Always preserve None-delimited groups in a captured TokenStream

Previously, we would silently remove any `None`-delimiters when
capturing a `TokenStream`, 'flattenting' them to their inner tokens.
This was not normally visible, since we usually have
`TokenKind::Interpolated` (which gets converted to a `None`-delimited
group during macro invocation) instead of an actual `None`-delimited
group.

However, there are a couple of cases where this becomes visible to
proc-macros:
1. A cross-crate `macro_rules!` macro has a `None`-delimited group
   stored in its body (as a result of being produced by another
   `macro_rules!` macro). The cross-crate `macro_rules!` invocation
   can then expand to an attribute macro invocation, which needs
   to be able to see the `None`-delimited group.
2. A proc-macro can invoke an attribute proc-macro with its re-collected
   input. If there are any nonterminals present in the input, they will
   get re-collected to `None`-delimited groups, which will then get
   captured as part of the attribute macro invocation.

Both of these cases are incredibly obscure, so there hopefully won't be
any breakage. This change will allow more agressive 'flattenting' of
nonterminals in #82608 without losing `None`-delimited groups.
This commit is contained in:
Aaron Hill
2021-03-26 23:23:20 -04:00
parent f811f14006
commit f94360fd83
5 changed files with 113 additions and 28 deletions

View File

@@ -172,6 +172,13 @@ struct TokenCursor {
// appended to the captured stream when
// we evaluate a `LazyTokenStream`
append_unglued_token: Option<TreeAndSpacing>,
// If `true`, skip the delimiters for `None`-delimited groups,
// and just yield the inner tokens. This is `true` during
// normal parsing, since the parser code is not currently prepared
// to handle `None` delimiters. When capturing a `TokenStream`,
// however, we want to handle `None`-delimiters, since
// proc-macros always see `None`-delimited groups.
skip_none_delims: bool,
}
#[derive(Clone)]
@@ -184,13 +191,13 @@ struct TokenCursorFrame {
}
impl TokenCursorFrame {
fn new(span: DelimSpan, delim: DelimToken, tts: TokenStream) -> Self {
fn new(span: DelimSpan, delim: DelimToken, tts: TokenStream, skip_none_delims: bool) -> Self {
TokenCursorFrame {
delim,
span,
open_delim: delim == token::NoDelim,
open_delim: delim == token::NoDelim && skip_none_delims,
tree_cursor: tts.into_trees(),
close_delim: delim == token::NoDelim,
close_delim: delim == token::NoDelim && skip_none_delims,
}
}
}
@@ -218,7 +225,7 @@ impl TokenCursor {
return (token, spacing);
}
TokenTree::Delimited(sp, delim, tts) => {
let frame = TokenCursorFrame::new(sp, delim, tts);
let frame = TokenCursorFrame::new(sp, delim, tts, self.skip_none_delims);
self.stack.push(mem::replace(&mut self.frame, frame));
}
}
@@ -276,6 +283,7 @@ impl TokenCursor {
.cloned()
.collect::<TokenStream>()
},
self.skip_none_delims,
),
));
@@ -371,12 +379,19 @@ impl<'a> Parser<'a> {
prev_token: Token::dummy(),
restrictions: Restrictions::empty(),
expected_tokens: Vec::new(),
// Skip over the delimiters for `None`-delimited groups
token_cursor: TokenCursor {
frame: TokenCursorFrame::new(DelimSpan::dummy(), token::NoDelim, tokens),
frame: TokenCursorFrame::new(
DelimSpan::dummy(),
token::NoDelim,
tokens,
/* skip_none_delims */ true,
),
stack: Vec::new(),
num_next_calls: 0,
desugar_doc_comments,
append_unglued_token: None,
skip_none_delims: true,
},
desugar_doc_comments,
unmatched_angle_bracket_count: 0,