Rollup merge of #140147 - xizheyin:issue-138401-1, r=compiler-errors

Clean: rename `open_braces` to `open_delimiters` in lexer and move `make_unclosed_delims_error` into `diagnostics.rs`.

Clean code prepared for resolving #138401. To avoid having too many extraneous changes in one PR, I cleaned up some of the naming and method placement in lexer in this PR.
1. For the make_unclosed_delims_error function defined in mod.rs is only used in lexer, so moved into lexer, which enhances encapsulation.
2. For open_braces in TokenTreeDiagInfo the naming is not canonical, as Brace refers to `{...} ` and this variable can store all kinds of different Delimiters. so I named it open_delimiters.

r? `@chenyukang`
This commit is contained in:
Chris Denton
2025-04-23 00:43:06 +00:00
committed by GitHub
5 changed files with 54 additions and 50 deletions

View File

@@ -43,11 +43,8 @@ use token_type::TokenTypeSet;
pub use token_type::{ExpKeywordPair, ExpTokenPair, TokenType};
use tracing::debug;
use crate::errors::{
self, IncorrectVisibilityRestriction, MismatchedClosingDelimiter, NonStringAbiLiteral,
};
use crate::errors::{self, IncorrectVisibilityRestriction, NonStringAbiLiteral};
use crate::exp;
use crate::lexer::UnmatchedDelim;
#[cfg(test)]
mod tests;
@@ -1745,27 +1742,6 @@ impl<'a> Parser<'a> {
}
}
pub(crate) fn make_unclosed_delims_error(
unmatched: UnmatchedDelim,
psess: &ParseSess,
) -> Option<Diag<'_>> {
// `None` here means an `Eof` was found. We already emit those errors elsewhere, we add them to
// `unmatched_delims` only for error recovery in the `Parser`.
let found_delim = unmatched.found_delim?;
let mut spans = vec![unmatched.found_span];
if let Some(sp) = unmatched.unclosed_span {
spans.push(sp);
};
let err = psess.dcx().create_err(MismatchedClosingDelimiter {
spans,
delimiter: pprust::token_kind_to_string(&found_delim.as_close_token_kind()).to_string(),
unmatched: unmatched.found_span,
opening_candidate: unmatched.candidate_span,
unclosed: unmatched.unclosed_span,
});
Some(err)
}
/// A helper struct used when building an `AttrTokenStream` from
/// a `LazyAttrTokenStream`. Both delimiter and non-delimited tokens
/// are stored as `FlatToken::Token`. A vector of `FlatToken`s