[librustdoc] Reform lang string token splitting
Only split doctest lang strings on `,`, ` `, and `\t`. Additionally, to
preserve backwards compatibility with pandoc-style langstrings, strip a
surrounding `{}`, and remove leading `.`s from each token.
Prior to this change, doctest lang strings were split on all
non-alphanumeric characters except `-` or `_`, which limited future
extensions to doctest lang string tokens, for example using `=` for
key-value tokens.
This is a breaking change, although it is not expected to be disruptive,
because lang strings using separators other than `,` and ` ` are not
very common
This commit is contained in:
@@ -10,7 +10,7 @@
|
||||
//! fixpoint solution to your dataflow problem, or implement the `ResultsVisitor` interface and use
|
||||
//! `visit_results`. The following example uses the `ResultsCursor` approach.
|
||||
//!
|
||||
//! ```ignore(cross-crate-imports)
|
||||
//! ```ignore (cross-crate-imports)
|
||||
//! use rustc_mir::dataflow::Analysis; // Makes `into_engine` available.
|
||||
//!
|
||||
//! fn do_my_analysis(tcx: TyCtxt<'tcx>, body: &mir::Body<'tcx>) {
|
||||
@@ -211,7 +211,7 @@ pub trait Analysis<'tcx>: AnalysisDomain<'tcx> {
|
||||
/// default impl and the one for all `A: GenKillAnalysis` will do the right thing.
|
||||
/// Its purpose is to enable method chaining like so:
|
||||
///
|
||||
/// ```ignore(cross-crate-imports)
|
||||
/// ```ignore (cross-crate-imports)
|
||||
/// let results = MyAnalysis::new(tcx, body)
|
||||
/// .into_engine(tcx, body, def_id)
|
||||
/// .iterate_to_fixpoint()
|
||||
|
||||
Reference in New Issue
Block a user