proc_macro: Reorganize public API

This commit is a reorganization of the `proc_macro` crate's public user-facing
API. This is the result of a number of discussions at the recent Rust All-Hands
where we're hoping to get the `proc_macro` crate into ship shape for
stabilization of a subset of its functionality in the Rust 2018 release.

The reorganization here is motivated by experiences from the `proc-macro2`,
`quote`, and `syn` crates on crates.io (and other crates which depend on them).
The main focus is future flexibility along with making a few more operations
consistent and/or fixing bugs. A summary of the changes made from today's
`proc_macro` API is:

* The `TokenNode` enum has been removed and the public fields of `TokenTree`
  have also been removed. Instead the `TokenTree` type is now a public enum
  (what `TokenNode` was) and each variant is an opaque struct which internally
  contains `Span` information. This makes the various tokens a bit more
  consistent, require fewer wrappers, and otherwise provides good
  future-compatibility as opaque structs are easy to modify later on.

* `Literal` integer constructors have been expanded to be unambiguous as to what
  they're doing and also allow for more future flexibility. Previously
  constructors like `Literal::float` and `Literal::integer` were used to create
  unsuffixed literals and the concrete methods like `Literal::i32` would create
  a suffixed token. This wasn't immediately clear to all users (the
  suffixed/unsuffixed aspect) and having *one* constructor for unsuffixed
  literals required us to pick a largest type which may not always be true. To
  fix these issues all constructors are now of the form
  `Literal::i32_unsuffixed` or `Literal::i32_suffixed` (for all integral types).
  This should allow future compatibility as well as being immediately clear
  what's suffixed and what isn't.

* Each variant of `TokenTree` internally contains a `Span` which can also be
  configured via `set_span`. For example `Literal` and `Term` now both
  internally contain a `Span` rather than having it stored in an auxiliary
  location.

* Constructors of all tokens are called `new` now (aka `Term::intern` is gone)
  and most do not take spans. Manufactured tokens typically don't have a fresh
  span to go with them and the span is purely used for error-reporting
  **except** the span for `Term`, which currently affects hygiene. The default
  spans for all these constructed tokens is `Span::call_site()` for now.

  The `Term` type's constructor explicitly requires passing in a `Span` to
  provide future-proofing against possible hygiene changes. It's intended that a
  first pass of stabilization will likely only stabilize `Span::call_site()`
  which is an explicit opt-in for "I would like no hygiene here please". The
  intention here is to make this explicit in procedural macros to be
  forwards-compatible with a hygiene-specifying solution.

* Some of the conversions for `TokenStream` have been simplified a little.

* The `TokenTreeIter` iterator was renamed to `token_stream::IntoIter`.

Overall the hope is that this is the "final pass" at the API of `TokenStream`
and most of `TokenTree` before stabilization. Explicitly left out here is any
changes to `Span`'s API which will likely need to be re-evaluated before
stabilization.

All changes in this PR have already been reflected to the [`proc-macro2`],
`quote`, and `syn` crates. New versions of all these crates have also been
published to crates.io.

Once this lands in nightly I plan on making an internals post again summarizing
the changes made here and also calling on all macro authors to give the APIs a
spin and see how they work. Hopefully pending no major issues we can then have
an FCP to stabilize later this cycle!

[`proc-macro2`]: https://docs.rs/proc-macro2/0.3.1/proc_macro2/
This commit is contained in:
Alex Crichton
2018-04-02 08:19:32 -07:00
parent 097efa9a99
commit 553c04d9eb
10 changed files with 750 additions and 338 deletions

View File

@@ -59,7 +59,6 @@ use syntax::errors::DiagnosticBuilder;
use syntax::parse::{self, token}; use syntax::parse::{self, token};
use syntax::symbol::Symbol; use syntax::symbol::Symbol;
use syntax::tokenstream; use syntax::tokenstream;
use syntax_pos::DUMMY_SP;
use syntax_pos::{FileMap, Pos, SyntaxContext, FileName}; use syntax_pos::{FileMap, Pos, SyntaxContext, FileName};
use syntax_pos::hygiene::Mark; use syntax_pos::hygiene::Mark;
@@ -73,7 +72,7 @@ use syntax_pos::hygiene::Mark;
/// The API of this type is intentionally bare-bones, but it'll be expanded over /// The API of this type is intentionally bare-bones, but it'll be expanded over
/// time! /// time!
#[stable(feature = "proc_macro_lib", since = "1.15.0")] #[stable(feature = "proc_macro_lib", since = "1.15.0")]
#[derive(Clone, Debug)] #[derive(Clone)]
pub struct TokenStream(tokenstream::TokenStream); pub struct TokenStream(tokenstream::TokenStream);
/// Error returned from `TokenStream::from_str`. /// Error returned from `TokenStream::from_str`.
@@ -83,6 +82,20 @@ pub struct LexError {
_inner: (), _inner: (),
} }
impl TokenStream {
/// Returns an empty `TokenStream`.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn empty() -> TokenStream {
TokenStream(tokenstream::TokenStream::empty())
}
/// Checks if this `TokenStream` is empty.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn is_empty(&self) -> bool {
self.0.is_empty()
}
}
#[stable(feature = "proc_macro_lib", since = "1.15.0")] #[stable(feature = "proc_macro_lib", since = "1.15.0")]
impl FromStr for TokenStream { impl FromStr for TokenStream {
type Err = LexError; type Err = LexError;
@@ -110,6 +123,81 @@ impl fmt::Display for TokenStream {
} }
} }
#[stable(feature = "proc_macro_lib", since = "1.15.0")]
impl fmt::Debug for TokenStream {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
self.0.fmt(f)
}
}
#[unstable(feature = "proc_macro", issue = "38356")]
impl From<TokenTree> for TokenStream {
fn from(tree: TokenTree) -> TokenStream {
TokenStream(tree.to_internal())
}
}
#[unstable(feature = "proc_macro", issue = "38356")]
impl iter::FromIterator<TokenTree> for TokenStream {
fn from_iter<I: IntoIterator<Item = TokenTree>>(trees: I) -> Self {
let mut builder = tokenstream::TokenStreamBuilder::new();
for tree in trees {
builder.push(tree.to_internal());
}
TokenStream(builder.build())
}
}
/// Implementation details for the `TokenTree` type, such as iterators.
#[unstable(feature = "proc_macro", issue = "38356")]
pub mod token_stream {
use syntax::tokenstream;
use syntax_pos::DUMMY_SP;
use {TokenTree, TokenStream, Delimiter};
/// An iterator over `TokenTree`s.
#[derive(Clone)]
#[unstable(feature = "proc_macro", issue = "38356")]
pub struct IntoIter {
cursor: tokenstream::Cursor,
stack: Vec<TokenTree>,
}
#[unstable(feature = "proc_macro", issue = "38356")]
impl Iterator for IntoIter {
type Item = TokenTree;
fn next(&mut self) -> Option<TokenTree> {
loop {
let tree = self.stack.pop().or_else(|| {
let next = self.cursor.next_as_stream()?;
Some(TokenTree::from_internal(next, &mut self.stack))
})?;
if tree.span().0 == DUMMY_SP {
if let TokenTree::Group(ref group) = tree {
if group.delimiter() == Delimiter::None {
self.cursor.insert(group.stream.clone().0);
continue
}
}
}
return Some(tree);
}
}
}
#[unstable(feature = "proc_macro", issue = "38356")]
impl IntoIterator for TokenStream {
type Item = TokenTree;
type IntoIter = IntoIter;
fn into_iter(self) -> IntoIter {
IntoIter { cursor: self.0.trees(), stack: Vec::new() }
}
}
}
/// `quote!(..)` accepts arbitrary tokens and expands into a `TokenStream` describing the input. /// `quote!(..)` accepts arbitrary tokens and expands into a `TokenStream` describing the input.
/// For example, `quote!(a + b)` will produce a expression, that, when evaluated, constructs /// For example, `quote!(a + b)` will produce a expression, that, when evaluated, constructs
/// the `TokenStream` `[Word("a"), Op('+', Alone), Word("b")]`. /// the `TokenStream` `[Word("a"), Op('+', Alone), Word("b")]`.
@@ -124,71 +212,6 @@ macro_rules! quote { () => {} }
#[doc(hidden)] #[doc(hidden)]
mod quote; mod quote;
#[unstable(feature = "proc_macro", issue = "38356")]
impl From<TokenTree> for TokenStream {
fn from(tree: TokenTree) -> TokenStream {
TokenStream(tree.to_internal())
}
}
#[unstable(feature = "proc_macro", issue = "38356")]
impl From<TokenNode> for TokenStream {
fn from(kind: TokenNode) -> TokenStream {
TokenTree::from(kind).into()
}
}
#[unstable(feature = "proc_macro", issue = "38356")]
impl<T: Into<TokenStream>> iter::FromIterator<T> for TokenStream {
fn from_iter<I: IntoIterator<Item = T>>(streams: I) -> Self {
let mut builder = tokenstream::TokenStreamBuilder::new();
for stream in streams {
builder.push(stream.into().0);
}
TokenStream(builder.build())
}
}
#[unstable(feature = "proc_macro", issue = "38356")]
impl IntoIterator for TokenStream {
type Item = TokenTree;
type IntoIter = TokenTreeIter;
fn into_iter(self) -> TokenTreeIter {
TokenTreeIter { cursor: self.0.trees(), stack: Vec::new() }
}
}
impl TokenStream {
/// Returns an empty `TokenStream`.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn empty() -> TokenStream {
TokenStream(tokenstream::TokenStream::empty())
}
/// Checks if this `TokenStream` is empty.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn is_empty(&self) -> bool {
self.0.is_empty()
}
}
/// A region of source code, along with macro expansion information.
#[unstable(feature = "proc_macro", issue = "38356")]
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
pub struct Span(syntax_pos::Span);
impl Span {
/// A span that resolves at the macro definition site.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn def_site() -> Span {
::__internal::with_sess(|(_, mark)| {
let call_site = mark.expn_info().unwrap().call_site;
Span(call_site.with_ctxt(SyntaxContext::empty().apply_mark(mark)))
})
}
}
/// Quote a `Span` into a `TokenStream`. /// Quote a `Span` into a `TokenStream`.
/// This is needed to implement a custom quoter. /// This is needed to implement a custom quoter.
#[unstable(feature = "proc_macro", issue = "38356")] #[unstable(feature = "proc_macro", issue = "38356")]
@@ -196,6 +219,11 @@ pub fn quote_span(span: Span) -> TokenStream {
quote::Quote::quote(span) quote::Quote::quote(span)
} }
/// A region of source code, along with macro expansion information.
#[unstable(feature = "proc_macro", issue = "38356")]
#[derive(Copy, Clone, Debug)]
pub struct Span(syntax_pos::Span);
macro_rules! diagnostic_method { macro_rules! diagnostic_method {
($name:ident, $level:expr) => ( ($name:ident, $level:expr) => (
/// Create a new `Diagnostic` with the given `message` at the span /// Create a new `Diagnostic` with the given `message` at the span
@@ -208,6 +236,15 @@ macro_rules! diagnostic_method {
} }
impl Span { impl Span {
/// A span that resolves at the macro definition site.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn def_site() -> Span {
::__internal::with_sess(|(_, mark)| {
let call_site = mark.expn_info().unwrap().call_site;
Span(call_site.with_ctxt(SyntaxContext::empty().apply_mark(mark)))
})
}
/// The span of the invocation of the current procedural macro. /// The span of the invocation of the current procedural macro.
#[unstable(feature = "proc_macro", issue = "38356")] #[unstable(feature = "proc_macro", issue = "38356")]
pub fn call_site() -> Span { pub fn call_site() -> Span {
@@ -284,6 +321,12 @@ impl Span {
other.resolved_at(*self) other.resolved_at(*self)
} }
/// Compares to spans to see if they're equal.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn eq(&self, other: &Span) -> bool {
self.0 == other.0
}
diagnostic_method!(error, Level::Error); diagnostic_method!(error, Level::Error);
diagnostic_method!(warning, Level::Warning); diagnostic_method!(warning, Level::Warning);
diagnostic_method!(note, Level::Note); diagnostic_method!(note, Level::Note);
@@ -379,39 +422,97 @@ impl PartialEq<FileName> for SourceFile {
/// A single token or a delimited sequence of token trees (e.g. `[1, (), ..]`). /// A single token or a delimited sequence of token trees (e.g. `[1, (), ..]`).
#[unstable(feature = "proc_macro", issue = "38356")] #[unstable(feature = "proc_macro", issue = "38356")]
#[derive(Clone, Debug)] #[derive(Clone, Debug)]
pub struct TokenTree { pub enum TokenTree {
/// The `TokenTree`'s span /// A delimited tokenstream
pub span: Span, Group(Group),
/// Description of the `TokenTree` /// A unicode identifier
pub kind: TokenNode, Term(Term),
/// A punctuation character (`+`, `,`, `$`, etc.).
Op(Op),
/// A literal character (`'a'`), string (`"hello"`), number (`2.3`), etc.
Literal(Literal),
}
impl TokenTree {
/// Returns the span of this token, accessing the `span` method of each of
/// the internal tokens.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn span(&self) -> Span {
match *self {
TokenTree::Group(ref t) => t.span(),
TokenTree::Term(ref t) => t.span(),
TokenTree::Op(ref t) => t.span(),
TokenTree::Literal(ref t) => t.span(),
}
}
/// Configures the span for *only this token*.
///
/// Note that if this token is a `Group` then this method will not configure
/// the span of each of the internal tokens, this will simply delegate to
/// the `set_span` method of each variant.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn set_span(&mut self, span: Span) {
match *self {
TokenTree::Group(ref mut t) => t.set_span(span),
TokenTree::Term(ref mut t) => t.set_span(span),
TokenTree::Op(ref mut t) => t.set_span(span),
TokenTree::Literal(ref mut t) => t.set_span(span),
}
}
} }
#[unstable(feature = "proc_macro", issue = "38356")] #[unstable(feature = "proc_macro", issue = "38356")]
impl From<TokenNode> for TokenTree { impl From<Group> for TokenTree {
fn from(kind: TokenNode) -> TokenTree { fn from(g: Group) -> TokenTree {
TokenTree { span: Span::def_site(), kind: kind } TokenTree::Group(g)
}
}
#[unstable(feature = "proc_macro", issue = "38356")]
impl From<Term> for TokenTree {
fn from(g: Term) -> TokenTree {
TokenTree::Term(g)
}
}
#[unstable(feature = "proc_macro", issue = "38356")]
impl From<Op> for TokenTree {
fn from(g: Op) -> TokenTree {
TokenTree::Op(g)
}
}
#[unstable(feature = "proc_macro", issue = "38356")]
impl From<Literal> for TokenTree {
fn from(g: Literal) -> TokenTree {
TokenTree::Literal(g)
} }
} }
#[unstable(feature = "proc_macro", issue = "38356")] #[unstable(feature = "proc_macro", issue = "38356")]
impl fmt::Display for TokenTree { impl fmt::Display for TokenTree {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
TokenStream::from(self.clone()).fmt(f) match *self {
TokenTree::Group(ref t) => t.fmt(f),
TokenTree::Term(ref t) => t.fmt(f),
TokenTree::Op(ref t) => t.fmt(f),
TokenTree::Literal(ref t) => t.fmt(f),
}
} }
} }
/// Description of a `TokenTree` /// A delimited token stream
///
/// A `Group` internally contains a `TokenStream` which is delimited by a
/// `Delimiter`. Groups represent multiple tokens internally and have a `Span`
/// for the entire stream.
#[derive(Clone, Debug)] #[derive(Clone, Debug)]
#[unstable(feature = "proc_macro", issue = "38356")] #[unstable(feature = "proc_macro", issue = "38356")]
pub enum TokenNode { pub struct Group {
/// A delimited tokenstream. delimiter: Delimiter,
Group(Delimiter, TokenStream), stream: TokenStream,
/// A unicode identifier. span: Span,
Term(Term),
/// A punctuation character (`+`, `,`, `$`, etc.).
Op(char, Spacing),
/// A literal character (`'a'`), string (`"hello"`), or number (`2.3`).
Literal(Literal),
} }
/// Describes how a sequence of token trees is delimited. /// Describes how a sequence of token trees is delimited.
@@ -428,23 +529,72 @@ pub enum Delimiter {
None, None,
} }
/// An interned string. impl Group {
#[derive(Copy, Clone, Debug)] /// Creates a new `group` with the given delimiter and token stream.
///
/// This constructor will set the span for this group to
/// `Span::call_site()`. To change the span you can use the `set_span`
/// method below.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn new(delimiter: Delimiter, stream: TokenStream) -> Group {
Group {
delimiter: delimiter,
stream: stream,
span: Span::call_site(),
}
}
/// Returns the delimiter of this `Group`
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn delimiter(&self) -> Delimiter {
self.delimiter
}
/// Returns the `TokenStream` of tokens that are delimited in this `Group`.
///
/// Note that the returned token stream does not include the delimiter
/// returned above.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn stream(&self) -> TokenStream {
self.stream.clone()
}
/// Returns the span for the delimiters of this token stream, spanning the
/// entire `Group`.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn span(&self) -> Span {
self.span
}
/// Configures the span for this `Group`'s delimiters, but not its internal
/// tokens.
///
/// This method will **not** set the span of all the internal tokens spanned
/// by this group, but rather it will only set the span of the delimiter
/// tokens at the level of the `Group`.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn set_span(&mut self, span: Span) {
self.span = span;
}
}
#[unstable(feature = "proc_macro", issue = "38356")] #[unstable(feature = "proc_macro", issue = "38356")]
pub struct Term(Symbol); impl fmt::Display for Group {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
impl Term { TokenStream::from(TokenTree::from(self.clone())).fmt(f)
/// Intern a string into a `Term`.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn intern(string: &str) -> Term {
Term(Symbol::intern(string))
} }
}
/// Get a reference to the interned string. /// An `Op` is an operator like `+` or `-`, and only represents one character.
#[unstable(feature = "proc_macro", issue = "38356")] ///
pub fn as_str(&self) -> &str { /// Operators like `+=` are represented as two instance of `Op` with different
unsafe { &*(&*self.0.as_str() as *const str) } /// forms of `Spacing` returned.
} #[unstable(feature = "proc_macro", issue = "38356")]
#[derive(Copy, Clone, Debug)]
pub struct Op {
op: char,
spacing: Spacing,
span: Span,
} }
/// Whether an `Op` is either followed immediately by another `Op` or followed by whitespace. /// Whether an `Op` is either followed immediately by another `Op` or followed by whitespace.
@@ -457,68 +607,285 @@ pub enum Spacing {
Joint, Joint,
} }
/// A literal character (`'a'`), string (`"hello"`), or number (`2.3`). impl Op {
#[derive(Clone, Debug)] /// Creates a new `Op` from the given character and spacing.
#[unstable(feature = "proc_macro", issue = "38356")] ///
pub struct Literal(token::Token); /// The returned `Op` will have the default span of `Span::call_site()`
/// which can be further configured with the `set_span` method below.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn new(op: char, spacing: Spacing) -> Op {
Op {
op: op,
spacing: spacing,
span: Span::call_site(),
}
}
#[unstable(feature = "proc_macro", issue = "38356")] /// Returns the character this operation represents, for example `'+'`
impl fmt::Display for Literal { #[unstable(feature = "proc_macro", issue = "38356")]
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { pub fn op(&self) -> char {
TokenTree { kind: TokenNode::Literal(self.clone()), span: Span(DUMMY_SP) }.fmt(f) self.op
}
/// Returns the spacing of this operator, indicating whether it's a joint
/// operator with more operators coming next in the token stream or an
/// `Alone` meaning that the operator has ended.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn spacing(&self) -> Spacing {
self.spacing
}
/// Returns the span for this operator character
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn span(&self) -> Span {
self.span
}
/// Configure the span for this operator's character
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn set_span(&mut self, span: Span) {
self.span = span;
} }
} }
macro_rules! int_literals { #[unstable(feature = "proc_macro", issue = "38356")]
($($int_kind:ident),*) => {$( impl fmt::Display for Op {
/// Integer literal. fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
#[unstable(feature = "proc_macro", issue = "38356")] TokenStream::from(TokenTree::from(self.clone())).fmt(f)
pub fn $int_kind(n: $int_kind) -> Literal {
Literal::typed_integer(n as i128, stringify!($int_kind))
} }
)*} }
/// An interned string.
#[derive(Copy, Clone, Debug)]
#[unstable(feature = "proc_macro", issue = "38356")]
pub struct Term {
sym: Symbol,
span: Span,
}
impl Term {
/// Creates a new `Term` with the given `string` as well as the specified
/// `span`.
///
/// Note that `span`, currently in rustc, configures the hygiene information
/// for this identifier. As of this time `Span::call_site()` explicitly
/// opts-in to **non-hygienic** information (aka copy/pasted code) while
/// spans like `Span::def_site()` will opt-in to hygienic information,
/// meaning that code at the call site of the macro can't access this
/// identifier.
///
/// Due to the current importance of hygiene this constructor, unlike other
/// tokens, requires a `Span` to be specified at construction.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn new(string: &str, span: Span) -> Term {
Term {
sym: Symbol::intern(string),
span,
}
}
/// Get a reference to the interned string.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn as_str(&self) -> &str {
unsafe { &*(&*self.sym.as_str() as *const str) }
}
/// Returns the span of this `Term`, encompassing the entire string returned
/// by `as_str`.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn span(&self) -> Span {
self.span
}
/// Configures the span of this `Term`, possibly changing hygiene
/// information.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn set_span(&mut self, span: Span) {
self.span = span;
}
}
#[unstable(feature = "proc_macro", issue = "38356")]
impl fmt::Display for Term {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
self.as_str().fmt(f)
}
}
/// A literal character (`'a'`), string (`"hello"`), a number (`2.3`), etc.
#[derive(Clone, Debug)]
#[unstable(feature = "proc_macro", issue = "38356")]
pub struct Literal {
token: token::Token,
span: Span,
}
macro_rules! suffixed_int_literals {
($($name:ident => $kind:ident,)*) => ($(
/// Creates a new suffixed integer literal with the specified value.
///
/// This function will create an integer like `1u32` where the integer
/// value specified is the first part of the token and the integral is
/// also suffixed at the end.
///
/// Literals created through this method have the `Span::call_site()`
/// span by default, which can be configured with the `set_span` method
/// below.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn $name(n: $kind) -> Literal {
let lit = token::Lit::Integer(Symbol::intern(&n.to_string()));
let ty = Some(Symbol::intern(stringify!($kind)));
Literal {
token: token::Literal(lit, ty),
span: Span::call_site(),
}
}
)*)
}
macro_rules! unsuffixed_int_literals {
($($name:ident => $kind:ident,)*) => ($(
/// Creates a new unsuffixed integer literal with the specified value.
///
/// This function will create an integer like `1` where the integer
/// value specified is the first part of the token. No suffix is
/// specified on this token, meaning that invocations like
/// `Literal::i8_unsuffixed(1)` are equivalent to
/// `Literal::u32_unsuffixed(1)`.
///
/// Literals created through this method have the `Span::call_site()`
/// span by default, which can be configured with the `set_span` method
/// below.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn $name(n: $kind) -> Literal {
let lit = token::Lit::Integer(Symbol::intern(&n.to_string()));
Literal {
token: token::Literal(lit, None),
span: Span::call_site(),
}
}
)*)
} }
impl Literal { impl Literal {
/// Integer literal suffixed_int_literals! {
#[unstable(feature = "proc_macro", issue = "38356")] u8_suffixed => u8,
pub fn integer(n: i128) -> Literal { u16_suffixed => u16,
Literal(token::Literal(token::Lit::Integer(Symbol::intern(&n.to_string())), None)) u32_suffixed => u32,
u64_suffixed => u64,
u128_suffixed => u128,
usize_suffixed => usize,
i8_suffixed => i8,
i16_suffixed => i16,
i32_suffixed => i32,
i64_suffixed => i64,
i128_suffixed => i128,
isize_suffixed => isize,
} }
int_literals!(u8, i8, u16, i16, u32, i32, u64, i64, usize, isize); unsuffixed_int_literals! {
fn typed_integer(n: i128, kind: &'static str) -> Literal { u8_unsuffixed => u8,
Literal(token::Literal(token::Lit::Integer(Symbol::intern(&n.to_string())), u16_unsuffixed => u16,
Some(Symbol::intern(kind)))) u32_unsuffixed => u32,
u64_unsuffixed => u64,
u128_unsuffixed => u128,
usize_unsuffixed => usize,
i8_unsuffixed => i8,
i16_unsuffixed => i16,
i32_unsuffixed => i32,
i64_unsuffixed => i64,
i128_unsuffixed => i128,
isize_unsuffixed => isize,
} }
/// Floating point literal. /// Creates a new unsuffixed floating-point literal.
///
/// This constructor is similar to those like `Literal::i8_unsuffixed` where
/// the float's value is emitted directly into the token but no suffix is
/// used, so it may be inferred to be a `f64` later in the compiler.
///
/// # Panics
///
/// This function requires that the specified float is finite, for
/// example if it is infinity or NaN this function will panic.
#[unstable(feature = "proc_macro", issue = "38356")] #[unstable(feature = "proc_macro", issue = "38356")]
pub fn float(n: f64) -> Literal { pub fn f32_unsuffixed(n: f32) -> Literal {
if !n.is_finite() { if !n.is_finite() {
panic!("Invalid float literal {}", n); panic!("Invalid float literal {}", n);
} }
Literal(token::Literal(token::Lit::Float(Symbol::intern(&n.to_string())), None)) let lit = token::Lit::Float(Symbol::intern(&n.to_string()));
Literal {
token: token::Literal(lit, None),
span: Span::call_site(),
}
} }
/// Floating point literal. /// Creates a new suffixed floating-point literal.
///
/// This consturctor will create a literal like `1.0f32` where the value
/// specified is the preceding part of the token and `f32` is the suffix of
/// the token. This token will always be inferred to be an `f32` in the
/// compiler.
///
/// # Panics
///
/// This function requires that the specified float is finite, for
/// example if it is infinity or NaN this function will panic.
#[unstable(feature = "proc_macro", issue = "38356")] #[unstable(feature = "proc_macro", issue = "38356")]
pub fn f32(n: f32) -> Literal { pub fn f32_suffixed(n: f32) -> Literal {
if !n.is_finite() { if !n.is_finite() {
panic!("Invalid f32 literal {}", n); panic!("Invalid float literal {}", n);
}
let lit = token::Lit::Float(Symbol::intern(&n.to_string()));
Literal {
token: token::Literal(lit, Some(Symbol::intern("f32"))),
span: Span::call_site(),
} }
Literal(token::Literal(token::Lit::Float(Symbol::intern(&n.to_string())),
Some(Symbol::intern("f32"))))
} }
/// Floating point literal. /// Creates a new unsuffixed floating-point literal.
///
/// This constructor is similar to those like `Literal::i8_unsuffixed` where
/// the float's value is emitted directly into the token but no suffix is
/// used, so it may be inferred to be a `f64` later in the compiler.
///
/// # Panics
///
/// This function requires that the specified float is finite, for
/// example if it is infinity or NaN this function will panic.
#[unstable(feature = "proc_macro", issue = "38356")] #[unstable(feature = "proc_macro", issue = "38356")]
pub fn f64(n: f64) -> Literal { pub fn f64_unsuffixed(n: f64) -> Literal {
if !n.is_finite() { if !n.is_finite() {
panic!("Invalid f64 literal {}", n); panic!("Invalid float literal {}", n);
}
let lit = token::Lit::Float(Symbol::intern(&n.to_string()));
Literal {
token: token::Literal(lit, None),
span: Span::call_site(),
}
}
/// Creates a new suffixed floating-point literal.
///
/// This consturctor will create a literal like `1.0f64` where the value
/// specified is the preceding part of the token and `f64` is the suffix of
/// the token. This token will always be inferred to be an `f64` in the
/// compiler.
///
/// # Panics
///
/// This function requires that the specified float is finite, for
/// example if it is infinity or NaN this function will panic.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn f64_suffixed(n: f64) -> Literal {
if !n.is_finite() {
panic!("Invalid float literal {}", n);
}
let lit = token::Lit::Float(Symbol::intern(&n.to_string()));
Literal {
token: token::Literal(lit, Some(Symbol::intern("f64"))),
span: Span::call_site(),
} }
Literal(token::Literal(token::Lit::Float(Symbol::intern(&n.to_string())),
Some(Symbol::intern("f64"))))
} }
/// String literal. /// String literal.
@@ -528,7 +895,10 @@ impl Literal {
for ch in string.chars() { for ch in string.chars() {
escaped.extend(ch.escape_debug()); escaped.extend(ch.escape_debug());
} }
Literal(token::Literal(token::Lit::Str_(Symbol::intern(&escaped)), None)) Literal {
token: token::Literal(token::Lit::Str_(Symbol::intern(&escaped)), None),
span: Span::call_site(),
}
} }
/// Character literal. /// Character literal.
@@ -536,7 +906,10 @@ impl Literal {
pub fn character(ch: char) -> Literal { pub fn character(ch: char) -> Literal {
let mut escaped = String::new(); let mut escaped = String::new();
escaped.extend(ch.escape_unicode()); escaped.extend(ch.escape_unicode());
Literal(token::Literal(token::Lit::Char(Symbol::intern(&escaped)), None)) Literal {
token: token::Literal(token::Lit::Char(Symbol::intern(&escaped)), None),
span: Span::call_site(),
}
} }
/// Byte string literal. /// Byte string literal.
@@ -544,36 +917,29 @@ impl Literal {
pub fn byte_string(bytes: &[u8]) -> Literal { pub fn byte_string(bytes: &[u8]) -> Literal {
let string = bytes.iter().cloned().flat_map(ascii::escape_default) let string = bytes.iter().cloned().flat_map(ascii::escape_default)
.map(Into::<char>::into).collect::<String>(); .map(Into::<char>::into).collect::<String>();
Literal(token::Literal(token::Lit::ByteStr(Symbol::intern(&string)), None)) Literal {
token: token::Literal(token::Lit::ByteStr(Symbol::intern(&string)), None),
span: Span::call_site(),
}
}
/// Returns the span encompassing this literal.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn span(&self) -> Span {
self.span
}
/// Configures the span associated for this literal.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn set_span(&mut self, span: Span) {
self.span = span;
} }
} }
/// An iterator over `TokenTree`s.
#[derive(Clone)]
#[unstable(feature = "proc_macro", issue = "38356")] #[unstable(feature = "proc_macro", issue = "38356")]
pub struct TokenTreeIter { impl fmt::Display for Literal {
cursor: tokenstream::Cursor, fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
stack: Vec<TokenTree>, TokenStream::from(TokenTree::from(self.clone())).fmt(f)
}
#[unstable(feature = "proc_macro", issue = "38356")]
impl Iterator for TokenTreeIter {
type Item = TokenTree;
fn next(&mut self) -> Option<TokenTree> {
loop {
let tree = self.stack.pop().or_else(|| {
let next = self.cursor.next_as_stream()?;
Some(TokenTree::from_internal(next, &mut self.stack))
})?;
if tree.span.0 == DUMMY_SP {
if let TokenNode::Group(Delimiter::None, stream) = tree.kind {
self.cursor.insert(stream.0);
continue
}
}
return Some(tree);
}
} }
} }
@@ -607,31 +973,34 @@ impl TokenTree {
tokenstream::TokenTree::Token(span, token) => (span, token), tokenstream::TokenTree::Token(span, token) => (span, token),
tokenstream::TokenTree::Delimited(span, delimed) => { tokenstream::TokenTree::Delimited(span, delimed) => {
let delimiter = Delimiter::from_internal(delimed.delim); let delimiter = Delimiter::from_internal(delimed.delim);
return TokenTree { let mut g = Group::new(delimiter, TokenStream(delimed.tts.into()));
span: Span(span), g.set_span(Span(span));
kind: TokenNode::Group(delimiter, TokenStream(delimed.tts.into())), return g.into()
};
} }
}; };
let op_kind = if is_joint { Spacing::Joint } else { Spacing::Alone }; let op_kind = if is_joint { Spacing::Joint } else { Spacing::Alone };
macro_rules! tt { macro_rules! tt {
($e:expr) => (TokenTree { span: Span(span), kind: $e }) ($e:expr) => ({
let mut x = TokenTree::from($e);
x.set_span(Span(span));
x
})
} }
macro_rules! op { macro_rules! op {
($a:expr) => (TokenNode::Op($a, op_kind)); ($a:expr) => (tt!(Op::new($a, op_kind)));
($a:expr, $b:expr) => ({ ($a:expr, $b:expr) => ({
stack.push(tt!(TokenNode::Op($b, op_kind).into())); stack.push(tt!(Op::new($b, op_kind)));
TokenNode::Op($a, Spacing::Joint) tt!(Op::new($a, Spacing::Joint))
}); });
($a:expr, $b:expr, $c:expr) => ({ ($a:expr, $b:expr, $c:expr) => ({
stack.push(tt!(TokenNode::Op($c, op_kind))); stack.push(tt!(Op::new($c, op_kind)));
stack.push(tt!(TokenNode::Op($b, Spacing::Joint))); stack.push(tt!(Op::new($b, Spacing::Joint)));
TokenNode::Op($a, Spacing::Joint) tt!(Op::new($a, Spacing::Joint))
}) })
} }
let kind = match token { match token {
Eq => op!('='), Eq => op!('='),
Lt => op!('<'), Lt => op!('<'),
Le => op!('<', '='), Le => op!('<', '='),
@@ -679,80 +1048,88 @@ impl TokenTree {
Dollar => op!('$'), Dollar => op!('$'),
Question => op!('?'), Question => op!('?'),
Ident(ident, false) | Lifetime(ident) => TokenNode::Term(Term(ident.name)), Ident(ident, false) | Lifetime(ident) => {
Ident(ident, true) => TokenNode::Term(Term(Symbol::intern(&format!("r#{}", ident)))), tt!(Term::new(&ident.name.as_str(), Span(span)))
Literal(..) => TokenNode::Literal(self::Literal(token)), }
Ident(ident, true) => {
tt!(Term::new(&format!("r#{}", ident), Span(span)))
}
Literal(..) => tt!(self::Literal { token, span: Span(span) }),
DocComment(c) => { DocComment(c) => {
let stream = vec![ let stream = vec![
tt!(TokenNode::Term(Term::intern("doc"))), tt!(Term::new("doc", Span(span))),
tt!(op!('=')), tt!(Op::new('=', Spacing::Alone)),
tt!(TokenNode::Literal(self::Literal(Literal(Lit::Str_(c), None)))), tt!(self::Literal::string(&c.as_str())),
].into_iter().collect(); ].into_iter().collect();
stack.push(tt!(TokenNode::Group(Delimiter::Bracket, stream))); stack.push(tt!(Group::new(Delimiter::Bracket, stream)));
op!('#') tt!(Op::new('#', Spacing::Alone))
} }
Interpolated(_) => { Interpolated(_) => {
__internal::with_sess(|(sess, _)| { __internal::with_sess(|(sess, _)| {
let tts = token.interpolated_to_tokenstream(sess, span); let tts = token.interpolated_to_tokenstream(sess, span);
TokenNode::Group(Delimiter::None, TokenStream(tts)) tt!(Group::new(Delimiter::None, TokenStream(tts)))
}) })
} }
DotEq => op!('.', '='), DotEq => op!('.', '='),
OpenDelim(..) | CloseDelim(..) => unreachable!(), OpenDelim(..) | CloseDelim(..) => unreachable!(),
Whitespace | Comment | Shebang(..) | Eof => unreachable!(), Whitespace | Comment | Shebang(..) | Eof => unreachable!(),
}; }
TokenTree { span: Span(span), kind: kind }
} }
fn to_internal(self) -> tokenstream::TokenStream { fn to_internal(self) -> tokenstream::TokenStream {
use syntax::parse::token::*; use syntax::parse::token::*;
use syntax::tokenstream::{TokenTree, Delimited}; use syntax::tokenstream::{TokenTree, Delimited};
let (op, kind) = match self.kind { let (op, kind, span) = match self {
TokenNode::Op(op, kind) => (op, kind), self::TokenTree::Op(tt) => (tt.op(), tt.spacing(), tt.span()),
TokenNode::Group(delimiter, tokens) => { self::TokenTree::Group(tt) => {
return TokenTree::Delimited(self.span.0, Delimited { return TokenTree::Delimited(tt.span.0, Delimited {
delim: delimiter.to_internal(), delim: tt.delimiter.to_internal(),
tts: tokens.0.into(), tts: tt.stream.0.into(),
}).into(); }).into();
}, },
TokenNode::Term(symbol) => { self::TokenTree::Term(tt) => {
let ident = ast::Ident { name: symbol.0, ctxt: self.span.0.ctxt() }; let ident = ast::Ident { name: tt.sym, ctxt: tt.span.0.ctxt() };
let sym_str = symbol.0.as_str(); let sym_str = tt.sym.as_str();
let token = let token =
if sym_str.starts_with("'") { Lifetime(ident) } if sym_str.starts_with("'") { Lifetime(ident) }
else if sym_str.starts_with("r#") { else if sym_str.starts_with("r#") {
let name = Symbol::intern(&sym_str[2..]); let name = Symbol::intern(&sym_str[2..]);
let ident = ast::Ident { name, ctxt: self.span.0.ctxt() }; let ident = ast::Ident { name, ctxt: tt.span.0.ctxt() };
Ident(ident, true) Ident(ident, true)
} else { Ident(ident, false) }; } else { Ident(ident, false) };
return TokenTree::Token(self.span.0, token).into(); return TokenTree::Token(tt.span.0, token).into();
} }
TokenNode::Literal(self::Literal(Literal(Lit::Integer(ref a), b))) self::TokenTree::Literal(self::Literal {
token: Literal(Lit::Integer(ref a), b),
span,
})
if a.as_str().starts_with("-") => if a.as_str().starts_with("-") =>
{ {
let minus = BinOp(BinOpToken::Minus); let minus = BinOp(BinOpToken::Minus);
let integer = Symbol::intern(&a.as_str()[1..]); let integer = Symbol::intern(&a.as_str()[1..]);
let integer = Literal(Lit::Integer(integer), b); let integer = Literal(Lit::Integer(integer), b);
let a = TokenTree::Token(self.span.0, minus); let a = TokenTree::Token(span.0, minus);
let b = TokenTree::Token(self.span.0, integer); let b = TokenTree::Token(span.0, integer);
return vec![a, b].into_iter().collect() return vec![a, b].into_iter().collect()
} }
TokenNode::Literal(self::Literal(Literal(Lit::Float(ref a), b))) self::TokenTree::Literal(self::Literal {
token: Literal(Lit::Float(ref a), b),
span,
})
if a.as_str().starts_with("-") => if a.as_str().starts_with("-") =>
{ {
let minus = BinOp(BinOpToken::Minus); let minus = BinOp(BinOpToken::Minus);
let float = Symbol::intern(&a.as_str()[1..]); let float = Symbol::intern(&a.as_str()[1..]);
let float = Literal(Lit::Float(float), b); let float = Literal(Lit::Float(float), b);
let a = TokenTree::Token(self.span.0, minus); let a = TokenTree::Token(span.0, minus);
let b = TokenTree::Token(self.span.0, float); let b = TokenTree::Token(span.0, float);
return vec![a, b].into_iter().collect() return vec![a, b].into_iter().collect()
} }
TokenNode::Literal(token) => { self::TokenTree::Literal(tt) => {
return TokenTree::Token(self.span.0, token.0).into() return TokenTree::Token(tt.span.0, tt.token).into()
} }
}; };
@@ -781,7 +1158,7 @@ impl TokenTree {
_ => panic!("unsupported character {}", op), _ => panic!("unsupported character {}", op),
}; };
let tree = TokenTree::Token(self.span.0, token); let tree = TokenTree::Token(span.0, token);
match kind { match kind {
Spacing::Alone => tree.into(), Spacing::Alone => tree.into(),
Spacing::Joint => tree.joint(), Spacing::Joint => tree.joint(),

View File

@@ -14,7 +14,7 @@
//! This quasiquoter uses macros 2.0 hygiene to reliably access //! This quasiquoter uses macros 2.0 hygiene to reliably access
//! items from `proc_macro`, to build a `proc_macro::TokenStream`. //! items from `proc_macro`, to build a `proc_macro::TokenStream`.
use {Delimiter, Literal, Spacing, Span, Term, TokenNode, TokenStream, TokenTree}; use {Delimiter, Literal, Spacing, Span, Term, Op, Group, TokenStream, TokenTree};
use syntax::ext::base::{ExtCtxt, ProcMacro}; use syntax::ext::base::{ExtCtxt, ProcMacro};
use syntax::parse::token; use syntax::parse::token;
@@ -23,47 +23,59 @@ use syntax::tokenstream;
pub struct Quoter; pub struct Quoter;
pub fn unquote<T: Into<TokenStream> + Clone>(tokens: &T) -> TokenStream { pub fn unquote<T: Into<TokenStream> + Clone>(tokens: &T) -> TokenStream {
T::into(tokens.clone()) tokens.clone().into()
} }
pub trait Quote { pub trait Quote {
fn quote(self) -> TokenStream; fn quote(self) -> TokenStream;
} }
macro_rules! tt2ts {
($e:expr) => (TokenStream::from(TokenTree::from($e)))
}
macro_rules! quote_tok { macro_rules! quote_tok {
(,) => { TokenNode::Op(',', Spacing::Alone) }; (,) => { tt2ts!(Op::new(',', Spacing::Alone)) };
(.) => { TokenNode::Op('.', Spacing::Alone) }; (.) => { tt2ts!(Op::new('.', Spacing::Alone)) };
(:) => { TokenNode::Op(':', Spacing::Alone) }; (:) => { tt2ts!(Op::new(':', Spacing::Alone)) };
(|) => { tt2ts!(Op::new('|', Spacing::Alone)) };
(::) => { (::) => {
[ [
TokenNode::Op(':', Spacing::Joint), TokenTree::from(Op::new(':', Spacing::Joint)),
TokenNode::Op(':', Spacing::Alone) TokenTree::from(Op::new(':', Spacing::Alone)),
].iter().cloned().collect::<TokenStream>() ].iter()
.cloned()
.map(|mut x| {
x.set_span(Span::def_site());
x
})
.collect::<TokenStream>()
}; };
(!) => { TokenNode::Op('!', Spacing::Alone) }; (!) => { tt2ts!(Op::new('!', Spacing::Alone)) };
(<) => { TokenNode::Op('<', Spacing::Alone) }; (<) => { tt2ts!(Op::new('<', Spacing::Alone)) };
(>) => { TokenNode::Op('>', Spacing::Alone) }; (>) => { tt2ts!(Op::new('>', Spacing::Alone)) };
(_) => { TokenNode::Op('_', Spacing::Alone) }; (_) => { tt2ts!(Op::new('_', Spacing::Alone)) };
(0) => { TokenNode::Literal(::Literal::integer(0)) }; (0) => { tt2ts!(Literal::i8_unsuffixed(0)) };
(&) => { TokenNode::Op('&', Spacing::Alone) }; (&) => { tt2ts!(Op::new('&', Spacing::Alone)) };
($i:ident) => { TokenNode::Term(Term::intern(stringify!($i))) }; ($i:ident) => { tt2ts!(Term::new(stringify!($i), Span::def_site())) };
} }
macro_rules! quote_tree { macro_rules! quote_tree {
((unquote $($t:tt)*)) => { $($t)* }; ((unquote $($t:tt)*)) => { $($t)* };
((quote $($t:tt)*)) => { ($($t)*).quote() }; ((quote $($t:tt)*)) => { ($($t)*).quote() };
(($($t:tt)*)) => { TokenNode::Group(Delimiter::Parenthesis, quote!($($t)*)) }; (($($t:tt)*)) => { tt2ts!(Group::new(Delimiter::Parenthesis, quote!($($t)*))) };
([$($t:tt)*]) => { TokenNode::Group(Delimiter::Bracket, quote!($($t)*)) }; ([$($t:tt)*]) => { tt2ts!(Group::new(Delimiter::Bracket, quote!($($t)*))) };
({$($t:tt)*}) => { TokenNode::Group(Delimiter::Brace, quote!($($t)*)) }; ({$($t:tt)*}) => { tt2ts!(Group::new(Delimiter::Brace, quote!($($t)*))) };
($t:tt) => { quote_tok!($t) }; ($t:tt) => { quote_tok!($t) };
} }
macro_rules! quote { macro_rules! quote {
() => { TokenStream::empty() }; () => { TokenStream::empty() };
($($t:tt)*) => { ($($t:tt)*) => {
[ [$(quote_tree!($t),)*].iter()
$(TokenStream::from(quote_tree!($t)),)* .cloned()
].iter().cloned().collect::<TokenStream>() .flat_map(|x| x.into_iter())
.collect::<TokenStream>()
}; };
} }
@@ -97,72 +109,81 @@ impl Quote for TokenStream {
let tokens = self.into_iter().filter_map(|tree| { let tokens = self.into_iter().filter_map(|tree| {
if after_dollar { if after_dollar {
after_dollar = false; after_dollar = false;
match tree.kind { match tree {
TokenNode::Term(_) => { TokenTree::Term(_) => {
let tree = TokenStream::from(tree);
return Some(quote!(::__internal::unquote(&(unquote tree)),)); return Some(quote!(::__internal::unquote(&(unquote tree)),));
} }
TokenNode::Op('$', _) => {} TokenTree::Op(ref tt) if tt.op() == '$' => {}
_ => panic!("`$` must be followed by an ident or `$` in `quote!`"), _ => panic!("`$` must be followed by an ident or `$` in `quote!`"),
} }
} else if let TokenNode::Op('$', _) = tree.kind { } else if let TokenTree::Op(tt) = tree {
if tt.op() == '$' {
after_dollar = true; after_dollar = true;
return None; return None;
} }
}
Some(quote!(::TokenStream::from((quote tree)),)) Some(quote!(::TokenStream::from((quote tree)),))
}).collect::<TokenStream>(); }).flat_map(|t| t.into_iter()).collect::<TokenStream>();
if after_dollar { if after_dollar {
panic!("unexpected trailing `$` in `quote!`"); panic!("unexpected trailing `$` in `quote!`");
} }
quote!([(unquote tokens)].iter().cloned().collect::<::TokenStream>()) quote!(
[(unquote tokens)].iter()
.cloned()
.flat_map(|x| x.into_iter())
.collect::<::TokenStream>()
)
} }
} }
impl Quote for TokenTree { impl Quote for TokenTree {
fn quote(self) -> TokenStream { fn quote(self) -> TokenStream {
quote!(::TokenTree { span: (quote self.span), kind: (quote self.kind) })
}
}
impl Quote for TokenNode {
fn quote(self) -> TokenStream {
macro_rules! gen_match {
($($i:ident($($arg:ident),+)),*) => {
match self { match self {
$(TokenNode::$i($($arg),+) => quote! { TokenTree::Op(tt) => quote!(::TokenTree::Op( (quote tt) )),
::TokenNode::$i($((quote $arg)),+) TokenTree::Group(tt) => quote!(::TokenTree::Group( (quote tt) )),
},)* TokenTree::Term(tt) => quote!(::TokenTree::Term( (quote tt) )),
TokenTree::Literal(tt) => quote!(::TokenTree::Literal( (quote tt) )),
} }
} }
}
gen_match! { Op(op, kind), Group(delim, tokens), Term(term), Literal(lit) }
}
} }
impl Quote for char { impl Quote for char {
fn quote(self) -> TokenStream { fn quote(self) -> TokenStream {
TokenNode::Literal(Literal::character(self)).into() TokenTree::from(Literal::character(self)).into()
} }
} }
impl<'a> Quote for &'a str { impl<'a> Quote for &'a str {
fn quote(self) -> TokenStream { fn quote(self) -> TokenStream {
TokenNode::Literal(Literal::string(self)).into() TokenTree::from(Literal::string(self)).into()
} }
} }
impl Quote for usize { impl Quote for usize {
fn quote(self) -> TokenStream { fn quote(self) -> TokenStream {
TokenNode::Literal(Literal::integer(self as i128)).into() TokenTree::from(Literal::usize_unsuffixed(self)).into()
}
}
impl Quote for Group {
fn quote(self) -> TokenStream {
quote!(::Group::new((quote self.delimiter()), (quote self.stream())))
}
}
impl Quote for Op {
fn quote(self) -> TokenStream {
quote!(::Op::new((quote self.op()), (quote self.spacing())))
} }
} }
impl Quote for Term { impl Quote for Term {
fn quote(self) -> TokenStream { fn quote(self) -> TokenStream {
quote!(::Term::intern((quote self.as_str()))) quote!(::Term::new((quote self.as_str()), (quote self.span())))
} }
} }
@@ -182,14 +203,20 @@ macro_rules! literals {
impl LiteralKind { impl LiteralKind {
pub fn with_contents_and_suffix(self, contents: Term, suffix: Option<Term>) pub fn with_contents_and_suffix(self, contents: Term, suffix: Option<Term>)
-> Literal { -> Literal {
let contents = contents.0; let sym = contents.sym;
let suffix = suffix.map(|t| t.0); let suffix = suffix.map(|t| t.sym);
match self { match self {
$(LiteralKind::$i => { $(LiteralKind::$i => {
Literal(token::Literal(token::Lit::$i(contents), suffix)) Literal {
token: token::Literal(token::Lit::$i(sym), suffix),
span: contents.span,
}
})* })*
$(LiteralKind::$raw(n) => { $(LiteralKind::$raw(n) => {
Literal(token::Literal(token::Lit::$raw(contents, n), suffix)) Literal {
token: token::Literal(token::Lit::$raw(sym, n), suffix),
span: contents.span,
}
})* })*
} }
} }
@@ -197,16 +224,17 @@ macro_rules! literals {
impl Literal { impl Literal {
fn kind_contents_and_suffix(self) -> (LiteralKind, Term, Option<Term>) { fn kind_contents_and_suffix(self) -> (LiteralKind, Term, Option<Term>) {
let (lit, suffix) = match self.0 { let (lit, suffix) = match self.token {
token::Literal(lit, suffix) => (lit, suffix), token::Literal(lit, suffix) => (lit, suffix),
_ => panic!("unsupported literal {:?}", self.0), _ => panic!("unsupported literal {:?}", self.token),
}; };
let (kind, contents) = match lit { let (kind, contents) = match lit {
$(token::Lit::$i(contents) => (LiteralKind::$i, contents),)* $(token::Lit::$i(contents) => (LiteralKind::$i, contents),)*
$(token::Lit::$raw(contents, n) => (LiteralKind::$raw(n), contents),)* $(token::Lit::$raw(contents, n) => (LiteralKind::$raw(n), contents),)*
}; };
(kind, Term(contents), suffix.map(Term)) let suffix = suffix.map(|sym| Term::new(&sym.as_str(), self.span()));
(kind, Term::new(&contents.as_str(), self.span()), suffix)
} }
} }

View File

@@ -16,7 +16,7 @@
extern crate proc_macro; extern crate proc_macro;
use proc_macro::{TokenStream, TokenTree, TokenNode, Delimiter, Literal, Spacing}; use proc_macro::{TokenStream, TokenTree, Delimiter, Literal, Spacing, Group};
#[proc_macro_attribute] #[proc_macro_attribute]
pub fn foo(attr: TokenStream, input: TokenStream) -> TokenStream { pub fn foo(attr: TokenStream, input: TokenStream) -> TokenStream {
@@ -52,24 +52,30 @@ pub fn bar(attr: TokenStream, input: TokenStream) -> TokenStream {
} }
fn assert_inline(slice: &mut &[TokenTree]) { fn assert_inline(slice: &mut &[TokenTree]) {
match slice[0].kind { match &slice[0] {
TokenNode::Op('#', _) => {} TokenTree::Op(tt) => assert_eq!(tt.op(), '#'),
_ => panic!("expected '#' char"), _ => panic!("expected '#' char"),
} }
match slice[1].kind { match &slice[1] {
TokenNode::Group(Delimiter::Bracket, _) => {} TokenTree::Group(tt) => assert_eq!(tt.delimiter(), Delimiter::Bracket),
_ => panic!("expected brackets"), _ => panic!("expected brackets"),
} }
*slice = &slice[2..]; *slice = &slice[2..];
} }
fn assert_doc(slice: &mut &[TokenTree]) { fn assert_doc(slice: &mut &[TokenTree]) {
match slice[0].kind { match &slice[0] {
TokenNode::Op('#', Spacing::Alone) => {} TokenTree::Op(tt) => {
assert_eq!(tt.op(), '#');
assert_eq!(tt.spacing(), Spacing::Alone);
}
_ => panic!("expected #"), _ => panic!("expected #"),
} }
let inner = match slice[1].kind { let inner = match &slice[1] {
TokenNode::Group(Delimiter::Bracket, ref s) => s.clone(), TokenTree::Group(tt) => {
assert_eq!(tt.delimiter(), Delimiter::Bracket);
tt.stream()
}
_ => panic!("expected brackets"), _ => panic!("expected brackets"),
}; };
let tokens = inner.into_iter().collect::<Vec<_>>(); let tokens = inner.into_iter().collect::<Vec<_>>();
@@ -79,16 +85,19 @@ fn assert_doc(slice: &mut &[TokenTree]) {
panic!("expected three tokens in doc") panic!("expected three tokens in doc")
} }
match tokens[0].kind { match &tokens[0] {
TokenNode::Term(ref t) => assert_eq!("doc", t.as_str()), TokenTree::Term(tt) => assert_eq!("doc", tt.as_str()),
_ => panic!("expected `doc`"), _ => panic!("expected `doc`"),
} }
match tokens[1].kind { match &tokens[1] {
TokenNode::Op('=', Spacing::Alone) => {} TokenTree::Op(tt) => {
assert_eq!(tt.op(), '=');
assert_eq!(tt.spacing(), Spacing::Alone);
}
_ => panic!("expected equals"), _ => panic!("expected equals"),
} }
match tokens[2].kind { match tokens[2] {
TokenNode::Literal(_) => {} TokenTree::Literal(_) => {}
_ => panic!("expected literal"), _ => panic!("expected literal"),
} }
@@ -96,32 +105,35 @@ fn assert_doc(slice: &mut &[TokenTree]) {
} }
fn assert_invoc(slice: &mut &[TokenTree]) { fn assert_invoc(slice: &mut &[TokenTree]) {
match slice[0].kind { match &slice[0] {
TokenNode::Op('#', _) => {} TokenTree::Op(tt) => assert_eq!(tt.op(), '#'),
_ => panic!("expected '#' char"), _ => panic!("expected '#' char"),
} }
match slice[1].kind { match &slice[1] {
TokenNode::Group(Delimiter::Bracket, _) => {} TokenTree::Group(tt) => assert_eq!(tt.delimiter(), Delimiter::Bracket),
_ => panic!("expected brackets"), _ => panic!("expected brackets"),
} }
*slice = &slice[2..]; *slice = &slice[2..];
} }
fn assert_foo(slice: &mut &[TokenTree]) { fn assert_foo(slice: &mut &[TokenTree]) {
match slice[0].kind { match &slice[0] {
TokenNode::Term(ref name) => assert_eq!(name.as_str(), "fn"), TokenTree::Term(tt) => assert_eq!(tt.as_str(), "fn"),
_ => panic!("expected fn"), _ => panic!("expected fn"),
} }
match slice[1].kind { match &slice[1] {
TokenNode::Term(ref name) => assert_eq!(name.as_str(), "foo"), TokenTree::Term(tt) => assert_eq!(tt.as_str(), "foo"),
_ => panic!("expected foo"), _ => panic!("expected foo"),
} }
match slice[2].kind { match &slice[2] {
TokenNode::Group(Delimiter::Parenthesis, ref s) => assert!(s.is_empty()), TokenTree::Group(tt) => {
assert_eq!(tt.delimiter(), Delimiter::Parenthesis);
assert!(tt.stream().is_empty());
}
_ => panic!("expected parens"), _ => panic!("expected parens"),
} }
match slice[3].kind { match &slice[3] {
TokenNode::Group(Delimiter::Brace, _) => {} TokenTree::Group(tt) => assert_eq!(tt.delimiter(), Delimiter::Brace),
_ => panic!("expected braces"), _ => panic!("expected braces"),
} }
*slice = &slice[4..]; *slice = &slice[4..];
@@ -132,22 +144,17 @@ fn fold_stream(input: TokenStream) -> TokenStream {
} }
fn fold_tree(input: TokenTree) -> TokenTree { fn fold_tree(input: TokenTree) -> TokenTree {
TokenTree {
span: input.span,
kind: fold_node(input.kind),
}
}
fn fold_node(input: TokenNode) -> TokenNode {
match input { match input {
TokenNode::Group(a, b) => TokenNode::Group(a, fold_stream(b)), TokenTree::Group(b) => {
TokenNode::Op(a, b) => TokenNode::Op(a, b), TokenTree::Group(Group::new(b.delimiter(), fold_stream(b.stream())))
TokenNode::Term(a) => TokenNode::Term(a), }
TokenNode::Literal(a) => { TokenTree::Op(b) => TokenTree::Op(b),
TokenTree::Term(a) => TokenTree::Term(a),
TokenTree::Literal(a) => {
if a.to_string() != "\"foo\"" { if a.to_string() != "\"foo\"" {
TokenNode::Literal(a) TokenTree::Literal(a)
} else { } else {
TokenNode::Literal(Literal::integer(3)) TokenTree::Literal(Literal::i32_unsuffixed(3))
} }
} }
} }

View File

@@ -15,15 +15,15 @@
extern crate proc_macro; extern crate proc_macro;
use proc_macro::{TokenStream, TokenNode, quote}; use proc_macro::*;
#[proc_macro] #[proc_macro]
pub fn cond(input: TokenStream) -> TokenStream { pub fn cond(input: TokenStream) -> TokenStream {
let mut conds = Vec::new(); let mut conds = Vec::new();
let mut input = input.into_iter().peekable(); let mut input = input.into_iter().peekable();
while let Some(tree) = input.next() { while let Some(tree) = input.next() {
let cond = match tree.kind { let cond = match tree {
TokenNode::Group(_, cond) => cond, TokenTree::Group(tt) => tt.stream(),
_ => panic!("Invalid input"), _ => panic!("Invalid input"),
}; };
let mut cond_trees = cond.clone().into_iter(); let mut cond_trees = cond.clone().into_iter();
@@ -32,8 +32,8 @@ pub fn cond(input: TokenStream) -> TokenStream {
if rhs.is_empty() { if rhs.is_empty() {
panic!("Invalid macro usage in cond: {}", cond); panic!("Invalid macro usage in cond: {}", cond);
} }
let is_else = match test.kind { let is_else = match test {
TokenNode::Term(word) => word.as_str() == "else", TokenTree::Term(word) => word.as_str() == "else",
_ => false, _ => false,
}; };
conds.push(if is_else || input.peek().is_none() { conds.push(if is_else || input.peek().is_none() {
@@ -43,5 +43,5 @@ pub fn cond(input: TokenStream) -> TokenStream {
}); });
} }
conds.into_iter().collect() conds.into_iter().flat_map(|x| x.into_iter()).collect()
} }

View File

@@ -15,7 +15,7 @@
extern crate proc_macro; extern crate proc_macro;
use proc_macro::{TokenStream, quote}; use proc_macro::*;
#[proc_macro_attribute] #[proc_macro_attribute]
pub fn attr_tru(_attr: TokenStream, item: TokenStream) -> TokenStream { pub fn attr_tru(_attr: TokenStream, item: TokenStream) -> TokenStream {

View File

@@ -15,20 +15,25 @@
extern crate proc_macro; extern crate proc_macro;
use proc_macro::{TokenStream, TokenNode, Spacing, Literal, quote}; use proc_macro::{TokenStream, TokenTree, Spacing, Literal, quote};
#[proc_macro] #[proc_macro]
pub fn count_compound_ops(input: TokenStream) -> TokenStream { pub fn count_compound_ops(input: TokenStream) -> TokenStream {
assert_eq!(count_compound_ops_helper(quote!(++ (&&) 4@a)), 3); assert_eq!(count_compound_ops_helper(quote!(++ (&&) 4@a)), 3);
TokenNode::Literal(Literal::u32(count_compound_ops_helper(input))).into() let l = Literal::u32_suffixed(count_compound_ops_helper(input));
TokenTree::from(l).into()
} }
fn count_compound_ops_helper(input: TokenStream) -> u32 { fn count_compound_ops_helper(input: TokenStream) -> u32 {
let mut count = 0; let mut count = 0;
for token in input { for token in input {
match token.kind { match &token {
TokenNode::Op(c, Spacing::Alone) => count += 1, TokenTree::Op(tt) if tt.spacing() == Spacing::Alone => {
TokenNode::Group(_, tokens) => count += count_compound_ops_helper(tokens), count += 1;
}
TokenTree::Group(tt) => {
count += count_compound_ops_helper(tt.stream());
}
_ => {} _ => {}
} }
} }

View File

@@ -19,16 +19,10 @@ use proc_macro::*;
#[proc_macro] #[proc_macro]
pub fn neg_one(_input: TokenStream) -> TokenStream { pub fn neg_one(_input: TokenStream) -> TokenStream {
TokenTree { TokenTree::Literal(Literal::i32_suffixed(-1)).into()
span: Span::call_site(),
kind: TokenNode::Literal(Literal::i32(-1)),
}.into()
} }
#[proc_macro] #[proc_macro]
pub fn neg_one_float(_input: TokenStream) -> TokenStream { pub fn neg_one_float(_input: TokenStream) -> TokenStream {
TokenTree { TokenTree::Literal(Literal::f32_suffixed(-1.0)).into()
span: Span::call_site(),
kind: TokenNode::Literal(Literal::f32(-1.0)),
}.into()
} }

View File

@@ -27,7 +27,7 @@ pub fn reemit(input: TokenStream) -> TokenStream {
#[proc_macro] #[proc_macro]
pub fn assert_fake_source_file(input: TokenStream) -> TokenStream { pub fn assert_fake_source_file(input: TokenStream) -> TokenStream {
for tk in input { for tk in input {
let source_file = tk.span.source_file(); let source_file = tk.span().source_file();
assert!(!source_file.is_real(), "Source file is real: {:?}", source_file); assert!(!source_file.is_real(), "Source file is real: {:?}", source_file);
} }
@@ -37,7 +37,7 @@ pub fn assert_fake_source_file(input: TokenStream) -> TokenStream {
#[proc_macro] #[proc_macro]
pub fn assert_source_file(input: TokenStream) -> TokenStream { pub fn assert_source_file(input: TokenStream) -> TokenStream {
for tk in input { for tk in input {
let source_file = tk.span.source_file(); let source_file = tk.span().source_file();
assert!(source_file.is_real(), "Source file is not real: {:?}", source_file); assert!(source_file.is_real(), "Source file is not real: {:?}", source_file);
} }

View File

@@ -14,12 +14,12 @@
extern crate proc_macro; extern crate proc_macro;
use proc_macro::{TokenStream, TokenTree, TokenNode, Span}; use proc_macro::{TokenStream, TokenTree, Span};
fn lit_span(tt: TokenTree) -> (Span, String) { fn lit_span(tt: TokenTree) -> (Span, String) {
use TokenNode::*; match tt {
match tt.kind { TokenTree::Literal(..) |
Literal(..) | Group(..) => (tt.span, tt.to_string().trim().into()), TokenTree::Group(..) => (tt.span(), tt.to_string().trim().into()),
_ => panic!("expected a literal in token tree, got: {:?}", tt) _ => panic!("expected a literal in token tree, got: {:?}", tt)
} }
} }

View File

@@ -14,26 +14,27 @@
extern crate proc_macro; extern crate proc_macro;
use proc_macro::{TokenStream, TokenNode, Span, Diagnostic}; use proc_macro::{TokenStream, TokenTree, Span, Diagnostic};
fn parse(input: TokenStream) -> Result<(), Diagnostic> { fn parse(input: TokenStream) -> Result<(), Diagnostic> {
let mut count = 0; let mut count = 0;
let mut last_span = Span::def_site(); let mut last_span = Span::def_site();
for tree in input { for tree in input {
let span = tree.span; let span = tree.span();
if count >= 3 { if count >= 3 {
return Err(span.error(format!("expected EOF, found `{}`.", tree)) return Err(span.error(format!("expected EOF, found `{}`.", tree))
.span_note(last_span, "last good input was here") .span_note(last_span, "last good input was here")
.help("input must be: `===`")) .help("input must be: `===`"))
} }
if let TokenNode::Op('=', _) = tree.kind { if let TokenTree::Op(tt) = tree {
if tt.op() == '=' {
count += 1; count += 1;
} else {
return Err(span.error(format!("expected `=`, found `{}`.", tree)));
}
last_span = span; last_span = span;
continue
}
}
return Err(span.error(format!("expected `=`, found `{}`.", tree)));
} }
if count < 3 { if count < 3 {