use ty::Binder in rustdoc instead of `skip_binder`
r? `@GuillaumeGomez`
this is a preliminary cleanup required to be able to normalize correctly/conveniently in rustdoc
If we have a call such as `foo(&mut buf)` and after reference
collapsing the type is inferred as `&T` where-as the required type is
`&mut T`, don't suggest `foo(&mut mut buf)`. This is wrong syntactically
and the issue lies elsewhere, not in the borrow.
Fixes#105645
Fix cases where the "invalid base prefix for number literal" error arises with
suffixes that look erroneously capitalized but which are in fact invalid.
Inline and remove `place_contents_drop_state_cannot_differ`.
It has a single call site and is hot enough to be worth inlining. And make sure `is_terminal_path` is inlined, too.
r? `@ghost`
Point out the type of associated types in every method call of iterator chains
Partially address #105184 by pointing out the type of associated types in every method call of iterator chains:
```
note: the expression is of type `Map<std::slice::Iter<'_, {integer}>, [closure@src/test/ui/iterators/invalid-iterator-chain.rs:12:18: 12:21]>`
--> src/test/ui/iterators/invalid-iterator-chain.rs:12:14
|
10 | vec![0, 1]
| ---------- this expression has type `Vec<{integer}>`
11 | .iter()
| ------ associated type `std::iter::Iterator::Item` is `&{integer}` here
12 | .map(|x| { x; })
| ^^^^^^^^^^^^^^^ associated type `std::iter::Iterator::Item` is `()` here
```
We also reduce the number of impls we mention when any of the candidates is an "exact match". This benefits the output of cases with numerics greatly.
Outstanding work would be to provide a structured suggestion for appropriate changes, like in this case detecting the spurious `;` in the closure.
Previously, we used spans. This was not good. Sometimes, the span of the
token that failed to match may come from a position later in the file
which has been transcribed into a token stream way earlier in the file.
If precisely this token fails to match, we think that it was the best
match because its span is so high, even though other arms might have
gotten further in the token stream.
We now try to properly use the location in the token stream.