Revert "Highlight conflicting param-env candidates"
This reverts #98794, commit 08135254dc.
Seems to have caused an incremental compilation bug. The root cause of the incr comp bug is somewhat unrelated but is triggered by this PR, so I don't feel comfortable with having this PR in the codebase until it can be investigated further. Fixes#99233.
don't succeed `evaluate_obligation` query if new opaque types were registered
fixes#98608fixes#98604
The root cause of all this is that in type flag computation we entirely ignore nongeneric things like struct fields and the signature of function items. So if a flag had to be set for a struct if it is set for a field, that will only happen if the field is generic, as only the generic parameters are checked.
I now believe we cannot use type flags to handle opaque types. They seem like the wrong tool for this.
Instead, this PR replaces the previous logic by adding a new variant of `EvaluatedToOk`: `EvaluatedToOkModuloOpaqueTypes`, which says that there were some opaque types that got hidden types bound, but that binding may not have been legal (because we don't know if the opaque type was in its defining scope or not).
Check ADT field is well-formed before checking it is sized
Fixes#96810.
There is one diagnostics regression, in [`src/test/ui/generic-associated-types/bugs/issue-80626.stderr`](https://github.com/rust-lang/rust/pull/97780/files#diff-53795946378e78a0af23a10277c628ff79091c18090fdc385801ee70c1ba6963). I am not super concerned about it, since it's GAT related.
We _could_ fix it, possibly by using the `FieldSized` obligation cause code instead of `BuiltinDerivedObligation`. But that would require changing `Sized` trait confirmation and the `adt_sized_constraint` query.
Do not use `ParamEnv::and` when building a cache key from a param-env and trait eval candidate
Do not use `ParamEnv::and` to cache a param-env with a selection/evaluation candidate.
This is because if the param-env is `RevealAll` mode, and the candidate looks global (i.e. it has erased regions, which can show up when we normalize a projection type under a binder<sup>1</sup>), then when we use `ParamEnv::and` to pair the candidate and the param-env for use as a cache key, we will throw away the param-env's caller bounds, and we'll end up caching a candidate that we inferred from the param-env with a empty param-env, which may cause cache-hit later when we have an empty param-env, and possibly mess with normalization like we see in the referenced issue during codegen.
Not sure how to trigger this with a more structured test, but changing `check-pass` to `build-pass` triggers the case that https://github.com/rust-lang/rust/issues/94903 detected.
<sup>1.</sup> That is, we will replace the late-bound region with a placeholder, which gets canonicalized and turned into an infererence variable, which gets erased during region freshening right before we cache the result. Sorry, it's quite a few steps.
Fixes#94903
r? `@Aaron1011` (or reassign as you see fit)
Swap DtorckConstraint to DropckConstraint
This change was made as per suspicion that this struct was never renamed after consistent use of DropCk.
This also clarifies the meaning behind the name of this structure.
Fixes https://github.com/rust-lang/rust/issues/94310
This change was made as per suspicion that this struct was never renamed after consistent use of DropCk.
This also clarifies the meaning behind the name of this structure.
Instead of probing for all possible impls that could have caused an
`ImplObligation`, keep track of its `DefId` and obligation spans for
accurate error reporting.
Follow up to #89580. Addresses #89418.
Remove some unnecessary clones.
Tweak output for auto trait impl obligations.