Skip to content

Fix #10605: Avoid mismatched constraints when prototypes are context … #10916

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jan 6, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 9 additions & 9 deletions compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala
Original file line number Diff line number Diff line change
Expand Up @@ -38,13 +38,13 @@ object ProtoTypes {
def isCompatible(tp: Type, pt: Type)(using Context): Boolean =
(tp.widenExpr relaxed_<:< pt.widenExpr) || viewExists(tp, pt)

/** Like isCompatibe, but using a subtype comparison with necessary eithers
* that don't unnecessarily truncate the constraint space, returning false instead.
/** Like normalize and then isCompatible, but using a subtype comparison with
* necessary eithers that does not unnecessarily truncate the constraint space,
* returning false instead.
*/
def necessarilyCompatible(tp: Type, pt: Type)(using Context): Boolean =
val tpw = tp.widenExpr
val ptw = pt.widenExpr
necessarySubType(tpw, ptw) || tpw.isValueSubType(ptw) || viewExists(tp, pt)
val tpn = normalize(tp, pt, followIFT = !defn.isContextFunctionType(pt))
necessarySubType(tpn, pt) || tpn.isValueSubType(pt) || viewExists(tpn, pt)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for so many questions on such a simple PR. But I noticed that you don't widen pt here anymore. Can't this change the result of comparing the types for other examples?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In theory it could, but in practice it did not. Conceptually, the correct thing is not to widen here, so unless we see an example where it causes a problem that should be the default.


/** Test compatibility after normalization.
* Do this in a fresh typerstate unless `keepConstraint` is true.
Expand Down Expand Up @@ -84,9 +84,9 @@ object ProtoTypes {
case _ => true
}
case _: ValueTypeOrProto if !disregardProto(pt) =>
necessarilyCompatible(normalize(mt, pt), pt)
necessarilyCompatible(mt, pt)
case pt: WildcardType if pt.optBounds.exists =>
necessarilyCompatible(normalize(mt, pt), pt)
necessarilyCompatible(mt, pt)
case _ =>
true
}
Expand Down Expand Up @@ -607,7 +607,7 @@ object ProtoTypes {
* of toString method. The problem is solved by dereferencing nullary method types if the corresponding
* function type is not compatible with the prototype.
*/
def normalize(tp: Type, pt: Type)(using Context): Type = {
def normalize(tp: Type, pt: Type, followIFT: Boolean = true)(using Context): Type = {
Stats.record("normalize")
tp.widenSingleton match {
case poly: PolyType =>
Expand All @@ -632,7 +632,7 @@ object ProtoTypes {
normalize(et.resultType, pt)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just to be sure, adding the followIFT argument will only affect the fall through case and will be dropped for recursive calls. Is this intended?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, because all recursive normalize calls are against the same pt, which determined the value of followIFT.

Copy link
Contributor

@b-studios b-studios Jan 5, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just wondered whether the check can be inlined at 635 to

if iftp.exists && !defn.isContextFunctionType(pt) then ...

to avoid this additional argument. But this would not be a semantics preserving refactoring since the argument followIFT defaults to true for the next recursive call.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, exactly.

case wtp =>
val iftp = defn.asContextFunctionType(wtp)
if iftp.exists then normalize(iftp.dropDependentRefinement.argInfos.last, pt)
if iftp.exists && followIFT then normalize(iftp.dropDependentRefinement.argInfos.last, pt)
else tp
}
}
Expand Down
9 changes: 9 additions & 0 deletions tests/neg/i10605.scala
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
def xa[A, B, X, Y](f: X => ((A, B) ?=> Y)) =
(z: X) => (a: A, b: B) => f(z)(using a, b)

def superxa1(using String, Int): Nothing = ???
def superxa2(using String, Int): Unit = ???

def main =
xa(Function.const(superxa1)(_: Int)) // error
xa(Function.const(superxa2)(_: Int)) // error