Skip to content

Numeric widening and weak conformance #2451

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
felixmulder opened this issue May 17, 2017 · 10 comments
Closed

Numeric widening and weak conformance #2451

felixmulder opened this issue May 17, 2017 · 10 comments

Comments

@felixmulder
Copy link
Contributor

Currently we're doing what scalac is doing WRT to this issue. Example:

List(1, 2.3) // res: List[Double]

from a user standpoint, this should in a stricter language have been inferred as List[Int | Double]. In scalac they have a flag for this Ywarn-numeric-widen. Which is not yet implemented in Dotty.

If we choose not to make inference for this type of construct stricter, then we should definitely implement the Ywarn-numeric-widen-flag.

@sjrd
Copy link
Member

sjrd commented May 17, 2017

Could we simply remove weak conformance from the language altogether? Seems like a thing inherited from Java that few like anyway.

@felixmulder felixmulder changed the title Numeric widening Numeric widening and weak conformance May 17, 2017
@odersky
Copy link
Contributor

odersky commented May 17, 2017

Strictly speaking we don't have weak conformance in dotty anymore. What we have is a new, more restricted and more localized rule. It applies to

  • the arguments corresponding to a vararg parameter
  • the branches of an if-then-else, match, or try

If the expressions in these scenarios all have primitive numeric types, we make sure it's the same numeric type by widening as necessary.

The new rule is less sweeping and much easier to specify than the old one.

(See harmonize in Applications for the implementation).

@propensive
Copy link
Contributor

Could we experiment with disabling it completely, i.e. making it union type or a type error? It still seems like the new approach wouldn't address the problems some people complain about with longs getting "widened" to floats, and precision being lost.

@smarter
Copy link
Member

smarter commented May 17, 2017

Yes, this has been requested before: #2289
But meanwhile the compiler has been changed to never infer union types: #2330 The proposed alternative was warnings, which I'm not super enthusiastic about, unless they're turned on by default.

@odersky
Copy link
Contributor

odersky commented May 17, 2017

I believe there's lots of code out there, in particular dealing with numerics where you write something like

Array(-2.33, 0, 3.0)

and so on. What's the point in requiring people to write 0.0? In my mind it's just pedantic busywork which will alienate people. But contrast, what's the gain? What do we stand to gain in inferring a useless type?

The other observation I have is that the amount of heat is often inversely proportional to the importance of an issue. This one here is about as unimportant as it gets. Scala's original weak conformance rules could be criticized for being too complicated. The new rules aren't. Are there constructed corner cases where they might do something that surprises some people? Sure. Should we care? I guess you know my position.

@DarkDimius
Copy link
Contributor

@odersky, in cases where constants are explicitly written, we can actually check if precision is lost during numeric conversion in compile time.

@dwijnand
Copy link
Member

@DarkDimius Indeed.

unless you can determine at compile time that the conversion is safe, so for example a 0 literal would convert without any cost

(from my brainstorming in #2289 (comment))

@propensive
Copy link
Contributor

propensive commented May 18, 2017

@odersky I think the "heat" argument works both ways: we shouldn't make too much fuss about allowing people to write 0 when it would be reasonably trivial for them to write 0.0 if they want a double (especially as they have to adhere to this expectation of type-safety elsewhere), and given that the solutions to this (i.e. the "complicated" weak conformance in Scala, or the new scheme) are complicated, shouldn't the burden be on the user to just add .0 everywhere? But I think that discussion isn't worth having.

So as a more radical alternative, could we instead infer the type of literal numbers to be the intersection of all the types in which those literals are precisely representable, and then deal with removing the intersections during erasure?

  • 1: Int(1) & Long(1L) & Double(1.0) & Float(1.0F)
  • 1L: Long(1L)
  • 2147483648: Long(2147483648L) & Double(2.147483648E9)
  • 3.141592653589793: Double(3.141592653589793)
  • 3.1415927: Double(3.1415927) & Float(3.1415927F)
  • 3.1415927F: Float(3.1415927F)
  • 3.1415927D: Double(3.1415927)

The LUBs in an expression like Array(-2.33, 0, 3.0) would fall out to the "right" answer (i.e. Double), without any special rules. Likewise for if/else expressions and match blocks.

This may also help solve the problem which still exists in both Dotty and Scala where expressions like 1L :: 2 :: 3 :: Nil infers as List[AnyVal]. My hope would be that 3 :: Nil could be typed as List[Int(3) & Long(3L)], 2 :: 3 :: Nil would be List[Int & Long] and 1L :: 2 :: 3 :: Nil would infer to List[Long].

I think the challenge would be in finding the correct point to widen the type. It seems to work well for singleton literals currently (though maybe we're all just too accustomed to working with their widened primitive types instead of the singleton types), but having types like Int & Long & Float & Double inferred everywhere would be a visual distraction, so we would want to avoid that, I suspect using only those types for local inference, rather than result types...

And cases like this would remain problematic:

val list = List(1, 2, 3)
val list2 = 0L :: list

unless they were typed explicitly as

val list: List[Int with Long] = List(1, 2, 3)
val list2 = 0L :: list

And as a wild future idea, could we support BigInteger and BigDecimal literals just by adding them to the intersection? I have a feeling doing so might end up working only if everything gets boxed, in which case it becomes a less interesting idea, but I don't know the details...

@lihaoyi
Copy link
Contributor

lihaoyi commented May 22, 2017

@odersky what about the following:

object Foo{
  def main(args: Array[String]): Unit = {
    val list = List(1.2, 92233720368547751L, 4)
    val long =           92233720368547751L
    println(list)
    println(list(1))
    println(list(1).getClass)
    println(list(1).toLong)

    println(long)
    println(list(1).toLong == long)
    
  }
}
sbt: [info] Running Foo
List(1.2, 9.2233720368547744E16, 4.0)
9.2233720368547744E16
double
92233720368547744
92233720368547751
false

(https://scastie.scala-lang.org/btdDoZBcRNqJRWm12VMXWA)

Here, we have the same literal resulting in two different, unequal values, due to widening/conformance/whatever. There are no type annotations here, so I don't expect any implicits to kick in: if anything, I'd expect to get a List[Any] containing the mix of Longs and Doubles that I wrote in the source code. But I don't and so 92233720368547751L ends up being two different values.

Surely the same literal resulting in different values is a real problem, and that's not just pedantic busywork?

I don't really care what the solution is, whether it's union types or intersection types or inferring Any or whatever. I just don't want my numbers mysteriously, irreversibly losing precision despite there being no explicit conversions in sight

@odersky
Copy link
Contributor

odersky commented Jan 24, 2018

Somebody would have to take this on. This means:

  • Come up with rules
  • Convince everyone that they are the right ones, or at least an improvement over the current ones
  • Implement them
  • Implement migration rules
  • Compile a large body of code to ensure we are not breaking too much of it.
  • Submit everything as a SIP

It's a big project. I don't see anybody in the core team having the stamina to do it. Until that changes or we have an outside contributor I will close the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants