-
Notifications
You must be signed in to change notification settings - Fork 3.1k
Support JDK 20 by upgrading to ASM 9.4 #10184
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@Philippus Would you mind targeting 2.12.x for the upgrade? Then we can merge it forward on to 2.13.x as usual. reminder to self: I should add a notice to the JDK compatibility page that 2.12.18 and 2.13.11 will have this support. |
I made it a separate PR, like last time. Hope that's ok. |
Maybe I'm being forgetful... is it a tricky merge? If it's a tricky merge, then it makes sense to have two PRs from the start. (Regardless, it doesn't matter much either way.) |
I think last time(s) it was a tricky merge indeed. |
I've been swamped in recent weeks, but now that the 2.13.10 dust has settled I do intend to get back to this. |
0b899c8
to
78ad9c9
Compare
I manually tested this by installing a JDK 20 early access build on my Mac (with |
scala 2.12.x PR: scala/scala#10185 scala 2.13.x PR: scala/scala#10184
### What changes were proposed in this pull request? This PR aims to upgrade Scala to 2.13.11 - https://www.scala-lang.org/news/2.13.11 Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3, ### Why are the changes needed? This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`: - scala/scala#10363 - scala/scala#10184 - scala/scala#10397 - scala/scala#10348 - scala/scala#10105 There are 2 known issues in this version: - scala/bug#12800 - scala/bug#12799 For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130 in Spark Code, and I checked `javax.annotation.Nonnull` no this issue. So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves The full release notes as follows: - https://github.com/scala/scala/releases/tag/v2.13.11 ### Does this PR introduce _any_ user-facing change? Yes, this is a Scala version change. ### How was this patch tested? - Existing Test - Checked Java 8/17 + Scala 2.13.11 using GA, all test passed Java 8 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14337279564 Java 17 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14343012195 Closes #41626 from LuciferYang/SPARK-40497. Authored-by: yangjie01 <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
### What changes were proposed in this pull request? This PR aims to upgrade Scala to 2.13.11 - https://www.scala-lang.org/news/2.13.11 Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3, ### Why are the changes needed? This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`: - scala/scala#10363 - scala/scala#10184 - scala/scala#10397 - scala/scala#10348 - scala/scala#10105 There are 2 known issues in this version: - scala/bug#12800 - scala/bug#12799 For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130 in Spark Code, and I checked `javax.annotation.Nonnull` no this issue. So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves The full release notes as follows: - https://github.com/scala/scala/releases/tag/v2.13.11 ### Does this PR introduce _any_ user-facing change? Yes, this is a Scala version change. ### How was this patch tested? - Existing Test - Checked Java 8/17 + Scala 2.13.11 using GA, all test passed Java 8 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14337279564 Java 17 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14343012195 Closes apache#41626 from LuciferYang/SPARK-40497. Authored-by: yangjie01 <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
### What changes were proposed in this pull request? This PR aims to re-upgrade Scala to 2.13.11, after SPARK-45144 was merged, the build issues mentioned in #41943 should no longer exist. - https://www.scala-lang.org/news/2.13.11 Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3 ### Why are the changes needed? This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`: - scala/scala#10363 - scala/scala#10184 - scala/scala#10397 - scala/scala#10348 - scala/scala#10105 There are 2 known issues in this version: - scala/bug#12800 - scala/bug#12799 For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130 in Spark Code, and I checked `javax.annotation.Nonnull` no this issue. So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves The full release notes as follows: - https://github.com/scala/scala/releases/tag/v2.13.11 ### Does this PR introduce _any_ user-facing change? Yes, this is a Scala version change. ### How was this patch tested? - Existing Test ### Was this patch authored or co-authored using generative AI tooling? No Closes #42918 from LuciferYang/SPARK-40497-2. Authored-by: yangjie01 <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
This PR aims to re-upgrade Scala to 2.13.11, after SPARK-45144 was merged, the build issues mentioned in apache#41943 should no longer exist. - https://www.scala-lang.org/news/2.13.11 Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3 This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`: - scala/scala#10363 - scala/scala#10184 - scala/scala#10397 - scala/scala#10348 - scala/scala#10105 There are 2 known issues in this version: - scala/bug#12800 - scala/bug#12799 For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130 in Spark Code, and I checked `javax.annotation.Nonnull` no this issue. So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves The full release notes as follows: - https://github.com/scala/scala/releases/tag/v2.13.11 Yes, this is a Scala version change. - Existing Test No Closes apache#42918 from LuciferYang/SPARK-40497-2. Authored-by: yangjie01 <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
There's first some work to be done in https://github.com/scala/scala-asm.