diff --git a/.github/workflows/ci.yaml b/.github/workflows/ci.yaml index 368766349bab..72e7149fa762 100644 --- a/.github/workflows/ci.yaml +++ b/.github/workflows/ci.yaml @@ -2,8 +2,18 @@ name: Dotty on: push: - tags: - - '**' + ## Be careful if you add or remove something here! Quoting from + ## : + ## + ## > If you define only tags/tags-ignore or only branches/branches-ignore, the + ## > workflow won't run for events affecting the undefined Git ref. If you + ## > define neither tags/tags-ignore or branches/branches-ignore, the workflow + ## > will run for events affecting either branches or tags. + ## + ## We want the CI to run on both branches and tags, so we should either have: + ## - both (tags or tags-ignore) and (branches or branches-ignore), + ## - or neither of them. + ## But it's important to not have only one or the other. pull_request: schedule: - cron: '0 3 * * *' # Every day at 3 AM diff --git a/.gitignore b/.gitignore index eb9541428302..4ac67ddfbb06 100644 --- a/.gitignore +++ b/.gitignore @@ -35,6 +35,9 @@ metals.sbt .idea_modules /.worksheet/ +# scala-cli +.scala-build + # Partest dotty.jar dotty-lib.jar diff --git a/MAINTENANCE.md b/MAINTENANCE.md index 7bde90839724..d1309a6b404d 100644 --- a/MAINTENANCE.md +++ b/MAINTENANCE.md @@ -1,9 +1,12 @@ # Issue Supervisor Role -This document formally defines the _Issue Supervisor_ role. This is a repository maintenance role that is assigned to core contributors on rotating basis. + +This document formally defines the _Issue Supervisor_ role. This is a repository maintenance role that is assigned to core contributors on a rotating basis. ## Responsibilities -Issue supervisor is responsible for: -- Health of the CI, nightly releases and benchmark infrastructure. + +The issue supervisor is responsible for: + +- The health of the CI, nightly releases and benchmark infrastructure. - PRs of external contributors: assigning someone to review, or handling themselves. - Triaging issues (especially new): - Each issue needs to be assigned an `itype` and 1 or more `area` labels. @@ -12,33 +15,39 @@ Issue supervisor is responsible for: - Modifying issue labels to best capture information about the issues - Attempting to reproduce the issue (or label “stat:cannot reproduce”) - Further minimizing the issue or asking the reporter of the issue to minimize it correctly (or label “stat:needs minimization”) + - Identifying which issues are of considerable importance and bringing them to the attention of the team during the Dotty meeting, where they can be filtered and added to the [Future Versions](https://github.com/lampepfl/dotty/milestone/46) milestone. Other core teammates are responsible for providing information to the issue supervisor in a timely manner when it is requested if they have that information. ## Assignment -Issue supervisor is appointed for 7 days and is responsible for what is specified in the “Responsibilities” section during those 7 days. Their assumption of the role starts from the Dotty Meeting on Monday and ends on the next Dotty Meeting on Monday. + +The issue supervisor is appointed for 7 days and is responsible for what is specified in the “Responsibilities” section during those 7 days. Their assumption of the role starts from the Dotty Meeting on Monday and ends on the next Dotty Meeting on Monday. During the Dotty Meeting, an issue supervisor is assigned for the current week and for the week after that. -Issue supervisor schedule is maintained in the [Issue Supervisor Statistics spreadsheet](https://docs.google.com/spreadsheets/d/19IAqNzHfJ9rsii3EsjIGwPz5BLTFJs_byGM3FprmX3E/edit?usp=sharing). So, someone who knows their availability several weeks ahead into the future can assign themselves to be an issue supervisor well ahead of time. +The issue supervisor schedule is maintained in the [Issue Supervisor Statistics spreadsheet](https://docs.google.com/spreadsheets/d/19IAqNzHfJ9rsii3EsjIGwPz5BLTFJs_byGM3FprmX3E/edit?usp=sharing). So, someone who knows their availability several weeks ahead into the future can assign themselves to be an issue supervisor well ahead of time. ## Prerequisites + An issue supervisor needs to have all the accesses and privileges required to get their job done. This might include: + - Admin rights in lampepfl/dotty repository - Admin rights in lampepfl/dotty-feature-requests repository -- Permissions to create new repositories in lampepfl organization (needed to fork repositories for the community build) +- Permission to create new repositories in lampepfl organization (needed to fork repositories for the community build) - Access to the LAMP slack to be able to ask for help with the infrastructure, triaging and such ## Procedures -To ensure proper health of the infrastructure, the supervisor regularly monitors its proper operation. If a malfunction is detected, the supervisor's job is to ensure that someone is working on it (or solve it on their own). + +To ensure the proper health of the infrastructure, the supervisor regularly monitors its proper operation. If a malfunction is detected, the supervisor's job is to ensure that someone is working on it (or solve it on their own). If it is unclear what area an issue belongs to, the supervisor asks for advice from other team members on Slack or GitHub. If, after asking for advice, it turns out that nobody in the team knows how to classify it, the issue must be classified with a “stat:needs triage” label. If it is unclear who should review an external PR, the supervisor asks for advice from the rest of the core team. If after asking for advice, it is still unclear who should do it, the reviewer for such a PR will be decided at the next Dotty meeting. -In general, if anything else is unclear for proper fulfillment of responsibilities, the supervisor must proactively seek advice from other team members on Slack or other channels. +In general, if anything else is unclear for the proper fulfillment of responsibilities, the supervisor must proactively seek advice from other team members on Slack or other channels. ## Reporting + At the end of their supervision period, the supervisor reports to the team during the Dotty meeting on the following points: - Whether there were any incidents with the CI, nightlies and benchmarks, how they were resolved and what steps were taken to prevent them from happening in the future. @@ -46,8 +55,10 @@ At the end of their supervision period, the supervisor reports to the team durin - How many new issues were opened during their supervision period? Were there any areas that got a lot of issues? How many regressions from a prior Scala 3 release were there? Which were designated for an MSc project or an Issue Spree? - If new labels were created or old ones were removed, or there is any other feedback on how to improve the issue supervision, mention that. - Unassigned PRs and issues that the team failed to classify: bring them one by one so that the team can make a decision on them. +- Issues of importance – candidates for the Future Versions milestone. + +## Maintenance List -# Maintenance List The following is the list of all the principal areas of the compiler and the core team members who are responsible for their maintenance: - Parser: @odersky @@ -73,5 +84,5 @@ The following is the list of all the principal areas of the compiler and the cor - Vulpix: @dwijnand, @prolativ - JVM backend: @Kordyjan, (@sjrd) - Derivation & Mirrors: @bishabosha, (@dwijnand) -- Linting (especially unused warnings) / Reporting UX : VirtusLab TBD? +- Linting (especially unused warnings) / Reporting UX: VirtusLab TBD? - Java-compat: @Kordyjan diff --git a/NOTICE.md b/NOTICE.md index 64ebae49efe5..f4d0e6ed2b5a 100644 --- a/NOTICE.md +++ b/NOTICE.md @@ -1,6 +1,6 @@ Dotty (https://dotty.epfl.ch) -Copyright 2012-2020 EPFL -Copyright 2012-2020 Lightbend, Inc. +Copyright 2012-2023 EPFL +Copyright 2012-2023 Lightbend, Inc. Licensed under the Apache License, Version 2.0 (the "License"): http://www.apache.org/licenses/LICENSE-2.0 diff --git a/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/InitializedAccessInt.scala b/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/InitializedAccessInt.scala new file mode 100644 index 000000000000..2a115ad63496 --- /dev/null +++ b/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/InitializedAccessInt.scala @@ -0,0 +1,30 @@ +package dotty.tools.benchmarks.lazyvals + +import org.openjdk.jmh.annotations.* +import org.openjdk.jmh.infra.Blackhole +import LazyVals.LazyIntHolder +import java.util.concurrent.TimeUnit + +@BenchmarkMode(Array(Mode.AverageTime)) +@Fork(2) +@Threads(1) +@Warmup(iterations = 5) +@Measurement(iterations = 5) +@OutputTimeUnit(TimeUnit.NANOSECONDS) +@State(Scope.Benchmark) +class InitializedAccessInt { + + var holder: LazyIntHolder = _ + + @Setup + def prepare: Unit = { + holder = new LazyIntHolder + holder.value + } + + @Benchmark + def measureInitialized(bh: Blackhole) = { + bh.consume(holder) + bh.consume(holder.value) + } +} diff --git a/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/InitializedObject.scala b/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/InitializedObject.scala new file mode 100644 index 000000000000..672cc4bf6544 --- /dev/null +++ b/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/InitializedObject.scala @@ -0,0 +1,22 @@ +package dotty.tools.benchmarks.lazyvals + +import org.openjdk.jmh.annotations.* +import org.openjdk.jmh.infra.Blackhole +import LazyVals.ObjectHolder +import java.util.concurrent.TimeUnit + +@BenchmarkMode(Array(Mode.AverageTime)) +@Fork(2) +@Threads(1) +@Warmup(iterations = 5) +@Measurement(iterations = 5) +@OutputTimeUnit(TimeUnit.NANOSECONDS) +@State(Scope.Benchmark) +class InitializedObject { + + @Benchmark + def measureInitialized(bh: Blackhole) = { + bh.consume(ObjectHolder) + bh.consume(ObjectHolder.value) + } +} diff --git a/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/LazyVals.scala b/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/LazyVals.scala index 0afd93d086be..68379f9e142c 100644 --- a/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/LazyVals.scala +++ b/bench-micro/src/main/scala/dotty/tools/benchmarks/lazyvals/LazyVals.scala @@ -50,4 +50,22 @@ object LazyVals { } } } + + class LazyIntHolder { + lazy val value: Int = { + (System.nanoTime() % 1000).toInt + } + } + + object ObjectHolder { + lazy val value: String = { + System.nanoTime() % 5 match { + case 0 => "abc" + case 1 => "def" + case 2 => "ghi" + case 3 => "jkl" + case 4 => "mno" + } + } + } } diff --git a/changelogs/3.3.0-RC1.md b/changelogs/3.3.0-RC1.md new file mode 100644 index 000000000000..1d632e49032a --- /dev/null +++ b/changelogs/3.3.0-RC1.md @@ -0,0 +1,225 @@ +# Highlights of the release + +- Stabilize new lazy vals [#16614](https://github.com/lampepfl/dotty/pull/16614) +- Experimental Macro annotations [#16392](https://github.com/lampepfl/dotty/pull/16392) [#16454](https://github.com/lampepfl/dotty/pull/16454) [#16534](https://github.com/lampepfl/dotty/pull/16534) +- Fix stability check for inline parameters [#15511](https://github.com/lampepfl/dotty/pull/15511) +- Make `fewerBraces` a standard feature [#16297](https://github.com/lampepfl/dotty/pull/16297) +- Add new front-end phase for unused entities and add support for unused imports [#16157](https://github.com/lampepfl/dotty/pull/16157) +- Implement -Wvalue-discard warning [#15975](https://github.com/lampepfl/dotty/pull/15975) +- Introduce boundary/break control abstraction. [#16612](https://github.com/lampepfl/dotty/pull/16612) + +# Other changes and fixes + +## Annotations + +- Support use-site meta-annotations [#16445](https://github.com/lampepfl/dotty/pull/16445) + +## Desugaring + +- Reuse typed prefix for `applyDynamic` and `applyDynamicNamed` [#16552](https://github.com/lampepfl/dotty/pull/16552) +- Fix object selftype match error [#16441](https://github.com/lampepfl/dotty/pull/16441) + +## Erasure + +- Dealias before checking for outer references in types [#16525](https://github.com/lampepfl/dotty/pull/16525) +- Fix generic signature for type params bounded by primitive [#16442](https://github.com/lampepfl/dotty/pull/16442) +- Avoid EmptyScope.cloneScope crashing, eg on missing references [#16314](https://github.com/lampepfl/dotty/pull/16314) + +## GADTs + +- Inline GADT state restoring in TypeComparer [#16564](https://github.com/lampepfl/dotty/pull/16564) +- Add extension/conversion to GADT selection healing [#16638](https://github.com/lampepfl/dotty/pull/16638) + +## Incremental compilation + +- Unpickle arguments of parent constructors in Templates lazily [#16688](https://github.com/lampepfl/dotty/pull/16688) + +## Initialization + +- Fix #16438: Supply dummy args for erroneous parent call in init check [#16448](https://github.com/lampepfl/dotty/pull/16448) + +## Inline + +- Dealias in ConstantValue, for inline if cond [#16652](https://github.com/lampepfl/dotty/pull/16652) +- Set Span for top level annotations generated in PostTyper [#16378](https://github.com/lampepfl/dotty/pull/16378) +- Interpolate any type vars from comparing against SelectionProto [#16348](https://github.com/lampepfl/dotty/pull/16348) +- Handle binding of beta reduced inlined lambdas [#16377](https://github.com/lampepfl/dotty/pull/16377) +- Do not add dummy RHS to abstract inline methods [#16510](https://github.com/lampepfl/dotty/pull/16510) +- Warn on inline given aliases with functions as RHS [#16499](https://github.com/lampepfl/dotty/pull/16499) +- Support inline overrides in value classes [#16523](https://github.com/lampepfl/dotty/pull/16523) + +## Java interop + +- Represent Java annotations as interfaces so they can be extended, and disallow various misuses of them [#16260](https://github.com/lampepfl/dotty/pull/16260) + +## Opaque Types + +- Delay opaque alias checking until PostTyper [#16644](https://github.com/lampepfl/dotty/pull/16644) + +## Overloading + +- Handle context function arguments in overloading resolution [#16511](https://github.com/lampepfl/dotty/pull/16511) + +## Parser + +- Improve support for Unicode supplementary characters in identifiers and string interpolation (as in Scala 2) [#16278](https://github.com/lampepfl/dotty/pull/16278) +- Require indent after colon at EOL [#16466](https://github.com/lampepfl/dotty/pull/16466) +- Help givens return refined types [#16293](https://github.com/lampepfl/dotty/pull/16293) + +## Pattern Matching + +- Tweak AvoidMap's derivedSelect [#16563](https://github.com/lampepfl/dotty/pull/16563) +- Space: Use RHS of & when refining subtypes [#16573](https://github.com/lampepfl/dotty/pull/16573) +- Freeze constraints in a condition check of maximiseType [#16526](https://github.com/lampepfl/dotty/pull/16526) +- Restrict syntax of typed patterns [#16150](https://github.com/lampepfl/dotty/pull/16150) +- Test case to show that #16252 works with transparent [#16262](https://github.com/lampepfl/dotty/pull/16262) +- Support inline unapplySeq and with leading given parameters [#16358](https://github.com/lampepfl/dotty/pull/16358) +- Handle sealed prefixes in exh checking [#16621](https://github.com/lampepfl/dotty/pull/16621) +- Detect irrefutable quoted patterns [#16674](https://github.com/lampepfl/dotty/pull/16674) + +## Pickling + +- Allow case classes with up to 254 parameters [#16501](https://github.com/lampepfl/dotty/pull/16501) +- Correctly unpickle Scala 2 private case classes in traits [#16519](https://github.com/lampepfl/dotty/pull/16519) + +## Polyfunctions + +- Fix #9996: Crash with function accepting polymorphic function type with singleton result [#16327](https://github.com/lampepfl/dotty/pull/16327) + +## Quotes + +- Remove contents of inline methods [#16345](https://github.com/lampepfl/dotty/pull/16345) +- Fix errors in explicit type annotations in inline match cases [#16257](https://github.com/lampepfl/dotty/pull/16257) +- Handle macro annotation suspends and crashes [#16509](https://github.com/lampepfl/dotty/pull/16509) +- Fix macro annotations `spliceOwner` [#16513](https://github.com/lampepfl/dotty/pull/16513) + +## REPL + +- REPL: Fix crash when printing instances of value classes [#16393](https://github.com/lampepfl/dotty/pull/16393) +- Attempt to fix completion crash [#16267](https://github.com/lampepfl/dotty/pull/16267) +- Fix REPL shadowing bug [#16389](https://github.com/lampepfl/dotty/pull/16389) +- Open up for extensibility [#16276](https://github.com/lampepfl/dotty/pull/16276) +- Don't crash if completions throw [#16687](https://github.com/lampepfl/dotty/pull/16687) + +## Reflection + +- Fix reflect typeMembers to return all members [#15033](https://github.com/lampepfl/dotty/pull/15033) +- Deprecate reflect Flags.Static [#16568](https://github.com/lampepfl/dotty/pull/16568) + +## Reporting + +- Suppress follow-on errors for erroneous import qualifiers [#16658](https://github.com/lampepfl/dotty/pull/16658) +- Fix order in which errors are reported for assignment to val [#16660](https://github.com/lampepfl/dotty/pull/16660) +- Fix class name in error message [#16635](https://github.com/lampepfl/dotty/pull/16635) +- Make refined type printing more source compatible [#16303](https://github.com/lampepfl/dotty/pull/16303) +- Add error hint on local inline def used in quotes [#16572](https://github.com/lampepfl/dotty/pull/16572) +- Fix Text wrapping [#16277](https://github.com/lampepfl/dotty/pull/16277) +- Fix -Wunused:import registering constructor `` instead of its owner (also fix false positive for enum) [#16661](https://github.com/lampepfl/dotty/pull/16661) +- Fix #16675 : -Wunused false positive on case class generated method, due to flags used to distinguish case accessors. [#16683](https://github.com/lampepfl/dotty/pull/16683) +- Fix #16680 by registering Ident not containing a symbol [#16689](https://github.com/lampepfl/dotty/pull/16689) +- Fix #16682: CheckUnused missed some used symbols [#16690](https://github.com/lampepfl/dotty/pull/16690) +- Fix the non-miniphase tree traverser [#16684](https://github.com/lampepfl/dotty/pull/16684) + +## Scala-JS + +- Fix #14289: Accept Ident refs to `js.native` in native member rhs. [#16185](https://github.com/lampepfl/dotty/pull/16185) + +## Standard Library + +- Add `CanEqual` instance for `Map` [#15886](https://github.com/lampepfl/dotty/pull/15886) +- Refine `Tuple.Append` return type [#16140](https://github.com/lampepfl/dotty/pull/16140) + +## TASTy format + +- Make it a fatal error if erasure cannot resolve a type [#16373](https://github.com/lampepfl/dotty/pull/16373) + +## Tooling + +- Add -Yimports compiler flag [#16218](https://github.com/lampepfl/dotty/pull/16218) +- Allow BooleanSettings to be set with a colon [#16425](https://github.com/lampepfl/dotty/pull/16425) + +## Transform + +- Avoid stackoverflow in ExplicitOuter [#16381](https://github.com/lampepfl/dotty/pull/16381) +- Make lazy vals run on non-fallback graal image - remove dynamic reflection [#16346](https://github.com/lampepfl/dotty/pull/16346) +- Patch to avoid crash in #16351 [#16354](https://github.com/lampepfl/dotty/pull/16354) +- Don't treat package object's `` methods as package members [#16667](https://github.com/lampepfl/dotty/pull/16667) +- Space: Refine isSubspace property & an example [#16574](https://github.com/lampepfl/dotty/pull/16574) + +## Typer + +- Drop requirement that self types are closed [#16648](https://github.com/lampepfl/dotty/pull/16648) +- Disallow constructor params from appearing in parent types for soundness [#16664](https://github.com/lampepfl/dotty/pull/16664) +- Don't search implicit arguments in singleton type prefix [#16490](https://github.com/lampepfl/dotty/pull/16490) +- Don't rely on isProvisional to determine whether atoms computed [#16489](https://github.com/lampepfl/dotty/pull/16489) +- Support signature polymorphic methods (`MethodHandle` and `VarHandle`) [#16225](https://github.com/lampepfl/dotty/pull/16225) +- Prefer parameterless alternatives during ambiguous overload resolution [#16315](https://github.com/lampepfl/dotty/pull/16315) +- Fix calculation to drop transparent classes [#16344](https://github.com/lampepfl/dotty/pull/16344) +- Test case for issue 16311 [#16317](https://github.com/lampepfl/dotty/pull/16317) +- Skip caching provisional OrType atoms [#16295](https://github.com/lampepfl/dotty/pull/16295) +- Avoid cyclic references due to experimental check when inlining [#16195](https://github.com/lampepfl/dotty/pull/16195) +- Track type variable dependencies to guide instantiation decisions [#16042](https://github.com/lampepfl/dotty/pull/16042) +- Two fixes to constraint solving [#16353](https://github.com/lampepfl/dotty/pull/16353) +- Fix regression in cyclic constraint handling [#16514](https://github.com/lampepfl/dotty/pull/16514) +- Sharpen range approximation for applied types with capture set ranges [#16261](https://github.com/lampepfl/dotty/pull/16261) +- Cut the Gordian Knot: Don't widen unions to transparent [#15642](https://github.com/lampepfl/dotty/pull/15642) +- Fix widening logic to keep instantiation within bounds [#16417](https://github.com/lampepfl/dotty/pull/16417) +- Skip ambiguous reference error when symbols are aliases [#16401](https://github.com/lampepfl/dotty/pull/16401) +- Avoid incorrect simplifications when updating bounds in the constraint [#16410](https://github.com/lampepfl/dotty/pull/16410) +- Take `@targetName` into account when resolving extension methods [#16487](https://github.com/lampepfl/dotty/pull/16487) +- Improve ClassTag handling to avoid invalid ClassTag generation and inference failure [#16492](https://github.com/lampepfl/dotty/pull/16492) +- Fix extracting the elemType of a union of arrays [#16569](https://github.com/lampepfl/dotty/pull/16569) +- Make sure annotations are typed in expression contexts [#16699](https://github.com/lampepfl/dotty/pull/16699) +- Throw a type error when using hk-types in unions or intersections [#16712](https://github.com/lampepfl/dotty/pull/16712) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.2.2..3.3.0-RC1` these are: + +``` + 225 Martin Odersky + 73 Dale Wijnand + 58 Szymon Rodziewicz + 54 Nicolas Stucki + 48 Kamil Szewczyk + 48 Paul Coral + 30 Paweł Marks + 28 Florian3k + 28 Yichen Xu + 14 Guillaume Martres + 8 Fengyun Liu + 8 Michał Pałka + 7 Chris Birchall + 7 rochala + 6 Kacper Korban + 6 Sébastien Doeraene + 6 jdudrak + 5 Seth Tisue + 5 Som Snytt + 5 nizhikov + 4 Filip Zybała + 4 Jan Chyb + 4 Michael Pollmeier + 4 Natsu Kagami + 3 Jamie Thompson + 2 Alex + 2 Anatolii Kmetiuk + 2 Dmitrii Naumenko + 2 Lukas Rytz + 2 adampauls + 2 yoshinorin + 1 Alexander Slesarenko + 1 Chris Kipp + 1 Guillaume Raffin + 1 Jakub Kozłowski + 1 Jan-Pieter van den Heuvel + 1 Julien Richard-Foy + 1 Kenji Yoshida + 1 Philippus + 1 Szymon R + 1 Tim Spence + 1 s.bazarsadaev + +``` \ No newline at end of file diff --git a/changelogs/3.3.0-RC2.md b/changelogs/3.3.0-RC2.md new file mode 100644 index 000000000000..57d785816489 --- /dev/null +++ b/changelogs/3.3.0-RC2.md @@ -0,0 +1,229 @@ +This release is nearly identical to 3.3.0-RC1. The only difference is that 3.3.0-RC1 generated output with incorrect TASTy version. + +The following changelog is identical to the changelog of 3.3.0-RC1. + +# Highlights of the release + +- Stabilize new lazy vals [#16614](https://github.com/lampepfl/dotty/pull/16614) +- Experimental Macro annotations [#16392](https://github.com/lampepfl/dotty/pull/16392) [#16454](https://github.com/lampepfl/dotty/pull/16454) [#16534](https://github.com/lampepfl/dotty/pull/16534) +- Fix stability check for inline parameters [#15511](https://github.com/lampepfl/dotty/pull/15511) +- Make `fewerBraces` a standard feature [#16297](https://github.com/lampepfl/dotty/pull/16297) +- Add new front-end phase for unused entities and add support for unused imports [#16157](https://github.com/lampepfl/dotty/pull/16157) +- Implement -Wvalue-discard warning [#15975](https://github.com/lampepfl/dotty/pull/15975) +- Introduce boundary/break control abstraction. [#16612](https://github.com/lampepfl/dotty/pull/16612) + +# Other changes and fixes + +## Annotations + +- Support use-site meta-annotations [#16445](https://github.com/lampepfl/dotty/pull/16445) + +## Desugaring + +- Reuse typed prefix for `applyDynamic` and `applyDynamicNamed` [#16552](https://github.com/lampepfl/dotty/pull/16552) +- Fix object selftype match error [#16441](https://github.com/lampepfl/dotty/pull/16441) + +## Erasure + +- Dealias before checking for outer references in types [#16525](https://github.com/lampepfl/dotty/pull/16525) +- Fix generic signature for type params bounded by primitive [#16442](https://github.com/lampepfl/dotty/pull/16442) +- Avoid EmptyScope.cloneScope crashing, eg on missing references [#16314](https://github.com/lampepfl/dotty/pull/16314) + +## GADTs + +- Inline GADT state restoring in TypeComparer [#16564](https://github.com/lampepfl/dotty/pull/16564) +- Add extension/conversion to GADT selection healing [#16638](https://github.com/lampepfl/dotty/pull/16638) + +## Incremental compilation + +- Unpickle arguments of parent constructors in Templates lazily [#16688](https://github.com/lampepfl/dotty/pull/16688) + +## Initialization + +- Fix #16438: Supply dummy args for erroneous parent call in init check [#16448](https://github.com/lampepfl/dotty/pull/16448) + +## Inline + +- Dealias in ConstantValue, for inline if cond [#16652](https://github.com/lampepfl/dotty/pull/16652) +- Set Span for top level annotations generated in PostTyper [#16378](https://github.com/lampepfl/dotty/pull/16378) +- Interpolate any type vars from comparing against SelectionProto [#16348](https://github.com/lampepfl/dotty/pull/16348) +- Handle binding of beta reduced inlined lambdas [#16377](https://github.com/lampepfl/dotty/pull/16377) +- Do not add dummy RHS to abstract inline methods [#16510](https://github.com/lampepfl/dotty/pull/16510) +- Warn on inline given aliases with functions as RHS [#16499](https://github.com/lampepfl/dotty/pull/16499) +- Support inline overrides in value classes [#16523](https://github.com/lampepfl/dotty/pull/16523) + +## Java interop + +- Represent Java annotations as interfaces so they can be extended, and disallow various misuses of them [#16260](https://github.com/lampepfl/dotty/pull/16260) + +## Opaque Types + +- Delay opaque alias checking until PostTyper [#16644](https://github.com/lampepfl/dotty/pull/16644) + +## Overloading + +- Handle context function arguments in overloading resolution [#16511](https://github.com/lampepfl/dotty/pull/16511) + +## Parser + +- Improve support for Unicode supplementary characters in identifiers and string interpolation (as in Scala 2) [#16278](https://github.com/lampepfl/dotty/pull/16278) +- Require indent after colon at EOL [#16466](https://github.com/lampepfl/dotty/pull/16466) +- Help givens return refined types [#16293](https://github.com/lampepfl/dotty/pull/16293) + +## Pattern Matching + +- Tweak AvoidMap's derivedSelect [#16563](https://github.com/lampepfl/dotty/pull/16563) +- Space: Use RHS of & when refining subtypes [#16573](https://github.com/lampepfl/dotty/pull/16573) +- Freeze constraints in a condition check of maximiseType [#16526](https://github.com/lampepfl/dotty/pull/16526) +- Restrict syntax of typed patterns [#16150](https://github.com/lampepfl/dotty/pull/16150) +- Test case to show that #16252 works with transparent [#16262](https://github.com/lampepfl/dotty/pull/16262) +- Support inline unapplySeq and with leading given parameters [#16358](https://github.com/lampepfl/dotty/pull/16358) +- Handle sealed prefixes in exh checking [#16621](https://github.com/lampepfl/dotty/pull/16621) +- Detect irrefutable quoted patterns [#16674](https://github.com/lampepfl/dotty/pull/16674) + +## Pickling + +- Allow case classes with up to 254 parameters [#16501](https://github.com/lampepfl/dotty/pull/16501) +- Correctly unpickle Scala 2 private case classes in traits [#16519](https://github.com/lampepfl/dotty/pull/16519) + +## Polyfunctions + +- Fix #9996: Crash with function accepting polymorphic function type with singleton result [#16327](https://github.com/lampepfl/dotty/pull/16327) + +## Quotes + +- Remove contents of inline methods [#16345](https://github.com/lampepfl/dotty/pull/16345) +- Fix errors in explicit type annotations in inline match cases [#16257](https://github.com/lampepfl/dotty/pull/16257) +- Handle macro annotation suspends and crashes [#16509](https://github.com/lampepfl/dotty/pull/16509) +- Fix macro annotations `spliceOwner` [#16513](https://github.com/lampepfl/dotty/pull/16513) + +## REPL + +- REPL: Fix crash when printing instances of value classes [#16393](https://github.com/lampepfl/dotty/pull/16393) +- Attempt to fix completion crash [#16267](https://github.com/lampepfl/dotty/pull/16267) +- Fix REPL shadowing bug [#16389](https://github.com/lampepfl/dotty/pull/16389) +- Open up for extensibility [#16276](https://github.com/lampepfl/dotty/pull/16276) +- Don't crash if completions throw [#16687](https://github.com/lampepfl/dotty/pull/16687) + +## Reflection + +- Fix reflect typeMembers to return all members [#15033](https://github.com/lampepfl/dotty/pull/15033) +- Deprecate reflect Flags.Static [#16568](https://github.com/lampepfl/dotty/pull/16568) + +## Reporting + +- Suppress follow-on errors for erroneous import qualifiers [#16658](https://github.com/lampepfl/dotty/pull/16658) +- Fix order in which errors are reported for assignment to val [#16660](https://github.com/lampepfl/dotty/pull/16660) +- Fix class name in error message [#16635](https://github.com/lampepfl/dotty/pull/16635) +- Make refined type printing more source compatible [#16303](https://github.com/lampepfl/dotty/pull/16303) +- Add error hint on local inline def used in quotes [#16572](https://github.com/lampepfl/dotty/pull/16572) +- Fix Text wrapping [#16277](https://github.com/lampepfl/dotty/pull/16277) +- Fix -Wunused:import registering constructor `` instead of its owner (also fix false positive for enum) [#16661](https://github.com/lampepfl/dotty/pull/16661) +- Fix #16675 : -Wunused false positive on case class generated method, due to flags used to distinguish case accessors. [#16683](https://github.com/lampepfl/dotty/pull/16683) +- Fix #16680 by registering Ident not containing a symbol [#16689](https://github.com/lampepfl/dotty/pull/16689) +- Fix #16682: CheckUnused missed some used symbols [#16690](https://github.com/lampepfl/dotty/pull/16690) +- Fix the non-miniphase tree traverser [#16684](https://github.com/lampepfl/dotty/pull/16684) + +## Scala-JS + +- Fix #14289: Accept Ident refs to `js.native` in native member rhs. [#16185](https://github.com/lampepfl/dotty/pull/16185) + +## Standard Library + +- Add `CanEqual` instance for `Map` [#15886](https://github.com/lampepfl/dotty/pull/15886) +- Refine `Tuple.Append` return type [#16140](https://github.com/lampepfl/dotty/pull/16140) + +## TASTy format + +- Make it a fatal error if erasure cannot resolve a type [#16373](https://github.com/lampepfl/dotty/pull/16373) + +## Tooling + +- Add -Yimports compiler flag [#16218](https://github.com/lampepfl/dotty/pull/16218) +- Allow BooleanSettings to be set with a colon [#16425](https://github.com/lampepfl/dotty/pull/16425) + +## Transform + +- Avoid stackoverflow in ExplicitOuter [#16381](https://github.com/lampepfl/dotty/pull/16381) +- Make lazy vals run on non-fallback graal image - remove dynamic reflection [#16346](https://github.com/lampepfl/dotty/pull/16346) +- Patch to avoid crash in #16351 [#16354](https://github.com/lampepfl/dotty/pull/16354) +- Don't treat package object's `` methods as package members [#16667](https://github.com/lampepfl/dotty/pull/16667) +- Space: Refine isSubspace property & an example [#16574](https://github.com/lampepfl/dotty/pull/16574) + +## Typer + +- Drop requirement that self types are closed [#16648](https://github.com/lampepfl/dotty/pull/16648) +- Disallow constructor params from appearing in parent types for soundness [#16664](https://github.com/lampepfl/dotty/pull/16664) +- Don't search implicit arguments in singleton type prefix [#16490](https://github.com/lampepfl/dotty/pull/16490) +- Don't rely on isProvisional to determine whether atoms computed [#16489](https://github.com/lampepfl/dotty/pull/16489) +- Support signature polymorphic methods (`MethodHandle` and `VarHandle`) [#16225](https://github.com/lampepfl/dotty/pull/16225) +- Prefer parameterless alternatives during ambiguous overload resolution [#16315](https://github.com/lampepfl/dotty/pull/16315) +- Fix calculation to drop transparent classes [#16344](https://github.com/lampepfl/dotty/pull/16344) +- Test case for issue 16311 [#16317](https://github.com/lampepfl/dotty/pull/16317) +- Skip caching provisional OrType atoms [#16295](https://github.com/lampepfl/dotty/pull/16295) +- Avoid cyclic references due to experimental check when inlining [#16195](https://github.com/lampepfl/dotty/pull/16195) +- Track type variable dependencies to guide instantiation decisions [#16042](https://github.com/lampepfl/dotty/pull/16042) +- Two fixes to constraint solving [#16353](https://github.com/lampepfl/dotty/pull/16353) +- Fix regression in cyclic constraint handling [#16514](https://github.com/lampepfl/dotty/pull/16514) +- Sharpen range approximation for applied types with capture set ranges [#16261](https://github.com/lampepfl/dotty/pull/16261) +- Cut the Gordian Knot: Don't widen unions to transparent [#15642](https://github.com/lampepfl/dotty/pull/15642) +- Fix widening logic to keep instantiation within bounds [#16417](https://github.com/lampepfl/dotty/pull/16417) +- Skip ambiguous reference error when symbols are aliases [#16401](https://github.com/lampepfl/dotty/pull/16401) +- Avoid incorrect simplifications when updating bounds in the constraint [#16410](https://github.com/lampepfl/dotty/pull/16410) +- Take `@targetName` into account when resolving extension methods [#16487](https://github.com/lampepfl/dotty/pull/16487) +- Improve ClassTag handling to avoid invalid ClassTag generation and inference failure [#16492](https://github.com/lampepfl/dotty/pull/16492) +- Fix extracting the elemType of a union of arrays [#16569](https://github.com/lampepfl/dotty/pull/16569) +- Make sure annotations are typed in expression contexts [#16699](https://github.com/lampepfl/dotty/pull/16699) +- Throw a type error when using hk-types in unions or intersections [#16712](https://github.com/lampepfl/dotty/pull/16712) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.2.2..3.3.0-RC1` these are: + +``` + 225 Martin Odersky + 73 Dale Wijnand + 58 Szymon Rodziewicz + 54 Nicolas Stucki + 48 Kamil Szewczyk + 48 Paul Coral + 30 Paweł Marks + 28 Florian3k + 28 Yichen Xu + 14 Guillaume Martres + 8 Fengyun Liu + 8 Michał Pałka + 7 Chris Birchall + 7 rochala + 6 Kacper Korban + 6 Sébastien Doeraene + 6 jdudrak + 5 Seth Tisue + 5 Som Snytt + 5 nizhikov + 4 Filip Zybała + 4 Jan Chyb + 4 Michael Pollmeier + 4 Natsu Kagami + 3 Jamie Thompson + 2 Alex + 2 Anatolii Kmetiuk + 2 Dmitrii Naumenko + 2 Lukas Rytz + 2 adampauls + 2 yoshinorin + 1 Alexander Slesarenko + 1 Chris Kipp + 1 Guillaume Raffin + 1 Jakub Kozłowski + 1 Jan-Pieter van den Heuvel + 1 Julien Richard-Foy + 1 Kenji Yoshida + 1 Philippus + 1 Szymon R + 1 Tim Spence + 1 s.bazarsadaev + +``` \ No newline at end of file diff --git a/changelogs/3.3.0-RC3.md b/changelogs/3.3.0-RC3.md new file mode 100644 index 000000000000..79a47fcf0bb9 --- /dev/null +++ b/changelogs/3.3.0-RC3.md @@ -0,0 +1,23 @@ +# Backported fixes + +- Added jpath check to `ClassLikeSupport` getParentsAsTreeSymbolTuples [#16759](https://github.com/lampepfl/dotty/pull/16759) +- Split out immutable GadtConstraint [#16602](https://github.com/lampepfl/dotty/pull/16602) +- Avoid bidirectional GADT typebounds from fullBounds [#15683](https://github.com/lampepfl/dotty/pull/15683) +- Fix static lazy field holder for GraalVM [#16800](https://github.com/lampepfl/dotty/pull/16800) +- Add support for disabling redirected output in the REPL driver for usage in worksheets in the Scala Plugin for IntelliJ IDEA [#16810](https://github.com/lampepfl/dotty/pull/16810) +- Add missing criterion to subtype check [#16889](https://github.com/lampepfl/dotty/pull/16889) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.0-RC2..3.3.0-RC3` these are: + +``` + 7 Dale Wijnand + 5 Szymon Rodziewicz + 2 Paweł Marks + 2 Vasil Vasilev + 1 Martin Odersky + 1 Mohammad Yousuf Minhaj Zia +``` diff --git a/changelogs/3.3.0-RC4.md b/changelogs/3.3.0-RC4.md new file mode 100644 index 000000000000..4c4a490237b6 --- /dev/null +++ b/changelogs/3.3.0-RC4.md @@ -0,0 +1,35 @@ +# Backported fixes + +- Fix HK quoted pattern type variables [#16907](https//github.com/lampepfl/dotty/pull/16907) +- Fix caching issue caused by incorrect isProvisional check [#16989](https://github.com/lampepfl/dotty/pull/16989) +- Fix race condition in new LazyVals [#16975](https://github.com/lampepfl/dotty/pull/16975) +- Fix "-Wunused: False positive on parameterless enum member" [#16927](https://github.com/lampepfl/dotty/pull/16927) +- Register usage of symbols in non-inferred type trees in CheckUnused [#16939](https://github.com/lampepfl/dotty/pull/16939) +- Traverse annotations instead of just registering in -W [#16956](https://github.com/lampepfl/dotty/pull/16956) +- Ignore parameter of accessors in -Wunused [#16957](https://github.com/lampepfl/dotty/pull/16957) +- Improve override detection in CheckUnused [#16965](https://github.com/lampepfl/dotty/pull/16965) +- WUnused: Fix unused warning in synthetic symbols [#17020](https://github.com/lampepfl/dotty/pull/17020) +- Fix WUnused with idents in derived code [#17095](https//github.com/lampepfl/dotty/pull/17095) +- WUnused: Fix for symbols with synthetic names and unused transparent inlines [#17061](https//github.com/lampepfl/dotty/pull/17061) +- Skip extension method params in WUnused [#17178](https//github.com/lampepfl/dotty/pull/17178) +- Fix wunused false positive when deriving alias type [#17157](https//github.com/lampepfl/dotty/pull/17157) +- Fix WUnused for accessible symbols that are renamed [#17177](https//github.com/lampepfl/dotty/pull/17177) +- Fix WUnused false positive in for [#17176](https//github.com/lampepfl/dotty/pull/17176) +- Make CheckUnused run both after Typer and Inlining [#17206](https//github.com/lampepfl/dotty/pull/17206) +- Disable WUnused for params of non-private defs [#17223](https//github.com/lampepfl/dotty/pull/17223) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.0-RC3..3.3.0-RC4` these are: + +``` + 41 Szymon Rodziewicz + 4 Paul Coral + 3 Paweł Marks + 1 Guillaume Martres + 1 Kacper Korban + 1 Nicolas Stucki + +``` diff --git a/changelogs/3.3.0-RC5.md b/changelogs/3.3.0-RC5.md new file mode 100644 index 000000000000..a9cc120ae39a --- /dev/null +++ b/changelogs/3.3.0-RC5.md @@ -0,0 +1,22 @@ +# Backported fixes + +- Remove experimental from `Mirror#fromProductTyped` [#16829](https//github.com/lampepfl/dotty/pull/16829) +- Wunused: Check if symbol exists before `isValidMemberDef` check [#17316](https://github.com/lampepfl/dotty/pull/17316) +- Wunused: Include import selector bounds in unused checks [#17323](https://github.com/lampepfl/dotty/pull/17323) +- Fix compiler crash in WUnused [#17340](https://github.com/lampepfl/dotty/pull/17340) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.0-RC4..3.3.0-RC5` these are: + +``` + 2 Kacper Korban + 2 Michael Pilquist + 2 Paweł Marks + 2 Szymon Rodziewicz + 1 Matt Bovel + + +``` diff --git a/changelogs/3.3.0-RC6.md b/changelogs/3.3.0-RC6.md new file mode 100644 index 000000000000..ab98f0055974 --- /dev/null +++ b/changelogs/3.3.0-RC6.md @@ -0,0 +1,18 @@ +# Backported fixes + +- Patmat: Use less type variables in prefix inference [#16827](https//github.com/lampepfl/dotty/pull/16827) +- Just warn on type ascription on a pattern [#17454](https://github.com/lampepfl/dotty/pull/17454) +- Fix #17187: allow patches with same span [#17366](https://github.com/lampepfl/dotty/pull/17366) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.3.0-RC5..3.3.0-RC6` these are: + +``` + 2 Adrien Piquerez + 2 Michał Pałka + 2 Paweł Marks + 1 Dale Wijnand +``` diff --git a/changelogs/3.3.0.md b/changelogs/3.3.0.md new file mode 100644 index 000000000000..e3cc3703fadd --- /dev/null +++ b/changelogs/3.3.0.md @@ -0,0 +1,268 @@ +# Highlights of the release + +- Stabilize new lazy vals [#16614](https://github.com/lampepfl/dotty/pull/16614) +- Experimental Macro annotations [#16392](https://github.com/lampepfl/dotty/pull/16392) [#16454](https://github.com/lampepfl/dotty/pull/16454) [#16534](https://github.com/lampepfl/dotty/pull/16534) +- Fix stability check for inline parameters [#15511](https://github.com/lampepfl/dotty/pull/15511) +- Make `fewerBraces` a standard feature [#16297](https://github.com/lampepfl/dotty/pull/16297) +- Add new front-end phase for unused entities and add support for unused imports [#16157](https://github.com/lampepfl/dotty/pull/16157) +- Implement -Wvalue-discard warning [#15975](https://github.com/lampepfl/dotty/pull/15975) +- Introduce boundary/break control abstraction. [#16612](https://github.com/lampepfl/dotty/pull/16612) + +# Other changes and fixes + +## Annotations + +- Support use-site meta-annotations [#16445](https://github.com/lampepfl/dotty/pull/16445) + +## Desugaring + +- Reuse typed prefix for `applyDynamic` and `applyDynamicNamed` [#16552](https://github.com/lampepfl/dotty/pull/16552) +- Fix object selftype match error [#16441](https://github.com/lampepfl/dotty/pull/16441) + +## Erasure + +- Dealias before checking for outer references in types [#16525](https://github.com/lampepfl/dotty/pull/16525) +- Fix generic signature for type params bounded by primitive [#16442](https://github.com/lampepfl/dotty/pull/16442) +- Avoid EmptyScope.cloneScope crashing, eg on missing references [#16314](https://github.com/lampepfl/dotty/pull/16314) + +## GADTs + +- Inline GADT state restoring in TypeComparer [#16564](https://github.com/lampepfl/dotty/pull/16564) +- Add extension/conversion to GADT selection healing [#16638](https://github.com/lampepfl/dotty/pull/16638) +- Split out immutable GadtConstraint [#16602](https://github.com/lampepfl/dotty/pull/16602) +- Avoid bidirectional GADT typebounds from fullBounds [#15683](https://github.com/lampepfl/dotty/pull/15683) + +## Incremental compilation + +- Unpickle arguments of parent constructors in Templates lazily [#16688](https://github.com/lampepfl/dotty/pull/16688) + +## Initialization + +- Fix #16438: Supply dummy args for erroneous parent call in init check [#16448](https://github.com/lampepfl/dotty/pull/16448) + +## Inline + +- Dealias in ConstantValue, for inline if cond [#16652](https://github.com/lampepfl/dotty/pull/16652) +- Set Span for top level annotations generated in PostTyper [#16378](https://github.com/lampepfl/dotty/pull/16378) +- Interpolate any type vars from comparing against SelectionProto [#16348](https://github.com/lampepfl/dotty/pull/16348) +- Handle binding of beta reduced inlined lambdas [#16377](https://github.com/lampepfl/dotty/pull/16377) +- Do not add dummy RHS to abstract inline methods [#16510](https://github.com/lampepfl/dotty/pull/16510) +- Warn on inline given aliases with functions as RHS [#16499](https://github.com/lampepfl/dotty/pull/16499) +- Support inline overrides in value classes [#16523](https://github.com/lampepfl/dotty/pull/16523) + +## Java interop + +- Represent Java annotations as interfaces so they can be extended, and disallow various misuses of them [#16260](https://github.com/lampepfl/dotty/pull/16260) + +## Linting + +- Fix -Wunused:import registering constructor `` instead of its owner (also fix false positive for enum) [#16661](https://github.com/lampepfl/dotty/pull/16661) +- Fix #16675 : -Wunused false positive on case class generated method, due to flags used to distinguish case accessors. [#16683](https://github.com/lampepfl/dotty/pull/16683) +- Fix #16682: CheckUnused missed some used symbols [#16690](https://github.com/lampepfl/dotty/pull/16690) +- Fix "-Wunused: False positive on parameterless enum member" [#16927](https://github.com/lampepfl/dotty/pull/16927) +- Register usage of symbols in non-inferred type trees in CheckUnused [#16939](https://github.com/lampepfl/dotty/pull/16939) +- Traverse annotations instead of just registering in -Wunused [#16956](https://github.com/lampepfl/dotty/pull/16956) +- Ignore parameter of accessors in -Wunused [#16957](https://github.com/lampepfl/dotty/pull/16957) +- Ignore parameter of accessors in -Wunused [#16957](https://github.com/lampepfl/dotty/pull/16957) +- Improve override detection in CheckUnused [#16965](https://github.com/lampepfl/dotty/pull/16965) +- WUnused: Fix unused warning in synthetic symbols [#17020](https://github.com/lampepfl/dotty/pull/17020) +- Fix WUnused with idents in derived code [#17095](https//github.com/lampepfl/dotty/pull/17095) +- WUnused: Fix for symbols with synthetic names and unused transparent inlines [#17061](https//github.com/lampepfl/dotty/pull/17061) +- Skip extension method params in WUnused [#17178](https//github.com/lampepfl/dotty/pull/17178) +- Fix wunused false positive when deriving alias type [#17157](https//github.com/lampepfl/dotty/pull/17157) +- Fix WUnused for accessible symbols that are renamed [#17177](https//github.com/lampepfl/dotty/pull/17177) +- Fix WUnused false positive in for [#17176](https//github.com/lampepfl/dotty/pull/17176) +- Make CheckUnused run both after Typer and Inlining [#17206](https//github.com/lampepfl/dotty/pull/17206) +- Disable WUnused for params of non-private defs [#17223](https//github.com/lampepfl/dotty/pull/17223) +- Wunused: Check if symbol exists before `isValidMemberDef` check [#17316](https://github.com/lampepfl/dotty/pull/17316) +- Wunused: Include import selector bounds in unused checks [#17323](https://github.com/lampepfl/dotty/pull/17323) +- Fix compiler crash in WUnused [#17340](https://github.com/lampepfl/dotty/pull/17340) + +## Opaque Types + +- Delay opaque alias checking until PostTyper [#16644](https://github.com/lampepfl/dotty/pull/16644) + +## Overloading + +- Handle context function arguments in overloading resolution [#16511](https://github.com/lampepfl/dotty/pull/16511) + +## Parser + +- Improve support for Unicode supplementary characters in identifiers and string interpolation (as in Scala 2) [#16278](https://github.com/lampepfl/dotty/pull/16278) +- Require indent after colon at EOL [#16466](https://github.com/lampepfl/dotty/pull/16466) +- Help givens return refined types [#16293](https://github.com/lampepfl/dotty/pull/16293) + +## Pattern Matching + +- Tweak AvoidMap's derivedSelect [#16563](https://github.com/lampepfl/dotty/pull/16563) +- Space: Use RHS of & when refining subtypes [#16573](https://github.com/lampepfl/dotty/pull/16573) +- Freeze constraints in a condition check of maximiseType [#16526](https://github.com/lampepfl/dotty/pull/16526) +- Restrict syntax of typed patterns [#16150](https://github.com/lampepfl/dotty/pull/16150) +- Test case to show that #16252 works with transparent [#16262](https://github.com/lampepfl/dotty/pull/16262) +- Support inline unapplySeq and with leading given parameters [#16358](https://github.com/lampepfl/dotty/pull/16358) +- Handle sealed prefixes in exh checking [#16621](https://github.com/lampepfl/dotty/pull/16621) +- Detect irrefutable quoted patterns [#16674](https://github.com/lampepfl/dotty/pull/16674) +- Patmat: Use less type variables in prefix inference [#16827](https//github.com/lampepfl/dotty/pull/16827) + +## Pickling + +- Allow case classes with up to 254 parameters [#16501](https://github.com/lampepfl/dotty/pull/16501) +- Correctly unpickle Scala 2 private case classes in traits [#16519](https://github.com/lampepfl/dotty/pull/16519) + +## Polyfunctions + +- Fix #9996: Crash with function accepting polymorphic function type with singleton result [#16327](https://github.com/lampepfl/dotty/pull/16327) + +## Quotes + +- Remove contents of inline methods [#16345](https://github.com/lampepfl/dotty/pull/16345) +- Fix errors in explicit type annotations in inline match cases [#16257](https://github.com/lampepfl/dotty/pull/16257) +- Handle macro annotation suspends and crashes [#16509](https://github.com/lampepfl/dotty/pull/16509) +- Fix macro annotations `spliceOwner` [#16513](https://github.com/lampepfl/dotty/pull/16513) +- Fix HK quoted pattern type variables [#16907](https//github.com/lampepfl/dotty/pull/16907) + +## REPL + +- REPL: Fix crash when printing instances of value classes [#16393](https://github.com/lampepfl/dotty/pull/16393) +- Attempt to fix completion crash [#16267](https://github.com/lampepfl/dotty/pull/16267) +- Fix REPL shadowing bug [#16389](https://github.com/lampepfl/dotty/pull/16389) +- Open up for extensibility [#16276](https://github.com/lampepfl/dotty/pull/16276) +- Don't crash if completions throw [#16687](https://github.com/lampepfl/dotty/pull/16687) + +## Reflection + +- Fix reflect typeMembers to return all members [#15033](https://github.com/lampepfl/dotty/pull/15033) +- Deprecate reflect Flags.Static [#16568](https://github.com/lampepfl/dotty/pull/16568) + +## Reporting + +- Suppress follow-on errors for erroneous import qualifiers [#16658](https://github.com/lampepfl/dotty/pull/16658) +- Fix order in which errors are reported for assignment to val [#16660](https://github.com/lampepfl/dotty/pull/16660) +- Fix class name in error message [#16635](https://github.com/lampepfl/dotty/pull/16635) +- Make refined type printing more source compatible [#16303](https://github.com/lampepfl/dotty/pull/16303) +- Add error hint on local inline def used in quotes [#16572](https://github.com/lampepfl/dotty/pull/16572) +- Fix Text wrapping [#16277](https://github.com/lampepfl/dotty/pull/16277) +- Fix #16680 by registering Ident not containing a symbol [#16689](https://github.com/lampepfl/dotty/pull/16689) +- Fix the non-miniphase tree traverser [#16684](https://github.com/lampepfl/dotty/pull/16684) +- Just warn on type ascription on a pattern [#17454](https://github.com/lampepfl/dotty/pull/17454) + +## Scala-JS + +- Fix #14289: Accept Ident refs to `js.native` in native member rhs. [#16185](https://github.com/lampepfl/dotty/pull/16185) + +## Scaladoc + +- Added jpath check to `ClassLikeSupport` getParentsAsTreeSymbolTuples [#16759](https://github.com/lampepfl/dotty/pull/16759) + +## Standard Library + +- Add `CanEqual` instance for `Map` [#15886](https://github.com/lampepfl/dotty/pull/15886) +- Refine `Tuple.Append` return type [#16140](https://github.com/lampepfl/dotty/pull/16140) +- Remove experimental from `Mirror#fromProductTyped` [#16829](https//github.com/lampepfl/dotty/pull/16829) + +## TASTy format + +- Make it a fatal error if erasure cannot resolve a type [#16373](https://github.com/lampepfl/dotty/pull/16373) + +## Tooling + +- Add -Yimports compiler flag [#16218](https://github.com/lampepfl/dotty/pull/16218) +- Allow BooleanSettings to be set with a colon [#16425](https://github.com/lampepfl/dotty/pull/16425) +- Add support for disabling redirected output in the REPL driver for usage in worksheets in the Scala Plugin for IntelliJ IDEA [#16810](https://github.com/lampepfl/dotty/pull/16810) +- Fix #17187: allow patches with same span [#17366](https://github.com/lampepfl/dotty/pull/17366) + +## Transform + +- Avoid stackoverflow in ExplicitOuter [#16381](https://github.com/lampepfl/dotty/pull/16381) +- Make lazy vals run on non-fallback graal image - remove dynamic reflection [#16346](https://github.com/lampepfl/dotty/pull/16346) +- Patch to avoid crash in #16351 [#16354](https://github.com/lampepfl/dotty/pull/16354) +- Don't treat package object's `` methods as package members [#16667](https://github.com/lampepfl/dotty/pull/16667) +- Space: Refine isSubspace property & an example [#16574](https://github.com/lampepfl/dotty/pull/16574) +- Fix static lazy field holder for GraalVM [#16800](https://github.com/lampepfl/dotty/pull/16800) +- Fix race condition in new LazyVals [#16975](https://github.com/lampepfl/dotty/pull/16975) + +## Typer + +- Drop requirement that self types are closed [#16648](https://github.com/lampepfl/dotty/pull/16648) +- Disallow constructor params from appearing in parent types for soundness [#16664](https://github.com/lampepfl/dotty/pull/16664) +- Don't search implicit arguments in singleton type prefix [#16490](https://github.com/lampepfl/dotty/pull/16490) +- Don't rely on isProvisional to determine whether atoms computed [#16489](https://github.com/lampepfl/dotty/pull/16489) +- Support signature polymorphic methods (`MethodHandle` and `VarHandle`) [#16225](https://github.com/lampepfl/dotty/pull/16225) +- Prefer parameterless alternatives during ambiguous overload resolution [#16315](https://github.com/lampepfl/dotty/pull/16315) +- Fix calculation to drop transparent classes [#16344](https://github.com/lampepfl/dotty/pull/16344) +- Test case for issue 16311 [#16317](https://github.com/lampepfl/dotty/pull/16317) +- Skip caching provisional OrType atoms [#16295](https://github.com/lampepfl/dotty/pull/16295) +- Avoid cyclic references due to experimental check when inlining [#16195](https://github.com/lampepfl/dotty/pull/16195) +- Track type variable dependencies to guide instantiation decisions [#16042](https://github.com/lampepfl/dotty/pull/16042) +- Two fixes to constraint solving [#16353](https://github.com/lampepfl/dotty/pull/16353) +- Fix regression in cyclic constraint handling [#16514](https://github.com/lampepfl/dotty/pull/16514) +- Sharpen range approximation for applied types with capture set ranges [#16261](https://github.com/lampepfl/dotty/pull/16261) +- Cut the Gordian Knot: Don't widen unions to transparent [#15642](https://github.com/lampepfl/dotty/pull/15642) +- Fix widening logic to keep instantiation within bounds [#16417](https://github.com/lampepfl/dotty/pull/16417) +- Skip ambiguous reference error when symbols are aliases [#16401](https://github.com/lampepfl/dotty/pull/16401) +- Avoid incorrect simplifications when updating bounds in the constraint [#16410](https://github.com/lampepfl/dotty/pull/16410) +- Take `@targetName` into account when resolving extension methods [#16487](https://github.com/lampepfl/dotty/pull/16487) +- Improve ClassTag handling to avoid invalid ClassTag generation and inference failure [#16492](https://github.com/lampepfl/dotty/pull/16492) +- Fix extracting the elemType of a union of arrays [#16569](https://github.com/lampepfl/dotty/pull/16569) +- Make sure annotations are typed in expression contexts [#16699](https://github.com/lampepfl/dotty/pull/16699) +- Throw a type error when using hk-types in unions or intersections [#16712](https://github.com/lampepfl/dotty/pull/16712) +- Add missing criterion to subtype check [#16889](https://github.com/lampepfl/dotty/pull/16889) +- Fix caching issue caused by incorrect isProvisional check [#16989](https://github.com/lampepfl/dotty/pull/16989) + +# Contributors + +Thank you to all the contributors who made this release possible 🎉 + +According to `git shortlog -sn --no-merges 3.2.2..3.3.0` these are: + +``` + 226 Martin Odersky + 106 Szymon Rodziewicz + 81 Dale Wijnand + 56 Nicolas Stucki + 52 Paul Coral + 48 Kamil Szewczyk + 45 Paweł Marks + 28 Florian3k + 28 Yichen Xu + 15 Guillaume Martres + 10 Michał Pałka + 9 Kacper Korban + 8 Fengyun Liu + 7 Chris Birchall + 7 rochala + 6 Sébastien Doeraene + 6 jdudrak + 5 Seth Tisue + 5 Som Snytt + 5 nizhikov + 4 Filip Zybała + 4 Jan Chyb + 4 Michael Pollmeier + 4 Natsu Kagami + 3 Anatolii Kmetiuk + 3 Jamie Thompson + 2 Adrien Piquerez + 2 Alex + 2 Dmitrii Naumenko + 2 Lukas Rytz + 2 Michael Pilquist + 2 Vasil Vasilev + 2 adampauls + 2 yoshinorin + 1 Alexander Slesarenko + 1 Chris Kipp + 1 Guillaume Raffin + 1 Jakub Kozłowski + 1 Jan-Pieter van den Heuvel + 1 Julien Richard-Foy + 1 Kenji Yoshida + 1 Matt Bovel + 1 Mohammad Yousuf Minhaj Zia + 1 Philippus + 1 Szymon R + 1 Tim Spence + 1 s.bazarsadaev + + +``` \ No newline at end of file diff --git a/community-build/community-projects/betterfiles b/community-build/community-projects/betterfiles index 0ab941360880..d098f2799092 160000 --- a/community-build/community-projects/betterfiles +++ b/community-build/community-projects/betterfiles @@ -1 +1 @@ -Subproject commit 0ab941360880095419183309b0b9b3363eb1ad00 +Subproject commit d098f279909246243643ba3b85f3520a24c377af diff --git a/community-build/community-projects/cats-effect-3 b/community-build/community-projects/cats-effect-3 index 3a32c0e5b7b6..1d425e6efdf8 160000 --- a/community-build/community-projects/cats-effect-3 +++ b/community-build/community-projects/cats-effect-3 @@ -1 +1 @@ -Subproject commit 3a32c0e5b7b61665e5bb94ccf0ed92beb66615dd +Subproject commit 1d425e6efdf8aee619a4a906e950473c51f78161 diff --git a/community-build/community-projects/cats-mtl b/community-build/community-projects/cats-mtl index 149f002c8774..0ab7aa1cc8a0 160000 --- a/community-build/community-projects/cats-mtl +++ b/community-build/community-projects/cats-mtl @@ -1 +1 @@ -Subproject commit 149f002c8774b61df87cb846455d94ae858b3b54 +Subproject commit 0ab7aa1cc8a087693b2b04c8a9cb63f69f4af54a diff --git a/community-build/community-projects/fs2 b/community-build/community-projects/fs2 index ac5275baf33b..6d7c6d6924cb 160000 --- a/community-build/community-projects/fs2 +++ b/community-build/community-projects/fs2 @@ -1 +1 @@ -Subproject commit ac5275baf33b03da0a461b5de735ee6a1f5a524e +Subproject commit 6d7c6d6924cb055028458ac8236622190acf66d1 diff --git a/community-build/community-projects/http4s b/community-build/community-projects/http4s index c3d46f561ed1..aa85f5f2e660 160000 --- a/community-build/community-projects/http4s +++ b/community-build/community-projects/http4s @@ -1 +1 @@ -Subproject commit c3d46f561ed1026ae54e1acbd5e4730f0498ea93 +Subproject commit aa85f5f2e660d1d4370d90316333718fd6517051 diff --git a/community-build/community-projects/play-json b/community-build/community-projects/play-json index 356b7044ed3e..b2b7f8b834a4 160000 --- a/community-build/community-projects/play-json +++ b/community-build/community-projects/play-json @@ -1 +1 @@ -Subproject commit 356b7044ed3efd6cf9350eb9930be6abd4906b6e +Subproject commit b2b7f8b834a405ec6ba5455dc345b754fab21e8f diff --git a/community-build/community-projects/protoquill b/community-build/community-projects/protoquill index 16d26fcb3072..494c2ddc06e7 160000 --- a/community-build/community-projects/protoquill +++ b/community-build/community-projects/protoquill @@ -1 +1 @@ -Subproject commit 16d26fcb30720b9aa81d29f08b9da10916e269a2 +Subproject commit 494c2ddc06e71f1c7f13b382675525130feee9a0 diff --git a/community-build/community-projects/requests-scala b/community-build/community-projects/requests-scala index 6d4a223bc33d..8e4a40588491 160000 --- a/community-build/community-projects/requests-scala +++ b/community-build/community-projects/requests-scala @@ -1 +1 @@ -Subproject commit 6d4a223bc33def14ae9a4def24a3f5c258451e8e +Subproject commit 8e4a40588491608aa40099f79c881d54a5094e75 diff --git a/community-build/community-projects/scala-parallel-collections b/community-build/community-projects/scala-parallel-collections index a6bd648bb188..7d0e41ae4d09 160000 --- a/community-build/community-projects/scala-parallel-collections +++ b/community-build/community-projects/scala-parallel-collections @@ -1 +1 @@ -Subproject commit a6bd648bb188a65ab36be07e956e52fe25f64d67 +Subproject commit 7d0e41ae4d09e1ddf063651e377921ec493fc5bf diff --git a/community-build/community-projects/scalaz b/community-build/community-projects/scalaz index ee85b0925809..6e7f3d9caf64 160000 --- a/community-build/community-projects/scalaz +++ b/community-build/community-projects/scalaz @@ -1 +1 @@ -Subproject commit ee85b0925809f6e04808a6124ae04dd89adba0d6 +Subproject commit 6e7f3d9caf64d8ad1c82804cf418882345f41930 diff --git a/community-build/community-projects/specs2 b/community-build/community-projects/specs2 index 2bfe446a4e91..789f23b75db1 160000 --- a/community-build/community-projects/specs2 +++ b/community-build/community-projects/specs2 @@ -1 +1 @@ -Subproject commit 2bfe446a4e9122b1122a7e13a3d100b3749b8630 +Subproject commit 789f23b75db1cf7961d04468b21a2cc0d7ba32d8 diff --git a/community-build/community-projects/spire b/community-build/community-projects/spire index 7f630c0209e3..bc524eeea735 160000 --- a/community-build/community-projects/spire +++ b/community-build/community-projects/spire @@ -1 +1 @@ -Subproject commit 7f630c0209e327bdc782ade2210d8e4b916fddcc +Subproject commit bc524eeea735a3cf4d5108039f95950b024a14e4 diff --git a/community-build/community-projects/stdLib213 b/community-build/community-projects/stdLib213 index 986dcc160aab..1a2521996bad 160000 --- a/community-build/community-projects/stdLib213 +++ b/community-build/community-projects/stdLib213 @@ -1 +1 @@ -Subproject commit 986dcc160aab85298f6cab0bf8dd0345497cdc01 +Subproject commit 1a2521996badfe4cb3d9b8cdecefacb1251faeb9 diff --git a/community-build/src/scala/dotty/communitybuild/projects.scala b/community-build/src/scala/dotty/communitybuild/projects.scala index 52155189a31f..fe3f5cfed5a2 100644 --- a/community-build/src/scala/dotty/communitybuild/projects.scala +++ b/community-build/src/scala/dotty/communitybuild/projects.scala @@ -140,7 +140,7 @@ final case class SbtCommunityProject( case Some(ivyHome) => List(s"-Dsbt.ivy.home=$ivyHome") case _ => Nil extraSbtArgs ++ sbtProps ++ List( - "-sbt-version", "1.7.1", + "-sbt-version", "1.8.0", "-Dsbt.supershell=false", s"-Ddotty.communitybuild.dir=$communitybuildDir", s"--addPluginSbtFile=$sbtPluginFilePath" diff --git a/compiler/src/dotty/tools/backend/jvm/BCodeBodyBuilder.scala b/compiler/src/dotty/tools/backend/jvm/BCodeBodyBuilder.scala index 3e2a8f1b0b60..1d559c9950f1 100644 --- a/compiler/src/dotty/tools/backend/jvm/BCodeBodyBuilder.scala +++ b/compiler/src/dotty/tools/backend/jvm/BCodeBodyBuilder.scala @@ -4,7 +4,7 @@ package jvm import scala.language.unsafeNulls -import scala.annotation.switch +import scala.annotation.{switch, tailrec} import scala.collection.mutable.SortedMap import scala.tools.asm @@ -23,6 +23,7 @@ import dotty.tools.dotc.transform.SymUtils._ import dotty.tools.dotc.util.Spans._ import dotty.tools.dotc.core.Contexts._ import dotty.tools.dotc.core.Phases._ +import dotty.tools.dotc.core.Decorators.em import dotty.tools.dotc.report /* @@ -78,9 +79,14 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { tree match { case Assign(lhs @ DesugaredSelect(qual, _), rhs) => + val savedStackHeight = stackHeight val isStatic = lhs.symbol.isStaticMember - if (!isStatic) { genLoadQualifier(lhs) } + if (!isStatic) { + genLoadQualifier(lhs) + stackHeight += 1 + } genLoad(rhs, symInfoTK(lhs.symbol)) + stackHeight = savedStackHeight lineNumber(tree) // receiverClass is used in the bytecode to access the field. using sym.owner may lead to IllegalAccessError val receiverClass = qual.tpe.typeSymbol @@ -144,7 +150,9 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { } genLoad(larg, resKind) + stackHeight += resKind.size genLoad(rarg, if (isShift) INT else resKind) + stackHeight -= resKind.size (code: @switch) match { case ADD => bc add resKind @@ -181,14 +189,19 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { if (isArrayGet(code)) { // load argument on stack assert(args.length == 1, s"Too many arguments for array get operation: $tree"); + stackHeight += 1 genLoad(args.head, INT) + stackHeight -= 1 generatedType = k.asArrayBType.componentType bc.aload(elementType) } else if (isArraySet(code)) { val List(a1, a2) = args + stackHeight += 1 genLoad(a1, INT) + stackHeight += 1 genLoad(a2) + stackHeight -= 2 generatedType = UNIT bc.astore(elementType) } else { @@ -222,7 +235,7 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { val resKind = if (hasUnitBranch) UNIT else tpeTK(tree) val postIf = new asm.Label - genLoadTo(thenp, resKind, LoadDestination.Jump(postIf)) + genLoadTo(thenp, resKind, LoadDestination.Jump(postIf, stackHeight)) markProgramPoint(failure) genLoadTo(elsep, resKind, LoadDestination.FallThrough) markProgramPoint(postIf) @@ -481,7 +494,17 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { dest match case LoadDestination.FallThrough => () - case LoadDestination.Jump(label) => + case LoadDestination.Jump(label, targetStackHeight) => + if targetStackHeight < stackHeight then + val stackDiff = stackHeight - targetStackHeight + if expectedType == UNIT then + bc dropMany stackDiff + else + val loc = locals.makeTempLocal(expectedType) + bc.store(loc.idx, expectedType) + bc dropMany stackDiff + bc.load(loc.idx, expectedType) + end if bc goTo label case LoadDestination.Return => bc emitRETURN returnType @@ -576,7 +599,7 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { if dest == LoadDestination.FallThrough then val resKind = tpeTK(tree) val jumpTarget = new asm.Label - registerJumpDest(labelSym, resKind, LoadDestination.Jump(jumpTarget)) + registerJumpDest(labelSym, resKind, LoadDestination.Jump(jumpTarget, stackHeight)) genLoad(expr, resKind) markProgramPoint(jumpTarget) resKind @@ -634,7 +657,7 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { markProgramPoint(loop) if isInfinite then - val dest = LoadDestination.Jump(loop) + val dest = LoadDestination.Jump(loop, stackHeight) genLoadTo(body, UNIT, dest) dest else @@ -649,7 +672,7 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { val failure = new asm.Label genCond(cond, success, failure, targetIfNoJump = success) markProgramPoint(success) - genLoadTo(body, UNIT, LoadDestination.Jump(loop)) + genLoadTo(body, UNIT, LoadDestination.Jump(loop, stackHeight)) markProgramPoint(failure) end match LoadDestination.FallThrough @@ -700,7 +723,7 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { var elemKind = arr.elementType val argsSize = args.length if (argsSize > dims) { - report.error(s"too many arguments for array constructor: found ${args.length} but array has only $dims dimension(s)", ctx.source.atSpan(app.span)) + report.error(em"too many arguments for array constructor: found ${args.length} but array has only $dims dimension(s)", ctx.source.atSpan(app.span)) } if (argsSize < dims) { /* In one step: @@ -743,7 +766,9 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { // scala/bug#10290: qual can be `this.$outer()` (not just `this`), so we call genLoad (not just ALOAD_0) genLoad(superQual) + stackHeight += 1 genLoadArguments(args, paramTKs(app)) + stackHeight -= 1 generatedType = genCallMethod(fun.symbol, InvokeStyle.Super, app.span) // 'new' constructor call: Note: since constructors are @@ -765,7 +790,9 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { assert(classBTypeFromSymbol(ctor.owner) == rt, s"Symbol ${ctor.owner.showFullName} is different from $rt") mnode.visitTypeInsn(asm.Opcodes.NEW, rt.internalName) bc dup generatedType + stackHeight += 2 genLoadArguments(args, paramTKs(app)) + stackHeight -= 2 genCallMethod(ctor, InvokeStyle.Special, app.span) case _ => @@ -798,8 +825,12 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { else if (app.hasAttachment(BCodeHelpers.UseInvokeSpecial)) InvokeStyle.Special else InvokeStyle.Virtual - if (invokeStyle.hasInstance) genLoadQualifier(fun) + val savedStackHeight = stackHeight + if invokeStyle.hasInstance then + genLoadQualifier(fun) + stackHeight += 1 genLoadArguments(args, paramTKs(app)) + stackHeight = savedStackHeight val DesugaredSelect(qual, name) = fun: @unchecked // fun is a Select, also checked in genLoadQualifier val isArrayClone = name == nme.clone_ && qual.tpe.widen.isInstanceOf[JavaArrayType] @@ -857,6 +888,8 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { bc iconst elems.length bc newarray elmKind + stackHeight += 3 // during the genLoad below, there is the result, its dup, and the index + var i = 0 var rest = elems while (!rest.isEmpty) { @@ -868,6 +901,8 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { i = i + 1 } + stackHeight -= 3 + generatedType } @@ -882,7 +917,7 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { val (generatedType, postMatch, postMatchDest) = if dest == LoadDestination.FallThrough then val postMatch = new asm.Label - (tpeTK(tree), postMatch, LoadDestination.Jump(postMatch)) + (tpeTK(tree), postMatch, LoadDestination.Jump(postMatch, stackHeight)) else (expectedType, null, dest) @@ -1159,14 +1194,21 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { } def genLoadArguments(args: List[Tree], btpes: List[BType]): Unit = - args match - case arg :: args1 => - btpes match - case btpe :: btpes1 => - genLoad(arg, btpe) - genLoadArguments(args1, btpes1) - case _ => - case _ => + @tailrec def loop(args: List[Tree], btpes: List[BType]): Unit = + args match + case arg :: args1 => + btpes match + case btpe :: btpes1 => + genLoad(arg, btpe) + stackHeight += btpe.size + loop(args1, btpes1) + case _ => + case _ => + + val savedStackHeight = stackHeight + loop(args, btpes) + stackHeight = savedStackHeight + end genLoadArguments def genLoadModule(tree: Tree): BType = { val module = ( @@ -1265,11 +1307,14 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { }.sum bc.genNewStringBuilder(approxBuilderSize) + stackHeight += 1 // during the genLoad below, there is a reference to the StringBuilder on the stack for (elem <- concatArguments) { val elemType = tpeTK(elem) genLoad(elem, elemType) bc.genStringBuilderAppend(elemType) } + stackHeight -= 1 + bc.genStringBuilderEnd } else { @@ -1286,12 +1331,15 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { var totalArgSlots = 0 var countConcats = 1 // ie. 1 + how many times we spilled + val savedStackHeight = stackHeight + for (elem <- concatArguments) { val tpe = tpeTK(elem) val elemSlots = tpe.size // Unlikely spill case if (totalArgSlots + elemSlots >= MaxIndySlots) { + stackHeight = savedStackHeight + countConcats bc.genIndyStringConcat(recipe.toString, argTypes.result(), constVals.result()) countConcats += 1 totalArgSlots = 0 @@ -1316,8 +1364,10 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { val tpe = tpeTK(elem) argTypes += tpe.toASMType genLoad(elem, tpe) + stackHeight += 1 } } + stackHeight = savedStackHeight bc.genIndyStringConcat(recipe.toString, argTypes.result(), constVals.result()) // If we spilled, generate one final concat @@ -1512,7 +1562,9 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { } else { val tk = tpeTK(l).maxType(tpeTK(r)) genLoad(l, tk) + stackHeight += tk.size genLoad(r, tk) + stackHeight -= tk.size genCJUMP(success, failure, op, tk, targetIfNoJump) } } @@ -1627,7 +1679,9 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { } genLoad(l, ObjectRef) + stackHeight += 1 genLoad(r, ObjectRef) + stackHeight -= 1 genCallMethod(equalsMethod, InvokeStyle.Static) genCZJUMP(success, failure, Primitives.NE, BOOL, targetIfNoJump) } @@ -1643,7 +1697,9 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { } else if (isNonNullExpr(l)) { // SI-7852 Avoid null check if L is statically non-null. genLoad(l, ObjectRef) + stackHeight += 1 genLoad(r, ObjectRef) + stackHeight -= 1 genCallMethod(defn.Any_equals, InvokeStyle.Virtual) genCZJUMP(success, failure, Primitives.NE, BOOL, targetIfNoJump) } else { @@ -1653,7 +1709,9 @@ trait BCodeBodyBuilder extends BCodeSkelBuilder { val lNonNull = new asm.Label genLoad(l, ObjectRef) + stackHeight += 1 genLoad(r, ObjectRef) + stackHeight -= 1 locals.store(eqEqTempLocal) bc dup ObjectRef genCZJUMP(lNull, lNonNull, Primitives.EQ, ObjectRef, targetIfNoJump = lNull) diff --git a/compiler/src/dotty/tools/backend/jvm/BCodeHelpers.scala b/compiler/src/dotty/tools/backend/jvm/BCodeHelpers.scala index b6d898b3b221..3cf7d88b9282 100644 --- a/compiler/src/dotty/tools/backend/jvm/BCodeHelpers.scala +++ b/compiler/src/dotty/tools/backend/jvm/BCodeHelpers.scala @@ -61,7 +61,6 @@ trait BCodeHelpers extends BCodeIdiomatic with BytecodeWriters { @threadUnsafe lazy val AnnotationRetentionSourceAttr: TermSymbol = requiredClass("java.lang.annotation.RetentionPolicy").linkedClass.requiredValue("SOURCE") @threadUnsafe lazy val AnnotationRetentionClassAttr: TermSymbol = requiredClass("java.lang.annotation.RetentionPolicy").linkedClass.requiredValue("CLASS") @threadUnsafe lazy val AnnotationRetentionRuntimeAttr: TermSymbol = requiredClass("java.lang.annotation.RetentionPolicy").linkedClass.requiredValue("RUNTIME") - @threadUnsafe lazy val JavaAnnotationClass: ClassSymbol = requiredClass("java.lang.annotation.Annotation") val bCodeAsmCommon: BCodeAsmCommon[int.type] = new BCodeAsmCommon(int) @@ -80,7 +79,7 @@ trait BCodeHelpers extends BCodeIdiomatic with BytecodeWriters { outputDirectory } catch { case ex: Throwable => - report.error(s"Couldn't create file for class $cName\n${ex.getMessage}", ctx.source.atSpan(csym.span)) + report.error(em"Couldn't create file for class $cName\n${ex.getMessage}", ctx.source.atSpan(csym.span)) null } } @@ -415,7 +414,7 @@ trait BCodeHelpers extends BCodeIdiomatic with BytecodeWriters { arrAnnotV.visitEnd() } // for the lazy val in ScalaSigBytes to be GC'ed, the invoker of emitAnnotations() should hold the ScalaSigBytes in a method-local var that doesn't escape. */ - case t @ Apply(constr, args) if t.tpe.derivesFrom(JavaAnnotationClass) => + case t @ Apply(constr, args) if t.tpe.classSymbol.is(JavaAnnotation) => val typ = t.tpe.classSymbol.denot.info val assocs = assocsFromApply(t) val desc = innerClasesStore.typeDescriptor(typ) // the class descriptor of the nested annotation class @@ -423,7 +422,7 @@ trait BCodeHelpers extends BCodeIdiomatic with BytecodeWriters { emitAssocs(nestedVisitor, assocs, bcodeStore)(innerClasesStore) case t => - report.error(ex"Annotation argument is not a constant", t.sourcePos) + report.error(em"Annotation argument is not a constant", t.sourcePos) } } @@ -872,10 +871,11 @@ trait BCodeHelpers extends BCodeIdiomatic with BytecodeWriters { try body catch { case ex: Throwable => - report.error(i"""|compiler bug: created invalid generic signature for $sym in ${sym.denot.owner.showFullName} - |signature: $sig - |if this is reproducible, please report bug at https://github.com/lampepfl/dotty/issues - """.trim, sym.sourcePos) + report.error( + em"""|compiler bug: created invalid generic signature for $sym in ${sym.denot.owner.showFullName} + |signature: $sig + |if this is reproducible, please report bug at https://github.com/lampepfl/dotty/issues + """, sym.sourcePos) throw ex } } diff --git a/compiler/src/dotty/tools/backend/jvm/BCodeIdiomatic.scala b/compiler/src/dotty/tools/backend/jvm/BCodeIdiomatic.scala index 02268c2919ba..b86efb7cacb1 100644 --- a/compiler/src/dotty/tools/backend/jvm/BCodeIdiomatic.scala +++ b/compiler/src/dotty/tools/backend/jvm/BCodeIdiomatic.scala @@ -54,6 +54,7 @@ trait BCodeIdiomatic { case "17" => asm.Opcodes.V17 case "18" => asm.Opcodes.V18 case "19" => asm.Opcodes.V19 + case "20" => asm.Opcodes.V20 } lazy val majorVersion: Int = (classfileVersion & 0xFF) @@ -619,6 +620,16 @@ trait BCodeIdiomatic { // can-multi-thread final def drop(tk: BType): Unit = { emit(if (tk.isWideType) Opcodes.POP2 else Opcodes.POP) } + // can-multi-thread + final def dropMany(size: Int): Unit = { + var s = size + while s >= 2 do + emit(Opcodes.POP2) + s -= 2 + if s > 0 then + emit(Opcodes.POP) + } + // can-multi-thread final def dup(tk: BType): Unit = { emit(if (tk.isWideType) Opcodes.DUP2 else Opcodes.DUP) } diff --git a/compiler/src/dotty/tools/backend/jvm/BCodeSkelBuilder.scala b/compiler/src/dotty/tools/backend/jvm/BCodeSkelBuilder.scala index a524d5fb5a8b..9c1ff1f26763 100644 --- a/compiler/src/dotty/tools/backend/jvm/BCodeSkelBuilder.scala +++ b/compiler/src/dotty/tools/backend/jvm/BCodeSkelBuilder.scala @@ -45,7 +45,7 @@ trait BCodeSkelBuilder extends BCodeHelpers { /** The value is put on the stack, and control flows through to the next opcode. */ case FallThrough /** The value is put on the stack, and control flow is transferred to the given `label`. */ - case Jump(label: asm.Label) + case Jump(label: asm.Label, targetStackHeight: Int) /** The value is RETURN'ed from the enclosing method. */ case Return /** The value is ATHROW'n. */ @@ -151,7 +151,7 @@ trait BCodeSkelBuilder extends BCodeHelpers { // !!! Part of this logic is duplicated in JSCodeGen.genCompilationUnit claszSymbol.info.decls.foreach { f => - if f.isField && !f.name.is(LazyBitMapName) then + if f.isField && !f.name.is(LazyBitMapName) && !f.name.is(LazyLocalName) then f.setFlag(JavaStatic) } @@ -368,6 +368,8 @@ trait BCodeSkelBuilder extends BCodeHelpers { // used by genLoadTry() and genSynchronized() var earlyReturnVar: Symbol = null var shouldEmitCleanup = false + // stack tracking + var stackHeight = 0 // line numbers var lastEmittedLineNr = -1 @@ -504,6 +506,13 @@ trait BCodeSkelBuilder extends BCodeHelpers { loc } + def makeTempLocal(tk: BType): Local = + assert(nxtIdx != -1, "not a valid start index") + assert(tk.size > 0, "makeLocal called for a symbol whose type is Unit.") + val loc = Local(tk, "temp", nxtIdx, isSynth = true) + nxtIdx += tk.size + loc + // not to be confused with `fieldStore` and `fieldLoad` which also take a symbol but a field-symbol. def store(locSym: Symbol): Unit = { val Local(tk, _, idx, _) = slots(locSym) @@ -574,6 +583,8 @@ trait BCodeSkelBuilder extends BCodeHelpers { earlyReturnVar = null shouldEmitCleanup = false + stackHeight = 0 + lastEmittedLineNr = -1 } @@ -748,7 +759,7 @@ trait BCodeSkelBuilder extends BCodeHelpers { if (params.size > MaximumJvmParameters) { // SI-7324 - report.error(s"Platform restriction: a parameter list's length cannot exceed $MaximumJvmParameters.", ctx.source.atSpan(methSymbol.span)) + report.error(em"Platform restriction: a parameter list's length cannot exceed $MaximumJvmParameters.", ctx.source.atSpan(methSymbol.span)) return } @@ -800,9 +811,10 @@ trait BCodeSkelBuilder extends BCodeHelpers { val veryFirstProgramPoint = currProgramPoint() if trimmedRhs == tpd.EmptyTree then - report.error("Concrete method has no definition: " + dd + ( - if (ctx.settings.Ydebug.value) "(found: " + methSymbol.owner.info.decls.toList.mkString(", ") + ")" - else ""), + report.error( + em"Concrete method has no definition: $dd${ + if (ctx.settings.Ydebug.value) "(found: " + methSymbol.owner.info.decls.toList.mkString(", ") + ")" + else ""}", ctx.source.atSpan(NoSpan) ) else diff --git a/compiler/src/dotty/tools/backend/jvm/CoreBTypes.scala b/compiler/src/dotty/tools/backend/jvm/CoreBTypes.scala index e94bda16fbb8..d5fce3f53627 100644 --- a/compiler/src/dotty/tools/backend/jvm/CoreBTypes.scala +++ b/compiler/src/dotty/tools/backend/jvm/CoreBTypes.scala @@ -134,8 +134,8 @@ class CoreBTypes[BTFS <: BTypesFromSymbols[_ <: DottyBackendInterface]](val bTyp private lazy val jliCallSiteRef : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.invoke.CallSite]) private lazy val jliLambdaMetafactoryRef : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.invoke.LambdaMetafactory]) - private lazy val jliMethodHandleRef : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.invoke.MethodHandle]) - private lazy val jliMethodHandlesLookupRef : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.invoke.MethodHandles.Lookup]) + private lazy val jliMethodHandleRef : ClassBType = classBTypeFromSymbol(defn.MethodHandleClass) + private lazy val jliMethodHandlesLookupRef : ClassBType = classBTypeFromSymbol(defn.MethodHandlesLookupClass) private lazy val jliMethodTypeRef : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.invoke.MethodType]) private lazy val jliStringConcatFactoryRef : ClassBType = classBTypeFromSymbol(requiredClass("java.lang.invoke.StringConcatFactory")) // since JDK 9 private lazy val srLambdaDeserialize : ClassBType = classBTypeFromSymbol(requiredClass[scala.runtime.LambdaDeserialize]) diff --git a/compiler/src/dotty/tools/backend/jvm/DottyBackendInterface.scala b/compiler/src/dotty/tools/backend/jvm/DottyBackendInterface.scala index 5461ff81341c..f8f683a429f6 100644 --- a/compiler/src/dotty/tools/backend/jvm/DottyBackendInterface.scala +++ b/compiler/src/dotty/tools/backend/jvm/DottyBackendInterface.scala @@ -14,6 +14,7 @@ import Contexts._ import Types._ import Symbols._ import Phases._ +import Decorators.em import dotty.tools.dotc.util.ReadOnlyMap import dotty.tools.dotc.report @@ -21,7 +22,7 @@ import dotty.tools.dotc.report import tpd._ import StdNames.nme -import NameKinds.LazyBitMapName +import NameKinds.{LazyBitMapName, LazyLocalName} import Names.Name class DottyBackendInterface(val outputDirectory: AbstractFile, val superCallsMap: ReadOnlyMap[Symbol, Set[ClassSymbol]])(using val ctx: Context) { @@ -71,7 +72,7 @@ class DottyBackendInterface(val outputDirectory: AbstractFile, val superCallsMap def _1: Type = field.tpe match { case JavaArrayType(elem) => elem case _ => - report.error(s"JavaSeqArray with type ${field.tpe} reached backend: $field", ctx.source.atSpan(field.span)) + report.error(em"JavaSeqArray with type ${field.tpe} reached backend: $field", ctx.source.atSpan(field.span)) UnspecifiedErrorType } def _2: List[Tree] = field.elems @@ -128,10 +129,11 @@ object DottyBackendInterface { * the new lazy val encoding: https://github.com/lampepfl/dotty/issues/7140 */ def isStaticModuleField(using Context): Boolean = - sym.owner.isStaticModuleClass && sym.isField && !sym.name.is(LazyBitMapName) + sym.owner.isStaticModuleClass && sym.isField && !sym.name.is(LazyBitMapName) && !sym.name.is(LazyLocalName) def isStaticMember(using Context): Boolean = (sym ne NoSymbol) && - (sym.is(JavaStatic) || sym.isScalaStatic || sym.isStaticModuleField) + (sym.is(JavaStatic) || sym.isScalaStatic || sym.isStaticModuleField) + // guard against no sumbol cause this code is executed to select which call type(static\dynamic) to use to call array.clone /** diff --git a/compiler/src/dotty/tools/backend/jvm/GenBCode.scala b/compiler/src/dotty/tools/backend/jvm/GenBCode.scala index 73e8fd9edb3b..e788c2b2a4ec 100644 --- a/compiler/src/dotty/tools/backend/jvm/GenBCode.scala +++ b/compiler/src/dotty/tools/backend/jvm/GenBCode.scala @@ -21,6 +21,7 @@ import dotty.tools.dotc.sbt.ExtractDependencies import Contexts._ import Phases._ import Symbols._ +import Decorators.em import java.io.DataOutputStream import java.nio.channels.ClosedByInterruptException @@ -308,7 +309,7 @@ class GenBCodePipeline(val int: DottyBackendInterface, val primitives: DottyPrim getFileForClassfile(outF, cls.name, ".class") } catch { case e: FileConflictException => - report.error(s"error writing ${cls.name}: ${e.getMessage}") + report.error(em"error writing ${cls.name}: ${e.getMessage}") null } } else null @@ -608,11 +609,11 @@ class GenBCodePipeline(val int: DottyBackendInterface, val primitives: DottyPrim val method = s"${e.getClassName.replaceAll("/", ".")}.${e.getMethodName}" val msg = - s"Generated bytecode for method '$method' is too large. Size: ${e.getCodeSize} bytes. Limit is 64KB" + em"Generated bytecode for method '$method' is too large. Size: ${e.getCodeSize} bytes. Limit is 64KB" report.error(msg) case e: ClassTooLargeException => val msg = - s"Class '${e.getClassName.replaceAll("/", ".")}' is too large. Constant pool size: ${e.getConstantPoolCount}. Limit is 64K entries" + em"Class '${e.getClassName.replaceAll("/", ".")}' is too large. Constant pool size: ${e.getConstantPoolCount}. Limit is 64K entries" report.error(msg) } diff --git a/compiler/src/dotty/tools/backend/jvm/scalaPrimitives.scala b/compiler/src/dotty/tools/backend/jvm/scalaPrimitives.scala index 2d4c3ce5c9c4..bc453aec17af 100644 --- a/compiler/src/dotty/tools/backend/jvm/scalaPrimitives.scala +++ b/compiler/src/dotty/tools/backend/jvm/scalaPrimitives.scala @@ -8,6 +8,7 @@ import Contexts._ import Names.TermName, StdNames._ import Types.{JavaArrayType, UnspecifiedErrorType, Type} import Symbols.{Symbol, NoSymbol} +import Decorators.em import dotc.report import dotc.util.ReadOnlyMap @@ -66,7 +67,7 @@ class DottyPrimitives(ictx: Context) { case defn.ArrayOf(el) => el case JavaArrayType(el) => el case _ => - report.error(s"expected Array $tpe") + report.error(em"expected Array $tpe") UnspecifiedErrorType } @@ -133,7 +134,7 @@ class DottyPrimitives(ictx: Context) { def addPrimitives(cls: Symbol, method: TermName, code: Int)(using Context): Unit = { val alts = cls.info.member(method).alternatives.map(_.symbol) if (alts.isEmpty) - report.error(s"Unknown primitive method $cls.$method") + report.error(em"Unknown primitive method $cls.$method") else alts foreach (s => addPrimitive(s, s.info.paramInfoss match { diff --git a/compiler/src/dotty/tools/backend/sjs/JSCodeGen.scala b/compiler/src/dotty/tools/backend/sjs/JSCodeGen.scala index 6714f664620b..4caf1f6b5fa2 100644 --- a/compiler/src/dotty/tools/backend/sjs/JSCodeGen.scala +++ b/compiler/src/dotty/tools/backend/sjs/JSCodeGen.scala @@ -125,7 +125,14 @@ class JSCodeGen()(using genCtx: Context) { /** Implicitly materializes the current local name generator. */ implicit def implicitLocalNames: LocalNameGenerator = localNames.get - private def currentClassType = encodeClassType(currentClassSym) + def currentThisType: jstpe.Type = { + encodeClassType(currentClassSym) match { + case tpe @ jstpe.ClassType(cls) => + jstpe.BoxedClassToPrimType.getOrElse(cls, tpe) + case tpe => + tpe + } + } /** Returns a new fresh local identifier. */ private def freshLocalIdent()(implicit pos: Position): js.LocalIdent = @@ -1023,7 +1030,7 @@ class JSCodeGen()(using genCtx: Context) { // Constructor of a non-native JS class ------------------------------------ def genJSClassCapturesAndConstructor(constructorTrees: List[DefDef])( - implicit pos: SourcePosition): (List[js.ParamDef], js.JSMethodDef) = { + implicit pos: SourcePosition): (List[js.ParamDef], js.JSConstructorDef) = { /* We need to merge all Scala constructors into a single one because the * IR, like JavaScript, only allows a single one. * @@ -1095,20 +1102,21 @@ class JSCodeGen()(using genCtx: Context) { (exports.result(), jsClassCaptures.result()) } + // The name 'constructor' is used for error reporting here val (formalArgs, restParam, overloadDispatchBody) = jsExportsGen.genOverloadDispatch(JSName.Literal("constructor"), exports, jstpe.IntType) val overloadVar = js.VarDef(freshLocalIdent("overload"), NoOriginalName, jstpe.IntType, mutable = false, overloadDispatchBody) - val ctorStats = genJSClassCtorStats(overloadVar.ref, ctorTree) - - val constructorBody = js.Block( - paramVarDefs ::: List(overloadVar, ctorStats, js.Undefined())) + val constructorBody = wrapJSCtorBody( + paramVarDefs :+ overloadVar, + genJSClassCtorBody(overloadVar.ref, ctorTree), + js.Undefined() :: Nil + ) - val constructorDef = js.JSMethodDef( - js.MemberFlags.empty, - js.StringLiteral("constructor"), + val constructorDef = js.JSConstructorDef( + js.MemberFlags.empty.withNamespace(js.MemberNamespace.Constructor), formalArgs, restParam, constructorBody)(OptimizerHints.empty, None) (jsClassCaptures, constructorDef) @@ -1150,7 +1158,8 @@ class JSCodeGen()(using genCtx: Context) { assert(jsSuperCall.isDefined, s"Did not find Super call in primary JS construtor at ${dd.sourcePos}") - new PrimaryJSCtor(sym, genParamsAndInfo(sym, dd.paramss), jsSuperCall.get :: jsStats.result()) + new PrimaryJSCtor(sym, genParamsAndInfo(sym, dd.paramss), + js.JSConstructorBody(Nil, jsSuperCall.get, jsStats.result())(dd.span)) } private def genSecondaryJSClassCtor(dd: DefDef): SplitSecondaryJSCtor = { @@ -1251,9 +1260,9 @@ class JSCodeGen()(using genCtx: Context) { (jsExport, jsClassCaptures) } - /** generates a sequence of JS constructor statements based on a constructor tree. */ - private def genJSClassCtorStats(overloadVar: js.VarRef, - ctorTree: ConstructorTree[PrimaryJSCtor])(implicit pos: Position): js.Tree = { + /** Generates a JS constructor body based on a constructor tree. */ + private def genJSClassCtorBody(overloadVar: js.VarRef, + ctorTree: ConstructorTree[PrimaryJSCtor])(implicit pos: Position): js.JSConstructorBody = { /* generates a statement that conditionally executes body iff the chosen * overload is any of the descendants of `tree` (including itself). @@ -1348,13 +1357,19 @@ class JSCodeGen()(using genCtx: Context) { val primaryCtor = ctorTree.ctor val secondaryCtorTrees = ctorTree.subCtors - js.Block( - secondaryCtorTrees.map(preStats(_, primaryCtor.paramsAndInfo)) ++ - primaryCtor.body ++ + wrapJSCtorBody( + secondaryCtorTrees.map(preStats(_, primaryCtor.paramsAndInfo)), + primaryCtor.body, secondaryCtorTrees.map(postStats(_)) ) } + private def wrapJSCtorBody(before: List[js.Tree], body: js.JSConstructorBody, + after: List[js.Tree]): js.JSConstructorBody = { + js.JSConstructorBody(before ::: body.beforeSuper, body.superCall, + body.afterSuper ::: after)(body.pos) + } + private sealed trait JSCtor { val sym: Symbol val paramsAndInfo: List[(Symbol, JSParamInfo)] @@ -1362,7 +1377,7 @@ class JSCodeGen()(using genCtx: Context) { private class PrimaryJSCtor(val sym: Symbol, val paramsAndInfo: List[(Symbol, JSParamInfo)], - val body: List[js.Tree]) extends JSCtor + val body: js.JSConstructorBody) extends JSCtor private class SplitSecondaryJSCtor(val sym: Symbol, val paramsAndInfo: List[(Symbol, JSParamInfo)], @@ -1945,9 +1960,9 @@ class JSCodeGen()(using genCtx: Context) { }*/ thisLocalVarIdent.fold[js.Tree] { - js.This()(currentClassType) + js.This()(currentThisType) } { thisLocalIdent => - js.VarRef(thisLocalIdent)(currentClassType) + js.VarRef(thisLocalIdent)(currentThisType) } } @@ -2014,9 +2029,7 @@ class JSCodeGen()(using genCtx: Context) { val (exceptValDef, exceptVar) = if (mightCatchJavaScriptException) { val valDef = js.VarDef(freshLocalIdent("e"), NoOriginalName, - encodeClassType(defn.ThrowableClass), mutable = false, { - genModuleApplyMethod(jsdefn.Runtime_wrapJavaScriptException, origExceptVar :: Nil) - }) + encodeClassType(defn.ThrowableClass), mutable = false, js.WrapAsThrowable(origExceptVar)) (valDef, valDef.ref) } else { (js.Skip(), origExceptVar) @@ -2307,7 +2320,7 @@ class JSCodeGen()(using genCtx: Context) { val privateFieldDefs = mutable.ListBuffer.empty[js.FieldDef] val classDefMembers = mutable.ListBuffer.empty[js.MemberDef] val instanceMembers = mutable.ListBuffer.empty[js.MemberDef] - var constructor: Option[js.JSMethodDef] = None + var constructor: Option[js.JSConstructorDef] = None originalClassDef.memberDefs.foreach { case fdef: js.FieldDef => @@ -2321,17 +2334,13 @@ class JSCodeGen()(using genCtx: Context) { "Non-static, unexported method in non-native JS class") classDefMembers += mdef - case mdef: js.JSMethodDef => - mdef.name match { - case js.StringLiteral("constructor") => - assert(!mdef.flags.namespace.isStatic, "Exported static method") - assert(constructor.isEmpty, "two ctors in class") - constructor = Some(mdef) + case cdef: js.JSConstructorDef => + assert(constructor.isEmpty, "two ctors in class") + constructor = Some(cdef) - case _ => - assert(!mdef.flags.namespace.isStatic, "Exported static method") - instanceMembers += mdef - } + case mdef: js.JSMethodDef => + assert(!mdef.flags.namespace.isStatic, "Exported static method") + instanceMembers += mdef case property: js.JSPropertyDef => instanceMembers += property @@ -2361,7 +2370,7 @@ class JSCodeGen()(using genCtx: Context) { val jsClassCaptures = originalClassDef.jsClassCaptures.getOrElse { throw new AssertionError(s"no class captures for anonymous JS class at $pos") } - val js.JSMethodDef(_, _, ctorParams, ctorRestParam, ctorBody) = constructor.getOrElse { + val js.JSConstructorDef(_, ctorParams, ctorRestParam, ctorBody) = constructor.getOrElse { throw new AssertionError("No ctor found") } assert(ctorParams.isEmpty && ctorRestParam.isEmpty, @@ -2396,6 +2405,9 @@ class JSCodeGen()(using genCtx: Context) { case mdef: js.MethodDef => throw new AssertionError("unexpected MethodDef") + case cdef: js.JSConstructorDef => + throw new AssertionError("unexpected JSConstructorDef") + case mdef: js.JSMethodDef => implicit val pos = mdef.pos val impl = memberLambda(mdef.args, mdef.restParam, mdef.body) @@ -2468,36 +2480,43 @@ class JSCodeGen()(using genCtx: Context) { } // Transform the constructor body. - val inlinedCtorStats = new ir.Transformers.Transformer { - override def transform(tree: js.Tree, isStat: Boolean): js.Tree = tree match { - // The super constructor call. Transform this into a simple new call. - case js.JSSuperConstructorCall(args) => - implicit val pos = tree.pos - - val newTree = { - val ident = originalClassDef.superClass.getOrElse(throw new FatalError("No superclass")) - if (args.isEmpty && ident.name == JSObjectClassName) - js.JSObjectConstr(Nil) - else - js.JSNew(jsSuperClassRef, args) - } - - js.Block( - js.VarDef(selfName, thisOriginalName, jstpe.AnyType, mutable = false, newTree) :: - memberDefinitions) + val inlinedCtorStats: List[js.Tree] = { + val beforeSuper = ctorBody.beforeSuper - case js.This() => - selfRef(tree.pos) + val superCall = { + implicit val pos = ctorBody.superCall.pos + val js.JSSuperConstructorCall(args) = ctorBody.superCall - // Don't traverse closure boundaries - case closure: js.Closure => - val newCaptureValues = closure.captureValues.map(transformExpr) - closure.copy(captureValues = newCaptureValues)(closure.pos) + val newTree = { + val ident = originalClassDef.superClass.getOrElse(throw new FatalError("No superclass")) + if (args.isEmpty && ident.name == JSObjectClassName) + js.JSObjectConstr(Nil) + else + js.JSNew(jsSuperClassRef, args) + } - case tree => - super.transform(tree, isStat) + val selfVarDef = js.VarDef(selfName, thisOriginalName, jstpe.AnyType, mutable = false, newTree) + selfVarDef :: memberDefinitions } - }.transform(ctorBody, isStat = true) + + // After the super call, substitute `selfRef` for `This()` + val afterSuper = new ir.Transformers.Transformer { + override def transform(tree: js.Tree, isStat: Boolean): js.Tree = tree match { + case js.This() => + selfRef(tree.pos) + + // Don't traverse closure boundaries + case closure: js.Closure => + val newCaptureValues = closure.captureValues.map(transformExpr) + closure.copy(captureValues = newCaptureValues)(closure.pos) + + case tree => + super.transform(tree, isStat) + } + }.transformStats(ctorBody.afterSuper) + + beforeSuper ::: superCall ::: afterSuper + } val closure = js.Closure(arrow = true, jsClassCaptures, Nil, None, js.Block(inlinedCtorStats, selfRef), jsSuperClassValue :: args) @@ -2926,7 +2945,7 @@ class JSCodeGen()(using genCtx: Context) { case defn.ArrayOf(el) => el case JavaArrayType(el) => el case tpe => - val msg = ex"expected Array $tpe" + val msg = em"expected Array $tpe" report.error(msg) ErrorType(msg) } @@ -2989,14 +3008,12 @@ class JSCodeGen()(using genCtx: Context) { implicit val pos: SourcePosition = tree.sourcePos val exception = args.head val genException = genExpr(exception) - js.Throw { - if (exception.tpe.typeSymbol.derivesFrom(jsdefn.JavaScriptExceptionClass)) { - genModuleApplyMethod( - jsdefn.Runtime_unwrapJavaScriptException, - List(genException)) - } else { - genException - } + genException match { + case js.New(cls, _, _) if cls != JavaScriptExceptionClassName => + // Common case where ex is neither null nor a js.JavaScriptException + js.Throw(genException) + case _ => + js.Throw(js.UnwrapFromThrowable(genException)) } } @@ -3652,7 +3669,7 @@ class JSCodeGen()(using genCtx: Context) { } else if (sym.isJSType) { if (sym.is(Trait)) { report.error( - s"isInstanceOf[${sym.fullName}] not supported because it is a JS trait", + em"isInstanceOf[${sym.fullName}] not supported because it is a JS trait", pos) js.BooleanLiteral(true) } else { @@ -3982,6 +3999,53 @@ class JSCodeGen()(using genCtx: Context) { js.JSFunctionApply(fVarDef.ref, List(keyVarRef)) })) + case JS_THROW => + // js.special.throw(arg) + js.Throw(genArgs1) + + case JS_TRY_CATCH => + /* js.special.tryCatch(arg1, arg2) + * + * We must generate: + * + * val body = arg1 + * val handler = arg2 + * try { + * body() + * } catch (e) { + * handler(e) + * } + * + * with temporary vals, because `arg2` must be evaluated before + * `body` executes. Moreover, exceptions thrown while evaluating + * the function values `arg1` and `arg2` must not be caught. + */ + val (arg1, arg2) = genArgs2 + val bodyVarDef = js.VarDef(freshLocalIdent("body"), NoOriginalName, + jstpe.AnyType, mutable = false, arg1) + val handlerVarDef = js.VarDef(freshLocalIdent("handler"), NoOriginalName, + jstpe.AnyType, mutable = false, arg2) + val exceptionVarIdent = freshLocalIdent("e") + val exceptionVarRef = js.VarRef(exceptionVarIdent)(jstpe.AnyType) + js.Block( + bodyVarDef, + handlerVarDef, + js.TryCatch( + js.JSFunctionApply(bodyVarDef.ref, Nil), + exceptionVarIdent, + NoOriginalName, + js.JSFunctionApply(handlerVarDef.ref, List(exceptionVarRef)) + )(jstpe.AnyType) + ) + + case WRAP_AS_THROWABLE => + // js.special.wrapAsThrowable(arg) + js.WrapAsThrowable(genArgs1) + + case UNWRAP_FROM_THROWABLE => + // js.special.unwrapFromThrowable(arg) + js.UnwrapFromThrowable(genArgs1) + case UNION_FROM | UNION_FROM_TYPE_CONSTRUCTOR => /* js.|.from and js.|.fromTypeConstructor * We should not have to deal with those. They have a perfectly valid @@ -4764,6 +4828,7 @@ object JSCodeGen { private val NullPointerExceptionClass = ClassName("java.lang.NullPointerException") private val JSObjectClassName = ClassName("scala.scalajs.js.Object") + private val JavaScriptExceptionClassName = ClassName("scala.scalajs.js.JavaScriptException") private val ObjectClassRef = jstpe.ClassRef(ir.Names.ObjectClass) diff --git a/compiler/src/dotty/tools/backend/sjs/JSDefinitions.scala b/compiler/src/dotty/tools/backend/sjs/JSDefinitions.scala index c252ac892548..5336d60129ac 100644 --- a/compiler/src/dotty/tools/backend/sjs/JSDefinitions.scala +++ b/compiler/src/dotty/tools/backend/sjs/JSDefinitions.scala @@ -162,10 +162,6 @@ final class JSDefinitions()(using Context) { @threadUnsafe lazy val RuntimePackageVal = requiredPackage("scala.scalajs.runtime") @threadUnsafe lazy val RuntimePackageClass = RuntimePackageVal.moduleClass.asClass - @threadUnsafe lazy val RuntimePackage_wrapJavaScriptExceptionR = RuntimePackageClass.requiredMethodRef("wrapJavaScriptException") - def Runtime_wrapJavaScriptException(using Context) = RuntimePackage_wrapJavaScriptExceptionR.symbol - @threadUnsafe lazy val Runtime_unwrapJavaScriptExceptionR = RuntimePackageClass.requiredMethodRef("unwrapJavaScriptException") - def Runtime_unwrapJavaScriptException(using Context) = Runtime_unwrapJavaScriptExceptionR.symbol @threadUnsafe lazy val Runtime_toScalaVarArgsR = RuntimePackageClass.requiredMethodRef("toScalaVarArgs") def Runtime_toScalaVarArgs(using Context) = Runtime_toScalaVarArgsR.symbol @threadUnsafe lazy val Runtime_toJSVarArgsR = RuntimePackageClass.requiredMethodRef("toJSVarArgs") @@ -206,6 +202,14 @@ final class JSDefinitions()(using Context) { def Special_instanceof(using Context) = Special_instanceofR.symbol @threadUnsafe lazy val Special_strictEqualsR = SpecialPackageClass.requiredMethodRef("strictEquals") def Special_strictEquals(using Context) = Special_strictEqualsR.symbol + @threadUnsafe lazy val Special_throwR = SpecialPackageClass.requiredMethodRef("throw") + def Special_throw(using Context) = Special_throwR.symbol + @threadUnsafe lazy val Special_tryCatchR = SpecialPackageClass.requiredMethodRef("tryCatch") + def Special_tryCatch(using Context) = Special_tryCatchR.symbol + @threadUnsafe lazy val Special_wrapAsThrowableR = SpecialPackageClass.requiredMethodRef("wrapAsThrowable") + def Special_wrapAsThrowable(using Context) = Special_wrapAsThrowableR.symbol + @threadUnsafe lazy val Special_unwrapFromThrowableR = SpecialPackageClass.requiredMethodRef("unwrapFromThrowable") + def Special_unwrapFromThrowable(using Context) = Special_unwrapFromThrowableR.symbol @threadUnsafe lazy val WrappedArrayType: TypeRef = requiredClassRef("scala.scalajs.js.WrappedArray") def WrappedArrayClass(using Context) = WrappedArrayType.symbol.asClass diff --git a/compiler/src/dotty/tools/backend/sjs/JSExportsGen.scala b/compiler/src/dotty/tools/backend/sjs/JSExportsGen.scala index 0884ec19b53e..78412999bb34 100644 --- a/compiler/src/dotty/tools/backend/sjs/JSExportsGen.scala +++ b/compiler/src/dotty/tools/backend/sjs/JSExportsGen.scala @@ -135,8 +135,7 @@ final class JSExportsGen(jsCodeGen: JSCodeGen)(using Context) { for ((info, _) <- tups.tail) { report.error( - em"export overload conflicts with export of $firstSym: " + - "a field may not share its exported name with another export", + em"export overload conflicts with export of $firstSym: a field may not share its exported name with another export", info.pos) } @@ -264,8 +263,8 @@ final class JSExportsGen(jsCodeGen: JSCodeGen)(using Context) { .alternatives assert(!alts.isEmpty, - em"Ended up with no alternatives for ${classSym.fullName}::$name. " + - em"Original set was ${alts} with types ${alts.map(_.info)}") + em"""Ended up with no alternatives for ${classSym.fullName}::$name. + |Original set was ${alts} with types ${alts.map(_.info)}""") val (jsName, isProp) = exportNameInfo(name) @@ -309,7 +308,7 @@ final class JSExportsGen(jsCodeGen: JSCodeGen)(using Context) { if (isProp && methodSyms.nonEmpty) { val firstAlt = alts.head report.error( - i"Conflicting properties and methods for ${classSym.fullName}::$name.", + em"Conflicting properties and methods for ${classSym.fullName}::$name.", firstAlt.srcPos) implicit val pos = firstAlt.span js.JSPropertyDef(js.MemberFlags.empty, genExpr(name)(firstAlt.sourcePos), None, None) @@ -613,7 +612,7 @@ final class JSExportsGen(jsCodeGen: JSCodeGen)(using Context) { val altsTypesInfo = alts.map(_.info.show).sorted.mkString("\n ") report.error( - s"Cannot disambiguate overloads for $fullKind $displayName with types\n $altsTypesInfo", + em"Cannot disambiguate overloads for $fullKind $displayName with types\n $altsTypesInfo", pos) } @@ -650,7 +649,7 @@ final class JSExportsGen(jsCodeGen: JSCodeGen)(using Context) { js.LoadJSConstructor(encodeClassName(superClassSym)) } - val receiver = js.This()(jstpe.AnyType) + val receiver = js.This()(currentThisType) val nameTree = genExpr(sym.jsName) if (sym.isJSGetter) { @@ -754,7 +753,7 @@ final class JSExportsGen(jsCodeGen: JSCodeGen)(using Context) { genApplyMethodMaybeStatically(receiver, modAccessor, Nil) } } else { - js.This()(encodeClassType(targetSym)) + js.This()(currentThisType) } } @@ -811,7 +810,7 @@ final class JSExportsGen(jsCodeGen: JSCodeGen)(using Context) { def receiver = if (static) genLoadModule(sym.owner) - else js.This()(encodeClassType(currentClass)) + else js.This()(currentThisType) def boxIfNeeded(call: js.Tree): js.Tree = box(call, atPhase(elimErasedValueTypePhase)(sym.info.resultType)) diff --git a/compiler/src/dotty/tools/backend/sjs/JSPositions.scala b/compiler/src/dotty/tools/backend/sjs/JSPositions.scala index 9b19e66058e8..2fd007165952 100644 --- a/compiler/src/dotty/tools/backend/sjs/JSPositions.scala +++ b/compiler/src/dotty/tools/backend/sjs/JSPositions.scala @@ -6,6 +6,7 @@ import java.net.{URI, URISyntaxException} import dotty.tools.dotc.core._ import Contexts._ +import Decorators.em import dotty.tools.dotc.report @@ -31,7 +32,7 @@ class JSPositions()(using Context) { URIMap(from, to) :: Nil } catch { case e: URISyntaxException => - report.error(s"${e.getInput} is not a valid URI") + report.error(em"${e.getInput} is not a valid URI") Nil } } diff --git a/compiler/src/dotty/tools/backend/sjs/JSPrimitives.scala b/compiler/src/dotty/tools/backend/sjs/JSPrimitives.scala index 6b3854ed677f..029273aed54b 100644 --- a/compiler/src/dotty/tools/backend/sjs/JSPrimitives.scala +++ b/compiler/src/dotty/tools/backend/sjs/JSPrimitives.scala @@ -5,6 +5,7 @@ import Names.TermName import Types._ import Contexts._ import Symbols._ +import Decorators.em import dotty.tools.dotc.ast.tpd._ import dotty.tools.backend.jvm.DottyPrimitives @@ -36,12 +37,16 @@ object JSPrimitives { inline val LINKING_INFO = WITH_CONTEXTUAL_JS_CLASS_VALUE + 1 // runtime.linkingInfo inline val DYNAMIC_IMPORT = LINKING_INFO + 1 // runtime.dynamicImport - inline val STRICT_EQ = DYNAMIC_IMPORT + 1 // js.special.strictEquals - inline val IN = STRICT_EQ + 1 // js.special.in - inline val INSTANCEOF = IN + 1 // js.special.instanceof - inline val DELETE = INSTANCEOF + 1 // js.special.delete - inline val FORIN = DELETE + 1 // js.special.forin - inline val DEBUGGER = FORIN + 1 // js.special.debugger + inline val STRICT_EQ = DYNAMIC_IMPORT + 1 // js.special.strictEquals + inline val IN = STRICT_EQ + 1 // js.special.in + inline val INSTANCEOF = IN + 1 // js.special.instanceof + inline val DELETE = INSTANCEOF + 1 // js.special.delete + inline val FORIN = DELETE + 1 // js.special.forin + inline val JS_THROW = FORIN + 1 // js.special.throw + inline val JS_TRY_CATCH = JS_THROW + 1 // js.special.tryCatch + inline val WRAP_AS_THROWABLE = JS_TRY_CATCH + 1 // js.special.wrapAsThrowable + inline val UNWRAP_FROM_THROWABLE = WRAP_AS_THROWABLE + 1 // js.special.unwrapFromThrowable + inline val DEBUGGER = UNWRAP_FROM_THROWABLE + 1 // js.special.debugger inline val THROW = DEBUGGER + 1 @@ -90,7 +95,7 @@ class JSPrimitives(ictx: Context) extends DottyPrimitives(ictx) { def addPrimitives(cls: Symbol, method: TermName, code: Int)(using Context): Unit = { val alts = cls.info.member(method).alternatives.map(_.symbol) if (alts.isEmpty) { - report.error(s"Unknown primitive method $cls.$method") + report.error(em"Unknown primitive method $cls.$method") } else { for (s <- alts) addPrimitive(s, code) @@ -125,6 +130,10 @@ class JSPrimitives(ictx: Context) extends DottyPrimitives(ictx) { addPrimitive(jsdefn.Special_instanceof, INSTANCEOF) addPrimitive(jsdefn.Special_delete, DELETE) addPrimitive(jsdefn.Special_forin, FORIN) + addPrimitive(jsdefn.Special_throw, JS_THROW) + addPrimitive(jsdefn.Special_tryCatch, JS_TRY_CATCH) + addPrimitive(jsdefn.Special_wrapAsThrowable, WRAP_AS_THROWABLE) + addPrimitive(jsdefn.Special_unwrapFromThrowable, UNWRAP_FROM_THROWABLE) addPrimitive(jsdefn.Special_debugger, DEBUGGER) addPrimitive(defn.throwMethod, THROW) diff --git a/compiler/src/dotty/tools/dotc/Bench.scala b/compiler/src/dotty/tools/dotc/Bench.scala index c9c032b0ae7d..5f5e9fc799b5 100644 --- a/compiler/src/dotty/tools/dotc/Bench.scala +++ b/compiler/src/dotty/tools/dotc/Bench.scala @@ -14,24 +14,22 @@ import scala.annotation.internal.sharable object Bench extends Driver: @sharable private var numRuns = 1 - - private def ntimes(n: Int)(op: => Reporter): Reporter = - (0 until n).foldLeft(emptyReporter)((_, _) => op) - + @sharable private var numCompilers = 1 + @sharable private var waitAfter = -1 + @sharable private var curCompiler = 0 @sharable private var times: Array[Int] = _ override def doCompile(compiler: Compiler, files: List[AbstractFile])(using Context): Reporter = - times = new Array[Int](numRuns) var reporter: Reporter = emptyReporter for i <- 0 until numRuns do + val curRun = curCompiler * numRuns + i val start = System.nanoTime() reporter = super.doCompile(compiler, files) - times(i) = ((System.nanoTime - start) / 1000000).toInt - println(s"time elapsed: ${times(i)}ms") - if ctx.settings.Xprompt.value then + times(curRun) = ((System.nanoTime - start) / 1000000).toInt + println(s"time elapsed: ${times(curRun)}ms") + if ctx.settings.Xprompt.value || waitAfter == curRun + 1 then print("hit to continue >") System.in.nn.read() - println() reporter def extractNumArg(args: Array[String], name: String, default: Int = 1): (Int, Array[String]) = { @@ -42,20 +40,26 @@ object Bench extends Driver: def reportTimes() = val best = times.sorted - val measured = numRuns / 3 + val measured = numCompilers * numRuns / 3 val avgBest = best.take(measured).sum / measured val avgLast = times.reverse.take(measured).sum / measured - println(s"best out of $numRuns runs: ${best(0)}") + println(s"best out of ${numCompilers * numRuns} runs: ${best(0)}") println(s"average out of best $measured: $avgBest") println(s"average out of last $measured: $avgLast") - override def process(args: Array[String], rootCtx: Context): Reporter = + override def process(args: Array[String]): Reporter = val (numCompilers, args1) = extractNumArg(args, "#compilers") val (numRuns, args2) = extractNumArg(args1, "#runs") + val (waitAfter, args3) = extractNumArg(args2, "#wait-after", -1) + this.numCompilers = numCompilers this.numRuns = numRuns + this.waitAfter = waitAfter + this.times = new Array[Int](numCompilers * numRuns) var reporter: Reporter = emptyReporter - for i <- 0 until numCompilers do - reporter = super.process(args2, rootCtx) + curCompiler = 0 + while curCompiler < numCompilers do + reporter = super.process(args3) + curCompiler += 1 reportTimes() reporter diff --git a/compiler/src/dotty/tools/dotc/CompilationUnit.scala b/compiler/src/dotty/tools/dotc/CompilationUnit.scala index 44ca582c3c61..046b649941b1 100644 --- a/compiler/src/dotty/tools/dotc/CompilationUnit.scala +++ b/compiler/src/dotty/tools/dotc/CompilationUnit.scala @@ -16,6 +16,8 @@ import core.Decorators._ import config.{SourceVersion, Feature} import StdNames.nme import scala.annotation.internal.sharable +import scala.util.control.NoStackTrace +import transform.MacroAnnotations class CompilationUnit protected (val source: SourceFile) { @@ -45,6 +47,8 @@ class CompilationUnit protected (val source: SourceFile) { */ var needsInlining: Boolean = false + var hasMacroAnnotations: Boolean = false + /** Set to `true` if inliner added anonymous mirrors that need to be completed */ var needsMirrorSupport: Boolean = false @@ -102,7 +106,7 @@ class CompilationUnit protected (val source: SourceFile) { object CompilationUnit { - class SuspendException extends Exception + class SuspendException extends Exception with NoStackTrace /** Make a compilation unit for top class `clsd` with the contents of the `unpickled` tree */ def apply(clsd: ClassDenotation, unpickled: Tree, forceTrees: Boolean)(using Context): CompilationUnit = @@ -119,6 +123,7 @@ object CompilationUnit { force.traverse(unit1.tpdTree) unit1.needsStaging = force.containsQuote unit1.needsInlining = force.containsInline + unit1.hasMacroAnnotations = force.containsMacroAnnotation } unit1 } @@ -131,11 +136,11 @@ object CompilationUnit { if (!mustExist) source else if (source.file.isDirectory) { - report.error(s"expected file, received directory '${source.file.path}'") + report.error(em"expected file, received directory '${source.file.path}'") NoSource } else if (!source.file.exists) { - report.error(s"source file not found: ${source.file.path}") + report.error(em"source file not found: ${source.file.path}") NoSource } else source @@ -147,6 +152,7 @@ object CompilationUnit { var containsQuote = false var containsInline = false var containsCaptureChecking = false + var containsMacroAnnotation = false def traverse(tree: Tree)(using Context): Unit = { if (tree.symbol.isQuote) containsQuote = true @@ -160,6 +166,9 @@ object CompilationUnit { Feature.handleGlobalLanguageImport(prefix, imported) case _ => case _ => + for annot <- tree.symbol.annotations do + if MacroAnnotations.isMacroAnnotation(annot) then + ctx.compilationUnit.hasMacroAnnotations = true traverseChildren(tree) } } diff --git a/compiler/src/dotty/tools/dotc/Compiler.scala b/compiler/src/dotty/tools/dotc/Compiler.scala index b121a47781e1..15d4a39c511f 100644 --- a/compiler/src/dotty/tools/dotc/Compiler.scala +++ b/compiler/src/dotty/tools/dotc/Compiler.scala @@ -35,6 +35,7 @@ class Compiler { protected def frontendPhases: List[List[Phase]] = List(new Parser) :: // Compiler frontend: scanner, parser List(new TyperPhase) :: // Compiler frontend: namer, typer + List(new CheckUnused.PostTyper) :: // Check for unused elements List(new YCheckPositions) :: // YCheck positions List(new sbt.ExtractDependencies) :: // Sends information on classes' dependencies to sbt via callbacks List(new semanticdb.ExtractSemanticDB) :: // Extract info into .semanticdb files @@ -49,6 +50,7 @@ class Compiler { List(new Pickler) :: // Generate TASTY info List(new Inlining) :: // Inline and execute macros List(new PostInlining) :: // Add mirror support for inlined code + List(new CheckUnused.PostInlining) :: // Check for unused elements List(new Staging) :: // Check staging levels and heal staged types List(new Splicing) :: // Replace level 1 splices with holes List(new PickleQuotes) :: // Turn quoted trees into explicit run-time data structures @@ -87,7 +89,8 @@ class Compiler { new sjs.ExplicitJSClasses, // Make all JS classes explicit (Scala.js only) new ExplicitOuter, // Add accessors to outer classes from nested ones. new ExplicitSelf, // Make references to non-trivial self types explicit as casts - new StringInterpolatorOpt) :: // Optimizes raw and s and f string interpolators by rewriting them to string concatenations or formats + new StringInterpolatorOpt, // Optimizes raw and s and f string interpolators by rewriting them to string concatenations or formats + new DropBreaks) :: // Optimize local Break throws by rewriting them List(new PruneErasedDefs, // Drop erased definitions from scopes and simplify erased expressions new UninitializedDefs, // Replaces `compiletime.uninitialized` by `_` new InlinePatterns, // Remove placeholders of inlined patterns diff --git a/compiler/src/dotty/tools/dotc/Driver.scala b/compiler/src/dotty/tools/dotc/Driver.scala index 14a71463c66d..5a2c8b7be56e 100644 --- a/compiler/src/dotty/tools/dotc/Driver.scala +++ b/compiler/src/dotty/tools/dotc/Driver.scala @@ -94,7 +94,7 @@ class Driver { val newEntries: List[String] = files .flatMap { file => if !file.exists then - report.error(s"File does not exist: ${file.path}") + report.error(em"File does not exist: ${file.path}") None else file.extension match case "jar" => Some(file.path) @@ -102,10 +102,10 @@ class Driver { TastyFileUtil.getClassPath(file) match case Some(classpath) => Some(classpath) case _ => - report.error(s"Could not load classname from: ${file.path}") + report.error(em"Could not load classname from: ${file.path}") None case _ => - report.error(s"File extension is not `tasty` or `jar`: ${file.path}") + report.error(em"File extension is not `tasty` or `jar`: ${file.path}") None } .distinct @@ -171,7 +171,7 @@ class Driver { * the other overloads without worrying about breaking compatibility * with sbt. */ - final def process(args: Array[String]): Reporter = + def process(args: Array[String]): Reporter = process(args, null: Reporter | Null, null: interfaces.CompilerCallback | Null) /** Entry point to the compiler using a custom `Context`. diff --git a/compiler/src/dotty/tools/dotc/Run.scala b/compiler/src/dotty/tools/dotc/Run.scala index 022ffbed5408..8cd1d160b42c 100644 --- a/compiler/src/dotty/tools/dotc/Run.scala +++ b/compiler/src/dotty/tools/dotc/Run.scala @@ -164,10 +164,15 @@ class Run(comp: Compiler, ictx: Context) extends ImplicitRunInfo with Constraint private var finalizeActions = mutable.ListBuffer[() => Unit]() /** Will be set to true if any of the compiled compilation units contains - * a pureFunctions or captureChecking language import. + * a pureFunctions language import. */ var pureFunsImportEncountered = false + /** Will be set to true if any of the compiled compilation units contains + * a captureChecking language import. + */ + var ccImportEncountered = false + def compile(files: List[AbstractFile]): Unit = try val sources = files.map(runContext.getSource(_)) @@ -226,9 +231,13 @@ class Run(comp: Compiler, ictx: Context) extends ImplicitRunInfo with Constraint ctx.settings.Yskip.value, ctx.settings.YstopBefore.value, stopAfter, ctx.settings.Ycheck.value) ctx.base.usePhases(phases) + if ctx.settings.YnoDoubleBindings.value then + ctx.base.checkNoDoubleBindings = true + def runPhases(using Context) = { var lastPrintedTree: PrintedTree = NoPrintedTree val profiler = ctx.profiler + var phasesWereAdjusted = false for (phase <- ctx.base.allPhases) if (phase.isRunnable) @@ -247,6 +256,11 @@ class Run(comp: Compiler, ictx: Context) extends ImplicitRunInfo with Constraint Stats.record(s"retained typed trees at end of $phase", unit.tpdTree.treeSize) ctx.typerState.gc() } + if !phasesWereAdjusted then + phasesWereAdjusted = true + if !Feature.ccEnabledSomewhere then + ctx.base.unlinkPhaseAsDenotTransformer(Phases.checkCapturesPhase.prev) + ctx.base.unlinkPhaseAsDenotTransformer(Phases.checkCapturesPhase) profiler.finished() } diff --git a/compiler/src/dotty/tools/dotc/ast/Desugar.scala b/compiler/src/dotty/tools/dotc/ast/Desugar.scala index 1e1db19bcf25..5326361ada98 100644 --- a/compiler/src/dotty/tools/dotc/ast/Desugar.scala +++ b/compiler/src/dotty/tools/dotc/ast/Desugar.scala @@ -6,6 +6,7 @@ import core._ import util.Spans._, Types._, Contexts._, Constants._, Names._, NameOps._, Flags._ import Symbols._, StdNames._, Trees._, ContextOps._ import Decorators._, transform.SymUtils._ +import Annotations.Annotation import NameKinds.{UniqueName, EvidenceParamName, DefaultGetterName, WildcardParamName} import typer.{Namer, Checking} import util.{Property, SourceFile, SourcePosition, Chars} @@ -117,7 +118,7 @@ object desugar { if (local.exists) (defctx.owner.thisType select local).dealiasKeepAnnots else { def msg = - s"no matching symbol for ${tp.symbol.showLocated} in ${defctx.owner} / ${defctx.effectiveScope.toList}" + em"no matching symbol for ${tp.symbol.showLocated} in ${defctx.owner} / ${defctx.effectiveScope.toList}" ErrorType(msg).assertingErrorsReported(msg) } case _ => @@ -165,32 +166,41 @@ object desugar { * * Generate setter where needed */ - def valDef(vdef0: ValDef)(using Context): Tree = { + def valDef(vdef0: ValDef)(using Context): Tree = val vdef @ ValDef(_, tpt, rhs) = vdef0 - val mods = vdef.mods - val valName = normalizeName(vdef, tpt).asTermName - val vdef1 = cpy.ValDef(vdef)(name = valName) + var mods1 = vdef.mods + + def dropInto(tpt: Tree): Tree = tpt match + case Into(tpt1) => + mods1 = vdef.mods.withAddedAnnotation( + TypedSplice( + Annotation(defn.AllowConversionsAnnot, tpt.span.startPos).tree)) + tpt1 + case ByNameTypeTree(tpt1) => + cpy.ByNameTypeTree(tpt)(dropInto(tpt1)) + case PostfixOp(tpt1, op) if op.name == tpnme.raw.STAR => + cpy.PostfixOp(tpt)(dropInto(tpt1), op) + case _ => + tpt + + val vdef1 = cpy.ValDef(vdef)(name = valName, tpt = dropInto(tpt)) + .withMods(mods1) - if (isSetterNeeded(vdef)) { - // TODO: copy of vdef as getter needed? - // val getter = ValDef(mods, name, tpt, rhs) withPos vdef.pos? - // right now vdef maps via expandedTree to a thicket which concerns itself. - // I don't see a problem with that but if there is one we can avoid it by making a copy here. + if isSetterNeeded(vdef) then val setterParam = makeSyntheticParameter(tpt = SetterParamTree().watching(vdef)) // The rhs gets filled in later, when field is generated and getter has parameters (see Memoize miniphase) val setterRhs = if (vdef.rhs.isEmpty) EmptyTree else unitLiteral val setter = cpy.DefDef(vdef)( - name = valName.setterName, - paramss = (setterParam :: Nil) :: Nil, - tpt = TypeTree(defn.UnitType), - rhs = setterRhs - ).withMods((mods | Accessor) &~ (CaseAccessor | GivenOrImplicit | Lazy)) - .dropEndMarker() // the end marker should only appear on the getter definition + name = valName.setterName, + paramss = (setterParam :: Nil) :: Nil, + tpt = TypeTree(defn.UnitType), + rhs = setterRhs + ).withMods((vdef.mods | Accessor) &~ (CaseAccessor | GivenOrImplicit | Lazy)) + .dropEndMarker() // the end marker should only appear on the getter definition Thicket(vdef1, setter) - } else vdef1 - } + end valDef def makeImplicitParameters(tpts: List[Tree], implicitFlag: FlagSet, forPrimaryConstructor: Boolean = false)(using Context): List[ValDef] = for (tpt <- tpts) yield { @@ -911,7 +921,7 @@ object desugar { case params :: paramss1 => // `params` must have a single parameter and without `given` flag def badRightAssoc(problem: String) = - report.error(i"right-associative extension method $problem", mdef.srcPos) + report.error(em"right-associative extension method $problem", mdef.srcPos) extParamss ++ mdef.paramss params match @@ -1137,7 +1147,7 @@ object desugar { def errorOnGivenBinding(bind: Bind)(using Context): Boolean = report.error( em"""${hl("given")} patterns are not allowed in a ${hl("val")} definition, - |please bind to an identifier and use an alias given.""".stripMargin, bind) + |please bind to an identifier and use an alias given.""", bind) false def isTuplePattern(arity: Int): Boolean = pat match { @@ -1237,7 +1247,7 @@ object desugar { def checkOpaqueAlias(tree: MemberDef)(using Context): MemberDef = def check(rhs: Tree): MemberDef = rhs match case bounds: TypeBoundsTree if bounds.alias.isEmpty => - report.error(i"opaque type must have a right-hand side", tree.srcPos) + report.error(em"opaque type must have a right-hand side", tree.srcPos) tree.withMods(tree.mods.withoutFlags(Opaque)) case LambdaTypeTree(_, body) => check(body) case _ => tree @@ -1454,7 +1464,10 @@ object desugar { val param = makeSyntheticParameter( tpt = if params.exists(_.tpt.isEmpty) then TypeTree() - else Tuple(params.map(_.tpt))) + else Tuple(params.map(_.tpt)), + flags = + if params.nonEmpty && params.head.mods.is(Given) then SyntheticTermParam | Given + else SyntheticTermParam) def selector(n: Int) = if (isGenericTuple) Apply(Select(refOfDef(param), nme.apply), Literal(Constant(n))) else Select(refOfDef(param), nme.selectorName(n)) diff --git a/compiler/src/dotty/tools/dotc/ast/DesugarEnums.scala b/compiler/src/dotty/tools/dotc/ast/DesugarEnums.scala index 096a885dcf32..a1c3c0ed0775 100644 --- a/compiler/src/dotty/tools/dotc/ast/DesugarEnums.scala +++ b/compiler/src/dotty/tools/dotc/ast/DesugarEnums.scala @@ -75,8 +75,8 @@ object DesugarEnums { def problem = if (!tparam.isOneOf(VarianceFlags)) "is invariant" else "has bounds that depend on a type parameter in the same parameter list" - errorType(i"""cannot determine type argument for enum parent $enumClass, - |type parameter $tparam $problem""", ctx.source.atSpan(span)) + errorType(em"""cannot determine type argument for enum parent $enumClass, + |type parameter $tparam $problem""", ctx.source.atSpan(span)) } } TypeTree(enumClass.typeRef.appliedTo(targs)).withSpan(span) @@ -216,7 +216,7 @@ object DesugarEnums { case Ident(name) => val matches = tparamNames.contains(name) if (matches && (caseTypeParams.nonEmpty || vparamss.isEmpty)) - report.error(i"illegal reference to type parameter $name from enum case", tree.srcPos) + report.error(em"illegal reference to type parameter $name from enum case", tree.srcPos) matches case LambdaTypeTree(lambdaParams, body) => underBinders(lambdaParams, foldOver(x, tree)) diff --git a/compiler/src/dotty/tools/dotc/ast/MainProxies.scala b/compiler/src/dotty/tools/dotc/ast/MainProxies.scala index 040582476e96..c0cf2c0d1b81 100644 --- a/compiler/src/dotty/tools/dotc/ast/MainProxies.scala +++ b/compiler/src/dotty/tools/dotc/ast/MainProxies.scala @@ -56,7 +56,7 @@ object MainProxies { def addArgs(call: untpd.Tree, mt: MethodType, idx: Int): untpd.Tree = if (mt.isImplicitMethod) { - report.error(s"@main method cannot have implicit parameters", pos) + report.error(em"@main method cannot have implicit parameters", pos) call } else { @@ -74,7 +74,7 @@ object MainProxies { mt.resType match { case restpe: MethodType => if (mt.paramInfos.lastOption.getOrElse(NoType).isRepeatedParam) - report.error(s"varargs parameter of @main method must come last", pos) + report.error(em"varargs parameter of @main method must come last", pos) addArgs(call1, restpe, idx + args.length) case _ => call1 @@ -83,7 +83,7 @@ object MainProxies { var result: List[TypeDef] = Nil if (!mainFun.owner.isStaticOwner) - report.error(s"@main method is not statically accessible", pos) + report.error(em"@main method is not statically accessible", pos) else { var call = ref(mainFun.termRef) mainFun.info match { @@ -91,9 +91,9 @@ object MainProxies { case mt: MethodType => call = addArgs(call, mt, 0) case _: PolyType => - report.error(s"@main method cannot have type parameters", pos) + report.error(em"@main method cannot have type parameters", pos) case _ => - report.error(s"@main can only annotate a method", pos) + report.error(em"@main can only annotate a method", pos) } val errVar = Ident(nme.error) val handler = CaseDef( @@ -203,7 +203,7 @@ object MainProxies { )) (sym, paramAnnotations.toVector, defaultValueSymbols(scope, sym), stat.rawComment) :: Nil case mainAnnot :: others => - report.error(s"method cannot have multiple main annotations", mainAnnot.tree) + report.error(em"method cannot have multiple main annotations", mainAnnot.tree) Nil } case stat @ TypeDef(_, impl: Template) if stat.symbol.is(Module) => @@ -379,26 +379,26 @@ object MainProxies { end generateMainClass if (!mainFun.owner.isStaticOwner) - report.error(s"main method is not statically accessible", pos) + report.error(em"main method is not statically accessible", pos) None else mainFun.info match { case _: ExprType => Some(generateMainClass(unitToValue(ref(mainFun.termRef)), Nil, Nil)) case mt: MethodType => if (mt.isImplicitMethod) - report.error(s"main method cannot have implicit parameters", pos) + report.error(em"main method cannot have implicit parameters", pos) None else mt.resType match case restpe: MethodType => - report.error(s"main method cannot be curried", pos) + report.error(em"main method cannot be curried", pos) None case _ => Some(generateMainClass(unitToValue(Apply(ref(mainFun.termRef), argRefs(mt))), argValDefs(mt), parameterInfos(mt))) case _: PolyType => - report.error(s"main method cannot have type parameters", pos) + report.error(em"main method cannot have type parameters", pos) None case _ => - report.error(s"main can only annotate a method", pos) + report.error(em"main can only annotate a method", pos) None } } diff --git a/compiler/src/dotty/tools/dotc/ast/NavigateAST.scala b/compiler/src/dotty/tools/dotc/ast/NavigateAST.scala index 054ffe66f323..ace396d1e583 100644 --- a/compiler/src/dotty/tools/dotc/ast/NavigateAST.scala +++ b/compiler/src/dotty/tools/dotc/ast/NavigateAST.scala @@ -4,7 +4,7 @@ package ast import core.Contexts._ import core.Decorators._ import util.Spans._ -import Trees.{MemberDef, DefTree, WithLazyField} +import Trees.{MemberDef, DefTree, WithLazyFields} import dotty.tools.dotc.core.Types.AnnotatedType import dotty.tools.dotc.core.Types.ImportType import dotty.tools.dotc.core.Types.Type @@ -106,16 +106,14 @@ object NavigateAST { // FIXME: We shouldn't be manually forcing trees here, we should replace // our usage of `productIterator` by something in `Positioned` that takes // care of low-level details like this for us. - p match { - case p: WithLazyField[?] => - p.forceIfLazy + p match + case p: WithLazyFields => p.forceFields() case _ => - } val iterator = p match case defdef: DefTree[?] => p.productIterator ++ defdef.mods.productIterator case _ => - p.productIterator + p.productIterator childPath(iterator, p :: path) } else { diff --git a/compiler/src/dotty/tools/dotc/ast/Positioned.scala b/compiler/src/dotty/tools/dotc/ast/Positioned.scala index d14addb8c9c7..dd783be7a9e1 100644 --- a/compiler/src/dotty/tools/dotc/ast/Positioned.scala +++ b/compiler/src/dotty/tools/dotc/ast/Positioned.scala @@ -154,14 +154,17 @@ abstract class Positioned(implicit @constructorOnly src: SourceFile) extends Src } } + private class LastPosRef: + var positioned: Positioned | Null = null + var span = NoSpan + /** Check that all positioned items in this tree satisfy the following conditions: * - Parent spans contain child spans * - If item is a non-empty tree, it has a position */ def checkPos(nonOverlapping: Boolean)(using Context): Unit = try { import untpd._ - var lastPositioned: Positioned | Null = null - var lastSpan = NoSpan + val last = LastPosRef() def check(p: Any): Unit = p match { case p: Positioned => assert(span contains p.span, @@ -181,19 +184,19 @@ abstract class Positioned(implicit @constructorOnly src: SourceFile) extends Src case _: XMLBlock => // FIXME: Trees generated by the XML parser do not satisfy `checkPos` case _: WildcardFunction - if lastPositioned.isInstanceOf[ValDef] && !p.isInstanceOf[ValDef] => + if last.positioned.isInstanceOf[ValDef] && !p.isInstanceOf[ValDef] => // ignore transition from last wildcard parameter to body case _ => - assert(!lastSpan.exists || !p.span.exists || lastSpan.end <= p.span.start, + assert(!last.span.exists || !p.span.exists || last.span.end <= p.span.start, i"""position error, child positions overlap or in wrong order |parent = $this - |1st child = $lastPositioned - |1st child span = $lastSpan + |1st child = ${last.positioned} + |1st child span = ${last.span} |2nd child = $p |2nd child span = ${p.span}""".stripMargin) } - lastPositioned = p - lastSpan = p.span + last.positioned = p + last.span = p.span p.checkPos(nonOverlapping) case m: untpd.Modifiers => m.annotations.foreach(check) diff --git a/compiler/src/dotty/tools/dotc/ast/TreeInfo.scala b/compiler/src/dotty/tools/dotc/ast/TreeInfo.scala index ff59a795d818..9b55db600d3d 100644 --- a/compiler/src/dotty/tools/dotc/ast/TreeInfo.scala +++ b/compiler/src/dotty/tools/dotc/ast/TreeInfo.scala @@ -14,10 +14,7 @@ import scala.collection.mutable import scala.annotation.tailrec -trait TreeInfo[T >: Untyped <: Type] { self: Trees.Instance[T] => - - // Note: the <: Type constraint looks necessary (and is needed to make the file compile in dotc). - // But Scalac accepts the program happily without it. Need to find out why. +trait TreeInfo[T <: Untyped] { self: Trees.Instance[T] => def unsplice(tree: Trees.Tree[T]): Trees.Tree[T] = tree @@ -105,6 +102,12 @@ trait TreeInfo[T >: Untyped <: Type] { self: Trees.Instance[T] => case _ => tree } + def stripTyped(tree: Tree): Tree = unsplice(tree) match + case Typed(expr, _) => + stripTyped(expr) + case _ => + tree + /** The number of arguments in an application */ def numArgs(tree: Tree): Int = unsplice(tree) match { case Apply(fn, args) => numArgs(fn) + args.length @@ -113,6 +116,24 @@ trait TreeInfo[T >: Untyped <: Type] { self: Trees.Instance[T] => case _ => 0 } + /** The type arguments of a possibly curried call */ + def typeArgss(tree: Tree): List[List[Tree]] = + @tailrec + def loop(tree: Tree, argss: List[List[Tree]]): List[List[Tree]] = tree match + case TypeApply(fn, args) => loop(fn, args :: argss) + case Apply(fn, args) => loop(fn, argss) + case _ => argss + loop(tree, Nil) + + /** The term arguments of a possibly curried call */ + def termArgss(tree: Tree): List[List[Tree]] = + @tailrec + def loop(tree: Tree, argss: List[List[Tree]]): List[List[Tree]] = tree match + case Apply(fn, args) => loop(fn, args :: argss) + case TypeApply(fn, args) => loop(fn, argss) + case _ => argss + loop(tree, Nil) + /** All term arguments of an application in a single flattened list */ def allArguments(tree: Tree): List[Tree] = unsplice(tree) match { case Apply(fn, args) => allArguments(fn) ::: args @@ -298,7 +319,7 @@ trait TreeInfo[T >: Untyped <: Type] { self: Trees.Instance[T] => */ def parentsKind(parents: List[Tree])(using Context): FlagSet = parents match { case Nil => NoInitsInterface - case Apply(_, _ :: _) :: _ => EmptyFlags + case Apply(_, _ :: _) :: _ | Block(_, _) :: _ => EmptyFlags case _ :: parents1 => parentsKind(parents1) } @@ -311,6 +332,50 @@ trait TreeInfo[T >: Untyped <: Type] { self: Trees.Instance[T] => case Block(_, expr) => forallResults(expr, p) case _ => p(tree) } + + def appliedCore(tree: Tree): Tree = tree match { + case Apply(fn, _) => appliedCore(fn) + case TypeApply(fn, _) => appliedCore(fn) + case AppliedTypeTree(fn, _) => appliedCore(fn) + case tree => tree + } + + /** Is tree an application with result `this.type`? + * Accept `b.addOne(x)` and also `xs(i) += x` + * where the op is an assignment operator. + */ + def isThisTypeResult(tree: Tree)(using Context): Boolean = appliedCore(tree) match { + case fun @ Select(receiver, op) => + val argss = termArgss(tree) + tree.tpe match { + case ThisType(tref) => + tref.symbol == receiver.symbol + case tref: TermRef => + tref.symbol == receiver.symbol || argss.exists(_.exists(tref.symbol == _.symbol)) + case _ => + def checkSingle(sym: Symbol): Boolean = + (sym == receiver.symbol) || { + receiver match { + case Apply(_, _) => op.isOpAssignmentName // xs(i) += x + case _ => receiver.symbol != NoSymbol && + (receiver.symbol.isGetter || receiver.symbol.isField) // xs.addOne(x) for var xs + } + } + @tailrec def loop(mt: Type): Boolean = mt match { + case m: MethodType => + m.resType match { + case ThisType(tref) => checkSingle(tref.symbol) + case tref: TermRef => checkSingle(tref.symbol) + case restpe => loop(restpe) + } + case PolyType(_, restpe) => loop(restpe) + case _ => false + } + fun.symbol != NoSymbol && loop(fun.symbol.info) + } + case _ => + tree.tpe.isInstanceOf[ThisType] + } } trait UntypedTreeInfo extends TreeInfo[Untyped] { self: Trees.Instance[Untyped] => @@ -686,24 +751,6 @@ trait TypedTreeInfo extends TreeInfo[Type] { self: Trees.Instance[Type] => } } - /** The type arguments of a possibly curried call */ - def typeArgss(tree: Tree): List[List[Tree]] = - @tailrec - def loop(tree: Tree, argss: List[List[Tree]]): List[List[Tree]] = tree match - case TypeApply(fn, args) => loop(fn, args :: argss) - case Apply(fn, args) => loop(fn, argss) - case _ => argss - loop(tree, Nil) - - /** The term arguments of a possibly curried call */ - def termArgss(tree: Tree): List[List[Tree]] = - @tailrec - def loop(tree: Tree, argss: List[List[Tree]]): List[List[Tree]] = tree match - case Apply(fn, args) => loop(fn, args :: argss) - case TypeApply(fn, args) => loop(fn, argss) - case _ => argss - loop(tree, Nil) - /** The type and term arguments of a possibly curried call, in the order they are given */ def allArgss(tree: Tree): List[List[Tree]] = @tailrec @@ -746,8 +793,6 @@ trait TypedTreeInfo extends TreeInfo[Type] { self: Trees.Instance[Type] => Some(meth) case Block(Nil, expr) => unapply(expr) - case Inlined(_, bindings, expr) if bindings.forall(isPureBinding) => - unapply(expr) case _ => None } @@ -791,10 +836,12 @@ trait TypedTreeInfo extends TreeInfo[Type] { self: Trees.Instance[Type] => /** The symbols defined locally in a statement list */ def localSyms(stats: List[Tree])(using Context): List[Symbol] = - val locals = new mutable.ListBuffer[Symbol] - for stat <- stats do - if stat.isDef && stat.symbol.exists then locals += stat.symbol - locals.toList + if stats.isEmpty then Nil + else + val locals = new mutable.ListBuffer[Symbol] + for stat <- stats do + if stat.isDef && stat.symbol.exists then locals += stat.symbol + locals.toList /** If `tree` is a DefTree, the symbol defined by it, otherwise NoSymbol */ def definedSym(tree: Tree)(using Context): Symbol = @@ -1040,7 +1087,7 @@ trait TypedTreeInfo extends TreeInfo[Type] { self: Trees.Instance[Type] => case Inlined(_, Nil, expr) => unapply(expr) case Block(Nil, expr) => unapply(expr) case _ => - tree.tpe.widenTermRefExpr.normalized match + tree.tpe.widenTermRefExpr.dealias.normalized match case ConstantType(Constant(x)) => Some(x) case _ => None } diff --git a/compiler/src/dotty/tools/dotc/ast/TreeMapWithImplicits.scala b/compiler/src/dotty/tools/dotc/ast/TreeMapWithImplicits.scala index caf8d68442f6..e52bf1064e4c 100644 --- a/compiler/src/dotty/tools/dotc/ast/TreeMapWithImplicits.scala +++ b/compiler/src/dotty/tools/dotc/ast/TreeMapWithImplicits.scala @@ -55,10 +55,10 @@ class TreeMapWithImplicits extends tpd.TreeMapWithPreciseStatContexts { transform(tree.tpt), transform(tree.rhs)(using nestedScopeCtx(tree.paramss.flatten))) } - case impl @ Template(constr, parents, self, _) => + case impl @ Template(constr, _, self, _) => cpy.Template(tree)( transformSub(constr), - transform(parents)(using ctx.superCallContext), + transform(impl.parents)(using ctx.superCallContext), Nil, transformSelf(self), transformStats(impl.body, tree.symbol)) diff --git a/compiler/src/dotty/tools/dotc/ast/TreeTypeMap.scala b/compiler/src/dotty/tools/dotc/ast/TreeTypeMap.scala index 71998aff9304..f5bf55802adf 100644 --- a/compiler/src/dotty/tools/dotc/ast/TreeTypeMap.scala +++ b/compiler/src/dotty/tools/dotc/ast/TreeTypeMap.scala @@ -92,11 +92,11 @@ class TreeTypeMap( cpy.Inlined(tree)(call, bindings1, expanded1) override def transform(tree: tpd.Tree)(using Context): tpd.Tree = treeMap(tree) match { - case impl @ Template(constr, parents, self, _) => + case impl @ Template(constr, _, self, _) => val tmap = withMappedSyms(localSyms(impl :: self :: Nil)) cpy.Template(impl)( constr = tmap.transformSub(constr), - parents = parents.mapconserve(transform), + parents = impl.parents.mapconserve(transform), self = tmap.transformSub(self), body = impl.body mapconserve (tmap.transform(_)(using ctx.withOwner(mapOwner(impl.symbol.owner)))) diff --git a/compiler/src/dotty/tools/dotc/ast/Trees.scala b/compiler/src/dotty/tools/dotc/ast/Trees.scala index 1159d13d5aef..c0b5987c3875 100644 --- a/compiler/src/dotty/tools/dotc/ast/Trees.scala +++ b/compiler/src/dotty/tools/dotc/ast/Trees.scala @@ -15,11 +15,12 @@ import config.Printers.overload import annotation.internal.sharable import annotation.unchecked.uncheckedVariance import annotation.constructorOnly +import compiletime.uninitialized import Decorators._ object Trees { - type Untyped = Nothing + type Untyped = Type | Null /** The total number of created tree nodes, maintained if Stats.enabled */ @sharable var ntrees: Int = 0 @@ -45,36 +46,34 @@ object Trees { * - Type checking an untyped tree should remove all embedded `TypedSplice` * nodes. */ - abstract class Tree[-T >: Untyped](implicit @constructorOnly src: SourceFile) + abstract class Tree[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends Positioned, SrcPos, Product, Attachment.Container, printing.Showable { if (Stats.enabled) ntrees += 1 /** The type constructor at the root of the tree */ - type ThisTree[T >: Untyped] <: Tree[T] + type ThisTree[T <: Untyped] <: Tree[T] - protected var myTpe: T @uncheckedVariance = _ + protected var myTpe: T @uncheckedVariance = uninitialized /** Destructively set the type of the tree. This should be called only when it is known that * it is safe under sharing to do so. One use-case is in the withType method below * which implements copy-on-write. Another use-case is in method interpolateAndAdapt in Typer, * where we overwrite with a simplified version of the type itself. */ - private[dotc] def overwriteType(tpe: T): Unit = + private[dotc] def overwriteType(tpe: T @uncheckedVariance): Unit = myTpe = tpe /** The type of the tree. In case of an untyped tree, * an UnAssignedTypeException is thrown. (Overridden by empty trees) */ - final def tpe: T @uncheckedVariance = { - if (myTpe == null) - throw UnAssignedTypeException(this) - myTpe - } + final def tpe: T = + if myTpe == null then throw UnAssignedTypeException(this) + myTpe.uncheckedNN /** Copy `tpe` attribute from tree `from` into this tree, independently * whether it is null or not. - final def copyAttr[U >: Untyped](from: Tree[U]): ThisTree[T] = { + final def copyAttr[U <: Untyped](from: Tree[U]): ThisTree[T] = { val t1 = this.withSpan(from.span) val t2 = if (from.myTpe != null) t1.withType(from.myTpe.asInstanceOf[Type]) @@ -131,10 +130,9 @@ object Trees { */ final def hasType: Boolean = myTpe != null - final def typeOpt: Type = myTpe match { + final def typeOpt: Type = myTpe match case tp: Type => tp - case _ => NoType - } + case null => NoType /** The denotation referred to by this tree. * Defined for `DenotingTree`s and `ProxyTree`s, NoDenotation for other @@ -166,7 +164,7 @@ object Trees { def toList: List[Tree[T]] = this :: Nil /** if this tree is the empty tree, the alternative, else this tree */ - inline def orElse[U >: Untyped <: T](inline that: Tree[U]): Tree[U] = + inline def orElse[U >: T <: Untyped](inline that: Tree[U]): Tree[U] = if (this eq genericEmptyTree) that else this /** The number of nodes in this tree */ @@ -217,42 +215,42 @@ object Trees { override def equals(that: Any): Boolean = this eq that.asInstanceOf[AnyRef] } - class UnAssignedTypeException[T >: Untyped](tree: Tree[T]) extends RuntimeException { + class UnAssignedTypeException[T <: Untyped](tree: Tree[T]) extends RuntimeException { override def getMessage: String = s"type of $tree is not assigned" } - type LazyTree[-T >: Untyped] = Tree[T] | Lazy[Tree[T]] - type LazyTreeList[-T >: Untyped] = List[Tree[T]] | Lazy[List[Tree[T]]] + type LazyTree[+T <: Untyped] = Tree[T] | Lazy[Tree[T]] + type LazyTreeList[+T <: Untyped] = List[Tree[T]] | Lazy[List[Tree[T]]] // ------ Categories of trees ----------------------------------- /** Instances of this class are trees for which isType is definitely true. * Note that some trees have isType = true without being TypTrees (e.g. Ident, Annotated) */ - trait TypTree[-T >: Untyped] extends Tree[T] { - type ThisTree[-T >: Untyped] <: TypTree[T] + trait TypTree[+T <: Untyped] extends Tree[T] { + type ThisTree[+T <: Untyped] <: TypTree[T] override def isType: Boolean = true } /** Instances of this class are trees for which isTerm is definitely true. * Note that some trees have isTerm = true without being TermTrees (e.g. Ident, Annotated) */ - trait TermTree[-T >: Untyped] extends Tree[T] { - type ThisTree[-T >: Untyped] <: TermTree[T] + trait TermTree[+T <: Untyped] extends Tree[T] { + type ThisTree[+T <: Untyped] <: TermTree[T] override def isTerm: Boolean = true } /** Instances of this class are trees which are not terms but are legal * parts of patterns. */ - trait PatternTree[-T >: Untyped] extends Tree[T] { - type ThisTree[-T >: Untyped] <: PatternTree[T] + trait PatternTree[+T <: Untyped] extends Tree[T] { + type ThisTree[+T <: Untyped] <: PatternTree[T] override def isPattern: Boolean = true } /** Tree's denotation can be derived from its type */ - abstract class DenotingTree[-T >: Untyped](implicit @constructorOnly src: SourceFile) extends Tree[T] { - type ThisTree[-T >: Untyped] <: DenotingTree[T] + abstract class DenotingTree[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends Tree[T] { + type ThisTree[+T <: Untyped] <: DenotingTree[T] override def denot(using Context): Denotation = typeOpt.stripped match case tpe: NamedType => tpe.denot case tpe: ThisType => tpe.cls.denot @@ -262,8 +260,8 @@ object Trees { /** Tree's denot/isType/isTerm properties come from a subtree * identified by `forwardTo`. */ - abstract class ProxyTree[-T >: Untyped](implicit @constructorOnly src: SourceFile) extends Tree[T] { - type ThisTree[-T >: Untyped] <: ProxyTree[T] + abstract class ProxyTree[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends Tree[T] { + type ThisTree[+T <: Untyped] <: ProxyTree[T] def forwardTo: Tree[T] override def denot(using Context): Denotation = forwardTo.denot override def isTerm: Boolean = forwardTo.isTerm @@ -271,24 +269,24 @@ object Trees { } /** Tree has a name */ - abstract class NameTree[-T >: Untyped](implicit @constructorOnly src: SourceFile) extends DenotingTree[T] { - type ThisTree[-T >: Untyped] <: NameTree[T] + abstract class NameTree[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends DenotingTree[T] { + type ThisTree[+T <: Untyped] <: NameTree[T] def name: Name } /** Tree refers by name to a denotation */ - abstract class RefTree[-T >: Untyped](implicit @constructorOnly src: SourceFile) extends NameTree[T] { - type ThisTree[-T >: Untyped] <: RefTree[T] + abstract class RefTree[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends NameTree[T] { + type ThisTree[+T <: Untyped] <: RefTree[T] def qualifier: Tree[T] override def isType: Boolean = name.isTypeName override def isTerm: Boolean = name.isTermName } /** Tree defines a new symbol */ - trait DefTree[-T >: Untyped] extends DenotingTree[T] { - type ThisTree[-T >: Untyped] <: DefTree[T] + trait DefTree[+T <: Untyped] extends DenotingTree[T] { + type ThisTree[+T <: Untyped] <: DefTree[T] - private var myMods: untpd.Modifiers | Null = _ + private var myMods: untpd.Modifiers | Null = uninitialized private[dotc] def rawMods: untpd.Modifiers = if (myMods == null) untpd.EmptyModifiers else myMods.uncheckedNN @@ -313,7 +311,7 @@ object Trees { extension (mdef: untpd.DefTree) def mods: untpd.Modifiers = mdef.rawMods - sealed trait WithEndMarker[-T >: Untyped]: + sealed trait WithEndMarker[+T <: Untyped]: self: PackageDef[T] | NamedDefTree[T] => import WithEndMarker.* @@ -356,9 +354,9 @@ object Trees { end WithEndMarker - abstract class NamedDefTree[-T >: Untyped](implicit @constructorOnly src: SourceFile) + abstract class NamedDefTree[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends NameTree[T] with DefTree[T] with WithEndMarker[T] { - type ThisTree[-T >: Untyped] <: NamedDefTree[T] + type ThisTree[+T <: Untyped] <: NamedDefTree[T] protected def srcName(using Context): Name = if name == nme.CONSTRUCTOR then nme.this_ @@ -395,8 +393,8 @@ object Trees { * The envelope of a MemberDef contains the whole definition and has its point * on the opening keyword (or the next token after that if keyword is missing). */ - abstract class MemberDef[-T >: Untyped](implicit @constructorOnly src: SourceFile) extends NamedDefTree[T] { - type ThisTree[-T >: Untyped] <: MemberDef[T] + abstract class MemberDef[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends NamedDefTree[T] { + type ThisTree[+T <: Untyped] <: MemberDef[T] def rawComment: Option[Comment] = getAttachment(DocComment) @@ -409,40 +407,40 @@ object Trees { } /** A ValDef or DefDef tree */ - abstract class ValOrDefDef[-T >: Untyped](implicit @constructorOnly src: SourceFile) extends MemberDef[T] with WithLazyField[Tree[T]] { - type ThisTree[-T >: Untyped] <: ValOrDefDef[T] + abstract class ValOrDefDef[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends MemberDef[T], WithLazyFields { + type ThisTree[+T <: Untyped] <: ValOrDefDef[T] def name: TermName def tpt: Tree[T] - def unforcedRhs: LazyTree[T] = unforced - def rhs(using Context): Tree[T] = forceIfLazy + def unforcedRhs: LazyTree[T] + def rhs(using Context): Tree[T] } - trait ValOrTypeDef[-T >: Untyped] extends MemberDef[T]: - type ThisTree[-T >: Untyped] <: ValOrTypeDef[T] + trait ValOrTypeDef[+T <: Untyped] extends MemberDef[T]: + type ThisTree[+T <: Untyped] <: ValOrTypeDef[T] - type ParamClause[T >: Untyped] = List[ValDef[T]] | List[TypeDef[T]] + type ParamClause[T <: Untyped] = List[ValDef[T]] | List[TypeDef[T]] // ----------- Tree case classes ------------------------------------ /** name */ - case class Ident[-T >: Untyped] private[ast] (name: Name)(implicit @constructorOnly src: SourceFile) + case class Ident[+T <: Untyped] private[ast] (name: Name)(implicit @constructorOnly src: SourceFile) extends RefTree[T] { - type ThisTree[-T >: Untyped] = Ident[T] + type ThisTree[+T <: Untyped] = Ident[T] def qualifier: Tree[T] = genericEmptyTree def isBackquoted: Boolean = hasAttachment(Backquoted) } - class SearchFailureIdent[-T >: Untyped] private[ast] (name: Name, expl: => String)(implicit @constructorOnly src: SourceFile) + class SearchFailureIdent[+T <: Untyped] private[ast] (name: Name, expl: => String)(implicit @constructorOnly src: SourceFile) extends Ident[T](name) { def explanation = expl override def toString: String = s"SearchFailureIdent($explanation)" } /** qualifier.name, or qualifier#name, if qualifier is a type */ - case class Select[-T >: Untyped] private[ast] (qualifier: Tree[T], name: Name)(implicit @constructorOnly src: SourceFile) + case class Select[+T <: Untyped] private[ast] (qualifier: Tree[T], name: Name)(implicit @constructorOnly src: SourceFile) extends RefTree[T] { - type ThisTree[-T >: Untyped] = Select[T] + type ThisTree[+T <: Untyped] = Select[T] override def denot(using Context): Denotation = typeOpt match case ConstantType(_) if ConstFold.foldedUnops.contains(name) => @@ -464,15 +462,15 @@ object Trees { else span } - class SelectWithSig[-T >: Untyped] private[ast] (qualifier: Tree[T], name: Name, val sig: Signature)(implicit @constructorOnly src: SourceFile) + class SelectWithSig[+T <: Untyped] private[ast] (qualifier: Tree[T], name: Name, val sig: Signature)(implicit @constructorOnly src: SourceFile) extends Select[T](qualifier, name) { override def toString: String = s"SelectWithSig($qualifier, $name, $sig)" } /** qual.this */ - case class This[-T >: Untyped] private[ast] (qual: untpd.Ident)(implicit @constructorOnly src: SourceFile) + case class This[+T <: Untyped] private[ast] (qual: untpd.Ident)(implicit @constructorOnly src: SourceFile) extends DenotingTree[T] with TermTree[T] { - type ThisTree[-T >: Untyped] = This[T] + type ThisTree[+T <: Untyped] = This[T] // Denotation of a This tree is always the underlying class; needs correction for modules. override def denot(using Context): Denotation = typeOpt match { @@ -484,21 +482,21 @@ object Trees { } /** C.super[mix], where qual = C.this */ - case class Super[-T >: Untyped] private[ast] (qual: Tree[T], mix: untpd.Ident)(implicit @constructorOnly src: SourceFile) + case class Super[+T <: Untyped] private[ast] (qual: Tree[T], mix: untpd.Ident)(implicit @constructorOnly src: SourceFile) extends ProxyTree[T] with TermTree[T] { - type ThisTree[-T >: Untyped] = Super[T] + type ThisTree[+T <: Untyped] = Super[T] def forwardTo: Tree[T] = qual } - abstract class GenericApply[-T >: Untyped](implicit @constructorOnly src: SourceFile) extends ProxyTree[T] with TermTree[T] { - type ThisTree[-T >: Untyped] <: GenericApply[T] + abstract class GenericApply[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends ProxyTree[T] with TermTree[T] { + type ThisTree[+T <: Untyped] <: GenericApply[T] val fun: Tree[T] val args: List[Tree[T]] def forwardTo: Tree[T] = fun } object GenericApply: - def unapply[T >: Untyped](tree: Tree[T]): Option[(Tree[T], List[Tree[T]])] = tree match + def unapply[T <: Untyped](tree: Tree[T]): Option[(Tree[T], List[Tree[T]])] = tree match case tree: GenericApply[T] => Some((tree.fun, tree.args)) case _ => None @@ -509,9 +507,9 @@ object Trees { case InfixTuple // r f (x1, ..., xN) where N != 1; needs to be treated specially for an error message in typedApply /** fun(args) */ - case class Apply[-T >: Untyped] private[ast] (fun: Tree[T], args: List[Tree[T]])(implicit @constructorOnly src: SourceFile) + case class Apply[+T <: Untyped] private[ast] (fun: Tree[T], args: List[Tree[T]])(implicit @constructorOnly src: SourceFile) extends GenericApply[T] { - type ThisTree[-T >: Untyped] = Apply[T] + type ThisTree[+T <: Untyped] = Apply[T] def setApplyKind(kind: ApplyKind) = putAttachment(untpd.KindOfApply, kind) @@ -525,57 +523,57 @@ object Trees { } /** fun[args] */ - case class TypeApply[-T >: Untyped] private[ast] (fun: Tree[T], args: List[Tree[T]])(implicit @constructorOnly src: SourceFile) + case class TypeApply[+T <: Untyped] private[ast] (fun: Tree[T], args: List[Tree[T]])(implicit @constructorOnly src: SourceFile) extends GenericApply[T] { - type ThisTree[-T >: Untyped] = TypeApply[T] + type ThisTree[+T <: Untyped] = TypeApply[T] } /** const */ - case class Literal[-T >: Untyped] private[ast] (const: Constant)(implicit @constructorOnly src: SourceFile) + case class Literal[+T <: Untyped] private[ast] (const: Constant)(implicit @constructorOnly src: SourceFile) extends Tree[T] with TermTree[T] { - type ThisTree[-T >: Untyped] = Literal[T] + type ThisTree[+T <: Untyped] = Literal[T] } /** new tpt, but no constructor call */ - case class New[-T >: Untyped] private[ast] (tpt: Tree[T])(implicit @constructorOnly src: SourceFile) + case class New[+T <: Untyped] private[ast] (tpt: Tree[T])(implicit @constructorOnly src: SourceFile) extends Tree[T] with TermTree[T] { - type ThisTree[-T >: Untyped] = New[T] + type ThisTree[+T <: Untyped] = New[T] } /** expr : tpt */ - case class Typed[-T >: Untyped] private[ast] (expr: Tree[T], tpt: Tree[T])(implicit @constructorOnly src: SourceFile) + case class Typed[+T <: Untyped] private[ast] (expr: Tree[T], tpt: Tree[T])(implicit @constructorOnly src: SourceFile) extends ProxyTree[T] with TermTree[T] { - type ThisTree[-T >: Untyped] = Typed[T] + type ThisTree[+T <: Untyped] = Typed[T] def forwardTo: Tree[T] = expr } /** name = arg, in a parameter list */ - case class NamedArg[-T >: Untyped] private[ast] (name: Name, arg: Tree[T])(implicit @constructorOnly src: SourceFile) + case class NamedArg[+T <: Untyped] private[ast] (name: Name, arg: Tree[T])(implicit @constructorOnly src: SourceFile) extends Tree[T] { - type ThisTree[-T >: Untyped] = NamedArg[T] + type ThisTree[+T <: Untyped] = NamedArg[T] } /** name = arg, outside a parameter list */ - case class Assign[-T >: Untyped] private[ast] (lhs: Tree[T], rhs: Tree[T])(implicit @constructorOnly src: SourceFile) + case class Assign[+T <: Untyped] private[ast] (lhs: Tree[T], rhs: Tree[T])(implicit @constructorOnly src: SourceFile) extends TermTree[T] { - type ThisTree[-T >: Untyped] = Assign[T] + type ThisTree[+T <: Untyped] = Assign[T] } /** { stats; expr } */ - case class Block[-T >: Untyped] private[ast] (stats: List[Tree[T]], expr: Tree[T])(implicit @constructorOnly src: SourceFile) + case class Block[+T <: Untyped] private[ast] (stats: List[Tree[T]], expr: Tree[T])(implicit @constructorOnly src: SourceFile) extends Tree[T] { - type ThisTree[-T >: Untyped] = Block[T] + type ThisTree[+T <: Untyped] = Block[T] override def isType: Boolean = expr.isType override def isTerm: Boolean = !isType // this will classify empty trees as terms, which is necessary } /** if cond then thenp else elsep */ - case class If[-T >: Untyped] private[ast] (cond: Tree[T], thenp: Tree[T], elsep: Tree[T])(implicit @constructorOnly src: SourceFile) + case class If[+T <: Untyped] private[ast] (cond: Tree[T], thenp: Tree[T], elsep: Tree[T])(implicit @constructorOnly src: SourceFile) extends TermTree[T] { - type ThisTree[-T >: Untyped] = If[T] + type ThisTree[+T <: Untyped] = If[T] def isInline = false } - class InlineIf[-T >: Untyped] private[ast] (cond: Tree[T], thenp: Tree[T], elsep: Tree[T])(implicit @constructorOnly src: SourceFile) + class InlineIf[+T <: Untyped] private[ast] (cond: Tree[T], thenp: Tree[T], elsep: Tree[T])(implicit @constructorOnly src: SourceFile) extends If(cond, thenp, elsep) { override def isInline = true override def toString = s"InlineIf($cond, $thenp, $elsep)" @@ -590,33 +588,33 @@ object Trees { * of the closure is a function type, otherwise it is the type * given in `tpt`, which must be a SAM type. */ - case class Closure[-T >: Untyped] private[ast] (env: List[Tree[T]], meth: Tree[T], tpt: Tree[T])(implicit @constructorOnly src: SourceFile) + case class Closure[+T <: Untyped] private[ast] (env: List[Tree[T]], meth: Tree[T], tpt: Tree[T])(implicit @constructorOnly src: SourceFile) extends TermTree[T] { - type ThisTree[-T >: Untyped] = Closure[T] + type ThisTree[+T <: Untyped] = Closure[T] } /** selector match { cases } */ - case class Match[-T >: Untyped] private[ast] (selector: Tree[T], cases: List[CaseDef[T]])(implicit @constructorOnly src: SourceFile) + case class Match[+T <: Untyped] private[ast] (selector: Tree[T], cases: List[CaseDef[T]])(implicit @constructorOnly src: SourceFile) extends TermTree[T] { - type ThisTree[-T >: Untyped] = Match[T] + type ThisTree[+T <: Untyped] = Match[T] def isInline = false } - class InlineMatch[-T >: Untyped] private[ast] (selector: Tree[T], cases: List[CaseDef[T]])(implicit @constructorOnly src: SourceFile) + class InlineMatch[+T <: Untyped] private[ast] (selector: Tree[T], cases: List[CaseDef[T]])(implicit @constructorOnly src: SourceFile) extends Match(selector, cases) { override def isInline = true override def toString = s"InlineMatch($selector, $cases)" } /** case pat if guard => body */ - case class CaseDef[-T >: Untyped] private[ast] (pat: Tree[T], guard: Tree[T], body: Tree[T])(implicit @constructorOnly src: SourceFile) + case class CaseDef[+T <: Untyped] private[ast] (pat: Tree[T], guard: Tree[T], body: Tree[T])(implicit @constructorOnly src: SourceFile) extends Tree[T] { - type ThisTree[-T >: Untyped] = CaseDef[T] + type ThisTree[+T <: Untyped] = CaseDef[T] } /** label[tpt]: { expr } */ - case class Labeled[-T >: Untyped] private[ast] (bind: Bind[T], expr: Tree[T])(implicit @constructorOnly src: SourceFile) + case class Labeled[+T <: Untyped] private[ast] (bind: Bind[T], expr: Tree[T])(implicit @constructorOnly src: SourceFile) extends NameTree[T] { - type ThisTree[-T >: Untyped] = Labeled[T] + type ThisTree[+T <: Untyped] = Labeled[T] def name: Name = bind.name } @@ -625,33 +623,33 @@ object Trees { * After program transformations this is not necessarily the enclosing method, because * closures can intervene. */ - case class Return[-T >: Untyped] private[ast] (expr: Tree[T], from: Tree[T] = genericEmptyTree)(implicit @constructorOnly src: SourceFile) + case class Return[+T <: Untyped] private[ast] (expr: Tree[T], from: Tree[T] = genericEmptyTree)(implicit @constructorOnly src: SourceFile) extends TermTree[T] { - type ThisTree[-T >: Untyped] = Return[T] + type ThisTree[+T <: Untyped] = Return[T] } /** while (cond) { body } */ - case class WhileDo[-T >: Untyped] private[ast] (cond: Tree[T], body: Tree[T])(implicit @constructorOnly src: SourceFile) + case class WhileDo[+T <: Untyped] private[ast] (cond: Tree[T], body: Tree[T])(implicit @constructorOnly src: SourceFile) extends TermTree[T] { - type ThisTree[-T >: Untyped] = WhileDo[T] + type ThisTree[+T <: Untyped] = WhileDo[T] } /** try block catch cases finally finalizer */ - case class Try[-T >: Untyped] private[ast] (expr: Tree[T], cases: List[CaseDef[T]], finalizer: Tree[T])(implicit @constructorOnly src: SourceFile) + case class Try[+T <: Untyped] private[ast] (expr: Tree[T], cases: List[CaseDef[T]], finalizer: Tree[T])(implicit @constructorOnly src: SourceFile) extends TermTree[T] { - type ThisTree[-T >: Untyped] = Try[T] + type ThisTree[+T <: Untyped] = Try[T] } /** Seq(elems) * @param tpt The element type of the sequence. */ - case class SeqLiteral[-T >: Untyped] private[ast] (elems: List[Tree[T]], elemtpt: Tree[T])(implicit @constructorOnly src: SourceFile) + case class SeqLiteral[+T <: Untyped] private[ast] (elems: List[Tree[T]], elemtpt: Tree[T])(implicit @constructorOnly src: SourceFile) extends Tree[T] { - type ThisTree[-T >: Untyped] = SeqLiteral[T] + type ThisTree[+T <: Untyped] = SeqLiteral[T] } /** Array(elems) */ - class JavaSeqLiteral[-T >: Untyped] private[ast] (elems: List[Tree[T]], elemtpt: Tree[T])(implicit @constructorOnly src: SourceFile) + class JavaSeqLiteral[+T <: Untyped] private[ast] (elems: List[Tree[T]], elemtpt: Tree[T])(implicit @constructorOnly src: SourceFile) extends SeqLiteral(elems, elemtpt) { override def toString: String = s"JavaSeqLiteral($elems, $elemtpt)" } @@ -672,17 +670,17 @@ object Trees { * different context: `bindings` represent the arguments to the inlined * call, whereas `expansion` represents the body of the inlined function. */ - case class Inlined[-T >: Untyped] private[ast] (call: tpd.Tree, bindings: List[MemberDef[T]], expansion: Tree[T])(implicit @constructorOnly src: SourceFile) + case class Inlined[+T <: Untyped] private[ast] (call: tpd.Tree, bindings: List[MemberDef[T]], expansion: Tree[T])(implicit @constructorOnly src: SourceFile) extends Tree[T] { - type ThisTree[-T >: Untyped] = Inlined[T] + type ThisTree[+T <: Untyped] = Inlined[T] override def isTerm = expansion.isTerm override def isType = expansion.isType } /** A type tree that represents an existing or inferred type */ - case class TypeTree[-T >: Untyped]()(implicit @constructorOnly src: SourceFile) + case class TypeTree[+T <: Untyped]()(implicit @constructorOnly src: SourceFile) extends DenotingTree[T] with TypTree[T] { - type ThisTree[-T >: Untyped] = TypeTree[T] + type ThisTree[+T <: Untyped] = TypeTree[T] override def isEmpty: Boolean = !hasType override def toString: String = s"TypeTree${if (hasType) s"[$typeOpt]" else ""}" @@ -693,25 +691,25 @@ object Trees { * - as a (result-)type of an inferred ValDef or DefDef. * Every TypeVar is created as the type of one InferredTypeTree. */ - class InferredTypeTree[-T >: Untyped](implicit @constructorOnly src: SourceFile) extends TypeTree[T] + class InferredTypeTree[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends TypeTree[T] /** ref.type */ - case class SingletonTypeTree[-T >: Untyped] private[ast] (ref: Tree[T])(implicit @constructorOnly src: SourceFile) + case class SingletonTypeTree[+T <: Untyped] private[ast] (ref: Tree[T])(implicit @constructorOnly src: SourceFile) extends DenotingTree[T] with TypTree[T] { - type ThisTree[-T >: Untyped] = SingletonTypeTree[T] + type ThisTree[+T <: Untyped] = SingletonTypeTree[T] } /** tpt { refinements } */ - case class RefinedTypeTree[-T >: Untyped] private[ast] (tpt: Tree[T], refinements: List[Tree[T]])(implicit @constructorOnly src: SourceFile) + case class RefinedTypeTree[+T <: Untyped] private[ast] (tpt: Tree[T], refinements: List[Tree[T]])(implicit @constructorOnly src: SourceFile) extends ProxyTree[T] with TypTree[T] { - type ThisTree[-T >: Untyped] = RefinedTypeTree[T] + type ThisTree[+T <: Untyped] = RefinedTypeTree[T] def forwardTo: Tree[T] = tpt } /** tpt[args] */ - case class AppliedTypeTree[-T >: Untyped] private[ast] (tpt: Tree[T], args: List[Tree[T]])(implicit @constructorOnly src: SourceFile) + case class AppliedTypeTree[+T <: Untyped] private[ast] (tpt: Tree[T], args: List[Tree[T]])(implicit @constructorOnly src: SourceFile) extends ProxyTree[T] with TypTree[T] { - type ThisTree[-T >: Untyped] = AppliedTypeTree[T] + type ThisTree[+T <: Untyped] = AppliedTypeTree[T] def forwardTo: Tree[T] = tpt } @@ -738,40 +736,40 @@ object Trees { * source code written by the user with the trees used by the compiler (for * example, to make "find all references" work in the IDE). */ - case class LambdaTypeTree[-T >: Untyped] private[ast] (tparams: List[TypeDef[T]], body: Tree[T])(implicit @constructorOnly src: SourceFile) + case class LambdaTypeTree[+T <: Untyped] private[ast] (tparams: List[TypeDef[T]], body: Tree[T])(implicit @constructorOnly src: SourceFile) extends TypTree[T] { - type ThisTree[-T >: Untyped] = LambdaTypeTree[T] + type ThisTree[+T <: Untyped] = LambdaTypeTree[T] } - case class TermLambdaTypeTree[-T >: Untyped] private[ast] (params: List[ValDef[T]], body: Tree[T])(implicit @constructorOnly src: SourceFile) + case class TermLambdaTypeTree[+T <: Untyped] private[ast] (params: List[ValDef[T]], body: Tree[T])(implicit @constructorOnly src: SourceFile) extends TypTree[T] { - type ThisTree[-T >: Untyped] = TermLambdaTypeTree[T] + type ThisTree[+T <: Untyped] = TermLambdaTypeTree[T] } /** [bound] selector match { cases } */ - case class MatchTypeTree[-T >: Untyped] private[ast] (bound: Tree[T], selector: Tree[T], cases: List[CaseDef[T]])(implicit @constructorOnly src: SourceFile) + case class MatchTypeTree[+T <: Untyped] private[ast] (bound: Tree[T], selector: Tree[T], cases: List[CaseDef[T]])(implicit @constructorOnly src: SourceFile) extends TypTree[T] { - type ThisTree[-T >: Untyped] = MatchTypeTree[T] + type ThisTree[+T <: Untyped] = MatchTypeTree[T] } /** => T */ - case class ByNameTypeTree[-T >: Untyped] private[ast] (result: Tree[T])(implicit @constructorOnly src: SourceFile) + case class ByNameTypeTree[+T <: Untyped] private[ast] (result: Tree[T])(implicit @constructorOnly src: SourceFile) extends TypTree[T] { - type ThisTree[-T >: Untyped] = ByNameTypeTree[T] + type ThisTree[+T <: Untyped] = ByNameTypeTree[T] } /** >: lo <: hi * >: lo <: hi = alias for RHS of bounded opaque type */ - case class TypeBoundsTree[-T >: Untyped] private[ast] (lo: Tree[T], hi: Tree[T], alias: Tree[T])(implicit @constructorOnly src: SourceFile) + case class TypeBoundsTree[+T <: Untyped] private[ast] (lo: Tree[T], hi: Tree[T], alias: Tree[T])(implicit @constructorOnly src: SourceFile) extends TypTree[T] { - type ThisTree[-T >: Untyped] = TypeBoundsTree[T] + type ThisTree[+T <: Untyped] = TypeBoundsTree[T] } /** name @ body */ - case class Bind[-T >: Untyped] private[ast] (name: Name, body: Tree[T])(implicit @constructorOnly src: SourceFile) + case class Bind[+T <: Untyped] private[ast] (name: Name, body: Tree[T])(implicit @constructorOnly src: SourceFile) extends NamedDefTree[T] with PatternTree[T] { - type ThisTree[-T >: Untyped] = Bind[T] + type ThisTree[+T <: Untyped] = Bind[T] override def isType: Boolean = name.isTypeName override def isTerm: Boolean = name.isTermName @@ -780,9 +778,9 @@ object Trees { } /** tree_1 | ... | tree_n */ - case class Alternative[-T >: Untyped] private[ast] (trees: List[Tree[T]])(implicit @constructorOnly src: SourceFile) + case class Alternative[+T <: Untyped] private[ast] (trees: List[Tree[T]])(implicit @constructorOnly src: SourceFile) extends PatternTree[T] { - type ThisTree[-T >: Untyped] = Alternative[T] + type ThisTree[+T <: Untyped] = Alternative[T] } /** The typed translation of `extractor(patterns)` in a pattern. The translation has the following @@ -799,29 +797,33 @@ object Trees { * val result = fun(sel)(implicits) * if (result.isDefined) "match patterns against result" */ - case class UnApply[-T >: Untyped] private[ast] (fun: Tree[T], implicits: List[Tree[T]], patterns: List[Tree[T]])(implicit @constructorOnly src: SourceFile) + case class UnApply[+T <: Untyped] private[ast] (fun: Tree[T], implicits: List[Tree[T]], patterns: List[Tree[T]])(implicit @constructorOnly src: SourceFile) extends ProxyTree[T] with PatternTree[T] { - type ThisTree[-T >: Untyped] = UnApply[T] + type ThisTree[+T <: Untyped] = UnApply[T] def forwardTo = fun } /** mods val name: tpt = rhs */ - case class ValDef[-T >: Untyped] private[ast] (name: TermName, tpt: Tree[T], private var preRhs: LazyTree[T @uncheckedVariance])(implicit @constructorOnly src: SourceFile) + case class ValDef[+T <: Untyped] private[ast] (name: TermName, tpt: Tree[T], private var preRhs: LazyTree[T])(implicit @constructorOnly src: SourceFile) extends ValOrDefDef[T], ValOrTypeDef[T] { - type ThisTree[-T >: Untyped] = ValDef[T] + type ThisTree[+T <: Untyped] = ValDef[T] assert(isEmpty || (tpt ne genericEmptyTree)) - def unforced: LazyTree[T] = preRhs - protected def force(x: Tree[T @uncheckedVariance]): Unit = preRhs = x + + def unforcedRhs: LazyTree[T] = preRhs + def forceFields()(using Context): Unit = preRhs = force(preRhs) + def rhs(using Context): Tree[T] = { forceFields(); preRhs.asInstanceOf[Tree[T]] } } /** mods def name[tparams](vparams_1)...(vparams_n): tpt = rhs */ - case class DefDef[-T >: Untyped] private[ast] (name: TermName, - paramss: List[ParamClause[T]], tpt: Tree[T], private var preRhs: LazyTree[T @uncheckedVariance])(implicit @constructorOnly src: SourceFile) + case class DefDef[+T <: Untyped] private[ast] (name: TermName, + paramss: List[ParamClause[T]], tpt: Tree[T], private var preRhs: LazyTree[T])(implicit @constructorOnly src: SourceFile) extends ValOrDefDef[T] { - type ThisTree[-T >: Untyped] = DefDef[T] + type ThisTree[+T <: Untyped] = DefDef[T] assert(tpt ne genericEmptyTree) - def unforced: LazyTree[T] = preRhs - protected def force(x: Tree[T @uncheckedVariance]): Unit = preRhs = x + + def unforcedRhs: LazyTree[T] = preRhs + def forceFields()(using Context): Unit = preRhs = force(preRhs) + def rhs(using Context): Tree[T] = { forceFields(); preRhs.asInstanceOf[Tree[T]] } def leadingTypeParams(using Context): List[TypeDef[T]] = paramss match case (tparams @ (tparam: TypeDef[_]) :: _) :: _ => tparams.asInstanceOf[List[TypeDef[T]]] @@ -842,9 +844,9 @@ object Trees { * mods type name >: lo <: hi, if rhs = TypeBoundsTree(lo, hi) or * mods type name >: lo <: hi = rhs if rhs = TypeBoundsTree(lo, hi, alias) and opaque in mods */ - case class TypeDef[-T >: Untyped] private[ast] (name: TypeName, rhs: Tree[T])(implicit @constructorOnly src: SourceFile) + case class TypeDef[+T <: Untyped] private[ast] (name: TypeName, rhs: Tree[T])(implicit @constructorOnly src: SourceFile) extends MemberDef[T], ValOrTypeDef[T] { - type ThisTree[-T >: Untyped] = TypeDef[T] + type ThisTree[+T <: Untyped] = TypeDef[T] /** Is this a definition of a class? */ def isClassDef: Boolean = rhs.isInstanceOf[Template[?]] @@ -857,22 +859,26 @@ object Trees { * if this is of class untpd.DerivingTemplate. * Typed templates only have parents. */ - case class Template[-T >: Untyped] private[ast] (constr: DefDef[T], parentsOrDerived: List[Tree[T]], self: ValDef[T], private var preBody: LazyTreeList[T @uncheckedVariance])(implicit @constructorOnly src: SourceFile) - extends DefTree[T] with WithLazyField[List[Tree[T]]] { - type ThisTree[-T >: Untyped] = Template[T] - def unforcedBody: LazyTreeList[T] = unforced - def unforced: LazyTreeList[T] = preBody - protected def force(x: List[Tree[T @uncheckedVariance]]): Unit = preBody = x - def body(using Context): List[Tree[T]] = forceIfLazy + case class Template[+T <: Untyped] private[ast] (constr: DefDef[T], private var preParentsOrDerived: LazyTreeList[T], self: ValDef[T], private var preBody: LazyTreeList[T])(implicit @constructorOnly src: SourceFile) + extends DefTree[T] with WithLazyFields { + type ThisTree[+T <: Untyped] = Template[T] + + def forceFields()(using Context): Unit = + preParentsOrDerived = force(preParentsOrDerived) + preBody = force(preBody) - def parents: List[Tree[T]] = parentsOrDerived // overridden by DerivingTemplate - def derived: List[untpd.Tree] = Nil // overridden by DerivingTemplate + def unforcedBody: LazyTreeList[T] = preBody + def body(using Context): List[Tree[T]] = { forceFields(); preBody.asInstanceOf[List[Tree[T]]] } + def parentsOrDerived(using Context): List[Tree[T]] = { forceFields(); preParentsOrDerived.asInstanceOf[List[Tree[T]]] } + + def parents(using Context): List[Tree[T]] = parentsOrDerived // overridden by DerivingTemplate + def derived: List[untpd.Tree] = Nil // overridden by DerivingTemplate } - abstract class ImportOrExport[-T >: Untyped](implicit @constructorOnly src: SourceFile) + abstract class ImportOrExport[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends DenotingTree[T] { - type ThisTree[-T >: Untyped] <: ImportOrExport[T] + type ThisTree[+T <: Untyped] <: ImportOrExport[T] val expr: Tree[T] val selectors: List[untpd.ImportSelector] } @@ -881,36 +887,36 @@ object Trees { * where a selector is either an untyped `Ident`, `name` or * an untyped thicket consisting of `name` and `rename`. */ - case class Import[-T >: Untyped] private[ast] (expr: Tree[T], selectors: List[untpd.ImportSelector])(implicit @constructorOnly src: SourceFile) + case class Import[+T <: Untyped] private[ast] (expr: Tree[T], selectors: List[untpd.ImportSelector])(implicit @constructorOnly src: SourceFile) extends ImportOrExport[T] { - type ThisTree[-T >: Untyped] = Import[T] + type ThisTree[+T <: Untyped] = Import[T] } /** export expr.selectors * where a selector is either an untyped `Ident`, `name` or * an untyped thicket consisting of `name` and `rename`. */ - case class Export[-T >: Untyped] private[ast] (expr: Tree[T], selectors: List[untpd.ImportSelector])(implicit @constructorOnly src: SourceFile) + case class Export[+T <: Untyped] private[ast] (expr: Tree[T], selectors: List[untpd.ImportSelector])(implicit @constructorOnly src: SourceFile) extends ImportOrExport[T] { - type ThisTree[-T >: Untyped] = Export[T] + type ThisTree[+T <: Untyped] = Export[T] } /** package pid { stats } */ - case class PackageDef[-T >: Untyped] private[ast] (pid: RefTree[T], stats: List[Tree[T]])(implicit @constructorOnly src: SourceFile) + case class PackageDef[+T <: Untyped] private[ast] (pid: RefTree[T], stats: List[Tree[T]])(implicit @constructorOnly src: SourceFile) extends ProxyTree[T] with WithEndMarker[T] { - type ThisTree[-T >: Untyped] = PackageDef[T] + type ThisTree[+T <: Untyped] = PackageDef[T] def forwardTo: RefTree[T] = pid protected def srcName(using Context): Name = pid.name } /** arg @annot */ - case class Annotated[-T >: Untyped] private[ast] (arg: Tree[T], annot: Tree[T])(implicit @constructorOnly src: SourceFile) + case class Annotated[+T <: Untyped] private[ast] (arg: Tree[T], annot: Tree[T])(implicit @constructorOnly src: SourceFile) extends ProxyTree[T] { - type ThisTree[-T >: Untyped] = Annotated[T] + type ThisTree[+T <: Untyped] = Annotated[T] def forwardTo: Tree[T] = arg } - trait WithoutTypeOrPos[-T >: Untyped] extends Tree[T] { + trait WithoutTypeOrPos[+T <: Untyped] extends Tree[T] { override def withTypeUnchecked(tpe: Type): ThisTree[Type] = this.asInstanceOf[ThisTree[Type]] override def span: Span = NoSpan override def span_=(span: Span): Unit = {} @@ -921,17 +927,17 @@ object Trees { * The contained trees will be integrated when transformed with * a `transform(List[Tree])` call. */ - case class Thicket[-T >: Untyped](trees: List[Tree[T]])(implicit @constructorOnly src: SourceFile) + case class Thicket[+T <: Untyped](trees: List[Tree[T]])(implicit @constructorOnly src: SourceFile) extends Tree[T] with WithoutTypeOrPos[T] { myTpe = NoType.asInstanceOf[T] - type ThisTree[-T >: Untyped] = Thicket[T] + type ThisTree[+T <: Untyped] = Thicket[T] - def mapElems(op: Tree[T] => Tree[T] @uncheckedVariance): Thicket[T] = { + def mapElems[U >: T <: Untyped](op: Tree[T] => Tree[U]): Thicket[U] = { val newTrees = trees.mapConserve(op) if (trees eq newTrees) this else - Thicket[T](newTrees)(source).asInstanceOf[this.type] + Thicket[U](newTrees)(source).asInstanceOf[this.type] } override def foreachInThicket(op: Tree[T] => Unit): Unit = @@ -950,12 +956,12 @@ object Trees { mapElems(_.withSpan(span)).asInstanceOf[this.type] } - class EmptyTree[T >: Untyped] extends Thicket(Nil)(NoSource) { + class EmptyTree[T <: Untyped] extends Thicket(Nil)(NoSource) { // assert(uniqueId != 1492) override def withSpan(span: Span) = throw AssertionError("Cannot change span of EmptyTree") } - class EmptyValDef[T >: Untyped] extends ValDef[T]( + class EmptyValDef[T <: Untyped] extends ValDef[T]( nme.WILDCARD, genericEmptyTree[T], genericEmptyTree[T])(NoSource) with WithoutTypeOrPos[T] { myTpe = NoType.asInstanceOf[T] setMods(untpd.Modifiers(PrivateLocal)) @@ -966,8 +972,8 @@ object Trees { @sharable val theEmptyTree = new EmptyTree[Type]() @sharable val theEmptyValDef = new EmptyValDef[Type]() - def genericEmptyValDef[T >: Untyped]: ValDef[T] = theEmptyValDef.asInstanceOf[ValDef[T]] - def genericEmptyTree[T >: Untyped]: Thicket[T] = theEmptyTree.asInstanceOf[Thicket[T]] + def genericEmptyValDef[T <: Untyped]: ValDef[T] = theEmptyValDef.asInstanceOf[ValDef[T]] + def genericEmptyTree[T <: Untyped]: Thicket[T] = theEmptyTree.asInstanceOf[Thicket[T]] /** Tree that replaces a level 1 splices in pickled (level 0) quotes. * It is only used when picking quotes (will never be in a TASTy file). @@ -978,13 +984,13 @@ object Trees { * @param content Lambda that computes the content of the hole. This tree is empty when in a quote pickle. * @param tpt Type of the hole */ - case class Hole[-T >: Untyped](isTermHole: Boolean, idx: Int, args: List[Tree[T]], content: Tree[T], tpt: Tree[T])(implicit @constructorOnly src: SourceFile) extends Tree[T] { - type ThisTree[-T >: Untyped] <: Hole[T] + case class Hole[+T <: Untyped](isTermHole: Boolean, idx: Int, args: List[Tree[T]], content: Tree[T], tpt: Tree[T])(implicit @constructorOnly src: SourceFile) extends Tree[T] { + type ThisTree[+T <: Untyped] <: Hole[T] override def isTerm: Boolean = isTermHole override def isType: Boolean = !isTermHole } - def flatten[T >: Untyped](trees: List[Tree[T]]): List[Tree[T]] = { + def flatten[T <: Untyped](trees: List[Tree[T]]): List[Tree[T]] = { def recur(buf: ListBuffer[Tree[T]] | Null, remaining: List[Tree[T]]): ListBuffer[Tree[T]] | Null = remaining match { case Thicket(elems) :: remaining1 => @@ -1010,34 +1016,31 @@ object Trees { // ----- Lazy trees and tree sequences - /** A tree that can have a lazy field - * The field is represented by some private `var` which is - * accessed by `unforced` and `force`. Forcing the field will - * set the `var` to the underlying value. - */ - trait WithLazyField[+T <: AnyRef] { - def unforced: T | Lazy[T] - protected def force(x: T @uncheckedVariance): Unit - def forceIfLazy(using Context): T = unforced match { - case lzy: Lazy[T @unchecked] => - val x = lzy.complete - force(x) - x - case x: T @ unchecked => x - } - } - /** A base trait for lazy tree fields. * These can be instantiated with Lazy instances which * can delay tree construction until the field is first demanded. */ - trait Lazy[+T <: AnyRef] { + trait Lazy[+T <: AnyRef]: def complete(using Context): T - } + + /** A tree that can have a lazy fields. + * Such fields are variables of type `T | Lazy[T]`, for some tyope `T`. + */ + trait WithLazyFields: + + /** If `x` is lazy, computes the underlying value */ + protected def force[T <: AnyRef](x: T | Lazy[T])(using Context): T = x match + case x: Lazy[T] @unchecked => x.complete + case x: T @unchecked => x + + /** Assigns all lazy fields their underlying non-lazy value. */ + def forceFields()(using Context): Unit + + end WithLazyFields // ----- Generic Tree Instances, inherited from `tpt` and `untpd`. - abstract class Instance[T >: Untyped <: Type] { inst => + abstract class Instance[T <: Untyped] { inst => type Tree = Trees.Tree[T] type TypTree = Trees.TypTree[T] @@ -1357,7 +1360,7 @@ object Trees { DefDef(tree: Tree)(name, paramss, tpt, rhs) def TypeDef(tree: TypeDef)(name: TypeName = tree.name, rhs: Tree = tree.rhs)(using Context): TypeDef = TypeDef(tree: Tree)(name, rhs) - def Template(tree: Template)(constr: DefDef = tree.constr, parents: List[Tree] = tree.parents, derived: List[untpd.Tree] = tree.derived, self: ValDef = tree.self, body: LazyTreeList = tree.unforcedBody)(using Context): Template = + def Template(tree: Template)(using Context)(constr: DefDef = tree.constr, parents: List[Tree] = tree.parents, derived: List[untpd.Tree] = tree.derived, self: ValDef = tree.self, body: LazyTreeList = tree.unforcedBody): Template = Template(tree: Tree)(constr, parents, derived, self, body) def Hole(tree: Hole)(isTerm: Boolean = tree.isTerm, idx: Int = tree.idx, args: List[Tree] = tree.args, content: Tree = tree.content, tpt: Tree = tree.tpt)(using Context): Hole = Hole(tree: Tree)(isTerm, idx, args, content, tpt) @@ -1372,7 +1375,7 @@ object Trees { * innermost enclosing call for which the inlined version is currently * processed. */ - protected def inlineContext(call: Tree)(using Context): Context = ctx + protected def inlineContext(call: tpd.Tree)(using Context): Context = ctx /** The context to use when mapping or accumulating over a tree */ def localCtx(tree: Tree)(using Context): Context @@ -1620,8 +1623,8 @@ object Trees { inContext(localCtx(tree)) { this(x, rhs) } - case tree @ Template(constr, parents, self, _) if tree.derived.isEmpty => - this(this(this(this(x, constr), parents), self), tree.body) + case tree @ Template(constr, _, self, _) if tree.derived.isEmpty => + this(this(this(this(x, constr), tree.parents), self), tree.body) case Import(expr, _) => this(x, expr) case Export(expr, _) => @@ -1747,7 +1750,7 @@ object Trees { val denot = receiver.tpe.member(method) if !denot.exists then overload.println(i"members = ${receiver.tpe.decls}") - report.error(i"no member $receiver . $method", receiver.srcPos) + report.error(em"no member $receiver . $method", receiver.srcPos) val selected = if (denot.isOverloaded) { def typeParamCount(tp: Type) = tp.widen match { diff --git a/compiler/src/dotty/tools/dotc/ast/tpd.scala b/compiler/src/dotty/tools/dotc/ast/tpd.scala index 52325e36037d..01d61986dee4 100644 --- a/compiler/src/dotty/tools/dotc/ast/tpd.scala +++ b/compiler/src/dotty/tools/dotc/ast/tpd.scala @@ -428,7 +428,7 @@ object tpd extends Trees.Instance[Type] with TypedTreeInfo { else val res = Select(TypeTree(pre), tp) if needLoad && !res.symbol.isStatic then - throw new TypeError(em"cannot establish a reference to $res") + throw TypeError(em"cannot establish a reference to $res") res def ref(sym: Symbol)(using Context): Tree = @@ -857,7 +857,7 @@ object tpd extends Trees.Instance[Type] with TypedTreeInfo { } /** After phase `trans`, set the owner of every definition in this tree that was formerly - * owner by `from` to `to`. + * owned by `from` to `to`. */ def changeOwnerAfter(from: Symbol, to: Symbol, trans: DenotTransformer)(using Context): ThisTree = if (ctx.phase == trans.next) { @@ -1144,35 +1144,38 @@ object tpd extends Trees.Instance[Type] with TypedTreeInfo { expand(tree, tree.tpe.widen) } - inline val MapRecursionLimit = 10 - extension (trees: List[Tree]) - /** A map that expands to a recursive function. It's equivalent to + /** Equivalent (but faster) to * * flatten(trees.mapConserve(op)) * - * and falls back to it after `MaxRecursionLimit` recursions. - * Before that it uses a simpler method that uses stackspace - * instead of heap. - * Note `op` is duplicated in the generated code, so it should be - * kept small. + * assuming that `trees` does not contain `Thicket`s to start with. */ - inline def mapInline(inline op: Tree => Tree): List[Tree] = - def recur(trees: List[Tree], count: Int): List[Tree] = - if count > MapRecursionLimit then - // use a slower implementation that avoids stack overflows - flatten(trees.mapConserve(op)) - else trees match - case tree :: rest => - val tree1 = op(tree) - val rest1 = recur(rest, count + 1) - if (tree1 eq tree) && (rest1 eq rest) then trees - else tree1 match - case Thicket(elems1) => elems1 ::: rest1 - case _ => tree1 :: rest1 - case nil => nil - recur(trees, 0) + inline def flattenedMapConserve(inline f: Tree => Tree): List[Tree] = + @tailrec + def loop(mapped: ListBuffer[Tree] | Null, unchanged: List[Tree], pending: List[Tree]): List[Tree] = + if pending.isEmpty then + if mapped == null then unchanged + else mapped.prependToList(unchanged) + else + val head0 = pending.head + val head1 = f(head0) + + if head1 eq head0 then + loop(mapped, unchanged, pending.tail) + else + val buf = if mapped == null then new ListBuffer[Tree] else mapped + var xc = unchanged + while xc ne pending do + buf += xc.head + xc = xc.tail + head1 match + case Thicket(elems1) => buf ++= elems1 + case _ => buf += head1 + val tail0 = pending.tail + loop(buf, tail0, tail0) + loop(null, trees, trees) /** Transform statements while maintaining import contexts and expression contexts * in the same way as Typer does. The code addresses additional concerns: @@ -1296,7 +1299,7 @@ object tpd extends Trees.Instance[Type] with TypedTreeInfo { else if (tree.tpe.widen isRef numericCls) tree else { - report.warning(i"conversion from ${tree.tpe.widen} to ${numericCls.typeRef} will always fail at runtime.") + report.warning(em"conversion from ${tree.tpe.widen} to ${numericCls.typeRef} will always fail at runtime.") Throw(New(defn.ClassCastExceptionClass.typeRef, Nil)).withSpan(tree.span) } } @@ -1495,7 +1498,7 @@ object tpd extends Trees.Instance[Type] with TypedTreeInfo { } } - /** Creates the tuple type tree repesentation of the type trees in `ts` */ + /** Creates the tuple type tree representation of the type trees in `ts` */ def tupleTypeTree(elems: List[Tree])(using Context): Tree = { val arity = elems.length if arity <= Definitions.MaxTupleArity then @@ -1506,10 +1509,14 @@ object tpd extends Trees.Instance[Type] with TypedTreeInfo { else nestedPairsTypeTree(elems) } - /** Creates the nested pairs type tree repesentation of the type trees in `ts` */ + /** Creates the nested pairs type tree representation of the type trees in `ts` */ def nestedPairsTypeTree(ts: List[Tree])(using Context): Tree = ts.foldRight[Tree](TypeTree(defn.EmptyTupleModule.termRef))((x, acc) => AppliedTypeTree(TypeTree(defn.PairClass.typeRef), x :: acc :: Nil)) + /** Creates the nested higher-kinded pairs type tree representation of the type trees in `ts` */ + def hkNestedPairsTypeTree(ts: List[Tree])(using Context): Tree = + ts.foldRight[Tree](TypeTree(defn.QuoteMatching_KNil.typeRef))((x, acc) => AppliedTypeTree(TypeTree(defn.QuoteMatching_KCons.typeRef), x :: acc :: Nil)) + /** Replaces all positions in `tree` with zero-extent positions */ private def focusPositions(tree: Tree)(using Context): Tree = { val transformer = new tpd.TreeMap { diff --git a/compiler/src/dotty/tools/dotc/ast/untpd.scala b/compiler/src/dotty/tools/dotc/ast/untpd.scala index ec3eb4f05b79..aeebb1f203e8 100644 --- a/compiler/src/dotty/tools/dotc/ast/untpd.scala +++ b/compiler/src/dotty/tools/dotc/ast/untpd.scala @@ -42,7 +42,7 @@ object untpd extends Trees.Instance[Untyped] with UntypedTreeInfo { /** mods object name impl */ case class ModuleDef(name: TermName, impl: Template)(implicit @constructorOnly src: SourceFile) extends MemberDef { - type ThisTree[-T >: Untyped] <: Trees.NameTree[T] with Trees.MemberDef[T] with ModuleDef + type ThisTree[+T <: Untyped] <: Trees.NameTree[T] with Trees.MemberDef[T] with ModuleDef def withName(name: Name)(using Context): ModuleDef = cpy.ModuleDef(this)(name.toTermName, impl) } @@ -54,7 +54,8 @@ object untpd extends Trees.Instance[Untyped] with UntypedTreeInfo { */ class DerivingTemplate(constr: DefDef, parentsOrDerived: List[Tree], self: ValDef, preBody: LazyTreeList, derivedCount: Int)(implicit @constructorOnly src: SourceFile) extends Template(constr, parentsOrDerived, self, preBody) { - override val parents = parentsOrDerived.dropRight(derivedCount) + private val myParents = parentsOrDerived.dropRight(derivedCount) + override def parents(using Context) = myParents override val derived = parentsOrDerived.takeRight(derivedCount) } @@ -117,6 +118,7 @@ object untpd extends Trees.Instance[Untyped] with UntypedTreeInfo { case class ContextBounds(bounds: TypeBoundsTree, cxBounds: List[Tree])(implicit @constructorOnly src: SourceFile) extends TypTree case class PatDef(mods: Modifiers, pats: List[Tree], tpt: Tree, rhs: Tree)(implicit @constructorOnly src: SourceFile) extends DefTree case class ExtMethods(paramss: List[ParamClause], methods: List[Tree])(implicit @constructorOnly src: SourceFile) extends Tree + case class Into(tpt: Tree)(implicit @constructorOnly src: SourceFile) extends Tree case class MacroTree(expr: Tree)(implicit @constructorOnly src: SourceFile) extends Tree case class ImportSelector(imported: Ident, renamed: Tree = EmptyTree, bound: Tree = EmptyTree)(implicit @constructorOnly src: SourceFile) extends Tree { @@ -414,6 +416,8 @@ object untpd extends Trees.Instance[Untyped] with UntypedTreeInfo { def Template(constr: DefDef, parents: List[Tree], derived: List[Tree], self: ValDef, body: LazyTreeList)(implicit src: SourceFile): Template = if (derived.isEmpty) new Template(constr, parents, self, body) else new DerivingTemplate(constr, parents ++ derived, self, body, derived.length) + def Template(constr: DefDef, parents: LazyTreeList, self: ValDef, body: LazyTreeList)(implicit src: SourceFile): Template = + new Template(constr, parents, self, body) def Import(expr: Tree, selectors: List[ImportSelector])(implicit src: SourceFile): Import = new Import(expr, selectors) def Export(expr: Tree, selectors: List[ImportSelector])(implicit src: SourceFile): Export = new Export(expr, selectors) def PackageDef(pid: RefTree, stats: List[Tree])(implicit src: SourceFile): PackageDef = new PackageDef(pid, stats) @@ -649,6 +653,9 @@ object untpd extends Trees.Instance[Untyped] with UntypedTreeInfo { def ExtMethods(tree: Tree)(paramss: List[ParamClause], methods: List[Tree])(using Context): Tree = tree match case tree: ExtMethods if (paramss eq tree.paramss) && (methods == tree.methods) => tree case _ => finalize(tree, untpd.ExtMethods(paramss, methods)(tree.source)) + def Into(tree: Tree)(tpt: Tree)(using Context): Tree = tree match + case tree: Into if tpt eq tree.tpt => tree + case _ => finalize(tree, untpd.Into(tpt)(tree.source)) def ImportSelector(tree: Tree)(imported: Ident, renamed: Tree, bound: Tree)(using Context): Tree = tree match { case tree: ImportSelector if (imported eq tree.imported) && (renamed eq tree.renamed) && (bound eq tree.bound) => tree case _ => finalize(tree, untpd.ImportSelector(imported, renamed, bound)(tree.source)) @@ -718,6 +725,8 @@ object untpd extends Trees.Instance[Untyped] with UntypedTreeInfo { cpy.PatDef(tree)(mods, transform(pats), transform(tpt), transform(rhs)) case ExtMethods(paramss, methods) => cpy.ExtMethods(tree)(transformParamss(paramss), transformSub(methods)) + case Into(tpt) => + cpy.Into(tree)(transform(tpt)) case ImportSelector(imported, renamed, bound) => cpy.ImportSelector(tree)(transformSub(imported), transform(renamed), transform(bound)) case Number(_, _) | TypedSplice(_) => @@ -777,6 +786,8 @@ object untpd extends Trees.Instance[Untyped] with UntypedTreeInfo { this(this(this(x, pats), tpt), rhs) case ExtMethods(paramss, methods) => this(paramss.foldLeft(x)(apply), methods) + case Into(tpt) => + this(x, tpt) case ImportSelector(imported, renamed, bound) => this(this(this(x, imported), renamed), bound) case Number(_, _) => diff --git a/compiler/src/dotty/tools/dotc/cc/CaptureOps.scala b/compiler/src/dotty/tools/dotc/cc/CaptureOps.scala index 3261cb1d90f8..e4533aa73ce0 100644 --- a/compiler/src/dotty/tools/dotc/cc/CaptureOps.scala +++ b/compiler/src/dotty/tools/dotc/cc/CaptureOps.scala @@ -166,8 +166,54 @@ extension (tp: Type) case CapturingType(_, _) => true case _ => false + def isEventuallyCapturingType(using Context): Boolean = + tp match + case EventuallyCapturingType(_, _) => true + case _ => false + + /** Is type known to be always pure by its class structure, + * so that adding a capture set to it would not make sense? + */ + def isAlwaysPure(using Context): Boolean = tp.dealias match + case tp: (TypeRef | AppliedType) => + val sym = tp.typeSymbol + if sym.isClass then sym.isPureClass + else tp.superType.isAlwaysPure + case CapturingType(parent, refs) => + parent.isAlwaysPure || refs.isAlwaysEmpty + case tp: TypeProxy => + tp.superType.isAlwaysPure + case tp: AndType => + tp.tp1.isAlwaysPure || tp.tp2.isAlwaysPure + case tp: OrType => + tp.tp1.isAlwaysPure && tp.tp2.isAlwaysPure + case _ => + false + +extension (cls: ClassSymbol) + + def pureBaseClass(using Context): Option[Symbol] = + cls.baseClasses.find(bc => + defn.pureBaseClasses.contains(bc) + || { + val selfType = bc.givenSelfType + selfType.exists && selfType.captureSet.isAlwaysEmpty + }) + extension (sym: Symbol) + /** A class is pure if: + * - one its base types has an explicitly declared self type with an empty capture set + * - or it is a value class + * - or it is an exception + * - or it is one of Nothing, Null, or String + */ + def isPureClass(using Context): Boolean = sym match + case cls: ClassSymbol => + cls.pureBaseClass.isDefined || defn.pureSimpleClasses.contains(cls) + case _ => + false + /** Does this symbol allow results carrying the universal capability? * Currently this is true only for function type applies (since their * results are unboxed) and `erasedValue` since this function is magic in diff --git a/compiler/src/dotty/tools/dotc/cc/CaptureSet.scala b/compiler/src/dotty/tools/dotc/cc/CaptureSet.scala index d3e32ac538a4..2b9fe9d3d923 100644 --- a/compiler/src/dotty/tools/dotc/cc/CaptureSet.scala +++ b/compiler/src/dotty/tools/dotc/cc/CaptureSet.scala @@ -222,7 +222,7 @@ sealed abstract class CaptureSet extends Showable: /** The largest subset (via <:<) of this capture set that only contains elements * for which `p` is true. */ - def filter(p: CaptureRef => Boolean)(using Context): CaptureSet = + def filter(p: Context ?=> CaptureRef => Boolean)(using Context): CaptureSet = if this.isConst then val elems1 = elems.filter(p) if elems1 == elems then this @@ -271,7 +271,7 @@ sealed abstract class CaptureSet extends Showable: map(Substituters.SubstParamsMap(tl, to)) /** Invoke handler if this set has (or later aquires) the root capability `*` */ - def disallowRootCapability(handler: () => Unit)(using Context): this.type = + def disallowRootCapability(handler: () => Context ?=> Unit)(using Context): this.type = if isUniversal then handler() this @@ -373,7 +373,7 @@ object CaptureSet: def isAlwaysEmpty = false /** A handler to be invoked if the root reference `*` is added to this set */ - var addRootHandler: () => Unit = () => () + var rootAddedHandler: () => Context ?=> Unit = () => () var description: String = "" @@ -404,7 +404,7 @@ object CaptureSet: def addNewElems(newElems: Refs, origin: CaptureSet)(using Context, VarState): CompareResult = if !isConst && recordElemsState() then elems ++= newElems - if isUniversal then addRootHandler() + if isUniversal then rootAddedHandler() // assert(id != 2 || elems.size != 2, this) (CompareResult.OK /: deps) { (r, dep) => r.andAlso(dep.tryInclude(newElems, this)) @@ -421,8 +421,8 @@ object CaptureSet: else CompareResult.fail(this) - override def disallowRootCapability(handler: () => Unit)(using Context): this.type = - addRootHandler = handler + override def disallowRootCapability(handler: () => Context ?=> Unit)(using Context): this.type = + rootAddedHandler = handler super.disallowRootCapability(handler) private var computingApprox = false @@ -546,7 +546,7 @@ object CaptureSet: else CompareResult.fail(this) } .andAlso { - if (origin ne source) && mapIsIdempotent then + if (origin ne source) && (origin ne initial) && mapIsIdempotent then // `tm` is idempotent, propagate back elems from image set. // This is sound, since we know that for `r in newElems: tm(r) = r`, hence // `r` is _one_ possible solution in `source` that would make an `r` appear in this set. @@ -559,7 +559,7 @@ object CaptureSet: // elements from variable sources in contra- and non-variant positions. In essence, // we approximate types resulting from such maps by returning a possible super type // from the actual type. But this is neither sound nor complete. - report.warning(i"trying to add elems ${CaptureSet(newElems)} from unrecognized source $origin of mapped set $this$whereCreated") + report.warning(em"trying to add elems ${CaptureSet(newElems)} from unrecognized source $origin of mapped set $this$whereCreated") CompareResult.fail(this) else CompareResult.OK @@ -613,7 +613,7 @@ object CaptureSet: /** A variable with elements given at any time as { x <- source.elems | p(x) } */ class Filtered private[CaptureSet] - (val source: Var, p: CaptureRef => Boolean)(using @constructorOnly ctx: Context) + (val source: Var, p: Context ?=> CaptureRef => Boolean)(using @constructorOnly ctx: Context) extends DerivedVar(source.elems.filter(p)): override def addNewElems(newElems: Refs, origin: CaptureSet)(using Context, VarState): CompareResult = diff --git a/compiler/src/dotty/tools/dotc/cc/CapturingType.scala b/compiler/src/dotty/tools/dotc/cc/CapturingType.scala index e9862f1f20b8..a7c283f4cc3b 100644 --- a/compiler/src/dotty/tools/dotc/cc/CapturingType.scala +++ b/compiler/src/dotty/tools/dotc/cc/CapturingType.scala @@ -48,6 +48,16 @@ object CapturingType: EventuallyCapturingType.unapply(tp) else None + /** Check whether a type is uncachable when computing `baseType`. + * - Avoid caching all the types during the setup phase, since at that point + * the capture set variables are not fully installed yet. + * - Avoid caching capturing types when IgnoreCaptures mode is set, since the + * capture sets may be thrown away in the computed base type. + */ + def isUncachable(tp: Type)(using Context): Boolean = + ctx.phase == Phases.checkCapturesPhase && + (Setup.isDuringSetup || ctx.mode.is(Mode.IgnoreCaptures) && tp.isEventuallyCapturingType) + end CapturingType /** An extractor for types that will be capturing types at phase CheckCaptures. Also diff --git a/compiler/src/dotty/tools/dotc/cc/CheckCaptures.scala b/compiler/src/dotty/tools/dotc/cc/CheckCaptures.scala index cf1d4266e89b..077d345d792d 100644 --- a/compiler/src/dotty/tools/dotc/cc/CheckCaptures.scala +++ b/compiler/src/dotty/tools/dotc/cc/CheckCaptures.scala @@ -10,7 +10,8 @@ import config.Printers.{capt, recheckr} import config.{Config, Feature} import ast.{tpd, untpd, Trees} import Trees.* -import typer.RefChecks.{checkAllOverrides, checkParents} +import typer.RefChecks.{checkAllOverrides, checkSelfAgainstParents, OverridingPairsChecker} +import typer.Checking.{checkBounds, checkAppliedTypesIn} import util.{SimpleIdentitySet, EqHashMap, SrcPos} import transform.SymUtils.* import transform.{Recheck, PreRecheck} @@ -18,6 +19,7 @@ import Recheck.* import scala.collection.mutable import CaptureSet.{withCaptureSetsExplained, IdempotentCaptRefMap} import StdNames.nme +import NameKinds.DefaultGetterName import reporting.trace /** The capture checker */ @@ -139,25 +141,12 @@ class CheckCaptures extends Recheck, SymTransformer: override def run(using Context): Unit = if Feature.ccEnabled then - checkOverrides.traverse(ctx.compilationUnit.tpdTree) super.run override def transformSym(sym: SymDenotation)(using Context): SymDenotation = if Synthetics.needsTransform(sym) then Synthetics.transformFromCC(sym) else super.transformSym(sym) - /** Check overrides again, taking capture sets into account. - * TODO: Can we avoid doing overrides checks twice? - * We need to do them here since only at this phase CaptureTypes are relevant - * But maybe we can then elide the check during the RefChecks phase under captureChecking? - */ - def checkOverrides = new TreeTraverser: - def traverse(t: Tree)(using Context) = - t match - case t: Template => checkAllOverrides(ctx.owner.asClass) - case _ => - traverseChildren(t) - class CaptureChecker(ictx: Context) extends Rechecker(ictx): import ast.tpd.* @@ -201,7 +190,7 @@ class CheckCaptures extends Recheck, SymTransformer: def checkElem(elem: CaptureRef, cs: CaptureSet, pos: SrcPos)(using Context) = val res = elem.singletonCaptureSet.subCaptures(cs, frozen = false) if !res.isOK then - report.error(i"$elem cannot be referenced here; it is not included in the allowed capture set ${res.blocking}", pos) + report.error(em"$elem cannot be referenced here; it is not included in the allowed capture set ${res.blocking}", pos) /** Check subcapturing `cs1 <: cs2`, report error on failure */ def checkSubset(cs1: CaptureSet, cs2: CaptureSet, pos: SrcPos)(using Context) = @@ -210,7 +199,7 @@ class CheckCaptures extends Recheck, SymTransformer: def header = if cs1.elems.size == 1 then i"reference ${cs1.elems.toList}%, % is not" else i"references $cs1 are not all" - report.error(i"$header included in allowed capture set ${res.blocking}", pos) + report.error(em"$header included in allowed capture set ${res.blocking}", pos) /** The current environment */ private var curEnv: Env = Env(NoSymbol, nestedInOwner = false, CaptureSet.empty, isBoxed = false, null) @@ -335,12 +324,21 @@ class CheckCaptures extends Recheck, SymTransformer: override def recheckApply(tree: Apply, pt: Type)(using Context): Type = val meth = tree.fun.symbol includeCallCaptures(meth, tree.srcPos) - if meth == defn.Caps_unsafeBox || meth == defn.Caps_unsafeUnbox then + def mapArgUsing(f: Type => Type) = val arg :: Nil = tree.args: @unchecked - val argType0 = recheckStart(arg, pt) - .forceBoxStatus(boxed = meth == defn.Caps_unsafeBox) + val argType0 = f(recheckStart(arg, pt)) val argType = super.recheckFinish(argType0, arg, pt) super.recheckFinish(argType, tree, pt) + + if meth == defn.Caps_unsafeBox then + mapArgUsing(_.forceBoxStatus(true)) + else if meth == defn.Caps_unsafeUnbox then + mapArgUsing(_.forceBoxStatus(false)) + else if meth == defn.Caps_unsafeBoxFunArg then + mapArgUsing { + case defn.FunctionOf(paramtpe :: Nil, restpe, isContectual, isErased) => + defn.FunctionOf(paramtpe.forceBoxStatus(true) :: Nil, restpe, isContectual, isErased) + } else super.recheckApply(tree, pt) match case appType @ CapturingType(appType1, refs) => @@ -432,7 +430,8 @@ class CheckCaptures extends Recheck, SymTransformer: block match case closureDef(mdef) => pt.dealias match - case defn.FunctionOf(ptformals, _, _, _) if ptformals.forall(_.captureSet.isAlwaysEmpty) => + case defn.FunctionOf(ptformals, _, _, _) + if ptformals.nonEmpty && ptformals.forall(_.captureSet.isAlwaysEmpty) => // Redo setup of the anonymous function so that formal parameters don't // get capture sets. This is important to avoid false widenings to `*` // when taking the base type of the actual closures's dependent function @@ -442,9 +441,10 @@ class CheckCaptures extends Recheck, SymTransformer: // First, undo the previous setup which installed a completer for `meth`. atPhase(preRecheckPhase.prev)(meth.denot.copySymDenotation()) .installAfter(preRecheckPhase) + // Next, update all parameter symbols to match expected formals meth.paramSymss.head.lazyZip(ptformals).foreach { (psym, pformal) => - psym.copySymDenotation(info = pformal).installAfter(preRecheckPhase) + psym.updateInfoBetween(preRecheckPhase, thisPhase, pformal.mapExprType) } // Next, update types of parameter ValDefs mdef.paramss.head.lazyZip(ptformals).foreach { (param, pformal) => @@ -452,36 +452,21 @@ class CheckCaptures extends Recheck, SymTransformer: tpt.rememberTypeAlways(pformal) } // Next, install a new completer reflecting the new parameters for the anonymous method + val mt = meth.info.asInstanceOf[MethodType] val completer = new LazyType: def complete(denot: SymDenotation)(using Context) = - denot.info = MethodType(ptformals, mdef.tpt.knownType) + denot.info = mt.companion(ptformals, mdef.tpt.knownType) .showing(i"simplify info of $meth to $result", capt) recheckDef(mdef, meth) - meth.copySymDenotation(info = completer, initFlags = meth.flags &~ Touched) - .installAfter(preRecheckPhase) + meth.updateInfoBetween(preRecheckPhase, thisPhase, completer) case _ => case _ => super.recheckBlock(block, pt) - /** If `rhsProto` has `*` as its capture set, wrap `rhs` in a `unsafeBox`. - * Used to infer `unsafeBox` for expressions that get assigned to variables - * that have universal capture set. - */ - def maybeBox(rhs: Tree, rhsProto: Type)(using Context): Tree = - if rhsProto.captureSet.isUniversal then - ref(defn.Caps_unsafeBox).appliedToType(rhsProto).appliedTo(rhs) - else rhs - - override def recheckAssign(tree: Assign)(using Context): Type = - val rhsProto = recheck(tree.lhs).widen - recheck(maybeBox(tree.rhs, rhsProto), rhsProto) - defn.UnitType - override def recheckValDef(tree: ValDef, sym: Symbol)(using Context): Unit = try if !sym.is(Module) then // Modules are checked by checking the module class - if sym.is(Mutable) then recheck(maybeBox(tree.rhs, sym.info), sym.info) - else super.recheckValDef(tree, sym) + super.recheckValDef(tree, sym) finally if !sym.is(Param) then // Parameters with inferred types belong to anonymous methods. We need to wait @@ -503,7 +488,8 @@ class CheckCaptures extends Recheck, SymTransformer: /** Class-specific capture set relations: * 1. The capture set of a class includes the capture sets of its parents. * 2. The capture set of the self type of a class includes the capture set of the class. - * 3. The capture set of the self type of a class includes the capture set of every class parameter. + * 3. The capture set of the self type of a class includes the capture set of every class parameter, + * unless the parameter is marked @constructorOnly. */ override def recheckClassDef(tree: TypeDef, impl: Template, cls: ClassSymbol)(using Context): Type = val saved = curEnv @@ -515,7 +501,12 @@ class CheckCaptures extends Recheck, SymTransformer: val thisSet = cls.classInfo.selfType.captureSet.withDescription(i"of the self type of $cls") checkSubset(localSet, thisSet, tree.srcPos) // (2) for param <- cls.paramGetters do - checkSubset(param.termRef.captureSet, thisSet, param.srcPos) // (3) + if !param.hasAnnotation(defn.ConstructorOnlyAnnot) then + checkSubset(param.termRef.captureSet, thisSet, param.srcPos) // (3) + for pureBase <- cls.pureBaseClass do + checkSubset(thisSet, + CaptureSet.empty.withDescription(i"of pure base class $pureBase"), + tree.srcPos) super.recheckClassDef(tree, impl, cls) finally curEnv = saved @@ -602,11 +593,28 @@ class CheckCaptures extends Recheck, SymTransformer: /** Massage `actual` and `expected` types using the methods below before checking conformance */ override def checkConformsExpr(actual: Type, expected: Type, tree: Tree)(using Context): Unit = - val expected1 = addOuterRefs(expected, actual) + val expected1 = alignDependentFunction(addOuterRefs(expected, actual), actual.stripCapturing) val actual1 = adaptBoxed(actual, expected1, tree.srcPos) //println(i"check conforms $actual1 <<< $expected1") super.checkConformsExpr(actual1, expected1, tree) + private def toDepFun(args: List[Type], resultType: Type, isContextual: Boolean, isErased: Boolean)(using Context): Type = + MethodType.companion(isContextual = isContextual, isErased = isErased)(args, resultType) + .toFunctionType(isJava = false, alwaysDependent = true) + + /** Turn `expected` into a dependent function when `actual` is dependent. */ + private def alignDependentFunction(expected: Type, actual: Type)(using Context): Type = + def recur(expected: Type): Type = expected.dealias match + case expected @ CapturingType(eparent, refs) => + CapturingType(recur(eparent), refs, boxed = expected.isBoxed) + case expected @ defn.FunctionOf(args, resultType, isContextual, isErased) + if defn.isNonRefinedFunction(expected) && defn.isFunctionType(actual) && !defn.isNonRefinedFunction(actual) => + val expected1 = toDepFun(args, resultType, isContextual, isErased) + expected1 + case _ => + expected + recur(expected) + /** For the expected type, implement the rule outlined in #14390: * - when checking an expression `a: Ca Ta` against an expected type `Ce Te`, * - where the capture set `Ce` contains Cls.this, @@ -647,8 +655,11 @@ class CheckCaptures extends Recheck, SymTransformer: case _ => expected - /** Adapt `actual` type to `expected` type by inserting boxing and unboxing conversions */ - def adaptBoxed(actual: Type, expected: Type, pos: SrcPos)(using Context): Type = + /** Adapt `actual` type to `expected` type by inserting boxing and unboxing conversions + * + * @param alwaysConst always make capture set variables constant after adaptation + */ + def adaptBoxed(actual: Type, expected: Type, pos: SrcPos, alwaysConst: Boolean = false)(using Context): Type = /** Adapt function type `actual`, which is `aargs -> ares` (possibly with dependencies) * to `expected` type. @@ -725,7 +736,8 @@ class CheckCaptures extends Recheck, SymTransformer: else ((parent, cs, tp.isBoxed), reconstruct) case actual => - ((actual, CaptureSet(), false), reconstruct) + val res = if tp.isFromJavaObject then tp else actual + ((res, CaptureSet(), false), reconstruct) def adapt(actual: Type, expected: Type, covariant: Boolean): Type = trace(adaptInfo(actual, expected, covariant), recheckr, show = true) { if expected.isInstanceOf[WildcardType] then actual @@ -772,7 +784,8 @@ class CheckCaptures extends Recheck, SymTransformer: // We can't box/unbox the universal capability. Leave `actual` as it is // so we get an error in checkConforms. This tends to give better error // messages than disallowing the root capability in `criticalSet`. - capt.println(i"cannot box/unbox $actual vs $expected") + if ctx.settings.YccDebug.value then + println(i"cannot box/unbox $actual vs $expected") actual else // Disallow future addition of `*` to `criticalSet`. @@ -784,9 +797,9 @@ class CheckCaptures extends Recheck, SymTransformer: } if !insertBox then // unboxing markFree(criticalSet, pos) - recon(CapturingType(parent1, cs1, !actualIsBoxed)) + recon(CapturingType(parent1, if alwaysConst then CaptureSet(cs1.elems) else cs1, !actualIsBoxed)) else - recon(CapturingType(parent1, cs1, actualIsBoxed)) + recon(CapturingType(parent1, if alwaysConst then CaptureSet(cs1.elems) else cs1, actualIsBoxed)) } var actualw = actual.widenDealias @@ -805,12 +818,49 @@ class CheckCaptures extends Recheck, SymTransformer: else actual end adaptBoxed + /** Check overrides again, taking capture sets into account. + * TODO: Can we avoid doing overrides checks twice? + * We need to do them here since only at this phase CaptureTypes are relevant + * But maybe we can then elide the check during the RefChecks phase under captureChecking? + */ + def checkOverrides = new TreeTraverser: + class OverridingPairsCheckerCC(clazz: ClassSymbol, self: Type, srcPos: SrcPos)(using Context) extends OverridingPairsChecker(clazz, self) { + /** Check subtype with box adaptation. + * This function is passed to RefChecks to check the compatibility of overriding pairs. + * @param sym symbol of the field definition that is being checked + */ + override def checkSubType(actual: Type, expected: Type)(using Context): Boolean = + val expected1 = alignDependentFunction(addOuterRefs(expected, actual), actual.stripCapturing) + val actual1 = + val saved = curEnv + try + curEnv = Env(clazz, nestedInOwner = true, capturedVars(clazz), isBoxed = false, outer0 = curEnv) + val adapted = adaptBoxed(actual, expected1, srcPos, alwaysConst = true) + actual match + case _: MethodType => + // We remove the capture set resulted from box adaptation for method types, + // since class methods are always treated as pure, and their captured variables + // are charged to the capture set of the class (which is already done during + // box adaptation). + adapted.stripCapturing + case _ => adapted + finally curEnv = saved + actual1 frozen_<:< expected1 + } + + def traverse(t: Tree)(using Context) = + t match + case t: Template => + checkAllOverrides(ctx.owner.asClass, OverridingPairsCheckerCC(_, _, t)) + case _ => + traverseChildren(t) + override def checkUnit(unit: CompilationUnit)(using Context): Unit = - Setup(preRecheckPhase, thisPhase, recheckDef) - .traverse(ctx.compilationUnit.tpdTree) + Setup(preRecheckPhase, thisPhase, recheckDef)(ctx.compilationUnit.tpdTree) //println(i"SETUP:\n${Recheck.addRecheckedTypes.transform(ctx.compilationUnit.tpdTree)}") withCaptureSetsExplained { super.checkUnit(unit) + checkOverrides.traverse(unit.tpdTree) checkSelfTypes(unit.tpdTree) postCheck(unit.tpdTree) if ctx.settings.YccDebug.value then @@ -845,28 +895,104 @@ class CheckCaptures extends Recheck, SymTransformer: cls => !parentTrees(cls).exists(ptree => parentTrees.contains(ptree.tpe.classSymbol)) } assert(roots.nonEmpty) - for root <- roots do - checkParents(root, parentTrees(root)) + for case root: ClassSymbol <- roots do + checkSelfAgainstParents(root, root.baseClasses) val selfType = root.asClass.classInfo.selfType interpolator(startingVariance = -1).traverse(selfType) if !root.isEffectivelySealed then + def matchesExplicitRefsInBaseClass(refs: CaptureSet, cls: ClassSymbol): Boolean = + cls.baseClasses.tail.exists { psym => + val selfType = psym.asClass.givenSelfType + selfType.exists && selfType.captureSet.elems == refs.elems + } selfType match - case CapturingType(_, refs: CaptureSet.Var) if !refs.isUniversal => + case CapturingType(_, refs: CaptureSet.Var) + if !refs.isUniversal && !matchesExplicitRefsInBaseClass(refs, root) => + // Forbid inferred self types unless they are already implied by an explicit + // self type in a parent. report.error( - i"""$root needs an explicitly declared self type since its - |inferred self type $selfType - |is not visible in other compilation units that define subclasses.""", + em"""$root needs an explicitly declared self type since its + |inferred self type $selfType + |is not visible in other compilation units that define subclasses.""", root.srcPos) case _ => parentTrees -= root capt.println(i"checked $root with $selfType") end checkSelfTypes + /** Heal ill-formed capture sets in the type parameter. + * + * We can push parameter refs into a capture set in type parameters + * that this type parameter can't see. + * For example, when capture checking the following expression: + * + * def usingLogFile[T](op: (f: {*} File) => T): T = ... + * + * usingLogFile[box ?1 () -> Unit] { (f: {*} File) => () => { f.write(0) } } + * + * We may propagate `f` into ?1, making ?1 ill-formed. + * This also causes soundness issues, since `f` in ?1 should be widened to `*`, + * giving rise to an error that `*` cannot be included in a boxed capture set. + * + * To solve this, we still allow ?1 to capture parameter refs like `f`, but + * compensate this by pushing the widened capture set of `f` into ?1. + * This solves the soundness issue caused by the ill-formness of ?1. + */ + private def healTypeParam(tree: Tree)(using Context): Unit = + val checker = new TypeTraverser: + private def isAllowed(ref: CaptureRef): Boolean = ref match + case ref: TermParamRef => allowed.contains(ref) + case _ => true + + // Widen the given term parameter refs x₁ : C₁ S₁ , ⋯ , xₙ : Cₙ Sₙ to their capture sets C₁ , ⋯ , Cₙ. + // + // If in these capture sets there are any capture references that are term parameter references we should avoid, + // we will widen them recursively. + private def widenParamRefs(refs: List[TermParamRef]): List[CaptureSet] = + @scala.annotation.tailrec + def recur(todos: List[TermParamRef], acc: List[CaptureSet]): List[CaptureSet] = + todos match + case Nil => acc + case ref :: rem => + val cs = ref.captureSetOfInfo + val nextAcc = cs.filter(isAllowed(_)) :: acc + val nextRem: List[TermParamRef] = (cs.elems.toList.filter(!isAllowed(_)) ++ rem).asInstanceOf + recur(nextRem, nextAcc) + recur(refs, Nil) + + private def healCaptureSet(cs: CaptureSet): Unit = + val toInclude = widenParamRefs(cs.elems.toList.filter(!isAllowed(_)).asInstanceOf) + toInclude.foreach(checkSubset(_, cs, tree.srcPos)) + + private var allowed: SimpleIdentitySet[TermParamRef] = SimpleIdentitySet.empty + + def traverse(tp: Type) = + tp match + case CapturingType(parent, refs) => + healCaptureSet(refs) + traverse(parent) + case tp @ RefinedType(parent, rname, rinfo: MethodType) if defn.isFunctionType(tp) => + traverse(rinfo) + case tp: TermLambda => + val saved = allowed + try + tp.paramRefs.foreach(allowed += _) + traverseChildren(tp) + finally allowed = saved + case _ => + traverseChildren(tp) + + if tree.isInstanceOf[InferredTypeTree] then + checker.traverse(tree.knownType) + end healTypeParam + /** Perform the following kinds of checks * - Check all explicitly written capturing types for well-formedness using `checkWellFormedPost`. * - Check that externally visible `val`s or `def`s have empty capture sets. If not, * suggest an explicit type. This is so that separate compilation (where external * symbols have empty capture sets) gives the same results as joint compilation. + * - Check that arguments of TypeApplys and AppliedTypes conform to their bounds. + * - Heal ill-formed capture sets of type parameters. See `healTypeParam`. */ def postCheck(unit: tpd.Tree)(using Context): Unit = unit.foreachSubTree { @@ -885,25 +1011,55 @@ class CheckCaptures extends Recheck, SymTransformer: val isLocal = sym.owner.ownersIterator.exists(_.isTerm) || sym.accessBoundary(defn.RootClass).isContainedIn(sym.topLevelClass) - - // The following classes of definitions need explicit capture types ... - if !isLocal // ... since external capture types are not inferred - || sym.owner.is(Trait) // ... since we do OverridingPairs checking before capture inference - || sym.allOverriddenSymbols.nonEmpty // ... since we do override checking before capture inference - then + def canUseInferred = // If canUseInferred is false, all capturing types in the type of `sym` need to be given explicitly + sym.is(Private) // private symbols can always have inferred types + || sym.name.is(DefaultGetterName) // default getters are exempted since otherwise it would be + // too annoying. This is a hole since a defualt getter's result type + // might leak into a type variable. + || // non-local symbols cannot have inferred types since external capture types are not inferred + isLocal // local symbols still need explicit types if + && !sym.owner.is(Trait) // they are defined in a trait, since we do OverridingPairs checking before capture inference + def isNotPureThis(ref: CaptureRef) = ref match { + case ref: ThisType => !ref.cls.isPureClass + case _ => true + } + if !canUseInferred then val inferred = t.tpt.knownType def checkPure(tp: Type) = tp match - case CapturingType(_, refs) if !refs.elems.isEmpty => + case CapturingType(_, refs) + if !refs.elems.filter(isNotPureThis).isEmpty => val resultStr = if t.isInstanceOf[DefDef] then " result" else "" report.error( em"""Non-local $sym cannot have an inferred$resultStr type |$inferred |with non-empty capture set $refs. - |The type needs to be declared explicitly.""", t.srcPos) + |The type needs to be declared explicitly.""".withoutDisambiguation(), + t.srcPos) case _ => inferred.foreachPart(checkPure, StopAt.Static) + case t @ TypeApply(fun, args) => + fun.knownType.widen match + case tl: PolyType => + val normArgs = args.lazyZip(tl.paramInfos).map { (arg, bounds) => + arg.withType(arg.knownType.forceBoxStatus( + bounds.hi.isBoxedCapturing | bounds.lo.isBoxedCapturing)) + } + checkBounds(normArgs, tl) + case _ => + + args.foreach(healTypeParam(_)) case _ => } - + if !ctx.reporter.errorsReported then + // We dont report errors here if previous errors were reported, because other + // errors often result in bad applied types, but flagging these bad types gives + // often worse error messages than the original errors. + val checkApplied = new TreeTraverser: + def traverse(t: Tree)(using Context) = t match + case tree: InferredTypeTree => + case tree: New => + case tree: TypeTree => checkAppliedTypesIn(tree.withKnownType) + case _ => traverseChildren(t) + checkApplied.traverse(unit) end CaptureChecker end CheckCaptures diff --git a/compiler/src/dotty/tools/dotc/cc/Setup.scala b/compiler/src/dotty/tools/dotc/cc/Setup.scala index 42c80e524a6e..461c18ea0980 100644 --- a/compiler/src/dotty/tools/dotc/cc/Setup.scala +++ b/compiler/src/dotty/tools/dotc/cc/Setup.scala @@ -11,6 +11,7 @@ import ast.tpd import transform.Recheck.* import CaptureSet.IdentityCaptRefMap import Synthetics.isExcluded +import util.Property /** A tree traverser that prepares a compilation unit to be capture checked. * It does the following: @@ -98,7 +99,10 @@ extends tpd.TreeTraverser: def addCaptureRefinements(tp: Type): Type = tp match case _: TypeRef | _: AppliedType if tp.typeParams.isEmpty => tp.typeSymbol match - case cls: ClassSymbol if !defn.isFunctionClass(cls) => + case cls: ClassSymbol + if !defn.isFunctionClass(cls) && !cls.is(JavaDefined) => + // We assume that Java classes can refer to capturing Scala types only indirectly, + // using type parameters. Hence, no need to refine them. cls.paramGetters.foldLeft(tp) { (core, getter) => if getter.termRef.isTracked then val getterType = tp.memberInfo(getter).strippedDealias @@ -117,14 +121,14 @@ extends tpd.TreeTraverser: case tp: (TypeRef | AppliedType) => val sym = tp.typeSymbol if sym.isClass then - tp.typeSymbol == defn.AnyClass + sym == defn.AnyClass // we assume Any is a shorthand of {*} Any, so if Any is an upper // bound, the type is taken to be impure. else superTypeIsImpure(tp.superType) case tp: (RefinedOrRecType | MatchType) => superTypeIsImpure(tp.underlying) case tp: AndType => - superTypeIsImpure(tp.tp1) || canHaveInferredCapture(tp.tp2) + superTypeIsImpure(tp.tp1) || needsVariable(tp.tp2) case tp: OrType => superTypeIsImpure(tp.tp1) && superTypeIsImpure(tp.tp2) case _ => @@ -132,23 +136,26 @@ extends tpd.TreeTraverser: }.showing(i"super type is impure $tp = $result", capt) /** Should a capture set variable be added on type `tp`? */ - def canHaveInferredCapture(tp: Type): Boolean = { + def needsVariable(tp: Type): Boolean = { tp.typeParams.isEmpty && tp.match case tp: (TypeRef | AppliedType) => val tp1 = tp.dealias - if tp1 ne tp then canHaveInferredCapture(tp1) + if tp1 ne tp then needsVariable(tp1) else val sym = tp1.typeSymbol - if sym.isClass then !sym.isValueClass && sym != defn.AnyClass + if sym.isClass then + !sym.isPureClass && sym != defn.AnyClass else superTypeIsImpure(tp1) case tp: (RefinedOrRecType | MatchType) => - canHaveInferredCapture(tp.underlying) + needsVariable(tp.underlying) case tp: AndType => - canHaveInferredCapture(tp.tp1) && canHaveInferredCapture(tp.tp2) + needsVariable(tp.tp1) && needsVariable(tp.tp2) case tp: OrType => - canHaveInferredCapture(tp.tp1) || canHaveInferredCapture(tp.tp2) - case CapturingType(_, refs) => - refs.isConst && !refs.isUniversal + needsVariable(tp.tp1) || needsVariable(tp.tp2) + case CapturingType(parent, refs) => + needsVariable(parent) + && refs.isConst // if refs is a variable, no need to add another + && !refs.isUniversal // if refs is {*}, an added variable would not change anything case _ => false }.showing(i"can have inferred capture $tp = $result", capt) @@ -181,7 +188,7 @@ extends tpd.TreeTraverser: CapturingType(OrType(parent1, tp2, tp.isSoft), refs1, tp1.isBoxed) case tp @ OrType(tp1, tp2 @ CapturingType(parent2, refs2)) => CapturingType(OrType(tp1, parent2, tp.isSoft), refs2, tp2.isBoxed) - case _ if canHaveInferredCapture(tp) => + case _ if needsVariable(tp) => val cs = tp.dealias match case CapturingType(_, refs) => CaptureSet.Var(refs.elems) case _ => CaptureSet.Var() @@ -206,17 +213,22 @@ extends tpd.TreeTraverser: val tycon1 = this(tycon) if defn.isNonRefinedFunction(tp) then // Convert toplevel generic function types to dependent functions - val args0 = args.init - var res0 = args.last - val args1 = mapNested(args0) - val res1 = this(res0) - if isTopLevel then - depFun(tycon1, args1, res1) - .showing(i"add function refinement $tp --> $result", capt) - else if (tycon1 eq tycon) && (args1 eq args0) && (res1 eq res0) then - tp + if !defn.isFunctionSymbol(tp.typeSymbol) && (tp.dealias ne tp) then + // This type is a function after dealiasing, so we dealias and recurse. + // See #15925. + this(tp.dealias) else - tp.derivedAppliedType(tycon1, args1 :+ res1) + val args0 = args.init + var res0 = args.last + val args1 = mapNested(args0) + val res1 = this(res0) + if isTopLevel then + depFun(tycon1, args1, res1) + .showing(i"add function refinement $tp ($tycon1, $args1, $res1) (${tp.dealias}) --> $result", capt) + else if (tycon1 eq tycon) && (args1 eq args0) && (res1 eq res0) then + tp + else + tp.derivedAppliedType(tycon1, args1 :+ res1) else tp.derivedAppliedType(tycon1, args.mapConserve(arg => this(arg))) case tp @ RefinedType(core, rname, rinfo) if defn.isFunctionType(tp) => @@ -382,20 +394,18 @@ extends tpd.TreeTraverser: return tree.tpt match case tpt: TypeTree if tree.symbol.allOverriddenSymbols.hasNext => + tree.paramss.foreach(traverse) transformTT(tpt, boxed = false, exact = true) + traverse(tree.rhs) //println(i"TYPE of ${tree.symbol.showLocated} = ${tpt.knownType}") case _ => - traverseChildren(tree) + traverseChildren(tree) case tree @ ValDef(_, tpt: TypeTree, _) => - val isVar = tree.symbol.is(Mutable) - val overrides = tree.symbol.allOverriddenSymbols.hasNext - //if overrides then println(i"transforming overriding ${tree.symbol}") - if isVar || overrides then - transformTT(tpt, - boxed = isVar, // types of mutable variables are boxed - exact = overrides // types of symbols that override a parent don't get a capture set - ) - traverseChildren(tree) + transformTT(tpt, + boxed = tree.symbol.is(Mutable), // types of mutable variables are boxed + exact = tree.symbol.allOverriddenSymbols.hasNext // types of symbols that override a parent don't get a capture set + ) + traverse(tree.rhs) case tree @ TypeApply(fn, args) => traverse(fn) for case arg: TypeTree <- args do @@ -475,4 +485,14 @@ extends tpd.TreeTraverser: capt.println(i"update info of ${tree.symbol} from $info to $newInfo") case _ => end traverse + + def apply(tree: Tree)(using Context): Unit = + traverse(tree)(using ctx.withProperty(Setup.IsDuringSetupKey, Some(()))) end Setup + +object Setup: + val IsDuringSetupKey = new Property.Key[Unit] + + def isDuringSetup(using Context): Boolean = + ctx.property(IsDuringSetupKey).isDefined + diff --git a/compiler/src/dotty/tools/dotc/cc/Synthetics.scala b/compiler/src/dotty/tools/dotc/cc/Synthetics.scala index e8f7fd502baa..dacbd27e0f35 100644 --- a/compiler/src/dotty/tools/dotc/cc/Synthetics.scala +++ b/compiler/src/dotty/tools/dotc/cc/Synthetics.scala @@ -31,10 +31,12 @@ object Synthetics: * The types of these symbols are transformed in a special way without * looking at the definitions's RHS */ - def needsTransform(sym: SymDenotation)(using Context): Boolean = - isSyntheticCopyMethod(sym) - || isSyntheticCompanionMethod(sym, nme.apply, nme.unapply) - || isSyntheticCopyDefaultGetterMethod(sym) + def needsTransform(symd: SymDenotation)(using Context): Boolean = + isSyntheticCopyMethod(symd) + || isSyntheticCompanionMethod(symd, nme.apply, nme.unapply) + || isSyntheticCopyDefaultGetterMethod(symd) + || (symd.symbol eq defn.Object_eq) + || (symd.symbol eq defn.Object_ne) /** Method is excluded from regular capture checking. * Excluded are synthetic class members @@ -141,13 +143,16 @@ object Synthetics: /** Drop added capture information from the type of an `unapply` */ private def dropUnapplyCaptures(info: Type)(using Context): Type = info match case info: MethodType => - val CapturingType(oldParamInfo, _) :: Nil = info.paramInfos: @unchecked - def oldResult(tp: Type): Type = tp match - case tp: MethodOrPoly => - tp.derivedLambdaType(resType = oldResult(tp.resType)) - case CapturingType(tp, _) => - tp - info.derivedLambdaType(paramInfos = oldParamInfo :: Nil, resType = oldResult(info.resType)) + info.paramInfos match + case CapturingType(oldParamInfo, _) :: Nil => + def oldResult(tp: Type): Type = tp match + case tp: MethodOrPoly => + tp.derivedLambdaType(resType = oldResult(tp.resType)) + case CapturingType(tp, _) => + tp + info.derivedLambdaType(paramInfos = oldParamInfo :: Nil, resType = oldResult(info.resType)) + case _ => + info case info: PolyType => info.derivedLambdaType(resType = dropUnapplyCaptures(info.resType)) @@ -163,7 +168,9 @@ object Synthetics: sym.copySymDenotation(info = addUnapplyCaptures(sym.info)) case nme.apply | nme.copy => sym.copySymDenotation(info = addCaptureDeps(sym.info)) - + case n if n == nme.eq || n == nme.ne => + sym.copySymDenotation(info = + MethodType(defn.ObjectType.capturing(CaptureSet.universal) :: Nil, defn.BooleanType)) /** If `sym` refers to a synthetic apply, unapply, copy, or copy default getter method * of a case class, transform it back to what it was before the CC phase. @@ -176,5 +183,7 @@ object Synthetics: sym.copySymDenotation(info = dropUnapplyCaptures(sym.info)) case nme.apply | nme.copy => sym.copySymDenotation(info = dropCaptureDeps(sym.info)) + case n if n == nme.eq || n == nme.ne => + sym.copySymDenotation(info = defn.methOfAnyRef(defn.BooleanType)) end Synthetics \ No newline at end of file diff --git a/compiler/src/dotty/tools/dotc/config/CliCommand.scala b/compiler/src/dotty/tools/dotc/config/CliCommand.scala index 68c900e405da..914df040fbf7 100644 --- a/compiler/src/dotty/tools/dotc/config/CliCommand.scala +++ b/compiler/src/dotty/tools/dotc/config/CliCommand.scala @@ -60,7 +60,7 @@ trait CliCommand: def defaultValue = s.default match case _: Int | _: String => s.default.toString case _ => "" - val info = List(shortHelp(s), if defaultValue.nonEmpty then s"Default $defaultValue" else "", if s.legalChoices.nonEmpty then s"Choices ${s.legalChoices}" else "") + val info = List(shortHelp(s), if defaultValue.nonEmpty then s"Default $defaultValue" else "", if s.legalChoices.nonEmpty then s"Choices : ${s.legalChoices}" else "") (s.name, info.filter(_.nonEmpty).mkString("\n")) end help diff --git a/compiler/src/dotty/tools/dotc/config/Config.scala b/compiler/src/dotty/tools/dotc/config/Config.scala index 17e3ec352e7c..247fa28efbda 100644 --- a/compiler/src/dotty/tools/dotc/config/Config.scala +++ b/compiler/src/dotty/tools/dotc/config/Config.scala @@ -22,6 +22,11 @@ object Config { */ inline val checkConstraintsNonCyclic = false + /** Check that reverse dependencies in constraints are correct and complete. + * Can also be enabled using -Ycheck-constraint-deps. + */ + inline val checkConstraintDeps = false + /** Check that each constraint resulting from a subtype test * is satisfiable. Also check that a type variable instantiation * satisfies its constraints. @@ -78,13 +83,6 @@ object Config { */ inline val failOnInstantiationToNothing = false - /** Enable noDoubleDef checking if option "-YnoDoubleDefs" is set. - * The reason to have an option as well as the present global switch is - * that the noDoubleDef checking is done in a hotspot, and we do not - * want to incur the overhead of checking an option each time. - */ - inline val checkNoDoubleBindings = true - /** Check positions for consistency after parsing */ inline val checkPositions = true @@ -184,6 +182,9 @@ object Config { /** If set, prints a trace of all symbol completions */ inline val showCompletions = false + /** If set, show variable/variable reverse dependencies when printing constraints. */ + inline val showConstraintDeps = true + /** If set, method results that are context functions are flattened by adding * the parameters of the context function results to the methods themselves. * This is an optimization that reduces closure allocations. diff --git a/compiler/src/dotty/tools/dotc/config/Feature.scala b/compiler/src/dotty/tools/dotc/config/Feature.scala index 6d905f500c54..188526bb094f 100644 --- a/compiler/src/dotty/tools/dotc/config/Feature.scala +++ b/compiler/src/dotty/tools/dotc/config/Feature.scala @@ -30,6 +30,7 @@ object Feature: val saferExceptions = experimental("saferExceptions") val pureFunctions = experimental("pureFunctions") val captureChecking = experimental("captureChecking") + val into = experimental("into") val globalOnlyImports: Set[TermName] = Set(pureFunctions, captureChecking) @@ -79,19 +80,27 @@ object Feature: def scala2ExperimentalMacroEnabled(using Context) = enabled(scala2macros) + /** Is pureFunctions enabled for this compilation unit? */ def pureFunsEnabled(using Context) = enabledBySetting(pureFunctions) || ctx.compilationUnit.knowsPureFuns || ccEnabled + /** Is captureChecking enabled for this compilation unit? */ def ccEnabled(using Context) = enabledBySetting(captureChecking) || ctx.compilationUnit.needsCaptureChecking + /** Is pureFunctions enabled for any of the currently compiled compilation units? */ def pureFunsEnabledSomewhere(using Context) = enabledBySetting(pureFunctions) - || enabledBySetting(captureChecking) || ctx.run != null && ctx.run.nn.pureFunsImportEncountered + || ccEnabledSomewhere + + /** Is captureChecking enabled for any of the currently compiled compilation units? */ + def ccEnabledSomewhere(using Context) = + enabledBySetting(captureChecking) + || ctx.run != null && ctx.run.nn.ccImportEncountered def sourceVersionSetting(using Context): SourceVersion = SourceVersion.valueOf(ctx.settings.source.value) @@ -101,7 +110,11 @@ object Feature: case Some(v) => v case none => sourceVersionSetting - def migrateTo3(using Context): Boolean = sourceVersion == `3.0-migration` + def migrateTo3(using Context): Boolean = + sourceVersion == `3.0-migration` + + def fewerBracesEnabled(using Context) = + sourceVersion.isAtLeast(`3.3`) || enabled(fewerBraces) /** If current source migrates to `version`, issue given warning message * and return `true`, otherwise return `false`. @@ -117,7 +130,7 @@ object Feature: def checkExperimentalFeature(which: String, srcPos: SrcPos, note: => String = "")(using Context) = if !isExperimentalEnabled then - report.error(i"Experimental $which may only be used with a nightly or snapshot version of the compiler$note", srcPos) + report.error(em"Experimental $which may only be used with a nightly or snapshot version of the compiler$note", srcPos) def checkExperimentalDef(sym: Symbol, srcPos: SrcPos)(using Context) = if !isExperimentalEnabled then @@ -128,7 +141,7 @@ object Feature: i"${sym.owner} is marked @experimental" else i"$sym inherits @experimental" - report.error(s"$symMsg and therefore may only be used in an experimental scope.", srcPos) + report.error(em"$symMsg and therefore may only be used in an experimental scope.", srcPos) /** Check that experimental compiler options are only set for snapshot or nightly compiler versions. */ def checkExperimentalSettings(using Context): Unit = @@ -139,6 +152,11 @@ object Feature: def isExperimentalEnabled(using Context): Boolean = Properties.experimental && !ctx.settings.YnoExperimental.value + /** Handle language import `import language..` if it is one + * of the global imports `pureFunctions` or `captureChecking`. In this case + * make the compilation unit's and current run's fields accordingly. + * @return true iff import that was handled + */ def handleGlobalLanguageImport(prefix: TermName, imported: Name)(using Context): Boolean = val fullFeatureName = QualifiedName(prefix, imported.asTermName) if fullFeatureName == pureFunctions then @@ -147,7 +165,7 @@ object Feature: true else if fullFeatureName == captureChecking then ctx.compilationUnit.needsCaptureChecking = true - if ctx.run != null then ctx.run.nn.pureFunsImportEncountered = true + if ctx.run != null then ctx.run.nn.ccImportEncountered = true true else false diff --git a/compiler/src/dotty/tools/dotc/config/Printers.scala b/compiler/src/dotty/tools/dotc/config/Printers.scala index ecb189de9bb3..63d616e1ce3d 100644 --- a/compiler/src/dotty/tools/dotc/config/Printers.scala +++ b/compiler/src/dotty/tools/dotc/config/Printers.scala @@ -32,6 +32,7 @@ object Printers { val init = noPrinter val inlining = noPrinter val interactiv = noPrinter + val macroAnnot = noPrinter val matchTypes = noPrinter val nullables = noPrinter val overload = noPrinter diff --git a/compiler/src/dotty/tools/dotc/config/ScalaSettings.scala b/compiler/src/dotty/tools/dotc/config/ScalaSettings.scala index 09bedd3e8b35..5ae99ec7e6fa 100644 --- a/compiler/src/dotty/tools/dotc/config/ScalaSettings.scala +++ b/compiler/src/dotty/tools/dotc/config/ScalaSettings.scala @@ -17,7 +17,7 @@ class ScalaSettings extends SettingGroup with AllScalaSettings object ScalaSettings: // Keep synchronized with `classfileVersion` in `BCodeIdiomatic` private val minTargetVersion = 8 - private val maxTargetVersion = 19 + private val maxTargetVersion = 20 def supportedTargetVersions: List[String] = (minTargetVersion to maxTargetVersion).toList.map(_.toString) @@ -64,7 +64,6 @@ trait AllScalaSettings extends CommonScalaSettings, PluginSettings, VerboseSetti val oldSyntax: Setting[Boolean] = BooleanSetting("-old-syntax", "Require `(...)` around conditions.") val indent: Setting[Boolean] = BooleanSetting("-indent", "Together with -rewrite, remove {...} syntax when possible due to significant indentation.") val noindent: Setting[Boolean] = BooleanSetting("-no-indent", "Require classical {...} syntax, indentation is not significant.", aliases = List("-noindent")) - val YindentColons: Setting[Boolean] = BooleanSetting("-Yindent-colons", "(disabled: use -language:experimental.fewerBraces instead)") /* Decompiler settings */ val printTasty: Setting[Boolean] = BooleanSetting("-print-tasty", "Prints the raw tasty.", aliases = List("--print-tasty")) @@ -156,20 +155,71 @@ private sealed trait VerboseSettings: */ private sealed trait WarningSettings: self: SettingGroup => + import Setting.ChoiceWithHelp + val Whelp: Setting[Boolean] = BooleanSetting("-W", "Print a synopsis of warning options.") val XfatalWarnings: Setting[Boolean] = BooleanSetting("-Werror", "Fail the compilation if there are any warnings.", aliases = List("-Xfatal-warnings")) + val WvalueDiscard: Setting[Boolean] = BooleanSetting("-Wvalue-discard", "Warn when non-Unit expression results are unused.") - val Wunused: Setting[List[String]] = MultiChoiceSetting( + val Wunused: Setting[List[ChoiceWithHelp[String]]] = MultiChoiceHelpSetting( name = "-Wunused", helpArg = "warning", descr = "Enable or disable specific `unused` warnings", - choices = List("nowarn", "all"), + choices = List( + ChoiceWithHelp("nowarn", ""), + ChoiceWithHelp("all",""), + ChoiceWithHelp( + name = "imports", + description = "Warn if an import selector is not referenced.\n" + + "NOTE : overrided by -Wunused:strict-no-implicit-warn"), + ChoiceWithHelp("privates","Warn if a private member is unused"), + ChoiceWithHelp("locals","Warn if a local definition is unused"), + ChoiceWithHelp("explicits","Warn if an explicit parameter is unused"), + ChoiceWithHelp("implicits","Warn if an implicit parameter is unused"), + ChoiceWithHelp("params","Enable -Wunused:explicits,implicits"), + ChoiceWithHelp("linted","Enable -Wunused:imports,privates,locals,implicits"), + ChoiceWithHelp( + name = "strict-no-implicit-warn", + description = "Same as -Wunused:import, only for imports of explicit named members.\n" + + "NOTE : This overrides -Wunused:imports and NOT set by -Wunused:all" + ), + // ChoiceWithHelp("patvars","Warn if a variable bound in a pattern is unused"), + ChoiceWithHelp( + name = "unsafe-warn-patvars", + description = "(UNSAFE) Warn if a variable bound in a pattern is unused.\n" + + "This warning can generate false positive, as warning cannot be\n" + + "suppressed yet." + ) + ), default = Nil ) object WunusedHas: + def isChoiceSet(s: String)(using Context) = Wunused.value.pipe(us => us.contains(s)) def allOr(s: String)(using Context) = Wunused.value.pipe(us => us.contains("all") || us.contains(s)) def nowarn(using Context) = allOr("nowarn") + // overrided by strict-no-implicit-warn + def imports(using Context) = + (allOr("imports") || allOr("linted")) && !(strictNoImplicitWarn) + def locals(using Context) = + allOr("locals") || allOr("linted") + /** -Wunused:explicits OR -Wunused:params */ + def explicits(using Context) = + allOr("explicits") || allOr("params") + /** -Wunused:implicits OR -Wunused:params */ + def implicits(using Context) = + allOr("implicits") || allOr("params") || allOr("linted") + def params(using Context) = allOr("params") + def privates(using Context) = + allOr("privates") || allOr("linted") + def patvars(using Context) = + isChoiceSet("unsafe-warn-patvars") // not with "all" + // allOr("patvars") // todo : rename once fixed + def linted(using Context) = + allOr("linted") + def strictNoImplicitWarn(using Context) = + isChoiceSet("strict-no-implicit-warn") + val Wconf: Setting[List[String]] = MultiStringSetting( "-Wconf", "patterns", @@ -282,6 +332,7 @@ private sealed trait YSettings: val Yscala2Unpickler: Setting[String] = StringSetting("-Yscala2-unpickler", "", "Control where we may get Scala 2 symbols from. This is either \"always\", \"never\", or a classpath.", "always") val YnoImports: Setting[Boolean] = BooleanSetting("-Yno-imports", "Compile without importing scala.*, java.lang.*, or Predef.") + val Yimports: Setting[List[String]] = MultiStringSetting("-Yimports", helpArg="", "Custom root imports. If set, none of scala.*, java.lang.*, or Predef.* will be imported unless explicitly included.") val YnoGenericSig: Setting[Boolean] = BooleanSetting("-Yno-generic-signatures", "Suppress generation of generic signatures for Java.") val YnoPredef: Setting[Boolean] = BooleanSetting("-Yno-predef", "Compile without importing Predef.") val Yskip: Setting[List[String]] = PhasesSetting("-Yskip", "Skip") @@ -309,10 +360,12 @@ private sealed trait YSettings: val YforceSbtPhases: Setting[Boolean] = BooleanSetting("-Yforce-sbt-phases", "Run the phases used by sbt for incremental compilation (ExtractDependencies and ExtractAPI) even if the compiler is ran outside of sbt, for debugging.") val YdumpSbtInc: Setting[Boolean] = BooleanSetting("-Ydump-sbt-inc", "For every compiled foo.scala, output the API representation and dependencies used for sbt incremental compilation in foo.inc, implies -Yforce-sbt-phases.") val YcheckAllPatmat: Setting[Boolean] = BooleanSetting("-Ycheck-all-patmat", "Check exhaustivity and redundancy of all pattern matching (used for testing the algorithm).") + val YcheckConstraintDeps: Setting[Boolean] = BooleanSetting("-Ycheck-constraint-deps", "Check dependency tracking in constraints (used for testing the algorithm).") val YretainTrees: Setting[Boolean] = BooleanSetting("-Yretain-trees", "Retain trees for top-level classes, accessible from ClassSymbol#tree") val YshowTreeIds: Setting[Boolean] = BooleanSetting("-Yshow-tree-ids", "Uniquely tag all tree nodes in debugging output.") val YfromTastyIgnoreList: Setting[List[String]] = MultiStringSetting("-Yfrom-tasty-ignore-list", "file", "List of `tasty` files in jar files that will not be loaded when using -from-tasty") val YnoExperimental: Setting[Boolean] = BooleanSetting("-Yno-experimental", "Disable experimental language features") + val YlegacyLazyVals: Setting[Boolean] = BooleanSetting("-Ylegacy-lazy-vals", "Use legacy (pre 3.3.0) implementation of lazy vals") val YprofileEnabled: Setting[Boolean] = BooleanSetting("-Yprofile-enabled", "Enable profiling.") val YprofileDestination: Setting[String] = StringSetting("-Yprofile-destination", "file", "Where to send profiling output - specify a file, default is to the console.", "") @@ -330,7 +383,6 @@ private sealed trait YSettings: val YrecheckTest: Setting[Boolean] = BooleanSetting("-Yrecheck-test", "Run basic rechecking (internal test only)") val YccDebug: Setting[Boolean] = BooleanSetting("-Ycc-debug", "Used in conjunction with captureChecking language import, debug info for captured references") val YccNoAbbrev: Setting[Boolean] = BooleanSetting("-Ycc-no-abbrev", "Used in conjunction with captureChecking language import, suppress type abbreviations") - val YlightweightLazyVals: Setting[Boolean] = BooleanSetting("-Ylightweight-lazy-vals", "Use experimental lightweight implementation of lazy vals") /** Area-specific debug output */ val YexplainLowlevel: Setting[Boolean] = BooleanSetting("-Yexplain-lowlevel", "When explaining type errors, show types at a lower level.") diff --git a/compiler/src/dotty/tools/dotc/config/Settings.scala b/compiler/src/dotty/tools/dotc/config/Settings.scala index 277833afbd5d..34e5582e8a91 100644 --- a/compiler/src/dotty/tools/dotc/config/Settings.scala +++ b/compiler/src/dotty/tools/dotc/config/Settings.scala @@ -11,6 +11,7 @@ import annotation.tailrec import collection.mutable.ArrayBuffer import reflect.ClassTag import scala.util.{Success, Failure} +import dotty.tools.dotc.config.Settings.Setting.ChoiceWithHelp object Settings: @@ -69,11 +70,11 @@ object Settings: def updateIn(state: SettingsState, x: Any): SettingsState = x match case _: T => state.update(idx, x) - case _ => throw IllegalArgumentException(s"found: $x of type ${x.getClass.getName}, required: ${implicitly[ClassTag[T]]}") + case _ => throw IllegalArgumentException(s"found: $x of type ${x.getClass.getName}, required: ${summon[ClassTag[T]]}") def isDefaultIn(state: SettingsState): Boolean = valueIn(state) == default - def isMultivalue: Boolean = implicitly[ClassTag[T]] == ListTag + def isMultivalue: Boolean = summon[ClassTag[T]] == ListTag def legalChoices: String = choices match { @@ -106,6 +107,11 @@ object Settings: def missingArg = fail(s"missing argument for option $name", args) + def setBoolean(argValue: String, args: List[String]) = + if argValue.equalsIgnoreCase("true") || argValue.isEmpty then update(true, args) + else if argValue.equalsIgnoreCase("false") then update(false, args) + else fail(s"$argValue is not a valid choice for boolean setting $name", args) + def setString(argValue: String, args: List[String]) = choices match case Some(xs) if !xs.contains(argValue) => @@ -126,9 +132,9 @@ object Settings: catch case _: NumberFormatException => fail(s"$argValue is not an integer argument for $name", args) - def doSet(argRest: String) = ((implicitly[ClassTag[T]], args): @unchecked) match { + def doSet(argRest: String) = ((summon[ClassTag[T]], args): @unchecked) match { case (BooleanTag, _) => - update(true, args) + setBoolean(argRest, args) case (OptionTag, _) => update(Some(propertyClass.get.getConstructor().newInstance()), args) case (ListTag, _) => @@ -184,6 +190,19 @@ object Settings: def update(x: T)(using Context): SettingsState = setting.updateIn(ctx.settingsState, x) def isDefault(using Context): Boolean = setting.isDefaultIn(ctx.settingsState) + /** + * A choice with help description. + * + * NOTE : `equals` and `toString` have special behaviors + */ + case class ChoiceWithHelp[T](name: T, description: String): + override def equals(x: Any): Boolean = x match + case s:String => s == name.toString() + case _ => false + override def toString(): String = + s"\n- $name${if description.isEmpty() then "" else s" :\n\t${description.replace("\n","\n\t")}"}" + end Setting + class SettingGroup { private val _allSettings = new ArrayBuffer[Setting[?]] @@ -265,6 +284,9 @@ object Settings: def MultiChoiceSetting(name: String, helpArg: String, descr: String, choices: List[String], default: List[String], aliases: List[String] = Nil): Setting[List[String]] = publish(Setting(name, descr, default, helpArg, Some(choices), aliases = aliases)) + def MultiChoiceHelpSetting(name: String, helpArg: String, descr: String, choices: List[ChoiceWithHelp[String]], default: List[ChoiceWithHelp[String]], aliases: List[String] = Nil): Setting[List[ChoiceWithHelp[String]]] = + publish(Setting(name, descr, default, helpArg, Some(choices), aliases = aliases)) + def IntSetting(name: String, descr: String, default: Int, aliases: List[String] = Nil): Setting[Int] = publish(Setting(name, descr, default, aliases = aliases)) @@ -290,6 +312,6 @@ object Settings: publish(Setting(name, descr, default)) def OptionSetting[T: ClassTag](name: String, descr: String, aliases: List[String] = Nil): Setting[Option[T]] = - publish(Setting(name, descr, None, propertyClass = Some(implicitly[ClassTag[T]].runtimeClass), aliases = aliases)) + publish(Setting(name, descr, None, propertyClass = Some(summon[ClassTag[T]].runtimeClass), aliases = aliases)) } end Settings diff --git a/compiler/src/dotty/tools/dotc/config/SourceVersion.scala b/compiler/src/dotty/tools/dotc/config/SourceVersion.scala index 545e2f2d9b42..4b9b1b247856 100644 --- a/compiler/src/dotty/tools/dotc/config/SourceVersion.scala +++ b/compiler/src/dotty/tools/dotc/config/SourceVersion.scala @@ -8,6 +8,7 @@ import util.Property enum SourceVersion: case `3.0-migration`, `3.0`, `3.1` // Note: do not add `3.1-migration` here, 3.1 is the same language as 3.0. case `3.2-migration`, `3.2` + case `3.3-migration`, `3.3` case `future-migration`, `future` val isMigrating: Boolean = toString.endsWith("-migration") @@ -18,7 +19,7 @@ enum SourceVersion: def isAtLeast(v: SourceVersion) = stable.ordinal >= v.ordinal object SourceVersion extends Property.Key[SourceVersion]: - def defaultSourceVersion = `3.2` + def defaultSourceVersion = `3.3` /** language versions that may appear in a language import, are deprecated, but not removed from the standard library. */ val illegalSourceVersionNames = List("3.1-migration").map(_.toTermName) diff --git a/compiler/src/dotty/tools/dotc/core/Annotations.scala b/compiler/src/dotty/tools/dotc/core/Annotations.scala index aa8ead280bbf..3b00f2915f1c 100644 --- a/compiler/src/dotty/tools/dotc/core/Annotations.scala +++ b/compiler/src/dotty/tools/dotc/core/Annotations.scala @@ -2,12 +2,13 @@ package dotty.tools package dotc package core -import Symbols._, Types._, Contexts._, Constants._ -import dotty.tools.dotc.ast.tpd, tpd.* +import Symbols._, Types._, Contexts._, Constants._, Phases.* +import ast.tpd, tpd.* import util.Spans.Span import printing.{Showable, Printer} import printing.Texts.Text -import annotation.internal.sharable + +import scala.annotation.internal.sharable object Annotations { @@ -87,6 +88,22 @@ object Annotations { def sameAnnotation(that: Annotation)(using Context): Boolean = symbol == that.symbol && tree.sameTree(that.tree) + def hasOneOfMetaAnnotation(metaSyms: Set[Symbol], orNoneOf: Set[Symbol] = Set.empty)(using Context): Boolean = atPhaseNoLater(erasurePhase) { + def go(metaSyms: Set[Symbol]) = + def recTp(tp: Type): Boolean = tp.dealiasKeepAnnots match + case AnnotatedType(parent, metaAnnot) => metaSyms.exists(metaAnnot.matches) || recTp(parent) + case _ => false + def rec(tree: Tree): Boolean = methPart(tree) match + case New(tpt) => rec(tpt) + case Select(qual, _) => rec(qual) + case Annotated(arg, metaAnnot) => metaSyms.exists(metaAnnot.tpe.classSymbol.derivesFrom) || rec(arg) + case t @ Ident(_) => recTp(t.tpe) + case Typed(expr, _) => rec(expr) + case _ => false + metaSyms.exists(symbol.hasAnnotation) || rec(tree) + go(metaSyms) || orNoneOf.nonEmpty && !go(orNoneOf) + } + /** Operations for hash-consing, can be overridden */ def hash: Int = System.identityHashCode(this) def eql(that: Annotation) = this eq that @@ -177,27 +194,21 @@ object Annotations { object Annotation { def apply(tree: Tree): ConcreteAnnotation = ConcreteAnnotation(tree) + + def apply(cls: ClassSymbol, span: Span)(using Context): Annotation = + apply(cls, Nil, span) - def apply(cls: ClassSymbol)(using Context): Annotation = - apply(cls, Nil) - - def apply(cls: ClassSymbol, arg: Tree)(using Context): Annotation = - apply(cls, arg :: Nil) - - def apply(cls: ClassSymbol, arg1: Tree, arg2: Tree)(using Context): Annotation = - apply(cls, arg1 :: arg2 :: Nil) - - def apply(cls: ClassSymbol, args: List[Tree])(using Context): Annotation = - apply(cls.typeRef, args) - - def apply(atp: Type, arg: Tree)(using Context): Annotation = - apply(atp, arg :: Nil) + def apply(cls: ClassSymbol, arg: Tree, span: Span)(using Context): Annotation = + apply(cls, arg :: Nil, span) - def apply(atp: Type, arg1: Tree, arg2: Tree)(using Context): Annotation = - apply(atp, arg1 :: arg2 :: Nil) + def apply(cls: ClassSymbol, args: List[Tree], span: Span)(using Context): Annotation = + apply(cls.typeRef, args, span) - def apply(atp: Type, args: List[Tree])(using Context): Annotation = - apply(New(atp, args)) + def apply(atp: Type, arg: Tree, span: Span)(using Context): Annotation = + apply(atp, arg :: Nil, span) + + def apply(atp: Type, args: List[Tree], span: Span)(using Context): Annotation = + apply(New(atp, args).withSpan(span)) /** Create an annotation where the tree is computed lazily. */ def deferred(sym: Symbol)(treeFn: Context ?=> Tree): Annotation = @@ -234,15 +245,15 @@ object Annotations { else None } - def makeSourceFile(path: String)(using Context): Annotation = - apply(defn.SourceFileAnnot, Literal(Constant(path))) + def makeSourceFile(path: String, span: Span)(using Context): Annotation = + apply(defn.SourceFileAnnot, Literal(Constant(path)), span) } @sharable val EmptyAnnotation = Annotation(EmptyTree) def ThrowsAnnotation(cls: ClassSymbol)(using Context): Annotation = { val tref = cls.typeRef - Annotation(defn.ThrowsAnnot.typeRef.appliedTo(tref), Ident(tref)) + Annotation(defn.ThrowsAnnot.typeRef.appliedTo(tref), Ident(tref), cls.span) } /** Extracts the type of the thrown exception from an annotation. diff --git a/compiler/src/dotty/tools/dotc/core/CheckRealizable.scala b/compiler/src/dotty/tools/dotc/core/CheckRealizable.scala index 4b441d512dec..a61701eee2d7 100644 --- a/compiler/src/dotty/tools/dotc/core/CheckRealizable.scala +++ b/compiler/src/dotty/tools/dotc/core/CheckRealizable.scala @@ -149,7 +149,7 @@ class CheckRealizable(using Context) { */ private def boundsRealizability(tp: Type) = { - val memberProblems = withMode(Mode.CheckBounds) { + val memberProblems = withMode(Mode.CheckBoundsOrSelfType) { for { mbr <- tp.nonClassTypeMembers if !(mbr.info.loBound <:< mbr.info.hiBound) @@ -157,7 +157,7 @@ class CheckRealizable(using Context) { yield new HasProblemBounds(mbr.name, mbr.info) } - val refinementProblems = withMode(Mode.CheckBounds) { + val refinementProblems = withMode(Mode.CheckBoundsOrSelfType) { for { name <- refinedNames(tp) if (name.isTypeName) diff --git a/compiler/src/dotty/tools/dotc/core/Constraint.scala b/compiler/src/dotty/tools/dotc/core/Constraint.scala index 07b6e71cdcc9..c634f847e510 100644 --- a/compiler/src/dotty/tools/dotc/core/Constraint.scala +++ b/compiler/src/dotty/tools/dotc/core/Constraint.scala @@ -4,6 +4,7 @@ package core import Types._, Contexts._ import printing.Showable +import util.{SimpleIdentitySet, SimpleIdentityMap} /** Constraint over undetermined type parameters. Constraints are built * over values of the following types: @@ -70,6 +71,9 @@ abstract class Constraint extends Showable { */ def nonParamBounds(param: TypeParamRef)(using Context): TypeBounds + /** The current bounds of type parameter `param` */ + def bounds(param: TypeParamRef)(using Context): TypeBounds + /** A new constraint which is derived from this constraint by adding * entries for all type parameters of `poly`. * @param tvars A list of type variables associated with the params, @@ -87,6 +91,8 @@ abstract class Constraint extends Showable { * - Another type, indicating a solution for the parameter * * @pre `this contains param`. + * @pre `tp` does not contain top-level references to `param` + * (see `validBoundsFor`) */ def updateEntry(param: TypeParamRef, tp: Type)(using Context): This @@ -128,7 +134,7 @@ abstract class Constraint extends Showable { /** Is `tv` marked as hard in the constraint? */ def isHard(tv: TypeVar): Boolean - + /** The same as this constraint, but with `tv` marked as hard. */ def withHard(tv: TypeVar)(using Context): This @@ -165,15 +171,49 @@ abstract class Constraint extends Showable { */ def hasConflictingTypeVarsFor(tl: TypeLambda, that: Constraint): Boolean - /** Check that no constrained parameter contains itself as a bound */ - def checkNonCyclic()(using Context): this.type - /** Does `param` occur at the toplevel in `tp` ? * Toplevel means: the type itself or a factor in some * combination of `&` or `|` types. */ def occursAtToplevel(param: TypeParamRef, tp: Type)(using Context): Boolean + /** Sanitize `bound` to make it either a valid upper or lower bound for + * `param` depending on `isUpper`. + * + * Toplevel references to `param`, are replaced by `Any` if `isUpper` is true + * and `Nothing` otherwise. + * + * @see `occursAtTopLevel` for a definition of "toplevel" + * @see `validBoundsFor` to sanitize both the lower and upper bound at once. + */ + def validBoundFor(param: TypeParamRef, bound: Type, isUpper: Boolean)(using Context): Type + + /** Sanitize `bounds` to make them valid constraints for `param`. + * + * @see `validBoundFor` for details. + */ + def validBoundsFor(param: TypeParamRef, bounds: TypeBounds)(using Context): Type + + /** A string that shows the reverse dependencies maintained by this constraint + * (coDeps and contraDeps for OrderingConstraints). + */ + def depsToString(using Context): String + + /** Does the constraint restricted to variables outside `except` depend on `tv` + * in the given direction `co`? + * @param `co` If true, test whether the constraint would change if the variable is made larger + * otherwise, test whether the constraint would change if the variable is made smaller. + */ + def dependsOn(tv: TypeVar, except: TypeVars, co: Boolean)(using Context): Boolean + + /** Depending on Config settngs: + * - Under `checkConstraintsNonCyclic`, check that no constrained + * parameter contains itself as a bound. + * - Under `checkConstraintDeps`, check hat reverse dependencies in + * constraints are correct and complete. + */ + def checkWellFormed()(using Context): this.type + /** Check that constraint only refers to TypeParamRefs bound by itself */ def checkClosed()(using Context): Unit diff --git a/compiler/src/dotty/tools/dotc/core/ConstraintHandling.scala b/compiler/src/dotty/tools/dotc/core/ConstraintHandling.scala index 1dfa04822766..9ffe2bda73cb 100644 --- a/compiler/src/dotty/tools/dotc/core/ConstraintHandling.scala +++ b/compiler/src/dotty/tools/dotc/core/ConstraintHandling.scala @@ -58,6 +58,12 @@ trait ConstraintHandling { */ protected var comparedTypeLambdas: Set[TypeLambda] = Set.empty + /** Used for match type reduction: If false, we don't recognize an abstract type + * to be a subtype type of any of its base classes. This is in place only at the + * toplevel; it is turned on again when we add parts of the scrutinee to the constraint. + */ + protected var canWidenAbstract: Boolean = true + protected var myNecessaryConstraintsOnly = false /** When collecting the constraints needed for a particular subtyping * judgment to be true, we sometimes need to approximate the constraint @@ -146,8 +152,8 @@ trait ConstraintHandling { return param LevelAvoidMap(0, maxLevel)(param) match case freshVar: TypeVar => freshVar.origin - case _ => throw new TypeError( - i"Could not decrease the nesting level of ${param} from ${nestingLevel(param)} to $maxLevel in $constraint") + case _ => throw TypeError( + em"Could not decrease the nesting level of ${param} from ${nestingLevel(param)} to $maxLevel in $constraint") def nonParamBounds(param: TypeParamRef)(using Context): TypeBounds = constraint.nonParamBounds(param) @@ -251,7 +257,7 @@ trait ConstraintHandling { end LevelAvoidMap /** Approximate `rawBound` if needed to make it a legal bound of `param` by - * avoiding wildcards and types with a level strictly greater than its + * avoiding cycles, wildcards and types with a level strictly greater than its * `nestingLevel`. * * Note that level-checking must be performed here and cannot be delayed @@ -277,7 +283,7 @@ trait ConstraintHandling { // This is necessary for i8900-unflip.scala to typecheck. val v = if necessaryConstraintsOnly then -this.variance else this.variance atVariance(v)(super.legalVar(tp)) - approx(rawBound) + constraint.validBoundFor(param, approx(rawBound), isUpper) end legalBound protected def addOneBound(param: TypeParamRef, rawBound: Type, isUpper: Boolean)(using Context): Boolean = @@ -407,8 +413,10 @@ trait ConstraintHandling { constraint = constraint.addLess(p2, p1, direction = if pKept eq p1 then KeepParam2 else KeepParam1) - val boundKept = constraint.nonParamBounds(pKept).substParam(pRemoved, pKept) - var boundRemoved = constraint.nonParamBounds(pRemoved).substParam(pRemoved, pKept) + val boundKept = constraint.validBoundsFor(pKept, + constraint.nonParamBounds( pKept).substParam(pRemoved, pKept).bounds) + var boundRemoved = constraint.validBoundsFor(pKept, + constraint.nonParamBounds(pRemoved).substParam(pRemoved, pKept).bounds) if level1 != level2 then boundRemoved = LevelAvoidMap(-1, math.min(level1, level2))(boundRemoved) @@ -550,6 +558,13 @@ trait ConstraintHandling { inst end approximation + private def isTransparent(tp: Type, traitOnly: Boolean)(using Context): Boolean = tp match + case AndType(tp1, tp2) => + isTransparent(tp1, traitOnly) && isTransparent(tp2, traitOnly) + case _ => + val cls = tp.underlyingClassRef(refinementOK = false).typeSymbol + cls.isTransparentClass && (!traitOnly || cls.is(Trait)) + /** If `tp` is an intersection such that some operands are transparent trait instances * and others are not, replace as many transparent trait instances as possible with Any * as long as the result is still a subtype of `bound`. But fall back to the @@ -562,18 +577,17 @@ trait ConstraintHandling { var dropped: List[Type] = List() // the types dropped so far, last one on top def dropOneTransparentTrait(tp: Type): Type = - val tpd = tp.dealias - if tpd.typeSymbol.isTransparentTrait && !tpd.isLambdaSub && !kept.contains(tpd) then - dropped = tpd :: dropped + if isTransparent(tp, traitOnly = true) && !kept.contains(tp) then + dropped = tp :: dropped defn.AnyType - else tpd match + else tp match case AndType(tp1, tp2) => val tp1w = dropOneTransparentTrait(tp1) if tp1w ne tp1 then tp1w & tp2 else val tp2w = dropOneTransparentTrait(tp2) if tp2w ne tp2 then tp1 & tp2w - else tpd + else tp case _ => tp @@ -612,8 +626,9 @@ trait ConstraintHandling { /** Widen inferred type `inst` with upper `bound`, according to the following rules: * 1. If `inst` is a singleton type, or a union containing some singleton types, - * widen (all) the singleton type(s), provided the result is a subtype of `bound`. - * (i.e. `inst.widenSingletons <:< bound` succeeds with satisfiable constraint) + * widen (all) the singleton type(s), provided the result is a subtype of `bound` + * (i.e. `inst.widenSingletons <:< bound` succeeds with satisfiable constraint) and + * is not transparent according to `isTransparent`. * 2a. If `inst` is a union type and `widenUnions` is true, approximate the union type * from above by an intersection of all common base types, provided the result * is a subtype of `bound`. @@ -635,7 +650,7 @@ trait ConstraintHandling { def widenOr(tp: Type) = if widenUnions then val tpw = tp.widenUnion - if (tpw ne tp) && (tpw <:< bound) then tpw else tp + if (tpw ne tp) && !isTransparent(tpw, traitOnly = false) && (tpw <:< bound) then tpw else tp else tp.hardenUnions def widenSingle(tp: Type) = @@ -648,7 +663,12 @@ trait ConstraintHandling { val wideInst = if isSingleton(bound) then inst - else dropTransparentTraits(widenIrreducible(widenOr(widenSingle(inst))), bound) + else + val widenedFromSingle = widenSingle(inst) + val widenedFromUnion = widenOr(widenedFromSingle) + val widened = dropTransparentTraits(widenedFromUnion, bound) + widenIrreducible(widened) + wideInst match case wideInst: TypeRef if wideInst.symbol.is(Module) => TermRef(wideInst.prefix, wideInst.symbol.sourceModule) @@ -729,16 +749,7 @@ trait ConstraintHandling { } /** The current bounds of type parameter `param` */ - def bounds(param: TypeParamRef)(using Context): TypeBounds = { - val e = constraint.entry(param) - if (e.exists) e.bounds - else { - // TODO: should we change the type of paramInfos to nullable? - val pinfos: List[param.binder.PInfo] | Null = param.binder.paramInfos - if (pinfos != null) pinfos(param.paramNum) // pinfos == null happens in pos/i536.scala - else TypeBounds.empty - } - } + def bounds(param: TypeParamRef)(using Context): TypeBounds = constraint.bounds(param) /** Add type lambda `tl`, possibly with type variables `tvars`, to current constraint * and propagate all bounds. @@ -839,13 +850,17 @@ trait ConstraintHandling { //checkPropagated(s"adding $description")(true) // DEBUG in case following fails checkPropagated(s"added $description") { addConstraintInvocations += 1 + val saved = canWidenAbstract + canWidenAbstract = true try bound match case bound: TypeParamRef if constraint contains bound => addParamBound(bound) case _ => val pbound = avoidLambdaParams(bound) kindCompatible(param, pbound) && addBoundTransitively(param, pbound, !fromBelow) - finally addConstraintInvocations -= 1 + finally + canWidenAbstract = saved + addConstraintInvocations -= 1 } end addConstraint diff --git a/compiler/src/dotty/tools/dotc/core/Contexts.scala b/compiler/src/dotty/tools/dotc/core/Contexts.scala index d2a88a422b2e..2f28975dd066 100644 --- a/compiler/src/dotty/tools/dotc/core/Contexts.scala +++ b/compiler/src/dotty/tools/dotc/core/Contexts.scala @@ -28,6 +28,7 @@ import printing._ import config.{JavaPlatform, SJSPlatform, Platform, ScalaSettings} import classfile.ReusableDataReader import StdNames.nme +import compiletime.uninitialized import scala.annotation.internal.sharable @@ -123,7 +124,9 @@ object Contexts { */ abstract class Context(val base: ContextBase) { thiscontext => - given Context = this + protected given Context = this + + def outer: Context /** All outer contexts, ending in `base.initialCtx` and then `NoContext` */ def outersIterator: Iterator[Context] = new Iterator[Context] { @@ -132,65 +135,21 @@ object Contexts { def next = { val c = current; current = current.outer; c } } - /** The outer context */ - private var _outer: Context = _ - protected def outer_=(outer: Context): Unit = _outer = outer - final def outer: Context = _outer - - /** The current context */ - private var _period: Period = _ - protected def period_=(period: Period): Unit = { - assert(period.firstPhaseId == period.lastPhaseId, period) - _period = period - } - final def period: Period = _period - - /** The scope nesting level */ - private var _mode: Mode = _ - protected def mode_=(mode: Mode): Unit = _mode = mode - final def mode: Mode = _mode - - /** The current owner symbol */ - private var _owner: Symbol = _ - protected def owner_=(owner: Symbol): Unit = _owner = owner - final def owner: Symbol = _owner - - /** The current tree */ - private var _tree: Tree[? >: Untyped]= _ - protected def tree_=(tree: Tree[? >: Untyped]): Unit = _tree = tree - final def tree: Tree[? >: Untyped] = _tree - - /** The current scope */ - private var _scope: Scope = _ - protected def scope_=(scope: Scope): Unit = _scope = scope - final def scope: Scope = _scope - - /** The current typerstate */ - private var _typerState: TyperState = _ - protected def typerState_=(typerState: TyperState): Unit = _typerState = typerState - final def typerState: TyperState = _typerState - - /** The current bounds in force for type parameters appearing in a GADT */ - private var _gadt: GadtConstraint = _ - protected def gadt_=(gadt: GadtConstraint): Unit = _gadt = gadt - final def gadt: GadtConstraint = _gadt - - /** The history of implicit searches that are currently active */ - private var _searchHistory: SearchHistory = _ - protected def searchHistory_= (searchHistory: SearchHistory): Unit = _searchHistory = searchHistory - final def searchHistory: SearchHistory = _searchHistory - - /** The current source file */ - private var _source: SourceFile = _ - protected def source_=(source: SourceFile): Unit = _source = source - final def source: SourceFile = _source + def period: Period + def mode: Mode + def owner: Symbol + def tree: Tree[?] + def scope: Scope + def typerState: TyperState + def gadt: GadtConstraint = gadtState.gadt + def gadtState: GadtState + def searchHistory: SearchHistory + def source: SourceFile /** A map in which more contextual properties can be stored * Typically used for attributes that are read and written only in special situations. */ - private var _moreProperties: Map[Key[Any], Any] = _ - protected def moreProperties_=(moreProperties: Map[Key[Any], Any]): Unit = _moreProperties = moreProperties - final def moreProperties: Map[Key[Any], Any] = _moreProperties + def moreProperties: Map[Key[Any], Any] def property[T](key: Key[T]): Option[T] = moreProperties.get(key).asInstanceOf[Option[T]] @@ -200,9 +159,7 @@ object Contexts { * Access to store entries is much faster than access to properties, and only * slightly slower than a normal field access would be. */ - private var _store: Store = _ - protected def store_=(store: Store): Unit = _store = store - final def store: Store = _store + def store: Store /** The compiler callback implementation, or null if no callback will be called. */ def compilerCallback: CompilerCallback = store(compilerCallbackLoc) @@ -240,7 +197,7 @@ object Contexts { def typeAssigner: TypeAssigner = store(typeAssignerLoc) /** The new implicit references that are introduced by this scope */ - protected var implicitsCache: ContextualImplicits | Null = null + private var implicitsCache: ContextualImplicits | Null = null def implicits: ContextualImplicits = { if (implicitsCache == null) implicitsCache = { @@ -299,13 +256,12 @@ object Contexts { file catch case ex: InvalidPathException => - report.error(s"invalid file path: ${ex.getMessage}") + report.error(em"invalid file path: ${ex.getMessage}") NoAbstractFile /** AbstractFile with given path, memoized */ def getFile(name: String): AbstractFile = getFile(name.toTermName) - private var related: SimpleIdentityMap[Phase | SourceFile, Context] | Null = null private def lookup(key: Phase | SourceFile): Context | Null = @@ -356,7 +312,7 @@ object Contexts { /** If -Ydebug is on, the top of the stack trace where this context * was created, otherwise `null`. */ - private var creationTrace: Array[StackTraceElement] = _ + private var creationTrace: Array[StackTraceElement] = uninitialized private def setCreationTrace() = creationTrace = (new Throwable).getStackTrace().take(20) @@ -455,7 +411,7 @@ object Contexts { val constrCtx = outersIterator.dropWhile(_.outer.owner == owner).next() superOrThisCallContext(owner, constrCtx.scope) .setTyperState(typerState) - .setGadt(gadt) + .setGadtState(gadtState) .fresh .setScope(this.scope) } @@ -469,7 +425,7 @@ object Contexts { } /** The context of expression `expr` seen as a member of a statement sequence */ - def exprContext(stat: Tree[? >: Untyped], exprOwner: Symbol): Context = + def exprContext(stat: Tree[?], exprOwner: Symbol): Context = if (exprOwner == this.owner) this else if (untpd.isSuperConstrCall(stat) && this.owner.isClass) superCallContext else fresh.setOwner(exprOwner) @@ -491,36 +447,11 @@ object Contexts { /** Is the explicit nulls option set? */ def explicitNulls: Boolean = base.settings.YexplicitNulls.value - /** Initialize all context fields, except typerState, which has to be set separately - * @param outer The outer context - * @param origin The context from which fields are copied - */ - private[Contexts] def init(outer: Context, origin: Context): this.type = { - _outer = outer - _period = origin.period - _mode = origin.mode - _owner = origin.owner - _tree = origin.tree - _scope = origin.scope - _gadt = origin.gadt - _searchHistory = origin.searchHistory - _source = origin.source - _moreProperties = origin.moreProperties - _store = origin.store - this - } - - def reuseIn(outer: Context): this.type = - implicitsCache = null - related = null - init(outer, outer) - /** A fresh clone of this context embedded in this context. */ def fresh: FreshContext = freshOver(this) /** A fresh clone of this context embedded in the specified `outer` context. */ def freshOver(outer: Context): FreshContext = - util.Stats.record("Context.fresh") FreshContext(base).init(outer, this).setTyperState(this.typerState) final def withOwner(owner: Symbol): Context = @@ -565,6 +496,15 @@ object Contexts { def uniques: util.WeakHashSet[Type] = base.uniques def initialize()(using Context): Unit = base.initialize() + + protected def resetCaches(): Unit = + implicitsCache = null + related = null + + /** Reuse this context as a fresh context nested inside `outer` + * But keep the typerstate, this one has to be set explicitly if needed. + */ + def reuseIn(outer: Context): this.type } /** A condensed context provides only a small memory footprint over @@ -579,55 +519,138 @@ object Contexts { * of its attributes using the with... methods. */ class FreshContext(base: ContextBase) extends Context(base) { + util.Stats.record("Context.fresh") + + private var _outer: Context = uninitialized + def outer: Context = _outer + + private var _period: Period = uninitialized + final def period: Period = _period + + private var _mode: Mode = uninitialized + final def mode: Mode = _mode + + private var _owner: Symbol = uninitialized + final def owner: Symbol = _owner + + private var _tree: Tree[?]= _ + final def tree: Tree[?] = _tree + + private var _scope: Scope = uninitialized + final def scope: Scope = _scope + + private var _typerState: TyperState = uninitialized + final def typerState: TyperState = _typerState + + private var _gadtState: GadtState = uninitialized + final def gadtState: GadtState = _gadtState + + private var _searchHistory: SearchHistory = uninitialized + final def searchHistory: SearchHistory = _searchHistory + + private var _source: SourceFile = uninitialized + final def source: SourceFile = _source + + private var _moreProperties: Map[Key[Any], Any] = uninitialized + final def moreProperties: Map[Key[Any], Any] = _moreProperties + + private var _store: Store = uninitialized + final def store: Store = _store + + /** Initialize all context fields, except typerState, which has to be set separately + * @param outer The outer context + * @param origin The context from which fields are copied + */ + private[Contexts] def init(outer: Context, origin: Context): this.type = { + _outer = outer + _period = origin.period + _mode = origin.mode + _owner = origin.owner + _tree = origin.tree + _scope = origin.scope + _gadtState = origin.gadtState + _searchHistory = origin.searchHistory + _source = origin.source + _moreProperties = origin.moreProperties + _store = origin.store + this + } + + def reuseIn(outer: Context): this.type = + resetCaches() + init(outer, outer) + def setPeriod(period: Period): this.type = util.Stats.record("Context.setPeriod") - this.period = period + //assert(period.firstPhaseId == period.lastPhaseId, period) + this._period = period this + def setMode(mode: Mode): this.type = util.Stats.record("Context.setMode") - this.mode = mode + this._mode = mode this + def setOwner(owner: Symbol): this.type = util.Stats.record("Context.setOwner") assert(owner != NoSymbol) - this.owner = owner + this._owner = owner this - def setTree(tree: Tree[? >: Untyped]): this.type = + + def setTree(tree: Tree[?]): this.type = util.Stats.record("Context.setTree") - this.tree = tree + this._tree = tree this - def setScope(scope: Scope): this.type = { this.scope = scope; this } + + def setScope(scope: Scope): this.type = + this._scope = scope + this + def setNewScope: this.type = util.Stats.record("Context.setScope") - this.scope = newScope + this._scope = newScope + this + + def setTyperState(typerState: TyperState): this.type = + this._typerState = typerState this - def setTyperState(typerState: TyperState): this.type = { this.typerState = typerState; this } - def setNewTyperState(): this.type = setTyperState(typerState.fresh(committable = true)) - def setExploreTyperState(): this.type = setTyperState(typerState.fresh(committable = false)) - def setReporter(reporter: Reporter): this.type = setTyperState(typerState.fresh().setReporter(reporter)) - def setTyper(typer: Typer): this.type = { this.scope = typer.scope; setTypeAssigner(typer) } - def setGadt(gadt: GadtConstraint): this.type = - util.Stats.record("Context.setGadt") - this.gadt = gadt + def setNewTyperState(): this.type = + setTyperState(typerState.fresh(committable = true)) + def setExploreTyperState(): this.type = + setTyperState(typerState.fresh(committable = false)) + def setReporter(reporter: Reporter): this.type = + setTyperState(typerState.fresh().setReporter(reporter)) + + def setTyper(typer: Typer): this.type = + this._scope = typer.scope + setTypeAssigner(typer) + + def setGadtState(gadtState: GadtState): this.type = + util.Stats.record("Context.setGadtState") + this._gadtState = gadtState this - def setFreshGADTBounds: this.type = setGadt(gadt.fresh) + def setFreshGADTBounds: this.type = + setGadtState(gadtState.fresh) + def setSearchHistory(searchHistory: SearchHistory): this.type = util.Stats.record("Context.setSearchHistory") - this.searchHistory = searchHistory + this._searchHistory = searchHistory this + def setSource(source: SourceFile): this.type = util.Stats.record("Context.setSource") - this.source = source + this._source = source this + private def setMoreProperties(moreProperties: Map[Key[Any], Any]): this.type = util.Stats.record("Context.setMoreProperties") - this.moreProperties = moreProperties + this._moreProperties = moreProperties this + private def setStore(store: Store): this.type = util.Stats.record("Context.setStore") - this.store = store + this._store = store this - def setImplicits(implicits: ContextualImplicits): this.type = { this.implicitsCache = implicits; this } def setCompilationUnit(compilationUnit: CompilationUnit): this.type = { setSource(compilationUnit.source) @@ -681,6 +704,28 @@ object Contexts { def setDebug: this.type = setSetting(base.settings.Ydebug, true) } + object FreshContext: + /** Defines an initial context with given context base and possible settings. */ + def initial(base: ContextBase, settingsGroup: SettingGroup): Context = + val c = new FreshContext(base) + c._outer = NoContext + c._period = InitialPeriod + c._mode = Mode.None + c._typerState = TyperState.initialState() + c._owner = NoSymbol + c._tree = untpd.EmptyTree + c._moreProperties = Map(MessageLimiter -> DefaultMessageLimiter()) + c._scope = EmptyScope + c._source = NoSource + c._store = initialStore + .updated(settingsStateLoc, settingsGroup.defaultState) + .updated(notNullInfosLoc, Nil) + .updated(compilationUnitLoc, NoCompilationUnit) + c._searchHistory = new SearchRoot + c._gadtState = GadtState(GadtConstraint.empty) + c + end FreshContext + given ops: AnyRef with extension (c: Context) def addNotNullInfo(info: NotNullInfo) = @@ -710,56 +755,40 @@ object Contexts { final def retractMode(mode: Mode): c.type = c.setMode(c.mode &~ mode) } - private def exploreCtx(using Context): FreshContext = - util.Stats.record("explore") - val base = ctx.base - import base._ - val nestedCtx = - if exploresInUse < exploreContexts.size then - exploreContexts(exploresInUse).reuseIn(ctx) - else - val ts = TyperState() - .setReporter(ExploringReporter()) - .setCommittable(false) - val c = FreshContext(ctx.base).init(ctx, ctx).setTyperState(ts) - exploreContexts += c - c - exploresInUse += 1 - val nestedTS = nestedCtx.typerState - nestedTS.init(ctx.typerState, ctx.typerState.constraint) - nestedCtx - - private def wrapUpExplore(ectx: Context) = - ectx.reporter.asInstanceOf[ExploringReporter].reset() - ectx.base.exploresInUse -= 1 - + /** Run `op` with a pool-allocated context that has an ExporeTyperState. */ inline def explore[T](inline op: Context ?=> T)(using Context): T = - val ectx = exploreCtx - try op(using ectx) finally wrapUpExplore(ectx) + exploreInFreshCtx(op) + /** Run `op` with a pool-allocated FreshContext that has an ExporeTyperState. */ inline def exploreInFreshCtx[T](inline op: FreshContext ?=> T)(using Context): T = - val ectx = exploreCtx - try op(using ectx) finally wrapUpExplore(ectx) - - private def changeOwnerCtx(owner: Symbol)(using Context): Context = - val base = ctx.base - import base._ - val nestedCtx = - if changeOwnersInUse < changeOwnerContexts.size then - changeOwnerContexts(changeOwnersInUse).reuseIn(ctx) - else - val c = FreshContext(ctx.base).init(ctx, ctx) - changeOwnerContexts += c - c - changeOwnersInUse += 1 - nestedCtx.setOwner(owner).setTyperState(ctx.typerState) - - /** Run `op` in current context, with a mode is temporarily set as specified. + val pool = ctx.base.exploreContextPool + val nestedCtx = pool.next() + try op(using nestedCtx) + finally + nestedCtx.typerState.reporter.asInstanceOf[ExploringReporter].reset() + pool.free() + + /** Run `op` with a pool-allocated context that has a fresh typer state. + * Commit the typer state if `commit` applied to `op`'s result returns true. */ + inline def withFreshTyperState[T](inline op: Context ?=> T, inline commit: T => Context ?=> Boolean)(using Context): T = + val pool = ctx.base.freshTSContextPool + val nestedCtx = pool.next() + try + val result = op(using nestedCtx) + if commit(result)(using nestedCtx) then + nestedCtx.typerState.commit() + nestedCtx.typerState.setCommittable(true) + result + finally + pool.free() + + /** Run `op` with a pool-allocated context that has the given `owner`. */ inline def runWithOwner[T](owner: Symbol)(inline op: Context ?=> T)(using Context): T = if Config.reuseOwnerContexts then - try op(using changeOwnerCtx(owner)) - finally ctx.base.changeOwnersInUse -= 1 + val pool = ctx.base.generalContextPool + try op(using pool.next().setOwner(owner).setTyperState(ctx.typerState)) + finally pool.free() else op(using ctx.fresh.setOwner(owner)) @@ -796,30 +825,9 @@ object Contexts { finally ctx.base.comparersInUse = saved end comparing - /** A class defining the initial context with given context base - * and set of possible settings. - */ - private class InitialContext(base: ContextBase, settingsGroup: SettingGroup) extends FreshContext(base) { - outer = NoContext - period = InitialPeriod - mode = Mode.None - typerState = TyperState.initialState() - owner = NoSymbol - tree = untpd.EmptyTree - moreProperties = Map(MessageLimiter -> DefaultMessageLimiter()) - scope = EmptyScope - source = NoSource - store = initialStore - .updated(settingsStateLoc, settingsGroup.defaultState) - .updated(notNullInfosLoc, Nil) - .updated(compilationUnitLoc, NoCompilationUnit) - searchHistory = new SearchRoot - gadt = GadtConstraint.empty - } - - @sharable object NoContext extends Context((null: ContextBase | Null).uncheckedNN) { - source = NoSource + @sharable val NoContext: Context = new FreshContext((null: ContextBase | Null).uncheckedNN) { override val implicits: ContextualImplicits = new ContextualImplicits(Nil, null, false)(this: @unchecked) + setSource(NoSource) } /** A context base defines state and associated methods that exist once per @@ -833,10 +841,10 @@ object Contexts { val settings: ScalaSettings = new ScalaSettings /** The initial context */ - val initialCtx: Context = new InitialContext(this, settings) + val initialCtx: Context = FreshContext.initial(this: @unchecked, settings) /** The platform, initialized by `initPlatform()`. */ - private var _platform: Platform | Null = _ + private var _platform: Platform | Null = uninitialized /** The platform */ def platform: Platform = { @@ -872,6 +880,47 @@ object Contexts { allPhases.find(_.period.containsPhaseId(p.id)).getOrElse(NoPhase) } + class ContextPool: + protected def fresh()(using Context): FreshContext = + FreshContext(ctx.base).init(ctx, ctx) + + private var inUse: Int = 0 + private var pool = new mutable.ArrayBuffer[FreshContext] + + def next()(using Context): FreshContext = + val base = ctx.base + import base._ + val nestedCtx = + if inUse < pool.size then + pool(inUse).reuseIn(ctx) + else + val c = fresh() + pool += c + c + inUse += 1 + nestedCtx + + final def free(): Unit = + inUse -= 1 + end ContextPool + + class TSContextPool extends ContextPool: + override def next()(using Context) = + val nextCtx = super.next() + nextCtx.typerState.init(ctx.typerState, ctx.typerState.constraint) + nextCtx + + class FreshTSContextPool extends TSContextPool: + override protected def fresh()(using Context) = + super.fresh().setTyperState(ctx.typerState.fresh(committable = true)) + + class ExploreContextPool extends TSContextPool: + override protected def fresh()(using Context) = + val ts = TyperState() + .setReporter(ExploringReporter()) + .setCommittable(false) + super.fresh().setTyperState(ts) + /** The essential mutable state of a context base, collected into a common class */ class ContextState { // Symbols state @@ -922,22 +971,27 @@ object Contexts { // Phases state - private[core] var phasesPlan: List[List[Phase]] = _ + private[core] var phasesPlan: List[List[Phase]] = uninitialized /** Phases by id */ - private[dotc] var phases: Array[Phase] = _ + private[dotc] var phases: Array[Phase] = uninitialized /** Phases with consecutive Transforms grouped into a single phase, Empty array if fusion is disabled */ private[core] var fusedPhases: Array[Phase] = Array.empty[Phase] /** Next denotation transformer id */ - private[core] var nextDenotTransformerId: Array[Int] = _ + private[core] var nextDenotTransformerId: Array[Int] = uninitialized - private[core] var denotTransformers: Array[DenotTransformer] = _ + private[core] var denotTransformers: Array[DenotTransformer] = uninitialized /** Flag to suppress inlining, set after overflow */ private[dotc] var stopInlining: Boolean = false + /** Cached -Yno-double-bindings setting. This is accessed from `setDenot`, which + * is fairly hot, so we don't want to lookup the setting each time it is called. + */ + private[dotc] var checkNoDoubleBindings = false + /** A variable that records that some error was reported in a globally committable context. * The error will not necessarlily be emitted, since it could still be that * the enclosing context will be aborted. The variable is used as a smoke test @@ -954,11 +1008,9 @@ object Contexts { protected[dotc] val indentTab: String = " " - private[Contexts] val exploreContexts = new mutable.ArrayBuffer[FreshContext] - private[Contexts] var exploresInUse: Int = 0 - - private[Contexts] val changeOwnerContexts = new mutable.ArrayBuffer[FreshContext] - private[Contexts] var changeOwnersInUse: Int = 0 + val exploreContextPool = ExploreContextPool() + val freshTSContextPool = FreshTSContextPool() + val generalContextPool = ContextPool() private[Contexts] val comparers = new mutable.ArrayBuffer[TypeComparer] private[Contexts] var comparersInUse: Int = 0 @@ -967,7 +1019,7 @@ object Contexts { private[core] val reusableDataReader = ReusableInstance(new ReusableDataReader()) - private[dotc] var wConfCache: (List[String], WConf) = _ + private[dotc] var wConfCache: (List[String], WConf) = uninitialized def sharedCharArray(len: Int): Array[Char] = while len > charArray.length do diff --git a/compiler/src/dotty/tools/dotc/core/Decorators.scala b/compiler/src/dotty/tools/dotc/core/Decorators.scala index d392a4e3079a..9f55e29c0f59 100644 --- a/compiler/src/dotty/tools/dotc/core/Decorators.scala +++ b/compiler/src/dotty/tools/dotc/core/Decorators.scala @@ -11,7 +11,7 @@ import printing.{ Printer, Showable }, printing.Formatting._, printing.Texts._ import transform.MegaPhase import reporting.{Message, NoExplanation} -/** This object provides useful implicit decorators for types defined elsewhere */ +/** This object provides useful extension methods for types defined elsewhere */ object Decorators { /** Extension methods for toType/TermName methods on PreNames. @@ -58,8 +58,11 @@ object Decorators { padding + s.replace("\n", "\n" + padding) end extension + /** Convert lazy string to message. To be with caution, since no message-defined + * formatting will be done on the string. + */ extension (str: => String) - def toMessage: Message = reporting.NoExplanation(str) + def toMessage: Message = NoExplanation(str)(using NoContext) /** Implements a findSymbol method on iterators of Symbols that * works like find but avoids Option, replacing None with NoSymbol. @@ -78,7 +81,7 @@ object Decorators { /** Implements filterConserve, zipWithConserve methods * on lists that avoid duplication of list nodes where feasible. */ - implicit class ListDecorator[T](val xs: List[T]) extends AnyVal { + extension [T](xs: List[T]) final def mapconserve[U](f: T => U): List[U] = { @tailrec @@ -207,11 +210,18 @@ object Decorators { } /** Union on lists seen as sets */ - def | (ys: List[T]): List[T] = xs ::: (ys filterNot (xs contains _)) + def setUnion (ys: List[T]): List[T] = xs ::: ys.filterNot(xs contains _) - /** Intersection on lists seen as sets */ - def & (ys: List[T]): List[T] = xs filter (ys contains _) - } + /** Reduce left with `op` as long as list `xs` is not longer than `seqLimit`. + * Otherwise, split list in two half, reduce each, and combine with `op`. + */ + def reduceBalanced(op: (T, T) => T, seqLimit: Int = 100): T = + val len = xs.length + if len > seqLimit then + val (leading, trailing) = xs.splitAt(len / 2) + op(leading.reduceBalanced(op, seqLimit), trailing.reduceBalanced(op, seqLimit)) + else + xs.reduceLeft(op) extension [T, U](xss: List[List[T]]) def nestedMap(f: T => U): List[List[U]] = xss match @@ -269,17 +279,22 @@ object Decorators { catch case ex: CyclicReference => "... (caught cyclic reference) ..." case NonFatal(ex) - if !ctx.mode.is(Mode.PrintShowExceptions) && !ctx.settings.YshowPrintErrors.value => - val msg = ex match { case te: TypeError => te.toMessage case _ => ex.getMessage } + if !ctx.mode.is(Mode.PrintShowExceptions) && !ctx.settings.YshowPrintErrors.value => + val msg = ex match + case te: TypeError => te.toMessage.message + case _ => ex.getMessage s"[cannot display due to $msg, raw string = $x]" case _ => String.valueOf(x).nn + /** Returns the simple class name of `x`. */ + def className: String = x.getClass.getSimpleName.nn + extension [T](x: T) def assertingErrorsReported(using Context): T = { assert(ctx.reporter.errorsReported) x } - def assertingErrorsReported(msg: => String)(using Context): T = { + def assertingErrorsReported(msg: Message)(using Context): T = { assert(ctx.reporter.errorsReported, msg) x } @@ -289,21 +304,16 @@ object Decorators { if (xs.head eq x1) && (xs.tail eq xs1) then xs else x1 :: xs1 extension (sc: StringContext) + /** General purpose string formatting */ def i(args: Shown*)(using Context): String = new StringFormatter(sc).assemble(args) - /** Formatting for error messages: Like `i` but suppress follow-on - * error messages after the first one if some of their arguments are "non-sensical". - */ - def em(args: Shown*)(using Context): String = - forErrorMessages(new StringFormatter(sc).assemble(args)) - - /** Formatting with added explanations: Like `em`, but add explanations to - * give more info about type variables and to disambiguate where needed. + /** Interpolator yielding an error message, which undergoes + * the formatting defined in Message. */ - def ex(args: Shown*)(using Context): String = - explained(new StringFormatter(sc).assemble(args)) + def em(args: Shown*)(using Context): NoExplanation = + NoExplanation(i(args*)) extension [T <: AnyRef](arr: Array[T]) def binarySearch(x: T | Null): Int = java.util.Arrays.binarySearch(arr.asInstanceOf[Array[Object | Null]], x) diff --git a/compiler/src/dotty/tools/dotc/core/Definitions.scala b/compiler/src/dotty/tools/dotc/core/Definitions.scala index 174244b4a456..56409ad050f6 100644 --- a/compiler/src/dotty/tools/dotc/core/Definitions.scala +++ b/compiler/src/dotty/tools/dotc/core/Definitions.scala @@ -518,8 +518,8 @@ class Definitions { def staticsMethod(name: PreName): TermSymbol = ScalaStaticsModule.requiredMethod(name) @tu lazy val DottyArraysModule: Symbol = requiredModule("scala.runtime.Arrays") - def newGenericArrayMethod(using Context): TermSymbol = DottyArraysModule.requiredMethod("newGenericArray") - def newArrayMethod(using Context): TermSymbol = DottyArraysModule.requiredMethod("newArray") + @tu lazy val newGenericArrayMethod: TermSymbol = DottyArraysModule.requiredMethod("newGenericArray") + @tu lazy val newArrayMethod: TermSymbol = DottyArraysModule.requiredMethod("newArray") def getWrapVarargsArrayModule: Symbol = ScalaRuntimeModule @@ -644,6 +644,8 @@ class Definitions { @tu lazy val RepeatedParamClass: ClassSymbol = enterSpecialPolyClass(tpnme.REPEATED_PARAM_CLASS, Covariant, Seq(ObjectType, SeqType)) + @tu lazy val IntoType: TypeSymbol = enterAliasType(tpnme.INTO, HKTypeLambda(TypeBounds.empty :: Nil)(_.paramRefs(0))) + // fundamental classes @tu lazy val StringClass: ClassSymbol = requiredClass("java.lang.String") def StringType: Type = StringClass.typeRef @@ -732,6 +734,10 @@ class Definitions { } def JavaEnumType = JavaEnumClass.typeRef + @tu lazy val MethodHandleClass: ClassSymbol = requiredClass("java.lang.invoke.MethodHandle") + @tu lazy val MethodHandlesLookupClass: ClassSymbol = requiredClass("java.lang.invoke.MethodHandles.Lookup") + @tu lazy val VarHandleClass: ClassSymbol = requiredClass("java.lang.invoke.VarHandle") + @tu lazy val StringBuilderClass: ClassSymbol = requiredClass("scala.collection.mutable.StringBuilder") @tu lazy val MatchErrorClass : ClassSymbol = requiredClass("scala.MatchError") @tu lazy val ConversionClass : ClassSymbol = requiredClass("scala.Conversion").typeRef.symbol.asClass @@ -856,7 +862,12 @@ class Definitions { @tu lazy val QuoteMatchingClass: ClassSymbol = requiredClass("scala.quoted.runtime.QuoteMatching") @tu lazy val QuoteMatching_ExprMatch: Symbol = QuoteMatchingClass.requiredMethod("ExprMatch") + @tu lazy val QuoteMatching_ExprMatchModule: Symbol = QuoteMatchingClass.requiredClass("ExprMatchModule") @tu lazy val QuoteMatching_TypeMatch: Symbol = QuoteMatchingClass.requiredMethod("TypeMatch") + @tu lazy val QuoteMatching_TypeMatchModule: Symbol = QuoteMatchingClass.requiredClass("TypeMatchModule") + @tu lazy val QuoteMatchingModule: Symbol = requiredModule("scala.quoted.runtime.QuoteMatching") + @tu lazy val QuoteMatching_KNil: Symbol = QuoteMatchingModule.requiredType("KNil") + @tu lazy val QuoteMatching_KCons: Symbol = QuoteMatchingModule.requiredType("KCons") @tu lazy val ToExprModule: Symbol = requiredModule("scala.quoted.ToExpr") @tu lazy val ToExprModule_BooleanToExpr: Symbol = ToExprModule.requiredMethod("BooleanToExpr") @@ -889,6 +900,8 @@ class Definitions { @tu lazy val QuotedTypeModule: Symbol = QuotedTypeClass.companionModule @tu lazy val QuotedTypeModule_of: Symbol = QuotedTypeModule.requiredMethod("of") + @tu lazy val MacroAnnotationClass: ClassSymbol = requiredClass("scala.annotation.MacroAnnotation") + @tu lazy val CanEqualClass: ClassSymbol = getClassIfDefined("scala.Eql").orElse(requiredClass("scala.CanEqual")).asClass def CanEqual_canEqualAny(using Context): TermSymbol = val methodName = if CanEqualClass.name == tpnme.Eql then nme.eqlAny else nme.canEqualAny @@ -960,18 +973,24 @@ class Definitions { def TupledFunctionClass(using Context): ClassSymbol = TupledFunctionTypeRef.symbol.asClass def RuntimeTupleFunctionsModule(using Context): Symbol = requiredModule("scala.runtime.TupledFunctions") + @tu lazy val boundaryModule: Symbol = requiredModule("scala.util.boundary") + @tu lazy val LabelClass: Symbol = requiredClass("scala.util.boundary.Label") + @tu lazy val BreakClass: Symbol = requiredClass("scala.util.boundary.Break") + @tu lazy val CapsModule: Symbol = requiredModule("scala.caps") - @tu lazy val Caps_unsafeBox: Symbol = CapsModule.requiredMethod("unsafeBox") - @tu lazy val Caps_unsafeUnbox: Symbol = CapsModule.requiredMethod("unsafeUnbox") @tu lazy val captureRoot: TermSymbol = CapsModule.requiredValue("*") + @tu lazy val CapsUnsafeModule: Symbol = requiredModule("scala.caps.unsafe") + @tu lazy val Caps_unsafeBox: Symbol = CapsUnsafeModule.requiredMethod("unsafeBox") + @tu lazy val Caps_unsafeUnbox: Symbol = CapsUnsafeModule.requiredMethod("unsafeUnbox") + @tu lazy val Caps_unsafeBoxFunArg: Symbol = CapsUnsafeModule.requiredMethod("unsafeBoxFunArg") // Annotation base classes @tu lazy val AnnotationClass: ClassSymbol = requiredClass("scala.annotation.Annotation") - @tu lazy val ClassfileAnnotationClass: ClassSymbol = requiredClass("scala.annotation.ClassfileAnnotation") @tu lazy val StaticAnnotationClass: ClassSymbol = requiredClass("scala.annotation.StaticAnnotation") @tu lazy val RefiningAnnotationClass: ClassSymbol = requiredClass("scala.annotation.RefiningAnnotation") // Annotation classes + @tu lazy val AllowConversionsAnnot: ClassSymbol = requiredClass("scala.annotation.allowConversions") @tu lazy val AnnotationDefaultAnnot: ClassSymbol = requiredClass("scala.annotation.internal.AnnotationDefault") @tu lazy val BeanPropertyAnnot: ClassSymbol = requiredClass("scala.beans.BeanProperty") @tu lazy val BooleanBeanPropertyAnnot: ClassSymbol = requiredClass("scala.beans.BooleanBeanProperty") @@ -991,6 +1010,7 @@ class Definitions { @tu lazy val MappedAlternativeAnnot: ClassSymbol = requiredClass("scala.annotation.internal.MappedAlternative") @tu lazy val MigrationAnnot: ClassSymbol = requiredClass("scala.annotation.migration") @tu lazy val NowarnAnnot: ClassSymbol = requiredClass("scala.annotation.nowarn") + @tu lazy val UnusedAnnot: ClassSymbol = requiredClass("scala.annotation.unused") @tu lazy val TransparentTraitAnnot: ClassSymbol = requiredClass("scala.annotation.transparentTrait") @tu lazy val NativeAnnot: ClassSymbol = requiredClass("scala.native") @tu lazy val RepeatedAnnot: ClassSymbol = requiredClass("scala.annotation.internal.Repeated") @@ -1013,6 +1033,8 @@ class Definitions { @tu lazy val UncheckedVarianceAnnot: ClassSymbol = requiredClass("scala.annotation.unchecked.uncheckedVariance") @tu lazy val VolatileAnnot: ClassSymbol = requiredClass("scala.volatile") @tu lazy val WithPureFunsAnnot: ClassSymbol = requiredClass("scala.annotation.internal.WithPureFuns") + @tu lazy val BeanGetterMetaAnnot: ClassSymbol = requiredClass("scala.annotation.meta.beanGetter") + @tu lazy val BeanSetterMetaAnnot: ClassSymbol = requiredClass("scala.annotation.meta.beanSetter") @tu lazy val FieldMetaAnnot: ClassSymbol = requiredClass("scala.annotation.meta.field") @tu lazy val GetterMetaAnnot: ClassSymbol = requiredClass("scala.annotation.meta.getter") @tu lazy val ParamMetaAnnot: ClassSymbol = requiredClass("scala.annotation.meta.param") @@ -1029,8 +1051,10 @@ class Definitions { @tu lazy val JavaRepeatableAnnot: ClassSymbol = requiredClass("java.lang.annotation.Repeatable") // A list of meta-annotations that are relevant for fields and accessors - @tu lazy val FieldAccessorMetaAnnots: Set[Symbol] = + @tu lazy val NonBeanMetaAnnots: Set[Symbol] = Set(FieldMetaAnnot, GetterMetaAnnot, ParamMetaAnnot, SetterMetaAnnot) + @tu lazy val MetaAnnots: Set[Symbol] = + NonBeanMetaAnnots + BeanGetterMetaAnnot + BeanSetterMetaAnnot // A list of annotations that are commonly used to indicate that a field/method argument or return // type is not null. These annotations are used by the nullification logic in JavaNullInterop to @@ -1347,6 +1371,15 @@ class Definitions { @tu lazy val untestableClasses: Set[Symbol] = Set(NothingClass, NullClass, SingletonClass) + /** Base classes that are assumed to be pure for the purposes of capture checking. + * Every class inheriting from a pure baseclass is pure. + */ + @tu lazy val pureBaseClasses = Set(defn.AnyValClass, defn.ThrowableClass) + + /** Non-inheritable lasses that are assumed to be pure for the purposes of capture checking, + */ + @tu lazy val pureSimpleClasses = Set(StringClass, NothingClass, NullClass) + @tu lazy val AbstractFunctionType: Array[TypeRef] = mkArityArray("scala.runtime.AbstractFunction", MaxImplementedFunctionArity, 0).asInstanceOf[Array[TypeRef]] val AbstractFunctionClassPerRun: PerRun[Array[Symbol]] = new PerRun(AbstractFunctionType.map(_.symbol.asClass)) def AbstractFunctionClass(n: Int)(using Context): Symbol = AbstractFunctionClassPerRun()(using ctx)(n) @@ -1389,8 +1422,8 @@ class Definitions { val classRefs1 = new Array[TypeRef | Null](classRefs.length * 2) Array.copy(classRefs, 0, classRefs1, 0, classRefs.length) classRefs = classRefs1 - val funName = s"scala.$prefix$n" if classRefs(n) == null then + val funName = s"scala.$prefix$n" classRefs(n) = if prefix.startsWith("Impure") then staticRef(funName.toTypeName).symbol.typeRef @@ -1541,12 +1574,21 @@ class Definitions { private val PredefImportFns: RootRef = RootRef(() => ScalaPredefModule.termRef, isPredef=true) - @tu private lazy val JavaRootImportFns: List[RootRef] = - if ctx.settings.YnoImports.value then Nil - else JavaImportFns + @tu private lazy val YimportsImportFns: List[RootRef] = ctx.settings.Yimports.value.map { name => + val denot = + getModuleIfDefined(name).suchThat(_.is(Module)) `orElse` + getPackageClassIfDefined(name).suchThat(_.is(Package)) + if !denot.exists then + report.error(s"error: bad preamble import $name") + val termRef = denot.symbol.termRef + RootRef(() => termRef) + } + + @tu private lazy val JavaRootImportFns: List[RootRef] = JavaImportFns @tu private lazy val ScalaRootImportFns: List[RootRef] = - if ctx.settings.YnoImports.value then Nil + if !ctx.settings.Yimports.isDefault then YimportsImportFns + else if ctx.settings.YnoImports.value then Nil else if ctx.settings.YnoPredef.value then ScalaImportFns else ScalaImportFns :+ PredefImportFns @@ -1827,20 +1869,53 @@ class Definitions { def isInfix(sym: Symbol)(using Context): Boolean = (sym eq Object_eq) || (sym eq Object_ne) - @tu lazy val assumedTransparentTraits = - Set[Symbol](ComparableClass, ProductClass, SerializableClass, - // add these for now, until we had a chance to retrofit 2.13 stdlib - // we should do a more through sweep through it then. - requiredClass("scala.collection.SortedOps"), - requiredClass("scala.collection.StrictOptimizedSortedSetOps"), - requiredClass("scala.collection.generic.DefaultSerializable"), - requiredClass("scala.collection.generic.IsIterable"), - requiredClass("scala.collection.generic.IsIterableOnce"), - requiredClass("scala.collection.generic.IsMap"), - requiredClass("scala.collection.generic.IsSeq"), - requiredClass("scala.collection.generic.Subtractable"), - requiredClass("scala.collection.immutable.StrictOptimizedSeqOps") - ) + @tu lazy val assumedTransparentNames: Map[Name, Set[Symbol]] = + // add these for now, until we had a chance to retrofit 2.13 stdlib + // we should do a more through sweep through it then. + val strs = Map( + "Any" -> Set("scala"), + "AnyVal" -> Set("scala"), + "Matchable" -> Set("scala"), + "Product" -> Set("scala"), + "Object" -> Set("java.lang"), + "Comparable" -> Set("java.lang"), + "Serializable" -> Set("java.io"), + "BitSetOps" -> Set("scala.collection"), + "IndexedSeqOps" -> Set("scala.collection", "scala.collection.mutable", "scala.collection.immutable"), + "IterableOnceOps" -> Set("scala.collection"), + "IterableOps" -> Set("scala.collection"), + "LinearSeqOps" -> Set("scala.collection", "scala.collection.immutable"), + "MapOps" -> Set("scala.collection", "scala.collection.mutable", "scala.collection.immutable"), + "SeqOps" -> Set("scala.collection", "scala.collection.mutable", "scala.collection.immutable"), + "SetOps" -> Set("scala.collection", "scala.collection.mutable", "scala.collection.immutable"), + "SortedMapOps" -> Set("scala.collection", "scala.collection.mutable", "scala.collection.immutable"), + "SortedOps" -> Set("scala.collection"), + "SortedSetOps" -> Set("scala.collection", "scala.collection.mutable", "scala.collection.immutable"), + "StrictOptimizedIterableOps" -> Set("scala.collection"), + "StrictOptimizedLinearSeqOps" -> Set("scala.collection"), + "StrictOptimizedMapOps" -> Set("scala.collection", "scala.collection.immutable"), + "StrictOptimizedSeqOps" -> Set("scala.collection", "scala.collection.immutable"), + "StrictOptimizedSetOps" -> Set("scala.collection", "scala.collection.immutable"), + "StrictOptimizedSortedMapOps" -> Set("scala.collection", "scala.collection.immutable"), + "StrictOptimizedSortedSetOps" -> Set("scala.collection", "scala.collection.immutable"), + "ArrayDequeOps" -> Set("scala.collection.mutable"), + "DefaultSerializable" -> Set("scala.collection.generic"), + "IsIterable" -> Set("scala.collection.generic"), + "IsIterableLowPriority" -> Set("scala.collection.generic"), + "IsIterableOnce" -> Set("scala.collection.generic"), + "IsIterableOnceLowPriority" -> Set("scala.collection.generic"), + "IsMap" -> Set("scala.collection.generic"), + "IsSeq" -> Set("scala.collection.generic")) + strs.map { case (simple, pkgs) => ( + simple.toTypeName, + pkgs.map(pkg => staticRef(pkg.toTermName, isPackage = true).symbol.moduleClass) + ) + } + + def isAssumedTransparent(sym: Symbol): Boolean = + assumedTransparentNames.get(sym.name) match + case Some(pkgs) => pkgs.contains(sym.owner) + case none => false // ----- primitive value class machinery ------------------------------------------ @@ -1962,6 +2037,7 @@ class Definitions { orType, RepeatedParamClass, ByNameParamClass2x, + IntoType, AnyValClass, NullClass, NothingClass, diff --git a/compiler/src/dotty/tools/dotc/core/Denotations.scala b/compiler/src/dotty/tools/dotc/core/Denotations.scala index f267e6c85e03..723f9408d805 100644 --- a/compiler/src/dotty/tools/dotc/core/Denotations.scala +++ b/compiler/src/dotty/tools/dotc/core/Denotations.scala @@ -175,7 +175,7 @@ object Denotations { * * @param symbol The referencing symbol, or NoSymbol is none exists */ - abstract class Denotation(val symbol: Symbol, protected var myInfo: Type) extends PreDenotation with printing.Showable { + abstract class Denotation(val symbol: Symbol, protected var myInfo: Type, val isType: Boolean) extends PreDenotation with printing.Showable { type AsSeenFromResult <: Denotation /** The type info. @@ -194,12 +194,6 @@ object Denotations { */ def infoOrCompleter: Type - /** The period during which this denotation is valid. */ - def validFor: Period - - /** Is this a reference to a type symbol? */ - def isType: Boolean - /** Is this a reference to a term symbol? */ def isTerm: Boolean = !isType @@ -229,6 +223,15 @@ object Denotations { */ def current(using Context): Denotation + /** The period during which this denotation is valid. */ + private var myValidFor: Period = Nowhere + + final def validFor: Period = myValidFor + final def validFor_=(p: Period): Unit = { + myValidFor = p + symbol.invalidateDenotCache() + } + /** Is this denotation different from NoDenotation or an ErrorDenotation? */ def exists: Boolean = true @@ -300,9 +303,9 @@ object Denotations { case NoDenotation | _: NoQualifyingRef | _: MissingRef => def argStr = if (args.isEmpty) "" else i" matching ($args%, %)" val msg = - if (site.exists) i"$site does not have a member $kind $name$argStr" - else i"missing: $kind $name$argStr" - throw new TypeError(msg) + if site.exists then em"$site does not have a member $kind $name$argStr" + else em"missing: $kind $name$argStr" + throw TypeError(msg) case denot => denot.symbol } @@ -571,7 +574,7 @@ object Denotations { end infoMeet /** A non-overloaded denotation */ - abstract class SingleDenotation(symbol: Symbol, initInfo: Type) extends Denotation(symbol, initInfo) { + abstract class SingleDenotation(symbol: Symbol, initInfo: Type, isType: Boolean) extends Denotation(symbol, initInfo, isType) { protected def newLikeThis(symbol: Symbol, info: Type, pre: Type, isRefinedMethod: Boolean): SingleDenotation final def name(using Context): Name = symbol.name @@ -610,16 +613,13 @@ object Denotations { */ def signature(sourceLanguage: SourceLanguage)(using Context): Signature = if (isType) Signature.NotAMethod // don't force info if this is a type denotation - else info match { + else info match case info: MethodOrPoly => try info.signature(sourceLanguage) - catch { // !!! DEBUG - case scala.util.control.NonFatal(ex) => - report.echo(s"cannot take signature of $info") - throw ex - } + catch case ex: Exception => + if ctx.debug then report.echo(s"cannot take signature of $info") + throw ex case _ => Signature.NotAMethod - } def derivedSingleDenotation(symbol: Symbol, info: Type, pre: Type = this.prefix, isRefinedMethod: Boolean = this.isRefinedMethod)(using Context): SingleDenotation = if ((symbol eq this.symbol) && (info eq this.info) && (pre eq this.prefix) && (isRefinedMethod == this.isRefinedMethod)) this @@ -644,15 +644,19 @@ object Denotations { def atSignature(sig: Signature, targetName: Name, site: Type, relaxed: Boolean)(using Context): SingleDenotation = val situated = if site == NoPrefix then this else asSeenFrom(site) - val sigMatches = sig.matchDegree(situated.signature) match - case FullMatch => - true - case MethodNotAMethodMatch => - // See comment in `matches` - relaxed && !symbol.is(JavaDefined) - case ParamMatch => - relaxed - case noMatch => + val sigMatches = + try + sig.matchDegree(situated.signature) match + case FullMatch => + true + case MethodNotAMethodMatch => + // See comment in `matches` + relaxed && !symbol.is(JavaDefined) + case ParamMatch => + relaxed + case noMatch => + false + catch case ex: MissingType => false if sigMatches && symbol.hasTargetName(targetName) then this else NoDenotation @@ -663,14 +667,6 @@ object Denotations { // ------ Transformations ----------------------------------------- - private var myValidFor: Period = Nowhere - - def validFor: Period = myValidFor - def validFor_=(p: Period): Unit = { - myValidFor = p - symbol.invalidateDenotCache() - } - /** The next SingleDenotation in this run, with wrap-around from last to first. * * There may be several `SingleDenotation`s with different validity @@ -694,7 +690,7 @@ object Denotations { if (validFor.firstPhaseId <= 1) this else { var current = nextInRun - while (current.validFor.code > this.myValidFor.code) current = current.nextInRun + while (current.validFor.code > this.validFor.code) current = current.nextInRun current } @@ -775,7 +771,7 @@ object Denotations { * are otherwise undefined. */ def skipRemoved(using Context): SingleDenotation = - if (myValidFor.code <= 0) nextDefined else this + if (validFor.code <= 0) nextDefined else this /** Produce a denotation that is valid for the given context. * Usually called when !(validFor contains ctx.period) @@ -792,15 +788,13 @@ object Denotations { def current(using Context): SingleDenotation = util.Stats.record("current") val currentPeriod = ctx.period - val valid = myValidFor + val valid = validFor def assertNotPackage(d: SingleDenotation, transformer: DenotTransformer) = d match case d: ClassDenotation => assert(!d.is(Package), s"illegal transformation of package denotation by transformer $transformer") case _ => - def escapeToNext = nextDefined.ensuring(_.validFor != Nowhere) - def toNewRun = util.Stats.record("current.bringForward") if exists then initial.bringForward().current else this @@ -836,9 +830,6 @@ object Denotations { // creations that way, and also avoid phase caches in contexts to get large. // To work correctly, we need to demand that the context with the new phase // is not retained in the result. - catch case ex: CyclicReference => - // println(s"error while transforming $this") - throw ex finally mutCtx.setPeriod(savedPeriod) if next eq cur then @@ -875,7 +866,7 @@ object Denotations { // can happen if we sit on a stale denotation which has been replaced // wholesale by an installAfter; in this case, proceed to the next // denotation and try again. - escapeToNext + nextDefined else if valid.runId != currentPeriod.runId then toNewRun else if currentPeriod.code > valid.code then @@ -962,7 +953,7 @@ object Denotations { case denot: SymDenotation => s"in ${denot.owner}" case _ => "" } - s"stale symbol; $this#${symbol.id} $ownerMsg, defined in ${myValidFor}, is referred to in run ${ctx.period}" + s"stale symbol; $this#${symbol.id} $ownerMsg, defined in ${validFor}, is referred to in run ${ctx.period}" } /** The period (interval of phases) for which there exists @@ -1148,9 +1139,9 @@ object Denotations { acc(false, symbol.info) } - abstract class NonSymSingleDenotation(symbol: Symbol, initInfo: Type, override val prefix: Type) extends SingleDenotation(symbol, initInfo) { + abstract class NonSymSingleDenotation(symbol: Symbol, initInfo: Type, override val prefix: Type) + extends SingleDenotation(symbol, initInfo, initInfo.isInstanceOf[TypeType]) { def infoOrCompleter: Type = initInfo - def isType: Boolean = infoOrCompleter.isInstanceOf[TypeType] } class UniqueRefDenotation( @@ -1246,10 +1237,10 @@ object Denotations { /** An overloaded denotation consisting of the alternatives of both given denotations. */ - case class MultiDenotation(denot1: Denotation, denot2: Denotation) extends Denotation(NoSymbol, NoType) with MultiPreDenotation { + case class MultiDenotation(denot1: Denotation, denot2: Denotation) extends Denotation(NoSymbol, NoType, isType = false) with MultiPreDenotation { + validFor = denot1.validFor & denot2.validFor + final def infoOrCompleter: Type = multiHasNot("info") - final def validFor: Period = denot1.validFor & denot2.validFor - final def isType: Boolean = false final def hasUniqueSym: Boolean = false final def name(using Context): Name = denot1.name final def signature(using Context): Signature = Signature.OverloadedSignature diff --git a/compiler/src/dotty/tools/dotc/core/Flags.scala b/compiler/src/dotty/tools/dotc/core/Flags.scala index 8bf65ed8288f..f23dce020f10 100644 --- a/compiler/src/dotty/tools/dotc/core/Flags.scala +++ b/compiler/src/dotty/tools/dotc/core/Flags.scala @@ -350,14 +350,14 @@ object Flags { /** Symbol is a method which should be marked ACC_SYNCHRONIZED */ val (_, Synchronized @ _, _) = newFlags(36, "") - /** Symbol is a Java-style varargs method */ - val (_, JavaVarargs @ _, _) = newFlags(37, "") + /** Symbol is a Java-style varargs method / a Java annotation */ + val (_, JavaVarargs @ _, JavaAnnotation @ _) = newFlags(37, "", "") /** Symbol is a Java default method */ val (_, DefaultMethod @ _, _) = newFlags(38, "") /** Symbol is a transparent inline method or trait */ - val (Transparent @ _, _, _) = newFlags(39, "transparent") + val (Transparent @ _, _, TransparentType @ _) = newFlags(39, "transparent") /** Symbol is an enum class or enum case (if used with case) */ val (Enum @ _, EnumVal @ _, _) = newFlags(40, "enum") @@ -477,7 +477,7 @@ object Flags { */ val AfterLoadFlags: FlagSet = commonFlags( FromStartFlags, AccessFlags, Final, AccessorOrSealed, - Abstract, LazyOrTrait, SelfName, JavaDefined, Transparent) + Abstract, LazyOrTrait, SelfName, JavaDefined, JavaAnnotation, Transparent) /** A value that's unstable unless complemented with a Stable flag */ val UnstableValueFlags: FlagSet = Mutable | Method @@ -609,5 +609,4 @@ object Flags { val SyntheticParam: FlagSet = Synthetic | Param val SyntheticTermParam: FlagSet = Synthetic | TermParam val SyntheticTypeParam: FlagSet = Synthetic | TypeParam - val TransparentTrait: FlagSet = Trait | Transparent } diff --git a/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala b/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala index 53fc58595472..bb65cce84042 100644 --- a/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala +++ b/compiler/src/dotty/tools/dotc/core/GadtConstraint.scala @@ -2,36 +2,160 @@ package dotty.tools package dotc package core -import Decorators._ -import Contexts._ -import Types._ -import Symbols._ +import Contexts.*, Decorators.*, Symbols.*, Types.* +import NameKinds.UniqueName +import config.Printers.{gadts, gadtsConstr} import util.{SimpleIdentitySet, SimpleIdentityMap} -import collection.mutable import printing._ +import scala.annotation.tailrec +import scala.annotation.internal.sharable +import scala.collection.mutable + object GadtConstraint: - def apply(): GadtConstraint = empty - def empty: GadtConstraint = - new ProperGadtConstraint(OrderingConstraint.empty, SimpleIdentityMap.empty, SimpleIdentityMap.empty, false) + @sharable val empty: GadtConstraint = + GadtConstraint(OrderingConstraint.empty, SimpleIdentityMap.empty, SimpleIdentityMap.empty, false) /** Represents GADT constraints currently in scope */ -sealed trait GadtConstraint ( - private var myConstraint: Constraint, - private var mapping: SimpleIdentityMap[Symbol, TypeVar], - private var reverseMapping: SimpleIdentityMap[TypeParamRef, Symbol], - private var wasConstrained: Boolean -) extends Showable { - this: ConstraintHandling => - - import dotty.tools.dotc.config.Printers.{gadts, gadtsConstr} - - /** Exposes ConstraintHandling.subsumes */ - def subsumes(left: GadtConstraint, right: GadtConstraint, pre: GadtConstraint)(using Context): Boolean = { - def extractConstraint(g: GadtConstraint) = g.constraint - subsumes(extractConstraint(left), extractConstraint(right), extractConstraint(pre)) +class GadtConstraint private ( + private val myConstraint: Constraint, + private val mapping: SimpleIdentityMap[Symbol, TypeVar], + private val reverseMapping: SimpleIdentityMap[TypeParamRef, Symbol], + private val wasConstrained: Boolean, +) extends Showable: + def constraint: Constraint = myConstraint + def symbols: List[Symbol] = mapping.keys + def withConstraint(c: Constraint) = copy(myConstraint = c) + def withWasConstrained = copy(wasConstrained = true) + + def add(sym: Symbol, tv: TypeVar): GadtConstraint = copy( + mapping = mapping.updated(sym, tv), + reverseMapping = reverseMapping.updated(tv.origin, sym), + ) + + /** Is `sym1` ordered to be less than `sym2`? */ + def isLess(sym1: Symbol, sym2: Symbol)(using Context): Boolean = + constraint.isLess(tvarOrError(sym1).origin, tvarOrError(sym2).origin) + + /** Full bounds of `sym`, including TypeRefs to other lower/upper symbols. + * + * @note this performs subtype checks between ordered symbols. + * Using this in isSubType can lead to infinite recursion. Consider `bounds` instead. + */ + def fullBounds(sym: Symbol)(using Context): TypeBounds | Null = mapping(sym) match + case null => null + case tv: TypeVar => fullBounds(tv.origin) // .ensuring(containsNoInternalTypes(_)) + + /** Immediate bounds of `sym`. Does not contain lower/upper symbols (see [[fullBounds]]). */ + def bounds(sym: Symbol)(using Context): TypeBounds | Null = + mapping(sym) match + case null => null + case tv: TypeVar => + def retrieveBounds: TypeBounds = externalize(constraint.bounds(tv.origin)).bounds + retrieveBounds + //.showing(i"gadt bounds $sym: $result", gadts) + //.ensuring(containsNoInternalTypes(_)) + + /** Is the symbol registered in the constraint? + * + * @note this is true even if the symbol is constrained to be equal to another type, unlike [[Constraint.contains]]. + */ + def contains(sym: Symbol)(using Context): Boolean = mapping(sym) != null + + /** GADT constraint narrows bounds of at least one variable */ + def isNarrowing: Boolean = wasConstrained + + def fullBounds(param: TypeParamRef)(using Context): TypeBounds = + nonParamBounds(param).derivedTypeBounds(fullLowerBound(param), fullUpperBound(param)) + + def nonParamBounds(param: TypeParamRef)(using Context): TypeBounds = + externalize(constraint.nonParamBounds(param)).bounds + + def fullLowerBound(param: TypeParamRef)(using Context): Type = + val self = externalize(param) + constraint.minLower(param).foldLeft(nonParamBounds(param).lo) { (acc, p) => + externalize(p) match + // drop any lower param that is a GADT symbol + // and is upper-bounded by a non-Any super-type of the original parameter + // e.g. in pos/i14287.min + // B$1 had info <: X and fullBounds >: B$2 <: X, and + // B$2 had info <: B$1 and fullBounds <: B$1 + // We can use the info of B$2 to drop the lower-bound of B$1 + // and return non-bidirectional bounds B$1 <: X and B$2 <: B$1. + case tp: TypeRef if tp.symbol.isPatternBound && self =:= tp.info.hiBound => acc + case tp => acc | tp + } + + def fullUpperBound(param: TypeParamRef)(using Context): Type = + val self = externalize(param) + constraint.minUpper(param).foldLeft(nonParamBounds(param).hi) { (acc, u) => + externalize(u) match + case tp: TypeRef if tp.symbol.isPatternBound && self =:= tp.info.loBound => acc // like fullLowerBound + case tp => + // Any as the upper bound means "no bound", but if F is higher-kinded, + // Any & F = F[_]; this is wrong for us so we need to short-circuit + if acc.isAny then tp else acc & tp + } + + def externalize(tp: Type, theMap: TypeMap | Null = null)(using Context): Type = tp match + case param: TypeParamRef => reverseMapping(param) match + case sym: Symbol => sym.typeRef + case null => param + case tp: TypeAlias => tp.derivedAlias(externalize(tp.alias, theMap)) + case tp => (if theMap == null then ExternalizeMap() else theMap).mapOver(tp) + + private class ExternalizeMap(using Context) extends TypeMap: + def apply(tp: Type): Type = externalize(tp, this)(using mapCtx) + + def tvarOrError(sym: Symbol)(using Context): TypeVar = + mapping(sym).ensuring(_ != null, i"not a constrainable symbol: $sym").uncheckedNN + + @tailrec final def stripInternalTypeVar(tp: Type): Type = tp match + case tv: TypeVar => + val inst = constraint.instType(tv) + if inst.exists then stripInternalTypeVar(inst) else tv + case _ => tp + + def internalize(tp: Type)(using Context): Type = tp match + case nt: NamedType => + val ntTvar = mapping(nt.symbol) + if ntTvar == null then tp + else ntTvar + case _ => tp + + private def containsNoInternalTypes(tp: Type, theAcc: TypeAccumulator[Boolean] | Null = null)(using Context): Boolean = tp match { + case tpr: TypeParamRef => !reverseMapping.contains(tpr) + case tv: TypeVar => !reverseMapping.contains(tv.origin) + case tp => + (if (theAcc != null) theAcc else new ContainsNoInternalTypesAccumulator()).foldOver(true, tp) } + private class ContainsNoInternalTypesAccumulator(using Context) extends TypeAccumulator[Boolean] { + override def apply(x: Boolean, tp: Type): Boolean = x && containsNoInternalTypes(tp, this) + } + + override def toText(printer: Printer): Texts.Text = printer.toText(this) + + /** Provides more information than toText, by showing the underlying Constraint details. */ + def debugBoundsDescription(using Context): String = i"$this\n$constraint" + + private def copy( + myConstraint: Constraint = myConstraint, + mapping: SimpleIdentityMap[Symbol, TypeVar] = mapping, + reverseMapping: SimpleIdentityMap[TypeParamRef, Symbol] = reverseMapping, + wasConstrained: Boolean = wasConstrained, + ): GadtConstraint = GadtConstraint(myConstraint, mapping, reverseMapping, wasConstrained) +end GadtConstraint + +object GadtState: + def apply(gadt: GadtConstraint): GadtState = ProperGadtState(gadt) + +sealed trait GadtState { + this: ConstraintHandling => // Hide ConstraintHandling within GadtConstraintHandling + + def gadt: GadtConstraint + def gadt_=(g: GadtConstraint): Unit + override protected def legalBound(param: TypeParamRef, rawBound: Type, isUpper: Boolean)(using Context): Type = // GADT constraints never involve wildcards and are not propagated outside // the case where they're valid, so no approximating is needed. @@ -52,22 +176,19 @@ sealed trait GadtConstraint ( // and used as orderings. def substDependentSyms(tp: Type, isUpper: Boolean)(using Context): Type = { def loop(tp: Type) = substDependentSyms(tp, isUpper) - tp match { + tp match case tp @ AndType(tp1, tp2) if !isUpper => tp.derivedAndType(loop(tp1), loop(tp2)) case tp @ OrType(tp1, tp2) if isUpper => tp.derivedOrType(loop(tp1), loop(tp2)) case tp: NamedType => - params.indexOf(tp.symbol) match { + params.indexOf(tp.symbol) match case -1 => - mapping(tp.symbol) match { + gadt.internalize(tp) match case tv: TypeVar => tv.origin - case null => tp - } + case _ => tp case i => pt.paramRefs(i) - } case tp => tp - } } val tb = param.info.bounds @@ -81,186 +202,87 @@ sealed trait GadtConstraint ( val tvars = params.lazyZip(poly1.paramRefs).map { (sym, paramRef) => val tv = TypeVar(paramRef, creatorState = null) - mapping = mapping.updated(sym, tv) - reverseMapping = reverseMapping.updated(tv.origin, sym) + gadt = gadt.add(sym, tv) tv } // The replaced symbols are picked up here. addToConstraint(poly1, tvars) - .showing(i"added to constraint: [$poly1] $params%, % gadt = $this", gadts) + .showing(i"added to constraint: [$poly1] $params%, % gadt = $gadt", gadts) } /** Further constrain a symbol already present in the constraint. */ def addBound(sym: Symbol, bound: Type, isUpper: Boolean)(using Context): Boolean = { - @annotation.tailrec def stripInternalTypeVar(tp: Type): Type = tp match { - case tv: TypeVar => - val inst = constraint.instType(tv) - if (inst.exists) stripInternalTypeVar(inst) else tv - case _ => tp - } - - val symTvar: TypeVar = stripInternalTypeVar(tvarOrError(sym)) match { + val symTvar: TypeVar = gadt.stripInternalTypeVar(gadt.tvarOrError(sym)) match case tv: TypeVar => tv case inst => gadts.println(i"instantiated: $sym -> $inst") - return if (isUpper) isSub(inst, bound) else isSub(bound, inst) - } + return if isUpper then isSub(inst, bound) else isSub(bound, inst) - val internalizedBound = bound match { - case nt: NamedType => - val ntTvar = mapping(nt.symbol) - if (ntTvar != null) stripInternalTypeVar(ntTvar) else bound - case _ => bound - } + val internalizedBound = gadt.stripInternalTypeVar(gadt.internalize(bound)) val saved = constraint val result = internalizedBound match case boundTvar: TypeVar => - if (boundTvar eq symTvar) true - else if (isUpper) addLess(symTvar.origin, boundTvar.origin) + if boundTvar eq symTvar then true + else if isUpper + then addLess(symTvar.origin, boundTvar.origin) else addLess(boundTvar.origin, symTvar.origin) case bound => addBoundTransitively(symTvar.origin, bound, isUpper) gadts.println { - val descr = if (isUpper) "upper" else "lower" - val op = if (isUpper) "<:" else ">:" + val descr = if isUpper then "upper" else "lower" + val op = if isUpper then "<:" else ">:" i"adding $descr bound $sym $op $bound = $result" } - if constraint ne saved then wasConstrained = true + if constraint ne saved then gadt = gadt.withWasConstrained result } - /** Is `sym1` ordered to be less than `sym2`? */ - def isLess(sym1: Symbol, sym2: Symbol)(using Context): Boolean = - constraint.isLess(tvarOrError(sym1).origin, tvarOrError(sym2).origin) - - /** Full bounds of `sym`, including TypeRefs to other lower/upper symbols. - * - * @note this performs subtype checks between ordered symbols. - * Using this in isSubType can lead to infinite recursion. Consider `bounds` instead. - */ - def fullBounds(sym: Symbol)(using Context): TypeBounds | Null = - mapping(sym) match { - case null => null - // TODO: Improve flow typing so that ascription becomes redundant - case tv: TypeVar => - fullBounds(tv.origin) - // .ensuring(containsNoInternalTypes(_)) - } - - /** Immediate bounds of `sym`. Does not contain lower/upper symbols (see [[fullBounds]]). */ - def bounds(sym: Symbol)(using Context): TypeBounds | Null = - mapping(sym) match { - case null => null - // TODO: Improve flow typing so that ascription becomes redundant - case tv: TypeVar => - def retrieveBounds: TypeBounds = externalize(bounds(tv.origin)).bounds - retrieveBounds - //.showing(i"gadt bounds $sym: $result", gadts) - //.ensuring(containsNoInternalTypes(_)) - } - - /** Is the symbol registered in the constraint? - * - * @note this is true even if the symbol is constrained to be equal to another type, unlike [[Constraint.contains]]. - */ - def contains(sym: Symbol)(using Context): Boolean = mapping(sym) != null - - /** GADT constraint narrows bounds of at least one variable */ - def isNarrowing: Boolean = wasConstrained - /** See [[ConstraintHandling.approximation]] */ def approximation(sym: Symbol, fromBelow: Boolean, maxLevel: Int = Int.MaxValue)(using Context): Type = { - val res = - approximation(tvarOrError(sym).origin, fromBelow, maxLevel) match - case tpr: TypeParamRef => - // Here we do externalization when the returned type is a TypeParamRef, - // b/c ConstraintHandling.approximation may return internal types when - // the type variable is instantiated. See #15531. - externalize(tpr) - case tp => tp - - gadts.println(i"approximating $sym ~> $res") - res + approximation(gadt.tvarOrError(sym).origin, fromBelow, maxLevel).match + case tpr: TypeParamRef => + // Here we do externalization when the returned type is a TypeParamRef, + // b/c ConstraintHandling.approximation may return internal types when + // the type variable is instantiated. See #15531. + gadt.externalize(tpr) + case tp => tp + .showing(i"approximating $sym ~> $result", gadts) } - def symbols: List[Symbol] = mapping.keys + def fresh: GadtState = GadtState(gadt) - def fresh: GadtConstraint = new ProperGadtConstraint(myConstraint, mapping, reverseMapping, wasConstrained) + /** Restore the GadtConstraint state. */ + def restore(gadt: GadtConstraint): Unit = this.gadt = gadt - /** Restore the state from other [[GadtConstraint]], probably copied using [[fresh]] */ - def restore(other: GadtConstraint): Unit = - this.myConstraint = other.myConstraint - this.mapping = other.mapping - this.reverseMapping = other.reverseMapping - this.wasConstrained = other.wasConstrained + inline def rollbackGadtUnless(inline op: Boolean): Boolean = + val saved = gadt + var result = false + try result = op + finally if !result then restore(saved) + result // ---- Protected/internal ----------------------------------------------- - override protected def constraint = myConstraint - override protected def constraint_=(c: Constraint) = myConstraint = c + override protected def constraint = gadt.constraint + override protected def constraint_=(c: Constraint) = gadt = gadt.withConstraint(c) override protected def isSub(tp1: Type, tp2: Type)(using Context): Boolean = TypeComparer.isSubType(tp1, tp2) override protected def isSame(tp1: Type, tp2: Type)(using Context): Boolean = TypeComparer.isSameType(tp1, tp2) - override def nonParamBounds(param: TypeParamRef)(using Context): TypeBounds = - externalize(constraint.nonParamBounds(param)).bounds - - override def fullLowerBound(param: TypeParamRef)(using Context): Type = - constraint.minLower(param).foldLeft(nonParamBounds(param).lo) { - (t, u) => t | externalize(u) - } - - override def fullUpperBound(param: TypeParamRef)(using Context): Type = - constraint.minUpper(param).foldLeft(nonParamBounds(param).hi) { (t, u) => - val eu = externalize(u) - // Any as the upper bound means "no bound", but if F is higher-kinded, - // Any & F = F[_]; this is wrong for us so we need to short-circuit - if t.isAny then eu else t & eu - } - - // ---- Private ---------------------------------------------------------- - - private def externalize(tp: Type, theMap: TypeMap | Null = null)(using Context): Type = tp match - case param: TypeParamRef => reverseMapping(param) match - case sym: Symbol => sym.typeRef - case null => param - case tp: TypeAlias => tp.derivedAlias(externalize(tp.alias, theMap)) - case tp => (if theMap == null then ExternalizeMap() else theMap).mapOver(tp) - - private class ExternalizeMap(using Context) extends TypeMap: - def apply(tp: Type): Type = externalize(tp, this)(using mapCtx) - - private def tvarOrError(sym: Symbol)(using Context): TypeVar = - mapping(sym).ensuring(_ != null, i"not a constrainable symbol: $sym").uncheckedNN - - private def containsNoInternalTypes(tp: Type, theAcc: TypeAccumulator[Boolean] | Null = null)(using Context): Boolean = tp match { - case tpr: TypeParamRef => !reverseMapping.contains(tpr) - case tv: TypeVar => !reverseMapping.contains(tv.origin) - case tp => - (if (theAcc != null) theAcc else new ContainsNoInternalTypesAccumulator()).foldOver(true, tp) - } - - private class ContainsNoInternalTypesAccumulator(using Context) extends TypeAccumulator[Boolean] { - override def apply(x: Boolean, tp: Type): Boolean = x && containsNoInternalTypes(tp, this) - } + override def nonParamBounds(param: TypeParamRef)(using Context): TypeBounds = gadt.nonParamBounds(param) + override def fullLowerBound(param: TypeParamRef)(using Context): Type = gadt.fullLowerBound(param) + override def fullUpperBound(param: TypeParamRef)(using Context): Type = gadt.fullUpperBound(param) // ---- Debug ------------------------------------------------------------ override def constr = gadtsConstr - - override def toText(printer: Printer): Texts.Text = printer.toText(this) - - /** Provides more information than toText, by showing the underlying Constraint details. */ - def debugBoundsDescription(using Context): String = i"$this\n$constraint" } -private class ProperGadtConstraint ( - myConstraint: Constraint, - mapping: SimpleIdentityMap[Symbol, TypeVar], - reverseMapping: SimpleIdentityMap[TypeParamRef, Symbol], - wasConstrained: Boolean, -) extends ConstraintHandling with GadtConstraint(myConstraint, mapping, reverseMapping, wasConstrained) +// Hide ConstraintHandling within GadtState +private class ProperGadtState(private var myGadt: GadtConstraint) extends ConstraintHandling with GadtState: + def gadt: GadtConstraint = myGadt + def gadt_=(gadt: GadtConstraint): Unit = myGadt = gadt diff --git a/compiler/src/dotty/tools/dotc/core/Mode.scala b/compiler/src/dotty/tools/dotc/core/Mode.scala index 33ac3de70767..40a45b9f4678 100644 --- a/compiler/src/dotty/tools/dotc/core/Mode.scala +++ b/compiler/src/dotty/tools/dotc/core/Mode.scala @@ -70,10 +70,14 @@ object Mode { /** We are currently unpickling Scala2 info */ val Scala2Unpickling: Mode = newMode(13, "Scala2Unpickling") - /** We are currently checking bounds to be non-empty, so we should not - * do any widening when computing members of refined types. + /** Signifies one of two possible situations: + * 1. We are currently checking bounds to be non-empty, so we should not + * do any widening when computing members of refined types. + * 2. We are currently checking self type conformance, so we should not + * ignore capture sets added to otherwise pure classes (only needed + * for capture checking). */ - val CheckBounds: Mode = newMode(14, "CheckBounds") + val CheckBoundsOrSelfType: Mode = newMode(14, "CheckBoundsOrSelfType") /** Use Scala2 scheme for overloading and implicit resolution */ val OldOverloadingResolution: Mode = newMode(15, "OldOverloadingResolution") diff --git a/compiler/src/dotty/tools/dotc/core/NameKinds.scala b/compiler/src/dotty/tools/dotc/core/NameKinds.scala index f71c16e82b70..2c968ab9446c 100644 --- a/compiler/src/dotty/tools/dotc/core/NameKinds.scala +++ b/compiler/src/dotty/tools/dotc/core/NameKinds.scala @@ -300,6 +300,7 @@ object NameKinds { val UniqueInlineName: UniqueNameKind = new UniqueNameKind("$i") val InlineScrutineeName: UniqueNameKind = new UniqueNameKind("$scrutinee") val InlineBinderName: UniqueNameKind = new UniqueNameKind("$proxy") + val MacroNames: UniqueNameKind = new UniqueNameKind("$macro$") /** A kind of unique extension methods; Unlike other unique names, these can be * unmangled. @@ -324,6 +325,8 @@ object NameKinds { val LocalOptInlineLocalObj: UniqueNameKind = new UniqueNameKind("ilo") + val BoundaryName: UniqueNameKind = new UniqueNameKind("boundary") + /** The kind of names of default argument getters */ val DefaultGetterName: NumberedNameKind = new NumberedNameKind(DEFAULTGETTER, "DefaultGetter") { def mkString(underlying: TermName, info: ThisInfo) = { diff --git a/compiler/src/dotty/tools/dotc/core/NameOps.scala b/compiler/src/dotty/tools/dotc/core/NameOps.scala index 7c1073852681..4e075953d7fa 100644 --- a/compiler/src/dotty/tools/dotc/core/NameOps.scala +++ b/compiler/src/dotty/tools/dotc/core/NameOps.scala @@ -86,11 +86,17 @@ object NameOps { def isVarPattern: Boolean = testSimple { n => n.length > 0 && { + def isLowerLetterSupplementary: Boolean = + import Character.{isHighSurrogate, isLowSurrogate, isLetter, isLowerCase, isValidCodePoint, toCodePoint} + isHighSurrogate(n(0)) && n.length > 1 && isLowSurrogate(n(1)) && { + val codepoint = toCodePoint(n(0), n(1)) + isValidCodePoint(codepoint) && isLetter(codepoint) && isLowerCase(codepoint) + } val first = n.head - (((first.isLower && first.isLetter) || first == '_') - && (n != false_) - && (n != true_) - && (n != null_)) + ((first.isLower && first.isLetter || first == '_' || isLowerLetterSupplementary) + && n != false_ + && n != true_ + && n != null_) } } || name.is(PatMatGivenVarName) @@ -98,7 +104,7 @@ object NameOps { case raw.NE | raw.LE | raw.GE | EMPTY => false case name: SimpleName => - name.length > 0 && name.last == '=' && name.head != '=' && isOperatorPart(name.head) + name.length > 0 && name.last == '=' && name.head != '=' && isOperatorPart(name.firstCodePoint) case _ => false } @@ -352,6 +358,14 @@ object NameOps { val unmangled = kinds.foldLeft(name)(_.unmangle(_)) if (unmangled eq name) name else unmangled.unmangle(kinds) } + + def firstCodePoint: Int = + val first = name.firstPart + import Character.{isHighSurrogate, isLowSurrogate, isValidCodePoint, toCodePoint} + if isHighSurrogate(first(0)) && first.length > 1 && isLowSurrogate(first(1)) then + val codepoint = toCodePoint(first(0), first(1)) + if isValidCodePoint(codepoint) then codepoint else first(0) + else first(0) } extension (name: TermName) { diff --git a/compiler/src/dotty/tools/dotc/core/NamerOps.scala b/compiler/src/dotty/tools/dotc/core/NamerOps.scala index fa0a89349b5e..db6f72590818 100644 --- a/compiler/src/dotty/tools/dotc/core/NamerOps.scala +++ b/compiler/src/dotty/tools/dotc/core/NamerOps.scala @@ -67,11 +67,11 @@ object NamerOps: completer.withSourceModule(findModuleBuddy(name.sourceModuleName, scope)) /** Find moduleClass/sourceModule in effective scope */ - def findModuleBuddy(name: Name, scope: Scope)(using Context) = { - val it = scope.lookupAll(name).filter(_.is(Module)) - if (it.hasNext) it.next() - else NoSymbol.assertingErrorsReported(s"no companion $name in $scope") - } + def findModuleBuddy(name: Name, scope: Scope, alternate: Name = EmptyTermName)(using Context): Symbol = + var it = scope.lookupAll(name).filter(_.is(Module)) + if !alternate.isEmpty then it ++= scope.lookupAll(alternate).filter(_.is(Module)) + if it.hasNext then it.next() + else NoSymbol.assertingErrorsReported(em"no companion $name in $scope") /** If a class has one of these flags, it does not get a constructor companion */ private val NoConstructorProxyNeededFlags = Abstract | Trait | Case | Synthetic | Module | Invisible @@ -212,11 +212,11 @@ object NamerOps: * by (ab?)-using GADT constraints. See pos/i941.scala. */ def linkConstructorParams(sym: Symbol, tparams: List[Symbol], rhsCtx: Context)(using Context): Unit = - rhsCtx.gadt.addToConstraint(tparams) + rhsCtx.gadtState.addToConstraint(tparams) tparams.lazyZip(sym.owner.typeParams).foreach { (psym, tparam) => val tr = tparam.typeRef - rhsCtx.gadt.addBound(psym, tr, isUpper = false) - rhsCtx.gadt.addBound(psym, tr, isUpper = true) + rhsCtx.gadtState.addBound(psym, tr, isUpper = false) + rhsCtx.gadtState.addBound(psym, tr, isUpper = true) } end NamerOps diff --git a/compiler/src/dotty/tools/dotc/core/Names.scala b/compiler/src/dotty/tools/dotc/core/Names.scala index f13c3a184bf9..1e08379b57f0 100644 --- a/compiler/src/dotty/tools/dotc/core/Names.scala +++ b/compiler/src/dotty/tools/dotc/core/Names.scala @@ -15,8 +15,8 @@ import scala.annotation.internal.sharable object Names { import NameKinds._ - /** Things that can be turned into names with `totermName` and `toTypeName` - * Decorators defines implements these as extension methods for strings. + /** Things that can be turned into names with `toTermName` and `toTypeName`. + * Decorators implements these as extension methods for strings. */ type PreName = Name | String @@ -25,7 +25,7 @@ object Names { */ abstract class Designator - /** A name if either a term name or a type name. Term names can be simple + /** A name is either a term name or a type name. Term names can be simple * or derived. A simple term name is essentially an interned string stored * in a name table. A derived term name adds a tag, and possibly a number * or a further simple name to some other name. diff --git a/compiler/src/dotty/tools/dotc/core/OrderingConstraint.scala b/compiler/src/dotty/tools/dotc/core/OrderingConstraint.scala index 1341fac7d735..faea30390d2b 100644 --- a/compiler/src/dotty/tools/dotc/core/OrderingConstraint.scala +++ b/compiler/src/dotty/tools/dotc/core/OrderingConstraint.scala @@ -16,27 +16,34 @@ import cc.{CapturingType, derivedCapturingType} object OrderingConstraint { - type ArrayValuedMap[T] = SimpleIdentityMap[TypeLambda, Array[T]] + /** If true, use reverse dependencies in `replace` to avoid checking the bounds + * of all parameters in the constraint. This can speed things up, but there are some + * rare corner cases where reverse dependencies miss a parameter. Specifically, + * if a constraint contains a free reference to TypeParam P and afterwards the + * same P is added as a bound variable to the constraint, a backwards link would + * then become necessary at this point but is missing. This causes two CB projects + * to fail when reverse dependencies are checked (parboiled2 and perspective). + * In these rare cases `replace` could behave differently when optimized. However, + * no deviation was found in the two projects. It is not clear what the "right" + * behavior of `replace` should be in these cases. Normally, PolyTypes added + * to constraints are supposed to be fresh, so that would mean that the behavior + * with optimizeReplace = true would be correct. But the previous behavior without + * reverse dependency checking corresponds to `optimizeReplace = false`. This behavior + * makes sense if we assume that the added polytype was simply added too late, so we + * want to establish the link between newly bound variable and pre-existing reference. + */ + private final val optimizeReplace = true + + private type ArrayValuedMap[T] = SimpleIdentityMap[TypeLambda, Array[T]] /** The type of `OrderingConstraint#boundsMap` */ - type ParamBounds = ArrayValuedMap[Type] + private type ParamBounds = ArrayValuedMap[Type] /** The type of `OrderingConstraint#lowerMap`, `OrderingConstraint#upperMap` */ - type ParamOrdering = ArrayValuedMap[List[TypeParamRef]] - - /** A new constraint with given maps and given set of hard typevars */ - private def newConstraint( - boundsMap: ParamBounds, lowerMap: ParamOrdering, upperMap: ParamOrdering, - hardVars: TypeVars)(using Context) : OrderingConstraint = - if boundsMap.isEmpty && lowerMap.isEmpty && upperMap.isEmpty then - empty - else - val result = new OrderingConstraint(boundsMap, lowerMap, upperMap, hardVars) - if ctx.run != null then ctx.run.nn.recordConstraintSize(result, result.boundsMap.size) - result + private type ParamOrdering = ArrayValuedMap[List[TypeParamRef]] /** A lens for updating a single entry array in one of the three constraint maps */ - abstract class ConstraintLens[T <: AnyRef: ClassTag] { + private abstract class ConstraintLens[T <: AnyRef: ClassTag] { def entries(c: OrderingConstraint, poly: TypeLambda): Array[T] | Null def updateEntries(c: OrderingConstraint, poly: TypeLambda, entries: Array[T])(using Context): OrderingConstraint def initial: T @@ -47,7 +54,7 @@ object OrderingConstraint { } /** The `current` constraint but with the entry for `param` updated to `entry`. - * `current` is used linearly. If it is different from `prev` it is + * `current` is used linearly. If it is different from `prev` then `current` is * known to be dead after the call. Hence it is OK to update destructively * parts of `current` which are not shared by `prev`. */ @@ -89,27 +96,27 @@ object OrderingConstraint { map(prev, current, param.binder, param.paramNum, f) } - val boundsLens: ConstraintLens[Type] = new ConstraintLens[Type] { + private val boundsLens: ConstraintLens[Type] = new ConstraintLens[Type] { def entries(c: OrderingConstraint, poly: TypeLambda): Array[Type] | Null = c.boundsMap(poly) def updateEntries(c: OrderingConstraint, poly: TypeLambda, entries: Array[Type])(using Context): OrderingConstraint = - newConstraint(c.boundsMap.updated(poly, entries), c.lowerMap, c.upperMap, c.hardVars) + c.newConstraint(boundsMap = c.boundsMap.updated(poly, entries)) def initial = NoType } - val lowerLens: ConstraintLens[List[TypeParamRef]] = new ConstraintLens[List[TypeParamRef]] { + private val lowerLens: ConstraintLens[List[TypeParamRef]] = new ConstraintLens[List[TypeParamRef]] { def entries(c: OrderingConstraint, poly: TypeLambda): Array[List[TypeParamRef]] | Null = c.lowerMap(poly) def updateEntries(c: OrderingConstraint, poly: TypeLambda, entries: Array[List[TypeParamRef]])(using Context): OrderingConstraint = - newConstraint(c.boundsMap, c.lowerMap.updated(poly, entries), c.upperMap, c.hardVars) + c.newConstraint(lowerMap = c.lowerMap.updated(poly, entries)) def initial = Nil } - val upperLens: ConstraintLens[List[TypeParamRef]] = new ConstraintLens[List[TypeParamRef]] { + private val upperLens: ConstraintLens[List[TypeParamRef]] = new ConstraintLens[List[TypeParamRef]] { def entries(c: OrderingConstraint, poly: TypeLambda): Array[List[TypeParamRef]] | Null = c.upperMap(poly) def updateEntries(c: OrderingConstraint, poly: TypeLambda, entries: Array[List[TypeParamRef]])(using Context): OrderingConstraint = - newConstraint(c.boundsMap, c.lowerMap, c.upperMap.updated(poly, entries), c.hardVars) + c.newConstraint(upperMap = c.upperMap.updated(poly, entries)) def initial = Nil } @@ -143,11 +150,27 @@ class OrderingConstraint(private val boundsMap: ParamBounds, private val lowerMap : ParamOrdering, private val upperMap : ParamOrdering, private val hardVars : TypeVars) extends Constraint { + thisConstraint => import UnificationDirection.* type This = OrderingConstraint + /** A new constraint with given maps and given set of hard typevars */ + private def newConstraint( + boundsMap: ParamBounds = this.boundsMap, + lowerMap: ParamOrdering = this.lowerMap, + upperMap: ParamOrdering = this.upperMap, + hardVars: TypeVars = this.hardVars)(using Context) : OrderingConstraint = + if boundsMap.isEmpty && lowerMap.isEmpty && upperMap.isEmpty then + empty + else + val result = new OrderingConstraint(boundsMap, lowerMap, upperMap, hardVars) + if ctx.run != null then ctx.run.nn.recordConstraintSize(result, result.boundsMap.size) + result.coDeps = this.coDeps + result.contraDeps = this.contraDeps + result + // ----------- Basic indices -------------------------------------------------- /** The number of type parameters in the given entry array */ @@ -201,6 +224,17 @@ class OrderingConstraint(private val boundsMap: ParamBounds, def exclusiveUpper(param: TypeParamRef, butNot: TypeParamRef): List[TypeParamRef] = upper(param).filterNot(isLess(butNot, _)) + def bounds(param: TypeParamRef)(using Context): TypeBounds = { + val e = entry(param) + if (e.exists) e.bounds + else { + // TODO: should we change the type of paramInfos to nullable? + val pinfos: List[param.binder.PInfo] | Null = param.binder.paramInfos + if (pinfos != null) pinfos(param.paramNum) // pinfos == null happens in pos/i536.scala + else TypeBounds.empty + } + } + // ---------- Info related to TypeParamRefs ------------------------------------------- def isLess(param1: TypeParamRef, param2: TypeParamRef): Boolean = @@ -217,6 +251,197 @@ class OrderingConstraint(private val boundsMap: ParamBounds, if tvar == null then NoType else tvar +// ------------- Type parameter dependencies ---------------------------------------- + + private type ReverseDeps = SimpleIdentityMap[TypeParamRef, SimpleIdentitySet[TypeParamRef]] + + /** A map that associates type parameters of this constraint with all other type + * parameters that refer to them in their bounds covariantly, such that, if the + * type parameter is instantiated to a larger type, the constraint would be narrowed + * (i.e. solution set changes other than simply being made larger). + */ + private var coDeps: ReverseDeps = SimpleIdentityMap.empty + + /** A map that associates type parameters of this constraint with all other type + * parameters that refer to them in their bounds covariantly, such that, if the + * type parameter is instantiated to a smaller type, the constraint would be narrowed. + * (i.e. solution set changes other than simply being made larger). + */ + private var contraDeps: ReverseDeps = SimpleIdentityMap.empty + + /** Null-safe indexing */ + extension (deps: ReverseDeps) def at(param: TypeParamRef): SimpleIdentitySet[TypeParamRef] = + val result = deps(param) + if null == result // swapped operand order important since `==` is overloaded in `SimpleIdentitySet` + then SimpleIdentitySet.empty + else result + + override def dependsOn(tv: TypeVar, except: TypeVars, co: Boolean)(using Context): Boolean = + def origin(tv: TypeVar) = + assert(!instType(tv).exists) + tv.origin + val param = origin(tv) + val excluded = except.map(origin) + val qualifies: TypeParamRef => Boolean = !excluded.contains(_) + def test(deps: ReverseDeps, lens: ConstraintLens[List[TypeParamRef]]) = + deps.at(param).exists(qualifies) + || lens(this, tv.origin.binder, tv.origin.paramNum).exists(qualifies) + if co then test(coDeps, upperLens) else test(contraDeps, lowerLens) + + /** Modify traversals in two respects: + * - when encountering an application C[Ts], where C is a type variable or parameter + * that has an instantiation in this constraint, assume the type parameters of + * the instantiation instead of the type parameters of C when traversing the + * arguments Ts. That can make a difference for the variance in which an argument + * is traversed. Example constraint: + * + * constrained types: C[X], A + * A >: C[B] + * C := Option + * + * Here, B is traversed with variance +1 instead of 0. Test case: pos/t3152.scala + * + * - When typing a prefx, don't avoid negative variances. This matters only for the + * corner case where a parameter is instantiated to Nothing (see comment in + * TypeAccumulator#applyToPrefix). When determining instantiation directions in + * interpolations (which is what dependency variances are for), it can be ignored. + */ + private trait ConstraintAwareTraversal[T] extends TypeAccumulator[T]: + + /** Does `param` have bounds in the current constraint? */ + protected def hasBounds(param: TypeParamRef): Boolean = entry(param).isInstanceOf[TypeBounds] + + override def tyconTypeParams(tp: AppliedType)(using Context): List[ParamInfo] = + def tparams(tycon: Type): List[ParamInfo] = tycon match + case tycon: TypeVar if !tycon.inst.exists => tparams(tycon.origin) + case tycon: TypeParamRef if !hasBounds(tycon) => + val entryParams = entry(tycon).typeParams + if entryParams.nonEmpty then entryParams + else tp.tyconTypeParams + case _ => tp.tyconTypeParams + tparams(tp.tycon) + + override def applyToPrefix(x: T, tp: NamedType): T = + this(x, tp.prefix) + end ConstraintAwareTraversal + + /** A type traverser that adjust dependencies originating from a given type + * @param ignoreBinding if not null, a parameter that is assumed to be still uninstantiated. + * This is necessary to handle replacements. + */ + private class Adjuster(srcParam: TypeParamRef, ignoreBinding: TypeParamRef | Null)(using Context) + extends TypeTraverser, ConstraintAwareTraversal[Unit]: + + var add: Boolean = compiletime.uninitialized + val seen = util.HashSet[LazyRef]() + + override protected def hasBounds(param: TypeParamRef) = + (param eq ignoreBinding) || super.hasBounds(param) + + def update(deps: ReverseDeps, referenced: TypeParamRef): ReverseDeps = + val prev = deps.at(referenced) + val newSet = if add then prev + srcParam else prev - srcParam + if newSet.isEmpty then deps.remove(referenced) + else deps.updated(referenced, newSet) + + def traverse(t: Type) = t match + case param: TypeParamRef => + if hasBounds(param) then + if variance >= 0 then coDeps = update(coDeps, param) + if variance <= 0 then contraDeps = update(contraDeps, param) + else + traverse(entry(param)) + case tp: LazyRef => + if !seen.contains(tp) then + seen += tp + traverse(tp.ref) + case _ => traverseChildren(t) + end Adjuster + + /** Adjust dependencies to account for the delta of previous entry `prevEntry` + * and the new bound `entry` for the type parameter `srcParam`. + */ + def adjustDeps(entry: Type | Null, prevEntry: Type | Null, srcParam: TypeParamRef, ignoreBinding: TypeParamRef | Null = null)(using Context): this.type = + val adjuster = new Adjuster(srcParam, ignoreBinding) + + /** Adjust reverse dependencies of all type parameters referenced by `bound` + * @param isLower `bound` is a lower bound + * @param add if true, add referenced variables to dependencoes, otherwise drop them. + */ + def adjustReferenced(bound: Type, isLower: Boolean, add: Boolean) = + adjuster.variance = if isLower then 1 else -1 + adjuster.add = add + adjuster.seen.clear(resetToInitial = false) + adjuster.traverse(bound) + + /** Use an optimized strategy to adjust dependencies to account for the delta + * of previous bound `prevBound` and new bound `bound`: If `prevBound` is some + * and/or prefix of `bound`, and `baseCase` is true, just add the new parts of `bound`. + * @param isLower `bound` and `prevBound` are lower bounds + * @return true iff the delta strategy succeeded, false if it failed in which case + * the constraint is left unchanged. + */ + def adjustDelta(bound: Type, prevBound: Type, isLower: Boolean, baseCase: => Boolean): Boolean = + if bound eq prevBound then + baseCase + else bound match + case bound: AndOrType => + adjustDelta(bound.tp1, prevBound, isLower, baseCase) && { + adjustReferenced(bound.tp2, isLower, add = true) + true + } + case _ => false + + /** Add or remove depenencies referenced in `bounds`. + * @param add if true, dependecies are added, otherwise they are removed + */ + def adjustBounds(bounds: TypeBounds, add: Boolean) = + adjustReferenced(bounds.lo, isLower = true, add) + adjustReferenced(bounds.hi, isLower = false, add) + + entry match + case entry @ TypeBounds(lo, hi) => + prevEntry match + case prevEntry @ TypeBounds(plo, phi) => + if !adjustDelta(lo, plo, isLower = true, + adjustDelta(hi, phi, isLower = false, true)) + then + adjustBounds(prevEntry, add = false) + adjustBounds(entry, add = true) + case _ => + adjustBounds(entry, add = true) + case _ => + prevEntry match + case prevEntry: TypeBounds => + adjustBounds(prevEntry, add = false) + case _ => + dropDeps(srcParam) // srcParam is instantiated, so its dependencies can be dropped + this + end adjustDeps + + /** Adjust dependencies to account for adding or dropping all `entries` associated + * with `poly`. + * @param add if true, entries is added, otherwise it is dropped + */ + def adjustDeps(poly: TypeLambda, entries: Array[Type], add: Boolean)(using Context): this.type = + for n <- 0 until paramCount(entries) do + if add + then adjustDeps(entries(n), NoType, poly.paramRefs(n)) + else adjustDeps(NoType, entries(n), poly.paramRefs(n)) + this + + /** Remove all reverse dependencies of `param` */ + def dropDeps(param: TypeParamRef)(using Context): Unit = + coDeps = coDeps.remove(param) + contraDeps = contraDeps.remove(param) + + /** A string representing the two dependency maps */ + def depsToString(using Context): String = + def depsStr(deps: ReverseDeps): String = + def depStr(param: TypeParamRef) = i"$param --> ${deps.at(param).toList}%, %" + if deps.isEmpty then "" else i"\n ${deps.toList.map((k, v) => depStr(k))}%\n %" + i" co-deps:${depsStr(coDeps)}\n contra-deps:${depsStr(contraDeps)}\n" + // ---------- Adding TypeLambdas -------------------------------------------------- /** The bound type `tp` without constrained parameters which are clearly @@ -282,7 +507,8 @@ class OrderingConstraint(private val boundsMap: ParamBounds, val entries1 = new Array[Type](nparams * 2) poly.paramInfos.copyToArray(entries1, 0) tvars.copyToArray(entries1, nparams) - newConstraint(boundsMap.updated(poly, entries1), lowerMap, upperMap, hardVars).init(poly) + newConstraint(boundsMap = this.boundsMap.updated(poly, entries1)) + .init(poly) } /** Split dependent parameters off the bounds for parameters in `poly`. @@ -298,31 +524,23 @@ class OrderingConstraint(private val boundsMap: ParamBounds, val param = poly.paramRefs(i) val bounds = dropWildcards(nonParamBounds(param)) val stripped = stripParams(bounds, todos, isUpper = true) - current = updateEntry(current, param, stripped) + current = boundsLens.update(this, current, param, stripped) while todos.nonEmpty do current = todos.head(current, param) todos.dropInPlace(1) i += 1 } - current.checkNonCyclic() + current.adjustDeps(poly, current.boundsMap(poly).nn, add = true) + .checkWellFormed() } // ---------- Updates ------------------------------------------------------------ - /** If `inst` is a TypeBounds, make sure it does not contain toplevel - * references to `param` (see `Constraint#occursAtToplevel` for a definition - * of "toplevel"). - * Any such references are replaced by `Nothing` in the lower bound and `Any` - * in the upper bound. - * References can be direct or indirect through instantiations of other - * parameters in the constraint. - */ - private def ensureNonCyclic(param: TypeParamRef, inst: Type)(using Context): Type = - - def recur(tp: Type, fromBelow: Boolean): Type = tp match + def validBoundFor(param: TypeParamRef, bound: Type, isUpper: Boolean)(using Context): Type = + def recur(tp: Type): Type = tp match case tp: AndOrType => - val r1 = recur(tp.tp1, fromBelow) - val r2 = recur(tp.tp2, fromBelow) + val r1 = recur(tp.tp1) + val r2 = recur(tp.tp2) if (r1 eq tp.tp1) && (r2 eq tp.tp2) then tp else tp.match case tp: OrType => @@ -331,35 +549,34 @@ class OrderingConstraint(private val boundsMap: ParamBounds, r1 & r2 case tp: TypeParamRef => if tp eq param then - if fromBelow then defn.NothingType else defn.AnyType + if isUpper then defn.AnyType else defn.NothingType else entry(tp) match case NoType => tp - case TypeBounds(lo, hi) => if lo eq hi then recur(lo, fromBelow) else tp - case inst => recur(inst, fromBelow) + case TypeBounds(lo, hi) => if lo eq hi then recur(lo) else tp + case inst => recur(inst) case tp: TypeVar => - val underlying1 = recur(tp.underlying, fromBelow) + val underlying1 = recur(tp.underlying) if underlying1 ne tp.underlying then underlying1 else tp case CapturingType(parent, refs) => - val parent1 = recur(parent, fromBelow) + val parent1 = recur(parent) if parent1 ne parent then tp.derivedCapturingType(parent1, refs) else tp case tp: AnnotatedType => - val parent1 = recur(tp.parent, fromBelow) + val parent1 = recur(tp.parent) if parent1 ne tp.parent then tp.derivedAnnotatedType(parent1, tp.annot) else tp case _ => val tp1 = tp.dealiasKeepAnnots if tp1 ne tp then - val tp2 = recur(tp1, fromBelow) + val tp2 = recur(tp1) if tp2 ne tp1 then tp2 else tp else tp - inst match - case bounds: TypeBounds => - bounds.derivedTypeBounds( - recur(bounds.lo, fromBelow = true), - recur(bounds.hi, fromBelow = false)) - case _ => - inst - end ensureNonCyclic + recur(bound) + end validBoundFor + + def validBoundsFor(param: TypeParamRef, bounds: TypeBounds)(using Context): Type = + bounds.derivedTypeBounds( + validBoundFor(param, bounds.lo, isUpper = false), + validBoundFor(param, bounds.hi, isUpper = true)) /** Add the fact `param1 <: param2` to the constraint `current` and propagate * `<:<` relationships between parameters ("edges") but not bounds. @@ -418,7 +635,7 @@ class OrderingConstraint(private val boundsMap: ParamBounds, case param: TypeParamRef if contains(param) => param :: (if (isUpper) upper(param) else lower(param)) case tp: AndType if isUpper => - dependentParams(tp.tp1, isUpper) | (dependentParams(tp.tp2, isUpper)) + dependentParams(tp.tp1, isUpper).setUnion(dependentParams(tp.tp2, isUpper)) case tp: OrType if !isUpper => dependentParams(tp.tp1, isUpper).intersect(dependentParams(tp.tp2, isUpper)) case EtaExpansion(tycon) => @@ -426,10 +643,12 @@ class OrderingConstraint(private val boundsMap: ParamBounds, case _ => Nil - private def updateEntry(current: This, param: TypeParamRef, tp: Type)(using Context): This = { - if Config.checkNoWildcardsInConstraint then assert(!tp.containsWildcardTypes) - var current1 = boundsLens.update(this, current, param, tp) - tp match { + private def updateEntry(current: This, param: TypeParamRef, newEntry: Type)(using Context): This = { + if Config.checkNoWildcardsInConstraint then assert(!newEntry.containsWildcardTypes) + val oldEntry = current.entry(param) + var current1 = boundsLens.update(this, current, param, newEntry) + .adjustDeps(newEntry, oldEntry, param) + newEntry match { case TypeBounds(lo, hi) => for p <- dependentParams(lo, isUpper = false) do current1 = order(current1, p, param) @@ -440,12 +659,11 @@ class OrderingConstraint(private val boundsMap: ParamBounds, current1 } - /** The public version of `updateEntry`. Guarantees that there are no cycles */ def updateEntry(param: TypeParamRef, tp: Type)(using Context): This = - updateEntry(this, param, ensureNonCyclic(param, tp)).checkNonCyclic() + updateEntry(this, param, tp).checkWellFormed() def addLess(param1: TypeParamRef, param2: TypeParamRef, direction: UnificationDirection)(using Context): This = - order(this, param1, param2, direction).checkNonCyclic() + order(this, param1, param2, direction).checkWellFormed() // ---------- Replacements and Removals ------------------------------------- @@ -455,24 +673,80 @@ class OrderingConstraint(private val boundsMap: ParamBounds, */ def replace(param: TypeParamRef, tp: Type)(using Context): OrderingConstraint = val replacement = tp.dealiasKeepAnnots.stripTypeVar - if param == replacement then this.checkNonCyclic() + if param == replacement then this.checkWellFormed() else assert(replacement.isValueTypeOrLambda) - var current = - if isRemovable(param.binder) then remove(param.binder) - else updateEntry(this, param, replacement) - - def removeParam(ps: List[TypeParamRef]) = ps.filterConserve(param ne _) - def replaceParam(tp: Type, atPoly: TypeLambda, atIdx: Int): Type = - current.ensureNonCyclic(atPoly.paramRefs(atIdx), tp.substParam(param, replacement)) - - current.foreachParam { (p, i) => - current = boundsLens.map(this, current, p, i, replaceParam(_, p, i)) - current = lowerLens.map(this, current, p, i, removeParam) - current = upperLens.map(this, current, p, i, removeParam) - } - current.checkNonCyclic() + val replacedTypeVar = typeVarOfParam(param) + //println(i"replace $param with $replacement in $this") + + def mapReplacedTypeVarTo(to: Type) = new TypeMap: + override def apply(t: Type): Type = + if (t eq replacedTypeVar) && t.exists then to else mapOver(t) + + val coDepsOfParam = coDeps.at(param) + val contraDepsOfParam = contraDeps.at(param) + + var current = updateEntry(this, param, replacement) + // Need to update param early to avoid infinite recursion on instantiation. + // See i16311.scala for a test case. On the other hand, for the purposes of + // dependency adjustment, we need to pretend that `param` is still unbound. + // We achieve that by passing a `ignoreBinding = param` to `adjustDeps` below. + + def removeParamFrom(ps: List[TypeParamRef]) = + ps.filterConserve(param ne _) + + for lo <- lower(param) do + current = upperLens.map(this, current, lo, removeParamFrom) + for hi <- upper(param) do + current = lowerLens.map(this, current, hi, removeParamFrom) + + def replaceParamIn(other: TypeParamRef) = + val oldEntry = current.entry(other) + val newEntry = oldEntry.substParam(param, replacement) match + case tp: TypeBounds => current.validBoundsFor(other, tp) + case tp => tp + current = boundsLens.update(this, current, other, newEntry) + var oldDepEntry = oldEntry + var newDepEntry = newEntry + replacedTypeVar match + case tvar: TypeVar => + if tvar.inst.exists // `isInstantiated` would use ctx.typerState.constraint rather than the current constraint + then + // If the type variable has been instantiated, we need to forget about + // the instantiation for old dependencies. + // I.e. to find out what the old entry was, we should not follow + // the newly instantiated type variable but assume the type variable's origin `param`. + // An example where this happens is if `replace` is called from TypeVar's `instantiateWith`. + oldDepEntry = mapReplacedTypeVarTo(param)(oldDepEntry) + else + // If the type variable has not been instantiated, we need to replace references to it + // in the new entry by `replacement`. Otherwise we would get stuck in an uninstantiated + // type variable. + // An example where this happens is if `replace` is called from unify. + newDepEntry = mapReplacedTypeVarTo(replacement)(newDepEntry) + case _ => + if oldDepEntry ne newDepEntry then + current.adjustDeps(newDepEntry, oldDepEntry, other, ignoreBinding = param) + end replaceParamIn + + if optimizeReplace then + current.foreachParam { (p, i) => + val other = p.paramRefs(i) + entry(other) match + case _: TypeBounds => + if coDepsOfParam.contains(other) || contraDepsOfParam.contains(other) then + replaceParamIn(other) + case _ => replaceParamIn(other) + } + else + current.foreachParam { (p, i) => + val other = p.paramRefs(i) + if other != param then replaceParamIn(other) + } + if isRemovable(param.binder) then current = current.remove(param.binder) + current.dropDeps(param) + current.checkWellFormed() end replace def remove(pt: TypeLambda)(using Context): This = { @@ -485,7 +759,8 @@ class OrderingConstraint(private val boundsMap: ParamBounds, } val hardVars1 = pt.paramRefs.foldLeft(hardVars)((hvs, param) => hvs - typeVarOfParam(param)) newConstraint(boundsMap.remove(pt), removeFromOrdering(lowerMap), removeFromOrdering(upperMap), hardVars1) - .checkNonCyclic() + .adjustDeps(pt, boundsMap(pt).nn, add = false) + .checkWellFormed() } def isRemovable(pt: TypeLambda): Boolean = { @@ -511,7 +786,7 @@ class OrderingConstraint(private val boundsMap: ParamBounds, def swapKey[T](m: ArrayValuedMap[T]) = val info = m(from) if info == null then m else m.remove(from).updated(to, info) - var current = newConstraint(swapKey(boundsMap), swapKey(lowerMap), swapKey(upperMap), hardVars) + var current = newConstraint(swapKey(boundsMap), swapKey(lowerMap), swapKey(upperMap)) def subst[T <: Type](x: T): T = x.subst(from, to).asInstanceOf[T] current.foreachParam {(p, i) => current = boundsLens.map(this, current, p, i, subst) @@ -519,12 +794,12 @@ class OrderingConstraint(private val boundsMap: ParamBounds, current = upperLens.map(this, current, p, i, _.map(subst)) } constr.println(i"renamed $this to $current") - current.checkNonCyclic() + current.checkWellFormed() def isHard(tv: TypeVar) = hardVars.contains(tv) def withHard(tv: TypeVar)(using Context) = - newConstraint(boundsMap, lowerMap, upperMap, hardVars + tv) + newConstraint(hardVars = this.hardVars + tv) def instType(tvar: TypeVar): Type = entry(tvar.origin) match case _: TypeBounds => NoType @@ -551,6 +826,26 @@ class OrderingConstraint(private val boundsMap: ParamBounds, assert(tvar.origin == param, i"mismatch $tvar, $param") case _ => + def occursAtToplevel(param: TypeParamRef, inst: Type)(using Context): Boolean = + def occurs(tp: Type)(using Context): Boolean = tp match + case tp: AndOrType => + occurs(tp.tp1) || occurs(tp.tp2) + case tp: TypeParamRef => + (tp eq param) || entry(tp).match + case NoType => false + case TypeBounds(lo, hi) => (lo eq hi) && occurs(lo) + case inst => occurs(inst) + case tp: TypeVar => + occurs(tp.underlying) + case TypeBounds(lo, hi) => + occurs(lo) || occurs(hi) + case _ => + val tp1 = tp.dealias + (tp1 ne tp) && occurs(tp1) + + occurs(inst) + end occursAtToplevel + // ---------- Exploration -------------------------------------------------------- def domainLambdas: List[TypeLambda] = boundsMap.keys @@ -603,7 +898,57 @@ class OrderingConstraint(private val boundsMap: ParamBounds, // ---------- Checking ----------------------------------------------- - def checkNonCyclic()(using Context): this.type = + def checkWellFormed()(using Context): this.type = + + /** Check that each dependency A -> B in coDeps and contraDeps corresponds to + * a reference to A at the right variance in the entry of B. + */ + def checkBackward(deps: ReverseDeps, depsName: String, v: Int)(using Context): Unit = + deps.foreachBinding { (param, params) => + for srcParam <- params do + assert(contains(srcParam) && occursAtVariance(param, v, in = entry(srcParam)), + i"wrong $depsName backwards reference $param -> $srcParam in $thisConstraint") + } + + /** A type traverser that checks that all references bound in the constraint + * are accounted for in coDeps and/or contraDeps. + */ + def checkForward(srcParam: TypeParamRef)(using Context) = + new TypeTraverser with ConstraintAwareTraversal[Unit]: + val seen = util.HashSet[LazyRef]() + def traverse(t: Type): Unit = t match + case param: TypeParamRef if param ne srcParam => + def check(deps: ReverseDeps, directDeps: List[TypeParamRef], depsName: String) = + assert(deps.at(param).contains(srcParam) || directDeps.contains(srcParam), + i"missing $depsName backwards reference $param -> $srcParam in $thisConstraint") + entry(param) match + case _: TypeBounds => + if variance >= 0 then check(contraDeps, upper(param), "contra") + if variance <= 0 then check(coDeps, lower(param), "co") + case tp => + traverse(tp) + case tp: LazyRef => + if !seen.contains(tp) then + seen += tp + traverse(tp.ref) + case _ => traverseChildren(t) + + /** Does `param` occur at variance `v` or else at variance 0 in entry `in`? */ + def occursAtVariance(param: TypeParamRef, v: Int, in: Type)(using Context): Boolean = + val test = new TypeAccumulator[Boolean] with ConstraintAwareTraversal[Boolean]: + def apply(x: Boolean, t: Type): Boolean = + if x then true + else t match + case t: TypeParamRef => + entry(t) match + case _: TypeBounds => + t == param && (variance == 0 || variance == v) + case e => + apply(x, e) + case _ => + foldOver(x, t) + test(false, in) + if Config.checkConstraintsNonCyclic then domainParams.foreach { param => val inst = entry(param) @@ -612,28 +957,13 @@ class OrderingConstraint(private val boundsMap: ParamBounds, assert(!occursAtToplevel(param, inst), s"cyclic bound for $param: ${inst.show} in ${this.show}") } - this - - def occursAtToplevel(param: TypeParamRef, inst: Type)(using Context): Boolean = - - def occurs(tp: Type)(using Context): Boolean = tp match - case tp: AndOrType => - occurs(tp.tp1) || occurs(tp.tp2) - case tp: TypeParamRef => - (tp eq param) || entry(tp).match - case NoType => false - case TypeBounds(lo, hi) => (lo eq hi) && occurs(lo) - case inst => occurs(inst) - case tp: TypeVar => - occurs(tp.underlying) - case TypeBounds(lo, hi) => - occurs(lo) || occurs(hi) - case _ => - val tp1 = tp.dealias - (tp1 ne tp) && occurs(tp1) + if Config.checkConstraintDeps || ctx.settings.YcheckConstraintDeps.value then + checkBackward(coDeps, "co", -1) + checkBackward(contraDeps, "contra", +1) + domainParams.foreach(p => if contains(p) then checkForward(p).traverse(entry(p))) - occurs(inst) - end occursAtToplevel + this + end checkWellFormed override def checkClosed()(using Context): Unit = @@ -663,13 +993,16 @@ class OrderingConstraint(private val boundsMap: ParamBounds, val constrainedText = " constrained types = " + domainLambdas.mkString("\n") val boundsText = - " bounds = " + { + "\n bounds = " + { val assocs = for (param <- domainParams) yield s"${param.binder.paramNames(param.paramNum)}: ${entryText(entry(param))}" assocs.mkString("\n") } - constrainedText + "\n" + boundsText + val depsText = + "\n coDeps = " + coDeps + + "\n contraDeps = " + contraDeps + constrainedText + boundsText + depsText } } diff --git a/compiler/src/dotty/tools/dotc/core/PatternTypeConstrainer.scala b/compiler/src/dotty/tools/dotc/core/PatternTypeConstrainer.scala index ff9a5cd0aed7..5e8a960608e6 100644 --- a/compiler/src/dotty/tools/dotc/core/PatternTypeConstrainer.scala +++ b/compiler/src/dotty/tools/dotc/core/PatternTypeConstrainer.scala @@ -265,26 +265,26 @@ trait PatternTypeConstrainer { self: TypeComparer => (tp, pt) match { case (AppliedType(tyconS, argsS), AppliedType(tyconP, argsP)) => val saved = state.nn.constraint - val savedGadt = ctx.gadt.fresh val result = - tyconS.typeParams.lazyZip(argsS).lazyZip(argsP).forall { (param, argS, argP) => - val variance = param.paramVarianceSign - if variance == 0 || assumeInvariantRefinement || - // As a special case, when pattern and scrutinee types have the same type constructor, - // we infer better bounds for pattern-bound abstract types. - argP.typeSymbol.isPatternBound && patternTp.classSymbol == scrutineeTp.classSymbol - then - val TypeBounds(loS, hiS) = argS.bounds - val TypeBounds(loP, hiP) = argP.bounds - var res = true - if variance < 1 then res &&= isSubType(loS, hiP) - if variance > -1 then res &&= isSubType(loP, hiS) - res - else true + ctx.gadtState.rollbackGadtUnless { + tyconS.typeParams.lazyZip(argsS).lazyZip(argsP).forall { (param, argS, argP) => + val variance = param.paramVarianceSign + if variance == 0 || assumeInvariantRefinement || + // As a special case, when pattern and scrutinee types have the same type constructor, + // we infer better bounds for pattern-bound abstract types. + argP.typeSymbol.isPatternBound && patternTp.classSymbol == scrutineeTp.classSymbol + then + val TypeBounds(loS, hiS) = argS.bounds + val TypeBounds(loP, hiP) = argP.bounds + var res = true + if variance < 1 then res &&= isSubType(loS, hiP) + if variance > -1 then res &&= isSubType(loP, hiS) + res + else true + } } if !result then constraint = saved - ctx.gadt.restore(savedGadt) result case _ => // Give up if we don't get AppliedType, e.g. if we upcasted to Any. diff --git a/compiler/src/dotty/tools/dotc/core/Periods.scala b/compiler/src/dotty/tools/dotc/core/Periods.scala index 44d83dcb5278..ee877fb538d4 100644 --- a/compiler/src/dotty/tools/dotc/core/Periods.scala +++ b/compiler/src/dotty/tools/dotc/core/Periods.scala @@ -20,7 +20,7 @@ object Periods { /** Are all base types in the current period guaranteed to be the same as in period `p`? */ def currentHasSameBaseTypesAs(p: Period)(using Context): Boolean = val period = ctx.period - period == p || + period.code == p.code || period.runId == p.runId && unfusedPhases(period.phaseId).sameBaseTypesStartId == unfusedPhases(p.phaseId).sameBaseTypesStartId @@ -118,7 +118,8 @@ object Periods { apply(rid, 0, PhaseMask) } - final val Nowhere: Period = new Period(0) + inline val NowhereCode = 0 + final val Nowhere: Period = new Period(NowhereCode) final val InitialPeriod: Period = Period(InitialRunId, FirstPhaseId) diff --git a/compiler/src/dotty/tools/dotc/core/Phases.scala b/compiler/src/dotty/tools/dotc/core/Phases.scala index b4a2dcac1b85..205554e418ed 100644 --- a/compiler/src/dotty/tools/dotc/core/Phases.scala +++ b/compiler/src/dotty/tools/dotc/core/Phases.scala @@ -197,6 +197,14 @@ object Phases { config.println(s"nextDenotTransformerId = ${nextDenotTransformerId.toList}") } + /** Unlink `phase` from Denot transformer chain. This means that + * any denotation transformer defined by the phase will not be executed. + */ + def unlinkPhaseAsDenotTransformer(phase: Phase)(using Context) = + for i <- 0 until nextDenotTransformerId.length do + if nextDenotTransformerId(i) == phase.id then + nextDenotTransformerId(i) = nextDenotTransformerId(phase.id + 1) + private var myParserPhase: Phase = _ private var myTyperPhase: Phase = _ private var myPostTyperPhase: Phase = _ diff --git a/compiler/src/dotty/tools/dotc/core/Scopes.scala b/compiler/src/dotty/tools/dotc/core/Scopes.scala index 863ae4fa6b7f..99076b422358 100644 --- a/compiler/src/dotty/tools/dotc/core/Scopes.scala +++ b/compiler/src/dotty/tools/dotc/core/Scopes.scala @@ -467,7 +467,7 @@ object Scopes { override def size: Int = 0 override def nestingLevel: Int = 0 override def toList(using Context): List[Symbol] = Nil - override def cloneScope(using Context): MutableScope = unsupported("cloneScope") + override def cloneScope(using Context): MutableScope = newScope(nestingLevel) override def lookupEntry(name: Name)(using Context): ScopeEntry | Null = null override def lookupNextEntry(entry: ScopeEntry)(using Context): ScopeEntry | Null = null } diff --git a/compiler/src/dotty/tools/dotc/core/StdNames.scala b/compiler/src/dotty/tools/dotc/core/StdNames.scala index c0aca9d8abf4..92f2e55a49bf 100644 --- a/compiler/src/dotty/tools/dotc/core/StdNames.scala +++ b/compiler/src/dotty/tools/dotc/core/StdNames.scala @@ -3,6 +3,7 @@ package core import scala.collection.mutable import scala.annotation.switch +import scala.annotation.internal.sharable import Names._ import Symbols._ import Contexts._ @@ -40,7 +41,9 @@ object StdNames { inline val Tuple = "Tuple" inline val Product = "Product" - def sanitize(str: String): String = str.replaceAll("""[<>]""", """\$""").nn + @sharable + private val disallowed = java.util.regex.Pattern.compile("""[<>]""").nn + def sanitize(str: String): String = disallowed.matcher(str).nn.replaceAll("""\$""").nn } abstract class DefinedNames[N <: Name] { @@ -128,6 +131,7 @@ object StdNames { val EXCEPTION_RESULT_PREFIX: N = "exceptionResult" val EXPAND_SEPARATOR: N = str.EXPAND_SEPARATOR val IMPORT: N = "" + val INTO: N = "" val MODULE_SUFFIX: N = str.MODULE_SUFFIX val OPS_PACKAGE: N = "" val OVERLOADED: N = "" @@ -243,7 +247,6 @@ object StdNames { final val ToString: N = "ToString" final val Xor: N = "^" - final val ClassfileAnnotation: N = "ClassfileAnnotation" final val ClassManifest: N = "ClassManifest" final val Enum: N = "Enum" final val Group: N = "Group" @@ -420,6 +423,7 @@ object StdNames { val assert_ : N = "assert" val assume_ : N = "assume" val box: N = "box" + val break: N = "break" val build : N = "build" val bundle: N = "bundle" val bytes: N = "bytes" @@ -501,6 +505,7 @@ object StdNames { val info: N = "info" val inlinedEquals: N = "inlinedEquals" val internal: N = "internal" + val into: N = "into" val isArray: N = "isArray" val isDefinedAt: N = "isDefinedAt" val isDefinedAtImpl: N = "$isDefinedAt" @@ -510,10 +515,12 @@ object StdNames { val isInstanceOfPM: N = "$isInstanceOf$" val java: N = "java" val key: N = "key" + val label: N = "label" val lang: N = "lang" val language: N = "language" val length: N = "length" val lengthCompare: N = "lengthCompare" + val local: N = "local" val longHash: N = "longHash" val macroThis : N = "_this" val macroContext : N = "c" @@ -825,7 +832,7 @@ object StdNames { def newBitmapName(bitmapPrefix: TermName, n: Int): TermName = bitmapPrefix ++ n.toString - def selectorName(n: Int): TermName = "_" + (n + 1) + def selectorName(n: Int): TermName = productAccessorName(n + 1) object primitive { val arrayApply: TermName = "[]apply" diff --git a/compiler/src/dotty/tools/dotc/core/SymDenotations.scala b/compiler/src/dotty/tools/dotc/core/SymDenotations.scala index d0bf0f4da6dc..9d7a3945a1ca 100644 --- a/compiler/src/dotty/tools/dotc/core/SymDenotations.scala +++ b/compiler/src/dotty/tools/dotc/core/SymDenotations.scala @@ -24,7 +24,7 @@ import config.Config import reporting._ import collection.mutable import transform.TypeUtils._ -import cc.{CapturingType, derivedCapturingType} +import cc.{CapturingType, derivedCapturingType, Setup, EventuallyCapturingType, isEventuallyCapturingType} import scala.annotation.internal.sharable @@ -39,7 +39,7 @@ object SymDenotations { final val name: Name, initFlags: FlagSet, initInfo: Type, - initPrivateWithin: Symbol = NoSymbol) extends SingleDenotation(symbol, initInfo) { + initPrivateWithin: Symbol = NoSymbol) extends SingleDenotation(symbol, initInfo, name.isTypeName) { //assert(symbol.id != 4940, name) @@ -168,7 +168,8 @@ object SymDenotations { } } else { - if (myFlags.is(Touched)) throw CyclicReference(this) + if (myFlags.is(Touched)) + throw CyclicReference(this)(using ctx.withOwner(symbol)) myFlags |= Touched atPhase(validFor.firstPhaseId)(completer.complete(this)) } @@ -251,6 +252,15 @@ object SymDenotations { final def filterAnnotations(p: Annotation => Boolean)(using Context): Unit = annotations = annotations.filterConserve(p) + def annotationsCarrying(meta: Set[Symbol], orNoneOf: Set[Symbol] = Set.empty)(using Context): List[Annotation] = + annotations.filterConserve(_.hasOneOfMetaAnnotation(meta, orNoneOf = orNoneOf)) + + def copyAndKeepAnnotationsCarrying(phase: DenotTransformer, meta: Set[Symbol], orNoneOf: Set[Symbol] = Set.empty)(using Context): Unit = + if annotations.nonEmpty then + val cpy = copySymDenotation() + cpy.annotations = annotationsCarrying(meta, orNoneOf = orNoneOf) + cpy.installAfter(phase) + /** Optionally, the annotation matching the given class symbol */ final def getAnnotation(cls: Symbol)(using Context): Option[Annotation] = dropOtherAnnotations(annotations, cls) match { @@ -273,7 +283,7 @@ object SymDenotations { /** Add the given annotation without parameters to the annotations of this denotation */ final def addAnnotation(cls: ClassSymbol)(using Context): Unit = - addAnnotation(Annotation(cls)) + addAnnotation(Annotation(cls, symbol.span)) /** Remove annotation with given class from this denotation */ final def removeAnnotation(cls: Symbol)(using Context): Unit = @@ -505,6 +515,30 @@ object SymDenotations { /** `fullName` where `.' is the separator character */ def fullName(using Context): Name = fullNameSeparated(QualifiedName) + /** The fully qualified name on the JVM of the class corresponding to this symbol. */ + def binaryClassName(using Context): String = + val builder = new StringBuilder + val pkg = enclosingPackageClass + if !pkg.isEffectiveRoot then + builder.append(pkg.fullName.mangledString) + builder.append(".") + val flatName = this.flatName + // Some companion objects are fake (that is, they're a compiler fiction + // that doesn't correspond to a class that exists at runtime), this + // can happen in two cases: + // - If a Java class has static members. + // - If we create constructor proxies for a class (see NamerOps#addConstructorProxies). + // + // In both cases it's may be vital that we don't return the object name. + // For instance, sending it to zinc: when sbt is restarted, zinc will inspect the binary + // dependencies to see if they're still on the classpath, if it + // doesn't find them it will invalidate whatever referenced them, so + // any reference to a fake companion will lead to extra recompilations. + // Instead, use the class name since it's guaranteed to exist at runtime. + val clsFlatName = if isOneOf(JavaDefined | ConstructorProxy) then flatName.stripModuleClassSuffix else flatName + builder.append(clsFlatName.mangledString) + builder.toString + private var myTargetName: Name | Null = null private def computeTargetName(targetNameAnnot: Option[Annotation])(using Context): Name = @@ -542,9 +576,6 @@ object SymDenotations { // ----- Tests ------------------------------------------------- - /** Is this denotation a type? */ - override def isType: Boolean = name.isTypeName - /** Is this denotation a class? */ final def isClass: Boolean = isInstanceOf[ClassDenotation] @@ -748,7 +779,7 @@ object SymDenotations { * So the first call to a stable member might fail and/or produce side effects. */ final def isStableMember(using Context): Boolean = { - def isUnstableValue = isOneOf(UnstableValueFlags) || info.isInstanceOf[ExprType] + def isUnstableValue = isOneOf(UnstableValueFlags) || info.isInstanceOf[ExprType] || isAllOf(InlineParam) isType || is(StableRealizable) || exists && !isUnstableValue } @@ -808,19 +839,14 @@ object SymDenotations { /** Is this a Scala or Java annotation ? */ def isAnnotation(using Context): Boolean = - isClass && derivesFrom(defn.AnnotationClass) + isClass && (derivesFrom(defn.AnnotationClass) || is(JavaAnnotation)) /** Is this symbol a class that extends `java.io.Serializable` ? */ def isSerializable(using Context): Boolean = isClass && derivesFrom(defn.JavaSerializableClass) - /** Is this symbol a class that extends `AnyVal`? */ - final def isValueClass(using Context): Boolean = - val di = initial - di.isClass - && atPhase(di.validFor.firstPhaseId)(di.derivesFrom(defn.AnyValClass)) - // We call derivesFrom at the initial phase both because AnyVal does not exist - // after Erasure and to avoid cyclic references caused by forcing denotations + /** Is this symbol a class that extends `AnyVal`? Overridden in ClassDenotation */ + def isValueClass(using Context): Boolean = false /** Is this symbol a class of which `null` is a value? */ final def isNullableClass(using Context): Boolean = @@ -960,6 +986,26 @@ object SymDenotations { def isSkolem: Boolean = name == nme.SKOLEM + // Java language spec: https://docs.oracle.com/javase/specs/jls/se11/html/jls-15.html#jls-15.12.3 + // Scala 2 spec: https://scala-lang.org/files/archive/spec/2.13/06-expressions.html#signature-polymorphic-methods + def isSignaturePolymorphic(using Context): Boolean = + containsSignaturePolymorphic + && is(JavaDefined) + && hasAnnotation(defn.NativeAnnot) + && atPhase(typerPhase)(symbol.denot).paramSymss.match + case List(List(p)) => p.info.isRepeatedParam + case _ => false + + def containsSignaturePolymorphic(using Context): Boolean = + maybeOwner == defn.MethodHandleClass + || maybeOwner == defn.VarHandleClass + + def originalSignaturePolymorphic(using Context): Denotation = + if containsSignaturePolymorphic && !isSignaturePolymorphic then + val d = owner.info.member(name) + if d.symbol.isSignaturePolymorphic then d else NoDenotation + else NoDenotation + def isInlineMethod(using Context): Boolean = isAllOf(InlineMethod, butNot = Accessor) @@ -1053,6 +1099,7 @@ object SymDenotations { case tp: Symbol => sourceOfSelf(tp.info) case tp: RefinedType => sourceOfSelf(tp.parent) case tp: AnnotatedType => sourceOfSelf(tp.parent) + case tp: ThisType => tp.cls } sourceOfSelf(selfType) case info: LazyType => @@ -1151,9 +1198,9 @@ object SymDenotations { final def isEffectivelySealed(using Context): Boolean = isOneOf(FinalOrSealed) || isClass && !isOneOf(EffectivelyOpenFlags) - final def isTransparentTrait(using Context): Boolean = - isAllOf(TransparentTrait) - || defn.assumedTransparentTraits.contains(symbol) + final def isTransparentClass(using Context): Boolean = + is(TransparentType) + || defn.isAssumedTransparent(symbol) || isClass && hasAnnotation(defn.TransparentTraitAnnot) /** The class containing this denotation which has the given effective name. */ @@ -1827,19 +1874,21 @@ object SymDenotations { super.info_=(tp) } - /** The symbols of the parent classes. */ - def parentSyms(using Context): List[Symbol] = info match { - case classInfo: ClassInfo => classInfo.declaredParents.map(_.classSymbol) + /** The types of the parent classes. */ + def parentTypes(using Context): List[Type] = info match + case classInfo: ClassInfo => classInfo.declaredParents case _ => Nil - } + + /** The symbols of the parent classes. */ + def parentSyms(using Context): List[Symbol] = + parentTypes.map(_.classSymbol) /** The symbol of the superclass, NoSymbol if no superclass exists */ - def superClass(using Context): Symbol = parentSyms match { - case parent :: _ => - if (parent.is(Trait)) NoSymbol else parent - case _ => - NoSymbol - } + def superClass(using Context): Symbol = parentTypes match + case parentType :: _ => + val parentCls = parentType.classSymbol + if parentCls.is(Trait) then NoSymbol else parentCls + case _ => NoSymbol /** The explicitly given self type (self types of modules are assumed to be * explcitly given here). @@ -1901,20 +1950,20 @@ object SymDenotations { def computeBaseData(implicit onBehalf: BaseData, ctx: Context): (List[ClassSymbol], BaseClassSet) = { def emptyParentsExpected = is(Package) || (symbol == defn.AnyClass) || ctx.erasedTypes && (symbol == defn.ObjectClass) - val psyms = parentSyms - if (psyms.isEmpty && !emptyParentsExpected) + val parents = parentTypes + if (parents.isEmpty && !emptyParentsExpected) onBehalf.signalProvisional() val builder = new BaseDataBuilder - def traverse(parents: List[Symbol]): Unit = parents match { + def traverse(parents: List[Type]): Unit = parents match { case p :: parents1 => - p match { + p.classSymbol match { case pcls: ClassSymbol => builder.addAll(pcls.baseClasses) case _ => assert(isRefinementClass || p.isError || ctx.mode.is(Mode.Interactive), s"$this has non-class parent: $p") } traverse(parents1) case nil => } - traverse(psyms) + traverse(parents) (classSymbol :: builder.baseClasses, builder.baseClassSet) } @@ -1951,6 +2000,17 @@ object SymDenotations { /** Hook to do a pre-enter test. Overridden in PackageDenotation */ protected def proceedWithEnter(sym: Symbol, mscope: MutableScope)(using Context): Boolean = true + final override def isValueClass(using Context): Boolean = + val di = initial.asClass + val anyVal = defn.AnyValClass + if di.baseDataCache.isValid && !ctx.erasedTypes then + // fast path that does not demand time travel + (symbol eq anyVal) || di.baseClassSet.contains(anyVal) + else + // We call derivesFrom at the initial phase both because AnyVal does not exist + // after Erasure and to avoid cyclic references caused by forcing denotations + atPhase(di.validFor.firstPhaseId)(di.derivesFrom(anyVal)) + /** Enter a symbol in current scope, and future scopes of same denotation. * Note: We require that this does not happen after the first time * someone does a findMember on a subclass. @@ -2092,7 +2152,7 @@ object SymDenotations { Stats.record("basetype cache entries") if (!baseTp.exists) Stats.record("basetype cache NoTypes") } - if (!tp.isProvisional) + if (!tp.isProvisional && !CapturingType.isUncachable(tp)) btrCache(tp) = baseTp else btrCache.remove(tp) // Remove any potential sentinel value @@ -2106,8 +2166,9 @@ object SymDenotations { def recur(tp: Type): Type = try { tp match { case tp: CachedType => - val baseTp = btrCache.lookup(tp) - if (baseTp != null) return ensureAcyclic(baseTp) + val baseTp: Type | Null = btrCache.lookup(tp) + if (baseTp != null) + return ensureAcyclic(baseTp) case _ => } if (Stats.monitored) { @@ -2251,9 +2312,11 @@ object SymDenotations { var names = Set[Name]() def maybeAdd(name: Name) = if (keepOnly(thisType, name)) names += name try { - for (p <- parentSyms if p.isClass) - for (name <- p.asClass.memberNames(keepOnly)) - maybeAdd(name) + for ptype <- parentTypes do + ptype.classSymbol match + case pcls: ClassSymbol => + for name <- pcls.memberNames(keepOnly) do + maybeAdd(name) val ownSyms = if (keepOnly eq implicitFilter) if (this.is(Package)) Iterator.empty @@ -2438,13 +2501,13 @@ object SymDenotations { val youngest = assocFiles.filter(_.lastModified == lastModDate) val chosen = youngest.head def ambiguousFilesMsg(f: AbstractFile) = - em"""Toplevel definition $name is defined in - | $chosen - |and also in - | $f""" + i"""Toplevel definition $name is defined in + | $chosen + |and also in + | $f""" if youngest.size > 1 then - throw TypeError(i"""${ambiguousFilesMsg(youngest.tail.head)} - |One of these files should be removed from the classpath.""") + throw TypeError(em"""${ambiguousFilesMsg(youngest.tail.head)} + |One of these files should be removed from the classpath.""") // Warn if one of the older files comes from a different container. // In that case picking the youngest file is not necessarily what we want, @@ -2454,15 +2517,18 @@ object SymDenotations { try f.container == chosen.container catch case NonFatal(ex) => true if !ambiguityWarningIssued then for conflicting <- assocFiles.find(!sameContainer(_)) do - report.warning(i"""${ambiguousFilesMsg(conflicting.nn)} - |Keeping only the definition in $chosen""") + report.warning(em"""${ambiguousFilesMsg(conflicting.nn)} + |Keeping only the definition in $chosen""") ambiguityWarningIssued = true multi.filterWithPredicate(_.symbol.associatedFile == chosen) end dropStale - if symbol eq defn.ScalaPackageClass then + if name == nme.CONSTRUCTOR then + NoDenotation // packages don't have constructors, even if package objects do. + else if symbol eq defn.ScalaPackageClass then + // revert order: search package first, then nested package objects val denots = super.computeMembersNamed(name) - if denots.exists || name == nme.CONSTRUCTOR then denots + if denots.exists then denots else recur(packageObjs, NoDenotation) else recur(packageObjs, NoDenotation) end computeMembersNamed @@ -2505,7 +2571,6 @@ object SymDenotations { @sharable object NoDenotation extends SymDenotation(NoSymbol, NoSymbol, "".toTermName, Permanent, NoType) { - override def isType: Boolean = false override def isTerm: Boolean = false override def exists: Boolean = false override def owner: Symbol = throw new AssertionError("NoDenotation.owner") @@ -2802,7 +2867,7 @@ object SymDenotations { } def isValidAt(phase: Phase)(using Context) = - checkedPeriod == ctx.period || + checkedPeriod.code == ctx.period.code || createdAt.runId == ctx.runId && createdAt.phaseId < unfusedPhases.length && sameGroup(unfusedPhases(createdAt.phaseId), phase) && diff --git a/compiler/src/dotty/tools/dotc/core/SymbolLoaders.scala b/compiler/src/dotty/tools/dotc/core/SymbolLoaders.scala index c5ae98853061..9eb67b468cfa 100644 --- a/compiler/src/dotty/tools/dotc/core/SymbolLoaders.scala +++ b/compiler/src/dotty/tools/dotc/core/SymbolLoaders.scala @@ -88,8 +88,8 @@ object SymbolLoaders { return NoSymbol } else - throw new TypeError( - i"""$owner contains object and package with same name: $pname + throw TypeError( + em"""$owner contains object and package with same name: $pname |one of them needs to be removed from classpath""") newModuleSymbol(owner, pname, PackageCreationFlags, PackageCreationFlags, completer).entered @@ -331,8 +331,9 @@ abstract class SymbolLoader extends LazyType { self => if (ctx.debug) ex.printStackTrace() val msg = ex.getMessage() report.error( - if (msg == null) "i/o error while loading " + root.name - else "error while loading " + root.name + ",\n" + msg) + if msg == null then em"i/o error while loading ${root.name}" + else em"""error while loading ${root.name}, + |$msg""") } try { val start = System.currentTimeMillis diff --git a/compiler/src/dotty/tools/dotc/core/Symbols.scala b/compiler/src/dotty/tools/dotc/core/Symbols.scala index 775062c26b0c..aa3ae0c3c513 100644 --- a/compiler/src/dotty/tools/dotc/core/Symbols.scala +++ b/compiler/src/dotty/tools/dotc/core/Symbols.scala @@ -103,7 +103,7 @@ object Symbols { /** The current denotation of this symbol */ final def denot(using Context): SymDenotation = { util.Stats.record("Symbol.denot") - if (checkedPeriod == ctx.period) lastDenot + if checkedPeriod.code == ctx.period.code then lastDenot else computeDenot(lastDenot) } @@ -630,6 +630,32 @@ object Symbols { owner.thisType, modcls, parents, decls, TermRef(owner.thisType, module)), privateWithin, coord, assocFile) + /** Same as `newCompleteModuleSymbol` except that `parents` can be a list of arbitrary + * types which get normalized into type refs and parameter bindings. + */ + def newNormalizedModuleSymbol( + owner: Symbol, + name: TermName, + modFlags: FlagSet, + clsFlags: FlagSet, + parentTypes: List[Type], + decls: Scope, + privateWithin: Symbol = NoSymbol, + coord: Coord = NoCoord, + assocFile: AbstractFile | Null = null)(using Context): TermSymbol = { + def completer(module: Symbol) = new LazyType { + def complete(denot: SymDenotation)(using Context): Unit = { + val cls = denot.asClass.classSymbol + val decls = newScope + denot.info = ClassInfo(owner.thisType, cls, parentTypes.map(_.dealias), decls, TermRef(owner.thisType, module)) + } + } + newModuleSymbol( + owner, name, modFlags, clsFlags, + (module, modcls) => completer(module), + privateWithin, coord, assocFile) + } + /** Create a package symbol with associated package class * from its non-info fields and a lazy type for loading the package's members. */ @@ -660,7 +686,7 @@ object Symbols { addToGadt: Boolean = true, flags: FlagSet = EmptyFlags)(using Context): Symbol = { val sym = newSymbol(ctx.owner, name, Case | flags, info, coord = span) - if (addToGadt && name.isTypeName) ctx.gadt.addToConstraint(sym) + if (addToGadt && name.isTypeName) ctx.gadtState.addToConstraint(sym) sym } diff --git a/compiler/src/dotty/tools/dotc/core/TypeApplications.scala b/compiler/src/dotty/tools/dotc/core/TypeApplications.scala index 81f822811456..7c25ecd21ebf 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeApplications.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeApplications.scala @@ -533,6 +533,9 @@ class TypeApplications(val self: Type) extends AnyVal { case JavaArrayType(elemtp) => elemtp case tp: OrType if tp.tp1.isBottomType => tp.tp2.elemType case tp: OrType if tp.tp2.isBottomType => tp.tp1.elemType - case _ => self.baseType(defn.SeqClass).argInfos.headOption.getOrElse(NoType) + case _ => + self.baseType(defn.SeqClass) + .orElse(self.baseType(defn.ArrayClass)) + .argInfos.headOption.getOrElse(NoType) } } diff --git a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala index 283a7e3a474e..6428c5315263 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeComparer.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeComparer.scala @@ -23,7 +23,7 @@ import typer.ProtoTypes.constrained import typer.Applications.productSelectorTypes import reporting.trace import annotation.constructorOnly -import cc.{CapturingType, derivedCapturingType, CaptureSet, stripCapturing, isBoxedCapturing, boxed, boxedUnlessFun, boxedIfTypeParam} +import cc.{CapturingType, derivedCapturingType, CaptureSet, stripCapturing, isBoxedCapturing, boxed, boxedUnlessFun, boxedIfTypeParam, isAlwaysPure} /** Provides methods to compare types. */ @@ -60,8 +60,6 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling /** Indicates whether the subtype check used GADT bounds */ private var GADTused: Boolean = false - protected var canWidenAbstract: Boolean = true - private var myInstance: TypeComparer = this def currentInstance: TypeComparer = myInstance @@ -118,7 +116,7 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling private def isBottom(tp: Type) = tp.widen.isRef(NothingClass) protected def gadtBounds(sym: Symbol)(using Context) = ctx.gadt.bounds(sym) - protected def gadtAddBound(sym: Symbol, b: Type, isUpper: Boolean): Boolean = ctx.gadt.addBound(sym, b, isUpper) + protected def gadtAddBound(sym: Symbol, b: Type, isUpper: Boolean): Boolean = ctx.gadtState.addBound(sym, b, isUpper) protected def typeVarInstance(tvar: TypeVar)(using Context): Type = tvar.underlying @@ -311,6 +309,7 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling thirdTryNamed(tp2) else ( (tp1.name eq tp2.name) + && !sym1.is(Private) && tp2.isPrefixDependentMemberRef && isSubPrefix(tp1.prefix, tp2.prefix) && tp1.signature == tp2.signature @@ -420,16 +419,16 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling true } def compareTypeParamRef = - assumedTrue(tp1) || - tp2.match { - case tp2: TypeParamRef => constraint.isLess(tp1, tp2) - case _ => false - } || - isSubTypeWhenFrozen(bounds(tp1).hi.boxed, tp2) || { - if (canConstrain(tp1) && !approx.high) - addConstraint(tp1, tp2, fromBelow = false) && flagNothingBound - else thirdTry - } + assumedTrue(tp1) + || tp2.dealias.match + case tp2a: TypeParamRef => constraint.isLess(tp1, tp2a) + case tp2a: AndType => recur(tp1, tp2a) + case _ => false + || isSubTypeWhenFrozen(bounds(tp1).hi.boxed, tp2) + || (if canConstrain(tp1) && !approx.high then + addConstraint(tp1, tp2, fromBelow = false) && flagNothingBound + else thirdTry) + compareTypeParamRef case tp1: ThisType => val cls1 = tp1.cls @@ -522,7 +521,9 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling res case CapturingType(parent1, refs1) => - if subCaptures(refs1, tp2.captureSet, frozenConstraint).isOK && sameBoxed(tp1, tp2, refs1) + if tp2.isAny then true + else if subCaptures(refs1, tp2.captureSet, frozenConstraint).isOK && sameBoxed(tp1, tp2, refs1) + || !ctx.mode.is(Mode.CheckBoundsOrSelfType) && tp1.isAlwaysPure then recur(parent1, tp2) else thirdTry case tp1: AnnotatedType if !tp1.isRefining => @@ -585,7 +586,8 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling } def compareTypeParamRef(tp2: TypeParamRef): Boolean = - assumedTrue(tp2) || { + assumedTrue(tp2) + || { val alwaysTrue = // The following condition is carefully formulated to catch all cases // where the subtype relation is true without needing to add a constraint @@ -596,11 +598,13 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling // widening in `fourthTry` before adding to the constraint. if (frozenConstraint) recur(tp1, bounds(tp2).lo.boxed) else isSubTypeWhenFrozen(tp1, tp2) - alwaysTrue || { - if (canConstrain(tp2) && !approx.low) - addConstraint(tp2, tp1.widenExpr, fromBelow = true) - else fourthTry - } + alwaysTrue + || tp1.dealias.match + case tp1a: OrType => recur(tp1a, tp2) + case _ => false + || (if canConstrain(tp2) && !approx.low then + addConstraint(tp2, tp1.widenExpr, fromBelow = true) + else fourthTry) } def thirdTry: Boolean = tp2 match { @@ -826,7 +830,11 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling if refs1.isAlwaysEmpty then recur(tp1, parent2) else subCaptures(refs1, refs2, frozenConstraint).isOK && sameBoxed(tp1, tp2, refs1) - && recur(tp1.widen.stripCapturing, parent2) + && (recur(tp1.widen.stripCapturing, parent2) + || tp1.isInstanceOf[SingletonType] && recur(tp1, parent2) + // this alternative is needed in case the right hand side is a + // capturing type that contains the lhs as an alternative of a union type. + ) catch case ex: AssertionError => println(i"assertion failed while compare captured $tp1 <:< $tp2") throw ex @@ -1435,11 +1443,11 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling if tp2 eq NoType then false else if tp1 eq tp2 then true else - val saved = constraint - val savedGadt = ctx.gadt.fresh + val savedCstr = constraint + val savedGadt = ctx.gadt inline def restore() = - state.constraint = saved - ctx.gadt.restore(savedGadt) + state.constraint = savedCstr + ctx.gadtState.restore(savedGadt) val savedSuccessCount = successCount try recCount += 1 @@ -1845,16 +1853,17 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling */ private def necessaryEither(op1: => Boolean, op2: => Boolean): Boolean = val preConstraint = constraint - val preGadt = ctx.gadt.fresh + val preGadt = ctx.gadt def allSubsumes(leftGadt: GadtConstraint, rightGadt: GadtConstraint, left: Constraint, right: Constraint): Boolean = - subsumes(left, right, preConstraint) && preGadt.subsumes(leftGadt, rightGadt, preGadt) + subsumes(left, right, preConstraint) + && subsumes(leftGadt.constraint, rightGadt.constraint, preGadt.constraint) if op1 then val op1Constraint = constraint - val op1Gadt = ctx.gadt.fresh + val op1Gadt = ctx.gadt constraint = preConstraint - ctx.gadt.restore(preGadt) + ctx.gadtState.restore(preGadt) if op2 then if allSubsumes(op1Gadt, ctx.gadt, op1Constraint, constraint) then gadts.println(i"GADT CUT - prefer ${ctx.gadt} over $op1Gadt") @@ -1863,15 +1872,15 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling gadts.println(i"GADT CUT - prefer $op1Gadt over ${ctx.gadt}") constr.println(i"CUT - prefer $op1Constraint over $constraint") constraint = op1Constraint - ctx.gadt.restore(op1Gadt) + ctx.gadtState.restore(op1Gadt) else gadts.println(i"GADT CUT - no constraint is preferable, reverting to $preGadt") constr.println(i"CUT - no constraint is preferable, reverting to $preConstraint") constraint = preConstraint - ctx.gadt.restore(preGadt) + ctx.gadtState.restore(preGadt) else constraint = op1Constraint - ctx.gadt.restore(op1Gadt) + ctx.gadtState.restore(op1Gadt) true else op2 end necessaryEither @@ -2043,10 +2052,7 @@ class TypeComparer(@constructorOnly initctx: Context) extends ConstraintHandling gadts.println(i"narrow gadt bound of $tparam: ${tparam.info} from ${if (isUpper) "above" else "below"} to $bound ${bound.toString} ${bound.isRef(tparam)}") if (bound.isRef(tparam)) false else - val savedGadt = ctx.gadt.fresh - val success = gadtAddBound(tparam, bound, isUpper) - if !success then ctx.gadt.restore(savedGadt) - success + ctx.gadtState.rollbackGadtUnless(gadtAddBound(tparam, bound, isUpper)) } } @@ -3157,7 +3163,7 @@ class TrackingTypeComparer(initctx: Context) extends TypeComparer(initctx) { tp case Nil => val casesText = MatchTypeTrace.noMatchesText(scrut, cases) - throw new TypeError(s"Match type reduction $casesText") + throw TypeError(em"Match type reduction $casesText") inFrozenConstraint { // Empty types break the basic assumption that if a scrutinee and a diff --git a/compiler/src/dotty/tools/dotc/core/TypeErasure.scala b/compiler/src/dotty/tools/dotc/core/TypeErasure.scala index 1fc7ee5d22a8..0e67fd40991b 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeErasure.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeErasure.scala @@ -591,9 +591,9 @@ class TypeErasure(sourceLanguage: SourceLanguage, semiEraseVCs: Boolean, isConst tp case tp: TypeRef => val sym = tp.symbol - if (!sym.isClass) this(tp.translucentSuperType) - else if (semiEraseVCs && isDerivedValueClass(sym)) eraseDerivedValueClass(tp) - else if (defn.isSyntheticFunctionClass(sym)) defn.functionTypeErasure(sym) + if !sym.isClass then this(checkedSuperType(tp)) + else if semiEraseVCs && isDerivedValueClass(sym) then eraseDerivedValueClass(tp) + else if defn.isSyntheticFunctionClass(sym) then defn.functionTypeErasure(sym) else eraseNormalClassRef(tp) case tp: AppliedType => val tycon = tp.tycon @@ -601,7 +601,7 @@ class TypeErasure(sourceLanguage: SourceLanguage, semiEraseVCs: Boolean, isConst else if (tycon.isRef(defn.PairClass)) erasePair(tp) else if (tp.isRepeatedParam) apply(tp.translateFromRepeated(toArray = sourceLanguage.isJava)) else if (semiEraseVCs && isDerivedValueClass(tycon.classSymbol)) eraseDerivedValueClass(tp) - else apply(tp.translucentSuperType) + else this(checkedSuperType(tp)) case tp: TermRef => this(underlyingOfTermRef(tp)) case _: ThisType => @@ -689,6 +689,18 @@ class TypeErasure(sourceLanguage: SourceLanguage, semiEraseVCs: Boolean, isConst tp } + /** Like translucentSuperType, but issue a fatal error if it does not exist. */ + private def checkedSuperType(tp: TypeProxy)(using Context): Type = + val tp1 = tp.translucentSuperType + if !tp1.exists then + val msg = tp.typeConstructor match + case tycon: TypeRef => + MissingType(tycon.prefix, tycon.name).toMessage.message + case _ => + i"Cannot resolve reference to $tp" + throw FatalError(msg) + tp1 + /** Widen term ref, skipping any `()` parameter of an eventual getter. Used to erase a TermRef. * Since getters are introduced after erasure, one would think that erasing a TermRef * could just use `widen`. However, it's possible that the TermRef got read from a class @@ -815,7 +827,7 @@ class TypeErasure(sourceLanguage: SourceLanguage, semiEraseVCs: Boolean, isConst throw new MissingType(tp.prefix, tp.name) val sym = tp.symbol if (!sym.isClass) { - val info = tp.translucentSuperType + val info = checkedSuperType(tp) if (!info.exists) assert(false, i"undefined: $tp with symbol $sym") return sigName(info) } @@ -841,7 +853,7 @@ class TypeErasure(sourceLanguage: SourceLanguage, semiEraseVCs: Boolean, isConst sigName( // todo: what about repeatedParam? if (erasureDependsOnArgs(sym)) this(tp) else if (sym.isClass) tp.underlying - else tp.translucentSuperType) + else checkedSuperType(tp)) case ErasedValueType(_, underlying) => sigName(underlying) case JavaArrayType(elem) => diff --git a/compiler/src/dotty/tools/dotc/core/TypeErrors.scala b/compiler/src/dotty/tools/dotc/core/TypeErrors.scala index a3b594eb0f09..24a207da6836 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeErrors.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeErrors.scala @@ -12,39 +12,58 @@ import Denotations._ import Decorators._ import reporting._ import ast.untpd -import config.Printers.cyclicErrors - -class TypeError(msg: String) extends Exception(msg) { - def this() = this("") - final def toMessage(using Context): Message = - withMode(Mode.Printing)(produceMessage) - def produceMessage(using Context): Message = super.getMessage.nn.toMessage - override def getMessage: String = super.getMessage.nn -} - -class MalformedType(pre: Type, denot: Denotation, absMembers: Set[Name]) extends TypeError { - override def produceMessage(using Context): Message = - i"malformed type: $pre is not a legal prefix for $denot because it contains abstract type member${if (absMembers.size == 1) "" else "s"} ${absMembers.mkString(", ")}" - .toMessage -} - -class MissingType(pre: Type, name: Name) extends TypeError { +import config.Printers.{cyclicErrors, noPrinter} + +import scala.annotation.constructorOnly + +abstract class TypeError(using creationContext: Context) extends Exception(""): + + /** Will the stack trace of this exception be filled in? + * This is expensive and only useful for debugging purposes. + */ + def computeStackTrace: Boolean = + ctx.debug || (cyclicErrors != noPrinter && this.isInstanceOf[CyclicReference] && !(ctx.mode is Mode.CheckCyclic)) + + override def fillInStackTrace(): Throwable = + if computeStackTrace then super.fillInStackTrace().nn + else this + + /** Convert to message. This takes an additional Context, so that we + * use the context when the message is first produced, i.e. when the TypeError + * is handled. This makes a difference for CyclicErrors since we need to know + * the context where the completed symbol is referenced, but the creation + * context of the CyclicReference is the completion context for the symbol. + * See i2887b for a test case, where we want to see + * "recursive or overloaded method needs result type". + */ + def toMessage(using Context): Message + + /** Uses creationContext to produce the message */ + override def getMessage: String = toMessage.message + +object TypeError: + def apply(msg: Message)(using Context) = new TypeError: + def toMessage(using Context) = msg +end TypeError + +class MalformedType(pre: Type, denot: Denotation, absMembers: Set[Name])(using Context) extends TypeError: + def toMessage(using Context) = em"malformed type: $pre is not a legal prefix for $denot because it contains abstract type member${if (absMembers.size == 1) "" else "s"} ${absMembers.mkString(", ")}" + +class MissingType(pre: Type, name: Name)(using Context) extends TypeError: private def otherReason(pre: Type)(using Context): String = pre match { case pre: ThisType if pre.cls.givenSelfType.exists => i"\nor the self type of $pre might not contain all transitive dependencies" case _ => "" } - override def produceMessage(using Context): Message = { - if (ctx.debug) printStackTrace() - i"""cannot resolve reference to type $pre.$name - |the classfile defining the type might be missing from the classpath${otherReason(pre)}""" - .toMessage - } -} + override def toMessage(using Context): Message = + if ctx.debug then printStackTrace() + em"""cannot resolve reference to type $pre.$name + |the classfile defining the type might be missing from the classpath${otherReason(pre)}""" +end MissingType -class RecursionOverflow(val op: String, details: => String, val previous: Throwable, val weight: Int) -extends TypeError { +class RecursionOverflow(val op: String, details: => String, val previous: Throwable, val weight: Int)(using Context) +extends TypeError: def explanation: String = s"$op $details" @@ -71,50 +90,51 @@ extends TypeError { (rs.map(_.explanation): List[String]).mkString("\n ", "\n| ", "") } - override def produceMessage(using Context): Message = NoExplanation { + override def toMessage(using Context): Message = val mostCommon = recursions.groupBy(_.op).toList.maxBy(_._2.map(_.weight).sum)._2.reverse - s"""Recursion limit exceeded. - |Maybe there is an illegal cyclic reference? - |If that's not the case, you could also try to increase the stacksize using the -Xss JVM option. - |For the unprocessed stack trace, compile with -Yno-decode-stacktraces. - |A recurring operation is (inner to outer): - |${opsString(mostCommon)}""".stripMargin - } + em"""Recursion limit exceeded. + |Maybe there is an illegal cyclic reference? + |If that's not the case, you could also try to increase the stacksize using the -Xss JVM option. + |For the unprocessed stack trace, compile with -Yno-decode-stacktraces. + |A recurring operation is (inner to outer): + |${opsString(mostCommon).stripMargin}""" override def fillInStackTrace(): Throwable = this override def getStackTrace(): Array[StackTraceElement] = previous.getStackTrace().asInstanceOf -} +end RecursionOverflow /** Post-process exceptions that might result from StackOverflow to add * tracing information while unwalking the stack. */ // Beware: Since this object is only used when handling a StackOverflow, this code // cannot consume significant amounts of stack. -object handleRecursive { +object handleRecursive: + inline def underlyingStackOverflowOrNull(exc: Throwable): Throwable | Null = + var e: Throwable | Null = exc + while e != null && !e.isInstanceOf[StackOverflowError] do e = e.getCause + e + def apply(op: String, details: => String, exc: Throwable, weight: Int = 1)(using Context): Nothing = - if (ctx.settings.YnoDecodeStacktraces.value) + if ctx.settings.YnoDecodeStacktraces.value then throw exc - else - exc match { - case _: RecursionOverflow => - throw new RecursionOverflow(op, details, exc, weight) - case _ => - var e: Throwable | Null = exc - while (e != null && !e.isInstanceOf[StackOverflowError]) e = e.getCause - if (e != null) throw new RecursionOverflow(op, details, e, weight) - else throw exc - } -} + else exc match + case _: RecursionOverflow => + throw new RecursionOverflow(op, details, exc, weight) + case _ => + val so = underlyingStackOverflowOrNull(exc) + if so != null then throw new RecursionOverflow(op, details, so, weight) + else throw exc +end handleRecursive /** * This TypeError signals that completing denot encountered a cycle: it asked for denot.info (or similar), * so it requires knowing denot already. * @param denot */ -class CyclicReference private (val denot: SymDenotation) extends TypeError { +class CyclicReference private (val denot: SymDenotation)(using Context) extends TypeError: var inImplicitSearch: Boolean = false - override def produceMessage(using Context): Message = { + override def toMessage(using Context): Message = val cycleSym = denot.symbol // cycleSym.flags would try completing denot and would fail, but here we can use flagsUNSAFE to detect flags @@ -151,19 +171,16 @@ class CyclicReference private (val denot: SymDenotation) extends TypeError { CyclicReferenceInvolving(denot) errorMsg(ctx) - } -} + end toMessage -object CyclicReference { - def apply(denot: SymDenotation)(using Context): CyclicReference = { +object CyclicReference: + def apply(denot: SymDenotation)(using Context): CyclicReference = val ex = new CyclicReference(denot) - if (!(ctx.mode is Mode.CheckCyclic) || ctx.settings.Ydebug.value) { + if ex.computeStackTrace then cyclicErrors.println(s"Cyclic reference involving! $denot") val sts = ex.getStackTrace.asInstanceOf[Array[StackTraceElement]] for (elem <- sts take 200) cyclicErrors.println(elem.toString) - } ex - } -} +end CyclicReference diff --git a/compiler/src/dotty/tools/dotc/core/TypeEval.scala b/compiler/src/dotty/tools/dotc/core/TypeEval.scala index 7ec0f12db3b6..b5684b07f181 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeEval.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeEval.scala @@ -91,7 +91,7 @@ object TypeEval: val result = try op catch case e: Throwable => - throw new TypeError(e.getMessage.nn) + throw TypeError(em"${e.getMessage.nn}") ConstantType(Constant(result)) def constantFold1[T](extractor: Type => Option[T], op: T => Any): Option[Type] = diff --git a/compiler/src/dotty/tools/dotc/core/TypeOps.scala b/compiler/src/dotty/tools/dotc/core/TypeOps.scala index c9b2d3334f47..c91412988e82 100644 --- a/compiler/src/dotty/tools/dotc/core/TypeOps.scala +++ b/compiler/src/dotty/tools/dotc/core/TypeOps.scala @@ -2,7 +2,7 @@ package dotty.tools package dotc package core -import Contexts._, Types._, Symbols._, Names._, Flags._ +import Contexts._, Types._, Symbols._, Names._, NameKinds.*, Flags._ import SymDenotations._ import util.Spans._ import util.Stats @@ -13,6 +13,7 @@ import ast.tpd._ import reporting.trace import config.Printers.typr import config.Feature +import transform.SymUtils.* import typer.ProtoTypes._ import typer.ForceDegree import typer.Inferencing._ @@ -186,7 +187,7 @@ object TypeOps: if (normed.exists) normed else mapOver case tp: MethodicType => // See documentation of `Types#simplified` - val addTypeVars = new TypeMap: + val addTypeVars = new TypeMap with IdempotentCaptRefMap: val constraint = ctx.typerState.constraint def apply(t: Type): Type = t match case t: TypeParamRef => constraint.typeVarOfParam(t).orElse(t) @@ -504,7 +505,7 @@ object TypeOps: override def derivedSelect(tp: NamedType, pre: Type) = if (pre eq tp.prefix) tp - else tryWiden(tp, tp.prefix).orElse { + else (if pre.isSingleton then NoType else tryWiden(tp, tp.prefix)).orElse { if (tp.isTerm && variance > 0 && !pre.isSingleton) apply(tp.info.widenExpr) else if (upper(pre).member(tp.name).exists) @@ -539,6 +540,18 @@ object TypeOps: val sym = tp.symbol forbidden.contains(sym) + /** We need to split the set into upper and lower approximations + * only if it contains a local element. The idea here is that at the + * time we perform an `avoid` all local elements are already accounted for + * and no further elements will be added afterwards. So we can just keep + * the set as it is. See comment by @linyxus on #16261. + */ + override def needsRangeIfInvariant(refs: CaptureSet): Boolean = + refs.elems.exists { + case ref: TermRef => toAvoid(ref) + case _ => false + } + override def apply(tp: Type): Type = tp match case tp: TypeVar if mapCtx.typerState.constraint.contains(tp) => val lo = TypeComparer.instanceType( @@ -609,7 +622,7 @@ object TypeOps: boundss: List[TypeBounds], instantiate: (Type, List[Type]) => Type, app: Type)( - using Context): List[BoundsViolation] = withMode(Mode.CheckBounds) { + using Context): List[BoundsViolation] = withMode(Mode.CheckBoundsOrSelfType) { val argTypes = args.tpes /** Replace all wildcards in `tps` with `#` where `` is the @@ -674,8 +687,8 @@ object TypeOps: val bound1 = massage(bound) if (bound1 ne bound) { if (checkCtx eq ctx) checkCtx = ctx.fresh.setFreshGADTBounds - if (!checkCtx.gadt.contains(sym)) checkCtx.gadt.addToConstraint(sym) - checkCtx.gadt.addBound(sym, bound1, fromBelow) + if (!checkCtx.gadt.contains(sym)) checkCtx.gadtState.addToConstraint(sym) + checkCtx.gadtState.addBound(sym, bound1, fromBelow) typr.println("install GADT bound $bound1 for when checking F-bounded $sym") } } @@ -726,7 +739,7 @@ object TypeOps: * If the subtyping is true, the instantiated type `p.child[Vs]` is * returned. Otherwise, `NoType` is returned. */ - def refineUsingParent(parent: Type, child: Symbol)(using Context): Type = { + def refineUsingParent(parent: Type, child: Symbol, mixins: List[Type] = Nil)(using Context): Type = { // is a place holder from Scalac, it is hopeless to instantiate it. // // Quote from scalac (from nsc/symtab/classfile/Pickler.scala): @@ -741,7 +754,7 @@ object TypeOps: val childTp = if (child.isTerm) child.termRef else child.typeRef inContext(ctx.fresh.setExploreTyperState().setFreshGADTBounds.addMode(Mode.GadtConstraintInference)) { - instantiateToSubType(childTp, parent).dealias + instantiateToSubType(childTp, parent, mixins).dealias } } @@ -752,7 +765,7 @@ object TypeOps: * * Otherwise, return NoType. */ - private def instantiateToSubType(tp1: NamedType, tp2: Type)(using Context): Type = { + private def instantiateToSubType(tp1: NamedType, tp2: Type, mixins: List[Type])(using Context): Type = { // In order for a child type S to qualify as a valid subtype of the parent // T, we need to test whether it is possible S <: T. // @@ -826,22 +839,51 @@ object TypeOps: } } - // Prefix inference, replace `p.C.this.Child` with `X.Child` where `X <: p.C` - // Note: we need to strip ThisType in `p` recursively. + /** Gather GADT symbols and `ThisType`s found in `tp2`, ie. the scrutinee. */ + object TraverseTp2 extends TypeTraverser: + val thisTypes = util.HashSet[ThisType]() + val gadtSyms = new mutable.ListBuffer[Symbol] + + def traverse(tp: Type) = { + val tpd = tp.dealias + if tpd ne tp then traverse(tpd) + else tp match + case tp: ThisType if !tp.tref.symbol.isStaticOwner && !thisTypes.contains(tp) => + thisTypes += tp + traverseChildren(tp.tref) + case tp: TypeRef if tp.symbol.isAbstractOrParamType => + gadtSyms += tp.symbol + traverseChildren(tp) + case _ => + traverseChildren(tp) + } + TraverseTp2.traverse(tp2) + val thisTypes = TraverseTp2.thisTypes + val gadtSyms = TraverseTp2.gadtSyms.toList + + // Prefix inference, given `p.C.this.Child`: + // 1. return it as is, if `C.this` is found in `tp`, i.e. the scrutinee; or + // 2. replace it with `X.Child` where `X <: p.C`, stripping ThisType in `p` recursively. // - // See tests/patmat/i3938.scala + // See tests/patmat/i3938.scala, tests/pos/i15029.more.scala, tests/pos/i16785.scala class InferPrefixMap extends TypeMap { var prefixTVar: Type | Null = null def apply(tp: Type): Type = tp match { - case ThisType(tref: TypeRef) if !tref.symbol.isStaticOwner => - if (tref.symbol.is(Module)) - TermRef(this(tref.prefix), tref.symbol.sourceModule) + case tp @ ThisType(tref) if !tref.symbol.isStaticOwner => + val symbol = tref.symbol + if thisTypes.contains(tp) then + prefixTVar = tp // e.g. tests/pos/i16785.scala, keep Outer.this + prefixTVar.uncheckedNN + else if symbol.is(Module) then + TermRef(this(tref.prefix), symbol.sourceModule) else if (prefixTVar != null) this(tref) else { prefixTVar = WildcardType // prevent recursive call from assigning it - val tref2 = this(tref.applyIfParameterized(tref.typeParams.map(_ => TypeBounds.empty))) - prefixTVar = newTypeVar(TypeBounds.upper(tref2)) + // e.g. tests/pos/i15029.more.scala, create a TypeVar for `Instances`' B, so we can disregard `Ints` + val tvars = tref.typeParams.map { tparam => newTypeVar(tparam.paramInfo.bounds, DepParamName.fresh(tparam.paramName)) } + val tref2 = this(tref.applyIfParameterized(tvars)) + prefixTVar = newTypeVar(TypeBounds.upper(tref2), DepParamName.fresh(tref.name)) prefixTVar.uncheckedNN } case tp => mapOver(tp) @@ -849,15 +891,11 @@ object TypeOps: } val inferThisMap = new InferPrefixMap - val tvars = tp1.typeParams.map { tparam => newTypeVar(tparam.paramInfo.bounds) } + val tvars = tp1.typeParams.map { tparam => newTypeVar(tparam.paramInfo.bounds, DepParamName.fresh(tparam.paramName)) } val protoTp1 = inferThisMap.apply(tp1).appliedTo(tvars) - val getAbstractSymbols = new TypeAccumulator[List[Symbol]]: - def apply(xs: List[Symbol], tp: Type) = tp.dealias match - case tp: TypeRef if tp.symbol.exists && !tp.symbol.isClass => foldOver(tp.symbol :: xs, tp) - case tp => foldOver(xs, tp) - val syms2 = getAbstractSymbols(Nil, tp2).reverse - if syms2.nonEmpty then ctx.gadt.addToConstraint(syms2) + if gadtSyms.nonEmpty then + ctx.gadtState.addToConstraint(gadtSyms) // If parent contains a reference to an abstract type, then we should // refine subtype checking to eliminate abstract types according to @@ -869,10 +907,7 @@ object TypeOps: } def instantiate(): Type = { - // if there's a change in variance in type parameters (between subtype tp1 and supertype tp2) - // then we don't want to maximise the type variables in the wrong direction. - // For instance 15967, A[-Z] and B[Y] extends A[Y], we don't want to maximise Y to Any - maximizeType(protoTp1.baseType(tp2.classSymbol), NoSpan) + for tp <- mixins.reverseIterator do protoTp1 <:< tp maximizeType(protoTp1, NoSpan) wildApprox(protoTp1) } diff --git a/compiler/src/dotty/tools/dotc/core/Types.scala b/compiler/src/dotty/tools/dotc/core/Types.scala index 3243bb242a56..03fc7274beaa 100644 --- a/compiler/src/dotty/tools/dotc/core/Types.scala +++ b/compiler/src/dotty/tools/dotc/core/Types.scala @@ -118,10 +118,9 @@ object Types { if t.mightBeProvisional then t.mightBeProvisional = t match case t: TypeRef => - !t.currentSymbol.isStatic && { + t.currentSymbol.isProvisional || !t.currentSymbol.isStatic && { (t: Type).mightBeProvisional = false // break cycles - t.symbol.isProvisional - || test(t.prefix, theAcc) + test(t.prefix, theAcc) || t.denot.infoOrCompleter.match case info: LazyType => true case info: AliasingBounds => test(info.alias, theAcc) @@ -397,6 +396,10 @@ object Types { def isRepeatedParam(using Context): Boolean = typeSymbol eq defn.RepeatedParamClass + /** Is this a parameter type that allows implicit argument converson? */ + def isConvertibleParam(using Context): Boolean = + typeSymbol eq defn.IntoType + /** Is this the type of a method that has a repeated parameter type as * last parameter type? */ @@ -536,7 +539,7 @@ object Types { case tp: ClassInfo => tp.cls :: Nil case AndType(l, r) => - l.parentSymbols(include) | r.parentSymbols(include) + l.parentSymbols(include).setUnion(r.parentSymbols(include)) case OrType(l, r) => l.parentSymbols(include) intersect r.parentSymbols(include) // TODO does not conform to spec case _ => @@ -745,16 +748,6 @@ object Types { // which means that we always defensively copy the type in the future. This second // measure is necessary because findMember calls might be cached, so do not // necessarily appear in nested order. - // Without the defensive copy, Typer.scala fails to compile at the line - // - // untpd.rename(lhsCore, setterName).withType(setterType), WildcardType) - // - // because the subtype check - // - // ThisTree[Untyped]#ThisTree[Typed] <: Tree[Typed] - // - // fails (in fact it thinks the underlying type of the LHS is `Tree[Untyped]`.) - // // Without the `openedTwice` trick, Typer.scala fails to Ycheck // at phase resolveSuper. val rt = @@ -775,11 +768,11 @@ object Types { val rinfo = tp.refinedInfo if (name.isTypeName && !pinfo.isInstanceOf[ClassInfo]) { // simplified case that runs more efficiently val jointInfo = - if rinfo.isInstanceOf[TypeAlias] && !ctx.mode.is(Mode.CheckBounds) then + if rinfo.isInstanceOf[TypeAlias] && !ctx.mode.is(Mode.CheckBoundsOrSelfType) then // In normal situations, the only way to "improve" on rinfo is to return an empty type bounds // So, we do not lose anything essential in "widening" to rinfo. // We need to compute the precise info only when checking for empty bounds - // which is communicated by the CheckBounds mode. + // which is communicated by the CheckBoundsOrSelfType mode. rinfo else if ctx.base.pendingMemberSearches.contains(name) then pinfo safe_& rinfo @@ -1077,12 +1070,15 @@ object Types { * @param relaxedCheck if true type `Null` becomes a subtype of non-primitive value types in TypeComparer. * @param matchLoosely if true the types `=> T` and `()T` are seen as overriding each other. * @param checkClassInfo if true we check that ClassInfos are within bounds of abstract types + * + * @param isSubType a function used for checking subtype relationships. */ - final def overrides(that: Type, relaxedCheck: Boolean, matchLoosely: => Boolean, checkClassInfo: Boolean = true)(using Context): Boolean = { + final def overrides(that: Type, relaxedCheck: Boolean, matchLoosely: => Boolean, checkClassInfo: Boolean = true, + isSubType: (Type, Type) => Context ?=> Boolean = (tp1, tp2) => tp1 frozen_<:< tp2)(using Context): Boolean = { val overrideCtx = if relaxedCheck then ctx.relaxedOverrideContext else ctx inContext(overrideCtx) { !checkClassInfo && this.isInstanceOf[ClassInfo] - || (this.widenExpr frozen_<:< that.widenExpr) + || isSubType(this.widenExpr, that.widenExpr) || matchLoosely && { val this1 = this.widenNullaryMethod val that1 = that.widenNullaryMethod @@ -1287,11 +1283,14 @@ object Types { * then the top-level union isn't widened. This is needed so that type inference can infer nullable types. */ def widenUnion(using Context): Type = widen match - case tp @ OrNull(tp1): OrType => - // Don't widen `T|Null`, since otherwise we wouldn't be able to infer nullable unions. - val tp1Widen = tp1.widenUnionWithoutNull - if (tp1Widen.isRef(defn.AnyClass)) tp1Widen - else tp.derivedOrType(tp1Widen, defn.NullType) + case tp: OrType => tp match + case OrNull(tp1) => + // Don't widen `T|Null`, since otherwise we wouldn't be able to infer nullable unions. + val tp1Widen = tp1.widenUnionWithoutNull + if (tp1Widen.isRef(defn.AnyClass)) tp1Widen + else tp.derivedOrType(tp1Widen, defn.NullType) + case _ => + tp.widenUnionWithoutNull case tp => tp.widenUnionWithoutNull @@ -1871,6 +1870,11 @@ object Types { def dropRepeatedAnnot(using Context): Type = dropAnnot(defn.RepeatedAnnot) + /** A translation from types of original parameter ValDefs to the types + * of parameters in MethodTypes. + * Translates `Seq[T] @repeated` or `Array[T] @repeated` to `[T]`. + * That way, repeated arguments are made manifest without risk of dropped annotations. + */ def annotatedToRepeated(using Context): Type = this match { case tp @ ExprType(tp1) => tp.derivedExprType(tp1.annotatedToRepeated) @@ -2186,7 +2190,7 @@ object Types { def designator: Designator protected def designator_=(d: Designator): Unit - assert(prefix.isValueType || (prefix eq NoPrefix), s"invalid prefix $prefix") + assert(NamedType.validPrefix(prefix), s"invalid prefix $prefix") private var myName: Name | Null = null private var lastDenotation: Denotation | Null = null @@ -2261,15 +2265,17 @@ object Types { final def symbol(using Context): Symbol = // We can rely on checkedPeriod (unlike in the definition of `denot` below) // because SymDenotation#installAfter never changes the symbol - if (checkedPeriod == ctx.period) lastSymbol.nn else computeSymbol + if (checkedPeriod.code == ctx.period.code) lastSymbol.asInstanceOf[Symbol] + else computeSymbol private def computeSymbol(using Context): Symbol = - designator match { + val result = designator match case sym: Symbol => if (sym.isValidInCurrentRun) sym else denot.symbol case name => - (if (denotationIsCurrent) lastDenotation.nn else denot).symbol - } + (if (denotationIsCurrent) lastDenotation.asInstanceOf[Denotation] else denot).symbol + if checkedPeriod.code != NowhereCode then checkedPeriod = ctx.period + result /** There is a denotation computed which is valid (somewhere in) the * current run. @@ -2301,18 +2307,16 @@ object Types { def info(using Context): Type = denot.info - /** The denotation currently denoted by this type */ - final def denot(using Context): Denotation = { + /** The denotation currently denoted by this type. Extremely hot. Carefully optimized + * to be as small as possible. + */ + final def denot(using Context): Denotation = util.Stats.record("NamedType.denot") - val now = ctx.period + val lastd = lastDenotation.asInstanceOf[Denotation] // Even if checkedPeriod == now we still need to recheck lastDenotation.validFor // as it may have been mutated by SymDenotation#installAfter - if (checkedPeriod != Nowhere && lastDenotation.nn.validFor.contains(now)) { - checkedPeriod = now - lastDenotation.nn - } + if checkedPeriod.code != NowhereCode && lastd.validFor.contains(ctx.period) then lastd else computeDenot - } private def computeDenot(using Context): Denotation = { util.Stats.record("NamedType.computeDenot") @@ -2348,10 +2352,11 @@ object Types { lastDenotation match { case lastd0: SingleDenotation => val lastd = lastd0.skipRemoved - if (lastd.validFor.runId == ctx.runId && (checkedPeriod != Nowhere)) finish(lastd.current) + if lastd.validFor.runId == ctx.runId && checkedPeriod.code != NowhereCode then + finish(lastd.current) else lastd match { case lastd: SymDenotation => - if (stillValid(lastd) && (checkedPeriod != Nowhere)) finish(lastd.current) + if stillValid(lastd) && checkedPeriod.code != NowhereCode then finish(lastd.current) else finish(memberDenot(lastd.initial.name, allowPrivate = false)) case _ => fromDesignator @@ -2420,12 +2425,12 @@ object Types { } else { if (!ctx.reporter.errorsReported) - throw new TypeError( - i"""bad parameter reference $this at ${ctx.phase} - |the parameter is ${param.showLocated} but the prefix $prefix - |does not define any corresponding arguments. - |idx = $idx, args = $args%, %, - |constraint = ${ctx.typerState.constraint}""") + throw TypeError( + em"""bad parameter reference $this at ${ctx.phase} + |the parameter is ${param.showLocated} but the prefix $prefix + |does not define any corresponding arguments. + |idx = $idx, args = $args%, %, + |constraint = ${ctx.typerState.constraint}""") NoDenotation } } @@ -2437,9 +2442,8 @@ object Types { setDenot(memberDenot(name, allowPrivate = !symbol.exists || symbol.is(Private))) private def setDenot(denot: Denotation)(using Context): Unit = { - if (Config.checkNoDoubleBindings) - if (ctx.settings.YnoDoubleBindings.value) - checkSymAssign(denot.symbol) + if ctx.base.checkNoDoubleBindings then + checkSymAssign(denot.symbol) lastDenotation = denot lastSymbol = denot.symbol @@ -2453,6 +2457,8 @@ object Types { } private def checkDenot()(using Context) = {} + //if name.toString == "getConstructor" then + // println(i"set denot of $this to ${denot.info}, ${denot.getClass}, ${Phases.phaseOf(denot.validFor.lastPhaseId)} at ${ctx.phase}") private def checkSymAssign(sym: Symbol)(using Context) = { def selfTypeOf(sym: Symbol) = @@ -2693,7 +2699,7 @@ object Types { this /** A reference like this one, but with the given prefix. */ - final def withPrefix(prefix: Type)(using Context): NamedType = { + final def withPrefix(prefix: Type)(using Context): Type = { def reload(): NamedType = { val lastSym = lastSymbol.nn val allowPrivate = !lastSym.exists || lastSym.is(Private) @@ -2706,6 +2712,7 @@ object Types { NamedType(prefix, name, d) } if (prefix eq this.prefix) this + else if !NamedType.validPrefix(prefix) then UnspecifiedErrorType else if (lastDenotation == null) NamedType(prefix, designator) else designator match { case sym: Symbol => @@ -2897,6 +2904,8 @@ object Types { def apply(prefix: Type, designator: Name, denot: Denotation)(using Context): NamedType = if (designator.isTermName) TermRef.apply(prefix, designator.asTermName, denot) else TypeRef.apply(prefix, designator.asTypeName, denot) + + def validPrefix(prefix: Type): Boolean = prefix.isValueType || (prefix eq NoPrefix) } object TermRef { @@ -3311,11 +3320,11 @@ object Types { final class CachedAndType(tp1: Type, tp2: Type) extends AndType(tp1, tp2) object AndType { - def apply(tp1: Type, tp2: Type)(using Context): AndType = { - assert(tp1.isValueTypeOrWildcard && - tp2.isValueTypeOrWildcard, i"$tp1 & $tp2 / " + s"$tp1 & $tp2") + def apply(tp1: Type, tp2: Type)(using Context): AndType = + def where = i"in intersection $tp1 & $tp2" + expectValueTypeOrWildcard(tp1, where) + expectValueTypeOrWildcard(tp2, where) unchecked(tp1, tp2) - } def balanced(tp1: Type, tp2: Type)(using Context): AndType = tp1 match @@ -3355,7 +3364,7 @@ object Types { TypeComparer.liftIfHK(tp1, tp2, AndType.make(_, _, checkValid = false), makeHk, _ | _) } - abstract case class OrType(tp1: Type, tp2: Type) extends AndOrType { + abstract case class OrType protected(tp1: Type, tp2: Type) extends AndOrType { def isAnd: Boolean = false def isSoft: Boolean private var myBaseClassesPeriod: Period = Nowhere @@ -3388,9 +3397,6 @@ object Types { myFactorCount else 1 - assert(tp1.isValueTypeOrWildcard && - tp2.isValueTypeOrWildcard, s"$tp1 $tp2") - private var myJoin: Type = _ private var myJoinPeriod: Period = Nowhere @@ -3423,25 +3429,29 @@ object Types { private var myAtoms: Atoms = _ private var myWidened: Type = _ + private def computeAtoms()(using Context): Atoms = + if tp1.hasClassSymbol(defn.NothingClass) then tp2.atoms + else if tp2.hasClassSymbol(defn.NothingClass) then tp1.atoms + else tp1.atoms | tp2.atoms + + private def computeWidenSingletons()(using Context): Type = + val tp1w = tp1.widenSingletons + val tp2w = tp2.widenSingletons + if ((tp1 eq tp1w) && (tp2 eq tp2w)) this else TypeComparer.lub(tp1w, tp2w, isSoft = isSoft) + private def ensureAtomsComputed()(using Context): Unit = if atomsRunId != ctx.runId then - myAtoms = - if tp1.hasClassSymbol(defn.NothingClass) then tp2.atoms - else if tp2.hasClassSymbol(defn.NothingClass) then tp1.atoms - else tp1.atoms | tp2.atoms - val tp1w = tp1.widenSingletons - val tp2w = tp2.widenSingletons - myWidened = if ((tp1 eq tp1w) && (tp2 eq tp2w)) this else TypeComparer.lub(tp1w, tp2w, isSoft = isSoft) - atomsRunId = ctx.runId + myAtoms = computeAtoms() + myWidened = computeWidenSingletons() + if !isProvisional then atomsRunId = ctx.runId override def atoms(using Context): Atoms = ensureAtomsComputed() myAtoms - override def widenSingletons(using Context): Type = { + override def widenSingletons(using Context): Type = ensureAtomsComputed() myWidened - } def derivedOrType(tp1: Type, tp2: Type, soft: Boolean = isSoft)(using Context): Type = if ((tp1 eq this.tp1) && (tp2 eq this.tp2) && soft == isSoft) this @@ -3461,6 +3471,9 @@ object Types { object OrType { def apply(tp1: Type, tp2: Type, soft: Boolean)(using Context): OrType = { + def where = i"in union $tp1 | $tp2" + expectValueTypeOrWildcard(tp1, where) + expectValueTypeOrWildcard(tp2, where) assertUnerased() unique(new CachedOrType(tp1, tp2, soft)) } @@ -3491,6 +3504,11 @@ object Types { TypeComparer.liftIfHK(tp1, tp2, OrType(_, _, soft = true), makeHk, _ & _) } + def expectValueTypeOrWildcard(tp: Type, where: => String)(using Context): Unit = + if !tp.isValueTypeOrWildcard then + assert(!ctx.isAfterTyper, where) // we check correct kinds at PostTyper + throw TypeError(em"$tp is not a value type, cannot be used $where") + /** An extractor object to pattern match against a nullable union. * e.g. * @@ -3948,27 +3966,48 @@ object Types { * and inline parameters: * - replace @repeated annotations on Seq or Array types by types * - add @inlineParam to inline parameters + * - add @erasedParam to erased parameters + * - wrap types of parameters that have an @allowConversions annotation with Into[_] */ - def fromSymbols(params: List[Symbol], resultType: Type)(using Context): MethodType = { - def translateInline(tp: Type): Type = tp match { - case ExprType(resType) => ExprType(AnnotatedType(resType, Annotation(defn.InlineParamAnnot))) - case _ => AnnotatedType(tp, Annotation(defn.InlineParamAnnot)) - } - def translateErased(tp: Type): Type = tp match { - case ExprType(resType) => ExprType(AnnotatedType(resType, Annotation(defn.ErasedParamAnnot))) - case _ => AnnotatedType(tp, Annotation(defn.ErasedParamAnnot)) - } - def paramInfo(param: Symbol) = { + def fromSymbols(params: List[Symbol], resultType: Type)(using Context): MethodType = + def addAnnotation(tp: Type, cls: ClassSymbol, param: Symbol): Type = tp match + case ExprType(resType) => ExprType(addAnnotation(resType, cls, param)) + case _ => AnnotatedType(tp, Annotation(cls, param.span)) + + def wrapConvertible(tp: Type) = + AppliedType(defn.IntoType.typeRef, tp :: Nil) + + /** Add `Into[..] to the type itself and if it is a function type, to all its + * curried result type(s) as well. + */ + def addInto(tp: Type): Type = tp match + case tp @ AppliedType(tycon, args) if tycon.typeSymbol == defn.RepeatedParamClass => + tp.derivedAppliedType(tycon, addInto(args.head) :: Nil) + case tp @ AppliedType(tycon, args) if defn.isFunctionType(tp) => + wrapConvertible(tp.derivedAppliedType(tycon, args.init :+ addInto(args.last))) + case tp @ RefinedType(parent, rname, rinfo) if defn.isFunctionOrPolyType(tp) => + wrapConvertible(tp.derivedRefinedType(parent, rname, addInto(rinfo))) + case tp: MethodOrPoly => + tp.derivedLambdaType(resType = addInto(tp.resType)) + case ExprType(resType) => + ExprType(addInto(resType)) + case _ => + wrapConvertible(tp) + + def paramInfo(param: Symbol) = var paramType = param.info.annotatedToRepeated - if (param.is(Inline)) paramType = translateInline(paramType) - if (param.is(Erased)) paramType = translateErased(paramType) + if param.is(Inline) then + paramType = addAnnotation(paramType, defn.InlineParamAnnot, param) + if param.is(Erased) then + paramType = addAnnotation(paramType, defn.ErasedParamAnnot, param) + if param.hasAnnotation(defn.AllowConversionsAnnot) then + paramType = addInto(paramType) paramType - } apply(params.map(_.name.asTermName))( tl => params.map(p => tl.integrate(params, paramInfo(p))), tl => tl.integrate(params, resultType)) - } + end fromSymbols final def apply(paramNames: List[TermName])(paramInfosExp: MethodType => List[Type], resultTypeExp: MethodType => Type)(using Context): MethodType = checkValid(unique(new CachedMethodType(paramNames)(paramInfosExp, resultTypeExp, self))) @@ -5291,7 +5330,12 @@ object Types { abstract class FlexType extends UncachedGroundType with ValueType abstract class ErrorType extends FlexType { + + /** An explanation of the cause of the failure */ def msg(using Context): Message + + /** An explanation of the cause of the failure as a string */ + def explanation(using Context): String = msg.message } object ErrorType: @@ -5299,18 +5343,16 @@ object Types { val et = new PreviousErrorType ctx.base.errorTypeMsg(et) = m et - def apply(s: => String)(using Context): ErrorType = - apply(s.toMessage) end ErrorType class PreviousErrorType extends ErrorType: def msg(using Context): Message = ctx.base.errorTypeMsg.get(this) match case Some(m) => m - case None => "error message from previous run no longer available".toMessage + case None => em"error message from previous run no longer available" object UnspecifiedErrorType extends ErrorType { - override def msg(using Context): Message = "unspecified error".toMessage + override def msg(using Context): Message = em"unspecified error" } /* Type used to track Select nodes that could not resolve a member and their qualifier is a scala.Dynamic. */ @@ -5497,6 +5539,14 @@ object Types { stop == StopAt.Static && tp.currentSymbol.isStatic && isStaticPrefix(tp.prefix) || stop == StopAt.Package && tp.currentSymbol.is(Package) } + + /** The type parameters of the constructor of this applied type. + * Overridden in OrderingConstraint's ConstraintAwareTraversal to take account + * of instantiations in the constraint that are not yet propagated to the + * instance types of type variables. + */ + protected def tyconTypeParams(tp: AppliedType)(using Context): List[ParamInfo] = + tp.tyconTypeParams end VariantTraversal /** A supertrait for some typemaps that are bijections. Used for capture checking. @@ -5604,17 +5654,11 @@ object Types { case tp: NamedType => if stopBecauseStaticOrLocal(tp) then tp else - val prefix1 = atVariance(variance max 0)(this(tp.prefix)) - // A prefix is never contravariant. Even if say `p.A` is used in a contravariant - // context, we cannot assume contravariance for `p` because `p`'s lower - // bound might not have a binding for `A` (e.g. the lower bound could be `Nothing`). - // By contrast, covariance does translate to the prefix, since we have that - // if `p <: q` then `p.A <: q.A`, and well-formedness requires that `A` is a member - // of `p`'s upper bound. + val prefix1 = atVariance(variance max 0)(this(tp.prefix)) // see comment of TypeAccumulator's applyToPrefix derivedSelect(tp, prefix1) case tp: AppliedType => - derivedAppliedType(tp, this(tp.tycon), mapArgs(tp.args, tp.tyconTypeParams)) + derivedAppliedType(tp, this(tp.tycon), mapArgs(tp.args, tyconTypeParams(tp))) case tp: LambdaType => mapOverLambda(tp) @@ -5941,7 +5985,7 @@ object Types { case nil => true } - if (distributeArgs(args, tp.tyconTypeParams)) + if (distributeArgs(args, tyconTypeParams(tp))) range(tp.derivedAppliedType(tycon, loBuf.toList), tp.derivedAppliedType(tycon, hiBuf.toList)) else if tycon.isLambdaSub || args.exists(isRangeOfNonTermTypes) then @@ -6025,8 +6069,11 @@ object Types { tp.derivedLambdaType(tp.paramNames, formals, restpe) } + /** Overridden in TypeOps.avoid */ + protected def needsRangeIfInvariant(refs: CaptureSet): Boolean = true + override def mapCapturingType(tp: Type, parent: Type, refs: CaptureSet, v: Int): Type = - if v == 0 then + if v == 0 && needsRangeIfInvariant(refs) then range(mapCapturingType(tp, parent, refs, -1), mapCapturingType(tp, parent, refs, 1)) else super.mapCapturingType(tp, parent, refs, v) @@ -6037,14 +6084,10 @@ object Types { /** A range of possible types between lower bound `lo` and upper bound `hi`. * Only used internally in `ApproximatingTypeMap`. */ - case class Range(lo: Type, hi: Type) extends UncachedGroundType { + case class Range(lo: Type, hi: Type) extends UncachedGroundType: assert(!lo.isInstanceOf[Range]) assert(!hi.isInstanceOf[Range]) - override def toText(printer: Printer): Text = - lo.toText(printer) ~ ".." ~ hi.toText(printer) - } - /** Approximate wildcards by their bounds */ class AvoidWildcardsMap(using Context) extends ApproximatingTypeMap: protected def mapWild(t: WildcardType) = @@ -6063,8 +6106,17 @@ object Types { protected def applyToAnnot(x: T, annot: Annotation): T = x // don't go into annotations - protected final def applyToPrefix(x: T, tp: NamedType): T = - atVariance(variance max 0)(this(x, tp.prefix)) // see remark on NamedType case in TypeMap + /** A prefix is never contravariant. Even if say `p.A` is used in a contravariant + * context, we cannot assume contravariance for `p` because `p`'s lower + * bound might not have a binding for `A`, since the lower bound could be `Nothing`. + * By contrast, covariance does translate to the prefix, since we have that + * if `p <: q` then `p.A <: q.A`, and well-formedness requires that `A` is a member + * of `p`'s upper bound. + * Overridden in OrderingConstraint's ConstraintAwareTraversal, where a + * more relaxed scheme is used. + */ + protected def applyToPrefix(x: T, tp: NamedType): T = + atVariance(variance max 0)(this(x, tp.prefix)) def foldOver(x: T, tp: Type): T = { record(s"foldOver $getClass") @@ -6087,7 +6139,7 @@ object Types { } foldArgs(acc, tparams.tail, args.tail) } - foldArgs(this(x, tycon), tp.tyconTypeParams, args) + foldArgs(this(x, tycon), tyconTypeParams(tp), args) case _: BoundType | _: ThisType => x diff --git a/compiler/src/dotty/tools/dotc/core/classfile/ClassfileConstants.scala b/compiler/src/dotty/tools/dotc/core/classfile/ClassfileConstants.scala index 3b05ee351b86..4aa60d973264 100644 --- a/compiler/src/dotty/tools/dotc/core/classfile/ClassfileConstants.scala +++ b/compiler/src/dotty/tools/dotc/core/classfile/ClassfileConstants.scala @@ -346,6 +346,7 @@ object ClassfileConstants { case JAVA_ACC_ENUM => Enum case JAVA_ACC_ABSTRACT => if (isClass) Abstract else Deferred case JAVA_ACC_INTERFACE => PureInterfaceCreationFlags | JavaDefined + case JAVA_ACC_ANNOTATION => JavaAnnotation case _ => EmptyFlags } @@ -353,18 +354,16 @@ object ClassfileConstants { if (jflag == 0) base else base | translateFlag(jflag) private def translateFlags(jflags: Int, baseFlags: FlagSet): FlagSet = { - val nflags = - if ((jflags & JAVA_ACC_ANNOTATION) == 0) jflags - else jflags & ~(JAVA_ACC_ABSTRACT | JAVA_ACC_INTERFACE) // annotations are neither abstract nor interfaces var res: FlagSet = baseFlags | JavaDefined - res = addFlag(res, nflags & JAVA_ACC_PRIVATE) - res = addFlag(res, nflags & JAVA_ACC_PROTECTED) - res = addFlag(res, nflags & JAVA_ACC_FINAL) - res = addFlag(res, nflags & JAVA_ACC_SYNTHETIC) - res = addFlag(res, nflags & JAVA_ACC_STATIC) - res = addFlag(res, nflags & JAVA_ACC_ENUM) - res = addFlag(res, nflags & JAVA_ACC_ABSTRACT) - res = addFlag(res, nflags & JAVA_ACC_INTERFACE) + res = addFlag(res, jflags & JAVA_ACC_PRIVATE) + res = addFlag(res, jflags & JAVA_ACC_PROTECTED) + res = addFlag(res, jflags & JAVA_ACC_FINAL) + res = addFlag(res, jflags & JAVA_ACC_SYNTHETIC) + res = addFlag(res, jflags & JAVA_ACC_STATIC) + res = addFlag(res, jflags & JAVA_ACC_ENUM) + res = addFlag(res, jflags & JAVA_ACC_ABSTRACT) + res = addFlag(res, jflags & JAVA_ACC_INTERFACE) + res = addFlag(res, jflags & JAVA_ACC_ANNOTATION) res } diff --git a/compiler/src/dotty/tools/dotc/core/classfile/ClassfileParser.scala b/compiler/src/dotty/tools/dotc/core/classfile/ClassfileParser.scala index 4763cd25ff41..7702e6a93446 100644 --- a/compiler/src/dotty/tools/dotc/core/classfile/ClassfileParser.scala +++ b/compiler/src/dotty/tools/dotc/core/classfile/ClassfileParser.scala @@ -165,11 +165,7 @@ class ClassfileParser( * Updates the read pointer of 'in'. */ def parseParents: List[Type] = { val superType = - if (isAnnotation) { - in.nextChar - defn.AnnotationClass.typeRef - } - else if (classRoot.symbol == defn.ComparableClass || + if (classRoot.symbol == defn.ComparableClass || classRoot.symbol == defn.JavaCloneableClass || classRoot.symbol == defn.JavaSerializableClass) { // Treat these interfaces as universal traits @@ -186,7 +182,6 @@ class ClassfileParser( // Consequently, no best implicit for the "Integral" evidence parameter of "range" // is found. Previously, this worked because of weak conformance, which has been dropped. - if (isAnnotation) ifaces = defn.ClassfileAnnotationClass.typeRef :: ifaces superType :: ifaces } @@ -331,7 +326,7 @@ class ClassfileParser( if (isEnum) { val enumClass = sym.owner.linkedClass if (!enumClass.exists) - report.warning(s"no linked class for java enum $sym in ${sym.owner}. A referencing class file might be missing an InnerClasses entry.") + report.warning(em"no linked class for java enum $sym in ${sym.owner}. A referencing class file might be missing an InnerClasses entry.") else { if (!enumClass.is(Flags.Sealed)) enumClass.setFlag(Flags.AbstractSealed) enumClass.addAnnotation(Annotation.Child(sym, NoSpan)) @@ -661,7 +656,7 @@ class ClassfileParser( case tp: TypeRef if tp.denot.infoOrCompleter.isInstanceOf[StubInfo] => // Silently ignore missing annotation classes like javac if ctx.debug then - report.warning(i"Error while parsing annotations in ${classfile}: annotation class $tp not present on classpath") + report.warning(em"Error while parsing annotations in ${classfile}: annotation class $tp not present on classpath") None case _ => if (hasError || skip) None @@ -676,7 +671,7 @@ class ClassfileParser( // the classpath would *not* end up here. A class not found is signaled // with a `FatalError` exception, handled above. Here you'd end up after a NPE (for example), // and that should never be swallowed silently. - report.warning("Caught: " + ex + " while parsing annotations in " + classfile) + report.warning(em"Caught: $ex while parsing annotations in $classfile") if (ctx.debug) ex.printStackTrace() None // ignore malformed annotations @@ -758,7 +753,7 @@ class ClassfileParser( case tpnme.ConstantValueATTR => val c = pool.getConstant(in.nextChar) if (c ne null) res.constant = c - else report.warning(s"Invalid constant in attribute of ${sym.showLocated} while parsing ${classfile}") + else report.warning(em"Invalid constant in attribute of ${sym.showLocated} while parsing ${classfile}") case tpnme.MethodParametersATTR => val paramCount = in.nextByte @@ -769,7 +764,7 @@ class ClassfileParser( res.namedParams += (i -> name.name) case tpnme.AnnotationDefaultATTR => - sym.addAnnotation(Annotation(defn.AnnotationDefaultAnnot, Nil)) + sym.addAnnotation(Annotation(defn.AnnotationDefaultAnnot, Nil, sym.span)) // Java annotations on classes / methods / fields with RetentionPolicy.RUNTIME case tpnme.RuntimeVisibleAnnotationATTR @@ -845,7 +840,7 @@ class ClassfileParser( class AnnotConstructorCompleter(classInfo: TempClassInfoType) extends LazyType { def complete(denot: SymDenotation)(using Context): Unit = { - val attrs = classInfo.decls.toList.filter(sym => sym.isTerm && sym != denot.symbol) + val attrs = classInfo.decls.toList.filter(sym => sym.isTerm && sym != denot.symbol && sym.name != nme.CONSTRUCTOR) val paramNames = attrs.map(_.name.asTermName) val paramTypes = attrs.map(_.info.resultType) denot.info = MethodType(paramNames, paramTypes, classRoot.typeRef) @@ -972,7 +967,7 @@ class ClassfileParser( } } else { - report.error(s"Could not find $path in ${classfile.underlyingSource}") + report.error(em"Could not find $path in ${classfile.underlyingSource}") Array.empty } case _ => @@ -980,7 +975,7 @@ class ClassfileParser( val name = classfile.name.stripSuffix(".class") + ".tasty" val tastyFileOrNull = dir.lookupName(name, false) if (tastyFileOrNull == null) { - report.error(s"Could not find TASTY file $name under $dir") + report.error(em"Could not find TASTY file $name under $dir") Array.empty } else tastyFileOrNull.toByteArray diff --git a/compiler/src/dotty/tools/dotc/core/tasty/CommentPickler.scala b/compiler/src/dotty/tools/dotc/core/tasty/CommentPickler.scala index df3e4df497f8..fde6c669045d 100644 --- a/compiler/src/dotty/tools/dotc/core/tasty/CommentPickler.scala +++ b/compiler/src/dotty/tools/dotc/core/tasty/CommentPickler.scala @@ -9,36 +9,43 @@ import dotty.tools.tasty.TastyFormat.CommentsSection import java.nio.charset.StandardCharsets -class CommentPickler(pickler: TastyPickler, addrOfTree: tpd.Tree => Addr, docString: untpd.MemberDef => Option[Comment]): - private val buf = new TastyBuffer(5000) - pickler.newSection(CommentsSection, buf) - - def pickleComment(root: tpd.Tree): Unit = traverse(root) - - private def pickleComment(addr: Addr, comment: Comment): Unit = - if addr != NoAddr then - val bytes = comment.raw.getBytes(StandardCharsets.UTF_8).nn - val length = bytes.length - buf.writeAddr(addr) - buf.writeNat(length) - buf.writeBytes(bytes, length) - buf.writeLongInt(comment.span.coords) - - private def traverse(x: Any): Unit = x match - case x: untpd.Tree @unchecked => - x match - case x: tpd.MemberDef @unchecked => // at this point all MembderDefs are t(y)p(e)d. - for comment <- docString(x) do pickleComment(addrOfTree(x), comment) - case _ => - val limit = x.productArity - var n = 0 - while n < limit do - traverse(x.productElement(n)) - n += 1 - case y :: ys => - traverse(y) - traverse(ys) - case _ => - +object CommentPickler: + + def pickleComments( + pickler: TastyPickler, + addrOfTree: PositionPickler.TreeToAddr, + docString: untpd.MemberDef => Option[Comment], + root: tpd.Tree, + buf: TastyBuffer = new TastyBuffer(5000)): Unit = + + pickler.newSection(CommentsSection, buf) + + def pickleComment(addr: Addr, comment: Comment): Unit = + if addr != NoAddr then + val bytes = comment.raw.getBytes(StandardCharsets.UTF_8).nn + val length = bytes.length + buf.writeAddr(addr) + buf.writeNat(length) + buf.writeBytes(bytes, length) + buf.writeLongInt(comment.span.coords) + + def traverse(x: Any): Unit = x match + case x: untpd.Tree @unchecked => + x match + case x: tpd.MemberDef @unchecked => // at this point all MembderDefs are t(y)p(e)d. + for comment <- docString(x) do pickleComment(addrOfTree(x), comment) + case _ => + val limit = x.productArity + var n = 0 + while n < limit do + traverse(x.productElement(n)) + n += 1 + case y :: ys => + traverse(y) + traverse(ys) + case _ => + + traverse(root) + end pickleComments end CommentPickler diff --git a/compiler/src/dotty/tools/dotc/core/tasty/NameBuffer.scala b/compiler/src/dotty/tools/dotc/core/tasty/NameBuffer.scala index 623508780325..1ddcf9afe1dc 100644 --- a/compiler/src/dotty/tools/dotc/core/tasty/NameBuffer.scala +++ b/compiler/src/dotty/tools/dotc/core/tasty/NameBuffer.scala @@ -49,9 +49,12 @@ class NameBuffer extends TastyBuffer(10000) { } } - private def withLength(op: => Unit, lengthWidth: Int = 1): Unit = { + private inline def withLength(inline op: Unit, lengthWidth: Int = 1): Unit = { val lengthAddr = currentAddr - for (i <- 0 until lengthWidth) writeByte(0) + var i = 0 + while i < lengthWidth do + writeByte(0) + i += 1 op val length = currentAddr.index - lengthAddr.index - lengthWidth putNat(lengthAddr, length, lengthWidth) @@ -111,11 +114,11 @@ class NameBuffer extends TastyBuffer(10000) { override def assemble(): Unit = { var i = 0 - for ((name, ref) <- nameRefs) { + for (name, ref) <- nameRefs do + val ref = nameRefs(name) assert(ref.index == i) i += 1 pickleNameContents(name) - } } } diff --git a/compiler/src/dotty/tools/dotc/core/tasty/PositionPickler.scala b/compiler/src/dotty/tools/dotc/core/tasty/PositionPickler.scala index ad0c051e1b7b..924b87bec003 100644 --- a/compiler/src/dotty/tools/dotc/core/tasty/PositionPickler.scala +++ b/compiler/src/dotty/tools/dotc/core/tasty/PositionPickler.scala @@ -8,32 +8,40 @@ import dotty.tools.tasty.TastyBuffer import TastyBuffer._ import ast._ -import Trees.WithLazyField +import Trees.WithLazyFields import util.{SourceFile, NoSource} import core._ import Annotations._, Decorators._ import collection.mutable import util.Spans._ +import reporting.Message -class PositionPickler( - pickler: TastyPickler, - addrOfTree: PositionPickler.TreeToAddr, - treeAnnots: untpd.MemberDef => List[tpd.Tree], - relativePathReference: String){ - +object PositionPickler: import ast.tpd._ - val buf: TastyBuffer = new TastyBuffer(5000) - pickler.newSection(PositionsSection, buf) - - private val pickledIndices = new mutable.BitSet + // Note: This could be just TreeToAddr => Addr if functions are specialized to value classes. + // We use a SAM type to avoid boxing of Addr + @FunctionalInterface + trait TreeToAddr: + def apply(x: untpd.Tree): Addr - def header(addrDelta: Int, hasStartDelta: Boolean, hasEndDelta: Boolean, hasPoint: Boolean): Int = { + def header(addrDelta: Int, hasStartDelta: Boolean, hasEndDelta: Boolean, hasPoint: Boolean): Int = def toInt(b: Boolean) = if (b) 1 else 0 (addrDelta << 3) | (toInt(hasStartDelta) << 2) | (toInt(hasEndDelta) << 1) | toInt(hasPoint) - } - def picklePositions(source: SourceFile, roots: List[Tree], warnings: mutable.ListBuffer[String]): Unit = { + def picklePositions( + pickler: TastyPickler, + addrOfTree: TreeToAddr, + treeAnnots: untpd.MemberDef => List[tpd.Tree], + relativePathReference: String, + source: SourceFile, + roots: List[Tree], + warnings: mutable.ListBuffer[Message], + buf: TastyBuffer = new TastyBuffer(5000), + pickledIndices: mutable.BitSet = new mutable.BitSet) = + + pickler.newSection(PositionsSection, buf) + /** Pickle the number of lines followed by the length of each line */ def pickleLineOffsets(): Unit = { val content = source.content() @@ -79,7 +87,7 @@ class PositionPickler( def alwaysNeedsPos(x: Positioned) = x match { case // initialSpan is inaccurate for trees with lazy field - _: WithLazyField[?] + _: WithLazyFields // A symbol is created before the corresponding tree is unpickled, // and its position cannot be changed afterwards. @@ -128,10 +136,6 @@ class PositionPickler( } for (root <- roots) traverse(root, NoSource) - } -} -object PositionPickler: - // Note: This could be just TreeToAddr => Addr if functions are specialized to value classes. - // We use a SAM type to avoid boxing of Addr - @FunctionalInterface trait TreeToAddr: - def apply(x: untpd.Tree): Addr + end picklePositions +end PositionPickler + diff --git a/compiler/src/dotty/tools/dotc/core/tasty/ScratchData.scala b/compiler/src/dotty/tools/dotc/core/tasty/ScratchData.scala new file mode 100644 index 000000000000..b36c78a77ac6 --- /dev/null +++ b/compiler/src/dotty/tools/dotc/core/tasty/ScratchData.scala @@ -0,0 +1,20 @@ +package dotty.tools.dotc.core.tasty +import dotty.tools.tasty.TastyBuffer +import collection.mutable +import java.util.Arrays + +class ScratchData: + var delta, delta1 = new Array[Int](0) + + val positionBuffer = new TastyBuffer(5000) + val pickledIndices = new mutable.BitSet + + val commentBuffer = new TastyBuffer(5000) + + def reset() = + assert(delta ne delta1) + assert(delta.length == delta1.length) + positionBuffer.reset() + pickledIndices.clear() + commentBuffer.reset() + diff --git a/compiler/src/dotty/tools/dotc/core/tasty/TastyPickler.scala b/compiler/src/dotty/tools/dotc/core/tasty/TastyPickler.scala index aa657c393815..4f1e84ac9184 100644 --- a/compiler/src/dotty/tools/dotc/core/tasty/TastyPickler.scala +++ b/compiler/src/dotty/tools/dotc/core/tasty/TastyPickler.scala @@ -38,8 +38,9 @@ class TastyPickler(val rootCls: ClassSymbol) { nameBuffer.assemble() sections.foreach(_._2.assemble()) - val nameBufferHash = TastyHash.pjwHash64(nameBuffer.bytes) - val treeSectionHash +: otherSectionHashes = sections.map(x => TastyHash.pjwHash64(x._2.bytes)): @unchecked + val nameBufferHash = TastyHash.pjwHash64(nameBuffer.bytes, nameBuffer.length) + val treeSectionHash +: otherSectionHashes = + sections.map(x => TastyHash.pjwHash64(x._2.bytes, x._2.length)): @unchecked // Hash of name table and tree val uuidLow: Long = nameBufferHash ^ treeSectionHash diff --git a/compiler/src/dotty/tools/dotc/core/tasty/TreeBuffer.scala b/compiler/src/dotty/tools/dotc/core/tasty/TreeBuffer.scala index a3dedaaec685..d0f08379c114 100644 --- a/compiler/src/dotty/tools/dotc/core/tasty/TreeBuffer.scala +++ b/compiler/src/dotty/tools/dotc/core/tasty/TreeBuffer.scala @@ -10,6 +10,7 @@ import TastyBuffer.{Addr, NoAddr, AddrWidth} import util.Util.bestFit import config.Printers.pickling import ast.untpd.Tree +import java.util.Arrays class TreeBuffer extends TastyBuffer(50000) { @@ -17,7 +18,6 @@ class TreeBuffer extends TastyBuffer(50000) { private val initialOffsetSize = bytes.length / (AddrWidth * ItemsOverOffsets) private var offsets = new Array[Int](initialOffsetSize) private var isRelative = new Array[Boolean](initialOffsetSize) - private var delta: Array[Int] = _ private var numOffsets = 0 /** A map from trees to the address at which a tree is pickled. */ @@ -68,109 +68,119 @@ class TreeBuffer extends TastyBuffer(50000) { } /** The amount by which the bytes at the given address are shifted under compression */ - def deltaAt(at: Addr): Int = { + def deltaAt(at: Addr, scratch: ScratchData): Int = { val idx = bestFit(offsets, numOffsets, at.index - 1) - if (idx < 0) 0 else delta(idx) + if (idx < 0) 0 else scratch.delta(idx) } /** The address to which `x` is translated under compression */ - def adjusted(x: Addr): Addr = x - deltaAt(x) + def adjusted(x: Addr, scratch: ScratchData): Addr = x - deltaAt(x, scratch) - /** Compute all shift-deltas */ - private def computeDeltas() = { - delta = new Array[Int](numOffsets) - var lastDelta = 0 - var i = 0 - while (i < numOffsets) { - val off = offset(i) - val skippedOff = skipZeroes(off) - val skippedCount = skippedOff.index - off.index - assert(skippedCount < AddrWidth, s"unset field at position $off") - lastDelta += skippedCount - delta(i) = lastDelta - i += 1 - } - } + /** Final assembly, involving the following steps: + * - compute deltas + * - adjust deltas until additional savings are < 1% of total + * - adjust offsets according to the adjusted deltas + * - shrink buffer, skipping zeroes. + */ + def compactify(scratch: ScratchData): Unit = - /** The absolute or relative adjusted address at index `i` of `offsets` array*/ - private def adjustedOffset(i: Int): Addr = { - val at = offset(i) - val original = getAddr(at) - if (isRelative(i)) { - val start = skipNat(at) - val len1 = original + delta(i) - deltaAt(original + start.index) - val len2 = adjusted(original + start.index) - adjusted(start).index - assert(len1 == len2, - s"adjusting offset #$i: $at, original = $original, len1 = $len1, len2 = $len2") - len1 + def reserve(arr: Array[Int]) = + if arr.length < numOffsets then + new Array[Int](numOffsets) + else + Arrays.fill(arr, 0, numOffsets, 0) + arr + + /** Compute all shift-deltas */ + def computeDeltas() = { + scratch.delta = reserve(scratch.delta) + var lastDelta = 0 + var i = 0 + while (i < numOffsets) { + val off = offset(i) + val skippedOff = skipZeroes(off) + val skippedCount = skippedOff.index - off.index + assert(skippedCount < AddrWidth, s"unset field at position $off") + lastDelta += skippedCount + scratch.delta(i) = lastDelta + i += 1 + } } - else adjusted(original) - } - /** Adjust all offsets according to previously computed deltas */ - private def adjustOffsets(): Unit = - for (i <- 0 until numOffsets) { - val corrected = adjustedOffset(i) - fillAddr(offset(i), corrected) + /** The absolute or relative adjusted address at index `i` of `offsets` array*/ + def adjustedOffset(i: Int): Addr = { + val at = offset(i) + val original = getAddr(at) + if (isRelative(i)) { + val start = skipNat(at) + val len1 = original + scratch.delta(i) - deltaAt(original + start.index, scratch) + val len2 = adjusted(original + start.index, scratch) - adjusted(start, scratch).index + assert(len1 == len2, + s"adjusting offset #$i: $at, original = $original, len1 = $len1, len2 = $len2") + len1 + } + else adjusted(original, scratch) } - /** Adjust deltas to also take account references that will shrink (and thereby - * generate additional zeroes that can be skipped) due to previously - * computed adjustments. - */ - private def adjustDeltas(): Int = { - val delta1 = new Array[Int](delta.length) - var lastDelta = 0 - var i = 0 - while (i < numOffsets) { - val corrected = adjustedOffset(i) - lastDelta += AddrWidth - TastyBuffer.natSize(corrected.index) - delta1(i) = lastDelta - i += 1 + /** Adjust all offsets according to previously computed deltas */ + def adjustOffsets(): Unit = + var i = 0 + while i < numOffsets do + val corrected = adjustedOffset(i) + fillAddr(offset(i), corrected) + i += 1 + + /** Adjust deltas to also take account references that will shrink (and thereby + * generate additional zeroes that can be skipped) due to previously + * computed adjustments. + */ + def adjustDeltas(): Int = { + scratch.delta1 = reserve(scratch.delta1) + var lastDelta = 0 + var i = 0 + while i < numOffsets do + val corrected = adjustedOffset(i) + lastDelta += AddrWidth - TastyBuffer.natSize(corrected.index) + scratch.delta1(i) = lastDelta + i += 1 + val saved = + if (numOffsets == 0) 0 + else scratch.delta1(numOffsets - 1) - scratch.delta(numOffsets - 1) + val tmp = scratch.delta + scratch.delta = scratch.delta1 + scratch.delta1 = tmp + saved } - val saved = - if (numOffsets == 0) 0 - else delta1(numOffsets - 1) - delta(numOffsets - 1) - delta = delta1 - saved - } - /** Compress pickle buffer, shifting bytes to close all skipped zeroes. */ - private def compress(): Int = { - var lastDelta = 0 - var start = 0 - var i = 0 - var wasted = 0 - def shift(end: Int) = - System.arraycopy(bytes, start, bytes, start - lastDelta, end - start) - while (i < numOffsets) { - val next = offsets(i) - shift(next) - start = next + delta(i) - lastDelta - val pastZeroes = skipZeroes(Addr(next)).index - assert(pastZeroes >= start, s"something's wrong: eliminated non-zero") - wasted += (pastZeroes - start) - lastDelta = delta(i) - i += 1 + /** Compress pickle buffer, shifting bytes to close all skipped zeroes. */ + def compress(): Int = { + var lastDelta = 0 + var start = 0 + var i = 0 + var wasted = 0 + def shift(end: Int) = + System.arraycopy(bytes, start, bytes, start - lastDelta, end - start) + while (i < numOffsets) { + val next = offsets(i) + shift(next) + start = next + scratch.delta(i) - lastDelta + val pastZeroes = skipZeroes(Addr(next)).index + assert(pastZeroes >= start, s"something's wrong: eliminated non-zero") + wasted += (pastZeroes - start) + lastDelta = scratch.delta(i) + i += 1 + } + shift(length) + length -= lastDelta + wasted } - shift(length) - length -= lastDelta - wasted - } - def adjustTreeAddrs(): Unit = - var i = 0 - while i < treeAddrs.size do - treeAddrs.setValue(i, adjusted(Addr(treeAddrs.value(i))).index) - i += 1 + def adjustTreeAddrs(): Unit = + var i = 0 + while i < treeAddrs.size do + treeAddrs.setValue(i, adjusted(Addr(treeAddrs.value(i)), scratch).index) + i += 1 - /** Final assembly, involving the following steps: - * - compute deltas - * - adjust deltas until additional savings are < 1% of total - * - adjust offsets according to the adjusted deltas - * - shrink buffer, skipping zeroes. - */ - def compactify(): Unit = { val origLength = length computeDeltas() //println(s"offsets: ${offsets.take(numOffsets).deep}") @@ -185,5 +195,5 @@ class TreeBuffer extends TastyBuffer(50000) { adjustTreeAddrs() val wasted = compress() pickling.println(s"original length: $origLength, compressed to: $length, wasted: $wasted") // DEBUG, for now. - } + end compactify } diff --git a/compiler/src/dotty/tools/dotc/core/tasty/TreePickler.scala b/compiler/src/dotty/tools/dotc/core/tasty/TreePickler.scala index 34c22439a932..bef28545592a 100644 --- a/compiler/src/dotty/tools/dotc/core/tasty/TreePickler.scala +++ b/compiler/src/dotty/tools/dotc/core/tasty/TreePickler.scala @@ -20,6 +20,8 @@ import collection.mutable import reporting.{Profile, NoProfile} import dotty.tools.tasty.TastyFormat.ASTsSection +object TreePickler: + class StackSizeExceeded(val mdef: tpd.MemberDef) extends Exception class TreePickler(pickler: TastyPickler) { val buf: TreeBuffer = new TreeBuffer @@ -27,6 +29,7 @@ class TreePickler(pickler: TastyPickler) { import buf._ import pickler.nameBuffer.nameIndex import tpd._ + import TreePickler.* private val symRefs = Symbols.MutableSymbolMap[Addr](256) private val forwardSymRefs = Symbols.MutableSymbolMap[List[Addr]]() @@ -53,7 +56,7 @@ class TreePickler(pickler: TastyPickler) { def docString(tree: untpd.MemberDef): Option[Comment] = Option(docStrings.lookup(tree)) - private def withLength(op: => Unit) = { + private inline def withLength(inline op: Unit) = { val lengthAddr = reserveRef(relative = true) op fillRef(lengthAddr, currentAddr, relative = true) @@ -68,15 +71,12 @@ class TreePickler(pickler: TastyPickler) { case _ => } - def registerDef(sym: Symbol): Unit = { + def registerDef(sym: Symbol): Unit = symRefs(sym) = currentAddr - forwardSymRefs.get(sym) match { - case Some(refs) => - refs.foreach(fillRef(_, currentAddr, relative = false)) - forwardSymRefs -= sym - case None => - } - } + val refs = forwardSymRefs.lookup(sym) + if refs != null then + refs.foreach(fillRef(_, currentAddr, relative = false)) + forwardSymRefs -= sym def pickleName(name: Name): Unit = writeNat(nameIndex(name).index) @@ -85,17 +85,19 @@ class TreePickler(pickler: TastyPickler) { if (sig eq Signature.NotAMethod) name else SignedName(name.toTermName, sig, target.asTermName)) - private def pickleSymRef(sym: Symbol)(using Context) = symRefs.get(sym) match { - case Some(label) => - if (label != NoAddr) writeRef(label) else pickleForwardSymRef(sym) - case None => + private def pickleSymRef(sym: Symbol)(using Context) = + val label: Addr | Null = symRefs.lookup(sym) + if label == null then // See pos/t1957.scala for an example where this can happen. // I believe it's a bug in typer: the type of an implicit argument refers // to a closure parameter outside the closure itself. TODO: track this down, so that we // can eliminate this case. report.log(i"pickling reference to as yet undefined $sym in ${sym.owner}", sym.srcPos) pickleForwardSymRef(sym) - } + else if label == NoAddr then + pickleForwardSymRef(sym) + else + writeRef(label.uncheckedNN) // !!! Dotty problem: Not clear why nn or uncheckedNN is needed here private def pickleForwardSymRef(sym: Symbol)(using Context) = { val ref = reserveRef(relative = false) @@ -207,7 +209,7 @@ class TreePickler(pickler: TastyPickler) { else if (tpe.prefix == NoPrefix) { writeByte(if (tpe.isType) TYPEREFdirect else TERMREFdirect) if Config.checkLevelsOnConstraints && !symRefs.contains(sym) && !sym.isPatternBound && !sym.hasAnnotation(defn.QuotedRuntimePatterns_patternTypeAnnot) then - report.error(i"pickling reference to as yet undefined $tpe with symbol ${sym}", sym.srcPos) + report.error(em"pickling reference to as yet undefined $tpe with symbol ${sym}", sym.srcPos) pickleSymRef(sym) } else tpe.designator match { @@ -328,23 +330,30 @@ class TreePickler(pickler: TastyPickler) { registerDef(sym) writeByte(tag) val addr = currentAddr - withLength { - pickleName(sym.name) - pickleParams - tpt match { - case _: Template | _: Hole => pickleTree(tpt) - case _ if tpt.isType => pickleTpt(tpt) + try + withLength { + pickleName(sym.name) + pickleParams + tpt match { + case _: Template | _: Hole => pickleTree(tpt) + case _ if tpt.isType => pickleTpt(tpt) + } + pickleTreeUnlessEmpty(rhs) + pickleModifiers(sym, mdef) } - pickleTreeUnlessEmpty(rhs) - pickleModifiers(sym, mdef) - } + catch + case ex: Throwable => + if !ctx.settings.YnoDecodeStacktraces.value + && handleRecursive.underlyingStackOverflowOrNull(ex) != null then + throw StackSizeExceeded(mdef) + else + throw ex if sym.is(Method) && sym.owner.isClass then profile.recordMethodSize(sym, currentAddr.index - addr.index, mdef.span) - for - docCtx <- ctx.docCtx - comment <- docCtx.docstring(sym) - do - docStrings(mdef) = comment + for docCtx <- ctx.docCtx do + val comment = docCtx.docstrings.lookup(sym) + if comment != null then + docStrings(mdef) = comment } def pickleParam(tree: Tree)(using Context): Unit = { @@ -426,6 +435,13 @@ class TreePickler(pickler: TastyPickler) { writeByte(THROW) pickleTree(args.head) } + else if fun.symbol.originalSignaturePolymorphic.exists then + writeByte(APPLYsigpoly) + withLength { + pickleTree(fun) + pickleType(fun.tpe.widenTermRefExpr, richTypes = true) // this widens to a MethodType, so need richTypes + args.foreach(pickleTree) + } else { writeByte(APPLY) withLength { @@ -451,7 +467,7 @@ class TreePickler(pickler: TastyPickler) { withLength { pickleTree(qual); if (!mix.isEmpty) { - // mixinType being a TypeRef when mix is non-empty is enforced by TreeChecker#checkSuper + // mixinType being a TypeRef when mix is non-empty is enforced by TreeChecker#checkSuper val SuperType(_, mixinType: TypeRef) = tree.tpe: @unchecked pickleTree(mix.withType(mixinType)) } @@ -777,18 +793,39 @@ class TreePickler(pickler: TastyPickler) { def pickle(trees: List[Tree])(using Context): Unit = { profile = Profile.current - trees.foreach(tree => if (!tree.isEmpty) pickleTree(tree)) + for tree <- trees do + try + if !tree.isEmpty then pickleTree(tree) + catch case ex: StackSizeExceeded => + report.error( + em"""Recursion limit exceeded while pickling ${ex.mdef} + |in ${ex.mdef.symbol.showLocated}. + |You could try to increase the stacksize using the -Xss JVM option. + |For the unprocessed stack trace, compile with -Yno-decode-stacktraces.""", + ex.mdef.srcPos) + def missing = forwardSymRefs.keysIterator .map(sym => i"${sym.showLocated} (line ${sym.srcPos.line}) #${sym.id}") .toList assert(forwardSymRefs.isEmpty, i"unresolved symbols: $missing%, % when pickling ${ctx.source}") } - def compactify(): Unit = { - buf.compactify() + def compactify(scratch: ScratchData = new ScratchData): Unit = { + buf.compactify(scratch) def updateMapWithDeltas(mp: MutableSymbolMap[Addr]) = - for (key <- mp.keysIterator.toBuffer[Symbol]) mp(key) = adjusted(mp(key)) + val keys = new Array[Symbol](mp.size) + val it = mp.keysIterator + var i = 0 + while i < keys.length do + keys(i) = it.next + i += 1 + assert(!it.hasNext) + i = 0 + while i < keys.length do + val key = keys(i) + mp(key) = adjusted(mp(key), scratch) + i += 1 updateMapWithDeltas(symRefs) } diff --git a/compiler/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala b/compiler/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala index 617a2c55a7ad..91290b4ddd41 100644 --- a/compiler/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala +++ b/compiler/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala @@ -625,7 +625,9 @@ class TreeUnpickler(reader: TastyReader, else newSymbol(ctx.owner, name, flags, completer, privateWithin, coord) } - val annots = annotFns.map(_(sym.owner)) + val annotOwner = + if sym.owner.isClass then newLocalDummy(sym.owner) else sym.owner + val annots = annotFns.map(_(annotOwner)) sym.annotations = annots if sym.isOpaqueAlias then sym.setFlag(Deferred) val isScala2MacroDefinedInScala3 = flags.is(Macro, butNot = Inline) && flags.is(Erased) @@ -957,6 +959,51 @@ class TreeUnpickler(reader: TastyReader, tree.setDefTree } + /** Read enough of parent to determine its type, without reading arguments + * of applications. This is necessary to make TreeUnpickler as lazy as Namer + * in this regard. See i16673 for a test case. + */ + private def readParentType()(using Context): Type = + readByte() match + case TYPEAPPLY => + val end = readEnd() + val tycon = readParentType() + if tycon.typeParams.isEmpty then + goto(end) + tycon + else + val args = until(end)(readTpt()) + val cls = tycon.classSymbol + assert(cls.typeParams.hasSameLengthAs(args)) + cls.typeRef.appliedTo(args.tpes) + case APPLY | BLOCK => + val end = readEnd() + try readParentType() + finally goto(end) + case SELECTin => + val end = readEnd() + readName() + readTerm() match + case nu: New => + try nu.tpe + finally goto(end) + case SHAREDterm => + forkAt(readAddr()).readParentType() + + /** Read template parents + * @param withArgs if false, only read enough of parent trees to determine their type + * but skip constructor arguments. Return any trees that were partially + * parsed in this way as InferredTypeTrees. + */ + def readParents(withArgs: Boolean)(using Context): List[Tree] = + collectWhile(nextByte != SELFDEF && nextByte != DEFDEF) { + nextUnsharedTag match + case APPLY | TYPEAPPLY | BLOCK => + if withArgs then readTerm() + else InferredTypeTree().withType(readParentType()) + case _ => readTpt() + } + private def readTemplate(using Context): Template = { val start = currentAddr assert(sourcePathAt(start).isEmpty) @@ -979,12 +1026,8 @@ class TreeUnpickler(reader: TastyReader, while (bodyIndexer.reader.nextByte != DEFDEF) bodyIndexer.skipTree() bodyIndexer.indexStats(end) } - val parents = collectWhile(nextByte != SELFDEF && nextByte != DEFDEF) { - nextUnsharedTag match { - case APPLY | TYPEAPPLY | BLOCK => readTerm()(using parentCtx) - case _ => readTpt()(using parentCtx) - } - } + val parentReader = fork + val parents = readParents(withArgs = false)(using parentCtx) val parentTypes = parents.map(_.tpe.dealias) val self = if (nextByte == SELFDEF) { @@ -998,7 +1041,13 @@ class TreeUnpickler(reader: TastyReader, selfInfo = if (self.isEmpty) NoType else self.tpt.tpe) .integrateOpaqueMembers val constr = readIndexedDef().asInstanceOf[DefDef] - val mappedParents = parents.map(_.changeOwner(localDummy, constr.symbol)) + val mappedParents: LazyTreeList = + if parents.exists(_.isInstanceOf[InferredTypeTree]) then + // parents were not read fully, will need to be read again later on demand + new LazyReader(parentReader, localDummy, ctx.mode, ctx.source, + _.readParents(withArgs = true) + .map(_.changeOwner(localDummy, constr.symbol))) + else parents val lazyStats = readLater(end, rdr => { val stats = rdr.readIndexedStats(localDummy, end) @@ -1007,7 +1056,7 @@ class TreeUnpickler(reader: TastyReader, defn.patchStdLibClass(cls) NamerOps.addConstructorProxies(cls) setSpan(start, - untpd.Template(constr, mappedParents, Nil, self, lazyStats) + untpd.Template(constr, mappedParents, self, lazyStats) .withType(localDummy.termRef)) } @@ -1236,6 +1285,12 @@ class TreeUnpickler(reader: TastyReader, else tpd.Apply(fn, args) case TYPEAPPLY => tpd.TypeApply(readTerm(), until(end)(readTpt())) + case APPLYsigpoly => + val fn = readTerm() + val methType = readType() + val args = until(end)(readTerm()) + val fun2 = typer.Applications.retypeSignaturePolymorphicFn(fn, methType) + tpd.Apply(fun2, args) case TYPED => val expr = readTerm() val tpt = readTpt() diff --git a/compiler/src/dotty/tools/dotc/core/unpickleScala2/Scala2Erasure.scala b/compiler/src/dotty/tools/dotc/core/unpickleScala2/Scala2Erasure.scala index f2d25d0f34b5..cc2d7dd7ee56 100644 --- a/compiler/src/dotty/tools/dotc/core/unpickleScala2/Scala2Erasure.scala +++ b/compiler/src/dotty/tools/dotc/core/unpickleScala2/Scala2Erasure.scala @@ -39,9 +39,9 @@ object Scala2Erasure: case RefinedType(parent, _, _) => checkSupported(parent) case AnnotatedType(parent, _) if parent.dealias.isInstanceOf[Scala2RefinedType] => - throw new TypeError(i"Unsupported Scala 2 type: Component $parent of intersection is annotated.") + throw TypeError(em"Unsupported Scala 2 type: Component $parent of intersection is annotated.") case tp @ TypeRef(prefix, _) if !tp.symbol.exists && prefix.dealias.isInstanceOf[Scala2RefinedType] => - throw new TypeError(i"Unsupported Scala 2 type: Prefix $prefix of intersection component is an intersection or refinement.") + throw TypeError(em"Unsupported Scala 2 type: Prefix $prefix of intersection component is an intersection or refinement.") case _ => /** A type that would be represented as a RefinedType in Scala 2. diff --git a/compiler/src/dotty/tools/dotc/core/unpickleScala2/Scala2Unpickler.scala b/compiler/src/dotty/tools/dotc/core/unpickleScala2/Scala2Unpickler.scala index 561b1eac2391..50b0b875c1fc 100644 --- a/compiler/src/dotty/tools/dotc/core/unpickleScala2/Scala2Unpickler.scala +++ b/compiler/src/dotty/tools/dotc/core/unpickleScala2/Scala2Unpickler.scala @@ -89,7 +89,11 @@ object Scala2Unpickler { val sourceModule = denot.sourceModule.orElse { // For non-toplevel modules, `sourceModule` won't be set when completing // the module class, we need to go find it ourselves. - NamerOps.findModuleBuddy(cls.name.sourceModuleName, denot.owner.info.decls) + val modName = cls.name.sourceModuleName + val alternate = + if cls.privateWithin.exists && cls.owner.is(Trait) then modName.expandedName(cls.owner) + else EmptyTermName + NamerOps.findModuleBuddy(modName, denot.owner.info.decls, alternate) } denot.owner.thisType.select(sourceModule) else selfInfo diff --git a/compiler/src/dotty/tools/dotc/fromtasty/ReadTasty.scala b/compiler/src/dotty/tools/dotc/fromtasty/ReadTasty.scala index 864f5277bff3..86ae99b3e0f9 100644 --- a/compiler/src/dotty/tools/dotc/fromtasty/ReadTasty.scala +++ b/compiler/src/dotty/tools/dotc/fromtasty/ReadTasty.scala @@ -29,7 +29,7 @@ class ReadTasty extends Phase { val className = unit.className.toTypeName def cannotUnpickle(reason: String): None.type = { - report.error(s"class $className cannot be unpickled because $reason") + report.error(em"class $className cannot be unpickled because $reason") None } diff --git a/compiler/src/dotty/tools/dotc/fromtasty/TASTYRun.scala b/compiler/src/dotty/tools/dotc/fromtasty/TASTYRun.scala index 04c65a3d3882..fb0abe3332ed 100644 --- a/compiler/src/dotty/tools/dotc/fromtasty/TASTYRun.scala +++ b/compiler/src/dotty/tools/dotc/fromtasty/TASTYRun.scala @@ -6,6 +6,7 @@ import scala.language.unsafeNulls import io.{JarArchive, AbstractFile, Path} import core.Contexts._ +import core.Decorators.em import java.io.File class TASTYRun(comp: Compiler, ictx: Context) extends Run(comp, ictx) { @@ -27,7 +28,7 @@ class TASTYRun(comp: Compiler, ictx: Context) extends Run(comp, ictx) { .toList case "tasty" => TastyFileUtil.getClassName(file) case _ => - report.error(s"File extension is not `tasty` or `jar`: ${file.path}") + report.error(em"File extension is not `tasty` or `jar`: ${file.path}") Nil } classNames.map(new TASTYCompilationUnit(_)) diff --git a/compiler/src/dotty/tools/dotc/inlines/InlineReducer.scala b/compiler/src/dotty/tools/dotc/inlines/InlineReducer.scala index debf51872d5a..e1b2aaa02866 100644 --- a/compiler/src/dotty/tools/dotc/inlines/InlineReducer.scala +++ b/compiler/src/dotty/tools/dotc/inlines/InlineReducer.scala @@ -12,6 +12,8 @@ import NameKinds.{InlineAccessorName, InlineBinderName, InlineScrutineeName} import config.Printers.inlining import util.SimpleIdentityMap +import dotty.tools.dotc.transform.BetaReduce + import collection.mutable /** A utility class offering methods for rewriting inlined code */ @@ -158,35 +160,32 @@ class InlineReducer(inliner: Inliner)(using Context): * * where `def` is used for call-by-name parameters. However, we shortcut any NoPrefix * refs among the ei's directly without creating an intermediate binding. + * + * This variant of beta-reduction preserves the integrity of `Inlined` tree nodes. */ def betaReduce(tree: Tree)(using Context): Tree = tree match { - case Apply(Select(cl @ closureDef(ddef), nme.apply), args) if defn.isFunctionType(cl.tpe) => - // closureDef also returns a result for closures wrapped in Inlined nodes. - // These need to be preserved. - def recur(cl: Tree): Tree = cl match - case Inlined(call, bindings, expr) => - cpy.Inlined(cl)(call, bindings, recur(expr)) - case _ => ddef.tpe.widen match - case mt: MethodType if ddef.paramss.head.length == args.length => - val bindingsBuf = new DefBuffer - val argSyms = mt.paramNames.lazyZip(mt.paramInfos).lazyZip(args).map { (name, paramtp, arg) => - arg.tpe.dealias match { - case ref @ TermRef(NoPrefix, _) => ref.symbol - case _ => - paramBindingDef(name, paramtp, arg, bindingsBuf)( - using ctx.withSource(cl.source) - ).symbol - } - } - val expander = new TreeTypeMap( - oldOwners = ddef.symbol :: Nil, - newOwners = ctx.owner :: Nil, - substFrom = ddef.paramss.head.map(_.symbol), - substTo = argSyms) - Block(bindingsBuf.toList, expander.transform(ddef.rhs)).withSpan(tree.span) - case _ => tree - recur(cl) - case _ => tree + case Apply(Select(cl, nme.apply), args) if defn.isFunctionType(cl.tpe) => + val bindingsBuf = new mutable.ListBuffer[ValDef] + def recur(cl: Tree): Option[Tree] = cl match + case Block((ddef : DefDef) :: Nil, closure: Closure) if ddef.symbol == closure.meth.symbol => + ddef.tpe.widen match + case mt: MethodType if ddef.paramss.head.length == args.length => + Some(BetaReduce.reduceApplication(ddef, args, bindingsBuf)) + case _ => None + case Block(stats, expr) if stats.forall(isPureBinding) => + recur(expr).map(cpy.Block(cl)(stats, _)) + case Inlined(call, bindings, expr) if bindings.forall(isPureBinding) => + recur(expr).map(cpy.Inlined(cl)(call, bindings, _)) + case Typed(expr, tpt) => + recur(expr) + case _ => None + recur(cl) match + case Some(reduced) => + seq(bindingsBuf.result(), reduced).withSpan(tree.span) + case None => + tree + case _ => + tree } /** The result type of reducing a match. It consists optionally of a list of bindings @@ -269,12 +268,21 @@ class InlineReducer(inliner: Inliner)(using Context): } } - // Extractors contain Bind nodes in type parameter lists, the tree looks like this: + // Extractors can contain Bind nodes in type parameter lists, + // for that case tree looks like this: // UnApply[t @ t](pats)(implicits): T[t] // Test case is pos/inline-caseclass.scala. + // Alternatively, for explicitly specified type binds in type annotations like in + // case A(B): A[t] + // the tree will look like this: + // Unapply[t](pats)(implicits) : T[t @ t] + // and the binds will be found in the type tree instead + // Test case is pos-macros/i15971 + val tptBinds = getBinds(Set.empty[TypeSymbol], tpt) val binds: Set[TypeSymbol] = pat match { - case UnApply(TypeApply(_, tpts), _, _) => getBinds(Set.empty[TypeSymbol], tpts) - case _ => getBinds(Set.empty[TypeSymbol], tpt) + case UnApply(TypeApply(_, tpts), _, _) => + getBinds(Set.empty[TypeSymbol], tpts) ++ tptBinds + case _ => tptBinds } val extractBindVariance = new TypeAccumulator[TypeBindsMap] { @@ -303,11 +311,11 @@ class InlineReducer(inliner: Inliner)(using Context): def addTypeBindings(typeBinds: TypeBindsMap)(using Context): Unit = typeBinds.foreachBinding { case (sym, shouldBeMinimized) => newTypeBinding(sym, - ctx.gadt.approximation(sym, fromBelow = shouldBeMinimized, maxLevel = Int.MaxValue)) + ctx.gadtState.approximation(sym, fromBelow = shouldBeMinimized, maxLevel = Int.MaxValue)) } def registerAsGadtSyms(typeBinds: TypeBindsMap)(using Context): Unit = - if (typeBinds.size > 0) ctx.gadt.addToConstraint(typeBinds.keys) + if (typeBinds.size > 0) ctx.gadtState.addToConstraint(typeBinds.keys) pat match { case Typed(pat1, tpt) => diff --git a/compiler/src/dotty/tools/dotc/inlines/Inliner.scala b/compiler/src/dotty/tools/dotc/inlines/Inliner.scala index bea42e82ce6f..76494c1bf405 100644 --- a/compiler/src/dotty/tools/dotc/inlines/Inliner.scala +++ b/compiler/src/dotty/tools/dotc/inlines/Inliner.scala @@ -227,7 +227,7 @@ class Inliner(val call: tpd.Tree)(using Context): val binding = { var newArg = arg.changeOwner(ctx.owner, boundSym) if bindingFlags.is(Inline) && argIsBottom then - newArg = Typed(newArg, TypeTree(formal)) // type ascribe RHS to avoid type errors in expansion. See i8612.scala + newArg = Typed(newArg, TypeTree(formal.widenExpr)) // type ascribe RHS to avoid type errors in expansion. See i8612.scala if isByName then DefDef(boundSym, newArg) else ValDef(boundSym, newArg) }.withSpan(boundSym.span) @@ -253,7 +253,7 @@ class Inliner(val call: tpd.Tree)(using Context): computeParamBindings(tp.resultType, targs.drop(tp.paramNames.length), argss, formalss, buf) case tp: MethodType => if argss.isEmpty then - report.error(i"missing arguments for inline method $inlinedMethod", call.srcPos) + report.error(em"missing arguments for inline method $inlinedMethod", call.srcPos) false else tp.paramNames.lazyZip(formalss.head).lazyZip(argss.head).foreach { (name, formal, arg) => @@ -616,8 +616,8 @@ class Inliner(val call: tpd.Tree)(using Context): def issueError() = callValueArgss match { case (msgArg :: Nil) :: Nil => val message = msgArg.tpe match { - case ConstantType(Constant(msg: String)) => msg - case _ => s"A literal string is expected as an argument to `compiletime.error`. Got ${msgArg.show}" + case ConstantType(Constant(msg: String)) => msg.toMessage + case _ => em"A literal string is expected as an argument to `compiletime.error`. Got $msgArg" } // Usually `error` is called from within a rewrite method. In this // case we need to report the error at the point of the outermost enclosing inline @@ -749,9 +749,9 @@ class Inliner(val call: tpd.Tree)(using Context): ctx override def typedIdent(tree: untpd.Ident, pt: Type)(using Context): Tree = - val tree1 = inlineIfNeeded( - tryInlineArg(tree.asInstanceOf[tpd.Tree]) `orElse` super.typedIdent(tree, pt) - ) + val locked = ctx.typerState.ownedVars + val tree0 = tryInlineArg(tree.asInstanceOf[tpd.Tree]) `orElse` super.typedIdent(tree, pt) + val tree1 = inlineIfNeeded(tree0, pt, locked) tree1 match case id: Ident if tpd.needsSelect(id.tpe) => inlining.println(i"expanding $id to selection") @@ -760,6 +760,7 @@ class Inliner(val call: tpd.Tree)(using Context): tree1 override def typedSelect(tree: untpd.Select, pt: Type)(using Context): Tree = { + val locked = ctx.typerState.ownedVars val qual1 = typed(tree.qualifier, shallowSelectionProto(tree.name, pt, this)) val resNoReduce = untpd.cpy.Select(tree)(qual1, tree.name).withType(tree.typeOpt) val reducedProjection = reducer.reduceProjection(resNoReduce) @@ -771,7 +772,7 @@ class Inliner(val call: tpd.Tree)(using Context): if resNoReduce ne res then typed(res, pt) // redo typecheck if reduction changed something else if res.symbol.isInlineMethod then - inlineIfNeeded(res) + inlineIfNeeded(res, pt, locked) else ensureAccessible(res.tpe, tree.qualifier.isInstanceOf[untpd.Super], tree.srcPos) res @@ -809,19 +810,22 @@ class Inliner(val call: tpd.Tree)(using Context): tree match case Quoted(Spliced(inner)) => inner case _ => tree + val locked = ctx.typerState.ownedVars val res = cancelQuotes(constToLiteral(betaReduce(super.typedApply(tree, pt)))) match { case res: Apply if res.symbol == defn.QuotedRuntime_exprSplice && StagingContext.level == 0 && !hasInliningErrors => val expanded = expandMacro(res.args.head, tree.srcPos) + transform.TreeChecker.checkMacroGeneratedTree(res, expanded) typedExpr(expanded) // Inline calls and constant fold code generated by the macro case res => - specializeEq(inlineIfNeeded(res)) + specializeEq(inlineIfNeeded(res, pt, locked)) } res override def typedTypeApply(tree: untpd.TypeApply, pt: Type)(using Context): Tree = - val tree1 = inlineIfNeeded(constToLiteral(betaReduce(super.typedTypeApply(tree, pt)))) + val locked = ctx.typerState.ownedVars + val tree1 = inlineIfNeeded(constToLiteral(betaReduce(super.typedTypeApply(tree, pt))), pt, locked) if tree1.symbol.isQuote then ctx.compilationUnit.needsStaging = true tree1 @@ -889,11 +893,11 @@ class Inliner(val call: tpd.Tree)(using Context): /** True if this inline typer has already issued errors */ override def hasInliningErrors(using Context) = ctx.reporter.errorCount > initialErrorCount - private def inlineIfNeeded(tree: Tree)(using Context): Tree = + private def inlineIfNeeded(tree: Tree, pt: Type, locked: TypeVars)(using Context): Tree = val meth = tree.symbol if meth.isAllOf(DeferredInline) then - errorTree(tree, i"Deferred inline ${meth.showLocated} cannot be invoked") - else if Inlines.needsInlining(tree) then Inlines.inlineCall(tree) + errorTree(tree, em"Deferred inline ${meth.showLocated} cannot be invoked") + else if Inlines.needsInlining(tree) then Inlines.inlineCall(simplify(tree, pt, locked)) else tree override def typedUnadapted(tree: untpd.Tree, pt: Type, locked: TypeVars)(using Context): Tree = diff --git a/compiler/src/dotty/tools/dotc/inlines/Inlines.scala b/compiler/src/dotty/tools/dotc/inlines/Inlines.scala index 8be23b932e98..8110fd2de195 100644 --- a/compiler/src/dotty/tools/dotc/inlines/Inlines.scala +++ b/compiler/src/dotty/tools/dotc/inlines/Inlines.scala @@ -85,7 +85,10 @@ object Inlines: if (tree.symbol == defn.CompiletimeTesting_typeChecks) return Intrinsics.typeChecks(tree) if (tree.symbol == defn.CompiletimeTesting_typeCheckErrors) return Intrinsics.typeCheckErrors(tree) - CrossVersionChecks.checkExperimentalRef(tree.symbol, tree.srcPos) + if ctx.isAfterTyper then + // During typer we wait with cross version checks until PostTyper, in order + // not to provoke cyclic references. See i16116 for a test case. + CrossVersionChecks.checkExperimentalRef(tree.symbol, tree.srcPos) if tree.symbol.isConstructor then return tree // error already reported for the inline constructor definition @@ -153,9 +156,9 @@ object Inlines: else ("successive inlines", ctx.settings.XmaxInlines) errorTree( tree, - i"""|Maximal number of $reason (${setting.value}) exceeded, - |Maybe this is caused by a recursive inline method? - |You can use ${setting.name} to change the limit.""".toMessage, + em"""|Maximal number of $reason (${setting.value}) exceeded, + |Maybe this is caused by a recursive inline method? + |You can use ${setting.name} to change the limit.""", (tree :: enclosingInlineds).last.srcPos ) if ctx.base.stopInlining && enclosingInlineds.isEmpty then @@ -178,37 +181,28 @@ object Inlines: // as its right hand side. The call to the wrapper unapply serves as the signpost for pattern matching. // After pattern matching, the anonymous class is removed in phase InlinePatterns with a beta reduction step. // - // An inline unapply `P.unapply` in a plattern `P(x1,x2,...)` is transformed into - // `{ class $anon { def unapply(t0: T0)(using t1: T1, t2: T2, ...): R = P.unapply(t0)(using t1, t2, ...) }; new $anon }.unapply` - // and the call `P.unapply(x1, x2, ...)` is inlined. + // An inline unapply `P.unapply` in a pattern `P[...](using ...)(x1,x2,...)(using t1: T1, t2: T2, ...)` is transformed into + // `{ class $anon { def unapply(s: S)(using t1: T1, t2: T2, ...): R = P.unapply[...](using ...)(s)(using t1, t2, ...) }; new $anon }.unapply(using y1,y2,...)` + // and the call `P.unapply[...](using ...)(x1, x2, ...)(using t1, t2, ...)` is inlined. // This serves as a placeholder for the inlined body until the `patternMatcher` phase. After pattern matcher // transforms the patterns into terms, the `inlinePatterns` phase removes this anonymous class by β-reducing // the call to the `unapply`. - object SplitFunAndGivenArgs: - def unapply(tree: Tree): (Tree, List[List[Tree]]) = tree match - case Apply(SplitFunAndGivenArgs(fn, argss), args) => (fn, argss :+ args) - case _ => (tree, Nil) - val UnApply(SplitFunAndGivenArgs(fun, leadingImplicits), trailingImplicits, patterns) = unapp - if leadingImplicits.flatten.nonEmpty then - // To support them see https://github.com/lampepfl/dotty/pull/13158 - report.error("inline unapply methods with given parameters before the scrutinee are not supported", fun) + val UnApply(fun, trailingImplicits, patterns) = unapp val sym = unapp.symbol var unapplySym1: Symbol = NoSymbol // created from within AnonClass() and used afterwards val newUnapply = AnonClass(ctx.owner, List(defn.ObjectType), sym.coord) { cls => - val targs = fun match - case TypeApply(_, targs) => targs - case _ => Nil - val unapplyInfo = sym.info match - case info: PolyType => info.instantiate(targs.map(_.tpe)) - case info => info - - val unapplySym = newSymbol(cls, sym.name.toTermName, Synthetic | Method, unapplyInfo, coord = sym.coord).entered + // `fun` is a partially applied method that contains all type applications of the method. + // The methodic type `fun.tpe.widen` is the type of the function starting from the scrutinee argument + // and its type parameters are instantiated. + val unapplySym = newSymbol(cls, sym.name.toTermName, Synthetic | Method, fun.tpe.widen, coord = sym.coord).entered val unapply = DefDef(unapplySym.asTerm, argss => - inlineCall(fun.appliedToArgss(argss).withSpan(unapp.span))(using ctx.withOwner(unapplySym)) + val body = fun.appliedToArgss(argss).withSpan(unapp.span) + if body.symbol.is(Transparent) then inlineCall(body)(using ctx.withOwner(unapplySym)) + else body ) unapplySym1 = unapplySym List(unapply) @@ -235,8 +229,8 @@ object Inlines: val retainer = meth.copy( name = BodyRetainerName(meth.name), - flags = meth.flags &~ (Inline | Macro | Override) | Private, - coord = mdef.rhs.span.startPos).asTerm + flags = (meth.flags &~ (Inline | Macro | Override | AbsOverride)) | Private, + coord = mdef.rhs.span.startPos).asTerm.entered retainer.deriveTargetNameAnnotation(meth, name => BodyRetainerName(name.asTermName)) DefDef(retainer, prefss => inlineCall( @@ -439,8 +433,7 @@ object Inlines: val evidence = evTyper.inferImplicitArg(tpt.tpe, tpt.span) evidence.tpe match case fail: Implicits.SearchFailureType => - val msg = evTyper.missingArgMsg(evidence, tpt.tpe, "") - errorTree(call, em"$msg") + errorTree(call, evTyper.missingArgMsg(evidence, tpt.tpe, "")) case _ => evidence } diff --git a/compiler/src/dotty/tools/dotc/inlines/PrepareInlineable.scala b/compiler/src/dotty/tools/dotc/inlines/PrepareInlineable.scala index 7e47bbfdfa8a..85293d4a82d7 100644 --- a/compiler/src/dotty/tools/dotc/inlines/PrepareInlineable.scala +++ b/compiler/src/dotty/tools/dotc/inlines/PrepareInlineable.scala @@ -284,7 +284,7 @@ object PrepareInlineable { private def checkInlineMethod(inlined: Symbol, body: Tree)(using Context): body.type = { if Inlines.inInlineMethod(using ctx.outer) then - report.error(ex"Implementation restriction: nested inline methods are not supported", inlined.srcPos) + report.error(em"Implementation restriction: nested inline methods are not supported", inlined.srcPos) if (inlined.is(Macro) && !ctx.isAfterTyper) { diff --git a/compiler/src/dotty/tools/dotc/interactive/Completion.scala b/compiler/src/dotty/tools/dotc/interactive/Completion.scala index 6af34dc88362..e4d0cce9f6f9 100644 --- a/compiler/src/dotty/tools/dotc/interactive/Completion.scala +++ b/compiler/src/dotty/tools/dotc/interactive/Completion.scala @@ -17,6 +17,7 @@ import dotty.tools.dotc.core.Symbols.{NoSymbol, Symbol, defn, newSymbol} import dotty.tools.dotc.core.StdNames.nme import dotty.tools.dotc.core.SymDenotations.SymDenotation import dotty.tools.dotc.core.TypeError +import dotty.tools.dotc.core.Phases import dotty.tools.dotc.core.Types.{AppliedType, ExprType, MethodOrPoly, NameFilter, NoType, RefinedType, TermRef, Type, TypeProxy} import dotty.tools.dotc.parsing.Tokens import dotty.tools.dotc.util.Chars diff --git a/compiler/src/dotty/tools/dotc/interactive/Interactive.scala b/compiler/src/dotty/tools/dotc/interactive/Interactive.scala index 6b2237a09b3f..fd6d426f39bb 100644 --- a/compiler/src/dotty/tools/dotc/interactive/Interactive.scala +++ b/compiler/src/dotty/tools/dotc/interactive/Interactive.scala @@ -313,8 +313,8 @@ object Interactive { case _ => } localCtx - case tree @ Template(constr, parents, self, _) => - if ((constr :: self :: parents).contains(nested)) outer + case tree @ Template(constr, _, self, _) => + if ((constr :: self :: tree.parentsOrDerived).contains(nested)) outer else contextOfStat(tree.body, nested, tree.symbol, outer.inClassContext(self.symbol)) case _ => outer diff --git a/compiler/src/dotty/tools/dotc/parsing/JavaParsers.scala b/compiler/src/dotty/tools/dotc/parsing/JavaParsers.scala index 4611554a01a3..daeebcbcc17c 100644 --- a/compiler/src/dotty/tools/dotc/parsing/JavaParsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/JavaParsers.scala @@ -71,10 +71,10 @@ object JavaParsers { } } - def syntaxError(msg: String, skipIt: Boolean): Unit = + def syntaxError(msg: Message, skipIt: Boolean): Unit = syntaxError(in.offset, msg, skipIt) - def syntaxError(offset: Int, msg: String, skipIt: Boolean): Unit = { + def syntaxError(offset: Int, msg: Message, skipIt: Boolean): Unit = { if (offset > lastErrorOffset) { syntaxError(msg, offset) // no more errors on this token. @@ -178,9 +178,7 @@ object JavaParsers { if (in.token != token) { val offsetToReport = in.offset val msg = - tokenString(token) + " expected but " + - tokenString(in.token) + " found." - + em"${tokenString(token)} expected but ${tokenString(in.token)} found." syntaxError(offsetToReport, msg, skipIt = true) } if (in.token == token) in.nextToken() @@ -271,7 +269,7 @@ object JavaParsers { case FLOAT => in.nextToken(); TypeTree(FloatType) case DOUBLE => in.nextToken(); TypeTree(DoubleType) case BOOLEAN => in.nextToken(); TypeTree(BooleanType) - case _ => syntaxError("illegal start of type", skipIt = true); errorTypeTree + case _ => syntaxError(em"illegal start of type", skipIt = true); errorTypeTree } } @@ -762,7 +760,7 @@ object JavaParsers { accept(SEMI) val names = buf.toList if (names.length < 2) { - syntaxError(start, "illegal import", skipIt = false) + syntaxError(start, em"illegal import", skipIt = false) List() } else { @@ -822,7 +820,7 @@ object JavaParsers { val iface = atSpan(start, nameOffset) { TypeDef( name, - makeTemplate(parents, body, tparams, false)).withMods(mods | Flags.Trait | Flags.JavaInterface | Flags.Abstract) + makeTemplate(parents, body, tparams, false)).withMods(mods | Flags.JavaInterface) } addCompanionObject(statics, iface) } @@ -858,10 +856,9 @@ object JavaParsers { } (statics.toList, members.toList) } - def annotationParents: List[Select] = List( - scalaAnnotationDot(tpnme.Annotation), - Select(javaLangDot(nme.annotation), tpnme.Annotation), - scalaAnnotationDot(tpnme.ClassfileAnnotation) + def annotationParents: List[Tree] = List( + javaLangObject(), + Select(javaLangDot(nme.annotation), tpnme.Annotation) ) def annotationDecl(start: Offset, mods: Modifiers): List[Tree] = { accept(AT) @@ -877,7 +874,7 @@ object JavaParsers { List(constructorParams), TypeTree(), EmptyTree).withMods(Modifiers(Flags.JavaDefined)) val templ = makeTemplate(annotationParents, constr :: body, List(), true) val annot = atSpan(start, nameOffset) { - TypeDef(name, templ).withMods(mods | Flags.Abstract) + TypeDef(name, templ).withMods(mods | Flags.JavaInterface | Flags.JavaAnnotation) } addCompanionObject(statics, annot) } @@ -955,7 +952,7 @@ object JavaParsers { case INTERFACE => interfaceDecl(start, mods) case AT => annotationDecl(start, mods) case CLASS => classDecl(start, mods) - case _ => in.nextToken(); syntaxError("illegal start of type declaration", skipIt = true); List(errorTypeTree) + case _ => in.nextToken(); syntaxError(em"illegal start of type declaration", skipIt = true); List(errorTypeTree) } def tryConstant: Option[Constant] = { diff --git a/compiler/src/dotty/tools/dotc/parsing/JavaScanners.scala b/compiler/src/dotty/tools/dotc/parsing/JavaScanners.scala index 1be8bdae6bd1..d21d4b85b5df 100644 --- a/compiler/src/dotty/tools/dotc/parsing/JavaScanners.scala +++ b/compiler/src/dotty/tools/dotc/parsing/JavaScanners.scala @@ -10,6 +10,7 @@ import JavaTokens._ import scala.annotation.{switch, tailrec} import util.Chars._ import PartialFunction.cond +import core.Decorators.em object JavaScanners { @@ -108,7 +109,7 @@ object JavaScanners { setStrVal() nextChar() else - error("unclosed string literal") + error(em"unclosed string literal") else nextChar() if ch != '\"' then // "" empty string literal @@ -127,7 +128,7 @@ object JavaScanners { setStrVal() } else - error("unclosed character literal") + error(em"unclosed character literal") case '=' => token = EQUALS @@ -298,7 +299,7 @@ object JavaScanners { nextChar() token = DOTDOTDOT } - else error("`.` character expected") + else error(em"`.` character expected") } case ';' => @@ -336,7 +337,7 @@ object JavaScanners { case SU => if (isAtEnd) token = EOF else { - error("illegal character") + error(em"illegal character") nextChar() } @@ -347,7 +348,7 @@ object JavaScanners { getIdentRest() } else { - error("illegal character: " + ch.toInt) + error(em"illegal character: ${ch.toInt}") nextChar() } } @@ -360,7 +361,7 @@ object JavaScanners { case _ => nextChar(); skipLineComment() } @tailrec def skipJavaComment(): Unit = ch match { - case SU => incompleteInputError("unclosed comment") + case SU => incompleteInputError(em"unclosed comment") case '*' => nextChar(); if (ch == '/') nextChar() else skipJavaComment() case _ => nextChar(); skipJavaComment() } @@ -480,7 +481,7 @@ object JavaScanners { nextChar() } if (ch != LF && ch != CR) { // CR-LF is already normalized into LF by `JavaCharArrayReader` - error("illegal text block open delimiter sequence, missing line terminator") + error(em"illegal text block open delimiter sequence, missing line terminator") return } nextChar() @@ -529,7 +530,7 @@ object JavaScanners { // Bail out if the block never did have an end if (!blockClosed) { - error("unclosed text block") + error(em"unclosed text block") return } @@ -642,14 +643,14 @@ object JavaScanners { while (i < len) { val d = digit2int(strVal.charAt(i), base) if (d < 0) { - error("malformed integer number") + error(em"malformed integer number") return 0 } if (value < 0 || limit / (base / divider) < value || limit - (d / divider) < value * (base / divider) && !(negated && limit == value * base - 1 + d)) { - error("integer number too large") + error(em"integer number too large") return 0 } value = value * base + d @@ -666,11 +667,11 @@ object JavaScanners { try { val value: Double = java.lang.Double.valueOf(strVal.toString).nn.doubleValue() if (value > limit) - error("floating point number too large") + error(em"floating point number too large") if (negated) -value else value } catch { case _: NumberFormatException => - error("malformed floating point number") + error(em"malformed floating point number") 0.0 } } diff --git a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala index e108e2d9cbeb..6c494db78c7f 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Parsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Parsers.scala @@ -143,21 +143,12 @@ object Parsers { syntaxError(msg, Span(offset, offset + length)) lastErrorOffset = in.offset - def syntaxError(msg: => String, offset: Int): Unit = - syntaxError(msg.toMessage, offset) - - def syntaxError(msg: => String): Unit = - syntaxError(msg, in.offset) - /** Unconditionally issue an error at given span, without * updating lastErrorOffset. */ def syntaxError(msg: Message, span: Span): Unit = report.error(msg, source.atSpan(span)) - def syntaxError(msg: => String, span: Span): Unit = - syntaxError(msg.toMessage, span) - def unimplementedExpr(using Context): Select = Select(scalaDot(nme.Predef), nme.???) } @@ -288,9 +279,6 @@ object Parsers { syntaxError(msg, offset) skip() - def syntaxErrorOrIncomplete(msg: => String): Unit = - syntaxErrorOrIncomplete(msg.toMessage, in.offset) - def syntaxErrorOrIncomplete(msg: Message, span: Span): Unit = if in.token == EOF then incompleteInputError(msg) @@ -346,7 +334,7 @@ object Parsers { in.nextToken() recur(true, endSeen) else if in.token == END then - if endSeen then syntaxError("duplicate end marker") + if endSeen then syntaxError(em"duplicate end marker") checkEndMarker(stats) recur(sepSeen, endSeen = true) else if isStatSeqEnd || in.token == altEnd then @@ -358,7 +346,7 @@ object Parsers { val statFollows = mustStartStatTokens.contains(found) syntaxError( if noPrevStat then IllegalStartOfStatement(what, isModifier, statFollows) - else i"end of $what expected but ${showToken(found)} found".toMessage) + else em"end of $what expected but ${showToken(found)} found") if mustStartStatTokens.contains(found) then false // it's a statement that might be legal in an outer context else @@ -460,7 +448,7 @@ object Parsers { */ def convertToParam(tree: Tree, mods: Modifiers): ValDef = def fail() = - syntaxError(s"not a legal formal parameter for a function literal", tree.span) + syntaxError(em"not a legal formal parameter for a function literal", tree.span) makeParameter(nme.ERROR, tree, mods) tree match case param: ValDef => @@ -618,11 +606,11 @@ object Parsers { if in.isNewLine && !(nextIndentWidth < startIndentWidth) then warning( if startIndentWidth <= nextIndentWidth then - i"""Line is indented too far to the right, or a `{` is missing before: - | - |${t.tryToShow}""".toMessage + em"""Line is indented too far to the right, or a `{` is missing before: + | + |${t.tryToShow}""" else - in.spaceTabMismatchMsg(startIndentWidth, nextIndentWidth).toMessage, + in.spaceTabMismatchMsg(startIndentWidth, nextIndentWidth), in.next.offset ) t @@ -635,7 +623,7 @@ object Parsers { if in.isNewLine then val nextIndentWidth = in.indentWidth(in.next.offset) if in.currentRegion.indentWidth < nextIndentWidth then - warning(i"Line is indented too far to the right, or a `{` or `:` is missing".toMessage, in.next.offset) + warning(em"Line is indented too far to the right, or a `{` or `:` is missing", in.next.offset) /* -------- REWRITES ----------------------------------------------------------- */ @@ -716,7 +704,11 @@ object Parsers { val t = enclosed(INDENT, body) if needsBraces(t) then patch(source, Span(startOpening, endOpening), " {") - patch(source, Span(closingOffset(source.nextLine(in.lastOffset))), indentWidth.toPrefix ++ "}\n") + val next = in.next + def closedByEndMarker = + next.token == END && (next.offset - next.lineOffset) == indentWidth.toPrefix.size + if closedByEndMarker then patch(source, Span(next.offset), "} // ") + else patch(source, Span(closingOffset(source.nextLine(in.lastOffset))), indentWidth.toPrefix ++ "}\n") t end indentedToBraces @@ -778,7 +770,7 @@ object Parsers { } }) canRewrite &= (in.isAfterLineEnd || statCtdTokens.contains(in.token)) // test (5) - if (canRewrite && (!underColonSyntax || in.fewerBracesEnabled)) { + if canRewrite && (!underColonSyntax || Feature.fewerBracesEnabled) then val openingPatchStr = if !colonRequired then "" else if testChar(startOpening - 1, Chars.isOperatorPart(_)) then " :" @@ -786,7 +778,6 @@ object Parsers { val (startClosing, endClosing) = closingElimRegion() patch(source, Span(startOpening, endOpening), openingPatchStr) patch(source, Span(startClosing, endClosing), "") - } t } @@ -957,7 +948,7 @@ object Parsers { lookahead.isArrow && { lookahead.nextToken() - lookahead.token == INDENT + lookahead.token == INDENT || lookahead.token == EOF } lookahead.nextToken() if lookahead.isIdent || lookahead.token == USCORE then @@ -1025,7 +1016,7 @@ object Parsers { * body */ def isColonLambda = - in.fewerBracesEnabled && in.token == COLONfollow && followingIsLambdaAfterColon() + Feature.fewerBracesEnabled && in.token == COLONfollow && followingIsLambdaAfterColon() /** operand { infixop operand | MatchClause } [postfixop], * @@ -1082,7 +1073,7 @@ object Parsers { val name = in.name if name == nme.CONSTRUCTOR || name == nme.STATIC_CONSTRUCTOR then report.error( - i"""Illegal backquoted identifier: `` and `` are forbidden""", + em"""Illegal backquoted identifier: `` and `` are forbidden""", in.sourcePos()) in.nextToken() name @@ -1235,7 +1226,7 @@ object Parsers { null } catch { - case ex: FromDigitsException => syntaxErrorOrIncomplete(ex.getMessage) + case ex: FromDigitsException => syntaxErrorOrIncomplete(ex.getMessage.toMessage) } Literal(Constant(value)) } @@ -1353,11 +1344,16 @@ object Parsers { // note: next is defined here because current == NEWLINE if (in.token == NEWLINE && p(in.next.token)) newLineOpt() - def colonAtEOLOpt(): Unit = { + def acceptIndent() = + if in.token != INDENT then + syntaxErrorOrIncomplete(em"indented definitions expected, ${in} found") + + def colonAtEOLOpt(): Unit = possibleColonOffset = in.lastOffset in.observeColonEOL(inTemplate = false) - if in.token == COLONeol then in.nextToken() - } + if in.token == COLONeol then + in.nextToken() + acceptIndent() def argumentStart(): Unit = colonAtEOLOpt() @@ -1365,9 +1361,9 @@ object Parsers { in.nextToken() if in.indentWidth(in.offset) == in.currentRegion.indentWidth then report.errorOrMigrationWarning( - i"""This opening brace will start a new statement in Scala 3. - |It needs to be indented to the right to keep being treated as - |an argument to the previous expression.${rewriteNotice()}""", + em"""This opening brace will start a new statement in Scala 3. + |It needs to be indented to the right to keep being treated as + |an argument to the previous expression.${rewriteNotice()}""", in.sourcePos(), from = `3.0`) patch(source, Span(in.offset), " ") @@ -1377,8 +1373,7 @@ object Parsers { if in.lookahead.token == END then in.token = NEWLINE else in.nextToken() - if in.token != INDENT && in.token != LBRACE then - syntaxErrorOrIncomplete(i"indented definitions expected, ${in} found") + if in.token != LBRACE then acceptIndent() else newLineOptWhenFollowedBy(LBRACE) @@ -1419,10 +1414,7 @@ object Parsers { if in.token == END then val start = in.skipToken() if stats.isEmpty || !matchesAndSetEnd(stats.last) then - syntaxError("misaligned end marker", Span(start, in.lastCharOffset)) - else if overlapsPatch(source, Span(start, start)) then - patch(source, Span(start, start), "") - patch(source, Span(start, in.lastCharOffset), s"} // end $endName") + syntaxError(em"misaligned end marker", Span(start, in.lastCharOffset)) in.token = IDENTIFIER // Leaving it as the original token can confuse newline insertion in.nextToken() end checkEndMarker @@ -1506,7 +1498,7 @@ object Parsers { TermLambdaTypeTree(params.asInstanceOf[List[ValDef]], resultType) else if imods.isOneOf(Given | Erased | Impure) then if imods.is(Given) && params.isEmpty then - syntaxError("context function types require at least one parameter", paramSpan) + syntaxError(em"context function types require at least one parameter", paramSpan) FunctionWithMods(params, resultType, imods) else if !ctx.settings.YkindProjector.isDefault then val (newParams :+ newResultType, tparams) = replaceKindProjectorPlaceholders(params :+ resultType): @unchecked @@ -1569,7 +1561,7 @@ object Parsers { if (isFunction(body)) PolyFunction(tparams, body) else { - syntaxError("Implementation restriction: polymorphic function types must have a value parameter", arrowOffset) + syntaxError(em"Implementation restriction: polymorphic function types must have a value parameter", arrowOffset) Ident(nme.ERROR.toTypeName) } } @@ -1723,7 +1715,7 @@ object Parsers { val hint = if inPattern then "Use lower cased variable name without the `$` instead" else "To use a given Type[T] in a quote just write T directly" - syntaxError(s"$msg\n\nHint: $hint", Span(start, in.lastOffset)) + syntaxError(em"$msg\n\nHint: $hint", Span(start, in.lastOffset)) Ident(nme.ERROR.toTypeName) else Splice(expr) @@ -1744,7 +1736,7 @@ object Parsers { Ident(tpnme.USCOREkw).withSpan(Span(start, in.lastOffset, start)) else if sourceVersion.isAtLeast(future) then - deprecationWarning(em"`_` is deprecated for wildcard arguments of types: use `?` instead".toMessage) + deprecationWarning(em"`_` is deprecated for wildcard arguments of types: use `?` instead") patch(source, Span(in.offset, in.offset + 1), "?") val start = in.skipToken() typeBounds().withSpan(Span(start, in.lastOffset, start)) @@ -1805,7 +1797,7 @@ object Parsers { if (!ctx.settings.YkindProjector.isDefault) { def fail(): Tree = { syntaxError( - "λ requires a single argument of the form X => ... or (X, Y) => ...", + em"λ requires a single argument of the form X => ... or (X, Y) => ...", Span(startOffset(t), in.lastOffset) ) AppliedTypeTree(applied, args) @@ -1900,10 +1892,10 @@ object Parsers { val tp = paramTypeOf(core) val tp1 = tp match case ImpureByNameTypeTree(tp1) => - syntaxError("explicit captureSet is superfluous for impure call-by-name type", start) + syntaxError(em"explicit captureSet is superfluous for impure call-by-name type", start) tp1 case CapturingTypeTree(_, tp1: ByNameTypeTree) => - syntaxError("only one captureSet is allowed here", start) + syntaxError(em"only one captureSet is allowed here", start) tp1 case _: ByNameTypeTree if startTpOffset > endCsOffset => report.warning( @@ -1918,6 +1910,13 @@ object Parsers { else core() + private def maybeInto(tp: () => Tree) = + if in.isIdent(nme.into) + && in.featureEnabled(Feature.into) + && canStartTypeTokens.contains(in.lookahead.token) + then atSpan(in.skipToken()) { Into(tp()) } + else tp() + /** FunArgType ::= Type * | `=>' Type * | [CaptureSet] `->' Type @@ -1930,10 +1929,10 @@ object Parsers { */ def paramType(): Tree = paramTypeOf(paramValueType) - /** ParamValueType ::= Type [`*'] + /** ParamValueType ::= [`into`] Type [`*'] */ def paramValueType(): Tree = { - val t = toplevelTyp() + val t = maybeInto(toplevelTyp) if (isIdent(nme.raw.STAR)) { in.nextToken() atSpan(startOffset(t)) { PostfixOp(t, Ident(tpnme.raw.STAR)) } @@ -1979,7 +1978,7 @@ object Parsers { } :: contextBounds(pname) else if in.token == VIEWBOUND then report.errorOrMigrationWarning( - "view bounds `<%' are no longer supported, use a context bound `:' instead", + em"view bounds `<%' are no longer supported, use a context bound `:' instead", in.sourcePos(), from = `3.0`) atSpan(in.skipToken()) { Function(Ident(pname) :: Nil, toplevelTyp()) @@ -2095,7 +2094,7 @@ object Parsers { if (isFunction(body)) PolyFunction(tparams, body) else { - syntaxError("Implementation restriction: polymorphic function literals must have a value parameter", arrowOffset) + syntaxError(em"Implementation restriction: polymorphic function literals must have a value parameter", arrowOffset) errorTermTree(arrowOffset) } } @@ -2130,8 +2129,8 @@ object Parsers { } case DO => report.errorOrMigrationWarning( - i"""`do while ` is no longer supported, - |use `while ; do ()` instead.${rewriteNotice()}""", + em"""`do while ` is no longer supported, + |use `while ; do ()` instead.${rewriteNotice()}""", in.sourcePos(), from = `3.0`) val start = in.skipToken() atSpan(start) { @@ -2319,7 +2318,7 @@ object Parsers { val t = if ((in.token == COLONop || in.token == COLONfollow) && location == Location.InBlock) { report.errorOrMigrationWarning( - s"This syntax is no longer supported; parameter needs to be enclosed in (...)${rewriteNotice(`future-migration`)}", + em"This syntax is no longer supported; parameter needs to be enclosed in (...)${rewriteNotice(`future-migration`)}", source.atSpan(Span(start, in.lastOffset)), from = future) in.nextToken() @@ -2356,7 +2355,7 @@ object Parsers { atSpan(start, in.offset) { if in.token == CTXARROW then if params.isEmpty then - syntaxError("context function literals require at least one formal parameter", Span(start, in.lastOffset)) + syntaxError(em"context function literals require at least one formal parameter", Span(start, in.lastOffset)) in.nextToken() else accept(ARROW) @@ -2370,7 +2369,7 @@ object Parsers { /** PostfixExpr ::= InfixExpr [id [nl]] * InfixExpr ::= PrefixExpr * | InfixExpr id [nl] InfixExpr - * | InfixExpr id `:` IndentedExpr + * | InfixExpr id ColonArgument * | InfixExpr MatchClause */ def postfixExpr(location: Location = Location.ElseWhere): Tree = @@ -2414,10 +2413,11 @@ object Parsers { * | SimpleExpr `.` MatchClause * | SimpleExpr (TypeArgs | NamedTypeArgs) * | SimpleExpr1 ArgumentExprs - * | SimpleExpr1 `:` ColonArgument -- under language.experimental.fewerBraces - * ColonArgument ::= indent (CaseClauses | Block) outdent - * | FunParams (‘=>’ | ‘?=>’) ColonArgBody - * | HkTypeParamClause ‘=>’ ColonArgBody + * | SimpleExpr1 ColonArgument + * ColonArgument ::= colon [LambdaStart] + * indent (CaseClauses | Block) outdent + * LambdaStart ::= FunParams (‘=>’ | ‘?=>’) + * | HkTypeParamClause ‘=>’ * ColonArgBody ::= indent (CaseClauses | Block) outdent * Quoted ::= ‘'’ ‘{’ Block ‘}’ * | ‘'’ ‘[’ Type ‘]’ @@ -2778,10 +2778,10 @@ object Parsers { CaseDef(pat, grd, atSpan(accept(ARROW)) { if exprOnly then if in.indentSyntax && in.isAfterLineEnd && in.token != INDENT then - warning(i"""Misleading indentation: this expression forms part of the preceding catch case. - |If this is intended, it should be indented for clarity. - |Otherwise, if the handler is intended to be empty, use a multi-line catch with - |an indented case.""".toMessage) + warning(em"""Misleading indentation: this expression forms part of the preceding catch case. + |If this is intended, it should be indented for clarity. + |Otherwise, if the handler is intended to be empty, use a multi-line catch with + |an indented case.""") expr() else block() }) @@ -2822,11 +2822,25 @@ object Parsers { if (isIdent(nme.raw.BAR)) { in.nextToken(); pattern1(location) :: patternAlts(location) } else Nil - /** Pattern1 ::= Pattern2 [Ascription] + /** Pattern1 ::= PatVar Ascription + * | [‘-’] integerLiteral Ascription + * | [‘-’] floatingPointLiteral Ascription + * | Pattern2 */ def pattern1(location: Location = Location.InPattern): Tree = val p = pattern2() if in.isColon then + val isVariableOrNumber = isVarPattern(p) || p.isInstanceOf[Number] + if !isVariableOrNumber then + report.gradualErrorOrMigrationWarning( + em"""Type ascriptions after patterns other than: + | * variable pattern, e.g. `case x: String =>` + | * number literal pattern, e.g. `case 10.5: Double =>` + |are no longer supported. Remove the type ascription or move it to a separate variable pattern.""", + in.sourcePos(), + warnFrom = `3.3`, + errorFrom = future + ) in.nextToken() ascription(p, location) else p @@ -3003,7 +3017,7 @@ object Parsers { if in.token == THIS then if sourceVersion.isAtLeast(future) then deprecationWarning( - "The [this] qualifier will be deprecated in the future; it should be dropped.".toMessage) + em"The [this] qualifier will be deprecated in the future; it should be dropped.") in.nextToken() mods | Local else mods.withPrivateWithin(ident().toTypeName) @@ -3094,7 +3108,7 @@ object Parsers { def variance(vflag: FlagSet): FlagSet = if ownerKind == ParamOwner.Def || ownerKind == ParamOwner.TypeParam then - syntaxError(i"no `+/-` variance annotation allowed here") + syntaxError(em"no `+/-` variance annotation allowed here") in.nextToken() EmptyFlags else @@ -3185,7 +3199,7 @@ object Parsers { addMod(mods, mod) else if (!(mods.flags &~ (ParamAccessor | Inline | impliedMods.flags)).isEmpty) - syntaxError("`val` or `var` expected") + syntaxError(em"`val` or `var` expected") if (firstClause && ofCaseClass) mods else mods | PrivateLocal } @@ -3231,7 +3245,7 @@ object Parsers { else paramMods() if givenOnly && !impliedMods.is(Given) then - syntaxError("`using` expected") + syntaxError(em"`using` expected") val isParams = !impliedMods.is(Given) || startParamTokens.contains(in.token) @@ -3310,19 +3324,19 @@ object Parsers { in.languageImportContext = in.languageImportContext.importContext(imp, NoSymbol) for case ImportSelector(id @ Ident(imported), EmptyTree, _) <- selectors do if Feature.handleGlobalLanguageImport(prefix, imported) && !outermost then - syntaxError(i"this language import is only allowed at the toplevel", id.span) + syntaxError(em"this language import is only allowed at the toplevel", id.span) if allSourceVersionNames.contains(imported) && prefix.isEmpty then if !outermost then - syntaxError(i"source version import is only allowed at the toplevel", id.span) + syntaxError(em"source version import is only allowed at the toplevel", id.span) else if ctx.compilationUnit.sourceVersion.isDefined then - syntaxError(i"duplicate source version import", id.span) + syntaxError(em"duplicate source version import", id.span) else if illegalSourceVersionNames.contains(imported) then val candidate = val nonMigration = imported.toString.replace("-migration", "") validSourceVersionNames.find(_.show == nonMigration) - val baseMsg = i"`$imported` is not a valid source version" + val baseMsg = em"`$imported` is not a valid source version" val msg = candidate match - case Some(member) => i"$baseMsg, did you mean language.`$member`?" + case Some(member) => baseMsg.append(i", did you mean language.`$member`?") case _ => baseMsg syntaxError(msg, id.span) else @@ -3385,7 +3399,7 @@ object Parsers { case _ => if isIdent(nme.raw.STAR) then wildcardSelector() else - if !idOK then syntaxError(i"named imports cannot follow wildcard imports") + if !idOK then syntaxError(em"named imports cannot follow wildcard imports") namedSelector(termIdent()) } @@ -3485,7 +3499,7 @@ object Parsers { if sourceVersion.isAtLeast(future) then deprecationWarning( em"""`= _` has been deprecated; use `= uninitialized` instead. - |`uninitialized` can be imported with `scala.compiletime.uninitialized`.""".toMessage, + |`uninitialized` can be imported with `scala.compiletime.uninitialized`.""", rhsOffset) placeholderParams = placeholderParams.tail atSpan(rhs0.span) { Ident(nme.WILDCARD) } @@ -3526,7 +3540,7 @@ object Parsers { else ": Unit " // trailing space ensures that `def f()def g()` works. if migrateTo3 then report.errorOrMigrationWarning( - s"Procedure syntax no longer supported; `$toInsert` should be inserted here", + em"Procedure syntax no longer supported; `$toInsert` should be inserted here", in.sourcePos(), from = `3.0`) patch(source, Span(in.lastOffset), toInsert) true @@ -3538,7 +3552,7 @@ object Parsers { val vparamss = paramClauses(numLeadParams = numLeadParams) if (vparamss.isEmpty || vparamss.head.take(1).exists(_.mods.isOneOf(GivenOrImplicit))) in.token match { - case LBRACKET => syntaxError("no type parameters allowed here") + case LBRACKET => syntaxError(em"no type parameters allowed here") case EOF => incompleteInputError(AuxConstructorNeedsNonImplicitParameter()) case _ => syntaxError(AuxConstructorNeedsNonImplicitParameter(), nameStart) } @@ -3631,13 +3645,13 @@ object Parsers { case TypeBoundsTree(EmptyTree, upper, _) => rhs = MatchTypeTree(upper, mtt.selector, mtt.cases) case _ => - syntaxError(i"cannot combine lower bound and match type alias", eqOffset) + syntaxError(em"cannot combine lower bound and match type alias", eqOffset) } case _ => if mods.is(Opaque) then rhs = TypeBoundsTree(bounds.lo, bounds.hi, rhs) else - syntaxError(i"cannot combine bound and alias", eqOffset) + syntaxError(em"cannot combine bound and alias", eqOffset) } makeTypeDef(rhs) } @@ -3718,7 +3732,7 @@ object Parsers { private def checkAccessOnly(mods: Modifiers, where: String): Modifiers = val mods1 = mods & (AccessFlags | Enum) if mods1 ne mods then - syntaxError(s"Only access modifiers are allowed on enum $where") + syntaxError(em"Only access modifiers are allowed on enum $where") mods1 /** EnumDef ::= id ClassConstr InheritClauses EnumBody @@ -3774,17 +3788,17 @@ object Parsers { vparamss: List[List[Tree]], stat: Tree): Unit = stat match { case stat: DefDef => if stat.mods.is(ExtensionMethod) && vparamss.nonEmpty then - syntaxError(i"no extension method allowed here since leading parameter was already given", stat.span) + syntaxError(em"no extension method allowed here since leading parameter was already given", stat.span) else if !stat.mods.is(ExtensionMethod) && vparamss.isEmpty then - syntaxError(i"an extension method is required here", stat.span) + syntaxError(em"an extension method is required here", stat.span) else if tparams.nonEmpty && stat.leadingTypeParams.nonEmpty then - syntaxError(i"extension method cannot have type parameters since some were already given previously", + syntaxError(em"extension method cannot have type parameters since some were already given previously", stat.leadingTypeParams.head.span) else if stat.rhs.isEmpty then - syntaxError(i"extension method cannot be abstract", stat.span) + syntaxError(em"extension method cannot be abstract", stat.span) case EmptyTree => case stat => - syntaxError(i"extension clause can only define methods", stat.span) + syntaxError(em"extension clause can only define methods", stat.span) } /** GivenDef ::= [GivenSig] (AnnotType [‘=’ Expr] | StructuralInstance) @@ -3807,7 +3821,7 @@ object Parsers { if !(name.isEmpty && noParams) then acceptColon() val parents = if isSimpleLiteral then rejectWildcardType(annotType()) :: Nil - else constrApp() :: withConstrApps() + else refinedTypeRest(constrApp()) :: withConstrApps() val parentsIsType = parents.length == 1 && parents.head.isType if in.token == EQUALS && parentsIsType then accept(EQUALS) @@ -3850,7 +3864,7 @@ object Parsers { do () leadParamss ++= paramClauses(givenOnly = true, numLeadParams = nparams) if in.isColon then - syntaxError("no `:` expected here") + syntaxError(em"no `:` expected here") in.nextToken() val methods: List[Tree] = if in.token == EXPORT then @@ -3861,7 +3875,7 @@ object Parsers { in.observeIndented() newLineOptWhenFollowedBy(LBRACE) if in.isNestedStart then inDefScopeBraces(extMethods(nparams)) - else { syntaxErrorOrIncomplete("Extension without extension methods") ; Nil } + else { syntaxErrorOrIncomplete(em"Extension without extension methods") ; Nil } val result = atSpan(start)(ExtMethods(joinParams(tparams, leadParamss.toList), methods)) val comment = in.getDocComment(start) if comment.isDefined then @@ -3894,7 +3908,7 @@ object Parsers { meths += defDefOrDcl(start, mods, numLeadParams) in.token != EOF && statSepOrEnd(meths, what = "extension method") do () - if meths.isEmpty then syntaxErrorOrIncomplete("`def` expected") + if meths.isEmpty then syntaxErrorOrIncomplete(em"`def` expected") meths.toList } @@ -3940,7 +3954,7 @@ object Parsers { in.nextToken() if (in.token == LBRACE || in.token == COLONeol) { report.errorOrMigrationWarning( - "`extends` must be followed by at least one parent", + em"`extends` must be followed by at least one parent", in.sourcePos(), from = `3.0`) Nil } @@ -4082,7 +4096,7 @@ object Parsers { in.token = SELFARROW // suppresses INDENT insertion after `=>` in.nextToken() else - syntaxError("`=>` expected after self type") + syntaxError(em"`=>` expected after self type") makeSelfDef(selfName, selfTpt) } else EmptyValDef @@ -4129,24 +4143,26 @@ object Parsers { def refineStatSeq(): List[Tree] = { val stats = new ListBuffer[Tree] def checkLegal(tree: Tree): List[Tree] = - val problem = tree match + def ok = tree :: Nil + def fail(msg: Message) = + syntaxError(msg, tree.span) + Nil + tree match case tree: ValDef if tree.mods.is(Mutable) => - i"""refinement cannot be a mutable var. - |You can use an explicit getter ${tree.name} and setter ${tree.name}_= instead""" + fail(em"""refinement cannot be a mutable var. + |You can use an explicit getter ${tree.name} and setter ${tree.name}_= instead""") case tree: MemberDef if !(tree.mods.flags & ModifierFlags).isEmpty => - i"refinement cannot be ${(tree.mods.flags & ModifierFlags).flagStrings().mkString("`", "`, `", "`")}" + fail(em"refinement cannot be ${(tree.mods.flags & ModifierFlags).flagStrings().mkString("`", "`, `", "`")}") case tree: DefDef if tree.termParamss.nestedExists(!_.rhs.isEmpty) => - i"refinement cannot have default arguments" + fail(em"refinement cannot have default arguments") case tree: ValOrDefDef => - if tree.rhs.isEmpty then "" - else "refinement cannot have a right-hand side" + if tree.rhs.isEmpty then ok + else fail(em"refinement cannot have a right-hand side") case tree: TypeDef => - if !tree.isClassDef then "" - else "refinement cannot be a class or trait" + if !tree.isClassDef then ok + else fail(em"refinement cannot be a class or trait") case _ => - "this kind of definition cannot be a refinement" - if problem.isEmpty then tree :: Nil - else { syntaxError(problem, tree.span); Nil } + fail(em"this kind of definition cannot be a refinement") while val dclFound = isDclIntro diff --git a/compiler/src/dotty/tools/dotc/parsing/Scanners.scala b/compiler/src/dotty/tools/dotc/parsing/Scanners.scala index a4eff045b4ac..b3d824a2efd2 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Scanners.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Scanners.scala @@ -17,9 +17,11 @@ import scala.collection.mutable import scala.collection.immutable.SortedMap import rewrites.Rewrites.patch import config.Feature -import config.Feature.migrateTo3 +import config.Feature.{migrateTo3, fewerBracesEnabled} import config.SourceVersion.`3.0` -import reporting.{NoProfile, Profile} +import reporting.{NoProfile, Profile, Message} + +import java.util.Objects object Scanners { @@ -100,19 +102,23 @@ object Scanners { */ var errOffset: Offset = NoOffset + /** Implements CharArrayReader's error method */ + protected def error(msg: String, off: Offset): Unit = + error(msg.toMessage, off) + /** Generate an error at the given offset */ - def error(msg: String, off: Offset = offset): Unit = { + def error(msg: Message, off: Offset = offset): Unit = { errorButContinue(msg, off) token = ERROR errOffset = off } - def errorButContinue(msg: String, off: Offset = offset): Unit = + def errorButContinue(msg: Message, off: Offset = offset): Unit = report.error(msg, sourcePos(off)) /** signal an error where the input ended in the middle of a token */ - def incompleteInputError(msg: String): Unit = { - report.incompleteInputError(msg.toMessage, sourcePos()) + def incompleteInputError(msg: Message): Unit = { + report.incompleteInputError(msg, sourcePos()) token = EOF errOffset = offset } @@ -122,9 +128,11 @@ object Scanners { // Setting token data ---------------------------------------------------- + protected def initialCharBufferSize = 1024 + /** A character buffer for literals */ - protected val litBuf = CharBuffer() + protected val litBuf = CharBuffer(initialCharBufferSize) /** append Unicode character to "litBuf" buffer */ @@ -159,7 +167,7 @@ object Scanners { // disallow trailing numeric separator char, but continue lexing def checkNoTrailingSeparator(): Unit = if (!litBuf.isEmpty && isNumberSeparator(litBuf.last)) - errorButContinue("trailing separator is not allowed", offset + litBuf.length - 1) + errorButContinue(em"trailing separator is not allowed", offset + litBuf.length - 1) } class Scanner(source: SourceFile, override val startFrom: Offset = 0, profile: Profile = NoProfile, allowIndent: Boolean = true)(using Context) extends ScannerCommon(source) { @@ -192,7 +200,7 @@ object Scanners { val rewriteTargets = List(s.newSyntax, s.oldSyntax, s.indent, s.noindent) val enabled = rewriteTargets.filter(_.value) if (enabled.length > 1) - error(s"illegal combination of -rewrite targets: ${enabled(0).name} and ${enabled(1).name}") + error(em"illegal combination of -rewrite targets: ${enabled(0).name} and ${enabled(1).name}") } private var myLanguageImportContext: Context = ctx @@ -202,25 +210,6 @@ object Scanners { def featureEnabled(name: TermName) = Feature.enabled(name)(using languageImportContext) def erasedEnabled = featureEnabled(Feature.erasedDefinitions) - private inline val fewerBracesByDefault = false - // turn on to study impact on codebase if `fewerBraces` was the default - - private var fewerBracesEnabledCache = false - private var fewerBracesEnabledCtx: Context = NoContext - - def fewerBracesEnabled = - if fewerBracesEnabledCtx ne myLanguageImportContext then - fewerBracesEnabledCache = - featureEnabled(Feature.fewerBraces) - || fewerBracesByDefault && indentSyntax && !migrateTo3 - // ensure that fewer braces is not the default for 3.0-migration since - // { x: T => - // expr - // } - // would be ambiguous - fewerBracesEnabledCtx = myLanguageImportContext - fewerBracesEnabledCache - private var postfixOpsEnabledCache = false private var postfixOpsEnabledCtx: Context = NoContext @@ -257,14 +246,14 @@ object Scanners { def getDocComment(pos: Int): Option[Comment] = docstringMap.get(pos) /** A buffer for comments */ - private val commentBuf = CharBuffer() + private val commentBuf = CharBuffer(initialCharBufferSize) def toToken(identifier: SimpleName): Token = def handleMigration(keyword: Token): Token = if scala3keywords.contains(keyword) && migrateTo3 then val what = tokenString(keyword) report.errorOrMigrationWarning( - i"$what is now a keyword, write `$what` instead of $what to keep it as an identifier", + em"$what is now a keyword, write `$what` instead of $what to keep it as an identifier", sourcePos(), from = `3.0`) patch(source, Span(offset), "`") @@ -566,7 +555,7 @@ object Scanners { // If nextWidth is an indentation level not yet seen by enclosing indentation // region, invoke `handler`. - def handleNewIndentWidth(r: Region, handler: Indented => Unit): Unit = r match + inline def handleNewIndentWidth(r: Region, inline handler: Indented => Unit): Unit = r match case r @ Indented(curWidth, prefix, outer) if curWidth < nextWidth && !r.otherIndentWidths.contains(nextWidth) && nextWidth != lastWidth => handler(r) @@ -584,7 +573,7 @@ object Scanners { * they start with `(`, `[` or `{`, or the last statement ends in a `return`. * The Scala 2 rules apply under source `3.0-migration` or under `-no-indent`. */ - def isContinuing = + inline def isContinuing = lastWidth < nextWidth && (openParensTokens.contains(token) || lastToken == RETURN) && !pastBlankLine @@ -621,10 +610,11 @@ object Scanners { case r: Indented => insert(OUTDENT, offset) handleNewIndentWidth(r.enclosing, ir => + val lw = lastWidth errorButContinue( - i"""The start of this line does not match any of the previous indentation widths. - |Indentation width of current line : $nextWidth - |This falls between previous widths: ${ir.width} and $lastWidth""")) + em"""The start of this line does not match any of the previous indentation widths. + |Indentation width of current line : $nextWidth + |This falls between previous widths: ${ir.width} and $lw""")) case r => if skipping then if r.enclosing.isClosedByUndentAt(nextWidth) then @@ -640,16 +630,17 @@ object Scanners { else if lastToken == SELFARROW then currentRegion.knownWidth = nextWidth else if (lastWidth != nextWidth) - errorButContinue(spaceTabMismatchMsg(lastWidth, nextWidth)) + val lw = lastWidth + errorButContinue(spaceTabMismatchMsg(lw, nextWidth)) if token != OUTDENT then handleNewIndentWidth(currentRegion, _.otherIndentWidths += nextWidth) if next.token == EMPTY then profile.recordNewLine() end handleNewLine - def spaceTabMismatchMsg(lastWidth: IndentWidth, nextWidth: IndentWidth) = - i"""Incompatible combinations of tabs and spaces in indentation prefixes. - |Previous indent : $lastWidth + def spaceTabMismatchMsg(lastWidth: IndentWidth, nextWidth: IndentWidth): Message = + em"""Incompatible combinations of tabs and spaces in indentation prefixes. + |Previous indent : $lastWidth |Latest indent : $nextWidth""" def observeColonEOL(inTemplate: Boolean): Unit = @@ -792,22 +783,24 @@ object Scanners { private def isSupplementary(high: Char, test: Int => Boolean, strict: Boolean = true): Boolean = isHighSurrogate(high) && { var res = false - nextChar() - val low = ch + val low = lookaheadChar() if isLowSurrogate(low) then - nextChar() val codepoint = toCodePoint(high, low) - if isValidCodePoint(codepoint) && test(codepoint) then - putChar(high) - putChar(low) - res = true + if isValidCodePoint(codepoint) then + if test(codepoint) then + putChar(high) + putChar(low) + nextChar() + nextChar() + res = true else - error(s"illegal character '${toUnicode(high)}${toUnicode(low)}'") + error(em"illegal character '${toUnicode(high)}${toUnicode(low)}'") else if !strict then putChar(high) + nextChar() res = true else - error(s"illegal character '${toUnicode(high)}' missing low surrogate") + error(em"illegal character '${toUnicode(high)}' missing low surrogate") res } private def atSupplementary(ch: Char, f: Int => Boolean): Boolean = @@ -884,7 +877,7 @@ object Scanners { case _ => base = 10 ; putChar('0') } if (base != 10 && !isNumberSeparator(ch) && digit2int(ch, base) < 0) - error("invalid literal number") + error(em"invalid literal number") } fetchLeadingZero() getNumber() @@ -904,7 +897,6 @@ object Scanners { if (ch == '\"') { if (lookaheadChar() == '\"') { nextRawChar() - //offset += 3 // first part is positioned at the quote nextRawChar() stringPart(multiLine = true) } @@ -915,7 +907,6 @@ object Scanners { } } else { - //offset += 1 // first part is positioned at the quote stringPart(multiLine = false) } } @@ -950,13 +941,13 @@ object Scanners { val isEmptyCharLit = (ch == '\'') getLitChar() if ch == '\'' then - if isEmptyCharLit then error("empty character literal (use '\\'' for single quote)") - else if litBuf.length != 1 then error("illegal codepoint in Char constant: " + litBuf.toString.map(toUnicode).mkString("'", "", "'")) + if isEmptyCharLit then error(em"empty character literal (use '\\'' for single quote)") + else if litBuf.length != 1 then error(em"illegal codepoint in Char constant: ${litBuf.toString.map(toUnicode).mkString("'", "", "'")}") else finishCharLit() - else if isEmptyCharLit then error("empty character literal") - else error("unclosed character literal") + else if isEmptyCharLit then error(em"empty character literal") + else error(em"unclosed character literal") case _ => - error("unclosed character literal") + error(em"unclosed character literal") } } fetchSingleQuote() @@ -987,35 +978,34 @@ object Scanners { case SU => if (isAtEnd) token = EOF else { - error("illegal character") + error(em"illegal character") nextChar() } case _ => def fetchOther() = - if (ch == '\u21D2') { + if ch == '\u21D2' then nextChar(); token = ARROW - report.deprecationWarning("The unicode arrow `⇒` is deprecated, use `=>` instead. If you still wish to display it as one character, consider using a font with programming ligatures such as Fira Code.", sourcePos(offset)) - } - else if (ch == '\u2190') { + report.deprecationWarning(em"The unicode arrow `⇒` is deprecated, use `=>` instead. If you still wish to display it as one character, consider using a font with programming ligatures such as Fira Code.", sourcePos(offset)) + else if ch == '\u2190' then nextChar(); token = LARROW - report.deprecationWarning("The unicode arrow `←` is deprecated, use `<-` instead. If you still wish to display it as one character, consider using a font with programming ligatures such as Fira Code.", sourcePos(offset)) - } - else if (Character.isUnicodeIdentifierStart(ch)) { + report.deprecationWarning(em"The unicode arrow `←` is deprecated, use `<-` instead. If you still wish to display it as one character, consider using a font with programming ligatures such as Fira Code.", sourcePos(offset)) + else if isUnicodeIdentifierStart(ch) then putChar(ch) nextChar() getIdentRest() - } - else if (isSpecial(ch)) { + if ch == '"' && token == IDENTIFIER then token = INTERPOLATIONID + else if isSpecial(ch) then putChar(ch) nextChar() getOperatorRest() - } else if isSupplementary(ch, isUnicodeIdentifierStart) then getIdentRest() - else { - error(s"illegal character '${toUnicode(ch)}'") + if ch == '"' && token == IDENTIFIER then token = INTERPOLATIONID + else if isSupplementary(ch, isSpecial) then + getOperatorRest() + else + error(em"illegal character '${toUnicode(ch)}'") nextChar() - } fetchOther() } } @@ -1043,7 +1033,7 @@ object Scanners { if (ch == '/') nextChar() else skipComment() } - else if (ch == SU) incompleteInputError("unclosed comment") + else if (ch == SU) incompleteInputError(em"unclosed comment") else { nextChar(); skipComment() } def nestedComment() = { nextChar(); skipComment() } val start = lastCharOffset @@ -1091,6 +1081,7 @@ object Scanners { next class LookaheadScanner(val allowIndent: Boolean = false) extends Scanner(source, offset, allowIndent = allowIndent) { + override protected def initialCharBufferSize = 8 override def languageImportContext = Scanner.this.languageImportContext } @@ -1123,14 +1114,14 @@ object Scanners { nextChar() finishNamedToken(BACKQUOTED_IDENT, target = this) if (name.length == 0) - error("empty quoted identifier") + error(em"empty quoted identifier") else if (name == nme.WILDCARD) - error("wildcard invalid as backquoted identifier") + error(em"wildcard invalid as backquoted identifier") } - else error("unclosed quoted identifier") + else error(em"unclosed quoted identifier") } - private def getIdentRest(): Unit = (ch: @switch) match { + @tailrec private def getIdentRest(): Unit = (ch: @switch) match { case 'A' | 'B' | 'C' | 'D' | 'E' | 'F' | 'G' | 'H' | 'I' | 'J' | 'K' | 'L' | 'M' | 'N' | 'O' | @@ -1165,7 +1156,7 @@ object Scanners { finishNamed() } - private def getOperatorRest(): Unit = (ch: @switch) match { + @tailrec private def getOperatorRest(): Unit = (ch: @switch) match { case '~' | '!' | '@' | '#' | '%' | '^' | '*' | '+' | '-' | '<' | '>' | '?' | ':' | '=' | '&' | @@ -1176,23 +1167,13 @@ object Scanners { if nxch == '/' || nxch == '*' then finishNamed() else { putChar(ch); nextChar(); getOperatorRest() } case _ => - if (isSpecial(ch)) { putChar(ch); nextChar(); getOperatorRest() } + if isSpecial(ch) then { putChar(ch); nextChar(); getOperatorRest() } + else if isSupplementary(ch, isSpecial) then getOperatorRest() else finishNamed() } private def getIdentOrOperatorRest(): Unit = - if (isIdentifierPart(ch)) - getIdentRest() - else ch match { - case '~' | '!' | '@' | '#' | '%' | - '^' | '*' | '+' | '-' | '<' | - '>' | '?' | ':' | '=' | '&' | - '|' | '\\' | '/' => - getOperatorRest() - case _ => - if (isSpecial(ch)) getOperatorRest() - else finishNamed() - } + if (isIdentifierPart(ch) || isSupplementary(ch, isIdentifierPart)) getIdentRest() else getOperatorRest() def isSoftModifier: Boolean = token == IDENTIFIER @@ -1221,7 +1202,7 @@ object Scanners { nextChar() token = STRINGLIT } - else error("unclosed string literal") + else error(em"unclosed string literal") } private def getRawStringLit(): Unit = @@ -1235,7 +1216,7 @@ object Scanners { getRawStringLit() } else if (ch == SU) - incompleteInputError("unclosed multi-line string literal") + incompleteInputError(em"unclosed multi-line string literal") else { putChar(ch) nextRawChar() @@ -1305,7 +1286,7 @@ object Scanners { else if atSupplementary(ch, isUnicodeIdentifierStart) then getInterpolatedIdentRest(hasSupplement = true) else - error("invalid string interpolation: `$$`, `$\"`, `$`ident or `$`BlockExpr expected", off = charOffset - 2) + error("invalid string interpolation: `$$`, `$\"`, `$`ident or `$`BlockExpr expected".toMessage, off = charOffset - 2) putChar('$') getStringPart(multiLine) } @@ -1313,9 +1294,9 @@ object Scanners { val isUnclosedLiteral = !isUnicodeEscape && (ch == SU || (!multiLine && (ch == CR || ch == LF))) if (isUnclosedLiteral) if (multiLine) - incompleteInputError("unclosed multi-line string literal") + incompleteInputError(em"unclosed multi-line string literal") else - error("unclosed string literal") + error(em"unclosed string literal") else { putChar(ch) nextRawChar() @@ -1467,7 +1448,7 @@ object Scanners { } def checkNoLetter(): Unit = if (isIdentifierPart(ch) && ch >= ' ') - error("Invalid literal number") + error(em"Invalid literal number") /** Read a number into strVal and set base */ @@ -1515,7 +1496,7 @@ object Scanners { if (ch == '\'') finishCharLit() else { token = op - strVal = if (name != null) name.toString else null + strVal = Objects.toString(name) litBuf.clear() } } @@ -1550,7 +1531,7 @@ object Scanners { def resume(lastTokenData: TokenData): Unit = { this.copyFrom(lastTokenData) if (next.token != EMPTY && !ctx.reporter.hasErrors) - error("unexpected end of input: possible missing '}' in XML block") + error(em"unexpected end of input: possible missing '}' in XML block") nextToken() } diff --git a/compiler/src/dotty/tools/dotc/parsing/Tokens.scala b/compiler/src/dotty/tools/dotc/parsing/Tokens.scala index 7d27b3ca82b9..dba0ad3fa2ee 100644 --- a/compiler/src/dotty/tools/dotc/parsing/Tokens.scala +++ b/compiler/src/dotty/tools/dotc/parsing/Tokens.scala @@ -231,6 +231,8 @@ object Tokens extends TokensCommon { final val canStartInfixTypeTokens: TokenSet = literalTokens | identifierTokens | BitSet( THIS, SUPER, USCORE, LPAREN, LBRACE, AT) + final val canStartTypeTokens: TokenSet = canStartInfixTypeTokens | BitSet(LBRACE) + final val templateIntroTokens: TokenSet = BitSet(CLASS, TRAIT, OBJECT, ENUM, CASECLASS, CASEOBJECT) final val dclIntroTokens: TokenSet = BitSet(DEF, VAL, VAR, TYPE, GIVEN) @@ -287,7 +289,7 @@ object Tokens extends TokensCommon { final val closingParens = BitSet(RPAREN, RBRACKET, RBRACE) - final val softModifierNames = Set(nme.inline, nme.opaque, nme.open, nme.transparent, nme.infix) + final val softModifierNames = Set(nme.inline, nme.into, nme.opaque, nme.open, nme.transparent, nme.infix) def showTokenDetailed(token: Int): String = debugString(token) diff --git a/compiler/src/dotty/tools/dotc/parsing/package.scala b/compiler/src/dotty/tools/dotc/parsing/package.scala index a1f9c8d73ad4..ee3ecda60aee 100644 --- a/compiler/src/dotty/tools/dotc/parsing/package.scala +++ b/compiler/src/dotty/tools/dotc/parsing/package.scala @@ -17,7 +17,7 @@ package object parsing { def precedence(operator: Name): Int = if (operator eq nme.ERROR) -1 else { - val firstCh = operator.firstPart.head + val firstCh = operator.firstCodePoint if (isScalaLetter(firstCh)) 1 else if (operator.isOpAssignmentName) 0 else firstCh match { diff --git a/compiler/src/dotty/tools/dotc/parsing/xml/MarkupParsers.scala b/compiler/src/dotty/tools/dotc/parsing/xml/MarkupParsers.scala index 3d9f5fb7ad6d..77c5a1bf376b 100644 --- a/compiler/src/dotty/tools/dotc/parsing/xml/MarkupParsers.scala +++ b/compiler/src/dotty/tools/dotc/parsing/xml/MarkupParsers.scala @@ -6,6 +6,7 @@ package xml import scala.language.unsafeNulls import scala.collection.mutable +import core.Contexts.Context import mutable.{ Buffer, ArrayBuffer, ListBuffer } import scala.util.control.ControlThrowable import util.Chars.SU @@ -13,7 +14,7 @@ import Parsers._ import util.Spans._ import core._ import Constants._ -import Decorators.toMessage +import Decorators.{em, toMessage} import util.SourceFile import Utility._ @@ -50,7 +51,7 @@ object MarkupParsers { override def getMessage: String = "input ended while parsing XML" } - class MarkupParser(parser: Parser, final val preserveWS: Boolean)(implicit src: SourceFile) extends MarkupParserCommon { + class MarkupParser(parser: Parser, final val preserveWS: Boolean)(using Context) extends MarkupParserCommon { import Tokens.{ LBRACE, RBRACE } @@ -330,9 +331,9 @@ object MarkupParsers { case c @ TruncatedXMLControl => ifTruncated(c.getMessage) case c @ (MissingEndTagControl | ConfusedAboutBracesControl) => - parser.syntaxError(c.getMessage + debugLastElem + ">", debugLastPos) + parser.syntaxError(em"${c.getMessage}$debugLastElem>", debugLastPos) case _: ArrayIndexOutOfBoundsException => - parser.syntaxError("missing end tag in XML literal for <%s>" format debugLastElem, debugLastPos) + parser.syntaxError(em"missing end tag in XML literal for <$debugLastElem>", debugLastPos) } finally parser.in.resume(saved) @@ -396,7 +397,7 @@ object MarkupParsers { tree } }, - msg => parser.syntaxError(msg, curOffset) + msg => parser.syntaxError(msg.toMessage, curOffset) ) def escapeToScala[A](op: => A, kind: String): A = { @@ -422,7 +423,7 @@ object MarkupParsers { */ def xScalaPatterns: List[Tree] = escapeToScala(parser.patterns(), "pattern") - def reportSyntaxError(offset: Int, str: String): Unit = parser.syntaxError(str, offset) + def reportSyntaxError(offset: Int, str: String): Unit = parser.syntaxError(str.toMessage, offset) def reportSyntaxError(str: String): Unit = { reportSyntaxError(curOffset, "in XML literal: " + str) nextch() diff --git a/compiler/src/dotty/tools/dotc/plugins/Plugins.scala b/compiler/src/dotty/tools/dotc/plugins/Plugins.scala index 3093a1c0460f..976b783c40f0 100644 --- a/compiler/src/dotty/tools/dotc/plugins/Plugins.scala +++ b/compiler/src/dotty/tools/dotc/plugins/Plugins.scala @@ -5,6 +5,7 @@ import scala.language.unsafeNulls import core._ import Contexts._ +import Decorators.em import config.{ PathResolver, Feature } import dotty.tools.io._ import Phases._ @@ -83,14 +84,14 @@ trait Plugins { // Verify required plugins are present. for (req <- ctx.settings.require.value ; if !(plugs exists (_.name == req))) - report.error("Missing required plugin: " + req) + report.error(em"Missing required plugin: $req") // Verify no non-existent plugin given with -P for { opt <- ctx.settings.pluginOptions.value if !(plugs exists (opt startsWith _.name + ":")) } - report.error("bad option: -P:" + opt) + report.error(em"bad option: -P:$opt") plugs } diff --git a/compiler/src/dotty/tools/dotc/printing/Formatting.scala b/compiler/src/dotty/tools/dotc/printing/Formatting.scala index f85845517d8c..f4bbd74842c8 100644 --- a/compiler/src/dotty/tools/dotc/printing/Formatting.scala +++ b/compiler/src/dotty/tools/dotc/printing/Formatting.scala @@ -137,236 +137,6 @@ object Formatting { } } - /** The `em` string interpolator works like the `i` string interpolator, but marks nonsensical errors - * using `...` tags. - * Note: Instead of these tags, it would be nicer to return a data structure containing the message string - * and a boolean indicating whether the message is sensical, but then we cannot use string operations - * like concatenation, stripMargin etc on the values returned by em"...", and in the current error - * message composition methods, this is crucial. - */ - def forErrorMessages(op: Context ?=> String)(using Context): String = op(using errorMessageCtx) - - private class ErrorMessagePrinter(_ctx: Context) extends RefinedPrinter(_ctx): - override def toText(tp: Type): Text = wrapNonSensical(tp, super.toText(tp)) - override def toText(sym: Symbol): Text = wrapNonSensical(sym, super.toText(sym)) - - private def wrapNonSensical(arg: Any, text: Text)(using Context): Text = { - import Message._ - def isSensical(arg: Any): Boolean = arg match { - case tpe: Type => - tpe.exists && !tpe.isErroneous - case sym: Symbol if sym.isCompleted => - sym.info match { - case _: ErrorType | TypeAlias(_: ErrorType) | NoType => false - case _ => true - } - case _ => true - } - - if (isSensical(arg)) text - else nonSensicalStartTag ~ text ~ nonSensicalEndTag - } - - private type Recorded = Symbol | ParamRef | SkolemType - - private case class SeenKey(str: String, isType: Boolean) - private class Seen extends mutable.HashMap[SeenKey, List[Recorded]] { - - override def default(key: SeenKey) = Nil - - def record(str: String, isType: Boolean, entry: Recorded)(using Context): String = { - - /** If `e1` is an alias of another class of the same name, return the other - * class symbol instead. This normalization avoids recording e.g. scala.List - * and scala.collection.immutable.List as two different types - */ - def followAlias(e1: Recorded): Recorded = e1 match { - case e1: Symbol if e1.isAliasType => - val underlying = e1.typeRef.underlyingClassRef(refinementOK = false).typeSymbol - if (underlying.name == e1.name) underlying else e1 - case _ => e1 - } - val key = SeenKey(str, isType) - val existing = apply(key) - lazy val dealiased = followAlias(entry) - - // alts: The alternatives in `existing` that are equal, or follow (an alias of) `entry` - var alts = existing.dropWhile(alt => dealiased ne followAlias(alt)) - if (alts.isEmpty) { - alts = entry :: existing - update(key, alts) - } - val suffix = alts.length match { - case 1 => "" - case n => n.toString.toCharArray.map { - case '0' => '⁰' - case '1' => '¹' - case '2' => '²' - case '3' => '³' - case '4' => '⁴' - case '5' => '⁵' - case '6' => '⁶' - case '7' => '⁷' - case '8' => '⁸' - case '9' => '⁹' - }.mkString - } - str + suffix - } - } - - private class ExplainingPrinter(seen: Seen)(_ctx: Context) extends ErrorMessagePrinter(_ctx) { - - /** True if printer should a source module instead of its module class */ - private def useSourceModule(sym: Symbol): Boolean = - sym.is(ModuleClass, butNot = Package) && sym.sourceModule.exists && !_ctx.settings.YdebugNames.value - - override def simpleNameString(sym: Symbol): String = - if (useSourceModule(sym)) simpleNameString(sym.sourceModule) - else seen.record(super.simpleNameString(sym), sym.isType, sym) - - override def ParamRefNameString(param: ParamRef): String = - seen.record(super.ParamRefNameString(param), param.isInstanceOf[TypeParamRef], param) - - override def toTextRef(tp: SingletonType): Text = tp match { - case tp: SkolemType => seen.record(tp.repr.toString, isType = true, tp) - case _ => super.toTextRef(tp) - } - - override def toText(tp: Type): Text = tp match { - case tp: TypeRef if useSourceModule(tp.symbol) => Str("object ") ~ super.toText(tp) - case _ => super.toText(tp) - } - } - - /** Create explanation for single `Recorded` type or symbol */ - def explanation(entry: AnyRef)(using Context): String = { - def boundStr(bound: Type, default: ClassSymbol, cmp: String) = - if (bound.isRef(default)) "" else i"$cmp $bound" - - def boundsStr(bounds: TypeBounds): String = { - val lo = boundStr(bounds.lo, defn.NothingClass, ">:") - val hi = boundStr(bounds.hi, defn.AnyClass, "<:") - if (lo.isEmpty) hi - else if (hi.isEmpty) lo - else s"$lo and $hi" - } - - def addendum(cat: String, info: Type): String = info match { - case bounds @ TypeBounds(lo, hi) if bounds ne TypeBounds.empty => - if (lo eq hi) i" which is an alias of $lo" - else i" with $cat ${boundsStr(bounds)}" - case _ => - "" - } - - entry match { - case param: TypeParamRef => - s"is a type variable${addendum("constraint", TypeComparer.bounds(param))}" - case param: TermParamRef => - s"is a reference to a value parameter" - case sym: Symbol => - val info = - if (ctx.gadt.contains(sym)) - sym.info & ctx.gadt.fullBounds(sym) - else - sym.info - s"is a ${ctx.printer.kindString(sym)}${sym.showExtendedLocation}${addendum("bounds", info)}" - case tp: SkolemType => - s"is an unknown value of type ${tp.widen.show}" - } - } - - /** Turns a `Seen` into a `String` to produce an explanation for types on the - * form `where: T is...` - * - * @return string disambiguating types - */ - private def explanations(seen: Seen)(using Context): String = { - def needsExplanation(entry: Recorded) = entry match { - case param: TypeParamRef => ctx.typerState.constraint.contains(param) - case param: ParamRef => false - case skolem: SkolemType => true - case sym: Symbol => - ctx.gadt.contains(sym) && ctx.gadt.fullBounds(sym) != TypeBounds.empty - } - - val toExplain: List[(String, Recorded)] = seen.toList.flatMap { kvs => - val res: List[(String, Recorded)] = kvs match { - case (key, entry :: Nil) => - if (needsExplanation(entry)) (key.str, entry) :: Nil else Nil - case (key, entries) => - for (alt <- entries) yield { - val tickedString = seen.record(key.str, key.isType, alt) - (tickedString, alt) - } - } - res // help the inferrencer out - }.sortBy(_._1) - - def columnar(parts: List[(String, String)]): List[String] = { - lazy val maxLen = parts.map(_._1.length).max - parts.map { - case (leader, trailer) => - val variable = hl(leader) - s"""$variable${" " * (maxLen - leader.length)} $trailer""" - } - } - - val explainParts = toExplain.map { case (str, entry) => (str, explanation(entry)) } - val explainLines = columnar(explainParts) - if (explainLines.isEmpty) "" else i"where: $explainLines%\n %\n" - } - - private def errorMessageCtx(using Context): Context = - val ctx1 = ctx.property(MessageLimiter) match - case Some(_: ErrorMessageLimiter) => ctx - case _ => ctx.fresh.setProperty(MessageLimiter, ErrorMessageLimiter()) - ctx1.printer match - case _: ErrorMessagePrinter => ctx1 - case _ => ctx1.fresh.setPrinterFn(ctx => ErrorMessagePrinter(ctx)) - - /** Context with correct printer set for explanations */ - private def explainCtx(seen: Seen)(using Context): Context = - val ectx = errorMessageCtx - ectx.printer match - case dp: ExplainingPrinter => - ectx // re-use outer printer and defer explanation to it - case _ => - ectx.fresh.setPrinterFn(ctx => new ExplainingPrinter(seen)(ctx)) - - /** Entrypoint for explanation string interpolator: - * - * ``` - * ex"disambiguate $tpe1 and $tpe2" - * ``` - */ - def explained(op: Context ?=> String)(using Context): String = { - val seen = new Seen - val msg = op(using explainCtx(seen)) - val addendum = explanations(seen) - if (addendum.isEmpty) msg else msg ++ "\n\n" ++ addendum - } - - /** When getting a type mismatch it is useful to disambiguate placeholders like: - * - * ``` - * found: List[Int] - * required: List[T] - * where: T is a type in the initializer of value s which is an alias of - * String - * ``` - * - * @return the `where` section as well as the printing context for the - * placeholders - `("T is a...", printCtx)` - */ - def disambiguateTypes(args: Type*)(using Context): (String, Context) = { - val seen = new Seen - val printCtx = explainCtx(seen) - args.foreach(_.show(using printCtx)) // showing each member will put it into `seen` - (explanations(seen), printCtx) - } - /** This method will produce a colored type diff from the given arguments. * The idea is to do this for known cases that are useful and then fall back * on regular syntax highlighting for the cases which are unhandled. @@ -378,16 +148,13 @@ object Formatting { * @return the (found, expected, changePercentage) with coloring to * highlight the difference */ - def typeDiff(found: Type, expected: Type)(using Context): (String, String) = { - val fnd = wrapNonSensical(found, found.toText(ctx.printer)).show - val exp = wrapNonSensical(expected, expected.toText(ctx.printer)).show - - DiffUtil.mkColoredTypeDiff(fnd, exp) match { - case _ if ctx.settings.color.value == "never" => (fnd, exp) - case (fnd, exp, change) if change < 0.5 => (fnd, exp) + def typeDiff(found: Type, expected: Type)(using Context): (String, String) = + val fnd = found.show + val exp = expected.show + DiffUtil.mkColoredTypeDiff(fnd, exp) match + case (fnd1, exp1, change) + if change < 0.5 && ctx.settings.color.value != "never" => (fnd1, exp1) case _ => (fnd, exp) - } - } /** Explicit syntax highlighting */ def hl(s: String)(using Context): String = diff --git a/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala b/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala index 3c83d681e716..0da1993310c6 100644 --- a/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala +++ b/compiler/src/dotty/tools/dotc/printing/PlainPrinter.scala @@ -111,8 +111,14 @@ class PlainPrinter(_ctx: Context) extends Printer { protected def refinementNameString(tp: RefinedType): String = nameString(tp.refinedName) /** String representation of a refinement */ - protected def toTextRefinement(rt: RefinedType): Closed = - (refinementNameString(rt) ~ toTextRHS(rt.refinedInfo)).close + protected def toTextRefinement(rt: RefinedType): Text = + val keyword = rt.refinedInfo match { + case _: ExprType | _: MethodOrPoly => "def " + case _: TypeBounds => "type " + case _: TypeProxy => "val " + case _ => "" + } + (keyword ~ refinementNameString(rt) ~ toTextRHS(rt.refinedInfo)).close protected def argText(arg: Type): Text = homogenizeArg(arg) match { case arg: TypeBounds => "?" ~ toText(arg) @@ -218,7 +224,7 @@ class PlainPrinter(_ctx: Context) extends Printer { case tp: PreviousErrorType if ctx.settings.XprintTypes.value => "" // do not print previously reported error message because they may try to print this error type again recuresevely case tp: ErrorType => - s"" + s"" case tp: WildcardType => if (tp.optBounds.exists) "" else "" case NoType => @@ -258,8 +264,9 @@ class PlainPrinter(_ctx: Context) extends Printer { if annot.symbol == defn.InlineParamAnnot || annot.symbol == defn.ErasedParamAnnot then toText(tpe) else toTextLocal(tpe) ~ " " ~ toText(annot) case tp: TypeVar => + def toTextCaret(tp: Type) = if printDebug then toTextLocal(tp) ~ Str("^") else toText(tp) if (tp.isInstantiated) - toTextLocal(tp.instanceOpt) ~ (Str("^") provided printDebug) + toTextCaret(tp.instanceOpt) else { val constr = ctx.typerState.constraint val bounds = @@ -267,7 +274,7 @@ class PlainPrinter(_ctx: Context) extends Printer { withMode(Mode.Printing)(TypeComparer.fullBounds(tp.origin)) else TypeBounds.empty - if (bounds.isTypeAlias) toText(bounds.lo) ~ (Str("^") provided printDebug) + if (bounds.isTypeAlias) toTextCaret(bounds.lo) else if (ctx.settings.YshowVarBounds.value) "(" ~ toText(tp.origin) ~ "?" ~ toText(bounds) ~ ")" else toText(tp.origin) } @@ -278,6 +285,8 @@ class PlainPrinter(_ctx: Context) extends Printer { case ex: Throwable => Str("...") } "LazyRef(" ~ refTxt ~ ")" + case Range(lo, hi) => + toText(lo) ~ ".." ~ toText(hi) case _ => tp.fallbackToText(this) } @@ -607,7 +616,7 @@ class PlainPrinter(_ctx: Context) extends Printer { def toText(sc: Scope): Text = ("Scope{" ~ dclsText(sc.toList) ~ "}").close - def toText[T >: Untyped](tree: Tree[T]): Text = { + def toText[T <: Untyped](tree: Tree[T]): Text = { def toTextElem(elem: Any): Text = elem match { case elem: Showable => elem.toText(this) case elem: List[?] => "List(" ~ Text(elem map toTextElem, ",") ~ ")" @@ -689,8 +698,9 @@ class PlainPrinter(_ctx: Context) extends Printer { Text(ups.map(toText), ", ") Text(deps, "\n") } + val depsText = if Config.showConstraintDeps then c.depsToString else "" //Printer.debugPrintUnique = false - Text.lines(List(uninstVarsText, constrainedText, boundsText, orderingText)) + Text.lines(List(uninstVarsText, constrainedText, boundsText, orderingText, depsText)) finally ctx.typerState.constraint = savedConstraint diff --git a/compiler/src/dotty/tools/dotc/printing/Printer.scala b/compiler/src/dotty/tools/dotc/printing/Printer.scala index f06c70f56905..326630844dde 100644 --- a/compiler/src/dotty/tools/dotc/printing/Printer.scala +++ b/compiler/src/dotty/tools/dotc/printing/Printer.scala @@ -31,7 +31,7 @@ abstract class Printer { * ### `atPrec` vs `changePrec` * * This is to be used when changing precedence inside some sort of parentheses: - * for instance, to print T[A]` use + * for instance, to print `T[A]` use * `toText(T) ~ '[' ~ atPrec(GlobalPrec) { toText(A) } ~ ']'`. * * If the presence of the parentheses depends on precedence, inserting them manually is most certainly a bug. @@ -60,8 +60,7 @@ abstract class Printer { * A op B op' C parses as (A op B) op' C if op and op' are left-associative, and as * A op (B op' C) if they're right-associative, so we need respectively * ```scala - * val isType = ??? // is this a term or type operator? - * val prec = parsing.precedence(op, isType) + * val prec = parsing.precedence(op) * // either: * changePrec(prec) { toText(a) ~ op ~ atPrec(prec + 1) { toText(b) } } // for left-associative op and op' * // or: @@ -149,7 +148,7 @@ abstract class Printer { def toText(sc: Scope): Text /** Textual representation of tree */ - def toText[T >: Untyped](tree: Tree[T]): Text + def toText[T <: Untyped](tree: Tree[T]): Text /** Textual representation of source position */ def toText(pos: SourcePosition): Text diff --git a/compiler/src/dotty/tools/dotc/printing/RefinedPrinter.scala b/compiler/src/dotty/tools/dotc/printing/RefinedPrinter.scala index 2a87ec9b4bbe..9a4b53d4112c 100644 --- a/compiler/src/dotty/tools/dotc/printing/RefinedPrinter.scala +++ b/compiler/src/dotty/tools/dotc/printing/RefinedPrinter.scala @@ -40,7 +40,7 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { override def printerContext: Context = myCtx - def withEnclosingDef(enclDef: Tree[? >: Untyped])(op: => Text): Text = { + def withEnclosingDef(enclDef: Tree[?])(op: => Text): Text = { val savedCtx = myCtx if (enclDef.hasType && enclDef.symbol.exists) myCtx = ctx.withOwner(enclDef.symbol) @@ -223,6 +223,7 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { case _ => val tsym = tycon.typeSymbol if tycon.isRepeatedParam then toTextLocal(args.head) ~ "*" + else if tp.isConvertibleParam then "into " ~ toText(args.head) else if defn.isFunctionSymbol(tsym) then toTextFunction(args, tsym.name.isContextFunction, tsym.name.isErasedFunction, isPure = Feature.pureFunsEnabled && !tsym.name.isImpureFunction) @@ -308,15 +309,15 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { protected def exprToText(tp: ExprType): Text = "=> " ~ toText(tp.resType) - protected def blockToText[T >: Untyped](block: Block[T]): Text = + protected def blockToText[T <: Untyped](block: Block[T]): Text = blockText(block.stats :+ block.expr) - protected def blockText[T >: Untyped](trees: List[Tree[T]]): Text = + protected def blockText[T <: Untyped](trees: List[Tree[T]]): Text = inContextBracket { ("{" ~ toText(trees, "\n") ~ "}").close } - protected def typeApplyText[T >: Untyped](tree: TypeApply[T]): Text = { + protected def typeApplyText[T <: Untyped](tree: TypeApply[T]): Text = { val funText = toTextLocal(tree.fun) tree.fun match { case Select(New(tpt), nme.CONSTRUCTOR) if tpt.typeOpt.dealias.isInstanceOf[AppliedType] => @@ -326,7 +327,7 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { } } - protected def toTextCore[T >: Untyped](tree: Tree[T]): Text = { + protected def toTextCore[T <: Untyped](tree: Tree[T]): Text = { import untpd._ def isLocalThis(tree: Tree) = tree.typeOpt match { @@ -523,9 +524,10 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { case SeqLiteral(elems, elemtpt) => "[" ~ toTextGlobal(elems, ",") ~ " : " ~ toText(elemtpt) ~ "]" case tree @ Inlined(call, bindings, body) => - (("/* inlined from " ~ (if (call.isEmpty) "outside" else toText(call)) ~ " */ ") `provided` - !homogenizedView && ctx.settings.XprintInline.value) ~ - (if bindings.isEmpty then toText(body) else blockText(bindings :+ body)) + val bodyText = if bindings.isEmpty then toText(body) else blockText(bindings :+ body) + if homogenizedView || !ctx.settings.XprintInline.value then bodyText + else if call.isEmpty then stringText("{{") ~ stringText("/* inlined from outside */") ~ bodyText ~ stringText("}}") + else keywordText("{{") ~ keywordText("/* inlined from ") ~ toText(call) ~ keywordText(" */") ~ bodyText ~ keywordText("}}") case tpt: untpd.DerivedTypeTree => "" case TypeTree() => @@ -739,7 +741,7 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { } } - override def toText[T >: Untyped](tree: Tree[T]): Text = controlled { + override def toText[T <: Untyped](tree: Tree[T]): Text = controlled { import untpd._ var txt = toTextCore(tree) @@ -826,7 +828,7 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { protected def dropAnnotForModText(sym: Symbol): Boolean = sym == defn.BodyAnnot - protected def optAscription[T >: Untyped](tpt: Tree[T]): Text = optText(tpt)(": " ~ _) + protected def optAscription[T <: Untyped](tpt: Tree[T]): Text = optText(tpt)(": " ~ _) private def idText(tree: untpd.Tree): Text = (if showUniqueIds && tree.hasType && tree.symbol.exists then s"#${tree.symbol.id}" else "") ~ @@ -842,7 +844,7 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { private def useSymbol(tree: untpd.Tree) = tree.hasType && tree.symbol.exists && ctx.settings.YprintSyms.value - protected def nameIdText[T >: Untyped](tree: NameTree[T]): Text = + protected def nameIdText[T <: Untyped](tree: NameTree[T]): Text = if (tree.hasType && tree.symbol.exists) { val str = nameString(tree.symbol) tree match { @@ -856,13 +858,13 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { private def toTextOwner(tree: Tree[?]) = "[owner = " ~ tree.symbol.maybeOwner.show ~ "]" provided ctx.settings.YprintDebugOwners.value - protected def dclTextOr[T >: Untyped](tree: Tree[T])(treeText: => Text): Text = + protected def dclTextOr[T <: Untyped](tree: Tree[T])(treeText: => Text): Text = toTextOwner(tree) ~ { if (useSymbol(tree)) annotsText(tree.symbol) ~~ dclText(tree.symbol) else treeText } - def paramsText[T>: Untyped](params: ParamClause[T]): Text = (params: @unchecked) match + def paramsText[T <: Untyped](params: ParamClause[T]): Text = (params: @unchecked) match case Nil => "()" case untpd.ValDefs(vparams @ (vparam :: _)) => @@ -872,10 +874,10 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { case untpd.TypeDefs(tparams) => "[" ~ toText(tparams, ", ") ~ "]" - def addParamssText[T >: Untyped](leading: Text, paramss: List[ParamClause[T]]): Text = + def addParamssText[T <: Untyped](leading: Text, paramss: List[ParamClause[T]]): Text = paramss.foldLeft(leading)((txt, params) => txt ~ paramsText(params)) - protected def valDefToText[T >: Untyped](tree: ValDef[T]): Text = { + protected def valDefToText[T <: Untyped](tree: ValDef[T]): Text = { dclTextOr(tree) { modText(tree.mods, tree.symbol, keywordStr(if (tree.mods.is(Mutable)) "var" else "val"), isType = false) ~~ valDefText(nameIdText(tree)) ~ optAscription(tree.tpt) ~ @@ -883,7 +885,7 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { } } - protected def defDefToText[T >: Untyped](tree: DefDef[T]): Text = { + protected def defDefToText[T <: Untyped](tree: DefDef[T]): Text = { import untpd._ dclTextOr(tree) { val defKeyword = modText(tree.mods, tree.symbol, keywordStr("def"), isType = false) @@ -989,8 +991,8 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { ) } - protected def toTextPackageId[T >: Untyped](pid: Tree[T]): Text = - if (homogenizedView && pid.hasType) toTextLocal(pid.tpe.asInstanceOf[Showable]) + protected def toTextPackageId[T <: Untyped](pid: Tree[T]): Text = + if (homogenizedView && pid.hasType) toTextLocal(pid.typeOpt) else toTextLocal(pid) protected def packageDefText(tree: PackageDef): Text = { @@ -1044,10 +1046,10 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { def optText(name: Name)(encl: Text => Text): Text = if (name.isEmpty) "" else encl(toText(name)) - def optText[T >: Untyped](tree: Tree[T])(encl: Text => Text): Text = + def optText[T <: Untyped](tree: Tree[T])(encl: Text => Text): Text = if (tree.isEmpty) "" else encl(toText(tree)) - def optText[T >: Untyped](tree: List[Tree[T]])(encl: Text => Text): Text = + def optText[T <: Untyped](tree: List[Tree[T]])(encl: Text => Text): Text = if (tree.exists(!_.isEmpty)) encl(blockText(tree)) else "" override protected def treatAsTypeParam(sym: Symbol): Boolean = sym.is(TypeParam) @@ -1060,7 +1062,7 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { if (sym.isImport) sym.infoOrCompleter match { case info: Namer#Completer => return info.original.show - case info: ImportType => return s"import $info.expr.show" + case info: ImportType => return s"import ${info.expr.show}" case _ => } def name = diff --git a/compiler/src/dotty/tools/dotc/printing/Texts.scala b/compiler/src/dotty/tools/dotc/printing/Texts.scala index 17f86e766869..7c040a78de5e 100644 --- a/compiler/src/dotty/tools/dotc/printing/Texts.scala +++ b/compiler/src/dotty/tools/dotc/printing/Texts.scala @@ -1,8 +1,12 @@ package dotty.tools.dotc package printing +import scala.annotation.internal.sharable object Texts { + @sharable + private val ansi = java.util.regex.Pattern.compile("\u001b\\[\\d+m").nn + sealed abstract class Text { protected def indentMargin: Int = 2 @@ -15,12 +19,17 @@ object Texts { case Vertical(relems) => relems.isEmpty } + // Str Ver Clo Flu + // isVertical F T F F + // isClosed F T T F + // isFluid F F T T + // isSplittable F F F T def isVertical: Boolean = isInstanceOf[Vertical] def isClosed: Boolean = isVertical || isInstanceOf[Closed] def isFluid: Boolean = isInstanceOf[Fluid] def isSplittable: Boolean = isFluid && !isClosed - def close: Closed = new Closed(relems) + def close: Text = if isSplittable then Closed(relems) else this def remaining(width: Int): Int = this match { case Str(s, _) => @@ -53,7 +62,7 @@ object Texts { } private def appendIndented(that: Text)(width: Int): Text = - Vertical(that.layout(width - indentMargin).indented :: this.relems) + Fluid(that.layout(width - indentMargin).indented :: this.relems) private def append(width: Int)(that: Text): Text = if (this.isEmpty) that.layout(width) @@ -65,7 +74,7 @@ object Texts { else appendIndented(that)(width) private def lengthWithoutAnsi(str: String): Int = - str.replaceAll("\u001b\\[\\d+m", "").nn.length + ansi.matcher(str).nn.replaceAll("").nn.length def layout(width: Int): Text = this match { case Str(s, _) => @@ -113,7 +122,7 @@ object Texts { sb.append("|") } } - sb.append(s) + sb.append(s.replaceAll("[ ]+$", "")) case _ => var follow = false for (elem <- relems.reverse) { @@ -138,7 +147,13 @@ object Texts { def ~ (that: Text): Text = if (this.isEmpty) that else if (that.isEmpty) this - else Fluid(that :: this :: Nil) + else this match + case Fluid(relems1) if !isClosed => that match + case Fluid(relems2) if !that.isClosed => Fluid(relems2 ++ relems1) + case _ => Fluid(that +: relems1) + case _ => that match + case Fluid(relems2) if !that.isClosed => Fluid(relems2 :+ this) + case _ => Fluid(that :: this :: Nil) def ~~ (that: Text): Text = if (this.isEmpty) that @@ -161,9 +176,9 @@ object Texts { def apply(xs: Traversable[Text], sep: String = " "): Text = if (sep == "\n") lines(xs) else { - val ys = xs filterNot (_.isEmpty) + val ys = xs.filterNot(_.isEmpty) if (ys.isEmpty) Str("") - else ys reduce (_ ~ sep ~ _) + else ys.reduceRight((a, b) => (a ~ sep).close ~ b) } /** The given texts `xs`, each on a separate line */ @@ -176,12 +191,16 @@ object Texts { case class Str(s: String, lineRange: LineRange = EmptyLineRange) extends Text { override def relems: List[Text] = List(this) + override def toString = this match + case Str(s, EmptyLineRange) => s"Str($s)" + case Str(s, lineRange) => s"Str($s, $lineRange)" } case class Vertical(relems: List[Text]) extends Text case class Fluid(relems: List[Text]) extends Text - class Closed(relems: List[Text]) extends Fluid(relems) + class Closed(relems: List[Text]) extends Fluid(relems): + override def productPrefix = "Closed" implicit def stringToText(s: String): Text = Str(s) diff --git a/compiler/src/dotty/tools/dotc/profile/ExtendedThreadMxBean.java b/compiler/src/dotty/tools/dotc/profile/ExtendedThreadMxBean.java index 68ae4f148cfd..60f44db16add 100644 --- a/compiler/src/dotty/tools/dotc/profile/ExtendedThreadMxBean.java +++ b/compiler/src/dotty/tools/dotc/profile/ExtendedThreadMxBean.java @@ -248,13 +248,14 @@ public SunThreadMxBean(ThreadMXBean underlying) { super(underlying); this.real = underlying; try { - getThreadUserTimeMethod = real.getClass().getMethod("getThreadUserTime", long[].class); - isThreadAllocatedMemoryEnabledMethod = real.getClass().getMethod("isThreadAllocatedMemoryEnabled"); - setThreadAllocatedMemoryEnabledMethod = real.getClass().getMethod("setThreadAllocatedMemoryEnabled", Boolean.TYPE); - getThreadAllocatedBytesMethod1 = real.getClass().getMethod("getThreadAllocatedBytes", Long.TYPE); - getThreadAllocatedBytesMethod2 = real.getClass().getMethod("getThreadAllocatedBytes", long[].class); - isThreadAllocatedMemorySupportedMethod = real.getClass().getMethod("isThreadAllocatedMemorySupported"); - getThreadCpuTimeMethod = real.getClass().getMethod("getThreadCpuTime", long[].class); + Class cls = Class.forName("com.sun.management.ThreadMXBean"); + getThreadUserTimeMethod = cls.getMethod("getThreadUserTime", long[].class); + isThreadAllocatedMemoryEnabledMethod = cls.getMethod("isThreadAllocatedMemoryEnabled"); + setThreadAllocatedMemoryEnabledMethod = cls.getMethod("setThreadAllocatedMemoryEnabled", Boolean.TYPE); + getThreadAllocatedBytesMethod1 = cls.getMethod("getThreadAllocatedBytes", Long.TYPE); + getThreadAllocatedBytesMethod2 = cls.getMethod("getThreadAllocatedBytes", long[].class); + isThreadAllocatedMemorySupportedMethod = cls.getMethod("isThreadAllocatedMemorySupported"); + getThreadCpuTimeMethod = cls.getMethod("getThreadCpuTime", long[].class); getThreadUserTimeMethod.setAccessible(true); isThreadAllocatedMemoryEnabledMethod.setAccessible(true); diff --git a/compiler/src/dotty/tools/dotc/profile/Profiler.scala b/compiler/src/dotty/tools/dotc/profile/Profiler.scala index 25c53903c10b..64cc08160701 100644 --- a/compiler/src/dotty/tools/dotc/profile/Profiler.scala +++ b/compiler/src/dotty/tools/dotc/profile/Profiler.scala @@ -13,6 +13,7 @@ import javax.management.{Notification, NotificationEmitter, NotificationListener import dotty.tools.dotc.core.Phases.Phase import dotty.tools.dotc.core.Contexts._ import dotty.tools.io.AbstractFile +import annotation.internal.sharable object Profiler { def apply()(using Context): Profiler = @@ -217,14 +218,16 @@ sealed trait ProfileReporter { } object ConsoleProfileReporter extends ProfileReporter { - + @sharable var totalAlloc = 0L override def reportBackground(profiler: RealProfiler, threadRange: ProfileRange): Unit = - // TODO - ??? + reportCommon(EventType.BACKGROUND, profiler, threadRange) override def reportForeground(profiler: RealProfiler, threadRange: ProfileRange): Unit = - // TODO - ??? + reportCommon(EventType.MAIN, profiler, threadRange) + @nowarn("cat=deprecation") + private def reportCommon(tpe:EventType, profiler: RealProfiler, threadRange: ProfileRange): Unit = + totalAlloc += threadRange.allocatedBytes + println(s"${threadRange.phase.phaseName.replace(',', ' ')},run ns = ${threadRange.runNs},idle ns = ${threadRange.idleNs},cpu ns = ${threadRange.cpuNs},user ns = ${threadRange.userNs},allocated = ${threadRange.allocatedBytes},heap at end = ${threadRange.end.heapBytes}, total allocated = $totalAlloc ") override def close(profiler: RealProfiler): Unit = () diff --git a/compiler/src/dotty/tools/dotc/quoted/Interpreter.scala b/compiler/src/dotty/tools/dotc/quoted/Interpreter.scala new file mode 100644 index 000000000000..38cecb7953b8 --- /dev/null +++ b/compiler/src/dotty/tools/dotc/quoted/Interpreter.scala @@ -0,0 +1,364 @@ +package dotty.tools.dotc +package quoted + +import scala.language.unsafeNulls + +import scala.collection.mutable +import scala.reflect.ClassTag + +import java.io.{PrintWriter, StringWriter} +import java.lang.reflect.{InvocationTargetException, Method => JLRMethod} + +import dotty.tools.dotc.ast.tpd +import dotty.tools.dotc.ast.TreeMapWithImplicits +import dotty.tools.dotc.core.Annotations._ +import dotty.tools.dotc.core.Constants._ +import dotty.tools.dotc.core.Contexts._ +import dotty.tools.dotc.core.Decorators._ +import dotty.tools.dotc.core.Denotations.staticRef +import dotty.tools.dotc.core.Flags._ +import dotty.tools.dotc.core.NameKinds.FlatName +import dotty.tools.dotc.core.Names._ +import dotty.tools.dotc.core.StagingContext._ +import dotty.tools.dotc.core.StdNames._ +import dotty.tools.dotc.core.Symbols._ +import dotty.tools.dotc.core.TypeErasure +import dotty.tools.dotc.core.Types._ +import dotty.tools.dotc.quoted._ +import dotty.tools.dotc.transform.TreeMapWithStages._ +import dotty.tools.dotc.typer.ImportInfo.withRootImports +import dotty.tools.dotc.util.SrcPos +import dotty.tools.dotc.reporting.Message +import dotty.tools.repl.AbstractFileClassLoader + +/** Tree interpreter for metaprogramming constructs */ +class Interpreter(pos: SrcPos, classLoader: ClassLoader)(using Context): + import Interpreter._ + import tpd._ + + /** Local variable environment */ + type Env = Map[Symbol, Object] + def emptyEnv: Env = Map.empty + inline def env(using e: Env): e.type = e + + /** Returns the result of interpreting the code in the tree. + * Return Some of the result or None if the result type is not consistent with the expected type. + * Throws a StopInterpretation if the tree could not be interpreted or a runtime exception ocurred. + */ + final def interpret[T](tree: Tree)(using ct: ClassTag[T]): Option[T] = + interpretTree(tree)(using emptyEnv) match { + case obj: T => Some(obj) + case obj => + // TODO upgrade to a full type tag check or something similar + report.error(em"Interpreted tree returned a result of an unexpected type. Expected ${ct.runtimeClass} but was ${obj.getClass}", pos) + None + } + + /** Returns the result of interpreting the code in the tree. + * Throws a StopInterpretation if the tree could not be interpreted or a runtime exception ocurred. + */ + protected def interpretTree(tree: Tree)(using Env): Object = tree match { + case Literal(Constant(value)) => + interpretLiteral(value) + + case tree: Ident if tree.symbol.is(Inline, butNot = Method) => + tree.tpe.widenTermRefExpr match + case ConstantType(c) => c.value.asInstanceOf[Object] + case _ => throw new StopInterpretation(em"${tree.symbol} could not be inlined", tree.srcPos) + + // TODO disallow interpreted method calls as arguments + case Call(fn, args) => + if (fn.symbol.isConstructor) + interpretNew(fn.symbol, args.flatten.map(interpretTree)) + else if (fn.symbol.is(Module)) + interpretModuleAccess(fn.symbol) + else if (fn.symbol.is(Method) && fn.symbol.isStatic) { + interpretedStaticMethodCall(fn.symbol.owner, fn.symbol, interpretArgs(args, fn.symbol.info)) + } + else if fn.symbol.isStatic then + assert(args.isEmpty) + interpretedStaticFieldAccess(fn.symbol) + else if (fn.qualifier.symbol.is(Module) && fn.qualifier.symbol.isStatic) + if (fn.name == nme.asInstanceOfPM) + interpretModuleAccess(fn.qualifier.symbol) + else { + interpretedStaticMethodCall(fn.qualifier.symbol.moduleClass, fn.symbol, interpretArgs(args, fn.symbol.info)) + } + else if (env.contains(fn.symbol)) + env(fn.symbol) + else if (tree.symbol.is(InlineProxy)) + interpretTree(tree.symbol.defTree.asInstanceOf[ValOrDefDef].rhs) + else + unexpectedTree(tree) + + case closureDef((ddef @ DefDef(_, ValDefs(arg :: Nil) :: Nil, _, _))) => + (obj: AnyRef) => interpretTree(ddef.rhs)(using env.updated(arg.symbol, obj)) + + // Interpret `foo(j = x, i = y)` which it is expanded to + // `val j$1 = x; val i$1 = y; foo(i = i$1, j = j$1)` + case Block(stats, expr) => interpretBlock(stats, expr) + case NamedArg(_, arg) => interpretTree(arg) + + case Inlined(_, bindings, expansion) => interpretBlock(bindings, expansion) + + case Typed(expr, _) => + interpretTree(expr) + + case SeqLiteral(elems, _) => + interpretVarargs(elems.map(e => interpretTree(e))) + + case _ => + unexpectedTree(tree) + } + + private def interpretArgs(argss: List[List[Tree]], fnType: Type)(using Env): List[Object] = { + def interpretArgsGroup(args: List[Tree], argTypes: List[Type]): List[Object] = + assert(args.size == argTypes.size) + val view = + for (arg, info) <- args.lazyZip(argTypes) yield + info match + case _: ExprType => () => interpretTree(arg) // by-name argument + case _ => interpretTree(arg) // by-value argument + view.toList + + fnType.dealias match + case fnType: MethodType if fnType.isErasedMethod => interpretArgs(argss, fnType.resType) + case fnType: MethodType => + val argTypes = fnType.paramInfos + assert(argss.head.size == argTypes.size) + interpretArgsGroup(argss.head, argTypes) ::: interpretArgs(argss.tail, fnType.resType) + case fnType: AppliedType if defn.isContextFunctionType(fnType) => + val argTypes :+ resType = fnType.args: @unchecked + interpretArgsGroup(argss.head, argTypes) ::: interpretArgs(argss.tail, resType) + case fnType: PolyType => interpretArgs(argss, fnType.resType) + case fnType: ExprType => interpretArgs(argss, fnType.resType) + case _ => + assert(argss.isEmpty) + Nil + } + + private def interpretBlock(stats: List[Tree], expr: Tree)(using Env) = { + var unexpected: Option[Object] = None + val newEnv = stats.foldLeft(env)((accEnv, stat) => stat match + case stat: ValDef => + accEnv.updated(stat.symbol, interpretTree(stat.rhs)(using accEnv)) + case stat => + if (unexpected.isEmpty) + unexpected = Some(unexpectedTree(stat)) + accEnv + ) + unexpected.getOrElse(interpretTree(expr)(using newEnv)) + } + + private def interpretLiteral(value: Any): Object = + value.asInstanceOf[Object] + + private def interpretVarargs(args: List[Object]): Object = + args.toSeq + + private def interpretedStaticMethodCall(moduleClass: Symbol, fn: Symbol, args: List[Object]): Object = { + val (inst, clazz) = + try + if (moduleClass.name.startsWith(str.REPL_SESSION_LINE)) + (null, loadReplLineClass(moduleClass)) + else { + val inst = loadModule(moduleClass) + (inst, inst.getClass) + } + catch + case MissingClassDefinedInCurrentRun(sym) => + suspendOnMissing(sym, pos) + + val name = fn.name.asTermName + val method = getMethod(clazz, name, paramsSig(fn)) + stopIfRuntimeException(method.invoke(inst, args: _*), method) + } + + private def interpretedStaticFieldAccess(sym: Symbol): Object = { + val clazz = loadClass(sym.owner.fullName.toString) + val field = clazz.getField(sym.name.toString) + field.get(null) + } + + private def interpretModuleAccess(fn: Symbol): Object = + loadModule(fn.moduleClass) + + private def interpretNew(fn: Symbol, args: List[Object]): Object = { + val className = fn.owner.fullName.mangledString.replaceAll("\\$\\.", "\\$") + val clazz = loadClass(className) + val constr = clazz.getConstructor(paramsSig(fn): _*) + constr.newInstance(args: _*).asInstanceOf[Object] + } + + private def unexpectedTree(tree: Tree): Object = + throw new StopInterpretation(em"Unexpected tree could not be interpreted: ${tree.toString}", tree.srcPos) + + private def loadModule(sym: Symbol): Object = + if (sym.owner.is(Package)) { + // is top level object + val moduleClass = loadClass(sym.fullName.toString) + moduleClass.getField(str.MODULE_INSTANCE_FIELD).get(null) + } + else { + // nested object in an object + val clazz = loadClass(sym.binaryClassName) + clazz.getConstructor().newInstance().asInstanceOf[Object] + } + + private def loadReplLineClass(moduleClass: Symbol): Class[?] = { + val lineClassloader = new AbstractFileClassLoader(ctx.settings.outputDir.value, classLoader) + lineClassloader.loadClass(moduleClass.name.firstPart.toString) + } + + private def loadClass(name: String): Class[?] = + try classLoader.loadClass(name) + catch + case MissingClassDefinedInCurrentRun(sym) => + suspendOnMissing(sym, pos) + + + private def getMethod(clazz: Class[?], name: Name, paramClasses: List[Class[?]]): JLRMethod = + try clazz.getMethod(name.toString, paramClasses: _*) + catch { + case _: NoSuchMethodException => + val msg = em"Could not find method ${clazz.getCanonicalName}.$name with parameters ($paramClasses%, %)" + throw new StopInterpretation(msg, pos) + case MissingClassDefinedInCurrentRun(sym) => + suspendOnMissing(sym, pos) + } + + private def stopIfRuntimeException[T](thunk: => T, method: JLRMethod): T = + try thunk + catch { + case ex: RuntimeException => + val sw = new StringWriter() + sw.write("A runtime exception occurred while executing macro expansion\n") + sw.write(ex.getMessage) + sw.write("\n") + ex.printStackTrace(new PrintWriter(sw)) + sw.write("\n") + throw new StopInterpretation(sw.toString.toMessage, pos) + case ex: InvocationTargetException => + ex.getTargetException match { + case ex: scala.quoted.runtime.StopMacroExpansion => + throw ex + case MissingClassDefinedInCurrentRun(sym) => + suspendOnMissing(sym, pos) + case targetException => + val sw = new StringWriter() + sw.write("Exception occurred while executing macro expansion.\n") + if (!ctx.settings.Ydebug.value) { + val end = targetException.getStackTrace.lastIndexWhere { x => + x.getClassName == method.getDeclaringClass.getCanonicalName && x.getMethodName == method.getName + } + val shortStackTrace = targetException.getStackTrace.take(end + 1) + targetException.setStackTrace(shortStackTrace) + } + targetException.printStackTrace(new PrintWriter(sw)) + sw.write("\n") + throw new StopInterpretation(sw.toString.toMessage, pos) + } + } + + /** List of classes of the parameters of the signature of `sym` */ + private def paramsSig(sym: Symbol): List[Class[?]] = { + def paramClass(param: Type): Class[?] = { + def arrayDepth(tpe: Type, depth: Int): (Type, Int) = tpe match { + case JavaArrayType(elemType) => arrayDepth(elemType, depth + 1) + case _ => (tpe, depth) + } + def javaArraySig(tpe: Type): String = { + val (elemType, depth) = arrayDepth(tpe, 0) + val sym = elemType.classSymbol + val suffix = + if (sym == defn.BooleanClass) "Z" + else if (sym == defn.ByteClass) "B" + else if (sym == defn.ShortClass) "S" + else if (sym == defn.IntClass) "I" + else if (sym == defn.LongClass) "J" + else if (sym == defn.FloatClass) "F" + else if (sym == defn.DoubleClass) "D" + else if (sym == defn.CharClass) "C" + else "L" + javaSig(elemType) + ";" + ("[" * depth) + suffix + } + def javaSig(tpe: Type): String = tpe match { + case tpe: JavaArrayType => javaArraySig(tpe) + case _ => + // Take the flatten name of the class and the full package name + val pack = tpe.classSymbol.topLevelClass.owner + val packageName = if (pack == defn.EmptyPackageClass) "" else s"${pack.fullName}." + packageName + tpe.classSymbol.fullNameSeparated(FlatName).toString + } + + val sym = param.classSymbol + if (sym == defn.BooleanClass) classOf[Boolean] + else if (sym == defn.ByteClass) classOf[Byte] + else if (sym == defn.CharClass) classOf[Char] + else if (sym == defn.ShortClass) classOf[Short] + else if (sym == defn.IntClass) classOf[Int] + else if (sym == defn.LongClass) classOf[Long] + else if (sym == defn.FloatClass) classOf[Float] + else if (sym == defn.DoubleClass) classOf[Double] + else java.lang.Class.forName(javaSig(param), false, classLoader) + } + def getExtraParams(tp: Type): List[Type] = tp.widenDealias match { + case tp: AppliedType if defn.isContextFunctionType(tp) => + // Call context function type direct method + tp.args.init.map(arg => TypeErasure.erasure(arg)) ::: getExtraParams(tp.args.last) + case _ => Nil + } + val extraParams = getExtraParams(sym.info.finalResultType) + val allParams = TypeErasure.erasure(sym.info) match { + case meth: MethodType => meth.paramInfos ::: extraParams + case _ => extraParams + } + allParams.map(paramClass) + } +end Interpreter + +object Interpreter: + /** Exception that stops interpretation if some issue is found */ + class StopInterpretation(val msg: Message, val pos: SrcPos) extends Exception + + object Call: + import tpd._ + /** Matches an expression that is either a field access or an application + * It retruns a TermRef containing field accessed or a method reference and the arguments passed to it. + */ + def unapply(arg: Tree)(using Context): Option[(RefTree, List[List[Tree]])] = + Call0.unapply(arg).map((fn, args) => (fn, args.reverse)) + + private object Call0 { + def unapply(arg: Tree)(using Context): Option[(RefTree, List[List[Tree]])] = arg match { + case Select(Call0(fn, args), nme.apply) if defn.isContextFunctionType(fn.tpe.widenDealias.finalResultType) => + Some((fn, args)) + case fn: Ident => Some((tpd.desugarIdent(fn).withSpan(fn.span), Nil)) + case fn: Select => Some((fn, Nil)) + case Apply(f @ Call0(fn, args1), args2) => + if (f.tpe.widenDealias.isErasedMethod) Some((fn, args1)) + else Some((fn, args2 :: args1)) + case TypeApply(Call0(fn, args), _) => Some((fn, args)) + case _ => None + } + } + end Call + + object MissingClassDefinedInCurrentRun { + def unapply(targetException: Throwable)(using Context): Option[Symbol] = { + if !ctx.compilationUnit.isSuspendable then None + else targetException match + case _: NoClassDefFoundError | _: ClassNotFoundException => + val className = targetException.getMessage + if className eq null then None + else + val sym = staticRef(className.toTypeName).symbol + if (sym.isDefinedInCurrentRun) Some(sym) else None + case _ => None + } + } + + def suspendOnMissing(sym: Symbol, pos: SrcPos)(using Context): Nothing = + if ctx.settings.XprintSuspension.value then + report.echo(i"suspension triggered by a dependency on $sym", pos) + ctx.compilationUnit.suspend() // this throws a SuspendException diff --git a/compiler/src/dotty/tools/dotc/quoted/PickledQuotes.scala b/compiler/src/dotty/tools/dotc/quoted/PickledQuotes.scala index 41f3fd4f64f3..20bcba417a5e 100644 --- a/compiler/src/dotty/tools/dotc/quoted/PickledQuotes.scala +++ b/compiler/src/dotty/tools/dotc/quoted/PickledQuotes.scala @@ -5,6 +5,7 @@ import dotty.tools.dotc.ast.{TreeTypeMap, tpd} import dotty.tools.dotc.config.Printers._ import dotty.tools.dotc.core.Contexts._ import dotty.tools.dotc.core.Decorators._ +import dotty.tools.dotc.core.Flags._ import dotty.tools.dotc.core.Mode import dotty.tools.dotc.core.Symbols._ import dotty.tools.dotc.core.Types._ @@ -12,7 +13,7 @@ import dotty.tools.dotc.core.tasty.{ PositionPickler, TastyPickler, TastyPrinter import dotty.tools.dotc.core.tasty.DottyUnpickler import dotty.tools.dotc.core.tasty.TreeUnpickler.UnpickleMode import dotty.tools.dotc.report - +import dotty.tools.dotc.reporting.Message import scala.quoted.Quotes import scala.quoted.runtime.impl._ @@ -220,10 +221,10 @@ object PickledQuotes { treePkl.pickle(tree :: Nil) treePkl.compactify() if tree.span.exists then - val positionWarnings = new mutable.ListBuffer[String]() + val positionWarnings = new mutable.ListBuffer[Message]() val reference = ctx.settings.sourceroot.value - new PositionPickler(pickler, treePkl.buf.addrOfTree, treePkl.treeAnnots, reference) - .picklePositions(ctx.compilationUnit.source, tree :: Nil, positionWarnings) + PositionPickler.picklePositions(pickler, treePkl.buf.addrOfTree, treePkl.treeAnnots, reference, + ctx.compilationUnit.source, tree :: Nil, positionWarnings) positionWarnings.foreach(report.warning(_)) val pickled = pickler.assembleParts() @@ -248,23 +249,41 @@ object PickledQuotes { case pickled: String => TastyString.unpickle(pickled) case pickled: List[String] => TastyString.unpickle(pickled) - quotePickling.println(s"**** unpickling quote from TASTY\n${TastyPrinter.showContents(bytes, ctx.settings.color.value == "never")}") + val unpicklingContext = + if ctx.owner.isClass then + // When a quote is unpickled with a Quotes context that that has a class `spliceOwner` + // we need to use a dummy owner to unpickle it. Otherwise any definitions defined + // in the quoted block would be accidentally entered in the class. + // When splicing this expression, this owner is replaced with the correct owner (see `quotedExprToTree` and `quotedTypeToTree` above). + // On the other hand, if the expression is used as a reflect term, the user must call `changeOwner` (same as with other expressions used within a nested owner). + // `-Xcheck-macros` will check for inconsistent owners and provide the users hints on how to improve them. + // + // Quotes context that that has a class `spliceOwner` can come from a macro annotation + // or a user setting it explicitly using `Symbol.asQuotes`. + ctx.withOwner(newSymbol(ctx.owner, "$quoteOwnedByClass$".toTermName, Private, defn.AnyType, NoSymbol)) + else ctx - val mode = if (isType) UnpickleMode.TypeTree else UnpickleMode.Term - val unpickler = new DottyUnpickler(bytes, mode) - unpickler.enter(Set.empty) + inContext(unpicklingContext) { - val tree = unpickler.tree - QuotesCache(pickled) = tree + quotePickling.println(s"**** unpickling quote from TASTY\n${TastyPrinter.showContents(bytes, ctx.settings.color.value == "never")}") - // Make sure trees and positions are fully loaded - new TreeTraverser { - def traverse(tree: Tree)(using Context): Unit = traverseChildren(tree) - }.traverse(tree) + val mode = if (isType) UnpickleMode.TypeTree else UnpickleMode.Term + val unpickler = new DottyUnpickler(bytes, mode) + unpickler.enter(Set.empty) - quotePickling.println(i"**** unpickled quote\n$tree") + val tree = unpickler.tree + QuotesCache(pickled) = tree + + // Make sure trees and positions are fully loaded + new TreeTraverser { + def traverse(tree: Tree)(using Context): Unit = traverseChildren(tree) + }.traverse(tree) + + quotePickling.println(i"**** unpickled quote\n$tree") + + tree + } - tree } } diff --git a/compiler/src/dotty/tools/dotc/report.scala b/compiler/src/dotty/tools/dotc/report.scala index 00399ecbfd0a..c92fbe5daa56 100644 --- a/compiler/src/dotty/tools/dotc/report.scala +++ b/compiler/src/dotty/tools/dotc/report.scala @@ -26,30 +26,18 @@ object report: def deprecationWarning(msg: Message, pos: SrcPos)(using Context): Unit = issueWarning(new DeprecationWarning(msg, pos.sourcePos)) - def deprecationWarning(msg: => String, pos: SrcPos)(using Context): Unit = - deprecationWarning(msg.toMessage, pos) - def migrationWarning(msg: Message, pos: SrcPos)(using Context): Unit = issueWarning(new MigrationWarning(msg, pos.sourcePos)) - def migrationWarning(msg: => String, pos: SrcPos)(using Context): Unit = - migrationWarning(msg.toMessage, pos) - def uncheckedWarning(msg: Message, pos: SrcPos)(using Context): Unit = issueWarning(new UncheckedWarning(msg, pos.sourcePos)) - def uncheckedWarning(msg: => String, pos: SrcPos)(using Context): Unit = - uncheckedWarning(msg.toMessage, pos) - def featureWarning(msg: Message, pos: SrcPos)(using Context): Unit = issueWarning(new FeatureWarning(msg, pos.sourcePos)) - def featureWarning(msg: => String, pos: SrcPos)(using Context): Unit = - featureWarning(msg.toMessage, pos) - def featureWarning(feature: String, featureDescription: => String, - featureUseSite: Symbol, required: Boolean, pos: SrcPos)(using Context): Unit = { - val req = if (required) "needs to" else "should" + featureUseSite: Symbol, required: Boolean, pos: SrcPos)(using Context): Unit = + val req = if required then "needs to" else "should" val fqname = s"scala.language.$feature" val explain = @@ -60,47 +48,48 @@ object report: |See the Scala docs for value $fqname for a discussion |why the feature $req be explicitly enabled.""".stripMargin - def msg = s"""$featureDescription $req be enabled - |by adding the import clause 'import $fqname' - |or by setting the compiler option -language:$feature.$explain""".stripMargin - if (required) error(msg, pos) - else issueWarning(new FeatureWarning(msg.toMessage, pos.sourcePos)) - } + def msg = em"""$featureDescription $req be enabled + |by adding the import clause 'import $fqname' + |or by setting the compiler option -language:$feature.$explain""" + if required then error(msg, pos) + else issueWarning(new FeatureWarning(msg, pos.sourcePos)) + end featureWarning def warning(msg: Message, pos: SrcPos)(using Context): Unit = issueWarning(new Warning(msg, addInlineds(pos))) + def warning(msg: Message)(using Context): Unit = + warning(msg, NoSourcePosition) + def warning(msg: => String, pos: SrcPos = NoSourcePosition)(using Context): Unit = warning(msg.toMessage, pos) - def error(msg: Message, pos: SrcPos)(using Context): Unit = + def error(msg: Message, pos: SrcPos = NoSourcePosition)(using Context): Unit = val fullPos = addInlineds(pos) ctx.reporter.report(new Error(msg, fullPos)) if ctx.settings.YdebugError.value then Thread.dumpStack() - def error(msg: => String, pos: SrcPos = NoSourcePosition)(using Context): Unit = + def error(msg: => String, pos: SrcPos)(using Context): Unit = error(msg.toMessage, pos) + def error(msg: => String)(using Context): Unit = + error(msg, NoSourcePosition) + def error(ex: TypeError, pos: SrcPos)(using Context): Unit = val fullPos = addInlineds(pos) ctx.reporter.report(new StickyError(ex.toMessage, fullPos)) if ctx.settings.YdebugError.value then Thread.dumpStack() + if ctx.settings.YdebugTypeError.value then ex.printStackTrace() def errorOrMigrationWarning(msg: Message, pos: SrcPos, from: SourceVersion)(using Context): Unit = if sourceVersion.isAtLeast(from) then if sourceVersion.isMigrating && sourceVersion.ordinal <= from.ordinal then migrationWarning(msg, pos) else error(msg, pos) - def errorOrMigrationWarning(msg: => String, pos: SrcPos, from: SourceVersion)(using Context): Unit = - errorOrMigrationWarning(msg.toMessage, pos, from) - def gradualErrorOrMigrationWarning(msg: Message, pos: SrcPos, warnFrom: SourceVersion, errorFrom: SourceVersion)(using Context): Unit = if sourceVersion.isAtLeast(errorFrom) then errorOrMigrationWarning(msg, pos, errorFrom) else if sourceVersion.isAtLeast(warnFrom) then warning(msg, pos) - def gradualErrorOrMigrationWarning(msg: => String, pos: SrcPos, warnFrom: SourceVersion, errorFrom: SourceVersion)(using Context): Unit = - gradualErrorOrMigrationWarning(msg.toMessage, pos, warnFrom, errorFrom) - def restrictionError(msg: Message, pos: SrcPos = NoSourcePosition)(using Context): Unit = error(msg.mapMsg("Implementation restriction: " + _), pos) diff --git a/compiler/src/dotty/tools/dotc/reporting/Diagnostic.scala b/compiler/src/dotty/tools/dotc/reporting/Diagnostic.scala index a92da7821fab..624aa93924e8 100644 --- a/compiler/src/dotty/tools/dotc/reporting/Diagnostic.scala +++ b/compiler/src/dotty/tools/dotc/reporting/Diagnostic.scala @@ -9,7 +9,7 @@ import dotty.tools.dotc.core.Contexts._ import dotty.tools.dotc.interfaces.Diagnostic.{ERROR, INFO, WARNING} import dotty.tools.dotc.util.SourcePosition -import java.util.Optional +import java.util.{Collections, Optional, List => JList} import scala.util.chaining._ import core.Decorators.toMessage @@ -89,7 +89,7 @@ class Diagnostic( val msg: Message, val pos: SourcePosition, val level: Int -) extends Exception with interfaces.Diagnostic: +) extends interfaces.Diagnostic: private var verbose: Boolean = false def isVerbose: Boolean = verbose def setVerbose(): this.type = @@ -100,7 +100,8 @@ class Diagnostic( if (pos.exists && pos.source.exists) Optional.of(pos) else Optional.empty() override def message: String = msg.message.replaceAll("\u001B\\[[;\\d]*m", "") + override def diagnosticRelatedInformation: JList[interfaces.DiagnosticRelatedInformation] = + Collections.emptyList() override def toString: String = s"$getClass at $pos: $message" - override def getMessage(): String = message end Diagnostic diff --git a/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala b/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala index d9140a6309b8..9f0d71645833 100644 --- a/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala +++ b/compiler/src/dotty/tools/dotc/reporting/ErrorMessageID.scala @@ -176,7 +176,7 @@ enum ErrorMessageID(val isActive: Boolean = true) extends java.lang.Enum[ErrorMe case JavaEnumParentArgsID // errorNumber: 160 case AlreadyDefinedID // errorNumber: 161 case CaseClassInInlinedCodeID // errorNumber: 162 - case OverrideTypeMismatchErrorID // errorNumber: 163 + case OverrideTypeMismatchErrorID extends ErrorMessageID(isActive = false) // errorNumber: 163 case OverrideErrorID // errorNumber: 164 case MatchableWarningID // errorNumber: 165 case CannotExtendFunctionID // errorNumber: 166 @@ -185,6 +185,10 @@ enum ErrorMessageID(val isActive: Boolean = true) extends java.lang.Enum[ErrorMe case TargetNameOnTopLevelClassID // errorNumber: 169 case NotClassTypeID // errorNumber 170 case MissingArgumentID // errorNumer 171 + case MissingImplicitArgumentID // errorNumber 172 + case CannotBeAccessedID // errorNumber 173 + case InlineGivenShouldNotBeFunctionID // errorNumber 174 + case ValueDiscardingID // errorNumber 175 def errorNumber = ordinal - 1 diff --git a/compiler/src/dotty/tools/dotc/reporting/Message.scala b/compiler/src/dotty/tools/dotc/reporting/Message.scala index 9e397d606491..a1fe6773c1d2 100644 --- a/compiler/src/dotty/tools/dotc/reporting/Message.scala +++ b/compiler/src/dotty/tools/dotc/reporting/Message.scala @@ -2,17 +2,35 @@ package dotty.tools package dotc package reporting -import core.Contexts.*, core.Decorators.*, core.Mode +import core.* +import Contexts.*, Decorators.*, Symbols.*, Types.*, Flags.* +import printing.{RefinedPrinter, MessageLimiter, ErrorMessageLimiter} +import printing.Texts.Text +import printing.Formatting.hl import config.SourceVersion import scala.language.unsafeNulls import scala.annotation.threadUnsafe -object Message { - val nonSensicalStartTag: String = "" - val nonSensicalEndTag: String = "" - +/** ## Tips for error message generation + * + * - You can use the `em` interpolator for error messages. It's defined in core.Decorators. + * - You can also use a simple string argument for `error` or `warning` (not for the other variants), + * but the string should not be interpolated or composed of objects that require a + * Context for evaluation. + * - When embedding interpolated substrings defined elsewhere in error messages, + * use `i` and make sure they are defined as def's instead of vals. That way, the + * possibly expensive interpolation will performed only in the case where the message + * is eventually printed. Note: At least during typer, it's common for messages + * to be discarded without being printed. Also, by making them defs, you ensure that + * they will be evaluated in the Message context, which makes formatting safer + * and more robust. + * - For common messages, or messages that might require explanation, prefer defining + * a new `Message` class in file `messages.scala` and use that instead. The advantage is that these + * messages have unique IDs that can be referenced elsewhere. + */ +object Message: def rewriteNotice(what: String, version: SourceVersion | Null = null, options: String = "")(using Context): String = if !ctx.mode.is(Mode.Interactive) then val sourceStr = if version != null then i"-source $version" else "" @@ -22,7 +40,188 @@ object Message { else i"$sourceStr $options" i"\n$what can be rewritten automatically under -rewrite $optionStr." else "" -} + + private type Recorded = Symbol | ParamRef | SkolemType + + private case class SeenKey(str: String, isType: Boolean) + + /** A class that records printed items of one of the types in `Recorded`, + * adds superscripts for disambiguations, and can explain recorded symbols + * in ` where` clause + */ + private class Seen(disambiguate: Boolean): + + val seen = new collection.mutable.HashMap[SeenKey, List[Recorded]]: + override def default(key: SeenKey) = Nil + + var nonSensical = false + + /** If false, stop all recordings */ + private var recordOK = disambiguate + + /** Clear all entries and stop further entries to be added */ + def disable() = + seen.clear() + recordOK = false + + /** Record an entry `entry` with given String representation `str` and a + * type/term namespace identified by `isType`. + * If the entry was not yet recorded, allocate the next superscript corresponding + * to the same string in the same name space. The first recording is the string proper + * and following recordings get consecutive superscripts starting with 2. + * @return The possibly superscripted version of `str`. + */ + def record(str: String, isType: Boolean, entry: Recorded)(using Context): String = + if !recordOK then return str + //println(s"recording $str, $isType, $entry") + + /** If `e1` is an alias of another class of the same name, return the other + * class symbol instead. This normalization avoids recording e.g. scala.List + * and scala.collection.immutable.List as two different types + */ + def followAlias(e1: Recorded): Recorded = e1 match { + case e1: Symbol if e1.isAliasType => + val underlying = e1.typeRef.underlyingClassRef(refinementOK = false).typeSymbol + if (underlying.name == e1.name) underlying else e1 + case _ => e1 + } + val key = SeenKey(str, isType) + val existing = seen(key) + lazy val dealiased = followAlias(entry) + + // alts: The alternatives in `existing` that are equal, or follow (an alias of) `entry` + var alts = existing.dropWhile(alt => dealiased ne followAlias(alt)) + if alts.isEmpty then + alts = entry :: existing + seen(key) = alts + + val suffix = alts.length match { + case 1 => "" + case n => n.toString.toCharArray.map { + case '0' => '⁰' + case '1' => '¹' + case '2' => '²' + case '3' => '³' + case '4' => '⁴' + case '5' => '⁵' + case '6' => '⁶' + case '7' => '⁷' + case '8' => '⁸' + case '9' => '⁹' + }.mkString + } + str + suffix + end record + + /** Create explanation for single `Recorded` type or symbol */ + private def explanation(entry: AnyRef)(using Context): String = + def boundStr(bound: Type, default: ClassSymbol, cmp: String) = + if (bound.isRef(default)) "" else i"$cmp $bound" + + def boundsStr(bounds: TypeBounds): String = { + val lo = boundStr(bounds.lo, defn.NothingClass, ">:") + val hi = boundStr(bounds.hi, defn.AnyClass, "<:") + if (lo.isEmpty) hi + else if (hi.isEmpty) lo + else s"$lo and $hi" + } + + def addendum(cat: String, info: Type): String = info match { + case bounds @ TypeBounds(lo, hi) if bounds ne TypeBounds.empty => + if (lo eq hi) i" which is an alias of $lo" + else i" with $cat ${boundsStr(bounds)}" + case _ => + "" + } + + entry match { + case param: TypeParamRef => + s"is a type variable${addendum("constraint", TypeComparer.bounds(param))}" + case param: TermParamRef => + s"is a reference to a value parameter" + case sym: Symbol => + val info = + if (ctx.gadt.contains(sym)) + sym.info & ctx.gadt.fullBounds(sym) + else + sym.info + s"is a ${ctx.printer.kindString(sym)}${sym.showExtendedLocation}${addendum("bounds", info)}" + case tp: SkolemType => + s"is an unknown value of type ${tp.widen.show}" + } + end explanation + + /** Produce a where clause with explanations for recorded iterms. + */ + def explanations(using Context): String = + def needsExplanation(entry: Recorded) = entry match { + case param: TypeParamRef => ctx.typerState.constraint.contains(param) + case param: ParamRef => false + case skolem: SkolemType => true + case sym: Symbol => + ctx.gadt.contains(sym) && ctx.gadt.fullBounds(sym) != TypeBounds.empty + } + + val toExplain: List[(String, Recorded)] = seen.toList.flatMap { kvs => + val res: List[(String, Recorded)] = kvs match { + case (key, entry :: Nil) => + if (needsExplanation(entry)) (key.str, entry) :: Nil else Nil + case (key, entries) => + for (alt <- entries) yield { + val tickedString = record(key.str, key.isType, alt) + (tickedString, alt) + } + } + res // help the inferrencer out + }.sortBy(_._1) + + def columnar(parts: List[(String, String)]): List[String] = { + lazy val maxLen = parts.map(_._1.length).max + parts.map { + case (leader, trailer) => + val variable = hl(leader) + s"""$variable${" " * (maxLen - leader.length)} $trailer""" + } + } + + val explainParts = toExplain.map { case (str, entry) => (str, explanation(entry)) } + val explainLines = columnar(explainParts) + if (explainLines.isEmpty) "" else i"where: $explainLines%\n %\n" + end explanations + end Seen + + /** Printer to be used when formatting messages */ + private class Printer(val seen: Seen, _ctx: Context) extends RefinedPrinter(_ctx): + + /** True if printer should a show source module instead of its module class */ + private def useSourceModule(sym: Symbol): Boolean = + sym.is(ModuleClass, butNot = Package) && sym.sourceModule.exists && !_ctx.settings.YdebugNames.value + + override def simpleNameString(sym: Symbol): String = + if useSourceModule(sym) then simpleNameString(sym.sourceModule) + else seen.record(super.simpleNameString(sym), sym.isType, sym) + + override def ParamRefNameString(param: ParamRef): String = + seen.record(super.ParamRefNameString(param), param.isInstanceOf[TypeParamRef], param) + + override def toTextRef(tp: SingletonType): Text = tp match + case tp: SkolemType => seen.record(tp.repr.toString, isType = true, tp) + case _ => super.toTextRef(tp) + + override def toText(tp: Type): Text = + if !tp.exists || tp.isErroneous then seen.nonSensical = true + tp match + case tp: TypeRef if useSourceModule(tp.symbol) => Str("object ") ~ super.toText(tp) + case _ => super.toText(tp) + + override def toText(sym: Symbol): Text = + sym.infoOrCompleter match + case _: ErrorType | TypeAlias(_: ErrorType) | NoType => seen.nonSensical = true + case _ => + super.toText(sym) + end Printer + +end Message /** A `Message` contains all semantic information necessary to easily * comprehend what caused the message to be logged. Each message can be turned @@ -39,9 +238,41 @@ object Message { * * @param errorId a unique id identifying the message, this will be * used to reference documentation online + * + * Messages modify the rendendering of interpolated strings in several ways: + * + * 1. The size of the printed code is limited with a MessafeLimiter. If the message + * would get too large or too deeply nested, a `...` is printed instead. + * 2. References to module classes are prefixed with `object ` for better recogniability. + * 3. A where clause is sometimes added which contains the following additional explanations: + * - Rerences are disambiguated: If a message contains occurrences of the same identifier + * representing different symbols, the duplicates are printed with superscripts + * and the where-clause explains where each symbol is located. + * - Uninstantiated variables are explained in the where-clause with additional + * info about their bounds. + * - Skolems are explained with additional info about their underlying type. + * + * Messages inheriting from the NoDisambiguation trait or returned from the + * `noDisambiguation()` method skip point (3) above. This makes sense if the + * message already exolains where different occurrences of the same identifier + * are located. Examples are NamingMsgs such as double definition errors, + * overriding errors, and ambiguous implicit errors. + * + * We consciously made the design decision to disambiguate by default and disable + * disambiguation as an opt-in. The reason is that one usually does not consider all + * fine-grained details when writing an error message. If disambiguation is the default, + * some tests will show where clauses that look too noisy and that then can be disabled + * when needed. But if silence is the default, one usually does not realize that + * better info could be obtained by turning disambiguation on. */ -abstract class Message(val errorId: ErrorMessageID) { self => - import Message._ +abstract class Message(val errorId: ErrorMessageID)(using Context) { self => + import Message.* + + /** The kind of the error message, e.g. "Syntax" or "Type Mismatch". + * This will be printed as "$kind Error", "$kind Warning", etc, on the first + * line of the message. + */ + def kind: MessageKind /** The `msg` contains the diagnostic message e.g: * @@ -52,22 +283,27 @@ abstract class Message(val errorId: ErrorMessageID) { self => * `Diagnostic`. The message is given in raw form, with possible embedded * tags. */ - protected def msg: String - - /** The kind of the error message, e.g. "Syntax" or "Type Mismatch". - * This will be printed as "$kind Error", "$kind Warning", etc, on the first - * line of the message. - */ - def kind: MessageKind + protected def msg(using Context): String /** The explanation should provide a detailed description of why the error * occurred and use examples from the user's own code to illustrate how to * avoid these errors. It might contain embedded tags. */ - protected def explain: String + protected def explain(using Context): String - /** A message suffix that can be added for certain subclasses */ - protected def msgSuffix: String = "" + /** What gets printed after the message proper */ + protected def msgPostscript(using Context): String = + if ctx eq NoContext then "" + else ctx.printer match + case msgPrinter: Message.Printer => + myIsNonSensical = msgPrinter.seen.nonSensical + val addendum = msgPrinter.seen.explanations + msgPrinter.seen.disable() + // Clear entries and stop futher recording so that messages containing the current + // one don't repeat the explanations or use explanations from the msgPostscript. + if addendum.isEmpty then "" else "\n\n" ++ addendum + case _ => + "" /** Does this message have an explanation? * This is normally the same as `explain.nonEmpty` but can be overridden @@ -76,61 +312,69 @@ abstract class Message(val errorId: ErrorMessageID) { self => */ def canExplain: Boolean = explain.nonEmpty - private var myMsg: String | Null = null private var myIsNonSensical: Boolean = false - private def dropNonSensical(msg: String): String = - if msg.contains(nonSensicalStartTag) then - myIsNonSensical = true - // myMsg might be composed of several d"..." invocations -> nested - // nonsensical tags possible - msg - .replace(nonSensicalStartTag, "") - .replace(nonSensicalEndTag, "") - else msg + /** A message is non-sensical if it contains references to internally + * generated error types. Normally we want to suppress error messages + * referring to types like this because they look weird and are normally + * follow-up errors to something that was diagnosed before. + */ + def isNonSensical: Boolean = { message; myIsNonSensical } + + private var disambiguate: Boolean = true + + def withoutDisambiguation(): this.type = + disambiguate = false + this - /** The message with potential embedded tags */ - def rawMessage = message + private def inMessageContext(disambiguate: Boolean)(op: Context ?=> String): String = + if ctx eq NoContext then op + else + val msgContext = ctx.printer match + case _: Message.Printer => ctx + case _ => + val seen = Seen(disambiguate) + val ctx1 = ctx.fresh.setPrinterFn(Message.Printer(seen, _)) + if !ctx1.property(MessageLimiter).isDefined then + ctx1.setProperty(MessageLimiter, ErrorMessageLimiter()) + ctx1 + op(using msgContext) /** The message to report. tags are filtered out */ - @threadUnsafe lazy val message: String = dropNonSensical(msg + msgSuffix) + @threadUnsafe lazy val message: String = + inMessageContext(disambiguate)(msg + msgPostscript) /** The explanation to report. tags are filtered out */ - @threadUnsafe lazy val explanation: String = dropNonSensical(explain) - - /** A message is non-sensical if it contains references to - * tags. Such tags are inserted by the error diagnostic framework if a - * message contains references to internally generated error types. Normally - * we want to suppress error messages referring to types like this because - * they look weird and are normally follow-up errors to something that was - * diagnosed before. - */ - def isNonSensical: Boolean = { message; myIsNonSensical } + @threadUnsafe lazy val explanation: String = + inMessageContext(disambiguate = false)(explain) /** The implicit `Context` in messages is a large thing that we don't want * persisted. This method gets around that by duplicating the message, * forcing its `msg` and `explanation` vals and dropping the implicit context * that was captured in the original message. */ - def persist: Message = new Message(errorId) { - val kind = self.kind - val msg = self.msg - val explain = self.explain + def persist: Message = new Message(errorId)(using NoContext): + val kind = self.kind + private val persistedMsg = self.message + private val persistedExplain = self.explanation + def msg(using Context) = persistedMsg + def explain(using Context) = persistedExplain override val canExplain = self.canExplain - } + override def isNonSensical = self.isNonSensical def append(suffix: => String): Message = mapMsg(_ ++ suffix) + def prepend(prefix: => String): Message = mapMsg(prefix ++ _) def mapMsg(f: String => String): Message = new Message(errorId): - val kind = self.kind - def msg = f(self.msg) - def explain = self.explain + val kind = self.kind + def msg(using Context) = f(self.msg) + def explain(using Context) = self.explain override def canExplain = self.canExplain def appendExplanation(suffix: => String): Message = new Message(errorId): - val kind = self.kind - def msg = self.msg - def explain = self.explain ++ suffix + val kind = self.kind + def msg(using Context) = self.msg + def explain(using Context) = self.explain ++ suffix override def canExplain = true /** Override with `true` for messages that should always be shown even if their @@ -143,10 +387,14 @@ abstract class Message(val errorId: ErrorMessageID) { self => override def toString = msg } +/** A marker trait that suppresses generation of `where` clause for disambiguations */ +trait NoDisambiguation extends Message: + withoutDisambiguation() + /** The fallback `Message` containing no explanation and having no `kind` */ -class NoExplanation(msgFn: => String) extends Message(ErrorMessageID.NoExplanationID) { - def msg: String = msgFn - def explain: String = "" +final class NoExplanation(msgFn: Context ?=> String)(using Context) extends Message(ErrorMessageID.NoExplanationID) { + def msg(using Context): String = msgFn + def explain(using Context): String = "" val kind: MessageKind = MessageKind.NoKind override def toString(): String = msg diff --git a/compiler/src/dotty/tools/dotc/reporting/Reporter.scala b/compiler/src/dotty/tools/dotc/reporting/Reporter.scala index 497e77ae4a7c..f5aadac27296 100644 --- a/compiler/src/dotty/tools/dotc/reporting/Reporter.scala +++ b/compiler/src/dotty/tools/dotc/reporting/Reporter.scala @@ -14,7 +14,7 @@ import dotty.tools.dotc.util.NoSourcePosition import java.io.{BufferedReader, PrintWriter} import scala.annotation.internal.sharable import scala.collection.mutable -import core.Decorators.toMessage +import core.Decorators.em object Reporter { /** Convert a SimpleReporter into a real Reporter */ @@ -218,8 +218,8 @@ abstract class Reporter extends interfaces.ReporterResult { def summarizeUnreportedWarnings()(using Context): Unit = for (settingName, count) <- unreportedWarnings do val were = if count == 1 then "was" else "were" - val msg = s"there $were ${countString(count, settingName.tail + " warning")}; re-run with $settingName for details" - report(Warning(msg.toMessage, NoSourcePosition)) + val msg = em"there $were ${countString(count, settingName.tail + " warning")}; re-run with $settingName for details" + report(Warning(msg, NoSourcePosition)) /** Print the summary of warnings and errors */ def printSummary()(using Context): Unit = { diff --git a/compiler/src/dotty/tools/dotc/reporting/ThrowingReporter.scala b/compiler/src/dotty/tools/dotc/reporting/ThrowingReporter.scala index ad47a9d30536..153212522541 100644 --- a/compiler/src/dotty/tools/dotc/reporting/ThrowingReporter.scala +++ b/compiler/src/dotty/tools/dotc/reporting/ThrowingReporter.scala @@ -6,12 +6,16 @@ import core.Contexts._ import Diagnostic.Error /** - * This class implements a Reporter that throws all errors and sends warnings and other - * info to the underlying reporter. + * This class implements a Reporter that throws all errors as UnhandledError exceptions + * and sends warnings and other info to the underlying reporter. */ class ThrowingReporter(reportInfo: Reporter) extends Reporter { def doReport(dia: Diagnostic)(using Context): Unit = dia match { - case _: Error => throw dia + case dia: Error => throw UnhandledError(dia) case _ => reportInfo.doReport(dia) } } + +class UnhandledError(val diagnostic: Error) extends Exception: + override def getMessage = diagnostic.message + diff --git a/compiler/src/dotty/tools/dotc/reporting/WConf.scala b/compiler/src/dotty/tools/dotc/reporting/WConf.scala index 21e10e894e0b..af1a5c0f0f47 100644 --- a/compiler/src/dotty/tools/dotc/reporting/WConf.scala +++ b/compiler/src/dotty/tools/dotc/reporting/WConf.scala @@ -18,7 +18,7 @@ enum MessageFilter: case Feature => message.isInstanceOf[Diagnostic.FeatureWarning] case Unchecked => message.isInstanceOf[Diagnostic.UncheckedWarning] case MessagePattern(pattern) => - val noHighlight = message.msg.rawMessage.replaceAll("\\e\\[[\\d;]*[^\\d;]","") + val noHighlight = message.msg.message.replaceAll("\\e\\[[\\d;]*[^\\d;]","") pattern.findFirstIn(noHighlight).nonEmpty case MessageID(errorId) => message.msg.errorId == errorId case None => false diff --git a/compiler/src/dotty/tools/dotc/reporting/messages.scala b/compiler/src/dotty/tools/dotc/reporting/messages.scala index 56375d881f97..e8029d790d0a 100644 --- a/compiler/src/dotty/tools/dotc/reporting/messages.scala +++ b/compiler/src/dotty/tools/dotc/reporting/messages.scala @@ -15,9 +15,10 @@ import printing.Formatting import ErrorMessageID._ import ast.Trees import config.{Feature, ScalaVersion} -import typer.ErrorReporting.{err, matchReductionAddendum} +import typer.ErrorReporting.{err, matchReductionAddendum, substitutableTypeSymbolsInScope} import typer.ProtoTypes.ViewProto -import typer.Implicits.Candidate +import typer.Implicits.* +import typer.Inferencing import scala.util.control.NonFatal import StdNames.nme import printing.Formatting.hl @@ -25,6 +26,8 @@ import ast.Trees._ import ast.untpd import ast.tpd import transform.SymUtils._ +import scala.util.matching.Regex +import java.util.regex.Matcher.quoteReplacement import cc.CaptureSet.IdentityCaptRefMap /** Messages @@ -40,211 +43,212 @@ import cc.CaptureSet.IdentityCaptRefMap * ``` */ - abstract class SyntaxMsg(errorId: ErrorMessageID) extends Message(errorId): - def kind = MessageKind.Syntax +abstract class SyntaxMsg(errorId: ErrorMessageID)(using Context) extends Message(errorId): + def kind = MessageKind.Syntax - abstract class TypeMsg(errorId: ErrorMessageID) extends Message(errorId): - def kind = MessageKind.Type +abstract class TypeMsg(errorId: ErrorMessageID)(using Context) extends Message(errorId): + def kind = MessageKind.Type - trait ShowMatchTrace(tps: Type*)(using Context) extends Message: - override def msgSuffix: String = matchReductionAddendum(tps*) +trait ShowMatchTrace(tps: Type*)(using Context) extends Message: + override def msgPostscript(using Context): String = + super.msgPostscript ++ matchReductionAddendum(tps*) - abstract class TypeMismatchMsg(found: Type, expected: Type)(errorId: ErrorMessageID)(using Context) - extends Message(errorId), ShowMatchTrace(found, expected): - def kind = MessageKind.TypeMismatch - def explain = err.whyNoMatchStr(found, expected) - override def canExplain = true +abstract class TypeMismatchMsg(found: Type, expected: Type)(errorId: ErrorMessageID)(using Context) +extends Message(errorId), ShowMatchTrace(found, expected): + def kind = MessageKind.TypeMismatch + def explain(using Context) = err.whyNoMatchStr(found, expected) + override def canExplain = true - abstract class NamingMsg(errorId: ErrorMessageID) extends Message(errorId): - def kind = MessageKind.Naming +abstract class NamingMsg(errorId: ErrorMessageID)(using Context) extends Message(errorId), NoDisambiguation: + def kind = MessageKind.Naming - abstract class DeclarationMsg(errorId: ErrorMessageID) extends Message(errorId): - def kind = MessageKind.Declaration +abstract class DeclarationMsg(errorId: ErrorMessageID)(using Context) extends Message(errorId): + def kind = MessageKind.Declaration - /** A simple not found message (either for idents, or member selection. - * Messages of this class are sometimes dropped in favor of other, more - * specific messages. - */ - abstract class NotFoundMsg(errorId: ErrorMessageID) extends Message(errorId): - def kind = MessageKind.NotFound - def name: Name +/** A simple not found message (either for idents, or member selection. + * Messages of this class are sometimes dropped in favor of other, more + * specific messages. + */ +abstract class NotFoundMsg(errorId: ErrorMessageID)(using Context) extends Message(errorId): + def kind = MessageKind.NotFound + def name: Name - abstract class PatternMatchMsg(errorId: ErrorMessageID) extends Message(errorId): - def kind = MessageKind.PatternMatch +abstract class PatternMatchMsg(errorId: ErrorMessageID)(using Context) extends Message(errorId): + def kind = MessageKind.PatternMatch - abstract class CyclicMsg(errorId: ErrorMessageID) extends Message(errorId): - def kind = MessageKind.Cyclic +abstract class CyclicMsg(errorId: ErrorMessageID)(using Context) extends Message(errorId): + def kind = MessageKind.Cyclic - abstract class ReferenceMsg(errorId: ErrorMessageID) extends Message(errorId): - def kind = MessageKind.Reference +abstract class ReferenceMsg(errorId: ErrorMessageID)(using Context) extends Message(errorId): + def kind = MessageKind.Reference - abstract class EmptyCatchOrFinallyBlock(tryBody: untpd.Tree, errNo: ErrorMessageID)(using Context) - extends SyntaxMsg(errNo) { - def explain = { - val tryString = tryBody match { - case Block(Nil, untpd.EmptyTree) => "{}" - case _ => tryBody.show - } - - val code1 = - s"""|import scala.util.control.NonFatal - | - |try $tryString catch { - | case NonFatal(e) => ??? - |}""".stripMargin - - val code2 = - s"""|try $tryString finally { - | // perform your cleanup here! - |}""".stripMargin - - em"""|A ${hl("try")} expression should be followed by some mechanism to handle any exceptions - |thrown. Typically a ${hl("catch")} expression follows the ${hl("try")} and pattern matches - |on any expected exceptions. For example: - | - |$code1 - | - |It is also possible to follow a ${hl("try")} immediately by a ${hl("finally")} - letting the - |exception propagate - but still allowing for some clean up in ${hl("finally")}: - | - |$code2 - | - |It is recommended to use the ${hl("NonFatal")} extractor to catch all exceptions as it - |correctly handles transfer functions like ${hl("return")}.""" +abstract class EmptyCatchOrFinallyBlock(tryBody: untpd.Tree, errNo: ErrorMessageID)(using Context) +extends SyntaxMsg(errNo) { + def explain(using Context) = { + val tryString = tryBody match { + case Block(Nil, untpd.EmptyTree) => "{}" + case _ => tryBody.show } - } - - class EmptyCatchBlock(tryBody: untpd.Tree)(using Context) - extends EmptyCatchOrFinallyBlock(tryBody, EmptyCatchBlockID) { - def msg = - em"""|The ${hl("catch")} block does not contain a valid expression, try - |adding a case like - ${hl("case e: Exception =>")} to the block""" - } - - class EmptyCatchAndFinallyBlock(tryBody: untpd.Tree)(using Context) - extends EmptyCatchOrFinallyBlock(tryBody, EmptyCatchAndFinallyBlockID) { - def msg = - em"""|A ${hl("try")} without ${hl("catch")} or ${hl("finally")} is equivalent to putting - |its body in a block; no exceptions are handled.""" - } - - class DeprecatedWithOperator()(using Context) - extends SyntaxMsg(DeprecatedWithOperatorID) { - def msg = - em"""${hl("with")} as a type operator has been deprecated; use ${hl("&")} instead""" - def explain = - em"""|Dotty introduces intersection types - ${hl("&")} types. These replace the - |use of the ${hl("with")} keyword. There are a few differences in - |semantics between intersection types and using ${hl("with")}.""" - } - - class CaseClassMissingParamList(cdef: untpd.TypeDef)(using Context) - extends SyntaxMsg(CaseClassMissingParamListID) { - def msg = - em"""|A ${hl("case class")} must have at least one parameter list""" - - def explain = - em"""|${cdef.name} must have at least one parameter list, if you would rather - |have a singleton representation of ${cdef.name}, use a "${hl("case object")}". - |Or, add an explicit ${hl("()")} as a parameter list to ${cdef.name}.""" - } - class AnonymousFunctionMissingParamType(param: untpd.ValDef, - tree: untpd.Function, - pt: Type) - (using Context) - extends TypeMsg(AnonymousFunctionMissingParamTypeID) { - def msg = { - val ofFun = - if param.name.is(WildcardParamName) - || (MethodType.syntheticParamNames(tree.args.length + 1) contains param.name) - then i" of expanded function:\n$tree" - else "" + val code1 = + s"""|import scala.util.control.NonFatal + | + |try $tryString catch { + | case NonFatal(e) => ??? + |}""".stripMargin - val inferred = - if (pt == WildcardType) "" - else i"\nWhat I could infer was: $pt" + val code2 = + s"""|try $tryString finally { + | // perform your cleanup here! + |}""".stripMargin - i"""Missing parameter type - | - |I could not infer the type of the parameter ${param.name}$ofFun.$inferred""" - } + i"""|A ${hl("try")} expression should be followed by some mechanism to handle any exceptions + |thrown. Typically a ${hl("catch")} expression follows the ${hl("try")} and pattern matches + |on any expected exceptions. For example: + | + |$code1 + | + |It is also possible to follow a ${hl("try")} immediately by a ${hl("finally")} - letting the + |exception propagate - but still allowing for some clean up in ${hl("finally")}: + | + |$code2 + | + |It is recommended to use the ${hl("NonFatal")} extractor to catch all exceptions as it + |correctly handles transfer functions like ${hl("return")}.""" + } +} + +class EmptyCatchBlock(tryBody: untpd.Tree)(using Context) +extends EmptyCatchOrFinallyBlock(tryBody, EmptyCatchBlockID) { + def msg(using Context) = + i"""|The ${hl("catch")} block does not contain a valid expression, try + |adding a case like - ${hl("case e: Exception =>")} to the block""" +} + +class EmptyCatchAndFinallyBlock(tryBody: untpd.Tree)(using Context) +extends EmptyCatchOrFinallyBlock(tryBody, EmptyCatchAndFinallyBlockID) { + def msg(using Context) = + i"""|A ${hl("try")} without ${hl("catch")} or ${hl("finally")} is equivalent to putting + |its body in a block; no exceptions are handled.""" +} + +class DeprecatedWithOperator()(using Context) +extends SyntaxMsg(DeprecatedWithOperatorID) { + def msg(using Context) = + i"""${hl("with")} as a type operator has been deprecated; use ${hl("&")} instead""" + def explain(using Context) = + i"""|Dotty introduces intersection types - ${hl("&")} types. These replace the + |use of the ${hl("with")} keyword. There are a few differences in + |semantics between intersection types and using ${hl("with")}.""" +} + +class CaseClassMissingParamList(cdef: untpd.TypeDef)(using Context) +extends SyntaxMsg(CaseClassMissingParamListID) { + def msg(using Context) = + i"""|A ${hl("case class")} must have at least one parameter list""" + + def explain(using Context) = + i"""|${cdef.name} must have at least one parameter list, if you would rather + |have a singleton representation of ${cdef.name}, use a "${hl("case object")}". + |Or, add an explicit ${hl("()")} as a parameter list to ${cdef.name}.""" +} + +class AnonymousFunctionMissingParamType(param: untpd.ValDef, + tree: untpd.Function, + pt: Type) + (using Context) +extends TypeMsg(AnonymousFunctionMissingParamTypeID) { + def msg(using Context) = { + val ofFun = + if param.name.is(WildcardParamName) + || (MethodType.syntheticParamNames(tree.args.length + 1) contains param.name) + then i" of expanded function:\n$tree" + else "" - def explain = "" - } + val inferred = + if (pt == WildcardType) "" + else i"\nWhat I could infer was: $pt" - class WildcardOnTypeArgumentNotAllowedOnNew()(using Context) - extends SyntaxMsg(WildcardOnTypeArgumentNotAllowedOnNewID) { - def msg = "Type argument must be fully defined" - def explain = - val code1: String = - """ - |object TyperDemo { - | class Team[A] - | val team = new Team[?] - |} - """.stripMargin + i"""Missing parameter type + | + |I could not infer the type of the parameter ${param.name}$ofFun.$inferred""" + } + + def explain(using Context) = "" +} + +class WildcardOnTypeArgumentNotAllowedOnNew()(using Context) +extends SyntaxMsg(WildcardOnTypeArgumentNotAllowedOnNewID) { + def msg(using Context) = "Type argument must be fully defined" + def explain(using Context) = + val code1: String = + """ + |object TyperDemo { + | class Team[A] + | val team = new Team[?] + |} + """.stripMargin - val code2: String = - """ - |object TyperDemo { - | class Team[A] - | val team = new Team[Int] - |} - """.stripMargin - em"""|Wildcard on arguments is not allowed when declaring a new type. - | - |Given the following example: - | - |$code1 - | - |You must complete all the type parameters, for instance: - | - |$code2 """ - } + val code2: String = + """ + |object TyperDemo { + | class Team[A] + | val team = new Team[Int] + |} + """.stripMargin + i"""|Wildcard on arguments is not allowed when declaring a new type. + | + |Given the following example: + | + |$code1 + | + |You must complete all the type parameters, for instance: + | + |$code2 """ +} - // Type Errors ------------------------------------------------------------ // - class DuplicateBind(bind: untpd.Bind, tree: untpd.CaseDef)(using Context) - extends NamingMsg(DuplicateBindID) { - def msg = em"duplicate pattern variable: ${bind.name}" +// Type Errors ------------------------------------------------------------ // +class DuplicateBind(bind: untpd.Bind, tree: untpd.CaseDef)(using Context) +extends NamingMsg(DuplicateBindID) { + def msg(using Context) = i"duplicate pattern variable: ${bind.name}" - def explain = { - val pat = tree.pat.show - val guard = tree.guard match { - case untpd.EmptyTree => "" - case guard => s"if ${guard.show}" - } + def explain(using Context) = { + val pat = tree.pat.show + val guard = tree.guard match + case untpd.EmptyTree => "" + case guard => s"if ${guard.show}" - val body = tree.body match { - case Block(Nil, untpd.EmptyTree) => "" - case body => s" ${body.show}" - } + val body = tree.body match { + case Block(Nil, untpd.EmptyTree) => "" + case body => s" ${body.show}" + } - val caseDef = s"case $pat$guard => $body" + val caseDef = s"case $pat$guard => $body" - em"""|For each ${hl("case")} bound variable names have to be unique. In: - | - |$caseDef - | - |${bind.name} is not unique. Rename one of the bound variables!""" - } + i"""|For each ${hl("case")} bound variable names have to be unique. In: + | + |$caseDef + | + |${bind.name} is not unique. Rename one of the bound variables!""" } +} - class MissingIdent(tree: untpd.Ident, treeKind: String, val name: Name)(using Context) - extends NotFoundMsg(MissingIdentID) { - def msg = em"Not found: $treeKind$name" - def explain = { - em"""|The identifier for `$treeKind$name` is not bound, that is, - |no declaration for this identifier can be found. - |That can happen, for example, if `$name` or its declaration has either been - |misspelt or if an import is missing.""" - } +class MissingIdent(tree: untpd.Ident, treeKind: String, val name: Name)(using Context) +extends NotFoundMsg(MissingIdentID) { + def msg(using Context) = i"Not found: $treeKind$name" + def explain(using Context) = { + i"""|The identifier for `$treeKind$name` is not bound, that is, + |no declaration for this identifier can be found. + |That can happen, for example, if `$name` or its declaration has either been + |misspelt or if an import is missing.""" } +} - class TypeMismatch(found: Type, expected: Type, inTree: Option[untpd.Tree], addenda: => String*)(using Context) - extends TypeMismatchMsg(found, expected)(TypeMismatchID): +class TypeMismatch(found: Type, expected: Type, inTree: Option[untpd.Tree], addenda: => String*)(using Context) + extends TypeMismatchMsg(found, expected)(TypeMismatchID): + def msg(using Context) = // replace constrained TypeParamRefs and their typevars by their bounds where possible // and the bounds are not f-bounds. // The idea is that if the bounds are also not-subtypes of each other to report @@ -272,2265 +276,2522 @@ import cc.CaptureSet.IdentityCaptRefMap case _ => mapOver(tp) - def msg = - val found1 = reported(found) - reported.setVariance(-1) - val expected1 = reported(expected) - val (found2, expected2) = - if (found1 frozen_<:< expected1) || reported.fbounded then (found, expected) - else (found1, expected1) - val postScript = addenda.find(!_.isEmpty) match - case Some(p) => p - case None => - if expected.isTopType || found.isBottomType - then "" - else ctx.typer.importSuggestionAddendum(ViewProto(found.widen, expected)) - val (where, printCtx) = Formatting.disambiguateTypes(found2, expected2) - val whereSuffix = if (where.isEmpty) where else s"\n\n$where" - val (foundStr, expectedStr) = Formatting.typeDiff(found2, expected2)(using printCtx) - s"""|Found: $foundStr - |Required: $expectedStr""".stripMargin - + whereSuffix + postScript - - override def explain = - val treeStr = inTree.map(x => s"\nTree: ${x.show}").getOrElse("") - treeStr + "\n" + super.explain - - end TypeMismatch - - class NotAMember(site: Type, val name: Name, selected: String, addendum: => String = "")(using Context) - extends NotFoundMsg(NotAMemberID), ShowMatchTrace(site) { - //println(i"site = $site, decls = ${site.decls}, source = ${site.typeSymbol.sourceFile}") //DEBUG - - def msg = { - import core.Flags._ - val maxDist = 3 // maximal number of differences to be considered for a hint - val missing = name.show - - // The symbols of all non-synthetic, non-private members of `site` - // that are of the same type/term kind as the missing member. - def candidates: Set[Symbol] = - for - bc <- site.widen.baseClasses.toSet - sym <- bc.info.decls.filter(sym => - sym.isType == name.isTypeName - && !sym.isConstructor - && !sym.flagsUNSAFE.isOneOf(Synthetic | Private)) - yield sym - - // Calculate Levenshtein distance - def distance(s1: String, s2: String): Int = - val dist = Array.ofDim[Int](s2.length + 1, s1.length + 1) - for - j <- 0 to s2.length - i <- 0 to s1.length - do - dist(j)(i) = - if j == 0 then i - else if i == 0 then j - else if s2(j - 1) == s1(i - 1) then dist(j - 1)(i - 1) - else (dist(j - 1)(i) min dist(j)(i - 1) min dist(j - 1)(i - 1)) + 1 - dist(s2.length)(s1.length) - - // A list of possible candidate symbols with their Levenstein distances - // to the name of the missing member - def closest: List[(Int, Symbol)] = candidates - .toList - .map(sym => (distance(sym.name.show, missing), sym)) - .filter((d, sym) => d <= maxDist && d < missing.length && d < sym.name.show.length) - .sortBy((d, sym) => (d, sym.name.show)) // sort by distance first, alphabetically second - - val enumClause = - if ((name eq nme.values) || (name eq nme.valueOf)) && site.classSymbol.companionClass.isEnumClass then - val kind = if name eq nme.values then i"${nme.values} array" else i"${nme.valueOf} lookup method" - // an assumption is made here that the values and valueOf methods were not generated - // because the enum defines non-singleton cases - i""" - |Although ${site.classSymbol.companionClass} is an enum, it has non-singleton cases, - |meaning a $kind is not defined""" - else - "" - - def prefixEnumClause(addendum: String) = - if enumClause.nonEmpty then s".$enumClause$addendum" else addendum - - val finalAddendum = - if addendum.nonEmpty then prefixEnumClause(addendum) - else closest match - case (d, sym) :: _ => - val siteName = site match - case site: NamedType => site.name.show - case site => i"$site" - val showName = - // Add .type to the name if it is a module - if sym.is(ModuleClass) then s"${sym.name.show}.type" - else sym.name.show - s" - did you mean $siteName.$showName?$enumClause" - case Nil => prefixEnumClause("") - - ex"$selected $name is not a member of ${site.widen}$finalAddendum" - } - - def explain = "" - } - - class EarlyDefinitionsNotSupported()(using Context) - extends SyntaxMsg(EarlyDefinitionsNotSupportedID) { - def msg = "Early definitions are not supported; use trait parameters instead" - - def explain = { - val code1 = - """|trait Logging { - | val f: File - | f.open() - | onExit(f.close()) - | def log(msg: String) = f.write(msg) - |} - | - |class B extends Logging { - | val f = new File("log.data") // triggers a NullPointerException - |} - | - |// early definition gets around the NullPointerException - |class C extends { - | val f = new File("log.data") - |} with Logging""".stripMargin - - val code2 = - """|trait Logging(f: File) { - | f.open() - | onExit(f.close()) - | def log(msg: String) = f.write(msg) - |} - | - |class C extends Logging(new File("log.data"))""".stripMargin - - em"""|Earlier versions of Scala did not support trait parameters and "early - |definitions" (also known as "early initializers") were used as an alternative. - | - |Example of old syntax: - | - |$code1 - | - |The above code can now be written as: - | - |$code2 - |""" - } - } - - class TopLevelImplicitClass(cdef: untpd.TypeDef)(using Context) - extends SyntaxMsg(TopLevelImplicitClassID) { - def msg = em"""An ${hl("implicit class")} may not be top-level""" - - def explain = { - val TypeDef(name, impl @ Template(constr0, parents, self, _)) = cdef: @unchecked - val exampleArgs = - if(constr0.termParamss.isEmpty) "..." - else constr0.termParamss(0).map(_.withMods(untpd.Modifiers()).show).mkString(", ") - def defHasBody[T] = impl.body.exists(!_.isEmpty) - val exampleBody = if (defHasBody) "{\n ...\n }" else "" - em"""|There may not be any method, member or object in scope with the same name as - |the implicit class and a case class automatically gets a companion object with - |the same name created by the compiler which would cause a naming conflict if it - |were allowed. - | | - |To resolve the conflict declare ${cdef.name} inside of an ${hl("object")} then import the class - |from the object at the use site if needed, for example: - | - |object Implicits { - | implicit class ${cdef.name}($exampleArgs)$exampleBody - |} - | - |// At the use site: - |import Implicits.${cdef.name}""" - } - } - - class ImplicitCaseClass(cdef: untpd.TypeDef)(using Context) - extends SyntaxMsg(ImplicitCaseClassID) { - def msg = em"""A ${hl("case class")} may not be defined as ${hl("implicit")}""" - - def explain = - em"""|Implicit classes may not be case classes. Instead use a plain class: - | - |implicit class ${cdef.name}... - | - |""" - } + val found1 = reported(found) + reported.setVariance(-1) + val expected1 = reported(expected) + val (found2, expected2) = + if (found1 frozen_<:< expected1) || reported.fbounded then (found, expected) + else (found1, expected1) + val (foundStr, expectedStr) = Formatting.typeDiff(found2, expected2) + i"""|Found: $foundStr + |Required: $expectedStr""" + end msg + + override def msgPostscript(using Context) = + def importSuggestions = + if expected.isTopType || found.isBottomType then "" + else ctx.typer.importSuggestionAddendum(ViewProto(found.widen, expected)) + super.msgPostscript + ++ addenda.dropWhile(_.isEmpty).headOption.getOrElse(importSuggestions) + + override def explain(using Context) = + val treeStr = inTree.map(x => s"\nTree: ${x.show}").getOrElse("") + treeStr + "\n" + super.explain + +end TypeMismatch + +class NotAMember(site: Type, val name: Name, selected: String, addendum: => String = "")(using Context) +extends NotFoundMsg(NotAMemberID), ShowMatchTrace(site) { + //println(i"site = $site, decls = ${site.decls}, source = ${site.typeSymbol.sourceFile}") //DEBUG + + def msg(using Context) = { + import core.Flags._ + val maxDist = 3 // maximal number of differences to be considered for a hint + val missing = name.show + + // The symbols of all non-synthetic, non-private members of `site` + // that are of the same type/term kind as the missing member. + def candidates: Set[Symbol] = + for + bc <- site.widen.baseClasses.toSet + sym <- bc.info.decls.filter(sym => + sym.isType == name.isTypeName + && !sym.isConstructor + && !sym.flagsUNSAFE.isOneOf(Synthetic | Private)) + yield sym + + // Calculate Levenshtein distance + def distance(s1: String, s2: String): Int = + val dist = Array.ofDim[Int](s2.length + 1, s1.length + 1) + for + j <- 0 to s2.length + i <- 0 to s1.length + do + dist(j)(i) = + if j == 0 then i + else if i == 0 then j + else if s2(j - 1) == s1(i - 1) then dist(j - 1)(i - 1) + else (dist(j - 1)(i) min dist(j)(i - 1) min dist(j - 1)(i - 1)) + 1 + dist(s2.length)(s1.length) + + // A list of possible candidate symbols with their Levenstein distances + // to the name of the missing member + def closest: List[(Int, Symbol)] = candidates + .toList + .map(sym => (distance(sym.name.show, missing), sym)) + .filter((d, sym) => d <= maxDist && d < missing.length && d < sym.name.show.length) + .sortBy((d, sym) => (d, sym.name.show)) // sort by distance first, alphabetically second + + val enumClause = + if ((name eq nme.values) || (name eq nme.valueOf)) && site.classSymbol.companionClass.isEnumClass then + val kind = if name eq nme.values then i"${nme.values} array" else i"${nme.valueOf} lookup method" + // an assumption is made here that the values and valueOf methods were not generated + // because the enum defines non-singleton cases + i""" + |Although ${site.classSymbol.companionClass} is an enum, it has non-singleton cases, + |meaning a $kind is not defined""" + else + "" - class ImplicitClassPrimaryConstructorArity()(using Context) - extends SyntaxMsg(ImplicitClassPrimaryConstructorArityID){ - def msg = "Implicit classes must accept exactly one primary constructor parameter" - def explain = { - val example = "implicit class RichDate(date: java.util.Date)" - em"""Implicit classes may only take one non-implicit argument in their constructor. For example: + def prefixEnumClause(addendum: String) = + if enumClause.nonEmpty then s".$enumClause$addendum" else addendum + + val finalAddendum = + if addendum.nonEmpty then prefixEnumClause(addendum) + else closest match + case (d, sym) :: _ => + val siteName = site match + case site: NamedType => site.name.show + case site => i"$site" + val showName = + // Add .type to the name if it is a module + if sym.is(ModuleClass) then s"${sym.name.show}.type" + else sym.name.show + s" - did you mean $siteName.$showName?$enumClause" + case Nil => prefixEnumClause("") + + i"$selected $name is not a member of ${site.widen}$finalAddendum" + } + + def explain(using Context) = "" +} + +class EarlyDefinitionsNotSupported()(using Context) +extends SyntaxMsg(EarlyDefinitionsNotSupportedID) { + def msg(using Context) = "Early definitions are not supported; use trait parameters instead" + + def explain(using Context) = { + val code1 = + """|trait Logging { + | val f: File + | f.open() + | onExit(f.close()) + | def log(msg: String) = f.write(msg) + |} | - | $example + |class B extends Logging { + | val f = new File("log.data") // triggers a NullPointerException + |} | - |While it’s possible to create an implicit class with more than one non-implicit argument, - |such classes aren’t used during implicit lookup. - |""" - } - } - - class ObjectMayNotHaveSelfType(mdef: untpd.ModuleDef)(using Context) - extends SyntaxMsg(ObjectMayNotHaveSelfTypeID) { - def msg = em"""${hl("object")}s must not have a self ${hl("type")}""" - - def explain = { - val untpd.ModuleDef(name, tmpl) = mdef - val ValDef(_, selfTpt, _) = tmpl.self - em"""|${hl("object")}s must not have a self ${hl("type")}: - | - |Consider these alternative solutions: - | - Create a trait or a class instead of an object - | - Let the object extend a trait containing the self type: - | - | object $name extends ${selfTpt.show}""" - } - } - - class RepeatedModifier(modifier: String)(implicit ctx:Context) - extends SyntaxMsg(RepeatedModifierID) { - def msg = em"""Repeated modifier $modifier""" - - def explain = { - val code1 = em"""private private val Origin = Point(0, 0)""" - val code2 = em"""private final val Origin = Point(0, 0)""" - em"""This happens when you accidentally specify the same modifier twice. - | - |Example: - | - |$code1 - | - |instead of - | - |$code2 - | - |""" - } - } - - class InterpolatedStringError()(implicit ctx:Context) - extends SyntaxMsg(InterpolatedStringErrorID) { - def msg = "Error in interpolated string: identifier or block expected" - def explain = { - val code1 = "s\"$new Point(0, 0)\"" - val code2 = "s\"${new Point(0, 0)}\"" - em"""|This usually happens when you forget to place your expressions inside curly braces. - | - |$code1 - | - |should be written as - | - |$code2 - |""" - } - } - - class UnboundPlaceholderParameter()(implicit ctx:Context) - extends SyntaxMsg(UnboundPlaceholderParameterID) { - def msg = em"""Unbound placeholder parameter; incorrect use of ${hl("_")}""" - def explain = - em"""|The ${hl("_")} placeholder syntax was used where it could not be bound. - |Consider explicitly writing the variable binding. - | - |This can be done by replacing ${hl("_")} with a variable (eg. ${hl("x")}) - |and adding ${hl("x =>")} where applicable. - | - |Example before: - | - |${hl("{ _ }")} - | - |Example after: - | - |${hl("x => { x }")} - | - |Another common occurrence for this error is defining a val with ${hl("_")}: - | - |${hl("val a = _")} - | - |But this val definition isn't very useful, it can never be assigned - |another value. And thus will always remain uninitialized. - |Consider replacing the ${hl("val")} with ${hl("var")}: - | - |${hl("var a = _")} - | - |Note that this use of ${hl("_")} is not placeholder syntax, - |but an uninitialized var definition. - |Only fields can be left uninitialized in this manner; local variables - |must be initialized. - | - |Another occurrence for this error is self type definition. - |The ${hl("_")} can be replaced with ${hl("this")}. - | - |Example before: - | - |${hl("trait A { _: B => ... ")} - | - |Example after: - | - |${hl("trait A { this: B => ... ")} - |""" - } - - class IllegalStartSimpleExpr(illegalToken: String)(using Context) - extends SyntaxMsg(IllegalStartSimpleExprID) { - def msg = em"expression expected but ${Red(illegalToken)} found" - def explain = { - em"""|An expression cannot start with ${Red(illegalToken)}.""" - } - } + |// early definition gets around the NullPointerException + |class C extends { + | val f = new File("log.data") + |} with Logging""".stripMargin + + val code2 = + """|trait Logging(f: File) { + | f.open() + | onExit(f.close()) + | def log(msg: String) = f.write(msg) + |} + | + |class C extends Logging(new File("log.data"))""".stripMargin - class MissingReturnType()(implicit ctx:Context) - extends SyntaxMsg(MissingReturnTypeID) { - def msg = "Missing return type" - def explain = - em"""|An abstract declaration must have a return type. For example: - | - |trait Shape: - | ${hl("def area: Double")} // abstract declaration returning a Double""" + i"""|Earlier versions of Scala did not support trait parameters and "early + |definitions" (also known as "early initializers") were used as an alternative. + | + |Example of old syntax: + | + |$code1 + | + |The above code can now be written as: + | + |$code2 + |""" } - - class MissingReturnTypeWithReturnStatement(method: Symbol)(using Context) - extends SyntaxMsg(MissingReturnTypeWithReturnStatementID) { - def msg = em"$method has a return statement; it needs a result type" - def explain = - em"""|If a method contains a ${hl("return")} statement, it must have an - |explicit return type. For example: - | - |${hl("def good: Int /* explicit return type */ = return 1")}""" +} + +class TopLevelImplicitClass(cdef: untpd.TypeDef)(using Context) +extends SyntaxMsg(TopLevelImplicitClassID) { + def msg(using Context) = i"""An ${hl("implicit class")} may not be top-level""" + + def explain(using Context) = { + val TypeDef(name, impl @ Template(constr0, parents, self, _)) = cdef: @unchecked + val exampleArgs = + if(constr0.termParamss.isEmpty) "..." + else constr0.termParamss(0).map(_.withMods(untpd.Modifiers()).show).mkString(", ") + def defHasBody[T] = impl.body.exists(!_.isEmpty) + val exampleBody = if (defHasBody) "{\n ...\n }" else "" + i"""|There may not be any method, member or object in scope with the same name as + |the implicit class and a case class automatically gets a companion object with + |the same name created by the compiler which would cause a naming conflict if it + |were allowed. + | | + |To resolve the conflict declare ${cdef.name} inside of an ${hl("object")} then import the class + |from the object at the use site if needed, for example: + | + |object Implicits { + | implicit class ${cdef.name}($exampleArgs)$exampleBody + |} + | + |// At the use site: + |import Implicits.${cdef.name}""" } +} - class YieldOrDoExpectedInForComprehension()(using Context) - extends SyntaxMsg(YieldOrDoExpectedInForComprehensionID) { - def msg = em"${hl("yield")} or ${hl("do")} expected" - - def explain = - em"""|When the enumerators in a for comprehension are not placed in parentheses or - |braces, a ${hl("do")} or ${hl("yield")} statement is required after the enumerators - |section of the comprehension. - | - |You can save some keystrokes by omitting the parentheses and writing - | - |${hl("val numbers = for i <- 1 to 3 yield i")} - | - | instead of - | - |${hl("val numbers = for (i <- 1 to 3) yield i")} - | - |but the ${hl("yield")} keyword is still required. - | - |For comprehensions that simply perform a side effect without yielding anything - |can also be written without parentheses but a ${hl("do")} keyword has to be - |included. For example, - | - |${hl("for (i <- 1 to 3) println(i)")} - | - |can be written as - | - |${hl("for i <- 1 to 3 do println(i) // notice the 'do' keyword")} - | - |""" - } +class ImplicitCaseClass(cdef: untpd.TypeDef)(using Context) +extends SyntaxMsg(ImplicitCaseClassID) { + def msg(using Context) = i"""A ${hl("case class")} may not be defined as ${hl("implicit")}""" - class ProperDefinitionNotFound()(using Context) - extends Message(ProperDefinitionNotFoundID) { - def kind = MessageKind.DocComment - def msg = em"""Proper definition was not found in ${hl("@usecase")}""" - - def explain = { - val noUsecase = - "def map[B, That](f: A => B)(implicit bf: CanBuildFrom[List[A], B, That]): That" - - val usecase = - """|/** Map from List[A] => List[B] - | * - | * @usecase def map[B](f: A => B): List[B] - | */ - |def map[B, That](f: A => B)(implicit bf: CanBuildFrom[List[A], B, That]): That - |""".stripMargin - - em"""|Usecases are only supported for ${hl("def")}s. They exist because with Scala's - |advanced type-system, we sometimes end up with seemingly scary signatures. - |The usage of these methods, however, needs not be - for instance the ${hl("map")} - |function - | - |${hl("List(1, 2, 3).map(2 * _) // res: List(2, 4, 6)")} - | - |is easy to understand and use - but has a rather bulky signature: - | - |$noUsecase - | - |to mitigate this and ease the usage of such functions we have the ${hl("@usecase")} - |annotation for docstrings. Which can be used like this: - | - |$usecase - | - |When creating the docs, the signature of the method is substituted by the - |usecase and the compiler makes sure that it is valid. Because of this, you're - |only allowed to use ${hl("def")}s when defining usecases.""" - } + def explain(using Context) = + i"""|Implicit classes may not be case classes. Instead use a plain class: + | + |implicit class ${cdef.name}... + | + |""" +} + +class ImplicitClassPrimaryConstructorArity()(using Context) +extends SyntaxMsg(ImplicitClassPrimaryConstructorArityID){ + def msg(using Context) = "Implicit classes must accept exactly one primary constructor parameter" + def explain(using Context) = { + val example = "implicit class RichDate(date: java.util.Date)" + i"""Implicit classes may only take one non-implicit argument in their constructor. For example: + | + | $example + | + |While it’s possible to create an implicit class with more than one non-implicit argument, + |such classes aren’t used during implicit lookup. + |""" } +} - class ByNameParameterNotSupported(tpe: untpd.Tree)(using Context) - extends SyntaxMsg(ByNameParameterNotSupportedID) { - def msg = em"By-name parameter type ${tpe} not allowed here." - - def explain = - em"""|By-name parameters act like functions that are only evaluated when referenced, - |allowing for lazy evaluation of a parameter. - | - |An example of using a by-name parameter would look like: - |${hl("def func(f: => Boolean) = f // 'f' is evaluated when referenced within the function")} - | - |An example of the syntax of passing an actual function as a parameter: - |${hl("def func(f: (Boolean => Boolean)) = f(true)")} - | - |or: - | - |${hl("def func(f: Boolean => Boolean) = f(true)")} - | - |And the usage could be as such: - |${hl("func(bool => // do something...)")} - |""" - } +class ObjectMayNotHaveSelfType(mdef: untpd.ModuleDef)(using Context) +extends SyntaxMsg(ObjectMayNotHaveSelfTypeID) { + def msg(using Context) = i"""${hl("object")}s must not have a self ${hl("type")}""" - class WrongNumberOfTypeArgs(fntpe: Type, expectedArgs: List[ParamInfo], actual: List[untpd.Tree])(using Context) - extends SyntaxMsg(WrongNumberOfTypeArgsID) { - - private val expectedCount = expectedArgs.length - private val actualCount = actual.length - private val msgPrefix = if (actualCount > expectedCount) "Too many" else "Not enough" - - def msg = - val expectedArgString = expectedArgs - .map(_.paramName.unexpandedName.show) - .mkString("[", ", ", "]") - val actualArgString = actual.map(_.show).mkString("[", ", ", "]") - val prettyName = - try fntpe.termSymbol match - case NoSymbol => fntpe.show - case symbol => symbol.showFullName - catch case NonFatal(ex) => fntpe.show - em"""|$msgPrefix type arguments for $prettyName$expectedArgString - |expected: $expectedArgString - |actual: $actualArgString""".stripMargin - - def explain = { - val tooManyTypeParams = - """|val tuple2: (Int, String) = (1, "one") - |val list: List[(Int, String)] = List(tuple2)""".stripMargin - - if (actualCount > expectedCount) - em"""|You have supplied too many type parameters - | - |For example List takes a single type parameter (List[A]) - |If you need to hold more types in a list then you need to combine them - |into another data type that can contain the number of types you need, - |In this example one solution would be to use a Tuple: - | - |${tooManyTypeParams}""" - else - em"""|You have not supplied enough type parameters - |If you specify one type parameter then you need to specify every type parameter.""" - } + def explain(using Context) = { + val untpd.ModuleDef(name, tmpl) = mdef + val ValDef(_, selfTpt, _) = tmpl.self + i"""|${hl("object")}s must not have a self ${hl("type")}: + | + |Consider these alternative solutions: + | - Create a trait or a class instead of an object + | - Let the object extend a trait containing the self type: + | + | object $name extends ${selfTpt.show}""" } +} - class IllegalVariableInPatternAlternative(name: Name)(using Context) - extends SyntaxMsg(IllegalVariableInPatternAlternativeID) { - def msg = em"Illegal variable $name in pattern alternative" - def explain = { - val varInAlternative = - """|def g(pair: (Int,Int)): Int = pair match { - | case (1, n) | (n, 1) => n - | case _ => 0 - |}""".stripMargin - - val fixedVarInAlternative = - """|def g(pair: (Int,Int)): Int = pair match { - | case (1, n) => n - | case (n, 1) => n - | case _ => 0 - |}""".stripMargin - - em"""|Variables are not allowed within alternate pattern matches. You can workaround - |this issue by adding additional cases for each alternative. For example, the - |illegal function: - | - |$varInAlternative - |could be implemented by moving each alternative into a separate case: - | - |$fixedVarInAlternative""" - } - } +class RepeatedModifier(modifier: String)(implicit ctx:Context) +extends SyntaxMsg(RepeatedModifierID) { + def msg(using Context) = i"""Repeated modifier $modifier""" - class IdentifierExpected(identifier: String)(using Context) - extends SyntaxMsg(IdentifierExpectedID) { - def msg = "identifier expected" - def explain = { - val wrongIdentifier = em"def foo: $identifier = {...}" - val validIdentifier = em"def foo = {...}" - em"""|An identifier expected, but $identifier found. This could be because - |$identifier is not a valid identifier. As a workaround, the compiler could - |infer the type for you. For example, instead of: - | - |$wrongIdentifier - | - |Write your code like: - | - |$validIdentifier - | - |""" - } + def explain(using Context) = { + val code1 = "private private val Origin = Point(0, 0)" + val code2 = "private final val Origin = Point(0, 0)" + i"""This happens when you accidentally specify the same modifier twice. + | + |Example: + | + |$code1 + | + |instead of + | + |$code2 + | + |""" } +} - class AuxConstructorNeedsNonImplicitParameter()(implicit ctx:Context) - extends SyntaxMsg(AuxConstructorNeedsNonImplicitParameterID) { - def msg = "Auxiliary constructor needs non-implicit parameter list" - def explain = - em"""|Only the primary constructor is allowed an ${hl("implicit")} parameter list; - |auxiliary constructors need non-implicit parameter lists. When a primary - |constructor has an implicit argslist, auxiliary constructors that call the - |primary constructor must specify the implicit value. - | - |To resolve this issue check for: - | - Forgotten parenthesis on ${hl("this")} (${hl("def this() = { ... }")}) - | - Auxiliary constructors specify the implicit value - |""" +class InterpolatedStringError()(implicit ctx:Context) +extends SyntaxMsg(InterpolatedStringErrorID) { + def msg(using Context) = "Error in interpolated string: identifier or block expected" + def explain(using Context) = { + val code1 = "s\"$new Point(0, 0)\"" + val code2 = "s\"${new Point(0, 0)}\"" + i"""|This usually happens when you forget to place your expressions inside curly braces. + | + |$code1 + | + |should be written as + | + |$code2 + |""" } +} - class IllegalLiteral()(using Context) - extends SyntaxMsg(IllegalLiteralID) { - def msg = "Illegal literal" - def explain = - em"""|Available literals can be divided into several groups: - | - Integer literals: 0, 21, 0xFFFFFFFF, -42L - | - Floating Point Literals: 0.0, 1e30f, 3.14159f, 1.0e-100, .1 - | - Boolean Literals: true, false - | - Character Literals: 'a', '\u0041', '\n' - | - String Literals: "Hello, World!" - | - null - |""" - } +class UnboundPlaceholderParameter()(implicit ctx:Context) +extends SyntaxMsg(UnboundPlaceholderParameterID) { + def msg(using Context) = i"""Unbound placeholder parameter; incorrect use of ${hl("_")}""" + def explain(using Context) = + i"""|The ${hl("_")} placeholder syntax was used where it could not be bound. + |Consider explicitly writing the variable binding. + | + |This can be done by replacing ${hl("_")} with a variable (eg. ${hl("x")}) + |and adding ${hl("x =>")} where applicable. + | + |Example before: + | + |${hl("{ _ }")} + | + |Example after: + | + |${hl("x => { x }")} + | + |Another common occurrence for this error is defining a val with ${hl("_")}: + | + |${hl("val a = _")} + | + |But this val definition isn't very useful, it can never be assigned + |another value. And thus will always remain uninitialized. + |Consider replacing the ${hl("val")} with ${hl("var")}: + | + |${hl("var a = _")} + | + |Note that this use of ${hl("_")} is not placeholder syntax, + |but an uninitialized var definition. + |Only fields can be left uninitialized in this manner; local variables + |must be initialized. + | + |Another occurrence for this error is self type definition. + |The ${hl("_")} can be replaced with ${hl("this")}. + | + |Example before: + | + |${hl("trait A { _: B => ... ")} + | + |Example after: + | + |${hl("trait A { this: B => ... ")} + |""" +} - class LossyWideningConstantConversion(sourceType: Type, targetType: Type)(using Context) - extends Message(LossyWideningConstantConversionID): - def kind = MessageKind.LossyConversion - def msg = em"""|Widening conversion from $sourceType to $targetType loses precision. - |Write `.to$targetType` instead.""".stripMargin - def explain = "" - - class PatternMatchExhaustivity(uncoveredFn: => String, hasMore: Boolean)(using Context) - extends Message(PatternMatchExhaustivityID) { - def kind = MessageKind.PatternMatchExhaustivity - lazy val uncovered = uncoveredFn - def msg = - val addendum = if hasMore then "(More unmatched cases are elided)" else "" - em"""|${hl("match")} may not be exhaustive. - | - |It would fail on pattern case: $uncovered - |$addendum""" - - - def explain = - em"""|There are several ways to make the match exhaustive: - | - Add missing cases as shown in the warning - | - If an extractor always return ${hl("Some(...)")}, write ${hl("Some[X]")} for its return type - | - Add a ${hl("case _ => ...")} at the end to match all remaining cases - |""" +class IllegalStartSimpleExpr(illegalToken: String)(using Context) +extends SyntaxMsg(IllegalStartSimpleExprID) { + def msg(using Context) = i"expression expected but ${Red(illegalToken)} found" + def explain(using Context) = { + i"""|An expression cannot start with ${Red(illegalToken)}.""" } +} - class UncheckedTypePattern(msgFn: => String)(using Context) - extends PatternMatchMsg(UncheckedTypePatternID) { - def msg = msgFn - def explain = - em"""|Type arguments and type refinements are erased during compile time, thus it's - |impossible to check them at run-time. - | - |You can either replace the type arguments by ${hl("_")} or use `@unchecked`. - |""" - } +class MissingReturnType()(implicit ctx:Context) +extends SyntaxMsg(MissingReturnTypeID) { + def msg(using Context) = "Missing return type" + def explain(using Context) = + i"""|An abstract declaration must have a return type. For example: + | + |trait Shape: + | ${hl("def area: Double")} // abstract declaration returning a Double""" +} + +class MissingReturnTypeWithReturnStatement(method: Symbol)(using Context) +extends SyntaxMsg(MissingReturnTypeWithReturnStatementID) { + def msg(using Context) = i"$method has a return statement; it needs a result type" + def explain(using Context) = + i"""|If a method contains a ${hl("return")} statement, it must have an + |explicit return type. For example: + | + |${hl("def good: Int /* explicit return type */ = return 1")}""" +} - class MatchCaseUnreachable()(using Context) - extends Message(MatchCaseUnreachableID) { - def kind = MessageKind.MatchCaseUnreachable - def msg = "Unreachable case" - def explain = "" - } +class YieldOrDoExpectedInForComprehension()(using Context) +extends SyntaxMsg(YieldOrDoExpectedInForComprehensionID) { + def msg(using Context) = i"${hl("yield")} or ${hl("do")} expected" - class MatchCaseOnlyNullWarning()(using Context) - extends PatternMatchMsg(MatchCaseOnlyNullWarningID) { - def msg = em"""Unreachable case except for ${hl("null")} (if this is intentional, consider writing ${hl("case null =>")} instead).""" - def explain = "" - } + def explain(using Context) = + i"""|When the enumerators in a for comprehension are not placed in parentheses or + |braces, a ${hl("do")} or ${hl("yield")} statement is required after the enumerators + |section of the comprehension. + | + |You can save some keystrokes by omitting the parentheses and writing + | + |${hl("val numbers = for i <- 1 to 3 yield i")} + | + | instead of + | + |${hl("val numbers = for (i <- 1 to 3) yield i")} + | + |but the ${hl("yield")} keyword is still required. + | + |For comprehensions that simply perform a side effect without yielding anything + |can also be written without parentheses but a ${hl("do")} keyword has to be + |included. For example, + | + |${hl("for (i <- 1 to 3) println(i)")} + | + |can be written as + | + |${hl("for i <- 1 to 3 do println(i) // notice the 'do' keyword")} + | + |""" +} + +class ProperDefinitionNotFound()(using Context) +extends Message(ProperDefinitionNotFoundID) { + def kind = MessageKind.DocComment + def msg(using Context) = i"""Proper definition was not found in ${hl("@usecase")}""" + + def explain(using Context) = { + val noUsecase = + "def map[B, That](f: A => B)(implicit bf: CanBuildFrom[List[A], B, That]): That" + + val usecase = + """|/** Map from List[A] => List[B] + | * + | * @usecase def map[B](f: A => B): List[B] + | */ + |def map[B, That](f: A => B)(implicit bf: CanBuildFrom[List[A], B, That]): That + |""".stripMargin - class MatchableWarning(tp: Type, pattern: Boolean)(using Context) - extends TypeMsg(MatchableWarningID) { - def msg = - val kind = if pattern then "pattern selector" else "value" - em"""${kind} should be an instance of Matchable,, - |but it has unmatchable type $tp instead""" - - def explain = - if pattern then - em"""A value of type $tp cannot be the selector of a match expression - |since it is not constrained to be `Matchable`. Matching on unconstrained - |values is disallowed since it can uncover implementation details that - |were intended to be hidden and thereby can violate paramtetricity laws - |for reasoning about programs. - | - |The restriction can be overridden by appending `.asMatchable` to - |the selector value. `asMatchable` needs to be imported from - |scala.compiletime. Example: - | - | import compiletime.asMatchable - | def f[X](x: X) = x.asMatchable match { ... }""" - else - em"""The value can be converted to a `Matchable` by appending `.asMatchable`. - |`asMatchable` needs to be imported from scala.compiletime.""" + i"""|Usecases are only supported for ${hl("def")}s. They exist because with Scala's + |advanced type-system, we sometimes end up with seemingly scary signatures. + |The usage of these methods, however, needs not be - for instance the ${hl("map")} + |function + | + |${hl("List(1, 2, 3).map(2 * _) // res: List(2, 4, 6)")} + | + |is easy to understand and use - but has a rather bulky signature: + | + |$noUsecase + | + |to mitigate this and ease the usage of such functions we have the ${hl("@usecase")} + |annotation for docstrings. Which can be used like this: + | + |$usecase + | + |When creating the docs, the signature of the method is substituted by the + |usecase and the compiler makes sure that it is valid. Because of this, you're + |only allowed to use ${hl("def")}s when defining usecases.""" } +} - class SeqWildcardPatternPos()(using Context) - extends SyntaxMsg(SeqWildcardPatternPosID) { - def msg = em"""${hl("*")} can be used only for last argument""" - def explain = { - val code = - """def sumOfTheFirstTwo(list: List[Int]): Int = list match { - | case List(first, second, x*) => first + second - | case _ => 0 - |}""" - em"""|Sequence wildcard pattern is expected at the end of an argument list. - |This pattern matches any remaining elements in a sequence. - |Consider the following example: - | - |$code - | - |Calling: - | - |${hl("sumOfTheFirstTwo(List(1, 2, 10))")} - | - |would give 3 as a result""" - } - } +class ByNameParameterNotSupported(tpe: untpd.Tree)(using Context) +extends SyntaxMsg(ByNameParameterNotSupportedID) { + def msg(using Context) = i"By-name parameter type ${tpe} not allowed here." - class IllegalStartOfSimplePattern()(using Context) - extends SyntaxMsg(IllegalStartOfSimplePatternID) { - def msg = "pattern expected" - def explain = { - val sipCode = - """def f(x: Int, y: Int) = x match { - | case `y` => ... - |} - """ - val constructorPatternsCode = - """case class Person(name: String, age: Int) + def explain(using Context) = + i"""|By-name parameters act like functions that are only evaluated when referenced, + |allowing for lazy evaluation of a parameter. + | + |An example of using a by-name parameter would look like: + |${hl("def func(f: => Boolean) = f // 'f' is evaluated when referenced within the function")} + | + |An example of the syntax of passing an actual function as a parameter: + |${hl("def func(f: (Boolean => Boolean)) = f(true)")} + | + |or: + | + |${hl("def func(f: Boolean => Boolean) = f(true)")} + | + |And the usage could be as such: + |${hl("func(bool => // do something...)")} + |""" +} + +class WrongNumberOfTypeArgs(fntpe: Type, expectedArgs: List[ParamInfo], actual: List[untpd.Tree])(using Context) +extends SyntaxMsg(WrongNumberOfTypeArgsID) { + + private val expectedCount = expectedArgs.length + private val actualCount = actual.length + private val msgPrefix = if (actualCount > expectedCount) "Too many" else "Not enough" + + def msg(using Context) = + val expectedArgString = expectedArgs + .map(_.paramName.unexpandedName.show) + .mkString("[", ", ", "]") + val actualArgString = actual.map(_.show).mkString("[", ", ", "]") + val prettyName = + try fntpe.termSymbol match + case NoSymbol => fntpe.show + case symbol => symbol.showFullName + catch case NonFatal(ex) => fntpe.show + i"""|$msgPrefix type arguments for $prettyName$expectedArgString + |expected: $expectedArgString + |actual: $actualArgString""" + + def explain(using Context) = { + val tooManyTypeParams = + """|val tuple2: (Int, String) = (1, "one") + |val list: List[(Int, String)] = List(tuple2)""".stripMargin + + if (actualCount > expectedCount) + i"""|You have supplied too many type parameters | - |def test(p: Person) = p match { - | case Person(name, age) => ... - |} - """ - val tupplePatternsCode = - """def swap(tuple: (String, Int)): (Int, String) = tuple match { - | case (text, number) => (number, text) - |} - """ - val patternSequencesCode = - """def getSecondValue(list: List[Int]): Int = list match { - | case List(_, second, x:_*) => second + |For example List takes a single type parameter (List[A]) + |If you need to hold more types in a list then you need to combine them + |into another data type that can contain the number of types you need, + |In this example one solution would be to use a Tuple: + | + |${tooManyTypeParams}""" + else + i"""|You have not supplied enough type parameters + |If you specify one type parameter then you need to specify every type parameter.""" + } +} + +class IllegalVariableInPatternAlternative(name: Name)(using Context) +extends SyntaxMsg(IllegalVariableInPatternAlternativeID) { + def msg(using Context) = i"Illegal variable $name in pattern alternative" + def explain(using Context) = { + val varInAlternative = + """|def g(pair: (Int,Int)): Int = pair match { + | case (1, n) | (n, 1) => n | case _ => 0 - |}""" - em"""|Simple patterns can be divided into several groups: - |- Variable Patterns: ${hl("case x => ...")}. - | It matches any value, and binds the variable name to that value. - | A special case is the wild-card pattern _ which is treated as if it was a fresh - | variable on each occurrence. - | - |- Typed Patterns: ${hl("case x: Int => ...")} or ${hl("case _: Int => ...")}. - | This pattern matches any value matched by the specified type; it binds the variable - | name to that value. - | - |- Literal Patterns: ${hl("case 123 => ...")} or ${hl("case 'A' => ...")}. - | This type of pattern matches any value that is equal to the specified literal. - | - |- Stable Identifier Patterns: - | - | $sipCode - | - | the match succeeds only if the x argument and the y argument of f are equal. - | - |- Constructor Patterns: - | - | $constructorPatternsCode - | - | The pattern binds all object's fields to the variable names (name and age, in this - | case). - | - |- Tuple Patterns: - | - | $tupplePatternsCode - | - | Calling: - | - | ${hl("""swap(("Luftballons", 99)""")} - | - | would give ${hl("""(99, "Luftballons")""")} as a result. - | - |- Pattern Sequences: - | - | $patternSequencesCode - | - | Calling: - | - | ${hl("getSecondValue(List(1, 10, 2))")} - | - | would give 10 as a result. - | This pattern is possible because a companion object for the List class has a method - | with the following signature: - | - | ${hl("def unapplySeq[A](x: List[A]): Some[List[A]]")} - |""" - } - } - - class PkgDuplicateSymbol(existing: Symbol)(using Context) - extends NamingMsg(PkgDuplicateSymbolID) { - def msg = em"Trying to define package with same name as $existing" - def explain = "" - } + |}""".stripMargin - class ExistentialTypesNoLongerSupported()(using Context) - extends SyntaxMsg(ExistentialTypesNoLongerSupportedID) { - def msg = - em"""|Existential types are no longer supported - - |use a wildcard or dependent type instead""" - def explain = - em"""|The use of existential types is no longer supported. - | - |You should use a wildcard or dependent type instead. - | - |For example: - | - |Instead of using ${hl("forSome")} to specify a type variable - | - |${hl("List[T forSome { type T }]")} - | - |Try using a wildcard type variable - | - |${hl("List[?]")} - |""" - } + val fixedVarInAlternative = + """|def g(pair: (Int,Int)): Int = pair match { + | case (1, n) => n + | case (n, 1) => n + | case _ => 0 + |}""".stripMargin - class UnboundWildcardType()(using Context) - extends SyntaxMsg(UnboundWildcardTypeID) { - def msg = "Unbound wildcard type" - def explain = - em"""|The wildcard type syntax (${hl("_")}) was used where it could not be bound. - |Replace ${hl("_")} with a non-wildcard type. If the type doesn't matter, - |try replacing ${hl("_")} with ${hl("Any")}. - | - |Examples: - | - |- Parameter lists - | - | Instead of: - | ${hl("def foo(x: _) = ...")} - | - | Use ${hl("Any")} if the type doesn't matter: - | ${hl("def foo(x: Any) = ...")} - | - |- Type arguments - | - | Instead of: - | ${hl("val foo = List[?](1, 2)")} - | - | Use: - | ${hl("val foo = List[Int](1, 2)")} - | - |- Type bounds - | - | Instead of: - | ${hl("def foo[T <: _](x: T) = ...")} - | - | Remove the bounds if the type doesn't matter: - | ${hl("def foo[T](x: T) = ...")} - | - |- ${hl("val")} and ${hl("def")} types - | - | Instead of: - | ${hl("val foo: _ = 3")} - | - | Use: - | ${hl("val foo: Int = 3")} - |""" + i"""|Variables are not allowed within alternate pattern matches. You can workaround + |this issue by adding additional cases for each alternative. For example, the + |illegal function: + | + |$varInAlternative + |could be implemented by moving each alternative into a separate case: + | + |$fixedVarInAlternative""" + } +} + +class IdentifierExpected(identifier: String)(using Context) +extends SyntaxMsg(IdentifierExpectedID) { + def msg(using Context) = "identifier expected" + def explain(using Context) = { + val wrongIdentifier = i"def foo: $identifier = {...}" + val validIdentifier = i"def foo = {...}" + i"""|An identifier expected, but $identifier found. This could be because + |$identifier is not a valid identifier. As a workaround, the compiler could + |infer the type for you. For example, instead of: + | + |$wrongIdentifier + | + |Write your code like: + | + |$validIdentifier + | + |""" } +} - class OverridesNothing(member: Symbol)(using Context) - extends DeclarationMsg(OverridesNothingID) { - def msg = em"""${member} overrides nothing""" - - def explain = - em"""|There must be a field or method with the name ${member.name} in a super - |class of ${member.owner} to override it. Did you misspell it? - |Are you extending the right classes? - |""" - } +class AuxConstructorNeedsNonImplicitParameter()(implicit ctx:Context) +extends SyntaxMsg(AuxConstructorNeedsNonImplicitParameterID) { + def msg(using Context) = "Auxiliary constructor needs non-implicit parameter list" + def explain(using Context) = + i"""|Only the primary constructor is allowed an ${hl("implicit")} parameter list; + |auxiliary constructors need non-implicit parameter lists. When a primary + |constructor has an implicit argslist, auxiliary constructors that call the + |primary constructor must specify the implicit value. + | + |To resolve this issue check for: + | - Forgotten parenthesis on ${hl("this")} (${hl("def this() = { ... }")}) + | - Auxiliary constructors specify the implicit value + |""" +} + +class IllegalLiteral()(using Context) +extends SyntaxMsg(IllegalLiteralID) { + def msg(using Context) = "Illegal literal" + def explain(using Context) = + i"""|Available literals can be divided into several groups: + | - Integer literals: 0, 21, 0xFFFFFFFF, -42L + | - Floating Point Literals: 0.0, 1e30f, 3.14159f, 1.0e-100, .1 + | - Boolean Literals: true, false + | - Character Literals: 'a', '\u0041', '\n' + | - String Literals: "Hello, World!" + | - null + |""" +} + +class LossyWideningConstantConversion(sourceType: Type, targetType: Type)(using Context) +extends Message(LossyWideningConstantConversionID): + def kind = MessageKind.LossyConversion + def msg(using Context) = i"""|Widening conversion from $sourceType to $targetType loses precision. + |Write `.to$targetType` instead.""" + def explain(using Context) = "" + +class PatternMatchExhaustivity(uncoveredFn: => String, hasMore: Boolean)(using Context) +extends Message(PatternMatchExhaustivityID) { + def kind = MessageKind.PatternMatchExhaustivity + lazy val uncovered = uncoveredFn + def msg(using Context) = + val addendum = if hasMore then "(More unmatched cases are elided)" else "" + i"""|${hl("match")} may not be exhaustive. + | + |It would fail on pattern case: $uncovered + |$addendum""" - class OverridesNothingButNameExists(member: Symbol, existing: List[Denotations.SingleDenotation])(using Context) - extends DeclarationMsg(OverridesNothingButNameExistsID) { - def msg = - val what = - if !existing.exists(_.symbol.hasTargetName(member.targetName)) - then "target name" - else "signature" - em"""${member} has a different $what than the overridden declaration""" - def explain = - val existingDecl: String = existing.map(_.showDcl).mkString(" \n") - em"""|There must be a non-final field or method with the name ${member.name} and the - |same parameter list in a super class of ${member.owner} to override it. - | - | ${member.showDcl} - | - |The super classes of ${member.owner} contain the following members - |named ${member.name}: - | ${existingDecl} - |""" - } - class OverrideError(override val msg: String) extends DeclarationMsg(OverrideErrorID): - def explain = "" - - class OverrideTypeMismatchError(override val msg: String, memberTp: Type, otherTp: Type)(using Context) - extends DeclarationMsg(OverrideTypeMismatchErrorID): - def explain = err.whyNoMatchStr(memberTp, otherTp) - override def canExplain = true - - class ForwardReferenceExtendsOverDefinition(value: Symbol, definition: Symbol)(using Context) - extends ReferenceMsg(ForwardReferenceExtendsOverDefinitionID) { - def msg = em"${definition.name} is a forward reference extending over the definition of ${value.name}" - - def explain = - em"""|${definition.name} is used before you define it, and the definition of ${value.name} - |appears between that use and the definition of ${definition.name}. - | - |Forward references are allowed only, if there are no value definitions between - |the reference and the referred method definition. - | - |Define ${definition.name} before it is used, - |or move the definition of ${value.name} so it does not appear between - |the declaration of ${definition.name} and its use, - |or define ${value.name} as lazy. - |""".stripMargin + def explain(using Context) = + i"""|There are several ways to make the match exhaustive: + | - Add missing cases as shown in the warning + | - If an extractor always return ${hl("Some(...)")}, write ${hl("Some[X]")} for its return type + | - Add a ${hl("case _ => ...")} at the end to match all remaining cases + |""" +} + +class UncheckedTypePattern(msgFn: => String)(using Context) + extends PatternMatchMsg(UncheckedTypePatternID) { + def msg(using Context) = msgFn + def explain(using Context) = + i"""|Type arguments and type refinements are erased during compile time, thus it's + |impossible to check them at run-time. + | + |You can either replace the type arguments by ${hl("_")} or use `@unchecked`. + |""" +} + +class MatchCaseUnreachable()(using Context) +extends Message(MatchCaseUnreachableID) { + def kind = MessageKind.MatchCaseUnreachable + def msg(using Context) = "Unreachable case" + def explain(using Context) = "" +} + +class MatchCaseOnlyNullWarning()(using Context) +extends PatternMatchMsg(MatchCaseOnlyNullWarningID) { + def msg(using Context) = i"""Unreachable case except for ${hl("null")} (if this is intentional, consider writing ${hl("case null =>")} instead).""" + def explain(using Context) = "" +} + +class MatchableWarning(tp: Type, pattern: Boolean)(using Context) +extends TypeMsg(MatchableWarningID) { + def msg(using Context) = + val kind = if pattern then "pattern selector" else "value" + i"""${kind} should be an instance of Matchable,, + |but it has unmatchable type $tp instead""" + + def explain(using Context) = + if pattern then + i"""A value of type $tp cannot be the selector of a match expression + |since it is not constrained to be `Matchable`. Matching on unconstrained + |values is disallowed since it can uncover implementation details that + |were intended to be hidden and thereby can violate paramtetricity laws + |for reasoning about programs. + | + |The restriction can be overridden by appending `.asMatchable` to + |the selector value. `asMatchable` needs to be imported from + |scala.compiletime. Example: + | + | import compiletime.asMatchable + | def f[X](x: X) = x.asMatchable match { ... }""" + else + i"""The value can be converted to a `Matchable` by appending `.asMatchable`. + |`asMatchable` needs to be imported from scala.compiletime.""" +} + +class SeqWildcardPatternPos()(using Context) +extends SyntaxMsg(SeqWildcardPatternPosID) { + def msg(using Context) = i"""${hl("*")} can be used only for last argument""" + def explain(using Context) = { + val code = + """def sumOfTheFirstTwo(list: List[Int]): Int = list match { + | case List(first, second, x*) => first + second + | case _ => 0 + |}""" + i"""|Sequence wildcard pattern is expected at the end of an argument list. + |This pattern matches any remaining elements in a sequence. + |Consider the following example: + | + |$code + | + |Calling: + | + |${hl("sumOfTheFirstTwo(List(1, 2, 10))")} + | + |would give 3 as a result""" + } +} + +class IllegalStartOfSimplePattern()(using Context) +extends SyntaxMsg(IllegalStartOfSimplePatternID) { + def msg(using Context) = "pattern expected" + def explain(using Context) = { + val sipCode = + """def f(x: Int, y: Int) = x match { + | case `y` => ... + |} + """ + val constructorPatternsCode = + """case class Person(name: String, age: Int) + | + |def test(p: Person) = p match { + | case Person(name, age) => ... + |} + """ + val tupplePatternsCode = + """def swap(tuple: (String, Int)): (Int, String) = tuple match { + | case (text, number) => (number, text) + |} + """ + val patternSequencesCode = + """def getSecondValue(list: List[Int]): Int = list match { + | case List(_, second, x:_*) => second + | case _ => 0 + |}""" + i"""|Simple patterns can be divided into several groups: + |- Variable Patterns: ${hl("case x => ...")}. + | It matches any value, and binds the variable name to that value. + | A special case is the wild-card pattern _ which is treated as if it was a fresh + | variable on each occurrence. + | + |- Typed Patterns: ${hl("case x: Int => ...")} or ${hl("case _: Int => ...")}. + | This pattern matches any value matched by the specified type; it binds the variable + | name to that value. + | + |- Literal Patterns: ${hl("case 123 => ...")} or ${hl("case 'A' => ...")}. + | This type of pattern matches any value that is equal to the specified literal. + | + |- Stable Identifier Patterns: + | + | $sipCode + | + | the match succeeds only if the x argument and the y argument of f are equal. + | + |- Constructor Patterns: + | + | $constructorPatternsCode + | + | The pattern binds all object's fields to the variable names (name and age, in this + | case). + | + |- Tuple Patterns: + | + | $tupplePatternsCode + | + | Calling: + | + | ${hl("""swap(("Luftballons", 99)""")} + | + | would give ${hl("""(99, "Luftballons")""")} as a result. + | + |- Pattern Sequences: + | + | $patternSequencesCode + | + | Calling: + | + | ${hl("getSecondValue(List(1, 10, 2))")} + | + | would give 10 as a result. + | This pattern is possible because a companion object for the List class has a method + | with the following signature: + | + | ${hl("def unapplySeq[A](x: List[A]): Some[List[A]]")} + |""" } +} + +class PkgDuplicateSymbol(existing: Symbol)(using Context) +extends NamingMsg(PkgDuplicateSymbolID) { + def msg(using Context) = i"Trying to define package with same name as $existing" + def explain(using Context) = "" +} + +class ExistentialTypesNoLongerSupported()(using Context) +extends SyntaxMsg(ExistentialTypesNoLongerSupportedID) { + def msg(using Context) = + i"""|Existential types are no longer supported - + |use a wildcard or dependent type instead""" + def explain(using Context) = + i"""|The use of existential types is no longer supported. + | + |You should use a wildcard or dependent type instead. + | + |For example: + | + |Instead of using ${hl("forSome")} to specify a type variable + | + |${hl("List[T forSome { type T }]")} + | + |Try using a wildcard type variable + | + |${hl("List[?]")} + |""" +} + +class UnboundWildcardType()(using Context) +extends SyntaxMsg(UnboundWildcardTypeID) { + def msg(using Context) = "Unbound wildcard type" + def explain(using Context) = + i"""|The wildcard type syntax (${hl("_")}) was used where it could not be bound. + |Replace ${hl("_")} with a non-wildcard type. If the type doesn't matter, + |try replacing ${hl("_")} with ${hl("Any")}. + | + |Examples: + | + |- Parameter lists + | + | Instead of: + | ${hl("def foo(x: _) = ...")} + | + | Use ${hl("Any")} if the type doesn't matter: + | ${hl("def foo(x: Any) = ...")} + | + |- Type arguments + | + | Instead of: + | ${hl("val foo = List[?](1, 2)")} + | + | Use: + | ${hl("val foo = List[Int](1, 2)")} + | + |- Type bounds + | + | Instead of: + | ${hl("def foo[T <: _](x: T) = ...")} + | + | Remove the bounds if the type doesn't matter: + | ${hl("def foo[T](x: T) = ...")} + | + |- ${hl("val")} and ${hl("def")} types + | + | Instead of: + | ${hl("val foo: _ = 3")} + | + | Use: + | ${hl("val foo: Int = 3")} + |""" +} - class ExpectedTokenButFound(expected: Token, found: Token)(using Context) - extends SyntaxMsg(ExpectedTokenButFoundID) { +class OverridesNothing(member: Symbol)(using Context) +extends DeclarationMsg(OverridesNothingID) { + def msg(using Context) = i"""${member} overrides nothing""" - private lazy val foundText = Tokens.showToken(found) + def explain(using Context) = + i"""|There must be a field or method with the name ${member.name} in a super + |class of ${member.owner} to override it. Did you misspell it? + |Are you extending the right classes? + |""" +} + +class OverridesNothingButNameExists(member: Symbol, existing: List[Denotations.SingleDenotation])(using Context) +extends DeclarationMsg(OverridesNothingButNameExistsID) { + def msg(using Context) = + val what = + if !existing.exists(_.symbol.hasTargetName(member.targetName)) + then "target name" + else "signature" + i"""${member} has a different $what than the overridden declaration""" + def explain(using Context) = + val existingDecl: String = existing.map(_.showDcl).mkString(" \n") + i"""|There must be a non-final field or method with the name ${member.name} and the + |same parameter list in a super class of ${member.owner} to override it. + | + | ${member.showDcl} + | + |The super classes of ${member.owner} contain the following members + |named ${member.name}: + | ${existingDecl} + |""" +} + +class OverrideError( + core: Context ?=> String, base: Type, + member: Symbol, other: Symbol, + memberTp: Type, otherTp: Type)(using Context) +extends DeclarationMsg(OverrideErrorID), NoDisambiguation: + def msg(using Context) = + val isConcreteOverAbstract = + (other.owner isSubClass member.owner) && other.is(Deferred) && !member.is(Deferred) + def addendum = + if isConcreteOverAbstract then + i"""| + |(Note that ${err.infoStringWithLocation(other, base)} is abstract, + |and is therefore overridden by concrete ${err.infoStringWithLocation(member, base)})""" + else "" + i"""error overriding ${err.infoStringWithLocation(other, base)}; + | ${err.infoString(member, base, showLocation = member.owner != base.typeSymbol)} $core$addendum""" + override def canExplain = + memberTp.exists && otherTp.exists + def explain(using Context) = + if canExplain then err.whyNoMatchStr(memberTp, otherTp) else "" + +class ForwardReferenceExtendsOverDefinition(value: Symbol, definition: Symbol)(using Context) +extends ReferenceMsg(ForwardReferenceExtendsOverDefinitionID) { + def msg(using Context) = i"${definition.name} is a forward reference extending over the definition of ${value.name}" + + def explain(using Context) = + i"""|${definition.name} is used before you define it, and the definition of ${value.name} + |appears between that use and the definition of ${definition.name}. + | + |Forward references are allowed only, if there are no value definitions between + |the reference and the referred method definition. + | + |Define ${definition.name} before it is used, + |or move the definition of ${value.name} so it does not appear between + |the declaration of ${definition.name} and its use, + |or define ${value.name} as lazy. + |""" +} - def msg = - val expectedText = - if (Tokens.isIdentifier(expected)) "an identifier" - else Tokens.showToken(expected) - em"""${expectedText} expected, but ${foundText} found""" +class ExpectedTokenButFound(expected: Token, found: Token)(using Context) +extends SyntaxMsg(ExpectedTokenButFoundID) { - def explain = - if (Tokens.isIdentifier(expected) && Tokens.isKeyword(found)) - s""" - |If you want to use $foundText as identifier, you may put it in backticks: `${Tokens.tokenString(found)}`.""".stripMargin - else - "" - } + private def foundText = Tokens.showToken(found) - class MixedLeftAndRightAssociativeOps(op1: Name, op2: Name, op2LeftAssoc: Boolean)(using Context) - extends SyntaxMsg(MixedLeftAndRightAssociativeOpsID) { - def msg = - val op1Asso: String = if (op2LeftAssoc) "which is right-associative" else "which is left-associative" - val op2Asso: String = if (op2LeftAssoc) "which is left-associative" else "which is right-associative" - em"${op1} (${op1Asso}) and ${op2} ($op2Asso) have same precedence and may not be mixed" - def explain = - s"""|The operators ${op1} and ${op2} are used as infix operators in the same expression, - |but they bind to different sides: - |${op1} is applied to the operand to its ${if (op2LeftAssoc) "right" else "left"} - |${op2} is applied to the operand to its ${if (op2LeftAssoc) "left" else "right"} - |As both have the same precedence the compiler can't decide which to apply first. - | - |You may use parenthesis to make the application order explicit, - |or use method application syntax operand1.${op1}(operand2). - | - |Operators ending in a colon ${hl(":")} are right-associative. All other operators are left-associative. - | - |Infix operator precedence is determined by the operator's first character. Characters are listed - |below in increasing order of precedence, with characters on the same line having the same precedence. - | (all letters) - | | - | ^ - | & - | = ! - | < > - | : - | + - - | * / % - | (all other special characters) - |Operators starting with a letter have lowest precedence, followed by operators starting with `|`, etc. - |""".stripMargin - } + def msg(using Context) = + val expectedText = + if (Tokens.isIdentifier(expected)) "an identifier" + else Tokens.showToken(expected) + i"""${expectedText} expected, but ${foundText} found""" - class CantInstantiateAbstractClassOrTrait(cls: Symbol, isTrait: Boolean)(using Context) - extends TypeMsg(CantInstantiateAbstractClassOrTraitID) { - private val traitOrAbstract = if (isTrait) "a trait" else "abstract" - def msg = em"""${cls.name} is ${traitOrAbstract}; it cannot be instantiated""" - def explain = - em"""|Abstract classes and traits need to be extended by a concrete class or object - |to make their functionality accessible. - | - |You may want to create an anonymous class extending ${cls.name} with - | ${s"class ${cls.name} { }"} - | - |or add a companion object with - | ${s"object ${cls.name} extends ${cls.name}"} - | - |You need to implement any abstract members in both cases. - |""".stripMargin - } - - class UnreducibleApplication(tycon: Type)(using Context) extends TypeMsg(UnreducibleApplicationID): - def msg = em"unreducible application of higher-kinded type $tycon to wildcard arguments" - def explain = - em"""|An abstract type constructor cannot be applied to wildcard arguments. - |Such applications are equivalent to existential types, which are not - |supported in Scala 3.""" - - class OverloadedOrRecursiveMethodNeedsResultType(cycleSym: Symbol)(using Context) - extends CyclicMsg(OverloadedOrRecursiveMethodNeedsResultTypeID) { - def msg = em"""Overloaded or recursive $cycleSym needs return type""" - def explain = - em"""Case 1: $cycleSym is overloaded - |If there are multiple methods named $cycleSym and at least one definition of - |it calls another, you need to specify the calling method's return type. - | - |Case 2: $cycleSym is recursive - |If $cycleSym calls itself on any path (even through mutual recursion), you need to specify the return type - |of $cycleSym or of a definition it's mutually recursive with. - |""".stripMargin - } - - class RecursiveValueNeedsResultType(cycleSym: Symbol)(using Context) - extends CyclicMsg(RecursiveValueNeedsResultTypeID) { - def msg = em"""Recursive $cycleSym needs type""" - def explain = - em"""The definition of $cycleSym is recursive and you need to specify its type. - |""".stripMargin - } - - class CyclicReferenceInvolving(denot: SymDenotation)(using Context) - extends CyclicMsg(CyclicReferenceInvolvingID) { - def msg = - val where = if denot.exists then s" involving $denot" else "" - em"Cyclic reference$where" - def explain = - em"""|$denot is declared as part of a cycle which makes it impossible for the - |compiler to decide upon ${denot.name}'s type. - |To avoid this error, try giving ${denot.name} an explicit type. - |""".stripMargin - } + def explain(using Context) = + if (Tokens.isIdentifier(expected) && Tokens.isKeyword(found)) + s""" + |If you want to use $foundText as identifier, you may put it in backticks: `${Tokens.tokenString(found)}`.""".stripMargin + else + "" +} + +class MixedLeftAndRightAssociativeOps(op1: Name, op2: Name, op2LeftAssoc: Boolean)(using Context) +extends SyntaxMsg(MixedLeftAndRightAssociativeOpsID) { + def msg(using Context) = + val op1Asso: String = if (op2LeftAssoc) "which is right-associative" else "which is left-associative" + val op2Asso: String = if (op2LeftAssoc) "which is left-associative" else "which is right-associative" + i"${op1} (${op1Asso}) and ${op2} ($op2Asso) have same precedence and may not be mixed" + def explain(using Context) = + s"""|The operators ${op1} and ${op2} are used as infix operators in the same expression, + |but they bind to different sides: + |${op1} is applied to the operand to its ${if (op2LeftAssoc) "right" else "left"} + |${op2} is applied to the operand to its ${if (op2LeftAssoc) "left" else "right"} + |As both have the same precedence the compiler can't decide which to apply first. + | + |You may use parenthesis to make the application order explicit, + |or use method application syntax operand1.${op1}(operand2). + | + |Operators ending in a colon ${hl(":")} are right-associative. All other operators are left-associative. + | + |Infix operator precedence is determined by the operator's first character. Characters are listed + |below in increasing order of precedence, with characters on the same line having the same precedence. + | (all letters) + | | + | ^ + | & + | = ! + | < > + | : + | + - + | * / % + | (all other special characters) + |Operators starting with a letter have lowest precedence, followed by operators starting with `|`, etc. + |""".stripMargin +} + +class CantInstantiateAbstractClassOrTrait(cls: Symbol, isTrait: Boolean)(using Context) +extends TypeMsg(CantInstantiateAbstractClassOrTraitID) { + private val traitOrAbstract = if (isTrait) "a trait" else "abstract" + def msg(using Context) = i"""${cls.name} is ${traitOrAbstract}; it cannot be instantiated""" + def explain(using Context) = + i"""|Abstract classes and traits need to be extended by a concrete class or object + |to make their functionality accessible. + | + |You may want to create an anonymous class extending ${cls.name} with + | ${s"class ${cls.name} { }"} + | + |or add a companion object with + | ${s"object ${cls.name} extends ${cls.name}"} + | + |You need to implement any abstract members in both cases. + |""" +} + +class UnreducibleApplication(tycon: Type)(using Context) extends TypeMsg(UnreducibleApplicationID): + def msg(using Context) = i"unreducible application of higher-kinded type $tycon to wildcard arguments" + def explain(using Context) = + i"""|An abstract type constructor cannot be applied to wildcard arguments. + |Such applications are equivalent to existential types, which are not + |supported in Scala 3.""" + +class OverloadedOrRecursiveMethodNeedsResultType(cycleSym: Symbol)(using Context) +extends CyclicMsg(OverloadedOrRecursiveMethodNeedsResultTypeID) { + def msg(using Context) = i"""Overloaded or recursive $cycleSym needs return type""" + def explain(using Context) = + i"""Case 1: $cycleSym is overloaded + |If there are multiple methods named $cycleSym and at least one definition of + |it calls another, you need to specify the calling method's return type. + | + |Case 2: $cycleSym is recursive + |If $cycleSym calls itself on any path (even through mutual recursion), you need to specify the return type + |of $cycleSym or of a definition it's mutually recursive with. + |""" +} - class CyclicReferenceInvolvingImplicit(cycleSym: Symbol)(using Context) - extends CyclicMsg(CyclicReferenceInvolvingImplicitID) { - def msg = em"""Cyclic reference involving implicit $cycleSym""" - def explain = - em"""|$cycleSym is declared as part of a cycle which makes it impossible for the - |compiler to decide upon ${cycleSym.name}'s type. - |This might happen when the right hand-side of $cycleSym's definition involves an implicit search. - |To avoid this error, try giving ${cycleSym.name} an explicit type. - |""".stripMargin - } +class RecursiveValueNeedsResultType(cycleSym: Symbol)(using Context) +extends CyclicMsg(RecursiveValueNeedsResultTypeID) { + def msg(using Context) = i"""Recursive $cycleSym needs type""" + def explain(using Context) = + i"""The definition of $cycleSym is recursive and you need to specify its type. + |""" +} + +class CyclicReferenceInvolving(denot: SymDenotation)(using Context) +extends CyclicMsg(CyclicReferenceInvolvingID) { + def msg(using Context) = + val where = if denot.exists then s" involving $denot" else "" + i"Cyclic reference$where" + def explain(using Context) = + i"""|$denot is declared as part of a cycle which makes it impossible for the + |compiler to decide upon ${denot.name}'s type. + |To avoid this error, try giving ${denot.name} an explicit type. + |""" +} + +class CyclicReferenceInvolvingImplicit(cycleSym: Symbol)(using Context) +extends CyclicMsg(CyclicReferenceInvolvingImplicitID) { + def msg(using Context) = i"""Cyclic reference involving implicit $cycleSym""" + def explain(using Context) = + i"""|$cycleSym is declared as part of a cycle which makes it impossible for the + |compiler to decide upon ${cycleSym.name}'s type. + |This might happen when the right hand-side of $cycleSym's definition involves an implicit search. + |To avoid this error, try giving ${cycleSym.name} an explicit type. + |""" +} - class SkolemInInferred(tree: tpd.Tree, pt: Type, argument: tpd.Tree)(using Context) - extends TypeMsg(SkolemInInferredID): - private def argStr = +class SkolemInInferred(tree: tpd.Tree, pt: Type, argument: tpd.Tree)(using Context) +extends TypeMsg(SkolemInInferredID): + def msg(using Context) = + def argStr = if argument.isEmpty then "" else i" from argument of type ${argument.tpe.widen}" - def msg = - em"""Failure to generate given instance for type $pt$argStr) - | - |I found: $tree - |But the part corresponding to `` is not a reference that can be generated. - |This might be because resolution yielded as given instance a function that is not - |known to be total and side-effect free.""" - def explain = - em"""The part of given resolution that corresponds to `` produced a term that - |is not a stable reference. Therefore a given instance could not be generated. - | - |To trouble-shoot the problem, try to supply an explicit expression instead of - |relying on implicit search at this point.""" - - class SuperQualMustBeParent(qual: untpd.Ident, cls: ClassSymbol)(using Context) - extends ReferenceMsg(SuperQualMustBeParentID) { - def msg = em"""|$qual does not name a parent of $cls""" - def explain = - val parents: Seq[String] = (cls.info.parents map (_.typeSymbol.name.show)).sorted - em"""|When a qualifier ${hl("T")} is used in a ${hl("super")} prefix of the form ${hl("C.super[T]")}, - |${hl("T")} must be a parent type of ${hl("C")}. - | - |In this case, the parents of $cls are: - |${parents.mkString(" - ", "\n - ", "")} - |""".stripMargin - } - - class VarArgsParamMustComeLast()(using Context) - extends SyntaxMsg(VarArgsParamMustComeLastID) { - def msg = em"""${hl("varargs")} parameter must come last""" - def explain = - em"""|The ${hl("varargs")} field must be the last field in the method signature. - |Attempting to define a field in a method signature after a ${hl("varargs")} field is an error. - |""" - } - - import typer.Typer.BindingPrec - - class AmbiguousReference(name: Name, newPrec: BindingPrec, prevPrec: BindingPrec, prevCtx: Context)(using Context) - extends ReferenceMsg(AmbiguousReferenceID) { - - /** A string which explains how something was bound; Depending on `prec` this is either - * imported by - * or defined in - */ - private def bindingString(prec: BindingPrec, whereFound: Context, qualifier: String = "") = { - val howVisible = prec match { - case BindingPrec.Definition => "defined" - case BindingPrec.Inheritance => "inherited" - case BindingPrec.NamedImport => "imported by name" - case BindingPrec.WildImport => "imported" - case BindingPrec.PackageClause => "found" - case BindingPrec.NothingBound => assert(false) - } - if (prec.isImportPrec) { - ex"""$howVisible$qualifier by ${em"${whereFound.importInfo}"}""" - } else - ex"""$howVisible$qualifier in ${em"${whereFound.owner}"}""" - } - - def msg = - i"""|Reference to ${em"$name"} is ambiguous, - |it is both ${bindingString(newPrec, ctx)} - |and ${bindingString(prevPrec, prevCtx, " subsequently")}""" - - def explain = - em"""|The compiler can't decide which of the possible choices you - |are referencing with $name: A definition of lower precedence - |in an inner scope, or a definition with higher precedence in - |an outer scope. - |Note: - | - Definitions in an enclosing scope take precedence over inherited definitions - | - Definitions take precedence over imports - | - Named imports take precedence over wildcard imports - | - You may replace a name when imported using - | ${hl("import")} scala.{ $name => ${name.show + "Tick"} } - |""" - } - - class MethodDoesNotTakeParameters(tree: tpd.Tree)(using Context) - extends TypeMsg(MethodDoesNotTakeParametersId) { - def methodSymbol: Symbol = - def recur(t: tpd.Tree): Symbol = - val sym = tpd.methPart(t).symbol - if sym == defn.Any_typeCast then - t match - case TypeApply(Select(qual, _), _) => recur(qual) - case _ => sym - else sym - recur(tree) - - def msg = { - val more = if (tree.isInstanceOf[tpd.Apply]) " more" else "" - val meth = methodSymbol - val methStr = if (meth.exists) meth.showLocated else "expression" - em"$methStr does not take$more parameters" - } - - def explain = { - val isNullary = methodSymbol.info.isInstanceOf[ExprType] - val addendum = - if (isNullary) "\nNullary methods may not be called with parenthesis" - else "" - - "You have specified more parameter lists than defined in the method definition(s)." + addendum + i"""Failure to generate given instance for type $pt$argStr) + | + |I found: $tree + |But the part corresponding to `` is not a reference that can be generated. + |This might be because resolution yielded as given instance a function that is not + |known to be total and side-effect free.""" + def explain(using Context) = + i"""The part of given resolution that corresponds to `` produced a term that + |is not a stable reference. Therefore a given instance could not be generated. + | + |To trouble-shoot the problem, try to supply an explicit expression instead of + |relying on implicit search at this point.""" + +class SuperQualMustBeParent(qual: untpd.Ident, cls: ClassSymbol)(using Context) +extends ReferenceMsg(SuperQualMustBeParentID) { + def msg(using Context) = i"""|$qual does not name a parent of $cls""" + def explain(using Context) = + val parents: Seq[String] = (cls.info.parents map (_.typeSymbol.name.show)).sorted + i"""|When a qualifier ${hl("T")} is used in a ${hl("super")} prefix of the form ${hl("C.super[T]")}, + |${hl("T")} must be a parent type of ${hl("C")}. + | + |In this case, the parents of $cls are: + |${parents.mkString(" - ", "\n - ", "")} + |""" +} + +class VarArgsParamMustComeLast()(using Context) +extends SyntaxMsg(VarArgsParamMustComeLastID) { + def msg(using Context) = i"""${hl("varargs")} parameter must come last""" + def explain(using Context) = + i"""|The ${hl("varargs")} field must be the last field in the method signature. + |Attempting to define a field in a method signature after a ${hl("varargs")} field is an error. + |""" +} + +import typer.Typer.BindingPrec + +class AmbiguousReference(name: Name, newPrec: BindingPrec, prevPrec: BindingPrec, prevCtx: Context)(using Context) + extends ReferenceMsg(AmbiguousReferenceID), NoDisambiguation { + + /** A string which explains how something was bound; Depending on `prec` this is either + * imported by + * or defined in + */ + private def bindingString(prec: BindingPrec, whereFound: Context, qualifier: String = "")(using Context) = { + val howVisible = prec match { + case BindingPrec.Definition => "defined" + case BindingPrec.Inheritance => "inherited" + case BindingPrec.NamedImport => "imported by name" + case BindingPrec.WildImport => "imported" + case BindingPrec.PackageClause => "found" + case BindingPrec.NothingBound => assert(false) } - - } - - class AmbiguousOverload(tree: tpd.Tree, val alternatives: List[SingleDenotation], pt: Type, addendum: String = "")( - implicit ctx: Context) - extends ReferenceMsg(AmbiguousOverloadID) { - private def all = if (alternatives.length == 2) "both" else "all" - def msg = - em"""|Ambiguous overload. The ${err.overloadedAltsStr(alternatives)} - |$all match ${err.expectedTypeStr(pt)}$addendum""".stripMargin - def explain = - em"""|There are ${alternatives.length} methods that could be referenced as the compiler knows too little - |about the expected type. - |You may specify the expected type e.g. by - |- assigning it to a value with a specified type, or - |- adding a type ascription as in ${hl("instance.myMethod: String => Int")} - |""" - } - - class ReassignmentToVal(name: Name)(using Context) - extends TypeMsg(ReassignmentToValID) { - def msg = em"""Reassignment to val $name""" - def explain = - em"""|You can not assign a new value to $name as values can't be changed. - |Keep in mind that every statement has a value, so you may e.g. use - | ${hl("val")} $name ${hl("= if (condition) 2 else 5")} - |In case you need a reassignable name, you can declare it as - |variable - | ${hl("var")} $name ${hl("=")} ... - |""".stripMargin - } - - class TypeDoesNotTakeParameters(tpe: Type, params: List[Trees.Tree[Trees.Untyped]])(using Context) - extends TypeMsg(TypeDoesNotTakeParametersID) { - private def fboundsAddendum = - if tpe.typeSymbol.isAllOf(Provisional | TypeParam) then - "\n(Note that F-bounds of type parameters may not be type lambdas)" - else "" - def msg = em"$tpe does not take type parameters$fboundsAddendum" - def explain = - val ps = - if (params.size == 1) s"a type parameter ${params.head}" - else s"type parameters ${params.map(_.show).mkString(", ")}" - i"""You specified ${NoColor(ps)} for ${em"$tpe"}, which is not - |declared to take any. - |""" - } - - class VarValParametersMayNotBeCallByName(name: TermName, mutable: Boolean)(using Context) - extends SyntaxMsg(VarValParametersMayNotBeCallByNameID) { - def varOrVal = if (mutable) em"${hl("var")}" else em"${hl("val")}" - def msg = s"$varOrVal parameters may not be call-by-name" - def explain = - em"""${hl("var")} and ${hl("val")} parameters of classes and traits may no be call-by-name. In case you - |want the parameter to be evaluated on demand, consider making it just a parameter - |and a ${hl("def")} in the class such as - | ${s"class MyClass(${name}Tick: => String) {"} - | ${s" def $name() = ${name}Tick"} - | ${hl("}")} - |""" - } - - class MissingTypeParameterFor(tpe: Type)(using Context) - extends SyntaxMsg(MissingTypeParameterForID) { - def msg = - if (tpe.derivesFrom(defn.AnyKindClass)) em"${tpe} cannot be used as a value type" - else em"Missing type parameter for ${tpe}" - def explain = "" - } - - class MissingTypeParameterInTypeApp(tpe: Type)(using Context) - extends TypeMsg(MissingTypeParameterInTypeAppID) { - def numParams = tpe.typeParams.length - def parameters = if (numParams == 1) "parameter" else "parameters" - def msg = em"Missing type $parameters for $tpe" - def explain = em"A fully applied type is expected but $tpe takes $numParams $parameters" - } - - class MissingArgument(pname: Name, methString: String)(using Context) - extends TypeMsg(MissingArgumentID): - def msg = - if pname.firstPart contains '$' then s"not enough arguments for $methString" - else s"missing argument for parameter $pname of $methString" - def explain = "" - - class DoesNotConformToBound(tpe: Type, which: String, bound: Type)(using Context) - extends TypeMismatchMsg( - if which == "lower" then bound else tpe, - if which == "lower" then tpe else bound)(DoesNotConformToBoundID): - private def isBounds = tpe match - case TypeBounds(lo, hi) => lo ne hi - case _ => false - override def canExplain = !isBounds - def msg = - if isBounds then - em"Type argument ${tpe} does not overlap with $which bound $bound" - else - em"Type argument ${tpe} does not conform to $which bound $bound" - - class DoesNotConformToSelfType(category: String, selfType: Type, cls: Symbol, - otherSelf: Type, relation: String, other: Symbol)( - implicit ctx: Context) - extends TypeMismatchMsg(selfType, otherSelf)(DoesNotConformToSelfTypeID) { - def msg = em"""$category: self type $selfType of $cls does not conform to self type $otherSelf - |of $relation $other""" - } - - class DoesNotConformToSelfTypeCantBeInstantiated(tp: Type, selfType: Type)( - implicit ctx: Context) - extends TypeMismatchMsg(tp, selfType)(DoesNotConformToSelfTypeCantBeInstantiatedID) { - def msg = em"""$tp does not conform to its self type $selfType; cannot be instantiated""" - } - - class IllegalParameterInit(found: Type, expected: Type, param: Symbol, cls: Symbol)(using Context) - extends TypeMismatchMsg(found, expected)(IllegalParameterInitID): - def msg = - em"""illegal parameter initialization of $param. - | - | The argument passed for $param has type: $found - | but $cls expects $param to have type: $expected""" - - class AbstractMemberMayNotHaveModifier(sym: Symbol, flag: FlagSet)( - implicit ctx: Context) - extends SyntaxMsg(AbstractMemberMayNotHaveModifierID) { - def msg = em"""${hl("abstract")} $sym may not have `${flag.flagsString}` modifier""" - def explain = "" - } - - class TypesAndTraitsCantBeImplicit()(using Context) - extends SyntaxMsg(TypesAndTraitsCantBeImplicitID) { - def msg = em"""${hl("implicit")} modifier cannot be used for types or traits""" - def explain = "" - } - - class OnlyClassesCanBeAbstract(sym: Symbol)( - implicit ctx: Context) - extends SyntaxMsg(OnlyClassesCanBeAbstractID) { - def explain = "" - def msg = em"""${hl("abstract")} modifier can be used only for classes; it should be omitted for abstract members""" - } - - class AbstractOverrideOnlyInTraits(sym: Symbol)( - implicit ctx: Context) - extends SyntaxMsg(AbstractOverrideOnlyInTraitsID) { - def msg = em"""${hl("abstract override")} modifier only allowed for members of traits""" - def explain = "" - } - - class TraitsMayNotBeFinal(sym: Symbol)( - implicit ctx: Context) - extends SyntaxMsg(TraitsMayNotBeFinalID) { - def msg = em"""$sym may not be ${hl("final")}""" - def explain = - "A trait can never be final since it is abstract and must be extended to be useful." - } - - class NativeMembersMayNotHaveImplementation(sym: Symbol)( - implicit ctx: Context) - extends SyntaxMsg(NativeMembersMayNotHaveImplementationID) { - def msg = em"""${hl("@native")} members may not have an implementation""" - def explain = "" - } - - class TraitMayNotDefineNativeMethod(sym: Symbol)( - implicit ctx: Context) - extends SyntaxMsg(TraitMayNotDefineNativeMethodID) { - def msg = em"""A trait cannot define a ${hl("@native")} method.""" - def explain = "" - } - - class OnlyClassesCanHaveDeclaredButUndefinedMembers(sym: Symbol)( - implicit ctx: Context) - extends SyntaxMsg(OnlyClassesCanHaveDeclaredButUndefinedMembersID) { - - private def varNote = - if (sym.is(Mutable)) "Note that variables need to be initialized to be defined." + if (prec.isImportPrec) { + i"""$howVisible$qualifier by ${whereFound.importInfo}""" + } else + i"""$howVisible$qualifier in ${whereFound.owner}""" + } + + def msg(using Context) = + i"""|Reference to $name is ambiguous, + |it is both ${bindingString(newPrec, ctx)} + |and ${bindingString(prevPrec, prevCtx, " subsequently")}""" + + def explain(using Context) = + i"""|The compiler can't decide which of the possible choices you + |are referencing with $name: A definition of lower precedence + |in an inner scope, or a definition with higher precedence in + |an outer scope. + |Note: + | - Definitions in an enclosing scope take precedence over inherited definitions + | - Definitions take precedence over imports + | - Named imports take precedence over wildcard imports + | - You may replace a name when imported using + | ${hl("import")} scala.{ $name => ${name.show + "Tick"} } + |""" +} + +class MethodDoesNotTakeParameters(tree: tpd.Tree)(using Context) +extends TypeMsg(MethodDoesNotTakeParametersId) { + def methodSymbol(using Context): Symbol = + def recur(t: tpd.Tree): Symbol = + val sym = tpd.methPart(t).symbol + if sym == defn.Any_typeCast then + t match + case TypeApply(Select(qual, _), _) => recur(qual) + case _ => sym + else sym + recur(tree) + + def msg(using Context) = { + val more = if (tree.isInstanceOf[tpd.Apply]) " more" else "" + val meth = methodSymbol + val methStr = if (meth.exists) meth.showLocated else "expression" + i"$methStr does not take$more parameters" + } + + def explain(using Context) = { + val isNullary = methodSymbol.info.isInstanceOf[ExprType] + val addendum = + if (isNullary) "\nNullary methods may not be called with parenthesis" else "" - def msg = em"""Declaration of $sym not allowed here: only classes can have declared but undefined members""" - def explain = s"$varNote" - } - class CannotExtendAnyVal(sym: Symbol)(using Context) - extends SyntaxMsg(CannotExtendAnyValID) { - def msg = em"""$sym cannot extend ${hl("AnyVal")}""" - def explain = - em"""Only classes (not traits) are allowed to extend ${hl("AnyVal")}, but traits may extend - |${hl("Any")} to become ${Green("\"universal traits\"")} which may only have ${hl("def")} members. - |Universal traits can be mixed into classes that extend ${hl("AnyVal")}. - |""" + "You have specified more parameter lists than defined in the method definition(s)." + addendum } - class CannotExtendJavaEnum(sym: Symbol)(using Context) - extends SyntaxMsg(CannotExtendJavaEnumID) { - def msg = em"""$sym cannot extend ${hl("java.lang.Enum")}: only enums defined with the ${hl("enum")} syntax can""" - def explain = "" - } - - class CannotExtendContextFunction(sym: Symbol)(using Context) - extends SyntaxMsg(CannotExtendFunctionID) { - def msg = em"""$sym cannot extend a context function class""" - def explain = "" - } - - class JavaEnumParentArgs(parent: Type)(using Context) - extends TypeMsg(JavaEnumParentArgsID) { - def msg = em"""not enough arguments for constructor Enum: ${hl("(name: String, ordinal: Int)")}: ${hl(parent.show)}""" - def explain = "" - } - - class CannotHaveSameNameAs(sym: Symbol, cls: Symbol, reason: CannotHaveSameNameAs.Reason)(using Context) - extends SyntaxMsg(CannotHaveSameNameAsID) { - import CannotHaveSameNameAs._ - def reasonMessage: String = reason match { - case CannotBeOverridden => "class definitions cannot be overridden" - case DefinedInSelf(self) => - s"""cannot define ${sym.showKind} member with the same name as a ${cls.showKind} member in self reference ${self.name}. - |(Note: this can be resolved by using another name) - |""".stripMargin - } +} - def msg = em"""$sym cannot have the same name as ${cls.showLocated} -- """ + reasonMessage - def explain = "" - } - object CannotHaveSameNameAs { - sealed trait Reason - case object CannotBeOverridden extends Reason - case class DefinedInSelf(self: tpd.ValDef) extends Reason - } - - class ValueClassesMayNotDefineInner(valueClass: Symbol, inner: Symbol)(using Context) - extends SyntaxMsg(ValueClassesMayNotDefineInnerID) { - def msg = em"""Value classes may not define an inner class""" - def explain = "" - } - - class ValueClassesMayNotDefineNonParameterField(valueClass: Symbol, field: Symbol)(using Context) - extends SyntaxMsg(ValueClassesMayNotDefineNonParameterFieldID) { - def msg = em"""Value classes may not define non-parameter field""" - def explain = "" - } - - class ValueClassesMayNotDefineASecondaryConstructor(valueClass: Symbol, constructor: Symbol)(using Context) - extends SyntaxMsg(ValueClassesMayNotDefineASecondaryConstructorID) { - def msg = em"""Value classes may not define a secondary constructor""" - def explain = "" - } - - class ValueClassesMayNotContainInitalization(valueClass: Symbol)(using Context) - extends SyntaxMsg(ValueClassesMayNotContainInitalizationID) { - def msg = em"""Value classes may not contain initialization statements""" - def explain = "" - } - - class ValueClassesMayNotBeAbstract(valueClass: Symbol)(using Context) - extends SyntaxMsg(ValueClassesMayNotBeAbstractID) { - def msg = em"""Value classes may not be ${hl("abstract")}""" - def explain = "" - } - - class ValueClassesMayNotBeContainted(valueClass: Symbol)(using Context) - extends SyntaxMsg(ValueClassesMayNotBeContaintedID) { - private def localOrMember = if (valueClass.owner.isTerm) "local class" else "member of another class" - def msg = s"""Value classes may not be a $localOrMember""" - def explain = "" - } - - class ValueClassesMayNotWrapAnotherValueClass(valueClass: Symbol)(using Context) - extends SyntaxMsg(ValueClassesMayNotWrapAnotherValueClassID) { - def msg = """A value class may not wrap another user-defined value class""" - def explain = "" - } - - class ValueClassParameterMayNotBeAVar(valueClass: Symbol, param: Symbol)(using Context) - extends SyntaxMsg(ValueClassParameterMayNotBeAVarID) { - def msg = em"""A value class parameter may not be a ${hl("var")}""" - def explain = - em"""A value class must have exactly one ${hl("val")} parameter.""" - } - - class ValueClassNeedsOneValParam(valueClass: Symbol)(using Context) - extends SyntaxMsg(ValueClassNeedsExactlyOneValParamID) { - def msg = em"""Value class needs one ${hl("val")} parameter""" - def explain = "" - } - - class ValueClassParameterMayNotBeCallByName(valueClass: Symbol, param: Symbol)(using Context) - extends SyntaxMsg(ValueClassParameterMayNotBeCallByNameID) { - def msg = s"Value class parameter `${param.name}` may not be call-by-name" - def explain = "" - } - - class SuperCallsNotAllowedInlineable(symbol: Symbol)(using Context) - extends SyntaxMsg(SuperCallsNotAllowedInlineableID) { - def msg = em"Super call not allowed in inlineable $symbol" - def explain = "Method inlining prohibits calling superclass methods, as it may lead to confusion about which super is being called." - } - - class NotAPath(tp: Type, usage: String)(using Context) extends TypeMsg(NotAPathID): - def msg = em"$tp is not a valid $usage, since it is not an immutable path" - def explain = - i"""An immutable path is - | - a reference to an immutable value, or - | - a reference to `this`, or - | - a selection of an immutable path with an immutable value.""" - - class WrongNumberOfParameters(expected: Int)(using Context) - extends SyntaxMsg(WrongNumberOfParametersID) { - def msg = s"Wrong number of parameters, expected: $expected" - def explain = "" - } - - class DuplicatePrivateProtectedQualifier()(using Context) - extends SyntaxMsg(DuplicatePrivateProtectedQualifierID) { - def msg = "Duplicate private/protected qualifier" - def explain = - em"It is not allowed to combine `private` and `protected` modifiers even if they are qualified to different scopes" - } - - class ExpectedStartOfTopLevelDefinition()(using Context) - extends SyntaxMsg(ExpectedStartOfTopLevelDefinitionID) { - def msg = "Expected start of definition" - def explain = - em"You have to provide either ${hl("class")}, ${hl("trait")}, ${hl("object")}, or ${hl("enum")} definitions after qualifiers" - } - - class NoReturnFromInlineable(owner: Symbol)(using Context) - extends SyntaxMsg(NoReturnFromInlineableID) { - def msg = em"No explicit ${hl("return")} allowed from inlineable $owner" - def explain = - em"""Methods marked with ${hl("inline")} modifier may not use ${hl("return")} statements. - |Instead, you should rely on the last expression's value being - |returned from a method. - |""" - } - - class ReturnOutsideMethodDefinition(owner: Symbol)(using Context) - extends SyntaxMsg(ReturnOutsideMethodDefinitionID) { - def msg = em"${hl("return")} outside method definition" - def explain = - em"""You used ${hl("return")} in ${owner}. - |${hl("return")} is a keyword and may only be used within method declarations. - |""" - } - - class ExtendFinalClass(clazz:Symbol, finalClazz: Symbol)(using Context) - extends SyntaxMsg(ExtendFinalClassID) { - def msg = em"$clazz cannot extend ${hl("final")} $finalClazz" - def explain = - em"""A class marked with the ${hl("final")} keyword cannot be extended""" - } - - class ExpectedTypeBoundOrEquals(found: Token)(using Context) - extends SyntaxMsg(ExpectedTypeBoundOrEqualsID) { - def msg = em"${hl("=")}, ${hl(">:")}, or ${hl("<:")} expected, but ${Tokens.showToken(found)} found" - - def explain = - em"""Type parameters and abstract types may be constrained by a type bound. - |Such type bounds limit the concrete values of the type variables and possibly - |reveal more information about the members of such types. - | - |A lower type bound ${hl("B >: A")} expresses that the type variable ${hl("B")} - |refers to a supertype of type ${hl("A")}. - | - |An upper type bound ${hl("T <: A")} declares that type variable ${hl("T")} - |refers to a subtype of type ${hl("A")}. - |""" - } - - class ClassAndCompanionNameClash(cls: Symbol, other: Symbol)(using Context) - extends NamingMsg(ClassAndCompanionNameClashID) { - def msg = - val name = cls.name.stripModuleClassSuffix - em"Name clash: both ${cls.owner} and its companion object defines $name" - def explain = - em"""|A ${cls.kindString} and its companion object cannot both define a ${hl("class")}, ${hl("trait")} or ${hl("object")} with the same name: - | - ${cls.owner} defines ${cls} - | - ${other.owner} defines ${other}""" - } - - class TailrecNotApplicable(symbol: Symbol)(using Context) - extends SyntaxMsg(TailrecNotApplicableID) { - def msg = { - val reason = - if (!symbol.is(Method)) em"$symbol isn't a method" - else if (symbol.is(Deferred)) em"$symbol is abstract" - else if (!symbol.isEffectivelyFinal) em"$symbol is neither ${hl("private")} nor ${hl("final")} so can be overridden" - else em"$symbol contains no recursive calls" - - s"TailRec optimisation not applicable, $reason" - } - def explain = "" - } +class AmbiguousOverload(tree: tpd.Tree, val alternatives: List[SingleDenotation], pt: Type, addendum: String = "")( + implicit ctx: Context) +extends ReferenceMsg(AmbiguousOverloadID), NoDisambiguation { + private def all = if (alternatives.length == 2) "both" else "all" + def msg(using Context) = + i"""|Ambiguous overload. The ${err.overloadedAltsStr(alternatives)} + |$all match ${err.expectedTypeStr(pt)}$addendum""" + def explain(using Context) = + i"""|There are ${alternatives.length} methods that could be referenced as the compiler knows too little + |about the expected type. + |You may specify the expected type e.g. by + |- assigning it to a value with a specified type, or + |- adding a type ascription as in ${hl("instance.myMethod: String => Int")} + |""" +} + +class ReassignmentToVal(name: Name)(using Context) + extends TypeMsg(ReassignmentToValID) { + def msg(using Context) = i"""Reassignment to val $name""" + def explain(using Context) = + i"""|You can not assign a new value to $name as values can't be changed. + |Keep in mind that every statement has a value, so you may e.g. use + | ${hl("val")} $name ${hl("= if (condition) 2 else 5")} + |In case you need a reassignable name, you can declare it as + |variable + | ${hl("var")} $name ${hl("=")} ... + |""" +} + +class TypeDoesNotTakeParameters(tpe: Type, params: List[untpd.Tree])(using Context) + extends TypeMsg(TypeDoesNotTakeParametersID) { + private def fboundsAddendum(using Context) = + if tpe.typeSymbol.isAllOf(Provisional | TypeParam) then + "\n(Note that F-bounds of type parameters may not be type lambdas)" + else "" + def msg(using Context) = i"$tpe does not take type parameters$fboundsAddendum" + def explain(using Context) = + val ps = + if (params.size == 1) s"a type parameter ${params.head}" + else s"type parameters ${params.map(_.show).mkString(", ")}" + i"""You specified ${NoColor(ps)} for $tpe, which is not + |declared to take any. + |""" +} + +class VarValParametersMayNotBeCallByName(name: TermName, mutable: Boolean)(using Context) + extends SyntaxMsg(VarValParametersMayNotBeCallByNameID) { + def varOrVal = if mutable then hl("var") else hl("val") + def msg(using Context) = s"$varOrVal parameters may not be call-by-name" + def explain(using Context) = + i"""${hl("var")} and ${hl("val")} parameters of classes and traits may no be call-by-name. In case you + |want the parameter to be evaluated on demand, consider making it just a parameter + |and a ${hl("def")} in the class such as + | ${s"class MyClass(${name}Tick: => String) {"} + | ${s" def $name() = ${name}Tick"} + | ${hl("}")} + |""" +} + +class MissingTypeParameterFor(tpe: Type)(using Context) + extends SyntaxMsg(MissingTypeParameterForID) { + def msg(using Context) = + if tpe.derivesFrom(defn.AnyKindClass) + then i"$tpe cannot be used as a value type" + else i"Missing type parameter for $tpe" + def explain(using Context) = "" +} + +class MissingTypeParameterInTypeApp(tpe: Type)(using Context) + extends TypeMsg(MissingTypeParameterInTypeAppID) { + def numParams = tpe.typeParams.length + def parameters = if (numParams == 1) "parameter" else "parameters" + def msg(using Context) = i"Missing type $parameters for $tpe" + def explain(using Context) = i"A fully applied type is expected but $tpe takes $numParams $parameters" +} + +class MissingArgument(pname: Name, methString: String)(using Context) + extends TypeMsg(MissingArgumentID): + def msg(using Context) = + if pname.firstPart contains '$' then s"not enough arguments for $methString" + else s"missing argument for parameter $pname of $methString" + def explain(using Context) = "" + +class DoesNotConformToBound(tpe: Type, which: String, bound: Type)(using Context) + extends TypeMismatchMsg( + if which == "lower" then bound else tpe, + if which == "lower" then tpe else bound)(DoesNotConformToBoundID): + private def isBounds = tpe match + case TypeBounds(lo, hi) => lo ne hi + case _ => false + override def canExplain = !isBounds + def msg(using Context) = + if isBounds then + i"Type argument ${tpe} does not overlap with $which bound $bound" + else + i"Type argument ${tpe} does not conform to $which bound $bound" + +class DoesNotConformToSelfType(category: String, selfType: Type, cls: Symbol, + otherSelf: Type, relation: String, other: Symbol)( + implicit ctx: Context) + extends TypeMismatchMsg(selfType, otherSelf)(DoesNotConformToSelfTypeID) { + def msg(using Context) = i"""$category: self type $selfType of $cls does not conform to self type $otherSelf + |of $relation $other""" +} + +class DoesNotConformToSelfTypeCantBeInstantiated(tp: Type, selfType: Type)( + implicit ctx: Context) + extends TypeMismatchMsg(tp, selfType)(DoesNotConformToSelfTypeCantBeInstantiatedID) { + def msg(using Context) = i"""$tp does not conform to its self type $selfType; cannot be instantiated""" +} + +class IllegalParameterInit(found: Type, expected: Type, param: Symbol, cls: Symbol)(using Context) + extends TypeMismatchMsg(found, expected)(IllegalParameterInitID): + def msg(using Context) = + i"""illegal parameter initialization of $param. + | + | The argument passed for $param has type: $found + | but $cls expects $param to have type: $expected""" + +class AbstractMemberMayNotHaveModifier(sym: Symbol, flag: FlagSet)( + implicit ctx: Context) + extends SyntaxMsg(AbstractMemberMayNotHaveModifierID) { + def msg(using Context) = i"""${hl("abstract")} $sym may not have `${flag.flagsString}` modifier""" + def explain(using Context) = "" +} + +class TypesAndTraitsCantBeImplicit()(using Context) + extends SyntaxMsg(TypesAndTraitsCantBeImplicitID) { + def msg(using Context) = i"""${hl("implicit")} modifier cannot be used for types or traits""" + def explain(using Context) = "" +} + +class OnlyClassesCanBeAbstract(sym: Symbol)( + implicit ctx: Context) + extends SyntaxMsg(OnlyClassesCanBeAbstractID) { + def explain(using Context) = "" + def msg(using Context) = i"""${hl("abstract")} modifier can be used only for classes; it should be omitted for abstract members""" +} + +class AbstractOverrideOnlyInTraits(sym: Symbol)( + implicit ctx: Context) + extends SyntaxMsg(AbstractOverrideOnlyInTraitsID) { + def msg(using Context) = i"""${hl("abstract override")} modifier only allowed for members of traits""" + def explain(using Context) = "" +} + +class TraitsMayNotBeFinal(sym: Symbol)( + implicit ctx: Context) + extends SyntaxMsg(TraitsMayNotBeFinalID) { + def msg(using Context) = i"""$sym may not be ${hl("final")}""" + def explain(using Context) = + "A trait can never be final since it is abstract and must be extended to be useful." +} + +class NativeMembersMayNotHaveImplementation(sym: Symbol)( + implicit ctx: Context) + extends SyntaxMsg(NativeMembersMayNotHaveImplementationID) { + def msg(using Context) = i"""${hl("@native")} members may not have an implementation""" + def explain(using Context) = "" +} + +class TraitMayNotDefineNativeMethod(sym: Symbol)( + implicit ctx: Context) + extends SyntaxMsg(TraitMayNotDefineNativeMethodID) { + def msg(using Context) = i"""A trait cannot define a ${hl("@native")} method.""" + def explain(using Context) = "" +} + +class OnlyClassesCanHaveDeclaredButUndefinedMembers(sym: Symbol)( + implicit ctx: Context) + extends SyntaxMsg(OnlyClassesCanHaveDeclaredButUndefinedMembersID) { + + def msg(using Context) = i"""Declaration of $sym not allowed here: only classes can have declared but undefined members""" + def explain(using Context) = + if sym.is(Mutable) then "Note that variables need to be initialized to be defined." + else "" +} + +class CannotExtendAnyVal(sym: Symbol)(using Context) + extends SyntaxMsg(CannotExtendAnyValID) { + def msg(using Context) = i"""$sym cannot extend ${hl("AnyVal")}""" + def explain(using Context) = + i"""Only classes (not traits) are allowed to extend ${hl("AnyVal")}, but traits may extend + |${hl("Any")} to become ${Green("\"universal traits\"")} which may only have ${hl("def")} members. + |Universal traits can be mixed into classes that extend ${hl("AnyVal")}. + |""" +} - class FailureToEliminateExistential(tp: Type, tp1: Type, tp2: Type, boundSyms: List[Symbol], classRoot: Symbol)(using Context) - extends Message(FailureToEliminateExistentialID) { - def kind = MessageKind.Compatibility - def msg = - val originalType = ctx.printer.dclsText(boundSyms, "; ").show - em"""An existential type that came from a Scala-2 classfile for $classRoot - |cannot be mapped accurately to a Scala-3 equivalent. - |original type : $tp forSome ${originalType} - |reduces to : $tp1 - |type used instead: $tp2 - |This choice can cause follow-on type errors or hide type errors. - |Proceed at own risk.""" - def explain = - em"""Existential types in their full generality are no longer supported. - |Scala-3 does applications of class types to wildcard type arguments. - |Other forms of existential types that come from Scala-2 classfiles - |are only approximated in a best-effort way.""" +class CannotExtendJavaEnum(sym: Symbol)(using Context) + extends SyntaxMsg(CannotExtendJavaEnumID) { + def msg(using Context) = i"""$sym cannot extend ${hl("java.lang.Enum")}: only enums defined with the ${hl("enum")} syntax can""" + def explain(using Context) = "" } - class OnlyFunctionsCanBeFollowedByUnderscore(tp: Type)(using Context) - extends SyntaxMsg(OnlyFunctionsCanBeFollowedByUnderscoreID) { - def msg = em"Only function types can be followed by ${hl("_")} but the current expression has type $tp" - def explain = - em"""The syntax ${hl("x _")} is no longer supported if ${hl("x")} is not a function. - |To convert to a function value, you need to explicitly write ${hl("() => x")}""" +class CannotExtendContextFunction(sym: Symbol)(using Context) + extends SyntaxMsg(CannotExtendFunctionID) { + def msg(using Context) = i"""$sym cannot extend a context function class""" + def explain(using Context) = "" } - class MissingEmptyArgumentList(method: String)(using Context) - extends SyntaxMsg(MissingEmptyArgumentListID) { - def msg = em"$method must be called with ${hl("()")} argument" - def explain = { - val codeExample = - """def next(): T = ... - |next // is expanded to next()""" - - em"""Previously an empty argument list () was implicitly inserted when calling a nullary method without arguments. E.g. - | - |$codeExample - | - |In Dotty, this idiom is an error. The application syntax has to follow exactly the parameter syntax. - |Excluded from this rule are methods that are defined in Java or that override methods defined in Java.""" - } +class JavaEnumParentArgs(parent: Type)(using Context) + extends TypeMsg(JavaEnumParentArgsID) { + def msg(using Context) = i"""not enough arguments for constructor Enum: ${hl("(name: String, ordinal: Int)")}: ${hl(parent.show)}""" + def explain(using Context) = "" } - class DuplicateNamedTypeParameter(name: Name)(using Context) - extends SyntaxMsg(DuplicateNamedTypeParameterID) { - def msg = em"Type parameter $name was defined multiple times." - def explain = "" - } - - class UndefinedNamedTypeParameter(undefinedName: Name, definedNames: List[Name])(using Context) - extends SyntaxMsg(UndefinedNamedTypeParameterID) { - def msg = em"Type parameter $undefinedName is undefined. Expected one of ${definedNames.map(_.show).mkString(", ")}." - def explain = "" - } - - class IllegalStartOfStatement(what: String, isModifier: Boolean, isStat: Boolean)(using Context) extends SyntaxMsg(IllegalStartOfStatementID) { - def msg = - if isStat then - "this kind of statement is not allowed here" - else - val addendum = if isModifier then ": this modifier is not allowed here" else "" - s"Illegal start of $what$addendum" - def explain = - i"""A statement is an import or export, a definition or an expression. - |Some statements are only allowed in certain contexts""" +class CannotHaveSameNameAs(sym: Symbol, cls: Symbol, reason: CannotHaveSameNameAs.Reason)(using Context) + extends NamingMsg(CannotHaveSameNameAsID) { + import CannotHaveSameNameAs._ + def reasonMessage(using Context): String = reason match { + case CannotBeOverridden => "class definitions cannot be overridden" + case DefinedInSelf(self) => + s"""cannot define ${sym.showKind} member with the same name as a ${cls.showKind} member in self reference ${self.name}. + |(Note: this can be resolved by using another name) + |""".stripMargin } - class TraitIsExpected(symbol: Symbol)(using Context) extends SyntaxMsg(TraitIsExpectedID) { - def msg = em"$symbol is not a trait" - def explain = { - val errorCodeExample = - """class A - |class B - | - |val a = new A with B // will fail with a compile error - class B is not a trait""".stripMargin - val codeExample = - """class A - |trait B - | - |val a = new A with B // compiles normally""".stripMargin + def msg(using Context) = i"""$sym cannot have the same name as ${cls.showLocated} -- """ + reasonMessage + def explain(using Context) = "" +} +object CannotHaveSameNameAs { + sealed trait Reason + case object CannotBeOverridden extends Reason + case class DefinedInSelf(self: tpd.ValDef) extends Reason +} + +class ValueClassesMayNotDefineInner(valueClass: Symbol, inner: Symbol)(using Context) + extends SyntaxMsg(ValueClassesMayNotDefineInnerID) { + def msg(using Context) = i"""Value classes may not define an inner class""" + def explain(using Context) = "" +} + +class ValueClassesMayNotDefineNonParameterField(valueClass: Symbol, field: Symbol)(using Context) + extends SyntaxMsg(ValueClassesMayNotDefineNonParameterFieldID) { + def msg(using Context) = i"""Value classes may not define non-parameter field""" + def explain(using Context) = "" +} + +class ValueClassesMayNotDefineASecondaryConstructor(valueClass: Symbol, constructor: Symbol)(using Context) + extends SyntaxMsg(ValueClassesMayNotDefineASecondaryConstructorID) { + def msg(using Context) = i"""Value classes may not define a secondary constructor""" + def explain(using Context) = "" +} + +class ValueClassesMayNotContainInitalization(valueClass: Symbol)(using Context) + extends SyntaxMsg(ValueClassesMayNotContainInitalizationID) { + def msg(using Context) = i"""Value classes may not contain initialization statements""" + def explain(using Context) = "" +} + +class ValueClassesMayNotBeAbstract(valueClass: Symbol)(using Context) + extends SyntaxMsg(ValueClassesMayNotBeAbstractID) { + def msg(using Context) = i"""Value classes may not be ${hl("abstract")}""" + def explain(using Context) = "" +} + +class ValueClassesMayNotBeContainted(valueClass: Symbol)(using Context) + extends SyntaxMsg(ValueClassesMayNotBeContaintedID) { + private def localOrMember = if (valueClass.owner.isTerm) "local class" else "member of another class" + def msg(using Context) = s"""Value classes may not be a $localOrMember""" + def explain(using Context) = "" +} + +class ValueClassesMayNotWrapAnotherValueClass(valueClass: Symbol)(using Context) + extends SyntaxMsg(ValueClassesMayNotWrapAnotherValueClassID) { + def msg(using Context) = """A value class may not wrap another user-defined value class""" + def explain(using Context) = "" +} + +class ValueClassParameterMayNotBeAVar(valueClass: Symbol, param: Symbol)(using Context) + extends SyntaxMsg(ValueClassParameterMayNotBeAVarID) { + def msg(using Context) = i"""A value class parameter may not be a ${hl("var")}""" + def explain(using Context) = + i"""A value class must have exactly one ${hl("val")} parameter.""" +} + +class ValueClassNeedsOneValParam(valueClass: Symbol)(using Context) + extends SyntaxMsg(ValueClassNeedsExactlyOneValParamID) { + def msg(using Context) = i"""Value class needs one ${hl("val")} parameter""" + def explain(using Context) = "" +} + +class ValueClassParameterMayNotBeCallByName(valueClass: Symbol, param: Symbol)(using Context) + extends SyntaxMsg(ValueClassParameterMayNotBeCallByNameID) { + def msg(using Context) = s"Value class parameter `${param.name}` may not be call-by-name" + def explain(using Context) = "" +} + +class SuperCallsNotAllowedInlineable(symbol: Symbol)(using Context) + extends SyntaxMsg(SuperCallsNotAllowedInlineableID) { + def msg(using Context) = i"Super call not allowed in inlineable $symbol" + def explain(using Context) = "Method inlining prohibits calling superclass methods, as it may lead to confusion about which super is being called." +} + +class NotAPath(tp: Type, usage: String)(using Context) extends TypeMsg(NotAPathID): + def msg(using Context) = i"$tp is not a valid $usage, since it is not an immutable path" + def explain(using Context) = + i"""An immutable path is + | - a reference to an immutable value, or + | - a reference to `this`, or + | - a selection of an immutable path with an immutable value.""" + +class WrongNumberOfParameters(expected: Int)(using Context) + extends SyntaxMsg(WrongNumberOfParametersID) { + def msg(using Context) = s"Wrong number of parameters, expected: $expected" + def explain(using Context) = "" +} + +class DuplicatePrivateProtectedQualifier()(using Context) + extends SyntaxMsg(DuplicatePrivateProtectedQualifierID) { + def msg(using Context) = "Duplicate private/protected qualifier" + def explain(using Context) = + i"It is not allowed to combine `private` and `protected` modifiers even if they are qualified to different scopes" +} + +class ExpectedStartOfTopLevelDefinition()(using Context) + extends SyntaxMsg(ExpectedStartOfTopLevelDefinitionID) { + def msg(using Context) = "Expected start of definition" + def explain(using Context) = + i"You have to provide either ${hl("class")}, ${hl("trait")}, ${hl("object")}, or ${hl("enum")} definitions after qualifiers" +} + +class NoReturnFromInlineable(owner: Symbol)(using Context) + extends SyntaxMsg(NoReturnFromInlineableID) { + def msg(using Context) = i"No explicit ${hl("return")} allowed from inlineable $owner" + def explain(using Context) = + i"""Methods marked with ${hl("inline")} modifier may not use ${hl("return")} statements. + |Instead, you should rely on the last expression's value being + |returned from a method. + |""" +} + +class ReturnOutsideMethodDefinition(owner: Symbol)(using Context) + extends SyntaxMsg(ReturnOutsideMethodDefinitionID) { + def msg(using Context) = i"${hl("return")} outside method definition" + def explain(using Context) = + i"""You used ${hl("return")} in ${owner}. + |${hl("return")} is a keyword and may only be used within method declarations. + |""" +} + +class ExtendFinalClass(clazz:Symbol, finalClazz: Symbol)(using Context) + extends SyntaxMsg(ExtendFinalClassID) { + def msg(using Context) = i"$clazz cannot extend ${hl("final")} $finalClazz" + def explain(using Context) = + i"""A class marked with the ${hl("final")} keyword cannot be extended""" +} + +class ExpectedTypeBoundOrEquals(found: Token)(using Context) + extends SyntaxMsg(ExpectedTypeBoundOrEqualsID) { + def msg(using Context) = i"${hl("=")}, ${hl(">:")}, or ${hl("<:")} expected, but ${Tokens.showToken(found)} found" + + def explain(using Context) = + i"""Type parameters and abstract types may be constrained by a type bound. + |Such type bounds limit the concrete values of the type variables and possibly + |reveal more information about the members of such types. + | + |A lower type bound ${hl("B >: A")} expresses that the type variable ${hl("B")} + |refers to a supertype of type ${hl("A")}. + | + |An upper type bound ${hl("T <: A")} declares that type variable ${hl("T")} + |refers to a subtype of type ${hl("A")}. + |""" +} + +class ClassAndCompanionNameClash(cls: Symbol, other: Symbol)(using Context) + extends NamingMsg(ClassAndCompanionNameClashID) { + def msg(using Context) = + val name = cls.name.stripModuleClassSuffix + i"Name clash: both ${cls.owner} and its companion object defines $name" + def explain(using Context) = + i"""|A ${cls.kindString} and its companion object cannot both define a ${hl("class")}, ${hl("trait")} or ${hl("object")} with the same name: + | - ${cls.owner} defines ${cls} + | - ${other.owner} defines ${other}""" +} + +class TailrecNotApplicable(symbol: Symbol)(using Context) + extends SyntaxMsg(TailrecNotApplicableID) { + def msg(using Context) = { + val reason = + if !symbol.is(Method) then i"$symbol isn't a method" + else if symbol.is(Deferred) then i"$symbol is abstract" + else if !symbol.isEffectivelyFinal then i"$symbol is neither ${hl("private")} nor ${hl("final")} so can be overridden" + else i"$symbol contains no recursive calls" + + s"TailRec optimisation not applicable, $reason" + } + def explain(using Context) = "" +} + +class FailureToEliminateExistential(tp: Type, tp1: Type, tp2: Type, boundSyms: List[Symbol], classRoot: Symbol)(using Context) + extends Message(FailureToEliminateExistentialID) { + def kind = MessageKind.Compatibility + def msg(using Context) = + val originalType = ctx.printer.dclsText(boundSyms, "; ").show + i"""An existential type that came from a Scala-2 classfile for $classRoot + |cannot be mapped accurately to a Scala-3 equivalent. + |original type : $tp forSome ${originalType} + |reduces to : $tp1 + |type used instead: $tp2 + |This choice can cause follow-on type errors or hide type errors. + |Proceed at own risk.""" + def explain(using Context) = + i"""Existential types in their full generality are no longer supported. + |Scala-3 does applications of class types to wildcard type arguments. + |Other forms of existential types that come from Scala-2 classfiles + |are only approximated in a best-effort way.""" +} + +class OnlyFunctionsCanBeFollowedByUnderscore(tp: Type)(using Context) + extends SyntaxMsg(OnlyFunctionsCanBeFollowedByUnderscoreID) { + def msg(using Context) = i"Only function types can be followed by ${hl("_")} but the current expression has type $tp" + def explain(using Context) = + i"""The syntax ${hl("x _")} is no longer supported if ${hl("x")} is not a function. + |To convert to a function value, you need to explicitly write ${hl("() => x")}""" +} + +class MissingEmptyArgumentList(method: String)(using Context) + extends SyntaxMsg(MissingEmptyArgumentListID) { + def msg(using Context) = i"$method must be called with ${hl("()")} argument" + def explain(using Context) = { + val codeExample = + """def next(): T = ... + |next // is expanded to next()""" + + i"""Previously an empty argument list () was implicitly inserted when calling a nullary method without arguments. E.g. + | + |$codeExample + | + |In Dotty, this idiom is an error. The application syntax has to follow exactly the parameter syntax. + |Excluded from this rule are methods that are defined in Java or that override methods defined in Java.""" + } +} + +class DuplicateNamedTypeParameter(name: Name)(using Context) + extends SyntaxMsg(DuplicateNamedTypeParameterID) { + def msg(using Context) = i"Type parameter $name was defined multiple times." + def explain(using Context) = "" +} + +class UndefinedNamedTypeParameter(undefinedName: Name, definedNames: List[Name])(using Context) + extends SyntaxMsg(UndefinedNamedTypeParameterID) { + def msg(using Context) = i"Type parameter $undefinedName is undefined. Expected one of ${definedNames.map(_.show).mkString(", ")}." + def explain(using Context) = "" +} + +class IllegalStartOfStatement(what: String, isModifier: Boolean, isStat: Boolean)(using Context) extends SyntaxMsg(IllegalStartOfStatementID) { + def msg(using Context) = + if isStat then + "this kind of statement is not allowed here" + else + val addendum = if isModifier then ": this modifier is not allowed here" else "" + s"Illegal start of $what$addendum" + def explain(using Context) = + i"""A statement is an import or export, a definition or an expression. + |Some statements are only allowed in certain contexts""" +} + +class TraitIsExpected(symbol: Symbol)(using Context) extends SyntaxMsg(TraitIsExpectedID) { + def msg(using Context) = i"$symbol is not a trait" + def explain(using Context) = { + val errorCodeExample = + """class A + |class B + | + |val a = new A with B // will fail with a compile error - class B is not a trait""".stripMargin + val codeExample = + """class A + |trait B + | + |val a = new A with B // compiles normally""".stripMargin - em"""Only traits can be mixed into classes using a ${hl("with")} keyword. - |Consider the following example: - | - |$errorCodeExample - | - |The example mentioned above would fail because B is not a trait. - |But if you make B a trait it will be compiled without any errors: - | - |$codeExample - |""" - } + i"""Only traits can be mixed into classes using a ${hl("with")} keyword. + |Consider the following example: + | + |$errorCodeExample + | + |The example mentioned above would fail because B is not a trait. + |But if you make B a trait it will be compiled without any errors: + | + |$codeExample + |""" } +} - class TraitRedefinedFinalMethodFromAnyRef(method: Symbol)(using Context) extends SyntaxMsg(TraitRedefinedFinalMethodFromAnyRefID) { - def msg = em"Traits cannot redefine final $method from ${hl("class AnyRef")}." - def explain = "" - } +class TraitRedefinedFinalMethodFromAnyRef(method: Symbol)(using Context) extends SyntaxMsg(TraitRedefinedFinalMethodFromAnyRefID) { + def msg(using Context) = i"Traits cannot redefine final $method from ${hl("class AnyRef")}." + def explain(using Context) = "" +} - class AlreadyDefined(name: Name, owner: Symbol, conflicting: Symbol)(using Context) extends NamingMsg(AlreadyDefinedID): - private def where: String = +class AlreadyDefined(name: Name, owner: Symbol, conflicting: Symbol)(using Context) +extends NamingMsg(AlreadyDefinedID): + def msg(using Context) = + def where: String = if conflicting.effectiveOwner.is(Package) && conflicting.associatedFile != null then i" in ${conflicting.associatedFile}" else if conflicting.owner == owner then "" else i" in ${conflicting.owner}" - private def note = + def note = if owner.is(Method) || conflicting.is(Method) then "\n\nNote that overloaded methods must all be defined in the same group of toplevel definitions" else "" - def msg = - if conflicting.isTerm != name.isTermName then - em"$name clashes with $conflicting$where; the two must be defined together" - else - em"$name is already defined as $conflicting$where$note" - def explain = "" - - class PackageNameAlreadyDefined(pkg: Symbol)(using Context) extends NamingMsg(PackageNameAlreadyDefinedID) { - lazy val (where, or) = - if pkg.associatedFile == null then ("", "") - else (s" in ${pkg.associatedFile}", " or delete the containing class file") - def msg = em"""${pkg.name} is the name of $pkg$where. - |It cannot be used at the same time as the name of a package.""" - def explain = - em"""An ${hl("object")} or other toplevel definition cannot have the same name as an existing ${hl("package")}. - |Rename either one of them$or.""" - } - - class UnapplyInvalidNumberOfArguments(qual: untpd.Tree, argTypes: List[Type])(using Context) - extends SyntaxMsg(UnapplyInvalidNumberOfArgumentsID) { - def msg = em"Wrong number of argument patterns for $qual; expected: ($argTypes%, %)" - def explain = - em"""The Unapply method of $qual was used with incorrect number of arguments. - |Expected usage would be something like: - |case $qual(${argTypes.map(_ => '_')}%, %) => ... - | - |where subsequent arguments would have following types: ($argTypes%, %). - |""".stripMargin - } - - class UnapplyInvalidReturnType(unapplyResult: Type, unapplyName: Name)(using Context) - extends DeclarationMsg(UnapplyInvalidReturnTypeID) { - def msg = - val addendum = - if Feature.migrateTo3 && unapplyName == nme.unapplySeq - then "\nYou might want to try to rewrite the extractor to use `unapply` instead." - else "" - em"""| ${Red(i"$unapplyResult")} is not a valid result type of an $unapplyName method of an ${Magenta("extractor")}.$addendum""" - def explain = if (unapplyName.show == "unapply") - em""" - |To be used as an extractor, an unapply method has to return a type that either: - | - has members ${Magenta("isEmpty: Boolean")} and ${Magenta("get: S")} (usually an ${Green("Option[S]")}) - | - is a ${Green("Boolean")} - | - is a ${Green("Product")} (like a ${Magenta("Tuple2[T1, T2]")}) - | - |class A(val i: Int) - | - |object B { - | def unapply(a: A): ${Green("Option[Int]")} = Some(a.i) - |} - | - |object C { - | def unapply(a: A): ${Green("Boolean")} = a.i == 2 - |} - | - |object D { - | def unapply(a: A): ${Green("(Int, Int)")} = (a.i, a.i) - |} - | - |object Test { - | def test(a: A) = a match { - | ${Magenta("case B(1)")} => 1 - | ${Magenta("case a @ C()")} => 2 - | ${Magenta("case D(3, 3)")} => 3 - | } - |} - """.stripMargin + if conflicting.isTerm != name.isTermName then + i"$name clashes with $conflicting$where; the two must be defined together" else - em""" - |To be used as an extractor, an unapplySeq method has to return a type which has members - |${Magenta("isEmpty: Boolean")} and ${Magenta("get: S")} where ${Magenta("S <: Seq[V]")} (usually an ${Green("Option[Seq[V]]")}): - | - |object CharList { - | def unapplySeq(s: String): ${Green("Option[Seq[Char]")} = Some(s.toList) - | - | "example" match { - | ${Magenta("case CharList(c1, c2, c3, c4, _, _, _)")} => - | println(s"$$c1,$$c2,$$c3,$$c4") - | case _ => - | println("Expected *exactly* 7 characters!") - | } - |} - """.stripMargin - } - - class StaticFieldsOnlyAllowedInObjects(member: Symbol)(using Context) extends SyntaxMsg(StaticFieldsOnlyAllowedInObjectsID) { - def msg = em"${hl("@static")} $member in ${member.owner} must be defined inside a static ${hl("object")}." - def explain = - em"${hl("@static")} members are only allowed inside objects." - } - - class StaticFieldsShouldPrecedeNonStatic(member: Symbol, defns: List[tpd.Tree])(using Context) extends SyntaxMsg(StaticFieldsShouldPrecedeNonStaticID) { - def msg = em"${hl("@static")} $member in ${member.owner} must be defined before non-static fields." - def explain = { - val nonStatics = defns.takeWhile(_.symbol != member).take(3).filter(_.isInstanceOf[tpd.ValDef]) - val codeExample = s"""object ${member.owner.name.firstPart} { - | @static ${member} = ... - | ${nonStatics.map(m => s"${m.symbol} = ...").mkString("\n ")} - | ... - |}""" - em"""The fields annotated with @static should precede any non @static fields. - |This ensures that we do not introduce surprises for users in initialization order of this class. - |Static field are initialized when class loading the code of Foo. - |Non static fields are only initialized the first time that Foo is accessed. - | - |The definition of ${member.name} should have been before the non ${hl("@static val")}s: - |$codeExample + i"$name is already defined as $conflicting$where$note" + def explain(using Context) = "" + +class PackageNameAlreadyDefined(pkg: Symbol)(using Context) extends NamingMsg(PackageNameAlreadyDefinedID) { + def msg(using Context) = + def where = if pkg.associatedFile == null then "" else s" in ${pkg.associatedFile}" + i"""${pkg.name} is the name of $pkg$where. + |It cannot be used at the same time as the name of a package.""" + def explain(using Context) = + def or = if pkg.associatedFile == null then "" else " or delete the containing class file" + i"""An ${hl("object")} or other toplevel definition cannot have the same name as an existing ${hl("package")}. + |Rename either one of them$or.""" +} + +class UnapplyInvalidNumberOfArguments(qual: untpd.Tree, argTypes: List[Type])(using Context) + extends SyntaxMsg(UnapplyInvalidNumberOfArgumentsID) { + def msg(using Context) = i"Wrong number of argument patterns for $qual; expected: ($argTypes%, %)" + def explain(using Context) = + i"""The Unapply method of $qual was used with incorrect number of arguments. + |Expected usage would be something like: + |case $qual(${argTypes.map(_ => '_')}%, %) => ... + | + |where subsequent arguments would have following types: ($argTypes%, %). |""" +} + +class UnapplyInvalidReturnType(unapplyResult: Type, unapplyName: Name)(using Context) + extends DeclarationMsg(UnapplyInvalidReturnTypeID) { + def msg(using Context) = + val addendum = + if Feature.migrateTo3 && unapplyName == nme.unapplySeq + then "\nYou might want to try to rewrite the extractor to use `unapply` instead." + else "" + i"""| ${Red(i"$unapplyResult")} is not a valid result type of an $unapplyName method of an ${Magenta("extractor")}.$addendum""" + def explain(using Context) = if (unapplyName.show == "unapply") + i""" + |To be used as an extractor, an unapply method has to return a type that either: + | - has members ${Magenta("isEmpty: Boolean")} and ${Magenta("get: S")} (usually an ${Green("Option[S]")}) + | - is a ${Green("Boolean")} + | - is a ${Green("Product")} (like a ${Magenta("Tuple2[T1, T2]")}) + | + |class A(val i: Int) + | + |object B { + | def unapply(a: A): ${Green("Option[Int]")} = Some(a.i) + |} + | + |object C { + | def unapply(a: A): ${Green("Boolean")} = a.i == 2 + |} + | + |object D { + | def unapply(a: A): ${Green("(Int, Int)")} = (a.i, a.i) + |} + | + |object Test { + | def test(a: A) = a match { + | ${Magenta("case B(1)")} => 1 + | ${Magenta("case a @ C()")} => 2 + | ${Magenta("case D(3, 3)")} => 3 + | } + |} + """ + else + i""" + |To be used as an extractor, an unapplySeq method has to return a type which has members + |${Magenta("isEmpty: Boolean")} and ${Magenta("get: S")} where ${Magenta("S <: Seq[V]")} (usually an ${Green("Option[Seq[V]]")}): + | + |object CharList { + | def unapplySeq(s: String): ${Green("Option[Seq[Char]")} = Some(s.toList) + | + | "example" match { + | ${Magenta("case CharList(c1, c2, c3, c4, _, _, _)")} => + | println(s"$$c1,$$c2,$$c3,$$c4") + | case _ => + | println("Expected *exactly* 7 characters!") + | } + |} + """ +} + +class StaticFieldsOnlyAllowedInObjects(member: Symbol)(using Context) extends SyntaxMsg(StaticFieldsOnlyAllowedInObjectsID) { + def msg(using Context) = i"${hl("@static")} $member in ${member.owner} must be defined inside a static ${hl("object")}." + def explain(using Context) = + i"${hl("@static")} members are only allowed inside objects." +} + +class StaticFieldsShouldPrecedeNonStatic(member: Symbol, defns: List[tpd.Tree])(using Context) extends SyntaxMsg(StaticFieldsShouldPrecedeNonStaticID) { + def msg(using Context) = i"${hl("@static")} $member in ${member.owner} must be defined before non-static fields." + def explain(using Context) = { + val nonStatics = defns.takeWhile(_.symbol != member).take(3).filter(_.isInstanceOf[tpd.ValDef]) + val codeExample = s"""object ${member.owner.name.firstPart} { + | @static ${member} = ... + | ${nonStatics.map(m => s"${m.symbol} = ...").mkString("\n ")} + | ... + |}""" + i"""The fields annotated with @static should precede any non @static fields. + |This ensures that we do not introduce surprises for users in initialization order of this class. + |Static field are initialized when class loading the code of Foo. + |Non static fields are only initialized the first time that Foo is accessed. + | + |The definition of ${member.name} should have been before the non ${hl("@static val")}s: + |$codeExample + |""" + } +} + +class CyclicInheritance(symbol: Symbol, addendum: => String)(using Context) extends SyntaxMsg(CyclicInheritanceID) { + def msg(using Context) = i"Cyclic inheritance: $symbol extends itself$addendum" + def explain(using Context) = { + val codeExample = "class A extends A" + + i"""Cyclic inheritance is prohibited in Dotty. + |Consider the following example: + | + |$codeExample + | + |The example mentioned above would fail because this type of inheritance hierarchy + |creates a "cycle" where a not yet defined class A extends itself which makes + |impossible to instantiate an object of this class""" + } +} + +class BadSymbolicReference(denot: SymDenotation)(using Context) +extends ReferenceMsg(BadSymbolicReferenceID) { + def msg(using Context) = { + val denotationOwner = denot.owner + val denotationName = ctx.fresh.setSetting(ctx.settings.YdebugNames, true).printer.nameString(denot.name) + val file = denot.symbol.associatedFile + val (location, src) = + if (file != null) (s" in $file", file.toString) + else ("", "the signature") + + i"""Bad symbolic reference. A signature$location + |refers to $denotationName in ${denotationOwner.showKind} ${denotationOwner.showFullName} which is not available. + |It may be completely missing from the current classpath, or the version on + |the classpath might be incompatible with the version used when compiling $src.""" + } + + def explain(using Context) = "" +} + +class UnableToExtendSealedClass(pclazz: Symbol)(using Context) extends SyntaxMsg(UnableToExtendSealedClassID) { + def msg(using Context) = i"Cannot extend ${hl("sealed")} $pclazz in a different source file" + def explain(using Context) = "A sealed class or trait can only be extended in the same file as its declaration" +} + +class SymbolHasUnparsableVersionNumber(symbol: Symbol, errorMessage: String)(using Context) +extends SyntaxMsg(SymbolHasUnparsableVersionNumberID) { + def msg(using Context) = i"${symbol.showLocated} has an unparsable version number: $errorMessage" + def explain(using Context) = + i"""The ${symbol.showLocated} is marked with ${hl("@migration")} indicating it has changed semantics + |between versions and the ${hl("-Xmigration")} settings is used to warn about constructs + |whose behavior may have changed since version change.""" +} + +class SymbolChangedSemanticsInVersion( + symbol: Symbol, + migrationVersion: ScalaVersion, + migrationMessage: String +)(using Context) extends SyntaxMsg(SymbolChangedSemanticsInVersionID) { + def msg(using Context) = i"${symbol.showLocated} has changed semantics in version $migrationVersion: $migrationMessage" + def explain(using Context) = + i"""The ${symbol.showLocated} is marked with ${hl("@migration")} indicating it has changed semantics + |between versions and the ${hl("-Xmigration")} settings is used to warn about constructs + |whose behavior may have changed since version change.""" +} + +class UnableToEmitSwitch()(using Context) +extends SyntaxMsg(UnableToEmitSwitchID) { + def msg(using Context) = i"Could not emit switch for ${hl("@switch")} annotated match" + def explain(using Context) = { + val codeExample = + """val ConstantB = 'B' + |final val ConstantC = 'C' + |def tokenMe(ch: Char) = (ch: @switch) match { + | case '\t' | '\n' => 1 + | case 'A' => 2 + | case ConstantB => 3 // a non-literal may prevent switch generation: this would not compile + | case ConstantC => 4 // a constant value is allowed + | case _ => 5 + |}""".stripMargin + + i"""If annotated with ${hl("@switch")}, the compiler will verify that the match has been compiled to a + |tableswitch or lookupswitch and issue an error if it instead compiles into a series of conditional + |expressions. Example usage: + | + |$codeExample + | + |The compiler will not apply the optimisation if: + |- the matched value is not of type ${hl("Int")}, ${hl("Byte")}, ${hl("Short")} or ${hl("Char")} + |- the matched value is not a constant literal + |- there are less than three cases""" + } +} + +class MissingCompanionForStatic(member: Symbol)(using Context) +extends SyntaxMsg(MissingCompanionForStaticID) { + def msg(using Context) = i"${member.owner} does not have a companion class" + def explain(using Context) = + i"An object that contains ${hl("@static")} members must have a companion class." +} + +class PolymorphicMethodMissingTypeInParent(rsym: Symbol, parentSym: Symbol)(using Context) +extends SyntaxMsg(PolymorphicMethodMissingTypeInParentID) { + def msg(using Context) = i"Polymorphic refinement $rsym without matching type in parent $parentSym is no longer allowed" + def explain(using Context) = + i"""Polymorphic $rsym is not allowed in the structural refinement of $parentSym because + |$rsym does not override any method in $parentSym. Structural refinement does not allow for + |polymorphic methods.""" +} + +class ParamsNoInline(owner: Symbol)(using Context) + extends SyntaxMsg(ParamsNoInlineID) { + def msg(using Context) = i"""${hl("inline")} modifier can only be used for parameters of inline methods""" + def explain(using Context) = "" +} + +class JavaSymbolIsNotAValue(symbol: Symbol)(using Context) extends TypeMsg(JavaSymbolIsNotAValueID) { + def msg(using Context) = + val kind = + if symbol is Package then i"$symbol" + else i"Java defined ${hl("class " + symbol.name)}" + s"$kind is not a value" + def explain(using Context) = "" +} + +class DoubleDefinition(decl: Symbol, previousDecl: Symbol, base: Symbol)(using Context) +extends NamingMsg(DoubleDefinitionID) { + def msg(using Context) = { + def nameAnd = if (decl.name != previousDecl.name) " name and" else "" + def erasedType = if ctx.erasedTypes then i" ${decl.info}" else "" + def details(using Context): String = + if (decl.isRealMethod && previousDecl.isRealMethod) { + import Signature.MatchDegree._ + + // compare the signatures when both symbols represent methods + decl.signature.matchDegree(previousDecl.signature) match { + case NoMatch => + // If the signatures don't match at all at the current phase, then + // they might match after erasure. + if ctx.phase.id <= elimErasedValueTypePhase.id then + atPhase(elimErasedValueTypePhase.next)(details) + else + "" // shouldn't be reachable + case ParamMatch => + "have matching parameter types." + case MethodNotAMethodMatch => + "neither has parameters." + case FullMatch => + val hint = + if !decl.hasAnnotation(defn.TargetNameAnnot) + && !previousDecl.hasAnnotation(defn.TargetNameAnnot) + then + i""" + | + |Consider adding a @targetName annotation to one of the conflicting definitions + |for disambiguation.""" + else "" + i"have the same$nameAnd type$erasedType after erasure.$hint" + } + } + else "" + def symLocation(sym: Symbol) = { + val lineDesc = + if (sym.span.exists && sym.span != sym.owner.span) + s" at line ${sym.srcPos.line + 1}" + else "" + i"in ${sym.owner}${lineDesc}" } - } - - class CyclicInheritance(symbol: Symbol, addendum: => String)(using Context) extends SyntaxMsg(CyclicInheritanceID) { - def msg = em"Cyclic inheritance: $symbol extends itself$addendum" - def explain = { - val codeExample = "class A extends A" - - em"""Cyclic inheritance is prohibited in Dotty. - |Consider the following example: - | - |$codeExample - | - |The example mentioned above would fail because this type of inheritance hierarchy - |creates a "cycle" where a not yet defined class A extends itself which makes - |impossible to instantiate an object of this class""" - } - } - - class BadSymbolicReference(denot: SymDenotation)(using Context) - extends ReferenceMsg(BadSymbolicReferenceID) { - def msg = { - val denotationOwner = denot.owner - val denotationName = ctx.fresh.setSetting(ctx.settings.YdebugNames, true).printer.nameString(denot.name) - val file = denot.symbol.associatedFile - val (location, src) = - if (file != null) (s" in $file", file.toString) - else ("", "the signature") - - em"""Bad symbolic reference. A signature$location - |refers to $denotationName in ${denotationOwner.showKind} ${denotationOwner.showFullName} which is not available. - |It may be completely missing from the current classpath, or the version on - |the classpath might be incompatible with the version used when compiling $src.""" - } - - def explain = "" - } - - class UnableToExtendSealedClass(pclazz: Symbol)(using Context) extends SyntaxMsg(UnableToExtendSealedClassID) { - def msg = em"Cannot extend ${hl("sealed")} $pclazz in a different source file" - def explain = "A sealed class or trait can only be extended in the same file as its declaration" - } - - class SymbolHasUnparsableVersionNumber(symbol: Symbol, errorMessage: String)(using Context) - extends SyntaxMsg(SymbolHasUnparsableVersionNumberID) { - def msg = em"${symbol.showLocated} has an unparsable version number: $errorMessage" - def explain = - em"""The ${symbol.showLocated} is marked with ${hl("@migration")} indicating it has changed semantics - |between versions and the ${hl("-Xmigration")} settings is used to warn about constructs - |whose behavior may have changed since version change.""" - } - - class SymbolChangedSemanticsInVersion( - symbol: Symbol, - migrationVersion: ScalaVersion, - migrationMessage: String - )(using Context) extends SyntaxMsg(SymbolChangedSemanticsInVersionID) { - def msg = em"${symbol.showLocated} has changed semantics in version $migrationVersion: $migrationMessage" - def explain = - em"""The ${symbol.showLocated} is marked with ${hl("@migration")} indicating it has changed semantics - |between versions and the ${hl("-Xmigration")} settings is used to warn about constructs - |whose behavior may have changed since version change.""" - } - - class UnableToEmitSwitch()(using Context) - extends SyntaxMsg(UnableToEmitSwitchID) { - def msg = em"Could not emit switch for ${hl("@switch")} annotated match" - def explain = { - val codeExample = - """val ConstantB = 'B' - |final val ConstantC = 'C' - |def tokenMe(ch: Char) = (ch: @switch) match { - | case '\t' | '\n' => 1 - | case 'A' => 2 - | case ConstantB => 3 // a non-literal may prevent switch generation: this would not compile - | case ConstantC => 4 // a constant value is allowed - | case _ => 5 - |}""".stripMargin + val clashDescription = + if (decl.owner eq previousDecl.owner) + "Double definition" + else if ((decl.owner eq base) || (previousDecl eq base)) + "Name clash between defined and inherited member" + else + "Name clash between inherited members" - em"""If annotated with ${hl("@switch")}, the compiler will verify that the match has been compiled to a - |tableswitch or lookupswitch and issue an error if it instead compiles into a series of conditional - |expressions. Example usage: - | - |$codeExample - | - |The compiler will not apply the optimisation if: - |- the matched value is not of type ${hl("Int")}, ${hl("Byte")}, ${hl("Short")} or ${hl("Char")} - |- the matched value is not a constant literal - |- there are less than three cases""" + atPhase(typerPhase) { + i"""$clashDescription: + |${previousDecl.showDcl} ${symLocation(previousDecl)} and + |${decl.showDcl} ${symLocation(decl)} + |""" + } + details + } + def explain(using Context) = "" +} + +class ImportRenamedTwice(ident: untpd.Ident)(using Context) extends SyntaxMsg(ImportRenamedTwiceID) { + def msg(using Context) = s"${ident.show} is renamed twice on the same import line." + def explain(using Context) = "" +} + +class TypeTestAlwaysDiverges(scrutTp: Type, testTp: Type)(using Context) extends SyntaxMsg(TypeTestAlwaysDivergesID) { + def msg(using Context) = + s"This type test will never return a result since the scrutinee type ${scrutTp.show} does not contain any value." + def explain(using Context) = "" +} + +// Relative of CyclicReferenceInvolvingImplicit and RecursiveValueNeedsResultType +class TermMemberNeedsResultTypeForImplicitSearch(cycleSym: Symbol)(using Context) + extends CyclicMsg(TermMemberNeedsNeedsResultTypeForImplicitSearchID) { + def msg(using Context) = i"""$cycleSym needs result type because its right-hand side attempts implicit search""" + def explain(using Context) = + i"""|The right hand-side of $cycleSym's definition requires an implicit search at the highlighted position. + |To avoid this error, give `$cycleSym` an explicit type. + |""" +} + +class ClassCannotExtendEnum(cls: Symbol, parent: Symbol)(using Context) extends SyntaxMsg(ClassCannotExtendEnumID) { + def msg(using Context) = i"""$cls in ${cls.owner} extends enum ${parent.name}, but extending enums is prohibited.""" + def explain(using Context) = "" +} + +class NotAnExtractor(tree: untpd.Tree)(using Context) extends SyntaxMsg(NotAnExtractorID) { + def msg(using Context) = i"$tree cannot be used as an extractor in a pattern because it lacks an unapply or unapplySeq method" + def explain(using Context) = + i"""|An ${hl("unapply")} method should be defined in an ${hl("object")} as follow: + | - If it is just a test, return a ${hl("Boolean")}. For example ${hl("case even()")} + | - If it returns a single sub-value of type T, return an ${hl("Option[T]")} + | - If it returns several sub-values T1,...,Tn, group them in an optional tuple ${hl("Option[(T1,...,Tn)]")} + | + |Sometimes, the number of sub-values isn't fixed and we would like to return a sequence. + |For this reason, you can also define patterns through ${hl("unapplySeq")} which returns ${hl("Option[Seq[T]]")}. + |This mechanism is used for instance in pattern ${hl("case List(x1, ..., xn)")}""" +} + +class MemberWithSameNameAsStatic()(using Context) + extends SyntaxMsg(MemberWithSameNameAsStaticID) { + def msg(using Context) = i"Companion classes cannot define members with same name as a ${hl("@static")} member" + def explain(using Context) = "" +} + +class PureExpressionInStatementPosition(stat: untpd.Tree, val exprOwner: Symbol)(using Context) + extends Message(PureExpressionInStatementPositionID) { + def kind = MessageKind.PotentialIssue + def msg(using Context) = "A pure expression does nothing in statement position; you may be omitting necessary parentheses" + def explain(using Context) = + i"""The pure expression $stat doesn't have any side effect and its result is not assigned elsewhere. + |It can be removed without changing the semantics of the program. This may indicate an error.""" +} + +class TraitCompanionWithMutableStatic()(using Context) + extends SyntaxMsg(TraitCompanionWithMutableStaticID) { + def msg(using Context) = i"Companion of traits cannot define mutable @static fields" + def explain(using Context) = "" +} + +class LazyStaticField()(using Context) + extends SyntaxMsg(LazyStaticFieldID) { + def msg(using Context) = i"Lazy @static fields are not supported" + def explain(using Context) = "" +} + +class StaticOverridingNonStaticMembers()(using Context) + extends SyntaxMsg(StaticOverridingNonStaticMembersID) { + def msg(using Context) = i"${hl("@static")} members cannot override or implement non-static ones" + def explain(using Context) = "" +} + +class OverloadInRefinement(rsym: Symbol)(using Context) + extends DeclarationMsg(OverloadInRefinementID) { + def msg(using Context) = "Refinements cannot introduce overloaded definitions" + def explain(using Context) = + i"""The refinement `$rsym` introduces an overloaded definition. + |Refinements cannot contain overloaded definitions.""" +} + +class NoMatchingOverload(val alternatives: List[SingleDenotation], pt: Type)(using Context) + extends TypeMsg(NoMatchingOverloadID) { + def msg(using Context) = + i"""None of the ${err.overloadedAltsStr(alternatives)} + |match ${err.expectedTypeStr(pt)}""" + def explain(using Context) = "" +} +class StableIdentPattern(tree: untpd.Tree, pt: Type)(using Context) + extends TypeMsg(StableIdentPatternID) { + def msg(using Context) = + i"""Stable identifier required, but $tree found""" + def explain(using Context) = "" +} + +class IllegalSuperAccessor(base: Symbol, memberName: Name, targetName: Name, + acc: Symbol, accTp: Type, + other: Symbol, otherTp: Type)(using Context) extends DeclarationMsg(IllegalSuperAccessorID) { + def msg(using Context) = { + // The mixin containing a super-call that requires a super-accessor + val accMixin = acc.owner + // The class or trait that the super-accessor should resolve too in `base` + val otherMixin = other.owner + // The super-call in `accMixin` + val superCall = hl(i"super.$memberName") + // The super-call that the super-accesors in `base` forwards to + val resolvedSuperCall = hl(i"super[${otherMixin.name}].$memberName") + // The super-call that we would have called if `super` in traits behaved like it + // does in classes, i.e. followed the linearization of the trait itself. + val staticSuperCall = { + val staticSuper = accMixin.asClass.info.parents.reverse + .find(_.nonPrivateMember(memberName) + .matchingDenotation(accMixin.thisType, acc.info, targetName).exists) + val staticSuperName = staticSuper match { + case Some(parent) => + parent.classSymbol.name.show + case None => // Might be reachable under separate compilation + "SomeParent" + } + hl(i"super[$staticSuperName].$memberName") } - } - - class MissingCompanionForStatic(member: Symbol)(using Context) - extends SyntaxMsg(MissingCompanionForStaticID) { - def msg = em"${member.owner} does not have a companion class" - def explain = - em"An object that contains ${hl("@static")} members must have a companion class." - } - - class PolymorphicMethodMissingTypeInParent(rsym: Symbol, parentSym: Symbol)(using Context) - extends SyntaxMsg(PolymorphicMethodMissingTypeInParentID) { - def msg = em"Polymorphic refinement $rsym without matching type in parent $parentSym is no longer allowed" - def explain = - em"""Polymorphic $rsym is not allowed in the structural refinement of $parentSym because - |$rsym does not override any method in $parentSym. Structural refinement does not allow for - |polymorphic methods.""" - } + i"""$base cannot be defined due to a conflict between its parents when + |implementing a super-accessor for $memberName in $accMixin: + | + |1. One of its parent (${accMixin.name}) contains a call $superCall in its body, + | and when a super-call in a trait is written without an explicit parent + | listed in brackets, it is implemented by a generated super-accessor in + | the class that extends this trait based on the linearization order of + | the class. + |2. Because ${otherMixin.name} comes before ${accMixin.name} in the linearization + | order of ${base.name}, and because ${otherMixin.name} overrides $memberName, + | the super-accessor in ${base.name} is implemented as a call to + | $resolvedSuperCall. + |3. However, + | ${otherTp.widenExpr} (the type of $resolvedSuperCall in ${base.name}) + | is not a subtype of + | ${accTp.widenExpr} (the type of $memberName in $accMixin). + | Hence, the super-accessor that needs to be generated in ${base.name} + | is illegal. + | + |Here are two possible ways to resolve this: + | + |1. Change the linearization order of ${base.name} such that + | ${accMixin.name} comes before ${otherMixin.name}. + |2. Alternatively, replace $superCall in the body of $accMixin by a + | super-call to a specific parent, e.g. $staticSuperCall + |""" + } + def explain(using Context) = "" +} + +class TraitParameterUsedAsParentPrefix(cls: Symbol)(using Context) + extends DeclarationMsg(TraitParameterUsedAsParentPrefixID) { + def msg(using Context) = + s"${cls.show} cannot extend from a parent that is derived via its own parameters" + def explain(using Context) = + i""" + |The parent class/trait that ${cls.show} extends from is obtained from + |the parameter of ${cls.show}. This is disallowed in order to prevent + |outer-related Null Pointer Exceptions in Scala. + | + |In order to fix this issue consider directly extending from the parent rather + |than obtaining it from the parameters of ${cls.show}. + |""" +} + +class UnknownNamedEnclosingClassOrObject(name: TypeName)(using Context) + extends ReferenceMsg(UnknownNamedEnclosingClassOrObjectID) { + def msg(using Context) = + i"""no enclosing class or object is named '${hl(name.show)}'""" + def explain(using Context) = + i""" + |The class or object named '${hl(name.show)}' was used as a visibility + |modifier, but could not be resolved. Make sure that + |'${hl(name.show)}' is not misspelled and has been imported into the + |current scope. + """ + } + +class IllegalCyclicTypeReference(sym: Symbol, where: String, lastChecked: Type)(using Context) + extends CyclicMsg(IllegalCyclicTypeReferenceID) { + def msg(using Context) = + val lastCheckedStr = + try lastChecked.show + catch case ex: CyclicReference => "..." + i"illegal cyclic type reference: ${where} ${hl(lastCheckedStr)} of $sym refers back to the type itself" + def explain(using Context) = "" +} + +class ErasedTypesCanOnlyBeFunctionTypes()(using Context) + extends SyntaxMsg(ErasedTypesCanOnlyBeFunctionTypesID) { + def msg(using Context) = "Types with erased keyword can only be function types `(erased ...) => ...`" + def explain(using Context) = "" +} + +class CaseClassMissingNonImplicitParamList(cdef: untpd.TypeDef)(using Context) + extends SyntaxMsg(CaseClassMissingNonImplicitParamListID) { + def msg(using Context) = + i"""|A ${hl("case class")} must have at least one leading non-implicit parameter list""" + + def explain(using Context) = + i"""|${cdef.name} must have at least one leading non-implicit parameter list, + | if you're aiming to have a case class parametrized only by implicit ones, you should + | add an explicit ${hl("()")} as the first parameter list to ${cdef.name}.""" +} + +class EnumerationsShouldNotBeEmpty(cdef: untpd.TypeDef)(using Context) + extends SyntaxMsg(EnumerationsShouldNotBeEmptyID) { + def msg(using Context) = "Enumerations must contain at least one case" + + def explain(using Context) = + i"""|Enumeration ${cdef.name} must contain at least one case + |Example Usage: + | ${hl("enum")} ${cdef.name} { + | ${hl("case")} Option1, Option2 + | } + |""" +} + +class TypedCaseDoesNotExplicitlyExtendTypedEnum(enumDef: Symbol, caseDef: untpd.TypeDef)(using Context) + extends SyntaxMsg(TypedCaseDoesNotExplicitlyExtendTypedEnumID) { + def msg(using Context) = i"explicit extends clause needed because both enum case and enum class have type parameters" + + def explain(using Context) = + i"""Enumerations where the enum class as well as the enum case have type parameters need + |an explicit extends. + |for example: + | ${hl("enum")} ${enumDef.name}[T] { + | ${hl("case")} ${caseDef.name}[U](u: U) ${hl("extends")} ${enumDef.name}[U] + | } + |""" +} + +class IllegalRedefinitionOfStandardKind(kindType: String, name: Name)(using Context) + extends SyntaxMsg(IllegalRedefinitionOfStandardKindID) { + def msg(using Context) = i"illegal redefinition of standard $kindType $name" + def explain(using Context) = + i"""| "$name" is a standard Scala core `$kindType` + | Please choose a different name to avoid conflicts + |""" +} + +class NoExtensionMethodAllowed(mdef: untpd.DefDef)(using Context) + extends SyntaxMsg(NoExtensionMethodAllowedID) { + def msg(using Context) = i"No extension method allowed here, since collective parameters are given" + def explain(using Context) = + i"""|Extension method: + | `${mdef}` + |is defined inside an extension clause which has collective parameters. + |""" +} - class ParamsNoInline(owner: Symbol)(using Context) - extends SyntaxMsg(ParamsNoInlineID) { - def msg = em"""${hl("inline")} modifier can only be used for parameters of inline methods""" - def explain = "" - } +class ExtensionMethodCannotHaveTypeParams(mdef: untpd.DefDef)(using Context) + extends SyntaxMsg(ExtensionMethodCannotHaveTypeParamsID) { + def msg(using Context) = i"Extension method cannot have type parameters since some were already given previously" - class JavaSymbolIsNotAValue(symbol: Symbol)(using Context) extends TypeMsg(JavaSymbolIsNotAValueID) { - def msg = { - val kind = - if (symbol is Package) em"$symbol" - else em"Java defined ${hl("class " + symbol.name)}" + def explain(using Context) = + i"""|Extension method: + | `${mdef}` + |has type parameters `[${mdef.leadingTypeParams.map(_.show).mkString(",")}]`, while the extension clause has + |it's own type parameters. Please consider moving these to the extension clause's type parameter list. + |""" +} + +class ExtensionCanOnlyHaveDefs(mdef: untpd.Tree)(using Context) + extends SyntaxMsg(ExtensionCanOnlyHaveDefsID) { + def msg(using Context) = i"Only methods allowed here, since collective parameters are given" + def explain(using Context) = + i"""Extension clauses can only have `def`s + | `${mdef.show}` is not a valid expression here. + |""" +} + +class UnexpectedPatternForSummonFrom(tree: Tree[_])(using Context) + extends SyntaxMsg(UnexpectedPatternForSummonFromID) { + def msg(using Context) = i"Unexpected pattern for summonFrom. Expected ${hl("`x: T`")} or ${hl("`_`")}" + def explain(using Context) = + i"""|The pattern "${tree.show}" provided in the ${hl("case")} expression of the ${hl("summonFrom")}, + | needs to be of the form ${hl("`x: T`")} or ${hl("`_`")}. + | + | Example usage: + | inline def a = summonFrom { + | case x: T => ??? + | } + | + | or + | inline def a = summonFrom { + | case _ => ??? + | } + |""" +} + +class AnonymousInstanceCannotBeEmpty(impl: untpd.Template)(using Context) + extends SyntaxMsg(AnonymousInstanceCannotBeEmptyID) { + def msg(using Context) = i"anonymous instance must implement a type or have at least one extension method" + def explain(using Context) = + i"""|Anonymous instances cannot be defined with an empty body. The block + |`${impl.show}` should either contain an implemented type or at least one extension method. + |""" +} + +class ModifierNotAllowedForDefinition(flag: Flag)(using Context) + extends SyntaxMsg(ModifierNotAllowedForDefinitionID) { + def msg(using Context) = i"Modifier ${hl(flag.flagsString)} is not allowed for this definition" + def explain(using Context) = "" +} + +class RedundantModifier(flag: Flag)(using Context) + extends SyntaxMsg(RedundantModifierID) { + def msg(using Context) = i"Modifier ${hl(flag.flagsString)} is redundant for this definition" + def explain(using Context) = "" +} + +class InvalidReferenceInImplicitNotFoundAnnotation(typeVar: String, owner: String)(using Context) + extends ReferenceMsg(InvalidReferenceInImplicitNotFoundAnnotationID) { + def msg(using Context) = i"""|Invalid reference to a type variable ${hl(typeVar)} found in the annotation argument. + |The variable does not occur as a parameter in the scope of ${hl(owner)}. + |""" + def explain(using Context) = "" +} + +class CaseClassInInlinedCode(tree: tpd.Tree)(using Context) + extends SyntaxMsg(CaseClassInInlinedCodeID) { + + def defKind = if tree.symbol.is(Module) then "object" else "class" + def msg(using Context) = s"Case $defKind definitions are not allowed in inline methods or quoted code. Use a normal $defKind instead." + def explain(using Context) = + i"""Case class/object definitions generate a considerable footprint in code size. + |Inlining such definition would multiply this footprint for each call site. + |""" +} + +class ImplicitSearchTooLargeWarning(limit: Int, openSearchPairs: List[(Candidate, Type)])(using Context) + extends TypeMsg(ImplicitSearchTooLargeID): + override def showAlways = true + def showQuery(query: (Candidate, Type))(using Context): String = + i" ${query._1.ref.symbol.showLocated} for ${query._2}}" + def msg(using Context) = + i"""Implicit search problem too large. + |an implicit search was terminated with failure after trying $limit expressions. + |The root candidate for the search was: + | + |${showQuery(openSearchPairs.last)} + | + |You can change the behavior by setting the `-Ximplicit-search-limit` value. + |Smaller values cause the search to fail faster. + |Larger values might make a very large search problem succeed. + |""" + def explain(using Context) = + i"""The overflow happened with the following lists of tried expressions and target types, + |starting with the root query: + | + |${openSearchPairs.reverse.map(showQuery)}%\n% + """ + +class TargetNameOnTopLevelClass(symbol: Symbol)(using Context) +extends SyntaxMsg(TargetNameOnTopLevelClassID): + def msg(using Context) = i"${hl("@targetName")} annotation not allowed on top-level $symbol" + def explain(using Context) = + val annot = symbol.getAnnotation(defn.TargetNameAnnot).get + i"""The @targetName annotation may be applied to a top-level ${hl("val")} or ${hl("def")}, but not + |a top-level ${hl("class")}, ${hl("trait")}, or ${hl("object")}. + | + |This restriction is due to the naming convention of Java classfiles, whose filenames + |are based on the name of the class defined within. If @targetName were permitted + |here, the name of the classfile would be based on the target name, and the compiler + |could not associate that classfile with the Scala-visible defined name of the class. + | + |If your use case requires @targetName, consider wrapping $symbol in an ${hl("object")} + |(and possibly exporting it), as in the following example: + | + |${hl("object Wrapper:")} + | $annot $symbol { ... } + | + |${hl("export")} Wrapper.${symbol.name} ${hl("// optional")}""" + +class NotClassType(tp: Type)(using Context) +extends TypeMsg(NotClassTypeID), ShowMatchTrace(tp): + def msg(using Context) = i"$tp is not a class type" + def explain(using Context) = "" + +class MissingImplicitArgument( + arg: tpd.Tree, + pt: Type, + where: String, + paramSymWithMethodCallTree: Option[(Symbol, tpd.Tree)] = None, + ignoredInstanceNormalImport: => Option[SearchSuccess] + )(using Context) extends TypeMsg(MissingImplicitArgumentID), ShowMatchTrace(pt): + + arg.tpe match + case ambi: AmbiguousImplicits => withoutDisambiguation() + case _ => + + def msg(using Context): String = + + def formatMsg(shortForm: String)(headline: String = shortForm) = arg match + case arg: Trees.SearchFailureIdent[?] => + arg.tpe match + case _: NoMatchingImplicits => headline + case tpe: SearchFailureType => + i"$headline. ${tpe.explanation}" + case _ => headline + case _ => + arg.tpe match + case tpe: SearchFailureType => + val original = arg match + case Inlined(call, _, _) => call + case _ => arg + i"""$headline. + |I found: + | + | ${original.show.replace("\n", "\n ")} + | + |But ${tpe.explanation}.""" + case _ => headline + + /** Format `raw` implicitNotFound or implicitAmbiguous argument, replacing + * all occurrences of `${X}` where `X` is in `paramNames` with the + * corresponding shown type in `args`. + */ + def userDefinedErrorString(raw: String, paramNames: List[String], args: List[Type]): String = { + def translate(name: String): Option[String] = { + val idx = paramNames.indexOf(name) + if (idx >= 0) Some(i"${args(idx)}") else None + } - s"$kind is not a value" + """\$\{\s*([^}\s]+)\s*\}""".r.replaceAllIn(raw, (_: Regex.Match) match { + case Regex.Groups(v) => quoteReplacement(translate(v).getOrElse("")).nn + }) } - def explain = "" - } - class DoubleDefinition(decl: Symbol, previousDecl: Symbol, base: Symbol)(using Context) extends NamingMsg(DoubleDefinitionID) { - def msg = { - def nameAnd = if (decl.name != previousDecl.name) " name and" else "" - def erasedType = if ctx.erasedTypes then i" ${decl.info}" else "" - def details(using Context): String = - if (decl.isRealMethod && previousDecl.isRealMethod) { - import Signature.MatchDegree._ - - // compare the signatures when both symbols represent methods - decl.signature.matchDegree(previousDecl.signature) match { - case NoMatch => - // If the signatures don't match at all at the current phase, then - // they might match after erasure. - if ctx.phase.id <= elimErasedValueTypePhase.id then - atPhase(elimErasedValueTypePhase.next)(details) - else - "" // shouldn't be reachable - case ParamMatch => - "have matching parameter types." - case MethodNotAMethodMatch => - "neither has parameters." - case FullMatch => - val hint = - if !decl.hasAnnotation(defn.TargetNameAnnot) - && !previousDecl.hasAnnotation(defn.TargetNameAnnot) - then - i""" - | - |Consider adding a @targetName annotation to one of the conflicting definitions - |for disambiguation.""" - else "" - i"have the same$nameAnd type$erasedType after erasure.$hint" - } - } - else "" - def symLocation(sym: Symbol) = { - val lineDesc = - if (sym.span.exists && sym.span != sym.owner.span) - s" at line ${sym.srcPos.line + 1}" - else "" - i"in ${sym.owner}${lineDesc}" + /** Extract a user defined error message from a symbol `sym` + * with an annotation matching the given class symbol `cls`. + */ + def userDefinedMsg(sym: Symbol, cls: Symbol) = for { + ann <- sym.getAnnotation(cls) + msg <- ann.argumentConstantString(0) + } yield msg + + def location(preposition: String) = if (where.isEmpty) "" else s" $preposition $where" + + def defaultAmbiguousImplicitMsg(ambi: AmbiguousImplicits) = + s"Ambiguous given instances: ${ambi.explanation}${location("of")}" + + def defaultImplicitNotFoundMessage = + i"No given instance of type $pt was found${location("for")}" + + /** Construct a custom error message given an ambiguous implicit + * candidate `alt` and a user defined message `raw`. + */ + def userDefinedAmbiguousImplicitMsg(alt: SearchSuccess, raw: String) = { + val params = alt.ref.underlying match { + case p: PolyType => p.paramNames.map(_.toString) + case _ => Nil } - val clashDescription = - if (decl.owner eq previousDecl.owner) - "Double definition" - else if ((decl.owner eq base) || (previousDecl eq base)) - "Name clash between defined and inherited member" - else - "Name clash between inherited members" - - atPhase(typerPhase) { - em"""$clashDescription: - |${previousDecl.showDcl} ${symLocation(previousDecl)} and - |${decl.showDcl} ${symLocation(decl)} - |""" - } + details + def resolveTypes(targs: List[tpd.Tree])(using Context) = + targs.map(a => Inferencing.fullyDefinedType(a.tpe, "type argument", a.srcPos)) + + // We can extract type arguments from: + // - a function call: + // @implicitAmbiguous("msg A=${A}") + // implicit def f[A](): String = ... + // implicitly[String] // found: f[Any]() + // + // - an eta-expanded function: + // @implicitAmbiguous("msg A=${A}") + // implicit def f[A](x: Int): String = ... + // implicitly[Int => String] // found: x => f[Any](x) + + val call = tpd.closureBody(alt.tree) // the tree itself if not a closure + val targs = tpd.typeArgss(call).flatten + val args = resolveTypes(targs)(using ctx.fresh.setTyperState(alt.tstate)) + userDefinedErrorString(raw, params, args) } - def explain = "" - } - - class ImportRenamedTwice(ident: untpd.Ident)(using Context) extends SyntaxMsg(ImportRenamedTwiceID) { - def msg = s"${ident.show} is renamed twice on the same import line." - def explain = "" - } - - class TypeTestAlwaysDiverges(scrutTp: Type, testTp: Type)(using Context) extends SyntaxMsg(TypeTestAlwaysDivergesID) { - def msg = - s"This type test will never return a result since the scrutinee type ${scrutTp.show} does not contain any value." - def explain = "" - } - - // Relative of CyclicReferenceInvolvingImplicit and RecursiveValueNeedsResultType - class TermMemberNeedsResultTypeForImplicitSearch(cycleSym: Symbol)(using Context) - extends CyclicMsg(TermMemberNeedsNeedsResultTypeForImplicitSearchID) { - def msg = em"""$cycleSym needs result type because its right-hand side attempts implicit search""" - def explain = - em"""|The right hand-side of $cycleSym's definition requires an implicit search at the highlighted position. - |To avoid this error, give `$cycleSym` an explicit type. - |""".stripMargin - } - - class ClassCannotExtendEnum(cls: Symbol, parent: Symbol)(using Context) extends SyntaxMsg(ClassCannotExtendEnumID) { - def msg = em"""$cls in ${cls.owner} extends enum ${parent.name}, but extending enums is prohibited.""" - def explain = "" - } - - class NotAnExtractor(tree: untpd.Tree)(using Context) extends SyntaxMsg(NotAnExtractorID) { - def msg = em"$tree cannot be used as an extractor in a pattern because it lacks an unapply or unapplySeq method" - def explain = - em"""|An ${hl("unapply")} method should be defined in an ${hl("object")} as follow: - | - If it is just a test, return a ${hl("Boolean")}. For example ${hl("case even()")} - | - If it returns a single sub-value of type T, return an ${hl("Option[T]")} - | - If it returns several sub-values T1,...,Tn, group them in an optional tuple ${hl("Option[(T1,...,Tn)]")} - | - |Sometimes, the number of sub-values isn't fixed and we would like to return a sequence. - |For this reason, you can also define patterns through ${hl("unapplySeq")} which returns ${hl("Option[Seq[T]]")}. - |This mechanism is used for instance in pattern ${hl("case List(x1, ..., xn)")}""".stripMargin - } - - class MemberWithSameNameAsStatic()(using Context) - extends SyntaxMsg(MemberWithSameNameAsStaticID) { - def msg = em"Companion classes cannot define members with same name as a ${hl("@static")} member" - def explain = "" - } - - class PureExpressionInStatementPosition(stat: untpd.Tree, val exprOwner: Symbol)(using Context) - extends Message(PureExpressionInStatementPositionID) { - def kind = MessageKind.PotentialIssue - def msg = "A pure expression does nothing in statement position; you may be omitting necessary parentheses" - def explain = - em"""The pure expression $stat doesn't have any side effect and its result is not assigned elsewhere. - |It can be removed without changing the semantics of the program. This may indicate an error.""".stripMargin - } - - class TraitCompanionWithMutableStatic()(using Context) - extends SyntaxMsg(TraitCompanionWithMutableStaticID) { - def msg = em"Companion of traits cannot define mutable @static fields" - def explain = "" - } - class LazyStaticField()(using Context) - extends SyntaxMsg(LazyStaticFieldID) { - def msg = em"Lazy @static fields are not supported" - def explain = "" - } - - class StaticOverridingNonStaticMembers()(using Context) - extends SyntaxMsg(StaticOverridingNonStaticMembersID) { - def msg = em"${hl("@static")} members cannot override or implement non-static ones" - def explain = "" - } - - class OverloadInRefinement(rsym: Symbol)(using Context) - extends DeclarationMsg(OverloadInRefinementID) { - def msg = "Refinements cannot introduce overloaded definitions" - def explain = - em"""The refinement `$rsym` introduces an overloaded definition. - |Refinements cannot contain overloaded definitions.""".stripMargin - } - - class NoMatchingOverload(val alternatives: List[SingleDenotation], pt: Type)(using Context) - extends TypeMsg(NoMatchingOverloadID) { - def msg = - em"""None of the ${err.overloadedAltsStr(alternatives)} - |match ${err.expectedTypeStr(pt)}""" - def explain = "" - } - class StableIdentPattern(tree: untpd.Tree, pt: Type)(using Context) - extends TypeMsg(StableIdentPatternID) { - def msg = - em"""Stable identifier required, but $tree found""" - def explain = "" - } + /** @param rawMsg Message template with variables, e.g. "Variable A is ${A}" + * @param sym Symbol of the annotated type or of the method whose parameter was annotated + * @param substituteType Function substituting specific types for abstract types associated with variables, e.g A -> Int + */ + def formatAnnotationMessage(rawMsg: String, sym: Symbol, substituteType: Type => Type): String = { + val substitutableTypesSymbols = substitutableTypeSymbolsInScope(sym) + + userDefinedErrorString( + rawMsg, + paramNames = substitutableTypesSymbols.map(_.name.unexpandedName.toString), + args = substitutableTypesSymbols.map(_.typeRef).map(substituteType) + ) + } - class IllegalSuperAccessor(base: Symbol, memberName: Name, targetName: Name, - acc: Symbol, accTp: Type, - other: Symbol, otherTp: Type)(using Context) extends DeclarationMsg(IllegalSuperAccessorID) { - def msg = { - // The mixin containing a super-call that requires a super-accessor - val accMixin = acc.owner - // The class or trait that the super-accessor should resolve too in `base` - val otherMixin = other.owner - // The super-call in `accMixin` - val superCall = hl(i"super.$memberName") - // The super-call that the super-accesors in `base` forwards to - val resolvedSuperCall = hl(i"super[${otherMixin.name}].$memberName") - // The super-call that we would have called if `super` in traits behaved like it - // does in classes, i.e. followed the linearization of the trait itself. - val staticSuperCall = { - val staticSuper = accMixin.asClass.info.parents.reverse - .find(_.nonPrivateMember(memberName) - .matchingDenotation(accMixin.thisType, acc.info, targetName).exists) - val staticSuperName = staticSuper match { - case Some(parent) => - parent.classSymbol.name.show - case None => // Might be reachable under separate compilation - "SomeParent" - } - hl(i"super[$staticSuperName].$memberName") + /** Extracting the message from a method parameter, e.g. in + * + * trait Foo + * + * def foo(implicit @annotation.implicitNotFound("Foo is missing") foo: Foo): Any = ??? + */ + def userDefinedImplicitNotFoundParamMessage: Option[String] = paramSymWithMethodCallTree.flatMap { (sym, applTree) => + userDefinedMsg(sym, defn.ImplicitNotFoundAnnot).map { rawMsg => + val fn = tpd.funPart(applTree) + val targs = tpd.typeArgss(applTree).flatten + val methodOwner = fn.symbol.owner + val methodOwnerType = tpd.qualifier(fn).tpe + val methodTypeParams = fn.symbol.paramSymss.flatten.filter(_.isType) + val methodTypeArgs = targs.map(_.tpe) + val substituteType = (_: Type).asSeenFrom(methodOwnerType, methodOwner).subst(methodTypeParams, methodTypeArgs) + formatAnnotationMessage(rawMsg, sym.owner, substituteType) } - ex"""$base cannot be defined due to a conflict between its parents when - |implementing a super-accessor for $memberName in $accMixin: - | - |1. One of its parent (${accMixin.name}) contains a call $superCall in its body, - | and when a super-call in a trait is written without an explicit parent - | listed in brackets, it is implemented by a generated super-accessor in - | the class that extends this trait based on the linearization order of - | the class. - |2. Because ${otherMixin.name} comes before ${accMixin.name} in the linearization - | order of ${base.name}, and because ${otherMixin.name} overrides $memberName, - | the super-accessor in ${base.name} is implemented as a call to - | $resolvedSuperCall. - |3. However, - | ${otherTp.widenExpr} (the type of $resolvedSuperCall in ${base.name}) - | is not a subtype of - | ${accTp.widenExpr} (the type of $memberName in $accMixin). - | Hence, the super-accessor that needs to be generated in ${base.name} - | is illegal. - | - |Here are two possible ways to resolve this: - | - |1. Change the linearization order of ${base.name} such that - | ${accMixin.name} comes before ${otherMixin.name}. - |2. Alternatively, replace $superCall in the body of $accMixin by a - | super-call to a specific parent, e.g. $staticSuperCall - |""".stripMargin } - def explain = "" - } - class TraitParameterUsedAsParentPrefix(cls: Symbol)(using Context) - extends DeclarationMsg(TraitParameterUsedAsParentPrefixID) { - def msg = - s"${cls.show} cannot extend from a parent that is derived via its own parameters" - def explain = - ex""" - |The parent class/trait that ${cls.show} extends from is obtained from - |the parameter of ${cls.show}. This is disallowed in order to prevent - |outer-related Null Pointer Exceptions in Scala. - | - |In order to fix this issue consider directly extending from the parent rather - |than obtaining it from the parameters of ${cls.show}. - |""".stripMargin - } - - class UnknownNamedEnclosingClassOrObject(name: TypeName)(using Context) - extends ReferenceMsg(UnknownNamedEnclosingClassOrObjectID) { - def msg = - em"""no enclosing class or object is named '${hl(name.show)}'""" - def explain = - ex""" - |The class or object named '${hl(name.show)}' was used as a visibility - |modifier, but could not be resolved. Make sure that - |'${hl(name.show)}' is not misspelled and has been imported into the - |current scope. - """.stripMargin + /** Extracting the message from a type, e.g. in + * + * @annotation.implicitNotFound("Foo is missing") + * trait Foo + * + * def foo(implicit foo: Foo): Any = ??? + */ + def userDefinedImplicitNotFoundTypeMessage: Option[String] = + def recur(tp: Type): Option[String] = tp match + case tp: TypeRef => + val sym = tp.symbol + userDefinedImplicitNotFoundTypeMessageFor(sym).orElse(recur(tp.info)) + case tp: ClassInfo => + tp.baseClasses.iterator + .map(userDefinedImplicitNotFoundTypeMessageFor) + .find(_.isDefined).flatten + case tp: TypeProxy => + recur(tp.superType) + case tp: AndType => + recur(tp.tp1).orElse(recur(tp.tp2)) + case _ => + None + recur(pt) + + def userDefinedImplicitNotFoundTypeMessageFor(sym: Symbol): Option[String] = + for + rawMsg <- userDefinedMsg(sym, defn.ImplicitNotFoundAnnot) + if Feature.migrateTo3 || sym != defn.Function1 + // Don't inherit "No implicit view available..." message if subtypes of Function1 are not treated as implicit conversions anymore + yield + val substituteType = (_: Type).asSeenFrom(pt, sym) + formatAnnotationMessage(rawMsg, sym, substituteType) + + object AmbiguousImplicitMsg { + def unapply(search: SearchSuccess): Option[String] = + userDefinedMsg(search.ref.symbol, defn.ImplicitAmbiguousAnnot) } - class IllegalCyclicTypeReference(sym: Symbol, where: String, lastChecked: Type)(using Context) - extends CyclicMsg(IllegalCyclicTypeReferenceID) { - def msg = - val lastCheckedStr = - try lastChecked.show - catch case ex: CyclicReference => "..." - i"illegal cyclic type reference: ${where} ${hl(lastCheckedStr)} of $sym refers back to the type itself" - def explain = "" - } - - class ErasedTypesCanOnlyBeFunctionTypes()(using Context) - extends SyntaxMsg(ErasedTypesCanOnlyBeFunctionTypesID) { - def msg = "Types with erased keyword can only be function types `(erased ...) => ...`" - def explain = "" - } - - class CaseClassMissingNonImplicitParamList(cdef: untpd.TypeDef)(using Context) - extends SyntaxMsg(CaseClassMissingNonImplicitParamListID) { - def msg = - em"""|A ${hl("case class")} must have at least one leading non-implicit parameter list""" - - def explain = - em"""|${cdef.name} must have at least one leading non-implicit parameter list, - | if you're aiming to have a case class parametrized only by implicit ones, you should - | add an explicit ${hl("()")} as the first parameter list to ${cdef.name}.""".stripMargin - } - - class EnumerationsShouldNotBeEmpty(cdef: untpd.TypeDef)(using Context) - extends SyntaxMsg(EnumerationsShouldNotBeEmptyID) { - def msg = "Enumerations must contain at least one case" - - def explain = - em"""|Enumeration ${cdef.name} must contain at least one case - |Example Usage: - | ${hl("enum")} ${cdef.name} { - | ${hl("case")} Option1, Option2 - | } - |""".stripMargin - } - - class TypedCaseDoesNotExplicitlyExtendTypedEnum(enumDef: Symbol, caseDef: untpd.TypeDef)(using Context) - extends SyntaxMsg(TypedCaseDoesNotExplicitlyExtendTypedEnumID) { - def msg = i"explicit extends clause needed because both enum case and enum class have type parameters" - - def explain = - em"""Enumerations where the enum class as well as the enum case have type parameters need - |an explicit extends. - |for example: - | ${hl("enum")} ${enumDef.name}[T] { - | ${hl("case")} ${caseDef.name}[U](u: U) ${hl("extends")} ${enumDef.name}[U] - | } - |""".stripMargin - } - - class IllegalRedefinitionOfStandardKind(kindType: String, name: Name)(using Context) - extends SyntaxMsg(IllegalRedefinitionOfStandardKindID) { - def msg = em"illegal redefinition of standard $kindType $name" - def explain = - em"""| "$name" is a standard Scala core `$kindType` - | Please choose a different name to avoid conflicts - |""".stripMargin - } - - class NoExtensionMethodAllowed(mdef: untpd.DefDef)(using Context) - extends SyntaxMsg(NoExtensionMethodAllowedID) { - def msg = em"No extension method allowed here, since collective parameters are given" - def explain = - em"""|Extension method: - | `${mdef}` - |is defined inside an extension clause which has collective parameters. - |""".stripMargin - } - - class ExtensionMethodCannotHaveTypeParams(mdef: untpd.DefDef)(using Context) - extends SyntaxMsg(ExtensionMethodCannotHaveTypeParamsID) { - def msg = i"Extension method cannot have type parameters since some were already given previously" - - def explain = - em"""|Extension method: - | `${mdef}` - |has type parameters `[${mdef.leadingTypeParams.map(_.show).mkString(",")}]`, while the extension clause has - |it's own type parameters. Please consider moving these to the extension clause's type parameter list. - |""".stripMargin - } - - class ExtensionCanOnlyHaveDefs(mdef: untpd.Tree)(using Context) - extends SyntaxMsg(ExtensionCanOnlyHaveDefsID) { - def msg = em"Only methods allowed here, since collective parameters are given" - def explain = - em"""Extension clauses can only have `def`s - | `${mdef.show}` is not a valid expression here. - |""".stripMargin - } - - class UnexpectedPatternForSummonFrom(tree: Tree[_])(using Context) - extends SyntaxMsg(UnexpectedPatternForSummonFromID) { - def msg = em"Unexpected pattern for summonFrom. Expected ${hl("`x: T`")} or ${hl("`_`")}" - def explain = - em"""|The pattern "${tree.show}" provided in the ${hl("case")} expression of the ${hl("summonFrom")}, - | needs to be of the form ${hl("`x: T`")} or ${hl("`_`")}. - | - | Example usage: - | inline def a = summonFrom { - | case x: T => ??? - | } - | - | or - | inline def a = summonFrom { - | case _ => ??? - | } - |""".stripMargin - } - - class AnonymousInstanceCannotBeEmpty(impl: untpd.Template)(using Context) - extends SyntaxMsg(AnonymousInstanceCannotBeEmptyID) { - def msg = i"anonymous instance must implement a type or have at least one extension method" - def explain = - em"""|Anonymous instances cannot be defined with an empty body. The block - |`${impl.show}` should either contain an implemented type or at least one extension method. - |""".stripMargin - } - - class ModifierNotAllowedForDefinition(flag: Flag)(using Context) - extends SyntaxMsg(ModifierNotAllowedForDefinitionID) { - def msg = em"Modifier ${hl(flag.flagsString)} is not allowed for this definition" - def explain = "" - } - - class RedundantModifier(flag: Flag)(using Context) - extends SyntaxMsg(RedundantModifierID) { - def msg = em"Modifier ${hl(flag.flagsString)} is redundant for this definition" - def explain = "" - } - - class InvalidReferenceInImplicitNotFoundAnnotation(typeVar: String, owner: String)(using Context) - extends ReferenceMsg(InvalidReferenceInImplicitNotFoundAnnotationID) { - def msg = em"""|Invalid reference to a type variable ${hl(typeVar)} found in the annotation argument. - |The variable does not occur as a parameter in the scope of ${hl(owner)}. - |""".stripMargin - def explain = "" - } - - class CaseClassInInlinedCode(tree: tpd.Tree)(using Context) - extends SyntaxMsg(CaseClassInInlinedCodeID) { - - def defKind = if tree.symbol.is(Module) then "object" else "class" - def msg = s"Case $defKind definitions are not allowed in inline methods or quoted code. Use a normal $defKind instead." - def explain = - em"""Case class/object definitions generate a considerable footprint in code size. - |Inlining such definition would multiply this footprint for each call site. - |""".stripMargin - } - - class ImplicitSearchTooLargeWarning(limit: Int, openSearchPairs: List[(Candidate, Type)])(using Context) - extends TypeMsg(ImplicitSearchTooLargeID): - override def showAlways = true - def showQuery(query: (Candidate, Type)): String = - i" ${query._1.ref.symbol.showLocated} for ${query._2}}" - def msg = - em"""Implicit search problem too large. - |an implicit search was terminated with failure after trying $limit expressions. - |The root candidate for the search was: - | - |${showQuery(openSearchPairs.last)} - | - |You can change the behavior by setting the `-Ximplicit-search-limit` value. - |Smaller values cause the search to fail faster. - |Larger values might make a very large search problem succeed. - |""" - def explain = - em"""The overflow happened with the following lists of tried expressions and target types, - |starting with the root query: - | - |${openSearchPairs.reverse.map(showQuery)}%\n% - """ - - class TargetNameOnTopLevelClass(symbol: Symbol)(using Context) - extends SyntaxMsg(TargetNameOnTopLevelClassID): - def msg = em"${hl("@targetName")} annotation not allowed on top-level $symbol" - def explain = - val annot = symbol.getAnnotation(defn.TargetNameAnnot).get - em"""The @targetName annotation may be applied to a top-level ${hl("val")} or ${hl("def")}, but not - |a top-level ${hl("class")}, ${hl("trait")}, or ${hl("object")}. - | - |This restriction is due to the naming convention of Java classfiles, whose filenames - |are based on the name of the class defined within. If @targetName were permitted - |here, the name of the classfile would be based on the target name, and the compiler - |could not associate that classfile with the Scala-visible defined name of the class. - | - |If your use case requires @targetName, consider wrapping $symbol in an ${hl("object")} - |(and possibly exporting it), as in the following example: - | - |${hl("object Wrapper:")} - | $annot $symbol { ... } - | - |${hl("export")} Wrapper.${symbol.name} ${hl("// optional")}""" - - class NotClassType(tp: Type)(using Context) - extends TypeMsg(NotClassTypeID), ShowMatchTrace(tp): - def msg = ex"$tp is not a class type" - def explain = "" - + arg.tpe match + case ambi: AmbiguousImplicits => + (ambi.alt1, ambi.alt2) match + case (alt @ AmbiguousImplicitMsg(msg), _) => + userDefinedAmbiguousImplicitMsg(alt, msg) + case (_, alt @ AmbiguousImplicitMsg(msg)) => + userDefinedAmbiguousImplicitMsg(alt, msg) + case _ => + defaultAmbiguousImplicitMsg(ambi) + case ambi @ TooUnspecific(target) => + i"""No implicit search was attempted${location("for")} + |since the expected type $target is not specific enough""" + case _ => + val shortMessage = userDefinedImplicitNotFoundParamMessage + .orElse(userDefinedImplicitNotFoundTypeMessage) + .getOrElse(defaultImplicitNotFoundMessage) + formatMsg(shortMessage)() + end msg + + override def msgPostscript(using Context) = + arg.tpe match + case _: AmbiguousImplicits => + "" // show no disambiguation + case _: TooUnspecific => + super.msgPostscript // show just disambigutation and match type trace + case _ => + // show all available additional info + def hiddenImplicitNote(s: SearchSuccess) = + i"\n\nNote: ${s.ref.symbol.showLocated} was not considered because it was not imported with `import given`." + super.msgPostscript + ++ ignoredInstanceNormalImport.map(hiddenImplicitNote) + .getOrElse(ctx.typer.importSuggestionAddendum(pt)) + + def explain(using Context) = "" +end MissingImplicitArgument + +class CannotBeAccessed(tpe: NamedType, superAccess: Boolean)(using Context) +extends ReferenceMsg(CannotBeAccessedID): + def msg(using Context) = + val pre = tpe.prefix + val name = tpe.name + val alts = tpe.denot.alternatives.map(_.symbol).filter(_.exists) + val whatCanNot = alts match + case Nil => + i"$name cannot" + case sym :: Nil => + i"${if (sym.owner == pre.typeSymbol) sym.show else sym.showLocated} cannot" + case _ => + i"none of the overloaded alternatives named $name can" + val where = if (ctx.owner.exists) s" from ${ctx.owner.enclosingClass}" else "" + val whyNot = new StringBuffer + alts.foreach(_.isAccessibleFrom(pre, superAccess, whyNot)) + i"$whatCanNot be accessed as a member of $pre$where.$whyNot" + def explain(using Context) = "" + +class InlineGivenShouldNotBeFunction()(using Context) +extends SyntaxMsg(InlineGivenShouldNotBeFunctionID): + def msg(using Context) = + i"""An inline given alias with a function value as right-hand side can significantly increase + |generated code size. You should either drop the `inline` or rewrite the given with an + |explicit `apply` method.""" + def explain(using Context) = + i"""A function value on the right-hand side of an inline given alias expands to + |an anonymous class. Each application of the inline given will then create a + |fresh copy of that class, which can increase code size in surprising ways. + |For that reason, functions are discouraged as right hand sides of inline given aliases. + |You should either drop `inline` or rewrite to an explicit `apply` method. E.g. + | + | inline given Conversion[A, B] = x => x.toB + | + |should be re-formulated as + | + | given Conversion[A, B] with + | inline def apply(x: A) = x.toB + """ + +class ValueDiscarding(tp: Type)(using Context) + extends Message(ValueDiscardingID): + def kind = MessageKind.PotentialIssue + def msg(using Context) = i"discarded non-Unit value of type $tp" + def explain(using Context) = "" diff --git a/compiler/src/dotty/tools/dotc/reporting/trace.scala b/compiler/src/dotty/tools/dotc/reporting/trace.scala index 7c114b51ed21..8e8d3efb8b40 100644 --- a/compiler/src/dotty/tools/dotc/reporting/trace.scala +++ b/compiler/src/dotty/tools/dotc/reporting/trace.scala @@ -4,10 +4,11 @@ package reporting import scala.language.unsafeNulls -import core.Contexts._ -import config.Config -import config.Printers -import core.Mode +import core.*, Contexts.*, Decorators.* +import config.* +import printing.Formatting.* + +import scala.compiletime.* /** Exposes the {{{ trace("question") { op } }}} syntax. * @@ -51,9 +52,20 @@ trait TraceSyntax: else op inline def apply[T](inline question: String, inline printer: Printers.Printer, inline show: Boolean)(inline op: T)(using Context): T = - inline if isEnabled then - doTrace[T](question, printer, if show then showShowable(_) else alwaysToString)(op) - else op + apply(question, printer, { + val showOp: T => String = inline if show == true then + val showT = summonInline[Show[T]] + { + given Show[T] = showT + t => i"$t" + } + else + summonFrom { + case given Show[T] => t => i"$t" + case _ => alwaysToString + } + showOp + })(op) inline def apply[T](inline question: String, inline printer: Printers.Printer)(inline op: T)(using Context): T = apply[T](question, printer, false)(op) @@ -64,15 +76,11 @@ trait TraceSyntax: inline def apply[T](inline question: String)(inline op: T)(using Context): T = apply[T](question, false)(op) - private def showShowable(x: Any)(using Context) = x match - case x: printing.Showable => x.show - case _ => String.valueOf(x) - private val alwaysToString = (x: Any) => String.valueOf(x) private def doTrace[T](question: => String, printer: Printers.Printer = Printers.default, - showOp: T => String = alwaysToString) + showOp: T => String) (op: => T)(using Context): T = if ctx.mode.is(Mode.Printing) || !isForced && (printer eq Printers.noPrinter) then op else diff --git a/compiler/src/dotty/tools/dotc/rewrites/Rewrites.scala b/compiler/src/dotty/tools/dotc/rewrites/Rewrites.scala index 96e88e5c68ae..f2dfac88d464 100644 --- a/compiler/src/dotty/tools/dotc/rewrites/Rewrites.scala +++ b/compiler/src/dotty/tools/dotc/rewrites/Rewrites.scala @@ -23,10 +23,7 @@ object Rewrites { private[Rewrites] val pbuf = new mutable.ListBuffer[Patch]() def addPatch(span: Span, replacement: String): Unit = - pbuf.indexWhere(p => p.span.start == span.start && p.span.end == span.end) match { - case i if i >= 0 => pbuf.update(i, Patch(span, replacement)) - case _ => pbuf += Patch(span, replacement) - } + pbuf += Patch(span, replacement) def apply(cs: Array[Char]): Array[Char] = { val delta = pbuf.map(_.delta).sum diff --git a/compiler/src/dotty/tools/dotc/sbt/ExtractAPI.scala b/compiler/src/dotty/tools/dotc/sbt/ExtractAPI.scala index e561b26abf6d..f54baeb7256c 100644 --- a/compiler/src/dotty/tools/dotc/sbt/ExtractAPI.scala +++ b/compiler/src/dotty/tools/dotc/sbt/ExtractAPI.scala @@ -737,8 +737,7 @@ private class ExtractAPICollector(using Context) extends ThunkHolder { var h = initHash p match - case p: WithLazyField[?] => - p.forceIfLazy + case p: WithLazyFields => p.forceFields() case _ => if inlineOrigin.exists then diff --git a/compiler/src/dotty/tools/dotc/sbt/ExtractDependencies.scala b/compiler/src/dotty/tools/dotc/sbt/ExtractDependencies.scala index f7b15dc21eb0..3fb7a66dc89e 100644 --- a/compiler/src/dotty/tools/dotc/sbt/ExtractDependencies.scala +++ b/compiler/src/dotty/tools/dotc/sbt/ExtractDependencies.scala @@ -143,34 +143,7 @@ class ExtractDependencies extends Phase { def allowLocal = dep.context == DependencyByInheritance || dep.context == LocalDependencyByInheritance if (depFile.extension == "class") { // Dependency is external -- source is undefined - - // The fully qualified name on the JVM of the class corresponding to `dep.to` - val binaryClassName = { - val builder = new StringBuilder - val pkg = dep.to.enclosingPackageClass - if (!pkg.isEffectiveRoot) { - builder.append(pkg.fullName.mangledString) - builder.append(".") - } - val flatName = dep.to.flatName - // Some companion objects are fake (that is, they're a compiler fiction - // that doesn't correspond to a class that exists at runtime), this - // can happen in two cases: - // - If a Java class has static members. - // - If we create constructor proxies for a class (see NamerOps#addConstructorProxies). - // - // In both cases it's vital that we don't send the object name to - // zinc: when sbt is restarted, zinc will inspect the binary - // dependencies to see if they're still on the classpath, if it - // doesn't find them it will invalidate whatever referenced them, so - // any reference to a fake companion will lead to extra recompilations. - // Instead, use the class name since it's guaranteed to exist at runtime. - val clsFlatName = if (dep.to.isOneOf(JavaDefined | ConstructorProxy)) flatName.stripModuleClassSuffix else flatName - builder.append(clsFlatName.mangledString) - builder.toString - } - - processExternalDependency(depFile, binaryClassName) + processExternalDependency(depFile, dep.to.binaryClassName) } else if (allowLocal || depFile.file != sourceFile) { // We cannot ignore dependencies coming from the same source file because // the dependency info needs to propagate. See source-dependencies/trait-trait-211. @@ -190,7 +163,7 @@ object ExtractDependencies { /** Report an internal error in incremental compilation. */ def internalError(msg: => String, pos: SrcPos = NoSourcePosition)(using Context): Unit = - report.error(s"Internal error in the incremental compiler while compiling ${ctx.compilationUnit.source}: $msg", pos) + report.error(em"Internal error in the incremental compiler while compiling ${ctx.compilationUnit.source}: $msg", pos) } private case class ClassDependency(from: Symbol, to: Symbol, context: DependencyContext) diff --git a/compiler/src/dotty/tools/dotc/semanticdb/SemanticSymbolBuilder.scala b/compiler/src/dotty/tools/dotc/semanticdb/SemanticSymbolBuilder.scala index c825032373f8..c7b0dfd437db 100644 --- a/compiler/src/dotty/tools/dotc/semanticdb/SemanticSymbolBuilder.scala +++ b/compiler/src/dotty/tools/dotc/semanticdb/SemanticSymbolBuilder.scala @@ -74,7 +74,9 @@ class SemanticSymbolBuilder: def addOwner(owner: Symbol): Unit = if !owner.isRoot then addSymName(b, owner) - def addOverloadIdx(sym: Symbol): Unit = + def addOverloadIdx(initSym: Symbol): Unit = + // revert from the compiler-generated overload of the signature polymorphic method + val sym = initSym.originalSignaturePolymorphic.symbol.orElse(initSym) val decls = val decls0 = sym.owner.info.decls.lookupAll(sym.name) if sym.owner.isAllOf(JavaModule) then diff --git a/compiler/src/dotty/tools/dotc/transform/BeanProperties.scala b/compiler/src/dotty/tools/dotc/transform/BeanProperties.scala index 0d464d319848..0c1f40d4f2bd 100644 --- a/compiler/src/dotty/tools/dotc/transform/BeanProperties.scala +++ b/compiler/src/dotty/tools/dotc/transform/BeanProperties.scala @@ -5,7 +5,8 @@ import core._ import ast.tpd._ import Annotations._ import Contexts._ -import Symbols.newSymbol +import Symbols.* +import SymUtils.* import Decorators._ import Flags._ import Names._ @@ -23,8 +24,6 @@ class BeanProperties(thisPhase: DenotTransformer): } ::: origBody) def generateAccessors(valDef: ValDef)(using Context): List[Tree] = - import Symbols.defn - def generateGetter(valDef: ValDef, annot: Annotation)(using Context) : Tree = val prefix = if annot matches defn.BooleanBeanPropertyAnnot then "is" else "get" val meth = newSymbol( @@ -34,9 +33,9 @@ class BeanProperties(thisPhase: DenotTransformer): info = MethodType(Nil, valDef.denot.info), coord = annot.tree.span ).enteredAfter(thisPhase).asTerm - meth.addAnnotations(valDef.symbol.annotations) + .withAnnotationsCarrying(valDef.symbol, defn.BeanGetterMetaAnnot) val body: Tree = ref(valDef.symbol) - DefDef(meth, body) + DefDef(meth, body).withSpan(meth.span) def maybeGenerateSetter(valDef: ValDef, annot: Annotation)(using Context): Option[Tree] = Option.when(valDef.denot.asSymDenotation.flags.is(Mutable)) { @@ -48,9 +47,9 @@ class BeanProperties(thisPhase: DenotTransformer): info = MethodType(valDef.name :: Nil, valDef.denot.info :: Nil, defn.UnitType), coord = annot.tree.span ).enteredAfter(thisPhase).asTerm - meth.addAnnotations(valDef.symbol.annotations) + .withAnnotationsCarrying(valDef.symbol, defn.BeanSetterMetaAnnot) def body(params: List[List[Tree]]): Tree = Assign(ref(valDef.symbol), params.head.head) - DefDef(meth, body) + DefDef(meth, body).withSpan(meth.span) } def prefixedName(prefix: String, valName: Name) = diff --git a/compiler/src/dotty/tools/dotc/transform/BetaReduce.scala b/compiler/src/dotty/tools/dotc/transform/BetaReduce.scala index 90c0207ebb6d..7ac3dc972ad1 100644 --- a/compiler/src/dotty/tools/dotc/transform/BetaReduce.scala +++ b/compiler/src/dotty/tools/dotc/transform/BetaReduce.scala @@ -9,6 +9,8 @@ import Symbols._, Contexts._, Types._, Decorators._ import StdNames.nme import ast.TreeTypeMap +import scala.collection.mutable.ListBuffer + /** Rewrite an application * * (((x1, ..., xn) => b): T)(y1, ..., yn) @@ -70,9 +72,15 @@ object BetaReduce: original end apply - /** Beta-reduces a call to `ddef` with arguments `argSyms` */ + /** Beta-reduces a call to `ddef` with arguments `args` */ def apply(ddef: DefDef, args: List[Tree])(using Context) = - val bindings = List.newBuilder[ValDef] + val bindings = new ListBuffer[ValDef]() + val expansion1 = reduceApplication(ddef, args, bindings) + val bindings1 = bindings.result() + seq(bindings1, expansion1) + + /** Beta-reduces a call to `ddef` with arguments `args` and registers new bindings */ + def reduceApplication(ddef: DefDef, args: List[Tree], bindings: ListBuffer[ValDef])(using Context): Tree = val vparams = ddef.termParamss.iterator.flatten.toList assert(args.hasSameLengthAs(vparams)) val argSyms = @@ -84,7 +92,8 @@ object BetaReduce: val flags = Synthetic | (param.symbol.flags & Erased) val tpe = if arg.tpe.dealias.isInstanceOf[ConstantType] then arg.tpe.dealias else arg.tpe.widen val binding = ValDef(newSymbol(ctx.owner, param.name, flags, tpe, coord = arg.span), arg).withSpan(arg.span) - bindings += binding + if !(tpe.isInstanceOf[ConstantType] && isPureExpr(arg)) then + bindings += binding binding.symbol val expansion = TreeTypeMap( @@ -99,8 +108,5 @@ object BetaReduce: case ConstantType(const) if isPureExpr(tree) => cpy.Literal(tree)(const) case _ => super.transform(tree) }.transform(expansion) - val bindings1 = - bindings.result().filterNot(vdef => vdef.tpt.tpe.isInstanceOf[ConstantType] && isPureExpr(vdef.rhs)) - seq(bindings1, expansion1) - end apply + expansion1 diff --git a/compiler/src/dotty/tools/dotc/transform/CheckReentrant.scala b/compiler/src/dotty/tools/dotc/transform/CheckReentrant.scala index 6b0a4c3e9737..b63773687f74 100644 --- a/compiler/src/dotty/tools/dotc/transform/CheckReentrant.scala +++ b/compiler/src/dotty/tools/dotc/transform/CheckReentrant.scala @@ -67,8 +67,8 @@ class CheckReentrant extends MiniPhase { if (sym.isTerm && !sym.isSetter && !isIgnored(sym)) if (sym.is(Mutable)) { report.error( - i"""possible data race involving globally reachable ${sym.showLocated}: ${sym.info} - | use -Ylog:checkReentrant+ to find out more about why the variable is reachable.""") + em"""possible data race involving globally reachable ${sym.showLocated}: ${sym.info} + | use -Ylog:checkReentrant+ to find out more about why the variable is reachable.""") shared += sym } else if (!sym.is(Method) || sym.isOneOf(Accessor | ParamAccessor)) diff --git a/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala new file mode 100644 index 000000000000..69ec9f0d7b2b --- /dev/null +++ b/compiler/src/dotty/tools/dotc/transform/CheckUnused.scala @@ -0,0 +1,769 @@ +package dotty.tools.dotc.transform + +import dotty.tools.dotc.ast.tpd +import dotty.tools.dotc.ast.tpd.{Inlined, TreeTraverser} +import dotty.tools.dotc.ast.untpd +import dotty.tools.dotc.ast.untpd.ImportSelector +import dotty.tools.dotc.config.ScalaSettings +import dotty.tools.dotc.core.Contexts.* +import dotty.tools.dotc.core.Decorators.{em, i} +import dotty.tools.dotc.core.Flags.* +import dotty.tools.dotc.core.Phases.Phase +import dotty.tools.dotc.core.StdNames +import dotty.tools.dotc.report +import dotty.tools.dotc.reporting.Message +import dotty.tools.dotc.typer.ImportInfo +import dotty.tools.dotc.util.{Property, SrcPos} +import dotty.tools.dotc.core.Mode +import dotty.tools.dotc.core.Types.{AnnotatedType, ConstantType, NoType, TermRef, Type, TypeTraverser} +import dotty.tools.dotc.core.Flags.flagsString +import dotty.tools.dotc.core.Flags +import dotty.tools.dotc.core.Names.Name +import dotty.tools.dotc.transform.MegaPhase.MiniPhase +import dotty.tools.dotc.core.Annotations +import dotty.tools.dotc.core.Definitions +import dotty.tools.dotc.core.NameKinds.WildcardParamName +import dotty.tools.dotc.core.Symbols.Symbol +import dotty.tools.dotc.core.StdNames.nme +import scala.math.Ordering + +/** + * A compiler phase that checks for unused imports or definitions + * + * Basically, it gathers definition/imports and their usage. If a + * definition/imports does not have any usage, then it is reported. + */ +class CheckUnused private (phaseMode: CheckUnused.PhaseMode, suffix: String, _key: Property.Key[CheckUnused.UnusedData]) extends MiniPhase: + import CheckUnused.* + import UnusedData.* + + private def unusedDataApply[U](f: UnusedData => U)(using Context): Context = + ctx.property(_key).foreach(f) + ctx + + override def phaseName: String = CheckUnused.phaseNamePrefix + suffix + + override def description: String = CheckUnused.description + + override def isRunnable(using Context): Boolean = + super.isRunnable && + ctx.settings.Wunused.value.nonEmpty && + !ctx.isJava + + // ========== SETUP ============ + + override def prepareForUnit(tree: tpd.Tree)(using Context): Context = + val data = UnusedData() + tree.getAttachment(_key).foreach(oldData => + data.unusedAggregate = oldData.unusedAggregate + ) + val fresh = ctx.fresh.setProperty(_key, data) + tree.putAttachment(_key, data) + fresh + + // ========== END + REPORTING ========== + + override def transformUnit(tree: tpd.Tree)(using Context): tpd.Tree = + unusedDataApply { ud => + ud.finishAggregation() + if(phaseMode == PhaseMode.Report) then + ud.unusedAggregate.foreach(reportUnused) + } + tree + + // ========== MiniPhase Prepare ========== + override def prepareForOther(tree: tpd.Tree)(using Context): Context = + // A standard tree traverser covers cases not handled by the Mega/MiniPhase + traverser.traverse(tree) + ctx + + override def prepareForInlined(tree: tpd.Inlined)(using Context): Context = + traverser.traverse(tree.call) + ctx + + override def prepareForIdent(tree: tpd.Ident)(using Context): Context = + if tree.symbol.exists then + val prefixes = LazyList.iterate(tree.typeOpt.normalizedPrefix)(_.normalizedPrefix).takeWhile(_ != NoType) + .take(10) // Failsafe for the odd case if there was an infinite cycle + for prefix <- prefixes do + unusedDataApply(_.registerUsed(prefix.classSymbol, None)) + unusedDataApply(_.registerUsed(tree.symbol, Some(tree.name))) + else if tree.hasType then + unusedDataApply(_.registerUsed(tree.tpe.classSymbol, Some(tree.name))) + else + ctx + + override def prepareForSelect(tree: tpd.Select)(using Context): Context = + unusedDataApply(_.registerUsed(tree.symbol, Some(tree.name))) + + override def prepareForBlock(tree: tpd.Block)(using Context): Context = + pushInBlockTemplatePackageDef(tree) + + override def prepareForTemplate(tree: tpd.Template)(using Context): Context = + pushInBlockTemplatePackageDef(tree) + + override def prepareForPackageDef(tree: tpd.PackageDef)(using Context): Context = + pushInBlockTemplatePackageDef(tree) + + override def prepareForValDef(tree: tpd.ValDef)(using Context): Context = + unusedDataApply{ud => + // do not register the ValDef generated for `object` + traverseAnnotations(tree.symbol) + if !tree.symbol.is(Module) then + ud.registerDef(tree) + if tree.name.mangledString.startsWith(nme.derived.mangledString + "$") + && tree.typeOpt != NoType then + ud.registerUsed(tree.typeOpt.typeSymbol, None, true) + ud.addIgnoredUsage(tree.symbol) + } + + override def prepareForDefDef(tree: tpd.DefDef)(using Context): Context = + unusedDataApply{ ud => + if !tree.symbol.is(Private) then + tree.termParamss.flatten.foreach { p => + ud.addIgnoredParam(p.symbol) + } + import ud.registerTrivial + tree.registerTrivial + traverseAnnotations(tree.symbol) + ud.registerDef(tree) + ud.addIgnoredUsage(tree.symbol) + } + + override def prepareForTypeDef(tree: tpd.TypeDef)(using Context): Context = + unusedDataApply{ ud => + if !tree.symbol.is(Param) then // Ignore type parameter (as Scala 2) + traverseAnnotations(tree.symbol) + ud.registerDef(tree) + ud.addIgnoredUsage(tree.symbol) + } + + override def prepareForBind(tree: tpd.Bind)(using Context): Context = + traverseAnnotations(tree.symbol) + unusedDataApply(_.registerPatVar(tree)) + + override def prepareForTypeTree(tree: tpd.TypeTree)(using Context): Context = + if !tree.isInstanceOf[tpd.InferredTypeTree] then typeTraverser(unusedDataApply).traverse(tree.tpe) + ctx + + // ========== MiniPhase Transform ========== + + override def transformBlock(tree: tpd.Block)(using Context): tpd.Tree = + popOutBlockTemplatePackageDef() + tree + + override def transformTemplate(tree: tpd.Template)(using Context): tpd.Tree = + popOutBlockTemplatePackageDef() + tree + + override def transformPackageDef(tree: tpd.PackageDef)(using Context): tpd.Tree = + popOutBlockTemplatePackageDef() + tree + + override def transformValDef(tree: tpd.ValDef)(using Context): tpd.Tree = + unusedDataApply(_.removeIgnoredUsage(tree.symbol)) + tree + + override def transformDefDef(tree: tpd.DefDef)(using Context): tpd.Tree = + unusedDataApply(_.removeIgnoredUsage(tree.symbol)) + tree + + override def transformTypeDef(tree: tpd.TypeDef)(using Context): tpd.Tree = + unusedDataApply(_.removeIgnoredUsage(tree.symbol)) + tree + + // ---------- MiniPhase HELPERS ----------- + + private def pushInBlockTemplatePackageDef(tree: tpd.Block | tpd.Template | tpd.PackageDef)(using Context): Context = + unusedDataApply { ud => + ud.pushScope(UnusedData.ScopeType.fromTree(tree)) + } + ctx + + private def popOutBlockTemplatePackageDef()(using Context): Context = + unusedDataApply { ud => + ud.popScope() + } + ctx + + private def newCtx(tree: tpd.Tree)(using Context) = + if tree.symbol.exists then ctx.withOwner(tree.symbol) else ctx + + /** + * This traverse is the **main** component of this phase + * + * It traverse the tree the tree and gather the data in the + * corresponding context property + */ + private def traverser = new TreeTraverser: + import tpd._ + import UnusedData.ScopeType + + /* Register every imports, definition and usage */ + override def traverse(tree: tpd.Tree)(using Context): Unit = + val newCtx = if tree.symbol.exists then ctx.withOwner(tree.symbol) else ctx + tree match + case imp: tpd.Import => + unusedDataApply(_.registerImport(imp)) + imp.selectors.filter(_.isGiven).map(_.bound).collect { + case untpd.TypedSplice(tree1) => tree1 + }.foreach(traverse(_)(using newCtx)) + traverseChildren(tree)(using newCtx) + case ident: Ident => + prepareForIdent(ident) + traverseChildren(tree)(using newCtx) + case sel: Select => + prepareForSelect(sel) + traverseChildren(tree)(using newCtx) + case _: (tpd.Block | tpd.Template | tpd.PackageDef) => + //! DIFFERS FROM MINIPHASE + unusedDataApply { ud => + ud.inNewScope(ScopeType.fromTree(tree))(traverseChildren(tree)(using newCtx)) + } + case t:tpd.ValDef => + prepareForValDef(t) + traverseChildren(tree)(using newCtx) + transformValDef(t) + case t:tpd.DefDef => + prepareForDefDef(t) + traverseChildren(tree)(using newCtx) + transformDefDef(t) + case t:tpd.TypeDef => + prepareForTypeDef(t) + traverseChildren(tree)(using newCtx) + transformTypeDef(t) + case t: tpd.Bind => + prepareForBind(t) + traverseChildren(tree)(using newCtx) + case _: tpd.InferredTypeTree => + case t@tpd.TypeTree() => + //! DIFFERS FROM MINIPHASE + typeTraverser(unusedDataApply).traverse(t.tpe) + traverseChildren(tree)(using newCtx) + case _ => + //! DIFFERS FROM MINIPHASE + traverseChildren(tree)(using newCtx) + end traverse + end traverser + + /** This is a type traverser which catch some special Types not traversed by the term traverser above */ + private def typeTraverser(dt: (UnusedData => Any) => Unit)(using Context) = new TypeTraverser: + override def traverse(tp: Type): Unit = + if tp.typeSymbol.exists then dt(_.registerUsed(tp.typeSymbol, Some(tp.typeSymbol.name))) + tp match + case AnnotatedType(_, annot) => + dt(_.registerUsed(annot.symbol, None)) + traverseChildren(tp) + case _ => + traverseChildren(tp) + + /** This traverse the annotations of the symbol */ + private def traverseAnnotations(sym: Symbol)(using Context): Unit = + sym.denot.annotations.foreach(annot => traverser.traverse(annot.tree)) + + + /** Do the actual reporting given the result of the anaylsis */ + private def reportUnused(res: UnusedData.UnusedResult)(using Context): Unit = + res.warnings.toList.sortBy(_.pos.line)(using Ordering[Int]).foreach { s => + s match + case UnusedSymbol(t, _, WarnTypes.Imports) => + report.warning(s"unused import", t) + case UnusedSymbol(t, _, WarnTypes.LocalDefs) => + report.warning(s"unused local definition", t) + case UnusedSymbol(t, _, WarnTypes.ExplicitParams) => + report.warning(s"unused explicit parameter", t) + case UnusedSymbol(t, _, WarnTypes.ImplicitParams) => + report.warning(s"unused implicit parameter", t) + case UnusedSymbol(t, _, WarnTypes.PrivateMembers) => + report.warning(s"unused private member", t) + case UnusedSymbol(t, _, WarnTypes.PatVars) => + report.warning(s"unused pattern variable", t) + } + +end CheckUnused + +object CheckUnused: + val phaseNamePrefix: String = "checkUnused" + val description: String = "check for unused elements" + + enum PhaseMode: + case Aggregate + case Report + + private enum WarnTypes: + case Imports + case LocalDefs + case ExplicitParams + case ImplicitParams + case PrivateMembers + case PatVars + + /** + * The key used to retrieve the "unused entity" analysis metadata, + * from the compilation `Context` + */ + private val _key = Property.StickyKey[UnusedData] + + class PostTyper extends CheckUnused(PhaseMode.Aggregate, "PostTyper", _key) + + class PostInlining extends CheckUnused(PhaseMode.Report, "PostInlining", _key) + + /** + * A stateful class gathering the infos on : + * - imports + * - definitions + * - usage + */ + private class UnusedData: + import collection.mutable.{Set => MutSet, Map => MutMap, Stack => MutStack} + import UnusedData.* + + /** The current scope during the tree traversal */ + val currScopeType: MutStack[ScopeType] = MutStack(ScopeType.Other) + + var unusedAggregate: Option[UnusedResult] = None + + /* IMPORTS */ + private val impInScope = MutStack(MutSet[tpd.Import]()) + /** + * We store the symbol along with their accessibility without import. + * Accessibility to their definition in outer context/scope + * + * See the `isAccessibleAsIdent` extension method below in the file + */ + private val usedInScope = MutStack(MutSet[(Symbol,Boolean, Option[Name], Boolean)]()) + private val usedInPosition = MutSet[(SrcPos, Name)]() + /* unused import collected during traversal */ + private val unusedImport = MutSet[ImportSelector]() + + /* LOCAL DEF OR VAL / Private Def or Val / Pattern variables */ + private val localDefInScope = MutSet[tpd.MemberDef]() + private val privateDefInScope = MutSet[tpd.MemberDef]() + private val explicitParamInScope = MutSet[tpd.MemberDef]() + private val implicitParamInScope = MutSet[tpd.MemberDef]() + private val patVarsInScope = MutSet[tpd.Bind]() + + /* Unused collection collected at the end */ + private val unusedLocalDef = MutSet[tpd.MemberDef]() + private val unusedPrivateDef = MutSet[tpd.MemberDef]() + private val unusedExplicitParams = MutSet[tpd.MemberDef]() + private val unusedImplicitParams = MutSet[tpd.MemberDef]() + private val unusedPatVars = MutSet[tpd.Bind]() + + /** All used symbols */ + private val usedDef = MutSet[Symbol]() + /** Do not register as used */ + private val doNotRegister = MutSet[Symbol]() + + /** Trivial definitions, avoid registering params */ + private val trivialDefs = MutSet[Symbol]() + + private val paramsToSkip = MutSet[Symbol]() + + /** + * Push a new Scope of the given type, executes the given Unit and + * pop it back to the original type. + */ + def inNewScope(newScope: ScopeType)(execInNewScope: => Unit)(using Context): Unit = + val prev = currScopeType + pushScope(newScope) + execInNewScope + popScope() + + def finishAggregation(using Context)(): Unit = + val unusedInThisStage = this.getUnused + this.unusedAggregate match { + case None => + this.unusedAggregate = Some(unusedInThisStage) + case Some(prevUnused) => + val intersection = unusedInThisStage.warnings.intersect(prevUnused.warnings) + this.unusedAggregate = Some(UnusedResult(intersection)) + } + + + /** + * Register a found (used) symbol along with its name + * + * The optional name will be used to target the right import + * as the same element can be imported with different renaming + */ + def registerUsed(sym: Symbol, name: Option[Name], isDerived: Boolean = false)(using Context): Unit = + if !isConstructorOfSynth(sym) && !doNotRegister(sym) then + if sym.isConstructor && sym.exists then + registerUsed(sym.owner, None) // constructor are "implicitly" imported with the class + else + usedInScope.top += ((sym, sym.isAccessibleAsIdent, name, isDerived)) + usedInScope.top += ((sym.companionModule, sym.isAccessibleAsIdent, name, isDerived)) + usedInScope.top += ((sym.companionClass, sym.isAccessibleAsIdent, name, isDerived)) + if sym.sourcePos.exists then + name.map(n => usedInPosition += ((sym.sourcePos, n))) + + /** Register a symbol that should be ignored */ + def addIgnoredUsage(sym: Symbol)(using Context): Unit = + doNotRegister ++= sym.everySymbol + + /** Remove a symbol that shouldn't be ignored anymore */ + def removeIgnoredUsage(sym: Symbol)(using Context): Unit = + doNotRegister --= sym.everySymbol + + def addIgnoredParam(sym: Symbol)(using Context): Unit = + paramsToSkip += sym + + /** Register an import */ + def registerImport(imp: tpd.Import)(using Context): Unit = + if !tpd.languageImport(imp.expr).nonEmpty && !imp.isGeneratedByEnum && !isTransparentAndInline(imp) then + impInScope.top += imp + unusedImport ++= imp.selectors.filter { s => + !shouldSelectorBeReported(imp, s) && !isImportExclusion(s) + } + + /** Register (or not) some `val` or `def` according to the context, scope and flags */ + def registerDef(memDef: tpd.MemberDef)(using Context): Unit = + if memDef.isValidMemberDef then + if memDef.isValidParam then + if memDef.symbol.isOneOf(GivenOrImplicit) then + if !paramsToSkip.contains(memDef.symbol) then + implicitParamInScope += memDef + else if !paramsToSkip.contains(memDef.symbol) then + explicitParamInScope += memDef + else if currScopeType.top == ScopeType.Local then + localDefInScope += memDef + else if memDef.shouldReportPrivateDef then + privateDefInScope += memDef + + /** Register pattern variable */ + def registerPatVar(patvar: tpd.Bind)(using Context): Unit = + if !patvar.symbol.isUnusedAnnot then + patVarsInScope += patvar + + /** enter a new scope */ + def pushScope(newScopeType: ScopeType): Unit = + // unused imports : + currScopeType.push(newScopeType) + impInScope.push(MutSet()) + usedInScope.push(MutSet()) + + /** + * leave the current scope and do : + * + * - If there are imports in this scope check for unused ones + */ + def popScope()(using Context): Unit = + // used symbol in this scope + val used = usedInScope.pop().toSet + // used imports in this scope + val imports = impInScope.pop() + val kept = used.filterNot { (sym, isAccessible, optName, isDerived) => + // keep the symbol for outer scope, if it matches **no** import + // This is the first matching wildcard selector + var selWildCard: Option[ImportSelector] = None + + val matchedExplicitImport = imports.exists { imp => + sym.isInImport(imp, isAccessible, optName, isDerived) match + case None => false + case optSel@Some(sel) if sel.isWildcard => + if selWildCard.isEmpty then selWildCard = optSel + // We keep wildcard symbol for the end as they have the least precedence + false + case Some(sel) => + unusedImport -= sel + true + } + if !matchedExplicitImport && selWildCard.isDefined then + unusedImport -= selWildCard.get + true // a matching import exists so the symbol won't be kept for outer scope + else + matchedExplicitImport + } + + // if there's an outer scope + if usedInScope.nonEmpty then + // we keep the symbols not referencing an import in this scope + // as it can be the only reference to an outer import + usedInScope.top ++= kept + // register usage in this scope for other warnings at the end of the phase + usedDef ++= used.map(_._1) + // retrieve previous scope type + currScopeType.pop + end popScope + + /** + * Leave the scope and return a `List` of unused `ImportSelector`s + * + * The given `List` is sorted by line and then column of the position + */ + + def getUnused(using Context): UnusedResult = + popScope() + + val sortedImp = + if ctx.settings.WunusedHas.imports || ctx.settings.WunusedHas.strictNoImplicitWarn then + unusedImport.map(d => UnusedSymbol(d.srcPos, d.name, WarnTypes.Imports)).toList + else + Nil + val sortedLocalDefs = + if ctx.settings.WunusedHas.locals then + localDefInScope + .filterNot(d => d.symbol.usedDefContains) + .filterNot(d => usedInPosition.exists { case (pos, name) => d.span.contains(pos.span) && name == d.symbol.name}) + .filterNot(d => containsSyntheticSuffix(d.symbol)) + .map(d => UnusedSymbol(d.namePos, d.name, WarnTypes.LocalDefs)).toList + else + Nil + val sortedExplicitParams = + if ctx.settings.WunusedHas.explicits then + explicitParamInScope + .filterNot(d => d.symbol.usedDefContains) + .filterNot(d => usedInPosition.exists { case (pos, name) => d.span.contains(pos.span) && name == d.symbol.name}) + .filterNot(d => containsSyntheticSuffix(d.symbol)) + .map(d => UnusedSymbol(d.namePos, d.name, WarnTypes.ExplicitParams)).toList + else + Nil + val sortedImplicitParams = + if ctx.settings.WunusedHas.implicits then + implicitParamInScope + .filterNot(d => d.symbol.usedDefContains) + .filterNot(d => containsSyntheticSuffix(d.symbol)) + .map(d => UnusedSymbol(d.namePos, d.name, WarnTypes.ImplicitParams)).toList + else + Nil + val sortedPrivateDefs = + if ctx.settings.WunusedHas.privates then + privateDefInScope + .filterNot(d => d.symbol.usedDefContains) + .filterNot(d => containsSyntheticSuffix(d.symbol)) + .map(d => UnusedSymbol(d.namePos, d.name, WarnTypes.PrivateMembers)).toList + else + Nil + val sortedPatVars = + if ctx.settings.WunusedHas.patvars then + patVarsInScope + .filterNot(d => d.symbol.usedDefContains) + .filterNot(d => containsSyntheticSuffix(d.symbol)) + .filterNot(d => usedInPosition.exists { case (pos, name) => d.span.contains(pos.span) && name == d.symbol.name}) + .map(d => UnusedSymbol(d.namePos, d.name, WarnTypes.PatVars)).toList + else + Nil + val warnings = List(sortedImp, sortedLocalDefs, sortedExplicitParams, sortedImplicitParams, sortedPrivateDefs, sortedPatVars).flatten.sortBy { s => + val pos = s.pos.sourcePos + (pos.line, pos.column) + } + UnusedResult(warnings.toSet) + end getUnused + //============================ HELPERS ==================================== + + + /** + * Checks if import selects a def that is transparent and inline + */ + private def isTransparentAndInline(imp: tpd.Import)(using Context): Boolean = + imp.selectors.exists { sel => + val qual = imp.expr + val importedMembers = qual.tpe.member(sel.name).alternatives.map(_.symbol) + importedMembers.exists(s => s.is(Transparent) && s.is(Inline)) + } + + /** + * Heuristic to detect synthetic suffixes in names of symbols + */ + private def containsSyntheticSuffix(symbol: Symbol)(using Context): Boolean = + symbol.name.mangledString.contains("$") + + /** + * Is the the constructor of synthetic package object + * Should be ignored as it is always imported/used in package + * Trigger false negative on used import + * + * Without this check example: + * + * --- WITH PACKAGE : WRONG --- + * {{{ + * package a: + * val x: Int = 0 + * package b: + * import a._ // no warning + * }}} + * --- WITH OBJECT : OK --- + * {{{ + * object a: + * val x: Int = 0 + * object b: + * import a._ // unused warning + * }}} + */ + private def isConstructorOfSynth(sym: Symbol)(using Context): Boolean = + sym.exists && sym.isConstructor && sym.owner.isPackageObject && sym.owner.is(Synthetic) + + /** + * This is used to avoid reporting the parameters of the synthetic main method + * generated by `@main` + */ + private def isSyntheticMainParam(sym: Symbol)(using Context): Boolean = + sym.exists && ctx.platform.isMainMethod(sym.owner) && sym.owner.is(Synthetic) + + /** + * This is used to ignore exclusion imports (i.e. import `qual`.{`member` => _}) + */ + private def isImportExclusion(sel: ImportSelector): Boolean = sel.renamed match + case untpd.Ident(name) => name == StdNames.nme.WILDCARD + case _ => false + + /** + * If -Wunused:strict-no-implicit-warn import and this import selector could potentially import implicit. + * return true + */ + private def shouldSelectorBeReported(imp: tpd.Import, sel: ImportSelector)(using Context): Boolean = + ctx.settings.WunusedHas.strictNoImplicitWarn && ( + sel.isWildcard || + imp.expr.tpe.member(sel.name.toTermName).alternatives.exists(_.symbol.isOneOf(GivenOrImplicit)) || + imp.expr.tpe.member(sel.name.toTypeName).alternatives.exists(_.symbol.isOneOf(GivenOrImplicit)) + ) + + extension (tree: ImportSelector) + def boundTpe: Type = tree.bound match { + case untpd.TypedSplice(tree1) => tree1.tpe + case _ => NoType + } + + extension (sym: Symbol) + /** is accessible without import in current context */ + private def isAccessibleAsIdent(using Context): Boolean = + sym.exists && + ctx.outersIterator.exists{ c => + c.owner == sym.owner + || sym.owner.isClass && c.owner.isClass + && c.owner.thisType.baseClasses.contains(sym.owner) + && c.owner.thisType.member(sym.name).alternatives.contains(sym) + } + + /** Given an import and accessibility, return selector that matches import<->symbol */ + private def isInImport(imp: tpd.Import, isAccessible: Boolean, symName: Option[Name], isDerived: Boolean)(using Context): Option[ImportSelector] = + val tpd.Import(qual, sels) = imp + val dealiasedSym = dealias(sym) + val simpleSelections = qual.tpe.member(sym.name).alternatives + val typeSelections = sels.flatMap(n => qual.tpe.member(n.name.toTypeName).alternatives) + val termSelections = sels.flatMap(n => qual.tpe.member(n.name.toTermName).alternatives) + val selectionsToDealias = typeSelections ::: termSelections + val qualHasSymbol = simpleSelections.map(_.symbol).contains(sym) || (simpleSelections ::: selectionsToDealias).map(_.symbol).map(dealias).contains(dealiasedSym) + def selector = sels.find(sel => (sel.name.toTermName == sym.name || sel.name.toTypeName == sym.name) && symName.map(n => n.toTermName == sel.rename).getOrElse(true)) + def dealiasedSelector = if(isDerived) sels.flatMap(sel => selectionsToDealias.map(m => (sel, m.symbol))).collect { + case (sel, sym) if dealias(sym) == dealiasedSym => sel + }.headOption else None + def givenSelector = if sym.is(Given) || sym.is(Implicit) + then sels.filter(sel => sel.isGiven && !sel.bound.isEmpty).find(sel => sel.boundTpe =:= sym.info) + else None + def wildcard = sels.find(sel => sel.isWildcard && ((sym.is(Given) == sel.isGiven && sel.bound.isEmpty) || sym.is(Implicit))) + if qualHasSymbol && (!isAccessible || sym.isRenamedSymbol(symName)) && sym.exists then + selector.orElse(dealiasedSelector).orElse(givenSelector).orElse(wildcard) // selector with name or wildcard (or given) + else + None + + private def isRenamedSymbol(symNameInScope: Option[Name])(using Context) = + sym.name != nme.NO_NAME && symNameInScope.exists(_.toSimpleName != sym.name.toSimpleName) + + private def dealias(symbol: Symbol)(using Context): Symbol = + if(symbol.isType && symbol.asType.denot.isAliasType) then + symbol.asType.typeRef.dealias.typeSymbol + else symbol + /** Annotated with @unused */ + private def isUnusedAnnot(using Context): Boolean = + sym.annotations.exists(a => a.symbol == ctx.definitions.UnusedAnnot) + + private def shouldNotReportParamOwner(using Context): Boolean = + if sym.exists then + val owner = sym.owner + trivialDefs(owner) || // is a trivial def + owner.isPrimaryConstructor || + owner.annotations.exists ( // @depreacated + _.symbol == ctx.definitions.DeprecatedAnnot + ) || + owner.isAllOf(Synthetic | PrivateLocal) || + owner.is(Accessor) || + owner.isOverriden + else + false + + private def usedDefContains(using Context): Boolean = + sym.everySymbol.exists(usedDef.apply) + + private def everySymbol(using Context): List[Symbol] = + List(sym, sym.companionClass, sym.companionModule, sym.moduleClass).filter(_.exists) + + /** A function is overriden. Either has `override flags` or parent has a matching member (type and name) */ + private def isOverriden(using Context): Boolean = + sym.is(Flags.Override) || + (sym.exists && sym.owner.thisType.parents.exists(p => sym.matchingMember(p).exists)) + + end extension + + extension (defdef: tpd.DefDef) + // so trivial that it never consumes params + private def isTrivial(using Context): Boolean = + val rhs = defdef.rhs + rhs.symbol == ctx.definitions.Predef_undefined || + rhs.tpe =:= ctx.definitions.NothingType || + defdef.symbol.is(Deferred) || + (rhs match { + case _: tpd.Literal => true + case _ => rhs.tpe match + case ConstantType(_) => true + case tp: TermRef => + // Detect Scala 2 SingleType + tp.underlying.classSymbol.is(Flags.Module) + case _ => + false + }) + def registerTrivial(using Context): Unit = + if defdef.isTrivial then + trivialDefs += defdef.symbol + + extension (memDef: tpd.MemberDef) + private def isValidMemberDef(using Context): Boolean = + memDef.symbol.exists + && !memDef.symbol.isUnusedAnnot + && !memDef.symbol.isAllOf(Flags.AccessorCreationFlags) + && !memDef.name.isWildcard + && !memDef.symbol.owner.is(ExtensionMethod) + + private def isValidParam(using Context): Boolean = + val sym = memDef.symbol + (sym.is(Param) || sym.isAllOf(PrivateParamAccessor | Local, butNot = CaseAccessor)) && + !isSyntheticMainParam(sym) && + !sym.shouldNotReportParamOwner + + + private def shouldReportPrivateDef(using Context): Boolean = + currScopeType.top == ScopeType.Template && !memDef.symbol.isConstructor && memDef.symbol.is(Private, butNot = SelfName | Synthetic | CaseAccessor) + + extension (imp: tpd.Import) + /** Enum generate an import for its cases (but outside them), which should be ignored */ + def isGeneratedByEnum(using Context): Boolean = + imp.symbol.exists && imp.symbol.owner.is(Flags.Enum, butNot = Flags.Case) + + extension (thisName: Name) + private def isWildcard: Boolean = + thisName == StdNames.nme.WILDCARD || thisName.is(WildcardParamName) + + end UnusedData + + private object UnusedData: + enum ScopeType: + case Local + case Template + case Other + + object ScopeType: + /** return the scope corresponding to the enclosing scope of the given tree */ + def fromTree(tree: tpd.Tree): ScopeType = tree match + case _:tpd.Template => Template + case _:tpd.Block => Local + case _ => Other + + case class UnusedSymbol(pos: SrcPos, name: Name, warnType: WarnTypes) + /** A container for the results of the used elements analysis */ + case class UnusedResult(warnings: Set[UnusedSymbol]) + object UnusedResult: + val Empty = UnusedResult(Set.empty) + +end CheckUnused + diff --git a/compiler/src/dotty/tools/dotc/transform/CompleteJavaEnums.scala b/compiler/src/dotty/tools/dotc/transform/CompleteJavaEnums.scala index be454281bcbb..b7e8ccf4e7e1 100644 --- a/compiler/src/dotty/tools/dotc/transform/CompleteJavaEnums.scala +++ b/compiler/src/dotty/tools/dotc/transform/CompleteJavaEnums.scala @@ -80,7 +80,7 @@ class CompleteJavaEnums extends MiniPhase with InfoTransformer { thisPhase => parents.map { case app @ Apply(fn, args0) if fn.symbol.owner == targetCls => if args0.nonEmpty && targetCls == defn.JavaEnumClass then - report.error("the constructor of java.lang.Enum cannot be called explicitly", app.sourcePos) + report.error(em"the constructor of java.lang.Enum cannot be called explicitly", app.sourcePos) cpy.Apply(app)(fn, args0 ++ args) case p => p } @@ -110,7 +110,7 @@ class CompleteJavaEnums extends MiniPhase with InfoTransformer { thisPhase => yield { def forwarderSym(flags: FlagSet, info: Type): Symbol { type ThisName = TermName } = val sym = newSymbol(clazz, enumValue.name.asTermName, flags, info) - sym.addAnnotation(Annotations.Annotation(defn.ScalaStaticAnnot)) + sym.addAnnotation(Annotations.Annotation(defn.ScalaStaticAnnot, sym.span)) sym val body = moduleRef.select(enumValue) if ctx.settings.scalajs.value then diff --git a/compiler/src/dotty/tools/dotc/transform/ContextFunctionResults.scala b/compiler/src/dotty/tools/dotc/transform/ContextFunctionResults.scala index be58fb41f1da..2ab910f6d06e 100644 --- a/compiler/src/dotty/tools/dotc/transform/ContextFunctionResults.scala +++ b/compiler/src/dotty/tools/dotc/transform/ContextFunctionResults.scala @@ -39,7 +39,7 @@ object ContextFunctionResults: val count = contextResultCount(mdef.rhs, mdef.tpt.tpe) if Config.flattenContextFunctionResults && count != 0 && !disabled then - val countAnnot = Annotation(defn.ContextResultCountAnnot, Literal(Constant(count))) + val countAnnot = Annotation(defn.ContextResultCountAnnot, Literal(Constant(count)), mdef.symbol.span) mdef.symbol.addAnnotation(countAnnot) end annotateContextResults diff --git a/compiler/src/dotty/tools/dotc/transform/DropBreaks.scala b/compiler/src/dotty/tools/dotc/transform/DropBreaks.scala new file mode 100644 index 000000000000..3081bd5c2b20 --- /dev/null +++ b/compiler/src/dotty/tools/dotc/transform/DropBreaks.scala @@ -0,0 +1,251 @@ +package dotty.tools +package dotc +package transform + +import ast.{Trees, tpd} +import core.* +import Decorators.* +import NameKinds.BoundaryName +import MegaPhase._ +import Types._, Contexts._, Flags._, DenotTransformers._ +import Symbols._, StdNames._, Trees._ +import util.Property +import Constants.Constant +import Flags.MethodOrLazy + +object DropBreaks: + val name: String = "dropBreaks" + val description: String = "replace local Break throws by labeled returns" + + /** Usage data and other info associated with a Label symbol. + * @param goto the return-label to use for a labeled return. + * @param enclMeth the enclosing method + */ + class LabelUsage(val goto: TermSymbol, val enclMeth: Symbol): + /** The number of references to associated label that come from labeled returns */ + var returnRefs: Int = 0 + /** The number of other references to associated label */ + var otherRefs: Int = 0 + + private val LabelUsages = new Property.Key[Map[Symbol, LabelUsage]] + private val ShadowedLabels = new Property.Key[Set[Symbol]] + +/** Rewrites local Break throws to labeled returns. + * Drops `try` statements on breaks if no other uses of its label remain. + * A Break throw with a `Label` created by some enclosing boundary is replaced + * with a labeled return if + * + * - the throw and the boundary are in the same method, and + * - there is no try expression inside the boundary that encloses the throw. + */ +class DropBreaks extends MiniPhase: + import DropBreaks.* + + import tpd._ + + override def phaseName: String = DropBreaks.name + + override def description: String = DropBreaks.description + + override def runsAfterGroupsOf: Set[String] = Set(ElimByName.name) + // we want by-name parameters to be converted to closures + + /** The number of boundary nodes enclosing the currently analized tree. */ + private var enclosingBoundaries: Int = 0 + + private object LabelTry: + + object GuardedThrow: + + /** `(ex, local)` provided `expr` matches + * + * if ex.label.eq(local) then ex.value else throw ex + */ + def unapply(expr: Tree)(using Context): Option[(Symbol, Symbol)] = stripTyped(expr) match + case If( + Apply(Select(Select(ex: Ident, label), eq), (lbl @ Ident(local)) :: Nil), + Select(ex2: Ident, value), + Apply(throww, (ex3: Ident) :: Nil)) + if label == nme.label && eq == nme.eq && local == nme.local && value == nme.value + && throww.symbol == defn.throwMethod + && ex.symbol == ex2.symbol && ex.symbol == ex3.symbol => + Some((ex.symbol, lbl.symbol)) + case _ => + None + end GuardedThrow + + /** `(local, body)` provided `tree` matches + * + * try body + * catch case ex: Break => + * if ex.label.eq(local) then ex.value else throw ex + */ + def unapply(tree: Tree)(using Context): Option[(Symbol, Tree)] = stripTyped(tree) match + case Try(body, CaseDef(pat @ Bind(_, Typed(_, tpt)), EmptyTree, GuardedThrow(exc, local)) :: Nil, EmptyTree) + if tpt.tpe.isRef(defn.BreakClass) && exc == pat.symbol => + Some((local, body)) + case _ => + None + end LabelTry + + private object BreakBoundary: + + /** `(local, body)` provided `tree` matches + * + * { val local: Label[...] = ...; } + */ + def unapply(tree: Tree)(using Context): Option[(Symbol, Tree)] = stripTyped(tree) match + case Block((vd @ ValDef(nme.local, _, _)) :: Nil, LabelTry(caughtAndRhs)) + if vd.symbol.info.isRef(defn.LabelClass) && vd.symbol == caughtAndRhs._1 => + Some(caughtAndRhs) + case _ => + None + end BreakBoundary + + private object Break: + + private def isBreak(sym: Symbol)(using Context): Boolean = + sym.name == nme.break && sym.owner == defn.boundaryModule.moduleClass + + /** `(local, arg)` provided `tree` matches + * + * break[...](arg)(local) + * + * or `(local, ())` provided `tree` matches + * + * break()(local) + */ + def unapply(tree: Tree)(using Context): Option[(Symbol, Tree)] = tree match + case Apply(Apply(fn, args), id :: Nil) + if isBreak(fn.symbol) => + stripInlined(id) match + case id: Ident => + val arg = (args: @unchecked) match + case arg :: Nil => arg + case Nil => Literal(Constant(())).withSpan(tree.span) + Some((id.symbol, arg)) + case _ => None + case _ => None + end Break + + /** The LabelUsage data associated with `lbl` in the current context */ + private def labelUsage(lbl: Symbol)(using Context): Option[LabelUsage] = + for + usesMap <- ctx.property(LabelUsages) + uses <- usesMap.get(lbl) + yield + uses + + /** If `tree` is a BreakBoundary, associate a fresh `LabelUsage` with its label. */ + override def prepareForBlock(tree: Block)(using Context): Context = tree match + case BreakBoundary(label, _) => + enclosingBoundaries += 1 + val mapSoFar = ctx.property(LabelUsages).getOrElse(Map.empty) + val goto = newSymbol(ctx.owner, BoundaryName.fresh(), Synthetic | Label, tree.tpe) + ctx.fresh.setProperty(LabelUsages, + mapSoFar.updated(label, LabelUsage(goto, ctx.owner.enclosingMethod))) + case _ => + ctx + + /** Include all enclosing labels in the `ShadowedLabels` context property. + * This means that breaks to these labels will not be translated to labeled + * returns while this context is valid. + */ + private def shadowLabels(using Context): Context = + ctx.property(LabelUsages) match + case Some(usesMap) => + val setSoFar = ctx.property(ShadowedLabels).getOrElse(Set.empty) + ctx.fresh.setProperty(ShadowedLabels, setSoFar ++ usesMap.keysIterator) + case _ => ctx + + /** Need to suppress labeled returns if there is an intervening try + */ + override def prepareForTry(tree: Try)(using Context): Context = + if enclosingBoundaries == 0 then ctx + else tree match + case LabelTry(_, _) => ctx + case _ => shadowLabels + + override def prepareForValDef(tree: ValDef)(using Context): Context = + if enclosingBoundaries != 0 + && tree.symbol.is(Lazy) + && tree.symbol.owner == ctx.owner.enclosingMethod + then shadowLabels // RHS be converted to a lambda + else ctx + + /** If `tree` is a BreakBoundary, transform it as follows: + * - Wrap it in a labeled block if its label has local uses + * - Drop the try/catch if its label has no other uses + */ + override def transformBlock(tree: Block)(using Context): Tree = tree match + case BreakBoundary(label, expr) => + enclosingBoundaries -= 1 + val uses = ctx.property(LabelUsages).get(label) + val tree1 = + if uses.otherRefs > 1 then + // one non-local ref is always in the catch clause; this one does not count + tree + else + expr + report.log(i"trans boundary block $label // ${uses.returnRefs}, ${uses.otherRefs}") + if uses.returnRefs > 0 then Labeled(uses.goto, tree1) else tree1 + case _ => + tree + + private def isBreak(sym: Symbol)(using Context): Boolean = + sym.name == nme.break && sym.owner == defn.boundaryModule.moduleClass + + private def transformBreak(tree: Tree, arg: Tree, lbl: Symbol)(using Context): Tree = + report.log(i"transform break $tree/$arg/$lbl") + labelUsage(lbl) match + case Some(uses: LabelUsage) + if uses.enclMeth == ctx.owner.enclosingMethod + && !ctx.property(ShadowedLabels).getOrElse(Set.empty).contains(lbl) + => + uses.otherRefs -= 1 + uses.returnRefs += 1 + Return(arg, ref(uses.goto)).withSpan(arg.span) + case _ => + tree + + + /** Rewrite a break call + * + * break.apply[...](value)(using lbl) + * + * where `lbl` is a label defined in the current method and is not included in + * ShadowedLabels to + * + * return[target] arg + * + * where `target` is the `goto` return label associated with `lbl`. + * Adjust associated ref counts accordingly. The local refcount is increased + * and the non-local refcount is decreased, since the `lbl` implicit argument + * to `break` is dropped. + */ + override def transformApply(tree: Apply)(using Context): Tree = + if enclosingBoundaries == 0 then tree + else tree match + case Break(lbl, arg) => + labelUsage(lbl) match + case Some(uses: LabelUsage) + if uses.enclMeth == ctx.owner.enclosingMethod + && !ctx.property(ShadowedLabels).getOrElse(Set.empty).contains(lbl) + => + uses.otherRefs -= 1 + uses.returnRefs += 1 + Return(arg, ref(uses.goto)).withSpan(arg.span) + case _ => tree + case _ => tree + + /** If `tree` refers to an enclosing label, increase its non local recount. + * This increase is corrected in `transformInlined` if the reference turns + * out to be part of a BreakThrow to a local, non-shadowed label. + */ + override def transformIdent(tree: Ident)(using Context): Tree = + if enclosingBoundaries != 0 then + for uses <- labelUsage(tree.symbol) do + uses.otherRefs += 1 + tree + +end DropBreaks diff --git a/compiler/src/dotty/tools/dotc/transform/ElimRepeated.scala b/compiler/src/dotty/tools/dotc/transform/ElimRepeated.scala index bdc2a268c1f8..78baec70bee6 100644 --- a/compiler/src/dotty/tools/dotc/transform/ElimRepeated.scala +++ b/compiler/src/dotty/tools/dotc/transform/ElimRepeated.scala @@ -51,10 +51,10 @@ class ElimRepeated extends MiniPhase with InfoTransformer { thisPhase => // see https://github.com/scala/bug/issues/11714 val validJava = isValidJavaVarArgs(sym.info) if !validJava then - report.error("""To generate java-compatible varargs: + report.error(em"""To generate java-compatible varargs: | - there must be a single repeated parameter | - it must be the last argument in the last parameter list - |""".stripMargin, + |""", sym.sourcePos) else addVarArgsForwarder(sym, isJavaVarargsOverride, hasAnnotation, parentHasAnnotation) diff --git a/compiler/src/dotty/tools/dotc/transform/Erasure.scala b/compiler/src/dotty/tools/dotc/transform/Erasure.scala index 84005424e3ec..129964557995 100644 --- a/compiler/src/dotty/tools/dotc/transform/Erasure.scala +++ b/compiler/src/dotty/tools/dotc/transform/Erasure.scala @@ -549,28 +549,30 @@ object Erasure { /** Check that Java statics and packages can only be used in selections. */ - private def checkNotErased(tree: Tree)(using Context): tree.type = { - if (!ctx.mode.is(Mode.Type)) { + private def checkNotErased(tree: Tree)(using Context): tree.type = + if !ctx.mode.is(Mode.Type) then if isErased(tree) then val msg = if tree.symbol.is(Flags.Inline) then em"""${tree.symbol} is declared as `inline`, but was not inlined | - |Try increasing `-Xmax-inlines` above ${ctx.settings.XmaxInlines.value}""".stripMargin - else em"${tree.symbol} is declared as `erased`, but is in fact used" + |Try increasing `-Xmax-inlines` above ${ctx.settings.XmaxInlines.value}""" + else + em"${tree.symbol} is declared as `erased`, but is in fact used" report.error(msg, tree.srcPos) - tree.symbol.getAnnotation(defn.CompileTimeOnlyAnnot) match { + tree.symbol.getAnnotation(defn.CompileTimeOnlyAnnot) match case Some(annot) => - def defaultMsg = - i"""Reference to ${tree.symbol.showLocated} should not have survived, - |it should have been processed and eliminated during expansion of an enclosing macro or term erasure.""" - val message = annot.argumentConstant(0).fold(defaultMsg)(_.stringValue) + val message = annot.argumentConstant(0) match + case Some(c) => + c.stringValue.toMessage + case _ => + em"""Reference to ${tree.symbol.showLocated} should not have survived, + |it should have been processed and eliminated during expansion of an enclosing macro or term erasure.""" report.error(message, tree.srcPos) case _ => // OK - } - } + checkNotErasedClass(tree) - } + end checkNotErased private def checkNotErasedClass(tp: Type, tree: untpd.Tree)(using Context): Unit = tp match case JavaArrayType(et) => @@ -614,7 +616,7 @@ object Erasure { * are handled separately by [[typedDefDef]], [[typedValDef]] and [[typedTyped]]. */ override def typedTypeTree(tree: untpd.TypeTree, pt: Type)(using Context): TypeTree = - checkNotErasedClass(tree.withType(erasure(tree.tpe))) + checkNotErasedClass(tree.withType(erasure(tree.typeOpt))) /** This override is only needed to semi-erase type ascriptions */ override def typedTyped(tree: untpd.Typed, pt: Type)(using Context): Tree = @@ -696,18 +698,20 @@ object Erasure { return tree.asInstanceOf[Tree] // we are re-typing a primitive array op val owner = mapOwner(origSym) - var sym = if (owner eq origSym.maybeOwner) origSym else owner.info.decl(tree.name).symbol - if !sym.exists then - // We fail the sym.exists test for pos/i15158.scala, where we pass an infinitely - // recurring match type to an overloaded constructor. An equivalent test - // with regular apply methods succeeds. It's at present unclear whether - // - the program should be rejected, or - // - there is another fix. - // Therefore, we apply the fix to use the pre-erasure symbol, but only - // for constructors, in order not to mask other possible bugs that would - // trigger the assert(sym.exists, ...) below. - val prevSym = tree.symbol(using preErasureCtx) - if prevSym.isConstructor then sym = prevSym + val sym = + (if (owner eq origSym.maybeOwner) origSym else owner.info.decl(tree.name).symbol) + .orElse { + // We fail the sym.exists test for pos/i15158.scala, where we pass an infinitely + // recurring match type to an overloaded constructor. An equivalent test + // with regular apply methods succeeds. It's at present unclear whether + // - the program should be rejected, or + // - there is another fix. + // Therefore, we apply the fix to use the pre-erasure symbol, but only + // for constructors, in order not to mask other possible bugs that would + // trigger the assert(sym.exists, ...) below. + val prevSym = tree.symbol(using preErasureCtx) + if prevSym.isConstructor then prevSym else NoSymbol + } assert(sym.exists, i"no owner from $owner/${origSym.showLocated} in $tree") @@ -780,7 +784,7 @@ object Erasure { val tp = originalQual if tp =:= qual1.tpe.widen then return errorTree(qual1, - ex"Unable to emit reference to ${sym.showLocated}, ${sym.owner} is not accessible in ${ctx.owner.enclosingClass}") + em"Unable to emit reference to ${sym.showLocated}, ${sym.owner} is not accessible in ${ctx.owner.enclosingClass}") tp recur(cast(qual1, castTarget)) } diff --git a/compiler/src/dotty/tools/dotc/transform/ExpandSAMs.scala b/compiler/src/dotty/tools/dotc/transform/ExpandSAMs.scala index cd6753eaed69..0552fe31f8a2 100644 --- a/compiler/src/dotty/tools/dotc/transform/ExpandSAMs.scala +++ b/compiler/src/dotty/tools/dotc/transform/ExpandSAMs.scala @@ -186,7 +186,7 @@ class ExpandSAMs extends MiniPhase: private def checkRefinements(tpe: Type, tree: Tree)(using Context): Type = tpe.dealias match { case RefinedType(parent, name, _) => if (name.isTermName && tpe.member(name).symbol.ownersIterator.isEmpty) // if member defined in the refinement - report.error("Lambda does not define " + name, tree.srcPos) + report.error(em"Lambda does not define $name", tree.srcPos) checkRefinements(parent, tree) case tpe => tpe diff --git a/compiler/src/dotty/tools/dotc/transform/ExplicitOuter.scala b/compiler/src/dotty/tools/dotc/transform/ExplicitOuter.scala index 00074a6ea81a..cddfe51275c8 100644 --- a/compiler/src/dotty/tools/dotc/transform/ExplicitOuter.scala +++ b/compiler/src/dotty/tools/dotc/transform/ExplicitOuter.scala @@ -176,8 +176,9 @@ object ExplicitOuter { if prefix == NoPrefix then outerCls.typeRef.appliedTo(outerCls.typeParams.map(_ => TypeBounds.empty)) else prefix.widen) val info = if (flags.is(Method)) ExprType(target) else target + val currentNestingLevel = ctx.nestingLevel atPhaseNoEarlier(explicitOuterPhase.next) { // outer accessors are entered at explicitOuter + 1, should not be defined before. - newSymbol(owner, name, SyntheticArtifact | flags, info, coord = cls.coord) + newSymbol(owner, name, SyntheticArtifact | flags, info, coord = cls.coord, nestingLevel = currentNestingLevel) } } @@ -255,7 +256,6 @@ object ExplicitOuter { */ def referencesOuter(cls: Symbol, tree: Tree)(using Context): Boolean = - val test = new TreeAccumulator[Boolean]: private var inInline = false @@ -301,19 +301,20 @@ object ExplicitOuter { def containsOuterRefs(t: Tree): Boolean = t match case _: This | _: Ident => isOuterRef(t.tpe) case nw: New => - val newCls = nw.tpe.classSymbol + val newType = nw.tpe.dealias + val newCls = newType.classSymbol isOuterSym(newCls.owner.enclosingClass) || - hasOuterPrefix(nw.tpe) || + hasOuterPrefix(newType) || newCls.owner.isTerm && cls.isProperlyContainedIn(newCls) // newCls might get proxies for free variables. If current class is // properly contained in newCls, it needs an outer path to newCls access the // proxies and forward them to the new instance. case app: TypeApply if app.symbol.isTypeTest => // Type tests of singletons translate to `eq` tests with references, which might require outer pointers - containsOuterRefsAtTopLevel(app.args.head.tpe) + containsOuterRefsAtTopLevel(app.args.head.tpe.dealias) case t: TypeTree if inInline => // Expansions of inline methods must be able to address outer types - containsOuterRefsAnywhere(t.tpe) + containsOuterRefsAnywhere(t.tpe.dealias) case _ => false diff --git a/compiler/src/dotty/tools/dotc/transform/ExtensionMethods.scala b/compiler/src/dotty/tools/dotc/transform/ExtensionMethods.scala index 9c580235a2e4..a430f7532066 100644 --- a/compiler/src/dotty/tools/dotc/transform/ExtensionMethods.scala +++ b/compiler/src/dotty/tools/dotc/transform/ExtensionMethods.scala @@ -13,7 +13,7 @@ import core._ import Types._, Contexts._, Names._, Flags._, DenotTransformers._, Phases._ import SymDenotations._, Symbols._, StdNames._, Denotations._ import TypeErasure.{ valueErasure, ErasedValueType } -import NameKinds.ExtMethName +import NameKinds.{ExtMethName, BodyRetainerName} import Decorators._ import TypeUtils._ @@ -79,7 +79,7 @@ class ExtensionMethods extends MiniPhase with DenotTransformer with FullParamete // because it adds extension methods before pickling. if (!(valueClass.is(Scala2x))) for (decl <- valueClass.classInfo.decls) - if (isMethodWithExtension(decl)) + if isMethodWithExtension(decl) then enterInModuleClass(createExtensionMethod(decl, moduleClassSym.symbol)) // Create synthetic methods to cast values between the underlying type @@ -179,7 +179,10 @@ object ExtensionMethods { /** Name of the extension method that corresponds to given instance method `meth`. */ def extensionName(imeth: Symbol)(using Context): TermName = - ExtMethName(imeth.name.asTermName) + ExtMethName( + imeth.name.asTermName match + case BodyRetainerName(name) => name + case name => name) /** Return the extension method that corresponds to given instance method `meth`. */ def extensionMethod(imeth: Symbol)(using Context): TermSymbol = @@ -188,9 +191,17 @@ object ExtensionMethods { val companion = imeth.owner.companionModule val companionInfo = companion.info val candidates = companionInfo.decl(extensionName(imeth)).alternatives - val matching = - // See the documentation of `memberSignature` to understand why `.stripPoly.ensureMethodic` is needed here. - candidates filter (c => FullParameterization.memberSignature(c.info) == imeth.info.stripPoly.ensureMethodic.signature) + def matches(candidate: SingleDenotation) = + FullParameterization.memberSignature(candidate.info) == imeth.info.stripPoly.ensureMethodic.signature + // See the documentation of `memberSignature` to understand why `.stripPoly.ensureMethodic` is needed here. + && (if imeth.targetName == imeth.name then + // imeth does not have a @targetName annotation, candidate should not have one either + candidate.symbol.targetName == candidate.symbol.name + else + // imeth has a @targetName annotation, candidate's target name must match + imeth.targetName == candidate.symbol.targetName + ) + val matching = candidates.filter(matches) assert(matching.nonEmpty, i"""no extension method found for: | @@ -203,6 +214,9 @@ object ExtensionMethods { | Candidates (signatures normalized): | | ${candidates.map(c => s"${c.name}:${c.info.signature}:${FullParameterization.memberSignature(c.info)}").mkString("\n")}""") + if matching.tail.nonEmpty then + // this case will report a "have the same erasure" error later at erasure pahse + report.log(i"mutiple extension methods match $imeth: ${candidates.map(c => i"${c.name}:${c.info}")}") matching.head.symbol.asTerm } } diff --git a/compiler/src/dotty/tools/dotc/transform/GenericSignatures.scala b/compiler/src/dotty/tools/dotc/transform/GenericSignatures.scala index 9a6ab233e239..050abf7f3cb7 100644 --- a/compiler/src/dotty/tools/dotc/transform/GenericSignatures.scala +++ b/compiler/src/dotty/tools/dotc/transform/GenericSignatures.scala @@ -5,16 +5,19 @@ package transform import core.Annotations._ import core.Contexts._ import core.Phases._ +import core.Decorators.* import core.Definitions import core.Flags._ import core.Names.Name import core.Symbols._ import core.TypeApplications.{EtaExpansion, TypeParamInfo} -import core.TypeErasure.{erasedGlb, erasure, isGenericArrayElement} +import core.TypeErasure.{erasedGlb, erasure, fullErasure, isGenericArrayElement} import core.Types._ import core.classfile.ClassfileConstants import SymUtils._ import TypeUtils._ +import config.Printers.transforms +import reporting.trace import java.lang.StringBuilder import scala.collection.mutable.ListBuffer @@ -130,12 +133,12 @@ object GenericSignatures { else Right(parent)) - def paramSig(param: LambdaParam): Unit = { - builder.append(sanitizeName(param.paramName)) + def paramSig(param: TypeParamInfo): Unit = { + builder.append(sanitizeName(param.paramName.lastPart)) boundsSig(hiBounds(param.paramInfo.bounds)) } - def polyParamSig(tparams: List[LambdaParam]): Unit = + def polyParamSig(tparams: List[TypeParamInfo]): Unit = if (tparams.nonEmpty) { builder.append('<') tparams.foreach(paramSig) @@ -236,7 +239,11 @@ object GenericSignatures { tp match { case ref @ TypeParamRef(_: PolyType, _) => - typeParamSig(ref.paramName.lastPart) + val erasedUnderlying = fullErasure(ref.underlying.bounds.hi) + // don't emit type param name if the param is upper-bounded by a primitive type (including via a value class) + if erasedUnderlying.isPrimitiveValueType then + jsig(erasedUnderlying, toplevel, primitiveOK) + else typeParamSig(ref.paramName.lastPart) case defn.ArrayOf(elemtp) => if (isGenericArrayElement(elemtp, isScala2 = false)) @@ -267,7 +274,7 @@ object GenericSignatures { else if (sym == defn.UnitClass) jsig(defn.BoxedUnitClass.typeRef) else builder.append(defn.typeTag(sym.info)) else if (ValueClasses.isDerivedValueClass(sym)) { - val erasedUnderlying = core.TypeErasure.fullErasure(tp) + val erasedUnderlying = fullErasure(tp) if (erasedUnderlying.isPrimitiveValueType && !primitiveOK) classSig(sym, pre, args) else @@ -334,15 +341,6 @@ object GenericSignatures { jsig(repr, primitiveOK = primitiveOK) case ci: ClassInfo => - def polyParamSig(tparams: List[TypeParamInfo]): Unit = - if (tparams.nonEmpty) { - builder.append('<') - tparams.foreach { tp => - builder.append(sanitizeName(tp.paramName.lastPart)) - boundsSig(hiBounds(tp.paramInfo.bounds)) - } - builder.append('>') - } val tParams = tp.typeParams if (toplevel) polyParamSig(tParams) superSig(ci.typeSymbol, ci.parents) diff --git a/compiler/src/dotty/tools/dotc/transform/HoistSuperArgs.scala b/compiler/src/dotty/tools/dotc/transform/HoistSuperArgs.scala index edbfbd1552c4..9a36d65babe8 100644 --- a/compiler/src/dotty/tools/dotc/transform/HoistSuperArgs.scala +++ b/compiler/src/dotty/tools/dotc/transform/HoistSuperArgs.scala @@ -13,6 +13,7 @@ import collection.mutable import ast.Trees._ import core.NameKinds.SuperArgName import SymUtils._ +import core.Decorators.* object HoistSuperArgs { val name: String = "hoistSuperArgs" @@ -181,7 +182,9 @@ class HoistSuperArgs extends MiniPhase with IdentityDenotTransformer { thisPhase /** Hoist complex arguments in super call out of the class. */ def hoistSuperArgsFromCall(superCall: Tree, cdef: DefDef, lifted: mutable.ListBuffer[Symbol]): Tree = superCall match - case Block(defs, expr) => + case Block(defs, expr) if !expr.symbol.owner.is(Scala2x) => + // MO: The guard avoids the crash for #16351. + // It would be good to dig deeper, but I won't have the time myself to do it. cpy.Block(superCall)( stats = defs.mapconserve { case vdef: ValDef => diff --git a/compiler/src/dotty/tools/dotc/transform/InlineVals.scala b/compiler/src/dotty/tools/dotc/transform/InlineVals.scala index 65212ec2c0cc..047a187bad68 100644 --- a/compiler/src/dotty/tools/dotc/transform/InlineVals.scala +++ b/compiler/src/dotty/tools/dotc/transform/InlineVals.scala @@ -38,8 +38,8 @@ class InlineVals extends MiniPhase: tpt.tpe.widenTermRefExpr.dealiasKeepOpaques.normalized match case tp: ConstantType => if !isPureExpr(rhs) then - val details = if enclosingInlineds.isEmpty then "" else em"but was: $rhs" - report.error(s"inline value must be pure$details", rhs.srcPos) + def details = if enclosingInlineds.isEmpty then "" else i"but was: $rhs" + report.error(em"inline value must be pure$details", rhs.srcPos) case tp => if tp.typeSymbol.is(Opaque) then report.error(em"The type of an `inline val` cannot be an opaque type.\n\nTo inline, consider using `inline def` instead", rhs) diff --git a/compiler/src/dotty/tools/dotc/transform/Inlining.scala b/compiler/src/dotty/tools/dotc/transform/Inlining.scala index 5ddcf600c63a..f0ed7026ee91 100644 --- a/compiler/src/dotty/tools/dotc/transform/Inlining.scala +++ b/compiler/src/dotty/tools/dotc/transform/Inlining.scala @@ -7,14 +7,18 @@ import Contexts._ import Symbols._ import SymUtils._ import dotty.tools.dotc.ast.tpd - +import dotty.tools.dotc.ast.Trees._ +import dotty.tools.dotc.quoted._ import dotty.tools.dotc.core.StagingContext._ import dotty.tools.dotc.inlines.Inlines import dotty.tools.dotc.ast.TreeMapWithImplicits +import dotty.tools.dotc.core.DenotTransformers.IdentityDenotTransformer +import scala.collection.mutable.ListBuffer /** Inlines all calls to inline methods that are not in an inline method or a quote */ class Inlining extends MacroTransform { + import tpd._ override def phaseName: String = Inlining.name @@ -23,8 +27,10 @@ class Inlining extends MacroTransform { override def allowsImplicitSearch: Boolean = true + override def changesMembers: Boolean = true + override def run(using Context): Unit = - if ctx.compilationUnit.needsInlining then + if ctx.compilationUnit.needsInlining || ctx.compilationUnit.hasMacroAnnotations then try super.run catch case _: CompilationUnit.SuspendException => () @@ -57,10 +63,33 @@ class Inlining extends MacroTransform { } private class InliningTreeMap extends TreeMapWithImplicits { + + /** List of top level classes added by macro annotation in a package object. + * These are added to the PackageDef that owns this particular package object. + */ + private val newTopClasses = MutableSymbolMap[ListBuffer[Tree]]() + override def transform(tree: Tree)(using Context): Tree = { tree match - case tree: DefTree => + case tree: MemberDef => if tree.symbol.is(Inline) then tree + else if tree.symbol.is(Param) then super.transform(tree) + else if + !tree.symbol.isPrimaryConstructor + && StagingContext.level == 0 + && MacroAnnotations.hasMacroAnnotation(tree.symbol) + then + val trees = (new MacroAnnotations).expandAnnotations(tree) + val trees1 = trees.map(super.transform) + + // Find classes added to the top level from a package object + val (topClasses, trees2) = + if ctx.owner.isPackageObject then trees1.partition(_.symbol.owner == ctx.owner.owner) + else (Nil, trees1) + if topClasses.nonEmpty then + newTopClasses.getOrElseUpdate(ctx.owner.owner, new ListBuffer) ++= topClasses + + flatTree(trees2) else super.transform(tree) case _: Typed | _: Block => super.transform(tree) @@ -72,6 +101,16 @@ class Inlining extends MacroTransform { super.transform(tree)(using StagingContext.quoteContext) case _: GenericApply if tree.symbol.isExprSplice => super.transform(tree)(using StagingContext.spliceContext) + case _: PackageDef => + super.transform(tree) match + case tree1: PackageDef => + newTopClasses.get(tree.symbol.moduleClass) match + case Some(topClasses) => + newTopClasses.remove(tree.symbol.moduleClass) + val newStats = tree1.stats ::: topClasses.result() + cpy.PackageDef(tree1)(tree1.pid, newStats) + case _ => tree1 + case tree1 => tree1 case _ => super.transform(tree) } diff --git a/compiler/src/dotty/tools/dotc/transform/InterceptedMethods.scala b/compiler/src/dotty/tools/dotc/transform/InterceptedMethods.scala index ad068b84c041..c95500d856be 100644 --- a/compiler/src/dotty/tools/dotc/transform/InterceptedMethods.scala +++ b/compiler/src/dotty/tools/dotc/transform/InterceptedMethods.scala @@ -65,7 +65,7 @@ class InterceptedMethods extends MiniPhase { override def transformApply(tree: Apply)(using Context): Tree = { lazy val qual = tree.fun match { case Select(qual, _) => qual - case ident @ Ident(_) => + case ident: Ident => ident.tpe match { case TermRef(prefix: TermRef, _) => tpd.ref(prefix) diff --git a/compiler/src/dotty/tools/dotc/transform/LazyVals.scala b/compiler/src/dotty/tools/dotc/transform/LazyVals.scala index 3b37ef130231..e4cb21a279d6 100644 --- a/compiler/src/dotty/tools/dotc/transform/LazyVals.scala +++ b/compiler/src/dotty/tools/dotc/transform/LazyVals.scala @@ -112,7 +112,7 @@ class LazyVals extends MiniPhase with IdentityDenotTransformer { appendOffsetDefs.get(cls) match { case None => template case Some(data) => - data.defs.foreach(_.symbol.addAnnotation(Annotation(defn.ScalaStaticAnnot))) + data.defs.foreach(defin => defin.symbol.addAnnotation(Annotation(defn.ScalaStaticAnnot, defin.symbol.span))) cpy.Template(template)(body = addInFront(data.defs, template.body)) } } @@ -448,10 +448,10 @@ class LazyVals extends MiniPhase with IdentityDenotTransformer { def transformMemberDefThreadSafe(x: ValOrDefDef)(using Context): Thicket = { assert(!(x.symbol is Mutable)) - if ctx.settings.YlightweightLazyVals.value then - transformMemberDefThreadSafeNew(x) - else + if ctx.settings.YlegacyLazyVals.value then transformMemberDefThreadSafeLegacy(x) + else + transformMemberDefThreadSafeNew(x) } def transformMemberDefThreadSafeNew(x: ValOrDefDef)(using Context): Thicket = { @@ -464,15 +464,10 @@ class LazyVals extends MiniPhase with IdentityDenotTransformer { def offsetName(id: Int) = s"${StdNames.nme.LAZY_FIELD_OFFSET}${if (x.symbol.owner.is(Module)) "_m_" else ""}$id".toTermName val containerName = LazyLocalName.fresh(x.name.asTermName) val containerSymbol = newSymbol(claz, containerName, x.symbol.flags &~ containerFlagsMask | containerFlags | Private, defn.ObjectType, coord = x.symbol.coord).enteredAfter(this) - containerSymbol.addAnnotation(Annotation(defn.VolatileAnnot)) // private @volatile var _x: AnyRef + containerSymbol.addAnnotation(Annotation(defn.VolatileAnnot, containerSymbol.span)) // private @volatile var _x: AnyRef containerSymbol.addAnnotations(x.symbol.annotations) // pass annotations from original definition - val stat = x.symbol.isStatic - if stat then - containerSymbol.setFlag(JavaStatic) + containerSymbol.removeAnnotation(defn.ScalaStaticAnnot) val getOffset = - if stat then - Select(ref(defn.LazyValsModule), lazyNme.RLazyVals.getStaticFieldOffset) - else Select(ref(defn.LazyValsModule), lazyNme.RLazyVals.getOffsetStatic) val containerTree = ValDef(containerSymbol, nullLiteral) @@ -482,7 +477,7 @@ class LazyVals extends MiniPhase with IdentityDenotTransformer { newSymbol(claz, offsetName(info.defs.size), Synthetic, defn.LongType).enteredAfter(this) case None => newSymbol(claz, offsetName(0), Synthetic, defn.LongType).enteredAfter(this) - offsetSymbol.nn.addAnnotation(Annotation(defn.ScalaStaticAnnot)) + offsetSymbol.nn.addAnnotation(Annotation(defn.ScalaStaticAnnot, offsetSymbol.nn.span)) val fieldTree = thizClass.select(lazyNme.RLazyVals.getDeclaredField).appliedTo(Literal(Constant(containerName.mangledString))) val offsetTree = ValDef(offsetSymbol.nn, getOffset.appliedTo(fieldTree)) val offsetInfo = appendOffsetDefs.getOrElseUpdate(claz, new OffsetInfo(Nil)) @@ -490,9 +485,6 @@ class LazyVals extends MiniPhase with IdentityDenotTransformer { val offset = ref(offsetSymbol.nn) val swapOver = - if stat then - tpd.clsOf(x.symbol.owner.typeRef) - else This(claz) val (accessorDef, initMethodDef) = mkThreadSafeDef(x, claz, containerSymbol, offset, swapOver) @@ -625,7 +617,7 @@ class LazyVals extends MiniPhase with IdentityDenotTransformer { .symbol.asTerm else { // need to create a new flag offsetSymbol = newSymbol(claz, offsetById, Synthetic, defn.LongType).enteredAfter(this) - offsetSymbol.nn.addAnnotation(Annotation(defn.ScalaStaticAnnot)) + offsetSymbol.nn.addAnnotation(Annotation(defn.ScalaStaticAnnot, offsetSymbol.nn.span)) val flagName = LazyBitMapName.fresh(id.toString.toTermName) val flagSymbol = newSymbol(claz, flagName, containerFlags, defn.LongType).enteredAfter(this) flag = ValDef(flagSymbol, Literal(Constant(0L))) @@ -636,7 +628,7 @@ class LazyVals extends MiniPhase with IdentityDenotTransformer { case None => offsetSymbol = newSymbol(claz, offsetName(0), Synthetic, defn.LongType).enteredAfter(this) - offsetSymbol.nn.addAnnotation(Annotation(defn.ScalaStaticAnnot)) + offsetSymbol.nn.addAnnotation(Annotation(defn.ScalaStaticAnnot, offsetSymbol.nn.span)) val flagName = LazyBitMapName.fresh("0".toTermName) val flagSymbol = newSymbol(claz, flagName, containerFlags, defn.LongType).enteredAfter(this) flag = ValDef(flagSymbol, Literal(Constant(0L))) @@ -682,7 +674,6 @@ object LazyVals { val cas: TermName = N.cas.toTermName val getOffset: TermName = N.getOffset.toTermName val getOffsetStatic: TermName = "getOffsetStatic".toTermName - val getStaticFieldOffset: TermName = "getStaticFieldOffset".toTermName val getDeclaredField: TermName = "getDeclaredField".toTermName } val flag: TermName = "flag".toTermName diff --git a/compiler/src/dotty/tools/dotc/transform/MacroAnnotations.scala b/compiler/src/dotty/tools/dotc/transform/MacroAnnotations.scala new file mode 100644 index 000000000000..cc2e6118d1fa --- /dev/null +++ b/compiler/src/dotty/tools/dotc/transform/MacroAnnotations.scala @@ -0,0 +1,142 @@ +package dotty.tools.dotc +package transform + +import scala.language.unsafeNulls + +import dotty.tools.dotc.ast.tpd +import dotty.tools.dotc.ast.Trees.* +import dotty.tools.dotc.config.Printers.{macroAnnot => debug} +import dotty.tools.dotc.core.Annotations.* +import dotty.tools.dotc.core.Contexts.* +import dotty.tools.dotc.core.Decorators.* +import dotty.tools.dotc.core.DenotTransformers.DenotTransformer +import dotty.tools.dotc.core.Flags.* +import dotty.tools.dotc.core.MacroClassLoader +import dotty.tools.dotc.core.Symbols.* +import dotty.tools.dotc.core.Types._ +import dotty.tools.dotc.quoted.* +import dotty.tools.dotc.util.SrcPos +import scala.quoted.runtime.impl.{QuotesImpl, SpliceScope} + +import scala.quoted.Quotes +import scala.util.control.NonFatal + +import java.lang.reflect.InvocationTargetException + +class MacroAnnotations: + import tpd.* + import MacroAnnotations.* + + /** Expands every macro annotation that is on this tree. + * Returns a list with transformed definition and any added definitions. + */ + def expandAnnotations(tree: MemberDef)(using Context): List[DefTree] = + if !hasMacroAnnotation(tree.symbol) then + List(tree) + else if tree.symbol.is(Module) && !tree.symbol.isClass then + // only class is transformed + List(tree) + else if tree.symbol.isType && !tree.symbol.isClass then + report.error("macro annotations are not supported on type", tree) + List(tree) + else + debug.println(i"Expanding macro annotations of:\n$tree") + + val macroInterpreter = new Interpreter(tree.srcPos, MacroClassLoader.fromContext) + + val allTrees = List.newBuilder[DefTree] + var insertedAfter: List[List[DefTree]] = Nil + + // Apply all macro annotation to `tree` and collect new definitions in order + val transformedTree: DefTree = tree.symbol.annotations.foldLeft(tree) { (tree, annot) => + if isMacroAnnotation(annot) then + debug.println(i"Expanding macro annotation: ${annot}") + + // Interpret call to `new myAnnot(..).transform(using )()` + val transformedTrees = + try callMacro(macroInterpreter, tree, annot) + catch + // TODO: Replace this case when scala.annaotaion.MacroAnnotation is no longer experimental and reflectiveSelectable is not used + // Replace this case with the nested cases. + case ex0: InvocationTargetException => + ex0.getCause match + case ex: scala.quoted.runtime.StopMacroExpansion => + if !ctx.reporter.hasErrors then + report.error("Macro expansion was aborted by the macro without any errors reported. Macros should issue errors to end-users when aborting a macro expansion with StopMacroExpansion.", annot.tree) + List(tree) + case Interpreter.MissingClassDefinedInCurrentRun(sym) => + Interpreter.suspendOnMissing(sym, annot.tree) + case NonFatal(ex) => + val stack0 = ex.getStackTrace.takeWhile(_.getClassName != "dotty.tools.dotc.transform.MacroAnnotations") + val stack = stack0.take(1 + stack0.lastIndexWhere(_.getMethodName == "transform")) + val msg = + em"""Failed to evaluate macro. + | Caused by ${ex.getClass}: ${if (ex.getMessage == null) "" else ex.getMessage} + | ${stack.mkString("\n ")} + |""" + report.error(msg, annot.tree) + List(tree) + case _ => + throw ex0 + transformedTrees.span(_.symbol != tree.symbol) match + case (prefixed, newTree :: suffixed) => + allTrees ++= prefixed + insertedAfter = suffixed :: insertedAfter + prefixed.foreach(checkMacroDef(_, tree, annot)) + suffixed.foreach(checkMacroDef(_, tree, annot)) + transform.TreeChecker.checkMacroGeneratedTree(tree, newTree) + newTree + case (Nil, Nil) => + report.error(i"Unexpected `Nil` returned by `(${annot.tree}).transform(..)` during macro expansion", annot.tree.srcPos) + tree + case (_, Nil) => + report.error(i"Transformed tree for ${tree} was not return by `(${annot.tree}).transform(..)` during macro expansion", annot.tree.srcPos) + tree + else + tree + } + + allTrees += transformedTree + insertedAfter.foreach(allTrees.++=) + + val result = allTrees.result() + debug.println(result.map(_.show).mkString("expanded to:\n", "\n", "")) + result + + /** Interpret the code `new annot(..).transform(using )()` */ + private def callMacro(interpreter: Interpreter, tree: MemberDef, annot: Annotation)(using Context): List[MemberDef] = + // TODO: Remove when scala.annaotaion.MacroAnnotation is no longer experimental + import scala.reflect.Selectable.reflectiveSelectable + type MacroAnnotation = { + def transform(using Quotes)(tree: Object/*Erased type of quotes.refelct.Definition*/): List[MemberDef /*quotes.refelct.Definition known to be MemberDef in QuotesImpl*/] + } + + // Interpret macro annotation instantiation `new myAnnot(..)` + val annotInstance = interpreter.interpret[MacroAnnotation](annot.tree).get + // TODO: Remove when scala.annaotaion.MacroAnnotation is no longer experimental + assert(annotInstance.getClass.getClassLoader.loadClass("scala.annotation.MacroAnnotation").isInstance(annotInstance)) + + val quotes = QuotesImpl()(using SpliceScope.contextWithNewSpliceScope(tree.symbol.sourcePos)(using MacroExpansion.context(tree)).withOwner(tree.symbol.owner)) + annotInstance.transform(using quotes)(tree.asInstanceOf[quotes.reflect.Definition]) + + /** Check that this tree can be added by the macro annotation */ + private def checkMacroDef(newTree: DefTree, annotatedTree: Tree, annot: Annotation)(using Context) = + transform.TreeChecker.checkMacroGeneratedTree(annotatedTree, newTree) + val sym = newTree.symbol + val annotated = annotatedTree.symbol + if sym.isType && !sym.isClass then + report.error(i"macro annotation cannot return a `type`. $annot tried to add $sym", annot.tree) + else if sym.owner != annotated.owner && !(annotated.owner.isPackageObject && (sym.isClass || sym.is(Module)) && sym.owner == annotated.owner.owner) then + report.error(i"macro annotation $annot added $sym with an inconsistent owner. Expected it to be owned by ${annotated.owner} but was owned by ${sym.owner}.", annot.tree) + else if annotated.isClass && annotated.owner.is(Package) /*&& !sym.isClass*/ then + report.error(i"macro annotation can not add top-level ${sym.showKind}. $annot tried to add $sym.", annot.tree) + +object MacroAnnotations: + + /** Is this an annotation that implements `scala.annation.MacroAnnotation` */ + def isMacroAnnotation(annot: Annotation)(using Context): Boolean = + annot.tree.symbol.maybeOwner.derivesFrom(defn.MacroAnnotationClass) + + /** Is this symbol annotated with an annotation that implements `scala.annation.MacroAnnotation` */ + def hasMacroAnnotation(sym: Symbol)(using Context): Boolean = + sym.getAnnotation(defn.MacroAnnotationClass).isDefined diff --git a/compiler/src/dotty/tools/dotc/transform/MacroTransform.scala b/compiler/src/dotty/tools/dotc/transform/MacroTransform.scala index 27ccd622bc65..7bb7ed365ebe 100644 --- a/compiler/src/dotty/tools/dotc/transform/MacroTransform.scala +++ b/compiler/src/dotty/tools/dotc/transform/MacroTransform.scala @@ -38,10 +38,10 @@ abstract class MacroTransform extends Phase { tree case _: PackageDef | _: MemberDef => super.transform(tree)(using localCtx(tree)) - case impl @ Template(constr, parents, self, _) => + case impl @ Template(constr, _, self, _) => cpy.Template(tree)( transformSub(constr), - transform(parents)(using ctx.superCallContext), + transform(impl.parents)(using ctx.superCallContext), Nil, transformSelf(self), transformStats(impl.body, tree.symbol)) diff --git a/compiler/src/dotty/tools/dotc/transform/MegaPhase.scala b/compiler/src/dotty/tools/dotc/transform/MegaPhase.scala index 9d241216bdaa..d4dd911241d3 100644 --- a/compiler/src/dotty/tools/dotc/transform/MegaPhase.scala +++ b/compiler/src/dotty/tools/dotc/transform/MegaPhase.scala @@ -456,7 +456,7 @@ class MegaPhase(val miniPhases: Array[MiniPhase]) extends Phase { } def transformTrees(trees: List[Tree], start: Int)(using Context): List[Tree] = - trees.mapInline(transformTree(_, start)) + trees.flattenedMapConserve(transformTree(_, start)) def transformSpecificTrees[T <: Tree](trees: List[T], start: Int)(using Context): List[T] = transformTrees(trees, start).asInstanceOf[List[T]] diff --git a/compiler/src/dotty/tools/dotc/transform/Memoize.scala b/compiler/src/dotty/tools/dotc/transform/Memoize.scala index 6456066bfdb0..5a2eda4101a4 100644 --- a/compiler/src/dotty/tools/dotc/transform/Memoize.scala +++ b/compiler/src/dotty/tools/dotc/transform/Memoize.scala @@ -4,7 +4,7 @@ package transform import core._ import DenotTransformers._ import Contexts._ -import Phases.phaseOf +import Phases.* import SymDenotations.SymDenotation import Denotations._ import Symbols._ @@ -114,26 +114,10 @@ class Memoize extends MiniPhase with IdentityDenotTransformer { thisPhase => flags = Private | (if (sym.is(StableRealizable)) EmptyFlags else Mutable), info = fieldType, coord = tree.span - ).withAnnotationsCarrying(sym, defn.FieldMetaAnnot) + ).withAnnotationsCarrying(sym, defn.FieldMetaAnnot, orNoneOf = defn.MetaAnnots) .enteredAfter(thisPhase) } - def addAnnotations(denot: Denotation): Unit = - denot match { - case fieldDenot: SymDenotation if sym.annotations.nonEmpty => - val cpy = fieldDenot.copySymDenotation() - cpy.annotations = sym.annotations - cpy.installAfter(thisPhase) - case _ => () - } - - def removeUnwantedAnnotations(denot: SymDenotation, metaAnnotSym: ClassSymbol): Unit = - if (sym.annotations.nonEmpty) { - val cpy = sym.copySymDenotation() - cpy.filterAnnotations(_.symbol.hasAnnotation(metaAnnotSym)) - cpy.installAfter(thisPhase) - } - val NoFieldNeeded = Lazy | Deferred | JavaDefined | Inline def erasedBottomTree(sym: Symbol) = @@ -183,8 +167,7 @@ class Memoize extends MiniPhase with IdentityDenotTransformer { thisPhase => if isErasableBottomField(field, rhsClass) then erasedBottomTree(rhsClass) else transformFollowingDeep(ref(field))(using ctx.withOwner(sym)) val getterDef = cpy.DefDef(tree)(rhs = getterRhs) - addAnnotations(fieldDef.denot) - removeUnwantedAnnotations(sym, defn.GetterMetaAnnot) + sym.copyAndKeepAnnotationsCarrying(thisPhase, Set(defn.GetterMetaAnnot)) Thicket(fieldDef, getterDef) else if sym.isSetter then if (!sym.is(ParamAccessor)) { val Literal(Constant(())) = tree.rhs: @unchecked } // This is intended as an assertion @@ -210,7 +193,7 @@ class Memoize extends MiniPhase with IdentityDenotTransformer { thisPhase => then Literal(Constant(())) else Assign(ref(field), adaptToField(field, ref(tree.termParamss.head.head.symbol))) val setterDef = cpy.DefDef(tree)(rhs = transformFollowingDeep(initializer)(using ctx.withOwner(sym))) - removeUnwantedAnnotations(sym, defn.SetterMetaAnnot) + sym.copyAndKeepAnnotationsCarrying(thisPhase, Set(defn.SetterMetaAnnot)) setterDef else // Curiously, some accessors from Scala2 have ' ' suffixes. diff --git a/compiler/src/dotty/tools/dotc/transform/MoveStatics.scala b/compiler/src/dotty/tools/dotc/transform/MoveStatics.scala index 99702686edf8..db96aeefe231 100644 --- a/compiler/src/dotty/tools/dotc/transform/MoveStatics.scala +++ b/compiler/src/dotty/tools/dotc/transform/MoveStatics.scala @@ -46,7 +46,7 @@ class MoveStatics extends MiniPhase with SymTransformer { if (staticFields.nonEmpty) { /* do NOT put Flags.JavaStatic here. It breaks .enclosingClass */ val staticCostructor = newSymbol(orig.symbol, nme.STATIC_CONSTRUCTOR, Flags.Synthetic | Flags.Method | Flags.Private, MethodType(Nil, defn.UnitType)) - staticCostructor.addAnnotation(Annotation(defn.ScalaStaticAnnot)) + staticCostructor.addAnnotation(Annotation(defn.ScalaStaticAnnot, staticCostructor.span)) staticCostructor.entered val staticAssigns = staticFields.map(x => Assign(ref(x.symbol), x.rhs.changeOwner(x.symbol, staticCostructor))) diff --git a/compiler/src/dotty/tools/dotc/transform/NonLocalReturns.scala b/compiler/src/dotty/tools/dotc/transform/NonLocalReturns.scala index 7e1ae9e661f6..a75d6da9dd6a 100644 --- a/compiler/src/dotty/tools/dotc/transform/NonLocalReturns.scala +++ b/compiler/src/dotty/tools/dotc/transform/NonLocalReturns.scala @@ -6,6 +6,7 @@ import Contexts._, Symbols._, Types._, Flags._, StdNames._ import MegaPhase._ import NameKinds.NonLocalReturnKeyName import config.SourceVersion.* +import Decorators.em object NonLocalReturns { import ast.tpd._ @@ -96,7 +97,7 @@ class NonLocalReturns extends MiniPhase { override def transformReturn(tree: Return)(using Context): Tree = if isNonLocalReturn(tree) then report.gradualErrorOrMigrationWarning( - "Non local returns are no longer supported; use scala.util.control.NonLocalReturns instead", + em"Non local returns are no longer supported; use `boundary` and `boundary.break` in `scala.util` instead", tree.srcPos, warnFrom = `3.2`, errorFrom = future) diff --git a/compiler/src/dotty/tools/dotc/transform/OverridingPairs.scala b/compiler/src/dotty/tools/dotc/transform/OverridingPairs.scala index b27a75436d86..48dc7c818360 100644 --- a/compiler/src/dotty/tools/dotc/transform/OverridingPairs.scala +++ b/compiler/src/dotty/tools/dotc/transform/OverridingPairs.scala @@ -200,10 +200,13 @@ object OverridingPairs: /** Let `member` and `other` be members of some common class C with types * `memberTp` and `otherTp` in C. Are the two symbols considered an overriding * pair in C? We assume that names already match so we test only the types here. - * @param fallBack A function called if the initial test is false and - * `member` and `other` are term symbols. + * @param fallBack A function called if the initial test is false and + * `member` and `other` are term symbols. + * @param isSubType A function to be used for checking subtype relationships + * between term fields. */ - def isOverridingPair(member: Symbol, memberTp: Type, other: Symbol, otherTp: Type, fallBack: => Boolean = false)(using Context): Boolean = + def isOverridingPair(member: Symbol, memberTp: Type, other: Symbol, otherTp: Type, fallBack: => Boolean = false, + isSubType: (Type, Type) => Context ?=> Boolean = (tp1, tp2) => tp1 frozen_<:< tp2)(using Context): Boolean = if member.isType then // intersection of bounds to refined types must be nonempty memberTp.bounds.hi.hasSameKindAs(otherTp.bounds.hi) && ( @@ -222,6 +225,6 @@ object OverridingPairs: val relaxedOverriding = ctx.explicitNulls && (member.is(JavaDefined) || other.is(JavaDefined)) member.name.is(DefaultGetterName) // default getters are not checked for compatibility || memberTp.overrides(otherTp, relaxedOverriding, - member.matchNullaryLoosely || other.matchNullaryLoosely || fallBack) + member.matchNullaryLoosely || other.matchNullaryLoosely || fallBack, isSubType = isSubType) end OverridingPairs diff --git a/compiler/src/dotty/tools/dotc/transform/PCPCheckAndHeal.scala b/compiler/src/dotty/tools/dotc/transform/PCPCheckAndHeal.scala index 263b0040eb24..1d0ed035df09 100644 --- a/compiler/src/dotty/tools/dotc/transform/PCPCheckAndHeal.scala +++ b/compiler/src/dotty/tools/dotc/transform/PCPCheckAndHeal.scala @@ -246,13 +246,14 @@ class PCPCheckAndHeal(@constructorOnly ictx: Context) extends TreeMapWithStages( checkStable(tp, pos, "type witness") getQuoteTypeTags.getTagRef(tp) case _: SearchFailureType => - report.error(i"""Reference to $tp within quotes requires a given $reqType in scope. - |${ctx.typer.missingArgMsg(tag, reqType, "")} - | - |""", pos) + report.error( + ctx.typer.missingArgMsg(tag, reqType, "") + .prepend(i"Reference to $tp within quotes requires a given $reqType in scope.\n") + .append("\n"), + pos) tp case _ => - report.error(i"""Reference to $tp within quotes requires a given $reqType in scope. + report.error(em"""Reference to $tp within quotes requires a given $reqType in scope. | |""", pos) tp @@ -263,10 +264,16 @@ class PCPCheckAndHeal(@constructorOnly ictx: Context) extends TreeMapWithStages( if (!tp.isInstanceOf[ThisType]) sym.show else if (sym.is(ModuleClass)) sym.sourceModule.show else i"${sym.name}.this" + val hint = + if sym.is(Inline) && levelOf(sym) < level then + "\n\n" + + "Hint: Staged references to inline definition in quotes are only inlined after the quote is spliced into level 0 code by a macro. " + + "Try moving this inline definition in a statically accessible location such as an object (this definition can be private)." + else "" report.error( em"""access to $symStr from wrong staging level: | - the definition is at level ${levelOf(sym)}, - | - but the access is at level $level.""", pos) + | - but the access is at level $level.$hint""", pos) tp } @@ -296,7 +303,7 @@ object PCPCheckAndHeal { flags = Synthetic, info = TypeAlias(splicedTree.tpe.select(tpnme.Underlying)), coord = span).asType - local.addAnnotation(Annotation(defn.QuotedRuntime_SplicedTypeAnnot)) + local.addAnnotation(Annotation(defn.QuotedRuntime_SplicedTypeAnnot, span)) ctx.typeAssigner.assignType(untpd.TypeDef(local.name, alias), local) } diff --git a/compiler/src/dotty/tools/dotc/transform/PatternMatcher.scala b/compiler/src/dotty/tools/dotc/transform/PatternMatcher.scala index 70fa0e5cc513..63ffdffbddef 100644 --- a/compiler/src/dotty/tools/dotc/transform/PatternMatcher.scala +++ b/compiler/src/dotty/tools/dotc/transform/PatternMatcher.scala @@ -664,12 +664,12 @@ object PatternMatcher { val refCount = varRefCount(plan) val LetPlan(topSym, _) = plan: @unchecked - def toDrop(sym: Symbol) = initializer.get(sym) match { - case Some(rhs) => + def toDrop(sym: Symbol) = + val rhs = initializer.lookup(sym) + if rhs != null then isPatmatGenerated(sym) && refCount(sym) <= 1 && sym != topSym && isPureExpr(rhs) - case none => + else false - } object Inliner extends PlanTransform { override val treeMap = new TreeMap { diff --git a/compiler/src/dotty/tools/dotc/transform/PickleQuotes.scala b/compiler/src/dotty/tools/dotc/transform/PickleQuotes.scala index f3ae6a377aab..21fc27cec0dd 100644 --- a/compiler/src/dotty/tools/dotc/transform/PickleQuotes.scala +++ b/compiler/src/dotty/tools/dotc/transform/PickleQuotes.scala @@ -113,12 +113,10 @@ class PickleQuotes extends MacroTransform { case _ => val (contents, tptWithHoles) = makeHoles(tpt) PickleQuotes(quotes, tptWithHoles, contents, tpt.tpe, true) - case tree: DefDef if tree.symbol.is(Macro) => + case tree: DefDef if !tree.rhs.isEmpty && tree.symbol.isInlineMethod => // Shrink size of the tree. The methods have already been inlined. // TODO move to FirstTransform to trigger even without quotes cpy.DefDef(tree)(rhs = defaultValue(tree.rhs.tpe)) - case _: DefDef if tree.symbol.isInlineMethod => - tree case _ => super.transform(tree) } diff --git a/compiler/src/dotty/tools/dotc/transform/Pickler.scala b/compiler/src/dotty/tools/dotc/transform/Pickler.scala index 4d9b42a36fe7..f5fe34bafc2f 100644 --- a/compiler/src/dotty/tools/dotc/transform/Pickler.scala +++ b/compiler/src/dotty/tools/dotc/transform/Pickler.scala @@ -1,4 +1,5 @@ -package dotty.tools.dotc +package dotty.tools +package dotc package transform import core._ @@ -11,10 +12,10 @@ import Periods._ import Phases._ import Symbols._ import Flags.Module -import reporting.{ThrowingReporter, Profile} +import reporting.{ThrowingReporter, Profile, Message} import collection.mutable -import scala.concurrent.{Future, Await, ExecutionContext} -import scala.concurrent.duration.Duration +import util.concurrent.{Executor, Future} +import compiletime.uninitialized object Pickler { val name: String = "pickler" @@ -47,7 +48,7 @@ class Pickler extends Phase { // Maps that keep a record if -Ytest-pickler is set. private val beforePickling = new mutable.HashMap[ClassSymbol, String] - private val picklers = new mutable.HashMap[ClassSymbol, TastyPickler] + private val pickledBytes = new mutable.HashMap[ClassSymbol, Array[Byte]] /** Drop any elements of this list that are linked module classes of other elements in the list */ private def dropCompanionModuleClasses(clss: List[ClassSymbol])(using Context): List[ClassSymbol] = { @@ -56,6 +57,24 @@ class Pickler extends Phase { clss.filterNot(companionModuleClasses.contains) } + /** Runs given functions with a scratch data block in a serialized fashion (i.e. + * inside a synchronized block). Scratch data is re-used between calls. + * Used to conserve on memory usage by avoiding to create scratch data for each + * pickled unit. + */ + object serialized: + val scratch = new ScratchData + def run(body: ScratchData => Array[Byte]): Array[Byte] = + synchronized { + scratch.reset() + body(scratch) + } + + private val executor = Executor[Array[Byte]]() + + private def useExecutor(using Context) = + Pickler.ParallelPickling && !ctx.settings.YtestPickler.value + override def run(using Context): Unit = { val unit = ctx.compilationUnit pickling.println(i"unpickling in run ${ctx.runId}") @@ -64,25 +83,30 @@ class Pickler extends Phase { cls <- dropCompanionModuleClasses(topLevelClasses(unit.tpdTree)) tree <- sliceTopLevel(unit.tpdTree, cls) do + if ctx.settings.YtestPickler.value then beforePickling(cls) = tree.show + val pickler = new TastyPickler(cls) - if ctx.settings.YtestPickler.value then - beforePickling(cls) = tree.show - picklers(cls) = pickler val treePkl = new TreePickler(pickler) treePkl.pickle(tree :: Nil) Profile.current.recordTasty(treePkl.buf.length) - val positionWarnings = new mutable.ListBuffer[String]() - val pickledF = inContext(ctx.fresh) { - Future { - treePkl.compactify() + + val positionWarnings = new mutable.ListBuffer[Message]() + def reportPositionWarnings() = positionWarnings.foreach(report.warning(_)) + + def computePickled(): Array[Byte] = inContext(ctx.fresh) { + serialized.run { scratch => + treePkl.compactify(scratch) if tree.span.exists then val reference = ctx.settings.sourceroot.value - new PositionPickler(pickler, treePkl.buf.addrOfTree, treePkl.treeAnnots, reference) - .picklePositions(unit.source, tree :: Nil, positionWarnings) + PositionPickler.picklePositions( + pickler, treePkl.buf.addrOfTree, treePkl.treeAnnots, reference, + unit.source, tree :: Nil, positionWarnings, + scratch.positionBuffer, scratch.pickledIndices) if !ctx.settings.YdropComments.value then - new CommentPickler(pickler, treePkl.buf.addrOfTree, treePkl.docString) - .pickleComment(tree) + CommentPickler.pickleComments( + pickler, treePkl.buf.addrOfTree, treePkl.docString, tree, + scratch.commentBuffer) val pickled = pickler.assembleParts() @@ -93,26 +117,40 @@ class Pickler extends Phase { // println(i"rawBytes = \n$rawBytes%\n%") // DEBUG if pickling ne noPrinter then - pickling.synchronized { - println(i"**** pickled info of $cls") - println(TastyPrinter.showContents(pickled, ctx.settings.color.value == "never")) - } + println(i"**** pickled info of $cls") + println(TastyPrinter.showContents(pickled, ctx.settings.color.value == "never")) pickled - }(using ExecutionContext.global) + } } - def force(): Array[Byte] = - val result = Await.result(pickledF, Duration.Inf) - positionWarnings.foreach(report.warning(_)) - result - - if !Pickler.ParallelPickling || ctx.settings.YtestPickler.value then force() - unit.pickled += (cls -> force) + /** A function that returns the pickled bytes. Depending on `Pickler.ParallelPickling` + * either computes the pickled data in a future or eagerly before constructing the + * function value. + */ + val demandPickled: () => Array[Byte] = + if useExecutor then + val futurePickled = executor.schedule(computePickled) + () => + try futurePickled.force.get + finally reportPositionWarnings() + else + val pickled = computePickled() + reportPositionWarnings() + if ctx.settings.YtestPickler.value then pickledBytes(cls) = pickled + () => pickled + + unit.pickled += (cls -> demandPickled) end for } override def runOn(units: List[CompilationUnit])(using Context): List[CompilationUnit] = { - val result = super.runOn(units) + val result = + if useExecutor then + executor.start() + try super.runOn(units) + finally executor.close() + else + super.runOn(units) if ctx.settings.YtestPickler.value then val ctx2 = ctx.fresh.setSetting(ctx.settings.YreadComments, true) testUnpickler( @@ -128,8 +166,8 @@ class Pickler extends Phase { pickling.println(i"testing unpickler at run ${ctx.runId}") ctx.initialize() val unpicklers = - for ((cls, pickler) <- picklers) yield { - val unpickler = new DottyUnpickler(pickler.assembleParts()) + for ((cls, bytes) <- pickledBytes) yield { + val unpickler = new DottyUnpickler(bytes) unpickler.enter(roots = Set.empty) cls -> unpickler } @@ -147,8 +185,9 @@ class Pickler extends Phase { if unequal then output("before-pickling.txt", previous) output("after-pickling.txt", unpickled) - report.error(s"""pickling difference for $cls in ${cls.source}, for details: - | - | diff before-pickling.txt after-pickling.txt""".stripMargin) + //sys.process.Process("diff -u before-pickling.txt after-pickling.txt").! + report.error(em"""pickling difference for $cls in ${cls.source}, for details: + | + | diff before-pickling.txt after-pickling.txt""") end testSame } diff --git a/compiler/src/dotty/tools/dotc/transform/PostTyper.scala b/compiler/src/dotty/tools/dotc/transform/PostTyper.scala index 0424b48751bc..2039a8f19558 100644 --- a/compiler/src/dotty/tools/dotc/transform/PostTyper.scala +++ b/compiler/src/dotty/tools/dotc/transform/PostTyper.scala @@ -1,4 +1,5 @@ -package dotty.tools.dotc +package dotty.tools +package dotc package transform import dotty.tools.dotc.ast.{Trees, tpd, untpd, desugar} @@ -156,12 +157,14 @@ class PostTyper extends MacroTransform with IdentityDenotTransformer { thisPhase checkInferredWellFormed(tree.tpt) if sym.is(Method) then if sym.isSetter then - removeUnwantedAnnotations(sym, defn.SetterMetaAnnot, NoSymbol, keepIfNoRelevantAnnot = false) + sym.copyAndKeepAnnotationsCarrying(thisPhase, Set(defn.SetterMetaAnnot)) else if sym.is(Param) then - removeUnwantedAnnotations(sym, defn.ParamMetaAnnot, NoSymbol, keepIfNoRelevantAnnot = true) + sym.copyAndKeepAnnotationsCarrying(thisPhase, Set(defn.ParamMetaAnnot), orNoneOf = defn.NonBeanMetaAnnots) + else if sym.is(ParamAccessor) then + sym.copyAndKeepAnnotationsCarrying(thisPhase, Set(defn.GetterMetaAnnot, defn.FieldMetaAnnot)) else - removeUnwantedAnnotations(sym, defn.GetterMetaAnnot, defn.FieldMetaAnnot, keepIfNoRelevantAnnot = !sym.is(ParamAccessor)) + sym.copyAndKeepAnnotationsCarrying(thisPhase, Set(defn.GetterMetaAnnot, defn.FieldMetaAnnot), orNoneOf = defn.NonBeanMetaAnnots) if sym.isScala2Macro && !ctx.settings.XignoreScala2Macros.value then if !sym.owner.unforcedDecls.exists(p => !p.isScala2Macro && p.name == sym.name && p.signature == sym.signature) // Allow scala.reflect.materializeClassTag to be able to compile scala/reflect/package.scala @@ -183,17 +186,6 @@ class PostTyper extends MacroTransform with IdentityDenotTransformer { thisPhase => Checking.checkAppliedTypesIn(tree) case _ => - private def removeUnwantedAnnotations(sym: Symbol, metaAnnotSym: Symbol, - metaAnnotSymBackup: Symbol, keepIfNoRelevantAnnot: Boolean)(using Context): Unit = - def shouldKeep(annot: Annotation): Boolean = - val annotSym = annot.symbol - annotSym.hasAnnotation(metaAnnotSym) - || annotSym.hasAnnotation(metaAnnotSymBackup) - || (keepIfNoRelevantAnnot && { - !annotSym.annotations.exists(metaAnnot => defn.FieldAccessorMetaAnnots.contains(metaAnnot.symbol)) - }) - if sym.annotations.nonEmpty then - sym.filterAnnotations(shouldKeep(_)) private def transformSelect(tree: Select, targs: List[Tree])(using Context): Tree = { val qual = tree.qualifier @@ -269,7 +261,7 @@ class PostTyper extends MacroTransform with IdentityDenotTransformer { thisPhase def checkNotPackage(tree: Tree)(using Context): Tree = if !tree.symbol.is(Package) then tree - else errorTree(tree, i"${tree.symbol} cannot be used as a type") + else errorTree(tree, em"${tree.symbol} cannot be used as a type") override def transform(tree: Tree)(using Context): Tree = try tree match { @@ -277,7 +269,7 @@ class PostTyper extends MacroTransform with IdentityDenotTransformer { thisPhase case CaseDef(pat, _, _) => val gadtCtx = pat.removeAttachment(typer.Typer.InferredGadtConstraints) match - case Some(gadt) => ctx.fresh.setGadt(gadt) + case Some(gadt) => ctx.fresh.setGadtState(GadtState(gadt)) case None => ctx super.transform(tree)(using gadtCtx) @@ -302,12 +294,14 @@ class PostTyper extends MacroTransform with IdentityDenotTransformer { thisPhase checkNoConstructorProxy(tree) transformSelect(tree, Nil) case tree: Apply => - val methType = tree.fun.tpe.widen + val methType = tree.fun.tpe.widen.asInstanceOf[MethodType] val app = if (methType.isErasedMethod) tpd.cpy.Apply(tree)( tree.fun, tree.args.mapConserve(arg => + if methType.isResultDependent then + Checking.checkRealizable(arg.tpe, arg.srcPos, "erased argument") if (methType.isImplicitMethod && arg.span.isSynthetic) arg match case _: RefTree | _: Apply | _: TypeApply if arg.symbol.is(Erased) => @@ -331,7 +325,7 @@ class PostTyper extends MacroTransform with IdentityDenotTransformer { thisPhase // Check the constructor type as well; it could be an illegal singleton type // which would not be reflected as `tree.tpe` ctx.typer.checkClassType(nu.tpe, tree.srcPos, traitReq = false, stablePrefixReq = false) - Checking.checkInstantiable(tree.tpe, nu.srcPos) + Checking.checkInstantiable(tree.tpe, nu.tpe, nu.srcPos) withNoCheckNews(nu :: Nil)(app1) case _ => app1 @@ -360,6 +354,7 @@ class PostTyper extends MacroTransform with IdentityDenotTransformer { thisPhase } case Inlined(call, bindings, expansion) if !call.isEmpty => val pos = call.sourcePos + CrossVersionChecks.checkExperimentalRef(call.symbol, pos) val callTrace = Inlines.inlineCallTrace(call.symbol, pos)(using ctx.withSource(pos.source)) cpy.Inlined(tree)(callTrace, transformSub(bindings), transform(expansion)(using inlineContext(call))) case templ: Template => @@ -372,33 +367,45 @@ class PostTyper extends MacroTransform with IdentityDenotTransformer { thisPhase ) } case tree: ValDef => + registerIfHasMacroAnnotations(tree) checkErasedDef(tree) val tree1 = cpy.ValDef(tree)(rhs = normalizeErasedRhs(tree.rhs, tree.symbol)) if tree1.removeAttachment(desugar.UntupledParam).isDefined then checkStableSelection(tree.rhs) processValOrDefDef(super.transform(tree1)) case tree: DefDef => + registerIfHasMacroAnnotations(tree) checkErasedDef(tree) annotateContextResults(tree) val tree1 = cpy.DefDef(tree)(rhs = normalizeErasedRhs(tree.rhs, tree.symbol)) processValOrDefDef(superAcc.wrapDefDef(tree1)(super.transform(tree1).asInstanceOf[DefDef])) case tree: TypeDef => + registerIfHasMacroAnnotations(tree) val sym = tree.symbol if (sym.isClass) VarianceChecker.check(tree) annotateExperimental(sym) + checkMacroAnnotation(sym) tree.rhs match case impl: Template => for parent <- impl.parents do Checking.checkTraitInheritance(parent.tpe.classSymbol, sym.asClass, parent.srcPos) + // Constructor parameters are in scope when typing a parent. + // While they can safely appear in a parent tree, to preserve + // soundness we need to ensure they don't appear in a parent + // type (#16270). + val illegalRefs = parent.tpe.namedPartsWith(p => p.symbol.is(ParamAccessor) && (p.symbol.owner eq sym)) + if illegalRefs.nonEmpty then + report.error( + em"The type of a class parent cannot refer to constructor parameters, but ${parent.tpe} refers to ${illegalRefs.map(_.name.show).mkString(",")}", parent.srcPos) // Add SourceFile annotation to top-level classes if sym.owner.is(Package) then if ctx.compilationUnit.source.exists && sym != defn.SourceFileAnnot then val reference = ctx.settings.sourceroot.value val relativePath = util.SourceFile.relativePath(ctx.compilationUnit.source, reference) - sym.addAnnotation(Annotation.makeSourceFile(relativePath)) + sym.addAnnotation(Annotation.makeSourceFile(relativePath, tree.span)) if Feature.pureFunsEnabled && sym != defn.WithPureFunsAnnot then - sym.addAnnotation(Annotation(defn.WithPureFunsAnnot)) + sym.addAnnotation(Annotation(defn.WithPureFunsAnnot, tree.span)) else if !sym.is(Param) && !sym.owner.isOneOf(AbstractOrTrait) then Checking.checkGoodBounds(tree.symbol) @@ -414,7 +421,7 @@ class PostTyper extends MacroTransform with IdentityDenotTransformer { thisPhase Checking.checkGoodBounds(tree.symbol) super.transform(tree) case tree: New if isCheckable(tree) => - Checking.checkInstantiable(tree.tpe, tree.srcPos) + Checking.checkInstantiable(tree.tpe, tree.tpe, tree.srcPos) super.transform(tree) case tree: Closure if !tree.tpt.isEmpty => Checking.checkRealizable(tree.tpt.tpe, tree.srcPos, "SAM type") @@ -434,6 +441,13 @@ class PostTyper extends MacroTransform with IdentityDenotTransformer { thisPhase case SingletonTypeTree(ref) => Checking.checkRealizable(ref.tpe, ref.srcPos) super.transform(tree) + case tree: TypeBoundsTree => + val TypeBoundsTree(lo, hi, alias) = tree + if !alias.isEmpty then + val bounds = TypeBounds(lo.tpe, hi.tpe) + if !bounds.contains(alias.tpe) then + report.error(em"type ${alias.tpe} outside bounds $bounds", tree.srcPos) + super.transform(tree) case tree: TypeTree => tree.withType( tree.tpe match { @@ -480,6 +494,16 @@ class PostTyper extends MacroTransform with IdentityDenotTransformer { thisPhase private def normalizeErasedRhs(rhs: Tree, sym: Symbol)(using Context) = if (sym.isEffectivelyErased) dropInlines.transform(rhs) else rhs + /** Check if the definition has macro annotation and sets `compilationUnit.hasMacroAnnotations` if needed. */ + private def registerIfHasMacroAnnotations(tree: DefTree)(using Context) = + if !Inlines.inInlineMethod && MacroAnnotations.hasMacroAnnotation(tree.symbol) then + ctx.compilationUnit.hasMacroAnnotations = true + + /** Check macro annotations implementations */ + private def checkMacroAnnotation(sym: Symbol)(using Context) = + if sym.derivesFrom(defn.MacroAnnotationClass) && !sym.isStatic then + report.error("classes that extend MacroAnnotation must not be inner/local classes", sym.srcPos) + private def checkErasedDef(tree: ValOrDefDef)(using Context): Unit = if tree.symbol.is(Erased, butNot = Macro) then val tpe = tree.rhs.tpe @@ -490,8 +514,8 @@ class PostTyper extends MacroTransform with IdentityDenotTransformer { thisPhase private def annotateExperimental(sym: Symbol)(using Context): Unit = if sym.is(Module) && sym.companionClass.hasAnnotation(defn.ExperimentalAnnot) then - sym.addAnnotation(defn.ExperimentalAnnot) - sym.companionModule.addAnnotation(defn.ExperimentalAnnot) + sym.addAnnotation(Annotation(defn.ExperimentalAnnot, sym.span)) + sym.companionModule.addAnnotation(Annotation(defn.ExperimentalAnnot, sym.span)) } } diff --git a/compiler/src/dotty/tools/dotc/transform/ProtectedAccessors.scala b/compiler/src/dotty/tools/dotc/transform/ProtectedAccessors.scala index 98e835293303..6d8f7bdb32cb 100644 --- a/compiler/src/dotty/tools/dotc/transform/ProtectedAccessors.scala +++ b/compiler/src/dotty/tools/dotc/transform/ProtectedAccessors.scala @@ -70,7 +70,7 @@ class ProtectedAccessors extends MiniPhase { override def ifNoHost(reference: RefTree)(using Context): Tree = { val curCls = ctx.owner.enclosingClass transforms.println(i"${curCls.ownersIterator.toList}%, %") - report.error(i"illegal access to protected ${reference.symbol.showLocated} from $curCls", + report.error(em"illegal access to protected ${reference.symbol.showLocated} from $curCls", reference.srcPos) reference } diff --git a/compiler/src/dotty/tools/dotc/transform/Recheck.scala b/compiler/src/dotty/tools/dotc/transform/Recheck.scala index 6d783854ae35..c524bbb7702f 100644 --- a/compiler/src/dotty/tools/dotc/transform/Recheck.scala +++ b/compiler/src/dotty/tools/dotc/transform/Recheck.scala @@ -22,6 +22,7 @@ import StdNames.nme import reporting.trace import annotation.constructorOnly import cc.CaptureSet.IdempotentCaptRefMap +import dotty.tools.dotc.core.Denotations.SingleDenotation object Recheck: import tpd.* @@ -71,7 +72,7 @@ object Recheck: val symd = sym.denot symd.validFor.firstPhaseId == phase.id + 1 && (sym.originDenotation ne symd) - extension (tree: Tree) + extension [T <: Tree](tree: T) /** Remember `tpe` as the type of `tree`, which might be different from the * type stored in the tree itself, unless a type was already remembered for `tree`. @@ -86,11 +87,27 @@ object Recheck: if tpe ne tree.tpe then tree.putAttachment(RecheckedType, tpe) /** The remembered type of the tree, or if none was installed, the original type */ - def knownType = + def knownType: Type = tree.attachmentOrElse(RecheckedType, tree.tpe) def hasRememberedType: Boolean = tree.hasAttachment(RecheckedType) + def withKnownType(using Context): T = tree.getAttachment(RecheckedType) match + case Some(tpe) => tree.withType(tpe).asInstanceOf[T] + case None => tree + + extension (tpe: Type) + + /** Map ExprType => T to () ?=> T (and analogously for pure versions). + * Even though this phase runs after ElimByName, ExprTypes can still occur + * as by-name arguments of applied types. See note in doc comment for + * ElimByName phase. Test case is bynamefun.scala. + */ + def mapExprType(using Context): Type = tpe match + case ExprType(rt) => defn.ByNameFunction(rt) + case _ => tpe + + /** A base class that runs a simplified typer pass over an already re-typed program. The pass * does not transform trees but returns instead the re-typed type of each tree as it is * traversed. The Recheck phase must be directly preceded by a phase of type PreRecheck. @@ -116,7 +133,9 @@ abstract class Recheck extends Phase, SymTransformer: else sym def run(using Context): Unit = - newRechecker().checkUnit(ctx.compilationUnit) + val rechecker = newRechecker() + rechecker.checkUnit(ctx.compilationUnit) + rechecker.reset() def newRechecker()(using Context): Rechecker @@ -136,6 +155,12 @@ abstract class Recheck extends Phase, SymTransformer: */ def keepType(tree: Tree): Boolean = keepAllTypes + private val prevSelDenots = util.HashMap[NamedType, Denotation]() + + def reset()(using Context): Unit = + for (ref, mbr) <- prevSelDenots.iterator do + ref.withDenot(mbr) + /** Constant-folded rechecked type `tp` of tree `tree` */ protected def constFold(tree: Tree, tp: Type)(using Context): Type = val tree1 = tree.withType(tp) @@ -147,18 +172,42 @@ abstract class Recheck extends Phase, SymTransformer: def recheckSelect(tree: Select, pt: Type)(using Context): Type = val Select(qual, name) = tree - recheckSelection(tree, recheck(qual, AnySelectionProto).widenIfUnstable, name, pt) + val proto = + if tree.symbol == defn.Any_asInstanceOf then WildcardType + else AnySelectionProto + recheckSelection(tree, recheck(qual, proto).widenIfUnstable, name, pt) + + /** When we select the `apply` of a function with type such as `(=> A) => B`, + * we need to convert the parameter type `=> A` to `() ?=> A`. See doc comment + * of `mapExprType`. + */ + def normalizeByName(mbr: SingleDenotation)(using Context): SingleDenotation = mbr.info match + case mt: MethodType if mt.paramInfos.exists(_.isInstanceOf[ExprType]) => + mbr.derivedSingleDenotation(mbr.symbol, + mt.derivedLambdaType(paramInfos = mt.paramInfos.map(_.mapExprType))) + case _ => + mbr def recheckSelection(tree: Select, qualType: Type, name: Name, sharpen: Denotation => Denotation)(using Context): Type = if name.is(OuterSelectName) then tree.tpe else //val pre = ta.maybeSkolemizePrefix(qualType, name) - val mbr = sharpen( + val mbr = normalizeByName( + sharpen( qualType.findMember(name, qualType, excluded = if tree.symbol.is(Private) then EmptyFlags else Private - )).suchThat(tree.symbol == _) - constFold(tree, qualType.select(name, mbr)) + )).suchThat(tree.symbol == _)) + val newType = tree.tpe match + case prevType: NamedType => + val prevDenot = prevType.denot + val newType = qualType.select(name, mbr) + if (newType eq prevType) && (mbr.info ne prevDenot.info) && !prevSelDenots.contains(prevType) then + prevSelDenots(prevType) = prevDenot + newType + case _ => + qualType.select(name, mbr) + constFold(tree, newType) //.showing(i"recheck select $qualType . $name : ${mbr.info} = $result") @@ -212,7 +261,10 @@ abstract class Recheck extends Phase, SymTransformer: mt.instantiate(argTypes) def recheckApply(tree: Apply, pt: Type)(using Context): Type = - recheck(tree.fun).widen match + val funTp = recheck(tree.fun) + // reuse the tree's type on signature polymorphic methods, instead of using the (wrong) rechecked one + val funtpe = if tree.fun.symbol.originalSignaturePolymorphic.exists then tree.fun.tpe else funTp + funtpe.widen match case fntpe: MethodType => assert(fntpe.paramInfos.hasSameLengthAs(tree.args)) val formals = @@ -220,7 +272,7 @@ abstract class Recheck extends Phase, SymTransformer: else fntpe.paramInfos def recheckArgs(args: List[Tree], formals: List[Type], prefs: List[ParamRef]): List[Type] = args match case arg :: args1 => - val argType = recheck(arg, formals.head) + val argType = recheck(arg, formals.head.mapExprType) val formals1 = if fntpe.isParamDependent then formals.tail.map(_.substParam(prefs.head, argType)) @@ -232,6 +284,8 @@ abstract class Recheck extends Phase, SymTransformer: val argTypes = recheckArgs(tree.args, formals, fntpe.paramRefs) constFold(tree, instantiate(fntpe, argTypes, tree.fun.symbol)) //.showing(i"typed app $tree : $fntpe with ${tree.args}%, % : $argTypes%, % = $result") + case tp => + assert(false, i"unexpected type of ${tree.fun}: $funtpe") def recheckTypeApply(tree: TypeApply, pt: Type)(using Context): Type = recheck(tree.fun).widen match @@ -262,7 +316,7 @@ abstract class Recheck extends Phase, SymTransformer: recheckBlock(tree.stats, tree.expr, pt) def recheckInlined(tree: Inlined, pt: Type)(using Context): Type = - recheckBlock(tree.bindings, tree.expansion, pt) + recheckBlock(tree.bindings, tree.expansion, pt)(using inlineContext(tree.call)) def recheckIf(tree: If, pt: Type)(using Context): Type = recheck(tree.cond, defn.BooleanType) @@ -297,7 +351,20 @@ abstract class Recheck extends Phase, SymTransformer: val rawType = recheck(tree.expr) val ownType = avoidMap(rawType) - checkConforms(ownType, tree.from.symbol.returnProto, tree) + + // The pattern matching translation, which runs before this phase + // sometimes instantiates return types with singleton type alternatives + // but the returned expression is widened. We compensate by widening the expected + // type as well. See also `widenSkolems` in `checkConformsExpr` which fixes + // a more general problem. It turns out that pattern matching returns + // are not checked by Ycheck, that's why these problems were allowed to slip + // through. + def widened(tp: Type): Type = tp match + case tp: SingletonType => tp.widen + case tp: AndOrType => tp.derivedAndOrType(widened(tp.tp1), widened(tp.tp2)) + case tp @ AnnotatedType(tp1, ann) => tp.derivedAnnotatedType(widened(tp1), ann) + case _ => tp + checkConforms(ownType, widened(tree.from.symbol.returnProto), tree) defn.NothingType end recheckReturn @@ -423,6 +490,27 @@ abstract class Recheck extends Phase, SymTransformer: throw ex } + /** Typing and previous transforms sometiems leaves skolem types in prefixes of + * NamedTypes in `expected` that do not match the `actual` Type. -Ycheck does + * not complain (need to find out why), but a full recheck does. We compensate + * by de-skolemizing everywhere in `expected` except when variance is negative. + * @return If `tp` contains SkolemTypes in covariant or invariant positions, + * the type where these SkolemTypes are mapped to their underlying type. + * Otherwise, `tp` itself + */ + def widenSkolems(tp: Type)(using Context): Type = + object widenSkolems extends TypeMap, IdempotentCaptRefMap: + var didWiden: Boolean = false + def apply(t: Type): Type = t match + case t: SkolemType if variance >= 0 => + didWiden = true + apply(t.underlying) + case t: LazyRef => t + case t @ AnnotatedType(t1, ann) => t.derivedAnnotatedType(apply(t1), ann) + case _ => mapOver(t) + val tp1 = widenSkolems(tp) + if widenSkolems.didWiden then tp1 else tp + /** If true, print info for some successful checkConforms operations (failing ones give * an error message in any case). */ @@ -438,11 +526,16 @@ abstract class Recheck extends Phase, SymTransformer: def checkConformsExpr(actual: Type, expected: Type, tree: Tree)(using Context): Unit = //println(i"check conforms $actual <:< $expected") - val isCompatible = + + def isCompatible(expected: Type): Boolean = actual <:< expected || expected.isRepeatedParam - && actual <:< expected.translateFromRepeated(toArray = tree.tpe.isRef(defn.ArrayClass)) - if !isCompatible then + && isCompatible(expected.translateFromRepeated(toArray = tree.tpe.isRef(defn.ArrayClass))) + || { + val widened = widenSkolems(expected) + (widened ne expected) && isCompatible(widened) + } + if !isCompatible(expected) then recheckr.println(i"conforms failed for ${tree}: $actual vs $expected") err.typeMismatch(tree.withType(actual), expected) else if debugSuccesses then @@ -450,6 +543,7 @@ abstract class Recheck extends Phase, SymTransformer: case _: Ident => println(i"SUCCESS $tree:\n${TypeComparer.explained(_.isSubType(actual, expected))}") case _ => + end checkConformsExpr def checkUnit(unit: CompilationUnit)(using Context): Unit = recheck(unit.tpdTree) diff --git a/compiler/src/dotty/tools/dotc/transform/RepeatableAnnotations.scala b/compiler/src/dotty/tools/dotc/transform/RepeatableAnnotations.scala index e8f8a80e1a0d..d6c11fe36748 100644 --- a/compiler/src/dotty/tools/dotc/transform/RepeatableAnnotations.scala +++ b/compiler/src/dotty/tools/dotc/transform/RepeatableAnnotations.scala @@ -10,6 +10,7 @@ import Symbols.defn import Constants._ import Types._ import Decorators._ +import Flags._ import scala.collection.mutable @@ -33,7 +34,7 @@ class RepeatableAnnotations extends MiniPhase: val annsByType = stableGroupBy(annotations, _.symbol) annsByType.flatMap { case (_, a :: Nil) => a :: Nil - case (sym, anns) if sym.derivesFrom(defn.ClassfileAnnotationClass) => + case (sym, anns) if sym.is(JavaDefined) => sym.getAnnotation(defn.JavaRepeatableAnnot).flatMap(_.argumentConstant(0)) match case Some(Constant(containerTpe: Type)) => val clashingAnns = annsByType.getOrElse(containerTpe.classSymbol, Nil) @@ -44,7 +45,7 @@ class RepeatableAnnotations extends MiniPhase: Nil else val aggregated = JavaSeqLiteral(anns.map(_.tree).toList, TypeTree(sym.typeRef)) - Annotation(containerTpe, NamedArg("value".toTermName, aggregated)) :: Nil + Annotation(containerTpe, NamedArg("value".toTermName, aggregated), sym.span) :: Nil case _ => val pos = anns.head.tree.srcPos report.error("Not repeatable annotation repeated", pos) diff --git a/compiler/src/dotty/tools/dotc/transform/Splicer.scala b/compiler/src/dotty/tools/dotc/transform/Splicer.scala index 31c28d7b1854..b936afb73dc8 100644 --- a/compiler/src/dotty/tools/dotc/transform/Splicer.scala +++ b/compiler/src/dotty/tools/dotc/transform/Splicer.scala @@ -19,6 +19,8 @@ import dotty.tools.dotc.core.Denotations.staticRef import dotty.tools.dotc.core.TypeErasure import dotty.tools.dotc.core.Constants.Constant +import dotty.tools.dotc.quoted.Interpreter + import scala.util.control.NonFatal import dotty.tools.dotc.util.SrcPos import dotty.tools.repl.AbstractFileClassLoader @@ -32,7 +34,8 @@ import scala.quoted.runtime.impl._ /** Utility class to splice quoted expressions */ object Splicer { - import tpd._ + import tpd.* + import Interpreter.* /** Splice the Tree for a Quoted expression. `${'{xyz}}` becomes `xyz` * and for `$xyz` the tree of `xyz` is interpreted for which the @@ -50,7 +53,7 @@ object Splicer { val oldContextClassLoader = Thread.currentThread().getContextClassLoader Thread.currentThread().setContextClassLoader(classLoader) try { - val interpreter = new Interpreter(splicePos, classLoader) + val interpreter = new SpliceInterpreter(splicePos, classLoader) // Some parts of the macro are evaluated during the unpickling performed in quotedExprToTree val interpretedExpr = interpreter.interpret[Quotes => scala.quoted.Expr[Any]](tree) @@ -66,7 +69,7 @@ object Splicer { throw ex case ex: scala.quoted.runtime.StopMacroExpansion => if !ctx.reporter.hasErrors then - report.error("Macro expansion was aborted by the macro without any errors reported. Macros should issue errors to end-users to facilitate debugging when aborting a macro expansion.", splicePos) + report.error("Macro expansion was aborted by the macro without any errors reported. Macros should issue errors to end-users when aborting a macro expansion with StopMacroExpansion.", splicePos) // errors have been emitted EmptyTree case ex: StopInterpretation => @@ -74,10 +77,10 @@ object Splicer { ref(defn.Predef_undefined).withType(ErrorType(ex.msg)) case NonFatal(ex) => val msg = - s"""Failed to evaluate macro. - | Caused by ${ex.getClass}: ${if (ex.getMessage == null) "" else ex.getMessage} - | ${ex.getStackTrace.takeWhile(_.getClassName != "dotty.tools.dotc.transform.Splicer$").drop(1).mkString("\n ")} - """.stripMargin + em"""Failed to evaluate macro. + | Caused by ${ex.getClass}: ${if (ex.getMessage == null) "" else ex.getMessage} + | ${ex.getStackTrace.takeWhile(_.getClassName != "dotty.tools.dotc.transform.Splicer$").drop(1).mkString("\n ")} + """ report.error(msg, spliceExpansionPos) ref(defn.Predef_undefined).withType(ErrorType(msg)) } @@ -219,24 +222,13 @@ object Splicer { checkIfValidStaticCall(tree)(using Set.empty) } - /** Tree interpreter that evaluates the tree */ - private class Interpreter(pos: SrcPos, classLoader: ClassLoader)(using Context) { - - type Env = Map[Symbol, Object] - - /** Returns the interpreted result of interpreting the code a call to the symbol with default arguments. - * Return Some of the result or None if some error happen during the interpretation. - */ - def interpret[T](tree: Tree)(implicit ct: ClassTag[T]): Option[T] = - interpretTree(tree)(Map.empty) match { - case obj: T => Some(obj) - case obj => - // TODO upgrade to a full type tag check or something similar - report.error(s"Interpreted tree returned a result of an unexpected type. Expected ${ct.runtimeClass} but was ${obj.getClass}", pos) - None - } + /** Tree interpreter that evaluates the tree. + * Interpreter is assumed to start at quotation level -1. + */ + private class SpliceInterpreter(pos: SrcPos, classLoader: ClassLoader)(using Context) extends Interpreter(pos, classLoader) { - def interpretTree(tree: Tree)(implicit env: Env): Object = tree match { + override protected def interpretTree(tree: Tree)(implicit env: Env): Object = tree match { + // Interpret level -1 quoted code `'{...}` (assumed without level 0 splices) case Apply(Select(Apply(TypeApply(fn, _), quoted :: Nil), nme.apply), _) if fn.symbol == defn.QuotedRuntime_exprQuote => val quoted1 = quoted match { case quoted: Ident if quoted.symbol.isAllOf(InlineByNameProxy) => @@ -245,324 +237,14 @@ object Splicer { case Inlined(EmptyTree, _, quoted) => quoted case _ => quoted } - interpretQuote(quoted1) + new ExprImpl(Inlined(EmptyTree, Nil, QuoteUtils.changeOwnerOfTree(quoted1, ctx.owner)).withSpan(quoted1.span), SpliceScope.getCurrent) + // Interpret level -1 `Type.of[T]` case Apply(TypeApply(fn, quoted :: Nil), _) if fn.symbol == defn.QuotedTypeModule_of => - interpretTypeQuote(quoted) - - case Literal(Constant(value)) => - interpretLiteral(value) - - case tree: Ident if tree.symbol.is(Inline, butNot = Method) => - tree.tpe.widenTermRefExpr match - case ConstantType(c) => c.value.asInstanceOf[Object] - case _ => throw new StopInterpretation(em"${tree.symbol} could not be inlined", tree.srcPos) - - // TODO disallow interpreted method calls as arguments - case Call(fn, args) => - if (fn.symbol.isConstructor && fn.symbol.owner.owner.is(Package)) - interpretNew(fn.symbol, args.flatten.map(interpretTree)) - else if (fn.symbol.is(Module)) - interpretModuleAccess(fn.symbol) - else if (fn.symbol.is(Method) && fn.symbol.isStatic) { - val staticMethodCall = interpretedStaticMethodCall(fn.symbol.owner, fn.symbol) - staticMethodCall(interpretArgs(args, fn.symbol.info)) - } - else if fn.symbol.isStatic then - assert(args.isEmpty) - interpretedStaticFieldAccess(fn.symbol) - else if (fn.qualifier.symbol.is(Module) && fn.qualifier.symbol.isStatic) - if (fn.name == nme.asInstanceOfPM) - interpretModuleAccess(fn.qualifier.symbol) - else { - val staticMethodCall = interpretedStaticMethodCall(fn.qualifier.symbol.moduleClass, fn.symbol) - staticMethodCall(interpretArgs(args, fn.symbol.info)) - } - else if (env.contains(fn.symbol)) - env(fn.symbol) - else if (tree.symbol.is(InlineProxy)) - interpretTree(tree.symbol.defTree.asInstanceOf[ValOrDefDef].rhs) - else - unexpectedTree(tree) - - case closureDef((ddef @ DefDef(_, ValDefs(arg :: Nil) :: Nil, _, _))) => - (obj: AnyRef) => interpretTree(ddef.rhs)(using env.updated(arg.symbol, obj)) - - // Interpret `foo(j = x, i = y)` which it is expanded to - // `val j$1 = x; val i$1 = y; foo(i = i$1, j = j$1)` - case Block(stats, expr) => interpretBlock(stats, expr) - case NamedArg(_, arg) => interpretTree(arg) - - case Inlined(_, bindings, expansion) => interpretBlock(bindings, expansion) - - case Typed(expr, _) => - interpretTree(expr) - - case SeqLiteral(elems, _) => - interpretVarargs(elems.map(e => interpretTree(e))) + new TypeImpl(QuoteUtils.changeOwnerOfTree(quoted, ctx.owner), SpliceScope.getCurrent) case _ => - unexpectedTree(tree) - } - - private def interpretArgs(argss: List[List[Tree]], fnType: Type)(using Env): List[Object] = { - def interpretArgsGroup(args: List[Tree], argTypes: List[Type]): List[Object] = - assert(args.size == argTypes.size) - val view = - for (arg, info) <- args.lazyZip(argTypes) yield - info match - case _: ExprType => () => interpretTree(arg) // by-name argument - case _ => interpretTree(arg) // by-value argument - view.toList - - fnType.dealias match - case fnType: MethodType if fnType.isErasedMethod => interpretArgs(argss, fnType.resType) - case fnType: MethodType => - val argTypes = fnType.paramInfos - assert(argss.head.size == argTypes.size) - interpretArgsGroup(argss.head, argTypes) ::: interpretArgs(argss.tail, fnType.resType) - case fnType: AppliedType if defn.isContextFunctionType(fnType) => - val argTypes :+ resType = fnType.args: @unchecked - interpretArgsGroup(argss.head, argTypes) ::: interpretArgs(argss.tail, resType) - case fnType: PolyType => interpretArgs(argss, fnType.resType) - case fnType: ExprType => interpretArgs(argss, fnType.resType) - case _ => - assert(argss.isEmpty) - Nil - } - - private def interpretBlock(stats: List[Tree], expr: Tree)(implicit env: Env) = { - var unexpected: Option[Object] = None - val newEnv = stats.foldLeft(env)((accEnv, stat) => stat match { - case stat: ValDef => - accEnv.updated(stat.symbol, interpretTree(stat.rhs)(accEnv)) - case stat => - if (unexpected.isEmpty) - unexpected = Some(unexpectedTree(stat)) - accEnv - }) - unexpected.getOrElse(interpretTree(expr)(newEnv)) - } - - private def interpretQuote(tree: Tree)(implicit env: Env): Object = - new ExprImpl(Inlined(EmptyTree, Nil, QuoteUtils.changeOwnerOfTree(tree, ctx.owner)).withSpan(tree.span), SpliceScope.getCurrent) - - private def interpretTypeQuote(tree: Tree)(implicit env: Env): Object = - new TypeImpl(QuoteUtils.changeOwnerOfTree(tree, ctx.owner), SpliceScope.getCurrent) - - private def interpretLiteral(value: Any)(implicit env: Env): Object = - value.asInstanceOf[Object] - - private def interpretVarargs(args: List[Object])(implicit env: Env): Object = - args.toSeq - - private def interpretedStaticMethodCall(moduleClass: Symbol, fn: Symbol)(implicit env: Env): List[Object] => Object = { - val (inst, clazz) = - try - if (moduleClass.name.startsWith(str.REPL_SESSION_LINE)) - (null, loadReplLineClass(moduleClass)) - else { - val inst = loadModule(moduleClass) - (inst, inst.getClass) - } - catch - case MissingClassDefinedInCurrentRun(sym) if ctx.compilationUnit.isSuspendable => - if (ctx.settings.XprintSuspension.value) - report.echo(i"suspension triggered by a dependency on $sym", pos) - ctx.compilationUnit.suspend() // this throws a SuspendException - - val name = fn.name.asTermName - val method = getMethod(clazz, name, paramsSig(fn)) - (args: List[Object]) => stopIfRuntimeException(method.invoke(inst, args: _*), method) - } - - private def interpretedStaticFieldAccess(sym: Symbol)(implicit env: Env): Object = { - val clazz = loadClass(sym.owner.fullName.toString) - val field = clazz.getField(sym.name.toString) - field.get(null) - } - - private def interpretModuleAccess(fn: Symbol)(implicit env: Env): Object = - loadModule(fn.moduleClass) - - private def interpretNew(fn: Symbol, args: => List[Object])(implicit env: Env): Object = { - val clazz = loadClass(fn.owner.fullName.toString) - val constr = clazz.getConstructor(paramsSig(fn): _*) - constr.newInstance(args: _*).asInstanceOf[Object] - } - - private def unexpectedTree(tree: Tree)(implicit env: Env): Object = - throw new StopInterpretation("Unexpected tree could not be interpreted: " + tree, tree.srcPos) - - private def loadModule(sym: Symbol): Object = - if (sym.owner.is(Package)) { - // is top level object - val moduleClass = loadClass(sym.fullName.toString) - moduleClass.getField(str.MODULE_INSTANCE_FIELD).get(null) - } - else { - // nested object in an object - val className = { - val pack = sym.topLevelClass.owner - if (pack == defn.RootPackage || pack == defn.EmptyPackageClass) sym.flatName.toString - else pack.showFullName + "." + sym.flatName - } - val clazz = loadClass(className) - clazz.getConstructor().newInstance().asInstanceOf[Object] - } - - private def loadReplLineClass(moduleClass: Symbol)(implicit env: Env): Class[?] = { - val lineClassloader = new AbstractFileClassLoader(ctx.settings.outputDir.value, classLoader) - lineClassloader.loadClass(moduleClass.name.firstPart.toString) - } - - private def loadClass(name: String): Class[?] = - try classLoader.loadClass(name) - catch { - case _: ClassNotFoundException => - val msg = s"Could not find class $name in classpath" - throw new StopInterpretation(msg, pos) - } - - private def getMethod(clazz: Class[?], name: Name, paramClasses: List[Class[?]]): JLRMethod = - try clazz.getMethod(name.toString, paramClasses: _*) - catch { - case _: NoSuchMethodException => - val msg = em"Could not find method ${clazz.getCanonicalName}.$name with parameters ($paramClasses%, %)" - throw new StopInterpretation(msg, pos) - case MissingClassDefinedInCurrentRun(sym) if ctx.compilationUnit.isSuspendable => - if (ctx.settings.XprintSuspension.value) - report.echo(i"suspension triggered by a dependency on $sym", pos) - ctx.compilationUnit.suspend() // this throws a SuspendException - } - - private def stopIfRuntimeException[T](thunk: => T, method: JLRMethod): T = - try thunk - catch { - case ex: RuntimeException => - val sw = new StringWriter() - sw.write("A runtime exception occurred while executing macro expansion\n") - sw.write(ex.getMessage) - sw.write("\n") - ex.printStackTrace(new PrintWriter(sw)) - sw.write("\n") - throw new StopInterpretation(sw.toString, pos) - case ex: InvocationTargetException => - ex.getTargetException match { - case ex: scala.quoted.runtime.StopMacroExpansion => - throw ex - case MissingClassDefinedInCurrentRun(sym) if ctx.compilationUnit.isSuspendable => - if (ctx.settings.XprintSuspension.value) - report.echo(i"suspension triggered by a dependency on $sym", pos) - ctx.compilationUnit.suspend() // this throws a SuspendException - case targetException => - val sw = new StringWriter() - sw.write("Exception occurred while executing macro expansion.\n") - if (!ctx.settings.Ydebug.value) { - val end = targetException.getStackTrace.lastIndexWhere { x => - x.getClassName == method.getDeclaringClass.getCanonicalName && x.getMethodName == method.getName - } - val shortStackTrace = targetException.getStackTrace.take(end + 1) - targetException.setStackTrace(shortStackTrace) - } - targetException.printStackTrace(new PrintWriter(sw)) - sw.write("\n") - throw new StopInterpretation(sw.toString, pos) - } - } - - private object MissingClassDefinedInCurrentRun { - def unapply(targetException: NoClassDefFoundError)(using Context): Option[Symbol] = { - val className = targetException.getMessage - if (className == null) None - else { - val sym = staticRef(className.toTypeName).symbol - if (sym.isDefinedInCurrentRun) Some(sym) else None - } - } - } - - /** List of classes of the parameters of the signature of `sym` */ - private def paramsSig(sym: Symbol): List[Class[?]] = { - def paramClass(param: Type): Class[?] = { - def arrayDepth(tpe: Type, depth: Int): (Type, Int) = tpe match { - case JavaArrayType(elemType) => arrayDepth(elemType, depth + 1) - case _ => (tpe, depth) - } - def javaArraySig(tpe: Type): String = { - val (elemType, depth) = arrayDepth(tpe, 0) - val sym = elemType.classSymbol - val suffix = - if (sym == defn.BooleanClass) "Z" - else if (sym == defn.ByteClass) "B" - else if (sym == defn.ShortClass) "S" - else if (sym == defn.IntClass) "I" - else if (sym == defn.LongClass) "J" - else if (sym == defn.FloatClass) "F" - else if (sym == defn.DoubleClass) "D" - else if (sym == defn.CharClass) "C" - else "L" + javaSig(elemType) + ";" - ("[" * depth) + suffix - } - def javaSig(tpe: Type): String = tpe match { - case tpe: JavaArrayType => javaArraySig(tpe) - case _ => - // Take the flatten name of the class and the full package name - val pack = tpe.classSymbol.topLevelClass.owner - val packageName = if (pack == defn.EmptyPackageClass) "" else s"${pack.fullName}." - packageName + tpe.classSymbol.fullNameSeparated(FlatName).toString - } - - val sym = param.classSymbol - if (sym == defn.BooleanClass) classOf[Boolean] - else if (sym == defn.ByteClass) classOf[Byte] - else if (sym == defn.CharClass) classOf[Char] - else if (sym == defn.ShortClass) classOf[Short] - else if (sym == defn.IntClass) classOf[Int] - else if (sym == defn.LongClass) classOf[Long] - else if (sym == defn.FloatClass) classOf[Float] - else if (sym == defn.DoubleClass) classOf[Double] - else java.lang.Class.forName(javaSig(param), false, classLoader) - } - def getExtraParams(tp: Type): List[Type] = tp.widenDealias match { - case tp: AppliedType if defn.isContextFunctionType(tp) => - // Call context function type direct method - tp.args.init.map(arg => TypeErasure.erasure(arg)) ::: getExtraParams(tp.args.last) - case _ => Nil - } - val extraParams = getExtraParams(sym.info.finalResultType) - val allParams = TypeErasure.erasure(sym.info) match { - case meth: MethodType => meth.paramInfos ::: extraParams - case _ => extraParams - } - allParams.map(paramClass) - } - } - - - - /** Exception that stops interpretation if some issue is found */ - private class StopInterpretation(val msg: String, val pos: SrcPos) extends Exception - - object Call { - /** Matches an expression that is either a field access or an application - * It retruns a TermRef containing field accessed or a method reference and the arguments passed to it. - */ - def unapply(arg: Tree)(using Context): Option[(RefTree, List[List[Tree]])] = - Call0.unapply(arg).map((fn, args) => (fn, args.reverse)) - - private object Call0 { - def unapply(arg: Tree)(using Context): Option[(RefTree, List[List[Tree]])] = arg match { - case Select(Call0(fn, args), nme.apply) if defn.isContextFunctionType(fn.tpe.widenDealias.finalResultType) => - Some((fn, args)) - case fn: Ident => Some((tpd.desugarIdent(fn).withSpan(fn.span), Nil)) - case fn: Select => Some((fn, Nil)) - case Apply(f @ Call0(fn, args1), args2) => - if (f.tpe.widenDealias.isErasedMethod) Some((fn, args1)) - else Some((fn, args2 :: args1)) - case TypeApply(Call0(fn, args), _) => Some((fn, args)) - case _ => None - } + super.interpretTree(tree) } } } diff --git a/compiler/src/dotty/tools/dotc/transform/SuperAccessors.scala b/compiler/src/dotty/tools/dotc/transform/SuperAccessors.scala index b0c8605e7dd1..2307f759b571 100644 --- a/compiler/src/dotty/tools/dotc/transform/SuperAccessors.scala +++ b/compiler/src/dotty/tools/dotc/transform/SuperAccessors.scala @@ -88,7 +88,7 @@ class SuperAccessors(thisPhase: DenotTransformer) { // Diagnostic for SI-7091 if (!accDefs.contains(clazz)) report.error( - s"Internal error: unable to store accessor definition in ${clazz}. clazz.hasPackageFlag=${clazz.is(Package)}. Accessor required for ${sel} (${sel.show})", + em"Internal error: unable to store accessor definition in ${clazz}. clazz.hasPackageFlag=${clazz.is(Package)}. Accessor required for ${sel.toString} ($sel)", sel.srcPos) else accDefs(clazz) += DefDef(acc, EmptyTree).withSpan(accRange) acc @@ -109,16 +109,16 @@ class SuperAccessors(thisPhase: DenotTransformer) { if (sym.isTerm && !sym.is(Method, butNot = Accessor) && !ctx.owner.isAllOf(ParamForwarder)) // ParamForwaders as installed ParamForwarding.scala do use super calls to vals - report.error(s"super may be not be used on ${sym.underlyingSymbol}", sel.srcPos) + report.error(em"super may be not be used on ${sym.underlyingSymbol}", sel.srcPos) else if (isDisallowed(sym)) - report.error(s"super not allowed here: use this.${sel.name} instead", sel.srcPos) + report.error(em"super not allowed here: use this.${sel.name} instead", sel.srcPos) else if (sym.is(Deferred)) { val member = sym.overridingSymbol(clazz.asClass) if (!mix.name.isEmpty || !member.exists || !(member.is(AbsOverride) && member.isIncompleteIn(clazz))) report.error( - i"${sym.showLocated} is accessed from super. It may not be abstract unless it is overridden by a member declared `abstract' and `override'", + em"${sym.showLocated} is accessed from super. It may not be abstract unless it is overridden by a member declared `abstract' and `override'", sel.srcPos) else report.log(i"ok super $sel ${sym.showLocated} $member $clazz ${member.isIncompleteIn(clazz)}") } @@ -131,7 +131,7 @@ class SuperAccessors(thisPhase: DenotTransformer) { val overriding = sym.overridingSymbol(intermediateClass) if (overriding.is(Deferred, butNot = AbsOverride) && !overriding.owner.is(Trait)) report.error( - s"${sym.showLocated} cannot be directly accessed from ${clazz} because ${overriding.owner} redeclares it as abstract", + em"${sym.showLocated} cannot be directly accessed from ${clazz} because ${overriding.owner} redeclares it as abstract", sel.srcPos) } else { diff --git a/compiler/src/dotty/tools/dotc/transform/SymUtils.scala b/compiler/src/dotty/tools/dotc/transform/SymUtils.scala index 6010fe2a2a44..b945f5820523 100644 --- a/compiler/src/dotty/tools/dotc/transform/SymUtils.scala +++ b/compiler/src/dotty/tools/dotc/transform/SymUtils.scala @@ -270,11 +270,8 @@ object SymUtils: def isEnumCase(using Context): Boolean = self.isAllOf(EnumCase, butNot = JavaDefined) - def annotationsCarrying(meta: ClassSymbol)(using Context): List[Annotation] = - self.annotations.filter(_.symbol.hasAnnotation(meta)) - - def withAnnotationsCarrying(from: Symbol, meta: ClassSymbol)(using Context): self.type = { - self.addAnnotations(from.annotationsCarrying(meta)) + def withAnnotationsCarrying(from: Symbol, meta: Symbol, orNoneOf: Set[Symbol] = Set.empty)(using Context): self.type = { + self.addAnnotations(from.annotationsCarrying(Set(meta), orNoneOf)) self } @@ -384,7 +381,7 @@ object SymUtils: if original.hasAnnotation(defn.TargetNameAnnot) then self.addAnnotation( Annotation(defn.TargetNameAnnot, - Literal(Constant(nameFn(original.targetName).toString)).withSpan(original.span))) + Literal(Constant(nameFn(original.targetName).toString)).withSpan(original.span), original.span)) /** The return type as seen from the body of this definition. It is * computed from the symbol's type by replacing param refs by param symbols. diff --git a/compiler/src/dotty/tools/dotc/transform/SyntheticMembers.scala b/compiler/src/dotty/tools/dotc/transform/SyntheticMembers.scala index 0a9a7a83948c..48bcbaab3511 100644 --- a/compiler/src/dotty/tools/dotc/transform/SyntheticMembers.scala +++ b/compiler/src/dotty/tools/dotc/transform/SyntheticMembers.scala @@ -13,6 +13,7 @@ import ast.untpd import ValueClasses.isDerivedValueClass import SymUtils._ import util.Property +import util.Spans.Span import config.Printers.derive import NullOpsDecorator._ @@ -155,7 +156,7 @@ class SyntheticMembers(thisPhase: DenotTransformer) { case nme.hashCode_ => chooseHashcode case nme.toString_ => toStringBody(vrefss) case nme.equals_ => equalsBody(vrefss.head.head) - case nme.canEqual_ => canEqualBody(vrefss.head.head) + case nme.canEqual_ => canEqualBody(vrefss.head.head, synthetic.span) case nme.ordinal => ordinalRef case nme.productArity => Literal(Constant(accessors.length)) case nme.productPrefix if isEnumValue => nameRef @@ -260,13 +261,13 @@ class SyntheticMembers(thisPhase: DenotTransformer) { def equalsBody(that: Tree)(using Context): Tree = { val thatAsClazz = newSymbol(ctx.owner, nme.x_0, SyntheticCase, clazzType, coord = ctx.owner.span) // x$0 def wildcardAscription(tp: Type) = Typed(Underscore(tp), TypeTree(tp)) - val pattern = Bind(thatAsClazz, wildcardAscription(AnnotatedType(clazzType, Annotation(defn.UncheckedAnnot)))) // x$0 @ (_: C @unchecked) + val pattern = Bind(thatAsClazz, wildcardAscription(AnnotatedType(clazzType, Annotation(defn.UncheckedAnnot, thatAsClazz.span)))) // x$0 @ (_: C @unchecked) // compare primitive fields first, slow equality checks of non-primitive fields can be skipped when primitives differ val sortedAccessors = accessors.sortBy(accessor => if (accessor.info.typeSymbol.isPrimitiveValueClass) 0 else 1) val comparisons = sortedAccessors.map { accessor => This(clazz).withSpan(ctx.owner.span.focus).select(accessor).equal(ref(thatAsClazz).select(accessor)) } var rhs = // this.x == this$0.x && this.y == x$0.y && that.canEqual(this) - if comparisons.isEmpty then Literal(Constant(true)) else comparisons.reduceLeft(_ and _) + if comparisons.isEmpty then Literal(Constant(true)) else comparisons.reduceBalanced(_ and _) val canEqualMeth = existingDef(defn.Product_canEqual, clazz) if !clazz.is(Final) || canEqualMeth.exists && !canEqualMeth.is(Synthetic) then rhs = rhs.and( @@ -390,7 +391,7 @@ class SyntheticMembers(thisPhase: DenotTransformer) { * * `@unchecked` is needed for parametric case classes. */ - def canEqualBody(that: Tree): Tree = that.isInstance(AnnotatedType(clazzType, Annotation(defn.UncheckedAnnot))) + def canEqualBody(that: Tree, span: Span): Tree = that.isInstance(AnnotatedType(clazzType, Annotation(defn.UncheckedAnnot, span))) symbolsToSynthesize.flatMap(syntheticDefIfMissing) } diff --git a/compiler/src/dotty/tools/dotc/transform/TailRec.scala b/compiler/src/dotty/tools/dotc/transform/TailRec.scala index 71b66c3d0da6..741b9d1627fe 100644 --- a/compiler/src/dotty/tools/dotc/transform/TailRec.scala +++ b/compiler/src/dotty/tools/dotc/transform/TailRec.scala @@ -4,7 +4,7 @@ package transform import ast.{TreeTypeMap, tpd} import config.Printers.tailrec import core.* -import Contexts.*, Flags.*, Symbols.* +import Contexts.*, Flags.*, Symbols.*, Decorators.em import Constants.Constant import NameKinds.{TailLabelName, TailLocalName, TailTempName} import StdNames.nme @@ -303,7 +303,7 @@ class TailRec extends MiniPhase { def fail(reason: String) = { if (isMandatory) { failureReported = true - report.error(s"Cannot rewrite recursive call: $reason", tree.srcPos) + report.error(em"Cannot rewrite recursive call: $reason", tree.srcPos) } else tailrec.println("Cannot rewrite recursive call at: " + tree.span + " because: " + reason) diff --git a/compiler/src/dotty/tools/dotc/transform/TreeChecker.scala b/compiler/src/dotty/tools/dotc/transform/TreeChecker.scala index 82413e2e6733..4573c40df78b 100644 --- a/compiler/src/dotty/tools/dotc/transform/TreeChecker.scala +++ b/compiler/src/dotty/tools/dotc/transform/TreeChecker.scala @@ -42,10 +42,6 @@ class TreeChecker extends Phase with SymTransformer { private val seenClasses = collection.mutable.HashMap[String, Symbol]() private val seenModuleVals = collection.mutable.HashMap[String, Symbol]() - def isValidJVMName(name: Name): Boolean = name.toString.forall(isValidJVMChar) - - def isValidJVMMethodName(name: Name): Boolean = name.toString.forall(isValidJVMMethodChar) - val NoSuperClassFlags: FlagSet = Trait | Package def testDuplicate(sym: Symbol, registry: mutable.Map[String, Symbol], typ: String)(using Context): Unit = { @@ -91,7 +87,7 @@ class TreeChecker extends Phase with SymTransformer { if (ctx.phaseId <= erasurePhase.id) { val initial = symd.initial assert(symd == initial || symd.signature == initial.signature, - i"""Signature of ${sym.showLocated} changed at phase ${ctx.phase.prevMega} + i"""Signature of ${sym} in ${sym.ownersIterator.toList}%, % changed at phase ${ctx.phase.prevMega} |Initial info: ${initial.info} |Initial sig : ${initial.signature} |Current info: ${symd.info} @@ -109,18 +105,6 @@ class TreeChecker extends Phase with SymTransformer { else if (ctx.phase.prev.isCheckable) check(ctx.base.allPhases.toIndexedSeq, ctx) - private def previousPhases(phases: List[Phase])(using Context): List[Phase] = phases match { - case (phase: MegaPhase) :: phases1 => - val subPhases = phase.miniPhases - val previousSubPhases = previousPhases(subPhases.toList) - if (previousSubPhases.length == subPhases.length) previousSubPhases ::: previousPhases(phases1) - else previousSubPhases - case phase :: phases1 if phase ne ctx.phase => - phase :: previousPhases(phases1) - case _ => - Nil - } - def check(phasesToRun: Seq[Phase], ctx: Context): Tree = { val fusedPhase = ctx.phase.prevMega(using ctx) report.echo(s"checking ${ctx.compilationUnit} after phase ${fusedPhase}")(using ctx) @@ -134,7 +118,6 @@ class TreeChecker extends Phase with SymTransformer { val checkingCtx = ctx .fresh - .addMode(Mode.ImplicitsEnabled) .setReporter(new ThrowingReporter(ctx.reporter)) val checker = inContext(ctx) { @@ -150,9 +133,80 @@ class TreeChecker extends Phase with SymTransformer { } } + /** + * Checks that `New` nodes are always wrapped inside `Select` nodes. + */ + def assertSelectWrapsNew(tree: Tree)(using Context): Unit = + (new TreeAccumulator[tpd.Tree] { + override def apply(parent: Tree, tree: Tree)(using Context): Tree = { + tree match { + case tree: New if !parent.isInstanceOf[tpd.Select] => + assert(assertion = false, i"`New` node must be wrapped in a `Select`:\n parent = ${parent.show}\n child = ${tree.show}") + case _: Annotated => + // Don't check inside annotations, since they're allowed to contain + // somewhat invalid trees. + case _ => + foldOver(tree, tree) // replace the parent when folding over the children + } + parent // return the old parent so that my siblings see it + } + })(tpd.EmptyTree, tree) +} + +object TreeChecker { + /** - Check that TypeParamRefs and MethodParams refer to an enclosing type. + * - Check that all type variables are instantiated. + */ + def checkNoOrphans(tp0: Type, tree: untpd.Tree = untpd.EmptyTree)(using Context): Type = new TypeMap() { + val definedBinders = new java.util.IdentityHashMap[Type, Any] + def apply(tp: Type): Type = { + tp match { + case tp: BindingType => + definedBinders.put(tp, tp) + mapOver(tp) + definedBinders.remove(tp) + case tp: ParamRef => + assert(definedBinders.get(tp.binder) != null, s"orphan param: ${tp.show}, hash of binder = ${System.identityHashCode(tp.binder)}, tree = ${tree.show}, type = $tp0") + case tp: TypeVar => + assert(tp.isInstantiated, s"Uninstantiated type variable: ${tp.show}, tree = ${tree.show}") + apply(tp.underlying) + case _ => + mapOver(tp) + } + tp + } + }.apply(tp0) + + /** Run some additional checks on the nodes of the trees. Specifically: + * + * - TypeTree can only appear in TypeApply args, New, Typed tpt, Closure + * tpt, SeqLiteral elemtpt, ValDef tpt, DefDef tpt, and TypeDef rhs. + */ + object TreeNodeChecker extends untpd.TreeTraverser: + import untpd._ + def traverse(tree: Tree)(using Context) = tree match + case t: TypeTree => assert(assertion = false, i"TypeTree not expected: $t") + case t @ TypeApply(fun, _targs) => traverse(fun) + case t @ New(_tpt) => + case t @ Typed(expr, _tpt) => traverse(expr) + case t @ Closure(env, meth, _tpt) => traverse(env); traverse(meth) + case t @ SeqLiteral(elems, _elemtpt) => traverse(elems) + case t @ ValDef(_, _tpt, _) => traverse(t.rhs) + case t @ DefDef(_, paramss, _tpt, _) => for params <- paramss do traverse(params); traverse(t.rhs) + case t @ TypeDef(_, _rhs) => + case t @ Template(constr, _, self, _) => traverse(constr); traverse(t.parentsOrDerived); traverse(self); traverse(t.body) + case t => traverseChildren(t) + end traverse + + private[TreeChecker] def isValidJVMName(name: Name): Boolean = name.toString.forall(isValidJVMChar) + + private[TreeChecker] def isValidJVMMethodName(name: Name): Boolean = name.toString.forall(isValidJVMMethodChar) + + class Checker(phasesToCheck: Seq[Phase]) extends ReTyper with Checking { + import ast.tpd._ - private val nowDefinedSyms = util.HashSet[Symbol]() + protected val nowDefinedSyms = util.HashSet[Symbol]() private val patBoundSyms = util.HashSet[Symbol]() private val everDefinedSyms = MutableSymbolMap[untpd.Tree]() @@ -248,10 +302,9 @@ class TreeChecker extends Phase with SymTransformer { // case tree: untpd.Ident => // case tree: untpd.Select => // case tree: untpd.Bind => - case vd : ValDef => - assertIdentNotJavaClass(vd.forceIfLazy) - case dd : DefDef => - assertIdentNotJavaClass(dd.forceIfLazy) + case md: ValOrDefDef => + md.forceFields() + assertIdentNotJavaClass(md) // case tree: untpd.TypeDef => case Apply(fun, args) => assertIdentNotJavaClass(fun) @@ -376,7 +429,7 @@ class TreeChecker extends Phase with SymTransformer { override def typedIdent(tree: untpd.Ident, pt: Type)(using Context): Tree = { assert(tree.isTerm || !ctx.isAfterTyper, tree.show + " at " + ctx.phase) - assert(tree.isType || ctx.mode.is(Mode.Pattern) && untpd.isWildcardArg(tree) || !needsSelect(tree.tpe), i"bad type ${tree.tpe} for $tree # ${tree.uniqueId}") + assert(tree.isType || ctx.mode.is(Mode.Pattern) && untpd.isWildcardArg(tree) || !needsSelect(tree.typeOpt), i"bad type ${tree.tpe} for $tree # ${tree.uniqueId}") assertDefined(tree) checkNotRepeated(super.typedIdent(tree, pt)) @@ -417,11 +470,11 @@ class TreeChecker extends Phase with SymTransformer { sym == mbr || sym.overriddenSymbol(mbr.owner.asClass) == mbr || mbr.overriddenSymbol(sym.owner.asClass) == sym), - ex"""symbols differ for $tree - |was : $sym - |alternatives by type: $memberSyms%, % of types ${memberSyms.map(_.info)}%, % - |qualifier type : ${qualTpe} - |tree type : ${tree.typeOpt} of class ${tree.typeOpt.getClass}""") + i"""symbols differ for $tree + |was : $sym + |alternatives by type: $memberSyms%, % of types ${memberSyms.map(_.info)}%, % + |qualifier type : ${qualTpe} + |tree type : ${tree.typeOpt} of class ${tree.typeOpt.getClass}""") } checkNotRepeated(super.typedSelect(tree, pt)) @@ -658,68 +711,50 @@ class TreeChecker extends Phase with SymTransformer { override def simplify(tree: Tree, pt: Type, locked: TypeVars)(using Context): tree.type = tree } - /** - * Checks that `New` nodes are always wrapped inside `Select` nodes. - */ - def assertSelectWrapsNew(tree: Tree)(using Context): Unit = - (new TreeAccumulator[tpd.Tree] { - override def apply(parent: Tree, tree: Tree)(using Context): Tree = { - tree match { - case tree: New if !parent.isInstanceOf[tpd.Select] => - assert(assertion = false, i"`New` node must be wrapped in a `Select`:\n parent = ${parent.show}\n child = ${tree.show}") - case _: Annotated => - // Don't check inside annotations, since they're allowed to contain - // somewhat invalid trees. - case _ => - foldOver(tree, tree) // replace the parent when folding over the children - } - parent // return the old parent so that my siblings see it - } - })(tpd.EmptyTree, tree) -} + /** Tree checker that can be applied to a local tree. */ + class LocalChecker(phasesToCheck: Seq[Phase]) extends Checker(phasesToCheck: Seq[Phase]): + override def assertDefined(tree: untpd.Tree)(using Context): Unit = + // Only check definitions nested in the local tree + if nowDefinedSyms.contains(tree.symbol.maybeOwner) then + super.assertDefined(tree) -object TreeChecker { - /** - Check that TypeParamRefs and MethodParams refer to an enclosing type. - * - Check that all type variables are instantiated. - */ - def checkNoOrphans(tp0: Type, tree: untpd.Tree = untpd.EmptyTree)(using Context): Type = new TypeMap() { - val definedBinders = new java.util.IdentityHashMap[Type, Any] - def apply(tp: Type): Type = { - tp match { - case tp: BindingType => - definedBinders.put(tp, tp) - mapOver(tp) - definedBinders.remove(tp) - case tp: ParamRef => - assert(definedBinders.get(tp.binder) != null, s"orphan param: ${tp.show}, hash of binder = ${System.identityHashCode(tp.binder)}, tree = ${tree.show}, type = $tp0") - case tp: TypeVar => - assert(tp.isInstantiated, s"Uninstantiated type variable: ${tp.show}, tree = ${tree.show}") - apply(tp.underlying) - case _ => - mapOver(tp) - } - tp - } - }.apply(tp0) + def checkMacroGeneratedTree(original: tpd.Tree, expansion: tpd.Tree)(using Context): Unit = + if ctx.settings.XcheckMacros.value then + val checkingCtx = ctx + .fresh + .setReporter(new ThrowingReporter(ctx.reporter)) + val phases = ctx.base.allPhases.toList + val treeChecker = new LocalChecker(previousPhases(phases)) + + try treeChecker.typed(expansion)(using checkingCtx) + catch + case err: java.lang.AssertionError => + report.error( + s"""Malformed tree was found while expanding macro with -Xcheck-macros. + |The tree does not conform to the compiler's tree invariants. + | + |Macro was: + |${scala.quoted.runtime.impl.QuotesImpl.showDecompiledTree(original)} + | + |The macro returned: + |${scala.quoted.runtime.impl.QuotesImpl.showDecompiledTree(expansion)} + | + |Error: + |${err.getMessage} + | + |""", + original + ) - /** Run some additional checks on the nodes of the trees. Specifically: - * - * - TypeTree can only appear in TypeApply args, New, Typed tpt, Closure - * tpt, SeqLiteral elemtpt, ValDef tpt, DefDef tpt, and TypeDef rhs. - */ - object TreeNodeChecker extends untpd.TreeTraverser: - import untpd._ - def traverse(tree: Tree)(using Context) = tree match - case t: TypeTree => assert(assertion = false, i"TypeTree not expected: $t") - case t @ TypeApply(fun, _targs) => traverse(fun) - case t @ New(_tpt) => - case t @ Typed(expr, _tpt) => traverse(expr) - case t @ Closure(env, meth, _tpt) => traverse(env); traverse(meth) - case t @ SeqLiteral(elems, _elemtpt) => traverse(elems) - case t @ ValDef(_, _tpt, _) => traverse(t.rhs) - case t @ DefDef(_, paramss, _tpt, _) => for params <- paramss do traverse(params); traverse(t.rhs) - case t @ TypeDef(_, _rhs) => - case t @ Template(constr, parents, self, _) => traverse(constr); traverse(parents); traverse(self); traverse(t.body) - case t => traverseChildren(t) - end traverse + private[TreeChecker] def previousPhases(phases: List[Phase])(using Context): List[Phase] = phases match { + case (phase: MegaPhase) :: phases1 => + val subPhases = phase.miniPhases + val previousSubPhases = previousPhases(subPhases.toList) + if (previousSubPhases.length == subPhases.length) previousSubPhases ::: previousPhases(phases1) + else previousSubPhases + case phase :: phases1 if phase ne ctx.phase => + phase :: previousPhases(phases1) + case _ => + Nil + } } diff --git a/compiler/src/dotty/tools/dotc/transform/TupleOptimizations.scala b/compiler/src/dotty/tools/dotc/transform/TupleOptimizations.scala index 6bc2f438eb37..6fba0bca4ce3 100644 --- a/compiler/src/dotty/tools/dotc/transform/TupleOptimizations.scala +++ b/compiler/src/dotty/tools/dotc/transform/TupleOptimizations.scala @@ -145,7 +145,7 @@ class TupleOptimizations extends MiniPhase with IdentityDenotTransformer { val size = tpes.size val n = nTpe.value.intValue if (n < 0 || n >= size) { - report.error("index out of bounds: " + n, nTree.underlyingArgument.srcPos) + report.error(em"index out of bounds: $n", nTree.underlyingArgument.srcPos) tree } else if (size <= MaxTupleArity) @@ -155,7 +155,7 @@ class TupleOptimizations extends MiniPhase with IdentityDenotTransformer { // tup.asInstanceOf[TupleXXL].productElement(n) tup.asInstance(defn.TupleXXLClass.typeRef).select(nme.productElement).appliedTo(Literal(nTpe.value)) case (None, nTpe: ConstantType) if nTpe.value.intValue < 0 => - report.error("index out of bounds: " + nTpe.value.intValue, nTree.srcPos) + report.error(em"index out of bounds: ${nTpe.value.intValue}", nTree.srcPos) tree case _ => // No optimization, keep: diff --git a/compiler/src/dotty/tools/dotc/transform/TypeTestsCasts.scala b/compiler/src/dotty/tools/dotc/transform/TypeTestsCasts.scala index b2a101649457..3763af243881 100644 --- a/compiler/src/dotty/tools/dotc/transform/TypeTestsCasts.scala +++ b/compiler/src/dotty/tools/dotc/transform/TypeTestsCasts.scala @@ -241,7 +241,7 @@ object TypeTestsCasts { val foundEffectiveClass = effectiveClass(expr.tpe.widen) if foundEffectiveClass.isPrimitiveValueClass && !testCls.isPrimitiveValueClass then - report.error(i"cannot test if value of $exprType is a reference of $testCls", tree.srcPos) + report.error(em"cannot test if value of $exprType is a reference of $testCls", tree.srcPos) false else foundClasses.exists(check) end checkSensical @@ -345,7 +345,7 @@ object TypeTestsCasts { val testWidened = testType.widen defn.untestableClasses.find(testWidened.isRef(_)) match case Some(untestable) => - report.error(i"$untestable cannot be used in runtime type tests", tree.srcPos) + report.error(em"$untestable cannot be used in runtime type tests", tree.srcPos) constant(expr, Literal(Constant(false))) case _ => val erasedTestType = erasure(testType) @@ -359,7 +359,7 @@ object TypeTestsCasts { if !isTrusted && !isUnchecked then val whyNot = whyUncheckable(expr.tpe, argType, tree.span) if whyNot.nonEmpty then - report.uncheckedWarning(i"the type test for $argType cannot be checked at runtime because $whyNot", expr.srcPos) + report.uncheckedWarning(em"the type test for $argType cannot be checked at runtime because $whyNot", expr.srcPos) transformTypeTest(expr, argType, flagUnrelated = enclosingInlineds.isEmpty) // if test comes from inlined code, dont't flag it even if it always false } diff --git a/compiler/src/dotty/tools/dotc/transform/TypeUtils.scala b/compiler/src/dotty/tools/dotc/transform/TypeUtils.scala index 5b6e36343379..a897503ef275 100644 --- a/compiler/src/dotty/tools/dotc/transform/TypeUtils.scala +++ b/compiler/src/dotty/tools/dotc/transform/TypeUtils.scala @@ -76,7 +76,7 @@ object TypeUtils { case AndType(tp1, tp2) => // We assume that we have the following property: // (T1, T2, ..., Tn) & (U1, U2, ..., Un) = (T1 & U1, T2 & U2, ..., Tn & Un) - tp1.tupleElementTypes.zip(tp2.tupleElementTypes).map { case (t1, t2) => t1 & t2 } + tp1.tupleElementTypes.zip(tp2.tupleElementTypes).map { case (t1, t2) => t1.intersect(t2) } case OrType(tp1, tp2) => None // We can't combine the type of two tuples case _ => diff --git a/compiler/src/dotty/tools/dotc/transform/ValueClasses.scala b/compiler/src/dotty/tools/dotc/transform/ValueClasses.scala index a86bf2c48fb5..28d1255eaa72 100644 --- a/compiler/src/dotty/tools/dotc/transform/ValueClasses.scala +++ b/compiler/src/dotty/tools/dotc/transform/ValueClasses.scala @@ -22,15 +22,14 @@ object ValueClasses { } def isMethodWithExtension(sym: Symbol)(using Context): Boolean = - atPhaseNoLater(extensionMethodsPhase) { - val d = sym.denot - d.validFor.containsPhaseId(ctx.phaseId) && - d.isRealMethod && - isDerivedValueClass(d.owner) && - !d.isConstructor && - !d.symbol.isSuperAccessor && - !d.is(Macro) - } + val d = sym.denot.initial + d.validFor.firstPhaseId <= extensionMethodsPhase.id + && d.isRealMethod + && isDerivedValueClass(d.owner) + && !d.isConstructor + && !d.symbol.isSuperAccessor + && !d.isInlineMethod + && !d.is(Macro) /** The member of a derived value class that unboxes it. */ def valueClassUnbox(cls: ClassSymbol)(using Context): Symbol = diff --git a/compiler/src/dotty/tools/dotc/transform/YCheckPositions.scala b/compiler/src/dotty/tools/dotc/transform/YCheckPositions.scala index ba42d826fe82..8080a7c911b3 100644 --- a/compiler/src/dotty/tools/dotc/transform/YCheckPositions.scala +++ b/compiler/src/dotty/tools/dotc/transform/YCheckPositions.scala @@ -61,6 +61,7 @@ class YCheckPositions extends Phase { private def isMacro(call: Tree)(using Context) = call.symbol.is(Macro) || + (call.symbol.isClass && call.tpe.derivesFrom(defn.MacroAnnotationClass)) || // The call of a macro after typer is encoded as a Select while other inlines are Ident // TODO remove this distinction once Inline nodes of expanded macros can be trusted (also in Inliner.inlineCallTrace) (!(ctx.phase <= postTyperPhase) && call.isInstanceOf[Select]) diff --git a/compiler/src/dotty/tools/dotc/transform/init/Semantic.scala b/compiler/src/dotty/tools/dotc/transform/init/Semantic.scala index a48aa77fe79f..eb1692e00a12 100644 --- a/compiler/src/dotty/tools/dotc/transform/init/Semantic.scala +++ b/compiler/src/dotty/tools/dotc/transform/init/Semantic.scala @@ -855,7 +855,7 @@ object Semantic: // init "fake" param fields for parameters of primary and secondary constructors def addParamsAsFields(args: List[Value], ref: Ref, ctorDef: DefDef) = val params = ctorDef.termParamss.flatten.map(_.symbol) - assert(args.size == params.size, "arguments = " + args.size + ", params = " + params.size) + assert(args.size == params.size, "arguments = " + args.size + ", params = " + params.size + ", ctor = " + ctor.show) for (param, value) <- params.zip(args) do ref.updateField(param, value) printer.println(param.show + " initialized with " + value) @@ -1663,9 +1663,14 @@ object Semantic: // term arguments to B. That can only be done in a concrete class. val tref = typeRefOf(klass.typeRef.baseType(mixin).typeConstructor) val ctor = tref.classSymbol.primaryConstructor - if ctor.exists then extendTrace(superParent) { - superCall(tref, ctor, Nil, tasks) - } + if ctor.exists then + // The parameter check of traits comes late in the mixin phase. + // To avoid crash we supply hot values for erroneous parent calls. + // See tests/neg/i16438.scala. + val args: List[ArgInfo] = ctor.info.paramInfoss.flatten.map(_ => ArgInfo(Hot, Trace.empty)) + extendTrace(superParent) { + superCall(tref, ctor, args, tasks) + } } // initialize super classes after outers are set diff --git a/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala b/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala index 8e891f822255..90310a385d0c 100644 --- a/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala +++ b/compiler/src/dotty/tools/dotc/transform/patmat/Space.scala @@ -116,15 +116,15 @@ trait SpaceLogic { /** Simplify space such that a space equal to `Empty` becomes `Empty` */ def simplify(space: Space)(using Context): Space = trace(s"simplify ${show(space)} --> ", debug, show)(space match { case Prod(tp, fun, spaces) => - val sps = spaces.map(simplify(_)) + val sps = spaces.mapconserve(simplify) if (sps.contains(Empty)) Empty else if (canDecompose(tp) && decompose(tp).isEmpty) Empty - else Prod(tp, fun, sps) + else if sps eq spaces then space else Prod(tp, fun, sps) case Or(spaces) => - val spaces2 = spaces.map(simplify(_)).filter(_ != Empty) + val spaces2 = spaces.map(simplify).filter(_ != Empty) if spaces2.isEmpty then Empty - else if spaces2.lengthCompare(1) == 0 then spaces2.head - else Or(spaces2) + else if spaces2.lengthIs == 1 then spaces2.head + else if spaces2.corresponds(spaces)(_ eq _) then space else Or(spaces2) case Typ(tp, _) => if (canDecompose(tp) && decompose(tp).isEmpty) Empty else space @@ -164,12 +164,15 @@ trait SpaceLogic { List(space) } - /** Is `a` a subspace of `b`? Equivalent to `a - b == Empty`, but faster */ + /** Is `a` a subspace of `b`? Equivalent to `simplify(simplify(a) - simplify(b)) == Empty`, but faster */ def isSubspace(a: Space, b: Space)(using Context): Boolean = trace(s"isSubspace(${show(a)}, ${show(b)})", debug) { def tryDecompose1(tp: Type) = canDecompose(tp) && isSubspace(Or(decompose(tp)), b) def tryDecompose2(tp: Type) = canDecompose(tp) && isSubspace(a, Or(decompose(tp))) - (simplify(a), simplify(b)) match { + val a2 = simplify(a) + val b2 = simplify(b) + if (a ne a2) || (b ne b2) then isSubspace(a2, b2) + else (a, b) match { case (Empty, _) => true case (_, Empty) => false case (Or(ss), _) => @@ -187,7 +190,7 @@ trait SpaceLogic { case (Typ(tp1, _), Prod(tp2, fun, ss)) => isSubType(tp1, tp2) && covers(fun, tp1, ss.length) - && isSubspace(Prod(tp2, fun, signature(fun, tp2, ss.length).map(Typ(_, false))), b) + && isSubspace(Prod(tp2, fun, signature(fun, tp1, ss.length).map(Typ(_, false))), b) case (Prod(_, fun1, ss1), Prod(_, fun2, ss2)) => isSameUnapply(fun1, fun2) && ss1.zip(ss2).forall((isSubspace _).tupled) } @@ -266,9 +269,11 @@ trait SpaceLogic { tryDecompose2(tp2) else a + case (Prod(tp1, fun1, ss1), Prod(tp2, fun2, ss2)) + if (!isSameUnapply(fun1, fun2)) => a + case (Prod(tp1, fun1, ss1), Prod(tp2, fun2, ss2)) + if (fun1.symbol.name == nme.unapply && ss1.length != ss2.length) => a case (Prod(tp1, fun1, ss1), Prod(tp2, fun2, ss2)) => - if (!isSameUnapply(fun1, fun2)) return a - if (fun1.symbol.name == nme.unapply && ss1.length != ss2.length) return a val range = (0 until ss1.size).toList val cache = Array.fill[Space | Null](ss2.length)(null) @@ -317,6 +322,27 @@ object SpaceEngine { case funRef: TermRef => isIrrefutable(funRef, argLen) case _: ErrorType => false } + + /** Is this an `'{..}` or `'[..]` irrefutable quoted patterns? + * @param unapp The unapply function tree + * @param implicits The implicits of the unapply + * @param pt The scrutinee type + */ + def isIrrefutableQuotedPattern(unapp: tpd.Tree, implicits: List[tpd.Tree], pt: Type)(using Context): Boolean = { + implicits.headOption match + // pattern '{ $x: T } + case Some(tpd.Apply(tpd.Select(tpd.Quoted(tpd.TypeApply(fn, List(tpt))), nme.apply), _)) + if unapp.symbol.owner.eq(defn.QuoteMatching_ExprMatchModule) + && fn.symbol.eq(defn.QuotedRuntimePatterns_patternHole) => + pt <:< defn.QuotedExprClass.typeRef.appliedTo(tpt.tpe) + + // pattern '[T] + case Some(tpd.Apply(tpd.TypeApply(fn, List(tpt)), _)) + if unapp.symbol.owner.eq(defn.QuoteMatching_TypeMatchModule) => + pt =:= defn.QuotedTypeClass.typeRef.appliedTo(tpt.tpe) + + case _ => false + } } /** Scala implementation of space logic */ @@ -597,11 +623,11 @@ class SpaceEngine(using Context) extends SpaceLogic { } /** Decompose a type into subspaces -- assume the type can be decomposed */ - def decompose(tp: Type): List[Typ] = - tp.dealias match { + def decompose(tp: Type): List[Typ] = trace(i"decompose($tp)", debug, show(_: Seq[Space])) { + def rec(tp: Type, mixins: List[Type]): List[Typ] = tp.dealias match { case AndType(tp1, tp2) => def decomposeComponent(tpA: Type, tpB: Type): List[Typ] = - decompose(tpA).flatMap { + rec(tpA, tpB :: mixins).flatMap { case Typ(tp, _) => if tp <:< tpB then Typ(tp, decomposed = true) :: Nil @@ -628,6 +654,17 @@ class SpaceEngine(using Context) extends SpaceLogic { Typ(ConstantType(Constant(())), true) :: Nil case tp if tp.classSymbol.isAllOf(JavaEnumTrait) => tp.classSymbol.children.map(sym => Typ(sym.termRef, true)) + + case tp @ AppliedType(tycon, targs) if tp.classSymbol.children.isEmpty && canDecompose(tycon) => + // It might not obvious that it's OK to apply the type arguments of a parent type to child types. + // But this is guarded by `tp.classSymbol.children.isEmpty`, + // meaning we'll decompose to the same class, just not the same type. + // For instance, from i15029, `decompose((X | Y).Field[T]) = [X.Field[T], Y.Field[T]]`. + rec(tycon, Nil).map(typ => Typ(tp.derivedAppliedType(typ.tp, targs))) + + case tp: NamedType if canDecompose(tp.prefix) => + rec(tp.prefix, Nil).map(typ => Typ(tp.derivedSelect(typ.tp))) + case tp => def getChildren(sym: Symbol): List[Symbol] = sym.children.flatMap { child => @@ -642,7 +679,7 @@ class SpaceEngine(using Context) extends SpaceLogic { val parts = children.map { sym => val sym1 = if (sym.is(ModuleClass)) sym.sourceModule else sym - val refined = TypeOps.refineUsingParent(tp, sym1) + val refined = TypeOps.refineUsingParent(tp, sym1, mixins) debug.println(sym1.show + " refined to " + refined.show) @@ -663,13 +700,17 @@ class SpaceEngine(using Context) extends SpaceLogic { parts.map(Typ(_, true)) } + rec(tp, Nil) + } /** Abstract sealed types, or-types, Boolean and Java enums can be decomposed */ def canDecompose(tp: Type): Boolean = val res = tp.dealias match + case AppliedType(tycon, _) if canDecompose(tycon) => true + case tp: NamedType if canDecompose(tp.prefix) => true case _: SingletonType => false case _: OrType => true - case and: AndType => canDecompose(and.tp1) || canDecompose(and.tp2) + case AndType(tp1, tp2) => canDecompose(tp1) || canDecompose(tp2) case _ => val cls = tp.classSymbol cls.is(Sealed) diff --git a/compiler/src/dotty/tools/dotc/transform/sjs/AddLocalJSFakeNews.scala b/compiler/src/dotty/tools/dotc/transform/sjs/AddLocalJSFakeNews.scala index 8851e641122f..6471e58d4ddc 100644 --- a/compiler/src/dotty/tools/dotc/transform/sjs/AddLocalJSFakeNews.scala +++ b/compiler/src/dotty/tools/dotc/transform/sjs/AddLocalJSFakeNews.scala @@ -65,7 +65,7 @@ class AddLocalJSFakeNews extends MiniPhase { thisPhase => constant.typeValue.typeSymbol.asClass case _ => // this shouldn't happen - report.error(i"unexpected $classValueArg for the first argument to `createLocalJSClass`", classValueArg) + report.error(em"unexpected $classValueArg for the first argument to `createLocalJSClass`", classValueArg) jsdefn.JSObjectClass } diff --git a/compiler/src/dotty/tools/dotc/transform/sjs/ExplicitJSClasses.scala b/compiler/src/dotty/tools/dotc/transform/sjs/ExplicitJSClasses.scala index 3c87621413b7..705b3cc404a8 100644 --- a/compiler/src/dotty/tools/dotc/transform/sjs/ExplicitJSClasses.scala +++ b/compiler/src/dotty/tools/dotc/transform/sjs/ExplicitJSClasses.scala @@ -651,7 +651,7 @@ class ExplicitJSClasses extends MiniPhase with InfoTransformer { thisPhase => case typeRef: TypeRef => typeRef case _ => // This should not have passed the checks in PrepJSInterop - report.error(i"class type required but found $tpe0", tree) + report.error(em"class type required but found $tpe0", tree) jsdefn.JSObjectType } val cls = tpe.typeSymbol @@ -667,7 +667,7 @@ class ExplicitJSClasses extends MiniPhase with InfoTransformer { thisPhase => val jsclassAccessor = jsclassAccessorFor(cls) ref(NamedType(prefix, jsclassAccessor.name, jsclassAccessor.denot)) } else { - report.error(i"stable reference to a JS class required but $tpe found", tree) + report.error(em"stable reference to a JS class required but $tpe found", tree) ref(defn.Predef_undefined) } } else if (isLocalJSClass(cls)) { diff --git a/compiler/src/dotty/tools/dotc/transform/sjs/JUnitBootstrappers.scala b/compiler/src/dotty/tools/dotc/transform/sjs/JUnitBootstrappers.scala index 817a6c5afabc..b911d7dfab96 100644 --- a/compiler/src/dotty/tools/dotc/transform/sjs/JUnitBootstrappers.scala +++ b/compiler/src/dotty/tools/dotc/transform/sjs/JUnitBootstrappers.scala @@ -13,6 +13,7 @@ import Scopes._ import Symbols._ import StdNames._ import Types._ +import Decorators.em import dotty.tools.dotc.transform.MegaPhase._ @@ -238,7 +239,7 @@ class JUnitBootstrappers extends MiniPhase { case NamedArg(name, _) => name.show(using ctx) case other => other.show(using ctx) } - report.error(s"$shownName is an unsupported argument for the JUnit @Test annotation in this position", other.sourcePos) + report.error(em"$shownName is an unsupported argument for the JUnit @Test annotation in this position", other.sourcePos) None } } diff --git a/compiler/src/dotty/tools/dotc/transform/sjs/PrepJSExports.scala b/compiler/src/dotty/tools/dotc/transform/sjs/PrepJSExports.scala index b0de197635e9..25ab46712e70 100644 --- a/compiler/src/dotty/tools/dotc/transform/sjs/PrepJSExports.scala +++ b/compiler/src/dotty/tools/dotc/transform/sjs/PrepJSExports.scala @@ -189,7 +189,7 @@ object PrepJSExports { if (hasExplicitName) { annot.argumentConstantString(0).getOrElse { report.error( - s"The argument to ${annot.symbol.name} must be a literal string", + em"The argument to ${annot.symbol.name} must be a literal string", annot.arguments(0)) "dummy" } diff --git a/compiler/src/dotty/tools/dotc/transform/sjs/PrepJSInterop.scala b/compiler/src/dotty/tools/dotc/transform/sjs/PrepJSInterop.scala index e75769147f80..d934dc179989 100644 --- a/compiler/src/dotty/tools/dotc/transform/sjs/PrepJSInterop.scala +++ b/compiler/src/dotty/tools/dotc/transform/sjs/PrepJSInterop.scala @@ -248,9 +248,9 @@ class PrepJSInterop extends MacroTransform with IdentityDenotTransformer { thisP if (tpeSym.isJSType) { def reportError(reasonAndExplanation: String): Unit = { report.error( - "Using an anonymous function as a SAM for the JavaScript type " + - i"${tpeSym.fullName} is not allowed because " + - reasonAndExplanation, + em"Using an anonymous function as a SAM for the JavaScript type ${ + tpeSym.fullName + } is not allowed because $reasonAndExplanation", tree) } if (!tpeSym.is(Trait) || tpeSym.asClass.superClass != jsdefn.JSFunctionClass) { @@ -318,9 +318,9 @@ class PrepJSInterop extends MacroTransform with IdentityDenotTransformer { thisP nameArgs match { case List(Literal(Constant(s: String))) => if (s != "apply") - report.error(i"js.Dynamic.literal does not have a method named $s", tree) + report.error(em"js.Dynamic.literal does not have a method named $s", tree) case _ => - report.error(i"js.Dynamic.literal.${tree.symbol.name} may not be called directly", tree) + report.error(em"js.Dynamic.literal.${tree.symbol.name} may not be called directly", tree) } // TODO Warn for known duplicate property names @@ -381,7 +381,7 @@ class PrepJSInterop extends MacroTransform with IdentityDenotTransformer { thisP tpe.underlyingClassRef(refinementOK = false) match { case typeRef: TypeRef if typeRef.symbol.isOneOf(Trait | ModuleClass) => - report.error(i"non-trait class type required but $tpe found", tpeArg) + report.error(em"non-trait class type required but $tpe found", tpeArg) case _ => // an error was already reported above } @@ -440,7 +440,7 @@ class PrepJSInterop extends MacroTransform with IdentityDenotTransformer { thisP * which is never valid. */ report.error( - i"${sym.name} extends ${parentSym.fullName} which does not extend js.Any.", + em"${sym.name} extends ${parentSym.fullName} which does not extend js.Any.", classDef) } } @@ -502,8 +502,8 @@ class PrepJSInterop extends MacroTransform with IdentityDenotTransformer { thisP def emitOverrideError(msg: String): Unit = { report.error( - "error overriding %s;\n %s %s".format( - infoStringWithLocation(overridden), infoString(overriding), msg), + em"""error overriding ${infoStringWithLocation(overridden)}; + | ${infoString(overriding)} $msg""", errorPos) } @@ -559,7 +559,7 @@ class PrepJSInterop extends MacroTransform with IdentityDenotTransformer { thisP for (annot <- sym.annotations) { val annotSym = annot.symbol if (isJSNativeLoadingSpecAnnot(annotSym)) - report.error(i"Traits may not have an @${annotSym.name} annotation.", annot.tree) + report.error(em"Traits may not have an @${annotSym.name} annotation.", annot.tree) } } else { checkJSNativeLoadSpecOf(treePos, sym) @@ -571,7 +571,7 @@ class PrepJSInterop extends MacroTransform with IdentityDenotTransformer { thisP def checkGlobalRefName(globalRef: String): Unit = { if (!JSGlobalRef.isValidJSGlobalRefName(globalRef)) - report.error(s"The name of a JS global variable must be a valid JS identifier (got '$globalRef')", pos) + report.error(em"The name of a JS global variable must be a valid JS identifier (got '$globalRef')", pos) } if (enclosingOwner is OwnerKind.JSNative) { @@ -585,7 +585,7 @@ class PrepJSInterop extends MacroTransform with IdentityDenotTransformer { thisP for (annot <- sym.annotations) { val annotSym = annot.symbol if (isJSNativeLoadingSpecAnnot(annotSym)) - report.error(i"Nested JS classes and objects cannot have an @${annotSym.name} annotation.", annot.tree) + report.error(em"Nested JS classes and objects cannot have an @${annotSym.name} annotation.", annot.tree) } if (sym.owner.isStaticOwner) { @@ -731,7 +731,7 @@ class PrepJSInterop extends MacroTransform with IdentityDenotTransformer { thisP if (overriddenSymbols.hasNext) { val overridden = overriddenSymbols.next() val verb = if (overridden.is(Deferred)) "implement" else "override" - report.error(i"An @js.native member cannot $verb the inherited member ${overridden.fullName}", tree) + report.error(em"An @js.native member cannot $verb the inherited member ${overridden.fullName}", tree) } tree @@ -974,6 +974,8 @@ class PrepJSInterop extends MacroTransform with IdentityDenotTransformer { thisP tree.rhs match { case sel: Select if sel.symbol == jsdefn.JSPackage_native => // ok + case rhs: Ident if rhs.symbol == jsdefn.JSPackage_native => + // ok case _ => val pos = if (tree.rhs != EmptyTree) tree.rhs.srcPos else tree.srcPos report.error(s"$longKindStr may only call js.native.", pos) @@ -982,7 +984,7 @@ class PrepJSInterop extends MacroTransform with IdentityDenotTransformer { thisP // Check that the result type was explicitly specified // (This is stronger than Scala 2, which only warns, and only if it was inferred as Nothing.) if (tree.tpt.isInstanceOf[InferredTypeTree]) - report.error(i"The type of ${tree.name} must be explicitly specified because it is JS native.", tree) + report.error(em"The type of ${tree.name} must be explicitly specified because it is JS native.", tree) } private def checkJSNativeSpecificAnnotsOnNonJSNative(memberDef: MemberDef)(using Context): Unit = { @@ -1319,7 +1321,7 @@ object PrepJSInterop { for (annotation <- sym.annotations) { if (isCompilerAnnotation(annotation)) { report.error( - i"@${annotation.symbol.fullName} is for compiler internal use only. Do not use it yourself.", + em"@${annotation.symbol.fullName} is for compiler internal use only. Do not use it yourself.", annotation.tree) } } diff --git a/compiler/src/dotty/tools/dotc/typer/Applications.scala b/compiler/src/dotty/tools/dotc/typer/Applications.scala index 386bae6d5338..cd33fe9cef24 100644 --- a/compiler/src/dotty/tools/dotc/typer/Applications.scala +++ b/compiler/src/dotty/tools/dotc/typer/Applications.scala @@ -6,7 +6,6 @@ import core._ import ast.{Trees, tpd, untpd, desugar} import util.Stats.record import util.{SrcPos, NoSourcePosition} -import Trees.Untyped import Contexts._ import Flags._ import Symbols._ @@ -24,7 +23,7 @@ import Inferencing._ import reporting._ import transform.TypeUtils._ import transform.SymUtils._ -import Nullables._ +import Nullables._, NullOpsDecorator.* import config.Feature import collection.mutable @@ -47,7 +46,7 @@ object Applications { def extractorMemberType(tp: Type, name: Name, errorPos: SrcPos)(using Context): Type = { val ref = extractorMember(tp, name) if (ref.isOverloaded) - errorType(i"Overloaded reference to $ref is not allowed in extractor", errorPos) + errorType(em"Overloaded reference to $ref is not allowed in extractor", errorPos) ref.info.widenExpr.annotatedToRepeated } @@ -273,6 +272,7 @@ object Applications { else def selectGetter(qual: Tree): Tree = val getterDenot = qual.tpe.member(getterName) + .accessibleFrom(qual.tpe.widenIfUnstable) // to reset Local if (getterDenot.exists) qual.select(TermRef(qual.tpe, getterName, getterDenot)) else EmptyTree if !meth.isClassConstructor then @@ -341,6 +341,12 @@ object Applications { val getter = findDefaultGetter(fn, n, testOnly) if getter.isEmpty then getter else spliceMeth(getter.withSpan(fn.span), fn) + + def retypeSignaturePolymorphicFn(fun: Tree, methType: Type)(using Context): Tree = + val sym1 = fun.symbol + val flags2 = sym1.flags | NonMember // ensures Select typing doesn't let TermRef#withPrefix revert the type + val sym2 = sym1.copy(info = methType, flags = flags2) // symbol not entered, to avoid overload resolution problems + fun.withType(sym2.termRef) } trait Applications extends Compatibility { @@ -479,7 +485,7 @@ trait Applications extends Compatibility { matchArgs(orderedArgs, methType.paramInfos, 0) case _ => if (methType.isError) ok = false - else fail(s"$methString does not take parameters".toMessage) + else fail(em"$methString does not take parameters") } /** The application was successful */ @@ -491,7 +497,7 @@ trait Applications extends Compatibility { i"${err.refStr(methRef)}$infoStr" /** Re-order arguments to correctly align named arguments */ - def reorder[T >: Untyped](args: List[Trees.Tree[T]]): List[Trees.Tree[T]] = { + def reorder[T <: Untyped](args: List[Trees.Tree[T]]): List[Trees.Tree[T]] = { /** @param pnames The list of parameter names that are missing arguments * @param args The list of arguments that are not yet passed, or that are waiting to be dropped @@ -519,10 +525,10 @@ trait Applications extends Compatibility { else { // name not (or no longer) available for named arg def msg = if (methodType.paramNames contains aname) - s"parameter $aname of $methString is already instantiated" + em"parameter $aname of $methString is already instantiated" else - s"$methString does not have a parameter $aname" - fail(msg.toMessage, arg.asInstanceOf[Arg]) + em"$methString does not have a parameter $aname" + fail(msg, arg.asInstanceOf[Arg]) arg :: handleNamed(pnamesRest, args1, nameToArg, toDrop) } case arg :: args1 => @@ -548,7 +554,7 @@ trait Applications extends Compatibility { /** Is `sym` a constructor of a Java-defined annotation? */ def isJavaAnnotConstr(sym: Symbol): Boolean = - sym.is(JavaDefined) && sym.isConstructor && sym.owner.derivesFrom(defn.AnnotationClass) + sym.is(JavaDefined) && sym.isConstructor && sym.owner.is(JavaAnnotation) /** Match re-ordered arguments against formal parameters * @param n The position of the first parameter in formals in `methType`. @@ -564,7 +570,7 @@ trait Applications extends Compatibility { i"it is not the only argument to be passed to the corresponding repeated parameter $formal" else i"the corresponding parameter has type $formal which is not a repeated parameter type" - fail(em"Sequence argument type annotation `*` cannot be used here:\n$addendum".toMessage, arg) + fail(em"Sequence argument type annotation `*` cannot be used here:\n$addendum", arg) /** Add result of typing argument `arg` against parameter type `formal`. * @return The remaining formal parameter types. If the method is parameter-dependent @@ -648,10 +654,10 @@ trait Applications extends Compatibility { def msg = arg match case untpd.Tuple(Nil) if applyKind == ApplyKind.InfixTuple && funType.widen.isNullaryMethod => - i"can't supply unit value with infix notation because nullary $methString takes no arguments; use dotted invocation instead: (...).${methRef.name}()" + em"can't supply unit value with infix notation because nullary $methString takes no arguments; use dotted invocation instead: (...).${methRef.name}()" case _ => - i"too many arguments for $methString" - fail(msg.toMessage, arg) + em"too many arguments for $methString" + fail(msg, arg) case nil => } } @@ -754,7 +760,7 @@ trait Applications extends Compatibility { /** Subclass of Application for type checking an Apply node, where * types of arguments are either known or unknown. */ - abstract class TypedApply[T >: Untyped]( + abstract class TypedApply[T <: Untyped]( app: untpd.Apply, fun: Tree, methRef: TermRef, args: List[Trees.Tree[T]], resultType: Type, override val applyKind: ApplyKind)(using Context) extends Application(methRef, fun.tpe, args, resultType) { @@ -937,6 +943,21 @@ trait Applications extends Compatibility { /** Type application where arguments come from prototype, and no implicits are inserted */ def simpleApply(fun1: Tree, proto: FunProto)(using Context): Tree = methPart(fun1).tpe match { + case funRef: TermRef if funRef.symbol.isSignaturePolymorphic => + // synthesize a method type based on the types at the call site. + // one can imagine the original signature-polymorphic method as + // being infinitely overloaded, with each individual overload only + // being brought into existence as needed + val originalResultType = funRef.symbol.info.resultType.stripNull + val resultType = + if !originalResultType.isRef(defn.ObjectClass) then originalResultType + else AvoidWildcardsMap()(proto.resultType.deepenProtoTrans) match + case SelectionProto(nme.asInstanceOf_, PolyProto(_, resTp), _, _) => resTp + case resTp if isFullyDefined(resTp, ForceDegree.all) => resTp + case _ => defn.ObjectType + val methType = MethodType(proto.typedArgs().map(_.tpe.widen), resultType) + val fun2 = Applications.retypeSignaturePolymorphicFn(fun1, methType) + simpleApply(fun2, proto) case funRef: TermRef => val app = ApplyTo(tree, fun1, funRef, proto, pt) convertNewGenericArray( @@ -982,7 +1003,10 @@ trait Applications extends Compatibility { case TypeApply(fun, _) => !fun.isInstanceOf[Select] case _ => false } - typedDynamicApply(tree, isInsertedApply, pt) + val tree1 = fun1 match + case Select(_, nme.apply) => tree + case _ => untpd.Apply(fun1, tree.args) + typedDynamicApply(tree1, isInsertedApply, pt) case _ => if (originalProto.isDropped) fun1 else if (fun1.symbol == defn.Compiletime_summonFrom) @@ -1097,7 +1121,7 @@ trait Applications extends Compatibility { /** Overridden in ReTyper to handle primitive operations that can be generated after erasure */ protected def handleUnexpectedFunType(tree: untpd.Apply, fun: Tree)(using Context): Tree = if ctx.reporter.errorsReported then - throw TypeError(i"unexpected function type: ${methPart(fun).tpe}") + throw TypeError(em"unexpected function type: ${methPart(fun).tpe}") else throw Error(i"unexpected type.\n fun = $fun,\n methPart(fun) = ${methPart(fun)},\n methPart(fun).tpe = ${methPart(fun).tpe},\n tpe = ${fun.tpe}") @@ -1105,8 +1129,8 @@ trait Applications extends Compatibility { for (case arg @ NamedArg(id, argtpt) <- args) yield { if !Feature.namedTypeArgsEnabled then report.error( - i"""Named type arguments are experimental, - |they must be enabled with a `experimental.namedTypeArguments` language import or setting""", + em"""Named type arguments are experimental, + |they must be enabled with a `experimental.namedTypeArguments` language import or setting""", arg.srcPos) val argtpt1 = typedType(argtpt) cpy.NamedArg(arg)(id, argtpt1).withType(argtpt1.tpe) @@ -1114,14 +1138,14 @@ trait Applications extends Compatibility { def typedTypeApply(tree: untpd.TypeApply, pt: Type)(using Context): Tree = { if (ctx.mode.is(Mode.Pattern)) - return errorTree(tree, "invalid pattern") + return errorTree(tree, em"invalid pattern") val isNamed = hasNamedArg(tree.args) val typedArgs = if (isNamed) typedNamedArgs(tree.args) else tree.args.mapconserve(typedType(_)) record("typedTypeApply") typedExpr(tree.fun, PolyProto(typedArgs, pt)) match { case _: TypeApply if !ctx.isAfterTyper => - errorTree(tree, "illegal repeated type application") + errorTree(tree, em"illegal repeated type application") case typedFn => typedFn.tpe.widen match { case pt: PolyType => @@ -1397,7 +1421,7 @@ trait Applications extends Compatibility { case Apply(Apply(unapply, `dummyArg` :: Nil), args2) => assert(args2.nonEmpty); res ++= args2 case Apply(unapply, `dummyArg` :: Nil) => case Inlined(u, _, _) => loop(u) - case DynamicUnapply(_) => report.error("Structural unapply is not supported", unapplyFn.srcPos) + case DynamicUnapply(_) => report.error(em"Structural unapply is not supported", unapplyFn.srcPos) case Apply(fn, args) => assert(args.nonEmpty); loop(fn); res ++= args case _ => ().assertingErrorsReported } @@ -1502,11 +1526,17 @@ trait Applications extends Compatibility { } /** Drop any leading implicit parameter sections */ - def stripImplicit(tp: Type)(using Context): Type = tp match { + def stripImplicit(tp: Type, wildcardOnly: Boolean = false)(using Context): Type = tp match { case mt: MethodType if mt.isImplicitMethod => - stripImplicit(resultTypeApprox(mt)) + stripImplicit(resultTypeApprox(mt, wildcardOnly)) case pt: PolyType => - pt.derivedLambdaType(pt.paramNames, pt.paramInfos, stripImplicit(pt.resultType)).asInstanceOf[PolyType].flatten + pt.derivedLambdaType(pt.paramNames, pt.paramInfos, + stripImplicit(pt.resultType, wildcardOnly = true)) + // can't use TypeParamRefs for parameter references in `resultTypeApprox` + // since their bounds can refer to type parameters in `pt` that are not + // bound by the constraint. This can lead to hygiene violations if subsequently + // `pt` itself is added to the constraint. Test case is run/enrich-gentraversable.scala. + .asInstanceOf[PolyType].flatten case _ => tp } @@ -1897,7 +1927,9 @@ trait Applications extends Compatibility { /** The shape of given tree as a type; cannot handle named arguments. */ def typeShape(tree: untpd.Tree): Type = tree match { case untpd.Function(args, body) => - defn.FunctionOf(args map Function.const(defn.AnyType), typeShape(body)) + defn.FunctionOf( + args.map(Function.const(defn.AnyType)), typeShape(body), + isContextual = untpd.isContextualClosure(tree)) case Match(EmptyTree, _) => defn.PartialFunctionClass.typeRef.appliedTo(defn.AnyType :: defn.NothingType :: Nil) case _ => @@ -2206,7 +2238,7 @@ trait Applications extends Compatibility { false val commonFormal = if (isPartial) defn.PartialFunctionOf(commonParamTypes.head, WildcardType) - else defn.FunctionOf(commonParamTypes, WildcardType) + else defn.FunctionOf(commonParamTypes, WildcardType, isContextual = untpd.isContextualClosure(arg)) overload.println(i"pretype arg $arg with expected type $commonFormal") if (commonParamTypes.forall(isFullyDefined(_, ForceDegree.flipBottom))) withMode(Mode.ImplicitsEnabled) { @@ -2375,7 +2407,7 @@ trait Applications extends Compatibility { else None catch - case NonFatal(_) => None + case ex: UnhandledError => None def isApplicableExtensionMethod(methodRef: TermRef, receiverType: Type)(using Context): Boolean = methodRef.symbol.is(ExtensionMethod) && !receiverType.isBottomType && diff --git a/compiler/src/dotty/tools/dotc/typer/Checking.scala b/compiler/src/dotty/tools/dotc/typer/Checking.scala index c53213d7bd37..817fe6f21d24 100644 --- a/compiler/src/dotty/tools/dotc/typer/Checking.scala +++ b/compiler/src/dotty/tools/dotc/typer/Checking.scala @@ -33,7 +33,7 @@ import NameOps._ import SymDenotations.{NoCompleter, NoDenotation} import Applications.unapplyArgs import Inferencing.isFullyDefined -import transform.patmat.SpaceEngine.isIrrefutable +import transform.patmat.SpaceEngine.{isIrrefutable, isIrrefutableQuotedPattern} import config.Feature import config.Feature.sourceVersion import config.SourceVersion._ @@ -67,11 +67,12 @@ object Checking { */ def checkBounds(args: List[tpd.Tree], boundss: List[TypeBounds], instantiate: (Type, List[Type]) => Type, app: Type = NoType, tpt: Tree = EmptyTree)(using Context): Unit = - args.lazyZip(boundss).foreach { (arg, bound) => - if !bound.isLambdaSub && !arg.tpe.hasSimpleKind then - errorTree(arg, - showInferred(MissingTypeParameterInTypeApp(arg.tpe), app, tpt)) - } + if ctx.phase != Phases.checkCapturesPhase then + args.lazyZip(boundss).foreach { (arg, bound) => + if !bound.isLambdaSub && !arg.tpe.hasSimpleKind then + errorTree(arg, + showInferred(MissingTypeParameterInTypeApp(arg.tpe), app, tpt)) + } for (arg, which, bound) <- TypeOps.boundsViolations(args, boundss, instantiate, app) do report.error( showInferred(DoesNotConformToBound(arg.tpe, which, bound), app, tpt), @@ -154,7 +155,7 @@ object Checking { checker.traverse(tpt.tpe) def checkNoWildcard(tree: Tree)(using Context): Tree = tree.tpe match { - case tpe: TypeBounds => errorTree(tree, "no wildcard type allowed here") + case tpe: TypeBounds => errorTree(tree, em"no wildcard type allowed here") case _ => tree } @@ -184,12 +185,14 @@ object Checking { /** Check that `tp` refers to a nonAbstract class * and that the instance conforms to the self type of the created class. */ - def checkInstantiable(tp: Type, pos: SrcPos)(using Context): Unit = + def checkInstantiable(tp: Type, srcTp: Type, pos: SrcPos)(using Context): Unit = tp.underlyingClassRef(refinementOK = false) match case tref: TypeRef => val cls = tref.symbol - if (cls.isOneOf(AbstractOrTrait)) - report.error(CantInstantiateAbstractClassOrTrait(cls, isTrait = cls.is(Trait)), pos) + if (cls.isOneOf(AbstractOrTrait)) { + val srcCls = srcTp.underlyingClassRef(refinementOK = false).typeSymbol + report.error(CantInstantiateAbstractClassOrTrait(srcCls, isTrait = srcCls.is(Trait)), pos) + } if !cls.is(Module) then // Create a synthetic singleton type instance, and check whether // it conforms to the self type of the class as seen from that instance. @@ -471,11 +474,11 @@ object Checking { def checkWithDeferred(flag: FlagSet) = if (sym.isOneOf(flag)) fail(AbstractMemberMayNotHaveModifier(sym, flag)) - def checkNoConflict(flag1: FlagSet, flag2: FlagSet, msg: => String) = - if (sym.isAllOf(flag1 | flag2)) fail(msg.toMessage) + def checkNoConflict(flag1: FlagSet, flag2: FlagSet, msg: Message) = + if (sym.isAllOf(flag1 | flag2)) fail(msg) def checkCombination(flag1: FlagSet, flag2: FlagSet) = if sym.isAllOf(flag1 | flag2) then - fail(i"illegal combination of modifiers: `${flag1.flagsString}` and `${flag2.flagsString}` for: $sym".toMessage) + fail(em"illegal combination of modifiers: `${flag1.flagsString}` and `${flag2.flagsString}` for: $sym") def checkApplicable(flag: Flag, ok: Boolean) = if sym.is(flag, butNot = Synthetic) && !ok then fail(ModifierNotAllowedForDefinition(flag)) @@ -495,15 +498,15 @@ object Checking { } if sym.is(Transparent) then if sym.isType then - if !sym.is(Trait) then fail(em"`transparent` can only be used for traits".toMessage) + if !sym.isExtensibleClass then fail(em"`transparent` can only be used for extensible classes and traits") else - if !sym.isInlineMethod then fail(em"`transparent` can only be used for inline methods".toMessage) + if !sym.isInlineMethod then fail(em"`transparent` can only be used for inline methods") if (!sym.isClass && sym.is(Abstract)) fail(OnlyClassesCanBeAbstract(sym)) // note: this is not covered by the next test since terms can be abstract (which is a dual-mode flag) // but they can never be one of ClassOnlyFlags if !sym.isClass && sym.isOneOf(ClassOnlyFlags) then - fail(em"only classes can be ${(sym.flags & ClassOnlyFlags).flagsString}".toMessage) + fail(em"only classes can be ${(sym.flags & ClassOnlyFlags).flagsString}") if (sym.is(AbsOverride) && !sym.owner.is(Trait)) fail(AbstractOverrideOnlyInTraits(sym)) if sym.is(Trait) then @@ -520,7 +523,7 @@ object Checking { if !sym.isOneOf(Method | ModuleVal) then fail(TailrecNotApplicable(sym)) else if sym.is(Inline) then - fail("Inline methods cannot be @tailrec".toMessage) + fail(em"Inline methods cannot be @tailrec") if sym.hasAnnotation(defn.TargetNameAnnot) && sym.isClass && sym.isTopLevelClass then fail(TargetNameOnTopLevelClass(sym)) if (sym.hasAnnotation(defn.NativeAnnot)) { @@ -539,7 +542,7 @@ object Checking { fail(CannotExtendAnyVal(sym)) if (sym.isConstructor && !sym.isPrimaryConstructor && sym.owner.is(Trait, butNot = JavaDefined)) val addendum = if ctx.settings.Ydebug.value then s" ${sym.owner.flagsString}" else "" - fail(s"Traits cannot have secondary constructors$addendum".toMessage) + fail(em"Traits cannot have secondary constructors$addendum") checkApplicable(Inline, sym.isTerm && !sym.isOneOf(Mutable | Module)) checkApplicable(Lazy, !sym.isOneOf(Method | Mutable)) if (sym.isType && !sym.isOneOf(Deferred | JavaDefined)) @@ -560,7 +563,7 @@ object Checking { // The issue with `erased inline` is that the erased semantics get lost // as the code is inlined and the reference is removed before the erased usage check. checkCombination(Erased, Inline) - checkNoConflict(Lazy, ParamAccessor, s"parameter may not be `lazy`") + checkNoConflict(Lazy, ParamAccessor, em"parameter may not be `lazy`") } /** Check for illegal or redundant modifiers on modules. This is done separately @@ -599,7 +602,7 @@ object Checking { */ def checkNoPrivateLeaks(sym: Symbol)(using Context): Type = { class NotPrivate extends TypeMap { - var errors: List[() => String] = Nil + var errors: List[Message] = Nil private var inCaptureSet: Boolean = false def accessBoundary(sym: Symbol): Symbol = @@ -631,7 +634,7 @@ object Checking { var tp1 = if (isLeaked(tp.symbol)) { errors = - (() => em"non-private ${sym.showLocated} refers to private ${tp.symbol}\nin its type signature ${sym.info}") + em"non-private ${sym.showLocated} refers to private ${tp.symbol}\nin its type signature ${sym.info}" :: errors tp } @@ -672,7 +675,7 @@ object Checking { } val notPrivate = new NotPrivate val info = notPrivate(sym.info) - notPrivate.errors.foreach(error => report.errorOrMigrationWarning(error(), sym.srcPos, from = `3.0`)) + notPrivate.errors.foreach(report.errorOrMigrationWarning(_, sym.srcPos, from = `3.0`)) info } @@ -807,13 +810,13 @@ trait Checking { /** Check that type `tp` is stable. */ def checkStable(tp: Type, pos: SrcPos, kind: String)(using Context): Unit = - if !tp.isStable then report.error(NotAPath(tp, kind), pos) + if !tp.isStable && !tp.isErroneous then report.error(NotAPath(tp, kind), pos) /** Check that all type members of `tp` have realizable bounds */ def checkRealizableBounds(cls: Symbol, pos: SrcPos)(using Context): Unit = { val rstatus = boundsRealizability(cls.thisType) if (rstatus ne Realizable) - report.error(ex"$cls cannot be instantiated since it${rstatus.msg}", pos) + report.error(em"$cls cannot be instantiated since it${rstatus.msg}", pos) } /** Check that pattern `pat` is irrefutable for scrutinee type `sel.tpe`. @@ -834,7 +837,7 @@ trait Checking { var reportedPt = pt.dropAnnot(defn.UncheckedAnnot) if !pat.tpe.isSingleton then reportedPt = reportedPt.widen val problem = if pat.tpe <:< reportedPt then "is more specialized than" else "does not match" - ex"pattern's type ${pat.tpe} $problem the right hand side expression's type $reportedPt" + em"pattern's type ${pat.tpe} $problem the right hand side expression's type $reportedPt" case RefutableExtractor => val extractor = val UnApply(fn, _, _) = pat: @unchecked @@ -843,6 +846,10 @@ trait Checking { case _ => EmptyTree if extractor.isEmpty then em"pattern binding uses refutable extractor" + else if extractor.symbol eq defn.QuoteMatching_ExprMatch then + em"pattern binding uses refutable extractor `'{...}`" + else if extractor.symbol eq defn.QuoteMatching_TypeMatch then + em"pattern binding uses refutable extractor `'[...]`" else em"pattern binding uses refutable extractor `$extractor`" @@ -862,10 +869,11 @@ trait Checking { else pat.srcPos def rewriteMsg = Message.rewriteNotice("This patch", `3.2-migration`) report.gradualErrorOrMigrationWarning( - em"""$message - | - |If $usage is intentional, this can be communicated by $fix, - |which $addendum.$rewriteMsg""", + message.append( + i"""| + | + |If $usage is intentional, this can be communicated by $fix, + |which $addendum.$rewriteMsg"""), pos, warnFrom = `3.2`, errorFrom = `future`) false } @@ -880,9 +888,9 @@ trait Checking { pat match case Bind(_, pat1) => recur(pat1, pt) - case UnApply(fn, _, pats) => + case UnApply(fn, implicits, pats) => check(pat, pt) && - (isIrrefutable(fn, pats.length) || fail(pat, pt, Reason.RefutableExtractor)) && { + (isIrrefutable(fn, pats.length) || isIrrefutableQuotedPattern(fn, implicits, pt) || fail(pat, pt, Reason.RefutableExtractor)) && { val argPts = unapplyArgs(fn.tpe.widen.finalResultType, fn, pats, pat.srcPos) pats.corresponds(argPts)(recur) } @@ -902,7 +910,7 @@ trait Checking { private def checkLegalImportOrExportPath(path: Tree, kind: String)(using Context): Unit = { checkStable(path.tpe, path.srcPos, kind) if (!ctx.isAfterTyper) Checking.checkRealizable(path.tpe, path.srcPos) - if !isIdempotentExpr(path) then + if !isIdempotentExpr(path) && !path.tpe.isErroneous then report.error(em"import prefix is not a pure expression", path.srcPos) } @@ -934,8 +942,8 @@ trait Checking { // we restrict wildcard export from package as incremental compilation does not yet // register a dependency on "all members of a package" - see https://github.com/sbt/zinc/issues/226 report.error( - em"Implementation restriction: ${path.tpe.classSymbol} is not a valid prefix " + - "for a wildcard export, as it is a package.", path.srcPos) + em"Implementation restriction: ${path.tpe.classSymbol} is not a valid prefix for a wildcard export, as it is a package", + path.srcPos) /** Check that module `sym` does not clash with a class of the same name * that is concurrently compiled in another source file. @@ -978,14 +986,15 @@ trait Checking { sym.srcPos) /** If `tree` is an application of a new-style implicit conversion (using the apply - * method of a `scala.Conversion` instance), check that implicit conversions are - * enabled. + * method of a `scala.Conversion` instance), check that the expected type is + * a convertible formal parameter type or that implicit conversions are enabled. */ - def checkImplicitConversionUseOK(tree: Tree)(using Context): Unit = + def checkImplicitConversionUseOK(tree: Tree, expected: Type)(using Context): Unit = val sym = tree.symbol if sym.name == nme.apply && sym.owner.derivesFrom(defn.ConversionClass) && !sym.info.isErroneous + && !expected.isConvertibleParam then def conv = methPart(tree) match case Select(qual, _) => qual.symbol.orElse(sym.owner) @@ -1021,8 +1030,8 @@ trait Checking { ("method", (n: Name) => s"method syntax .$n(...)") def rewriteMsg = Message.rewriteNotice("The latter", options = "-deprecation") report.deprecationWarning( - i"""Alphanumeric $kind $name is not declared ${hlAsKeyword("infix")}; it should not be used as infix operator. - |Instead, use ${alternative(name)} or backticked identifier `$name`.$rewriteMsg""", + em"""Alphanumeric $kind $name is not declared ${hlAsKeyword("infix")}; it should not be used as infix operator. + |Instead, use ${alternative(name)} or backticked identifier `$name`.$rewriteMsg""", tree.op.srcPos) if (ctx.settings.deprecation.value) { patch(Span(tree.op.span.start, tree.op.span.start), "`") @@ -1048,14 +1057,14 @@ trait Checking { def checkFeasibleParent(tp: Type, pos: SrcPos, where: => String = "")(using Context): Type = { def checkGoodBounds(tp: Type) = tp match { case tp @ TypeBounds(lo, hi) if !(lo <:< hi) => - report.error(ex"no type exists between low bound $lo and high bound $hi$where", pos) + report.error(em"no type exists between low bound $lo and high bound $hi$where", pos) TypeBounds(hi, hi) case _ => tp } tp match { case tp @ AndType(tp1, tp2) => - report.error(s"conflicting type arguments$where", pos) + report.error(em"conflicting type arguments$where", pos) tp1 case tp @ AppliedType(tycon, args) => tp.derivedAppliedType(tycon, args.mapConserve(checkGoodBounds)) @@ -1109,10 +1118,12 @@ trait Checking { def checkParentCall(call: Tree, caller: ClassSymbol)(using Context): Unit = if (!ctx.isAfterTyper) { val called = call.tpe.classSymbol + if (called.is(JavaAnnotation)) + report.error(em"${called.name} must appear without any argument to be a valid class parent because it is a Java annotation", call.srcPos) if (caller.is(Trait)) - report.error(i"$caller may not call constructor of $called", call.srcPos) + report.error(em"$caller may not call constructor of $called", call.srcPos) else if (called.is(Trait) && !caller.mixins.contains(called)) - report.error(i"""$called is already implemented by super${caller.superClass}, + report.error(em"""$called is already implemented by super${caller.superClass}, |its constructor cannot be called again""", call.srcPos) // Check that constructor call is of the form _.(args1)...(argsN). @@ -1121,7 +1132,7 @@ trait Checking { case Apply(fn, _) => checkLegalConstructorCall(fn, tree, "") case TypeApply(fn, _) => checkLegalConstructorCall(fn, tree, "type ") case Select(_, nme.CONSTRUCTOR) => // ok - case _ => report.error(s"too many ${kind}arguments in parent constructor", encl.srcPos) + case _ => report.error(em"too many ${kind}arguments in parent constructor", encl.srcPos) } call match { case Apply(fn, _) => checkLegalConstructorCall(fn, call, "") @@ -1171,7 +1182,7 @@ trait Checking { parent match { case parent: ClassSymbol => if (parent.is(Case)) - report.error(ex"""case $caseCls has case ancestor $parent, but case-to-case inheritance is prohibited. + report.error(em"""case $caseCls has case ancestor $parent, but case-to-case inheritance is prohibited. |To overcome this limitation, use extractors to pattern match on non-leaf nodes.""", pos) else checkCaseInheritance(parent.superClass, caseCls, pos) case _ => @@ -1185,7 +1196,7 @@ trait Checking { val check = new TreeTraverser { def traverse(tree: Tree)(using Context) = tree match { case id: Ident if vparams.exists(_.symbol == id.symbol) => - report.error("illegal forward reference to method parameter", id.srcPos) + report.error(em"illegal forward reference to method parameter", id.srcPos) case _ => traverseChildren(tree) } @@ -1228,7 +1239,7 @@ trait Checking { if (t.span.isSourceDerived && owner == badOwner) t match { case t: RefTree if allowed(t.name, checkedSym) => - case _ => report.error(i"illegal reference to $checkedSym from $where", t.srcPos) + case _ => report.error(em"illegal reference to $checkedSym from $where", t.srcPos) } val sym = t.symbol t match { @@ -1262,6 +1273,23 @@ trait Checking { if !Inlines.inInlineMethod && !ctx.isInlineContext then report.error(em"$what can only be used in an inline method", pos) + /** Check that the class corresponding to this tree is either a Scala or Java annotation. + * + * @return The original tree or an error tree in case `tree` isn't a valid + * annotation or already an error tree. + */ + def checkAnnotClass(tree: Tree)(using Context): Tree = + if tree.tpe.isError then + return tree + val cls = Annotations.annotClass(tree) + if cls.is(JavaDefined) then + if !cls.is(JavaAnnotation) then + errorTree(tree, em"$cls is not a valid Java annotation: it was not declared with `@interface`") + else tree + else if !cls.derivesFrom(defn.AnnotationClass) then + errorTree(tree, em"$cls is not a valid Scala annotation: it does not extend `scala.annotation.Annotation`") + else tree + /** Check arguments of compiler-defined annotations */ def checkAnnotArgs(tree: Tree)(using Context): tree.type = val cls = Annotations.annotClass(tree) @@ -1328,7 +1356,7 @@ trait Checking { def ensureParentDerivesFrom(enumCase: Symbol)(using Context) = val enumCls = enumCase.owner.linkedClass if !firstParent.derivesFrom(enumCls) then - report.error(i"enum case does not extend its enum $enumCls", enumCase.srcPos) + report.error(em"enum case does not extend its enum $enumCls", enumCase.srcPos) cls.info match case info: ClassInfo => cls.info = info.derivedClassInfo(declaredParents = enumCls.typeRefApplied :: info.declaredParents) @@ -1366,9 +1394,9 @@ trait Checking { if (stat.symbol.isAllOf(EnumCase)) stat match { - case TypeDef(_, Template(DefDef(_, paramss, _, _), parents, _, _)) => + case TypeDef(_, impl @ Template(DefDef(_, paramss, _, _), _, _, _)) => paramss.foreach(_.foreach(check)) - parents.foreach(check) + impl.parents.foreach(check) case vdef: ValDef => vdef.rhs match { case Block((clsDef @ TypeDef(_, impl: Template)) :: Nil, _) @@ -1516,7 +1544,7 @@ trait NoChecking extends ReChecking { override def checkStable(tp: Type, pos: SrcPos, kind: String)(using Context): Unit = () override def checkClassType(tp: Type, pos: SrcPos, traitReq: Boolean, stablePrefixReq: Boolean)(using Context): Type = tp override def checkImplicitConversionDefOK(sym: Symbol)(using Context): Unit = () - override def checkImplicitConversionUseOK(tree: Tree)(using Context): Unit = () + override def checkImplicitConversionUseOK(tree: Tree, expected: Type)(using Context): Unit = () override def checkFeasibleParent(tp: Type, pos: SrcPos, where: => String = "")(using Context): Type = tp override def checkAnnotArgs(tree: Tree)(using Context): tree.type = tree override def checkNoTargetNameConflict(stats: List[Tree])(using Context): Unit = () diff --git a/compiler/src/dotty/tools/dotc/typer/CrossVersionChecks.scala b/compiler/src/dotty/tools/dotc/typer/CrossVersionChecks.scala index 044dd7bb8528..ef9599be551c 100644 --- a/compiler/src/dotty/tools/dotc/typer/CrossVersionChecks.scala +++ b/compiler/src/dotty/tools/dotc/typer/CrossVersionChecks.scala @@ -67,7 +67,7 @@ class CrossVersionChecks extends MiniPhase: if !skipWarning then val msg = annot.argumentConstant(0).map(": " + _.stringValue).getOrElse("") val since = annot.argumentConstant(1).map(" since " + _.stringValue).getOrElse("") - report.deprecationWarning(s"${sym.showLocated} is deprecated${since}${msg}", pos) + report.deprecationWarning(em"${sym.showLocated} is deprecated${since}${msg}", pos) private def checkExperimentalSignature(sym: Symbol, pos: SrcPos)(using Context): Unit = class Checker extends TypeTraverser: @@ -110,20 +110,12 @@ class CrossVersionChecks extends MiniPhase: !sym.isDeprecated && !sym.is(Deferred)) if (!concrOvers.isEmpty) report.deprecationWarning( - symbol.toString + " overrides concrete, non-deprecated symbol(s):" + - concrOvers.map(_.name).mkString(" ", ", ", ""), tree.srcPos) + em"""$symbol overrides concrete, non-deprecated definition(s): + | ${concrOvers.map(_.name).mkString(", ")}""", + tree.srcPos) } } - /** Check that classes extending experimental classes or nested in experimental classes have the @experimental annotation. */ - private def checkExperimentalInheritance(cls: ClassSymbol)(using Context): Unit = - if !cls.isAnonymousClass && !cls.hasAnnotation(defn.ExperimentalAnnot) then - cls.info.parents.find(_.typeSymbol.isExperimental) match - case Some(parent) => - report.error(em"extension of experimental ${parent.typeSymbol} must have @experimental annotation", cls.srcPos) - case _ => - end checkExperimentalInheritance - override def transformValDef(tree: ValDef)(using Context): ValDef = checkDeprecatedOvers(tree) checkExperimentalAnnots(tree.symbol) @@ -136,12 +128,6 @@ class CrossVersionChecks extends MiniPhase: checkExperimentalSignature(tree.symbol, tree) tree - override def transformTemplate(tree: Template)(using Context): Tree = - val cls = ctx.owner.asClass - checkExperimentalInheritance(cls) - checkExperimentalAnnots(cls) - tree - override def transformIdent(tree: Ident)(using Context): Ident = { checkUndesiredProperties(tree.symbol, tree.srcPos) tree diff --git a/compiler/src/dotty/tools/dotc/typer/Deriving.scala b/compiler/src/dotty/tools/dotc/typer/Deriving.scala index d2165a5ca8c5..8fdc468780ba 100644 --- a/compiler/src/dotty/tools/dotc/typer/Deriving.scala +++ b/compiler/src/dotty/tools/dotc/typer/Deriving.scala @@ -44,7 +44,7 @@ trait Deriving { private def addDerivedInstance(clsName: Name, info: Type, pos: SrcPos): Unit = { val instanceName = "derived$".concat(clsName) if (ctx.denotNamed(instanceName).exists) - report.error(i"duplicate type class derivation for $clsName", pos) + report.error(em"duplicate type class derivation for $clsName", pos) else // If we set the Synthetic flag here widenGiven will widen too far and the // derived instance will have too low a priority to be selected over a freshly @@ -90,7 +90,7 @@ trait Deriving { xs.corresponds(ys)((x, y) => x.paramInfo.hasSameKindAs(y.paramInfo)) def cannotBeUnified = - report.error(i"${cls.name} cannot be unified with the type argument of ${typeClass.name}", derived.srcPos) + report.error(em"${cls.name} cannot be unified with the type argument of ${typeClass.name}", derived.srcPos) def addInstance(derivedParams: List[TypeSymbol], evidenceParamInfos: List[List[Type]], instanceTypes: List[Type]): Unit = { val resultType = typeClassType.appliedTo(instanceTypes) @@ -252,7 +252,7 @@ trait Deriving { if (typeClassArity == 1) deriveSingleParameter else if (typeClass == defn.CanEqualClass) deriveCanEqual else if (typeClassArity == 0) - report.error(i"type ${typeClass.name} in derives clause of ${cls.name} has no type parameters", derived.srcPos) + report.error(em"type ${typeClass.name} in derives clause of ${cls.name} has no type parameters", derived.srcPos) else cannotBeUnified } diff --git a/compiler/src/dotty/tools/dotc/typer/Docstrings.scala b/compiler/src/dotty/tools/dotc/typer/Docstrings.scala index 5fefd355d7d8..d819528ff556 100644 --- a/compiler/src/dotty/tools/dotc/typer/Docstrings.scala +++ b/compiler/src/dotty/tools/dotc/typer/Docstrings.scala @@ -37,7 +37,7 @@ object Docstrings { case List(df: tpd.DefDef) => usecase.typed(df) case _ => - report.error("`@usecase` was not a valid definition", ctx.source.atSpan(usecase.codePos)) + report.error(em"`@usecase` was not a valid definition", ctx.source.atSpan(usecase.codePos)) usecase } } diff --git a/compiler/src/dotty/tools/dotc/typer/Dynamic.scala b/compiler/src/dotty/tools/dotc/typer/Dynamic.scala index 1630ce31e4c6..b69d83b2dcd5 100644 --- a/compiler/src/dotty/tools/dotc/typer/Dynamic.scala +++ b/compiler/src/dotty/tools/dotc/typer/Dynamic.scala @@ -80,7 +80,7 @@ trait Dynamic { val args = tree.args val dynName = if (args.exists(isNamedArg)) nme.applyDynamicNamed else nme.applyDynamic if (dynName == nme.applyDynamicNamed && untpd.isWildcardStarArgList(args)) - errorTree(tree, "applyDynamicNamed does not support passing a vararg parameter") + errorTree(tree, em"applyDynamicNamed does not support passing a vararg parameter") else { def namedArgTuple(name: String, arg: untpd.Tree) = untpd.Tuple(List(Literal(Constant(name)), arg)) def namedArgs = args.map { diff --git a/compiler/src/dotty/tools/dotc/typer/ErrorReporting.scala b/compiler/src/dotty/tools/dotc/typer/ErrorReporting.scala index 3034253adb61..32b5fde689ec 100644 --- a/compiler/src/dotty/tools/dotc/typer/ErrorReporting.scala +++ b/compiler/src/dotty/tools/dotc/typer/ErrorReporting.scala @@ -10,11 +10,9 @@ import Trees._ import NameOps._ import util.SrcPos import config.Feature -import java.util.regex.Matcher.quoteReplacement import reporting._ import collection.mutable -import scala.util.matching.Regex object ErrorReporting { @@ -26,9 +24,6 @@ object ErrorReporting { def errorTree(tree: untpd.Tree, msg: Message)(using Context): tpd.Tree = errorTree(tree, msg, tree.srcPos) - def errorTree(tree: untpd.Tree, msg: => String)(using Context): tpd.Tree = - errorTree(tree, msg.toMessage) - def errorTree(tree: untpd.Tree, msg: TypeError, pos: SrcPos)(using Context): tpd.Tree = tree.withType(errorType(msg, pos)) @@ -37,9 +32,6 @@ object ErrorReporting { ErrorType(msg) } - def errorType(msg: => String, pos: SrcPos)(using Context): ErrorType = - errorType(msg.toMessage, pos) - def errorType(ex: TypeError, pos: SrcPos)(using Context): ErrorType = { report.error(ex, pos) ErrorType(ex.toMessage) @@ -87,18 +79,18 @@ object ErrorReporting { def expectedTypeStr(tp: Type): String = tp match { case tp: PolyProto => - em"type arguments [${tp.targs.tpes}%, %] and ${expectedTypeStr(revealDeepenedArgs(tp.resultType))}" + i"type arguments [${tp.targs.tpes}%, %] and ${expectedTypeStr(revealDeepenedArgs(tp.resultType))}" case tp: FunProto => def argStr(tp: FunProto): String = val result = revealDeepenedArgs(tp.resultType) match { case restp: FunProto => argStr(restp) case _: WildcardType | _: IgnoredProto => "" - case tp => em" and expected result type $tp" + case tp => i" and expected result type $tp" } - em"(${tp.typedArgs().tpes}%, %)$result" + i"(${tp.typedArgs().tpes}%, %)$result" s"arguments ${argStr(tp)}" case _ => - em"expected type $tp" + i"expected type $tp" } def anonymousTypeMemberStr(tpe: Type): String = { @@ -107,12 +99,12 @@ object ErrorReporting { case _: MethodOrPoly => "method" case _ => "value of type" } - em"$kind $tpe" + i"$kind $tpe" } def overloadedAltsStr(alts: List[SingleDenotation]): String = - em"overloaded alternatives of ${denotStr(alts.head)} with types\n" + - em" ${alts map (_.info)}%\n %" + i"""overloaded alternatives of ${denotStr(alts.head)} with types + | ${alts map (_.info)}%\n %""" def denotStr(denot: Denotation): String = if (denot.isOverloaded) overloadedAltsStr(denot.alternatives) @@ -130,13 +122,30 @@ object ErrorReporting { case _ => anonymousTypeMemberStr(tp) } + /** Explain info of symbol `sym` as a member of class `base`. + * @param showLocation if true also show sym's location. + */ + def infoString(sym: Symbol, base: Type, showLocation: Boolean): String = + val sym1 = sym.underlyingSymbol + def info = base.memberInfo(sym1) + val infoStr = + if sym1.isAliasType then i", which equals ${info.bounds.hi}" + else if sym1.isAbstractOrParamType && info != TypeBounds.empty then i" with bounds$info" + else if sym1.is(Module) then "" + else if sym1.isTerm then i" of type $info" + else "" + i"${if showLocation then sym1.showLocated else sym1}$infoStr" + + def infoStringWithLocation(sym: Symbol, base: Type) = + infoString(sym, base, showLocation = true) + def exprStr(tree: Tree): String = refStr(tree.tpe) - def takesNoParamsStr(tree: Tree, kind: String): String = + def takesNoParamsMsg(tree: Tree, kind: String): Message = if (tree.tpe.widen.exists) - i"${exprStr(tree)} does not take ${kind}parameters" + em"${exprStr(tree)} does not take ${kind}parameters" else { - i"undefined: $tree # ${tree.uniqueId}: ${tree.tpe.toString} at ${ctx.phase}" + em"undefined: $tree # ${tree.uniqueId}: ${tree.tpe.toString} at ${ctx.phase}" } def patternConstrStr(tree: Tree): String = ??? @@ -187,7 +196,9 @@ object ErrorReporting { |The tests were made under $constraintText""" def whyFailedStr(fail: FailedExtension) = - i""" failed with + i""" + | + | failed with: | |${fail.whyFailed.message.indented(8)}""" @@ -255,201 +266,9 @@ object ErrorReporting { ownerSym.typeRef.nonClassTypeMembers.map(_.symbol) }.toList - def dependentStr = + def dependentMsg = """Term-dependent types are experimental, - |they must be enabled with a `experimental.dependent` language import or setting""".stripMargin + |they must be enabled with a `experimental.dependent` language import or setting""".stripMargin.toMessage def err(using Context): Errors = new Errors } - -class ImplicitSearchError( - arg: tpd.Tree, - pt: Type, - where: String, - paramSymWithMethodCallTree: Option[(Symbol, tpd.Tree)] = None, - ignoredInstanceNormalImport: => Option[SearchSuccess], - importSuggestionAddendum: => String -)(using ctx: Context) { - - def missingArgMsg = arg.tpe match { - case ambi: AmbiguousImplicits => - (ambi.alt1, ambi.alt2) match { - case (alt @ AmbiguousImplicitMsg(msg), _) => - userDefinedAmbiguousImplicitMsg(alt, msg) - case (_, alt @ AmbiguousImplicitMsg(msg)) => - userDefinedAmbiguousImplicitMsg(alt, msg) - case _ => - defaultAmbiguousImplicitMsg(ambi) - } - case ambi @ TooUnspecific(target) => - ex"""No implicit search was attempted${location("for")} - |since the expected type $target is not specific enough""" - case _ => - val shortMessage = userDefinedImplicitNotFoundParamMessage - .orElse(userDefinedImplicitNotFoundTypeMessage) - .getOrElse(defaultImplicitNotFoundMessage) - formatMsg(shortMessage)() - ++ hiddenImplicitsAddendum - ++ ErrorReporting.matchReductionAddendum(pt) - } - - private def formatMsg(shortForm: String)(headline: String = shortForm) = arg match - case arg: Trees.SearchFailureIdent[?] => - arg.tpe match - case _: NoMatchingImplicits => headline - case tpe: SearchFailureType => - i"$headline. ${tpe.explanation}" - case _ => headline - case _ => - arg.tpe match - case tpe: SearchFailureType => - val original = arg match - case Inlined(call, _, _) => call - case _ => arg - i"""$headline. - |I found: - | - | ${original.show.replace("\n", "\n ")} - | - |But ${tpe.explanation}.""" - case _ => headline - - /** Format `raw` implicitNotFound or implicitAmbiguous argument, replacing - * all occurrences of `${X}` where `X` is in `paramNames` with the - * corresponding shown type in `args`. - */ - private def userDefinedErrorString(raw: String, paramNames: List[String], args: List[Type]): String = { - def translate(name: String): Option[String] = { - val idx = paramNames.indexOf(name) - if (idx >= 0) Some(ex"${args(idx)}") else None - } - - """\$\{\s*([^}\s]+)\s*\}""".r.replaceAllIn(raw, (_: Regex.Match) match { - case Regex.Groups(v) => quoteReplacement(translate(v).getOrElse("")).nn - }) - } - - /** Extract a user defined error message from a symbol `sym` - * with an annotation matching the given class symbol `cls`. - */ - private def userDefinedMsg(sym: Symbol, cls: Symbol) = for { - ann <- sym.getAnnotation(cls) - msg <- ann.argumentConstantString(0) - } yield msg - - private def location(preposition: String) = if (where.isEmpty) "" else s" $preposition $where" - - private def defaultAmbiguousImplicitMsg(ambi: AmbiguousImplicits) = - s"Ambiguous given instances: ${ambi.explanation}${location("of")}" - - private def defaultImplicitNotFoundMessage = - ex"No given instance of type $pt was found${location("for")}" - - /** Construct a custom error message given an ambiguous implicit - * candidate `alt` and a user defined message `raw`. - */ - private def userDefinedAmbiguousImplicitMsg(alt: SearchSuccess, raw: String) = { - val params = alt.ref.underlying match { - case p: PolyType => p.paramNames.map(_.toString) - case _ => Nil - } - def resolveTypes(targs: List[tpd.Tree])(using Context) = - targs.map(a => Inferencing.fullyDefinedType(a.tpe, "type argument", a.srcPos)) - - // We can extract type arguments from: - // - a function call: - // @implicitAmbiguous("msg A=${A}") - // implicit def f[A](): String = ... - // implicitly[String] // found: f[Any]() - // - // - an eta-expanded function: - // @implicitAmbiguous("msg A=${A}") - // implicit def f[A](x: Int): String = ... - // implicitly[Int => String] // found: x => f[Any](x) - - val call = tpd.closureBody(alt.tree) // the tree itself if not a closure - val targs = tpd.typeArgss(call).flatten - val args = resolveTypes(targs)(using ctx.fresh.setTyperState(alt.tstate)) - userDefinedErrorString(raw, params, args) - } - - /** @param rawMsg Message template with variables, e.g. "Variable A is ${A}" - * @param sym Symbol of the annotated type or of the method whose parameter was annotated - * @param substituteType Function substituting specific types for abstract types associated with variables, e.g A -> Int - */ - private def formatAnnotationMessage(rawMsg: String, sym: Symbol, substituteType: Type => Type): String = { - val substitutableTypesSymbols = ErrorReporting.substitutableTypeSymbolsInScope(sym) - - userDefinedErrorString( - rawMsg, - paramNames = substitutableTypesSymbols.map(_.name.unexpandedName.toString), - args = substitutableTypesSymbols.map(_.typeRef).map(substituteType) - ) - } - - /** Extracting the message from a method parameter, e.g. in - * - * trait Foo - * - * def foo(implicit @annotation.implicitNotFound("Foo is missing") foo: Foo): Any = ??? - */ - private def userDefinedImplicitNotFoundParamMessage: Option[String] = paramSymWithMethodCallTree.flatMap { (sym, applTree) => - userDefinedMsg(sym, defn.ImplicitNotFoundAnnot).map { rawMsg => - val fn = tpd.funPart(applTree) - val targs = tpd.typeArgss(applTree).flatten - val methodOwner = fn.symbol.owner - val methodOwnerType = tpd.qualifier(fn).tpe - val methodTypeParams = fn.symbol.paramSymss.flatten.filter(_.isType) - val methodTypeArgs = targs.map(_.tpe) - val substituteType = (_: Type).asSeenFrom(methodOwnerType, methodOwner).subst(methodTypeParams, methodTypeArgs) - formatAnnotationMessage(rawMsg, sym.owner, substituteType) - } - } - - /** Extracting the message from a type, e.g. in - * - * @annotation.implicitNotFound("Foo is missing") - * trait Foo - * - * def foo(implicit foo: Foo): Any = ??? - */ - private def userDefinedImplicitNotFoundTypeMessage: Option[String] = - def recur(tp: Type): Option[String] = tp match - case tp: TypeRef => - val sym = tp.symbol - userDefinedImplicitNotFoundTypeMessage(sym).orElse(recur(tp.info)) - case tp: ClassInfo => - tp.baseClasses.iterator - .map(userDefinedImplicitNotFoundTypeMessage) - .find(_.isDefined).flatten - case tp: TypeProxy => - recur(tp.superType) - case tp: AndType => - recur(tp.tp1).orElse(recur(tp.tp2)) - case _ => - None - recur(pt) - - private def userDefinedImplicitNotFoundTypeMessage(sym: Symbol): Option[String] = - for - rawMsg <- userDefinedMsg(sym, defn.ImplicitNotFoundAnnot) - if Feature.migrateTo3 || sym != defn.Function1 - // Don't inherit "No implicit view available..." message if subtypes of Function1 are not treated as implicit conversions anymore - yield - val substituteType = (_: Type).asSeenFrom(pt, sym) - formatAnnotationMessage(rawMsg, sym, substituteType) - - private def hiddenImplicitsAddendum: String = - def hiddenImplicitNote(s: SearchSuccess) = - em"\n\nNote: ${s.ref.symbol.showLocated} was not considered because it was not imported with `import given`." - - val normalImports = ignoredInstanceNormalImport.map(hiddenImplicitNote) - - normalImports.getOrElse(importSuggestionAddendum) - end hiddenImplicitsAddendum - - private object AmbiguousImplicitMsg { - def unapply(search: SearchSuccess): Option[String] = - userDefinedMsg(search.ref.symbol, defn.ImplicitAmbiguousAnnot) - } -} diff --git a/compiler/src/dotty/tools/dotc/typer/Implicits.scala b/compiler/src/dotty/tools/dotc/typer/Implicits.scala index 42c78dcfb32c..03d3011b4bcd 100644 --- a/compiler/src/dotty/tools/dotc/typer/Implicits.scala +++ b/compiler/src/dotty/tools/dotc/typer/Implicits.scala @@ -435,20 +435,15 @@ object Implicits: final protected def qualify(using Context): String = expectedType match { case SelectionProto(name, mproto, _, _) if !argument.isEmpty => - em"provide an extension method `$name` on ${argument.tpe}" + i"provide an extension method `$name` on ${argument.tpe}" case NoType => - if (argument.isEmpty) em"match expected type" - else em"convert from ${argument.tpe} to expected type" + if (argument.isEmpty) i"match expected type" + else i"convert from ${argument.tpe} to expected type" case _ => - if (argument.isEmpty) em"match type ${clarify(expectedType)}" - else em"convert from ${argument.tpe} to ${clarify(expectedType)}" + if (argument.isEmpty) i"match type ${clarify(expectedType)}" + else i"convert from ${argument.tpe} to ${clarify(expectedType)}" } - /** An explanation of the cause of the failure as a string */ - def explanation(using Context): String - - def msg(using Context): Message = explanation.toMessage - /** If search was for an implicit conversion, a note describing the failure * in more detail - this is either empty or starts with a '\n' */ @@ -488,8 +483,9 @@ object Implicits: map(tp) } - def explanation(using Context): String = + def msg(using Context): Message = em"no implicit values were found that $qualify" + override def toString = s"NoMatchingImplicits($expectedType, $argument)" } @@ -509,20 +505,20 @@ object Implicits: i""" |Note that implicit conversions were not tried because the result of an implicit conversion |must be more specific than $target""" - override def explanation(using Context) = - i"""${super.explanation}. - |The expected type $target is not specific enough, so no search was attempted""" + + override def msg(using Context) = + super.msg.append("\nThe expected type $target is not specific enough, so no search was attempted") override def toString = s"TooUnspecific" /** An ambiguous implicits failure */ class AmbiguousImplicits(val alt1: SearchSuccess, val alt2: SearchSuccess, val expectedType: Type, val argument: Tree) extends SearchFailureType { - def explanation(using Context): String = + def msg(using Context): Message = var str1 = err.refStr(alt1.ref) var str2 = err.refStr(alt2.ref) if str1 == str2 then str1 = ctx.printer.toTextRef(alt1.ref).show str2 = ctx.printer.toTextRef(alt2.ref).show - em"both $str1 and $str2 $qualify" + em"both $str1 and $str2 $qualify".withoutDisambiguation() override def whyNoConversion(using Context): String = if !argument.isEmpty && argument.tpe.widen.isRef(defn.NothingClass) then "" @@ -536,21 +532,21 @@ object Implicits: class MismatchedImplicit(ref: TermRef, val expectedType: Type, val argument: Tree) extends SearchFailureType { - def explanation(using Context): String = + def msg(using Context): Message = em"${err.refStr(ref)} does not $qualify" } class DivergingImplicit(ref: TermRef, val expectedType: Type, val argument: Tree) extends SearchFailureType { - def explanation(using Context): String = + def msg(using Context): Message = em"${err.refStr(ref)} produces a diverging implicit search when trying to $qualify" } /** A search failure type for attempted ill-typed extension method calls */ class FailedExtension(extApp: Tree, val expectedType: Type, val whyFailed: Message) extends SearchFailureType: def argument = EmptyTree - def explanation(using Context) = em"$extApp does not $qualify" + def msg(using Context) = em"$extApp does not $qualify" /** A search failure type for aborted searches of extension methods, typically * because of a cyclic reference or similar. @@ -558,7 +554,6 @@ object Implicits: class NestedFailure(_msg: Message, val expectedType: Type) extends SearchFailureType: def argument = EmptyTree override def msg(using Context) = _msg - def explanation(using Context) = msg.toString /** A search failure type for failed synthesis of terms for special types */ class SynthesisFailure(reasons: List[String], val expectedType: Type) extends SearchFailureType: @@ -568,9 +563,9 @@ object Implicits: if reasons.length > 1 then reasons.mkString("\n\t* ", "\n\t* ", "") else - reasons.mkString + reasons.mkString(" ", "", "") - def explanation(using Context) = em"Failed to synthesize an instance of type ${clarify(expectedType)}: ${formatReasons}" + def msg(using Context) = em"Failed to synthesize an instance of type ${clarify(expectedType)}:${formatReasons}" end Implicits @@ -635,7 +630,7 @@ trait ImplicitRunInfo: def apply(tp: Type): collection.Set[Type] = parts = mutable.LinkedHashSet() - partSeen.clear() + partSeen.clear(resetToInitial = false) traverse(tp) parts end collectParts @@ -851,7 +846,7 @@ trait Implicits: inferred match { case SearchSuccess(_, ref, _, false) if isOldStyleFunctionConversion(ref.underlying) => report.migrationWarning( - i"The conversion ${ref} will not be applied implicitly here in Scala 3 because only implicit methods and instances of Conversion class will continue to work as implicit views.", + em"The conversion ${ref} will not be applied implicitly here in Scala 3 because only implicit methods and instances of Conversion class will continue to work as implicit views.", from ) case _ => @@ -905,7 +900,7 @@ trait Implicits: pt: Type, where: String, paramSymWithMethodCallTree: Option[(Symbol, Tree)] = None - )(using Context): String = { + )(using Context): Message = { def findHiddenImplicitsCtx(c: Context): Context = if c == NoContext then c else c.freshOver(findHiddenImplicitsCtx(c.outer)).addMode(Mode.FindHiddenImplicits) @@ -928,8 +923,7 @@ trait Implicits: // example where searching for a nested type causes an infinite loop. None - val error = new ImplicitSearchError(arg, pt, where, paramSymWithMethodCallTree, ignoredInstanceNormalImport, importSuggestionAddendum(pt)) - error.missingArgMsg + MissingImplicitArgument(arg, pt, where, paramSymWithMethodCallTree, ignoredInstanceNormalImport) } /** A string indicating the formal parameter corresponding to a missing argument */ @@ -939,10 +933,10 @@ trait Implicits: val qt = qual.tpe.widen val qt1 = qt.dealiasKeepAnnots def addendum = if (qt1 eq qt) "" else (i"\nThe required type is an alias of: $qt1") - em"parameter of ${qual.tpe.widen}$addendum" + i"parameter of ${qual.tpe.widen}$addendum" case _ => - em"${ if paramName.is(EvidenceParamName) then "an implicit parameter" - else s"parameter $paramName" } of $methodStr" + i"${ if paramName.is(EvidenceParamName) then "an implicit parameter" + else s"parameter $paramName" } of $methodStr" } /** A CanEqual[T, U] instance is assumed @@ -1037,7 +1031,7 @@ trait Implicits: if result.tstate ne ctx.typerState then result.tstate.commit() if result.gstate ne ctx.gadt then - ctx.gadt.restore(result.gstate) + ctx.gadtState.restore(result.gstate) if hasSkolem(false, result.tree) then report.error(SkolemInInferred(result.tree, pt, argument), ctx.source.atSpan(span)) implicits.println(i"success: $result") @@ -1050,7 +1044,8 @@ trait Implicits: withMode(Mode.OldOverloadingResolution)(inferImplicit(pt, argument, span)) match { case altResult: SearchSuccess => report.migrationWarning( - s"According to new implicit resolution rules, this will be ambiguous:\n${result.reason.explanation}", + result.reason.msg + .prepend(s"According to new implicit resolution rules, this will be ambiguous:\n"), ctx.source.atSpan(span)) altResult case _ => @@ -1357,13 +1352,13 @@ trait Implicits: def warnAmbiguousNegation(ambi: AmbiguousImplicits) = report.migrationWarning( - i"""Ambiguous implicits ${ambi.alt1.ref.symbol.showLocated} and ${ambi.alt2.ref.symbol.showLocated} - |seem to be used to implement a local failure in order to negate an implicit search. - |According to the new implicit resolution rules this is no longer possible; - |the search will fail with a global ambiguity error instead. - | - |Consider using the scala.util.NotGiven class to implement similar functionality.""", - srcPos) + em"""Ambiguous implicits ${ambi.alt1.ref.symbol.showLocated} and ${ambi.alt2.ref.symbol.showLocated} + |seem to be used to implement a local failure in order to negate an implicit search. + |According to the new implicit resolution rules this is no longer possible; + |the search will fail with a global ambiguity error instead. + | + |Consider using the scala.util.NotGiven class to implement similar functionality.""", + srcPos) /** Compare the length of the baseClasses of two symbols (except for objects, * where we use the length of the companion class instead if it's bigger). diff --git a/compiler/src/dotty/tools/dotc/typer/Inferencing.scala b/compiler/src/dotty/tools/dotc/typer/Inferencing.scala index 27b83e025cf9..3442207653d4 100644 --- a/compiler/src/dotty/tools/dotc/typer/Inferencing.scala +++ b/compiler/src/dotty/tools/dotc/typer/Inferencing.scala @@ -6,15 +6,14 @@ import core._ import ast._ import Contexts._, Types._, Flags._, Symbols._ import ProtoTypes._ -import NameKinds.{AvoidNameKind, UniqueName} +import NameKinds.UniqueName import util.Spans._ -import util.{Stats, SimpleIdentityMap, SrcPos} +import util.{Stats, SimpleIdentityMap, SimpleIdentitySet, SrcPos} import Decorators._ import config.Printers.{gadts, typr} import annotation.tailrec import reporting._ import collection.mutable - import scala.annotation.internal.sharable object Inferencing { @@ -27,12 +26,8 @@ object Inferencing { * but only if the overall result of `isFullyDefined` is `true`. * Variables that are successfully minimized do not count as uninstantiated. */ - def isFullyDefined(tp: Type, force: ForceDegree.Value)(using Context): Boolean = { - val nestedCtx = ctx.fresh.setNewTyperState() - val result = new IsFullyDefinedAccumulator(force)(using nestedCtx).process(tp) - if (result) nestedCtx.typerState.commit() - result - } + def isFullyDefined(tp: Type, force: ForceDegree.Value)(using Context): Boolean = + withFreshTyperState(new IsFullyDefinedAccumulator(force).process(tp), x => x) /** Try to fully define `tp`. Return whether constraint has changed. * Any changed constraint is kept. @@ -267,7 +262,7 @@ object Inferencing { && ctx.gadt.contains(tp.symbol) => val sym = tp.symbol - val res = ctx.gadt.approximation(sym, fromBelow = variance < 0) + val res = ctx.gadtState.approximation(sym, fromBelow = variance < 0) gadts.println(i"approximated $tp ~~ $res") res @@ -418,7 +413,7 @@ object Inferencing { if safeToInstantiate then tvar.instantiate(fromBelow = v == -1) else { val bounds = TypeComparer.fullBounds(tvar.origin) - if bounds.hi <:< bounds.lo || bounds.hi.classSymbol.is(Final) then + if (bounds.hi frozen_<:< bounds.lo) || bounds.hi.classSymbol.is(Final) then tvar.instantiate(fromBelow = false) else { // We do not add the created symbols to GADT constraint immediately, since they may have inter-dependencies. @@ -437,7 +432,7 @@ object Inferencing { } // We add the created symbols to GADT constraint here. - if (res.nonEmpty) ctx.gadt.addToConstraint(res) + if (res.nonEmpty) ctx.gadtState.addToConstraint(res) res } @@ -574,7 +569,7 @@ trait Inferencing { this: Typer => * Then `Y` also occurs co-variantly in `T` because it needs to be minimized in order to constrain * `T` the least. See `variances` for more detail. */ - def interpolateTypeVars(tree: Tree, pt: Type, locked: TypeVars)(using Context): tree.type = { + def interpolateTypeVars(tree: Tree, pt: Type, locked: TypeVars)(using Context): tree.type = val state = ctx.typerState // Note that some variables in `locked` might not be in `state.ownedVars` @@ -583,7 +578,7 @@ trait Inferencing { this: Typer => // `qualifying`. val ownedVars = state.ownedVars - if ((ownedVars ne locked) && !ownedVars.isEmpty) { + if (ownedVars ne locked) && !ownedVars.isEmpty then val qualifying = ownedVars -- locked if (!qualifying.isEmpty) { typr.println(i"interpolate $tree: ${tree.tpe.widen} in $state, pt = $pt, owned vars = ${state.ownedVars.toList}%, %, qualifying = ${qualifying.toList}%, %, previous = ${locked.toList}%, % / ${state.constraint}") @@ -619,44 +614,67 @@ trait Inferencing { this: Typer => if state.reporter.hasUnreportedErrors then return tree def constraint = state.constraint - type InstantiateQueue = mutable.ListBuffer[(TypeVar, Boolean)] - val toInstantiate = new InstantiateQueue - for tvar <- qualifying do - if !tvar.isInstantiated && constraint.contains(tvar) && tvar.nestingLevel >= ctx.nestingLevel then - constrainIfDependentParamRef(tvar, tree) - // Needs to be checked again, since previous interpolations could already have - // instantiated `tvar` through unification. - val v = vs(tvar) - if v == null then - // Even though `tvar` is non-occurring in `v`, the specific - // instantiation we pick still matters because `tvar` might appear - // in the bounds of a non-`qualifying` type variable in the - // constraint. - // In particular, if `tvar` was created as the upper or lower - // bound of an existing variable by `LevelAvoidMap`, we - // instantiate it in the direction corresponding to the - // original variable which might be further constrained later. - // Otherwise, we simply rely on `hasLowerBound`. - val name = tvar.origin.paramName - val fromBelow = - name.is(AvoidNameKind.UpperBound) || - !name.is(AvoidNameKind.LowerBound) && tvar.hasLowerBound - typr.println(i"interpolate non-occurring $tvar in $state in $tree: $tp, fromBelow = $fromBelow, $constraint") - toInstantiate += ((tvar, fromBelow)) - else if v.intValue != 0 then - typr.println(i"interpolate $tvar in $state in $tree: $tp, fromBelow = ${v.intValue == 1}, $constraint") - toInstantiate += ((tvar, v.intValue == 1)) - else comparing(cmp => - if !cmp.levelOK(tvar.nestingLevel, ctx.nestingLevel) then - // Invariant: The type of a tree whose enclosing scope is level - // N only contains type variables of level <= N. - typr.println(i"instantiate nonvariant $tvar of level ${tvar.nestingLevel} to a type variable of level <= ${ctx.nestingLevel}, $constraint") - cmp.atLevel(ctx.nestingLevel, tvar.origin) - else - typr.println(i"no interpolation for nonvariant $tvar in $state") - ) - /** Instantiate all type variables in `buf` in the indicated directions. + /** Values of this type report type variables to instantiate with variance indication: + * +1 variable appears covariantly, can be instantiated from lower bound + * -1 variable appears contravariantly, can be instantiated from upper bound + * 0 variable does not appear at all, can be instantiated from either bound + */ + type ToInstantiate = List[(TypeVar, Int)] + + val toInstantiate: ToInstantiate = + val buf = new mutable.ListBuffer[(TypeVar, Int)] + for tvar <- qualifying do + if !tvar.isInstantiated && constraint.contains(tvar) && tvar.nestingLevel >= ctx.nestingLevel then + constrainIfDependentParamRef(tvar, tree) + if !tvar.isInstantiated then + // isInstantiated needs to be checked again, since previous interpolations could already have + // instantiated `tvar` through unification. + val v = vs(tvar) + if v == null then buf += ((tvar, 0)) + else if v.intValue != 0 then buf += ((tvar, v.intValue)) + else comparing(cmp => + if !cmp.levelOK(tvar.nestingLevel, ctx.nestingLevel) then + // Invariant: The type of a tree whose enclosing scope is level + // N only contains type variables of level <= N. + typr.println(i"instantiate nonvariant $tvar of level ${tvar.nestingLevel} to a type variable of level <= ${ctx.nestingLevel}, $constraint") + cmp.atLevel(ctx.nestingLevel, tvar.origin) + else + typr.println(i"no interpolation for nonvariant $tvar in $state") + ) + buf.toList + + def typeVarsIn(xs: ToInstantiate): TypeVars = + xs.foldLeft(SimpleIdentitySet.empty: TypeVars)((tvs, tvi) => tvs + tvi._1) + + /** Filter list of proposed instantiations so that they don't constrain further + * the current constraint. + */ + def filterByDeps(tvs0: ToInstantiate): ToInstantiate = + val excluded = // ignore dependencies from other variables that are being instantiated + typeVarsIn(tvs0) + def step(tvs: ToInstantiate): ToInstantiate = tvs match + case tvs @ (hd @ (tvar, v)) :: tvs1 => + def aboveOK = !constraint.dependsOn(tvar, excluded, co = true) + def belowOK = !constraint.dependsOn(tvar, excluded, co = false) + if v == 0 && !aboveOK then + step((tvar, 1) :: tvs1) + else if v == 0 && !belowOK then + step((tvar, -1) :: tvs1) + else if v == -1 && !aboveOK || v == 1 && !belowOK then + typr.println(i"drop $tvar, $v in $tp, $pt, qualifying = ${qualifying.toList}, tvs0 = ${tvs0.toList}%, %, excluded = ${excluded.toList}, $constraint") + step(tvs1) + else // no conflict, keep the instantiation proposal + tvs.derivedCons(hd, step(tvs1)) + case Nil => + Nil + val tvs1 = step(tvs0) + if tvs1 eq tvs0 then tvs1 + else filterByDeps(tvs1) // filter again with smaller excluded set + end filterByDeps + + /** Instantiate all type variables in `tvs` in the indicated directions, + * as described in the doc comment of `ToInstantiate`. * If a type variable A is instantiated from below, and there is another * type variable B in `buf` that is known to be smaller than A, wait and * instantiate all other type variables before trying to instantiate A again. @@ -685,29 +703,37 @@ trait Inferencing { this: Typer => * * V2 := V3, O2 := O3 */ - def doInstantiate(buf: InstantiateQueue): Unit = - if buf.nonEmpty then - val suspended = new InstantiateQueue - while buf.nonEmpty do - val first @ (tvar, fromBelow) = buf.head - buf.dropInPlace(1) - if !tvar.isInstantiated then - val suspend = buf.exists{ (following, _) => - if fromBelow then - constraint.isLess(following.origin, tvar.origin) - else - constraint.isLess(tvar.origin, following.origin) + def doInstantiate(tvs: ToInstantiate): Unit = + + /** Try to instantiate `tvs`, return any suspended type variables */ + def tryInstantiate(tvs: ToInstantiate): ToInstantiate = tvs match + case (hd @ (tvar, v)) :: tvs1 => + val fromBelow = v == 1 || (v == 0 && tvar.hasLowerBound) + typr.println( + i"interpolate${if v == 0 then " non-occurring" else ""} $tvar in $state in $tree: $tp, fromBelow = $fromBelow, $constraint") + if tvar.isInstantiated then + tryInstantiate(tvs1) + else + val suspend = tvs1.exists{ (following, _) => + if fromBelow + then constraint.isLess(following.origin, tvar.origin) + else constraint.isLess(tvar.origin, following.origin) } - if suspend then suspended += first else tvar.instantiate(fromBelow) - end if - end while - doInstantiate(suspended) + if suspend then + typr.println(i"suspended: $hd") + hd :: tryInstantiate(tvs1) + else + tvar.instantiate(fromBelow) + tryInstantiate(tvs1) + case Nil => Nil + if tvs.nonEmpty then doInstantiate(tryInstantiate(tvs)) end doInstantiate - doInstantiate(toInstantiate) + + doInstantiate(filterByDeps(toInstantiate)) } - } + end if tree - } + end interpolateTypeVars /** If `tvar` represents a parameter of a dependent method type in the current `call` * approximate it from below with the type of the actual argument. Skolemize that diff --git a/compiler/src/dotty/tools/dotc/typer/Namer.scala b/compiler/src/dotty/tools/dotc/typer/Namer.scala index ad8d0e50d348..6f85efb0fc8a 100644 --- a/compiler/src/dotty/tools/dotc/typer/Namer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Namer.scala @@ -201,7 +201,7 @@ class Namer { typer: Typer => case tree: MemberDef => SymDenotations.canBeLocal(tree.name, flags) case _ => false if !ok then - report.error(i"modifier(s) `${flags.flagsString}` incompatible with $kind definition", tree.srcPos) + report.error(em"modifier(s) `${flags.flagsString}` incompatible with $kind definition", tree.srcPos) if adapted.is(Private) && canBeLocal then adapted | Local else adapted } @@ -461,8 +461,8 @@ class Namer { typer: Typer => val isProvisional = parents.exists(!_.baseType(defn.AnyClass).exists) if isProvisional then typr.println(i"provisional superclass $first for $cls") - first = AnnotatedType(first, Annotation(defn.ProvisionalSuperClassAnnot)) - checkFeasibleParent(first, cls.srcPos, em" in inferred superclass $first") :: parents + first = AnnotatedType(first, Annotation(defn.ProvisionalSuperClassAnnot, cls.span)) + checkFeasibleParent(first, cls.srcPos, i" in inferred superclass $first") :: parents end ensureFirstIsClass /** Add child annotation for `child` to annotations of `cls`. The annotation @@ -762,7 +762,7 @@ class Namer { typer: Typer => } def missingType(sym: Symbol, modifier: String)(using Context): Unit = { - report.error(s"${modifier}type of implicit definition needs to be given explicitly", sym.srcPos) + report.error(em"${modifier}type of implicit definition needs to be given explicitly", sym.srcPos) sym.resetFlag(GivenOrImplicit) } @@ -831,9 +831,9 @@ class Namer { typer: Typer => for (annotTree <- original.mods.annotations) { val cls = typedAheadAnnotationClass(annotTree)(using annotCtx) if (cls eq sym) - report.error("An annotation class cannot be annotated with iself", annotTree.srcPos) + report.error(em"An annotation class cannot be annotated with iself", annotTree.srcPos) else { - val ann = Annotation.deferred(cls)(typedAheadAnnotation(annotTree)(using annotCtx)) + val ann = Annotation.deferred(cls)(typedAheadExpr(annotTree)(using annotCtx)) sym.addAnnotation(ann) } } @@ -1249,7 +1249,7 @@ class Namer { typer: Typer => val reason = mbrs.map(canForward(_, alias)).collect { case CanForward.No(whyNot) => i"\n$path.$name cannot be exported because it $whyNot" }.headOption.getOrElse("") - report.error(i"""no eligible member $name at $path$reason""", ctx.source.atSpan(span)) + report.error(em"""no eligible member $name at $path$reason""", ctx.source.atSpan(span)) else targets += alias @@ -1314,7 +1314,7 @@ class Namer { typer: Typer => case _ => 0 if cmp == 0 then report.error( - ex"""Clashing exports: The exported + em"""Clashing exports: The exported | ${forwarder.rhs.symbol}: ${alt1.widen} |and ${forwarder1.rhs.symbol}: ${alt2.widen} |have the same signature after erasure and overloading resolution could not disambiguate.""", @@ -1437,7 +1437,7 @@ class Namer { typer: Typer => case mt: MethodType if cls.is(Case) && mt.isParamDependent => // See issue #8073 for background report.error( - i"""Implementation restriction: case classes cannot have dependencies between parameters""", + em"""Implementation restriction: case classes cannot have dependencies between parameters""", cls.srcPos) case _ => @@ -1618,15 +1618,14 @@ class Namer { typer: Typer => def typedAheadExpr(tree: Tree, pt: Type = WildcardType)(using Context): tpd.Tree = typedAhead(tree, typer.typedExpr(_, pt)) - def typedAheadAnnotation(tree: Tree)(using Context): tpd.Tree = - typedAheadExpr(tree, defn.AnnotationClass.typeRef) - - def typedAheadAnnotationClass(tree: Tree)(using Context): Symbol = tree match { + def typedAheadAnnotationClass(tree: Tree)(using Context): Symbol = tree match case Apply(fn, _) => typedAheadAnnotationClass(fn) case TypeApply(fn, _) => typedAheadAnnotationClass(fn) case Select(qual, nme.CONSTRUCTOR) => typedAheadAnnotationClass(qual) case New(tpt) => typedAheadType(tpt).tpe.classSymbol - } + case TypedSplice(_) => + val sym = tree.symbol + if sym.isConstructor then sym.owner else sym /** Enter and typecheck parameter list */ def completeParams(params: List[MemberDef])(using Context): Unit = { @@ -1690,8 +1689,10 @@ class Namer { typer: Typer => if !Config.checkLevelsOnConstraints then val hygienicType = TypeOps.avoid(rhsType, termParamss.flatten) if (!hygienicType.isValueType || !(hygienicType <:< tpt.tpe)) - report.error(i"return type ${tpt.tpe} of lambda cannot be made hygienic;\n" + - i"it is not a supertype of the hygienic type $hygienicType", mdef.srcPos) + report.error( + em"""return type ${tpt.tpe} of lambda cannot be made hygienic + |it is not a supertype of the hygienic type $hygienicType""", + mdef.srcPos) //println(i"lifting $rhsType over $termParamss -> $hygienicType = ${tpt.tpe}") //println(TypeComparer.explained { implicit ctx => hygienicType <:< tpt.tpe }) case _ => @@ -1863,7 +1864,7 @@ class Namer { typer: Typer => // so we must allow constraining its type parameters // compare with typedDefDef, see tests/pos/gadt-inference.scala rhsCtx.setFreshGADTBounds - rhsCtx.gadt.addToConstraint(typeParams) + rhsCtx.gadtState.addToConstraint(typeParams) } def typedAheadRhs(pt: Type) = @@ -1882,7 +1883,7 @@ class Namer { typer: Typer => // larger choice of overrides (see `default-getter.scala`). // For justification on the use of `@uncheckedVariance`, see // `default-getter-variance.scala`. - AnnotatedType(defaultTp, Annotation(defn.UncheckedVarianceAnnot)) + AnnotatedType(defaultTp, Annotation(defn.UncheckedVarianceAnnot, sym.span)) else // don't strip @uncheckedVariance annot for default getters TypeOps.simplify(tp.widenTermRefExpr, diff --git a/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala b/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala index 6fb019ee057c..8ba842ad695f 100644 --- a/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala +++ b/compiler/src/dotty/tools/dotc/typer/ProtoTypes.scala @@ -295,6 +295,8 @@ object ProtoTypes { */ @sharable object AnySelectionProto extends SelectionProto(nme.WILDCARD, WildcardType, NoViewsAllowed, true) + @sharable object SingletonTypeProto extends SelectionProto(nme.WILDCARD, WildcardType, NoViewsAllowed, true) + /** A prototype for selections in pattern constructors */ class UnapplySelectionProto(name: Name) extends SelectionProto(name, WildcardType, NoViewsAllowed, true) diff --git a/compiler/src/dotty/tools/dotc/typer/QuotesAndSplices.scala b/compiler/src/dotty/tools/dotc/typer/QuotesAndSplices.scala index fa29f450be2a..65d8abfdf6a7 100644 --- a/compiler/src/dotty/tools/dotc/typer/QuotesAndSplices.scala +++ b/compiler/src/dotty/tools/dotc/typer/QuotesAndSplices.scala @@ -54,7 +54,7 @@ trait QuotesAndSplices { val msg = em"""Quoted types `'[..]` can only be used in patterns. | |Hint: To get a scala.quoted.Type[T] use scala.quoted.Type.of[T] instead. - |""".stripMargin + |""" report.error(msg, tree.srcPos) EmptyTree else @@ -87,7 +87,7 @@ trait QuotesAndSplices { ref(defn.QuotedRuntime_exprSplice).appliedToType(argType).appliedTo(pat) } else { - report.error(i"Type must be fully defined.\nConsider annotating the splice using a type ascription:\n ($tree: XYZ).", tree.expr.srcPos) + report.error(em"Type must be fully defined.\nConsider annotating the splice using a type ascription:\n ($tree: XYZ).", tree.expr.srcPos) tree.withType(UnspecifiedErrorType) } else { @@ -123,7 +123,7 @@ trait QuotesAndSplices { assert(ctx.mode.is(Mode.QuotedPattern)) val untpd.Apply(splice: untpd.Splice, args) = tree: @unchecked if !isFullyDefined(pt, ForceDegree.flipBottom) then - report.error(i"Type must be fully defined.", splice.srcPos) + report.error(em"Type must be fully defined.", splice.srcPos) tree.withType(UnspecifiedErrorType) else if splice.isInBraces then // ${x}(...) match an application val typedArgs = args.map(arg => typedExpr(arg)) @@ -172,10 +172,10 @@ trait QuotesAndSplices { report.error("Splice ${...} outside quotes '{...} or inline method", tree.srcPos) else if (level < 0) report.error( - s"""Splice $${...} at level $level. - | - |Inline method may contain a splice at level 0 but the contents of this splice cannot have a splice. - |""".stripMargin, tree.srcPos + em"""Splice $${...} at level $level. + | + |Inline method may contain a splice at level 0 but the contents of this splice cannot have a splice. + |""", tree.srcPos ) /** Split a typed quoted pattern is split into its type bindings, pattern expression and inner patterns. @@ -263,7 +263,7 @@ trait QuotesAndSplices { transformTypeBindingTypeDef(PatMatGivenVarName.fresh(tdef.name.toTermName), tdef, typePatBuf) else if tdef.symbol.isClass then val kind = if tdef.symbol.is(Module) then "objects" else "classes" - report.error("Implementation restriction: cannot match " + kind, tree.srcPos) + report.error(em"Implementation restriction: cannot match $kind", tree.srcPos) EmptyTree else super.transform(tree) @@ -364,7 +364,7 @@ trait QuotesAndSplices { * * ``` * case scala.internal.quoted.Expr.unapply[ - * Tuple1[t @ _], // Type binging definition + * KList[t @ _, KNil], // Type binging definition * Tuple2[Type[t], Expr[List[t]]] // Typing the result of the pattern match * ]( * Tuple2.unapply @@ -411,7 +411,7 @@ trait QuotesAndSplices { val replaceBindings = new ReplaceBindings val patType = defn.tupleType(splices.tpes.map(tpe => replaceBindings(tpe.widen))) - val typeBindingsTuple = tpd.tupleTypeTree(typeBindings.values.toList) + val typeBindingsTuple = tpd.hkNestedPairsTypeTree(typeBindings.values.toList) val replaceBindingsInTree = new TreeMap { private var bindMap = Map.empty[Symbol, Symbol] diff --git a/compiler/src/dotty/tools/dotc/typer/ReTyper.scala b/compiler/src/dotty/tools/dotc/typer/ReTyper.scala index 7099234c80e1..b53b2f9ec57a 100644 --- a/compiler/src/dotty/tools/dotc/typer/ReTyper.scala +++ b/compiler/src/dotty/tools/dotc/typer/ReTyper.scala @@ -71,7 +71,7 @@ class ReTyper(nestingLevel: Int = 0) extends Typer(nestingLevel) with ReChecking promote(tree) override def typedRefinedTypeTree(tree: untpd.RefinedTypeTree)(using Context): TypTree = - promote(TypeTree(tree.tpe).withSpan(tree.span)) + promote(TypeTree(tree.typeOpt).withSpan(tree.span)) override def typedExport(exp: untpd.Export)(using Context): Export = promote(exp) @@ -87,8 +87,8 @@ class ReTyper(nestingLevel: Int = 0) extends Typer(nestingLevel) with ReChecking // retract PatternOrTypeBits like in typedExpr withoutMode(Mode.PatternOrTypeBits)(typedUnadapted(tree.fun, AnyFunctionProto)) val implicits1 = tree.implicits.map(typedExpr(_)) - val patterns1 = tree.patterns.mapconserve(pat => typed(pat, pat.tpe)) - untpd.cpy.UnApply(tree)(fun1, implicits1, patterns1).withType(tree.tpe) + val patterns1 = tree.patterns.mapconserve(pat => typed(pat, pat.typeOpt)) + untpd.cpy.UnApply(tree)(fun1, implicits1, patterns1).withType(tree.typeOpt) } override def typedUnApply(tree: untpd.Apply, selType: Type)(using Context): Tree = diff --git a/compiler/src/dotty/tools/dotc/typer/RefChecks.scala b/compiler/src/dotty/tools/dotc/typer/RefChecks.scala index 1aa53d866b5e..3d53371e603e 100644 --- a/compiler/src/dotty/tools/dotc/typer/RefChecks.scala +++ b/compiler/src/dotty/tools/dotc/typer/RefChecks.scala @@ -58,11 +58,9 @@ object RefChecks { // constructors of different classes are allowed to have defaults if (haveDefaults.exists(x => !x.isConstructor) || owners.distinct.size < haveDefaults.size) report.error( - "in " + clazz + - ", multiple overloaded alternatives of " + haveDefaults.head + - " define default arguments" + ( - if (owners.forall(_ == clazz)) "." - else ".\nThe members with defaults are defined in " + owners.map(_.showLocated).mkString("", " and ", ".")), + em"in $clazz, multiple overloaded alternatives of ${haveDefaults.head} define default arguments${ + if owners.forall(_ == clazz) then "." + else i".\nThe members with defaults are defined in ${owners.map(_.showLocated).mkString("", " and ", ".")}"}", clazz.srcPos) } } @@ -91,24 +89,39 @@ object RefChecks { cls.thisType } + /** - Check that self type of `cls` conforms to self types of all `parents` as seen from + * `cls.thisType` + * - If self type of `cls` is explicit, check that it conforms to the self types + * of all its class symbols. + * @param deep If true and a self type of a parent is not given explicitly, recurse to + * check against the parents of the parent. This is needed when capture checking, + * since we assume (& check) that the capture set of an inferred self type + * is the intersection of the capture sets of all its parents + */ + def checkSelfAgainstParents(cls: ClassSymbol, parents: List[Symbol])(using Context): Unit = + withMode(Mode.CheckBoundsOrSelfType) { + val cinfo = cls.classInfo + + def checkSelfConforms(other: ClassSymbol) = + val otherSelf = other.declaredSelfTypeAsSeenFrom(cls.thisType) + if otherSelf.exists then + if !(cinfo.selfType <:< otherSelf) then + report.error(DoesNotConformToSelfType("illegal inheritance", cinfo.selfType, cls, otherSelf, "parent", other), + cls.srcPos) + + for psym <- parents do + checkSelfConforms(psym.asClass) + } + end checkSelfAgainstParents + /** Check that self type of this class conforms to self types of parents * and required classes. Also check that only `enum` constructs extend * `java.lang.Enum` and no user-written class extends ContextFunctionN. */ def checkParents(cls: Symbol, parentTrees: List[Tree])(using Context): Unit = cls.info match { case cinfo: ClassInfo => - def checkSelfConforms(other: ClassSymbol, category: String, relation: String) = { - val otherSelf = other.declaredSelfTypeAsSeenFrom(cls.thisType) - if otherSelf.exists && !(cinfo.selfType <:< otherSelf) then - report.error(DoesNotConformToSelfType(category, cinfo.selfType, cls, otherSelf, relation, other), - cls.srcPos) - } val psyms = cls.asClass.parentSyms - for (psym <- psyms) - checkSelfConforms(psym.asClass, "illegal inheritance", "parent") - for reqd <- cinfo.cls.givenSelfType.classSymbols do - if reqd != cls then - checkSelfConforms(reqd, "missing requirement", "required") + checkSelfAgainstParents(cls.asClass, psyms) def isClassExtendingJavaEnum = !cls.isOneOf(Enum | Trait) && psyms.contains(defn.JavaEnumClass) @@ -221,9 +234,16 @@ object RefChecks { && inLinearizationOrder(sym1, sym2, parent) && !sym2.is(AbsOverride) - def checkAll(checkOverride: (Symbol, Symbol) => Unit) = + // Checks the subtype relationship tp1 <:< tp2. + // It is passed to the `checkOverride` operation in `checkAll`, to be used for + // compatibility checking. + def checkSubType(tp1: Type, tp2: Type)(using Context): Boolean = tp1 frozen_<:< tp2 + + private val subtypeChecker: (Type, Type) => Context ?=> Boolean = this.checkSubType + + def checkAll(checkOverride: ((Type, Type) => Context ?=> Boolean, Symbol, Symbol) => Unit) = while hasNext do - checkOverride(overriding, overridden) + checkOverride(subtypeChecker, overriding, overridden) next() // The OverridingPairs cursor does assume that concrete overrides abstract @@ -237,7 +257,7 @@ object RefChecks { if dcl.is(Deferred) then for other <- dcl.allOverriddenSymbols do if !other.is(Deferred) then - checkOverride(dcl, other) + checkOverride(checkSubType, dcl, other) end checkAll end OverridingPairsChecker @@ -274,8 +294,11 @@ object RefChecks { * TODO check that classes are not overridden * TODO This still needs to be cleaned up; the current version is a straight port of what was there * before, but it looks too complicated and method bodies are far too large. + * + * @param makeOverridePairsChecker A function for creating a OverridePairsChecker instance + * from the class symbol and the self type */ - def checkAllOverrides(clazz: ClassSymbol)(using Context): Unit = { + def checkAllOverrides(clazz: ClassSymbol, makeOverridingPairsChecker: ((ClassSymbol, Type) => Context ?=> OverridingPairsChecker) | Null = null)(using Context): Unit = { val self = clazz.thisType val upwardsSelf = upwardsThisType(clazz) var hasErrors = false @@ -301,25 +324,22 @@ object RefChecks { report.error(msg.append(othersMsg), clazz.srcPos) } - def infoString(sym: Symbol) = infoString0(sym, sym.owner != clazz) - def infoStringWithLocation(sym: Symbol) = infoString0(sym, true) - - def infoString0(sym: Symbol, showLocation: Boolean) = { - val sym1 = sym.underlyingSymbol - def info = self.memberInfo(sym1) - val infoStr = - if (sym1.isAliasType) i", which equals ${info.bounds.hi}" - else if (sym1.isAbstractOrParamType && info != TypeBounds.empty) i" with bounds$info" - else if (sym1.is(Module)) "" - else if (sym1.isTerm) i" of type $info" - else "" - i"${if (showLocation) sym1.showLocated else sym1}$infoStr" - } + def infoString(sym: Symbol) = + err.infoString(sym, self, showLocation = sym.owner != clazz) + def infoStringWithLocation(sym: Symbol) = + err.infoString(sym, self, showLocation = true) + + def isInheritedAccessor(mbr: Symbol, other: Symbol): Boolean = + mbr.is(ParamAccessor) + && { + val next = ParamForwarding.inheritedAccessor(mbr) + next == other || isInheritedAccessor(next, other) + } /* Check that all conditions for overriding `other` by `member` - * of class `clazz` are met. - */ - def checkOverride(member: Symbol, other: Symbol): Unit = + * of class `clazz` are met. + */ + def checkOverride(checkSubType: (Type, Type) => Context ?=> Boolean, member: Symbol, other: Symbol): Unit = def memberTp(self: Type) = if (member.isClass) TypeAlias(member.typeRef.EtaExpand(member.typeParams)) else self.memberInfo(member) @@ -329,27 +349,17 @@ object RefChecks { def noErrorType = !memberTp(self).isErroneous && !otherTp(self).isErroneous - def overrideErrorMsg(msg: String, compareTypes: Boolean = false): Message = { - val isConcreteOverAbstract = - (other.owner isSubClass member.owner) && other.is(Deferred) && !member.is(Deferred) - val addendum = - if isConcreteOverAbstract then - ";\n (Note that %s is abstract,\n and is therefore overridden by concrete %s)".format( - infoStringWithLocation(other), - infoStringWithLocation(member)) - else "" - val fullMsg = - s"error overriding ${infoStringWithLocation(other)};\n ${infoString(member)} $msg$addendum" - if compareTypes then OverrideTypeMismatchError(fullMsg, memberTp(self), otherTp(self)) - else OverrideError(fullMsg) - } + def overrideErrorMsg(core: Context ?=> String, compareTypes: Boolean = false): Message = + val (mtp, otp) = if compareTypes then (memberTp(self), otherTp(self)) else (NoType, NoType) + OverrideError(core, self, member, other, mtp, otp) def compatTypes(memberTp: Type, otherTp: Type): Boolean = try isOverridingPair(member, memberTp, other, otherTp, fallBack = warnOnMigration( overrideErrorMsg("no longer has compatible type"), - (if (member.owner == clazz) member else clazz).srcPos, version = `3.0`)) + (if (member.owner == clazz) member else clazz).srcPos, version = `3.0`), + isSubType = checkSubType) catch case ex: MissingType => // can happen when called with upwardsSelf as qualifier of memberTp and otherTp, // because in that case we might access types that are not members of the qualifier. @@ -361,7 +371,16 @@ object RefChecks { * Type members are always assumed to match. */ def trueMatch: Boolean = - member.isType || memberTp(self).matches(otherTp(self)) + member.isType || withMode(Mode.IgnoreCaptures) { + // `matches` does not perform box adaptation so the result here would be + // spurious during capture checking. + // + // Instead of parameterizing `matches` with the function for subtype checking + // with box adaptation, we simply ignore capture annotations here. + // This should be safe since the compatibility under box adaptation is already + // checked. + memberTp(self).matches(otherTp(self)) + } def emitOverrideError(fullmsg: Message) = if (!(hasErrors && member.is(Synthetic) && member.is(Module))) { @@ -378,7 +397,7 @@ object RefChecks { def overrideDeprecation(what: String, member: Symbol, other: Symbol, fix: String): Unit = report.deprecationWarning( - s"overriding $what${infoStringWithLocation(other)} is deprecated;\n ${infoString(member)} should be $fix.", + em"overriding $what${infoStringWithLocation(other)} is deprecated;\n ${infoString(member)} should be $fix.", if member.owner == clazz then member.srcPos else clazz.srcPos) def autoOverride(sym: Symbol) = @@ -464,7 +483,7 @@ object RefChecks { if (autoOverride(member) || other.owner.isAllOf(JavaInterface) && warnOnMigration( - "`override` modifier required when a Java 8 default method is re-implemented".toMessage, + em"`override` modifier required when a Java 8 default method is re-implemented", member.srcPos, version = `3.0`)) member.setFlag(Override) else if (member.isType && self.memberInfo(member) =:= self.memberInfo(other)) @@ -496,7 +515,7 @@ object RefChecks { else if (member.is(ModuleVal) && !other.isRealMethod && !other.isOneOf(DeferredOrLazy)) overrideError("may not override a concrete non-lazy value") else if (member.is(Lazy, butNot = Module) && !other.isRealMethod && !other.is(Lazy) && - !warnOnMigration(overrideErrorMsg("may not override a non-lazy value"), member.srcPos, version = `3.0`)) + !warnOnMigration(overrideErrorMsg("may not override a non-lazy value"), member.srcPos, version = `3.0`)) overrideError("may not override a non-lazy value") else if (other.is(Lazy) && !other.isRealMethod && !member.is(Lazy)) overrideError("must be declared lazy to override a lazy value") @@ -521,7 +540,7 @@ object RefChecks { overrideError(i"cannot override val parameter ${other.showLocated}") else report.deprecationWarning( - i"overriding val parameter ${other.showLocated} is deprecated, will be illegal in a future version", + em"overriding val parameter ${other.showLocated} is deprecated, will be illegal in a future version", member.srcPos) else if !other.isExperimental && member.hasAnnotation(defn.ExperimentalAnnot) then // (1.12) overrideError("may not override non-experimental member") @@ -529,14 +548,8 @@ object RefChecks { overrideDeprecation("", member, other, "removed or renamed") end checkOverride - def isInheritedAccessor(mbr: Symbol, other: Symbol): Boolean = - mbr.is(ParamAccessor) - && { - val next = ParamForwarding.inheritedAccessor(mbr) - next == other || isInheritedAccessor(next, other) - } - - OverridingPairsChecker(clazz, self).checkAll(checkOverride) + val checker = if makeOverridingPairsChecker == null then OverridingPairsChecker(clazz, self) else makeOverridingPairsChecker(clazz, self) + checker.checkAll(checkOverride) printMixinOverrideErrors() // Verifying a concrete class has nothing unimplemented. @@ -544,7 +557,7 @@ object RefChecks { val abstractErrors = new mutable.ListBuffer[String] def abstractErrorMessage = // a little formatting polish - if (abstractErrors.size <= 2) abstractErrors mkString " " + if (abstractErrors.size <= 2) abstractErrors.mkString(" ") else abstractErrors.tail.mkString(abstractErrors.head + ":\n", "\n", "") def abstractClassError(mustBeMixin: Boolean, msg: String): Unit = { @@ -580,7 +593,7 @@ object RefChecks { clazz.nonPrivateMembersNamed(mbr.name) .filterWithPredicate( impl => isConcrete(impl.symbol) - && mbrDenot.matchesLoosely(impl, alwaysCompareTypes = true)) + && withMode(Mode.IgnoreCaptures)(mbrDenot.matchesLoosely(impl, alwaysCompareTypes = true))) .exists /** The term symbols in this class and its baseclasses that are @@ -727,7 +740,7 @@ object RefChecks { def checkNoAbstractDecls(bc: Symbol): Unit = { for (decl <- bc.info.decls) if (decl.is(Deferred)) { - val impl = decl.matchingMember(clazz.thisType) + val impl = withMode(Mode.IgnoreCaptures)(decl.matchingMember(clazz.thisType)) if (impl == NoSymbol || decl.owner.isSubClass(impl.owner)) && !ignoreDeferred(decl) then @@ -774,17 +787,19 @@ object RefChecks { // For each member, check that the type of its symbol, as seen from `self` // can override the info of this member - for (name <- membersToCheck) - for (mbrd <- self.member(name).alternatives) { - val mbr = mbrd.symbol - val mbrType = mbr.info.asSeenFrom(self, mbr.owner) - if (!mbrType.overrides(mbrd.info, relaxedCheck = false, matchLoosely = true)) - report.errorOrMigrationWarning( - em"""${mbr.showLocated} is not a legal implementation of `$name` in $clazz - | its type $mbrType - | does not conform to ${mbrd.info}""", - (if (mbr.owner == clazz) mbr else clazz).srcPos, from = `3.0`) + withMode(Mode.IgnoreCaptures) { + for (name <- membersToCheck) + for (mbrd <- self.member(name).alternatives) { + val mbr = mbrd.symbol + val mbrType = mbr.info.asSeenFrom(self, mbr.owner) + if (!mbrType.overrides(mbrd.info, relaxedCheck = false, matchLoosely = true)) + report.errorOrMigrationWarning( + em"""${mbr.showLocated} is not a legal implementation of `$name` in $clazz + | its type $mbrType + | does not conform to ${mbrd.info}""", + (if (mbr.owner == clazz) mbr else clazz).srcPos, from = `3.0`) } + } } /** Check that inheriting a case class does not constitute a variant refinement @@ -796,7 +811,7 @@ object RefChecks { for (baseCls <- caseCls.info.baseClasses.tail) if (baseCls.typeParams.exists(_.paramVarianceSign != 0)) for (problem <- variantInheritanceProblems(baseCls, caseCls, "non-variant", "case ")) - report.errorOrMigrationWarning(problem(), clazz.srcPos, from = `3.0`) + report.errorOrMigrationWarning(problem, clazz.srcPos, from = `3.0`) checkNoAbstractMembers() if (abstractErrors.isEmpty) checkNoAbstractDecls(clazz) @@ -827,7 +842,7 @@ object RefChecks { if cls.paramAccessors.nonEmpty && !mixins.contains(cls) problem <- variantInheritanceProblems(cls, clazz.asClass.superClass, "parameterized", "super") } - report.error(problem(), clazz.srcPos) + report.error(problem, clazz.srcPos) } checkParameterizedTraitsOK() @@ -841,13 +856,13 @@ object RefChecks { * Return an optional by name error message if this test fails. */ def variantInheritanceProblems( - baseCls: Symbol, middle: Symbol, baseStr: String, middleStr: String): Option[() => String] = { + baseCls: Symbol, middle: Symbol, baseStr: String, middleStr: String): Option[Message] = { val superBT = self.baseType(middle) val thisBT = self.baseType(baseCls) val combinedBT = superBT.baseType(baseCls) if (combinedBT =:= thisBT) None // ok else - Some(() => + Some( em"""illegal inheritance: $clazz inherits conflicting instances of $baseStr base $baseCls. | | Direct basetype: $thisBT @@ -944,7 +959,7 @@ object RefChecks { for bc <- cls.baseClasses.tail do val other = sym.matchingDecl(bc, cls.thisType) if other.exists then - report.error(i"private $sym cannot override ${other.showLocated}", sym.srcPos) + report.error(em"private $sym cannot override ${other.showLocated}", sym.srcPos) end checkNoPrivateOverrides /** Check that unary method definition do not receive parameters. diff --git a/compiler/src/dotty/tools/dotc/typer/Synthesizer.scala b/compiler/src/dotty/tools/dotc/typer/Synthesizer.scala index e3f5382ecad7..71efc27bf673 100644 --- a/compiler/src/dotty/tools/dotc/typer/Synthesizer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Synthesizer.scala @@ -28,13 +28,25 @@ class Synthesizer(typer: Typer)(using @constructorOnly c: Context): private type SpecialHandlers = List[(ClassSymbol, SpecialHandler)] val synthesizedClassTag: SpecialHandler = (formal, span) => + def instArg(tp: Type): Type = tp.stripTypeVar match + // Special case to avoid instantiating `Int & S` to `Int & Nothing` in + // i16328.scala. The intersection comes from an earlier instantiation + // to an upper bound. + // The dual situation with unions is harder to trigger because lower + // bounds are usually widened during instantiation. + case tp: AndOrType if tp.tp1 =:= tp.tp2 => + instArg(tp.tp1) + case _ => + if isFullyDefined(tp, ForceDegree.all) then tp + else NoType // this happens in tests/neg/i15372.scala + val tag = formal.argInfos match - case arg :: Nil if isFullyDefined(arg, ForceDegree.all) => - arg match + case arg :: Nil => + instArg(arg) match case defn.ArrayOf(elemTp) => val etag = typer.inferImplicitArg(defn.ClassTagClass.typeRef.appliedTo(elemTp), span) if etag.tpe.isError then EmptyTree else etag.select(nme.wrap) - case tp if hasStableErasure(tp) && !defn.isBottomClassAfterErasure(tp.typeSymbol) => + case tp if hasStableErasure(tp) && !tp.isBottomTypeAfterErasure => val sym = tp.typeSymbol val classTagModul = ref(defn.ClassTagModule) if defn.SpecialClassTagClasses.contains(sym) then @@ -476,8 +488,8 @@ class Synthesizer(typer: Typer)(using @constructorOnly c: Context): val elemLabels = cls.children.map(c => ConstantType(Constant(c.name.toString))) def internalError(msg: => String)(using Context): Unit = - report.error(i"""Internal error when synthesizing sum mirror for $cls: - |$msg""".stripMargin, ctx.source.atSpan(span)) + report.error(em"""Internal error when synthesizing sum mirror for $cls: + |$msg""", ctx.source.atSpan(span)) def childPrefix(child: Symbol)(using Context): Type = val symPre = TypeOps.childPrefix(pre, cls, child) @@ -691,10 +703,11 @@ class Synthesizer(typer: Typer)(using @constructorOnly c: Context): val manifest = synthesize(fullyDefinedType(arg, "Manifest argument", ctx.source.atSpan(span)), kind, topLevel = true) if manifest != EmptyTree then report.deprecationWarning( - i"""Compiler synthesis of Manifest and OptManifest is deprecated, instead - |replace with the type `scala.reflect.ClassTag[$arg]`. - |Alternatively, consider using the new metaprogramming features of Scala 3, - |see https://docs.scala-lang.org/scala3/reference/metaprogramming.html""", ctx.source.atSpan(span)) + em"""Compiler synthesis of Manifest and OptManifest is deprecated, instead + |replace with the type `scala.reflect.ClassTag[$arg]`. + |Alternatively, consider using the new metaprogramming features of Scala 3, + |see https://docs.scala-lang.org/scala3/reference/metaprogramming.html""", + ctx.source.atSpan(span)) withNoErrors(manifest) case _ => EmptyTreeNoError diff --git a/compiler/src/dotty/tools/dotc/typer/TypeAssigner.scala b/compiler/src/dotty/tools/dotc/typer/TypeAssigner.scala index b90409e72364..98e9cb638c17 100644 --- a/compiler/src/dotty/tools/dotc/typer/TypeAssigner.scala +++ b/compiler/src/dotty/tools/dotc/typer/TypeAssigner.scala @@ -31,8 +31,9 @@ trait TypeAssigner { c case _ => report.error( - if (qual.isEmpty) tree.show + " can be used only in a class, object, or template" - else qual.show + " is not an enclosing class", tree.srcPos) + if qual.isEmpty then em"$tree can be used only in a class, object, or template" + else em"$qual is not an enclosing class", + tree.srcPos) NoSymbol } } @@ -127,7 +128,7 @@ trait TypeAssigner { def arrayElemType = qual1.tpe.widen match case JavaArrayType(elemtp) => elemtp case qualType => - report.error("Expected Array but was " + qualType.show, tree.srcPos) + report.error(em"Expected Array but was $qualType", tree.srcPos) defn.NothingType val name = tree.name @@ -167,26 +168,13 @@ trait TypeAssigner { case _ => false def addendum = err.selectErrorAddendum(tree, qual, qualType, importSuggestionAddendum, foundWithoutNull) val msg: Message = - if tree.name == nme.CONSTRUCTOR then ex"$qualType does not have a constructor".toMessage + if tree.name == nme.CONSTRUCTOR then em"$qualType does not have a constructor" else NotAMember(qualType, tree.name, kind, addendum) errorType(msg, tree.srcPos) def inaccessibleErrorType(tpe: NamedType, superAccess: Boolean, pos: SrcPos)(using Context): Type = - val pre = tpe.prefix - val name = tpe.name - val alts = tpe.denot.alternatives.map(_.symbol).filter(_.exists) - val whatCanNot = alts match - case Nil => - em"$name cannot" - case sym :: Nil => - em"${if (sym.owner == pre.typeSymbol) sym.show else sym.showLocated} cannot" - case _ => - em"none of the overloaded alternatives named $name can" - val where = if (ctx.owner.exists) s" from ${ctx.owner.enclosingClass}" else "" - val whyNot = new StringBuffer - alts.foreach(_.isAccessibleFrom(pre, superAccess, whyNot)) if tpe.isError then tpe - else errorType(ex"$whatCanNot be accessed as a member of $pre$where.$whyNot", pos) + else errorType(CannotBeAccessed(tpe, superAccess), pos) def processAppliedType(tree: untpd.Tree, tp: Type)(using Context): Type = tp match case AppliedType(tycon, args) => @@ -238,7 +226,7 @@ trait TypeAssigner { val cls = qualifyingClass(tree, tree.qual.name, packageOK = false) tree.withType( if (cls.isClass) cls.thisType - else errorType("not a legal qualifying class for this", tree.srcPos)) + else errorType(em"not a legal qualifying class for this", tree.srcPos)) } def superType(qualType: Type, mix: untpd.Ident, mixinClass: Symbol, pos: SrcPos)(using Context) = @@ -252,7 +240,7 @@ trait TypeAssigner { case Nil => errorType(SuperQualMustBeParent(mix, cls), pos) case p :: q :: _ => - errorType("ambiguous parent class qualifier", pos) + errorType(em"ambiguous parent class qualifier", pos) } val owntype = if (mixinClass.exists) mixinClass.typeRef @@ -291,25 +279,25 @@ trait TypeAssigner { def safeSubstMethodParams(mt: MethodType, argTypes: List[Type])(using Context): Type = if mt.isResultDependent then safeSubstParams(mt.resultType, mt.paramRefs, argTypes) - else if mt.isCaptureDependent then mt.resultType.substParams(mt, argTypes) else mt.resultType def assignType(tree: untpd.Apply, fn: Tree, args: List[Tree])(using Context): Apply = { val ownType = fn.tpe.widen match { case fntpe: MethodType => - if (fntpe.paramInfos.hasSameLengthAs(args) || ctx.phase.prev.relaxedTyping) - safeSubstMethodParams(fntpe, args.tpes) + if fntpe.paramInfos.hasSameLengthAs(args) || ctx.phase.prev.relaxedTyping then + if fntpe.isResultDependent then safeSubstMethodParams(fntpe, args.tpes) + else fntpe.resultType // fast path optimization else - errorType(i"wrong number of arguments at ${ctx.phase.prev} for $fntpe: ${fn.tpe}, expected: ${fntpe.paramInfos.length}, found: ${args.length}", tree.srcPos) + errorType(em"wrong number of arguments at ${ctx.phase.prev} for $fntpe: ${fn.tpe}, expected: ${fntpe.paramInfos.length}, found: ${args.length}", tree.srcPos) case t => if (ctx.settings.Ydebug.value) new FatalError("").printStackTrace() - errorType(err.takesNoParamsStr(fn, ""), tree.srcPos) + errorType(err.takesNoParamsMsg(fn, ""), tree.srcPos) } ConstFold.Apply(tree.withType(ownType)) } def assignType(tree: untpd.TypeApply, fn: Tree, args: List[Tree])(using Context): TypeApply = { - def fail = tree.withType(errorType(err.takesNoParamsStr(fn, "type "), tree.srcPos)) + def fail = tree.withType(errorType(err.takesNoParamsMsg(fn, "type "), tree.srcPos)) ConstFold(fn.tpe.widen match { case pt: TypeLambda => tree.withType { diff --git a/compiler/src/dotty/tools/dotc/typer/Typer.scala b/compiler/src/dotty/tools/dotc/typer/Typer.scala index 33638df54fb1..eb09d30e60f3 100644 --- a/compiler/src/dotty/tools/dotc/typer/Typer.scala +++ b/compiler/src/dotty/tools/dotc/typer/Typer.scala @@ -73,12 +73,6 @@ object Typer { /** An attachment for GADT constraints that were inferred for a pattern. */ val InferredGadtConstraints = new Property.StickyKey[core.GadtConstraint] - /** A context property that indicates the owner of any expressions to be typed in the context - * if that owner is different from the context's owner. Typically, a context with a class - * as owner would have a local dummy as ExprOwner value. - */ - private val ExprOwner = new Property.Key[Symbol] - /** An attachment on a Select node with an `apply` field indicating that the `apply` * was inserted by the Typer. */ @@ -250,15 +244,17 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer imp.importSym.info match case ImportType(expr) => val pre = expr.tpe - var denot = pre.memberBasedOnFlags(name, required, excluded) + val denot0 = pre.memberBasedOnFlags(name, required, excluded) .accessibleFrom(pre)(using refctx) // Pass refctx so that any errors are reported in the context of the // reference instead of the context of the import scope - if denot.exists then - if checkBounds then - denot = denot.filterWithPredicate { mbr => - mbr.matchesImportBound(if mbr.symbol.is(Given) then imp.givenBound else imp.wildcardBound) - } + if denot0.exists then + val denot = + if checkBounds then + denot0.filterWithPredicate { mbr => + mbr.matchesImportBound(if mbr.symbol.is(Given) then imp.givenBound else imp.wildcardBound) + } + else denot0 def isScalaJsPseudoUnion = denot.name == tpnme.raw.BAR && ctx.settings.scalajs.value && denot.symbol == JSDefinitions.jsdefn.PseudoUnionClass // Just like Scala2Unpickler reinterprets Scala.js pseudo-unions @@ -283,7 +279,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer def checkUnambiguous(found: Type) = val other = recur(selectors.tail) if other.exists && found.exists && found != other then - fail(em"reference to `$name` is ambiguous; it is imported twice".toMessage) + fail(em"reference to `$name` is ambiguous; it is imported twice") found if selector.rename == termName && selector.rename != nme.WILDCARD then @@ -376,6 +372,17 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer case denot => !denot.hasAltWith(isCurrent) def checkNoOuterDefs(denot: Denotation, last: Context, prevCtx: Context): Unit = + def sameTermOrType(d1: SingleDenotation, d2: Denotation) = + d2.containsSym(d1.symbol) || d2.hasUniqueSym && { + val sym1 = d1.symbol + val sym2 = d2.symbol + if sym1.isTerm then + sym1.isStableMember && + sym2.isStableMember && + sym1.termRef =:= sym2.termRef + else + (sym1.isAliasType || sym2.isAliasType) && d1.info =:= d2.info + } val outer = last.outer val owner = outer.owner if (owner eq last.owner) && (outer.scope eq last.scope) then @@ -385,7 +392,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer val competing = scope.denotsNamed(name).filterWithFlags(required, excluded) if competing.exists then val symsMatch = competing - .filterWithPredicate(sd => denot.containsSym(sd.symbol)) + .filterWithPredicate(sd => sameTermOrType(sd, denot)) .exists if !symsMatch && !suppressErrors then report.errorOrMigrationWarning( @@ -476,13 +483,15 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer * (x: T | Null) => x.$asInstanceOf$[x.type & T] */ def toNotNullTermRef(tree: Tree, pt: Type)(using Context): Tree = tree.tpe match - case ref @ OrNull(tpnn) : TermRef + case ref: TermRef if pt != AssignProto && // Ensure it is not the lhs of Assign ctx.notNullInfos.impliesNotNull(ref) && // If a reference is in the context, it is already trackable at the point we add it. // Hence, we don't use isTracked in the next line, because checking use out of order is enough. !ref.usedOutOfOrder => - tree.cast(AndType(ref, tpnn)) + ref match + case OrNull(tpnn) => tree.cast(AndType(ref, tpnn)) + case _ => tree case _ => tree @@ -525,7 +534,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer val found = findRef(name, pt, EmptyFlags, EmptyFlags, tree.srcPos) if foundUnderScala2.exists && !(foundUnderScala2 =:= found) then report.migrationWarning( - ex"""Name resolution will change. + em"""Name resolution will change. | currently selected : $foundUnderScala2 | in the future, without -source 3.0-migration: $found""", tree.srcPos) foundUnderScala2 @@ -584,7 +593,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer else if ctx.owner.isConstructor && !ctx.owner.isPrimaryConstructor && ctx.owner.owner.unforcedDecls.lookup(tree.name).exists then // we are in the arguments of a this(...) constructor call - errorTree(tree, ex"$tree is not accessible from constructor arguments") + errorTree(tree, em"$tree is not accessible from constructor arguments") else errorTree(tree, MissingIdent(tree, kind, name)) end typedIdent @@ -609,11 +618,15 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer val superAccess = qual.isInstanceOf[Super] val rawType = selectionType(tree, qual) val checkedType = accessibleType(rawType, superAccess) - if checkedType.exists then + + def finish(tree: untpd.Select, qual: Tree, checkedType: Type): Tree = val select = toNotNullTermRef(assignType(tree, checkedType), pt) if selName.isTypeName then checkStable(qual.tpe, qual.srcPos, "type prefix") checkLegalValue(select, pt) ConstFold(select) + + if checkedType.exists then + finish(tree, qual, checkedType) else if selName == nme.apply && qual.tpe.widen.isInstanceOf[MethodType] then // Simplify `m.apply(...)` to `m(...)` qual @@ -625,6 +638,26 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer else val tree1 = tryExtensionOrConversion( tree, pt, IgnoredProto(pt), qual, ctx.typerState.ownedVars, this, inSelect = true) + .orElse { + if ctx.gadt.isNarrowing then + // try GADT approximation if we're trying to select a member + // Member lookup cannot take GADTs into account b/c of cache, so we + // approximate types based on GADT constraints instead. For an example, + // see MemberHealing in gadt-approximation-interaction.scala. + val wtp = qual.tpe.widen + gadts.println(i"Trying to heal member selection by GADT-approximating $wtp") + val gadtApprox = Inferencing.approximateGADT(wtp) + gadts.println(i"GADT-approximated $wtp ~~ $gadtApprox") + val qual1 = qual.cast(gadtApprox) + val tree1 = cpy.Select(tree0)(qual1, selName) + val checkedType1 = accessibleType(selectionType(tree1, qual1), superAccess = false) + if checkedType1.exists then + gadts.println(i"Member selection healed by GADT approximation") + finish(tree1, qual1, checkedType1) + else + tryExtensionOrConversion(tree1, pt, IgnoredProto(pt), qual1, ctx.typerState.ownedVars, this, inSelect = true) + else EmptyTree + } if !tree1.isEmpty then tree1 else if canDefineFurther(qual.tpe.widen) then @@ -673,7 +706,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer javaSelectOnType(qual2) case _ => - errorTree(tree, "cannot convert to type selection") // will never be printed due to fallback + errorTree(tree, em"cannot convert to type selection") // will never be printed due to fallback } def selectWithFallback(fallBack: Context ?=> Tree) = @@ -987,8 +1020,8 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer def lhs1 = adapt(lhsCore, AssignProto, locked) def reassignmentToVal = - errorTree(cpy.Assign(tree)(lhsCore, typed(tree.rhs, lhs1.tpe.widen)), - ReassignmentToVal(lhsCore.symbol.name)) + report.error(ReassignmentToVal(lhsCore.symbol.name), tree.srcPos) + cpy.Assign(tree)(lhsCore, typed(tree.rhs, lhs1.tpe.widen)).withType(defn.UnitType) def canAssign(sym: Symbol) = sym.is(Mutable, butNot = Accessor) || @@ -1210,8 +1243,8 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer && defn.isContextFunctionType(pt1.nonPrivateMember(nme.apply).info.finalResultType) then report.error( - i"""Implementation restriction: Expected result type $pt1 - |is a curried dependent context function type. Such types are not yet supported.""", + em"""Implementation restriction: Expected result type $pt1 + |is a curried dependent context function type. Such types are not yet supported.""", pos) pt1 match { case tp: TypeParamRef => @@ -1322,7 +1355,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer val appDef = typed(appDef0).asInstanceOf[DefDef] val mt = appDef.symbol.info.asInstanceOf[MethodType] if (mt.isParamDependent) - report.error(i"$mt is an illegal function type because it has inter-parameter dependencies", tree.srcPos) + report.error(em"$mt is an illegal function type because it has inter-parameter dependencies", tree.srcPos) val resTpt = TypeTree(mt.nonDependentResultApprox).withSpan(body.span) val typeArgs = appDef.termParamss.head.map(_.tpt) :+ resTpt val tycon = TypeTree(funSym.typeRef) @@ -1526,14 +1559,14 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer // Replace the underspecified expected type by one based on the closure method type defn.PartialFunctionOf(mt.firstParamTypes.head, mt.resultType) else - report.error(ex"result type of lambda is an underspecified SAM type $pt", tree.srcPos) + report.error(em"result type of lambda is an underspecified SAM type $pt", tree.srcPos) pt TypeTree(targetTpe) case _ => if (mt.isParamDependent) errorTree(tree, - i"""cannot turn method type $mt into closure - |because it has internal parameter dependencies""") + em"""cannot turn method type $mt into closure + |because it has internal parameter dependencies""") else if ((tree.tpt `eq` untpd.ContextualEmptyTree) && mt.paramNames.isEmpty) // Note implicitness of function in target type since there are no method parameters that indicate it. TypeTree(defn.FunctionOf(Nil, mt.resType, isContextual = true, isErased = false)) @@ -1650,7 +1683,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer // skip exhaustivity check in later phase // TODO: move the check above to patternMatcher phase - val uncheckedTpe = AnnotatedType(sel.tpe.widen, Annotation(defn.UncheckedAnnot)) + val uncheckedTpe = AnnotatedType(sel.tpe.widen, Annotation(defn.UncheckedAnnot, tree.selector.span)) tpd.cpy.Match(result)( selector = tpd.Typed(sel, tpd.TypeTree(uncheckedTpe)), cases = result.cases @@ -1781,7 +1814,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer var body1 = typedType(cdef.body, pt) if !body1.isType then assert(ctx.reporter.errorsReported) - body1 = TypeTree(errorType("", cdef.srcPos)) + body1 = TypeTree(errorType(em"", cdef.srcPos)) assignType(cpy.CaseDef(cdef)(pat2, EmptyTree, body1), pat2, body1) } caseRest(using ctx.fresh.setFreshGADTBounds.setNewScope) @@ -1885,7 +1918,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer Typed(res, TypeTree( AnnotatedType(res.tpe, - Annotation(defn.RequiresCapabilityAnnot, cap)))) + Annotation(defn.RequiresCapabilityAnnot, cap, tree.span)))) else res def typedSeqLiteral(tree: untpd.SeqLiteral, pt: Type)(using Context): SeqLiteral = { @@ -1929,7 +1962,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer .withType( if isFullyDefined(pt, ForceDegree.flipBottom) then pt else if ctx.reporter.errorsReported then UnspecifiedErrorType - else errorType(i"cannot infer type; expected type $pt is not fully defined", tree.srcPos)) + else errorType(em"cannot infer type; expected type $pt is not fully defined", tree.srcPos)) def typedTypeTree(tree: untpd.TypeTree, pt: Type)(using Context): Tree = tree match @@ -1943,13 +1976,13 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer // untyped tree is no longer accessed after all // accesses with typedTypeTree are done. case None => - errorTree(tree, "Something's wrong: missing original symbol for type tree") + errorTree(tree, em"Something's wrong: missing original symbol for type tree") } case _ => completeTypeTree(InferredTypeTree(), pt, tree) def typedSingletonTypeTree(tree: untpd.SingletonTypeTree)(using Context): SingletonTypeTree = { - val ref1 = typedExpr(tree.ref) + val ref1 = typedExpr(tree.ref, SingletonTypeProto) checkStable(ref1.tpe, tree.srcPos, "singleton type") assignType(cpy.SingletonTypeTree(tree)(ref1), ref1) } @@ -1984,9 +2017,9 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer tree.args match case arg :: _ if arg.isTerm => if Feature.dependentEnabled then - return errorTree(tree, i"Not yet implemented: T(...)") + return errorTree(tree, em"Not yet implemented: T(...)") else - return errorTree(tree, dependentStr) + return errorTree(tree, dependentMsg) case _ => val tpt1 = withoutMode(Mode.Pattern) { @@ -2096,6 +2129,9 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer && checkedArgs(1).tpe.derivesFrom(defn.RuntimeExceptionClass) then report.error(em"throws clause cannot be defined for RuntimeException", checkedArgs(1).srcPos) + else if tycon == defn.IntoType then + // is defined in package scala but this should be hidden from user programs + report.error(em"not found: ", tpt1.srcPos) else if (ctx.isJava) if tycon eq defn.ArrayClass then checkedArgs match { @@ -2122,9 +2158,9 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer def typedTermLambdaTypeTree(tree: untpd.TermLambdaTypeTree)(using Context): Tree = if Feature.dependentEnabled then - errorTree(tree, i"Not yet implemented: (...) =>> ...") + errorTree(tree, em"Not yet implemented: (...) =>> ...") else - errorTree(tree, dependentStr) + errorTree(tree, dependentMsg) def typedMatchTypeTree(tree: untpd.MatchTypeTree, pt: Type)(using Context): Tree = { val bound1 = @@ -2148,15 +2184,11 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer val alias1 = typed(alias) val lo2 = if (lo1.isEmpty) typed(untpd.TypeTree(defn.NothingType)) else lo1 val hi2 = if (hi1.isEmpty) typed(untpd.TypeTree(defn.AnyType)) else hi1 - if !alias1.isEmpty then - val bounds = TypeBounds(lo2.tpe, hi2.tpe) - if !bounds.contains(alias1.tpe) then - report.error(em"type ${alias1.tpe} outside bounds $bounds", tree.srcPos) assignType(cpy.TypeBoundsTree(tree)(lo2, hi2, alias1), lo2, hi2, alias1) def typedBind(tree: untpd.Bind, pt: Type)(using Context): Tree = { if !isFullyDefined(pt, ForceDegree.all) then - return errorTree(tree, i"expected type of $tree is not fully defined") + return errorTree(tree, em"expected type of $tree is not fully defined") val body1 = typed(tree.body, pt) body1 match { case UnApply(fn, Nil, arg :: Nil) @@ -2222,29 +2254,23 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer /** The context to be used for an annotation of `mdef`. * This should be the context enclosing `mdef`, or if `mdef` defines a parameter * the context enclosing the owner of `mdef`. - * Furthermore, we need to evaluate annotation arguments in an expression context, - * since classes defined in a such arguments should not be entered into the - * enclosing class. + * Furthermore, we need to make sure that annotation trees are evaluated + * with an owner that is not the enclosing class since otherwise locally + * defined symbols would be entered as class members. */ - def annotContext(mdef: untpd.Tree, sym: Symbol)(using Context): Context = { + def annotContext(mdef: untpd.Tree, sym: Symbol)(using Context): Context = def isInner(owner: Symbol) = owner == sym || sym.is(Param) && owner == sym.owner val outer = ctx.outersIterator.dropWhile(c => isInner(c.owner)).next() - var adjusted = outer.property(ExprOwner) match { - case Some(exprOwner) if outer.owner.isClass => outer.exprContext(mdef, exprOwner) - case _ => outer - } + def local: FreshContext = outer.fresh.setOwner(newLocalDummy(sym.owner)) sym.owner.infoOrCompleter match - case completer: Namer#Completer if sym.is(Param) => - val tparams = completer.completerTypeParams(sym) - if tparams.nonEmpty then - // Create a new local context with a dummy owner and a scope containing the - // type parameters of the enclosing method or class. Thus annotations can see - // these type parameters. See i12953.scala for a test case. - val dummyOwner = newLocalDummy(sym.owner) - adjusted = adjusted.fresh.setOwner(dummyOwner).setScope(newScopeWith(tparams*)) + case completer: Namer#Completer + if sym.is(Param) && completer.completerTypeParams(sym).nonEmpty => + // Create a new local context with a dummy owner and a scope containing the + // type parameters of the enclosing method or class. Thus annotations can see + // these type parameters. See i12953.scala for a test case. + local.setScope(newScopeWith(completer.completerTypeParams(sym)*)) case _ => - adjusted - } + if outer.owner.isClass then local else outer def completeAnnotations(mdef: untpd.MemberDef, sym: Symbol)(using Context): Unit = { // necessary to force annotation trees to be computed. @@ -2259,7 +2285,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer } def typedAnnotation(annot: untpd.Tree)(using Context): Tree = - checkAnnotArgs(typed(annot, defn.AnnotationClass.typeRef)) + checkAnnotClass(checkAnnotArgs(typed(annot))) def registerNowarn(tree: Tree, mdef: untpd.Tree)(using Context): Unit = val annot = Annotations.Annotation(tree) @@ -2336,7 +2362,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer ctx.outer.outersIterator.takeWhile(!_.owner.is(Method)) .filter(ctx => ctx.owner.isClass && ctx.owner.typeParams.nonEmpty) .toList.reverse - .foreach(ctx => rhsCtx.gadt.addToConstraint(ctx.owner.typeParams)) + .foreach(ctx => rhsCtx.gadtState.addToConstraint(ctx.owner.typeParams)) if tparamss.nonEmpty then rhsCtx.setFreshGADTBounds @@ -2345,7 +2371,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer // we're typing a polymorphic definition's body, // so we allow constraining all of its type parameters // constructors are an exception as we don't allow constraining type params of classes - rhsCtx.gadt.addToConstraint(tparamSyms) + rhsCtx.gadtState.addToConstraint(tparamSyms) else if !sym.isPrimaryConstructor then linkConstructorParams(sym, tparamSyms, rhsCtx) @@ -2358,6 +2384,10 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer if sym.isInlineMethod then if StagingContext.level > 0 then report.error("inline def cannot be within quotes", sym.sourcePos) + if sym.is(Given) + && untpd.stripBlock(untpd.unsplice(ddef.rhs)).isInstanceOf[untpd.Function] + then + report.warning(InlineGivenShouldNotBeFunction(), ddef.rhs.srcPos) val rhsToInline = PrepareInlineable.wrapRHS(ddef, tpt1, rhs1) PrepareInlineable.registerInlineInfo(sym, rhsToInline) @@ -2444,7 +2474,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer // error if the same parent was explicitly added in user code. if !tree.span.isSourceDerived then return EmptyTree - if !ctx.isAfterTyper then report.error(i"$psym is extended twice", tree.srcPos) + if !ctx.isAfterTyper then report.error(em"$psym is extended twice", tree.srcPos) else seenParents += psym val result = ensureConstrCall(cls, parent, psym)(using superCtx) @@ -2561,7 +2591,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer && effectiveOwner.is(Trait) && !effectiveOwner.derivesFrom(defn.ObjectClass) then - report.error(i"$cls cannot be defined in universal $effectiveOwner", cdef.srcPos) + report.error(em"$cls cannot be defined in universal $effectiveOwner", cdef.srcPos) // Temporarily set the typed class def as root tree so that we have at least some // information in the IDE in case we never reach `SetRootTree`. @@ -2595,6 +2625,9 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer */ def ensureConstrCall(cls: ClassSymbol, parent: Tree, psym: Symbol)(using Context): Tree = if parent.isType && !cls.is(Trait) && !cls.is(JavaDefined) && psym.isClass + // Annotations are represented as traits with constructors, but should + // never be called as such outside of annotation trees. + && !psym.is(JavaAnnotation) && (!psym.is(Trait) || psym.primaryConstructor.info.takesParams && !cls.superClass.isSubClass(psym)) then typed(untpd.New(untpd.TypedSplice(parent), Nil)) @@ -2665,11 +2698,11 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer // Package will not exist if a duplicate type has already been entered, see `tests/neg/1708.scala` errorTree(tree, if pkg.exists then PackageNameAlreadyDefined(pkg) - else i"package ${tree.pid.name} does not exist".toMessage) + else em"package ${tree.pid.name} does not exist") end typedPackageDef def typedAnnotated(tree: untpd.Annotated, pt: Type)(using Context): Tree = { - val annot1 = typedExpr(tree.annot, defn.AnnotationClass.typeRef) + val annot1 = checkAnnotClass(typedExpr(tree.annot)) val annotCls = Annotations.annotClass(annot1) if annotCls == defn.NowarnAnnot then registerNowarn(annot1, tree) @@ -2745,8 +2778,8 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer if ((prefix ++ suffix).isEmpty) "simply leave out the trailing ` _`" else s"use `$prefix$suffix` instead" report.errorOrMigrationWarning( - i"""The syntax ` _` is no longer supported; - |you can $remedy""", + em"""The syntax ` _` is no longer supported; + |you can $remedy""", tree.srcPos, from = future) if sourceVersion.isMigrating then @@ -2878,7 +2911,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer else "" val namePos = tree.sourcePos.withSpan(tree.nameSpan) report.errorOrMigrationWarning( - s"`?` is not a valid type name$addendum", namePos, from = `3.0`) + em"`?` is not a valid type name$addendum", namePos, from = `3.0`) if tree.isClassDef then typedClassDef(tree, sym.asClass)(using ctx.localContext(tree, sym)) else @@ -2930,7 +2963,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer case tree: untpd.TypedSplice => typedTypedSplice(tree) case tree: untpd.UnApply => typedUnApply(tree, pt) case tree: untpd.Tuple => typedTuple(tree, pt) - case tree: untpd.DependentTypeTree => completeTypeTree(untpd.TypeTree(), pt, tree) + case tree: untpd.DependentTypeTree => completeTypeTree(untpd.InferredTypeTree(), pt, tree) case tree: untpd.InfixOp => typedInfixOp(tree, pt) case tree: untpd.ParsedTry => typedTry(tree, pt) case tree @ untpd.PostfixOp(qual, Ident(nme.WILDCARD)) => typedAsFunction(tree, pt) @@ -3082,10 +3115,6 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer case nil => (buf.toList, ctx) } - val localCtx = { - val exprOwnerOpt = if (exprOwner == ctx.owner) None else Some(exprOwner) - ctx.withProperty(ExprOwner, exprOwnerOpt) - } def finalize(stat: Tree)(using Context): Tree = stat match { case stat: TypeDef if stat.symbol.is(Module) => val enumContext = enumContexts(stat.symbol.linkedClass) @@ -3098,7 +3127,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer case _ => stat } - val (stats0, finalCtx) = traverse(stats)(using localCtx) + val (stats0, finalCtx) = traverse(stats) val stats1 = stats0.mapConserve(finalize) if ctx.owner == exprOwner then checkNoTargetNameConflict(stats1) (stats1, finalCtx) @@ -3347,7 +3376,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer case SearchSuccess(found, _, _, isExtension) => if isExtension then return found else - checkImplicitConversionUseOK(found) + checkImplicitConversionUseOK(found, selProto) return withoutMode(Mode.ImplicitsEnabled)(typedSelect(tree, pt, found)) case failure: SearchFailure => if failure.isAmbiguous then @@ -3421,42 +3450,59 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer ErrorReporting.missingArgs(tree, mt) tree.withType(mt.resultType) - def adaptOverloaded(ref: TermRef) = { + def adaptOverloaded(ref: TermRef) = + // get all the alternatives val altDenots = val allDenots = ref.denot.alternatives if pt.isExtensionApplyProto then allDenots.filter(_.symbol.is(ExtensionMethod)) else allDenots + typr.println(i"adapt overloaded $ref with alternatives ${altDenots map (_.info)}%\n\n %") + + /** Search for an alternative that does not take parameters. + * If there is one, return it, otherwise emit an error. + */ + def tryParameterless(alts: List[TermRef])(error: => tpd.Tree): Tree = + alts.filter(_.info.isParameterless) match + case alt :: Nil => readaptSimplified(tree.withType(alt)) + case _ => + if altDenots.exists(_.info.paramInfoss == ListOfNil) then + typed(untpd.Apply(untpd.TypedSplice(tree), Nil), pt, locked) + else + error + def altRef(alt: SingleDenotation) = TermRef(ref.prefix, ref.name, alt) val alts = altDenots.map(altRef) - resolveOverloaded(alts, pt) match { + + resolveOverloaded(alts, pt) match case alt :: Nil => readaptSimplified(tree.withType(alt)) case Nil => - // If alternative matches, there are still two ways to recover: + // If no alternative matches, there are still two ways to recover: // 1. If context is an application, try to insert an apply or implicit // 2. If context is not an application, pick a alternative that does // not take parameters. - def noMatches = - errorTree(tree, NoMatchingOverload(altDenots, pt)) - def hasEmptyParams(denot: SingleDenotation) = denot.info.paramInfoss == ListOfNil - pt match { + + def errorNoMatch = errorTree(tree, NoMatchingOverload(altDenots, pt)) + + pt match case pt: FunOrPolyProto if pt.applyKind != ApplyKind.Using => // insert apply or convert qualifier, but only for a regular application - tryInsertApplyOrImplicit(tree, pt, locked)(noMatches) + tryInsertApplyOrImplicit(tree, pt, locked)(errorNoMatch) case _ => - alts.filter(_.info.isParameterless) match { - case alt :: Nil => readaptSimplified(tree.withType(alt)) - case _ => - if (altDenots exists (_.info.paramInfoss == ListOfNil)) - typed(untpd.Apply(untpd.TypedSplice(tree), Nil), pt, locked) - else - noMatches - } - } + tryParameterless(alts)(errorNoMatch) + case ambiAlts => - if tree.tpe.isErroneous || pt.isErroneous then tree.withType(UnspecifiedErrorType) - else + // If there are ambiguous alternatives, and: + // 1. the types aren't erroneous + // 2. the expected type is not a function type + // 3. there exist a parameterless alternative + // + // Then, pick the parameterless alternative. + // See tests/pos/i10715-scala and tests/pos/i10715-java. + + /** Constructs an "ambiguous overload" error */ + def errorAmbiguous = val remainingDenots = altDenots.filter(denot => ambiAlts.contains(altRef(denot))) val addendum = if ambiAlts.exists(!_.symbol.exists) then @@ -3465,8 +3511,19 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer |Note: Overloaded definitions introduced by refinements cannot be resolved""" else "" errorTree(tree, AmbiguousOverload(tree, remainingDenots, pt, addendum)) - } - } + end errorAmbiguous + + if tree.tpe.isErroneous || pt.isErroneous then + tree.withType(UnspecifiedErrorType) + else + pt match + case _: FunProto => + errorAmbiguous + case _ => + tryParameterless(alts)(errorAmbiguous) + + end match + end adaptOverloaded def adaptToArgs(wtp: Type, pt: FunProto): Tree = wtp match { case wtp: MethodOrPoly => @@ -3703,7 +3760,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer if (!defn.isFunctionType(pt)) pt match { case SAMType(_) if !pt.classSymbol.hasAnnotation(defn.FunctionalInterfaceAnnot) => - report.warning(ex"${tree.symbol} is eta-expanded even though $pt does not have the @FunctionalInterface annotation.", tree.srcPos) + report.warning(em"${tree.symbol} is eta-expanded even though $pt does not have the @FunctionalInterface annotation.", tree.srcPos) case _ => } simplify(typed(etaExpand(tree, wtp, arity), pt), pt, locked) @@ -3726,24 +3783,24 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer true } - if ((implicitFun || caseCompanion) && - !isApplyProto(pt) && - pt != AssignProto && - !ctx.mode.is(Mode.Pattern) && - !ctx.isAfterTyper && - !ctx.isInlineContext) { + if (implicitFun || caseCompanion) + && !isApplyProto(pt) + && pt != SingletonTypeProto + && pt != AssignProto + && !ctx.mode.is(Mode.Pattern) + && !ctx.isAfterTyper + && !ctx.isInlineContext + then typr.println(i"insert apply on implicit $tree") val sel = untpd.Select(untpd.TypedSplice(tree), nme.apply).withAttachment(InsertedApply, ()) try typed(sel, pt, locked) finally sel.removeAttachment(InsertedApply) - } - else if (ctx.mode is Mode.Pattern) { + else if ctx.mode is Mode.Pattern then checkEqualityEvidence(tree, pt) tree - } else val meth = methPart(tree).symbol if meth.isAllOf(DeferredInline) && !Inlines.inInlineMethod then - errorTree(tree, i"Deferred inline ${meth.showLocated} cannot be invoked") + errorTree(tree, em"Deferred inline ${meth.showLocated} cannot be invoked") else if Inlines.needsInlining(tree) then tree.tpe <:< wildApprox(pt) val errorCount = ctx.reporter.errorCount @@ -3763,8 +3820,9 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer } else { report.error( - """Scala 2 macro cannot be used in Dotty. See https://docs.scala-lang.org/scala3/reference/dropped-features/macros.html - |To turn this error into a warning, pass -Xignore-scala2-macros to the compiler""".stripMargin, tree.srcPos.startPos) + em"""Scala 2 macro cannot be used in Dotty. See https://docs.scala-lang.org/scala3/reference/dropped-features/macros.html + |To turn this error into a warning, pass -Xignore-scala2-macros to the compiler""", + tree.srcPos.startPos) tree } else TypeComparer.testSubType(tree.tpe.widenExpr, pt) match @@ -3777,7 +3835,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer adaptToSubType(wtp) case CompareResult.OKwithGADTUsed if pt.isValueType - && !inContext(ctx.fresh.setGadt(GadtConstraint.empty)) { + && !inContext(ctx.fresh.setGadtState(GadtState(GadtConstraint.empty))) { val res = (tree.tpe.widenExpr frozen_<:< pt) if res then // we overshot; a cast is not needed, after all. @@ -3892,6 +3950,9 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer // so will take the code path that decides on inlining val tree1 = adapt(tree, WildcardType, locked) checkStatementPurity(tree1)(tree, ctx.owner) + if (!ctx.isAfterTyper && !tree.isInstanceOf[Inlined] && ctx.settings.WvalueDiscard.value && !isThisTypeResult(tree)) { + report.warning(ValueDiscarding(tree.tpe), tree.srcPos) + } return tpd.Block(tree1 :: Nil, Literal(Constant(()))) } @@ -3934,19 +3995,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer pt match case pt: SelectionProto => - if ctx.gadt.isNarrowing then - // try GADT approximation if we're trying to select a member - // Member lookup cannot take GADTs into account b/c of cache, so we - // approximate types based on GADT constraints instead. For an example, - // see MemberHealing in gadt-approximation-interaction.scala. - gadts.println(i"Trying to heal member selection by GADT-approximating $wtp") - val gadtApprox = Inferencing.approximateGADT(wtp) - gadts.println(i"GADT-approximated $wtp ~~ $gadtApprox") - if pt.isMatchedBy(gadtApprox) then - gadts.println(i"Member selection healed by GADT approximation") - tree.cast(gadtApprox) - else tree - else if tree.tpe.derivesFrom(defn.PairClass) && !defn.isTupleNType(tree.tpe.widenDealias) then + if tree.tpe.derivesFrom(defn.PairClass) && !defn.isTupleNType(tree.tpe.widenDealias) then // If this is a generic tuple we need to cast it to make the TupleN/ members accessible. // This works only for generic tuples of known size up to 22. defn.tupleTypes(tree.tpe.widenTermRefExpr) match @@ -3963,7 +4012,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer case SearchSuccess(found, _, _, isExtension) => if isExtension then found else - checkImplicitConversionUseOK(found) + checkImplicitConversionUseOK(found, pt) withoutMode(Mode.ImplicitsEnabled)(readapt(found)) case failure: SearchFailure => if (pt.isInstanceOf[ProtoType] && !failure.isAmbiguous) then @@ -4195,11 +4244,12 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer case _ => if !config.Feature.scala2ExperimentalMacroEnabled then report.error( - """Scala 2 macro definition needs to be enabled - |by making the implicit value scala.language.experimental.macros visible. - |This can be achieved by adding the import clause 'import scala.language.experimental.macros' - |or by setting the compiler option -language:experimental.macros. - """.stripMargin, call.srcPos) + em"""Scala 2 macro definition needs to be enabled + |by making the implicit value scala.language.experimental.macros visible. + |This can be achieved by adding the import clause 'import scala.language.experimental.macros' + |or by setting the compiler option -language:experimental.macros. + """, + call.srcPos) call match case call: untpd.Ident => typedIdent(call, defn.AnyType) @@ -4214,7 +4264,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer typedTypeApply(call2, defn.AnyType) } case _ => - report.error("Invalid Scala 2 macro " + call.show, call.srcPos) + report.error(em"Invalid Scala 2 macro $call", call.srcPos) EmptyTree else typedExpr(call, defn.AnyType) @@ -4244,7 +4294,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer // this is needed for -Ycheck. Without the annotation Ycheck will // skolemize the result type which will lead to different types before // and after checking. See i11955.scala. - AnnotatedType(conj, Annotation(defn.UncheckedStableAnnot)) + AnnotatedType(conj, Annotation(defn.UncheckedStableAnnot, tree.symbol.span)) else conj else pt gadts.println(i"insert GADT cast from $tree to $target") diff --git a/compiler/src/dotty/tools/dotc/typer/VarianceChecker.scala b/compiler/src/dotty/tools/dotc/typer/VarianceChecker.scala index 53646558cf5c..bcfc9288d862 100644 --- a/compiler/src/dotty/tools/dotc/typer/VarianceChecker.scala +++ b/compiler/src/dotty/tools/dotc/typer/VarianceChecker.scala @@ -164,11 +164,11 @@ class VarianceChecker(using Context) { i"\n${hl("enum case")} ${towner.name} requires explicit declaration of $tvar to resolve this issue.\n$example" else "" - i"${varianceLabel(tvar.flags)} $tvar occurs in ${varianceLabel(required)} position in type ${sym.info} of $sym$enumAddendum" + em"${varianceLabel(tvar.flags)} $tvar occurs in ${varianceLabel(required)} position in type ${sym.info} of $sym$enumAddendum" if (migrateTo3 && (sym.owner.isConstructor || sym.ownersIterator.exists(_.isAllOf(ProtectedLocal)))) report.migrationWarning( - s"According to new variance rules, this is no longer accepted; need to annotate with @uncheckedVariance:\n$msg", + msg.prepend("According to new variance rules, this is no longer accepted; need to annotate with @uncheckedVariance\n"), pos) // patch(Span(pos.end), " @scala.annotation.unchecked.uncheckedVariance") // Patch is disabled until two TODOs are solved: diff --git a/compiler/src/dotty/tools/dotc/util/Chars.scala b/compiler/src/dotty/tools/dotc/util/Chars.scala index 471b68d6247e..cde1a63f5293 100644 --- a/compiler/src/dotty/tools/dotc/util/Chars.scala +++ b/compiler/src/dotty/tools/dotc/util/Chars.scala @@ -1,21 +1,20 @@ package dotty.tools.dotc.util import scala.annotation.switch -import java.lang.{Character => JCharacter} -import java.lang.Character.LETTER_NUMBER -import java.lang.Character.LOWERCASE_LETTER -import java.lang.Character.OTHER_LETTER -import java.lang.Character.TITLECASE_LETTER -import java.lang.Character.UPPERCASE_LETTER +import Character.{LETTER_NUMBER, LOWERCASE_LETTER, OTHER_LETTER, TITLECASE_LETTER, UPPERCASE_LETTER} +import Character.{MATH_SYMBOL, OTHER_SYMBOL} +import Character.{isJavaIdentifierPart, isUnicodeIdentifierStart, isUnicodeIdentifierPart} /** Contains constants and classifier methods for characters */ -object Chars { +object Chars: inline val LF = '\u000A' inline val FF = '\u000C' inline val CR = '\u000D' inline val SU = '\u001A' + type CodePoint = Int + /** Convert a character digit to an Int according to given base, * -1 if no success */ @@ -59,17 +58,21 @@ object Chars { '0' <= c && c <= '9' || 'A' <= c && c <= 'Z' || 'a' <= c && c <= 'z' /** Can character start an alphanumeric Scala identifier? */ - def isIdentifierStart(c: Char): Boolean = - (c == '_') || (c == '$') || JCharacter.isUnicodeIdentifierStart(c) + def isIdentifierStart(c: Char): Boolean = (c == '_') || (c == '$') || isUnicodeIdentifierStart(c) + def isIdentifierStart(c: CodePoint): Boolean = (c == '_') || (c == '$') || isUnicodeIdentifierStart(c) /** Can character form part of an alphanumeric Scala identifier? */ - def isIdentifierPart(c: Char): Boolean = - (c == '$') || JCharacter.isUnicodeIdentifierPart(c) + def isIdentifierPart(c: Char): Boolean = (c == '$') || isUnicodeIdentifierPart(c) + def isIdentifierPart(c: CodePoint) = (c == '$') || isUnicodeIdentifierPart(c) /** Is character a math or other symbol in Unicode? */ def isSpecial(c: Char): Boolean = { - val chtp = JCharacter.getType(c) - chtp == JCharacter.MATH_SYMBOL.toInt || chtp == JCharacter.OTHER_SYMBOL.toInt + val chtp = Character.getType(c) + chtp == MATH_SYMBOL.toInt || chtp == OTHER_SYMBOL.toInt + } + def isSpecial(codePoint: CodePoint) = { + val chtp = Character.getType(codePoint) + chtp == MATH_SYMBOL.toInt || chtp == OTHER_SYMBOL.toInt } def isValidJVMChar(c: Char): Boolean = @@ -78,15 +81,26 @@ object Chars { def isValidJVMMethodChar(c: Char): Boolean = !(c == '.' || c == ';' || c =='[' || c == '/' || c == '<' || c == '>') - private final val otherLetters = Set[Char]('\u0024', '\u005F') // '$' and '_' - private final val letterGroups = { - import JCharacter._ - Set[Byte](LOWERCASE_LETTER, UPPERCASE_LETTER, OTHER_LETTER, TITLECASE_LETTER, LETTER_NUMBER) - } - def isScalaLetter(ch: Char): Boolean = letterGroups(JCharacter.getType(ch).toByte) || otherLetters(ch) + def isScalaLetter(c: Char): Boolean = + Character.getType(c: @switch) match { + case LOWERCASE_LETTER | UPPERCASE_LETTER | OTHER_LETTER | TITLECASE_LETTER | LETTER_NUMBER => true + case _ => c == '$' || c == '_' + } + def isScalaLetter(c: CodePoint): Boolean = + Character.getType(c: @switch) match { + case LOWERCASE_LETTER | UPPERCASE_LETTER | OTHER_LETTER | TITLECASE_LETTER | LETTER_NUMBER => true + case _ => c == '$' || c == '_' + } /** Can character form part of a Scala operator name? */ - def isOperatorPart(c : Char) : Boolean = (c: @switch) match { + def isOperatorPart(c: Char): Boolean = (c: @switch) match { + case '~' | '!' | '@' | '#' | '%' | + '^' | '*' | '+' | '-' | '<' | + '>' | '?' | ':' | '=' | '&' | + '|' | '/' | '\\' => true + case c => isSpecial(c) + } + def isOperatorPart(c: CodePoint): Boolean = (c: @switch) match { case '~' | '!' | '@' | '#' | '%' | '^' | '*' | '+' | '-' | '<' | '>' | '?' | ':' | '=' | '&' | @@ -95,5 +109,4 @@ object Chars { } /** Would the character be encoded by `NameTransformer.encode`? */ - def willBeEncoded(c : Char) : Boolean = !JCharacter.isJavaIdentifierPart(c) -} + def willBeEncoded(c: Char): Boolean = !isJavaIdentifierPart(c) diff --git a/compiler/src/dotty/tools/dotc/util/GenericHashMap.scala b/compiler/src/dotty/tools/dotc/util/GenericHashMap.scala index fd6518fcc15c..a21a4af37038 100644 --- a/compiler/src/dotty/tools/dotc/util/GenericHashMap.scala +++ b/compiler/src/dotty/tools/dotc/util/GenericHashMap.scala @@ -42,9 +42,10 @@ abstract class GenericHashMap[Key, Value] else 1 << (32 - Integer.numberOfLeadingZeros(n)) /** Remove all elements from this table and set back to initial configuration */ - def clear(): Unit = + def clear(resetToInitial: Boolean): Unit = used = 0 - allocate(roundToPower(initialCapacity)) + if resetToInitial then allocate(roundToPower(initialCapacity)) + else java.util.Arrays.fill(table, null) /** The number of elements in the set */ def size: Int = used diff --git a/compiler/src/dotty/tools/dotc/util/HashSet.scala b/compiler/src/dotty/tools/dotc/util/HashSet.scala index a524dd39a594..a6e1532c804f 100644 --- a/compiler/src/dotty/tools/dotc/util/HashSet.scala +++ b/compiler/src/dotty/tools/dotc/util/HashSet.scala @@ -44,11 +44,10 @@ class HashSet[T](initialCapacity: Int = 8, capacityMultiple: Int = 2) extends Mu else if Integer.bitCount(n) == 1 then n else 1 << (32 - Integer.numberOfLeadingZeros(n)) - /** Remove all elements from this set and set back to initial configuration */ - def clear(): Unit = { + def clear(resetToInitial: Boolean): Unit = used = 0 - allocate(roundToPower(initialCapacity)) - } + if resetToInitial then allocate(roundToPower(initialCapacity)) + else java.util.Arrays.fill(table, null) /** The number of elements in the set */ def size: Int = used diff --git a/compiler/src/dotty/tools/dotc/util/MutableMap.scala b/compiler/src/dotty/tools/dotc/util/MutableMap.scala index ba912a312aea..283e28e7e04f 100644 --- a/compiler/src/dotty/tools/dotc/util/MutableMap.scala +++ b/compiler/src/dotty/tools/dotc/util/MutableMap.scala @@ -13,6 +13,10 @@ abstract class MutableMap[Key, Value] extends ReadOnlyMap[Key, Value]: remove(k) this - def clear(): Unit + /** Remove all bindings from this map. + * @param resetToInitial If true, set back to initial configuration, which includes + * reallocating tables. + */ + def clear(resetToInitial: Boolean = true): Unit def getOrElseUpdate(key: Key, value: => Value): Value diff --git a/compiler/src/dotty/tools/dotc/util/MutableSet.scala b/compiler/src/dotty/tools/dotc/util/MutableSet.scala index 6e3ae7628eb6..9529262fa5ec 100644 --- a/compiler/src/dotty/tools/dotc/util/MutableSet.scala +++ b/compiler/src/dotty/tools/dotc/util/MutableSet.scala @@ -15,7 +15,11 @@ abstract class MutableSet[T] extends ReadOnlySet[T]: /** Remove element `x` from the set */ def -=(x: T): Unit - def clear(): Unit + /** Remove all elements from this set. + * @param resetToInitial If true, set back to initial configuration, which includes + * reallocating tables. + */ + def clear(resetToInitial: Boolean = true): Unit def ++= (xs: IterableOnce[T]): Unit = xs.iterator.foreach(this += _) diff --git a/compiler/src/dotty/tools/dotc/util/Spans.scala b/compiler/src/dotty/tools/dotc/util/Spans.scala index baf2cfa121b0..e1487408f36b 100644 --- a/compiler/src/dotty/tools/dotc/util/Spans.scala +++ b/compiler/src/dotty/tools/dotc/util/Spans.scala @@ -86,7 +86,6 @@ object Spans { || containsInner(this, that.end) || containsInner(that, this.start) || containsInner(that, this.end) - || this.start == that.start && this.end == that.end // exact match in one point ) } @@ -182,6 +181,7 @@ object Spans { assert(isSpan) if (this == NoCoord) NoSpan else Span(-1 - encoding) } + override def toString = if isSpan then s"$toSpan" else s"Coord(idx=$toIndex)" } /** An index coordinate */ diff --git a/compiler/src/dotty/tools/dotc/util/Stats.scala b/compiler/src/dotty/tools/dotc/util/Stats.scala index f04957f26400..e9b72015b202 100644 --- a/compiler/src/dotty/tools/dotc/util/Stats.scala +++ b/compiler/src/dotty/tools/dotc/util/Stats.scala @@ -55,15 +55,14 @@ import collection.mutable } def maybeMonitored[T](op: => T)(using Context): T = - if (ctx.settings.YdetailedStats.value && hits.nonEmpty) { + if ctx.settings.YdetailedStats.value then monitored = true try op - finally { - aggregate() - println() - println(hits.toList.sortBy(_._2).map{ case (x, y) => s"$x -> $y" } mkString "\n") - hits.clear() - } - } + finally + if hits.nonEmpty then + aggregate() + println() + println(hits.toList.sortBy(_._2).map{ case (x, y) => s"$x -> $y" } mkString "\n") + hits.clear() else op } diff --git a/compiler/src/dotty/tools/dotc/util/WeakHashSet.scala b/compiler/src/dotty/tools/dotc/util/WeakHashSet.scala index 3c23b181a041..975826a87a37 100644 --- a/compiler/src/dotty/tools/dotc/util/WeakHashSet.scala +++ b/compiler/src/dotty/tools/dotc/util/WeakHashSet.scala @@ -204,7 +204,7 @@ abstract class WeakHashSet[A <: AnyRef](initialCapacity: Int = 8, loadFactor: Do linkedListLoop(null, table(bucket)) } - def clear(): Unit = { + def clear(resetToInitial: Boolean): Unit = { table = new Array[Entry[A] | Null](table.size) threshold = computeThreshold count = 0 diff --git a/compiler/src/dotty/tools/dotc/util/concurrent.scala b/compiler/src/dotty/tools/dotc/util/concurrent.scala new file mode 100644 index 000000000000..2710aae6c9b0 --- /dev/null +++ b/compiler/src/dotty/tools/dotc/util/concurrent.scala @@ -0,0 +1,62 @@ +package dotty.tools.dotc.util +import scala.util.{Try, Failure, Success} +import scala.collection.mutable.ArrayBuffer + +object concurrent: + + class NoCompletion extends RuntimeException + + class Future[T](exec: Executor[T]): + private var result: Option[Try[T]] = None + def force: Try[T] = synchronized { + while result.isEmpty && exec.isAlive do wait(1000 /*ms*/) + result.getOrElse(Failure(NoCompletion())) + } + def complete(r: Try[T]): Unit = synchronized { + result = Some(r) + notifyAll() + } + end Future + + class Executor[T] extends Thread: + private type WorkItem = (Future[T], () => T) + + private var allScheduled = false + private val pending = new ArrayBuffer[WorkItem] + + def schedule(op: () => T): Future[T] = synchronized { + assert(!allScheduled) + val f = Future[T](this) + pending += ((f, op)) + notifyAll() + f + } + + def close(): Unit = synchronized { + allScheduled = true + notifyAll() + } + + private def nextPending(): Option[WorkItem] = synchronized { + while pending.isEmpty && !allScheduled do wait(1000 /*ms*/) + if pending.isEmpty then None + else + val item = pending.head + pending.dropInPlace(1) + Some(item) + } + + override def run(): Unit = + while + nextPending() match + case Some((f, op)) => + f.complete(Try(op())) + true + case None => + false + do () + end Executor +end concurrent + + + diff --git a/compiler/src/dotty/tools/io/AbstractFile.scala b/compiler/src/dotty/tools/io/AbstractFile.scala index 29bc764dcd7b..f34fe6f40b9c 100644 --- a/compiler/src/dotty/tools/io/AbstractFile.scala +++ b/compiler/src/dotty/tools/io/AbstractFile.scala @@ -260,8 +260,10 @@ abstract class AbstractFile extends Iterable[AbstractFile] { // a race condition in creating the entry after the failed lookup may throw val path = jpath.resolve(name) - if (isDir) Files.createDirectory(path) - else Files.createFile(path) + try + if (isDir) Files.createDirectory(path) + else Files.createFile(path) + catch case _: FileAlreadyExistsException => () new PlainFile(new File(path)) case lookup => lookup } diff --git a/compiler/src/dotty/tools/package.scala b/compiler/src/dotty/tools/package.scala index 57a58151acc7..f90355b1fa8e 100644 --- a/compiler/src/dotty/tools/package.scala +++ b/compiler/src/dotty/tools/package.scala @@ -1,10 +1,6 @@ package dotty package object tools { - // Ensure this object is already classloaded, since it's only actually used - // when handling stack overflows and every operation (including class loading) - // risks failing. - dotty.tools.dotc.core.handleRecursive val ListOfNil: List[Nil.type] = Nil :: Nil @@ -18,7 +14,7 @@ package object tools { * Flow-typing under explicit nulls will automatically insert many necessary * occurrences of uncheckedNN. */ - inline def uncheckedNN: T = x.asInstanceOf[T] + transparent inline def uncheckedNN: T = x.asInstanceOf[T] inline def toOption: Option[T] = if x == null then None else Some(x.asInstanceOf[T]) @@ -49,4 +45,9 @@ package object tools { val e = if msg == null then AssertionError() else AssertionError("assertion failed: " + msg) e.setStackTrace(Array()) throw e -} + + // Ensure this object is already classloaded, since it's only actually used + // when handling stack overflows and every operation (including class loading) + // risks failing. + dotty.tools.dotc.core.handleRecursive + } diff --git a/compiler/src/dotty/tools/repl/JLineTerminal.scala b/compiler/src/dotty/tools/repl/JLineTerminal.scala index 9da12ae955d1..8e048d786ae1 100644 --- a/compiler/src/dotty/tools/repl/JLineTerminal.scala +++ b/compiler/src/dotty/tools/repl/JLineTerminal.scala @@ -16,7 +16,7 @@ import org.jline.reader.impl.history.DefaultHistory import org.jline.terminal.TerminalBuilder import org.jline.utils.AttributedString -final class JLineTerminal extends java.io.Closeable { +class JLineTerminal extends java.io.Closeable { // import java.util.logging.{Logger, Level} // Logger.getLogger("org.jline").setLevel(Level.FINEST) @@ -30,7 +30,8 @@ final class JLineTerminal extends java.io.Closeable { private def blue(str: String)(using Context) = if (ctx.settings.color.value != "never") Console.BLUE + str + Console.RESET else str - private def prompt(using Context) = blue("\nscala> ") + protected def promptStr = "scala" + private def prompt(using Context) = blue(s"\n$promptStr> ") private def newLinePrompt(using Context) = blue(" | ") /** Blockingly read line from `System.in` diff --git a/compiler/src/dotty/tools/repl/Rendering.scala b/compiler/src/dotty/tools/repl/Rendering.scala index 64e7ab72d3dd..c647ef302bb9 100644 --- a/compiler/src/dotty/tools/repl/Rendering.scala +++ b/compiler/src/dotty/tools/repl/Rendering.scala @@ -3,18 +3,14 @@ package repl import scala.language.unsafeNulls -import java.lang.{ ClassLoader, ExceptionInInitializerError } -import java.lang.reflect.InvocationTargetException - -import dotc.core.Contexts._ -import dotc.core.Denotations.Denotation -import dotc.core.Flags -import dotc.core.Flags._ -import dotc.core.Symbols.{Symbol, defn} -import dotc.core.StdNames.{nme, str} -import dotc.printing.ReplPrinter -import dotc.reporting.Diagnostic -import dotc.transform.ValueClasses +import dotc.*, core.* +import Contexts.*, Denotations.*, Flags.*, NameOps.*, StdNames.*, Symbols.* +import printing.ReplPrinter +import reporting.Diagnostic +import transform.ValueClasses +import util.StackTraceOps.* + +import scala.util.control.NonFatal /** This rendering object uses `ClassLoader`s to accomplish crossing the 4th * wall (i.e. fetching back values from the compiled class files put into a @@ -28,10 +24,10 @@ private[repl] class Rendering(parentClassLoader: Option[ClassLoader] = None): import Rendering._ - private var myClassLoader: AbstractFileClassLoader = _ + var myClassLoader: AbstractFileClassLoader = _ /** (value, maxElements, maxCharacters) => String */ - private var myReplStringOf: (Object, Int, Int) => String = _ + var myReplStringOf: (Object, Int, Int) => String = _ /** Class loader used to load compiled code */ private[repl] def classLoader()(using Context) = @@ -131,8 +127,7 @@ private[repl] class Rendering(parentClassLoader: Option[ClassLoader] = None): */ private def rewrapValueClass(sym: Symbol, value: Object)(using Context): Option[Object] = if ValueClasses.isDerivedValueClass(sym) then - val valueClassName = sym.flatName.encode.toString - val valueClass = Class.forName(valueClassName, true, classLoader()) + val valueClass = Class.forName(sym.binaryClassName, true, classLoader()) valueClass.getConstructors.headOption.map(_.newInstance(value)) else Some(value) @@ -148,7 +143,7 @@ private[repl] class Rendering(parentClassLoader: Option[ClassLoader] = None): infoDiagnostic(d.symbol.showUser, d) /** Render value definition result */ - def renderVal(d: Denotation)(using Context): Either[InvocationTargetException, Option[Diagnostic]] = + def renderVal(d: Denotation)(using Context): Either[ReflectiveOperationException, Option[Diagnostic]] = val dcl = d.symbol.showUser def msg(s: String) = infoDiagnostic(s, d) try @@ -156,12 +151,11 @@ private[repl] class Rendering(parentClassLoader: Option[ClassLoader] = None): if d.symbol.is(Flags.Lazy) then Some(msg(dcl)) else valueOf(d.symbol).map(value => msg(s"$dcl = $value")) ) - catch case e: InvocationTargetException => Left(e) + catch case e: ReflectiveOperationException => Left(e) end renderVal /** Force module initialization in the absence of members. */ def forceModule(sym: Symbol)(using Context): Seq[Diagnostic] = - import scala.util.control.NonFatal def load() = val objectName = sym.fullName.encode.toString Class.forName(objectName, true, classLoader()) @@ -169,14 +163,11 @@ private[repl] class Rendering(parentClassLoader: Option[ClassLoader] = None): try load() catch case e: ExceptionInInitializerError => List(renderError(e, sym.denot)) - case NonFatal(e) => List(renderError(InvocationTargetException(e), sym.denot)) + case NonFatal(e) => List(renderError(e, sym.denot)) /** Render the stack trace of the underlying exception. */ - def renderError(ite: InvocationTargetException | ExceptionInInitializerError, d: Denotation)(using Context): Diagnostic = - import dotty.tools.dotc.util.StackTraceOps._ - val cause = ite.getCause match - case e: ExceptionInInitializerError => e.getCause - case e => e + def renderError(thr: Throwable, d: Denotation)(using Context): Diagnostic = + val cause = rootCause(thr) // detect //at repl$.rs$line$2$.(rs$line$2:1) //at repl$.rs$line$2.res1(rs$line$2) @@ -190,7 +181,6 @@ private[repl] class Rendering(parentClassLoader: Option[ClassLoader] = None): private def infoDiagnostic(msg: String, d: Denotation)(using Context): Diagnostic = new Diagnostic.Info(msg, d.symbol.sourcePos) - object Rendering: final val REPL_WRAPPER_NAME_PREFIX = str.REPL_SESSION_LINE @@ -200,3 +190,12 @@ object Rendering: val text = printer.dclText(s) text.mkString(ctx.settings.pageWidth.value, ctx.settings.printLines.value) } + + def rootCause(x: Throwable): Throwable = x match + case _: ExceptionInInitializerError | + _: java.lang.reflect.InvocationTargetException | + _: java.lang.reflect.UndeclaredThrowableException | + _: java.util.concurrent.ExecutionException + if x.getCause != null => + rootCause(x.getCause) + case _ => x diff --git a/compiler/src/dotty/tools/repl/ReplCompiler.scala b/compiler/src/dotty/tools/repl/ReplCompiler.scala index 8db288f50aca..764695e8479b 100644 --- a/compiler/src/dotty/tools/repl/ReplCompiler.scala +++ b/compiler/src/dotty/tools/repl/ReplCompiler.scala @@ -62,8 +62,8 @@ class ReplCompiler extends Compiler: } val rootCtx = super.rootContext.fresh - .setOwner(defn.EmptyPackageClass) .withRootImports + .fresh.setOwner(defn.EmptyPackageClass): Context (state.validObjectIndexes).foldLeft(rootCtx)((ctx, id) => importPreviousRun(id)(using ctx)) } diff --git a/compiler/src/dotty/tools/repl/ReplDriver.scala b/compiler/src/dotty/tools/repl/ReplDriver.scala index 4fab4b119a08..905f4f06de08 100644 --- a/compiler/src/dotty/tools/repl/ReplDriver.scala +++ b/compiler/src/dotty/tools/repl/ReplDriver.scala @@ -37,6 +37,7 @@ import org.jline.reader._ import scala.annotation.tailrec import scala.collection.mutable import scala.jdk.CollectionConverters._ +import scala.util.control.NonFatal import scala.util.Using /** The state of the REPL contains necessary bindings instead of having to have @@ -118,7 +119,7 @@ class ReplDriver(settings: Array[String], private var rootCtx: Context = _ private var shouldStart: Boolean = _ private var compiler: ReplCompiler = _ - private var rendering: Rendering = _ + protected var rendering: Rendering = _ // initialize the REPL session as part of the constructor so that once `run` // is called, we're in business @@ -138,7 +139,7 @@ class ReplDriver(settings: Array[String], * observable outside of the CLI, for this reason, most helper methods are * `protected final` to facilitate testing. */ - final def runUntilQuit(using initialState: State = initialState)(): State = { + def runUntilQuit(using initialState: State = initialState)(): State = { val terminal = new JLineTerminal out.println( @@ -176,24 +177,44 @@ class ReplDriver(settings: Array[String], interpret(ParseResult.complete(input)) } - private def runBody(body: => State): State = rendering.classLoader()(using rootCtx).asContext(withRedirectedOutput(body)) + final def runQuietly(input: String)(using State): State = runBody { + val parsed = ParseResult(input) + interpret(parsed, quiet = true) + } + + protected def runBody(body: => State): State = rendering.classLoader()(using rootCtx).asContext(withRedirectedOutput(body)) // TODO: i5069 final def bind(name: String, value: Any)(using state: State): State = state + /** + * Controls whether the `System.out` and `System.err` streams are set to the provided constructor parameter instance + * of [[java.io.PrintStream]] during the execution of the repl. On by default. + * + * Disabling this can be beneficial when executing a repl instance inside a concurrent environment, for example a + * thread pool (such as the Scala compile server in the Scala Plugin for IntelliJ IDEA). + * + * In such environments, indepently executing `System.setOut` and `System.setErr` without any synchronization can + * lead to unpredictable results when restoring the original streams (dependent on the order of execution), leaving + * the Java process in an inconsistent state. + */ + protected def redirectOutput: Boolean = true + // redirecting the output allows us to test `println` in scripted tests private def withRedirectedOutput(op: => State): State = { - val savedOut = System.out - val savedErr = System.err - try { - System.setOut(out) - System.setErr(out) - op - } - finally { - System.setOut(savedOut) - System.setErr(savedErr) - } + if redirectOutput then + val savedOut = System.out + val savedErr = System.err + try { + System.setOut(out) + System.setErr(out) + op + } + finally { + System.setOut(savedOut) + System.setErr(savedErr) + } + else op } private def newRun(state: State, reporter: StoreReporter = newStoreReporter) = { @@ -236,16 +257,16 @@ class ReplDriver(settings: Array[String], unit.tpdTree = tree given Context = state.context.fresh.setCompilationUnit(unit) val srcPos = SourcePosition(file, Span(cursor)) - val (_, completions) = Completion.completions(srcPos) + val completions = try Completion.completions(srcPos)._2 catch case NonFatal(_) => Nil completions.map(_.label).distinct.map(makeCandidate) } .getOrElse(Nil) end completions - private def interpret(res: ParseResult)(using state: State): State = { + protected def interpret(res: ParseResult, quiet: Boolean = false)(using state: State): State = { res match { case parsed: Parsed if parsed.trees.nonEmpty => - compile(parsed, state) + compile(parsed, state, quiet) case SyntaxErrors(_, errs, _) => displayErrors(errs) @@ -263,7 +284,7 @@ class ReplDriver(settings: Array[String], } /** Compile `parsed` trees and evolve `state` in accordance */ - private def compile(parsed: Parsed, istate: State): State = { + private def compile(parsed: Parsed, istate: State, quiet: Boolean = false): State = { def extractNewestWrapper(tree: untpd.Tree): Name = tree match { case PackageDef(_, (obj: untpd.ModuleDef) :: Nil) => obj.name.moduleClassName case _ => nme.NO_NAME @@ -314,9 +335,11 @@ class ReplDriver(settings: Array[String], given Ordering[Diagnostic] = Ordering[(Int, Int, Int)].on(d => (d.pos.line, -d.level, d.pos.column)) - (definitions ++ warnings) - .sorted - .foreach(printDiagnostic) + if (!quiet) { + (definitions ++ warnings) + .sorted + .foreach(printDiagnostic) + } updatedState } diff --git a/compiler/src/scala/quoted/runtime/impl/QuoteMatcher.scala b/compiler/src/scala/quoted/runtime/impl/QuoteMatcher.scala index d85d92de5455..7c952dbbe142 100644 --- a/compiler/src/scala/quoted/runtime/impl/QuoteMatcher.scala +++ b/compiler/src/scala/quoted/runtime/impl/QuoteMatcher.scala @@ -121,9 +121,9 @@ object QuoteMatcher { private def withEnv[T](env: Env)(body: Env ?=> T): T = body(using env) - def treeMatch(scrutineeTerm: Tree, patternTerm: Tree)(using Context): Option[Tuple] = + def treeMatch(scrutineeTree: Tree, patternTree: Tree)(using Context): Option[Tuple] = given Env = Map.empty - scrutineeTerm =?= patternTerm + scrutineeTree =?= patternTree /** Check that all trees match with `mtch` and concatenate the results with &&& */ private def matchLists[T](l1: List[T], l2: List[T])(mtch: (T, T) => Matching): Matching = (l1, l2) match { diff --git a/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala b/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala index f8e439baeb0e..d1806947fa5d 100644 --- a/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala +++ b/compiler/src/scala/quoted/runtime/impl/QuotesImpl.scala @@ -8,15 +8,16 @@ import dotty.tools.dotc.ast.tpd import dotty.tools.dotc.ast.untpd import dotty.tools.dotc.core.Annotations import dotty.tools.dotc.core.Contexts._ -import dotty.tools.dotc.core.Types +import dotty.tools.dotc.core.Decorators._ import dotty.tools.dotc.core.Flags._ import dotty.tools.dotc.core.NameKinds +import dotty.tools.dotc.core.NameOps._ import dotty.tools.dotc.core.StdNames._ -import dotty.tools.dotc.quoted.reflect._ -import dotty.tools.dotc.core.Decorators._ +import dotty.tools.dotc.core.Types import dotty.tools.dotc.NoCompilationUnit - -import dotty.tools.dotc.quoted.{MacroExpansion, PickledQuotes} +import dotty.tools.dotc.quoted.MacroExpansion +import dotty.tools.dotc.quoted.PickledQuotes +import dotty.tools.dotc.quoted.reflect._ import scala.quoted.runtime.{QuoteUnpickler, QuoteMatching} import scala.quoted.runtime.impl.printers._ @@ -242,6 +243,14 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler def unapply(cdef: ClassDef): (String, DefDef, List[Tree /* Term | TypeTree */], Option[ValDef], List[Statement]) = val rhs = cdef.rhs.asInstanceOf[tpd.Template] (cdef.name.toString, cdef.constructor, cdef.parents, cdef.self, rhs.body) + + def module(module: Symbol, parents: List[Tree /* Term | TypeTree */], body: List[Statement]): (ValDef, ClassDef) = { + val cls = module.moduleClass + val clsDef = ClassDef(cls, parents, body) + val newCls = Apply(Select(New(TypeIdent(cls)), cls.primaryConstructor), Nil) + val modVal = ValDef(module, Some(newCls)) + (modVal, clsDef) + } end ClassDef given ClassDefMethods: ClassDefMethods with @@ -298,7 +307,7 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler object ValDef extends ValDefModule: def apply(symbol: Symbol, rhs: Option[Term]): ValDef = - tpd.ValDef(symbol.asTerm, xCheckMacroedOwners(xCheckMacroValidExpr(rhs), symbol).getOrElse(tpd.EmptyTree)) + withDefaultPos(tpd.ValDef(symbol.asTerm, xCheckMacroedOwners(xCheckMacroValidExpr(rhs), symbol).getOrElse(tpd.EmptyTree))) def copy(original: Tree)(name: String, tpt: TypeTree, rhs: Option[Term]): ValDef = tpd.cpy.ValDef(original)(name.toTermName, tpt, xCheckMacroedOwners(xCheckMacroValidExpr(rhs), original.symbol).getOrElse(tpd.EmptyTree)) def unapply(vdef: ValDef): (String, TypeTree, Option[Term]) = @@ -1474,7 +1483,7 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler object Bind extends BindModule: def apply(sym: Symbol, pattern: Tree): Bind = - tpd.Bind(sym, pattern) + withDefaultPos(tpd.Bind(sym, pattern)) def copy(original: Tree)(name: String, pattern: Tree): Bind = withDefaultPos(tpd.cpy.Bind(original)(name.toTermName, pattern)) def unapply(pattern: Bind): (String, Tree) = @@ -2395,7 +2404,13 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler object Implicits extends ImplicitsModule: def search(tpe: TypeRepr): ImplicitSearchResult = - ctx.typer.inferImplicitArg(tpe, Position.ofMacroExpansion.span) + import tpd.TreeOps + val implicitTree = ctx.typer.inferImplicitArg(tpe, Position.ofMacroExpansion.span) + // Make sure that we do not have any uninstantiated type variables. + // See tests/pos-macros/i16636. + // See tests/pos-macros/exprSummonWithTypeVar with -Xcheck-macros. + dotc.typer.Inferencing.fullyDefinedType(implicitTree.tpe, "", implicitTree) + implicitTree end Implicits type ImplicitSearchResult = Tree @@ -2481,6 +2496,21 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler for sym <- decls(cls) do cls.enter(sym) cls + def newModule(owner: Symbol, name: String, modFlags: Flags, clsFlags: Flags, parents: List[TypeRepr], decls: Symbol => List[Symbol], privateWithin: Symbol): Symbol = + assert(parents.nonEmpty && !parents.head.typeSymbol.is(dotc.core.Flags.Trait), "First parent must be a class") + val mod = dotc.core.Symbols.newNormalizedModuleSymbol( + owner, + name.toTermName, + modFlags | dotc.core.Flags.ModuleValCreationFlags, + clsFlags | dotc.core.Flags.ModuleClassCreationFlags, + parents, + dotc.core.Scopes.newScope, + privateWithin) + val cls = mod.moduleClass.asClass + cls.enter(dotc.core.Symbols.newConstructor(cls, dotc.core.Flags.Synthetic, Nil, Nil)) + for sym <- decls(cls) do cls.enter(sym) + mod + def newMethod(owner: Symbol, name: String, tpe: TypeRepr): Symbol = newMethod(owner, name, tpe, Flags.EmptyFlags, noSymbol) def newMethod(owner: Symbol, name: String, tpe: TypeRepr, flags: Flags, privateWithin: Symbol): Symbol = @@ -2490,6 +2520,9 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler def newBind(owner: Symbol, name: String, flags: Flags, tpe: TypeRepr): Symbol = dotc.core.Symbols.newSymbol(owner, name.toTermName, flags | Case, tpe) def noSymbol: Symbol = dotc.core.Symbols.NoSymbol + + def freshName(prefix: String): String = + NameKinds.MacroNames.fresh(prefix.toTermName).toString end Symbol given SymbolMethods: SymbolMethods with @@ -2513,6 +2546,8 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler def name: String = self.denot.name.toString def fullName: String = self.denot.fullName.toString + def info: TypeRepr = self.denot.info + def pos: Option[Position] = if self.exists then Some(self.sourcePos) else None @@ -2619,13 +2654,15 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler case sym if sym.isType => sym.asType }.toList - def memberType(name: String): Symbol = typeMember(name) + def memberType(name: String): Symbol = + self.typeRef.decls.find(sym => sym.name == name.toTypeName) def typeMember(name: String): Symbol = - self.unforcedDecls.find(sym => sym.name == name.toTypeName) + lookupPrefix.member(name.toTypeName).symbol - def memberTypes: List[Symbol] = typeMembers + def memberTypes: List[Symbol] = + self.typeRef.decls.filter(_.isType) def typeMembers: List[Symbol] = - self.unforcedDecls.filter(_.isType) + lookupPrefix.typeMembers.map(_.symbol).toList def declarations: List[Symbol] = self.typeRef.info.decls.toList @@ -2654,7 +2691,9 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler def show(using printer: Printer[Symbol]): String = printer.show(self) - def asQuotes: Nested = new QuotesImpl(using ctx.withOwner(self)) + def asQuotes: Nested = + assert(self.ownersIterator.contains(ctx.owner), s"$self is not owned by ${ctx.owner}") + new QuotesImpl(using ctx.withOwner(self)) end extension @@ -2765,6 +2804,7 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler def Invisible: Flags = dotc.core.Flags.Invisible def JavaDefined: Flags = dotc.core.Flags.JavaDefined def JavaStatic: Flags = dotc.core.Flags.JavaStatic + def JavaAnnotation: Flags = dotc.core.Flags.JavaAnnotation def Lazy: Flags = dotc.core.Flags.Lazy def Local: Flags = dotc.core.Flags.Local def Macro: Flags = dotc.core.Flags.Macro @@ -2784,7 +2824,7 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler def Scala2x: Flags = dotc.core.Flags.Scala2x def Sealed: Flags = dotc.core.Flags.Sealed def StableRealizable: Flags = dotc.core.Flags.StableRealizable - def Static: Flags = dotc.core.Flags.JavaStatic + @deprecated("Use JavaStatic instead", "3.3.0") def Static: Flags = dotc.core.Flags.JavaStatic def Synthetic: Flags = dotc.core.Flags.Synthetic def Trait: Flags = dotc.core.Flags.Trait def Transparent: Flags = dotc.core.Flags.Transparent @@ -3053,14 +3093,14 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler new TypeImpl(tree, SpliceScope.getCurrent).asInstanceOf[scala.quoted.Type[T]] object ExprMatch extends ExprMatchModule: - def unapply[TypeBindings <: Tuple, Tup <: Tuple](scrutinee: scala.quoted.Expr[Any])(using pattern: scala.quoted.Expr[Any]): Option[Tup] = + def unapply[TypeBindings, Tup <: Tuple](scrutinee: scala.quoted.Expr[Any])(using pattern: scala.quoted.Expr[Any]): Option[Tup] = val scrutineeTree = reflect.asTerm(scrutinee) val patternTree = reflect.asTerm(pattern) treeMatch(scrutineeTree, patternTree).asInstanceOf[Option[Tup]] end ExprMatch object TypeMatch extends TypeMatchModule: - def unapply[TypeBindings <: Tuple, Tup <: Tuple](scrutinee: scala.quoted.Type[?])(using pattern: scala.quoted.Type[?]): Option[Tup] = + def unapply[TypeBindings, Tup <: Tuple](scrutinee: scala.quoted.Type[?])(using pattern: scala.quoted.Type[?]): Option[Tup] = val scrutineeTree = reflect.TypeTree.of(using scrutinee) val patternTree = reflect.TypeTree.of(using pattern) treeMatch(scrutineeTree, patternTree).asInstanceOf[Option[Tup]] @@ -3090,7 +3130,7 @@ class QuotesImpl private (using val ctx: Context) extends Quotes, QuoteUnpickler if typeHoles.isEmpty then ctx else val ctx1 = ctx.fresh.setFreshGADTBounds.addMode(dotc.core.Mode.GadtConstraintInference) - ctx1.gadt.addToConstraint(typeHoles) + ctx1.gadtState.addToConstraint(typeHoles) ctx1 val matchings = QuoteMatcher.treeMatch(scrutinee, pat1)(using ctx1) diff --git a/compiler/src/scala/quoted/runtime/impl/printers/Extractors.scala b/compiler/src/scala/quoted/runtime/impl/printers/Extractors.scala index 0bea8f0ab643..c229338ad228 100644 --- a/compiler/src/scala/quoted/runtime/impl/printers/Extractors.scala +++ b/compiler/src/scala/quoted/runtime/impl/printers/Extractors.scala @@ -57,7 +57,6 @@ object Extractors { if (flags.is(Flags.Scala2x)) flagList += "Flags.Scala2x" if (flags.is(Flags.Sealed)) flagList += "Flags.Sealed" if (flags.is(Flags.StableRealizable)) flagList += "Flags.StableRealizable" - if (flags.is(Flags.Static)) flagList += "Flags.javaStatic" if (flags.is(Flags.Synthetic)) flagList += "Flags.Synthetic" if (flags.is(Flags.Trait)) flagList += "Flags.Trait" if (flags.is(Flags.Transparent)) flagList += "Flags.Transparent" diff --git a/compiler/src/scala/quoted/runtime/impl/printers/SourceCode.scala b/compiler/src/scala/quoted/runtime/impl/printers/SourceCode.scala index 5d61902fbedd..e934c1930163 100644 --- a/compiler/src/scala/quoted/runtime/impl/printers/SourceCode.scala +++ b/compiler/src/scala/quoted/runtime/impl/printers/SourceCode.scala @@ -57,7 +57,6 @@ object SourceCode { if (flags.is(Flags.Scala2x)) flagList += "scala2x" if (flags.is(Flags.Sealed)) flagList += "sealed" if (flags.is(Flags.StableRealizable)) flagList += "stableRealizable" - if (flags.is(Flags.Static)) flagList += "javaStatic" if (flags.is(Flags.Synthetic)) flagList += "synthetic" if (flags.is(Flags.Trait)) flagList += "trait" if (flags.is(Flags.Transparent)) flagList += "transparent" diff --git a/compiler/test-resources/repl/i1370 b/compiler/test-resources/repl/i1370 index 6582e03b6539..4bd92b4d5f83 100644 --- a/compiler/test-resources/repl/i1370 +++ b/compiler/test-resources/repl/i1370 @@ -1,5 +1,5 @@ scala> object Lives { class Private { def foo1: Any = new Private.C1; def foo2: Any = new Private.C2 }; object Private { class C1 private {}; private class C2 {} } } --- Error: ---------------------------------------------------------------------- +-- [E173] Reference Error: ----------------------------------------------------- 1 | object Lives { class Private { def foo1: Any = new Private.C1; def foo2: Any = new Private.C2 }; object Private { class C1 private {}; private class C2 {} } } | ^^^^^^^^^^ |constructor C1 cannot be accessed as a member of Lives.Private.C1 from class Private. diff --git a/compiler/test-resources/repl/i15493 b/compiler/test-resources/repl/i15493 index f543f5c1d0f7..670cf8ebcbd2 100644 --- a/compiler/test-resources/repl/i15493 +++ b/compiler/test-resources/repl/i15493 @@ -142,3 +142,8 @@ val res33: Outer.Foo = Outer$Foo@17 scala> res33.toString val res34: String = Outer$Foo@17 +scala> Vector.unapplySeq(Vector(2)) +val res35: scala.collection.SeqFactory.UnapplySeqWrapper[Int] = scala.collection.SeqFactory$UnapplySeqWrapper@df507bfd + +scala> new scala.concurrent.duration.DurationInt(5) +val res36: scala.concurrent.duration.package.DurationInt = scala.concurrent.duration.package$DurationInt@5 diff --git a/compiler/test-resources/repl/i4184 b/compiler/test-resources/repl/i4184 index 2c4eb7d12a6f..06b2c81ece21 100644 --- a/compiler/test-resources/repl/i4184 +++ b/compiler/test-resources/repl/i4184 @@ -5,8 +5,11 @@ scala> object bar { class Foo } scala> implicit def eqFoo: CanEqual[foo.Foo, foo.Foo] = CanEqual.derived def eqFoo: CanEqual[foo.Foo, foo.Foo] scala> object Bar { new foo.Foo == new bar.Foo } --- Error: ---------------------------------------------------------------------- +-- [E172] Type Error: ---------------------------------------------------------- 1 | object Bar { new foo.Foo == new bar.Foo } | ^^^^^^^^^^^^^^^^^^^^^^^^^^ - | Values of types foo.Foo and bar.Foo cannot be compared with == or != -1 error found + | Values of types foo.Foo and bar.Foo² cannot be compared with == or != + | + | where: Foo is a class in object foo + | Foo² is a class in object bar +1 error found \ No newline at end of file diff --git a/compiler/test-resources/type-printer/source-compatible b/compiler/test-resources/type-printer/source-compatible new file mode 100644 index 000000000000..d0773a11a795 --- /dev/null +++ b/compiler/test-resources/type-printer/source-compatible @@ -0,0 +1,17 @@ +scala> case class Bag() extends reflect.Selectable +// defined case class Bag +scala> val m = new Bag { val f = 23; def g = 47; def h(i: Int): Int = i; var i = 101; type N = Int; val l = List(42); def p[T](t: T) = t.toString() } +val m: + Bag{ + val f: Int; def g: Int; def h(i: Int): Int; val i: Int; + def i_=(x$1: Int): Unit; type N = Int; val l: List[Int]; + def p[T](t: T): String + } = Bag() +scala> type t = Bag { val f: Int; def g: Int; def h(i: Int): Int; val i: Int; def i_=(x$1: Int): Unit; type N = Int; val l: List[Int]; val s: String @unchecked } +// defined alias type t + = + Bag{ + val f: Int; def g: Int; def h(i: Int): Int; val i: Int; + def i_=(x$1: Int): Unit; type N = Int; val l: List[Int]; + val s: String @unchecked + } diff --git a/compiler/test/dotc/comptest.scala b/compiler/test/dotc/comptest.scala index bd0d800e641c..fb53f561a94d 100644 --- a/compiler/test/dotc/comptest.scala +++ b/compiler/test/dotc/comptest.scala @@ -12,6 +12,7 @@ object comptest extends ParallelTesting { def isInteractive = true def testFilter = Nil def updateCheckFiles: Boolean = false + def failedTests = None val posDir = "./tests/pos/" val negDir = "./tests/neg/" diff --git a/compiler/test/dotc/pos-test-pickling.blacklist b/compiler/test/dotc/pos-test-pickling.blacklist index a7d8778d4c61..cdbad2160f2a 100644 --- a/compiler/test/dotc/pos-test-pickling.blacklist +++ b/compiler/test/dotc/pos-test-pickling.blacklist @@ -45,6 +45,7 @@ i9999.scala i6505.scala i15158.scala i15155.scala +i15827.scala # Opaque type i5720.scala @@ -84,7 +85,6 @@ boxmap-paper.scala # Function types print differnt after unpickling since test mispredicts Feature.preFundsEnabled caps-universal.scala - # GADT cast applied to singleton type difference i4176-gadt.scala diff --git a/compiler/test/dotc/run-lazy-vals-tests.allowlist b/compiler/test/dotc/run-lazy-vals-tests.allowlist index 98973dc2893d..361795bcc5fd 100644 --- a/compiler/test/dotc/run-lazy-vals-tests.allowlist +++ b/compiler/test/dotc/run-lazy-vals-tests.allowlist @@ -38,7 +38,6 @@ null-lazy-val.scala patmatch-classtag.scala priorityQueue.scala serialization-new-legacy.scala -serialization-new.scala singletons.scala statics.scala stream_flatmap_odds.scala diff --git a/compiler/test/dotty/Properties.scala b/compiler/test/dotty/Properties.scala index f4e0ed5f615f..cc47303d5468 100644 --- a/compiler/test/dotty/Properties.scala +++ b/compiler/test/dotty/Properties.scala @@ -13,6 +13,10 @@ object Properties { prop == null || prop == "TRUE" } + /** If property is unset or FALSE we consider it `false` */ + private def propIsTrue(name: String): Boolean = + sys.props.getOrElse(name, "FALSE") == "TRUE" + /** Are we running on the CI? */ val isRunByCI: Boolean = sys.env.isDefinedAt("DOTTY_CI_RUN") || sys.env.isDefinedAt("DRONE") // TODO remove this when we drop Drone @@ -30,9 +34,11 @@ object Properties { */ val testsFilter: List[String] = sys.props.get("dotty.tests.filter").fold(Nil)(_.split(',').toList) + /** Run only failed tests */ + val rerunFailed: Boolean = propIsTrue("dotty.tests.rerunFailed") + /** Tests should override the checkfiles with the current output */ - val testsUpdateCheckfile: Boolean = - sys.props.getOrElse("dotty.tests.updateCheckfiles", "FALSE") == "TRUE" + val testsUpdateCheckfile: Boolean = propIsTrue("dotty.tests.updateCheckfiles") /** When set, the run tests are only compiled - not run, a warning will be * issued @@ -85,6 +91,9 @@ object Properties { /** jline-reader jar */ def jlineReader: String = sys.props("dotty.tests.classes.jlineReader") + /** scalajs-javalib jar */ + def scalaJSJavalib: String = sys.props("dotty.tests.classes.scalaJSJavalib") + /** scalajs-library jar */ def scalaJSLibrary: String = sys.props("dotty.tests.classes.scalaJSLibrary") } diff --git a/compiler/test/dotty/tools/backend/jvm/DottyBytecodeTests.scala b/compiler/test/dotty/tools/backend/jvm/DottyBytecodeTests.scala index 2c618ea91e96..71bf530fcda5 100644 --- a/compiler/test/dotty/tools/backend/jvm/DottyBytecodeTests.scala +++ b/compiler/test/dotty/tools/backend/jvm/DottyBytecodeTests.scala @@ -597,7 +597,7 @@ class DottyBytecodeTests extends DottyBytecodeTest { val clsIn = dir.lookupName("Test.class", directory = false).input val clsNode = loadClassNode(clsIn) val method = getMethod(clsNode, "test") - assertEquals(88, instructionsFromMethod(method).size) + assertEquals(23, instructionsFromMethod(method).size) } } diff --git a/compiler/test/dotty/tools/backend/jvm/InlineBytecodeTests.scala b/compiler/test/dotty/tools/backend/jvm/InlineBytecodeTests.scala index ea9009de1d9e..33e898718b33 100644 --- a/compiler/test/dotty/tools/backend/jvm/InlineBytecodeTests.scala +++ b/compiler/test/dotty/tools/backend/jvm/InlineBytecodeTests.scala @@ -600,13 +600,7 @@ class InlineBytecodeTests extends DottyBytecodeTest { val instructions = instructionsFromMethod(fun) val expected = // TODO room for constant folding List( - Op(ICONST_1), - VarOp(ISTORE, 1), - Op(ICONST_2), - VarOp(ILOAD, 1), - Op(IADD), - Op(ICONST_3), - Op(IADD), + IntOp(BIPUSH, 6), Op(IRETURN), ) assert(instructions == expected, diff --git a/compiler/test/dotty/tools/backend/jvm/LabelBytecodeTests.scala b/compiler/test/dotty/tools/backend/jvm/LabelBytecodeTests.scala new file mode 100644 index 000000000000..aea567b87f91 --- /dev/null +++ b/compiler/test/dotty/tools/backend/jvm/LabelBytecodeTests.scala @@ -0,0 +1,166 @@ +package dotty.tools.backend.jvm + +import scala.language.unsafeNulls + +import org.junit.Assert._ +import org.junit.Test + +import scala.tools.asm +import asm._ +import asm.tree._ + +import scala.tools.asm.Opcodes +import scala.jdk.CollectionConverters._ +import Opcodes._ + +class LabelBytecodeTests extends DottyBytecodeTest { + import ASMConverters._ + + @Test def localLabelBreak = { + testLabelBytecodeEquals( + """val local = boundary.Label[Long]() + |try break(5L)(using local) + |catch case ex: boundary.Break[Long] @unchecked => + | if ex.label eq local then ex.value + | else throw ex + """.stripMargin, + "Long", + Ldc(LDC, 5), + Op(LRETURN) + ) + } + + @Test def simpleBoundaryBreak = { + testLabelBytecodeEquals( + """boundary: l ?=> + | break(2)(using l) + """.stripMargin, + "Int", + Op(ICONST_2), + Op(IRETURN) + ) + + testLabelBytecodeEquals( + """boundary: + | break(3) + """.stripMargin, + "Int", + Op(ICONST_3), + Op(IRETURN) + ) + + testLabelBytecodeEquals( + """boundary: + | break() + """.stripMargin, + "Unit", + Op(RETURN) + ) + } + + @Test def labelExtraction = { + // Test extra Inlined around the label + testLabelBytecodeEquals( + """boundary: + | break(2)(using summon[boundary.Label[Int]]) + """.stripMargin, + "Int", + Op(ICONST_2), + Op(IRETURN) + ) + + // Test extra Block around the label + testLabelBytecodeEquals( + """boundary: l ?=> + | break(2)(using { l }) + """.stripMargin, + "Int", + Op(ICONST_2), + Op(IRETURN) + ) + } + + @Test def boundaryLocalBreak = { + testLabelBytecodeExpect( + """val x: Boolean = true + |boundary[Unit]: + | var i = 0 + | while true do + | i += 1 + | if i > 10 then break() + """.stripMargin, + "Unit", + !throws(_) + ) + } + + @Test def boundaryNonLocalBreak = { + testLabelBytecodeExpect( + """boundary[Unit]: + | nonLocalBreak() + """.stripMargin, + "Unit", + throws + ) + + testLabelBytecodeExpect( + """boundary[Unit]: + | def f() = break() + | f() + """.stripMargin, + "Unit", + throws + ) + } + + @Test def boundaryLocalAndNonLocalBreak = { + testLabelBytecodeExpect( + """boundary[Unit]: l ?=> + | break() + | nonLocalBreak() + """.stripMargin, + "Unit", + throws + ) + } + + private def throws(instructions: List[Instruction]): Boolean = + instructions.exists { + case Op(ATHROW) => true + case _ => false + } + + private def testLabelBytecodeEquals(code: String, tpe: String, expected: Instruction*): Unit = + checkLabelBytecodeInstructions(code, tpe) { instructions => + val expectedList = expected.toList + assert(instructions == expectedList, + "`test` was not properly generated\n" + diffInstructions(instructions, expectedList)) + } + + private def testLabelBytecodeExpect(code: String, tpe: String, expected: List[Instruction] => Boolean): Unit = + checkLabelBytecodeInstructions(code, tpe) { instructions => + assert(expected(instructions), + "`test` was not properly generated\n" + instructions) + } + + private def checkLabelBytecodeInstructions(code: String, tpe: String)(checkOutput: List[Instruction] => Unit): Unit = { + val source = + s"""import scala.util.boundary, boundary.break + |class Test: + | def test: $tpe = { + | ${code.linesIterator.toList.mkString("", "\n ", "")} + | } + | def nonLocalBreak[T](value: T)(using boundary.Label[T]): Nothing = break(value) + | def nonLocalBreak()(using boundary.Label[Unit]): Nothing = break(()) + """.stripMargin + + checkBCode(source) { dir => + val clsIn = dir.lookupName("Test.class", directory = false).input + val clsNode = loadClassNode(clsIn) + val method = getMethod(clsNode, "test") + + checkOutput(instructionsFromMethod(method)) + } + } + +} diff --git a/compiler/test/dotty/tools/dotc/BootstrappedOnlyCompilationTests.scala b/compiler/test/dotty/tools/dotc/BootstrappedOnlyCompilationTests.scala index cce23cb5c9a6..e9d0e26f33b0 100644 --- a/compiler/test/dotty/tools/dotc/BootstrappedOnlyCompilationTests.scala +++ b/compiler/test/dotty/tools/dotc/BootstrappedOnlyCompilationTests.scala @@ -10,6 +10,7 @@ import org.junit.Assume._ import org.junit.experimental.categories.Category import scala.concurrent.duration._ +import reporting.TestReporter import vulpix._ import java.nio.file._ @@ -35,6 +36,12 @@ class BootstrappedOnlyCompilationTests { ).checkCompile() } + @Test def posWithCompilerCC: Unit = + implicit val testGroup: TestGroup = TestGroup("compilePosWithCompilerCC") + aggregateTests( + compileDir("tests/pos-with-compiler-cc/dotc", withCompilerOptions.and("-language:experimental.captureChecking")) + ).checkCompile() + @Test def posWithCompiler: Unit = { implicit val testGroup: TestGroup = TestGroup("compilePosWithCompiler") aggregateTests( @@ -214,6 +221,7 @@ object BootstrappedOnlyCompilationTests extends ParallelTesting { def isInteractive = SummaryReport.isInteractive def testFilter = Properties.testsFilter def updateCheckFiles: Boolean = Properties.testsUpdateCheckfile + def failedTests = TestReporter.lastRunFailedTests implicit val summaryReport: SummaryReporting = new SummaryReport @AfterClass def tearDown(): Unit = { diff --git a/compiler/test/dotty/tools/dotc/CompilationTests.scala b/compiler/test/dotty/tools/dotc/CompilationTests.scala index 261e6af21927..8d9e28d415c1 100644 --- a/compiler/test/dotty/tools/dotc/CompilationTests.scala +++ b/compiler/test/dotty/tools/dotc/CompilationTests.scala @@ -16,6 +16,7 @@ import scala.jdk.CollectionConverters._ import scala.util.matching.Regex import scala.concurrent.duration._ import TestSources.sources +import reporting.TestReporter import vulpix._ class CompilationTests { @@ -43,8 +44,8 @@ class CompilationTests { compileFilesInDir("tests/pos-custom-args/captures", defaultOptions.and("-language:experimental.captureChecking")), compileFilesInDir("tests/pos-custom-args/erased", defaultOptions.and("-language:experimental.erasedDefinitions")), compileFilesInDir("tests/pos", defaultOptions.and("-Ysafe-init")), - // Run tests for experimental lightweight lazy vals - compileFilesInDir("tests/pos", defaultOptions.and("-Ysafe-init", "-Ylightweight-lazy-vals"), FileFilter.include(TestSources.posLazyValsAllowlist)), + // Run tests for legacy lazy vals + compileFilesInDir("tests/pos", defaultOptions.and("-Ysafe-init", "-Ylegacy-lazy-vals", "-Ycheck-constraint-deps"), FileFilter.include(TestSources.posLazyValsAllowlist)), compileFilesInDir("tests/pos-deep-subtype", allowDeepSubtypes), compileFilesInDir("tests/pos-custom-args/no-experimental", defaultOptions.and("-Yno-experimental")), compileDir("tests/pos-special/java-param-names", defaultOptions.withJavacOnlyOptions("-parameters")), @@ -82,6 +83,7 @@ class CompilationTests { compileFile("tests/rewrites/i9632.scala", defaultOptions.and("-indent", "-rewrite")), compileFile("tests/rewrites/i11895.scala", defaultOptions.and("-indent", "-rewrite")), compileFile("tests/rewrites/i12340.scala", unindentOptions.and("-rewrite")), + compileFile("tests/rewrites/i17187.scala", unindentOptions.and("-rewrite")), ).checkRewrites() } @@ -141,21 +143,15 @@ class CompilationTests { compileFilesInDir("tests/neg-custom-args/erased", defaultOptions.and("-language:experimental.erasedDefinitions")), compileFilesInDir("tests/neg-custom-args/allow-double-bindings", allowDoubleBindings), compileFilesInDir("tests/neg-custom-args/allow-deep-subtypes", allowDeepSubtypes), + compileFilesInDir("tests/neg-custom-args/feature", defaultOptions.and("-Xfatal-warnings", "-feature")), compileFilesInDir("tests/neg-custom-args/no-experimental", defaultOptions.and("-Yno-experimental")), compileFilesInDir("tests/neg-custom-args/captures", defaultOptions.and("-language:experimental.captureChecking")), - compileDir("tests/neg-custom-args/impl-conv", defaultOptions.and("-Xfatal-warnings", "-feature")), - compileDir("tests/neg-custom-args/i13946", defaultOptions.and("-Xfatal-warnings", "-feature")), compileFile("tests/neg-custom-args/avoid-warn-deprecation.scala", defaultOptions.and("-Xfatal-warnings", "-feature")), - compileFile("tests/neg-custom-args/implicit-conversions.scala", defaultOptions.and("-Xfatal-warnings", "-feature")), - compileFile("tests/neg-custom-args/implicit-conversions-old.scala", defaultOptions.and("-Xfatal-warnings", "-feature")), compileFile("tests/neg-custom-args/i3246.scala", scala2CompatMode), compileFile("tests/neg-custom-args/overrideClass.scala", scala2CompatMode), compileFile("tests/neg-custom-args/ovlazy.scala", scala2CompatMode.and("-Xfatal-warnings")), compileFile("tests/neg-custom-args/newline-braces.scala", scala2CompatMode.and("-Xfatal-warnings")), compileFile("tests/neg-custom-args/autoTuplingTest.scala", defaultOptions.andLanguageFeature("noAutoTupling")), - compileFile("tests/neg-custom-args/nopredef.scala", defaultOptions.and("-Yno-predef")), - compileFile("tests/neg-custom-args/noimports.scala", defaultOptions.and("-Yno-imports")), - compileFile("tests/neg-custom-args/noimports2.scala", defaultOptions.and("-Yno-imports")), compileFile("tests/neg-custom-args/i1650.scala", allowDeepSubtypes), compileFile("tests/neg-custom-args/i3882.scala", allowDeepSubtypes), compileFile("tests/neg-custom-args/i4372.scala", allowDeepSubtypes), @@ -164,6 +160,7 @@ class CompilationTests { compileFile("tests/neg-custom-args/i9517.scala", defaultOptions.and("-Xprint-types")), compileFile("tests/neg-custom-args/i11637.scala", defaultOptions.and("-explain")), compileFile("tests/neg-custom-args/i15575.scala", defaultOptions.and("-explain")), + compileFile("tests/neg-custom-args/i16601a.scala", defaultOptions.and("-explain")), compileFile("tests/neg-custom-args/interop-polytypes.scala", allowDeepSubtypes.and("-Yexplicit-nulls")), compileFile("tests/neg-custom-args/conditionalWarnings.scala", allowDeepSubtypes.and("-deprecation").and("-Xfatal-warnings")), compileFilesInDir("tests/neg-custom-args/isInstanceOf", allowDeepSubtypes and "-Xfatal-warnings"), @@ -188,11 +185,11 @@ class CompilationTests { compileFile("tests/neg-custom-args/matchable.scala", defaultOptions.and("-Xfatal-warnings", "-source", "future")), compileFile("tests/neg-custom-args/i7314.scala", defaultOptions.and("-Xfatal-warnings", "-source", "future")), compileFile("tests/neg-custom-args/capt-wf.scala", defaultOptions.and("-language:experimental.captureChecking", "-Xfatal-warnings")), - compileFile("tests/neg-custom-args/feature-shadowing.scala", defaultOptions.and("-Xfatal-warnings", "-feature")), compileDir("tests/neg-custom-args/hidden-type-errors", defaultOptions.and("-explain")), compileFile("tests/neg-custom-args/i13026.scala", defaultOptions.and("-print-lines")), compileFile("tests/neg-custom-args/i13838.scala", defaultOptions.and("-Ximplicit-search-limit", "1000")), compileFile("tests/neg-custom-args/jdk-9-app.scala", defaultOptions.and("-release:8")), + compileFile("tests/neg-custom-args/i10994.scala", defaultOptions.and("-source", "future")), ).checkExpectedErrors() } @@ -217,9 +214,9 @@ class CompilationTests { compileDir("tests/run-custom-args/Xmacro-settings/compileTimeEnv", defaultOptions.and("-Xmacro-settings:a,b=1,c.b.a=x.y.z=1,myLogger.level=INFO")), compileFilesInDir("tests/run-custom-args/captures", allowDeepSubtypes.and("-language:experimental.captureChecking")), compileFilesInDir("tests/run-deep-subtype", allowDeepSubtypes), - compileFilesInDir("tests/run", defaultOptions.and("-Ysafe-init"), FileFilter.exclude("serialization-new.scala")), - // Run tests for experimental lightweight lazy vals and stable lazy vals. - compileFilesInDir("tests/run", defaultOptions.and("-Ysafe-init", "-Ylightweight-lazy-vals"), FileFilter.include(TestSources.runLazyValsAllowlist)), + compileFilesInDir("tests/run", defaultOptions.and("-Ysafe-init")), + // Run tests for legacy lazy vals. + compileFilesInDir("tests/run", defaultOptions.and("-Ysafe-init", "-Ylegacy-lazy-vals", "-Ycheck-constraint-deps"), FileFilter.include(TestSources.runLazyValsAllowlist)), ).checkRuns() } @@ -241,7 +238,8 @@ class CompilationTests { ).checkCompile() } - @Test def recheck: Unit = + //@Test disabled in favor of posWithCompilerCC to save time. + def recheck: Unit = given TestGroup = TestGroup("recheck") aggregateTests( compileFilesInDir("tests/new", recheckOptions), @@ -317,6 +315,7 @@ object CompilationTests extends ParallelTesting { def isInteractive = SummaryReport.isInteractive def testFilter = Properties.testsFilter def updateCheckFiles: Boolean = Properties.testsUpdateCheckfile + def failedTests = TestReporter.lastRunFailedTests implicit val summaryReport: SummaryReporting = new SummaryReport @AfterClass def tearDown(): Unit = { diff --git a/compiler/test/dotty/tools/dotc/FromTastyTests.scala b/compiler/test/dotty/tools/dotc/FromTastyTests.scala index 2684a47b870c..1d46cbbce95c 100644 --- a/compiler/test/dotty/tools/dotc/FromTastyTests.scala +++ b/compiler/test/dotty/tools/dotc/FromTastyTests.scala @@ -5,6 +5,7 @@ package dotc import scala.language.unsafeNulls import org.junit.{AfterClass, Test} +import reporting.TestReporter import vulpix._ import java.io.{File => JFile} @@ -48,6 +49,7 @@ object FromTastyTests extends ParallelTesting { def isInteractive = SummaryReport.isInteractive def testFilter = Properties.testsFilter def updateCheckFiles: Boolean = Properties.testsUpdateCheckfile + def failedTests = TestReporter.lastRunFailedTests implicit val summaryReport: SummaryReporting = new SummaryReport @AfterClass def tearDown(): Unit = { diff --git a/compiler/test/dotty/tools/dotc/IdempotencyTests.scala b/compiler/test/dotty/tools/dotc/IdempotencyTests.scala index 84b3f1f8a48f..b515ebb05f96 100644 --- a/compiler/test/dotty/tools/dotc/IdempotencyTests.scala +++ b/compiler/test/dotty/tools/dotc/IdempotencyTests.scala @@ -12,6 +12,7 @@ import org.junit.{AfterClass, Test} import org.junit.experimental.categories.Category import scala.concurrent.duration._ +import reporting.TestReporter import vulpix._ @@ -76,6 +77,7 @@ object IdempotencyTests extends ParallelTesting { def isInteractive = SummaryReport.isInteractive def testFilter = Properties.testsFilter def updateCheckFiles: Boolean = Properties.testsUpdateCheckfile + def failedTests = TestReporter.lastRunFailedTests implicit val summaryReport: SummaryReporting = new SummaryReport @AfterClass def tearDown(): Unit = { diff --git a/compiler/test/dotty/tools/dotc/SettingsTests.scala b/compiler/test/dotty/tools/dotc/SettingsTests.scala index e3076f055d51..8c571a321548 100644 --- a/compiler/test/dotty/tools/dotc/SettingsTests.scala +++ b/compiler/test/dotty/tools/dotc/SettingsTests.scala @@ -179,6 +179,25 @@ class SettingsTests { assertEquals(100, foo.value) } + @Test def `Set BooleanSettings correctly`: Unit = + object Settings extends SettingGroup: + val foo = BooleanSetting("-foo", "foo", false) + val bar = BooleanSetting("-bar", "bar", true) + val baz = BooleanSetting("-baz", "baz", false) + val qux = BooleanSetting("-qux", "qux", false) + import Settings._ + + val args = List("-foo:true", "-bar:false", "-baz", "-qux:true", "-qux:false") + val summary = processArguments(args, processAll = true) + assertTrue(s"Setting args errors:\n ${summary.errors.take(5).mkString("\n ")}", summary.errors.isEmpty) + withProcessedArgs(summary) { + assertEquals(true, foo.value) + assertEquals(false, bar.value) + assertEquals(true, baz.value) + assertEquals(false, qux.value) + assertEquals(List("Flag -qux set repeatedly"), summary.warnings) + } + private def withProcessedArgs(summary: ArgsSummary)(f: SettingsState ?=> Unit) = f(using summary.sstate) extension [T](setting: Setting[T]) diff --git a/compiler/test/dotty/tools/dotc/StringFormatterTest.scala b/compiler/test/dotty/tools/dotc/StringFormatterTest.scala index e745fa515443..4dfc08cc7e9b 100644 --- a/compiler/test/dotty/tools/dotc/StringFormatterTest.scala +++ b/compiler/test/dotty/tools/dotc/StringFormatterTest.scala @@ -39,51 +39,6 @@ class StringFormatterTest extends AbstractStringFormatterTest: assertEquals("flags=private final ", store.string) end StringFormatterTest -class EmStringFormatterTest extends AbstractStringFormatterTest: - @Test def seq = check("[Any, String]", em"${Seq(defn.AnyType, defn.StringType)}") - @Test def seqSeq = check("Any; String", em"${Seq(defn.AnyType, defn.StringType)}%; %") - @Test def ellipsis = assert(em"$Big".contains("...")) - @Test def err = check("type Err", em"$Err") - @Test def ambig = check("Foo vs Foo", em"$Foo vs $Foo") - @Test def cstrd = check("Foo; Bar", em"$mkCstrd%; %") - @Test def seqErr = check("[class Any, type Err]", em"${Seq(defn.AnyClass, Err)}") - @Test def seqSeqErr = check("class Any; type Err", em"${Seq(defn.AnyClass, Err)}%; %") - @Test def tupleErr = check("(1,type Err)", em"${(1, Err)}") - @Test def tupleAmb = check("(Foo,Foo)", em"${(Foo, Foo)}") - @Test def tupleFlags = check("(Foo,abstract)", em"${(Foo, Abstract)}") - @Test def seqOfTupleFlags = check("[(Foo,abstract)]", em"${Seq((Foo, Abstract))}") -end EmStringFormatterTest - -class ExStringFormatterTest extends AbstractStringFormatterTest: - @Test def seq = check("[Any, String]", ex"${Seq(defn.AnyType, defn.StringType)}") - @Test def seqSeq = check("Any; String", ex"${Seq(defn.AnyType, defn.StringType)}%; %") - @Test def ellipsis = assert(ex"$Big".contains("...")) - @Test def err = check("type Err", ex"$Err") - @Test def ambig = check("""Foo vs Foo² - | - |where: Foo is a type - | Foo² is a type - |""".stripMargin, ex"$Foo vs $Foo") - @Test def cstrd = check("""Foo; Bar - | - |where: Bar is a type variable with constraint <: String - | Foo is a type variable with constraint <: Int - |""".stripMargin, ex"$mkCstrd%; %") - @Test def seqErr = check("[class Any, type Err]", ex"${Seq(defn.AnyClass, Err)}") - @Test def seqSeqErr = check("class Any; type Err", ex"${Seq(defn.AnyClass, Err)}%; %") - @Test def tupleErr = check("(1,type Err)", ex"${(1, Err)}") - @Test def tupleAmb = check("""(Foo,Foo²) - | - |where: Foo is a type - | Foo² is a type - |""".stripMargin, ex"${(Foo, Foo)}") - @Test def seqOfTup3Amb = check("""[(Foo,Foo²,type Err)] - | - |where: Foo is a type - | Foo² is a type - |""".stripMargin, ex"${Seq((Foo, Foo, Err))}") -end ExStringFormatterTest - abstract class AbstractStringFormatterTest extends DottyTest: override def initializeCtx(fc: FreshContext) = super.initializeCtx(fc.setSetting(fc.settings.color, "never")) diff --git a/compiler/test/dotty/tools/dotc/TastyBootstrapTests.scala b/compiler/test/dotty/tools/dotc/TastyBootstrapTests.scala index 9e71b10b206d..50e07f388dc4 100644 --- a/compiler/test/dotty/tools/dotc/TastyBootstrapTests.scala +++ b/compiler/test/dotty/tools/dotc/TastyBootstrapTests.scala @@ -17,6 +17,7 @@ import scala.util.matching.Regex import scala.concurrent.duration._ import TestSources.sources import vulpix._ +import reporting.TestReporter class TastyBootstrapTests { import ParallelTesting._ @@ -114,6 +115,7 @@ object TastyBootstrapTests extends ParallelTesting { def isInteractive = SummaryReport.isInteractive def testFilter = Properties.testsFilter def updateCheckFiles: Boolean = Properties.testsUpdateCheckfile + def failedTests = TestReporter.lastRunFailedTests implicit val summaryReport: SummaryReporting = new SummaryReport @AfterClass def tearDown(): Unit = { diff --git a/compiler/test/dotty/tools/dotc/TupleShowTests.scala b/compiler/test/dotty/tools/dotc/TupleShowTests.scala new file mode 100644 index 000000000000..2d76c480b001 --- /dev/null +++ b/compiler/test/dotty/tools/dotc/TupleShowTests.scala @@ -0,0 +1,96 @@ +package dotty.tools +package dotc + +import core.*, Decorators.*, Symbols.* +import printing.Texts.* + +import java.lang.System.{ lineSeparator => EOL } +import org.junit.Test + +class TupleShowTests extends DottyTest: + def IntType = defn.IntType + def LongType = defn.LongType + def ShortType = defn.ShortType + def Types_10 = List.fill(5)(IntType) ::: List.fill(5)(LongType) + def Types_20 = Types_10 ::: Types_10 + + val tup0 = defn.tupleType(Nil) + val tup1 = defn.tupleType(IntType :: Nil) + val tup2 = defn.tupleType(IntType :: LongType :: Nil) + val tup3 = defn.tupleType(IntType :: LongType :: ShortType :: Nil) + val tup21 = defn.tupleType(Types_20 ::: IntType :: Nil) + val tup22 = defn.tupleType(Types_20 ::: IntType :: LongType :: Nil) + val tup23 = defn.tupleType(Types_20 ::: IntType :: LongType :: ShortType :: Nil) + val tup24 = defn.tupleType(Types_20 ::: IntType :: LongType :: ShortType :: ShortType :: Nil) + + @Test def tup0_show = chkEq("EmptyTuple.type", i"$tup0") + @Test def tup1_show = chkEq("Tuple1[Int]", i"$tup1") + @Test def tup2_show = chkEq("(Int, Long)", i"$tup2") + @Test def tup3_show = chkEq("(Int, Long, Short)", i"$tup3") + @Test def tup21_show = chkEq(res21, i"$tup21") + @Test def tup22_show = chkEq(res22, i"$tup22") + @Test def tup23_show = chkEq(res23, i"$tup23") + @Test def tup24_show = chkEq(res24, i"$tup24") + + @Test def tup3_text = + val obt = tup3.toText(ctx.printer) + val exp = Fluid(List( + Str(")"), + Str("Short"), + Closed(List(Str(", "), Str("Long"))), + Closed(List(Str(", "), Str("Int"))), + Str("("), + )) + chkEq(exp, obt) + + @Test def tup3_layout10 = + val obt = tup3.toText(ctx.printer).layout(10) + val exp = Fluid(List( + Str(" Short)"), + Str(" Long, "), + Str("(Int, "), + )) + chkEq(exp, obt) + + @Test def tup3_show10 = chkEq("(Int,\n Long,\n Short)".normEOL, tup3.toText(ctx.printer).mkString(10, false)) + + val res21 = """|(Int, Int, Int, Int, Int, Long, Long, Long, Long, Long, Int, Int, Int, Int, + | Int, Long, Long, Long, Long, Long, Int)""".stripMargin.normEOL + + val res22 = """|(Int, Int, Int, Int, Int, Long, Long, Long, Long, Long, Int, Int, Int, Int, + | Int, Long, Long, Long, Long, Long, Int, Long)""".stripMargin.normEOL + + val res23 = """|(Int, Int, Int, Int, Int, Long, Long, Long, Long, Long, Int, Int, Int, Int, + | Int, Long, Long, Long, Long, Long, Int, Long, Short)""".stripMargin.normEOL + + val res24 = """|(Int, Int, Int, Int, Int, Long, Long, Long, Long, Long, Int, Int, Int, Int, + | Int, Long, Long, Long, Long, Long, Int, Long, Short, Short)""".stripMargin.normEOL + + def chkEq[A](expected: A, obtained: A) = assert(expected == obtained, diff(s"$expected", s"$obtained")) + + /** On Windows the string literal in this test source file will be read with `\n` (b/c of "-encoding UTF8") + * but the compiler will correctly emit \r\n as the line separator. + * So we align the expected result to faithfully compare test results. */ + extension (str: String) def normEOL = if EOL == "\n" then str else str.replace("\n", EOL).nn + + def diff(exp: String, obt: String) = + val min = math.min(exp.length, obt.length) + val pre = + var i = 0 + while i < min && exp(i) == obt(i) do i += 1 + exp.take(i) + val suf = + val max = min - pre.length - 1 + var i = 0 + while i <= max && exp(exp.length - 1 - i) == obt(obt.length - 1 - i) do i += 1 + exp.drop(exp.length - 1) + + import scala.io.AnsiColor.* + val ellip = BLACK + BOLD + "..." + RESET + val compactPre = if pre.length <= 20 then pre else ellip + pre.drop(pre.length - 20) + val compactSuf = if suf.length <= 20 then suf else suf.take(20) + ellip + def extractDiff(s: String) = s.slice(pre.length, s.length - suf.length) + s"""|Comparison Failure: + | expected: $compactPre${CYAN }${extractDiff(exp)}$RESET$compactSuf + | obtained: $compactPre$MAGENTA${extractDiff(obt)}$RESET$compactSuf + |""".stripMargin diff --git a/compiler/test/dotty/tools/dotc/core/ConstraintsTest.scala b/compiler/test/dotty/tools/dotc/core/ConstraintsTest.scala index 5ab162b9f05c..9ae3fda8c6b9 100644 --- a/compiler/test/dotty/tools/dotc/core/ConstraintsTest.scala +++ b/compiler/test/dotty/tools/dotc/core/ConstraintsTest.scala @@ -53,3 +53,41 @@ class ConstraintsTest: i"Merging constraints `?S <: ?T` and `Int <: ?S` should result in `Int <:< ?T`: ${ctx.typerState.constraint}") } end mergeBoundsTransitivity + + @Test def validBoundsInit: Unit = inCompilerContext( + TestConfiguration.basicClasspath, + scalaSources = "trait A { def foo[S >: T <: T | Int, T <: String]: Any }") { + val tvars = constrained(requiredClass("A").typeRef.select("foo".toTermName).info.asInstanceOf[TypeLambda], EmptyTree, alwaysAddTypeVars = true)._2 + val List(s, t) = tvars.tpes + + val TypeBounds(lo, hi) = ctx.typerState.constraint.entry(t.asInstanceOf[TypeVar].origin): @unchecked + assert(lo =:= defn.NothingType, i"Unexpected lower bound $lo for $t: ${ctx.typerState.constraint}") + assert(hi =:= defn.StringType, i"Unexpected upper bound $hi for $t: ${ctx.typerState.constraint}") // used to be Any + } + + @Test def validBoundsUnify: Unit = inCompilerContext( + TestConfiguration.basicClasspath, + scalaSources = "trait A { def foo[S >: T <: T | Int, T <: String | Int]: Any }") { + val tvars = constrained(requiredClass("A").typeRef.select("foo".toTermName).info.asInstanceOf[TypeLambda], EmptyTree, alwaysAddTypeVars = true)._2 + val List(s, t) = tvars.tpes + + s <:< t + + val TypeBounds(lo, hi) = ctx.typerState.constraint.entry(t.asInstanceOf[TypeVar].origin): @unchecked + assert(lo =:= defn.NothingType, i"Unexpected lower bound $lo for $t: ${ctx.typerState.constraint}") + assert(hi =:= (defn.StringType | defn.IntType), i"Unexpected upper bound $hi for $t: ${ctx.typerState.constraint}") + } + + @Test def validBoundsReplace: Unit = inCompilerContext( + TestConfiguration.basicClasspath, + scalaSources = "trait X; trait A { def foo[S <: U | X, T, U]: Any }") { + val tvarTrees = constrained(requiredClass("A").typeRef.select("foo".toTermName).info.asInstanceOf[TypeLambda], EmptyTree, alwaysAddTypeVars = true)._2 + val tvars @ List(s, t, u) = tvarTrees.tpes.asInstanceOf[List[TypeVar]] + s =:= t + t =:= u + + for tvar <- tvars do + val entry = ctx.typerState.constraint.entry(tvar.origin) + assert(!ctx.typerState.constraint.occursAtToplevel(tvar.origin, entry), + i"cyclic bound for ${tvar.origin}: ${entry} in ${ctx.typerState.constraint}") + } diff --git a/compiler/test/dotty/tools/dotc/coverage/CoverageTests.scala b/compiler/test/dotty/tools/dotc/coverage/CoverageTests.scala index 5d9458fe95c9..77e172f61167 100644 --- a/compiler/test/dotty/tools/dotc/coverage/CoverageTests.scala +++ b/compiler/test/dotty/tools/dotc/coverage/CoverageTests.scala @@ -4,13 +4,13 @@ import org.junit.Test import org.junit.AfterClass import org.junit.Assert.* import org.junit.experimental.categories.Category - import dotty.{BootstrappedOnlyTests, Properties} import dotty.tools.vulpix.* import dotty.tools.vulpix.TestConfiguration.* import dotty.tools.dotc.Main +import dotty.tools.dotc.reporting.TestReporter -import java.nio.file.{Files, FileSystems, Path, Paths, StandardCopyOption} +import java.nio.file.{FileSystems, Files, Path, Paths, StandardCopyOption} import scala.jdk.CollectionConverters.* import scala.util.Properties.userDir import scala.language.unsafeNulls @@ -85,6 +85,7 @@ object CoverageTests extends ParallelTesting: def testFilter = Properties.testsFilter def isInteractive = SummaryReport.isInteractive def updateCheckFiles = Properties.testsUpdateCheckfile + def failedTests = TestReporter.lastRunFailedTests given summaryReport: SummaryReporting = SummaryReport() @AfterClass def tearDown(): Unit = diff --git a/compiler/test/dotty/tools/dotc/parsing/DeSugarTest.scala b/compiler/test/dotty/tools/dotc/parsing/DeSugarTest.scala index a54880326704..bb2797c5d034 100644 --- a/compiler/test/dotty/tools/dotc/parsing/DeSugarTest.scala +++ b/compiler/test/dotty/tools/dotc/parsing/DeSugarTest.scala @@ -59,8 +59,8 @@ class DeSugarTest extends ParserTest { cpy.DefDef(tree1)(name, transformParamss(paramss), transform(tpt, Type), transform(tree1.rhs)) case tree1 @ TypeDef(name, rhs) => cpy.TypeDef(tree1)(name, transform(rhs, Type)) - case impl @ Template(constr, parents, self, _) => - cpy.Template(tree1)(transformSub(constr), transform(parents), Nil, transformSub(self), transform(impl.body, Expr)) + case impl @ Template(constr, _, self, _) => + cpy.Template(tree1)(transformSub(constr), transform(impl.parentsOrDerived), Nil, transformSub(self), transform(impl.body, Expr)) case Thicket(trees) => Thicket(flatten(trees mapConserve super.transform)) case tree1 => diff --git a/compiler/test/dotty/tools/dotc/printing/PrintingTest.scala b/compiler/test/dotty/tools/dotc/printing/PrintingTest.scala index 639b04089abc..2c970e93f573 100644 --- a/compiler/test/dotty/tools/dotc/printing/PrintingTest.scala +++ b/compiler/test/dotty/tools/dotc/printing/PrintingTest.scala @@ -21,6 +21,7 @@ import scala.io.Source import org.junit.Test import scala.util.Using import java.io.File + class PrintingTest { def options(phase: String, flags: List[String]) = @@ -45,7 +46,7 @@ class PrintingTest { } val actualLines = byteStream.toString(StandardCharsets.UTF_8.name).linesIterator - FileDiff.checkAndDump(path.toString, actualLines.toIndexedSeq, checkFilePath) + FileDiff.checkAndDumpOrUpdate(path.toString, actualLines.toIndexedSeq, checkFilePath) } def testIn(testsDir: String, phase: String) = diff --git a/compiler/test/dotty/tools/dotc/reporting/TestReporter.scala b/compiler/test/dotty/tools/dotc/reporting/TestReporter.scala index 475cd1160296..940fc875a021 100644 --- a/compiler/test/dotty/tools/dotc/reporting/TestReporter.scala +++ b/compiler/test/dotty/tools/dotc/reporting/TestReporter.scala @@ -3,18 +3,20 @@ package dotc package reporting import scala.language.unsafeNulls - -import java.io.{ PrintStream, PrintWriter, File => JFile, FileOutputStream, StringWriter } +import java.io.{BufferedReader, FileInputStream, FileOutputStream, FileReader, PrintStream, PrintWriter, StringReader, StringWriter, File as JFile} import java.text.SimpleDateFormat import java.util.Date -import core.Decorators._ +import core.Decorators.* import scala.collection.mutable - +import scala.jdk.CollectionConverters.* import util.SourcePosition -import core.Contexts._ -import Diagnostic._ -import interfaces.Diagnostic.{ ERROR, WARNING } +import core.Contexts.* +import Diagnostic.* +import dotty.Properties +import interfaces.Diagnostic.{ERROR, WARNING} + +import scala.io.Codec class TestReporter protected (outWriter: PrintWriter, filePrintln: String => Unit, logLevel: Int) extends Reporter with UniqueMessagePositions with HideNonSensicalMessages with MessageRendering { @@ -84,17 +86,23 @@ extends Reporter with UniqueMessagePositions with HideNonSensicalMessages with M } object TestReporter { + private val testLogsDirName: String = "testlogs" + private val failedTestsFileName: String = "last-failed.log" + private val failedTestsFile: JFile = new JFile(s"$testLogsDirName/$failedTestsFileName") + private var outFile: JFile = _ private var logWriter: PrintWriter = _ + private var failedTestsWriter: PrintWriter = _ private def initLog() = if (logWriter eq null) { val date = new Date val df0 = new SimpleDateFormat("yyyy-MM-dd") val df1 = new SimpleDateFormat("yyyy-MM-dd-'T'HH-mm-ss") - val folder = s"testlogs/tests-${df0.format(date)}" + val folder = s"$testLogsDirName/tests-${df0.format(date)}" new JFile(folder).mkdirs() outFile = new JFile(s"$folder/tests-${df1.format(date)}.log") logWriter = new PrintWriter(new FileOutputStream(outFile, true)) + failedTestsWriter = new PrintWriter(new FileOutputStream(failedTestsFile, false)) } def logPrintln(str: String) = { @@ -144,4 +152,16 @@ object TestReporter { } rep } + + def lastRunFailedTests: Option[List[String]] = + Option.when( + Properties.rerunFailed && + failedTestsFile.exists() && + failedTestsFile.isFile + )(java.nio.file.Files.readAllLines(failedTestsFile.toPath).asScala.toList) + + def writeFailedTests(tests: List[String]): Unit = + initLog() + tests.foreach(failed => failedTestsWriter.println(failed)) + failedTestsWriter.flush() } diff --git a/compiler/test/dotty/tools/dotc/reporting/UserDefinedErrorMessages.scala b/compiler/test/dotty/tools/dotc/reporting/UserDefinedErrorMessages.scala index 4d73b0d88b55..807d3a19f8f3 100644 --- a/compiler/test/dotty/tools/dotc/reporting/UserDefinedErrorMessages.scala +++ b/compiler/test/dotty/tools/dotc/reporting/UserDefinedErrorMessages.scala @@ -26,9 +26,9 @@ class UserDefinedErrorMessages extends ErrorMessagesTest { given Context = itcx assertMessageCount(1, messages) - val (m: NoExplanation) :: Nil = messages: @unchecked + val (m: TypeMsg) :: Nil = messages: @unchecked - assertEquals(m.msg, "Could not prove Int =!= Int") + assertEquals(m.message, "Could not prove Int =!= Int") } @Test def userDefinedImplicitAmbiguous2 = @@ -50,9 +50,9 @@ class UserDefinedErrorMessages extends ErrorMessagesTest { given Context = itcx assertMessageCount(1, messages) - val (m: NoExplanation) :: Nil = messages: @unchecked + val (m: TypeMsg) :: Nil = messages: @unchecked - assertEquals(m.msg, "Could not prove Int =!= Int") + assertEquals(m.message, "Could not prove Int =!= Int") } @Test def userDefinedImplicitAmbiguous3 = @@ -75,9 +75,9 @@ class UserDefinedErrorMessages extends ErrorMessagesTest { given Context = itcx assertMessageCount(1, messages) - val (m: NoExplanation) :: Nil = messages: @unchecked + val (m: TypeMsg) :: Nil = messages: @unchecked - assertEquals(m.msg, "Could not prove Int =!= Int") + assertEquals(m.message, "Could not prove Int =!= Int") } @Test def userDefinedImplicitAmbiguous4 = @@ -97,9 +97,9 @@ class UserDefinedErrorMessages extends ErrorMessagesTest { given Context = itcx assertMessageCount(1, messages) - val (m: NoExplanation) :: Nil = messages: @unchecked + val (m: TypeMsg) :: Nil = messages: @unchecked - assertEquals(m.msg, "msg A=Any") + assertEquals(m.message, "msg A=Any") } @Test def userDefinedImplicitAmbiguous5 = @@ -119,8 +119,8 @@ class UserDefinedErrorMessages extends ErrorMessagesTest { given Context = itcx assertMessageCount(1, messages) - val (m: NoExplanation) :: Nil = messages: @unchecked + val (m: TypeMsg) :: Nil = messages: @unchecked - assertEquals(m.msg, "msg A=Any") + assertEquals(m.message, "msg A=Any") } } diff --git a/compiler/test/dotty/tools/dotc/transform/PatmatExhaustivityTest.scala b/compiler/test/dotty/tools/dotc/transform/PatmatExhaustivityTest.scala index eb6ab8e8fb5f..1e7d7ef2c708 100644 --- a/compiler/test/dotty/tools/dotc/transform/PatmatExhaustivityTest.scala +++ b/compiler/test/dotty/tools/dotc/transform/PatmatExhaustivityTest.scala @@ -20,7 +20,7 @@ class PatmatExhaustivityTest { val testsDir = "tests/patmat" // pagewidth/color: for a stable diff as the defaults are based on the terminal (e.g size) // stop-after: patmatexhaust-huge.scala crash compiler (but also hides other warnings..) - val options = List("-pagewidth", "80", "-color:never", "-Ystop-after:explicitSelf", "-classpath", TestConfiguration.basicClasspath) + val options = List("-pagewidth", "80", "-color:never", "-Ystop-after:explicitSelf", "-Ycheck-constraint-deps", "-classpath", TestConfiguration.basicClasspath) private def compile(files: List[JPath]): Seq[String] = { val opts = toolArgsFor(files).get(ToolName.Scalac).getOrElse(Nil) diff --git a/compiler/test/dotty/tools/dotc/transform/TypeTestsCastsTest.scala b/compiler/test/dotty/tools/dotc/transform/TypeTestsCastsTest.scala index 0db7a6072579..9f6f155a2ac2 100644 --- a/compiler/test/dotty/tools/dotc/transform/TypeTestsCastsTest.scala +++ b/compiler/test/dotty/tools/dotc/transform/TypeTestsCastsTest.scala @@ -6,6 +6,8 @@ import core.* import Contexts.*, Decorators.*, Denotations.*, SymDenotations.*, Symbols.*, Types.* import Annotations.* +import dotty.tools.dotc.util.Spans.Span + import org.junit.Test import org.junit.Assert.* @@ -15,7 +17,7 @@ class TypeTestsCastsTest extends DottyTest: @Test def orL = checkFound(List(StringType, LongType), OrType(LongType, StringType, false)) @Test def orR = checkFound(List(LongType, StringType), OrType(StringType, LongType, false)) - @Test def annot = checkFound(List(StringType, LongType), AnnotatedType(OrType(LongType, StringType, false), Annotation(defn.UncheckedAnnot))) + @Test def annot = checkFound(List(StringType, LongType), AnnotatedType(OrType(LongType, StringType, false), Annotation(defn.UncheckedAnnot, Span(0)))) @Test def andL = checkFound(List(StringType), AndType(StringType, AnyType)) @Test def andR = checkFound(List(StringType), AndType(AnyType, StringType)) diff --git a/compiler/test/dotty/tools/dotc/transform/patmat/SpaceEngineTest.scala b/compiler/test/dotty/tools/dotc/transform/patmat/SpaceEngineTest.scala new file mode 100644 index 000000000000..699b36caa508 --- /dev/null +++ b/compiler/test/dotty/tools/dotc/transform/patmat/SpaceEngineTest.scala @@ -0,0 +1,64 @@ +package dotty.tools +package dotc +package transform +package patmat + +import core.*, Annotations.*, Contexts.*, Decorators.*, Flags.*, Names.*, StdNames.*, Symbols.*, Types.* +import ast.*, tpd.* + +import vulpix.TestConfiguration, TestConfiguration.basicClasspath + +import org.junit, junit.Test, junit.Assert.* + +class SpaceEngineTest: + @Test def isSubspaceTest1: Unit = inCompilerContext(basicClasspath) { + // Testing the property of `isSubspace` that: + // isSubspace(a, b) <=> simplify(simplify(a) - simplify(a)) == Empty + // Previously there were no simplify calls, + // and this is a counter-example, + // for which you need either to simplify(b) or simplify the minus result. + val engine = patmat.SpaceEngine() + import engine.* + + val tp = defn.ConsClass.typeRef.appliedTo(defn.AnyType) + val unappTp = requiredMethod("scala.collection.immutable.::.unapply").termRef + val params = List(Empty, Typ(tp)) + + val a = Prod(tp, unappTp, params) + val b = Empty + + val res1 = isSubspace(a, b) + + val a2 = simplify(a) + val b2 = simplify(b) + val rem1 = minus(a2, b2) + val rem2 = simplify(rem1) + val res2 = rem2 == Empty + + assertEquals( + i"""|isSubspace: + | + |isSubspace(a, b) = $res1 + | + |Should be equivalent to: + |simplify(simplify(a) - simplify(b)) == Empty + |simplify(a2 - b2) == Empty + |simplify(rem1) == Empty + |rem2 == Empty + | + |a = ${show(a)} + |b = ${show(b)} + |a2 = ${show(a2)} + |b2 = ${show(b2)} + |rem1 = ${show(rem1)} + |rem2 = ${show(rem2)} + | + |a = ${a.toString} + |b = ${b.toString} + |a2 = ${a2.toString} + |b2 = ${b2.toString} + |rem1 = ${rem1.toString} + |rem2 = ${rem2.toString} + | + |""".stripMargin, res1, res2) + } diff --git a/compiler/test/dotty/tools/repl/ReplCompilerTests.scala b/compiler/test/dotty/tools/repl/ReplCompilerTests.scala index 866647476888..bcb08cd232d7 100644 --- a/compiler/test/dotty/tools/repl/ReplCompilerTests.scala +++ b/compiler/test/dotty/tools/repl/ReplCompilerTests.scala @@ -347,27 +347,6 @@ class ReplCompilerTests extends ReplTest: assertEquals("java.lang.AssertionError: assertion failed", all.head) } - @Test def i14491 = - initially { - run("import language.experimental.fewerBraces") - } andThen { - run("""|val x = Seq(7,8,9).apply: - | 1 - |""".stripMargin) - assertEquals("val x: Int = 8", storedOutput().trim) - } - initially { - run("""|import language.experimental.fewerBraces - |import language.experimental.fewerBraces as _ - |""".stripMargin) - } andThen { - run("""|val x = Seq(7,8,9).apply: - | 1 - |""".stripMargin) - assert("expected error if fewerBraces is unimported", - lines().exists(_.contains("missing arguments for method apply"))) - } - object ReplCompilerTests: private val pattern = Pattern.compile("\\r[\\n]?|\\n"); diff --git a/compiler/test/dotty/tools/repl/ShadowingBatchTests.scala b/compiler/test/dotty/tools/repl/ShadowingBatchTests.scala index 5a96976bd867..7272c10aa003 100644 --- a/compiler/test/dotty/tools/repl/ShadowingBatchTests.scala +++ b/compiler/test/dotty/tools/repl/ShadowingBatchTests.scala @@ -32,6 +32,20 @@ class ShadowingBatchTests extends ErrorMessagesTest: ictx.setSetting(classpath, classpath.value + File.pathSeparator + dir.jpath.toAbsolutePath) } + @Test def io = + val lib = """|package io.foo + | + |object Bar { + | def baz: Int = 42 + |} + |""".stripMargin + val app = """|object Main: + | def main(args: Array[String]): Unit = + | println(io.foo.Bar.baz) + |""".stripMargin + checkMessages(lib).expectNoErrors + checkMessages(app).expectNoErrors + @Test def file = checkMessages("class C(val c: Int)").expectNoErrors checkMessages("object rsline1 {\n def line1 = new C().c\n}").expect { (_, msgs) => diff --git a/compiler/test/dotty/tools/repl/ShadowingTests.scala b/compiler/test/dotty/tools/repl/ShadowingTests.scala index 62a2322e38f0..98aa58a62a15 100644 --- a/compiler/test/dotty/tools/repl/ShadowingTests.scala +++ b/compiler/test/dotty/tools/repl/ShadowingTests.scala @@ -76,6 +76,18 @@ class ShadowingTests extends ReplTest(options = ShadowingTests.options): Files.delete(file) end compileShadowed + @Test def io = shadowedScriptedTest(name = "io", + shadowed = """|package io.foo + | + |object Bar { + | def baz: Int = 42 + |} + |""".stripMargin, + script = """|scala> io.foo.Bar.baz + |val res0: Int = 42 + |""".stripMargin + ) + @Test def i7635 = shadowedScriptedTest(name = "", shadowed = "class C(val c: Int)", script = @@ -122,13 +134,18 @@ class ShadowingTests extends ReplTest(options = ShadowingTests.options): |val y: String = foo | |scala> if (true) x else y - |val res0: Matchable = 42 + |val res0: Int | String = 42 |""".stripMargin.linesIterator.toList ) ShadowingTests.createSubDir("util") testScript(name = "", """|scala> import util.Try + |-- [E008] Not Found Error: ----------------------------------------------------- + |1 | import util.Try + | | ^^^ + | | value Try is not a member of util + |1 error found | |scala> object util { class Try { override def toString = "you've gotta try!" } } |// defined object util diff --git a/compiler/test/dotty/tools/repl/TabcompleteTests.scala b/compiler/test/dotty/tools/repl/TabcompleteTests.scala index 9cdb896963f1..910584a9b5e7 100644 --- a/compiler/test/dotty/tools/repl/TabcompleteTests.scala +++ b/compiler/test/dotty/tools/repl/TabcompleteTests.scala @@ -233,4 +233,8 @@ class TabcompleteTests extends ReplTest { val comp = tabComplete("BigInt(1).") assertTrue(comp.distinct.nonEmpty) } + + @Test def i9334 = initially { + assert(tabComplete("class Foo[T]; classOf[Foo].").contains("getName")) + } } diff --git a/compiler/test/dotty/tools/vulpix/FailedTestInfo.scala b/compiler/test/dotty/tools/vulpix/FailedTestInfo.scala new file mode 100644 index 000000000000..c7172f54aadc --- /dev/null +++ b/compiler/test/dotty/tools/vulpix/FailedTestInfo.scala @@ -0,0 +1,3 @@ +package dotty.tools.vulpix + +case class FailedTestInfo(title: String, extra: String) diff --git a/compiler/test/dotty/tools/vulpix/FileDiff.scala b/compiler/test/dotty/tools/vulpix/FileDiff.scala index c060c4d3938c..5e882be6425a 100644 --- a/compiler/test/dotty/tools/vulpix/FileDiff.scala +++ b/compiler/test/dotty/tools/vulpix/FileDiff.scala @@ -50,21 +50,6 @@ object FileDiff { outFile.writeAll(content.mkString("", EOL, EOL)) } - def checkAndDump(sourceTitle: String, actualLines: Seq[String], checkFilePath: String): Boolean = { - val outFilePath = checkFilePath + ".out" - FileDiff.check(sourceTitle, actualLines, checkFilePath) match { - case Some(msg) => - FileDiff.dump(outFilePath, actualLines) - println(msg) - println(FileDiff.diffMessage(checkFilePath, outFilePath)) - false - case _ => - val jOutFilePath = Paths.get(outFilePath) - Files.deleteIfExists(jOutFilePath) - true - } - } - def checkAndDumpOrUpdate(sourceTitle: String, actualLines: Seq[String], checkFilePath: String): Boolean = { val outFilePath = checkFilePath + ".out" FileDiff.check(sourceTitle, actualLines, checkFilePath) match { diff --git a/compiler/test/dotty/tools/vulpix/ParallelTesting.scala b/compiler/test/dotty/tools/vulpix/ParallelTesting.scala index 44565c44b681..62b1b88984bc 100644 --- a/compiler/test/dotty/tools/vulpix/ParallelTesting.scala +++ b/compiler/test/dotty/tools/vulpix/ParallelTesting.scala @@ -57,6 +57,9 @@ trait ParallelTesting extends RunnerOrchestration { self => /** Tests should override the checkfiles with the current output */ def updateCheckFiles: Boolean + /** Contains a list of failed tests to run, if list is empty no tests will run */ + def failedTests: Option[List[String]] + /** A test source whose files or directory of files is to be compiled * in a specific way defined by the `Test` */ @@ -204,6 +207,14 @@ trait ParallelTesting extends RunnerOrchestration { self => protected def shouldSkipTestSource(testSource: TestSource): Boolean = false + protected def shouldReRun(testSource: TestSource): Boolean = + failedTests.forall(rerun => testSource match { + case JointCompilationSource(_, files, _, _, _, _) => + rerun.exists(filter => files.exists(file => file.getPath.contains(filter))) + case SeparateCompilationSource(_, dir, _, _) => + rerun.exists(dir.getPath.contains) + }) + private trait CompilationLogic { this: Test => def suppressErrors = false @@ -359,7 +370,7 @@ trait ParallelTesting extends RunnerOrchestration { self => case SeparateCompilationSource(_, dir, _, _) => testFilter.exists(dir.getPath.contains) } - filteredByName.filterNot(shouldSkipTestSource(_)) + filteredByName.filterNot(shouldSkipTestSource(_)).filter(shouldReRun(_)) /** Total amount of test sources being compiled by this test */ val sourceCount = filteredSources.length @@ -409,14 +420,14 @@ trait ParallelTesting extends RunnerOrchestration { self => synchronized { reproduceInstructions.append(ins) } /** The test sources that failed according to the implementing subclass */ - private val failedTestSources = mutable.ArrayBuffer.empty[String] + private val failedTestSources = mutable.ArrayBuffer.empty[FailedTestInfo] protected final def failTestSource(testSource: TestSource, reason: Failure = Generic) = synchronized { val extra = reason match { case TimeoutFailure(title) => s", test '$title' timed out" case JavaCompilationFailure(msg) => s", java test sources failed to compile with: \n$msg" case Generic => "" } - failedTestSources.append(testSource.title + s" failed" + extra) + failedTestSources.append(FailedTestInfo(testSource.title, s" failed" + extra)) fail(reason) } @@ -550,7 +561,7 @@ trait ParallelTesting extends RunnerOrchestration { self => def addToLast(str: String): Unit = diagnostics match case head :: tail => - diagnostics = Diagnostic.Error(s"${head.msg.rawMessage}$str", head.pos) :: tail + diagnostics = Diagnostic.Error(s"${head.msg.message}$str", head.pos) :: tail case Nil => var inError = false for line <- errorsText.linesIterator do diff --git a/compiler/test/dotty/tools/vulpix/SummaryReport.scala b/compiler/test/dotty/tools/vulpix/SummaryReport.scala index e216ac1c5d4f..74612387015f 100644 --- a/compiler/test/dotty/tools/vulpix/SummaryReport.scala +++ b/compiler/test/dotty/tools/vulpix/SummaryReport.scala @@ -3,7 +3,6 @@ package tools package vulpix import scala.language.unsafeNulls - import scala.collection.mutable import dotc.reporting.TestReporter @@ -23,7 +22,7 @@ trait SummaryReporting { def reportPassed(): Unit /** Add the name of the failed test */ - def addFailedTest(msg: String): Unit + def addFailedTest(msg: FailedTestInfo): Unit /** Add instructions to reproduce the error */ def addReproduceInstruction(instr: String): Unit @@ -49,7 +48,7 @@ trait SummaryReporting { final class NoSummaryReport extends SummaryReporting { def reportFailed(): Unit = () def reportPassed(): Unit = () - def addFailedTest(msg: String): Unit = () + def addFailedTest(msg: FailedTestInfo): Unit = () def addReproduceInstruction(instr: String): Unit = () def addStartingMessage(msg: String): Unit = () def addCleanup(f: () => Unit): Unit = () @@ -66,7 +65,7 @@ final class SummaryReport extends SummaryReporting { import scala.jdk.CollectionConverters._ private val startingMessages = new java.util.concurrent.ConcurrentLinkedDeque[String] - private val failedTests = new java.util.concurrent.ConcurrentLinkedDeque[String] + private val failedTests = new java.util.concurrent.ConcurrentLinkedDeque[FailedTestInfo] private val reproduceInstructions = new java.util.concurrent.ConcurrentLinkedDeque[String] private val cleanUps = new java.util.concurrent.ConcurrentLinkedDeque[() => Unit] @@ -79,7 +78,7 @@ final class SummaryReport extends SummaryReporting { def reportPassed(): Unit = passed += 1 - def addFailedTest(msg: String): Unit = + def addFailedTest(msg: FailedTestInfo): Unit = failedTests.add(msg) def addReproduceInstruction(instr: String): Unit = @@ -108,7 +107,8 @@ final class SummaryReport extends SummaryReporting { startingMessages.asScala.foreach(rep.append) - failedTests.asScala.map(x => s" $x\n").foreach(rep.append) + failedTests.asScala.map(x => s" ${x.title}${x.extra}\n").foreach(rep.append) + TestReporter.writeFailedTests(failedTests.asScala.toList.map(_.title)) // If we're compiling locally, we don't need instructions on how to // reproduce failures diff --git a/compiler/test/dotty/tools/vulpix/TestConfiguration.scala b/compiler/test/dotty/tools/vulpix/TestConfiguration.scala index 3ea364cc3a68..5d2992b50a09 100644 --- a/compiler/test/dotty/tools/vulpix/TestConfiguration.scala +++ b/compiler/test/dotty/tools/vulpix/TestConfiguration.scala @@ -49,6 +49,7 @@ object TestConfiguration { withCompilerClasspath + File.pathSeparator + mkClasspath(List(Properties.dottyTastyInspector)) lazy val scalaJSClasspath = mkClasspath(List( + Properties.scalaJSJavalib, Properties.scalaJSLibrary, Properties.dottyLibraryJS )) diff --git a/compiler/test/dotty/tools/vulpix/VulpixMetaTests.scala b/compiler/test/dotty/tools/vulpix/VulpixMetaTests.scala index 75af0aa94893..0044ab8a94e5 100644 --- a/compiler/test/dotty/tools/vulpix/VulpixMetaTests.scala +++ b/compiler/test/dotty/tools/vulpix/VulpixMetaTests.scala @@ -30,6 +30,7 @@ object VulpixMetaTests extends ParallelTesting { def isInteractive = false // Don't beautify output for interactive use. def testFilter = Nil // Run all the tests. def updateCheckFiles: Boolean = false + def failedTests = None @AfterClass def tearDown() = this.cleanup() diff --git a/compiler/test/dotty/tools/vulpix/VulpixUnitTests.scala b/compiler/test/dotty/tools/vulpix/VulpixUnitTests.scala index 8a32fd636e76..baf61c845d96 100644 --- a/compiler/test/dotty/tools/vulpix/VulpixUnitTests.scala +++ b/compiler/test/dotty/tools/vulpix/VulpixUnitTests.scala @@ -108,6 +108,7 @@ object VulpixUnitTests extends ParallelTesting { def isInteractive = !sys.env.contains("DRONE") def testFilter = Nil def updateCheckFiles: Boolean = false + def failedTests = None @AfterClass def tearDown() = this.cleanup() diff --git a/docs/_assets/css/color-brewer.css b/docs/_assets/css/color-brewer.css deleted file mode 100644 index b832a05ebc51..000000000000 --- a/docs/_assets/css/color-brewer.css +++ /dev/null @@ -1,66 +0,0 @@ -/* - -Colorbrewer theme -Original: https://github.com/mbostock/colorbrewer-theme (c) Mike Bostock -Ported by Fabrício Tavares de Oliveira - -*/ - -/* .hljs { - background: transparent; -} - -.hljs, -.hljs-subst { - color: #000; -} */ - -/*.hljs-string, -.hljs-meta, -.hljs-symbol, -.hljs-template-tag, -.hljs-template-variable, -.hljs-addition { - color: #756bb1; -}*/ - -/* .hljs-comment, -.hljs-quote { - color: #636363; -} - -.hljs-number, -.hljs-regexp, -.hljs-literal, -.hljs-bullet, -.hljs-link { - color: #31a354; -} - -.hljs-deletion, -.hljs-variable { - color: #88f; -} */ - -/*.hljs-keyword, -.hljs-selector-tag, -.hljs-title, -.hljs-section, -.hljs-built_in, -.hljs-doctag, -.hljs-type, -.hljs-tag, -.hljs-name, -.hljs-selector-id, -.hljs-selector-class, -.hljs-strong { - color: #3182bd; -}*/ - -/* .hljs-emphasis { - font-style: italic; -} - -.hljs-attribute { - color: #e6550d; -} */ diff --git a/docs/_docs/internals/syntax.md b/docs/_docs/internals/syntax.md index bae8e6d3ec8d..8e7de0efe19e 100644 --- a/docs/_docs/internals/syntax.md +++ b/docs/_docs/internals/syntax.md @@ -211,7 +211,8 @@ FunArgType ::= Type | ‘=>’ Type PrefixOp(=>, t) FunArgTypes ::= FunArgType { ‘,’ FunArgType } ParamType ::= [‘=>’] ParamValueType -ParamValueType ::= Type [‘*’] PostfixOp(t, "*") +ParamValueType ::= [‘into’] ExactParamType Into(t) +ExactParamType ::= ParamValueType [‘*’] PostfixOp(t, "*") TypeArgs ::= ‘[’ Types ‘]’ ts Refinement ::= :<<< [RefineDcl] {semi [RefineDcl]} >>> ds TypeBounds ::= [‘>:’ Type] [‘<:’ Type] TypeBoundsTree(lo, hi) @@ -318,7 +319,10 @@ TypeCaseClauses ::= TypeCaseClause { TypeCaseClause } TypeCaseClause ::= ‘case’ (InfixType | ‘_’) ‘=>’ Type [semi] Pattern ::= Pattern1 { ‘|’ Pattern1 } Alternative(pats) -Pattern1 ::= Pattern2 [‘:’ RefinedType] Bind(name, Typed(Ident(wildcard), tpe)) +Pattern1 ::= PatVar ‘:’ RefinedType Bind(name, Typed(Ident(wildcard), tpe)) + | [‘-’] integerLiteral ‘:’ RefinedType Typed(pat, tpe) + | [‘-’] floatingPointLiteral ‘:’ RefinedType Typed(pat, tpe) + | Pattern2 Pattern2 ::= [id ‘@’] InfixPattern [‘*’] Bind(name, pat) InfixPattern ::= SimplePattern { id [nl] SimplePattern } InfixOp(pat, op, pat) SimplePattern ::= PatVar Ident(wildcard) diff --git a/docs/_docs/reference/contextual/derivation.md b/docs/_docs/reference/contextual/derivation.md index a4da7c470e3c..f073c339ec6f 100644 --- a/docs/_docs/reference/contextual/derivation.md +++ b/docs/_docs/reference/contextual/derivation.md @@ -34,6 +34,9 @@ given [T: Ordering]: Ordering[Option[T]] = Ordering.derived It is discouraged to directly refer to the `derived` member if you can use a `derives` clause instead. +All data types can have a `derives` clause. This document focuses primarily on data types which also have a given instance +of the `Mirror` type class available. + ## Exact mechanism In the following, when type arguments are enumerated and the first index evaluates to a larger value than the last, then there are actually no arguments, for example: `A[T_2, ..., T_1]` means `A`. @@ -281,7 +284,7 @@ Note the following properties of `Mirror` types, + The methods `ordinal` and `fromProduct` are defined in terms of `MirroredMonoType` which is the type of kind-`*` which is obtained from `MirroredType` by wildcarding its type parameters. -### Implementing `derived` with `Mirror` +## Implementing `derived` with `Mirror` As seen before, the signature and implementation of a `derived` method for a type class `TC[_]` are arbitrary, but we expect it to typically be of the following form: @@ -507,7 +510,7 @@ The framework described here enables all three of these approaches without manda For a brief discussion on how to use macros to write a type class `derived` method please read more at [How to write a type class `derived` method using macros](./derivation-macro.md). -### Syntax +## Syntax ``` Template ::= InheritClauses [TemplateBody] diff --git a/docs/_docs/reference/experimental/erased-defs-spec.md b/docs/_docs/reference/experimental/erased-defs-spec.md index 5395a8468399..24ae89c7e28b 100644 --- a/docs/_docs/reference/experimental/erased-defs-spec.md +++ b/docs/_docs/reference/experimental/erased-defs-spec.md @@ -62,3 +62,9 @@ TODO: complete 7. Overriding * Member definitions overriding each other must both be `erased` or not be `erased` * `def foo(x: T): U` cannot be overridden by `def foo(erased x: T): U` and vice-versa + * + + +8. Type Restrictions + * For dependent functions, `erased` parameters are limited to realizable types, that is, types that are inhabited by non-null values. + This restriction stops us from using a bad bound introduced by an erased value, which leads to unsoundness (see #4060). diff --git a/docs/_docs/reference/experimental/into-modifier.md b/docs/_docs/reference/experimental/into-modifier.md new file mode 100644 index 000000000000..2ee4c74539b3 --- /dev/null +++ b/docs/_docs/reference/experimental/into-modifier.md @@ -0,0 +1,81 @@ +--- +layout: doc-page +title: "The `into` Type Modifier" +redirectFrom: /docs/reference/other-new-features/into-modifier.html +nightlyOf: https://docs.scala-lang.org/scala3/reference/experimental/into-modifier.html +--- + +Scala 3's implicit conversions of the `scala.Conversion` class require a language import +``` +import scala.language.implicitConversions +``` +in any code that uses them as implicit conversions (code that calls conversions explicitly is not affected). If the import is missing, a feature warning is currently issued, and this will become an error in a future version of Scala 3. The motivation for this restriction is that code with hidden implicit conversions is hard to understand and might have correctness or performance problems that go undetected. + +There is one broad use case, however, where implicit conversions are very hard to replace. This is the case where an implicit conversion is used to adapt a method argument to its formal parameter type. An example from the standard library: +```scala +scala> val xs = List(0, 1) +scala> val ys = Array(2, 3) +scala> xs ++ ys +val res0: List[Int] = List(0, 1, 2, 3) +``` +The last input made use of an implicit conversion from `Array[Int]` to `IterableOnce[Int]` which is defined as a Scala 2 style implicit conversion in the standard library. Once the standard library is rewritten with Scala 3 conversions, this will +require a language import at the use site, which is clearly unacceptable. It is possible to avoid the need for implicit conversions using method overloading or type classes, but this often leads to longer and more complicated code, and neither of these alternatives work for vararg parameters. + +This is where the `into` modifier on parameter types comes in. Here is a signature of the `++` method on `List[A]` that uses it: +```scala + def ++ (elems: into IterableOnce[A]): List[A] +``` +The `into` modifier on the type of `elems` means that implicit conversions can be applied to convert the actual argument to an `IterableOnce` value, and this without needing a language import. + +## Function arguments + +`into` also allows conversions on the results of function arguments. For instance, consider the new proposed signature of the `flatMap` method on `List[A]`: + +```scala + def flatMap[B](f: into A => IterableOnce[B]): List[B] +``` +This allows a conversion of the actual argument to the function type `A => IterableOnce[B]`. Crucially, it also allows that conversion to be applied to +the function result. So the following would work: +```scala +scala> val xs = List(1, 2, 3) +scala> xs.flatMap(x => x.toString * x) +val res2: List[Char] = List(1, 2, 2, 3, 3, 3) +``` +Here, the conversion from `String` to `Iterable[Char]` is applied on the results of `flatMap`'s function argument when it is applied to the elements of `xs`. + +## Vararg arguments + +When applied to a vararg parameter, `into` allows a conversion on each argument value individually. For example, consider a method `concatAll` that concatenates a variable +number of `IterableOnce[Char]` arguments, and also allows implicit conversions into `IterableOnce[Char]`: + +```scala +def concatAll(xss: into IterableOnce[Char]*): List[Char] = + xss.foldLeft(List[Char]())(_ ++ _) +``` +Here, the call +```scala +concatAll(List('a'), "bc", Array('d', 'e')) +``` +would apply two _different_ implicit conversions: the conversion from `String` to `Iterable[Char]` gets applied to the second argument and the conversion from `Array[Char]` to `Iterable[Char]` gets applied to the third argument. + +## Retrofitting Scala 2 libraries + +A new annotation `allowConversions` has the same effect as an `into` modifier. It is defined as an `@experimental` class in package `scala.annotation`. It is intended to be used for retrofitting Scala 2 library code so that Scala 3 conversions can be applied to arguments without language imports. For instance, the definitions of +`++` and `flatMap` in the Scala 2.13 `List` class could be retrofitted as follows. +```scala + def ++ (@allowConversions elems: IterableOnce[A]): List[A] + def flatMap[B](@allowConversions f: A => IterableOnce[B]): List[B] +``` +For Scala 3 code, the `into` modifier is preferred. First, because it is shorter, +and second, because it adheres to the principle that annotations should not influence +typing and type inference in Scala. + +## Syntax changes + +The addition to the grammar is: +``` +ParamType ::= [‘=>’] ParamValueType +ParamValueType ::= [‘into‘] ExactParamType +ExactParamType ::= Type [‘*’] +``` +As the grammar shows, `into` can only applied to the type of a parameter; it is illegal in other positions. diff --git a/docs/_docs/reference/language-versions/source-compatibility.md b/docs/_docs/reference/language-versions/source-compatibility.md index 077f06b2b4db..131bb100a91b 100644 --- a/docs/_docs/reference/language-versions/source-compatibility.md +++ b/docs/_docs/reference/language-versions/source-compatibility.md @@ -23,8 +23,11 @@ The default Scala language syntax version currently supported by the Dotty compi - [`3.2-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/2-migration$.html): the same as `3.2`, but in conjunction with `-rewrite`, offer code rewrites from Scala `3.0/3.1` to `3.2`. - [`future`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future$.html): A preview of changes that will be introduced in `3.x` versions after `3.2`. Some Scala 2 specific idioms are dropped in this version. The feature set supported by this version may grow over time as features become stabilised for preview. +- [`3.3`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/3$.html): the same as `3.2`, but in addition: + -[Fewer braces syntax](https://docs.scala-lang.org/scala3/reference/other-new-features/indentation.html#optional-braces-for-method-arguments-1) is enabled by default. +- [`3.3-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$3/3-migration$.html): the same as `3.3` -- [`future-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future-migration$.html): Same as `future` but with additional helpers to migrate from `3.2`. Similarly to the helpers available under `3.0-migration`, these include migration warnings and optional rewrites. +- [`future-migration`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$future-migration$.html): Same as `future` but with additional helpers to migrate from `3.3`. Similarly to the helpers available under `3.0-migration`, these include migration warnings and optional rewrites. There are two ways to specify a language version : diff --git a/docs/_docs/reference/new-types/union-types-spec.md b/docs/_docs/reference/new-types/union-types-spec.md index d250d3f11713..1093631e7c63 100644 --- a/docs/_docs/reference/new-types/union-types-spec.md +++ b/docs/_docs/reference/new-types/union-types-spec.md @@ -72,6 +72,10 @@ a non-union type, for this purpose we define the _join_ of a union type `T1 | `T1`,...,`Tn`. Note that union types might still appear as type arguments in the resulting type, this guarantees that the join is always finite. +The _visible join_ of a union type is its join where all operands of the intersection that +are instances of [transparent](../other-new-features/transparent-traits.md) traits or classes are removed. + + ### Example Given @@ -80,31 +84,50 @@ Given trait C[+T] trait D trait E -class A extends C[A] with D -class B extends C[B] with D with E +transparent trait X +class A extends C[A], D, X +class B extends C[B], D, E, X ``` -The join of `A | B` is `C[A | B] & D` +The join of `A | B` is `C[A | B] & D & X` and the visible join of `A | B` is `C[A | B] & D`. + +## Hard and Soft Union Types + +We distinguish between hard and soft union types. A _hard_ union type is a union type that's explicitly +written in the source. For instance, in +```scala +val x: Int | String = ... +``` +`Int | String` would be a hard union type. A _soft_ union type is a type that arises from type checking +an alternative of expressions. For instance, the type of the expression +```scala +val x = 1 +val y = "abc" +if cond then x else y +``` +is the soft unon type `Int | String`. Similarly for match expressions. The type of +```scala +x match + case 1 => x + case 2 => "abc" + case 3 => List(1, 2, 3) +``` +is the soft union type `Int | "abc" | List[Int]`. + ## Type inference When inferring the result type of a definition (`val`, `var`, or `def`) and the -type we are about to infer is a union type, then we replace it by its join. +type we are about to infer is a soft union type, then we replace it by its visible join, +provided it is not empty. Similarly, when instantiating a type argument, if the corresponding type parameter is not upper-bounded by a union type and the type we are about to -instantiate is a union type, we replace it by its join. This mirrors the +instantiate is a soft union type, we replace it by its visible join, provided it is not empty. +This mirrors the treatment of singleton types which are also widened to their underlying type unless explicitly specified. The motivation is the same: inferring types which are "too precise" can lead to unintuitive typechecking issues later on. -**Note:** Since this behavior limits the usability of union types, it might -be changed in the future. For example by not widening unions that have been -explicitly written down by the user and not inferred, or by not widening a type -argument when the corresponding type parameter is covariant. - -See [PR #2330](https://github.com/lampepfl/dotty/pull/2330) and -[Issue #4867](https://github.com/lampepfl/dotty/issues/4867) for further discussions. - ### Example ```scala diff --git a/docs/_docs/reference/new-types/union-types.md b/docs/_docs/reference/new-types/union-types.md index 76c0ac6e674c..3729cbf09848 100644 --- a/docs/_docs/reference/new-types/union-types.md +++ b/docs/_docs/reference/new-types/union-types.md @@ -8,8 +8,9 @@ A union type `A | B` includes all values of both types. ```scala -case class UserName(name: String) -case class Password(hash: Hash) +trait ID +case class UserName(name: String) extends ID +case class Password(hash: Hash) extends ID def help(id: UserName | Password) = val user = id match @@ -22,7 +23,10 @@ Union types are duals of intersection types. `|` is _commutative_: `A | B` is the same type as `B | A`. The compiler will assign a union type to an expression only if such a -type is explicitly given. This can be seen in the following [REPL](https://docs.scala-lang.org/overviews/repl/overview.html) transcript: +type is explicitly given or if the common supertype of all alternatives is [transparent](../other-new-features/transparent-traits.md). + + +This can be seen in the following [REPL](https://docs.scala-lang.org/overviews/repl/overview.html) transcript: ```scala scala> val password = Password(123) @@ -32,15 +36,36 @@ scala> val name = UserName("Eve") val name: UserName = UserName(Eve) scala> if true then name else password -val res2: Object = UserName(Eve) +val res1: ID = UserName(Eve) scala> val either: Password | UserName = if true then name else password -val either: Password | UserName = UserName(Eve) +val either: UserName | Password = UserName(Eve) ``` - -The type of `res2` is `Object & Product`, which is a supertype of -`UserName` and `Password`, but not the least supertype `Password | -UserName`. If we want the least supertype, we have to give it +The type of `res1` is `ID`, which is a supertype of +`UserName` and `Password`, but not the least supertype `UserName | Password`. +If we want the least supertype, we have to give it explicitly, as is done for the type of `either`. +The inference behavior changes if the common supertrait `ID` is declared `transparent`: +```scala +transparent trait ID +``` +In that case the union type is not widened. +```scala +scala> if true then name else password +val res2: UserName | Password = UserName(Eve) +``` +The more precise union type is also inferred if `UserName` and `Password` are declared without an explicit +parent, since in that case their implied superclass is `Object`, which is among the classes that are +assumed to be transparent. See [Transparent Traits and Classes](../other-new-features/transparent-traits.md) +for a list of such classes. +```scala +case class UserName(name: String) +case class Password(hash: Hash) + +scala> if true then UserName("Eve") else Password(123) +val res3: UserName | Password = UserName(Eve) +``` + + [More details](./union-types-spec.md) diff --git a/docs/_docs/reference/other-new-features/experimental-defs.md b/docs/_docs/reference/other-new-features/experimental-defs.md index 225b61161652..88815ad1e136 100644 --- a/docs/_docs/reference/other-new-features/experimental-defs.md +++ b/docs/_docs/reference/other-new-features/experimental-defs.md @@ -216,7 +216,7 @@ Experimental definitions can only be referenced in an experimental scope. Experi
Example 1 - + ```scala import scala.annotation.experimental @@ -242,7 +242,7 @@ Experimental definitions can only be referenced in an experimental scope. Experi } } ``` - +
5. Annotations of an experimental definition are in experimental scopes. Examples: @@ -270,13 +270,6 @@ Can use the `-Yno-experimental` compiler flag to disable it and run as a proper In any other situation, a reference to an experimental definition will cause a compilation error. -## Experimental inheritance - -All subclasses of an experimental `class` or `trait` must be marked as [`@experimental`](https://scala-lang.org/api/3.x/scala/annotation/experimental.html) even if they are in an experimental scope. -Anonymous classes and SAMs of experimental classes are considered experimental. - -We require explicit annotations to make sure we do not have completion or cycles issues with nested classes. This restriction could be relaxed in the future. - ## Experimental overriding For an overriding member `M` and overridden member `O`, if `O` is non-experimental then `M` must be non-experimental. diff --git a/docs/_docs/reference/other-new-features/indentation.md b/docs/_docs/reference/other-new-features/indentation.md index f60d2d462c82..40e2fc6fb38c 100644 --- a/docs/_docs/reference/other-new-features/indentation.md +++ b/docs/_docs/reference/other-new-features/indentation.md @@ -186,6 +186,60 @@ Refinement ::= :<<< [RefineDcl] {semi [RefineDcl]} >>> Packaging ::= ‘package’ QualId :<<< TopStats >>> ``` +## Optional Braces for Method Arguments + +Starting with Scala 3.3, a `` token is also recognized where a function argument would be expected. Examples: + +```scala +times(10): + println("ah") + println("ha") +``` + +or + +```scala +credentials `++`: + val file = Path.userHome / ".credentials" + if file.exists + then Seq(Credentials(file)) + else Seq() +``` + +or + +```scala +xs.map: + x => + val y = x - 1 + y * y +``` +What's more, a `:` in these settings can also be followed on the same line by the parameter part and arrow of a lambda. So the last example could be compressed to this: + +```scala +xs.map: x => + val y = x - 1 + y * y +``` +and the following would also be legal: +```scala +xs.foldLeft(0): (x, y) => + x + y +``` + +The grammar changes for optional braces around arguments are as follows. + +``` +SimpleExpr ::= ... + | SimpleExpr ColonArgument +InfixExpr ::= ... + | InfixExpr id ColonArgument +ColonArgument ::= colon [LambdaStart] + indent (CaseClauses | Block) outdent +LambdaStart ::= FunParams (‘=>’ | ‘?=>’) + | HkTypeParamClause ‘=>’ +``` + ## Spaces vs Tabs Indentation prefixes can consist of spaces and/or tabs. Indentation widths are the indentation prefixes themselves, ordered by the string prefix relation. So, so for instance "2 tabs, followed by 4 spaces" is strictly less than "2 tabs, followed by 5 spaces", but "2 tabs, followed by 4 spaces" is incomparable to "6 tabs" or to "4 spaces, followed by 2 tabs". It is an error if the indentation width of some line is incomparable with the indentation width of the region that's current at that point. To avoid such errors, it is a good idea not to mix spaces and tabs in the same source file. @@ -448,62 +502,3 @@ indented regions where possible. When invoked with options `-rewrite -no-indent` The `-indent` option only works on [new-style syntax](./control-syntax.md). So to go from old-style syntax to new-style indented code one has to invoke the compiler twice, first with options `-rewrite -new-syntax`, then again with options `-rewrite -indent`. To go in the opposite direction, from indented code to old-style syntax, it's `-rewrite -no-indent`, followed by `-rewrite -old-syntax`. -## Variant: Indentation Marker `:` for Arguments - -Generally, the possible indentation regions coincide with those regions where braces `{...}` are also legal, no matter whether the braces enclose an expression or a set of definitions. There is one exception, though: Arguments to functions can be enclosed in braces but they cannot be simply indented instead. Making indentation always significant for function arguments would be too restrictive and fragile. - -To allow such arguments to be written without braces, a variant of the indentation scheme is implemented under language import -```scala -import language.experimental.fewerBraces -``` -In this variant, a `` token is also recognized where function argument would be expected. Examples: - -```scala -times(10): - println("ah") - println("ha") -``` - -or - -```scala -credentials `++`: - val file = Path.userHome / ".credentials" - if file.exists - then Seq(Credentials(file)) - else Seq() -``` - -or - -```scala -xs.map: - x => - val y = x - 1 - y * y -``` -What's more, a `:` in these settings can also be followed on the same line by the parameter part and arrow of a lambda. So the last example could be compressed to this: - -```scala -xs.map: x => - val y = x - 1 - y * y -``` -and the following would also be legal: -```scala -xs.foldLeft(0): (x, y) => - x + y -``` - -The grammar changes for this variant are as follows. - -``` -SimpleExpr ::= ... - | SimpleExpr ColonArgument -InfixExpr ::= ... - | InfixExpr id ColonArgument -ColonArgument ::= colon [LambdaStart] - indent (CaseClauses | Block) outdent -LambdaStart ::= FunParams (‘=>’ | ‘?=>’) - | HkTypeParamClause ‘=>’ -``` diff --git a/docs/_docs/reference/other-new-features/transparent-traits.md b/docs/_docs/reference/other-new-features/transparent-traits.md index 699ce0b9ddd8..b930ffbfde00 100644 --- a/docs/_docs/reference/other-new-features/transparent-traits.md +++ b/docs/_docs/reference/other-new-features/transparent-traits.md @@ -1,6 +1,6 @@ --- layout: doc-page -title: "Transparent Traits" +title: "Transparent Traits and Classes" nightlyOf: https://docs.scala-lang.org/scala3/reference/other-new-features/transparent-traits.html --- @@ -20,12 +20,13 @@ val x = Set(if condition then Val else Var) Here, the inferred type of `x` is `Set[Kind & Product & Serializable]` whereas one would have hoped it to be `Set[Kind]`. The reasoning for this particular type to be inferred is as follows: -- The type of the conditional above is the [union type](../new-types/union-types.md) `Val | Var`. -- A union type is widened in type inference to the least supertype that is not a union type. - In the example, this type is `Kind & Product & Serializable` since all three traits are traits of both `Val` and `Var`. +- The type of the conditional above is the [union type](../new-types/union-types.md) `Val | Var`. This union type is treated as "soft", which means it was not explicitly written in the source program, but came from forming an upper bound of the types of +some alternatives. +- A soft union type is widened in type inference to the least product of class or trait types that is a supertype of the union type. + In the example, this type is `Kind & Product & Serializable` since all three traits are super-traits of both `Val` and `Var`. So that type becomes the inferred element type of the set. -Scala 3 allows one to mark a mixin trait as `transparent`, which means that it can be suppressed in type inference. Here's an example that follows the lines of the code above, but now with a new transparent trait `S` instead of `Product`: +Scala 3 allows one to mark a trait or class as `transparent`, which means that it can be suppressed in type inference. Here's an example that follows the lines of the code above, but now with a new transparent trait `S` instead of `Product`: ```scala transparent trait S @@ -38,13 +39,40 @@ val x = Set(if condition then Val else Var) Now `x` has inferred type `Set[Kind]`. The common transparent trait `S` does not appear in the inferred type. -## Transparent Traits +In the previous example, one could also declare `Kind` as `transparent`: +```scala +transparent trait Kind +``` +The widened union type of `if condition then Val else Var` would then +_only_ contain the transparent traits `Kind` and `S`. In this case, +the widening is not performed at all, so `x` would have type `Set[Val | Var]`. + +The root classes and traits `Any`, `AnyVal`, `Object`, and `Matchable` are +considered to be transparent. This means that an expression such +as +```scala +if condition then 1 else "hello" +``` +will have type `Int | String` instead of the widened type `Any`. + -The traits [`scala.Product`](https://scala-lang.org/api/3.x/scala/Product.html), [`java.io.Serializable`](https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/io/Serializable.html) and [`java.lang.Comparable`](https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/lang/Comparable.html) -are treated automatically as transparent. Other traits are turned into transparent traits using the modifier `transparent`. Scala 2 traits can also be made transparent -by adding a [`@transparentTrait`](https://scala-lang.org/api/3.x/scala/annotation/transparentTrait.html) annotation. This annotation is defined in [`scala.annotation`](https://scala-lang.org/api/3.x/scala/annotation.html). It will be deprecated and phased out once Scala 2/3 interoperability is no longer needed. -Typically, transparent traits are traits +## Which Traits and Classes Are Transparent? + +Traits and classes are declared transparent by adding the modifier `transparent`. Scala 2 traits and classes can also be declared transparent by adding a [`@transparentTrait`](https://scala-lang.org/api/3.x/scala/annotation/transparentTrait.html) annotation. This annotation is defined in [`scala.annotation`](https://scala-lang.org/api/3.x/scala/annotation.html). It will be deprecated and phased out once Scala 2/3 interoperability is no longer needed. + +The following classes and traits are automatically treated as transparent: +```scala + scala.Any + scala.AnyVal + scala.Matchable + scala.Product + java.lang.Object + java.lang.Comparable + java.io.Serializable +``` + +Typically, transparent types other than the root classes are traits that influence the implementation of inheriting classes and traits that are not usually used as types by themselves. Two examples from the standard collection library are: - [`IterableOps`](https://scala-lang.org/api/3.x/scala/collection/IterableOps.html), which provides method implementations for an [`Iterable`](https://scala-lang.org/api/3.x/scala/collection/Iterable.html). @@ -55,7 +83,10 @@ declared transparent. ## Rules for Inference -Transparent traits can be given as explicit types as usual. But they are often elided when types are inferred. Roughly, the rules for type inference say that transparent traits are dropped from intersections where possible. +Transparent traits and classes can be given as explicit types as usual. But they are often elided when types are inferred. Roughly, the rules for type inference imply the following. + + - Transparent traits are dropped from intersections where possible. + - Union types are not widened if widening would result in only transparent supertypes. The precise rules are as follows: @@ -63,8 +94,8 @@ The precise rules are as follows: - where that type is not higher-kinded, - and where `B` is its known upper bound or `Any` if none exists: - If the type inferred so far is of the form `T1 & ... & Tn` where - `n >= 1`, replace the maximal number of transparent `Ti`s by `Any`, while ensuring that + `n >= 1`, replace the maximal number of transparent traits `Ti`s by `Any`, while ensuring that the resulting type is still a subtype of the bound `B`. -- However, do not perform this widening if all transparent traits `Ti` can get replaced in that way. +- However, do not perform this widening if all types `Ti` can get replaced in that way. This clause ensures that a single transparent trait instance such as [`Product`](https://scala-lang.org/api/3.x/scala/Product.html) is not widened to [`Any`](https://scala-lang.org/api/3.x/scala/Any.html). Transparent trait instances are only dropped when they appear in conjunction with some other type. -The last clause ensures that a single transparent trait instance such as [`Product`](https://scala-lang.org/api/3.x/scala/Product.html) is not widened to [`Any`](https://scala-lang.org/api/3.x/scala/Any.html). Transparent trait instances are only dropped when they appear in conjunction with some other type. +- If the original type was a is union type that got widened in a previous step to a product consisting only of transparent traits and classes, keep the original union type instead of its widened form. \ No newline at end of file diff --git a/docs/_docs/reference/syntax.md b/docs/_docs/reference/syntax.md index e11629c8eaf9..bc709fb1f870 100644 --- a/docs/_docs/reference/syntax.md +++ b/docs/_docs/reference/syntax.md @@ -249,6 +249,7 @@ Catches ::= ‘catch’ (Expr | ExprCaseClause) PostfixExpr ::= InfixExpr [id] -- only if language.postfixOperators is enabled InfixExpr ::= PrefixExpr | InfixExpr id [nl] InfixExpr + | InfixExpr id ColonArgument | InfixExpr MatchClause MatchClause ::= ‘match’ <<< CaseClauses >>> PrefixExpr ::= [PrefixOperator] SimpleExpr @@ -267,6 +268,11 @@ SimpleExpr ::= SimpleRef | SimpleExpr ‘.’ MatchClause | SimpleExpr TypeArgs | SimpleExpr ArgumentExprs + | SimpleExpr ColonArgument +ColonArgument ::= colon [LambdaStart] + indent (CaseClauses | Block) outdent +LambdaStart ::= FunParams (‘=>’ | ‘?=>’) + | HkTypeParamClause ‘=>’ Quoted ::= ‘'’ ‘{’ Block ‘}’ | ‘'’ ‘[’ Type ‘]’ ExprSplice ::= spliceId -- if inside quoted block @@ -306,7 +312,10 @@ TypeCaseClauses ::= TypeCaseClause { TypeCaseClause } TypeCaseClause ::= ‘case’ (InfixType | ‘_’) ‘=>’ Type [semi] Pattern ::= Pattern1 { ‘|’ Pattern1 } -Pattern1 ::= Pattern2 [‘:’ RefinedType] +Pattern1 ::= PatVar ‘:’ RefinedType + | [‘-’] integerLiteral ‘:’ RefinedType + | [‘-’] floatingPointLiteral ‘:’ RefinedType + | Pattern2 Pattern2 ::= [id ‘@’] InfixPattern [‘*’] InfixPattern ::= SimplePattern { id [nl] SimplePattern } SimplePattern ::= PatVar diff --git a/docs/_layouts/base.html b/docs/_layouts/base.html index 62823d08c751..feb79d1590a0 100644 --- a/docs/_layouts/base.html +++ b/docs/_layouts/base.html @@ -1,7 +1,3 @@ ---- -extraCSS: - - css/color-brewer.css ---- diff --git a/docs/sidebar.yml b/docs/sidebar.yml index f2ef04ad8d25..1e791472bceb 100644 --- a/docs/sidebar.yml +++ b/docs/sidebar.yml @@ -150,6 +150,7 @@ subsection: - page: reference/experimental/numeric-literals.md - page: reference/experimental/explicit-nulls.md - page: reference/experimental/main-annotation.md + - page: reference/experimental/into-modifier.md - page: reference/experimental/cc.md - page: reference/experimental/purefuns.md - page: reference/experimental/tupled-function.md diff --git a/interfaces/src/dotty/tools/dotc/interfaces/Diagnostic.java b/interfaces/src/dotty/tools/dotc/interfaces/Diagnostic.java index c46360afaa3d..19878a2fa105 100644 --- a/interfaces/src/dotty/tools/dotc/interfaces/Diagnostic.java +++ b/interfaces/src/dotty/tools/dotc/interfaces/Diagnostic.java @@ -1,6 +1,7 @@ package dotty.tools.dotc.interfaces; import java.util.Optional; +import java.util.List; /** A diagnostic is a message emitted during the compilation process. * @@ -23,4 +24,7 @@ public interface Diagnostic { /** @return The position in a source file of the code that caused this diagnostic * to be emitted. */ Optional position(); + + /** @return A list of additional messages together with their code positions */ + List diagnosticRelatedInformation(); } diff --git a/interfaces/src/dotty/tools/dotc/interfaces/DiagnosticRelatedInformation.java b/interfaces/src/dotty/tools/dotc/interfaces/DiagnosticRelatedInformation.java new file mode 100644 index 000000000000..3ebea03f4362 --- /dev/null +++ b/interfaces/src/dotty/tools/dotc/interfaces/DiagnosticRelatedInformation.java @@ -0,0 +1,6 @@ +package dotty.tools.dotc.interfaces; + +public interface DiagnosticRelatedInformation { + SourcePosition position(); + String message(); +} diff --git a/library/src/scala/CanEqual.scala b/library/src/scala/CanEqual.scala index dfb4ec7d2bfc..8c331bb21b43 100644 --- a/library/src/scala/CanEqual.scala +++ b/library/src/scala/CanEqual.scala @@ -1,7 +1,7 @@ package scala import annotation.implicitNotFound -import scala.collection.{Seq, Set} +import scala.collection.{Seq, Set, Map} /** A marker trait indicating that values of type `L` can be compared to values of type `R`. */ @implicitNotFound("Values of types ${L} and ${R} cannot be compared with == or !=") @@ -26,7 +26,7 @@ object CanEqual { given canEqualNumber: CanEqual[Number, Number] = derived given canEqualString: CanEqual[String, String] = derived - // The next 6 definitions can go into the companion objects of their corresponding + // The following definitions can go into the companion objects of their corresponding // classes. For now they are here in order not to have to touch the // source code of these classes given canEqualSeqs[T, U](using eq: CanEqual[T, U]): CanEqual[Seq[T], Seq[U]] = derived @@ -34,6 +34,10 @@ object CanEqual { given canEqualSet[T, U](using eq: CanEqual[T, U]): CanEqual[Set[T], Set[U]] = derived + given canEqualMap[K1, V1, K2, V2]( + using eqK: CanEqual[K1, K2], eqV: CanEqual[V1, V2] + ): CanEqual[Map[K1, V1], Map[K2, V2]] = derived + given canEqualOptions[T, U](using eq: CanEqual[T, U]): CanEqual[Option[T], Option[U]] = derived given canEqualOption[T](using eq: CanEqual[T, T]): CanEqual[Option[T], Option[T]] = derived // for `case None` in pattern matching diff --git a/library/src/scala/Tuple.scala b/library/src/scala/Tuple.scala index 703f8a1e2992..fa72e320b560 100644 --- a/library/src/scala/Tuple.scala +++ b/library/src/scala/Tuple.scala @@ -83,7 +83,7 @@ sealed trait Tuple extends Product { object Tuple { /** Type of a tuple with an element appended */ - type Append[X <: Tuple, Y] <: Tuple = X match { + type Append[X <: Tuple, Y] <: NonEmptyTuple = X match { case EmptyTuple => Y *: EmptyTuple case x *: xs => x *: Append[xs, Y] } diff --git a/library/src/scala/annotation/MacroAnnotation.scala b/library/src/scala/annotation/MacroAnnotation.scala new file mode 100644 index 000000000000..5c39ef45f417 --- /dev/null +++ b/library/src/scala/annotation/MacroAnnotation.scala @@ -0,0 +1,212 @@ +// TODO in which package should this class be located? +package scala +package annotation + +import scala.quoted._ + +/** Base trait for macro annotation implementation. + * Macro annotations can transform definitions and add new definitions. + * + * See: `MacroAnnotation.transform` + * + * @syntax markdown + */ +@experimental +trait MacroAnnotation extends StaticAnnotation: + + /** Transform the `tree` definition and add new definitions + * + * This method takes as argument the annotated definition. + * It returns a non-empty list containing the modified version of the annotated definition. + * The new tree for the definition must use the original symbol. + * New definitions can be added to the list before or after the transformed definitions, this order + * will be retained. New definitions will not be visible from outside the macro expansion. + * + * #### Restrictions + * - All definitions in the result must have the same owner. The owner can be recovered from `Symbol.spliceOwner`. + * - Special case: an annotated top-level `def`, `val`, `var`, `lazy val` can return a `class`/`object` +definition that is owned by the package or package object. + * - Can not return a `type`. + * - Annotated top-level `class`/`object` can not return top-level `def`, `val`, `var`, `lazy val`. + * - Can not see new definition in user written code. + * + * #### Good practices + * - Make your new definitions private if you can. + * - New definitions added as class members should use a fresh name (`Symbol.freshName`) to avoid collisions. + * - New top-level definitions should use a fresh name (`Symbol.freshName`) that includes the name of the annotated + * member as a prefix to avoid collisions of definitions added in other files. + * + * **IMPORTANT**: When developing and testing a macro annotation, you must enable `-Xcheck-macros` and `-Ycheck:all`. + * + * #### Example 1 + * This example shows how to modify a `def` and add a `val` next to it using a macro annotation. + * ```scala + * import scala.quoted.* + * import scala.collection.mutable + * + * class memoize extends MacroAnnotation: + * def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + * import quotes.reflect._ + * tree match + * case DefDef(name, TermParamClause(param :: Nil) :: Nil, tpt, Some(rhsTree)) => + * (param.tpt.tpe.asType, tpt.tpe.asType) match + * case ('[t], '[u]) => + * val cacheName = Symbol.freshName(name + "Cache") + * val cacheSymbol = Symbol.newVal(Symbol.spliceOwner, cacheName, TypeRepr.of[mutable.Map[t, u]], Flags.Private, Symbol.noSymbol) + * val cacheRhs = + * given Quotes = cacheSymbol.asQuotes + * '{ mutable.Map.empty[t, u] }.asTerm + * val cacheVal = ValDef(cacheSymbol, Some(cacheRhs)) + * val newRhs = + * given Quotes = tree.symbol.asQuotes + * val cacheRefExpr = Ref(cacheSymbol).asExprOf[mutable.Map[t, u]] + * val paramRefExpr = Ref(param.symbol).asExprOf[t] + * val rhsExpr = rhsTree.asExprOf[u] + * '{ $cacheRefExpr.getOrElseUpdate($paramRefExpr, $rhsExpr) }.asTerm + * val newTree = DefDef.copy(tree)(name, TermParamClause(param :: Nil) :: Nil, tpt, Some(newRhs)) + * List(cacheVal, newTree) + * case _ => + * report.error("Annotation only supported on `def` with a single argument are supported") + * List(tree) + * ``` + * with this macro annotation a user can write + * ```scala + * //{ + * class memoize extends scala.annotation.StaticAnnotation + * //} + * @memoize + * def fib(n: Int): Int = + * println(s"compute fib of $n") + * if n <= 1 then n else fib(n - 1) + fib(n - 2) + * ``` + * and the macro will modify the definition to create + * ```scala + * val fibCache$macro$1 = + * scala.collection.mutable.Map.empty[Int, Int] + * def fib(n: Int): Int = + * fibCache$macro$1.getOrElseUpdate( + * n, + * { + * println(s"compute fib of $n") + * if n <= 1 then n else fib(n - 1) + fib(n - 2) + * } + * ) + * ``` + * + * #### Example 2 + * This example shows how to modify a `class` using a macro annotation. + * It shows how to override inherited members and add new ones. + * ```scala + * import scala.annotation.{experimental, MacroAnnotation} + * import scala.quoted.* + * + * @experimental + * class equals extends MacroAnnotation: + * def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + * import quotes.reflect.* + * tree match + * case ClassDef(className, ctr, parents, self, body) => + * val cls = tree.symbol + * + * val constructorParameters = ctr.paramss.collect { case clause: TermParamClause => clause } + * if constructorParameters.size != 1 || constructorParameters.head.params.isEmpty then + * report.errorAndAbort("@equals class must have a single argument list with at least one argument", ctr.pos) + * def checkNotOverridden(sym: Symbol): Unit = + * if sym.overridingSymbol(cls).exists then + * report.error(s"Cannot override ${sym.name} in a @equals class") + * + * val fields = body.collect { + * case vdef: ValDef if vdef.symbol.flags.is(Flags.ParamAccessor) => + * Select(This(cls), vdef.symbol).asExpr + * } + * + * val equalsSym = Symbol.requiredMethod("java.lang.Object.equals") + * checkNotOverridden(equalsSym) + * val equalsOverrideSym = Symbol.newMethod(cls, "equals", equalsSym.info, Flags.Override, Symbol.noSymbol) + * def equalsOverrideDefBody(argss: List[List[Tree]]): Option[Term] = + * given Quotes = equalsOverrideSym.asQuotes + * cls.typeRef.asType match + * case '[c] => + * Some(equalsExpr[c](argss.head.head.asExpr, fields).asTerm) + * val equalsOverrideDef = DefDef(equalsOverrideSym, equalsOverrideDefBody) + * + * val hashSym = Symbol.newVal(cls, Symbol.freshName("hash"), TypeRepr.of[Int], Flags.Private | Flags.Lazy, Symbol.noSymbol) + * val hashVal = ValDef(hashSym, Some(hashCodeExpr(className, fields)(using hashSym.asQuotes).asTerm)) + * + * val hashCodeSym = Symbol.requiredMethod("java.lang.Object.hashCode") + * checkNotOverridden(hashCodeSym) + * val hashCodeOverrideSym = Symbol.newMethod(cls, "hashCode", hashCodeSym.info, Flags.Override, Symbol.noSymbol) + * val hashCodeOverrideDef = DefDef(hashCodeOverrideSym, _ => Some(Ref(hashSym))) + * + * val newBody = equalsOverrideDef :: hashVal :: hashCodeOverrideDef :: body + * List(ClassDef.copy(tree)(className, ctr, parents, self, newBody)) + * case _ => + * report.error("Annotation only supports `class`") + * List(tree) + * + * private def equalsExpr[T: Type](that: Expr[Any], thisFields: List[Expr[Any]])(using Quotes): Expr[Boolean] = + * '{ + * $that match + * case that: T @unchecked => + * ${ + * val thatFields: List[Expr[Any]] = + * import quotes.reflect.* + * thisFields.map(field => Select('{that}.asTerm, field.asTerm.symbol).asExpr) + * thisFields.zip(thatFields) + * .map { case (thisField, thatField) => '{ $thisField == $thatField } } + * .reduce { case (pred1, pred2) => '{ $pred1 && $pred2 } } + * } + * case _ => false + * } + * + * private def hashCodeExpr(className: String, thisFields: List[Expr[Any]])(using Quotes): Expr[Int] = + * '{ + * var acc: Int = ${ Expr(scala.runtime.Statics.mix(-889275714, className.hashCode)) } + * ${ + * Expr.block( + * thisFields.map { + * case '{ $field: Boolean } => '{ if $field then 1231 else 1237 } + * case '{ $field: Byte } => '{ $field.toInt } + * case '{ $field: Char } => '{ $field.toInt } + * case '{ $field: Short } => '{ $field.toInt } + * case '{ $field: Int } => field + * case '{ $field: Long } => '{ scala.runtime.Statics.longHash($field) } + * case '{ $field: Double } => '{ scala.runtime.Statics.doubleHash($field) } + * case '{ $field: Float } => '{ scala.runtime.Statics.floatHash($field) } + * case '{ $field: Null } => '{ 0 } + * case '{ $field: Unit } => '{ 0 } + * case field => '{ scala.runtime.Statics.anyHash($field) } + * }.map(hash => '{ acc = scala.runtime.Statics.mix(acc, $hash) }), + * '{ scala.runtime.Statics.finalizeHash(acc, ${Expr(thisFields.size)}) } + * ) + * } + * } + * ``` + * with this macro annotation a user can write + * ```scala + * //{ + * class equals extends scala.annotation.StaticAnnotation + * //} + * @equals class User(val name: String, val id: Int) + * ``` + * and the macro will modify the class definition to generate the following code + * ```scala + * class User(val name: String, val id: Int): + * override def equals(that: Any): Boolean = + * that match + * case that: User => this.name == that.name && this.id == that.id + * case _ => false + * private lazy val hash$macro$1: Int = + * var acc = 515782504 // scala.runtime.Statics.mix(-889275714, "User".hashCode) + * acc = scala.runtime.Statics.mix(acc, scala.runtime.Statics.anyHash(name)) + * acc = scala.runtime.Statics.mix(acc, id) + * scala.runtime.Statics.finalizeHash(acc, 2) + * override def hashCode(): Int = hash$macro$1 + * ``` + * + * @param Quotes Implicit instance of Quotes used for tree reflection + * @param tree Tree that will be transformed + * + * @syntax markdown + */ + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] diff --git a/library/src/scala/annotation/allowConversions.scala b/library/src/scala/annotation/allowConversions.scala new file mode 100644 index 000000000000..9d752ee26d21 --- /dev/null +++ b/library/src/scala/annotation/allowConversions.scala @@ -0,0 +1,10 @@ +package scala.annotation +import annotation.experimental + +/** An annotation on a parameter type that allows implicit conversions + * for its arguments. Intended for use by Scala 2, to annotate Scala 2 + * libraries. Scala 3 uses the `into` modifier on the parameter + * type instead. + */ +@experimental +class allowConversions extends scala.annotation.StaticAnnotation diff --git a/library/src/scala/caps.scala b/library/src/scala/caps.scala index 21b2f7a4dece..fb1721f98b35 100644 --- a/library/src/scala/caps.scala +++ b/library/src/scala/caps.scala @@ -7,12 +7,26 @@ import annotation.experimental /** The universal capture reference */ val `*`: Any = () - /** If argument is of type `cs T`, converts to type `box cs T`. This - * avoids the error that would be raised when boxing `*`. - */ - extension [T](x: T) def unsafeBox: T = x + object unsafe: + + /** If argument is of type `cs T`, converts to type `box cs T`. This + * avoids the error that would be raised when boxing `*`. + */ + extension [T](x: T) def unsafeBox: T = x + + /** If argument is of type `box cs T`, converts to type `cs T`. This + * avoids the error that would be raised when unboxing `*`. + */ + extension [T](x: T) def unsafeUnbox: T = x + + /** If argument is of type `box cs T`, converts to type `cs T`. This + * avoids the error that would be raised when unboxing `*`. + */ + extension [T, U](f: T => U) def unsafeBoxFunArg: T => U = f + end unsafe - /** If argument is of type `box cs T`, converts to type `cs T`. This - * avoids the error that would be raised when unboxing `*`. + /** Mixing in this trait forces a trait or class to be pure, i.e. + * have no capabilities retained in its self type. */ - extension [T](x: T) def unsafeUnbox: T = x + trait Pure: + this: Pure => diff --git a/library/src/scala/deriving/Mirror.scala b/library/src/scala/deriving/Mirror.scala index 5de219dfe5c4..57453a516567 100644 --- a/library/src/scala/deriving/Mirror.scala +++ b/library/src/scala/deriving/Mirror.scala @@ -52,7 +52,6 @@ object Mirror { extension [T](p: ProductOf[T]) /** Create a new instance of type `T` with elements taken from product `a`. */ - @annotation.experimental def fromProductTyped[A <: scala.Product, Elems <: p.MirroredElemTypes](a: A)(using m: ProductOf[A] { type MirroredElemTypes = Elems }): T = p.fromProduct(a) diff --git a/library/src/scala/quoted/Quotes.scala b/library/src/scala/quoted/Quotes.scala index 3e2863f2260b..48a387e64169 100644 --- a/library/src/scala/quoted/Quotes.scala +++ b/library/src/scala/quoted/Quotes.scala @@ -14,7 +14,7 @@ import scala.reflect.TypeTest * } * ``` */ -transparent inline def quotes(using inline q: Quotes): q.type = q +transparent inline def quotes(using q: Quotes): q.type = q /** Quotation context provided by a macro expansion or in the scope of `scala.quoted.staging.run`. * Used to perform all operations on quoted `Expr` or `Type`. @@ -467,9 +467,33 @@ trait Quotes { self: runtime.QuoteUnpickler & runtime.QuoteMatching => * otherwise the can be `Term` containing the `New` applied to the parameters of the extended class. * @param body List of members of the class. The members must align with the members of `cls`. */ + // TODO add selfOpt: Option[ValDef]? @experimental def apply(cls: Symbol, parents: List[Tree /* Term | TypeTree */], body: List[Statement]): ClassDef def copy(original: Tree)(name: String, constr: DefDef, parents: List[Tree /* Term | TypeTree */], selfOpt: Option[ValDef], body: List[Statement]): ClassDef def unapply(cdef: ClassDef): (String, DefDef, List[Tree /* Term | TypeTree */], Option[ValDef], List[Statement]) + + + /** Create the ValDef and ClassDef of a module (equivalent to an `object` declaration in source code). + * + * Equivalent to + * ``` + * def module(module: Symbol, parents: List[Tree], body: List[Statement]): (ValDef, ClassDef) = + * val modCls = module.moduleClass + * val modClassDef = ClassDef(modCls, parents, body) + * val modValDef = ValDef(module, Some(Apply(Select(New(TypeIdent(modCls)), cls.primaryConstructor), Nil))) + * List(modValDef, modClassDef) + * ``` + * + * @param module the module symbol (created using `Symbol.newModule`) + * @param parents parents of the module class + * @param body body of the module class + * @return The module lazy val definition and module class definition. + * These should be added one after the other (in that order) in the body of a class or statements of a block. + * + * @syntax markdown + */ + // TODO add selfOpt: Option[ValDef]? + @experimental def module(module: Symbol, parents: List[Tree /* Term | TypeTree */], body: List[Statement]): (ValDef, ClassDef) } /** Makes extension methods on `ClassDef` available without any imports */ @@ -3638,8 +3662,67 @@ trait Quotes { self: runtime.QuoteUnpickler & runtime.QuoteMatching => * @note As a macro can only splice code into the point at which it is expanded, all generated symbols must be * direct or indirect children of the reflection context's owner. */ + // TODO: add flags and privateWithin @experimental def newClass(parent: Symbol, name: String, parents: List[TypeRepr], decls: Symbol => List[Symbol], selfType: Option[TypeRepr]): Symbol + /** Generates a new module symbol with an associated module class symbol, + * this is equivalent to an `object` declaration in source code. + * This method returns the module symbol. The module class can be accessed calling `moduleClass` on this symbol. + * + * Example usage: + * ```scala + * //{ + * given Quotes = ??? + * import quotes.reflect._ + * //} + * val moduleName: String = Symbol.freshName("MyModule") + * val parents = List(TypeTree.of[Object]) + * def decls(cls: Symbol): List[Symbol] = + * List(Symbol.newMethod(cls, "run", MethodType(Nil)(_ => Nil, _ => TypeRepr.of[Unit]), Flags.EmptyFlags, Symbol.noSymbol)) + * + * val mod = Symbol.newModule(Symbol.spliceOwner, moduleName, Flags.EmptyFlags, Flags.EmptyFlags, parents.map(_.tpe), decls, Symbol.noSymbol) + * val cls = mod.moduleClass + * val runSym = cls.declaredMethod("run").head + * + * val runDef = DefDef(runSym, _ => Some('{ println("run") }.asTerm)) + * val modDef = ClassDef.module(mod, parents, body = List(runDef)) + * + * val callRun = Apply(Select(Ref(mod), runSym), Nil) + * + * Block(modDef.toList, callRun) + * ``` + * constructs the equivalent to + * ```scala + * //{ + * given Quotes = ??? + * import quotes.reflect._ + * //} + * '{ + * object MyModule$macro$1 extends Object: + * def run(): Unit = println("run") + * MyModule$macro$1.run() + * } + * ``` + * + * @param parent The owner of the class + * @param name The name of the class + * @param modFlags extra flags with which the module symbol should be constructed + * @param clsFlags extra flags with which the module class symbol should be constructed + * @param parents The parent classes of the class. The first parent must not be a trait. + * @param decls A function that takes the symbol of the module class as input and return the symbols of its declared members + * @param privateWithin the symbol within which this new method symbol should be private. May be noSymbol. + * + * This symbol starts without an accompanying definition. + * It is the meta-programmer's responsibility to provide exactly one corresponding definition by passing + * this symbol to `ClassDef.module`. + * + * @note As a macro can only splice code into the point at which it is expanded, all generated symbols must be + * direct or indirect children of the reflection context's owner. + * + * @syntax markdown + */ + @experimental def newModule(owner: Symbol, name: String, modFlags: Flags, clsFlags: Flags, parents: List[TypeRepr], decls: Symbol => List[Symbol], privateWithin: Symbol): Symbol + /** Generates a new method symbol with the given parent, name and type. * * To define a member method of a class, use the `newMethod` within the `decls` function of `newClass`. @@ -3675,7 +3758,7 @@ trait Quotes { self: runtime.QuoteUnpickler & runtime.QuoteMatching => * It is the meta-programmer's responsibility to provide exactly one corresponding definition by passing * this symbol to the ValDef constructor. * - * Note: Also see reflect.let + * Note: Also see ValDef.let * * @param parent The owner of the val/var/lazy val * @param name The name of the val/var/lazy val @@ -3704,6 +3787,18 @@ trait Quotes { self: runtime.QuoteUnpickler & runtime.QuoteMatching => /** Definition not available */ def noSymbol: Symbol + + /** A fresh name for class or member symbol names. + * + * Fresh names are constructed using the following format `prefix + "$macro$" + freshIndex`. + * The `freshIndex` are unique within the current source file. + * + * Examples: See `scala.annotation.MacroAnnotation` + * + * @param prefix Prefix of the fresh name + */ + @experimental + def freshName(prefix: String): String } /** Makes extension methods on `Symbol` available without any imports */ @@ -3734,6 +3829,10 @@ trait Quotes { self: runtime.QuoteUnpickler & runtime.QuoteMatching => /** The full name of this symbol up to the root package */ def fullName: String + /** Type of the definition */ + @experimental + def info: TypeRepr + /** The position of this symbol */ def pos: Option[Position] @@ -3879,17 +3978,17 @@ trait Quotes { self: runtime.QuoteUnpickler & runtime.QuoteMatching => def declaredTypes: List[Symbol] /** Type member with the given name directly declared in the class */ - @deprecated("Use typeMember", "3.1.0") + @deprecated("Use declaredType or typeMember", "3.1.0") def memberType(name: String): Symbol - /** Type member with the given name directly declared in the class */ + /** Type member with the given name declared or inherited in the class */ def typeMember(name: String): Symbol /** Type member directly declared in the class */ - @deprecated("Use typeMembers", "3.1.0") + @deprecated("Use declaredTypes or typeMembers", "3.1.0") def memberTypes: List[Symbol] - /** Type member directly declared in the class */ + /** Type member directly declared or inherited in the class */ def typeMembers: List[Symbol] /** All members directly declared in the class */ @@ -4201,7 +4300,7 @@ trait Quotes { self: runtime.QuoteUnpickler & runtime.QuoteMatching => // FLAGS // /////////////// - /** FlagSet of a Symbol */ + /** Flags of a Symbol */ type Flags /** Module object of `type Flags` */ @@ -4278,6 +4377,9 @@ trait Quotes { self: runtime.QuoteUnpickler & runtime.QuoteMatching => /** Is implemented as a Java static */ def JavaStatic: Flags + /** Is this an annotation defined in Java */ + @experimental def JavaAnnotation: Flags + /** Is this symbol `lazy` */ def Lazy: Flags @@ -4336,7 +4438,7 @@ trait Quotes { self: runtime.QuoteUnpickler & runtime.QuoteMatching => def StableRealizable: Flags /** Is this symbol marked as static. Mapped to static Java member */ - def Static: Flags + @deprecated("Use JavaStatic instead", "3.3.0") def Static: Flags /** Is this symbol to be tagged Java Synthetic */ def Synthetic: Flags @@ -4370,6 +4472,7 @@ trait Quotes { self: runtime.QuoteUnpickler & runtime.QuoteMatching => end extension } + /////////////// // POSITIONS // /////////////// diff --git a/library/src/scala/quoted/runtime/QuoteMatching.scala b/library/src/scala/quoted/runtime/QuoteMatching.scala index 2a76143e9868..c95ffe87b5dc 100644 --- a/library/src/scala/quoted/runtime/QuoteMatching.scala +++ b/library/src/scala/quoted/runtime/QuoteMatching.scala @@ -17,7 +17,7 @@ trait QuoteMatching: * - `ExprMatch.unapply('{ f(0, myInt) })('{ f(patternHole[Int], patternHole[Int]) }, _)` * will return `Some(Tuple2('{0}, '{ myInt }))` * - `ExprMatch.unapply('{ f(0, "abc") })('{ f(0, patternHole[Int]) }, _)` - * will return `None` due to the missmatch of types in the hole + * will return `None` due to the mismatch of types in the hole * * Holes: * - scala.quoted.runtime.Patterns.patternHole[T]: hole that matches an expression `x` of type `Expr[U]` @@ -27,7 +27,7 @@ trait QuoteMatching: * @param pattern `Expr[Any]` containing the pattern tree * @return None if it did not match, `Some(tup)` if it matched where `tup` contains `Expr[Ti]`` */ - def unapply[TypeBindings <: Tuple, Tup <: Tuple](scrutinee: Expr[Any])(using pattern: Expr[Any]): Option[Tup] + def unapply[TypeBindings, Tup <: Tuple](scrutinee: Expr[Any])(using pattern: Expr[Any]): Option[Tup] } val TypeMatch: TypeMatchModule @@ -40,5 +40,10 @@ trait QuoteMatching: * @param pattern `Type[?]` containing the pattern tree * @return None if it did not match, `Some(tup)` if it matched where `tup` contains `Type[Ti]`` */ - def unapply[TypeBindings <: Tuple, Tup <: Tuple](scrutinee: Type[?])(using pattern: Type[?]): Option[Tup] + def unapply[TypeBindings, Tup <: Tuple](scrutinee: Type[?])(using pattern: Type[?]): Option[Tup] } + +object QuoteMatching: + type KList + type KCons[+H <: AnyKind, +T <: KList] <: KList + type KNil <: KList diff --git a/library/src/scala/runtime/LazyVals.scala b/library/src/scala/runtime/LazyVals.scala index 0bb78aee94ad..d8c89c7abf28 100644 --- a/library/src/scala/runtime/LazyVals.scala +++ b/library/src/scala/runtime/LazyVals.scala @@ -9,19 +9,21 @@ import scala.annotation.* */ object LazyVals { @nowarn - private[this] val unsafe: sun.misc.Unsafe = - classOf[sun.misc.Unsafe].getDeclaredFields.nn.find { field => - field.nn.getType == classOf[sun.misc.Unsafe] && { - field.nn.setAccessible(true) - true - } - } - .map(_.nn.get(null).asInstanceOf[sun.misc.Unsafe]) - .getOrElse { - throw new ExceptionInInitializerError { - new IllegalStateException("Can't find instance of sun.misc.Unsafe") - } - } + private[this] val unsafe: sun.misc.Unsafe = { + def throwInitializationException() = + throw new ExceptionInInitializerError( + new IllegalStateException("Can't find instance of sun.misc.Unsafe") + ) + try + val unsafeField = classOf[sun.misc.Unsafe].getDeclaredField("theUnsafe").nn + if unsafeField.getType == classOf[sun.misc.Unsafe] then + unsafeField.setAccessible(true) + unsafeField.get(null).asInstanceOf[sun.misc.Unsafe] + else + throwInitializationException() + catch case _: NoSuchFieldException => + throwInitializationException() + } private[this] val base: Int = { val processors = java.lang.Runtime.getRuntime.nn.availableProcessors() @@ -43,28 +45,25 @@ object LazyVals { /* ------------- Start of public API ------------- */ - @experimental - sealed trait LazyValControlState + // This trait extends Serializable to fix #16806 that caused a race condition + sealed trait LazyValControlState extends Serializable /** * Used to indicate the state of a lazy val that is being * evaluated and of which other threads await the result. */ - @experimental final class Waiting extends CountDownLatch(1) with LazyValControlState /** * Used to indicate the state of a lazy val that is currently being * evaluated with no other thread awaiting its result. */ - @experimental object Evaluating extends LazyValControlState /** * Used to indicate the state of a lazy val that has been evaluated to * `null`. */ - @experimental object NullValue extends LazyValControlState final val BITS_PER_LAZY_VAL = 2L @@ -84,7 +83,6 @@ object LazyVals { unsafe.compareAndSwapLong(t, offset, e, n) } - @experimental def objCAS(t: Object, offset: Long, exp: Object, n: Object): Boolean = { if (debug) println(s"objCAS($t, $exp, $n)") @@ -145,7 +143,6 @@ object LazyVals { r } - @experimental def getStaticFieldOffset(field: java.lang.reflect.Field): Long = { @nowarn val r = unsafe.staticFieldOffset(field) diff --git a/library/src/scala/runtime/stdLibPatches/Predef.scala b/library/src/scala/runtime/stdLibPatches/Predef.scala index 3b7d009ff6f3..09feaf11c31d 100644 --- a/library/src/scala/runtime/stdLibPatches/Predef.scala +++ b/library/src/scala/runtime/stdLibPatches/Predef.scala @@ -31,7 +31,7 @@ object Predef: * @tparam T the type of the value to be summoned * @return the given value typed: the provided type parameter */ - transparent inline def summon[T](using inline x: T): x.type = x + transparent inline def summon[T](using x: T): x.type = x // Extension methods for working with explicit nulls diff --git a/library/src/scala/runtime/stdLibPatches/language.scala b/library/src/scala/runtime/stdLibPatches/language.scala index 5c01f66ffd46..401926dbab4d 100644 --- a/library/src/scala/runtime/stdLibPatches/language.scala +++ b/library/src/scala/runtime/stdLibPatches/language.scala @@ -51,6 +51,7 @@ object language: /** Experimental support for using indentation for arguments */ @compileTimeOnly("`fewerBraces` can only be used at compile time in import statements") + @deprecated("`fewerBraces` is now standard, no language import is needed", since = "3.3") object fewerBraces /** Experimental support for typechecked exception capabilities @@ -73,6 +74,14 @@ object language: */ @compileTimeOnly("`captureChecking` can only be used at compile time in import statements") object captureChecking + + /** Experimental support for automatic conversions of arguments, without requiring + * a langauge import `import scala.language.implicitConversions`. + * + * @see [[https://dotty.epfl.ch/docs/reference/experimental/into-modifier]] + */ + @compileTimeOnly("`into` can only be used at compile time in import statements") + object into end experimental /** The deprecated object contains features that are no longer officially suypported in Scala. @@ -192,7 +201,6 @@ object language: @compileTimeOnly("`3.2` can only be used at compile time in import statements") object `3.2` -/* This can be added when we go to 3.3 /** Set source version to 3.3-migration. * * @see [[https://docs.scala-lang.org/scala3/guides/migration/compatibility-intro.html]] @@ -206,5 +214,5 @@ object language: */ @compileTimeOnly("`3.3` can only be used at compile time in import statements") object `3.3` -*/ + end language diff --git a/library/src/scala/util/boundary.scala b/library/src/scala/util/boundary.scala new file mode 100644 index 000000000000..3c6c6982c7ee --- /dev/null +++ b/library/src/scala/util/boundary.scala @@ -0,0 +1,62 @@ +package scala.util + +/** A boundary that can be exited by `break` calls. + * `boundary` and `break` represent a unified and superior alternative for the + * `scala.util.control.NonLocalReturns` and `scala.util.control.Breaks` APIs. + * The main differences are: + * + * - Unified names: `boundary` to establish a scope, `break` to leave it. + * `break` can optionally return a value. + * - Integration with exceptions. `break`s are logically non-fatal exceptions. + * The `Break` exception class extends `RuntimeException` and is optimized so + * that stack trace generation is suppressed. + * - Better performance: breaks to enclosing scopes in the same method can + * be rewritten to jumps. + * + * Example usage: + * + * import scala.util.boundary, boundary.break + * + * def firstIndex[T](xs: List[T], elem: T): Int = + * boundary: + * for (x, i) <- xs.zipWithIndex do + * if x == elem then break(i) + * -1 + */ +object boundary: + + /** User code should call `break.apply` instead of throwing this exception + * directly. + */ + final class Break[T] private[boundary](val label: Label[T], val value: T) + extends RuntimeException( + /*message*/ null, /*cause*/ null, /*enableSuppression=*/ false, /*writableStackTrace*/ false) + + /** Labels are targets indicating which boundary will be exited by a `break`. + */ + final class Label[-T] + + /** Abort current computation and instead return `value` as the value of + * the enclosing `boundary` call that created `label`. + */ + def break[T](value: T)(using label: Label[T]): Nothing = + throw Break(label, value) + + /** Abort current computation and instead continue after the `boundary` call that + * created `label`. + */ + def break()(using label: Label[Unit]): Nothing = + throw Break(label, ()) + + /** Run `body` with freshly generated label as implicit argument. Catch any + * breaks associated with that label and return their results instead of + * `body`'s result. + */ + inline def apply[T](inline body: Label[T] ?=> T): T = + val local = Label[T]() + try body(using local) + catch case ex: Break[T] @unchecked => + if ex.label eq local then ex.value + else throw ex + +end boundary diff --git a/library/src/scala/util/control/NonLocalReturns.scala b/library/src/scala/util/control/NonLocalReturns.scala index c32e0ff16457..ad4dc05f36ac 100644 --- a/library/src/scala/util/control/NonLocalReturns.scala +++ b/library/src/scala/util/control/NonLocalReturns.scala @@ -7,8 +7,19 @@ package scala.util.control * import scala.util.control.NonLocalReturns.* * * returning { ... throwReturn(x) ... } + * + * This API has been deprecated. Its functionality is better served by + * + * - `scala.util.boundary` in place of `returning` + * - `scala.util.break` in place of `throwReturn` + * + * The new abstractions work with plain `RuntimeExceptions` and are more + * performant, since returns within the scope of the same method can be + * rewritten by the compiler to jumps. */ +@deprecated("Use scala.util.boundary instead", "3.3") object NonLocalReturns { + @deprecated("Use scala.util.boundary.Break instead", "3.3") class ReturnThrowable[T] extends ControlThrowable { private var myResult: T = _ def throwReturn(result: T): Nothing = { @@ -19,10 +30,12 @@ object NonLocalReturns { } /** Performs a nonlocal return by throwing an exception. */ + @deprecated("Use scala.util.boundary.break instead", "3.3") def throwReturn[T](result: T)(using returner: ReturnThrowable[? >: T]): Nothing = returner.throwReturn(result) /** Enable nonlocal returns in `op`. */ + @deprecated("Use scala.util.boundary instead", "3.3") def returning[T](op: ReturnThrowable[T] ?=> T): T = { val returner = new ReturnThrowable[T] try op(using returner) diff --git a/project/Build.scala b/project/Build.scala index 5fab2b80229a..5aca4ace8d6a 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -80,9 +80,9 @@ object DottyJSPlugin extends AutoPlugin { object Build { import ScaladocConfigs._ - val referenceVersion = "3.2.1" + val referenceVersion = "3.2.2" - val baseVersion = "3.2.2" + val baseVersion = "3.3.0" // Versions used by the vscode extension to create a new project // This should be the latest published releases. @@ -98,7 +98,7 @@ object Build { * set to 3.1.3. If it is going to be 3.1.0, it must be set to the latest * 3.0.x release. */ - val previousDottyVersion = "3.2.1" + val previousDottyVersion = "3.2.2" object CompatMode { final val BinaryCompatible = 0 @@ -489,7 +489,8 @@ object Build { settings(commonJavaSettings). settings(commonMiMaSettings). settings( - versionScheme := Some("semver-spec") + versionScheme := Some("semver-spec"), + mimaBinaryIssueFilters ++= MiMaFilters.Interfaces ) /** Find an artifact with the given `name` in `classpath` */ @@ -545,7 +546,7 @@ object Build { // get libraries onboard libraryDependencies ++= Seq( - "org.scala-lang.modules" % "scala-asm" % "9.3.0-scala-1", // used by the backend + "org.scala-lang.modules" % "scala-asm" % "9.4.0-scala-1", // used by the backend Dependencies.oldCompilerInterface, // we stick to the old version to avoid deprecation warnings "org.jline" % "jline-reader" % "3.19.0", // used by the REPL "org.jline" % "jline-terminal" % "3.19.0", @@ -607,7 +608,7 @@ object Build { if (args.contains("--help")) { println( s""" - |usage: testCompilation [--help] [--from-tasty] [--update-checkfiles] [] + |usage: testCompilation [--help] [--from-tasty] [--update-checkfiles] [--failed] [] | |By default runs tests in dotty.tools.dotc.*CompilationTests and dotty.tools.dotc.coverage.*, |excluding tests tagged with dotty.SlowTests. @@ -615,6 +616,7 @@ object Build { | --help show this message | --from-tasty runs tests in dotty.tools.dotc.FromTastyTests | --update-checkfiles override the checkfiles that did not match with the current output + | --failed re-run only failed tests | substring of the path of the tests file | """.stripMargin @@ -623,11 +625,13 @@ object Build { } else { val updateCheckfile = args.contains("--update-checkfiles") + val rerunFailed = args.contains("--failed") val fromTasty = args.contains("--from-tasty") - val args1 = if (updateCheckfile | fromTasty) args.filter(x => x != "--update-checkfiles" && x != "--from-tasty") else args + val args1 = if (updateCheckfile | fromTasty | rerunFailed) args.filter(x => x != "--update-checkfiles" && x != "--from-tasty" && x != "--failed") else args val test = if (fromTasty) "dotty.tools.dotc.FromTastyTests" else "dotty.tools.dotc.*CompilationTests dotty.tools.dotc.coverage.*" val cmd = s" $test -- --exclude-categories=dotty.SlowTests" + (if (updateCheckfile) " -Ddotty.tests.updateCheckfiles=TRUE" else "") + + (if (rerunFailed) " -Ddotty.tests.rerunFailed=TRUE" else "") + (if (args1.nonEmpty) " -Ddotty.tests.filter=" + args1.mkString(" ") else "") (Test / testOnly).toTask(cmd) } @@ -1051,15 +1055,13 @@ object Build { // with the bootstrapped library on the classpath. lazy val `scala3-sbt-bridge-tests` = project.in(file("sbt-bridge/test")). dependsOn(dottyCompiler(Bootstrapped) % Test). + dependsOn(`scala3-sbt-bridge`). settings(commonBootstrappedSettings). settings( Compile / sources := Seq(), Test / scalaSource := baseDirectory.value, Test / javaSource := baseDirectory.value, - - // Tests disabled until zinc-api-info cross-compiles with 2.13, - // alternatively we could just copy in sources the part of zinc-api-info we need. - Test / sources := Seq() + libraryDependencies += ("org.scala-sbt" %% "zinc-apiinfo" % "1.8.0" % Test).cross(CrossVersion.for3Use2_13) ) lazy val `scala3-language-server` = project.in(file("language-server")). @@ -1189,6 +1191,9 @@ object Build { "isFullOpt" -> (stage == FullOptStage), "compliantAsInstanceOfs" -> (sems.asInstanceOfs == CheckedBehavior.Compliant), "compliantArrayIndexOutOfBounds" -> (sems.arrayIndexOutOfBounds == CheckedBehavior.Compliant), + "compliantArrayStores" -> (sems.arrayStores == CheckedBehavior.Compliant), + "compliantNegativeArraySizes" -> (sems.negativeArraySizes == CheckedBehavior.Compliant), + "compliantStringIndexOutOfBounds" -> (sems.stringIndexOutOfBounds == CheckedBehavior.Compliant), "compliantModuleInit" -> (sems.moduleInit == CheckedBehavior.Compliant), "strictFloats" -> sems.strictFloats, "productionMode" -> sems.productionMode, @@ -1265,6 +1270,14 @@ object Build { ) }, + /* For some reason, in Scala 3, the implementation of IterableDefaultTest + * resolves to `scala.collection.ArrayOps.ArrayIterator`, whose `next()` + * method is not compliant when called past the last element on Scala.js. + * It relies on catching an `ArrayIndexOutOfBoundsException`. + * We have to ignore it here. + */ + Test / testOptions := Seq(Tests.Filter(_ != "org.scalajs.testsuite.javalib.lang.IterableDefaultTest")), + Test / managedResources ++= { val testDir = fetchScalaJSSource.value / "test-suite/js/src/test" @@ -1300,6 +1313,7 @@ object Build { Seq( "-Ddotty.tests.classes.dottyLibraryJS=" + dottyLibraryJSJar, + "-Ddotty.tests.classes.scalaJSJavalib=" + findArtifactPath(externalJSDeps, "scalajs-javalib"), "-Ddotty.tests.classes.scalaJSLibrary=" + findArtifactPath(externalJSDeps, "scalajs-library_2.13"), ) }, @@ -1820,9 +1834,10 @@ object Build { settings(disableDocSetting). settings( versionScheme := Some("semver-spec"), - if (mode == Bootstrapped) { - commonMiMaSettings - } else { + if (mode == Bootstrapped) Def.settings( + commonMiMaSettings, + mimaBinaryIssueFilters ++= MiMaFilters.TastyCore, + ) else { Nil } ) diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 3708ec528c79..1dbf732a5b6e 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -25,6 +25,6 @@ object Dependencies { "com.vladsch.flexmark" % "flexmark-ext-yaml-front-matter" % flexmarkVersion, ) - val newCompilerInterface = "org.scala-sbt" % "compiler-interface" % "1.7.1" + val newCompilerInterface = "org.scala-sbt" % "compiler-interface" % "1.8.0" val oldCompilerInterface = "org.scala-sbt" % "compiler-interface" % "1.3.5" } diff --git a/project/DocumentationWebsite.scala b/project/DocumentationWebsite.scala index ec32144ac0a5..e24917a60803 100644 --- a/project/DocumentationWebsite.scala +++ b/project/DocumentationWebsite.scala @@ -42,7 +42,7 @@ object DocumentationWebsite { import _root_.scala.concurrent._ import _root_.scala.concurrent.duration.Duration import ExecutionContext.Implicits.global - val inkuireVersion = "1.0.0-M3" + val inkuireVersion = "v1.0.0-M7" val inkuireLink = s"https://github.com/VirtusLab/Inkuire/releases/download/$inkuireVersion/inkuire.js" val inkuireDestinationFile = baseDest / "dotty_res" / "scripts" / "inkuire.js" sbt.IO.touch(inkuireDestinationFile) diff --git a/project/MiMaFilters.scala b/project/MiMaFilters.scala index 81510d22d2c2..689a4b8f1614 100644 --- a/project/MiMaFilters.scala +++ b/project/MiMaFilters.scala @@ -3,21 +3,12 @@ import com.typesafe.tools.mima.core._ object MiMaFilters { val Library: Seq[ProblemFilter] = Seq( - ProblemFilters.exclude[MissingClassProblem]("scala.annotation.internal.MappedAlternative"), - - ProblemFilters.exclude[DirectMissingMethodProblem]("scala.runtime.LazyVals.getStaticFieldOffset"), - ProblemFilters.exclude[DirectMissingMethodProblem]("scala.runtime.LazyVals.objCAS"), - ProblemFilters.exclude[MissingClassProblem]("scala.runtime.LazyVals$LazyValControlState"), - ProblemFilters.exclude[MissingClassProblem]("scala.runtime.LazyVals$Evaluating$"), - ProblemFilters.exclude[MissingClassProblem]("scala.runtime.LazyVals$NullValue$"), - ProblemFilters.exclude[MissingClassProblem]("scala.runtime.LazyVals$Waiting"), - ProblemFilters.exclude[MissingFieldProblem]("scala.runtime.LazyVals.Evaluating"), - ProblemFilters.exclude[MissingFieldProblem]("scala.runtime.LazyVals.NullValue"), - - ProblemFilters.exclude[MissingFieldProblem]("scala.runtime.stdLibPatches.language#experimental.pureFunctions"), - ProblemFilters.exclude[MissingFieldProblem]("scala.runtime.stdLibPatches.language#experimental.captureChecking"), - ProblemFilters.exclude[MissingClassProblem]("scala.runtime.stdLibPatches.language$experimental$pureFunctions$"), - ProblemFilters.exclude[MissingClassProblem]("scala.runtime.stdLibPatches.language$experimental$captureChecking$"), - ProblemFilters.exclude[MissingClassProblem]("scala.caps"), + ProblemFilters.exclude[DirectMissingMethodProblem]("scala.caps.unsafeBox"), + ProblemFilters.exclude[DirectMissingMethodProblem]("scala.caps.unsafeUnbox"), + ) + val TastyCore: Seq[ProblemFilter] = Seq() + val Interfaces: Seq[ProblemFilter] = Seq( + ProblemFilters.exclude[MissingClassProblem]("dotty.tools.dotc.interfaces.DiagnosticRelatedInformation"), + ProblemFilters.exclude[ReversedMissingMethodProblem]("dotty.tools.dotc.interfaces.Diagnostic.diagnosticRelatedInformation") ) } diff --git a/project/build.properties b/project/build.properties index 22af2628c413..8b9a0b0ab037 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1 @@ -sbt.version=1.7.1 +sbt.version=1.8.0 diff --git a/project/plugins.sbt b/project/plugins.sbt index b6bc5f1184b6..aba843ca2c3c 100644 --- a/project/plugins.sbt +++ b/project/plugins.sbt @@ -2,7 +2,11 @@ // // e.g. addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.1.0") -addSbtPlugin("org.scala-js" % "sbt-scalajs" % "1.10.1") +// some plugins haven't moved to scala-xml 2.x yet +libraryDependencySchemes += + "org.scala-lang.modules" %% "scala-xml" % VersionScheme.Always + +addSbtPlugin("org.scala-js" % "sbt-scalajs" % "1.12.0") addSbtPlugin("org.xerial.sbt" % "sbt-sonatype" % "3.9.10") diff --git a/project/scripts/bisect.scala b/project/scripts/bisect.scala new file mode 100755 index 000000000000..e60e632a7feb --- /dev/null +++ b/project/scripts/bisect.scala @@ -0,0 +1,240 @@ +/* +This script will bisect a problem with the compiler based on success/failure of the validation script passed as an argument. +It starts with a fast bisection on released nightly builds. +Then it will bisect the commits between the last nightly that worked and the first nightly that failed. +Look at the `usageMessage` below for more details. +*/ + + +import sys.process._ +import scala.io.Source +import java.io.File +import java.nio.file.attribute.PosixFilePermissions +import java.nio.charset.StandardCharsets +import java.nio.file.Files + +val usageMessage = """ + |Usage: + | > scala-cli project/scripts/bisect.scala -- [] + | + |The should be one of: + |* compile ... + |* run ... + |* + | + |The arguments for 'compile' and 'run' should be paths to the source file(s) and optionally additional options passed directly to scala-cli. + | + |A custom validation script should be executable and accept a single parameter, which will be the scala version to validate. + |Look at bisect-cli-example.sh and bisect-expect-example.exp for reference. + |If you want to use one of the example scripts - use a copy of the file instead of modifying it in place because that might mess up the checkout. + | + |The optional may be any combination of: + |* --dry-run + | Don't try to bisect - just make sure the validation command works correctly + |* --releases + | Bisect only releases from the given range (defaults to all releases). + | The range format is ..., where both and are optional, e.g. + | * 3.1.0-RC1-bin-20210827-427d313-NIGHTLY..3.2.1-RC1-bin-20220716-bb9c8ff-NIGHTLY + | * 3.2.1-RC1-bin-20220620-de3a82c-NIGHTLY.. + | * ..3.3.0-RC1-bin-20221124-e25362d-NIGHTLY + | The ranges are treated as inclusive. + |* --bootstrapped + | Publish locally and test a bootstrapped compiler rather than a nonboostrapped one + | + |Warning: The bisect script should not be run multiple times in parallel because of a potential race condition while publishing artifacts locally. + +""".stripMargin + +@main def run(args: String*): Unit = + val scriptOptions = + try ScriptOptions.fromArgs(args) + catch + case _ => + sys.error(s"Wrong script parameters.\n${usageMessage}") + + val validationScript = scriptOptions.validationCommand.validationScript + val releases = Releases.fromRange(scriptOptions.releasesRange) + val releaseBisect = ReleaseBisect(validationScript, releases) + + releaseBisect.verifyEdgeReleases() + + if (!scriptOptions.dryRun) then + val (lastGoodRelease, firstBadRelease) = releaseBisect.bisectedGoodAndBadReleases() + println(s"Last good release: ${lastGoodRelease.version}") + println(s"First bad release: ${firstBadRelease.version}") + println("\nFinished bisecting releases\n") + + val commitBisect = CommitBisect(validationScript, bootstrapped = scriptOptions.bootstrapped, lastGoodRelease.hash, firstBadRelease.hash) + commitBisect.bisect() + + +case class ScriptOptions(validationCommand: ValidationCommand, dryRun: Boolean, bootstrapped: Boolean, releasesRange: ReleasesRange) +object ScriptOptions: + def fromArgs(args: Seq[String]) = + val defaultOptions = ScriptOptions( + validationCommand = null, + dryRun = false, + bootstrapped = false, + ReleasesRange(first = None, last = None) + ) + parseArgs(args, defaultOptions) + + private def parseArgs(args: Seq[String], options: ScriptOptions): ScriptOptions = + args match + case "--dry-run" :: argsRest => parseArgs(argsRest, options.copy(dryRun = true)) + case "--bootstrapped" :: argsRest => parseArgs(argsRest, options.copy(bootstrapped = true)) + case "--releases" :: argsRest => + val range = ReleasesRange.tryParse(argsRest.head).get + parseArgs(argsRest.tail, options.copy(releasesRange = range)) + case _ => + val command = ValidationCommand.fromArgs(args) + options.copy(validationCommand = command) + +enum ValidationCommand: + case Compile(args: Seq[String]) + case Run(args: Seq[String]) + case CustomValidationScript(scriptFile: File) + + def validationScript: File = this match + case Compile(args) => + ValidationScript.tmpScalaCliScript(command = "compile", args) + case Run(args) => + ValidationScript.tmpScalaCliScript(command = "run", args) + case CustomValidationScript(scriptFile) => + ValidationScript.copiedFrom(scriptFile) + +object ValidationCommand: + def fromArgs(args: Seq[String]) = args match + case Seq("compile", commandArgs*) => Compile(commandArgs) + case Seq("run", commandArgs*) => Run(commandArgs) + case Seq(path) => CustomValidationScript(new File(path)) + + +object ValidationScript: + def copiedFrom(file: File): File = + val fileContent = scala.io.Source.fromFile(file).mkString + tmpScript(fileContent) + + def tmpScalaCliScript(command: String, args: Seq[String]): File = tmpScript(s""" + |#!/usr/bin/env bash + |scala-cli ${command} -S "$$1" --server=false ${args.mkString(" ")} + |""".stripMargin + ) + + private def tmpScript(content: String): File = + val executableAttr = PosixFilePermissions.asFileAttribute(PosixFilePermissions.fromString("rwxr-xr-x")) + val tmpPath = Files.createTempFile("scala-bisect-validator", "", executableAttr) + val tmpFile = tmpPath.toFile + + print(s"Bisecting with validation script: ${tmpPath.toAbsolutePath}\n") + print("#####################################\n") + print(s"${content}\n\n") + print("#####################################\n\n") + + tmpFile.deleteOnExit() + Files.write(tmpPath, content.getBytes(StandardCharsets.UTF_8)) + tmpFile + + +case class ReleasesRange(first: Option[String], last: Option[String]): + def filter(releases: Seq[Release]) = + def releaseIndex(version: String): Int = + val index = releases.indexWhere(_.version == version) + assert(index > 0, s"${version} matches no nightly compiler release") + index + + val startIdx = first.map(releaseIndex(_)).getOrElse(0) + val endIdx = last.map(releaseIndex(_) + 1).getOrElse(releases.length) + val filtered = releases.slice(startIdx, endIdx).toVector + assert(filtered.nonEmpty, "No matching releases") + filtered + +object ReleasesRange: + def all = ReleasesRange(None, None) + def tryParse(range: String): Option[ReleasesRange] = range match + case s"${first}...${last}" => Some(ReleasesRange( + Some(first).filter(_.nonEmpty), + Some(last).filter(_.nonEmpty) + )) + case _ => None + +class Releases(val releases: Vector[Release]) + +object Releases: + lazy val allReleases: Vector[Release] = + val re = raw"""(?<=title=")(.+-bin-\d{8}-\w{7}-NIGHTLY)(?=/")""".r + val html = Source.fromURL("https://repo1.maven.org/maven2/org/scala-lang/scala3-compiler_3/") + re.findAllIn(html.mkString).map(Release.apply).toVector + + def fromRange(range: ReleasesRange): Vector[Release] = range.filter(allReleases) + +case class Release(version: String): + private val re = raw".+-bin-(\d{8})-(\w{7})-NIGHTLY".r + def date: String = + version match + case re(date, _) => date + case _ => sys.error(s"Could not extract date from release name: $version") + def hash: String = + version match + case re(_, hash) => hash + case _ => sys.error(s"Could not extract hash from release name: $version") + + override def toString: String = version + + +class ReleaseBisect(validationScript: File, allReleases: Vector[Release]): + assert(allReleases.length > 1, "Need at least 2 releases to bisect") + + private val isGoodReleaseCache = collection.mutable.Map.empty[Release, Boolean] + + def verifyEdgeReleases(): Unit = + println(s"Verifying the first release: ${allReleases.head.version}") + assert(isGoodRelease(allReleases.head), s"The evaluation script unexpectedly failed for the first checked release") + println(s"Verifying the last release: ${allReleases.last.version}") + assert(!isGoodRelease(allReleases.last), s"The evaluation script unexpectedly succeeded for the last checked release") + + def bisectedGoodAndBadReleases(): (Release, Release) = + val firstBadRelease = bisect(allReleases) + assert(!isGoodRelease(firstBadRelease), s"Bisection error: the 'first bad release' ${firstBadRelease.version} is not a bad release") + val lastGoodRelease = firstBadRelease.previous + assert(isGoodRelease(lastGoodRelease), s"Bisection error: the 'last good release' ${lastGoodRelease.version} is not a good release") + (lastGoodRelease, firstBadRelease) + + extension (release: Release) private def previous: Release = + val idx = allReleases.indexOf(release) + allReleases(idx - 1) + + private def bisect(releases: Vector[Release]): Release = + if releases.length == 2 then + if isGoodRelease(releases.head) then releases.last + else releases.head + else + val mid = releases(releases.length / 2) + if isGoodRelease(mid) then bisect(releases.drop(releases.length / 2)) + else bisect(releases.take(releases.length / 2 + 1)) + + private def isGoodRelease(release: Release): Boolean = + isGoodReleaseCache.getOrElseUpdate(release, { + println(s"Testing ${release.version}") + val result = Seq(validationScript.getAbsolutePath, release.version).! + val isGood = result == 0 + println(s"Test result: ${release.version} is a ${if isGood then "good" else "bad"} release\n") + isGood + }) + +class CommitBisect(validationScript: File, bootstrapped: Boolean, lastGoodHash: String, fistBadHash: String): + def bisect(): Unit = + println(s"Starting bisecting commits $lastGoodHash..$fistBadHash\n") + val scala3CompilerProject = if bootstrapped then "scala3-compiler-bootstrapped" else "scala3-compiler" + val scala3Project = if bootstrapped then "scala3-bootstrapped" else "scala3" + val bisectRunScript = s""" + |scalaVersion=$$(sbt "print ${scala3CompilerProject}/version" | tail -n1) + |rm -r out + |sbt "clean; ${scala3Project}/publishLocal" + |${validationScript.getAbsolutePath} "$$scalaVersion" + """.stripMargin + "git bisect start".! + s"git bisect bad $fistBadHash".! + s"git bisect good $lastGoodHash".! + Seq("git", "bisect", "run", "sh", "-c", bisectRunScript).! + s"git bisect reset".! diff --git a/project/scripts/cmdScaladocTests b/project/scripts/cmdScaladocTests index 5d33e3bd7b37..2168e3e8e334 100755 --- a/project/scripts/cmdScaladocTests +++ b/project/scripts/cmdScaladocTests @@ -17,7 +17,7 @@ DOTTY_BOOTSTRAPPED_VERSION_COMMAND="$SBT \"eval println(Build.dottyVersion)\"" DOTTY_BOOTSTRAPPED_VERSION=$(eval $DOTTY_BOOTSTRAPPED_VERSION_COMMAND | tail -n 2 | head -n 1) SOURCE_LINKS_REPOSITORY="lampepfl/dotty" -SOURCE_LINKS_VERSION="$DOTTY_BOOTSTRAPPED_VERSION" +SOURCE_LINKS_VERSION="${GITHUB_SHA:-$DOTTY_BOOTSTRAPPED_VERSION}" "$SBT" "scaladoc/generateTestcasesDocumentation" > "$tmp" 2>&1 || echo "generated testcases project with sbt" dist/target/pack/bin/scaladoc \ @@ -37,9 +37,9 @@ dist/target/pack/bin/scaladoc \ "-snippet-compiler:scaladoc-testcases/docs=compile" \ "-comment-syntax:scaladoc-testcases/src/example/comment-md=markdown,scaladoc-testcases/src/example/comment-wiki=wiki" \ -siteroot scaladoc-testcases/docs \ - -project-footer "Copyright (c) 2002-2022, LAMP/EPFL" \ + -project-footer "Copyright (c) 2002-2023, LAMP/EPFL" \ -default-template static-site-main \ -author -groups -revision main -project-version "${DOTTY_BOOTSTRAPPED_VERSION}" \ - "-quick-links:Learn::https://docs.scala-lang.org/,Install::https://www.scala-lang.org/download/,Playground::https://scastie.scala-lang.org,Find A Library::https://index.scala-lang.org,Community::https://www.scala-lang.org/community/,Blog::https://www.scala-lang.org/blog/" \ + "-quick-links:Learn::https://docs.scala-lang.org/,Install::https://www.scala-lang.org/download/,Playground::https://scastie.scala-lang.org,Find A Library::https://index.scala-lang.org,Community::https://www.scala-lang.org/community/,Blog::https://www.scala-lang.org/blog/," \ out/bootstrap/scaladoc-testcases/scala-"${DOTTY_NONBOOTSTRAPPED_VERSION}"/classes > "$tmp" 2>&1 || echo "generated testcases project with scripts" diff -rq "$OUT1" "scaladoc/output/testcases" diff --git a/project/scripts/dottyCompileBisect.scala b/project/scripts/dottyCompileBisect.scala deleted file mode 100644 index fc61e63bdb78..000000000000 --- a/project/scripts/dottyCompileBisect.scala +++ /dev/null @@ -1,73 +0,0 @@ -// Usage -// > scala-cli project/scripts/dottyCompileBisect.scala -- File1.scala File2.scala -// -// This script will bisect the compilation failure starting with a fast bisection on released nightly builds. -// Then it will bisect the commits between the last nightly that worked and the first nightly that failed. - - -import sys.process._ -import scala.io.Source -import Releases.Release - -@main def dottyCompileBisect(files: String*): Unit = - val releaseBisect = ReleaseBisect(files.toList) - val fistBadRelease = releaseBisect.bisect(Releases.allReleases) - println("\nFinished bisecting releases\n") - fistBadRelease.previous match - case Some(lastGoodRelease) => - println(s"Last good release: $lastGoodRelease\nFirst bad release: $fistBadRelease\n") - val commitBisect = CommitBisect(files.toList) - commitBisect.bisect(lastGoodRelease.hash, fistBadRelease.hash) - case None => - println(s"No good release found") - -class ReleaseBisect(files: List[String]): - - def bisect(releases: Vector[Release]): Release = - assert(releases.length > 1, "Need at least 2 releases to bisect") - if releases.length == 2 then - if isGoodRelease(releases.head) then releases.last - else releases.head - else - val mid = releases(releases.length / 2) - if isGoodRelease(mid) then bisect(releases.drop(releases.length / 2)) - else bisect(releases.take(releases.length / 2 + 1)) - - private def isGoodRelease(release: Release): Boolean = - println(s"Testing ${release.version}") - val res = s"""scala-cli compile ${files.mkString(" ")} -S "${release.version}"""".! - val isGood = res == 0 - println(s"Test result: ${release.version} is a ${if isGood then "good" else "bad"} release\n") - isGood - -object Releases: - lazy val allReleases: Vector[Release] = - val re = raw"(?<=title=$")(.+-bin-\d{8}-\w{7}-NIGHTLY)(?=/$")".r - val html = Source.fromURL("https://repo1.maven.org/maven2/org/scala-lang/scala3-compiler_3/") - re.findAllIn(html.mkString).map(Release.apply).toVector - - case class Release(version: String): - private val re = raw".+-bin-(\d{8})-(\w{7})-NIGHTLY".r - def date: String = - version match - case re(date, _) => date - case _ => sys.error(s"Could not extract date from version $version") - def hash: String = - version match - case re(_, hash) => hash - case _ => sys.error(s"Could not extract hash from version $version") - - def previous: Option[Release] = - val idx = allReleases.indexOf(this) - if idx == 0 then None - else Some(allReleases(idx - 1)) - - override def toString: String = version - -class CommitBisect(files: List[String]): - def bisect(lastGoodHash: String, fistBadHash: String): Unit = - println(s"Starting bisecting commits $lastGoodHash..$fistBadHash\n") - "git bisect start".! - s"git bisect bad $fistBadHash".! - s"git bisect good $lastGoodHash".! - s"git bisect run sh project/scripts/dottyCompileBisect.sh ${files.mkString(" ")}".! diff --git a/project/scripts/dottyCompileBisect.sh b/project/scripts/dottyCompileBisect.sh deleted file mode 100644 index 1cead7a8aefd..000000000000 --- a/project/scripts/dottyCompileBisect.sh +++ /dev/null @@ -1,16 +0,0 @@ -# Usage -# > git bisect start -# > git bisect bad -# > git bisect good -# > git bisect run project/scripts/dottyCompileBisect.sh -# -# Note: Use dottyCompileBisect.scala for faster bisection over commits that spans several days - -files=$@ -shift - -rm -r out -mkdir out -mkdir out/bisect - -sbt "clean; scalac -d out/bisect $files" diff --git a/project/scripts/examples/bisect-cli-example.sh b/project/scripts/examples/bisect-cli-example.sh new file mode 100755 index 000000000000..6eb010cbe8bc --- /dev/null +++ b/project/scripts/examples/bisect-cli-example.sh @@ -0,0 +1,6 @@ +#!/usr/bin/env bash + +# Don't use this example script modified in place as it might disappear from the repo during a checkout. +# Instead copy it to a different location first. + +scala-cli compile -S "$1" --server=false file1.scala file2.scala diff --git a/project/scripts/examples/bisect-expect-example.exp b/project/scripts/examples/bisect-expect-example.exp new file mode 100755 index 000000000000..4c094c373d30 --- /dev/null +++ b/project/scripts/examples/bisect-expect-example.exp @@ -0,0 +1,17 @@ +#!/usr/local/bin/expect -f + +# Don't use this example script modified in place as it might disappear from the repo during a checkout. +# Instead copy it to a different location first. + +set scalaVersion [lindex $argv 0] ;# Get the script argument + +set timeout 30 ;# Give scala-cli some time to download the compiler +spawn scala-cli repl -S "$scalaVersion" --server=false ;# Start the REPL +expect "scala>" ;# REPL has started +set timeout 5 +send -- "Seq.empty.len\t" ;# Tab pressed to trigger code completion +expect { + "length" { exit 0 } ;# Exit with success if the expected string appeared somewhere in stdout +} + +exit 1 ;# Otherwise fail - the timeout was exceeded or the REPL crashed diff --git a/sbt-bridge/src/dotty/tools/xsbt/DelegatingReporter.java b/sbt-bridge/src/dotty/tools/xsbt/DelegatingReporter.java index 20cdfb720538..3e1e291ab7d1 100644 --- a/sbt-bridge/src/dotty/tools/xsbt/DelegatingReporter.java +++ b/sbt-bridge/src/dotty/tools/xsbt/DelegatingReporter.java @@ -45,7 +45,7 @@ public void doReport(Diagnostic dia, Context ctx) { rendered.append(explanation(message, ctx)); } - delegate.log(new Problem(position, message.msg(), severity, rendered.toString(), diagnosticCode)); + delegate.log(new Problem(position, message.message(), severity, rendered.toString(), diagnosticCode)); } private static Severity severityOf(int level) { diff --git a/sbt-bridge/src/dotty/tools/xsbt/Problem.java b/sbt-bridge/src/dotty/tools/xsbt/Problem.java index b88b2637d314..29d64cc26c4a 100644 --- a/sbt-bridge/src/dotty/tools/xsbt/Problem.java +++ b/sbt-bridge/src/dotty/tools/xsbt/Problem.java @@ -41,12 +41,19 @@ public Optional rendered() { } public Optional diagnosticCode() { - // NOTE: It's important for compatibility that we only construct a - // DiagnosticCode here to maintain compatibility with older versions of - // zinc while using this newer version of the compiler. If we would - // contstruct it earlier, you'd end up with ClassNotFoundExceptions for - // DiagnosticCode. - return Optional.of(new DiagnosticCode(_diagnosticCode, Optional.empty())); + // We don't forward the code if it's -1 since some tools will assume that this is actually + // the diagnostic code and show it or attempt to use it. This will ensure tools consuming + // this don't all have to be adding checks for -1. + if (_diagnosticCode == "-1") { + return Optional.empty(); + } else { + // NOTE: It's important for compatibility that we only construct a + // DiagnosticCode here to maintain compatibility with older versions of + // zinc while using this newer version of the compiler. If we would + // contstruct it earlier, you'd end up with ClassNotFoundExceptions for + // DiagnosticCode. + return Optional.of(new DiagnosticCode(_diagnosticCode, Optional.empty())); + } } @Override diff --git a/sbt-bridge/test/xsbt/ExtractAPISpecification.scala b/sbt-bridge/test/xsbt/ExtractAPISpecification.scala index dfbcbf2181a2..e85cf8989b0f 100644 --- a/sbt-bridge/test/xsbt/ExtractAPISpecification.scala +++ b/sbt-bridge/test/xsbt/ExtractAPISpecification.scala @@ -147,9 +147,8 @@ class ExtractAPISpecification { |""".stripMargin val compilerForTesting = new ScalaCompilerForUnitTesting val apis = - compilerForTesting.extractApisFromSrcs(reuseCompilerInstance = false)(List(src1, src2), - List(src2)) - val _ :: src2Api1 :: src2Api2 :: Nil = apis.toList + compilerForTesting.extractApisFromSrcs(List(src1, src2), List(src2)) + val _ :: src2Api1 :: src2Api2 :: Nil = apis.toList: @unchecked val namerApi1 = selectNamer(src2Api1) val namerApi2 = selectNamer(src2Api2) assertTrue(SameAPI(namerApi1, namerApi2)) @@ -202,7 +201,7 @@ class ExtractAPISpecification { val srcC8 = "class C8 { self => }" val compilerForTesting = new ScalaCompilerForUnitTesting val apis = compilerForTesting - .extractApisFromSrcs(reuseCompilerInstance = true)( + .extractApisFromSrcs( List(srcX, srcY, srcC1, srcC2, srcC3, srcC4, srcC5, srcC6, srcC8) ) .map(_.head) diff --git a/sbt-bridge/test/xsbt/ExtractUsedNamesSpecification.scala b/sbt-bridge/test/xsbt/ExtractUsedNamesSpecification.scala index ee50b3717213..819bedec3cbc 100644 --- a/sbt-bridge/test/xsbt/ExtractUsedNamesSpecification.scala +++ b/sbt-bridge/test/xsbt/ExtractUsedNamesSpecification.scala @@ -79,10 +79,10 @@ class ExtractUsedNamesSpecification { val usedNames = compilerForTesting.extractUsedNamesFromSrc(srcA, srcB, srcC, srcD) val scalaVersion = scala.util.Properties.versionNumberString val namesA = standardNames ++ Set("Nothing", "Any") - val namesAX = standardNames ++ objectStandardNames ++ Set("x", "T", "A", "Nothing", "Any", "scala") + val namesAX = standardNames ++ Set("x", "T", "A", "Nothing", "Any") val namesB = Set("A", "Int", "A;init;", "Unit") - val namesC = objectStandardNames ++ Set("B;init;", "B", "Unit") - val namesD = standardNames ++ objectStandardNames ++ Set("C", "X", "foo", "Int", "T") + val namesC = Set("B;init;", "B", "Unit") + val namesD = standardNames ++ Set("C", "X", "foo", "Int", "T") assertEquals(namesA, usedNames("A")) assertEquals(namesAX, usedNames("A.X")) assertEquals(namesB, usedNames("B")) @@ -131,13 +131,13 @@ class ExtractUsedNamesSpecification { val compilerForTesting = new ScalaCompilerForUnitTesting val usedNames = compilerForTesting.extractUsedNamesFromSrc(src1, src2) val expectedNames_lista = - standardNames ++ objectStandardNames ++ Set("B", "lista", "List", "A") + standardNames ++ Set("B", "lista", "List", "A") val expectedNames_at = - standardNames ++ objectStandardNames ++ Set("B", "at", "A", "T", "X0", "X1") + standardNames ++ Set("B", "at", "A", "T", "X0", "X1") val expectedNames_as = - standardNames ++ objectStandardNames ++ Set("B", "as", "S", "Y") + standardNames ++ Set("B", "as", "S", "Y") val expectedNames_foo = - standardNames ++ objectStandardNames ++ + standardNames ++ Set("B", "foo", "M", @@ -146,7 +146,7 @@ class ExtractUsedNamesSpecification { "???", "Nothing") val expectedNames_bar = - standardNames ++ objectStandardNames ++ + standardNames ++ Set("B", "bar", "P1", @@ -174,7 +174,7 @@ class ExtractUsedNamesSpecification { |""".stripMargin val compilerForTesting = new ScalaCompilerForUnitTesting val usedNames = compilerForTesting.extractUsedNamesFromSrc(srcFoo, srcBar) - val expectedNames = standardNames ++ objectStandardNames ++ Set("Outer", "TypeInner", "Inner", "Int") + val expectedNames = standardNames ++ Set("Outer", "TypeInner", "Inner", "Int") assertEquals(expectedNames, usedNames("Bar")) } @@ -227,7 +227,7 @@ class ExtractUsedNamesSpecification { def findPatMatUsages(in: String): Set[String] = { val compilerForTesting = new ScalaCompilerForUnitTesting val (_, callback) = - compilerForTesting.compileSrcs(List(List(sealedClass, in)), reuseCompilerInstance = false) + compilerForTesting.compileSrcs(List(List(sealedClass, in))) val clientNames = callback.usedNamesAndScopes.view.filterKeys(!_.startsWith("base.")) val names: Set[String] = clientNames.flatMap { @@ -309,9 +309,4 @@ class ExtractUsedNamesSpecification { // the return type of the default constructor is Unit "Unit" ) - - private val objectStandardNames = Set( - // all Dotty objects extend scala.Serializable - "scala", "Serializable" - ) } diff --git a/sbt-bridge/test/xsbt/ScalaCompilerForUnitTesting.scala b/sbt-bridge/test/xsbt/ScalaCompilerForUnitTesting.scala index e81d58a07744..e58f9fefd92d 100644 --- a/sbt-bridge/test/xsbt/ScalaCompilerForUnitTesting.scala +++ b/sbt-bridge/test/xsbt/ScalaCompilerForUnitTesting.scala @@ -1,7 +1,7 @@ /** Adapted from https://github.com/sbt/sbt/blob/0.13/compile/interface/src/test/scala/xsbt/ScalaCompilerForUnitTesting.scala */ package xsbt -import xsbti.compile.SingleOutput +import xsbti.compile.{CompileProgress, SingleOutput} import java.io.File import xsbti._ import sbt.io.IO @@ -9,6 +9,8 @@ import xsbti.api.{ ClassLike, Def, DependencyContext } import DependencyContext._ import xsbt.api.SameAPI import sbt.internal.util.ConsoleLogger +import dotty.tools.io.PlainFile.toPlainFile +import dotty.tools.xsbt.CompilerBridge import TestCallback.ExtractedClassDependencies @@ -32,8 +34,8 @@ class ScalaCompilerForUnitTesting { * Compiles given source code using Scala compiler and returns API representation * extracted by ExtractAPI class. */ - def extractApisFromSrcs(reuseCompilerInstance: Boolean)(srcs: List[String]*): Seq[Seq[ClassLike]] = { - val (tempSrcFiles, analysisCallback) = compileSrcs(srcs.toList, reuseCompilerInstance) + def extractApisFromSrcs(srcs: List[String]*): Seq[Seq[ClassLike]] = { + val (tempSrcFiles, analysisCallback) = compileSrcs(srcs.toList) tempSrcFiles.map(analysisCallback.apis) } @@ -91,7 +93,7 @@ class ScalaCompilerForUnitTesting { * file system-independent way of testing dependencies between source code "files". */ def extractDependenciesFromSrcs(srcs: List[List[String]]): ExtractedClassDependencies = { - val (_, testCallback) = compileSrcs(srcs, reuseCompilerInstance = true) + val (_, testCallback) = compileSrcs(srcs) val memberRefDeps = testCallback.classDependencies collect { case (target, src, DependencyByMemberRef) => (src, target) @@ -117,50 +119,47 @@ class ScalaCompilerForUnitTesting { * useful to compile macros, which cannot be used in the same compilation run that * defines them. * - * The `reuseCompilerInstance` parameter controls whether the same Scala compiler instance - * is reused between compiling source groups. Separate compiler instances can be used to - * test stability of API representation (with respect to pickling) or to test handling of - * binary dependencies. - * * The sequence of temporary files corresponding to passed snippets and analysis * callback is returned as a result. */ - def compileSrcs(groupedSrcs: List[List[String]], - reuseCompilerInstance: Boolean): (Seq[File], TestCallback) = { - // withTemporaryDirectory { temp => - { + def compileSrcs(groupedSrcs: List[List[String]]): (Seq[File], TestCallback) = { val temp = IO.createTemporaryDirectory val analysisCallback = new TestCallback val classesDir = new File(temp, "classes") classesDir.mkdir() - lazy val commonCompilerInstanceAndCtx = prepareCompiler(classesDir, analysisCallback, classesDir.toString) + val bridge = new CompilerBridge val files = for ((compilationUnit, unitId) <- groupedSrcs.zipWithIndex) yield { - // use a separate instance of the compiler for each group of sources to - // have an ability to test for bugs in instability between source and pickled - // representation of types - val (compiler, ctx) = if (reuseCompilerInstance) commonCompilerInstanceAndCtx else - prepareCompiler(classesDir, analysisCallback, classesDir.toString) - val run = compiler.newRun(ctx) - val srcFiles = compilationUnit.toSeq.zipWithIndex map { - case (src, i) => + val srcFiles = compilationUnit.toSeq.zipWithIndex.map { + (src, i) => val fileName = s"Test-$unitId-$i.scala" prepareSrcFile(temp, fileName, src) } - val srcFilePaths = srcFiles.map(srcFile => srcFile.getAbsolutePath).toList - run.compile(srcFilePaths) + val virtualSrcFiles = srcFiles.map(file => TestVirtualFile(file.toPath)).toArray + val classesDirPath = classesDir.getAbsolutePath.toString + val output = new SingleOutput: + def getOutputDirectory() = classesDir + + bridge.run( + virtualSrcFiles.toArray, + new TestDependencyChanges, + Array("-Yforce-sbt-phases", "-classpath", classesDirPath, "-usejavacp", "-d", classesDirPath), + output, + analysisCallback, + new TestReporter, + new CompileProgress {}, + new TestLogger + ) - // srcFilePaths.foreach(f => new File(f).delete) srcFiles } (files.flatten.toSeq, analysisCallback) - } } def compileSrcs(srcs: String*): (Seq[File], TestCallback) = { - compileSrcs(List(srcs.toList), reuseCompilerInstance = true) + compileSrcs(List(srcs.toList)) } private def prepareSrcFile(baseDir: File, fileName: String, src: String): File = { @@ -168,28 +167,5 @@ class ScalaCompilerForUnitTesting { IO.write(srcFile, src) srcFile } - - private def prepareCompiler(outputDir: File, analysisCallback: AnalysisCallback, classpath: String = ".") = { - val args = Array.empty[String] - - import dotty.tools.dotc.{Compiler, Driver} - import dotty.tools.dotc.core.Contexts._ - - val driver = new TestDriver - val ctx = (new ContextBase).initialCtx.fresh.setSbtCallback(analysisCallback) - driver.getCompiler(Array("-classpath", classpath, "-usejavacp", "-d", outputDir.getAbsolutePath), ctx) - } - - private object ConsoleReporter extends Reporter { - def reset(): Unit = () - def hasErrors: Boolean = false - def hasWarnings: Boolean = false - def printWarnings(): Unit = () - def problems(): Array[xsbti.Problem] = Array.empty - def log(problem: xsbti.Problem): Unit = println(problem.message) - def comment(pos: Position, msg: String): Unit = () - def printSummary(): Unit = () - } - } diff --git a/sbt-bridge/test/xsbt/TestDependencyChanges.scala b/sbt-bridge/test/xsbt/TestDependencyChanges.scala new file mode 100644 index 000000000000..f31a314ba036 --- /dev/null +++ b/sbt-bridge/test/xsbt/TestDependencyChanges.scala @@ -0,0 +1,9 @@ +package xsbt + +import xsbti.compile.* + +class TestDependencyChanges extends DependencyChanges: + def isEmpty(): Boolean = ??? + def modifiedBinaries(): Array[java.io.File] = ??? + def modifiedClasses(): Array[String] = ??? + def modifiedLibraries(): Array[xsbti.VirtualFileRef] = ??? diff --git a/sbt-bridge/test/xsbt/TestDriver.scala b/sbt-bridge/test/xsbt/TestDriver.scala deleted file mode 100644 index 790c14f4b912..000000000000 --- a/sbt-bridge/test/xsbt/TestDriver.scala +++ /dev/null @@ -1,13 +0,0 @@ -package xsbt - -import dotty.tools.dotc._ -import core.Contexts._ - -class TestDriver extends Driver { - override protected def sourcesRequired = false - - def getCompiler(args: Array[String], rootCtx: Context) = { - val (fileNames, ctx) = setup(args, rootCtx) - (newCompiler(ctx), ctx) - } -} diff --git a/sbt-bridge/test/xsbt/TestLogger.scala b/sbt-bridge/test/xsbt/TestLogger.scala new file mode 100644 index 000000000000..598887e3f8e6 --- /dev/null +++ b/sbt-bridge/test/xsbt/TestLogger.scala @@ -0,0 +1,12 @@ +package xsbt + +import java.util.function.Supplier + +import xsbti.* + +class TestLogger extends Logger: + override def debug(msg: Supplier[String]): Unit = () + override def error(msg: Supplier[String]): Unit = () + override def info(msg: Supplier[String]): Unit = () + override def warn(msg: Supplier[String]): Unit = () + override def trace(exception: Supplier[Throwable]): Unit = () diff --git a/sbt-bridge/test/xsbt/TestReporter.scala b/sbt-bridge/test/xsbt/TestReporter.scala new file mode 100644 index 000000000000..cab9823813a6 --- /dev/null +++ b/sbt-bridge/test/xsbt/TestReporter.scala @@ -0,0 +1,13 @@ +package xsbt + +import xsbti.* + +class TestReporter extends Reporter: + private val allProblems = collection.mutable.ListBuffer.empty[Problem] + def comment(position: Position, msg: String): Unit = () + def hasErrors(): Boolean = allProblems.exists(_.severity == Severity.Error) + def hasWarnings(): Boolean = allProblems.exists(_.severity == Severity.Warn) + def log(problem: Problem): Unit = allProblems.append(problem) + def printSummary(): Unit = () + def problems(): Array[Problem] = allProblems.toArray + def reset(): Unit = allProblems.clear() diff --git a/sbt-bridge/test/xsbt/TestVirtualFile.scala b/sbt-bridge/test/xsbt/TestVirtualFile.scala new file mode 100644 index 000000000000..db00038272a8 --- /dev/null +++ b/sbt-bridge/test/xsbt/TestVirtualFile.scala @@ -0,0 +1,14 @@ +package xsbt + +import xsbti.PathBasedFile +import java.nio.file.{Files, Path} +import scala.io.Source +import scala.io.Codec + +class TestVirtualFile(path: Path) extends PathBasedFile: + override def contentHash(): Long = ??? + override def input(): java.io.InputStream = Files.newInputStream(path) + override def id(): String = name() + override def name(): String = path.toFile.getName + override def names(): Array[String] = ??? + override def toPath(): Path = path diff --git a/sbt-bridge/test/xsbti/TestCallback.scala b/sbt-bridge/test/xsbti/TestCallback.scala index 3348fd2d90f3..a0919dc69bc4 100644 --- a/sbt-bridge/test/xsbti/TestCallback.scala +++ b/sbt-bridge/test/xsbti/TestCallback.scala @@ -2,7 +2,9 @@ package xsbti import java.io.File +import java.nio.file.Path import scala.collection.mutable.ArrayBuffer +import xsbti.VirtualFileRef import xsbti.api.ClassLike import xsbti.api.DependencyContext import DependencyContext._ @@ -24,12 +26,14 @@ class TestCallback extends AnalysisCallback assert(!apis.contains(source), s"startSource can be called only once per source file: $source") apis(source) = Seq.empty } + override def startSource(source: VirtualFile): Unit = ??? override def binaryDependency(binary: File, name: String, fromClassName: String, source: File, context: DependencyContext): Unit = { binaryDependencies += ((binary, name, fromClassName, source, context)) } + override def binaryDependency(binary: Path, name: String, fromClassName: String, source: VirtualFileRef, context: DependencyContext): Unit = ??? - def generatedNonLocalClass(source: File, + override def generatedNonLocalClass(source: File, module: File, binaryClassName: String, srcClassName: String): Unit = { @@ -37,12 +41,13 @@ class TestCallback extends AnalysisCallback classNames(source) += ((srcClassName, binaryClassName)) () } + override def generatedNonLocalClass(source: VirtualFileRef, module: Path, binaryClassName: String, srcClassName: String): Unit = ??? - def generatedLocalClass(source: File, module: File): Unit = { + override def generatedLocalClass(source: File, module: File): Unit = { products += ((source, module)) () } - + override def generatedLocalClass(source: VirtualFileRef, module: Path): Unit = ??? override def classDependency(onClassName: String, sourceClassName: String, context: DependencyContext): Unit = { if (onClassName != sourceClassName) classDependencies += ((onClassName, sourceClassName, context)) @@ -51,15 +56,23 @@ class TestCallback extends AnalysisCallback override def usedName(className: String, name: String, scopes: EnumSet[UseScope]): Unit = { usedNamesAndScopes(className) += TestUsedName(name, scopes) } + override def api(source: File, classApi: ClassLike): Unit = { apis(source) = classApi +: apis(source) } + override def api(source: VirtualFileRef, classApi: ClassLike): Unit = ??? + override def problem(category: String, pos: xsbti.Position, message: String, severity: xsbti.Severity, reported: Boolean): Unit = () override def dependencyPhaseCompleted(): Unit = () override def apiPhaseCompleted(): Unit = () override def enabled(): Boolean = true - def mainClass(source: File, className: String): Unit = () + override def mainClass(source: File, className: String): Unit = () + override def mainClass(source: VirtualFileRef, className: String): Unit = ??? + + override def classesInOutputJar(): java.util.Set[String] = ??? + override def getPickleJarPair(): java.util.Optional[xsbti.T2[Path, Path]] = ??? + override def isPickleJava(): Boolean = ??? } object TestCallback { @@ -78,14 +91,8 @@ object TestCallback { } private def pairsToMultiMap[A, B](pairs: collection.Seq[(A, B)]): Map[A, Set[B]] = { - import scala.collection.mutable.{ HashMap, MultiMap } - val emptyMultiMap = new HashMap[A, scala.collection.mutable.Set[B]] with MultiMap[A, B] - val multiMap = pairs.foldLeft(emptyMultiMap) { - case (acc, (key, value)) => - acc.addBinding(key, value) - } - // convert all collections to immutable variants - multiMap.toMap.view.mapValues(_.toSet).toMap.withDefaultValue(Set.empty) + pairs.groupBy(_._1).view.mapValues(values => values.map(_._2).toSet) + .toMap.withDefaultValue(Set.empty) } } } diff --git a/sbt-test/sbt-dotty/dotty-knowledge.i17/project/build.properties b/sbt-test/sbt-dotty/dotty-knowledge.i17/project/build.properties index 22af2628c413..8b9a0b0ab037 100644 --- a/sbt-test/sbt-dotty/dotty-knowledge.i17/project/build.properties +++ b/sbt-test/sbt-dotty/dotty-knowledge.i17/project/build.properties @@ -1 +1 @@ -sbt.version=1.7.1 +sbt.version=1.8.0 diff --git a/sbt-test/scala2-compat/i16351/app/App.scala b/sbt-test/scala2-compat/i16351/app/App.scala new file mode 100644 index 000000000000..5c152f515ada --- /dev/null +++ b/sbt-test/scala2-compat/i16351/app/App.scala @@ -0,0 +1,8 @@ +package app + +import lib.* + +object App { + def main(args: Array[String]): Unit = + new Lib(Value("Foo"), b = 2) {} +} diff --git a/sbt-test/scala2-compat/i16351/build.sbt b/sbt-test/scala2-compat/i16351/build.sbt new file mode 100644 index 000000000000..433a5e8baddf --- /dev/null +++ b/sbt-test/scala2-compat/i16351/build.sbt @@ -0,0 +1,13 @@ +val scala3Version = sys.props("plugin.scalaVersion") +val scala2Version = sys.props("plugin.scala2Version") + +lazy val lib = project.in(file("lib")) + .settings( + scalaVersion := scala2Version + ) + +lazy val app = project.in(file("app")) + .dependsOn(lib) + .settings( + scalaVersion := scala3Version + ) diff --git a/sbt-test/scala2-compat/i16351/lib/lib.scala b/sbt-test/scala2-compat/i16351/lib/lib.scala new file mode 100644 index 000000000000..cfc3c6c780d9 --- /dev/null +++ b/sbt-test/scala2-compat/i16351/lib/lib.scala @@ -0,0 +1,10 @@ +// Should be compiled with 2.13 +package lib + +class Value(val value: String) + +class Lib( + value: => Value, + a: Int = 0, + b: Int +) diff --git a/sbt-test/scala2-compat/i16351/test b/sbt-test/scala2-compat/i16351/test new file mode 100644 index 000000000000..63092ffa4a03 --- /dev/null +++ b/sbt-test/scala2-compat/i16351/test @@ -0,0 +1 @@ +> app/run diff --git a/sbt-test/source-dependencies/inline-rec-change-inline/B.scala b/sbt-test/source-dependencies/inline-rec-change-inline/B.scala index 61e61a620957..eaeef8d57ece 100644 --- a/sbt-test/source-dependencies/inline-rec-change-inline/B.scala +++ b/sbt-test/source-dependencies/inline-rec-change-inline/B.scala @@ -1,5 +1,5 @@ object B { - inline def inlinedAny(x: String): x.type = x + inline def inlinedAny(x: String): String = x } diff --git a/sbt-test/source-dependencies/inline-rec-change-inline/changes/B1.scala b/sbt-test/source-dependencies/inline-rec-change-inline/changes/B1.scala index 4a1c47d38572..63104570fed4 100644 --- a/sbt-test/source-dependencies/inline-rec-change-inline/changes/B1.scala +++ b/sbt-test/source-dependencies/inline-rec-change-inline/changes/B1.scala @@ -1,5 +1,5 @@ object B { - inline def inlinedAny(inline x: String): x.type = x + inline def inlinedAny(inline x: String): String = x } diff --git a/scaladoc-testcases/src/tests/extensionParams.scala b/scaladoc-testcases/src/tests/extensionParams.scala index 231a8a1fefbf..7892676af2c4 100644 --- a/scaladoc-testcases/src/tests/extensionParams.scala +++ b/scaladoc-testcases/src/tests/extensionParams.scala @@ -1,22 +1,54 @@ package tests.extensionParams +trait Animal + extension [A](thiz: A) - def toTuple2[B](that: B): (A, B) = thiz -> that + def toTuple2[B](that: B): (A, B) + = thiz -> that extension [A](a: A)(using Int) - def f[B](b: B): (A, B) = ??? + def f1[B](b: B): (A, B) + = ??? extension [A](a: A)(using Int) - def ff(b: A): (A, A) = ??? + def f2(b: A): (A, A) + = ??? extension [A](a: A)(using Int) - def fff(using String)(b: A): (A, A) = ??? + def f3(using String)(b: A): (A, A) + = ??? extension (a: Char)(using Int) - def ffff(using String)(b: Int): Unit = ??? + def f4(using String)(b: Int): Unit + = ??? extension (a: Char)(using Int) - def fffff[B](using String)(b: B): Unit = ??? + def f5[B](using String)(b: B): Unit + = ??? + +extension [A <: List[Char]](a: Int)(using Int) + def f6[B](b: B): (A, B) + = ??? + +extension [A <: List[Char]](using String)(using Unit)(a: A)(using Int)(using Number) + def f7[B, C](b: B)(c: C): (A, B) + = ??? + +extension [A <: List[Char]](using String)(using Unit)(a: A)(using Int)(using Number) + def f8(b: Any)(c: Any): Any + = ??? + +extension [A <: List[Char]](using String)(using Unit)(a: A)(using Int)(using Number) + def f9[B, C](using Int)(b: B)(c: C): (A, B) + = ??? + +extension [A <: List[Char]](using String)(using Unit)(a: A)(using Int)(using Number) + def f10(using Int)(b: Any)(c: Any): Any + = ??? + + def f12(using Int)(b: A)(c: String): Number + = ??? -extension [A <: List[Char]](a: A)(using Int) - def ffffff[B](b: B): (A, B) = ??? +extension (using String)(using Unit)(a: Animal)(using Int)(using Number) + def f11(b: Any)(c: Any): Any + = ??? diff --git a/scaladoc-testcases/src/tests/nonScala3Parent.scala b/scaladoc-testcases/src/tests/nonScala3Parent.scala new file mode 100644 index 000000000000..91183d25b583 --- /dev/null +++ b/scaladoc-testcases/src/tests/nonScala3Parent.scala @@ -0,0 +1,13 @@ +package tests +package nonScala3Parent + +import javax.swing.JPanel +import javax.swing.JFrame + +// https://github.com/lampepfl/dotty/issues/15927 + +trait Foo1 extends Numeric[Any] +trait Foo2 extends JPanel +trait Foo3 extends JFrame +trait Foo4 extends Ordering[Any] +trait Foo5 extends Enumeration diff --git a/scaladoc/README.md b/scaladoc/README.md index 5f7560372976..774543996c7a 100644 --- a/scaladoc/README.md +++ b/scaladoc/README.md @@ -40,7 +40,7 @@ the documentation won't work completely if you don't. ## CLI and SBT Documentation The preferred way to use scaladoc is calling it from sbt `Compile/doc` task or to use CLI interface provided inside `dotty/bin/scaladoc` bash script. -More information about specific scaladoc flags you can find inside [Usage docs](https://dotty.epfl.ch/docs/usage/scaladoc/settings.html) +More information about specific scaladoc flags you can find inside [Usage docs](https://docs.scala-lang.org/scala3/guides/scaladoc/settings.html) ## Developing diff --git a/scaladoc/resources/dotty_res/styles/scalastyle.css b/scaladoc/resources/dotty_res/styles/scalastyle.css index a14af7f7ae2d..3efcfddd370a 100644 --- a/scaladoc/resources/dotty_res/styles/scalastyle.css +++ b/scaladoc/resources/dotty_res/styles/scalastyle.css @@ -962,8 +962,7 @@ footer .socials { color: var(--type); } -.signature *[t="t"] { - /* Types with links */ +.signature *[t="t"] { /* Types with links */ color: var(--type-link); } diff --git a/scaladoc/src/dotty/tools/scaladoc/renderers/MemberRenderer.scala b/scaladoc/src/dotty/tools/scaladoc/renderers/MemberRenderer.scala index e50d87e99837..5d5f3e9b20d5 100644 --- a/scaladoc/src/dotty/tools/scaladoc/renderers/MemberRenderer.scala +++ b/scaladoc/src/dotty/tools/scaladoc/renderers/MemberRenderer.scala @@ -415,8 +415,8 @@ class MemberRenderer(signatureRenderer: SignatureRenderer)(using DocContext) ext .functionParameters(on.argsLists) .content val sig = typeSig ++ Signature(Plain(s"(${on.name}: ")) ++ on.signature ++ Signature(Plain(")")) ++ argsSig - MGroup(span(cls := "groupHeader")(sig.map(renderElement(_))), members.sortBy(_.name).toSeq, on.name) - }.toSeq + MGroup(span(cls := "groupHeader")(sig.map(renderElement(_))), members.sortBy(_.name).toSeq, on.name) -> on.position + }.toSeq.sortBy(_._2).map(_._1) div(cls := "membersList expand")( renderTabs( diff --git a/scaladoc/src/dotty/tools/scaladoc/renderers/Resources.scala b/scaladoc/src/dotty/tools/scaladoc/renderers/Resources.scala index d6cd701225ba..bae43980a11d 100644 --- a/scaladoc/src/dotty/tools/scaladoc/renderers/Resources.scala +++ b/scaladoc/src/dotty/tools/scaladoc/renderers/Resources.scala @@ -177,7 +177,15 @@ trait Resources(using ctx: DocContext) extends Locations, Writer: def extensionTarget(member: Member): String = member.kind match - case Kind.Extension(on, _) => flattenToText(on.signature) + case Kind.Extension(on, _) => + val typeSig = SignatureBuilder() + .keyword("extension ") + .generics(on.typeParams) + .content + val argsSig = SignatureBuilder() + .functionParameters(on.argsLists) + .content + flattenToText(typeSig ++ argsSig) case _ => "" def docPartRenderPlain(d: DocPart): String = diff --git a/scaladoc/src/dotty/tools/scaladoc/tasty/ClassLikeSupport.scala b/scaladoc/src/dotty/tools/scaladoc/tasty/ClassLikeSupport.scala index 7ecc4827836a..38cc90330265 100644 --- a/scaladoc/src/dotty/tools/scaladoc/tasty/ClassLikeSupport.scala +++ b/scaladoc/src/dotty/tools/scaladoc/tasty/ClassLikeSupport.scala @@ -266,7 +266,8 @@ trait ClassLikeSupport: def getParentsAsTreeSymbolTuples: List[(Tree, Symbol)] = if noPosClassDefs.contains(c.symbol) then Nil else for - parentTree <- c.parents if parentTree.pos.start != parentTree.pos.end // We assume here that order is correct + // TODO: add exists function to position methods in Quotes and replace the condition here for checking the JPath + parentTree <- c.parents if parentTree.pos.sourceFile.getJPath.isDefined && parentTree.pos.start != parentTree.pos.end // We assume here that order is correct parentSymbol = parentTree match case t: TypeTree => t.tpe.typeSymbol case tree if tree.symbol.isClassConstructor => tree.symbol.owner @@ -539,14 +540,13 @@ trait ClassLikeSupport: // Documenting method slightly different then its definition is withing the 'undefiend behaviour'. symbol.paramSymss.flatten.find(_.name == name).exists(_.flags.is(Flags.Implicit)) - def handlePolyType(polyType: PolyType): MemberInfo = - MemberInfo(polyType.paramNames.zip(polyType.paramBounds).toMap, List.empty, polyType.resType) + def handlePolyType(memberInfo: MemberInfo, polyType: PolyType): MemberInfo = + MemberInfo(polyType.paramNames.zip(polyType.paramBounds).toMap, memberInfo.paramLists, polyType.resType) def handleMethodType(memberInfo: MemberInfo, methodType: MethodType): MemberInfo = val rawParams = methodType.paramNames.zip(methodType.paramTypes).toMap val (evidences, notEvidences) = rawParams.partition(e => isSyntheticEvidence(e._1)) - def findParamRefs(t: TypeRepr): Seq[ParamRef] = t match case paramRef: ParamRef => Seq(paramRef) case AppliedType(_, args) => args.flatMap(findParamRefs) @@ -583,7 +583,7 @@ trait ClassLikeSupport: MemberInfo(memberInfo.genericTypes, memberInfo.paramLists, byNameType.underlying) def recursivelyCalculateMemberInfo(memberInfo: MemberInfo): MemberInfo = memberInfo.res match - case p: PolyType => recursivelyCalculateMemberInfo(handlePolyType(p)) + case p: PolyType => recursivelyCalculateMemberInfo(handlePolyType(memberInfo, p)) case m: MethodType => recursivelyCalculateMemberInfo(handleMethodType(memberInfo, m)) case b: ByNameType => handleByNameType(memberInfo, b) case _ => memberInfo diff --git a/scaladoc/src/dotty/tools/scaladoc/tasty/SymOps.scala b/scaladoc/src/dotty/tools/scaladoc/tasty/SymOps.scala index ca3dac7e12f8..b4a1fc197d9a 100644 --- a/scaladoc/src/dotty/tools/scaladoc/tasty/SymOps.scala +++ b/scaladoc/src/dotty/tools/scaladoc/tasty/SymOps.scala @@ -147,28 +147,43 @@ object SymOps: def extendedSymbol: Option[reflect.ValDef] = import reflect.* - Option.when(sym.isExtensionMethod){ - val termParamss = sym.tree.asInstanceOf[DefDef].termParamss - if sym.isLeftAssoc || termParamss.size == 1 then termParamss(0).params(0) - else termParamss(1).params(0) + if sym.isExtensionMethod then + sym.extendedTermParamLists.find(param => !param.isImplicit && !param.isGiven).flatMap(_.params.headOption) + else None + + def splitExtensionParamList: (List[reflect.ParamClause], List[reflect.ParamClause]) = + import reflect.* + + def getPositionStartOption(pos: Option[Position]): Option[Int] = pos.flatMap { + case dotty.tools.dotc.util.NoSourcePosition => None + case pos: Position => Some(pos.start) } + def comparePositionStarts(posA: Option[Position], posB: Option[Position]): Option[Boolean] = + for { + startA <- getPositionStartOption(posA) + startB <- getPositionStartOption(posB) + } yield startA < startB + + sym.tree match + case tree: DefDef => + tree.paramss.partition(_.params.headOption.flatMap(param => + comparePositionStarts(param.symbol.pos, tree.symbol.pos)).getOrElse(false) + ) + case _ => Nil -> Nil + def extendedTypeParams: List[reflect.TypeDef] = import reflect.* - val method = sym.tree.asInstanceOf[DefDef] - method.leadingTypeParams + sym.tree match + case tree: DefDef => + tree.leadingTypeParams + case _ => Nil def extendedTermParamLists: List[reflect.TermParamClause] = import reflect.* - if sym.nonExtensionLeadingTypeParams.nonEmpty then - sym.nonExtensionParamLists.takeWhile { - case _: TypeParamClause => false - case _ => true - }.collect { - case tpc: TermParamClause => tpc - } - else - List.empty + sym.splitExtensionParamList._1.collect { + case tpc: TermParamClause => tpc + } def nonExtensionTermParamLists: List[reflect.TermParamClause] = import reflect.* @@ -185,14 +200,7 @@ object SymOps: } def nonExtensionParamLists: List[reflect.ParamClause] = - import reflect.* - val method = sym.tree.asInstanceOf[DefDef] - if sym.isExtensionMethod then - val params = method.paramss - val toDrop = if method.leadingTypeParams.nonEmpty then 2 else 1 - if sym.isLeftAssoc || params.size == 1 then params.drop(toDrop) - else params.head :: params.tail.drop(toDrop) - else method.paramss + sym.splitExtensionParamList._2 def nonExtensionLeadingTypeParams: List[reflect.TypeDef] = import reflect.* diff --git a/scaladoc/src/dotty/tools/scaladoc/tasty/SyntheticSupport.scala b/scaladoc/src/dotty/tools/scaladoc/tasty/SyntheticSupport.scala index dabc6468d4c9..b33d5f61faac 100644 --- a/scaladoc/src/dotty/tools/scaladoc/tasty/SyntheticSupport.scala +++ b/scaladoc/src/dotty/tools/scaladoc/tasty/SyntheticSupport.scala @@ -49,7 +49,7 @@ object SyntheticsSupport: c.symbol.typeRef.baseClasses.map(b => b -> c.symbol.typeRef.baseType(b)).tail def typeForClass(using Quotes)(c: reflect.ClassDef): reflect.TypeRepr = - c.symbol.typeRef.appliedTo(c.symbol.typeMembers.filter(_.isTypeParam).map(_.typeRef)) + c.symbol.typeRef.appliedTo(c.symbol.declaredTypes.filter(_.isTypeParam).map(_.typeRef)) /* We need there to filter out symbols with certain flagsets, because these symbols come from compiler and TASTY can't handle them well. They are valdefs that describe case companion objects and cases from enum. diff --git a/scaladoc/src/dotty/tools/scaladoc/tasty/TastyParser.scala b/scaladoc/src/dotty/tools/scaladoc/tasty/TastyParser.scala index f8be9e766fa8..cd1bed42f485 100644 --- a/scaladoc/src/dotty/tools/scaladoc/tasty/TastyParser.scala +++ b/scaladoc/src/dotty/tools/scaladoc/tasty/TastyParser.scala @@ -228,6 +228,6 @@ case class TastyParser( try Traverser.traverseTree(root)(Symbol.spliceOwner) catch case e: Throwable => println(s"Problem parsing ${root.pos}, documentation may not be generated.") - e.printStackTrace() + // e.printStackTrace() docs.result() diff --git a/scaladoc/src/dotty/tools/scaladoc/tasty/reflect.scala b/scaladoc/src/dotty/tools/scaladoc/tasty/reflect.scala index b48519e29d28..419beac50134 100644 --- a/scaladoc/src/dotty/tools/scaladoc/tasty/reflect.scala +++ b/scaladoc/src/dotty/tools/scaladoc/tasty/reflect.scala @@ -4,4 +4,4 @@ package tasty import scala.quoted._ /** Shorthand for `quotes.reflect` */ -transparent inline def reflect(using inline q: Quotes): q.reflect.type = q.reflect +transparent inline def reflect(using q: Quotes): q.reflect.type = q.reflect diff --git a/scaladoc/test/dotty/tools/scaladoc/signatures/TranslatableSignaturesTestCases.scala b/scaladoc/test/dotty/tools/scaladoc/signatures/TranslatableSignaturesTestCases.scala index 7da1bb9b7e03..49316b08dbc0 100644 --- a/scaladoc/test/dotty/tools/scaladoc/signatures/TranslatableSignaturesTestCases.scala +++ b/scaladoc/test/dotty/tools/scaladoc/signatures/TranslatableSignaturesTestCases.scala @@ -41,6 +41,8 @@ class MergedPackageSignatures extends SignatureTest("mergedPackage", SignatureTe class ExtensionMethodSignature extends SignatureTest("extensionMethodSignatures", SignatureTest.all) +class ExtensionMethodParamsSignature extends SignatureTest("extensionParams", SignatureTest.all) + class ClassModifiers extends SignatureTest("classModifiers", SignatureTest.classlikeKinds) class EnumSignatures extends SignatureTest("enumSignatures", SignatureTest.all) @@ -104,3 +106,5 @@ class ImplicitMembers extends SignatureTest( Seq("def"), filterFunc = _.toString.endsWith("OuterClass$ImplicitMemberTarget.html") ) + +class NonScala3Parent extends SignatureTest("nonScala3Parent", SignatureTest.all) diff --git a/scaladoc/test/dotty/tools/scaladoc/testUtils.scala b/scaladoc/test/dotty/tools/scaladoc/testUtils.scala index 21ed7398f74e..2ba78c321eab 100644 --- a/scaladoc/test/dotty/tools/scaladoc/testUtils.scala +++ b/scaladoc/test/dotty/tools/scaladoc/testUtils.scala @@ -11,9 +11,9 @@ import java.nio.file.Paths case class ReportedDiagnostics(errors: List[Diagnostic], warnings: List[Diagnostic], infos: List[Diagnostic]): - def errorMsgs = errors.map(_.msg.rawMessage) - def warningMsgs = warnings.map(_.msg.rawMessage) - def infoMsgs = infos.map(_.msg.rawMessage) + def errorMsgs = errors.map(_.msg.message) + def warningMsgs = warnings.map(_.msg.message) + def infoMsgs = infos.map(_.msg.message) extension (c: CompilerContext) def reportedDiagnostics: ReportedDiagnostics = diff --git a/semanticdb/project/build.properties b/semanticdb/project/build.properties index 22af2628c413..8b9a0b0ab037 100644 --- a/semanticdb/project/build.properties +++ b/semanticdb/project/build.properties @@ -1 +1 @@ -sbt.version=1.7.1 +sbt.version=1.8.0 diff --git a/sjs-compiler-tests/test/scala/dotty/tools/dotc/ScalaJSCompilationTests.scala b/sjs-compiler-tests/test/scala/dotty/tools/dotc/ScalaJSCompilationTests.scala index ca4f292568bb..0f4eb633b770 100644 --- a/sjs-compiler-tests/test/scala/dotty/tools/dotc/ScalaJSCompilationTests.scala +++ b/sjs-compiler-tests/test/scala/dotty/tools/dotc/ScalaJSCompilationTests.scala @@ -6,6 +6,7 @@ import org.junit.{ Test, BeforeClass, AfterClass } import org.junit.experimental.categories.Category import scala.concurrent.duration._ +import reporting.TestReporter import vulpix._ @Category(Array(classOf[ScalaJSCompilationTests])) @@ -23,6 +24,7 @@ class ScalaJSCompilationTests extends ParallelTesting { def isInteractive = SummaryReport.isInteractive def testFilter = Properties.testsFilter def updateCheckFiles: Boolean = Properties.testsUpdateCheckfile + def failedTests = TestReporter.lastRunFailedTests // Negative tests ------------------------------------------------------------ diff --git a/tasty/src/dotty/tools/tasty/TastyBuffer.scala b/tasty/src/dotty/tools/tasty/TastyBuffer.scala index 1d48027087f5..f9266cf23617 100644 --- a/tasty/src/dotty/tools/tasty/TastyBuffer.scala +++ b/tasty/src/dotty/tools/tasty/TastyBuffer.scala @@ -193,4 +193,9 @@ class TastyBuffer(initialSize: Int) { * After `assemble` no more output actions to this buffer are permitted. */ def assemble(): Unit = () + + def reset(): Unit = { + java.util.Arrays.fill(bytes, 0, length, 0.toByte) + length = 0 + } } diff --git a/tasty/src/dotty/tools/tasty/TastyFormat.scala b/tasty/src/dotty/tools/tasty/TastyFormat.scala index 98cba90bdccf..ac0357068c55 100644 --- a/tasty/src/dotty/tools/tasty/TastyFormat.scala +++ b/tasty/src/dotty/tools/tasty/TastyFormat.scala @@ -91,6 +91,7 @@ Standard-Section: "ASTs" TopLevelStat* THROW throwableExpr_Term -- throw throwableExpr NAMEDARG paramName_NameRef arg_Term -- paramName = arg APPLY Length fn_Term arg_Term* -- fn(args) + APPLYsigpoly Length fn_Term meth_Type arg_Term* -- The application of a signature-polymorphic method TYPEAPPLY Length fn_Term arg_Type* -- fn[args] SUPER Length this_Term mixinTypeIdent_Tree? -- super[mixin] TYPED Length expr_Term ascriptionType_Term -- expr: ascription @@ -288,7 +289,7 @@ object TastyFormat { * compatibility, but remains backwards compatible, with all * preceeding `MinorVersion`. */ - final val MinorVersion: Int = 2 + final val MinorVersion: Int = 3 /** Natural Number. The `ExperimentalVersion` allows for * experimentation with changes to TASTy without committing @@ -578,6 +579,7 @@ object TastyFormat { // final val ??? = 178 // final val ??? = 179 final val METHODtype = 180 + final val APPLYsigpoly = 181 final val MATCHtype = 190 final val MATCHtpt = 191 @@ -744,6 +746,7 @@ object TastyFormat { case BOUNDED => "BOUNDED" case APPLY => "APPLY" case TYPEAPPLY => "TYPEAPPLY" + case APPLYsigpoly => "APPLYsigpoly" case NEW => "NEW" case THROW => "THROW" case TYPED => "TYPED" diff --git a/tasty/src/dotty/tools/tasty/TastyHash.scala b/tasty/src/dotty/tools/tasty/TastyHash.scala index aff663f42a8d..701328d578a3 100644 --- a/tasty/src/dotty/tools/tasty/TastyHash.scala +++ b/tasty/src/dotty/tools/tasty/TastyHash.scala @@ -6,10 +6,10 @@ object TastyHash { * * from https://en.wikipedia.org/wiki/PJW_hash_function#Algorithm */ - def pjwHash64(data: Array[Byte]): Long = { + def pjwHash64(data: Array[Byte], length: Int): Long = { var h = 0L var i = 0 - while (i < data.length) { + while (i < length) { val d = data(i) & 0xFFL // Interpret byte as unsigned byte h = (h << 8) + d val high = h & 0xFF00000000000000L @@ -19,4 +19,6 @@ object TastyHash { } h } + def pjwHash64(data: Array[Byte]): Long = + pjwHash64(data, data.length) } diff --git a/tasty/src/dotty/tools/tasty/util/Util.scala b/tasty/src/dotty/tools/tasty/util/Util.scala index 5726e65773b0..750f5956c5cc 100644 --- a/tasty/src/dotty/tools/tasty/util/Util.scala +++ b/tasty/src/dotty/tools/tasty/util/Util.scala @@ -11,4 +11,17 @@ object Util { arr1 } + /** Specialized version for bytes */ + def dble(arr: Array[Byte]): Array[Byte] = { + val arr1 = new Array[Byte](arr.length * 2) + System.arraycopy(arr, 0, arr1, 0, arr.length) + arr1 + } + + /** Specialized version for ints */ + def dble(arr: Array[Int]): Array[Int] = { + val arr1 = new Array[Int](arr.length * 2) + System.arraycopy(arr, 0, arr1, 0, arr.length) + arr1 + } } diff --git a/tests/cmdTest-sbt-tests/sourcepath-with-inline-api-hash/project/build.properties b/tests/cmdTest-sbt-tests/sourcepath-with-inline-api-hash/project/build.properties index 22af2628c413..8b9a0b0ab037 100644 --- a/tests/cmdTest-sbt-tests/sourcepath-with-inline-api-hash/project/build.properties +++ b/tests/cmdTest-sbt-tests/sourcepath-with-inline-api-hash/project/build.properties @@ -1 +1 @@ -sbt.version=1.7.1 +sbt.version=1.8.0 diff --git a/tests/cmdTest-sbt-tests/sourcepath-with-inline/project/build.properties b/tests/cmdTest-sbt-tests/sourcepath-with-inline/project/build.properties index 22af2628c413..8b9a0b0ab037 100644 --- a/tests/cmdTest-sbt-tests/sourcepath-with-inline/project/build.properties +++ b/tests/cmdTest-sbt-tests/sourcepath-with-inline/project/build.properties @@ -1 +1 @@ -sbt.version=1.7.1 +sbt.version=1.8.0 diff --git a/tests/explicit-nulls/run/i11332.scala b/tests/explicit-nulls/run/i11332.scala new file mode 100644 index 000000000000..73fb48839c16 --- /dev/null +++ b/tests/explicit-nulls/run/i11332.scala @@ -0,0 +1,22 @@ +// scalajs: --skip +import scala.language.unsafeNulls + +import java.lang.invoke._, MethodType.methodType + +// A copy of tests/run/i11332.scala +// to test the bootstrap minimisation which failed +// (because bootstrap runs under explicit nulls) +class Foo: + def neg(x: Int): Int = -x + +object Test: + def main(args: Array[String]): Unit = + val l = MethodHandles.lookup() + val self = new Foo() + + val res4 = { + l // explicit chain method call - previously derivedSelect broke the type + .findVirtual(classOf[Foo], "neg", methodType(classOf[Int], classOf[Int])) + .invokeExact(self, 4): Int + } + assert(-4 == res4) diff --git a/tests/generic-java-signatures/i15385/Lib.scala b/tests/generic-java-signatures/i15385/Lib.scala new file mode 100644 index 000000000000..81b00d964b3f --- /dev/null +++ b/tests/generic-java-signatures/i15385/Lib.scala @@ -0,0 +1,24 @@ +class Loc(val idx: Int) extends AnyVal + +class Foo: + def testNoParam[A <: Int]: A = 1.asInstanceOf[A] + def testSingleParam[A <: Int](a: A): A = 2.asInstanceOf[A] // (I)I + def testSingleParam2[A <: Int](a: A): Box[A] = new Box[A](a) // (I)LBox; + def testSingleParam3[A <: Int](box: Box[A]): A = box.value // (LBox;)I + def testOtherReturn[A <: Int](a: A): String = "3" + def testNoErasure[A <: String](a: A): A = "4".asInstanceOf[A] + def testMultiParam[A <: Int, B <: String](a: A, b: B): A = 5.asInstanceOf[A] + + def testVCNoParam[A <: Loc]: A = Loc(1).asInstanceOf[A] + def testVCSingleParam[A <: Loc](a: A): A = Loc(2).asInstanceOf[A] + def testVCOtherReturn[A <: Loc](a: A): String = "3" + def testVCNoErasure[A <: String](a: A): A = "4".asInstanceOf[A] + def testVCMultiParam[A <: Loc, B <: String](a: A, b: B): A = Loc(5).asInstanceOf[A] + +class Box[T](val value: T) + +class BarParent[X, Y] +trait BarInterface[F, G] +abstract class Bar[A <: Int](a: A) extends BarParent[A, String] with BarInterface[Int, A]: + def getMap: Map[String, A] + def bar[B](a: A, b: B): (A, B, Int) diff --git a/tests/generic-java-signatures/i15385/Test.java b/tests/generic-java-signatures/i15385/Test.java new file mode 100644 index 000000000000..184f104d0fb0 --- /dev/null +++ b/tests/generic-java-signatures/i15385/Test.java @@ -0,0 +1,18 @@ +public class Test { + public static void main(String[] args) throws Exception { + Foo foo = new Foo(); + System.out.println(foo.testNoParam()); + System.out.println(foo.testSingleParam(2)); + System.out.println(foo.testSingleParam2(21).value()); + System.out.println(foo.testSingleParam3(new Box(22))); + System.out.println(foo.testOtherReturn(3)); + System.out.println(foo.testNoErasure("4")); + System.out.println(foo.testMultiParam(5, "5")); + + System.out.println(foo.testVCNoParam()); + System.out.println(foo.testVCSingleParam(2)); + System.out.println(foo.testVCOtherReturn(3)); + System.out.println(foo.testVCNoErasure("4")); + System.out.println(foo.testVCMultiParam(5, "5")); + } +} diff --git a/tests/init/neg/early-promote4.scala b/tests/init/neg/early-promote4.scala index 65f917553974..487a75c5516f 100644 --- a/tests/init/neg/early-promote4.scala +++ b/tests/init/neg/early-promote4.scala @@ -8,13 +8,13 @@ class Outer { trait B { def bar() = assert(a == 5) } -} -class M(val o: Outer) extends A with o.B { - val n: Int = 10 + class M extends A with B { + val n: Int = 10 + } } class Dummy { val m: Int = n + 4 val n: Int = 10 // error -} \ No newline at end of file +} diff --git a/tests/init/neg/early-promote5.scala b/tests/init/neg/early-promote5.scala index 404f6fdb8d70..3f850b623ea3 100644 --- a/tests/init/neg/early-promote5.scala +++ b/tests/init/neg/early-promote5.scala @@ -8,13 +8,13 @@ class Outer { trait B { def bar(x: A) = println(a) } -} -class M(val o: Outer, c: Container) extends A with o.B + class M(c: Container) extends A with B +} class Container { val o = new Outer - val m = new M(o, this) // error + val m = new o.M(this) // error val s = "hello" } diff --git a/tests/pos-custom-args/captures/boxmap.scala b/tests/neg-custom-args/boxmap.scala similarity index 67% rename from tests/pos-custom-args/captures/boxmap.scala rename to tests/neg-custom-args/boxmap.scala index 18baabd4e584..e66b0a8ec808 100644 --- a/tests/pos-custom-args/captures/boxmap.scala +++ b/tests/neg-custom-args/boxmap.scala @@ -15,5 +15,7 @@ def lazymap[A <: Top, B <: Top](b: Box[A])(f: A => B): {f} (() -> Box[B]) = def test[A <: Top, B <: Top] = def lazymap[A <: Top, B <: Top](b: Box[A])(f: A => B) = () => b[Box[B]]((x: A) => box(f(x))) - val x: (b: Box[A]) -> (f: A => B) -> (() -> Box[B]) = lazymap[A, B] + val x0: (b: Box[A]) -> (f: A => B) -> (() -> Box[B]) = lazymap[A, B] // error + val x: (b: Box[A]) -> (f: A => B) -> {b, f} (() -> Box[B]) = lazymap[A, B] // works + val y: (b: Box[A]) -> (f: A => B) -> {*} (() -> Box[B]) = lazymap[A, B] // works () diff --git a/tests/neg-custom-args/captures/boundschecks.scala b/tests/neg-custom-args/captures/boundschecks.scala new file mode 100644 index 000000000000..cf4eab28f19d --- /dev/null +++ b/tests/neg-custom-args/captures/boundschecks.scala @@ -0,0 +1,18 @@ +object test { + + class Tree + + def f[X <: Tree](x: X): Unit = () + + class C[X <: Tree](x: X) + + def foo(t: {*} Tree) = + f(t) // error + f[{*} Tree](t) // error + f[Tree](t) // error + val c1 = C(t) // error + val c2 = C[{*} Tree](t) // error + val c3 = C[Tree](t) // error + + val foo: C[{*} Tree] = ??? +} diff --git a/tests/neg-custom-args/captures/boundschecks2.scala b/tests/neg-custom-args/captures/boundschecks2.scala new file mode 100644 index 000000000000..f6927b04931b --- /dev/null +++ b/tests/neg-custom-args/captures/boundschecks2.scala @@ -0,0 +1,13 @@ +object test { + + class Tree + + def f[X <: Tree](x: X): Unit = () + + class C[X <: Tree](x: X) + + val foo: C[{*} Tree] = ??? // error + type T = C[{*} Tree] // error + val bar: T -> T = ??? + val baz: C[{*} Tree] -> Unit = ??? // error +} diff --git a/tests/neg-custom-args/captures/byname.check b/tests/neg-custom-args/captures/byname.check index 486f94d599ac..90cf6c145c33 100644 --- a/tests/neg-custom-args/captures/byname.check +++ b/tests/neg-custom-args/captures/byname.check @@ -8,7 +8,7 @@ 10 | h(f2()) // error | ^^^^ | Found: {cap1} (x$0: Int) -> Int - | Required: {cap2} Int -> Int + | Required: {cap2} (x$0: Int) -> Int | | longer explanation available when compiling with `-explain` -- [E007] Type Mismatch Error: tests/neg-custom-args/captures/byname.scala:19:5 ---------------------------------------- diff --git a/tests/neg-custom-args/captures/capt-depfun.scala b/tests/neg-custom-args/captures/capt-depfun.scala index a74764f432c7..c01eed7c4b25 100644 --- a/tests/neg-custom-args/captures/capt-depfun.scala +++ b/tests/neg-custom-args/captures/capt-depfun.scala @@ -1,8 +1,9 @@ import annotation.retains class C type Cap = C @retains(caps.*) +class Str def f(y: Cap, z: Cap) = def g(): C @retains(y, z) = ??? - val ac: ((x: Cap) => String @retains(x) => String @retains(x)) = ??? - val dc: (({y, z} String) => {y, z} String) = ac(g()) // error + val ac: ((x: Cap) => Str @retains(x) => Str @retains(x)) = ??? + val dc: (({y, z} Str) => {y, z} Str) = ac(g()) // error diff --git a/tests/neg-custom-args/captures/capt-depfun2.scala b/tests/neg-custom-args/captures/capt-depfun2.scala index 74b9441593c1..52dd74aabf9f 100644 --- a/tests/neg-custom-args/captures/capt-depfun2.scala +++ b/tests/neg-custom-args/captures/capt-depfun2.scala @@ -1,11 +1,12 @@ import annotation.retains class C type Cap = C @retains(caps.*) +class Str def f(y: Cap, z: Cap) = def g(): C @retains(y, z) = ??? - val ac: ((x: Cap) => Array[String @retains(x)]) = ??? - val dc = ac(g()) // error: Needs explicit type Array[? >: String <: {y, z} String] + val ac: ((x: Cap) => Array[Str @retains(x)]) = ??? + val dc = ac(g()) // error: Needs explicit type Array[? >: Str <: {y, z} Str] // This is a shortcoming of rechecking since the originally inferred - // type is `Array[String]` and the actual type after rechecking - // cannot be expressed as `Array[C String]` for any capture set C \ No newline at end of file + // type is `Array[Str]` and the actual type after rechecking + // cannot be expressed as `Array[C Str]` for any capture set C \ No newline at end of file diff --git a/tests/neg-custom-args/captures/cc-depfun.scala b/tests/neg-custom-args/captures/cc-depfun.scala new file mode 100644 index 000000000000..c4ef303f4712 --- /dev/null +++ b/tests/neg-custom-args/captures/cc-depfun.scala @@ -0,0 +1,9 @@ +trait Cap { def use(): Unit } + +def main() = { + val f: (io: {*} Cap) -> {} () -> Unit = + io => () => io.use() // error + + val g: ({*} Cap) -> {} () -> Unit = + io => () => io.use() // error +} diff --git a/tests/neg-custom-args/captures/cc-this.check b/tests/neg-custom-args/captures/cc-this.check index c492df15078f..0049f42a5db5 100644 --- a/tests/neg-custom-args/captures/cc-this.check +++ b/tests/neg-custom-args/captures/cc-this.check @@ -9,10 +9,7 @@ 10 | class C2(val x: () => Int): // error | ^ | reference (C2.this.x : () => Int) is not included in allowed capture set {} of the self type of class C2 --- [E058] Type Mismatch Error: tests/neg-custom-args/captures/cc-this.scala:17:8 --------------------------------------- +-- Error: tests/neg-custom-args/captures/cc-this.scala:17:8 ------------------------------------------------------------ 17 | class C4(val f: () => Int) extends C3 // error - | ^ - | illegal inheritance: self type {C4.this.f} C4 of class C4 does not conform to self type C3 - | of parent class C3 - | - | longer explanation available when compiling with `-explain` + | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + | reference (C4.this.f : () => Int) is not included in allowed capture set {} of pure base class class C3 diff --git a/tests/neg-custom-args/captures/cc-this2.check b/tests/neg-custom-args/captures/cc-this2.check index d10519636ca8..086524d307a2 100644 --- a/tests/neg-custom-args/captures/cc-this2.check +++ b/tests/neg-custom-args/captures/cc-this2.check @@ -1,8 +1,6 @@ --- [E058] Type Mismatch Error: tests/neg-custom-args/captures/cc-this2/D_2.scala:2:6 ----------------------------------- +-- Error: tests/neg-custom-args/captures/cc-this2/D_2.scala:2:6 -------------------------------------------------------- 2 |class D extends C: // error - | ^ - | illegal inheritance: self type {*} D of class D does not conform to self type C - | of parent class C - | - | longer explanation available when compiling with `-explain` + |^ + |reference (scala.caps.* : Any) is not included in allowed capture set {} of pure base class class C +3 | this: {*} D => diff --git a/tests/neg-custom-args/captures/exception-definitions.check b/tests/neg-custom-args/captures/exception-definitions.check new file mode 100644 index 000000000000..aca5d9217d64 --- /dev/null +++ b/tests/neg-custom-args/captures/exception-definitions.check @@ -0,0 +1,17 @@ +-- Error: tests/neg-custom-args/captures/exception-definitions.scala:2:6 ----------------------------------------------- +2 |class Err extends Exception: // error + |^ + |reference (scala.caps.* : Any) is not included in allowed capture set {} of pure base class class Throwable +3 | self: {*} Err => +-- Error: tests/neg-custom-args/captures/exception-definitions.scala:10:6 ---------------------------------------------- +10 |class Err4(c: {*} Any) extends AnyVal // error + |^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + |reference (Err4.this.c : {*} Any) is not included in allowed capture set {} of pure base class class AnyVal +-- Error: tests/neg-custom-args/captures/exception-definitions.scala:7:12 ---------------------------------------------- +7 | val x = c // error + | ^ + |(c : {*} Any) cannot be referenced here; it is not included in the allowed capture set {} of pure base class class Throwable +-- Error: tests/neg-custom-args/captures/exception-definitions.scala:8:8 ----------------------------------------------- +8 | class Err3(c: {*} Any) extends Exception // error + | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + | reference (Err3.this.c : {*} Any) is not included in allowed capture set {} of pure base class class Throwable diff --git a/tests/neg-custom-args/captures/exception-definitions.scala b/tests/neg-custom-args/captures/exception-definitions.scala new file mode 100644 index 000000000000..9f3539b7febf --- /dev/null +++ b/tests/neg-custom-args/captures/exception-definitions.scala @@ -0,0 +1,12 @@ + +class Err extends Exception: // error + self: {*} Err => + +def test(c: {*} Any) = + class Err2 extends Exception: + val x = c // error + class Err3(c: {*} Any) extends Exception // error + +class Err4(c: {*} Any) extends AnyVal // error + + diff --git a/tests/neg-custom-args/captures/heal-tparam-cs.scala b/tests/neg-custom-args/captures/heal-tparam-cs.scala new file mode 100644 index 000000000000..3ff34d0a8a42 --- /dev/null +++ b/tests/neg-custom-args/captures/heal-tparam-cs.scala @@ -0,0 +1,33 @@ +import language.experimental.captureChecking + +trait Cap { def use(): Unit } + +def localCap[T](op: (cap: {*} Cap) => T): T = ??? + +def main(io: {*} Cap, net: {*} Cap): Unit = { + val test1 = localCap { cap => // error + () => { cap.use() } + } + + val test2: (cap: {*} Cap) -> {cap} () -> Unit = + localCap { cap => // should work + (cap1: {*} Cap) => () => { cap1.use() } + } + + val test3: (cap: {io} Cap) -> {io} () -> Unit = + localCap { cap => // should work + (cap1: {io} Cap) => () => { cap1.use() } + } + + val test4: (cap: {io} Cap) -> {net} () -> Unit = + localCap { cap => // error + (cap1: {io} Cap) => () => { cap1.use() } + } + + def localCap2[T](op: (cap: {io} Cap) => T): T = ??? + + val test5: {io} () -> Unit = + localCap2 { cap => // ok + () => { cap.use() } + } +} diff --git a/tests/neg-custom-args/captures/i15116.check b/tests/neg-custom-args/captures/i15116.check index 83c552087646..7c73a7ff52ff 100644 --- a/tests/neg-custom-args/captures/i15116.check +++ b/tests/neg-custom-args/captures/i15116.check @@ -2,27 +2,27 @@ 3 | val x = Foo(m) // error | ^^^^^^^^^^^^^^ | Non-local value x cannot have an inferred type - | {Bar.this.m} Foo{m: {Bar.this.m} String} + | {Bar.this.m} Foo{val m: {Bar.this.m} String} | with non-empty capture set {Bar.this.m}. | The type needs to be declared explicitly. -- Error: tests/neg-custom-args/captures/i15116.scala:5:6 -------------------------------------------------------------- 5 | val x = Foo(m) // error | ^^^^^^^^^^^^^^ | Non-local value x cannot have an inferred type - | {Baz.this} Foo{m: {Baz.this} String} + | {Baz.this} Foo{val m: {*} String} | with non-empty capture set {Baz.this}. | The type needs to be declared explicitly. -- Error: tests/neg-custom-args/captures/i15116.scala:7:6 -------------------------------------------------------------- 7 | val x = Foo(m) // error | ^^^^^^^^^^^^^^ | Non-local value x cannot have an inferred type - | {Bar1.this.m} Foo{m: {Bar1.this.m} String} + | {Bar1.this.m} Foo{val m: {Bar1.this.m} String} | with non-empty capture set {Bar1.this.m}. | The type needs to be declared explicitly. -- Error: tests/neg-custom-args/captures/i15116.scala:9:6 -------------------------------------------------------------- 9 | val x = Foo(m) // error | ^^^^^^^^^^^^^^ | Non-local value x cannot have an inferred type - | {Baz2.this} Foo{m: {Baz2.this} String} + | {Baz2.this} Foo{val m: {*} String} | with non-empty capture set {Baz2.this}. | The type needs to be declared explicitly. diff --git a/tests/neg-custom-args/captures/i15772.check b/tests/neg-custom-args/captures/i15772.check index 765586ac5e27..a587f2d262ed 100644 --- a/tests/neg-custom-args/captures/i15772.check +++ b/tests/neg-custom-args/captures/i15772.check @@ -1,14 +1,14 @@ -- [E007] Type Mismatch Error: tests/neg-custom-args/captures/i15772.scala:20:49 --------------------------------------- 20 | val boxed1 : (({*} C) => Unit) -> Unit = box1(c) // error | ^^^^^^^ - | Found: {c} ({*} ({c} C{arg: {*} C}) -> Unit) -> Unit + | Found: {c} ({*} ({c} C{val arg: {*} C}) -> Unit) -> Unit | Required: (({*} C) => Unit) -> Unit | | longer explanation available when compiling with `-explain` -- [E007] Type Mismatch Error: tests/neg-custom-args/captures/i15772.scala:27:38 --------------------------------------- 27 | val boxed2 : Observe[{*} C] = box2(c) // error | ^^^^^^^ - | Found: {c} ({*} ({c} C{arg: {*} C}) -> Unit) -> Unit + | Found: {c} ({*} ({c} C{val arg: {*} C}) -> Unit) -> Unit | Required: Observe[{*} C] | | longer explanation available when compiling with `-explain` @@ -16,7 +16,7 @@ 33 | val boxed2 : Observe[{*} C] = box2(c) // error | ^ | Found: {*} C - | Required: box {*} C{arg: ? C} + | Required: box {*} C{val arg: ? C} | | longer explanation available when compiling with `-explain` -- [E007] Type Mismatch Error: tests/neg-custom-args/captures/i15772.scala:44:2 ---------------------------------------- diff --git a/tests/neg-custom-args/captures/i15921.scala b/tests/neg-custom-args/captures/i15921.scala new file mode 100644 index 000000000000..291673746e33 --- /dev/null +++ b/tests/neg-custom-args/captures/i15921.scala @@ -0,0 +1,12 @@ +trait Stream { def close(): Unit = (); def write(x: Any): Unit = () } + +object Test { + def usingLogFile[T](op: (c: {*} Stream) => T): T = + val logFile = new Stream { } + val result = op(logFile) + logFile.close() + result + + val later = usingLogFile { f => () => f.write(0) } // error + later() // writing to closed file! +} diff --git a/tests/neg-custom-args/captures/i15925.scala b/tests/neg-custom-args/captures/i15925.scala new file mode 100644 index 000000000000..433d27a98414 --- /dev/null +++ b/tests/neg-custom-args/captures/i15925.scala @@ -0,0 +1,13 @@ +import language.experimental.captureChecking + +class Unit +object unit extends Unit + +type Foo[X] = [T] -> (op: X => T) -> T +type Lazy[X] = Unit => X + +def force[X](fx: Foo[Lazy[X]]): X = + fx[X](f => f(unit)) // error + +def force2[X](fx: Foo[Unit => X]): X = + fx[X](f => f(unit)) // error diff --git a/tests/neg-custom-args/captures/lazylist.check b/tests/neg-custom-args/captures/lazylist.check index e43538ad97f7..471e2a038450 100644 --- a/tests/neg-custom-args/captures/lazylist.check +++ b/tests/neg-custom-args/captures/lazylist.check @@ -1,15 +1,15 @@ --- [E163] Declaration Error: tests/neg-custom-args/captures/lazylist.scala:22:6 ---------------------------------------- -22 | def tail: {*} LazyList[Nothing] = ??? // error overriding - | ^ - | error overriding method tail in class LazyList of type -> lazylists.LazyList[Nothing]; - | method tail of type -> {*} lazylists.LazyList[Nothing] has incompatible type +-- [E007] Type Mismatch Error: tests/neg-custom-args/captures/lazylist.scala:17:15 ------------------------------------- +17 | def tail = xs() // error + | ^^^^ + | Found: {LazyCons.this.xs} lazylists.LazyList[T] + | Required: lazylists.LazyList[T] | | longer explanation available when compiling with `-explain` -- [E007] Type Mismatch Error: tests/neg-custom-args/captures/lazylist.scala:35:29 ------------------------------------- 35 | val ref1c: LazyList[Int] = ref1 // error | ^^^^ - | Found: (ref1 : {cap1} lazylists.LazyCons[Int]{xs: {cap1} () -> {*} lazylists.LazyList[Int]}) - | Required: lazylists.LazyList[Int] + | Found: (ref1 : {cap1} lazylists.LazyCons[Int]{val xs: {cap1} () -> {*} lazylists.LazyList[Int]}) + | Required: lazylists.LazyList[Int] | | longer explanation available when compiling with `-explain` -- [E007] Type Mismatch Error: tests/neg-custom-args/captures/lazylist.scala:37:36 ------------------------------------- @@ -33,10 +33,10 @@ | Required: {cap1, ref3, cap3} lazylists.LazyList[Int] | | longer explanation available when compiling with `-explain` --- Error: tests/neg-custom-args/captures/lazylist.scala:17:6 ----------------------------------------------------------- -17 | def tail = xs() // error: cannot have an inferred type - | ^^^^^^^^^^^^^^^ - | Non-local method tail cannot have an inferred result type - | {LazyCons.this.xs} lazylists.LazyList[? T] - | with non-empty capture set {LazyCons.this.xs}. - | The type needs to be declared explicitly. +-- [E164] Declaration Error: tests/neg-custom-args/captures/lazylist.scala:22:6 ---------------------------------------- +22 | def tail: {*} LazyList[Nothing] = ??? // error overriding + | ^ + | error overriding method tail in class LazyList of type -> lazylists.LazyList[Nothing]; + | method tail of type -> {*} lazylists.LazyList[Nothing] has incompatible type + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg-custom-args/captures/lazylist.scala b/tests/neg-custom-args/captures/lazylist.scala index 56bfc3ea6da2..2674f15a0ee3 100644 --- a/tests/neg-custom-args/captures/lazylist.scala +++ b/tests/neg-custom-args/captures/lazylist.scala @@ -14,7 +14,7 @@ abstract class LazyList[+T]: class LazyCons[+T](val x: T, val xs: () => {*} LazyList[T]) extends LazyList[T]: def isEmpty = false def head = x - def tail = xs() // error: cannot have an inferred type + def tail = xs() // error object LazyNil extends LazyList[Nothing]: def isEmpty = true diff --git a/tests/neg-custom-args/captures/lazylists2.check b/tests/neg-custom-args/captures/lazylists2.check index 41881b57da24..812170aabdfe 100644 --- a/tests/neg-custom-args/captures/lazylists2.check +++ b/tests/neg-custom-args/captures/lazylists2.check @@ -1,10 +1,3 @@ --- [E163] Declaration Error: tests/neg-custom-args/captures/lazylists2.scala:50:10 ------------------------------------- -50 | def tail: {xs, f} LazyList[B] = xs.tail.map(f) // error - | ^ - | error overriding method tail in trait LazyList of type -> {Mapped.this} LazyList[B]; - | method tail of type -> {xs, f} LazyList[B] has incompatible type - | - | longer explanation available when compiling with `-explain` -- [E007] Type Mismatch Error: tests/neg-custom-args/captures/lazylists2.scala:18:4 ------------------------------------ 18 | final class Mapped extends LazyList[B]: // error | ^ @@ -37,6 +30,18 @@ 41 | def tail: {this} LazyList[B] = xs.tail.map(f) // error | ^ |(f : A => B) cannot be referenced here; it is not included in the allowed capture set {xs} of the self type of class Mapped +-- [E007] Type Mismatch Error: tests/neg-custom-args/captures/lazylists2.scala:45:4 ------------------------------------ +45 | final class Mapped extends LazyList[B]: // error + | ^ + | Found: {f, xs} LazyList[? B] + | Required: {xs} LazyList[B] +46 | this: ({xs, f} Mapped) => +47 | def isEmpty = false +48 | def head: B = f(xs.head) +49 | def tail: {xs, f} LazyList[B] = xs.tail.map(f) +50 | new Mapped + | + | longer explanation available when compiling with `-explain` -- Error: tests/neg-custom-args/captures/lazylists2.scala:60:10 -------------------------------------------------------- 60 | class Mapped2 extends Mapped: // error | ^ diff --git a/tests/neg-custom-args/captures/lazylists2.scala b/tests/neg-custom-args/captures/lazylists2.scala index 7b661e931441..574fb5a1a488 100644 --- a/tests/neg-custom-args/captures/lazylists2.scala +++ b/tests/neg-custom-args/captures/lazylists2.scala @@ -42,12 +42,12 @@ extension [A](xs: {*} LazyList[A]) new Mapped def map4[B](f: A => B): {xs} LazyList[B] = - final class Mapped extends LazyList[B]: + final class Mapped extends LazyList[B]: // error this: ({xs, f} Mapped) => def isEmpty = false def head: B = f(xs.head) - def tail: {xs, f} LazyList[B] = xs.tail.map(f) // error + def tail: {xs, f} LazyList[B] = xs.tail.map(f) new Mapped def map5[B](f: A => B): LazyList[B] = diff --git a/tests/neg-custom-args/captures/lazyref.check b/tests/neg-custom-args/captures/lazyref.check index fcd98d0d67bd..7471f8f4f686 100644 --- a/tests/neg-custom-args/captures/lazyref.check +++ b/tests/neg-custom-args/captures/lazyref.check @@ -1,28 +1,28 @@ -- [E007] Type Mismatch Error: tests/neg-custom-args/captures/lazyref.scala:19:28 -------------------------------------- 19 | val ref1c: LazyRef[Int] = ref1 // error | ^^^^ - | Found: (ref1 : {cap1} LazyRef[Int]{elem: {cap1} () -> Int}) + | Found: (ref1 : {cap1} LazyRef[Int]{val elem: {cap1} () -> Int}) | Required: LazyRef[Int] | | longer explanation available when compiling with `-explain` -- [E007] Type Mismatch Error: tests/neg-custom-args/captures/lazyref.scala:21:35 -------------------------------------- 21 | val ref2c: {cap2} LazyRef[Int] = ref2 // error | ^^^^ - | Found: (ref2 : {cap2, ref1} LazyRef[Int]{elem: {*} () -> Int}) + | Found: (ref2 : {cap2, ref1} LazyRef[Int]{val elem: {*} () -> Int}) | Required: {cap2} LazyRef[Int] | | longer explanation available when compiling with `-explain` -- [E007] Type Mismatch Error: tests/neg-custom-args/captures/lazyref.scala:23:35 -------------------------------------- 23 | val ref3c: {ref1} LazyRef[Int] = ref3 // error | ^^^^ - | Found: (ref3 : {cap2, ref1} LazyRef[Int]{elem: {*} () -> Int}) + | Found: (ref3 : {cap2, ref1} LazyRef[Int]{val elem: {*} () -> Int}) | Required: {ref1} LazyRef[Int] | | longer explanation available when compiling with `-explain` -- [E007] Type Mismatch Error: tests/neg-custom-args/captures/lazyref.scala:25:35 -------------------------------------- 25 | val ref4c: {cap1} LazyRef[Int] = ref4 // error | ^^^^ - | Found: (ref4 : {cap2, cap1} LazyRef[Int]{elem: {*} () -> Int}) + | Found: (ref4 : {cap2, cap1} LazyRef[Int]{val elem: {*} () -> Int}) | Required: {cap1} LazyRef[Int] | | longer explanation available when compiling with `-explain` diff --git a/tests/neg-custom-args/captures/override-adapt-box-selftype.scala b/tests/neg-custom-args/captures/override-adapt-box-selftype.scala new file mode 100644 index 000000000000..a4dc92429192 --- /dev/null +++ b/tests/neg-custom-args/captures/override-adapt-box-selftype.scala @@ -0,0 +1,48 @@ +import language.experimental.captureChecking + +class IO +class C + +object Test1 { + abstract class A[X] { this: {} A[X] => + def foo(x: X): X + } + + def test(io: {*} IO) = { + class B extends A[{io} C] { // X =:= {io} C // error + override def foo(x: {io} C): {io} C = ??? + } + } +} + +def Test2(io: {*} IO, fs: {io} IO, ct: {*} IO) = { + abstract class A[X] { this: {io} A[X] => + def foo(x: X): X + } + + class B1 extends A[{io} C] { + override def foo(x: {io} C): {io} C = ??? + } + + class B2 extends A[{ct} C] { // error + override def foo(x: {ct} C): {ct} C = ??? + } + + class B3 extends A[{fs} C] { + override def foo(x: {fs} C): {fs} C = ??? + } +} + +def Test3(io: {*} IO, ct: {*} IO) = { + abstract class A[X] { this: {*} A[X] => + def foo(x: X): X + } + + class B1 extends A[{io} C] { + override def foo(x: {io} C): {io} C = ??? + } + + class B2 extends A[{io, ct} C] { + override def foo(x: {io, ct} C): {io, ct} C = ??? + } +} diff --git a/tests/neg-custom-args/captures/override-adapt-box.scala b/tests/neg-custom-args/captures/override-adapt-box.scala new file mode 100644 index 000000000000..64ba8743bf91 --- /dev/null +++ b/tests/neg-custom-args/captures/override-adapt-box.scala @@ -0,0 +1,14 @@ +import language.experimental.captureChecking + +abstract class A[X] { this: ({} A[X]) => + def foo(x: X): X +} + +class IO +class C + +def test(io: {*} IO) = { + class B extends A[{io} C] { // X =:= {io} C // error + override def foo(x: {io} C): {io} C = ??? + } +} diff --git a/tests/neg-custom-args/captures/selftype.scala b/tests/neg-custom-args/captures/selftype.scala new file mode 100644 index 000000000000..21148f625a7a --- /dev/null +++ b/tests/neg-custom-args/captures/selftype.scala @@ -0,0 +1,4 @@ +@annotation.experimental class C(x: () => Unit) extends caps.Pure // error + +@annotation.experimental class D(@annotation.constructorOnly x: () => Unit) extends caps.Pure // ok + diff --git a/tests/neg-custom-args/captures/try.check b/tests/neg-custom-args/captures/try.check index 30ebb910d34d..d4bcc859d256 100644 --- a/tests/neg-custom-args/captures/try.check +++ b/tests/neg-custom-args/captures/try.check @@ -1,8 +1,8 @@ -- [E007] Type Mismatch Error: tests/neg-custom-args/captures/try.scala:23:49 ------------------------------------------ 23 | val a = handle[Exception, CanThrow[Exception]] { // error | ^ - | Found: ? ({*} CT[Exception]) -> {*} CT[? >: ? Exception <: ? Exception] - | Required: CanThrow[Exception] => box {*} CT[Exception] + | Found: ? ({*} CT[Exception]) -> CanThrow[Exception] + | Required: {*} CanThrow[Exception] -> box {*} CT[Exception] 24 | (x: CanThrow[Exception]) => x 25 | }{ | @@ -10,12 +10,26 @@ -- [E007] Type Mismatch Error: tests/neg-custom-args/captures/try.scala:29:43 ------------------------------------------ 29 | val b = handle[Exception, () -> Nothing] { // error | ^ - | Found: ? (x: {*} CT[Exception]) -> {x} () -> ? Nothing - | Required: CanThrow[Exception] => () -> Nothing + | Found: ? (x: {*} CT[Exception]) -> {x} () -> Nothing + | Required: {*} (x$0: CanThrow[Exception]) -> () -> Nothing 30 | (x: CanThrow[Exception]) => () => raise(new Exception)(using x) 31 | } { | | longer explanation available when compiling with `-explain` +-- [E007] Type Mismatch Error: tests/neg-custom-args/captures/try.scala:52:2 ------------------------------------------- +47 |val global: () -> Int = handle { +48 | (x: CanThrow[Exception]) => +49 | () => +50 | raise(new Exception)(using x) +51 | 22 +52 |} { // error + | ^ + | Found: {x$0} () -> Int + | Required: () -> Int +53 | (ex: Exception) => () => 22 +54 |} + | + | longer explanation available when compiling with `-explain` -- Error: tests/neg-custom-args/captures/try.scala:40:4 ---------------------------------------------------------------- 35 | val xx = handle { 36 | (x: CanThrow[Exception]) => @@ -24,19 +38,7 @@ 39 | 22 40 | } { // error | ^ - | The expression's type box {*} () -> Int is not allowed to capture the root capability `*`. + | The expression's type box {x$0, *} () -> Int is not allowed to capture the root capability `*`. | This usually means that a capability persists longer than its allowed lifetime. 41 | (ex: Exception) => () => 22 42 | } --- Error: tests/neg-custom-args/captures/try.scala:52:2 ---------------------------------------------------------------- -47 |val global = handle { -48 | (x: CanThrow[Exception]) => -49 | () => -50 | raise(new Exception)(using x) -51 | 22 -52 |} { // error - | ^ - | The expression's type box {*} () -> Int is not allowed to capture the root capability `*`. - | This usually means that a capability persists longer than its allowed lifetime. -53 | (ex: Exception) => () => 22 -54 |} diff --git a/tests/neg-custom-args/captures/try.scala b/tests/neg-custom-args/captures/try.scala index df7930f76af8..9489766d41be 100644 --- a/tests/neg-custom-args/captures/try.scala +++ b/tests/neg-custom-args/captures/try.scala @@ -44,11 +44,11 @@ def test = yy // OK -val global = handle { +val global: () -> Int = handle { (x: CanThrow[Exception]) => () => raise(new Exception)(using x) 22 } { // error (ex: Exception) => () => 22 -} \ No newline at end of file +} diff --git a/tests/neg-custom-args/captures/usingLogFile.check b/tests/neg-custom-args/captures/usingLogFile.check index beb7ac23ed44..05fb385a64f7 100644 --- a/tests/neg-custom-args/captures/usingLogFile.check +++ b/tests/neg-custom-args/captures/usingLogFile.check @@ -1,13 +1,3 @@ --- Error: tests/neg-custom-args/captures/usingLogFile.scala:23:27 ------------------------------------------------------ -23 | val later = usingLogFile { f => () => f.write(0) } // error - | ^^^^^^^^^^^^^^^^^^^^^^^^^ - | {f} () -> Unit cannot be box-converted to box ? () -> Unit - | since one of their capture sets contains the root capability `*` --- Error: tests/neg-custom-args/captures/usingLogFile.scala:29:9 ------------------------------------------------------- -29 | later2.x() // error - | ^^^^^^^^ - | The expression's type box {*} () -> Unit is not allowed to capture the root capability `*`. - | This usually means that a capability persists longer than its allowed lifetime. -- Error: tests/neg-custom-args/captures/usingLogFile.scala:33:2 ------------------------------------------------------- 33 | later3() // error | ^^^^^^ @@ -18,18 +8,32 @@ | ^^^^^^^^ | The expression's type box {*} () -> Unit is not allowed to capture the root capability `*`. | This usually means that a capability persists longer than its allowed lifetime. --- Error: tests/neg-custom-args/captures/usingLogFile.scala:47:27 ------------------------------------------------------ +-- Error: tests/neg-custom-args/captures/usingLogFile.scala:23:6 ------------------------------------------------------- +23 | val later = usingLogFile { f => () => f.write(0) } // error + | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + | Non-local value later cannot have an inferred type + | {x$0} () -> Unit + | with non-empty capture set {x$0}. + | The type needs to be declared explicitly. +-- Error: tests/neg-custom-args/captures/usingLogFile.scala:29:9 ------------------------------------------------------- +29 | later2.x() // error + | ^^^^^^^^ + | The expression's type box {x$0, *} () -> Unit is not allowed to capture the root capability `*`. + | This usually means that a capability persists longer than its allowed lifetime. +-- Error: tests/neg-custom-args/captures/usingLogFile.scala:47:6 ------------------------------------------------------- 47 | val later = usingLogFile { f => () => f.write(0) } // error - | ^^^^^^^^^^^^^^^^^^^^^^^^^ - | {f} () -> Unit cannot be box-converted to box ? () -> Unit - | since one of their capture sets contains the root capability `*` --- Error: tests/neg-custom-args/captures/usingLogFile.scala:62:33 ------------------------------------------------------ + | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + | Non-local value later cannot have an inferred type + | {x$0} () -> Unit + | with non-empty capture set {x$0}. + | The type needs to be declared explicitly. +-- Error: tests/neg-custom-args/captures/usingLogFile.scala:62:25 ------------------------------------------------------ 62 | val later = usingFile("out", f => (y: Int) => xs.foreach(x => f.write(x + y))) // error - | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - | {f} (x$0: Int) -> Unit cannot be box-converted to box ? (x$0: Int) -> Unit - | since one of their capture sets contains the root capability `*` --- Error: tests/neg-custom-args/captures/usingLogFile.scala:71:37 ------------------------------------------------------ + | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + | The expression's type box {x$0, *} (x$0: Int) -> Unit is not allowed to capture the root capability `*`. + | This usually means that a capability persists longer than its allowed lifetime. +-- Error: tests/neg-custom-args/captures/usingLogFile.scala:71:25 ------------------------------------------------------ 71 | val later = usingFile("logfile", usingLogger(_, l => () => l.log("test"))) // error - | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - | {_$1} () -> Unit cannot be box-converted to box ? () -> Unit - | since one of their capture sets contains the root capability `*` + | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + | The expression's type box {x$0, *} () -> Unit is not allowed to capture the root capability `*`. + | This usually means that a capability persists longer than its allowed lifetime. diff --git a/tests/neg-custom-args/captures/vars.check b/tests/neg-custom-args/captures/vars.check index e4f28fd45e93..4b9ab5723ce6 100644 --- a/tests/neg-custom-args/captures/vars.check +++ b/tests/neg-custom-args/captures/vars.check @@ -9,7 +9,7 @@ 15 | val u = a // error | ^ | Found: (a : box {*} String -> String) - | Required: {*} (x$0: ? String) -> ? String + | Required: {*} (x$0: String) -> String | | longer explanation available when compiling with `-explain` -- Error: tests/neg-custom-args/captures/vars.scala:16:2 --------------------------------------------------------------- @@ -25,7 +25,7 @@ -- Error: tests/neg-custom-args/captures/vars.scala:32:8 --------------------------------------------------------------- 32 | local { cap3 => // error | ^ - | The expression's type box {*} (x$0: ? String) -> ? String is not allowed to capture the root capability `*`. + | The expression's type box {x$0, *} (x$0: String) -> String is not allowed to capture the root capability `*`. | This usually means that a capability persists longer than its allowed lifetime. 33 | def g(x: String): String = if cap3 == cap3 then "" else "a" 34 | g diff --git a/tests/neg-custom-args/erased/i4060.scala b/tests/neg-custom-args/erased/i4060.scala new file mode 100644 index 000000000000..a1a2eee68dc0 --- /dev/null +++ b/tests/neg-custom-args/erased/i4060.scala @@ -0,0 +1,21 @@ +// See https://github.com/lampepfl/dotty/issues/4060#issuecomment-445808377 + +object App { + trait A { type L >: Any} + def upcast(erased a: A)(x: Any): a.L = x + erased val p: A { type L <: Nothing } = p + def coerce(x: Any): Int = upcast(p)(x) // error + + def coerceInline(x: Any): Int = upcast(compiletime.erasedValue[A {type L <: Nothing}])(x) // error + + trait B { type L <: Nothing } + def upcast_dep_parameter(erased a: B)(x: a.L) : Int = x + erased val q : B { type L >: Any } = compiletime.erasedValue + + def coerceInlineWithB(x: Any): Int = upcast_dep_parameter(q)(x) // error + + def main(args: Array[String]): Unit = { + println(coerce("Uh oh!")) + println(coerceInlineWithB("Uh oh!")) + } +} diff --git a/tests/neg-custom-args/fatal-warnings/i10994.scala b/tests/neg-custom-args/fatal-warnings/i10994.scala new file mode 100644 index 000000000000..ce5cb2cf3df9 --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i10994.scala @@ -0,0 +1,2 @@ +def foo = true match + case (b: Boolean): Boolean => () // error diff --git a/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala b/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala new file mode 100644 index 000000000000..13d540dc2a5d --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i15503-scala2/scala2-t11681.scala @@ -0,0 +1,110 @@ +// scalac: -Wunused:params +// + +import Answers._ + +trait InterFace { + /** Call something. */ + def call(a: Int, b: String, c: Double): Int +} + +trait BadAPI extends InterFace { + private def f(a: Int, + b: String, // error + c: Double): Int = { + println(c) + a + } + @deprecated("no warn in deprecated API", since="yesterday") + def g(a: Int, + b: String, // OK + c: Double): Int = { + println(c) + a + } + override def call(a: Int, + b: String, // OK + c: Double): Int = { + println(c) + a + } + + def meth(x: Int) = x + + override def equals(other: Any): Boolean = true // OK + + def i(implicit s: String) = answer // ok + + /* + def future(x: Int): Int = { + val y = 42 + val x = y // maybe option to warn only if shadowed + x + } + */ +} + +// mustn't alter warnings in super +trait PoorClient extends BadAPI { + override def meth(x: Int) = ??? // OK + override def f(a: Int, b: String, c: Double): Int = a + b.toInt + c.toInt +} + +class Unusing(u: Int) { // error + def f = ??? +} + +class Valuing(val u: Int) // OK + +class Revaluing(u: Int) { def f = u } // OK + +case class CaseyKasem(k: Int) // OK + +case class CaseyAtTheBat(k: Int)(s: String) // ok + +trait Ignorance { + def f(readResolve: Int) = answer // ok +} + +class Reusing(u: Int) extends Unusing(u) // OK + +// TODO: check +// class Main { +// def main(args: Array[String]): Unit = println("hello, args") // OK +// } + +trait Unimplementation { + def f(u: Int): Int = ??? // OK +} + +trait DumbStuff { + def f(implicit dummy: DummyImplicit) = answer // ok + def g(dummy: DummyImplicit) = answer // ok +} +trait Proofs { + def f[A, B](implicit ev: A =:= B) = answer // ok + def g[A, B](implicit ev: A <:< B) = answer // ok + def f2[A, B](ev: A =:= B) = answer // ok + def g2[A, B](ev: A <:< B) = answer // ok +} + +trait Anonymous { + def f = (i: Int) => answer // ok + + def f1 = (_: Int) => answer // OK + + def f2: Int => Int = _ + 1 // OK + + def g = for (i <- List(1)) yield answer // ok +} +trait Context[A] +trait Implicits { + def f[A](implicit ctx: Context[A]) = answer // ok + def g[A: Context] = answer // OK +} +class Bound[A: Context] // OK +object Answers { + def answer: Int = 42 +} + +val a$1 = 2 \ No newline at end of file diff --git a/tests/neg-custom-args/fatal-warnings/i15503a.scala b/tests/neg-custom-args/fatal-warnings/i15503a.scala new file mode 100644 index 000000000000..cd7282490fc9 --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i15503a.scala @@ -0,0 +1,268 @@ +// scalac: -Wunused:imports + + +object FooUnused: + import collection.mutable.Set // error + import collection.mutable.{Map => MutMap} // error + import collection.mutable._ // error + +object FooWildcardUnused: + import collection.mutable._ // error + +object Foo: + import collection.mutable.Set // OK + import collection.mutable.{Map => MutMap} // OK + + val bar = Set() // OK + val baz = MutMap() // OK + +object FooWildcard: + import collection.mutable._ // OK + + val bar = Set() // OK + +object FooNestedUnused: + import collection.mutable.Set // error + object Nested: + def hello = 1 + +object FooNested: + import collection.mutable.Set // OK + object Nested: + def hello = Set() + +object FooGivenUnused: + import SomeGivenImports.given // error + +object FooGiven: + import SomeGivenImports.given // OK + import SomeGivenImports._ // error + + val foo = summon[Int] + +/** + * Import used as type name are considered + * as used. + * + * Import here are only used as types, not as + * Term + */ +object FooTypeName: + import collection.mutable.Set // OK + import collection.mutable.Map // OK + import collection.mutable.Seq // OK + import collection.mutable.ArrayBuilder // OK + import collection.mutable.ListBuffer // error + + def checkImplicit[A](using Set[A]) = () + def checkParamType[B](a: Map[B,B]): Seq[B] = ??? + def checkTypeParam[A] = () + + checkTypeParam[ArrayBuilder[Int]] + + +object InlineChecks: + object InlineFoo: + import collection.mutable.Set // ok + import collection.mutable.Map // error + inline def getSet = Set(1) + + object InlinedBar: + import collection.mutable.Set // ok + import collection.mutable.Map // error + val a = InlineFoo.getSet + +object MacroChecks: + object StringInterpol: + import collection.mutable.Set // OK + import collection.mutable.Map // OK + println(s"This is a mutableSet : ${Set[Map[Int,Int]]()}") + + +object InnerMostCheck: + import collection.mutable.* // error + def check = + import collection.mutable.* //OK + val a = Set(1) + +object IgnoreExclusion: + import collection.mutable.{Set => _} // OK + import collection.mutable.{Map => _} // OK + import collection.mutable.{ListBuffer} // error + def check = + val a = Set(1) + val b = Map(1 -> 2) +/** + * Some given values for the test + */ +object SomeGivenImports: + given Int = 0 + given String = "foo" + +/* BEGIN : Check on packages*/ +package testsamepackageimport: + package p { + class C + } + + package p { + import p._ // error + package q { + class U { + def f = new C + } + } + } +// ----------------------- + +package testpackageimport: + package a: + val x: Int = 0 + + package b: + import a._ // error + + +/* END : Check on packages*/ + +/* BEGIN : tests on meta-language features */ +object TestGivenCoversionScala2: + /* note: scala3 Conversion[U,T] do not require an import */ + import language.implicitConversions // OK + + implicit def doubleToInt(d:Double):Int = d.toInt + + def idInt(i:Int):Int = i + val someInt = idInt(1.0) + +object TestTailrecImport: + import annotation.tailrec // OK + @tailrec + def fac(x:Int, acc:Int = 1): Int = + if x == 0 then acc else fac(x - 1, acc * x) +/* END : tests on meta-language features */ + +/* BEGIN : tests on given import order */ +object GivenImportOrderAtoB: + class X + class Y extends X + object A { implicit val x: X = new X } + object B { implicit val y: Y = new Y } + class C { + import A._ // error + import B._ // OK + def t = implicitly[X] + } + +object GivenImportOrderBtoA: + class X + class Y extends X + object A { implicit val x: X = new X } + object B { implicit val y: Y = new Y } + class C { + import B._ // OK + import A._ // error + def t = implicitly[X] + } +/* END : tests on given import order */ + +/* Scala 2 implicits */ +object Scala2ImplicitsGiven: + object A: + implicit val x: Int = 1 + object B: + import A.given // OK + val b = summon[Int] + object C: + import A.given // error + val b = 1 + object D: + import A._ // OK + val b = summon[Int] + object E: + import A._ // error + val b = 1 + object F: + import A.x // OK + val b = summon[Int] + object G: + import A.x // error + val b = 1 + +// ------------------------------------- +object TestNewKeyword: + object Foo: + class Aa[T](val x: T) + object Bar: + import Foo.Aa // OK + val v = 1 + val a = new Aa(v) + +// ------------------------------------- +object testAnnotatedType: + import annotation.switch // OK + val a = (??? : @switch) match + case _ => ??? + + +//------------------------------------- +package testImportsInImports: + package a: + package b: + val x = 1 + package c: + import a.b // OK + import b.x // OK + val y = x + +//------------------------------------- +package testOnOverloadedMethodsImports: + package a: + trait A + trait B + trait C: + def foo(x: A):A = ??? + def foo(x: B):B = ??? + package b: + object D extends a.C + package c: + import b.D.foo // error + package d: + import b.D.foo // OK + def bar = foo((??? : a.A)) + package e: + import b.D.foo // OK + def bar = foo((??? : a.B)) + package f: + import b.D.foo // OK + def bar = foo((??? : a.A)) + def baz = foo((??? : a.B)) + +//------------------------------------- +package foo.testing.rename.imports: + import collection.mutable.{Set => MutSet1} // OK + import collection.mutable.{Set => MutSet2} // OK + import collection.mutable.{Set => MutSet3} // error + type A[X] = MutSet1[X] + val a = MutSet2(1) + +//------------------------------------- +package foo.testing.imports.precedence: + import scala.collection.immutable.{BitSet => _, _} // error + import scala.collection.immutable.BitSet // OK + def t = BitSet.empty + +package foo.test.enums: + enum A: // OK + case B extends A // OK + case C extends A // OK + +package foo.test.typeapply.hklamdba.i16680: + package foo: + trait IO[A] + + package bar: + import foo.IO // OK + + def f[F[_]]: String = "hello" + def go = f[IO] \ No newline at end of file diff --git a/tests/neg-custom-args/fatal-warnings/i15503b.scala b/tests/neg-custom-args/fatal-warnings/i15503b.scala new file mode 100644 index 000000000000..8a4a055150f9 --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i15503b.scala @@ -0,0 +1,102 @@ +// scalac: -Wunused:locals + +val a = 1 // OK + +val b = // OK + val e1 = 1 // error + def e2 = 2 // error + 1 + +val c = // OK + val e1 = 1 // OK + def e2 = e1 // OK + e2 + +def d = 1 // OK + +def e = // OK + val e1 = 1 // error + def e2 = 2 // error + 1 + +def f = // OK + val f1 = 1 // OK + def f2 = f1 // OK + f2 + +class Foo { + val b = // OK + val e1 = 1 // error + def e2 = 2 // error + 1 + + val c = // OK + val e1 = 1 // OK + def e2 = e1 // OK + e2 + + def d = 1 // OK + + def e = // OK + val e1 = 1 // error + def e2 = 2 // error + 1 + + def f = // OK + val f1 = 1 // OK + def f2 = f1 // OK + f2 +} + +// ---- SCALA 2 tests ---- + +package foo.scala2.tests: + class Outer { + class Inner + } + + trait Locals { + def f0 = { + var x = 1 // error + var y = 2 // OK + y = 3 + y + y + } + def f1 = { + val a = new Outer // OK + val b = new Outer // error + new a.Inner + } + def f2 = { + var x = 100 + x + } + } + + object Types { + def l1() = { + object HiObject { def f = this } // OK + class Hi { // error + def f1: Hi = new Hi + def f2(x: Hi) = x + } + class DingDongDoobie // error + class Bippy // OK + type Something = Bippy // OK + type OtherThing = String // error + (new Bippy): Something + } + } + +package test.foo.twisted.i16682: + def myPackage = + object IntExtractor: // OK + def unapply(s: String): Option[Int] = s.toIntOption + + def isInt(s: String) = s match { // OK + case IntExtractor(i) => println(s"Number $i") + case _ => println("NaN") + } + isInt + + def f = myPackage("42") diff --git a/tests/neg-custom-args/fatal-warnings/i15503c.scala b/tests/neg-custom-args/fatal-warnings/i15503c.scala new file mode 100644 index 000000000000..630846df4e5d --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i15503c.scala @@ -0,0 +1,40 @@ +// scalac: -Wunused:privates +trait C +class A: + self: C => // OK + class B: + private[A] val a = 1 // OK + private[B] val b = 1 // OK + private[this] val c = 1 // error + private val d = 1 // error + + private[A] val e = 1 // OK + private[this] val f = e // OK + private val g = f // OK + + private def fac(x: Int): Int = // error + if x == 0 then 1 else x * fac(x - 1) + + val x = 1 // OK + def y = 2 // OK + def z = g // OK + +package foo.test.contructors: + case class A private (x:Int) // OK + class B private (val x: Int) // OK + class C private (private val x: Int) // error + class D private (private val x: Int): // OK + def y = x + + +package test.foo.i16682: + object myPackage: + private object IntExtractor: // OK + def unapply(s: String): Option[Int] = s.toIntOption + + def isInt(s: String) = s match { + case IntExtractor(i) => println(s"Number $i") + case _ => println("NaN") + } + + def f = myPackage.isInt("42") diff --git a/tests/neg-custom-args/fatal-warnings/i15503d.scala b/tests/neg-custom-args/fatal-warnings/i15503d.scala new file mode 100644 index 000000000000..6c5973c66a3a --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i15503d.scala @@ -0,0 +1,30 @@ +// scalac: -Wunused:unsafe-warn-patvars +// todo : change to :patvars + +sealed trait Calc +sealed trait Const extends Calc +case class Sum(a: Calc, b: Calc) extends Calc +case class S(pred: Const) extends Const +case object Z extends Const + +val a = Sum(S(S(Z)),Z) match { + case Sum(a,Z) => Z // error + // case Sum(a @ _,Z) => Z // todo : this should pass in the future + case Sum(a@S(_),Z) => Z // error + case Sum(a@S(_),Z) => a // OK + case Sum(a@S(b@S(_)), Z) => a // error + case Sum(a@S(b@S(_)), Z) => a // error + case Sum(a@S(b@(S(_))), Z) => Sum(a,b) // OK + case Sum(_,_) => Z // OK + case _ => Z // OK +} + +// todo : This should pass in the future +// val b = for { +// case Some(x) <- Option(Option(1)) +// } println(s"$x") + +// todo : This should *NOT* pass in the future +// val c = for { +// case Some(x) <- Option(Option(1)) +// } println(s"hello world") diff --git a/tests/neg-custom-args/fatal-warnings/i15503e.scala b/tests/neg-custom-args/fatal-warnings/i15503e.scala new file mode 100644 index 000000000000..57664cd08dcd --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i15503e.scala @@ -0,0 +1,71 @@ +// scalac: -Wunused:explicits + +object Foo { + /* This goes around the "trivial method" detection */ + val default_val = 1 + + private def f1(a: Int) = a // OK + private def f2(a: Int) = default_val // error + private def f3(a: Int)(using Int) = a // OK + private def f4(a: Int)(using Int) = default_val // error + private def f6(a: Int)(using Int) = summon[Int] // error + private def f7(a: Int)(using Int) = summon[Int] + a // OK +} + +package scala2main.unused.args: + object happyBirthday { + def main(args: Array[String]): Unit = println("Hello World") // ok + } + +package scala2main: + object happyBirthday { + def main(args: Array[String]): Unit = // OK + println(s"Hello World, there are ${args.size} arguments") + } + +package scala3main: + /* This goes around the "trivial method" detection */ + val default_unit = () + @main def hello = println("Hello World") // OK + +package foo.test.lambda.param: + val default_val = 1 + val a = (i: Int) => i // OK + val b = (i: Int) => default_val // OK + val c = (_: Int) => default_val // OK + +package foo.test.trivial: + /* A twisted test from Scala 2 */ + class C { + def answer: 42 = 42 + object X + private def g0(x: Int) = ??? // OK + private def f0(x: Int) = () // OK + private def f1(x: Int) = throw new RuntimeException // OK + private def f2(x: Int) = 42 // OK + private def f3(x: Int): Option[Int] = None // OK + private def f4(x: Int) = classOf[Int] // OK + private def f5(x: Int) = answer + 27 // OK + private def f6(x: Int) = X // OK + private def f7(x: Int) = Y // OK + private def f8(x: Int): List[C] = Nil // OK + private def f9(x: Int): List[Int] = List(1,2,3,4) // error + private def foo:Int = 32 // OK + private def f77(x: Int) = foo // error + } + object Y + +package foo.test.i16955: + class S(var r: String) // OK + +package foo.test.i16865: + trait Foo: + def fn(a: Int, b: Int): Int // OK + trait Bar extends Foo + + object Ex extends Bar: + def fn(a: Int, b: Int): Int = b + 3 // OK + + object Ex2 extends Bar: + override def fn(a: Int, b: Int): Int = b + 3 // OK + diff --git a/tests/neg-custom-args/fatal-warnings/i15503f.scala b/tests/neg-custom-args/fatal-warnings/i15503f.scala new file mode 100644 index 000000000000..f909272af732 --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i15503f.scala @@ -0,0 +1,14 @@ +// scalac: -Wunused:implicits + +/* This goes around the "trivial method" detection */ +val default_int = 1 + +object Xd { + private def f1(a: Int) = a // OK + private def f2(a: Int) = 1 // OK + private def f3(a: Int)(using Int) = a // OK + private def f4(a: Int)(using Int) = default_int // OK + private def f6(a: Int)(using Int) = summon[Int] // OK + private def f7(a: Int)(using Int) = summon[Int] + a // OK + private def f8(a: Int)(using foo: Int) = a // error +} diff --git a/tests/neg-custom-args/fatal-warnings/i15503g.scala b/tests/neg-custom-args/fatal-warnings/i15503g.scala new file mode 100644 index 000000000000..2185bfed711d --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i15503g.scala @@ -0,0 +1,23 @@ +// scalac: -Wunused:params + +/* This goes around the "trivial method" detection */ +object Foo { + val default_int = 1 + + private def f1(a: Int) = a // OK + private def f2(a: Int) = default_int // error + private def f3(a: Int)(using Int) = a // OK + private def f4(a: Int)(using Int) = default_int // error + private def f6(a: Int)(using Int) = summon[Int] // error + private def f7(a: Int)(using Int) = summon[Int] + a // OK + /* --- Trivial method check --- */ + private def g1(x: Int) = 1 // OK + private def g2(x: Int) = ??? // OK +} + +package foo.test.i17101: + type Test[A] = A + extension[A] (x: Test[A]) { // OK + def value: A = x + def causesIssue: Unit = println("oh no") + } diff --git a/tests/neg-custom-args/fatal-warnings/i15503h.scala b/tests/neg-custom-args/fatal-warnings/i15503h.scala new file mode 100644 index 000000000000..3bab6cdbd098 --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i15503h.scala @@ -0,0 +1,20 @@ +// scalac: -Wunused:linted + +import collection.mutable.Set // error + +class A { + private val a = 1 // error + val b = 2 // OK + + private def c = 2 // error + def d(using x:Int): Int = b // ok + def e(x: Int) = 1 // OK + def f = + val x = 1 // error + def f = 2 // error + 3 + + def g(x: Int): Int = x match + case x:1 => 0 // OK + case _ => 1 +} \ No newline at end of file diff --git a/tests/neg-custom-args/fatal-warnings/i15503i.scala b/tests/neg-custom-args/fatal-warnings/i15503i.scala new file mode 100644 index 000000000000..fefead7f01a3 --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i15503i.scala @@ -0,0 +1,315 @@ +// scalac: -Wunused:all + +import collection.mutable.{Map => MutMap} // error +import collection.mutable.Set // error + +class A { + import collection.mutable.{Map => MutMap} // OK + private val a = 1 // error + val b = 2 // OK + + /* This goes around the trivial method detection */ + val default_int = 12 + + val someMap = MutMap() + + private def c1 = 2 // error + private def c2 = 2 // OK + def c3 = c2 + + def d1(using x:Int): Int = default_int // ok + def d2(using x:Int): Int = x // OK + + def e1(x: Int) = default_int // ok + def e2(x: Int) = x // OK + def f = + val x = 1 // error + def f = 2 // error + val y = 3 // OK + def g = 4 // OK + y + g + + // todo : uncomment once patvars is fixed + // def g(x: Int): Int = x match + // case x:1 => 0 // ?error + // case x:2 => x // ?OK + // case _ => 1 // ?OK +} + +/* ---- CHECK scala.annotation.unused ---- */ +package foo.test.scala.annotation: + import annotation.unused // OK + + /* This goes around the trivial method detection */ + val default_int = 12 + + def a1(a: Int) = a // OK + def a2(a: Int) = default_int // ok + + def a3(@unused a: Int) = default_int //OK + + def b1 = + def f = 1 // error + 1 + + def b2 = + def f = 1 // OK + f + + def b3 = + @unused def f = 1 // OK + 1 + + object Foo: + private def a = 1 // error + private def b = 2 // OK + @unused private def c = 3 // OK + + def other = b + +package foo.test.companionprivate: + class A: + import A.b // OK + def a = b // OK + + object A: + private def b = c // OK + def c = List(1,2,3) // OK + +package foo.test.i16678: + def foo(func: Int => String, value: Int): String = func(value) // OK + + def run = + println(foo(number => number.toString, value = 5)) // OK + println(foo(number => "", value = 5)) // error + println(foo(func = number => "", value = 5)) // error + println(foo(func = number => number.toString, value = 5)) // OK + println(foo(func = _.toString, value = 5)) // OK + +package foo.test.possibleclasses: + case class AllCaseClass( + k: Int, // OK + private val y: Int // OK /* Kept as it can be taken from pattern */ + )( + s: Int, + val t: Int, // OK + private val z: Int // error + ) + + case class AllCaseUsed( + k: Int, // OK + private val y: Int // OK + )( + s: Int, // OK + val t: Int, // OK + private val z: Int // OK + ) { + def a = k + y + s + t + z + } + + class AllClass( + k: Int, // error + private val y: Int // error + )( + s: Int, // error + val t: Int, // OK + private val z: Int // error + ) + + class AllUsed( + k: Int, // OK + private val y: Int // OK + )( + s: Int, // OK + val t: Int, // OK + private val z: Int // OK + ) { + def a = k + y + s + t + z + } + +package foo.test.possibleclasses.withvar: + case class AllCaseClass( + k: Int, // OK + private var y: Int // OK /* Kept as it can be taken from pattern */ + )( + s: Int, + var t: Int, // OK + private var z: Int // error + ) + + case class AllCaseUsed( + k: Int, // OK + private var y: Int // OK + )( + s: Int, // OK + var t: Int, // OK + private var z: Int // OK + ) { + def a = k + y + s + t + z + } + + class AllClass( + k: Int, // error + private var y: Int // error + )( + s: Int, // error + var t: Int, // OK + private var z: Int // error + ) + + class AllUsed( + k: Int, // OK + private var y: Int // OK + )( + s: Int, // OK + var t: Int, // OK + private var z: Int // OK + ) { + def a = k + y + s + t + z + } + + + +package foo.test.from.i16675: + case class PositiveNumber private (i: Int) // OK + object PositiveNumber: + def make(i: Int): Option[PositiveNumber] = //OK + Option.when(i >= 0)(PositiveNumber(i)) // OK + +package foo.test.i16822: + enum ExampleEnum { + case Build(context: String) // OK + case List // OK + } + + def demo = { + val x = ExampleEnum.List // OK + println(x) // OK + } + +package foo.test.i16877: + import scala.collection.immutable.HashMap // OK + import scala.annotation.StaticAnnotation // OK + + class ExampleAnnotation(val a: Object) extends StaticAnnotation // OK + + @ExampleAnnotation(new HashMap()) // OK + class Test //OK + +package foo.test.i16926: + def hello(): Unit = + for { + i <- (0 to 10).toList + (a, b) = "hello" -> "world" // OK + } yield println(s"$a $b") + +package foo.test.i16925: + def hello = + for { + i <- 1 to 2 if true + _ = println(i) // OK + } yield () + +package foo.test.i16863a: + import scala.quoted.* + def fn(using Quotes) = + val x = Expr(1) + '{ $x + 2 } // OK + +package foo.test.i16863b: + import scala.quoted.* + def fn[A](using Quotes, Type[A]) = // OK + val numeric = Expr.summon[Numeric[A]].getOrElse(???) + '{ $numeric.fromInt(3) } // OK + +package foo.test.i16863c: + import scala.quoted.* + def fn[A](expr: Expr[Any])(using Quotes) = + val imp = expr match + case '{ ${ _ }: a } => Expr.summon[Numeric[a]] // OK + println(imp) + +package foo.test.i16863d: + import scala.quoted.* + import scala.compiletime.asMatchable // OK + def fn[A](using Quotes, Type[A]) = + import quotes.reflect.* + val imp = TypeRepr.of[A].widen.asMatchable match + case Refinement(_,_,_) => () + println(imp) + +package foo.test.i16679a: + object myPackage: + trait CaseClassName[A]: + def name: String + object CaseClassName: + trait CaseClassByStringName[A] extends CaseClassName[A] + import scala.deriving.Mirror + object CaseClassByStringName: + inline final def derived[A](using inline A: Mirror.Of[A]): CaseClassByStringName[A] = + new CaseClassByStringName[A]: + def name: String = A.toString + + object secondPackage: + import myPackage.CaseClassName // OK + case class CoolClass(i: Int) derives CaseClassName.CaseClassByStringName + println(summon[CaseClassName[CoolClass]].name) + +package foo.test.i16679b: + object myPackage: + trait CaseClassName[A]: + def name: String + + object CaseClassName: + import scala.deriving.Mirror + inline final def derived[A](using inline A: Mirror.Of[A]): CaseClassName[A] = + new CaseClassName[A]: + def name: String = A.toString + + object Foo: + given x: myPackage.CaseClassName[secondPackage.CoolClass] = null + + object secondPackage: + import myPackage.CaseClassName // OK + import Foo.x + case class CoolClass(i: Int) + println(summon[myPackage.CaseClassName[CoolClass]]) + +package foo.test.i17156: + package a: + trait Foo[A] + object Foo: + inline def derived[T]: Foo[T] = new Foo{} + + package b: + import a.Foo + type Xd[A] = Foo[A] + + package c: + import b.Xd + trait Z derives Xd + + +package foo.test.i17175: + val continue = true + def foo = + for { + i <- 1.until(10) // OK + if continue + } { + println(i) + } + +package foo.test.i17117: + package example { + object test1 { + val test = "test" + } + + object test2 { + + import example.test1 as t1 + + val test = t1.test + } + } diff --git a/tests/neg-custom-args/fatal-warnings/i15503j.scala b/tests/neg-custom-args/fatal-warnings/i15503j.scala new file mode 100644 index 000000000000..51c1fa6fda0c --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i15503j.scala @@ -0,0 +1,59 @@ +// scalac: -Wunused:strict-no-implicit-warn + +package foo.unused.strict.test: + package a: + given x: Int = 0 + implicit val y: Int = 1 + val z: Int = 2 + def f: Int = 3 + package b: + import a.given // OK + import a._ // OK + import a.* // OK + import a.x // OK + import a.y // OK + import a.z // error + import a.f // error + package c: + import a.given // OK + import a.x // OK + import a.y // OK + import a.z // OK + import a.f // OK + def g = f + z + y + x + +package foo.implicits.resolution: + class X + class Y extends X + object A { implicit val x: X = new X } + object B { implicit val y: Y = new Y } + class C { + import A._ // OK + import B._ // OK + def t = implicitly[X] + } + +package foo.unused.summon.inlines: + package lib: + trait A + trait B + trait C + trait X + + given willBeUnused: (A & X) = new A with X {} + given willBeUsed: (A & B) = new A with B {} + + package use: + import lib.{A, B, C, willBeUnused, willBeUsed} //OK + import compiletime.summonInline //OK + + transparent inline given conflictInside: C = + summonInline[A] + new {} + + transparent inline given potentialConflict: C = + summonInline[B] + new {} + + val b: B = summon[B] + val c: C = summon[C] \ No newline at end of file diff --git a/tests/neg-custom-args/fatal-warnings/i15893.scala b/tests/neg-custom-args/fatal-warnings/i15893.scala new file mode 100644 index 000000000000..f23e6150106a --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i15893.scala @@ -0,0 +1,61 @@ +sealed trait NatT +case class Zero() extends NatT +case class Succ[+N <: NatT](n: N) extends NatT + +type Mod2[N <: NatT] <: NatT = N match + case Zero => Zero + case Succ[Zero] => Succ[Zero] + case Succ[Succ[predPredN]] => Mod2[predPredN] + +def mod2(n: NatT): NatT = n match + case Zero() => Zero() + case Succ(Zero()) => Succ(Zero()) + case Succ(Succ(predPredN)) => mod2(predPredN) + +inline def inlineMod2(inline n: NatT): NatT = inline n match + case Zero() => Zero() + case Succ(Zero()) => Succ(Zero()) + case Succ(Succ(predPredN)) => inlineMod2(predPredN) + +transparent inline def transparentInlineMod2(inline n: NatT): NatT = inline n match + case Zero() => Zero() + case Succ(Zero()) => Succ(Zero()) + case Succ(Succ(predPredN)) => transparentInlineMod2(predPredN) + +def dependentlyTypedMod2[N <: NatT](n: N): Mod2[N] = n match + case Zero(): Zero => Zero() // error + case Succ(Zero()): Succ[Zero] => Succ(Zero()) // error + case Succ(Succ(predPredN)): Succ[Succ[_]] => dependentlyTypedMod2(predPredN) // error + +inline def inlineDependentlyTypedMod2[N <: NatT](inline n: N): Mod2[N] = inline n match + case Zero(): Zero => Zero() // error + case Succ(Zero()): Succ[Zero] => Succ(Zero()) // error + case Succ(Succ(predPredN)): Succ[Succ[_]] => inlineDependentlyTypedMod2(predPredN) // error + +transparent inline def transparentInlineDependentlyTypedMod2[N <: NatT](inline n: N): Mod2[N] = inline n match + case Zero(): Zero => Zero() // error + case Succ(Zero()): Succ[Zero] => Succ(Zero()) // error + case Succ(Succ(predPredN)): Succ[Succ[_]] => transparentInlineDependentlyTypedMod2(predPredN) // error + +def foo(n: NatT): NatT = mod2(n) match + case Succ(Zero()) => Zero() + case _ => n + +inline def inlineFoo(inline n: NatT): NatT = inline inlineMod2(n) match + case Succ(Zero()) => Zero() + case _ => n + +inline def transparentInlineFoo(inline n: NatT): NatT = inline transparentInlineMod2(n) match + case Succ(Zero()) => Zero() + case _ => n + +@main def main(): Unit = + println(mod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected + println(foo(Succ(Succ(Succ(Zero()))))) // prints Zero(), as expected + println(inlineMod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected + println(inlineFoo(Succ(Succ(Succ(Zero()))))) // prints Succ(Succ(Succ(Zero()))); unexpected + println(transparentInlineMod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected + println(transparentInlineFoo(Succ(Succ(Succ(Zero()))))) // prints Zero(), as expected + println(dependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // runtime error; unexpected + println(inlineDependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected + println(transparentInlineDependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected diff --git a/tests/neg-custom-args/fatal-warnings/i16649-refutable.check b/tests/neg-custom-args/fatal-warnings/i16649-refutable.check new file mode 100644 index 000000000000..5b3d460c7f09 --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i16649-refutable.check @@ -0,0 +1,8 @@ +-- Error: tests/neg-custom-args/fatal-warnings/i16649-refutable.scala:4:6 ---------------------------------------------- +4 | val '{ ($y: Int) + ($z: Int) } = x // error + | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + | pattern binding uses refutable extractor `'{...}` + | + | If this usage is intentional, this can be communicated by adding `: @unchecked` after the expression, + | which may result in a MatchError at runtime. + | This patch can be rewritten automatically under -rewrite -source 3.2-migration. diff --git a/tests/neg-custom-args/fatal-warnings/i16649-refutable.scala b/tests/neg-custom-args/fatal-warnings/i16649-refutable.scala new file mode 100644 index 000000000000..2a42f652e093 --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i16649-refutable.scala @@ -0,0 +1,4 @@ +import quoted.* + +def foo(using Quotes)(x: Expr[Int]) = + val '{ ($y: Int) + ($z: Int) } = x // error diff --git a/tests/neg-custom-args/fatal-warnings/i16876/Macro.scala b/tests/neg-custom-args/fatal-warnings/i16876/Macro.scala new file mode 100644 index 000000000000..2823de1f72c5 --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i16876/Macro.scala @@ -0,0 +1,23 @@ +import scala.quoted.* + +def findMethodSymbol(using q: Quotes)(s: quotes.reflect.Symbol): quotes.reflect.Symbol = + if s.isDefDef then + s + else + findMethodSymbol(using q)(s.owner) +end findMethodSymbol + + +inline def adder: Int = ${ + adderImpl +} + +def adderImpl(using q: Quotes): Expr[Int] = + import quotes.reflect.* + + val inputs = findMethodSymbol(using q)(q.reflect.Symbol.spliceOwner).tree match + case DefDef(_, params, _, _) => + params.last match + case TermParamClause(valDefs) => + valDefs.map(vd => Ref(vd.symbol).asExprOf[Int]) + inputs.reduce((exp1, exp2) => '{ $exp1 + $exp2 }) \ No newline at end of file diff --git a/tests/neg-custom-args/fatal-warnings/i16876/Test.scala b/tests/neg-custom-args/fatal-warnings/i16876/Test.scala new file mode 100644 index 000000000000..d9229d31cd6d --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i16876/Test.scala @@ -0,0 +1,11 @@ +// scalac: -Wunused:all + +object Foo { + private def myMethod(a: Int, b: Int, c: Int) = adder // ok + myMethod(1, 2, 3) + + private def myMethodFailing(a: Int, b: Int, c: Int) = a + 0 // error // error + myMethodFailing(1, 2, 3) +} + + diff --git a/tests/neg-custom-args/fatal-warnings/i16930.scala b/tests/neg-custom-args/fatal-warnings/i16930.scala new file mode 100644 index 000000000000..1f6c5bf1a09f --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i16930.scala @@ -0,0 +1,22 @@ +// scalac: -Wunused:imports + +trait Outer: + trait Used + trait Unused + +object Test { + val outer: Outer = ??? + import outer.{Used, Unused} // error + def foo(x: Any): Used = x.asInstanceOf[Used] +} + +trait Outer1: + trait UnusedToo1 + trait Unused1 + def unusedToo1: UnusedToo1 + +object Test1 { + val outer1: Outer1 = ??? + import outer1.{Unused1, UnusedToo1} // error // error + def foo() = outer1.unusedToo1 // in this case UnusedToo1 is not used explicitly, only inferred +} diff --git a/tests/neg-custom-args/fatal-warnings/i17314b.scala b/tests/neg-custom-args/fatal-warnings/i17314b.scala new file mode 100644 index 000000000000..384767765cf4 --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i17314b.scala @@ -0,0 +1,14 @@ +// scalac: -Wunused:all + +package foo: + class Foo[T] + given Foo[Int] = new Foo[Int] + + +package bar: + import foo.{given foo.Foo[Int]} // error + import foo.Foo + + given Foo[Int] = ??? + + val repro: Foo[Int] = summon[Foo[Int]] diff --git a/tests/neg-custom-args/fatal-warnings/i17335.scala b/tests/neg-custom-args/fatal-warnings/i17335.scala new file mode 100644 index 000000000000..6629e2f151c9 --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/i17335.scala @@ -0,0 +1,4 @@ +// scalac: -Wunused:all + +def aMethod() = + doStuff { (x) => x } // error diff --git a/tests/neg-custom-args/fatal-warnings/inline-givens.scala b/tests/neg-custom-args/fatal-warnings/inline-givens.scala new file mode 100644 index 000000000000..eae50bca45cf --- /dev/null +++ b/tests/neg-custom-args/fatal-warnings/inline-givens.scala @@ -0,0 +1,15 @@ + +class Item(x: String) + +inline given a: Conversion[String, Item] = + Item(_) // error + +inline given b: Conversion[String, Item] = + (x => Item(x)) // error + +inline given c: Conversion[String, Item] = + { x => Item(x) } // error + +inline given d: Conversion[String, Item] with + def apply(x: String) = Item(x) // ok + diff --git a/tests/neg-custom-args/feature/convertible.scala b/tests/neg-custom-args/feature/convertible.scala new file mode 100644 index 000000000000..1b9e1c79f011 --- /dev/null +++ b/tests/neg-custom-args/feature/convertible.scala @@ -0,0 +1,29 @@ +import language.experimental.into + +class Text(val str: String) + +object Test: + + given Conversion[String, Text] = Text(_) + + def f(x: Text, y: => Text, zs: Text*) = + println(s"${x.str} ${y.str} ${zs.map(_.str).mkString(" ")}") + + f("abc", "def") // error // error + f("abc", "def", "xyz", "uvw") // error // error // error // error + f("abc", "def", "xyz", Text("uvw")) // error // error // error + + def g(x: into Text) = + println(x.str) + + + g("abc") // OK + val gg = g + gg("abc") // straight eta expansion is also OK + + def h1[X](x: X)(y: X): Unit = () + + def h(x: into Text) = + val y = h1(x) + y("abc") // error, inference through type variable does not propagate + diff --git a/tests/neg-custom-args/feature-shadowing.scala b/tests/neg-custom-args/feature/feature-shadowing.scala similarity index 100% rename from tests/neg-custom-args/feature-shadowing.scala rename to tests/neg-custom-args/feature/feature-shadowing.scala diff --git a/tests/neg-custom-args/i13946/BadPrinter.scala b/tests/neg-custom-args/feature/i13946/BadPrinter.scala similarity index 100% rename from tests/neg-custom-args/i13946/BadPrinter.scala rename to tests/neg-custom-args/feature/i13946/BadPrinter.scala diff --git a/tests/neg-custom-args/i13946/Printer.scala b/tests/neg-custom-args/feature/i13946/Printer.scala similarity index 100% rename from tests/neg-custom-args/i13946/Printer.scala rename to tests/neg-custom-args/feature/i13946/Printer.scala diff --git a/tests/neg-custom-args/impl-conv/A.scala b/tests/neg-custom-args/feature/impl-conv/A.scala similarity index 100% rename from tests/neg-custom-args/impl-conv/A.scala rename to tests/neg-custom-args/feature/impl-conv/A.scala diff --git a/tests/neg-custom-args/impl-conv/B.scala b/tests/neg-custom-args/feature/impl-conv/B.scala similarity index 100% rename from tests/neg-custom-args/impl-conv/B.scala rename to tests/neg-custom-args/feature/impl-conv/B.scala diff --git a/tests/neg-custom-args/implicit-conversions-old.scala b/tests/neg-custom-args/feature/implicit-conversions-old.scala similarity index 100% rename from tests/neg-custom-args/implicit-conversions-old.scala rename to tests/neg-custom-args/feature/implicit-conversions-old.scala diff --git a/tests/neg-custom-args/implicit-conversions.scala b/tests/neg-custom-args/feature/implicit-conversions.scala similarity index 100% rename from tests/neg-custom-args/implicit-conversions.scala rename to tests/neg-custom-args/feature/implicit-conversions.scala diff --git a/tests/neg-custom-args/i10994.check b/tests/neg-custom-args/i10994.check new file mode 100644 index 000000000000..c540a04657c3 --- /dev/null +++ b/tests/neg-custom-args/i10994.check @@ -0,0 +1,7 @@ +-- Error: tests/neg-custom-args/i10994.scala:2:19 ---------------------------------------------------------------------- +2 | case (b: Boolean): Boolean => () // error + | ^ + | Type ascriptions after patterns other than: + | * variable pattern, e.g. `case x: String =>` + | * number literal pattern, e.g. `case 10.5: Double =>` + | are no longer supported. Remove the type ascription or move it to a separate variable pattern. diff --git a/tests/neg-custom-args/i10994.scala b/tests/neg-custom-args/i10994.scala new file mode 100644 index 000000000000..65695ccf4352 --- /dev/null +++ b/tests/neg-custom-args/i10994.scala @@ -0,0 +1,2 @@ +def foo = true match + case (b: Boolean): Boolean => () // error diff --git a/tests/neg-custom-args/i13838.check b/tests/neg-custom-args/i13838.check index 2c93e4001461..5e62779f3238 100644 --- a/tests/neg-custom-args/i13838.check +++ b/tests/neg-custom-args/i13838.check @@ -1,15 +1,15 @@ --- Error: tests/neg-custom-args/i13838.scala:10:5 ---------------------------------------------------------------------- +-- [E172] Type Error: tests/neg-custom-args/i13838.scala:10:5 ---------------------------------------------------------- 10 | foo // error | ^ - |No given instance of type Order[X] was found for parameter x$1 of method foo in object FooT - | - |where: X is a type variable - |. + |No given instance of type Order[X] was found for parameter x$1 of method foo in object FooT. |I found: | | FooT.OrderFFooA[F, A](FooT.OrderFFooA[F, A](/* missing */summon[Order[F[Foo[A]]]])) | - |But given instance OrderFFooA in object FooT produces a diverging implicit search when trying to match type Order[F[Foo[A]]]. + |But given instance OrderFFooA in object FooT produces a diverging implicit search when trying to match type Order[F[Foo[A]]] + | + |where: X is a type variable + |. -- [E168] Type Warning: tests/neg-custom-args/i13838.scala:10:5 -------------------------------------------------------- 10 | foo // error | ^ diff --git a/tests/neg-custom-args/i16601a.check b/tests/neg-custom-args/i16601a.check new file mode 100644 index 000000000000..604f71993ada --- /dev/null +++ b/tests/neg-custom-args/i16601a.check @@ -0,0 +1,18 @@ +-- [E042] Type Error: tests/neg-custom-args/i16601a.scala:1:27 --------------------------------------------------------- +1 |@main def Test: Unit = new concurrent.ExecutionContext // error + | ^^^^^^^^^^^^^^^^^^^^^^^^^^^ + | ExecutionContext is a trait; it cannot be instantiated + |--------------------------------------------------------------------------------------------------------------------- + | Explanation (enabled by `-explain`) + |- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + | Abstract classes and traits need to be extended by a concrete class or object + | to make their functionality accessible. + | + | You may want to create an anonymous class extending ExecutionContext with + | class ExecutionContext { } + | + | or add a companion object with + | object ExecutionContext extends ExecutionContext + | + | You need to implement any abstract members in both cases. + --------------------------------------------------------------------------------------------------------------------- diff --git a/tests/neg-custom-args/i16601a.scala b/tests/neg-custom-args/i16601a.scala new file mode 100644 index 000000000000..2e058db0093c --- /dev/null +++ b/tests/neg-custom-args/i16601a.scala @@ -0,0 +1 @@ +@main def Test: Unit = new concurrent.ExecutionContext // error \ No newline at end of file diff --git a/tests/neg-custom-args/i4060.scala b/tests/neg-custom-args/i4060.scala deleted file mode 100644 index 3d5c180b5d7b..000000000000 --- a/tests/neg-custom-args/i4060.scala +++ /dev/null @@ -1,22 +0,0 @@ -class X { type R } -class T(erased val a: X)(val value: a.R) - -object App { - def coerce[U, V](u: U): V = { - trait X { type R >: U } - trait Y { type R = V } - - class T[A <: X](erased val a: A)(val value: a.R) // error - - object O { lazy val x : Y & X = ??? } - - val a = new T[Y & X](O.x)(u) - a.value - } - - def main(args: Array[String]): Unit = { - val x: Int = coerce[String, Int]("a") - println(x + 1) - - } -} diff --git a/tests/neg-custom-args/no-experimental/experimental-nested-imports-2.scala b/tests/neg-custom-args/no-experimental/experimental-nested-imports-2.scala index 85076cca723a..a4962c6153a0 100644 --- a/tests/neg-custom-args/no-experimental/experimental-nested-imports-2.scala +++ b/tests/neg-custom-args/no-experimental/experimental-nested-imports-2.scala @@ -1,7 +1,6 @@ import annotation.experimental class Class1: - import language.experimental.fewerBraces // error import language.experimental.namedTypeArguments // error import language.experimental.genericNumberLiterals // error import language.experimental.erasedDefinitions // ok: only check at erased definition @@ -9,7 +8,6 @@ class Class1: def g = 1 object Object1: - import language.experimental.fewerBraces // error import language.experimental.namedTypeArguments // error import language.experimental.genericNumberLiterals // error import language.experimental.erasedDefinitions // ok: only check at erased definition @@ -17,7 +15,6 @@ object Object1: def g = 1 def fun1 = - import language.experimental.fewerBraces // error import language.experimental.namedTypeArguments // error import language.experimental.genericNumberLiterals // error import language.experimental.erasedDefinitions // ok: only check at erased definition @@ -25,7 +22,6 @@ def fun1 = def g = 1 val value1 = - import language.experimental.fewerBraces // error import language.experimental.namedTypeArguments // error import language.experimental.genericNumberLiterals // error import language.experimental.erasedDefinitions // ok: only check at erased definition diff --git a/tests/neg-custom-args/no-experimental/experimental-nested-imports-3.scala b/tests/neg-custom-args/no-experimental/experimental-nested-imports-3.scala index 1af04918b1d9..77fbe41479d2 100644 --- a/tests/neg-custom-args/no-experimental/experimental-nested-imports-3.scala +++ b/tests/neg-custom-args/no-experimental/experimental-nested-imports-3.scala @@ -1,25 +1,21 @@ import annotation.experimental class Class1: - import language.experimental.fewerBraces // error import language.experimental.namedTypeArguments // error import language.experimental.genericNumberLiterals // error import language.experimental.erasedDefinitions // ok: only check at erased definition object Object1: - import language.experimental.fewerBraces // error import language.experimental.namedTypeArguments // error import language.experimental.genericNumberLiterals // error import language.experimental.erasedDefinitions // ok: only check at erased definition def fun1 = - import language.experimental.fewerBraces // error import language.experimental.namedTypeArguments // error import language.experimental.genericNumberLiterals // error import language.experimental.erasedDefinitions // ok: only check at erased definition val value1 = - import language.experimental.fewerBraces // error import language.experimental.namedTypeArguments // error import language.experimental.genericNumberLiterals // error import language.experimental.erasedDefinitions // ok: only check at erased definition diff --git a/tests/neg-custom-args/no-experimental/experimental-nested-imports.scala b/tests/neg-custom-args/no-experimental/experimental-nested-imports.scala index b9fc38dc4915..180c43b9f671 100644 --- a/tests/neg-custom-args/no-experimental/experimental-nested-imports.scala +++ b/tests/neg-custom-args/no-experimental/experimental-nested-imports.scala @@ -1,28 +1,24 @@ import annotation.experimental class Class1: - import language.experimental.fewerBraces // error import language.experimental.namedTypeArguments // error import language.experimental.genericNumberLiterals // error import language.experimental.erasedDefinitions // ok: only check at erased definition @experimental def f = 1 object Object1: - import language.experimental.fewerBraces // error import language.experimental.namedTypeArguments // error import language.experimental.genericNumberLiterals // error import language.experimental.erasedDefinitions // ok: only check at erased definition @experimental def f = 1 def fun1 = - import language.experimental.fewerBraces // error import language.experimental.namedTypeArguments // error import language.experimental.genericNumberLiterals // error import language.experimental.erasedDefinitions // ok: only check at erased definition @experimental def f = 1 val value1 = - import language.experimental.fewerBraces // error import language.experimental.namedTypeArguments // error import language.experimental.genericNumberLiterals // error import language.experimental.erasedDefinitions // ok: only check at erased definition diff --git a/tests/neg-custom-args/no-experimental/experimental-package-imports.scala b/tests/neg-custom-args/no-experimental/experimental-package-imports.scala index 90ec387b1036..047b3eb61e82 100644 --- a/tests/neg-custom-args/no-experimental/experimental-package-imports.scala +++ b/tests/neg-custom-args/no-experimental/experimental-package-imports.scala @@ -1,7 +1,6 @@ import annotation.experimental package foo { - import language.experimental.fewerBraces // error import language.experimental.namedTypeArguments // error import language.experimental.genericNumberLiterals // error import language.experimental.erasedDefinitions // ok: only check at erased definition @@ -13,7 +12,6 @@ package foo { package foo2 { // ok: all definitions are top-level @experimental - import language.experimental.fewerBraces import language.experimental.namedTypeArguments import language.experimental.genericNumberLiterals import language.experimental.erasedDefinitions diff --git a/tests/neg-custom-args/no-experimental/experimentalInheritance.scala b/tests/neg-custom-args/no-experimental/experimentalInheritance.scala deleted file mode 100644 index f6eab1224310..000000000000 --- a/tests/neg-custom-args/no-experimental/experimentalInheritance.scala +++ /dev/null @@ -1,14 +0,0 @@ -import scala.annotation.experimental - -@experimental def x = 2 - -@experimental class A1(x: Any) -class A2(x: Any) - - -@experimental class B1 extends A1(1) -class B2 // error: extension of experimental class A1 must have @experimental annotation -extends A1(1) // error: class A1 is marked @experimental ... - -@experimental class C1 extends A2(x) -class C2 extends A2(x) // error def x is marked @experimental and therefore diff --git a/tests/neg-custom-args/no-experimental/experimentalInline.scala b/tests/neg-custom-args/no-experimental/experimentalInline.scala index 8827fd42e36a..eb49bf15d11a 100644 --- a/tests/neg-custom-args/no-experimental/experimentalInline.scala +++ b/tests/neg-custom-args/no-experimental/experimentalInline.scala @@ -4,5 +4,5 @@ import scala.annotation.experimental inline def g() = () def test: Unit = - g() // errors + g() // error () diff --git a/tests/neg-custom-args/no-experimental/experimentalInline2.scala b/tests/neg-custom-args/no-experimental/experimentalInline2.scala new file mode 100644 index 000000000000..c40eb050a832 --- /dev/null +++ b/tests/neg-custom-args/no-experimental/experimentalInline2.scala @@ -0,0 +1,8 @@ +import scala.annotation.experimental + +@experimental +transparent inline def g() = () + +def test: Unit = + g() // error + () diff --git a/tests/neg-macros/annot-MacroAnnotation-direct.check b/tests/neg-macros/annot-MacroAnnotation-direct.check new file mode 100644 index 000000000000..580b2bcc7639 --- /dev/null +++ b/tests/neg-macros/annot-MacroAnnotation-direct.check @@ -0,0 +1,6 @@ +-- [E042] Type Error: tests/neg-macros/annot-MacroAnnotation-direct.scala:3:0 ------------------------------------------ +3 |@MacroAnnotation // error + |^^^^^^^^^^^^^^^^ + |MacroAnnotation is a trait; it cannot be instantiated + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg-macros/annot-MacroAnnotation-direct.scala b/tests/neg-macros/annot-MacroAnnotation-direct.scala new file mode 100644 index 000000000000..a0024457dc48 --- /dev/null +++ b/tests/neg-macros/annot-MacroAnnotation-direct.scala @@ -0,0 +1,4 @@ +import scala.annotation.MacroAnnotation + +@MacroAnnotation // error +def test = () diff --git a/tests/neg-macros/annot-accessIndirect/Macro_1.scala b/tests/neg-macros/annot-accessIndirect/Macro_1.scala new file mode 100644 index 000000000000..8679edcfc0c3 --- /dev/null +++ b/tests/neg-macros/annot-accessIndirect/Macro_1.scala @@ -0,0 +1,11 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ + +@experimental +class hello extends MacroAnnotation { + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + import quotes.reflect._ + val helloSymbol = Symbol.newVal(Symbol.spliceOwner, Symbol.freshName("hello"), TypeRepr.of[String], Flags.EmptyFlags, Symbol.noSymbol) + val helloVal = ValDef(helloSymbol, Some(Literal(StringConstant("Hello, World!")))) + List(helloVal, tree) +} diff --git a/tests/neg-macros/annot-accessIndirect/Macro_2.scala b/tests/neg-macros/annot-accessIndirect/Macro_2.scala new file mode 100644 index 000000000000..d069175ce166 --- /dev/null +++ b/tests/neg-macros/annot-accessIndirect/Macro_2.scala @@ -0,0 +1,18 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ + +@experimental +class foo extends MacroAnnotation { + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + import quotes.reflect._ + val s = '{@hello def foo1(x: Int): Int = x + 1;()}.asTerm + val fooDef = s.asInstanceOf[Inlined].body.asInstanceOf[Block].statements.head.asInstanceOf[DefDef] + val hello = Ref(Symbol.spliceOwner.declaredFields("hello").head).asExprOf[String] // error + tree match + case DefDef(name, params, tpt, Some(t)) => + val rhs = '{ + ${t.asExprOf[String]} + $hello + }.asTerm + val newDef = DefDef.copy(tree)(name, params, tpt, Some(rhs)) + List(fooDef, newDef) +} diff --git a/tests/neg-macros/annot-accessIndirect/Test.scala b/tests/neg-macros/annot-accessIndirect/Test.scala new file mode 100644 index 000000000000..6e2bbd3d3361 --- /dev/null +++ b/tests/neg-macros/annot-accessIndirect/Test.scala @@ -0,0 +1,3 @@ +class Bar: + @foo def bar(x: String): String = x // error + bar("a") diff --git a/tests/neg-macros/annot-crash.check b/tests/neg-macros/annot-crash.check new file mode 100644 index 000000000000..16eb0f68bc44 --- /dev/null +++ b/tests/neg-macros/annot-crash.check @@ -0,0 +1,8 @@ + +-- Error: tests/neg-macros/annot-crash/Test_2.scala:1:0 ---------------------------------------------------------------- +1 |@crash // error + |^^^^^^ + |Failed to evaluate macro. + | Caused by class scala.NotImplementedError: an implementation is missing + | scala.Predef$.$qmark$qmark$qmark(Predef.scala:344) + | crash.transform(Macro_1.scala:7) diff --git a/tests/neg-macros/annot-crash/Macro_1.scala b/tests/neg-macros/annot-crash/Macro_1.scala new file mode 100644 index 000000000000..f3d5b3f602f8 --- /dev/null +++ b/tests/neg-macros/annot-crash/Macro_1.scala @@ -0,0 +1,8 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ + +@experimental +class crash extends MacroAnnotation { + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + ??? +} diff --git a/tests/neg-macros/annot-crash/Test_2.scala b/tests/neg-macros/annot-crash/Test_2.scala new file mode 100644 index 000000000000..3e8fd3cf785f --- /dev/null +++ b/tests/neg-macros/annot-crash/Test_2.scala @@ -0,0 +1,2 @@ +@crash // error +def test = () diff --git a/tests/neg-macros/annot-empty-result.check b/tests/neg-macros/annot-empty-result.check new file mode 100644 index 000000000000..6d43c19664cb --- /dev/null +++ b/tests/neg-macros/annot-empty-result.check @@ -0,0 +1,13 @@ + +-- Error: tests/neg-macros/annot-empty-result/Test_2.scala:5:2 --------------------------------------------------------- +5 | @nilAnnot // error + | ^^^^^^^^^ + | Unexpected `Nil` returned by `(new nilAnnot()).transform(..)` during macro expansion +-- Error: tests/neg-macros/annot-empty-result/Test_2.scala:9:4 --------------------------------------------------------- +9 | @nilAnnot // error + | ^^^^^^^^^ + | Unexpected `Nil` returned by `(new nilAnnot()).transform(..)` during macro expansion +-- Error: tests/neg-macros/annot-empty-result/Test_2.scala:1:0 --------------------------------------------------------- +1 |@nilAnnot // error + |^^^^^^^^^ + |Unexpected `Nil` returned by `(new nilAnnot()).transform(..)` during macro expansion diff --git a/tests/neg-macros/annot-empty-result/Macro_1.scala b/tests/neg-macros/annot-empty-result/Macro_1.scala new file mode 100644 index 000000000000..ff3be61c05d2 --- /dev/null +++ b/tests/neg-macros/annot-empty-result/Macro_1.scala @@ -0,0 +1,8 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ + +@experimental +class nilAnnot extends MacroAnnotation { + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + Nil +} diff --git a/tests/neg-macros/annot-empty-result/Test_2.scala b/tests/neg-macros/annot-empty-result/Test_2.scala new file mode 100644 index 000000000000..84beeafecc24 --- /dev/null +++ b/tests/neg-macros/annot-empty-result/Test_2.scala @@ -0,0 +1,11 @@ +@nilAnnot // error +def f1 = 1 + +class B: + @nilAnnot // error + def f2 = 2 + + def test = + @nilAnnot // error + def f3 = 2 + () diff --git a/tests/neg-macros/annot-error-annot.check b/tests/neg-macros/annot-error-annot.check new file mode 100644 index 000000000000..f150b4561e2c --- /dev/null +++ b/tests/neg-macros/annot-error-annot.check @@ -0,0 +1,127 @@ + +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:17:6 --------------------------------------------------------- +16 |@error +17 |class cGlobal // error + |^ + |MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:20:7 --------------------------------------------------------- +19 |@error +20 |object oGlobal // error + |^ + |MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:24:6 --------------------------------------------------------- +23 | @error +24 | val vMember: Int = 1 // error + | ^ + | MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:26:11 -------------------------------------------------------- +25 | @error +26 | lazy val lvMember: Int = 1 // error + | ^ + | MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:28:6 --------------------------------------------------------- +27 | @error +28 | def dMember: Int = 1 // error + | ^ + | MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:30:8 --------------------------------------------------------- +29 | @error +30 | given gMember: Int = 1 // error + | ^ + | MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:32:8 --------------------------------------------------------- +31 | @error +32 | given gMember2: Num[Int] with // error + | ^ + | MACRO ERROR +33 | def zero = 0 +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:35:8 --------------------------------------------------------- +34 | @error +35 | given gMember3(using DummyImplicit): Num[Int] with // error + | ^ + | MACRO ERROR +36 | def zero = 0 +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:39:8 --------------------------------------------------------- +38 | @error +39 | class cMember // error + | ^ + | MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:42:9 --------------------------------------------------------- +41 | @error +42 | object oMember // error + | ^ + | MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:46:8 --------------------------------------------------------- +45 | @error +46 | val vLocal: Int = 1 // error + | ^ + | MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:48:13 -------------------------------------------------------- +47 | @error +48 | lazy val lvLocal: Int = 1 // error + | ^ + | MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:50:8 --------------------------------------------------------- +49 | @error +50 | def dLocal: Int = 1 // error + | ^ + | MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:52:10 -------------------------------------------------------- +51 | @error +52 | given gLocal: Int = 1 // error + | ^ + | MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:54:10 -------------------------------------------------------- +53 | @error +54 | given gLocal2: Num[Int] with // error + | ^ + | MACRO ERROR +55 | def zero = 0 +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:57:10 -------------------------------------------------------- +56 | @error +57 | given gLocal3(using DummyImplicit): Num[Int] with // error + | ^ + | MACRO ERROR +58 | def zero = 0 +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:61:10 -------------------------------------------------------- +60 | @error +61 | class cLocal // error + | ^ + | MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:63:11 -------------------------------------------------------- +62 | @error +63 | object oLocal // error + | ^ + | MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:2:4 ---------------------------------------------------------- +1 |@error +2 |val vGlobal: Int = 1 // error + |^ + |MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:4:9 ---------------------------------------------------------- +3 |@error +4 |lazy val lvGlobal: Int = 1 // error + |^ + |MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:6:4 ---------------------------------------------------------- +5 |@error +6 |def dGlobal: Int = 1 // error + |^ + |MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:8:6 ---------------------------------------------------------- +7 |@error +8 |given gGlobal: Int = 1 // error + |^ + |MACRO ERROR +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:10:6 --------------------------------------------------------- + 9 |@error +10 |given gGlobal2: Num[Int] with // error + |^ + |MACRO ERROR +11 | def zero = 0 +-- Error: tests/neg-macros/annot-error-annot/Test_2.scala:13:6 --------------------------------------------------------- +12 |@error +13 |given gGlobal3(using DummyImplicit): Num[Int] with // error + |^ + |MACRO ERROR +14 | def zero = 0 diff --git a/tests/neg-macros/annot-error-annot/Macro_1.scala b/tests/neg-macros/annot-error-annot/Macro_1.scala new file mode 100644 index 000000000000..d54b69903e02 --- /dev/null +++ b/tests/neg-macros/annot-error-annot/Macro_1.scala @@ -0,0 +1,9 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ + +@experimental +class error extends MacroAnnotation { + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + quotes.reflect.report.error("MACRO ERROR", tree.pos) + List(tree) +} diff --git a/tests/neg-macros/annot-error-annot/Test_2.scala b/tests/neg-macros/annot-error-annot/Test_2.scala new file mode 100644 index 000000000000..3325ba431127 --- /dev/null +++ b/tests/neg-macros/annot-error-annot/Test_2.scala @@ -0,0 +1,67 @@ +@error +val vGlobal: Int = 1 // error +@error +lazy val lvGlobal: Int = 1 // error +@error +def dGlobal: Int = 1 // error +@error +given gGlobal: Int = 1 // error +@error +given gGlobal2: Num[Int] with // error + def zero = 0 +@error +given gGlobal3(using DummyImplicit): Num[Int] with // error + def zero = 0 + +@error +class cGlobal // error + +@error +object oGlobal // error + +class B: + @error + val vMember: Int = 1 // error + @error + lazy val lvMember: Int = 1 // error + @error + def dMember: Int = 1 // error + @error + given gMember: Int = 1 // error + @error + given gMember2: Num[Int] with // error + def zero = 0 + @error + given gMember3(using DummyImplicit): Num[Int] with // error + def zero = 0 + + @error + class cMember // error + + @error + object oMember // error + + def locals: Unit = + @error + val vLocal: Int = 1 // error + @error + lazy val lvLocal: Int = 1 // error + @error + def dLocal: Int = 1 // error + @error + given gLocal: Int = 1 // error + @error + given gLocal2: Num[Int] with // error + def zero = 0 + @error + given gLocal3(using DummyImplicit): Num[Int] with // error + def zero = 0 + + @error + class cLocal // error + @error + object oLocal // error + () + +trait Num[T]: + def zero: T diff --git a/tests/neg-macros/annot-ill-abort.check b/tests/neg-macros/annot-ill-abort.check new file mode 100644 index 000000000000..b969b3ad4313 --- /dev/null +++ b/tests/neg-macros/annot-ill-abort.check @@ -0,0 +1,5 @@ + +-- Error: tests/neg-macros/annot-ill-abort/Test_2.scala:1:0 ------------------------------------------------------------ +1 |@crash // error + |^^^^^^ + |Macro expansion was aborted by the macro without any errors reported. Macros should issue errors to end-users when aborting a macro expansion with StopMacroExpansion. diff --git a/tests/neg-macros/annot-ill-abort/Macro_1.scala b/tests/neg-macros/annot-ill-abort/Macro_1.scala new file mode 100644 index 000000000000..446ce0a5331b --- /dev/null +++ b/tests/neg-macros/annot-ill-abort/Macro_1.scala @@ -0,0 +1,8 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ + +@experimental +class crash extends MacroAnnotation { + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + throw new scala.quoted.runtime.StopMacroExpansion +} diff --git a/tests/neg-macros/annot-ill-abort/Test_2.scala b/tests/neg-macros/annot-ill-abort/Test_2.scala new file mode 100644 index 000000000000..3e8fd3cf785f --- /dev/null +++ b/tests/neg-macros/annot-ill-abort/Test_2.scala @@ -0,0 +1,2 @@ +@crash // error +def test = () diff --git a/tests/neg-macros/annot-mod-class-add-top-method.check b/tests/neg-macros/annot-mod-class-add-top-method.check new file mode 100644 index 000000000000..28fb93bb29db --- /dev/null +++ b/tests/neg-macros/annot-mod-class-add-top-method.check @@ -0,0 +1,9 @@ + +-- Error: tests/neg-macros/annot-mod-class-add-top-method/Test_2.scala:1:0 --------------------------------------------- +1 |@addTopLevelMethod // error + |^^^^^^^^^^^^^^^^^^ + |macro annotation can not add top-level method. @addTopLevelMethod tried to add method toLevelMethod$macro$1. +-- Error: tests/neg-macros/annot-mod-class-add-top-method/Test_2.scala:4:0 --------------------------------------------- +4 |@addTopLevelMethod // error + |^^^^^^^^^^^^^^^^^^ + |macro annotation can not add top-level method. @addTopLevelMethod tried to add method toLevelMethod$macro$2. diff --git a/tests/neg-macros/annot-mod-class-add-top-method/Macro_1.scala b/tests/neg-macros/annot-mod-class-add-top-method/Macro_1.scala new file mode 100644 index 000000000000..b5c49695ad2a --- /dev/null +++ b/tests/neg-macros/annot-mod-class-add-top-method/Macro_1.scala @@ -0,0 +1,17 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ +import scala.collection.mutable + +@experimental +class addTopLevelMethod extends MacroAnnotation: + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + import quotes.reflect._ + tree match + case ClassDef(name, ctr, parents, self, body) => + val methType = MethodType(Nil)(_ => Nil, _ => TypeRepr.of[Int]) + val methSym = Symbol.newMethod(Symbol.spliceOwner, Symbol.freshName("toLevelMethod"), methType, Flags.EmptyFlags, Symbol.noSymbol) + val methDef = DefDef(methSym, _ => Some(Literal(IntConstant(1)))) + List(methDef, tree) + case _ => + report.error("Annotation only supports `class`") + List(tree) diff --git a/tests/neg-macros/annot-mod-class-add-top-method/Test_2.scala b/tests/neg-macros/annot-mod-class-add-top-method/Test_2.scala new file mode 100644 index 000000000000..eadeff0f060c --- /dev/null +++ b/tests/neg-macros/annot-mod-class-add-top-method/Test_2.scala @@ -0,0 +1,5 @@ +@addTopLevelMethod // error +class Foo + +@addTopLevelMethod // error +object Foo diff --git a/tests/neg-macros/annot-mod-class-add-top-val.check b/tests/neg-macros/annot-mod-class-add-top-val.check new file mode 100644 index 000000000000..bc21173923f1 --- /dev/null +++ b/tests/neg-macros/annot-mod-class-add-top-val.check @@ -0,0 +1,9 @@ + +-- Error: tests/neg-macros/annot-mod-class-add-top-val/Test_2.scala:1:0 ------------------------------------------------ +1 |@addTopLevelVal // error + |^^^^^^^^^^^^^^^ + |macro annotation can not add top-level value. @addTopLevelVal tried to add value toLevelVal$macro$1. +-- Error: tests/neg-macros/annot-mod-class-add-top-val/Test_2.scala:4:0 ------------------------------------------------ +4 |@addTopLevelVal // error + |^^^^^^^^^^^^^^^ + |macro annotation can not add top-level value. @addTopLevelVal tried to add value toLevelVal$macro$2. diff --git a/tests/neg-macros/annot-mod-class-add-top-val/Macro_1.scala b/tests/neg-macros/annot-mod-class-add-top-val/Macro_1.scala new file mode 100644 index 000000000000..c6f21e181879 --- /dev/null +++ b/tests/neg-macros/annot-mod-class-add-top-val/Macro_1.scala @@ -0,0 +1,16 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ +import scala.collection.mutable + +@experimental +class addTopLevelVal extends MacroAnnotation: + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + import quotes.reflect._ + tree match + case ClassDef(name, ctr, parents, self, body) => + val valSym = Symbol.newVal(Symbol.spliceOwner, Symbol.freshName("toLevelVal"), TypeRepr.of[Int], Flags.EmptyFlags, Symbol.noSymbol) + val valDef = ValDef(valSym, Some(Literal(IntConstant(1)))) + List(valDef, tree) + case _ => + report.error("Annotation only supports `class`") + List(tree) diff --git a/tests/neg-macros/annot-mod-class-add-top-val/Test_2.scala b/tests/neg-macros/annot-mod-class-add-top-val/Test_2.scala new file mode 100644 index 000000000000..440e90bc1652 --- /dev/null +++ b/tests/neg-macros/annot-mod-class-add-top-val/Test_2.scala @@ -0,0 +1,5 @@ +@addTopLevelVal // error +class Foo + +@addTopLevelVal // error +object Foo diff --git a/tests/neg-macros/annot-mod-top-method-add-top-method/Macro_1.scala b/tests/neg-macros/annot-mod-top-method-add-top-method/Macro_1.scala new file mode 100644 index 000000000000..45679b65c03b --- /dev/null +++ b/tests/neg-macros/annot-mod-top-method-add-top-method/Macro_1.scala @@ -0,0 +1,13 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ +import scala.collection.mutable + +@experimental +// Assumes annotation is on top level def or val +class addTopLevelMethodOutsidePackageObject extends MacroAnnotation: + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + import quotes.reflect._ + val methType = MethodType(Nil)(_ => Nil, _ => TypeRepr.of[Int]) + val methSym = Symbol.newMethod(Symbol.spliceOwner.owner, Symbol.freshName("toLevelMethod"), methType, Flags.EmptyFlags, Symbol.noSymbol) + val methDef = DefDef(methSym, _ => Some(Literal(IntConstant(1)))) + List(methDef, tree) diff --git a/tests/neg-macros/annot-mod-top-method-add-top-method/Test_2.scala b/tests/neg-macros/annot-mod-top-method-add-top-method/Test_2.scala new file mode 100644 index 000000000000..151b722a0dda --- /dev/null +++ b/tests/neg-macros/annot-mod-top-method-add-top-method/Test_2.scala @@ -0,0 +1,5 @@ +@addTopLevelMethodOutsidePackageObject // error +def foo = 1 + +@addTopLevelMethodOutsidePackageObject // error +val bar = 1 diff --git a/tests/neg-macros/annot-mod-top-method-add-top-val/Macro_1.scala b/tests/neg-macros/annot-mod-top-method-add-top-val/Macro_1.scala new file mode 100644 index 000000000000..c6c4c32afcb8 --- /dev/null +++ b/tests/neg-macros/annot-mod-top-method-add-top-val/Macro_1.scala @@ -0,0 +1,12 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ +import scala.collection.mutable + +@experimental +// Assumes annotation is on top level def or val +class addTopLevelValOutsidePackageObject extends MacroAnnotation: + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + import quotes.reflect._ + val valSym = Symbol.newVal(Symbol.spliceOwner.owner, Symbol.freshName("toLevelVal"), TypeRepr.of[Int], Flags.EmptyFlags, Symbol.noSymbol) + val valDef = ValDef(valSym, Some(Literal(IntConstant(1)))) + List(valDef, tree) diff --git a/tests/neg-macros/annot-mod-top-method-add-top-val/Test_2.scala b/tests/neg-macros/annot-mod-top-method-add-top-val/Test_2.scala new file mode 100644 index 000000000000..076a636267ab --- /dev/null +++ b/tests/neg-macros/annot-mod-top-method-add-top-val/Test_2.scala @@ -0,0 +1,5 @@ +@addTopLevelValOutsidePackageObject // error +def foo = 1 + +@addTopLevelValOutsidePackageObject // error +val bar = 1 diff --git a/tests/neg-macros/annot-nested.scala b/tests/neg-macros/annot-nested.scala new file mode 100644 index 000000000000..4365e41eefff --- /dev/null +++ b/tests/neg-macros/annot-nested.scala @@ -0,0 +1,42 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ + +class Foo: + @experimental + class void extends MacroAnnotation: // error + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = List(tree) + + object Bar: + @experimental + class void extends MacroAnnotation: // error + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = List(tree) + +class Foo2: + @experimental + trait void extends MacroAnnotation // error + + object Bar: + @experimental + trait void extends MacroAnnotation // error + +def test: Unit = + @experimental + class void extends MacroAnnotation: // error + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = List(tree) + + trait void2 extends MacroAnnotation // error + + new MacroAnnotation {} // error + + () + +val test2: Unit = + @experimental + class void extends MacroAnnotation: // error + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = List(tree) + + trait void2 extends MacroAnnotation // error + + new MacroAnnotation {} // error + + () diff --git a/tests/neg-macros/annot-on-type.check b/tests/neg-macros/annot-on-type.check new file mode 100644 index 000000000000..3844c3eeebe9 --- /dev/null +++ b/tests/neg-macros/annot-on-type.check @@ -0,0 +1,16 @@ + +-- Error: tests/neg-macros/annot-on-type/Test_2.scala:6:7 -------------------------------------------------------------- +5 | @voidAnnot +6 | type C // error + | ^ + | macro annotations are not supported on type +-- Error: tests/neg-macros/annot-on-type/Test_2.scala:10:9 ------------------------------------------------------------- + 9 | @voidAnnot +10 | type D // error + | ^ + | macro annotations are not supported on type +-- Error: tests/neg-macros/annot-on-type/Test_2.scala:2:5 -------------------------------------------------------------- +1 |@voidAnnot +2 |type A // error + |^ + |macro annotations are not supported on type diff --git a/tests/neg-macros/annot-on-type/Macro_1.scala b/tests/neg-macros/annot-on-type/Macro_1.scala new file mode 100644 index 000000000000..7468c5a200a6 --- /dev/null +++ b/tests/neg-macros/annot-on-type/Macro_1.scala @@ -0,0 +1,8 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ + +@experimental +class voidAnnot extends MacroAnnotation { + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + List(tree) +} diff --git a/tests/neg-macros/annot-on-type/Test_2.scala b/tests/neg-macros/annot-on-type/Test_2.scala new file mode 100644 index 000000000000..4dfe1cc76d42 --- /dev/null +++ b/tests/neg-macros/annot-on-type/Test_2.scala @@ -0,0 +1,11 @@ +@voidAnnot +type A // error + +object B: + @voidAnnot + type C // error + + def test = + @voidAnnot + type D // error + () diff --git a/tests/neg-macros/annot-result-owner.check b/tests/neg-macros/annot-result-owner.check new file mode 100644 index 000000000000..5d67be058fdf --- /dev/null +++ b/tests/neg-macros/annot-result-owner.check @@ -0,0 +1,9 @@ + +-- Error: tests/neg-macros/annot-result-owner/Test_2.scala:1:0 --------------------------------------------------------- +1 |@insertVal // error + |^^^^^^^^^^ + |macro annotation @insertVal added value definitionWithWrongOwner$macro$1 with an inconsistent owner. Expected it to be owned by package object Test_2$package but was owned by method foo. +-- Error: tests/neg-macros/annot-result-owner/Test_2.scala:5:2 --------------------------------------------------------- +5 | @insertVal // error + | ^^^^^^^^^^ + |macro annotation @insertVal added value definitionWithWrongOwner$macro$2 with an inconsistent owner. Expected it to be owned by method bar but was owned by method foo. diff --git a/tests/neg-macros/annot-result-owner/Macro_1.scala b/tests/neg-macros/annot-result-owner/Macro_1.scala new file mode 100644 index 000000000000..34f7541f726b --- /dev/null +++ b/tests/neg-macros/annot-result-owner/Macro_1.scala @@ -0,0 +1,11 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ + +@experimental +class insertVal extends MacroAnnotation: + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + import quotes.reflect._ + // Use of wrong owner + val valSym = Symbol.newVal(tree.symbol, Symbol.freshName("definitionWithWrongOwner"), TypeRepr.of[Unit], Flags.Private, Symbol.noSymbol) + val valDef = ValDef(valSym, Some('{}.asTerm)) + List(valDef, tree) diff --git a/tests/neg-macros/annot-result-owner/Test_2.scala b/tests/neg-macros/annot-result-owner/Test_2.scala new file mode 100644 index 000000000000..5bcebb1ecf76 --- /dev/null +++ b/tests/neg-macros/annot-result-owner/Test_2.scala @@ -0,0 +1,6 @@ +@insertVal // error +def foo(): Unit = () + +def bar = + @insertVal // error + def foo(): Unit = () diff --git a/tests/neg-macros/annot-suspend-cycle.check b/tests/neg-macros/annot-suspend-cycle.check new file mode 100644 index 000000000000..237cbe4188b2 --- /dev/null +++ b/tests/neg-macros/annot-suspend-cycle.check @@ -0,0 +1,12 @@ +-- [E129] Potential Issue Warning: tests/neg-macros/annot-suspend-cycle/Macro.scala:7:4 -------------------------------- +7 | new Foo + | ^^^^^^^ + | A pure expression does nothing in statement position; you may be omitting necessary parentheses + | + | longer explanation available when compiling with `-explain` +Cyclic macro dependencies in tests/neg-macros/annot-suspend-cycle/Test.scala. +Compilation stopped since no further progress can be made. + +To fix this, place macros in one set of files and their callers in another. + +Compiling with -Xprint-suspension gives more information. diff --git a/tests/neg-macros/annot-suspend-cycle/Macro.scala b/tests/neg-macros/annot-suspend-cycle/Macro.scala new file mode 100644 index 000000000000..4143e2c32062 --- /dev/null +++ b/tests/neg-macros/annot-suspend-cycle/Macro.scala @@ -0,0 +1,9 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ + +@experimental +class cycle extends MacroAnnotation { + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + new Foo + List(tree) +} diff --git a/tests/neg-macros/annot-suspend-cycle/Test.scala b/tests/neg-macros/annot-suspend-cycle/Test.scala new file mode 100644 index 000000000000..c1e1289742c1 --- /dev/null +++ b/tests/neg-macros/annot-suspend-cycle/Test.scala @@ -0,0 +1,5 @@ +// nopos-error +class Foo + +@cycle +def test = () diff --git a/tests/neg-macros/i16532.check b/tests/neg-macros/i16532.check new file mode 100644 index 000000000000..45dc9d07dcaf --- /dev/null +++ b/tests/neg-macros/i16532.check @@ -0,0 +1,8 @@ +-- Error: tests/neg-macros/i16532.scala:7:13 --------------------------------------------------------------------------- +7 | val x2 = recurseII($a, $b) // error + | ^^^^^^^^^ + |access to method recurseII from wrong staging level: + | - the definition is at level 0, + | - but the access is at level 1. + | + |Hint: Staged references to inline definition in quotes are only inlined after the quote is spliced into level 0 code by a macro. Try moving this inline definition in a statically accessible location such as an object (this definition can be private). diff --git a/tests/neg-macros/i16532.scala b/tests/neg-macros/i16532.scala new file mode 100644 index 000000000000..d1edfdd80088 --- /dev/null +++ b/tests/neg-macros/i16532.scala @@ -0,0 +1,9 @@ +import scala.quoted.* + +def power0Impl(a: Expr[Int], b: Expr[Int])(using Quotes): Expr[Int] = + inline def recurseII(a:Int, n:Int): Int = ??? + + '{ + val x2 = recurseII($a, $b) // error + x2 + } diff --git a/tests/neg-macros/i6436.check b/tests/neg-macros/i6436.check index 9c422fa11b99..43e93b2e64e5 100644 --- a/tests/neg-macros/i6436.check +++ b/tests/neg-macros/i6436.check @@ -1,4 +1,4 @@ --- Error: tests/neg-macros/i6436.scala:5:9 ----------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg-macros/i6436.scala:5:9 ----------------------------------------------------------------- 5 | case '{ StringContext(${Varargs(parts)}*) } => // error | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | No given instance of type scala.quoted.Quotes was found diff --git a/tests/neg-macros/i9014b.check b/tests/neg-macros/i9014b.check index 0d972e123a30..de0be2d5c1fa 100644 --- a/tests/neg-macros/i9014b.check +++ b/tests/neg-macros/i9014b.check @@ -1,5 +1,5 @@ --- Error: tests/neg-macros/i9014b/Test_2.scala:1:23 -------------------------------------------------------------------- +-- [E172] Type Error: tests/neg-macros/i9014b/Test_2.scala:1:23 -------------------------------------------------------- 1 |val tests = summon[Bar] // error | ^ | No given instance of type Bar was found for parameter x of method summon in object Predef. diff --git a/tests/neg-macros/ill-abort.check b/tests/neg-macros/ill-abort.check index 2f76c89d88dd..c267c2e79ecf 100644 --- a/tests/neg-macros/ill-abort.check +++ b/tests/neg-macros/ill-abort.check @@ -2,7 +2,7 @@ -- Error: tests/neg-macros/ill-abort/quoted_2.scala:1:15 --------------------------------------------------------------- 1 |def test = fail() // error | ^^^^^^ - |Macro expansion was aborted by the macro without any errors reported. Macros should issue errors to end-users to facilitate debugging when aborting a macro expansion. + |Macro expansion was aborted by the macro without any errors reported. Macros should issue errors to end-users when aborting a macro expansion with StopMacroExpansion. |--------------------------------------------------------------------------------------------------------------------- |Inline stack trace |- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/tests/neg-scalajs/jsconstructorof-error-in-prepjsinterop.check b/tests/neg-scalajs/jsconstructorof-error-in-prepjsinterop.check index 301111860aa7..7687543ea75f 100644 --- a/tests/neg-scalajs/jsconstructorof-error-in-prepjsinterop.check +++ b/tests/neg-scalajs/jsconstructorof-error-in-prepjsinterop.check @@ -13,7 +13,7 @@ -- [E170] Type Error: tests/neg-scalajs/jsconstructorof-error-in-prepjsinterop.scala:17:27 ----------------------------- 17 | val d = js.constructorOf[NativeJSClass { def bar: Int }] // error | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - | NativeJSClass{bar: Int} is not a class type + | NativeJSClass{def bar: Int} is not a class type -- Error: tests/neg-scalajs/jsconstructorof-error-in-prepjsinterop.scala:19:27 ----------------------------------------- 19 | val e = js.constructorOf[JSTrait] // error | ^^^^^^^ @@ -29,7 +29,7 @@ -- [E170] Type Error: tests/neg-scalajs/jsconstructorof-error-in-prepjsinterop.scala:23:27 ----------------------------- 23 | val h = js.constructorOf[JSClass { def bar: Int }] // error | ^^^^^^^^^^^^^^^^^^^^^^^^ - | JSClass{bar: Int} is not a class type + | JSClass{def bar: Int} is not a class type -- [E170] Type Error: tests/neg-scalajs/jsconstructorof-error-in-prepjsinterop.scala:25:42 ----------------------------- 25 | def foo[A <: js.Any] = js.constructorOf[A] // error | ^ diff --git a/tests/neg-scalajs/jsconstructortag-error-in-prepjsinterop.check b/tests/neg-scalajs/jsconstructortag-error-in-prepjsinterop.check index c4ce18b2e57c..142de318efd3 100644 --- a/tests/neg-scalajs/jsconstructortag-error-in-prepjsinterop.check +++ b/tests/neg-scalajs/jsconstructortag-error-in-prepjsinterop.check @@ -9,11 +9,11 @@ -- [E170] Type Error: tests/neg-scalajs/jsconstructortag-error-in-prepjsinterop.scala:16:61 ---------------------------- 16 | val c = js.constructorTag[NativeJSClass with NativeJSTrait] // error | ^ - | (NativeJSClass & NativeJSTrait) is not a class type + | NativeJSClass & NativeJSTrait is not a class type -- [E170] Type Error: tests/neg-scalajs/jsconstructortag-error-in-prepjsinterop.scala:17:59 ---------------------------- 17 | val d = js.constructorTag[NativeJSClass { def bar: Int }] // error | ^ - | NativeJSClass{bar: Int} is not a class type + | NativeJSClass{def bar: Int} is not a class type -- Error: tests/neg-scalajs/jsconstructortag-error-in-prepjsinterop.scala:19:36 ---------------------------------------- 19 | val e = js.constructorTag[JSTrait] // error | ^ @@ -25,11 +25,11 @@ -- [E170] Type Error: tests/neg-scalajs/jsconstructortag-error-in-prepjsinterop.scala:22:49 ---------------------------- 22 | val g = js.constructorTag[JSClass with JSTrait] // error | ^ - | (JSClass & JSTrait) is not a class type + | JSClass & JSTrait is not a class type -- [E170] Type Error: tests/neg-scalajs/jsconstructortag-error-in-prepjsinterop.scala:23:53 ---------------------------- 23 | val h = js.constructorTag[JSClass { def bar: Int }] // error | ^ - | JSClass{bar: Int} is not a class type + | JSClass{def bar: Int} is not a class type -- [E170] Type Error: tests/neg-scalajs/jsconstructortag-error-in-prepjsinterop.scala:25:45 ---------------------------- 25 | def foo[A <: js.Any] = js.constructorTag[A] // error | ^ diff --git a/tests/neg-scalajs/jsconstructortag-error-in-typer.check b/tests/neg-scalajs/jsconstructortag-error-in-typer.check index ba845de39231..888fa163e81c 100644 --- a/tests/neg-scalajs/jsconstructortag-error-in-typer.check +++ b/tests/neg-scalajs/jsconstructortag-error-in-typer.check @@ -1,4 +1,4 @@ --- Error: tests/neg-scalajs/jsconstructortag-error-in-typer.scala:9:39 ------------------------------------------------- +-- [E172] Type Error: tests/neg-scalajs/jsconstructortag-error-in-typer.scala:9:39 ------------------------------------- 9 | val a = js.constructorTag[ScalaClass] // error | ^ |No given instance of type scala.scalajs.js.ConstructorTag[ScalaClass] was found for parameter tag of method constructorTag in package scala.scalajs.js. @@ -7,7 +7,7 @@ | scala.scalajs.js.ConstructorTag.materialize[T] | |But method materialize in object ConstructorTag does not match type scala.scalajs.js.ConstructorTag[ScalaClass]. --- Error: tests/neg-scalajs/jsconstructortag-error-in-typer.scala:10:39 ------------------------------------------------ +-- [E172] Type Error: tests/neg-scalajs/jsconstructortag-error-in-typer.scala:10:39 ------------------------------------ 10 | val b = js.constructorTag[ScalaTrait] // error | ^ |No given instance of type scala.scalajs.js.ConstructorTag[ScalaTrait] was found for parameter tag of method constructorTag in package scala.scalajs.js. @@ -16,7 +16,7 @@ | scala.scalajs.js.ConstructorTag.materialize[T] | |But method materialize in object ConstructorTag does not match type scala.scalajs.js.ConstructorTag[ScalaTrait]. --- Error: tests/neg-scalajs/jsconstructortag-error-in-typer.scala:11:45 ------------------------------------------------ +-- [E172] Type Error: tests/neg-scalajs/jsconstructortag-error-in-typer.scala:11:45 ------------------------------------ 11 | val c = js.constructorTag[ScalaObject.type] // error | ^ |No given instance of type scala.scalajs.js.ConstructorTag[ScalaObject.type] was found for parameter tag of method constructorTag in package scala.scalajs.js. diff --git a/tests/neg/abstract-givens.check b/tests/neg/abstract-givens.check index a74d0097b091..022c454c31f1 100644 --- a/tests/neg/abstract-givens.check +++ b/tests/neg/abstract-givens.check @@ -7,7 +7,7 @@ | ^ | error overriding given instance y in trait T of type (using x$1: Int): String; | given instance y of type (using x$1: Int): String cannot override final member given instance y in trait T --- [E163] Declaration Error: tests/neg/abstract-givens.scala:9:8 ------------------------------------------------------- +-- [E164] Declaration Error: tests/neg/abstract-givens.scala:9:8 ------------------------------------------------------- 9 | given z[T](using T): Seq[T] = List(summon[T]) // error | ^ | error overriding given instance z in trait T of type [T](using x$1: T): List[T]; diff --git a/tests/neg/classOf.check b/tests/neg/classOf.check index c3873aff7391..e3be3ca17026 100644 --- a/tests/neg/classOf.check +++ b/tests/neg/classOf.check @@ -11,4 +11,4 @@ -- [E170] Type Error: tests/neg/classOf.scala:9:18 --------------------------------------------------------------------- 9 | val y = classOf[C { type I = String }] // error | ^^^^^^^^^^^^^^^^^^^^^ - | Test.C{I = String} is not a class type + | Test.C{type I = String} is not a class type diff --git a/tests/neg/closure-args.scala b/tests/neg/closure-args.scala index 3b166c81c61c..76e590ad28b9 100644 --- a/tests/neg/closure-args.scala +++ b/tests/neg/closure-args.scala @@ -1,4 +1,4 @@ -import language.experimental.fewerBraces +import language.`3.3` val x = List(1).map: (x: => Int) => // error ??? diff --git a/tests/neg/enum-values.check b/tests/neg/enum-values.check index 84df5889b500..37990e8f312e 100644 --- a/tests/neg/enum-values.check +++ b/tests/neg/enum-values.check @@ -6,7 +6,9 @@ | meaning a values array is not defined. | An extension method was tried, but could not be fully constructed: | - | example.Extensions.values(Tag) failed with + | example.Extensions.values(Tag) + | + | failed with: | | Found: example.Tag.type | Required: Nothing @@ -18,7 +20,9 @@ | meaning a values array is not defined. | An extension method was tried, but could not be fully constructed: | - | example.Extensions.values(ListLike) failed with + | example.Extensions.values(ListLike) + | + | failed with: | | Found: Array[example.Tag[?]] | Required: Array[example.ListLike[?]] @@ -30,7 +34,9 @@ | meaning a values array is not defined. | An extension method was tried, but could not be fully constructed: | - | example.Extensions.values(TypeCtorsK) failed with + | example.Extensions.values(TypeCtorsK) + | + | failed with: | | Found: Array[example.Tag[?]] | Required: Array[example.TypeCtorsK[?[_$1]]] @@ -63,7 +69,9 @@ | value values is not a member of object example.NotAnEnum. | An extension method was tried, but could not be fully constructed: | - | example.Extensions.values(NotAnEnum) failed with + | example.Extensions.values(NotAnEnum) + | + | failed with: | | Found: example.NotAnEnum.type | Required: Nothing diff --git a/tests/neg/equality1.scala b/tests/neg/equality1.scala index 74bd45b18c12..cbd962a32bf6 100644 --- a/tests/neg/equality1.scala +++ b/tests/neg/equality1.scala @@ -132,4 +132,9 @@ object equality1 { println("empty") } + Map("k1" -> 1) == Map("k2" -> 2, "k3" -> 3) + Map(Color.Red -> Status.Inactive) == Map(Color.Green -> Status.Active(5)) + + Map("k1" -> 5) == Map('k' -> 5) // error + Map("k1" -> new A) == Map("k2" -> new B) // error } diff --git a/tests/neg/experimentalInheritance.scala b/tests/neg/experimentalInheritance.scala deleted file mode 100644 index 8b6c0b11afa3..000000000000 --- a/tests/neg/experimentalInheritance.scala +++ /dev/null @@ -1,44 +0,0 @@ -import scala.annotation.experimental - -@experimental -class A - -@experimental -trait T - -class B extends A // error - -@experimental -class B2 extends A - -class C extends T // error - -@experimental -class C2 extends T - -@experimental -class O: - class X - - @experimental - class Y - - object Z - -@experimental -object O: - class A - - @experimental - class B - - object C - -class OA extends O.A // error -class OB extends O.B // error - -@experimental -class OA2 extends O.A - -@experimental -class OB2 extends O.B diff --git a/tests/neg/experimentalInheritance2.scala b/tests/neg/experimentalInheritance2.scala deleted file mode 100644 index 84668ac5850f..000000000000 --- a/tests/neg/experimentalInheritance2.scala +++ /dev/null @@ -1,6 +0,0 @@ -import scala.annotation.experimental - -@experimental class A - -class B // // error: extension of experimental class A1 must have @experimental annotation - extends A diff --git a/tests/neg/exports.check b/tests/neg/exports.check index 49d8cdf0654b..79951cebfc39 100644 --- a/tests/neg/exports.check +++ b/tests/neg/exports.check @@ -11,7 +11,7 @@ 25 | export printUnit.bitmap // error: no eligible member | ^ | non-private given instance bitmap in class Copier refers to private value printUnit - | in its type signature => Copier.this.printUnit.bitmap + | in its type signature => object Copier.this.printUnit.bitmap -- [E120] Naming Error: tests/neg/exports.scala:23:33 ------------------------------------------------------------------ 23 | export printUnit.{stat => _, _} // error: double definition | ^ diff --git a/tests/neg/harmonize.scala b/tests/neg/harmonize.scala index 0fe03d2d7600..72275a8f68fc 100644 --- a/tests/neg/harmonize.scala +++ b/tests/neg/harmonize.scala @@ -79,9 +79,9 @@ object Test { val a4 = ArrayBuffer(1.0f, 1L) val b4: ArrayBuffer[Double] = a4 // error: no widening val a5 = ArrayBuffer(1.0f, 1L, f()) - val b5: ArrayBuffer[AnyVal] = a5 + val b5: ArrayBuffer[Float | Long | Int] = a5 val a6 = ArrayBuffer(1.0f, 1234567890) - val b6: ArrayBuffer[AnyVal] = a6 + val b6: ArrayBuffer[Float | Int] = a6 def totalDuration(results: List[Long], cond: Boolean): Long = results.map(r => if (cond) r else 0).sum diff --git a/tests/neg/i10098.check b/tests/neg/i10098.check index 06d0c62b69c0..94cc911b7753 100644 --- a/tests/neg/i10098.check +++ b/tests/neg/i10098.check @@ -1,16 +1,16 @@ --- Error: tests/neg/i10098.scala:20:32 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i10098.scala:20:32 --------------------------------------------------------------------- 20 | implicitly[Bar12[Int, String]] // error | ^ | There's no Foo2[String, Int] --- Error: tests/neg/i10098.scala:21:32 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i10098.scala:21:32 --------------------------------------------------------------------- 21 | implicitly[Bar21[Int, String]] // error | ^ | There's no Foo1[String, Int] --- Error: tests/neg/i10098.scala:22:32 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i10098.scala:22:32 --------------------------------------------------------------------- 22 | implicitly[Baz12[Int, String]] // error | ^ | There's no Baz12[Int, String] --- Error: tests/neg/i10098.scala:23:32 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i10098.scala:23:32 --------------------------------------------------------------------- 23 | implicitly[Baz21[Int, String]] // error | ^ | There's no Baz21[Int, String] diff --git a/tests/neg/i10603a.check b/tests/neg/i10603a.check index 578b942f6023..1d885dfdb762 100644 --- a/tests/neg/i10603a.check +++ b/tests/neg/i10603a.check @@ -1,4 +1,4 @@ --- Error: tests/neg/i10603a.scala:2:35 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i10603a.scala:2:35 --------------------------------------------------------------------- 2 | val x = implicitly[List[Boolean]] // error | ^ | No given instance of type List[Boolean] was found for parameter e of method implicitly in object Predef diff --git a/tests/neg/i10603b.check b/tests/neg/i10603b.check index 14a03fc9d3d7..cd230c44538b 100644 --- a/tests/neg/i10603b.check +++ b/tests/neg/i10603b.check @@ -1,4 +1,4 @@ --- Error: tests/neg/i10603b.scala:4:35 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i10603b.scala:4:35 --------------------------------------------------------------------- 4 | val x = implicitly[List[Boolean]] // error | ^ | No implicit view available from Int => Boolean. diff --git a/tests/neg/i10715a.scala b/tests/neg/i10715a.scala new file mode 100644 index 000000000000..b5794c46d22c --- /dev/null +++ b/tests/neg/i10715a.scala @@ -0,0 +1,22 @@ +class Parent: + def f(x: Int): Parent = ??? + def f: Int = 0 + + def g[A](x: Int): Parent = ??? + def g[A]: Int = 0 + +class Sub extends Parent: + override def f(x: Int): Parent = ??? + override def g[A](x: Int): Parent = ??? + +def bad(c: Sub): Unit = + c.f: String // error + c.g: String // error + c.f.bad // error + c.g.bad // error + + c.f("") // error + c.g("") // error + c.g[Int]("") // error + c.g[Int]: (String => String) // error + c.g[Int]: (Int => Parent) // ok diff --git a/tests/neg/i10715b.scala b/tests/neg/i10715b.scala new file mode 100644 index 000000000000..922b80cf727b --- /dev/null +++ b/tests/neg/i10715b.scala @@ -0,0 +1,10 @@ +class Parent: + def f(x: Int): Unit = () + def f: Int = 0 + +class Sub extends Parent: + override def f(x: Int): Unit = () + def f(x: Int)(using String): Unit = () + +def bad(c: Sub): Unit = + c.f(1) // error: ambiguous overload diff --git a/tests/neg/i10901.check b/tests/neg/i10901.check index 26270ced338b..e055bed7dd3a 100644 --- a/tests/neg/i10901.check +++ b/tests/neg/i10901.check @@ -4,7 +4,9 @@ | value º is not a member of object BugExp4Point2D.IntT. | An extension method was tried, but could not be fully constructed: | - | º(x) failed with + | º(x) + | + | failed with: | | Ambiguous overload. The overloaded alternatives of method º in object dsl with types | [T1, T2] @@ -22,7 +24,9 @@ |value º is not a member of object BugExp4Point2D.IntT. |An extension method was tried, but could not be fully constructed: | - | º(x) failed with + | º(x) + | + | failed with: | | Ambiguous overload. The overloaded alternatives of method º in object dsl with types | [T1, T2] @@ -36,6 +40,8 @@ | value foo is not a member of String. | An extension method was tried, but could not be fully constructed: | - | Test.foo("abc")(/* missing */summon[C]) failed with + | Test.foo("abc")(/* missing */summon[C]) + | + | failed with: | | No given instance of type C was found for parameter x$2 of method foo in object Test diff --git a/tests/neg/i10943.scala b/tests/neg/i10943.scala index 4a9697c31874..09a42ce66cc4 100644 --- a/tests/neg/i10943.scala +++ b/tests/neg/i10943.scala @@ -1,4 +1,4 @@ -import language.experimental.fewerBraces +import language.`3.3` object T: class A diff --git a/tests/neg/i11066.check b/tests/neg/i11066.check index a73cb5566439..24b91b2f69ee 100644 --- a/tests/neg/i11066.check +++ b/tests/neg/i11066.check @@ -1,4 +1,4 @@ --- Error: tests/neg/i11066.scala:15:37 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i11066.scala:15:37 --------------------------------------------------------------------- 15 |val x = Greeter.greet("Who's there?") // error | ^ |Ambiguous given instances: both given instance joesPrompt in object JoesPrefs and given instance jillsPrompt in object JillsPrefs match type PreferredPrompt of parameter prompt of method greet in object Greeter diff --git a/tests/neg/i11797.check b/tests/neg/i11797.check index 80090b6b2faf..62b8a8828069 100644 --- a/tests/neg/i11797.check +++ b/tests/neg/i11797.check @@ -1,4 +1,4 @@ --- Error: tests/neg/i11797.scala:6:17 ---------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i11797.scala:6:17 ---------------------------------------------------------------------- 6 | summon[Foo.Bar] // error | ^ | Oops diff --git a/tests/neg/i11897.check b/tests/neg/i11897.check index 4b001fadc606..67de6dbab37d 100644 --- a/tests/neg/i11897.check +++ b/tests/neg/i11897.check @@ -23,23 +23,23 @@ | ^^^^^^^^^^^ | given patterns are not allowed in a val definition, | please bind to an identifier and use an alias given. --- Error: tests/neg/i11897.scala:16:18 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i11897.scala:16:18 --------------------------------------------------------------------- 16 | assert(summon[A] == A(23)) // error | ^ | No given instance of type A was found for parameter x of method summon in object Predef --- Error: tests/neg/i11897.scala:17:18 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i11897.scala:17:18 --------------------------------------------------------------------- 17 | assert(summon[B] == B(false)) // error | ^ | No given instance of type B was found for parameter x of method summon in object Predef --- Error: tests/neg/i11897.scala:18:18 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i11897.scala:18:18 --------------------------------------------------------------------- 18 | assert(summon[C] == C("c")) // error | ^ | No given instance of type C was found for parameter x of method summon in object Predef --- Error: tests/neg/i11897.scala:19:18 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i11897.scala:19:18 --------------------------------------------------------------------- 19 | assert(summon[E] == E(93)) // error | ^ | No given instance of type E was found for parameter x of method summon in object Predef --- Error: tests/neg/i11897.scala:20:18 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i11897.scala:20:18 --------------------------------------------------------------------- 20 | assert(summon[G] == G(101)) // error | ^ | No given instance of type G was found for parameter x of method summon in object Predef diff --git a/tests/neg/i11982.check b/tests/neg/i11982.check index 48ec252a4410..304accbf0269 100644 --- a/tests/neg/i11982.check +++ b/tests/neg/i11982.check @@ -1,4 +1,4 @@ --- Error: tests/neg/i11982.scala:22:38 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i11982.scala:22:38 --------------------------------------------------------------------- 22 | val p1: ("msg", 42) = unpair[Tshape] // error: no singleton value for Any | ^ |No singleton value available for Any; eligible singleton types for `ValueOf` synthesis include literals and stable paths. diff --git a/tests/neg/i12049.check b/tests/neg/i12049.check index edf76a0823b9..a58624ec6778 100644 --- a/tests/neg/i12049.check +++ b/tests/neg/i12049.check @@ -31,7 +31,7 @@ | | case t1 *: t2 *: ts => Tuple.Concat[Reverse[ts], (t2, t1)] | case EmptyTuple => EmptyTuple --- Error: tests/neg/i12049.scala:24:20 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i12049.scala:24:20 --------------------------------------------------------------------- 24 |val _ = summon[M[B]] // error | ^ | No given instance of type M[B] was found for parameter x of method summon in object Predef diff --git a/tests/neg/i12232.check b/tests/neg/i12232.check index 8de9b4317a31..eb4875ab77c3 100644 --- a/tests/neg/i12232.check +++ b/tests/neg/i12232.check @@ -1,10 +1,10 @@ --- Error: tests/neg/i12232.scala:17:15 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i12232.scala:17:15 --------------------------------------------------------------------- 17 | foo(min(3, 4)) // error: works in Scala 2, not in 3 | ^ | No given instance of type Op[Int, Int, V] was found for parameter op of method min in object Foo | | where: V is a type variable with constraint <: Double --- Error: tests/neg/i12232.scala:19:16 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i12232.scala:19:16 --------------------------------------------------------------------- 19 | foo(minR(3, 4)) // error: works in Scala 2, not in 3 | ^ | No given instance of type Op[Int, Int, R] was found for parameter op of method minR in object Foo diff --git a/tests/neg/i12448.scala b/tests/neg/i12448.scala new file mode 100644 index 000000000000..e495cfd19f1d --- /dev/null +++ b/tests/neg/i12448.scala @@ -0,0 +1,5 @@ +object Main { + def mkArray[T <: A]: T#AType // error // error + mkArray[Array] // was: "assertion failed: invalid prefix HKTypeLambda..." + val x = mkArray[Array] +} diff --git a/tests/neg/i12591.check b/tests/neg/i12591.check index e050038659b5..17d418713aa2 100644 --- a/tests/neg/i12591.check +++ b/tests/neg/i12591.check @@ -1,4 +1,4 @@ --- Error: tests/neg/i12591/Inner.scala:12:31 --------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i12591/Inner.scala:12:31 --------------------------------------------------------------- 12 |val badSummon = summon[TC[Bar]] // error here | ^ |Ambiguous given instances: both outer.inner.Foo.ofFoo and outer.Foo.ofFoo match type outer.inner.Foo.TC[outer.Bar] of parameter x of method summon in object Predef diff --git a/tests/neg/i12991.scala b/tests/neg/i12991.scala deleted file mode 100644 index 90e037424c49..000000000000 --- a/tests/neg/i12991.scala +++ /dev/null @@ -1,7 +0,0 @@ -object Foo: - inline def unapply(using String)(i: Int): Some[Int] = Some(i) - -given String = "" - -val i = 10 match - case Foo(x) => x // error diff --git a/tests/neg/i13558.check b/tests/neg/i13558.check index 4c468a854781..ab10a42cdd32 100644 --- a/tests/neg/i13558.check +++ b/tests/neg/i13558.check @@ -4,7 +4,9 @@ | value id is not a member of testcode.A. | An extension method was tried, but could not be fully constructed: | - | testcode.ExtensionA.id(a) failed with + | testcode.ExtensionA.id(a) + | + | failed with: | | Reference to id is ambiguous, | it is both imported by import testcode.ExtensionB._ @@ -15,7 +17,9 @@ | value id is not a member of testcode.A. | An extension method was tried, but could not be fully constructed: | - | testcode.ExtensionB.id(a) failed with + | testcode.ExtensionB.id(a) + | + | failed with: | | Reference to id is ambiguous, | it is both imported by import testcode.ExtensionA._ diff --git a/tests/neg/i13846.check b/tests/neg/i13846.check index 69ea0f0e51ac..a57db35ef6dd 100644 --- a/tests/neg/i13846.check +++ b/tests/neg/i13846.check @@ -2,7 +2,7 @@ 3 |def foo(): Int throws ArithmeticException = 1 / 0 // error | ^^^^^^^^^^^^^^^^^^^ | throws clause cannot be defined for RuntimeException --- Error: tests/neg/i13846.scala:7:9 ----------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i13846.scala:7:9 ----------------------------------------------------------------------- 7 | foo() // error | ^ | The capability to throw exception ArithmeticException is missing. diff --git a/tests/neg/i13864.check b/tests/neg/i13864.check index 54e81ea82774..6020ff8c6086 100644 --- a/tests/neg/i13864.check +++ b/tests/neg/i13864.check @@ -3,7 +3,7 @@ | ^^^^^^^^^^ | Implementation restriction: cannot generate CanThrow capability for this kind of catch. | CanThrow capabilities can only be generated for cases of the form `ex: T` where `T` is fully defined. --- Error: tests/neg/i13864.scala:9:10 ---------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i13864.scala:9:10 ---------------------------------------------------------------------- 9 | foo(1) // error | ^ | The capability to throw exception Ex[Int] is missing. diff --git a/tests/neg/i13991.check b/tests/neg/i13991.check index bd1cda58c046..4c24e14a85c6 100644 --- a/tests/neg/i13991.check +++ b/tests/neg/i13991.check @@ -1,4 +1,4 @@ --- Error: tests/neg/i13991.scala:5:7 ----------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i13991.scala:5:7 ----------------------------------------------------------------------- 5 | first[String] // error // before line 10 to test alignment of the error message `|` | ^^^^^^^^^^^^^ | No given instance of type Foo[String] was found diff --git a/tests/neg/i14025.check b/tests/neg/i14025.check index 3c67b954297b..a44cdc67c1f8 100644 --- a/tests/neg/i14025.check +++ b/tests/neg/i14025.check @@ -1,8 +1,8 @@ --- Error: tests/neg/i14025.scala:1:88 ---------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i14025.scala:1:88 ---------------------------------------------------------------------- 1 |val foo = summon[deriving.Mirror.Product { type MirroredType = [X] =>> [Y] =>> (X, Y) }] // error | ^ - |No given instance of type deriving.Mirror.Product{MirroredType[X] = [Y] =>> (X, Y)} was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Product{MirroredType[X] = [Y] =>> (X, Y)}: type `[X] =>> [Y] =>> (X, Y)` is not a generic product because its subpart `[X] =>> [Y] =>> (X, Y)` is not a supported kind (either `*` or `* -> *`) --- Error: tests/neg/i14025.scala:2:90 ---------------------------------------------------------------------------------- + |No given instance of type deriving.Mirror.Product{type MirroredType[X] = [Y] =>> (X, Y)} was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Product{type MirroredType[X] = [Y] =>> (X, Y)}: type `[X] =>> [Y] =>> (X, Y)` is not a generic product because its subpart `[X] =>> [Y] =>> (X, Y)` is not a supported kind (either `*` or `* -> *`) +-- [E172] Type Error: tests/neg/i14025.scala:2:90 ---------------------------------------------------------------------- 2 |val bar = summon[deriving.Mirror.Sum { type MirroredType = [X] =>> [Y] =>> List[(X, Y)] }] // error | ^ - |No given instance of type deriving.Mirror.Sum{MirroredType[X] = [Y] =>> List[(X, Y)]} was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Sum{MirroredType[X] = [Y] =>> List[(X, Y)]}: type `[X] =>> [Y] =>> List[(X, Y)]` is not a generic sum because its subpart `[X] =>> [Y] =>> List[(X, Y)]` is not a supported kind (either `*` or `* -> *`) + |No given instance of type deriving.Mirror.Sum{type MirroredType[X] = [Y] =>> List[(X, Y)]} was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Sum{type MirroredType[X] = [Y] =>> List[(X, Y)]}: type `[X] =>> [Y] =>> List[(X, Y)]` is not a generic sum because its subpart `[X] =>> [Y] =>> List[(X, Y)]` is not a supported kind (either `*` or `* -> *`) diff --git a/tests/neg/i14127.check b/tests/neg/i14127.check index 969092401012..15babe8b2775 100644 --- a/tests/neg/i14127.check +++ b/tests/neg/i14127.check @@ -1,10 +1,8 @@ --- Error: tests/neg/i14127.scala:6:55 ---------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i14127.scala:6:55 ---------------------------------------------------------------------- 6 | *: Int *: Int *: Int *: Int *: Int *: EmptyTuple)]] // error | ^ - |No given instance of type deriving.Mirror.Of[(Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, - | Int - |, Int, Int)] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[(Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, - | Int - |, Int, Int)]: + |No given instance of type deriving.Mirror.Of[(Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, + | Int, Int, Int)] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[(Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, Int, + | Int, Int, Int)]: | * class *: is not a generic product because it reduces to a tuple with arity 23, expected arity <= 22 | * class *: is not a generic sum because it does not have subclasses diff --git a/tests/neg/i14432.check b/tests/neg/i14432.check index 793ade82212b..d19d952b0153 100644 --- a/tests/neg/i14432.check +++ b/tests/neg/i14432.check @@ -1,6 +1,6 @@ --- Error: tests/neg/i14432.scala:13:33 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i14432.scala:13:33 --------------------------------------------------------------------- 13 |val mFoo = summon[Mirror.Of[Foo]] // error: no mirror found | ^ - |No given instance of type deriving.Mirror.Of[example.Foo] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[example.Foo]: + |No given instance of type deriving.Mirror.Of[example.Foo] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[example.Foo]: | * class Foo is not a generic product because the constructor of class Foo is innaccessible from the calling scope. | * class Foo is not a generic sum because it is not a sealed class diff --git a/tests/neg/i14432a.check b/tests/neg/i14432a.check index 5f847ce30a38..705a7ed0e88b 100644 --- a/tests/neg/i14432a.check +++ b/tests/neg/i14432a.check @@ -1,6 +1,6 @@ --- Error: tests/neg/i14432a.scala:14:43 -------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i14432a.scala:14:43 -------------------------------------------------------------------- 14 | val mFoo = summon[Mirror.Of[example.Foo]] // error: no mirror found | ^ - |No given instance of type deriving.Mirror.Of[example.Foo] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[example.Foo]: + |No given instance of type deriving.Mirror.Of[example.Foo] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[example.Foo]: | * class Foo is not a generic product because the constructor of class Foo is innaccessible from the calling scope. | * class Foo is not a generic sum because it is not a sealed class diff --git a/tests/neg/i14432b.check b/tests/neg/i14432b.check index 24cb04b731ca..5b0dac3e6ad0 100644 --- a/tests/neg/i14432b.check +++ b/tests/neg/i14432b.check @@ -1,6 +1,6 @@ --- Error: tests/neg/i14432b.scala:15:43 -------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i14432b.scala:15:43 -------------------------------------------------------------------- 15 | val mFoo = summon[Mirror.Of[example.Foo]] // error: no mirror found | ^ - |No given instance of type deriving.Mirror.Of[example.Foo] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[example.Foo]: + |No given instance of type deriving.Mirror.Of[example.Foo] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[example.Foo]: | * class Foo is not a generic product because the constructor of class Foo is innaccessible from the calling scope. | * class Foo is not a generic sum because it is not a sealed class diff --git a/tests/neg/i14432c.check b/tests/neg/i14432c.check index 384235e5d379..a61e100ceb98 100644 --- a/tests/neg/i14432c.check +++ b/tests/neg/i14432c.check @@ -1,10 +1,10 @@ --- Error: tests/neg/i14432c.scala:12:18 -------------------------------------------------------------------------------- +-- [E173] Reference Error: tests/neg/i14432c.scala:12:18 --------------------------------------------------------------- 12 |class Bar extends example.Foo(23) { // error: cant access private[example] ctor | ^^^^^^^^^^^ | constructor Foo cannot be accessed as a member of example.Foo from class Bar. --- Error: tests/neg/i14432c.scala:16:43 -------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i14432c.scala:16:43 -------------------------------------------------------------------- 16 | val mFoo = summon[Mirror.Of[example.Foo]] // error: no mirror | ^ - |No given instance of type deriving.Mirror.Of[example.Foo] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[example.Foo]: + |No given instance of type deriving.Mirror.Of[example.Foo] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[example.Foo]: | * class Foo is not a generic product because the constructor of class Foo is innaccessible from the calling scope. | * class Foo is not a generic sum because it is not a sealed class diff --git a/tests/neg/i14432d.check b/tests/neg/i14432d.check index 0701fb02ea19..aff070d90192 100644 --- a/tests/neg/i14432d.check +++ b/tests/neg/i14432d.check @@ -1,6 +1,6 @@ --- Error: tests/neg/i14432d.scala:17:45 -------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i14432d.scala:17:45 -------------------------------------------------------------------- 17 | val mFoo = summon[Mirror.Of[example.Foo]] // error | ^ - |No given instance of type deriving.Mirror.Of[example.Foo] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[example.Foo]: + |No given instance of type deriving.Mirror.Of[example.Foo] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[example.Foo]: | * class Foo is not a generic product because the constructor of class Foo is innaccessible from the calling scope. | * class Foo is not a generic sum because it is not a sealed class diff --git a/tests/neg/i14823.check b/tests/neg/i14823.check index 4d5a64680882..47b15f04e2da 100644 --- a/tests/neg/i14823.check +++ b/tests/neg/i14823.check @@ -1,6 +1,6 @@ --- Error: tests/neg/i14823.scala:8:50 ---------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i14823.scala:8:50 ---------------------------------------------------------------------- 8 |val baz = summon[Mirror.Of[SubA[Int] | SubB[Int]]] // error | ^ - |No given instance of type deriving.Mirror.Of[SubA[Int] | SubB[Int]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[SubA[Int] | SubB[Int]]: + |No given instance of type deriving.Mirror.Of[SubA[Int] | SubB[Int]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[SubA[Int] | SubB[Int]]: | * type `SubA[Int] | SubB[Int]` is not a generic product because its subpart `SubA[Int] | SubB[Int]` is a top-level union type. | * type `SubA[Int] | SubB[Int]` is not a generic sum because its subpart `SubA[Int] | SubB[Int]` is a top-level union type. diff --git a/tests/neg/i14823a.check b/tests/neg/i14823a.check index 9c917548d9bf..3c9b749780e0 100644 --- a/tests/neg/i14823a.check +++ b/tests/neg/i14823a.check @@ -1,24 +1,24 @@ --- Error: tests/neg/i14823a.scala:16:48 -------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i14823a.scala:16:48 -------------------------------------------------------------------- 16 |val foo = summon[Mirror.Of[Box[Int] | Box[Int]]] // error | ^ - |No given instance of type deriving.Mirror.Of[Box[Int] | Box[Int]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[Box[Int] | Box[Int]]: + |No given instance of type deriving.Mirror.Of[Box[Int] | Box[Int]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[Box[Int] | Box[Int]]: | * type `Box[Int] | Box[Int]` is not a generic product because its subpart `Box[Int] | Box[Int]` is a top-level union type. | * type `Box[Int] | Box[Int]` is not a generic sum because its subpart `Box[Int] | Box[Int]` is a top-level union type. --- Error: tests/neg/i14823a.scala:17:58 -------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i14823a.scala:17:58 -------------------------------------------------------------------- 17 |val bar = summon[MirrorK1.Of[[X] =>> Box[Int] | Box[Int]]] // error | ^ - |No given instance of type MirrorK1.Of[[X] =>> Box[Int] | Box[Int]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type MirrorK1.Of[[X] =>> Box[Int] | Box[Int]]: + |No given instance of type MirrorK1.Of[[X] =>> Box[Int] | Box[Int]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type MirrorK1.Of[[X] =>> Box[Int] | Box[Int]]: | * type `[A] =>> Box[Int] | Box[Int]` is not a generic product because its subpart `Box[Int] | Box[Int]` is a top-level union type. | * type `[A] =>> Box[Int] | Box[Int]` is not a generic sum because its subpart `Box[Int] | Box[Int]` is a top-level union type. --- Error: tests/neg/i14823a.scala:18:63 -------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i14823a.scala:18:63 -------------------------------------------------------------------- 18 |def baz = summon[deriving.Mirror.Of[Foo[String] | Foo[String]]] // error | ^ - |No given instance of type deriving.Mirror.Of[Foo[String] | Foo[String]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[Foo[String] | Foo[String]]: + |No given instance of type deriving.Mirror.Of[Foo[String] | Foo[String]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[Foo[String] | Foo[String]]: | * type `Foo[String] | Foo[String]` is not a generic product because its subpart `Foo[String] | Foo[String]` is a top-level union type. | * type `Foo[String] | Foo[String]` is not a generic sum because its subpart `Foo[String] | Foo[String]` is a top-level union type. --- Error: tests/neg/i14823a.scala:20:66 -------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i14823a.scala:20:66 -------------------------------------------------------------------- 20 |def qux = summon[deriving.Mirror.Of[Option[Int] | Option[String]]] // error | ^ - |No given instance of type deriving.Mirror.Of[Option[Int] | Option[String]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[Option[Int] | Option[String]]: + |No given instance of type deriving.Mirror.Of[Option[Int] | Option[String]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[Option[Int] | Option[String]]: | * type `Option[Int] | Option[String]` is not a generic product because its subpart `Option[Int] | Option[String]` is a top-level union type. | * type `Option[Int] | Option[String]` is not a generic sum because its subpart `Option[Int] | Option[String]` is a top-level union type. diff --git a/tests/neg/i15000.check b/tests/neg/i15000.check index 1a1e8e1b973b..64c222b2a52e 100644 --- a/tests/neg/i15000.check +++ b/tests/neg/i15000.check @@ -16,7 +16,9 @@ |value apply is not a member of object ExtensionMethodReproduction.c. |An extension method was tried, but could not be fully constructed: | - | apply(ExtensionMethodReproduction.c) failed with + | apply(ExtensionMethodReproduction.c) + | + | failed with: | | Ambiguous overload. The overloaded alternatives of method apply in object ExtensionMethodReproduction with types | (c: ExtensionMethodReproduction.C)(x: Int, y: Int): String diff --git a/tests/neg/i15507.check b/tests/neg/i15507.check new file mode 100644 index 000000000000..3786d559c306 --- /dev/null +++ b/tests/neg/i15507.check @@ -0,0 +1,40 @@ +-- Error: tests/neg/i15507.scala:2:40 ---------------------------------------------------------------------------------- +2 | type _NestedSet1[X] = Set[_NestedSet1[?]] // error + | ^ + | no wildcard type allowed here +-- Error: tests/neg/i15507.scala:3:41 ---------------------------------------------------------------------------------- +3 | type _NestedSet2[X] <: Set[_NestedSet2[?]] // error + | ^ + | no wildcard type allowed here +-- [E140] Cyclic Error: tests/neg/i15507.scala:5:7 --------------------------------------------------------------------- +5 | type _NestedSet4[X] >: Set[_NestedSet4[X]] // error + | ^ + | illegal cyclic type reference: lower bound ... of type _NestedSet4 refers back to the type itself +-- [E140] Cyclic Error: tests/neg/i15507.scala:6:7 --------------------------------------------------------------------- +6 | type _NestedSet5[X] = Set[_NestedSet5[X]] // error + | ^ + | illegal cyclic type reference: alias ... of type _NestedSet5 refers back to the type itself +-- [E140] Cyclic Error: tests/neg/i15507.scala:7:7 --------------------------------------------------------------------- +7 | type _NestedSet6[X] = Set[_NestedSet6[Int]] // error + | ^ + | illegal cyclic type reference: alias ... of type _NestedSet6 refers back to the type itself +-- Error: tests/neg/i15507.scala:9:43 ---------------------------------------------------------------------------------- +9 | type _NestedList1[X] = List[_NestedList1[?]] // error + | ^ + | no wildcard type allowed here +-- Error: tests/neg/i15507.scala:10:44 --------------------------------------------------------------------------------- +10 | type _NestedList2[X] <: List[_NestedList2[?]] // error + | ^ + | no wildcard type allowed here +-- [E140] Cyclic Error: tests/neg/i15507.scala:12:7 -------------------------------------------------------------------- +12 | type _NestedList4[X] >: List[_NestedList4[X]] // error + | ^ + | illegal cyclic type reference: lower bound ... of type _NestedList4 refers back to the type itself +-- [E140] Cyclic Error: tests/neg/i15507.scala:13:7 -------------------------------------------------------------------- +13 | type _NestedList5[X] = List[_NestedList5[X]] // error + | ^ + | illegal cyclic type reference: alias ... of type _NestedList5 refers back to the type itself +-- [E140] Cyclic Error: tests/neg/i15507.scala:14:7 -------------------------------------------------------------------- +14 | type _NestedList6[X] = List[_NestedList6[Int]] // error + | ^ + | illegal cyclic type reference: alias ... of type _NestedList6 refers back to the type itself diff --git a/tests/neg/i15507.scala b/tests/neg/i15507.scala index 3c45f2a8d9f6..f65d216ba6a2 100644 --- a/tests/neg/i15507.scala +++ b/tests/neg/i15507.scala @@ -1,12 +1,12 @@ object TestNested: - type _NestedSet1[X] = Set[_NestedSet1[?]] // error // error + type _NestedSet1[X] = Set[_NestedSet1[?]] // error type _NestedSet2[X] <: Set[_NestedSet2[?]] // error type _NestedSet3[X] <: Set[_NestedSet3[X]] // ok type _NestedSet4[X] >: Set[_NestedSet4[X]] // error type _NestedSet5[X] = Set[_NestedSet5[X]] // error type _NestedSet6[X] = Set[_NestedSet6[Int]] // error - type _NestedList1[X] = List[_NestedList1[?]] // error // error + type _NestedList1[X] = List[_NestedList1[?]] // error type _NestedList2[X] <: List[_NestedList2[?]] // error type _NestedList3[X] <: List[_NestedList3[X]] // ok type _NestedList4[X] >: List[_NestedList4[X]] // error diff --git a/tests/neg/i15618.check b/tests/neg/i15618.check index 0853da26c27a..91f557b12dcf 100644 --- a/tests/neg/i15618.check +++ b/tests/neg/i15618.check @@ -1,4 +1,4 @@ --- Error: tests/neg/i15618.scala:17:44 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i15618.scala:17:44 --------------------------------------------------------------------- 17 | def toArray: Array[ScalaType[T]] = Array() // error | ^ | No ClassTag available for ScalaType[T] diff --git a/tests/neg/i15998.check b/tests/neg/i15998.check index c745c7a84309..1f25946624cf 100644 --- a/tests/neg/i15998.check +++ b/tests/neg/i15998.check @@ -11,7 +11,7 @@ | must be more specific than CC[A] | | longer explanation available when compiling with `-explain` --- Error: tests/neg/i15998.scala:11:11 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i15998.scala:11:11 --------------------------------------------------------------------- 11 |val _ = bar // error | ^ | No implicit search was attempted for parameter x of method bar diff --git a/tests/neg/i16270a.scala b/tests/neg/i16270a.scala new file mode 100644 index 000000000000..b4a5016aaa08 --- /dev/null +++ b/tests/neg/i16270a.scala @@ -0,0 +1,25 @@ +class Outer { + type Smuggler + var smuggler: Option[Smuggler] = None +} +class Foo[T](var unpack: T) +class Evil(val outer: Outer, extract: outer.type => Unit) extends Foo[outer.type](outer) { // error + def doExtract(): Unit = extract(unpack) +} + +object Test { + def main(args: Array[String]): Unit = { + val outer1 = new Outer { type Smuggler = Int } + outer1.smuggler = Some(5) + val evil1 = new Evil(outer1, _ => ()) + + val outer2 = new Outer { type Smuggler = String } + var extractedOuter2: Option[outer2.type] = None + val evil2 = new Evil(outer2, x => extractedOuter2 = Some(x)) + + evil2.unpack = evil1.unpack + evil2.doExtract() + val smuggled: String = extractedOuter2.get.smuggler.get + println(smuggled) + } +} diff --git a/tests/neg/i16270b.scala b/tests/neg/i16270b.scala new file mode 100644 index 000000000000..d520bf7516e2 --- /dev/null +++ b/tests/neg/i16270b.scala @@ -0,0 +1,9 @@ +class Outer { + class Foo(var unpack: Outer.this.type) + + type Smuggler + var smuggler: Option[Smuggler] = None +} +class Evil(val outer: Outer, extract: outer.type => Unit) extends outer.Foo(outer) { // error + def doExtract(): Unit = extract(unpack) +} diff --git a/tests/neg/i16270c.scala b/tests/neg/i16270c.scala new file mode 100644 index 000000000000..e1d51913c1ce --- /dev/null +++ b/tests/neg/i16270c.scala @@ -0,0 +1,3 @@ +class Foo[T <: Singleton](x: T) +class Outer +class Evil(val outer: Outer) extends Foo(outer) // error (because outer.type appears in the inferred type) diff --git a/tests/neg/i16343.scala b/tests/neg/i16343.scala new file mode 100644 index 000000000000..d09ffcbe32c7 --- /dev/null +++ b/tests/neg/i16343.scala @@ -0,0 +1,2 @@ +class Issue16343: + class MyWorker extends javax.swing.SwingWorker[Unit, Unit] // error diff --git a/tests/neg/i16407.check b/tests/neg/i16407.check new file mode 100644 index 000000000000..5c6bd19ca8c1 --- /dev/null +++ b/tests/neg/i16407.check @@ -0,0 +1,12 @@ +-- Error: tests/neg/i16407.scala:2:2 ----------------------------------------------------------------------------------- +2 | f(g()) // error // error + | ^ + | cannot resolve reference to type (X.this : Y & X).A + | the classfile defining the type might be missing from the classpath + | or the self type of (X.this : Y & X) might not contain all transitive dependencies +-- Error: tests/neg/i16407.scala:2:4 ----------------------------------------------------------------------------------- +2 | f(g()) // error // error + | ^ + | cannot resolve reference to type (X.this : Y & X).A + | the classfile defining the type might be missing from the classpath + | or the self type of (X.this : Y & X) might not contain all transitive dependencies diff --git a/tests/neg/i16407.scala b/tests/neg/i16407.scala new file mode 100644 index 000000000000..ff7192390eef --- /dev/null +++ b/tests/neg/i16407.scala @@ -0,0 +1,11 @@ +trait X { self: Y => + f(g()) // error // error +} +trait Y { self: Z => + type B = A + def f(a: B): Unit = () + def g(): A = ??? +} +trait Z { + type A +} diff --git a/tests/neg/i16438.scala b/tests/neg/i16438.scala new file mode 100644 index 000000000000..33873b13384b --- /dev/null +++ b/tests/neg/i16438.scala @@ -0,0 +1,4 @@ +// scalac: -Ysafe-init +trait ATrait(val string: String, val int: Int) +trait AnotherTrait( override val string: String, override val int: Int) extends ATrait +case class ACaseClass(override val string: String) extends AnotherTrait(string, 3) // error diff --git a/tests/neg/i16452.check b/tests/neg/i16452.check new file mode 100644 index 000000000000..df4247aabc12 --- /dev/null +++ b/tests/neg/i16452.check @@ -0,0 +1,4 @@ +-- Error: tests/neg/i16452.scala:2:8 ----------------------------------------------------------------------------------- +2 |// error + | ^ + | indented definitions expected, eof found diff --git a/tests/neg/i16452.scala b/tests/neg/i16452.scala new file mode 100644 index 000000000000..d2b6c565a684 --- /dev/null +++ b/tests/neg/i16452.scala @@ -0,0 +1,2 @@ +val x = Seq(1, 2, 3).map: +// error \ No newline at end of file diff --git a/tests/neg/i16464.scala b/tests/neg/i16464.scala new file mode 100644 index 000000000000..dfc4cd3da3c3 --- /dev/null +++ b/tests/neg/i16464.scala @@ -0,0 +1,6 @@ + +implicit final class SomeOps(e: Int) extends AnyVal: + def -(other: Seq[Int]) = List(1) + def -(other: Seq[Long]) = List(2) // error: double definition + +def main(): Unit = 1 - Seq.empty[Int] diff --git a/tests/neg/i16601.check b/tests/neg/i16601.check new file mode 100644 index 000000000000..25baef04e479 --- /dev/null +++ b/tests/neg/i16601.check @@ -0,0 +1,6 @@ +-- [E042] Type Error: tests/neg/i16601.scala:1:27 ---------------------------------------------------------------------- +1 |@main def Test: Unit = new concurrent.ExecutionContext // error + | ^^^^^^^^^^^^^^^^^^^^^^^^^^^ + | ExecutionContext is a trait; it cannot be instantiated + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg/i16601.scala b/tests/neg/i16601.scala new file mode 100644 index 000000000000..2e058db0093c --- /dev/null +++ b/tests/neg/i16601.scala @@ -0,0 +1 @@ +@main def Test: Unit = new concurrent.ExecutionContext // error \ No newline at end of file diff --git a/tests/neg/i16653.check b/tests/neg/i16653.check new file mode 100644 index 000000000000..dd5c756f6f79 --- /dev/null +++ b/tests/neg/i16653.check @@ -0,0 +1,6 @@ +-- [E006] Not Found Error: tests/neg/i16653.scala:1:7 ------------------------------------------------------------------ +1 |import demo.implicits._ // error + | ^^^^ + | Not found: demo + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg/i16653.scala b/tests/neg/i16653.scala new file mode 100644 index 000000000000..3be14d1bc6bf --- /dev/null +++ b/tests/neg/i16653.scala @@ -0,0 +1,3 @@ +import demo.implicits._ // error +import demo._ +object Demo {} \ No newline at end of file diff --git a/tests/neg/i16655.check b/tests/neg/i16655.check new file mode 100644 index 000000000000..e1335b624244 --- /dev/null +++ b/tests/neg/i16655.check @@ -0,0 +1,6 @@ +-- [E052] Type Error: tests/neg/i16655.scala:3:4 ----------------------------------------------------------------------- +3 | x = 5 // error + | ^^^^^ + | Reassignment to val x + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg/i16655.scala b/tests/neg/i16655.scala new file mode 100644 index 000000000000..c758678d9896 --- /dev/null +++ b/tests/neg/i16655.scala @@ -0,0 +1,3 @@ +object Test: + val x = "MyString" + x = 5 // error diff --git a/tests/neg/i16696.check b/tests/neg/i16696.check new file mode 100644 index 000000000000..2cac6a9c595a --- /dev/null +++ b/tests/neg/i16696.check @@ -0,0 +1,12 @@ +-- Error: tests/neg/i16696.scala:7:29 ---------------------------------------------------------------------------------- +7 | val boom1 = BoxMaker[Some].make1 // error + | ^ + | Some is not a value type, cannot be used in intersection Some & Int +-- Error: tests/neg/i16696.scala:8:29 ---------------------------------------------------------------------------------- +8 | val boom2 = BoxMaker[Some].make2 // error + | ^ + | Some is not a value type, cannot be used in union Some | Int +-- Error: tests/neg/i16696.scala:20:27 --------------------------------------------------------------------------------- +20 | val boom = BoxMaker[Foo].make(_.foo) // error + | ^ + | test2.Foo is not a value type, cannot be used in intersection R & test2.Foo diff --git a/tests/neg/i16696.scala b/tests/neg/i16696.scala new file mode 100644 index 000000000000..f54b884960fa --- /dev/null +++ b/tests/neg/i16696.scala @@ -0,0 +1,20 @@ +object test1: + class BoxMaker[T] { + def make1: T & Int = ??? + def make2: T | Int = ??? + } + + val boom1 = BoxMaker[Some].make1 // error + val boom2 = BoxMaker[Some].make2 // error + +object test2: + class Box[R] + + class BoxMaker[T] { + def make[R <: T](f: T => Box[R]): Box[R & T] = ??? + } + + trait Foo[A]{ + def foo: Box[Foo[Unit]] + } + val boom = BoxMaker[Foo].make(_.foo) // error diff --git a/tests/neg/i16850.check b/tests/neg/i16850.check new file mode 100644 index 000000000000..6c9c7f7e0eac --- /dev/null +++ b/tests/neg/i16850.check @@ -0,0 +1,10 @@ +-- [E007] Type Mismatch Error: tests/neg/i16850.scala:7:33 ------------------------------------------------------------- +7 | def add(elm: Y): Unit = list = elm :: list // error + | ^^^ + | Found: (elm : Y) + | Required: Class.this.Y² + | + | where: Y is a type in class Class + | Y² is a type in trait Trait + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg/i16850.scala b/tests/neg/i16850.scala new file mode 100644 index 000000000000..e7904fcd44e7 --- /dev/null +++ b/tests/neg/i16850.scala @@ -0,0 +1,10 @@ + +trait Trait : + type Y + var list: List[Y] = Nil + +class Class[Y] extends Trait : + def add(elm: Y): Unit = list = elm :: list // error + +object Object extends Class[Int] : + add(42) diff --git a/tests/neg/i1730.scala b/tests/neg/i1730.scala new file mode 100644 index 000000000000..d88d3c007002 --- /dev/null +++ b/tests/neg/i1730.scala @@ -0,0 +1,7 @@ +import scala.reflect.ClassTag + +@main def Test = + val x: Array[? <: String] = Array[Int & Nothing]() // error: No ClassTag available for Int & Nothing + // (was: ClassCastException: [I cannot be cast to [Ljava.lang.String) + val y: Array[? <: Int] = Array[String & Nothing]() // error: No ClassTag available for String & Nothing + // (was: ClassCastException: [Lscala.runtime.Nothing$; cannot be cast to [I) diff --git a/tests/neg/i3935.scala b/tests/neg/i3935.scala new file mode 100644 index 000000000000..07515a4c9ff9 --- /dev/null +++ b/tests/neg/i3935.scala @@ -0,0 +1,10 @@ +enum Foo3[T](x: T) { + case Bar[S, T](y: T) extends Foo3[y.type](y) // error +} + +// val foo: Foo3.Bar[Nothing, 3] = Foo3.Bar(3) +// val bar = foo + +// def baz[T](f: Foo3[T]): f.type = f + +// val qux = baz(bar) // existentials are back in Dotty? diff --git a/tests/neg/i4373b.scala b/tests/neg/i4373b.scala index 45b60a46c721..93d967ef7778 100644 --- a/tests/neg/i4373b.scala +++ b/tests/neg/i4373b.scala @@ -1,5 +1,5 @@ // ==> 05bef7805687ba94da37177f7568e3ba7da1f91c.scala <== class x0 { - x1: - x0 | _ // error -// error \ No newline at end of file + x1: // error + x0 | _ + // error \ No newline at end of file diff --git a/tests/neg/i4986a.check b/tests/neg/i4986a.check index 3aac0a7b2cf3..141f3fa8aacb 100644 --- a/tests/neg/i4986a.check +++ b/tests/neg/i4986a.check @@ -1,4 +1,4 @@ --- Error: tests/neg/i4986a.scala:6:57 ---------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i4986a.scala:6:57 ---------------------------------------------------------------------- 6 | def foo(l: Lst[Int]) = l.map[Int, List[String]](x => 1) // error | ^ |Cannot construct a collection of type List[String] with elements of type Int based on a collection of type List[Int].. diff --git a/tests/neg/i4986c.check b/tests/neg/i4986c.check index a5fe0cee26bf..8befc30f5a60 100644 --- a/tests/neg/i4986c.check +++ b/tests/neg/i4986c.check @@ -1,64 +1,64 @@ --- Error: tests/neg/i4986c.scala:38:8 ---------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i4986c.scala:38:8 ---------------------------------------------------------------------- 38 | test.f // error | ^ | Missing X$Y for Test[Char] --- Error: tests/neg/i4986c.scala:39:13 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i4986c.scala:39:13 --------------------------------------------------------------------- 39 | test.g[Int] // error | ^ | Missing Outer[Int] with OuterMember = pkg.Outer[Int]#OuterMember --- Error: tests/neg/i4986c.scala:40:13 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i4986c.scala:40:13 --------------------------------------------------------------------- 40 | test.h[X$Y] // error | ^ | Missing Outer[pkg.X$Y] with OuterMember = pkg.Outer[pkg.X$Y]#OuterMember --- Error: tests/neg/i4986c.scala:41:24 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i4986c.scala:41:24 --------------------------------------------------------------------- 41 | test.i[Option[String]] // error | ^ | Missing implicit outer param of type Outer[Option[String]] for Test[Char] --- Error: tests/neg/i4986c.scala:42:43 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i4986c.scala:42:43 --------------------------------------------------------------------- 42 | test.j[(Long, Long), Int | String, Array] // error | ^ |Missing Inner[Int | String, Array] with InnerMember = pkg.Outer[(Long, Long)]#Inner[Int | String, Array]#InnerMember from Outer[(Long, Long)] with OuterMember = pkg.Outer[(Long, Long)]#OuterMember --- Error: tests/neg/i4986c.scala:43:53 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i4986c.scala:43:53 --------------------------------------------------------------------- 43 | test.k[Either[String, Any], Seq[Seq[Char]], Vector] // error | ^ | Missing implicit inner param of type Outer[Either[String, Any]]#Inner[Seq[Seq[Char]], Vector] for Test[Char] --- Error: tests/neg/i4986c.scala:45:87 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i4986c.scala:45:87 --------------------------------------------------------------------- 45 | implicitly[Outer[Option[String] | List[Iterable[Char]]] { type MyType = BigDecimal }] // error | ^ - |Missing Outer[Option[String] | List[Iterable[Char]]] with OuterMember = pkg.Outer[Option[String] | List[Iterable[Char]]]{MyType = BigDecimal}#OuterMember --- Error: tests/neg/i4986c.scala:46:106 -------------------------------------------------------------------------------- + |Missing Outer[Option[String] | List[Iterable[Char]]] with OuterMember = pkg.Outer[Option[String] | List[Iterable[Char]]]{type MyType = BigDecimal}#OuterMember +-- [E172] Type Error: tests/neg/i4986c.scala:46:106 -------------------------------------------------------------------- 46 | implicitly[(Outer[Option[String] | List[Iterable[Char]]] { type MyType = BigDecimal })#Inner[Byte, Seq]] // error | ^ - |Missing Inner[Byte, Seq] with InnerMember = pkg.Outer[Option[String] | List[Iterable[Char]]]{MyType = BigDecimal}#Inner[Byte, Seq]#InnerMember from Outer[Option[String] | List[Iterable[Char]]] with OuterMember = pkg.Outer[Option[String] | List[Iterable[Char]]]{MyType = BigDecimal}#OuterMember --- Error: tests/neg/i4986c.scala:47:33 --------------------------------------------------------------------------------- + |Missing Inner[Byte, Seq] with InnerMember = pkg.Outer[Option[String] | List[Iterable[Char]]]{type MyType = BigDecimal}#Inner[Byte, Seq]#InnerMember from Outer[Option[String] | List[Iterable[Char]]] with OuterMember = pkg.Outer[Option[String] | List[Iterable[Char]]]{type MyType = BigDecimal}#OuterMember +-- [E172] Type Error: tests/neg/i4986c.scala:47:33 --------------------------------------------------------------------- 47 | implicitly[Outer[Int] @myAnnot] // error | ^ | Missing Outer[Int] with OuterMember = pkg.Outer[Int] @myAnnot#OuterMember --- Error: tests/neg/i4986c.scala:52:52 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i4986c.scala:52:52 --------------------------------------------------------------------- 52 | implicitly[Outer[Int] { type OuterMember = Long }] // error | ^ | Missing Outer[Int] with OuterMember = Long --- Error: tests/neg/i4986c.scala:53:24 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i4986c.scala:53:24 --------------------------------------------------------------------- 53 | implicitly[outer.type] // error | ^ | Missing Outer[Int] with OuterMember = pkg.Test.outer.OuterMember --- Error: tests/neg/i4986c.scala:54:104 -------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i4986c.scala:54:104 -------------------------------------------------------------------- 54 | implicitly[(Outer[Int] { type OuterMember = Long })#Inner[Long, Iterator] { type InnerMember = Byte }] // error | ^ | Missing Inner[Long, Iterator] with InnerMember = Byte from Outer[Int] with OuterMember = Long --- Error: tests/neg/i4986c.scala:55:69 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i4986c.scala:55:69 --------------------------------------------------------------------- 55 | implicitly[outer.Inner[Long, Iterator] { type InnerMember = Byte }] // error | ^ |Missing Inner[Long, Iterator] with InnerMember = Byte from Outer[Int] with OuterMember = pkg.Test.outer.OuterMember --- Error: tests/neg/i4986c.scala:56:24 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i4986c.scala:56:24 --------------------------------------------------------------------- 56 | implicitly[inner.type] // error | ^ |Missing Inner[Long, Iterator] with InnerMember = pkg.Test.inner.InnerMember from Outer[Int] with OuterMember = pkg.Test.outer.OuterMember --- Error: tests/neg/i4986c.scala:58:33 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i4986c.scala:58:33 --------------------------------------------------------------------- 58 | implicitly[U[Int, Option, Map]] // error | ^ | There's no U[Int, Option, Map] --- Error: tests/neg/i4986c.scala:62:19 --------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i4986c.scala:62:19 --------------------------------------------------------------------- 62 | i.m[Option[Long]] // error | ^ | String; List; [A, _] =>> List[Option[?]]; Int; Option[Long]; diff --git a/tests/neg/i5498-postfixOps.check b/tests/neg/i5498-postfixOps.check index 59568a7fd9f3..d41862364270 100644 --- a/tests/neg/i5498-postfixOps.check +++ b/tests/neg/i5498-postfixOps.check @@ -10,7 +10,7 @@ | expression expected but ')' found | | longer explanation available when compiling with `-explain` --- Error: tests/neg/i5498-postfixOps.scala:6:0 ------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i5498-postfixOps.scala:6:0 ------------------------------------------------------------- 6 | Seq(1, 2).filter(List(1,2) contains) // error: usage of postfix operator // error |^ |No given instance of type scala.concurrent.duration.DurationConversions.Classifier[Null] was found for parameter ev of method second in trait DurationConversions diff --git a/tests/pos/i5636.scala b/tests/neg/i5636.scala similarity index 72% rename from tests/pos/i5636.scala rename to tests/neg/i5636.scala index 0a38439d718e..9c3b30af801a 100644 --- a/tests/pos/i5636.scala +++ b/tests/neg/i5636.scala @@ -4,6 +4,6 @@ trait Bar[X] { def foo: X = ??? } // same for `class Foo(...)...` -trait Foo(val a: A) extends Bar[a.type] { +trait Foo(val a: A) extends Bar[a.type] { // error val same: a.type = foo } diff --git a/tests/pending/neg/i5690.scala b/tests/neg/i5690.scala similarity index 100% rename from tests/pending/neg/i5690.scala rename to tests/neg/i5690.scala diff --git a/tests/neg/i6056.scala b/tests/neg/i6056.scala index ad68616eecc2..8e39b0e4631c 100644 --- a/tests/neg/i6056.scala +++ b/tests/neg/i6056.scala @@ -2,6 +2,6 @@ object i0{ import i0.i0 // error // error def i0={ import _ // error - import // error + import } // error } \ No newline at end of file diff --git a/tests/neg/i6183.check b/tests/neg/i6183.check index 70c1afaae621..6c7e96f1088a 100644 --- a/tests/neg/i6183.check +++ b/tests/neg/i6183.check @@ -4,7 +4,9 @@ | value render is not a member of Int. | An extension method was tried, but could not be fully constructed: | - | render(42) failed with + | render(42) + | + | failed with: | | Ambiguous overload. The overloaded alternatives of method render in object Test with types | [B](b: B)(using x$2: DummyImplicit): Char diff --git a/tests/neg/i6779.check b/tests/neg/i6779.check index d895203221ec..8e05c22eb640 100644 --- a/tests/neg/i6779.check +++ b/tests/neg/i6779.check @@ -11,7 +11,9 @@ | value f is not a member of T. | An extension method was tried, but could not be fully constructed: | - | Test.f[G[T]](x)(given_Stuff) failed with + | Test.f[G[T]](x)(given_Stuff) + | + | failed with: | | Found: (x : T) | Required: G[T] diff --git a/tests/neg/i7613.check b/tests/neg/i7613.check index 85d73b5c88f3..8ce12426c90c 100644 --- a/tests/neg/i7613.check +++ b/tests/neg/i7613.check @@ -1,4 +1,4 @@ --- Error: tests/neg/i7613.scala:10:16 ---------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i7613.scala:10:16 ---------------------------------------------------------------------- 10 | new BazLaws[A] {} // error | ^ | No given instance of type Baz[A] was found for parameter x$1 of constructor BazLaws in trait BazLaws diff --git a/tests/neg/i7709.check b/tests/neg/i7709.check index 20ecc4adce5f..180cf1939d16 100644 --- a/tests/neg/i7709.check +++ b/tests/neg/i7709.check @@ -1,46 +1,46 @@ --- Error: tests/neg/i7709.scala:5:20 ----------------------------------------------------------------------------------- +-- [E173] Reference Error: tests/neg/i7709.scala:5:20 ------------------------------------------------------------------ 5 | class B extends X.Y // error | ^^^ | class Y cannot be accessed as a member of X.type from class B. | Access to protected class Y not permitted because enclosing object A | is not a subclass of object X where target is defined --- Error: tests/neg/i7709.scala:6:21 ----------------------------------------------------------------------------------- +-- [E173] Reference Error: tests/neg/i7709.scala:6:21 ------------------------------------------------------------------ 6 | class B2 extends X.Y: // error | ^^^ | class Y cannot be accessed as a member of X.type from class B2. | Access to protected class Y not permitted because enclosing object A | is not a subclass of object X where target is defined --- Error: tests/neg/i7709.scala:9:28 ----------------------------------------------------------------------------------- +-- [E173] Reference Error: tests/neg/i7709.scala:9:28 ------------------------------------------------------------------ 9 | class B4 extends B3(new X.Y) // error | ^^^ | class Y cannot be accessed as a member of X.type from class B4. | Access to protected class Y not permitted because enclosing object A | is not a subclass of object X where target is defined --- Error: tests/neg/i7709.scala:11:34 ---------------------------------------------------------------------------------- +-- [E173] Reference Error: tests/neg/i7709.scala:11:34 ----------------------------------------------------------------- 11 | def this(n: Int) = this(new X.Y().toString) // error | ^^^ | class Y cannot be accessed as a member of X.type from class B5. | Access to protected class Y not permitted because enclosing object A | is not a subclass of object X where target is defined --- Error: tests/neg/i7709.scala:13:20 ---------------------------------------------------------------------------------- +-- [E173] Reference Error: tests/neg/i7709.scala:13:20 ----------------------------------------------------------------- 13 | class B extends X.Y // error | ^^^ | class Y cannot be accessed as a member of X.type from class B. | Access to protected class Y not permitted because enclosing trait T | is not a subclass of object X where target is defined --- Error: tests/neg/i7709.scala:18:18 ---------------------------------------------------------------------------------- +-- [E173] Reference Error: tests/neg/i7709.scala:18:18 ----------------------------------------------------------------- 18 | def y = new xx.Y // error | ^^^^ | class Y cannot be accessed as a member of XX from class C. | Access to protected class Y not permitted because enclosing class C | is not a subclass of class XX where target is defined --- Error: tests/neg/i7709.scala:23:20 ---------------------------------------------------------------------------------- +-- [E173] Reference Error: tests/neg/i7709.scala:23:20 ----------------------------------------------------------------- 23 | def y = new xx.Y // error | ^^^^ | class Y cannot be accessed as a member of XX from class D. | Access to protected class Y not permitted because enclosing class D | is not a subclass of class XX where target is defined --- Error: tests/neg/i7709.scala:31:20 ---------------------------------------------------------------------------------- +-- [E173] Reference Error: tests/neg/i7709.scala:31:20 ----------------------------------------------------------------- 31 | class Q extends X.Y // error | ^^^ | class Y cannot be accessed as a member of p.X.type from class Q. diff --git a/tests/neg/i7751.scala b/tests/neg/i7751.scala index 4c835a533704..978ed860574f 100644 --- a/tests/neg/i7751.scala +++ b/tests/neg/i7751.scala @@ -1,3 +1,3 @@ -import language.experimental.fewerBraces +import language.`3.3` val a = Some(a=a,)=> // error // error val a = Some(x=y,)=> diff --git a/tests/neg/i827.check b/tests/neg/i827.check new file mode 100644 index 000000000000..825aefbb480b --- /dev/null +++ b/tests/neg/i827.check @@ -0,0 +1,11 @@ +-- [E069] Naming Error: tests/neg/i827.scala:3:8 ----------------------------------------------------------------------- +3 | trait Inner extends self.Inner // error: cannot merge trait Inner in trait A with trait Inner in trait B as members of type (A & B)(B.this) + | ^ + |trait Inner cannot have the same name as trait Inner in trait A -- cannot define trait member with the same name as a trait member in self reference self. + |(Note: this can be resolved by using another name) +-- [E110] Syntax Error: tests/neg/i827.scala:7:16 ---------------------------------------------------------------------- +7 |class C extends C // error: cyclic inheritance: class C extends itself + | ^ + | Cyclic inheritance: class C extends itself + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg/i8337.scala b/tests/neg/i8337.scala index 7955c471fb70..6e42b96c2855 100644 --- a/tests/neg/i8337.scala +++ b/tests/neg/i8337.scala @@ -1,6 +1,6 @@ trait Foo[F <: Foo[F]] class Bar extends Foo[Bar] -object Q { // error: recursion limit exceeded - opaque type X <: Foo[X] = Bar // error: out of bounds // error +object Q { // error: cyclic reference + opaque type X <: Foo[X] = Bar // error: cyclic reference } \ No newline at end of file diff --git a/tests/neg/i8623.check b/tests/neg/i8623.check index b9d6e244e70e..39337a7839d8 100644 --- a/tests/neg/i8623.check +++ b/tests/neg/i8623.check @@ -8,4 +8,6 @@ | This might be because resolution yielded as given instance a function that is not | known to be total and side-effect free. | + | where: ?1 is an unknown value of type QC + | | longer explanation available when compiling with `-explain` diff --git a/tests/neg/i9185.check b/tests/neg/i9185.check index ffeed7e2fb2d..1616e739c473 100644 --- a/tests/neg/i9185.check +++ b/tests/neg/i9185.check @@ -5,11 +5,12 @@ |An extension method was tried, but could not be fully constructed: | | M.pure[A, F]("ola")( - | /* ambiguous: both object listMonad in object M and object optionMonad in object M match type M[F] */summon[M[F]] - | ) failed with + | /* ambiguous: both object listMonad in object M and object optionMonad in object M match type M[F] */summon[M[F]]) + | + | failed with: | | Ambiguous given instances: both object listMonad in object M and object optionMonad in object M match type M[F] of parameter m of method pure in object M --- Error: tests/neg/i9185.scala:8:28 ----------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i9185.scala:8:28 ----------------------------------------------------------------------- 8 | val value3 = M.pure("ola") // error | ^ |Ambiguous given instances: both object listMonad in object M and object optionMonad in object M match type M[F] of parameter m of method pure in object M @@ -19,7 +20,9 @@ | value len is not a member of String. | An extension method was tried, but could not be fully constructed: | - | M.len("abc") failed with + | M.len("abc") + | + | failed with: | | Found: ("abc" : String) | Required: Int diff --git a/tests/neg/i9568.check b/tests/neg/i9568.check index b6e20bdaf1be..3f318d0b0111 100644 --- a/tests/neg/i9568.check +++ b/tests/neg/i9568.check @@ -1,12 +1,16 @@ --- Error: tests/neg/i9568.scala:13:10 ---------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i9568.scala:13:10 ---------------------------------------------------------------------- 13 | blaMonad.foo(bla) // error: diverges | ^ - | No given instance of type => Monad[F] was found for parameter ev of method blaMonad in object Test - | - | where: F is a type variable with constraint <: [_] =>> Any - | . + | No given instance of type => Monad[F] was found for parameter ev of method blaMonad in object Test. | I found: | - | Test.blaMonad[F, S](Test.blaMonad[F, S]) + | Test.blaMonad[F², S](Test.blaMonad[F³, S²]) | - | But method blaMonad in object Test does not match type => Monad[F]. + | But method blaMonad in object Test does not match type => Monad[F²] + | + | where: F is a type variable with constraint <: [_] =>> Any + | F² is a type variable with constraint <: [_] =>> Any + | F³ is a type variable with constraint <: [_] =>> Any + | S is a type variable + | S² is a type variable + | . diff --git a/tests/neg/i9958.check b/tests/neg/i9958.check index d8b37b996ec1..3f65286eebc2 100644 --- a/tests/neg/i9958.check +++ b/tests/neg/i9958.check @@ -1,4 +1,4 @@ --- Error: tests/neg/i9958.scala:1:30 ----------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/i9958.scala:1:30 ----------------------------------------------------------------------- 1 |val x = summon[[X] =>> (X, X)] // error | ^ | No given instance of type [X] =>> (X, X) was found for parameter x of method summon in object Predef diff --git a/tests/neg/implicitSearch.check b/tests/neg/implicitSearch.check index d7ea6c01801c..e8efc744ac0a 100644 --- a/tests/neg/implicitSearch.check +++ b/tests/neg/implicitSearch.check @@ -1,4 +1,4 @@ --- Error: tests/neg/implicitSearch.scala:13:12 ------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/implicitSearch.scala:13:12 ------------------------------------------------------------- 13 | sort(xs) // error (with a partially constructed implicit argument shown) | ^ | No given instance of type Test.Ord[List[List[T]]] was found for parameter o of method sort in object Test. @@ -7,7 +7,7 @@ | Test.listOrd[List[T]](Test.listOrd[T](/* missing */summon[Test.Ord[T]])) | | But no implicit values were found that match type Test.Ord[T]. --- Error: tests/neg/implicitSearch.scala:15:38 ------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/implicitSearch.scala:15:38 ------------------------------------------------------------- 15 | listOrd(listOrd(implicitly[Ord[T]] /*not found*/)) // error | ^ | No given instance of type Test.Ord[T] was found for parameter e of method implicitly in object Predef diff --git a/tests/neg/indent-colons.check b/tests/neg/indent-colons.check index 06bd7a31b079..102d41592014 100644 --- a/tests/neg/indent-colons.check +++ b/tests/neg/indent-colons.check @@ -1,29 +1,29 @@ --- Error: tests/neg/indent-colons.scala:6:4 ---------------------------------------------------------------------------- -6 | : // error +-- Error: tests/neg/indent-colons.scala:7:4 ---------------------------------------------------------------------------- +7 | : // error | ^ | end of statement expected but ':' found --- Error: tests/neg/indent-colons.scala:12:2 --------------------------------------------------------------------------- -12 | : // error +-- Error: tests/neg/indent-colons.scala:13:2 --------------------------------------------------------------------------- +13 | : // error | ^ | end of statement expected but ':' found --- Error: tests/neg/indent-colons.scala:19:2 --------------------------------------------------------------------------- -19 | : // error +-- Error: tests/neg/indent-colons.scala:20:2 --------------------------------------------------------------------------- +20 | : // error | ^ | end of statement expected but ':' found --- [E018] Syntax Error: tests/neg/indent-colons.scala:26:14 ------------------------------------------------------------ -26 | val y = 1 + : // error +-- [E018] Syntax Error: tests/neg/indent-colons.scala:27:14 ------------------------------------------------------------ +27 | val y = 1 + : // error | ^ | expression expected but : found | | longer explanation available when compiling with `-explain` --- [E018] Syntax Error: tests/neg/indent-colons.scala:30:27 ------------------------------------------------------------ -30 | val all = credentials ++ : // error +-- [E018] Syntax Error: tests/neg/indent-colons.scala:31:27 ------------------------------------------------------------ +31 | val all = credentials ++ : // error | ^ | expression expected but : found | | longer explanation available when compiling with `-explain` --- [E134] Type Error: tests/neg/indent-colons.scala:23:12 -------------------------------------------------------------- -23 | val x = 1.+ : // error +-- [E134] Type Error: tests/neg/indent-colons.scala:24:12 -------------------------------------------------------------- +24 | val x = 1.+ : // error | ^^^ | None of the overloaded alternatives of method + in class Int with types | (x: Double): Double @@ -35,27 +35,27 @@ | (x: Byte): Int | (x: String): String | match expected type (2 : Int) --- [E006] Not Found Error: tests/neg/indent-colons.scala:32:7 ---------------------------------------------------------- -32 | if file.isEmpty // error +-- [E006] Not Found Error: tests/neg/indent-colons.scala:33:7 ---------------------------------------------------------- +33 | if file.isEmpty // error | ^^^^ | Not found: file | | longer explanation available when compiling with `-explain` --- [E006] Not Found Error: tests/neg/indent-colons.scala:34:13 --------------------------------------------------------- -34 | else Seq(file) // error +-- [E006] Not Found Error: tests/neg/indent-colons.scala:35:13 --------------------------------------------------------- +35 | else Seq(file) // error | ^^^^ | Not found: file | | longer explanation available when compiling with `-explain` --- Error: tests/neg/indent-colons.scala:4:2 ---------------------------------------------------------------------------- -4 | tryEither: // error +-- Error: tests/neg/indent-colons.scala:5:2 ---------------------------------------------------------------------------- +5 | tryEither: // error | ^^^^^^^^^ | missing arguments for method tryEither --- Error: tests/neg/indent-colons.scala:10:2 --------------------------------------------------------------------------- -10 | tryEither: // error +-- Error: tests/neg/indent-colons.scala:11:2 --------------------------------------------------------------------------- +11 | tryEither: // error | ^^^^^^^^^ | missing arguments for method tryEither --- Error: tests/neg/indent-colons.scala:17:2 --------------------------------------------------------------------------- -17 | Some(3).fold: // error +-- Error: tests/neg/indent-colons.scala:18:2 --------------------------------------------------------------------------- +18 | Some(3).fold: // error | ^^^^^^^^^^^^ | missing arguments for method fold in class Option diff --git a/tests/neg/indent-colons.scala b/tests/neg/indent-colons.scala index 5364713dd4aa..240012f5489b 100644 --- a/tests/neg/indent-colons.scala +++ b/tests/neg/indent-colons.scala @@ -1,3 +1,4 @@ +import language.`3.2` def tryEither[T](x: T)(y: Int => T): T = ??? def test1 = diff --git a/tests/neg/indent-experimental.scala b/tests/neg/indent-experimental.scala index e945e172d1de..34ea5633010c 100644 --- a/tests/neg/indent-experimental.scala +++ b/tests/neg/indent-experimental.scala @@ -1,4 +1,4 @@ -import language.experimental.fewerBraces +import language.`3.3` val x = if true then: // error diff --git a/tests/neg/inline-param-unstable-path.scala b/tests/neg/inline-param-unstable-path.scala new file mode 100644 index 000000000000..be2d7142bc2f --- /dev/null +++ b/tests/neg/inline-param-unstable-path.scala @@ -0,0 +1,6 @@ +inline val a = 3 +inline def f(inline x: Int, y: Int, z: => Int): Unit = + val x2: x.type = x // error: (x : Int) is not a valid singleton type, since it is not an immutable path + val y2: y.type = y + val z2: z.type = z // error: (z : Int) is not a valid singleton type, since it is not an immutable path + val a2: a.type = a diff --git a/tests/neg/inline-val-in-inline-method.scala b/tests/neg/inline-val-in-inline-method.scala new file mode 100644 index 000000000000..fbd0f69ff2d5 --- /dev/null +++ b/tests/neg/inline-val-in-inline-method.scala @@ -0,0 +1,8 @@ +inline def f(inline x: Int): Unit = + inline val b = x + val c: b.type = b + +def test = + f(1) + def a = 1 + f(a) // error: inline value must have a literal constant type diff --git a/tests/neg/java-ann-extends-separate/Ann_1.java b/tests/neg/java-ann-extends-separate/Ann_1.java new file mode 100644 index 000000000000..97184df24c83 --- /dev/null +++ b/tests/neg/java-ann-extends-separate/Ann_1.java @@ -0,0 +1,3 @@ +public @interface Ann_1 { + int value(); +} diff --git a/tests/neg/java-ann-extends-separate/Test_2.scala b/tests/neg/java-ann-extends-separate/Test_2.scala new file mode 100644 index 000000000000..4e73b71679f6 --- /dev/null +++ b/tests/neg/java-ann-extends-separate/Test_2.scala @@ -0,0 +1,2 @@ +def test(x: Ann_1) = + val y: scala.annotation.Annotation = x // error diff --git a/tests/neg/java-ann-extends/Ann.java b/tests/neg/java-ann-extends/Ann.java new file mode 100644 index 000000000000..9ae845a8af63 --- /dev/null +++ b/tests/neg/java-ann-extends/Ann.java @@ -0,0 +1,3 @@ +public @interface Ann { + int value(); +} diff --git a/tests/neg/java-ann-extends/Test.scala b/tests/neg/java-ann-extends/Test.scala new file mode 100644 index 000000000000..629f1daa9acc --- /dev/null +++ b/tests/neg/java-ann-extends/Test.scala @@ -0,0 +1,2 @@ +def test(x: Ann) = + val y: scala.annotation.Annotation = x // error diff --git a/tests/neg/java-ann-super-class/Ann.java b/tests/neg/java-ann-super-class/Ann.java new file mode 100644 index 000000000000..9ae845a8af63 --- /dev/null +++ b/tests/neg/java-ann-super-class/Ann.java @@ -0,0 +1,3 @@ +public @interface Ann { + int value(); +} diff --git a/tests/neg/java-ann-super-class/Test.scala b/tests/neg/java-ann-super-class/Test.scala new file mode 100644 index 000000000000..cf2f72d2f633 --- /dev/null +++ b/tests/neg/java-ann-super-class/Test.scala @@ -0,0 +1,9 @@ +class Bar extends Ann(1) { // error + def value = 1 + def annotationType = classOf[Ann] +} + +def test = + // Typer errors + new Ann // error + new Ann(1) {} // error diff --git a/tests/neg/java-ann-super-class2/Ann.java b/tests/neg/java-ann-super-class2/Ann.java new file mode 100644 index 000000000000..9ae845a8af63 --- /dev/null +++ b/tests/neg/java-ann-super-class2/Ann.java @@ -0,0 +1,3 @@ +public @interface Ann { + int value(); +} diff --git a/tests/neg/java-ann-super-class2/Test.scala b/tests/neg/java-ann-super-class2/Test.scala new file mode 100644 index 000000000000..d5c22860899c --- /dev/null +++ b/tests/neg/java-ann-super-class2/Test.scala @@ -0,0 +1,3 @@ +def test = + // Posttyper errors + new Ann(1) // error diff --git a/tests/neg/java-ann-super-class3/Ann.java b/tests/neg/java-ann-super-class3/Ann.java new file mode 100644 index 000000000000..9ae845a8af63 --- /dev/null +++ b/tests/neg/java-ann-super-class3/Ann.java @@ -0,0 +1,3 @@ +public @interface Ann { + int value(); +} diff --git a/tests/neg/java-ann-super-class3/Test.scala b/tests/neg/java-ann-super-class3/Test.scala new file mode 100644 index 000000000000..8fd9791e6fe3 --- /dev/null +++ b/tests/neg/java-ann-super-class3/Test.scala @@ -0,0 +1,3 @@ +def test = + // Refchecks error + new Ann {} // error diff --git a/tests/neg/java-fake-ann-separate/FakeAnn_1.java b/tests/neg/java-fake-ann-separate/FakeAnn_1.java new file mode 100644 index 000000000000..597ea980585d --- /dev/null +++ b/tests/neg/java-fake-ann-separate/FakeAnn_1.java @@ -0,0 +1 @@ +interface FakeAnn_1 extends java.lang.annotation.Annotation { } diff --git a/tests/neg/java-fake-ann-separate/Test_2.scala b/tests/neg/java-fake-ann-separate/Test_2.scala new file mode 100644 index 000000000000..becc8babdaa0 --- /dev/null +++ b/tests/neg/java-fake-ann-separate/Test_2.scala @@ -0,0 +1,3 @@ +@FakeAnn_1 def test = // error + (1: @FakeAnn_1) // error + diff --git a/tests/neg/java-fake-ann/FakeAnn.java b/tests/neg/java-fake-ann/FakeAnn.java new file mode 100644 index 000000000000..2b055f782d42 --- /dev/null +++ b/tests/neg/java-fake-ann/FakeAnn.java @@ -0,0 +1 @@ +interface FakeAnn extends java.lang.annotation.Annotation { } diff --git a/tests/neg/java-fake-ann/Test.scala b/tests/neg/java-fake-ann/Test.scala new file mode 100644 index 000000000000..827527cb80bf --- /dev/null +++ b/tests/neg/java-fake-ann/Test.scala @@ -0,0 +1,2 @@ +@FakeAnn def test = // error + (1: @FakeAnn) // error diff --git a/tests/neg/mirror-synthesis-errors-b.check b/tests/neg/mirror-synthesis-errors-b.check index ea41d14da296..d9e394617c9d 100644 --- a/tests/neg/mirror-synthesis-errors-b.check +++ b/tests/neg/mirror-synthesis-errors-b.check @@ -1,40 +1,40 @@ --- Error: tests/neg/mirror-synthesis-errors-b.scala:21:56 -------------------------------------------------------------- +-- [E172] Type Error: tests/neg/mirror-synthesis-errors-b.scala:21:56 -------------------------------------------------- 21 |val testA = summon[Mirror.ProductOf[Cns[Int] & Sm[Int]]] // error: unreleated | ^ |No given instance of type deriving.Mirror.ProductOf[Cns[Int] & Sm[Int]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.ProductOf[Cns[Int] & Sm[Int]]: type `Cns[Int] & Sm[Int]` is not a generic product because its subpart `Cns[Int] & Sm[Int]` is an intersection of unrelated definitions class Cns and class Sm. --- Error: tests/neg/mirror-synthesis-errors-b.scala:22:56 -------------------------------------------------------------- +-- [E172] Type Error: tests/neg/mirror-synthesis-errors-b.scala:22:56 -------------------------------------------------- 22 |val testB = summon[Mirror.ProductOf[Sm[Int] & Cns[Int]]] // error: unreleated | ^ |No given instance of type deriving.Mirror.ProductOf[Sm[Int] & Cns[Int]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.ProductOf[Sm[Int] & Cns[Int]]: type `Sm[Int] & Cns[Int]` is not a generic product because its subpart `Sm[Int] & Cns[Int]` is an intersection of unrelated definitions class Sm and class Cns. --- Error: tests/neg/mirror-synthesis-errors-b.scala:23:49 -------------------------------------------------------------- +-- [E172] Type Error: tests/neg/mirror-synthesis-errors-b.scala:23:49 -------------------------------------------------- 23 |val testC = summon[Mirror.Of[Cns[Int] & Sm[Int]]] // error: unreleated | ^ - |No given instance of type deriving.Mirror.Of[Cns[Int] & Sm[Int]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[Cns[Int] & Sm[Int]]: + |No given instance of type deriving.Mirror.Of[Cns[Int] & Sm[Int]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[Cns[Int] & Sm[Int]]: | * type `Cns[Int] & Sm[Int]` is not a generic product because its subpart `Cns[Int] & Sm[Int]` is an intersection of unrelated definitions class Cns and class Sm. | * type `Cns[Int] & Sm[Int]` is not a generic sum because its subpart `Cns[Int] & Sm[Int]` is an intersection of unrelated definitions class Cns and class Sm. --- Error: tests/neg/mirror-synthesis-errors-b.scala:24:49 -------------------------------------------------------------- +-- [E172] Type Error: tests/neg/mirror-synthesis-errors-b.scala:24:49 -------------------------------------------------- 24 |val testD = summon[Mirror.Of[Sm[Int] & Cns[Int]]] // error: unreleated | ^ - |No given instance of type deriving.Mirror.Of[Sm[Int] & Cns[Int]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[Sm[Int] & Cns[Int]]: + |No given instance of type deriving.Mirror.Of[Sm[Int] & Cns[Int]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[Sm[Int] & Cns[Int]]: | * type `Sm[Int] & Cns[Int]` is not a generic product because its subpart `Sm[Int] & Cns[Int]` is an intersection of unrelated definitions class Sm and class Cns. | * type `Sm[Int] & Cns[Int]` is not a generic sum because its subpart `Sm[Int] & Cns[Int]` is an intersection of unrelated definitions class Sm and class Cns. --- Error: tests/neg/mirror-synthesis-errors-b.scala:25:55 -------------------------------------------------------------- +-- [E172] Type Error: tests/neg/mirror-synthesis-errors-b.scala:25:55 -------------------------------------------------- 25 |val testE = summon[Mirror.ProductOf[Sm[Int] & Nn.type]] // error: unreleated | ^ |No given instance of type deriving.Mirror.ProductOf[Sm[Int] & Nn.type] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.ProductOf[Sm[Int] & Nn.type]: type `Sm[Int] & Nn.type` is not a generic product because its subpart `Sm[Int] & Nn.type` is an intersection of unrelated definitions class Sm and object Nn. --- Error: tests/neg/mirror-synthesis-errors-b.scala:26:55 -------------------------------------------------------------- +-- [E172] Type Error: tests/neg/mirror-synthesis-errors-b.scala:26:55 -------------------------------------------------- 26 |val testF = summon[Mirror.ProductOf[Nn.type & Sm[Int]]] // error: unreleated | ^ |No given instance of type deriving.Mirror.ProductOf[Nn.type & Sm[Int]] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.ProductOf[Nn.type & Sm[Int]]: type `Nn.type & Sm[Int]` is not a generic product because its subpart `Nn.type & Sm[Int]` is an intersection of unrelated definitions object Nn and class Sm. --- Error: tests/neg/mirror-synthesis-errors-b.scala:27:54 -------------------------------------------------------------- +-- [E172] Type Error: tests/neg/mirror-synthesis-errors-b.scala:27:54 -------------------------------------------------- 27 |val testG = summon[Mirror.Of[Foo.A.type & Foo.B.type]] // error: unreleated | ^ - |No given instance of type deriving.Mirror.Of[(Foo.A : Foo) & (Foo.B : Foo)] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[(Foo.A : Foo) & (Foo.B : Foo)]: + |No given instance of type deriving.Mirror.Of[(Foo.A : Foo) & (Foo.B : Foo)] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[(Foo.A : Foo) & (Foo.B : Foo)]: | * type `(Foo.A : Foo) & (Foo.B : Foo)` is not a generic product because its subpart `(Foo.A : Foo) & (Foo.B : Foo)` is an intersection of unrelated definitions value A and value B. | * type `(Foo.A : Foo) & (Foo.B : Foo)` is not a generic sum because its subpart `(Foo.A : Foo) & (Foo.B : Foo)` is an intersection of unrelated definitions value A and value B. --- Error: tests/neg/mirror-synthesis-errors-b.scala:28:54 -------------------------------------------------------------- +-- [E172] Type Error: tests/neg/mirror-synthesis-errors-b.scala:28:54 -------------------------------------------------- 28 |val testH = summon[Mirror.Of[Foo.B.type & Foo.A.type]] // error: unreleated | ^ - |No given instance of type deriving.Mirror.Of[(Foo.B : Foo) & (Foo.A : Foo)] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[(Foo.B : Foo) & (Foo.A : Foo)]: + |No given instance of type deriving.Mirror.Of[(Foo.B : Foo) & (Foo.A : Foo)] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[(Foo.B : Foo) & (Foo.A : Foo)]: | * type `(Foo.B : Foo) & (Foo.A : Foo)` is not a generic product because its subpart `(Foo.B : Foo) & (Foo.A : Foo)` is an intersection of unrelated definitions value B and value A. | * type `(Foo.B : Foo) & (Foo.A : Foo)` is not a generic sum because its subpart `(Foo.B : Foo) & (Foo.A : Foo)` is an intersection of unrelated definitions value B and value A. diff --git a/tests/neg/mirror-synthesis-errors.check b/tests/neg/mirror-synthesis-errors.check index d108c99280ae..da795e80bf51 100644 --- a/tests/neg/mirror-synthesis-errors.check +++ b/tests/neg/mirror-synthesis-errors.check @@ -1,42 +1,42 @@ --- Error: tests/neg/mirror-synthesis-errors.scala:21:32 ---------------------------------------------------------------- +-- [E172] Type Error: tests/neg/mirror-synthesis-errors.scala:21:32 ---------------------------------------------------- 21 |val testA = summon[Mirror.Of[A]] // error: Not a sealed trait | ^ - |No given instance of type deriving.Mirror.Of[A] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[A]: + |No given instance of type deriving.Mirror.Of[A] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[A]: | * trait A is not a generic product because it is not a case class | * trait A is not a generic sum because it is not a sealed trait --- Error: tests/neg/mirror-synthesis-errors.scala:22:32 ---------------------------------------------------------------- +-- [E172] Type Error: tests/neg/mirror-synthesis-errors.scala:22:32 ---------------------------------------------------- 22 |val testC = summon[Mirror.Of[C]] // error: Does not have subclasses | ^ - |No given instance of type deriving.Mirror.Of[C] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[C]: + |No given instance of type deriving.Mirror.Of[C] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[C]: | * trait C is not a generic product because it is not a case class | * trait C is not a generic sum because it does not have subclasses --- Error: tests/neg/mirror-synthesis-errors.scala:23:32 ---------------------------------------------------------------- +-- [E172] Type Error: tests/neg/mirror-synthesis-errors.scala:23:32 ---------------------------------------------------- 23 |val testD = summon[Mirror.Of[D]] // error: child SubD takes more than one parameter list | ^ - |No given instance of type deriving.Mirror.Of[D] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[D]: + |No given instance of type deriving.Mirror.Of[D] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[D]: | * class D is not a generic product because it is not a case class | * class D is not a generic sum because its child class SubD is not a generic product because it takes more than one parameter list --- Error: tests/neg/mirror-synthesis-errors.scala:24:38 ---------------------------------------------------------------- +-- [E172] Type Error: tests/neg/mirror-synthesis-errors.scala:24:38 ---------------------------------------------------- 24 |val testSubD = summon[Mirror.Of[SubD]] // error: takes more than one parameter list | ^ - |No given instance of type deriving.Mirror.Of[SubD] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[SubD]: + |No given instance of type deriving.Mirror.Of[SubD] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[SubD]: | * class SubD is not a generic product because it takes more than one parameter list | * class SubD is not a generic sum because it is not a sealed class --- Error: tests/neg/mirror-synthesis-errors.scala:25:32 ---------------------------------------------------------------- +-- [E172] Type Error: tests/neg/mirror-synthesis-errors.scala:25:32 ---------------------------------------------------- 25 |val testE = summon[Mirror.Of[E]] // error: Not an abstract class | ^ - |No given instance of type deriving.Mirror.Of[E] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[E]: + |No given instance of type deriving.Mirror.Of[E] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[E]: | * class E is not a generic product because it is not a case class | * class E is not a generic sum because it is not an abstract class --- Error: tests/neg/mirror-synthesis-errors.scala:26:32 ---------------------------------------------------------------- +-- [E172] Type Error: tests/neg/mirror-synthesis-errors.scala:26:32 ---------------------------------------------------- 26 |val testF = summon[Mirror.Of[F]] // error: No children | ^ - |No given instance of type deriving.Mirror.Of[F] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[F]: + |No given instance of type deriving.Mirror.Of[F] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[F]: | * trait F is not a generic product because it is not a case class | * trait F is not a generic sum because it does not have subclasses --- Error: tests/neg/mirror-synthesis-errors.scala:27:36 ---------------------------------------------------------------- +-- [E172] Type Error: tests/neg/mirror-synthesis-errors.scala:27:36 ---------------------------------------------------- 27 |val testG = summon[Mirror.Of[Foo.G]] // error: Has anonymous subclasses | ^ - |No given instance of type deriving.Mirror.Of[Foo.G] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[Foo.G]: + |No given instance of type deriving.Mirror.Of[Foo.G] was found for parameter x of method summon in object Predef. Failed to synthesize an instance of type deriving.Mirror.Of[Foo.G]: | * trait G is not a generic product because it is not a case class | * trait G is not a generic sum because it has anonymous or inaccessible subclasses diff --git a/tests/neg/missing-implicit-2.check b/tests/neg/missing-implicit-2.check index e1994c4bf02d..10f0192d1459 100644 --- a/tests/neg/missing-implicit-2.check +++ b/tests/neg/missing-implicit-2.check @@ -1,4 +1,4 @@ --- Error: tests/neg/missing-implicit-2.scala:4:24 ---------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/missing-implicit-2.scala:4:24 ---------------------------------------------------------- 4 |val f = Future[Unit] { } // error | ^ | Cannot find an implicit ExecutionContext. You might add diff --git a/tests/neg/missing-implicit1.check b/tests/neg/missing-implicit1.check index ccba4b0fa018..c94225aaf0a6 100644 --- a/tests/neg/missing-implicit1.check +++ b/tests/neg/missing-implicit1.check @@ -1,4 +1,4 @@ --- Error: tests/neg/missing-implicit1.scala:17:4 ----------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/missing-implicit1.scala:17:4 ----------------------------------------------------------- 17 | ff // error | ^ |No given instance of type testObjectInstance.Zip[Option] was found for parameter xs of method ff in object testObjectInstance @@ -16,7 +16,7 @@ | | import testObjectInstance.instances.traverseList | --- Error: tests/neg/missing-implicit1.scala:23:42 ---------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/missing-implicit1.scala:23:42 ---------------------------------------------------------- 23 | List(1, 2, 3).traverse(x => Option(x)) // error | ^ |No given instance of type testObjectInstance.Zip[Option] was found for an implicit parameter of method traverse in trait Traverse diff --git a/tests/neg/missing-implicit2.check b/tests/neg/missing-implicit2.check index 705e052c0a43..103c098f5798 100644 --- a/tests/neg/missing-implicit2.check +++ b/tests/neg/missing-implicit2.check @@ -1,4 +1,4 @@ --- Error: tests/neg/missing-implicit2.scala:10:18 ---------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/missing-implicit2.scala:10:18 ---------------------------------------------------------- 10 | f(using xFromY) // error | ^ | No given instance of type Y was found for parameter y of given instance xFromY @@ -7,7 +7,7 @@ | | import test.instances.y | --- Error: tests/neg/missing-implicit2.scala:16:5 ----------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/missing-implicit2.scala:16:5 ----------------------------------------------------------- 16 | f // error | ^ | No given instance of type X was found for parameter x of method f in object test diff --git a/tests/neg/missing-implicit3.check b/tests/neg/missing-implicit3.check index 3cf3b101f3ca..ab87bf99a32a 100644 --- a/tests/neg/missing-implicit3.check +++ b/tests/neg/missing-implicit3.check @@ -1,4 +1,4 @@ --- Error: tests/neg/missing-implicit3.scala:13:36 ---------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/missing-implicit3.scala:13:36 ---------------------------------------------------------- 13 |val sortedFoos = sort(List(new Foo)) // error | ^ | No given instance of type ord.Ord[ord.Foo] was found for an implicit parameter of method sort in package ord. diff --git a/tests/neg/missing-implicit4.check b/tests/neg/missing-implicit4.check index 4cc8a2182b8d..e243c208ecdf 100644 --- a/tests/neg/missing-implicit4.check +++ b/tests/neg/missing-implicit4.check @@ -1,4 +1,4 @@ --- Error: tests/neg/missing-implicit4.scala:14:4 ----------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/missing-implicit4.scala:14:4 ----------------------------------------------------------- 14 | ff // error | ^ | No given instance of type Zip[Option] was found for parameter xs of method ff @@ -16,7 +16,7 @@ | | import instances.traverseList | --- Error: tests/neg/missing-implicit4.scala:20:42 ---------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/missing-implicit4.scala:20:42 ---------------------------------------------------------- 20 | List(1, 2, 3).traverse(x => Option(x)) // error | ^ | No given instance of type Zip[Option] was found for an implicit parameter of method traverse in trait Traverse diff --git a/tests/neg/missing-import.scala b/tests/neg/missing-import.scala new file mode 100644 index 000000000000..8af26030435a --- /dev/null +++ b/tests/neg/missing-import.scala @@ -0,0 +1,3 @@ +class annotation extends Annotation // error +val s: String = "str" +val regex: Regex = s.r // error diff --git a/tests/neg/noimports-additional.scala b/tests/neg/noimports-additional.scala new file mode 100644 index 000000000000..e726db5b9b0a --- /dev/null +++ b/tests/neg/noimports-additional.scala @@ -0,0 +1,4 @@ +// scalac: -Yno-imports -Yimports:scala.annotation,scala.util.matching +class annotation extends Annotation +val s: String = "str" // error +val regex: Regex = new Regex("str") diff --git a/tests/neg-custom-args/noimports.scala b/tests/neg/noimports.scala similarity index 70% rename from tests/neg-custom-args/noimports.scala rename to tests/neg/noimports.scala index 6cef8dee8843..720d111757cd 100644 --- a/tests/neg-custom-args/noimports.scala +++ b/tests/neg/noimports.scala @@ -1,3 +1,4 @@ +// scalac: -Yno-imports object Test { val t: Int = 1 // error: not found Int } diff --git a/tests/neg-custom-args/noimports2.scala b/tests/neg/noimports2.scala similarity index 74% rename from tests/neg-custom-args/noimports2.scala rename to tests/neg/noimports2.scala index b75f1361ddb9..deee773c35c6 100644 --- a/tests/neg-custom-args/noimports2.scala +++ b/tests/neg/noimports2.scala @@ -1,3 +1,4 @@ +// scalac: -Yno-imports object Test { assert("asdf" == "asdf") // error: not found assert } diff --git a/tests/neg/nopredef-additional.scala b/tests/neg/nopredef-additional.scala new file mode 100644 index 000000000000..0b6a71ca7c53 --- /dev/null +++ b/tests/neg/nopredef-additional.scala @@ -0,0 +1,4 @@ +// scalac: -Yno-predef -Yimports:java.lang,scala.annotation,scala.util.matching +class annotation extends Annotation +val s: String = "str" +val regex: Regex = s.r // error diff --git a/tests/neg/nopredef.scala b/tests/neg/nopredef.scala index 0a22e200805a..fa9a344772a6 100644 --- a/tests/neg/nopredef.scala +++ b/tests/neg/nopredef.scala @@ -1,5 +1,4 @@ -import Predef.{assert as _} - +// scalac: -Yno-predef object Test { assert("asdf" == "asdf") // error: not found assert } diff --git a/tests/neg/opaque-bounds-1.scala b/tests/neg/opaque-bounds-1.scala new file mode 100644 index 000000000000..e05cd56ae71c --- /dev/null +++ b/tests/neg/opaque-bounds-1.scala @@ -0,0 +1,13 @@ +abstract class Test { + opaque type FlagSet = Int + + opaque type Flag <: FlagSet = String // error: type String outside bounds <: Test.this.FlagSet + + object Flag { + def make(s: String): Flag = s + } + + val f: Flag = Flag.make("hello") + val g: FlagSet = f + +} \ No newline at end of file diff --git a/tests/neg/opaque-bounds.scala b/tests/neg/opaque-bounds.scala index 3eb03117e469..c39f184e2008 100644 --- a/tests/neg/opaque-bounds.scala +++ b/tests/neg/opaque-bounds.scala @@ -2,7 +2,7 @@ class Test { // error: class Test cannot be instantiated opaque type FlagSet = Int - opaque type Flag <: FlagSet = String // error: type String outside bounds <: Test.this.FlagSet + opaque type Flag <: FlagSet = String object Flag { def make(s: String): Flag = s diff --git a/tests/neg/parser-stability-9.scala b/tests/neg/parser-stability-9.scala index aaa77f216f37..932f6a15ad52 100644 --- a/tests/neg/parser-stability-9.scala +++ b/tests/neg/parser-stability-9.scala @@ -1,2 +1,2 @@ -import // error +import // error \ No newline at end of file diff --git a/tests/neg/repeatable/Test_1.scala b/tests/neg/repeatable/Test_1.scala index 3779b6ffa4a8..6466da95dfa8 100644 --- a/tests/neg/repeatable/Test_1.scala +++ b/tests/neg/repeatable/Test_1.scala @@ -6,11 +6,11 @@ import repeatable._ @FirstLevel_0(Array()) // error trait U -@FirstLevel_0(Array(Plain_0(4), Plain_0(5))) -@FirstLevel_0(Array(Plain_0(6), Plain_0(7))) +@FirstLevel_0(Array(new Plain_0(4), new Plain_0(5))) +@FirstLevel_0(Array(new Plain_0(6), new Plain_0(7))) @SecondLevel_0(Array()) // error trait T @SecondLevel_0(Array()) @SecondLevel_0(Array()) // error -trait S \ No newline at end of file +trait S diff --git a/tests/neg/safeThrowsStrawman.check b/tests/neg/safeThrowsStrawman.check index 6bf1ecdae513..0885404bbb76 100644 --- a/tests/neg/safeThrowsStrawman.check +++ b/tests/neg/safeThrowsStrawman.check @@ -1,4 +1,4 @@ --- Error: tests/neg/safeThrowsStrawman.scala:17:32 --------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/safeThrowsStrawman.scala:17:32 --------------------------------------------------------- 17 | if x then 1 else raise(Fail()) // error | ^ | The capability to throw exception scalax.Fail is missing. @@ -6,7 +6,7 @@ | - A using clause `(using CanThrow[scalax.Fail])` | - A raises clause in a result type such as `X raises scalax.Fail` | - an enclosing `try` that catches scalax.Fail --- Error: tests/neg/safeThrowsStrawman.scala:27:15 --------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/safeThrowsStrawman.scala:27:15 --------------------------------------------------------- 27 | println(bar) // error | ^ | The capability to throw exception Exception is missing. diff --git a/tests/neg/saferExceptions.check b/tests/neg/saferExceptions.check index 5f51ce08d6db..77859e940b2d 100644 --- a/tests/neg/saferExceptions.check +++ b/tests/neg/saferExceptions.check @@ -1,4 +1,4 @@ --- Error: tests/neg/saferExceptions.scala:12:16 ------------------------------------------------------------------------ +-- [E172] Type Error: tests/neg/saferExceptions.scala:12:16 ------------------------------------------------------------ 12 | case 4 => throw Exception() // error | ^^^^^^^^^^^^^^^^^ | The capability to throw exception Exception is missing. @@ -11,7 +11,7 @@ | | import unsafeExceptions.canThrowAny | --- Error: tests/neg/saferExceptions.scala:17:46 ------------------------------------------------------------------------ +-- [E172] Type Error: tests/neg/saferExceptions.scala:17:46 ------------------------------------------------------------ 17 | def baz(x: Int): Int throws Failure = bar(x) // error | ^ | The capability to throw exception java.io.IOException is missing. diff --git a/tests/neg/selfInheritance.scala b/tests/neg/selfInheritance.scala index 073316de008c..e8eb2bab5624 100644 --- a/tests/neg/selfInheritance.scala +++ b/tests/neg/selfInheritance.scala @@ -26,7 +26,3 @@ object Test { object M extends C // error: illegal inheritance: self type Test.M.type of object M$ does not conform to self type B of parent class C } - -trait X { self: Y => } // error: missing requirement: self type Y & X of trait X does not conform to self type Z of required trait Y -trait Y { self: Z => } -trait Z diff --git a/tests/neg/subtyping.check b/tests/neg/subtyping.check index 832ff6296c52..c0ae1c71e007 100644 --- a/tests/neg/subtyping.check +++ b/tests/neg/subtyping.check @@ -1,8 +1,8 @@ --- Error: tests/neg/subtyping.scala:8:27 ------------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/subtyping.scala:8:27 ------------------------------------------------------------------- 8 | implicitly[B#X <:< A#X] // error: no implicit argument | ^ | Cannot prove that B#X <:< A#X. --- Error: tests/neg/subtyping.scala:12:27 ------------------------------------------------------------------------------ +-- [E172] Type Error: tests/neg/subtyping.scala:12:27 ------------------------------------------------------------------ 12 | implicitly[a.T <:< a.U] // error: no implicit argument | ^ | Cannot prove that a.T <:< a.U. diff --git a/tests/neg/summon-function.check b/tests/neg/summon-function.check index 863d1429d33f..b6ff4feea047 100644 --- a/tests/neg/summon-function.check +++ b/tests/neg/summon-function.check @@ -1,4 +1,4 @@ --- Error: tests/neg/summon-function.scala:2:23 ------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/summon-function.scala:2:23 ------------------------------------------------------------- 2 | summon[Int => String] // error | ^ | No given instance of type Int => String was found for parameter x of method summon in object Predef diff --git a/tests/neg/summonInline.check b/tests/neg/summonInline.check index 6c3839266ce4..e317ed53f8e2 100644 --- a/tests/neg/summonInline.check +++ b/tests/neg/summonInline.check @@ -1,4 +1,4 @@ --- Error: tests/neg/summonInline.scala:19:32 --------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/summonInline.scala:19:32 --------------------------------------------------------------- 19 |val missing1 = summonInlineCheck(1) // error | ^^^^^^^^^^^^^^^^^^^^ | Missing One @@ -9,7 +9,7 @@ 15 | case 1 => summonInline[Missing1] | ^^^^^^^^^^^^^^^^^^^^^^ -------------------------------------------------------------------------------------------------------------------- --- Error: tests/neg/summonInline.scala:20:32 --------------------------------------------------------------------------- +-- [E172] Type Error: tests/neg/summonInline.scala:20:32 --------------------------------------------------------------- 20 |val missing2 = summonInlineCheck(2) // error | ^^^^^^^^^^^^^^^^^^^^ | Missing Two diff --git a/tests/neg/supertraits.scala b/tests/neg/supertraits.scala index 2fc79ca30f1d..6952c7640529 100644 --- a/tests/neg/supertraits.scala +++ b/tests/neg/supertraits.scala @@ -6,19 +6,20 @@ class C extends A, S val x = if ??? then B() else C() val x1: S = x // error -case object a -case object b +class Top +case object a extends Top +case object b extends Top val y = if ??? then a else b val y1: Product = y // error val y2: Serializable = y // error -enum Color { +enum Color extends Top { case Red, Green, Blue } -enum Nucleobase { +enum Nucleobase extends Top { case A, C, G, T } val z = if ??? then Color.Red else Nucleobase.G -val z1: reflect.Enum = z // error: Found: (z : Object) Required: reflect.Enum +val z1: reflect.Enum = z // error: Found: (z : Top) Required: reflect.Enum diff --git a/tests/neg/t5702-neg-bad-and-wild.check b/tests/neg/t5702-neg-bad-and-wild.check index f6d761a6726f..c461b76ea70b 100644 --- a/tests/neg/t5702-neg-bad-and-wild.check +++ b/tests/neg/t5702-neg-bad-and-wild.check @@ -56,6 +56,13 @@ | Recursive value $1$ needs type | | longer explanation available when compiling with `-explain` +-- Warning: tests/neg/t5702-neg-bad-and-wild.scala:13:22 --------------------------------------------------------------- +13 | case List(1, _*3:) => // error // error + | ^ + | Type ascriptions after patterns other than: + | * variable pattern, e.g. `case x: String =>` + | * number literal pattern, e.g. `case 10.5: Double =>` + | are no longer supported. Remove the type ascription or move it to a separate variable pattern. -- Warning: tests/neg/t5702-neg-bad-and-wild.scala:22:20 --------------------------------------------------------------- 22 | val K(x @ _*) = k | ^ diff --git a/tests/neg/transparent.scala b/tests/neg/transparent.scala index b4d89478b0ac..95899bfa0b33 100644 --- a/tests/neg/transparent.scala +++ b/tests/neg/transparent.scala @@ -1,7 +1,8 @@ transparent def foo = 1 // error transparent inline def bar = 2 // ok transparent inline val x = 2 // error -transparent class c // error +transparent class c // ok +transparent final class d // error transparent object y // error transparent trait t // ok transparent type T = c // error diff --git a/tests/neg-custom-args/nopredef.scala b/tests/neg/unimport-Predef-assert.scala similarity index 70% rename from tests/neg-custom-args/nopredef.scala rename to tests/neg/unimport-Predef-assert.scala index b75f1361ddb9..0a22e200805a 100644 --- a/tests/neg-custom-args/nopredef.scala +++ b/tests/neg/unimport-Predef-assert.scala @@ -1,3 +1,5 @@ +import Predef.{assert as _} + object Test { assert("asdf" == "asdf") // error: not found assert } diff --git a/tests/neg/union.scala b/tests/neg/union.scala index 0a702ab70058..c6fd42e6629e 100644 --- a/tests/neg/union.scala +++ b/tests/neg/union.scala @@ -11,8 +11,9 @@ object Test { } object O { - class A - class B + class Top + class A extends Top + class B extends Top def f[T](x: T, y: T): T = x val x: A = f(new A { }, new A) diff --git a/tests/neg/warn-value-discard.check b/tests/neg/warn-value-discard.check new file mode 100644 index 000000000000..ab6539dd5cd8 --- /dev/null +++ b/tests/neg/warn-value-discard.check @@ -0,0 +1,20 @@ +-- [E175] Potential Issue Error: tests/neg/warn-value-discard.scala:15:35 ---------------------------------------------- +15 | firstThing().map(_ => secondThing()) // error + | ^^^^^^^^^^^^^ + | discarded non-Unit value of type Either[Failed, Unit] +-- [E175] Potential Issue Error: tests/neg/warn-value-discard.scala:18:35 ---------------------------------------------- +18 | firstThing().map(_ => secondThing()) // error + | ^^^^^^^^^^^^^ + | discarded non-Unit value of type Either[Failed, Unit] +-- [E175] Potential Issue Error: tests/neg/warn-value-discard.scala:27:36 ---------------------------------------------- +27 | mutable.Set.empty[String].remove("") // error + | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + | discarded non-Unit value of type Boolean +-- [E175] Potential Issue Error: tests/neg/warn-value-discard.scala:39:41 ---------------------------------------------- +39 | mutable.Set.empty[String].subtractOne("") // error + | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + | discarded non-Unit value of type scala.collection.mutable.Set[String] +-- [E175] Potential Issue Error: tests/neg/warn-value-discard.scala:59:4 ----------------------------------------------- +59 | mutable.Set.empty[String] += "" // error + | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + | discarded non-Unit value of type scala.collection.mutable.Set[String] diff --git a/tests/neg/warn-value-discard.scala b/tests/neg/warn-value-discard.scala new file mode 100644 index 000000000000..149433395cc5 --- /dev/null +++ b/tests/neg/warn-value-discard.scala @@ -0,0 +1,66 @@ +// scalac: -Wvalue-discard -Werror + +import scala.util.{Either, Right, Left} +import scala.collection.mutable + +case class Failed(msg: String) + +def firstThing(): Either[Failed, Unit] = + Right(()) + +def secondThing(): Either[Failed, Unit] = + Left(Failed("whoops you should have flatMapped me")) + +def singleExpr(): Either[Failed, Unit] = + firstThing().map(_ => secondThing()) // error + +def block(): Either[Failed, Unit] = { + firstThing().map(_ => secondThing()) // error +} + +class ValueDiscardTest: + val field = mutable.Set.empty[String] + + def remove(): Unit = + // Set#remove returns a Boolean, not this.type + // --> Warning + mutable.Set.empty[String].remove("") // error + + // TODO IMHO we don't need to support this, + // as it's just as easy to add a @nowarn annotation as a Unit ascription + //def removeAscribed(): Unit = { + // mutable.Set.empty[String].remove(""): Unit // nowarn + //} + + def subtract(): Unit = + // - Set#subtractOne returns this.type + // - receiver is not a field or a local variable (not quite sure what you'd call it) + // --> Warning + mutable.Set.empty[String].subtractOne("") // error + + def mutateLocalVariable(): Unit = { + // - Set#subtractOne returns this.type + // - receiver is a local variable + // --> No warning + val s: mutable.Set[String] = mutable.Set.empty[String] + s.subtractOne("") + } + + def mutateField(): Unit = + // - Set#subtractOne returns this.type + // - receiver is a local variable + // --> No warning + field.subtractOne("") + + def assignmentOperator(): Unit = + // - += returns this.type + // - receiver is not a field or a local variable + // --> Warning + mutable.Set.empty[String] += "" // error + + def assignmentOperatorLocalVariable(): Unit = + // - += returns this.type + // - receiver is a local variable + // --> No warning + val s: mutable.Set[String] = mutable.Set.empty[String] + s += "" diff --git a/tests/neg/yimports-custom.check b/tests/neg/yimports-custom.check new file mode 100644 index 000000000000..6ed2eb8b1df3 --- /dev/null +++ b/tests/neg/yimports-custom.check @@ -0,0 +1,7 @@ + +-- [E006] Not Found Error: tests/neg/yimports-custom/C_2.scala:5:16 ---------------------------------------------------- +5 | def greet() = println("hello, world!") // error + | ^^^^^^^ + | Not found: println + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg/yimports-custom/C_2.scala b/tests/neg/yimports-custom/C_2.scala new file mode 100644 index 000000000000..6ba25ad2963c --- /dev/null +++ b/tests/neg/yimports-custom/C_2.scala @@ -0,0 +1,6 @@ +// scalac: -Yimports:hello.world.minidef + +class C { + val v: Numb = Magic + def greet() = println("hello, world!") // error +} diff --git a/tests/neg/yimports-custom/minidef_1.scala b/tests/neg/yimports-custom/minidef_1.scala new file mode 100644 index 000000000000..5d18d0a39584 --- /dev/null +++ b/tests/neg/yimports-custom/minidef_1.scala @@ -0,0 +1,7 @@ + +package hello.world + +object minidef { + type Numb = Int + final val Magic = 42 +} diff --git a/tests/neg/yimports-nojava.check b/tests/neg/yimports-nojava.check new file mode 100644 index 000000000000..8aef6786ca21 --- /dev/null +++ b/tests/neg/yimports-nojava.check @@ -0,0 +1,12 @@ +-- [E006] Not Found Error: tests/neg/yimports-nojava.scala:5:16 -------------------------------------------------------- +5 | def g() = new Integer(42) // error + | ^^^^^^^ + | Not found: type Integer + | + | longer explanation available when compiling with `-explain` +-- [E006] Not Found Error: tests/neg/yimports-nojava.scala:6:16 -------------------------------------------------------- +6 | def sleep() = Thread.sleep(42000L) // error + | ^^^^^^ + | Not found: Thread + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg/yimports-nojava.scala b/tests/neg/yimports-nojava.scala new file mode 100644 index 000000000000..35233e37a775 --- /dev/null +++ b/tests/neg/yimports-nojava.scala @@ -0,0 +1,7 @@ +// scalac: -Yimports:scala,scala.Predef + +trait T { + def f() = println("hello, world!") + def g() = new Integer(42) // error + def sleep() = Thread.sleep(42000L) // error +} diff --git a/tests/neg/yimports-nosuch.check b/tests/neg/yimports-nosuch.check new file mode 100644 index 000000000000..5a77d7f8d016 --- /dev/null +++ b/tests/neg/yimports-nosuch.check @@ -0,0 +1,2 @@ +error: bad preamble import skala +error: bad preamble import scala.Predeff diff --git a/tests/neg/yimports-nosuch.scala b/tests/neg/yimports-nosuch.scala new file mode 100644 index 000000000000..431daf39a180 --- /dev/null +++ b/tests/neg/yimports-nosuch.scala @@ -0,0 +1,5 @@ +// scalac: -Yimports:skala,scala.Predeff +// +class C +// nopos-error +// nopos-error diff --git a/tests/neg/yimports-order.check b/tests/neg/yimports-order.check new file mode 100644 index 000000000000..b49503f75e01 --- /dev/null +++ b/tests/neg/yimports-order.check @@ -0,0 +1,16 @@ +-- [E006] Not Found Error: tests/neg/yimports-order.scala:9:16 --------------------------------------------------------- +9 | def f() = Map("hello" -> "world") // error // error + | ^^^ + | Not found: Map + | + | longer explanation available when compiling with `-explain` +-- [E008] Not Found Error: tests/neg/yimports-order.scala:9:28 --------------------------------------------------------- +9 | def f() = Map("hello" -> "world") // error // error + | ^^^^^^^^^^ + | value -> is not a member of String +-- [E006] Not Found Error: tests/neg/yimports-order.scala:10:16 -------------------------------------------------------- +10 | def g() = println(f()) // error + | ^^^^^^^ + | Not found: println + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg/yimports-order.scala b/tests/neg/yimports-order.scala new file mode 100644 index 000000000000..9cba91385b8a --- /dev/null +++ b/tests/neg/yimports-order.scala @@ -0,0 +1,13 @@ + +package top { + package middle { + class C { + def c() = println("hello, world") + } + import Predef.{Map => _} + object Test { + def f() = Map("hello" -> "world") // error // error + def g() = println(f()) // error + } + } +} diff --git a/tests/neg/yimports-predef.check b/tests/neg/yimports-predef.check new file mode 100644 index 000000000000..eb8881e04223 --- /dev/null +++ b/tests/neg/yimports-predef.check @@ -0,0 +1,4 @@ +-- [E008] Not Found Error: tests/neg/yimports-predef.scala:6:21 -------------------------------------------------------- +6 | def f[A](x: A) = x + 42 // error + | ^^^ + | value + is not a member of A diff --git a/tests/neg/yimports-predef.scala b/tests/neg/yimports-predef.scala new file mode 100644 index 000000000000..8bfe89b08cd8 --- /dev/null +++ b/tests/neg/yimports-predef.scala @@ -0,0 +1,7 @@ +// scalac: -Yimports:scala,scala.Predef +// +import Predef.{any2stringadd => _, _} + +class classic { + def f[A](x: A) = x + 42 // error +} diff --git a/tests/neg/yimports-stable.check b/tests/neg/yimports-stable.check new file mode 100644 index 000000000000..c5bfd914ae07 --- /dev/null +++ b/tests/neg/yimports-stable.check @@ -0,0 +1,14 @@ + +error: bad preamble import hello.world.potions +-- [E006] Not Found Error: tests/neg/yimports-stable/C_2.scala:4:9 ----------------------------------------------------- +4 | val v: Numb = magic // error // error + | ^^^^ + | Not found: type Numb + | + | longer explanation available when compiling with `-explain` +-- [E006] Not Found Error: tests/neg/yimports-stable/C_2.scala:4:16 ---------------------------------------------------- +4 | val v: Numb = magic // error // error + | ^^^^^ + | Not found: magic + | + | longer explanation available when compiling with `-explain` diff --git a/tests/neg/yimports-stable/C_2.scala b/tests/neg/yimports-stable/C_2.scala new file mode 100644 index 000000000000..0b97775f1a01 --- /dev/null +++ b/tests/neg/yimports-stable/C_2.scala @@ -0,0 +1,7 @@ +// scalac: -Yimports:scala,scala.Predef,hello.world.potions +// +class C { + val v: Numb = magic // error // error + def greet() = println("hello, world!") +} +// nopos-error diff --git a/tests/neg/yimports-stable/minidef_1.scala b/tests/neg/yimports-stable/minidef_1.scala new file mode 100644 index 000000000000..b3ea7445df24 --- /dev/null +++ b/tests/neg/yimports-stable/minidef_1.scala @@ -0,0 +1,11 @@ + +package hello + +trait stuff { + type Numb = Int + val magic = 42 +} + +object world { + val potions = new stuff {} +} diff --git a/tests/patmat/aliasing.check b/tests/patmat/aliasing.check index d7c21e8d0605..c367626d6f1e 100644 --- a/tests/patmat/aliasing.check +++ b/tests/patmat/aliasing.check @@ -1,3 +1,3 @@ 14: Pattern Match Exhaustivity: _: Trait & Test.Alias1, _: Clazz & Test.Alias1 19: Pattern Match Exhaustivity: _: Trait & Test.Alias2 -23: Pattern Match Exhaustivity: _: Trait & (Test.Alias2 & OpenTrait2){x: Int} +23: Pattern Match Exhaustivity: _: Trait & (Test.Alias2 & OpenTrait2){val x: Int} diff --git a/tests/patmat/andtype-refinedtype-interaction.check b/tests/patmat/andtype-refinedtype-interaction.check index 9f57c5ba4867..d9512b5cb3e4 100644 --- a/tests/patmat/andtype-refinedtype-interaction.check +++ b/tests/patmat/andtype-refinedtype-interaction.check @@ -1,9 +1,9 @@ -32: Pattern Match Exhaustivity: _: Trait & C1{x: Int} -48: Pattern Match Exhaustivity: _: Trait & (C1 | (C2 | T1)){x: Int} & (C3 | (C4 | T2)){x: Int}, _: Clazz & (C1 | (C2 | T1)){x: Int} & (C3 | (C4 | T2)){x: Int} -54: Pattern Match Exhaustivity: _: Trait & (C1 | (C2 | T1)){x: Int} & C3{x: Int} -59: Pattern Match Exhaustivity: _: Trait & (C1 & C2){x: Int} -65: Pattern Match Exhaustivity: _: Trait & (C1 | C2){x: Int} & (C3 | SubC1){x: Int} -72: Pattern Match Exhaustivity: _: Trait & (T1 & (C1 | SubC2)){x: Int} & (T2 & (C2 | C3 | SubC1)){x: Int} & - SubSubC1{x: Int} -79: Pattern Match Exhaustivity: _: Trait & (T1 & (C1 | SubC2)){x: Int} & (T2 & (C2 | C3 | SubC1)){x: Int} & - SubSubC2{x: Int} +32: Pattern Match Exhaustivity: _: Trait & C1{val x: Int} +48: Pattern Match Exhaustivity: _: Trait & (C1 | (C2 | T1)){val x: Int} & (C3 | (C4 | T2)){val x: Int}, _: Clazz & (C1 | (C2 | T1)){val x: Int} & (C3 | (C4 | T2)){val x: Int} +54: Pattern Match Exhaustivity: _: Trait & (C1 | (C2 | T1)){val x: Int} & C3{val x: Int} +59: Pattern Match Exhaustivity: _: Trait & (C1 & C2){val x: Int} +65: Pattern Match Exhaustivity: _: Trait & (C1 | C2){val x: Int} & (C3 | SubC1){val x: Int} +72: Pattern Match Exhaustivity: _: Trait & (T1 & (C1 | SubC2)){val x: Int} & (T2 & (C2 | C3 | SubC1)){val x: Int} & + SubSubC1{val x: Int} +79: Pattern Match Exhaustivity: _: Trait & (T1 & (C1 | SubC2)){val x: Int} & (T2 & (C2 | C3 | SubC1)){val x: Int} & + SubSubC2{val x: Int} diff --git a/tests/patmat/isSubspace-Typ-Prod.scala b/tests/patmat/isSubspace-Typ-Prod.scala new file mode 100644 index 000000000000..df17c99d67be --- /dev/null +++ b/tests/patmat/isSubspace-Typ-Prod.scala @@ -0,0 +1,7 @@ +case class Foo[T](x: T) +class Bar extends Foo[String]("") + +def test(x: Any) = x match + case Foo(1) => + case _: Bar => // used to warn about unreachable case + // case _: Foo[_] => // still warns, something else is wrong diff --git a/tests/pending/neg/yimports-custom-b.check b/tests/pending/neg/yimports-custom-b.check new file mode 100644 index 000000000000..d046a1d8f6cc --- /dev/null +++ b/tests/pending/neg/yimports-custom-b.check @@ -0,0 +1,10 @@ + +C_2.scala:8: error: not found: type Numb + val v: Numb = Answer + ^ +-- [E006] Not Found Error: tests/neg/yimports-custom-b/C_2.scala:9:16 -------------------------------------------------- +9 | def greet() = println("hello, world!") // error + | ^^^^^^^ + | Not found: println + | + | longer explanation available when compiling with `-explain` diff --git a/tests/pending/neg/yimports-custom-b/C_2.scala b/tests/pending/neg/yimports-custom-b/C_2.scala new file mode 100644 index 000000000000..8da798e80b0d --- /dev/null +++ b/tests/pending/neg/yimports-custom-b/C_2.scala @@ -0,0 +1,10 @@ +// scalac: -Yimports:hello.world.minidef + +import hello.{world => hw} +import hw.minidef.{Magic => Answer} + +// Finds the answer, but dumb to forget Numb +class C { + val v: Numb = Answer // error + def greet() = println("hello, world!") // error +} diff --git a/tests/pending/neg/yimports-custom-b/minidef_1.scala b/tests/pending/neg/yimports-custom-b/minidef_1.scala new file mode 100644 index 000000000000..befc137b6ab6 --- /dev/null +++ b/tests/pending/neg/yimports-custom-b/minidef_1.scala @@ -0,0 +1,8 @@ +// scalac: -Yimports:scala + +package hello.world + +object minidef { + type Numb = Int + final val Magic = 42 +} diff --git a/tests/pending/neg/yimports-masked.check b/tests/pending/neg/yimports-masked.check new file mode 100644 index 000000000000..ae715313392a --- /dev/null +++ b/tests/pending/neg/yimports-masked.check @@ -0,0 +1,10 @@ + +C_2.scala:11: error: not found: type Numb + val v: Numb = Answer + ^ +-- [E006] Not Found Error: tests/neg/yimports-masked/C_2.scala:12:18 --------------------------------------------------- +12 | def greet() = println("hello, world!") // error + | ^^^^^^^ + | Not found: println + | + | longer explanation available when compiling with `-explain` diff --git a/tests/pending/neg/yimports-masked/C_2.scala b/tests/pending/neg/yimports-masked/C_2.scala new file mode 100644 index 000000000000..1b6c736bad7b --- /dev/null +++ b/tests/pending/neg/yimports-masked/C_2.scala @@ -0,0 +1,14 @@ +// scalac: -Yimports:scala,hello.world.minidef + +// import at top level or top of package disables implicit import. +// the import can appear at any statement position, here, end of package. +// Update: with new trick, the import has to be completed before usages. + +import hello.world.minidef.{Magic => Answer} + +package p { + class C { + val v: Numb = Answer // error + def greet() = println("hello, world!") // error + } +} diff --git a/tests/pending/neg/yimports-masked/minidef_1.scala b/tests/pending/neg/yimports-masked/minidef_1.scala new file mode 100644 index 000000000000..5d18d0a39584 --- /dev/null +++ b/tests/pending/neg/yimports-masked/minidef_1.scala @@ -0,0 +1,7 @@ + +package hello.world + +object minidef { + type Numb = Int + final val Magic = 42 +} diff --git a/tests/pending/pos/i16268.scala b/tests/pending/pos/i16268.scala new file mode 100644 index 000000000000..6b44e71a2247 --- /dev/null +++ b/tests/pending/pos/i16268.scala @@ -0,0 +1,25 @@ +import language.experimental.captureChecking +class Tree +case class Thicket(trees: List[Tree]) extends Tree + +def test1(segments: List[{*} Tree]) = + val elems = segments flatMap { (t: {*} Tree) => t match // error + case ts: Thicket => ts.trees.tail + case t => Nil + } + elems + +def test2(segments: List[{*} Tree]) = + val f = (t: {*} Tree) => t match + case ts: Thicket => ts.trees.tail + case t => Nil + val elems = segments.flatMap(f) // error + elems + +def test3(c: {*} Any)(segments: List[{c} Tree]) = + val elems = segments flatMap { (t: {c} Tree) => t match + case ts: Thicket => ts.trees.tail + case t => Nil + } + elems + diff --git a/tests/pending/run/i15893.scala b/tests/pending/run/i15893.scala index dedec2138f2a..d9cd2822e971 100644 --- a/tests/pending/run/i15893.scala +++ b/tests/pending/run/i15893.scala @@ -24,7 +24,7 @@ transparent inline def transparentInlineMod2(inline n: NatT): NatT = inline n m case Succ(Zero()) => Succ(Zero()) case Succ(Succ(predPredN)) => transparentInlineMod2(predPredN) */ -def dependentlyTypedMod2[N <: NatT](n: N): Mod2[N] = n match // exhaustivity warning; unexpected +def dependentlyTypedMod2[N <: NatT](n: N): Mod2[N] = n match case Zero(): Zero => Zero() case Succ(Zero()): Succ[Zero] => Succ(Zero()) case Succ(Succ(predPredN)): Succ[Succ[_]] => dependentlyTypedMod2(predPredN) @@ -61,5 +61,5 @@ inline def transparentInlineFoo(inline n: NatT): NatT = inline transparentInline println(transparentInlineFoo(Succ(Succ(Succ(Zero()))))) // prints Zero(), as expected */ println(dependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // runtime error; unexpected -// println(inlineDependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // doesn't compile; unexpected -// println(transparentInlineDependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // doesn't compile; unexpected +// println(inlineDependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected +// println(transparentInlineDependentlyTypedMod2(Succ(Succ(Succ(Zero()))))) // prints Succ(Zero()), as expected diff --git a/tests/pos-custom-args/captures/bynamefun.scala b/tests/pos-custom-args/captures/bynamefun.scala new file mode 100644 index 000000000000..86bad201ffc3 --- /dev/null +++ b/tests/pos-custom-args/captures/bynamefun.scala @@ -0,0 +1,11 @@ +object test: + class Plan(elem: Plan) + object SomePlan extends Plan(???) + def f1(expr: (-> Plan) -> Plan): Plan = expr(SomePlan) + f1 { onf => Plan(onf) } + def f2(expr: (=> Plan) -> Plan): Plan = ??? + f2 { onf => Plan(onf) } + def f3(expr: (-> Plan) => Plan): Plan = ??? + f1 { onf => Plan(onf) } + def f4(expr: (=> Plan) => Plan): Plan = ??? + f2 { onf => Plan(onf) } diff --git a/tests/pos-custom-args/captures/capt-test.scala b/tests/pos-custom-args/captures/capt-test.scala index 6ee0d2a4d9f4..c61577e96eb1 100644 --- a/tests/pos-custom-args/captures/capt-test.scala +++ b/tests/pos-custom-args/captures/capt-test.scala @@ -21,6 +21,9 @@ def map[A, B](f: A => B)(xs: LIST[A]): LIST[B] = class C type Cap = {*} C +class Foo(x: Cap): + this: {x} Foo => + def test(c: Cap, d: Cap) = def f(x: Cap): Unit = if c == x then () def g(x: Cap): Unit = if d == x then () diff --git a/tests/pos-custom-args/captures/cmp-singleton-2.scala b/tests/pos-custom-args/captures/cmp-singleton-2.scala new file mode 100644 index 000000000000..daaa4add3858 --- /dev/null +++ b/tests/pos-custom-args/captures/cmp-singleton-2.scala @@ -0,0 +1,11 @@ +class T +class A extends T +class B extends T + +def test(tp: T) = + val mapping: Map[A, String] = ??? + + tp match + case a: A => mapping(a) match + case s: String => B() + case null => a diff --git a/tests/pos-custom-args/captures/cmp-singleton.scala b/tests/pos-custom-args/captures/cmp-singleton.scala new file mode 100644 index 000000000000..45b4009f5e89 --- /dev/null +++ b/tests/pos-custom-args/captures/cmp-singleton.scala @@ -0,0 +1,10 @@ +class Denotation +abstract class SingleDenotation extends Denotation +def goRefined: Denotation = + val foo: Denotation = ??? + val joint = foo + joint match + case joint: SingleDenotation => + joint + case _ => + joint \ No newline at end of file diff --git a/tests/pos-custom-args/captures/foreach.scala b/tests/pos-custom-args/captures/foreach.scala new file mode 100644 index 000000000000..b7dfc49272a9 --- /dev/null +++ b/tests/pos-custom-args/captures/foreach.scala @@ -0,0 +1,4 @@ +import caps.unsafe.* +def test = + val tasks = new collection.mutable.ArrayBuffer[() => Unit] + val _: Unit = tasks.foreach(((task: () => Unit) => task()).unsafeBoxFunArg) diff --git a/tests/pos-custom-args/captures/gadt-ycheck.scala b/tests/pos-custom-args/captures/gadt-ycheck.scala new file mode 100644 index 000000000000..946763b53e7e --- /dev/null +++ b/tests/pos-custom-args/captures/gadt-ycheck.scala @@ -0,0 +1,14 @@ +package test + +import reflect.ClassTag +import language.experimental.pureFunctions + +object Settings: + val OptionTag: ClassTag[Option[?]] = ClassTag(classOf[Option[?]]) + + class Setting[T: ClassTag](propertyClass: Option[Class[?]]): + def tryToSet() = + def update(value: Any): String = ??? + implicitly[ClassTag[T]] match + case OptionTag => + update(Some(propertyClass.get.getConstructor().newInstance())) diff --git a/tests/pos-custom-args/captures/i16116.scala b/tests/pos-custom-args/captures/i16116.scala new file mode 100644 index 000000000000..2f5d5304dca5 --- /dev/null +++ b/tests/pos-custom-args/captures/i16116.scala @@ -0,0 +1,39 @@ +package x + +import scala.annotation.* +import scala.concurrent.* + +trait CpsMonad[F[_]] { + type Context +} + +object CpsMonad { + type Aux[F[_],C] = CpsMonad[F] { type Context = C } + given CpsMonad[Future] with {} +} + +@experimental +object Test { + + @capability + class CpsTransform[F[_]] { + def await[T](ft: F[T]): { this } T = ??? + } + + transparent inline def cpsAsync[F[_]](using m:CpsMonad[F]) = + new Test.InfernAsyncArg + + class InfernAsyncArg[F[_],C](using am:CpsMonad.Aux[F,C]) { + def apply[A](expr: (CpsTransform[F], C) ?=> A): F[A] = ??? + } + + def asyncPlus[F[_]](a:Int, b:F[Int])(using cps: CpsTransform[F]): { cps } Int = + a + (cps.await(b).asInstanceOf[Int]) + + def testExample1Future(): Unit = + val fr = cpsAsync[Future] { + val y = asyncPlus(1,Future successful 2).asInstanceOf[Int] + y+1 + } + +} diff --git a/tests/pos-custom-args/captures/i16226.scala b/tests/pos-custom-args/captures/i16226.scala new file mode 100644 index 000000000000..8edf3f54d739 --- /dev/null +++ b/tests/pos-custom-args/captures/i16226.scala @@ -0,0 +1,14 @@ +@annotation.capability class Cap + +class LazyRef[T](val elem: () => T): + val get: {elem} () -> T = elem + def map[U](f: T => U): {f, this} LazyRef[U] = + new LazyRef(() => f(elem())) + +def map[A, B](ref: {*} LazyRef[A], f: A => B): {f, ref} LazyRef[B] = + new LazyRef(() => f(ref.elem())) + +def main(io: Cap) = { + def mapd[A, B]: ({io} LazyRef[A], A => B) => {*} LazyRef[B] = + (ref1, f1) => map[A, B](ref1, f1) +} diff --git a/tests/pos-custom-args/captures/i16226a.scala b/tests/pos-custom-args/captures/i16226a.scala new file mode 100644 index 000000000000..444d7f2ed0d7 --- /dev/null +++ b/tests/pos-custom-args/captures/i16226a.scala @@ -0,0 +1,13 @@ +class Name +class TermName extends Name +class TypeName extends Name + +trait ParamInfo: + type ThisName <: Name + def variance: Long +object ParamInfo: + type Of[N <: Name] = ParamInfo { type ThisName = N } + +def test(tparams1: List[ParamInfo{ type ThisName = TypeName }], tparams2: List[ParamInfo.Of[TypeName]]) = + tparams1.lazyZip(tparams2).map((p1, p2) => p1.variance + p2.variance) + diff --git a/tests/pos-custom-args/captures/matchtypes.scala b/tests/pos-custom-args/captures/matchtypes.scala new file mode 100644 index 000000000000..b2442277f1f7 --- /dev/null +++ b/tests/pos-custom-args/captures/matchtypes.scala @@ -0,0 +1,10 @@ +type HEAD[X <: NonEmptyTuple] = X match { + case x *: (_ <: NonEmptyTuple) => x +} + +inline def head[A <: NonEmptyTuple](x: A): HEAD[A] = null.asInstanceOf[HEAD[A]] + +def show[A, T <: Tuple](x: A *: T) = + show1(head(x)) + show1(x.head) +def show1[A](x: A): String = ??? \ No newline at end of file diff --git a/tests/pos-custom-args/captures/override-adapt-box-pos-alt.scala b/tests/pos-custom-args/captures/override-adapt-box-pos-alt.scala new file mode 100644 index 000000000000..c7e4d38723d7 --- /dev/null +++ b/tests/pos-custom-args/captures/override-adapt-box-pos-alt.scala @@ -0,0 +1,17 @@ +import language.experimental.captureChecking + +class IO + +abstract class A[X] { + def foo(x: Unit): X + def bar(op: X => Int): Int +} + +class C + +def test(io: {*} IO) = { + class B extends A[{io} C] { // X =:= {io} C + def foo(x: Unit): {io} C = ??? + def bar(op: ({io} C) => Int): Int = 0 + } +} diff --git a/tests/pos-custom-args/captures/override-adapt-box-pos.scala b/tests/pos-custom-args/captures/override-adapt-box-pos.scala new file mode 100644 index 000000000000..7496a138070d --- /dev/null +++ b/tests/pos-custom-args/captures/override-adapt-box-pos.scala @@ -0,0 +1,19 @@ +import language.experimental.captureChecking + +class IO + +abstract class A[X, Y] { + def foo(x: Unit): X + def bar(x: Int, y: {} IO): X + def baz(x: Y): X +} + +class C + +def test(io: {*} IO) = { + class B extends A[{io} C, {} C] { // X =:= {io} C + override def foo(x: Unit): {io} C = ??? + override def bar(x: Int, y: {} IO): {io} C = ??? + override def baz(x: {} C): {io} C = ??? + } +} diff --git a/tests/pos-custom-args/captures/overrides.scala b/tests/pos-custom-args/captures/overrides.scala index 66f19726ffa7..7e70afe7a327 100644 --- a/tests/pos-custom-args/captures/overrides.scala +++ b/tests/pos-custom-args/captures/overrides.scala @@ -12,15 +12,3 @@ class Bar extends Foo: class Baz extends Bar: override def foo = () => println("baz") override def bar = "baz" - //override def toString = bar - -abstract class Message: - protected def msg: String - override def toString = msg - -abstract class SyntaxMsg extends Message - -class CyclicInheritance extends SyntaxMsg: - def msg = "cyclic" - - diff --git a/tests/pos-custom-args/captures/overrides/A.scala b/tests/pos-custom-args/captures/overrides/A.scala new file mode 100644 index 000000000000..6a81f8562164 --- /dev/null +++ b/tests/pos-custom-args/captures/overrides/A.scala @@ -0,0 +1,4 @@ +abstract class Message: + lazy val message: String = ??? + def rawMessage = message + diff --git a/tests/pos-custom-args/captures/overrides/B.scala b/tests/pos-custom-args/captures/overrides/B.scala new file mode 100644 index 000000000000..ce4a3f20f1d2 --- /dev/null +++ b/tests/pos-custom-args/captures/overrides/B.scala @@ -0,0 +1,6 @@ + +abstract class SyntaxMsg extends Message + +class CyclicInheritance extends SyntaxMsg + + diff --git a/tests/pos-custom-args/captures/selftypes.scala b/tests/pos-custom-args/captures/selftypes.scala new file mode 100644 index 000000000000..c1b8eefce506 --- /dev/null +++ b/tests/pos-custom-args/captures/selftypes.scala @@ -0,0 +1,15 @@ + import annotation.constructorOnly + trait A: + self: A => + def foo: Int + + abstract class B extends A: + def foo: Int + + class C extends B: + def foo = 1 + def derived = this + + class D(@constructorOnly op: Int => Int) extends C: + val x = 1//op(1) + diff --git a/tests/pos-custom-args/captures/unsafe-unbox.scala b/tests/pos-custom-args/captures/unsafe-unbox.scala index e846a7db1b69..b228d8c07925 100644 --- a/tests/pos-custom-args/captures/unsafe-unbox.scala +++ b/tests/pos-custom-args/captures/unsafe-unbox.scala @@ -1,4 +1,4 @@ -import caps.* +import caps.unsafe.* def test = var finalizeActions = collection.mutable.ListBuffer[() => Unit]() val action = finalizeActions.remove(0).unsafeUnbox diff --git a/tests/pos-custom-args/captures/vars1.scala b/tests/pos-custom-args/captures/vars1.scala index 8c2f2cb8b5d5..c008bac2e72f 100644 --- a/tests/pos-custom-args/captures/vars1.scala +++ b/tests/pos-custom-args/captures/vars1.scala @@ -1,4 +1,4 @@ -import caps.* +import caps.unsafe.* object Test: type ErrorHandler = (Int, String) => Unit @@ -11,15 +11,11 @@ object Test: def defaultIncompleteHandler1(): ErrorHandler = ??? val defaultIncompleteHandler2: ErrorHandler = ??? - var incompleteHandler1: ErrorHandler = defaultIncompleteHandler1() - var incompleteHandler2: ErrorHandler = defaultIncompleteHandler2 - var incompleteHandler3: ErrorHandler = defaultIncompleteHandler1().unsafeBox - var incompleteHandler4: ErrorHandler = defaultIncompleteHandler2.unsafeBox - private var incompleteHandler5 = defaultIncompleteHandler1() - private var incompleteHandler6 = defaultIncompleteHandler2 + var incompleteHandler1: ErrorHandler = defaultIncompleteHandler1().unsafeBox + var incompleteHandler2: ErrorHandler = defaultIncompleteHandler2.unsafeBox private var incompleteHandler7 = defaultIncompleteHandler1().unsafeBox private var incompleteHandler8 = defaultIncompleteHandler2.unsafeBox - incompleteHandler1 = defaultIncompleteHandler2 + incompleteHandler1 = defaultIncompleteHandler2.unsafeBox incompleteHandler1 = defaultIncompleteHandler2.unsafeBox val saved = incompleteHandler1.unsafeUnbox diff --git a/tests/pos-custom-args/no-experimental/dotty-experimental.scala b/tests/pos-custom-args/no-experimental/dotty-experimental.scala index 74e79c85eaaa..320c68dbea50 100644 --- a/tests/pos-custom-args/no-experimental/dotty-experimental.scala +++ b/tests/pos-custom-args/no-experimental/dotty-experimental.scala @@ -1,6 +1,6 @@ package dotty.tools object test { - val x = caps.unsafeBox + val x = caps.unsafe.unsafeBox } diff --git a/tests/pos-custom-args/no-experimental/experimental-imports-empty.scala b/tests/pos-custom-args/no-experimental/experimental-imports-empty.scala index bb27629a6062..998086c5d9a4 100644 --- a/tests/pos-custom-args/no-experimental/experimental-imports-empty.scala +++ b/tests/pos-custom-args/no-experimental/experimental-imports-empty.scala @@ -1,5 +1,4 @@ import annotation.experimental -import language.experimental.fewerBraces import language.experimental.namedTypeArguments import language.experimental.genericNumberLiterals import language.experimental.erasedDefinitions diff --git a/tests/pos-macros/annot-in-object/Macro_1.scala b/tests/pos-macros/annot-in-object/Macro_1.scala new file mode 100644 index 000000000000..52c5daec1f29 --- /dev/null +++ b/tests/pos-macros/annot-in-object/Macro_1.scala @@ -0,0 +1,12 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ + +object Foo: + @experimental + class void extends MacroAnnotation: + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = List(tree) + + object Bar: + @experimental + class void extends MacroAnnotation: + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = List(tree) diff --git a/tests/pos-macros/annot-in-object/Test_2.scala b/tests/pos-macros/annot-in-object/Test_2.scala new file mode 100644 index 000000000000..4fc43d4f2e41 --- /dev/null +++ b/tests/pos-macros/annot-in-object/Test_2.scala @@ -0,0 +1,3 @@ +@Foo.void +@Foo.Bar.void +def test = 0 diff --git a/tests/pos-macros/annot-suspend/Macro_1.scala b/tests/pos-macros/annot-suspend/Macro_1.scala new file mode 100644 index 000000000000..afbf05e568c7 --- /dev/null +++ b/tests/pos-macros/annot-suspend/Macro_1.scala @@ -0,0 +1,7 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ + +@experimental +class void extends MacroAnnotation: + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + List(tree) diff --git a/tests/pos-macros/annot-suspend/Test_2.scala b/tests/pos-macros/annot-suspend/Test_2.scala new file mode 100644 index 000000000000..ee8529fa4414 --- /dev/null +++ b/tests/pos-macros/annot-suspend/Test_2.scala @@ -0,0 +1,2 @@ +@void +def test = 0 diff --git a/tests/pos-macros/annot-then-inline/Macro_1.scala b/tests/pos-macros/annot-then-inline/Macro_1.scala new file mode 100644 index 000000000000..8e966be862cd --- /dev/null +++ b/tests/pos-macros/annot-then-inline/Macro_1.scala @@ -0,0 +1,16 @@ +import scala.annotation.{experimental, MacroAnnotation} +import scala.quoted._ + +@experimental +class useInlinedIdentity extends MacroAnnotation { + def transform(using Quotes)(tree: quotes.reflect.Definition): List[quotes.reflect.Definition] = + import quotes.reflect.* + tree match + case DefDef(name, params, tpt, Some(rhs)) => + val newRhs = + given Quotes = tree.symbol.asQuotes + '{ inlinedIdentity(${rhs.asExpr}) }.asTerm + List(DefDef.copy(tree)(name, params, tpt, Some(newRhs))) +} + +inline def inlinedIdentity(x: Any): x.type = x diff --git a/tests/pos-macros/annot-then-inline/Test_2.scala b/tests/pos-macros/annot-then-inline/Test_2.scala new file mode 100644 index 000000000000..3e72fcaaae1d --- /dev/null +++ b/tests/pos-macros/annot-then-inline/Test_2.scala @@ -0,0 +1,2 @@ +@useInlinedIdentity +def test = 0 diff --git a/tests/pos-macros/exprSummonWithTypeVar/Macro_1.scala b/tests/pos-macros/exprSummonWithTypeVar/Macro_1.scala new file mode 100644 index 000000000000..72bcbe8b6515 --- /dev/null +++ b/tests/pos-macros/exprSummonWithTypeVar/Macro_1.scala @@ -0,0 +1,13 @@ +import scala.compiletime.{erasedValue, summonFrom} + +import scala.quoted._ + +inline given summonAfterTypeMatch[T]: Any = + ${ summonAfterTypeMatchExpr[T] } + +private def summonAfterTypeMatchExpr[T: Type](using Quotes): Expr[Any] = + Expr.summon[Foo[T]].get + +trait Foo[T] + +given IntFoo[T <: Int]: Foo[T] = ??? diff --git a/tests/pos-macros/exprSummonWithTypeVar/Test_2.scala b/tests/pos-macros/exprSummonWithTypeVar/Test_2.scala new file mode 100644 index 000000000000..dbf2fd88fe24 --- /dev/null +++ b/tests/pos-macros/exprSummonWithTypeVar/Test_2.scala @@ -0,0 +1 @@ +def test: Unit = summonAfterTypeMatch[Int] diff --git a/tests/pos-macros/hk-quoted-type-patterns/Macro_1.scala b/tests/pos-macros/hk-quoted-type-patterns/Macro_1.scala new file mode 100644 index 000000000000..0d2df1504918 --- /dev/null +++ b/tests/pos-macros/hk-quoted-type-patterns/Macro_1.scala @@ -0,0 +1,17 @@ +import scala.quoted._ + +private def impl(x: Expr[Any])(using Quotes): Expr[Unit] = { + x match + case '{ foo[x] } => + assert(Type.show[x] == "scala.Int", Type.show[x]) + case '{ type f[X]; foo[`f`] } => + assert(Type.show[f] == "[A >: scala.Nothing <: scala.Any] => scala.collection.immutable.List[A]", Type.show[f]) + case '{ type f <: AnyKind; foo[`f`] } => + assert(Type.show[f] == "[K >: scala.Nothing <: scala.Any, V >: scala.Nothing <: scala.Any] => scala.collection.immutable.Map[K, V]", Type.show[f]) + case x => throw MatchError(x.show) + '{} +} + +inline def test(inline x: Any): Unit = ${ impl('x) } + +def foo[T <: AnyKind]: Any = ??? diff --git a/tests/pos-macros/hk-quoted-type-patterns/Test_2.scala b/tests/pos-macros/hk-quoted-type-patterns/Test_2.scala new file mode 100644 index 000000000000..3cb9113f2452 --- /dev/null +++ b/tests/pos-macros/hk-quoted-type-patterns/Test_2.scala @@ -0,0 +1,5 @@ +@main +def Test = + test(foo[Int]) + test(foo[List]) + test(foo[Map]) diff --git a/tests/pos-macros/i11211.scala b/tests/pos-macros/i11211.scala index 2650fa754193..154d8df174e7 100644 --- a/tests/pos-macros/i11211.scala +++ b/tests/pos-macros/i11211.scala @@ -12,7 +12,7 @@ def takeOptionImpl2[T](using Quotes, Type[T]): Unit = '{ def takeOptionImpl[T](o: Expr[Option[T]], default: Expr[T])(using Quotes, Type[T]): Expr[T] = '{ $o match { case Some(t1) => t1 - case None: Option[T] => $default + case None => $default } } diff --git a/tests/pos-macros/i15779/Macro_1.scala b/tests/pos-macros/i15779/Macro_1.scala new file mode 100644 index 000000000000..8bb98ab31553 --- /dev/null +++ b/tests/pos-macros/i15779/Macro_1.scala @@ -0,0 +1,30 @@ +import scala.quoted._ +import scala.deriving.Mirror + +trait Encoder[-A] + +trait PrimitiveEncoder[A] extends Encoder[A] + +given intOpt: PrimitiveEncoder[Option[Int]] with {} + +given primitiveNotNull[T](using e: Encoder[Option[T]]): PrimitiveEncoder[T] = + new PrimitiveEncoder[T] {} + +transparent inline given fromMirror[A]: Any = ${ fromMirrorImpl[A] } + +def fromMirrorImpl[A : Type](using q: Quotes): Expr[Any] = + Expr.summon[Mirror.Of[A]].get match + case '{ ${mirror}: Mirror.ProductOf[A] { type MirroredElemTypes = elementTypes } } => + val encoder = Type.of[elementTypes] match + case '[tpe *: EmptyTuple] => + Expr.summon[Encoder[tpe]].get + + encoder match + case '{ ${encoder}: Encoder[tpe] } => // ok + case _ => ??? + + encoder match + case '{ ${encoder}: Encoder[tpe] } => // ok + case _ => ??? + + encoder diff --git a/tests/pos-macros/i15779/Test_2.scala b/tests/pos-macros/i15779/Test_2.scala new file mode 100644 index 000000000000..c7223d849a86 --- /dev/null +++ b/tests/pos-macros/i15779/Test_2.scala @@ -0,0 +1,3 @@ +case class JustInt(i: Int) + +val x = fromMirror[JustInt] diff --git a/tests/pos-macros/i15985.scala b/tests/pos-macros/i15985.scala new file mode 100644 index 000000000000..cd8a726647f9 --- /dev/null +++ b/tests/pos-macros/i15985.scala @@ -0,0 +1,28 @@ +package anorm.macros +sealed trait Row +sealed trait SqlResult[A] + +import scala.quoted.{ Expr, Quotes, Type } + +private[anorm] object RowParserImpl { + def apply[A](using q:Quotes)(using a: Type[A]): Expr[Row => SqlResult[A]] = { + import q.reflect.* + + inline def f1: Expr[SqlResult[A]] = + Match(???, ???).asExprOf[SqlResult[A]] // (using Type.of[anorm.macros.SqlResult[A]] }) + + inline def f2: Expr[SqlResult[A]] = + Match(???, ???).asExprOf[SqlResult[A]](using Type.of[SqlResult[A]]) + // In Staging phase it becomes + // ..asExprOf[..](using Type.of[{ @SplicedType type a$_$3 = a.Underlying; anorm.macros.SqlResult[a$_$3] }]) + + inline def f3(using Type[SqlResult[A]]): Expr[SqlResult[A]] = + Match(???, ???).asExprOf[SqlResult[A]] + + f1 + f2 + f3 + + ??? + } +} diff --git a/tests/pos-macros/i16265.scala b/tests/pos-macros/i16265.scala new file mode 100644 index 000000000000..db75fbfa307c --- /dev/null +++ b/tests/pos-macros/i16265.scala @@ -0,0 +1,9 @@ +import scala.quoted.* + +class Foo(val value: Int) + +def foo(exprs: Expr[Any])(using Quotes): Any = + exprs match + case '{ $tuple: (Foo *: tail) } => + val x = '{ ${tuple}.head.value } + ??? diff --git a/tests/pos-macros/i16318/Macro_1.scala b/tests/pos-macros/i16318/Macro_1.scala new file mode 100644 index 000000000000..d66cebfd68b6 --- /dev/null +++ b/tests/pos-macros/i16318/Macro_1.scala @@ -0,0 +1,11 @@ +import scala.quoted.* + +final case class Record(a: String, b: Int) + +transparent inline def ann[T]: List[Any] = ${ annsImpl[T] } + +def annsImpl[T: Type](using Quotes): Expr[List[Any]] = { + import quotes.reflect.* + val annExpr = TypeRepr.of[T].typeSymbol.annotations.head.asExpr + '{ List($annExpr) } +} \ No newline at end of file diff --git a/tests/pos-macros/i16318/Test_2.scala b/tests/pos-macros/i16318/Test_2.scala new file mode 100644 index 000000000000..80eed17d26ba --- /dev/null +++ b/tests/pos-macros/i16318/Test_2.scala @@ -0,0 +1,2 @@ +def Test = + val a = ann[Record] \ No newline at end of file diff --git a/tests/pos-macros/i16420/Macro.scala b/tests/pos-macros/i16420/Macro.scala new file mode 100644 index 000000000000..1ea9406a0b9b --- /dev/null +++ b/tests/pos-macros/i16420/Macro.scala @@ -0,0 +1,21 @@ +import scala.quoted.{Expr, Quotes, Type} + +object Converter { + private def handleUnit[R](f: Expr[Int ?=> R])(using q: Quotes, rt: Type[R]): Expr[Unit] = '{} + + class UnitConverter[R] extends Converter[EmptyTuple, R, Int ?=> R] { + inline def convert(inline f: Int ?=> R): Unit = ${ handleUnit[R]('f) } + } + + inline given unitHandler[R]: UnitConverter[R] = new UnitConverter[R] +} + + +trait Converter[T <: Tuple, R, F] { + inline def convert(inline fn: F): Unit +} + +abstract class Directive[R <: Tuple] { + inline def apply[O, F](using inline c: Converter[R, O, F])(inline fn: F): Unit = + c.convert(fn) +} diff --git a/tests/pos-macros/i16420/Test.scala b/tests/pos-macros/i16420/Test.scala new file mode 100644 index 000000000000..f63cc62306af --- /dev/null +++ b/tests/pos-macros/i16420/Test.scala @@ -0,0 +1,8 @@ +object Meow extends App { + case class Meow(s: String, i: Int) + + val dir: Directive[EmptyTuple] = ??? + dir { + Meow("asd", 123) + } +} diff --git a/tests/pos-macros/i16636/Macro_1.scala b/tests/pos-macros/i16636/Macro_1.scala new file mode 100644 index 000000000000..78a3f6ef7b9b --- /dev/null +++ b/tests/pos-macros/i16636/Macro_1.scala @@ -0,0 +1,30 @@ +import scala.quoted.* + +trait ReproTransformer[A, B] { + def transform(from: A): B +} + +object ReproTransformer { + final class Identity[A, B >: A] extends ReproTransformer[A, B] { + def transform(from: A): B = from + } + + given identity[A, B >: A]: Identity[A, B] = Identity[A, B] + + inline def getTransformer[A, B]: ReproTransformer[A, B] = ${ getTransformerMacro[A, B] } + + def getTransformerMacro[A, B](using quotes: Quotes, A: Type[A], B: Type[B]) = { + import quotes.reflect.* + + val transformer = (A -> B) match { + case '[a] -> '[b] => + val summoned = Expr.summon[ReproTransformer[a, b]].get +// ----------- INTERESTING STUFF STARTS HERE + summoned match { + case '{ $t: ReproTransformer[src, dest] } => t + } +// ----------- INTERESTING STUFF ENDS HERE + } + transformer.asExprOf[ReproTransformer[A, B]] + } +} diff --git a/tests/pos-macros/i16636/Test_2.scala b/tests/pos-macros/i16636/Test_2.scala new file mode 100644 index 000000000000..eb8891ea7bf8 --- /dev/null +++ b/tests/pos-macros/i16636/Test_2.scala @@ -0,0 +1,9 @@ +object A { + case class AnotherCaseClass(name: String) + + val errorsOut1 = ReproTransformer.getTransformer[A.AnotherCaseClass, AnotherCaseClass] + val errorsOu2 = ReproTransformer.getTransformer[AnotherCaseClass, A.AnotherCaseClass] + + val works1 = ReproTransformer.getTransformer[A.AnotherCaseClass, A.AnotherCaseClass] + val works2 = ReproTransformer.getTransformer[AnotherCaseClass, AnotherCaseClass] +} diff --git a/tests/pos-macros/i8577a/Macro_1.scala b/tests/pos-macros/i8577a/Macro_1.scala new file mode 100644 index 000000000000..3831f060f918 --- /dev/null +++ b/tests/pos-macros/i8577a/Macro_1.scala @@ -0,0 +1,11 @@ +package i8577 + +import scala.quoted._ + +object Macro: + opaque type StrCtx = StringContext + def apply(ctx: StringContext): StrCtx = ctx + def unapply(ctx: StrCtx): Option[StringContext] = Some(ctx) + +def implUnapply(sc: Expr[Macro.StrCtx], input: Expr[Int])(using Quotes): Expr[Option[Seq[Int]]] = + '{ Some(Seq(${input})) } diff --git a/tests/pos-macros/i8577a/Main_2.scala b/tests/pos-macros/i8577a/Main_2.scala new file mode 100644 index 000000000000..5a0f6b609f81 --- /dev/null +++ b/tests/pos-macros/i8577a/Main_2.scala @@ -0,0 +1,9 @@ +package i8577 + +def main: Unit = + extension (ctx: StringContext) def mac: Macro.StrCtx = Macro(ctx) + extension (inline ctx: Macro.StrCtx) inline def unapplySeq(inline input: Int): Option[Seq[Int]] = + ${ implUnapply('ctx, 'input) } + + val mac"$x" = 1 + assert(x == 1) diff --git a/tests/pos-macros/i8577b/Macro_1.scala b/tests/pos-macros/i8577b/Macro_1.scala new file mode 100644 index 000000000000..464d9894fa1c --- /dev/null +++ b/tests/pos-macros/i8577b/Macro_1.scala @@ -0,0 +1,11 @@ +package i8577 + +import scala.quoted._ + +object Macro: + opaque type StrCtx = StringContext + def apply(ctx: StringContext): StrCtx = ctx + def unapply(ctx: StrCtx): Option[StringContext] = Some(ctx) + +def implUnapply[U](sc: Expr[Macro.StrCtx], input: Expr[U])(using Type[U])(using Quotes): Expr[Option[Seq[U]]] = + '{ Some(Seq(${input})) } diff --git a/tests/pos-macros/i8577b/Main_2.scala b/tests/pos-macros/i8577b/Main_2.scala new file mode 100644 index 000000000000..789e572bd5aa --- /dev/null +++ b/tests/pos-macros/i8577b/Main_2.scala @@ -0,0 +1,9 @@ +package i8577 + +def main: Unit = + extension (ctx: StringContext) def mac: Macro.StrCtx = Macro(ctx) + extension (inline ctx: Macro.StrCtx) inline def unapplySeq[U](inline input: U): Option[Seq[U]] = + ${ implUnapply('ctx, 'input) } + + val mac"$x" = 1 + assert(x == 1) diff --git a/tests/pos-macros/i8577c/Macro_1.scala b/tests/pos-macros/i8577c/Macro_1.scala new file mode 100644 index 000000000000..45986b34d48d --- /dev/null +++ b/tests/pos-macros/i8577c/Macro_1.scala @@ -0,0 +1,11 @@ +package i8577 + +import scala.quoted._ + +object Macro: + opaque type StrCtx = StringContext + def apply(ctx: StringContext): StrCtx = ctx + def unapply(ctx: StrCtx): Option[StringContext] = Some(ctx) + +def implUnapply[T](sc: Expr[Macro.StrCtx], input: Expr[T])(using Type[T])(using Quotes): Expr[Option[Seq[T]]] = + '{ Some(Seq(${input})) } diff --git a/tests/pos-macros/i8577c/Main_2.scala b/tests/pos-macros/i8577c/Main_2.scala new file mode 100644 index 000000000000..4f42c7635ec5 --- /dev/null +++ b/tests/pos-macros/i8577c/Main_2.scala @@ -0,0 +1,9 @@ +package i8577 + +def main: Unit = + extension (ctx: StringContext) def mac: Macro.StrCtx = Macro(ctx) + extension [T] (inline ctx: Macro.StrCtx) inline def unapplySeq(inline input: T): Option[Seq[T]] = + ${ implUnapply('ctx, 'input) } + + val mac"$x" = 1 + assert(x == 1) diff --git a/tests/pos-macros/i8577d/Macro_1.scala b/tests/pos-macros/i8577d/Macro_1.scala new file mode 100644 index 000000000000..45986b34d48d --- /dev/null +++ b/tests/pos-macros/i8577d/Macro_1.scala @@ -0,0 +1,11 @@ +package i8577 + +import scala.quoted._ + +object Macro: + opaque type StrCtx = StringContext + def apply(ctx: StringContext): StrCtx = ctx + def unapply(ctx: StrCtx): Option[StringContext] = Some(ctx) + +def implUnapply[T](sc: Expr[Macro.StrCtx], input: Expr[T])(using Type[T])(using Quotes): Expr[Option[Seq[T]]] = + '{ Some(Seq(${input})) } diff --git a/tests/pos-macros/i8577d/Main_2.scala b/tests/pos-macros/i8577d/Main_2.scala new file mode 100644 index 000000000000..a87f06503b31 --- /dev/null +++ b/tests/pos-macros/i8577d/Main_2.scala @@ -0,0 +1,9 @@ +package i8577 + +def main: Unit = + extension (ctx: StringContext) def mac: Macro.StrCtx = Macro(ctx) + extension [T] (inline ctx: Macro.StrCtx) inline def unapplySeq[U](inline input: T): Option[Seq[T]] = + ${ implUnapply('ctx, 'input) } + + val mac"$x" = 1 + assert(x == 1) diff --git a/tests/pos-macros/i8577e/Macro_1.scala b/tests/pos-macros/i8577e/Macro_1.scala new file mode 100644 index 000000000000..cf133d33a100 --- /dev/null +++ b/tests/pos-macros/i8577e/Macro_1.scala @@ -0,0 +1,11 @@ +package i8577 + +import scala.quoted._ + +object Macro: + opaque type StrCtx = StringContext + def apply(ctx: StringContext): StrCtx = ctx + def unapply(ctx: StrCtx): Option[StringContext] = Some(ctx) + +def implUnapply[T, U](sc: Expr[Macro.StrCtx], input: Expr[U])(using Type[U])(using Quotes): Expr[Option[Seq[U]]] = + '{ Some(Seq(${input})) } diff --git a/tests/pos-macros/i8577e/Main_2.scala b/tests/pos-macros/i8577e/Main_2.scala new file mode 100644 index 000000000000..598d18d2faec --- /dev/null +++ b/tests/pos-macros/i8577e/Main_2.scala @@ -0,0 +1,9 @@ +package i8577 + +def main: Unit = + extension (ctx: StringContext) def mac: Macro.StrCtx = Macro(ctx) + extension [T] (inline ctx: Macro.StrCtx) inline def unapplySeq[U](inline input: U): Option[Seq[U]] = + ${ implUnapply('ctx, 'input) } + + val mac"$x" = 1 + assert(x == 1) diff --git a/tests/pos-macros/i8577f/Macro_1.scala b/tests/pos-macros/i8577f/Macro_1.scala new file mode 100644 index 000000000000..7d3b5df28701 --- /dev/null +++ b/tests/pos-macros/i8577f/Macro_1.scala @@ -0,0 +1,11 @@ +package i8577 + +import scala.quoted._ + +object Macro: + opaque type StrCtx = StringContext + def apply(ctx: StringContext): StrCtx = ctx + def unapply(ctx: StrCtx): Option[StringContext] = Some(ctx) + +def implUnapply[T, U](sc: Expr[Macro.StrCtx], input: Expr[(T, U)])(using Type[T], Type[U])(using Quotes): Expr[Option[Seq[(T, U)]]] = + '{ Some(Seq(${input})) } diff --git a/tests/pos-macros/i8577f/Main_2.scala b/tests/pos-macros/i8577f/Main_2.scala new file mode 100644 index 000000000000..fd1bb3e6186f --- /dev/null +++ b/tests/pos-macros/i8577f/Main_2.scala @@ -0,0 +1,12 @@ +package i8577 + +def main: Unit = + extension (ctx: StringContext) def mac: Macro.StrCtx = Macro(ctx) + extension [T] (inline ctx: Macro.StrCtx) inline def unapplySeq[U](inline input: (T, U)): Option[Seq[(T, U)]] = + ${ implUnapply('ctx, 'input) } + + val mac"$x" = (1, 2) + assert(x == (1, 2)) + + val mac"$y" = (1, "a") + assert(y == (1, "a")) diff --git a/tests/pos-macros/i8577g/Macro_1.scala b/tests/pos-macros/i8577g/Macro_1.scala new file mode 100644 index 000000000000..2da12d6e23fd --- /dev/null +++ b/tests/pos-macros/i8577g/Macro_1.scala @@ -0,0 +1,11 @@ +package i8577 + +import scala.quoted._ + +object Macro: + opaque type StrCtx = StringContext + def apply(ctx: StringContext): StrCtx = ctx + def unapply(ctx: StrCtx): Option[StringContext] = Some(ctx) + +def implUnapply[T, U](sc: Expr[Macro.StrCtx], input: Expr[T | U])(using Type[T], Type[U])(using Quotes): Expr[Option[Seq[T | U]]] = + '{ Some(Seq(${input})) } diff --git a/tests/pos-macros/i8577g/Main_2.scala b/tests/pos-macros/i8577g/Main_2.scala new file mode 100644 index 000000000000..4998b9962802 --- /dev/null +++ b/tests/pos-macros/i8577g/Main_2.scala @@ -0,0 +1,9 @@ +package i8577 + +def main: Unit = + extension (ctx: StringContext) def mac: Macro.StrCtx = Macro(ctx) + extension [T] (inline ctx: Macro.StrCtx) inline def unapplySeq[U](inline input: T | U): Option[Seq[T | U]] = + ${ implUnapply('ctx, 'input) } + + val mac"$x" = 1 + assert(x == 1) diff --git a/tests/pos-macros/i8577h/Macro_1.scala b/tests/pos-macros/i8577h/Macro_1.scala new file mode 100644 index 000000000000..2da12d6e23fd --- /dev/null +++ b/tests/pos-macros/i8577h/Macro_1.scala @@ -0,0 +1,11 @@ +package i8577 + +import scala.quoted._ + +object Macro: + opaque type StrCtx = StringContext + def apply(ctx: StringContext): StrCtx = ctx + def unapply(ctx: StrCtx): Option[StringContext] = Some(ctx) + +def implUnapply[T, U](sc: Expr[Macro.StrCtx], input: Expr[T | U])(using Type[T], Type[U])(using Quotes): Expr[Option[Seq[T | U]]] = + '{ Some(Seq(${input})) } diff --git a/tests/pos-macros/i8577h/Main_2.scala b/tests/pos-macros/i8577h/Main_2.scala new file mode 100644 index 000000000000..9fe2565a0ec3 --- /dev/null +++ b/tests/pos-macros/i8577h/Main_2.scala @@ -0,0 +1,9 @@ +package i8577 + +def main: Unit = + extension (ctx: StringContext) def mac: Macro.StrCtx = Macro(ctx) + extension [T] (inline ctx: Macro.StrCtx) inline def unapplySeq[U](inline input: U | T): Option[Seq[T | U]] = + ${ implUnapply('ctx, 'input) } + + val mac"$x" = 1 + assert(x == 1) diff --git a/tests/pos-macros/i8858/Macro_1.scala b/tests/pos-macros/i8858/Macro_1.scala index 8eb0182c2779..d1647b3dbba6 100644 --- a/tests/pos-macros/i8858/Macro_1.scala +++ b/tests/pos-macros/i8858/Macro_1.scala @@ -5,5 +5,5 @@ def mcrImpl(expr: Expr[Any])(using Quotes): Expr[Any] = import quotes.reflect._ expr.asTerm match case Inlined(_, _, id1) => - println(id1.tpe.widen.show) + id1.tpe.widen.show '{()} diff --git a/tests/pos-macros/i9684/Macro_1.scala b/tests/pos-macros/i9684/Macro_1.scala index 7b47efefdfd8..2fef3ac99817 100644 --- a/tests/pos-macros/i9684/Macro_1.scala +++ b/tests/pos-macros/i9684/Macro_1.scala @@ -9,7 +9,6 @@ object X { def printTypeImpl[A:Type](x:Expr[A])(using Quotes): Expr[String] = { import quotes.reflect._ val value: String = x.asTerm.tpe.show - println(value) Expr( value ) } diff --git a/tests/pos-macros/macro-inline-by-name-cast/Macro_1.scala b/tests/pos-macros/macro-inline-by-name-cast/Macro_1.scala new file mode 100644 index 000000000000..7d9e186ed94f --- /dev/null +++ b/tests/pos-macros/macro-inline-by-name-cast/Macro_1.scala @@ -0,0 +1,7 @@ +import scala.quoted.* + +inline def f[T](inline code: =>T): Any = + ${ create[T]('{ () => code }) } + +def create[T: Type](code: Expr[() => T])(using Quotes): Expr[Any] = + '{ identity($code) } diff --git a/tests/pos-macros/macro-inline-by-name-cast/Test_2.scala b/tests/pos-macros/macro-inline-by-name-cast/Test_2.scala new file mode 100644 index 000000000000..161f58748342 --- /dev/null +++ b/tests/pos-macros/macro-inline-by-name-cast/Test_2.scala @@ -0,0 +1 @@ +def test: Unit = f[Unit](???) diff --git a/tests/pos-special/fatal-warnings/i10994.scala b/tests/pos-special/fatal-warnings/i10994.scala deleted file mode 100644 index 99ae647466b1..000000000000 --- a/tests/pos-special/fatal-warnings/i10994.scala +++ /dev/null @@ -1,2 +0,0 @@ -def foo = true match - case (b: Boolean): Boolean => () diff --git a/tests/pos-special/fatal-warnings/i16649-irrefutable.scala b/tests/pos-special/fatal-warnings/i16649-irrefutable.scala new file mode 100644 index 000000000000..b9aa6d2acf52 --- /dev/null +++ b/tests/pos-special/fatal-warnings/i16649-irrefutable.scala @@ -0,0 +1,7 @@ +import quoted.* + +def foo(using Quotes)(x: Expr[Int]) = + val '{ $y } = x + val '{ $a: Any } = x + val '{ $b: Int } = x + val '[List[Int]] = Type.of[List[Int]] diff --git a/tests/pos-special/fatal-warnings/i17314.scala b/tests/pos-special/fatal-warnings/i17314.scala new file mode 100644 index 000000000000..23f988741bed --- /dev/null +++ b/tests/pos-special/fatal-warnings/i17314.scala @@ -0,0 +1,33 @@ +// scalac: "-Wunused:all" + +import java.net.URI + +object circelike { + import scala.compiletime.summonInline + import scala.deriving.Mirror + + type Codec[T] + type Configuration + trait ConfiguredCodec[T] + object ConfiguredCodec: + inline final def derived[A](using conf: Configuration)(using + inline mirror: Mirror.Of[A] + ): ConfiguredCodec[A] = + new ConfiguredCodec[A]: + val codec = summonInline[Codec[URI]] // simplification +} + +object foo { + import circelike.{Codec, Configuration} + + given Configuration = ??? + given Codec[URI] = ??? +} + +object bar { + import circelike.Codec + import circelike.{Configuration, ConfiguredCodec} + import foo.{given Configuration, given Codec[URI]} + + case class Operator(url: URI) derives ConfiguredCodec +} diff --git a/tests/pos-special/fatal-warnings/i17314a.scala b/tests/pos-special/fatal-warnings/i17314a.scala new file mode 100644 index 000000000000..468b956fb04c --- /dev/null +++ b/tests/pos-special/fatal-warnings/i17314a.scala @@ -0,0 +1,12 @@ +// scalac: -Wunused:all + +package foo: + class Foo[T] + given Foo[Int] = new Foo[Int] + + +package bar: + import foo.{given foo.Foo[Int]} + import foo.Foo + + val repro: Foo[Int] = summon[Foo[Int]] diff --git a/tests/pos-with-compiler-cc/backend/ScalaPrimitivesOps.scala b/tests/pos-with-compiler-cc/backend/ScalaPrimitivesOps.scala new file mode 100644 index 000000000000..6b5bfbc3e00e --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/ScalaPrimitivesOps.scala @@ -0,0 +1,232 @@ +package dotty.tools +package backend + +object ScalaPrimitivesOps extends ScalaPrimitivesOps + +class ScalaPrimitivesOps { + // Arithmetic unary operations + inline val POS = 1 // +x + inline val NEG = 2 // -x + inline val NOT = 3 // ~x + + // Arithmetic binary operations + inline val ADD = 10 // x + y + inline val SUB = 11 // x - y + inline val MUL = 12 // x * y + inline val DIV = 13 // x / y + inline val MOD = 14 // x % y + + // Bitwise operations + inline val OR = 20 // x | y + inline val XOR = 21 // x ^ y + inline val AND = 22 // x & y + + // Shift operations + inline val LSL = 30 // x << y + inline val LSR = 31 // x >>> y + inline val ASR = 32 // x >> y + + // Comparison operations + inline val ID = 40 // x eq y + inline val NI = 41 // x ne y + inline val EQ = 42 // x == y + inline val NE = 43 // x != y + inline val LT = 44 // x < y + inline val LE = 45 // x <= y + inline val GT = 46 // x > y + inline val GE = 47 // x >= y + + // Boolean unary operations + inline val ZNOT = 50 // !x + + // Boolean binary operations + inline val ZOR = 60 // x || y + inline val ZAND = 61 // x && y + + // Array operations + inline val LENGTH = 70 // x.length + inline val APPLY = 71 // x(y) + inline val UPDATE = 72 // x(y) = z + + // Any operations + inline val IS = 80 // x.is[y] + inline val AS = 81 // x.as[y] + inline val HASH = 87 // x.## + + // AnyRef operations + inline val SYNCHRONIZED = 90 // x.synchronized(y) + + // String operations + inline val CONCAT = 100 // String.valueOf(x)+String.valueOf(y) + + // coercions + inline val COERCE = 101 + + // RunTime operations + inline val BOX = 110 // RunTime.box_(x) + inline val UNBOX = 111 // RunTime.unbox_(x) + inline val NEW_ZARRAY = 112 // RunTime.zarray(x) + inline val NEW_BARRAY = 113 // RunTime.barray(x) + inline val NEW_SARRAY = 114 // RunTime.sarray(x) + inline val NEW_CARRAY = 115 // RunTime.carray(x) + inline val NEW_IARRAY = 116 // RunTime.iarray(x) + inline val NEW_LARRAY = 117 // RunTime.larray(x) + inline val NEW_FARRAY = 118 // RunTime.farray(x) + inline val NEW_DARRAY = 119 // RunTime.darray(x) + inline val NEW_OARRAY = 120 // RunTime.oarray(x) + + inline val ZARRAY_LENGTH = 131 // RunTime.zarray_length(x) + inline val BARRAY_LENGTH = 132 // RunTime.barray_length(x) + inline val SARRAY_LENGTH = 133 // RunTime.sarray_length(x) + inline val CARRAY_LENGTH = 134 // RunTime.carray_length(x) + inline val IARRAY_LENGTH = 135 // RunTime.iarray_length(x) + inline val LARRAY_LENGTH = 136 // RunTime.larray_length(x) + inline val FARRAY_LENGTH = 137 // RunTime.farray_length(x) + inline val DARRAY_LENGTH = 138 // RunTime.darray_length(x) + inline val OARRAY_LENGTH = 139 // RunTime.oarray_length(x) + + inline val ZARRAY_GET = 140 // RunTime.zarray_get(x,y) + inline val BARRAY_GET = 141 // RunTime.barray_get(x,y) + inline val SARRAY_GET = 142 // RunTime.sarray_get(x,y) + inline val CARRAY_GET = 143 // RunTime.carray_get(x,y) + inline val IARRAY_GET = 144 // RunTime.iarray_get(x,y) + inline val LARRAY_GET = 145 // RunTime.larray_get(x,y) + inline val FARRAY_GET = 146 // RunTime.farray_get(x,y) + inline val DARRAY_GET = 147 // RunTime.darray_get(x,y) + inline val OARRAY_GET = 148 // RunTime.oarray_get(x,y) + + inline val ZARRAY_SET = 150 // RunTime.zarray(x,y,z) + inline val BARRAY_SET = 151 // RunTime.barray(x,y,z) + inline val SARRAY_SET = 152 // RunTime.sarray(x,y,z) + inline val CARRAY_SET = 153 // RunTime.carray(x,y,z) + inline val IARRAY_SET = 154 // RunTime.iarray(x,y,z) + inline val LARRAY_SET = 155 // RunTime.larray(x,y,z) + inline val FARRAY_SET = 156 // RunTime.farray(x,y,z) + inline val DARRAY_SET = 157 // RunTime.darray(x,y,z) + inline val OARRAY_SET = 158 // RunTime.oarray(x,y,z) + + inline val B2B = 200 // RunTime.b2b(x) + inline val B2S = 201 // RunTime.b2s(x) + inline val B2C = 202 // RunTime.b2c(x) + inline val B2I = 203 // RunTime.b2i(x) + inline val B2L = 204 // RunTime.b2l(x) + inline val B2F = 205 // RunTime.b2f(x) + inline val B2D = 206 // RunTime.b2d(x) + + inline val S2B = 210 // RunTime.s2b(x) + inline val S2S = 211 // RunTime.s2s(x) + inline val S2C = 212 // RunTime.s2c(x) + inline val S2I = 213 // RunTime.s2i(x) + inline val S2L = 214 // RunTime.s2l(x) + inline val S2F = 215 // RunTime.s2f(x) + inline val S2D = 216 // RunTime.s2d(x) + + inline val C2B = 220 // RunTime.c2b(x) + inline val C2S = 221 // RunTime.c2s(x) + inline val C2C = 222 // RunTime.c2c(x) + inline val C2I = 223 // RunTime.c2i(x) + inline val C2L = 224 // RunTime.c2l(x) + inline val C2F = 225 // RunTime.c2f(x) + inline val C2D = 226 // RunTime.c2d(x) + + inline val I2B = 230 // RunTime.i2b(x) + inline val I2S = 231 // RunTime.i2s(x) + inline val I2C = 232 // RunTime.i2c(x) + inline val I2I = 233 // RunTime.i2i(x) + inline val I2L = 234 // RunTime.i2l(x) + inline val I2F = 235 // RunTime.i2f(x) + inline val I2D = 236 // RunTime.i2d(x) + + inline val L2B = 240 // RunTime.l2b(x) + inline val L2S = 241 // RunTime.l2s(x) + inline val L2C = 242 // RunTime.l2c(x) + inline val L2I = 243 // RunTime.l2i(x) + inline val L2L = 244 // RunTime.l2l(x) + inline val L2F = 245 // RunTime.l2f(x) + inline val L2D = 246 // RunTime.l2d(x) + + inline val F2B = 250 // RunTime.f2b(x) + inline val F2S = 251 // RunTime.f2s(x) + inline val F2C = 252 // RunTime.f2c(x) + inline val F2I = 253 // RunTime.f2i(x) + inline val F2L = 254 // RunTime.f2l(x) + inline val F2F = 255 // RunTime.f2f(x) + inline val F2D = 256 // RunTime.f2d(x) + + inline val D2B = 260 // RunTime.d2b(x) + inline val D2S = 261 // RunTime.d2s(x) + inline val D2C = 262 // RunTime.d2c(x) + inline val D2I = 263 // RunTime.d2i(x) + inline val D2L = 264 // RunTime.d2l(x) + inline val D2F = 265 // RunTime.d2f(x) + inline val D2D = 266 // RunTime.d2d(x) + + /** Check whether the given operation code is an array operation. */ + def isArrayOp(code: Int): Boolean = + isArrayNew(code) | isArrayLength(code) | isArrayGet(code) | isArraySet(code) + + def isArrayNew(code: Int): Boolean = code match { + case NEW_ZARRAY | NEW_BARRAY | NEW_SARRAY | NEW_CARRAY | + NEW_IARRAY | NEW_LARRAY | NEW_FARRAY | NEW_DARRAY | + NEW_OARRAY => true + case _ => false + } + + def isArrayLength(code: Int): Boolean = code match { + case ZARRAY_LENGTH | BARRAY_LENGTH | SARRAY_LENGTH | CARRAY_LENGTH | + IARRAY_LENGTH | LARRAY_LENGTH | FARRAY_LENGTH | DARRAY_LENGTH | + OARRAY_LENGTH | LENGTH => true + case _ => false + } + + def isArrayGet(code: Int): Boolean = code match { + case ZARRAY_GET | BARRAY_GET | SARRAY_GET | CARRAY_GET | + IARRAY_GET | LARRAY_GET | FARRAY_GET | DARRAY_GET | + OARRAY_GET | APPLY => true + case _ => false + } + + def isArraySet(code: Int): Boolean = code match { + case ZARRAY_SET | BARRAY_SET | SARRAY_SET | CARRAY_SET | + IARRAY_SET | LARRAY_SET | FARRAY_SET | DARRAY_SET | + OARRAY_SET | UPDATE => true + case _ => false + } + + /** Check whether the given code is a comparison operator */ + def isComparisonOp(code: Int): Boolean = code match { + case ID | NI | EQ | NE | + LT | LE | GT | GE => true + + case _ => false + } + def isUniversalEqualityOp(code: Int): Boolean = (code == EQ) || (code == NE) + def isReferenceEqualityOp(code: Int): Boolean = (code == ID) || (code == NI) + + def isArithmeticOp(code: Int): Boolean = code match { + case POS | NEG | NOT => true; // unary + case ADD | SUB | MUL | + DIV | MOD => true; // binary + case OR | XOR | AND | + LSL | LSR | ASR => true; // bitwise + case _ => false + } + + def isLogicalOp(code: Int): Boolean = code match { + case ZNOT | ZAND | ZOR => true + case _ => false + } + + def isShiftOp(code: Int): Boolean = code match { + case LSL | LSR | ASR => true + case _ => false + } + + def isBitwiseOp(code: Int): Boolean = code match { + case OR | XOR | AND => true + case _ => false + } + + def isCoercion(code: Int): Boolean = (code >= B2B) && (code <= D2D) + +} diff --git a/tests/pos-with-compiler-cc/backend/WorklistAlgorithm.scala b/tests/pos-with-compiler-cc/backend/WorklistAlgorithm.scala new file mode 100644 index 000000000000..b3d98d425b2a --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/WorklistAlgorithm.scala @@ -0,0 +1,57 @@ +package dotty.tools +package backend + +/** + * Simple implementation of a worklist algorithm. A processing + * function is applied repeatedly to the first element in the + * worklist, as long as the stack is not empty. + * + * The client class should mix-in this class and initialize the worklist + * field and define the `processElement` method. Then call the `run` method + * providing a function that initializes the worklist. + * + * @author Martin Odersky + * @version 1.0 + * @see [[scala.tools.nsc.backend.icode.Linearizers]] + */ +trait WorklistAlgorithm { + type Elem + class WList { + private var list: List[Elem] = Nil + def isEmpty = list.isEmpty + def nonEmpty = !isEmpty + def push(e: Elem): Unit = { list = e :: list } + def pop(): Elem = { + val head = list.head + list = list.tail + head + } + def pushAll(xs: Iterable[Elem]): Unit = xs.foreach(push) + def clear(): Unit = list = Nil + + } + + val worklist: WList + + /** + * Run the iterative algorithm until the worklist remains empty. + * The initializer is run once before the loop starts and should + * initialize the worklist. + */ + def run(initWorklist: => Unit) = { + initWorklist + + while (worklist.nonEmpty) + processElement(dequeue) + } + + /** + * Process the current element from the worklist. + */ + def processElement(e: Elem): Unit + + /** + * Remove and return the first element to be processed from the worklist. + */ + def dequeue: Elem +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/AsmUtils.scala b/tests/pos-with-compiler-cc/backend/jvm/AsmUtils.scala new file mode 100644 index 000000000000..e6393ce82054 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/AsmUtils.scala @@ -0,0 +1,65 @@ +package dotty.tools +package backend +package jvm + +import scala.language.unsafeNulls + +import scala.tools.asm.tree.{AbstractInsnNode} +import java.io.PrintWriter +import scala.tools.asm.util.{TraceClassVisitor, TraceMethodVisitor, Textifier} +import scala.tools.asm.ClassReader + +object AsmUtils { + + /** + * Print the bytecode of methods generated by GenBCode to the standard output. Only methods + * whose name contains `traceMethodPattern` are traced. + */ + final val traceMethodEnabled = sys.env.contains("printBCODE") + final val traceMethodPattern = sys.env.getOrElse("printBCODE", "") + + /** + * Print the bytecode of classes generated by GenBCode to the standard output. + */ + inline val traceClassEnabled = false + inline val traceClassPattern = "" + + /** + * Print the bytedcode of classes as they are serialized by the ASM library. The serialization + * performed by `asm.ClassWriter` can change the code generated by GenBCode. For example, it + * introduces stack map frames, it computes the maximal stack sizes, and it replaces dead + * code by NOPs (see also https://github.com/scala/scala/pull/3726#issuecomment-42861780). + */ + inline val traceSerializedClassEnabled = false + inline val traceSerializedClassPattern = "" + + def traceMethod(mnode: MethodNode1): Unit = { + println(s"Bytecode for method ${mnode.name}") + val p = new Textifier + val tracer = new TraceMethodVisitor(p) + mnode.accept(tracer) + val w = new PrintWriter(System.out) + p.print(w) + w.flush() + } + + def traceClass(cnode: ClassNode1): Unit = { + println(s"Bytecode for class ${cnode.name}") + val w = new PrintWriter(System.out) + cnode.accept(new TraceClassVisitor(w)) + w.flush() + } + + def traceClass(bytes: Array[Byte]): Unit = traceClass(readClass(bytes)) + + def readClass(bytes: Array[Byte]): ClassNode1 = { + val node = new ClassNode1() + new ClassReader(bytes).accept(node, 0) + node + } + + def instructionString(instruction: AbstractInsnNode): String = instruction.getOpcode match { + case -1 => instruction.toString + case op => scala.tools.asm.util.Printer.OPCODES(op) + } +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/BCodeAsmCommon.scala b/tests/pos-with-compiler-cc/backend/jvm/BCodeAsmCommon.scala new file mode 100644 index 000000000000..d95638be2695 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/BCodeAsmCommon.scala @@ -0,0 +1,158 @@ +package dotty.tools +package backend +package jvm + +import scala.language.unsafeNulls + +import dotty.tools.dotc.core.Flags._ +import dotty.tools.dotc.core.Symbols._ +import dotty.tools.dotc.report + +/** + * This trait contains code shared between GenBCode and GenASM that depends on types defined in + * the compiler cake (Global). + */ +final class BCodeAsmCommon[I <: DottyBackendInterface](val interface: I) { + import interface.given + import DottyBackendInterface.symExtensions + + /** + * True if `classSym` is an anonymous class or a local class. I.e., false if `classSym` is a + * member class. This method is used to decide if we should emit an EnclosingMethod attribute. + * It is also used to decide whether the "owner" field in the InnerClass attribute should be + * null. + */ + def isAnonymousOrLocalClass(classSym: Symbol): Boolean = { + assert(classSym.isClass, s"not a class: $classSym") + // Here used to be an `assert(!classSym.isDelambdafyFunction)`: delambdafy lambda classes are + // always top-level. However, SI-8900 shows an example where the weak name-based implementation + // of isDelambdafyFunction failed (for a function declared in a package named "lambda"). + classSym.isAnonymousClass || { + val originalOwner = classSym.originalOwner + originalOwner != NoSymbol && !originalOwner.isClass + } + } + + /** + * Returns the enclosing method for non-member classes. In the following example + * + * class A { + * def f = { + * class B { + * class C + * } + * } + * } + * + * the method returns Some(f) for B, but None for C, because C is a member class. For non-member + * classes that are not enclosed by a method, it returns None: + * + * class A { + * { class B } + * } + * + * In this case, for B, we return None. + * + * The EnclosingMethod attribute needs to be added to non-member classes (see doc in BTypes). + * This is a source-level property, so we need to use the originalOwner chain to reconstruct it. + */ + private def enclosingMethodForEnclosingMethodAttribute(classSym: Symbol): Option[Symbol] = { + assert(classSym.isClass, classSym) + def enclosingMethod(sym: Symbol): Option[Symbol] = { + if (sym.isClass || sym == NoSymbol) None + else if (sym.is(Method)) Some(sym) + else enclosingMethod(sym.originalOwner) + } + enclosingMethod(classSym.originalOwner) + } + + /** + * The enclosing class for emitting the EnclosingMethod attribute. Since this is a source-level + * property, this method looks at the originalOwner chain. See doc in BTypes. + */ + private def enclosingClassForEnclosingMethodAttribute(classSym: Symbol): Symbol = { + assert(classSym.isClass, classSym) + def enclosingClass(sym: Symbol): Symbol = { + if (sym.isClass) sym + else enclosingClass(sym.originalOwner.originalLexicallyEnclosingClass) + } + enclosingClass(classSym.originalOwner.originalLexicallyEnclosingClass) + } + + /*final*/ case class EnclosingMethodEntry(owner: String, name: String, methodDescriptor: String) + + /** + * Data for emitting an EnclosingMethod attribute. None if `classSym` is a member class (not + * an anonymous or local class). See doc in BTypes. + * + * The class is parametrized by two functions to obtain a bytecode class descriptor for a class + * symbol, and to obtain a method signature descriptor fro a method symbol. These function depend + * on the implementation of GenASM / GenBCode, so they need to be passed in. + */ + def enclosingMethodAttribute(classSym: Symbol, classDesc: Symbol => String, methodDesc: Symbol => String): Option[EnclosingMethodEntry] = { + if (isAnonymousOrLocalClass(classSym)) { + val methodOpt = enclosingMethodForEnclosingMethodAttribute(classSym) + report.debuglog(s"enclosing method for $classSym is $methodOpt (in ${methodOpt.map(_.enclosingClass)})") + Some(EnclosingMethodEntry( + classDesc(enclosingClassForEnclosingMethodAttribute(classSym)), + methodOpt.map(_.javaSimpleName).orNull, + methodOpt.map(methodDesc).orNull)) + } else { + None + } + } +} + +object BCodeAsmCommon{ + def ubytesToCharArray(bytes: Array[Byte]): Array[Char] = { + val ca = new Array[Char](bytes.length) + var idx = 0 + while(idx < bytes.length) { + val b: Byte = bytes(idx) + assert((b & ~0x7f) == 0) + ca(idx) = b.asInstanceOf[Char] + idx += 1 + } + + ca + } + + final def arrEncode(bSeven: Array[Byte]): Array[String] = { + var strs: List[String] = Nil + // chop into slices of at most 65535 bytes, counting 0x00 as taking two bytes (as per JVMS 4.4.7 The CONSTANT_Utf8_info Structure) + var prevOffset = 0 + var offset = 0 + var encLength = 0 + while(offset < bSeven.length) { + val deltaEncLength = (if(bSeven(offset) == 0) 2 else 1) + val newEncLength = encLength.toLong + deltaEncLength + if(newEncLength >= 65535) { + val ba = bSeven.slice(prevOffset, offset) + strs ::= new java.lang.String(ubytesToCharArray(ba)) + encLength = 0 + prevOffset = offset + } else { + encLength += deltaEncLength + offset += 1 + } + } + if(prevOffset < offset) { + assert(offset == bSeven.length) + val ba = bSeven.slice(prevOffset, offset) + strs ::= new java.lang.String(ubytesToCharArray(ba)) + } + assert(strs.size > 1, "encode instead as one String via strEncode()") // TODO too strict? + strs.reverse.toArray + } + + + def strEncode(bSeven: Array[Byte]): String = { + val ca = ubytesToCharArray(bSeven) + new java.lang.String(ca) + // debug val bvA = new asm.ByteVector; bvA.putUTF8(s) + // debug val enc: Array[Byte] = scala.reflect.internal.pickling.ByteCodecs.encode(bytes) + // debug assert(enc(idx) == bvA.getByte(idx + 2)) + // debug assert(bvA.getLength == enc.size + 2) + } + +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/BCodeBodyBuilder.scala b/tests/pos-with-compiler-cc/backend/jvm/BCodeBodyBuilder.scala new file mode 100644 index 000000000000..bf10e37943a8 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/BCodeBodyBuilder.scala @@ -0,0 +1,1776 @@ +package dotty.tools +package backend +package jvm + +import scala.language.unsafeNulls + +import scala.annotation.switch +import scala.collection.mutable.SortedMap + +import scala.tools.asm +import scala.tools.asm.{Handle, Opcodes} +import BCodeHelpers.InvokeStyle + +import dotty.tools.dotc.ast.tpd +import dotty.tools.dotc.CompilationUnit +import dotty.tools.dotc.core.Constants._ +import dotty.tools.dotc.core.Flags.{Label => LabelFlag, _} +import dotty.tools.dotc.core.Types._ +import dotty.tools.dotc.core.StdNames.{nme, str} +import dotty.tools.dotc.core.Symbols._ +import dotty.tools.dotc.transform.Erasure +import dotty.tools.dotc.transform.SymUtils._ +import dotty.tools.dotc.util.Spans._ +import dotty.tools.dotc.core.Contexts._ +import dotty.tools.dotc.core.Phases._ +import dotty.tools.dotc.core.Decorators.em +import dotty.tools.dotc.report + +/* + * + * @author Miguel Garcia, http://lamp.epfl.ch/~magarcia/ScalaCompilerCornerReloaded/ + * @version 1.0 + * + */ +trait BCodeBodyBuilder extends BCodeSkelBuilder { + // import global._ + // import definitions._ + import tpd._ + import int.{_, given} + import DottyBackendInterface.symExtensions + import bTypes._ + import coreBTypes._ + + protected val primitives: DottyPrimitives + + /* + * Functionality to build the body of ASM MethodNode, except for `synchronized` and `try` expressions. + */ + abstract class PlainBodyBuilder(cunit: CompilationUnit) extends PlainSkelBuilder(cunit) { + + import Primitives.TestOp + + /* ---------------- helper utils for generating methods and code ---------------- */ + + def emit(opc: Int): Unit = { mnode.visitInsn(opc) } + + def emitZeroOf(tk: BType): Unit = { + tk match { + case BOOL => bc.boolconst(false) + case BYTE | + SHORT | + CHAR | + INT => bc.iconst(0) + case LONG => bc.lconst(0) + case FLOAT => bc.fconst(0) + case DOUBLE => bc.dconst(0) + case UNIT => () + case _ => emit(asm.Opcodes.ACONST_NULL) + } + } + + /* + * Emits code that adds nothing to the operand stack. + * Two main cases: `tree` is an assignment, + * otherwise an `adapt()` to UNIT is performed if needed. + */ + def genStat(tree: Tree): Unit = { + lineNumber(tree) + + tree match { + case Assign(lhs @ DesugaredSelect(qual, _), rhs) => + val isStatic = lhs.symbol.isStaticMember + if (!isStatic) { genLoadQualifier(lhs) } + genLoad(rhs, symInfoTK(lhs.symbol)) + lineNumber(tree) + // receiverClass is used in the bytecode to access the field. using sym.owner may lead to IllegalAccessError + val receiverClass = qual.tpe.typeSymbol + fieldStore(lhs.symbol, receiverClass) + + case Assign(lhs, rhs) => + val s = lhs.symbol + val Local(tk, _, idx, _) = locals.getOrMakeLocal(s) + + rhs match { + case Apply(Select(larg: Ident, nme.ADD), Literal(x) :: Nil) + if larg.symbol == s && tk.isIntSizedType && x.isShortRange => + lineNumber(tree) + bc.iinc(idx, x.intValue) + + case Apply(Select(larg: Ident, nme.SUB), Literal(x) :: Nil) + if larg.symbol == s && tk.isIntSizedType && Constant(-x.intValue).isShortRange => + lineNumber(tree) + bc.iinc(idx, -x.intValue) + + case _ => + genLoad(rhs, tk) + lineNumber(tree) + bc.store(idx, tk) + } + + case _ => + genLoad(tree, UNIT) + } + } + + /* Generate code for primitive arithmetic operations. */ + def genArithmeticOp(tree: Tree, code: Int): BType = tree match{ + case Apply(fun @ DesugaredSelect(larg, _), args) => + var resKind = tpeTK(larg) + + assert(resKind.isNumericType || (resKind == BOOL), + s"$resKind is not a numeric or boolean type [operation: ${fun.symbol}]") + + import ScalaPrimitivesOps._ + + args match { + // unary operation + case Nil => + genLoad(larg, resKind) + code match { + case POS => () // nothing + case NEG => bc.neg(resKind) + case NOT => bc.genPrimitiveArithmetic(Primitives.NOT, resKind) + case _ => abort(s"Unknown unary operation: ${fun.symbol.showFullName} code: $code") + } + + // binary operation + case rarg :: Nil => + val isShift = isShiftOp(code) + resKind = tpeTK(larg).maxType(if (isShift) INT else tpeTK(rarg)) + + if (isShift || isBitwiseOp(code)) { + assert(resKind.isIntegralType || (resKind == BOOL), + s"$resKind incompatible with arithmetic modulo operation.") + } + + genLoad(larg, resKind) + genLoad(rarg, if (isShift) INT else resKind) + + (code: @switch) match { + case ADD => bc add resKind + case SUB => bc sub resKind + case MUL => bc mul resKind + case DIV => bc div resKind + case MOD => bc rem resKind + + case OR | XOR | AND => bc.genPrimitiveLogical(code, resKind) + + case LSL | LSR | ASR => bc.genPrimitiveShift(code, resKind) + + case _ => abort(s"Unknown primitive: ${fun.symbol}[$code]") + } + + case _ => + abort(s"Too many arguments for primitive function: $tree") + } + lineNumber(tree) + resKind + } + + /* Generate primitive array operations. */ + def genArrayOp(tree: Tree, code: Int, expectedType: BType): BType = tree match{ + + case Apply(DesugaredSelect(arrayObj, _), args) => + import ScalaPrimitivesOps._ + val k = tpeTK(arrayObj) + genLoad(arrayObj, k) + val elementType = typeOfArrayOp.getOrElse[bTypes.BType](code, abort(s"Unknown operation on arrays: $tree code: $code")) + + var generatedType = expectedType + + if (isArrayGet(code)) { + // load argument on stack + assert(args.length == 1, s"Too many arguments for array get operation: $tree"); + genLoad(args.head, INT) + generatedType = k.asArrayBType.componentType + bc.aload(elementType) + } + else if (isArraySet(code)) { + val List(a1, a2) = args + genLoad(a1, INT) + genLoad(a2) + generatedType = UNIT + bc.astore(elementType) + } else { + generatedType = INT + emit(asm.Opcodes.ARRAYLENGTH) + } + lineNumber(tree) + + generatedType + } + + def genLoadIfTo(tree: If, expectedType: BType, dest: LoadDestination): BType = tree match{ + case If(condp, thenp, elsep) => + + val success = new asm.Label + val failure = new asm.Label + + val hasElse = !elsep.isEmpty && (elsep match { + case Literal(value) if value.tag == UnitTag => false + case _ => true + }) + + genCond(condp, success, failure, targetIfNoJump = success) + markProgramPoint(success) + + if dest == LoadDestination.FallThrough then + if hasElse then + val thenKind = tpeTK(thenp) + val elseKind = tpeTK(elsep) + def hasUnitBranch = (thenKind == UNIT || elseKind == UNIT) && expectedType == UNIT + val resKind = if (hasUnitBranch) UNIT else tpeTK(tree) + + val postIf = new asm.Label + genLoadTo(thenp, resKind, LoadDestination.Jump(postIf)) + markProgramPoint(failure) + genLoadTo(elsep, resKind, LoadDestination.FallThrough) + markProgramPoint(postIf) + resKind + else + genLoad(thenp, UNIT) + markProgramPoint(failure) + UNIT + end if + else + genLoadTo(thenp, expectedType, dest) + markProgramPoint(failure) + if hasElse then + genLoadTo(elsep, expectedType, dest) + else + genAdaptAndSendToDest(UNIT, expectedType, dest) + expectedType + end if + } + + def genPrimitiveOp(tree: Apply, expectedType: BType): BType = (tree: @unchecked) match { + case Apply(fun @ DesugaredSelect(receiver, _), _) => + val sym = tree.symbol + + val code = primitives.getPrimitive(tree, receiver.tpe) + + import ScalaPrimitivesOps._ + + if (isArithmeticOp(code)) genArithmeticOp(tree, code) + else if (code == CONCAT) genStringConcat(tree) + else if (code == HASH) genScalaHash(receiver) + else if (isArrayOp(code)) genArrayOp(tree, code, expectedType) + else if (isLogicalOp(code) || isComparisonOp(code)) { + val success, failure, after = new asm.Label + genCond(tree, success, failure, targetIfNoJump = success) + // success block + markProgramPoint(success) + bc boolconst true + bc goTo after + // failure block + markProgramPoint(failure) + bc boolconst false + // after + markProgramPoint(after) + + BOOL + } + else if (isCoercion(code)) { + genLoad(receiver) + lineNumber(tree) + genCoercion(code) + coercionTo(code) + } + else abort( + s"Primitive operation not handled yet: ${sym.showFullName}(${fun.symbol.name}) at: ${tree.span}" + ) + } + + def genLoad(tree: Tree): Unit = { + genLoad(tree, tpeTK(tree)) + } + + /* Generate code for trees that produce values on the stack */ + def genLoad(tree: Tree, expectedType: BType): Unit = + genLoadTo(tree, expectedType, LoadDestination.FallThrough) + + /* Generate code for trees that produce values, sent to a given `LoadDestination`. */ + def genLoadTo(tree: Tree, expectedType: BType, dest: LoadDestination): Unit = + var generatedType = expectedType + var generatedDest = LoadDestination.FallThrough + + lineNumber(tree) + + tree match { + case tree@ValDef(_, _, _) => + val sym = tree.symbol + /* most of the time, !locals.contains(sym), unless the current activation of genLoad() is being called + while duplicating a finalizer that contains this ValDef. */ + val loc = locals.getOrMakeLocal(sym) + val Local(tk, _, idx, isSynth) = loc + if (tree.rhs == tpd.EmptyTree) { emitZeroOf(tk) } + else { genLoad(tree.rhs, tk) } + bc.store(idx, tk) + val localVarStart = currProgramPoint() + if (!isSynth) { // there are case ValDef's emitted by patmat + varsInScope ::= (sym -> localVarStart) + } + generatedType = UNIT + + case t @ If(_, _, _) => + generatedType = genLoadIfTo(t, expectedType, dest) + generatedDest = dest + + case t @ Labeled(_, _) => + generatedType = genLabeledTo(t, expectedType, dest) + generatedDest = dest + + case r: Return => + genReturn(r) + generatedDest = LoadDestination.Return + + case t @ WhileDo(_, _) => + generatedDest = genWhileDo(t) + generatedType = UNIT + + case t @ Try(_, _, _) => + generatedType = genLoadTry(t) + + case t: Apply if t.fun.symbol eq defn.throwMethod => + val thrownExpr = t.args.head + val thrownKind = tpeTK(thrownExpr) + genLoadTo(thrownExpr, thrownKind, LoadDestination.Throw) + generatedDest = LoadDestination.Throw + + case New(tpt) => + abort(s"Unexpected New(${tpt.tpe.showSummary()}/$tpt) reached GenBCode.\n" + + " Call was genLoad" + ((tree, expectedType))) + + case t @ Closure(env, call, tpt) => + val functionalInterface: Symbol = + if !tpt.isEmpty then tpt.tpe.classSymbol + else t.tpe.classSymbol + val (fun, args) = call match { + case Apply(fun, args) => (fun, args) + case t @ DesugaredSelect(_, _) => (t, Nil) // TODO: use Select + case t @ Ident(_) => (t, Nil) + } + + if (!fun.symbol.isStaticMember) { + // load receiver of non-static implementation of lambda + + // darkdimius: I haven't found in spec `this` reference should go + // but I was able to derrive it by reading + // AbstractValidatingLambdaMetafactory.validateMetafactoryArgs + + val DesugaredSelect(prefix, _) = fun: @unchecked + genLoad(prefix) + } + + genLoadArguments(env, fun.symbol.info.firstParamTypes map toTypeKind) + generatedType = genInvokeDynamicLambda(NoSymbol, fun.symbol, env.size, functionalInterface) + + case app @ Apply(_, _) => + generatedType = genApply(app, expectedType) + + case This(qual) => + val symIsModuleClass = tree.symbol.is(ModuleClass) + assert(tree.symbol == claszSymbol || symIsModuleClass, + s"Trying to access the this of another class: tree.symbol = ${tree.symbol}, class symbol = $claszSymbol compilation unit: $cunit") + if (symIsModuleClass && tree.symbol != claszSymbol) { + generatedType = genLoadModule(tree) + } + else { + mnode.visitVarInsn(asm.Opcodes.ALOAD, 0) + // When compiling Array.scala, the constructor invokes `Array.this.super.`. The expectedType + // is `[Object` (computed by typeToBType, the type of This(Array) is `Array[T]`). If we would set + // the generatedType to `Array` below, the call to adapt at the end would fail. The situation is + // similar for primitives (`I` vs `Int`). + if (tree.symbol != defn.ArrayClass && !tree.symbol.isPrimitiveValueClass) { + generatedType = classBTypeFromSymbol(claszSymbol) + } + } + + case DesugaredSelect(Ident(nme.EMPTY_PACKAGE), module) => + assert(tree.symbol.is(Module), s"Selection of non-module from empty package: $tree sym: ${tree.symbol} at: ${tree.span}") + genLoadModule(tree) + + case DesugaredSelect(qualifier, _) => + val sym = tree.symbol + generatedType = symInfoTK(sym) + val qualSafeToElide = tpd.isIdempotentExpr(qualifier) + + def genLoadQualUnlessElidable(): Unit = { if (!qualSafeToElide) { genLoadQualifier(tree) } } + + // receiverClass is used in the bytecode to access the field. using sym.owner may lead to IllegalAccessError + def receiverClass = qualifier.tpe.typeSymbol + if (sym.is(Module)) { + genLoadQualUnlessElidable() + genLoadModule(tree) + } else if (sym.isStaticMember) { + genLoadQualUnlessElidable() + fieldLoad(sym, receiverClass) + } else { + genLoadQualifier(tree) + fieldLoad(sym, receiverClass) + } + + case t @ Ident(name) => + val sym = tree.symbol + val tk = symInfoTK(sym) + generatedType = tk + + val desugared = cachedDesugarIdent(t) + desugared match { + case None => + if (!sym.is(Package)) { + if (sym.is(Module)) genLoadModule(sym) + else locals.load(sym) + } + case Some(t) => + genLoad(t, generatedType) + } + + case Literal(value) => + if (value.tag != UnitTag) (value.tag, expectedType) match { + case (IntTag, LONG ) => bc.lconst(value.longValue); generatedType = LONG + case (FloatTag, DOUBLE) => bc.dconst(value.doubleValue); generatedType = DOUBLE + case (NullTag, _ ) => bc.emit(asm.Opcodes.ACONST_NULL); generatedType = srNullRef + case _ => genConstant(value); generatedType = tpeTK(tree) + } + + case blck @ Block(stats, expr) => + if(stats.isEmpty) + genLoadTo(expr, expectedType, dest) + else + genBlockTo(blck, expectedType, dest) + generatedDest = dest + + case Typed(Super(_, _), _) => + genLoadTo(tpd.This(claszSymbol.asClass), expectedType, dest) + generatedDest = dest + + case Typed(expr, _) => + genLoadTo(expr, expectedType, dest) + generatedDest = dest + + case Assign(_, _) => + generatedType = UNIT + genStat(tree) + + case av @ ArrayValue(_, _) => + generatedType = genArrayValue(av) + + case mtch @ Match(_, _) => + generatedType = genMatchTo(mtch, expectedType, dest) + generatedDest = dest + + case tpd.EmptyTree => if (expectedType != UNIT) { emitZeroOf(expectedType) } + + + case t: TypeApply => // dotty specific + generatedType = genTypeApply(t) + + case _ => abort(s"Unexpected tree in genLoad: $tree/${tree.getClass} at: ${tree.span}") + } + + // emit conversion and send to the right destination + if generatedDest == LoadDestination.FallThrough then + genAdaptAndSendToDest(generatedType, expectedType, dest) + end genLoadTo + + def genAdaptAndSendToDest(generatedType: BType, expectedType: BType, dest: LoadDestination): Unit = + if generatedType != expectedType then + adapt(generatedType, expectedType) + + dest match + case LoadDestination.FallThrough => + () + case LoadDestination.Jump(label) => + bc goTo label + case LoadDestination.Return => + bc emitRETURN returnType + case LoadDestination.Throw => + val thrownType = expectedType + // `throw null` is valid although scala.Null (as defined in src/libray-aux) isn't a subtype of Throwable. + // Similarly for scala.Nothing (again, as defined in src/libray-aux). + assert(thrownType.isNullType || thrownType.isNothingType || thrownType.asClassBType.isSubtypeOf(jlThrowableRef)) + emit(asm.Opcodes.ATHROW) + end genAdaptAndSendToDest + + // ---------------- field load and store ---------------- + + /* + * must-single-thread + */ + def fieldLoad( field: Symbol, hostClass: Symbol = null): Unit = fieldOp(field, isLoad = true, hostClass) + + /* + * must-single-thread + */ + def fieldStore(field: Symbol, hostClass: Symbol = null): Unit = fieldOp(field, isLoad = false, hostClass) + + /* + * must-single-thread + */ + private def fieldOp(field: Symbol, isLoad: Boolean, specificReceiver: Symbol): Unit = { + val useSpecificReceiver = specificReceiver != null && !field.isScalaStatic + + val owner = internalName(if (useSpecificReceiver) specificReceiver else field.owner) + val fieldJName = field.javaSimpleName + val fieldDescr = symInfoTK(field).descriptor + val isStatic = field.isStaticMember + val opc = + if (isLoad) { if (isStatic) asm.Opcodes.GETSTATIC else asm.Opcodes.GETFIELD } + else { if (isStatic) asm.Opcodes.PUTSTATIC else asm.Opcodes.PUTFIELD } + mnode.visitFieldInsn(opc, owner, fieldJName, fieldDescr) + + } + + // ---------------- emitting constant values ---------------- + + /* + * For ClazzTag: + * must-single-thread + * Otherwise it's safe to call from multiple threads. + */ + def genConstant(const: Constant): Unit = { + (const.tag/*: @switch*/) match { + + case BooleanTag => bc.boolconst(const.booleanValue) + + case ByteTag => bc.iconst(const.byteValue) + case ShortTag => bc.iconst(const.shortValue) + case CharTag => bc.iconst(const.charValue) + case IntTag => bc.iconst(const.intValue) + + case LongTag => bc.lconst(const.longValue) + case FloatTag => bc.fconst(const.floatValue) + case DoubleTag => bc.dconst(const.doubleValue) + + case UnitTag => () + + case StringTag => + assert(const.value != null, const) // TODO this invariant isn't documented in `case class Constant` + mnode.visitLdcInsn(const.stringValue) // `stringValue` special-cases null, but not for a const with StringTag + + case NullTag => emit(asm.Opcodes.ACONST_NULL) + + case ClazzTag => + val tp = toTypeKind(const.typeValue) + if tp.isPrimitive then + val boxedClass = boxedClassOfPrimitive(tp.asPrimitiveBType) + mnode.visitFieldInsn( + asm.Opcodes.GETSTATIC, + boxedClass.internalName, + "TYPE", // field name + jlClassRef.descriptor + ) + else + mnode.visitLdcInsn(tp.toASMType) + + case _ => abort(s"Unknown constant value: $const") + } + } + + private def genLabeledTo(tree: Labeled, expectedType: BType, dest: LoadDestination): BType = tree match { + case Labeled(bind, expr) => + + val labelSym = bind.symbol + + if dest == LoadDestination.FallThrough then + val resKind = tpeTK(tree) + val jumpTarget = new asm.Label + registerJumpDest(labelSym, resKind, LoadDestination.Jump(jumpTarget)) + genLoad(expr, resKind) + markProgramPoint(jumpTarget) + resKind + else + registerJumpDest(labelSym, expectedType, dest) + genLoadTo(expr, expectedType, dest) + expectedType + end if + } + + private def genReturn(r: Return): Unit = { + val expr: Tree = r.expr + val fromSym: Symbol = if (r.from.symbol.is(LabelFlag)) r.from.symbol else NoSymbol + + if (NoSymbol == fromSym) { + // return from enclosing method + cleanups match { + case Nil => + // not an assertion: !shouldEmitCleanup (at least not yet, pendingCleanups() may still have to run, and reset `shouldEmitCleanup`. + genLoadTo(expr, returnType, LoadDestination.Return) + case nextCleanup :: rest => + genLoad(expr, returnType) + lineNumber(r) + val saveReturnValue = (returnType != UNIT) + if (saveReturnValue) { + // regarding return value, the protocol is: in place of a `return-stmt`, a sequence of `adapt, store, jump` are inserted. + if (earlyReturnVar == null) { + earlyReturnVar = locals.makeLocal(returnType, "earlyReturnVar", expr.tpe, expr.span) + } + locals.store(earlyReturnVar) + } + bc goTo nextCleanup + shouldEmitCleanup = true + } + } else { + // return from labeled + assert(fromSym.is(LabelFlag), fromSym) + assert(!fromSym.is(Method), fromSym) + + /* TODO At the moment, we disregard cleanups, because by construction we don't have return-from-labels + * that cross cleanup boundaries. However, in theory such crossings are valid, so we should take care + * of them. + */ + val (exprExpectedType, exprDest) = findJumpDest(fromSym) + genLoadTo(expr, exprExpectedType, exprDest) + } + } // end of genReturn() + + def genWhileDo(tree: WhileDo): LoadDestination = tree match{ + case WhileDo(cond, body) => + + val isInfinite = cond == tpd.EmptyTree + + val loop = new asm.Label + markProgramPoint(loop) + + if isInfinite then + val dest = LoadDestination.Jump(loop) + genLoadTo(body, UNIT, dest) + dest + else + body match + case Literal(value) if value.tag == UnitTag => + // this is the shape of do..while loops + val exitLoop = new asm.Label + genCond(cond, loop, exitLoop, targetIfNoJump = exitLoop) + markProgramPoint(exitLoop) + case _ => + val success = new asm.Label + val failure = new asm.Label + genCond(cond, success, failure, targetIfNoJump = success) + markProgramPoint(success) + genLoadTo(body, UNIT, LoadDestination.Jump(loop)) + markProgramPoint(failure) + end match + LoadDestination.FallThrough + } + + def genTypeApply(t: TypeApply): BType = (t: @unchecked) match { + case TypeApply(fun@DesugaredSelect(obj, _), targs) => + + val sym = fun.symbol + val cast = + if (sym == defn.Any_isInstanceOf) false + else if (sym == defn.Any_asInstanceOf) true + else abort(s"Unexpected type application $fun[sym: ${sym.showFullName}] in: $t") + val l = tpeTK(obj) + val r = tpeTK(targs.head) + genLoadQualifier(fun) + + // TODO @lry make pattern match + if (l.isPrimitive && r.isPrimitive) + genConversion(l, r, cast) + else if (l.isPrimitive) { + bc drop l + if (cast) { + mnode.visitTypeInsn(asm.Opcodes.NEW, jlClassCastExceptionRef.internalName) + bc dup ObjectRef + emit(asm.Opcodes.ATHROW) + } else { + bc boolconst false + } + } + else if (r.isPrimitive && cast) { + abort(s"Erasure should have added an unboxing operation to prevent this cast. Tree: $t") + } + else if (r.isPrimitive) { + bc isInstance boxedClassOfPrimitive(r.asPrimitiveBType) + } + else { + assert(r.isRef, r) // ensure that it's not a method + genCast(r.asRefBType, cast) + } + + if (cast) r else BOOL + } // end of genTypeApply() + + + private def mkArrayConstructorCall(arr: ArrayBType, app: Apply, args: List[Tree]) = { + val dims = arr.dimension + var elemKind = arr.elementType + val argsSize = args.length + if (argsSize > dims) { + report.error(em"too many arguments for array constructor: found ${args.length} but array has only $dims dimension(s)", ctx.source.atSpan(app.span)) + } + if (argsSize < dims) { + /* In one step: + * elemKind = new BType(BType.ARRAY, arr.off + argsSize, arr.len - argsSize) + * however the above does not enter a TypeName for each nested arrays in chrs. + */ + for (i <- args.length until dims) elemKind = ArrayBType(elemKind) + } + genLoadArguments(args, List.fill(args.size)(INT)) + (argsSize /*: @switch*/) match { + case 1 => bc newarray elemKind + case _ => + val descr = ("[" * argsSize) + elemKind.descriptor // denotes the same as: arrayN(elemKind, argsSize).descriptor + mnode.visitMultiANewArrayInsn(descr, argsSize) + } + } + + + private def genApply(app: Apply, expectedType: BType): BType = { + var generatedType = expectedType + lineNumber(app) + app match { + case Apply(_, args) if app.symbol eq defn.newArrayMethod => + val List(elemClaz, Literal(c: Constant), ArrayValue(_, dims)) = args: @unchecked + + generatedType = toTypeKind(c.typeValue) + mkArrayConstructorCall(generatedType.asArrayBType, app, dims) + case Apply(t :TypeApply, _) => + generatedType = + if (t.symbol ne defn.Object_synchronized) genTypeApply(t) + else genSynchronized(app, expectedType) + + case Apply(fun @ DesugaredSelect(Super(superQual, _), _), args) => + // 'super' call: Note: since constructors are supposed to + // return an instance of what they construct, we have to take + // special care. On JVM they are 'void', and Scala forbids (syntactically) + // to call super constructors explicitly and/or use their 'returned' value. + // therefore, we can ignore this fact, and generate code that leaves nothing + // on the stack (contrary to what the type in the AST says). + + // scala/bug#10290: qual can be `this.$outer()` (not just `this`), so we call genLoad (not just ALOAD_0) + genLoad(superQual) + genLoadArguments(args, paramTKs(app)) + generatedType = genCallMethod(fun.symbol, InvokeStyle.Super, app.span) + + // 'new' constructor call: Note: since constructors are + // thought to return an instance of what they construct, + // we have to 'simulate' it by DUPlicating the freshly created + // instance (on JVM, methods return VOID). + case Apply(fun @ DesugaredSelect(New(tpt), nme.CONSTRUCTOR), args) => + val ctor = fun.symbol + assert(ctor.isClassConstructor, s"'new' call to non-constructor: ${ctor.name}") + + generatedType = toTypeKind(tpt.tpe) + assert(generatedType.isRef, s"Non reference type cannot be instantiated: $generatedType") + + generatedType match { + case arr: ArrayBType => + mkArrayConstructorCall(arr, app, args) + + case rt: ClassBType => + assert(classBTypeFromSymbol(ctor.owner) == rt, s"Symbol ${ctor.owner.showFullName} is different from $rt") + mnode.visitTypeInsn(asm.Opcodes.NEW, rt.internalName) + bc dup generatedType + genLoadArguments(args, paramTKs(app)) + genCallMethod(ctor, InvokeStyle.Special, app.span) + + case _ => + abort(s"Cannot instantiate $tpt of kind: $generatedType") + } + + case Apply(fun, List(expr)) if Erasure.Boxing.isBox(fun.symbol) && fun.symbol.denot.owner != defn.UnitModuleClass => + val nativeKind = tpeTK(expr) + genLoad(expr, nativeKind) + val MethodNameAndType(mname, methodType) = asmBoxTo(nativeKind) + bc.invokestatic(srBoxesRuntimeRef.internalName, mname, methodType.descriptor, itf = false) + generatedType = boxResultType(fun.symbol) // was toTypeKind(fun.symbol.tpe.resultType) + + case Apply(fun, List(expr)) if Erasure.Boxing.isUnbox(fun.symbol) && fun.symbol.denot.owner != defn.UnitModuleClass => + genLoad(expr) + val boxType = unboxResultType(fun.symbol) // was toTypeKind(fun.symbol.owner.linkedClassOfClass.tpe) + generatedType = boxType + val MethodNameAndType(mname, methodType) = asmUnboxTo(boxType) + bc.invokestatic(srBoxesRuntimeRef.internalName, mname, methodType.descriptor, itf = false) + + case app @ Apply(fun, args) => + val sym = fun.symbol + + if (isPrimitive(fun)) { // primitive method call + generatedType = genPrimitiveOp(app, expectedType) + } else { // normal method call + val invokeStyle = + if (sym.isStaticMember) InvokeStyle.Static + else if (sym.is(Private) || sym.isClassConstructor) InvokeStyle.Special + else if (app.hasAttachment(BCodeHelpers.UseInvokeSpecial)) InvokeStyle.Special + else InvokeStyle.Virtual + + if (invokeStyle.hasInstance) genLoadQualifier(fun) + genLoadArguments(args, paramTKs(app)) + + val DesugaredSelect(qual, name) = fun: @unchecked // fun is a Select, also checked in genLoadQualifier + val isArrayClone = name == nme.clone_ && qual.tpe.widen.isInstanceOf[JavaArrayType] + if (isArrayClone) { + // Special-case Array.clone, introduced in 36ef60e. The goal is to generate this call + // as "[I.clone" instead of "java/lang/Object.clone". This is consistent with javac. + // Arrays have a public method `clone` (jls 10.7). + // + // The JVMS is not explicit about this, but that receiver type can be an array type + // descriptor (instead of a class internal name): + // invokevirtual #2; //Method "[I".clone:()Ljava/lang/Object + // + // Note that using `Object.clone()` would work as well, but only because the JVM + // relaxes protected access specifically if the receiver is an array: + // http://hg.openjdk.java.net/jdk8/jdk8/hotspot/file/87ee5ee27509/src/share/vm/interpreter/linkResolver.cpp#l439 + // Example: `class C { override def clone(): Object = "hi" }` + // Emitting `def f(c: C) = c.clone()` as `Object.clone()` gives a VerifyError. + val target: String = tpeTK(qual).asRefBType.classOrArrayType + val methodBType = asmMethodType(sym) + bc.invokevirtual(target, sym.javaSimpleName, methodBType.descriptor) + generatedType = methodBType.returnType + } else { + val receiverClass = if (!invokeStyle.isVirtual) null else { + // receiverClass is used in the bytecode to as the method receiver. using sym.owner + // may lead to IllegalAccessErrors, see 9954eaf / aladdin bug 455. + val qualSym = qual.tpe.typeSymbol + if (qualSym == defn.ArrayClass) { + // For invocations like `Array(1).hashCode` or `.wait()`, use Object as receiver + // in the bytecode. Using the array descriptor (like we do for clone above) seems + // to work as well, but it seems safer not to change this. Javac also uses Object. + // Note that array apply/update/length are handled by isPrimitive (above). + assert(sym.owner == defn.ObjectClass, s"unexpected array call: $app") + defn.ObjectClass + } else qualSym + } + generatedType = genCallMethod(sym, invokeStyle, app.span, receiverClass) + } + } + } + + generatedType + } // end of genApply() + + private def genArrayValue(av: tpd.JavaSeqLiteral): BType = { + val ArrayValue(tpt, elems) = av: @unchecked + + lineNumber(av) + genArray(elems, tpt) + } + + private def genArray(elems: List[Tree], elemType: Type): BType = { + val elmKind = toTypeKind(elemType) + val generatedType = ArrayBType(elmKind) + + bc iconst elems.length + bc newarray elmKind + + var i = 0 + var rest = elems + while (!rest.isEmpty) { + bc dup generatedType + bc iconst i + genLoad(rest.head, elmKind) + bc astore elmKind + rest = rest.tail + i = i + 1 + } + + generatedType + } + + /* A Match node contains one or more case clauses, each case clause lists one or more + * Int/String values to use as keys, and a code block. The exception is the "default" case + * clause which doesn't list any key (there is exactly one of these per match). + */ + private def genMatchTo(tree: Match, expectedType: BType, dest: LoadDestination): BType = tree match { + case Match(selector, cases) => + lineNumber(tree) + + val (generatedType, postMatch, postMatchDest) = + if dest == LoadDestination.FallThrough then + val postMatch = new asm.Label + (tpeTK(tree), postMatch, LoadDestination.Jump(postMatch)) + else + (expectedType, null, dest) + + // Only two possible selector types exist in `Match` trees at this point: Int and String + if (tpeTK(selector) == INT) { + + /* On a first pass over the case clauses, we flatten the keys and their + * targets (the latter represented with asm.Labels). That representation + * allows JCodeMethodV to emit a lookupswitch or a tableswitch. + * + * On a second pass, we emit the switch blocks, one for each different target. + */ + + var flatKeys: List[Int] = Nil + var targets: List[asm.Label] = Nil + var default: asm.Label = null + var switchBlocks: List[(asm.Label, Tree)] = Nil + + genLoad(selector, INT) + + // collect switch blocks and their keys, but don't emit yet any switch-block. + for (caze @ CaseDef(pat, guard, body) <- cases) { + assert(guard == tpd.EmptyTree, guard) + val switchBlockPoint = new asm.Label + switchBlocks ::= (switchBlockPoint, body) + pat match { + case Literal(value) => + flatKeys ::= value.intValue + targets ::= switchBlockPoint + case Ident(nme.WILDCARD) => + assert(default == null, s"multiple default targets in a Match node, at ${tree.span}") + default = switchBlockPoint + case Alternative(alts) => + alts foreach { + case Literal(value) => + flatKeys ::= value.intValue + targets ::= switchBlockPoint + case _ => + abort(s"Invalid alternative in alternative pattern in Match node: $tree at: ${tree.span}") + } + case _ => + abort(s"Invalid pattern in Match node: $tree at: ${tree.span}") + } + } + + bc.emitSWITCH(mkArrayReverse(flatKeys), mkArrayL(targets.reverse), default, MIN_SWITCH_DENSITY) + + // emit switch-blocks. + for (sb <- switchBlocks.reverse) { + val (caseLabel, caseBody) = sb + markProgramPoint(caseLabel) + genLoadTo(caseBody, generatedType, postMatchDest) + } + } else { + + /* Since the JVM doesn't have a way to switch on a string, we switch + * on the `hashCode` of the string then do an `equals` check (with a + * possible second set of jumps if blocks can be reach from multiple + * string alternatives). + * + * This mirrors the way that Java compiles `switch` on Strings. + */ + + var default: asm.Label = null + var indirectBlocks: List[(asm.Label, Tree)] = Nil + + + // Cases grouped by their hashCode + val casesByHash = SortedMap.empty[Int, List[(String, Either[asm.Label, Tree])]] + var caseFallback: Tree = null + + for (caze @ CaseDef(pat, guard, body) <- cases) { + assert(guard == tpd.EmptyTree, guard) + pat match { + case Literal(value) => + val strValue = value.stringValue + casesByHash.updateWith(strValue.##) { existingCasesOpt => + val newCase = (strValue, Right(body)) + Some(newCase :: existingCasesOpt.getOrElse(Nil)) + } + case Ident(nme.WILDCARD) => + assert(default == null, s"multiple default targets in a Match node, at ${tree.span}") + default = new asm.Label + indirectBlocks ::= (default, body) + case Alternative(alts) => + // We need an extra basic block since multiple strings can lead to this code + val indirectCaseGroupLabel = new asm.Label + indirectBlocks ::= (indirectCaseGroupLabel, body) + alts foreach { + case Literal(value) => + val strValue = value.stringValue + casesByHash.updateWith(strValue.##) { existingCasesOpt => + val newCase = (strValue, Left(indirectCaseGroupLabel)) + Some(newCase :: existingCasesOpt.getOrElse(Nil)) + } + case _ => + abort(s"Invalid alternative in alternative pattern in Match node: $tree at: ${tree.span}") + } + + case _ => + abort(s"Invalid pattern in Match node: $tree at: ${tree.span}") + } + } + + // Organize the hashCode options into switch cases + var flatKeys: List[Int] = Nil + var targets: List[asm.Label] = Nil + var hashBlocks: List[(asm.Label, List[(String, Either[asm.Label, Tree])])] = Nil + for ((hashValue, hashCases) <- casesByHash) { + val switchBlockPoint = new asm.Label + hashBlocks ::= (switchBlockPoint, hashCases) + flatKeys ::= hashValue + targets ::= switchBlockPoint + } + + // Push the hashCode of the string (or `0` it is `null`) onto the stack and switch on it + genLoadIfTo( + If( + tree.selector.select(defn.Any_==).appliedTo(nullLiteral), + Literal(Constant(0)), + tree.selector.select(defn.Any_hashCode).appliedToNone + ), + INT, + LoadDestination.FallThrough + ) + bc.emitSWITCH(mkArrayReverse(flatKeys), mkArrayL(targets.reverse), default, MIN_SWITCH_DENSITY) + + // emit blocks for each hash case + for ((hashLabel, caseAlternatives) <- hashBlocks.reverse) { + markProgramPoint(hashLabel) + for ((caseString, indirectLblOrBody) <- caseAlternatives) { + val comparison = if (caseString == null) defn.Any_== else defn.Any_equals + val condp = Literal(Constant(caseString)).select(defn.Any_==).appliedTo(tree.selector) + val keepGoing = new asm.Label + indirectLblOrBody match { + case Left(jump) => + genCond(condp, jump, keepGoing, targetIfNoJump = keepGoing) + + case Right(caseBody) => + val thisCaseMatches = new asm.Label + genCond(condp, thisCaseMatches, keepGoing, targetIfNoJump = thisCaseMatches) + markProgramPoint(thisCaseMatches) + genLoadTo(caseBody, generatedType, postMatchDest) + } + markProgramPoint(keepGoing) + } + bc goTo default + } + + // emit blocks for common patterns + for ((caseLabel, caseBody) <- indirectBlocks.reverse) { + markProgramPoint(caseLabel) + genLoadTo(caseBody, generatedType, postMatchDest) + } + } + + if postMatch != null then + markProgramPoint(postMatch) + generatedType + } + + def genBlockTo(tree: Block, expectedType: BType, dest: LoadDestination): Unit = tree match { + case Block(stats, expr) => + + val savedScope = varsInScope + varsInScope = Nil + stats foreach genStat + genLoadTo(expr, expectedType, dest) + emitLocalVarScopes() + varsInScope = savedScope + } + + /** Add entries to the `LocalVariableTable` JVM attribute for all the vars in + * `varsInScope`, ending at the current program point. + */ + def emitLocalVarScopes(): Unit = + if (emitVars) { + val end = currProgramPoint() + for ((sym, start) <- varsInScope.reverse) { + emitLocalVarScope(sym, start, end) + } + } + end emitLocalVarScopes + + def adapt(from: BType, to: BType): Unit = { + if (!from.conformsTo(to)) { + to match { + case UNIT => bc drop from + case _ => bc.emitT2T(from, to) + } + } else if (from.isNothingType) { + /* There are two possibilities for from.isNothingType: emitting a "throw e" expressions and + * loading a (phantom) value of type Nothing. + * + * The Nothing type in Scala's type system does not exist in the JVM. In bytecode, Nothing + * is mapped to scala.runtime.Nothing$. To the JVM, a call to Predef.??? looks like it would + * return an object of type Nothing$. We need to do something with that phantom object on + * the stack. "Phantom" because it never exists: such methods always throw, but the JVM does + * not know that. + * + * Note: The two verifiers (old: type inference, new: type checking) have different + * requirements. Very briefly: + * + * Old (http://docs.oracle.com/javase/specs/jvms/se8/html/jvms-4.html#jvms-4.10.2.1): at + * each program point, no matter what branches were taken to get there + * - Stack is same size and has same typed values + * - Local and stack values need to have consistent types + * - In practice, the old verifier seems to ignore unreachable code and accept any + * instructions after an ATHROW. For example, there can be another ATHROW (without + * loading another throwable first). + * + * New (http://docs.oracle.com/javase/specs/jvms/se8/html/jvms-4.html#jvms-4.10.1) + * - Requires consistent stack map frames. GenBCode generates stack frames if -target:jvm-1.6 + * or higher. + * - In practice: the ASM library computes stack map frames for us (ClassWriter). Emitting + * correct frames after an ATHROW is probably complex, so ASM uses the following strategy: + * - Every time when generating an ATHROW, a new basic block is started. + * - During classfile writing, such basic blocks are found to be dead: no branches go there + * - Eliminating dead code would probably require complex shifts in the output byte buffer + * - But there's an easy solution: replace all code in the dead block with with + * `nop; nop; ... nop; athrow`, making sure the bytecode size stays the same + * - The corresponding stack frame can be easily generated: on entering a dead the block, + * the frame requires a single Throwable on the stack. + * - Since there are no branches to the dead block, the frame requirements are never violated. + * + * To summarize the above: it does matter what we emit after an ATHROW. + * + * NOW: if we end up here because we emitted a load of a (phantom) value of type Nothing$, + * there was no ATHROW emitted. So, we have to make the verifier happy and do something + * with that value. Since Nothing$ extends Throwable, the easiest is to just emit an ATHROW. + * + * If we ended up here because we generated a "throw e" expression, we know the last + * emitted instruction was an ATHROW. As explained above, it is OK to emit a second ATHROW, + * the verifiers will be happy. + */ + if (lastInsn.getOpcode != asm.Opcodes.ATHROW) + emit(asm.Opcodes.ATHROW) + } else if (from.isNullType) { + /* After loading an expression of type `scala.runtime.Null$`, introduce POP; ACONST_NULL. + * This is required to pass the verifier: in Scala's type system, Null conforms to any + * reference type. In bytecode, the type Null is represented by scala.runtime.Null$, which + * is not a subtype of all reference types. Example: + * + * def nl: Null = null // in bytecode, nl has return type scala.runtime.Null$ + * val a: String = nl // OK for Scala but not for the JVM, scala.runtime.Null$ does not conform to String + * + * In order to fix the above problem, the value returned by nl is dropped and ACONST_NULL is + * inserted instead - after all, an expression of type scala.runtime.Null$ can only be null. + */ + if (lastInsn.getOpcode != asm.Opcodes.ACONST_NULL) { + bc drop from + emit(asm.Opcodes.ACONST_NULL) + } + } + else (from, to) match { + case (BYTE, LONG) | (SHORT, LONG) | (CHAR, LONG) | (INT, LONG) => bc.emitT2T(INT, LONG) + case _ => () + } + } + + /* Emit code to Load the qualifier of `tree` on top of the stack. */ + def genLoadQualifier(tree: Tree): Unit = { + lineNumber(tree) + tree match { + case DesugaredSelect(qualifier, _) => genLoad(qualifier) + case t: Ident => // dotty specific + cachedDesugarIdent(t) match { + case Some(sel) => genLoadQualifier(sel) + case None => + assert(t.symbol.owner == this.claszSymbol) + } + case _ => abort(s"Unknown qualifier $tree") + } + } + + def genLoadArguments(args: List[Tree], btpes: List[BType]): Unit = + args match + case arg :: args1 => + btpes match + case btpe :: btpes1 => + genLoad(arg, btpe) + genLoadArguments(args1, btpes1) + case _ => + case _ => + + def genLoadModule(tree: Tree): BType = { + val module = ( + if (!tree.symbol.is(PackageClass)) tree.symbol + else tree.symbol.info.member(nme.PACKAGE).symbol match { + case NoSymbol => abort(s"SI-5604: Cannot use package as value: $tree") + case s => abort(s"SI-5604: found package class where package object expected: $tree") + } + ) + lineNumber(tree) + genLoadModule(module) + symInfoTK(module) + } + + def genLoadModule(module: Symbol): Unit = { + def inStaticMethod = methSymbol != null && methSymbol.isStaticMember + if (claszSymbol == module.moduleClass && jMethodName != "readResolve" && !inStaticMethod) { + mnode.visitVarInsn(asm.Opcodes.ALOAD, 0) + } else { + val mbt = symInfoTK(module).asClassBType + mnode.visitFieldInsn( + asm.Opcodes.GETSTATIC, + mbt.internalName /* + "$" */ , + str.MODULE_INSTANCE_FIELD, + mbt.descriptor // for nostalgics: toTypeKind(module.tpe).descriptor + ) + } + } + + def genConversion(from: BType, to: BType, cast: Boolean): Unit = { + if (cast) { bc.emitT2T(from, to) } + else { + bc drop from + bc boolconst (from == to) + } + } + + def genCast(to: RefBType, cast: Boolean): Unit = { + if (cast) { bc checkCast to } + else { bc isInstance to } + } + + /* Is the given symbol a primitive operation? */ + def isPrimitive(fun: Tree): Boolean = { + primitives.isPrimitive(fun) + } + + /* Generate coercion denoted by "code" */ + def genCoercion(code: Int): Unit = { + import ScalaPrimitivesOps._ + (code: @switch) match { + case B2B | S2S | C2C | I2I | L2L | F2F | D2D => () + case _ => + val from = coercionFrom(code) + val to = coercionTo(code) + bc.emitT2T(from, to) + } + } + + /* Generate string concatenation + * + * On JDK 8: create and append using `StringBuilder` + * On JDK 9+: use `invokedynamic` with `StringConcatFactory` + */ + def genStringConcat(tree: Tree): BType = { + lineNumber(tree) + liftStringConcat(tree) match { + // Optimization for expressions of the form "" + x + case List(Literal(Constant("")), arg) => + genLoad(arg, ObjectRef) + genCallMethod(defn.String_valueOf_Object, InvokeStyle.Static) + + case concatenations => + val concatArguments = concatenations.view + .filter { + case Literal(Constant("")) => false // empty strings are no-ops in concatenation + case _ => true + } + .map { + case Apply(boxOp, value :: Nil) if Erasure.Boxing.isBox(boxOp.symbol) && boxOp.symbol.denot.owner != defn.UnitModuleClass => + // Eliminate boxing of primitive values. Boxing is introduced by erasure because + // there's only a single synthetic `+` method "added" to the string class. + value + case other => other + } + .toList + + // `StringConcatFactory` only got added in JDK 9, so use `StringBuilder` for lower + if (classfileVersion < asm.Opcodes.V9) { + + // Estimate capacity needed for the string builder + val approxBuilderSize = concatArguments.view.map { + case Literal(Constant(s: String)) => s.length + case Literal(c @ Constant(_)) if c.isNonUnitAnyVal => String.valueOf(c).length + case _ => 0 + }.sum + bc.genNewStringBuilder(approxBuilderSize) + + for (elem <- concatArguments) { + val elemType = tpeTK(elem) + genLoad(elem, elemType) + bc.genStringBuilderAppend(elemType) + } + bc.genStringBuilderEnd + } else { + + /* `StringConcatFactory#makeConcatWithConstants` accepts max 200 argument slots. If + * the string concatenation is longer (unlikely), we spill into multiple calls + */ + val MaxIndySlots = 200 + val TagArg = '\u0001' // indicates a hole (in the recipe string) for an argument + val TagConst = '\u0002' // indicates a hole (in the recipe string) for a constant + + val recipe = new StringBuilder() + val argTypes = Seq.newBuilder[asm.Type] + val constVals = Seq.newBuilder[String] + var totalArgSlots = 0 + var countConcats = 1 // ie. 1 + how many times we spilled + + for (elem <- concatArguments) { + val tpe = tpeTK(elem) + val elemSlots = tpe.size + + // Unlikely spill case + if (totalArgSlots + elemSlots >= MaxIndySlots) { + bc.genIndyStringConcat(recipe.toString, argTypes.result(), constVals.result()) + countConcats += 1 + totalArgSlots = 0 + recipe.setLength(0) + argTypes.clear() + constVals.clear() + } + + elem match { + case Literal(Constant(s: String)) => + if (s.contains(TagArg) || s.contains(TagConst)) { + totalArgSlots += elemSlots + recipe.append(TagConst) + constVals += s + } else { + recipe.append(s) + } + + case other => + totalArgSlots += elemSlots + recipe.append(TagArg) + val tpe = tpeTK(elem) + argTypes += tpe.toASMType + genLoad(elem, tpe) + } + } + bc.genIndyStringConcat(recipe.toString, argTypes.result(), constVals.result()) + + // If we spilled, generate one final concat + if (countConcats > 1) { + bc.genIndyStringConcat( + TagArg.toString * countConcats, + Seq.fill(countConcats)(StringRef.toASMType), + Seq.empty + ) + } + } + } + StringRef + } + + /** + * Generate a method invocation. If `specificReceiver != null`, it is used as receiver in the + * invocation instruction, otherwise `method.owner`. A specific receiver class is needed to + * prevent an IllegalAccessError, (aladdin bug 455). + */ + def genCallMethod(method: Symbol, style: InvokeStyle, pos: Span = NoSpan, specificReceiver: Symbol = null): BType = { + val methodOwner = method.owner + + // the class used in the invocation's method descriptor in the classfile + val receiverClass = { + if (specificReceiver != null) + assert(style.isVirtual || specificReceiver == methodOwner, s"specificReceiver can only be specified for virtual calls. $method - $specificReceiver") + + val useSpecificReceiver = specificReceiver != null && !defn.isBottomClass(specificReceiver) && !method.isScalaStatic + val receiver = if (useSpecificReceiver) specificReceiver else methodOwner + + // workaround for a JVM bug: https://bugs.openjdk.java.net/browse/JDK-8154587 + // when an interface method overrides a member of Object (note that all interfaces implicitly + // have superclass Object), the receiver needs to be the interface declaring the override (and + // not a sub-interface that inherits it). example: + // trait T { override def clone(): Object = "" } + // trait U extends T + // class C extends U + // class D { def f(u: U) = u.clone() } + // The invocation `u.clone()` needs `T` as a receiver: + // - using Object is illegal, as Object.clone is protected + // - using U results in a `NoSuchMethodError: U.clone. This is the JVM bug. + // Note that a mixin forwarder is generated, so the correct method is executed in the end: + // class C { override def clone(): Object = super[T].clone() } + val isTraitMethodOverridingObjectMember = { + receiver != methodOwner && // fast path - the boolean is used to pick either of these two, if they are the same it does not matter + style.isVirtual && + isEmittedInterface(receiver) && + defn.ObjectType.decl(method.name).symbol.exists && { // fast path - compute overrideChain on the next line only if necessary + val syms = method.allOverriddenSymbols.toList + !syms.isEmpty && syms.last.owner == defn.ObjectClass + } + } + if (isTraitMethodOverridingObjectMember) methodOwner else receiver + } + + receiverClass.info // ensure types the type is up to date; erasure may add lateINTERFACE to traits + val receiverName = internalName(receiverClass) + + val jname = method.javaSimpleName + val bmType = asmMethodType(method) + val mdescr = bmType.descriptor + + val isInterface = isEmittedInterface(receiverClass) + import InvokeStyle._ + if (style == Super) { + if (isInterface && !method.is(JavaDefined)) { + val args = new Array[BType](bmType.argumentTypes.length + 1) + val ownerBType = toTypeKind(method.owner.info) + bmType.argumentTypes.copyToArray(args, 1) + val staticDesc = MethodBType(ownerBType :: bmType.argumentTypes, bmType.returnType).descriptor + val staticName = traitSuperAccessorName(method) + bc.invokestatic(receiverName, staticName, staticDesc, isInterface) + } else { + bc.invokespecial(receiverName, jname, mdescr, isInterface) + } + } else { + val opc = style match { + case Static => Opcodes.INVOKESTATIC + case Special => Opcodes.INVOKESPECIAL + case Virtual => if (isInterface) Opcodes.INVOKEINTERFACE else Opcodes.INVOKEVIRTUAL + } + bc.emitInvoke(opc, receiverName, jname, mdescr, isInterface) + } + + bmType.returnType + } // end of genCallMethod() + + /* Generate the scala ## method. */ + def genScalaHash(tree: Tree): BType = { + genLoad(tree, ObjectRef) + genCallMethod(NoSymbol, InvokeStyle.Static) // used to dispatch ## on primitives to ScalaRuntime.hash. Should be implemented by a miniphase + } + + /* + * Returns a list of trees that each should be concatenated, from left to right. + * It turns a chained call like "a".+("b").+("c") into a list of arguments. + */ + def liftStringConcat(tree: Tree): List[Tree] = tree match { + case tree @ Apply(fun @ DesugaredSelect(larg, method), rarg) => + if (isPrimitive(fun) && + primitives.getPrimitive(tree, larg.tpe) == ScalaPrimitivesOps.CONCAT) + liftStringConcat(larg) ::: rarg + else + tree :: Nil + case _ => + tree :: Nil + } + + /* Emit code to compare the two top-most stack values using the 'op' operator. */ + private def genCJUMP(success: asm.Label, failure: asm.Label, op: TestOp, tk: BType, targetIfNoJump: asm.Label, negated: Boolean = false): Unit = { + if (targetIfNoJump == success) genCJUMP(failure, success, op.negate(), tk, targetIfNoJump, negated = !negated) + else { + if (tk.isIntSizedType) { // BOOL, BYTE, CHAR, SHORT, or INT + bc.emitIF_ICMP(op, success) + } else if (tk.isRef) { // REFERENCE(_) | ARRAY(_) + bc.emitIF_ACMP(op, success) + } else { + import Primitives._ + def useCmpG = if (negated) op == GT || op == GE else op == LT || op == LE + (tk: @unchecked) match { + case LONG => emit(asm.Opcodes.LCMP) + case FLOAT => emit(if (useCmpG) asm.Opcodes.FCMPG else asm.Opcodes.FCMPL) + case DOUBLE => emit(if (useCmpG) asm.Opcodes.DCMPG else asm.Opcodes.DCMPL) + } + bc.emitIF(op, success) + } + if (targetIfNoJump != failure) bc goTo failure + } + } + + /* Emits code to compare (and consume) stack-top and zero using the 'op' operator */ + private def genCZJUMP(success: asm.Label, failure: asm.Label, op: TestOp, tk: BType, targetIfNoJump: asm.Label, negated: Boolean = false): Unit = { + import Primitives._ + if (targetIfNoJump == success) genCZJUMP(failure, success, op.negate(), tk, targetIfNoJump, negated = !negated) + else { + if (tk.isIntSizedType) { // BOOL, BYTE, CHAR, SHORT, or INT + bc.emitIF(op, success) + } else if (tk.isRef) { // REFERENCE(_) | ARRAY(_) + (op: @unchecked) match { // references are only compared with EQ and NE + case EQ => bc emitIFNULL success + case NE => bc emitIFNONNULL success + } + } else { + def useCmpG = if (negated) op == GT || op == GE else op == LT || op == LE + (tk: @unchecked) match { + case LONG => + emit(asm.Opcodes.LCONST_0) + emit(asm.Opcodes.LCMP) + case FLOAT => + emit(asm.Opcodes.FCONST_0) + emit(if (useCmpG) asm.Opcodes.FCMPG else asm.Opcodes.FCMPL) + case DOUBLE => + emit(asm.Opcodes.DCONST_0) + emit(if (useCmpG) asm.Opcodes.DCMPG else asm.Opcodes.DCMPL) + } + bc.emitIF(op, success) + } + if (targetIfNoJump != failure) bc goTo failure + } + } + + def testOpForPrimitive(primitiveCode: Int) = (primitiveCode: @switch) match { + case ScalaPrimitivesOps.ID => Primitives.EQ + case ScalaPrimitivesOps.NI => Primitives.NE + case ScalaPrimitivesOps.EQ => Primitives.EQ + case ScalaPrimitivesOps.NE => Primitives.NE + case ScalaPrimitivesOps.LT => Primitives.LT + case ScalaPrimitivesOps.LE => Primitives.LE + case ScalaPrimitivesOps.GT => Primitives.GT + case ScalaPrimitivesOps.GE => Primitives.GE + } + + /* + * Generate code for conditional expressions. + * The jump targets success/failure of the test are `then-target` and `else-target` resp. + */ + private def genCond(tree: Tree, success: asm.Label, failure: asm.Label, targetIfNoJump: asm.Label): Unit = { + + def genComparisonOp(l: Tree, r: Tree, code: Int): Unit = { + val op = testOpForPrimitive(code) + def isNull(t: Tree): Boolean = t match { + case Literal(Constant(null)) => true + case _ => false + } + def ifOneIsNull(l: Tree, r: Tree): Tree = if (isNull(l)) r else if (isNull(r)) l else null + val nonNullSide = if (ScalaPrimitivesOps.isReferenceEqualityOp(code)) ifOneIsNull(l, r) else null + if (nonNullSide != null) { + // special-case reference (in)equality test for null (null eq x, x eq null) + genLoad(nonNullSide, ObjectRef) + genCZJUMP(success, failure, op, ObjectRef, targetIfNoJump) + } else { + val tk = tpeTK(l).maxType(tpeTK(r)) + genLoad(l, tk) + genLoad(r, tk) + genCJUMP(success, failure, op, tk, targetIfNoJump) + } + } + + def loadAndTestBoolean() = { + genLoad(tree, BOOL) + genCZJUMP(success, failure, Primitives.NE, BOOL, targetIfNoJump) + } + + lineNumber(tree) + tree match { + + case tree @ Apply(fun, args) if primitives.isPrimitive(fun.symbol) => + import ScalaPrimitivesOps.{ ZNOT, ZAND, ZOR, EQ } + + // lhs and rhs of test + lazy val DesugaredSelect(lhs, _) = fun: @unchecked + val rhs = if (args.isEmpty) tpd.EmptyTree else args.head // args.isEmpty only for ZNOT + + def genZandOrZor(and: Boolean): Unit = { + // reaching "keepGoing" indicates the rhs should be evaluated too (ie not short-circuited). + val keepGoing = new asm.Label + + if (and) genCond(lhs, keepGoing, failure, targetIfNoJump = keepGoing) + else genCond(lhs, success, keepGoing, targetIfNoJump = keepGoing) + + markProgramPoint(keepGoing) + genCond(rhs, success, failure, targetIfNoJump) + } + + primitives.getPrimitive(fun.symbol) match { + case ZNOT => genCond(lhs, failure, success, targetIfNoJump) + case ZAND => genZandOrZor(and = true) + case ZOR => genZandOrZor(and = false) + case code => + if (ScalaPrimitivesOps.isUniversalEqualityOp(code) && tpeTK(lhs).isClass) { + // rewrite `==` to null tests and `equals`. not needed for arrays (`equals` is reference equality). + if (code == EQ) genEqEqPrimitive(lhs, rhs, success, failure, targetIfNoJump) + else genEqEqPrimitive(lhs, rhs, failure, success, targetIfNoJump) + } else if (ScalaPrimitivesOps.isComparisonOp(code)) { + genComparisonOp(lhs, rhs, code) + } else + loadAndTestBoolean() + } + + case Block(stats, expr) => + /* Push the decision further down the `expr`. + * This is particularly effective for the shape of do..while loops. + */ + val savedScope = varsInScope + varsInScope = Nil + stats foreach genStat + genCond(expr, success, failure, targetIfNoJump) + emitLocalVarScopes() + varsInScope = savedScope + + case If(condp, thenp, elsep) => + val innerSuccess = new asm.Label + val innerFailure = new asm.Label + genCond(condp, innerSuccess, innerFailure, targetIfNoJump = innerSuccess) + markProgramPoint(innerSuccess) + genCond(thenp, success, failure, targetIfNoJump = innerFailure) + markProgramPoint(innerFailure) + genCond(elsep, success, failure, targetIfNoJump) + + case _ => loadAndTestBoolean() + } + + } // end of genCond() + + /* + * Generate the "==" code for object references. It is equivalent of + * if (l eq null) r eq null else l.equals(r); + * + * @param l left-hand-side of the '==' + * @param r right-hand-side of the '==' + */ + def genEqEqPrimitive(l: Tree, r: Tree, success: asm.Label, failure: asm.Label, targetIfNoJump: asm.Label): Unit = { + + /* True if the equality comparison is between values that require the use of the rich equality + * comparator (scala.runtime.Comparator.equals). This is the case when either side of the + * comparison might have a run-time type subtype of java.lang.Number or java.lang.Character. + * When it is statically known that both sides are equal and subtypes of Number of Character, + * not using the rich equality is possible (their own equals method will do ok.) + */ + val mustUseAnyComparator: Boolean = { + val areSameFinals = l.tpe.typeSymbol.is(Final) && r.tpe.typeSymbol.is(Final) && (l.tpe =:= r.tpe) + // todo: remove + def isMaybeBoxed(sym: Symbol): Boolean = { + (sym == defn.ObjectClass) || + (sym == defn.JavaSerializableClass) || + (sym == defn.ComparableClass) || + (sym derivesFrom defn.BoxedNumberClass) || + (sym derivesFrom defn.BoxedCharClass) || + (sym derivesFrom defn.BoxedBooleanClass) + } + !areSameFinals && isMaybeBoxed(l.tpe.typeSymbol) && isMaybeBoxed(r.tpe.typeSymbol) + } + def isNull(t: Tree): Boolean = t match { + case Literal(Constant(null)) => true + case _ => false + } + def isNonNullExpr(t: Tree): Boolean = t.isInstanceOf[Literal] || ((t.symbol ne null) && t.symbol.is(Module)) + + if (mustUseAnyComparator) { + val equalsMethod: Symbol = { + if (l.tpe <:< defn.BoxedNumberClass.info) { + if (r.tpe <:< defn.BoxedNumberClass.info) defn.BoxesRunTimeModule.requiredMethod(nme.equalsNumNum) + else if (r.tpe <:< defn.BoxedCharClass.info) NoSymbol // ctx.requiredMethod(BoxesRunTimeTypeRef, nme.equalsNumChar) // this method is private + else defn.BoxesRunTimeModule.requiredMethod(nme.equalsNumObject) + } else defn.BoxesRunTimeModule_externalEquals + } + + genLoad(l, ObjectRef) + genLoad(r, ObjectRef) + genCallMethod(equalsMethod, InvokeStyle.Static) + genCZJUMP(success, failure, Primitives.NE, BOOL, targetIfNoJump) + } + else { + if (isNull(l)) { + // null == expr -> expr eq null + genLoad(r, ObjectRef) + genCZJUMP(success, failure, Primitives.EQ, ObjectRef, targetIfNoJump) + } else if (isNull(r)) { + // expr == null -> expr eq null + genLoad(l, ObjectRef) + genCZJUMP(success, failure, Primitives.EQ, ObjectRef, targetIfNoJump) + } else if (isNonNullExpr(l)) { + // SI-7852 Avoid null check if L is statically non-null. + genLoad(l, ObjectRef) + genLoad(r, ObjectRef) + genCallMethod(defn.Any_equals, InvokeStyle.Virtual) + genCZJUMP(success, failure, Primitives.NE, BOOL, targetIfNoJump) + } else { + // l == r -> if (l eq null) r eq null else l.equals(r) + val eqEqTempLocal = locals.makeLocal(ObjectRef, nme.EQEQ_LOCAL_VAR.mangledString, defn.ObjectType, r.span) + val lNull = new asm.Label + val lNonNull = new asm.Label + + genLoad(l, ObjectRef) + genLoad(r, ObjectRef) + locals.store(eqEqTempLocal) + bc dup ObjectRef + genCZJUMP(lNull, lNonNull, Primitives.EQ, ObjectRef, targetIfNoJump = lNull) + + markProgramPoint(lNull) + bc drop ObjectRef + locals.load(eqEqTempLocal) + genCZJUMP(success, failure, Primitives.EQ, ObjectRef, targetIfNoJump = lNonNull) + + markProgramPoint(lNonNull) + locals.load(eqEqTempLocal) + genCallMethod(defn.Any_equals, InvokeStyle.Virtual) + genCZJUMP(success, failure, Primitives.NE, BOOL, targetIfNoJump) + } + } + } + + + def genSynchronized(tree: Apply, expectedType: BType): BType + def genLoadTry(tree: Try): BType + + def genInvokeDynamicLambda(ctor: Symbol, lambdaTarget: Symbol, environmentSize: Int, functionalInterface: Symbol): BType = { + import java.lang.invoke.LambdaMetafactory.{FLAG_BRIDGES, FLAG_SERIALIZABLE} + + report.debuglog(s"Using invokedynamic rather than `new ${ctor.owner}`") + val generatedType = classBTypeFromSymbol(functionalInterface) + // Lambdas should be serializable if they implement a SAM that extends Serializable or if they + // implement a scala.Function* class. + val isSerializable = functionalInterface.isSerializable || defn.isFunctionClass(functionalInterface) + val isInterface = isEmittedInterface(lambdaTarget.owner) + val invokeStyle = + if (lambdaTarget.isStaticMember) asm.Opcodes.H_INVOKESTATIC + else if (lambdaTarget.is(Private) || lambdaTarget.isClassConstructor) asm.Opcodes.H_INVOKESPECIAL + else if (isInterface) asm.Opcodes.H_INVOKEINTERFACE + else asm.Opcodes.H_INVOKEVIRTUAL + + val targetHandle = + new asm.Handle(invokeStyle, + classBTypeFromSymbol(lambdaTarget.owner).internalName, + lambdaTarget.javaSimpleName, + asmMethodType(lambdaTarget).descriptor, + /* itf = */ isInterface) + + val (a,b) = lambdaTarget.info.firstParamTypes.splitAt(environmentSize) + var (capturedParamsTypes, lambdaParamTypes) = (a,b) + + if (invokeStyle != asm.Opcodes.H_INVOKESTATIC) capturedParamsTypes = lambdaTarget.owner.info :: capturedParamsTypes + + // Requires https://github.com/scala/scala-java8-compat on the runtime classpath + val returnUnit = lambdaTarget.info.resultType.typeSymbol == defn.UnitClass + val functionalInterfaceDesc: String = generatedType.descriptor + val desc = capturedParamsTypes.map(tpe => toTypeKind(tpe)).mkString(("("), "", ")") + functionalInterfaceDesc + // TODO specialization + val instantiatedMethodType = new MethodBType(lambdaParamTypes.map(p => toTypeKind(p)), toTypeKind(lambdaTarget.info.resultType)).toASMType + + val samMethod = atPhase(erasurePhase) { + val samMethods = toDenot(functionalInterface).info.possibleSamMethods.toList + samMethods match { + case x :: Nil => x.symbol + case Nil => abort(s"${functionalInterface.show} is not a functional interface. It doesn't have abstract methods") + case xs => abort(s"${functionalInterface.show} is not a functional interface. " + + s"It has the following abstract methods: ${xs.map(_.name).mkString(", ")}") + } + } + + val methodName = samMethod.javaSimpleName + val samMethodType = asmMethodType(samMethod).toASMType + // scala/bug#10334: make sure that a lambda object for `T => U` has a method `apply(T)U`, not only the `(Object)Object` + // version. Using the lambda a structural type `{def apply(t: T): U}` causes a reflective lookup for this method. + val needsGenericBridge = samMethodType != instantiatedMethodType + val bridgeMethods = atPhase(erasurePhase){ + samMethod.allOverriddenSymbols.toList + } + val overriddenMethodTypes = bridgeMethods.map(b => asmMethodType(b).toASMType) + + // any methods which `samMethod` overrides need bridges made for them + // this is done automatically during erasure for classes we generate, but LMF needs to have them explicitly mentioned + // so we have to compute them at this relatively late point. + val bridgeTypes = ( + if (needsGenericBridge) + instantiatedMethodType +: overriddenMethodTypes + else + overriddenMethodTypes + ).distinct.filterNot(_ == samMethodType) + + val needsBridges = bridgeTypes.nonEmpty + + def flagIf(b: Boolean, flag: Int): Int = if (b) flag else 0 + val flags = flagIf(isSerializable, FLAG_SERIALIZABLE) | flagIf(needsBridges, FLAG_BRIDGES) + + val bsmArgs0 = Seq(samMethodType, targetHandle, instantiatedMethodType) + val bsmArgs1 = if (flags != 0) Seq(Int.box(flags)) else Seq.empty + val bsmArgs2 = if needsBridges then bridgeTypes.length +: bridgeTypes else Seq.empty + + val bsmArgs = bsmArgs0 ++ bsmArgs1 ++ bsmArgs2 + + val metafactory = + if (flags != 0) + jliLambdaMetaFactoryAltMetafactoryHandle // altMetafactory required to be able to pass the flags and additional arguments if needed + else + jliLambdaMetaFactoryMetafactoryHandle + + bc.jmethod.visitInvokeDynamicInsn(methodName, desc, metafactory, bsmArgs: _*) + + generatedType + } + } + + /** Does this symbol actually correspond to an interface that will be emitted? + * In the backend, this should be preferred over `isInterface` because it + * also returns true for the symbols of the fake companion objects we + * create for Java-defined classes as well as for Java annotations + * which we represent as classes. + */ + private def isEmittedInterface(sym: Symbol): Boolean = sym.isInterface || + sym.is(JavaDefined) && (toDenot(sym).isAnnotation || sym.is(ModuleClass) && (sym.companionClass.is(PureInterface)) || sym.companionClass.is(Trait)) + + +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/BCodeHelpers.scala b/tests/pos-with-compiler-cc/backend/jvm/BCodeHelpers.scala new file mode 100644 index 000000000000..6f83af540bea --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/BCodeHelpers.scala @@ -0,0 +1,960 @@ +package dotty.tools +package backend +package jvm + +import scala.language.unsafeNulls + +import scala.annotation.threadUnsafe +import scala.tools.asm +import scala.tools.asm.AnnotationVisitor +import scala.tools.asm.ClassWriter +import scala.collection.mutable + +import dotty.tools.dotc.CompilationUnit +import dotty.tools.dotc.ast.tpd +import dotty.tools.dotc.ast.Trees +import dotty.tools.dotc.core.Annotations._ +import dotty.tools.dotc.core.Constants._ +import dotty.tools.dotc.core.Contexts._ +import dotty.tools.dotc.core.Phases._ +import dotty.tools.dotc.core.Decorators._ +import dotty.tools.dotc.core.Flags._ +import dotty.tools.dotc.core.Names.Name +import dotty.tools.dotc.core.NameKinds.ExpandedName +import dotty.tools.dotc.core.Signature +import dotty.tools.dotc.core.StdNames._ +import dotty.tools.dotc.core.NameKinds +import dotty.tools.dotc.core.Symbols._ +import dotty.tools.dotc.core.Types +import dotty.tools.dotc.core.Types._ +import dotty.tools.dotc.core.TypeErasure +import dotty.tools.dotc.transform.GenericSignatures +import dotty.tools.dotc.transform.ElimErasedValueType +import dotty.tools.io.AbstractFile +import dotty.tools.dotc.report + +import dotty.tools.backend.jvm.DottyBackendInterface.symExtensions + +/* + * Traits encapsulating functionality to convert Scala AST Trees into ASM ClassNodes. + * + * @author Miguel Garcia, http://lamp.epfl.ch/~magarcia/ScalaCompilerCornerReloaded + * @version 1.0 + * + */ +trait BCodeHelpers extends BCodeIdiomatic with BytecodeWriters { + // for some reason singleton types aren't allowed in constructor calls. will need several casts in code to enforce + + //import global._ + //import bTypes._ + //import coreBTypes._ + import bTypes._ + import tpd._ + import coreBTypes._ + import int.{_, given} + import DottyBackendInterface._ + + def ScalaATTRName: String = "Scala" + def ScalaSignatureATTRName: String = "ScalaSig" + + @threadUnsafe lazy val AnnotationRetentionAttr: ClassSymbol = requiredClass("java.lang.annotation.Retention") + @threadUnsafe lazy val AnnotationRetentionSourceAttr: TermSymbol = requiredClass("java.lang.annotation.RetentionPolicy").linkedClass.requiredValue("SOURCE") + @threadUnsafe lazy val AnnotationRetentionClassAttr: TermSymbol = requiredClass("java.lang.annotation.RetentionPolicy").linkedClass.requiredValue("CLASS") + @threadUnsafe lazy val AnnotationRetentionRuntimeAttr: TermSymbol = requiredClass("java.lang.annotation.RetentionPolicy").linkedClass.requiredValue("RUNTIME") + + val bCodeAsmCommon: BCodeAsmCommon[int.type] = new BCodeAsmCommon(int) + + /* + * must-single-thread + */ + def getFileForClassfile(base: AbstractFile, clsName: String, suffix: String): AbstractFile = { + getFile(base, clsName, suffix) + } + + /* + * must-single-thread + */ + def getOutFolder(csym: Symbol, cName: String): AbstractFile = { + try { + outputDirectory + } catch { + case ex: Throwable => + report.error(em"Couldn't create file for class $cName\n${ex.getMessage}", ctx.source.atSpan(csym.span)) + null + } + } + + final def traitSuperAccessorName(sym: Symbol): String = { + val nameString = sym.javaSimpleName.toString + if (sym.name == nme.TRAIT_CONSTRUCTOR) nameString + else nameString + "$" + } + + // ----------------------------------------------------------------------------------------- + // finding the least upper bound in agreement with the bytecode verifier (given two internal names handed by ASM) + // Background: + // http://gallium.inria.fr/~xleroy/publi/bytecode-verification-JAR.pdf + // http://comments.gmane.org/gmane.comp.java.vm.languages/2293 + // https://issues.scala-lang.org/browse/SI-3872 + // ----------------------------------------------------------------------------------------- + + /* An `asm.ClassWriter` that uses `jvmWiseLUB()` + * The internal name of the least common ancestor of the types given by inameA and inameB. + * It's what ASM needs to know in order to compute stack map frames, http://asm.ow2.org/doc/developer-guide.html#controlflow + */ + final class CClassWriter(flags: Int) extends asm.ClassWriter(flags) { + + /** + * This method is thread-safe: it depends only on the BTypes component, which does not depend + * on global. TODO @lry move to a different place where no global is in scope, on bTypes. + */ + override def getCommonSuperClass(inameA: String, inameB: String): String = { + val a = classBTypeFromInternalName(inameA) + val b = classBTypeFromInternalName(inameB) + val lub = a.jvmWiseLUB(b) + val lubName = lub.internalName + assert(lubName != "scala/Any") + lubName // ASM caches the answer during the lifetime of a ClassWriter. We outlive that. Not sure whether caching on our side would improve things. + } + } + + /* + * must-single-thread + */ + def initBytecodeWriter(): BytecodeWriter = { + (None: Option[AbstractFile] /*getSingleOutput*/) match { // todo: implement + case Some(f) if f.hasExtension("jar") => + new DirectToJarfileWriter(f.file) + case _ => + factoryNonJarBytecodeWriter() + } + } + + /* + * Populates the InnerClasses JVM attribute with `refedInnerClasses`. See also the doc on inner + * classes in BTypes.scala. + * + * `refedInnerClasses` may contain duplicates, need not contain the enclosing inner classes of + * each inner class it lists (those are looked up and included). + * + * This method serializes in the InnerClasses JVM attribute in an appropriate order, + * not necessarily that given by `refedInnerClasses`. + * + * can-multi-thread + */ + final def addInnerClasses(jclass: asm.ClassVisitor, declaredInnerClasses: List[ClassBType], refedInnerClasses: List[ClassBType]): Unit = { + // sorting ensures nested classes are listed after their enclosing class thus satisfying the Eclipse Java compiler + val allNestedClasses = new mutable.TreeSet[ClassBType]()(Ordering.by(_.internalName)) + allNestedClasses ++= declaredInnerClasses + refedInnerClasses.foreach(allNestedClasses ++= _.enclosingNestedClassesChain) + for nestedClass <- allNestedClasses + do { + // Extract the innerClassEntry - we know it exists, enclosingNestedClassesChain only returns nested classes. + val Some(e) = nestedClass.innerClassAttributeEntry: @unchecked + jclass.visitInnerClass(e.name, e.outerName, e.innerName, e.flags) + } + } + + /* + * can-multi-thread + */ + def createJAttribute(name: String, b: Array[Byte], offset: Int, len: Int): asm.Attribute = { + new asm.Attribute(name) { + override def write(classWriter: ClassWriter, code: Array[Byte], + codeLength: Int, maxStack: Int, maxLocals: Int): asm.ByteVector = { + val byteVector = new asm.ByteVector(len) + byteVector.putByteArray(b, offset, len) + byteVector + } + } + } + + /* + * Custom attribute (JVMS 4.7.1) "ScalaSig" used as marker only + * i.e., the pickle is contained in a custom annotation, see: + * (1) `addAnnotations()`, + * (2) SID # 10 (draft) - Storage of pickled Scala signatures in class files, http://www.scala-lang.org/sid/10 + * (3) SID # 5 - Internals of Scala Annotations, http://www.scala-lang.org/sid/5 + * That annotation in turn is not related to the "java-generic-signature" (JVMS 4.7.9) + * other than both ending up encoded as attributes (JVMS 4.7) + * (with the caveat that the "ScalaSig" attribute is associated to some classes, + * while the "Signature" attribute can be associated to classes, methods, and fields.) + * + */ + trait BCPickles { + + import dotty.tools.dotc.core.unpickleScala2.{ PickleFormat, PickleBuffer } + + val versionPickle = { + val vp = new PickleBuffer(new Array[Byte](16), -1, 0) + assert(vp.writeIndex == 0, vp) + vp writeNat PickleFormat.MajorVersion + vp writeNat PickleFormat.MinorVersion + vp writeNat 0 + vp + } + + /* + * can-multi-thread + */ + def pickleMarkerLocal = { + createJAttribute(ScalaSignatureATTRName, versionPickle.bytes, 0, versionPickle.writeIndex) + } + + /* + * can-multi-thread + */ + def pickleMarkerForeign = { + createJAttribute(ScalaATTRName, new Array[Byte](0), 0, 0) + } + } // end of trait BCPickles + + trait BCInnerClassGen extends caps.Pure { + + def debugLevel = 3 // 0 -> no debug info; 1-> filename; 2-> lines; 3-> varnames + + final val emitSource = debugLevel >= 1 + final val emitLines = debugLevel >= 2 + final val emitVars = debugLevel >= 3 + + /** + * The class internal name for a given class symbol. + */ + final def internalName(sym: Symbol): String = { + // For each java class, the scala compiler creates a class and a module (thus a module class). + // If the `sym` is a java module class, we use the java class instead. This ensures that the + // ClassBType is created from the main class (instead of the module class). + // The two symbols have the same name, so the resulting internalName is the same. + val classSym = if (sym.is(JavaDefined) && sym.is(ModuleClass)) sym.linkedClass else sym + getClassBType(classSym).internalName + } + + private def assertClassNotArray(sym: Symbol): Unit = { + assert(sym.isClass, sym) + assert(sym != defn.ArrayClass || compilingArray, sym) + } + + private def assertClassNotArrayNotPrimitive(sym: Symbol): Unit = { + assertClassNotArray(sym) + assert(!primitiveTypeMap.contains(sym) || isCompilingPrimitive, sym) + } + + /** + * The ClassBType for a class symbol. + * + * The class symbol scala.Nothing is mapped to the class scala.runtime.Nothing$. Similarly, + * scala.Null is mapped to scala.runtime.Null$. This is because there exist no class files + * for the Nothing / Null. If used for example as a parameter type, we use the runtime classes + * in the classfile method signature. + * + * Note that the referenced class symbol may be an implementation class. For example when + * compiling a mixed-in method that forwards to the static method in the implementation class, + * the class descriptor of the receiver (the implementation class) is obtained by creating the + * ClassBType. + */ + final def getClassBType(sym: Symbol): ClassBType = { + assertClassNotArrayNotPrimitive(sym) + + if (sym == defn.NothingClass) srNothingRef + else if (sym == defn.NullClass) srNullRef + else classBTypeFromSymbol(sym) + } + + /* + * must-single-thread + */ + final def asmMethodType(msym: Symbol): MethodBType = { + assert(msym.is(Method), s"not a method-symbol: $msym") + val resT: BType = + if (msym.isClassConstructor || msym.isConstructor) UNIT + else toTypeKind(msym.info.resultType) + MethodBType(msym.info.firstParamTypes map toTypeKind, resT) + } + + /** + * The jvm descriptor of a type. + */ + final def typeDescriptor(t: Type): String = { toTypeKind(t).descriptor } + + /** + * The jvm descriptor for a symbol. + */ + final def symDescriptor(sym: Symbol): String = getClassBType(sym).descriptor + + final def toTypeKind(tp: Type): BType = typeToTypeKind(tp)(BCodeHelpers.this)(this) + + } // end of trait BCInnerClassGen + + trait BCAnnotGen extends BCInnerClassGen { + + /* + * must-single-thread + */ + def emitAnnotations(cw: asm.ClassVisitor, annotations: List[Annotation]): Unit = + for(annot <- annotations; if shouldEmitAnnotation(annot)) { + val typ = annot.tree.tpe + val assocs = assocsFromApply(annot.tree) + val av = cw.visitAnnotation(typeDescriptor(typ), isRuntimeVisible(annot)) + emitAssocs(av, assocs, BCodeHelpers.this)(this) + } + + /* + * must-single-thread + */ + def emitAnnotations(mw: asm.MethodVisitor, annotations: List[Annotation]): Unit = + for(annot <- annotations; if shouldEmitAnnotation(annot)) { + val typ = annot.tree.tpe + val assocs = assocsFromApply(annot.tree) + val av = mw.visitAnnotation(typeDescriptor(typ), isRuntimeVisible(annot)) + emitAssocs(av, assocs, BCodeHelpers.this)(this) + } + + /* + * must-single-thread + */ + def emitAnnotations(fw: asm.FieldVisitor, annotations: List[Annotation]): Unit = + for(annot <- annotations; if shouldEmitAnnotation(annot)) { + val typ = annot.tree.tpe + val assocs = assocsFromApply(annot.tree) + val av = fw.visitAnnotation(typeDescriptor(typ), isRuntimeVisible(annot)) + emitAssocs(av, assocs, BCodeHelpers.this)(this) + } + + /* + * must-single-thread + */ + def emitParamNames(jmethod: asm.MethodVisitor, params: List[Symbol]) = + for param <- params do + var access = asm.Opcodes.ACC_FINAL + if param.is(Artifact) then access |= asm.Opcodes.ACC_SYNTHETIC + jmethod.visitParameter(param.name.mangledString, access) + + /* + * must-single-thread + */ + def emitParamAnnotations(jmethod: asm.MethodVisitor, pannotss: List[List[Annotation]]): Unit = + val annotationss = pannotss map (_ filter shouldEmitAnnotation) + if (annotationss forall (_.isEmpty)) return + for ((annots, idx) <- annotationss.zipWithIndex; + annot <- annots) { + val typ = annot.tree.tpe + val assocs = assocsFromApply(annot.tree) + val pannVisitor: asm.AnnotationVisitor = jmethod.visitParameterAnnotation(idx, typeDescriptor(typ.asInstanceOf[Type]), isRuntimeVisible(annot)) + emitAssocs(pannVisitor, assocs, BCodeHelpers.this)(this) + } + + + private def shouldEmitAnnotation(annot: Annotation): Boolean = { + annot.symbol.is(JavaDefined) && + retentionPolicyOf(annot) != AnnotationRetentionSourceAttr + } + + private def emitAssocs(av: asm.AnnotationVisitor, assocs: List[(Name, Object)], bcodeStore: BCodeHelpers) + (innerClasesStore: bcodeStore.BCInnerClassGen) = { + for ((name, value) <- assocs) + emitArgument(av, name.mangledString, value.asInstanceOf[Tree], bcodeStore)(innerClasesStore) + av.visitEnd() + } + + private def emitArgument(av: AnnotationVisitor, + name: String, + arg: Tree, bcodeStore: BCodeHelpers)(innerClasesStore: bcodeStore.BCInnerClassGen): Unit = { + val narg = normalizeArgument(arg) + // Transformation phases are not run on annotation trees, so we need to run + // `constToLiteral` at this point. + val t = atPhase(erasurePhase)(constToLiteral(narg)) + t match { + case Literal(const @ Constant(_)) => + const.tag match { + case BooleanTag | ByteTag | ShortTag | CharTag | IntTag | LongTag | FloatTag | DoubleTag => av.visit(name, const.value) + case StringTag => + assert(const.value != null, const) // TODO this invariant isn't documented in `case class Constant` + av.visit(name, const.stringValue) // `stringValue` special-cases null, but that execution path isn't exercised for a const with StringTag + case ClazzTag => av.visit(name, typeToTypeKind(TypeErasure.erasure(const.typeValue))(bcodeStore)(innerClasesStore).toASMType) + } + case Ident(nme.WILDCARD) => + // An underscore argument indicates that we want to use the default value for this parameter, so do not emit anything + case t: tpd.RefTree if t.symbol.owner.linkedClass.isAllOf(JavaEnumTrait) => + val edesc = innerClasesStore.typeDescriptor(t.tpe) // the class descriptor of the enumeration class. + val evalue = t.symbol.javaSimpleName // value the actual enumeration value. + av.visitEnum(name, edesc, evalue) + case t: SeqLiteral => + val arrAnnotV: AnnotationVisitor = av.visitArray(name) + for (arg <- t.elems) { emitArgument(arrAnnotV, null, arg, bcodeStore)(innerClasesStore) } + arrAnnotV.visitEnd() + + case Apply(fun, args) if fun.symbol == defn.ArrayClass.primaryConstructor || + toDenot(fun.symbol).owner == defn.ArrayClass.linkedClass && fun.symbol.name == nme.apply => + val arrAnnotV: AnnotationVisitor = av.visitArray(name) + + var actualArgs = if (fun.tpe.isImplicitMethod) { + // generic array method, need to get implicit argument out of the way + fun.asInstanceOf[Apply].args + } else args + + val flatArgs = actualArgs.flatMap { arg => + normalizeArgument(arg) match { + case t: tpd.SeqLiteral => t.elems + case e => List(e) + } + } + for(arg <- flatArgs) { + emitArgument(arrAnnotV, null, arg, bcodeStore)(innerClasesStore) + } + arrAnnotV.visitEnd() + /* + case sb @ ScalaSigBytes(bytes) => + // see http://www.scala-lang.org/sid/10 (Storage of pickled Scala signatures in class files) + // also JVMS Sec. 4.7.16.1 The element_value structure and JVMS Sec. 4.4.7 The CONSTANT_Utf8_info Structure. + if (sb.fitsInOneString) { + av.visit(name, BCodeAsmCommon.strEncode(sb)) + } else { + val arrAnnotV: asm.AnnotationVisitor = av.visitArray(name) + for(arg <- BCodeAsmCommon.arrEncode(sb)) { arrAnnotV.visit(name, arg) } + arrAnnotV.visitEnd() + } // for the lazy val in ScalaSigBytes to be GC'ed, the invoker of emitAnnotations() should hold the ScalaSigBytes in a method-local var that doesn't escape. + */ + case t @ Apply(constr, args) if t.tpe.classSymbol.is(JavaAnnotation) => + val typ = t.tpe.classSymbol.denot.info + val assocs = assocsFromApply(t) + val desc = innerClasesStore.typeDescriptor(typ) // the class descriptor of the nested annotation class + val nestedVisitor = av.visitAnnotation(name, desc) + emitAssocs(nestedVisitor, assocs, bcodeStore)(innerClasesStore) + + case t => + report.error(em"Annotation argument is not a constant", t.sourcePos) + } + } + + private def normalizeArgument(arg: Tree): Tree = arg match { + case Trees.NamedArg(_, arg1) => normalizeArgument(arg1) + case Trees.Typed(arg1, _) => normalizeArgument(arg1) + case _ => arg + } + + private def isRuntimeVisible(annot: Annotation): Boolean = + if (toDenot(annot.tree.tpe.typeSymbol).hasAnnotation(AnnotationRetentionAttr)) + retentionPolicyOf(annot) == AnnotationRetentionRuntimeAttr + else { + // SI-8926: if the annotation class symbol doesn't have a @RetentionPolicy annotation, the + // annotation is emitted with visibility `RUNTIME` + // dotty bug: #389 + true + } + + private def retentionPolicyOf(annot: Annotation): Symbol = + annot.tree.tpe.typeSymbol.getAnnotation(AnnotationRetentionAttr). + flatMap(_.argument(0).map(_.tpe.termSymbol)).getOrElse(AnnotationRetentionClassAttr) + + private def assocsFromApply(tree: Tree): List[(Name, Tree)] = { + tree match { + case Block(_, expr) => assocsFromApply(expr) + case Apply(fun, args) => + fun.tpe.widen match { + case MethodType(names) => + (names zip args).filter { + case (_, t: tpd.Ident) if (t.tpe.normalizedPrefix eq NoPrefix) => false + case _ => true + } + } + } + } + } // end of trait BCAnnotGen + + trait BCJGenSigGen { + import int.given + + def getCurrentCUnit(): CompilationUnit + + /** + * Generates the generic signature for `sym` before erasure. + * + * @param sym The symbol for which to generate a signature. + * @param owner The owner of `sym`. + * @return The generic signature of `sym` before erasure, as specified in the Java Virtual + * Machine Specification, §4.3.4, or `null` if `sym` doesn't need a generic signature. + * @see https://docs.oracle.com/javase/specs/jvms/se7/html/jvms-4.html#jvms-4.3.4 + */ + def getGenericSignature(sym: Symbol, owner: Symbol): String = { + atPhase(erasurePhase) { + val memberTpe = + if (sym.is(Method)) sym.denot.info + else owner.denot.thisType.memberInfo(sym) + getGenericSignatureHelper(sym, owner, memberTpe).orNull + } + } + + } // end of trait BCJGenSigGen + + trait BCForwardersGen extends BCAnnotGen with BCJGenSigGen { + + /* Add a forwarder for method m. Used only from addForwarders(). + * + * must-single-thread + */ + private def addForwarder(jclass: asm.ClassVisitor, module: Symbol, m: Symbol, isSynthetic: Boolean): Unit = { + val moduleName = internalName(module) + val methodInfo = module.thisType.memberInfo(m) + val paramJavaTypes: List[BType] = methodInfo.firstParamTypes map toTypeKind + // val paramNames = 0 until paramJavaTypes.length map ("x_" + _) + + /* Forwarders must not be marked final, + * as the JVM will not allow redefinition of a final static method, + * and we don't know what classes might be subclassing the companion class. See SI-4827. + */ + // TODO: evaluate the other flags we might be dropping on the floor here. + val flags = GenBCodeOps.PublicStatic | ( + if (m.is(JavaVarargs)) asm.Opcodes.ACC_VARARGS else 0 + ) | ( + if (isSynthetic) asm.Opcodes.ACC_SYNTHETIC else 0 + ) + + // TODO needed? for(ann <- m.annotations) { ann.symbol.initialize } + val jgensig = getStaticForwarderGenericSignature(m, module) + val (throws, others) = m.annotations.partition(_.symbol eq defn.ThrowsAnnot) + val thrownExceptions: List[String] = getExceptions(throws) + + val jReturnType = toTypeKind(methodInfo.resultType) + val mdesc = MethodBType(paramJavaTypes, jReturnType).descriptor + val mirrorMethodName = m.javaSimpleName + val mirrorMethod: asm.MethodVisitor = jclass.visitMethod( + flags, + mirrorMethodName, + mdesc, + jgensig, + mkArrayS(thrownExceptions) + ) + + emitAnnotations(mirrorMethod, others) + val params: List[Symbol] = Nil // backend uses this to emit annotations on parameter lists of forwarders + // to static methods of companion class + // Old assumption: in Dotty this link does not exists: there is no way to get from method type + // to inner symbols of DefDef + // TODO: now we have paramSymss and could use it here. + emitParamAnnotations(mirrorMethod, params.map(_.annotations)) + + mirrorMethod.visitCode() + + mirrorMethod.visitFieldInsn(asm.Opcodes.GETSTATIC, moduleName, str.MODULE_INSTANCE_FIELD, symDescriptor(module)) + + var index = 0 + for(jparamType <- paramJavaTypes) { + mirrorMethod.visitVarInsn(jparamType.typedOpcode(asm.Opcodes.ILOAD), index) + assert(!jparamType.isInstanceOf[MethodBType], jparamType) + index += jparamType.size + } + + mirrorMethod.visitMethodInsn(asm.Opcodes.INVOKEVIRTUAL, moduleName, mirrorMethodName, asmMethodType(m).descriptor, false) + mirrorMethod.visitInsn(jReturnType.typedOpcode(asm.Opcodes.IRETURN)) + + mirrorMethod.visitMaxs(0, 0) // just to follow protocol, dummy arguments + mirrorMethod.visitEnd() + + } + + /* Add forwarders for all methods defined in `module` that don't conflict + * with methods in the companion class of `module`. A conflict arises when + * a method with the same name is defined both in a class and its companion object: + * method signature is not taken into account. + * + * must-single-thread + */ + def addForwarders(jclass: asm.ClassVisitor, jclassName: String, moduleClass: Symbol): Unit = { + assert(moduleClass.is(ModuleClass), moduleClass) + report.debuglog(s"Dumping mirror class for object: $moduleClass") + + val linkedClass = moduleClass.companionClass + lazy val conflictingNames: Set[Name] = { + (linkedClass.info.allMembers.collect { case d if d.name.isTermName => d.name }).toSet + } + report.debuglog(s"Potentially conflicting names for forwarders: $conflictingNames") + + for (m0 <- sortedMembersBasedOnFlags(moduleClass.info, required = Method, excluded = ExcludedForwarder)) { + val m = if (m0.is(Bridge)) m0.nextOverriddenSymbol else m0 + if (m == NoSymbol) + report.log(s"$m0 is a bridge method that overrides nothing, something went wrong in a previous phase.") + else if (m.isType || m.is(Deferred) || (m.owner eq defn.ObjectClass) || m.isConstructor || m.name.is(ExpandedName)) + report.debuglog(s"No forwarder for '$m' from $jclassName to '$moduleClass'") + else if (conflictingNames(m.name)) + report.log(s"No forwarder for $m due to conflict with ${linkedClass.info.member(m.name)}") + else if (m.accessBoundary(defn.RootClass) ne defn.RootClass) + report.log(s"No forwarder for non-public member $m") + else { + report.log(s"Adding static forwarder for '$m' from $jclassName to '$moduleClass'") + // It would be simpler to not generate forwarders for these methods, + // but that wouldn't be binary-compatible with Scala 3.0.0, so instead + // we generate ACC_SYNTHETIC forwarders so Java compilers ignore them. + val isSynthetic = + m0.name.is(NameKinds.SyntheticSetterName) || + // Only hide bridges generated at Erasure, mixin forwarders are also + // marked as bridge but shouldn't be hidden since they don't have a + // non-bridge overload. + m0.is(Bridge) && m0.initial.validFor.firstPhaseId == erasurePhase.next.id + addForwarder(jclass, moduleClass, m, isSynthetic) + } + } + } + + /** The members of this type that have all of `required` flags but none of `excluded` flags set. + * The members are sorted by name and signature to guarantee a stable ordering. + */ + private def sortedMembersBasedOnFlags(tp: Type, required: Flag, excluded: FlagSet): List[Symbol] = { + // The output of `memberNames` is a Set, sort it to guarantee a stable ordering. + val names = tp.memberNames(takeAllFilter).toSeq.sorted + val buffer = mutable.ListBuffer[Symbol]() + names.foreach { name => + buffer ++= tp.memberBasedOnFlags(name, required, excluded) + .alternatives.sortBy(_.signature)(Signature.lexicographicOrdering).map(_.symbol) + } + buffer.toList + } + + /* + * Quoting from JVMS 4.7.5 The Exceptions Attribute + * "The Exceptions attribute indicates which checked exceptions a method may throw. + * There may be at most one Exceptions attribute in each method_info structure." + * + * The contents of that attribute are determined by the `String[] exceptions` argument to ASM's ClassVisitor.visitMethod() + * This method returns such list of internal names. + * + * must-single-thread + */ + def getExceptions(excs: List[Annotation]): List[String] = { + for (case ThrownException(exc) <- excs.distinct) + yield internalName(TypeErasure.erasure(exc).classSymbol) + } + } // end of trait BCForwardersGen + + trait BCClassGen extends BCInnerClassGen { + + // Used as threshold above which a tableswitch bytecode instruction is preferred over a lookupswitch. + // There's a space tradeoff between these multi-branch instructions (details in the JVM spec). + // The particular value in use for `MIN_SWITCH_DENSITY` reflects a heuristic. + val MIN_SWITCH_DENSITY = 0.7 + + /* + * Add public static final field serialVersionUID with value `id` + * + * can-multi-thread + */ + def addSerialVUID(id: Long, jclass: asm.ClassVisitor): Unit = { + // add static serialVersionUID field if `clasz` annotated with `@SerialVersionUID(uid: Long)` + jclass.visitField( + GenBCodeOps.PrivateStaticFinal, + "serialVersionUID", + "J", + null, // no java-generic-signature + java.lang.Long.valueOf(id) + ).visitEnd() + } + } // end of trait BCClassGen + + /* functionality for building plain and mirror classes */ + abstract class JCommonBuilder + extends BCInnerClassGen + with BCAnnotGen + with BCForwardersGen + with BCPickles { } + + /* builder of mirror classes */ + class JMirrorBuilder extends JCommonBuilder { + + private var cunit: CompilationUnit = _ + def getCurrentCUnit(): CompilationUnit = cunit; + + /* Generate a mirror class for a top-level module. A mirror class is a class + * containing only static methods that forward to the corresponding method + * on the MODULE instance of the given Scala object. It will only be + * generated if there is no companion class: if there is, an attempt will + * instead be made to add the forwarder methods to the companion class. + * + * must-single-thread + */ + def genMirrorClass(moduleClass: Symbol, cunit: CompilationUnit): asm.tree.ClassNode = { + assert(moduleClass.is(ModuleClass)) + assert(moduleClass.companionClass == NoSymbol, moduleClass) + this.cunit = cunit + val bType = mirrorClassBTypeFromSymbol(moduleClass) + val moduleName = internalName(moduleClass) // + "$" + val mirrorName = bType.internalName + + val mirrorClass = new asm.tree.ClassNode + mirrorClass.visit( + classfileVersion, + bType.info.flags, + mirrorName, + null /* no java-generic-signature */, + ObjectRef.internalName, + EMPTY_STRING_ARRAY + ) + + if (emitSource) { + mirrorClass.visitSource("" + cunit.source.file.name, + null /* SourceDebugExtension */) + } + + val ssa = None // getAnnotPickle(mirrorName, if (moduleClass.is(Module)) moduleClass.companionClass else moduleClass.companionModule) + mirrorClass.visitAttribute(if (ssa.isDefined) pickleMarkerLocal else pickleMarkerForeign) + emitAnnotations(mirrorClass, moduleClass.annotations ++ ssa) + + addForwarders(mirrorClass, mirrorName, moduleClass) + mirrorClass.visitEnd() + + moduleClass.name // this side-effect is necessary, really. + + mirrorClass + } + + } // end of class JMirrorBuilder + + trait JAndroidBuilder { + self: BCInnerClassGen => + + /* From the reference documentation of the Android SDK: + * The `Parcelable` interface identifies classes whose instances can be written to and restored from a `Parcel`. + * Classes implementing the `Parcelable` interface must also have a static field called `CREATOR`, + * which is an object implementing the `Parcelable.Creator` interface. + */ + val androidFieldName = "CREATOR".toTermName + + lazy val AndroidParcelableInterface : Symbol = NoSymbol // getClassIfDefined("android.os.Parcelable") + lazy val AndroidCreatorClass : Symbol = NoSymbol // getClassIfDefined("android.os.Parcelable$Creator") + + /* + * must-single-thread + */ + def isAndroidParcelableClass(sym: Symbol) = + (AndroidParcelableInterface != NoSymbol) && + (sym.info.parents.map(_.typeSymbol) contains AndroidParcelableInterface) + + /* + * must-single-thread + */ + def legacyAddCreatorCode(clinit: asm.MethodVisitor, cnode: asm.tree.ClassNode, thisName: String): Unit = { + val androidCreatorType = getClassBType(AndroidCreatorClass) + val tdesc_creator = androidCreatorType.descriptor + + cnode.visitField( + GenBCodeOps.PublicStaticFinal, + "CREATOR", + tdesc_creator, + null, // no java-generic-signature + null // no initial value + ).visitEnd() + + val moduleName = (thisName + "$") + + // GETSTATIC `moduleName`.MODULE$ : `moduleName`; + clinit.visitFieldInsn( + asm.Opcodes.GETSTATIC, + moduleName, + str.MODULE_INSTANCE_FIELD, + "L" + moduleName + ";" + ) + + // INVOKEVIRTUAL `moduleName`.CREATOR() : android.os.Parcelable$Creator; + val bt = MethodBType(Nil, androidCreatorType) + clinit.visitMethodInsn( + asm.Opcodes.INVOKEVIRTUAL, + moduleName, + "CREATOR", + bt.descriptor, + false + ) + + // PUTSTATIC `thisName`.CREATOR; + clinit.visitFieldInsn( + asm.Opcodes.PUTSTATIC, + thisName, + "CREATOR", + tdesc_creator + ) + } + + } // end of trait JAndroidBuilder + + /** + * This method returns the BType for a type reference, for example a parameter type. + * + * If the result is a ClassBType for a nested class, it is added to the innerClassBufferASM. + * + * If `t` references a class, toTypeKind ensures that the class is not an implementation class. + * See also comment on getClassBTypeAndRegisterInnerClass, which is invoked for implementation + * classes. + */ + private def typeToTypeKind(tp: Type)(ct: BCodeHelpers)(storage: ct.BCInnerClassGen): ct.bTypes.BType = { + import ct.bTypes._ + val defn = ctx.definitions + import coreBTypes._ + import Types._ + /** + * Primitive types are represented as TypeRefs to the class symbol of, for example, scala.Int. + * The `primitiveTypeMap` maps those class symbols to the corresponding PrimitiveBType. + */ + def primitiveOrClassToBType(sym: Symbol): BType = { + assert(sym.isClass, sym) + assert(sym != defn.ArrayClass || compilingArray, sym) + primitiveTypeMap.getOrElse(sym, storage.getClassBType(sym)).asInstanceOf[BType] + } + + /** + * When compiling Array.scala, the type parameter T is not erased and shows up in method + * signatures, e.g. `def apply(i: Int): T`. A TyperRef to T is replaced by ObjectReference. + */ + def nonClassTypeRefToBType(sym: Symbol): ClassBType = { + assert(sym.isType && compilingArray, sym) + ObjectRef.asInstanceOf[ct.bTypes.ClassBType] + } + + tp.widenDealias match { + case JavaArrayType(el) =>ArrayBType(typeToTypeKind(el)(ct)(storage)) // Array type such as Array[Int] (kept by erasure) + case t: TypeRef => + t.info match { + + case _ => + if (!t.symbol.isClass) nonClassTypeRefToBType(t.symbol) // See comment on nonClassTypeRefToBType + else primitiveOrClassToBType(t.symbol) // Common reference to a type such as scala.Int or java.lang.String + } + case Types.ClassInfo(_, sym, _, _, _) => primitiveOrClassToBType(sym) // We get here, for example, for genLoadModule, which invokes toTypeKind(moduleClassSymbol.info) + + /* AnnotatedType should (probably) be eliminated by erasure. However we know it happens for + * meta-annotated annotations (@(ann @getter) val x = 0), so we don't emit a warning. + * The type in the AnnotationInfo is an AnnotatedTpe. Tested in jvm/annotations.scala. + */ + case a @ AnnotatedType(t, _) => + report.debuglog(s"typeKind of annotated type $a") + typeToTypeKind(t)(ct)(storage) + + /* The cases below should probably never occur. They are kept for now to avoid introducing + * new compiler crashes, but we added a warning. The compiler / library bootstrap and the + * test suite don't produce any warning. + */ + + case tp => + report.warning( + s"an unexpected type representation reached the compiler backend while compiling ${ctx.compilationUnit}: $tp. " + + "If possible, please file a bug on https://github.com/lampepfl/dotty/issues") + + tp match { + case tp: ThisType if tp.cls == defn.ArrayClass => ObjectRef.asInstanceOf[ct.bTypes.ClassBType] // was introduced in 9b17332f11 to fix SI-999, but this code is not reached in its test, or any other test + case tp: ThisType => storage.getClassBType(tp.cls) + // case t: SingletonType => primitiveOrClassToBType(t.classSymbol) + case t: SingletonType => typeToTypeKind(t.underlying)(ct)(storage) + case t: RefinedType => typeToTypeKind(t.parent)(ct)(storage) //parents.map(_.toTypeKind(ct)(storage).asClassBType).reduceLeft((a, b) => a.jvmWiseLUB(b)) + } + } + } + + private def getGenericSignatureHelper(sym: Symbol, owner: Symbol, memberTpe: Type)(using Context): Option[String] = { + if (needsGenericSignature(sym)) { + val erasedTypeSym = TypeErasure.fullErasure(sym.denot.info).typeSymbol + if (erasedTypeSym.isPrimitiveValueClass) { + // Suppress signatures for symbols whose types erase in the end to primitive + // value types. This is needed to fix #7416. + None + } else { + val jsOpt = GenericSignatures.javaSig(sym, memberTpe) + if (ctx.settings.XverifySignatures.value) { + jsOpt.foreach(verifySignature(sym, _)) + } + + jsOpt + } + } else { + None + } + } + + private def verifySignature(sym: Symbol, sig: String)(using Context): Unit = { + import scala.tools.asm.util.CheckClassAdapter + def wrap(body: => Unit): Unit = { + try body + catch { + case ex: Throwable => + report.error( + em"""|compiler bug: created invalid generic signature for $sym in ${sym.denot.owner.showFullName} + |signature: $sig + |if this is reproducible, please report bug at https://github.com/lampepfl/dotty/issues + """, sym.sourcePos) + throw ex + } + } + + wrap { + if (sym.is(Method)) { + CheckClassAdapter.checkMethodSignature(sig) + } + else if (sym.isTerm) { + CheckClassAdapter.checkFieldSignature(sig) + } + else { + CheckClassAdapter.checkClassSignature(sig) + } + } + } + + // @M don't generate java generics sigs for (members of) implementation + // classes, as they are monomorphic (TODO: ok?) + private final def needsGenericSignature(sym: Symbol): Boolean = !( + // pp: this condition used to include sym.hasexpandedname, but this leads + // to the total loss of generic information if a private member is + // accessed from a closure: both the field and the accessor were generated + // without it. This is particularly bad because the availability of + // generic information could disappear as a consequence of a seemingly + // unrelated change. + ctx.base.settings.YnoGenericSig.value + || sym.is(Artifact) + || sym.isAllOf(LiftedMethod) + || sym.is(Bridge) + ) + + private def getStaticForwarderGenericSignature(sym: Symbol, moduleClass: Symbol): String = { + // scala/bug#3452 Static forwarder generation uses the same erased signature as the method if forwards to. + // By rights, it should use the signature as-seen-from the module class, and add suitable + // primitive and value-class boxing/unboxing. + // But for now, just like we did in mixin, we just avoid writing a wrong generic signature + // (one that doesn't erase to the actual signature). See run/t3452b for a test case. + + val memberTpe = atPhase(erasurePhase) { moduleClass.denot.thisType.memberInfo(sym) } + val erasedMemberType = ElimErasedValueType.elimEVT(TypeErasure.transformInfo(sym, memberTpe)) + if (erasedMemberType =:= sym.denot.info) + getGenericSignatureHelper(sym, moduleClass, memberTpe).orNull + else null + } + + def abort(msg: String): Nothing = { + report.error(msg) + throw new RuntimeException(msg) + } + + private def compilingArray(using Context) = + ctx.compilationUnit.source.file.name == "Array.scala" +} + +object BCodeHelpers { + + class InvokeStyle(val style: Int) extends AnyVal { + import InvokeStyle._ + def isVirtual: Boolean = this == Virtual + def isStatic : Boolean = this == Static + def isSpecial: Boolean = this == Special + def isSuper : Boolean = this == Super + + def hasInstance = this != Static + } + + object InvokeStyle { + val Virtual = new InvokeStyle(0) // InvokeVirtual or InvokeInterface + val Static = new InvokeStyle(1) // InvokeStatic + val Special = new InvokeStyle(2) // InvokeSpecial (private methods, constructors) + val Super = new InvokeStyle(3) // InvokeSpecial (super calls) + } + + /** An attachment on Apply nodes indicating that it should be compiled with + * `invokespecial` instead of `invokevirtual`. This is used for static + * forwarders. + * See BCodeSkelBuilder.makeStaticForwarder for more details. + */ + val UseInvokeSpecial = new dotc.util.Property.Key[Unit] + +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/BCodeIdiomatic.scala b/tests/pos-with-compiler-cc/backend/jvm/BCodeIdiomatic.scala new file mode 100644 index 000000000000..77e58440aa97 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/BCodeIdiomatic.scala @@ -0,0 +1,725 @@ +package dotty.tools +package backend +package jvm + +import scala.language.unsafeNulls + +import scala.tools.asm +import scala.annotation.switch +import Primitives.{NE, EQ, TestOp, ArithmeticOp} +import scala.tools.asm.tree.MethodInsnNode +import dotty.tools.dotc.report + +/* + * A high-level facade to the ASM API for bytecode generation. + * + * @author Miguel Garcia, http://lamp.epfl.ch/~magarcia/ScalaCompilerCornerReloaded + * @version 1.0 + * + */ +trait BCodeIdiomatic extends caps.Pure { + val int: DottyBackendInterface + final lazy val bTypes = new BTypesFromSymbols[int.type](int) + + import int.{_, given} + import bTypes._ + import coreBTypes._ + + + + lazy val target = + val releaseValue = Option(ctx.settings.javaOutputVersion.value).filter(_.nonEmpty) + val targetValue = Option(ctx.settings.XuncheckedJavaOutputVersion.value).filter(_.nonEmpty) + val defaultTarget = "8" + (releaseValue, targetValue) match + case (Some(release), None) => release + case (None, Some(target)) => target + case (Some(release), Some(_)) => + report.warning(s"The value of ${ctx.settings.XuncheckedJavaOutputVersion.name} was overridden by ${ctx.settings.javaOutputVersion.name}") + release + case (None, None) => "8" // least supported version by default + + + // Keep synchronized with `minTargetVersion` and `maxTargetVersion` in ScalaSettings + lazy val classfileVersion: Int = target match { + case "8" => asm.Opcodes.V1_8 + case "9" => asm.Opcodes.V9 + case "10" => asm.Opcodes.V10 + case "11" => asm.Opcodes.V11 + case "12" => asm.Opcodes.V12 + case "13" => asm.Opcodes.V13 + case "14" => asm.Opcodes.V14 + case "15" => asm.Opcodes.V15/* + case "16" => asm.Opcodes.V16 + case "17" => asm.Opcodes.V17 + case "18" => asm.Opcodes.V18 + case "19" => asm.Opcodes.V19 + case "20" => asm.Opcodes.V20*/ + } + + lazy val majorVersion: Int = (classfileVersion & 0xFF) + lazy val emitStackMapFrame = (majorVersion >= 50) + + val extraProc: Int = + import GenBCodeOps.addFlagIf + asm.ClassWriter.COMPUTE_MAXS + .addFlagIf(emitStackMapFrame, asm.ClassWriter.COMPUTE_FRAMES) + + lazy val JavaStringBuilderClassName = jlStringBuilderRef.internalName + + val CLASS_CONSTRUCTOR_NAME = "" + val INSTANCE_CONSTRUCTOR_NAME = "" + + val EMPTY_STRING_ARRAY = Array.empty[String] + val EMPTY_INT_ARRAY = Array.empty[Int] + val EMPTY_LABEL_ARRAY = Array.empty[asm.Label] + val EMPTY_BTYPE_ARRAY = Array.empty[BType] + + /* can-multi-thread */ + final def mkArrayB(xs: List[BType]): Array[BType] = { + if (xs.isEmpty) { return EMPTY_BTYPE_ARRAY } + val a = new Array[BType](xs.size); xs.copyToArray(a); a + } + /* can-multi-thread */ + final def mkArrayS(xs: List[String]): Array[String] = { + if (xs.isEmpty) { return EMPTY_STRING_ARRAY } + val a = new Array[String](xs.size); xs.copyToArray(a); a + } + /* can-multi-thread */ + final def mkArrayL(xs: List[asm.Label]): Array[asm.Label] = { + if (xs.isEmpty) { return EMPTY_LABEL_ARRAY } + val a = new Array[asm.Label](xs.size); xs.copyToArray(a); a + } + + /* + * can-multi-thread + */ + final def mkArrayReverse(xs: List[String]): Array[String] = { + val len = xs.size + if (len == 0) { return EMPTY_STRING_ARRAY } + val a = new Array[String](len) + var i = len - 1 + var rest = xs + while (!rest.isEmpty) { + a(i) = rest.head + rest = rest.tail + i -= 1 + } + a + } + + /* + * can-multi-thread + */ + final def mkArrayReverse(xs: List[Int]): Array[Int] = { + val len = xs.size + if (len == 0) { return EMPTY_INT_ARRAY } + val a = new Array[Int](len) + var i = len - 1 + var rest = xs + while (!rest.isEmpty) { + a(i) = rest.head + rest = rest.tail + i -= 1 + } + a + } + + /* Just a namespace for utilities that encapsulate MethodVisitor idioms. + * In the ASM world, org.objectweb.asm.commons.InstructionAdapter plays a similar role, + * but the methods here allow choosing when to transition from ICode to ASM types + * (including not at all, e.g. for performance). + */ + abstract class JCodeMethodN { + + def jmethod: asm.tree.MethodNode + + import asm.Opcodes; + + final def emit(opc: Int): Unit = { jmethod.visitInsn(opc) } + + /* + * can-multi-thread + */ + final def genPrimitiveArithmetic(op: ArithmeticOp, kind: BType): Unit = { + + import Primitives.{ ADD, SUB, MUL, DIV, REM, NOT } + + op match { + + case ADD => add(kind) + case SUB => sub(kind) + case MUL => mul(kind) + case DIV => div(kind) + case REM => rem(kind) + + case NOT => + if (kind.isIntSizedType) { + emit(Opcodes.ICONST_M1) + emit(Opcodes.IXOR) + } else if (kind == LONG) { + jmethod.visitLdcInsn(java.lang.Long.valueOf(-1)) + jmethod.visitInsn(Opcodes.LXOR) + } else { + abort(s"Impossible to negate an $kind") + } + + case _ => + abort(s"Unknown arithmetic primitive $op") + } + + } // end of method genPrimitiveArithmetic() + + /* + * can-multi-thread + */ + final def genPrimitiveLogical(op: /* LogicalOp */ Int, kind: BType): Unit = { + + import ScalaPrimitivesOps.{ AND, OR, XOR } + + ((op, kind): @unchecked) match { + case (AND, LONG) => emit(Opcodes.LAND) + case (AND, INT) => emit(Opcodes.IAND) + case (AND, _) => + emit(Opcodes.IAND) + if (kind != BOOL) { emitT2T(INT, kind) } + + case (OR, LONG) => emit(Opcodes.LOR) + case (OR, INT) => emit(Opcodes.IOR) + case (OR, _) => + emit(Opcodes.IOR) + if (kind != BOOL) { emitT2T(INT, kind) } + + case (XOR, LONG) => emit(Opcodes.LXOR) + case (XOR, INT) => emit(Opcodes.IXOR) + case (XOR, _) => + emit(Opcodes.IXOR) + if (kind != BOOL) { emitT2T(INT, kind) } + } + + } // end of method genPrimitiveLogical() + + /* + * can-multi-thread + */ + final def genPrimitiveShift(op: /* ShiftOp */ Int, kind: BType): Unit = { + + import ScalaPrimitivesOps.{ LSL, ASR, LSR } + + ((op, kind): @unchecked) match { + case (LSL, LONG) => emit(Opcodes.LSHL) + case (LSL, INT) => emit(Opcodes.ISHL) + case (LSL, _) => + emit(Opcodes.ISHL) + emitT2T(INT, kind) + + case (ASR, LONG) => emit(Opcodes.LSHR) + case (ASR, INT) => emit(Opcodes.ISHR) + case (ASR, _) => + emit(Opcodes.ISHR) + emitT2T(INT, kind) + + case (LSR, LONG) => emit(Opcodes.LUSHR) + case (LSR, INT) => emit(Opcodes.IUSHR) + case (LSR, _) => + emit(Opcodes.IUSHR) + emitT2T(INT, kind) + } + + } // end of method genPrimitiveShift() + + /* Creates a new `StringBuilder` instance with the requested capacity + * + * can-multi-thread + */ + final def genNewStringBuilder(size: Int): Unit = { + jmethod.visitTypeInsn(Opcodes.NEW, JavaStringBuilderClassName) + jmethod.visitInsn(Opcodes.DUP) + jmethod.visitLdcInsn(Integer.valueOf(size)) + invokespecial( + JavaStringBuilderClassName, + INSTANCE_CONSTRUCTOR_NAME, + "(I)V", + itf = false + ) + } + + /* Issue a call to `StringBuilder#append` for the right element type + * + * can-multi-thread + */ + final def genStringBuilderAppend(elemType: BType): Unit = { + val paramType = elemType match { + case ct: ClassBType if ct.isSubtypeOf(StringRef) => StringRef + case ct: ClassBType if ct.isSubtypeOf(jlStringBufferRef) => jlStringBufferRef + case ct: ClassBType if ct.isSubtypeOf(jlCharSequenceRef) => jlCharSequenceRef + // Don't match for `ArrayBType(CHAR)`, even though StringBuilder has such an overload: + // `"a" + Array('b')` should NOT be "ab", but "a[C@...". + case _: RefBType => ObjectRef + // jlStringBuilder does not have overloads for byte and short, but we can just use the int version + case BYTE | SHORT => INT + case pt: PrimitiveBType => pt + } + val bt = MethodBType(List(paramType), jlStringBuilderRef) + invokevirtual(JavaStringBuilderClassName, "append", bt.descriptor) + } + + /* Extract the built `String` from the `StringBuilder` + * + * can-multi-thread + */ + final def genStringBuilderEnd: Unit = { + invokevirtual(JavaStringBuilderClassName, "toString", genStringBuilderEndDesc) + } + // Use ClassBType refs instead of plain string literal to make sure that needed ClassBTypes are initialized and reachable + private lazy val genStringBuilderEndDesc = MethodBType(Nil, StringRef).descriptor + + /* Concatenate top N arguments on the stack with `StringConcatFactory#makeConcatWithConstants` + * (only works for JDK 9+) + * + * can-multi-thread + */ + final def genIndyStringConcat( + recipe: String, + argTypes: Seq[asm.Type], + constants: Seq[String] + ): Unit = { + jmethod.visitInvokeDynamicInsn( + "makeConcatWithConstants", + asm.Type.getMethodDescriptor(StringRef.toASMType, argTypes:_*), + coreBTypes.jliStringConcatFactoryMakeConcatWithConstantsHandle, + (recipe +: constants):_* + ) + } + + /* + * Emits one or more conversion instructions based on the types given as arguments. + * + * @param from The type of the value to be converted into another type. + * @param to The type the value will be converted into. + * + * can-multi-thread + */ + final def emitT2T(from: BType, to: BType): Unit = { + + assert( + from.isNonVoidPrimitiveType && to.isNonVoidPrimitiveType, + s"Cannot emit primitive conversion from $from to $to" + ) + + def pickOne(opcs: Array[Int]): Unit = { // TODO index on to.sort + val chosen = (to: @unchecked) match { + case BYTE => opcs(0) + case SHORT => opcs(1) + case CHAR => opcs(2) + case INT => opcs(3) + case LONG => opcs(4) + case FLOAT => opcs(5) + case DOUBLE => opcs(6) + } + if (chosen != -1) { emit(chosen) } + } + + if (from == to) { return } + // the only conversion involving BOOL that is allowed is (BOOL -> BOOL) + assert(from != BOOL && to != BOOL, s"inconvertible types : $from -> $to") + + // We're done with BOOL already + from match { + + // using `asm.Type.SHORT` instead of `BType.SHORT` because otherwise "warning: could not emit switch for @switch annotated match" + + case BYTE => pickOne(JCodeMethodN.fromByteT2T) + case SHORT => pickOne(JCodeMethodN.fromShortT2T) + case CHAR => pickOne(JCodeMethodN.fromCharT2T) + case INT => pickOne(JCodeMethodN.fromIntT2T) + + case FLOAT => + import asm.Opcodes.{ F2L, F2D, F2I } + to match { + case LONG => emit(F2L) + case DOUBLE => emit(F2D) + case _ => emit(F2I); emitT2T(INT, to) + } + + case LONG => + import asm.Opcodes.{ L2F, L2D, L2I } + to match { + case FLOAT => emit(L2F) + case DOUBLE => emit(L2D) + case _ => emit(L2I); emitT2T(INT, to) + } + + case DOUBLE => + import asm.Opcodes.{ D2L, D2F, D2I } + to match { + case FLOAT => emit(D2F) + case LONG => emit(D2L) + case _ => emit(D2I); emitT2T(INT, to) + } + } + } // end of emitT2T() + + // can-multi-thread + final def boolconst(b: Boolean): Unit = { iconst(if (b) 1 else 0) } + + // can-multi-thread + final def iconst(cst: Int): Unit = { + if (cst >= -1 && cst <= 5) { + emit(Opcodes.ICONST_0 + cst) + } else if (cst >= java.lang.Byte.MIN_VALUE && cst <= java.lang.Byte.MAX_VALUE) { + jmethod.visitIntInsn(Opcodes.BIPUSH, cst) + } else if (cst >= java.lang.Short.MIN_VALUE && cst <= java.lang.Short.MAX_VALUE) { + jmethod.visitIntInsn(Opcodes.SIPUSH, cst) + } else { + jmethod.visitLdcInsn(Integer.valueOf(cst)) + } + } + + // can-multi-thread + final def lconst(cst: Long): Unit = { + if (cst == 0L || cst == 1L) { + emit(Opcodes.LCONST_0 + cst.asInstanceOf[Int]) + } else { + jmethod.visitLdcInsn(java.lang.Long.valueOf(cst)) + } + } + + // can-multi-thread + final def fconst(cst: Float): Unit = { + val bits: Int = java.lang.Float.floatToIntBits(cst) + if (bits == 0L || bits == 0x3f800000 || bits == 0x40000000) { // 0..2 + emit(Opcodes.FCONST_0 + cst.asInstanceOf[Int]) + } else { + jmethod.visitLdcInsn(java.lang.Float.valueOf(cst)) + } + } + + // can-multi-thread + final def dconst(cst: Double): Unit = { + val bits: Long = java.lang.Double.doubleToLongBits(cst) + if (bits == 0L || bits == 0x3ff0000000000000L) { // +0.0d and 1.0d + emit(Opcodes.DCONST_0 + cst.asInstanceOf[Int]) + } else { + jmethod.visitLdcInsn(java.lang.Double.valueOf(cst)) + } + } + + // can-multi-thread + final def newarray(elem: BType): Unit = { + elem match { + case c: RefBType => + /* phantom type at play in `Array(null)`, SI-1513. On the other hand, Array(()) has element type `scala.runtime.BoxedUnit` which isObject. */ + jmethod.visitTypeInsn(Opcodes.ANEWARRAY, c.classOrArrayType) + case _ => + assert(elem.isNonVoidPrimitiveType) + val rand = { + // using `asm.Type.SHORT` instead of `BType.SHORT` because otherwise "warning: could not emit switch for @switch annotated match" + elem match { + case BOOL => Opcodes.T_BOOLEAN + case BYTE => Opcodes.T_BYTE + case SHORT => Opcodes.T_SHORT + case CHAR => Opcodes.T_CHAR + case INT => Opcodes.T_INT + case LONG => Opcodes.T_LONG + case FLOAT => Opcodes.T_FLOAT + case DOUBLE => Opcodes.T_DOUBLE + } + } + jmethod.visitIntInsn(Opcodes.NEWARRAY, rand) + } + } + + + final def load( idx: Int, tk: BType): Unit = { emitVarInsn(Opcodes.ILOAD, idx, tk) } // can-multi-thread + final def store(idx: Int, tk: BType): Unit = { emitVarInsn(Opcodes.ISTORE, idx, tk) } // can-multi-thread + final def iinc( idx: Int, increment: Int): Unit = jmethod.visitIincInsn(idx, increment) // can-multi-thread + + final def aload( tk: BType): Unit = { emitTypeBased(JCodeMethodN.aloadOpcodes, tk) } // can-multi-thread + final def astore(tk: BType): Unit = { emitTypeBased(JCodeMethodN.astoreOpcodes, tk) } // can-multi-thread + + final def neg(tk: BType): Unit = { emitPrimitive(JCodeMethodN.negOpcodes, tk) } // can-multi-thread + final def add(tk: BType): Unit = { emitPrimitive(JCodeMethodN.addOpcodes, tk) } // can-multi-thread + final def sub(tk: BType): Unit = { emitPrimitive(JCodeMethodN.subOpcodes, tk) } // can-multi-thread + final def mul(tk: BType): Unit = { emitPrimitive(JCodeMethodN.mulOpcodes, tk) } // can-multi-thread + final def div(tk: BType): Unit = { emitPrimitive(JCodeMethodN.divOpcodes, tk) } // can-multi-thread + final def rem(tk: BType): Unit = { emitPrimitive(JCodeMethodN.remOpcodes, tk) } // can-multi-thread + + // can-multi-thread + final def invokespecial(owner: String, name: String, desc: String, itf: Boolean): Unit = { + emitInvoke(Opcodes.INVOKESPECIAL, owner, name, desc, itf) + } + // can-multi-thread + final def invokestatic(owner: String, name: String, desc: String, itf: Boolean): Unit = { + emitInvoke(Opcodes.INVOKESTATIC, owner, name, desc, itf) + } + // can-multi-thread + final def invokeinterface(owner: String, name: String, desc: String): Unit = { + emitInvoke(Opcodes.INVOKEINTERFACE, owner, name, desc, itf = true) + } + // can-multi-thread + final def invokevirtual(owner: String, name: String, desc: String): Unit = { + emitInvoke(Opcodes.INVOKEVIRTUAL, owner, name, desc, itf = false) + } + + def emitInvoke(opcode: Int, owner: String, name: String, desc: String, itf: Boolean): Unit = { + val node = new MethodInsnNode(opcode, owner, name, desc, itf) + jmethod.instructions.add(node) + } + + + // can-multi-thread + final def goTo(label: asm.Label): Unit = { jmethod.visitJumpInsn(Opcodes.GOTO, label) } + // can-multi-thread + final def emitIF(cond: TestOp, label: asm.Label): Unit = { jmethod.visitJumpInsn(cond.opcodeIF(), label) } + // can-multi-thread + final def emitIF_ICMP(cond: TestOp, label: asm.Label): Unit = { jmethod.visitJumpInsn(cond.opcodeIFICMP(), label) } + // can-multi-thread + final def emitIF_ACMP(cond: TestOp, label: asm.Label): Unit = { + assert((cond == EQ) || (cond == NE), cond) + val opc = (if (cond == EQ) Opcodes.IF_ACMPEQ else Opcodes.IF_ACMPNE) + jmethod.visitJumpInsn(opc, label) + } + // can-multi-thread + final def emitIFNONNULL(label: asm.Label): Unit = { jmethod.visitJumpInsn(Opcodes.IFNONNULL, label) } + // can-multi-thread + final def emitIFNULL (label: asm.Label): Unit = { jmethod.visitJumpInsn(Opcodes.IFNULL, label) } + + // can-multi-thread + final def emitRETURN(tk: BType): Unit = { + if (tk == UNIT) { emit(Opcodes.RETURN) } + else { emitTypeBased(JCodeMethodN.returnOpcodes, tk) } + } + + /* Emits one of tableswitch or lookoupswitch. + * + * can-multi-thread + */ + final def emitSWITCH(keys: Array[Int], branches: Array[asm.Label], defaultBranch: asm.Label, minDensity: Double): Unit = { + assert(keys.length == branches.length) + + // For empty keys, it makes sense emitting LOOKUPSWITCH with defaultBranch only. + // Similar to what javac emits for a switch statement consisting only of a default case. + if (keys.length == 0) { + jmethod.visitLookupSwitchInsn(defaultBranch, keys, branches) + return + } + + // sort `keys` by increasing key, keeping `branches` in sync. TODO FIXME use quicksort + var i = 1 + while (i < keys.length) { + var j = 1 + while (j <= keys.length - i) { + if (keys(j) < keys(j - 1)) { + val tmp = keys(j) + keys(j) = keys(j - 1) + keys(j - 1) = tmp + val tmpL = branches(j) + branches(j) = branches(j - 1) + branches(j - 1) = tmpL + } + j += 1 + } + i += 1 + } + + // check for duplicate keys to avoid "VerifyError: unsorted lookupswitch" (SI-6011) + i = 1 + while (i < keys.length) { + if (keys(i-1) == keys(i)) { + abort("duplicate keys in SWITCH, can't pick arbitrarily one of them to evict, see SI-6011.") + } + i += 1 + } + + val keyMin = keys(0) + val keyMax = keys(keys.length - 1) + + val isDenseEnough: Boolean = { + /* Calculate in long to guard against overflow. TODO what overflow? */ + val keyRangeD: Double = (keyMax.asInstanceOf[Long] - keyMin + 1).asInstanceOf[Double] + val klenD: Double = keys.length + val kdensity: Double = (klenD / keyRangeD) + + kdensity >= minDensity + } + + if (isDenseEnough) { + // use a table in which holes are filled with defaultBranch. + val keyRange = (keyMax - keyMin + 1) + val newBranches = new Array[asm.Label](keyRange) + var oldPos = 0 + var i = 0 + while (i < keyRange) { + val key = keyMin + i; + if (keys(oldPos) == key) { + newBranches(i) = branches(oldPos) + oldPos += 1 + } else { + newBranches(i) = defaultBranch + } + i += 1 + } + assert(oldPos == keys.length, "emitSWITCH") + jmethod.visitTableSwitchInsn(keyMin, keyMax, defaultBranch, newBranches: _*) + } else { + jmethod.visitLookupSwitchInsn(defaultBranch, keys, branches) + } + } + + // internal helpers -- not part of the public API of `jcode` + // don't make private otherwise inlining will suffer + + // can-multi-thread + final def emitVarInsn(opc: Int, idx: Int, tk: BType): Unit = { + assert((opc == Opcodes.ILOAD) || (opc == Opcodes.ISTORE), opc) + jmethod.visitVarInsn(tk.typedOpcode(opc), idx) + } + + // ---------------- array load and store ---------------- + + // can-multi-thread + final def emitTypeBased(opcs: Array[Int], tk: BType): Unit = { + assert(tk != UNIT, tk) + val opc = { + if (tk.isRef) { opcs(0) } + else if (tk.isIntSizedType) { + (tk: @unchecked) match { + case BOOL | BYTE => opcs(1) + case SHORT => opcs(2) + case CHAR => opcs(3) + case INT => opcs(4) + } + } else { + (tk: @unchecked) match { + case LONG => opcs(5) + case FLOAT => opcs(6) + case DOUBLE => opcs(7) + } + } + } + emit(opc) + } + + // ---------------- primitive operations ---------------- + + // can-multi-thread + final def emitPrimitive(opcs: Array[Int], tk: BType): Unit = { + val opc = { + // using `asm.Type.SHORT` instead of `BType.SHORT` because otherwise "warning: could not emit switch for @switch annotated match" + tk match { + case LONG => opcs(1) + case FLOAT => opcs(2) + case DOUBLE => opcs(3) + case _ => opcs(0) + } + } + emit(opc) + } + + // can-multi-thread + final def drop(tk: BType): Unit = { emit(if (tk.isWideType) Opcodes.POP2 else Opcodes.POP) } + + // can-multi-thread + final def dup(tk: BType): Unit = { emit(if (tk.isWideType) Opcodes.DUP2 else Opcodes.DUP) } + + // ---------------- type checks and casts ---------------- + + // can-multi-thread + final def isInstance(tk: RefBType): Unit = { + jmethod.visitTypeInsn(Opcodes.INSTANCEOF, tk.classOrArrayType) + } + + // can-multi-thread + final def checkCast(tk: RefBType): Unit = { + // TODO ICode also requires: but that's too much, right? assert(!isBoxedType(tk), "checkcast on boxed type: " + tk) + jmethod.visitTypeInsn(Opcodes.CHECKCAST, tk.classOrArrayType) + } + + def abort(msg: String): Nothing = { + report.error(msg) + throw new RuntimeException(msg) + } + + } // end of class JCodeMethodN + + /* Constant-valued val-members of JCodeMethodN at the companion object, so as to avoid re-initializing them multiple times. */ + object JCodeMethodN { + + import asm.Opcodes._ + + // ---------------- conversions ---------------- + + val fromByteT2T = { Array( -1, -1, I2C, -1, I2L, I2F, I2D) } // do nothing for (BYTE -> SHORT) and for (BYTE -> INT) + val fromCharT2T = { Array(I2B, I2S, -1, -1, I2L, I2F, I2D) } // for (CHAR -> INT) do nothing + val fromShortT2T = { Array(I2B, -1, I2C, -1, I2L, I2F, I2D) } // for (SHORT -> INT) do nothing + val fromIntT2T = { Array(I2B, I2S, I2C, -1, I2L, I2F, I2D) } + + // ---------------- array load and store ---------------- + + val aloadOpcodes = { Array(AALOAD, BALOAD, SALOAD, CALOAD, IALOAD, LALOAD, FALOAD, DALOAD) } + val astoreOpcodes = { Array(AASTORE, BASTORE, SASTORE, CASTORE, IASTORE, LASTORE, FASTORE, DASTORE) } + val returnOpcodes = { Array(ARETURN, IRETURN, IRETURN, IRETURN, IRETURN, LRETURN, FRETURN, DRETURN) } + + // ---------------- primitive operations ---------------- + + val negOpcodes: Array[Int] = { Array(INEG, LNEG, FNEG, DNEG) } + val addOpcodes: Array[Int] = { Array(IADD, LADD, FADD, DADD) } + val subOpcodes: Array[Int] = { Array(ISUB, LSUB, FSUB, DSUB) } + val mulOpcodes: Array[Int] = { Array(IMUL, LMUL, FMUL, DMUL) } + val divOpcodes: Array[Int] = { Array(IDIV, LDIV, FDIV, DDIV) } + val remOpcodes: Array[Int] = { Array(IREM, LREM, FREM, DREM) } + + } // end of object JCodeMethodN + + // ---------------- adapted from scalaPrimitives ---------------- + + /* Given `code` reports the src TypeKind of the coercion indicated by `code`. + * To find the dst TypeKind, `ScalaPrimitivesOps.generatedKind(code)` can be used. + * + * can-multi-thread + */ + final def coercionFrom(code: Int): BType = { + import ScalaPrimitivesOps._ + (code: @switch) match { + case B2B | B2C | B2S | B2I | B2L | B2F | B2D => BYTE + case S2B | S2S | S2C | S2I | S2L | S2F | S2D => SHORT + case C2B | C2S | C2C | C2I | C2L | C2F | C2D => CHAR + case I2B | I2S | I2C | I2I | I2L | I2F | I2D => INT + case L2B | L2S | L2C | L2I | L2L | L2F | L2D => LONG + case F2B | F2S | F2C | F2I | F2L | F2F | F2D => FLOAT + case D2B | D2S | D2C | D2I | D2L | D2F | D2D => DOUBLE + } + } + + /* If code is a coercion primitive, the result type. + * + * can-multi-thread + */ + final def coercionTo(code: Int): BType = { + import ScalaPrimitivesOps._ + (code: @switch) match { + case B2B | C2B | S2B | I2B | L2B | F2B | D2B => BYTE + case B2C | C2C | S2C | I2C | L2C | F2C | D2C => CHAR + case B2S | C2S | S2S | I2S | L2S | F2S | D2S => SHORT + case B2I | C2I | S2I | I2I | L2I | F2I | D2I => INT + case B2L | C2L | S2L | I2L | L2L | F2L | D2L => LONG + case B2F | C2F | S2F | I2F | L2F | F2F | D2F => FLOAT + case B2D | C2D | S2D | I2D | L2D | F2D | D2D => DOUBLE + } + } + + implicit class InsnIterMethodNode(mnode: asm.tree.MethodNode) { + @`inline` final def foreachInsn(f: (asm.tree.AbstractInsnNode) => Unit): Unit = { mnode.instructions.foreachInsn(f) } + } + + implicit class InsnIterInsnList(lst: asm.tree.InsnList) { + + @`inline` final def foreachInsn(f: (asm.tree.AbstractInsnNode) => Unit): Unit = { + val insnIter = lst.iterator() + while (insnIter.hasNext) { + f(insnIter.next()) + } + } + } +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/BCodeSkelBuilder.scala b/tests/pos-with-compiler-cc/backend/jvm/BCodeSkelBuilder.scala new file mode 100644 index 000000000000..1d8a9c579cb9 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/BCodeSkelBuilder.scala @@ -0,0 +1,908 @@ +package dotty.tools +package backend +package jvm + +import scala.language.unsafeNulls + +import scala.annotation.tailrec + +import scala.collection.{ mutable, immutable } + +import scala.tools.asm +import dotty.tools.dotc.ast.tpd +import dotty.tools.dotc.ast.TreeTypeMap +import dotty.tools.dotc.CompilationUnit +import dotty.tools.dotc.core.Decorators._ +import dotty.tools.dotc.core.Flags._ +import dotty.tools.dotc.core.StdNames._ +import dotty.tools.dotc.core.NameKinds._ +import dotty.tools.dotc.core.Names.TermName +import dotty.tools.dotc.core.Symbols._ +import dotty.tools.dotc.core.Types._ +import dotty.tools.dotc.core.Contexts._ +import dotty.tools.dotc.util.Spans._ +import dotty.tools.dotc.report +import dotty.tools.dotc.transform.SymUtils._ + +/* + * + * @author Miguel Garcia, http://lamp.epfl.ch/~magarcia/ScalaCompilerCornerReloaded/ + * @version 1.0 + * + */ +trait BCodeSkelBuilder extends BCodeHelpers { + import int.{_, given} + import DottyBackendInterface.{symExtensions, _} + import tpd._ + import bTypes._ + import coreBTypes._ + import bCodeAsmCommon._ + + lazy val NativeAttr: Symbol = requiredClass[scala.native] + + /** The destination of a value generated by `genLoadTo`. */ + enum LoadDestination: + /** The value is put on the stack, and control flows through to the next opcode. */ + case FallThrough + /** The value is put on the stack, and control flow is transferred to the given `label`. */ + case Jump(label: asm.Label) + /** The value is RETURN'ed from the enclosing method. */ + case Return + /** The value is ATHROW'n. */ + case Throw + end LoadDestination + + /* + * There's a dedicated PlainClassBuilder for each CompilationUnit, + * which simplifies the initialization of per-class data structures in `genPlainClass()` which in turn delegates to `initJClass()` + * + * The entry-point to emitting bytecode instructions is `genDefDef()` where the per-method data structures are initialized, + * including `resetMethodBookkeeping()` and `initJMethod()`. + * Once that's been done, and assuming the method being visited isn't abstract, `emitNormalMethodBody()` populates + * the ASM MethodNode instance with ASM AbstractInsnNodes. + * + * Given that CleanUp delivers trees that produce values on the stack, + * the entry-point to all-things instruction-emit is `genLoad()`. + * There, an operation taking N arguments results in recursively emitting instructions to lead each of them, + * followed by emitting instructions to process those arguments (to be found at run-time on the operand-stack). + * + * In a few cases the above recipe deserves more details, as provided in the documentation for: + * - `genLoadTry()` + * - `genSynchronized() + * - `jumpDest` , `cleanups` , `labelDefsAtOrUnder` + */ + abstract class PlainSkelBuilder(cunit: CompilationUnit) + extends BCClassGen + with BCAnnotGen + with BCInnerClassGen + with JAndroidBuilder + with BCForwardersGen + with BCPickles + with BCJGenSigGen { + + // Strangely I can't find this in the asm code 255, but reserving 1 for "this" + inline val MaximumJvmParameters = 254 + + // current class + var cnode: ClassNode1 = null + var thisName: String = null // the internal name of the class being emitted + + var claszSymbol: Symbol = null + var isCZParcelable = false + var isCZStaticModule = false + + /* ---------------- idiomatic way to ask questions to typer ---------------- */ + + def paramTKs(app: Apply, take: Int = -1): List[BType] = app match { + case Apply(fun, _) => + val funSym = fun.symbol + (funSym.info.firstParamTypes map toTypeKind) // this tracks mentioned inner classes (in innerClassBufferASM) + } + + def symInfoTK(sym: Symbol): BType = { + toTypeKind(sym.info) // this tracks mentioned inner classes (in innerClassBufferASM) + } + + def tpeTK(tree: Tree): BType = { toTypeKind(tree.tpe) } + + override def getCurrentCUnit(): CompilationUnit = { cunit } + + /* ---------------- helper utils for generating classes and fields ---------------- */ + + def genPlainClass(cd0: TypeDef) = cd0 match { + case TypeDef(_, impl: Template) => + assert(cnode == null, "GenBCode detected nested methods.") + + claszSymbol = cd0.symbol + isCZParcelable = isAndroidParcelableClass(claszSymbol) + isCZStaticModule = claszSymbol.isStaticModuleClass + thisName = internalName(claszSymbol) + + cnode = new ClassNode1() + + initJClass(cnode) + + val cd = if (isCZStaticModule) { + // Move statements from the primary constructor following the superclass constructor call to + // a newly synthesised tree representing the "", which also assigns the MODULE$ field. + // Because the assigments to both the module instance fields, and the fields of the module itself + // are in the , these fields can be static + final. + + // Should we do this transformation earlier, say in Constructors? Or would that just cause + // pain for scala-{js, native}? + // + // @sjrd (https://github.com/lampepfl/dotty/pull/9181#discussion_r457458205): + // moving that before the back-end would make things significantly more complicated for + // Scala.js and Native. Both have a first-class concept of ModuleClass, and encode the + // singleton pattern of MODULE$ in a completely different way. In the Scala.js IR, there + // even isn't anything that corresponds to MODULE$ per se. + // + // So if you move this before the back-end, then Scala.js and Scala Native will have to + // reverse all the effects of this transformation, which would be counter-productive. + + + // TODO: remove `!f.name.is(LazyBitMapName)` once we change lazy val encoding + // https://github.com/lampepfl/dotty/issues/7140 + // + // Lazy val encoding assumes bitmap fields are non-static + // + // See `tests/run/given-var.scala` + // + + // !!! Part of this logic is duplicated in JSCodeGen.genCompilationUnit + claszSymbol.info.decls.foreach { f => + if f.isField && !f.name.is(LazyBitMapName) then + f.setFlag(JavaStatic) + } + + val (clinits, body) = impl.body.partition(stat => stat.isInstanceOf[DefDef] && stat.symbol.isStaticConstructor) + + val (uptoSuperStats, remainingConstrStats) = splitAtSuper(impl.constr.rhs.asInstanceOf[Block].stats) + val clInitSymbol: TermSymbol = + if (clinits.nonEmpty) clinits.head.symbol.asTerm + else newSymbol( + claszSymbol, + nme.STATIC_CONSTRUCTOR, + JavaStatic | Method, + MethodType(Nil)(_ => Nil, _ => defn.UnitType), + privateWithin = NoSymbol, + coord = claszSymbol.coord + ) + + val moduleField = newSymbol( + claszSymbol, + str.MODULE_INSTANCE_FIELD.toTermName, + JavaStatic | Final, + claszSymbol.typeRef, + privateWithin = NoSymbol, + coord = claszSymbol.coord + ).entered + + val thisMap = new TreeMap { + override def transform(tree: Tree)(using Context) = { + val tp = tree.tpe.substThis(claszSymbol.asClass, claszSymbol.sourceModule.termRef) + tree.withType(tp) match { + case tree: This if tree.symbol == claszSymbol => + ref(claszSymbol.sourceModule) + case tree => + super.transform(tree) + } + } + } + + def rewire(stat: Tree) = thisMap.transform(stat).changeOwner(claszSymbol.primaryConstructor, clInitSymbol) + + val callConstructor = New(claszSymbol.typeRef).select(claszSymbol.primaryConstructor).appliedToTermArgs(Nil) + val assignModuleField = Assign(ref(moduleField), callConstructor) + val remainingConstrStatsSubst = remainingConstrStats.map(rewire) + val clinit = clinits match { + case (ddef: DefDef) :: _ => + cpy.DefDef(ddef)(rhs = Block(ddef.rhs :: assignModuleField :: remainingConstrStatsSubst, unitLiteral)) + case _ => + DefDef(clInitSymbol, Block(assignModuleField :: remainingConstrStatsSubst, unitLiteral)) + } + + val constr2 = { + val rhs = Block(uptoSuperStats, impl.constr.rhs.asInstanceOf[Block].expr) + cpy.DefDef(impl.constr)(rhs = rhs) + } + + val impl2 = cpy.Template(impl)(constr = constr2, body = clinit :: body) + cpy.TypeDef(cd0)(rhs = impl2) + } else cd0 + + val hasStaticCtor = isCZStaticModule || cd.symbol.info.decls.exists(_.isStaticConstructor) + if (!hasStaticCtor && isCZParcelable) fabricateStaticInitAndroid() + + val optSerial: Option[Long] = + claszSymbol.getAnnotation(defn.SerialVersionUIDAnnot).flatMap { annot => + if (claszSymbol.is(Trait)) { + report.warning("@SerialVersionUID does nothing on a trait", annot.tree.sourcePos) + None + } else { + val vuid = annot.argumentConstant(0).map(_.longValue) + if (vuid.isEmpty) + report.error("The argument passed to @SerialVersionUID must be a constant", + annot.argument(0).getOrElse(annot.tree).sourcePos) + vuid + } + } + if (optSerial.isDefined) { addSerialVUID(optSerial.get, cnode)} + + addClassFields() + gen(cd.rhs) + + if (AsmUtils.traceClassEnabled && cnode.name.contains(AsmUtils.traceClassPattern)) + AsmUtils.traceClass(cnode) + + cnode.innerClasses + assert(cd.symbol == claszSymbol, "Someone messed up BCodePhase.claszSymbol during genPlainClass().") + + } // end of method genPlainClass() + + /* + * must-single-thread + */ + private def initJClass(jclass: asm.ClassVisitor): Unit = { + + val ps = claszSymbol.info.parents + val superClass: String = if (ps.isEmpty) ObjectRef.internalName else internalName(ps.head.typeSymbol) + val interfaceNames0 = classBTypeFromSymbol(claszSymbol).info.interfaces.map(_.internalName) + /* To avoid deadlocks when combining objects, lambdas and multi-threading, + * lambdas in objects are compiled to instance methods of the module class + * instead of static methods (see tests/run/deadlock.scala and + * https://github.com/scala/scala-dev/issues/195 for details). + * This has worked well for us so far but this is problematic for + * serialization: serializing a lambda requires serializing all the values + * it captures, if this lambda is in an object, this means serializing the + * enclosing object, which fails if the object does not extend + * Serializable. + * Because serializing objects is basically free since #5775, it seems like + * the simplest solution is to simply make all objects Serializable, this + * certainly seems preferable to deadlocks. + * This cannot be done earlier because Scala.js would not like it (#9596). + */ + val interfaceNames = + if (claszSymbol.is(ModuleClass) && !interfaceNames0.contains("java/io/Serializable")) + interfaceNames0 :+ "java/io/Serializable" + else + interfaceNames0 + + val flags = javaFlags(claszSymbol) + + val thisSignature = getGenericSignature(claszSymbol, claszSymbol.owner) + cnode.visit(classfileVersion, flags, + thisName, thisSignature, + superClass, interfaceNames.toArray) + + if (emitSource) { + cnode.visitSource(cunit.source.file.name, null /* SourceDebugExtension */) + } + + enclosingMethodAttribute(claszSymbol, internalName, asmMethodType(_).descriptor) match { + case Some(EnclosingMethodEntry(className, methodName, methodDescriptor)) => + cnode.visitOuterClass(className, methodName, methodDescriptor) + case _ => () + } + + val ssa = None // TODO: inlined form `getAnnotPickle(thisName, claszSymbol)`. Should something be done on Dotty? + cnode.visitAttribute(if (ssa.isDefined) pickleMarkerLocal else pickleMarkerForeign) + emitAnnotations(cnode, claszSymbol.annotations ++ ssa) + + if (!isCZStaticModule && !isCZParcelable) { + val skipStaticForwarders = (claszSymbol.is(Module) || ctx.settings.XnoForwarders.value) + if (!skipStaticForwarders) { + val lmoc = claszSymbol.companionModule + // add static forwarders if there are no name conflicts; see bugs #363 and #1735 + if (lmoc != NoSymbol) { + // it must be a top level class (name contains no $s) + val isCandidateForForwarders = (lmoc.is(Module)) && lmoc.isStatic + if (isCandidateForForwarders) { + report.log(s"Adding static forwarders from '$claszSymbol' to implementations in '$lmoc'") + addForwarders(cnode, thisName, lmoc.moduleClass) + } + } + } + + } + + // the invoker is responsible for adding a class-static constructor. + + } // end of method initJClass + + /* + * must-single-thread + */ + private def fabricateStaticInitAndroid(): Unit = { + + val clinit: asm.MethodVisitor = cnode.visitMethod( + GenBCodeOps.PublicStatic, // TODO confirm whether we really don't want ACC_SYNTHETIC nor ACC_DEPRECATED + CLASS_CONSTRUCTOR_NAME, + "()V", + null, // no java-generic-signature + null // no throwable exceptions + ) + clinit.visitCode() + + legacyAddCreatorCode(clinit, cnode, thisName) + + clinit.visitInsn(asm.Opcodes.RETURN) + clinit.visitMaxs(0, 0) // just to follow protocol, dummy arguments + clinit.visitEnd() + } + + def addClassFields(): Unit = { + /* Non-method term members are fields, except for module members. Module + * members can only happen on .NET (no flatten) for inner traits. There, + * a module symbol is generated (transformInfo in mixin) which is used + * as owner for the members of the implementation class (so that the + * backend emits them as static). + * No code is needed for this module symbol. + */ + for (f <- claszSymbol.info.decls.filter(p => p.isTerm && !p.is(Method))) { + val javagensig = getGenericSignature(f, claszSymbol) + val flags = javaFieldFlags(f) + + assert(!f.isStaticMember || !claszSymbol.isInterface || !f.is(Mutable), + s"interface $claszSymbol cannot have non-final static field $f") + + val jfield = new asm.tree.FieldNode( + flags, + f.javaSimpleName, + symInfoTK(f).descriptor, + javagensig, + null // no initial value + ) + cnode.fields.add(jfield) + emitAnnotations(jfield, f.annotations) + } + + } // end of method addClassFields() + + // current method + var mnode: MethodNode1 = null + var jMethodName: String = null + var isMethSymStaticCtor = false + var returnType: BType = null + var methSymbol: Symbol = null + // used by genLoadTry() and genSynchronized() + var earlyReturnVar: Symbol = null + var shouldEmitCleanup = false + // line numbers + var lastEmittedLineNr = -1 + + object bc extends JCodeMethodN { + override def jmethod = PlainSkelBuilder.this.mnode + } + + /* ---------------- Part 1 of program points, ie Labels in the ASM world ---------------- */ + + /* + * A jump is represented as a Return node whose `from` symbol denotes a Labeled's Bind node, the target of the jump. + * The `jumpDest` map is used to find the `LoadDestination` at the end of the `Labeled` block, as well as the + * corresponding expected type. The `LoadDestination` can never be `FallThrough` here. + */ + var jumpDest: immutable.Map[ /* Labeled */ Symbol, (BType, LoadDestination) ] = null + def registerJumpDest(labelSym: Symbol, expectedType: BType, dest: LoadDestination): Unit = { + assert(labelSym.is(Label), s"trying to register a jump-dest for a non-label symbol, at: ${labelSym.span}") + assert(dest != LoadDestination.FallThrough, s"trying to register a FallThrough dest for label, at: ${labelSym.span}") + assert(!jumpDest.contains(labelSym), s"trying to register a second jump-dest for label, at: ${labelSym.span}") + jumpDest += (labelSym -> (expectedType, dest)) + } + def findJumpDest(labelSym: Symbol): (BType, LoadDestination) = { + assert(labelSym.is(Label), s"trying to map a non-label symbol to an asm.Label, at: ${labelSym.span}") + jumpDest.getOrElse(labelSym, { + abort(s"unknown label symbol, for label at: ${labelSym.span}") + }) + } + + /* + * A program point may be lexically nested (at some depth) + * (a) in the try-clause of a try-with-finally expression + * (b) in a synchronized block. + * Each of the constructs above establishes a "cleanup block" to execute upon + * both normal-exit, early-return, and abrupt-termination of the instructions it encloses. + * + * The `cleanups` LIFO queue represents the nesting of active (for the current program point) + * pending cleanups. For each such cleanup an asm.Label indicates the start of its cleanup-block. + * At any given time during traversal of the method body, + * the head of `cleanups` denotes the cleanup-block for the closest enclosing try-with-finally or synchronized-expression. + * + * `cleanups` is used: + * + * (1) upon visiting a Return statement. + * In case of pending cleanups, we can't just emit a RETURN instruction, but must instead: + * - store the result (if any) in `earlyReturnVar`, and + * - jump to the next pending cleanup. + * See `genReturn()` + * + * (2) upon emitting a try-with-finally or a synchronized-expr, + * In these cases, the targets of the above jumps are emitted, + * provided an early exit was actually encountered somewhere in the protected clauses. + * See `genLoadTry()` and `genSynchronized()` + * + * The code thus emitted for jumps and targets covers the early-return case. + * The case of abrupt (ie exceptional) termination is covered by exception handlers + * emitted for that purpose as described in `genLoadTry()` and `genSynchronized()`. + */ + var cleanups: List[asm.Label] = Nil + def registerCleanup(finCleanup: asm.Label): Unit = { + if (finCleanup != null) { cleanups = finCleanup :: cleanups } + } + def unregisterCleanup(finCleanup: asm.Label): Unit = { + if (finCleanup != null) { + assert(cleanups.head eq finCleanup, + s"Bad nesting of cleanup operations: $cleanups trying to unregister: $finCleanup") + cleanups = cleanups.tail + } + } + + /* ---------------- local variables and params ---------------- */ + + case class Local(tk: BType, name: String, idx: Int, isSynth: Boolean) + + /* + * Bookkeeping for method-local vars and method-params. + * + * TODO: use fewer slots. local variable slots are never re-used in separate blocks. + * In the following example, x and y could use the same slot. + * def foo() = { + * { val x = 1 } + * { val y = "a" } + * } + */ + object locals { + + private val slots = mutable.AnyRefMap.empty[Symbol, Local] // (local-or-param-sym -> Local(BType, name, idx, isSynth)) + + private var nxtIdx = -1 // next available index for local-var + + def reset(isStaticMethod: Boolean): Unit = { + slots.clear() + nxtIdx = if (isStaticMethod) 0 else 1 + } + + def contains(locSym: Symbol): Boolean = { slots.contains(locSym) } + + def apply(locSym: Symbol): Local = { slots.apply(locSym) } + + /* Make a fresh local variable, ensuring a unique name. + * The invoker must make sure inner classes are tracked for the sym's tpe. + */ + def makeLocal(tk: BType, name: String, tpe: Type, pos: Span): Symbol = { + + val locSym = newSymbol(methSymbol, name.toTermName, Synthetic, tpe, NoSymbol, pos) + makeLocal(locSym, tk) + locSym + } + + def makeLocal(locSym: Symbol): Local = { + makeLocal(locSym, symInfoTK(locSym)) + } + + def getOrMakeLocal(locSym: Symbol): Local = { + // `getOrElse` below has the same effect as `getOrElseUpdate` because `makeLocal()` adds an entry to the `locals` map. + slots.getOrElse(locSym, makeLocal(locSym)) + } + + def reuseLocal(sym: Symbol, loc: Local): Unit = + val existing = slots.put(sym, loc) + if (existing.isDefined) + report.error("attempt to create duplicate local var.", ctx.source.atSpan(sym.span)) + + def reuseThisSlot(sym: Symbol): Unit = + reuseLocal(sym, Local(symInfoTK(sym), sym.javaSimpleName, 0, sym.is(Synthetic))) + + private def makeLocal(sym: Symbol, tk: BType): Local = { + assert(nxtIdx != -1, "not a valid start index") + val loc = Local(tk, sym.javaSimpleName, nxtIdx, sym.is(Synthetic)) + val existing = slots.put(sym, loc) + if (existing.isDefined) + report.error("attempt to create duplicate local var.", ctx.source.atSpan(sym.span)) + assert(tk.size > 0, "makeLocal called for a symbol whose type is Unit.") + nxtIdx += tk.size + loc + } + + // not to be confused with `fieldStore` and `fieldLoad` which also take a symbol but a field-symbol. + def store(locSym: Symbol): Unit = { + val Local(tk, _, idx, _) = slots(locSym) + bc.store(idx, tk) + } + + def load(locSym: Symbol): Unit = { + val Local(tk, _, idx, _) = slots(locSym) + bc.load(idx, tk) + } + + } + + /* ---------------- Part 2 of program points, ie Labels in the ASM world ---------------- */ + + // bookkeeping the scopes of non-synthetic local vars, to emit debug info (`emitVars`). + var varsInScope: List[(Symbol, asm.Label)] = null // (local-var-sym -> start-of-scope) + + // helpers around program-points. + def lastInsn: asm.tree.AbstractInsnNode = mnode.instructions.getLast + def currProgramPoint(): asm.Label = { + lastInsn match { + case labnode: asm.tree.LabelNode => labnode.getLabel + case _ => + val pp = new asm.Label + mnode visitLabel pp + pp + } + } + def markProgramPoint(lbl: asm.Label): Unit = { + val skip = (lbl == null) || isAtProgramPoint(lbl) + if (!skip) { mnode visitLabel lbl } + } + def isAtProgramPoint(lbl: asm.Label): Boolean = { + def getNonLineNumberNode(a: asm.tree.AbstractInsnNode): asm.tree.AbstractInsnNode = a match { + case a: asm.tree.LineNumberNode => getNonLineNumberNode(a.getPrevious) // line numbers aren't part of code itself + case _ => a + } + (getNonLineNumberNode(lastInsn) match { + case labnode: asm.tree.LabelNode => (labnode.getLabel == lbl); + case _ => false } ) + } + def lineNumber(tree: Tree): Unit = { + if (!emitLines || !tree.span.exists) return; + val nr = ctx.source.offsetToLine(tree.span.point) + 1 + if (nr != lastEmittedLineNr) { + lastEmittedLineNr = nr + lastInsn match { + case lnn: asm.tree.LineNumberNode => + // overwrite previous landmark as no instructions have been emitted for it + lnn.line = nr + case _ => + mnode.visitLineNumber(nr, currProgramPoint()) + } + } + } + + // on entering a method + def resetMethodBookkeeping(dd: DefDef) = { + val rhs = dd.rhs + locals.reset(isStaticMethod = methSymbol.isStaticMember) + jumpDest = immutable.Map.empty + + // check previous invocation of genDefDef exited as many varsInScope as it entered. + assert(varsInScope == null, "Unbalanced entering/exiting of GenBCode's genBlock().") + // check previous invocation of genDefDef unregistered as many cleanups as it registered. + assert(cleanups == Nil, "Previous invocation of genDefDef didn't unregister as many cleanups as it registered.") + earlyReturnVar = null + shouldEmitCleanup = false + + lastEmittedLineNr = -1 + } + + /* ---------------- top-down traversal invoking ASM Tree API along the way ---------------- */ + + def gen(tree: Tree): Unit = { + tree match { + case tpd.EmptyTree => () + + case ValDef(name, tpt, rhs) => () // fields are added in `genPlainClass()`, via `addClassFields()` + + case dd: DefDef => + /* First generate a static forwarder if this is a non-private trait + * trait method. This is required for super calls to this method, which + * go through the static forwarder in order to work around limitations + * of the JVM. + * + * For the $init$ method, we must not leave it as a default method, but + * instead we must put the whole body in the static method. If we leave + * it as a default method, Java classes cannot extend Scala classes that + * extend several Scala traits, since they then inherit unrelated default + * $init$ methods. See #8599. scalac does the same thing. + * + * In theory, this would go in a separate MiniPhase, but it would have to + * sit in a MegaPhase of its own between GenSJSIR and GenBCode, so the cost + * is not worth it. We directly do it in this back-end instead, which also + * kind of makes sense because it is JVM-specific. + */ + val sym = dd.symbol + val needsStaticImplMethod = + claszSymbol.isInterface && !dd.rhs.isEmpty && !sym.isPrivate && !sym.isStaticMember + if needsStaticImplMethod then + if sym.name == nme.TRAIT_CONSTRUCTOR then + genTraitConstructorDefDef(dd) + else + genStaticForwarderForDefDef(dd) + genDefDef(dd) + else + genDefDef(dd) + + case tree: Template => + val body = + if (tree.constr.rhs.isEmpty) tree.body + else tree.constr :: tree.body + body foreach gen + + case _ => abort(s"Illegal tree in gen: $tree") + } + } + + /* + * must-single-thread + */ + def initJMethod(flags: Int, params: List[Symbol]): Unit = { + + val jgensig = getGenericSignature(methSymbol, claszSymbol) + val (excs, others) = methSymbol.annotations.partition(_.symbol eq defn.ThrowsAnnot) + val thrownExceptions: List[String] = getExceptions(excs) + + val bytecodeName = + if (isMethSymStaticCtor) CLASS_CONSTRUCTOR_NAME + else jMethodName + + val mdesc = asmMethodType(methSymbol).descriptor + mnode = cnode.visitMethod( + flags, + bytecodeName, + mdesc, + jgensig, + mkArrayS(thrownExceptions) + ).asInstanceOf[MethodNode1] + + // TODO param names: (m.params map (p => javaName(p.sym))) + + emitAnnotations(mnode, others) + emitParamNames(mnode, params) + emitParamAnnotations(mnode, params.map(_.annotations)) + + } // end of method initJMethod + + private def genTraitConstructorDefDef(dd: DefDef): Unit = + val statifiedDef = makeStatifiedDefDef(dd) + genDefDef(statifiedDef) + + /** Creates a copy of the given DefDef that is static and where an explicit + * self parameter represents the original `this` value. + * + * Example: from + * {{{ + * trait Enclosing { + * def foo(x: Int): String = this.toString() + x + * } + * }}} + * the statified version of `foo` would be + * {{{ + * static def foo($self: Enclosing, x: Int): String = $self.toString() + x + * }}} + */ + private def makeStatifiedDefDef(dd: DefDef): DefDef = + val origSym = dd.symbol.asTerm + val newSym = makeStatifiedDefSymbol(origSym, origSym.name) + tpd.DefDef(newSym, { paramRefss => + val selfParamRef :: regularParamRefs = paramRefss.head: @unchecked + val enclosingClass = origSym.owner.asClass + new TreeTypeMap( + typeMap = _.substThis(enclosingClass, selfParamRef.symbol.termRef) + .subst(dd.termParamss.head.map(_.symbol), regularParamRefs.map(_.symbol.termRef)), + treeMap = { + case tree: This if tree.symbol == enclosingClass => selfParamRef + case tree => tree + }, + oldOwners = origSym :: Nil, + newOwners = newSym :: Nil + ).transform(dd.rhs) + }) + + private def genStaticForwarderForDefDef(dd: DefDef): Unit = + val forwarderDef = makeStaticForwarder(dd) + genDefDef(forwarderDef) + + /* Generates a synthetic static forwarder for a trait method. + * For a method such as + * def foo(...args: Ts): R + * in trait X, we generate the following method: + * static def foo$($this: X, ...args: Ts): R = + * invokespecial $this.X::foo(...args) + * We force an invokespecial with the attachment UseInvokeSpecial. It is + * necessary to make sure that the call will not follow overrides of foo() + * in subtraits and subclasses, since the whole point of this forward is to + * encode super calls. + */ + private def makeStaticForwarder(dd: DefDef): DefDef = + val origSym = dd.symbol.asTerm + val name = traitSuperAccessorName(origSym).toTermName + val sym = makeStatifiedDefSymbol(origSym, name) + tpd.DefDef(sym, { paramss => + val params = paramss.head + tpd.Apply(params.head.select(origSym), params.tail) + .withAttachment(BCodeHelpers.UseInvokeSpecial, ()) + }) + + private def makeStatifiedDefSymbol(origSym: TermSymbol, name: TermName): TermSymbol = + val info = origSym.info match + case mt: MethodType => + MethodType(nme.SELF :: mt.paramNames, origSym.owner.typeRef :: mt.paramInfos, mt.resType) + origSym.copy( + name = name.toTermName, + flags = Method | JavaStatic, + info = info + ).asTerm + + def genDefDef(dd: DefDef): Unit = { + val rhs = dd.rhs + val vparamss = dd.termParamss + // the only method whose implementation is not emitted: getClass() + if (dd.symbol eq defn.Any_getClass) { return } + assert(mnode == null, "GenBCode detected nested method.") + + methSymbol = dd.symbol + jMethodName = methSymbol.javaSimpleName + returnType = asmMethodType(dd.symbol).returnType + isMethSymStaticCtor = methSymbol.isStaticConstructor + + resetMethodBookkeeping(dd) + + // add method-local vars for params + + assert(vparamss.isEmpty || vparamss.tail.isEmpty, s"Malformed parameter list: $vparamss") + val params = if (vparamss.isEmpty) Nil else vparamss.head + for (p <- params) { locals.makeLocal(p.symbol) } + // debug assert((params.map(p => locals(p.symbol).tk)) == asmMethodType(methSymbol).getArgumentTypes.toList, "debug") + + if (params.size > MaximumJvmParameters) { + // SI-7324 + report.error(em"Platform restriction: a parameter list's length cannot exceed $MaximumJvmParameters.", ctx.source.atSpan(methSymbol.span)) + return + } + + val isNative = methSymbol.hasAnnotation(NativeAttr) + val isAbstractMethod = (methSymbol.is(Deferred) || (methSymbol.owner.isInterface && ((methSymbol.is(Deferred)) || methSymbol.isClassConstructor))) + val flags = + import GenBCodeOps.addFlagIf + javaFlags(methSymbol) + .addFlagIf(isAbstractMethod, asm.Opcodes.ACC_ABSTRACT) + .addFlagIf(false /*methSymbol.isStrictFP*/, asm.Opcodes.ACC_STRICT) + .addFlagIf(isNative, asm.Opcodes.ACC_NATIVE) // native methods of objects are generated in mirror classes + + // TODO needed? for(ann <- m.symbol.annotations) { ann.symbol.initialize } + val paramSyms = params.map(_.symbol) + initJMethod(flags, paramSyms) + + + if (!isAbstractMethod && !isNative) { + // #14773 Reuse locals slots for tailrec-generated mutable vars + val trimmedRhs: Tree = + @tailrec def loop(stats: List[Tree]): List[Tree] = + stats match + case (tree @ ValDef(TailLocalName(_, _), _, _)) :: rest if tree.symbol.isAllOf(Mutable | Synthetic) => + tree.rhs match + case This(_) => + locals.reuseThisSlot(tree.symbol) + loop(rest) + case rhs: Ident if paramSyms.contains(rhs.symbol) => + locals.reuseLocal(tree.symbol, locals(rhs.symbol)) + loop(rest) + case _ => + stats + case _ => + stats + end loop + + rhs match + case Block(stats, expr) => + val trimmedStats = loop(stats) + if trimmedStats eq stats then + rhs + else + Block(trimmedStats, expr) + case _ => + rhs + end trimmedRhs + + def emitNormalMethodBody(): Unit = { + val veryFirstProgramPoint = currProgramPoint() + + if trimmedRhs == tpd.EmptyTree then + report.error( + em"Concrete method has no definition: $dd${ + if (ctx.settings.Ydebug.value) "(found: " + methSymbol.owner.info.decls.toList.mkString(", ") + ")" + else ""}", + ctx.source.atSpan(NoSpan) + ) + else + genLoadTo(trimmedRhs, returnType, LoadDestination.Return) + + if (emitVars) { + // add entries to LocalVariableTable JVM attribute + val onePastLastProgramPoint = currProgramPoint() + val hasStaticBitSet = ((flags & asm.Opcodes.ACC_STATIC) != 0) + if (!hasStaticBitSet) { + mnode.visitLocalVariable( + "this", + "L" + thisName + ";", + null, + veryFirstProgramPoint, + onePastLastProgramPoint, + 0 + ) + } + for (p <- params) { emitLocalVarScope(p.symbol, veryFirstProgramPoint, onePastLastProgramPoint, force = true) } + } + + if (isMethSymStaticCtor) { appendToStaticCtor(dd) } + } // end of emitNormalMethodBody() + + lineNumber(rhs) + emitNormalMethodBody() + + // Note we don't invoke visitMax, thus there are no FrameNode among mnode.instructions. + // The only non-instruction nodes to be found are LabelNode and LineNumberNode. + } + + if (AsmUtils.traceMethodEnabled && mnode.name.contains(AsmUtils.traceMethodPattern)) + AsmUtils.traceMethod(mnode) + + mnode = null + } // end of method genDefDef() + + /* + * must-single-thread + * + * TODO document, explain interplay with `fabricateStaticInitAndroid()` + */ + private def appendToStaticCtor(dd: DefDef): Unit = { + + def insertBefore( + location: asm.tree.AbstractInsnNode, + i0: asm.tree.AbstractInsnNode, + i1: asm.tree.AbstractInsnNode): Unit = { + if (i0 != null) { + mnode.instructions.insertBefore(location, i0.clone(null)) + mnode.instructions.insertBefore(location, i1.clone(null)) + } + } + + // collect all return instructions + var rets: List[asm.tree.AbstractInsnNode] = Nil + mnode foreachInsn { i => if (i.getOpcode() == asm.Opcodes.RETURN) { rets ::= i } } + if (rets.isEmpty) { return } + + var insnParcA: asm.tree.AbstractInsnNode = null + var insnParcB: asm.tree.AbstractInsnNode = null + // android creator code + if (isCZParcelable) { + // add a static field ("CREATOR") to this class to cache android.os.Parcelable$Creator + val andrFieldDescr = classBTypeFromSymbol(AndroidCreatorClass).descriptor + cnode.visitField( + asm.Opcodes.ACC_STATIC | asm.Opcodes.ACC_FINAL, + "CREATOR", + andrFieldDescr, + null, + null + ) + // INVOKESTATIC CREATOR(): android.os.Parcelable$Creator; -- TODO where does this Android method come from? + val callee = claszSymbol.companionModule.info.member(androidFieldName).symbol + val jowner = internalName(callee.owner) + val jname = callee.javaSimpleName + val jtype = asmMethodType(callee).descriptor + insnParcA = new asm.tree.MethodInsnNode(asm.Opcodes.INVOKESTATIC, jowner, jname, jtype, false) + // PUTSTATIC `thisName`.CREATOR; + insnParcB = new asm.tree.FieldInsnNode(asm.Opcodes.PUTSTATIC, thisName, "CREATOR", andrFieldDescr) + } + + // insert a few instructions for initialization before each return instruction + for(r <- rets) { + insertBefore(r, insnParcA, insnParcB) + } + + } + + def emitLocalVarScope(sym: Symbol, start: asm.Label, end: asm.Label, force: Boolean = false): Unit = { + val Local(tk, name, idx, isSynth) = locals(sym) + if (force || !isSynth) { + mnode.visitLocalVariable(name, tk.descriptor, null, start, end, idx) + } + } + + def genLoadTo(tree: Tree, expectedType: BType, dest: LoadDestination): Unit + + } // end of class PlainSkelBuilder + +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/BCodeSyncAndTry.scala b/tests/pos-with-compiler-cc/backend/jvm/BCodeSyncAndTry.scala new file mode 100644 index 000000000000..b5ed27511e7e --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/BCodeSyncAndTry.scala @@ -0,0 +1,426 @@ +package dotty.tools +package backend +package jvm + +import scala.language.unsafeNulls + +import scala.collection.immutable +import scala.tools.asm + +import dotty.tools.dotc.CompilationUnit +import dotty.tools.dotc.core.StdNames.nme +import dotty.tools.dotc.core.Symbols._ +import dotty.tools.dotc.ast.tpd + +/* + * + * @author Miguel Garcia, http://lamp.epfl.ch/~magarcia/ScalaCompilerCornerReloaded/ + * @version 1.0 + * + */ +trait BCodeSyncAndTry extends BCodeBodyBuilder { + import int.given + import tpd._ + import bTypes._ + import coreBTypes._ + /* + * Functionality to lower `synchronized` and `try` expressions. + */ + abstract class SyncAndTryBuilder(cunit: CompilationUnit) extends PlainBodyBuilder(cunit) { + + def genSynchronized(tree: Apply, expectedType: BType): BType = (tree: @unchecked) match { + case Apply(TypeApply(fun, _), args) => + val monitor = locals.makeLocal(ObjectRef, "monitor", defn.ObjectType, tree.span) + val monCleanup = new asm.Label + + // if the synchronized block returns a result, store it in a local variable. + // Just leaving it on the stack is not valid in MSIL (stack is cleaned when leaving try-blocks). + val hasResult = (expectedType != UNIT) + val monitorResult: Symbol = if (hasResult) locals.makeLocal(tpeTK(args.head), "monitorResult", defn.ObjectType, tree.span) else null + + /* ------ (1) pushing and entering the monitor, also keeping a reference to it in a local var. ------ */ + genLoadQualifier(fun) + bc dup ObjectRef + locals.store(monitor) + emit(asm.Opcodes.MONITORENTER) + + /* ------ (2) Synchronized block. + * Reached by fall-through from (1). + * Protected by: + * (2.a) the EH-version of the monitor-exit, and + * (2.b) whatever protects the whole synchronized expression. + * ------ + */ + val startProtected = currProgramPoint() + registerCleanup(monCleanup) + genLoad(args.head, expectedType /* toTypeKind(tree.tpe.resultType) */) + unregisterCleanup(monCleanup) + if (hasResult) { locals.store(monitorResult) } + nopIfNeeded(startProtected) + val endProtected = currProgramPoint() + + /* ------ (3) monitor-exit after normal, non-early-return, termination of (2). + * Reached by fall-through from (2). + * Protected by whatever protects the whole synchronized expression. + * ------ + */ + locals.load(monitor) + emit(asm.Opcodes.MONITOREXIT) + if (hasResult) { locals.load(monitorResult) } + val postHandler = new asm.Label + bc goTo postHandler + + /* ------ (4) exception-handler version of monitor-exit code. + * Reached upon abrupt termination of (2). + * Protected by whatever protects the whole synchronized expression. + * null => "any" exception in bytecode, like we emit for finally. + * Important not to use j/l/Throwable which dooms the method to a life of interpretation! (SD-233) + * ------ + */ + protect(startProtected, endProtected, currProgramPoint(), null) + locals.load(monitor) + emit(asm.Opcodes.MONITOREXIT) + emit(asm.Opcodes.ATHROW) + + /* ------ (5) cleanup version of monitor-exit code. + * Reached upon early-return from (2). + * Protected by whatever protects the whole synchronized expression. + * ------ + */ + if (shouldEmitCleanup) { + markProgramPoint(monCleanup) + locals.load(monitor) + emit(asm.Opcodes.MONITOREXIT) + pendingCleanups() + } + + /* ------ (6) normal exit of the synchronized expression. + * Reached after normal, non-early-return, termination of (3). + * Protected by whatever protects the whole synchronized expression. + * ------ + */ + mnode visitLabel postHandler + + lineNumber(tree) + + expectedType + } + + /* + * Detects whether no instructions have been emitted since label `lbl` and if so emits a NOP. + * Useful to avoid emitting an empty try-block being protected by exception handlers, + * which results in "java.lang.ClassFormatError: Illegal exception table range". See SI-6102. + */ + def nopIfNeeded(lbl: asm.Label): Unit = { + val noInstructionEmitted = isAtProgramPoint(lbl) + if (noInstructionEmitted) { emit(asm.Opcodes.NOP) } + } + + /* + * Emitting try-catch is easy, emitting try-catch-finally not quite so. + * A finally-block (which always has type Unit, thus leaving the operand stack unchanged) + * affects control-transfer from protected regions, as follows: + * + * (a) `return` statement: + * + * First, the value to return (if any) is evaluated. + * Afterwards, all enclosing finally-blocks are run, from innermost to outermost. + * Only then is the return value (if any) returned. + * + * Some terminology: + * (a.1) Executing a return statement that is protected + * by one or more finally-blocks is called "early return" + * (a.2) the chain of code sections (a code section for each enclosing finally-block) + * to run upon early returns is called "cleanup chain" + * + * As an additional spin, consider a return statement in a finally-block. + * In this case, the value to return depends on how control arrived at that statement: + * in case it arrived via a previous return, the previous return enjoys priority: + * the value to return is given by that statement. + * + * (b) A finally-block protects both the try-clause and the catch-clauses. + * + * Sidenote: + * A try-clause may contain an empty block. On CLR, a finally-block has special semantics + * regarding Abort interruptions; but on the JVM it's safe to elide an exception-handler + * that protects an "empty" range ("empty" as in "containing NOPs only", + * see `asm.optimiz.DanglingExcHandlers` and SI-6720). + * + * This means a finally-block indicates instructions that can be reached: + * (b.1) Upon normal (non-early-returning) completion of the try-clause or a catch-clause + * In this case, the next-program-point is that following the try-catch-finally expression. + * (b.2) Upon early-return initiated in the try-clause or a catch-clause + * In this case, the next-program-point is the enclosing cleanup section (if any), otherwise return. + * (b.3) Upon abrupt termination (due to unhandled exception) of the try-clause or a catch-clause + * In this case, the unhandled exception must be re-thrown after running the finally-block. + * + * (c) finally-blocks are implicit to `synchronized` (a finally-block is added to just release the lock) + * that's why `genSynchronized()` too emits cleanup-sections. + * + * A number of code patterns can be emitted to realize the intended semantics. + * + * A popular alternative (GenICode, javac) consists in duplicating the cleanup-chain at each early-return position. + * The principle at work being that once control is transferred to a cleanup-section, + * control will always stay within the cleanup-chain. + * That is, barring an exception being thrown in a cleanup-section, in which case the enclosing try-block + * (reached via abrupt termination) takes over. + * + * The observations above hint at another code layout, less verbose, for the cleanup-chain. + * + * The code layout that GenBCode emits takes into account that once a cleanup section has been reached, + * jumping to the next cleanup-section (and so on, until the outermost one) realizes the correct semantics. + * + * There is still code duplication in that two cleanup-chains are needed (but this is unavoidable, anyway): + * one for normal control flow and another chain consisting of exception handlers. + * The in-line comments below refer to them as + * - "early-return-cleanups" and + * - "exception-handler-version-of-finally-block" respectively. + * + */ + def genLoadTry(tree: Try): BType = tree match { + case Try(block, catches, finalizer) => + val kind = tpeTK(tree) + + val caseHandlers: List[EHClause] = + for (CaseDef(pat, _, caseBody) <- catches) yield { + pat match { + case Typed(Ident(nme.WILDCARD), tpt) => NamelessEH(tpeTK(tpt).asClassBType, caseBody) + case Ident(nme.WILDCARD) => NamelessEH(jlThrowableRef, caseBody) + case Bind(_, _) => BoundEH (pat.symbol, caseBody) + } + } + + // ------ (0) locals used later ------ + + /* + * `postHandlers` is a program point denoting: + * (a) the finally-clause conceptually reached via fall-through from try-catch-finally + * (in case a finally-block is present); or + * (b) the program point right after the try-catch + * (in case there's no finally-block). + * The name choice emphasizes that the code section lies "after all exception handlers", + * where "all exception handlers" includes those derived from catch-clauses as well as from finally-blocks. + */ + val postHandlers = new asm.Label + + val hasFinally = (finalizer != tpd.EmptyTree) + + /* + * used in the finally-clause reached via fall-through from try-catch, if any. + */ + val guardResult = hasFinally && (kind != UNIT) && mayCleanStack(finalizer) + + /* + * please notice `tmp` has type tree.tpe, while `earlyReturnVar` has the method return type. + * Because those two types can be different, dedicated vars are needed. + */ + val tmp = if (guardResult) locals.makeLocal(tpeTK(tree), "tmp", tree.tpe, tree.span) else null + + /* + * upon early return from the try-body or one of its EHs (but not the EH-version of the finally-clause) + * AND hasFinally, a cleanup is needed. + */ + val finCleanup = if (hasFinally) new asm.Label else null + + /* ------ (1) try-block, protected by: + * (1.a) the EHs due to case-clauses, emitted in (2), + * (1.b) the EH due to finally-clause, emitted in (3.A) + * (1.c) whatever protects the whole try-catch-finally expression. + * ------ + */ + + val startTryBody = currProgramPoint() + registerCleanup(finCleanup) + genLoad(block, kind) + unregisterCleanup(finCleanup) + nopIfNeeded(startTryBody) + val endTryBody = currProgramPoint() + bc goTo postHandlers + + /** + * A return within a `try` or `catch` block where a `finally` is present ("early return") + * emits a store of the result to a local, jump to a "cleanup" version of the `finally` block, + * and sets `shouldEmitCleanup = true` (see [[PlainBodyBuilder.genReturn]]). + * + * If the try-catch is nested, outer `finally` blocks need to be emitted in a cleanup version + * as well, so the `shouldEmitCleanup` variable remains `true` until the outermost `finally`. + * Nested cleanup `finally` blocks jump to the next enclosing one. For the outermost, we emit + * a read of the local variable, a return, and we set `shouldEmitCleanup = false` (see + * [[pendingCleanups]]). + * + * Now, assume we have + * + * try { return 1 } finally { + * try { println() } finally { println() } + * } + * + * Here, the outer `finally` needs a cleanup version, but the inner one does not. The method + * here makes sure that `shouldEmitCleanup` is only propagated outwards, not inwards to + * nested `finally` blocks. + */ + def withFreshCleanupScope(body: => Unit) = { + val savedShouldEmitCleanup = shouldEmitCleanup + shouldEmitCleanup = false + body + shouldEmitCleanup = savedShouldEmitCleanup || shouldEmitCleanup + } + + /* ------ (2) One EH for each case-clause (this does not include the EH-version of the finally-clause) + * An EH in (2) is reached upon abrupt termination of (1). + * An EH in (2) is protected by: + * (2.a) the EH-version of the finally-clause, if any. + * (2.b) whatever protects the whole try-catch-finally expression. + * ------ + */ + + for (ch <- caseHandlers) withFreshCleanupScope { + + // (2.a) emit case clause proper + val startHandler = currProgramPoint() + var endHandler: asm.Label = null + var excType: ClassBType = null + registerCleanup(finCleanup) + ch match { + case NamelessEH(typeToDrop, caseBody) => + bc drop typeToDrop + genLoad(caseBody, kind) // adapts caseBody to `kind`, thus it can be stored, if `guardResult`, in `tmp`. + nopIfNeeded(startHandler) + endHandler = currProgramPoint() + excType = typeToDrop + + case BoundEH (patSymbol, caseBody) => + // test/files/run/contrib674.scala , a local-var already exists for patSymbol. + // rather than creating on first-access, we do it right away to emit debug-info for the created local var. + val Local(patTK, _, patIdx, _) = locals.getOrMakeLocal(patSymbol) + bc.store(patIdx, patTK) + genLoad(caseBody, kind) + nopIfNeeded(startHandler) + endHandler = currProgramPoint() + emitLocalVarScope(patSymbol, startHandler, endHandler) + excType = patTK.asClassBType + } + unregisterCleanup(finCleanup) + // (2.b) mark the try-body as protected by this case clause. + protect(startTryBody, endTryBody, startHandler, excType) + // (2.c) emit jump to the program point where the finally-clause-for-normal-exit starts, or in effect `after` if no finally-clause was given. + bc goTo postHandlers + + } + + // Need to save the state of `shouldEmitCleanup` at this point: while emitting the first + // version of the `finally` block below, the variable may become true. But this does not mean + // that we need a cleanup version for the current block, only for the enclosing ones. + val currentFinallyBlockNeedsCleanup = shouldEmitCleanup + + /* ------ (3.A) The exception-handler-version of the finally-clause. + * Reached upon abrupt termination of (1) or one of the EHs in (2). + * Protected only by whatever protects the whole try-catch-finally expression. + * ------ + */ + + // a note on terminology: this is not "postHandlers", despite appearances. + // "postHandlers" as in the source-code view. And from that perspective, both (3.A) and (3.B) are invisible implementation artifacts. + if (hasFinally) withFreshCleanupScope { + nopIfNeeded(startTryBody) + val finalHandler = currProgramPoint() // version of the finally-clause reached via unhandled exception. + protect(startTryBody, finalHandler, finalHandler, null) + val Local(eTK, _, eIdx, _) = locals(locals.makeLocal(jlThrowableRef, "exc", defn.ThrowableType, finalizer.span)) + bc.store(eIdx, eTK) + emitFinalizer(finalizer, null, isDuplicate = true) + bc.load(eIdx, eTK) + emit(asm.Opcodes.ATHROW) + } + + /* ------ (3.B) Cleanup-version of the finally-clause. + * Reached upon early RETURN from (1) or upon early RETURN from one of the EHs in (2) + * (and only from there, ie reached only upon early RETURN from + * program regions bracketed by registerCleanup/unregisterCleanup). + * Protected only by whatever protects the whole try-catch-finally expression. + * + * Given that control arrives to a cleanup section only upon early RETURN, + * the value to return (if any) is always available. Therefore, a further RETURN + * found in a cleanup section is always ignored (a warning is displayed, @see `genReturn()`). + * In order for `genReturn()` to know whether the return statement is enclosed in a cleanup section, + * the variable `insideCleanupBlock` is used. + * ------ + */ + + // this is not "postHandlers" either. + // `shouldEmitCleanup` can be set, and at the same time this try expression may lack a finally-clause. + // In other words, all combinations of (hasFinally, shouldEmitCleanup) are valid. + if (hasFinally && currentFinallyBlockNeedsCleanup) { + markProgramPoint(finCleanup) + // regarding return value, the protocol is: in place of a `return-stmt`, a sequence of `adapt, store, jump` are inserted. + emitFinalizer(finalizer, null, isDuplicate = true) + pendingCleanups() + } + + /* ------ (4) finally-clause-for-normal-nonEarlyReturn-exit + * Reached upon normal, non-early-return termination of (1) or of an EH in (2). + * Protected only by whatever protects the whole try-catch-finally expression. + * TODO explain what happens upon RETURN contained in (4) + * ------ + */ + + markProgramPoint(postHandlers) + if (hasFinally) { + emitFinalizer(finalizer, tmp, isDuplicate = false) // the only invocation of emitFinalizer with `isDuplicate == false` + } + + kind + } // end of genLoadTry() + + /* if no more pending cleanups, all that remains to do is return. Otherwise jump to the next (outer) pending cleanup. */ + private def pendingCleanups(): Unit = { + cleanups match { + case Nil => + if (earlyReturnVar != null) { + locals.load(earlyReturnVar) + bc.emitRETURN(locals(earlyReturnVar).tk) + } else { + bc emitRETURN UNIT + } + shouldEmitCleanup = false + + case nextCleanup :: _ => + bc goTo nextCleanup + } + } + + def protect(start: asm.Label, end: asm.Label, handler: asm.Label, excType: ClassBType): Unit = { + val excInternalName: String = + if (excType == null) null + else excType.internalName + assert(start != end, "protecting a range of zero instructions leads to illegal class format. Solution: add a NOP to that range.") + mnode.visitTryCatchBlock(start, end, handler, excInternalName) + } + + /* `tmp` (if non-null) is the symbol of the local-var used to preserve the result of the try-body, see `guardResult` */ + def emitFinalizer(finalizer: Tree, tmp: Symbol, isDuplicate: Boolean): Unit = { + var saved: immutable.Map[ /* Labeled */ Symbol, (BType, LoadDestination) ] = null + if (isDuplicate) { + saved = jumpDest + } + // when duplicating, the above guarantees new asm.Labels are used for LabelDefs contained in the finalizer (their vars are reused, that's ok) + if (tmp != null) { locals.store(tmp) } + genLoad(finalizer, UNIT) + if (tmp != null) { locals.load(tmp) } + if (isDuplicate) { + jumpDest = saved + } + } + + /* Does this tree have a try-catch block? */ + def mayCleanStack(tree: Tree): Boolean = tree.find { t => t match { // TODO: use existsSubTree + case Try(_, _, _) => true + case _ => false + } + }.isDefined + + trait EHClause + case class NamelessEH(typeToDrop: ClassBType, caseBody: Tree) extends EHClause + case class BoundEH (patSymbol: Symbol, caseBody: Tree) extends EHClause + + } + +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/BTypes.scala b/tests/pos-with-compiler-cc/backend/jvm/BTypes.scala new file mode 100644 index 000000000000..dda85e2d5616 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/BTypes.scala @@ -0,0 +1,864 @@ +package dotty.tools +package backend +package jvm + +import scala.language.unsafeNulls + +import scala.tools.asm + +/** + * The BTypes component defines The BType class hierarchy. BTypes encapsulates all type information + * that is required after building the ASM nodes. This includes optimizations, geneartion of + * InnerClass attributes and generation of stack map frames. + * + * This representation is immutable and independent of the compiler data structures, hence it can + * be queried by concurrent threads. + */ +abstract class BTypes extends caps.Pure { + + val int: DottyBackendInterface + import int.given + /** + * A map from internal names to ClassBTypes. Every ClassBType is added to this map on its + * construction. + * + * This map is used when computing stack map frames. The asm.ClassWriter invokes the method + * `getCommonSuperClass`. In this method we need to obtain the ClassBType for a given internal + * name. The method assumes that every class type that appears in the bytecode exists in the map. + * + * Concurrent because stack map frames are computed when in the class writer, which might run + * on multiple classes concurrently. + */ + protected def classBTypeFromInternalNameMap: collection.concurrent.Map[String, ClassBType] + // NOTE: Should be a lazy val but scalac does not allow abstract lazy vals (dotty does) + + /** + * Obtain a previously constructed ClassBType for a given internal name. + */ + def classBTypeFromInternalName(internalName: String) = classBTypeFromInternalNameMap(internalName) + + // Some core BTypes are required here, in class BType, where no Global instance is available. + // The Global is only available in the subclass BTypesFromSymbols. We cannot depend on the actual + // implementation (CoreBTypesProxy) here because it has members that refer to global.Symbol. + val coreBTypes: CoreBTypesProxyGlobalIndependent[this.type] + import coreBTypes._ + + /** + * A BType is either a primitve type, a ClassBType, an ArrayBType of one of these, or a MethodType + * referring to BTypes. + */ + /*sealed*/ trait BType extends caps.Pure { // Not sealed for now due to SI-8546 + final override def toString: String = this match { + case UNIT => "V" + case BOOL => "Z" + case CHAR => "C" + case BYTE => "B" + case SHORT => "S" + case INT => "I" + case FLOAT => "F" + case LONG => "J" + case DOUBLE => "D" + case ClassBType(internalName) => "L" + internalName + ";" + case ArrayBType(component) => "[" + component + case MethodBType(args, res) => args.mkString("(", "", ")" + res) + } + + /** + * @return The Java descriptor of this type. Examples: + * - int: I + * - java.lang.String: Ljava/lang/String; + * - int[]: [I + * - Object m(String s, double d): (Ljava/lang/String;D)Ljava/lang/Object; + */ + final def descriptor = toString + + /** + * @return 0 for void, 2 for long and double, 1 otherwise + */ + final def size: Int = this match { + case UNIT => 0 + case LONG | DOUBLE => 2 + case _ => 1 + } + + final def isPrimitive: Boolean = this.isInstanceOf[PrimitiveBType] + final def isRef: Boolean = this.isInstanceOf[RefBType] + final def isArray: Boolean = this.isInstanceOf[ArrayBType] + final def isClass: Boolean = this.isInstanceOf[ClassBType] + final def isMethod: Boolean = this.isInstanceOf[MethodBType] + + final def isNonVoidPrimitiveType = isPrimitive && this != UNIT + + final def isNullType = this == srNullRef + final def isNothingType = this == srNothingRef + + final def isBoxed = this.isClass && boxedClasses(this.asClassBType) + + final def isIntSizedType = this == BOOL || this == CHAR || this == BYTE || + this == SHORT || this == INT + final def isIntegralType = this == INT || this == BYTE || this == LONG || + this == CHAR || this == SHORT + final def isRealType = this == FLOAT || this == DOUBLE + final def isNumericType = isIntegralType || isRealType + final def isWideType = size == 2 + + /* + * Subtype check `this <:< other` on BTypes that takes into account the JVM built-in numeric + * promotions (e.g. BYTE to INT). Its operation can be visualized more easily in terms of the + * Java bytecode type hierarchy. + */ + final def conformsTo(other: BType): Boolean = { + assert(isRef || isPrimitive, s"conformsTo cannot handle $this") + assert(other.isRef || other.isPrimitive, s"conformsTo cannot handle $other") + + this match { + case ArrayBType(component) => + if (other == ObjectRef || other == jlCloneableRef || other == jiSerializableRef) true + else other match { + case ArrayBType(otherComponoent) => component.conformsTo(otherComponoent) + case _ => false + } + + case classType: ClassBType => + if (isBoxed) { + if (other.isBoxed) this == other + else if (other == ObjectRef) true + else other match { + case otherClassType: ClassBType => classType.isSubtypeOf(otherClassType) // e.g., java/lang/Double conforms to java/lang/Number + case _ => false + } + } else if (isNullType) { + if (other.isNothingType) false + else if (other.isPrimitive) false + else true // Null conforms to all classes (except Nothing) and arrays. + } else if (isNothingType) { + true + } else other match { + case otherClassType: ClassBType => classType.isSubtypeOf(otherClassType) + // case ArrayBType(_) => this.isNullType // documentation only, because `if (isNullType)` above covers this case + case _ => + // isNothingType || // documentation only, because `if (isNothingType)` above covers this case + false + } + + case UNIT => + other == UNIT + case BOOL | BYTE | SHORT | CHAR => + this == other || other == INT || other == LONG // TODO Actually, BOOL does NOT conform to LONG. Even with adapt(). + case _ => + assert(isPrimitive && other.isPrimitive, s"Expected primitive types $this - $other") + this == other + } + } + + /** + * Compute the upper bound of two types. + * Takes promotions of numeric primitives into account. + */ + final def maxType(other: BType): BType = this match { + case pt: PrimitiveBType => pt.maxValueType(other) + + case _: ArrayBType | _: ClassBType => + if (isNothingType) return other + if (other.isNothingType) return this + if (this == other) return this + + assert(other.isRef, s"Cannot compute maxType: $this, $other") + // Approximate `lub`. The common type of two references is always ObjectReference. + ObjectRef + } + + /** + * See documentation of [[typedOpcode]]. + * The numbers are taken from asm.Type.VOID_TYPE ff., the values are those shifted by << 8. + */ + private def loadStoreOpcodeOffset: Int = this match { + case UNIT | INT => 0 + case BOOL | BYTE => 5 + case CHAR => 6 + case SHORT => 7 + case FLOAT => 2 + case LONG => 1 + case DOUBLE => 3 + case _ => 4 + } + + /** + * See documentation of [[typedOpcode]]. + * The numbers are taken from asm.Type.VOID_TYPE ff., the values are those shifted by << 16. + */ + private def typedOpcodeOffset: Int = this match { + case UNIT => 5 + case BOOL | CHAR | BYTE | SHORT | INT => 0 + case FLOAT => 2 + case LONG => 1 + case DOUBLE => 3 + case _ => 4 + } + + /** + * Some JVM opcodes have typed variants. This method returns the correct opcode according to + * the type. + * + * @param opcode A JVM instruction opcode. This opcode must be one of ILOAD, ISTORE, IALOAD, + * IASTORE, IADD, ISUB, IMUL, IDIV, IREM, INEG, ISHL, ISHR, IUSHR, IAND, IOR + * IXOR and IRETURN. + * @return The opcode adapted to this java type. For example, if this type is `float` and + * `opcode` is `IRETURN`, this method returns `FRETURN`. + */ + final def typedOpcode(opcode: Int): Int = { + if (opcode == asm.Opcodes.IALOAD || opcode == asm.Opcodes.IASTORE) + opcode + loadStoreOpcodeOffset + else + opcode + typedOpcodeOffset + } + + /** + * The asm.Type corresponding to this BType. + * + * Note about asm.Type.getObjectType (*): For class types, the method expects the internal + * name, i.e. without the surrounding 'L' and ';'. For array types on the other hand, the + * method expects a full descriptor, for example "[Ljava/lang/String;". + * + * See method asm.Type.getType that creates a asm.Type from a type descriptor + * - for an OBJECT type, the 'L' and ';' are not part of the range of the created Type + * - for an ARRAY type, the full descriptor is part of the range + */ + def toASMType: asm.Type = this match { + case UNIT => asm.Type.VOID_TYPE + case BOOL => asm.Type.BOOLEAN_TYPE + case CHAR => asm.Type.CHAR_TYPE + case BYTE => asm.Type.BYTE_TYPE + case SHORT => asm.Type.SHORT_TYPE + case INT => asm.Type.INT_TYPE + case FLOAT => asm.Type.FLOAT_TYPE + case LONG => asm.Type.LONG_TYPE + case DOUBLE => asm.Type.DOUBLE_TYPE + case ClassBType(internalName) => asm.Type.getObjectType(internalName) // see (*) above + case a: ArrayBType => asm.Type.getObjectType(a.descriptor) + case m: MethodBType => asm.Type.getMethodType(m.descriptor) + } + + def asRefBType : RefBType = this.asInstanceOf[RefBType] + def asArrayBType : ArrayBType = this.asInstanceOf[ArrayBType] + def asClassBType : ClassBType = this.asInstanceOf[ClassBType] + def asPrimitiveBType : PrimitiveBType = this.asInstanceOf[PrimitiveBType] + } + + sealed trait PrimitiveBType extends BType { + + /** + * The upper bound of two primitive types. The `other` type has to be either a primitive + * type or Nothing. + * + * The maxValueType of (Char, Byte) and of (Char, Short) is Int, to encompass the negative + * values of Byte and Short. See ticket #2087. + */ + final def maxValueType(other: BType): BType = { + + def uncomparable: Nothing = throw new AssertionError(s"Cannot compute maxValueType: $this, $other") + + if (!other.isPrimitive && !other.isNothingType) uncomparable + + if (other.isNothingType) return this + if (this == other) return this + + this match { + case BYTE => + if (other == CHAR) INT + else if (other.isNumericType) other + else uncomparable + + case SHORT => + other match { + case BYTE => SHORT + case CHAR => INT + case INT | LONG | FLOAT | DOUBLE => other + case _ => uncomparable + } + + case CHAR => + other match { + case BYTE | SHORT => INT + case INT | LONG | FLOAT | DOUBLE => other + case _ => uncomparable + } + + case INT => + other match { + case BYTE | SHORT | CHAR => INT + case LONG | FLOAT | DOUBLE => other + case _ => uncomparable + } + + case LONG => + other match { + case INT | BYTE | LONG | CHAR | SHORT => LONG + case DOUBLE => DOUBLE + case FLOAT => FLOAT + case _ => uncomparable + } + + case FLOAT => + if (other == DOUBLE) DOUBLE + else if (other.isNumericType) FLOAT + else uncomparable + + case DOUBLE => + if (other.isNumericType) DOUBLE + else uncomparable + + case UNIT | BOOL => uncomparable + } + } + } + + case object UNIT extends PrimitiveBType + case object BOOL extends PrimitiveBType + case object CHAR extends PrimitiveBType + case object BYTE extends PrimitiveBType + case object SHORT extends PrimitiveBType + case object INT extends PrimitiveBType + case object FLOAT extends PrimitiveBType + case object LONG extends PrimitiveBType + case object DOUBLE extends PrimitiveBType + + sealed trait RefBType extends BType { + /** + * The class or array type of this reference type. Used for ANEWARRAY, MULTIANEWARRAY, + * INSTANCEOF and CHECKCAST instructions. Also used for emitting invokevirtual calls to + * (a: Array[T]).clone() for any T, see genApply. + * + * In contrast to the descriptor, this string does not contain the surrounding 'L' and ';' for + * class types, for example "java/lang/String". + * However, for array types, the full descriptor is used, for example "[Ljava/lang/String;". + * + * This can be verified for example using javap or ASMifier. + */ + def classOrArrayType: String = this match { + case ClassBType(internalName) => internalName + case a: ArrayBType => a.descriptor + } + } + + /** + * InnerClass and EnclosingMethod attributes (EnclosingMethod is displayed as OUTERCLASS in asm). + * + * In this summary, "class" means "class or interface". + * + * JLS: http://docs.oracle.com/javase/specs/jls/se8/html/index.html + * JVMS: http://docs.oracle.com/javase/specs/jvms/se8/html/index.html + * + * Terminology + * ----------- + * + * - Nested class (JLS 8): class whose declaration occurs within the body of another class + * + * - Top-level class (JLS 8): non-nested class + * + * - Inner class (JLS 8.1.3): nested class that is not (explicitly or implicitly) static + * + * - Member class (JLS 8.5): class directly enclosed in the body of a class (and not, for + * example, defined in a method). Member classes cannot be anonymous. May be static. + * + * - Local class (JLS 14.3): nested, non-anonymous class that is not a member of a class + * - cannot be static (therefore they are "inner" classes) + * - can be defined in a method, a constructor or in an initializer block + * + * - Initializer block (JLS 8.6 / 8.7): block of statements in a java class + * - static initializer: executed before constructor body + * - instance initializer: executed when class is initialized (instance creation, static + * field access, ...) + * + * - A static nested class can be defined as + * - a static member class (explicitly static), or + * - a member class of an interface (implicitly static) + * - local classes are never static, even if they are defined in a static method. + * + * Note: it is NOT the case that all inner classes (non-static) have an outer pointer. Example: + * class C { static void foo { class D {} } } + * The class D is an inner class (non-static), but javac does not add an outer pointer to it. + * + * InnerClass + * ---------- + * + * The JVMS 4.7.6 requires an entry for every class mentioned in a CONSTANT_Class_info in the + * constant pool (CP) that is not a member of a package (JLS 7.1). + * + * The JLS 13.1, points 9. / 10. requires: a class must reference (in the CP) + * - its immediately enclosing class + * - all of its member classes + * - all local and anonymous classes that are referenced (or declared) elsewhere (method, + * constructor, initializer block, field initializer) + * + * In a comment, the 4.7.6 spec says: this implies an entry in the InnerClass attribute for + * - All enclosing classes (except the outermost, which is top-level) + * - My comment: not sure how this is implied, below (*) a Java counter-example. + * In any case, the Java compiler seems to add all enclosing classes, even if they are not + * otherwise mentioned in the CP. So we should do the same. + * - All nested classes (including anonymous and local, but not transitively) + * + * Fields in the InnerClass entries: + * - inner class: the (nested) class C we are talking about + * - outer class: the class of which C is a member. Has to be null for non-members, i.e. for + * local and anonymous classes. NOTE: this co-incides with the presence of an + * EnclosingMethod attribute (see below) + * - inner name: A string with the simple name of the inner class. Null for anonymous classes. + * - flags: access property flags, details in JVMS, table in 4.7.6. Static flag: see + * discussion below. + * + * + * Note 1: when a nested class is present in the InnerClass attribute, all of its enclosing + * classes have to be present as well (by the rules above). Example: + * + * class Outer { class I1 { class I2 { } } } + * class User { Outer.I1.I2 foo() { } } + * + * The return type "Outer.I1.I2" puts "Outer$I1$I2" in the CP, therefore the class is added to the + * InnerClass attribute. For this entry, the "outer class" field will be "Outer$I1". This in turn + * adds "Outer$I1" to the CP, which requires adding that class to the InnerClass attribute. + * (For local / anonymous classes this would not be the case, since the "outer class" attribute + * would be empty. However, no class (other than the enclosing class) can refer to them, as they + * have no name.) + * + * In the current implementation of the Scala compiler, when adding a class to the InnerClass + * attribute, all of its enclosing classes will be added as well. Javac seems to do the same, + * see (*). + * + * + * Note 2: If a class name is mentioned only in a CONSTANT_Utf8_info, but not in a + * CONSTANT_Class_info, the JVMS does not require an entry in the InnerClass attribute. However, + * the Java compiler seems to add such classes anyway. For example, when using an annotation, the + * annotation class is stored as a CONSTANT_Utf8_info in the CP: + * + * @O.Ann void foo() { } + * + * adds "const #13 = Asciz LO$Ann;;" in the constant pool. The "RuntimeInvisibleAnnotations" + * attribute refers to that constant pool entry. Even though there is no other reference to + * `O.Ann`, the java compiler adds an entry for that class to the InnerClass attribute (which + * entails adding a CONSTANT_Class_info for the class). + * + * + * + * EnclosingMethod + * --------------- + * + * JVMS 4.7.7: the attribute must be present "if and only if it represents a local class + * or an anonymous class" (i.e. not for member classes). + * + * The attribute is mis-named, it should be called "EnclosingClass". It has to be defined for all + * local and anonymous classes, no matter if there is an enclosing method or not. Accordingly, the + * "class" field (see below) must be always defined, while the "method" field may be null. + * + * NOTE: When a EnclosingMethod attribute is required (local and anonymous classes), the "outer" + * field in the InnerClass table must be null. + * + * Fields: + * - class: the enclosing class + * - method: the enclosing method (or constructor). Null if the class is not enclosed by a + * method, i.e. for + * - local or anonymous classes defined in (static or non-static) initializer blocks + * - anonymous classes defined in initializer blocks or field initializers + * + * Note: the field is required for anonymous classes defined within local variable + * initializers (within a method), Java example below (**). + * + * For local and anonymous classes in initializer blocks or field initializers, and + * class-level anonymous classes, the scala compiler sets the "method" field to null. + * + * + * (*) + * public class Test { + * void foo() { + * class Foo1 { + * // constructor statement block + * { + * class Foo2 { + * class Foo3 { } + * } + * } + * } + * } + * } + * + * The class file Test$1Foo1$1Foo2$Foo3 has no reference to the class Test$1Foo1, however it + * still contains an InnerClass attribute for Test$1Foo1. + * Maybe this is just because the Java compiler follows the JVMS comment ("InnerClasses + * information for each enclosing class"). + * + * + * (**) + * void foo() { + * // anonymous class defined in local variable initializer expression. + * Runnable x = true ? (new Runnable() { + * public void run() { return; } + * }) : null; + * } + * + * The EnclosingMethod attribute of the anonymous class mentions "foo" in the "method" field. + * + * + * Java Compatibility + * ------------------ + * + * In the InnerClass entry for classes in top-level modules, the "outer class" is emitted as the + * mirror class (or the existing companion class), i.e. C1 is nested in T (not T$). + * For classes nested in a nested object, the "outer class" is the module class: C2 is nested in T$N$ + * object T { + * class C1 + * object N { class C2 } + * } + * + * Reason: java compat. It's a "best effort" "solution". If you want to use "C1" from Java, you + * can write "T.C1", and the Java compiler will translate that to the classfile T$C1. + * + * If we would emit the "outer class" of C1 as "T$", then in Java you'd need to write "T$.C1" + * because the java compiler looks at the InnerClass attribute to find if an inner class exists. + * However, the Java compiler would then translate the '.' to '$' and you'd get the class name + * "T$$C1". This class file obviously does not exist. + * + * Directly using the encoded class name "T$C1" in Java does not work: since the classfile + * describes a nested class, the Java compiler hides it from the classpath and will report + * "cannot find symbol T$C1". This means that the class T.N.C2 cannot be referenced from a + * Java source file in any way. + * + * + * STATIC flag + * ----------- + * + * Java: static member classes have the static flag in the InnerClass attribute, for example B in + * class A { static class B { } } + * + * The spec is not very clear about when the static flag should be emitted. It says: "Marked or + * implicitly static in source." + * + * The presence of the static flag does NOT coincide with the absence of an "outer" field in the + * class. The java compiler never puts the static flag for local classes, even if they don't have + * an outer pointer: + * + * class A { + * void f() { class B {} } + * static void g() { calss C {} } + * } + * + * B has an outer pointer, C doesn't. Both B and C are NOT marked static in the InnerClass table. + * + * It seems sane to follow the same principle in the Scala compiler. So: + * + * package p + * object O1 { + * class C1 // static inner class + * object O2 { // static inner module + * def f = { + * class C2 { // non-static inner class, even though there's no outer pointer + * class C3 // non-static, has an outer pointer + * } + * } + * } + * } + * + * Mirror Classes + * -------------- + * + * TODO: innerclass attributes on mirror class + */ + + /** + * A ClassBType represents a class or interface type. The necessary information to build a + * ClassBType is extracted from compiler symbols and types, see BTypesFromSymbols. + * + * The `offset` and `length` fields are used to represent the internal name of the class. They + * are indices into some character array. The internal name can be obtained through the method + * `internalNameString`, which is abstract in this component. Name creation is assumed to be + * hash-consed, so if two ClassBTypes have the same internal name, they NEED to have the same + * `offset` and `length`. + * + * The actual implementation in subclass BTypesFromSymbols uses the global `chrs` array from the + * name table. This representation is efficient because the JVM class name is obtained through + * `classSymbol.javaBinaryName`. This already adds the necessary string to the `chrs` array, + * so it makes sense to reuse the same name table in the backend. + * + * ClassBType is not a case class because we want a custom equals method, and because the + * extractor extracts the internalName, which is what you typically need. + */ + final class ClassBType(val internalName: String) extends RefBType { + /** + * Write-once variable allows initializing a cyclic graph of infos. This is required for + * nested classes. Example: for the definition `class A { class B }` we have + * + * B.info.nestedInfo.outerClass == A + * A.info.memberClasses contains B + */ + private var _info: ClassInfo = null + + def info: ClassInfo = { + assert(_info != null, s"ClassBType.info not yet assigned: $this") + _info + } + + def info_=(i: ClassInfo): Unit = { + assert(_info == null, s"Cannot set ClassBType.info multiple times: $this") + _info = i + checkInfoConsistency() + } + + classBTypeFromInternalNameMap(internalName) = this + + private def checkInfoConsistency(): Unit = { + // we assert some properties. however, some of the linked ClassBType (members, superClass, + // interfaces) may not yet have an `_info` (initialization of cyclic structures). so we do a + // best-effort verification. + def ifInit(c: ClassBType)(p: ClassBType => Boolean): Boolean = c._info == null || p(c) + + def isJLO(t: ClassBType) = t.internalName == "java/lang/Object" + + assert(!ClassBType.isInternalPhantomType(internalName), s"Cannot create ClassBType for phantom type $this") + + assert( + if (info.superClass.isEmpty) { isJLO(this) || (DottyBackendInterface.isCompilingPrimitive && ClassBType.hasNoSuper(internalName)) } + else if (isInterface) isJLO(info.superClass.get) + else !isJLO(this) && ifInit(info.superClass.get)(!_.isInterface), + s"Invalid superClass in $this: ${info.superClass}" + ) + assert( + info.interfaces.forall(c => ifInit(c)(_.isInterface)), + s"Invalid interfaces in $this: ${info.interfaces}" + ) + + assert(info.memberClasses.forall(c => ifInit(c)(_.isNestedClass)), info.memberClasses) + } + + /** + * The internal name of a class is the string returned by java.lang.Class.getName, with all '.' + * replaced by '/'. For example "java/lang/String". + */ + //def internalName: String = internalNameString(offset, length) + + /** + * @return The class name without the package prefix + */ + def simpleName: String = internalName.split("/").last + + def isInterface = (info.flags & asm.Opcodes.ACC_INTERFACE) != 0 + + def superClassesTransitive: List[ClassBType] = info.superClass match { + case None => Nil + case Some(sc) => sc :: sc.superClassesTransitive + } + + def isNestedClass = info.nestedInfo.isDefined + + def enclosingNestedClassesChain: List[ClassBType] = + if (isNestedClass) this :: info.nestedInfo.get.enclosingClass.enclosingNestedClassesChain + else Nil + + def innerClassAttributeEntry: Option[InnerClassEntry] = info.nestedInfo map { + case NestedInfo(_, outerName, innerName, isStaticNestedClass) => + import GenBCodeOps.addFlagIf + InnerClassEntry( + internalName, + outerName.orNull, + innerName.orNull, + info.flags.addFlagIf(isStaticNestedClass, asm.Opcodes.ACC_STATIC) + & ClassBType.INNER_CLASSES_FLAGS + ) + } + + def isSubtypeOf(other: ClassBType): Boolean = { + if (this == other) return true + + if (isInterface) { + if (other == ObjectRef) return true // interfaces conform to Object + if (!other.isInterface) return false // this is an interface, the other is some class other than object. interfaces cannot extend classes, so the result is false. + // else: this and other are both interfaces. continue to (*) + } else { + val sc = info.superClass + if (sc.isDefined && sc.get.isSubtypeOf(other)) return true // the superclass of this class conforms to other + if (!other.isInterface) return false // this and other are both classes, and the superclass of this does not conform + // else: this is a class, the other is an interface. continue to (*) + } + + // (*) check if some interface of this class conforms to other. + info.interfaces.exists(_.isSubtypeOf(other)) + } + + /** + * Finding the least upper bound in agreement with the bytecode verifier + * Background: + * http://gallium.inria.fr/~xleroy/publi/bytecode-verification-JAR.pdf + * http://comments.gmane.org/gmane.comp.java.vm.languages/2293 + * https://issues.scala-lang.org/browse/SI-3872 + */ + def jvmWiseLUB(other: ClassBType): ClassBType = { + def isNotNullOrNothing(c: ClassBType) = !c.isNullType && !c.isNothingType + assert(isNotNullOrNothing(this) && isNotNullOrNothing(other), s"jvmWiseLub for null or nothing: $this - $other") + + val res: ClassBType = (this.isInterface, other.isInterface) match { + case (true, true) => + // exercised by test/files/run/t4761.scala + if (other.isSubtypeOf(this)) this + else if (this.isSubtypeOf(other)) other + else ObjectRef + + case (true, false) => + if (other.isSubtypeOf(this)) this else ObjectRef + + case (false, true) => + if (this.isSubtypeOf(other)) other else ObjectRef + + case _ => + // TODO @lry I don't really understand the reasoning here. + // Both this and other are classes. The code takes (transitively) all superclasses and + // finds the first common one. + // MOST LIKELY the answer can be found here, see the comments and links by Miguel: + // - https://issues.scala-lang.org/browse/SI-3872 + firstCommonSuffix(this :: this.superClassesTransitive, other :: other.superClassesTransitive) + } + + assert(isNotNullOrNothing(res), s"jvmWiseLub computed: $res") + res + } + + private def firstCommonSuffix(as: List[ClassBType], bs: List[ClassBType]): ClassBType = { + var chainA = as + var chainB = bs + var fcs: ClassBType = null + while { + if (chainB contains chainA.head) fcs = chainA.head + else if (chainA contains chainB.head) fcs = chainB.head + else { + chainA = chainA.tail + chainB = chainB.tail + } + fcs == null + } do () + fcs + } + + /** + * Custom equals / hashCode: we only compare the name (offset / length) + */ + override def equals(o: Any): Boolean = (this eq o.asInstanceOf[Object]) || (o match { + case c: ClassBType @unchecked => c.internalName == this.internalName + case _ => false + }) + + override def hashCode: Int = { + import scala.runtime.Statics + var acc: Int = -889275714 + acc = Statics.mix(acc, internalName.hashCode) + Statics.finalizeHash(acc, 2) + } + } + + object ClassBType { + /** + * Pattern matching on a ClassBType extracts the `internalName` of the class. + */ + def unapply(c: ClassBType): Some[String] = Some(c.internalName) + + /** + * Valid flags for InnerClass attribute entry. + * See http://docs.oracle.com/javase/specs/jvms/se8/html/jvms-4.html#jvms-4.7.6 + */ + private val INNER_CLASSES_FLAGS = { + asm.Opcodes.ACC_PUBLIC | asm.Opcodes.ACC_PRIVATE | asm.Opcodes.ACC_PROTECTED | + asm.Opcodes.ACC_STATIC | asm.Opcodes.ACC_FINAL | asm.Opcodes.ACC_INTERFACE | + asm.Opcodes.ACC_ABSTRACT | asm.Opcodes.ACC_SYNTHETIC | asm.Opcodes.ACC_ANNOTATION | + asm.Opcodes.ACC_ENUM + } + + // Primitive classes have no super class. A ClassBType for those is only created when + // they are actually being compiled (e.g., when compiling scala/Boolean.scala). + private val hasNoSuper = Set( + "scala/Unit", + "scala/Boolean", + "scala/Char", + "scala/Byte", + "scala/Short", + "scala/Int", + "scala/Float", + "scala/Long", + "scala/Double" + ) + + private val isInternalPhantomType = Set( + "scala/Null", + "scala/Nothing" + ) + } + + /** + * The type info for a class. Used for symboltable-independent subtype checks in the backend. + * + * @param superClass The super class, not defined for class java/lang/Object. + * @param interfaces All transitively implemented interfaces, except for those inherited + * through the superclass. + * @param flags The java flags, obtained through `javaFlags`. Used also to derive + * the flags for InnerClass entries. + * @param memberClasses Classes nested in this class. Those need to be added to the + * InnerClass table, see the InnerClass spec summary above. + * @param nestedInfo If this describes a nested class, information for the InnerClass table. + */ + case class ClassInfo(superClass: Option[ClassBType], interfaces: List[ClassBType], flags: Int, + memberClasses: List[ClassBType], nestedInfo: Option[NestedInfo]) + + /** + * Information required to add a class to an InnerClass table. + * The spec summary above explains what information is required for the InnerClass entry. + * + * @param enclosingClass The enclosing class, if it is also nested. When adding a class + * to the InnerClass table, enclosing nested classes are also added. + * @param outerName The outerName field in the InnerClass entry, may be None. + * @param innerName The innerName field, may be None. + * @param isStaticNestedClass True if this is a static nested class (not inner class) (*) + * + * (*) Note that the STATIC flag in ClassInfo.flags, obtained through javaFlags(classSym), is not + * correct for the InnerClass entry, see javaFlags. The static flag in the InnerClass describes + * a source-level propety: if the class is in a static context (does not have an outer pointer). + * This is checked when building the NestedInfo. + */ + case class NestedInfo(enclosingClass: ClassBType, + outerName: Option[String], + innerName: Option[String], + isStaticNestedClass: Boolean) + + /** + * This class holds the data for an entry in the InnerClass table. See the InnerClass summary + * above in this file. + * + * There's some overlap with the class NestedInfo, but it's not exactly the same and cleaner to + * keep separate. + * @param name The internal name of the class. + * @param outerName The internal name of the outer class, may be null. + * @param innerName The simple name of the inner class, may be null. + * @param flags The flags for this class in the InnerClass entry. + */ + case class InnerClassEntry(name: String, outerName: String, innerName: String, flags: Int) + + case class ArrayBType(componentType: BType) extends RefBType { + def dimension: Int = componentType match { + case a: ArrayBType => 1 + a.dimension + case _ => 1 + } + + def elementType: BType = componentType match { + case a: ArrayBType => a.elementType + case t => t + } + } + + case class MethodBType(argumentTypes: List[BType], returnType: BType) extends BType + + /* Some definitions that are required for the implementation of BTypes. They are abstract because + * initializing them requires information from types / symbols, which is not accessible here in + * BTypes. + * + * They are defs (not vals) because they are implemented using vars (see comment on CoreBTypes). + */ + + /** + * Just a named pair, used in CoreBTypes.asmBoxTo/asmUnboxTo. + */ + /*final*/ case class MethodNameAndType(name: String, methodType: MethodBType) +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/BTypesFromSymbols.scala b/tests/pos-with-compiler-cc/backend/jvm/BTypesFromSymbols.scala new file mode 100644 index 000000000000..54dafe6f0032 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/BTypesFromSymbols.scala @@ -0,0 +1,348 @@ +package dotty.tools +package backend +package jvm + +import scala.tools.asm +import scala.annotation.threadUnsafe +import scala.collection.mutable +import scala.collection.mutable.Clearable + +import dotty.tools.dotc.core.Flags._ +import dotty.tools.dotc.core.Contexts._ +import dotty.tools.dotc.core.Phases._ +import dotty.tools.dotc.core.Symbols._ +import dotty.tools.dotc.core.Phases.Phase +import dotty.tools.dotc.transform.SymUtils._ +import dotty.tools.dotc.core.StdNames + +/** + * This class mainly contains the method classBTypeFromSymbol, which extracts the necessary + * information from a symbol and its type to create the corresponding ClassBType. It requires + * access to the compiler (global parameter). + * + * The mixin CoreBTypes defines core BTypes that are used in the backend. Building these BTypes + * uses classBTypeFromSymbol, hence requires access to the compiler (global). + * + * BTypesFromSymbols extends BTypes because the implementation of BTypes requires access to some + * of the core btypes. They are declared in BTypes as abstract members. Note that BTypes does + * not have access to the compiler instance. + */ +class BTypesFromSymbols[I <: DottyBackendInterface](val int: I) extends BTypes { + import int.{_, given} + import DottyBackendInterface.{symExtensions, _} + + lazy val TransientAttr = requiredClass[scala.transient] + lazy val VolatileAttr = requiredClass[scala.volatile] + + val bCodeAsmCommon: BCodeAsmCommon[int.type ] = new BCodeAsmCommon(int) + import bCodeAsmCommon._ + + // Why the proxy, see documentation of class [[CoreBTypes]]. + val coreBTypes: CoreBTypesProxy[this.type] = new CoreBTypesProxy[this.type](this) + import coreBTypes._ + + final def intializeCoreBTypes(): Unit = { + coreBTypes.setBTypes(new CoreBTypes[this.type](this)) + } + + private[this] val perRunCaches: Caches = new Caches { + def newAnyRefMap[K <: AnyRef, V](): mutable.AnyRefMap[K, V] = new mutable.AnyRefMap[K, V]() + def newWeakMap[K, V](): mutable.WeakHashMap[K, V] = new mutable.WeakHashMap[K, V]() + def recordCache[T <: Clearable](cache: T): T = cache + def newMap[K, V](): mutable.HashMap[K, V] = new mutable.HashMap[K, V]() + def newSet[K](): mutable.Set[K] = new mutable.HashSet[K] + } + + // TODO remove abstraction + private abstract class Caches { + def recordCache[T <: Clearable](cache: T): T + def newWeakMap[K, V](): collection.mutable.WeakHashMap[K, V] + def newMap[K, V](): collection.mutable.HashMap[K, V] + def newSet[K](): collection.mutable.Set[K] + def newAnyRefMap[K <: AnyRef, V](): collection.mutable.AnyRefMap[K, V] + } + + @threadUnsafe protected lazy val classBTypeFromInternalNameMap = { + perRunCaches.recordCache(collection.concurrent.TrieMap.empty[String, ClassBType]) + } + + /** + * Cache for the method classBTypeFromSymbol. + */ + @threadUnsafe private lazy val convertedClasses = perRunCaches.newMap[Symbol, ClassBType]() + + /** + * The ClassBType for a class symbol `sym`. + */ + final def classBTypeFromSymbol(classSym: Symbol): ClassBType = { + assert(classSym != NoSymbol, "Cannot create ClassBType from NoSymbol") + assert(classSym.isClass, s"Cannot create ClassBType from non-class symbol $classSym") + assert( + (!primitiveTypeMap.contains(classSym) || isCompilingPrimitive) && + (classSym != defn.NothingClass && classSym != defn.NullClass), + s"Cannot create ClassBType for special class symbol ${classSym.showFullName}") + + convertedClasses.getOrElse(classSym, { + val internalName = classSym.javaBinaryName + // We first create and add the ClassBType to the hash map before computing its info. This + // allows initializing cylic dependencies, see the comment on variable ClassBType._info. + val classBType = new ClassBType(internalName) + convertedClasses(classSym) = classBType + setClassInfo(classSym, classBType) + }) + } + + final def mirrorClassBTypeFromSymbol(moduleClassSym: Symbol): ClassBType = { + assert(moduleClassSym.isTopLevelModuleClass, s"not a top-level module class: $moduleClassSym") + val internalName = moduleClassSym.javaBinaryName.stripSuffix(StdNames.str.MODULE_SUFFIX) + val bType = ClassBType(internalName) + bType.info = ClassInfo( + superClass = Some(ObjectRef), + interfaces = Nil, + flags = asm.Opcodes.ACC_SUPER | asm.Opcodes.ACC_PUBLIC | asm.Opcodes.ACC_FINAL, + memberClasses = getMemberClasses(moduleClassSym).map(classBTypeFromSymbol), + nestedInfo = None + ) + bType + } + + private def setClassInfo(classSym: Symbol, classBType: ClassBType): ClassBType = { + val superClassSym: Symbol = { + val t = classSym.asClass.superClass + if (t.exists) t + else if (classSym.is(ModuleClass)) { + // workaround #371 + + println(s"Warning: mocking up superclass for $classSym") + defn.ObjectClass + } + else t + } + assert( + if (classSym == defn.ObjectClass) + superClassSym == NoSymbol + else if (classSym.isInterface) + superClassSym == defn.ObjectClass + else + // A ClassBType for a primitive class (scala.Boolean et al) is only created when compiling these classes. + ((superClassSym != NoSymbol) && !superClassSym.isInterface) || (isCompilingPrimitive && primitiveTypeMap.contains(classSym)), + s"Bad superClass for $classSym: $superClassSym" + ) + val superClass = if (superClassSym == NoSymbol) None + else Some(classBTypeFromSymbol(superClassSym)) + + /** + * All interfaces implemented by a class, except for those inherited through the superclass. + * Redundant interfaces are removed unless there is a super call to them. + */ + extension (sym: Symbol) def superInterfaces: List[Symbol] = { + val directlyInheritedTraits = sym.directlyInheritedTraits + val directlyInheritedTraitsSet = directlyInheritedTraits.toSet + val allBaseClasses = directlyInheritedTraits.iterator.flatMap(_.asClass.baseClasses.drop(1)).toSet + val superCalls = superCallsMap.getOrElse(sym, Set.empty) + val additional = (superCalls -- directlyInheritedTraitsSet).filter(_.is(Trait)) +// if (additional.nonEmpty) +// println(s"$fullName: adding supertraits $additional") + directlyInheritedTraits.filter(t => !allBaseClasses(t) || superCalls(t)) ++ additional + } + + val interfaces = classSym.superInterfaces.map(classBTypeFromSymbol) + + val flags = javaFlags(classSym) + + /* The InnerClass table of a class C must contain all nested classes of C, even if they are only + * declared but not otherwise referenced in C (from the bytecode or a method / field signature). + * We collect them here. + */ + val nestedClassSymbols = { + // The lambdalift phase lifts all nested classes to the enclosing class, so if we collect + // member classes right after lambdalift, we obtain all nested classes, including local and + // anonymous ones. + val nestedClasses = getNestedClasses(classSym) + + // If this is a top-level class, and it has a companion object, the member classes of the + // companion are added as members of the class. For example: + // class C { } + // object C { + // class D + // def f = { class E } + // } + // The class D is added as a member of class C. The reason is that the InnerClass attribute + // for D will containt class "C" and NOT the module class "C$" as the outer class of D. + // This is done by buildNestedInfo, the reason is Java compatibility, see comment in BTypes. + // For consistency, the InnerClass entry for D needs to be present in C - to Java it looks + // like D is a member of C, not C$. + val linkedClass = classSym.linkedClass + val companionModuleMembers = { + if (classSym.linkedClass.isTopLevelModuleClass) getMemberClasses(classSym.linkedClass) + else Nil + } + + nestedClasses ++ companionModuleMembers + } + + /** + * For nested java classes, the scala compiler creates both a class and a module (and therefore + * a module class) symbol. For example, in `class A { class B {} }`, the nestedClassSymbols + * for A contain both the class B and the module class B. + * Here we get rid of the module class B, making sure that the class B is present. + */ + val nestedClassSymbolsNoJavaModuleClasses = nestedClassSymbols.filter(s => { + if (s.is(JavaDefined) && s.is(ModuleClass)) { + // We could also search in nestedClassSymbols for s.linkedClassOfClass, but sometimes that + // returns NoSymbol, so it doesn't work. + val nb = nestedClassSymbols.count(mc => mc.name == s.name && mc.owner == s.owner) + // this assertion is specific to how ScalaC works. It doesn't apply to dotty, as n dotty there will be B & B$ + // assert(nb == 2, s"Java member module without member class: $s - $nestedClassSymbols") + false + } else true + }) + + val memberClasses = nestedClassSymbolsNoJavaModuleClasses.map(classBTypeFromSymbol) + + val nestedInfo = buildNestedInfo(classSym) + + classBType.info = ClassInfo(superClass, interfaces, flags, memberClasses, nestedInfo) + classBType + } + + /** For currently compiled classes: All locally defined classes including local classes. + * The empty list for classes that are not currently compiled. + */ + private def getNestedClasses(sym: Symbol): List[Symbol] = definedClasses(sym, flattenPhase) + + /** For currently compiled classes: All classes that are declared as members of this class + * (but not inherited ones). The empty list for classes that are not currently compiled. + */ + private def getMemberClasses(sym: Symbol): List[Symbol] = definedClasses(sym, lambdaLiftPhase) + + private def definedClasses(sym: Symbol, phase: Phase) = + if (sym.isDefinedInCurrentRun) + atPhase(phase) { + toDenot(sym).info.decls.filter(sym => sym.isClass && !sym.isEffectivelyErased) + } + else Nil + + private def buildNestedInfo(innerClassSym: Symbol): Option[NestedInfo] = { + assert(innerClassSym.isClass, s"Cannot build NestedInfo for non-class symbol $innerClassSym") + + val isNested = !innerClassSym.originalOwner.originalLexicallyEnclosingClass.is(PackageClass) + if (!isNested) None + else { + // See comment in BTypes, when is a class marked static in the InnerClass table. + val isStaticNestedClass = innerClassSym.originalOwner.originalLexicallyEnclosingClass.isOriginallyStaticOwner + + // After lambdalift (which is where we are), the rawowoner field contains the enclosing class. + val enclosingClassSym = { + if (innerClassSym.isClass) { + atPhase(flattenPhase.prev) { + toDenot(innerClassSym).owner.enclosingClass + } + } + else atPhase(flattenPhase.prev)(innerClassSym.enclosingClass) + } //todo is handled specially for JavaDefined symbols in scalac + + val enclosingClass: ClassBType = classBTypeFromSymbol(enclosingClassSym) + + val outerName: Option[String] = { + if (isAnonymousOrLocalClass(innerClassSym)) { + None + } else { + val outerName = innerClassSym.originalOwner.originalLexicallyEnclosingClass.javaBinaryName + def dropModule(str: String): String = + if (!str.isEmpty && str.last == '$') str.take(str.length - 1) else str + // Java compatibility. See the big comment in BTypes that summarizes the InnerClass spec. + val outerNameModule = + if (innerClassSym.originalOwner.originalLexicallyEnclosingClass.isTopLevelModuleClass) dropModule(outerName) + else outerName + Some(outerNameModule.toString) + } + } + + val innerName: Option[String] = { + if (innerClassSym.isAnonymousClass || innerClassSym.isAnonymousFunction) None + else { + val original = innerClassSym.initial + Some(atPhase(original.validFor.phaseId)(innerClassSym.name).mangledString) // moduleSuffix for module classes + } + } + + Some(NestedInfo(enclosingClass, outerName, innerName, isStaticNestedClass)) + } + } + + /** + * This is basically a re-implementation of sym.isStaticOwner, but using the originalOwner chain. + * + * The problem is that we are interested in a source-level property. Various phases changed the + * symbol's properties in the meantime, mostly lambdalift modified (destructively) the owner. + * Therefore, `sym.isStatic` is not what we want. For example, in + * object T { def f { object U } } + * the owner of U is T, so UModuleClass.isStatic is true. Phase travel does not help here. + */ + extension (sym: Symbol) + private def isOriginallyStaticOwner: Boolean = + sym.is(PackageClass) || sym.is(ModuleClass) && sym.originalOwner.originalLexicallyEnclosingClass.isOriginallyStaticOwner + + /** + * Return the Java modifiers for the given symbol. + * Java modifiers for classes: + * - public, abstract, final, strictfp (not used) + * for interfaces: + * - the same as for classes, without 'final' + * for fields: + * - public, private (*) + * - static, final + * for methods: + * - the same as for fields, plus: + * - abstract, synchronized (not used), strictfp (not used), native (not used) + * for all: + * - deprecated + * + * (*) protected cannot be used, since inner classes 'see' protected members, + * and they would fail verification after lifted. + */ + final def javaFlags(sym: Symbol): Int = { + + // Classes are always emitted as public. This matches the behavior of Scala 2 + // and is necessary for object deserialization to work properly, otherwise + // ModuleSerializationProxy may fail with an accessiblity error (see + // tests/run/serialize.scala and https://github.com/typelevel/cats-effect/pull/2360). + val privateFlag = !sym.isClass && (sym.is(Private) || (sym.isPrimaryConstructor && sym.owner.isTopLevelModuleClass)) + + val finalFlag = sym.is(Final) && !toDenot(sym).isClassConstructor && !sym.is(Mutable, butNot = Accessor) && !sym.enclosingClass.is(Trait) + + import asm.Opcodes._ + import GenBCodeOps.addFlagIf + 0 .addFlagIf(privateFlag, ACC_PRIVATE) + .addFlagIf(!privateFlag, ACC_PUBLIC) + .addFlagIf(sym.is(Deferred) || sym.isOneOf(AbstractOrTrait), ACC_ABSTRACT) + .addFlagIf(sym.isInterface, ACC_INTERFACE) + .addFlagIf(finalFlag + // Primitives are "abstract final" to prohibit instantiation + // without having to provide any implementations, but that is an + // illegal combination of modifiers at the bytecode level so + // suppress final if abstract if present. + && !sym.isOneOf(AbstractOrTrait) + // Mixin forwarders are bridges and can be final, but final bridges confuse some frameworks + && !sym.is(Bridge), ACC_FINAL) + .addFlagIf(sym.isStaticMember, ACC_STATIC) + .addFlagIf(sym.is(Bridge), ACC_BRIDGE | ACC_SYNTHETIC) + .addFlagIf(sym.is(Artifact), ACC_SYNTHETIC) + .addFlagIf(sym.isClass && !sym.isInterface, ACC_SUPER) + .addFlagIf(sym.isAllOf(JavaEnumTrait), ACC_ENUM) + .addFlagIf(sym.is(JavaVarargs), ACC_VARARGS) + .addFlagIf(sym.is(Synchronized), ACC_SYNCHRONIZED) + .addFlagIf(sym.isDeprecated, ACC_DEPRECATED) + .addFlagIf(sym.is(Enum), ACC_ENUM) + } + + def javaFieldFlags(sym: Symbol) = { + import asm.Opcodes._ + import GenBCodeOps.addFlagIf + javaFlags(sym) + .addFlagIf(sym.hasAnnotation(TransientAttr), ACC_TRANSIENT) + .addFlagIf(sym.hasAnnotation(VolatileAttr), ACC_VOLATILE) + .addFlagIf(!sym.is(Mutable), ACC_FINAL) + } +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/BytecodeWriters.scala b/tests/pos-with-compiler-cc/backend/jvm/BytecodeWriters.scala new file mode 100644 index 000000000000..551d4f8d809e --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/BytecodeWriters.scala @@ -0,0 +1,147 @@ +package dotty.tools +package backend +package jvm + +import scala.language.unsafeNulls + +import java.io.{ DataOutputStream, FileOutputStream, IOException, File as JFile } +import java.nio.channels.ClosedByInterruptException +import dotty.tools.io._ +import dotty.tools.dotc.report + + +/** Can't output a file due to the state of the file system. */ +class FileConflictException(msg: String, val file: AbstractFile) extends IOException(msg) + +/** For the last mile: turning generated bytecode in memory into + * something you can use. Has implementations for writing to class + * files, jars, and disassembled/javap output. + */ +trait BytecodeWriters { + val int: DottyBackendInterface + import int.{_, given} + + /** + * @param clsName cls.getName + */ + def getFile(base: AbstractFile, clsName: String, suffix: String): AbstractFile = { + def ensureDirectory(dir: AbstractFile): AbstractFile = + if (dir.isDirectory) dir + else throw new FileConflictException(s"${base.path}/$clsName$suffix: ${dir.path} is not a directory", dir) + var dir = base + val pathParts = clsName.split("[./]").toList + for (part <- pathParts.init) dir = ensureDirectory(dir) subdirectoryNamed part + ensureDirectory(dir) fileNamed pathParts.last + suffix + } + def getFile(sym: Symbol, clsName: String, suffix: String): AbstractFile = + getFile(outputDirectory, clsName, suffix) + + def factoryNonJarBytecodeWriter(): BytecodeWriter = { + val emitAsmp = None + val doDump = dumpClasses + (emitAsmp.isDefined, doDump.isDefined) match { + case (false, false) => new ClassBytecodeWriter { } + case (false, true ) => new ClassBytecodeWriter with DumpBytecodeWriter { } + case (true, false) => new ClassBytecodeWriter with AsmpBytecodeWriter + case (true, true ) => new ClassBytecodeWriter with AsmpBytecodeWriter with DumpBytecodeWriter { } + } + } + + trait BytecodeWriter { + def writeClass(label: String, jclassName: String, jclassBytes: Array[Byte], outfile: AbstractFile): Unit + def close(): Unit = () + } + + class DirectToJarfileWriter(jfile: JFile) extends BytecodeWriter { + val writer = new Jar(jfile).jarWriter() + + def writeClass(label: String, jclassName: String, jclassBytes: Array[Byte], outfile: AbstractFile): Unit = { + assert(outfile == null, + "The outfile formal param is there just because ClassBytecodeWriter overrides this method and uses it.") + val path = jclassName + ".class" + val out = writer.newOutputStream(path) + + try out.write(jclassBytes, 0, jclassBytes.length) + finally out.flush() + + report.informProgress("added " + label + path + " to jar") + } + override def close() = writer.close() + } + + /* + * The ASM textual representation for bytecode overcomes disadvantages of javap output in three areas: + * (a) pickle dingbats undecipherable to the naked eye; + * (b) two constant pools, while having identical contents, are displayed differently due to physical layout. + * (c) stack maps (classfile version 50 and up) are displayed in encoded form by javap, + * their expansion by ASM is more readable. + * + * */ + trait AsmpBytecodeWriter extends BytecodeWriter { + import scala.tools.asm + + private val baseDir = new Directory(None.get).createDirectory() // FIXME missing directoy + // new needed here since resolution of user-defined `apply` methods is ambiguous, and we want the constructor. + + private def emitAsmp(jclassBytes: Array[Byte], asmpFile: dotty.tools.io.File): Unit = { + val pw = asmpFile.printWriter() + try { + val cnode = new ClassNode1() + val cr = new asm.ClassReader(jclassBytes) + cr.accept(cnode, 0) + val trace = new scala.tools.asm.util.TraceClassVisitor(new java.io.PrintWriter(new java.io.StringWriter())) + cnode.accept(trace) + trace.p.print(pw) + } + finally pw.close() + } + + abstract override def writeClass(label: String, jclassName: String, jclassBytes: Array[Byte], outfile: AbstractFile): Unit = { + super.writeClass(label, jclassName, jclassBytes, outfile) + + val segments = jclassName.split("[./]") + val asmpFile = segments.foldLeft(baseDir: Path)(_ / _).changeExtension("asmp").toFile + + asmpFile.parent.createDirectory() + emitAsmp(jclassBytes, asmpFile) + } + } + + trait ClassBytecodeWriter extends BytecodeWriter { + def writeClass(label: String, jclassName: String, jclassBytes: Array[Byte], outfile: AbstractFile): Unit = { + assert(outfile != null, + "Precisely this override requires its invoker to hand out a non-null AbstractFile.") + val outstream = new DataOutputStream(outfile.bufferedOutput) + + try outstream.write(jclassBytes, 0, jclassBytes.length) + catch case ex: ClosedByInterruptException => + try + outfile.delete() // don't leave an empty or half-written classfile around after an interrupt + catch + case _: Throwable => + throw ex + finally outstream.close() + report.informProgress("wrote '" + label + "' to " + outfile) + } + } + + trait DumpBytecodeWriter extends BytecodeWriter { + val baseDir = Directory(dumpClasses.get).createDirectory() + + abstract override def writeClass(label: String, jclassName: String, jclassBytes: Array[Byte], outfile: AbstractFile): Unit = { + super.writeClass(label, jclassName, jclassBytes, outfile) + + val pathName = jclassName + val dumpFile = pathName.split("[./]").foldLeft(baseDir: Path) (_ / _).changeExtension("class").toFile + dumpFile.parent.createDirectory() + val outstream = new DataOutputStream(new FileOutputStream(dumpFile.path)) + + try outstream.write(jclassBytes, 0, jclassBytes.length) + finally outstream.close() + } + } + + private def dumpClasses: Option[String] = + if (ctx.settings.Ydumpclasses.isDefault) None + else Some(ctx.settings.Ydumpclasses.value) +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/ClassNode1.java b/tests/pos-with-compiler-cc/backend/jvm/ClassNode1.java new file mode 100644 index 000000000000..c5594ae3dea6 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/ClassNode1.java @@ -0,0 +1,39 @@ +/* + * Scala (https://www.scala-lang.org) + * + * Copyright EPFL and Lightbend, Inc. + * + * Licensed under Apache License 2.0 + * (http://www.apache.org/licenses/LICENSE-2.0). + * + * See the NOTICE file distributed with this work for + * additional information regarding copyright ownership. + */ + +package dotty.tools.backend.jvm; + +import scala.tools.asm.MethodVisitor; +import scala.tools.asm.Opcodes; +import scala.tools.asm.tree.ClassNode; +import scala.tools.asm.tree.MethodNode; + +/** + * A subclass of {@link ClassNode} to customize the representation of + * label nodes with {@link LabelNode1}. + */ +public class ClassNode1 extends ClassNode { + public ClassNode1() { + this(Opcodes.ASM6); + } + + public ClassNode1(int api) { + super(api); + } + + @Override + public MethodVisitor visitMethod(int access, String name, String descriptor, String signature, String[] exceptions) { + MethodNode method = new MethodNode1(access, name, descriptor, signature, exceptions); + methods.add(method); + return method; + } +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/CollectSuperCalls.scala b/tests/pos-with-compiler-cc/backend/jvm/CollectSuperCalls.scala new file mode 100644 index 000000000000..299c1c75d6cf --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/CollectSuperCalls.scala @@ -0,0 +1,48 @@ +package dotty.tools.backend.jvm + +import dotty.tools.dotc.ast.tpd +import dotty.tools.dotc.core.Contexts._ +import dotty.tools.dotc.core.Phases._ +import dotty.tools.dotc.core.Symbols._ +import dotty.tools.dotc.core.Flags.Trait +import dotty.tools.dotc.transform.MegaPhase.MiniPhase + +/** Collect all super calls to trait members. + * + * For each super reference to trait member, register a call from the current class to the + * owner of the referenced member. + * + * This information is used to know if it is safe to remove a redundant mixin class. + * A redundant mixin class is one that is implemented by another mixin class. As the + * methods in a redundant mixin class could be implemented with a default abstract method, + * the redundant mixin class could be required as a parent by the JVM. + */ +class CollectSuperCalls extends MiniPhase { + import tpd._ + + override def phaseName: String = CollectSuperCalls.name + + override def description: String = CollectSuperCalls.description + + override def transformSelect(tree: Select)(using Context): Tree = { + tree.qualifier match { + case sup: Super => + if (tree.symbol.owner.is(Trait)) + registerSuperCall(ctx.owner.enclosingClass.asClass, tree.symbol.owner.asClass) + case _ => + } + tree + } + + private def registerSuperCall(sym: ClassSymbol, calls: ClassSymbol)(using Context) = { + genBCodePhase match { + case genBCodePhase: GenBCode => + genBCodePhase.registerSuperCall(sym, calls) + case _ => + } + } +} + +object CollectSuperCalls: + val name: String = "collectSuperCalls" + val description: String = "find classes that are called with super" diff --git a/tests/pos-with-compiler-cc/backend/jvm/CoreBTypes.scala b/tests/pos-with-compiler-cc/backend/jvm/CoreBTypes.scala new file mode 100644 index 000000000000..d5fce3f53627 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/CoreBTypes.scala @@ -0,0 +1,294 @@ +package dotty.tools +package backend +package jvm + + +import dotty.tools.dotc.core.Symbols._ +import dotty.tools.dotc.transform.Erasure +import scala.tools.asm.{Handle, Opcodes} +import dotty.tools.dotc.core.StdNames + +/** + * Core BTypes and some other definitions. The initialization of these definitions requies access + * to symbols / types (global). + * + * The symbols used to initialize the ClassBTypes may change from one compiler run to the next. To + * make sure the definitions are consistent with the symbols in the current run, the + * `intializeCoreBTypes` method in BTypesFromSymbols creates a new instance of CoreBTypes in each + * compiler run. + * + * The class BTypesFromSymbols does not directly reference CoreBTypes, but CoreBTypesProxy. The + * reason is that having a `var bTypes: CoreBTypes` would not allow `import bTypes._`. Instead, the + * proxy class holds a `CoreBTypes` in a variable field and forwards to this instance. + * + * The definitions in `CoreBTypes` need to be lazy vals to break an initialization cycle. When + * creating a new instance to assign to the proxy, the `classBTypeFromSymbol` invoked in the + * constructor will actucally go through the proxy. The lazy vals make sure the instance is assigned + * in the proxy before the fields are initialized. + * + * Note: if we did not re-create the core BTypes on each compiler run, BType.classBTypeFromInternalNameMap + * could not be a perRunCache anymore: the classes defeined here need to be in that map, they are + * added when the ClassBTypes are created. The per run cache removes them, so they would be missing + * in the second run. + */ +class CoreBTypes[BTFS <: BTypesFromSymbols[_ <: DottyBackendInterface]](val bTypes: BTFS) { + import bTypes._ + import int.given + import DottyBackendInterface._ + + //import global._ + //import rootMirror.{requiredClass, getClassIfDefined} + //import definitions._ + + /** + * Maps primitive types to their corresponding PrimitiveBType. The map is defined lexically above + * the first use of `classBTypeFromSymbol` because that method looks at the map. + */ + lazy val primitiveTypeMap: Map[Symbol, PrimitiveBType] = Map( + defn.UnitClass -> UNIT, + defn.BooleanClass -> BOOL, + defn.CharClass -> CHAR, + defn.ByteClass -> BYTE, + defn.ShortClass -> SHORT, + defn.IntClass -> INT, + defn.LongClass -> LONG, + defn.FloatClass -> FLOAT, + defn.DoubleClass -> DOUBLE + ) + + private lazy val BOXED_UNIT : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.Void]) + private lazy val BOXED_BOOLEAN : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.Boolean]) + private lazy val BOXED_BYTE : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.Byte]) + private lazy val BOXED_SHORT : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.Short]) + private lazy val BOXED_CHAR : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.Character]) + private lazy val BOXED_INT : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.Integer]) + private lazy val BOXED_LONG : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.Long]) + private lazy val BOXED_FLOAT : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.Float]) + private lazy val BOXED_DOUBLE : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.Double]) + + /** + * Map from primitive types to their boxed class type. Useful when pushing class literals onto the + * operand stack (ldc instruction taking a class literal), see genConstant. + */ + lazy val boxedClassOfPrimitive: Map[PrimitiveBType, ClassBType] = Map( + UNIT -> BOXED_UNIT, + BOOL -> BOXED_BOOLEAN, + BYTE -> BOXED_BYTE, + SHORT -> BOXED_SHORT, + CHAR -> BOXED_CHAR, + INT -> BOXED_INT, + LONG -> BOXED_LONG, + FLOAT -> BOXED_FLOAT, + DOUBLE -> BOXED_DOUBLE + ) + + lazy val boxedClasses: Set[ClassBType] = boxedClassOfPrimitive.values.toSet + + /** + * Maps the method symbol for a box method to the boxed type of the result. For example, the + * method symbol for `Byte.box()` is mapped to the ClassBType `java/lang/Byte`. + */ + lazy val boxResultType: Map[Symbol, ClassBType] = { + val boxMethods = defn.ScalaValueClasses().map{x => // @darkdimius Are you sure this should be a def? + (x, Erasure.Boxing.boxMethod(x.asClass)) + }.toMap + for ((valueClassSym, boxMethodSym) <- boxMethods) + yield boxMethodSym -> boxedClassOfPrimitive(primitiveTypeMap(valueClassSym)) + } + + /** + * Maps the method symbol for an unbox method to the primitive type of the result. + * For example, the method symbol for `Byte.unbox()`) is mapped to the PrimitiveBType BYTE. */ + lazy val unboxResultType: Map[Symbol, PrimitiveBType] = { + val unboxMethods: Map[Symbol, Symbol] = + defn.ScalaValueClasses().map(x => (x, Erasure.Boxing.unboxMethod(x.asClass))).toMap + for ((valueClassSym, unboxMethodSym) <- unboxMethods) + yield unboxMethodSym -> primitiveTypeMap(valueClassSym) + } + + /* + * srNothingRef and srNullRef exist at run-time only. They are the bytecode-level manifestation (in + * method signatures only) of what shows up as NothingClass (scala.Nothing) resp. NullClass (scala.Null) in Scala ASTs. + * + * Therefore, when srNothingRef or srNullRef are to be emitted, a mapping is needed: the internal + * names of NothingClass and NullClass can't be emitted as-is. + * TODO @lry Once there's a 2.11.3 starr, use the commented argument list. The current starr crashes on the type literal `scala.runtime.Nothing$` + */ + lazy val srNothingRef : ClassBType = classBTypeFromSymbol(requiredClass("scala.runtime.Nothing$")) // (requiredClass[scala.runtime.Nothing$]) + lazy val srNullRef : ClassBType = classBTypeFromSymbol(requiredClass("scala.runtime.Null$")) // (requiredClass[scala.runtime.Null$]) + + lazy val ObjectRef : ClassBType = classBTypeFromSymbol(defn.ObjectClass) + lazy val StringRef : ClassBType = classBTypeFromSymbol(defn.StringClass) + lazy val jlStringBuilderRef : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.StringBuilder]) + lazy val jlStringBufferRef : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.StringBuffer]) + lazy val jlCharSequenceRef : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.CharSequence]) + lazy val jlClassRef : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.Class[_]]) + lazy val jlThrowableRef : ClassBType = classBTypeFromSymbol(defn.ThrowableClass) + lazy val jlCloneableRef : ClassBType = classBTypeFromSymbol(defn.JavaCloneableClass) // java/lang/Cloneable + lazy val jioSerializableRef : ClassBType = classBTypeFromSymbol(requiredClass[java.io.Serializable]) // java/io/Serializable + lazy val jlClassCastExceptionRef : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.ClassCastException]) // java/lang/ClassCastException + lazy val jlIllegalArgExceptionRef : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.IllegalArgumentException]) + lazy val jliSerializedLambdaRef : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.invoke.SerializedLambda]) + + lazy val srBoxesRunTimeRef: ClassBType = classBTypeFromSymbol(requiredClass[scala.runtime.BoxesRunTime]) + + private lazy val jliCallSiteRef : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.invoke.CallSite]) + private lazy val jliLambdaMetafactoryRef : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.invoke.LambdaMetafactory]) + private lazy val jliMethodHandleRef : ClassBType = classBTypeFromSymbol(defn.MethodHandleClass) + private lazy val jliMethodHandlesLookupRef : ClassBType = classBTypeFromSymbol(defn.MethodHandlesLookupClass) + private lazy val jliMethodTypeRef : ClassBType = classBTypeFromSymbol(requiredClass[java.lang.invoke.MethodType]) + private lazy val jliStringConcatFactoryRef : ClassBType = classBTypeFromSymbol(requiredClass("java.lang.invoke.StringConcatFactory")) // since JDK 9 + private lazy val srLambdaDeserialize : ClassBType = classBTypeFromSymbol(requiredClass[scala.runtime.LambdaDeserialize]) + + lazy val jliLambdaMetaFactoryMetafactoryHandle: Handle = new Handle( + Opcodes.H_INVOKESTATIC, + jliLambdaMetafactoryRef.internalName, + "metafactory", + MethodBType( + List(jliMethodHandlesLookupRef, StringRef, jliMethodTypeRef, jliMethodTypeRef, jliMethodHandleRef, jliMethodTypeRef), + jliCallSiteRef + ).descriptor, + /* itf = */ false) + + lazy val jliLambdaMetaFactoryAltMetafactoryHandle: Handle = new Handle( + Opcodes.H_INVOKESTATIC, + jliLambdaMetafactoryRef.internalName, + "altMetafactory", + MethodBType( + List(jliMethodHandlesLookupRef, StringRef, jliMethodTypeRef, ArrayBType(ObjectRef)), + jliCallSiteRef + ).descriptor, + /* itf = */ false) + + lazy val jliLambdaDeserializeBootstrapHandle: Handle = new Handle( + Opcodes.H_INVOKESTATIC, + srLambdaDeserialize.internalName, + "bootstrap", + MethodBType( + List(jliMethodHandlesLookupRef, StringRef, jliMethodTypeRef, ArrayBType(jliMethodHandleRef)), + jliCallSiteRef + ).descriptor, + /* itf = */ false) + + lazy val jliStringConcatFactoryMakeConcatWithConstantsHandle = new Handle( + Opcodes.H_INVOKESTATIC, + jliStringConcatFactoryRef.internalName, + "makeConcatWithConstants", + MethodBType( + List(jliMethodHandlesLookupRef, StringRef, jliMethodTypeRef, StringRef, ArrayBType(ObjectRef)), + jliCallSiteRef + ).descriptor, + /* itf = */ false) + + /** + * Methods in scala.runtime.BoxesRuntime + */ + lazy val asmBoxTo : Map[BType, MethodNameAndType] = Map( + BOOL -> MethodNameAndType("boxToBoolean", MethodBType(List(BOOL), BOXED_BOOLEAN)), + BYTE -> MethodNameAndType("boxToByte", MethodBType(List(BYTE), BOXED_BYTE)), + CHAR -> MethodNameAndType("boxToCharacter", MethodBType(List(CHAR), BOXED_CHAR)), + SHORT -> MethodNameAndType("boxToShort", MethodBType(List(SHORT), BOXED_SHORT)), + INT -> MethodNameAndType("boxToInteger", MethodBType(List(INT), BOXED_INT)), + LONG -> MethodNameAndType("boxToLong", MethodBType(List(LONG), BOXED_LONG)), + FLOAT -> MethodNameAndType("boxToFloat", MethodBType(List(FLOAT), BOXED_FLOAT)), + DOUBLE -> MethodNameAndType("boxToDouble", MethodBType(List(DOUBLE), BOXED_DOUBLE)) + ) + + lazy val asmUnboxTo: Map[BType, MethodNameAndType] = Map( + BOOL -> MethodNameAndType("unboxToBoolean", MethodBType(List(ObjectRef), BOOL)), + BYTE -> MethodNameAndType("unboxToByte", MethodBType(List(ObjectRef), BYTE)), + CHAR -> MethodNameAndType("unboxToChar", MethodBType(List(ObjectRef), CHAR)), + SHORT -> MethodNameAndType("unboxToShort", MethodBType(List(ObjectRef), SHORT)), + INT -> MethodNameAndType("unboxToInt", MethodBType(List(ObjectRef), INT)), + LONG -> MethodNameAndType("unboxToLong", MethodBType(List(ObjectRef), LONG)), + FLOAT -> MethodNameAndType("unboxToFloat", MethodBType(List(ObjectRef), FLOAT)), + DOUBLE -> MethodNameAndType("unboxToDouble", MethodBType(List(ObjectRef), DOUBLE)) + ) + + lazy val typeOfArrayOp: Map[Int, BType] = { + import dotty.tools.backend.ScalaPrimitivesOps._ + Map( + (List(ZARRAY_LENGTH, ZARRAY_GET, ZARRAY_SET) map (_ -> BOOL)) ++ + (List(BARRAY_LENGTH, BARRAY_GET, BARRAY_SET) map (_ -> BYTE)) ++ + (List(SARRAY_LENGTH, SARRAY_GET, SARRAY_SET) map (_ -> SHORT)) ++ + (List(CARRAY_LENGTH, CARRAY_GET, CARRAY_SET) map (_ -> CHAR)) ++ + (List(IARRAY_LENGTH, IARRAY_GET, IARRAY_SET) map (_ -> INT)) ++ + (List(LARRAY_LENGTH, LARRAY_GET, LARRAY_SET) map (_ -> LONG)) ++ + (List(FARRAY_LENGTH, FARRAY_GET, FARRAY_SET) map (_ -> FLOAT)) ++ + (List(DARRAY_LENGTH, DARRAY_GET, DARRAY_SET) map (_ -> DOUBLE)) ++ + (List(OARRAY_LENGTH, OARRAY_GET, OARRAY_SET) map (_ -> ObjectRef)) : _* + ) + } +} + +/** + * This trait make some core BTypes availalbe that don't depend on a Global instance. Some core + * BTypes are required to be accessible in the BTypes trait, which does not have access to Global. + * + * BTypes cannot refer to CoreBTypesProxy because some of its members depend on global, for example + * the type Symbol in + * def primitiveTypeMap: Map[Symbol, PrimitiveBType] + */ +trait CoreBTypesProxyGlobalIndependent[BTS <: BTypes] { + val bTypes: BTS + import bTypes._ + + def boxedClasses: Set[ClassBType] + + def srNothingRef : ClassBType + def srNullRef : ClassBType + + def ObjectRef : ClassBType + def jlCloneableRef : ClassBType + def jiSerializableRef : ClassBType +} + +/** + * See comment in class [[CoreBTypes]]. + */ +final class CoreBTypesProxy[BTFS <: BTypesFromSymbols[_ <: DottyBackendInterface]](val bTypes: BTFS) extends CoreBTypesProxyGlobalIndependent[BTFS] { + import bTypes._ + + private var _coreBTypes: CoreBTypes[bTypes.type] = _ + def setBTypes(coreBTypes: CoreBTypes[BTFS]): Unit = { + _coreBTypes = coreBTypes.asInstanceOf[CoreBTypes[bTypes.type]] + } + + def primitiveTypeMap: Map[Symbol, PrimitiveBType] = _coreBTypes.primitiveTypeMap + + def boxedClasses: Set[ClassBType] = _coreBTypes.boxedClasses + + def boxedClassOfPrimitive: Map[PrimitiveBType, ClassBType] = _coreBTypes.boxedClassOfPrimitive + + def boxResultType: Map[Symbol, ClassBType] = _coreBTypes.boxResultType + + def unboxResultType: Map[Symbol, PrimitiveBType] = _coreBTypes.unboxResultType + + def srNothingRef : ClassBType = _coreBTypes.srNothingRef + def srNullRef : ClassBType = _coreBTypes.srNullRef + + def ObjectRef : ClassBType = _coreBTypes.ObjectRef + def StringRef : ClassBType = _coreBTypes.StringRef + def jlStringBuilderRef : ClassBType = _coreBTypes.jlStringBuilderRef + def jlStringBufferRef : ClassBType = _coreBTypes.jlStringBufferRef + def jlCharSequenceRef : ClassBType = _coreBTypes.jlCharSequenceRef + def jlClassRef : ClassBType = _coreBTypes.jlClassRef + def jlThrowableRef : ClassBType = _coreBTypes.jlThrowableRef + def jlCloneableRef : ClassBType = _coreBTypes.jlCloneableRef + def jiSerializableRef : ClassBType = _coreBTypes.jioSerializableRef + def jlClassCastExceptionRef : ClassBType = _coreBTypes.jlClassCastExceptionRef + def jlIllegalArgExceptionRef : ClassBType = _coreBTypes.jlIllegalArgExceptionRef + def jliSerializedLambdaRef : ClassBType = _coreBTypes.jliSerializedLambdaRef + + def srBoxesRuntimeRef: ClassBType = _coreBTypes.srBoxesRunTimeRef + + def jliLambdaMetaFactoryMetafactoryHandle : Handle = _coreBTypes.jliLambdaMetaFactoryMetafactoryHandle + def jliLambdaMetaFactoryAltMetafactoryHandle : Handle = _coreBTypes.jliLambdaMetaFactoryAltMetafactoryHandle + def jliLambdaDeserializeBootstrapHandle : Handle = _coreBTypes.jliLambdaDeserializeBootstrapHandle + def jliStringConcatFactoryMakeConcatWithConstantsHandle: Handle = _coreBTypes.jliStringConcatFactoryMakeConcatWithConstantsHandle + + def asmBoxTo : Map[BType, MethodNameAndType] = _coreBTypes.asmBoxTo + def asmUnboxTo: Map[BType, MethodNameAndType] = _coreBTypes.asmUnboxTo + + def typeOfArrayOp: Map[Int, BType] = _coreBTypes.typeOfArrayOp +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/DottyBackendInterface.scala b/tests/pos-with-compiler-cc/backend/jvm/DottyBackendInterface.scala new file mode 100644 index 000000000000..a70d671f9c63 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/DottyBackendInterface.scala @@ -0,0 +1,204 @@ +package dotty.tools.backend.jvm + +import scala.language.unsafeNulls + +import dotty.tools.dotc.ast.tpd +import dotty.tools.dotc.core.Flags._ +import dotty.tools.dotc.transform.SymUtils._ +import java.io.{File => _} + +import scala.reflect.ClassTag +import dotty.tools.io.AbstractFile +import dotty.tools.dotc.core._ +import Contexts._ +import Types._ +import Symbols._ +import Phases._ +import Decorators.em + +import dotty.tools.dotc.util.ReadOnlyMap +import dotty.tools.dotc.report + +import tpd._ + +import StdNames.nme +import NameKinds.LazyBitMapName +import Names.Name + +class DottyBackendInterface(val outputDirectory: AbstractFile, val superCallsMap: ReadOnlyMap[Symbol, Set[ClassSymbol]])(using val ctx: DetachedContext) { + + private val desugared = new java.util.IdentityHashMap[Type, tpd.Select] + + def cachedDesugarIdent(i: Ident): Option[tpd.Select] = { + var found = desugared.get(i.tpe) + if (found == null) { + tpd.desugarIdent(i) match { + case sel: tpd.Select => + desugared.put(i.tpe, sel) + found = sel + case _ => + } + } + if (found == null) None else Some(found) + } + + object DesugaredSelect extends DeconstructorCommon[tpd.Tree] { + + var desugared: tpd.Select = null + + override def isEmpty: Boolean = + desugared eq null + + def _1: Tree = desugared.qualifier + + def _2: Name = desugared.name + + override def unapply(s: tpd.Tree): this.type = { + s match { + case t: tpd.Select => desugared = t + case t: Ident => + cachedDesugarIdent(t) match { + case Some(t) => desugared = t + case None => desugared = null + } + case _ => desugared = null + } + + this + } + } + + object ArrayValue extends DeconstructorCommon[tpd.JavaSeqLiteral] { + def _1: Type = field.tpe match { + case JavaArrayType(elem) => elem + case _ => + report.error(em"JavaSeqArray with type ${field.tpe} reached backend: $field", ctx.source.atSpan(field.span)) + UnspecifiedErrorType + } + def _2: List[Tree] = field.elems + } + + abstract class DeconstructorCommon[T >: Null <: AnyRef] { + var field: T = null + def get: this.type = this + def isEmpty: Boolean = field eq null + def isDefined = !isEmpty + def unapply(s: T): this.type ={ + field = s + this + } + } + +} + +object DottyBackendInterface { + + private def erasureString(clazz: Class[_]): String = { + if (clazz.isArray) "Array[" + erasureString(clazz.getComponentType) + "]" + else clazz.getName + } + + def requiredClass(str: String)(using Context): ClassSymbol = + Symbols.requiredClass(str) + + def requiredClass[T](using evidence: ClassTag[T], ctx: Context): Symbol = + requiredClass(erasureString(evidence.runtimeClass)) + + def requiredModule(str: String)(using Context): Symbol = + Symbols.requiredModule(str) + + def requiredModule[T](using evidence: ClassTag[T], ctx: Context): Symbol = { + val moduleName = erasureString(evidence.runtimeClass) + val className = if (moduleName.endsWith("$")) moduleName.dropRight(1) else moduleName + requiredModule(className) + } + + given symExtensions: AnyRef with + extension (sym: Symbol) + + def isInterface(using Context): Boolean = (sym.is(PureInterface)) || sym.is(Trait) + + def isStaticConstructor(using Context): Boolean = (sym.isStaticMember && sym.isClassConstructor) || (sym.name eq nme.STATIC_CONSTRUCTOR) + + /** Fields of static modules will be static at backend + * + * Note that lazy val encoding assumes bitmap fields are non-static. + * See also `genPlainClass` in `BCodeSkelBuilder.scala`. + * + * TODO: remove the special handing of `LazyBitMapName` once we swtich to + * the new lazy val encoding: https://github.com/lampepfl/dotty/issues/7140 + */ + def isStaticModuleField(using Context): Boolean = + sym.owner.isStaticModuleClass && sym.isField && !sym.name.is(LazyBitMapName) + + def isStaticMember(using Context): Boolean = (sym ne NoSymbol) && + (sym.is(JavaStatic) || sym.isScalaStatic || sym.isStaticModuleField) + // guard against no sumbol cause this code is executed to select which call type(static\dynamic) to use to call array.clone + + /** + * True for module classes of modules that are top-level or owned only by objects. Module classes + * for such objects will get a MODULE$ flag and a corresponding static initializer. + */ + def isStaticModuleClass(using Context): Boolean = + (sym.is(Module)) && { + // scalac uses atPickling here + // this would not work if modules are created after pickling + // for example by specialization + val original = toDenot(sym).initial + val validity = original.validFor + atPhase(validity.phaseId) { + toDenot(sym).isStatic + } + } + + + + def originalLexicallyEnclosingClass(using Context): Symbol = + // used to populate the EnclosingMethod attribute. + // it is very tricky in presence of classes(and annonymous classes) defined inside supper calls. + if (sym.exists) { + val validity = toDenot(sym).initial.validFor + atPhase(validity.phaseId) { + toDenot(sym).lexicallyEnclosingClass + } + } else NoSymbol + + /** + * True for module classes of package level objects. The backend will generate a mirror class for + * such objects. + */ + def isTopLevelModuleClass(using Context): Boolean = + sym.is(ModuleClass) && + atPhase(flattenPhase) { + toDenot(sym).owner.is(PackageClass) + } + + def javaSimpleName(using Context): String = toDenot(sym).name.mangledString + def javaClassName(using Context): String = toDenot(sym).fullName.mangledString + def javaBinaryName(using Context): String = javaClassName.replace('.', '/') + + end extension + + end symExtensions + + private val primitiveCompilationUnits = Set( + "Unit.scala", + "Boolean.scala", + "Char.scala", + "Byte.scala", + "Short.scala", + "Int.scala", + "Float.scala", + "Long.scala", + "Double.scala" + ) + + /** + * True if the current compilation unit is of a primitive class (scala.Boolean et al). + * Used only in assertions. + */ + def isCompilingPrimitive(using Context) = { + primitiveCompilationUnits(ctx.compilationUnit.source.file.name) + } + +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/GenBCode.scala b/tests/pos-with-compiler-cc/backend/jvm/GenBCode.scala new file mode 100644 index 000000000000..71d007370fe7 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/GenBCode.scala @@ -0,0 +1,671 @@ +package dotty.tools.backend.jvm + +import scala.language.unsafeNulls + +import dotty.tools.dotc.CompilationUnit +import dotty.tools.dotc.ast.Trees.{PackageDef, ValDef} +import dotty.tools.dotc.ast.tpd +import dotty.tools.dotc.core.Phases.Phase + +import scala.collection.mutable +import scala.jdk.CollectionConverters._ +import dotty.tools.dotc.transform.SymUtils._ +import dotty.tools.dotc.interfaces +import dotty.tools.dotc.report + +import dotty.tools.dotc.util.SourceFile +import java.util.Optional + +import dotty.tools.dotc.core._ +import dotty.tools.dotc.sbt.ExtractDependencies +import Contexts._ +import Phases._ +import Symbols._ +import Decorators.em + +import java.io.DataOutputStream +import java.nio.channels.ClosedByInterruptException + +import dotty.tools.tasty.{ TastyBuffer, TastyHeaderUnpickler } + +import scala.tools.asm +import scala.tools.asm.Handle +import scala.tools.asm.tree._ +import tpd._ +import StdNames._ +import dotty.tools.io._ +import scala.tools.asm.MethodTooLargeException +import scala.tools.asm.ClassTooLargeException + +class GenBCode extends Phase { + + override def phaseName: String = GenBCode.name + + override def description: String = GenBCode.description + + private val superCallsMap = new MutableSymbolMap[Set[ClassSymbol]] + def registerSuperCall(sym: Symbol, calls: ClassSymbol): Unit = { + val old = superCallsMap.getOrElse(sym, Set.empty) + superCallsMap.update(sym, old + calls) + } + + private val entryPoints = new mutable.HashSet[String]() + def registerEntryPoint(s: String): Unit = entryPoints += s + + private var myOutput: AbstractFile = _ + + private def outputDir(using Context): AbstractFile = { + if (myOutput eq null) + myOutput = ctx.settings.outputDir.value + myOutput + } + + private var myPrimitives: DottyPrimitives = null + + override def run(using Context): Unit = + inDetachedContext: ctx ?=> + if myPrimitives == null then myPrimitives = new DottyPrimitives(ctx) + new GenBCodePipeline( + DottyBackendInterface(outputDir, superCallsMap), + myPrimitives + ).run(ctx.compilationUnit.tpdTree) + + + override def runOn(units: List[CompilationUnit])(using Context): List[CompilationUnit] = { + outputDir match + case jar: JarArchive => + updateJarManifestWithMainClass(jar, entryPoints.toList) + case _ => + try super.runOn(units) + finally outputDir match { + case jar: JarArchive => + if (ctx.run.nn.suspendedUnits.nonEmpty) + // If we close the jar the next run will not be able to write on the jar. + // But if we do not close it we cannot use it as part of the macro classpath of the suspended files. + report.error("Can not suspend and output to a jar at the same time. See suspension with -Xprint-suspension.") + + jar.close() + case _ => + } + } + + private def updateJarManifestWithMainClass(jarArchive: JarArchive, entryPoints: List[String])(using Context): Unit = + val mainClass = Option.when(!ctx.settings.XmainClass.isDefault)(ctx.settings.XmainClass.value).orElse { + entryPoints match + case List(mainClass) => + Some(mainClass) + case Nil => + report.warning("No Main-Class designated or discovered.") + None + case mcs => + report.warning(s"No Main-Class due to multiple entry points:\n ${mcs.mkString("\n ")}") + None + } + mainClass.map { mc => + val manifest = Jar.WManifest() + manifest.mainClass = mc + val file = jarArchive.subdirectoryNamed("META-INF").fileNamed("MANIFEST.MF") + val os = file.output + manifest.underlying.write(os) + os.close() + } + end updateJarManifestWithMainClass +} + +object GenBCode { + val name: String = "genBCode" + val description: String = "generate JVM bytecode" +} + +class GenBCodePipeline(val int: DottyBackendInterface, val primitives: DottyPrimitives)(using DetachedContext) extends BCodeSyncAndTry { + import DottyBackendInterface.symExtensions + + private var tree: Tree = _ + + private val sourceFile: SourceFile = ctx.compilationUnit.source + + /** Convert a `dotty.tools.io.AbstractFile` into a + * `dotty.tools.dotc.interfaces.AbstractFile`. + */ + private def convertAbstractFile(absfile: dotty.tools.io.AbstractFile): interfaces.AbstractFile = + new interfaces.AbstractFile { + override def name = absfile.name + override def path = absfile.path + override def jfile = Optional.ofNullable(absfile.file) + } + + final class PlainClassBuilder(cunit: CompilationUnit) extends SyncAndTryBuilder(cunit) + +// class BCodePhase() { + + private var bytecodeWriter : BytecodeWriter = null + private var mirrorCodeGen : JMirrorBuilder = null + + /* ---------------- q1 ---------------- */ + + case class Item1(arrivalPos: Int, cd: TypeDef, cunit: CompilationUnit) { + def isPoison: Boolean = { arrivalPos == Int.MaxValue } + } + private val poison1 = Item1(Int.MaxValue, null, ctx.compilationUnit) + private val q1 = new java.util.LinkedList[Item1] + + /* ---------------- q2 ---------------- */ + + case class SubItem2(classNode: asm.tree.ClassNode, + file: dotty.tools.io.AbstractFile) + + case class Item2(arrivalPos: Int, + mirror: SubItem2, + plain: SubItem2) { + def isPoison: Boolean = { arrivalPos == Int.MaxValue } + } + + private val poison2 = Item2(Int.MaxValue, null, null) + private val q2 = new _root_.java.util.LinkedList[Item2] + + /* ---------------- q3 ---------------- */ + + /* + * An item of queue-3 (the last queue before serializing to disk) contains three of these + * (one for each of mirror and plain classes). + * + * @param jclassName internal name of the class + * @param jclassBytes bytecode emitted for the class SubItem3 represents + */ + case class SubItem3( + jclassName: String, + jclassBytes: Array[Byte], + jclassFile: dotty.tools.io.AbstractFile + ) + + case class Item3(arrivalPos: Int, + mirror: SubItem3, + plain: SubItem3) { + + def isPoison: Boolean = { arrivalPos == Int.MaxValue } + } + private val i3comparator = new java.util.Comparator[Item3] { + override def compare(a: Item3, b: Item3) = { + if (a.arrivalPos < b.arrivalPos) -1 + else if (a.arrivalPos == b.arrivalPos) 0 + else 1 + } + } + private val poison3 = Item3(Int.MaxValue, null, null) + private val q3 = new java.util.PriorityQueue[Item3](1000, i3comparator) + + /* + * Pipeline that takes ClassDefs from queue-1, lowers them into an intermediate form, placing them on queue-2 + */ + class Worker1(needsOutFolder: Boolean) { + + private val lowerCaseNames = mutable.HashMap.empty[String, Symbol] + private def checkForCaseConflict(javaClassName: String, classSymbol: Symbol) = { + val lowerCaseName = javaClassName.toLowerCase + lowerCaseNames.get(lowerCaseName) match { + case None => + lowerCaseNames.put(lowerCaseName, classSymbol) + case Some(dupClassSym) => + // Order is not deterministic so we enforce lexicographic order between the duplicates for error-reporting + val (cl1, cl2) = + if (classSymbol.effectiveName.toString < dupClassSym.effectiveName.toString) (classSymbol, dupClassSym) + else (dupClassSym, classSymbol) + val same = classSymbol.effectiveName.toString == dupClassSym.effectiveName.toString + atPhase(typerPhase) { + if (same) + report.warning( // FIXME: This should really be an error, but then FromTasty tests fail + em"$cl1 and ${cl2.showLocated} produce classes that overwrite one another", cl1.sourcePos) + else + report.warning( + em"""$cl1 differs only in case from ${cl2.showLocated}. + |uch classes will overwrite one another on case-insensitive filesystems.""", cl1.sourcePos) + } + } + } + + def run(): Unit = { + while (true) { + val item = q1.poll + if (item.isPoison) { + q2 add poison2 + return + } + else { + try { /*withCurrentUnit(item.cunit)*/(visit(item)) } + catch { + case ex: InterruptedException => + throw ex + case ex: Throwable => + println(s"Error while emitting ${item.cunit.source.file.name}") + throw ex + } + } + } + } + + /* + * Checks for duplicate internal names case-insensitively, + * builds ASM ClassNodes for mirror and plain classes; + * enqueues them in queue-2. + * + */ + def visit(item: Item1): Boolean = { + val Item1(arrivalPos, cd, cunit) = item + val claszSymbol = cd.symbol + + // -------------- mirror class, if needed -------------- + val mirrorC = + if (claszSymbol.isTopLevelModuleClass) { + if (claszSymbol.companionClass == NoSymbol) { + mirrorCodeGen.genMirrorClass(claszSymbol, cunit) + } else { + report.log(s"No mirror class for module with linked class: ${claszSymbol.showFullName}") + null + } + } else null + + // -------------- "plain" class -------------- + val pcb = new PlainClassBuilder(cunit) + pcb.genPlainClass(cd) + val outF = if (needsOutFolder) getOutFolder(claszSymbol, pcb.thisName) else null; + val plainC = pcb.cnode + + if (claszSymbol.isClass) // @DarkDimius is this test needed here? + for (binary <- ctx.compilationUnit.pickled.get(claszSymbol.asClass)) { + val store = if (mirrorC ne null) mirrorC else plainC + val tasty = + val outTastyFile = getFileForClassfile(outF, store.name, ".tasty") + val outstream = new DataOutputStream(outTastyFile.bufferedOutput) + try outstream.write(binary()) + catch case ex: ClosedByInterruptException => + try + outTastyFile.delete() // don't leave an empty or half-written tastyfile around after an interrupt + catch + case _: Throwable => + throw ex + finally outstream.close() + + val uuid = new TastyHeaderUnpickler(binary()).readHeader() + val lo = uuid.getMostSignificantBits + val hi = uuid.getLeastSignificantBits + + // TASTY attribute is created but only the UUID bytes are stored in it. + // A TASTY attribute has length 16 if and only if the .tasty file exists. + val buffer = new TastyBuffer(16) + buffer.writeUncompressedLong(lo) + buffer.writeUncompressedLong(hi) + buffer.bytes + + val dataAttr = createJAttribute(nme.TASTYATTR.mangledString, tasty, 0, tasty.length) + store.visitAttribute(dataAttr) + } + + + // ----------- create files + + val classNodes = List(mirrorC, plainC) + val classFiles = classNodes.map(cls => + if (outF != null && cls != null) { + try { + checkForCaseConflict(cls.name, claszSymbol) + getFileForClassfile(outF, cls.name, ".class") + } catch { + case e: FileConflictException => + report.error(em"error writing ${cls.name}: ${e.getMessage}") + null + } + } else null + ) + + // ----------- compiler and sbt's callbacks + + val (fullClassName, isLocal) = atPhase(sbtExtractDependenciesPhase) { + (ExtractDependencies.classNameAsString(claszSymbol), claszSymbol.isLocal) + } + + for ((cls, clsFile) <- classNodes.zip(classFiles)) { + if (cls != null) { + val className = cls.name.replace('/', '.') + if (ctx.compilerCallback != null) + ctx.compilerCallback.onClassGenerated(sourceFile, convertAbstractFile(clsFile), className) + if (ctx.sbtCallback != null) { + if (isLocal) + ctx.sbtCallback.generatedLocalClass(sourceFile.jfile.orElse(null), clsFile.file) + else { + ctx.sbtCallback.generatedNonLocalClass(sourceFile.jfile.orElse(null), clsFile.file, + className, fullClassName) + } + } + } + } + + // ----------- hand over to pipeline-2 + + val item2 = + Item2(arrivalPos, + SubItem2(mirrorC, classFiles(0)), + SubItem2(plainC, classFiles(1))) + + q2 add item2 // at the very end of this method so that no Worker2 thread starts mutating before we're done. + + } // end of method visit(Item1) + + } // end of class BCodePhase.Worker1 + + /* + * Pipeline that takes ClassNodes from queue-2. The unit of work depends on the optimization level: + * + * (a) no optimization involves: + * - converting the plain ClassNode to byte array and placing it on queue-3 + */ + class Worker2 { + import bTypes.ClassBType + import bTypes.coreBTypes.jliLambdaMetaFactoryAltMetafactoryHandle + // lazy val localOpt = new LocalOpt(new Settings()) + + private def localOptimizations(classNode: ClassNode): Unit = { + // BackendStats.timed(BackendStats.methodOptTimer)(localOpt.methodOptimizations(classNode)) + } + + + /* Return an array of all serializable lambdas in this class */ + private def collectSerializableLambdas(classNode: ClassNode): Array[Handle] = { + val indyLambdaBodyMethods = new mutable.ArrayBuffer[Handle] + for (m <- classNode.methods.asScala) { + val iter = m.instructions.iterator + while (iter.hasNext) { + val insn = iter.next() + insn match { + case indy: InvokeDynamicInsnNode + if indy.bsm == jliLambdaMetaFactoryAltMetafactoryHandle => + import java.lang.invoke.LambdaMetafactory.FLAG_SERIALIZABLE + val metafactoryFlags = indy.bsmArgs(3).asInstanceOf[Integer].toInt + val isSerializable = (metafactoryFlags & FLAG_SERIALIZABLE) != 0 + if isSerializable then + val implMethod = indy.bsmArgs(1).asInstanceOf[Handle] + indyLambdaBodyMethods += implMethod + case _ => + } + } + } + indyLambdaBodyMethods.toArray + } + + /* + * Add: + * + * private static Object $deserializeLambda$(SerializedLambda l) { + * try return indy[scala.runtime.LambdaDeserialize.bootstrap, targetMethodGroup$0](l) + * catch { + * case i: IllegalArgumentException => + * try return indy[scala.runtime.LambdaDeserialize.bootstrap, targetMethodGroup$1](l) + * catch { + * case i: IllegalArgumentException => + * ... + * return indy[scala.runtime.LambdaDeserialize.bootstrap, targetMethodGroup${NUM_GROUPS-1}](l) + * } + * + * We use invokedynamic here to enable caching within the deserializer without needing to + * host a static field in the enclosing class. This allows us to add this method to interfaces + * that define lambdas in default methods. + * + * SI-10232 we can't pass arbitrary number of method handles to the final varargs parameter of the bootstrap + * method due to a limitation in the JVM. Instead, we emit a separate invokedynamic bytecode for each group of target + * methods. + */ + private def addLambdaDeserialize(classNode: ClassNode, implMethodsArray: Array[Handle]): Unit = { + import asm.Opcodes._ + import bTypes._ + import coreBTypes._ + + val cw = classNode + + // Make sure to reference the ClassBTypes of all types that are used in the code generated + // here (e.g. java/util/Map) are initialized. Initializing a ClassBType adds it to + // `classBTypeFromInternalNameMap`. When writing the classfile, the asm ClassWriter computes + // stack map frames and invokes the `getCommonSuperClass` method. This method expects all + // ClassBTypes mentioned in the source code to exist in the map. + + val serlamObjDesc = MethodBType(jliSerializedLambdaRef :: Nil, ObjectRef).descriptor + + val mv = cw.visitMethod(ACC_PRIVATE + ACC_STATIC + ACC_SYNTHETIC, "$deserializeLambda$", serlamObjDesc, null, null) + def emitLambdaDeserializeIndy(targetMethods: Seq[Handle]): Unit = { + mv.visitVarInsn(ALOAD, 0) + mv.visitInvokeDynamicInsn("lambdaDeserialize", serlamObjDesc, jliLambdaDeserializeBootstrapHandle, targetMethods: _*) + } + + val targetMethodGroupLimit = 255 - 1 - 3 // JVM limit. See See MAX_MH_ARITY in CallSite.java + val groups: Array[Array[Handle]] = implMethodsArray.grouped(targetMethodGroupLimit).toArray + val numGroups = groups.length + + import scala.tools.asm.Label + val initialLabels = Array.fill(numGroups - 1)(new Label()) + val terminalLabel = new Label + def nextLabel(i: Int) = if (i == numGroups - 2) terminalLabel else initialLabels(i + 1) + + for ((label, i) <- initialLabels.iterator.zipWithIndex) { + mv.visitTryCatchBlock(label, nextLabel(i), nextLabel(i), jlIllegalArgExceptionRef.internalName) + } + for ((label, i) <- initialLabels.iterator.zipWithIndex) { + mv.visitLabel(label) + emitLambdaDeserializeIndy(groups(i).toIndexedSeq) + mv.visitInsn(ARETURN) + } + mv.visitLabel(terminalLabel) + emitLambdaDeserializeIndy(groups(numGroups - 1).toIndexedSeq) + mv.visitInsn(ARETURN) + } + + private def setInnerClasses(classNode: ClassNode): Unit = if (classNode != null) { + classNode.innerClasses.clear() + val (declared, referred) = collectNestedClasses(classNode) + addInnerClasses(classNode, declared, referred) + } + + /** + * Visit the class node and collect all referenced nested classes. + */ + private def collectNestedClasses(classNode: ClassNode): (List[ClassBType], List[ClassBType]) = { + // type InternalName = String + val c = new NestedClassesCollector[ClassBType](nestedOnly = true) { + def declaredNestedClasses(internalName: InternalName): List[ClassBType] = + bTypes.classBTypeFromInternalName(internalName).info.memberClasses + + def getClassIfNested(internalName: InternalName): Option[ClassBType] = { + val c = bTypes.classBTypeFromInternalName(internalName) + Option.when(c.isNestedClass)(c) + } + + def raiseError(msg: String, sig: String, e: Option[Throwable]): Unit = { + // don't crash on invalid generic signatures + } + } + c.visit(classNode) + (c.declaredInnerClasses.toList, c.referredInnerClasses.toList) + } + + def run(): Unit = { + while (true) { + val item = q2.poll + if (item.isPoison) { + q3 add poison3 + return + } + else { + try { + val plainNode = item.plain.classNode + localOptimizations(plainNode) + val serializableLambdas = collectSerializableLambdas(plainNode) + if (serializableLambdas.nonEmpty) + addLambdaDeserialize(plainNode, serializableLambdas) + setInnerClasses(plainNode) + setInnerClasses(item.mirror.classNode) + addToQ3(item) + } catch { + case ex: InterruptedException => + throw ex + case ex: Throwable => + println(s"Error while emitting ${item.plain.classNode.name}") + throw ex + } + } + } + } + + private def addToQ3(item: Item2) = { + + def getByteArray(cn: asm.tree.ClassNode): Array[Byte] = { + val cw = new CClassWriter(extraProc) + cn.accept(cw) + cw.toByteArray + } + + val Item2(arrivalPos, SubItem2(mirror, mirrorFile), SubItem2(plain, plainFile)) = item + + val mirrorC = if (mirror == null) null else SubItem3(mirror.name, getByteArray(mirror), mirrorFile) + val plainC = SubItem3(plain.name, getByteArray(plain), plainFile) + + if (AsmUtils.traceSerializedClassEnabled && plain.name.contains(AsmUtils.traceSerializedClassPattern)) { + if (mirrorC != null) AsmUtils.traceClass(mirrorC.jclassBytes) + AsmUtils.traceClass(plainC.jclassBytes) + } + + q3 add Item3(arrivalPos, mirrorC, plainC) + } + + } // end of class BCodePhase.Worker2 + + var arrivalPos: Int = 0 + + /* + * A run of the BCodePhase phase comprises: + * + * (a) set-up steps (most notably supporting maps in `BCodeTypes`, + * but also "the" writer where class files in byte-array form go) + * + * (b) building of ASM ClassNodes, their optimization and serialization. + * + * (c) tear down (closing the classfile-writer and clearing maps) + * + */ + def run(t: Tree)(using Context): Unit = { + this.tree = t + + // val bcodeStart = Statistics.startTimer(BackendStats.bcodeTimer) + + // val initStart = Statistics.startTimer(BackendStats.bcodeInitTimer) + arrivalPos = 0 // just in case + // scalaPrimitives.init() + bTypes.intializeCoreBTypes() + // Statistics.stopTimer(BackendStats.bcodeInitTimer, initStart) + + // initBytecodeWriter invokes fullName, thus we have to run it before the typer-dependent thread is activated. + bytecodeWriter = initBytecodeWriter() + mirrorCodeGen = new JMirrorBuilder + + val needsOutfileForSymbol = bytecodeWriter.isInstanceOf[ClassBytecodeWriter] + buildAndSendToDisk(needsOutfileForSymbol) + + // closing output files. + bytecodeWriter.close() + // Statistics.stopTimer(BackendStats.bcodeTimer, bcodeStart) + + if (ctx.compilerCallback != null) + ctx.compilerCallback.onSourceCompiled(sourceFile) + + /* TODO Bytecode can be verified (now that all classfiles have been written to disk) + * + * (1) asm.util.CheckAdapter.verify() + * public static void verify(ClassReader cr, ClassLoader loader, boolean dump, PrintWriter pw) + * passing a custom ClassLoader to verify inter-dependent classes. + * Alternatively, + * - an offline-bytecode verifier could be used (e.g. Maxine brings one as separate tool). + * - -Xverify:all + * + * (2) if requested, check-java-signatures, over and beyond the syntactic checks in `getGenericSignature()` + * + */ + } + + /* + * Sequentially: + * (a) place all ClassDefs in queue-1 + * (b) dequeue one at a time from queue-1, convert it to ASM ClassNode, place in queue-2 + * (c) dequeue one at a time from queue-2, convert it to byte-array, place in queue-3 + * (d) serialize to disk by draining queue-3. + */ + private def buildAndSendToDisk(needsOutFolder: Boolean)(using Context) = { + try + feedPipeline1() + // val genStart = Statistics.startTimer(BackendStats.bcodeGenStat) + (new Worker1(needsOutFolder)).run() + // Statistics.stopTimer(BackendStats.bcodeGenStat, genStart) + + (new Worker2).run() + + // val writeStart = Statistics.startTimer(BackendStats.bcodeWriteTimer) + drainQ3() + // Statistics.stopTimer(BackendStats.bcodeWriteTimer, writeStart) + catch + case e: MethodTooLargeException => + val method = + s"${e.getClassName.replaceAll("/", ".")}.${e.getMethodName}" + val msg = + em"Generated bytecode for method '$method' is too large. Size: ${e.getCodeSize} bytes. Limit is 64KB" + report.error(msg) + case e: ClassTooLargeException => + val msg = + em"Class '${e.getClassName.replaceAll("/", ".")}' is too large. Constant pool size: ${e.getConstantPoolCount}. Limit is 64K entries" + report.error(msg) + + } + + /* Feed pipeline-1: place all ClassDefs on q1, recording their arrival position. */ + private def feedPipeline1() = { + def gen(tree: Tree): Unit = { + tree match { + case EmptyTree => () + case PackageDef(_, stats) => stats foreach gen + case ValDef(name, tpt, rhs) => () // module val not emitted + case cd: TypeDef => + q1 add Item1(arrivalPos, cd, int.ctx.compilationUnit) + arrivalPos += 1 + } + } + gen(tree) + q1 add poison1 + } + + /* Pipeline that writes classfile representations to disk. */ + private def drainQ3() = { + + def sendToDisk(cfr: SubItem3): Unit = { + if (cfr != null){ + val SubItem3(jclassName, jclassBytes, jclassFile) = cfr + bytecodeWriter.writeClass(jclassName, jclassName, jclassBytes, jclassFile) + } + } + + var moreComing = true + // `expected` denotes the arrivalPos whose Item3 should be serialized next + var expected = 0 + + while (moreComing) { + val incoming = q3.poll + moreComing = !incoming.isPoison + if (moreComing) { + val item = incoming + sendToDisk(item.mirror) + sendToDisk(item.plain) + expected += 1 + } + } + + // we're done + assert(q1.isEmpty, s"Some ClassDefs remained in the first queue: $q1") + assert(q2.isEmpty, s"Some classfiles remained in the second queue: $q2") + assert(q3.isEmpty, s"Some classfiles weren't written to disk: $q3") + + } + //} // end of class BCodePhase +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/GenBCodeOps.scala b/tests/pos-with-compiler-cc/backend/jvm/GenBCodeOps.scala new file mode 100644 index 000000000000..210e47566cb9 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/GenBCodeOps.scala @@ -0,0 +1,16 @@ +package dotty.tools +package backend +package jvm + +import scala.tools.asm + +object GenBCodeOps extends GenBCodeOps + +class GenBCodeOps { + extension (flags: Int) + def addFlagIf(cond: Boolean, flag: Int): Int = if cond then flags | flag else flags + + final val PublicStatic = asm.Opcodes.ACC_PUBLIC | asm.Opcodes.ACC_STATIC + final val PublicStaticFinal = asm.Opcodes.ACC_PUBLIC | asm.Opcodes.ACC_STATIC | asm.Opcodes.ACC_FINAL + final val PrivateStaticFinal = asm.Opcodes.ACC_PRIVATE | asm.Opcodes.ACC_STATIC | asm.Opcodes.ACC_FINAL +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/GenericSignatureVisitor.scala b/tests/pos-with-compiler-cc/backend/jvm/GenericSignatureVisitor.scala new file mode 100644 index 000000000000..e9e532933290 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/GenericSignatureVisitor.scala @@ -0,0 +1,326 @@ +package dotty.tools.backend.jvm + +import scala.language.unsafeNulls + +import scala.tools.asm.{ClassReader, Type, Handle } +import scala.tools.asm.tree._ + +import scala.collection.mutable +import scala.util.control.{NoStackTrace, NonFatal} +import scala.annotation._ +import scala.jdk.CollectionConverters._ + +// Backported from scala/scala, commit sha: 724be0e9425b9ad07c244d25efdad695d75abbcf +// https://github.com/scala/scala/blob/724be0e9425b9ad07c244d25efdad695d75abbcf/src/compiler/scala/tools/nsc/backend/jvm/analysis/BackendUtils.scala#L928 +abstract class GenericSignatureVisitor(nestedOnly: Boolean) { + // For performance (`Char => Boolean` is not specialized) + private trait CharBooleanFunction { def apply(c: Char): Boolean } + + final def visitInternalName(internalName: String): Unit = visitInternalName(internalName, 0, if (internalName eq null) 0 else internalName.length) + def visitInternalName(internalName: String, offset: Int, length: Int): Unit + + def raiseError(msg: String, sig: String, e: Option[Throwable] = None): Unit + + def visitClassSignature(sig: String): Unit = if (sig != null) { + val p = new Parser(sig, nestedOnly) + p.safely { p.classSignature() } + } + + def visitMethodSignature(sig: String): Unit = if (sig != null) { + val p = new Parser(sig, nestedOnly) + p.safely { p.methodSignature() } + } + + def visitFieldSignature(sig: String): Unit = if (sig != null) { + val p = new Parser(sig, nestedOnly) + p.safely { p.fieldSignature() } + } + + private final class Parser(sig: String, nestedOnly: Boolean) { + + private var index = 0 + private val end = sig.length + + private val Aborted: Throwable = new NoStackTrace { } + private def abort(): Nothing = throw Aborted + + @inline def safely(f: => Unit): Unit = try f catch { + case Aborted => + case NonFatal(e) => raiseError(s"Exception thrown during signature parsing", sig, Some(e)) + } + + private def current = { + if (index >= end) { + raiseError(s"Out of bounds, $index >= $end", sig) + abort() // Don't continue, even if `notifyInvalidSignature` returns + } + sig.charAt(index) + } + + private def accept(c: Char): Unit = { + if (current != c) { + raiseError(s"Expected $c at $index, found $current", sig) + abort() + } + index += 1 + } + + private def skip(): Unit = { index += 1 } + private def getCurrentAndSkip(): Char = { val c = current; skip(); c } + + private def skipUntil(isDelimiter: CharBooleanFunction): Unit = { + while (!isDelimiter(current)) { index += 1 } + } + private def skipUntilDelimiter(delimiter: Char): Unit = { + sig.indexOf(delimiter, index) match { + case -1 => + raiseError(s"Out of bounds", sig) + abort() // Don't continue, even if `notifyInvalidSignature` returns + case i => + index = i + } + } + + private def appendUntil(builder: java.lang.StringBuilder, isDelimiter: CharBooleanFunction): Unit = { + val start = index + skipUntil(isDelimiter) + builder.append(sig, start, index) + } + + def isBaseType(c: Char): Boolean = c match { + case 'B' | 'C' | 'D' | 'F' | 'I' | 'J' | 'S' | 'Z' => true + case _ => false + } + + private val isClassNameEnd: CharBooleanFunction = (c: Char) => c == '<' || c == '.' || c == ';' + + private def typeArguments(): Unit = if (current == '<') { + skip() + while (current != '>') current match { + case '*' | '+' | '-' => + skip() + case _ => + referenceTypeSignature() + } + accept('>') + } + + @tailrec private def referenceTypeSignature(): Unit = getCurrentAndSkip() match { + case 'L' => + var names: java.lang.StringBuilder = null + + val start = index + var seenDollar = false + while (!isClassNameEnd(current)) { + seenDollar ||= current == '$' + index += 1 + } + if ((current == '.' || seenDollar) || !nestedOnly) { + // OPT: avoid allocations when only a top-level class is encountered + names = new java.lang.StringBuilder(32) + names.append(sig, start, index) + visitInternalName(names.toString) + } + typeArguments() + + while (current == '.') { + skip() + names.append('$') + appendUntil(names, isClassNameEnd) + visitInternalName(names.toString) + typeArguments() + } + accept(';') + + case 'T' => + skipUntilDelimiter(';') + skip() + + case '[' => + if (isBaseType(current)) skip() + else referenceTypeSignature() + } + + private def typeParameters(): Unit = if (current == '<') { + skip() + while (current != '>') { + skipUntilDelimiter(':'); skip() + val c = current + // The ClassBound can be missing, but only if there's an InterfaceBound after. + // This is an assumption that's not in the spec, see https://stackoverflow.com/q/44284928 + if (c != ':' && c != '>') { referenceTypeSignature() } + while (current == ':') { skip(); referenceTypeSignature() } + } + accept('>') + } + + def classSignature(): Unit = { + typeParameters() + while (index < end) referenceTypeSignature() + } + + def methodSignature(): Unit = { + typeParameters() + + accept('(') + while (current != ')') { + if (isBaseType(current)) skip() + else referenceTypeSignature() + } + accept(')') + + if (current == 'V' || isBaseType(current)) skip() + else referenceTypeSignature() + + while (index < end) { + accept('^') + referenceTypeSignature() + } + } + + def fieldSignature(): Unit = if (sig != null) safely { + referenceTypeSignature() + } + } +} + +// Backported from scala/scala, commit sha: 724be0e9425b9ad07c244d25efdad695d75abbcf +// https://github.com/scala/scala/blob/724be0e9425b9ad07c244d25efdad695d75abbcf/src/compiler/scala/tools/nsc/backend/jvm/analysis/BackendUtils.scala#L790 +abstract class NestedClassesCollector[T](nestedOnly: Boolean) extends GenericSignatureVisitor(nestedOnly) { + type InternalName = String + + def declaredNestedClasses(internalName: InternalName): List[T] + def getClassIfNested(internalName: InternalName): Option[T] + + val declaredInnerClasses = mutable.Set.empty[T] + val referredInnerClasses = mutable.Set.empty[T] + + def innerClasses: collection.Set[T] = declaredInnerClasses ++ referredInnerClasses + def clear(): Unit = { + declaredInnerClasses.clear() + referredInnerClasses.clear() + } + + def visit(classNode: ClassNode): Unit = { + visitInternalName(classNode.name) + declaredInnerClasses ++= declaredNestedClasses(classNode.name) + + visitInternalName(classNode.superName) + classNode.interfaces.asScala foreach visitInternalName + visitInternalName(classNode.outerClass) + + visitAnnotations(classNode.visibleAnnotations) + visitAnnotations(classNode.visibleTypeAnnotations) + visitAnnotations(classNode.invisibleAnnotations) + visitAnnotations(classNode.invisibleTypeAnnotations) + + visitClassSignature(classNode.signature) + + for (f <- classNode.fields.asScala) { + visitDescriptor(f.desc) + visitAnnotations(f.visibleAnnotations) + visitAnnotations(f.visibleTypeAnnotations) + visitAnnotations(f.invisibleAnnotations) + visitAnnotations(f.invisibleTypeAnnotations) + visitFieldSignature(f.signature) + } + + for (m <- classNode.methods.asScala) { + visitDescriptor(m.desc) + + visitAnnotations(m.visibleAnnotations) + visitAnnotations(m.visibleTypeAnnotations) + visitAnnotations(m.invisibleAnnotations) + visitAnnotations(m.invisibleTypeAnnotations) + visitAnnotationss(m.visibleParameterAnnotations) + visitAnnotationss(m.invisibleParameterAnnotations) + visitAnnotations(m.visibleLocalVariableAnnotations) + visitAnnotations(m.invisibleLocalVariableAnnotations) + + m.exceptions.asScala foreach visitInternalName + for (tcb <- m.tryCatchBlocks.asScala) visitInternalName(tcb.`type`) + + val iter = m.instructions.iterator + while (iter.hasNext) iter.next() match { + case ti: TypeInsnNode => visitInternalNameOrArrayReference(ti.desc) + case fi: FieldInsnNode => visitInternalNameOrArrayReference(fi.owner); visitDescriptor(fi.desc) + case mi: MethodInsnNode => visitInternalNameOrArrayReference(mi.owner); visitDescriptor(mi.desc) + case id: InvokeDynamicInsnNode => visitDescriptor(id.desc); visitHandle(id.bsm); id.bsmArgs foreach visitConstant + case ci: LdcInsnNode => visitConstant(ci.cst) + case ma: MultiANewArrayInsnNode => visitDescriptor(ma.desc) + case _ => + } + + visitMethodSignature(m.signature) + } + } + + private def containsChar(s: String, offset: Int, length: Int, char: Char): Boolean = { + val ix = s.indexOf(char, offset) + !(ix == -1 || ix >= offset + length) + } + + def visitInternalName(internalName: String, offset: Int, length: Int): Unit = if (internalName != null && containsChar(internalName, offset, length, '$')) { + for (c <- getClassIfNested(internalName.substring(offset, length))) + if (!declaredInnerClasses.contains(c)) + referredInnerClasses += c + } + + // either an internal/Name or [[Linternal/Name; -- there are certain references in classfiles + // that are either an internal name (without the surrounding `L;`) or an array descriptor + // `[Linternal/Name;`. + def visitInternalNameOrArrayReference(ref: String): Unit = if (ref != null) { + val bracket = ref.lastIndexOf('[') + if (bracket == -1) visitInternalName(ref) + else if (ref.charAt(bracket + 1) == 'L') visitInternalName(ref, bracket + 2, ref.length - 1) + } + + // we are only interested in the class references in the descriptor, so we can skip over + // primitives and the brackets of array descriptors + def visitDescriptor(desc: String): Unit = (desc.charAt(0): @switch) match { + case '(' => + var i = 1 + while (i < desc.length) { + if (desc.charAt(i) == 'L') { + val start = i + 1 // skip the L + var seenDollar = false + while ({val ch = desc.charAt(i); seenDollar ||= (ch == '$'); ch != ';'}) i += 1 + if (seenDollar) + visitInternalName(desc, start, i) + } + // skips over '[', ')', primitives + i += 1 + } + + case 'L' => + visitInternalName(desc, 1, desc.length - 1) + + case '[' => + visitInternalNameOrArrayReference(desc) + + case _ => // skip over primitive types + } + + def visitConstant(const: AnyRef): Unit = const match { + case t: Type => visitDescriptor(t.getDescriptor) + case _ => + } + + // in principle we could references to annotation types, as they only end up as strings in the + // constant pool, not as class references. however, the java compiler still includes nested + // annotation classes in the innerClass table, so we do the same. explained in detail in the + // large comment in class BTypes. + def visitAnnotation(annot: AnnotationNode): Unit = { + visitDescriptor(annot.desc) + if (annot.values != null) annot.values.asScala foreach visitConstant + } + + def visitAnnotations(annots: java.util.List[_ <: AnnotationNode]) = if (annots != null) annots.asScala foreach visitAnnotation + def visitAnnotationss(annotss: Array[java.util.List[AnnotationNode]]) = if (annotss != null) annotss foreach visitAnnotations + + def visitHandle(handle: Handle): Unit = { + visitInternalNameOrArrayReference(handle.getOwner) + visitDescriptor(handle.getDesc) + } +} + diff --git a/tests/pos-with-compiler-cc/backend/jvm/LabelNode1.java b/tests/pos-with-compiler-cc/backend/jvm/LabelNode1.java new file mode 100644 index 000000000000..cf91fe619f5d --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/LabelNode1.java @@ -0,0 +1,31 @@ +/* + * Scala (https://www.scala-lang.org) + * + * Copyright EPFL and Lightbend, Inc. + * + * Licensed under Apache License 2.0 + * (http://www.apache.org/licenses/LICENSE-2.0). + * + * See the NOTICE file distributed with this work for + * additional information regarding copyright ownership. + */ + +package dotty.tools.backend.jvm; + +import scala.tools.asm.Label; +import scala.tools.asm.tree.ClassNode; +import scala.tools.asm.tree.LabelNode; + +/** + * A subclass of {@link LabelNode} to add user-definable flags. + */ +public class LabelNode1 extends LabelNode { + public LabelNode1() { + } + + public LabelNode1(Label label) { + super(label); + } + + public int flags; +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/MethodNode1.java b/tests/pos-with-compiler-cc/backend/jvm/MethodNode1.java new file mode 100644 index 000000000000..bfa4401830ba --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/MethodNode1.java @@ -0,0 +1,47 @@ +/* + * Scala (https://www.scala-lang.org) + * + * Copyright EPFL and Lightbend, Inc. + * + * Licensed under Apache License 2.0 + * (http://www.apache.org/licenses/LICENSE-2.0). + * + * See the NOTICE file distributed with this work for + * additional information regarding copyright ownership. + */ + +package dotty.tools.backend.jvm; + +import scala.tools.asm.Label; +import scala.tools.asm.Opcodes; +import scala.tools.asm.tree.LabelNode; +import scala.tools.asm.tree.MethodNode; +/** + * A subclass of {@link MethodNode} to customize the representation of + * label nodes with {@link LabelNode1}. + */ +public class MethodNode1 extends MethodNode { + public MethodNode1(int api, int access, String name, String descriptor, String signature, String[] exceptions) { + super(api, access, name, descriptor, signature, exceptions); + } + + public MethodNode1(int access, String name, String descriptor, String signature, String[] exceptions) { + this(Opcodes.ASM6, access, name, descriptor, signature, exceptions); + } + + public MethodNode1(int api) { + super(api); + } + + public MethodNode1() { + this(Opcodes.ASM6); + } + + @Override + protected LabelNode getLabelNode(Label label) { + if (!(label.info instanceof LabelNode)) { + label.info = new LabelNode1(label); + } + return (LabelNode) label.info; + } +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/Primitives.scala b/tests/pos-with-compiler-cc/backend/jvm/Primitives.scala new file mode 100644 index 000000000000..c9ddfeab24e1 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/Primitives.scala @@ -0,0 +1,191 @@ +package dotty.tools +package backend +package jvm + +import java.io.PrintWriter + +object Primitives { + /** This class represents a primitive operation. */ + class Primitive { + } + + /** This class represents a test operation. */ + sealed abstract class TestOp { + + /** Returns the negation of this operation. */ + def negate(): TestOp + + /** Returns a string representation of this operation. */ + override def toString(): String + + /** used only from GenASM */ + def opcodeIF(): Int + + /** used only from GenASM */ + def opcodeIFICMP(): Int + + } + + /** An equality test */ + case object EQ extends TestOp { + def negate() = NE + override def toString() = "EQ" + override def opcodeIF() = scala.tools.asm.Opcodes.IFEQ + override def opcodeIFICMP() = scala.tools.asm.Opcodes.IF_ICMPEQ + } + + /** A non-equality test */ + case object NE extends TestOp { + def negate() = EQ + override def toString() = "NE" + override def opcodeIF() = scala.tools.asm.Opcodes.IFNE + override def opcodeIFICMP() = scala.tools.asm.Opcodes.IF_ICMPNE + } + + /** A less-than test */ + case object LT extends TestOp { + def negate() = GE + override def toString() = "LT" + override def opcodeIF() = scala.tools.asm.Opcodes.IFLT + override def opcodeIFICMP() = scala.tools.asm.Opcodes.IF_ICMPLT + } + + /** A greater-than-or-equal test */ + case object GE extends TestOp { + def negate() = LT + override def toString() = "GE" + override def opcodeIF() = scala.tools.asm.Opcodes.IFGE + override def opcodeIFICMP() = scala.tools.asm.Opcodes.IF_ICMPGE + } + + /** A less-than-or-equal test */ + case object LE extends TestOp { + def negate() = GT + override def toString() = "LE" + override def opcodeIF() = scala.tools.asm.Opcodes.IFLE + override def opcodeIFICMP() = scala.tools.asm.Opcodes.IF_ICMPLE + } + + /** A greater-than test */ + case object GT extends TestOp { + def negate() = LE + override def toString() = "GT" + override def opcodeIF() = scala.tools.asm.Opcodes.IFGT + override def opcodeIFICMP() = scala.tools.asm.Opcodes.IF_ICMPGT + } + + /** This class represents an arithmetic operation. */ + class ArithmeticOp { + + /** Returns a string representation of this operation. */ + override def toString(): String = this match { + case ADD => "ADD" + case SUB => "SUB" + case MUL => "MUL" + case DIV => "DIV" + case REM => "REM" + case NOT => "NOT" + case _ => throw new RuntimeException("ArithmeticOp unknown case") + } + } + + /** An arithmetic addition operation */ + case object ADD extends ArithmeticOp + + /** An arithmetic subtraction operation */ + case object SUB extends ArithmeticOp + + /** An arithmetic multiplication operation */ + case object MUL extends ArithmeticOp + + /** An arithmetic division operation */ + case object DIV extends ArithmeticOp + + /** An arithmetic remainder operation */ + case object REM extends ArithmeticOp + + /** Bitwise negation. */ + case object NOT extends ArithmeticOp + + /** This class represents a shift operation. */ + class ShiftOp { + + /** Returns a string representation of this operation. */ + override def toString(): String = this match { + case LSL => "LSL" + case ASR => "ASR" + case LSR => "LSR" + case _ => throw new RuntimeException("ShitOp unknown case") + } + } + + /** A logical shift to the left */ + case object LSL extends ShiftOp + + /** An arithmetic shift to the right */ + case object ASR extends ShiftOp + + /** A logical shift to the right */ + case object LSR extends ShiftOp + + /** This class represents a logical operation. */ + class LogicalOp { + + /** Returns a string representation of this operation. */ + override def toString(): String = this match { + case AND => "AND" + case OR => "OR" + case XOR => "XOR" + case _ => throw new RuntimeException("LogicalOp unknown case") + } + } + + /** A bitwise AND operation */ + case object AND extends LogicalOp + + /** A bitwise OR operation */ + case object OR extends LogicalOp + + /** A bitwise XOR operation */ + case object XOR extends LogicalOp + + /** Signals the beginning of a series of concatenations. + * On the JVM platform, it should create a new StringBuffer + */ + case object StartConcat extends Primitive + + /** + * type: (buf) => STR + * jvm : It should turn the StringBuffer into a String. + */ + case object EndConcat extends Primitive + + /** Pretty printer for primitives */ + class PrimitivePrinter(out: PrintWriter) { + def print(s: String): PrimitivePrinter = { + out.print(s) + this + } + } + + /** This class represents a comparison operation. */ + class ComparisonOp { + + /** Returns a string representation of this operation. */ + override def toString(): String = this match { + case CMPL => "CMPL" + case CMP => "CMP" + case CMPG => "CMPG" + case _ => throw new RuntimeException("ComparisonOp unknown case") + } + } + + /** A comparison operation with -1 default for NaNs */ + case object CMPL extends ComparisonOp + + /** A comparison operation with no default for NaNs */ + case object CMP extends ComparisonOp + + /** A comparison operation with +1 default for NaNs */ + case object CMPG extends ComparisonOp +} diff --git a/tests/pos-with-compiler-cc/backend/jvm/scalaPrimitives.scala b/tests/pos-with-compiler-cc/backend/jvm/scalaPrimitives.scala new file mode 100644 index 000000000000..420ff7b20423 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/jvm/scalaPrimitives.scala @@ -0,0 +1,412 @@ +package dotty.tools +package backend.jvm + +import dotc.ast.Trees.Select +import dotc.ast.tpd._ +import dotc.core._ +import Contexts._ +import Names.TermName, StdNames._ +import Types.{JavaArrayType, UnspecifiedErrorType, Type} +import Symbols.{Symbol, NoSymbol} +import Decorators.em +import dotc.report +import dotc.util.ReadOnlyMap + +import scala.annotation.threadUnsafe + +/** Scala primitive operations are represented as methods in `Any` and + * `AnyVal` subclasses. Here we demultiplex them by providing a mapping + * from their symbols to integers. Different methods exist for + * different value types, but with the same meaning (like plus, minus, + * etc.). They will all be mapped to the same int. + * + * Note: The three equal methods have the following semantics: + * - `"=="` checks for `null`, and if non-null, calls + * `java.lang.Object.equals` + * `(class: Any; modifier: final)`. Primitive: `EQ` + * - `"eq"` usual reference comparison + * `(class: AnyRef; modifier: final)`. Primitive: `ID` + * - `"equals"` user-defined equality (Java semantics) + * `(class: Object; modifier: none)`. Primitive: `EQUALS` + * + * Inspired from the `scalac` compiler. + */ +class DottyPrimitives(ictx: DetachedContext) { + import dotty.tools.backend.ScalaPrimitivesOps._ + + @threadUnsafe private lazy val primitives: ReadOnlyMap[Symbol, Int] = init + + /** Return the code for the given symbol. */ + def getPrimitive(sym: Symbol): Int = { + primitives(sym) + } + + /** + * Return the primitive code of the given operation. If the + * operation is an array get/set, we inspect the type of the receiver + * to demux the operation. + * + * @param fun The method symbol + * @param tpe The type of the receiver object. It is used only for array + * operations + */ + def getPrimitive(app: Apply, tpe: Type)(using Context): Int = { + val fun = app.fun.symbol + val defn = ctx.definitions + val code = app.fun match { + case Select(_, nme.primitive.arrayLength) => + LENGTH + case Select(_, nme.primitive.arrayUpdate) => + UPDATE + case Select(_, nme.primitive.arrayApply) => + APPLY + case _ => getPrimitive(fun) + } + + def elementType: Type = tpe.widenDealias match { + case defn.ArrayOf(el) => el + case JavaArrayType(el) => el + case _ => + report.error(em"expected Array $tpe") + UnspecifiedErrorType + } + + code match { + + case APPLY => + defn.scalaClassName(elementType) match { + case tpnme.Boolean => ZARRAY_GET + case tpnme.Byte => BARRAY_GET + case tpnme.Short => SARRAY_GET + case tpnme.Char => CARRAY_GET + case tpnme.Int => IARRAY_GET + case tpnme.Long => LARRAY_GET + case tpnme.Float => FARRAY_GET + case tpnme.Double => DARRAY_GET + case _ => OARRAY_GET + } + + case UPDATE => + defn.scalaClassName(elementType) match { + case tpnme.Boolean => ZARRAY_SET + case tpnme.Byte => BARRAY_SET + case tpnme.Short => SARRAY_SET + case tpnme.Char => CARRAY_SET + case tpnme.Int => IARRAY_SET + case tpnme.Long => LARRAY_SET + case tpnme.Float => FARRAY_SET + case tpnme.Double => DARRAY_SET + case _ => OARRAY_SET + } + + case LENGTH => + defn.scalaClassName(elementType) match { + case tpnme.Boolean => ZARRAY_LENGTH + case tpnme.Byte => BARRAY_LENGTH + case tpnme.Short => SARRAY_LENGTH + case tpnme.Char => CARRAY_LENGTH + case tpnme.Int => IARRAY_LENGTH + case tpnme.Long => LARRAY_LENGTH + case tpnme.Float => FARRAY_LENGTH + case tpnme.Double => DARRAY_LENGTH + case _ => OARRAY_LENGTH + } + + case _ => + code + } + } + + /** Initialize the primitive map */ + private def init: ReadOnlyMap[Symbol, Int] = { + + given Context = ictx + + import Symbols.defn + val primitives = Symbols.MutableSymbolMap[Int](512) + + /** Add a primitive operation to the map */ + def addPrimitive(s: Symbol, code: Int): Unit = { + assert(!(primitives contains s), "Duplicate primitive " + s) + primitives(s) = code + } + + def addPrimitives(cls: Symbol, method: TermName, code: Int)(using Context): Unit = { + val alts = cls.info.member(method).alternatives.map(_.symbol) + if (alts.isEmpty) + report.error(em"Unknown primitive method $cls.$method") + else alts foreach (s => + addPrimitive(s, + s.info.paramInfoss match { + case List(tp :: _) if code == ADD && tp =:= ctx.definitions.StringType => CONCAT + case _ => code + } + ) + ) + } + + // scala.Any + addPrimitive(defn.Any_==, EQ) + addPrimitive(defn.Any_!=, NE) + addPrimitive(defn.Any_isInstanceOf, IS) + addPrimitive(defn.Any_asInstanceOf, AS) + addPrimitive(defn.Any_##, HASH) + + // java.lang.Object + addPrimitive(defn.Object_eq, ID) + addPrimitive(defn.Object_ne, NI) + /* addPrimitive(defn.Any_==, EQ) + addPrimitive(defn.Any_!=, NE)*/ + addPrimitive(defn.Object_synchronized, SYNCHRONIZED) + /*addPrimitive(defn.Any_isInstanceOf, IS) + addPrimitive(defn.Any_asInstanceOf, AS)*/ + + // java.lang.String + addPrimitive(defn.String_+, CONCAT) + + // scala.Array + lazy val ArrayClass = defn.ArrayClass + addPrimitives(ArrayClass, nme.length, LENGTH) + addPrimitives(ArrayClass, nme.apply, APPLY) + addPrimitives(ArrayClass, nme.update, UPDATE) + + // scala.Boolean + lazy val BooleanClass = defn.BooleanClass + addPrimitives(BooleanClass, nme.EQ, EQ) + addPrimitives(BooleanClass, nme.NE, NE) + addPrimitives(BooleanClass, nme.UNARY_!, ZNOT) + addPrimitives(BooleanClass, nme.ZOR, ZOR) + addPrimitives(BooleanClass, nme.ZAND, ZAND) + addPrimitives(BooleanClass, nme.OR, OR) + addPrimitives(BooleanClass, nme.AND, AND) + addPrimitives(BooleanClass, nme.XOR, XOR) + + // scala.Byte + lazy val ByteClass = defn.ByteClass + addPrimitives(ByteClass, nme.EQ, EQ) + addPrimitives(ByteClass, nme.NE, NE) + addPrimitives(ByteClass, nme.ADD, ADD) + addPrimitives(ByteClass, nme.SUB, SUB) + addPrimitives(ByteClass, nme.MUL, MUL) + addPrimitives(ByteClass, nme.DIV, DIV) + addPrimitives(ByteClass, nme.MOD, MOD) + addPrimitives(ByteClass, nme.LT, LT) + addPrimitives(ByteClass, nme.LE, LE) + addPrimitives(ByteClass, nme.GT, GT) + addPrimitives(ByteClass, nme.GE, GE) + addPrimitives(ByteClass, nme.XOR, XOR) + addPrimitives(ByteClass, nme.OR, OR) + addPrimitives(ByteClass, nme.AND, AND) + addPrimitives(ByteClass, nme.LSL, LSL) + addPrimitives(ByteClass, nme.LSR, LSR) + addPrimitives(ByteClass, nme.ASR, ASR) + // conversions + addPrimitives(ByteClass, nme.toByte, B2B) + addPrimitives(ByteClass, nme.toShort, B2S) + addPrimitives(ByteClass, nme.toChar, B2C) + addPrimitives(ByteClass, nme.toInt, B2I) + addPrimitives(ByteClass, nme.toLong, B2L) + // unary methods + addPrimitives(ByteClass, nme.UNARY_+, POS) + addPrimitives(ByteClass, nme.UNARY_-, NEG) + addPrimitives(ByteClass, nme.UNARY_~, NOT) + + addPrimitives(ByteClass, nme.toFloat, B2F) + addPrimitives(ByteClass, nme.toDouble, B2D) + + // scala.Short + lazy val ShortClass = defn.ShortClass + addPrimitives(ShortClass, nme.EQ, EQ) + addPrimitives(ShortClass, nme.NE, NE) + addPrimitives(ShortClass, nme.ADD, ADD) + addPrimitives(ShortClass, nme.SUB, SUB) + addPrimitives(ShortClass, nme.MUL, MUL) + addPrimitives(ShortClass, nme.DIV, DIV) + addPrimitives(ShortClass, nme.MOD, MOD) + addPrimitives(ShortClass, nme.LT, LT) + addPrimitives(ShortClass, nme.LE, LE) + addPrimitives(ShortClass, nme.GT, GT) + addPrimitives(ShortClass, nme.GE, GE) + addPrimitives(ShortClass, nme.XOR, XOR) + addPrimitives(ShortClass, nme.OR, OR) + addPrimitives(ShortClass, nme.AND, AND) + addPrimitives(ShortClass, nme.LSL, LSL) + addPrimitives(ShortClass, nme.LSR, LSR) + addPrimitives(ShortClass, nme.ASR, ASR) + // conversions + addPrimitives(ShortClass, nme.toByte, S2B) + addPrimitives(ShortClass, nme.toShort, S2S) + addPrimitives(ShortClass, nme.toChar, S2C) + addPrimitives(ShortClass, nme.toInt, S2I) + addPrimitives(ShortClass, nme.toLong, S2L) + // unary methods + addPrimitives(ShortClass, nme.UNARY_+, POS) + addPrimitives(ShortClass, nme.UNARY_-, NEG) + addPrimitives(ShortClass, nme.UNARY_~, NOT) + + addPrimitives(ShortClass, nme.toFloat, S2F) + addPrimitives(ShortClass, nme.toDouble, S2D) + + // scala.Char + lazy val CharClass = defn.CharClass + addPrimitives(CharClass, nme.EQ, EQ) + addPrimitives(CharClass, nme.NE, NE) + addPrimitives(CharClass, nme.ADD, ADD) + addPrimitives(CharClass, nme.SUB, SUB) + addPrimitives(CharClass, nme.MUL, MUL) + addPrimitives(CharClass, nme.DIV, DIV) + addPrimitives(CharClass, nme.MOD, MOD) + addPrimitives(CharClass, nme.LT, LT) + addPrimitives(CharClass, nme.LE, LE) + addPrimitives(CharClass, nme.GT, GT) + addPrimitives(CharClass, nme.GE, GE) + addPrimitives(CharClass, nme.XOR, XOR) + addPrimitives(CharClass, nme.OR, OR) + addPrimitives(CharClass, nme.AND, AND) + addPrimitives(CharClass, nme.LSL, LSL) + addPrimitives(CharClass, nme.LSR, LSR) + addPrimitives(CharClass, nme.ASR, ASR) + // conversions + addPrimitives(CharClass, nme.toByte, C2B) + addPrimitives(CharClass, nme.toShort, C2S) + addPrimitives(CharClass, nme.toChar, C2C) + addPrimitives(CharClass, nme.toInt, C2I) + addPrimitives(CharClass, nme.toLong, C2L) + // unary methods + addPrimitives(CharClass, nme.UNARY_+, POS) + addPrimitives(CharClass, nme.UNARY_-, NEG) + addPrimitives(CharClass, nme.UNARY_~, NOT) + addPrimitives(CharClass, nme.toFloat, C2F) + addPrimitives(CharClass, nme.toDouble, C2D) + + // scala.Int + lazy val IntClass = defn.IntClass + addPrimitives(IntClass, nme.EQ, EQ) + addPrimitives(IntClass, nme.NE, NE) + addPrimitives(IntClass, nme.ADD, ADD) + addPrimitives(IntClass, nme.SUB, SUB) + addPrimitives(IntClass, nme.MUL, MUL) + addPrimitives(IntClass, nme.DIV, DIV) + addPrimitives(IntClass, nme.MOD, MOD) + addPrimitives(IntClass, nme.LT, LT) + addPrimitives(IntClass, nme.LE, LE) + addPrimitives(IntClass, nme.GT, GT) + addPrimitives(IntClass, nme.GE, GE) + addPrimitives(IntClass, nme.XOR, XOR) + addPrimitives(IntClass, nme.OR, OR) + addPrimitives(IntClass, nme.AND, AND) + addPrimitives(IntClass, nme.LSL, LSL) + addPrimitives(IntClass, nme.LSR, LSR) + addPrimitives(IntClass, nme.ASR, ASR) + // conversions + addPrimitives(IntClass, nme.toByte, I2B) + addPrimitives(IntClass, nme.toShort, I2S) + addPrimitives(IntClass, nme.toChar, I2C) + addPrimitives(IntClass, nme.toInt, I2I) + addPrimitives(IntClass, nme.toLong, I2L) + // unary methods + addPrimitives(IntClass, nme.UNARY_+, POS) + addPrimitives(IntClass, nme.UNARY_-, NEG) + addPrimitives(IntClass, nme.UNARY_~, NOT) + addPrimitives(IntClass, nme.toFloat, I2F) + addPrimitives(IntClass, nme.toDouble, I2D) + + // scala.Long + lazy val LongClass = defn.LongClass + addPrimitives(LongClass, nme.EQ, EQ) + addPrimitives(LongClass, nme.NE, NE) + addPrimitives(LongClass, nme.ADD, ADD) + addPrimitives(LongClass, nme.SUB, SUB) + addPrimitives(LongClass, nme.MUL, MUL) + addPrimitives(LongClass, nme.DIV, DIV) + addPrimitives(LongClass, nme.MOD, MOD) + addPrimitives(LongClass, nme.LT, LT) + addPrimitives(LongClass, nme.LE, LE) + addPrimitives(LongClass, nme.GT, GT) + addPrimitives(LongClass, nme.GE, GE) + addPrimitives(LongClass, nme.XOR, XOR) + addPrimitives(LongClass, nme.OR, OR) + addPrimitives(LongClass, nme.AND, AND) + addPrimitives(LongClass, nme.LSL, LSL) + addPrimitives(LongClass, nme.LSR, LSR) + addPrimitives(LongClass, nme.ASR, ASR) + // conversions + addPrimitives(LongClass, nme.toByte, L2B) + addPrimitives(LongClass, nme.toShort, L2S) + addPrimitives(LongClass, nme.toChar, L2C) + addPrimitives(LongClass, nme.toInt, L2I) + addPrimitives(LongClass, nme.toLong, L2L) + // unary methods + addPrimitives(LongClass, nme.UNARY_+, POS) + addPrimitives(LongClass, nme.UNARY_-, NEG) + addPrimitives(LongClass, nme.UNARY_~, NOT) + addPrimitives(LongClass, nme.toFloat, L2F) + addPrimitives(LongClass, nme.toDouble, L2D) + + // scala.Float + lazy val FloatClass = defn.FloatClass + addPrimitives(FloatClass, nme.EQ, EQ) + addPrimitives(FloatClass, nme.NE, NE) + addPrimitives(FloatClass, nme.ADD, ADD) + addPrimitives(FloatClass, nme.SUB, SUB) + addPrimitives(FloatClass, nme.MUL, MUL) + addPrimitives(FloatClass, nme.DIV, DIV) + addPrimitives(FloatClass, nme.MOD, MOD) + addPrimitives(FloatClass, nme.LT, LT) + addPrimitives(FloatClass, nme.LE, LE) + addPrimitives(FloatClass, nme.GT, GT) + addPrimitives(FloatClass, nme.GE, GE) + // conversions + addPrimitives(FloatClass, nme.toByte, F2B) + addPrimitives(FloatClass, nme.toShort, F2S) + addPrimitives(FloatClass, nme.toChar, F2C) + addPrimitives(FloatClass, nme.toInt, F2I) + addPrimitives(FloatClass, nme.toLong, F2L) + addPrimitives(FloatClass, nme.toFloat, F2F) + addPrimitives(FloatClass, nme.toDouble, F2D) + // unary methods + addPrimitives(FloatClass, nme.UNARY_+, POS) + addPrimitives(FloatClass, nme.UNARY_-, NEG) + + // scala.Double + lazy val DoubleClass = defn.DoubleClass + addPrimitives(DoubleClass, nme.EQ, EQ) + addPrimitives(DoubleClass, nme.NE, NE) + addPrimitives(DoubleClass, nme.ADD, ADD) + addPrimitives(DoubleClass, nme.SUB, SUB) + addPrimitives(DoubleClass, nme.MUL, MUL) + addPrimitives(DoubleClass, nme.DIV, DIV) + addPrimitives(DoubleClass, nme.MOD, MOD) + addPrimitives(DoubleClass, nme.LT, LT) + addPrimitives(DoubleClass, nme.LE, LE) + addPrimitives(DoubleClass, nme.GT, GT) + addPrimitives(DoubleClass, nme.GE, GE) + // conversions + addPrimitives(DoubleClass, nme.toByte, D2B) + addPrimitives(DoubleClass, nme.toShort, D2S) + addPrimitives(DoubleClass, nme.toChar, D2C) + addPrimitives(DoubleClass, nme.toInt, D2I) + addPrimitives(DoubleClass, nme.toLong, D2L) + addPrimitives(DoubleClass, nme.toFloat, D2F) + addPrimitives(DoubleClass, nme.toDouble, D2D) + // unary methods + addPrimitives(DoubleClass, nme.UNARY_+, POS) + addPrimitives(DoubleClass, nme.UNARY_-, NEG) + + + primitives + } + + def isPrimitive(sym: Symbol): Boolean = + primitives.contains(sym) + + def isPrimitive(fun: Tree): Boolean = + given Context = ictx + primitives.contains(fun.symbol) + || (fun.symbol == NoSymbol // the only trees that do not have a symbol assigned are array.{update,select,length,clone}} + && { + fun match + case Select(_, StdNames.nme.clone_) => false // but array.clone is NOT a primitive op. + case _ => true + }) +} diff --git a/tests/pos-with-compiler-cc/backend/sjs/GenSJSIR.scala b/tests/pos-with-compiler-cc/backend/sjs/GenSJSIR.scala new file mode 100644 index 000000000000..1579b4577933 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/sjs/GenSJSIR.scala @@ -0,0 +1,23 @@ +package dotty.tools.backend.sjs + +import dotty.tools.dotc.core._ +import Contexts._ +import Phases._ + +/** Generates Scala.js IR files for the compilation unit. */ +class GenSJSIR extends Phase { + + override def phaseName: String = GenSJSIR.name + + override def description: String = GenSJSIR.description + + override def isRunnable(using Context): Boolean = + super.isRunnable && ctx.settings.scalajs.value + + def run(using Context): Unit = + new JSCodeGen().run() +} + +object GenSJSIR: + val name: String = "genSJSIR" + val description: String = "generate .sjsir files for Scala.js" diff --git a/tests/pos-with-compiler-cc/backend/sjs/JSCodeGen.scala b/tests/pos-with-compiler-cc/backend/sjs/JSCodeGen.scala new file mode 100644 index 000000000000..87d816e56192 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/sjs/JSCodeGen.scala @@ -0,0 +1,4897 @@ +package dotty.tools.backend.sjs + +import scala.language.unsafeNulls + +import scala.annotation.switch +import scala.collection.mutable + +import dotty.tools.FatalError +import dotty.tools.dotc.CompilationUnit +import dotty.tools.dotc.ast.tpd +import dotty.tools.dotc.core._ +import Contexts._ +import Decorators._ +import Flags._ +import Names._ +import NameKinds.DefaultGetterName +import Types._ +import Symbols._ +import Phases._ +import StdNames._ +import TypeErasure.ErasedValueType + +import dotty.tools.dotc.transform.{Erasure, ValueClasses} +import dotty.tools.dotc.transform.SymUtils._ +import dotty.tools.dotc.util.SourcePosition +import dotty.tools.dotc.report + +import org.scalajs.ir +import org.scalajs.ir.{ClassKind, Position, Names => jsNames, Trees => js, Types => jstpe} +import org.scalajs.ir.Names.{ClassName, MethodName, SimpleMethodName} +import org.scalajs.ir.OriginalName +import org.scalajs.ir.OriginalName.NoOriginalName +import org.scalajs.ir.Trees.OptimizerHints + +import dotty.tools.dotc.transform.sjs.JSSymUtils._ + +import JSEncoding._ +import ScopedVar.withScopedVars +import annotation.retains + +/** Main codegen for Scala.js IR. + * + * [[GenSJSIR]] creates one instance of `JSCodeGen` per compilation unit. + * The `run()` method processes the whole compilation unit and generates + * `.sjsir` files for it. + * + * There are 4 main levels of translation: + * + * - `genCompilationUnit()` iterates through all the type definitions in the + * compilation unit. Each generated `js.ClassDef` is serialized to an + * `.sjsir` file. + * - `genScalaClass()` and other similar methods generate the skeleton of + * classes. + * - `genMethod()` and similar methods generate the declarations of methods. + * - `genStatOrExpr()` and everything else generate the bodies of methods. + */ +class JSCodeGen()(using genCtx: DetachedContext) { + import JSCodeGen._ + import tpd._ + + val sjsPlatform = dotty.tools.dotc.config.SJSPlatform.sjsPlatform + val jsdefn = JSDefinitions.jsdefn + private val primitives = new JSPrimitives(genCtx) + + val positionConversions = new JSPositions()(using genCtx) + import positionConversions._ + + private val jsExportsGen = new JSExportsGen(this) + + // Some state -------------------------------------------------------------- + + private val lazilyGeneratedAnonClasses = new MutableSymbolMap[TypeDef] + private val generatedClasses = mutable.ListBuffer.empty[js.ClassDef] + private val generatedStaticForwarderClasses = mutable.ListBuffer.empty[(Symbol, js.ClassDef)] + + val currentClassSym: ScopedVar[Symbol] = new ScopedVar[Symbol] + private val currentMethodSym = new ScopedVar[Symbol] + private val localNames = new ScopedVar[LocalNameGenerator] + private val thisLocalVarIdent = new ScopedVar[Option[js.LocalIdent]] + private val isModuleInitialized = new ScopedVar[ScopedVar.VarBox[Boolean]] + private val undefinedDefaultParams = new ScopedVar[mutable.Set[Symbol]] + + /* Contextual JS class value for some operations of nested JS classes that need one. */ + private val contextualJSClassValue = new ScopedVar[Option[js.Tree]](None) + + /** Resets all of the scoped state in the context of `body`. */ + private def resetAllScopedVars[T](body: => T): T = { + withScopedVars( + currentClassSym := null, + currentMethodSym := null, + localNames := null, + thisLocalVarIdent := null, + isModuleInitialized := null, + undefinedDefaultParams := null + ) { + body + } + } + + private def withPerMethodBodyState[A](methodSym: Symbol)(body: => A): A = { + withScopedVars( + currentMethodSym := methodSym, + thisLocalVarIdent := None, + isModuleInitialized := new ScopedVar.VarBox(false), + undefinedDefaultParams := mutable.Set.empty, + ) { + body + } + } + + private def acquireContextualJSClassValue[A](f: Option[js.Tree] => A): A = { + val jsClassValue = contextualJSClassValue.get + withScopedVars( + contextualJSClassValue := None + ) { + f(jsClassValue) + } + } + + def withNewLocalNameScope[A](body: => A): A = { + withScopedVars(localNames := new LocalNameGenerator) { + body + } + } + + /** Implicitly materializes the current local name generator. */ + implicit def implicitLocalNames: LocalNameGenerator = localNames.get + + def currentThisType: jstpe.Type = { + encodeClassType(currentClassSym) match { + case tpe @ jstpe.ClassType(cls) => + jstpe.BoxedClassToPrimType.getOrElse(cls, tpe) + case tpe => + tpe + } + } + + /** Returns a new fresh local identifier. */ + private def freshLocalIdent()(implicit pos: Position): js.LocalIdent = + localNames.get.freshLocalIdent() + + /** Returns a new fresh local identifier. */ + def freshLocalIdent(base: String)(implicit pos: Position): js.LocalIdent = + localNames.get.freshLocalIdent(base) + + /** Returns a new fresh local identifier. */ + private def freshLocalIdent(base: TermName)(implicit pos: Position): js.LocalIdent = + localNames.get.freshLocalIdent(base) + + private def consumeLazilyGeneratedAnonClass(sym: Symbol): TypeDef = { + val typeDef = lazilyGeneratedAnonClasses.remove(sym) + if (typeDef == null) { + throw new FatalError( + i"Could not find tree for lazily generated anonymous class ${sym.fullName} at ${sym.sourcePos}") + } else { + typeDef + } + } + + // Compilation unit -------------------------------------------------------- + + def run(): Unit = { + try { + genCompilationUnit(ctx.compilationUnit) + } finally { + generatedClasses.clear() + generatedStaticForwarderClasses.clear() + } + } + + /** Generates the Scala.js IR for a compilation unit + * This method iterates over all the class and interface definitions + * found in the compilation unit and emits their IR (.sjsir). + * + * Some classes are never actually emitted: + * - Classes representing primitive types + * - The scala.Array class + * + * TODO Some classes representing anonymous functions are not actually emitted. + * Instead, a temporary representation of their `apply` method is built + * and recorded, so that it can be inlined as a JavaScript anonymous + * function in the method that instantiates it. + * + * Other ClassDefs are emitted according to their nature: + * * Non-native JS class -> `genNonNativeJSClass()` + * * Other JS type (<: js.Any) -> `genRawJSClassData()` + * * Interface -> `genInterface()` + * * Normal class -> `genClass()` + */ + private def genCompilationUnit(cunit: CompilationUnit): Unit = { + def collectTypeDefs(tree: Tree): List[TypeDef] = { + tree match { + case EmptyTree => Nil + case PackageDef(_, stats) => stats.flatMap(collectTypeDefs) + case cd: TypeDef => cd :: Nil + case _: ValDef => Nil // module instance + } + } + val allTypeDefs = collectTypeDefs(cunit.tpdTree) + + /* #13221 Set JavaStatic on all the Module fields of static module classes. + * This is necessary for `desugarIdent` not to crash in some obscure + * scenarios. + * + * !!! Part of this logic is duplicated in BCodeSkelBuilder.genPlainClass + * + * However, here we only do this for Module fields, not all fields. + */ + for (typeDef <- allTypeDefs) { + if (typeDef.symbol.is(ModuleClass)) { + typeDef.symbol.info.decls.foreach { f => + if (f.isField && f.is(Module)) + f.setFlag(JavaStatic) + } + } + } + + val (anonJSClassTypeDefs, otherTypeDefs) = + allTypeDefs.partition(td => td.symbol.isAnonymousClass && td.symbol.isJSType) + + // Record the TypeDefs of anonymous JS classes to be lazily generated + for (td <- anonJSClassTypeDefs) + lazilyGeneratedAnonClasses(td.symbol) = td + + /* Finally, we emit true code for the remaining class defs. */ + for (td <- otherTypeDefs) { + val sym = td.symbol + implicit val pos: Position = sym.span + + /* Do not actually emit code for primitive types nor scala.Array. */ + val isPrimitive = + sym.isPrimitiveValueClass || sym == defn.ArrayClass + + if (!isPrimitive) { + withScopedVars( + currentClassSym := sym + ) { + val tree = if (sym.isJSType) { + if (!sym.is(Trait) && sym.isNonNativeJSClass) + genNonNativeJSClass(td) + else + genRawJSClassData(td) + } else if (sym.is(Trait)) { + genInterface(td) + } else { + genScalaClass(td) + } + + generatedClasses += tree + } + } + } + + for (tree <- generatedClasses) + genIRFile(cunit, tree) + + if (generatedStaticForwarderClasses.nonEmpty) { + /* #4148 Add generated static forwarder classes, except those that + * would collide with regular classes on case insensitive file systems. + */ + + /* I could not find any reference anywhere about what locale is used + * by case insensitive file systems to compare case-insensitively. + * In doubt, force the English locale, which is probably going to do + * the right thing in virtually all cases (especially if users stick + * to ASCII class names), and it has the merit of being deterministic, + * as opposed to using the OS' default locale. + * The JVM backend performs a similar test to emit a warning for + * conflicting top-level classes. However, it uses `toLowerCase()` + * without argument, which is not deterministic. + */ + def caseInsensitiveNameOf(classDef: js.ClassDef): String = + classDef.name.name.nameString.toLowerCase(java.util.Locale.ENGLISH) + + val generatedCaseInsensitiveNames = + generatedClasses.map(caseInsensitiveNameOf).toSet + + for ((site, classDef) <- generatedStaticForwarderClasses) { + if (!generatedCaseInsensitiveNames.contains(caseInsensitiveNameOf(classDef))) { + genIRFile(cunit, classDef) + } else { + report.warning( + s"Not generating the static forwarders of ${classDef.name.name.nameString} " + + "because its name differs only in case from the name of another class or trait in this compilation unit.", + site.srcPos) + } + } + } + } + + private def genIRFile(cunit: CompilationUnit, tree: ir.Trees.ClassDef): Unit = { + val outfile = getFileFor(cunit, tree.name.name, ".sjsir") + val output = outfile.bufferedOutput + try { + ir.Serializers.serialize(output, tree) + } finally { + output.close() + } + } + + private def getFileFor(cunit: CompilationUnit, className: ClassName, + suffix: String): dotty.tools.io.AbstractFile = { + val outputDirectory = ctx.settings.outputDir.value + val pathParts = className.nameString.split('.') + val dir = pathParts.init.foldLeft(outputDirectory)(_.subdirectoryNamed(_)) + val filename = pathParts.last + dir.fileNamed(filename + suffix) + } + + // Generate a class -------------------------------------------------------- + + /** Gen the IR ClassDef for a Scala class definition (maybe a module class). + */ + private def genScalaClass(td: TypeDef): js.ClassDef = { + val sym = td.symbol.asClass + implicit val pos: SourcePosition = sym.sourcePos + + assert(!sym.is(Trait), + "genScalaClass() must be called only for normal classes: "+sym) + assert(sym.superClass != NoSymbol, sym) + + if (hasDefaultCtorArgsAndJSModule(sym)) { + report.error( + "Implementation restriction: " + + "constructors of Scala classes cannot have default parameters if their companion module is JS native.", + td) + } + + val classIdent = encodeClassNameIdent(sym) + val originalName = originalNameOfClass(sym) + val isHijacked = false //isHijackedBoxedClass(sym) + + // Optimizer hints + + val isDynamicImportThunk = sym.isSubClass(jsdefn.DynamicImportThunkClass) + + def isStdLibClassWithAdHocInlineAnnot(sym: Symbol): Boolean = { + val fullName = sym.fullName.toString + (fullName.startsWith("scala.Tuple") && !fullName.endsWith("$")) || + (fullName.startsWith("scala.collection.mutable.ArrayOps$of")) + } + + val shouldMarkInline = ( + isDynamicImportThunk || + sym.hasAnnotation(jsdefn.InlineAnnot) || + (sym.isAnonymousFunction && !sym.isSubClass(defn.PartialFunctionClass)) || + isStdLibClassWithAdHocInlineAnnot(sym)) + + val optimizerHints = { + OptimizerHints.empty + .withInline(shouldMarkInline) + .withNoinline(sym.hasAnnotation(jsdefn.NoinlineAnnot)) + } + + // Generate members (constructor + methods) + + val generatedNonFieldMembers = new mutable.ListBuffer[js.MemberDef] + + val tpl = td.rhs.asInstanceOf[Template] + for (tree <- tpl.constr :: tpl.body) { + tree match { + case EmptyTree => () + + case vd: ValDef => + // fields are added via genClassFields(), but we need to generate the JS native members + val sym = vd.symbol + if (!sym.is(Module) && sym.hasAnnotation(jsdefn.JSNativeAnnot)) + generatedNonFieldMembers += genJSNativeMemberDef(vd) + + case dd: DefDef => + val sym = dd.symbol + if sym.hasAnnotation(jsdefn.JSNativeAnnot) then + if !sym.is(Accessor) then + generatedNonFieldMembers += genJSNativeMemberDef(dd) + else + generatedNonFieldMembers ++= genMethod(dd) + + case _ => + throw new FatalError("Illegal tree in body of genScalaClass(): " + tree) + } + } + + // Generate fields and add to methods + ctors + val generatedMembers = genClassFields(td) ++ generatedNonFieldMembers.toList + + // Generate member exports + val memberExports = jsExportsGen.genMemberExports(sym) + + // Generate top-level export definitions + val topLevelExportDefs = jsExportsGen.genTopLevelExports(sym) + + // Static initializer + val optStaticInitializer = { + // Initialization of reflection data, if required + val reflectInit = { + val enableReflectiveInstantiation = { + sym.baseClasses.exists { ancestor => + ancestor.hasAnnotation(jsdefn.EnableReflectiveInstantiationAnnot) + } + } + if (enableReflectiveInstantiation) + genRegisterReflectiveInstantiation(sym).toList + else + Nil + } + + // Initialization of the module because of field exports + val needsStaticModuleInit = + topLevelExportDefs.exists(_.isInstanceOf[js.TopLevelFieldExportDef]) + val staticModuleInit = + if (!needsStaticModuleInit) Nil + else List(genLoadModule(sym)) + + val staticInitializerStats = reflectInit ::: staticModuleInit + if (staticInitializerStats.nonEmpty) + List(genStaticConstructorWithStats(ir.Names.StaticInitializerName, js.Block(staticInitializerStats))) + else + Nil + } + + val optDynamicImportForwarder = + if (isDynamicImportThunk) List(genDynamicImportForwarder(sym)) + else Nil + + val allMemberDefsExceptStaticForwarders = + generatedMembers ::: memberExports ::: optStaticInitializer ::: optDynamicImportForwarder + + // Add static forwarders + val allMemberDefs = if (!isCandidateForForwarders(sym)) { + allMemberDefsExceptStaticForwarders + } else { + if (isStaticModule(sym)) { + /* If the module class has no linked class, we must create one to + * hold the static forwarders. Otherwise, this is going to be handled + * when generating the companion class. + */ + if (!sym.linkedClass.exists) { + val forwarders = genStaticForwardersFromModuleClass(Nil, sym) + if (forwarders.nonEmpty) { + val forwardersClassDef = js.ClassDef( + js.ClassIdent(ClassName(classIdent.name.nameString.stripSuffix("$"))), + originalName, + ClassKind.Class, + None, + Some(js.ClassIdent(ir.Names.ObjectClass)), + Nil, + None, + None, + forwarders, + Nil + )(js.OptimizerHints.empty) + generatedStaticForwarderClasses += sym -> forwardersClassDef + } + } + allMemberDefsExceptStaticForwarders + } else { + val forwarders = genStaticForwardersForClassOrInterface( + allMemberDefsExceptStaticForwarders, sym) + allMemberDefsExceptStaticForwarders ::: forwarders + } + } + + // Hashed definitions of the class + val hashedDefs = ir.Hashers.hashMemberDefs(allMemberDefs) + + // The complete class definition + val kind = + if (isStaticModule(sym)) ClassKind.ModuleClass + else if (isHijacked) ClassKind.HijackedClass + else ClassKind.Class + + val classDefinition = js.ClassDef( + classIdent, + originalName, + kind, + None, + Some(encodeClassNameIdent(sym.superClass)), + genClassInterfaces(sym, forJSClass = false), + None, + None, + hashedDefs, + topLevelExportDefs)( + optimizerHints) + + classDefinition + } + + /** Gen the IR ClassDef for a Scala.js-defined JS class. */ + private def genNonNativeJSClass(td: TypeDef): js.ClassDef = { + val sym = td.symbol.asClass + implicit val pos: SourcePosition = sym.sourcePos + + assert(sym.isNonNativeJSClass, + i"genNonNativeJSClass() must be called only for non-native JS classes: $sym") + assert(sym.superClass != NoSymbol, sym) + + if (hasDefaultCtorArgsAndJSModule(sym)) { + report.error( + "Implementation restriction: " + + "constructors of non-native JS classes cannot have default parameters if their companion module is JS native.", + td) + } + + val classIdent = encodeClassNameIdent(sym) + val originalName = originalNameOfClass(sym) + + // Generate members (constructor + methods) + + val constructorTrees = new mutable.ListBuffer[DefDef] + val generatedMethods = new mutable.ListBuffer[js.MethodDef] + val dispatchMethodNames = new mutable.ListBuffer[JSName] + + val tpl = td.rhs.asInstanceOf[Template] + for (tree <- tpl.constr :: tpl.body) { + tree match { + case EmptyTree => () + + case _: ValDef => + () // fields are added via genClassFields() + + case dd: DefDef => + val sym = dd.symbol + val exposed = sym.isJSExposed + + if (sym.isClassConstructor) { + constructorTrees += dd + } else if (exposed && sym.is(Accessor, butNot = Lazy)) { + // Exposed accessors must not be emitted, since the field they access is enough. + } else if (sym.hasAnnotation(jsdefn.JSOptionalAnnot)) { + // Optional methods must not be emitted + } else { + generatedMethods ++= genMethod(dd) + + // Collect the names of the dispatchers we have to create + if (exposed && !sym.is(Deferred)) { + /* We add symbols that we have to expose here. This way we also + * get inherited stuff that is implemented in this class. + */ + dispatchMethodNames += sym.jsName + } + } + + case _ => + throw new FatalError("Illegal tree in gen of genNonNativeJSClass(): " + tree) + } + } + + // Static members (exported from the companion object) + val staticMembers = { + val module = sym.companionModule + if (!module.exists) { + Nil + } else { + val companionModuleClass = module.moduleClass + val exports = withScopedVars(currentClassSym := companionModuleClass) { + jsExportsGen.genStaticExports(companionModuleClass) + } + if (exports.exists(_.isInstanceOf[js.JSFieldDef])) { + val classInitializer = + genStaticConstructorWithStats(ir.Names.ClassInitializerName, genLoadModule(companionModuleClass)) + exports :+ classInitializer + } else { + exports + } + } + } + + val topLevelExports = jsExportsGen.genTopLevelExports(sym) + + val (generatedConstructor, jsClassCaptures) = withNewLocalNameScope { + val isNested = sym.isNestedJSClass + + if (isNested) + localNames.reserveLocalName(JSSuperClassParamName) + + val (captures, ctor) = genJSClassCapturesAndConstructor(constructorTrees.toList) + + val jsClassCaptures = if (isNested) { + val superParam = js.ParamDef(js.LocalIdent(JSSuperClassParamName), + NoOriginalName, jstpe.AnyType, mutable = false) + Some(superParam :: captures) + } else { + assert(captures.isEmpty, s"found non nested JS class with captures $captures at $pos") + None + } + + (ctor, jsClassCaptures) + } + + // Generate fields (and add to methods + ctors) + val generatedMembers = { + genClassFields(td) ::: + generatedConstructor :: + jsExportsGen.genJSClassDispatchers(sym, dispatchMethodNames.result().distinct) ::: + generatedMethods.toList ::: + staticMembers + } + + // Hashed definitions of the class + val hashedMemberDefs = ir.Hashers.hashMemberDefs(generatedMembers) + + // The complete class definition + val kind = + if (isStaticModule(sym)) ClassKind.JSModuleClass + else ClassKind.JSClass + + val classDefinition = js.ClassDef( + classIdent, + originalNameOfClass(sym), + kind, + jsClassCaptures, + Some(encodeClassNameIdent(sym.superClass)), + genClassInterfaces(sym, forJSClass = true), + jsSuperClass = jsClassCaptures.map(_.head.ref), + None, + hashedMemberDefs, + topLevelExports)( + OptimizerHints.empty) + + classDefinition + } + + /** Gen the IR ClassDef for a raw JS class or trait. + */ + private def genRawJSClassData(td: TypeDef): js.ClassDef = { + val sym = td.symbol.asClass + implicit val pos: Position = sym.span + + val classIdent = encodeClassNameIdent(sym) + val kind = { + if (sym.is(Trait)) ClassKind.AbstractJSType + else if (sym.is(ModuleClass)) ClassKind.NativeJSModuleClass + else ClassKind.NativeJSClass + } + val superClass = + if (sym.is(Trait)) None + else Some(encodeClassNameIdent(sym.superClass)) + val jsNativeLoadSpec = computeJSNativeLoadSpecOfClass(sym) + + js.ClassDef( + classIdent, + originalNameOfClass(sym), + kind, + None, + superClass, + genClassInterfaces(sym, forJSClass = false), + None, + jsNativeLoadSpec, + Nil, + Nil)( + OptimizerHints.empty) + } + + /** Gen the IR ClassDef for an interface definition. + */ + private def genInterface(td: TypeDef): js.ClassDef = { + val sym = td.symbol.asClass + implicit val pos: SourcePosition = sym.sourcePos + + val classIdent = encodeClassNameIdent(sym) + + val generatedMethods = new mutable.ListBuffer[js.MethodDef] + + val tpl = td.rhs.asInstanceOf[Template] + for (tree <- tpl.constr :: tpl.body) { + tree match { + case EmptyTree => () + case dd: DefDef => generatedMethods ++= genMethod(dd) + case _ => + throw new FatalError( + i"""Illegal tree in gen of genInterface(): $tree + |class = $td + |in ${ctx.compilationUnit}""") + } + } + + val superInterfaces = genClassInterfaces(sym, forJSClass = false) + + val genMethodsList = generatedMethods.toList + val allMemberDefs = + if (!isCandidateForForwarders(sym)) genMethodsList + else genMethodsList ::: genStaticForwardersForClassOrInterface(genMethodsList, sym) + + // Hashed definitions of the interface + val hashedDefs = ir.Hashers.hashMemberDefs(allMemberDefs) + + js.ClassDef( + classIdent, + originalNameOfClass(sym), + ClassKind.Interface, + None, + None, + superInterfaces, + None, + None, + hashedDefs, + Nil)( + OptimizerHints.empty) + } + + private def genClassInterfaces(sym: ClassSymbol, forJSClass: Boolean)( + implicit pos: Position): List[js.ClassIdent] = { + for { + intf <- sym.directlyInheritedTraits + if !(forJSClass && intf == defn.DynamicClass) + } yield { + encodeClassNameIdent(intf) + } + } + + // Static forwarders ------------------------------------------------------- + + /* This mimics the logic in BCodeHelpers.addForwarders and the code that + * calls it, except that we never have collisions with existing methods in + * the companion class. This is because in the IR, only methods with the + * same `MethodName` (including signature) and that are also + * `PublicStatic` would collide. There should never be an actual collision + * because the only `PublicStatic` methods that are otherwise generated are + * the bodies of SAMs, which have mangled names. If that assumption is + * broken, an error message is emitted asking the user to report a bug. + * + * It is important that we always emit forwarders, because some Java APIs + * actually have a public static method and a public instance method with + * the same name. For example the class `Integer` has a + * `def hashCode(): Int` and a `static def hashCode(Int): Int`. The JVM + * back-end considers them as colliding because they have the same name, + * but we must not. + * + * By default, we only emit forwarders for top-level objects, like the JVM + * back-end. However, if requested via a compiler option, we enable them + * for all static objects. This is important so we can implement static + * methods of nested static classes of JDK APIs (see scala-js/#3950). + */ + + /** Is the given Scala class, interface or module class a candidate for + * static forwarders? + * + * - the flag `-XnoForwarders` is not set to true, and + * - the symbol is static, and + * - either of both of the following is true: + * - the flag `-scalajsGenStaticForwardersForNonTopLevelObjects` is set to true, or + * - the symbol was originally at the package level + * + * Other than the Scala.js-specific flag, and the fact that we also consider + * interfaces, this performs the same tests as the JVM back-end. + */ + def isCandidateForForwarders(sym: Symbol): Boolean = { + !ctx.settings.XnoForwarders.value && sym.isStatic && { + ctx.settings.scalajsGenStaticForwardersForNonTopLevelObjects.value || { + atPhase(flattenPhase) { + toDenot(sym).owner.is(PackageClass) + } + } + } + } + + /** Gen the static forwarders to the members of a class or interface for + * methods of its companion object. + * + * This is only done if there exists a companion object and it is not a JS + * type. + * + * Precondition: `isCandidateForForwarders(sym)` is true + */ + def genStaticForwardersForClassOrInterface( + existingMembers: List[js.MemberDef], sym: Symbol)( + implicit pos: SourcePosition): List[js.MemberDef] = { + val module = sym.companionModule + if (!module.exists) { + Nil + } else { + val moduleClass = module.moduleClass + if (!moduleClass.isJSType) + genStaticForwardersFromModuleClass(existingMembers, moduleClass) + else + Nil + } + } + + /** Gen the static forwarders for the methods of a module class. + * + * Precondition: `isCandidateForForwarders(moduleClass)` is true + */ + def genStaticForwardersFromModuleClass(existingMembers: List[js.MemberDef], + moduleClass: Symbol)( + implicit pos: SourcePosition): List[js.MemberDef] = { + + assert(moduleClass.is(ModuleClass), moduleClass) + + val existingPublicStaticMethodNames = existingMembers.collect { + case js.MethodDef(flags, name, _, _, _, _) + if flags.namespace == js.MemberNamespace.PublicStatic => + name.name + }.toSet + + val members = { + moduleClass.info.membersBasedOnFlags(required = Flags.Method, + excluded = Flags.ExcludedForwarder).map(_.symbol) + } + + def isExcluded(m: Symbol): Boolean = { + def hasAccessBoundary = m.accessBoundary(defn.RootClass) ne defn.RootClass + + def isOfJLObject: Boolean = m.owner eq defn.ObjectClass + + def isDefaultParamOfJSNativeDef: Boolean = { + m.name.is(DefaultGetterName) && { + val info = new DefaultParamInfo(m) + !info.isForConstructor && info.attachedMethod.hasAnnotation(jsdefn.JSNativeAnnot) + } + } + + m.is(Deferred) + || m.isConstructor + || hasAccessBoundary + || isOfJLObject + || m.hasAnnotation(jsdefn.JSNativeAnnot) || isDefaultParamOfJSNativeDef // #4557 + } + + val forwarders = for { + m <- members + if !isExcluded(m) + } yield { + withNewLocalNameScope { + val flags = js.MemberFlags.empty.withNamespace(js.MemberNamespace.PublicStatic) + val methodIdent = encodeMethodSym(m) + val originalName = originalNameOfMethod(m) + val jsParams = for { + (paramName, paramInfo) <- m.info.paramNamess.flatten.zip(m.info.paramInfoss.flatten) + } yield { + js.ParamDef(freshLocalIdent(paramName), NoOriginalName, + toIRType(paramInfo), mutable = false) + } + val resultType = toIRType(m.info.resultType) + + if (existingPublicStaticMethodNames.contains(methodIdent.name)) { + report.error( + "Unexpected situation: found existing public static method " + + s"${methodIdent.name.nameString} in the companion class of " + + s"${moduleClass.fullName}; cannot generate a static forwarder " + + "the method of the same name in the object." + + "Please report this as a bug in the Scala.js support in dotty.", + pos) + } + + js.MethodDef(flags, methodIdent, originalName, jsParams, resultType, Some { + genApplyMethod(genLoadModule(moduleClass), m, jsParams.map(_.ref)) + })(OptimizerHints.empty, None) + } + } + + forwarders.toList + } + + // Generate the fields of a class ------------------------------------------ + + /** Gen definitions for the fields of a class. */ + private def genClassFields(td: TypeDef): List[js.MemberDef] = { + val classSym = td.symbol.asClass + assert(currentClassSym.get == classSym, + "genClassFields called with a ClassDef other than the current one") + + val isJSClass = classSym.isNonNativeJSClass + + // Term members that are neither methods nor modules are fields + classSym.info.decls.filter { f => + !f.isOneOf(MethodOrModule) && f.isTerm + && !f.hasAnnotation(jsdefn.JSNativeAnnot) + && !f.hasAnnotation(jsdefn.JSOptionalAnnot) + && !f.hasAnnotation(jsdefn.JSExportStaticAnnot) + }.flatMap({ f => + implicit val pos = f.span + + val isTopLevelExport = f.hasAnnotation(jsdefn.JSExportTopLevelAnnot) + val isJavaStatic = f.is(JavaStatic) + assert(!(isTopLevelExport && isJavaStatic), + em"found ${f.fullName} which is both a top-level export and a Java static") + val isStaticField = isTopLevelExport || isJavaStatic + + val namespace = if isStaticField then js.MemberNamespace.PublicStatic else js.MemberNamespace.Public + val mutable = isStaticField || f.is(Mutable) + + val flags = js.MemberFlags.empty.withMutable(mutable).withNamespace(namespace) + + val irTpe0 = + if (isJSClass) genExposedFieldIRType(f) + else if (isTopLevelExport) jstpe.AnyType + else toIRType(f.info) + + // scala-js/#4370 Fields cannot have type NothingType + val irTpe = + if (irTpe0 == jstpe.NothingType) encodeClassType(defn.NothingClass) + else irTpe0 + + if (isJSClass && f.isJSExposed) + js.JSFieldDef(flags, genExpr(f.jsName)(f.sourcePos), irTpe) :: Nil + else + val fieldIdent = encodeFieldSym(f) + val originalName = originalNameOfField(f) + val fieldDef = js.FieldDef(flags, fieldIdent, originalName, irTpe) + val optionalStaticFieldGetter = + if isJavaStatic then + // Here we are generating a public static getter for the static field, + // this is its API for other units. This is necessary for singleton + // enum values, which are backed by static fields. + val className = encodeClassName(classSym) + val body = js.Block( + js.LoadModule(className), + js.SelectStatic(className, fieldIdent)(irTpe)) + js.MethodDef(js.MemberFlags.empty.withNamespace(js.MemberNamespace.PublicStatic), + encodeStaticMemberSym(f), originalName, Nil, irTpe, + Some(body))( + OptimizerHints.empty, None) :: Nil + else + Nil + fieldDef :: optionalStaticFieldGetter + }).toList + } + + def genExposedFieldIRType(f: Symbol): jstpe.Type = { + val tpeEnteringPosterasure = atPhase(elimErasedValueTypePhase)(f.info) + tpeEnteringPosterasure match { + case tpe: ErasedValueType => + /* Here, we must store the field as the boxed representation of + * the value class. The default value of that field, as + * initialized at the time the instance is created, will + * therefore be null. This will not match the behavior we would + * get in a Scala class. To match the behavior, we would need to + * initialized to an instance of the boxed representation, with + * an underlying value set to the zero of its type. However we + * cannot implement that, so we live with the discrepancy. + * + * In dotc this is usually not an issue, because it unboxes `null` to + * the zero of the underlying type, unlike scalac which throws an NPE. + */ + jstpe.ClassType(encodeClassName(tpe.tycon.typeSymbol)) + + case _ => + // Other types are not boxed, so we can initialized them to their true zero. + toIRType(f.info) + } + } + + // Static initializers ----------------------------------------------------- + + private def genStaticConstructorWithStats(name: MethodName, stats: js.Tree)( + implicit pos: Position): js.MethodDef = { + js.MethodDef( + js.MemberFlags.empty.withNamespace(js.MemberNamespace.StaticConstructor), + js.MethodIdent(name), + NoOriginalName, + Nil, + jstpe.NoType, + Some(stats))( + OptimizerHints.empty, None) + } + + private def genRegisterReflectiveInstantiation(sym: Symbol)( + implicit pos: SourcePosition): Option[js.Tree] = { + if (isStaticModule(sym)) + genRegisterReflectiveInstantiationForModuleClass(sym) + else if (sym.is(ModuleClass)) + None // scala-js#3228 + else if (sym.is(Lifted) && !sym.originalOwner.isClass) + None // scala-js#3227 + else + genRegisterReflectiveInstantiationForNormalClass(sym) + } + + private def genRegisterReflectiveInstantiationForModuleClass(sym: Symbol)( + implicit pos: SourcePosition): Option[js.Tree] = { + val fqcnArg = js.StringLiteral(sym.fullName.toString) + val runtimeClassArg = js.ClassOf(toTypeRef(sym.info)) + val loadModuleFunArg = + js.Closure(arrow = true, Nil, Nil, None, genLoadModule(sym), Nil) + + val stat = genApplyMethod( + genLoadModule(jsdefn.ReflectModule), + jsdefn.Reflect_registerLoadableModuleClass, + List(fqcnArg, runtimeClassArg, loadModuleFunArg)) + + Some(stat) + } + + private def genRegisterReflectiveInstantiationForNormalClass(sym: Symbol)( + implicit pos: SourcePosition): Option[js.Tree] = { + val ctors = + if (sym.is(Abstract)) Nil + else sym.info.member(nme.CONSTRUCTOR).alternatives.map(_.symbol).filter(m => !m.isOneOf(Private | Protected)) + + if (ctors.isEmpty) { + None + } else { + val constructorsInfos = for { + ctor <- ctors + } yield { + withNewLocalNameScope { + val (parameterTypes, formalParams, actualParams) = (for { + (paramName, paramInfo) <- ctor.info.paramNamess.flatten.zip(ctor.info.paramInfoss.flatten) + } yield { + val paramType = js.ClassOf(toTypeRef(paramInfo)) + val paramDef = js.ParamDef(freshLocalIdent(paramName), + NoOriginalName, jstpe.AnyType, mutable = false) + val actualParam = unbox(paramDef.ref, paramInfo) + (paramType, paramDef, actualParam) + }).unzip3 + + val paramTypesArray = js.JSArrayConstr(parameterTypes) + + val newInstanceFun = js.Closure(arrow = true, Nil, formalParams, None, { + js.New(encodeClassName(sym), encodeMethodSym(ctor), actualParams) + }, Nil) + + js.JSArrayConstr(List(paramTypesArray, newInstanceFun)) + } + } + + val fqcnArg = js.StringLiteral(sym.fullName.toString) + val runtimeClassArg = js.ClassOf(toTypeRef(sym.info)) + val ctorsInfosArg = js.JSArrayConstr(constructorsInfos) + + val stat = genApplyMethod( + genLoadModule(jsdefn.ReflectModule), + jsdefn.Reflect_registerInstantiatableClass, + List(fqcnArg, runtimeClassArg, ctorsInfosArg)) + + Some(stat) + } + } + + // Constructor of a non-native JS class ------------------------------------ + + def genJSClassCapturesAndConstructor(constructorTrees: List[DefDef])( + implicit pos: SourcePosition): (List[js.ParamDef], js.JSConstructorDef) = { + /* We need to merge all Scala constructors into a single one because the + * IR, like JavaScript, only allows a single one. + * + * We do this by applying: + * 1. Applying runtime type based dispatch, just like exports. + * 2. Splitting secondary ctors into parts before and after the `this` call. + * 3. Topo-sorting all constructor statements and including/excluding + * them based on the overload that was chosen. + */ + + val (primaryTree :: Nil, secondaryTrees) = + constructorTrees.partition(_.symbol.isPrimaryConstructor): @unchecked + + val primaryCtor = genPrimaryJSClassCtor(primaryTree) + val secondaryCtors = secondaryTrees.map(genSecondaryJSClassCtor(_)) + + // VarDefs for the parameters of all constructors. + val paramVarDefs = for { + vparam <- constructorTrees.flatMap(_.paramss.flatten) + } yield { + val sym = vparam.symbol + val tpe = toIRType(sym.info) + js.VarDef(encodeLocalSym(sym), originalNameOfLocal(sym), tpe, mutable = true, jstpe.zeroOf(tpe))(vparam.span) + } + + /* organize constructors in a called-by tree + * (the implicit root is the primary constructor) + */ + val ctorTree = { + val ctorToChildren = secondaryCtors + .groupBy(_.targetCtor) + .withDefaultValue(Nil) + + /* when constructing the call-by tree, we use pre-order traversal to + * assign overload numbers. + * this puts all descendants of a ctor in a range of overloads numbers. + * + * this property is useful, later, when we need to make statements + * conditional based on the chosen overload. + */ + var nextOverloadNum = 0 + def subTree[T <: JSCtor](ctor: T): ConstructorTree[T] = { + val overloadNum = nextOverloadNum + nextOverloadNum += 1 + val subtrees = ctorToChildren(ctor.sym).map(subTree(_)) + new ConstructorTree(overloadNum, ctor, subtrees) + } + + subTree(primaryCtor) + } + + /* prepare overload dispatch for all constructors. + * as a side-product, we retrieve the capture parameters. + */ + val (exports, jsClassCaptures) = { + val exports = List.newBuilder[jsExportsGen.Exported] + val jsClassCaptures = List.newBuilder[js.ParamDef] + + def add(tree: ConstructorTree[_ <: JSCtor]): Unit = { + val (e, c) = genJSClassCtorDispatch(tree.ctor.sym, + tree.ctor.paramsAndInfo, tree.overloadNum) + exports += e + jsClassCaptures ++= c + tree.subCtors.foreach(add(_)) + } + + add(ctorTree) + + (exports.result(), jsClassCaptures.result()) + } + + // The name 'constructor' is used for error reporting here + val (formalArgs, restParam, overloadDispatchBody) = + jsExportsGen.genOverloadDispatch(JSName.Literal("constructor"), exports, jstpe.IntType) + + val overloadVar = js.VarDef(freshLocalIdent("overload"), NoOriginalName, + jstpe.IntType, mutable = false, overloadDispatchBody) + + val constructorBody = wrapJSCtorBody( + paramVarDefs :+ overloadVar, + genJSClassCtorBody(overloadVar.ref, ctorTree), + js.Undefined() :: Nil + ) + + val constructorDef = js.JSConstructorDef( + js.MemberFlags.empty.withNamespace(js.MemberNamespace.Constructor), + formalArgs, restParam, constructorBody)(OptimizerHints.empty, None) + + (jsClassCaptures, constructorDef) + } + + private def genPrimaryJSClassCtor(dd: DefDef): PrimaryJSCtor = { + val sym = dd.symbol + val Block(stats, _) = dd.rhs: @unchecked + assert(sym.isPrimaryConstructor, s"called with non-primary ctor: $sym") + + var jsSuperCall: Option[js.JSSuperConstructorCall] = None + val jsStats = List.newBuilder[js.Tree] + + /* Move all statements after the super constructor call since JS + * cannot access `this` before the super constructor call. + * + * dotc inserts statements before the super constructor call for param + * accessor initializers (including val's and var's declared in the + * params). We move those after the super constructor call, and are + * therefore executed later than for a Scala class. + */ + withPerMethodBodyState(sym) { + stats.foreach { + case tree @ Apply(fun @ Select(Super(This(_), _), _), args) + if fun.symbol.isClassConstructor => + assert(jsSuperCall.isEmpty, s"Found 2 JS Super calls at ${dd.sourcePos}") + implicit val pos: Position = tree.span + jsSuperCall = Some(js.JSSuperConstructorCall(genActualJSArgs(fun.symbol, args))) + + case stat => + val jsStat = genStat(stat) + assert(jsSuperCall.isDefined || !jsStat.isInstanceOf[js.VarDef], + "Trying to move a local VarDef after the super constructor call of a non-native JS class at " + + dd.sourcePos) + jsStats += jsStat + } + } + + assert(jsSuperCall.isDefined, + s"Did not find Super call in primary JS construtor at ${dd.sourcePos}") + + new PrimaryJSCtor(sym, genParamsAndInfo(sym, dd.paramss), + js.JSConstructorBody(Nil, jsSuperCall.get, jsStats.result())(dd.span)) + } + + private def genSecondaryJSClassCtor(dd: DefDef): SplitSecondaryJSCtor = { + val sym = dd.symbol + assert(!sym.isPrimaryConstructor, s"called with primary ctor $sym") + + def flattenBlocks(t: Tree): List[Tree] = t match { + case Block(stats, expr) => (stats :+ expr).flatMap(flattenBlocks) + case _ => t :: Nil + } + val stats = flattenBlocks(dd.rhs) + + val beforeThisCall = List.newBuilder[js.Tree] + var thisCall: Option[(Symbol, List[js.Tree])] = None + val afterThisCall = List.newBuilder[js.Tree] + + withPerMethodBodyState(sym) { + stats.foreach { + case tree @ Apply(fun @ Select(This(_), _), args) + if fun.symbol.isClassConstructor => + assert(thisCall.isEmpty, + s"duplicate this() call in secondary JS constructor at ${dd.sourcePos}") + + implicit val pos: Position = tree.span + val sym = fun.symbol + thisCall = Some((sym, genActualArgs(sym, args))) + + case stat => + val jsStat = genStat(stat) + if (thisCall.isEmpty) + beforeThisCall += jsStat + else + afterThisCall += jsStat + } + } + + assert(thisCall.isDefined, + i"could not find the this() call in secondary JS constructor at ${dd.sourcePos}:\n${stats.map(_.show).mkString("\n")}") + val Some((targetCtor, ctorArgs)) = thisCall: @unchecked + + new SplitSecondaryJSCtor(sym, genParamsAndInfo(sym, dd.paramss), + beforeThisCall.result(), targetCtor, ctorArgs, afterThisCall.result()) + } + + private def genParamsAndInfo(ctorSym: Symbol, + vparamss: List[ParamClause]): List[(Symbol, JSParamInfo)] = { + implicit val pos: SourcePosition = ctorSym.sourcePos + + val paramSyms = if (vparamss.isEmpty) Nil else vparamss.head.map(_.symbol) + paramSyms.zip(ctorSym.jsParamInfos) + } + + private def genJSClassCtorDispatch(ctorSym: Symbol, + allParamsAndInfos: List[(Symbol, JSParamInfo)], + overloadNum: Int): (jsExportsGen.Exported, List[js.ParamDef]) = { + + implicit val pos: SourcePosition = ctorSym.sourcePos + + /* `allParams` are the parameters as seen from inside the constructor body, + * i.e., the ones generated by the trees in the constructor body. + */ + val (captureParamsAndInfos, normalParamsAndInfos) = + allParamsAndInfos.partition(_._2.capture) + + /* For class captures, we need to generate different names than the ones + * used by the constructor body. This is necessary so that we can forward + * captures properly between constructor delegation calls. + */ + val (jsClassCaptures, captureAssigns) = (for { + (param, info) <- captureParamsAndInfos + } yield { + val ident = freshLocalIdent(param.name.toTermName) + val jsClassCapture = + js.ParamDef(ident, originalNameOfLocal(param), toIRType(info.info), mutable = false) + val captureAssign = + js.Assign(genVarRef(param), jsClassCapture.ref) + (jsClassCapture, captureAssign) + }).unzip + + val normalInfos = normalParamsAndInfos.map(_._2).toIndexedSeq + + val jsExport = new jsExportsGen.Exported(ctorSym, normalInfos) { + def genBody(formalArgsRegistry: jsExportsGen.FormalArgsRegistry): js.Tree = { + val paramAssigns = for { + ((param, info), i) <- normalParamsAndInfos.zipWithIndex + } yield { + val rhs = jsExportsGen.genScalaArg(this, i, formalArgsRegistry, info, static = true, + captures = captureParamsAndInfos.map(pi => genVarRef(pi._1)))( + prevArgsCount => normalParamsAndInfos.take(prevArgsCount).map(pi => genVarRef(pi._1))) + + js.Assign(genVarRef(param), rhs) + } + + js.Block(captureAssigns ::: paramAssigns, js.IntLiteral(overloadNum)) + } + } + + (jsExport, jsClassCaptures) + } + + /** Generates a JS constructor body based on a constructor tree. */ + private def genJSClassCtorBody(overloadVar: js.VarRef, + ctorTree: ConstructorTree[PrimaryJSCtor])(implicit pos: Position): js.JSConstructorBody = { + + /* generates a statement that conditionally executes body iff the chosen + * overload is any of the descendants of `tree` (including itself). + * + * here we use the property from building the trees, that a set of + * descendants always has a range of overload numbers. + */ + def ifOverload(tree: ConstructorTree[_], body: js.Tree): js.Tree = body match { + case js.Skip() => js.Skip() + + case body => + val x = overloadVar + val cond = { + import tree.{lo, hi} + + if (lo == hi) { + js.BinaryOp(js.BinaryOp.Int_==, js.IntLiteral(lo), x) + } else { + val lhs = js.BinaryOp(js.BinaryOp.Int_<=, js.IntLiteral(lo), x) + val rhs = js.BinaryOp(js.BinaryOp.Int_<=, x, js.IntLiteral(hi)) + js.If(lhs, rhs, js.BooleanLiteral(false))(jstpe.BooleanType) + } + } + + js.If(cond, body, js.Skip())(jstpe.NoType) + } + + /* preStats / postStats use pre/post order traversal respectively to + * generate a topo-sorted sequence of statements. + */ + + def preStats(tree: ConstructorTree[SplitSecondaryJSCtor], + nextParamsAndInfo: List[(Symbol, JSParamInfo)]): js.Tree = { + val inner = tree.subCtors.map(preStats(_, tree.ctor.paramsAndInfo)) + + assert(tree.ctor.ctorArgs.size == nextParamsAndInfo.size, "param count mismatch") + val paramsInfosAndArgs = nextParamsAndInfo.zip(tree.ctor.ctorArgs) + + val (captureParamsInfosAndArgs, normalParamsInfosAndArgs) = + paramsInfosAndArgs.partition(_._1._2.capture) + + val captureAssigns = for { + ((param, _), arg) <- captureParamsInfosAndArgs + } yield { + js.Assign(genVarRef(param), arg) + } + + val normalAssigns = for { + (((param, info), arg), i) <- normalParamsInfosAndArgs.zipWithIndex + } yield { + val newArg = arg match { + case js.Transient(UndefinedParam) => + /* Go full circle: We have ignored the default param getter for + * this, we'll create it again. + * + * This seems not optimal: We could simply not ignore the calls to + * default param getters in the first place. + * + * However, this proves to be difficult: Because of translations in + * earlier phases, calls to default param getters may be assigned + * to temporary variables first (see the undefinedDefaultParams + * ScopedVar). If this happens, it becomes increasingly difficult + * to distinguish a default param getter call for a constructor + * call of *this* instance (in which case we would want to keep + * the default param getter call) from one for a *different* + * instance (in which case we would want to discard the default + * param getter call) + * + * Because of this, it ends up being easier to just re-create the + * default param getter call if necessary. + */ + implicit val pos: SourcePosition = tree.ctor.sym.sourcePos + jsExportsGen.genCallDefaultGetter(tree.ctor.sym, i, static = false, + captures = captureParamsInfosAndArgs.map(p => genVarRef(p._1._1)))( + prevArgsCount => normalParamsInfosAndArgs.take(prevArgsCount).map(p => genVarRef(p._1._1))) + + case arg => arg + } + + js.Assign(genVarRef(param), newArg) + } + + ifOverload(tree, js.Block( + inner ++ tree.ctor.beforeCall ++ captureAssigns ++ normalAssigns)) + } + + def postStats(tree: ConstructorTree[SplitSecondaryJSCtor]): js.Tree = { + val inner = tree.subCtors.map(postStats(_)) + ifOverload(tree, js.Block(tree.ctor.afterCall ++ inner)) + } + + val primaryCtor = ctorTree.ctor + val secondaryCtorTrees = ctorTree.subCtors + + wrapJSCtorBody( + secondaryCtorTrees.map(preStats(_, primaryCtor.paramsAndInfo)), + primaryCtor.body, + secondaryCtorTrees.map(postStats(_)) + ) + } + + private def wrapJSCtorBody(before: List[js.Tree], body: js.JSConstructorBody, + after: List[js.Tree]): js.JSConstructorBody = { + js.JSConstructorBody(before ::: body.beforeSuper, body.superCall, + body.afterSuper ::: after)(body.pos) + } + + private sealed trait JSCtor { + val sym: Symbol + val paramsAndInfo: List[(Symbol, JSParamInfo)] + } + + private class PrimaryJSCtor(val sym: Symbol, + val paramsAndInfo: List[(Symbol, JSParamInfo)], + val body: js.JSConstructorBody) extends JSCtor + + private class SplitSecondaryJSCtor(val sym: Symbol, + val paramsAndInfo: List[(Symbol, JSParamInfo)], + val beforeCall: List[js.Tree], + val targetCtor: Symbol, val ctorArgs: List[js.Tree], + val afterCall: List[js.Tree]) extends JSCtor + + private class ConstructorTree[Ctor <: JSCtor]( + val overloadNum: Int, val ctor: Ctor, + val subCtors: List[ConstructorTree[SplitSecondaryJSCtor]]) { + val lo: Int = overloadNum + val hi: Int = subCtors.lastOption.fold(lo)(_.hi) + + assert(lo <= hi, "bad overload range") + } + + // Generate a method ------------------------------------------------------- + + /** Generates the JSNativeMemberDef. */ + def genJSNativeMemberDef(tree: ValOrDefDef): js.JSNativeMemberDef = { + implicit val pos = tree.span + + val sym = tree.symbol + val flags = js.MemberFlags.empty.withNamespace(js.MemberNamespace.PublicStatic) + val methodName = encodeJSNativeMemberSym(sym) + val jsNativeLoadSpec = computeJSNativeLoadSpecOfValDef(sym) + js.JSNativeMemberDef(flags, methodName, jsNativeLoadSpec) + } + + private def genMethod(dd: DefDef): Option[js.MethodDef] = { + withScopedVars( + localNames := new LocalNameGenerator + ) { + genMethodWithCurrentLocalNameScope(dd) + } + } + + /** Gen JS code for a method definition in a class or in an impl class. + * On the JS side, method names are mangled to encode the full signature + * of the Scala method, as described in `JSEncoding`, to support + * overloading. + * + * Some methods are not emitted at all: + * - Primitives, since they are never actually called + * - Constructors of hijacked classes + * + * Constructors are emitted by generating their body as a statement. + * + * Other (normal) methods are emitted with `genMethodBody()`. + */ + private def genMethodWithCurrentLocalNameScope(dd: DefDef): Option[js.MethodDef] = { + implicit val pos = dd.span + val sym = dd.symbol + val vparamss = dd.termParamss + val rhs = dd.rhs + + /* Is this method a default accessor that should be ignored? + * + * This is the case iff one of the following applies: + * - It is a constructor default accessor and the linked class is a + * native JS class. + * - It is a default accessor for a native JS def, but with the caveat + * that its rhs must be `js.native` because of #4553. + * + * Both of those conditions can only happen if the default accessor is in + * a module class, so we use that as a fast way out. (But omitting that + * condition would not change the result.) + * + * This is different than `isJSDefaultParam` in `genApply`: we do not + * ignore default accessors of *non-native* JS types. Neither for + * constructor default accessor nor regular default accessors. We also + * do not need to worry about non-constructor members of native JS types, + * since for those, the entire member list is ignored in `genJSClassData`. + */ + def isIgnorableDefaultParam: Boolean = { + sym.name.is(DefaultGetterName) && sym.owner.is(ModuleClass) && { + val info = new DefaultParamInfo(sym) + if (info.isForConstructor) { + /* This is a default accessor for a constructor parameter. Check + * whether the attached constructor is a native JS constructor, + * which is the case iff the linked class is a native JS type. + */ + info.constructorOwner.hasAnnotation(jsdefn.JSNativeAnnot) + } else { + /* #4553 We need to ignore default accessors for JS native defs. + * However, because Scala.js <= 1.7.0 actually emitted code calling + * those accessors, we must keep default accessors that would + * compile. The only accessors we can actually get rid of are those + * that are `= js.native`. + */ + !sym.owner.isJSType && + info.attachedMethod.hasAnnotation(jsdefn.JSNativeAnnot) && { + dd.rhs match { + case MaybeAsInstanceOf(Apply(fun, _)) => + fun.symbol == jsdefn.JSPackage_native + case _ => + false + } + } + } + } + } + + withPerMethodBodyState(sym) { + assert(vparamss.isEmpty || vparamss.tail.isEmpty, + "Malformed parameter list: " + vparamss) + val params = if (vparamss.isEmpty) Nil else vparamss.head.map(_.symbol) + + val methodName = encodeMethodSym(sym) + val originalName = originalNameOfMethod(sym) + + def jsParams = params.map(genParamDef(_)) + + if (primitives.isPrimitive(sym)) { + None + } else if (sym.is(Deferred) && currentClassSym.isNonNativeJSClass) { + // scala-js/#4409: Do not emit abstract methods in non-native JS classes + None + } else if (sym.is(Deferred)) { + Some(js.MethodDef(js.MemberFlags.empty, methodName, originalName, + jsParams, toIRType(patchedResultType(sym)), None)( + OptimizerHints.empty, None)) + } else if (isIgnorableDefaultParam) { + // #11592 + None + } else if (sym.is(Bridge) && sym.name.is(DefaultGetterName) && currentClassSym.isNonNativeJSClass) { + /* #12572 Bridges for default accessors in non-native JS classes must not be emitted, + * because they call another default accessor, making their entire body an + * that cannot be eliminated. + * Such methods are never called anyway, because they are filtered out in + * JSExportsGen.defaultGetterDenot(). + */ + None + } else /*if (sym.isClassConstructor && isHijackedBoxedClass(sym.owner)) { + None + } else*/ { + /*def isTraitImplForwarder = dd.rhs match { + case app: Apply => foreignIsImplClass(app.symbol.owner) + case _ => false + }*/ + + val shouldMarkInline = { + sym.hasAnnotation(jsdefn.InlineAnnot) || + sym.isAnonymousFunction + } + + val shouldMarkNoinline = { + sym.hasAnnotation(jsdefn.NoinlineAnnot) /*&& + !isTraitImplForwarder*/ + } + + val optimizerHints = { + OptimizerHints.empty + .withInline(shouldMarkInline) + .withNoinline(shouldMarkNoinline) + } + + val methodDef = { + if (sym.isClassConstructor) { + val namespace = js.MemberNamespace.Constructor + js.MethodDef(js.MemberFlags.empty.withNamespace(namespace), + methodName, originalName, jsParams, jstpe.NoType, Some(genStat(rhs)))( + optimizerHints, None) + } else { + val namespace = if (isMethodStaticInIR(sym)) { + if (sym.isPrivate) js.MemberNamespace.PrivateStatic + else js.MemberNamespace.PublicStatic + } else { + if (sym.isPrivate) js.MemberNamespace.Private + else js.MemberNamespace.Public + } + val resultIRType = toIRType(patchedResultType(sym)) + genMethodDef(namespace, methodName, originalName, + params, resultIRType, rhs, optimizerHints) + } + } + + Some(methodDef) + } + } + } + + /** Generates the MethodDef of a (non-constructor) method + * + * Most normal methods are emitted straightforwardly. If the result + * type is Unit, then the body is emitted as a statement. Otherwise, it is + * emitted as an expression. + * + * Instance methods in non-native JS classes are compiled as static methods + * taking an explicit parameter for their `this` value. Static methods in + * non-native JS classes are compiled as is, like methods in Scala classes. + */ + private def genMethodDef(namespace: js.MemberNamespace, methodName: js.MethodIdent, + originalName: OriginalName, paramsSyms: List[Symbol], resultIRType: jstpe.Type, + tree: Tree, optimizerHints: OptimizerHints): js.MethodDef = { + implicit val pos = tree.span + + val jsParams = paramsSyms.map(genParamDef(_)) + + def genBody() = localNames.makeLabeledIfRequiresEnclosingReturn(resultIRType) { + if (resultIRType == jstpe.NoType) genStat(tree) + else genExpr(tree) + } + + if (namespace.isStatic || !currentClassSym.isNonNativeJSClass) { + val flags = js.MemberFlags.empty.withNamespace(namespace) + js.MethodDef(flags, methodName, originalName, jsParams, resultIRType, Some(genBody()))( + optimizerHints, None) + } else { + val thisLocalIdent = freshLocalIdent("this") + withScopedVars( + thisLocalVarIdent := Some(thisLocalIdent) + ) { + val staticNamespace = + if (namespace.isPrivate) js.MemberNamespace.PrivateStatic + else js.MemberNamespace.PublicStatic + val flags = + js.MemberFlags.empty.withNamespace(staticNamespace) + val thisParamDef = js.ParamDef(thisLocalIdent, thisOriginalName, + jstpe.AnyType, mutable = false) + + js.MethodDef(flags, methodName, originalName, + thisParamDef :: jsParams, resultIRType, Some(genBody()))( + optimizerHints, None) + } + } + } + + // ParamDefs --------------------------------------------------------------- + + def genParamDef(sym: Symbol): js.ParamDef = + genParamDef(sym, toIRType(sym.info)) + + private def genParamDef(sym: Symbol, ptpe: jstpe.Type): js.ParamDef = + genParamDef(sym, ptpe, sym.span) + + private def genParamDef(sym: Symbol, pos: Position): js.ParamDef = + genParamDef(sym, toIRType(sym.info), pos) + + private def genParamDef(sym: Symbol, ptpe: jstpe.Type, pos: Position): js.ParamDef = { + js.ParamDef(encodeLocalSym(sym)(implicitly, pos, implicitly), + originalNameOfLocal(sym), ptpe, mutable = false)(pos) + } + + // Generate statements and expressions ------------------------------------- + + /** Gen JS code for a tree in statement position (in the IR). + */ + private def genStat(tree: Tree): js.Tree = { + exprToStat(genStatOrExpr(tree, isStat = true)) + } + + /** Turn a JavaScript expression of type Unit into a statement */ + private def exprToStat(tree: js.Tree): js.Tree = { + /* Any JavaScript expression is also a statement, but at least we get rid + * of some pure expressions that come from our own codegen. + */ + implicit val pos = tree.pos + tree match { + case js.Block(stats :+ expr) => + js.Block(stats :+ exprToStat(expr)) + case _:js.Literal | _:js.This | _:js.VarRef => + js.Skip() + case _ => + tree + } + } + + /** Gen JS code for a tree in expression position (in the IR). + */ + private def genExpr(tree: Tree): js.Tree = { + val result = genStatOrExpr(tree, isStat = false) + assert(result.tpe != jstpe.NoType, + s"genExpr($tree) returned a tree with type NoType at pos ${tree.span}") + result + } + + def genExpr(name: JSName)(implicit pos: SourcePosition): js.Tree = name match { + case JSName.Literal(name) => js.StringLiteral(name) + case JSName.Computed(sym) => genComputedJSName(sym) + } + + private def genComputedJSName(sym: Symbol)(implicit pos: SourcePosition): js.Tree = { + /* By construction (i.e. restriction in PrepJSInterop), we know that sym + * must be a static method. + * Therefore, at this point, we can invoke it by loading its owner and + * calling it. + */ + def moduleOrGlobalScope = genLoadModuleOrGlobalScope(sym.owner) + def module = genLoadModule(sym.owner) + + if (sym.owner.isJSType) { + if (!sym.owner.isNonNativeJSClass || sym.isJSExposed) + genApplyJSMethodGeneric(sym, moduleOrGlobalScope, args = Nil, isStat = false) + else + genApplyJSClassMethod(module, sym, arguments = Nil) + } else { + genApplyMethod(module, sym, arguments = Nil) + } + } + + /** Gen JS code for a tree in expression position (in the IR) or the + * global scope. + */ + def genExprOrGlobalScope(tree: Tree): MaybeGlobalScope = { + implicit def pos: SourcePosition = tree.sourcePos + + tree match { + case _: This => + val sym = tree.symbol + if (sym != currentClassSym.get && sym.is(Module)) + genLoadModuleOrGlobalScope(sym) + else + MaybeGlobalScope.NotGlobalScope(genExpr(tree)) + + case _:Ident | _:Select => + val sym = tree.symbol + if (sym.is(Module)) { + assert(!sym.is(PackageClass), "Cannot use package as value: " + tree) + genLoadModuleOrGlobalScope(sym) + } else { + MaybeGlobalScope.NotGlobalScope(genExpr(tree)) + } + + case Apply(fun, _) => + if (fun.symbol == jsdefn.JSDynamic_global) + MaybeGlobalScope.GlobalScope(pos) + else + MaybeGlobalScope.NotGlobalScope(genExpr(tree)) + + case _ => + MaybeGlobalScope.NotGlobalScope(genExpr(tree)) + } + } + + /** Gen JS code for a tree in statement or expression position (in the IR). + * + * This is the main transformation method. Each node of the Scala AST + * is transformed into an equivalent portion of the JS AST. + */ + private def genStatOrExpr(tree: Tree, isStat: Boolean): js.Tree = { + implicit val pos: SourcePosition = tree.sourcePos + + report.debuglog(" " + tree) + report.debuglog("") + + tree match { + /** Local val or var declaration */ + case tree @ ValDef(name, _, _) => + val sym = tree.symbol + val rhs = tree.rhs + val rhsTree = genExpr(rhs) + + rhsTree match { + case js.Transient(UndefinedParam) => + /* This is an intermediate assignment for default params on a + * js.Any. Add the symbol to the corresponding set to inform + * the Ident resolver how to replace it and don't emit the symbol. + */ + undefinedDefaultParams += sym + js.Skip() + case _ => + js.VarDef(encodeLocalSym(sym), originalNameOfLocal(sym), + toIRType(sym.info), sym.is(Mutable), rhsTree) + } + + case If(cond, thenp, elsep) => + val tpe = + if (isStat) jstpe.NoType + else toIRType(tree.tpe) + + js.If(genExpr(cond), genStatOrExpr(thenp, isStat), + genStatOrExpr(elsep, isStat))(tpe) + + case Labeled(bind, expr) => + js.Labeled(encodeLabelSym(bind.symbol), toIRType(tree.tpe), genStatOrExpr(expr, isStat)) + + case Return(expr, from) => + val fromSym = from.symbol + val label = + if (fromSym.is(Label)) encodeLabelSym(fromSym) + else localNames.get.getEnclosingReturnLabel() + js.Return(toIRType(expr.tpe) match { + case jstpe.NoType => js.Block(genStat(expr), js.Undefined()) + case _ => genExpr(expr) + }, label) + + case WhileDo(cond, body) => + val genCond = + if (cond == EmptyTree) js.BooleanLiteral(true) + else genExpr(cond) + js.While(genCond, genStat(body)) + + case t: Try => + genTry(t, isStat) + + case app: Apply => + genApply(app, isStat) + + case app: TypeApply => + genTypeApply(app) + + /*case app: ApplyDynamic => + genApplyDynamic(app)*/ + + case tree: This => + val currentClass = currentClassSym.get + val symIsModuleClass = tree.symbol.is(ModuleClass) + assert(tree.symbol == currentClass || symIsModuleClass, + s"Trying to access the this of another class: tree.symbol = ${tree.symbol}, class symbol = $currentClass") + if (symIsModuleClass && tree.symbol != currentClass) + genLoadModule(tree.symbol) + else + genThis() + + case Select(qualifier, _) => + val sym = tree.symbol + if (sym.is(Module)) { + assert(!sym.is(Package), "Cannot use package as value: " + tree) + genLoadModule(sym) + } else if (sym.is(JavaStatic)) { + genLoadStaticField(sym) + } else if (sym.hasAnnotation(jsdefn.JSNativeAnnot)) { + genJSNativeMemberSelect(tree) + } else { + val (field, boxed) = genAssignableField(sym, qualifier) + if (boxed) unbox(field, atPhase(elimErasedValueTypePhase)(sym.info)) + else field + } + + case tree: Ident => + desugarIdent(tree).fold[js.Tree] { + val sym = tree.symbol + assert(!sym.is(Package), "Cannot use package as value: " + tree) + if (sym.is(Module)) { + genLoadModule(sym) + } else if (undefinedDefaultParams.contains(sym)) { + /* This is a default parameter whose assignment was moved to + * a local variable. Put an undefined param instead. + */ + js.Transient(UndefinedParam) + } else { + genVarRef(sym) + } + } { select => + genStatOrExpr(select, isStat) + } + + case Literal(value) => + import Constants._ + value.tag match { + case UnitTag => + js.Skip() + case BooleanTag => + js.BooleanLiteral(value.booleanValue) + case ByteTag => + js.ByteLiteral(value.byteValue) + case ShortTag => + js.ShortLiteral(value.shortValue) + case CharTag => + js.CharLiteral(value.charValue) + case IntTag => + js.IntLiteral(value.intValue) + case LongTag => + js.LongLiteral(value.longValue) + case FloatTag => + js.FloatLiteral(value.floatValue) + case DoubleTag => + js.DoubleLiteral(value.doubleValue) + case StringTag => + js.StringLiteral(value.stringValue) + case NullTag => + js.Null() + case ClazzTag => + genClassConstant(value.typeValue) + } + + case Block(stats, expr) => + // #15419 Collapse { ; BoxedUnit } to + val genStatsAndExpr0 = stats.map(genStat(_)) :+ genStatOrExpr(expr, isStat) + val genStatsAndExpr = genStatsAndExpr0 match { + case (undefParam @ js.Transient(UndefinedParam)) :: js.Undefined() :: Nil => + undefParam :: Nil + case _ => + genStatsAndExpr0 + } + js.Block(genStatsAndExpr) + + case Typed(expr, _) => + expr match { + case _: Super => genThis() + case _ => genExpr(expr) + } + + case Assign(lhs0, rhs) => + val sym = lhs0.symbol + if (sym.is(JavaStaticTerm) && sym.source != ctx.compilationUnit.source) + throw new FatalError(s"Assignment to static member ${sym.fullName} not supported") + def genRhs = genExpr(rhs) + val lhs = lhs0 match { + case lhs: Ident => desugarIdent(lhs).getOrElse(lhs) + case lhs => lhs + } + lhs match { + case lhs: Select => + val qualifier = lhs.qualifier + + def ctorAssignment = ( + currentMethodSym.get.name == nme.CONSTRUCTOR && + currentMethodSym.get.owner == qualifier.symbol && + qualifier.isInstanceOf[This] + ) + // TODO This fails for OFFSET$x fields. Re-enable when we can. + /*if (!sym.is(Mutable) && !ctorAssignment) + throw new FatalError(s"Assigning to immutable field ${sym.fullName} at $pos")*/ + + if (sym.hasAnnotation(jsdefn.JSNativeAnnot)) { + /* This is an assignment to a @js.native field. Since we reject + * `@js.native var`s as compile errors, this can only happen in + * the constructor of the enclosing object. + * We simply ignore the assignment, since the field will not be + * emitted at all. + */ + js.Skip() + } else { + val (field, boxed) = genAssignableField(sym, qualifier) + if (boxed) { + val genBoxedRhs = box(genRhs, atPhase(elimErasedValueTypePhase)(sym.info)) + js.Assign(field, genBoxedRhs) + } else { + js.Assign(field, genRhs) + } + } + + case _ => + js.Assign(genVarRef(sym), genRhs) + } + + /** Array constructor */ + case javaSeqLiteral: JavaSeqLiteral => + genJavaSeqLiteral(javaSeqLiteral) + + /** A Match reaching the backend is supposed to be optimized as a switch */ + case mtch: Match => + genMatch(mtch, isStat) + + case tree: Closure => + genClosure(tree) + + case EmptyTree => + js.Skip() + + case _ => + throw new FatalError("Unexpected tree in genExpr: " + + tree + "/" + tree.getClass + " at: " + (tree.span: Position)) + } + } // end of genStatOrExpr() + + private def qualifierOf(fun: Tree): Tree = fun match { + case fun: Ident => + fun.tpe match { + case TermRef(prefix: TermRef, _) => tpd.ref(prefix) + case TermRef(prefix: ThisType, _) => tpd.This(prefix.cls) + } + case Select(qualifier, _) => + qualifier + case TypeApply(fun, _) => + qualifierOf(fun) + } + + /** Gen JS this of the current class. + * Normally encoded straightforwardly as a JS this. + * But must be replaced by the `thisLocalVarIdent` local variable if there + * is one. + */ + private def genThis()(implicit pos: Position): js.Tree = { + /*if (tryingToGenMethodAsJSFunction) { + throw new CancelGenMethodAsJSFunction( + "Trying to generate `this` inside the body") + }*/ + + thisLocalVarIdent.fold[js.Tree] { + js.This()(currentThisType) + } { thisLocalIdent => + js.VarRef(thisLocalIdent)(currentThisType) + } + } + + /** Gen IR code for a `try..catch` or `try..finally` block. + * + * `try..finally` blocks are compiled straightforwardly to `try..finally` + * blocks of the IR. + * + * `try..catch` blocks are a bit more subtle, as the IR does not have + * type-based selection of exceptions to catch. We thus encode explicitly + * the type tests, like in: + * + * ``` + * try { ... } + * catch (e) { + * if (e.isInstanceOf[IOException]) { ... } + * else if (e.isInstanceOf[Exception]) { ... } + * else { + * throw e; // default, re-throw + * } + * } + * ``` + * + * In addition, there are provisions to handle catching JavaScript + * exceptions (which do not extend `Throwable`) as wrapped in a + * `js.JavaScriptException`. + */ + private def genTry(tree: Try, isStat: Boolean): js.Tree = { + implicit val pos: SourcePosition = tree.sourcePos + val Try(block, catches, finalizer) = tree + + val blockAST = genStatOrExpr(block, isStat) + + val resultType = + if (isStat) jstpe.NoType + else toIRType(tree.tpe) + + val handled = + if (catches.isEmpty) blockAST + else genTryCatch(blockAST, catches, resultType, isStat) + + genStat(finalizer) match { + case js.Skip() => handled + case ast => js.TryFinally(handled, ast) + } + } + + private def genTryCatch(body: js.Tree, catches: List[CaseDef], + resultType: jstpe.Type, + isStat: Boolean)(implicit pos: SourcePosition): js.Tree = { + val exceptIdent = freshLocalIdent("e") + val origExceptVar = js.VarRef(exceptIdent)(jstpe.AnyType) + + val mightCatchJavaScriptException = catches.exists { caseDef => + caseDef.pat match { + case Typed(Ident(nme.WILDCARD), tpt) => + isMaybeJavaScriptException(tpt.tpe) + case Ident(nme.WILDCARD) => + true + case pat @ Bind(_, _) => + isMaybeJavaScriptException(pat.symbol.info) + } + } + + val (exceptValDef, exceptVar) = if (mightCatchJavaScriptException) { + val valDef = js.VarDef(freshLocalIdent("e"), NoOriginalName, + encodeClassType(defn.ThrowableClass), mutable = false, js.WrapAsThrowable(origExceptVar)) + (valDef, valDef.ref) + } else { + (js.Skip(), origExceptVar) + } + + val elseHandler: js.Tree = js.Throw(origExceptVar) + + val handler = catches.foldRight(elseHandler) { (caseDef, elsep) => + implicit val pos: SourcePosition = caseDef.sourcePos + val CaseDef(pat, _, body) = caseDef + + // Extract exception type and variable + val (tpe, boundVar) = (pat match { + case Typed(Ident(nme.WILDCARD), tpt) => + (tpt.tpe, None) + case Ident(nme.WILDCARD) => + (defn.ThrowableType, None) + case Bind(_, _) => + val ident = encodeLocalSym(pat.symbol) + val origName = originalNameOfLocal(pat.symbol) + (pat.symbol.info, Some(ident, origName)) + }) + + // Generate the body that must be executed if the exception matches + val bodyWithBoundVar = (boundVar match { + case None => + genStatOrExpr(body, isStat) + case Some((boundVarIdent, boundVarOriginalName)) => + val castException = genAsInstanceOf(exceptVar, tpe) + js.Block( + js.VarDef(boundVarIdent, boundVarOriginalName, toIRType(tpe), + mutable = false, castException), + genStatOrExpr(body, isStat)) + }) + + // Generate the test + if (tpe =:= defn.ThrowableType) { + bodyWithBoundVar + } else { + val cond = genIsInstanceOf(exceptVar, tpe) + js.If(cond, bodyWithBoundVar, elsep)(resultType) + } + } + + js.TryCatch(body, exceptIdent, NoOriginalName, + js.Block(exceptValDef, handler))(resultType) + } + + /** Gen JS code for an Apply node (method call) + * + * There's a whole bunch of varieties of Apply nodes: regular method + * calls, super calls, constructor calls, isInstanceOf/asInstanceOf, + * primitives, JS calls, etc. They are further dispatched in here. + */ + private def genApply(tree: Apply, isStat: Boolean): js.Tree = { + implicit val pos = tree.span + val args = tree.args + val sym = tree.fun.symbol + + /* Is the method a JS default accessor, which should become an + * `UndefinedParam` rather than being compiled normally. + * + * This is true iff one of the following conditions apply: + * - It is a constructor default param for the constructor of a JS class. + * - It is a default param of an instance method of a native JS type. + * - It is a default param of an instance method of a non-native JS type + * and the attached method is exposed. + * - It is a default param for a native JS def. + * + * This is different than `isIgnorableDefaultParam` in + * `genMethodWithCurrentLocalNameScope`: we include here the default + * accessors of *non-native* JS types (unless the corresponding methods are + * not exposed). We also need to handle non-constructor members of native + * JS types. + */ + def isJSDefaultParam: Boolean = { + sym.name.is(DefaultGetterName) && { + val info = new DefaultParamInfo(sym) + if (info.isForConstructor) { + /* This is a default accessor for a constructor parameter. Check + * whether the attached constructor is a JS constructor, which is + * the case iff the linked class is a JS type. + */ + info.constructorOwner.isJSType + } else { + if (sym.owner.isJSType) { + /* The default accessor is in a JS type. It is a JS default + * param iff the enclosing class is native or the attached method + * is exposed. + */ + !sym.owner.isNonNativeJSClass || info.attachedMethod.isJSExposed + } else { + /* The default accessor is in a Scala type. It is a JS default + * param iff the attached method is a native JS def. This can + * only happen if the owner is a module class, which we test + * first as a fast way out. + */ + sym.owner.is(ModuleClass) && info.attachedMethod.hasAnnotation(jsdefn.JSNativeAnnot) + } + } + } + } + + tree.fun match { + case _ if isJSDefaultParam => + js.Transient(UndefinedParam) + + case Select(Super(_, _), _) => + genSuperCall(tree, isStat) + + case Select(New(_), nme.CONSTRUCTOR) => + genApplyNew(tree) + + case _ => + if (primitives.isPrimitive(tree)) { + genPrimitiveOp(tree, isStat) + } else if (Erasure.Boxing.isBox(sym)) { + // Box a primitive value (cannot be Unit) + val arg = args.head + makePrimitiveBox(genExpr(arg), arg.tpe) + } else if (Erasure.Boxing.isUnbox(sym)) { + // Unbox a primitive value (cannot be Unit) + val arg = args.head + makePrimitiveUnbox(genExpr(arg), tree.tpe) + } else { + genNormalApply(tree, isStat) + } + } + } + + /** Gen JS code for a super call, of the form Class.super[mix].fun(args). + * + * This does not include calls defined in mixin traits, as these are + * already desugared by the 'mixin' phase. Only calls to super classes + * remain. + * + * Since a class has exactly one direct superclass, and calling a method + * two classes above the current one is invalid in Scala, the `mix` item is + * irrelevant. + */ + private def genSuperCall(tree: Apply, isStat: Boolean): js.Tree = { + implicit val pos = tree.span + val Apply(fun @ Select(sup @ Super(qual, _), _), args) = tree: @unchecked + val sym = fun.symbol + + if (sym == defn.Any_getClass) { + // The only primitive that is also callable as super call + js.GetClass(genThis()) + } else if (currentClassSym.isNonNativeJSClass) { + genJSSuperCall(tree, isStat) + } else { + /* #3013 `qual` can be `this.$outer()` in some cases since Scala 2.12, + * so we call `genExpr(qual)`, not just `genThis()`. + */ + val superCall = genApplyMethodStatically( + genExpr(qual), sym, genActualArgs(sym, args)) + + // Initialize the module instance just after the super constructor call. + if (isStaticModule(currentClassSym) && !isModuleInitialized.get.value && + currentMethodSym.get.isClassConstructor) { + isModuleInitialized.get.value = true + val className = encodeClassName(currentClassSym) + val thisType = jstpe.ClassType(className) + val initModule = js.StoreModule(className, js.This()(thisType)) + js.Block(superCall, initModule) + } else { + superCall + } + } + } + + /** Gen JS code for a constructor call (new). + * Further refined into: + * * new String(...) + * * new of a hijacked boxed class + * * new of an anonymous function class that was recorded as JS function + * * new of a raw JS class + * * new Array + * * regular new + */ + private def genApplyNew(tree: Apply): js.Tree = { + implicit val pos: SourcePosition = tree.sourcePos + + val Apply(fun @ Select(New(tpt), nme.CONSTRUCTOR), args) = tree: @unchecked + val ctor = fun.symbol + val tpe = tpt.tpe + + assert(ctor.isClassConstructor, + "'new' call to non-constructor: " + ctor.name) + + val clsSym = tpe.typeSymbol + + if (isHijackedClass(clsSym)) { + genNewHijackedClass(clsSym, ctor, args.map(genExpr)) + } else /*if (translatedAnonFunctions contains tpe.typeSymbol) { + val functionMaker = translatedAnonFunctions(tpe.typeSymbol) + functionMaker(args map genExpr) + } else*/ if (clsSym.isJSType) { + genNewJSClass(tree) + } else { + toTypeRef(tpe) match { + case jstpe.ClassRef(className) => + js.New(className, encodeMethodSym(ctor), genActualArgs(ctor, args)) + + case other => + throw new FatalError(s"Non ClassRef cannot be instantiated: $other") + } + } + } + + /** Gen JS code for a call to a constructor of a hijacked class. + * Reroute them to the `new` method with the same signature in the + * companion object. + */ + private def genNewHijackedClass(clazz: Symbol, ctor: Symbol, + args: List[js.Tree])(implicit pos: SourcePosition): js.Tree = { + + val className = encodeClassName(clazz) + val initName = encodeMethodSym(ctor).name + val newName = MethodName(newSimpleMethodName, initName.paramTypeRefs, + jstpe.ClassRef(className)) + val newMethodIdent = js.MethodIdent(newName) + + js.ApplyStatic(js.ApplyFlags.empty, className, newMethodIdent, args)( + jstpe.ClassType(className)) + } + + /** Gen JS code for a new of a JS class (subclass of `js.Any`). */ + private def genNewJSClass(tree: Apply): js.Tree = { + acquireContextualJSClassValue { jsClassValue => + implicit val pos: Position = tree.span + + val Apply(fun @ Select(New(tpt), _), args) = tree: @unchecked + val cls = tpt.tpe.typeSymbol + val ctor = fun.symbol + + val nestedJSClass = cls.isNestedJSClass + assert(jsClassValue.isDefined == nestedJSClass, + s"$cls at $pos: jsClassValue.isDefined = ${jsClassValue.isDefined} " + + s"but isInnerNonNativeJSClass = $nestedJSClass") + + def genArgs: List[js.TreeOrJSSpread] = genActualJSArgs(ctor, args) + def genArgsAsClassCaptures: List[js.Tree] = args.map(genExpr) + + jsClassValue.fold { + // Static JS class (by construction, it cannot be a module class, as their News do not reach the back-end) + if (cls == jsdefn.JSObjectClass && args.isEmpty) + js.JSObjectConstr(Nil) + else if (cls == jsdefn.JSArrayClass && args.isEmpty) + js.JSArrayConstr(Nil) + else + js.JSNew(genLoadJSConstructor(cls), genArgs) + } { jsClassVal => + // Nested JS class + if (cls.isAnonymousClass) + genNewAnonJSClass(cls, jsClassVal, genArgsAsClassCaptures)(fun.span) + else if (atPhase(erasurePhase)(cls.is(ModuleClass))) // LambdaLift removes the ModuleClass flag of lifted classes + js.JSNew(js.CreateJSClass(encodeClassName(cls), jsClassVal :: genArgsAsClassCaptures), Nil) + else + js.JSNew(jsClassVal, genArgs) + } + } + } + + /** Generate an instance of an anonymous (non-lambda) JS class inline + * + * @param sym Class to generate the instance of + * @param jsSuperClassValue JS class value of the super class + * @param args Arguments to the Scala constructor, which map to JS class captures + * @param pos Position of the original New tree + */ + private def genNewAnonJSClass(sym: Symbol, jsSuperClassValue: js.Tree, args: List[js.Tree])( + implicit pos: Position): js.Tree = { + assert(sym.isAnonymousClass, + s"Generating AnonJSClassNew of non anonymous JS class ${sym.fullName}") + + // Find the TypeDef for this anonymous class and generate it + val typeDef = consumeLazilyGeneratedAnonClass(sym) + val originalClassDef = resetAllScopedVars { + withScopedVars( + currentClassSym := sym + ) { + genNonNativeJSClass(typeDef) + } + } + + // Partition class members. + val privateFieldDefs = mutable.ListBuffer.empty[js.FieldDef] + val classDefMembers = mutable.ListBuffer.empty[js.MemberDef] + val instanceMembers = mutable.ListBuffer.empty[js.MemberDef] + var constructor: Option[js.JSConstructorDef] = None + + originalClassDef.memberDefs.foreach { + case fdef: js.FieldDef => + privateFieldDefs += fdef + + case fdef: js.JSFieldDef => + instanceMembers += fdef + + case mdef: js.MethodDef => + assert(mdef.flags.namespace.isStatic, + "Non-static, unexported method in non-native JS class") + classDefMembers += mdef + + case cdef: js.JSConstructorDef => + assert(constructor.isEmpty, "two ctors in class") + constructor = Some(cdef) + + case mdef: js.JSMethodDef => + assert(!mdef.flags.namespace.isStatic, "Exported static method") + instanceMembers += mdef + + case property: js.JSPropertyDef => + instanceMembers += property + + case nativeMemberDef: js.JSNativeMemberDef => + throw new FatalError("illegal native JS member in JS class at " + nativeMemberDef.pos) + } + + assert(originalClassDef.topLevelExportDefs.isEmpty, + "Found top-level exports in anonymous JS class at " + pos) + + // Make new class def with static members + val newClassDef = { + implicit val pos = originalClassDef.pos + val parent = js.ClassIdent(jsNames.ObjectClass) + js.ClassDef(originalClassDef.name, originalClassDef.originalName, + ClassKind.AbstractJSType, None, Some(parent), interfaces = Nil, + jsSuperClass = None, jsNativeLoadSpec = None, + classDefMembers.toList, Nil)( + originalClassDef.optimizerHints) + } + + generatedClasses += newClassDef + + // Construct inline class definition + + val jsClassCaptures = originalClassDef.jsClassCaptures.getOrElse { + throw new AssertionError(s"no class captures for anonymous JS class at $pos") + } + val js.JSConstructorDef(_, ctorParams, ctorRestParam, ctorBody) = constructor.getOrElse { + throw new AssertionError("No ctor found") + } + assert(ctorParams.isEmpty && ctorRestParam.isEmpty, + s"non-empty constructor params for anonymous JS class at $pos") + + /* The first class capture is always a reference to the super class. + * This is enforced by genJSClassCapturesAndConstructor. + */ + def jsSuperClassRef(implicit pos: ir.Position): js.VarRef = + jsClassCaptures.head.ref + + /* The `this` reference. + * FIXME This could clash with a local variable of the constructor or a JS + * class capture. It seems Scala 2 has the same vulnerability. How do we + * avoid this? + */ + val selfName = freshLocalIdent("this")(pos) + def selfRef(implicit pos: ir.Position) = + js.VarRef(selfName)(jstpe.AnyType) + + def memberLambda(params: List[js.ParamDef], restParam: Option[js.ParamDef], body: js.Tree)(implicit pos: ir.Position): js.Closure = + js.Closure(arrow = false, captureParams = Nil, params, restParam, body, captureValues = Nil) + + val memberDefinitions0 = instanceMembers.toList.map { + case fdef: js.FieldDef => + throw new AssertionError("unexpected FieldDef") + + case fdef: js.JSFieldDef => + implicit val pos = fdef.pos + js.Assign(js.JSSelect(selfRef, fdef.name), jstpe.zeroOf(fdef.ftpe)) + + case mdef: js.MethodDef => + throw new AssertionError("unexpected MethodDef") + + case cdef: js.JSConstructorDef => + throw new AssertionError("unexpected JSConstructorDef") + + case mdef: js.JSMethodDef => + implicit val pos = mdef.pos + val impl = memberLambda(mdef.args, mdef.restParam, mdef.body) + js.Assign(js.JSSelect(selfRef, mdef.name), impl) + + case pdef: js.JSPropertyDef => + implicit val pos = pdef.pos + val optGetter = pdef.getterBody.map { body => + js.StringLiteral("get") -> memberLambda(params = Nil, restParam = None, body) + } + val optSetter = pdef.setterArgAndBody.map { case (arg, body) => + js.StringLiteral("set") -> memberLambda(params = arg :: Nil, restParam = None, body) + } + val descriptor = js.JSObjectConstr( + optGetter.toList ::: + optSetter.toList ::: + List(js.StringLiteral("configurable") -> js.BooleanLiteral(true)) + ) + js.JSMethodApply(js.JSGlobalRef("Object"), + js.StringLiteral("defineProperty"), + List(selfRef, pdef.name, descriptor)) + + case nativeMemberDef: js.JSNativeMemberDef => + throw new FatalError("illegal native JS member in JS class at " + nativeMemberDef.pos) + } + + val memberDefinitions = if (privateFieldDefs.isEmpty) { + memberDefinitions0 + } else { + /* Private fields, declared in FieldDefs, are stored in a separate + * object, itself stored as a non-enumerable field of the `selfRef`. + * The name of that field is retrieved at + * `scala.scalajs.runtime.privateFieldsSymbol()`, and is a Symbol if + * supported, or a randomly generated string that has the same enthropy + * as a UUID (i.e., 128 random bits). + * + * This encoding solves two issues: + * + * - Hide private fields in anonymous JS classes from `JSON.stringify` + * and other cursory inspections in JS (#2748). + * - Get around the fact that abstract JS types cannot declare + * FieldDefs (#3777). + */ + val fieldsObjValue = { + js.JSObjectConstr(privateFieldDefs.toList.map { fdef => + implicit val pos = fdef.pos + js.StringLiteral(fdef.name.name.nameString) -> jstpe.zeroOf(fdef.ftpe) + }) + } + val definePrivateFieldsObj = { + /* Object.defineProperty(selfRef, privateFieldsSymbol, { + * value: fieldsObjValue + * }); + * + * `writable`, `configurable` and `enumerable` are false by default. + */ + js.JSMethodApply( + js.JSGlobalRef("Object"), + js.StringLiteral("defineProperty"), + List( + selfRef, + genPrivateFieldsSymbol()(using sym.sourcePos), + js.JSObjectConstr(List( + js.StringLiteral("value") -> fieldsObjValue + )) + ) + ) + } + definePrivateFieldsObj :: memberDefinitions0 + } + + // Transform the constructor body. + val inlinedCtorStats: List[js.Tree] = { + val beforeSuper = ctorBody.beforeSuper + + val superCall = { + implicit val pos = ctorBody.superCall.pos + val js.JSSuperConstructorCall(args) = ctorBody.superCall + + val newTree = { + val ident = originalClassDef.superClass.getOrElse(throw new FatalError("No superclass")) + if (args.isEmpty && ident.name == JSObjectClassName) + js.JSObjectConstr(Nil) + else + js.JSNew(jsSuperClassRef, args) + } + + val selfVarDef = js.VarDef(selfName, thisOriginalName, jstpe.AnyType, mutable = false, newTree) + selfVarDef :: memberDefinitions + } + + // After the super call, substitute `selfRef` for `This()` + val afterSuper = new ir.Transformers.Transformer { + override def transform(tree: js.Tree, isStat: Boolean): js.Tree = tree match { + case js.This() => + selfRef(tree.pos) + + // Don't traverse closure boundaries + case closure: js.Closure => + val newCaptureValues = closure.captureValues.map(transformExpr) + closure.copy(captureValues = newCaptureValues)(closure.pos) + + case tree => + super.transform(tree, isStat) + } + }.transformStats(ctorBody.afterSuper) + + beforeSuper ::: superCall ::: afterSuper + } + + val closure = js.Closure(arrow = true, jsClassCaptures, Nil, None, + js.Block(inlinedCtorStats, selfRef), jsSuperClassValue :: args) + js.JSFunctionApply(closure, Nil) + } + + /** Gen JS code for a primitive method call. */ + private def genPrimitiveOp(tree: Apply, isStat: Boolean): js.Tree = { + import dotty.tools.backend.ScalaPrimitivesOps._ + + implicit val pos = tree.span + + val Apply(fun, args) = tree + val receiver = qualifierOf(fun) + + val code = primitives.getPrimitive(tree, receiver.tpe) + + if (isArithmeticOp(code) || isLogicalOp(code) || isComparisonOp(code)) + genSimpleOp(tree, receiver :: args, code) + else if (code == CONCAT) + genStringConcat(tree, receiver, args) + else if (code == HASH) + genScalaHash(tree, receiver) + else if (isArrayOp(code)) + genArrayOp(tree, code) + else if (code == SYNCHRONIZED) + genSynchronized(tree, isStat) + else if (isCoercion(code)) + genCoercion(tree, receiver, code) + else if (code == JSPrimitives.THROW) + genThrow(tree, args) + else if (JSPrimitives.isJSPrimitive(code)) + genJSPrimitive(tree, args, code, isStat) + else + throw new FatalError(s"Unknown primitive: ${tree.symbol.fullName} at: $pos") + } + + /** Gen JS code for a simple operation (arithmetic, logical, or comparison) */ + private def genSimpleOp(tree: Apply, args: List[Tree], code: Int): js.Tree = { + args match { + case List(arg) => genSimpleUnaryOp(tree, arg, code) + case List(lhs, rhs) => genSimpleBinaryOp(tree, lhs, rhs, code) + case _ => throw new FatalError("Incorrect arity for primitive") + } + } + + /** Gen JS code for a simple unary operation. */ + private def genSimpleUnaryOp(tree: Apply, arg: Tree, code: Int): js.Tree = { + import dotty.tools.backend.ScalaPrimitivesOps._ + + implicit val pos = tree.span + + val resultIRType = toIRType(tree.tpe) + val genArg = adaptPrimitive(genExpr(arg), resultIRType) + + (code: @switch) match { + case POS => + genArg + + case NEG => + (resultIRType: @unchecked) match { + case jstpe.IntType => + js.BinaryOp(js.BinaryOp.Int_-, js.IntLiteral(0), genArg) + case jstpe.LongType => + js.BinaryOp(js.BinaryOp.Long_-, js.LongLiteral(0), genArg) + case jstpe.FloatType => + js.BinaryOp(js.BinaryOp.Float_*, js.FloatLiteral(-1.0f), genArg) + case jstpe.DoubleType => + js.BinaryOp(js.BinaryOp.Double_*, js.DoubleLiteral(-1.0), genArg) + } + + case NOT => + (resultIRType: @unchecked) match { + case jstpe.IntType => + js.BinaryOp(js.BinaryOp.Int_^, js.IntLiteral(-1), genArg) + case jstpe.LongType => + js.BinaryOp(js.BinaryOp.Long_^, js.LongLiteral(-1), genArg) + } + + case ZNOT => + js.UnaryOp(js.UnaryOp.Boolean_!, genArg) + + case _ => + throw new FatalError("Unknown unary operation code: " + code) + } + } + + /** Gen JS code for a simple binary operation. */ + private def genSimpleBinaryOp(tree: Apply, lhs: Tree, rhs: Tree, code: Int): js.Tree = { + import dotty.tools.backend.ScalaPrimitivesOps._ + + implicit val pos: SourcePosition = tree.sourcePos + + val lhsIRType = toIRType(lhs.tpe) + val rhsIRType = toIRType(rhs.tpe) + + val isShift = isShiftOp(code) + + val opType = { + if (isShift) { + if (lhsIRType == jstpe.LongType) jstpe.LongType + else jstpe.IntType + } else { + (lhsIRType, rhsIRType) match { + case (jstpe.DoubleType, _) | (_, jstpe.DoubleType) => jstpe.DoubleType + case (jstpe.FloatType, _) | (_, jstpe.FloatType) => jstpe.FloatType + case (jstpe.LongType, _) | (_, jstpe.LongType) => jstpe.LongType + case (jstpe.IntType | jstpe.ByteType | jstpe.ShortType | jstpe.CharType, _) => jstpe.IntType + case (_, jstpe.IntType | jstpe.ByteType | jstpe.ShortType | jstpe.CharType) => jstpe.IntType + case (jstpe.BooleanType, _) | (_, jstpe.BooleanType) => jstpe.BooleanType + case _ => jstpe.AnyType + } + } + } + + val lsrc = + if (opType == jstpe.AnyType) genExpr(lhs) + else adaptPrimitive(genExpr(lhs), opType) + val rsrc = + if (opType == jstpe.AnyType) genExpr(rhs) + else adaptPrimitive(genExpr(rhs), if (isShift) jstpe.IntType else opType) + + if (opType == jstpe.AnyType && isUniversalEqualityOp(code)) { + genUniversalEqualityOp(lhs.tpe, rhs.tpe, lsrc, rsrc, code) + } else if (code == ZOR) { + js.If(lsrc, js.BooleanLiteral(true), rsrc)(jstpe.BooleanType) + } else if (code == ZAND) { + js.If(lsrc, rsrc, js.BooleanLiteral(false))(jstpe.BooleanType) + } else { + import js.BinaryOp._ + + (opType: @unchecked) match { + case jstpe.IntType => + val op = (code: @switch) match { + case ADD => Int_+ + case SUB => Int_- + case MUL => Int_* + case DIV => Int_/ + case MOD => Int_% + case OR => Int_| + case AND => Int_& + case XOR => Int_^ + case LSL => Int_<< + case LSR => Int_>>> + case ASR => Int_>> + + case EQ => Int_== + case NE => Int_!= + case LT => Int_< + case LE => Int_<= + case GT => Int_> + case GE => Int_>= + } + js.BinaryOp(op, lsrc, rsrc) + + case jstpe.FloatType => + def withFloats(op: Int): js.Tree = + js.BinaryOp(op, lsrc, rsrc) + + def toDouble(value: js.Tree): js.Tree = + js.UnaryOp(js.UnaryOp.FloatToDouble, value) + + def withDoubles(op: Int): js.Tree = + js.BinaryOp(op, toDouble(lsrc), toDouble(rsrc)) + + (code: @switch) match { + case ADD => withFloats(Float_+) + case SUB => withFloats(Float_-) + case MUL => withFloats(Float_*) + case DIV => withFloats(Float_/) + case MOD => withFloats(Float_%) + + case EQ => withDoubles(Double_==) + case NE => withDoubles(Double_!=) + case LT => withDoubles(Double_<) + case LE => withDoubles(Double_<=) + case GT => withDoubles(Double_>) + case GE => withDoubles(Double_>=) + } + + case jstpe.DoubleType => + val op = (code: @switch) match { + case ADD => Double_+ + case SUB => Double_- + case MUL => Double_* + case DIV => Double_/ + case MOD => Double_% + + case EQ => Double_== + case NE => Double_!= + case LT => Double_< + case LE => Double_<= + case GT => Double_> + case GE => Double_>= + } + js.BinaryOp(op, lsrc, rsrc) + + case jstpe.LongType => + val op = (code: @switch) match { + case ADD => Long_+ + case SUB => Long_- + case MUL => Long_* + case DIV => Long_/ + case MOD => Long_% + case OR => Long_| + case XOR => Long_^ + case AND => Long_& + case LSL => Long_<< + case LSR => Long_>>> + case ASR => Long_>> + + case EQ => Long_== + case NE => Long_!= + case LT => Long_< + case LE => Long_<= + case GT => Long_> + case GE => Long_>= + } + js.BinaryOp(op, lsrc, rsrc) + + case jstpe.BooleanType => + val op = (code: @switch) match { + case EQ => Boolean_== + case NE => Boolean_!= + case OR => Boolean_| + case AND => Boolean_& + case XOR => Boolean_!= + } + js.BinaryOp(op, lsrc, rsrc) + + case jstpe.AnyType => + val op = code match { + case ID => === + case NI => !== + } + js.BinaryOp(op, lsrc, rsrc) + } + } + } + + private def adaptPrimitive(value: js.Tree, to: jstpe.Type)( + implicit pos: Position): js.Tree = { + genConversion(value.tpe, to, value) + } + + /* This method corresponds to the method of the same name in + * BCodeBodyBuilder of the JVM back-end. It ends up calling the method + * BCodeIdiomatic.emitT2T, whose logic we replicate here. + */ + private def genConversion(from: jstpe.Type, to: jstpe.Type, value: js.Tree)( + implicit pos: Position): js.Tree = { + import js.UnaryOp._ + + if (from == to || from == jstpe.NothingType) { + value + } else if (from == jstpe.BooleanType || to == jstpe.BooleanType) { + throw new AssertionError(s"Invalid genConversion from $from to $to") + } else { + def intValue = (from: @unchecked) match { + case jstpe.IntType => value + case jstpe.CharType => js.UnaryOp(CharToInt, value) + case jstpe.ByteType => js.UnaryOp(ByteToInt, value) + case jstpe.ShortType => js.UnaryOp(ShortToInt, value) + case jstpe.LongType => js.UnaryOp(LongToInt, value) + case jstpe.FloatType => js.UnaryOp(DoubleToInt, js.UnaryOp(FloatToDouble, value)) + case jstpe.DoubleType => js.UnaryOp(DoubleToInt, value) + } + + def doubleValue = from match { + case jstpe.DoubleType => value + case jstpe.FloatType => js.UnaryOp(FloatToDouble, value) + case jstpe.LongType => js.UnaryOp(LongToDouble, value) + case _ => js.UnaryOp(IntToDouble, intValue) + } + + (to: @unchecked) match { + case jstpe.CharType => + js.UnaryOp(IntToChar, intValue) + case jstpe.ByteType => + js.UnaryOp(IntToByte, intValue) + case jstpe.ShortType => + js.UnaryOp(IntToShort, intValue) + case jstpe.IntType => + intValue + case jstpe.LongType => + from match { + case jstpe.FloatType | jstpe.DoubleType => + js.UnaryOp(DoubleToLong, doubleValue) + case _ => + js.UnaryOp(IntToLong, intValue) + } + case jstpe.FloatType => + if (from == jstpe.LongType) + js.UnaryOp(js.UnaryOp.LongToFloat, value) + else + js.UnaryOp(js.UnaryOp.DoubleToFloat, doubleValue) + case jstpe.DoubleType => + doubleValue + } + } + } + + /** Gen JS code for a universal equality test. */ + private def genUniversalEqualityOp(ltpe: Type, rtpe: Type, lhs: js.Tree, rhs: js.Tree, code: Int)( + implicit pos: SourcePosition): js.Tree = { + + import dotty.tools.backend.ScalaPrimitivesOps._ + + val bypassEqEq = { + // Do not call equals if we have a literal null at either side. + lhs.isInstanceOf[js.Null] || + rhs.isInstanceOf[js.Null] + } + + if (bypassEqEq) { + js.BinaryOp( + if (code == EQ) js.BinaryOp.=== else js.BinaryOp.!==, + lhs, rhs) + } else { + val body = genEqEqPrimitive(ltpe, rtpe, lhs, rhs) + if (code == EQ) body + else js.UnaryOp(js.UnaryOp.Boolean_!, body) + } + } + + private lazy val externalEqualsNumNum: Symbol = + defn.BoxesRunTimeModule.requiredMethod(nme.equalsNumNum) + private lazy val externalEqualsNumChar: Symbol = + NoSymbol // requiredMethod(BoxesRunTimeTypeRef, nme.equalsNumChar) // this method is private + private lazy val externalEqualsNumObject: Symbol = + defn.BoxesRunTimeModule.requiredMethod(nme.equalsNumObject) + private lazy val externalEquals: Symbol = + defn.BoxesRunTimeModule.info.decl(nme.equals_).suchThat(toDenot(_).info.firstParamTypes.size == 2).symbol + + /** Gen JS code for a call to Any.== */ + private def genEqEqPrimitive(ltpe: Type, rtpe: Type, lsrc: js.Tree, rsrc: js.Tree)( + implicit pos: SourcePosition): js.Tree = { + report.debuglog(s"$ltpe == $rtpe") + val lsym = ltpe.typeSymbol.asClass + val rsym = rtpe.typeSymbol.asClass + + /* True if the equality comparison is between values that require the + * use of the rich equality comparator + * (scala.runtime.BoxesRunTime.equals). + * This is the case when either side of the comparison might have a + * run-time type subtype of java.lang.Number or java.lang.Character, + * **which includes when either is a JS type**. + * When it is statically known that both sides are equal and subtypes of + * Number or Character, not using the rich equality is possible (their + * own equals method will do ok), except for java.lang.Float and + * java.lang.Double: their `equals` have different behavior around `NaN` + * and `-0.0`, see Javadoc (scala-dev#329, scala-js#2799). + */ + val mustUseAnyComparator: Boolean = { + lsym.isJSType || rsym.isJSType || { + val p = ctx.platform + p.isMaybeBoxed(lsym) && p.isMaybeBoxed(rsym) && { + val areSameFinals = lsym.is(Final) && rsym.is(Final) && (ltpe =:= rtpe) + !areSameFinals || lsym == defn.BoxedFloatClass || lsym == defn.BoxedDoubleClass + } + } + } + + if (mustUseAnyComparator) { + val equalsMethod: Symbol = { + val ptfm = ctx.platform + if (lsym.derivesFrom(defn.BoxedNumberClass)) { + if (rsym.derivesFrom(defn.BoxedNumberClass)) externalEqualsNumNum + else if (rsym.derivesFrom(defn.BoxedCharClass)) externalEqualsNumObject // will be externalEqualsNumChar in 2.12, SI-9030 + else externalEqualsNumObject + } else externalEquals + } + genApplyStatic(equalsMethod, List(lsrc, rsrc)) + } else { + // if (lsrc eq null) rsrc eq null else lsrc.equals(rsrc) + if (lsym == defn.StringClass) { + // String.equals(that) === (this eq that) + js.BinaryOp(js.BinaryOp.===, lsrc, rsrc) + } else { + /* This requires to evaluate both operands in local values first. + * The optimizer will eliminate them if possible. + */ + val ltemp = js.VarDef(freshLocalIdent(), NoOriginalName, lsrc.tpe, mutable = false, lsrc) + val rtemp = js.VarDef(freshLocalIdent(), NoOriginalName, rsrc.tpe, mutable = false, rsrc) + js.Block( + ltemp, + rtemp, + js.If(js.BinaryOp(js.BinaryOp.===, ltemp.ref, js.Null()), + js.BinaryOp(js.BinaryOp.===, rtemp.ref, js.Null()), + genApplyMethod(ltemp.ref, defn.Any_equals, List(rtemp.ref)))( + jstpe.BooleanType)) + } + } + } + + /** Gen JS code for string concatenation. + */ + private def genStringConcat(tree: Apply, receiver: Tree, + args: List[Tree]): js.Tree = { + implicit val pos = tree.span + + js.BinaryOp(js.BinaryOp.String_+, genExpr(receiver), genExpr(args.head)) + } + + /** Gen JS code for a call to Any.## */ + private def genScalaHash(tree: Apply, receiver: Tree): js.Tree = { + implicit val pos: SourcePosition = tree.sourcePos + + genModuleApplyMethod(defn.ScalaRuntimeModule.requiredMethod(nme.hash_), + List(genExpr(receiver))) + } + + /** Gen JS code for an array operation (get, set or length) */ + private def genArrayOp(tree: Tree, code: Int): js.Tree = { + import dotty.tools.backend.ScalaPrimitivesOps._ + + implicit val pos = tree.span + + val Apply(fun, args) = tree: @unchecked + val arrayObj = qualifierOf(fun) + + val genArray = genExpr(arrayObj) + val genArgs = args.map(genExpr) + + def elementType: Type = arrayObj.tpe.widenDealias match { + case defn.ArrayOf(el) => el + case JavaArrayType(el) => el + case tpe => + val msg = em"expected Array $tpe" + report.error(msg) + ErrorType(msg) + } + + def genSelect(): js.AssignLhs = + js.ArraySelect(genArray, genArgs(0))(toIRType(elementType)) + + if (isArrayGet(code)) { + // get an item of the array + assert(args.length == 1, + s"Array get requires 1 argument, found ${args.length} in $tree") + genSelect() + } else if (isArraySet(code)) { + // set an item of the array + assert(args.length == 2, + s"Array set requires 2 arguments, found ${args.length} in $tree") + js.Assign(genSelect(), genArgs(1)) + } else { + // length of the array + js.ArrayLength(genArray) + } + } + + /** Gen JS code for a call to AnyRef.synchronized */ + private def genSynchronized(tree: Apply, isStat: Boolean): js.Tree = { + /* JavaScript is single-threaded, so we can drop the + * synchronization altogether. + */ + val Apply(fun, List(arg)) = tree + val receiver = qualifierOf(fun) + + val genReceiver = genExpr(receiver) + val genArg = genStatOrExpr(arg, isStat) + + genReceiver match { + case js.This() => + // common case for which there is no side-effect nor NPE + genArg + case _ => + implicit val pos = tree.span + js.Block( + js.If(js.BinaryOp(js.BinaryOp.===, genReceiver, js.Null()), + js.Throw(js.New(NullPointerExceptionClass, js.MethodIdent(jsNames.NoArgConstructorName), Nil)), + js.Skip())(jstpe.NoType), + genArg) + } + } + + /** Gen JS code for a coercion */ + private def genCoercion(tree: Apply, receiver: Tree, code: Int): js.Tree = { + implicit val pos = tree.span + + val source = genExpr(receiver) + val resultType = toIRType(tree.tpe) + adaptPrimitive(source, resultType) + } + + /** Gen a call to the special `throw` method. */ + private def genThrow(tree: Apply, args: List[Tree]): js.Tree = { + implicit val pos: SourcePosition = tree.sourcePos + val exception = args.head + val genException = genExpr(exception) + genException match { + case js.New(cls, _, _) if cls != JavaScriptExceptionClassName => + // Common case where ex is neither null nor a js.JavaScriptException + js.Throw(genException) + case _ => + js.Throw(js.UnwrapFromThrowable(genException)) + } + } + + /** Gen a "normal" apply (to a true method). + * + * But even these are further refined into: + * * Methods of java.lang.String, which are redirected to the + * RuntimeString trait implementation. + * * Calls to methods of raw JS types (Scala.js -> JS interop) + * * Calls to methods in impl classes of Scala2 traits. + * * Regular method call + */ + private def genNormalApply(tree: Apply, isStat: Boolean): js.Tree = { + implicit val pos = tree.span + + val fun = tree.fun match { + case fun: Ident => desugarIdent(fun).get + case fun: Select => fun + } + val receiver = fun.qualifier + val args = tree.args + val sym = fun.symbol + + def isStringMethodFromObject: Boolean = sym.name match { + case nme.toString_ | nme.equals_ | nme.hashCode_ => true + case _ => false + } + + if (isMethodStaticInIR(sym)) { + genApplyStatic(sym, genActualArgs(sym, args)) + } else if (sym.owner.isJSType) { + if (!sym.owner.isNonNativeJSClass || sym.isJSExposed) + genApplyJSMethodGeneric(sym, genExprOrGlobalScope(receiver), genActualJSArgs(sym, args), isStat)(tree.sourcePos) + else + genApplyJSClassMethod(genExpr(receiver), sym, genActualArgs(sym, args)) + } else if (sym.hasAnnotation(jsdefn.JSNativeAnnot)) { + genJSNativeMemberCall(tree) + } else { + genApplyMethodMaybeStatically(genExpr(receiver), sym, genActualArgs(sym, args)) + } + } + + /** Gen JS code for a call to a JS method (of a subclass of `js.Any`). + * + * Basically it boils down to calling the method as a `JSBracketSelect`, + * without name mangling. But other aspects come into play: + * + * - Operator methods are translated to JS operators (not method calls) + * - `apply` is translated as a function call, i.e., `o()` instead of `o.apply()` + * - Scala varargs are turned into JS varargs (see `genPrimitiveJSArgs()`) + * - Getters and parameterless methods are translated as `JSBracketSelect` + * - Setters are translated to `Assign` to `JSBracketSelect` + */ + private def genApplyJSMethodGeneric(sym: Symbol, + receiver: MaybeGlobalScope, args: List[js.TreeOrJSSpread], isStat: Boolean, + jsSuperClassValue: Option[js.Tree] = None)( + implicit pos: SourcePosition): js.Tree = { + + def argsNoSpread: List[js.Tree] = { + assert(!args.exists(_.isInstanceOf[js.JSSpread]), s"Unexpected spread at $pos") + args.asInstanceOf[List[js.Tree]] + } + + val argc = args.size // meaningful only for methods that don't have varargs + + def requireNotSuper(): Unit = { + if (jsSuperClassValue.isDefined) + report.error("Illegal super call in Scala.js-defined JS class", pos) + } + + def requireNotSpread(arg: js.TreeOrJSSpread): js.Tree = + arg.asInstanceOf[js.Tree] + + def genSuperReference(propName: js.Tree): js.AssignLhs = { + jsSuperClassValue.fold[js.AssignLhs] { + genJSSelectOrGlobalRef(receiver, propName) + } { superClassValue => + js.JSSuperSelect(superClassValue, ruleOutGlobalScope(receiver), propName) + } + } + + def genSelectGet(propName: js.Tree): js.Tree = + genSuperReference(propName) + + def genSelectSet(propName: js.Tree, value: js.Tree): js.Tree = + js.Assign(genSuperReference(propName), value) + + def genCall(methodName: js.Tree, args: List[js.TreeOrJSSpread]): js.Tree = { + jsSuperClassValue.fold[js.Tree] { + genJSMethodApplyOrGlobalRefApply(receiver, methodName, args) + } { superClassValue => + js.JSSuperMethodCall(superClassValue, ruleOutGlobalScope(receiver), methodName, args) + } + } + + val boxedResult = sym.jsCallingConvention match { + case JSCallingConvention.UnaryOp(code) => + requireNotSuper() + assert(argc == 0, s"bad argument count ($argc) for unary op at $pos") + js.JSUnaryOp(code, ruleOutGlobalScope(receiver)) + + case JSCallingConvention.BinaryOp(code) => + requireNotSuper() + assert(argc == 1, s"bad argument count ($argc) for binary op at $pos") + js.JSBinaryOp(code, ruleOutGlobalScope(receiver), requireNotSpread(args.head)) + + case JSCallingConvention.Call => + requireNotSuper() + if (sym.owner.isSubClass(jsdefn.JSThisFunctionClass)) + js.JSMethodApply(ruleOutGlobalScope(receiver), js.StringLiteral("call"), args) + else + js.JSFunctionApply(ruleOutGlobalScope(receiver), args) + + case JSCallingConvention.Property(jsName) => + argsNoSpread match { + case Nil => + genSelectGet(genExpr(jsName)) + case value :: Nil => + genSelectSet(genExpr(jsName), value) + case _ => + throw new AssertionError(s"property methods should have 0 or 1 non-varargs arguments at $pos") + } + + case JSCallingConvention.BracketAccess => + argsNoSpread match { + case keyArg :: Nil => + genSelectGet(keyArg) + case keyArg :: valueArg :: Nil => + genSelectSet(keyArg, valueArg) + case _ => + throw new AssertionError(s"@JSBracketAccess methods should have 1 or 2 non-varargs arguments at $pos") + } + + case JSCallingConvention.BracketCall => + val (methodName, actualArgs) = extractFirstArg(args) + genCall(methodName, actualArgs) + + case JSCallingConvention.Method(jsName) => + genCall(genExpr(jsName), args) + } + + if (isStat) { + boxedResult + } else { + val tpe = atPhase(elimErasedValueTypePhase) { + sym.info.finalResultType + } + if (tpe.isRef(defn.BoxedUnitClass) && sym.isGetter) { + /* Work around to reclaim Scala 2 erasure behavior, assumed by the test + * NonNativeJSTypeTest.defaultValuesForFields. + * Scala 2 erases getters of `Unit`-typed fields as returning `Unit` + * (not `BoxedUnit`). Therefore, when called in expression position, + * the call site introduces an explicit `BoxedUnit.UNIT`. Even if the + * field has not been initialized at all (with `= _`), this results in + * an actual `()` value. + * In Scala 3, the same pattern returns `null`, as a `BoxedUnit`, so we + * introduce here an explicit `()` value. + * TODO We should remove this branch if the upstream test is updated + * not to assume such a strict interpretation of erasure. + */ + js.Block(boxedResult, js.Undefined()) + } else { + unbox(boxedResult, tpe) + } + } + } + + /** Extract the first argument in a list of actual arguments. + * + * This is nothing else than decomposing into head and tail, except that + * we assert that the first element is not a JSSpread. + */ + private def extractFirstArg(args: List[js.TreeOrJSSpread]): (js.Tree, List[js.TreeOrJSSpread]) = { + assert(args.nonEmpty, + "Trying to extract the first argument of an empty argument list") + val firstArg = args.head + assert(!firstArg.isInstanceOf[js.JSSpread], + "Trying to extract the first argument of an argument list starting " + + "with a Spread argument: " + firstArg) + (firstArg.asInstanceOf[js.Tree], args.tail) + } + + /** Gen JS code for a call to a native JS def or val. */ + private def genJSNativeMemberSelect(tree: Tree): js.Tree = + genJSNativeMemberSelectOrCall(tree, Nil) + + /** Gen JS code for a call to a native JS def or val. */ + private def genJSNativeMemberCall(tree: Apply): js.Tree = + genJSNativeMemberSelectOrCall(tree, tree.args) + + /** Gen JS code for a call to a native JS def or val. */ + private def genJSNativeMemberSelectOrCall(tree: Tree, args: List[Tree]): js.Tree = { + val sym = tree.symbol + + implicit val pos = tree.span + + val jsNativeMemberValue = + js.SelectJSNativeMember(encodeClassName(sym.owner), encodeJSNativeMemberSym(sym)) + + val boxedResult = + if (sym.isJSGetter) jsNativeMemberValue + else js.JSFunctionApply(jsNativeMemberValue, genActualJSArgs(sym, args)) + + unbox(boxedResult, atPhase(elimErasedValueTypePhase) { + sym.info.resultType + }) + } + + private def genJSSuperCall(tree: Apply, isStat: Boolean): js.Tree = { + acquireContextualJSClassValue { explicitJSSuperClassValue => + implicit val pos = tree.span + val Apply(fun @ Select(sup @ Super(qual, _), _), args) = tree: @unchecked + val sym = fun.symbol + + val genReceiver = genExpr(qual) + def genScalaArgs = genActualArgs(sym, args) + def genJSArgs = genActualJSArgs(sym, args) + + if (sym.owner == defn.ObjectClass) { + // Normal call anyway + assert(!sym.isClassConstructor, + s"Trying to call the super constructor of Object in a non-native JS class at $pos") + genApplyMethod(genReceiver, sym, genScalaArgs) + } else if (sym.isClassConstructor) { + throw new AssertionError( + s"calling a JS super constructor should have happened in genPrimaryJSClassCtor at $pos") + } else if (sym.owner.isNonNativeJSClass && !sym.isJSExposed) { + // Reroute to the static method + genApplyJSClassMethod(genReceiver, sym, genScalaArgs) + } else { + val jsSuperClassValue = explicitJSSuperClassValue.orElse { + Some(genLoadJSConstructor(currentClassSym.get.asClass.superClass)) + } + genApplyJSMethodGeneric(sym, MaybeGlobalScope.NotGlobalScope(genReceiver), + genJSArgs, isStat, jsSuperClassValue)(tree.sourcePos) + } + } + } + + /** Gen JS code for a call to a polymorphic method. + * + * The only methods that reach the back-end as polymorphic are + * `isInstanceOf` and `asInstanceOf`. + * + * (Well, in fact `DottyRunTime.newRefArray` too, but it is handled as a + * primitive instead.) + */ + private def genTypeApply(tree: TypeApply): js.Tree = { + implicit val pos: SourcePosition = tree.sourcePos + + val TypeApply(fun, targs) = tree + + val sym = fun.symbol + val receiver = qualifierOf(fun) + + val to = targs.head.tpe + + assert(!isPrimitiveValueType(receiver.tpe), + s"Found receiver of type test with primitive type ${receiver.tpe} at $pos") + assert(!isPrimitiveValueType(to), + s"Found target type of type test with primitive type ${receiver.tpe} at $pos") + + val genReceiver = genExpr(receiver) + + if (sym == defn.Any_asInstanceOf) { + genAsInstanceOf(genReceiver, to) + } else if (sym == defn.Any_isInstanceOf) { + genIsInstanceOf(genReceiver, to) + } else { + throw new FatalError( + s"Unexpected type application $fun with symbol ${sym.fullName}") + } + } + + /** Gen JS code for a Java Seq literal. */ + private def genJavaSeqLiteral(tree: JavaSeqLiteral): js.Tree = { + implicit val pos = tree.span + + val genElems = tree.elems.map(genExpr) + val arrayTypeRef = toTypeRef(tree.tpe).asInstanceOf[jstpe.ArrayTypeRef] + js.ArrayValue(arrayTypeRef, genElems) + } + + /** Gen JS code for a switch-`Match`, which is translated into an IR `js.Match`. */ + def genMatch(tree: Tree, isStat: Boolean): js.Tree = { + implicit val pos = tree.span + val Match(selector, cases) = tree: @unchecked + + def abortMatch(msg: String): Nothing = + throw new FatalError(s"$msg in switch-like pattern match at ${tree.span}: $tree") + + val genSelector = genExpr(selector) + + // Sanity check: we can handle Ints and Strings (including `null`s), but nothing else + genSelector.tpe match { + case jstpe.IntType | jstpe.ClassType(jsNames.BoxedStringClass) | jstpe.NullType | jstpe.NothingType => + // ok + case _ => + abortMatch(s"Invalid selector type ${genSelector.tpe}") + } + + val resultType = toIRType(tree.tpe) match { + case jstpe.NothingType => jstpe.NothingType // must take priority over NoType below + case _ if isStat => jstpe.NoType + case resType => resType + } + + var clauses: List[(List[js.MatchableLiteral], js.Tree)] = Nil + var optDefaultClause: Option[js.Tree] = None + + for (caze @ CaseDef(pat, guard, body) <- cases) { + if (guard != EmptyTree) + abortMatch("Found a case guard") + + val genBody = genStatOrExpr(body, isStat) + + def invalidCase(): Nothing = + abortMatch("Invalid case") + + def genMatchableLiteral(tree: Literal): js.MatchableLiteral = { + genExpr(tree) match { + case matchableLiteral: js.MatchableLiteral => matchableLiteral + case otherExpr => invalidCase() + } + } + + pat match { + case lit: Literal => + clauses = (List(genMatchableLiteral(lit)), genBody) :: clauses + case Ident(nme.WILDCARD) => + optDefaultClause = Some(genBody) + case Alternative(alts) => + val genAlts = alts.map { + case lit: Literal => genMatchableLiteral(lit) + case _ => invalidCase() + } + clauses = (genAlts, genBody) :: clauses + case _ => + invalidCase() + } + } + + clauses = clauses.reverse + val defaultClause = optDefaultClause.getOrElse { + throw new AssertionError("No elseClause in pattern match") + } + + /* Builds a `js.Match`, but simplifies it to a `js.If` if there is only + * one case with one alternative, and to a `js.Block` if there is no case + * at all. This happens in practice in the standard library. Having no + * case is a typical product of `match`es that are full of + * `case n if ... =>`, which are used instead of `if` chains for + * convenience and/or readability. + */ + def isInt(tree: js.Tree): Boolean = tree.tpe == jstpe.IntType + + clauses match { + case Nil => + // Completely remove the Match. Preserve the side-effects of `genSelector`. + js.Block(exprToStat(genSelector), defaultClause) + + case (uniqueAlt :: Nil, caseRhs) :: Nil => + /* Simplify the `match` as an `if`, so that the optimizer has less + * work to do, and we emit less code at the end of the day. + * Use `Int_==` instead of `===` if possible, since it is a common case. + */ + val op = + if (isInt(genSelector) && isInt(uniqueAlt)) js.BinaryOp.Int_== + else js.BinaryOp.=== + js.If(js.BinaryOp(op, genSelector, uniqueAlt), caseRhs, defaultClause)(resultType) + + case _ => + // We have more than one case: use a js.Match + js.Match(genSelector, clauses, defaultClause)(resultType) + } + } + + /** Gen JS code for a closure. + * + * Input: a `Closure` tree of the form + * {{{ + * Closure(env, call, functionalInterface) + * }}} + * representing the pseudo-syntax + * {{{ + * { (p1, ..., pm) => call(env1, ..., envn, p1, ..., pm) }: functionInterface + * }}} + * where `envi` are identifiers in the local scope. The qualifier of `call` + * is also implicitly captured. + * + * Output: a `js.Closure` tree of the form + * {{{ + * js.Closure(formalCaptures, formalParams, body, actualCaptures) + * }}} + * representing the pseudo-syntax + * {{{ + * lambda( + * formalParam1, ..., formalParamM) = body + * }}} + * where the `actualCaptures` and `body` are, in general, arbitrary + * expressions. But in this case, `actualCaptures` will be identifiers from + * `env`, and the `body` will be of the form + * {{{ + * call(formalCapture1.ref, ..., formalCaptureN.ref, + * formalParam1.ref, ...formalParamM.ref) + * }}} + * + * When the `js.Closure` node is evaluated, i.e., when the closure value is + * created, the expressions of the `actualCaptures` are evaluated, and the + * results of those evaluations is "stored" in the environment of the + * closure as the corresponding `formalCapture`. + * + * When we later *call* the closure, the `formalCaptures` already have their + * values from the environment, and they are available in the `body`. The + * `formalParams` of the created closure receive their values from the + * actual arguments at the call-site of the closure, and they are also + * available in the `body`. + */ + private def genClosure(tree: Closure): js.Tree = { + implicit val pos = tree.span + val Closure(env, call, functionalInterface) = tree + + val envSize = env.size + + val (fun, args) = call match { + // case Apply(fun, args) => (fun, args) // Conjectured not to happen + case t @ Select(_, _) => (t, Nil) + case t @ Ident(_) => (t, Nil) + } + val sym = fun.symbol + val isStaticCall = isMethodStaticInIR(sym) + + val qualifier = qualifierOf(fun) + val allCaptureValues = + if (isStaticCall) env + else qualifier :: env + + val formalAndActualCaptures = allCaptureValues.map { value => + implicit val pos = value.span + val (formalIdent, originalName) = value match { + case Ident(name) => (freshLocalIdent(name.toTermName), OriginalName(name.toString)) + case This(_) => (freshLocalIdent("this"), thisOriginalName) + case _ => (freshLocalIdent(), NoOriginalName) + } + val formalCapture = js.ParamDef(formalIdent, originalName, + toIRType(value.tpe), mutable = false) + val actualCapture = genExpr(value) + (formalCapture, actualCapture) + } + val (formalCaptures, actualCaptures) = formalAndActualCaptures.unzip + + val funInterfaceSym = functionalInterface.tpe.typeSymbol + val hasRepeatedParam = { + funInterfaceSym.exists && { + val Seq(samMethodDenot) = funInterfaceSym.info.possibleSamMethods + val samMethod = samMethodDenot.symbol + atPhase(elimRepeatedPhase)(samMethod.info.paramInfoss.flatten.exists(_.isRepeatedParam)) + } + } + + val formalParamNames = sym.info.paramNamess.flatten.drop(envSize) + val formalParamTypes = sym.info.paramInfoss.flatten.drop(envSize) + val formalParamRepeateds = + if (hasRepeatedParam) (0 until (formalParamTypes.size - 1)).map(_ => false) :+ true + else (0 until formalParamTypes.size).map(_ => false) + + val formalAndActualParams = formalParamNames.lazyZip(formalParamTypes).lazyZip(formalParamRepeateds).map { + (name, tpe, repeated) => + val formalParam = js.ParamDef(freshLocalIdent(name), + OriginalName(name.toString), jstpe.AnyType, mutable = false) + val actualParam = + if (repeated) genJSArrayToVarArgs(formalParam.ref)(tree.sourcePos) + else unbox(formalParam.ref, tpe) + (formalParam, actualParam) + } + val (formalAndRestParams, actualParams) = formalAndActualParams.unzip + + val (formalParams, restParam) = + if (hasRepeatedParam) (formalAndRestParams.init, Some(formalAndRestParams.last)) + else (formalAndRestParams, None) + + val genBody = { + val call = if (isStaticCall) { + genApplyStatic(sym, formalCaptures.map(_.ref) ::: actualParams) + } else { + val thisCaptureRef :: argCaptureRefs = formalCaptures.map(_.ref): @unchecked + if (!sym.owner.isNonNativeJSClass || sym.isJSExposed) + genApplyMethodMaybeStatically(thisCaptureRef, sym, argCaptureRefs ::: actualParams) + else + genApplyJSClassMethod(thisCaptureRef, sym, argCaptureRefs ::: actualParams) + } + box(call, sym.info.finalResultType) + } + + val isThisFunction = funInterfaceSym.isSubClass(jsdefn.JSThisFunctionClass) && { + val ok = formalParams.nonEmpty + if (!ok) + report.error("The SAM or apply method for a js.ThisFunction must have a leading non-varargs parameter", tree) + ok + } + + if (isThisFunction) { + val thisParam :: otherParams = formalParams: @unchecked + js.Closure( + arrow = false, + formalCaptures, + otherParams, + restParam, + js.Block( + js.VarDef(thisParam.name, thisParam.originalName, + thisParam.ptpe, mutable = false, + js.This()(thisParam.ptpe)(thisParam.pos))(thisParam.pos), + genBody), + actualCaptures) + } else { + val closure = js.Closure(arrow = true, formalCaptures, formalParams, restParam, genBody, actualCaptures) + + if (!funInterfaceSym.exists || defn.isFunctionClass(funInterfaceSym)) { + assert(!funInterfaceSym.exists || defn.isFunctionClass(funInterfaceSym), + s"Invalid functional interface $funInterfaceSym reached the back-end") + val formalCount = formalParams.size + val cls = ClassName("scala.scalajs.runtime.AnonFunction" + formalCount) + val ctorName = MethodName.constructor( + jstpe.ClassRef(ClassName("scala.scalajs.js.Function" + formalCount)) :: Nil) + js.New(cls, js.MethodIdent(ctorName), List(closure)) + } else { + assert(funInterfaceSym.isJSType, + s"Invalid functional interface $funInterfaceSym reached the back-end") + closure + } + } + } + + /** Generates a static method instantiating and calling this + * DynamicImportThunk's `apply`: + * + * {{{ + * static def dynamicImport$;;Ljava.lang.Object(): any = { + * new .;:V().apply;Ljava.lang.Object() + * } + * }}} + */ + private def genDynamicImportForwarder(clsSym: Symbol)(using Position): js.MethodDef = { + withNewLocalNameScope { + val ctor = clsSym.primaryConstructor + val paramSyms = ctor.paramSymss.flatten + val paramDefs = paramSyms.map(genParamDef(_)) + + val body = { + val inst = js.New(encodeClassName(clsSym), encodeMethodSym(ctor), paramDefs.map(_.ref)) + genApplyMethod(inst, jsdefn.DynamicImportThunkClass_apply, Nil) + } + + js.MethodDef( + js.MemberFlags.empty.withNamespace(js.MemberNamespace.PublicStatic), + encodeDynamicImportForwarderIdent(paramSyms), + NoOriginalName, + paramDefs, + jstpe.AnyType, + Some(body))(OptimizerHints.empty, None) + } + } + + /** Boxes a value of the given type before `elimErasedValueType`. + * + * This should be used when sending values to a JavaScript context, which + * is erased/boxed at the IR level, although it is not erased at the + * dotty/JVM level. + * + * @param expr Tree to be boxed if needed. + * @param tpeEnteringElimErasedValueType The type of `expr` as it was + * entering the `elimErasedValueType` phase. + */ + def box(expr: js.Tree, tpeEnteringElimErasedValueType: Type)(implicit pos: Position): js.Tree = { + tpeEnteringElimErasedValueType match { + case tpe if isPrimitiveValueType(tpe) => + makePrimitiveBox(expr, tpe) + + case tpe: ErasedValueType => + val boxedClass = tpe.tycon.typeSymbol + val ctor = boxedClass.primaryConstructor + js.New(encodeClassName(boxedClass), encodeMethodSym(ctor), List(expr)) + + case _ => + expr + } + } + + /** Unboxes a value typed as Any to the given type before `elimErasedValueType`. + * + * This should be used when receiving values from a JavaScript context, + * which is erased/boxed at the IR level, although it is not erased at the + * dotty/JVM level. + * + * @param expr Tree to be extracted. + * @param tpeEnteringElimErasedValueType The type of `expr` as it was + * entering the `elimErasedValueType` phase. + */ + def unbox(expr: js.Tree, tpeEnteringElimErasedValueType: Type)(implicit pos: Position): js.Tree = { + tpeEnteringElimErasedValueType match { + case tpe if isPrimitiveValueType(tpe) => + makePrimitiveUnbox(expr, tpe) + + case tpe: ErasedValueType => + val boxedClass = tpe.tycon.typeSymbol.asClass + val unboxMethod = ValueClasses.valueClassUnbox(boxedClass) + val content = genApplyMethod( + js.AsInstanceOf(expr, encodeClassType(boxedClass)), unboxMethod, Nil) + if (unboxMethod.info.resultType <:< tpe.erasedUnderlying) + content + else + unbox(content, tpe.erasedUnderlying) + + case tpe => + genAsInstanceOf(expr, tpe) + } + } + + /** Gen JS code for an asInstanceOf cast (for reference types only) */ + private def genAsInstanceOf(value: js.Tree, to: Type)(implicit pos: Position): js.Tree = + genAsInstanceOf(value, toIRType(to)) + + /** Gen JS code for an asInstanceOf cast (for reference types only) */ + private def genAsInstanceOf(value: js.Tree, to: jstpe.Type)(implicit pos: Position): js.Tree = { + to match { + case jstpe.AnyType => + value + case jstpe.NullType => + js.If( + js.BinaryOp(js.BinaryOp.===, value, js.Null()), + js.Null(), + genThrowClassCastException())( + jstpe.NullType) + case jstpe.NothingType => + js.Block(value, genThrowClassCastException()) + case _ => + js.AsInstanceOf(value, to) + } + } + + private def genThrowClassCastException()(implicit pos: Position): js.Tree = { + js.Throw(js.New(jsNames.ClassCastExceptionClass, + js.MethodIdent(jsNames.NoArgConstructorName), Nil)) + } + + /** Gen JS code for an isInstanceOf test (for reference types only) */ + def genIsInstanceOf(value: js.Tree, to: Type)( + implicit pos: SourcePosition): js.Tree = { + val sym = to.typeSymbol + + if (sym == defn.ObjectClass) { + js.BinaryOp(js.BinaryOp.!==, value, js.Null()) + } else if (sym.isJSType) { + if (sym.is(Trait)) { + report.error( + em"isInstanceOf[${sym.fullName}] not supported because it is a JS trait", + pos) + js.BooleanLiteral(true) + } else { + js.AsInstanceOf(js.JSBinaryOp( + js.JSBinaryOp.instanceof, value, genLoadJSConstructor(sym)), + jstpe.BooleanType) + } + } else { + // The Scala type system prevents x.isInstanceOf[Null] and ...[Nothing] + assert(sym != defn.NullClass && sym != defn.NothingClass, + s"Found a .isInstanceOf[$sym] at $pos") + js.IsInstanceOf(value, toIRType(to)) + } + } + + /** Gen a statically linked call to an instance method. */ + def genApplyMethodMaybeStatically(receiver: js.Tree, method: Symbol, + arguments: List[js.Tree])(implicit pos: Position): js.Tree = { + if (method.isPrivate || method.isClassConstructor) + genApplyMethodStatically(receiver, method, arguments) + else + genApplyMethod(receiver, method, arguments) + } + + /** Gen a dynamically linked call to a Scala method. */ + def genApplyMethod(receiver: js.Tree, method: Symbol, arguments: List[js.Tree])( + implicit pos: Position): js.Tree = { + assert(!method.isPrivate, + s"Cannot generate a dynamic call to private method $method at $pos") + js.Apply(js.ApplyFlags.empty, receiver, encodeMethodSym(method), arguments)( + toIRType(patchedResultType(method))) + } + + /** Gen a statically linked call to an instance method. */ + def genApplyMethodStatically(receiver: js.Tree, method: Symbol, arguments: List[js.Tree])( + implicit pos: Position): js.Tree = { + val flags = js.ApplyFlags.empty + .withPrivate(method.isPrivate && !method.isClassConstructor) + .withConstructor(method.isClassConstructor) + js.ApplyStatically(flags, receiver, encodeClassName(method.owner), + encodeMethodSym(method), arguments)( + toIRType(patchedResultType(method))) + } + + /** Gen a call to a static method. */ + private def genApplyStatic(method: Symbol, arguments: List[js.Tree])( + implicit pos: Position): js.Tree = { + js.ApplyStatic(js.ApplyFlags.empty.withPrivate(method.isPrivate), + encodeClassName(method.owner), encodeMethodSym(method), arguments)( + toIRType(patchedResultType(method))) + } + + /** Gen a call to a non-exposed method of a non-native JS class. */ + def genApplyJSClassMethod(receiver: js.Tree, method: Symbol, arguments: List[js.Tree])( + implicit pos: Position): js.Tree = { + genApplyStatic(method, receiver :: arguments) + } + + /** Gen a call to a method of a Scala top-level module. */ + private def genModuleApplyMethod(methodSym: Symbol, arguments: List[js.Tree])( + implicit pos: SourcePosition): js.Tree = { + genApplyMethod(genLoadModule(methodSym.owner), methodSym, arguments) + } + + /** Gen a boxing operation (tpe is the primitive type) */ + private def makePrimitiveBox(expr: js.Tree, tpe: Type)( + implicit pos: Position): js.Tree = { + toIRType(tpe) match { + case jstpe.NoType => // for JS interop cases + js.Block(expr, js.Undefined()) + case jstpe.BooleanType | jstpe.CharType | jstpe.ByteType | + jstpe.ShortType | jstpe.IntType | jstpe.LongType | jstpe.FloatType | + jstpe.DoubleType => + expr // box is identity for all those primitive types + case typeRef => + throw new FatalError( + s"makePrimitiveBox requires a primitive type, found $typeRef for $tpe at $pos") + } + } + + /** Gen an unboxing operation (tpe is the primitive type) */ + private def makePrimitiveUnbox(expr: js.Tree, tpe: Type)( + implicit pos: Position): js.Tree = { + toIRType(tpe) match { + case jstpe.NoType => expr // for JS interop cases + case irTpe => js.AsInstanceOf(expr, irTpe) + } + } + + /** Gen JS code for a Scala.js-specific primitive method */ + private def genJSPrimitive(tree: Apply, args: List[Tree], code: Int, + isStat: Boolean): js.Tree = { + + import JSPrimitives._ + + implicit val pos = tree.span + + def genArgs1: js.Tree = { + assert(args.size == 1, + s"Expected exactly 1 argument for JS primitive $code but got " + + s"${args.size} at $pos") + genExpr(args.head) + } + + def genArgs2: (js.Tree, js.Tree) = { + assert(args.size == 2, + s"Expected exactly 2 arguments for JS primitive $code but got " + + s"${args.size} at $pos") + (genExpr(args.head), genExpr(args.tail.head)) + } + + def genArgsVarLength: List[js.TreeOrJSSpread] = + genActualJSArgs(tree.symbol, args) + + def resolveReifiedJSClassSym(arg: Tree): Symbol = { + def fail(): Symbol = { + report.error( + tree.symbol.name.toString + " must be called with a constant " + + "classOf[T] representing a class extending js.Any " + + "(not a trait nor an object)", + tree.sourcePos) + NoSymbol + } + arg match { + case Literal(value) if value.tag == Constants.ClazzTag => + val classSym = value.typeValue.typeSymbol + if (classSym.isJSType && !classSym.is(Trait) && !classSym.is(ModuleClass)) + classSym + else + fail() + case _ => + fail() + } + } + + (code: @switch) match { + case DYNNEW => + // js.Dynamic.newInstance(clazz)(actualArgs: _*) + val (jsClass, actualArgs) = extractFirstArg(genArgsVarLength) + js.JSNew(jsClass, actualArgs) + + case ARR_CREATE => + // js.Array(elements: _*) + js.JSArrayConstr(genArgsVarLength) + + case CONSTRUCTOROF => + // runtime.constructorOf(clazz) + val classSym = resolveReifiedJSClassSym(args.head) + if (classSym == NoSymbol) + js.Undefined() // compile error emitted by resolveReifiedJSClassSym + else + genLoadJSConstructor(classSym) + + case CREATE_INNER_JS_CLASS | CREATE_LOCAL_JS_CLASS => + // runtime.createInnerJSClass(clazz, superClass) + // runtime.createLocalJSClass(clazz, superClass, fakeNewInstances) + val classSym = resolveReifiedJSClassSym(args(0)) + val superClassValue = genExpr(args(1)) + if (classSym == NoSymbol) { + js.Undefined() // compile error emitted by resolveReifiedJSClassSym + } else { + val captureValues = { + if (code == CREATE_INNER_JS_CLASS) { + /* Private inner classes that do not actually access their outer + * pointer do not receive an outer argument. We therefore count + * the number of constructors that have non-empty param list to + * know how many times we need to pass `this`. + */ + val requiredThisParams = + classSym.info.decls.lookupAll(nme.CONSTRUCTOR).count(_.info.paramInfoss.head.nonEmpty) + val outer = genThis() + List.fill(requiredThisParams)(outer) + } else { + val fakeNewInstances = args(2).asInstanceOf[JavaSeqLiteral].elems + fakeNewInstances.flatMap(genCaptureValuesFromFakeNewInstance(_)) + } + } + js.CreateJSClass(encodeClassName(classSym), superClassValue :: captureValues) + } + + case WITH_CONTEXTUAL_JS_CLASS_VALUE => + // withContextualJSClassValue(jsclass, inner) + val jsClassValue = genExpr(args(0)) + withScopedVars( + contextualJSClassValue := Some(jsClassValue) + ) { + genStatOrExpr(args(1), isStat) + } + + case LINKING_INFO => + // runtime.linkingInfo + js.JSLinkingInfo() + + case DEBUGGER => + // js.special.debugger() + js.Debugger() + + case UNITVAL => + // BoxedUnit.UNIT, which is the boxed version of () + js.Undefined() + + case JS_NEW_TARGET => + // js.new.target + val valid = currentMethodSym.get.isClassConstructor && currentClassSym.isNonNativeJSClass + if (!valid) { + report.error( + "Illegal use of js.`new`.target.\n" + + "It can only be used in the constructor of a JS class, " + + "as a statement or in the rhs of a val or var.\n" + + "It cannot be used inside a lambda or by-name parameter, nor in any other location.", + tree.sourcePos) + } + js.JSNewTarget() + + case JS_IMPORT => + // js.import(arg) + val arg = genArgs1 + js.JSImportCall(arg) + + case JS_IMPORT_META => + // js.import.meta + js.JSImportMeta() + + case DYNAMIC_IMPORT => + // runtime.dynamicImport + assert(args.size == 1, + s"Expected exactly 1 argument for JS primitive $code but got " + + s"${args.size} at $pos") + + args.head match { + case Block(stats, expr @ Typed(Apply(fun @ Select(New(tpt), _), args), _)) => + /* stats is always empty if no other compiler plugin is present. + * However, code instrumentation (notably scoverage) might add + * statements here. If this is the case, the thunk anonymous class + * has already been created when the other plugin runs (i.e. the + * plugin ran after jsinterop). + * + * Therefore, it is OK to leave the statements on our side of the + * dynamic loading boundary. + */ + + val clsSym = tpt.symbol + val ctor = fun.symbol + + assert(clsSym.isSubClass(jsdefn.DynamicImportThunkClass), + s"expected subclass of DynamicImportThunk, got: $clsSym at: ${expr.sourcePos}") + assert(ctor.isPrimaryConstructor, + s"expected primary constructor, got: $ctor at: ${expr.sourcePos}") + + js.Block( + stats.map(genStat(_)), + js.ApplyDynamicImport( + js.ApplyFlags.empty, + encodeClassName(clsSym), + encodeDynamicImportForwarderIdent(ctor.paramSymss.flatten), + genActualArgs(ctor, args)) + ) + + case tree => + throw new FatalError( + s"Unexpected argument tree in dynamicImport: $tree/${tree.getClass} at: $pos") + } + + case JS_NATIVE => + // js.native + report.error( + "js.native may only be used as stub implementation in facade types", + tree.sourcePos) + js.Undefined() + + case TYPEOF => + // js.typeOf(arg) + val arg = genArgs1 + val typeofExpr = arg match { + case arg: js.JSGlobalRef => js.JSTypeOfGlobalRef(arg) + case _ => js.JSUnaryOp(js.JSUnaryOp.typeof, arg) + } + js.AsInstanceOf(typeofExpr, jstpe.ClassType(jsNames.BoxedStringClass)) + + case STRICT_EQ => + // js.special.strictEquals(arg1, arg2) + val (arg1, arg2) = genArgs2 + js.JSBinaryOp(js.JSBinaryOp.===, arg1, arg2) + + case IN => + // js.special.in(arg1, arg2) + val (arg1, arg2) = genArgs2 + js.AsInstanceOf(js.JSBinaryOp(js.JSBinaryOp.in, arg1, arg2), + jstpe.BooleanType) + + case INSTANCEOF => + // js.special.instanceof(arg1, arg2) + val (arg1, arg2) = genArgs2 + js.AsInstanceOf(js.JSBinaryOp(js.JSBinaryOp.instanceof, arg1, arg2), + jstpe.BooleanType) + + case DELETE => + // js.special.delete(arg1, arg2) + val (arg1, arg2) = genArgs2 + js.JSDelete(arg1, arg2) + + case FORIN => + /* js.special.forin(arg1, arg2) + * + * We must generate: + * + * val obj = arg1 + * val f = arg2 + * for (val key in obj) { + * f(key) + * } + * + * with temporary vals, because `arg2` must be evaluated only + * once, and after `arg1`. + */ + val (arg1, arg2) = genArgs2 + val objVarDef = js.VarDef(freshLocalIdent("obj"), NoOriginalName, + jstpe.AnyType, mutable = false, arg1) + val fVarDef = js.VarDef(freshLocalIdent("f"), NoOriginalName, + jstpe.AnyType, mutable = false, arg2) + val keyVarIdent = freshLocalIdent("key") + val keyVarRef = js.VarRef(keyVarIdent)(jstpe.AnyType) + js.Block( + objVarDef, + fVarDef, + js.ForIn(objVarDef.ref, keyVarIdent, NoOriginalName, { + js.JSFunctionApply(fVarDef.ref, List(keyVarRef)) + })) + + case JS_THROW => + // js.special.throw(arg) + js.Throw(genArgs1) + + case JS_TRY_CATCH => + /* js.special.tryCatch(arg1, arg2) + * + * We must generate: + * + * val body = arg1 + * val handler = arg2 + * try { + * body() + * } catch (e) { + * handler(e) + * } + * + * with temporary vals, because `arg2` must be evaluated before + * `body` executes. Moreover, exceptions thrown while evaluating + * the function values `arg1` and `arg2` must not be caught. + */ + val (arg1, arg2) = genArgs2 + val bodyVarDef = js.VarDef(freshLocalIdent("body"), NoOriginalName, + jstpe.AnyType, mutable = false, arg1) + val handlerVarDef = js.VarDef(freshLocalIdent("handler"), NoOriginalName, + jstpe.AnyType, mutable = false, arg2) + val exceptionVarIdent = freshLocalIdent("e") + val exceptionVarRef = js.VarRef(exceptionVarIdent)(jstpe.AnyType) + js.Block( + bodyVarDef, + handlerVarDef, + js.TryCatch( + js.JSFunctionApply(bodyVarDef.ref, Nil), + exceptionVarIdent, + NoOriginalName, + js.JSFunctionApply(handlerVarDef.ref, List(exceptionVarRef)) + )(jstpe.AnyType) + ) + + case WRAP_AS_THROWABLE => + // js.special.wrapAsThrowable(arg) + js.WrapAsThrowable(genArgs1) + + case UNWRAP_FROM_THROWABLE => + // js.special.unwrapFromThrowable(arg) + js.UnwrapFromThrowable(genArgs1) + + case UNION_FROM | UNION_FROM_TYPE_CONSTRUCTOR => + /* js.|.from and js.|.fromTypeConstructor + * We should not have to deal with those. They have a perfectly valid + * user-space implementation. However, the Dotty type checker inserts + * way too many of those, even when they are completely unnecessary. + * That still wouldn't be an issue ... if only it did not insert them + * around the default getters to their parameters! But even there it + * does it (although the types are, by construction, *equivalent*!), + * and that kills our `UndefinedParam` treatment. So we have to handle + * those two methods as primitives to completely eliminate them. + * + * Hopefully this will become unnecessary when/if we manage to + * reinterpret js.| as a true Dotty union type. + */ + genArgs2._1 + + case REFLECT_SELECTABLE_SELECTDYN => + // scala.reflect.Selectable.selectDynamic + genReflectiveCall(tree, isSelectDynamic = true) + case REFLECT_SELECTABLE_APPLYDYN => + // scala.reflect.Selectable.applyDynamic + genReflectiveCall(tree, isSelectDynamic = false) + } + } + + /** Gen the SJSIR for a reflective call. + * + * Reflective calls are calls to a structural type field or method that + * involve a reflective Selectable. They look like the following in source + * code: + * {{{ + * import scala.reflect.Selectable.reflectiveSelectable + * + * type Structural = { + * val foo: Int + * def bar(x: Int, y: String): String + * } + * + * val structural: Structural = new { + * val foo: Int = 5 + * def bar(x: Int, y: String): String = x.toString + y + * } + * + * structural.foo + * structural.bar(6, "hello") + * }}} + * + * After expansion by the Scala 3 rules for structural member selections and + * calls, they look like + * + * {{{ + * reflectiveSelectable(structural).selectDynamic("foo") + * reflectiveSelectable(structural).applyDynamic("bar", + * classOf[Int], classOf[String] + * )( + * 6, "hello" + * ) + * }}} + * + * When the original `structural` value is already of a subtype of + * `scala.reflect.Selectable`, there is no conversion involved. There could + * also be any other arbitrary conversion, such as the deprecated bridge for + * Scala 2's `import scala.language.reflectiveCalls`. In general, the shape + * is therefore the following, for some `selectable: reflect.Selectable`: + * + * {{{ + * selectable.selectDynamic("foo") + * selectable.applyDynamic("bar", + * classOf[Int], classOf[String] + * )( + * 6, "hello" + * ) + * }}} + * + * and eventually reaches the back-end as + * + * {{{ + * selectable.selectDynamic("foo") // same as above + * selectable.applyDynamic("bar", + * wrapRefArray([ classOf[Int], classOf[String] : jl.Class ] + * )( + * genericWrapArray([ Int.box(6), "hello" : Object ]) + * ) + * }}} + * + * In SJSIR, they must be encoded as follows: + * + * {{{ + * selectable.selectedValue;O().foo;R() + * selectable.selectedValue;O().bar;I;Ljava.lang.String;R( + * Int.box(6).asInstanceOf[int], + * "hello".asInstanceOf[java.lang.String] + * ) + * }}} + * + * where `selectedValue;O()` is declared in `scala.reflect.Selectable` and + * holds the actual instance on which to perform the reflective operations. + * For the typical use case from the first snippet, it returns `structural`. + * + * This means that we must deconstruct the elaborated calls to recover: + * + * - the method name as a compile-time string `foo` or `bar` + * - the `tp: Type`s that have been wrapped in `classOf[tp]`, as a + * compile-time List[Type], from which we'll derive `jstpe.Type`s for the + * `asInstanceOf`s and `jstpe.TypeRef`s for the `MethodName.reflectiveProxy` + * - the actual arguments as a compile-time `List[Tree]` + * + * Virtually all of the code in `genReflectiveCall` deals with recovering + * those elements. Constructing the IR Tree is the easy part after that. + */ + private def genReflectiveCall(tree: Apply, isSelectDynamic: Boolean): js.Tree = { + implicit val pos = tree.span + val Apply(fun @ Select(receiver, _), args) = tree: @unchecked + + val selectedValueTree = js.Apply(js.ApplyFlags.empty, genExpr(receiver), + js.MethodIdent(selectedValueMethodName), Nil)(jstpe.AnyType) + + // Extract the method name as a String + val methodNameStr = args.head match { + case Literal(Constants.Constant(name: String)) => + name + case _ => + report.error( + "The method name given to Selectable.selectDynamic or Selectable.applyDynamic " + + "must be a literal string. " + + "Other uses are not supported in Scala.js.", + args.head.sourcePos) + "erroneous" + } + + val (formalParamTypeRefs, actualArgs) = if (isSelectDynamic) { + (Nil, Nil) + } else { + // Extract the param type refs and actual args from the 2nd and 3rd argument to applyDynamic + args.tail match { + case WrapArray(classOfsArray: JavaSeqLiteral) :: WrapArray(actualArgsAnyArray: JavaSeqLiteral) :: Nil => + // Extract jstpe.Type's and jstpe.TypeRef's from the classOf[_] trees + val formalParamTypesAndTypeRefs = classOfsArray.elems.map { + // classOf[tp] -> tp + case Literal(const) if const.tag == Constants.ClazzTag => + toIRTypeAndTypeRef(const.typeValue) + // Anything else is invalid + case otherTree => + report.error( + "The java.lang.Class[_] arguments passed to Selectable.applyDynamic must be " + + "literal classOf[T] expressions (typically compiler-generated). " + + "Other uses are not supported in Scala.js.", + otherTree.sourcePos) + (jstpe.AnyType, jstpe.ClassRef(jsNames.ObjectClass)) + } + + // Gen the actual args, downcasting them to the formal param types + val actualArgs = actualArgsAnyArray.elems.zip(formalParamTypesAndTypeRefs).map { + (actualArgAny, formalParamTypeAndTypeRef) => + val genActualArgAny = genExpr(actualArgAny) + genAsInstanceOf(genActualArgAny, formalParamTypeAndTypeRef._1)(genActualArgAny.pos) + } + + (formalParamTypesAndTypeRefs.map(pair => toParamOrResultTypeRef(pair._2)), actualArgs) + + case _ => + report.error( + "Passing the varargs of Selectable.applyDynamic with `: _*` " + + "is not supported in Scala.js.", + tree.sourcePos) + (Nil, Nil) + } + } + + val methodName = MethodName.reflectiveProxy(methodNameStr, formalParamTypeRefs) + + js.Apply(js.ApplyFlags.empty, selectedValueTree, js.MethodIdent(methodName), actualArgs)(jstpe.AnyType) + } + + /** Gen actual actual arguments to Scala method call. + * Returns a list of the transformed arguments. + * + * This tries to optimize repeated arguments (varargs) by turning them + * into js.WrappedArray instead of Scala wrapped arrays. + */ + private def genActualArgs(sym: Symbol, args: List[Tree])( + implicit pos: Position): List[js.Tree] = { + args.map(genExpr) + /*val wereRepeated = exitingPhase(currentRun.typerPhase) { + sym.tpe.params.map(p => isScalaRepeatedParamType(p.tpe)) + } + + if (wereRepeated.size > args.size) { + // Should not happen, but let's not crash + args.map(genExpr) + } else { + /* Arguments that are in excess compared to the type signature after + * erasure are lambda-lifted arguments. They cannot be repeated, hence + * the extension to `false`. + */ + for ((arg, wasRepeated) <- args.zipAll(wereRepeated, EmptyTree, false)) yield { + if (wasRepeated) { + tryGenRepeatedParamAsJSArray(arg, handleNil = false).fold { + genExpr(arg) + } { genArgs => + genNew(WrappedArrayClass, WrappedArray_ctor, + List(js.JSArrayConstr(genArgs))) + } + } else { + genExpr(arg) + } + } + }*/ + } + + /** Gen actual actual arguments to a JS method call. + * Returns a list of the transformed arguments. + * + * - TODO Repeated arguments (varargs) are expanded + * - Default arguments are omitted or replaced by undefined + * - All arguments are boxed + * + * Repeated arguments that cannot be expanded at compile time (i.e., if a + * Seq is passed to a varargs parameter with the syntax `seq: _*`) will be + * wrapped in a [[js.JSSpread]] node to be expanded at runtime. + */ + private def genActualJSArgs(sym: Symbol, args: List[Tree])( + implicit pos: Position): List[js.TreeOrJSSpread] = { + + var reversedArgs: List[js.TreeOrJSSpread] = Nil + + for ((arg, info) <- args.zip(sym.jsParamInfos)) { + if (info.repeated) { + reversedArgs = genJSRepeatedParam(arg) reverse_::: reversedArgs + } else if (info.capture) { + // Ignore captures + assert(sym.isClassConstructor, + i"Found a capture param in method ${sym.fullName}, which is not a class constructor, at $pos") + } else { + val unboxedArg = genExpr(arg) + val boxedArg = unboxedArg match { + case js.Transient(UndefinedParam) => + unboxedArg + case _ => + box(unboxedArg, info.info) + } + reversedArgs ::= boxedArg + } + } + + /* Remove all consecutive UndefinedParam's at the end of the argument + * list. No check is performed whether they may be there, since they will + * only be placed where default arguments can be anyway. + */ + reversedArgs = reversedArgs.dropWhile(_.isInstanceOf[js.Transient]) + + /* Find remaining UndefinedParam and replace by js.Undefined. This can + * happen with named arguments or with multiple argument lists. + */ + reversedArgs = reversedArgs map { + case js.Transient(UndefinedParam) => js.Undefined() + case arg => arg + } + + reversedArgs.reverse + } + + /** Gen JS code for a repeated param of a JS method. + * + * In this case `arg` has type `Seq[T]` for some `T`, but the result should + * be an expanded list of the elements in the sequence. So this method + * takes care of the conversion. + * + * It is specialized for the shapes of tree generated by the desugaring + * of repeated params in Scala, so that these are actually expanded at + * compile-time. + * + * Otherwise, it returns a `JSSpread` with the `Seq` converted to a + * `js.Array`. + */ + private def genJSRepeatedParam(arg: Tree): List[js.TreeOrJSSpread] = { + tryGenRepeatedParamAsJSArray(arg, handleNil = true).getOrElse { + /* Fall back to calling runtime.genTraversableOnce2jsArray + * to perform the conversion to js.Array, then wrap in a Spread + * operator. + */ + implicit val pos: SourcePosition = arg.sourcePos + val jsArrayArg = genModuleApplyMethod( + jsdefn.Runtime_toJSVarArgs, + List(genExpr(arg))) + List(js.JSSpread(jsArrayArg)) + } + } + + /** Try and expand an actual argument to a repeated param `(xs: T*)`. + * + * This method recognizes the shapes of tree generated by the desugaring + * of repeated params in Scala, and expands them. + * If `arg` does not have the shape of a generated repeated param, this + * method returns `None`. + */ + private def tryGenRepeatedParamAsJSArray(arg: Tree, + handleNil: Boolean): Option[List[js.Tree]] = { + implicit val pos = arg.span + + // Given a method `def foo(args: T*)` + arg match { + // foo(arg1, arg2, ..., argN) where N > 0 + case MaybeAsInstanceOf(WrapArray(MaybeAsInstanceOf(array: JavaSeqLiteral))) => + /* Value classes in arrays are already boxed, so no need to use + * the type before erasure. + * TODO Is this true in dotty? + */ + Some(array.elems.map(e => box(genExpr(e), e.tpe))) + + // foo() + case Ident(_) if handleNil && arg.symbol == defn.NilModule => + Some(Nil) + + // foo(argSeq: _*) - cannot be optimized + case _ => + None + } + } + + private object MaybeAsInstanceOf { + def unapply(tree: Tree): Some[Tree] = tree match { + case TypeApply(asInstanceOf_? @ Select(base, _), _) + if asInstanceOf_?.symbol == defn.Any_asInstanceOf => + Some(base) + case _ => + Some(tree) + } + } + + private object WrapArray { + lazy val isWrapArray: Set[Symbol] = { + val names0 = defn.ScalaValueClasses().map(sym => nme.wrapXArray(sym.name)) + val names1 = names0 ++ Set(nme.wrapRefArray, nme.genericWrapArray) + val symsInPredef = names1.map(defn.ScalaPredefModule.requiredMethod(_)) + val symsInScalaRunTime = names1.map(defn.ScalaRuntimeModule.requiredMethod(_)) + (symsInPredef ++ symsInScalaRunTime).toSet + } + + def unapply(tree: Apply): Option[Tree] = tree match { + case Apply(wrapArray_?, List(wrapped)) if isWrapArray(wrapArray_?.symbol) => + Some(wrapped) + case _ => + None + } + } + + /** Wraps a `js.Array` to use as varargs. */ + def genJSArrayToVarArgs(arrayRef: js.Tree)(implicit pos: SourcePosition): js.Tree = + genModuleApplyMethod(jsdefn.Runtime_toScalaVarArgs, List(arrayRef)) + + /** Gen the actual capture values for a JS constructor based on its fake `new` invocation. */ + private def genCaptureValuesFromFakeNewInstance(tree: Tree): List[js.Tree] = { + implicit val pos: Position = tree.span + + val Apply(fun @ Select(New(_), _), args) = tree: @unchecked + val sym = fun.symbol + + /* We use the same strategy as genActualJSArgs to detect which parameters were + * introduced by explicitouter or lambdalift (but reversed, of course). + */ + + val existedBeforeUncurry = atPhase(elimRepeatedPhase) { + sym.info.paramNamess.flatten.toSet + } + + for { + (arg, paramName) <- args.zip(sym.info.paramNamess.flatten) + if !existedBeforeUncurry(paramName) + } yield { + genExpr(arg) + } + } + + private def genVarRef(sym: Symbol)(implicit pos: Position): js.VarRef = + js.VarRef(encodeLocalSym(sym))(toIRType(sym.info)) + + private def genAssignableField(sym: Symbol, qualifier: Tree)(implicit pos: SourcePosition): (js.AssignLhs, Boolean) = { + def qual = genExpr(qualifier) + + if (sym.owner.isNonNativeJSClass) { + val f = if (sym.isJSExposed) { + js.JSSelect(qual, genExpr(sym.jsName)) + } else if (sym.owner.isAnonymousClass) { + js.JSSelect( + js.JSSelect(qual, genPrivateFieldsSymbol()), + encodeFieldSymAsStringLiteral(sym)) + } else { + js.JSPrivateSelect(qual, encodeClassName(sym.owner), + encodeFieldSym(sym)) + } + + (f, true) + } else if (sym.hasAnnotation(jsdefn.JSExportTopLevelAnnot)) { + val f = js.SelectStatic(encodeClassName(sym.owner), encodeFieldSym(sym))(jstpe.AnyType) + (f, true) + } else if (sym.hasAnnotation(jsdefn.JSExportStaticAnnot)) { + val jsName = sym.getAnnotation(jsdefn.JSExportStaticAnnot).get.argumentConstantString(0).getOrElse { + sym.defaultJSName + } + val companionClass = sym.owner.linkedClass + val f = js.JSSelect(genLoadJSConstructor(companionClass), js.StringLiteral(jsName)) + (f, true) + } else { + val className = encodeClassName(sym.owner) + val fieldIdent = encodeFieldSym(sym) + + /* #4370 Fields cannot have type NothingType, so we box them as + * scala.runtime.Nothing$ instead. They will be initialized with + * `null`, and any attempt to access them will throw a + * `ClassCastException` (generated in the unboxing code). + */ + val (irType, boxed) = toIRType(sym.info) match + case jstpe.NothingType => + (encodeClassType(defn.NothingClass), true) + case ftpe => + (ftpe, false) + + val f = + if sym.is(JavaStatic) then + js.SelectStatic(className, fieldIdent)(irType) + else + js.Select(qual, className, fieldIdent)(irType) + + (f, boxed) + } + } + + /** Gen JS code for loading a Java static field. + */ + private def genLoadStaticField(sym: Symbol)(implicit pos: SourcePosition): js.Tree = { + /* Actually, there is no static member in Scala.js. If we come here, that + * is because we found the symbol in a Java-emitted .class in the + * classpath. But the corresponding implementation in Scala.js will + * actually be a val in the companion module. + */ + + if (sym == defn.BoxedUnit_UNIT) { + js.Undefined() + } else if (sym == defn.BoxedUnit_TYPE) { + js.ClassOf(jstpe.VoidRef) + } else { + val className = encodeClassName(sym.owner) + val method = encodeStaticMemberSym(sym) + js.ApplyStatic(js.ApplyFlags.empty, className, method, Nil)(toIRType(sym.info)) + } + } + + /** Generates a call to `runtime.privateFieldsSymbol()` */ + private def genPrivateFieldsSymbol()(implicit pos: SourcePosition): js.Tree = + genModuleApplyMethod(jsdefn.Runtime_privateFieldsSymbol, Nil) + + /** Generate loading of a module value. + * + * Can be given either the module symbol or its module class symbol. + * + * If the module we load refers to the global scope (i.e., it is + * annotated with `@JSGlobalScope`), report a compile error specifying + * that a global scope object should only be used as the qualifier of a + * `.`-selection. + */ + def genLoadModule(sym: Symbol)(implicit pos: SourcePosition): js.Tree = + ruleOutGlobalScope(genLoadModuleOrGlobalScope(sym)) + + /** Generate loading of a module value or the global scope. + * + * Can be given either the module symbol of its module class symbol. + * + * Unlike `genLoadModule`, this method does not fail if the module we load + * refers to the global scope. + */ + def genLoadModuleOrGlobalScope(sym0: Symbol)( + implicit pos: SourcePosition): MaybeGlobalScope = { + + require(sym0.is(Module), + "genLoadModule called with non-module symbol: " + sym0) + val sym = if (sym0.isTerm) sym0.moduleClass else sym0 + + // Does that module refer to the global scope? + if (sym.hasAnnotation(jsdefn.JSGlobalScopeAnnot)) { + MaybeGlobalScope.GlobalScope(pos) + } else { + val cls = encodeClassName(sym) + val tree = + if (sym.isJSType) js.LoadJSModule(cls) + else js.LoadModule(cls) + MaybeGlobalScope.NotGlobalScope(tree) + } + } + + /** Gen JS code representing the constructor of a JS class. */ + private def genLoadJSConstructor(sym: Symbol)( + implicit pos: Position): js.Tree = { + assert(!isStaticModule(sym) && !sym.is(Trait), + s"genLoadJSConstructor called with non-class $sym") + js.LoadJSConstructor(encodeClassName(sym)) + } + + private inline val GenericGlobalObjectInformationMsg = { + "\n " + + "See https://www.scala-js.org/doc/interoperability/global-scope.html " + + "for further information." + } + + /** Rule out the `GlobalScope` case of a `MaybeGlobalScope` and extract the + * value tree. + * + * If `tree` represents the global scope, report a compile error. + */ + private def ruleOutGlobalScope(tree: MaybeGlobalScope): js.Tree = { + tree match { + case MaybeGlobalScope.NotGlobalScope(t) => + t + case MaybeGlobalScope.GlobalScope(pos) => + reportErrorLoadGlobalScope()(pos) + } + } + + /** Report a compile error specifying that the global scope cannot be + * loaded as a value. + */ + private def reportErrorLoadGlobalScope()(implicit pos: SourcePosition): js.Tree = { + report.error( + "Loading the global scope as a value (anywhere but as the " + + "left-hand-side of a `.`-selection) is not allowed." + + GenericGlobalObjectInformationMsg, + pos) + js.Undefined() + } + + /** Gen a JS bracket select or a `JSGlobalRef`. + * + * If the receiver is a normal value, i.e., not the global scope, then + * emit a `JSSelect`. + * + * Otherwise, if the `item` is a constant string that is a valid + * JavaScript identifier, emit a `JSGlobalRef`. + * + * Otherwise, report a compile error. + */ + private def genJSSelectOrGlobalRef(qual: MaybeGlobalScope, item: js.Tree)( + implicit pos: SourcePosition): js.AssignLhs = { + qual match { + case MaybeGlobalScope.NotGlobalScope(qualTree) => + js.JSSelect(qualTree, item) + + case MaybeGlobalScope.GlobalScope(_) => + item match { + case js.StringLiteral(value) => + if (js.JSGlobalRef.isValidJSGlobalRefName(value)) { + js.JSGlobalRef(value) + } else if (js.JSGlobalRef.ReservedJSIdentifierNames.contains(value)) { + report.error( + "Invalid selection in the global scope of the reserved " + + s"identifier name `$value`." + + GenericGlobalObjectInformationMsg, + pos) + js.JSGlobalRef("erroneous") + } else { + report.error( + "Selecting a field of the global scope whose name is " + + "not a valid JavaScript identifier is not allowed." + + GenericGlobalObjectInformationMsg, + pos) + js.JSGlobalRef("erroneous") + } + + case _ => + report.error( + "Selecting a field of the global scope with a dynamic " + + "name is not allowed." + + GenericGlobalObjectInformationMsg, + pos) + js.JSGlobalRef("erroneous") + } + } + } + + /** Gen a JS bracket method apply or an apply of a `GlobalRef`. + * + * If the receiver is a normal value, i.e., not the global scope, then + * emit a `JSMethodApply`. + * + * Otherwise, if the `method` is a constant string that is a valid + * JavaScript identifier, emit a `JSFunctionApply(JSGlobalRef(...), ...)`. + * + * Otherwise, report a compile error. + */ + private def genJSMethodApplyOrGlobalRefApply( + receiver: MaybeGlobalScope, method: js.Tree, args: List[js.TreeOrJSSpread])( + implicit pos: SourcePosition): js.Tree = { + receiver match { + case MaybeGlobalScope.NotGlobalScope(receiverTree) => + js.JSMethodApply(receiverTree, method, args) + + case MaybeGlobalScope.GlobalScope(_) => + method match { + case js.StringLiteral(value) => + if (js.JSGlobalRef.isValidJSGlobalRefName(value)) { + js.JSFunctionApply(js.JSGlobalRef(value), args) + } else if (js.JSGlobalRef.ReservedJSIdentifierNames.contains(value)) { + report.error( + "Invalid call in the global scope of the reserved " + + s"identifier name `$value`." + + GenericGlobalObjectInformationMsg, + pos) + js.Undefined() + } else { + report.error( + "Calling a method of the global scope whose name is not " + + "a valid JavaScript identifier is not allowed." + + GenericGlobalObjectInformationMsg, + pos) + js.Undefined() + } + + case _ => + report.error( + "Calling a method of the global scope with a dynamic " + + "name is not allowed." + + GenericGlobalObjectInformationMsg, + pos) + js.Undefined() + } + } + } + + private def computeJSNativeLoadSpecOfValDef(sym: Symbol): js.JSNativeLoadSpec = { + atPhaseBeforeTransforms { + computeJSNativeLoadSpecOfInPhase(sym) + } + } + + private def computeJSNativeLoadSpecOfClass(sym: Symbol): Option[js.JSNativeLoadSpec] = { + if (sym.is(Trait) || sym.hasAnnotation(jsdefn.JSGlobalScopeAnnot)) { + None + } else { + atPhaseBeforeTransforms { + if (sym.owner.isStaticOwner) + Some(computeJSNativeLoadSpecOfInPhase(sym)) + else + None + } + } + } + + private def computeJSNativeLoadSpecOfInPhase(sym: Symbol)(using Context): js.JSNativeLoadSpec = { + import js.JSNativeLoadSpec._ + + val symOwner = sym.owner + + // Marks a code path as unexpected because it should have been reported as an error in `PrepJSInterop`. + def unexpected(msg: String): Nothing = + throw new FatalError(i"$msg for ${sym.fullName} at ${sym.srcPos}") + + if (symOwner.hasAnnotation(jsdefn.JSNativeAnnot)) { + val jsName = sym.jsName match { + case JSName.Literal(jsName) => jsName + case JSName.Computed(_) => unexpected("could not read the simple JS name as a string literal") + } + + if (symOwner.hasAnnotation(jsdefn.JSGlobalScopeAnnot)) { + Global(jsName, Nil) + } else { + val ownerLoadSpec = computeJSNativeLoadSpecOfInPhase(symOwner) + ownerLoadSpec match { + case Global(globalRef, path) => + Global(globalRef, path :+ jsName) + case Import(module, path) => + Import(module, path :+ jsName) + case ImportWithGlobalFallback(Import(module, modulePath), Global(globalRef, globalPath)) => + ImportWithGlobalFallback( + Import(module, modulePath :+ jsName), + Global(globalRef, globalPath :+ jsName)) + } + } + } else { + def parsePath(pathName: String): List[String] = + pathName.split('.').toList + + def parseGlobalPath(pathName: String): Global = { + val globalRef :: path = parsePath(pathName): @unchecked + Global(globalRef, path) + } + + val annot = sym.annotations.find { annot => + annot.symbol == jsdefn.JSGlobalAnnot || annot.symbol == jsdefn.JSImportAnnot + }.getOrElse { + unexpected("could not find the JS native load spec annotation") + } + + if (annot.symbol == jsdefn.JSGlobalAnnot) { + val pathName = annot.argumentConstantString(0).getOrElse { + sym.defaultJSName + } + parseGlobalPath(pathName) + } else { // annot.symbol == jsdefn.JSImportAnnot + val module = annot.argumentConstantString(0).getOrElse { + unexpected("could not read the module argument as a string literal") + } + val path = annot.argumentConstantString(1).fold { + if (annot.arguments.sizeIs < 2) + parsePath(sym.defaultJSName) + else + Nil + } { pathName => + parsePath(pathName) + } + val importSpec = Import(module, path) + annot.argumentConstantString(2).fold[js.JSNativeLoadSpec] { + importSpec + } { globalPathName => + ImportWithGlobalFallback(importSpec, parseGlobalPath(globalPathName)) + } + } + } + } + + private def isMethodStaticInIR(sym: Symbol): Boolean = + sym.is(JavaStatic) + + /** Generate a Class[_] value (e.g. coming from classOf[T]) */ + private def genClassConstant(tpe: Type)(implicit pos: Position): js.Tree = + js.ClassOf(toTypeRef(tpe)) + + private def isStaticModule(sym: Symbol): Boolean = + sym.is(Module) && sym.isStatic + + private def isPrimitiveValueType(tpe: Type): Boolean = { + tpe.widenDealias match { + case JavaArrayType(_) => false + case _: ErasedValueType => false + case t => t.typeSymbol.asClass.isPrimitiveValueClass + } + } + + protected lazy val isHijackedClass: Set[Symbol] = { + /* This list is a duplicate of ir.Definitions.HijackedClasses, but + * with global.Symbol's instead of IR encoded names as Strings. + * We also add java.lang.Void, which BoxedUnit "erases" to. + */ + Set[Symbol]( + defn.BoxedUnitClass, defn.BoxedBooleanClass, defn.BoxedCharClass, defn.BoxedByteClass, + defn.BoxedShortClass, defn.BoxedIntClass, defn.BoxedLongClass, defn.BoxedFloatClass, + defn.BoxedDoubleClass, defn.StringClass, jsdefn.JavaLangVoidClass + ) + } + + private def isMaybeJavaScriptException(tpe: Type): Boolean = + jsdefn.JavaScriptExceptionClass.isSubClass(tpe.typeSymbol) + + private def hasDefaultCtorArgsAndJSModule(classSym: Symbol): Boolean = { + def hasNativeCompanion = + classSym.companionModule.moduleClass.hasAnnotation(jsdefn.JSNativeAnnot) + def hasDefaultParameters = + classSym.info.decls.exists(sym => sym.isClassConstructor && sym.hasDefaultParams) + + hasNativeCompanion && hasDefaultParameters + } + + // Copied from DottyBackendInterface + + private val desugared = new java.util.IdentityHashMap[Type, tpd.Select] + + def desugarIdent(i: Ident): Option[tpd.Select] = { + var found = desugared.get(i.tpe) + if (found == null) { + tpd.desugarIdent(i) match { + case sel: tpd.Select => + desugared.put(i.tpe, sel) + found = sel + case _ => + } + } + if (found == null) None else Some(found) + } +} + +object JSCodeGen { + + private val NullPointerExceptionClass = ClassName("java.lang.NullPointerException") + private val JSObjectClassName = ClassName("scala.scalajs.js.Object") + private val JavaScriptExceptionClassName = ClassName("scala.scalajs.js.JavaScriptException") + + private val ObjectClassRef = jstpe.ClassRef(ir.Names.ObjectClass) + + private val newSimpleMethodName = SimpleMethodName("new") + + private val selectedValueMethodName = MethodName("selectedValue", Nil, ObjectClassRef) + + private val ObjectArgConstructorName = MethodName.constructor(List(ObjectClassRef)) + + private val thisOriginalName = OriginalName("this") + + sealed abstract class MaybeGlobalScope + + object MaybeGlobalScope { + final case class NotGlobalScope(tree: js.Tree) extends MaybeGlobalScope + + final case class GlobalScope(pos: SourcePosition) extends MaybeGlobalScope + } + + /** Marker object for undefined parameters in JavaScript semantic calls. + * + * To be used inside a `js.Transient` node. + */ + case object UndefinedParam extends js.Transient.Value { + val tpe: jstpe.Type = jstpe.UndefType + + def traverse(traverser: ir.Traversers.Traverser): Unit = () + + def transform(transformer: ir.Transformers.Transformer, isStat: Boolean)( + implicit pos: ir.Position): js.Tree = { + js.Transient(this) + } + + def printIR(out: ir.Printers.IRTreePrinter): Unit = + out.print("") + } + + /** Info about a default param accessor. + * + * The method must have a default getter name for this class to make sense. + */ + private class DefaultParamInfo(sym: Symbol)(using Context) { + private val methodName = sym.name.exclude(DefaultGetterName) + + def isForConstructor: Boolean = methodName == nme.CONSTRUCTOR + + /** When `isForConstructor` is true, returns the owner of the attached + * constructor. + */ + def constructorOwner: Symbol = sym.owner.linkedClass + + /** When `isForConstructor` is false, returns the method attached to the + * specified default accessor. + */ + def attachedMethod: Symbol = { + // If there are overloads, we need to find the one that has default params. + val overloads = sym.owner.info.decl(methodName) + if (!overloads.isOverloaded) + overloads.symbol + else + overloads.suchThat(_.is(HasDefaultParams, butNot = Bridge)).symbol + } + } + +} diff --git a/tests/pos-with-compiler-cc/backend/sjs/JSDefinitions.scala b/tests/pos-with-compiler-cc/backend/sjs/JSDefinitions.scala new file mode 100644 index 000000000000..964811c69e19 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/sjs/JSDefinitions.scala @@ -0,0 +1,340 @@ +package dotty.tools.backend.sjs + +import scala.language.unsafeNulls + +import scala.annotation.threadUnsafe + +import dotty.tools.dotc.core._ +import Names._ +import Types._ +import Contexts._ +import Symbols._ +import StdNames._ + +import dotty.tools.dotc.config.SJSPlatform + +object JSDefinitions { + /** The Scala.js-specific definitions for the current context. */ + def jsdefn(using Context): JSDefinitions = + ctx.platform.asInstanceOf[SJSPlatform].jsDefinitions +} + +final class JSDefinitions()(using DetachedContext) { + + @threadUnsafe lazy val InlineAnnotType: TypeRef = requiredClassRef("scala.inline") + def InlineAnnot(using Context) = InlineAnnotType.symbol.asClass + @threadUnsafe lazy val NoinlineAnnotType: TypeRef = requiredClassRef("scala.noinline") + def NoinlineAnnot(using Context) = NoinlineAnnotType.symbol.asClass + + @threadUnsafe lazy val JavaLangVoidType: TypeRef = requiredClassRef("java.lang.Void") + def JavaLangVoidClass(using Context) = JavaLangVoidType.symbol.asClass + + @threadUnsafe lazy val ScalaJSJSPackageVal = requiredPackage("scala.scalajs.js") + @threadUnsafe lazy val ScalaJSJSPackageClass = ScalaJSJSPackageVal.moduleClass.asClass + @threadUnsafe lazy val JSPackage_typeOfR = ScalaJSJSPackageClass.requiredMethodRef("typeOf") + def JSPackage_typeOf(using Context) = JSPackage_typeOfR.symbol + @threadUnsafe lazy val JSPackage_constructorOfR = ScalaJSJSPackageClass.requiredMethodRef("constructorOf") + def JSPackage_constructorOf(using Context) = JSPackage_constructorOfR.symbol + @threadUnsafe lazy val JSPackage_nativeR = ScalaJSJSPackageClass.requiredMethodRef("native") + def JSPackage_native(using Context) = JSPackage_nativeR.symbol + @threadUnsafe lazy val JSPackage_undefinedR = ScalaJSJSPackageClass.requiredMethodRef("undefined") + def JSPackage_undefined(using Context) = JSPackage_undefinedR.symbol + @threadUnsafe lazy val JSPackage_dynamicImportR = ScalaJSJSPackageClass.requiredMethodRef("dynamicImport") + def JSPackage_dynamicImport(using Context) = JSPackage_dynamicImportR.symbol + + @threadUnsafe lazy val JSNativeAnnotType: TypeRef = requiredClassRef("scala.scalajs.js.native") + def JSNativeAnnot(using Context) = JSNativeAnnotType.symbol.asClass + + @threadUnsafe lazy val JSAnyType: TypeRef = requiredClassRef("scala.scalajs.js.Any") + def JSAnyClass(using Context) = JSAnyType.symbol.asClass + @threadUnsafe lazy val JSObjectType: TypeRef = requiredClassRef("scala.scalajs.js.Object") + def JSObjectClass(using Context) = JSObjectType.symbol.asClass + @threadUnsafe lazy val JSFunctionType: TypeRef = requiredClassRef("scala.scalajs.js.Function") + def JSFunctionClass(using Context) = JSFunctionType.symbol.asClass + @threadUnsafe lazy val JSThisFunctionType: TypeRef = requiredClassRef("scala.scalajs.js.ThisFunction") + def JSThisFunctionClass(using Context) = JSThisFunctionType.symbol.asClass + + @threadUnsafe lazy val PseudoUnionType: TypeRef = requiredClassRef("scala.scalajs.js.|") + def PseudoUnionClass(using Context) = PseudoUnionType.symbol.asClass + + @threadUnsafe lazy val PseudoUnionModuleRef = requiredModuleRef("scala.scalajs.js.|") + def PseudoUnionModule(using Context) = PseudoUnionModuleRef.symbol + @threadUnsafe lazy val PseudoUnion_fromR = PseudoUnionModule.requiredMethodRef("from") + def PseudoUnion_from(using Context) = PseudoUnion_fromR.symbol + @threadUnsafe lazy val PseudoUnion_fromTypeConstructorR = PseudoUnionModule.requiredMethodRef("fromTypeConstructor") + def PseudoUnion_fromTypeConstructor(using Context) = PseudoUnion_fromTypeConstructorR.symbol + + @threadUnsafe lazy val UnionOpsModuleRef = requiredModuleRef("scala.scalajs.js.internal.UnitOps") + + @threadUnsafe lazy val JSArrayType: TypeRef = requiredClassRef("scala.scalajs.js.Array") + def JSArrayClass(using Context) = JSArrayType.symbol.asClass + @threadUnsafe lazy val JSDynamicType: TypeRef = requiredClassRef("scala.scalajs.js.Dynamic") + def JSDynamicClass(using Context) = JSDynamicType.symbol.asClass + + @threadUnsafe lazy val RuntimeExceptionType: TypeRef = requiredClassRef("java.lang.RuntimeException") + def RuntimeExceptionClass(using Context) = RuntimeExceptionType.symbol.asClass + @threadUnsafe lazy val JavaScriptExceptionType: TypeRef = requiredClassRef("scala.scalajs.js.JavaScriptException") + def JavaScriptExceptionClass(using Context) = JavaScriptExceptionType.symbol.asClass + + @threadUnsafe lazy val JSGlobalAnnotType: TypeRef = requiredClassRef("scala.scalajs.js.annotation.JSGlobal") + def JSGlobalAnnot(using Context) = JSGlobalAnnotType.symbol.asClass + @threadUnsafe lazy val JSImportAnnotType: TypeRef = requiredClassRef("scala.scalajs.js.annotation.JSImport") + def JSImportAnnot(using Context) = JSImportAnnotType.symbol.asClass + @threadUnsafe lazy val JSGlobalScopeAnnotType: TypeRef = requiredClassRef("scala.scalajs.js.annotation.JSGlobalScope") + def JSGlobalScopeAnnot(using Context) = JSGlobalScopeAnnotType.symbol.asClass + @threadUnsafe lazy val JSNameAnnotType: TypeRef = requiredClassRef("scala.scalajs.js.annotation.JSName") + def JSNameAnnot(using Context) = JSNameAnnotType.symbol.asClass + @threadUnsafe lazy val JSFullNameAnnotType: TypeRef = requiredClassRef("scala.scalajs.js.annotation.JSFullName") + def JSFullNameAnnot(using Context) = JSFullNameAnnotType.symbol.asClass + @threadUnsafe lazy val JSBracketAccessAnnotType: TypeRef = requiredClassRef("scala.scalajs.js.annotation.JSBracketAccess") + def JSBracketAccessAnnot(using Context) = JSBracketAccessAnnotType.symbol.asClass + @threadUnsafe lazy val JSBracketCallAnnotType: TypeRef = requiredClassRef("scala.scalajs.js.annotation.JSBracketCall") + def JSBracketCallAnnot(using Context) = JSBracketCallAnnotType.symbol.asClass + @threadUnsafe lazy val JSExportTopLevelAnnotType: TypeRef = requiredClassRef("scala.scalajs.js.annotation.JSExportTopLevel") + def JSExportTopLevelAnnot(using Context) = JSExportTopLevelAnnotType.symbol.asClass + @threadUnsafe lazy val JSExportAnnotType: TypeRef = requiredClassRef("scala.scalajs.js.annotation.JSExport") + def JSExportAnnot(using Context) = JSExportAnnotType.symbol.asClass + @threadUnsafe lazy val JSExportStaticAnnotType: TypeRef = requiredClassRef("scala.scalajs.js.annotation.JSExportStatic") + def JSExportStaticAnnot(using Context) = JSExportStaticAnnotType.symbol.asClass + @threadUnsafe lazy val JSExportAllAnnotType: TypeRef = requiredClassRef("scala.scalajs.js.annotation.JSExportAll") + def JSExportAllAnnot(using Context) = JSExportAllAnnotType.symbol.asClass + + def JSAnnotPackage(using Context) = JSGlobalAnnot.owner.asClass + + @threadUnsafe lazy val JSTypeAnnotType: TypeRef = requiredClassRef("scala.scalajs.js.annotation.internal.JSType") + def JSTypeAnnot(using Context) = JSTypeAnnotType.symbol.asClass + @threadUnsafe lazy val JSOptionalAnnotType: TypeRef = requiredClassRef("scala.scalajs.js.annotation.internal.JSOptional") + def JSOptionalAnnot(using Context) = JSOptionalAnnotType.symbol.asClass + @threadUnsafe lazy val ExposedJSMemberAnnotType: TypeRef = requiredClassRef("scala.scalajs.js.annotation.internal.ExposedJSMember") + def ExposedJSMemberAnnot(using Context) = ExposedJSMemberAnnotType.symbol.asClass + + @threadUnsafe lazy val JSImportNamespaceModuleRef = requiredModuleRef("scala.scalajs.js.annotation.JSImport.Namespace") + def JSImportNamespaceModule(using Context) = JSImportNamespaceModuleRef.symbol + + @threadUnsafe lazy val JSAnyModuleRef = requiredModuleRef("scala.scalajs.js.Any") + def JSAnyModule(using Context) = JSAnyModuleRef.symbol + @threadUnsafe lazy val JSAny_fromFunctionR = (0 to 22).map(n => JSAnyModule.requiredMethodRef("fromFunction" + n)).toArray + def JSAny_fromFunction(n: Int)(using Context) = JSAny_fromFunctionR(n).symbol + + @threadUnsafe lazy val JSDynamicModuleRef = requiredModuleRef("scala.scalajs.js.Dynamic") + def JSDynamicModule(using Context) = JSDynamicModuleRef.symbol + @threadUnsafe lazy val JSDynamic_globalR = JSDynamicModule.requiredMethodRef("global") + def JSDynamic_global(using Context) = JSDynamic_globalR.symbol + @threadUnsafe lazy val JSDynamic_newInstanceR = JSDynamicModule.requiredMethodRef("newInstance") + def JSDynamic_newInstance(using Context) = JSDynamic_newInstanceR.symbol + + @threadUnsafe lazy val JSDynamicLiteralModuleRef = JSDynamicModule.moduleClass.requiredValueRef("literal") + def JSDynamicLiteralModule(using Context) = JSDynamicLiteralModuleRef.symbol + @threadUnsafe lazy val JSDynamicLiteral_applyDynamicNamedR = JSDynamicLiteralModule.requiredMethodRef("applyDynamicNamed") + def JSDynamicLiteral_applyDynamicNamed(using Context) = JSDynamicLiteral_applyDynamicNamedR.symbol + @threadUnsafe lazy val JSDynamicLiteral_applyDynamicR = JSDynamicLiteralModule.requiredMethodRef("applyDynamic") + def JSDynamicLiteral_applyDynamic(using Context) = JSDynamicLiteral_applyDynamicR.symbol + + @threadUnsafe lazy val JSObjectModuleRef = requiredModuleRef("scala.scalajs.js.Object") + def JSObjectModule(using Context) = JSObjectModuleRef.symbol + + @threadUnsafe lazy val JSArrayModuleRef = requiredModuleRef("scala.scalajs.js.Array") + def JSArrayModule(using Context) = JSArrayModuleRef.symbol + @threadUnsafe lazy val JSArray_applyR = JSArrayModule.requiredMethodRef(nme.apply) + def JSArray_apply(using Context) = JSArray_applyR.symbol + + @threadUnsafe lazy val JSThisFunctionModuleRef = requiredModuleRef("scala.scalajs.js.ThisFunction") + def JSThisFunctionModule(using Context) = JSThisFunctionModuleRef.symbol + @threadUnsafe lazy val JSThisFunction_fromFunctionR = (1 to 22).map(n => JSThisFunctionModule.requiredMethodRef("fromFunction" + n)).toArray + def JSThisFunction_fromFunction(n: Int)(using Context) = JSThisFunction_fromFunctionR(n - 1).symbol + + @threadUnsafe lazy val JSConstructorTagModuleRef = requiredModuleRef("scala.scalajs.js.ConstructorTag") + def JSConstructorTagModule(using Context) = JSConstructorTagModuleRef.symbol + @threadUnsafe lazy val JSConstructorTag_materializeR = JSConstructorTagModule.requiredMethodRef("materialize") + def JSConstructorTag_materialize(using Context) = JSConstructorTag_materializeR.symbol + + @threadUnsafe lazy val JSNewModuleRef = requiredModuleRef("scala.scalajs.js.new") + def JSNewModule(using Context) = JSNewModuleRef.symbol + @threadUnsafe lazy val JSNew_targetR = JSNewModule.requiredMethodRef("target") + def JSNew_target(using Context) = JSNew_targetR.symbol + + @threadUnsafe lazy val JSImportModuleRef = requiredModuleRef("scala.scalajs.js.import") + def JSImportModule(using Context) = JSImportModuleRef.symbol + @threadUnsafe lazy val JSImport_applyR = JSImportModule.requiredMethodRef(nme.apply) + def JSImport_apply(using Context) = JSImport_applyR.symbol + @threadUnsafe lazy val JSImport_metaR = JSImportModule.requiredMethodRef("meta") + def JSImport_meta(using Context) = JSImport_metaR.symbol + + @threadUnsafe lazy val RuntimePackageVal = requiredPackage("scala.scalajs.runtime") + @threadUnsafe lazy val RuntimePackageClass = RuntimePackageVal.moduleClass.asClass + @threadUnsafe lazy val Runtime_toScalaVarArgsR = RuntimePackageClass.requiredMethodRef("toScalaVarArgs") + def Runtime_toScalaVarArgs(using Context) = Runtime_toScalaVarArgsR.symbol + @threadUnsafe lazy val Runtime_toJSVarArgsR = RuntimePackageClass.requiredMethodRef("toJSVarArgs") + def Runtime_toJSVarArgs(using Context) = Runtime_toJSVarArgsR.symbol + @threadUnsafe lazy val Runtime_privateFieldsSymbolR = RuntimePackageClass.requiredMethodRef("privateFieldsSymbol") + def Runtime_privateFieldsSymbol(using Context) = Runtime_privateFieldsSymbolR.symbol + @threadUnsafe lazy val Runtime_constructorOfR = RuntimePackageClass.requiredMethodRef("constructorOf") + def Runtime_constructorOf(using Context) = Runtime_constructorOfR.symbol + @threadUnsafe lazy val Runtime_newConstructorTagR = RuntimePackageClass.requiredMethodRef("newConstructorTag") + def Runtime_newConstructorTag(using Context) = Runtime_newConstructorTagR.symbol + @threadUnsafe lazy val Runtime_createInnerJSClassR = RuntimePackageClass.requiredMethodRef("createInnerJSClass") + def Runtime_createInnerJSClass(using Context) = Runtime_createInnerJSClassR.symbol + @threadUnsafe lazy val Runtime_createLocalJSClassR = RuntimePackageClass.requiredMethodRef("createLocalJSClass") + def Runtime_createLocalJSClass(using Context) = Runtime_createLocalJSClassR.symbol + @threadUnsafe lazy val Runtime_withContextualJSClassValueR = RuntimePackageClass.requiredMethodRef("withContextualJSClassValue") + def Runtime_withContextualJSClassValue(using Context) = Runtime_withContextualJSClassValueR.symbol + @threadUnsafe lazy val Runtime_linkingInfoR = RuntimePackageClass.requiredMethodRef("linkingInfo") + def Runtime_linkingInfo(using Context) = Runtime_linkingInfoR.symbol + @threadUnsafe lazy val Runtime_dynamicImportR = RuntimePackageClass.requiredMethodRef("dynamicImport") + def Runtime_dynamicImport(using Context) = Runtime_dynamicImportR.symbol + + @threadUnsafe lazy val DynamicImportThunkType: TypeRef = requiredClassRef("scala.scalajs.runtime.DynamicImportThunk") + def DynamicImportThunkClass(using Context) = DynamicImportThunkType.symbol.asClass + @threadUnsafe lazy val DynamicImportThunkClass_applyR = DynamicImportThunkClass.requiredMethodRef(nme.apply) + def DynamicImportThunkClass_apply(using Context) = DynamicImportThunkClass_applyR.symbol + + @threadUnsafe lazy val SpecialPackageVal = requiredPackage("scala.scalajs.js.special") + @threadUnsafe lazy val SpecialPackageClass = SpecialPackageVal.moduleClass.asClass + @threadUnsafe lazy val Special_debuggerR = SpecialPackageClass.requiredMethodRef("debugger") + def Special_debugger(using Context) = Special_debuggerR.symbol + @threadUnsafe lazy val Special_deleteR = SpecialPackageClass.requiredMethodRef("delete") + def Special_delete(using Context) = Special_deleteR.symbol + @threadUnsafe lazy val Special_forinR = SpecialPackageClass.requiredMethodRef("forin") + def Special_forin(using Context) = Special_forinR.symbol + @threadUnsafe lazy val Special_inR = SpecialPackageClass.requiredMethodRef("in") + def Special_in(using Context) = Special_inR.symbol + @threadUnsafe lazy val Special_instanceofR = SpecialPackageClass.requiredMethodRef("instanceof") + def Special_instanceof(using Context) = Special_instanceofR.symbol + @threadUnsafe lazy val Special_strictEqualsR = SpecialPackageClass.requiredMethodRef("strictEquals") + def Special_strictEquals(using Context) = Special_strictEqualsR.symbol + @threadUnsafe lazy val Special_throwR = SpecialPackageClass.requiredMethodRef("throw") + def Special_throw(using Context) = Special_throwR.symbol + @threadUnsafe lazy val Special_tryCatchR = SpecialPackageClass.requiredMethodRef("tryCatch") + def Special_tryCatch(using Context) = Special_tryCatchR.symbol + @threadUnsafe lazy val Special_wrapAsThrowableR = SpecialPackageClass.requiredMethodRef("wrapAsThrowable") + def Special_wrapAsThrowable(using Context) = Special_wrapAsThrowableR.symbol + @threadUnsafe lazy val Special_unwrapFromThrowableR = SpecialPackageClass.requiredMethodRef("unwrapFromThrowable") + def Special_unwrapFromThrowable(using Context) = Special_unwrapFromThrowableR.symbol + + @threadUnsafe lazy val WrappedArrayType: TypeRef = requiredClassRef("scala.scalajs.js.WrappedArray") + def WrappedArrayClass(using Context) = WrappedArrayType.symbol.asClass + + @threadUnsafe lazy val ScalaRunTime_isArrayR = defn.ScalaRuntimeModule.requiredMethodRef("isArray", List(???, ???)) + def ScalaRunTime_isArray(using Context): Symbol = ScalaRunTime_isArrayR.symbol + + @threadUnsafe lazy val BoxesRunTime_boxToCharacterR = defn.BoxesRunTimeModule.requiredMethodRef("boxToCharacter") + def BoxesRunTime_boxToCharacter(using Context): Symbol = BoxesRunTime_boxToCharacterR.symbol + @threadUnsafe lazy val BoxesRunTime_unboxToCharR = defn.BoxesRunTimeModule.requiredMethodRef("unboxToChar") + def BoxesRunTime_unboxToChar(using Context): Symbol = BoxesRunTime_unboxToCharR.symbol + + @threadUnsafe lazy val EnableReflectiveInstantiationAnnotType: TypeRef = requiredClassRef("scala.scalajs.reflect.annotation.EnableReflectiveInstantiation") + def EnableReflectiveInstantiationAnnot(using Context) = EnableReflectiveInstantiationAnnotType.symbol.asClass + + @threadUnsafe lazy val ReflectModuleRef = requiredModuleRef("scala.scalajs.reflect.Reflect") + def ReflectModule(using Context) = ReflectModuleRef.symbol + @threadUnsafe lazy val Reflect_registerLoadableModuleClassR = ReflectModule.requiredMethodRef("registerLoadableModuleClass") + def Reflect_registerLoadableModuleClass(using Context) = Reflect_registerLoadableModuleClassR.symbol + @threadUnsafe lazy val Reflect_registerInstantiatableClassR = ReflectModule.requiredMethodRef("registerInstantiatableClass") + def Reflect_registerInstantiatableClass(using Context) = Reflect_registerInstantiatableClassR.symbol + + @threadUnsafe lazy val ReflectSelectableType: TypeRef = requiredClassRef("scala.reflect.Selectable") + def ReflectSelectableClass(using Context) = ReflectSelectableType.symbol.asClass + @threadUnsafe lazy val ReflectSelectable_selectDynamicR = ReflectSelectableClass.requiredMethodRef("selectDynamic") + def ReflectSelectable_selectDynamic(using Context) = ReflectSelectable_selectDynamicR.symbol + @threadUnsafe lazy val ReflectSelectable_applyDynamicR = ReflectSelectableClass.requiredMethodRef("applyDynamic") + def ReflectSelectable_applyDynamic(using Context) = ReflectSelectable_applyDynamicR.symbol + + @threadUnsafe lazy val ReflectSelectableModuleRef = requiredModuleRef("scala.reflect.Selectable") + def ReflectSelectableModule(using Context) = ReflectSelectableModuleRef.symbol + @threadUnsafe lazy val ReflectSelectable_reflectiveSelectableR = ReflectSelectableModule.requiredMethodRef("reflectiveSelectable") + def ReflectSelectable_reflectiveSelectable(using Context) = ReflectSelectable_reflectiveSelectableR.symbol + + @threadUnsafe lazy val SelectableModuleRef = requiredModuleRef("scala.Selectable") + def SelectableModule(using Context) = SelectableModuleRef.symbol + @threadUnsafe lazy val Selectable_reflectiveSelectableFromLangReflectiveCallsR = SelectableModule.requiredMethodRef("reflectiveSelectableFromLangReflectiveCalls") + def Selectable_reflectiveSelectableFromLangReflectiveCalls(using Context) = Selectable_reflectiveSelectableFromLangReflectiveCallsR.symbol + + private var allRefClassesCache: Set[Symbol] = _ + def allRefClasses(using Context): Set[Symbol] = { + if (allRefClassesCache == null) { + val baseNames = List("Object", "Boolean", "Character", "Byte", "Short", + "Int", "Long", "Float", "Double") + val fullNames = baseNames.flatMap { base => + List(s"scala.runtime.${base}Ref", s"scala.runtime.Volatile${base}Ref") + } + allRefClassesCache = fullNames.map(name => requiredClass(name)).toSet + } + allRefClassesCache + } + + /** Definitions related to scala.Enumeration. */ + object scalaEnumeration { + val nmeValue = termName("Value") + val nmeVal = termName("Val") + val hasNext = termName("hasNext") + val next = termName("next") + + @threadUnsafe lazy val EnumerationClass = requiredClass("scala.Enumeration") + @threadUnsafe lazy val Enumeration_Value_NoArg = EnumerationClass.requiredValue(nmeValue) + @threadUnsafe lazy val Enumeration_Value_IntArg = EnumerationClass.requiredMethod(nmeValue, List(defn.IntType)) + @threadUnsafe lazy val Enumeration_Value_StringArg = EnumerationClass.requiredMethod(nmeValue, List(defn.StringType)) + @threadUnsafe lazy val Enumeration_Value_IntStringArg = EnumerationClass.requiredMethod(nmeValue, List(defn.IntType, defn.StringType)) + @threadUnsafe lazy val Enumeration_nextName = EnumerationClass.requiredMethod(termName("nextName")) + + @threadUnsafe lazy val EnumerationValClass = EnumerationClass.requiredClass("Val") + @threadUnsafe lazy val Enumeration_Val_NoArg = EnumerationValClass.requiredMethod(nme.CONSTRUCTOR, Nil) + @threadUnsafe lazy val Enumeration_Val_IntArg = EnumerationValClass.requiredMethod(nme.CONSTRUCTOR, List(defn.IntType)) + @threadUnsafe lazy val Enumeration_Val_StringArg = EnumerationValClass.requiredMethod(nme.CONSTRUCTOR, List(defn.StringType)) + @threadUnsafe lazy val Enumeration_Val_IntStringArg = EnumerationValClass.requiredMethod(nme.CONSTRUCTOR, List(defn.IntType, defn.StringType)) + + def isValueMethod(sym: Symbol)(using Context): Boolean = + sym.name == nmeValue && sym.owner == EnumerationClass + + def isValueMethodNoName(sym: Symbol)(using Context): Boolean = + isValueMethod(sym) && (sym == Enumeration_Value_NoArg || sym == Enumeration_Value_IntArg) + + def isValueMethodName(sym: Symbol)(using Context): Boolean = + isValueMethod(sym) && (sym == Enumeration_Value_StringArg || sym == Enumeration_Value_IntStringArg) + + def isValCtor(sym: Symbol)(using Context): Boolean = + sym.isClassConstructor && sym.owner == EnumerationValClass + + def isValCtorNoName(sym: Symbol)(using Context): Boolean = + isValCtor(sym) && (sym == Enumeration_Val_NoArg || sym == Enumeration_Val_IntArg) + + def isValCtorName(sym: Symbol)(using Context): Boolean = + isValCtor(sym) && (sym == Enumeration_Val_StringArg || sym == Enumeration_Val_IntStringArg) + } + + /** Definitions related to the treatment of JUnit bootstrappers. */ + object junit { + @threadUnsafe lazy val TestAnnotType: TypeRef = requiredClassRef("org.junit.Test") + def TestAnnotClass(using Context): ClassSymbol = TestAnnotType.symbol.asClass + + @threadUnsafe lazy val BeforeAnnotType: TypeRef = requiredClassRef("org.junit.Before") + def BeforeAnnotClass(using Context): ClassSymbol = BeforeAnnotType.symbol.asClass + + @threadUnsafe lazy val AfterAnnotType: TypeRef = requiredClassRef("org.junit.After") + def AfterAnnotClass(using Context): ClassSymbol = AfterAnnotType.symbol.asClass + + @threadUnsafe lazy val BeforeClassAnnotType: TypeRef = requiredClassRef("org.junit.BeforeClass") + def BeforeClassAnnotClass(using Context): ClassSymbol = BeforeClassAnnotType.symbol.asClass + + @threadUnsafe lazy val AfterClassAnnotType: TypeRef = requiredClassRef("org.junit.AfterClass") + def AfterClassAnnotClass(using Context): ClassSymbol = AfterClassAnnotType.symbol.asClass + + @threadUnsafe lazy val IgnoreAnnotType: TypeRef = requiredClassRef("org.junit.Ignore") + def IgnoreAnnotClass(using Context): ClassSymbol = IgnoreAnnotType.symbol.asClass + + @threadUnsafe lazy val BootstrapperType: TypeRef = requiredClassRef("org.scalajs.junit.Bootstrapper") + + @threadUnsafe lazy val TestMetadataType: TypeRef = requiredClassRef("org.scalajs.junit.TestMetadata") + + @threadUnsafe lazy val NoSuchMethodExceptionType: TypeRef = requiredClassRef("java.lang.NoSuchMethodException") + + @threadUnsafe lazy val FutureType: TypeRef = requiredClassRef("scala.concurrent.Future") + def FutureClass(using Context): ClassSymbol = FutureType.symbol.asClass + + @threadUnsafe private lazy val FutureModule_successfulR = requiredModule("scala.concurrent.Future").requiredMethodRef("successful") + def FutureModule_successful(using Context): Symbol = FutureModule_successfulR.symbol + + @threadUnsafe private lazy val SuccessModule_applyR = requiredModule("scala.util.Success").requiredMethodRef(nme.apply) + def SuccessModule_apply(using Context): Symbol = SuccessModule_applyR.symbol + } + +} diff --git a/tests/pos-with-compiler-cc/backend/sjs/JSEncoding.scala b/tests/pos-with-compiler-cc/backend/sjs/JSEncoding.scala new file mode 100644 index 000000000000..73a150c60290 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/sjs/JSEncoding.scala @@ -0,0 +1,428 @@ +package dotty.tools.backend.sjs + +import scala.language.unsafeNulls + +import scala.collection.mutable + +import dotty.tools.dotc.core._ +import Contexts._ +import Flags._ +import Types._ +import Symbols._ +import NameOps._ +import Names._ +import StdNames._ + +import dotty.tools.dotc.transform.sjs.JSSymUtils._ + +import org.scalajs.ir +import org.scalajs.ir.{Trees => js, Types => jstpe} +import org.scalajs.ir.Names.{LocalName, LabelName, FieldName, SimpleMethodName, MethodName, ClassName} +import org.scalajs.ir.OriginalName +import org.scalajs.ir.OriginalName.NoOriginalName +import org.scalajs.ir.UTF8String + +import dotty.tools.backend.jvm.DottyBackendInterface.symExtensions + +import JSDefinitions.jsdefn + +/** Encoding of symbol names for JavaScript + * + * Some issues that this encoding solves: + * * Overloading: encode the full signature in the JS name + * * Same scope for fields and methods of a class + * * Global access to classes and modules (by their full name) + * + * @author Sébastien Doeraene + */ +object JSEncoding { + + /** Name of the capture param storing the JS super class. + * + * This is used by the dispatchers of exposed JS methods and properties of + * nested JS classes when they need to perform a super call. Other super + * calls (in the actual bodies of the methods, not in the dispatchers) do + * not use this value, since they are implemented as static methods that do + * not have access to it. Instead, they get the JS super class value through + * the magic method inserted by `ExplicitLocalJS`, leveraging `lambdalift` + * to ensure that it is properly captured. + * + * Using this identifier is only allowed if it was reserved in the current + * local name scope using [[reserveLocalName]]. Otherwise, this name can + * clash with another local identifier. + */ + final val JSSuperClassParamName = LocalName("superClass$") + + private val ScalaRuntimeNothingClassName = ClassName("scala.runtime.Nothing$") + private val ScalaRuntimeNullClassName = ClassName("scala.runtime.Null$") + + private val dynamicImportForwarderSimpleName = SimpleMethodName("dynamicImport$") + + // Fresh local name generator ---------------------------------------------- + + class LocalNameGenerator { + import LocalNameGenerator._ + + private val usedLocalNames = mutable.Set.empty[LocalName] + private val localSymbolNames = mutable.Map.empty[Symbol, LocalName] + private val usedLabelNames = mutable.Set.empty[LabelName] + private val labelSymbolNames = mutable.Map.empty[Symbol, LabelName] + private var returnLabelName: Option[LabelName] = None + + def reserveLocalName(name: LocalName): Unit = { + require(usedLocalNames.isEmpty, + s"Trying to reserve the name '$name' but names have already been allocated") + usedLocalNames += name + } + + private def freshNameGeneric[N <: ir.Names.Name](base: N, usedNamesSet: mutable.Set[N])( + withSuffix: (N, String) => N): N = { + + var suffix = 1 + var result = base + while (usedNamesSet(result)) { + suffix += 1 + result = withSuffix(base, "$" + suffix) + } + usedNamesSet += result + result + } + + def freshName(base: LocalName): LocalName = + freshNameGeneric(base, usedLocalNames)(_.withSuffix(_)) + + def freshName(base: String): LocalName = + freshName(LocalName(base)) + + def freshLocalIdent()(implicit pos: ir.Position): js.LocalIdent = + js.LocalIdent(freshName(xLocalName)) + + def freshLocalIdent(base: LocalName)(implicit pos: ir.Position): js.LocalIdent = + js.LocalIdent(freshName(base)) + + def freshLocalIdent(base: String)(implicit pos: ir.Position): js.LocalIdent = + freshLocalIdent(LocalName(base)) + + def freshLocalIdent(base: TermName)(implicit pos: ir.Position): js.LocalIdent = + freshLocalIdent(base.mangledString) + + def localSymbolName(sym: Symbol)(using Context): LocalName = { + localSymbolNames.getOrElseUpdate(sym, { + /* The emitter does not like local variables that start with a '$', + * because it needs to encode them not to clash with emitter-generated + * names. There are two common cases, caused by scalac-generated names: + * - the `$this` parameter of tailrec methods and "extension" methods of + * AnyVals, which scalac knows as `nme.SELF`, and + * - the `$outer` parameter of inner class constructors, which scalac + * knows as `nme.OUTER`. + * We choose different base names for those two cases instead, so that + * the avoidance mechanism of the emitter doesn't happen as a common + * case. It can still happen for user-defined variables, but in that case + * the emitter will deal with it. + */ + val base = sym.name match { + case nme.SELF => "this$" // instead of $this + case nme.OUTER => "outer" // instead of $outer + case name => name.mangledString + } + freshName(base) + }) + } + + def freshLabelName(base: LabelName): LabelName = + freshNameGeneric(base, usedLabelNames)(_.withSuffix(_)) + + def freshLabelName(base: String): LabelName = + freshLabelName(LabelName(base)) + + def freshLabelIdent(base: String)(implicit pos: ir.Position): js.LabelIdent = + js.LabelIdent(freshLabelName(base)) + + def labelSymbolName(sym: Symbol)(using Context): LabelName = + labelSymbolNames.getOrElseUpdate(sym, freshLabelName(sym.javaSimpleName)) + + def getEnclosingReturnLabel()(implicit pos: ir.Position): js.LabelIdent = { + if (returnLabelName.isEmpty) + returnLabelName = Some(freshLabelName("_return")) + js.LabelIdent(returnLabelName.get) + } + + /* If this `LocalNameGenerator` has a `returnLabelName` (often added in the + * construction of the `body` argument), wrap the resulting js.Tree to use that label. + */ + def makeLabeledIfRequiresEnclosingReturn(tpe: jstpe.Type)(body: js.Tree)(implicit pos: ir.Position): js.Tree = { + returnLabelName match { + case None => + body + case Some(labelName) => + js.Labeled(js.LabelIdent(labelName), tpe, body) + } + } + } + + private object LocalNameGenerator { + private val xLocalName = LocalName("x") + } + + // Encoding methods ---------------------------------------------------------- + + def encodeLabelSym(sym: Symbol)( + implicit ctx: Context, pos: ir.Position, localNames: LocalNameGenerator): js.LabelIdent = { + require(sym.is(Flags.Label), "encodeLabelSym called with non-label symbol: " + sym) + js.LabelIdent(localNames.labelSymbolName(sym)) + } + + def encodeFieldSym(sym: Symbol)(implicit ctx: Context, pos: ir.Position): js.FieldIdent = + js.FieldIdent(FieldName(encodeFieldSymAsString(sym))) + + def encodeFieldSymAsStringLiteral(sym: Symbol)(implicit ctx: Context, pos: ir.Position): js.StringLiteral = + js.StringLiteral(encodeFieldSymAsString(sym)) + + private def encodeFieldSymAsString(sym: Symbol)(using Context): String = { + require(sym.owner.isClass && sym.isTerm && !sym.isOneOf(MethodOrModule), + "encodeFieldSym called with non-field symbol: " + sym) + + val name0 = sym.javaSimpleName + if (name0.charAt(name0.length() - 1) != ' ') name0 + else name0.substring(0, name0.length() - 1) + } + + def encodeMethodSym(sym: Symbol, reflProxy: Boolean = false)( + implicit ctx: Context, pos: ir.Position): js.MethodIdent = { + require(sym.is(Flags.Method), "encodeMethodSym called with non-method symbol: " + sym) + + val tpe = sym.info + + val paramTypeRefs0 = tpe.firstParamTypes.map(paramOrResultTypeRef(_)) + + val hasExplicitThisParameter = !sym.is(JavaStatic) && sym.owner.isNonNativeJSClass + val paramTypeRefs = + if (!hasExplicitThisParameter) paramTypeRefs0 + else encodeClassRef(sym.owner) :: paramTypeRefs0 + + val name = sym.name + val simpleName = SimpleMethodName(name.mangledString) + + val methodName = { + if (sym.isClassConstructor) + MethodName.constructor(paramTypeRefs) + else if (reflProxy) + MethodName.reflectiveProxy(simpleName, paramTypeRefs) + else + MethodName(simpleName, paramTypeRefs, paramOrResultTypeRef(patchedResultType(sym))) + } + + js.MethodIdent(methodName) + } + + def encodeJSNativeMemberSym(sym: Symbol)(using Context, ir.Position): js.MethodIdent = { + require(sym.hasAnnotation(jsdefn.JSNativeAnnot), + "encodeJSNativeMemberSym called with non-native symbol: " + sym) + if (sym.is(Method)) + encodeMethodSym(sym) + else + encodeFieldSymAsMethod(sym) + } + + def encodeStaticMemberSym(sym: Symbol)(using Context, ir.Position): js.MethodIdent = { + require(sym.is(Flags.JavaStaticTerm), + "encodeStaticMemberSym called with non-static symbol: " + sym) + encodeFieldSymAsMethod(sym) + } + + private def encodeFieldSymAsMethod(sym: Symbol)(using Context, ir.Position): js.MethodIdent = { + val name = sym.name + val resultTypeRef = paramOrResultTypeRef(sym.info) + val methodName = MethodName(name.mangledString, Nil, resultTypeRef) + js.MethodIdent(methodName) + } + + def encodeDynamicImportForwarderIdent(params: List[Symbol])(using Context, ir.Position): js.MethodIdent = { + val paramTypeRefs = params.map(sym => paramOrResultTypeRef(sym.info)) + val resultTypeRef = jstpe.ClassRef(ir.Names.ObjectClass) + val methodName = MethodName(dynamicImportForwarderSimpleName, paramTypeRefs, resultTypeRef) + js.MethodIdent(methodName) + } + + /** Computes the type ref for a type, to be used in a method signature. */ + private def paramOrResultTypeRef(tpe: Type)(using Context): jstpe.TypeRef = + toParamOrResultTypeRef(toTypeRef(tpe)) + + def encodeLocalSym(sym: Symbol)( + implicit ctx: Context, pos: ir.Position, localNames: LocalNameGenerator): js.LocalIdent = { + require(!sym.owner.isClass && sym.isTerm && !sym.is(Flags.Method) && !sym.is(Flags.Module), + "encodeLocalSym called with non-local symbol: " + sym) + js.LocalIdent(localNames.localSymbolName(sym)) + } + + def encodeClassType(sym: Symbol)(using Context): jstpe.Type = { + if (sym == defn.ObjectClass) jstpe.AnyType + else if (sym.isJSType) jstpe.AnyType + else { + assert(sym != defn.ArrayClass, + "encodeClassType() cannot be called with ArrayClass") + jstpe.ClassType(encodeClassName(sym)) + } + } + + def encodeClassRef(sym: Symbol)(using Context): jstpe.ClassRef = + jstpe.ClassRef(encodeClassName(sym)) + + def encodeClassNameIdent(sym: Symbol)( + implicit ctx: Context, pos: ir.Position): js.ClassIdent = + js.ClassIdent(encodeClassName(sym)) + + def encodeClassName(sym: Symbol)(using Context): ClassName = { + val sym1 = + if (sym.isAllOf(ModuleClass | JavaDefined)) sym.linkedClass + else sym + + /* Some rewirings: + * - scala.runtime.BoxedUnit to java.lang.Void, as the IR expects. + * BoxedUnit$ is a JVM artifact. + * - scala.Nothing to scala.runtime.Nothing$. + * - scala.Null to scala.runtime.Null$. + */ + if (sym1 == defn.BoxedUnitClass) + ir.Names.BoxedUnitClass + else if (sym1 == defn.NothingClass) + ScalaRuntimeNothingClassName + else if (sym1 == defn.NullClass) + ScalaRuntimeNullClassName + else + ClassName(sym1.javaClassName) + } + + /** Converts a general TypeRef to a TypeRef to be used in a method signature. */ + def toParamOrResultTypeRef(typeRef: jstpe.TypeRef): jstpe.TypeRef = { + typeRef match { + case jstpe.ClassRef(ScalaRuntimeNullClassName) => jstpe.NullRef + case jstpe.ClassRef(ScalaRuntimeNothingClassName) => jstpe.NothingRef + case _ => typeRef + } + } + + def toIRTypeAndTypeRef(tp: Type)(using Context): (jstpe.Type, jstpe.TypeRef) = { + val typeRefInternal = toTypeRefInternal(tp) + (toIRTypeInternal(typeRefInternal), typeRefInternal._1) + } + + def toIRType(tp: Type)(using Context): jstpe.Type = + toIRTypeInternal(toTypeRefInternal(tp)) + + private def toIRTypeInternal(typeRefInternal: (jstpe.TypeRef, Symbol))(using Context): jstpe.Type = { + typeRefInternal._1 match { + case jstpe.PrimRef(irTpe) => + irTpe + + case typeRef: jstpe.ClassRef => + val sym = typeRefInternal._2 + if (sym == defn.ObjectClass || sym.isJSType) + jstpe.AnyType + else if (sym == defn.NothingClass) + jstpe.NothingType + else if (sym == defn.NullClass) + jstpe.NullType + else + jstpe.ClassType(typeRef.className) + + case typeRef: jstpe.ArrayTypeRef => + jstpe.ArrayType(typeRef) + } + } + + def toTypeRef(tp: Type)(using Context): jstpe.TypeRef = + toTypeRefInternal(tp)._1 + + private def toTypeRefInternal(tp: Type)(using Context): (jstpe.TypeRef, Symbol) = { + def primitiveOrClassToTypeRef(sym: Symbol): (jstpe.TypeRef, Symbol) = { + assert(sym.isClass, sym) + //assert(sym != defn.ArrayClass || isCompilingArray, sym) + val typeRef = if (sym.isPrimitiveValueClass) { + if (sym == defn.UnitClass) jstpe.VoidRef + else if (sym == defn.BooleanClass) jstpe.BooleanRef + else if (sym == defn.CharClass) jstpe.CharRef + else if (sym == defn.ByteClass) jstpe.ByteRef + else if (sym == defn.ShortClass) jstpe.ShortRef + else if (sym == defn.IntClass) jstpe.IntRef + else if (sym == defn.LongClass) jstpe.LongRef + else if (sym == defn.FloatClass) jstpe.FloatRef + else if (sym == defn.DoubleClass) jstpe.DoubleRef + else throw new Exception(s"unknown primitive value class $sym") + } else { + encodeClassRef(sym) + } + (typeRef, sym) + } + + /** + * When compiling Array.scala, the type parameter T is not erased and shows up in method + * signatures, e.g. `def apply(i: Int): T`. A TyperRef to T is replaced by ObjectReference. + */ + def nonClassTypeRefToTypeRef(sym: Symbol): (jstpe.TypeRef, Symbol) = { + //assert(sym.isType && isCompilingArray, sym) + (jstpe.ClassRef(ir.Names.ObjectClass), defn.ObjectClass) + } + + tp.widenDealias match { + // Array type such as Array[Int] (kept by erasure) + case JavaArrayType(el) => + val elTypeRef = toTypeRefInternal(el) + (jstpe.ArrayTypeRef.of(elTypeRef._1), elTypeRef._2) + + case t: TypeRef => + if (!t.symbol.isClass) nonClassTypeRefToTypeRef(t.symbol) // See comment on nonClassTypeRefToBType + else primitiveOrClassToTypeRef(t.symbol) // Common reference to a type such as scala.Int or java.lang.String + + case Types.ClassInfo(_, sym, _, _, _) => + /* We get here, for example, for genLoadModule, which invokes + * toTypeKind(moduleClassSymbol.info) + */ + primitiveOrClassToTypeRef(sym) + + /* AnnotatedType should (probably) be eliminated by erasure. However we know it happens for + * meta-annotated annotations (@(ann @getter) val x = 0), so we don't emit a warning. + * The type in the AnnotationInfo is an AnnotatedTpe. Tested in jvm/annotations.scala. + */ + case a @ AnnotatedType(t, _) => + //debuglog(s"typeKind of annotated type $a") + toTypeRefInternal(t) + } + } + + /** Patches the result type of a method symbol to sanitize it. + * + * For some reason, dotc thinks that the `info.resultType`of an + * `isConstructor` method (for classes or traits) is the enclosing class + * or trait, but the bodies and usages act as if the result type was `Unit`. + * + * This method returns `UnitType` for constructor methods, and otherwise + * `sym.info.resultType`. + */ + def patchedResultType(sym: Symbol)(using Context): Type = + if (sym.isConstructor) defn.UnitType + else sym.info.resultType + + def originalNameOfLocal(sym: Symbol)( + implicit ctx: Context, localNames: LocalNameGenerator): OriginalName = { + val irName = localNames.localSymbolName(sym) + val originalName = UTF8String(sym.name.unexpandedName.toString) + if (UTF8String.equals(originalName, irName.encoded)) NoOriginalName + else OriginalName(originalName) + } + + def originalNameOfField(sym: Symbol)(using Context): OriginalName = + originalNameOf(sym.name) + + def originalNameOfMethod(sym: Symbol)(using Context): OriginalName = + originalNameOf(sym.name) + + def originalNameOfClass(sym: Symbol)(using Context): OriginalName = + originalNameOf(sym.fullName) + + private def originalNameOf(name: Name): OriginalName = { + val originalName = name.unexpandedName.toString + if (originalName == name.mangledString) NoOriginalName + else OriginalName(originalName) + } +} diff --git a/tests/pos-with-compiler-cc/backend/sjs/JSExportsGen.scala b/tests/pos-with-compiler-cc/backend/sjs/JSExportsGen.scala new file mode 100644 index 000000000000..78412999bb34 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/sjs/JSExportsGen.scala @@ -0,0 +1,1025 @@ +package dotty.tools.backend.sjs + +import scala.language.unsafeNulls + +import scala.annotation.tailrec +import scala.collection.mutable + +import dotty.tools.dotc.core._ + +import Contexts._ +import Decorators._ +import Denotations._ +import Flags._ +import Names._ +import NameKinds.DefaultGetterName +import NameOps._ +import Phases._ +import Symbols._ +import Types._ +import TypeErasure.ErasedValueType + +import dotty.tools.dotc.util.{SourcePosition, SrcPos} +import dotty.tools.dotc.report + +import org.scalajs.ir.{Position, Names => jsNames, Trees => js, Types => jstpe} +import org.scalajs.ir.Names.DefaultModuleID +import org.scalajs.ir.OriginalName.NoOriginalName +import org.scalajs.ir.Position.NoPosition +import org.scalajs.ir.Trees.OptimizerHints + +import dotty.tools.dotc.transform.sjs.JSExportUtils._ +import dotty.tools.dotc.transform.sjs.JSSymUtils._ + +import JSEncoding._ + +final class JSExportsGen(jsCodeGen: JSCodeGen)(using Context) { + import jsCodeGen._ + import positionConversions._ + + /** Info for a non-member export. */ + sealed trait ExportInfo { + val pos: SourcePosition + } + + final case class TopLevelExportInfo(moduleID: String, jsName: String)(val pos: SourcePosition) extends ExportInfo + final case class StaticExportInfo(jsName: String)(val pos: SourcePosition) extends ExportInfo + + private sealed trait ExportKind + + private object ExportKind { + case object Module extends ExportKind + case object JSClass extends ExportKind + case object Constructor extends ExportKind + case object Method extends ExportKind + case object Property extends ExportKind + case object Field extends ExportKind + + def apply(sym: Symbol): ExportKind = { + if (sym.is(Flags.Module) && sym.isStatic) Module + else if (sym.isClass) JSClass + else if (sym.isConstructor) Constructor + else if (!sym.is(Flags.Method)) Field + else if (sym.isJSProperty) Property + else Method + } + } + + private def topLevelExportsOf(sym: Symbol): List[TopLevelExportInfo] = { + def isScalaClass(sym: Symbol): Boolean = + sym.isClass && !sym.isOneOf(Module | Trait) && !sym.isJSType + + if (isScalaClass(sym)) { + // Scala classes are never exported; their constructors are + Nil + } else if (sym.is(Accessor) || sym.is(Module, butNot = ModuleClass)) { + /* - Accessors receive the `@JSExportTopLevel` annotation of their associated field, + * but only the field is really exported. + * - Module values are not exported; their module class takes care of the export. + */ + Nil + } else { + val symForAnnot = + if (sym.isConstructor && isScalaClass(sym.owner)) sym.owner + else sym + + symForAnnot.annotations.collect { + case annot if annot.symbol == jsdefn.JSExportTopLevelAnnot => + val jsName = annot.argumentConstantString(0).get + val moduleID = annot.argumentConstantString(1).getOrElse(DefaultModuleID) + TopLevelExportInfo(moduleID, jsName)(annot.tree.sourcePos) + } + } + } + + private def staticExportsOf(sym: Symbol): List[StaticExportInfo] = { + if (sym.is(Accessor)) { + Nil + } else { + sym.annotations.collect { + case annot if annot.symbol == jsdefn.JSExportStaticAnnot => + val jsName = annot.argumentConstantString(0).getOrElse { + sym.defaultJSName + } + StaticExportInfo(jsName)(annot.tree.sourcePos) + } + } + } + + private def checkSameKind(tups: List[(ExportInfo, Symbol)]): Option[ExportKind] = { + assert(tups.nonEmpty, "must have at least one export") + + val firstSym = tups.head._2 + val overallKind = ExportKind(firstSym) + var bad = false + + for ((info, sym) <- tups.tail) { + val kind = ExportKind(sym) + + if (kind != overallKind) { + bad = true + report.error( + em"export overload conflicts with export of $firstSym: they are of different types (${kind.tryToShow} / ${overallKind.tryToShow})", + info.pos) + } + } + + if (bad) None + else Some(overallKind) + } + + private def checkSingleField(tups: List[(ExportInfo, Symbol)]): Symbol = { + assert(tups.nonEmpty, "must have at least one export") + + val firstSym = tups.head._2 + + for ((info, _) <- tups.tail) { + report.error( + em"export overload conflicts with export of $firstSym: a field may not share its exported name with another export", + info.pos) + } + + firstSym + } + + def genTopLevelExports(classSym: ClassSymbol): List[js.TopLevelExportDef] = { + val exports = for { + sym <- classSym :: classSym.info.decls.toList + info <- topLevelExportsOf(sym) + } yield { + (info, sym) + } + + (for { + (info, tups) <- exports.groupBy(_._1) + kind <- checkSameKind(tups) + } yield { + import ExportKind._ + + implicit val pos = info.pos + + kind match { + case Module => + js.TopLevelModuleExportDef(info.moduleID, info.jsName) + + case JSClass => + assert(classSym.isNonNativeJSClass, "found export on non-JS class") + js.TopLevelJSClassExportDef(info.moduleID, info.jsName) + + case Constructor | Method => + val exported = tups.map(_._2) + + val methodDef = withNewLocalNameScope { + genExportMethod(exported, JSName.Literal(info.jsName), static = true) + } + + js.TopLevelMethodExportDef(info.moduleID, methodDef) + + case Property => + throw new AssertionError("found top-level exported property") + + case Field => + val sym = checkSingleField(tups) + js.TopLevelFieldExportDef(info.moduleID, info.jsName, encodeFieldSym(sym)) + } + }).toList + } + + def genStaticExports(classSym: Symbol): List[js.MemberDef] = { + val exports = for { + sym <- classSym.info.decls.toList + info <- staticExportsOf(sym) + } yield { + (info, sym) + } + + (for { + (info, tups) <- exports.groupBy(_._1) + kind <- checkSameKind(tups) + } yield { + def alts = tups.map(_._2) + + implicit val pos = info.pos + + import ExportKind._ + + kind match { + case Method => + genMemberExportOrDispatcher(JSName.Literal(info.jsName), isProp = false, alts, static = true) + + case Property => + genMemberExportOrDispatcher(JSName.Literal(info.jsName), isProp = true, alts, static = true) + + case Field => + val sym = checkSingleField(tups) + + // static fields must always be mutable + val flags = js.MemberFlags.empty + .withNamespace(js.MemberNamespace.PublicStatic) + .withMutable(true) + val name = js.StringLiteral(info.jsName) + val irTpe = genExposedFieldIRType(sym) + js.JSFieldDef(flags, name, irTpe) + + case kind => + throw new AssertionError(s"unexpected static export kind: $kind") + } + }).toList + } + + /** Generates exported methods and properties for a class. + * + * @param classSym symbol of the class we export for + */ + def genMemberExports(classSym: ClassSymbol): List[js.MemberDef] = { + val classInfo = classSym.info + val allExports = classInfo.memberDenots(takeAllFilter, { (name, buf) => + if (isExportName(name)) + buf ++= classInfo.member(name).alternatives + }) + + val newlyDeclaredExports = if (classSym.superClass == NoSymbol) { + allExports + } else { + allExports.filterNot { denot => + classSym.superClass.info.member(denot.name).hasAltWith(_.info =:= denot.info) + } + } + + val newlyDeclaredExportNames = newlyDeclaredExports.map(_.name.toTermName).toList.distinct + + newlyDeclaredExportNames.map(genMemberExport(classSym, _)) + } + + private def genMemberExport(classSym: ClassSymbol, name: TermName): js.MemberDef = { + /* This used to be `.member(name)`, but it caused #3538, since we were + * sometimes selecting mixin forwarders, whose type history does not go + * far enough back in time to see varargs. We now explicitly exclude + * mixed-in members in addition to bridge methods (the latter are always + * excluded by `.member(name)`). + */ + val alts = classSym + .findMemberNoShadowingBasedOnFlags(name, classSym.appliedRef, required = Method, excluded = Bridge | MixedIn) + .alternatives + + assert(!alts.isEmpty, + em"""Ended up with no alternatives for ${classSym.fullName}::$name. + |Original set was ${alts} with types ${alts.map(_.info)}""") + + val (jsName, isProp) = exportNameInfo(name) + + // Check if we have a conflicting export of the other kind + val conflicting = classSym.info.member(makeExportName(jsName, !isProp)) + + if (conflicting.exists) { + val kind = if (isProp) "property" else "method" + val conflictingMember = conflicting.alternatives.head.symbol.fullName + val errorPos: SrcPos = alts.map(_.symbol).filter(_.owner == classSym) match { + case Nil => classSym + case altsInClass => altsInClass.minBy(_.span.point) + } + report.error(em"Exported $kind $jsName conflicts with $conflictingMember", errorPos) + } + + genMemberExportOrDispatcher(JSName.Literal(jsName), isProp, alts.map(_.symbol), static = false) + } + + def genJSClassDispatchers(classSym: Symbol, dispatchMethodsNames: List[JSName]): List[js.MemberDef] = { + dispatchMethodsNames.map(genJSClassDispatcher(classSym, _)) + } + + private def genJSClassDispatcher(classSym: Symbol, name: JSName): js.MemberDef = { + val alts = classSym.info.membersBasedOnFlags(required = Method, excluded = Bridge) + .map(_.symbol) + .filter { sym => + /* scala-js#3939: Object is not a "real" superclass of JS types. + * as such, its methods do not participate in overload resolution. + * An exception is toString, which is handled specially in genExportMethod. + */ + sym.owner != defn.ObjectClass && sym.jsName == name + } + .toList + + assert(!alts.isEmpty, s"Ended up with no alternatives for ${classSym.fullName}::$name.") + + val (propSyms, methodSyms) = alts.partition(_.isJSProperty) + val isProp = propSyms.nonEmpty + + if (isProp && methodSyms.nonEmpty) { + val firstAlt = alts.head + report.error( + em"Conflicting properties and methods for ${classSym.fullName}::$name.", + firstAlt.srcPos) + implicit val pos = firstAlt.span + js.JSPropertyDef(js.MemberFlags.empty, genExpr(name)(firstAlt.sourcePos), None, None) + } else { + genMemberExportOrDispatcher(name, isProp, alts, static = false) + } + } + + private def genMemberExportOrDispatcher(jsName: JSName, isProp: Boolean, + alts: List[Symbol], static: Boolean): js.MemberDef = { + withNewLocalNameScope { + if (isProp) + genExportProperty(alts, jsName, static) + else + genExportMethod(alts, jsName, static) + } + } + + private def genExportProperty(alts: List[Symbol], jsName: JSName, static: Boolean): js.JSPropertyDef = { + assert(!alts.isEmpty, s"genExportProperty with empty alternatives for $jsName") + + implicit val pos: Position = alts.head.span + + val namespace = + if (static) js.MemberNamespace.PublicStatic + else js.MemberNamespace.Public + val flags = js.MemberFlags.empty.withNamespace(namespace) + + /* Separate getters and setters. Since we only have getters and setters, we + * simply test the param list size, which is faster than using the full isJSGetter. + */ + val (getter, setters) = alts.partition(_.info.paramInfoss.head.isEmpty) + + // We can have at most one getter + if (getter.sizeIs > 1) + reportCannotDisambiguateError(jsName, alts) + + val getterBody = getter.headOption.map { getterSym => + genApplyForSingleExported(new FormalArgsRegistry(0, false), new ExportedSymbol(getterSym, static), static) + } + + val setterArgAndBody = { + if (setters.isEmpty) { + None + } else { + val formalArgsRegistry = new FormalArgsRegistry(1, false) + val (List(arg), None) = formalArgsRegistry.genFormalArgs(): @unchecked + val body = genOverloadDispatchSameArgc(jsName, formalArgsRegistry, + setters.map(new ExportedSymbol(_, static)), jstpe.AnyType, None) + Some((arg, body)) + } + } + + js.JSPropertyDef(flags, genExpr(jsName)(alts.head.sourcePos), getterBody, setterArgAndBody) + } + + private def genExportMethod(alts0: List[Symbol], jsName: JSName, static: Boolean)(using Context): js.JSMethodDef = { + assert(alts0.nonEmpty, "need at least one alternative to generate exporter method") + + implicit val pos: SourcePosition = alts0.head.sourcePos + + val namespace = + if (static) js.MemberNamespace.PublicStatic + else js.MemberNamespace.Public + val flags = js.MemberFlags.empty.withNamespace(namespace) + + // toString() is always exported. We might need to add it here to get correct overloading. + val alts = jsName match { + case JSName.Literal("toString") if alts0.forall(_.info.paramInfoss.exists(_.nonEmpty)) => + defn.Any_toString :: alts0 + case _ => + alts0 + } + + val overloads = alts.map(new ExportedSymbol(_, static)) + + val (formalArgs, restParam, body) = + genOverloadDispatch(jsName, overloads, jstpe.AnyType) + + js.JSMethodDef(flags, genExpr(jsName), formalArgs, restParam, body)( + OptimizerHints.empty, None) + } + + def genOverloadDispatch(jsName: JSName, alts: List[Exported], tpe: jstpe.Type)( + using pos: SourcePosition): (List[js.ParamDef], Option[js.ParamDef], js.Tree) = { + + // Create the formal args registry + val hasVarArg = alts.exists(_.hasRepeatedParam) + val minArgc = alts.map(_.minArgc).min + val maxNonRepeatedArgc = alts.map(_.maxNonRepeatedArgc).max + val needsRestParam = maxNonRepeatedArgc != minArgc || hasVarArg + val formalArgsRegistry = new FormalArgsRegistry(minArgc, needsRestParam) + + // Generate the list of formal parameters + val (formalArgs, restParam) = formalArgsRegistry.genFormalArgs() + + /* Generate the body + * We have a fast-path for methods that are not overloaded. In addition to + * being a fast path, it does a better job than `genExportMethodMultiAlts` + * when the only alternative has default parameters, because it avoids a + * spurious dispatch. + * In scalac, the spurious dispatch was avoided by a more elaborate case + * generation in `genExportMethod`, which was very convoluted and was not + * ported to dotc. + */ + val body = + if (alts.tail.isEmpty) alts.head.genBody(formalArgsRegistry) + else genExportMethodMultiAlts(formalArgsRegistry, maxNonRepeatedArgc, alts, tpe, jsName) + + (formalArgs, restParam, body) + } + + private def genExportMethodMultiAlts(formalArgsRegistry: FormalArgsRegistry, + maxNonRepeatedArgc: Int, alts: List[Exported], tpe: jstpe.Type, jsName: JSName)( + implicit pos: SourcePosition): js.Tree = { + + // Generate tuples (argc, method) + val methodArgCounts = for { + alt <- alts + argc <- alt.minArgc to (if (alt.hasRepeatedParam) maxNonRepeatedArgc else alt.maxNonRepeatedArgc) + } yield { + (argc, alt) + } + + // Create a list of (argCount -> methods), sorted by argCount (methods may appear multiple times) + val methodsByArgCount: List[(Int, List[Exported])] = + methodArgCounts.groupMap(_._1)(_._2).toList.sortBy(_._1) // sort for determinism + + val altsWithVarArgs = alts.filter(_.hasRepeatedParam) + + // Generate a case block for each (argCount, methods) tuple + // TODO? We could optimize this a bit by putting together all the `argCount`s that have the same methods + // (Scala.js for scalac does that, but the code is very convoluted and it's not clear that it is worth it). + val cases = for { + (argc, methods) <- methodsByArgCount + if methods != altsWithVarArgs // exclude default case we're generating anyways for varargs + } yield { + // body of case to disambiguates methods with current count + val caseBody = genOverloadDispatchSameArgc(jsName, formalArgsRegistry, methods, tpe, Some(argc)) + List(js.IntLiteral(argc - formalArgsRegistry.minArgc)) -> caseBody + } + + def defaultCase = { + if (altsWithVarArgs.isEmpty) + genThrowTypeError() + else + genOverloadDispatchSameArgc(jsName, formalArgsRegistry, altsWithVarArgs, tpe, None) + } + + val body = { + if (cases.isEmpty) { + defaultCase + } else if (cases.tail.isEmpty && altsWithVarArgs.isEmpty) { + cases.head._2 + } else { + val restArgRef = formalArgsRegistry.genRestArgRef() + js.Match( + js.AsInstanceOf(js.JSSelect(restArgRef, js.StringLiteral("length")), jstpe.IntType), + cases, + defaultCase)( + tpe) + } + } + + body + } + + /** Resolves method calls to [[alts]] while assuming they have the same parameter count. + * + * @param jsName + * The JS name of the method, for error reporting + * @param formalArgsRegistry + * The registry of all the formal arguments + * @param alts + * Alternative methods + * @param tpe + * Result type + * @param maxArgc + * Maximum number of arguments to use for disambiguation + */ + private def genOverloadDispatchSameArgc(jsName: JSName, formalArgsRegistry: FormalArgsRegistry, + alts: List[Exported], tpe: jstpe.Type, maxArgc: Option[Int]): js.Tree = { + genOverloadDispatchSameArgcRec(jsName, formalArgsRegistry, alts, tpe, paramIndex = 0, maxArgc) + } + + /** Resolves method calls to [[alts]] while assuming they have the same parameter count. + * + * @param jsName + * The JS name of the method, for error reporting + * @param formalArgsRegistry + * The registry of all the formal arguments + * @param alts + * Alternative methods + * @param tpe + * Result type + * @param paramIndex + * Index where to start disambiguation (starts at 0, increases through recursion) + * @param maxArgc + * Maximum number of arguments to use for disambiguation + */ + private def genOverloadDispatchSameArgcRec(jsName: JSName, formalArgsRegistry: FormalArgsRegistry, + alts: List[Exported], tpe: jstpe.Type, paramIndex: Int, maxArgc: Option[Int]): js.Tree = { + + implicit val pos = alts.head.pos + + if (alts.sizeIs == 1) { + alts.head.genBody(formalArgsRegistry) + } else if (maxArgc.exists(_ <= paramIndex) || !alts.exists(_.params.size > paramIndex)) { + // We reach here in three cases: + // 1. The parameter list has been exhausted + // 2. The optional argument count restriction has triggered + // 3. We only have (more than once) repeated parameters left + // Therefore, we should fail + reportCannotDisambiguateError(jsName, alts.map(_.sym)) + js.Undefined() + } else { + val altsByTypeTest = groupByWithoutHashCode(alts) { exported => + typeTestForTpe(exported.exportArgTypeAt(paramIndex)) + } + + if (altsByTypeTest.size == 1) { + // Testing this parameter is not doing any us good + genOverloadDispatchSameArgcRec(jsName, formalArgsRegistry, alts, tpe, paramIndex + 1, maxArgc) + } else { + // Sort them so that, e.g., isInstanceOf[String] comes before isInstanceOf[Object] + val sortedAltsByTypeTest = topoSortDistinctsWith(altsByTypeTest) { (lhs, rhs) => + (lhs._1, rhs._1) match { + // NoTypeTest is always last + case (_, NoTypeTest) => true + case (NoTypeTest, _) => false + + case (PrimitiveTypeTest(_, rank1), PrimitiveTypeTest(_, rank2)) => + rank1 <= rank2 + + case (InstanceOfTypeTest(t1), InstanceOfTypeTest(t2)) => + t1 <:< t2 + + case (_: PrimitiveTypeTest, _: InstanceOfTypeTest) => true + case (_: InstanceOfTypeTest, _: PrimitiveTypeTest) => false + } + } + + val defaultCase = genThrowTypeError() + + sortedAltsByTypeTest.foldRight[js.Tree](defaultCase) { (elem, elsep) => + val (typeTest, subAlts) = elem + implicit val pos = subAlts.head.pos + + val paramRef = formalArgsRegistry.genArgRef(paramIndex) + val genSubAlts = genOverloadDispatchSameArgcRec(jsName, formalArgsRegistry, + subAlts, tpe, paramIndex + 1, maxArgc) + + def hasDefaultParam = subAlts.exists(_.hasDefaultAt(paramIndex)) + + val optCond = typeTest match { + case PrimitiveTypeTest(tpe, _) => Some(js.IsInstanceOf(paramRef, tpe)) + case InstanceOfTypeTest(tpe) => Some(genIsInstanceOf(paramRef, tpe)) + case NoTypeTest => None + } + + optCond.fold[js.Tree] { + genSubAlts // note: elsep is discarded, obviously + } { cond => + val condOrUndef = if (!hasDefaultParam) cond else { + js.If(cond, js.BooleanLiteral(true), + js.BinaryOp(js.BinaryOp.===, paramRef, js.Undefined()))( + jstpe.BooleanType) + } + js.If(condOrUndef, genSubAlts, elsep)(tpe) + } + } + } + } + } + + private def reportCannotDisambiguateError(jsName: JSName, alts: List[Symbol]): Unit = { + val currentClass = currentClassSym.get + + /* Find a position that is in the current class for decent error reporting. + * If there are more than one, always use the "highest" one (i.e., the + * one coming last in the source text) so that we reliably display the + * same error in all compilers. + */ + val validPositions = alts.collect { + case alt if alt.owner == currentClass => alt.sourcePos + } + val pos: SourcePosition = + if (validPositions.isEmpty) currentClass.sourcePos + else validPositions.maxBy(_.point) + + val kind = + if (alts.head.isJSGetter) "getter" + else if (alts.head.isJSSetter) "setter" + else "method" + + val fullKind = + if (currentClass.isJSType) kind + else "exported " + kind + + val displayName = jsName.displayName + val altsTypesInfo = alts.map(_.info.show).sorted.mkString("\n ") + + report.error( + em"Cannot disambiguate overloads for $fullKind $displayName with types\n $altsTypesInfo", + pos) + } + + /** Generates a call to the method represented by the given `exported` while using the formalArguments + * and potentially the argument array. + * + * Also inserts default parameters if required. + */ + private def genApplyForSingleExported(formalArgsRegistry: FormalArgsRegistry, + exported: Exported, static: Boolean): js.Tree = { + if (currentClassSym.isJSType && exported.sym.owner != currentClassSym.get) { + assert(!static, s"nonsensical JS super call in static export of ${exported.sym}") + genApplyForSingleExportedJSSuperCall(formalArgsRegistry, exported) + } else { + genApplyForSingleExportedNonJSSuperCall(formalArgsRegistry, exported, static) + } + } + + private def genApplyForSingleExportedJSSuperCall( + formalArgsRegistry: FormalArgsRegistry, exported: Exported): js.Tree = { + implicit val pos = exported.pos + + val sym = exported.sym + assert(!sym.isClassConstructor, + s"Trying to genApplyForSingleExportedJSSuperCall for the constructor ${sym.fullName}") + + val allArgs = formalArgsRegistry.genAllArgsRefsForForwarder() + + val superClass = { + val superClassSym = currentClassSym.asClass.superClass + if (superClassSym.isNestedJSClass) + js.VarRef(js.LocalIdent(JSSuperClassParamName))(jstpe.AnyType) + else + js.LoadJSConstructor(encodeClassName(superClassSym)) + } + + val receiver = js.This()(currentThisType) + val nameTree = genExpr(sym.jsName) + + if (sym.isJSGetter) { + assert(allArgs.isEmpty, + s"getter symbol $sym does not have a getter signature") + js.JSSuperSelect(superClass, receiver, nameTree) + } else if (sym.isJSSetter) { + assert(allArgs.size == 1 && allArgs.head.isInstanceOf[js.Tree], + s"setter symbol $sym does not have a setter signature") + js.Assign(js.JSSuperSelect(superClass, receiver, nameTree), + allArgs.head.asInstanceOf[js.Tree]) + } else { + js.JSSuperMethodCall(superClass, receiver, nameTree, allArgs) + } + } + + private def genApplyForSingleExportedNonJSSuperCall( + formalArgsRegistry: FormalArgsRegistry, exported: Exported, static: Boolean): js.Tree = { + + implicit val pos = exported.pos + + val varDefs = new mutable.ListBuffer[js.VarDef] + + for ((param, i) <- exported.params.zipWithIndex) { + val rhs = genScalaArg(exported, i, formalArgsRegistry, param, static, captures = Nil)( + prevArgsCount => varDefs.take(prevArgsCount).toList.map(_.ref)) + + varDefs += js.VarDef(freshLocalIdent("prep" + i), NoOriginalName, rhs.tpe, mutable = false, rhs) + } + + val builtVarDefs = varDefs.result() + + val jsResult = genResult(exported, builtVarDefs.map(_.ref), static) + + js.Block(builtVarDefs :+ jsResult) + } + + /** Generates a Scala argument from dispatched JavaScript arguments + * (unboxing and default parameter handling). + */ + def genScalaArg(exported: Exported, paramIndex: Int, formalArgsRegistry: FormalArgsRegistry, + param: JSParamInfo, static: Boolean, captures: List[js.Tree])( + previousArgsValues: Int => List[js.Tree])( + implicit pos: SourcePosition): js.Tree = { + + if (param.repeated) { + genJSArrayToVarArgs(formalArgsRegistry.genVarargRef(paramIndex)) + } else { + val jsArg = formalArgsRegistry.genArgRef(paramIndex) + + // Unboxed argument (if it is defined) + val unboxedArg = unbox(jsArg, param.info) + + if (exported.hasDefaultAt(paramIndex)) { + // If argument is undefined and there is a default getter, call it + js.If(js.BinaryOp(js.BinaryOp.===, jsArg, js.Undefined()), { + genCallDefaultGetter(exported.sym, paramIndex, static, captures)(previousArgsValues) + }, { + unboxedArg + })(unboxedArg.tpe) + } else { + // Otherwise, it is always the unboxed argument + unboxedArg + } + } + } + + def genCallDefaultGetter(sym: Symbol, paramIndex: Int, + static: Boolean, captures: List[js.Tree])( + previousArgsValues: Int => List[js.Tree])( + implicit pos: SourcePosition): js.Tree = { + + val targetSym = targetSymForDefaultGetter(sym) + val defaultGetterDenot = this.defaultGetterDenot(targetSym, sym, paramIndex) + + assert(defaultGetterDenot.exists, s"need default getter for method ${sym.fullName}") + assert(!defaultGetterDenot.isOverloaded, i"found overloaded default getter $defaultGetterDenot") + val defaultGetter = defaultGetterDenot.symbol + + val targetTree = { + if (sym.isClassConstructor || static) { + if (targetSym.isStatic) { + assert(captures.isEmpty, i"expected empty captures for ${targetSym.fullName} at $pos") + genLoadModule(targetSym) + } else { + assert(captures.sizeIs == 1, "expected exactly one capture") + + // Find the module accessor. We cannot use memberBasedOnFlags because of scala-js/scala-js#4526. + val outer = targetSym.originalOwner + val name = atPhase(typerPhase)(targetSym.name.unexpandedName).sourceModuleName + val modAccessor = outer.info.allMembers.find { denot => + denot.symbol.is(Module) && denot.name.unexpandedName == name + }.getOrElse { + throw new AssertionError(i"could not find module accessor for ${targetSym.fullName} at $pos") + }.symbol + + val receiver = captures.head + if (outer.isJSType) + genApplyJSClassMethod(receiver, modAccessor, Nil) + else + genApplyMethodMaybeStatically(receiver, modAccessor, Nil) + } + } else { + js.This()(currentThisType) + } + } + + // Pass previous arguments to defaultGetter + val defaultGetterArgs = previousArgsValues(defaultGetter.info.paramInfoss.head.size) + + val callGetter = if (targetSym.isJSType) { + if (defaultGetter.owner.isNonNativeJSClass) { + if (defaultGetter.hasAnnotation(jsdefn.JSOptionalAnnot)) + js.Undefined() + else + genApplyJSClassMethod(targetTree, defaultGetter, defaultGetterArgs) + } else if (defaultGetter.owner == targetSym) { + /* We get here if a non-native constructor has a native companion. + * This is reported on a per-class level. + */ + assert(sym.isClassConstructor, + s"got non-constructor method $sym with default method in JS native companion") + js.Undefined() + } else { + report.error( + "When overriding a native method with default arguments, " + + "the overriding method must explicitly repeat the default arguments.", + sym.srcPos) + js.Undefined() + } + } else { + genApplyMethod(targetTree, defaultGetter, defaultGetterArgs) + } + + // #15419 If the getter returns void, we must "box" it by returning undefined + if (callGetter.tpe == jstpe.NoType) + js.Block(callGetter, js.Undefined()) + else + callGetter + } + + private def targetSymForDefaultGetter(sym: Symbol): Symbol = + if (sym.isClassConstructor) sym.owner.companionModule.moduleClass + else sym.owner + + private def defaultGetterDenot(targetSym: Symbol, sym: Symbol, paramIndex: Int): Denotation = + targetSym.info.memberBasedOnFlags(DefaultGetterName(sym.name.asTermName, paramIndex), excluded = Bridge) + + private def defaultGetterDenot(sym: Symbol, paramIndex: Int): Denotation = + defaultGetterDenot(targetSymForDefaultGetter(sym), sym, paramIndex) + + /** Generate the final forwarding call to the exported method. */ + private def genResult(exported: Exported, args: List[js.Tree], static: Boolean)( + implicit pos: SourcePosition): js.Tree = { + + val sym = exported.sym + val currentClass = currentClassSym.get + + def receiver = + if (static) genLoadModule(sym.owner) + else js.This()(currentThisType) + + def boxIfNeeded(call: js.Tree): js.Tree = + box(call, atPhase(elimErasedValueTypePhase)(sym.info.resultType)) + + if (currentClass.isNonNativeJSClass) { + assert(sym.owner == currentClass, sym.fullName) + boxIfNeeded(genApplyJSClassMethod(receiver, sym, args)) + } else { + if (sym.isClassConstructor) + js.New(encodeClassName(currentClass), encodeMethodSym(sym), args) + else if (sym.isPrivate) + boxIfNeeded(genApplyMethodStatically(receiver, sym, args)) + else + boxIfNeeded(genApplyMethod(receiver, sym, args)) + } + } + + private def genThrowTypeError(msg: String = "No matching overload")(implicit pos: Position): js.Tree = + js.Throw(js.JSNew(js.JSGlobalRef("TypeError"), js.StringLiteral(msg) :: Nil)) + + abstract class Exported( + val sym: Symbol, + // Parameters participating in overload resolution. + val params: scala.collection.immutable.IndexedSeq[JSParamInfo] + ) { + assert(!params.exists(_.capture), "illegal capture params in Exported") + + private val paramsHasDefault = { + if (!atPhase(elimRepeatedPhase)(sym.hasDefaultParams)) { + Vector.empty + } else { + val targetSym = targetSymForDefaultGetter(sym) + params.indices.map(i => defaultGetterDenot(targetSym, sym, i).exists) + } + } + + def hasDefaultAt(paramIndex: Int): Boolean = + paramIndex < paramsHasDefault.size && paramsHasDefault(paramIndex) + + val hasRepeatedParam = params.nonEmpty && params.last.repeated + + val minArgc = { + // Find the first default param or repeated param + params + .indices + .find(i => hasDefaultAt(i) || params(i).repeated) + .getOrElse(params.size) + } + + val maxNonRepeatedArgc = if (hasRepeatedParam) params.size - 1 else params.size + + def pos: SourcePosition = sym.sourcePos + + def exportArgTypeAt(paramIndex: Int): Type = { + if (paramIndex < params.length) { + params(paramIndex).info + } else { + assert(hasRepeatedParam, i"$sym does not have varargs nor enough params for $paramIndex") + params.last.info + } + } + + def typeInfo: String = sym.info.toString + + def genBody(formalArgsRegistry: FormalArgsRegistry): js.Tree + } + + private class ExportedSymbol(sym: Symbol, static: Boolean) + extends Exported(sym, sym.jsParamInfos.toIndexedSeq) { + + def genBody(formalArgsRegistry: FormalArgsRegistry): js.Tree = + genApplyForSingleExported(formalArgsRegistry, this, static) + } + + // !!! Hash codes of RTTypeTest are meaningless because of InstanceOfTypeTest + private sealed abstract class RTTypeTest + + private case class PrimitiveTypeTest(tpe: jstpe.Type, rank: Int) extends RTTypeTest + + // !!! This class does not have a meaningful hash code + private case class InstanceOfTypeTest(tpe: Type) extends RTTypeTest { + override def equals(that: Any): Boolean = { + that match { + case InstanceOfTypeTest(thatTpe) => tpe =:= thatTpe + case _ => false + } + } + } + + private case object NoTypeTest extends RTTypeTest + + /** Very simple O(n²) topological sort for elements assumed to be distinct. */ + private def topoSortDistinctsWith[A <: AnyRef](coll: List[A])(lteq: (A, A) => Boolean): List[A] = { + @tailrec + def loop(coll: List[A], acc: List[A]): List[A] = { + if (coll.isEmpty) acc + else if (coll.tail.isEmpty) coll.head :: acc + else { + val (lhs, rhs) = coll.span(x => !coll.forall(y => (x eq y) || !lteq(x, y))) + assert(!rhs.isEmpty, s"cycle while ordering $coll") + loop(lhs ::: rhs.tail, rhs.head :: acc) + } + } + + loop(coll, Nil) + } + + private def typeTestForTpe(tpe: Type): RTTypeTest = { + tpe match { + case tpe: ErasedValueType => + InstanceOfTypeTest(tpe.tycon.typeSymbol.typeRef) + + case _ => + import org.scalajs.ir.Names + + (toIRType(tpe): @unchecked) match { + case jstpe.AnyType => NoTypeTest + + case jstpe.NoType => PrimitiveTypeTest(jstpe.UndefType, 0) + case jstpe.BooleanType => PrimitiveTypeTest(jstpe.BooleanType, 1) + case jstpe.CharType => PrimitiveTypeTest(jstpe.CharType, 2) + case jstpe.ByteType => PrimitiveTypeTest(jstpe.ByteType, 3) + case jstpe.ShortType => PrimitiveTypeTest(jstpe.ShortType, 4) + case jstpe.IntType => PrimitiveTypeTest(jstpe.IntType, 5) + case jstpe.LongType => PrimitiveTypeTest(jstpe.LongType, 6) + case jstpe.FloatType => PrimitiveTypeTest(jstpe.FloatType, 7) + case jstpe.DoubleType => PrimitiveTypeTest(jstpe.DoubleType, 8) + + case jstpe.ClassType(Names.BoxedUnitClass) => PrimitiveTypeTest(jstpe.UndefType, 0) + case jstpe.ClassType(Names.BoxedStringClass) => PrimitiveTypeTest(jstpe.StringType, 9) + case jstpe.ClassType(_) => InstanceOfTypeTest(tpe) + + case jstpe.ArrayType(_) => InstanceOfTypeTest(tpe) + } + } + } + + // Group-by that does not rely on hashCode(), only equals() - O(n²) + private def groupByWithoutHashCode[A, B](coll: List[A])(f: A => B): List[(B, List[A])] = { + val m = new mutable.ArrayBuffer[(B, List[A])] + m.sizeHint(coll.length) + + for (elem <- coll) { + val key = f(elem) + val index = m.indexWhere(_._1 == key) + if (index < 0) + m += ((key, List(elem))) + else + m(index) = (key, elem :: m(index)._2) + } + + m.toList + } + + class FormalArgsRegistry(val minArgc: Int, needsRestParam: Boolean) { + private val fixedParamNames: scala.collection.immutable.IndexedSeq[jsNames.LocalName] = + (0 until minArgc).toIndexedSeq.map(_ => freshLocalIdent("arg")(NoPosition).name) + + private val restParamName: jsNames.LocalName = + if (needsRestParam) freshLocalIdent("rest")(NoPosition).name + else null + + def genFormalArgs()(implicit pos: Position): (List[js.ParamDef], Option[js.ParamDef]) = { + val fixedParamDefs = fixedParamNames.toList.map { paramName => + js.ParamDef(js.LocalIdent(paramName), NoOriginalName, jstpe.AnyType, mutable = false) + } + + val restParam = { + if (needsRestParam) + Some(js.ParamDef(js.LocalIdent(restParamName), NoOriginalName, jstpe.AnyType, mutable = false)) + else + None + } + + (fixedParamDefs, restParam) + } + + def genArgRef(index: Int)(implicit pos: Position): js.Tree = { + if (index < minArgc) + js.VarRef(js.LocalIdent(fixedParamNames(index)))(jstpe.AnyType) + else + js.JSSelect(genRestArgRef(), js.IntLiteral(index - minArgc)) + } + + def genVarargRef(fixedParamCount: Int)(implicit pos: Position): js.Tree = { + assert(fixedParamCount >= minArgc, s"genVarargRef($fixedParamCount) with minArgc = $minArgc at $pos") + val restParam = genRestArgRef() + if (fixedParamCount == minArgc) + restParam + else + js.JSMethodApply(restParam, js.StringLiteral("slice"), List(js.IntLiteral(fixedParamCount - minArgc))) + } + + def genRestArgRef()(implicit pos: Position): js.Tree = { + assert(needsRestParam, s"trying to generate a reference to non-existent rest param at $pos") + js.VarRef(js.LocalIdent(restParamName))(jstpe.AnyType) + } + + def genAllArgsRefsForForwarder()(implicit pos: Position): List[js.TreeOrJSSpread] = { + val fixedArgRefs = fixedParamNames.toList.map { paramName => + js.VarRef(js.LocalIdent(paramName))(jstpe.AnyType) + } + + if (needsRestParam) { + val restArgRef = js.VarRef(js.LocalIdent(restParamName))(jstpe.AnyType) + fixedArgRefs :+ js.JSSpread(restArgRef) + } else { + fixedArgRefs + } + } + } +} diff --git a/tests/pos-with-compiler-cc/backend/sjs/JSPositions.scala b/tests/pos-with-compiler-cc/backend/sjs/JSPositions.scala new file mode 100644 index 000000000000..2fd007165952 --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/sjs/JSPositions.scala @@ -0,0 +1,102 @@ +package dotty.tools.backend.sjs + +import scala.language.unsafeNulls + +import java.net.{URI, URISyntaxException} + +import dotty.tools.dotc.core._ +import Contexts._ +import Decorators.em + +import dotty.tools.dotc.report + +import dotty.tools.dotc.util.{SourceFile, SourcePosition} +import dotty.tools.dotc.util.Spans.Span + +import org.scalajs.ir + +/** Conversion utilities from dotty Positions to IR Positions. */ +class JSPositions()(using Context) { + import JSPositions._ + + private val sourceURIMaps: List[URIMap] = { + ctx.settings.scalajsMapSourceURI.value.flatMap { option => + val uris = option.split("->") + if (uris.length != 1 && uris.length != 2) { + report.error("-scalajs-mapSourceURI needs one or two URIs as argument (separated by '->').") + Nil + } else { + try { + val from = new URI(uris.head) + val to = uris.lift(1).map(str => new URI(str)) + URIMap(from, to) :: Nil + } catch { + case e: URISyntaxException => + report.error(em"${e.getInput} is not a valid URI") + Nil + } + } + } + } + + private def sourceAndSpan2irPos(source: SourceFile, span: Span): ir.Position = { + if (!span.exists) ir.Position.NoPosition + else { + // dotty positions and IR positions are both 0-based + val irSource = span2irPosCache.toIRSource(source) + val point = span.point + val line = source.offsetToLine(point) + val column = source.column(point) + ir.Position(irSource, line, column) + } + } + + /** Implicit conversion from dotty Span to ir.Position. */ + implicit def span2irPos(span: Span): ir.Position = + sourceAndSpan2irPos(ctx.compilationUnit.source, span) + + /** Implicitly materializes an ir.Position from an implicit dotty Span. */ + implicit def implicitSpan2irPos(implicit span: Span): ir.Position = + span2irPos(span) + + /** Implicitly materializes an ir.Position from an implicit dotty SourcePosition. */ + implicit def implicitSourcePos2irPos(implicit sourcePos: SourcePosition): ir.Position = + sourceAndSpan2irPos(sourcePos.source, sourcePos.span) + + private object span2irPosCache { + import dotty.tools.dotc.util._ + + private var lastDotcSource: SourceFile = null + private var lastIRSource: ir.Position.SourceFile = null + + def toIRSource(dotcSource: SourceFile): ir.Position.SourceFile = { + if (dotcSource != lastDotcSource) { + lastIRSource = convert(dotcSource) + lastDotcSource = dotcSource + } + lastIRSource + } + + private def convert(dotcSource: SourceFile): ir.Position.SourceFile = { + dotcSource.file.file match { + case null => + new java.net.URI( + "virtualfile", // Pseudo-Scheme + dotcSource.file.path, // Scheme specific part + null // Fragment + ) + case file => + val srcURI = file.toURI + sourceURIMaps.collectFirst { + case URIMap(from, to) if from.relativize(srcURI) != srcURI => + val relURI = from.relativize(srcURI) + to.fold(relURI)(_.resolve(relURI)) + }.getOrElse(srcURI) + } + } + } +} + +object JSPositions { + final case class URIMap(from: URI, to: Option[URI]) +} diff --git a/tests/pos-with-compiler-cc/backend/sjs/JSPrimitives.scala b/tests/pos-with-compiler-cc/backend/sjs/JSPrimitives.scala new file mode 100644 index 000000000000..ce83f5e9e83b --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/sjs/JSPrimitives.scala @@ -0,0 +1,150 @@ +package dotty.tools.backend.sjs + +import dotty.tools.dotc.core._ +import Names.TermName +import Types._ +import Contexts._ +import Symbols._ +import Decorators.em + +import dotty.tools.dotc.ast.tpd._ +import dotty.tools.backend.jvm.DottyPrimitives +import dotty.tools.dotc.report +import dotty.tools.dotc.util.ReadOnlyMap + +object JSPrimitives { + + inline val FirstJSPrimitiveCode = 300 + + inline val DYNNEW = FirstJSPrimitiveCode + 1 // Instantiate a new JavaScript object + + inline val ARR_CREATE = DYNNEW + 1 // js.Array.apply (array literal syntax) + + inline val TYPEOF = ARR_CREATE + 1 // typeof x + inline val JS_NATIVE = TYPEOF + 1 // js.native. Marker method. Fails if tried to be emitted. + + inline val UNITVAL = JS_NATIVE + 1 // () value, which is undefined + + inline val JS_NEW_TARGET = UNITVAL + 1 // js.new.target + + inline val JS_IMPORT = JS_NEW_TARGET + 1 // js.import.apply(specifier) + inline val JS_IMPORT_META = JS_IMPORT + 1 // js.import.meta + + inline val CONSTRUCTOROF = JS_IMPORT_META + 1 // runtime.constructorOf(clazz) + inline val CREATE_INNER_JS_CLASS = CONSTRUCTOROF + 1 // runtime.createInnerJSClass + inline val CREATE_LOCAL_JS_CLASS = CREATE_INNER_JS_CLASS + 1 // runtime.createLocalJSClass + inline val WITH_CONTEXTUAL_JS_CLASS_VALUE = CREATE_LOCAL_JS_CLASS + 1 // runtime.withContextualJSClassValue + inline val LINKING_INFO = WITH_CONTEXTUAL_JS_CLASS_VALUE + 1 // runtime.linkingInfo + inline val DYNAMIC_IMPORT = LINKING_INFO + 1 // runtime.dynamicImport + + inline val STRICT_EQ = DYNAMIC_IMPORT + 1 // js.special.strictEquals + inline val IN = STRICT_EQ + 1 // js.special.in + inline val INSTANCEOF = IN + 1 // js.special.instanceof + inline val DELETE = INSTANCEOF + 1 // js.special.delete + inline val FORIN = DELETE + 1 // js.special.forin + inline val JS_THROW = FORIN + 1 // js.special.throw + inline val JS_TRY_CATCH = JS_THROW + 1 // js.special.tryCatch + inline val WRAP_AS_THROWABLE = JS_TRY_CATCH + 1 // js.special.wrapAsThrowable + inline val UNWRAP_FROM_THROWABLE = WRAP_AS_THROWABLE + 1 // js.special.unwrapFromThrowable + inline val DEBUGGER = UNWRAP_FROM_THROWABLE + 1 // js.special.debugger + + inline val THROW = DEBUGGER + 1 + + inline val UNION_FROM = THROW + 1 // js.|.from + inline val UNION_FROM_TYPE_CONSTRUCTOR = UNION_FROM + 1 // js.|.fromTypeConstructor + + inline val REFLECT_SELECTABLE_SELECTDYN = UNION_FROM_TYPE_CONSTRUCTOR + 1 // scala.reflect.Selectable.selectDynamic + inline val REFLECT_SELECTABLE_APPLYDYN = REFLECT_SELECTABLE_SELECTDYN + 1 // scala.reflect.Selectable.applyDynamic + + inline val LastJSPrimitiveCode = REFLECT_SELECTABLE_APPLYDYN + + def isJSPrimitive(code: Int): Boolean = + code >= FirstJSPrimitiveCode && code <= LastJSPrimitiveCode + +} + +class JSPrimitives(ictx: DetachedContext) extends DottyPrimitives(ictx) { + import JSPrimitives._ + + private lazy val jsPrimitives: ReadOnlyMap[Symbol, Int] = initJSPrimitives(using ictx) + + override def getPrimitive(sym: Symbol): Int = + jsPrimitives.getOrElse(sym, super.getPrimitive(sym)) + + override def getPrimitive(app: Apply, tpe: Type)(using Context): Int = + jsPrimitives.getOrElse(app.fun.symbol, super.getPrimitive(app, tpe)) + + override def isPrimitive(sym: Symbol): Boolean = + jsPrimitives.contains(sym) || super.isPrimitive(sym) + + override def isPrimitive(fun: Tree): Boolean = + jsPrimitives.contains(fun.symbol(using ictx)) || super.isPrimitive(fun) + + /** Initialize the primitive map */ + private def initJSPrimitives(using Context): ReadOnlyMap[Symbol, Int] = { + + val primitives = MutableSymbolMap[Int]() + + // !!! Code duplicate with DottyPrimitives + /** Add a primitive operation to the map */ + def addPrimitive(s: Symbol, code: Int): Unit = { + assert(!(primitives contains s), "Duplicate primitive " + s) + primitives(s) = code + } + + def addPrimitives(cls: Symbol, method: TermName, code: Int)(using Context): Unit = { + val alts = cls.info.member(method).alternatives.map(_.symbol) + if (alts.isEmpty) { + report.error(em"Unknown primitive method $cls.$method") + } else { + for (s <- alts) + addPrimitive(s, code) + } + } + + val jsdefn = JSDefinitions.jsdefn + + addPrimitive(jsdefn.JSDynamic_newInstance, DYNNEW) + + addPrimitive(jsdefn.JSArray_apply, ARR_CREATE) + + addPrimitive(jsdefn.JSPackage_typeOf, TYPEOF) + addPrimitive(jsdefn.JSPackage_native, JS_NATIVE) + + addPrimitive(defn.BoxedUnit_UNIT, UNITVAL) + + addPrimitive(jsdefn.JSNew_target, JS_NEW_TARGET) + + addPrimitive(jsdefn.JSImport_apply, JS_IMPORT) + addPrimitive(jsdefn.JSImport_meta, JS_IMPORT_META) + + addPrimitive(jsdefn.Runtime_constructorOf, CONSTRUCTOROF) + addPrimitive(jsdefn.Runtime_createInnerJSClass, CREATE_INNER_JS_CLASS) + addPrimitive(jsdefn.Runtime_createLocalJSClass, CREATE_LOCAL_JS_CLASS) + addPrimitive(jsdefn.Runtime_withContextualJSClassValue, WITH_CONTEXTUAL_JS_CLASS_VALUE) + addPrimitive(jsdefn.Runtime_linkingInfo, LINKING_INFO) + addPrimitive(jsdefn.Runtime_dynamicImport, DYNAMIC_IMPORT) + + addPrimitive(jsdefn.Special_strictEquals, STRICT_EQ) + addPrimitive(jsdefn.Special_in, IN) + addPrimitive(jsdefn.Special_instanceof, INSTANCEOF) + addPrimitive(jsdefn.Special_delete, DELETE) + addPrimitive(jsdefn.Special_forin, FORIN) + addPrimitive(jsdefn.Special_throw, JS_THROW) + addPrimitive(jsdefn.Special_tryCatch, JS_TRY_CATCH) + addPrimitive(jsdefn.Special_wrapAsThrowable, WRAP_AS_THROWABLE) + addPrimitive(jsdefn.Special_unwrapFromThrowable, UNWRAP_FROM_THROWABLE) + addPrimitive(jsdefn.Special_debugger, DEBUGGER) + + addPrimitive(defn.throwMethod, THROW) + + addPrimitive(jsdefn.PseudoUnion_from, UNION_FROM) + addPrimitive(jsdefn.PseudoUnion_fromTypeConstructor, UNION_FROM_TYPE_CONSTRUCTOR) + + addPrimitive(jsdefn.ReflectSelectable_selectDynamic, REFLECT_SELECTABLE_SELECTDYN) + addPrimitive(jsdefn.ReflectSelectable_applyDynamic, REFLECT_SELECTABLE_APPLYDYN) + + primitives + } + +} diff --git a/tests/pos-with-compiler-cc/backend/sjs/ScopedVar.scala b/tests/pos-with-compiler-cc/backend/sjs/ScopedVar.scala new file mode 100644 index 000000000000..21462929833c --- /dev/null +++ b/tests/pos-with-compiler-cc/backend/sjs/ScopedVar.scala @@ -0,0 +1,38 @@ +package dotty.tools.backend.sjs + +class ScopedVar[A](init: A) extends caps.Pure { + import ScopedVar.Assignment + + private[ScopedVar] var value = init + + def this()(implicit ev: Null <:< A) = this(ev(null)) + + def get: A = value + def :=(newValue: A): Assignment[A] = new Assignment(this, newValue) +} + +object ScopedVar { + class Assignment[T](scVar: ScopedVar[T], value: T) { + private[ScopedVar] def push(): AssignmentStackElement[T] = { + val stack = new AssignmentStackElement(scVar, scVar.value) + scVar.value = value + stack + } + } + + private class AssignmentStackElement[T](scVar: ScopedVar[T], oldValue: T) { + private[ScopedVar] def pop(): Unit = { + scVar.value = oldValue + } + } + + implicit def toValue[T](scVar: ScopedVar[T]): T = scVar.get + + def withScopedVars[T](ass: Assignment[_]*)(body: => T): T = { + val stack = ass.map(_.push()) + try body + finally stack.reverse.foreach(_.pop()) + } + + final class VarBox[A](var value: A) +} diff --git a/tests/pos-with-compiler-cc/dotc/Bench.scala b/tests/pos-with-compiler-cc/dotc/Bench.scala new file mode 100644 index 000000000000..c9c032b0ae7d --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/Bench.scala @@ -0,0 +1,64 @@ +package dotty.tools +package dotc + +import core.Contexts._ +import reporting.Reporter +import io.AbstractFile + +import scala.annotation.internal.sharable + +/** A main class for running compiler benchmarks. Can instantiate a given + * number of compilers and run each (sequentially) a given number of times + * on the same sources. + */ +object Bench extends Driver: + + @sharable private var numRuns = 1 + + private def ntimes(n: Int)(op: => Reporter): Reporter = + (0 until n).foldLeft(emptyReporter)((_, _) => op) + + @sharable private var times: Array[Int] = _ + + override def doCompile(compiler: Compiler, files: List[AbstractFile])(using Context): Reporter = + times = new Array[Int](numRuns) + var reporter: Reporter = emptyReporter + for i <- 0 until numRuns do + val start = System.nanoTime() + reporter = super.doCompile(compiler, files) + times(i) = ((System.nanoTime - start) / 1000000).toInt + println(s"time elapsed: ${times(i)}ms") + if ctx.settings.Xprompt.value then + print("hit to continue >") + System.in.nn.read() + println() + reporter + + def extractNumArg(args: Array[String], name: String, default: Int = 1): (Int, Array[String]) = { + val pos = args indexOf name + if (pos < 0) (default, args) + else (args(pos + 1).toInt, (args take pos) ++ (args drop (pos + 2))) + } + + def reportTimes() = + val best = times.sorted + val measured = numRuns / 3 + val avgBest = best.take(measured).sum / measured + val avgLast = times.reverse.take(measured).sum / measured + println(s"best out of $numRuns runs: ${best(0)}") + println(s"average out of best $measured: $avgBest") + println(s"average out of last $measured: $avgLast") + + override def process(args: Array[String], rootCtx: Context): Reporter = + val (numCompilers, args1) = extractNumArg(args, "#compilers") + val (numRuns, args2) = extractNumArg(args1, "#runs") + this.numRuns = numRuns + var reporter: Reporter = emptyReporter + for i <- 0 until numCompilers do + reporter = super.process(args2, rootCtx) + reportTimes() + reporter + +end Bench + + diff --git a/tests/pos-with-compiler-cc/dotc/CompilationUnit.scala b/tests/pos-with-compiler-cc/dotc/CompilationUnit.scala new file mode 100644 index 000000000000..f70bda947129 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/CompilationUnit.scala @@ -0,0 +1,167 @@ +package dotty.tools +package dotc + +import core._ +import Contexts._ +import SymDenotations.ClassDenotation +import Symbols._ +import util.{FreshNameCreator, SourceFile, NoSource} +import util.Spans.Span +import ast.{tpd, untpd} +import tpd.{Tree, TreeTraverser} +import ast.Trees.{Import, Ident} +import typer.Nullables +import transform.SymUtils._ +import core.Decorators._ +import config.{SourceVersion, Feature} +import StdNames.nme +import scala.annotation.internal.sharable +import language.experimental.pureFunctions + +class CompilationUnit protected (val source: SourceFile) { + + override def toString: String = source.toString + + var untpdTree: untpd.Tree = untpd.EmptyTree + + var tpdTree: tpd.Tree = tpd.EmptyTree + + /** Is this the compilation unit of a Java file */ + def isJava: Boolean = source.file.name.endsWith(".java") + + /** The source version for this unit, as determined by a language import */ + var sourceVersion: Option[SourceVersion] = None + + /** Pickled TASTY binaries, indexed by class. */ + var pickled: Map[ClassSymbol, () -> Array[Byte]] = Map() + + /** The fresh name creator for the current unit. + * FIXME(#7661): This is not fine-grained enough to enable reproducible builds, + * see https://github.com/scala/scala/commit/f50ec3c866263448d803139e119b33afb04ec2bc + */ + val freshNames: FreshNameCreator = new FreshNameCreator.Default + + /** Will be set to `true` if there are inline call that must be inlined after typer. + * The information is used in phase `Inlining` in order to avoid traversing trees that need no transformations. + */ + var needsInlining: Boolean = false + + /** Set to `true` if inliner added anonymous mirrors that need to be completed */ + var needsMirrorSupport: Boolean = false + + /** Will be set to `true` if contains `Quote`. + * The information is used in phase `Staging`/`Splicing`/`PickleQuotes` in order to avoid traversing trees that need no transformations. + */ + var needsStaging: Boolean = false + + /** Will be set to true if the unit contains a captureChecking language import */ + var needsCaptureChecking: Boolean = false + + /** Will be set to true if the unit contains a pureFunctions language import */ + var knowsPureFuns: Boolean = false + + var suspended: Boolean = false + var suspendedAtInliningPhase: Boolean = false + + /** Can this compilation unit be suspended */ + def isSuspendable: Boolean = true + + /** Suspends the compilation unit by thowing a SuspendException + * and recording the suspended compilation unit + */ + def suspend()(using Context): Nothing = + assert(isSuspendable) + if !suspended then + if (ctx.settings.XprintSuspension.value) + report.echo(i"suspended: $this") + suspended = true + ctx.run.nn.suspendedUnits += this + if ctx.phase == Phases.inliningPhase then + suspendedAtInliningPhase = true + throw CompilationUnit.SuspendException() + + private var myAssignmentSpans: Map[Int, List[Span]] | Null = null + + /** A map from (name-) offsets of all local variables in this compilation unit + * that can be tracked for being not null to the list of spans of assignments + * to these variables. + */ + def assignmentSpans(using Context): Map[Int, List[Span]] = + if myAssignmentSpans == null then myAssignmentSpans = Nullables.assignmentSpans + myAssignmentSpans.nn +} + +@sharable object NoCompilationUnit extends CompilationUnit(NoSource) { + + override def isJava: Boolean = false + + override def suspend()(using Context): Nothing = + throw CompilationUnit.SuspendException() + + override def assignmentSpans(using Context): Map[Int, List[Span]] = Map.empty +} + +object CompilationUnit { + + class SuspendException extends Exception + + /** Make a compilation unit for top class `clsd` with the contents of the `unpickled` tree */ + def apply(clsd: ClassDenotation, unpickled: Tree, forceTrees: Boolean)(using Context): CompilationUnit = + val file = clsd.symbol.associatedFile.nn + apply(SourceFile(file, Array.empty[Char]), unpickled, forceTrees) + + /** Make a compilation unit, given picked bytes and unpickled tree */ + def apply(source: SourceFile, unpickled: Tree, forceTrees: Boolean)(using Context): CompilationUnit = { + assert(!unpickled.isEmpty, unpickled) + val unit1 = new CompilationUnit(source) + unit1.tpdTree = unpickled + if (forceTrees) { + val force = new Force + force.traverse(unit1.tpdTree) + unit1.needsStaging = force.containsQuote + unit1.needsInlining = force.containsInline + } + unit1 + } + + /** Create a compilation unit corresponding to `source`. + * If `mustExist` is true, this will fail if `source` does not exist. + */ + def apply(source: SourceFile, mustExist: Boolean = true)(using Context): CompilationUnit = { + val src = + if (!mustExist) + source + else if (source.file.isDirectory) { + report.error(em"expected file, received directory '${source.file.path}'") + NoSource + } + else if (!source.file.exists) { + report.error(em"source file not found: ${source.file.path}") + NoSource + } + else source + new CompilationUnit(src) + } + + /** Force the tree to be loaded */ + private class Force extends TreeTraverser { + var containsQuote = false + var containsInline = false + var containsCaptureChecking = false + def traverse(tree: Tree)(using Context): Unit = { + if (tree.symbol.isQuote) + containsQuote = true + if tree.symbol.is(Flags.Inline) then + containsInline = true + tree match + case Import(qual, selectors) => + tpd.languageImport(qual) match + case Some(prefix) => + for case untpd.ImportSelector(untpd.Ident(imported), untpd.EmptyTree, _) <- selectors do + Feature.handleGlobalLanguageImport(prefix, imported) + case _ => + case _ => + traverseChildren(tree) + } + } +} diff --git a/tests/pos-with-compiler-cc/dotc/Compiler.scala b/tests/pos-with-compiler-cc/dotc/Compiler.scala new file mode 100644 index 000000000000..b121a47781e1 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/Compiler.scala @@ -0,0 +1,171 @@ +package dotty.tools +package dotc + +import core._ +import Contexts._ +import typer.{TyperPhase, RefChecks} +import cc.CheckCaptures +import parsing.Parser +import Phases.Phase +import transform._ +import dotty.tools.backend +import backend.jvm.{CollectSuperCalls, GenBCode} +import localopt.StringInterpolatorOpt + +/** The central class of the dotc compiler. The job of a compiler is to create + * runs, which process given `phases` in a given `rootContext`. + */ +class Compiler { + + /** Meta-ordering constraint: + * + * DenotTransformers that change the signature of their denotation's info must go + * after erasure. The reason is that denotations are permanently referred to by + * TermRefs which contain a signature. If the signature of a symbol would change, + * all refs to it would become outdated - they could not be dereferenced in the + * new phase. + * + * After erasure, signature changing denot-transformers are OK because signatures + * are never recomputed later than erasure. + */ + def phases: List[List[Phase]] = + frontendPhases ::: picklerPhases ::: transformPhases ::: backendPhases + + /** Phases dealing with the frontend up to trees ready for TASTY pickling */ + protected def frontendPhases: List[List[Phase]] = + List(new Parser) :: // Compiler frontend: scanner, parser + List(new TyperPhase) :: // Compiler frontend: namer, typer + List(new YCheckPositions) :: // YCheck positions + List(new sbt.ExtractDependencies) :: // Sends information on classes' dependencies to sbt via callbacks + List(new semanticdb.ExtractSemanticDB) :: // Extract info into .semanticdb files + List(new PostTyper) :: // Additional checks and cleanups after type checking + List(new sjs.PrepJSInterop) :: // Additional checks and transformations for Scala.js (Scala.js only) + List(new sbt.ExtractAPI) :: // Sends a representation of the API of classes to sbt via callbacks + List(new SetRootTree) :: // Set the `rootTreeOrProvider` on class symbols + Nil + + /** Phases dealing with TASTY tree pickling and unpickling */ + protected def picklerPhases: List[List[Phase]] = + List(new Pickler) :: // Generate TASTY info + List(new Inlining) :: // Inline and execute macros + List(new PostInlining) :: // Add mirror support for inlined code + List(new Staging) :: // Check staging levels and heal staged types + List(new Splicing) :: // Replace level 1 splices with holes + List(new PickleQuotes) :: // Turn quoted trees into explicit run-time data structures + Nil + + /** Phases dealing with the transformation from pickled trees to backend trees */ + protected def transformPhases: List[List[Phase]] = + List(new InstrumentCoverage) :: // Perform instrumentation for code coverage (if -coverage-out is set) + List(new FirstTransform, // Some transformations to put trees into a canonical form + new CheckReentrant, // Internal use only: Check that compiled program has no data races involving global vars + new ElimPackagePrefixes, // Eliminate references to package prefixes in Select nodes + new CookComments, // Cook the comments: expand variables, doc, etc. + new CheckStatic, // Check restrictions that apply to @static members + new CheckLoopingImplicits, // Check that implicit defs do not call themselves in an infinite loop + new BetaReduce, // Reduce closure applications + new InlineVals, // Check right hand-sides of an `inline val`s + new ExpandSAMs, // Expand single abstract method closures to anonymous classes + new ElimRepeated, // Rewrite vararg parameters and arguments + new RefChecks) :: // Various checks mostly related to abstract members and overriding + List(new init.Checker) :: // Check initialization of objects + List(new CrossVersionChecks, // Check issues related to deprecated and experimental + new ProtectedAccessors, // Add accessors for protected members + new ExtensionMethods, // Expand methods of value classes with extension methods + new UncacheGivenAliases, // Avoid caching RHS of simple parameterless given aliases + new ElimByName, // Map by-name parameters to functions + new HoistSuperArgs, // Hoist complex arguments of supercalls to enclosing scope + new ForwardDepChecks, // Check that there are no forward references to local vals + new SpecializeApplyMethods, // Adds specialized methods to FunctionN + new TryCatchPatterns, // Compile cases in try/catch + new PatternMatcher) :: // Compile pattern matches + List(new TestRecheck.Pre) :: // Test only: run rechecker, enabled under -Yrecheck-test + List(new TestRecheck) :: // Test only: run rechecker, enabled under -Yrecheck-test + List(new CheckCaptures.Pre) :: // Preparations for check captures phase, enabled under captureChecking + List(new CheckCaptures) :: // Check captures, enabled under captureChecking + List(new ElimOpaque, // Turn opaque into normal aliases + new sjs.ExplicitJSClasses, // Make all JS classes explicit (Scala.js only) + new ExplicitOuter, // Add accessors to outer classes from nested ones. + new ExplicitSelf, // Make references to non-trivial self types explicit as casts + new StringInterpolatorOpt) :: // Optimizes raw and s and f string interpolators by rewriting them to string concatenations or formats + List(new PruneErasedDefs, // Drop erased definitions from scopes and simplify erased expressions + new UninitializedDefs, // Replaces `compiletime.uninitialized` by `_` + new InlinePatterns, // Remove placeholders of inlined patterns + new VCInlineMethods, // Inlines calls to value class methods + new SeqLiterals, // Express vararg arguments as arrays + new InterceptedMethods, // Special handling of `==`, `|=`, `getClass` methods + new Getters, // Replace non-private vals and vars with getter defs (fields are added later) + new SpecializeFunctions, // Specialized Function{0,1,2} by replacing super with specialized super + new SpecializeTuples, // Specializes Tuples by replacing tuple construction and selection trees + new LiftTry, // Put try expressions that might execute on non-empty stacks into their own methods + new CollectNullableFields, // Collect fields that can be nulled out after use in lazy initialization + new ElimOuterSelect, // Expand outer selections + new ResolveSuper, // Implement super accessors + new FunctionXXLForwarders, // Add forwarders for FunctionXXL apply method + new ParamForwarding, // Add forwarders for aliases of superclass parameters + new TupleOptimizations, // Optimize generic operations on tuples + new LetOverApply, // Lift blocks from receivers of applications + new ArrayConstructors) :: // Intercept creation of (non-generic) arrays and intrinsify. + List(new Erasure) :: // Rewrite types to JVM model, erasing all type parameters, abstract types and refinements. + List(new ElimErasedValueType, // Expand erased value types to their underlying implmementation types + new PureStats, // Remove pure stats from blocks + new VCElideAllocations, // Peep-hole optimization to eliminate unnecessary value class allocations + new EtaReduce, // Reduce eta expansions of pure paths to the underlying function reference + new ArrayApply, // Optimize `scala.Array.apply([....])` and `scala.Array.apply(..., [....])` into `[...]` + new sjs.AddLocalJSFakeNews, // Adds fake new invocations to local JS classes in calls to `createLocalJSClass` + new ElimPolyFunction, // Rewrite PolyFunction subclasses to FunctionN subclasses + new TailRec, // Rewrite tail recursion to loops + new CompleteJavaEnums, // Fill in constructors for Java enums + new Mixin, // Expand trait fields and trait initializers + new LazyVals, // Expand lazy vals + new Memoize, // Add private fields to getters and setters + new NonLocalReturns, // Expand non-local returns + new CapturedVars) :: // Represent vars captured by closures as heap objects + List(new Constructors, // Collect initialization code in primary constructors + // Note: constructors changes decls in transformTemplate, no InfoTransformers should be added after it + new Instrumentation) :: // Count calls and allocations under -Yinstrument + List(new LambdaLift, // Lifts out nested functions to class scope, storing free variables in environments + // Note: in this mini-phase block scopes are incorrect. No phases that rely on scopes should be here + new ElimStaticThis, // Replace `this` references to static objects by global identifiers + new CountOuterAccesses) :: // Identify outer accessors that can be dropped + List(new DropOuterAccessors, // Drop unused outer accessors + new CheckNoSuperThis, // Check that supercalls don't contain references to `this` + new Flatten, // Lift all inner classes to package scope + new TransformWildcards, // Replace wildcards with default values + new MoveStatics, // Move static methods from companion to the class itself + new ExpandPrivate, // Widen private definitions accessed from nested classes + new RestoreScopes, // Repair scopes rendered invalid by moving definitions in prior phases of the group + new SelectStatic, // get rid of selects that would be compiled into GetStatic + new sjs.JUnitBootstrappers, // Generate JUnit-specific bootstrapper classes for Scala.js (not enabled by default) + new CollectEntryPoints, // Collect all entry points and save them in the context + new CollectSuperCalls, // Find classes that are called with super + new RepeatableAnnotations) :: // Aggregate repeatable annotations + Nil + + /** Generate the output of the compilation */ + protected def backendPhases: List[List[Phase]] = + List(new backend.sjs.GenSJSIR) :: // Generate .sjsir files for Scala.js (not enabled by default) + List(new GenBCode) :: // Generate JVM bytecode + Nil + + var runId: Int = 1 + def nextRunId: Int = { + runId += 1; runId + } + + def reset()(using Context): Unit = { + ctx.base.reset() + val run = ctx.run + if (run != null) run.reset() + } + + def newRun(using Context): Run = { + reset() + val rctx = + if ctx.settings.Xsemanticdb.value then + ctx.addMode(Mode.ReadPositions) + else + ctx + new Run(this, rctx) + } +} diff --git a/tests/pos-with-compiler-cc/dotc/Driver.scala b/tests/pos-with-compiler-cc/dotc/Driver.scala new file mode 100644 index 000000000000..b85f1365243b --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/Driver.scala @@ -0,0 +1,207 @@ +package dotty.tools.dotc + +import dotty.tools.FatalError +import config.CompilerCommand +import core.Comments.{ContextDoc, ContextDocstrings} +import core.Contexts._ +import core.{MacroClassLoader, TypeError} +import dotty.tools.dotc.ast.Positioned +import dotty.tools.io.AbstractFile +import reporting._ +import core.Decorators._ +import config.Feature + +import scala.util.control.NonFatal +import fromtasty.{TASTYCompiler, TastyFileUtil} + +/** Run the Dotty compiler. + * + * Extending this class lets you customize many aspect of the compilation + * process, but in most cases you only need to call [[process]] on the + * existing object [[Main]]. + */ +class Driver { + + protected def newCompiler(using Context): Compiler = + if (ctx.settings.fromTasty.value) new TASTYCompiler + else new Compiler + + protected def emptyReporter: Reporter = new StoreReporter(null) + + protected def doCompile(compiler: Compiler, files: List[AbstractFile])(using Context): Reporter = + if files.nonEmpty then + try + val run = compiler.newRun + run.compile(files) + finish(compiler, run) + catch + case ex: FatalError => + report.error(ex.getMessage.nn) // signals that we should fail compilation. + case ex: TypeError => + println(s"${ex.toMessage} while compiling ${files.map(_.path).mkString(", ")}") + throw ex + case ex: Throwable => + println(s"$ex while compiling ${files.map(_.path).mkString(", ")}") + throw ex + ctx.reporter + + protected def finish(compiler: Compiler, run: Run)(using Context): Unit = + run.printSummary() + if !ctx.reporter.errorsReported && run.suspendedUnits.nonEmpty then + val suspendedUnits = run.suspendedUnits.toList + if (ctx.settings.XprintSuspension.value) + report.echo(i"compiling suspended $suspendedUnits%, %") + val run1 = compiler.newRun + for unit <- suspendedUnits do unit.suspended = false + run1.compileUnits(suspendedUnits) + finish(compiler, run1)(using MacroClassLoader.init(ctx.fresh)) + + protected def initCtx: Context = (new ContextBase).initialCtx + + protected def sourcesRequired: Boolean = true + + protected def command: CompilerCommand = ScalacCommand + + /** Setup context with initialized settings from CLI arguments, then check if there are any settings that + * would change the default behaviour of the compiler. + * + * @return If there is no setting like `-help` preventing us from continuing compilation, + * this method returns a list of files to compile and an updated Context. + * If compilation should be interrupted, this method returns None. + */ + def setup(args: Array[String], rootCtx: Context): Option[(List[AbstractFile], DetachedContext)] = { + val ictx = rootCtx.fresh + val summary = command.distill(args, ictx.settings)(ictx.settingsState)(using ictx) + ictx.setSettings(summary.sstate) + Feature.checkExperimentalSettings(using ictx) + MacroClassLoader.init(ictx) + Positioned.init(using ictx) + + inContext(ictx) { + if !ctx.settings.YdropComments.value || ctx.settings.YreadComments.value then + ictx.setProperty(ContextDoc, new ContextDocstrings) + val fileNamesOrNone = command.checkUsage(summary, sourcesRequired)(using ctx.settings)(using ctx.settingsState) + fileNamesOrNone.map { fileNames => + val files = fileNames.map(ctx.getFile) + (files, fromTastySetup(files).detach) + } + } + } + + /** Setup extra classpath of tasty and jar files */ + protected def fromTastySetup(files: List[AbstractFile])(using Context): Context = + if ctx.settings.fromTasty.value then + val newEntries: List[String] = files + .flatMap { file => + if !file.exists then + report.error(em"File does not exist: ${file.path}") + None + else file.extension match + case "jar" => Some(file.path) + case "tasty" => + TastyFileUtil.getClassPath(file) match + case Some(classpath) => Some(classpath) + case _ => + report.error(em"Could not load classname from: ${file.path}") + None + case _ => + report.error(em"File extension is not `tasty` or `jar`: ${file.path}") + None + } + .distinct + val ctx1 = ctx.fresh + val fullClassPath = + (newEntries :+ ctx.settings.classpath.value).mkString(java.io.File.pathSeparator.nn) + ctx1.setSetting(ctx1.settings.classpath, fullClassPath) + else ctx + + /** Entry point to the compiler that can be conveniently used with Java reflection. + * + * This entry point can easily be used without depending on the `dotty` package, + * you only need to depend on `dotty-interfaces` and call this method using + * reflection. This allows you to write code that will work against multiple + * versions of dotty without recompilation. + * + * The trade-off is that you can only pass a SimpleReporter to this method + * and not a normal Reporter which is more powerful. + * + * Usage example: [[https://github.com/lampepfl/dotty/tree/master/compiler/test/dotty/tools/dotc/InterfaceEntryPointTest.scala]] + * + * @param args Arguments to pass to the compiler. + * @param simple Used to log errors, warnings, and info messages. + * The default reporter is used if this is `null`. + * @param callback Used to execute custom code during the compilation + * process. No callbacks will be executed if this is `null`. + * @return + */ + final def process(args: Array[String], simple: interfaces.SimpleReporter | Null, + callback: interfaces.CompilerCallback | Null): interfaces.ReporterResult = { + val reporter = if (simple == null) null else Reporter.fromSimpleReporter(simple) + process(args, reporter, callback) + } + + /** Principal entry point to the compiler. + * + * Usage example: [[https://github.com/lampepfl/dotty/tree/master/compiler/test/dotty/tools/dotc/EntryPointsTest.scala.disabled]] + * in method `runCompiler` + * + * @param args Arguments to pass to the compiler. + * @param reporter Used to log errors, warnings, and info messages. + * The default reporter is used if this is `null`. + * @param callback Used to execute custom code during the compilation + * process. No callbacks will be executed if this is `null`. + * @return The `Reporter` used. Use `Reporter#hasErrors` to check + * if compilation succeeded. + */ + final def process(args: Array[String], reporter: Reporter | Null = null, + callback: interfaces.CompilerCallback | Null = null): Reporter = { + val compileCtx = initCtx.fresh + if (reporter != null) + compileCtx.setReporter(reporter) + if (callback != null) + compileCtx.setCompilerCallback(callback) + process(args, compileCtx) + } + + /** Entry point to the compiler with no optional arguments. + * + * This overload is provided for compatibility reasons: the + * `RawCompiler` of sbt expects this method to exist and calls + * it using reflection. Keeping it means that we can change + * the other overloads without worrying about breaking compatibility + * with sbt. + */ + final def process(args: Array[String]): Reporter = + process(args, null: Reporter | Null, null: interfaces.CompilerCallback | Null) + + /** Entry point to the compiler using a custom `Context`. + * + * In most cases, you do not need a custom `Context` and should + * instead use one of the other overloads of `process`. However, + * the other overloads cannot be overridden, instead you + * should override this one which they call internally. + * + * Usage example: [[https://github.com/lampepfl/dotty/tree/master/compiler/test/dotty/tools/dotc/EntryPointsTest.scala.disabled]] + * in method `runCompilerWithContext` + * + * @param args Arguments to pass to the compiler. + * @param rootCtx The root Context to use. + * @return The `Reporter` used. Use `Reporter#hasErrors` to check + * if compilation succeeded. + */ + def process(args: Array[String], rootCtx: Context): Reporter = { + setup(args, rootCtx) match + case Some((files, compileCtx)) => + doCompile(newCompiler(using compileCtx), files)(using compileCtx) + case None => + rootCtx.reporter + } + + def main(args: Array[String]): Unit = { + // Preload scala.util.control.NonFatal. Otherwise, when trying to catch a StackOverflowError, + // we may try to load it but fail with another StackOverflowError and lose the original exception, + // see . + val _ = NonFatal + sys.exit(if (process(args).hasErrors) 1 else 0) + } +} diff --git a/tests/pos-with-compiler-cc/dotc/Main.scala b/tests/pos-with-compiler-cc/dotc/Main.scala new file mode 100644 index 000000000000..3288fded52a2 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/Main.scala @@ -0,0 +1,5 @@ +package dotty.tools +package dotc + +/** Main class of the `dotc` batch compiler. */ +object Main extends Driver diff --git a/tests/pos-with-compiler-cc/dotc/MissingCoreLibraryException.scala b/tests/pos-with-compiler-cc/dotc/MissingCoreLibraryException.scala new file mode 100644 index 000000000000..ae20d81226c9 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/MissingCoreLibraryException.scala @@ -0,0 +1,9 @@ +package dotty.tools.dotc + +import dotty.tools.FatalError + +class MissingCoreLibraryException(rootPackage: String) extends FatalError( + s"""Could not find package $rootPackage from compiler core libraries. + |Make sure the compiler core libraries are on the classpath. + """.stripMargin +) diff --git a/tests/pos-with-compiler-cc/dotc/Resident.scala b/tests/pos-with-compiler-cc/dotc/Resident.scala new file mode 100644 index 000000000000..9ebeaaaeb1c2 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/Resident.scala @@ -0,0 +1,61 @@ +package dotty.tools +package dotc + +import core.Contexts._ +import reporting.Reporter +import java.io.EOFException +import scala.annotation.tailrec + +/** A compiler which stays resident between runs. This is more of a PoC than + * something that's expected to be used often + * + * Usage: + * + * > scala dotty.tools.dotc.Resident + * + * dotc> "more options and files to compile" + * + * ... + * + * dotc> :reset // reset all options to the ones passed on the command line + * + * ... + * + * dotc> :q // quit + */ +class Resident extends Driver { + + object residentCompiler extends Compiler + + override def sourcesRequired: Boolean = false + + private val quit = ":q" + private val reset = ":reset" + private val prompt = "dotc> " + + private def getLine() = { + Console.print(prompt) + try scala.io.StdIn.readLine() catch { case _: EOFException => quit } + } + + final override def process(args: Array[String], rootCtx: Context): Reporter = { + @tailrec def loop(args: Array[String], prevCtx: Context): Reporter = { + setup(args, prevCtx) match + case Some((files, ctx)) => + inContext(ctx) { + doCompile(residentCompiler, files) + } + var nextCtx: DetachedContext = ctx + var line = getLine() + while (line == reset) { + nextCtx = rootCtx.detach + line = getLine() + } + if line.startsWith(quit) then ctx.reporter + else loop((line split "\\s+").asInstanceOf[Array[String]], nextCtx) + case None => + prevCtx.reporter + } + loop(args, rootCtx) + } +} diff --git a/tests/pos-with-compiler-cc/dotc/Run.scala b/tests/pos-with-compiler-cc/dotc/Run.scala new file mode 100644 index 000000000000..16a955afca1a --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/Run.scala @@ -0,0 +1,403 @@ +package dotty.tools +package dotc + +import core._ +import Contexts._ +import Periods._ +import Symbols._ +import Scopes._ +import Names.Name +import Denotations.Denotation +import typer.Typer +import typer.ImportInfo.withRootImports +import Decorators._ +import io.AbstractFile +import Phases.unfusedPhases + +import util._ +import reporting.{Suppression, Action, Profile, ActiveProfile, NoProfile} +import reporting.Diagnostic +import reporting.Diagnostic.Warning +import rewrites.Rewrites +import profile.Profiler +import printing.XprintMode +import typer.ImplicitRunInfo +import config.Feature +import StdNames.nme + +import java.io.{BufferedWriter, OutputStreamWriter} +import java.nio.charset.StandardCharsets + +import scala.collection.mutable +import scala.util.control.NonFatal +import scala.io.Codec +import annotation.constructorOnly +import caps.unsafe.unsafeUnbox + +/** A compiler run. Exports various methods to compile source files */ +class Run(comp: Compiler, @constructorOnly ictx0: Context) extends ImplicitRunInfo with ConstraintRunInfo { + + val ictx = ictx0.detach + + /** Default timeout to stop looking for further implicit suggestions, in ms. + * This is usually for the first import suggestion; subsequent suggestions + * may get smaller timeouts. @see ImportSuggestions.reduceTimeBudget + */ + private var myImportSuggestionBudget: Int = + Int.MinValue // sentinel value; means whatever is set in command line option + + def importSuggestionBudget = + if myImportSuggestionBudget == Int.MinValue then ictx.settings.XimportSuggestionTimeout.value + else myImportSuggestionBudget + + def importSuggestionBudget_=(x: Int) = + myImportSuggestionBudget = x + + /** If this variable is set to `true`, some core typer operations will + * return immediately. Currently these early abort operations are + * `Typer.typed` and `Implicits.typedImplicit`. + */ + @volatile var isCancelled = false + + private var compiling = false + + private var myUnits: List[CompilationUnit] = Nil + private var myUnitsCached: List[CompilationUnit] = Nil + private var myFiles: Set[AbstractFile] = _ + + // `@nowarn` annotations by source file, populated during typer + private val mySuppressions: mutable.LinkedHashMap[SourceFile, mutable.ListBuffer[Suppression]] = mutable.LinkedHashMap.empty + // source files whose `@nowarn` annotations are processed + private val mySuppressionsComplete: mutable.Set[SourceFile] = mutable.Set.empty + // warnings issued before a source file's `@nowarn` annotations are processed, suspended so that `@nowarn` can filter them + private val mySuspendedMessages: mutable.LinkedHashMap[SourceFile, mutable.LinkedHashSet[Warning]] = mutable.LinkedHashMap.empty + + object suppressions: + // When the REPL creates a new run (ReplDriver.compile), parsing is already done in the old context, with the + // previous Run. Parser warnings were suspended in the old run and need to be copied over so they are not lost. + // Same as scala/scala/commit/79ca1408c7. + def initSuspendedMessages(oldRun: Run | Null) = if oldRun != null then + mySuspendedMessages.clear() + mySuspendedMessages ++= oldRun.mySuspendedMessages + + def suppressionsComplete(source: SourceFile) = source == NoSource || mySuppressionsComplete(source) + + def addSuspendedMessage(warning: Warning) = + mySuspendedMessages.getOrElseUpdate(warning.pos.source, mutable.LinkedHashSet.empty) += warning + + def nowarnAction(dia: Diagnostic): Action.Warning.type | Action.Verbose.type | Action.Silent.type = + mySuppressions.getOrElse(dia.pos.source, Nil).find(_.matches(dia)) match { + case Some(s) => + s.markUsed() + if (s.verbose) Action.Verbose + else Action.Silent + case _ => + Action.Warning + } + + def addSuppression(sup: Suppression): Unit = + val source = sup.annotPos.source + mySuppressions.getOrElseUpdate(source, mutable.ListBuffer.empty) += sup + + def reportSuspendedMessages(source: SourceFile)(using Context): Unit = { + // sort suppressions. they are not added in any particular order because of lazy type completion + for (sups <- mySuppressions.get(source)) + mySuppressions(source) = sups.sortBy(sup => 0 - sup.start) + mySuppressionsComplete += source + mySuspendedMessages.remove(source).foreach(_.foreach(ctx.reporter.issueIfNotSuppressed)) + } + + def runFinished(hasErrors: Boolean): Unit = + // report suspended messages (in case the run finished before typer) + mySuspendedMessages.keysIterator.toList.foreach(reportSuspendedMessages) + // report unused nowarns only if all all phases are done + if !hasErrors && ctx.settings.WunusedHas.nowarn then + for { + source <- mySuppressions.keysIterator.toList + sups <- mySuppressions.remove(source) + sup <- sups.reverse + } if (!sup.used) + report.warning("@nowarn annotation does not suppress any warnings", sup.annotPos) + + /** The compilation units currently being compiled, this may return different + * results over time. + */ + def units: List[CompilationUnit] = myUnits + + private def units_=(us: List[CompilationUnit]): Unit = + myUnits = us + + var suspendedUnits: mutable.ListBuffer[CompilationUnit] = mutable.ListBuffer() + + def checkSuspendedUnits(newUnits: List[CompilationUnit])(using Context): Unit = + if newUnits.isEmpty && suspendedUnits.nonEmpty && !ctx.reporter.errorsReported then + val where = + if suspendedUnits.size == 1 then i"in ${suspendedUnits.head}." + else i"""among + | + | ${suspendedUnits.toList}%, % + |""" + val enableXprintSuspensionHint = + if ctx.settings.XprintSuspension.value then "" + else "\n\nCompiling with -Xprint-suspension gives more information." + report.error(em"""Cyclic macro dependencies $where + |Compilation stopped since no further progress can be made. + | + |To fix this, place macros in one set of files and their callers in another.$enableXprintSuspensionHint""") + + /** The files currently being compiled (active or suspended). + * This may return different results over time. + * These files do not have to be source files since it's possible to compile + * from TASTY. + */ + def files: Set[AbstractFile] = { + if (myUnits ne myUnitsCached) { + myUnitsCached = myUnits + myFiles = (myUnits ++ suspendedUnits).map(_.source.file).toSet + } + myFiles + } + + /** The source files of all late entered symbols, as a set */ + private var lateFiles = mutable.Set[AbstractFile]() + + /** A cache for static references to packages and classes */ + val staticRefs = util.EqHashMap[Name, Denotation](initialCapacity = 1024) + + /** Actions that need to be performed at the end of the current compilation run */ + private var finalizeActions = mutable.ListBuffer[() => Unit]() + + /** Will be set to true if any of the compiled compilation units contains + * a pureFunctions language import. + */ + var pureFunsImportEncountered = false + + /** Will be set to true if any of the compiled compilation units contains + * a captureChecking language import. + */ + var ccImportEncountered = false + + def compile(files: List[AbstractFile]): Unit = + try + val codec = Codec(runContext.settings.encoding.value) + val sources = files.map(runContext.getSource(_, codec)) + compileSources(sources) + catch + case NonFatal(ex) => + if units.nonEmpty then report.echo(i"exception occurred while compiling $units%, %") + else report.echo(s"exception occurred while compiling ${files.map(_.name).mkString(", ")}") + throw ex + + /** TODO: There's a fundamental design problem here: We assemble phases using `fusePhases` + * when we first build the compiler. But we modify them with -Yskip, -Ystop + * on each run. That modification needs to either transform the tree structure, + * or we need to assemble phases on each run, and take -Yskip, -Ystop into + * account. I think the latter would be preferable. + */ + def compileSources(sources: List[SourceFile]): Unit = + if (sources forall (_.exists)) { + units = sources.map(CompilationUnit(_)) + compileUnits() + } + + + def compileUnits(us: List[CompilationUnit]): Unit = { + units = us + compileUnits() + } + + def compileUnits(us: List[CompilationUnit], ctx: Context): Unit = { + units = us + compileUnits()(using ctx) + } + + var profile: Profile = NoProfile + + private def compileUnits()(using Context) = Stats.maybeMonitored { + if (!ctx.mode.is(Mode.Interactive)) // IDEs might have multi-threaded access, accesses are synchronized + ctx.base.checkSingleThreaded() + + compiling = true + + profile = + if ctx.settings.Vprofile.value + || !ctx.settings.VprofileSortedBy.value.isEmpty + || ctx.settings.VprofileDetails.value != 0 + then ActiveProfile(ctx.settings.VprofileDetails.value.max(0).min(1000)) + else NoProfile + + // If testing pickler, make sure to stop after pickling phase: + val stopAfter = + if (ctx.settings.YtestPickler.value) List("pickler") + else ctx.settings.YstopAfter.value + + val pluginPlan = ctx.base.addPluginPhases(ctx.base.phasePlan) + val phases = ctx.base.fusePhases(pluginPlan, + ctx.settings.Yskip.value, ctx.settings.YstopBefore.value, stopAfter, ctx.settings.Ycheck.value) + ctx.base.usePhases(phases) + + def runPhases(using Context) = { + var lastPrintedTree: PrintedTree = NoPrintedTree + val profiler = ctx.profiler + var phasesWereAdjusted = false + + for (phase <- ctx.base.allPhases) + if (phase.isRunnable) + Stats.trackTime(s"$phase ms ") { + val start = System.currentTimeMillis + val profileBefore = profiler.beforePhase(phase) + units = phase.runOn(units) + profiler.afterPhase(phase, profileBefore) + if (ctx.settings.Xprint.value.containsPhase(phase)) + for (unit <- units) + lastPrintedTree = + printTree(lastPrintedTree)(using ctx.fresh.setPhase(phase.next).setCompilationUnit(unit)) + report.informTime(s"$phase ", start) + Stats.record(s"total trees at end of $phase", ast.Trees.ntrees) + for (unit <- units) + Stats.record(s"retained typed trees at end of $phase", unit.tpdTree.treeSize) + ctx.typerState.gc() + } + if !phasesWereAdjusted then + phasesWereAdjusted = true + if !Feature.ccEnabledSomewhere then + ctx.base.unlinkPhaseAsDenotTransformer(Phases.checkCapturesPhase.prev) + ctx.base.unlinkPhaseAsDenotTransformer(Phases.checkCapturesPhase) + + profiler.finished() + } + + val runCtx = ctx.fresh + runCtx.setProfiler(Profiler()) + unfusedPhases.foreach(_.initContext(runCtx)) + runPhases(using runCtx) + if (!ctx.reporter.hasErrors) + Rewrites.writeBack() + suppressions.runFinished(hasErrors = ctx.reporter.hasErrors) + while (finalizeActions.nonEmpty) { + val action = finalizeActions.remove(0).unsafeUnbox + action() + } + compiling = false + } + + /** Enter top-level definitions of classes and objects contained in source file `file`. + * The newly added symbols replace any previously entered symbols. + * If `typeCheck = true`, also run typer on the compilation unit, and set + * `rootTreeOrProvider`. + */ + def lateCompile(file: AbstractFile, typeCheck: Boolean)(using Context): Unit = + if (!files.contains(file) && !lateFiles.contains(file)) { + lateFiles += file + + val codec = Codec(ctx.settings.encoding.value) + val unit = CompilationUnit(ctx.getSource(file, codec)) + val unitCtx = runContext.fresh + .setCompilationUnit(unit) + .withRootImports + + def process()(using Context) = + ctx.typer.lateEnterUnit(doTypeCheck => + if typeCheck then + if compiling then finalizeActions += doTypeCheck + else doTypeCheck() + ) + + process()(using unitCtx) + } + + private sealed trait PrintedTree + private /*final*/ case class SomePrintedTree(phase: String, tree: String) extends PrintedTree + private object NoPrintedTree extends PrintedTree + + private def printTree(last: PrintedTree)(using Context): PrintedTree = { + val unit = ctx.compilationUnit + val fusedPhase = ctx.phase.prevMega + val echoHeader = f"[[syntax trees at end of $fusedPhase%25s]] // ${unit.source}" + val tree = if ctx.isAfterTyper then unit.tpdTree else unit.untpdTree + val treeString = fusedPhase.show(tree) + + last match { + case SomePrintedTree(phase, lastTreeString) if lastTreeString == treeString => + report.echo(s"$echoHeader: unchanged since $phase") + last + + case SomePrintedTree(phase, lastTreeString) if ctx.settings.XprintDiff.value || ctx.settings.XprintDiffDel.value => + val diff = DiffUtil.mkColoredCodeDiff(treeString, lastTreeString, ctx.settings.XprintDiffDel.value) + report.echo(s"$echoHeader\n$diff\n") + SomePrintedTree(fusedPhase.phaseName, treeString) + + case _ => + report.echo(s"$echoHeader\n$treeString\n") + SomePrintedTree(fusedPhase.phaseName, treeString) + } + } + + def compileFromStrings(scalaSources: List[String], javaSources: List[String] = Nil): Unit = { + def sourceFile(source: String, isJava: Boolean): SourceFile = { + val uuid = java.util.UUID.randomUUID().toString + val ext = if (isJava) "java" else "scala" + val name = s"compileFromString-$uuid.$ext" + SourceFile.virtual(name, source) + } + val sources = + scalaSources.map(sourceFile(_, isJava = false)) ++ + javaSources.map(sourceFile(_, isJava = true)) + + compileSources(sources) + } + + /** Print summary of warnings and errors encountered */ + def printSummary(): Unit = { + printMaxConstraint() + val r = runContext.reporter + if !r.errorsReported then + profile.printSummary() + r.summarizeUnreportedWarnings() + r.printSummary() + } + + override def reset(): Unit = { + super[ImplicitRunInfo].reset() + super[ConstraintRunInfo].reset() + myCtx = null + myUnits = Nil + myUnitsCached = Nil + } + + /** Produces the following contexts, from outermost to innermost + * + * bootStrap: A context with next available runId and a scope consisting of + * the RootPackage _root_ + * start A context with RootClass as owner and the necessary initializations + * for type checking. + * imports For each element of RootImports, an import context + */ + protected def rootContext(using Context): DetachedContext = { + ctx.initialize() + ctx.base.setPhasePlan(comp.phases) + val rootScope = new MutableScope(0) + val bootstrap = ctx.fresh + .setPeriod(Period(comp.nextRunId, FirstPhaseId)) + .setScope(rootScope) + rootScope.enter(ctx.definitions.RootPackage)(using bootstrap) + var start = bootstrap.fresh + .setOwner(defn.RootClass) + .setTyper(new Typer) + .addMode(Mode.ImplicitsEnabled) + .setTyperState(ctx.typerState.fresh(ctx.reporter)) + if ctx.settings.YexplicitNulls.value && !Feature.enabledBySetting(nme.unsafeNulls) then + start = start.addMode(Mode.SafeNulls) + ctx.initialize()(using start) // re-initialize the base context with start + + // `this` must be unchecked for safe initialization because by being passed to setRun during + // initialization, it is not yet considered fully initialized by the initialization checker + start.setRun(this: @unchecked).detach + } + + private var myCtx: DetachedContext | Null = rootContext(using ictx) + + /** The context created for this run */ + given runContext[Dummy_so_its_a_def]: DetachedContext = myCtx.nn + assert(runContext.runId <= Periods.MaxPossibleRunId) +} diff --git a/tests/pos-with-compiler-cc/dotc/ScalacCommand.scala b/tests/pos-with-compiler-cc/dotc/ScalacCommand.scala new file mode 100644 index 000000000000..2e0d9a08f25d --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/ScalacCommand.scala @@ -0,0 +1,9 @@ +package dotty.tools.dotc + +import config.Properties._ +import config.CompilerCommand + +object ScalacCommand extends CompilerCommand: + override def cmdName: String = "scalac" + override def versionMsg: String = s"Scala compiler $versionString -- $copyrightString" + override def ifErrorsMsg: String = " scalac -help gives more information" diff --git a/tests/pos-with-compiler-cc/dotc/ast/CheckTrees.scala.disabled b/tests/pos-with-compiler-cc/dotc/ast/CheckTrees.scala.disabled new file mode 100644 index 000000000000..6bf7530faf24 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/ast/CheckTrees.scala.disabled @@ -0,0 +1,258 @@ +package dotty.tools +package dotc +package ast + +import core._ +import util.Spans._, Types._, Contexts._, Constants._, Names._, Flags._ +import SymDenotations._, Symbols._, StdNames._, Annotations._, Trees._ + +// TODO: revise, integrate in a checking phase. +object CheckTrees { + + import tpd._ + + def check(p: Boolean, msg: => String = "")(using Context): Unit = assert(p, msg) + + def checkTypeArg(arg: Tree, bounds: TypeBounds)(using Context): Unit = { + check(arg.isValueType) + check(bounds contains arg.tpe) + } + + def escapingRefs(block: Block)(using Context): collection.Set[NamedType] = { + var hoisted: Set[Symbol] = Set() + lazy val locals = ctx.typeAssigner.localSyms(block.stats).toSet + def isLocal(sym: Symbol): Boolean = + (locals contains sym) && !isHoistableClass(sym) + def isHoistableClass(sym: Symbol) = + sym.isClass && { + (hoisted contains sym) || { + hoisted += sym + !classLeaks(sym.asClass) + } + } + def leakingTypes(tp: Type): collection.Set[NamedType] = + tp namedPartsWith (tp => isLocal(tp.symbol)) + def typeLeaks(tp: Type): Boolean = leakingTypes(tp).nonEmpty + def classLeaks(sym: ClassSymbol): Boolean = + (ctx.owner is Method) || // can't hoist classes out of method bodies + (sym.info.parents exists typeLeaks) || + (sym.decls.toList exists (t => typeLeaks(t.info))) + leakingTypes(block.tpe) + } + + def checkType(tree: Tree)(using Context): Unit = tree match { + case Ident(name) => + case Select(qualifier, name) => + check(qualifier.isValue) + check(qualifier.tpe =:= tree.tpe.normalizedPrefix) + val denot = qualifier.tpe.member(name) + check(denot.exists) + check(denot.hasAltWith(_.symbol == tree.symbol)) + case This(cls) => + case Super(qual, mixin) => + check(qual.isValue) + val cls = qual.tpe.typeSymbol + check(cls.isClass) + case Apply(fn, args) => + def checkArg(arg: Tree, name: Name, formal: Type): Unit = { + arg match { + case NamedArg(argName, _) => + check(argName == name) + case _ => + check(arg.isValue) + } + check(arg.tpe <:< formal) + } + val MethodType(paramNames, paramTypes) = fn.tpe.widen // checked already at construction + args.lazyZip(paramNames).lazyZip(paramTypes) foreach checkArg + case TypeApply(fn, args) => + val pt @ PolyType(_) = fn.tpe.widen // checked already at construction + args.lazyZip(pt.instantiateBounds(args map (_.tpe))) foreach checkTypeArg + case Literal(const: Constant) => + case New(tpt) => + check(tpt.isValueType) + val cls = tpt.tpe.typeSymbol + check(cls.isClass) + check(!(cls is AbstractOrTrait)) + case Pair(left, right) => + check(left.isValue) + check(right.isValue) + case Typed(expr, tpt) => + check(tpt.isValueType) + expr.tpe.widen match { + case tp: MethodType => + val cls = tpt.tpe.typeSymbol + check(cls.isClass) + check((cls is Trait) || + cls.primaryConstructor.info.paramTypess.flatten.isEmpty) + val absMembers = tpt.tpe.abstractTermMembers + check(absMembers.size == 1) + check(tp <:< absMembers.head.info) + case _ => + check(expr.isValueOrPattern) + check(expr.tpe <:< tpt.tpe.translateParameterized(defn.RepeatedParamClass, defn.SeqClass)) + } + case NamedArg(name, arg) => + case Assign(lhs, rhs) => + check(lhs.isValue); check(rhs.isValue) + lhs.tpe match { + case ltpe: TermRef => + check(ltpe.symbol is Mutable) + case _ => + check(false) + } + check(rhs.tpe <:< lhs.tpe.widen) + case tree @ Block(stats, expr) => + check(expr.isValue) + check(escapingRefs(tree).isEmpty) + case If(cond, thenp, elsep) => + check(cond.isValue); check(thenp.isValue); check(elsep.isValue) + check(cond.tpe isRef defn.BooleanClass) + case Closure(env, meth, target) => + meth.tpe.widen match { + case mt @ MethodType(_, paramTypes) => + if (target.isEmpty) { + check(env.length < paramTypes.length) + for ((arg, formal) <- env zip paramTypes) + check(arg.tpe <:< formal) + } + else + // env is stored in class, not method + target.tpe match { + case SAMType(targetMeth) => + check(mt <:< targetMeth.info) + } + } + case Match(selector, cases) => + check(selector.isValue) + // are any checks that relate selector and patterns desirable? + case CaseDef(pat, guard, body) => + check(pat.isValueOrPattern); check(guard.isValue); check(body.isValue) + check(guard.tpe.derivesFrom(defn.BooleanClass)) + case Return(expr, from) => + check(expr.isValue); check(from.isTerm) + check(from.tpe.termSymbol.isRealMethod) + case Try(block, handler, finalizer) => + check(block.isTerm) + check(finalizer.isTerm) + check(handler.isTerm) + check(handler.tpe derivesFrom defn.FunctionClass(1)) + check(handler.tpe.baseArgInfos(defn.FunctionClass(1)).head <:< defn.ThrowableType) + case Throw(expr) => + check(expr.isValue) + check(expr.tpe.derivesFrom(defn.ThrowableClass)) + case SeqLiteral(elems) => + val elemtp = tree.tpe.elemType + for (elem <- elems) { + check(elem.isValue) + check(elem.tpe <:< elemtp) + } + case TypeTree(original) => + if (!original.isEmpty) { + check(original.isValueType) + check(original.tpe == tree.tpe) + } + case SingletonTypeTree(ref) => + check(ref.isValue) + check(ref.symbol.isStable) + case SelectFromTypeTree(qualifier, name) => + check(qualifier.isValueType) + check(qualifier.tpe =:= tree.tpe.normalizedPrefix) + val denot = qualifier.tpe.member(name) + check(denot.exists) + check(denot.symbol == tree.symbol) + case AndTypeTree(left, right) => + check(left.isValueType); check(right.isValueType) + case OrTypeTree(left, right) => + check(left.isValueType); check(right.isValueType) + case RefinedTypeTree(tpt, refinements) => + check(tpt.isValueType) + def checkRefinements(forbidden: Set[Symbol], rs: List[Tree]): Unit = rs match { + case r :: rs1 => + val rsym = r.symbol + check(rsym.isTerm || rsym.isAbstractOrAliasType) + if (rsym.isAbstractType) check(tpt.tpe.member(rsym.name).exists) + check(rsym.info forallParts { + case nt: NamedType => !(forbidden contains nt.symbol) + case _ => true + }) + checkRefinements(forbidden - rsym, rs1) + case nil => + } + checkRefinements(ctx.typeAssigner.localSyms(refinements).toSet, refinements) + case AppliedTypeTree(tpt, args) => + check(tpt.isValueType) + val tparams = tpt.tpe.typeParams + check(sameLength(tparams, args)) + args.lazyZip(tparams map (_.info.bounds)) foreach checkTypeArg + case TypeBoundsTree(lo, hi) => + check(lo.isValueType); check(hi.isValueType) + check(lo.tpe <:< hi.tpe) + case Bind(sym, body) => + check(body.isValueOrPattern) + check(!(tree.symbol is Method)) + body match { + case Ident(nme.WILDCARD) => + case _ => check(body.tpe.widen =:= tree.symbol.info) + } + case Alternative(alts) => + for (alt <- alts) check(alt.isValueOrPattern) + case UnApply(fun, implicits, args) => // todo: review + check(fun.isTerm) + for (arg <- args) check(arg.isValueOrPattern) + val funtpe @ MethodType(_, _) = fun.tpe.widen + fun.symbol.name match { // check arg arity + case nme.unapplySeq => + // args need to be wrapped in (...: _*) + check(args.length == 1) + check(args.head.isInstanceOf[SeqLiteral]) + case nme.unapply => + val rtp = funtpe.resultType + if (rtp isRef defn.BooleanClass) + check(args.isEmpty) + else { + check(rtp isRef defn.OptionClass) + val normArgs = rtp.argTypesHi match { + case optionArg :: Nil => + optionArg.argTypesHi match { + case Nil => + optionArg :: Nil + case tupleArgs if defn.isTupleNType(optionArg) => + tupleArgs + } + case _ => + check(false) + Nil + } + check(sameLength(normArgs, args)) + } + } + case ValDef(mods, name, tpt, rhs) => + check(!(tree.symbol is Method)) + if (!rhs.isEmpty) { + check(rhs.isValue) + check(rhs.tpe <:< tpt.tpe) + } + case DefDef(mods, name, tparams, vparamss, tpt, rhs) => + check(tree.symbol is Method) + if (!rhs.isEmpty) { + check(rhs.isValue) + check(rhs.tpe <:< tpt.tpe) + } + case TypeDef(mods, name, tpt) => + check(tpt.isInstanceOf[Template] || tpt.tpe.isInstanceOf[TypeBounds]) + case Template(constr, parents, selfType, body) => + case Import(expr, selectors) => + check(expr.isValue) + check(expr.tpe.termSymbol.isStable) + case PackageDef(pid, stats) => + check(pid.isTerm) + check(pid.symbol is Package) + case Annotated(annot, arg) => + check(annot.isInstantiation) + check(annot.symbol.owner.isSubClass(defn.AnnotationClass)) + check(arg.isValueType || arg.isValue) + case EmptyTree => + } +} + diff --git a/tests/pos-with-compiler-cc/dotc/ast/Desugar.scala b/tests/pos-with-compiler-cc/dotc/ast/Desugar.scala new file mode 100644 index 000000000000..390e58d89245 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/ast/Desugar.scala @@ -0,0 +1,1979 @@ +package dotty.tools +package dotc +package ast + +import core._ +import util.Spans._, Types._, Contexts._, Constants._, Names._, NameOps._, Flags._ +import Symbols._, StdNames._, Trees._, ContextOps._ +import Decorators._, transform.SymUtils._ +import Annotations.Annotation +import NameKinds.{UniqueName, EvidenceParamName, DefaultGetterName, WildcardParamName} +import typer.{Namer, Checking} +import util.{Property, SourceFile, SourcePosition, Chars} +import config.Feature.{sourceVersion, migrateTo3, enabled} +import config.SourceVersion._ +import collection.mutable.ListBuffer +import reporting._ +import annotation.constructorOnly +import printing.Formatting.hl +import config.Printers + +import scala.annotation.internal.sharable + +object desugar { + import untpd._ + import DesugarEnums._ + + /** An attachment for companion modules of classes that have a `derives` clause. + * The position value indicates the start position of the template of the + * deriving class. + */ + val DerivingCompanion: Property.Key[SourcePosition] = Property.Key() + + /** An attachment for match expressions generated from a PatDef or GenFrom. + * Value of key == one of IrrefutablePatDef, IrrefutableGenFrom + */ + val CheckIrrefutable: Property.Key[MatchCheck] = Property.StickyKey() + + /** A multi-line infix operation with the infix operator starting a new line. + * Used for explaining potential errors. + */ + val MultiLineInfix: Property.Key[Unit] = Property.StickyKey() + + /** An attachment key to indicate that a ValDef originated from parameter untupling. + */ + val UntupledParam: Property.Key[Unit] = Property.StickyKey() + + /** What static check should be applied to a Match? */ + enum MatchCheck { + case None, Exhaustive, IrrefutablePatDef, IrrefutableGenFrom + } + + /** Is `name` the name of a method that can be invalidated as a compiler-generated + * case class method if it clashes with a user-defined method? + */ + def isRetractableCaseClassMethodName(name: Name)(using Context): Boolean = name match { + case nme.apply | nme.unapply | nme.unapplySeq | nme.copy => true + case DefaultGetterName(nme.copy, _) => true + case _ => false + } + + /** Is `name` the name of a method that is added unconditionally to case classes? */ + def isDesugaredCaseClassMethodName(name: Name)(using Context): Boolean = + isRetractableCaseClassMethodName(name) || name.isSelectorName + +// ----- DerivedTypeTrees ----------------------------------- + + class SetterParamTree(implicit @constructorOnly src: SourceFile) extends DerivedTypeTree { + def derivedTree(sym: Symbol)(using Context): tpd.TypeTree = tpd.TypeTree(sym.info.resultType) + } + + class TypeRefTree(implicit @constructorOnly src: SourceFile) extends DerivedTypeTree { + def derivedTree(sym: Symbol)(using Context): tpd.TypeTree = tpd.TypeTree(sym.typeRef) + } + + class TermRefTree(implicit @constructorOnly src: SourceFile) extends DerivedTypeTree { + def derivedTree(sym: Symbol)(using Context): tpd.Tree = tpd.ref(sym) + } + + /** A type tree that computes its type from an existing parameter. */ + class DerivedFromParamTree()(implicit @constructorOnly src: SourceFile) extends DerivedTypeTree { + + /** Complete the appropriate constructors so that OriginalSymbol attachments are + * pushed to DerivedTypeTrees. + */ + override def ensureCompletions(using Context): Unit = { + def completeConstructor(sym: Symbol) = + sym.infoOrCompleter match { + case completer: Namer#ClassCompleter => + completer.completeConstructor(sym) + case _ => + } + + if (!ctx.owner.is(Package)) + if (ctx.owner.isClass) { + completeConstructor(ctx.owner) + if (ctx.owner.is(ModuleClass)) + completeConstructor(ctx.owner.linkedClass) + } + else ensureCompletions(using ctx.outer) + } + + /** Return info of original symbol, where all references to siblings of the + * original symbol (i.e. sibling and original symbol have the same owner) + * are rewired to same-named parameters or accessors in the scope enclosing + * the current scope. The current scope is the scope owned by the defined symbol + * itself, that's why we have to look one scope further out. If the resulting + * type is an alias type, dealias it. This is necessary because the + * accessor of a type parameter is a private type alias that cannot be accessed + * from subclasses. + */ + def derivedTree(sym: Symbol)(using Context): tpd.TypeTree = { + val dctx = ctx.detach + val relocate = new TypeMap(using dctx) { + val originalOwner = sym.owner + def apply(tp: Type) = tp match { + case tp: NamedType if tp.symbol.exists && (tp.symbol.owner eq originalOwner) => + val defctx = mapCtx.detach.outersIterator.dropWhile(_.scope eq mapCtx.scope).next() + var local = defctx.denotNamed(tp.name).suchThat(_.isParamOrAccessor).symbol + if (local.exists) (defctx.owner.thisType select local).dealiasKeepAnnots + else { + def msg = + em"no matching symbol for ${tp.symbol.showLocated} in ${defctx.owner} / ${defctx.effectiveScope.toList}" + ErrorType(msg).assertingErrorsReported(msg) + } + case _ => + mapOver(tp) + } + } + tpd.TypeTree(relocate(sym.info)) + } + } + + /** A type definition copied from `tdef` with a rhs typetree derived from it */ + def derivedTypeParam(tdef: TypeDef)(using Context): TypeDef = + cpy.TypeDef(tdef)( + rhs = DerivedFromParamTree().withSpan(tdef.rhs.span).watching(tdef) + ) + + /** A derived type definition watching `sym` */ + def derivedTypeParamWithVariance(sym: TypeSymbol)(using Context): TypeDef = + val variance = VarianceFlags & sym.flags + TypeDef(sym.name, DerivedFromParamTree().watching(sym)).withFlags(TypeParam | Synthetic | variance) + + /** A value definition copied from `vdef` with a tpt typetree derived from it */ + def derivedTermParam(vdef: ValDef)(using Context): ValDef = + cpy.ValDef(vdef)( + tpt = DerivedFromParamTree().withSpan(vdef.tpt.span).watching(vdef)) + +// ----- Desugar methods ------------------------------------------------- + + /** Setter generation is needed for: + * - non-private class members + * - all trait members + * - all package object members + */ + def isSetterNeeded(valDef: ValDef)(using Context): Boolean = { + val mods = valDef.mods + mods.is(Mutable) + && ctx.owner.isClass + && (!mods.is(Private) || ctx.owner.is(Trait) || ctx.owner.isPackageObject) + } + + /** var x: Int = expr + * ==> + * def x: Int = expr + * def x_=($1: ): Unit = () + * + * Generate setter where needed + */ + def valDef(vdef0: ValDef)(using Context): Tree = + val vdef @ ValDef(_, tpt, rhs) = vdef0 + val valName = normalizeName(vdef, tpt).asTermName + var mods1 = vdef.mods + + def dropInto(tpt: Tree): Tree = tpt match + case Into(tpt1) => + mods1 = vdef.mods.withAddedAnnotation( + TypedSplice( + Annotation(defn.AllowConversionsAnnot).tree.withSpan(tpt.span.startPos))) + tpt1 + case ByNameTypeTree(tpt1) => + cpy.ByNameTypeTree(tpt)(dropInto(tpt1)) + case PostfixOp(tpt1, op) if op.name == tpnme.raw.STAR => + cpy.PostfixOp(tpt)(dropInto(tpt1), op) + case _ => + tpt + + val vdef1 = cpy.ValDef(vdef)(name = valName, tpt = dropInto(tpt)) + .withMods(mods1) + + if isSetterNeeded(vdef) then + val setterParam = makeSyntheticParameter(tpt = SetterParamTree().watching(vdef)) + // The rhs gets filled in later, when field is generated and getter has parameters (see Memoize miniphase) + val setterRhs = if (vdef.rhs.isEmpty) EmptyTree else unitLiteral + val setter = cpy.DefDef(vdef)( + name = valName.setterName, + paramss = (setterParam :: Nil) :: Nil, + tpt = TypeTree(defn.UnitType), + rhs = setterRhs + ).withMods((vdef.mods | Accessor) &~ (CaseAccessor | GivenOrImplicit | Lazy)) + .dropEndMarker() // the end marker should only appear on the getter definition + Thicket(vdef1, setter) + else vdef1 + end valDef + + def makeImplicitParameters(tpts: List[Tree], implicitFlag: FlagSet, forPrimaryConstructor: Boolean = false)(using Context): List[ValDef] = + for (tpt <- tpts) yield { + val paramFlags: FlagSet = if (forPrimaryConstructor) LocalParamAccessor else Param + val epname = EvidenceParamName.fresh() + ValDef(epname, tpt, EmptyTree).withFlags(paramFlags | implicitFlag) + } + + def mapParamss(paramss: List[ParamClause]) + (mapTypeParam: TypeDef => TypeDef) + (mapTermParam: ValDef => ValDef)(using Context): List[ParamClause] = + paramss.mapConserve { + case TypeDefs(tparams) => tparams.mapConserve(mapTypeParam) + case ValDefs(vparams) => vparams.mapConserve(mapTermParam) + case _ => unreachable() + } + + /** 1. Expand context bounds to evidence params. E.g., + * + * def f[T >: L <: H : B](params) + * ==> + * def f[T >: L <: H](params)(implicit evidence$0: B[T]) + * + * 2. Expand default arguments to default getters. E.g, + * + * def f[T: B](x: Int = 1)(y: String = x + "m") = ... + * ==> + * def f[T](x: Int)(y: String)(implicit evidence$0: B[T]) = ... + * def f$default$1[T] = 1 + * def f$default$2[T](x: Int) = x + "m" + */ + private def defDef(meth: DefDef, isPrimaryConstructor: Boolean = false)(using Context): Tree = + addDefaultGetters(elimContextBounds(meth, isPrimaryConstructor)) + + private def elimContextBounds(meth: DefDef, isPrimaryConstructor: Boolean)(using Context): DefDef = + val DefDef(_, paramss, tpt, rhs) = meth + val evidenceParamBuf = ListBuffer[ValDef]() + + def desugarContextBounds(rhs: Tree): Tree = rhs match + case ContextBounds(tbounds, cxbounds) => + val iflag = if sourceVersion.isAtLeast(`future`) then Given else Implicit + evidenceParamBuf ++= makeImplicitParameters( + cxbounds, iflag, forPrimaryConstructor = isPrimaryConstructor) + tbounds + case LambdaTypeTree(tparams, body) => + cpy.LambdaTypeTree(rhs)(tparams, desugarContextBounds(body)) + case _ => + rhs + + val paramssNoContextBounds = + mapParamss(paramss) { + tparam => cpy.TypeDef(tparam)(rhs = desugarContextBounds(tparam.rhs)) + }(identity) + + rhs match + case MacroTree(call) => + cpy.DefDef(meth)(rhs = call).withMods(meth.mods | Macro | Erased) + case _ => + addEvidenceParams( + cpy.DefDef(meth)( + name = normalizeName(meth, tpt).asTermName, + paramss = paramssNoContextBounds), + evidenceParamBuf.toList) + end elimContextBounds + + def addDefaultGetters(meth: DefDef)(using Context): Tree = + + /** The longest prefix of parameter lists in paramss whose total number of + * ValDefs does not exceed `n` + */ + def takeUpTo(paramss: List[ParamClause], n: Int): List[ParamClause] = paramss match + case ValDefs(vparams) :: paramss1 => + val len = vparams.length + if len <= n then vparams :: takeUpTo(paramss1, n - len) else Nil + case TypeDefs(tparams) :: paramss1 => + tparams :: takeUpTo(paramss1, n) + case _ => + Nil + + def dropContextBounds(tparam: TypeDef): TypeDef = + def dropInRhs(rhs: Tree): Tree = rhs match + case ContextBounds(tbounds, _) => + tbounds + case rhs @ LambdaTypeTree(tparams, body) => + cpy.LambdaTypeTree(rhs)(tparams, dropInRhs(body)) + case _ => + rhs + cpy.TypeDef(tparam)(rhs = dropInRhs(tparam.rhs)) + + def paramssNoRHS = mapParamss(meth.paramss)(identity) { + vparam => + if vparam.rhs.isEmpty then vparam + else cpy.ValDef(vparam)(rhs = EmptyTree).withMods(vparam.mods | HasDefault) + } + + def getterParamss(n: Int): List[ParamClause] = + mapParamss(takeUpTo(paramssNoRHS, n)) { + tparam => dropContextBounds(toDefParam(tparam, keepAnnotations = true)) + } { + vparam => toDefParam(vparam, keepAnnotations = true, keepDefault = false) + } + + def defaultGetters(paramss: List[ParamClause], n: Int): List[DefDef] = paramss match + case ValDefs(vparam :: vparams) :: paramss1 => + def defaultGetter: DefDef = + DefDef( + name = DefaultGetterName(meth.name, n), + paramss = getterParamss(n), + tpt = TypeTree(), + rhs = vparam.rhs + ) + .withMods(Modifiers( + meth.mods.flags & (AccessFlags | Synthetic) | (vparam.mods.flags & Inline), + meth.mods.privateWithin)) + val rest = defaultGetters(vparams :: paramss1, n + 1) + if vparam.rhs.isEmpty then rest else defaultGetter :: rest + case _ :: paramss1 => // skip empty parameter lists and type parameters + defaultGetters(paramss1, n) + case Nil => + Nil + + val defGetters = defaultGetters(meth.paramss, 0) + if defGetters.isEmpty then meth + else Thicket(cpy.DefDef(meth)(paramss = paramssNoRHS) :: defGetters) + end addDefaultGetters + + /** Add an explicit ascription to the `expectedTpt` to every tail splice. + * + * - `'{ x }` -> `'{ x }` + * - `'{ $x }` -> `'{ $x: T }` + * - `'{ if (...) $x else $y }` -> `'{ if (...) ($x: T) else ($y: T) }` + * + * Note that the splice `$t: T` will be typed as `${t: Expr[T]}` + */ + def quotedPattern(tree: untpd.Tree, expectedTpt: untpd.Tree)(using Context): untpd.Tree = { + def adaptToExpectedTpt(tree: untpd.Tree): untpd.Tree = tree match { + // Add the expected type as an ascription + case _: untpd.Splice => + untpd.Typed(tree, expectedTpt).withSpan(tree.span) + case Typed(expr: untpd.Splice, tpt) => + cpy.Typed(tree)(expr, untpd.makeAndType(tpt, expectedTpt).withSpan(tpt.span)) + + // Propagate down the expected type to the leafs of the expression + case Block(stats, expr) => + cpy.Block(tree)(stats, adaptToExpectedTpt(expr)) + case If(cond, thenp, elsep) => + cpy.If(tree)(cond, adaptToExpectedTpt(thenp), adaptToExpectedTpt(elsep)) + case untpd.Parens(expr) => + cpy.Parens(tree)(adaptToExpectedTpt(expr)) + case Match(selector, cases) => + val newCases = cases.map(cdef => cpy.CaseDef(cdef)(body = adaptToExpectedTpt(cdef.body))) + cpy.Match(tree)(selector, newCases) + case untpd.ParsedTry(expr, handler, finalizer) => + cpy.ParsedTry(tree)(adaptToExpectedTpt(expr), adaptToExpectedTpt(handler), finalizer) + + // Tree does not need to be ascribed + case _ => + tree + } + adaptToExpectedTpt(tree) + } + + /** Add all evidence parameters in `params` as implicit parameters to `meth`. + * If the parameters of `meth` end in an implicit parameter list or using clause, + * evidence parameters are added in front of that list. Otherwise they are added + * as a separate parameter clause. + */ + private def addEvidenceParams(meth: DefDef, params: List[ValDef])(using Context): DefDef = + params match + case Nil => + meth + case evidenceParams => + val paramss1 = meth.paramss.reverse match + case ValDefs(vparams @ (vparam :: _)) :: rparamss if vparam.mods.isOneOf(GivenOrImplicit) => + ((evidenceParams ++ vparams) :: rparamss).reverse + case _ => + meth.paramss :+ evidenceParams + cpy.DefDef(meth)(paramss = paramss1) + + /** The implicit evidence parameters of `meth`, as generated by `desugar.defDef` */ + private def evidenceParams(meth: DefDef)(using Context): List[ValDef] = + meth.paramss.reverse match { + case ValDefs(vparams @ (vparam :: _)) :: _ if vparam.mods.isOneOf(GivenOrImplicit) => + vparams.takeWhile(_.name.is(EvidenceParamName)) + case _ => + Nil + } + + @sharable private val synthetic = Modifiers(Synthetic) + + private def toDefParam(tparam: TypeDef, keepAnnotations: Boolean): TypeDef = { + var mods = tparam.rawMods + if (!keepAnnotations) mods = mods.withAnnotations(Nil) + tparam.withMods(mods & EmptyFlags | Param) + } + private def toDefParam(vparam: ValDef, keepAnnotations: Boolean, keepDefault: Boolean): ValDef = { + var mods = vparam.rawMods + if (!keepAnnotations) mods = mods.withAnnotations(Nil) + val hasDefault = if keepDefault then HasDefault else EmptyFlags + vparam.withMods(mods & (GivenOrImplicit | Erased | hasDefault) | Param) + } + + def mkApply(fn: Tree, paramss: List[ParamClause])(using Context): Tree = + paramss.foldLeft(fn) { (fn, params) => params match + case TypeDefs(params) => + TypeApply(fn, params.map(refOfDef)) + case (vparam: ValDef) :: _ if vparam.mods.is(Given) => + Apply(fn, params.map(refOfDef)).setApplyKind(ApplyKind.Using) + case _ => + Apply(fn, params.map(refOfDef)) + } + + /** The expansion of a class definition. See inline comments for what is involved */ + def classDef(cdef: TypeDef)(using Context): Tree = { + val impl @ Template(constr0, _, self, _) = cdef.rhs: @unchecked + val className = normalizeName(cdef, impl).asTypeName + val parents = impl.parents + val mods = cdef.mods + val companionMods = mods + .withFlags((mods.flags & (AccessFlags | Final)).toCommonFlags) + .withMods(Nil) + .withAnnotations(Nil) + + var defaultGetters: List[Tree] = Nil + + def decompose(ddef: Tree): DefDef = ddef match { + case meth: DefDef => meth + case Thicket((meth: DefDef) :: defaults) => + defaultGetters = defaults + meth + } + + val constr1 = decompose(defDef(impl.constr, isPrimaryConstructor = true)) + + // The original type and value parameters in the constructor already have the flags + // needed to be type members (i.e. param, and possibly also private and local unless + // prefixed by type or val). `tparams` and `vparamss` are the type parameters that + // go in `constr`, the constructor after desugaring. + + /** Does `tree' look like a reference to AnyVal? Temporary test before we have inline classes */ + def isAnyVal(tree: Tree): Boolean = tree match { + case Ident(tpnme.AnyVal) => true + case Select(qual, tpnme.AnyVal) => isScala(qual) + case _ => false + } + def isScala(tree: Tree): Boolean = tree match { + case Ident(nme.scala) => true + case Select(Ident(nme.ROOTPKG), nme.scala) => true + case _ => false + } + + def namePos = cdef.sourcePos.withSpan(cdef.nameSpan) + + val isObject = mods.is(Module) + val isCaseClass = mods.is(Case) && !isObject + val isCaseObject = mods.is(Case) && isObject + val isEnum = mods.isEnumClass && !mods.is(Module) + def isEnumCase = mods.isEnumCase + def isNonEnumCase = !isEnumCase && (isCaseClass || isCaseObject) + val isValueClass = parents.nonEmpty && isAnyVal(parents.head) + // This is not watertight, but `extends AnyVal` will be replaced by `inline` later. + + val originalTparams = constr1.leadingTypeParams + val originalVparamss = asTermOnly(constr1.trailingParamss) + lazy val derivedEnumParams = enumClass.typeParams.map(derivedTypeParamWithVariance) + val impliedTparams = + if (isEnumCase) { + val tparamReferenced = typeParamIsReferenced( + enumClass.typeParams, originalTparams, originalVparamss, parents) + if (originalTparams.isEmpty && (parents.isEmpty || tparamReferenced)) + derivedEnumParams.map(tdef => tdef.withFlags(tdef.mods.flags | PrivateLocal)) + else originalTparams + } + else originalTparams + + if mods.is(Trait) then + for vparams <- originalVparamss; vparam <- vparams do + if isByNameType(vparam.tpt) then + report.error(em"implementation restriction: traits cannot have by name parameters", vparam.srcPos) + + // Annotations on class _type_ parameters are set on the derived parameters + // but not on the constructor parameters. The reverse is true for + // annotations on class _value_ parameters. + val constrTparams = impliedTparams.map(toDefParam(_, keepAnnotations = false)) + val constrVparamss = + if (originalVparamss.isEmpty) { // ensure parameter list is non-empty + if (isCaseClass) + report.error(CaseClassMissingParamList(cdef), namePos) + ListOfNil + } + else if (isCaseClass && originalVparamss.head.exists(_.mods.isOneOf(GivenOrImplicit))) { + report.error(CaseClassMissingNonImplicitParamList(cdef), namePos) + ListOfNil + } + else originalVparamss.nestedMap(toDefParam(_, keepAnnotations = true, keepDefault = true)) + val derivedTparams = + constrTparams.zipWithConserve(impliedTparams)((tparam, impliedParam) => + derivedTypeParam(tparam).withAnnotations(impliedParam.mods.annotations)) + val derivedVparamss = + constrVparamss.nestedMap(vparam => + derivedTermParam(vparam).withAnnotations(Nil)) + + val constr = cpy.DefDef(constr1)(paramss = joinParams(constrTparams, constrVparamss)) + + val (normalizedBody, enumCases, enumCompanionRef) = { + // Add constructor type parameters and evidence implicit parameters + // to auxiliary constructors; set defaultGetters as a side effect. + def expandConstructor(tree: Tree) = tree match { + case ddef: DefDef if ddef.name.isConstructorName => + decompose( + defDef( + addEvidenceParams( + cpy.DefDef(ddef)(paramss = joinParams(constrTparams, ddef.paramss)), + evidenceParams(constr1).map(toDefParam(_, keepAnnotations = false, keepDefault = false))))) + case stat => + stat + } + // The Identifiers defined by a case + def caseIds(tree: Tree): List[Ident] = tree match { + case tree: MemberDef => Ident(tree.name.toTermName) :: Nil + case PatDef(_, ids: List[Ident] @ unchecked, _, _) => ids + } + + val stats0 = impl.body.map(expandConstructor) + val stats = + if (ctx.owner eq defn.ScalaPackageClass) && defn.hasProblematicGetClass(className) then + stats0.filterConserve { + case ddef: DefDef => + ddef.name ne nme.getClass_ + case _ => + true + } + else + stats0 + + if (isEnum) { + val (enumCases, enumStats) = stats.partition(DesugarEnums.isEnumCase) + if (enumCases.isEmpty) + report.error(EnumerationsShouldNotBeEmpty(cdef), namePos) + else + enumCases.last.pushAttachment(DesugarEnums.DefinesEnumLookupMethods, ()) + val enumCompanionRef = TermRefTree() + val enumImport = + Import(enumCompanionRef, enumCases.flatMap(caseIds).map( + enumCase => + ImportSelector(enumCase.withSpan(enumCase.span.startPos)) + ) + ) + (enumImport :: enumStats, enumCases, enumCompanionRef) + } + else (stats, Nil, EmptyTree) + } + + def anyRef = ref(defn.AnyRefAlias.typeRef) + + val arity = constrVparamss.head.length + + val classTycon: Tree = TypeRefTree() // watching is set at end of method + + def appliedTypeTree(tycon: Tree, args: List[Tree]) = + (if (args.isEmpty) tycon else AppliedTypeTree(tycon, args)) + .withSpan(cdef.span.startPos) + + def isHK(tparam: Tree): Boolean = tparam match { + case TypeDef(_, LambdaTypeTree(tparams, body)) => true + case TypeDef(_, rhs: DerivedTypeTree) => isHK(rhs.watched) + case _ => false + } + + def appliedRef(tycon: Tree, tparams: List[TypeDef] = constrTparams, widenHK: Boolean = false) = { + val targs = for (tparam <- tparams) yield { + val targ = refOfDef(tparam) + def fullyApplied(tparam: Tree): Tree = tparam match { + case TypeDef(_, LambdaTypeTree(tparams, body)) => + AppliedTypeTree(targ, tparams.map(_ => WildcardTypeBoundsTree())) + case TypeDef(_, rhs: DerivedTypeTree) => + fullyApplied(rhs.watched) + case _ => + targ + } + if (widenHK) fullyApplied(tparam) else targ + } + appliedTypeTree(tycon, targs) + } + + def isRepeated(tree: Tree): Boolean = stripByNameType(tree) match { + case PostfixOp(_, Ident(tpnme.raw.STAR)) => true + case _ => false + } + + // a reference to the class type bound by `cdef`, with type parameters coming from the constructor + val classTypeRef = appliedRef(classTycon) + + // a reference to `enumClass`, with type parameters coming from the case constructor + lazy val enumClassTypeRef = + if (enumClass.typeParams.isEmpty) + enumClassRef + else if (originalTparams.isEmpty) + appliedRef(enumClassRef) + else { + report.error(TypedCaseDoesNotExplicitlyExtendTypedEnum(enumClass, cdef) + , cdef.srcPos.startPos) + appliedTypeTree(enumClassRef, constrTparams map (_ => anyRef)) + } + + // new C[Ts](paramss) + lazy val creatorExpr = + val vparamss = constrVparamss match + case (vparam :: _) :: _ if vparam.mods.is(Implicit) => // add a leading () to match class parameters + Nil :: constrVparamss + case _ => + if constrVparamss.nonEmpty && constrVparamss.forall { + case vparam :: _ => vparam.mods.is(Given) + case _ => false + } + then constrVparamss :+ Nil // add a trailing () to match class parameters + else constrVparamss + val nu = vparamss.foldLeft(makeNew(classTypeRef)) { (nu, vparams) => + val app = Apply(nu, vparams.map(refOfDef)) + vparams match { + case vparam :: _ if vparam.mods.is(Given) => app.setApplyKind(ApplyKind.Using) + case _ => app + } + } + ensureApplied(nu) + + val copiedAccessFlags = if migrateTo3 then EmptyFlags else AccessFlags + + // Methods to add to a case class C[..](p1: T1, ..., pN: Tn)(moreParams) + // def _1: T1 = this.p1 + // ... + // def _N: TN = this.pN (unless already given as valdef or parameterless defdef) + // def copy(p1: T1 = p1..., pN: TN = pN)(moreParams) = + // new C[...](p1, ..., pN)(moreParams) + val (caseClassMeths, enumScaffolding) = { + def syntheticProperty(name: TermName, tpt: Tree, rhs: Tree) = + DefDef(name, Nil, tpt, rhs).withMods(synthetic) + + def productElemMeths = + val caseParams = derivedVparamss.head.toArray + val selectorNamesInBody = normalizedBody.collect { + case vdef: ValDef if vdef.name.isSelectorName => + vdef.name + case ddef: DefDef if ddef.name.isSelectorName && ddef.paramss.isEmpty => + ddef.name + } + for i <- List.range(0, arity) + selName = nme.selectorName(i) + if (selName ne caseParams(i).name) && !selectorNamesInBody.contains(selName) + yield syntheticProperty(selName, caseParams(i).tpt, + Select(This(EmptyTypeIdent), caseParams(i).name)) + + def enumCaseMeths = + if isEnumCase then + val (ordinal, scaffolding) = nextOrdinal(className, CaseKind.Class, definesEnumLookupMethods(cdef)) + (ordinalMethLit(ordinal) :: Nil, scaffolding) + else (Nil, Nil) + def copyMeths = { + val hasRepeatedParam = constrVparamss.nestedExists { + case ValDef(_, tpt, _) => isRepeated(tpt) + } + if (mods.is(Abstract) || hasRepeatedParam) Nil // cannot have default arguments for repeated parameters, hence copy method is not issued + else { + val copyFirstParams = derivedVparamss.head.map(vparam => + cpy.ValDef(vparam)(rhs = refOfDef(vparam))) + val copyRestParamss = derivedVparamss.tail.nestedMap(vparam => + cpy.ValDef(vparam)(rhs = EmptyTree)) + DefDef( + nme.copy, + joinParams(derivedTparams, copyFirstParams :: copyRestParamss), + TypeTree(), + creatorExpr + ).withMods(Modifiers(Synthetic | constr1.mods.flags & copiedAccessFlags, constr1.mods.privateWithin)) :: Nil + } + } + + if isCaseClass then + val (enumMeths, enumScaffolding) = enumCaseMeths + (copyMeths ::: enumMeths ::: productElemMeths, enumScaffolding) + else (Nil, Nil) + } + + var parents1: List[untpd.Tree] = parents // !cc! need explicit type to make capture checking pass + if (isEnumCase && parents.isEmpty) + parents1 = enumClassTypeRef :: Nil + if (isNonEnumCase) + parents1 = parents1 :+ scalaDot(str.Product.toTypeName) :+ scalaDot(nme.Serializable.toTypeName) + if (isEnum) + parents1 = parents1 :+ ref(defn.EnumClass) + + // derived type classes of non-module classes go to their companions + val (clsDerived, companionDerived) = + if (mods.is(Module)) (impl.derived, Nil) else (Nil, impl.derived) + + // The thicket which is the desugared version of the companion object + // synthetic object C extends parentTpt derives class-derived { defs } + def companionDefs(parentTpt: Tree, defs: List[Tree]) = { + val mdefs = moduleDef( + ModuleDef( + className.toTermName, Template(emptyConstructor, parentTpt :: Nil, companionDerived, EmptyValDef, defs)) + .withMods(companionMods | Synthetic)) + .withSpan(cdef.span).toList + if (companionDerived.nonEmpty) + for (case modClsDef @ TypeDef(_, _) <- mdefs) + modClsDef.putAttachment(DerivingCompanion, impl.srcPos.startPos) + mdefs + } + + val companionMembers = defaultGetters ::: enumCases + + // The companion object definitions, if a companion is needed, Nil otherwise. + // companion definitions include: + // 1. If class is a case class case class C[Ts](p1: T1, ..., pN: TN)(moreParams): + // def apply[Ts](p1: T1, ..., pN: TN)(moreParams) = new C[Ts](p1, ..., pN)(moreParams) (unless C is abstract) + // def unapply[Ts]($1: C[Ts]) = $1 // if not repeated + // def unapplySeq[Ts]($1: C[Ts]) = $1 // if repeated + // 2. The default getters of the constructor + // The parent of the companion object of a non-parameterized case class + // (T11, ..., T1N) => ... => (TM1, ..., TMN) => C + // For all other classes, the parent is AnyRef. + val companions = + if (isCaseClass) { + val applyMeths = + if (mods.is(Abstract)) Nil + else { + val appMods = + Modifiers(Synthetic | constr1.mods.flags & copiedAccessFlags).withPrivateWithin(constr1.mods.privateWithin) + val appParamss = + derivedVparamss.nestedZipWithConserve(constrVparamss)((ap, cp) => + ap.withMods(ap.mods | (cp.mods.flags & HasDefault))) + DefDef(nme.apply, joinParams(derivedTparams, appParamss), TypeTree(), creatorExpr) + .withMods(appMods) :: Nil + } + val unapplyMeth = { + val hasRepeatedParam = constrVparamss.head.exists { + case ValDef(_, tpt, _) => isRepeated(tpt) + } + val methName = if (hasRepeatedParam) nme.unapplySeq else nme.unapply + val unapplyParam = makeSyntheticParameter(tpt = classTypeRef) + val unapplyRHS = if (arity == 0) Literal(Constant(true)) else Ident(unapplyParam.name) + val unapplyResTp = if (arity == 0) Literal(Constant(true)) else TypeTree() + DefDef( + methName, + joinParams(derivedTparams, (unapplyParam :: Nil) :: Nil), + unapplyResTp, + unapplyRHS + ).withMods(synthetic) + } + val toStringMeth = + DefDef(nme.toString_, Nil, TypeTree(), Literal(Constant(className.toString))).withMods(Modifiers(Override | Synthetic)) + + companionDefs(anyRef, applyMeths ::: unapplyMeth :: toStringMeth :: companionMembers) + } + else if (companionMembers.nonEmpty || companionDerived.nonEmpty || isEnum) + companionDefs(anyRef, companionMembers) + else if (isValueClass) + companionDefs(anyRef, Nil) + else Nil + + enumCompanionRef match { + case ref: TermRefTree => // have the enum import watch the companion object + val (modVal: ValDef) :: _ = companions: @unchecked + ref.watching(modVal) + case _ => + } + + // For an implicit class C[Ts](p11: T11, ..., p1N: T1N) ... (pM1: TM1, .., pMN: TMN), the method + // synthetic implicit C[Ts](p11: T11, ..., p1N: T1N) ... (pM1: TM1, ..., pMN: TMN): C[Ts] = + // new C[Ts](p11, ..., p1N) ... (pM1, ..., pMN) = + val implicitWrappers = + if (!mods.isOneOf(GivenOrImplicit)) + Nil + else if (ctx.owner.is(Package)) { + report.error(TopLevelImplicitClass(cdef), cdef.srcPos) + Nil + } + else if (mods.is(Trait)) { + report.error(TypesAndTraitsCantBeImplicit(), cdef.srcPos) + Nil + } + else if (isCaseClass) { + report.error(ImplicitCaseClass(cdef), cdef.srcPos) + Nil + } + else if (arity != 1 && !mods.is(Given)) { + report.error(ImplicitClassPrimaryConstructorArity(), cdef.srcPos) + Nil + } + else { + val defParamss = constrVparamss match { + case Nil :: paramss => + paramss // drop leading () that got inserted by class + // TODO: drop this once we do not silently insert empty class parameters anymore + case paramss => paramss + } + // implicit wrapper is typechecked in same scope as constructor, so + // we can reuse the constructor parameters; no derived params are needed. + DefDef( + className.toTermName, joinParams(constrTparams, defParamss), + classTypeRef, creatorExpr) + .withMods(companionMods | mods.flags.toTermFlags & (GivenOrImplicit | Inline) | Final) + .withSpan(cdef.span) :: Nil + } + + val self1 = { + val selfType = if (self.tpt.isEmpty) classTypeRef else self.tpt + if (self.isEmpty) self + else cpy.ValDef(self)(tpt = selfType).withMods(self.mods | SelfName) + } + + val cdef1 = addEnumFlags { + val tparamAccessors = { + val impliedTparamsIt = impliedTparams.iterator + derivedTparams.map(_.withMods(impliedTparamsIt.next().mods)) + } + val caseAccessor = if (isCaseClass) CaseAccessor else EmptyFlags + val vparamAccessors = { + val originalVparamsIt = originalVparamss.iterator.flatten + derivedVparamss match { + case first :: rest => + first.map(_.withMods(originalVparamsIt.next().mods | caseAccessor)) ++ + rest.flatten.map(_.withMods(originalVparamsIt.next().mods)) + case _ => + Nil + } + } + if mods.isAllOf(Given | Inline | Transparent) then + report.error("inline given instances cannot be trasparent", cdef) + val classMods = if mods.is(Given) then mods &~ (Inline | Transparent) | Synthetic else mods + cpy.TypeDef(cdef: TypeDef)( + name = className, + rhs = cpy.Template(impl)(constr, parents1, clsDerived, self1, + tparamAccessors ::: vparamAccessors ::: normalizedBody ::: caseClassMeths) + ).withMods(classMods) + } + + // install the watch on classTycon + classTycon match { + case tycon: DerivedTypeTree => tycon.watching(cdef1) + case _ => + } + + flatTree(cdef1 :: companions ::: implicitWrappers ::: enumScaffolding) + }.showing(i"desugared: $cdef --> $result", Printers.desugar) + + /** Expand + * + * package object name { body } + * + * to: + * + * package name { + * object `package` { body } + * } + */ + def packageModuleDef(mdef: ModuleDef)(using Context): Tree = + val impl = mdef.impl + val mods = mdef.mods + val moduleName = normalizeName(mdef, impl).asTermName + if mods.is(Package) then + checkPackageName(mdef) + PackageDef(Ident(moduleName), + cpy.ModuleDef(mdef)(nme.PACKAGE, impl).withMods(mods &~ Package) :: Nil) + else + mdef + + /** Expand + * + * object name extends parents { self => body } + * + * to: + * + * val name: name$ = New(name$) + * final class name$ extends parents { self: name.type => body } + */ + def moduleDef(mdef: ModuleDef)(using Context): Tree = { + val impl = mdef.impl + val mods = mdef.mods + val moduleName = normalizeName(mdef, impl).asTermName + def isEnumCase = mods.isEnumCase + Checking.checkWellFormedModule(mdef) + + if (mods.is(Package)) + packageModuleDef(mdef) + else if (isEnumCase) { + typeParamIsReferenced(enumClass.typeParams, Nil, Nil, impl.parents) + // used to check there are no illegal references to enum's type parameters in parents + expandEnumModule(moduleName, impl, mods, definesEnumLookupMethods(mdef), mdef.span) + } + else { + val clsName = moduleName.moduleClassName + val clsRef = Ident(clsName) + val modul = ValDef(moduleName, clsRef, New(clsRef, Nil)) + .withMods(mods.toTermFlags & RetainedModuleValFlags | ModuleValCreationFlags) + .withSpan(mdef.span.startPos) + val ValDef(selfName, selfTpt, _) = impl.self + val selfMods = impl.self.mods + if (!selfTpt.isEmpty) report.error(ObjectMayNotHaveSelfType(mdef), impl.self.srcPos) + val clsSelf = ValDef(selfName, SingletonTypeTree(Ident(moduleName)), impl.self.rhs) + .withMods(selfMods) + .withSpan(impl.self.span.orElse(impl.span.startPos)) + val clsTmpl = cpy.Template(impl)(self = clsSelf, body = impl.body) + val cls = TypeDef(clsName, clsTmpl) + .withMods(mods.toTypeFlags & RetainedModuleClassFlags | ModuleClassCreationFlags) + .withEndMarker(copyFrom = mdef) // copy over the end marker position to the module class def + Thicket(modul, classDef(cls).withSpan(mdef.span)) + } + } + + def extMethod(mdef: DefDef, extParamss: List[ParamClause])(using Context): DefDef = + cpy.DefDef(mdef)( + name = normalizeName(mdef, mdef.tpt).asTermName, + paramss = + if mdef.name.isRightAssocOperatorName then + val (typaramss, paramss) = mdef.paramss.span(isTypeParamClause) // first extract type parameters + + paramss match + case params :: paramss1 => // `params` must have a single parameter and without `given` flag + + def badRightAssoc(problem: String) = + report.error(em"right-associative extension method $problem", mdef.srcPos) + extParamss ++ mdef.paramss + + params match + case ValDefs(vparam :: Nil) => + if !vparam.mods.is(Given) then + // we merge the extension parameters with the method parameters, + // swapping the operator arguments: + // e.g. + // extension [A](using B)(c: C)(using D) + // def %:[E](f: F)(g: G)(using H): Res = ??? + // will be encoded as + // def %:[A](using B)[E](f: F)(c: C)(using D)(g: G)(using H): Res = ??? + val (leadingUsing, otherExtParamss) = extParamss.span(isUsingOrTypeParamClause) + leadingUsing ::: typaramss ::: params :: otherExtParamss ::: paramss1 + else + badRightAssoc("cannot start with using clause") + case _ => + badRightAssoc("must start with a single parameter") + case _ => + // no value parameters, so not an infix operator. + extParamss ++ mdef.paramss + else + extParamss ++ mdef.paramss + ).withMods(mdef.mods | ExtensionMethod) + + /** Transform extension construct to list of extension methods */ + def extMethods(ext: ExtMethods)(using Context): Tree = flatTree { + ext.methods map { + case exp: Export => exp + case mdef: DefDef => defDef(extMethod(mdef, ext.paramss)) + } + } + /** Transforms + * + * type t >: Low <: Hi + * to + * + * @patternType type $T >: Low <: Hi + * + * if the type has a pattern variable name + */ + def quotedPatternTypeDef(tree: TypeDef)(using Context): TypeDef = { + assert(ctx.mode.is(Mode.QuotedPattern)) + if tree.name.isVarPattern && !tree.isBackquoted then + val patternTypeAnnot = New(ref(defn.QuotedRuntimePatterns_patternTypeAnnot.typeRef)).withSpan(tree.span) + val mods = tree.mods.withAddedAnnotation(patternTypeAnnot) + tree.withMods(mods) + else if tree.name.startsWith("$") && !tree.isBackquoted then + report.error( + """Quoted pattern variable names starting with $ are not supported anymore. + |Use lower cases type pattern name instead. + |""".stripMargin, + tree.srcPos) + tree + else tree + } + + def checkPackageName(mdef: ModuleDef | PackageDef)(using Context): Unit = + + def check(name: Name, errSpan: Span): Unit = name match + case name: SimpleName if !errSpan.isSynthetic && name.exists(Chars.willBeEncoded) => + report.warning(em"The package name `$name` will be encoded on the classpath, and can lead to undefined behaviour.", mdef.source.atSpan(errSpan)) + case _ => + + def loop(part: RefTree): Unit = part match + case part @ Ident(name) => check(name, part.span) + case part @ Select(qual: RefTree, name) => + check(name, part.nameSpan) + loop(qual) + case _ => + + mdef match + case pdef: PackageDef => loop(pdef.pid) + case mdef: ModuleDef if mdef.mods.is(Package) => check(mdef.name, mdef.nameSpan) + case _ => + end checkPackageName + + /** The normalized name of `mdef`. This means + * 1. Check that the name does not redefine a Scala core class. + * If it does redefine, issue an error and return a mangled name instead + * of the original one. + * 2. If the name is missing (this can be the case for instance definitions), + * invent one instead. + */ + def normalizeName(mdef: MemberDef, impl: Tree)(using Context): Name = { + var name = mdef.name + if (name.isEmpty) name = name.likeSpaced(inventGivenOrExtensionName(impl)) + def errPos = mdef.source.atSpan(mdef.nameSpan) + if (ctx.owner == defn.ScalaPackageClass && defn.reservedScalaClassNames.contains(name.toTypeName)) { + val kind = if (name.isTypeName) "class" else "object" + report.error(IllegalRedefinitionOfStandardKind(kind, name), errPos) + name = name.errorName + } + name + } + + /** Invent a name for an anonympus given of type or template `impl`. */ + def inventGivenOrExtensionName(impl: Tree)(using Context): SimpleName = + val str = impl match + case impl: Template => + if impl.parents.isEmpty then + report.error(AnonymousInstanceCannotBeEmpty(impl), impl.srcPos) + nme.ERROR.toString + else + impl.parents.map(inventTypeName(_)).mkString("given_", "_", "") + case impl: Tree => + "given_" ++ inventTypeName(impl) + str.toTermName.asSimpleName + + private class NameExtractor(followArgs: Boolean) extends UntypedTreeAccumulator[String] { + private def extractArgs(args: List[Tree])(using Context): String = + args.map(argNameExtractor.apply("", _)).mkString("_") + override def apply(x: String, tree: Tree)(using Context): String = + if (x.isEmpty) + tree match { + case Select(pre, nme.CONSTRUCTOR) => foldOver(x, pre) + case tree: RefTree => + if tree.name.isTypeName then tree.name.toString + else s"${tree.name}_type" + case tree: TypeDef => tree.name.toString + case tree: AppliedTypeTree if followArgs && tree.args.nonEmpty => + s"${apply(x, tree.tpt)}_${extractArgs(tree.args)}" + case InfixOp(left, op, right) => + if followArgs then s"${op.name}_${extractArgs(List(left, right))}" + else op.name.toString + case tree: LambdaTypeTree => + apply(x, tree.body) + case tree: Tuple => + extractArgs(tree.trees) + case tree: Function if tree.args.nonEmpty => + if followArgs then s"${extractArgs(tree.args)}_to_${apply("", tree.body)}" + else "Function" + case _ => foldOver(x, tree) + } + else x + } + private val typeNameExtractor = NameExtractor(followArgs = true) + private val argNameExtractor = NameExtractor(followArgs = false) + + private def inventTypeName(tree: Tree)(using Context): String = typeNameExtractor("", tree) + + /**This will check if this def tree is marked to define enum lookup methods, + * this is not recommended to call more than once per tree + */ + private def definesEnumLookupMethods(ddef: DefTree): Boolean = + ddef.removeAttachment(DefinesEnumLookupMethods).isDefined + + /** val p1, ..., pN: T = E + * ==> + * makePatDef[[val p1: T1 = E]]; ...; makePatDef[[val pN: TN = E]] + * + * case e1, ..., eN + * ==> + * expandSimpleEnumCase([case e1]); ...; expandSimpleEnumCase([case eN]) + */ + def patDef(pdef: PatDef)(using Context): Tree = flatTree { + val PatDef(mods, pats, tpt, rhs) = pdef + if mods.isEnumCase then + def expand(id: Ident, definesLookups: Boolean) = + expandSimpleEnumCase(id.name.asTermName, mods, definesLookups, + Span(id.span.start, id.span.end, id.span.start)) + + val ids = pats.asInstanceOf[List[Ident]] + if definesEnumLookupMethods(pdef) then + ids.init.map(expand(_, false)) ::: expand(ids.last, true) :: Nil + else + ids.map(expand(_, false)) + else { + val pats1 = if (tpt.isEmpty) pats else pats map (Typed(_, tpt)) + pats1 map (makePatDef(pdef, mods, _, rhs)) + } + } + + /** The selector of a match, which depends of the given `checkMode`. + * @param sel the original selector + * @return if `checkMode` is + * - None : sel @unchecked + * - Exhaustive : sel + * - IrrefutablePatDef, + * IrrefutableGenFrom: sel with attachment `CheckIrrefutable -> checkMode` + */ + def makeSelector(sel: Tree, checkMode: MatchCheck)(using Context): Tree = + checkMode match + case MatchCheck.None => + Annotated(sel, New(ref(defn.UncheckedAnnot.typeRef))) + + case MatchCheck.Exhaustive => + sel + + case MatchCheck.IrrefutablePatDef | MatchCheck.IrrefutableGenFrom => + // TODO: use `pushAttachment` and investigate duplicate attachment + sel.withAttachment(CheckIrrefutable, checkMode) + sel + end match + + /** If `pat` is a variable pattern, + * + * val/var/lazy val p = e + * + * Otherwise, in case there is exactly one variable x_1 in pattern + * val/var/lazy val p = e ==> val/var/lazy val x_1 = (e: @unchecked) match (case p => (x_1)) + * + * in case there are zero or more than one variables in pattern + * val/var/lazy p = e ==> private[this] synthetic [lazy] val t$ = (e: @unchecked) match (case p => (x_1, ..., x_N)) + * val/var/def x_1 = t$._1 + * ... + * val/var/def x_N = t$._N + * If the original pattern variable carries a type annotation, so does the corresponding + * ValDef or DefDef. + */ + def makePatDef(original: Tree, mods: Modifiers, pat: Tree, rhs: Tree)(using Context): Tree = pat match { + case IdPattern(id, tpt) => + val id1 = + if id.name == nme.WILDCARD + then cpy.Ident(id)(WildcardParamName.fresh()) + else id + derivedValDef(original, id1, tpt, rhs, mods) + case _ => + + def filterWildcardGivenBinding(givenPat: Bind): Boolean = + givenPat.name != nme.WILDCARD + + def errorOnGivenBinding(bind: Bind)(using Context): Boolean = + report.error( + em"""${hl("given")} patterns are not allowed in a ${hl("val")} definition, + |please bind to an identifier and use an alias given.""", bind) + false + + def isTuplePattern(arity: Int): Boolean = pat match { + case Tuple(pats) if pats.size == arity => + pats.forall(isVarPattern) + case _ => false + } + val isMatchingTuple: Tree => Boolean = { + case Tuple(es) => isTuplePattern(es.length) + case _ => false + } + + // We can only optimize `val pat = if (...) e1 else e2` if: + // - `e1` and `e2` are both tuples of arity N + // - `pat` is a tuple of N variables or wildcard patterns like `(x1, x2, ..., xN)` + val tupleOptimizable = forallResults(rhs, isMatchingTuple) + + val inAliasGenerator = original match + case _: GenAlias => true + case _ => false + + val vars = + if (tupleOptimizable) // include `_` + pat match + case Tuple(pats) => pats.map { case id: Ident => id -> TypeTree() } + else + getVariables( + tree = pat, + shouldAddGiven = + if inAliasGenerator then + filterWildcardGivenBinding + else + errorOnGivenBinding + ) // no `_` + + val ids = for ((named, _) <- vars) yield Ident(named.name) + val matchExpr = + if (tupleOptimizable) rhs + else + val caseDef = CaseDef(pat, EmptyTree, makeTuple(ids)) + Match(makeSelector(rhs, MatchCheck.IrrefutablePatDef), caseDef :: Nil) + vars match { + case Nil if !mods.is(Lazy) => + matchExpr + case (named, tpt) :: Nil => + derivedValDef(original, named, tpt, matchExpr, mods) + case _ => + val tmpName = UniqueName.fresh() + val patMods = + mods & Lazy | Synthetic | (if (ctx.owner.isClass) PrivateLocal else EmptyFlags) + val firstDef = + ValDef(tmpName, TypeTree(), matchExpr) + .withSpan(pat.span.union(rhs.span)).withMods(patMods) + val useSelectors = vars.length <= 22 + def selector(n: Int) = + if useSelectors then Select(Ident(tmpName), nme.selectorName(n)) + else Apply(Select(Ident(tmpName), nme.apply), Literal(Constant(n)) :: Nil) + val restDefs = + for (((named, tpt), n) <- vars.zipWithIndex if named.name != nme.WILDCARD) + yield + if mods.is(Lazy) then + DefDef(named.name.asTermName, Nil, tpt, selector(n)) + .withMods(mods &~ Lazy) + .withSpan(named.span) + else + valDef( + ValDef(named.name.asTermName, tpt, selector(n)) + .withMods(mods) + .withSpan(named.span) + ) + flatTree(firstDef :: restDefs) + } + } + + /** Expand variable identifier x to x @ _ */ + def patternVar(tree: Tree)(using Context): Bind = { + val Ident(name) = unsplice(tree): @unchecked + Bind(name, Ident(nme.WILDCARD)).withSpan(tree.span) + } + + /** The type of tests that check whether a MemberDef is OK for some flag. + * The test succeeds if the partial function is defined and returns true. + */ + type MemberDefTest = PartialFunction[MemberDef, Boolean] + + val legalOpaque: MemberDefTest = { + case TypeDef(_, rhs) => + def rhsOK(tree: Tree): Boolean = tree match { + case bounds: TypeBoundsTree => !bounds.alias.isEmpty + case _: Template | _: MatchTypeTree => false + case LambdaTypeTree(_, body) => rhsOK(body) + case _ => true + } + rhsOK(rhs) + } + + def checkOpaqueAlias(tree: MemberDef)(using Context): MemberDef = + def check(rhs: Tree): MemberDef = rhs match + case bounds: TypeBoundsTree if bounds.alias.isEmpty => + report.error(em"opaque type must have a right-hand side", tree.srcPos) + tree.withMods(tree.mods.withoutFlags(Opaque)) + case LambdaTypeTree(_, body) => check(body) + case _ => tree + if !tree.mods.is(Opaque) then tree + else tree match + case TypeDef(_, rhs) => check(rhs) + case _ => tree + + /** Check that modifiers are legal for the definition `tree`. + * Right now, we only check for `opaque`. TODO: Move other modifier checks here. + */ + def checkModifiers(tree: Tree)(using Context): Tree = tree match { + case tree: MemberDef => + var tested: MemberDef = tree + def checkApplicable(flag: Flag, test: MemberDefTest): MemberDef = + if (tested.mods.is(flag) && !test.applyOrElse(tree, (md: MemberDef) => false)) { + report.error(ModifierNotAllowedForDefinition(flag), tree.srcPos) + tested.withMods(tested.mods.withoutFlags(flag)) + } else tested + tested = checkOpaqueAlias(tested) + tested = checkApplicable(Opaque, legalOpaque) + tested + case _ => + tree + } + + def defTree(tree: Tree)(using Context): Tree = + checkModifiers(tree) match { + case tree: ValDef => valDef(tree) + case tree: TypeDef => + if (tree.isClassDef) classDef(tree) + else if (ctx.mode.is(Mode.QuotedPattern)) quotedPatternTypeDef(tree) + else tree + case tree: DefDef => + if (tree.name.isConstructorName) tree // was already handled by enclosing classDef + else defDef(tree) + case tree: ModuleDef => moduleDef(tree) + case tree: PatDef => patDef(tree) + } + + /** { stats; } + * ==> + * { stats; () } + */ + def block(tree: Block)(using Context): Block = tree.expr match { + case EmptyTree => + cpy.Block(tree)(tree.stats, + unitLiteral.withSpan(if (tree.stats.isEmpty) tree.span else tree.span.endPos)) + case _ => + tree + } + + /** Translate infix operation expression + * + * l op r ==> l.op(r) if op is left-associative + * ==> r.op(l) if op is right-associative + */ + def binop(left: Tree, op: Ident, right: Tree)(using Context): Apply = { + def assignToNamedArg(arg: Tree) = arg match { + case Assign(Ident(name), rhs) => cpy.NamedArg(arg)(name, rhs) + case _ => arg + } + def makeOp(fn: Tree, arg: Tree, selectPos: Span) = + val sel = Select(fn, op.name).withSpan(selectPos) + if (left.sourcePos.endLine < op.sourcePos.startLine) + sel.pushAttachment(MultiLineInfix, ()) + arg match + case Parens(arg) => + Apply(sel, assignToNamedArg(arg) :: Nil) + case Tuple(args) if args.exists(_.isInstanceOf[Assign]) => + Apply(sel, args.mapConserve(assignToNamedArg)) + case Tuple(args) => + Apply(sel, arg :: Nil).setApplyKind(ApplyKind.InfixTuple) + case _ => + Apply(sel, arg :: Nil) + + if op.name.isRightAssocOperatorName then + makeOp(right, left, Span(op.span.start, right.span.end)) + else + makeOp(left, right, Span(left.span.start, op.span.end, op.span.start)) + } + + /** Translate throws type `A throws E1 | ... | En` to + * $throws[... $throws[A, E1] ... , En]. + */ + def throws(tpt: Tree, op: Ident, excepts: Tree)(using Context): AppliedTypeTree = excepts match + case Parens(excepts1) => + throws(tpt, op, excepts1) + case InfixOp(l, bar @ Ident(tpnme.raw.BAR), r) => + throws(throws(tpt, op, l), bar, r) + case e => + AppliedTypeTree( + TypeTree(defn.throwsAlias.typeRef).withSpan(op.span), tpt :: excepts :: Nil) + + /** Translate tuple expressions of arity <= 22 + * + * () ==> () + * (t) ==> t + * (t1, ..., tN) ==> TupleN(t1, ..., tN) + */ + def smallTuple(tree: Tuple)(using Context): Tree = { + val ts = tree.trees + val arity = ts.length + assert(arity <= Definitions.MaxTupleArity) + def tupleTypeRef = defn.TupleType(arity).nn + if (arity == 0) + if (ctx.mode is Mode.Type) TypeTree(defn.UnitType) else unitLiteral + else if (ctx.mode is Mode.Type) AppliedTypeTree(ref(tupleTypeRef), ts) + else Apply(ref(tupleTypeRef.classSymbol.companionModule.termRef), ts) + } + + private def isTopLevelDef(stat: Tree)(using Context): Boolean = stat match + case _: ValDef | _: PatDef | _: DefDef | _: Export | _: ExtMethods => true + case stat: ModuleDef => + stat.mods.isOneOf(GivenOrImplicit) + case stat: TypeDef => + !stat.isClassDef || stat.mods.isOneOf(GivenOrImplicit) + case _ => + false + + /** Assuming `src` contains top-level definition, returns the name that should + * be using for the package object that will wrap them. + */ + def packageObjectName(src: SourceFile): TermName = + val fileName = src.file.name + val sourceName = fileName.take(fileName.lastIndexOf('.')) + (sourceName ++ str.TOPLEVEL_SUFFIX).toTermName + + /** Group all definitions that can't be at the toplevel in + * an object named `$package` where `` is the name of the source file. + * Definitions that can't be at the toplevel are: + * + * - all pattern, value and method definitions + * - non-class type definitions + * - implicit classes and objects + * - "companion objects" of wrapped type definitions + * (i.e. objects having the same name as a wrapped type) + */ + def packageDef(pdef: PackageDef)(using Context): PackageDef = { + checkPackageName(pdef) + val wrappedTypeNames = pdef.stats.collectCC { + case stat: TypeDef if isTopLevelDef(stat) => stat.name + } + def inPackageObject(stat: Tree) = + isTopLevelDef(stat) || { + stat match + case stat: ModuleDef => + wrappedTypeNames.contains(stat.name.stripModuleClassSuffix.toTypeName) + case _ => + false + } + val (nestedStats, topStats) = pdef.stats.partition(inPackageObject) + if (nestedStats.isEmpty) pdef + else { + val name = packageObjectName(ctx.source) + val grouped = + ModuleDef(name, Template(emptyConstructor, Nil, Nil, EmptyValDef, nestedStats)) + .withMods(Modifiers(Synthetic)) + cpy.PackageDef(pdef)(pdef.pid, topStats :+ grouped) + } + } + + /** Make closure corresponding to function. + * params => body + * ==> + * def $anonfun(params) = body + * Closure($anonfun) + */ + def makeClosure(params: List[ValDef], body: Tree, tpt: Tree | Null = null, isContextual: Boolean, span: Span)(using Context): Block = + Block( + DefDef(nme.ANON_FUN, params :: Nil, if (tpt == null) TypeTree() else tpt, body) + .withSpan(span) + .withMods(synthetic | Artifact), + Closure(Nil, Ident(nme.ANON_FUN), if (isContextual) ContextualEmptyTree else EmptyTree)) + + /** If `nparams` == 1, expand partial function + * + * { cases } + * ==> + * x$1 => (x$1 @unchecked?) match { cases } + * + * If `nparams` != 1, expand instead to + * + * (x$1, ..., x$n) => (x$0, ..., x${n-1} @unchecked?) match { cases } + */ + def makeCaseLambda(cases: List[CaseDef], checkMode: MatchCheck, nparams: Int = 1)(using Context): Function = { + val params = (1 to nparams).toList.map(makeSyntheticParameter(_)) + val selector = makeTuple(params.map(p => Ident(p.name))) + Function(params, Match(makeSelector(selector, checkMode), cases)) + } + + /** Map n-ary function `(x1: T1, ..., xn: Tn) => body` where n != 1 to unary function as follows: + * + * (x$1: (T1, ..., Tn)) => { + * def x1: T1 = x$1._1 + * ... + * def xn: Tn = x$1._n + * body + * } + * + * or if `isGenericTuple` + * + * (x$1: (T1, ... Tn) => { + * def x1: T1 = x$1.apply(0) + * ... + * def xn: Tn = x$1.apply(n-1) + * body + * } + * + * If some of the Ti's are absent, omit the : (T1, ..., Tn) type ascription + * in the selector. + */ + def makeTupledFunction(params: List[ValDef], body: Tree, isGenericTuple: Boolean)(using Context): Tree = { + val param = makeSyntheticParameter( + tpt = + if params.exists(_.tpt.isEmpty) then TypeTree() + else Tuple(params.map(_.tpt))) + def selector(n: Int) = + if (isGenericTuple) Apply(Select(refOfDef(param), nme.apply), Literal(Constant(n))) + else Select(refOfDef(param), nme.selectorName(n)) + val vdefs = + params.zipWithIndex.map { + case (param, idx) => + ValDef(param.name, param.tpt, selector(idx)) + .withSpan(param.span) + .withAttachment(UntupledParam, ()) + .withFlags(Synthetic) + } + Function(param :: Nil, Block(vdefs, body)) + } + + /** Convert a tuple pattern with given `elems` to a sequence of `ValDefs`, + * skipping elements that are not convertible. + */ + def patternsToParams(elems: List[Tree])(using Context): List[ValDef] = + def toParam(elem: Tree, tpt: Tree): Tree = + elem match + case Annotated(elem1, _) => toParam(elem1, tpt) + case Typed(elem1, tpt1) => toParam(elem1, tpt1) + case Ident(id: TermName) => ValDef(id, tpt, EmptyTree).withFlags(Param) + case _ => EmptyTree + elems.map(param => toParam(param, TypeTree()).withSpan(param.span)).collect { + case vd: ValDef => vd + } + + def makeContextualFunction(formals: List[Tree], body: Tree, isErased: Boolean)(using Context): Function = { + val mods = if (isErased) Given | Erased else Given + val params = makeImplicitParameters(formals, mods) + FunctionWithMods(params, body, Modifiers(mods)) + } + + private def derivedValDef(original: Tree, named: NameTree, tpt: Tree, rhs: Tree, mods: Modifiers)(using Context) = { + val vdef = ValDef(named.name.asTermName, tpt, rhs) + .withMods(mods) + .withSpan(original.span.withPoint(named.span.start)) + val mayNeedSetter = valDef(vdef) + mayNeedSetter + } + + private def derivedDefDef(original: Tree, named: NameTree, tpt: Tree, rhs: Tree, mods: Modifiers)(implicit src: SourceFile) = + DefDef(named.name.asTermName, Nil, tpt, rhs) + .withMods(mods) + .withSpan(original.span.withPoint(named.span.start)) + + /** Main desugaring method */ + def apply(tree: Tree, pt: Type = NoType)(using Context): Tree = { + + /** Create tree for for-comprehension `` or + * `` where mapName and flatMapName are chosen + * corresponding to whether this is a for-do or a for-yield. + * The creation performs the following rewrite rules: + * + * 1. + * + * for (P <- G) E ==> G.foreach (P => E) + * + * Here and in the following (P => E) is interpreted as the function (P => E) + * if P is a variable pattern and as the partial function { case P => E } otherwise. + * + * 2. + * + * for (P <- G) yield E ==> G.map (P => E) + * + * 3. + * + * for (P_1 <- G_1; P_2 <- G_2; ...) ... + * ==> + * G_1.flatMap (P_1 => for (P_2 <- G_2; ...) ...) + * + * 4. + * + * for (P <- G; E; ...) ... + * => + * for (P <- G.filter (P => E); ...) ... + * + * 5. For any N: + * + * for (P_1 <- G; P_2 = E_2; val P_N = E_N; ...) + * ==> + * for (TupleN(P_1, P_2, ... P_N) <- + * for (x_1 @ P_1 <- G) yield { + * val x_2 @ P_2 = E_2 + * ... + * val x_N & P_N = E_N + * TupleN(x_1, ..., x_N) + * } ...) + * + * If any of the P_i are variable patterns, the corresponding `x_i @ P_i` is not generated + * and the variable constituting P_i is used instead of x_i + * + * @param mapName The name to be used for maps (either map or foreach) + * @param flatMapName The name to be used for flatMaps (either flatMap or foreach) + * @param enums The enumerators in the for expression + * @param body The body of the for expression + */ + def makeFor(mapName: TermName, flatMapName: TermName, enums: List[Tree], body: Tree): Tree = trace(i"make for ${ForYield(enums, body)}", show = true) { + + /** Let `pat` be `gen`'s pattern. Make a function value `pat => body`. + * If `pat` is a var pattern `id: T` then this gives `(id: T) => body`. + * Otherwise this gives `{ case pat => body }`, where `pat` is checked to be + * irrefutable if `gen`'s checkMode is GenCheckMode.Check. + */ + def makeLambda(gen: GenFrom, body: Tree): Tree = gen.pat match { + case IdPattern(named, tpt) if gen.checkMode != GenCheckMode.FilterAlways => + Function(derivedValDef(gen.pat, named, tpt, EmptyTree, Modifiers(Param)) :: Nil, body) + case _ => + val matchCheckMode = + if (gen.checkMode == GenCheckMode.Check || gen.checkMode == GenCheckMode.CheckAndFilter) MatchCheck.IrrefutableGenFrom + else MatchCheck.None + makeCaseLambda(CaseDef(gen.pat, EmptyTree, body) :: Nil, matchCheckMode) + } + + /** If `pat` is not an Identifier, a Typed(Ident, _), or a Bind, wrap + * it in a Bind with a fresh name. Return the transformed pattern, and the identifier + * that refers to the bound variable for the pattern. Wildcard Binds are + * also replaced by Binds with fresh names. + */ + def makeIdPat(pat: Tree): (Tree, Ident) = pat match { + case bind @ Bind(name, pat1) => + if name == nme.WILDCARD then + val name = UniqueName.fresh() + (cpy.Bind(pat)(name, pat1).withMods(bind.mods), Ident(name)) + else (pat, Ident(name)) + case id: Ident if isVarPattern(id) && id.name != nme.WILDCARD => (id, id) + case Typed(id: Ident, _) if isVarPattern(id) && id.name != nme.WILDCARD => (pat, id) + case _ => + val name = UniqueName.fresh() + (Bind(name, pat), Ident(name)) + } + + /** Make a pattern filter: + * rhs.withFilter { case pat => true case _ => false } + * + * On handling irrefutable patterns: + * The idea is to wait until the pattern matcher sees a call + * + * xs withFilter { cases } + * + * where cases can be proven to be refutable i.e. cases would be + * equivalent to { case _ => true } + * + * In that case, compile to + * + * xs withFilter alwaysTrue + * + * where `alwaysTrue` is a predefined function value: + * + * val alwaysTrue: Any => Boolean = true + * + * In the libraries operations can take advantage of alwaysTrue to shortcircuit the + * withFilter call. + * + * def withFilter(f: Elem => Boolean) = + * if (f eq alwaysTrue) this // or rather identity filter monadic applied to this + * else real withFilter + */ + def makePatFilter(rhs: Tree, pat: Tree): Tree = { + val cases = List( + CaseDef(pat, EmptyTree, Literal(Constant(true))), + CaseDef(Ident(nme.WILDCARD), EmptyTree, Literal(Constant(false)))) + Apply(Select(rhs, nme.withFilter), makeCaseLambda(cases, MatchCheck.None)) + } + + /** Is pattern `pat` irrefutable when matched against `rhs`? + * We only can do a simple syntactic check here; a more refined check + * is done later in the pattern matcher (see discussion in @makePatFilter). + */ + def isIrrefutable(pat: Tree, rhs: Tree): Boolean = { + def matchesTuple(pats: List[Tree], rhs: Tree): Boolean = rhs match { + case Tuple(trees) => (pats corresponds trees)(isIrrefutable) + case Parens(rhs1) => matchesTuple(pats, rhs1) + case Block(_, rhs1) => matchesTuple(pats, rhs1) + case If(_, thenp, elsep) => matchesTuple(pats, thenp) && matchesTuple(pats, elsep) + case Match(_, cases) => cases forall (matchesTuple(pats, _)) + case CaseDef(_, _, rhs1) => matchesTuple(pats, rhs1) + case Throw(_) => true + case _ => false + } + pat match { + case Bind(_, pat1) => isIrrefutable(pat1, rhs) + case Parens(pat1) => isIrrefutable(pat1, rhs) + case Tuple(pats) => matchesTuple(pats, rhs) + case _ => isVarPattern(pat) + } + } + + /** Is `pat` of the form `x`, `x T`, or `given T`? when used as the lhs of a generator, + * these are all considered irrefutable. + */ + def isVarBinding(pat: Tree): Boolean = pat match + case pat @ Bind(_, pat1) if pat.mods.is(Given) => isVarBinding(pat1) + case IdPattern(_) => true + case _ => false + + def needsNoFilter(gen: GenFrom): Boolean = gen.checkMode match + case GenCheckMode.FilterAlways => false // pattern was prefixed by `case` + case GenCheckMode.FilterNow | GenCheckMode.CheckAndFilter => isVarBinding(gen.pat) || isIrrefutable(gen.pat, gen.expr) + case GenCheckMode.Check => true + case GenCheckMode.Ignore => true + + /** rhs.name with a pattern filter on rhs unless `pat` is irrefutable when + * matched against `rhs`. + */ + def rhsSelect(gen: GenFrom, name: TermName) = { + val rhs = if (needsNoFilter(gen)) gen.expr else makePatFilter(gen.expr, gen.pat) + Select(rhs, name) + } + + enums match { + case (gen: GenFrom) :: Nil => + Apply(rhsSelect(gen, mapName), makeLambda(gen, body)) + case (gen: GenFrom) :: (rest @ (GenFrom(_, _, _) :: _)) => + val cont = makeFor(mapName, flatMapName, rest, body) + Apply(rhsSelect(gen, flatMapName), makeLambda(gen, cont)) + case (gen: GenFrom) :: (rest @ GenAlias(_, _) :: _) => + val (valeqs, rest1) = rest.span(_.isInstanceOf[GenAlias]) + val pats = valeqs map { case GenAlias(pat, _) => pat } + val rhss = valeqs map { case GenAlias(_, rhs) => rhs } + val (defpat0, id0) = makeIdPat(gen.pat) + val (defpats, ids) = (pats map makeIdPat).unzip + val pdefs = valeqs.lazyZip(defpats).lazyZip(rhss).map { (valeq, defpat, rhs) => + val mods = defpat match + case defTree: DefTree => defTree.mods + case _ => Modifiers() + makePatDef(valeq, mods, defpat, rhs) + } + val rhs1 = makeFor(nme.map, nme.flatMap, GenFrom(defpat0, gen.expr, gen.checkMode) :: Nil, Block(pdefs, makeTuple(id0 :: ids))) + val allpats = gen.pat :: pats + val vfrom1 = GenFrom(makeTuple(allpats), rhs1, GenCheckMode.Ignore) + makeFor(mapName, flatMapName, vfrom1 :: rest1, body) + case (gen: GenFrom) :: test :: rest => + val filtered = Apply(rhsSelect(gen, nme.withFilter), makeLambda(gen, test)) + val genFrom = GenFrom(gen.pat, filtered, GenCheckMode.Ignore) + makeFor(mapName, flatMapName, genFrom :: rest, body) + case _ => + EmptyTree //may happen for erroneous input + } + } + + def makePolyFunction(targs: List[Tree], body: Tree, pt: Type): Tree = body match { + case Parens(body1) => + makePolyFunction(targs, body1, pt) + case Block(Nil, body1) => + makePolyFunction(targs, body1, pt) + case Function(vargs, res) => + assert(targs.nonEmpty) + // TODO: Figure out if we need a `PolyFunctionWithMods` instead. + val mods = body match { + case body: FunctionWithMods => body.mods + case _ => untpd.EmptyModifiers + } + val polyFunctionTpt = ref(defn.PolyFunctionType) + val applyTParams = targs.asInstanceOf[List[TypeDef]] + if (ctx.mode.is(Mode.Type)) { + // Desugar [T_1, ..., T_M] -> (P_1, ..., P_N) => R + // Into scala.PolyFunction { def apply[T_1, ..., T_M](x$1: P_1, ..., x$N: P_N): R } + + val applyVParams = vargs.zipWithIndex.map { + case (p: ValDef, _) => p.withAddedFlags(mods.flags) + case (p, n) => makeSyntheticParameter(n + 1, p).withAddedFlags(mods.flags) + } + RefinedTypeTree(polyFunctionTpt, List( + DefDef(nme.apply, applyTParams :: applyVParams :: Nil, res, EmptyTree).withFlags(Synthetic) + )) + } + else { + // Desugar [T_1, ..., T_M] -> (x_1: P_1, ..., x_N: P_N) => body + // with pt [S_1, ..., S_M] -> (O_1, ..., O_N) => R + // Into new scala.PolyFunction { def apply[T_1, ..., T_M](x_1: P_1, ..., x_N: P_N): R2 = body } + // where R2 is R, with all references to S_1..S_M replaced with T1..T_M. + + def typeTree(tp: Type) = tp match + case RefinedType(parent, nme.apply, PolyType(_, mt)) if parent.typeSymbol eq defn.PolyFunctionClass => + var bail = false + def mapper(tp: Type, topLevel: Boolean = false): Tree = tp match + case tp: TypeRef => ref(tp) + case tp: TypeParamRef => Ident(applyTParams(tp.paramNum).name) + case AppliedType(tycon, args) => AppliedTypeTree(mapper(tycon), args.map(mapper(_))) + case _ => if topLevel then TypeTree() else { bail = true; genericEmptyTree } + val mapped = mapper(mt.resultType, topLevel = true) + if bail then TypeTree() else mapped + case _ => TypeTree() + + val applyVParams = vargs.asInstanceOf[List[ValDef]] + .map(varg => varg.withAddedFlags(mods.flags | Param)) + New(Template(emptyConstructor, List(polyFunctionTpt), Nil, EmptyValDef, + List(DefDef(nme.apply, applyTParams :: applyVParams :: Nil, typeTree(pt), res)) + )) + } + case _ => + // may happen for erroneous input. An error will already have been reported. + assert(ctx.reporter.errorsReported) + EmptyTree + } + + // begin desugar + + // Special case for `Parens` desugaring: unlike all the desugarings below, + // its output is not a new tree but an existing one whose position should + // be preserved, so we shouldn't call `withPos` on it. + tree match { + case Parens(t) => + return t + case _ => + } + + val desugared = tree match { + case PolyFunction(targs, body) => + makePolyFunction(targs, body, pt) orElse tree + case SymbolLit(str) => + Apply( + ref(defn.ScalaSymbolClass.companionModule.termRef), + Literal(Constant(str)) :: Nil) + case InterpolatedString(id, segments) => + val strs = segments map { + case ts: Thicket => ts.trees.head + case t => t + } + val elems = segments flatMap { + case ts: Thicket => ts.trees.tail + case t => Nil + } map { (t: Tree) => t match + // !cc! explicitly typed parameter (t: Tree) is needed since otherwise + // we get an error similar to #16268. (The explicit type constrains the type of `segments` + // which is otherwise List[{*} tree]) + case Block(Nil, EmptyTree) => Literal(Constant(())) // for s"... ${} ..." + case Block(Nil, expr) => expr // important for interpolated string as patterns, see i1773.scala + case t => t + } + // This is a deliberate departure from scalac, where StringContext is not rooted (See #4732) + Apply(Select(Apply(scalaDot(nme.StringContext), strs), id).withSpan(tree.span), elems) + case PostfixOp(t, op) => + if (ctx.mode is Mode.Type) && !isBackquoted(op) && op.name == tpnme.raw.STAR then + if ctx.isJava then + AppliedTypeTree(ref(defn.RepeatedParamType), t) + else + Annotated( + AppliedTypeTree(ref(defn.SeqType), t), + New(ref(defn.RepeatedAnnot.typeRef), Nil :: Nil)) + else + assert(ctx.mode.isExpr || ctx.reporter.errorsReported || ctx.mode.is(Mode.Interactive), ctx.mode) + Select(t, op.name) + case PrefixOp(op, t) => + val nspace = if (ctx.mode.is(Mode.Type)) tpnme else nme + Select(t, nspace.UNARY_PREFIX ++ op.name) + case ForDo(enums, body) => + makeFor(nme.foreach, nme.foreach, enums, body) orElse tree + case ForYield(enums, body) => + makeFor(nme.map, nme.flatMap, enums, body) orElse tree + case PatDef(mods, pats, tpt, rhs) => + val pats1 = if (tpt.isEmpty) pats else pats map (Typed(_, tpt)) + flatTree(pats1 map (makePatDef(tree, mods, _, rhs))) + case ext: ExtMethods => + Block(List(ext), Literal(Constant(())).withSpan(ext.span)) + case CapturingTypeTree(refs, parent) => + // convert `{refs} T` to `T @retains refs` + // `{refs}-> T` to `-> (T @retainsByName refs)` + def annotate(annotName: TypeName, tp: Tree) = + Annotated(tp, New(scalaAnnotationDot(annotName), List(refs))) + parent match + case ByNameTypeTree(restpt) => + cpy.ByNameTypeTree(parent)(annotate(tpnme.retainsByName, restpt)) + case _ => + annotate(tpnme.retains, parent) + } + desugared.withSpan(tree.span) + } + + /** Turn a fucntion value `handlerFun` into a catch case for a try. + * If `handlerFun` is a partial function, translate to + * + * case ex => + * val ev$1 = handlerFun + * if ev$1.isDefinedAt(ex) then ev$1.apply(ex) else throw ex + * + * Otherwise translate to + * + * case ex => handlerFun.apply(ex) + */ + def makeTryCase(handlerFun: tpd.Tree)(using Context): CaseDef = + val handler = TypedSplice(handlerFun) + val excId = Ident(nme.DEFAULT_EXCEPTION_NAME) + val rhs = + if handlerFun.tpe.widen.isRef(defn.PartialFunctionClass) then + val tmpName = UniqueName.fresh() + val tmpId = Ident(tmpName) + val init = ValDef(tmpName, TypeTree(), handler) + val test = If( + Apply(Select(tmpId, nme.isDefinedAt), excId), + Apply(Select(tmpId, nme.apply), excId), + Throw(excId)) + Block(init :: Nil, test) + else + Apply(Select(handler, nme.apply), excId) + CaseDef(excId, EmptyTree, rhs) + + /** Create a class definition with the same info as the refined type given by `parent` + * and `refinements`. + * + * parent { refinements } + * ==> + * trait extends core { this: self => refinements } + * + * Here, `core` is the (possibly parameterized) class part of `parent`. + * If `parent` is the same as `core`, self is empty. Otherwise `self` is `parent`. + * + * Example: Given + * + * class C + * type T1 = C { type T <: A } + * + * the refined type + * + * T1 { type T <: B } + * + * is expanded to + * + * trait extends C { this: T1 => type T <: A } + * + * The result of this method is used for validity checking, is thrown away afterwards. + * @param parent The type of `parent` + */ + def refinedTypeToClass(parent: tpd.Tree, refinements: List[Tree])(using Context): TypeDef = { + def stripToCore(tp: Type): List[Type] = tp match { + case tp: AppliedType => tp :: Nil + case tp: TypeRef if tp.symbol.isClass => tp :: Nil // monomorphic class type + case tp: TypeProxy => stripToCore(tp.underlying) + case AndType(tp1, tp2) => stripToCore(tp1) ::: stripToCore(tp2) + case _ => defn.AnyType :: Nil + } + val parentCores = stripToCore(parent.tpe) + val untpdParent = TypedSplice(parent) + val (classParents, self) = + if (parentCores.length == 1 && (parent.tpe eq parentCores.head)) (untpdParent :: Nil, EmptyValDef) + else (parentCores map TypeTree, ValDef(nme.WILDCARD, untpdParent, EmptyTree)) + val impl = Template(emptyConstructor, classParents, Nil, self, refinements) + TypeDef(tpnme.REFINE_CLASS, impl).withFlags(Trait) + } + + /** Returns list of all pattern variables, possibly with their types, + * without duplicates + */ + private def getVariables(tree: Tree, shouldAddGiven: Context ?=> Bind => Boolean)(using Context): List[VarInfo] = { + val buf = ListBuffer[VarInfo]() + def seenName(name: Name) = buf exists (_._1.name == name) + def add(named: NameTree, t: Tree): Unit = + if (!seenName(named.name) && named.name.isTermName) buf += ((named, t)) + def collect(tree: Tree): Unit = tree match { + case tree @ Bind(nme.WILDCARD, tree1) => + if tree.mods.is(Given) then + val Typed(_, tpt) = tree1: @unchecked + if shouldAddGiven(tree) then + add(tree, tpt) + collect(tree1) + case tree @ Bind(_, Typed(tree1, tpt)) => + if !(tree.mods.is(Given) && !shouldAddGiven(tree)) then + add(tree, tpt) + collect(tree1) + case tree @ Bind(_, tree1) => + add(tree, TypeTree()) + collect(tree1) + case Typed(id: Ident, t) if isVarPattern(id) && id.name != nme.WILDCARD && !isWildcardStarArg(tree) => + add(id, t) + case id: Ident if isVarPattern(id) && id.name != nme.WILDCARD => + add(id, TypeTree()) + case Apply(_, args) => + args foreach collect + case Typed(expr, _) => + collect(expr) + case NamedArg(_, arg) => + collect(arg) + case SeqLiteral(elems, _) => + elems foreach collect + case Alternative(trees) => + for (tree <- trees; (vble, _) <- getVariables(tree, shouldAddGiven)) + report.error(IllegalVariableInPatternAlternative(vble.symbol.name), vble.srcPos) + case Annotated(arg, _) => + collect(arg) + case InterpolatedString(_, segments) => + segments foreach collect + case InfixOp(left, _, right) => + collect(left) + collect(right) + case PrefixOp(_, od) => + collect(od) + case Parens(tree) => + collect(tree) + case Tuple(trees) => + trees foreach collect + case Thicket(trees) => + trees foreach collect + case Block(Nil, expr) => + collect(expr) + case Quote(expr) => + new UntypedTreeTraverser { + def traverse(tree: untpd.Tree)(using Context): Unit = tree match { + case Splice(expr) => collect(expr) + case _ => traverseChildren(tree) + } + }.traverse(expr) + case CapturingTypeTree(refs, parent) => + collect(parent) + case _ => + } + collect(tree) + buf.toList + } +} diff --git a/tests/pos-with-compiler-cc/dotc/ast/DesugarEnums.scala b/tests/pos-with-compiler-cc/dotc/ast/DesugarEnums.scala new file mode 100644 index 000000000000..a1c3c0ed0775 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/ast/DesugarEnums.scala @@ -0,0 +1,310 @@ +package dotty.tools +package dotc +package ast + +import core._ +import util.Spans._, Types._, Contexts._, Constants._, Names._, Flags._ +import Symbols._, StdNames._, Trees._ +import Decorators._ +import util.{Property, SourceFile} +import typer.ErrorReporting._ +import transform.SyntheticMembers.ExtendsSingletonMirror + +import scala.annotation.internal.sharable + +/** Helper methods to desugar enums */ +object DesugarEnums { + import untpd._ + + enum CaseKind: + case Simple, Object, Class + + final case class EnumConstraints(minKind: CaseKind, maxKind: CaseKind, enumCases: List[(Int, RefTree)]): + require(minKind.ordinal <= maxKind.ordinal && !(cached && enumCases.isEmpty)) + def requiresCreator = minKind == CaseKind.Simple + def isEnumeration = maxKind.ordinal < CaseKind.Class.ordinal + def cached = minKind.ordinal < CaseKind.Class.ordinal + end EnumConstraints + + /** Attachment containing the number of enum cases, the smallest kind that was seen so far, + * and a list of all the value cases with their ordinals. + */ + val EnumCaseCount: Property.Key[(Int, CaseKind, CaseKind, List[(Int, TermName)])] = Property.Key() + + /** Attachment signalling that when this definition is desugared, it should add any additional + * lookup methods for enums. + */ + val DefinesEnumLookupMethods: Property.Key[Unit] = Property.Key() + + /** The enumeration class that belongs to an enum case. This works no matter + * whether the case is still in the enum class or it has been transferred to the + * companion object. + */ + def enumClass(using Context): Symbol = { + val cls = ctx.owner + if (cls.is(Module)) cls.linkedClass else cls + } + + def enumCompanion(using Context): Symbol = { + val cls = ctx.owner + if (cls.is(Module)) cls.sourceModule else cls.linkedClass.sourceModule + } + + /** Is `tree` an (untyped) enum case? */ + def isEnumCase(tree: Tree)(using Context): Boolean = tree match { + case tree: MemberDef => tree.mods.isEnumCase + case PatDef(mods, _, _, _) => mods.isEnumCase + case _ => false + } + + /** A reference to the enum class `E`, possibly followed by type arguments. + * Each covariant type parameter is approximated by its lower bound. + * Each contravariant type parameter is approximated by its upper bound. + * It is an error if a type parameter is non-variant, or if its approximation + * refers to pther type parameters. + */ + def interpolatedEnumParent(span: Span)(using Context): Tree = { + val tparams = enumClass.typeParams + def isGround(tp: Type) = tp.subst(tparams, tparams.map(_ => NoType)) eq tp + val targs = tparams map { tparam => + if (tparam.is(Covariant) && isGround(tparam.info.bounds.lo)) + tparam.info.bounds.lo + else if (tparam.is(Contravariant) && isGround(tparam.info.bounds.hi)) + tparam.info.bounds.hi + else { + def problem = + if (!tparam.isOneOf(VarianceFlags)) "is invariant" + else "has bounds that depend on a type parameter in the same parameter list" + errorType(em"""cannot determine type argument for enum parent $enumClass, + |type parameter $tparam $problem""", ctx.source.atSpan(span)) + } + } + TypeTree(enumClass.typeRef.appliedTo(targs)).withSpan(span) + } + + /** A type tree referring to `enumClass` */ + def enumClassRef(using Context): Tree = + if (enumClass.exists) TypeTree(enumClass.typeRef) else TypeTree() + + /** Add implied flags to an enum class or an enum case */ + def addEnumFlags(cdef: TypeDef)(using Context): TypeDef = + if (cdef.mods.isEnumClass) cdef.withMods(cdef.mods.withAddedFlags(Abstract | Sealed, cdef.span)) + else if (isEnumCase(cdef)) cdef.withMods(cdef.mods.withAddedFlags(Final, cdef.span)) + else cdef + + private def valuesDot(name: PreName)(implicit src: SourceFile) = + Select(Ident(nme.DOLLAR_VALUES), name.toTermName) + + private def ArrayLiteral(values: List[Tree], tpt: Tree)(using Context): Tree = + val clazzOf = TypeApply(ref(defn.Predef_classOf.termRef), tpt :: Nil) + val ctag = Apply(TypeApply(ref(defn.ClassTagModule_apply.termRef), tpt :: Nil), clazzOf :: Nil) + val apply = Select(ref(defn.ArrayModule.termRef), nme.apply) + Apply(Apply(TypeApply(apply, tpt :: Nil), values), ctag :: Nil) + + /** The following lists of definitions for an enum type E and known value cases e_0, ..., e_n: + * + * private val $values = Array[E](this.e_0,...,this.e_n)(ClassTag[E](classOf[E])) + * def values = $values.clone + * def valueOf($name: String) = $name match { + * case "e_0" => this.e_0 + * ... + * case "e_n" => this.e_n + * case _ => throw new IllegalArgumentException("case not found: " + $name) + * } + */ + private def enumScaffolding(enumValues: List[RefTree])(using Context): List[Tree] = { + val rawEnumClassRef = rawRef(enumClass.typeRef) + extension (tpe: NamedType) def ofRawEnum = AppliedTypeTree(ref(tpe), rawEnumClassRef) + + val privateValuesDef = + ValDef(nme.DOLLAR_VALUES, TypeTree(), ArrayLiteral(enumValues, rawEnumClassRef)) + .withFlags(Private | Synthetic) + + val valuesDef = + DefDef(nme.values, Nil, defn.ArrayType.ofRawEnum, valuesDot(nme.clone_)) + .withFlags(Synthetic) + + val valuesOfBody: Tree = + val defaultCase = + val msg = Apply(Select(Literal(Constant("enum case not found: ")), nme.PLUS), Ident(nme.nameDollar)) + CaseDef(Ident(nme.WILDCARD), EmptyTree, + Throw(New(TypeTree(defn.IllegalArgumentExceptionType), List(msg :: Nil)))) + val stringCases = enumValues.map(enumValue => + CaseDef(Literal(Constant(enumValue.name.toString)), EmptyTree, enumValue) + ) ::: defaultCase :: Nil + Match(Ident(nme.nameDollar), stringCases) + val valueOfDef = DefDef(nme.valueOf, List(param(nme.nameDollar, defn.StringType) :: Nil), + TypeTree(), valuesOfBody) + .withFlags(Synthetic) + + privateValuesDef :: + valuesDef :: + valueOfDef :: Nil + } + + private def enumLookupMethods(constraints: EnumConstraints)(using Context): List[Tree] = + def scaffolding: List[Tree] = + if constraints.isEnumeration then enumScaffolding(constraints.enumCases.map(_._2)) else Nil + def valueCtor: List[Tree] = if constraints.requiresCreator then enumValueCreator :: Nil else Nil + def fromOrdinal: Tree = + def throwArg(ordinal: Tree) = + Throw(New(TypeTree(defn.NoSuchElementExceptionType), List(Select(ordinal, nme.toString_) :: Nil))) + if !constraints.cached then + fromOrdinalMeth(throwArg) + else + def default(ordinal: Tree) = + CaseDef(Ident(nme.WILDCARD), EmptyTree, throwArg(ordinal)) + if constraints.isEnumeration then + fromOrdinalMeth(ordinal => + Try(Apply(valuesDot(nme.apply), ordinal), default(ordinal) :: Nil, EmptyTree)) + else + fromOrdinalMeth(ordinal => + Match(ordinal, + constraints.enumCases.map((i, enumValue) => CaseDef(Literal(Constant(i)), EmptyTree, enumValue)) + :+ default(ordinal))) + + if !enumClass.exists then + // in the case of a double definition of an enum that only defines class cases (see tests/neg/i4470c.scala) + // it seems `enumClass` might be `NoSymbol`; in this case we provide no scaffolding. + Nil + else + scaffolding ::: valueCtor ::: fromOrdinal :: Nil + end enumLookupMethods + + /** A creation method for a value of enum type `E`, which is defined as follows: + * + * private def $new(_$ordinal: Int, $name: String) = new E with scala.runtime.EnumValue { + * def ordinal = _$ordinal // if `E` does not derive from `java.lang.Enum` + * } + */ + private def enumValueCreator(using Context) = { + val creator = New(Template( + constr = emptyConstructor, + parents = enumClassRef :: scalaRuntimeDot(tpnme.EnumValue) :: Nil, + derived = Nil, + self = EmptyValDef, + body = Nil + ).withAttachment(ExtendsSingletonMirror, ())) + DefDef(nme.DOLLAR_NEW, + List(List(param(nme.ordinalDollar_, defn.IntType), param(nme.nameDollar, defn.StringType))), + TypeTree(), creator).withFlags(Private | Synthetic) + } + + /** Is a type parameter in `enumTypeParams` referenced from an enum class case that has + * given type parameters `caseTypeParams`, value parameters `vparamss` and parents `parents`? + * Issues an error if that is the case but the reference is illegal. + * The reference could be illegal for two reasons: + * - explicit type parameters are given + * - it's a value case, i.e. no value parameters are given + */ + def typeParamIsReferenced( + enumTypeParams: List[TypeSymbol], + caseTypeParams: List[TypeDef], + vparamss: List[List[ValDef]], + parents: List[Tree])(using Context): Boolean = { + + object searchRef extends UntypedTreeAccumulator[Boolean] { + var tparamNames = enumTypeParams.map(_.name).toSet[Name] + def underBinders(binders: List[MemberDef], op: => Boolean): Boolean = { + val saved = tparamNames + tparamNames = tparamNames -- binders.map(_.name) + try op + finally tparamNames = saved + } + def apply(x: Boolean, tree: Tree)(using Context): Boolean = x || { + tree match { + case Ident(name) => + val matches = tparamNames.contains(name) + if (matches && (caseTypeParams.nonEmpty || vparamss.isEmpty)) + report.error(em"illegal reference to type parameter $name from enum case", tree.srcPos) + matches + case LambdaTypeTree(lambdaParams, body) => + underBinders(lambdaParams, foldOver(x, tree)) + case RefinedTypeTree(parent, refinements) => + val refinementDefs = refinements collect { case r: MemberDef => r } + underBinders(refinementDefs, foldOver(x, tree)) + case _ => foldOver(x, tree) + } + } + def apply(tree: Tree)(using Context): Boolean = + underBinders(caseTypeParams, apply(false, tree)) + } + + def typeHasRef(tpt: Tree) = searchRef(tpt) + def valDefHasRef(vd: ValDef) = typeHasRef(vd.tpt) + def parentHasRef(parent: Tree): Boolean = parent match { + case Apply(fn, _) => parentHasRef(fn) + case TypeApply(_, targs) => targs.exists(typeHasRef) + case Select(nu, nme.CONSTRUCTOR) => parentHasRef(nu) + case New(tpt) => typeHasRef(tpt) + case parent => parent.isType && typeHasRef(parent) + } + + vparamss.nestedExists(valDefHasRef) || parents.exists(parentHasRef) + } + + /** A pair consisting of + * - the next enum tag + * - scaffolding containing the necessary definitions for singleton enum cases + * unless that scaffolding was already generated by a previous call to `nextEnumKind`. + */ + def nextOrdinal(name: Name, kind: CaseKind, definesLookups: Boolean)(using Context): (Int, List[Tree]) = { + val (ordinal, seenMinKind, seenMaxKind, seenCases) = + ctx.tree.removeAttachment(EnumCaseCount).getOrElse((0, CaseKind.Class, CaseKind.Simple, Nil)) + val minKind = if kind.ordinal < seenMinKind.ordinal then kind else seenMinKind + val maxKind = if kind.ordinal > seenMaxKind.ordinal then kind else seenMaxKind + val cases = name match + case name: TermName => (ordinal, name) :: seenCases + case _ => seenCases + if definesLookups then + val thisRef = This(EmptyTypeIdent) + val cachedValues = cases.reverse.map((i, name) => (i, Select(thisRef, name))) + (ordinal, enumLookupMethods(EnumConstraints(minKind, maxKind, cachedValues))) + else + ctx.tree.pushAttachment(EnumCaseCount, (ordinal + 1, minKind, maxKind, cases)) + (ordinal, Nil) + } + + def param(name: TermName, typ: Type)(using Context): ValDef = param(name, TypeTree(typ)) + def param(name: TermName, tpt: Tree)(using Context): ValDef = ValDef(name, tpt, EmptyTree).withFlags(Param) + + def ordinalMeth(body: Tree)(using Context): DefDef = + DefDef(nme.ordinal, Nil, TypeTree(defn.IntType), body).withAddedFlags(Synthetic) + + def ordinalMethLit(ord: Int)(using Context): DefDef = + ordinalMeth(Literal(Constant(ord))) + + def fromOrdinalMeth(body: Tree => Tree)(using Context): DefDef = + DefDef(nme.fromOrdinal, (param(nme.ordinal, defn.IntType) :: Nil) :: Nil, + rawRef(enumClass.typeRef), body(Ident(nme.ordinal))).withFlags(Synthetic) + + /** Expand a module definition representing a parameterless enum case */ + def expandEnumModule(name: TermName, impl: Template, mods: Modifiers, definesLookups: Boolean, span: Span)(using Context): Tree = { + assert(impl.body.isEmpty) + if (!enumClass.exists) EmptyTree + else if (impl.parents.isEmpty) + expandSimpleEnumCase(name, mods, definesLookups, span) + else { + val (tag, scaffolding) = nextOrdinal(name, CaseKind.Object, definesLookups) + val impl1 = cpy.Template(impl)(parents = impl.parents :+ scalaRuntimeDot(tpnme.EnumValue), body = Nil) + .withAttachment(ExtendsSingletonMirror, ()) + val vdef = ValDef(name, TypeTree(), New(impl1)).withMods(mods.withAddedFlags(EnumValue, span)) + flatTree(vdef :: scaffolding).withSpan(span) + } + } + + /** Expand a simple enum case */ + def expandSimpleEnumCase(name: TermName, mods: Modifiers, definesLookups: Boolean, span: Span)(using Context): Tree = + if (!enumClass.exists) EmptyTree + else if (enumClass.typeParams.nonEmpty) { + val parent = interpolatedEnumParent(span) + val impl = Template(emptyConstructor, parent :: Nil, Nil, EmptyValDef, Nil) + expandEnumModule(name, impl, mods, definesLookups, span) + } + else { + val (tag, scaffolding) = nextOrdinal(name, CaseKind.Simple, definesLookups) + val creator = Apply(Ident(nme.DOLLAR_NEW), List(Literal(Constant(tag)), Literal(Constant(name.toString)))) + val vdef = ValDef(name, enumClassRef, creator).withMods(mods.withAddedFlags(EnumValue, span)) + flatTree(vdef :: scaffolding).withSpan(span) + } +} diff --git a/tests/pos-with-compiler-cc/dotc/ast/MainProxies.scala b/tests/pos-with-compiler-cc/dotc/ast/MainProxies.scala new file mode 100644 index 000000000000..c0cf2c0d1b81 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/ast/MainProxies.scala @@ -0,0 +1,449 @@ +package dotty.tools.dotc +package ast + +import core._ +import Symbols._, Types._, Contexts._, Decorators._, util.Spans._, Flags._, Constants._ +import StdNames.{nme, tpnme} +import ast.Trees._ +import Names.Name +import Comments.Comment +import NameKinds.DefaultGetterName +import Annotations.Annotation + +object MainProxies { + + /** Generate proxy classes for @main functions and @myMain functions where myMain <:< MainAnnotation */ + def proxies(stats: List[tpd.Tree])(using Context): List[untpd.Tree] = { + mainAnnotationProxies(stats) ++ mainProxies(stats) + } + + /** Generate proxy classes for @main functions. + * A function like + * + * @main def f(x: S, ys: T*) = ... + * + * would be translated to something like + * + * import CommandLineParser._ + * class f { + * @static def main(args: Array[String]): Unit = + * try + * f( + * parseArgument[S](args, 0), + * parseRemainingArguments[T](args, 1): _* + * ) + * catch case err: ParseError => showError(err) + * } + */ + private def mainProxies(stats: List[tpd.Tree])(using Context): List[untpd.Tree] = { + import tpd._ + def mainMethods(stats: List[Tree]): List[Symbol] = stats.flatMap { + case stat: DefDef if stat.symbol.hasAnnotation(defn.MainAnnot) => + stat.symbol :: Nil + case stat @ TypeDef(name, impl: Template) if stat.symbol.is(Module) => + mainMethods(impl.body) + case _ => + Nil + } + mainMethods(stats).flatMap(mainProxy) + } + + import untpd._ + private def mainProxy(mainFun: Symbol)(using Context): List[TypeDef] = { + val mainAnnotSpan = mainFun.getAnnotation(defn.MainAnnot).get.tree.span + def pos = mainFun.sourcePos + val argsRef = Ident(nme.args) + + def addArgs(call: untpd.Tree, mt: MethodType, idx: Int): untpd.Tree = + if (mt.isImplicitMethod) { + report.error(em"@main method cannot have implicit parameters", pos) + call + } + else { + val args = mt.paramInfos.zipWithIndex map { + (formal, n) => + val (parserSym, formalElem) = + if (formal.isRepeatedParam) (defn.CLP_parseRemainingArguments, formal.argTypes.head) + else (defn.CLP_parseArgument, formal) + val arg = Apply( + TypeApply(ref(parserSym.termRef), TypeTree(formalElem) :: Nil), + argsRef :: Literal(Constant(idx + n)) :: Nil) + if (formal.isRepeatedParam) repeated(arg) else arg + } + val call1 = Apply(call, args) + mt.resType match { + case restpe: MethodType => + if (mt.paramInfos.lastOption.getOrElse(NoType).isRepeatedParam) + report.error(em"varargs parameter of @main method must come last", pos) + addArgs(call1, restpe, idx + args.length) + case _ => + call1 + } + } + + var result: List[TypeDef] = Nil + if (!mainFun.owner.isStaticOwner) + report.error(em"@main method is not statically accessible", pos) + else { + var call = ref(mainFun.termRef) + mainFun.info match { + case _: ExprType => + case mt: MethodType => + call = addArgs(call, mt, 0) + case _: PolyType => + report.error(em"@main method cannot have type parameters", pos) + case _ => + report.error(em"@main can only annotate a method", pos) + } + val errVar = Ident(nme.error) + val handler = CaseDef( + Typed(errVar, TypeTree(defn.CLP_ParseError.typeRef)), + EmptyTree, + Apply(ref(defn.CLP_showError.termRef), errVar :: Nil)) + val body = Try(call, handler :: Nil, EmptyTree) + val mainArg = ValDef(nme.args, TypeTree(defn.ArrayType.appliedTo(defn.StringType)), EmptyTree) + .withFlags(Param) + /** Replace typed `Ident`s that have been typed with a TypeSplice with the reference to the symbol. + * The annotations will be retype-checked in another scope that may not have the same imports. + */ + def insertTypeSplices = new TreeMap { + override def transform(tree: Tree)(using Context): Tree = tree match + case tree: tpd.Ident @unchecked => TypedSplice(tree) + case tree => super.transform(tree) + } + val annots = mainFun.annotations + .filterNot(_.matches(defn.MainAnnot)) + .map(annot => insertTypeSplices.transform(annot.tree)) + val mainMeth = DefDef(nme.main, (mainArg :: Nil) :: Nil, TypeTree(defn.UnitType), body) + .withFlags(JavaStatic | Synthetic) + .withAnnotations(annots) + val mainTempl = Template(emptyConstructor, Nil, Nil, EmptyValDef, mainMeth :: Nil) + val mainCls = TypeDef(mainFun.name.toTypeName, mainTempl) + .withFlags(Final | Invisible) + + if (!ctx.reporter.hasErrors) + result = mainCls.withSpan(mainAnnotSpan.toSynthetic) :: Nil + } + result + } + + private type DefaultValueSymbols = Map[Int, Symbol] + private type ParameterAnnotationss = Seq[Seq[Annotation]] + + /** + * Generate proxy classes for main functions. + * A function like + * + * /** + * * Lorem ipsum dolor sit amet + * * consectetur adipiscing elit. + * * + * * @param x my param x + * * @param ys all my params y + * */ + * @myMain(80) def f( + * @myMain.Alias("myX") x: S, + * y: S, + * ys: T* + * ) = ... + * + * would be translated to something like + * + * final class f { + * static def main(args: Array[String]): Unit = { + * val annotation = new myMain(80) + * val info = new Info( + * name = "f", + * documentation = "Lorem ipsum dolor sit amet consectetur adipiscing elit.", + * parameters = Seq( + * new scala.annotation.MainAnnotation.Parameter("x", "S", false, false, "my param x", Seq(new scala.main.Alias("myX"))), + * new scala.annotation.MainAnnotation.Parameter("y", "S", true, false, "", Seq()), + * new scala.annotation.MainAnnotation.Parameter("ys", "T", false, true, "all my params y", Seq()) + * ) + * ), + * val command = annotation.command(info, args) + * if command.isDefined then + * val cmd = command.get + * val args0: () => S = annotation.argGetter[S](info.parameters(0), cmd(0), None) + * val args1: () => S = annotation.argGetter[S](info.parameters(1), mainArgs(1), Some(() => sum$default$1())) + * val args2: () => Seq[T] = annotation.varargGetter[T](info.parameters(2), cmd.drop(2)) + * annotation.run(() => f(args0(), args1(), args2()*)) + * } + * } + */ + private def mainAnnotationProxies(stats: List[tpd.Tree])(using Context): List[untpd.Tree] = { + import tpd._ + + /** + * Computes the symbols of the default values of the function. Since they cannot be inferred anymore at this + * point of the compilation, they must be explicitly passed by [[mainProxy]]. + */ + def defaultValueSymbols(scope: Tree, funSymbol: Symbol): DefaultValueSymbols = + scope match { + case TypeDef(_, template: Template) => + template.body.flatMap((_: Tree) match { + case dd: DefDef if dd.name.is(DefaultGetterName) && dd.name.firstPart == funSymbol.name => + val DefaultGetterName.NumberedInfo(index) = dd.name.info: @unchecked + List(index -> dd.symbol) + case _ => Nil + }).toMap + case _ => Map.empty + } + + /** Computes the list of main methods present in the code. */ + def mainMethods(scope: Tree, stats: List[Tree]): List[(Symbol, ParameterAnnotationss, DefaultValueSymbols, Option[Comment])] = stats.flatMap { + case stat: DefDef => + val sym = stat.symbol + sym.annotations.filter(_.matches(defn.MainAnnotationClass)) match { + case Nil => + Nil + case _ :: Nil => + val paramAnnotations = stat.paramss.flatMap(_.map( + valdef => valdef.symbol.annotations.filter(_.matches(defn.MainAnnotationParameterAnnotation)) + )) + (sym, paramAnnotations.toVector, defaultValueSymbols(scope, sym), stat.rawComment) :: Nil + case mainAnnot :: others => + report.error(em"method cannot have multiple main annotations", mainAnnot.tree) + Nil + } + case stat @ TypeDef(_, impl: Template) if stat.symbol.is(Module) => + mainMethods(stat, impl.body) + case _ => + Nil + } + + // Assuming that the top-level object was already generated, all main methods will have a scope + mainMethods(EmptyTree, stats).flatMap(mainAnnotationProxy) + } + + private def mainAnnotationProxy(mainFun: Symbol, paramAnnotations: ParameterAnnotationss, defaultValueSymbols: DefaultValueSymbols, docComment: Option[Comment])(using Context): Option[TypeDef] = { + val mainAnnot = mainFun.getAnnotation(defn.MainAnnotationClass).get + def pos = mainFun.sourcePos + + val documentation = new Documentation(docComment) + + /** () => value */ + def unitToValue(value: Tree): Tree = + val defDef = DefDef(nme.ANON_FUN, List(Nil), TypeTree(), value) + Block(defDef, Closure(Nil, Ident(nme.ANON_FUN), EmptyTree)) + + /** Generate a list of trees containing the ParamInfo instantiations. + * + * A ParamInfo has the following shape + * ``` + * new scala.annotation.MainAnnotation.Parameter("x", "S", false, false, "my param x", Seq(new scala.main.Alias("myX"))) + * ``` + */ + def parameterInfos(mt: MethodType): List[Tree] = + extension (tree: Tree) def withProperty(sym: Symbol, args: List[Tree]) = + Apply(Select(tree, sym.name), args) + + for ((formal, paramName), idx) <- mt.paramInfos.zip(mt.paramNames).zipWithIndex yield + val param = paramName.toString + val paramType0 = if formal.isRepeatedParam then formal.argTypes.head.dealias else formal.dealias + val paramType = paramType0.dealias + val paramTypeOwner = paramType.typeSymbol.owner + val paramTypeStr = + if paramTypeOwner == defn.EmptyPackageClass then paramType.show + else paramTypeOwner.showFullName + "." + paramType.show + val hasDefault = defaultValueSymbols.contains(idx) + val isRepeated = formal.isRepeatedParam + val paramDoc = documentation.argDocs.getOrElse(param, "") + val paramAnnots = + val annotationTrees = paramAnnotations(idx).map(instantiateAnnotation).toList + Apply(ref(defn.SeqModule.termRef), annotationTrees) + + val constructorArgs = List(param, paramTypeStr, hasDefault, isRepeated, paramDoc) + .map(value => Literal(Constant(value))) + + New(TypeTree(defn.MainAnnotationParameter.typeRef), List(constructorArgs :+ paramAnnots)) + + end parameterInfos + + /** + * Creates a list of references and definitions of arguments. + * The goal is to create the + * `val args0: () => S = annotation.argGetter[S](0, cmd(0), None)` + * part of the code. + */ + def argValDefs(mt: MethodType): List[ValDef] = + for ((formal, paramName), idx) <- mt.paramInfos.zip(mt.paramNames).zipWithIndex yield + val argName = nme.args ++ idx.toString + val isRepeated = formal.isRepeatedParam + val formalType = if isRepeated then formal.argTypes.head else formal + val getterName = if isRepeated then nme.varargGetter else nme.argGetter + val defaultValueGetterOpt = defaultValueSymbols.get(idx) match + case None => ref(defn.NoneModule.termRef) + case Some(dvSym) => + val value = unitToValue(ref(dvSym.termRef)) + Apply(ref(defn.SomeClass.companionModule.termRef), value) + val argGetter0 = TypeApply(Select(Ident(nme.annotation), getterName), TypeTree(formalType) :: Nil) + val index = Literal(Constant(idx)) + val paramInfo = Apply(Select(Ident(nme.info), nme.parameters), index) + val argGetter = + if isRepeated then Apply(argGetter0, List(paramInfo, Apply(Select(Ident(nme.cmd), nme.drop), List(index)))) + else Apply(argGetter0, List(paramInfo, Apply(Ident(nme.cmd), List(index)), defaultValueGetterOpt)) + ValDef(argName, TypeTree(), argGetter) + end argValDefs + + + /** Create a list of argument references that will be passed as argument to the main method. + * `args0`, ...`argn*` + */ + def argRefs(mt: MethodType): List[Tree] = + for ((formal, paramName), idx) <- mt.paramInfos.zip(mt.paramNames).zipWithIndex yield + val argRef = Apply(Ident(nme.args ++ idx.toString), Nil) + if formal.isRepeatedParam then repeated(argRef) else argRef + end argRefs + + + /** Turns an annotation (e.g. `@main(40)`) into an instance of the class (e.g. `new scala.main(40)`). */ + def instantiateAnnotation(annot: Annotation): Tree = + val argss = { + def recurse(t: tpd.Tree, acc: List[List[Tree]]): List[List[Tree]] = t match { + case Apply(t, args: List[tpd.Tree]) => recurse(t, extractArgs(args) :: acc) + case _ => acc + } + + def extractArgs(args: List[tpd.Tree]): List[Tree] = + args.flatMap { + case Typed(SeqLiteral(varargs, _), _) => varargs.map(arg => TypedSplice(arg)) + case arg: Select if arg.name.is(DefaultGetterName) => Nil // Ignore default values, they will be added later by the compiler + case arg => List(TypedSplice(arg)) + } + + recurse(annot.tree, Nil) + } + + New(TypeTree(annot.symbol.typeRef), argss) + end instantiateAnnotation + + def generateMainClass(mainCall: Tree, args: List[Tree], parameterInfos: List[Tree]): TypeDef = + val cmdInfo = + val nameTree = Literal(Constant(mainFun.showName)) + val docTree = Literal(Constant(documentation.mainDoc)) + val paramInfos = Apply(ref(defn.SeqModule.termRef), parameterInfos) + New(TypeTree(defn.MainAnnotationInfo.typeRef), List(List(nameTree, docTree, paramInfos))) + + val annotVal = ValDef( + nme.annotation, + TypeTree(), + instantiateAnnotation(mainAnnot) + ) + val infoVal = ValDef( + nme.info, + TypeTree(), + cmdInfo + ) + val command = ValDef( + nme.command, + TypeTree(), + Apply( + Select(Ident(nme.annotation), nme.command), + List(Ident(nme.info), Ident(nme.args)) + ) + ) + val argsVal = ValDef( + nme.cmd, + TypeTree(), + Select(Ident(nme.command), nme.get) + ) + val run = Apply(Select(Ident(nme.annotation), nme.run), mainCall) + val body0 = If( + Select(Ident(nme.command), nme.isDefined), + Block(argsVal :: args, run), + EmptyTree + ) + val body = Block(List(annotVal, infoVal, command), body0) // TODO add `if (cmd.nonEmpty)` + + val mainArg = ValDef(nme.args, TypeTree(defn.ArrayType.appliedTo(defn.StringType)), EmptyTree) + .withFlags(Param) + /** Replace typed `Ident`s that have been typed with a TypeSplice with the reference to the symbol. + * The annotations will be retype-checked in another scope that may not have the same imports. + */ + def insertTypeSplices = new TreeMap { + override def transform(tree: Tree)(using Context): Tree = tree match + case tree: tpd.Ident @unchecked => TypedSplice(tree) + case tree => super.transform(tree) + } + val annots = mainFun.annotations + .filterNot(_.matches(defn.MainAnnotationClass)) + .map(annot => insertTypeSplices.transform(annot.tree)) + val mainMeth = DefDef(nme.main, (mainArg :: Nil) :: Nil, TypeTree(defn.UnitType), body) + .withFlags(JavaStatic) + .withAnnotations(annots) + val mainTempl = Template(emptyConstructor, Nil, Nil, EmptyValDef, mainMeth :: Nil) + val mainCls = TypeDef(mainFun.name.toTypeName, mainTempl) + .withFlags(Final | Invisible) + mainCls.withSpan(mainAnnot.tree.span.toSynthetic) + end generateMainClass + + if (!mainFun.owner.isStaticOwner) + report.error(em"main method is not statically accessible", pos) + None + else mainFun.info match { + case _: ExprType => + Some(generateMainClass(unitToValue(ref(mainFun.termRef)), Nil, Nil)) + case mt: MethodType => + if (mt.isImplicitMethod) + report.error(em"main method cannot have implicit parameters", pos) + None + else mt.resType match + case restpe: MethodType => + report.error(em"main method cannot be curried", pos) + None + case _ => + Some(generateMainClass(unitToValue(Apply(ref(mainFun.termRef), argRefs(mt))), argValDefs(mt), parameterInfos(mt))) + case _: PolyType => + report.error(em"main method cannot have type parameters", pos) + None + case _ => + report.error(em"main can only annotate a method", pos) + None + } + } + + /** A class responsible for extracting the docstrings of a method. */ + private class Documentation(docComment: Option[Comment]): + import util.CommentParsing._ + + /** The main part of the documentation. */ + lazy val mainDoc: String = _mainDoc + /** The parameters identified by @param. Maps from parameter name to its documentation. */ + lazy val argDocs: Map[String, String] = _argDocs + + private var _mainDoc: String = "" + private var _argDocs: Map[String, String] = Map() + + docComment match { + case Some(comment) => if comment.isDocComment then parseDocComment(comment.raw) else _mainDoc = comment.raw + case None => + } + + private def cleanComment(raw: String): String = + var lines: Seq[String] = raw.trim.nn.split('\n').nn.toSeq + lines = lines.map(l => l.substring(skipLineLead(l, -1), l.length).nn.trim.nn) + var s = lines.foldLeft("") { + case ("", s2) => s2 + case (s1, "") if s1.last == '\n' => s1 // Multiple newlines are kept as single newlines + case (s1, "") => s1 + '\n' + case (s1, s2) if s1.last == '\n' => s1 + s2 + case (s1, s2) => s1 + ' ' + s2 + } + s.replaceAll(raw"\[\[", "").nn.replaceAll(raw"\]\]", "").nn.trim.nn + + private def parseDocComment(raw: String): Unit = + // Positions of the sections (@) in the docstring + val tidx: List[(Int, Int)] = tagIndex(raw) + + // Parse main comment + var mainComment: String = raw.substring(skipLineLead(raw, 0), startTag(raw, tidx)).nn + _mainDoc = cleanComment(mainComment) + + // Parse arguments comments + val argsCommentsSpans: Map[String, (Int, Int)] = paramDocs(raw, "@param", tidx) + val argsCommentsTextSpans = argsCommentsSpans.view.mapValues(extractSectionText(raw, _)) + val argsCommentsTexts = argsCommentsTextSpans.mapValues({ case (beg, end) => raw.substring(beg, end).nn }) + _argDocs = argsCommentsTexts.mapValues(cleanComment(_)).toMap + end Documentation +} diff --git a/tests/pos-with-compiler-cc/dotc/ast/NavigateAST.scala b/tests/pos-with-compiler-cc/dotc/ast/NavigateAST.scala new file mode 100644 index 000000000000..054ffe66f323 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/ast/NavigateAST.scala @@ -0,0 +1,129 @@ +package dotty.tools.dotc +package ast + +import core.Contexts._ +import core.Decorators._ +import util.Spans._ +import Trees.{MemberDef, DefTree, WithLazyField} +import dotty.tools.dotc.core.Types.AnnotatedType +import dotty.tools.dotc.core.Types.ImportType +import dotty.tools.dotc.core.Types.Type + +/** Utility functions to go from typed to untyped ASTs */ +// TODO: Handle trees with mixed source files +object NavigateAST { + + /** The untyped tree corresponding to typed tree `tree` in the compilation + * unit specified by `ctx` + */ + def toUntyped(tree: tpd.Tree)(using Context): untpd.Tree = + untypedPath(tree, exactMatch = true) match { + case (utree: untpd.Tree) :: _ => + utree + case _ => + val loosePath = untypedPath(tree, exactMatch = false) + throw new + Error(i"""no untyped tree for $tree, pos = ${tree.sourcePos} + |best matching path =\n$loosePath%\n====\n% + |path positions = ${loosePath.map(_.sourcePos)}""") + } + + /** The reverse path of untyped trees starting with a tree that closest matches + * `tree` and ending in the untyped tree at the root of the compilation unit + * specified by `ctx`. + * @param exactMatch If `true`, the path must start with a node that exactly + * matches `tree`, or `Nil` is returned. + * If `false` the path might start with a node enclosing + * the logical position of `tree`. + * Note: A complication concerns member definitions. ValDefs and DefDefs + * have after desugaring a position that spans just the name of the symbol being + * defined and nothing else. So we look instead for an untyped tree approximating the + * envelope of the definition, and declare success if we find another DefTree. + */ + def untypedPath(tree: tpd.Tree, exactMatch: Boolean = false)(using Context): List[Positioned] = + tree match { + case tree: MemberDef[?] => + untypedPath(tree.span) match { + case path @ (last: DefTree[?]) :: _ => path + case path if !exactMatch => path + case _ => Nil + } + case _ => + untypedPath(tree.span) match { + case (path @ last :: _) if last.span == tree.span || !exactMatch => path + case _ => Nil + } + } + + /** The reverse part of the untyped root of the compilation unit of `ctx` to + * the given `span`. + */ + def untypedPath(span: Span)(using Context): List[Positioned] = + pathTo(span, List(ctx.compilationUnit.untpdTree)) + + + /** The reverse path from any node in `from` to the node that closest encloses `span`, + * or `Nil` if no such path exists. If a non-empty path is returned it starts with + * the node closest enclosing `span` and ends with one of the nodes in `from`. + * + * @param skipZeroExtent If true, skip over zero-extent nodes in the search. These nodes + * do not correspond to code the user wrote since their start and + * end point are the same, so this is useful when trying to reconcile + * nodes with source code. + */ + def pathTo(span: Span, from: List[Positioned], skipZeroExtent: Boolean = false)(using Context): List[Positioned] = { + def childPath(it: Iterator[Any], path: List[Positioned]): List[Positioned] = { + var bestFit: List[Positioned] = path + while (it.hasNext) { + val path1 = it.next() match { + case p: Positioned => singlePath(p, path) + case m: untpd.Modifiers => childPath(m.productIterator, path) + case xs: List[?] => childPath(xs.iterator, path) + case _ => path + } + if ((path1 ne path) && + ((bestFit eq path) || + bestFit.head.span != path1.head.span && + bestFit.head.span.contains(path1.head.span))) + bestFit = path1 + } + bestFit + } + /* + * Annotations trees are located in the Type + */ + def unpackAnnotations(t: Type, path: List[Positioned]): List[Positioned] = + t match { + case ann: AnnotatedType => + unpackAnnotations(ann.parent, childPath(ann.annot.tree.productIterator, path)) + case imp: ImportType => + childPath(imp.expr.productIterator, path) + case other => + path + } + def singlePath(p: Positioned, path: List[Positioned]): List[Positioned] = + if (p.span.exists && !(skipZeroExtent && p.span.isZeroExtent) && p.span.contains(span)) { + // FIXME: We shouldn't be manually forcing trees here, we should replace + // our usage of `productIterator` by something in `Positioned` that takes + // care of low-level details like this for us. + p match { + case p: WithLazyField[?] => + p.forceIfLazy + case _ => + } + val iterator = p match + case defdef: DefTree[?] => + p.productIterator ++ defdef.mods.productIterator + case _ => + p.productIterator + childPath(iterator, p :: path) + } + else { + p match { + case t: untpd.TypeTree => unpackAnnotations(t.typeOpt, path) + case _ => path + } + } + childPath(from.iterator, Nil) + } +} diff --git a/tests/pos-with-compiler-cc/dotc/ast/Positioned.scala b/tests/pos-with-compiler-cc/dotc/ast/Positioned.scala new file mode 100644 index 000000000000..fd30d441a6ee --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/ast/Positioned.scala @@ -0,0 +1,246 @@ +package dotty.tools +package dotc +package ast + +import util.Spans._ +import util.{SourceFile, SourcePosition, SrcPos} +import core.Contexts._ +import core.Decorators._ +import core.NameOps._ +import core.Flags.{JavaDefined, ExtensionMethod} +import core.StdNames.nme +import ast.Trees.mods +import annotation.constructorOnly +import annotation.internal.sharable + +/** A base class for things that have positions (currently: modifiers and trees) + */ +abstract class Positioned(implicit @constructorOnly src: SourceFile) extends SrcPos, Product, Cloneable, caps.Pure { + import Positioned.{ids, nextId, debugId} + + private var mySpan: Span = _ + + private var mySource: SourceFile = src + + /** A unique identifier in case -Yshow-tree-ids, or -Ydebug-tree-with-id + * is set, -1 otherwise. + */ + def uniqueId: Int = + if ids != null && ids.nn.containsKey(this) then ids.nn.get(this).nn else -1 + + private def allocateId() = + if ids != null then + val ownId = nextId + nextId += 1 + ids.nn.put(this: @unchecked, ownId) + if ownId == debugId then + println(s"Debug tree (id=$debugId) creation \n${this: @unchecked}\n") + Thread.dumpStack() + + allocateId() + + /** The span part of the item's position */ + def span: Span = mySpan + + def span_=(span: Span): Unit = + mySpan = span + + span = envelope(src) + + def source: SourceFile = mySource + + def sourcePos(using Context): SourcePosition = source.atSpan(span) + + /** This positioned item, widened to `SrcPos`. Used to make clear we only need the + * position, typically for error reporting. + */ + final def srcPos: SrcPos = this + + /** A positioned item like this one with given `span`. + * If the positioned item is source-derived, a clone is returned. + * If the positioned item is synthetic, the position is updated + * destructively and the item itself is returned. + */ + def withSpan(span: Span): this.type = + if (span == mySpan) this + else { + val newpd: this.type = + if !mySpan.exists then + if span.exists then envelope(source, span.startPos) // fill in children spans + this + else + cloneIn(source) + newpd.span = span + newpd + } + + /** The union of startSpan and the spans of all positioned children that + * have the same source as this node, except that Inlined nodes only + * consider their `call` child. + * + * Side effect: Any descendants without spans have but with the same source as this + * node have their span set to the end position of the envelope of all children to + * the left, or, if that one does not exist, to the start position of the envelope + * of all children to the right. + */ + def envelope(src: SourceFile, startSpan: Span = NoSpan): Span = (this: @unchecked) match { + case Trees.Inlined(call, _, _) => + call.span + case _ => + def include(span: Span, x: Any): Span = x match { + case p: Positioned => + if (p.source != src) span + else if (p.span.exists) span.union(p.span) + else if (span.exists) { + if (span.end != MaxOffset) + p.span = p.envelope(src, span.endPos) + span + } + else // No span available to assign yet, signal this by returning a span with MaxOffset end + Span(MaxOffset, MaxOffset) + case m: untpd.Modifiers => + include(include(span, m.mods), m.annotations) + case y :: ys => + include(include(span, y), ys) + case _ => span + } + val limit = productArity + def includeChildren(span: Span, n: Int): Span = + if (n < limit) includeChildren(include(span, productElement(n): @unchecked), n + 1) + else span + val span1 = includeChildren(startSpan, 0) + val span2 = + if (!span1.exists || span1.end != MaxOffset) + span1 + else if (span1.start == MaxOffset) + // No positioned child was found + NoSpan + else + ///println(s"revisit $uniqueId with $span1") + // We have some children left whose span could not be assigned. + // Go through it again with the known start position. + includeChildren(span1.startPos, 0) + span2.toSynthetic + } + + /** Clone this node but assign it a fresh id which marks it as a node in `file`. */ + def cloneIn(src: SourceFile): this.type = { + val newpd: this.type = clone.asInstanceOf[this.type] + newpd.allocateId() + newpd.mySource = src + newpd + } + + def contains(that: Positioned): Boolean = { + def isParent(x: Any): Boolean = x match { + case x: Positioned => + x.contains(that) + case m: untpd.Modifiers => + m.mods.exists(isParent) || m.annotations.exists(isParent) + case xs: List[?] => + xs.exists(isParent) + case _ => + false + } + (this eq that) || + (this.span contains that.span) && { + var n = productArity + var found = false + while (!found && n > 0) { + n -= 1 + found = isParent(productElement(n)) + } + found + } + } + + /** Check that all positioned items in this tree satisfy the following conditions: + * - Parent spans contain child spans + * - If item is a non-empty tree, it has a position + */ + def checkPos(nonOverlapping: Boolean)(using Context): Unit = try { + import untpd._ + var lastPositioned: Positioned | Null = null + var lastSpan = NoSpan + def check(p: Any): Unit = p match { + case p: Positioned => + assert(span contains p.span, + i"""position error, parent span does not contain child span + |parent = $this # $uniqueId, + |parent span = $span, + |child = $p # ${p.uniqueId}, + |child span = ${p.span}""".stripMargin) + p match { + case tree: Tree if !tree.isEmpty => + assert(tree.span.exists, + s"position error: position not set for $tree # ${tree.uniqueId}") + case _ => + } + if nonOverlapping then + this match { + case _: XMLBlock => + // FIXME: Trees generated by the XML parser do not satisfy `checkPos` + case _: WildcardFunction + if lastPositioned.isInstanceOf[ValDef] && !p.isInstanceOf[ValDef] => + // ignore transition from last wildcard parameter to body + case _ => + assert(!lastSpan.exists || !p.span.exists || lastSpan.end <= p.span.start, + i"""position error, child positions overlap or in wrong order + |parent = $this + |1st child = $lastPositioned + |1st child span = $lastSpan + |2nd child = $p + |2nd child span = ${p.span}""".stripMargin) + } + lastPositioned = p + lastSpan = p.span + p.checkPos(nonOverlapping) + case m: untpd.Modifiers => + m.annotations.foreach(check) + m.mods.foreach(check) + case xs: List[?] => + xs.foreach(check) + case _ => + } + this match { + case tree: DefDef if tree.name == nme.CONSTRUCTOR && tree.mods.is(JavaDefined) => + // Special treatment for constructors coming from Java: + // Leave out leading type params, they are copied with wrong positions from parent class + check(tree.mods) + check(tree.trailingParamss) + case tree: DefDef if tree.mods.is(ExtensionMethod) => + tree.paramss match + case vparams1 :: vparams2 :: rest if tree.name.isRightAssocOperatorName => + // omit check for right-associatiove extension methods; their parameters were swapped + case _ => + check(tree.paramss) + check(tree.tpt) + check(tree.rhs) + case _ => + val end = productArity + var n = 0 + while (n < end) { + check(productElement(n)) + n += 1 + } + } + } + catch { + case ex: AssertionError => + println(i"error while checking $this") + throw ex + } +} + +object Positioned { + @sharable private var debugId = Int.MinValue + @sharable private var ids: java.util.WeakHashMap[Positioned, Int] | Null = null + @sharable private var nextId: Int = 0 + + def init(using Context): Unit = + debugId = ctx.settings.YdebugTreeWithId.value + if ids == null && ctx.settings.YshowTreeIds.value + || debugId != ctx.settings.YdebugTreeWithId.default + then + ids = java.util.WeakHashMap() +} diff --git a/tests/pos-with-compiler-cc/dotc/ast/TreeInfo.scala b/tests/pos-with-compiler-cc/dotc/ast/TreeInfo.scala new file mode 100644 index 000000000000..b650a0088de4 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/ast/TreeInfo.scala @@ -0,0 +1,1070 @@ +package dotty.tools +package dotc +package ast + +import core._ +import Flags._, Trees._, Types._, Contexts._ +import Names._, StdNames._, NameOps._, Symbols._ +import typer.ConstFold +import reporting.trace +import dotty.tools.dotc.transform.SymUtils._ +import Decorators._ +import Constants.Constant +import scala.collection.mutable + +import scala.annotation.tailrec + +trait TreeInfo[T <: Untyped] { self: Trees.Instance[T] => + + def unsplice(tree: Trees.Tree[T]): Trees.Tree[T] = tree + + def isDeclarationOrTypeDef(tree: Tree): Boolean = unsplice(tree) match { + case DefDef(_, _, _, EmptyTree) + | ValDef(_, _, EmptyTree) + | TypeDef(_, _) => true + case _ => false + } + + def isOpAssign(tree: Tree): Boolean = unsplice(tree) match { + case Apply(fn, _ :: _) => + unsplice(fn) match { + case Select(_, name) if name.isOpAssignmentName => true + case _ => false + } + case _ => false + } + + class MatchingArgs(params: List[Symbol], args: List[Tree])(using Context) { + def foreach(f: (Symbol, Tree) => Unit): Boolean = { + def recur(params: List[Symbol], args: List[Tree]): Boolean = params match { + case Nil => args.isEmpty + case param :: params1 => + if (param.info.isRepeatedParam) { + for (arg <- args) f(param, arg) + true + } + else args match { + case Nil => false + case arg :: args1 => + f(param, args.head) + recur(params1, args1) + } + } + recur(params, args) + } + def zipped: List[(Symbol, Tree)] = map((_, _)) + def map[R](f: (Symbol, Tree) => R): List[R] = { + val b = List.newBuilder[R] + foreach(b += f(_, _)) + b.result() + } + } + + /** The method part of an application node, possibly enclosed in a block + * with only valdefs as statements. the reason for also considering blocks + * is that named arguments can transform a call into a block, e.g. + * (b = foo, a = bar) + * is transformed to + * { val x$1 = foo + * val x$2 = bar + * (x$2, x$1) + * } + */ + def methPart(tree: Tree): Tree = stripApply(tree) match { + case TypeApply(fn, _) => methPart(fn) + case AppliedTypeTree(fn, _) => methPart(fn) // !!! should not be needed + case Block(stats, expr) => methPart(expr) + case mp => mp + } + + /** If this is an application, its function part, stripping all + * Apply nodes (but leaving TypeApply nodes in). Otherwise the tree itself. + */ + def stripApply(tree: Tree): Tree = unsplice(tree) match { + case Apply(fn, _) => stripApply(fn) + case _ => tree + } + + /** If this is a block, its expression part */ + def stripBlock(tree: Tree): Tree = unsplice(tree) match { + case Block(_, expr) => stripBlock(expr) + case Inlined(_, _, expr) => stripBlock(expr) + case _ => tree + } + + def stripInlined(tree: Tree): Tree = unsplice(tree) match { + case Inlined(_, _, expr) => stripInlined(expr) + case _ => tree + } + + def stripAnnotated(tree: Tree): Tree = tree match { + case Annotated(arg, _) => arg + case _ => tree + } + + /** The number of arguments in an application */ + def numArgs(tree: Tree): Int = unsplice(tree) match { + case Apply(fn, args) => numArgs(fn) + args.length + case TypeApply(fn, _) => numArgs(fn) + case Block(_, expr) => numArgs(expr) + case _ => 0 + } + + /** All term arguments of an application in a single flattened list */ + def allArguments(tree: Tree): List[Tree] = unsplice(tree) match { + case Apply(fn, args) => allArguments(fn) ::: args + case TypeApply(fn, _) => allArguments(fn) + case Block(_, expr) => allArguments(expr) + case _ => Nil + } + + /** Is tree explicitly parameterized with type arguments? */ + def hasExplicitTypeArgs(tree: Tree): Boolean = tree match + case TypeApply(tycon, args) => + args.exists(arg => !arg.span.isZeroExtent && !tycon.span.contains(arg.span)) + case _ => false + + /** Is tree a path? */ + def isPath(tree: Tree): Boolean = unsplice(tree) match { + case Ident(_) | This(_) | Super(_, _) => true + case Select(qual, _) => isPath(qual) + case _ => false + } + + /** Is tree a self constructor call this(...)? I.e. a call to a constructor of the + * same object? + */ + def isSelfConstrCall(tree: Tree): Boolean = methPart(tree) match { + case Ident(nme.CONSTRUCTOR) | Select(This(_), nme.CONSTRUCTOR) => true + case _ => false + } + + /** Is tree a super constructor call? + */ + def isSuperConstrCall(tree: Tree): Boolean = methPart(tree) match { + case Select(Super(_, _), nme.CONSTRUCTOR) => true + case _ => false + } + + def isSuperSelection(tree: Tree): Boolean = unsplice(tree) match { + case Select(Super(_, _), _) => true + case _ => false + } + + def isSelfOrSuperConstrCall(tree: Tree): Boolean = methPart(tree) match { + case Ident(nme.CONSTRUCTOR) + | Select(This(_), nme.CONSTRUCTOR) + | Select(Super(_, _), nme.CONSTRUCTOR) => true + case _ => false + } + + /** Is tree a backquoted identifier or definition */ + def isBackquoted(tree: Tree): Boolean = tree.hasAttachment(Backquoted) + + /** Is tree a variable pattern? */ + def isVarPattern(pat: Tree): Boolean = unsplice(pat) match { + case x: Ident => x.name.isVarPattern && !isBackquoted(x) + case _ => false + } + + /** The first constructor definition in `stats` */ + def firstConstructor(stats: List[Tree]): Tree = stats match { + case (meth: DefDef) :: _ if meth.name.isConstructorName => meth + case stat :: stats => firstConstructor(stats) + case nil => EmptyTree + } + + /** Is tpt a vararg type of the form T* or => T*? */ + def isRepeatedParamType(tpt: Tree)(using Context): Boolean = stripByNameType(tpt) match { + case tpt: TypeTree => tpt.typeOpt.isRepeatedParam + case AppliedTypeTree(Select(_, tpnme.REPEATED_PARAM_CLASS), _) => true + case _ => false + } + + /** Is this argument node of the form *, or is it a reference to + * such an argument ? The latter case can happen when an argument is lifted. + */ + def isWildcardStarArg(tree: Tree)(using Context): Boolean = unbind(tree) match { + case Typed(Ident(nme.WILDCARD_STAR), _) => true + case Typed(_, Ident(tpnme.WILDCARD_STAR)) => true + case Typed(_, tpt: TypeTree) => tpt.typeOpt.isRepeatedParam + case NamedArg(_, arg) => isWildcardStarArg(arg) + case arg => arg.typeOpt.widen.isRepeatedParam + } + + /** Is tree a type tree of the form `=> T` or (under pureFunctions) `{refs}-> T`? */ + def isByNameType(tree: Tree)(using Context): Boolean = + stripByNameType(tree) ne tree + + /** Strip `=> T` to `T` and (under pureFunctions) `{refs}-> T` to `T` */ + def stripByNameType(tree: Tree)(using Context): Tree = unsplice(tree) match + case ByNameTypeTree(t1) => t1 + case untpd.CapturingTypeTree(_, parent) => + val parent1 = stripByNameType(parent) + if parent1 eq parent then tree else parent1 + case _ => tree + + /** All type and value parameter symbols of this DefDef */ + def allParamSyms(ddef: DefDef)(using Context): List[Symbol] = + ddef.paramss.flatten.map(_.symbol) + + /** Does this argument list end with an argument of the form : _* ? */ + def isWildcardStarArgList(trees: List[Tree])(using Context): Boolean = + trees.nonEmpty && isWildcardStarArg(trees.last) + + /** Is the argument a wildcard argument of the form `_` or `x @ _`? + */ + def isWildcardArg(tree: Tree): Boolean = unbind(tree) match { + case Ident(nme.WILDCARD) => true + case _ => false + } + + /** Does this list contain a named argument tree? */ + def hasNamedArg(args: List[Any]): Boolean = args exists isNamedArg + val isNamedArg: Any => Boolean = (arg: Any) => arg.isInstanceOf[Trees.NamedArg[_]] + + /** Is this pattern node a catch-all (wildcard or variable) pattern? */ + def isDefaultCase(cdef: CaseDef): Boolean = cdef match { + case CaseDef(pat, EmptyTree, _) => isWildcardArg(pat) + case _ => false + } + + /** Does this CaseDef catch Throwable? */ + def catchesThrowable(cdef: CaseDef)(using Context): Boolean = + catchesAllOf(cdef, defn.ThrowableType) + + /** Does this CaseDef catch everything of a certain Type? */ + def catchesAllOf(cdef: CaseDef, threshold: Type)(using Context): Boolean = + isDefaultCase(cdef) || + cdef.guard.isEmpty && { + unbind(cdef.pat) match { + case Typed(Ident(nme.WILDCARD), tpt) => threshold <:< tpt.typeOpt + case _ => false + } + } + + /** Is this case guarded? */ + def isGuardedCase(cdef: CaseDef): Boolean = cdef.guard ne EmptyTree + + /** Is this parameter list a using clause? */ + def isUsingClause(params: ParamClause)(using Context): Boolean = params match + case ValDefs(vparam :: _) => + val sym = vparam.symbol + if sym.exists then sym.is(Given) else vparam.mods.is(Given) + case _ => + false + + def isUsingOrTypeParamClause(params: ParamClause)(using Context): Boolean = params match + case TypeDefs(_) => true + case _ => isUsingClause(params) + + def isTypeParamClause(params: ParamClause)(using Context): Boolean = params match + case TypeDefs(_) => true + case _ => false + + private val languageSubCategories = Set(nme.experimental, nme.deprecated) + + /** If `path` looks like a language import, `Some(name)` where name + * is `experimental` if that sub-module is imported, and the empty + * term name otherwise. + */ + def languageImport(path: Tree): Option[TermName] = path match + case Select(p1, name: TermName) if languageSubCategories.contains(name) => + languageImport(p1) match + case Some(EmptyTermName) => Some(name) + case _ => None + case p1: RefTree if p1.name == nme.language => + p1.qualifier match + case EmptyTree => Some(EmptyTermName) + case p2: RefTree if p2.name == nme.scala => + p2.qualifier match + case EmptyTree => Some(EmptyTermName) + case Ident(nme.ROOTPKG) => Some(EmptyTermName) + case _ => None + case _ => None + case _ => None + + /** The underlying pattern ignoring any bindings */ + def unbind(x: Tree): Tree = unsplice(x) match { + case Bind(_, y) => unbind(y) + case y => y + } + + /** The largest subset of {NoInits, PureInterface} that a + * trait or class with these parents can have as flags. + */ + def parentsKind(parents: List[Tree])(using Context): FlagSet = parents match { + case Nil => NoInitsInterface + case Apply(_, _ :: _) :: _ => EmptyFlags + case _ :: parents1 => parentsKind(parents1) + } + + /** Checks whether predicate `p` is true for all result parts of this expression, + * where we zoom into Ifs, Matches, and Blocks. + */ + def forallResults(tree: Tree, p: Tree => Boolean): Boolean = tree match { + case If(_, thenp, elsep) => forallResults(thenp, p) && forallResults(elsep, p) + case Match(_, cases) => cases forall (c => forallResults(c.body, p)) + case Block(_, expr) => forallResults(expr, p) + case _ => p(tree) + } +} + +trait UntypedTreeInfo extends TreeInfo[Untyped] { self: Trees.Instance[Untyped] => + import untpd._ + + /** The underlying tree when stripping any TypedSplice or Parens nodes */ + override def unsplice(tree: Tree): Tree = tree match { + case TypedSplice(tree1) => tree1 + case Parens(tree1) => unsplice(tree1) + case _ => tree + } + + def functionWithUnknownParamType(tree: Tree): Option[Tree] = tree match { + case Function(args, _) => + if (args.exists { + case ValDef(_, tpt, _) => tpt.isEmpty + case _ => false + }) Some(tree) + else None + case Match(EmptyTree, _) => + Some(tree) + case Block(Nil, expr) => + functionWithUnknownParamType(expr) + case _ => + None + } + + def isFunctionWithUnknownParamType(tree: Tree): Boolean = + functionWithUnknownParamType(tree).isDefined + + def isFunction(tree: Tree): Boolean = tree match + case Function(_, _) | Match(EmptyTree, _) => true + case Block(Nil, expr) => isFunction(expr) + case _ => false + + /** Is `tree` an context function or closure, possibly nested in a block? */ + def isContextualClosure(tree: Tree)(using Context): Boolean = unsplice(tree) match { + case tree: FunctionWithMods => tree.mods.is(Given) + case Function((param: untpd.ValDef) :: _, _) => param.mods.is(Given) + case Closure(_, meth, _) => true + case Block(Nil, expr) => isContextualClosure(expr) + case Block(DefDef(nme.ANON_FUN, params :: _, _, _) :: Nil, cl: Closure) => + if params.isEmpty then + cl.tpt.eq(untpd.ContextualEmptyTree) || defn.isContextFunctionType(cl.tpt.typeOpt) + else + isUsingClause(params) + case _ => false + } + + /** The largest subset of {NoInits, PureInterface} that a + * trait or class enclosing this statement can have as flags. + */ + private def defKind(tree: Tree)(using Context): FlagSet = unsplice(tree) match { + case EmptyTree | _: Import => NoInitsInterface + case tree: TypeDef => if (tree.isClassDef) NoInits else NoInitsInterface + case tree: DefDef => + if tree.unforcedRhs == EmptyTree + && tree.paramss.forall { + case ValDefs(vparams) => vparams.forall(_.rhs.isEmpty) + case _ => true + } + then + NoInitsInterface + else if tree.mods.is(Given) && tree.paramss.isEmpty then + EmptyFlags // might become a lazy val: TODO: check whether we need to suppress NoInits once we have new lazy val impl + else + NoInits + case tree: ValDef => if (tree.unforcedRhs == EmptyTree) NoInitsInterface else EmptyFlags + case _ => EmptyFlags + } + + /** The largest subset of {NoInits, PureInterface} that a + * trait or class with this body can have as flags. + */ + def bodyKind(body: List[Tree])(using Context): FlagSet = + body.foldLeft(NoInitsInterface)((fs, stat) => fs & defKind(stat)) + + /** Info of a variable in a pattern: The named tree and its type */ + type VarInfo = (NameTree, Tree) + + /** An extractor for trees of the form `id` or `id: T` */ + object IdPattern { + def unapply(tree: Tree)(using Context): Option[VarInfo] = tree match { + case id: Ident if id.name != nme.WILDCARD => Some(id, TypeTree()) + case Typed(id: Ident, tpt) => Some((id, tpt)) + case _ => None + } + } + + /** Under pureFunctions: A builder and extractor for `=> T`, which is an alias for `{*}-> T`. + * Only trees of the form `=> T` are matched; trees written directly as `{*}-> T` + * are ignored by the extractor. + */ + object ImpureByNameTypeTree: + + def apply(tp: ByNameTypeTree)(using Context): untpd.CapturingTypeTree = + untpd.CapturingTypeTree( + untpd.captureRoot.withSpan(tp.span.startPos) :: Nil, tp) + + def unapply(tp: Tree)(using Context): Option[ByNameTypeTree] = tp match + case untpd.CapturingTypeTree(id @ Select(_, nme.CAPTURE_ROOT) :: Nil, bntp: ByNameTypeTree) + if id.span == bntp.span.startPos => Some(bntp) + case _ => None + end ImpureByNameTypeTree +} + +trait TypedTreeInfo extends TreeInfo[Type] { self: Trees.Instance[Type] => + import TreeInfo._ + import tpd._ + + /** The purity level of this statement. + * @return Pure if statement has no side effects + * Idempotent if running the statement a second time has no side effects + * Impure otherwise + */ + def statPurity(tree: Tree)(using Context): PurityLevel = unsplice(tree) match { + case EmptyTree + | TypeDef(_, _) + | Import(_, _) + | DefDef(_, _, _, _) => + Pure + case vdef @ ValDef(_, _, _) => + if (vdef.symbol.flags is Mutable) Impure else exprPurity(vdef.rhs) `min` Pure + case _ => + Impure + // TODO: It seem like this should be exprPurity(tree) + // But if we do that the repl/vars test break. Need to figure out why that's the case. + } + + /** The purity level of this expression. See docs for PurityLevel for what that means + * + * Note that purity and idempotency are treated differently. + * References to modules and lazy vals are impure (side-effecting) both because + * side-effecting code may be executed and because the first reference + * takes a different code path than all to follow; but they are idempotent + * because running the expression a second time gives the cached result. + */ + def exprPurity(tree: Tree)(using Context): PurityLevel = unsplice(tree) match { + case EmptyTree + | This(_) + | Super(_, _) + | Literal(_) => + PurePath + case Ident(_) => + refPurity(tree) + case Select(qual, _) => + if (tree.symbol.is(Erased)) Pure + else refPurity(tree) `min` exprPurity(qual) + case New(_) | Closure(_, _, _) => + Pure + case TypeApply(fn, _) => + if (fn.symbol.is(Erased) || fn.symbol == defn.QuotedTypeModule_of || fn.symbol == defn.Predef_classOf) Pure else exprPurity(fn) + case Apply(fn, args) => + if isPureApply(tree, fn) then + minOf(exprPurity(fn), args.map(exprPurity)) `min` Pure + else if fn.symbol.is(Erased) then + Pure + else if fn.symbol.isStableMember /* && fn.symbol.is(Lazy) */ then + minOf(exprPurity(fn), args.map(exprPurity)) `min` Idempotent + else + Impure + case Typed(expr, _) => + exprPurity(expr) + case Block(stats, expr) => + minOf(exprPurity(expr), stats.map(statPurity)) + case Inlined(_, bindings, expr) => + minOf(exprPurity(expr), bindings.map(statPurity)) + case NamedArg(_, expr) => + exprPurity(expr) + case _ => + Impure + } + + private def minOf(l0: PurityLevel, ls: List[PurityLevel]) = ls.foldLeft(l0)(_ `min` _) + + def isPurePath(tree: Tree)(using Context): Boolean = tree.tpe match { + case tpe: ConstantType => exprPurity(tree) >= Pure + case _ => exprPurity(tree) == PurePath + } + + def isPureExpr(tree: Tree)(using Context): Boolean = + exprPurity(tree) >= Pure + + def isIdempotentPath(tree: Tree)(using Context): Boolean = tree.tpe match { + case tpe: ConstantType => exprPurity(tree) >= Idempotent + case _ => exprPurity(tree) >= IdempotentPath + } + + def isIdempotentExpr(tree: Tree)(using Context): Boolean = + exprPurity(tree) >= Idempotent + + def isPureBinding(tree: Tree)(using Context): Boolean = statPurity(tree) >= Pure + + /** Is the application `tree` with function part `fn` known to be pure? + * Function value and arguments can still be impure. + */ + def isPureApply(tree: Tree, fn: Tree)(using Context): Boolean = + def isKnownPureOp(sym: Symbol) = + sym.owner.isPrimitiveValueClass + || sym.owner == defn.StringClass + || defn.pureMethods.contains(sym) + tree.tpe.isInstanceOf[ConstantType] && tree.symbol != NoSymbol && isKnownPureOp(tree.symbol) // A constant expression with pure arguments is pure. + || fn.symbol.isStableMember && !fn.symbol.is(Lazy) // constructors of no-inits classes are stable + + /** The purity level of this reference. + * @return + * PurePath if reference is (nonlazy and stable) + * or to a parameterized function + * or its type is a constant type + * IdempotentPath if reference is lazy and stable + * Impure otherwise + * @DarkDimius: need to make sure that lazy accessor methods have Lazy and Stable + * flags set. + */ + def refPurity(tree: Tree)(using Context): PurityLevel = { + val sym = tree.symbol + if (!tree.hasType) Impure + else if !tree.tpe.widen.isParameterless then PurePath + else if sym.is(Erased) then PurePath + else if tree.tpe.isInstanceOf[ConstantType] then PurePath + else if (!sym.isStableMember) Impure + else if (sym.is(Module)) + if (sym.moduleClass.isNoInitsRealClass) PurePath else IdempotentPath + else if (sym.is(Lazy)) IdempotentPath + else if sym.isAllOf(InlineParam) then Impure + else PurePath + } + + def isPureRef(tree: Tree)(using Context): Boolean = + refPurity(tree) == PurePath + def isIdempotentRef(tree: Tree)(using Context): Boolean = + refPurity(tree) >= IdempotentPath + + /** (1) If `tree` is a constant expression, its value as a Literal, + * or `tree` itself otherwise. + * + * Note: Demanding idempotency instead of purity in literalize is strictly speaking too loose. + * Example + * + * object O { final val x = 42; println("43") } + * O.x + * + * Strictly speaking we can't replace `O.x` with `42`. But this would make + * most expressions non-constant. Maybe we can change the spec to accept this + * kind of eliding behavior. Or else enforce true purity in the compiler. + * The choice will be affected by what we will do with `inline` and with + * Singleton type bounds (see SIP 23). Presumably + * + * object O1 { val x: Singleton = 42; println("43") } + * object O2 { inline val x = 42; println("43") } + * + * should behave differently. + * + * O1.x should have the same effect as { println("43"); 42 } + * + * whereas + * + * O2.x = 42 + * + * Revisit this issue once we have standardized on `inline`. Then we can demand + * purity of the prefix unless the selection goes to a inline val. + * + * Note: This method should be applied to all term tree nodes that are not literals, + * that can be idempotent, and that can have constant types. So far, only nodes + * of the following classes qualify: + * + * Ident + * Select + * TypeApply + * + * (2) A primitive unary operator expression `pre.op` where `op` is one of `+`, `-`, `~`, `!` + * that has a constant type `ConstantType(v)` but that is not a constant expression + * (i.e. `pre` has side-effects) is translated to + * + * { pre; v } + * + * (3) An expression `pre.getClass[..]()` that has a constant type `ConstantType(v)` but where + * `pre` has side-effects is translated to: + * + * { pre; v } + * + * This avoids the situation where we have a Select node that does not have a symbol. + */ + def constToLiteral(tree: Tree)(using Context): Tree = { + assert(!tree.isType) + val tree1 = ConstFold(tree) + tree1.tpe.widenTermRefExpr.dealias.normalized match { + case ConstantType(Constant(_: Type)) if tree.isInstanceOf[Block] => + // We can't rewrite `{ class A; classOf[A] }` to `classOf[A]`, so we leave + // blocks returning a class literal alone, even if they're idempotent. + tree1 + case ConstantType(value) => + def dropOp(t: Tree): Tree = t match + case Select(pre, _) if t.tpe.isInstanceOf[ConstantType] => + // it's a primitive unary operator + pre + case Apply(TypeApply(Select(pre, nme.getClass_), _), Nil) => + pre + case _ => + tree1 + + val countsAsPure = + if dropOp(tree1).symbol.isInlineVal + then isIdempotentExpr(tree1) + else isPureExpr(tree1) + + if countsAsPure then Literal(value).withSpan(tree.span) + else + val pre = dropOp(tree1) + if pre eq tree1 then tree1 + else + // it's a primitive unary operator or getClass call; + // Simplify `pre.op` to `{ pre; v }` where `v` is the value of `pre.op` + Block(pre :: Nil, Literal(value)).withSpan(tree.span) + case _ => tree1 + } + } + + def isExtMethodApply(tree: Tree)(using Context): Boolean = methPart(tree) match + case Inlined(call, _, _) => isExtMethodApply(call) + case tree @ Select(qual, nme.apply) => tree.symbol.is(ExtensionMethod) || isExtMethodApply(qual) + case tree => tree.symbol.is(ExtensionMethod) + + /** Is symbol potentially a getter of a mutable variable? + */ + def mayBeVarGetter(sym: Symbol)(using Context): Boolean = { + def maybeGetterType(tpe: Type): Boolean = tpe match { + case _: ExprType => true + case tpe: MethodType => tpe.isImplicitMethod + case tpe: PolyType => maybeGetterType(tpe.resultType) + case _ => false + } + sym.owner.isClass && !sym.isStableMember && maybeGetterType(sym.info) + } + + /** Is tree a reference to a mutable variable, or to a potential getter + * that has a setter in the same class? + */ + def isVariableOrGetter(tree: Tree)(using Context): Boolean = { + def sym = tree.symbol + def isVar = sym.is(Mutable) + def isGetter = + mayBeVarGetter(sym) && sym.owner.info.member(sym.name.asTermName.setterName).exists + + unsplice(tree) match { + case Ident(_) => isVar + case Select(_, _) => isVar || isGetter + case Apply(_, _) => + methPart(tree) match { + case Select(qual, nme.apply) => qual.tpe.member(nme.update).exists + case _ => false + } + case _ => false + } + } + + /** Is tree a `this` node which belongs to `enclClass`? */ + def isSelf(tree: Tree, enclClass: Symbol)(using Context): Boolean = unsplice(tree) match { + case This(_) => tree.symbol == enclClass + case _ => false + } + + /** Strips layers of `.asInstanceOf[T]` / `_.$asInstanceOf[T]()` from an expression */ + def stripCast(tree: Tree)(using Context): Tree = { + def isCast(sel: Tree) = sel.symbol.isTypeCast + unsplice(tree) match { + case TypeApply(sel @ Select(inner, _), _) if isCast(sel) => + stripCast(inner) + case Apply(TypeApply(sel @ Select(inner, _), _), Nil) if isCast(sel) => + stripCast(inner) + case t => + t + } + } + + /** The type arguments of a possibly curried call */ + def typeArgss(tree: Tree): List[List[Tree]] = + @tailrec + def loop(tree: Tree, argss: List[List[Tree]]): List[List[Tree]] = tree match + case TypeApply(fn, args) => loop(fn, args :: argss) + case Apply(fn, args) => loop(fn, argss) + case _ => argss + loop(tree, Nil) + + /** The term arguments of a possibly curried call */ + def termArgss(tree: Tree): List[List[Tree]] = + @tailrec + def loop(tree: Tree, argss: List[List[Tree]]): List[List[Tree]] = tree match + case Apply(fn, args) => loop(fn, args :: argss) + case TypeApply(fn, args) => loop(fn, argss) + case _ => argss + loop(tree, Nil) + + /** The type and term arguments of a possibly curried call, in the order they are given */ + def allArgss(tree: Tree): List[List[Tree]] = + @tailrec + def loop(tree: Tree, argss: List[List[Tree]]): List[List[Tree]] = tree match + case tree: GenericApply => loop(tree.fun, tree.args :: argss) + case _ => argss + loop(tree, Nil) + + /** The function part of a possibly curried call. Unlike `methPart` this one does + * not decompose blocks + */ + def funPart(tree: Tree): Tree = tree match + case tree: GenericApply => funPart(tree.fun) + case tree => tree + + /** Decompose a template body into parameters and other statements */ + def decomposeTemplateBody(body: List[Tree])(using Context): (List[Tree], List[Tree]) = + body.partition { + case stat: TypeDef => stat.symbol is Flags.Param + case stat: ValOrDefDef => + stat.symbol.is(Flags.ParamAccessor) && !stat.symbol.isSetter + case _ => false + } + + /** An extractor for closures, either contained in a block or standalone. + */ + object closure { + def unapply(tree: Tree): Option[(List[Tree], Tree, Tree)] = tree match { + case Block(_, expr) => unapply(expr) + case Closure(env, meth, tpt) => Some(env, meth, tpt) + case Typed(expr, _) => unapply(expr) + case _ => None + } + } + + /** An extractor for def of a closure contained the block of the closure. */ + object closureDef { + def unapply(tree: Tree)(using Context): Option[DefDef] = tree match { + case Block((meth : DefDef) :: Nil, closure: Closure) if meth.symbol == closure.meth.symbol => + Some(meth) + case Block(Nil, expr) => + unapply(expr) + case _ => + None + } + } + + /** If tree is a closure, its body, otherwise tree itself */ + def closureBody(tree: Tree)(using Context): Tree = tree match { + case closureDef(meth) => meth.rhs + case _ => tree + } + + /** The variables defined by a pattern, in reverse order of their appearance. */ + def patVars(tree: Tree)(using Context): List[Symbol] = { + val acc = new TreeAccumulator[List[Symbol]] { + def apply(syms: List[Symbol], tree: Tree)(using Context) = tree match { + case Bind(_, body) => apply(tree.symbol :: syms, body) + case Annotated(tree, id @ Ident(tpnme.BOUNDTYPE_ANNOT)) => apply(id.symbol :: syms, tree) + case _ => foldOver(syms, tree) + } + } + acc(Nil, tree) + } + + /** Is this pattern node a catch-all or type-test pattern? */ + def isCatchCase(cdef: CaseDef)(using Context): Boolean = cdef match { + case CaseDef(Typed(Ident(nme.WILDCARD), tpt), EmptyTree, _) => + isSimpleThrowable(tpt.tpe) + case CaseDef(Bind(_, Typed(Ident(nme.WILDCARD), tpt)), EmptyTree, _) => + isSimpleThrowable(tpt.tpe) + case _ => + isDefaultCase(cdef) + } + + private def isSimpleThrowable(tp: Type)(using Context): Boolean = tp match { + case tp @ TypeRef(pre, _) => + (pre == NoPrefix || pre.typeSymbol.isStatic) && + (tp.symbol derivesFrom defn.ThrowableClass) && !tp.symbol.is(Trait) + case _ => + false + } + + /** The symbols defined locally in a statement list */ + def localSyms(stats: List[Tree])(using Context): List[Symbol] = + val locals = new mutable.ListBuffer[Symbol] + for stat <- stats do + if stat.isDef && stat.symbol.exists then locals += stat.symbol + locals.toList + + /** If `tree` is a DefTree, the symbol defined by it, otherwise NoSymbol */ + def definedSym(tree: Tree)(using Context): Symbol = + if (tree.isDef) tree.symbol else NoSymbol + + /** Going from child to parent, the path of tree nodes that starts + * with a definition of symbol `sym` and ends with `root`, or Nil + * if no such path exists. + * Pre: `sym` must have a position. + */ + def defPath(sym: Symbol, root: Tree)(using Context): List[Tree] = trace.onDebug(s"defpath($sym with position ${sym.span}, ${root.show})") { + require(sym.span.exists, sym) + object accum extends TreeAccumulator[List[Tree]] { + def apply(x: List[Tree], tree: Tree)(using Context): List[Tree] = + if (tree.span.contains(sym.span)) + if (definedSym(tree) == sym) tree :: x + else { + val x1 = foldOver(x, tree) + if (x1 ne x) tree :: x1 else x1 + } + else x + } + accum(Nil, root) + } + + /** The top level classes in this tree, including only those module classes that + * are not a linked class of some other class in the result. + */ + def topLevelClasses(tree: Tree)(using Context): List[ClassSymbol] = tree match { + case PackageDef(_, stats) => stats.flatMap(topLevelClasses) + case tdef: TypeDef if tdef.symbol.isClass => tdef.symbol.asClass :: Nil + case _ => Nil + } + + /** The tree containing only the top-level classes and objects matching either `cls` or its companion object */ + def sliceTopLevel(tree: Tree, cls: ClassSymbol)(using Context): List[Tree] = tree match { + case PackageDef(pid, stats) => + val slicedStats = stats.flatMap(sliceTopLevel(_, cls)) + val isEffectivelyEmpty = slicedStats.forall(_.isInstanceOf[Import]) + if isEffectivelyEmpty then Nil + else cpy.PackageDef(tree)(pid, slicedStats) :: Nil + case tdef: TypeDef => + val sym = tdef.symbol + assert(sym.isClass) + if (cls == sym || cls == sym.linkedClass) tdef :: Nil + else Nil + case vdef: ValDef => + val sym = vdef.symbol + assert(sym.is(Module)) + if (cls == sym.companionClass || cls == sym.moduleClass) vdef :: Nil + else Nil + case tree => + tree :: Nil + } + + /** The statement sequence that contains a definition of `sym`, or Nil + * if none was found. + * For a tree to be found, The symbol must have a position and its definition + * tree must be reachable from come tree stored in an enclosing context. + */ + def definingStats(sym: Symbol)(using Context): List[Tree] = + if (!sym.span.exists || (ctx eq NoContext) || (ctx.compilationUnit eq NoCompilationUnit)) Nil + else defPath(sym, ctx.compilationUnit.tpdTree) match { + case defn :: encl :: _ => + def verify(stats: List[Tree]) = + if (stats exists (definedSym(_) == sym)) stats else Nil + encl match { + case Block(stats, _) => verify(stats) + case encl: Template => verify(encl.body) + case PackageDef(_, stats) => verify(stats) + case _ => Nil + } + case nil => + Nil + } + + /** If `tree` is an instance of `TupleN[...](e1, ..., eN)`, the arguments `e1, ..., eN` + * otherwise the empty list. + */ + def tupleArgs(tree: Tree)(using Context): List[Tree] = tree match { + case Block(Nil, expr) => tupleArgs(expr) + case Inlined(_, Nil, expr) => tupleArgs(expr) + case Apply(fn: NameTree, args) + if fn.name == nme.apply && + fn.symbol.owner.is(Module) && + defn.isTupleClass(fn.symbol.owner.companionClass) => args + case _ => Nil + } + + /** The qualifier part of a Select or Ident. + * For an Ident, this is the `This` of the current class. + */ + def qualifier(tree: Tree)(using Context): Tree = tree match { + case Select(qual, _) => qual + case tree: Ident => desugarIdentPrefix(tree) + case _ => This(ctx.owner.enclosingClass.asClass) + } + + /** Is this a (potentially applied) selection of a member of a structural type + * that is not a member of an underlying class or trait? + */ + def isStructuralTermSelectOrApply(tree: Tree)(using Context): Boolean = { + def isStructuralTermSelect(tree: Select) = + def hasRefinement(qualtpe: Type): Boolean = qualtpe.dealias match + case RefinedType(parent, rname, rinfo) => + rname == tree.name || hasRefinement(parent) + case tp: TypeProxy => + hasRefinement(tp.superType) + case tp: AndType => + hasRefinement(tp.tp1) || hasRefinement(tp.tp2) + case tp: OrType => + hasRefinement(tp.tp1) || hasRefinement(tp.tp2) + case _ => + false + !tree.symbol.exists + && tree.isTerm + && { + val qualType = tree.qualifier.tpe + hasRefinement(qualType) && !qualType.derivesFrom(defn.PolyFunctionClass) + } + def loop(tree: Tree): Boolean = tree match + case TypeApply(fun, _) => + loop(fun) + case Apply(fun, _) => + loop(fun) + case tree: Select => + isStructuralTermSelect(tree) + case _ => + false + loop(tree) + } + + /** Return a pair consisting of (supercall, rest) + * + * - supercall: the superclass call, excluding trait constr calls + * + * The supercall is always the first statement (if it exists) + */ + final def splitAtSuper(constrStats: List[Tree])(implicit ctx: Context): (List[Tree], List[Tree]) = + constrStats.toList match { + case (sc: Apply) :: rest if sc.symbol.isConstructor => (sc :: Nil, rest) + case (block @ Block(_, sc: Apply)) :: rest if sc.symbol.isConstructor => (block :: Nil, rest) + case stats => (Nil, stats) + } + + /** Structural tree comparison (since == on trees is reference equality). + * For the moment, only Ident, Select, Literal, Apply and TypeApply are supported + */ + extension (t1: Tree) { + def === (t2: Tree)(using Context): Boolean = (t1, t2) match { + case (t1: Ident, t2: Ident) => + t1.symbol == t2.symbol + case (t1 @ Select(q1, _), t2 @ Select(q2, _)) => + t1.symbol == t2.symbol && q1 === q2 + case (Literal(c1), Literal(c2)) => + c1 == c2 + case (Apply(f1, as1), Apply(f2, as2)) => + f1 === f2 && as1.corresponds(as2)(_ === _) + case (TypeApply(f1, ts1), TypeApply(f2, ts2)) => + f1 === f2 && ts1.tpes.corresponds(ts2.tpes)(_ =:= _) + case _ => + false + } + def hash(using Context): Int = + t1.getClass.hashCode * 37 + { + t1 match { + case t1: Ident => t1.symbol.hashCode + case t1 @ Select(q1, _) => t1.symbol.hashCode * 41 + q1.hash + case Literal(c1) => c1.hashCode + case Apply(f1, as1) => as1.foldLeft(f1.hash)((h, arg) => h * 41 + arg.hash) + case TypeApply(f1, ts1) => ts1.foldLeft(f1.hash)((h, arg) => h * 41 + arg.tpe.hash) + case _ => t1.hashCode + } + } + } + + def assertAllPositioned(tree: Tree)(using Context): Unit = + tree.foreachSubTree { + case t: WithoutTypeOrPos[_] => + case t => assert(t.span.exists, i"$t") + } + + /** Extractors for quotes */ + object Quoted { + /** Extracts the content of a quoted tree. + * The result can be the contents of a term or type quote, which + * will return a term or type tree respectively. + */ + def unapply(tree: tpd.Apply)(using Context): Option[tpd.Tree] = + if tree.symbol == defn.QuotedRuntime_exprQuote then + // quoted.runtime.Expr.quote[T]() + Some(tree.args.head) + else if tree.symbol == defn.QuotedTypeModule_of then + // quoted.Type.of[](quotes) + val TypeApply(_, body :: _) = tree.fun: @unchecked + Some(body) + else None + } + + /** Extractors for splices */ + object Spliced { + /** Extracts the content of a spliced expression tree. + * The result can be the contents of a term splice, which + * will return a term tree. + */ + def unapply(tree: tpd.Apply)(using Context): Option[tpd.Tree] = + if tree.symbol.isExprSplice then Some(tree.args.head) else None + } + + /** Extractors for type splices */ + object SplicedType { + /** Extracts the content of a spliced type tree. + * The result can be the contents of a type splice, which + * will return a type tree. + */ + def unapply(tree: tpd.Select)(using Context): Option[tpd.Tree] = + if tree.symbol.isTypeSplice then Some(tree.qualifier) else None + } + + /** Extractor for not-null assertions. + * A not-null assertion for reference `x` has the form `x.$asInstanceOf$[x.type & T]`. + */ + object AssertNotNull : + def apply(tree: tpd.Tree, tpnn: Type)(using Context): tpd.Tree = + tree.select(defn.Any_typeCast).appliedToType(AndType(tree.tpe, tpnn)) + + def unapply(tree: tpd.TypeApply)(using Context): Option[tpd.Tree] = tree match + case TypeApply(Select(qual: RefTree, nme.asInstanceOfPM), arg :: Nil) => + arg.tpe match + case AndType(ref, nn1) if qual.tpe eq ref => + qual.tpe.widen match + case OrNull(nn2) if nn1 eq nn2 => + Some(qual) + case _ => None + case _ => None + case _ => None + end AssertNotNull + + object ConstantValue { + def unapply(tree: Tree)(using Context): Option[Any] = + tree match + case Typed(expr, _) => unapply(expr) + case Inlined(_, Nil, expr) => unapply(expr) + case Block(Nil, expr) => unapply(expr) + case _ => + tree.tpe.widenTermRefExpr.normalized match + case ConstantType(Constant(x)) => Some(x) + case _ => None + } +} + +object TreeInfo { + /** A purity level is represented as a bitset (expressed as an Int) */ + class PurityLevel(val x: Int) extends AnyVal { + /** `this` contains the bits of `that` */ + def >= (that: PurityLevel): Boolean = (x & that.x) == that.x + + /** The intersection of the bits of `this` and `that` */ + def min(that: PurityLevel): PurityLevel = new PurityLevel(x & that.x) + } + + /** An expression is a stable path. Requires that expression is at least idempotent */ + val Path: PurityLevel = new PurityLevel(4) + + /** The expression has no side effects */ + val Pure: PurityLevel = new PurityLevel(3) + + /** Running the expression a second time has no side effects. Implied by `Pure`. */ + val Idempotent: PurityLevel = new PurityLevel(1) + + val Impure: PurityLevel = new PurityLevel(0) + + /** A stable path that is evaluated without side effects */ + val PurePath: PurityLevel = new PurityLevel(Pure.x | Path.x) + + /** A stable path that is also idempotent */ + val IdempotentPath: PurityLevel = new PurityLevel(Idempotent.x | Path.x) +} diff --git a/tests/pos-with-compiler-cc/dotc/ast/TreeMapWithImplicits.scala b/tests/pos-with-compiler-cc/dotc/ast/TreeMapWithImplicits.scala new file mode 100644 index 000000000000..caf8d68442f6 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/ast/TreeMapWithImplicits.scala @@ -0,0 +1,82 @@ +package dotty.tools.dotc +package ast + +import Trees._ +import core.Contexts._ +import core.ContextOps.enter +import core.Flags._ +import core.Symbols._ +import core.TypeError + +/** A TreeMap that maintains the necessary infrastructure to support + * contextual implicit searches (type-scope implicits are supported anyway). + * + * This incudes implicits defined in scope as well as imported implicits. + */ +class TreeMapWithImplicits extends tpd.TreeMapWithPreciseStatContexts { + import tpd._ + + def transformSelf(vd: ValDef)(using Context): ValDef = + cpy.ValDef(vd)(tpt = transform(vd.tpt)) + + private def nestedScopeCtx(defs: List[Tree])(using Context): Context = { + val nestedCtx = ctx.fresh.setNewScope + defs foreach { + case d: DefTree if d.symbol.isOneOf(GivenOrImplicitVal) => nestedCtx.enter(d.symbol) + case _ => + } + nestedCtx + } + + private def patternScopeCtx(pattern: Tree)(using Context): Context = { + val nestedCtx = ctx.fresh.setNewScope + new TreeTraverser { + def traverse(tree: Tree)(using Context): Unit = { + tree match { + case d: DefTree if d.symbol.isOneOf(GivenOrImplicitVal) => + nestedCtx.enter(d.symbol) + case _ => + } + traverseChildren(tree) + } + }.traverse(pattern) + nestedCtx + } + + override def transform(tree: Tree)(using Context): Tree = { + try tree match { + case Block(stats, expr) => + super.transform(tree)(using nestedScopeCtx(stats)) + case tree: DefDef => + inContext(localCtx(tree)) { + cpy.DefDef(tree)( + tree.name, + transformParamss(tree.paramss), + transform(tree.tpt), + transform(tree.rhs)(using nestedScopeCtx(tree.paramss.flatten))) + } + case impl @ Template(constr, parents, self, _) => + cpy.Template(tree)( + transformSub(constr), + transform(parents)(using ctx.superCallContext), + Nil, + transformSelf(self), + transformStats(impl.body, tree.symbol)) + case tree: CaseDef => + val patCtx = patternScopeCtx(tree.pat)(using ctx) + cpy.CaseDef(tree)( + transform(tree.pat), + transform(tree.guard)(using patCtx), + transform(tree.body)(using patCtx) + ) + case _ => + super.transform(tree) + } + catch { + case ex: TypeError => + report.error(ex, tree.srcPos) + tree + } + } +} + diff --git a/tests/pos-with-compiler-cc/dotc/ast/TreeTypeMap.scala b/tests/pos-with-compiler-cc/dotc/ast/TreeTypeMap.scala new file mode 100644 index 000000000000..3b250118f9b3 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/ast/TreeTypeMap.scala @@ -0,0 +1,232 @@ +package dotty.tools +package dotc +package ast + +import core._ +import Types._, Contexts._, Flags._ +import Symbols._, Annotations._, Trees._, Symbols._, Constants.Constant +import Decorators._ +import dotty.tools.dotc.transform.SymUtils._ +import language.experimental.pureFunctions + +/** A map that applies three functions and a substitution together to a tree and + * makes sure they are coordinated so that the result is well-typed. The functions are + * @param typeMap A function from Type to Type that gets applied to the + * type of every tree node and to all locally defined symbols, + * followed by the substitution [substFrom := substTo]. + * @param treeMap A transformer that translates all encountered subtrees in + * prefix traversal orders + * @param oldOwners Previous owners. If a top-level local symbol in the mapped tree + * has one of these as an owner, the owner is replaced by the corresponding + * symbol in `newOwners`. + * @param newOwners New owners, replacing previous owners. + * @param substFrom The symbols that need to be substituted. + * @param substTo The substitution targets. + * + * The reason the substitution is broken out from the rest of the type map is + * that all symbols have to be substituted at the same time. If we do not do this, + * we risk data races on named types. Example: Say we have `outer#1.inner#2` and we + * have two substitutions S1 = [outer#1 := outer#3], S2 = [inner#2 := inner#4] where + * hashtags precede symbol ids. If we do S1 first, we get outer#2.inner#3. If we then + * do S2 we get outer#2.inner#4. But that means that the named type outer#2.inner + * gets two different denotations in the same period. Hence, if -Yno-double-bindings is + * set, we would get a data race assertion error. + */ +class TreeTypeMap( + val typeMap: Type -> Type = IdentityTypeMap, + val treeMap: tpd.Tree -> tpd.Tree = identity[tpd.Tree](_), // !cc! need explicit instantiation of default argument + val oldOwners: List[Symbol] = Nil, + val newOwners: List[Symbol] = Nil, + val substFrom: List[Symbol] = Nil, + val substTo: List[Symbol] = Nil, + cpy: tpd.TreeCopier = tpd.cpy)(using DetachedContext) extends tpd.TreeMap(cpy) { + import tpd._ + + def copy( + typeMap: Type -> Type, + treeMap: tpd.Tree -> tpd.Tree, + oldOwners: List[Symbol], + newOwners: List[Symbol], + substFrom: List[Symbol], + substTo: List[Symbol])(using Context): TreeTypeMap = + new TreeTypeMap(typeMap, treeMap, oldOwners, newOwners, substFrom, substTo) + + /** If `sym` is one of `oldOwners`, replace by corresponding symbol in `newOwners` */ + def mapOwner(sym: Symbol): Symbol = sym.subst(oldOwners, newOwners) + + /** Replace occurrences of `This(oldOwner)` in some prefix of a type + * by the corresponding `This(newOwner)`. + */ + private val mapOwnerThis = new TypeMap with cc.CaptureSet.IdempotentCaptRefMap { + private def mapPrefix(from: List[Symbol], to: List[Symbol], tp: Type): Type = from match { + case Nil => tp + case (cls: ClassSymbol) :: from1 => mapPrefix(from1, to.tail, tp.substThis(cls, to.head.thisType)) + case _ :: from1 => mapPrefix(from1, to.tail, tp) + } + def apply(tp: Type): Type = tp match { + case tp: NamedType => tp.derivedSelect(mapPrefix(oldOwners, newOwners, tp.prefix)) + case _ => mapOver(tp) + } + } + + def mapType(tp: Type): Type = + mapOwnerThis(typeMap(tp).substSym(substFrom, substTo)) + + private def updateDecls(prevStats: List[Tree], newStats: List[Tree]): Unit = + if (prevStats.isEmpty) assert(newStats.isEmpty) + else { + prevStats.head match { + case pdef: MemberDef => + val prevSym = pdef.symbol + val newSym = newStats.head.symbol + val newCls = newSym.owner.asClass + if (prevSym != newSym) newCls.replace(prevSym, newSym) + case _ => + } + updateDecls(prevStats.tail, newStats.tail) + } + + def transformInlined(tree: tpd.Inlined)(using Context): tpd.Tree = + val Inlined(call, bindings, expanded) = tree + val (tmap1, bindings1) = transformDefs(bindings) + val expanded1 = tmap1.transform(expanded) + cpy.Inlined(tree)(call, bindings1, expanded1) + + override def transform(tree: tpd.Tree)(using Context): tpd.Tree = treeMap(tree) match { + case impl @ Template(constr, parents, self, _) => + val tmap = withMappedSyms(localSyms(impl :: self :: Nil)) + cpy.Template(impl)( + constr = tmap.transformSub(constr), + parents = parents.mapconserve(transform), + self = tmap.transformSub(self), + body = impl.body mapconserve + (tmap.transform(_)(using ctx.withOwner(mapOwner(impl.symbol.owner)))) + ).withType(tmap.mapType(impl.tpe)) + case tree1 => + tree1.withType(mapType(tree1.tpe)) match { + case id: Ident if tpd.needsSelect(id.tpe) => + ref(id.tpe.asInstanceOf[TermRef]).withSpan(id.span) + case ddef @ DefDef(name, paramss, tpt, _) => + val (tmap1, paramss1) = transformAllParamss(paramss) + val res = cpy.DefDef(ddef)(name, paramss1, tmap1.transform(tpt), tmap1.transform(ddef.rhs)) + res.symbol.setParamssFromDefs(paramss1) + res.symbol.transformAnnotations { + case ann: BodyAnnotation => ann.derivedAnnotation(transform(ann.tree)) + case ann => ann + } + res + case tdef @ LambdaTypeTree(tparams, body) => + val (tmap1, tparams1) = transformDefs(tparams) + cpy.LambdaTypeTree(tdef)(tparams1, tmap1.transform(body)) + case blk @ Block(stats, expr) => + val (tmap1, stats1) = transformDefs(stats) + val expr1 = tmap1.transform(expr) + cpy.Block(blk)(stats1, expr1) + case inlined: Inlined => + transformInlined(inlined) + case cdef @ CaseDef(pat, guard, rhs) => + val tmap = withMappedSyms(patVars(pat)) + val pat1 = tmap.transform(pat) + val guard1 = tmap.transform(guard) + val rhs1 = tmap.transform(rhs) + cpy.CaseDef(cdef)(pat1, guard1, rhs1) + case labeled @ Labeled(bind, expr) => + val tmap = withMappedSyms(bind.symbol :: Nil) + val bind1 = tmap.transformSub(bind) + val expr1 = tmap.transform(expr) + cpy.Labeled(labeled)(bind1, expr1) + case tree @ Hole(_, _, args, content, tpt) => + val args1 = args.mapConserve(transform) + val content1 = transform(content) + val tpt1 = transform(tpt) + cpy.Hole(tree)(args = args1, content = content1, tpt = tpt1) + case lit @ Literal(Constant(tpe: Type)) => + cpy.Literal(lit)(Constant(mapType(tpe))) + case tree1 => + super.transform(tree1) + } + } + + override def transformStats(trees: List[tpd.Tree], exprOwner: Symbol)(using Context): List[Tree] = + transformDefs(trees)._2 + + def transformDefs[TT <: tpd.Tree](trees: List[TT])(using Context): (TreeTypeMap, List[TT]) = { + val tmap = withMappedSyms(tpd.localSyms(trees)) + (tmap, tmap.transformSub(trees)) + } + + private def transformAllParamss(paramss: List[ParamClause]): (TreeTypeMap, List[ParamClause]) = paramss match + case params :: paramss1 => + val (tmap1, params1: ParamClause) = ((params: @unchecked) match + case ValDefs(vparams) => transformDefs(vparams) + case TypeDefs(tparams) => transformDefs(tparams) + ): @unchecked + val (tmap2, paramss2) = tmap1.transformAllParamss(paramss1) + (tmap2, params1 :: paramss2) + case nil => + (this, paramss) + + def apply[ThisTree <: tpd.Tree](tree: ThisTree): ThisTree = transform(tree).asInstanceOf[ThisTree] + + def apply(annot: Annotation): Annotation = annot.derivedAnnotation(apply(annot.tree)) + + /** The current tree map composed with a substitution [from -> to] */ + def withSubstitution(from: List[Symbol], to: List[Symbol]): TreeTypeMap = + if (from eq to) this + else { + // assert that substitution stays idempotent, assuming its parts are + // TODO: It might be better to cater for the asserted-away conditions, by + // setting up a proper substitution abstraction with a compose operator that + // guarantees idempotence. But this might be too inefficient in some cases. + // We'll cross that bridge when we need to. + assert(!from.exists(substTo contains _)) + assert(!to.exists(substFrom contains _)) + assert(!from.exists(newOwners contains _)) + assert(!to.exists(oldOwners contains _)) + copy( + typeMap, + treeMap, + from ++ oldOwners, + to ++ newOwners, + from ++ substFrom, + to ++ substTo) + } + + /** Apply `typeMap` and `ownerMap` to given symbols `syms` + * and return a treemap that contains the substitution + * between original and mapped symbols. + */ + def withMappedSyms(syms: List[Symbol]): TreeTypeMap = + withMappedSyms(syms, mapSymbols(syms, this)) + + /** The tree map with the substitution between originals `syms` + * and mapped symbols `mapped`. Also goes into mapped classes + * and substitutes their declarations. + */ + def withMappedSyms(syms: List[Symbol], mapped: List[Symbol]): TreeTypeMap = + if syms eq mapped then this + else + val substMap = withSubstitution(syms, mapped) + lazy val origCls = mapped.zip(syms).filter(_._1.isClass).toMap + mapped.filter(_.isClass).foldLeft(substMap) { (tmap, cls) => + val origDcls = cls.info.decls.toList.filterNot(_.is(TypeParam)) + val tmap0 = tmap.withSubstitution(origCls(cls).typeParams, cls.typeParams) + val mappedDcls = mapSymbols(origDcls, tmap0, mapAlways = true) + val tmap1 = tmap.withMappedSyms( + origCls(cls).typeParams ::: origDcls, + cls.typeParams ::: mappedDcls) + origDcls.lazyZip(mappedDcls).foreach(cls.asClass.replace) + tmap1 + } + + override def toString = + def showSyms(syms: List[Symbol]) = + syms.map(sym => s"$sym#${sym.id}").mkString(", ") + s"""TreeTypeMap( + |typeMap = $typeMap + |treeMap = $treeMap + |oldOwners = ${showSyms(oldOwners)} + |newOwners = ${showSyms(newOwners)} + |substFrom = ${showSyms(substFrom)} + |substTo = ${showSyms(substTo)}""".stripMargin +} diff --git a/tests/pos-with-compiler-cc/dotc/ast/Trees.scala b/tests/pos-with-compiler-cc/dotc/ast/Trees.scala new file mode 100644 index 000000000000..aa1c06a7ca85 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/ast/Trees.scala @@ -0,0 +1,1787 @@ +package dotty.tools +package dotc +package ast + +import core._ +import Types._, Names._, NameOps._, Flags._, util.Spans._, Contexts._, Constants._ +import typer.{ ConstFold, ProtoTypes } +import SymDenotations._, Symbols._, Denotations._, StdNames._, Comments._ +import collection.mutable.ListBuffer +import printing.Printer +import printing.Texts.Text +import util.{Stats, Attachment, Property, SourceFile, NoSource, SrcPos, SourcePosition} +import config.Config +import config.Printers.overload +import annotation.internal.sharable +import annotation.unchecked.uncheckedVariance +import annotation.constructorOnly +import compiletime.uninitialized +import Decorators._ +import annotation.retains +import language.experimental.pureFunctions + +object Trees { + + type Untyped = Type | Null + + /** The total number of created tree nodes, maintained if Stats.enabled */ + @sharable var ntrees: Int = 0 + + /** Property key for trees with documentation strings attached */ + val DocComment: Property.StickyKey[Comments.Comment] = Property.StickyKey() + + /** Property key for backquoted identifiers and definitions */ + val Backquoted: Property.StickyKey[Unit] = Property.StickyKey() + + /** Trees take a parameter indicating what the type of their `tpe` field + * is. Two choices: `Type` or `Untyped`. + * Untyped trees have type `Tree[Untyped]`. + * + * Tree typing uses a copy-on-write implementation: + * + * - You can never observe a `tpe` which is `null` (throws an exception) + * - So when creating a typed tree with `withType` we can re-use + * the existing tree transparently, assigning its `tpe` field. + * - It is impossible to embed untyped trees in typed ones. + * - Typed trees can be embedded in untyped ones provided they are rooted + * in a TypedSplice node. + * - Type checking an untyped tree should remove all embedded `TypedSplice` + * nodes. + */ + abstract class Tree[+T <: Untyped](implicit @constructorOnly src: SourceFile) + extends Positioned, SrcPos, Product, Attachment.Container, printing.Showable { + + if (Stats.enabled) ntrees += 1 + + /** The type constructor at the root of the tree */ + type ThisTree[T <: Untyped] <: Tree[T] + + protected var myTpe: T @uncheckedVariance = uninitialized + + /** Destructively set the type of the tree. This should be called only when it is known that + * it is safe under sharing to do so. One use-case is in the withType method below + * which implements copy-on-write. Another use-case is in method interpolateAndAdapt in Typer, + * where we overwrite with a simplified version of the type itself. + */ + private[dotc] def overwriteType(tpe: T @uncheckedVariance): Unit = + myTpe = tpe + + /** The type of the tree. In case of an untyped tree, + * an UnAssignedTypeException is thrown. (Overridden by empty trees) + */ + final def tpe: T = + if myTpe == null then throw UnAssignedTypeException(this) + myTpe.uncheckedNN + + /** Copy `tpe` attribute from tree `from` into this tree, independently + * whether it is null or not. + final def copyAttr[U <: Untyped](from: Tree[U]): ThisTree[T] = { + val t1 = this.withSpan(from.span) + val t2 = + if (from.myTpe != null) t1.withType(from.myTpe.asInstanceOf[Type]) + else t1 + t2.asInstanceOf[ThisTree[T]] + } + */ + + /** Return a typed tree that's isomorphic to this tree, but has given + * type. (Overridden by empty trees) + */ + def withType(tpe: Type)(using Context): ThisTree[Type] = { + if (tpe.isInstanceOf[ErrorType]) + assert(!Config.checkUnreportedErrors || + ctx.reporter.errorsReported || + ctx.settings.YshowPrintErrors.value + // under -Yshow-print-errors, errors might arise during printing, but they do not count as reported + ) + else if (Config.checkTreesConsistent) + checkChildrenTyped(productIterator) + withTypeUnchecked(tpe) + } + + /** Check that typed trees don't refer to untyped ones, except if + * - the parent tree is an import, or + * - the child tree is an identifier, or + * - errors were reported + */ + private def checkChildrenTyped(it: Iterator[Any])(using Context): Unit = + if (!this.isInstanceOf[Import[?]]) + while (it.hasNext) + it.next() match { + case x: Ident[?] => // untyped idents are used in a number of places in typed trees + case x: Tree[?] => + assert(x.hasType || ctx.reporter.errorsReported, + s"$this has untyped child $x") + case xs: List[?] => checkChildrenTyped(xs.iterator) + case _ => + } + + def withTypeUnchecked(tpe: Type): ThisTree[Type] = { + val tree = + (if (myTpe == null || + (myTpe.asInstanceOf[AnyRef] eq tpe.asInstanceOf[AnyRef])) this + else cloneIn(source)).asInstanceOf[Tree[Type]] + tree overwriteType tpe + tree.asInstanceOf[ThisTree[Type]] + } + + /** Does the tree have its type field set? Note: this operation is not + * referentially transparent, because it can observe the withType + * modifications. Should be used only in special circumstances (we + * need it for printing trees with optional type info). + */ + final def hasType: Boolean = myTpe != null + + final def typeOpt: Type = myTpe match + case tp: Type => tp + case null => NoType + + /** The denotation referred to by this tree. + * Defined for `DenotingTree`s and `ProxyTree`s, NoDenotation for other + * kinds of trees + */ + def denot(using Context): Denotation = NoDenotation + + /** Shorthand for `denot.symbol`. */ + final def symbol(using Context): Symbol = denot.symbol + + /** Does this tree represent a type? */ + def isType: Boolean = false + + /** Does this tree represent a term? */ + def isTerm: Boolean = false + + /** Is this a legal part of a pattern which is not at the same time a term? */ + def isPattern: Boolean = false + + /** Does this tree define a new symbol that is not defined elsewhere? */ + def isDef: Boolean = false + + /** Is this tree either the empty tree or the empty ValDef or an empty type ident? */ + def isEmpty: Boolean = false + + /** Convert tree to a list. Gives a singleton list, except + * for thickets which return their element trees. + */ + def toList: List[Tree[T]] = this :: Nil + + /** if this tree is the empty tree, the alternative, else this tree */ + inline def orElse[U >: T <: Untyped](inline that: Tree[U]): Tree[U] = + if (this eq genericEmptyTree) that else this + + /** The number of nodes in this tree */ + def treeSize: Int = { + var s = 1 + def addSize(elem: Any): Unit = elem match { + case t: Tree[?] => s += t.treeSize + case ts: List[?] => ts foreach addSize + case _ => + } + productIterator foreach addSize + s + } + + /** If this is a thicket, perform `op` on each of its trees + * otherwise, perform `op` ion tree itself. + */ + def foreachInThicket(op: Tree[T] => Unit): Unit = op(this) + + override def toText(printer: Printer): Text = printer.toText(this) + + def sameTree(that: Tree[?]): Boolean = { + def isSame(x: Any, y: Any): Boolean = + x.asInstanceOf[AnyRef].eq(y.asInstanceOf[AnyRef]) || { + x match { + case x: Tree[?] => + y match { + case y: Tree[?] => x.sameTree(y) + case _ => false + } + case x: List[?] => + y match { + case y: List[?] => x.corresponds(y)(isSame) + case _ => false + } + case _ => + false + } + } + this.getClass == that.getClass && { + val it1 = this.productIterator + val it2 = that.productIterator + it1.corresponds(it2)(isSame) + } + } + + override def hashCode(): Int = System.identityHashCode(this) + override def equals(that: Any): Boolean = this eq that.asInstanceOf[AnyRef] + } + + class UnAssignedTypeException[T <: Untyped](tree: Tree[T]) extends RuntimeException { + override def getMessage: String = s"type of $tree is not assigned" + } + + type LazyTree[+T <: Untyped] = Tree[T] | Lazy[Tree[T]] + type LazyTreeList[+T <: Untyped] = List[Tree[T]] | Lazy[List[Tree[T]]] + + // ------ Categories of trees ----------------------------------- + + /** Instances of this class are trees for which isType is definitely true. + * Note that some trees have isType = true without being TypTrees (e.g. Ident, Annotated) + */ + trait TypTree[+T <: Untyped] extends Tree[T] { + type ThisTree[+T <: Untyped] <: TypTree[T] + override def isType: Boolean = true + } + + /** Instances of this class are trees for which isTerm is definitely true. + * Note that some trees have isTerm = true without being TermTrees (e.g. Ident, Annotated) + */ + trait TermTree[+T <: Untyped] extends Tree[T] { + type ThisTree[+T <: Untyped] <: TermTree[T] + override def isTerm: Boolean = true + } + + /** Instances of this class are trees which are not terms but are legal + * parts of patterns. + */ + trait PatternTree[+T <: Untyped] extends Tree[T] { + type ThisTree[+T <: Untyped] <: PatternTree[T] + override def isPattern: Boolean = true + } + + /** Tree's denotation can be derived from its type */ + abstract class DenotingTree[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends Tree[T] { + type ThisTree[+T <: Untyped] <: DenotingTree[T] + override def denot(using Context): Denotation = typeOpt.stripped match + case tpe: NamedType => tpe.denot + case tpe: ThisType => tpe.cls.denot + case _ => NoDenotation + } + + /** Tree's denot/isType/isTerm properties come from a subtree + * identified by `forwardTo`. + */ + abstract class ProxyTree[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends Tree[T] { + type ThisTree[+T <: Untyped] <: ProxyTree[T] + def forwardTo: Tree[T] + override def denot(using Context): Denotation = forwardTo.denot + override def isTerm: Boolean = forwardTo.isTerm + override def isType: Boolean = forwardTo.isType + } + + /** Tree has a name */ + abstract class NameTree[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends DenotingTree[T] { + type ThisTree[+T <: Untyped] <: NameTree[T] + def name: Name + } + + /** Tree refers by name to a denotation */ + abstract class RefTree[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends NameTree[T] { + type ThisTree[+T <: Untyped] <: RefTree[T] + def qualifier: Tree[T] + override def isType: Boolean = name.isTypeName + override def isTerm: Boolean = name.isTermName + } + + /** Tree defines a new symbol */ + trait DefTree[+T <: Untyped] extends DenotingTree[T] { + type ThisTree[+T <: Untyped] <: DefTree[T] + + private var myMods: untpd.Modifiers | Null = uninitialized + + private[dotc] def rawMods: untpd.Modifiers = + if (myMods == null) untpd.EmptyModifiers else myMods.uncheckedNN + + def withAnnotations(annots: List[untpd.Tree]): ThisTree[Untyped] = withMods(rawMods.withAnnotations(annots)) + + def withMods(mods: untpd.Modifiers): ThisTree[Untyped] = { + val tree = if (myMods == null || (myMods == mods)) this else cloneIn(source) + tree.setMods(mods) + tree.asInstanceOf[ThisTree[Untyped]] + } + + def withFlags(flags: FlagSet): ThisTree[Untyped] = withMods(untpd.Modifiers(flags)) + def withAddedFlags(flags: FlagSet): ThisTree[Untyped] = withMods(rawMods | flags) + + /** Destructively update modifiers. To be used with care. */ + def setMods(mods: untpd.Modifiers): Unit = myMods = mods + + override def isDef: Boolean = true + def namedType: NamedType = tpe.asInstanceOf[NamedType] + } + + extension (mdef: untpd.DefTree) def mods: untpd.Modifiers = mdef.rawMods + + sealed trait WithEndMarker[+T <: Untyped]: + self: PackageDef[T] | NamedDefTree[T] => + + import WithEndMarker.* + + final def endSpan(using Context): Span = + if hasEndMarker then + val realName = srcName.stripModuleClassSuffix.lastPart + span.withStart(span.end - realName.length) + else + NoSpan + + /** The name in source code that represents this construct, + * and is the name that the user must write to create a valid + * end marker. + * e.g. a constructor definition is terminated in the source + * code by `end this`, so it's `srcName` should return `this`. + */ + protected def srcName(using Context): Name + + final def withEndMarker(): self.type = + self.withAttachment(HasEndMarker, ()) + + final def withEndMarker(copyFrom: WithEndMarker[?]): self.type = + if copyFrom.hasEndMarker then + this.withEndMarker() + else + this + + final def dropEndMarker(): self.type = + self.removeAttachment(HasEndMarker) + this + + protected def hasEndMarker: Boolean = self.hasAttachment(HasEndMarker) + + object WithEndMarker: + /** Property key that signals the tree was terminated + * with an `end` marker in the source code + */ + private val HasEndMarker: Property.StickyKey[Unit] = Property.StickyKey() + + end WithEndMarker + + abstract class NamedDefTree[+T <: Untyped](implicit @constructorOnly src: SourceFile) + extends NameTree[T] with DefTree[T] with WithEndMarker[T] { + type ThisTree[+T <: Untyped] <: NamedDefTree[T] + + protected def srcName(using Context): Name = + if name == nme.CONSTRUCTOR then nme.this_ + else if symbol.isPackageObject then symbol.owner.name + else name + + /** The position of the name defined by this definition. + * This is a point position if the definition is synthetic, or a range position + * if the definition comes from source. + * It might also be that the definition does not have a position (for instance when synthesized by + * a calling chain from `viewExists`), in that case the return position is NoSpan. + * Overridden in Bind + */ + def nameSpan(using Context): Span = + if (span.exists) { + val point = span.point + if (rawMods.is(Synthetic) || span.isSynthetic || name.toTermName == nme.ERROR) Span(point) + else { + val realName = srcName.stripModuleClassSuffix.lastPart + Span(point, point + realName.length, point) + } + } + else span + + /** The source position of the name defined by this definition. + * This is a point position if the definition is synthetic, or a range position + * if the definition comes from source. + */ + def namePos(using Context): SourcePosition = source.atSpan(nameSpan) + } + + /** Tree defines a new symbol and carries modifiers. + * The position of a MemberDef contains only the defined identifier or pattern. + * The envelope of a MemberDef contains the whole definition and has its point + * on the opening keyword (or the next token after that if keyword is missing). + */ + abstract class MemberDef[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends NamedDefTree[T] { + type ThisTree[+T <: Untyped] <: MemberDef[T] + + def rawComment: Option[Comment] = getAttachment(DocComment) + + def setComment(comment: Option[Comment]): this.type = { + comment.map(putAttachment(DocComment, _)) + this + } + + def name: Name + } + + /** A ValDef or DefDef tree */ + abstract class ValOrDefDef[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends MemberDef[T] with WithLazyField[Tree[T]] { + type ThisTree[+T <: Untyped] <: ValOrDefDef[T] + def name: TermName + def tpt: Tree[T] + def unforcedRhs: LazyTree[T] = unforced + def rhs(using Context): Tree[T] = forceIfLazy + } + + trait ValOrTypeDef[+T <: Untyped] extends MemberDef[T]: + type ThisTree[+T <: Untyped] <: ValOrTypeDef[T] + + type ParamClause[T <: Untyped] = List[ValDef[T]] | List[TypeDef[T]] + + // ----------- Tree case classes ------------------------------------ + + /** name */ + case class Ident[+T <: Untyped] private[ast] (name: Name)(implicit @constructorOnly src: SourceFile) + extends RefTree[T] { + type ThisTree[+T <: Untyped] = Ident[T] + def qualifier: Tree[T] = genericEmptyTree + + def isBackquoted: Boolean = hasAttachment(Backquoted) + } + + class SearchFailureIdent[+T <: Untyped] private[ast] (name: Name, expl: -> String)(implicit @constructorOnly src: SourceFile) + extends Ident[T](name) { + def explanation = expl + override def toString: String = s"SearchFailureIdent($explanation)" + } + + /** qualifier.name, or qualifier#name, if qualifier is a type */ + case class Select[+T <: Untyped] private[ast] (qualifier: Tree[T], name: Name)(implicit @constructorOnly src: SourceFile) + extends RefTree[T] { + type ThisTree[+T <: Untyped] = Select[T] + + override def denot(using Context): Denotation = typeOpt match + case ConstantType(_) if ConstFold.foldedUnops.contains(name) => + // Recover the denotation of a constant-folded selection + qualifier.typeOpt.member(name).atSignature(Signature.NotAMethod, name) + case _ => + super.denot + + def nameSpan(using Context): Span = + if span.exists then + val point = span.point + if name.toTermName == nme.ERROR then + Span(point) + else if qualifier.span.start > span.start then // right associative + val realName = name.stripModuleClassSuffix.lastPart + Span(span.start, span.start + realName.length, point) + else + Span(point, span.end, point) + else span + } + + class SelectWithSig[+T <: Untyped] private[ast] (qualifier: Tree[T], name: Name, val sig: Signature)(implicit @constructorOnly src: SourceFile) + extends Select[T](qualifier, name) { + override def toString: String = s"SelectWithSig($qualifier, $name, $sig)" + } + + /** qual.this */ + case class This[+T <: Untyped] private[ast] (qual: untpd.Ident)(implicit @constructorOnly src: SourceFile) + extends DenotingTree[T] with TermTree[T] { + type ThisTree[+T <: Untyped] = This[T] + // Denotation of a This tree is always the underlying class; needs correction for modules. + override def denot(using Context): Denotation = + typeOpt match { + case tpe @ TermRef(pre, _) if tpe.symbol.is(Module) => + tpe.symbol.moduleClass.denot.asSeenFrom(pre) + case _ => + super.denot + } + } + + /** C.super[mix], where qual = C.this */ + case class Super[+T <: Untyped] private[ast] (qual: Tree[T], mix: untpd.Ident)(implicit @constructorOnly src: SourceFile) + extends ProxyTree[T] with TermTree[T] { + type ThisTree[+T <: Untyped] = Super[T] + def forwardTo: Tree[T] = qual + } + + abstract class GenericApply[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends ProxyTree[T] with TermTree[T] { + type ThisTree[+T <: Untyped] <: GenericApply[T] + val fun: Tree[T] + val args: List[Tree[T]] + def forwardTo: Tree[T] = fun + } + + object GenericApply: + def unapply[T <: Untyped](tree: Tree[T]): Option[(Tree[T], List[Tree[T]])] = tree match + case tree: GenericApply[T] => Some((tree.fun, tree.args)) + case _ => None + + /** The kind of application */ + enum ApplyKind: + case Regular // r.f(x) + case Using // r.f(using x) + case InfixTuple // r f (x1, ..., xN) where N != 1; needs to be treated specially for an error message in typedApply + + /** fun(args) */ + case class Apply[+T <: Untyped] private[ast] (fun: Tree[T], args: List[Tree[T]])(implicit @constructorOnly src: SourceFile) + extends GenericApply[T] { + type ThisTree[+T <: Untyped] = Apply[T] + + def setApplyKind(kind: ApplyKind) = + putAttachment(untpd.KindOfApply, kind) + this + + /** The kind of this application. Works reliably only for untyped trees; typed trees + * are under no obligation to update it correctly. + */ + def applyKind: ApplyKind = + attachmentOrElse(untpd.KindOfApply, ApplyKind.Regular) + } + + /** fun[args] */ + case class TypeApply[+T <: Untyped] private[ast] (fun: Tree[T], args: List[Tree[T]])(implicit @constructorOnly src: SourceFile) + extends GenericApply[T] { + type ThisTree[+T <: Untyped] = TypeApply[T] + } + + /** const */ + case class Literal[+T <: Untyped] private[ast] (const: Constant)(implicit @constructorOnly src: SourceFile) + extends Tree[T] with TermTree[T] { + type ThisTree[+T <: Untyped] = Literal[T] + } + + /** new tpt, but no constructor call */ + case class New[+T <: Untyped] private[ast] (tpt: Tree[T])(implicit @constructorOnly src: SourceFile) + extends Tree[T] with TermTree[T] { + type ThisTree[+T <: Untyped] = New[T] + } + + /** expr : tpt */ + case class Typed[+T <: Untyped] private[ast] (expr: Tree[T], tpt: Tree[T])(implicit @constructorOnly src: SourceFile) + extends ProxyTree[T] with TermTree[T] { + type ThisTree[+T <: Untyped] = Typed[T] + def forwardTo: Tree[T] = expr + } + + /** name = arg, in a parameter list */ + case class NamedArg[+T <: Untyped] private[ast] (name: Name, arg: Tree[T])(implicit @constructorOnly src: SourceFile) + extends Tree[T] { + type ThisTree[+T <: Untyped] = NamedArg[T] + } + + /** name = arg, outside a parameter list */ + case class Assign[+T <: Untyped] private[ast] (lhs: Tree[T], rhs: Tree[T])(implicit @constructorOnly src: SourceFile) + extends TermTree[T] { + type ThisTree[+T <: Untyped] = Assign[T] + } + + /** { stats; expr } */ + case class Block[+T <: Untyped] private[ast] (stats: List[Tree[T]], expr: Tree[T])(implicit @constructorOnly src: SourceFile) + extends Tree[T] { + type ThisTree[+T <: Untyped] = Block[T] + override def isType: Boolean = expr.isType + override def isTerm: Boolean = !isType // this will classify empty trees as terms, which is necessary + } + + /** if cond then thenp else elsep */ + case class If[+T <: Untyped] private[ast] (cond: Tree[T], thenp: Tree[T], elsep: Tree[T])(implicit @constructorOnly src: SourceFile) + extends TermTree[T] { + type ThisTree[+T <: Untyped] = If[T] + def isInline = false + } + class InlineIf[+T <: Untyped] private[ast] (cond: Tree[T], thenp: Tree[T], elsep: Tree[T])(implicit @constructorOnly src: SourceFile) + extends If(cond, thenp, elsep) { + override def isInline = true + override def toString = s"InlineIf($cond, $thenp, $elsep)" + } + + /** A closure with an environment and a reference to a method. + * @param env The captured parameters of the closure + * @param meth A ref tree that refers to the method of the closure. + * The first (env.length) parameters of that method are filled + * with env values. + * @param tpt Either EmptyTree or a TypeTree. If tpt is EmptyTree the type + * of the closure is a function type, otherwise it is the type + * given in `tpt`, which must be a SAM type. + */ + case class Closure[+T <: Untyped] private[ast] (env: List[Tree[T]], meth: Tree[T], tpt: Tree[T])(implicit @constructorOnly src: SourceFile) + extends TermTree[T] { + type ThisTree[+T <: Untyped] = Closure[T] + } + + /** selector match { cases } */ + case class Match[+T <: Untyped] private[ast] (selector: Tree[T], cases: List[CaseDef[T]])(implicit @constructorOnly src: SourceFile) + extends TermTree[T] { + type ThisTree[+T <: Untyped] = Match[T] + def isInline = false + } + class InlineMatch[+T <: Untyped] private[ast] (selector: Tree[T], cases: List[CaseDef[T]])(implicit @constructorOnly src: SourceFile) + extends Match(selector, cases) { + override def isInline = true + override def toString = s"InlineMatch($selector, $cases)" + } + + /** case pat if guard => body */ + case class CaseDef[+T <: Untyped] private[ast] (pat: Tree[T], guard: Tree[T], body: Tree[T])(implicit @constructorOnly src: SourceFile) + extends Tree[T] { + type ThisTree[+T <: Untyped] = CaseDef[T] + } + + /** label[tpt]: { expr } */ + case class Labeled[+T <: Untyped] private[ast] (bind: Bind[T], expr: Tree[T])(implicit @constructorOnly src: SourceFile) + extends NameTree[T] { + type ThisTree[+T <: Untyped] = Labeled[T] + def name: Name = bind.name + } + + /** return expr + * where `from` refers to the method or label from which the return takes place + * After program transformations this is not necessarily the enclosing method, because + * closures can intervene. + */ + case class Return[+T <: Untyped] private[ast] (expr: Tree[T], from: Tree[T] = genericEmptyTree)(implicit @constructorOnly src: SourceFile) + extends TermTree[T] { + type ThisTree[+T <: Untyped] = Return[T] + } + + /** while (cond) { body } */ + case class WhileDo[+T <: Untyped] private[ast] (cond: Tree[T], body: Tree[T])(implicit @constructorOnly src: SourceFile) + extends TermTree[T] { + type ThisTree[+T <: Untyped] = WhileDo[T] + } + + /** try block catch cases finally finalizer */ + case class Try[+T <: Untyped] private[ast] (expr: Tree[T], cases: List[CaseDef[T]], finalizer: Tree[T])(implicit @constructorOnly src: SourceFile) + extends TermTree[T] { + type ThisTree[+T <: Untyped] = Try[T] + } + + /** Seq(elems) + * @param tpt The element type of the sequence. + */ + case class SeqLiteral[+T <: Untyped] private[ast] (elems: List[Tree[T]], elemtpt: Tree[T])(implicit @constructorOnly src: SourceFile) + extends Tree[T] { + type ThisTree[+T <: Untyped] = SeqLiteral[T] + } + + /** Array(elems) */ + class JavaSeqLiteral[+T <: Untyped] private[ast] (elems: List[Tree[T]], elemtpt: Tree[T])(implicit @constructorOnly src: SourceFile) + extends SeqLiteral(elems, elemtpt) { + override def toString: String = s"JavaSeqLiteral($elems, $elemtpt)" + } + + /** A tree representing inlined code. + * + * @param call Info about the original call that was inlined + * Until PostTyper, this is the full call, afterwards only + * a reference to the toplevel class from which the call was inlined. + * @param bindings Bindings for proxies to be used in the inlined code + * @param expansion The inlined tree, minus bindings. + * + * The full inlined code is equivalent to + * + * { bindings; expansion } + * + * The reason to keep `bindings` separate is because they are typed in a + * different context: `bindings` represent the arguments to the inlined + * call, whereas `expansion` represents the body of the inlined function. + */ + case class Inlined[+T <: Untyped] private[ast] (call: tpd.Tree, bindings: List[MemberDef[T]], expansion: Tree[T])(implicit @constructorOnly src: SourceFile) + extends Tree[T] { + type ThisTree[+T <: Untyped] = Inlined[T] + override def isTerm = expansion.isTerm + override def isType = expansion.isType + } + + /** A type tree that represents an existing or inferred type */ + case class TypeTree[+T <: Untyped]()(implicit @constructorOnly src: SourceFile) + extends DenotingTree[T] with TypTree[T] { + type ThisTree[+T <: Untyped] = TypeTree[T] + override def isEmpty: Boolean = !hasType + override def toString: String = + s"TypeTree${if (hasType) s"[$typeOpt]" else ""}" + } + + /** A type tree whose type is inferred. These trees appear in two contexts + * - as an argument of a TypeApply. In that case its type is always a TypeVar + * - as a (result-)type of an inferred ValDef or DefDef. + * Every TypeVar is created as the type of one InferredTypeTree. + */ + class InferredTypeTree[+T <: Untyped](implicit @constructorOnly src: SourceFile) extends TypeTree[T] + + /** ref.type */ + case class SingletonTypeTree[+T <: Untyped] private[ast] (ref: Tree[T])(implicit @constructorOnly src: SourceFile) + extends DenotingTree[T] with TypTree[T] { + type ThisTree[+T <: Untyped] = SingletonTypeTree[T] + } + + /** tpt { refinements } */ + case class RefinedTypeTree[+T <: Untyped] private[ast] (tpt: Tree[T], refinements: List[Tree[T]])(implicit @constructorOnly src: SourceFile) + extends ProxyTree[T] with TypTree[T] { + type ThisTree[+T <: Untyped] = RefinedTypeTree[T] + def forwardTo: Tree[T] = tpt + } + + /** tpt[args] */ + case class AppliedTypeTree[+T <: Untyped] private[ast] (tpt: Tree[T], args: List[Tree[T]])(implicit @constructorOnly src: SourceFile) + extends ProxyTree[T] with TypTree[T] { + type ThisTree[+T <: Untyped] = AppliedTypeTree[T] + def forwardTo: Tree[T] = tpt + } + + /** [typeparams] -> tpt + * + * Note: the type of such a tree is not necessarily a `HKTypeLambda`, it can + * also be a `TypeBounds` where the upper bound is an `HKTypeLambda`, and the + * lower bound is either a reference to `Nothing` or an `HKTypeLambda`, + * this happens because these trees are typed by `HKTypeLambda#fromParams` which + * makes sure to move bounds outside of the type lambda itself to simplify their + * handling in the compiler. + * + * You may ask: why not normalize the trees too? That way, + * + * LambdaTypeTree(X, TypeBoundsTree(A, B)) + * + * would become, + * + * TypeBoundsTree(LambdaTypeTree(X, A), LambdaTypeTree(X, B)) + * + * which would maintain consistency between a tree and its type. The problem + * with this definition is that the same tree `X` appears twice, therefore + * we'd have to create two symbols for it which makes it harder to relate the + * source code written by the user with the trees used by the compiler (for + * example, to make "find all references" work in the IDE). + */ + case class LambdaTypeTree[+T <: Untyped] private[ast] (tparams: List[TypeDef[T]], body: Tree[T])(implicit @constructorOnly src: SourceFile) + extends TypTree[T] { + type ThisTree[+T <: Untyped] = LambdaTypeTree[T] + } + + case class TermLambdaTypeTree[+T <: Untyped] private[ast] (params: List[ValDef[T]], body: Tree[T])(implicit @constructorOnly src: SourceFile) + extends TypTree[T] { + type ThisTree[+T <: Untyped] = TermLambdaTypeTree[T] + } + + /** [bound] selector match { cases } */ + case class MatchTypeTree[+T <: Untyped] private[ast] (bound: Tree[T], selector: Tree[T], cases: List[CaseDef[T]])(implicit @constructorOnly src: SourceFile) + extends TypTree[T] { + type ThisTree[+T <: Untyped] = MatchTypeTree[T] + } + + /** => T */ + case class ByNameTypeTree[+T <: Untyped] private[ast] (result: Tree[T])(implicit @constructorOnly src: SourceFile) + extends TypTree[T] { + type ThisTree[+T <: Untyped] = ByNameTypeTree[T] + } + + /** >: lo <: hi + * >: lo <: hi = alias for RHS of bounded opaque type + */ + case class TypeBoundsTree[+T <: Untyped] private[ast] (lo: Tree[T], hi: Tree[T], alias: Tree[T])(implicit @constructorOnly src: SourceFile) + extends TypTree[T] { + type ThisTree[+T <: Untyped] = TypeBoundsTree[T] + } + + /** name @ body */ + case class Bind[+T <: Untyped] private[ast] (name: Name, body: Tree[T])(implicit @constructorOnly src: SourceFile) + extends NamedDefTree[T] with PatternTree[T] { + type ThisTree[+T <: Untyped] = Bind[T] + override def isType: Boolean = name.isTypeName + override def isTerm: Boolean = name.isTermName + + override def nameSpan(using Context): Span = + if span.exists then Span(span.start, span.start + name.toString.length) else span + } + + /** tree_1 | ... | tree_n */ + case class Alternative[+T <: Untyped] private[ast] (trees: List[Tree[T]])(implicit @constructorOnly src: SourceFile) + extends PatternTree[T] { + type ThisTree[+T <: Untyped] = Alternative[T] + } + + /** The typed translation of `extractor(patterns)` in a pattern. The translation has the following + * components: + * + * @param fun is `extractor.unapply` (or, for backwards compatibility, `extractor.unapplySeq`) + * possibly with type parameters + * @param implicits Any implicit parameters passed to the unapply after the selector + * @param patterns The argument patterns in the pattern match. + * + * It is typed with same type as first `fun` argument + * Given a match selector `sel` a pattern UnApply(fun, implicits, patterns) is roughly translated as follows + * + * val result = fun(sel)(implicits) + * if (result.isDefined) "match patterns against result" + */ + case class UnApply[+T <: Untyped] private[ast] (fun: Tree[T], implicits: List[Tree[T]], patterns: List[Tree[T]])(implicit @constructorOnly src: SourceFile) + extends ProxyTree[T] with PatternTree[T] { + type ThisTree[+T <: Untyped] = UnApply[T] + def forwardTo = fun + } + + /** mods val name: tpt = rhs */ + case class ValDef[+T <: Untyped] private[ast] (name: TermName, tpt: Tree[T], private var preRhs: LazyTree[T])(implicit @constructorOnly src: SourceFile) + extends ValOrDefDef[T], ValOrTypeDef[T] { + type ThisTree[+T <: Untyped] = ValDef[T] + assert(isEmpty || (tpt ne genericEmptyTree)) + def unforced: LazyTree[T] = preRhs + protected def force(x: Tree[T @uncheckedVariance]): Unit = preRhs = x + } + + /** mods def name[tparams](vparams_1)...(vparams_n): tpt = rhs */ + case class DefDef[+T <: Untyped] private[ast] (name: TermName, + paramss: List[ParamClause[T]], tpt: Tree[T], private var preRhs: LazyTree[T])(implicit @constructorOnly src: SourceFile) + extends ValOrDefDef[T] { + type ThisTree[+T <: Untyped] = DefDef[T] + assert(tpt ne genericEmptyTree) + def unforced: LazyTree[T] = preRhs + protected def force(x: Tree[T @uncheckedVariance]): Unit = preRhs = x + + def leadingTypeParams(using Context): List[TypeDef[T]] = paramss match + case (tparams @ (tparam: TypeDef[_]) :: _) :: _ => tparams.asInstanceOf[List[TypeDef[T]]] + case _ => Nil + + def trailingParamss(using Context): List[ParamClause[T]] = paramss match + case ((tparam: TypeDef[_]) :: _) :: paramss1 => paramss1 + case _ => paramss + + def termParamss(using Context): List[List[ValDef[T]]] = + (if ctx.erasedTypes then paramss else untpd.termParamssIn(paramss)) + .asInstanceOf[List[List[ValDef[T]]]] + } + + /** mods class name template or + * mods trait name template or + * mods type name = rhs or + * mods type name >: lo <: hi, if rhs = TypeBoundsTree(lo, hi) or + * mods type name >: lo <: hi = rhs if rhs = TypeBoundsTree(lo, hi, alias) and opaque in mods + */ + case class TypeDef[+T <: Untyped] private[ast] (name: TypeName, rhs: Tree[T])(implicit @constructorOnly src: SourceFile) + extends MemberDef[T], ValOrTypeDef[T] { + type ThisTree[+T <: Untyped] = TypeDef[T] + + /** Is this a definition of a class? */ + def isClassDef: Boolean = rhs.isInstanceOf[Template[?]] + + def isBackquoted: Boolean = hasAttachment(Backquoted) + } + + /** extends parents { self => body } + * @param parentsOrDerived A list of parents followed by a list of derived classes, + * if this is of class untpd.DerivingTemplate. + * Typed templates only have parents. + */ + case class Template[+T <: Untyped] private[ast] (constr: DefDef[T], parentsOrDerived: List[Tree[T]], self: ValDef[T], private var preBody: LazyTreeList[T])(implicit @constructorOnly src: SourceFile) + extends DefTree[T] with WithLazyField[List[Tree[T]]] { + type ThisTree[+T <: Untyped] = Template[T] + def unforcedBody: LazyTreeList[T] = unforced + def unforced: LazyTreeList[T] = preBody + protected def force(x: List[Tree[T @uncheckedVariance]]): Unit = preBody = x + def body(using Context): List[Tree[T]] = forceIfLazy + + def parents: List[Tree[T]] = parentsOrDerived // overridden by DerivingTemplate + def derived: List[untpd.Tree] = Nil // overridden by DerivingTemplate + } + + + abstract class ImportOrExport[+T <: Untyped](implicit @constructorOnly src: SourceFile) + extends DenotingTree[T] { + type ThisTree[+T <: Untyped] <: ImportOrExport[T] + val expr: Tree[T] + val selectors: List[untpd.ImportSelector] + } + + /** import expr.selectors + * where a selector is either an untyped `Ident`, `name` or + * an untyped thicket consisting of `name` and `rename`. + */ + case class Import[+T <: Untyped] private[ast] (expr: Tree[T], selectors: List[untpd.ImportSelector])(implicit @constructorOnly src: SourceFile) + extends ImportOrExport[T] { + type ThisTree[+T <: Untyped] = Import[T] + } + + /** export expr.selectors + * where a selector is either an untyped `Ident`, `name` or + * an untyped thicket consisting of `name` and `rename`. + */ + case class Export[+T <: Untyped] private[ast] (expr: Tree[T], selectors: List[untpd.ImportSelector])(implicit @constructorOnly src: SourceFile) + extends ImportOrExport[T] { + type ThisTree[+T <: Untyped] = Export[T] + } + + /** package pid { stats } */ + case class PackageDef[+T <: Untyped] private[ast] (pid: RefTree[T], stats: List[Tree[T]])(implicit @constructorOnly src: SourceFile) + extends ProxyTree[T] with WithEndMarker[T] { + type ThisTree[+T <: Untyped] = PackageDef[T] + def forwardTo: RefTree[T] = pid + protected def srcName(using Context): Name = pid.name + } + + /** arg @annot */ + case class Annotated[+T <: Untyped] private[ast] (arg: Tree[T], annot: Tree[T])(implicit @constructorOnly src: SourceFile) + extends ProxyTree[T] { + type ThisTree[+T <: Untyped] = Annotated[T] + def forwardTo: Tree[T] = arg + } + + trait WithoutTypeOrPos[+T <: Untyped] extends Tree[T] { + override def withTypeUnchecked(tpe: Type): ThisTree[Type] = this.asInstanceOf[ThisTree[Type]] + override def span: Span = NoSpan + override def span_=(span: Span): Unit = {} + } + + /** Temporary class that results from translation of ModuleDefs + * (and possibly other statements). + * The contained trees will be integrated when transformed with + * a `transform(List[Tree])` call. + */ + case class Thicket[+T <: Untyped](trees: List[Tree[T]])(implicit @constructorOnly src: SourceFile) + extends Tree[T] with WithoutTypeOrPos[T] { + myTpe = NoType.asInstanceOf[T] + type ThisTree[+T <: Untyped] = Thicket[T] + + def mapElems[U >: T <: Untyped](op: Tree[T] => Tree[U]): Thicket[U] = { + val newTrees = trees.mapConserve(op) + if (trees eq newTrees) + this + else + Thicket[U](newTrees)(source).asInstanceOf[this.type] + } + + override def foreachInThicket(op: Tree[T] => Unit): Unit = + trees foreach (_.foreachInThicket(op)) + + override def isEmpty: Boolean = trees.isEmpty + override def toList: List[Tree[T]] = flatten(trees) + override def toString: String = if (isEmpty) "EmptyTree" else "Thicket(" + trees.mkString(", ") + ")" + override def span: Span = + def combine(s: Span, ts: List[Tree[T]]): Span = ts match + case t :: ts1 => combine(s.union(t.span), ts1) + case nil => s + combine(NoSpan, trees) + + override def withSpan(span: Span): this.type = + mapElems(_.withSpan(span)).asInstanceOf[this.type] + } + + class EmptyTree[T <: Untyped] extends Thicket(Nil)(NoSource) { + // assert(uniqueId != 1492) + override def withSpan(span: Span) = throw AssertionError("Cannot change span of EmptyTree") + } + + class EmptyValDef[T <: Untyped] extends ValDef[T]( + nme.WILDCARD, genericEmptyTree[T], genericEmptyTree[T])(NoSource) with WithoutTypeOrPos[T] { + myTpe = NoType.asInstanceOf[T] + setMods(untpd.Modifiers(PrivateLocal)) + override def isEmpty: Boolean = true + override def withSpan(span: Span) = throw AssertionError("Cannot change span of EmptyValDef") + } + + @sharable val theEmptyTree = new EmptyTree[Type]() + @sharable val theEmptyValDef = new EmptyValDef[Type]() + + def genericEmptyValDef[T <: Untyped]: ValDef[T] = theEmptyValDef.asInstanceOf[ValDef[T]] + def genericEmptyTree[T <: Untyped]: Thicket[T] = theEmptyTree.asInstanceOf[Thicket[T]] + + /** Tree that replaces a level 1 splices in pickled (level 0) quotes. + * It is only used when picking quotes (will never be in a TASTy file). + * + * @param isTermHole If this hole is a term, otherwise it is a type hole. + * @param idx The index of the hole in it's enclosing level 0 quote. + * @param args The arguments of the splice to compute its content + * @param content Lambda that computes the content of the hole. This tree is empty when in a quote pickle. + * @param tpt Type of the hole + */ + case class Hole[+T <: Untyped](isTermHole: Boolean, idx: Int, args: List[Tree[T]], content: Tree[T], tpt: Tree[T])(implicit @constructorOnly src: SourceFile) extends Tree[T] { + type ThisTree[+T <: Untyped] <: Hole[T] + override def isTerm: Boolean = isTermHole + override def isType: Boolean = !isTermHole + } + + def flatten[T <: Untyped](trees: List[Tree[T]]): List[Tree[T]] = { + def recur(buf: ListBuffer[Tree[T]] | Null, remaining: List[Tree[T]]): ListBuffer[Tree[T]] | Null = + remaining match { + case Thicket(elems) :: remaining1 => + var buf1 = buf + if (buf1 == null) { + buf1 = new ListBuffer[Tree[T]] + var scanned = trees + while (scanned `ne` remaining) { + buf1 += scanned.head + scanned = scanned.tail + } + } + recur(recur(buf1, elems), remaining1) + case tree :: remaining1 => + if (buf != null) buf += tree + recur(buf, remaining1) + case nil => + buf + } + val buf = recur(null, trees) + if (buf != null) buf.toList else trees + } + + // ----- Lazy trees and tree sequences + + /** A tree that can have a lazy field + * The field is represented by some private `var` which is + * accessed by `unforced` and `force`. Forcing the field will + * set the `var` to the underlying value. + */ + trait WithLazyField[+T <: AnyRef] { + def unforced: T | Lazy[T] + protected def force(x: T @uncheckedVariance): Unit + def forceIfLazy(using Context): T = unforced match { + case lzy: Lazy[T @unchecked] => + val x = lzy.complete + force(x) + x + case x: T @ unchecked => x + } + } + + /** A base trait for lazy tree fields. + * These can be instantiated with Lazy instances which + * can delay tree construction until the field is first demanded. + */ + trait Lazy[+T <: AnyRef] { + def complete(using Context): T + } + + // ----- Generic Tree Instances, inherited from `tpt` and `untpd`. + + abstract class Instance[T <: Untyped] { inst => + + type Tree = Trees.Tree[T] + type TypTree = Trees.TypTree[T] + type TermTree = Trees.TermTree[T] + type PatternTree = Trees.PatternTree[T] + type DenotingTree = Trees.DenotingTree[T] + type ProxyTree = Trees.ProxyTree[T] + type NameTree = Trees.NameTree[T] + type RefTree = Trees.RefTree[T] + type DefTree = Trees.DefTree[T] + type NamedDefTree = Trees.NamedDefTree[T] + type MemberDef = Trees.MemberDef[T] + type ValOrDefDef = Trees.ValOrDefDef[T] + type ValOrTypeDef = Trees.ValOrTypeDef[T] + type LazyTree = Trees.LazyTree[T] + type LazyTreeList = Trees.LazyTreeList[T] + type ParamClause = Trees.ParamClause[T] + + type Ident = Trees.Ident[T] + type SearchFailureIdent = Trees.SearchFailureIdent[T] + type Select = Trees.Select[T] + type SelectWithSig = Trees.SelectWithSig[T] + type This = Trees.This[T] + type Super = Trees.Super[T] + type Apply = Trees.Apply[T] + type TypeApply = Trees.TypeApply[T] + type GenericApply = Trees.GenericApply[T] + type Literal = Trees.Literal[T] + type New = Trees.New[T] + type Typed = Trees.Typed[T] + type NamedArg = Trees.NamedArg[T] + type Assign = Trees.Assign[T] + type Block = Trees.Block[T] + type If = Trees.If[T] + type InlineIf = Trees.InlineIf[T] + type Closure = Trees.Closure[T] + type Match = Trees.Match[T] + type InlineMatch = Trees.InlineMatch[T] + type CaseDef = Trees.CaseDef[T] + type Labeled = Trees.Labeled[T] + type Return = Trees.Return[T] + type WhileDo = Trees.WhileDo[T] + type Try = Trees.Try[T] + type SeqLiteral = Trees.SeqLiteral[T] + type JavaSeqLiteral = Trees.JavaSeqLiteral[T] + type Inlined = Trees.Inlined[T] + type TypeTree = Trees.TypeTree[T] + type InferredTypeTree = Trees.InferredTypeTree[T] + type SingletonTypeTree = Trees.SingletonTypeTree[T] + type RefinedTypeTree = Trees.RefinedTypeTree[T] + type AppliedTypeTree = Trees.AppliedTypeTree[T] + type LambdaTypeTree = Trees.LambdaTypeTree[T] + type TermLambdaTypeTree = Trees.TermLambdaTypeTree[T] + type MatchTypeTree = Trees.MatchTypeTree[T] + type ByNameTypeTree = Trees.ByNameTypeTree[T] + type TypeBoundsTree = Trees.TypeBoundsTree[T] + type Bind = Trees.Bind[T] + type Alternative = Trees.Alternative[T] + type UnApply = Trees.UnApply[T] + type ValDef = Trees.ValDef[T] + type DefDef = Trees.DefDef[T] + type TypeDef = Trees.TypeDef[T] + type Template = Trees.Template[T] + type Import = Trees.Import[T] + type Export = Trees.Export[T] + type ImportOrExport = Trees.ImportOrExport[T] + type PackageDef = Trees.PackageDef[T] + type Annotated = Trees.Annotated[T] + type Thicket = Trees.Thicket[T] + + type Hole = Trees.Hole[T] + + @sharable val EmptyTree: Thicket = genericEmptyTree + @sharable val EmptyValDef: ValDef = genericEmptyValDef + @sharable val ContextualEmptyTree: Thicket = new EmptyTree() // an empty tree marking a contextual closure + + // ----- Auxiliary creation methods ------------------ + + def Thicket(): Thicket = EmptyTree + def Thicket(x1: Tree, x2: Tree)(implicit src: SourceFile): Thicket = new Thicket(x1 :: x2 :: Nil) + def Thicket(x1: Tree, x2: Tree, x3: Tree)(implicit src: SourceFile): Thicket = new Thicket(x1 :: x2 :: x3 :: Nil) + def Thicket(xs: List[Tree])(implicit src: SourceFile) = new Thicket(xs) + + def flatTree(xs: List[Tree])(implicit src: SourceFile): Tree = flatten(xs) match { + case x :: Nil => x + case ys => Thicket(ys) + } + + // ----- Helper classes for copying, transforming, accumulating ----------------- + + val cpy: TreeCopier + + /** A class for copying trees. The copy methods avoid creating a new tree + * If all arguments stay the same. + * + * Note: Some of the copy methods take a context. + * These are exactly those methods that are overridden in TypedTreeCopier + * so that they selectively retype themselves. Retyping needs a context. + */ + abstract class TreeCopier { + protected def postProcess(tree: Tree, copied: untpd.Tree): copied.ThisTree[T] + protected def postProcess(tree: Tree, copied: untpd.MemberDef): copied.ThisTree[T] + + /** Soucre of the copied tree */ + protected def sourceFile(tree: Tree): SourceFile = tree.source + + protected def finalize(tree: Tree, copied: untpd.Tree): copied.ThisTree[T] = + Stats.record(s"TreeCopier.finalize/${tree.getClass == copied.getClass}") + postProcess(tree, copied.withSpan(tree.span).withAttachmentsFrom(tree)) + + protected def finalize(tree: Tree, copied: untpd.MemberDef): copied.ThisTree[T] = + Stats.record(s"TreeCopier.finalize/${tree.getClass == copied.getClass}") + postProcess(tree, copied.withSpan(tree.span).withAttachmentsFrom(tree)) + + def Ident(tree: Tree)(name: Name)(using Context): Ident = tree match { + case tree: Ident if name == tree.name => tree + case _ => finalize(tree, untpd.Ident(name)(sourceFile(tree))) + } + def Select(tree: Tree)(qualifier: Tree, name: Name)(using Context): Select = tree match { + case tree: SelectWithSig => + if ((qualifier eq tree.qualifier) && (name == tree.name)) tree + else finalize(tree, SelectWithSig(qualifier, name, tree.sig)(sourceFile(tree))) + case tree: Select if (qualifier eq tree.qualifier) && (name == tree.name) => tree + case _ => finalize(tree, untpd.Select(qualifier, name)(sourceFile(tree))) + } + /** Copy Ident or Select trees */ + def Ref(tree: RefTree)(name: Name)(using Context): RefTree = tree match { + case Ident(_) => Ident(tree)(name) + case Select(qual, _) => Select(tree)(qual, name) + } + def This(tree: Tree)(qual: untpd.Ident)(using Context): This = tree match { + case tree: This if (qual eq tree.qual) => tree + case _ => finalize(tree, untpd.This(qual)(sourceFile(tree))) + } + def Super(tree: Tree)(qual: Tree, mix: untpd.Ident)(using Context): Super = tree match { + case tree: Super if (qual eq tree.qual) && (mix eq tree.mix) => tree + case _ => finalize(tree, untpd.Super(qual, mix)(sourceFile(tree))) + } + def Apply(tree: Tree)(fun: Tree, args: List[Tree])(using Context): Apply = tree match { + case tree: Apply if (fun eq tree.fun) && (args eq tree.args) => tree + case _ => finalize(tree, untpd.Apply(fun, args)(sourceFile(tree))) + //.ensuring(res => res.uniqueId != 2213, s"source = $tree, ${tree.uniqueId}, ${tree.span}") + } + def TypeApply(tree: Tree)(fun: Tree, args: List[Tree])(using Context): TypeApply = tree match { + case tree: TypeApply if (fun eq tree.fun) && (args eq tree.args) => tree + case _ => finalize(tree, untpd.TypeApply(fun, args)(sourceFile(tree))) + } + def Literal(tree: Tree)(const: Constant)(using Context): Literal = tree match { + case tree: Literal if const == tree.const => tree + case _ => finalize(tree, untpd.Literal(const)(sourceFile(tree))) + } + def New(tree: Tree)(tpt: Tree)(using Context): New = tree match { + case tree: New if (tpt eq tree.tpt) => tree + case _ => finalize(tree, untpd.New(tpt)(sourceFile(tree))) + } + def Typed(tree: Tree)(expr: Tree, tpt: Tree)(using Context): Typed = tree match { + case tree: Typed if (expr eq tree.expr) && (tpt eq tree.tpt) => tree + case tree => finalize(tree, untpd.Typed(expr, tpt)(sourceFile(tree))) + } + def NamedArg(tree: Tree)(name: Name, arg: Tree)(using Context): NamedArg = tree match { + case tree: NamedArg if (name == tree.name) && (arg eq tree.arg) => tree + case _ => finalize(tree, untpd.NamedArg(name, arg)(sourceFile(tree))) + } + def Assign(tree: Tree)(lhs: Tree, rhs: Tree)(using Context): Assign = tree match { + case tree: Assign if (lhs eq tree.lhs) && (rhs eq tree.rhs) => tree + case _ => finalize(tree, untpd.Assign(lhs, rhs)(sourceFile(tree))) + } + def Block(tree: Tree)(stats: List[Tree], expr: Tree)(using Context): Block = tree match { + case tree: Block if (stats eq tree.stats) && (expr eq tree.expr) => tree + case _ => finalize(tree, untpd.Block(stats, expr)(sourceFile(tree))) + } + def If(tree: Tree)(cond: Tree, thenp: Tree, elsep: Tree)(using Context): If = tree match { + case tree: If if (cond eq tree.cond) && (thenp eq tree.thenp) && (elsep eq tree.elsep) => tree + case tree: InlineIf => finalize(tree, untpd.InlineIf(cond, thenp, elsep)(sourceFile(tree))) + case _ => finalize(tree, untpd.If(cond, thenp, elsep)(sourceFile(tree))) + } + def Closure(tree: Tree)(env: List[Tree], meth: Tree, tpt: Tree)(using Context): Closure = tree match { + case tree: Closure if (env eq tree.env) && (meth eq tree.meth) && (tpt eq tree.tpt) => tree + case _ => finalize(tree, untpd.Closure(env, meth, tpt)(sourceFile(tree))) + } + def Match(tree: Tree)(selector: Tree, cases: List[CaseDef])(using Context): Match = tree match { + case tree: Match if (selector eq tree.selector) && (cases eq tree.cases) => tree + case tree: InlineMatch => finalize(tree, untpd.InlineMatch(selector, cases)(sourceFile(tree))) + case _ => finalize(tree, untpd.Match(selector, cases)(sourceFile(tree))) + } + def CaseDef(tree: Tree)(pat: Tree, guard: Tree, body: Tree)(using Context): CaseDef = tree match { + case tree: CaseDef if (pat eq tree.pat) && (guard eq tree.guard) && (body eq tree.body) => tree + case _ => finalize(tree, untpd.CaseDef(pat, guard, body)(sourceFile(tree))) + } + def Labeled(tree: Tree)(bind: Bind, expr: Tree)(using Context): Labeled = tree match { + case tree: Labeled if (bind eq tree.bind) && (expr eq tree.expr) => tree + case _ => finalize(tree, untpd.Labeled(bind, expr)(sourceFile(tree))) + } + def Return(tree: Tree)(expr: Tree, from: Tree)(using Context): Return = tree match { + case tree: Return if (expr eq tree.expr) && (from eq tree.from) => tree + case _ => finalize(tree, untpd.Return(expr, from)(sourceFile(tree))) + } + def WhileDo(tree: Tree)(cond: Tree, body: Tree)(using Context): WhileDo = tree match { + case tree: WhileDo if (cond eq tree.cond) && (body eq tree.body) => tree + case _ => finalize(tree, untpd.WhileDo(cond, body)(sourceFile(tree))) + } + def Try(tree: Tree)(expr: Tree, cases: List[CaseDef], finalizer: Tree)(using Context): Try = tree match { + case tree: Try if (expr eq tree.expr) && (cases eq tree.cases) && (finalizer eq tree.finalizer) => tree + case _ => finalize(tree, untpd.Try(expr, cases, finalizer)(sourceFile(tree))) + } + def SeqLiteral(tree: Tree)(elems: List[Tree], elemtpt: Tree)(using Context): SeqLiteral = tree match { + case tree: JavaSeqLiteral => + if ((elems eq tree.elems) && (elemtpt eq tree.elemtpt)) tree + else finalize(tree, untpd.JavaSeqLiteral(elems, elemtpt)) + case tree: SeqLiteral if (elems eq tree.elems) && (elemtpt eq tree.elemtpt) => tree + case _ => finalize(tree, untpd.SeqLiteral(elems, elemtpt)(sourceFile(tree))) + } + def Inlined(tree: Tree)(call: tpd.Tree, bindings: List[MemberDef], expansion: Tree)(using Context): Inlined = tree match { + case tree: Inlined if (call eq tree.call) && (bindings eq tree.bindings) && (expansion eq tree.expansion) => tree + case _ => finalize(tree, untpd.Inlined(call, bindings, expansion)(sourceFile(tree))) + } + def SingletonTypeTree(tree: Tree)(ref: Tree)(using Context): SingletonTypeTree = tree match { + case tree: SingletonTypeTree if (ref eq tree.ref) => tree + case _ => finalize(tree, untpd.SingletonTypeTree(ref)(sourceFile(tree))) + } + def RefinedTypeTree(tree: Tree)(tpt: Tree, refinements: List[Tree])(using Context): RefinedTypeTree = tree match { + case tree: RefinedTypeTree if (tpt eq tree.tpt) && (refinements eq tree.refinements) => tree + case _ => finalize(tree, untpd.RefinedTypeTree(tpt, refinements)(sourceFile(tree))) + } + def AppliedTypeTree(tree: Tree)(tpt: Tree, args: List[Tree])(using Context): AppliedTypeTree = tree match { + case tree: AppliedTypeTree if (tpt eq tree.tpt) && (args eq tree.args) => tree + case _ => finalize(tree, untpd.AppliedTypeTree(tpt, args)(sourceFile(tree))) + } + def LambdaTypeTree(tree: Tree)(tparams: List[TypeDef], body: Tree)(using Context): LambdaTypeTree = tree match { + case tree: LambdaTypeTree if (tparams eq tree.tparams) && (body eq tree.body) => tree + case _ => finalize(tree, untpd.LambdaTypeTree(tparams, body)(sourceFile(tree))) + } + def TermLambdaTypeTree(tree: Tree)(params: List[ValDef], body: Tree)(using Context): TermLambdaTypeTree = tree match { + case tree: TermLambdaTypeTree if (params eq tree.params) && (body eq tree.body) => tree + case _ => finalize(tree, untpd.TermLambdaTypeTree(params, body)(sourceFile(tree))) + } + def MatchTypeTree(tree: Tree)(bound: Tree, selector: Tree, cases: List[CaseDef])(using Context): MatchTypeTree = tree match { + case tree: MatchTypeTree if (bound eq tree.bound) && (selector eq tree.selector) && (cases eq tree.cases) => tree + case _ => finalize(tree, untpd.MatchTypeTree(bound, selector, cases)(sourceFile(tree))) + } + def ByNameTypeTree(tree: Tree)(result: Tree)(using Context): ByNameTypeTree = tree match { + case tree: ByNameTypeTree if (result eq tree.result) => tree + case _ => finalize(tree, untpd.ByNameTypeTree(result)(sourceFile(tree))) + } + def TypeBoundsTree(tree: Tree)(lo: Tree, hi: Tree, alias: Tree)(using Context): TypeBoundsTree = tree match { + case tree: TypeBoundsTree if (lo eq tree.lo) && (hi eq tree.hi) && (alias eq tree.alias) => tree + case _ => finalize(tree, untpd.TypeBoundsTree(lo, hi, alias)(sourceFile(tree))) + } + def Bind(tree: Tree)(name: Name, body: Tree)(using Context): Bind = tree match { + case tree: Bind if (name eq tree.name) && (body eq tree.body) => tree + case _ => finalize(tree, untpd.Bind(name, body)(sourceFile(tree))) + } + def Alternative(tree: Tree)(trees: List[Tree])(using Context): Alternative = tree match { + case tree: Alternative if (trees eq tree.trees) => tree + case _ => finalize(tree, untpd.Alternative(trees)(sourceFile(tree))) + } + def UnApply(tree: Tree)(fun: Tree, implicits: List[Tree], patterns: List[Tree])(using Context): UnApply = tree match { + case tree: UnApply if (fun eq tree.fun) && (implicits eq tree.implicits) && (patterns eq tree.patterns) => tree + case _ => finalize(tree, untpd.UnApply(fun, implicits, patterns)(sourceFile(tree))) + } + def ValDef(tree: Tree)(name: TermName, tpt: Tree, rhs: LazyTree)(using Context): ValDef = tree match { + case tree: ValDef if (name == tree.name) && (tpt eq tree.tpt) && (rhs eq tree.unforcedRhs) => tree + case _ => finalize(tree, untpd.ValDef(name, tpt, rhs)(sourceFile(tree))) + } + def DefDef(tree: Tree)(name: TermName, paramss: List[ParamClause], tpt: Tree, rhs: LazyTree)(using Context): DefDef = tree match { + case tree: DefDef if (name == tree.name) && (paramss eq tree.paramss) && (tpt eq tree.tpt) && (rhs eq tree.unforcedRhs) => tree + case _ => finalize(tree, untpd.DefDef(name, paramss, tpt, rhs)(sourceFile(tree))) + } + def TypeDef(tree: Tree)(name: TypeName, rhs: Tree)(using Context): TypeDef = tree match { + case tree: TypeDef if (name == tree.name) && (rhs eq tree.rhs) => tree + case _ => finalize(tree, untpd.TypeDef(name, rhs)(sourceFile(tree))) + } + def Template(tree: Tree)(constr: DefDef, parents: List[Tree], derived: List[untpd.Tree], self: ValDef, body: LazyTreeList)(using Context): Template = tree match { + case tree: Template if (constr eq tree.constr) && (parents eq tree.parents) && (derived eq tree.derived) && (self eq tree.self) && (body eq tree.unforcedBody) => tree + case tree => finalize(tree, untpd.Template(constr, parents, derived, self, body)(sourceFile(tree))) + } + def Import(tree: Tree)(expr: Tree, selectors: List[untpd.ImportSelector])(using Context): Import = tree match { + case tree: Import if (expr eq tree.expr) && (selectors eq tree.selectors) => tree + case _ => finalize(tree, untpd.Import(expr, selectors)(sourceFile(tree))) + } + def Export(tree: Tree)(expr: Tree, selectors: List[untpd.ImportSelector])(using Context): Export = tree match { + case tree: Export if (expr eq tree.expr) && (selectors eq tree.selectors) => tree + case _ => finalize(tree, untpd.Export(expr, selectors)(sourceFile(tree))) + } + def PackageDef(tree: Tree)(pid: RefTree, stats: List[Tree])(using Context): PackageDef = tree match { + case tree: PackageDef if (pid eq tree.pid) && (stats eq tree.stats) => tree + case _ => finalize(tree, untpd.PackageDef(pid, stats)(sourceFile(tree))) + } + def Annotated(tree: Tree)(arg: Tree, annot: Tree)(using Context): Annotated = tree match { + case tree: Annotated if (arg eq tree.arg) && (annot eq tree.annot) => tree + case _ => finalize(tree, untpd.Annotated(arg, annot)(sourceFile(tree))) + } + def Thicket(tree: Tree)(trees: List[Tree])(using Context): Thicket = tree match { + case tree: Thicket if (trees eq tree.trees) => tree + case _ => finalize(tree, untpd.Thicket(trees)(sourceFile(tree))) + } + def Hole(tree: Tree)(isTerm: Boolean, idx: Int, args: List[Tree], content: Tree, tpt: Tree)(using Context): Hole = tree match { + case tree: Hole if isTerm == tree.isTerm && idx == tree.idx && args.eq(tree.args) && content.eq(tree.content) && content.eq(tree.content) => tree + case _ => finalize(tree, untpd.Hole(isTerm, idx, args, content, tpt)(sourceFile(tree))) + } + + // Copier methods with default arguments; these demand that the original tree + // is of the same class as the copy. We only include trees with more than 2 elements here. + def If(tree: If)(cond: Tree = tree.cond, thenp: Tree = tree.thenp, elsep: Tree = tree.elsep)(using Context): If = + If(tree: Tree)(cond, thenp, elsep) + def Closure(tree: Closure)(env: List[Tree] = tree.env, meth: Tree = tree.meth, tpt: Tree = tree.tpt)(using Context): Closure = + Closure(tree: Tree)(env, meth, tpt) + def CaseDef(tree: CaseDef)(pat: Tree = tree.pat, guard: Tree = tree.guard, body: Tree = tree.body)(using Context): CaseDef = + CaseDef(tree: Tree)(pat, guard, body) + def Try(tree: Try)(expr: Tree = tree.expr, cases: List[CaseDef] = tree.cases, finalizer: Tree = tree.finalizer)(using Context): Try = + Try(tree: Tree)(expr, cases, finalizer) + def UnApply(tree: UnApply)(fun: Tree = tree.fun, implicits: List[Tree] = tree.implicits, patterns: List[Tree] = tree.patterns)(using Context): UnApply = + UnApply(tree: Tree)(fun, implicits, patterns) + def ValDef(tree: ValDef)(name: TermName = tree.name, tpt: Tree = tree.tpt, rhs: LazyTree = tree.unforcedRhs)(using Context): ValDef = + ValDef(tree: Tree)(name, tpt, rhs) + def DefDef(tree: DefDef)(name: TermName = tree.name, paramss: List[ParamClause] = tree.paramss, tpt: Tree = tree.tpt, rhs: LazyTree = tree.unforcedRhs)(using Context): DefDef = + DefDef(tree: Tree)(name, paramss, tpt, rhs) + def TypeDef(tree: TypeDef)(name: TypeName = tree.name, rhs: Tree = tree.rhs)(using Context): TypeDef = + TypeDef(tree: Tree)(name, rhs) + def Template(tree: Template)(constr: DefDef = tree.constr, parents: List[Tree] = tree.parents, derived: List[untpd.Tree] = tree.derived, self: ValDef = tree.self, body: LazyTreeList = tree.unforcedBody)(using Context): Template = + Template(tree: Tree)(constr, parents, derived, self, body) + def Hole(tree: Hole)(isTerm: Boolean = tree.isTerm, idx: Int = tree.idx, args: List[Tree] = tree.args, content: Tree = tree.content, tpt: Tree = tree.tpt)(using Context): Hole = + Hole(tree: Tree)(isTerm, idx, args, content, tpt) + + } + + /** Hook to indicate that a transform of some subtree should be skipped */ + protected def skipTransform(tree: Tree)(using Context): Boolean = false + + /** For untyped trees, this is just the identity. + * For typed trees, a context derived form `ctx` that records `call` as the + * innermost enclosing call for which the inlined version is currently + * processed. + */ + protected def inlineContext(call: tpd.Tree)(using Context): Context = ctx + + /** The context to use when mapping or accumulating over a tree */ + def localCtx(tree: Tree)(using Context): Context + + /** The context to use when transforming a tree. + * It ensures that the source is correct, and that the local context is used if + * that's necessary for transforming the whole tree. + * TODO: ensure transform is always called with the correct context as argument + * @see https://github.com/lampepfl/dotty/pull/13880#discussion_r836395977 + */ + def transformCtx(tree: Tree)(using Context): Context = + val sourced = + if tree.source.exists && tree.source != ctx.source + then ctx.withSource(tree.source) + else ctx + tree match + case t: (MemberDef | PackageDef | LambdaTypeTree | TermLambdaTypeTree) => + localCtx(t)(using sourced) + case _ => + sourced + + abstract class TreeMap(val cpy: TreeCopier = inst.cpy) { self: TreeMap @retains(caps.*) => + def transform(tree: Tree)(using Context): Tree = { + inContext(transformCtx(tree)) { + Stats.record(s"TreeMap.transform/$getClass") + if (skipTransform(tree)) tree + else tree match { + case Ident(name) => + tree + case Select(qualifier, name) => + cpy.Select(tree)(transform(qualifier), name) + case This(qual) => + tree + case Super(qual, mix) => + cpy.Super(tree)(transform(qual), mix) + case Apply(fun, args) => + cpy.Apply(tree)(transform(fun), transform(args)) + case TypeApply(fun, args) => + cpy.TypeApply(tree)(transform(fun), transform(args)) + case Literal(const) => + tree + case New(tpt) => + cpy.New(tree)(transform(tpt)) + case Typed(expr, tpt) => + cpy.Typed(tree)(transform(expr), transform(tpt)) + case NamedArg(name, arg) => + cpy.NamedArg(tree)(name, transform(arg)) + case Assign(lhs, rhs) => + cpy.Assign(tree)(transform(lhs), transform(rhs)) + case blk: Block => + transformBlock(blk) + case If(cond, thenp, elsep) => + cpy.If(tree)(transform(cond), transform(thenp), transform(elsep)) + case Closure(env, meth, tpt) => + cpy.Closure(tree)(transform(env), transform(meth), transform(tpt)) + case Match(selector, cases) => + cpy.Match(tree)(transform(selector), transformSub(cases)) + case CaseDef(pat, guard, body) => + cpy.CaseDef(tree)(transform(pat), transform(guard), transform(body)) + case Labeled(bind, expr) => + cpy.Labeled(tree)(transformSub(bind), transform(expr)) + case Return(expr, from) => + cpy.Return(tree)(transform(expr), transformSub(from)) + case WhileDo(cond, body) => + cpy.WhileDo(tree)(transform(cond), transform(body)) + case Try(block, cases, finalizer) => + cpy.Try(tree)(transform(block), transformSub(cases), transform(finalizer)) + case SeqLiteral(elems, elemtpt) => + cpy.SeqLiteral(tree)(transform(elems), transform(elemtpt)) + case Inlined(call, bindings, expansion) => + cpy.Inlined(tree)(call, transformSub(bindings), transform(expansion)(using inlineContext(call))) + case TypeTree() => + tree + case SingletonTypeTree(ref) => + cpy.SingletonTypeTree(tree)(transform(ref)) + case RefinedTypeTree(tpt, refinements) => + cpy.RefinedTypeTree(tree)(transform(tpt), transformSub(refinements)) + case AppliedTypeTree(tpt, args) => + cpy.AppliedTypeTree(tree)(transform(tpt), transform(args)) + case LambdaTypeTree(tparams, body) => + cpy.LambdaTypeTree(tree)(transformSub(tparams), transform(body)) + case TermLambdaTypeTree(params, body) => + cpy.TermLambdaTypeTree(tree)(transformSub(params), transform(body)) + case MatchTypeTree(bound, selector, cases) => + cpy.MatchTypeTree(tree)(transform(bound), transform(selector), transformSub(cases)) + case ByNameTypeTree(result) => + cpy.ByNameTypeTree(tree)(transform(result)) + case TypeBoundsTree(lo, hi, alias) => + cpy.TypeBoundsTree(tree)(transform(lo), transform(hi), transform(alias)) + case Bind(name, body) => + cpy.Bind(tree)(name, transform(body)) + case Alternative(trees) => + cpy.Alternative(tree)(transform(trees)) + case UnApply(fun, implicits, patterns) => + cpy.UnApply(tree)(transform(fun), transform(implicits), transform(patterns)) + case EmptyValDef => + tree + case tree @ ValDef(name, tpt, _) => + val tpt1 = transform(tpt) + val rhs1 = transform(tree.rhs) + cpy.ValDef(tree)(name, tpt1, rhs1) + case tree @ DefDef(name, paramss, tpt, _) => + cpy.DefDef(tree)(name, transformParamss(paramss), transform(tpt), transform(tree.rhs)) + case tree @ TypeDef(name, rhs) => + cpy.TypeDef(tree)(name, transform(rhs)) + case tree @ Template(constr, parents, self, _) if tree.derived.isEmpty => + cpy.Template(tree)(transformSub(constr), transform(tree.parents), Nil, transformSub(self), transformStats(tree.body, tree.symbol)) + case Import(expr, selectors) => + cpy.Import(tree)(transform(expr), selectors) + case Export(expr, selectors) => + cpy.Export(tree)(transform(expr), selectors) + case PackageDef(pid, stats) => + cpy.PackageDef(tree)(transformSub(pid), transformStats(stats, ctx.owner)) + case Annotated(arg, annot) => + cpy.Annotated(tree)(transform(arg), transform(annot)) + case Thicket(trees) => + val trees1 = transform(trees) + if (trees1 eq trees) tree else Thicket(trees1) + case tree @ Hole(_, _, args, content, tpt) => + cpy.Hole(tree)(args = transform(args), content = transform(content), tpt = transform(tpt)) + case _ => + transformMoreCases(tree) + } + } + } + + def transformStats(trees: List[Tree], exprOwner: Symbol)(using Context): List[Tree] = + transform(trees) + def transformBlock(blk: Block)(using Context): Block = + cpy.Block(blk)(transformStats(blk.stats, ctx.owner), transform(blk.expr)) + def transform(trees: List[Tree])(using Context): List[Tree] = + flatten(trees mapConserve (transform(_))) + def transformSub[Tr <: Tree](tree: Tr)(using Context): Tr = + transform(tree).asInstanceOf[Tr] + def transformSub[Tr <: Tree](trees: List[Tr])(using Context): List[Tr] = + transform(trees).asInstanceOf[List[Tr]] + def transformParams(params: ParamClause)(using Context): ParamClause = + transform(params).asInstanceOf[ParamClause] + def transformParamss(paramss: List[ParamClause])(using Context): List[ParamClause] = + paramss.mapConserve(transformParams) + + protected def transformMoreCases(tree: Tree)(using Context): Tree = { + assert(ctx.reporter.errorsReported) + tree + } + } + + abstract class TreeAccumulator[X] { self: TreeAccumulator[X] @retains(caps.*) => + // Ties the knot of the traversal: call `foldOver(x, tree))` to dive in the `tree` node. + def apply(x: X, tree: Tree)(using Context): X + + def apply(x: X, trees: List[Tree])(using Context): X = + def fold(x: X, trees: List[Tree]): X = trees match + case tree :: rest => fold(apply(x, tree), rest) + case Nil => x + fold(x, trees) + + def foldOver(x: X, tree: Tree)(using Context): X = + if (tree.source != ctx.source && tree.source.exists) + foldOver(x, tree)(using ctx.withSource(tree.source)) + else { + Stats.record(s"TreeAccumulator.foldOver/$getClass") + tree match { + case Ident(name) => + x + case Select(qualifier, name) => + this(x, qualifier) + case This(qual) => + x + case Super(qual, mix) => + this(x, qual) + case Apply(fun, args) => + this(this(x, fun), args) + case TypeApply(fun, args) => + this(this(x, fun), args) + case Literal(const) => + x + case New(tpt) => + this(x, tpt) + case Typed(expr, tpt) => + this(this(x, expr), tpt) + case NamedArg(name, arg) => + this(x, arg) + case Assign(lhs, rhs) => + this(this(x, lhs), rhs) + case Block(stats, expr) => + this(this(x, stats), expr) + case If(cond, thenp, elsep) => + this(this(this(x, cond), thenp), elsep) + case Closure(env, meth, tpt) => + this(this(this(x, env), meth), tpt) + case Match(selector, cases) => + this(this(x, selector), cases) + case CaseDef(pat, guard, body) => + this(this(this(x, pat), guard), body) + case Labeled(bind, expr) => + this(this(x, bind), expr) + case Return(expr, from) => + this(this(x, expr), from) + case WhileDo(cond, body) => + this(this(x, cond), body) + case Try(block, handler, finalizer) => + this(this(this(x, block), handler), finalizer) + case SeqLiteral(elems, elemtpt) => + this(this(x, elems), elemtpt) + case Inlined(call, bindings, expansion) => + this(this(x, bindings), expansion)(using inlineContext(call)) + case TypeTree() => + x + case SingletonTypeTree(ref) => + this(x, ref) + case RefinedTypeTree(tpt, refinements) => + this(this(x, tpt), refinements) + case AppliedTypeTree(tpt, args) => + this(this(x, tpt), args) + case LambdaTypeTree(tparams, body) => + inContext(localCtx(tree)) { + this(this(x, tparams), body) + } + case TermLambdaTypeTree(params, body) => + inContext(localCtx(tree)) { + this(this(x, params), body) + } + case MatchTypeTree(bound, selector, cases) => + this(this(this(x, bound), selector), cases) + case ByNameTypeTree(result) => + this(x, result) + case TypeBoundsTree(lo, hi, alias) => + this(this(this(x, lo), hi), alias) + case Bind(name, body) => + this(x, body) + case Alternative(trees) => + this(x, trees) + case UnApply(fun, implicits, patterns) => + this(this(this(x, fun), implicits), patterns) + case tree @ ValDef(_, tpt, _) => + inContext(localCtx(tree)) { + this(this(x, tpt), tree.rhs) + } + case tree @ DefDef(_, paramss, tpt, _) => + inContext(localCtx(tree)) { + this(this(paramss.foldLeft(x)(apply), tpt), tree.rhs) + } + case TypeDef(_, rhs) => + inContext(localCtx(tree)) { + this(x, rhs) + } + case tree @ Template(constr, parents, self, _) if tree.derived.isEmpty => + this(this(this(this(x, constr), parents), self), tree.body) + case Import(expr, _) => + this(x, expr) + case Export(expr, _) => + this(x, expr) + case PackageDef(pid, stats) => + this(this(x, pid), stats)(using localCtx(tree)) + case Annotated(arg, annot) => + this(this(x, arg), annot) + case Thicket(ts) => + this(x, ts) + case Hole(_, _, args, content, tpt) => + this(this(this(x, args), content), tpt) + case _ => + foldMoreCases(x, tree) + } + } + + def foldMoreCases(x: X, tree: Tree)(using Context): X = { + assert(ctx.reporter.hasUnreportedErrors + || ctx.reporter.errorsReported + || ctx.mode.is(Mode.Interactive), tree) + // In interactive mode, errors might come from previous runs. + // In case of errors it may be that typed trees point to untyped ones. + // The IDE can still traverse inside such trees, either in the run where errors + // are reported, or in subsequent ones. + x + } + } + + abstract class TreeTraverser extends TreeAccumulator[Unit] { + def traverse(tree: Tree)(using Context): Unit + def traverse(trees: List[Tree])(using Context) = apply((), trees) + def apply(x: Unit, tree: Tree)(using Context): Unit = traverse(tree) + protected def traverseChildren(tree: Tree)(using Context): Unit = foldOver((), tree) + } + + /** Fold `f` over all tree nodes, in depth-first, prefix order */ + class DeepFolder[X](f: (X, Tree) => X) extends TreeAccumulator[X] { + def apply(x: X, tree: Tree)(using Context): X = foldOver(f(x, tree), tree) + } + + /** Fold `f` over all tree nodes, in depth-first, prefix order, but don't visit + * subtrees where `f` returns a different result for the root, i.e. `f(x, root) ne x`. + */ + class ShallowFolder[X](f: (X, Tree) => X) extends TreeAccumulator[X] { + def apply(x: X, tree: Tree)(using Context): X = { + val x1 = f(x, tree) + if (x1.asInstanceOf[AnyRef] ne x.asInstanceOf[AnyRef]) x1 + else foldOver(x1, tree) + } + } + + def rename(tree: NameTree, newName: Name)(using Context): tree.ThisTree[T] = { + tree match { + case tree: Ident => cpy.Ident(tree)(newName) + case tree: Select => cpy.Select(tree)(tree.qualifier, newName) + case tree: Bind => cpy.Bind(tree)(newName, tree.body) + case tree: ValDef => cpy.ValDef(tree)(name = newName.asTermName) + case tree: DefDef => cpy.DefDef(tree)(name = newName.asTermName) + case tree: TypeDef => cpy.TypeDef(tree)(name = newName.asTypeName) + } + }.asInstanceOf[tree.ThisTree[T]] + + object TypeDefs: + def unapply(xs: List[Tree]): Option[List[TypeDef]] = xs match + case (x: TypeDef) :: _ => Some(xs.asInstanceOf[List[TypeDef]]) + case _ => None + + object ValDefs: + def unapply(xs: List[Tree]): Option[List[ValDef]] = xs match + case Nil => Some(Nil) + case (x: ValDef) :: _ => Some(xs.asInstanceOf[List[ValDef]]) + case _ => None + + def termParamssIn(paramss: List[ParamClause]): List[List[ValDef]] = paramss match + case ValDefs(vparams) :: paramss1 => + val paramss2 = termParamssIn(paramss1) + if paramss2 eq paramss1 then paramss.asInstanceOf[List[List[ValDef]]] + else vparams :: paramss2 + case _ :: paramss1 => + termParamssIn(paramss1) + case nil => + Nil + + /** If `tparams` is non-empty, add it to the left `paramss`, merging + * it with a leading type parameter list of `paramss`, if one exists. + */ + def joinParams(tparams: List[TypeDef], paramss: List[ParamClause]): List[ParamClause] = + if tparams.isEmpty then paramss + else paramss match + case TypeDefs(tparams1) :: paramss1 => (tparams ++ tparams1) :: paramss1 + case _ => tparams :: paramss + + def isTermOnly(paramss: List[ParamClause]): Boolean = paramss match + case Nil => true + case params :: paramss1 => + params match + case (param: untpd.TypeDef) :: _ => false + case _ => isTermOnly(paramss1) + + def asTermOnly(paramss: List[ParamClause]): List[List[ValDef]] = + assert(isTermOnly(paramss)) + paramss.asInstanceOf[List[List[ValDef]]] + + /** Delegate to FunProto or FunProtoTyped depending on whether the prefix is `untpd` or `tpd`. */ + protected def FunProto(args: List[Tree], resType: Type)(using Context): ProtoTypes.FunProto + + /** Construct the application `$receiver.$method[$targs]($args)` using overloading resolution + * to find a matching overload of `$method` if necessary. + * This is useful when overloading resolution needs to be performed in a phase after typer. + * Note that this will not perform any kind of implicit search. + * + * @param expectedType An expected type of the application used to guide overloading resolution + */ + def applyOverloaded( + receiver: tpd.Tree, method: TermName, args: List[Tree], targs: List[Type], + expectedType: Type)(using parentCtx: Context): tpd.Tree = { + given ctx: Context = parentCtx.retractMode(Mode.ImplicitsEnabled) + import dotty.tools.dotc.ast.tpd.TreeOps + + val typer = ctx.typer + val proto = FunProto(args, expectedType) + val denot = receiver.tpe.member(method) + if !denot.exists then + overload.println(i"members = ${receiver.tpe.decls}") + report.error(em"no member $receiver . $method", receiver.srcPos) + val selected = + if (denot.isOverloaded) { + def typeParamCount(tp: Type) = tp.widen match { + case tp: PolyType => tp.paramInfos.length + case _ => 0 + } + val allAlts = denot.alternatives + .map(denot => TermRef(receiver.tpe, denot.symbol)) + .filter(tr => typeParamCount(tr) == targs.length) + .filter { _.widen match { + case MethodTpe(_, _, x: MethodType) => !x.isImplicitMethod + case _ => true + }} + val alternatives = ctx.typer.resolveOverloaded(allAlts, proto) + assert(alternatives.size == 1, + i"${if (alternatives.isEmpty) "no" else "multiple"} overloads available for " + + i"$method on ${receiver.tpe.widenDealiasKeepAnnots} with targs: $targs%, %; args: $args%, %; expectedType: $expectedType." + + i"all alternatives: ${allAlts.map(_.symbol.showDcl).mkString(", ")}\n" + + i"matching alternatives: ${alternatives.map(_.symbol.showDcl).mkString(", ")}.") // this is parsed from bytecode tree. there's nothing user can do about it + alternatives.head + } + else TermRef(receiver.tpe, denot.symbol) + val fun = receiver.select(selected).appliedToTypes(targs) + + val apply = untpd.Apply(fun, args) + typer.ApplyTo(apply, fun, selected, proto, expectedType) + } + + + def resolveConstructor(atp: Type, args: List[Tree])(using Context): tpd.Tree = { + val targs = atp.argTypes + withoutMode(Mode.PatternOrTypeBits) { + applyOverloaded(tpd.New(atp.typeConstructor), nme.CONSTRUCTOR, args, targs, atp) + } + } + } +} diff --git a/tests/pos-with-compiler-cc/dotc/ast/tpd.scala b/tests/pos-with-compiler-cc/dotc/ast/tpd.scala new file mode 100644 index 000000000000..f778824a18d3 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/ast/tpd.scala @@ -0,0 +1,1546 @@ +package dotty.tools +package dotc +package ast + +import dotty.tools.dotc.transform.{ExplicitOuter, Erasure} +import typer.ProtoTypes +import transform.SymUtils._ +import transform.TypeUtils._ +import core._ +import Scopes.newScope +import util.Spans._, Types._, Contexts._, Constants._, Names._, Flags._, NameOps._ +import Symbols._, StdNames._, Annotations._, Trees._, Symbols._ +import Decorators._, DenotTransformers._ +import collection.{immutable, mutable} +import util.{Property, SourceFile} +import NameKinds.{TempResultName, OuterSelectName} +import typer.ConstFold + +import scala.annotation.tailrec +import scala.collection.mutable.ListBuffer +import language.experimental.pureFunctions + +/** Some creators for typed trees */ +object tpd extends Trees.Instance[Type] with TypedTreeInfo { + + private def ta(using Context) = ctx.typeAssigner + + def Ident(tp: NamedType)(using Context): Ident = + ta.assignType(untpd.Ident(tp.name), tp) + + def Select(qualifier: Tree, name: Name)(using Context): Select = + ta.assignType(untpd.Select(qualifier, name), qualifier) + + def Select(qualifier: Tree, tp: NamedType)(using Context): Select = + untpd.Select(qualifier, tp.name).withType(tp) + + def This(cls: ClassSymbol)(using Context): This = + untpd.This(untpd.Ident(cls.name)).withType(cls.thisType) + + def Super(qual: Tree, mix: untpd.Ident, mixinClass: Symbol)(using Context): Super = + ta.assignType(untpd.Super(qual, mix), qual, mixinClass) + + def Super(qual: Tree, mixName: TypeName, mixinClass: Symbol = NoSymbol)(using Context): Super = + Super(qual, if (mixName.isEmpty) untpd.EmptyTypeIdent else untpd.Ident(mixName), mixinClass) + + def Apply(fn: Tree, args: List[Tree])(using Context): Apply = fn match + case Block(Nil, expr) => + Apply(expr, args) + case _: RefTree | _: GenericApply | _: Inlined | _: Hole => + ta.assignType(untpd.Apply(fn, args), fn, args) + + def TypeApply(fn: Tree, args: List[Tree])(using Context): TypeApply = fn match + case Block(Nil, expr) => + TypeApply(expr, args) + case _: RefTree | _: GenericApply => + ta.assignType(untpd.TypeApply(fn, args), fn, args) + + def Literal(const: Constant)(using Context): Literal = + ta.assignType(untpd.Literal(const)) + + def unitLiteral(using Context): Literal = + Literal(Constant(())) + + def nullLiteral(using Context): Literal = + Literal(Constant(null)) + + def New(tpt: Tree)(using Context): New = + ta.assignType(untpd.New(tpt), tpt) + + def New(tp: Type)(using Context): New = New(TypeTree(tp)) + + def Typed(expr: Tree, tpt: Tree)(using Context): Typed = + ta.assignType(untpd.Typed(expr, tpt), tpt) + + def NamedArg(name: Name, arg: Tree)(using Context): NamedArg = + ta.assignType(untpd.NamedArg(name, arg), arg) + + def Assign(lhs: Tree, rhs: Tree)(using Context): Assign = + ta.assignType(untpd.Assign(lhs, rhs)) + + def Block(stats: List[Tree], expr: Tree)(using Context): Block = + ta.assignType(untpd.Block(stats, expr), stats, expr) + + /** Join `stats` in front of `expr` creating a new block if necessary */ + def seq(stats: List[Tree], expr: Tree)(using Context): Tree = + if (stats.isEmpty) expr + else expr match { + case Block(_, _: Closure) => + Block(stats, expr) // leave closures in their own block + case Block(estats, eexpr) => + cpy.Block(expr)(stats ::: estats, eexpr).withType(ta.avoidingType(eexpr, stats)) + case _ => + Block(stats, expr) + } + + def If(cond: Tree, thenp: Tree, elsep: Tree)(using Context): If = + ta.assignType(untpd.If(cond, thenp, elsep), thenp, elsep) + + def InlineIf(cond: Tree, thenp: Tree, elsep: Tree)(using Context): If = + ta.assignType(untpd.InlineIf(cond, thenp, elsep), thenp, elsep) + + def Closure(env: List[Tree], meth: Tree, tpt: Tree)(using Context): Closure = + ta.assignType(untpd.Closure(env, meth, tpt), meth, tpt) + + /** A function def + * + * vparams => expr + * + * gets expanded to + * + * { def $anonfun(vparams) = expr; Closure($anonfun) } + * + * where the closure's type is the target type of the expression (FunctionN, unless + * otherwise specified). + */ + def Closure(meth: TermSymbol, rhsFn: List[List[Tree]] => Tree, targs: List[Tree] = Nil, targetType: Type = NoType)(using Context): Block = { + val targetTpt = if (targetType.exists) TypeTree(targetType) else EmptyTree + val call = + if (targs.isEmpty) Ident(TermRef(NoPrefix, meth)) + else TypeApply(Ident(TermRef(NoPrefix, meth)), targs) + Block( + DefDef(meth, rhsFn) :: Nil, + Closure(Nil, call, targetTpt)) + } + + /** A closure whose anonymous function has the given method type */ + def Lambda(tpe: MethodType, rhsFn: List[Tree] => Tree)(using Context): Block = { + val meth = newAnonFun(ctx.owner, tpe) + Closure(meth, tss => rhsFn(tss.head).changeOwner(ctx.owner, meth)) + } + + def CaseDef(pat: Tree, guard: Tree, body: Tree)(using Context): CaseDef = + ta.assignType(untpd.CaseDef(pat, guard, body), pat, body) + + def Match(selector: Tree, cases: List[CaseDef])(using Context): Match = + ta.assignType(untpd.Match(selector, cases), selector, cases) + + def InlineMatch(selector: Tree, cases: List[CaseDef])(using Context): Match = + ta.assignType(untpd.InlineMatch(selector, cases), selector, cases) + + def Labeled(bind: Bind, expr: Tree)(using Context): Labeled = + ta.assignType(untpd.Labeled(bind, expr)) + + def Labeled(sym: TermSymbol, expr: Tree)(using Context): Labeled = + Labeled(Bind(sym, EmptyTree), expr) + + def Return(expr: Tree, from: Tree)(using Context): Return = + ta.assignType(untpd.Return(expr, from)) + + def Return(expr: Tree, from: Symbol)(using Context): Return = + Return(expr, Ident(from.termRef)) + + def WhileDo(cond: Tree, body: Tree)(using Context): WhileDo = + ta.assignType(untpd.WhileDo(cond, body)) + + def Try(block: Tree, cases: List[CaseDef], finalizer: Tree)(using Context): Try = + ta.assignType(untpd.Try(block, cases, finalizer), block, cases) + + def SeqLiteral(elems: List[Tree], elemtpt: Tree)(using Context): SeqLiteral = + ta.assignType(untpd.SeqLiteral(elems, elemtpt), elems, elemtpt) + + def JavaSeqLiteral(elems: List[Tree], elemtpt: Tree)(using Context): JavaSeqLiteral = + ta.assignType(untpd.JavaSeqLiteral(elems, elemtpt), elems, elemtpt).asInstanceOf[JavaSeqLiteral] + + def Inlined(call: Tree, bindings: List[MemberDef], expansion: Tree)(using Context): Inlined = + ta.assignType(untpd.Inlined(call, bindings, expansion), bindings, expansion) + + def TypeTree(tp: Type, inferred: Boolean = false)(using Context): TypeTree = + (if inferred then untpd.InferredTypeTree() else untpd.TypeTree()).withType(tp) + + def SingletonTypeTree(ref: Tree)(using Context): SingletonTypeTree = + ta.assignType(untpd.SingletonTypeTree(ref), ref) + + def RefinedTypeTree(parent: Tree, refinements: List[Tree], refineCls: ClassSymbol)(using Context): Tree = + ta.assignType(untpd.RefinedTypeTree(parent, refinements), parent, refinements, refineCls) + + def AppliedTypeTree(tycon: Tree, args: List[Tree])(using Context): AppliedTypeTree = + ta.assignType(untpd.AppliedTypeTree(tycon, args), tycon, args) + + def ByNameTypeTree(result: Tree)(using Context): ByNameTypeTree = + ta.assignType(untpd.ByNameTypeTree(result), result) + + def LambdaTypeTree(tparams: List[TypeDef], body: Tree)(using Context): LambdaTypeTree = + ta.assignType(untpd.LambdaTypeTree(tparams, body), tparams, body) + + def MatchTypeTree(bound: Tree, selector: Tree, cases: List[CaseDef])(using Context): MatchTypeTree = + ta.assignType(untpd.MatchTypeTree(bound, selector, cases), bound, selector, cases) + + def TypeBoundsTree(lo: Tree, hi: Tree, alias: Tree = EmptyTree)(using Context): TypeBoundsTree = + ta.assignType(untpd.TypeBoundsTree(lo, hi, alias), lo, hi, alias) + + def Bind(sym: Symbol, body: Tree)(using Context): Bind = + ta.assignType(untpd.Bind(sym.name, body), sym) + + /** A pattern corresponding to `sym: tpe` */ + def BindTyped(sym: TermSymbol, tpe: Type)(using Context): Bind = + Bind(sym, Typed(Underscore(tpe), TypeTree(tpe))) + + def Alternative(trees: List[Tree])(using Context): Alternative = + ta.assignType(untpd.Alternative(trees), trees) + + def UnApply(fun: Tree, implicits: List[Tree], patterns: List[Tree], proto: Type)(using Context): UnApply = { + assert(fun.isInstanceOf[RefTree] || fun.isInstanceOf[GenericApply]) + ta.assignType(untpd.UnApply(fun, implicits, patterns), proto) + } + + def ValDef(sym: TermSymbol, rhs: LazyTree = EmptyTree, inferred: Boolean = false)(using Context): ValDef = + ta.assignType(untpd.ValDef(sym.name, TypeTree(sym.info, inferred), rhs), sym) + + def SyntheticValDef(name: TermName, rhs: Tree, flags: FlagSet = EmptyFlags)(using Context): ValDef = + ValDef(newSymbol(ctx.owner, name, Synthetic | flags, rhs.tpe.widen, coord = rhs.span), rhs) + + def DefDef(sym: TermSymbol, paramss: List[List[Symbol]], + resultType: Type, rhs: Tree)(using Context): DefDef = + sym.setParamss(paramss) + ta.assignType( + untpd.DefDef( + sym.name, + paramss.map { + case TypeSymbols(params) => params.map(param => TypeDef(param).withSpan(param.span)) + case TermSymbols(params) => params.map(param => ValDef(param).withSpan(param.span)) + case _ => unreachable() + }, + TypeTree(resultType), + rhs), + sym) + + def DefDef(sym: TermSymbol, rhs: Tree = EmptyTree)(using Context): DefDef = + ta.assignType(DefDef(sym, Function.const(rhs) _), sym) + + /** A DefDef with given method symbol `sym`. + * @rhsFn A function from parameter references + * to the method's right-hand side. + * Parameter symbols are taken from the `rawParamss` field of `sym`, or + * are freshly generated if `rawParamss` is empty. + */ + def DefDef(sym: TermSymbol, rhsFn: List[List[Tree]] => Tree)(using Context): DefDef = + + // Map method type `tp` with remaining parameters stored in rawParamss to + // final result type and all (given or synthesized) parameters + def recur(tp: Type, remaining: List[List[Symbol]]): (Type, List[List[Symbol]]) = tp match + case tp: PolyType => + val (tparams: List[TypeSymbol], remaining1) = remaining match + case tparams :: remaining1 => + assert(tparams.hasSameLengthAs(tp.paramNames) && tparams.head.isType) + (tparams.asInstanceOf[List[TypeSymbol]], remaining1) + case nil => + (newTypeParams(sym, tp.paramNames, EmptyFlags, tp.instantiateParamInfos(_)), Nil) + val (rtp, paramss) = recur(tp.instantiate(tparams.map(_.typeRef)), remaining1) + (rtp, tparams :: paramss) + case tp: MethodType => + val isParamDependent = tp.isParamDependent + val previousParamRefs: ListBuffer[TermRef] = + // It is ok to assign `null` here. + // If `isParamDependent == false`, the value of `previousParamRefs` is not used. + if isParamDependent then mutable.ListBuffer[TermRef]() else (null: ListBuffer[TermRef] | Null).uncheckedNN + + def valueParam(name: TermName, origInfo: Type): TermSymbol = + val maybeImplicit = + if tp.isContextualMethod then Given + else if tp.isImplicitMethod then Implicit + else EmptyFlags + val maybeErased = if tp.isErasedMethod then Erased else EmptyFlags + + def makeSym(info: Type) = newSymbol(sym, name, TermParam | maybeImplicit | maybeErased, info, coord = sym.coord) + + if isParamDependent then + val sym = makeSym(origInfo.substParams(tp, previousParamRefs.toList)) + previousParamRefs += sym.termRef + sym + else makeSym(origInfo) + end valueParam + + val (vparams: List[TermSymbol], remaining1) = + if tp.paramNames.isEmpty then (Nil, remaining) + else remaining match + case vparams :: remaining1 => + assert(vparams.hasSameLengthAs(tp.paramNames) && vparams.head.isTerm) + (vparams.asInstanceOf[List[TermSymbol]], remaining1) + case nil => + (tp.paramNames.lazyZip(tp.paramInfos).map(valueParam), Nil) + val (rtp, paramss) = recur(tp.instantiate(vparams.map(_.termRef)), remaining1) + (rtp, vparams :: paramss) + case _ => + assert(remaining.isEmpty) + (tp.widenExpr, Nil) + end recur + + val (rtp, paramss) = recur(sym.info, sym.rawParamss) + DefDef(sym, paramss, rtp, rhsFn(paramss.nestedMap(ref))) + end DefDef + + def TypeDef(sym: TypeSymbol)(using Context): TypeDef = + ta.assignType(untpd.TypeDef(sym.name, TypeTree(sym.info)), sym) + + def ClassDef(cls: ClassSymbol, constr: DefDef, body: List[Tree], superArgs: List[Tree] = Nil)(using Context): TypeDef = { + val firstParent :: otherParents = cls.info.parents: @unchecked + val superRef = + if (cls.is(Trait)) TypeTree(firstParent) + else { + def isApplicable(ctpe: Type): Boolean = ctpe match { + case ctpe: PolyType => + isApplicable(ctpe.instantiate(firstParent.argTypes)) + case ctpe: MethodType => + (superArgs corresponds ctpe.paramInfos)(_.tpe <:< _) + case _ => + false + } + val constr = firstParent.decl(nme.CONSTRUCTOR).suchThat(constr => isApplicable(constr.info)) + New(firstParent, constr.symbol.asTerm, superArgs) + } + ClassDefWithParents(cls, constr, superRef :: otherParents.map(TypeTree(_)), body) + } + + def ClassDefWithParents(cls: ClassSymbol, constr: DefDef, parents: List[Tree], body: List[Tree])(using Context): TypeDef = { + val selfType = + if (cls.classInfo.selfInfo ne NoType) ValDef(newSelfSym(cls)) + else EmptyValDef + def isOwnTypeParam(stat: Tree) = + stat.symbol.is(TypeParam) && stat.symbol.owner == cls + val bodyTypeParams = body filter isOwnTypeParam map (_.symbol) + val newTypeParams = + for (tparam <- cls.typeParams if !(bodyTypeParams contains tparam)) + yield TypeDef(tparam) + val findLocalDummy = FindLocalDummyAccumulator(cls) + val localDummy = body.foldLeft(NoSymbol: Symbol)(findLocalDummy.apply) + .orElse(newLocalDummy(cls)) + val impl = untpd.Template(constr, parents, Nil, selfType, newTypeParams ++ body) + .withType(localDummy.termRef) + ta.assignType(untpd.TypeDef(cls.name, impl), cls) + } + + /** An anonymous class + * + * new parents { forwarders } + * + * where `forwarders` contains forwarders for all functions in `fns`. + * @param parents a non-empty list of class types + * @param fns a non-empty of functions for which forwarders should be defined in the class. + * The class has the same owner as the first function in `fns`. + * Its position is the union of all functions in `fns`. + */ + def AnonClass(parents: List[Type], fns: List[TermSymbol], methNames: List[TermName])(using Context): Block = { + AnonClass(fns.head.owner, parents, fns.map(_.span).reduceLeft(_ union _)) { cls => + def forwarder(fn: TermSymbol, name: TermName) = { + val fwdMeth = fn.copy(cls, name, Synthetic | Method | Final).entered.asTerm + for overridden <- fwdMeth.allOverriddenSymbols do + if overridden.is(Extension) then fwdMeth.setFlag(Extension) + if !overridden.is(Deferred) then fwdMeth.setFlag(Override) + DefDef(fwdMeth, ref(fn).appliedToArgss(_)) + } + fns.lazyZip(methNames).map(forwarder) + } + } + + /** An anonymous class + * + * new parents { body } + * + * with the specified owner and position. + */ + def AnonClass(owner: Symbol, parents: List[Type], coord: Coord)(body: ClassSymbol => List[Tree])(using Context): Block = + val parents1 = + if (parents.head.classSymbol.is(Trait)) { + val head = parents.head.parents.head + if (head.isRef(defn.AnyClass)) defn.AnyRefType :: parents else head :: parents + } + else parents + val cls = newNormalizedClassSymbol(owner, tpnme.ANON_CLASS, Synthetic | Final, parents1, coord = coord) + val constr = newConstructor(cls, Synthetic, Nil, Nil).entered + val cdef = ClassDef(cls, DefDef(constr), body(cls)) + Block(cdef :: Nil, New(cls.typeRef, Nil)) + + def Import(expr: Tree, selectors: List[untpd.ImportSelector])(using Context): Import = + ta.assignType(untpd.Import(expr, selectors), newImportSymbol(ctx.owner, expr)) + + def Export(expr: Tree, selectors: List[untpd.ImportSelector])(using Context): Export = + ta.assignType(untpd.Export(expr, selectors)) + + def PackageDef(pid: RefTree, stats: List[Tree])(using Context): PackageDef = + ta.assignType(untpd.PackageDef(pid, stats), pid) + + def Annotated(arg: Tree, annot: Tree)(using Context): Annotated = + ta.assignType(untpd.Annotated(arg, annot), arg, annot) + + def Throw(expr: Tree)(using Context): Tree = + ref(defn.throwMethod).appliedTo(expr) + + def Hole(isTermHole: Boolean, idx: Int, args: List[Tree], content: Tree, tpt: Tree)(using Context): Hole = + ta.assignType(untpd.Hole(isTermHole, idx, args, content, tpt), tpt) + + // ------ Making references ------------------------------------------------------ + + def prefixIsElidable(tp: NamedType)(using Context): Boolean = { + val typeIsElidable = tp.prefix match { + case pre: ThisType => + tp.isType || + pre.cls.isStaticOwner || + tp.symbol.isParamOrAccessor && !pre.cls.is(Trait) && ctx.owner.enclosingClass == pre.cls + // was ctx.owner.enclosingClass.derivesFrom(pre.cls) which was not tight enough + // and was spuriously triggered in case inner class would inherit from outer one + // eg anonymous TypeMap inside TypeMap.andThen + case pre: TermRef => + pre.symbol.is(Module) && pre.symbol.isStatic + case pre => + pre `eq` NoPrefix + } + typeIsElidable || + tp.symbol.is(JavaStatic) || + tp.symbol.hasAnnotation(defn.ScalaStaticAnnot) + } + + def needsSelect(tp: Type)(using Context): Boolean = tp match { + case tp: TermRef => !prefixIsElidable(tp) + case _ => false + } + + /** A tree representing the same reference as the given type */ + def ref(tp: NamedType, needLoad: Boolean = true)(using Context): Tree = + if (tp.isType) TypeTree(tp) + else if (prefixIsElidable(tp)) Ident(tp) + else if (tp.symbol.is(Module) && ctx.owner.isContainedIn(tp.symbol.moduleClass)) + followOuterLinks(This(tp.symbol.moduleClass.asClass)) + else if (tp.symbol hasAnnotation defn.ScalaStaticAnnot) + Ident(tp) + else + val pre = tp.prefix + if (pre.isSingleton) followOuterLinks(singleton(pre.dealias, needLoad)).select(tp) + else + val res = Select(TypeTree(pre), tp) + if needLoad && !res.symbol.isStatic then + throw TypeError(em"cannot establish a reference to $res") + res + + def ref(sym: Symbol)(using Context): Tree = + ref(NamedType(sym.owner.thisType, sym.name, sym.denot)) + + private def followOuterLinks(t: Tree)(using Context) = t match { + case t: This if ctx.erasedTypes && !(t.symbol == ctx.owner.enclosingClass || t.symbol.isStaticOwner) => + // after erasure outer paths should be respected + ExplicitOuter.OuterOps(ctx.detach).path(toCls = t.tpe.classSymbol) + case t => + t + } + + def singleton(tp: Type, needLoad: Boolean = true)(using Context): Tree = tp.dealias match { + case tp: TermRef => ref(tp, needLoad) + case tp: ThisType => This(tp.cls) + case tp: SkolemType => singleton(tp.narrow, needLoad) + case SuperType(qual, _) => singleton(qual, needLoad) + case ConstantType(value) => Literal(value) + } + + /** A path that corresponds to the given type `tp`. Error if `tp` is not a refinement + * of an addressable singleton type. + */ + def pathFor(tp: Type)(using Context): Tree = { + def recur(tp: Type): Tree = tp match { + case tp: NamedType => + tp.info match { + case TypeAlias(alias) => recur(alias) + case _: TypeBounds => EmptyTree + case _ => singleton(tp) + } + case tp: TypeProxy => recur(tp.superType) + case _ => EmptyTree + } + recur(tp).orElse { + report.error(em"$tp is not an addressable singleton type") + TypeTree(tp) + } + } + + /** A tree representing a `newXYZArray` operation of the right + * kind for the given element type in `elemTpe`. No type arguments or + * `length` arguments are given. + */ + def newArray(elemTpe: Type, returnTpe: Type, span: Span, dims: JavaSeqLiteral)(using Context): Tree = { + val elemClass = elemTpe.classSymbol + def newArr = + ref(defn.DottyArraysModule).select(defn.newArrayMethod).withSpan(span) + + if (!ctx.erasedTypes) { + assert(!TypeErasure.isGeneric(elemTpe), elemTpe) //needs to be done during typer. See Applications.convertNewGenericArray + newArr.appliedToTypeTrees(TypeTree(returnTpe) :: Nil).appliedToTermArgs(clsOf(elemTpe) :: clsOf(returnTpe) :: dims :: Nil).withSpan(span) + } + else // after erasure + newArr.appliedToTermArgs(clsOf(elemTpe) :: clsOf(returnTpe) :: dims :: Nil).withSpan(span) + } + + /** The wrapped array method name for an array of type elemtp */ + def wrapArrayMethodName(elemtp: Type)(using Context): TermName = { + val elemCls = elemtp.classSymbol + if (elemCls.isPrimitiveValueClass) nme.wrapXArray(elemCls.name) + else if (elemCls.derivesFrom(defn.ObjectClass) && !elemCls.isNotRuntimeClass) nme.wrapRefArray + else nme.genericWrapArray + } + + /** A tree representing a `wrapXYZArray(tree)` operation of the right + * kind for the given element type in `elemTpe`. + */ + def wrapArray(tree: Tree, elemtp: Type)(using Context): Tree = + val wrapper = ref(defn.getWrapVarargsArrayModule) + .select(wrapArrayMethodName(elemtp)) + .appliedToTypes(if (elemtp.isPrimitiveValueType) Nil else elemtp :: Nil) + val actualElem = wrapper.tpe.widen.firstParamTypes.head + wrapper.appliedTo(tree.ensureConforms(actualElem)) + + // ------ Creating typed equivalents of trees that exist only in untyped form ------- + + /** new C(args), calling the primary constructor of C */ + def New(tp: Type, args: List[Tree])(using Context): Apply = + New(tp, tp.dealias.typeSymbol.primaryConstructor.asTerm, args) + + /** new C(args), calling given constructor `constr` of C */ + def New(tp: Type, constr: TermSymbol, args: List[Tree])(using Context): Apply = { + val targs = tp.argTypes + val tycon = tp.typeConstructor + New(tycon) + .select(TermRef(tycon, constr)) + .appliedToTypes(targs) + .appliedToTermArgs(args) + } + + /** An object def + * + * object obs extends parents { decls } + * + * gets expanded to + * + * val obj = new obj$ + * class obj$ extends parents { this: obj.type => decls } + * + * (The following no longer applies: + * What's interesting here is that the block is well typed + * (because class obj$ is hoistable), but the type of the `obj` val is + * not expressible. What needs to happen in general when + * inferring the type of a val from its RHS, is: if the type contains + * a class that has the val itself as owner, then that class + * is remapped to have the val's owner as owner. Remapping could be + * done by cloning the class with the new owner and substituting + * everywhere in the tree. We know that remapping is safe + * because the only way a local class can appear in the RHS of a val is + * by being hoisted outside of a block, and the necessary checks are + * done at this point already. + * + * On the other hand, for method result type inference, if the type of + * the RHS of a method contains a class owned by the method, this would be + * an error.) + */ + def ModuleDef(sym: TermSymbol, body: List[Tree])(using Context): tpd.Thicket = { + val modcls = sym.moduleClass.asClass + val constrSym = modcls.primaryConstructor orElse newDefaultConstructor(modcls).entered + val constr = DefDef(constrSym.asTerm, EmptyTree) + val clsdef = ClassDef(modcls, constr, body) + val valdef = ValDef(sym, New(modcls.typeRef).select(constrSym).appliedToNone) + Thicket(valdef, clsdef) + } + + /** A `_` with given type */ + def Underscore(tp: Type)(using Context): Ident = untpd.Ident(nme.WILDCARD).withType(tp) + + def defaultValue(tpe: Type)(using Context): Tree = { + val tpw = tpe.widen + + if (tpw isRef defn.IntClass) Literal(Constant(0)) + else if (tpw isRef defn.LongClass) Literal(Constant(0L)) + else if (tpw isRef defn.BooleanClass) Literal(Constant(false)) + else if (tpw isRef defn.CharClass) Literal(Constant('\u0000')) + else if (tpw isRef defn.FloatClass) Literal(Constant(0f)) + else if (tpw isRef defn.DoubleClass) Literal(Constant(0d)) + else if (tpw isRef defn.ByteClass) Literal(Constant(0.toByte)) + else if (tpw isRef defn.ShortClass) Literal(Constant(0.toShort)) + else nullLiteral.select(defn.Any_asInstanceOf).appliedToType(tpe) + } + + private class FindLocalDummyAccumulator(cls: ClassSymbol)(using Context) extends TreeAccumulator[Symbol] { + def apply(sym: Symbol, tree: Tree)(using Context) = + if (sym.exists) sym + else if (tree.isDef) { + val owner = tree.symbol.owner + if (owner.isLocalDummy && owner.owner == cls) owner + else if (owner == cls) foldOver(sym, tree) + else sym + } + else foldOver(sym, tree) + } + + /** The owner to be used in a local context when traversing a tree */ + def localOwner(tree: Tree)(using Context): Symbol = + val sym = tree.symbol + (if sym.is(PackageVal) then sym.moduleClass else sym).orElse(ctx.owner) + + /** The local context to use when traversing trees */ + def localCtx(tree: Tree)(using Context): Context = ctx.withOwner(localOwner(tree)) + + override val cpy: TypedTreeCopier = // Type ascription needed to pick up any new members in TreeCopier (currently there are none) + TypedTreeCopier() + + val cpyBetweenPhases: TimeTravellingTreeCopier = TimeTravellingTreeCopier() + + class TypedTreeCopier extends TreeCopier { + def postProcess(tree: Tree, copied: untpd.Tree): copied.ThisTree[Type] = + copied.withTypeUnchecked(tree.tpe) + def postProcess(tree: Tree, copied: untpd.MemberDef): copied.ThisTree[Type] = + copied.withTypeUnchecked(tree.tpe) + + protected val untpdCpy = untpd.cpy + + override def Select(tree: Tree)(qualifier: Tree, name: Name)(using Context): Select = { + val tree1 = untpdCpy.Select(tree)(qualifier, name) + tree match { + case tree: Select if qualifier.tpe eq tree.qualifier.tpe => + tree1.withTypeUnchecked(tree.tpe) + case _ => + val tree2: Select = tree.tpe match { + case tpe: NamedType => + val qualType = qualifier.tpe.widenIfUnstable + if qualType.isExactlyNothing then tree1.withTypeUnchecked(tree.tpe) + else tree1.withType(tpe.derivedSelect(qualType)) + case _ => tree1.withTypeUnchecked(tree.tpe) + } + ConstFold.Select(tree2) + } + } + + override def Apply(tree: Tree)(fun: Tree, args: List[Tree])(using Context): Apply = { + val tree1 = untpdCpy.Apply(tree)(fun, args) + tree match { + case tree: Apply + if (fun.tpe eq tree.fun.tpe) && sameTypes(args, tree.args) => + tree1.withTypeUnchecked(tree.tpe) + case _ => ta.assignType(tree1, fun, args) + } + } + + override def TypeApply(tree: Tree)(fun: Tree, args: List[Tree])(using Context): TypeApply = { + val tree1 = untpdCpy.TypeApply(tree)(fun, args) + tree match { + case tree: TypeApply + if (fun.tpe eq tree.fun.tpe) && sameTypes(args, tree.args) => + tree1.withTypeUnchecked(tree.tpe) + case _ => ta.assignType(tree1, fun, args) + } + } + + override def Literal(tree: Tree)(const: Constant)(using Context): Literal = + ta.assignType(untpdCpy.Literal(tree)(const)) + + override def New(tree: Tree)(tpt: Tree)(using Context): New = + ta.assignType(untpdCpy.New(tree)(tpt), tpt) + + override def Typed(tree: Tree)(expr: Tree, tpt: Tree)(using Context): Typed = + ta.assignType(untpdCpy.Typed(tree)(expr, tpt), tpt) + + override def NamedArg(tree: Tree)(name: Name, arg: Tree)(using Context): NamedArg = + ta.assignType(untpdCpy.NamedArg(tree)(name, arg), arg) + + override def Assign(tree: Tree)(lhs: Tree, rhs: Tree)(using Context): Assign = + ta.assignType(untpdCpy.Assign(tree)(lhs, rhs)) + + override def Block(tree: Tree)(stats: List[Tree], expr: Tree)(using Context): Block = { + val tree1 = untpdCpy.Block(tree)(stats, expr) + tree match { + case tree: Block if (expr.tpe eq tree.expr.tpe) && (expr.tpe eq tree.tpe) => + // The last guard is a conservative check: if `tree.tpe` is different from `expr.tpe`, then + // it was computed from widening `expr.tpe`, and tree transforms might cause `expr.tpe.widen` + // to change even if `expr.tpe` itself didn't change, e.g: + // { val s = ...; s } + // If the type of `s` changed, then the type of the block might have changed, even though `expr.tpe` + // will still be `TermRef(NoPrefix, s)` + tree1.withTypeUnchecked(tree.tpe) + case _ => ta.assignType(tree1, stats, expr) + } + } + + override def If(tree: Tree)(cond: Tree, thenp: Tree, elsep: Tree)(using Context): If = { + val tree1 = untpdCpy.If(tree)(cond, thenp, elsep) + tree match { + case tree: If if (thenp.tpe eq tree.thenp.tpe) && (elsep.tpe eq tree.elsep.tpe) && + ((tree.tpe eq thenp.tpe) || (tree.tpe eq elsep.tpe)) => + // The last guard is a conservative check similar to the one done in `Block` above, + // if `tree.tpe` is not identical to the type of one of its branch, it might have been + // computed from the widened type of the branches, so the same reasoning than + // in `Block` applies. + tree1.withTypeUnchecked(tree.tpe) + case _ => ta.assignType(tree1, thenp, elsep) + } + } + + override def Closure(tree: Tree)(env: List[Tree], meth: Tree, tpt: Tree)(using Context): Closure = { + val tree1 = untpdCpy.Closure(tree)(env, meth, tpt) + tree match { + case tree: Closure if sameTypes(env, tree.env) && (meth.tpe eq tree.meth.tpe) && (tpt.tpe eq tree.tpt.tpe) => + tree1.withTypeUnchecked(tree.tpe) + case _ => ta.assignType(tree1, meth, tpt) + } + } + + override def Match(tree: Tree)(selector: Tree, cases: List[CaseDef])(using Context): Match = { + val tree1 = untpdCpy.Match(tree)(selector, cases) + tree match { + case tree: Match if sameTypes(cases, tree.cases) => tree1.withTypeUnchecked(tree.tpe) + case _ => ta.assignType(tree1, selector, cases) + } + } + + override def CaseDef(tree: Tree)(pat: Tree, guard: Tree, body: Tree)(using Context): CaseDef = { + val tree1 = untpdCpy.CaseDef(tree)(pat, guard, body) + tree match { + case tree: CaseDef if body.tpe eq tree.body.tpe => tree1.withTypeUnchecked(tree.tpe) + case _ => ta.assignType(tree1, pat, body) + } + } + + override def Labeled(tree: Tree)(bind: Bind, expr: Tree)(using Context): Labeled = + ta.assignType(untpdCpy.Labeled(tree)(bind, expr)) + + override def Return(tree: Tree)(expr: Tree, from: Tree)(using Context): Return = + ta.assignType(untpdCpy.Return(tree)(expr, from)) + + override def WhileDo(tree: Tree)(cond: Tree, body: Tree)(using Context): WhileDo = + ta.assignType(untpdCpy.WhileDo(tree)(cond, body)) + + override def Try(tree: Tree)(expr: Tree, cases: List[CaseDef], finalizer: Tree)(using Context): Try = { + val tree1 = untpdCpy.Try(tree)(expr, cases, finalizer) + tree match { + case tree: Try if (expr.tpe eq tree.expr.tpe) && sameTypes(cases, tree.cases) => tree1.withTypeUnchecked(tree.tpe) + case _ => ta.assignType(tree1, expr, cases) + } + } + + override def Inlined(tree: Tree)(call: Tree, bindings: List[MemberDef], expansion: Tree)(using Context): Inlined = { + val tree1 = untpdCpy.Inlined(tree)(call, bindings, expansion) + tree match { + case tree: Inlined if sameTypes(bindings, tree.bindings) && (expansion.tpe eq tree.expansion.tpe) => + tree1.withTypeUnchecked(tree.tpe) + case _ => ta.assignType(tree1, bindings, expansion) + } + } + + override def SeqLiteral(tree: Tree)(elems: List[Tree], elemtpt: Tree)(using Context): SeqLiteral = { + val tree1 = untpdCpy.SeqLiteral(tree)(elems, elemtpt) + tree match { + case tree: SeqLiteral + if sameTypes(elems, tree.elems) && (elemtpt.tpe eq tree.elemtpt.tpe) => + tree1.withTypeUnchecked(tree.tpe) + case _ => + ta.assignType(tree1, elems, elemtpt) + } + } + + override def Annotated(tree: Tree)(arg: Tree, annot: Tree)(using Context): Annotated = { + val tree1 = untpdCpy.Annotated(tree)(arg, annot) + tree match { + case tree: Annotated if (arg.tpe eq tree.arg.tpe) && (annot eq tree.annot) => tree1.withTypeUnchecked(tree.tpe) + case _ => ta.assignType(tree1, arg, annot) + } + } + + override def If(tree: If)(cond: Tree = tree.cond, thenp: Tree = tree.thenp, elsep: Tree = tree.elsep)(using Context): If = + If(tree: Tree)(cond, thenp, elsep) + override def Closure(tree: Closure)(env: List[Tree] = tree.env, meth: Tree = tree.meth, tpt: Tree = tree.tpt)(using Context): Closure = + Closure(tree: Tree)(env, meth, tpt) + override def CaseDef(tree: CaseDef)(pat: Tree = tree.pat, guard: Tree = tree.guard, body: Tree = tree.body)(using Context): CaseDef = + CaseDef(tree: Tree)(pat, guard, body) + override def Try(tree: Try)(expr: Tree = tree.expr, cases: List[CaseDef] = tree.cases, finalizer: Tree = tree.finalizer)(using Context): Try = + Try(tree: Tree)(expr, cases, finalizer) + } + + class TimeTravellingTreeCopier extends TypedTreeCopier { + override def Apply(tree: Tree)(fun: Tree, args: List[Tree])(using Context): Apply = + tree match + case tree: Apply + if (tree.fun eq fun) && (tree.args eq args) + && tree.tpe.isInstanceOf[ConstantType] + && isPureExpr(tree) => tree + case _ => + ta.assignType(untpdCpy.Apply(tree)(fun, args), fun, args) + // Note: Reassigning the original type if `fun` and `args` have the same types as before + // does not work here in general: The computed type depends on the widened function type, not + // the function type itself. A tree transform may keep the function type the + // same but its widened type might change. + // However, we keep constant types of pure expressions. This uses the underlying assumptions + // that pure functions yielding a constant will not change in later phases. + + override def TypeApply(tree: Tree)(fun: Tree, args: List[Tree])(using Context): TypeApply = + ta.assignType(untpdCpy.TypeApply(tree)(fun, args), fun, args) + // Same remark as for Apply + + override def Closure(tree: Tree)(env: List[Tree], meth: Tree, tpt: Tree)(using Context): Closure = + ta.assignType(untpdCpy.Closure(tree)(env, meth, tpt), meth, tpt) + + override def Closure(tree: Closure)(env: List[Tree] = tree.env, meth: Tree = tree.meth, tpt: Tree = tree.tpt)(using Context): Closure = + Closure(tree: Tree)(env, meth, tpt) + } + + override def skipTransform(tree: Tree)(using Context): Boolean = tree.tpe.isError + + implicit class TreeOps[ThisTree <: tpd.Tree](private val tree: ThisTree) extends AnyVal { + + def isValue(using Context): Boolean = + tree.isTerm && tree.tpe.widen.isValueType + + def isValueOrPattern(using Context): Boolean = + tree.isValue || tree.isPattern + + def isValueType: Boolean = + tree.isType && tree.tpe.isValueType + + def isInstantiation: Boolean = tree match { + case Apply(Select(New(_), nme.CONSTRUCTOR), _) => true + case _ => false + } + + def shallowFold[T](z: T)(op: (T, tpd.Tree) => T)(using Context): T = + ShallowFolder(op).apply(z, tree) + + def deepFold[T](z: T)(op: (T, tpd.Tree) => T)(using Context): T = + DeepFolder(op).apply(z, tree) + + def find[T](pred: (tpd.Tree) => Boolean)(using Context): Option[tpd.Tree] = + shallowFold[Option[tpd.Tree]](None)((accum, tree) => if (pred(tree)) Some(tree) else accum) + + def subst(from: List[Symbol], to: List[Symbol])(using Context): ThisTree = + TreeTypeMap(substFrom = from, substTo = to).apply(tree) + + /** Change owner from `from` to `to`. If `from` is a weak owner, also change its + * owner to `to`, and continue until a non-weak owner is reached. + */ + def changeOwner(from: Symbol, to: Symbol)(using Context): ThisTree = { + @tailrec def loop(from: Symbol, froms: List[Symbol], tos: List[Symbol]): ThisTree = + if (from.isWeakOwner && !from.owner.isClass) + loop(from.owner, from :: froms, to :: tos) + else + //println(i"change owner ${from :: froms}%, % ==> $tos of $tree") + TreeTypeMap(oldOwners = from :: froms, newOwners = tos).apply(tree) + if (from == to) tree else loop(from, Nil, to :: Nil) + } + + /** + * Set the owner of every definition in this tree which is not itself contained in this + * tree to be `newowner` + */ + def changeNonLocalOwners(newOwner: Symbol)(using Context): Tree = { + val ownerAcc = new TreeAccumulator[immutable.Set[Symbol]] { + def apply(ss: immutable.Set[Symbol], tree: Tree)(using Context) = tree match { + case tree: DefTree => + val sym = tree.symbol + if sym.exists && !sym.owner.is(Package) then ss + sym.owner else ss + case _ => + foldOver(ss, tree) + } + } + val owners = ownerAcc(immutable.Set.empty[Symbol], tree).toList + val newOwners = List.fill(owners.size)(newOwner) + TreeTypeMap(oldOwners = owners, newOwners = newOwners).apply(tree) + } + + /** After phase `trans`, set the owner of every definition in this tree that was formerly + * owner by `from` to `to`. + */ + def changeOwnerAfter(from: Symbol, to: Symbol, trans: DenotTransformer)(using Context): ThisTree = + if (ctx.phase == trans.next) { + val traverser = new TreeTraverser { + def traverse(tree: Tree)(using Context) = tree match { + case tree: DefTree => + val sym = tree.symbol + val prevDenot = atPhase(trans)(sym.denot) + if (prevDenot.effectiveOwner == from.skipWeakOwner) { + val d = sym.copySymDenotation(owner = to) + d.installAfter(trans) + d.transformAfter(trans, d => if (d.owner eq from) d.copySymDenotation(owner = to) else d) + } + if (sym.isWeakOwner) traverseChildren(tree) + case _ => + traverseChildren(tree) + } + } + traverser.traverse(tree) + tree + } + else atPhase(trans.next)(changeOwnerAfter(from, to, trans)) + + /** A select node with the given selector name and a computed type */ + def select(name: Name)(using Context): Select = + Select(tree, name) + + /** A select node with the given selector name such that the designated + * member satisfies predicate `p`. Useful for disambiguating overloaded members. + */ + def select(name: Name, p: Symbol => Boolean)(using Context): Select = + select(tree.tpe.member(name).suchThat(p).symbol) + + /** A select node with the given type */ + def select(tp: NamedType)(using Context): Select = + untpd.Select(tree, tp.name).withType(tp) + + /** A select node that selects the given symbol. Note: Need to make sure this + * is in fact the symbol you would get when you select with the symbol's name, + * otherwise a data race may occur which would be flagged by -Yno-double-bindings. + */ + def select(sym: Symbol)(using Context): Select = { + val tp = + if (sym.isType) { + assert(!sym.is(TypeParam)) + TypeRef(tree.tpe, sym.asType) + } + else + TermRef(tree.tpe, sym.name.asTermName, sym.denot.asSeenFrom(tree.tpe)) + untpd.Select(tree, sym.name).withType(tp) + } + + /** A select node with the given selector name and signature and a computed type */ + def selectWithSig(name: Name, sig: Signature, target: Name)(using Context): Tree = + untpd.SelectWithSig(tree, name, sig).withType(tree.tpe.select(name.asTermName, sig, target)) + + /** A select node with selector name and signature taken from `sym`. + * Note: Use this method instead of select(sym) if the referenced symbol + * might be overridden in the type of the qualifier prefix. See note + * on select(sym: Symbol). + */ + def selectWithSig(sym: Symbol)(using Context): Tree = + selectWithSig(sym.name, sym.signature, sym.targetName) + + /** A unary apply node with given argument: `tree(arg)` */ + def appliedTo(arg: Tree)(using Context): Apply = + appliedToTermArgs(arg :: Nil) + + /** An apply node with given arguments: `tree(arg, args0, ..., argsN)` */ + def appliedTo(arg: Tree, args: Tree*)(using Context): Apply = + appliedToTermArgs(arg :: args.toList) + + /** An apply node with given argument list `tree(args(0), ..., args(args.length - 1))` */ + def appliedToTermArgs(args: List[Tree])(using Context): Apply = + Apply(tree, args) + + /** An applied node that accepts only varargs as arguments */ + def appliedToVarargs(args: List[Tree], tpt: Tree)(using Context): Apply = + appliedTo(repeated(args, tpt)) + + /** An apply or type apply node with given argument list */ + def appliedToArgs(args: List[Tree])(using Context): GenericApply = args match + case arg :: args1 if arg.isType => TypeApply(tree, args) + case _ => Apply(tree, args) + + /** The current tree applied to given argument lists: + * `tree (argss(0)) ... (argss(argss.length -1))` + */ + def appliedToArgss(argss: List[List[Tree]])(using Context): Tree = + argss.foldLeft(tree: Tree)(_.appliedToArgs(_)) + + /** The current tree applied to (): `tree()` */ + def appliedToNone(using Context): Apply = Apply(tree, Nil) + + /** The current tree applied to given type argument: `tree[targ]` */ + def appliedToType(targ: Type)(using Context): Tree = + appliedToTypes(targ :: Nil) + + /** The current tree applied to given type arguments: `tree[targ0, ..., targN]` */ + def appliedToTypes(targs: List[Type])(using Context): Tree = + appliedToTypeTrees(targs map (TypeTree(_))) + + /** The current tree applied to given type argument: `tree[targ]` */ + def appliedToTypeTree(targ: Tree)(using Context): Tree = + appliedToTypeTrees(targ :: Nil) + + /** The current tree applied to given type argument list: `tree[targs(0), ..., targs(targs.length - 1)]` */ + def appliedToTypeTrees(targs: List[Tree])(using Context): Tree = + if targs.isEmpty then tree else TypeApply(tree, targs) + + /** Apply to `()` unless tree's widened type is parameterless */ + def ensureApplied(using Context): Tree = + if (tree.tpe.widen.isParameterless) tree else tree.appliedToNone + + /** `tree == that` */ + def equal(that: Tree)(using Context): Tree = + if (that.tpe.widen.isRef(defn.NothingClass)) + Literal(Constant(false)) + else + applyOverloaded(tree, nme.EQ, that :: Nil, Nil, defn.BooleanType) + + /** `tree.isInstanceOf[tp]`, with special treatment of singleton types */ + def isInstance(tp: Type)(using Context): Tree = tp.dealias match { + case ConstantType(c) if c.tag == StringTag => + singleton(tp).equal(tree) + case tp: SingletonType => + if tp.widen.derivesFrom(defn.ObjectClass) then + tree.ensureConforms(defn.ObjectType).select(defn.Object_eq).appliedTo(singleton(tp)) + else + singleton(tp).equal(tree) + case _ => + tree.select(defn.Any_isInstanceOf).appliedToType(tp) + } + + /** tree.asInstanceOf[`tp`] */ + def asInstance(tp: Type)(using Context): Tree = { + assert(tp.isValueType, i"bad cast: $tree.asInstanceOf[$tp]") + tree.select(defn.Any_asInstanceOf).appliedToType(tp) + } + + /** cast tree to `tp`, assuming no exception is raised, i.e the operation is pure */ + def cast(tp: Type)(using Context): Tree = cast(TypeTree(tp)) + + /** cast tree to `tp`, assuming no exception is raised, i.e the operation is pure */ + def cast(tpt: TypeTree)(using Context): Tree = + assert(tpt.tpe.isValueType, i"bad cast: $tree.asInstanceOf[$tpt]") + tree.select(if (ctx.erasedTypes) defn.Any_asInstanceOf else defn.Any_typeCast) + .appliedToTypeTree(tpt) + + /** cast `tree` to `tp` (or its box/unbox/cast equivalent when after + * erasure and value and non-value types are mixed), + * unless tree's type already conforms to `tp`. + */ + def ensureConforms(tp: Type)(using Context): Tree = + if (tree.tpe <:< tp) tree + else if (!ctx.erasedTypes) cast(tp) + else Erasure.Boxing.adaptToType(tree, tp) + + /** `tree ne null` (might need a cast to be type correct) */ + def testNotNull(using Context): Tree = { + // If the receiver is of type `Nothing` or `Null`, add an ascription or cast + // so that the selection succeeds. + // e.g. `null.ne(null)` doesn't type, but `(null: AnyRef).ne(null)` does. + val receiver = + if tree.tpe.isBottomType then + if ctx.explicitNulls then tree.cast(defn.AnyRefType) + else Typed(tree, TypeTree(defn.AnyRefType)) + else tree.ensureConforms(defn.ObjectType) + // also need to cast the null literal to AnyRef in explicit nulls + val nullLit = if ctx.explicitNulls then nullLiteral.cast(defn.AnyRefType) else nullLiteral + receiver.select(defn.Object_ne).appliedTo(nullLit).withSpan(tree.span) + } + + /** If inititializer tree is `_`, the default value of its type, + * otherwise the tree itself. + */ + def wildcardToDefault(using Context): Tree = + if (isWildcardArg(tree)) defaultValue(tree.tpe) else tree + + /** `this && that`, for boolean trees `this`, `that` */ + def and(that: Tree)(using Context): Tree = + tree.select(defn.Boolean_&&).appliedTo(that) + + /** `this || that`, for boolean trees `this`, `that` */ + def or(that: Tree)(using Context): Tree = + tree.select(defn.Boolean_||).appliedTo(that) + + /** The translation of `tree = rhs`. + * This is either the tree as an assignment, or a setter call. + */ + def becomes(rhs: Tree)(using Context): Tree = { + val sym = tree.symbol + if (sym.is(Method)) { + val setter = sym.setter.orElse { + assert(sym.name.isSetterName && sym.info.firstParamTypes.nonEmpty, sym) + sym + } + val qual = tree match { + case id: Ident => desugarIdentPrefix(id) + case Select(qual, _) => qual + } + qual.select(setter).appliedTo(rhs) + } + else Assign(tree, rhs) + } + + /** tree @annot + * + * works differently for type trees and term trees + */ + def annotated(annot: Tree)(using Context): Tree = + if (tree.isTerm) + Typed(tree, TypeTree(AnnotatedType(tree.tpe.widenIfUnstable, Annotation(annot)))) + else + Annotated(tree, annot) + + /** A synthetic select with that will be turned into an outer path by ExplicitOuter. + * @param levels How many outer levels to select + * @param tp The type of the destination of the outer path. + */ + def outerSelect(levels: Int, tp: Type)(using Context): Tree = + untpd.Select(tree, OuterSelectName(EmptyTermName, levels)).withType(SkolemType(tp)) + + /** Replace Inlined nodes and InlineProxy references to underlying arguments */ + def underlyingArgument(using Context): Tree = { + val mapToUnderlying = new MapToUnderlying { + /** Should get the rhs of this binding + * Returns true if the symbol is a val or def generated by eta-expansion/inline + */ + override protected def skipLocal(sym: Symbol): Boolean = + sym.isOneOf(InlineProxy | Synthetic) + } + mapToUnderlying.transform(tree) + } + + /** Replace Ident nodes references to the underlying tree that defined them */ + def underlying(using Context): Tree = MapToUnderlying().transform(tree) + + // --- Higher order traversal methods ------------------------------- + + /** Apply `f` to each subtree of this tree */ + def foreachSubTree(f: Tree => Unit)(using Context): Unit = { + val traverser = new TreeTraverser { + def traverse(tree: Tree)(using Context) = foldOver(f(tree), tree) + } + traverser.traverse(tree) + } + + /** Is there a subtree of this tree that satisfies predicate `p`? */ + def existsSubTree(p: Tree => Boolean)(using Context): Boolean = { + val acc = new TreeAccumulator[Boolean] { + def apply(x: Boolean, t: Tree)(using Context) = x || p(t) || foldOver(x, t) + } + acc(false, tree) + } + + /** All subtrees of this tree that satisfy predicate `p`. */ + def filterSubTrees(f: Tree => Boolean)(using Context): List[Tree] = { + val buf = mutable.ListBuffer[Tree]() + foreachSubTree { tree => if (f(tree)) buf += tree } + buf.toList + } + + /** Set this tree as the `defTree` of its symbol and return this tree */ + def setDefTree(using Context): ThisTree = { + val sym = tree.symbol + if (sym.exists) sym.defTree = tree + tree + } + + def etaExpandCFT(using Context): Tree = + def expand(target: Tree, tp: Type)(using Context): Tree = tp match + case defn.ContextFunctionType(argTypes, resType, isErased) => + val anonFun = newAnonFun( + ctx.owner, + MethodType.companion(isContextual = true, isErased = isErased)(argTypes, resType), + coord = ctx.owner.coord) + def lambdaBody(refss: List[List[Tree]]) = + expand(target.select(nme.apply).appliedToArgss(refss), resType)( + using ctx.withOwner(anonFun)) + Closure(anonFun, lambdaBody) + case _ => + target + expand(tree, tree.tpe.widen) + } + + inline val MapRecursionLimit = 10 + + extension (trees: List[Tree]) + + /** A map that expands to a recursive function. It's equivalent to + * + * flatten(trees.mapConserve(op)) + * + * and falls back to it after `MaxRecursionLimit` recursions. + * Before that it uses a simpler method that uses stackspace + * instead of heap. + * Note `op` is duplicated in the generated code, so it should be + * kept small. + */ + inline def mapInline(inline op: Tree => Tree): List[Tree] = + def recur(trees: List[Tree], count: Int): List[Tree] = + if count > MapRecursionLimit then + // use a slower implementation that avoids stack overflows + flatten(trees.mapConserve(op)) + else trees match + case tree :: rest => + val tree1 = op(tree) + val rest1 = recur(rest, count + 1) + if (tree1 eq tree) && (rest1 eq rest) then trees + else tree1 match + case Thicket(elems1) => elems1 ::: rest1 + case _ => tree1 :: rest1 + case nil => nil + recur(trees, 0) + + /** Transform statements while maintaining import contexts and expression contexts + * in the same way as Typer does. The code addresses additional concerns: + * - be tail-recursive where possible + * - don't re-allocate trees where nothing has changed + */ + inline def mapStatements[T]( + exprOwner: Symbol, + inline op: Tree => Context ?=> Tree, + inline wrapResult: List[Tree] => Context ?=> T)(using Context): T = + @tailrec + def loop(mapped: mutable.ListBuffer[Tree] | Null, unchanged: List[Tree], pending: List[Tree])(using Context): T = + pending match + case stat :: rest => + val statCtx = stat match + case _: DefTree | _: ImportOrExport => ctx + case _ => ctx.exprContext(stat, exprOwner) + val stat1 = op(stat)(using statCtx) + val restCtx = stat match + case stat: Import => ctx.importContext(stat, stat.symbol) + case _ => ctx + if stat1 eq stat then + loop(mapped, unchanged, rest)(using restCtx) + else + val buf = if mapped == null then new mutable.ListBuffer[Tree] else mapped + var xc = unchanged + while xc ne pending do + buf += xc.head + xc = xc.tail + stat1 match + case Thicket(stats1) => buf ++= stats1 + case _ => buf += stat1 + loop(buf, rest, rest)(using restCtx) + case nil => + wrapResult( + if mapped == null then unchanged + else mapped.prependToList(unchanged)) + + loop(null, trees, trees) + end mapStatements + end extension + + /** A treemap that generates the same contexts as the original typer for statements. + * This means: + * - statements that are not definitions get the exprOwner as owner + * - imports are reflected in the contexts of subsequent statements + */ + class TreeMapWithPreciseStatContexts(cpy: TreeCopier = tpd.cpy) extends TreeMap(cpy): + def transformStats[T](trees: List[Tree], exprOwner: Symbol, wrapResult: List[Tree] => Context ?=> T)(using Context): T = + trees.mapStatements(exprOwner, transform(_), wrapResult) + final override def transformStats(trees: List[Tree], exprOwner: Symbol)(using Context): List[Tree] = + transformStats(trees, exprOwner, sameStats) + override def transformBlock(blk: Block)(using Context) = + transformStats(blk.stats, ctx.owner, + stats1 => ctx ?=> cpy.Block(blk)(stats1, transform(blk.expr))) + + val sameStats: List[Tree] => Context ?=> List[Tree] = stats => stats + + /** Map Inlined nodes, NamedArgs, Blocks with no statements and local references to underlying arguments. + * Also drops Inline and Block with no statements. + */ + private class MapToUnderlying extends TreeMap { + override def transform(tree: Tree)(using Context): Tree = tree match { + case tree: Ident if isBinding(tree.symbol) && skipLocal(tree.symbol) => + tree.symbol.defTree match { + case defTree: ValOrDefDef => + val rhs = defTree.rhs + assert(!rhs.isEmpty) + transform(rhs) + case _ => tree + } + case Inlined(_, Nil, arg) => transform(arg) + case Block(Nil, arg) => transform(arg) + case NamedArg(_, arg) => transform(arg) + case tree => super.transform(tree) + } + + /** Should get the rhs of this binding */ + protected def skipLocal(sym: Symbol): Boolean = true + + /** Is this a symbol that of a local val or parameterless def for which we could get the rhs */ + private def isBinding(sym: Symbol)(using Context): Boolean = + sym.isTerm && !sym.is(Param) && !sym.owner.isClass && + !(sym.is(Method) && sym.info.isInstanceOf[MethodOrPoly]) // if is a method it is parameterless + } + + extension (xs: List[tpd.Tree]) + def tpes: List[Type] = xs match { + case x :: xs1 => x.tpe :: xs1.tpes + case nil => Nil + } + + /** A trait for loaders that compute trees. Currently implemented just by DottyUnpickler. */ + trait TreeProvider { + protected def computeRootTrees(using Context): List[Tree] + + private var myTrees: List[Tree] | Null = _ + + /** Get trees defined by this provider. Cache them if -Yretain-trees is set. */ + def rootTrees(using Context): List[Tree] = + if (ctx.settings.YretainTrees.value) { + if (myTrees == null) myTrees = computeRootTrees + myTrees.uncheckedNN + } + else computeRootTrees + + /** Get first tree defined by this provider, or EmptyTree if none exists */ + def tree(using Context): Tree = + rootTrees.headOption.getOrElse(EmptyTree) + + /** Is it possible that the tree to load contains a definition of or reference to `id`? */ + def mightContain(id: String)(using Context): Boolean = true + } + + // convert a numeric with a toXXX method + def primitiveConversion(tree: Tree, numericCls: Symbol)(using Context): Tree = { + val mname = "to".concat(numericCls.name) + val conversion = tree.tpe member(mname) + if (conversion.symbol.exists) + tree.select(conversion.symbol.termRef).ensureApplied + else if (tree.tpe.widen isRef numericCls) + tree + else { + report.warning(em"conversion from ${tree.tpe.widen} to ${numericCls.typeRef} will always fail at runtime.") + Throw(New(defn.ClassCastExceptionClass.typeRef, Nil)).withSpan(tree.span) + } + } + + /** A tree that corresponds to `Predef.classOf[$tp]` in source */ + def clsOf(tp: Type)(using Context): Tree = + if ctx.erasedTypes && !tp.isRef(defn.UnitClass) then + Literal(Constant(TypeErasure.erasure(tp))) + else + Literal(Constant(tp)) + + @tailrec + def sameTypes(trees: List[tpd.Tree], trees1: List[tpd.Tree]): Boolean = + if (trees.isEmpty) trees.isEmpty + else if (trees1.isEmpty) trees.isEmpty + else (trees.head.tpe eq trees1.head.tpe) && sameTypes(trees.tail, trees1.tail) + + /** If `tree`'s purity level is less than `level`, let-bind it so that it gets evaluated + * only once. I.e. produce a + * + * { val x = 'tree ; ~within('x) } + * + * instead of otherwise + * + * ~within('tree) + */ + def letBindUnless(level: TreeInfo.PurityLevel, tree: Tree)(within: Tree => Tree)(using Context): Tree = + if (exprPurity(tree) >= level) within(tree) + else { + val vdef = SyntheticValDef(TempResultName.fresh(), tree) + Block(vdef :: Nil, within(Ident(vdef.namedType))) + } + + /** Let bind `tree` unless `tree` is at least idempotent */ + def evalOnce(tree: Tree)(within: Tree => Tree)(using Context): Tree = + letBindUnless(TreeInfo.Idempotent, tree)(within) + + def runtimeCall(name: TermName, args: List[Tree])(using Context): Tree = + Ident(defn.ScalaRuntimeModule.requiredMethod(name).termRef).appliedToTermArgs(args) + + /** An extractor that pulls out type arguments */ + object MaybePoly: + def unapply(tree: Tree): Option[(Tree, List[Tree])] = tree match + case TypeApply(tree, targs) => Some(tree, targs) + case _ => Some(tree, Nil) + + object TypeArgs: + def unapply(ts: List[Tree]): Option[List[Tree]] = + if ts.nonEmpty && ts.head.isType then Some(ts) else None + + /** Split argument clauses into a leading type argument clause if it exists and + * remaining clauses + */ + def splitArgs(argss: List[List[Tree]]): (List[Tree], List[List[Tree]]) = argss match + case TypeArgs(targs) :: argss1 => (targs, argss1) + case _ => (Nil, argss) + + def joinArgs(targs: List[Tree], argss: List[List[Tree]]): List[List[Tree]] = + if targs.isEmpty then argss else targs :: argss + + /** A key to be used in a context property that tracks enclosing inlined calls */ + private val InlinedCalls = Property.Key[List[Tree]]() + + /** A key to be used in a context property that tracks the number of inlined trees */ + private val InlinedTrees = Property.Key[Counter]() + final class Counter { + var count: Int = 0 + } + + /** Record an enclosing inlined call. + * EmptyTree calls (for parameters) cancel the next-enclosing call in the list instead of being added to it. + * We assume parameters are never nested inside parameters. + */ + override def inlineContext(call: Tree)(using Context): Context = { + // We assume enclosingInlineds is already normalized, and only process the new call with the head. + val oldIC = enclosingInlineds + + val newIC = + if call.isEmpty then + oldIC match + case t1 :: ts2 => ts2 + case _ => oldIC + else + call :: oldIC + + val ctx1 = ctx.fresh.setProperty(InlinedCalls, newIC) + if oldIC.isEmpty then ctx1.setProperty(InlinedTrees, new Counter) else ctx1 + } + + /** All enclosing calls that are currently inlined, from innermost to outermost. + */ + def enclosingInlineds(using Context): List[Tree] = + ctx.property(InlinedCalls).getOrElse(Nil) + + /** Record inlined trees */ + def addInlinedTrees(n: Int)(using Context): Unit = + ctx.property(InlinedTrees).foreach(_.count += n) + + /** Check if the limit on the number of inlined trees has been reached */ + def reachedInlinedTreesLimit(using Context): Boolean = + ctx.property(InlinedTrees) match + case Some(c) => c.count > ctx.settings.XmaxInlinedTrees.value + case None => false + + /** The source file where the symbol of the `inline` method referred to by `call` + * is defined + */ + def sourceFile(call: Tree)(using Context): SourceFile = call.symbol.source + + /** Desugar identifier into a select node. Return the tree itself if not possible */ + def desugarIdent(tree: Ident)(using Context): RefTree = { + val qual = desugarIdentPrefix(tree) + if (qual.isEmpty) tree + else qual.select(tree.symbol) + } + + /** Recover identifier prefix (e.g. this) if it exists */ + def desugarIdentPrefix(tree: Ident)(using Context): Tree = tree.tpe match { + case TermRef(prefix: TermRef, _) => + prefix.info match + case mt: MethodType if mt.paramInfos.isEmpty && mt.resultType.typeSymbol.is(Module) => + ref(mt.resultType.typeSymbol.sourceModule) + case _ => + ref(prefix) + case TermRef(prefix: ThisType, _) => + This(prefix.cls) + case _ => + EmptyTree + } + + /** + * The symbols that are imported with `expr.name` + * + * @param expr The base of the import statement + * @param name The name that is being imported. + * @return All the symbols that would be imported with `expr.name`. + */ + def importedSymbols(expr: Tree, name: Name)(using Context): List[Symbol] = { + def lookup(name: Name): Symbol = expr.tpe.member(name).symbol + val symbols = + List(lookup(name.toTermName), + lookup(name.toTypeName), + lookup(name.moduleClassName), + lookup(name.sourceModuleName)) + + symbols.map(_.sourceSymbol).filter(_.exists).distinct + } + + /** + * All the symbols that are imported by the first selector of `imp` that matches + * `selectorPredicate`. + * + * @param imp The import statement to analyze + * @param selectorPredicate A test to find the selector to use. + * @return The symbols imported. + */ + def importedSymbols(imp: Import, + selectorPredicate: untpd.ImportSelector -> Boolean = util.common.alwaysTrue) + (using Context): List[Symbol] = + imp.selectors.find(selectorPredicate) match + case Some(sel) => importedSymbols(imp.expr, sel.name) + case _ => Nil + + /** + * The list of select trees that resolve to the same symbols as the ones that are imported + * by `imp`. + */ + def importSelections(imp: Import)(using Context): List[Select] = { + def imported(sym: Symbol, id: untpd.Ident, rename: Option[untpd.Ident]): List[Select] = { + // Give a zero-extent position to the qualifier to prevent it from being included several + // times in results in the language server. + val noPosExpr = focusPositions(imp.expr) + val selectTree = Select(noPosExpr, sym.name).withSpan(id.span) + rename match { + case None => + selectTree :: Nil + case Some(rename) => + // Get the type of the symbol that is actually selected, and construct a select + // node with the new name and the type of the real symbol. + val name = if (sym.name.isTypeName) rename.name.toTypeName else rename.name + val actual = Select(noPosExpr, sym.name) + val renameTree = Select(noPosExpr, name).withSpan(rename.span).withType(actual.tpe) + selectTree :: renameTree :: Nil + } + } + + imp.selectors.flatMap { sel => + if sel.isWildcard then Nil + else + val renamedOpt = sel.renamed match + case renamed: untpd.Ident => Some(renamed) + case untpd.EmptyTree => None + importedSymbols(imp.expr, sel.name).flatMap { sym => + imported(sym, sel.imported, renamedOpt) + } + } + } + + /** Creates the tuple type tree repesentation of the type trees in `ts` */ + def tupleTypeTree(elems: List[Tree])(using Context): Tree = { + val arity = elems.length + if arity <= Definitions.MaxTupleArity then + val tupleTp = defn.TupleType(arity) + if tupleTp != null then + AppliedTypeTree(TypeTree(tupleTp), elems) + else nestedPairsTypeTree(elems) + else nestedPairsTypeTree(elems) + } + + /** Creates the nested pairs type tree repesentation of the type trees in `ts` */ + def nestedPairsTypeTree(ts: List[Tree])(using Context): Tree = + ts.foldRight[Tree](TypeTree(defn.EmptyTupleModule.termRef))((x, acc) => AppliedTypeTree(TypeTree(defn.PairClass.typeRef), x :: acc :: Nil)) + + /** Replaces all positions in `tree` with zero-extent positions */ + private def focusPositions(tree: Tree)(using Context): Tree = { + val transformer = new tpd.TreeMap { + override def transform(tree: Tree)(using Context): Tree = + super.transform(tree).withSpan(tree.span.focus) + } + transformer.transform(tree) + } + + /** Convert a list of trees to a vararg-compatible tree. + * Used to make arguments for methods that accept varargs. + */ + def repeated(trees: List[Tree], tpt: Tree)(using Context): Tree = + ctx.typeAssigner.arrayToRepeated(JavaSeqLiteral(trees, tpt)) + + /** Create a tree representing a list containing all + * the elements of the argument list. A "list of tree to + * tree of list" conversion. + * + * @param trees the elements the list represented by + * the resulting tree should contain. + * @param tpe the type of the elements of the resulting list. + * + */ + def mkList(trees: List[Tree], tpt: Tree)(using Context): Tree = + ref(defn.ListModule).select(nme.apply) + .appliedToTypeTree(tpt) + .appliedToVarargs(trees, tpt) + + + protected def FunProto(args: List[Tree], resType: Type)(using Context) = + ProtoTypes.FunProtoTyped(args, resType)(ctx.typer, ApplyKind.Regular) +} diff --git a/tests/pos-with-compiler-cc/dotc/ast/untpd.scala b/tests/pos-with-compiler-cc/dotc/ast/untpd.scala new file mode 100644 index 000000000000..b4dc6d0622c0 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/ast/untpd.scala @@ -0,0 +1,829 @@ +package dotty.tools +package dotc +package ast + +import core._ +import Types._, Contexts._, Constants._, Names._, Flags._ +import dotty.tools.dotc.typer.ProtoTypes +import Symbols._, StdNames._, Trees._ +import util.{Property, SourceFile, NoSource} +import util.Spans.Span +import annotation.constructorOnly +import annotation.internal.sharable +import Decorators._ +import annotation.retains +import language.experimental.pureFunctions + +object untpd extends Trees.Instance[Untyped] with UntypedTreeInfo { + + // ----- Tree cases that exist in untyped form only ------------------ + + abstract class OpTree(implicit @constructorOnly src: SourceFile) extends Tree { + def op: Ident + override def isTerm: Boolean = op.isTerm + override def isType: Boolean = op.isType + } + + /** A typed subtree of an untyped tree needs to be wrapped in a TypedSplice + * @param owner The current owner at the time the tree was defined + * @param isExtensionReceiver The splice was created from the receiver `e` in an extension + * method call `e.f(...)` + */ + abstract case class TypedSplice(splice: tpd.Tree)(val owner: Symbol, val isExtensionReceiver: Boolean)(implicit @constructorOnly src: SourceFile) extends ProxyTree { + def forwardTo: tpd.Tree = splice + override def toString = + def ext = if isExtensionReceiver then ", isExtensionReceiver = true" else "" + s"TypedSplice($splice$ext)" + } + + object TypedSplice { + def apply(tree: tpd.Tree, isExtensionReceiver: Boolean = false)(using Context): TypedSplice = + val owner = ctx.owner + given SourceFile = ctx.source + new TypedSplice(tree)(owner, isExtensionReceiver) {} + } + + /** mods object name impl */ + case class ModuleDef(name: TermName, impl: Template)(implicit @constructorOnly src: SourceFile) + extends MemberDef { + type ThisTree[+T <: Untyped] <: Trees.NameTree[T] with Trees.MemberDef[T] with ModuleDef + def withName(name: Name)(using Context): ModuleDef = cpy.ModuleDef(this)(name.toTermName, impl) + } + + /** An untyped template with a derives clause. Derived parents are added to the end + * of the `parents` list. `derivedCount` keeps track of how many there are. + * This representation was chosen because it balances two concerns: + * - maximize overlap between DerivingTemplate and Template for code streamlining + * - keep invariant that elements of untyped trees align with source positions + */ + class DerivingTemplate(constr: DefDef, parentsOrDerived: List[Tree], self: ValDef, preBody: LazyTreeList, derivedCount: Int)(implicit @constructorOnly src: SourceFile) + extends Template(constr, parentsOrDerived, self, preBody) { + override val parents = parentsOrDerived.dropRight(derivedCount) + override val derived = parentsOrDerived.takeRight(derivedCount) + } + + case class ParsedTry(expr: Tree, handler: Tree, finalizer: Tree)(implicit @constructorOnly src: SourceFile) extends TermTree + + case class SymbolLit(str: String)(implicit @constructorOnly src: SourceFile) extends TermTree + + /** An interpolated string + * @param segments a list of two element tickets consisting of string literal and argument tree, + * possibly with a simple string literal as last element of the list + */ + case class InterpolatedString(id: TermName, segments: List[Tree])(implicit @constructorOnly src: SourceFile) + extends TermTree + + /** A function type or closure */ + case class Function(args: List[Tree], body: Tree)(implicit @constructorOnly src: SourceFile) extends Tree { + override def isTerm: Boolean = body.isTerm + override def isType: Boolean = body.isType + } + + /** A function type or closure with `implicit`, `erased`, or `given` modifiers */ + class FunctionWithMods(args: List[Tree], body: Tree, val mods: Modifiers)(implicit @constructorOnly src: SourceFile) + extends Function(args, body) + + /** A polymorphic function type */ + case class PolyFunction(targs: List[Tree], body: Tree)(implicit @constructorOnly src: SourceFile) extends Tree { + override def isTerm = body.isTerm + override def isType = body.isType + } + + /** A function created from a wildcard expression + * @param placeholderParams a list of definitions of synthetic parameters. + * @param body the function body where wildcards are replaced by + * references to synthetic parameters. + * This is equivalent to Function, except that forms a special case for the overlapping + * positions tests. + */ + class WildcardFunction(placeholderParams: List[ValDef], body: Tree)(implicit @constructorOnly src: SourceFile) + extends Function(placeholderParams, body) + + case class InfixOp(left: Tree, op: Ident, right: Tree)(implicit @constructorOnly src: SourceFile) extends OpTree + case class PostfixOp(od: Tree, op: Ident)(implicit @constructorOnly src: SourceFile) extends OpTree + case class PrefixOp(op: Ident, od: Tree)(implicit @constructorOnly src: SourceFile) extends OpTree + case class Parens(t: Tree)(implicit @constructorOnly src: SourceFile) extends ProxyTree { + def forwardTo: Tree = t + } + case class Tuple(trees: List[Tree])(implicit @constructorOnly src: SourceFile) extends Tree { + override def isTerm: Boolean = trees.isEmpty || trees.head.isTerm + override def isType: Boolean = !isTerm + } + case class Throw(expr: Tree)(implicit @constructorOnly src: SourceFile) extends TermTree + case class Quote(quoted: Tree)(implicit @constructorOnly src: SourceFile) extends TermTree + case class Splice(expr: Tree)(implicit @constructorOnly src: SourceFile) extends TermTree { + def isInBraces: Boolean = span.end != expr.span.end + } + case class ForYield(enums: List[Tree], expr: Tree)(implicit @constructorOnly src: SourceFile) extends TermTree + case class ForDo(enums: List[Tree], body: Tree)(implicit @constructorOnly src: SourceFile) extends TermTree + case class GenFrom(pat: Tree, expr: Tree, checkMode: GenCheckMode)(implicit @constructorOnly src: SourceFile) extends Tree + case class GenAlias(pat: Tree, expr: Tree)(implicit @constructorOnly src: SourceFile) extends Tree + case class ContextBounds(bounds: TypeBoundsTree, cxBounds: List[Tree])(implicit @constructorOnly src: SourceFile) extends TypTree + case class PatDef(mods: Modifiers, pats: List[Tree], tpt: Tree, rhs: Tree)(implicit @constructorOnly src: SourceFile) extends DefTree + case class ExtMethods(paramss: List[ParamClause], methods: List[Tree])(implicit @constructorOnly src: SourceFile) extends Tree + case class Into(tpt: Tree)(implicit @constructorOnly src: SourceFile) extends Tree + case class MacroTree(expr: Tree)(implicit @constructorOnly src: SourceFile) extends Tree + + case class ImportSelector(imported: Ident, renamed: Tree = EmptyTree, bound: Tree = EmptyTree)(implicit @constructorOnly src: SourceFile) extends Tree { + // TODO: Make bound a typed tree? + + /** It's a `given` selector */ + val isGiven: Boolean = imported.name.isEmpty + + /** It's a `given` or `_` selector */ + val isWildcard: Boolean = isGiven || imported.name == nme.WILDCARD + + /** The imported name, EmptyTermName if it's a given selector */ + val name: TermName = imported.name.asInstanceOf[TermName] + + /** The renamed part (which might be `_`), if present, or `name`, if missing */ + val rename: TermName = renamed match + case Ident(rename: TermName) => rename + case _ => name + } + + case class Number(digits: String, kind: NumberKind)(implicit @constructorOnly src: SourceFile) extends TermTree + + enum NumberKind { + case Whole(radix: Int) + case Decimal + case Floating + } + + /** {x1, ..., xN} T (only relevant under captureChecking) */ + case class CapturingTypeTree(refs: List[Tree], parent: Tree)(implicit @constructorOnly src: SourceFile) extends TypTree + + /** Short-lived usage in typer, does not need copy/transform/fold infrastructure */ + case class DependentTypeTree(tp: List[Symbol] -> Context ?-> Type)(implicit @constructorOnly src: SourceFile) extends Tree + + @sharable object EmptyTypeIdent extends Ident(tpnme.EMPTY)(NoSource) with WithoutTypeOrPos[Untyped] { + override def isEmpty: Boolean = true + } + + def WildcardTypeBoundsTree()(using src: SourceFile): TypeBoundsTree = TypeBoundsTree(EmptyTree, EmptyTree, EmptyTree) + object WildcardTypeBoundsTree: + def unapply(tree: untpd.Tree): Boolean = tree match + case TypeBoundsTree(EmptyTree, EmptyTree, _) => true + case _ => false + + + /** A block generated by the XML parser, only treated specially by + * `Positioned#checkPos` */ + class XMLBlock(stats: List[Tree], expr: Tree)(implicit @constructorOnly src: SourceFile) extends Block(stats, expr) + + /** An enum to control checking or filtering of patterns in GenFrom trees */ + enum GenCheckMode { + case Ignore // neither filter nor check since filtering was done before + case Check // check that pattern is irrefutable + case CheckAndFilter // both check and filter (transitional period starting with 3.2) + case FilterNow // filter out non-matching elements if we are not in 3.2 or later + case FilterAlways // filter out non-matching elements since pattern is prefixed by `case` + } + + // ----- Modifiers ----------------------------------------------------- + /** Mod is intended to record syntactic information about modifiers, it's + * NOT a replacement of FlagSet. + * + * For any query about semantic information, check `flags` instead. + */ + sealed abstract class Mod(val flags: FlagSet)(implicit @constructorOnly src: SourceFile) + extends Positioned + + object Mod { + case class Private()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Private) + + case class Protected()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Protected) + + case class Var()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Mutable) + + case class Implicit()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Implicit) + + case class Given()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Given) + + case class Erased()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Erased) + + case class Final()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Final) + + case class Sealed()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Sealed) + + case class Opaque()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Opaque) + + case class Open()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Open) + + case class Override()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Override) + + case class Abstract()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Abstract) + + case class Lazy()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Lazy) + + case class Inline()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Inline) + + case class Transparent()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Transparent) + + case class Infix()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Infix) + + /** Used under pureFunctions to mark impure function types `A => B` in `FunctionWithMods` */ + case class Impure()(implicit @constructorOnly src: SourceFile) extends Mod(Flags.Impure) + } + + /** Modifiers and annotations for definitions + * + * @param flags The set flags + * @param privateWithin If a private or protected has is followed by a + * qualifier [q], the name q, "" as a typename otherwise. + * @param annotations The annotations preceding the modifiers + */ + case class Modifiers ( + flags: FlagSet = EmptyFlags, + privateWithin: TypeName = tpnme.EMPTY, + annotations: List[Tree] = Nil, + mods: List[Mod] = Nil) { + + def is(flag: Flag): Boolean = flags.is(flag) + def is(flag: Flag, butNot: FlagSet): Boolean = flags.is(flag, butNot = butNot) + def isOneOf(fs: FlagSet): Boolean = flags.isOneOf(fs) + def isOneOf(fs: FlagSet, butNot: FlagSet): Boolean = flags.isOneOf(fs, butNot = butNot) + def isAllOf(fc: FlagSet): Boolean = flags.isAllOf(fc) + + def | (fs: FlagSet): Modifiers = withFlags(flags | fs) + def & (fs: FlagSet): Modifiers = withFlags(flags & fs) + def &~(fs: FlagSet): Modifiers = withFlags(flags &~ fs) + + def toTypeFlags: Modifiers = withFlags(flags.toTypeFlags) + def toTermFlags: Modifiers = withFlags(flags.toTermFlags) + + def withFlags(flags: FlagSet): Modifiers = + if (this.flags == flags) this + else copy(flags = flags) + + def withoutFlags(flags: FlagSet): Modifiers = + if (this.isOneOf(flags)) + Modifiers(this.flags &~ flags, this.privateWithin, this.annotations, this.mods.filterNot(_.flags.isOneOf(flags))) + else this + + def withAddedMod(mod: Mod): Modifiers = + if (mods.exists(_ eq mod)) this + else withMods(mods :+ mod) + + private def compatible(flags1: FlagSet, flags2: FlagSet): Boolean = + flags1.isEmpty || flags2.isEmpty + || flags1.isTermFlags && flags2.isTermFlags + || flags1.isTypeFlags && flags2.isTypeFlags + + /** Add `flags` to thos modifier set, checking that there are no type/term conflicts. + * If there are conflicts, issue an error and return the modifiers consisting of + * the added flags only. The reason to do it this way is that the added flags usually + * describe the core of a construct whereas the existing set are the modifiers + * given in the source. + */ + def withAddedFlags(flags: FlagSet, span: Span)(using Context): Modifiers = + if this.flags.isAllOf(flags) then this + else if compatible(this.flags, flags) then this | flags + else + val what = if flags.isTermFlags then "values" else "types" + report.error(em"${(flags & ModifierFlags).flagsString} $what cannot be ${this.flags.flagsString}", ctx.source.atSpan(span)) + Modifiers(flags) + + /** Modifiers with given list of Mods. It is checked that + * all modifiers are already accounted for in `flags` and `privateWithin`. + */ + def withMods(ms: List[Mod]): Modifiers = + if (mods eq ms) this + else { + if (ms.nonEmpty) + for (m <- ms) + assert(flags.isAllOf(m.flags) + || m.isInstanceOf[Mod.Private] && !privateWithin.isEmpty + || (m.isInstanceOf[Mod.Abstract] || m.isInstanceOf[Mod.Override]) && flags.is(AbsOverride), + s"unaccounted modifier: $m in $this with flags ${flags.flagsString} when adding $ms") + copy(mods = ms) + } + + def withAddedAnnotation(annot: Tree): Modifiers = + if (annotations.exists(_ eq annot)) this + else withAnnotations(annotations :+ annot) + + def withAnnotations(annots: List[Tree]): Modifiers = + if (annots eq annotations) this + else copy(annotations = annots) + + def withPrivateWithin(pw: TypeName): Modifiers = + if (pw.isEmpty) this + else copy(privateWithin = pw) + + def hasFlags: Boolean = flags != EmptyFlags + def hasAnnotations: Boolean = annotations.nonEmpty + def hasPrivateWithin: Boolean = privateWithin != tpnme.EMPTY + def hasMod(cls: Class[?]) = mods.exists(_.getClass == cls) + + private def isEnum = is(Enum, butNot = JavaDefined) + + def isEnumCase: Boolean = isEnum && is(Case) + def isEnumClass: Boolean = isEnum && !is(Case) + } + + @sharable val EmptyModifiers: Modifiers = Modifiers() + + // ----- TypeTrees that refer to other tree's symbols ------------------- + + /** A type tree that gets its type from some other tree's symbol. Enters the + * type tree in the References attachment of the `from` tree as a side effect. + */ + abstract class DerivedTypeTree(implicit @constructorOnly src: SourceFile) extends TypeTree { + + private var myWatched: Tree = EmptyTree + + /** The watched tree; used only for printing */ + def watched: Tree = myWatched + + /** Install the derived type tree as a dependency on `original` */ + def watching(original: DefTree): this.type = { + myWatched = original + val existing = original.attachmentOrElse(References, Nil) + original.putAttachment(References, this :: existing) + this + } + + /** Install the derived type tree as a dependency on `sym` */ + def watching(sym: Symbol): this.type = withAttachment(OriginalSymbol, sym) + + /** A hook to ensure that all necessary symbols are completed so that + * OriginalSymbol attachments are propagated to this tree + */ + def ensureCompletions(using Context): Unit = () + + /** The method that computes the tree with the derived type */ + def derivedTree(originalSym: Symbol)(using Context): tpd.Tree + } + + /** Property key containing TypeTrees whose type is computed + * from the symbol in this type. These type trees have marker trees + * TypeRefOfSym or InfoOfSym as their originals. + */ + val References: Property.Key[List[DerivedTypeTree]] = Property.Key() + + /** Property key for TypeTrees marked with TypeRefOfSym or InfoOfSym + * which contains the symbol of the original tree from which this + * TypeTree is derived. + */ + val OriginalSymbol: Property.Key[Symbol] = Property.Key() + + /** Property key for contextual Apply trees of the form `fn given arg` */ + val KindOfApply: Property.StickyKey[ApplyKind] = Property.StickyKey() + + // ------ Creation methods for untyped only ----------------- + + def Ident(name: Name)(implicit src: SourceFile): Ident = new Ident(name) + def SearchFailureIdent(name: Name, explanation: -> String)(implicit src: SourceFile): SearchFailureIdent = new SearchFailureIdent(name, explanation) + def Select(qualifier: Tree, name: Name)(implicit src: SourceFile): Select = new Select(qualifier, name) + def SelectWithSig(qualifier: Tree, name: Name, sig: Signature)(implicit src: SourceFile): Select = new SelectWithSig(qualifier, name, sig) + def This(qual: Ident)(implicit src: SourceFile): This = new This(qual) + def Super(qual: Tree, mix: Ident)(implicit src: SourceFile): Super = new Super(qual, mix) + def Apply(fun: Tree, args: List[Tree])(implicit src: SourceFile): Apply = new Apply(fun, args) + def TypeApply(fun: Tree, args: List[Tree])(implicit src: SourceFile): TypeApply = new TypeApply(fun, args) + def Literal(const: Constant)(implicit src: SourceFile): Literal = new Literal(const) + def New(tpt: Tree)(implicit src: SourceFile): New = new New(tpt) + def Typed(expr: Tree, tpt: Tree)(implicit src: SourceFile): Typed = new Typed(expr, tpt) + def NamedArg(name: Name, arg: Tree)(implicit src: SourceFile): NamedArg = new NamedArg(name, arg) + def Assign(lhs: Tree, rhs: Tree)(implicit src: SourceFile): Assign = new Assign(lhs, rhs) + def Block(stats: List[Tree], expr: Tree)(implicit src: SourceFile): Block = new Block(stats, expr) + def If(cond: Tree, thenp: Tree, elsep: Tree)(implicit src: SourceFile): If = new If(cond, thenp, elsep) + def InlineIf(cond: Tree, thenp: Tree, elsep: Tree)(implicit src: SourceFile): If = new InlineIf(cond, thenp, elsep) + def Closure(env: List[Tree], meth: Tree, tpt: Tree)(implicit src: SourceFile): Closure = new Closure(env, meth, tpt) + def Match(selector: Tree, cases: List[CaseDef])(implicit src: SourceFile): Match = new Match(selector, cases) + def InlineMatch(selector: Tree, cases: List[CaseDef])(implicit src: SourceFile): Match = new InlineMatch(selector, cases) + def CaseDef(pat: Tree, guard: Tree, body: Tree)(implicit src: SourceFile): CaseDef = new CaseDef(pat, guard, body) + def Labeled(bind: Bind, expr: Tree)(implicit src: SourceFile): Labeled = new Labeled(bind, expr) + def Return(expr: Tree, from: Tree)(implicit src: SourceFile): Return = new Return(expr, from) + def WhileDo(cond: Tree, body: Tree)(implicit src: SourceFile): WhileDo = new WhileDo(cond, body) + def Try(expr: Tree, cases: List[CaseDef], finalizer: Tree)(implicit src: SourceFile): Try = new Try(expr, cases, finalizer) + def SeqLiteral(elems: List[Tree], elemtpt: Tree)(implicit src: SourceFile): SeqLiteral = new SeqLiteral(elems, elemtpt) + def JavaSeqLiteral(elems: List[Tree], elemtpt: Tree)(implicit src: SourceFile): JavaSeqLiteral = new JavaSeqLiteral(elems, elemtpt) + def Inlined(call: tpd.Tree, bindings: List[MemberDef], expansion: Tree)(implicit src: SourceFile): Inlined = new Inlined(call, bindings, expansion) + def TypeTree()(implicit src: SourceFile): TypeTree = new TypeTree() + def InferredTypeTree()(implicit src: SourceFile): TypeTree = new InferredTypeTree() + def SingletonTypeTree(ref: Tree)(implicit src: SourceFile): SingletonTypeTree = new SingletonTypeTree(ref) + def RefinedTypeTree(tpt: Tree, refinements: List[Tree])(implicit src: SourceFile): RefinedTypeTree = new RefinedTypeTree(tpt, refinements) + def AppliedTypeTree(tpt: Tree, args: List[Tree])(implicit src: SourceFile): AppliedTypeTree = new AppliedTypeTree(tpt, args) + def LambdaTypeTree(tparams: List[TypeDef], body: Tree)(implicit src: SourceFile): LambdaTypeTree = new LambdaTypeTree(tparams, body) + def TermLambdaTypeTree(params: List[ValDef], body: Tree)(implicit src: SourceFile): TermLambdaTypeTree = new TermLambdaTypeTree(params, body) + def MatchTypeTree(bound: Tree, selector: Tree, cases: List[CaseDef])(implicit src: SourceFile): MatchTypeTree = new MatchTypeTree(bound, selector, cases) + def ByNameTypeTree(result: Tree)(implicit src: SourceFile): ByNameTypeTree = new ByNameTypeTree(result) + def TypeBoundsTree(lo: Tree, hi: Tree, alias: Tree = EmptyTree)(implicit src: SourceFile): TypeBoundsTree = new TypeBoundsTree(lo, hi, alias) + def Bind(name: Name, body: Tree)(implicit src: SourceFile): Bind = new Bind(name, body) + def Alternative(trees: List[Tree])(implicit src: SourceFile): Alternative = new Alternative(trees) + def UnApply(fun: Tree, implicits: List[Tree], patterns: List[Tree])(implicit src: SourceFile): UnApply = new UnApply(fun, implicits, patterns) + def ValDef(name: TermName, tpt: Tree, rhs: LazyTree)(implicit src: SourceFile): ValDef = new ValDef(name, tpt, rhs) + def DefDef(name: TermName, paramss: List[ParamClause], tpt: Tree, rhs: LazyTree)(implicit src: SourceFile): DefDef = new DefDef(name, paramss, tpt, rhs) + def TypeDef(name: TypeName, rhs: Tree)(implicit src: SourceFile): TypeDef = new TypeDef(name, rhs) + def Template(constr: DefDef, parents: List[Tree], derived: List[Tree], self: ValDef, body: LazyTreeList)(implicit src: SourceFile): Template = + if (derived.isEmpty) new Template(constr, parents, self, body) + else new DerivingTemplate(constr, parents ++ derived, self, body, derived.length) + def Import(expr: Tree, selectors: List[ImportSelector])(implicit src: SourceFile): Import = new Import(expr, selectors) + def Export(expr: Tree, selectors: List[ImportSelector])(implicit src: SourceFile): Export = new Export(expr, selectors) + def PackageDef(pid: RefTree, stats: List[Tree])(implicit src: SourceFile): PackageDef = new PackageDef(pid, stats) + def Annotated(arg: Tree, annot: Tree)(implicit src: SourceFile): Annotated = new Annotated(arg, annot) + def Hole(isTermHole: Boolean, idx: Int, args: List[Tree], content: Tree, tpt: Tree)(implicit src: SourceFile): Hole = new Hole(isTermHole, idx, args, content, tpt) + + // ------ Additional creation methods for untyped only ----------------- + + /** new T(args1)...(args_n) + * ==> + * new T.[Ts](args1)...(args_n) + * + * where `Ts` are the class type arguments of `T` or its class type alias. + * Note: we also keep any type arguments as parts of `T`. This is necessary to allow + * navigation into these arguments from the IDE, and to do the right thing in + * PrepareInlineable. + */ + def New(tpt: Tree, argss: List[List[Tree]])(using Context): Tree = + ensureApplied(argss.foldLeft(makeNew(tpt))(Apply(_, _))) + + /** A new expression with constrictor and possibly type arguments. See + * `New(tpt, argss)` for details. + */ + def makeNew(tpt: Tree)(using Context): Tree = { + val (tycon, targs) = tpt match { + case AppliedTypeTree(tycon, targs) => + (tycon, targs) + case TypedSplice(tpt1: tpd.Tree) => + val argTypes = tpt1.tpe.dealias.argTypesLo + def wrap(tpe: Type) = TypeTree(tpe).withSpan(tpt.span) + (tpt, argTypes.map(wrap)) + case _ => + (tpt, Nil) + } + val nu: Tree = Select(New(tycon), nme.CONSTRUCTOR) + if (targs.nonEmpty) TypeApply(nu, targs) else nu + } + + def Block(stat: Tree, expr: Tree)(implicit src: SourceFile): Block = + Block(stat :: Nil, expr) + + def Apply(fn: Tree, arg: Tree)(implicit src: SourceFile): Apply = + Apply(fn, arg :: Nil) + + def ensureApplied(tpt: Tree)(implicit src: SourceFile): Tree = tpt match { + case _: Apply => tpt + case _ => Apply(tpt, Nil) + } + + def AppliedTypeTree(tpt: Tree, arg: Tree)(implicit src: SourceFile): AppliedTypeTree = + AppliedTypeTree(tpt, arg :: Nil) + + def TypeTree(tpe: Type)(using Context): TypedSplice = + TypedSplice(TypeTree().withTypeUnchecked(tpe)) + + def InferredTypeTree(tpe: Type)(using Context): TypedSplice = + TypedSplice(new InferredTypeTree().withTypeUnchecked(tpe)) + + def unitLiteral(implicit src: SourceFile): Literal = Literal(Constant(())) + + def ref(tp: NamedType)(using Context): Tree = + TypedSplice(tpd.ref(tp)) + + def ref(sym: Symbol)(using Context): Tree = + TypedSplice(tpd.ref(sym)) + + def rawRef(tp: NamedType)(using Context): Tree = + if tp.typeParams.isEmpty then ref(tp) + else AppliedTypeTree(ref(tp), tp.typeParams.map(_ => WildcardTypeBoundsTree())) + + def rootDot(name: Name)(implicit src: SourceFile): Select = Select(Ident(nme.ROOTPKG), name) + def scalaDot(name: Name)(implicit src: SourceFile): Select = Select(rootDot(nme.scala), name) + def scalaAnnotationDot(name: Name)(using SourceFile): Select = Select(scalaDot(nme.annotation), name) + def scalaRuntimeDot(name: Name)(using SourceFile): Select = Select(scalaDot(nme.runtime), name) + def scalaUnit(implicit src: SourceFile): Select = scalaDot(tpnme.Unit) + def scalaAny(implicit src: SourceFile): Select = scalaDot(tpnme.Any) + def javaDotLangDot(name: Name)(implicit src: SourceFile): Select = Select(Select(Ident(nme.java), nme.lang), name) + + def captureRoot(using Context): Select = + Select(scalaDot(nme.caps), nme.CAPTURE_ROOT) + + def makeConstructor(tparams: List[TypeDef], vparamss: List[List[ValDef]], rhs: Tree = EmptyTree)(using Context): DefDef = + DefDef(nme.CONSTRUCTOR, joinParams(tparams, vparamss), TypeTree(), rhs) + + def emptyConstructor(using Context): DefDef = + makeConstructor(Nil, Nil) + + def makeSelfDef(name: TermName, tpt: Tree)(using Context): ValDef = + ValDef(name, tpt, EmptyTree).withFlags(PrivateLocal) + + def makeTupleOrParens(ts: List[Tree])(using Context): Tree = ts match { + case t :: Nil => Parens(t) + case _ => Tuple(ts) + } + + def makeTuple(ts: List[Tree])(using Context): Tree = ts match { + case t :: Nil => t + case _ => Tuple(ts) + } + + def makeAndType(left: Tree, right: Tree)(using Context): AppliedTypeTree = + AppliedTypeTree(ref(defn.andType.typeRef), left :: right :: Nil) + + def makeParameter(pname: TermName, tpe: Tree, mods: Modifiers, isBackquoted: Boolean = false)(using Context): ValDef = { + val vdef = ValDef(pname, tpe, EmptyTree) + if (isBackquoted) vdef.pushAttachment(Backquoted, ()) + vdef.withMods(mods | Param) + } + + def makeSyntheticParameter(n: Int = 1, tpt: Tree | Null = null, flags: FlagSet = SyntheticTermParam)(using Context): ValDef = + ValDef(nme.syntheticParamName(n), if (tpt == null) TypeTree() else tpt, EmptyTree) + .withFlags(flags) + + def lambdaAbstract(params: List[ValDef] | List[TypeDef], tpt: Tree)(using Context): Tree = + params match + case Nil => tpt + case (vd: ValDef) :: _ => TermLambdaTypeTree(params.asInstanceOf[List[ValDef]], tpt) + case _ => LambdaTypeTree(params.asInstanceOf[List[TypeDef]], tpt) + + def lambdaAbstractAll(paramss: List[List[ValDef] | List[TypeDef]], tpt: Tree)(using Context): Tree = + paramss.foldRight(tpt)(lambdaAbstract) + + /** A reference to given definition. If definition is a repeated + * parameter, the reference will be a repeated argument. + */ + def refOfDef(tree: MemberDef)(using Context): Tree = tree match { + case ValDef(_, PostfixOp(_, Ident(tpnme.raw.STAR)), _) => repeated(Ident(tree.name)) + case _ => Ident(tree.name) + } + + /** A repeated argument such as `arg: _*` */ + def repeated(arg: Tree)(using Context): Typed = Typed(arg, Ident(tpnme.WILDCARD_STAR)) + + +// --------- Copier/Transformer/Accumulator classes for untyped trees ----- + + def localCtx(tree: Tree)(using Context): Context = ctx + + override val cpy: UntypedTreeCopier = UntypedTreeCopier() + + class UntypedTreeCopier extends TreeCopier { + + def postProcess(tree: Tree, copied: Tree): copied.ThisTree[Untyped] = + copied.asInstanceOf[copied.ThisTree[Untyped]] + + def postProcess(tree: Tree, copied: MemberDef): copied.ThisTree[Untyped] = { + tree match { + case tree: MemberDef => copied.withMods(tree.rawMods) + case _ => copied + } + }.asInstanceOf[copied.ThisTree[Untyped]] + + def ModuleDef(tree: Tree)(name: TermName, impl: Template)(using Context): ModuleDef = tree match { + case tree: ModuleDef if (name eq tree.name) && (impl eq tree.impl) => tree + case _ => finalize(tree, untpd.ModuleDef(name, impl)(tree.source)) + } + def ParsedTry(tree: Tree)(expr: Tree, handler: Tree, finalizer: Tree)(using Context): TermTree = tree match { + case tree: ParsedTry if (expr eq tree.expr) && (handler eq tree.handler) && (finalizer eq tree.finalizer) => tree + case _ => finalize(tree, untpd.ParsedTry(expr, handler, finalizer)(tree.source)) + } + def SymbolLit(tree: Tree)(str: String)(using Context): TermTree = tree match { + case tree: SymbolLit if str == tree.str => tree + case _ => finalize(tree, untpd.SymbolLit(str)(tree.source)) + } + def InterpolatedString(tree: Tree)(id: TermName, segments: List[Tree])(using Context): TermTree = tree match { + case tree: InterpolatedString if (id eq tree.id) && (segments eq tree.segments) => tree + case _ => finalize(tree, untpd.InterpolatedString(id, segments)(tree.source)) + } + def Function(tree: Tree)(args: List[Tree], body: Tree)(using Context): Tree = tree match { + case tree: Function if (args eq tree.args) && (body eq tree.body) => tree + case _ => finalize(tree, untpd.Function(args, body)(tree.source)) + } + def PolyFunction(tree: Tree)(targs: List[Tree], body: Tree)(using Context): Tree = tree match { + case tree: PolyFunction if (targs eq tree.targs) && (body eq tree.body) => tree + case _ => finalize(tree, untpd.PolyFunction(targs, body)(tree.source)) + } + def InfixOp(tree: Tree)(left: Tree, op: Ident, right: Tree)(using Context): Tree = tree match { + case tree: InfixOp if (left eq tree.left) && (op eq tree.op) && (right eq tree.right) => tree + case _ => finalize(tree, untpd.InfixOp(left, op, right)(tree.source)) + } + def PostfixOp(tree: Tree)(od: Tree, op: Ident)(using Context): Tree = tree match { + case tree: PostfixOp if (od eq tree.od) && (op eq tree.op) => tree + case _ => finalize(tree, untpd.PostfixOp(od, op)(tree.source)) + } + def PrefixOp(tree: Tree)(op: Ident, od: Tree)(using Context): Tree = tree match { + case tree: PrefixOp if (op eq tree.op) && (od eq tree.od) => tree + case _ => finalize(tree, untpd.PrefixOp(op, od)(tree.source)) + } + def Parens(tree: Tree)(t: Tree)(using Context): ProxyTree = tree match { + case tree: Parens if t eq tree.t => tree + case _ => finalize(tree, untpd.Parens(t)(tree.source)) + } + def Tuple(tree: Tree)(trees: List[Tree])(using Context): Tree = tree match { + case tree: Tuple if trees eq tree.trees => tree + case _ => finalize(tree, untpd.Tuple(trees)(tree.source)) + } + def Throw(tree: Tree)(expr: Tree)(using Context): TermTree = tree match { + case tree: Throw if expr eq tree.expr => tree + case _ => finalize(tree, untpd.Throw(expr)(tree.source)) + } + def Quote(tree: Tree)(quoted: Tree)(using Context): Tree = tree match { + case tree: Quote if quoted eq tree.quoted => tree + case _ => finalize(tree, untpd.Quote(quoted)(tree.source)) + } + def Splice(tree: Tree)(expr: Tree)(using Context): Tree = tree match { + case tree: Splice if expr eq tree.expr => tree + case _ => finalize(tree, untpd.Splice(expr)(tree.source)) + } + def ForYield(tree: Tree)(enums: List[Tree], expr: Tree)(using Context): TermTree = tree match { + case tree: ForYield if (enums eq tree.enums) && (expr eq tree.expr) => tree + case _ => finalize(tree, untpd.ForYield(enums, expr)(tree.source)) + } + def ForDo(tree: Tree)(enums: List[Tree], body: Tree)(using Context): TermTree = tree match { + case tree: ForDo if (enums eq tree.enums) && (body eq tree.body) => tree + case _ => finalize(tree, untpd.ForDo(enums, body)(tree.source)) + } + def GenFrom(tree: Tree)(pat: Tree, expr: Tree, checkMode: GenCheckMode)(using Context): Tree = tree match { + case tree: GenFrom if (pat eq tree.pat) && (expr eq tree.expr) && (checkMode == tree.checkMode) => tree + case _ => finalize(tree, untpd.GenFrom(pat, expr, checkMode)(tree.source)) + } + def GenAlias(tree: Tree)(pat: Tree, expr: Tree)(using Context): Tree = tree match { + case tree: GenAlias if (pat eq tree.pat) && (expr eq tree.expr) => tree + case _ => finalize(tree, untpd.GenAlias(pat, expr)(tree.source)) + } + def ContextBounds(tree: Tree)(bounds: TypeBoundsTree, cxBounds: List[Tree])(using Context): TypTree = tree match { + case tree: ContextBounds if (bounds eq tree.bounds) && (cxBounds eq tree.cxBounds) => tree + case _ => finalize(tree, untpd.ContextBounds(bounds, cxBounds)(tree.source)) + } + def PatDef(tree: Tree)(mods: Modifiers, pats: List[Tree], tpt: Tree, rhs: Tree)(using Context): Tree = tree match { + case tree: PatDef if (mods eq tree.mods) && (pats eq tree.pats) && (tpt eq tree.tpt) && (rhs eq tree.rhs) => tree + case _ => finalize(tree, untpd.PatDef(mods, pats, tpt, rhs)(tree.source)) + } + def ExtMethods(tree: Tree)(paramss: List[ParamClause], methods: List[Tree])(using Context): Tree = tree match + case tree: ExtMethods if (paramss eq tree.paramss) && (methods == tree.methods) => tree + case _ => finalize(tree, untpd.ExtMethods(paramss, methods)(tree.source)) + def Into(tree: Tree)(tpt: Tree)(using Context): Tree = tree match + case tree: Into if tpt eq tree.tpt => tree + case _ => finalize(tree, untpd.Into(tpt)(tree.source)) + def ImportSelector(tree: Tree)(imported: Ident, renamed: Tree, bound: Tree)(using Context): Tree = tree match { + case tree: ImportSelector if (imported eq tree.imported) && (renamed eq tree.renamed) && (bound eq tree.bound) => tree + case _ => finalize(tree, untpd.ImportSelector(imported, renamed, bound)(tree.source)) + } + def Number(tree: Tree)(digits: String, kind: NumberKind)(using Context): Tree = tree match { + case tree: Number if (digits == tree.digits) && (kind == tree.kind) => tree + case _ => finalize(tree, untpd.Number(digits, kind)) + } + def CapturingTypeTree(tree: Tree)(refs: List[Tree], parent: Tree)(using Context): Tree = tree match + case tree: CapturingTypeTree if (refs eq tree.refs) && (parent eq tree.parent) => tree + case _ => finalize(tree, untpd.CapturingTypeTree(refs, parent)) + + def TypedSplice(tree: Tree)(splice: tpd.Tree)(using Context): ProxyTree = tree match { + case tree: TypedSplice if splice `eq` tree.splice => tree + case _ => finalize(tree, untpd.TypedSplice(splice)(using ctx)) + } + def MacroTree(tree: Tree)(expr: Tree)(using Context): Tree = tree match { + case tree: MacroTree if expr `eq` tree.expr => tree + case _ => finalize(tree, untpd.MacroTree(expr)(tree.source)) + } + } + + abstract class UntypedTreeMap(cpy: UntypedTreeCopier = untpd.cpy) extends TreeMap(cpy) { + override def transformMoreCases(tree: Tree)(using Context): Tree = tree match { + case ModuleDef(name, impl) => + cpy.ModuleDef(tree)(name, transformSub(impl)) + case tree: DerivingTemplate => + cpy.Template(tree)(transformSub(tree.constr), transform(tree.parents), + transform(tree.derived), transformSub(tree.self), transformStats(tree.body, tree.symbol)) + case ParsedTry(expr, handler, finalizer) => + cpy.ParsedTry(tree)(transform(expr), transform(handler), transform(finalizer)) + case SymbolLit(str) => + cpy.SymbolLit(tree)(str) + case InterpolatedString(id, segments) => + cpy.InterpolatedString(tree)(id, segments.mapConserve(transform)) + case Function(args, body) => + cpy.Function(tree)(transform(args), transform(body)) + case PolyFunction(targs, body) => + cpy.PolyFunction(tree)(transform(targs), transform(body)) + case InfixOp(left, op, right) => + cpy.InfixOp(tree)(transform(left), op, transform(right)) + case PostfixOp(od, op) => + cpy.PostfixOp(tree)(transform(od), op) + case PrefixOp(op, od) => + cpy.PrefixOp(tree)(op, transform(od)) + case Parens(t) => + cpy.Parens(tree)(transform(t)) + case Tuple(trees) => + cpy.Tuple(tree)(transform(trees)) + case Throw(expr) => + cpy.Throw(tree)(transform(expr)) + case Quote(t) => + cpy.Quote(tree)(transform(t)) + case Splice(expr) => + cpy.Splice(tree)(transform(expr)) + case ForYield(enums, expr) => + cpy.ForYield(tree)(transform(enums), transform(expr)) + case ForDo(enums, body) => + cpy.ForDo(tree)(transform(enums), transform(body)) + case GenFrom(pat, expr, checkMode) => + cpy.GenFrom(tree)(transform(pat), transform(expr), checkMode) + case GenAlias(pat, expr) => + cpy.GenAlias(tree)(transform(pat), transform(expr)) + case ContextBounds(bounds, cxBounds) => + cpy.ContextBounds(tree)(transformSub(bounds), transform(cxBounds)) + case PatDef(mods, pats, tpt, rhs) => + cpy.PatDef(tree)(mods, transform(pats), transform(tpt), transform(rhs)) + case ExtMethods(paramss, methods) => + cpy.ExtMethods(tree)(transformParamss(paramss), transformSub(methods)) + case Into(tpt) => + cpy.Into(tree)(transform(tpt)) + case ImportSelector(imported, renamed, bound) => + cpy.ImportSelector(tree)(transformSub(imported), transform(renamed), transform(bound)) + case Number(_, _) | TypedSplice(_) => + tree + case MacroTree(expr) => + cpy.MacroTree(tree)(transform(expr)) + case CapturingTypeTree(refs, parent) => + cpy.CapturingTypeTree(tree)(transform(refs), transform(parent)) + case _ => + super.transformMoreCases(tree) + } + } + + abstract class UntypedTreeAccumulator[X] extends TreeAccumulator[X] { + self: UntypedTreeAccumulator[X] @retains(caps.*) => + override def foldMoreCases(x: X, tree: Tree)(using Context): X = tree match { + case ModuleDef(name, impl) => + this(x, impl) + case tree: DerivingTemplate => + this(this(this(this(this(x, tree.constr), tree.parents), tree.derived), tree.self), tree.body) + case ParsedTry(expr, handler, finalizer) => + this(this(this(x, expr), handler), finalizer) + case SymbolLit(str) => + x + case InterpolatedString(id, segments) => + this(x, segments) + case Function(args, body) => + this(this(x, args), body) + case PolyFunction(targs, body) => + this(this(x, targs), body) + case InfixOp(left, op, right) => + this(this(this(x, left), op), right) + case PostfixOp(od, op) => + this(this(x, od), op) + case PrefixOp(op, od) => + this(this(x, op), od) + case Parens(t) => + this(x, t) + case Tuple(trees) => + this(x, trees) + case Throw(expr) => + this(x, expr) + case Quote(t) => + this(x, t) + case Splice(expr) => + this(x, expr) + case ForYield(enums, expr) => + this(this(x, enums), expr) + case ForDo(enums, body) => + this(this(x, enums), body) + case GenFrom(pat, expr, _) => + this(this(x, pat), expr) + case GenAlias(pat, expr) => + this(this(x, pat), expr) + case ContextBounds(bounds, cxBounds) => + this(this(x, bounds), cxBounds) + case PatDef(mods, pats, tpt, rhs) => + this(this(this(x, pats), tpt), rhs) + case ExtMethods(paramss, methods) => + this(paramss.foldLeft(x)(apply), methods) + case Into(tpt) => + this(x, tpt) + case ImportSelector(imported, renamed, bound) => + this(this(this(x, imported), renamed), bound) + case Number(_, _) => + x + case TypedSplice(splice) => + this(x, splice) + case MacroTree(expr) => + this(x, expr) + case CapturingTypeTree(refs, parent) => + this(this(x, refs), parent) + case _ => + super.foldMoreCases(x, tree) + } + } + + abstract class UntypedTreeTraverser extends UntypedTreeAccumulator[Unit] { + def traverse(tree: Tree)(using Context): Unit + def apply(x: Unit, tree: Tree)(using Context): Unit = traverse(tree) + protected def traverseChildren(tree: Tree)(using Context): Unit = foldOver((), tree) + } + + /** Fold `f` over all tree nodes, in depth-first, prefix order */ + class UntypedDeepFolder[X](f: (X, Tree) => X) extends UntypedTreeAccumulator[X] { + def apply(x: X, tree: Tree)(using Context): X = foldOver(f(x, tree), tree) + } + + /** Is there a subtree of this tree that satisfies predicate `p`? */ + extension (tree: Tree) def existsSubTree(p: Tree => Boolean)(using Context): Boolean = { + val acc = new UntypedTreeAccumulator[Boolean] { + def apply(x: Boolean, t: Tree)(using Context) = x || p(t) || foldOver(x, t) + } + acc(false, tree) + } + + protected def FunProto(args: List[Tree], resType: Type)(using Context) = + ProtoTypes.FunProto(args, resType)(ctx.typer, ApplyKind.Regular) +} diff --git a/tests/pos-with-compiler-cc/dotc/cc/BoxedTypeCache.scala b/tests/pos-with-compiler-cc/dotc/cc/BoxedTypeCache.scala new file mode 100644 index 000000000000..56b3f5ba5047 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/cc/BoxedTypeCache.scala @@ -0,0 +1,19 @@ +package dotty.tools +package dotc +package cc + +import core.* +import Types.*, Symbols.*, Contexts.* + +/** A one-element cache for the boxed version of an unboxed capturing type */ +class BoxedTypeCache: + private var boxed: Type = compiletime.uninitialized + private var unboxed: Type = NoType + + def apply(tp: AnnotatedType)(using Context): Type = + if tp ne unboxed then + unboxed = tp + val CapturingType(parent, refs) = tp: @unchecked + boxed = CapturingType(parent, refs, boxed = true) + boxed +end BoxedTypeCache \ No newline at end of file diff --git a/tests/pos-with-compiler-cc/dotc/cc/CaptureAnnotation.scala b/tests/pos-with-compiler-cc/dotc/cc/CaptureAnnotation.scala new file mode 100644 index 000000000000..2e750865f407 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/cc/CaptureAnnotation.scala @@ -0,0 +1,77 @@ +package dotty.tools +package dotc +package cc + +import core.* +import Types.*, Symbols.*, Contexts.*, Annotations.* +import ast.Trees.* +import ast.{tpd, untpd} +import Decorators.* +import config.Printers.capt +import printing.Printer +import printing.Texts.Text +import annotation.retains + +/** An annotation representing a capture set and whether it is boxed. + * It simulates a normal @retains annotation except that it is more efficient, + * supports variables as capture sets, and adds a `boxed` flag. + * These annotations are created during capture checking. Before that + * there are only regular @retains and @retainsByName annotations. + * @param refs the capture set + * @param boxed whether the type carrying the annotation is boxed + * @param cls the underlying class (either annotation.retains or annotation.retainsByName) + */ +case class CaptureAnnotation(refs: CaptureSet, boxed: Boolean)(cls: Symbol) extends Annotation: + import CaptureAnnotation.* + import tpd.* + + /** A cache for boxed version of a capturing type with this annotation */ + val boxedType = BoxedTypeCache() + + /** Reconstitute annotation tree from capture set */ + override def tree(using Context) = + val elems = refs.elems.toList.map { + case cr: TermRef => ref(cr) + case cr: TermParamRef => untpd.Ident(cr.paramName).withType(cr) + case cr: ThisType => This(cr.cls) + } + val arg = repeated(elems, TypeTree(defn.AnyType)) + New(symbol.typeRef, arg :: Nil) + + override def symbol(using Context) = cls + + override def derivedAnnotation(tree: Tree)(using Context): Annotation = this + + def derivedAnnotation(refs: CaptureSet, boxed: Boolean)(using Context): Annotation = + if (this.refs eq refs) && (this.boxed == boxed) then this + else CaptureAnnotation(refs, boxed)(cls) + + override def sameAnnotation(that: Annotation)(using Context): Boolean = that match + case CaptureAnnotation(refs, boxed) => + this.refs == refs && this.boxed == boxed && this.symbol == that.symbol + case _ => false + + override def mapWith(tm: TypeMap @retains(caps.*))(using Context) = + val elems = refs.elems.toList + val elems1 = elems.mapConserve(tm) + if elems1 eq elems then this + else if elems1.forall(_.isInstanceOf[CaptureRef]) + then derivedAnnotation(CaptureSet(elems1.asInstanceOf[List[CaptureRef]]*), boxed) + else EmptyAnnotation + + override def refersToParamOf(tl: TermLambda)(using Context): Boolean = + refs.elems.exists { + case TermParamRef(tl1, _) => tl eq tl1 + case _ => false + } + + override def toText(printer: Printer): Text = refs.toText(printer) + + override def hash: Int = + (refs.hashCode << 1) | (if boxed then 1 else 0) + + override def eql(that: Annotation) = that match + case that: CaptureAnnotation => (this.refs eq that.refs) && (this.boxed == that.boxed) + case _ => false + +end CaptureAnnotation diff --git a/tests/pos-with-compiler-cc/dotc/cc/CaptureOps.scala b/tests/pos-with-compiler-cc/dotc/cc/CaptureOps.scala new file mode 100644 index 000000000000..0ede1825e611 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/cc/CaptureOps.scala @@ -0,0 +1,256 @@ +package dotty.tools +package dotc +package cc + +import core.* +import Types.*, Symbols.*, Contexts.*, Annotations.*, Flags.* +import ast.{tpd, untpd} +import Decorators.*, NameOps.* +import config.Printers.capt +import util.Property.Key +import tpd.* +import config.Feature + +private val Captures: Key[CaptureSet] = Key() +private val BoxedType: Key[BoxedTypeCache] = Key() + +/** The arguments of a @retains or @retainsByName annotation */ +private[cc] def retainedElems(tree: Tree)(using Context): List[Tree] = tree match + case Apply(_, Typed(SeqLiteral(elems, _), _) :: Nil) => elems + case _ => Nil + +/** An exception thrown if a @retains argument is not syntactically a CaptureRef */ +class IllegalCaptureRef(tpe: Type) extends Exception + +extension (tree: Tree) + + /** Map tree with CaptureRef type to its type, throw IllegalCaptureRef otherwise */ + def toCaptureRef(using Context): CaptureRef = tree.tpe match + case ref: CaptureRef => ref + case tpe => throw IllegalCaptureRef(tpe) + + /** Convert a @retains or @retainsByName annotation tree to the capture set it represents. + * For efficience, the result is cached as an Attachment on the tree. + */ + def toCaptureSet(using Context): CaptureSet = + tree.getAttachment(Captures) match + case Some(refs) => refs + case None => + val refs = CaptureSet(retainedElems(tree).map(_.toCaptureRef)*) + .showing(i"toCaptureSet $tree --> $result", capt) + tree.putAttachment(Captures, refs) + refs + + /** Under pureFunctions, add a @retainsByName(*)` annotation to the argument of + * a by name parameter type, turning the latter into an impure by name parameter type. + */ + def adaptByNameArgUnderPureFuns(using Context): Tree = + if Feature.pureFunsEnabledSomewhere then + val rbn = defn.RetainsByNameAnnot + Annotated(tree, + New(rbn.typeRef).select(rbn.primaryConstructor).appliedTo( + Typed( + SeqLiteral(ref(defn.captureRoot) :: Nil, TypeTree(defn.AnyType)), + TypeTree(defn.RepeatedParamType.appliedTo(defn.AnyType)) + ) + ) + ) + else tree + +extension (tp: Type) + + /** @pre `tp` is a CapturingType */ + def derivedCapturingType(parent: Type, refs: CaptureSet)(using Context): Type = tp match + case tp @ CapturingType(p, r) => + if (parent eq p) && (refs eq r) then tp + else CapturingType(parent, refs, tp.isBoxed) + + /** If this is a unboxed capturing type with nonempty capture set, its boxed version. + * Or, if type is a TypeBounds of capturing types, the version where the bounds are boxed. + * The identity for all other types. + */ + def boxed(using Context): Type = tp.dealias match + case tp @ CapturingType(parent, refs) if !tp.isBoxed && !refs.isAlwaysEmpty => + tp.annot match + case ann: CaptureAnnotation => + ann.boxedType(tp) + case ann => + ann.tree.getAttachment(BoxedType) match + case None => ann.tree.putAttachment(BoxedType, BoxedTypeCache()) + case _ => + ann.tree.attachment(BoxedType)(tp) + case tp: RealTypeBounds => + tp.derivedTypeBounds(tp.lo.boxed, tp.hi.boxed) + case _ => + tp + + /** If `sym` is a type parameter, the boxed version of `tp`, otherwise `tp` */ + def boxedIfTypeParam(sym: Symbol)(using Context) = + if sym.is(TypeParam) then tp.boxed else tp + + /** The boxed version of `tp`, unless `tycon` is a function symbol */ + def boxedUnlessFun(tycon: Type)(using Context) = + if ctx.phase != Phases.checkCapturesPhase || defn.isFunctionSymbol(tycon.typeSymbol) + then tp + else tp.boxed + + /** The capture set consisting of all top-level captures of `tp` that appear under a box. + * Unlike for `boxed` this also considers parents of capture types, unions and + * intersections, and type proxies other than abstract types. + */ + def boxedCaptureSet(using Context): CaptureSet = + def getBoxed(tp: Type): CaptureSet = tp match + case tp @ CapturingType(parent, refs) => + val pcs = getBoxed(parent) + if tp.isBoxed then refs ++ pcs else pcs + case tp: TypeRef if tp.symbol.isAbstractType => CaptureSet.empty + case tp: TypeProxy => getBoxed(tp.superType) + case tp: AndType => getBoxed(tp.tp1) ** getBoxed(tp.tp2) + case tp: OrType => getBoxed(tp.tp1) ++ getBoxed(tp.tp2) + case _ => CaptureSet.empty + getBoxed(tp) + + /** Is the boxedCaptureSet of this type nonempty? */ + def isBoxedCapturing(using Context) = !tp.boxedCaptureSet.isAlwaysEmpty + + /** If this type is a capturing type, the version with boxed statues as given by `boxed`. + * If it is a TermRef of a capturing type, and the box status flips, widen to a capturing + * type that captures the TermRef. + */ + def forceBoxStatus(boxed: Boolean)(using Context): Type = tp.widenDealias match + case tp @ CapturingType(parent, refs) if tp.isBoxed != boxed => + val refs1 = tp match + case ref: CaptureRef if ref.isTracked => ref.singletonCaptureSet + case _ => refs + CapturingType(parent, refs1, boxed) + case _ => + tp + + /** Map capturing type to their parents. Capturing types accessible + * via dealising are also stripped. + */ + def stripCapturing(using Context): Type = tp.dealiasKeepAnnots match + case CapturingType(parent, _) => + parent.stripCapturing + case atd @ AnnotatedType(parent, annot) => + atd.derivedAnnotatedType(parent.stripCapturing, annot) + case _ => + tp + + /** Under pureFunctions, map regular function type to impure function type + */ + def adaptFunctionTypeUnderPureFuns(using Context): Type = tp match + case AppliedType(fn, args) + if Feature.pureFunsEnabledSomewhere && defn.isFunctionClass(fn.typeSymbol) => + val fname = fn.typeSymbol.name + defn.FunctionType( + fname.functionArity, + isContextual = fname.isContextFunction, + isErased = fname.isErasedFunction, + isImpure = true).appliedTo(args) + case _ => + tp + + /** Under pureFunctions, add a @retainsByName(*)` annotation to the argument of + * a by name parameter type, turning the latter into an impure by name parameter type. + */ + def adaptByNameArgUnderPureFuns(using Context): Type = + if Feature.pureFunsEnabledSomewhere then + AnnotatedType(tp, + CaptureAnnotation(CaptureSet.universal, boxed = false)(defn.RetainsByNameAnnot)) + else + tp + + def isCapturingType(using Context): Boolean = + tp match + case CapturingType(_, _) => true + case _ => false + + /** Is type known to be always pure by its class structure, + * so that adding a capture set to it would not make sense? + */ + def isAlwaysPure(using Context): Boolean = tp.dealias match + case tp: (TypeRef | AppliedType) => + val sym = tp.typeSymbol + if sym.isClass then sym.isPureClass + else tp.superType.isAlwaysPure + case CapturingType(parent, refs) => + parent.isAlwaysPure || refs.isAlwaysEmpty + case tp: TypeProxy => + tp.superType.isAlwaysPure + case tp: AndType => + tp.tp1.isAlwaysPure || tp.tp2.isAlwaysPure + case tp: OrType => + tp.tp1.isAlwaysPure && tp.tp2.isAlwaysPure + case _ => + false + +extension (cls: ClassSymbol) + + def pureBaseClass(using Context): Option[Symbol] = + cls.baseClasses.find(bc => + defn.pureBaseClasses.contains(bc) + || { + val selfType = bc.givenSelfType + selfType.exists && selfType.captureSet.isAlwaysEmpty + }) + +extension (sym: Symbol) + + /** A class is pure if: + * - one its base types has an explicitly declared self type with an empty capture set + * - or it is a value class + * - or it is an exception + * - or it is one of Nothing, Null, or String + */ + def isPureClass(using Context): Boolean = sym match + case cls: ClassSymbol => + cls.pureBaseClass.isDefined || defn.pureSimpleClasses.contains(cls) + case _ => + false + + /** Does this symbol allow results carrying the universal capability? + * Currently this is true only for function type applies (since their + * results are unboxed) and `erasedValue` since this function is magic in + * that is allows to conjure global capabilies from nothing (aside: can we find a + * more controlled way to achieve this?). + * But it could be generalized to other functions that so that they can take capability + * classes as arguments. + */ + def allowsRootCapture(using Context): Boolean = + sym == defn.Compiletime_erasedValue + || defn.isFunctionClass(sym.maybeOwner) + + /** When applying `sym`, would the result type be unboxed? + * This is the case if the result type contains a top-level reference to an enclosing + * class or method type parameter and the method does not allow root capture. + * If the type parameter is instantiated to a boxed type, that type would + * have to be unboxed in the method's result. + */ + def unboxesResult(using Context): Boolean = + def containsEnclTypeParam(tp: Type): Boolean = tp.strippedDealias match + case tp @ TypeRef(pre: ThisType, _) => tp.symbol.is(Param) + case tp: TypeParamRef => true + case tp: AndOrType => containsEnclTypeParam(tp.tp1) || containsEnclTypeParam(tp.tp2) + case tp: RefinedType => containsEnclTypeParam(tp.parent) || containsEnclTypeParam(tp.refinedInfo) + case _ => false + containsEnclTypeParam(sym.info.finalResultType) + && !sym.allowsRootCapture + && sym != defn.Caps_unsafeBox + && sym != defn.Caps_unsafeUnbox + +extension (tp: AnnotatedType) + /** Is this a boxed capturing type? */ + def isBoxed(using Context): Boolean = tp.annot match + case ann: CaptureAnnotation => ann.boxed + case _ => false + +extension (ts: List[Type]) + /** Equivalent to ts.mapconserve(_.boxedUnlessFun(tycon)) but more efficient where + * it is the identity. + */ + def boxedUnlessFun(tycon: Type)(using Context) = + if ctx.phase != Phases.checkCapturesPhase || defn.isFunctionClass(tycon.typeSymbol) + then ts + else ts.mapconserve(_.boxed) + diff --git a/tests/pos-with-compiler-cc/dotc/cc/CaptureSet.scala b/tests/pos-with-compiler-cc/dotc/cc/CaptureSet.scala new file mode 100644 index 000000000000..c31bcb76c2c7 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/cc/CaptureSet.scala @@ -0,0 +1,902 @@ +package dotty.tools +package dotc +package cc + +import core.* +import Types.*, Symbols.*, Flags.*, Contexts.*, Decorators.* +import config.Printers.capt +import Annotations.Annotation +import annotation.threadUnsafe +import annotation.constructorOnly +import annotation.internal.sharable +import reporting.trace +import printing.{Showable, Printer} +import printing.Texts.* +import util.{SimpleIdentitySet, Property} +import util.common.alwaysTrue +import scala.collection.mutable +import config.Config.ccAllowUnsoundMaps +import language.experimental.pureFunctions +import annotation.retains + +/** A class for capture sets. Capture sets can be constants or variables. + * Capture sets support inclusion constraints <:< where <:< is subcapturing. + * + * They also allow + * - mapping with functions from elements to capture sets + * - filtering with predicates on elements + * - intersecting wo capture sets + * + * That is, constraints can be of the forms + * + * cs1 <:< cs2 + * cs1 = ∪ {f(x) | x ∈ cs2} where f is a function from capture references to capture sets. + * cs1 = ∪ {x | x ∈ cs2, p(x)} where p is a predicate on capture references + * cs1 = cs2 ∩ cs2 + * + * We call the resulting constraint system "monadic set constraints". + * To support capture propagation across maps, mappings are supported only + * if the mapped function is either a bijection or if it is idempotent + * on capture references (c.f. doc comment on `map` below). + */ +sealed abstract class CaptureSet extends Showable, caps.Pure: + import CaptureSet.* + + /** The elements of this capture set. For capture variables, + * the elements known so far. + */ + def elems: Refs + + /** Is this capture set constant (i.e. not an unsolved capture variable)? + * Solved capture variables count as constant. + */ + def isConst: Boolean + + /** Is this capture set always empty? For unsolved capture veriables, returns + * always false. + */ + def isAlwaysEmpty: Boolean + + /** Is this capture set definitely non-empty? */ + final def isNotEmpty: Boolean = !elems.isEmpty + + /** Convert to Const. @pre: isConst */ + def asConst: Const = this match + case c: Const => c + case v: Var => + assert(v.isConst) + Const(v.elems) + + /** Cast to variable. @pre: !isConst */ + def asVar: Var = + assert(!isConst) + asInstanceOf[Var] + + /** Does this capture set contain the root reference `*` as element? */ + final def isUniversal(using Context) = + elems.exists { + case ref: TermRef => ref.symbol == defn.captureRoot + case _ => false + } + + /** Add new elements to this capture set if allowed. + * @pre `newElems` is not empty and does not overlap with `this.elems`. + * Constant capture sets never allow to add new elements. + * Variables allow it if and only if the new elements can be included + * in all their dependent sets. + * @param origin The set where the elements come from, or `empty` if not known. + * @return CompareResult.OK if elements were added, or a conflicting + * capture set that prevents addition otherwise. + */ + protected def addNewElems(newElems: Refs, origin: CaptureSet)(using Context, VarState): CompareResult + + /** If this is a variable, add `cs` as a dependent set */ + protected def addDependent(cs: CaptureSet)(using Context, VarState): CompareResult + + /** If `cs` is a variable, add this capture set as one of its dependent sets */ + protected def addAsDependentTo(cs: CaptureSet)(using Context): this.type = + cs.addDependent(this)(using ctx, UnrecordedState) + this + + /** Try to include all references of `elems` that are not yet accounted for by this + * capture set. Inclusion is via `addNewElems`. + * @param origin The set where the elements come from, or `empty` if not known. + * @return CompareResult.OK if all unaccounted elements could be added, + * capture set that prevents addition otherwise. + */ + protected final def tryInclude(elems: Refs, origin: CaptureSet)(using Context, VarState): CompareResult = + val unaccounted = elems.filter(!accountsFor(_)) + if unaccounted.isEmpty then CompareResult.OK + else addNewElems(unaccounted, origin) + + /** Equivalent to `tryInclude({elem}, origin)`, but more efficient */ + protected final def tryInclude(elem: CaptureRef, origin: CaptureSet)(using Context, VarState): CompareResult = + if accountsFor(elem) then CompareResult.OK + else addNewElems(elem.singletonCaptureSet.elems, origin) + + /* x subsumes y if x is the same as y, or x is a this reference and y refers to a field of x */ + extension (x: CaptureRef) private def subsumes(y: CaptureRef) = + (x eq y) + || y.match + case y: TermRef => y.prefix eq x + case _ => false + + /** {x} <:< this where <:< is subcapturing, but treating all variables + * as frozen. + */ + def accountsFor(x: CaptureRef)(using Context): Boolean = + reporting.trace(i"$this accountsFor $x, ${x.captureSetOfInfo}?", show = true) { + elems.exists(_.subsumes(x)) + || !x.isRootCapability && x.captureSetOfInfo.subCaptures(this, frozen = true).isOK + } + + /** A more optimistic version of accountsFor, which does not take variable supersets + * of the `x` reference into account. A set might account for `x` if it accounts + * for `x` in a state where we assume all supersets of `x` have just the elements + * known at this point. On the other hand if x's capture set has no known elements, + * a set `cs` might account for `x` only if it subsumes `x` or it contains the + * root capability `*`. + */ + def mightAccountFor(x: CaptureRef)(using Context): Boolean = + reporting.trace(i"$this mightAccountFor $x, ${x.captureSetOfInfo}?", show = true) { + elems.exists(elem => elem.subsumes(x) || elem.isRootCapability) + || !x.isRootCapability + && { + val elems = x.captureSetOfInfo.elems + !elems.isEmpty && elems.forall(mightAccountFor) + } + } + + /** A more optimistic version of subCaptures used to choose one of two typing rules + * for selections and applications. `cs1 mightSubcapture cs2` if `cs2` might account for + * every element currently known to be in `cs1`. + */ + def mightSubcapture(that: CaptureSet)(using Context): Boolean = + elems.forall(that.mightAccountFor) + + /** The subcapturing test. + * @param frozen if true, no new variables or dependent sets are allowed to + * be added when making this test. An attempt to add either + * will result in failure. + */ + final def subCaptures(that: CaptureSet, frozen: Boolean)(using Context): CompareResult = + subCaptures(that)(using ctx, if frozen then FrozenState else VarState()) + + /** The subcapturing test, using a given VarState */ + private def subCaptures(that: CaptureSet)(using Context, VarState): CompareResult = + def recur(elems: List[CaptureRef]): CompareResult = elems match + case elem :: elems1 => + var result = that.tryInclude(elem, this) + if !result.isOK && !elem.isRootCapability && summon[VarState] != FrozenState then + result = elem.captureSetOfInfo.subCaptures(that) + if result.isOK then + recur(elems1) + else + varState.rollBack() + result + case Nil => + addDependent(that) + recur(elems.toList) + .showing(i"subcaptures $this <:< $that = $result", capt)(using null) + + /** Two capture sets are considered =:= equal if they mutually subcapture each other + * in a frozen state. + */ + def =:= (that: CaptureSet)(using Context): Boolean = + this.subCaptures(that, frozen = true).isOK + && that.subCaptures(this, frozen = true).isOK + + /** The smallest capture set (via <:<) that is a superset of both + * `this` and `that` + */ + def ++ (that: CaptureSet)(using Context): CaptureSet = + if this.subCaptures(that, frozen = true).isOK then that + else if that.subCaptures(this, frozen = true).isOK then this + else if this.isConst && that.isConst then Const(this.elems ++ that.elems) + else Var(this.elems ++ that.elems).addAsDependentTo(this).addAsDependentTo(that) + + /** The smallest superset (via <:<) of this capture set that also contains `ref`. + */ + def + (ref: CaptureRef)(using Context): CaptureSet = + this ++ ref.singletonCaptureSet + + /** The largest capture set (via <:<) that is a subset of both `this` and `that` + */ + def **(that: CaptureSet)(using Context): CaptureSet = + if this.subCaptures(that, frozen = true).isOK then this + else if that.subCaptures(this, frozen = true).isOK then that + else if this.isConst && that.isConst then Const(elemIntersection(this, that)) + else Intersected(this, that) + + /** The largest subset (via <:<) of this capture set that does not account for + * any of the elements in the constant capture set `that` + */ + def -- (that: CaptureSet.Const)(using Context): CaptureSet = + val elems1 = elems.filter(!that.accountsFor(_)) + if elems1.size == elems.size then this + else if this.isConst then Const(elems1) + else Diff(asVar, that) + + /** The largest subset (via <:<) of this capture set that does not account for `ref` */ + def - (ref: CaptureRef)(using Context): CaptureSet = + this -- ref.singletonCaptureSet + + /** The largest subset (via <:<) of this capture set that only contains elements + * for which `p` is true. + */ + def filter(p: (c: Context) ?-> (CaptureRef -> Boolean) @retains(c))(using Context): CaptureSet = + if this.isConst then + val elems1 = elems.filter(p) + if elems1 == elems then this + else Const(elems.filter(p)) + else Filtered(asVar, p) + + /** Capture set obtained by applying `tm` to all elements of the current capture set + * and joining the results. If the current capture set is a variable, the same + * transformation is applied to all future additions of new elements. + * + * Note: We have a problem how we handle the situation where we have a mapped set + * + * cs2 = tm(cs1) + * + * and then the propagation solver adds a new element `x` to `cs2`. What do we + * know in this case about `cs1`? We can answer this question in a sound way only + * if `tm` is a bijection on capture references or it is idempotent on capture references. + * (see definition in IdempotentCapRefMap). + * If `tm` is a bijection we know that `tm^-1(x)` must be in `cs1`. If `tm` is idempotent + * one possible solution is that `x` is in `cs1`, which is what we assume in this case. + * That strategy is sound but not complete. + * + * If `tm` is some other map, we don't know how to handle this case. For now, + * we simply refuse to handle other maps. If they do need to be handled, + * `OtherMapped` provides some approximation to a solution, but it is neither + * sound nor complete. + */ + def map(tm: TypeMap)(using Context): CaptureSet = tm match + case tm: BiTypeMap => + val mappedElems = elems.map(tm.forward) + if isConst then + if mappedElems == elems then this + else Const(mappedElems) + else BiMapped(asVar, tm, mappedElems) + case tm: IdentityCaptRefMap => + this + case _ => + val mapped = mapRefs(elems, tm, tm.variance) + if isConst then + if mapped.isConst && mapped.elems == elems then this + else mapped + else Mapped(asVar, tm, tm.variance, mapped) + + /** A mapping resulting from substituting parameters of a BindingType to a list of types */ + def substParams(tl: BindingType, to: List[Type])(using Context) = + map(Substituters.SubstParamsMap(tl, to).detach) + + /** Invoke handler if this set has (or later aquires) the root capability `*` */ + def disallowRootCapability(handler: () -> Context ?-> Unit)(using Context): this.type = + if isUniversal then handler() + this + + /** An upper approximation of this capture set, i.e. a constant set that is + * subcaptured by this set. If the current set is a variable + * it is the intersection of all upper approximations of known supersets + * of the variable. + * The upper approximation is meaningful only if it is constant. If not, + * `upperApprox` can return an arbitrary capture set variable. + * `upperApprox` is used in `solve`. + */ + protected def upperApprox(origin: CaptureSet)(using Context): CaptureSet + + /** Assuming set this set dependds on was just solved to be constant, propagate this info + * to this set. This might result in the set being solved to be constant + * itself. + */ + protected def propagateSolved()(using Context): Unit = () + + /** This capture set with a description that tells where it comes from */ + def withDescription(description: String): CaptureSet + + /** The provided description (using `withDescription`) for this capture set or else "" */ + def description: String + + /** A regular @retains or @retainsByName annotation with the elements of this set as arguments. */ + def toRegularAnnotation(cls: Symbol)(using Context): Annotation = + Annotation(CaptureAnnotation(this, boxed = false)(cls).tree) + + override def toText(printer: Printer): Text = + Str("{") ~ Text(elems.toList.map(printer.toTextCaptureRef), ", ") ~ Str("}") ~~ description + +object CaptureSet: + type Refs = SimpleIdentitySet[CaptureRef] + type Vars = SimpleIdentitySet[Var] + type Deps = SimpleIdentitySet[CaptureSet] + + @sharable private var varId = 0 + + /** If set to `true`, capture stack traces that tell us where sets are created */ + private final val debugSets = false + + private val emptySet = SimpleIdentitySet.empty + + /** The empty capture set `{}` */ + val empty: CaptureSet.Const = Const(emptySet) + + /** The universal capture set `{*}` */ + def universal(using Context): CaptureSet = + defn.captureRoot.termRef.singletonCaptureSet + + /** Used as a recursion brake */ + @sharable private[dotc] val Pending = Const(SimpleIdentitySet.empty) + + def apply(elems: CaptureRef*)(using Context): CaptureSet.Const = + if elems.isEmpty then empty + else Const(SimpleIdentitySet(elems.map(_.normalizedRef)*)) + + def apply(elems: Refs)(using Context): CaptureSet.Const = + if elems.isEmpty then empty else Const(elems) + + /** The subclass of constant capture sets with given elements `elems` */ + class Const private[CaptureSet] (val elems: Refs, val description: String = "") extends CaptureSet: + def isConst = true + def isAlwaysEmpty = elems.isEmpty + + def addNewElems(elems: Refs, origin: CaptureSet)(using Context, VarState): CompareResult = + CompareResult.fail(this) + + def addDependent(cs: CaptureSet)(using Context, VarState) = CompareResult.OK + + def upperApprox(origin: CaptureSet)(using Context): CaptureSet = this + + def withDescription(description: String): Const = Const(elems, description) + + override def toString = elems.toString + end Const + + /** The subclass of captureset variables with given initial elements */ + class Var(initialElems: Refs = emptySet) extends CaptureSet: + + /** A unique identification number for diagnostics */ + val id = + varId += 1 + varId + + /** A variable is solved if it is aproximated to a from-then-on constant set. */ + private var isSolved: Boolean = false + + /** The elements currently known to be in the set */ + var elems: Refs = initialElems + + /** The sets currently known to be dependent sets (i.e. new additions to this set + * are propagated to these dependent sets.) + */ + var deps: Deps = emptySet + + def isConst = isSolved + def isAlwaysEmpty = false + + /** A handler to be invoked if the root reference `*` is added to this set + * The handler is pure in the sense that it will only output diagnostics. + */ + var rootAddedHandler: () -> Context ?-> Unit = () => () + + var description: String = "" + + /** Record current elements in given VarState provided it does not yet + * contain an entry for this variable. + */ + private def recordElemsState()(using VarState): Boolean = + varState.getElems(this) match + case None => varState.putElems(this, elems) + case _ => true + + /** Record current dependent sets in given VarState provided it does not yet + * contain an entry for this variable. + */ + private[CaptureSet] def recordDepsState()(using VarState): Boolean = + varState.getDeps(this) match + case None => varState.putDeps(this, deps) + case _ => true + + /** Reset elements to what was recorded in `state` */ + def resetElems()(using state: VarState): Unit = + elems = state.elems(this) + + /** Reset dependent sets to what was recorded in `state` */ + def resetDeps()(using state: VarState): Unit = + deps = state.deps(this) + + def addNewElems(newElems: Refs, origin: CaptureSet)(using Context, VarState): CompareResult = + if !isConst && recordElemsState() then + elems ++= newElems + if isUniversal then rootAddedHandler() + // assert(id != 2 || elems.size != 2, this) + (CompareResult.OK /: deps) { (r, dep) => + r.andAlso(dep.tryInclude(newElems, this)) + } + else // fail if variable is solved or given VarState is frozen + CompareResult.fail(this) + + def addDependent(cs: CaptureSet)(using Context, VarState): CompareResult = + if (cs eq this) || cs.isUniversal || isConst then + CompareResult.OK + else if recordDepsState() then + deps += cs + CompareResult.OK + else + CompareResult.fail(this) + + override def disallowRootCapability(handler: () -> Context ?-> Unit)(using Context): this.type = + rootAddedHandler = handler + super.disallowRootCapability(handler) + + private var computingApprox = false + + /** Roughly: the intersection of all constant known supersets of this set. + * The aim is to find an as-good-as-possible constant set that is a superset + * of this set. The universal set {*} is a sound fallback. + */ + final def upperApprox(origin: CaptureSet)(using Context): CaptureSet = + if computingApprox then universal + else if isConst then this + else + computingApprox = true + try computeApprox(origin).ensuring(_.isConst) + finally computingApprox = false + + /** The intersection of all upper approximations of dependent sets */ + protected def computeApprox(origin: CaptureSet)(using Context): CaptureSet = + (universal /: deps) { (acc, sup) => acc ** sup.upperApprox(this) } + + /** Widen the variable's elements to its upper approximation and + * mark it as constant from now on. This is used for contra-variant type variables + * in the results of defs and vals. + */ + def solve()(using Context): Unit = + if !isConst then + val approx = upperApprox(empty) + //println(i"solving var $this $approx ${approx.isConst} deps = ${deps.toList}") + val newElems = approx.elems -- elems + if newElems.isEmpty || addNewElems(newElems, empty)(using ctx, VarState()).isOK then + markSolved() + + /** Mark set as solved and propagate this info to all dependent sets */ + def markSolved()(using Context): Unit = + isSolved = true + deps.foreach(_.propagateSolved()) + + def withDescription(description: String): this.type = + this.description = + if this.description.isEmpty then description + else s"${this.description} and $description" + this + + /** Used for diagnostics and debugging: A string that traces the creation + * history of a variable by following source links. Each variable on the + * path is characterized by the variable's id and the first letter of the + * variable's class name. The path ends in a plain variable with letter `V` that + * is not derived from some other variable. + */ + protected def ids(using Context): String = + val trail = this.match + case dv: DerivedVar => dv.source.ids + case _ => "" + s"$id${getClass.getSimpleName.nn.take(1)}$trail" + + /** Adds variables to the ShownVars context property if that exists, which + * establishes a record of all variables printed in an error message. + * Prints variables wih ids under -Ycc-debug. + */ + override def toText(printer: Printer): Text = inContext(printer.printerContext) { + for vars <- ctx.property(ShownVars) do vars += this + super.toText(printer) ~ (Str(ids) provided !isConst && ctx.settings.YccDebug.value) + } + + override def toString = s"Var$id$elems" + end Var + + /** A variable that is derived from some other variable via a map or filter. */ + abstract class DerivedVar(initialElems: Refs)(using @constructorOnly ctx: Context) + extends Var(initialElems): + + // For debugging: A trace where a set was created. Note that logically it would make more + // sense to place this variable in Mapped, but that runs afoul of the initializatuon checker. + val stack = if debugSets && this.isInstanceOf[Mapped] then (new Throwable).getStackTrace().nn.take(20) else null + + /** The variable from which this variable is derived */ + def source: Var + + addAsDependentTo(source) + + override def propagateSolved()(using Context) = + if source.isConst && !isConst then markSolved() + end DerivedVar + + /** A variable that changes when `source` changes, where all additional new elements are mapped + * using ∪ { tm(x) | x <- source.elems }. + * @param source the original set that is mapped + * @param tm the type map, which is assumed to be idempotent on capture refs + * (except if ccUnsoundMaps is enabled) + * @param variance the assumed variance with which types with capturesets of size >= 2 are approximated + * (i.e. co: full capture set, contra: empty set, nonvariant is not allowed.) + * @param initial The initial mappings of source's elements at the point the Mapped set is created. + */ + class Mapped private[CaptureSet] + (val source: Var, tm: TypeMap, variance: Int, initial: CaptureSet)(using @constructorOnly ctx: Context) + extends DerivedVar(initial.elems): + addAsDependentTo(initial) // initial mappings could change by propagation + + private def mapIsIdempotent = tm.isInstanceOf[IdempotentCaptRefMap] + + assert(ccAllowUnsoundMaps || mapIsIdempotent, tm.getClass) + + private def whereCreated(using Context): String = + if stack == null then "" + else i""" + |Stack trace of variable creation:" + |${stack.mkString("\n")}""" + + override def addNewElems(newElems: Refs, origin: CaptureSet)(using Context, VarState): CompareResult = + val added = + if origin eq source then // elements have to be mapped + mapRefs(newElems, tm, variance) + else + // elements are added by subcapturing propagation with this Mapped set + // as superset; no mapping is necessary or allowed. + Const(newElems) + super.addNewElems(added.elems, origin) + .andAlso { + if added.isConst then CompareResult.OK + else if added.asVar.recordDepsState() then { addAsDependentTo(added); CompareResult.OK } + else CompareResult.fail(this) + } + .andAlso { + if (origin ne source) && (origin ne initial) && mapIsIdempotent then + // `tm` is idempotent, propagate back elems from image set. + // This is sound, since we know that for `r in newElems: tm(r) = r`, hence + // `r` is _one_ possible solution in `source` that would make an `r` appear in this set. + // It's not necessarily the only possible solution, so the scheme is incomplete. + source.tryInclude(newElems, this) + else if !mapIsIdempotent && variance <= 0 && !origin.isConst && (origin ne initial) && (origin ne source) then + // The map is neither a BiTypeMap nor an idempotent type map. + // In that case there's no much we can do. + // The scheme then does not propagate added elements back to source and rejects adding + // elements from variable sources in contra- and non-variant positions. In essence, + // we approximate types resulting from such maps by returning a possible super type + // from the actual type. But this is neither sound nor complete. + report.warning(em"trying to add elems ${CaptureSet(newElems)} from unrecognized source $origin of mapped set $this$whereCreated") + CompareResult.fail(this) + else + CompareResult.OK + } + + override def computeApprox(origin: CaptureSet)(using Context): CaptureSet = + if source eq origin then + // it's a mapping of origin, so not a superset of `origin`, + // therefore don't contribute to the intersection. + universal + else + source.upperApprox(this).map(tm) + + override def propagateSolved()(using Context) = + if initial.isConst then super.propagateSolved() + + override def toString = s"Mapped$id($source, elems = $elems)" + end Mapped + + /** A mapping where the type map is required to be a bijection. + * Parameters as in Mapped. + */ + final class BiMapped private[CaptureSet] + (val source: Var, bimap: BiTypeMap, initialElems: Refs)(using @constructorOnly ctx: Context) + extends DerivedVar(initialElems): + + override def addNewElems(newElems: Refs, origin: CaptureSet)(using Context, VarState): CompareResult = + if origin eq source then + super.addNewElems(newElems.map(bimap.forward), origin) + else + super.addNewElems(newElems, origin) + .andAlso { + source.tryInclude(newElems.map(bimap.backward), this) + .showing(i"propagating new elems ${CaptureSet(newElems)} backward from $this to $source", capt)(using null) + } + + /** For a BiTypeMap, supertypes of the mapped type also constrain + * the source via the inverse type mapping and vice versa. That is, if + * B = f(A) and B <: C, then A <: f^-1(C), so C should flow into + * the upper approximation of A. + * Conversely if A <: C2, then we also know that B <: f(C2). + * These situations are modeled by the two branches of the conditional below. + */ + override def computeApprox(origin: CaptureSet)(using Context): CaptureSet = + val supApprox = super.computeApprox(this) + if source eq origin then supApprox.map(bimap.inverseTypeMap.detach) + else source.upperApprox(this).map(bimap) ** supApprox + + override def toString = s"BiMapped$id($source, elems = $elems)" + end BiMapped + + /** A variable with elements given at any time as { x <- source.elems | p(x) } */ + class Filtered private[CaptureSet] + (val source: Var, p: (c: Context) ?-> (CaptureRef -> Boolean) @retains(c))(using @constructorOnly ctx: Context) + extends DerivedVar(source.elems.filter(p)): + + override def addNewElems(newElems: Refs, origin: CaptureSet)(using Context, VarState): CompareResult = + val filtered = newElems.filter(p) + if origin eq source then + super.addNewElems(filtered, origin) + else + // Filtered elements have to be back-propagated to source. + // Elements that don't satisfy `p` are not allowed. + super.addNewElems(newElems, origin) + .andAlso { + if filtered.size == newElems.size then source.tryInclude(newElems, this) + else CompareResult.fail(this) + } + + override def computeApprox(origin: CaptureSet)(using Context): CaptureSet = + if source eq origin then + // it's a filter of origin, so not a superset of `origin`, + // therefore don't contribute to the intersection. + universal + else + source.upperApprox(this).filter(p) + + override def toString = s"${getClass.getSimpleName}$id($source, elems = $elems)" + end Filtered + + /** A variable with elements given at any time as { x <- source.elems | !other.accountsFor(x) } */ + class Diff(source: Var, other: Const)(using @constructorOnly ctx: Context) + extends Filtered(source, !other.accountsFor(_)) + + class Intersected(cs1: CaptureSet, cs2: CaptureSet)(using @constructorOnly ctx: Context) + extends Var(elemIntersection(cs1, cs2)): + addAsDependentTo(cs1) + addAsDependentTo(cs2) + deps += cs1 + deps += cs2 + + override def addNewElems(newElems: Refs, origin: CaptureSet)(using Context, VarState): CompareResult = + val added = + if origin eq cs1 then newElems.filter(cs2.accountsFor) + else if origin eq cs2 then newElems.filter(cs1.accountsFor) + else newElems + // If origin is not cs1 or cs2, then newElems will be propagated to + // cs1, cs2 since they are in deps. + super.addNewElems(added, origin) + + override def computeApprox(origin: CaptureSet)(using Context): CaptureSet = + if (origin eq cs1) || (origin eq cs2) then + // it's a combination of origin with some other set, so not a superset of `origin`, + // therefore don't contribute to the intersection. + universal + else + CaptureSet(elemIntersection(cs1.upperApprox(this), cs2.upperApprox(this))) + + override def propagateSolved()(using Context) = + if cs1.isConst && cs2.isConst && !isConst then markSolved() + end Intersected + + def elemIntersection(cs1: CaptureSet, cs2: CaptureSet)(using Context): Refs = + cs1.elems.filter(cs2.mightAccountFor) ++ cs2.elems.filter(cs1.mightAccountFor) + + /** Extrapolate tm(r) according to `variance`. Let r1 be the result of tm(r). + * - If r1 is a tracked CaptureRef, return {r1} + * - If r1 has an empty capture set, return {} + * - Otherwise, + * - if the variance is covariant, return r1's capture set + * - if the variance is contravariant, return {} + * - Otherwise assertion failure + */ + def extrapolateCaptureRef(r: CaptureRef, tm: TypeMap, variance: Int)(using Context): CaptureSet = + val r1 = tm(r) + val upper = r1.captureSet + def isExact = + upper.isAlwaysEmpty || upper.isConst && upper.elems.size == 1 && upper.elems.contains(r1) + if variance > 0 || isExact then upper + else if variance < 0 then CaptureSet.empty + else assert(false, i"trying to add $upper from $r via ${tm.getClass} in a non-variant setting") + + /** Apply `f` to each element in `xs`, and join result sets with `++` */ + def mapRefs(xs: Refs, f: CaptureRef => CaptureSet)(using Context): CaptureSet = + ((empty: CaptureSet) /: xs)((cs, x) => cs ++ f(x)) + + /** Apply extrapolated `tm` to each element in `xs`, and join result sets with `++` */ + def mapRefs(xs: Refs, tm: TypeMap, variance: Int)(using Context): CaptureSet = + mapRefs(xs, extrapolateCaptureRef(_, tm, variance)) + + /** Return true iff + * - arg1 is a TypeBounds >: CL T <: CH T of two capturing types with equal parents. + * - arg2 is a capturing type CA U + * - CH <: CA <: CL + * In other words, we can unify CL, CH and CA. + */ + def subCapturesRange(arg1: TypeBounds, arg2: Type)(using Context): Boolean = arg1 match + case TypeBounds(CapturingType(lo, loRefs), CapturingType(hi, hiRefs)) if lo =:= hi => + given VarState = VarState() + val cs2 = arg2.captureSet + hiRefs.subCaptures(cs2).isOK && cs2.subCaptures(loRefs).isOK + case _ => + false + + /** A TypeMap with the property that every capture reference in the image + * of the map is mapped to itself. I.e. for all capture references r1, r2, + * if M(r1) == r2 then M(r2) == r2. + */ + trait IdempotentCaptRefMap extends TypeMap + + /** A TypeMap that is the identity on capture references */ + trait IdentityCaptRefMap extends TypeMap + + type CompareResult = CompareResult.TYPE + + /** The result of subcapturing comparisons is an opaque type CompareResult.TYPE. + * This is either OK, indicating success, or + * another capture set, indicating failure. The failure capture set + * is the one that did not allow propagaton of elements into it. + */ + object CompareResult: + opaque type TYPE = CaptureSet + val OK: TYPE = Const(emptySet) + def fail(cs: CaptureSet): TYPE = cs + + extension (result: TYPE) + /** The result is OK */ + def isOK: Boolean = result eq OK + /** If not isOK, the blocking capture set */ + def blocking: CaptureSet = result + inline def andAlso(op: Context ?=> TYPE)(using Context): TYPE = if result.isOK then op else result + def show(using Context): String = if result.isOK then "OK" else i"$result" + end CompareResult + + /** A VarState serves as a snapshot mechanism that can undo + * additions of elements or super sets if an operation fails + */ + class VarState: + + /** A map from captureset variables to their elements at the time of the snapshot. */ + private val elemsMap: util.EqHashMap[Var, Refs] = new util.EqHashMap + + /** A map from captureset variables to their dependent sets at the time of the snapshot. */ + private val depsMap: util.EqHashMap[Var, Deps] = new util.EqHashMap + + /** The recorded elements of `v` (it's required that a recording was made) */ + def elems(v: Var): Refs = elemsMap(v) + + /** Optionally the recorded elements of `v`, None if nothing was recorded for `v` */ + def getElems(v: Var): Option[Refs] = elemsMap.get(v) + + /** Record elements, return whether this was allowed. + * By default, recording is allowed but the special state FrozenState + * overrides this. + */ + def putElems(v: Var, elems: Refs): Boolean = { elemsMap(v) = elems; true } + + /** The recorded dependent sets of `v` (it's required that a recording was made) */ + def deps(v: Var): Deps = depsMap(v) + + /** Optionally the recorded dependent sets of `v`, None if nothing was recorded for `v` */ + def getDeps(v: Var): Option[Deps] = depsMap.get(v) + + /** Record dependent sets, return whether this was allowed. + * By default, recording is allowed but the special state FrozenState + * overrides this. + */ + def putDeps(v: Var, deps: Deps): Boolean = { depsMap(v) = deps; true } + + /** Roll back global state to what was recorded in this VarState */ + def rollBack(): Unit = + elemsMap.keysIterator.foreach(_.resetElems()(using this)) + depsMap.keysIterator.foreach(_.resetDeps()(using this)) + end VarState + + /** A special state that does not allow to record elements or dependent sets. + * In effect this means that no new elements or dependent sets can be added + * in this state (since the previous state cannot be recorded in a snapshot) + */ + @sharable + object FrozenState extends VarState: + override def putElems(v: Var, refs: Refs) = false + override def putDeps(v: Var, deps: Deps) = false + override def rollBack(): Unit = () + + @sharable + /** A special state that turns off recording of elements. Used only + * in `addSub` to prevent cycles in recordings. + */ + private object UnrecordedState extends VarState: + override def putElems(v: Var, refs: Refs) = true + override def putDeps(v: Var, deps: Deps) = true + override def rollBack(): Unit = () + + /** The current VarState, as passed by the implicit context */ + def varState(using state: VarState): VarState = state + + /* Not needed: + def ofClass(cinfo: ClassInfo, argTypes: List[Type])(using Context): CaptureSet = + CaptureSet.empty + def captureSetOf(tp: Type): CaptureSet = tp match + case tp: TypeRef if tp.symbol.is(ParamAccessor) => + def mapArg(accs: List[Symbol], tps: List[Type]): CaptureSet = accs match + case acc :: accs1 if tps.nonEmpty => + if acc == tp.symbol then tps.head.captureSet + else mapArg(accs1, tps.tail) + case _ => + empty + mapArg(cinfo.cls.paramAccessors, argTypes) + case _ => + tp.captureSet + val css = + for + parent <- cinfo.parents if parent.classSymbol == defn.RetainingClass + arg <- parent.argInfos + yield captureSetOf(arg) + css.foldLeft(empty)(_ ++ _) + */ + + /** The capture set of the type underlying a CaptureRef */ + def ofInfo(ref: CaptureRef)(using Context): CaptureSet = ref match + case ref: TermRef if ref.isRootCapability => ref.singletonCaptureSet + case _ => ofType(ref.underlying) + + /** Capture set of a type */ + def ofType(tp: Type)(using Context): CaptureSet = + def recur(tp: Type): CaptureSet = tp.dealias match + case tp: TermRef => + tp.captureSet + case tp: TermParamRef => + tp.captureSet + case _: TypeRef => + if tp.classSymbol.hasAnnotation(defn.CapabilityAnnot) then universal else empty + case _: TypeParamRef => + empty + case CapturingType(parent, refs) => + recur(parent) ++ refs + case AppliedType(tycon, args) => + val cs = recur(tycon) + tycon.typeParams match + case tparams @ (LambdaParam(tl, _) :: _) => cs.substParams(tl, args) + case _ => cs + case tp: TypeProxy => + recur(tp.underlying) + case AndType(tp1, tp2) => + recur(tp1) ** recur(tp2) + case OrType(tp1, tp2) => + recur(tp1) ++ recur(tp2) + case _ => + empty + recur(tp) + .showing(i"capture set of $tp = $result", capt) + + private val ShownVars: Property.Key[mutable.Set[Var]] = Property.Key() + + /** Perform `op`. Under -Ycc-debug, collect and print info about all variables reachable + * via `(_.deps)*` from the variables that were shown in `op`. + */ + def withCaptureSetsExplained[T](op: Context ?=> T)(using ctx: Context): T = + if ctx.settings.YccDebug.value then + val shownVars = mutable.Set[Var]() + inContext(ctx.withProperty(ShownVars, Some(shownVars))) { + try op + finally + val reachable = mutable.Set[Var]() + val todo = mutable.Queue[Var]() ++= shownVars + def incl(cv: Var): Unit = + if !reachable.contains(cv) then todo += cv + while todo.nonEmpty do + val cv = todo.dequeue() + if !reachable.contains(cv) then + reachable += cv + cv.deps.foreach { + case cv: Var => incl(cv) + case _ => + } + cv match + case cv: DerivedVar => incl(cv.source) + case _ => + val allVars = reachable.toArray.sortBy(_.id) + println(i"Capture set dependencies:") + for cv <- allVars do + println(i" ${cv.show.padTo(20, ' ')} :: ${cv.deps.toList}%, %") + } + else op +end CaptureSet diff --git a/tests/pos-with-compiler-cc/dotc/cc/CapturingType.scala b/tests/pos-with-compiler-cc/dotc/cc/CapturingType.scala new file mode 100644 index 000000000000..e9862f1f20b8 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/cc/CapturingType.scala @@ -0,0 +1,72 @@ +package dotty.tools +package dotc +package cc + +import core.* +import Types.*, Symbols.*, Contexts.* + +/** A (possibly boxed) capturing type. This is internally represented as an annotated type with a @retains + * or @retainsByName annotation, but the extractor will succeed only at phase CheckCaptures. + * That way, we can ignore caturing information until phase CheckCaptures since it is + * wrapped in a plain annotation. + * + * The same trick does not work for the boxing information. Boxing is context dependent, so + * we have to add that information in the Setup step preceding CheckCaptures. Boxes are + * added for all type arguments of methods. For type arguments of applied types a different + * strategy is used where we box arguments of applied types that are not functions when + * accessing the argument. + * + * An alternative strategy would add boxes also to arguments of applied types during setup. + * But this would have to be done for all possibly accessibly types from the compiled units + * as well as their dependencies. It's difficult to do this in a DenotationTransformer without + * accidentally forcing symbol infos. That's why this alternative was not implemented. + * If we would go back on this it would make sense to also treat captuyring types different + * from annotations and to generate them all during Setup and in DenotationTransformers. + */ +object CapturingType: + + /** Smart constructor that drops empty capture sets and fuses compatible capturiong types. + * An outer type capturing type A can be fused with an inner capturing type B if their + * boxing status is the same or if A is boxed. + */ + def apply(parent: Type, refs: CaptureSet, boxed: Boolean = false)(using Context): Type = + if refs.isAlwaysEmpty then parent + else parent match + case parent @ CapturingType(parent1, refs1) if boxed || !parent.isBoxed => + apply(parent1, refs ++ refs1, boxed) + case _ => + AnnotatedType(parent, CaptureAnnotation(refs, boxed)(defn.RetainsAnnot)) + + /** An extractor that succeeds only during CheckCapturingPhase. Boxing statis is + * returned separately by CaptureOps.isBoxed. + */ + def unapply(tp: AnnotatedType)(using Context): Option[(Type, CaptureSet)] = + if ctx.phase == Phases.checkCapturesPhase + && tp.annot.symbol == defn.RetainsAnnot + && !ctx.mode.is(Mode.IgnoreCaptures) + then + EventuallyCapturingType.unapply(tp) + else None + +end CapturingType + +/** An extractor for types that will be capturing types at phase CheckCaptures. Also + * included are types that indicate captures on enclosing call-by-name parameters + * before phase ElimByName. + */ +object EventuallyCapturingType: + + def unapply(tp: AnnotatedType)(using Context): Option[(Type, CaptureSet)] = + val sym = tp.annot.symbol + if sym == defn.RetainsAnnot || sym == defn.RetainsByNameAnnot then + tp.annot match + case ann: CaptureAnnotation => + Some((tp.parent, ann.refs)) + case ann => + try Some((tp.parent, ann.tree.toCaptureSet)) + catch case ex: IllegalCaptureRef => None + else None + +end EventuallyCapturingType + + diff --git a/tests/pos-with-compiler-cc/dotc/cc/CheckCaptures.scala b/tests/pos-with-compiler-cc/dotc/cc/CheckCaptures.scala new file mode 100644 index 000000000000..ce3f788202b6 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/cc/CheckCaptures.scala @@ -0,0 +1,1039 @@ +package dotty.tools +package dotc +package cc + +import core.* +import Phases.*, DenotTransformers.*, SymDenotations.* +import Contexts.*, Names.*, Flags.*, Symbols.*, Decorators.* +import Types.*, StdNames.*, Denotations.* +import config.Printers.{capt, recheckr} +import config.{Config, Feature} +import ast.{tpd, untpd, Trees} +import Trees.* +import typer.RefChecks.{checkAllOverrides, checkSelfAgainstParents} +import typer.Checking.{checkBounds, checkAppliedTypesIn} +import util.{SimpleIdentitySet, EqHashMap, SrcPos} +import transform.SymUtils.* +import transform.{Recheck, PreRecheck} +import Recheck.* +import scala.collection.mutable +import CaptureSet.{withCaptureSetsExplained, IdempotentCaptRefMap} +import StdNames.nme +import NameKinds.DefaultGetterName +import reporting.trace +import language.experimental.pureFunctions + +/** The capture checker */ +object CheckCaptures: + import ast.tpd.* + + class Pre extends PreRecheck, SymTransformer: + + override def isEnabled(using Context) = true + + /** Reset `private` flags of parameter accessors so that we can refine them + * in Setup if they have non-empty capture sets. Special handling of some + * symbols defined for case classes. + */ + def transformSym(sym: SymDenotation)(using Context): SymDenotation = + if sym.isAllOf(PrivateParamAccessor) && !sym.hasAnnotation(defn.ConstructorOnlyAnnot) then + sym.copySymDenotation(initFlags = sym.flags &~ Private | Recheck.ResetPrivate) + else if Synthetics.needsTransform(sym) then + Synthetics.transformToCC(sym) + else + sym + end Pre + + /** A class describing environments. + * @param owner the current owner + * @param nestedInOwner true if the environment is a temporary one nested in the owner's environment, + * and does not have a different actual owner symbol (this happens when doing box adaptation). + * @param captured the caputure set containing all references to tracked free variables outside of boxes + * @param isBoxed true if the environment is inside a box (in which case references are not counted) + * @param outer0 the next enclosing environment + */ + case class Env( + owner: Symbol, + nestedInOwner: Boolean, + captured: CaptureSet, + isBoxed: Boolean, + outer0: Env | Null + ): + def outer = outer0.nn + + def isOutermost = outer0 == null + + /** If an environment is open it tracks free references */ + def isOpen = !captured.isAlwaysEmpty && !isBoxed + end Env + + /** Similar normal substParams, but this is an approximating type map that + * maps parameters in contravariant capture sets to the empty set. + * TODO: check what happens with non-variant. + */ + final class SubstParamsMap(from: BindingType, to: List[Type])(using DetachedContext) + extends ApproximatingTypeMap, IdempotentCaptRefMap: + def apply(tp: Type): Type = tp match + case tp: ParamRef => + if tp.binder == from then to(tp.paramNum) else tp + case tp: NamedType => + if tp.prefix `eq` NoPrefix then tp + else tp.derivedSelect(apply(tp.prefix)) + case _: ThisType => + tp + case _ => + mapOver(tp) + + /** Check that a @retains annotation only mentions references that can be tracked. + * This check is performed at Typer. + */ + def checkWellformed(ann: Tree)(using Context): Unit = + for elem <- retainedElems(ann) do + elem.tpe match + case ref: CaptureRef => + if !ref.canBeTracked then + report.error(em"$elem cannot be tracked since it is not a parameter or local value", elem.srcPos) + case tpe => + report.error(em"$elem: $tpe is not a legal element of a capture set", elem.srcPos) + + /** If `tp` is a capturing type, check that all references it mentions have non-empty + * capture sets. Also: warn about redundant capture annotations. + * This check is performed after capture sets are computed in phase cc. + */ + def checkWellformedPost(tp: Type, pos: SrcPos)(using Context): Unit = tp match + case CapturingType(parent, refs) => + for ref <- refs.elems do + if ref.captureSetOfInfo.elems.isEmpty then + report.error(em"$ref cannot be tracked since its capture set is empty", pos) + else if parent.captureSet.accountsFor(ref) then + report.warning(em"redundant capture: $parent already accounts for $ref", pos) + case _ => + + /** Warn if `ann`, which is a tree of a @retains annotation, defines some elements that + * are already accounted for by other elements of the same annotation. + * Note: We need to perform the check on the original annotation rather than its + * capture set since the conversion to a capture set already eliminates redundant elements. + */ + def warnIfRedundantCaptureSet(ann: Tree)(using Context): Unit = + // The lists `elems(i) :: prev.reverse :: elems(0),...,elems(i-1),elems(i+1),elems(n)` + // where `n == elems.length-1`, i <- 0..n`. + // I.e. + // choices(Nil, elems) = [[elems(i), elems(0), ..., elems(i-1), elems(i+1), .... elems(n)] | i <- 0..n] + def choices(prev: List[Tree], elems: List[Tree]): List[List[Tree]] = elems match + case Nil => Nil + case elem :: elems => + List(elem :: (prev reverse_::: elems)) ++ choices(elem :: prev, elems) + for case first :: others <- choices(Nil, retainedElems(ann)) do + val firstRef = first.toCaptureRef + val remaining = CaptureSet(others.map(_.toCaptureRef)*) + if remaining.accountsFor(firstRef) then + report.warning(em"redundant capture: $remaining already accounts for $firstRef", ann.srcPos) + +class CheckCaptures extends Recheck, SymTransformer: + thisPhase => + + import ast.tpd.* + import CheckCaptures.* + + def phaseName: String = "cc" + override def isEnabled(using Context) = true + + def newRechecker()(using Context) = CaptureChecker(ctx.detach) + + override def run(using Context): Unit = + if Feature.ccEnabled then + checkOverrides.traverse(ctx.compilationUnit.tpdTree) + super.run + + override def transformSym(sym: SymDenotation)(using Context): SymDenotation = + if Synthetics.needsTransform(sym) then Synthetics.transformFromCC(sym) + else super.transformSym(sym) + + /** Check overrides again, taking capture sets into account. + * TODO: Can we avoid doing overrides checks twice? + * We need to do them here since only at this phase CaptureTypes are relevant + * But maybe we can then elide the check during the RefChecks phase under captureChecking? + */ + def checkOverrides = new TreeTraverser: + def traverse(t: Tree)(using Context) = + t match + case t: Template => checkAllOverrides(ctx.owner.asClass) + case _ => + traverseChildren(t) + + class CaptureChecker(ictx: DetachedContext) extends Rechecker(ictx): + import ast.tpd.* + + override def keepType(tree: Tree) = + super.keepType(tree) + || tree.isInstanceOf[Try] // type of `try` needs tp be checked for * escapes + + /** Instantiate capture set variables appearing contra-variantly to their + * upper approximation. + */ + private def interpolator(startingVariance: Int = 1)(using Context) = new TypeTraverser: + variance = startingVariance + override def traverse(t: Type) = + t match + case CapturingType(parent, refs: CaptureSet.Var) => + if variance < 0 then + capt.println(i"solving $t") + refs.solve() + traverse(parent) + case t @ RefinedType(_, nme.apply, rinfo) if defn.isFunctionOrPolyType(t) => + traverse(rinfo) + case tp: TypeVar => + case tp: TypeRef => + traverse(tp.prefix) + case _ => + traverseChildren(t) + + /** If `tpt` is an inferred type, interpolate capture set variables appearing contra- + * variantly in it. + */ + private def interpolateVarsIn(tpt: Tree)(using Context): Unit = + if tpt.isInstanceOf[InferredTypeTree] then + interpolator().traverse(tpt.knownType) + .showing(i"solved vars in ${tpt.knownType}", capt)(using null) + + /** Assert subcapturing `cs1 <: cs2` */ + def assertSub(cs1: CaptureSet, cs2: CaptureSet)(using Context) = + assert(cs1.subCaptures(cs2, frozen = false).isOK, i"$cs1 is not a subset of $cs2") + + /** Check subcapturing `{elem} <: cs`, report error on failure */ + def checkElem(elem: CaptureRef, cs: CaptureSet, pos: SrcPos)(using Context) = + val res = elem.singletonCaptureSet.subCaptures(cs, frozen = false) + if !res.isOK then + report.error(em"$elem cannot be referenced here; it is not included in the allowed capture set ${res.blocking}", pos) + + /** Check subcapturing `cs1 <: cs2`, report error on failure */ + def checkSubset(cs1: CaptureSet, cs2: CaptureSet, pos: SrcPos)(using Context) = + val res = cs1.subCaptures(cs2, frozen = false) + if !res.isOK then + def header = + if cs1.elems.size == 1 then i"reference ${cs1.elems.toList}%, % is not" + else i"references $cs1 are not all" + report.error(em"$header included in allowed capture set ${res.blocking}", pos) + + /** The current environment */ + private var curEnv: Env = Env(NoSymbol, nestedInOwner = false, CaptureSet.empty, isBoxed = false, null) + + private val myCapturedVars: util.EqHashMap[Symbol, CaptureSet] = EqHashMap() + + /** If `sym` is a class or method nested inside a term, a capture set variable representing + * the captured variables of the environment associated with `sym`. + */ + def capturedVars(sym: Symbol)(using Context) = + myCapturedVars.getOrElseUpdate(sym, + if sym.ownersIterator.exists(_.isTerm) then CaptureSet.Var() + else CaptureSet.empty) + + /** For all nested environments up to `limit` perform `op` */ + def forallOuterEnvsUpTo(limit: Symbol)(op: Env => Unit)(using Context): Unit = + def recur(env: Env): Unit = + if env.isOpen && env.owner != limit then + op(env) + if !env.isOutermost then + var nextEnv = env.outer + if env.owner.isConstructor then + if nextEnv.owner != limit && !nextEnv.isOutermost then + recur(nextEnv.outer) + else recur(nextEnv) + recur(curEnv) + + /** Include `sym` in the capture sets of all enclosing environments nested in the + * the environment in which `sym` is defined. + */ + def markFree(sym: Symbol, pos: SrcPos)(using Context): Unit = + if sym.exists then + val ref = sym.termRef + if ref.isTracked then + forallOuterEnvsUpTo(sym.enclosure) { env => + capt.println(i"Mark $sym with cs ${ref.captureSet} free in ${env.owner}") + checkElem(ref, env.captured, pos) + } + + /** Make sure (projected) `cs` is a subset of the capture sets of all enclosing + * environments. At each stage, only include references from `cs` that are outside + * the environment's owner + */ + def markFree(cs: CaptureSet, pos: SrcPos)(using Context): Unit = + if !cs.isAlwaysEmpty then + forallOuterEnvsUpTo(ctx.owner.topLevelClass) { env => + val included = cs.filter { + case ref: TermRef => + (env.nestedInOwner || env.owner != ref.symbol.owner) + && env.owner.isContainedIn(ref.symbol.owner) + case ref: ThisType => + (env.nestedInOwner || env.owner != ref.cls) + && env.owner.isContainedIn(ref.cls) + case _ => false + } + capt.println(i"Include call capture $included in ${env.owner}") + checkSubset(included, env.captured, pos) + } + + /** Include references captured by the called method in the current environment stack */ + def includeCallCaptures(sym: Symbol, pos: SrcPos)(using Context): Unit = + if sym.exists && curEnv.isOpen then markFree(capturedVars(sym), pos) + + override def recheckIdent(tree: Ident)(using Context): Type = + if tree.symbol.is(Method) then includeCallCaptures(tree.symbol, tree.srcPos) + else markFree(tree.symbol, tree.srcPos) + super.recheckIdent(tree) + + /** A specialized implementation of the selection rule. + * + * E |- f: Cf f { m: Cr R } + * ------------------------ + * E |- f.m: C R + * + * The implementation picks as `C` one of `{f}` or `Cr`, depending on the + * outcome of a `mightSubcapture` test. It picks `{f}` if this might subcapture Cr + * and Cr otherwise. + */ + override def recheckSelection(tree: Select, qualType: Type, name: Name, pt: Type)(using Context) = { + def disambiguate(denot: Denotation): Denotation = denot match + case MultiDenotation(denot1, denot2) => + // This case can arise when we try to merge multiple types that have different + // capture sets on some part. For instance an asSeenFrom might produce + // a bi-mapped capture set arising from a substition. Applying the same substitution + // to the same type twice will nevertheless produce different capture setsw which can + // lead to a failure in disambiguation since neither alternative is better than the + // other in a frozen constraint. An example test case is disambiguate-select.scala. + // We address the problem by disambiguating while ignoring all capture sets as a fallback. + withMode(Mode.IgnoreCaptures) { + disambiguate(denot1).meet(disambiguate(denot2), qualType) + } + case _ => denot + + val selType = recheckSelection(tree, qualType, name, disambiguate) + val selCs = selType.widen.captureSet + if selCs.isAlwaysEmpty || selType.widen.isBoxedCapturing || qualType.isBoxedCapturing then + selType + else + val qualCs = qualType.captureSet + capt.println(i"intersect $qualType, ${selType.widen}, $qualCs, $selCs in $tree") + if qualCs.mightSubcapture(selCs) + && !selCs.mightSubcapture(qualCs) + && !pt.stripCapturing.isInstanceOf[SingletonType] + then + selType.widen.stripCapturing.capturing(qualCs) + .showing(i"alternate type for select $tree: $selType --> $result, $qualCs / $selCs", capt) + else + selType + }//.showing(i"recheck sel $tree, $qualType = $result") + + /** A specialized implementation of the apply rule. + * + * E |- f: Cf (Ra -> Cr Rr) + * E |- a: Ca Ra + * ------------------------ + * E |- f a: C Rr + * + * The implementation picks as `C` one of `{f, a}` or `Cr`, depending on the + * outcome of a `mightSubcapture` test. It picks `{f, a}` if this might subcapture Cr + * and Cr otherwise. + */ + override def recheckApply(tree: Apply, pt: Type)(using Context): Type = + val meth = tree.fun.symbol + includeCallCaptures(meth, tree.srcPos) + def mapArgUsing(f: Type => Type) = + val arg :: Nil = tree.args: @unchecked + val argType0 = f(recheckStart(arg, pt)) + val argType = super.recheckFinish(argType0, arg, pt) + super.recheckFinish(argType, tree, pt) + + if meth == defn.Caps_unsafeBox then + mapArgUsing(_.forceBoxStatus(true)) + else if meth == defn.Caps_unsafeUnbox then + mapArgUsing(_.forceBoxStatus(false)) + else if meth == defn.Caps_unsafeBoxFunArg then + mapArgUsing { + case defn.FunctionOf(paramtpe :: Nil, restpe, isContectual, isErased) => + defn.FunctionOf(paramtpe.forceBoxStatus(true) :: Nil, restpe, isContectual, isErased) + } + else + super.recheckApply(tree, pt) match + case appType @ CapturingType(appType1, refs) => + tree.fun match + case Select(qual, _) + if !tree.fun.symbol.isConstructor + && !qual.tpe.isBoxedCapturing + && !tree.args.exists(_.tpe.isBoxedCapturing) + && qual.tpe.captureSet.mightSubcapture(refs) + && tree.args.forall(_.tpe.captureSet.mightSubcapture(refs)) + => + val callCaptures = tree.args.foldLeft(qual.tpe.captureSet)((cs, arg) => + cs ++ arg.tpe.captureSet) + appType.derivedCapturingType(appType1, callCaptures) + .showing(i"narrow $tree: $appType, refs = $refs, qual = ${qual.tpe.captureSet} --> $result", capt) + case _ => appType + case appType => appType + end recheckApply + + /** Handle an application of method `sym` with type `mt` to arguments of types `argTypes`. + * This means: + * - Instantiate result type with actual arguments + * - If call is to a constructor: + * - remember types of arguments corresponding to tracked + * parameters in refinements. + * - add capture set of instantiated class to capture set of result type. + */ + override def instantiate(mt: MethodType, argTypes: List[Type], sym: Symbol)(using Context): Type = + val ownType = + if mt.isResultDependent then SubstParamsMap(mt, argTypes)(mt.resType) + else mt.resType + + if sym.isConstructor then + val cls = sym.owner.asClass + + /** First half of result pair: + * Refine the type of a constructor call `new C(t_1, ..., t_n)` + * to C{val x_1: T_1, ..., x_m: T_m} where x_1, ..., x_m are the tracked + * parameters of C and T_1, ..., T_m are the types of the corresponding arguments. + * + * Second half: union of all capture sets of arguments to tracked parameters. + */ + def addParamArgRefinements(core: Type, initCs: CaptureSet): (Type, CaptureSet) = + mt.paramNames.lazyZip(argTypes).foldLeft((core, initCs)) { (acc, refine) => + val (core, allCaptures) = acc + val (getterName, argType) = refine + val getter = cls.info.member(getterName).suchThat(_.is(ParamAccessor)).symbol + if getter.termRef.isTracked && !getter.is(Private) + then (RefinedType(core, getterName, argType), allCaptures ++ argType.captureSet) + else (core, allCaptures) + } + + def augmentConstructorType(core: Type, initCs: CaptureSet): Type = core match + case core: MethodType => + // more parameters to follow; augment result type + core.derivedLambdaType(resType = augmentConstructorType(core.resType, initCs)) + case CapturingType(parent, refs) => + // can happen for curried constructors if instantiate of a previous step + // added capture set to result. + augmentConstructorType(parent, initCs ++ refs) + case _ => + val (refined, cs) = addParamArgRefinements(core, initCs) + refined.capturing(cs) + + augmentConstructorType(ownType, CaptureSet.empty) match + case augmented: MethodType => + augmented + case augmented => + // add capture sets of class and constructor to final result of constructor call + augmented.capturing(capturedVars(cls) ++ capturedVars(sym)) + .showing(i"constr type $mt with $argTypes%, % in $cls = $result", capt) + else ownType + end instantiate + + override def recheckClosure(tree: Closure, pt: Type)(using Context): Type = + val cs = capturedVars(tree.meth.symbol) + capt.println(i"typing closure $tree with cvs $cs") + super.recheckClosure(tree, pt).capturing(cs) + .showing(i"rechecked $tree / $pt = $result", capt) + + /** Additionally to normal processing, update types of closures if the expected type + * is a function with only pure parameters. In that case, make the anonymous function + * also have the same parameters as the prototype. + * TODO: Develop a clearer rationale for this. + * TODO: Can we generalize this to arbitrary parameters? + * Currently some tests fail if we do this. (e.g. neg.../stackAlloc.scala, others) + */ + override def recheckBlock(block: Block, pt: Type)(using Context): Type = + block match + case closureDef(mdef) => + pt.dealias match + case defn.FunctionOf(ptformals, _, _, _) + if ptformals.nonEmpty && ptformals.forall(_.captureSet.isAlwaysEmpty) => + // Redo setup of the anonymous function so that formal parameters don't + // get capture sets. This is important to avoid false widenings to `*` + // when taking the base type of the actual closures's dependent function + // type so that it conforms to the expected non-dependent function type. + // See withLogFile.scala for a test case. + val meth = mdef.symbol + // First, undo the previous setup which installed a completer for `meth`. + atPhase(preRecheckPhase.prev)(meth.denot.copySymDenotation()) + .installAfter(preRecheckPhase) + + // Next, update all parameter symbols to match expected formals + meth.paramSymss.head.lazyZip(ptformals).foreach { (psym, pformal) => + psym.updateInfoBetween(preRecheckPhase, thisPhase, pformal.mapExprType) + } + // Next, update types of parameter ValDefs + mdef.paramss.head.lazyZip(ptformals).foreach { (param, pformal) => + val ValDef(_, tpt, _) = param: @unchecked + tpt.rememberTypeAlways(pformal) + } + // Next, install a new completer reflecting the new parameters for the anonymous method + val mt = meth.info.asInstanceOf[MethodType] + val completer = new LazyType: + def complete(denot: SymDenotation)(using Context) = + denot.info = mt.companion(ptformals, mdef.tpt.knownType) + .showing(i"simplify info of $meth to $result", capt) + recheckDef(mdef, meth) + meth.updateInfoBetween(preRecheckPhase, thisPhase, completer) + case _ => + case _ => + super.recheckBlock(block, pt) + + override def recheckValDef(tree: ValDef, sym: Symbol)(using Context): Unit = + try + if !sym.is(Module) then // Modules are checked by checking the module class + super.recheckValDef(tree, sym) + finally + if !sym.is(Param) then + // Parameters with inferred types belong to anonymous methods. We need to wait + // for more info from the context, so we cannot interpolate. Note that we cannot + // expect to have all necessary info available at the point where the anonymous + // function is compiled since we do not propagate expected types into blocks. + interpolateVarsIn(tree.tpt) + + override def recheckDefDef(tree: DefDef, sym: Symbol)(using Context): Unit = + if !Synthetics.isExcluded(sym) then + val saved = curEnv + val localSet = capturedVars(sym) + if !localSet.isAlwaysEmpty then curEnv = Env(sym, nestedInOwner = false, localSet, isBoxed = false, curEnv) + try super.recheckDefDef(tree, sym) + finally + interpolateVarsIn(tree.tpt) + curEnv = saved + + /** Class-specific capture set relations: + * 1. The capture set of a class includes the capture sets of its parents. + * 2. The capture set of the self type of a class includes the capture set of the class. + * 3. The capture set of the self type of a class includes the capture set of every class parameter, + * unless the parameter is marked @constructorOnly. + */ + override def recheckClassDef(tree: TypeDef, impl: Template, cls: ClassSymbol)(using Context): Type = + val saved = curEnv + val localSet = capturedVars(cls) + for parent <- impl.parents do // (1) + checkSubset(capturedVars(parent.tpe.classSymbol), localSet, parent.srcPos) + if !localSet.isAlwaysEmpty then curEnv = Env(cls, nestedInOwner = false, localSet, isBoxed = false, curEnv) + try + val thisSet = cls.classInfo.selfType.captureSet.withDescription(i"of the self type of $cls") + checkSubset(localSet, thisSet, tree.srcPos) // (2) + for param <- cls.paramGetters do + if !param.hasAnnotation(defn.ConstructorOnlyAnnot) then + checkSubset(param.termRef.captureSet, thisSet, param.srcPos) // (3) + for pureBase <- cls.pureBaseClass do + checkSubset(thisSet, + CaptureSet.empty.withDescription(i"of pure base class $pureBase"), + tree.srcPos) + super.recheckClassDef(tree, impl, cls) + finally + curEnv = saved + + /** If type is of the form `T @requiresCapability(x)`, + * mark `x` as free in the current environment. This is used to require the + * correct `CanThrow` capability when encountering a `throw`. + */ + override def recheckTyped(tree: Typed)(using Context): Type = + tree.tpt.tpe match + case AnnotatedType(_, annot) if annot.symbol == defn.RequiresCapabilityAnnot => + annot.tree match + case Apply(_, cap :: Nil) => + markFree(cap.symbol, tree.srcPos) + case _ => + case _ => + super.recheckTyped(tree) + + /* Currently not needed, since capture checking takes place after ElimByName. + * Keep around in case we need to get back to it + def recheckByNameArg(tree: Tree, pt: Type)(using Context): Type = + val closureDef(mdef) = tree: @unchecked + val arg = mdef.rhs + val localSet = CaptureSet.Var() + curEnv = Env(mdef.symbol, localSet, isBoxed = false, curEnv) + val result = + try + inContext(ctx.withOwner(mdef.symbol)) { + recheckStart(arg, pt).capturing(localSet) + } + finally curEnv = curEnv.outer + recheckFinish(result, arg, pt) + */ + + /** If expected type `pt` is boxed and the tree is a function or a reference, + * don't propagate free variables. + * Otherwise, if the result type is boxed, simulate an unboxing by + * adding all references in the boxed capture set to the current environment. + */ + override def recheck(tree: Tree, pt: Type = WildcardType)(using Context): Type = + if tree.isTerm && pt.isBoxedCapturing then + val saved = curEnv + + tree match + case _: RefTree | closureDef(_) => + curEnv = Env(curEnv.owner, nestedInOwner = false, CaptureSet.Var(), isBoxed = true, curEnv) + case _ => + + try super.recheck(tree, pt) + finally curEnv = saved + else + val res = super.recheck(tree, pt) + if tree.isTerm then markFree(res.boxedCaptureSet, tree.srcPos) + res + + /** If `tree` is a reference or an application where the result type refers + * to an enclosing class or method parameter of the reference, check that the result type + * does not capture the universal capability. This is justified since the + * result type would have to be implicitly unboxed. + * TODO: Can we find a cleaner way to achieve this? Logically, this should be part + * of simulated boxing and unboxing. + */ + override def recheckFinish(tpe: Type, tree: Tree, pt: Type)(using Context): Type = + val typeToCheck = tree match + case _: Ident | _: Select | _: Apply | _: TypeApply if tree.symbol.unboxesResult => + tpe + case _: Try => + tpe + case _ => + NoType + def checkNotUniversal(tp: Type): Unit = tp.widenDealias match + case wtp @ CapturingType(parent, refs) => + refs.disallowRootCapability { () => + val kind = if tree.isInstanceOf[ValDef] then "mutable variable" else "expression" + report.error( + em"""The $kind's type $wtp is not allowed to capture the root capability `*`. + |This usually means that a capability persists longer than its allowed lifetime.""", + tree.srcPos) + } + checkNotUniversal(parent) + case _ => + checkNotUniversal(typeToCheck) + super.recheckFinish(tpe, tree, pt) + + /** Massage `actual` and `expected` types using the methods below before checking conformance */ + override def checkConformsExpr(actual: Type, expected: Type, tree: Tree)(using Context): Unit = + val expected1 = alignDependentFunction(addOuterRefs(expected, actual), actual.stripCapturing) + val actual1 = adaptBoxed(actual, expected1, tree.srcPos) + //println(i"check conforms $actual1 <<< $expected1") + super.checkConformsExpr(actual1, expected1, tree) + + private def toDepFun(args: List[Type], resultType: Type, isContextual: Boolean, isErased: Boolean)(using Context): Type = + MethodType.companion(isContextual = isContextual, isErased = isErased)(args, resultType) + .toFunctionType(isJava = false, alwaysDependent = true) + + /** Turn `expected` into a dependent function when `actual` is dependent. */ + private def alignDependentFunction(expected: Type, actual: Type)(using Context): Type = + def recur(expected: Type): Type = expected.dealias match + case expected @ CapturingType(eparent, refs) => + CapturingType(recur(eparent), refs, boxed = expected.isBoxed) + case expected @ defn.FunctionOf(args, resultType, isContextual, isErased) + if defn.isNonRefinedFunction(expected) && defn.isFunctionType(actual) && !defn.isNonRefinedFunction(actual) => + val expected1 = toDepFun(args, resultType, isContextual, isErased) + expected1 + case _ => + expected + recur(expected) + + /** For the expected type, implement the rule outlined in #14390: + * - when checking an expression `a: Ca Ta` against an expected type `Ce Te`, + * - where the capture set `Ce` contains Cls.this, + * - and where and all method definitions enclosing `a` inside class `Cls` + * have only pure parameters, + * - add to `Ce` all references to variables or this-references in `Ca` + * that are outside `Cls`. These are all accessed through `Cls.this`, + * so we can assume they are already accounted for by `Ce` and adding + * them explicitly to `Ce` changes nothing. + */ + private def addOuterRefs(expected: Type, actual: Type)(using Context): Type = + def isPure(info: Type): Boolean = info match + case info: PolyType => isPure(info.resType) + case info: MethodType => info.paramInfos.forall(_.captureSet.isAlwaysEmpty) && isPure(info.resType) + case _ => true + def isPureContext(owner: Symbol, limit: Symbol): Boolean = + if owner == limit then true + else if !owner.exists then false + else isPure(owner.info) && isPureContext(owner.owner, limit) + def augment(erefs: CaptureSet, arefs: CaptureSet): CaptureSet = + (erefs /: erefs.elems) { (erefs, eref) => + eref match + case eref: ThisType if isPureContext(ctx.owner, eref.cls) => + erefs ++ arefs.filter { + case aref: TermRef => eref.cls.isProperlyContainedIn(aref.symbol.owner) + case aref: ThisType => eref.cls.isProperlyContainedIn(aref.cls) + case _ => false + } + case _ => + erefs + } + expected match + case CapturingType(ecore, erefs) => + val erefs1 = augment(erefs, actual.captureSet) + if erefs1 ne erefs then + capt.println(i"augmented $expected from ${actual.captureSet} --> $erefs1") + expected.derivedCapturingType(ecore, erefs1) + case _ => + expected + + /** Adapt `actual` type to `expected` type by inserting boxing and unboxing conversions */ + def adaptBoxed(actual: Type, expected: Type, pos: SrcPos)(using Context): Type = + + /** Adapt function type `actual`, which is `aargs -> ares` (possibly with dependencies) + * to `expected` type. + * It returns the adapted type along with the additionally captured variable + * during adaptation. + * @param reconstruct how to rebuild the adapted function type + */ + def adaptFun(actual: Type, aargs: List[Type], ares: Type, expected: Type, + covariant: Boolean, boxed: Boolean, + reconstruct: (List[Type], Type) => Type): (Type, CaptureSet) = + val saved = curEnv + curEnv = Env(curEnv.owner, nestedInOwner = true, CaptureSet.Var(), isBoxed = false, if boxed then null else curEnv) + + try + val (eargs, eres) = expected.dealias.stripCapturing match + case defn.FunctionOf(eargs, eres, _, _) => (eargs, eres) + case expected: MethodType => (expected.paramInfos, expected.resType) + case expected @ RefinedType(_, _, rinfo: MethodType) if defn.isFunctionType(expected) => (rinfo.paramInfos, rinfo.resType) + case _ => (aargs.map(_ => WildcardType), WildcardType) + val aargs1 = aargs.zipWithConserve(eargs) { (aarg, earg) => adapt(aarg, earg, !covariant) } + val ares1 = adapt(ares, eres, covariant) + + val resTp = + if (ares1 eq ares) && (aargs1 eq aargs) then actual + else reconstruct(aargs1, ares1) + + (resTp, curEnv.captured) + finally + curEnv = saved + + /** Adapt type function type `actual` to the expected type. + * @see [[adaptFun]] + */ + def adaptTypeFun( + actual: Type, ares: Type, expected: Type, + covariant: Boolean, boxed: Boolean, + reconstruct: Type => Type): (Type, CaptureSet) = + val saved = curEnv + curEnv = Env(curEnv.owner, nestedInOwner = true, CaptureSet.Var(), isBoxed = false, if boxed then null else curEnv) + + try + val eres = expected.dealias.stripCapturing match + case RefinedType(_, _, rinfo: PolyType) => rinfo.resType + case expected: PolyType => expected.resType + case _ => WildcardType + + val ares1 = adapt(ares, eres, covariant) + + val resTp = + if ares1 eq ares then actual + else reconstruct(ares1) + + (resTp, curEnv.captured) + finally + curEnv = saved + end adaptTypeFun + + def adaptInfo(actual: Type, expected: Type, covariant: Boolean): String = + val arrow = if covariant then "~~>" else "<~~" + i"adapting $actual $arrow $expected" + + /** Destruct a capturing type `tp` to a tuple (cs, tp0, boxed), + * where `tp0` is not a capturing type. + * + * If `tp` is a nested capturing type, the return tuple always represents + * the innermost capturing type. The outer capture annotations can be + * reconstructed with the returned function. + */ + def destructCapturingType(tp: Type, reconstruct: Type -> Context ?-> Type = (x: Type) => x) // !cc! need monomorphic default argument + : (Type, CaptureSet, Boolean, Type -> Context ?-> Type) = + tp.dealias match + case tp @ CapturingType(parent, cs) => + if parent.dealias.isCapturingType then + destructCapturingType(parent, res => reconstruct(tp.derivedCapturingType(res, cs))) + else + (parent, cs, tp.isBoxed, reconstruct) + case actual => + (actual, CaptureSet(), false, reconstruct) + + def adapt(actual: Type, expected: Type, covariant: Boolean): Type = trace(adaptInfo(actual, expected, covariant), recheckr, show = true) { + if expected.isInstanceOf[WildcardType] then actual + else + val (parent, cs, actualIsBoxed, recon: (Type -> Context ?-> Type)) = destructCapturingType(actual) + + val needsAdaptation = actualIsBoxed != expected.isBoxedCapturing + val insertBox = needsAdaptation && covariant != actualIsBoxed + + val (parent1, cs1) = parent match { + case actual @ AppliedType(tycon, args) if defn.isNonRefinedFunction(actual) => + val (parent1, leaked) = adaptFun(parent, args.init, args.last, expected, covariant, insertBox, + (aargs1, ares1) => actual.derivedAppliedType(tycon, aargs1 :+ ares1)) + (parent1, leaked ++ cs) + case actual @ RefinedType(_, _, rinfo: MethodType) if defn.isFunctionType(actual) => + // TODO Find a way to combine handling of generic and dependent function types (here and elsewhere) + val (parent1, leaked) = adaptFun(parent, rinfo.paramInfos, rinfo.resType, expected, covariant, insertBox, + (aargs1, ares1) => + rinfo.derivedLambdaType(paramInfos = aargs1, resType = ares1) + .toFunctionType(isJava = false, alwaysDependent = true)) + (parent1, leaked ++ cs) + case actual: MethodType => + val (parent1, leaked) = adaptFun(parent, actual.paramInfos, actual.resType, expected, covariant, insertBox, + (aargs1, ares1) => + actual.derivedLambdaType(paramInfos = aargs1, resType = ares1)) + (parent1, leaked ++ cs) + case actual @ RefinedType(p, nme, rinfo: PolyType) if defn.isFunctionOrPolyType(actual) => + val (parent1, leaked) = adaptTypeFun(parent, rinfo.resType, expected, covariant, insertBox, + ares1 => + val rinfo1 = rinfo.derivedLambdaType(rinfo.paramNames, rinfo.paramInfos, ares1) + val actual1 = actual.derivedRefinedType(p, nme, rinfo1) + actual1 + ) + (parent1, leaked ++ cs) + case _ => + (parent, cs) + } + + if needsAdaptation then + val criticalSet = // the set which is not allowed to have `*` + if covariant then cs1 // can't box with `*` + else expected.captureSet // can't unbox with `*` + if criticalSet.isUniversal && expected.isValueType then + // We can't box/unbox the universal capability. Leave `actual` as it is + // so we get an error in checkConforms. This tends to give better error + // messages than disallowing the root capability in `criticalSet`. + if ctx.settings.YccDebug.value then + println(i"cannot box/unbox $actual vs $expected") + actual + else + // Disallow future addition of `*` to `criticalSet`. + criticalSet.disallowRootCapability { () => + report.error( + em"""$actual cannot be box-converted to $expected + |since one of their capture sets contains the root capability `*`""", + pos) + } + if !insertBox then // unboxing + markFree(criticalSet, pos) + recon(CapturingType(parent1, cs1, !actualIsBoxed)) + else + recon(CapturingType(parent1, cs1, actualIsBoxed)) + } + + var actualw = actual.widenDealias + actual match + case ref: CaptureRef if ref.isTracked => + actualw match + case CapturingType(p, refs) => + actualw = actualw.derivedCapturingType(p, ref.singletonCaptureSet) + // given `a: C T`, improve `C T` to `{a} T` + case _ => + case _ => + val adapted = adapt(actualw, expected, covariant = true) + if adapted ne actualw then + capt.println(i"adapt boxed $actual vs $expected ===> $adapted") + adapted + else actual + end adaptBoxed + + override def checkUnit(unit: CompilationUnit)(using Context): Unit = + Setup(preRecheckPhase, thisPhase, recheckDef) + .traverse(ctx.compilationUnit.tpdTree) + //println(i"SETUP:\n${Recheck.addRecheckedTypes.transform(ctx.compilationUnit.tpdTree)}") + withCaptureSetsExplained { + super.checkUnit(unit) + checkSelfTypes(unit.tpdTree) + postCheck(unit.tpdTree) + if ctx.settings.YccDebug.value then + show(unit.tpdTree) // this does not print tree, but makes its variables visible for dependency printing + } + + /** Check that self types of subclasses conform to self types of super classes. + * (See comment below how this is achieved). The check assumes that classes + * without an explicit self type have the universal capture set `{*}` on the + * self type. If a class without explicit self type is not `effectivelyFinal` + * it is checked that the inferred self type is universal, in order to assure + * that joint and separate compilation give the same result. + */ + def checkSelfTypes(unit: tpd.Tree)(using Context): Unit = + val parentTrees = mutable.HashMap[Symbol, List[Tree]]() + unit.foreachSubTree { + case cdef @ TypeDef(_, impl: Template) => parentTrees(cdef.symbol) = impl.parents + case _ => + } + // Perform self type checking. The problem here is that `checkParents` compares a + // self type of a subclass with the result of an asSeenFrom of the self type of the + // superclass. That's no good. We need to constrain the original superclass self type + // capture set, not the set mapped by asSeenFrom. + // + // Instead, we proceed from parent classes to child classes. For every class + // we first check its parents, and then interpolate the self type to an + // upper approximation that satisfies all constraints on its capture set. + // That means all capture sets of parent self types are constants, so mapping + // them with asSeenFrom is OK. + while parentTrees.nonEmpty do + val roots = parentTrees.keysIterator.filter { + cls => !parentTrees(cls).exists(ptree => parentTrees.contains(ptree.tpe.classSymbol)) + } + assert(roots.nonEmpty) + for case root: ClassSymbol <- roots do + checkSelfAgainstParents(root, root.baseClasses) + val selfType = root.asClass.classInfo.selfType + interpolator(startingVariance = -1).traverse(selfType) + if !root.isEffectivelySealed then + def matchesExplicitRefsInBaseClass(refs: CaptureSet, cls: ClassSymbol): Boolean = + cls.baseClasses.tail.exists { psym => + val selfType = psym.asClass.givenSelfType + selfType.exists && selfType.captureSet.elems == refs.elems + } + selfType match + case CapturingType(_, refs: CaptureSet.Var) + if !refs.isUniversal && !matchesExplicitRefsInBaseClass(refs, root) => + // Forbid inferred self types unless they are already implied by an explicit + // self type in a parent. + report.error( + em"""$root needs an explicitly declared self type since its + |inferred self type $selfType + |is not visible in other compilation units that define subclasses.""", + root.srcPos) + case _ => + parentTrees -= root + capt.println(i"checked $root with $selfType") + end checkSelfTypes + + /** Heal ill-formed capture sets in the type parameter. + * + * We can push parameter refs into a capture set in type parameters + * that this type parameter can't see. + * For example, when capture checking the following expression: + * + * def usingLogFile[T](op: (f: {*} File) => T): T = ... + * + * usingLogFile[box ?1 () -> Unit] { (f: {*} File) => () => { f.write(0) } } + * + * We may propagate `f` into ?1, making ?1 ill-formed. + * This also causes soundness issues, since `f` in ?1 should be widened to `*`, + * giving rise to an error that `*` cannot be included in a boxed capture set. + * + * To solve this, we still allow ?1 to capture parameter refs like `f`, but + * compensate this by pushing the widened capture set of `f` into ?1. + * This solves the soundness issue caused by the ill-formness of ?1. + */ + private def healTypeParam(tree: Tree)(using Context): Unit = + val checker = new TypeTraverser: + private def isAllowed(ref: CaptureRef): Boolean = ref match + case ref: TermParamRef => allowed.contains(ref) + case _ => true + + // Widen the given term parameter refs x₁ : C₁ S₁ , ⋯ , xₙ : Cₙ Sₙ to their capture sets C₁ , ⋯ , Cₙ. + // + // If in these capture sets there are any capture references that are term parameter references we should avoid, + // we will widen them recursively. + private def widenParamRefs(refs: List[TermParamRef]): List[CaptureSet] = + @scala.annotation.tailrec + def recur(todos: List[TermParamRef], acc: List[CaptureSet]): List[CaptureSet] = + todos match + case Nil => acc + case ref :: rem => + val cs = ref.captureSetOfInfo + val nextAcc = cs.filter(isAllowed(_)) :: acc + val nextRem: List[TermParamRef] = (cs.elems.toList.filter(!isAllowed(_)) ++ rem).asInstanceOf + recur(nextRem, nextAcc) + recur(refs, Nil) + + private def healCaptureSet(cs: CaptureSet): Unit = + val toInclude = widenParamRefs(cs.elems.toList.filter(!isAllowed(_)).asInstanceOf) + toInclude.foreach(checkSubset(_, cs, tree.srcPos)) + + private var allowed: SimpleIdentitySet[TermParamRef] = SimpleIdentitySet.empty + + def traverse(tp: Type) = + tp match + case CapturingType(parent, refs) => + healCaptureSet(refs) + traverse(parent) + case tp @ RefinedType(parent, rname, rinfo: MethodType) if defn.isFunctionType(tp) => + traverse(rinfo) + case tp: TermLambda => + val saved = allowed + try + tp.paramRefs.foreach(allowed += _) + traverseChildren(tp) + finally allowed = saved + case _ => + traverseChildren(tp) + + if tree.isInstanceOf[InferredTypeTree] then + checker.traverse(tree.knownType) + end healTypeParam + + /** Perform the following kinds of checks + * - Check all explicitly written capturing types for well-formedness using `checkWellFormedPost`. + * - Check that externally visible `val`s or `def`s have empty capture sets. If not, + * suggest an explicit type. This is so that separate compilation (where external + * symbols have empty capture sets) gives the same results as joint compilation. + * - Check that arguments of TypeApplys and AppliedTypes conform to their bounds. + * - Heal ill-formed capture sets of type parameters. See `healTypeParam`. + */ + def postCheck(unit: tpd.Tree)(using Context): Unit = + unit.foreachSubTree { + case _: InferredTypeTree => + case tree: TypeTree if !tree.span.isZeroExtent => + tree.knownType.foreachPart { tp => + checkWellformedPost(tp, tree.srcPos) + tp match + case AnnotatedType(_, annot) if annot.symbol == defn.RetainsAnnot => + warnIfRedundantCaptureSet(annot.tree) + case _ => + } + case t: ValOrDefDef + if t.tpt.isInstanceOf[InferredTypeTree] && !Synthetics.isExcluded(t.symbol) => + val sym = t.symbol + val isLocal = + sym.owner.ownersIterator.exists(_.isTerm) + || sym.accessBoundary(defn.RootClass).isContainedIn(sym.topLevelClass) + def canUseInferred = // If canUseInferred is false, all capturing types in the type of `sym` need to be given explicitly + sym.is(Private) // private symbols can always have inferred types + || sym.name.is(DefaultGetterName) // default getters are exempted since otherwise it would be + // too annoying. This is a hole since a defualt getter's result type + // might leak into a type variable. + || // non-local symbols cannot have inferred types since external capture types are not inferred + isLocal // local symbols still need explicit types if + && !sym.owner.is(Trait) // they are defined in a trait, since we do OverridingPairs checking before capture inference + def isNotPureThis(ref: CaptureRef) = ref match { + case ref: ThisType => !ref.cls.isPureClass + case _ => true + } + if !canUseInferred then + val inferred = t.tpt.knownType + def checkPure(tp: Type) = tp match + case CapturingType(_, refs) + if !refs.elems.filter(isNotPureThis).isEmpty => + val resultStr = if t.isInstanceOf[DefDef] then " result" else "" + report.error( + em"""Non-local $sym cannot have an inferred$resultStr type + |$inferred + |with non-empty capture set $refs. + |The type needs to be declared explicitly.""".withoutDisambiguation(), + t.srcPos) + case _ => + inferred.foreachPart(checkPure, StopAt.Static) + case t @ TypeApply(fun, args) => + fun.knownType.widen match + case tl: PolyType => + val normArgs = args.lazyZip(tl.paramInfos).map { (arg, bounds) => + arg.withType(arg.knownType.forceBoxStatus( + bounds.hi.isBoxedCapturing | bounds.lo.isBoxedCapturing)) + } + checkBounds(normArgs, tl) + case _ => + + args.foreach(healTypeParam(_)) + case _ => + } + if !ctx.reporter.errorsReported then + // We dont report errors here if previous errors were reported, because other + // errors often result in bad applied types, but flagging these bad types gives + // often worse error messages than the original errors. + val checkApplied = new TreeTraverser: + def traverse(t: Tree)(using Context) = t match + case tree: InferredTypeTree => + case tree: New => + case tree: TypeTree => checkAppliedTypesIn(tree.withKnownType) + case _ => traverseChildren(t) + checkApplied.traverse(unit) + end CaptureChecker +end CheckCaptures diff --git a/tests/pos-with-compiler-cc/dotc/cc/Setup.scala b/tests/pos-with-compiler-cc/dotc/cc/Setup.scala new file mode 100644 index 000000000000..a91831022984 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/cc/Setup.scala @@ -0,0 +1,482 @@ +package dotty.tools +package dotc +package cc + +import core._ +import Phases.*, DenotTransformers.*, SymDenotations.* +import Contexts.*, Names.*, Flags.*, Symbols.*, Decorators.* +import Types.*, StdNames.* +import config.Printers.capt +import ast.tpd +import transform.Recheck.* +import CaptureSet.IdentityCaptRefMap +import Synthetics.isExcluded + +/** A tree traverser that prepares a compilation unit to be capture checked. + * It does the following: + * - For every inferred type, drop any retains annotations, + * add capture sets to all its parts, add refinements to class types and function types. + * (c.f. mapInferred) + * - For explicit capturing types, expand throws aliases to the underlying (pure) function, + * and add some implied capture sets to curried functions (c.f. expandThrowsAlias, expandAbbreviations). + * - Add capture sets to self types of classes and objects, unless the self type was written explicitly. + * - Box the types of mutable variables and type arguments to methods (type arguments of types + * are boxed on access). + * - Link the external types of val and def symbols with the inferred types based on their parameter symbols. + */ +class Setup( + preRecheckPhase: DenotTransformer, + thisPhase: DenotTransformer, + recheckDef: (tpd.ValOrDefDef, Symbol) => Context ?=> Unit) +extends tpd.TreeTraverser: + import tpd.* + + /** Create dependent function with underlying function class `tycon` and given + * arguments `argTypes` and result `resType`. + */ + private def depFun(tycon: Type, argTypes: List[Type], resType: Type)(using Context): Type = + MethodType.companion( + isContextual = defn.isContextFunctionClass(tycon.classSymbol), + isErased = defn.isErasedFunctionClass(tycon.classSymbol) + )(argTypes, resType) + .toFunctionType(isJava = false, alwaysDependent = true) + + /** If `tp` is an unboxed capturing type or a function returning an unboxed capturing type, + * convert it to be boxed. + */ + private def box(tp: Type)(using Context): Type = + def recur(tp: Type): Type = tp.dealias match + case tp @ CapturingType(parent, refs) if !tp.isBoxed => + tp.boxed + case tp1 @ AppliedType(tycon, args) if defn.isNonRefinedFunction(tp1) => + val res = args.last + val boxedRes = recur(res) + if boxedRes eq res then tp + else tp1.derivedAppliedType(tycon, args.init :+ boxedRes) + case tp1 @ RefinedType(_, _, rinfo) if defn.isFunctionType(tp1) => + val boxedRinfo = recur(rinfo) + if boxedRinfo eq rinfo then tp + else boxedRinfo.toFunctionType(isJava = false, alwaysDependent = true) + case tp1: MethodOrPoly => + val res = tp1.resType + val boxedRes = recur(res) + if boxedRes eq res then tp + else tp1.derivedLambdaType(resType = boxedRes) + case _ => tp + tp match + case tp: MethodOrPoly => tp // don't box results of methods outside refinements + case _ => recur(tp) + + /** Perform the following transformation steps everywhere in a type: + * 1. Drop retains annotations + * 2. Turn plain function types into dependent function types, so that + * we can refer to their parameters in capture sets. Currently this is + * only done at the toplevel, i.e. for function types that are not + * themselves argument types of other function types. Without this restriction + * pos.../lists.scala and pos/...curried-shorthands.scala fail. + * Need to figure out why. + * 3. Refine other class types C by adding capture set variables to their parameter getters + * (see addCaptureRefinements) + * 4. Add capture set variables to all types that can be tracked + * + * Polytype bounds are only cleaned using step 1, but not otherwise transformed. + */ + private def mapInferred(using DetachedContext) = new TypeMap: + + /** Drop @retains annotations everywhere */ + object cleanup extends TypeMap: + def apply(t: Type) = t match + case AnnotatedType(parent, annot) if annot.symbol == defn.RetainsAnnot => + apply(parent) + case _ => + mapOver(t) + + /** Refine a possibly applied class type C where the class has tracked parameters + * x_1: T_1, ..., x_n: T_n to C { val x_1: CV_1 T_1, ..., val x_n: CV_n T_n } + * where CV_1, ..., CV_n are fresh capture sets. + */ + def addCaptureRefinements(tp: Type): Type = tp match + case _: TypeRef | _: AppliedType if tp.typeParams.isEmpty => + tp.typeSymbol match + case cls: ClassSymbol + if !defn.isFunctionClass(cls) && !cls.is(JavaDefined) => + // We assume that Java classes can refer to capturing Scala types only indirectly, + // using type parameters. Hence, no need to refine them. + cls.paramGetters.foldLeft(tp) { (core, getter) => + if getter.termRef.isTracked then + val getterType = tp.memberInfo(getter).strippedDealias + RefinedType(core, getter.name, CapturingType(getterType, CaptureSet.Var())) + .showing(i"add capture refinement $tp --> $result", capt) + else + core + } + case _ => tp + case _ => tp + + private def superTypeIsImpure(tp: Type): Boolean = { + tp.dealias match + case CapturingType(_, refs) => + !refs.isAlwaysEmpty + case tp: (TypeRef | AppliedType) => + val sym = tp.typeSymbol + if sym.isClass then + sym == defn.AnyClass + // we assume Any is a shorthand of {*} Any, so if Any is an upper + // bound, the type is taken to be impure. + else superTypeIsImpure(tp.superType) + case tp: (RefinedOrRecType | MatchType) => + superTypeIsImpure(tp.underlying) + case tp: AndType => + superTypeIsImpure(tp.tp1) || needsVariable(tp.tp2) + case tp: OrType => + superTypeIsImpure(tp.tp1) && superTypeIsImpure(tp.tp2) + case _ => + false + }.showing(i"super type is impure $tp = $result", capt) + + /** Should a capture set variable be added on type `tp`? */ + def needsVariable(tp: Type): Boolean = { + tp.typeParams.isEmpty && tp.match + case tp: (TypeRef | AppliedType) => + val tp1 = tp.dealias + if tp1 ne tp then needsVariable(tp1) + else + val sym = tp1.typeSymbol + if sym.isClass then + !sym.isPureClass && sym != defn.AnyClass + else superTypeIsImpure(tp1) + case tp: (RefinedOrRecType | MatchType) => + needsVariable(tp.underlying) + case tp: AndType => + needsVariable(tp.tp1) && needsVariable(tp.tp2) + case tp: OrType => + needsVariable(tp.tp1) || needsVariable(tp.tp2) + case CapturingType(parent, refs) => + needsVariable(parent) + && refs.isConst // if refs is a variable, no need to add another + && !refs.isUniversal // if refs is {*}, an added variable would not change anything + case _ => + false + }.showing(i"can have inferred capture $tp = $result", capt) + + /** Add a capture set variable to `tp` if necessary, or maybe pull out + * an embedded capture set variable from a part of `tp`. + */ + def addVar(tp: Type) = tp match + case tp @ RefinedType(parent @ CapturingType(parent1, refs), rname, rinfo) => + CapturingType(tp.derivedRefinedType(parent1, rname, rinfo), refs, parent.isBoxed) + case tp: RecType => + tp.parent match + case parent @ CapturingType(parent1, refs) => + CapturingType(tp.derivedRecType(parent1), refs, parent.isBoxed) + case _ => + tp // can return `tp` here since unlike RefinedTypes, RecTypes are never created + // by `mapInferred`. Hence if the underlying type admits capture variables + // a variable was already added, and the first case above would apply. + case AndType(tp1 @ CapturingType(parent1, refs1), tp2 @ CapturingType(parent2, refs2)) => + assert(refs1.asVar.elems.isEmpty) + assert(refs2.asVar.elems.isEmpty) + assert(tp1.isBoxed == tp2.isBoxed) + CapturingType(AndType(parent1, parent2), refs1 ** refs2, tp1.isBoxed) + case tp @ OrType(tp1 @ CapturingType(parent1, refs1), tp2 @ CapturingType(parent2, refs2)) => + assert(refs1.asVar.elems.isEmpty) + assert(refs2.asVar.elems.isEmpty) + assert(tp1.isBoxed == tp2.isBoxed) + CapturingType(OrType(parent1, parent2, tp.isSoft), refs1 ++ refs2, tp1.isBoxed) + case tp @ OrType(tp1 @ CapturingType(parent1, refs1), tp2) => + CapturingType(OrType(parent1, tp2, tp.isSoft), refs1, tp1.isBoxed) + case tp @ OrType(tp1, tp2 @ CapturingType(parent2, refs2)) => + CapturingType(OrType(tp1, parent2, tp.isSoft), refs2, tp2.isBoxed) + case _ if needsVariable(tp) => + val cs = tp.dealias match + case CapturingType(_, refs) => CaptureSet.Var(refs.elems) + case _ => CaptureSet.Var() + CapturingType(tp, cs) + case _ => + tp + + private var isTopLevel = true + + private def mapNested(ts: List[Type]): List[Type] = + val saved = isTopLevel + isTopLevel = false + try ts.mapConserve(this) finally isTopLevel = saved + + def apply(t: Type) = + val tp = expandThrowsAlias(t) + val tp1 = tp match + case AnnotatedType(parent, annot) if annot.symbol == defn.RetainsAnnot => + // Drop explicit retains annotations + apply(parent) + case tp @ AppliedType(tycon, args) => + val tycon1 = this(tycon) + if defn.isNonRefinedFunction(tp) then + // Convert toplevel generic function types to dependent functions + val args0 = args.init + var res0 = args.last + val args1 = mapNested(args0) + val res1 = this(res0) + if isTopLevel then + depFun(tycon1, args1, res1) + .showing(i"add function refinement $tp --> $result", capt) + else if (tycon1 eq tycon) && (args1 eq args0) && (res1 eq res0) then + tp + else + tp.derivedAppliedType(tycon1, args1 :+ res1) + else + tp.derivedAppliedType(tycon1, args.mapConserve(arg => this(arg))) + case tp @ RefinedType(core, rname, rinfo) if defn.isFunctionType(tp) => + val rinfo1 = apply(rinfo) + if rinfo1 ne rinfo then rinfo1.toFunctionType(isJava = false, alwaysDependent = true) + else tp + case tp: MethodType => + tp.derivedLambdaType( + paramInfos = mapNested(tp.paramInfos), + resType = this(tp.resType)) + case tp: TypeLambda => + // Don't recurse into parameter bounds, just cleanup any stray retains annotations + tp.derivedLambdaType( + paramInfos = tp.paramInfos.mapConserve(cleanup(_).bounds), + resType = this(tp.resType)) + case _ => + mapOver(tp) + addVar(addCaptureRefinements(tp1)) + end apply + end mapInferred + + private def transformInferredType(tp: Type, boxed: Boolean)(using Context): Type = + val tp1 = mapInferred(tp) + if boxed then box(tp1) else tp1 + + /** Expand some aliases of function types to the underlying functions. + * Right now, these are only $throws aliases, but this could be generalized. + */ + private def expandThrowsAlias(tp: Type)(using Context) = tp match + case AppliedType(tycon, res :: exc :: Nil) if tycon.typeSymbol == defn.throwsAlias => + // hard-coded expansion since $throws aliases in stdlib are defined with `?=>` rather than `?->` + defn.FunctionOf(defn.CanThrowClass.typeRef.appliedTo(exc) :: Nil, res, isContextual = true, isErased = true) + case _ => tp + + private def expandThrowsAliases(using DetachedContext) = new TypeMap: + def apply(t: Type) = t match + case _: AppliedType => + val t1 = expandThrowsAlias(t) + if t1 ne t then apply(t1) else mapOver(t) + case _: LazyRef => + t + case t @ AnnotatedType(t1, ann) => + // Don't map capture sets, since that would implicitly normalize sets that + // are not well-formed. + t.derivedAnnotatedType(apply(t1), ann) + case _ => + mapOver(t) + + /** Fill in capture sets of curried function types from left to right, using + * a combination of the following two rules: + * + * 1. Expand `{c} (x: A) -> (y: B) -> C` + * to `{c} (x: A) -> {c} (y: B) -> C` + * 2. Expand `(x: A) -> (y: B) -> C` where `x` is tracked + * to `(x: A) -> {x} (y: B) -> C` + * + * TODO: Should we also propagate capture sets to the left? + */ + private def expandAbbreviations(using DetachedContext) = new TypeMap: + + /** Propagate `outerCs` as well as all tracked parameters as capture set to the result type + * of the dependent function type `tp`. + */ + def propagateDepFunctionResult(tp: Type, outerCs: CaptureSet): Type = tp match + case RefinedType(parent, nme.apply, rinfo: MethodType) => + val localCs = CaptureSet(rinfo.paramRefs.filter(_.isTracked)*) + val rinfo1 = rinfo.derivedLambdaType( + resType = propagateEnclosing(rinfo.resType, CaptureSet.empty, outerCs ++ localCs)) + if rinfo1 ne rinfo then rinfo1.toFunctionType(isJava = false, alwaysDependent = true) + else tp + + /** If `tp` is a function type: + * - add `outerCs` as its capture set, + * - propagate `currentCs`, `outerCs`, and all tracked parameters of `tp` to the right. + */ + def propagateEnclosing(tp: Type, currentCs: CaptureSet, outerCs: CaptureSet): Type = tp match + case tp @ AppliedType(tycon, args) if defn.isFunctionClass(tycon.typeSymbol) => + val tycon1 = this(tycon) + val args1 = args.init.mapConserve(this) + val tp1 = + if args1.exists(!_.captureSet.isAlwaysEmpty) then + val propagated = propagateDepFunctionResult( + depFun(tycon, args1, args.last), currentCs ++ outerCs) + propagated match + case RefinedType(_, _, mt: MethodType) => + if mt.isCaptureDependent then propagated + else + // No need to introduce dependent type, switch back to generic function type + tp.derivedAppliedType(tycon1, args1 :+ mt.resType) + else + val resType1 = propagateEnclosing( + args.last, CaptureSet.empty, currentCs ++ outerCs) + tp.derivedAppliedType(tycon1, args1 :+ resType1) + tp1.capturing(outerCs) + case tp @ RefinedType(parent, nme.apply, rinfo: MethodType) if defn.isFunctionType(tp) => + propagateDepFunctionResult(mapOver(tp), currentCs ++ outerCs) + .capturing(outerCs) + case _ => + mapOver(tp) + + def apply(tp: Type): Type = tp match + case CapturingType(parent, cs) => + tp.derivedCapturingType(propagateEnclosing(parent, cs, CaptureSet.empty), cs) + case _ => + propagateEnclosing(tp, CaptureSet.empty, CaptureSet.empty) + end expandAbbreviations + + private def transformExplicitType(tp: Type, boxed: Boolean)(using Context): Type = + val tp1 = expandThrowsAliases(if boxed then box(tp) else tp) + if tp1 ne tp then capt.println(i"expanded: $tp --> $tp1") + if ctx.settings.YccNoAbbrev.value then tp1 + else expandAbbreviations(tp1) + + /** Transform type of type tree, and remember the transformed type as the type the tree */ + private def transformTT(tree: TypeTree, boxed: Boolean, exact: Boolean)(using Context): Unit = + if !tree.hasRememberedType then + tree.rememberType( + if tree.isInstanceOf[InferredTypeTree] && !exact + then transformInferredType(tree.tpe, boxed) + else transformExplicitType(tree.tpe, boxed)) + + /** Substitute parameter symbols in `from` to paramRefs in corresponding + * method or poly types `to`. We use a single BiTypeMap to do everything. + * @param from a list of lists of type or term parameter symbols of a curried method + * @param to a list of method or poly types corresponding one-to-one to the parameter lists + */ + private class SubstParams(from: List[List[Symbol]], to: List[LambdaType])(using DetachedContext) + extends DeepTypeMap, BiTypeMap: + + def apply(t: Type): Type = t match + case t: NamedType => + val sym = t.symbol + def outer(froms: List[List[Symbol]], tos: List[LambdaType]): Type = + def inner(from: List[Symbol], to: List[ParamRef]): Type = + if from.isEmpty then outer(froms.tail, tos.tail) + else if sym eq from.head then to.head + else inner(from.tail, to.tail) + if tos.isEmpty then t + else inner(froms.head, tos.head.paramRefs) + outer(from, to) + case _ => + mapOver(t) + + def inverse(t: Type): Type = t match + case t: ParamRef => + def recur(from: List[LambdaType], to: List[List[Symbol]]): Type = + if from.isEmpty then t + else if t.binder eq from.head then to.head(t.paramNum).namedType + else recur(from.tail, to.tail) + recur(to, from) + case _ => + mapOver(t) + end SubstParams + + /** Update info of `sym` for CheckCaptures phase only */ + private def updateInfo(sym: Symbol, info: Type)(using Context) = + sym.updateInfoBetween(preRecheckPhase, thisPhase, info) + + def traverse(tree: Tree)(using Context): Unit = + tree match + case tree: DefDef => + if isExcluded(tree.symbol) then + return + tree.tpt match + case tpt: TypeTree if tree.symbol.allOverriddenSymbols.hasNext => + tree.paramss.foreach(traverse) + transformTT(tpt, boxed = false, exact = true) + traverse(tree.rhs) + //println(i"TYPE of ${tree.symbol.showLocated} = ${tpt.knownType}") + case _ => + traverseChildren(tree) + case tree @ ValDef(_, tpt: TypeTree, _) => + transformTT(tpt, + boxed = tree.symbol.is(Mutable), // types of mutable variables are boxed + exact = tree.symbol.allOverriddenSymbols.hasNext // types of symbols that override a parent don't get a capture set + ) + traverse(tree.rhs) + case tree @ TypeApply(fn, args) => + traverse(fn) + for case arg: TypeTree <- args do + transformTT(arg, boxed = true, exact = false) // type arguments in type applications are boxed + case _ => + traverseChildren(tree) + tree match + case tree: TypeTree => + transformTT(tree, boxed = false, exact = false) // other types are not boxed + case tree: ValOrDefDef => + val sym = tree.symbol + + // replace an existing symbol info with inferred types where capture sets of + // TypeParamRefs and TermParamRefs put in correspondence by BiTypeMaps with the + // capture sets of the types of the method's parameter symbols and result type. + def integrateRT( + info: Type, // symbol info to replace + psymss: List[List[Symbol]], // the local (type and term) parameter symbols corresponding to `info` + prevPsymss: List[List[Symbol]], // the local parameter symbols seen previously in reverse order + prevLambdas: List[LambdaType] // the outer method and polytypes generated previously in reverse order + ): Type = + info match + case mt: MethodOrPoly => + val psyms = psymss.head + mt.companion(mt.paramNames)( + mt1 => + if !psyms.exists(_.isUpdatedAfter(preRecheckPhase)) && !mt.isParamDependent && prevLambdas.isEmpty then + mt.paramInfos + else + val subst = SubstParams(psyms :: prevPsymss, mt1 :: prevLambdas) + psyms.map(psym => subst(psym.info).asInstanceOf[mt.PInfo]), + mt1 => + integrateRT(mt.resType, psymss.tail, psyms :: prevPsymss, mt1 :: prevLambdas) + ) + case info: ExprType => + info.derivedExprType(resType = + integrateRT(info.resType, psymss, prevPsymss, prevLambdas)) + case _ => + val restp = tree.tpt.knownType + if prevLambdas.isEmpty then restp + else SubstParams(prevPsymss, prevLambdas)(restp) + + if tree.tpt.hasRememberedType && !sym.isConstructor then + val newInfo = integrateRT(sym.info, sym.paramSymss, Nil, Nil) + .showing(i"update info $sym: ${sym.info} --> $result", capt) + if newInfo ne sym.info then + val completer = new LazyType: + def complete(denot: SymDenotation)(using Context) = + denot.info = newInfo + recheckDef(tree, sym) + updateInfo(sym, completer) + case tree: Bind => + val sym = tree.symbol + updateInfo(sym, transformInferredType(sym.info, boxed = false)) + case tree: TypeDef => + tree.symbol match + case cls: ClassSymbol => + val cinfo @ ClassInfo(prefix, _, ps, decls, selfInfo) = cls.classInfo + if (selfInfo eq NoType) || cls.is(ModuleClass) && !cls.isStatic then + // add capture set to self type of nested classes if no self type is given explicitly + val localRefs = CaptureSet.Var() + val newInfo = ClassInfo(prefix, cls, ps, decls, + CapturingType(cinfo.selfType, localRefs) + .showing(i"inferred self type for $cls: $result", capt)) + updateInfo(cls, newInfo) + cls.thisType.asInstanceOf[ThisType].invalidateCaches() + if cls.is(ModuleClass) then + // if it's a module, the capture set of the module reference is the capture set of the self type + val modul = cls.sourceModule + updateInfo(modul, CapturingType(modul.info, localRefs)) + modul.termRef.invalidateCaches() + case _ => + val info = atPhase(preRecheckPhase)(tree.symbol.info) + val newInfo = transformExplicitType(info, boxed = false) + if newInfo ne info then + updateInfo(tree.symbol, newInfo) + capt.println(i"update info of ${tree.symbol} from $info to $newInfo") + case _ => + end traverse +end Setup diff --git a/tests/pos-with-compiler-cc/dotc/cc/Synthetics.scala b/tests/pos-with-compiler-cc/dotc/cc/Synthetics.scala new file mode 100644 index 000000000000..dacbd27e0f35 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/cc/Synthetics.scala @@ -0,0 +1,189 @@ +package dotty.tools +package dotc +package cc + +import core.* +import Symbols.*, SymDenotations.*, Contexts.*, Flags.*, Types.*, Decorators.* +import StdNames.nme +import Names.Name +import NameKinds.DefaultGetterName +import Phases.checkCapturesPhase +import config.Printers.capt + +/** Classification and transformation methods for synthetic + * case class methods that need to be treated specially. + * In particular, compute capturing types for some of these methods which + * have inferred (result-)types that need to be established under separate + * compilation. + */ +object Synthetics: + private def isSyntheticCopyMethod(sym: SymDenotation)(using Context) = + sym.name == nme.copy && sym.is(Synthetic) && sym.owner.isClass && sym.owner.is(Case) + + private def isSyntheticCompanionMethod(sym: SymDenotation, names: Name*)(using Context): Boolean = + names.contains(sym.name) && sym.is(Synthetic) && sym.owner.is(Module) && sym.owner.companionClass.is(Case) + + private def isSyntheticCopyDefaultGetterMethod(sym: SymDenotation)(using Context) = sym.name match + case DefaultGetterName(nme.copy, _) => sym.is(Synthetic) && sym.owner.isClass && sym.owner.is(Case) + case _ => false + + /** Is `sym` a synthetic apply, copy, or copy default getter method? + * The types of these symbols are transformed in a special way without + * looking at the definitions's RHS + */ + def needsTransform(symd: SymDenotation)(using Context): Boolean = + isSyntheticCopyMethod(symd) + || isSyntheticCompanionMethod(symd, nme.apply, nme.unapply) + || isSyntheticCopyDefaultGetterMethod(symd) + || (symd.symbol eq defn.Object_eq) + || (symd.symbol eq defn.Object_ne) + + /** Method is excluded from regular capture checking. + * Excluded are synthetic class members + * - that override a synthesized case class symbol, or + * - the fromProduct method, or + * - members transformed specially as indicated by `needsTransform`. + */ + def isExcluded(sym: Symbol)(using Context): Boolean = + sym.is(Synthetic) + && sym.owner.isClass + && ( defn.caseClassSynthesized.exists( + ccsym => sym.overriddenSymbol(ccsym.owner.asClass) == ccsym) + || isSyntheticCompanionMethod(sym, nme.fromProduct) + || needsTransform(sym)) + + /** Add capture dependencies to the type of the `apply` or `copy` method of a case class. + * An apply method in a case class like this: + * case class CC(a: {d} A, b: B, {*} c: C) + * would get type + * def apply(a': {d} A, b: B, {*} c': C): {a', c'} CC { val a = {a'} A, val c = {c'} C } + * where `'` is used to indicate the difference between parameter symbol and refinement name. + * Analogous for the copy method. + */ + private def addCaptureDeps(info: Type)(using Context): Type = info match + case info: MethodType => + val trackedParams = info.paramRefs.filter(atPhase(checkCapturesPhase)(_.isTracked)) + def augmentResult(tp: Type): Type = tp match + case tp: MethodOrPoly => + tp.derivedLambdaType(resType = augmentResult(tp.resType)) + case _ => + val refined = trackedParams.foldLeft(tp) { (parent, pref) => + RefinedType(parent, pref.paramName, + CapturingType( + atPhase(ctx.phase.next)(pref.underlying.stripCapturing), + CaptureSet(pref))) + } + CapturingType(refined, CaptureSet(trackedParams*)) + if trackedParams.isEmpty then info + else augmentResult(info).showing(i"augment apply/copy type $info to $result", capt) + case info: PolyType => + info.derivedLambdaType(resType = addCaptureDeps(info.resType)) + case _ => + info + + /** Drop capture dependencies from the type of `apply` or `copy` method of a case class */ + private def dropCaptureDeps(tp: Type)(using Context): Type = tp match + case tp: MethodOrPoly => + tp.derivedLambdaType(resType = dropCaptureDeps(tp.resType)) + case CapturingType(parent, _) => + dropCaptureDeps(parent) + case RefinedType(parent, _, _) => + dropCaptureDeps(parent) + case _ => + tp + + /** Add capture information to the type of the default getter of a case class copy method */ + private def addDefaultGetterCapture(info: Type, owner: Symbol, idx: Int)(using Context): Type = info match + case info: MethodOrPoly => + info.derivedLambdaType(resType = addDefaultGetterCapture(info.resType, owner, idx)) + case info: ExprType => + info.derivedExprType(addDefaultGetterCapture(info.resType, owner, idx)) + case EventuallyCapturingType(parent, _) => + addDefaultGetterCapture(parent, owner, idx) + case info @ AnnotatedType(parent, annot) => + info.derivedAnnotatedType(addDefaultGetterCapture(parent, owner, idx), annot) + case _ if idx < owner.asClass.paramGetters.length => + val param = owner.asClass.paramGetters(idx) + val pinfo = param.info + atPhase(ctx.phase.next) { + if pinfo.captureSet.isAlwaysEmpty then info + else CapturingType(pinfo.stripCapturing, CaptureSet(param.termRef)) + } + case _ => + info + + /** Drop capture information from the type of the default getter of a case class copy method */ + private def dropDefaultGetterCapture(info: Type)(using Context): Type = info match + case info: MethodOrPoly => + info.derivedLambdaType(resType = dropDefaultGetterCapture(info.resType)) + case CapturingType(parent, _) => + parent + case info @ AnnotatedType(parent, annot) => + info.derivedAnnotatedType(dropDefaultGetterCapture(parent), annot) + case _ => + info + + /** Augment an unapply of type `(x: C): D` to `(x: {*} C): {x} D` */ + private def addUnapplyCaptures(info: Type)(using Context): Type = info match + case info: MethodType => + val paramInfo :: Nil = info.paramInfos: @unchecked + val newParamInfo = + CapturingType(paramInfo, CaptureSet.universal) + val trackedParam = info.paramRefs.head + def newResult(tp: Type): Type = tp match + case tp: MethodOrPoly => + tp.derivedLambdaType(resType = newResult(tp.resType)) + case _ => + CapturingType(tp, CaptureSet(trackedParam)) + info.derivedLambdaType(paramInfos = newParamInfo :: Nil, resType = newResult(info.resType)) + .showing(i"augment unapply type $info to $result", capt) + case info: PolyType => + info.derivedLambdaType(resType = addUnapplyCaptures(info.resType)) + + /** Drop added capture information from the type of an `unapply` */ + private def dropUnapplyCaptures(info: Type)(using Context): Type = info match + case info: MethodType => + info.paramInfos match + case CapturingType(oldParamInfo, _) :: Nil => + def oldResult(tp: Type): Type = tp match + case tp: MethodOrPoly => + tp.derivedLambdaType(resType = oldResult(tp.resType)) + case CapturingType(tp, _) => + tp + info.derivedLambdaType(paramInfos = oldParamInfo :: Nil, resType = oldResult(info.resType)) + case _ => + info + case info: PolyType => + info.derivedLambdaType(resType = dropUnapplyCaptures(info.resType)) + + /** If `sym` refers to a synthetic apply, unapply, copy, or copy default getter method + * of a case class, transform it to account for capture information. + * The method is run in phase CheckCaptures.Pre + * @pre needsTransform(sym) + */ + def transformToCC(sym: SymDenotation)(using Context): SymDenotation = sym.name match + case DefaultGetterName(nme.copy, n) => + sym.copySymDenotation(info = addDefaultGetterCapture(sym.info, sym.owner, n)) + case nme.unapply => + sym.copySymDenotation(info = addUnapplyCaptures(sym.info)) + case nme.apply | nme.copy => + sym.copySymDenotation(info = addCaptureDeps(sym.info)) + case n if n == nme.eq || n == nme.ne => + sym.copySymDenotation(info = + MethodType(defn.ObjectType.capturing(CaptureSet.universal) :: Nil, defn.BooleanType)) + + /** If `sym` refers to a synthetic apply, unapply, copy, or copy default getter method + * of a case class, transform it back to what it was before the CC phase. + * @pre needsTransform(sym) + */ + def transformFromCC(sym: SymDenotation)(using Context): SymDenotation = sym.name match + case DefaultGetterName(nme.copy, n) => + sym.copySymDenotation(info = dropDefaultGetterCapture(sym.info)) + case nme.unapply => + sym.copySymDenotation(info = dropUnapplyCaptures(sym.info)) + case nme.apply | nme.copy => + sym.copySymDenotation(info = dropCaptureDeps(sym.info)) + case n if n == nme.eq || n == nme.ne => + sym.copySymDenotation(info = defn.methOfAnyRef(defn.BooleanType)) + +end Synthetics \ No newline at end of file diff --git a/tests/pos-with-compiler-cc/dotc/classpath/AggregateClassPath.scala b/tests/pos-with-compiler-cc/dotc/classpath/AggregateClassPath.scala new file mode 100644 index 000000000000..51b261583feb --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/classpath/AggregateClassPath.scala @@ -0,0 +1,162 @@ +/* + * Copyright (c) 2014 Contributor. All rights reserved. + */ +package dotty.tools +package dotc.classpath + +import scala.language.unsafeNulls + +import java.net.URL +import scala.collection.mutable.ArrayBuffer +import scala.collection.immutable.ArraySeq +import dotc.util + +import dotty.tools.io.{ AbstractFile, ClassPath, ClassRepresentation, EfficientClassPath } + +/** + * A classpath unifying multiple class- and sourcepath entries. + * The Classpath can obtain entries for classes and sources independently + * so it tries to do operations quite optimally - iterating only these collections + * which are needed in the given moment and only as far as it's necessary. + * + * @param aggregates classpath instances containing entries which this class processes + */ +case class AggregateClassPath(aggregates: Seq[ClassPath]) extends ClassPath { + override def findClassFile(className: String): Option[AbstractFile] = { + val (pkg, _) = PackageNameUtils.separatePkgAndClassNames(className) + aggregatesForPackage(PackageName(pkg)).iterator.map(_.findClassFile(className)).collectFirst { + case Some(x) => x + } + } + private val packageIndex: collection.mutable.Map[String, Seq[ClassPath]] = collection.mutable.Map() + private def aggregatesForPackage(pkg: PackageName): Seq[ClassPath] = packageIndex.synchronized { + packageIndex.getOrElseUpdate(pkg.dottedString, aggregates.filter(_.hasPackage(pkg))) + } + + override def findClass(className: String): Option[ClassRepresentation] = { + val (pkg, _) = PackageNameUtils.separatePkgAndClassNames(className) + + def findEntry(isSource: Boolean): Option[ClassRepresentation] = + aggregatesForPackage(PackageName(pkg)).iterator.map(_.findClass(className)).collectFirst { + case Some(s: SourceFileEntry) if isSource => s + case Some(s: ClassFileEntry) if !isSource => s + } + + val classEntry = findEntry(isSource = false) + val sourceEntry = findEntry(isSource = true) + + (classEntry, sourceEntry) match { + case (Some(c: ClassFileEntry), Some(s: SourceFileEntry)) => Some(ClassAndSourceFilesEntry(c.file, s.file)) + case (c @ Some(_), _) => c + case (_, s) => s + } + } + + override def asURLs: Seq[URL] = aggregates.flatMap(_.asURLs) + + override def asClassPathStrings: Seq[String] = aggregates.map(_.asClassPathString).distinct + + override def asSourcePathString: String = ClassPath.join(aggregates map (_.asSourcePathString): _*) + + override private[dotty] def packages(inPackage: PackageName): Seq[PackageEntry] = { + val aggregatedPackages = aggregates.flatMap(_.packages(inPackage)).distinct + aggregatedPackages + } + + override private[dotty] def classes(inPackage: PackageName): Seq[ClassFileEntry] = + getDistinctEntries(_.classes(inPackage)) + + override private[dotty] def sources(inPackage: PackageName): Seq[SourceFileEntry] = + getDistinctEntries(_.sources(inPackage)) + + override private[dotty] def hasPackage(pkg: PackageName): Boolean = aggregates.exists(_.hasPackage(pkg)) + override private[dotty] def list(inPackage: PackageName): ClassPathEntries = { + val packages: java.util.HashSet[PackageEntry] = new java.util.HashSet[PackageEntry]() + val classesAndSourcesBuffer = collection.mutable.ArrayBuffer[ClassRepresentation]() + val onPackage: PackageEntry => Unit = packages.add(_) + val onClassesAndSources: ClassRepresentation => Unit = classesAndSourcesBuffer += _ + + aggregates.foreach { cp => + try { + cp match { + case ecp: EfficientClassPath => + ecp.list(inPackage, onPackage, onClassesAndSources) + case _ => + val entries = cp.list(inPackage) + entries._1.foreach(entry => packages.add(entry)) + classesAndSourcesBuffer ++= entries._2 + } + } catch { + case ex: java.io.IOException => + val e = FatalError(ex.getMessage) + e.initCause(ex) + throw e + } + } + + val distinctPackages: Seq[PackageEntry] = { + val arr = packages.toArray(new Array[PackageEntry](packages.size())) + ArraySeq.unsafeWrapArray(arr) + } + val distinctClassesAndSources = mergeClassesAndSources(classesAndSourcesBuffer) + ClassPathEntries(distinctPackages, distinctClassesAndSources) + } + + /** + * Returns only one entry for each name. If there's both a source and a class entry, it + * creates an entry containing both of them. If there would be more than one class or source + * entries for the same class it always would use the first entry of each type found on a classpath. + */ + private def mergeClassesAndSources(entries: scala.collection.Seq[ClassRepresentation]): Seq[ClassRepresentation] = { + // based on the implementation from MergedClassPath + var count = 0 + val indices = util.HashMap[String, Int]() + val mergedEntries = new ArrayBuffer[ClassRepresentation](entries.size) + for { + entry <- entries + } { + val name = entry.name + if (indices.contains(name)) { + val index = indices(name) + val existing = mergedEntries(index) + + if (existing.binary.isEmpty && entry.binary.isDefined) + mergedEntries(index) = ClassAndSourceFilesEntry(entry.binary.get, existing.source.get) + if (existing.source.isEmpty && entry.source.isDefined) + mergedEntries(index) = ClassAndSourceFilesEntry(existing.binary.get, entry.source.get) + } + else { + indices(name) = count + mergedEntries += entry + count += 1 + } + } + if (mergedEntries.isEmpty) Nil else mergedEntries.toIndexedSeq + } + + private def getDistinctEntries[EntryType <: ClassRepresentation](getEntries: ClassPath => Seq[EntryType]): Seq[EntryType] = { + val seenNames = util.HashSet[String]() + val entriesBuffer = new ArrayBuffer[EntryType](1024) + for { + cp <- aggregates + entry <- getEntries(cp) if !seenNames.contains(entry.name) + } + { + entriesBuffer += entry + seenNames += entry.name + } + entriesBuffer.toIndexedSeq + } +} + +object AggregateClassPath { + def createAggregate(parts: ClassPath*): ClassPath = { + val elems = new ArrayBuffer[ClassPath]() + parts foreach { + case AggregateClassPath(ps) => elems ++= ps + case p => elems += p + } + if (elems.size == 1) elems.head + else AggregateClassPath(elems.toIndexedSeq) + } +} diff --git a/tests/pos-with-compiler-cc/dotc/classpath/ClassPath.scala b/tests/pos-with-compiler-cc/dotc/classpath/ClassPath.scala new file mode 100644 index 000000000000..176b6acf9c6c --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/classpath/ClassPath.scala @@ -0,0 +1,85 @@ +/* + * Copyright (c) 2014 Contributor. All rights reserved. + */ +package dotty.tools.dotc.classpath + +import dotty.tools.io.AbstractFile +import dotty.tools.io.ClassRepresentation + +case class ClassPathEntries(packages: scala.collection.Seq[PackageEntry], classesAndSources: scala.collection.Seq[ClassRepresentation]) { + def toTuple: (scala.collection.Seq[PackageEntry], scala.collection.Seq[ClassRepresentation]) = (packages, classesAndSources) +} + +object ClassPathEntries { + val empty = ClassPathEntries(Seq.empty, Seq.empty) +} + +trait ClassFileEntry extends ClassRepresentation { + def file: AbstractFile +} + +trait SourceFileEntry extends ClassRepresentation { + def file: AbstractFile +} + +case class PackageName(dottedString: String) { + val dirPathTrailingSlashJar: String = FileUtils.dirPathInJar(dottedString) + "/" + + val dirPathTrailingSlash: String = + if (java.io.File.separatorChar == '/') + dirPathTrailingSlashJar + else + FileUtils.dirPath(dottedString) + java.io.File.separator + + def isRoot: Boolean = dottedString.isEmpty + + def entryName(entry: String): String = { + if (isRoot) entry else { + val builder = new java.lang.StringBuilder(dottedString.length + 1 + entry.length) + builder.append(dottedString) + builder.append('.') + builder.append(entry) + builder.toString + } + } +} + +trait PackageEntry { + def name: String +} + +private[dotty] case class ClassFileEntryImpl(file: AbstractFile) extends ClassFileEntry { + final def fileName: String = file.name + def name: String = FileUtils.stripClassExtension(file.name) // class name + + def binary: Option[AbstractFile] = Some(file) + def source: Option[AbstractFile] = None +} + +private[dotty] case class SourceFileEntryImpl(file: AbstractFile) extends SourceFileEntry { + final def fileName: String = file.name + def name: String = FileUtils.stripSourceExtension(file.name) + + def binary: Option[AbstractFile] = None + def source: Option[AbstractFile] = Some(file) +} + +private[dotty] case class ClassAndSourceFilesEntry(classFile: AbstractFile, srcFile: AbstractFile) extends ClassRepresentation { + final def fileName: String = classFile.name + def name: String = FileUtils.stripClassExtension(classFile.name) + + def binary: Option[AbstractFile] = Some(classFile) + def source: Option[AbstractFile] = Some(srcFile) +} + +private[dotty] case class PackageEntryImpl(name: String) extends PackageEntry + +private[dotty] trait NoSourcePaths { + def asSourcePathString: String = "" + private[dotty] def sources(inPackage: PackageName): Seq[SourceFileEntry] = Seq.empty +} + +private[dotty] trait NoClassPaths { + def findClassFile(className: String): Option[AbstractFile] = None + private[dotty] def classes(inPackage: PackageName): Seq[ClassFileEntry] = Seq.empty +} diff --git a/tests/pos-with-compiler-cc/dotc/classpath/ClassPathFactory.scala b/tests/pos-with-compiler-cc/dotc/classpath/ClassPathFactory.scala new file mode 100644 index 000000000000..ac8b69381938 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/classpath/ClassPathFactory.scala @@ -0,0 +1,84 @@ +/* + * Copyright (c) 2014 Contributor. All rights reserved. + */ +package dotty.tools.dotc.classpath + +import dotty.tools.io.{AbstractFile, VirtualDirectory} +import FileUtils._ +import dotty.tools.io.ClassPath +import dotty.tools.dotc.core.Contexts._ + +/** + * Provides factory methods for classpath. When creating classpath instances for a given path, + * it uses proper type of classpath depending on a types of particular files containing sources or classes. + */ +class ClassPathFactory { + /** + * Create a new classpath based on the abstract file. + */ + def newClassPath(file: AbstractFile)(using Context): ClassPath = ClassPathFactory.newClassPath(file) + + /** + * Creators for sub classpaths which preserve this context. + */ + def sourcesInPath(path: String)(using Context): List[ClassPath] = + for { + file <- expandPath(path, expandStar = false) + dir <- Option(AbstractFile getDirectory file) + } + yield createSourcePath(dir) + + + def expandPath(path: String, expandStar: Boolean = true): List[String] = dotty.tools.io.ClassPath.expandPath(path, expandStar) + + def expandDir(extdir: String): List[String] = dotty.tools.io.ClassPath.expandDir(extdir) + + def contentsOfDirsInPath(path: String)(using Context): List[ClassPath] = + for { + dir <- expandPath(path, expandStar = false) + name <- expandDir(dir) + entry <- Option(AbstractFile.getDirectory(name)) + } + yield newClassPath(entry) + + def classesInExpandedPath(path: String)(using Context): IndexedSeq[ClassPath] = + classesInPathImpl(path, expand = true).toIndexedSeq + + def classesInPath(path: String)(using Context): List[ClassPath] = classesInPathImpl(path, expand = false) + + def classesInManifest(useManifestClassPath: Boolean)(using Context): List[ClassPath] = + if (useManifestClassPath) dotty.tools.io.ClassPath.manifests.map(url => newClassPath(AbstractFile getResources url)) + else Nil + + // Internal + protected def classesInPathImpl(path: String, expand: Boolean)(using Context): List[ClassPath] = + for { + file <- expandPath(path, expand) + dir <- { + def asImage = if (file.endsWith(".jimage")) Some(AbstractFile.getFile(file)) else None + Option(AbstractFile.getDirectory(file)).orElse(asImage) + } + } + yield newClassPath(dir) + + private def createSourcePath(file: AbstractFile)(using Context): ClassPath = + if (file.isJarOrZip) + ZipAndJarSourcePathFactory.create(file) + else if (file.isDirectory) + new DirectorySourcePath(file.file) + else + sys.error(s"Unsupported sourcepath element: $file") +} + +object ClassPathFactory { + def newClassPath(file: AbstractFile)(using Context): ClassPath = file match { + case vd: VirtualDirectory => VirtualDirectoryClassPath(vd) + case _ => + if (file.isJarOrZip) + ZipAndJarClassPathFactory.create(file) + else if (file.isDirectory) + new DirectoryClassPath(file.file) + else + sys.error(s"Unsupported classpath element: $file") + } +} diff --git a/tests/pos-with-compiler-cc/dotc/classpath/DirectoryClassPath.scala b/tests/pos-with-compiler-cc/dotc/classpath/DirectoryClassPath.scala new file mode 100644 index 000000000000..a5678970411b --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/classpath/DirectoryClassPath.scala @@ -0,0 +1,313 @@ +/* + * Copyright (c) 2014 Contributor. All rights reserved. + */ +package dotty.tools.dotc.classpath + +import scala.language.unsafeNulls + +import java.io.{File => JFile} +import java.net.URL +import java.nio.file.{FileSystems, Files} + +import dotty.tools.dotc.classpath.PackageNameUtils.{packageContains, separatePkgAndClassNames} +import dotty.tools.io.{AbstractFile, PlainFile, ClassPath, ClassRepresentation, EfficientClassPath, JDK9Reflectors} +import FileUtils._ +import PlainFile.toPlainFile + +import scala.jdk.CollectionConverters._ +import scala.collection.immutable.ArraySeq +import scala.util.control.NonFatal +import language.experimental.pureFunctions + +/** + * A trait allowing to look for classpath entries in directories. It provides common logic for + * classes handling class and source files. + * It makes use of the fact that in the case of nested directories it's easy to find a file + * when we have a name of a package. + * It abstracts over the file representation to work with both JFile and AbstractFile. + */ +trait DirectoryLookup[FileEntryType <: ClassRepresentation] extends EfficientClassPath { + type F + + val dir: F + + protected def emptyFiles: Array[F] // avoids reifying ClassTag[F] + protected def getSubDir(dirName: String): Option[F] + protected def listChildren(dir: F, filter: Option[F -> Boolean] = (None: Option[F -> Boolean])): Array[F] // !cc! need explicit typing of default argument + protected def getName(f: F): String + protected def toAbstractFile(f: F): AbstractFile + protected def isPackage(f: F): Boolean + + protected def createFileEntry(file: AbstractFile): FileEntryType + protected def isMatchingFile(f: F): Boolean + + private def getDirectory(forPackage: PackageName): Option[F] = + if (forPackage.isRoot) + Some(dir) + else + getSubDir(forPackage.dirPathTrailingSlash) + + override private[dotty] def hasPackage(pkg: PackageName): Boolean = getDirectory(pkg).isDefined + + private[dotty] def packages(inPackage: PackageName): Seq[PackageEntry] = { + val dirForPackage = getDirectory(inPackage) + val nestedDirs: Array[F] = dirForPackage match { + case None => emptyFiles + case Some(directory) => listChildren(directory, Some(isPackage)) + } + ArraySeq.unsafeWrapArray(nestedDirs).map(f => PackageEntryImpl(inPackage.entryName(getName(f)))) + } + + protected def files(inPackage: PackageName): Seq[FileEntryType] = { + val dirForPackage = getDirectory(inPackage) + val files: Array[F] = dirForPackage match { + case None => emptyFiles + case Some(directory) => listChildren(directory, Some(isMatchingFile)) + } + files.iterator.map(f => createFileEntry(toAbstractFile(f))).toSeq + } + + override def list(inPackage: PackageName, onPackageEntry: PackageEntry => Unit, onClassesAndSources: ClassRepresentation => Unit): Unit = { + val dirForPackage = getDirectory(inPackage) + dirForPackage match { + case None => + case Some(directory) => + for (file <- listChildren(directory)) { + if (isPackage(file)) + onPackageEntry(PackageEntryImpl(inPackage.entryName(getName(file)))) + else if (isMatchingFile(file)) + onClassesAndSources(createFileEntry(toAbstractFile(file))) + } + } + } +} + +trait JFileDirectoryLookup[FileEntryType <: ClassRepresentation] extends DirectoryLookup[FileEntryType] { + type F = JFile + + protected def emptyFiles: Array[JFile] = Array.empty + protected def getSubDir(packageDirName: String): Option[JFile] = { + val packageDir = new JFile(dir, packageDirName) + if (packageDir.exists && packageDir.isDirectory) Some(packageDir) + else None + } + protected def listChildren(dir: JFile, filter: Option[JFile -> Boolean]): Array[JFile] = { + val listing = filter match { + case Some(f) => dir.listFiles(mkFileFilter(f)) + case None => dir.listFiles() + } + + if (listing != null) { + // Sort by file name for stable order of directory .class entries in package scope. + // This gives stable results ordering of base type sequences for unrelated classes + // with the same base type depth. + // + // Notably, this will stably infer`Product with Serializable` + // as the type of `case class C(); case class D(); List(C(), D()).head`, rather than the opposite order. + // On Mac, the HFS performs this sorting transparently, but on Linux the order is unspecified. + // + // Note this behaviour can be enabled in javac with `javac -XDsortfiles`, but that's only + // intended to improve determinism of the compiler for compiler hackers. + java.util.Arrays.sort(listing, + new java.util.Comparator[JFile] { + def compare(o1: JFile, o2: JFile) = o1.getName.compareTo(o2.getName) + }) + listing + } + else Array() + } + protected def getName(f: JFile): String = f.getName + protected def toAbstractFile(f: JFile): AbstractFile = f.toPath.toPlainFile + protected def isPackage(f: JFile): Boolean = f.isPackage + + assert(dir != null, "Directory file in DirectoryFileLookup cannot be null") + + def asURLs: Seq[URL] = Seq(dir.toURI.toURL) + def asClassPathStrings: Seq[String] = Seq(dir.getPath) +} + +object JrtClassPath { + import java.nio.file._, java.net.URI + def apply(release: Option[String]): Option[ClassPath] = { + import scala.util.Properties._ + if (!isJavaAtLeast("9")) None + else { + // Longer term we'd like an official API for this in the JDK + // Discussion: http://mail.openjdk.java.net/pipermail/compiler-dev/2018-March/thread.html#11738 + + val currentMajorVersion: Int = JDK9Reflectors.runtimeVersionMajor(JDK9Reflectors.runtimeVersion()).intValue() + release match { + case Some(v) if v.toInt < currentMajorVersion => + try { + val ctSym = Paths.get(javaHome).resolve("lib").resolve("ct.sym") + if (Files.notExists(ctSym)) None + else Some(new CtSymClassPath(ctSym, v.toInt)) + } catch { + case NonFatal(_) => None + } + case _ => + try { + val fs = FileSystems.getFileSystem(URI.create("jrt:/")) + Some(new JrtClassPath(fs)) + } catch { + case _: ProviderNotFoundException | _: FileSystemNotFoundException => None + } + } + } + } +} + +/** + * Implementation `ClassPath` based on the JDK 9 encapsulated runtime modules (JEP-220) + * + * https://bugs.openjdk.java.net/browse/JDK-8066492 is the most up to date reference + * for the structure of the jrt:// filesystem. + * + * The implementation assumes that no classes exist in the empty package. + */ +final class JrtClassPath(fs: java.nio.file.FileSystem) extends ClassPath with NoSourcePaths { + import java.nio.file.Path, java.nio.file._ + type F = Path + private val dir: Path = fs.getPath("/packages") + + // e.g. "java.lang" -> Seq("/modules/java.base") + private val packageToModuleBases: Map[String, Seq[Path]] = { + val ps = Files.newDirectoryStream(dir).iterator().asScala + def lookup(pack: Path): Seq[Path] = + Files.list(pack).iterator().asScala.map(l => if (Files.isSymbolicLink(l)) Files.readSymbolicLink(l) else l).toList + ps.map(p => (p.toString.stripPrefix("/packages/"), lookup(p))).toMap + } + + /** Empty string represents root package */ + override private[dotty] def hasPackage(pkg: PackageName): Boolean = packageToModuleBases.contains(pkg.dottedString) + + override private[dotty] def packages(inPackage: PackageName): Seq[PackageEntry] = + packageToModuleBases.keysIterator.filter(pack => packageContains(inPackage.dottedString, pack)).map(PackageEntryImpl(_)).toVector + + private[dotty] def classes(inPackage: PackageName): Seq[ClassFileEntry] = + if (inPackage.isRoot) Nil + else + packageToModuleBases.getOrElse(inPackage.dottedString, Nil).flatMap(x => + Files.list(x.resolve(inPackage.dirPathTrailingSlash)).iterator().asScala.filter(_.getFileName.toString.endsWith(".class"))).map(x => + ClassFileEntryImpl(x.toPlainFile)).toVector + + override private[dotty] def list(inPackage: PackageName): ClassPathEntries = + if (inPackage.isRoot) ClassPathEntries(packages(inPackage), Nil) + else ClassPathEntries(packages(inPackage), classes(inPackage)) + + def asURLs: Seq[URL] = Seq(new URL("jrt:/")) + // We don't yet have a scheme to represent the JDK modules in our `-classpath`. + // java models them as entries in the new "module path", we'll probably need to follow this. + def asClassPathStrings: Seq[String] = Nil + + def findClassFile(className: String): Option[AbstractFile] = + if (!className.contains(".")) None + else { + val (inPackage, _) = separatePkgAndClassNames(className) + packageToModuleBases.getOrElse(inPackage, Nil).iterator.flatMap{ x => + val file = x.resolve(FileUtils.dirPath(className) + ".class") + if (Files.exists(file)) file.toPlainFile :: Nil else Nil + }.take(1).toList.headOption + } +} + +/** + * Implementation `ClassPath` based on the \$JAVA_HOME/lib/ct.sym backing http://openjdk.java.net/jeps/247 + */ +final class CtSymClassPath(ctSym: java.nio.file.Path, release: Int) extends ClassPath with NoSourcePaths { + import java.nio.file.Path, java.nio.file._ + + private val fileSystem: FileSystem = FileSystems.newFileSystem(ctSym, null: ClassLoader) + private val root: Path = fileSystem.getRootDirectories.iterator.next + private val roots = Files.newDirectoryStream(root).iterator.asScala.toList + + // http://mail.openjdk.java.net/pipermail/compiler-dev/2018-March/011737.html + private def codeFor(major: Int): String = if (major < 10) major.toString else ('A' + (major - 10)).toChar.toString + + private val releaseCode: String = codeFor(release) + private def fileNameMatchesRelease(fileName: String) = !fileName.contains("-") && fileName.contains(releaseCode) // exclude `9-modules` + private val rootsForRelease: List[Path] = roots.filter(root => fileNameMatchesRelease(root.getFileName.toString)) + + // e.g. "java.lang" -> Seq(/876/java/lang, /87/java/lang, /8/java/lang)) + private val packageIndex: scala.collection.Map[String, scala.collection.Seq[Path]] = { + val index = collection.mutable.AnyRefMap[String, collection.mutable.ListBuffer[Path]]() + val isJava12OrHigher = scala.util.Properties.isJavaAtLeast("12") + rootsForRelease.foreach(root => Files.walk(root).iterator().asScala.filter(Files.isDirectory(_)).foreach { p => + val moduleNamePathElementCount = if (isJava12OrHigher) 1 else 0 + if (p.getNameCount > root.getNameCount + moduleNamePathElementCount) { + val packageDotted = p.subpath(moduleNamePathElementCount + root.getNameCount, p.getNameCount).toString.replace('/', '.') + index.getOrElseUpdate(packageDotted, new collection.mutable.ListBuffer) += p + } + }) + index + } + + /** Empty string represents root package */ + override private[dotty] def hasPackage(pkg: PackageName) = packageIndex.contains(pkg.dottedString) + override private[dotty] def packages(inPackage: PackageName): Seq[PackageEntry] = { + packageIndex.keysIterator.filter(pack => packageContains(inPackage.dottedString, pack)).map(PackageEntryImpl(_)).toVector + } + private[dotty] def classes(inPackage: PackageName): Seq[ClassFileEntry] = { + if (inPackage.isRoot) Nil + else { + val sigFiles = packageIndex.getOrElse(inPackage.dottedString, Nil).iterator.flatMap(p => + Files.list(p).iterator.asScala.filter(_.getFileName.toString.endsWith(".sig"))) + sigFiles.map(f => ClassFileEntryImpl(f.toPlainFile)).toVector + } + } + + override private[dotty] def list(inPackage: PackageName): ClassPathEntries = + if (inPackage.isRoot) ClassPathEntries(packages(inPackage), Nil) + else ClassPathEntries(packages(inPackage), classes(inPackage)) + + def asURLs: Seq[URL] = Nil + def asClassPathStrings: Seq[String] = Nil + def findClassFile(className: String): Option[AbstractFile] = { + if (!className.contains(".")) None + else { + val (inPackage, classSimpleName) = separatePkgAndClassNames(className) + packageIndex.getOrElse(inPackage, Nil).iterator.flatMap { p => + val path = p.resolve(classSimpleName + ".sig") + if (Files.exists(path)) path.toPlainFile :: Nil else Nil + }.take(1).toList.headOption + } + } +} + +case class DirectoryClassPath(dir: JFile) extends JFileDirectoryLookup[ClassFileEntryImpl] with NoSourcePaths { + override def findClass(className: String): Option[ClassRepresentation] = findClassFile(className) map ClassFileEntryImpl.apply + + def findClassFile(className: String): Option[AbstractFile] = { + val relativePath = FileUtils.dirPath(className) + val classFile = new JFile(dir, relativePath + ".class") + if (classFile.exists) { + Some(classFile.toPath.toPlainFile) + } + else None + } + + protected def createFileEntry(file: AbstractFile): ClassFileEntryImpl = ClassFileEntryImpl(file) + protected def isMatchingFile(f: JFile): Boolean = f.isClass + + private[dotty] def classes(inPackage: PackageName): Seq[ClassFileEntry] = files(inPackage) +} + +case class DirectorySourcePath(dir: JFile) extends JFileDirectoryLookup[SourceFileEntryImpl] with NoClassPaths { + def asSourcePathString: String = asClassPathString + + protected def createFileEntry(file: AbstractFile): SourceFileEntryImpl = SourceFileEntryImpl(file) + protected def isMatchingFile(f: JFile): Boolean = endsScalaOrJava(f.getName) + + override def findClass(className: String): Option[ClassRepresentation] = findSourceFile(className) map SourceFileEntryImpl.apply + + private def findSourceFile(className: String): Option[AbstractFile] = { + val relativePath = FileUtils.dirPath(className) + val sourceFile = LazyList("scala", "java") + .map(ext => new JFile(dir, relativePath + "." + ext)) + .collectFirst { case file if file.exists() => file } + + sourceFile.map(_.toPath.toPlainFile) + } + + private[dotty] def sources(inPackage: PackageName): Seq[SourceFileEntry] = files(inPackage) +} diff --git a/tests/pos-with-compiler-cc/dotc/classpath/FileUtils.scala b/tests/pos-with-compiler-cc/dotc/classpath/FileUtils.scala new file mode 100644 index 000000000000..0f5ac16b40bf --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/classpath/FileUtils.scala @@ -0,0 +1,85 @@ +/* + * Copyright (c) 2014 Contributor. All rights reserved. + */ +package dotty.tools +package dotc.classpath + +import scala.language.unsafeNulls + +import java.io.{File => JFile, FileFilter} +import java.net.URL +import dotty.tools.io.AbstractFile +import language.experimental.pureFunctions + +/** + * Common methods related to Java files and abstract files used in the context of classpath + */ +object FileUtils { + extension (file: AbstractFile) { + def isPackage: Boolean = file.isDirectory && mayBeValidPackage(file.name) + + def isClass: Boolean = !file.isDirectory && file.hasExtension("class") && !file.name.endsWith("$class.class") + // FIXME: drop last condition when we stop being compatible with Scala 2.11 + + def isScalaOrJavaSource: Boolean = !file.isDirectory && (file.hasExtension("scala") || file.hasExtension("java")) + + // TODO do we need to check also other files using ZipMagicNumber like in scala.tools.nsc.io.Jar.isJarOrZip? + def isJarOrZip: Boolean = file.hasExtension("jar") || file.hasExtension("zip") + + /** + * Safe method returning a sequence containing one URL representing this file, when underlying file exists, + * and returning given default value in other case + */ + def toURLs(default: => Seq[URL] = Seq.empty): Seq[URL] = if (file.file == null) default else Seq(file.toURL) + } + + extension (file: JFile) { + def isPackage: Boolean = file.isDirectory && mayBeValidPackage(file.getName) + + def isClass: Boolean = file.isFile && file.getName.endsWith(".class") && !file.getName.endsWith("$class.class") + // FIXME: drop last condition when we stop being compatible with Scala 2.11 + } + + private val SUFFIX_CLASS = ".class" + private val SUFFIX_SCALA = ".scala" + private val SUFFIX_JAVA = ".java" + private val SUFFIX_SIG = ".sig" + + def stripSourceExtension(fileName: String): String = + if (endsScala(fileName)) stripClassExtension(fileName) + else if (endsJava(fileName)) stripJavaExtension(fileName) + else throw new FatalError("Unexpected source file ending: " + fileName) + + def dirPath(forPackage: String): String = forPackage.replace('.', JFile.separatorChar) + + def dirPathInJar(forPackage: String): String = forPackage.replace('.', '/') + + inline private def ends (filename:String, suffix:String) = filename.endsWith(suffix) && filename.length > suffix.length + + def endsClass(fileName: String): Boolean = + ends (fileName, SUFFIX_CLASS) || fileName.endsWith(SUFFIX_SIG) + + def endsScalaOrJava(fileName: String): Boolean = + endsScala(fileName) || endsJava(fileName) + + def endsJava(fileName: String): Boolean = + ends (fileName, SUFFIX_JAVA) + + def endsScala(fileName: String): Boolean = + ends (fileName, SUFFIX_SCALA) + + def stripClassExtension(fileName: String): String = + fileName.substring(0, fileName.lastIndexOf('.')) + + def stripJavaExtension(fileName: String): String = + fileName.substring(0, fileName.length - 5) // equivalent of fileName.length - SUFFIX_JAVA.length + + // probably it should match a pattern like [a-z_]{1}[a-z0-9_]* but it cannot be changed + // because then some tests in partest don't pass + def mayBeValidPackage(dirName: String): Boolean = + (dirName != "META-INF") && (dirName != "") && (dirName.charAt(0) != '.') + + def mkFileFilter(f: JFile -> Boolean): FileFilter = new FileFilter { + def accept(pathname: JFile): Boolean = f(pathname) + } +} diff --git a/tests/pos-with-compiler-cc/dotc/classpath/PackageNameUtils.scala b/tests/pos-with-compiler-cc/dotc/classpath/PackageNameUtils.scala new file mode 100644 index 000000000000..ea7412f15d8a --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/classpath/PackageNameUtils.scala @@ -0,0 +1,37 @@ +/* + * Copyright (c) 2014 Contributor. All rights reserved. + */ +package dotty.tools.dotc.classpath + +import dotty.tools.io.ClassPath.RootPackage + +/** + * Common methods related to package names represented as String + */ +object PackageNameUtils { + + /** + * @param fullClassName full class name with package + * @return (package, simple class name) + */ + inline def separatePkgAndClassNames(fullClassName: String): (String, String) = { + val lastDotIndex = fullClassName.lastIndexOf('.') + if (lastDotIndex == -1) + (RootPackage, fullClassName) + else + (fullClassName.substring(0, lastDotIndex).nn, fullClassName.substring(lastDotIndex + 1).nn) + } + + def packagePrefix(inPackage: String): String = if (inPackage == RootPackage) "" else inPackage + "." + + /** + * `true` if `packageDottedName` is a package directly nested in `inPackage`, for example: + * - `packageContains("scala", "scala.collection")` + * - `packageContains("", "scala")` + */ + def packageContains(inPackage: String, packageDottedName: String) = { + if (packageDottedName.contains(".")) + packageDottedName.startsWith(inPackage) && packageDottedName.lastIndexOf('.') == inPackage.length + else inPackage == "" + } +} diff --git a/tests/pos-with-compiler-cc/dotc/classpath/VirtualDirectoryClassPath.scala b/tests/pos-with-compiler-cc/dotc/classpath/VirtualDirectoryClassPath.scala new file mode 100644 index 000000000000..ac80d543b539 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/classpath/VirtualDirectoryClassPath.scala @@ -0,0 +1,55 @@ +package dotty.tools.dotc.classpath + +import scala.language.unsafeNulls + +import dotty.tools.io.ClassRepresentation +import dotty.tools.io.{AbstractFile, VirtualDirectory} +import FileUtils._ +import java.net.URL + +import dotty.tools.io.ClassPath +import language.experimental.pureFunctions + +case class VirtualDirectoryClassPath(dir: VirtualDirectory) extends ClassPath with DirectoryLookup[ClassFileEntryImpl] with NoSourcePaths { + type F = AbstractFile + + // From AbstractFileClassLoader + private final def lookupPath(base: AbstractFile)(pathParts: Seq[String], directory: Boolean): AbstractFile = { + var file: AbstractFile = base + val dirParts = pathParts.init.iterator + while (dirParts.hasNext) { + val dirPart = dirParts.next + file = file.lookupName(dirPart, directory = true) + if (file == null) + return null + } + file.lookupName(pathParts.last, directory = directory) + } + + protected def emptyFiles: Array[AbstractFile] = Array.empty + protected def getSubDir(packageDirName: String): Option[AbstractFile] = + Option(lookupPath(dir)(packageDirName.split(java.io.File.separator).toIndexedSeq, directory = true)) + protected def listChildren(dir: AbstractFile, filter: Option[AbstractFile -> Boolean]): Array[F] = filter match { + case Some(f) => dir.iterator.filter(f).toArray + case _ => dir.toArray + } + def getName(f: AbstractFile): String = f.name + def toAbstractFile(f: AbstractFile): AbstractFile = f + def isPackage(f: AbstractFile): Boolean = f.isPackage + + // mimic the behavior of the old nsc.util.DirectoryClassPath + def asURLs: Seq[URL] = Seq(new URL(dir.name)) + def asClassPathStrings: Seq[String] = Seq(dir.path) + + override def findClass(className: String): Option[ClassRepresentation] = findClassFile(className) map ClassFileEntryImpl.apply + + def findClassFile(className: String): Option[AbstractFile] = { + val relativePath = FileUtils.dirPath(className) + ".class" + Option(lookupPath(dir)(relativePath.split(java.io.File.separator).toIndexedSeq, directory = false)) + } + + private[dotty] def classes(inPackage: PackageName): Seq[ClassFileEntry] = files(inPackage) + + protected def createFileEntry(file: AbstractFile): ClassFileEntryImpl = ClassFileEntryImpl(file) + protected def isMatchingFile(f: AbstractFile): Boolean = f.isClass +} diff --git a/tests/pos-with-compiler-cc/dotc/classpath/ZipAndJarFileLookupFactory.scala b/tests/pos-with-compiler-cc/dotc/classpath/ZipAndJarFileLookupFactory.scala new file mode 100644 index 000000000000..865f95551a0b --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/classpath/ZipAndJarFileLookupFactory.scala @@ -0,0 +1,205 @@ +/* + * Copyright (c) 2014 Contributor. All rights reserved. + */ +package dotty.tools.dotc +package classpath + +import scala.language.unsafeNulls + +import java.io.File +import java.net.URL +import java.nio.file.Files +import java.nio.file.attribute.{BasicFileAttributes, FileTime} + +import scala.annotation.tailrec +import dotty.tools.io.{AbstractFile, ClassPath, ClassRepresentation, FileZipArchive, ManifestResources} +import dotty.tools.dotc.core.Contexts._ +import FileUtils._ + +/** + * A trait providing an optional cache for classpath entries obtained from zip and jar files. + * It allows us to e.g. reduce significantly memory used by PresentationCompilers in Scala IDE + * when there are a lot of projects having a lot of common dependencies. + */ +sealed trait ZipAndJarFileLookupFactory { + private val cache = new FileBasedCache[ClassPath] + + def create(zipFile: AbstractFile)(using Context): ClassPath = + val release = Option(ctx.settings.javaOutputVersion.value).filter(_.nonEmpty) + if (ctx.settings.YdisableFlatCpCaching.value || zipFile.file == null) createForZipFile(zipFile, release) + else createUsingCache(zipFile, release) + + protected def createForZipFile(zipFile: AbstractFile, release: Option[String]): ClassPath + + private def createUsingCache(zipFile: AbstractFile, release: Option[String]): ClassPath = + cache.getOrCreate(zipFile.file.toPath, () => createForZipFile(zipFile, release)) +} + +/** + * Manages creation of classpath for class files placed in zip and jar files. + * It should be the only way of creating them as it provides caching. + */ +object ZipAndJarClassPathFactory extends ZipAndJarFileLookupFactory { + private case class ZipArchiveClassPath(zipFile: File, override val release: Option[String]) + extends ZipArchiveFileLookup[ClassFileEntryImpl] + with NoSourcePaths { + + override def findClassFile(className: String): Option[AbstractFile] = { + val (pkg, simpleClassName) = PackageNameUtils.separatePkgAndClassNames(className) + file(PackageName(pkg), simpleClassName + ".class").map(_.file) + } + + // This method is performance sensitive as it is used by SBT's ExtractDependencies phase. + override def findClass(className: String): Option[ClassRepresentation] = { + val (pkg, simpleClassName) = PackageNameUtils.separatePkgAndClassNames(className) + file(PackageName(pkg), simpleClassName + ".class") + } + + override private[dotty] def classes(inPackage: PackageName): Seq[ClassFileEntry] = files(inPackage) + + override protected def createFileEntry(file: FileZipArchive#Entry): ClassFileEntryImpl = ClassFileEntryImpl(file) + override protected def isRequiredFileType(file: AbstractFile): Boolean = file.isClass + } + + /** + * This type of classpath is closely related to the support for JSR-223. + * Its usage can be observed e.g. when running: + * jrunscript -classpath scala-compiler.jar;scala-reflect.jar;scala-library.jar -l scala + * with a particularly prepared scala-library.jar. It should have all classes listed in the manifest like e.g. this entry: + * Name: scala/Function2$mcFJD$sp.class + */ + private case class ManifestResourcesClassPath(file: ManifestResources) extends ClassPath with NoSourcePaths { + override def findClassFile(className: String): Option[AbstractFile] = { + val (pkg, simpleClassName) = PackageNameUtils.separatePkgAndClassNames(className) + classes(PackageName(pkg)).find(_.name == simpleClassName).map(_.file) + } + + override def asClassPathStrings: Seq[String] = Seq(file.path) + + override def asURLs: Seq[URL] = file.toURLs() + + import ManifestResourcesClassPath.PackageFileInfo + import ManifestResourcesClassPath.PackageInfo + + /** + * A cache mapping package name to abstract file for package directory and subpackages of given package. + * + * ManifestResources can iterate through the collections of entries from e.g. remote jar file. + * We can't just specify the path to the concrete directory etc. so we can't just 'jump' into + * given package, when it's needed. On the other hand we can iterate over entries to get + * AbstractFiles, iterate over entries of these files etc. + * + * Instead of traversing a tree of AbstractFiles once and caching all entries or traversing each time, + * when we need subpackages of a given package or its classes, we traverse once and cache only packages. + * Classes for given package can be then easily loaded when they are needed. + */ + private lazy val cachedPackages: util.HashMap[String, PackageFileInfo] = { + val packages = util.HashMap[String, PackageFileInfo]() + + def getSubpackages(dir: AbstractFile): List[AbstractFile] = + (for (file <- dir if file.isPackage) yield file).toList + + @tailrec + def traverse(packagePrefix: String, + filesForPrefix: List[AbstractFile], + subpackagesQueue: collection.mutable.Queue[PackageInfo]): Unit = filesForPrefix match { + case pkgFile :: remainingFiles => + val subpackages = getSubpackages(pkgFile) + val fullPkgName = packagePrefix + pkgFile.name + packages(fullPkgName) = PackageFileInfo(pkgFile, subpackages) + val newPackagePrefix = fullPkgName + "." + subpackagesQueue.enqueue(PackageInfo(newPackagePrefix, subpackages)) + traverse(packagePrefix, remainingFiles, subpackagesQueue) + case Nil if subpackagesQueue.nonEmpty => + val PackageInfo(packagePrefix, filesForPrefix) = subpackagesQueue.dequeue() + traverse(packagePrefix, filesForPrefix, subpackagesQueue) + case _ => + } + + val subpackages = getSubpackages(file) + packages(ClassPath.RootPackage) = PackageFileInfo(file, subpackages) + traverse(ClassPath.RootPackage, subpackages, collection.mutable.Queue()) + packages + } + + override private[dotty] def packages(inPackage: PackageName): Seq[PackageEntry] = cachedPackages.get(inPackage.dottedString) match { + case None => Seq.empty + case Some(PackageFileInfo(_, subpackages)) => + subpackages.map(packageFile => PackageEntryImpl(inPackage.entryName(packageFile.name))) + } + + override private[dotty] def classes(inPackage: PackageName): Seq[ClassFileEntry] = cachedPackages.get(inPackage.dottedString) match { + case None => Seq.empty + case Some(PackageFileInfo(pkg, _)) => + (for (file <- pkg if file.isClass) yield ClassFileEntryImpl(file)).toSeq + } + + override private[dotty] def hasPackage(pkg: PackageName) = cachedPackages.contains(pkg.dottedString) + override private[dotty] def list(inPackage: PackageName): ClassPathEntries = ClassPathEntries(packages(inPackage), classes(inPackage)) + } + + private object ManifestResourcesClassPath { + case class PackageFileInfo(packageFile: AbstractFile, subpackages: Seq[AbstractFile]) + case class PackageInfo(packageName: String, subpackages: List[AbstractFile]) + } + + override protected def createForZipFile(zipFile: AbstractFile, release: Option[String]): ClassPath = + if (zipFile.file == null) createWithoutUnderlyingFile(zipFile) + else ZipArchiveClassPath(zipFile.file, release) + + private def createWithoutUnderlyingFile(zipFile: AbstractFile) = zipFile match { + case manifestRes: ManifestResources => + ManifestResourcesClassPath(manifestRes) + case _ => + val errorMsg = s"Abstract files which don't have an underlying file and are not ManifestResources are not supported. There was $zipFile" + throw new IllegalArgumentException(errorMsg) + } +} + +/** + * Manages creation of classpath for source files placed in zip and jar files. + * It should be the only way of creating them as it provides caching. + */ +object ZipAndJarSourcePathFactory extends ZipAndJarFileLookupFactory { + private case class ZipArchiveSourcePath(zipFile: File) + extends ZipArchiveFileLookup[SourceFileEntryImpl] + with NoClassPaths { + + def release: Option[String] = None + + override def asSourcePathString: String = asClassPathString + + override private[dotty] def sources(inPackage: PackageName): Seq[SourceFileEntry] = files(inPackage) + + override protected def createFileEntry(file: FileZipArchive#Entry): SourceFileEntryImpl = SourceFileEntryImpl(file) + override protected def isRequiredFileType(file: AbstractFile): Boolean = file.isScalaOrJavaSource + } + + override protected def createForZipFile(zipFile: AbstractFile, release: Option[String]): ClassPath = ZipArchiveSourcePath(zipFile.file) +} + +final class FileBasedCache[T] { + private case class Stamp(lastModified: FileTime, fileKey: Object) + private val cache = collection.mutable.Map.empty[java.nio.file.Path, (Stamp, T)] + + def getOrCreate(path: java.nio.file.Path, create: () => T): T = cache.synchronized { + val attrs = Files.readAttributes(path, classOf[BasicFileAttributes]) + val lastModified = attrs.lastModifiedTime() + // only null on some platforms, but that's okay, we just use the last modified timestamp as our stamp + val fileKey = attrs.fileKey() + val stamp = Stamp(lastModified, fileKey) + cache.get(path) match { + case Some((cachedStamp, cached)) if cachedStamp == stamp => cached + case _ => + val value = create() + cache.put(path, (stamp, value)) + value + } + } + + def clear(): Unit = cache.synchronized { + // TODO support closing + // cache.valuesIterator.foreach(_.close()) + cache.clear() + } +} diff --git a/tests/pos-with-compiler-cc/dotc/classpath/ZipArchiveFileLookup.scala b/tests/pos-with-compiler-cc/dotc/classpath/ZipArchiveFileLookup.scala new file mode 100644 index 000000000000..e241feee8244 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/classpath/ZipArchiveFileLookup.scala @@ -0,0 +1,72 @@ +/* + * Copyright (c) 2014 Contributor. All rights reserved. + */ +package dotty.tools.dotc.classpath + +import scala.language.unsafeNulls + +import java.io.File +import java.net.URL + +import dotty.tools.io.{ AbstractFile, FileZipArchive } +import FileUtils._ +import dotty.tools.io.{EfficientClassPath, ClassRepresentation} + +/** + * A trait allowing to look for classpath entries of given type in zip and jar files. + * It provides common logic for classes handling class and source files. + * It's aware of things like e.g. META-INF directory which is correctly skipped. + */ +trait ZipArchiveFileLookup[FileEntryType <: ClassRepresentation] extends EfficientClassPath { + val zipFile: File + def release: Option[String] + + assert(zipFile != null, "Zip file in ZipArchiveFileLookup cannot be null") + + override def asURLs: Seq[URL] = Seq(zipFile.toURI.toURL) + override def asClassPathStrings: Seq[String] = Seq(zipFile.getPath) + + private val archive = new FileZipArchive(zipFile.toPath, release) + + override private[dotty] def packages(inPackage: PackageName): Seq[PackageEntry] = { + for { + dirEntry <- findDirEntry(inPackage).toSeq + entry <- dirEntry.iterator if entry.isPackage + } + yield PackageEntryImpl(inPackage.entryName(entry.name)) + } + + protected def files(inPackage: PackageName): Seq[FileEntryType] = + for { + dirEntry <- findDirEntry(inPackage).toSeq + entry <- dirEntry.iterator if isRequiredFileType(entry) + } + yield createFileEntry(entry) + + protected def file(inPackage: PackageName, name: String): Option[FileEntryType] = + for { + dirEntry <- findDirEntry(inPackage) + entry <- Option(dirEntry.lookupName(name, directory = false)) + if isRequiredFileType(entry) + } + yield createFileEntry(entry) + + override def hasPackage(pkg: PackageName) = findDirEntry(pkg).isDefined + def list(inPackage: PackageName, onPackageEntry: PackageEntry => Unit, onClassesAndSources: ClassRepresentation => Unit): Unit = + findDirEntry(inPackage) match { + case Some(dirEntry) => + for (entry <- dirEntry.iterator) { + if (entry.isPackage) + onPackageEntry(PackageEntryImpl(inPackage.entryName(entry.name))) + else if (isRequiredFileType(entry)) + onClassesAndSources(createFileEntry(entry)) + } + case None => + } + + private def findDirEntry(pkg: PackageName): Option[archive.DirEntry] = + archive.allDirs.get(pkg.dirPathTrailingSlashJar) + + protected def createFileEntry(file: FileZipArchive#Entry): FileEntryType + protected def isRequiredFileType(file: AbstractFile): Boolean +} diff --git a/tests/pos-with-compiler-cc/dotc/config/CliCommand.scala b/tests/pos-with-compiler-cc/dotc/config/CliCommand.scala new file mode 100644 index 000000000000..68c900e405da --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/CliCommand.scala @@ -0,0 +1,198 @@ +package dotty.tools.dotc +package config + +import scala.language.unsafeNulls + +import Settings._ +import core.Contexts._ +import printing.Highlighting + +import scala.util.chaining.given +import scala.PartialFunction.cond + +trait CliCommand: + + type ConcreteSettings <: CommonScalaSettings with Settings.SettingGroup + + def versionMsg: String + + def ifErrorsMsg: String + + /** The name of the command */ + def cmdName: String + + def isHelpFlag(using settings: ConcreteSettings)(using SettingsState): Boolean + + def helpMsg(using settings: ConcreteSettings)(using SettingsState, Context): String + + private def explainAdvanced = """ + |-- Notes on option parsing -- + |Boolean settings are always false unless set. + |Where multiple values are accepted, they should be comma-separated. + | example: -Xplugin:plugin1,plugin2 + | means one or a comma-separated list of: + | - (partial) phase names with an optional "+" suffix to include the next phase + | - the string "all" + | example: -Xprint:all prints all phases. + | example: -Xprint:typer,mixin prints the typer and mixin phases. + | example: -Ylog:erasure+ logs the erasure phase and the phase after the erasure phase. + | This is useful because during the tree transform of phase X, we often + | already are in phase X + 1. + """ + + /** Distill arguments into summary detailing settings, errors and files to main */ + def distill(args: Array[String], sg: Settings.SettingGroup)(ss: SettingsState = sg.defaultState)(using Context): ArgsSummary = + + // expand out @filename to the contents of that filename + def expandedArguments = args.toList flatMap { + case x if x startsWith "@" => CommandLineParser.expandArg(x) + case x => List(x) + } + + sg.processArguments(expandedArguments, processAll = true, settingsState = ss) + end distill + + /** Creates a help message for a subset of options based on cond */ + protected def availableOptionsMsg(p: Setting[?] => Boolean)(using settings: ConcreteSettings)(using SettingsState): String = + // result is (Option Name, descrption\ndefault: value\nchoices: x, y, z + def help(s: Setting[?]): (String, String) = + // For now, skip the default values that do not make sense for the end user, such as 'false' for the version command. + def defaultValue = s.default match + case _: Int | _: String => s.default.toString + case _ => "" + val info = List(shortHelp(s), if defaultValue.nonEmpty then s"Default $defaultValue" else "", if s.legalChoices.nonEmpty then s"Choices ${s.legalChoices}" else "") + (s.name, info.filter(_.nonEmpty).mkString("\n")) + end help + + val ss = settings.allSettings.filter(p).toList.sortBy(_.name) + val formatter = Columnator("", "", maxField = 30) + val fresh = ContextBase().initialCtx.fresh.setSettings(summon[SettingsState]) + formatter(List(ss.map(help) :+ ("@", "A text file containing compiler arguments (options and source files).")))(using fresh) + end availableOptionsMsg + + protected def shortUsage: String = s"Usage: $cmdName " + + protected def createUsageMsg(label: String, shouldExplain: Boolean, cond: Setting[?] => Boolean)(using settings: ConcreteSettings)(using SettingsState): String = + val prefix = List( + Some(shortUsage), + Some(explainAdvanced).filter(_ => shouldExplain), + Some(label + " options include:") + ).flatten.mkString("\n") + + prefix + "\n" + availableOptionsMsg(cond) + + protected def isStandard(s: Setting[?])(using settings: ConcreteSettings)(using SettingsState): Boolean = + !isVerbose(s) && !isWarning(s) && !isAdvanced(s) && !isPrivate(s) || s.name == "-Werror" || s.name == "-Wconf" + protected def isVerbose(s: Setting[?])(using settings: ConcreteSettings)(using SettingsState): Boolean = + s.name.startsWith("-V") && s.name != "-V" + protected def isWarning(s: Setting[?])(using settings: ConcreteSettings)(using SettingsState): Boolean = + s.name.startsWith("-W") && s.name != "-W" || s.name == "-Xlint" + protected def isAdvanced(s: Setting[?])(using settings: ConcreteSettings)(using SettingsState): Boolean = + s.name.startsWith("-X") && s.name != "-X" + protected def isPrivate(s: Setting[?])(using settings: ConcreteSettings)(using SettingsState): Boolean = + s.name.startsWith("-Y") && s.name != "-Y" + protected def shortHelp(s: Setting[?])(using settings: ConcreteSettings)(using SettingsState): String = + s.description.linesIterator.next() + protected def isHelping(s: Setting[?])(using settings: ConcreteSettings)(using SettingsState): Boolean = + cond(s.value) { + case ss: List[?] if s.isMultivalue => ss.contains("help") + case s: String => "help" == s + } + + /** Messages explaining usage and options */ + protected def usageMessage(using settings: ConcreteSettings)(using SettingsState) = + createUsageMsg("where possible standard", shouldExplain = false, isStandard) + protected def vusageMessage(using settings: ConcreteSettings)(using SettingsState) = + createUsageMsg("Possible verbose", shouldExplain = true, isVerbose) + protected def wusageMessage(using settings: ConcreteSettings)(using SettingsState) = + createUsageMsg("Possible warning", shouldExplain = true, isWarning) + protected def xusageMessage(using settings: ConcreteSettings)(using SettingsState) = + createUsageMsg("Possible advanced", shouldExplain = true, isAdvanced) + protected def yusageMessage(using settings: ConcreteSettings)(using SettingsState) = + createUsageMsg("Possible private", shouldExplain = true, isPrivate) + + /** Used for the formatted output of -Xshow-phases */ + protected def phasesMessage(using Context): String = + val phases = new Compiler().phases + val formatter = Columnator("phase name", "description", maxField = 25) + formatter(phases.map(mega => mega.map(p => (p.phaseName, p.description)))) + + /** Provide usage feedback on argument summary, assuming that all settings + * are already applied in context. + * @return Either Some list of files passed as arguments or None if further processing should be interrupted. + */ + def checkUsage(summary: ArgsSummary, sourcesRequired: Boolean)(using settings: ConcreteSettings)(using SettingsState, Context): Option[List[String]] = + // Print all warnings encountered during arguments parsing + summary.warnings.foreach(report.warning(_)) + + if summary.errors.nonEmpty then + summary.errors foreach (report.error(_)) + report.echo(ifErrorsMsg) + None + else if settings.version.value then + report.echo(versionMsg) + None + else if isHelpFlag then + report.echo(helpMsg) + None + else if (sourcesRequired && summary.arguments.isEmpty) + report.echo(usageMessage) + None + else + Some(summary.arguments) + + extension [T](setting: Setting[T]) + protected def value(using ss: SettingsState): T = setting.valueIn(ss) + + extension (s: String) + def padLeft(width: Int): String = String.format(s"%${width}s", s) + + // Formatting for -help and -Vphases in two columns, handling long field1 and wrapping long field2 + class Columnator(heading1: String, heading2: String, maxField: Int, separation: Int = 2): + def apply(texts: List[List[(String, String)]])(using Context): String = StringBuilder().tap(columnate(_, texts)).toString + + private def columnate(sb: StringBuilder, texts: List[List[(String, String)]])(using Context): Unit = + import Highlighting.* + val colors = Seq(Green(_), Yellow(_), Magenta(_), Cyan(_), Red(_)) + val nocolor = texts.length == 1 + def color(index: Int): String => Highlight = if nocolor then NoColor(_) else colors(index % colors.length) + val maxCol = ctx.settings.pageWidth.value + val field1 = maxField.min(texts.flatten.map(_._1.length).filter(_ < maxField).max) // widest field under maxField + val field2 = if field1 + separation + maxField < maxCol then maxCol - field1 - separation else 0 // skinny window -> terminal wrap + val separator = " " * separation + val EOL = "\n" + def formatField1(text: String): String = if text.length <= field1 then text.padLeft(field1) else text + EOL + "".padLeft(field1) + def formatField2(text: String): String = + def loopOverField2(fld: String): List[String] = + if field2 == 0 || fld.length <= field2 then List(fld) + else + fld.lastIndexOf(" ", field2) match + case -1 => List(fld) + case i => val (prefix, rest) = fld.splitAt(i) ; prefix :: loopOverField2(rest.trim) + text.split("\n").toList.flatMap(loopOverField2).filter(_.nonEmpty).mkString(EOL + "".padLeft(field1) + separator) + end formatField2 + def format(first: String, second: String, index: Int, colorPicker: Int => String => Highlight) = + sb.append(colorPicker(index)(formatField1(first)).show) + .append(separator) + .append(formatField2(second)) + .append(EOL): Unit + def fancy(first: String, second: String, index: Int) = format(first, second, index, color) + def plain(first: String, second: String) = format(first, second, 0, _ => NoColor(_)) + + if heading1.nonEmpty then + plain(heading1, heading2) + plain("-" * heading1.length, "-" * heading2.length) + + def emit(index: Int)(textPair: (String, String)): Unit = fancy(textPair._1, textPair._2, index) + def group(index: Int)(body: Int => Unit): Unit = + if !ctx.useColors then plain(s"{", "") + body(index) + if !ctx.useColors then plain(s"}", "") + + texts.zipWithIndex.foreach { (text, index) => + text match + case List(single) => emit(index)(single) + case Nil => + case mega => group(index)(i => mega.foreach(emit(i))) + } + end Columnator diff --git a/tests/pos-with-compiler-cc/dotc/config/CommandLineParser.scala b/tests/pos-with-compiler-cc/dotc/config/CommandLineParser.scala new file mode 100644 index 000000000000..2e76561c9913 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/CommandLineParser.scala @@ -0,0 +1,125 @@ +package dotty.tools.dotc.config + +import java.lang.Character.isWhitespace +import java.nio.file.{Files, Paths} +import scala.annotation.tailrec +import scala.collection.mutable.ArrayBuffer +import scala.jdk.CollectionConverters.* + +/** Split a line of text using shell conventions. + */ +object CommandLineParser: + inline private val DQ = '"' + inline private val SQ = '\'' + inline private val EOF = -1 + + /** Split the line into tokens separated by whitespace. + * + * Single or double quotes can be embedded to preserve internal whitespace: + * + * `""" echo "hello, world!" """` => "echo" :: "hello, world!" :: Nil + * `""" echo hello,' 'world! """` => "echo" :: "hello, world!" :: Nil + * `""" echo \"hello, world!\" """` => "echo" :: "\"hello," :: "world!\"" :: Nil + * + * The embedded quotes are stripped. Escaping backslash is not stripped. + * + * Invoke `errorFn` with a descriptive message if an end quote is missing. + */ + def tokenize(line: String, errorFn: String => Unit): List[String] = + + var accum: List[String] = Nil + + var pos = 0 + var start = 0 + val qpos = new ArrayBuffer[Int](16) // positions of paired quotes in current token + + inline def cur = if done then EOF else line.charAt(pos): Int + inline def bump() = pos += 1 + inline def done = pos >= line.length + + // Skip to the given unescaped end quote; false on no more input. + def skipToEndQuote(q: Int): Boolean = + var escaped = false + def terminal = cur match + case _ if escaped => escaped = false ; false + case '\\' => escaped = true ; false + case `q` | EOF => true + case _ => false + while !terminal do bump() + !done + + // Skip to the next whitespace word boundary; record unescaped embedded quotes; false on missing quote. + def skipToDelim(): Boolean = + var escaped = false + inline def quote() = { qpos += pos ; bump() } + @tailrec def advance(): Boolean = cur match + case _ if escaped => escaped = false ; bump() ; advance() + case '\\' => escaped = true ; bump() ; advance() + case q @ (DQ | SQ) => { quote() ; skipToEndQuote(q) } && { quote() ; advance() } + case EOF => true + case c if isWhitespace(c) => true + case _ => bump(); advance() + advance() + + def copyText(): String = + val buf = new java.lang.StringBuilder + var p = start + var i = 0 + while p < pos do + if i >= qpos.size then + buf.append(line, p, pos) + p = pos + else if p == qpos(i) then + buf.append(line, qpos(i)+1, qpos(i+1)) + p = qpos(i+1)+1 + i += 2 + else + buf.append(line, p, qpos(i)) + p = qpos(i) + buf.toString + + // the current token, stripped of any embedded quotes. + def text(): String = + val res = + if qpos.isEmpty then line.substring(start, pos) + else if qpos(0) == start && qpos(1) == pos then line.substring(start+1, pos-1) + else copyText() + qpos.clear() + res.nn + + inline def badquote() = errorFn(s"Unmatched quote [${qpos.last}](${line.charAt(qpos.last)})") + + inline def skipWhitespace() = while isWhitespace(cur) do bump() + + @tailrec def loop(): List[String] = + skipWhitespace() + start = pos + if done then + accum.reverse + else if !skipToDelim() then + badquote() + Nil + else + accum ::= text() + loop() + end loop + + loop() + end tokenize + + def tokenize(line: String): List[String] = tokenize(line, x => throw new ParseException(x)) + + /** Expands all arguments starting with @ to the contents of the file named like each argument. + */ + def expandArg(arg: String): List[String] = + val path = Paths.get(arg.stripPrefix("@")) + if !Files.exists(path) then + System.err.nn.println(s"Argument file ${path.nn.getFileName} could not be found") + Nil + else + def stripComment(s: String) = s.indexOf('#') match { case -1 => s case i => s.substring(0, i) } + val lines = Files.readAllLines(path).nn + val params = lines.asScala.map(stripComment).filter(!_.nn.isEmpty).mkString(" ") + tokenize(params) + + class ParseException(msg: String) extends RuntimeException(msg) diff --git a/tests/pos-with-compiler-cc/dotc/config/CompilerCommand.scala b/tests/pos-with-compiler-cc/dotc/config/CompilerCommand.scala new file mode 100644 index 000000000000..41e123472a75 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/CompilerCommand.scala @@ -0,0 +1,26 @@ +package dotty.tools.dotc +package config + +import Settings._ +import core.Contexts._ + +abstract class CompilerCommand extends CliCommand: + type ConcreteSettings = ScalaSettings + + final def helpMsg(using settings: ScalaSettings)(using SettingsState, Context): String = + settings.allSettings.find(isHelping) match + case Some(s) => s.description + case _ => + if (settings.help.value) usageMessage + else if (settings.Vhelp.value) vusageMessage + else if (settings.Whelp.value) wusageMessage + else if (settings.Xhelp.value) xusageMessage + else if (settings.Yhelp.value) yusageMessage + else if (settings.showPlugins.value) ctx.base.pluginDescriptions + else if (settings.XshowPhases.value) phasesMessage + else "" + + final def isHelpFlag(using settings: ScalaSettings)(using SettingsState): Boolean = + import settings._ + val flags = Set(help, Vhelp, Whelp, Xhelp, Yhelp, showPlugins, XshowPhases) + flags.exists(_.value) || allSettings.exists(isHelping) diff --git a/tests/pos-with-compiler-cc/dotc/config/Config.scala b/tests/pos-with-compiler-cc/dotc/config/Config.scala new file mode 100644 index 000000000000..cbd50429492e --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/Config.scala @@ -0,0 +1,256 @@ +package dotty.tools.dotc.config + +object Config { + + inline val cacheMembersNamed = true + inline val cacheAsSeenFrom = true + inline val cacheMemberNames = true + inline val cacheImplicitScopes = true + inline val cacheMatchReduced = true + + /** If true, the `runWithOwner` operation uses a re-usable context, + * similar to explore. This requires that the context does not escape + * the call. If false, `runWithOwner` runs its operation argument + * in a fresh context. + */ + inline val reuseOwnerContexts = true + + inline val checkCacheMembersNamed = false + + /** When updating a constraint bound, check that the constrained parameter + * does not appear at the top-level of either of its bounds. + */ + inline val checkConstraintsNonCyclic = false + + /** Check that reverse dependencies in constraints are correct and complete. + * Can also be enabled using -Ycheck-constraint-deps. + */ + inline val checkConstraintDeps = false + + /** Check that each constraint resulting from a subtype test + * is satisfiable. Also check that a type variable instantiation + * satisfies its constraints. + * Note that this can fail when bad bounds are in scope, like in + * tests/neg/i4721a.scala. + */ + inline val checkConstraintsSatisfiable = false + + /** Check that each constraint is fully propagated. i.e. + * If P <: Q then the upper bound of P is a subtype of the upper bound of Q + * and the lower bound of Q is a subtype of the lower bound of P. + */ + inline val checkConstraintsPropagated = false + + /** Check that constraint bounds do not contain wildcard types */ + inline val checkNoWildcardsInConstraint = false + + /** If a constraint is over a type lambda `tl` and `tvar` is one of + * the type variables associated with `tl` in the constraint, check + * that the origin of `tvar` is a parameter of `tl`. + */ + inline val checkConsistentVars = false + + /** Check that constraints of globally committable typer states are closed. + * NOTE: When enabled, the check can cause CyclicReference errors because + * it traverses all elements of a type. Such failures were observed when + * compiling all of dotty together (source seems to be in GenBCode which + * accesses javac's settings.) + * + * It is recommended to turn this option on only when chasing down + * a TypeParamRef instantiation error. See comment in Types.TypeVar.instantiate. + */ + inline val debugCheckConstraintsClosed = false + + /** Check that no type appearing as the info of a SymDenotation contains + * skolem types. + */ + inline val checkNoSkolemsInInfo = false + + /** Check that Name#toString is not called directly from backend by analyzing + * the stack trace of each toString call on names. This is very expensive, + * so not suitable for continuous testing. But it can be used to find a problem + * when running a specific test. + */ + inline val checkBackendNames = false + + /** Check that re-used type comparers are in their initialization state */ + inline val checkTypeComparerReset = false + + /** Type comparer will fail with an assert if the upper bound + * of a constrained parameter becomes Nothing. This should be turned + * on only for specific debugging as normally instantiation to Nothing + * is not an error condition. + */ + inline val failOnInstantiationToNothing = false + + /** Enable noDoubleDef checking if option "-YnoDoubleDefs" is set. + * The reason to have an option as well as the present global switch is + * that the noDoubleDef checking is done in a hotspot, and we do not + * want to incur the overhead of checking an option each time. + */ + inline val checkNoDoubleBindings = true + + /** Check positions for consistency after parsing */ + inline val checkPositions = true + + /** Check that typed trees don't point to untyped ones */ + inline val checkTreesConsistent = false + + /** Show subtype traces for all deep subtype recursions */ + inline val traceDeepSubTypeRecursions = false + + /** When explaining subtypes and this flag is set, also show the classes of the compared types. */ + inline val verboseExplainSubtype = false + + /** If this flag is set, take the fast path when comparing same-named type-aliases and types */ + inline val fastPathForRefinedSubtype = true + + /** If this flag is set, and we compute `T1[X1]` & `T2[X2]` as a new + * upper bound of a constrained parameter, try to align the arguments by computing + * `S1 =:= S2` (which might instantiate type parameters). + * This rule is contentious because it cuts the constraint set. + * + * For more info, see the comment in `TypeComparer#glbArgs`. + */ + inline val alignArgsInAnd = true + + /** If this flag is set, higher-kinded applications are checked for validity + */ + inline val checkHKApplications = false + + /** If this flag is set, method types are checked for valid parameter references + */ + inline val checkMethodTypes = false + + /** If this flag is set, it is checked that TypeRefs don't refer directly + * to themselves. + */ + inline val checkTypeRefCycles = false + + /** If this flag is set, we check that types assigned to trees are error types only + * if some error was already reported. There are complicicated scenarios where this + * is not true. An example is TestNonCyclic in posTwice. If we remove the + * first (unused) import `import dotty.tools.dotc.core.Types.Type` in `CompilationUnit`, + * we end up assigning a CyclicReference error type to an import expression `annotation` + * before the cyclic reference is reported. What happens is that the error was reported + * as a result of a completion in a not-yet committed typerstate. So we cannot enforce + * this in all circumstances. But since it is almost always true it is useful to + * keep the Config option for debugging. + */ + inline val checkUnreportedErrors = false + + /** If this flag is set, it is checked that class type parameters are + * only references with NoPrefix or ThisTypes as prefixes. This option + * is usually disabled, because there are still some legitimate cases where + * this can arise (e.g. for pos/Map.scala, in LambdaType.integrate). + */ + inline val checkTypeParamRefs = false + + /** The recursion depth for showing a summarized string */ + inline val summarizeDepth = 2 + + /** Check that variances of lambda arguments match the + * variance of the underlying lambda class. + */ + inline val checkLambdaVariance = false + + /** Check that certain types cannot be created in erasedTypes phases. + * Note: Turning this option on will get some false negatives, since it is + * possible that And/Or types are still created during erasure as the result + * of some operation on an existing type. + */ + inline val checkUnerased = false + + /** Check that atoms-based comparisons match regular comparisons that do not + * take atoms into account. The two have to give the same results, since + * atoms comparison is intended to be just an optimization. + */ + inline val checkAtomsComparisons = false + + /** In `derivedSelect`, rewrite + * + * (S & T)#A --> S#A & T#A + * (S | T)#A --> S#A | T#A + * + * Not sure whether this is useful. Preliminary measurements show a slowdown of about + * 7% for the build when this option is enabled. + */ + inline val splitProjections = false + + /** If this flag is on, always rewrite an application `S[Ts]` where `S` is an alias for + * `[Xs] -> U` to `[Xs := Ts]U`. + * Turning this flag on was observed to give a ~6% speedup on the JUnit test suite. + */ + inline val simplifyApplications = true + + /** Assume -indent by default */ + inline val defaultIndent = true + + /** If set, prints a trace of all symbol completions */ + inline val showCompletions = false + + /** If set, show variable/variable reverse dependencies when printing constraints. */ + inline val showConstraintDeps = true + + /** If set, method results that are context functions are flattened by adding + * the parameters of the context function results to the methods themselves. + * This is an optimization that reduces closure allocations. + */ + inline val flattenContextFunctionResults = true + + /** If set, enables tracing */ + inline val tracingEnabled = false + + /** Initial capacity of the uniques HashMap. + * Note: This should be a power of two to work with util.HashSet + */ + inline val initialUniquesCapacity = 0x8000 + + /** How many recursive calls to NamedType#underlying are performed before logging starts. */ + inline val LogPendingUnderlyingThreshold = 50 + + /** How many recursive calls to isSubType are performed before logging starts. */ + inline val LogPendingSubTypesThreshold = 50 + + /** How many recursive calls to findMember are performed before logging names starts + * Note: this threshold has to be chosen carefully. Too large, and programs + * like tests/pos/IterableSelfRec go into polynomial (or even exponential?) + * compile time slowdown. Too small and normal programs will cause the compiler to + * do inefficient operations on findMember. The current value is determined + * so that (1) IterableSelfRec still compiles in reasonable time (< 10sec) (2) Compiling + * dotty itself only causes small pending names lists to be generated (we measured + * at max 6 elements) and these lists are never searched with contains. + */ + inline val LogPendingFindMemberThreshold = 9 + + /** When in IDE, turn StaleSymbol errors into warnings instead of crashing */ + inline val ignoreStaleInIDE = true + + /** If true, `Denotation#asSeenFrom` is allowed to return an existing + * `SymDenotation` instead of allocating a new `SingleDenotation` if + * the two would only differ in their `prefix` (SymDenotation always + * have `NoPrefix` as their prefix). + * This is done for performance reasons: when compiling Dotty itself this + * reduces the number of allocated denotations by ~50%. + */ + inline val reuseSymDenotations = true + + /** If `checkLevelsOnConstraints` is true, check levels of type variables + * and create fresh ones as needed when bounds are first entered intot he constraint. + * If `checkLevelsOnInstantiation` is true, allow level-incorrect constraints but + * fix levels on type variable instantiation. + */ + inline val checkLevelsOnConstraints = false + inline val checkLevelsOnInstantiation = true + + /** If true, print capturing types in the form `{c} T`. + * If false, print them in the form `T @retains(c)`. + */ + inline val printCaptureSetsAsPrefix = true + + /** If true, allow mappping capture set variables under captureChecking with maps that are neither + * bijective nor idempotent. We currently do now know how to do this correctly in all + * cases, though. + */ + inline val ccAllowUnsoundMaps = false +} diff --git a/tests/pos-with-compiler-cc/dotc/config/Feature.scala b/tests/pos-with-compiler-cc/dotc/config/Feature.scala new file mode 100644 index 000000000000..1637c9268e30 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/Feature.scala @@ -0,0 +1,173 @@ +package dotty.tools +package dotc +package config + +import core._ +import Contexts._, Symbols._, Names._ +import StdNames.nme +import Decorators.* +import util.{SrcPos, NoSourcePosition} +import SourceVersion._ +import reporting.Message +import NameKinds.QualifiedName +import language.experimental.pureFunctions + +object Feature: + + def experimental(str: PreName): TermName = + QualifiedName(nme.experimental, str.toTermName) + + private def deprecated(str: PreName): TermName = + QualifiedName(nme.deprecated, str.toTermName) + + private val namedTypeArguments = experimental("namedTypeArguments") + private val genericNumberLiterals = experimental("genericNumberLiterals") + val scala2macros = experimental("macros") + + val dependent = experimental("dependent") + val erasedDefinitions = experimental("erasedDefinitions") + val symbolLiterals = deprecated("symbolLiterals") + val fewerBraces = experimental("fewerBraces") + val saferExceptions = experimental("saferExceptions") + val pureFunctions = experimental("pureFunctions") + val captureChecking = experimental("captureChecking") + val into = experimental("into") + + val globalOnlyImports: Set[TermName] = Set(pureFunctions, captureChecking) + + /** Is `feature` enabled by by a command-line setting? The enabling setting is + * + * -language:feature + * + * where is the fully qualified name of `owner`, followed by a ".", + * but subtracting the prefix `scala.language.` at the front. + */ + def enabledBySetting(feature: TermName)(using Context): Boolean = + ctx.base.settings.language.value.contains(feature.toString) + + /** Is `feature` enabled by by an import? This is the case if the feature + * is imported by a named import + * + * import owner.feature + * + * and there is no visible nested import that excludes the feature, as in + * + * import owner.{ feature => _ } + */ + def enabledByImport(feature: TermName)(using Context): Boolean = + //atPhase(typerPhase) { + val info = ctx.importInfo + info != null && info.featureImported(feature) + //} + + /** Is `feature` enabled by either a command line setting or an import? + * @param feature The name of the feature + * @param owner The prefix symbol (nested in `scala.language`) where the + * feature is defined. + */ + def enabled(feature: TermName)(using Context): Boolean = + enabledBySetting(feature) || enabledByImport(feature) + + /** Is auto-tupling enabled? */ + def autoTuplingEnabled(using Context): Boolean = !enabled(nme.noAutoTupling) + + def dynamicsEnabled(using Context): Boolean = enabled(nme.dynamics) + + def dependentEnabled(using Context) = enabled(dependent) + + def namedTypeArgsEnabled(using Context) = enabled(namedTypeArguments) + + def genericNumberLiteralsEnabled(using Context) = enabled(genericNumberLiterals) + + def scala2ExperimentalMacroEnabled(using Context) = enabled(scala2macros) + + /** Is pureFunctions enabled for this compilation unit? */ + def pureFunsEnabled(using Context) = + enabledBySetting(pureFunctions) + || ctx.compilationUnit.knowsPureFuns + || ccEnabled + + /** Is captureChecking enabled for this compilation unit? */ + def ccEnabled(using Context) = + enabledBySetting(captureChecking) + || ctx.compilationUnit.needsCaptureChecking + + /** Is pureFunctions enabled for any of the currently compiled compilation units? */ + def pureFunsEnabledSomewhere(using Context) = + enabledBySetting(pureFunctions) + || ctx.run != null && ctx.run.nn.pureFunsImportEncountered + || ccEnabledSomewhere + + /** Is captureChecking enabled for any of the currently compiled compilation units? */ + def ccEnabledSomewhere(using Context) = + enabledBySetting(captureChecking) + || ctx.run != null && ctx.run.nn.ccImportEncountered + + def sourceVersionSetting(using Context): SourceVersion = + SourceVersion.valueOf(ctx.settings.source.value) + + def sourceVersion(using Context): SourceVersion = + ctx.compilationUnit.sourceVersion match + case Some(v) => v + case none => sourceVersionSetting + + def migrateTo3(using Context): Boolean = + sourceVersion == `3.0-migration` + + def fewerBracesEnabled(using Context) = + sourceVersion.isAtLeast(`3.3`) || enabled(fewerBraces) + + /** If current source migrates to `version`, issue given warning message + * and return `true`, otherwise return `false`. + */ + def warnOnMigration(msg: Message, pos: SrcPos, version: SourceVersion)(using Context): Boolean = + if sourceVersion.isMigrating && sourceVersion.stable == version + || (version == `3.0` || version == `3.1`) && migrateTo3 + then + report.migrationWarning(msg, pos) + true + else + false + + def checkExperimentalFeature(which: String, srcPos: SrcPos, note: -> String = "")(using Context) = + if !isExperimentalEnabled then + report.error(em"Experimental $which may only be used with a nightly or snapshot version of the compiler$note", srcPos) + + def checkExperimentalDef(sym: Symbol, srcPos: SrcPos)(using Context) = + if !isExperimentalEnabled then + val symMsg = + if sym.hasAnnotation(defn.ExperimentalAnnot) then + i"$sym is marked @experimental" + else if sym.owner.hasAnnotation(defn.ExperimentalAnnot) then + i"${sym.owner} is marked @experimental" + else + i"$sym inherits @experimental" + report.error(em"$symMsg and therefore may only be used in an experimental scope.", srcPos) + + /** Check that experimental compiler options are only set for snapshot or nightly compiler versions. */ + def checkExperimentalSettings(using Context): Unit = + for setting <- ctx.settings.language.value + if setting.startsWith("experimental.") && setting != "experimental.macros" + do checkExperimentalFeature(s"feature $setting", NoSourcePosition) + + def isExperimentalEnabled(using Context): Boolean = + Properties.experimental && !ctx.settings.YnoExperimental.value + + /** Handle language import `import language..` if it is one + * of the global imports `pureFunctions` or `captureChecking`. In this case + * make the compilation unit's and current run's fields accordingly. + * @return true iff import that was handled + */ + def handleGlobalLanguageImport(prefix: TermName, imported: Name)(using Context): Boolean = + val fullFeatureName = QualifiedName(prefix, imported.asTermName) + if fullFeatureName == pureFunctions then + ctx.compilationUnit.knowsPureFuns = true + if ctx.run != null then ctx.run.nn.pureFunsImportEncountered = true + true + else if fullFeatureName == captureChecking then + ctx.compilationUnit.needsCaptureChecking = true + if ctx.run != null then ctx.run.nn.ccImportEncountered = true + true + else + false +end Feature diff --git a/tests/pos-with-compiler-cc/dotc/config/JavaPlatform.scala b/tests/pos-with-compiler-cc/dotc/config/JavaPlatform.scala new file mode 100644 index 000000000000..2b2f35e49451 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/JavaPlatform.scala @@ -0,0 +1,69 @@ +package dotty.tools +package dotc +package config + +import io._ +import classpath.AggregateClassPath +import core._ +import Symbols._, Types._, Contexts._, StdNames._ +import Flags._ +import transform.ExplicitOuter, transform.SymUtils._ + +class JavaPlatform extends Platform { + + private var currentClassPath: Option[ClassPath] = None + + def classPath(using Context): ClassPath = { + if (currentClassPath.isEmpty) + currentClassPath = Some(new PathResolver().result) + val cp = currentClassPath.get + cp + } + + // The given symbol is a method with the right name and signature to be a runnable java program. + def isMainMethod(sym: Symbol)(using Context): Boolean = + (sym.name == nme.main) && (sym.info match { + case MethodTpe(_, defn.ArrayOf(el) :: Nil, restpe) => el =:= defn.StringType && (restpe isRef defn.UnitClass) + case _ => false + }) + + /** Update classpath with a substituted subentry */ + def updateClassPath(subst: Map[ClassPath, ClassPath]): Unit = currentClassPath.get match { + case AggregateClassPath(entries) => + currentClassPath = Some(AggregateClassPath(entries map (e => subst.getOrElse(e, e)))) + case cp: ClassPath => + currentClassPath = Some(subst.getOrElse(cp, cp)) + } + + def rootLoader(root: TermSymbol)(using Context): SymbolLoader = new SymbolLoaders.PackageLoader(root, classPath) + + /** Is the SAMType `cls` also a SAM under the rules of the JVM? */ + def isSam(cls: ClassSymbol)(using Context): Boolean = + cls.isAllOf(NoInitsTrait) && + cls.superClass == defn.ObjectClass && + cls.directlyInheritedTraits.forall(_.is(NoInits)) && + !ExplicitOuter.needsOuterIfReferenced(cls) && + cls.typeRef.fields.isEmpty // Superaccessors already show up as abstract methods here, so no test necessary + + /** We could get away with excluding BoxedBooleanClass for the + * purpose of equality testing since it need not compare equal + * to anything but other booleans, but it should be present in + * case this is put to other uses. + */ + def isMaybeBoxed(sym: ClassSymbol)(using Context): Boolean = { + val d = defn + import d._ + (sym == ObjectClass) || + (sym == JavaSerializableClass) || + (sym == ComparableClass) || + (sym derivesFrom BoxedNumberClass) || + (sym derivesFrom BoxedCharClass) || + (sym derivesFrom BoxedBooleanClass) + } + + def shouldReceiveJavaSerializationMethods(sym: ClassSymbol)(using Context): Boolean = + true + + def newClassLoader(bin: AbstractFile)(using Context): SymbolLoader = + new ClassfileLoader(bin) +} diff --git a/tests/pos-with-compiler-cc/dotc/config/OutputDirs.scala b/tests/pos-with-compiler-cc/dotc/config/OutputDirs.scala new file mode 100644 index 000000000000..0411c5604768 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/OutputDirs.scala @@ -0,0 +1,117 @@ +package dotty.tools +package dotc +package config + +import scala.language.unsafeNulls + +import io._ + +/** A class for holding mappings from source directories to + * their output location. This functionality can be accessed + * only programmatically. The command line compiler uses a + * single output location, but tools may use this functionality + * to set output location per source directory. + */ +class OutputDirs { + /** Pairs of source directory - destination directory. */ + private var outputDirs: List[(AbstractFile, AbstractFile)] = Nil + + /** If this is not None, the output location where all + * classes should go. + */ + private var singleOutDir: Option[AbstractFile] = None + + /** Add a destination directory for sources found under srcdir. + * Both directories should exits. + */ + def add(srcDir: String, outDir: String): Unit = + add(checkDir(AbstractFile.getDirectory(srcDir), srcDir), + checkDir(AbstractFile.getDirectory(outDir), outDir)) + + /** Check that dir is exists and is a directory. */ + private def checkDir(dir: AbstractFile, name: String, allowJar: Boolean = false): AbstractFile = ( + if (dir != null && dir.isDirectory) + dir + // was: else if (allowJar && dir == null && Path.isJarOrZip(name, false)) + else if (allowJar && dir == null && Jar.isJarOrZip(File(name), false)) + new PlainFile(Path(name)) + else + throw new FatalError(name + " does not exist or is not a directory")) + + /** Set the single output directory. From now on, all files will + * be dumped in there, regardless of previous calls to 'add'. + */ + def setSingleOutput(outDir: String): Unit = { + val dst = AbstractFile.getDirectory(outDir) + setSingleOutput(checkDir(dst, outDir, true)) + } + + def getSingleOutput: Option[AbstractFile] = singleOutDir + + /** Set the single output directory. From now on, all files will + * be dumped in there, regardless of previous calls to 'add'. + */ + def setSingleOutput(dir: AbstractFile): Unit = + singleOutDir = Some(dir) + + def add(src: AbstractFile, dst: AbstractFile): Unit = { + singleOutDir = None + outputDirs ::= ((src, dst)) + } + + /** Return the list of source-destination directory pairs. */ + def outputs: List[(AbstractFile, AbstractFile)] = outputDirs + + /** Return the output directory for the given file. + */ + def outputDirFor(src: AbstractFile): AbstractFile = { + def isBelow(srcDir: AbstractFile, outDir: AbstractFile) = + src.path.startsWith(srcDir.path) + + singleOutDir match { + case Some(d) => d + case None => + (outputs find (isBelow _).tupled) match { + case Some((_, d)) => d + case _ => + throw new FatalError("Could not find an output directory for " + + src.path + " in " + outputs) + } + } + } + + /** Return the source file path(s) which correspond to the given + * classfile path and SourceFile attribute value, subject to the + * condition that source files are arranged in the filesystem + * according to Java package layout conventions. + * + * The given classfile path must be contained in at least one of + * the specified output directories. If it does not then this + * method returns Nil. + * + * Note that the source file is not required to exist, so assuming + * a valid classfile path this method will always return a list + * containing at least one element. + * + * Also that if two or more source path elements target the same + * output directory there will be two or more candidate source file + * paths. + */ + def srcFilesFor(classFile: AbstractFile, srcPath: String): List[AbstractFile] = { + def isBelow(srcDir: AbstractFile, outDir: AbstractFile) = + classFile.path.startsWith(outDir.path) + + singleOutDir match { + case Some(d) => + d match { + case _: VirtualDirectory | _: io.ZipArchive => Nil + case _ => List(d.lookupPathUnchecked(srcPath, false)) + } + case None => + (outputs filter (isBelow _).tupled) match { + case Nil => Nil + case matches => matches.map(_._1.lookupPathUnchecked(srcPath, false)) + } + } + } +} diff --git a/tests/pos-with-compiler-cc/dotc/config/PathResolver.scala b/tests/pos-with-compiler-cc/dotc/config/PathResolver.scala new file mode 100644 index 000000000000..afa30e38dc2a --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/PathResolver.scala @@ -0,0 +1,268 @@ +package dotty.tools +package dotc +package config + +import scala.language.unsafeNulls + +import WrappedProperties.AccessControl +import io.{ClassPath, Directory, Path} +import classpath.{AggregateClassPath, ClassPathFactory, JrtClassPath} +import ClassPath.split +import PartialFunction.condOpt +import core.Contexts._ +import Settings._ +import dotty.tools.io.File + +object PathResolver { + + // Imports property/environment functions which suppress + // security exceptions. + import AccessControl._ + + def firstNonEmpty(xs: String*): String = xs find (_ != "") getOrElse "" + + /** Map all classpath elements to absolute paths and reconstruct the classpath. + */ + def makeAbsolute(cp: String): String = ClassPath.map(cp, x => Path(x).toAbsolute.path) + + /** pretty print class path + */ + def ppcp(s: String): String = split(s) match { + case Nil => "" + case Seq(x) => x + case xs => xs.map("\n" + _).mkString + } + + /** Values found solely by inspecting environment or property variables. + */ + object Environment { + private def searchForBootClasspath = ( + systemProperties find (_._1 endsWith ".boot.class.path") map (_._2) getOrElse "" + ) + + /** Environment variables which java pays attention to so it + * seems we do as well. + */ + def classPathEnv: String = envOrElse("CLASSPATH", "") + def sourcePathEnv: String = envOrElse("SOURCEPATH", "") + + def javaBootClassPath: String = propOrElse("sun.boot.class.path", searchForBootClasspath) + + def javaExtDirs: String = propOrEmpty("java.ext.dirs") + def scalaHome: String = propOrEmpty("scala.home") + def scalaExtDirs: String = propOrEmpty("scala.ext.dirs") + + /** The java classpath and whether to use it. + */ + def javaUserClassPath: String = propOrElse("java.class.path", "") + def useJavaClassPath: Boolean = propOrFalse("scala.usejavacp") + + override def toString: String = s""" + |object Environment { + | scalaHome = $scalaHome (useJavaClassPath = $useJavaClassPath) + | javaBootClassPath = <${javaBootClassPath.length} chars> + | javaExtDirs = ${ppcp(javaExtDirs)} + | javaUserClassPath = ${ppcp(javaUserClassPath)} + | scalaExtDirs = ${ppcp(scalaExtDirs)} + |}""".trim.stripMargin + } + + /** Default values based on those in Environment as interpreted according + * to the path resolution specification. + */ + object Defaults { + def scalaSourcePath: String = Environment.sourcePathEnv + def javaBootClassPath: String = Environment.javaBootClassPath + def javaUserClassPath: String = Environment.javaUserClassPath + def javaExtDirs: String = Environment.javaExtDirs + def useJavaClassPath: Boolean = Environment.useJavaClassPath + + def scalaHome: String = Environment.scalaHome + def scalaHomeDir: Directory = Directory(scalaHome) + def scalaHomeExists: Boolean = scalaHomeDir.isDirectory + def scalaLibDir: Directory = (scalaHomeDir / "lib").toDirectory + def scalaClassesDir: Directory = (scalaHomeDir / "classes").toDirectory + + def scalaLibAsJar: File = (scalaLibDir / "scala-library.jar").toFile + def scalaLibAsDir: Directory = (scalaClassesDir / "library").toDirectory + + def scalaLibDirFound: Option[Directory] = + if (scalaLibAsJar.isFile) Some(scalaLibDir) + else if (scalaLibAsDir.isDirectory) Some(scalaClassesDir) + else None + + def scalaLibFound: String = + if (scalaLibAsJar.isFile) scalaLibAsJar.path + else if (scalaLibAsDir.isDirectory) scalaLibAsDir.path + else "" + + // XXX It must be time for someone to figure out what all these things + // are intended to do. This is disabled here because it was causing all + // the scala jars to end up on the classpath twice: one on the boot + // classpath as set up by the runner (or regular classpath under -nobootcp) + // and then again here. + def scalaBootClassPath: String = "" + // scalaLibDirFound match { + // case Some(dir) if scalaHomeExists => + // val paths = ClassPath expandDir dir.path + // join(paths: _*) + // case _ => "" + // } + + def scalaExtDirs: String = Environment.scalaExtDirs + + def scalaPluginPath: String = (scalaHomeDir / "misc" / "scala-devel" / "plugins").path + + override def toString: String = """ + |object Defaults { + | scalaHome = %s + | javaBootClassPath = %s + | scalaLibDirFound = %s + | scalaLibFound = %s + | scalaBootClassPath = %s + | scalaPluginPath = %s + |}""".trim.stripMargin.format( + scalaHome, + ppcp(javaBootClassPath), + scalaLibDirFound, scalaLibFound, + ppcp(scalaBootClassPath), ppcp(scalaPluginPath) + ) + } + + def fromPathString(path: String)(using Context): ClassPath = { + val settings = ctx.settings.classpath.update(path) + inContext(ctx.fresh.setSettings(settings)) { + new PathResolver().result + } + } + + /** Show values in Environment and Defaults when no argument is provided. + * Otherwise, show values in Calculated as if those options had been given + * to a scala runner. + */ + def main(args: Array[String]): Unit = + if (args.isEmpty) { + println(Environment) + println(Defaults) + } + else inContext(ContextBase().initialCtx) { + val ArgsSummary(sstate, rest, errors, warnings) = + ctx.settings.processArguments(args.toList, true, ctx.settingsState) + errors.foreach(println) + val pr = inContext(ctx.fresh.setSettings(sstate)) { + new PathResolver() + } + println(" COMMAND: 'scala %s'".format(args.mkString(" "))) + println("RESIDUAL: 'scala %s'\n".format(rest.mkString(" "))) + + pr.result match { + case cp: AggregateClassPath => + println(s"ClassPath has ${cp.aggregates.size} entries and results in:\n${cp.asClassPathStrings}") + } + } +} + +import PathResolver.{Defaults, ppcp} + +class PathResolver(using c: Context) { + import c.base.settings + + private val classPathFactory = new ClassPathFactory + + private def cmdLineOrElse(name: String, alt: String) = + commandLineFor(name) match { + case Some("") | None => alt + case Some(x) => x + } + + private def commandLineFor(s: String): Option[String] = condOpt(s) { + case "javabootclasspath" => settings.javabootclasspath.value + case "javaextdirs" => settings.javaextdirs.value + case "bootclasspath" => settings.bootclasspath.value + case "extdirs" => settings.extdirs.value + case "classpath" | "cp" => settings.classpath.value + case "sourcepath" => settings.sourcepath.value + } + + /** Calculated values based on any given command line options, falling back on + * those in Defaults. + */ + object Calculated { + def scalaHome: String = Defaults.scalaHome + def useJavaClassPath: Boolean = settings.usejavacp.value || Defaults.useJavaClassPath + def javaBootClassPath: String = cmdLineOrElse("javabootclasspath", Defaults.javaBootClassPath) + def javaExtDirs: String = cmdLineOrElse("javaextdirs", Defaults.javaExtDirs) + def javaUserClassPath: String = if (useJavaClassPath) Defaults.javaUserClassPath else "" + def scalaBootClassPath: String = cmdLineOrElse("bootclasspath", Defaults.scalaBootClassPath) + def scalaExtDirs: String = cmdLineOrElse("extdirs", Defaults.scalaExtDirs) + /** Scaladoc doesn't need any bootstrapping, otherwise will create errors such as: + * [scaladoc] ../scala-trunk/src/reflect/scala/reflect/macros/Reifiers.scala:89: error: object api is not a member of package reflect + * [scaladoc] case class ReificationException(val pos: reflect.api.PositionApi, val msg: String) extends Throwable(msg) + * [scaladoc] ^ + * Because bootstrapping looks at the sourcepath and creates the package "reflect" in "" it will cause the + * typedIdentifier to pick .reflect instead of the .scala.reflect package. Thus, no bootstrapping for scaladoc! + */ + def sourcePath: String = cmdLineOrElse("sourcepath", Defaults.scalaSourcePath) + + def userClassPath: String = + if (!settings.classpath.isDefault) settings.classpath.value + else sys.env.getOrElse("CLASSPATH", ".") + + import classPathFactory._ + + // Assemble the elements! + def basis: List[Traversable[ClassPath]] = + val release = Option(ctx.settings.javaOutputVersion.value).filter(_.nonEmpty) + + List( + JrtClassPath(release), // 1. The Java 9+ classpath (backed by the jrt:/ virtual system, if available) + classesInPath(javaBootClassPath), // 2. The Java bootstrap class path. + contentsOfDirsInPath(javaExtDirs), // 3. The Java extension class path. + classesInExpandedPath(javaUserClassPath), // 4. The Java application class path. + classesInPath(scalaBootClassPath), // 5. The Scala boot class path. + contentsOfDirsInPath(scalaExtDirs), // 6. The Scala extension class path. + classesInExpandedPath(userClassPath), // 7. The Scala application class path. + sourcesInPath(sourcePath) // 8. The Scala source path. + ) + + lazy val containers: List[ClassPath] = basis.flatten.distinct + + override def toString: String = """ + |object Calculated { + | scalaHome = %s + | javaBootClassPath = %s + | javaExtDirs = %s + | javaUserClassPath = %s + | useJavaClassPath = %s + | scalaBootClassPath = %s + | scalaExtDirs = %s + | userClassPath = %s + | sourcePath = %s + |}""".trim.stripMargin.format( + scalaHome, + ppcp(javaBootClassPath), ppcp(javaExtDirs), ppcp(javaUserClassPath), + useJavaClassPath, + ppcp(scalaBootClassPath), ppcp(scalaExtDirs), ppcp(userClassPath), + ppcp(sourcePath) + ) + } + + def containers: List[ClassPath] = Calculated.containers + + lazy val result: ClassPath = { + val cp = AggregateClassPath(containers.toIndexedSeq) + + if (settings.YlogClasspath.value) { + Console.println("Classpath built from " + settings.toConciseString(ctx.settingsState)) + Console.println("Defaults: " + PathResolver.Defaults) + Console.println("Calculated: " + Calculated) + + val xs = (Calculated.basis drop 2).flatten.distinct + println("After java boot/extdirs classpath has %d entries:" format xs.size) + xs foreach (x => println(" " + x)) + } + cp + } + + def asURLs: Seq[java.net.URL] = result.asURLs +} diff --git a/tests/pos-with-compiler-cc/dotc/config/Platform.scala b/tests/pos-with-compiler-cc/dotc/config/Platform.scala new file mode 100644 index 000000000000..0faacf1bcebb --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/Platform.scala @@ -0,0 +1,46 @@ +package dotty.tools +package dotc +package config + +import io.{ClassPath, AbstractFile} +import core.Contexts._, core.Symbols._ +import core.SymbolLoader +import core.StdNames.nme +import core.Flags.Module + +/** The platform dependent pieces of Global. + */ +abstract class Platform { + + /** The root symbol loader. */ + def rootLoader(root: TermSymbol)(using Context): SymbolLoader + + /** The compiler classpath. */ + def classPath(using Context): ClassPath + + /** Update classpath with a substitution that maps entries to entries */ + def updateClassPath(subst: Map[ClassPath, ClassPath]): Unit + + /** Any platform-specific phases. */ + //def platformPhases: List[SubComponent] + + /** Is the SAMType `cls` also a SAM under the rules of the platform? */ + def isSam(cls: ClassSymbol)(using Context): Boolean + + /** The various ways a boxed primitive might materialize at runtime. */ + def isMaybeBoxed(sym: ClassSymbol)(using Context): Boolean + + /** Is the given class symbol eligible for Java serialization-specific methods? */ + def shouldReceiveJavaSerializationMethods(sym: ClassSymbol)(using Context): Boolean + + /** Create a new class loader to load class file `bin` */ + def newClassLoader(bin: AbstractFile)(using Context): SymbolLoader + + /** The given symbol is a method with the right name and signature to be a runnable program. */ + def isMainMethod(sym: Symbol)(using Context): Boolean + + /** The given class has a main method. */ + final def hasMainMethod(sym: Symbol)(using Context): Boolean = + sym.info.member(nme.main).hasAltWith(d => + isMainMethod(d.symbol) && (sym.is(Module) || d.symbol.isStatic)) +} diff --git a/tests/pos-with-compiler-cc/dotc/config/Printers.scala b/tests/pos-with-compiler-cc/dotc/config/Printers.scala new file mode 100644 index 000000000000..ecb189de9bb3 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/Printers.scala @@ -0,0 +1,52 @@ +package dotty.tools.dotc.config + +object Printers { + + class Printer { + def println(msg: => String): Unit = System.out.nn.println(msg) + } + + object noPrinter extends Printer { + inline override def println(msg: => String): Unit = () + } + + val default = new Printer + + val capt = noPrinter + val constr = noPrinter + val core = noPrinter + val checks = noPrinter + val config = noPrinter + val cyclicErrors = noPrinter + val debug = noPrinter + val derive = noPrinter + val desugar = noPrinter + val scaladoc = noPrinter + val exhaustivity = noPrinter + val gadts = noPrinter + val gadtsConstr = noPrinter + val hk = noPrinter + val implicits = noPrinter + val implicitsDetailed = noPrinter + val lexical = noPrinter + val init = noPrinter + val inlining = noPrinter + val interactiv = noPrinter + val matchTypes = noPrinter + val nullables = noPrinter + val overload = noPrinter + val patmatch = noPrinter + val pickling = noPrinter + val quotePickling = noPrinter + val plugins = noPrinter + val recheckr = noPrinter + val refcheck = noPrinter + val simplify = noPrinter + val staging = noPrinter + val subtyping = noPrinter + val tailrec = noPrinter + val transforms = noPrinter + val typr = noPrinter + val unapp = noPrinter + val variances = noPrinter +} diff --git a/tests/pos-with-compiler-cc/dotc/config/Properties.scala b/tests/pos-with-compiler-cc/dotc/config/Properties.scala new file mode 100644 index 000000000000..1e9cc82112af --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/Properties.scala @@ -0,0 +1,142 @@ +package dotty.tools +package dotc +package config + +import scala.language.unsafeNulls + +import scala.annotation.internal.sharable + +import java.io.IOException +import java.util.jar.Attributes.{ Name => AttributeName } +import java.nio.charset.StandardCharsets + +/** Loads `library.properties` from the jar. */ +object Properties extends PropertiesTrait { + protected def propCategory: String = "compiler" + protected def pickJarBasedOn: Class[PropertiesTrait] = classOf[PropertiesTrait] + + /** Scala manifest attributes. + */ + @sharable val ScalaCompilerVersion: AttributeName = new AttributeName("Scala-Compiler-Version") +} + +trait PropertiesTrait { + protected def propCategory: String // specializes the remainder of the values + protected def pickJarBasedOn: Class[?] // props file comes from jar containing this + + /** The name of the properties file */ + protected val propFilename: String = "/" + propCategory + ".properties" + + /** The loaded properties */ + @sharable protected lazy val scalaProps: java.util.Properties = { + val props = new java.util.Properties + val stream = pickJarBasedOn getResourceAsStream propFilename + if (stream ne null) + quietlyDispose(props load stream, stream.close) + + props + } + + private def quietlyDispose(action: => Unit, disposal: => Unit) = + try { action } + finally + try { disposal } + catch { case _: IOException => } + + def propIsSet(name: String): Boolean = System.getProperty(name) != null + def propIsSetTo(name: String, value: String): Boolean = propOrNull(name) == value + def propOrElse(name: String, alt: String): String = System.getProperty(name, alt) + def propOrEmpty(name: String): String = propOrElse(name, "") + def propOrNull(name: String): String = propOrElse(name, null) + def propOrNone(name: String): Option[String] = Option(propOrNull(name)) + def propOrFalse(name: String): Boolean = propOrNone(name) exists (x => List("yes", "on", "true") contains x.toLowerCase) + def setProp(name: String, value: String): String = System.setProperty(name, value) + def clearProp(name: String): String = System.clearProperty(name) + + def envOrElse(name: String, alt: String): String = Option(System getenv name) getOrElse alt + def envOrNone(name: String): Option[String] = Option(System getenv name) + + // for values based on propFilename + def scalaPropOrElse(name: String, alt: String): String = scalaProps.getProperty(name, alt) + def scalaPropOrEmpty(name: String): String = scalaPropOrElse(name, "") + def scalaPropOrNone(name: String): Option[String] = Option(scalaProps.getProperty(name)) + + /** Either the development or release version if known, otherwise + * the empty string. + */ + def versionNumberString: String = scalaPropOrEmpty("version.number") + + /** The version number of the jar this was loaded from, + * or `"(unknown)"` if it cannot be determined. + */ + val simpleVersionString: String = { + val v = scalaPropOrElse("version.number", "(unknown)") + v + ( + if (v.contains("SNAPSHOT") || v.contains("NIGHTLY")) + "-git-" + scalaPropOrElse("git.hash", "(unknown)") + else + "" + ) + } + + /** The version number of the jar this was loaded from plus `"version "` prefix, + * or `"version (unknown)"` if it cannot be determined. + */ + val versionString: String = "version " + simpleVersionString + + /** Whether the current version of compiler is experimental + * + * 1. Snapshot, nightly releases and non-bootstrapped compiler are experimental. + * 2. Features supported by experimental versions of the compiler: + * - research plugins + */ + val experimental: Boolean = versionString.contains("SNAPSHOT") || versionString.contains("NIGHTLY") || versionString.contains("nonbootstrapped") + + val copyrightString: String = scalaPropOrElse("copyright.string", "(c) 2002-2017 LAMP/EPFL") + + /** This is the encoding to use reading in source files, overridden with -encoding + * Note that it uses "prop" i.e. looks in the scala jar, not the system properties. + */ + def sourceEncoding: String = scalaPropOrElse("file.encoding", StandardCharsets.UTF_8.name) + def sourceReader: String = scalaPropOrElse("source.reader", "scala.tools.nsc.io.SourceReader") + + /** This is the default text encoding, overridden (unreliably) with + * `JAVA_OPTS="-Dfile.encoding=Foo"` + */ + def encodingString: String = propOrElse("file.encoding", StandardCharsets.UTF_8.name) + + /** The default end of line character. + */ + def lineSeparator: String = propOrElse("line.separator", "\n") + + /** Various well-known properties. + */ + def javaClassPath: String = propOrEmpty("java.class.path") + def javaHome: String = propOrEmpty("java.home") + def javaVendor: String = propOrEmpty("java.vendor") + def javaVersion: String = propOrEmpty("java.version") + def javaVmInfo: String = propOrEmpty("java.vm.info") + def javaVmName: String = propOrEmpty("java.vm.name") + def javaVmVendor: String = propOrEmpty("java.vm.vendor") + def javaVmVersion: String = propOrEmpty("java.vm.version") + def osName: String = propOrEmpty("os.name") + def scalaHome: String = propOrEmpty("scala.home") + def tmpDir: String = propOrEmpty("java.io.tmpdir") + def userDir: String = propOrEmpty("user.dir") + def userHome: String = propOrEmpty("user.home") + def userName: String = propOrEmpty("user.name") + + /** Some derived values. + */ + def isWin: Boolean = osName startsWith "Windows" + def isMac: Boolean = javaVendor startsWith "Apple" + + // This is looking for javac, tools.jar, etc. + // Tries JDK_HOME first, then the more common but likely jre JAVA_HOME, + // and finally the system property based javaHome. + def jdkHome: String = envOrElse("JDK_HOME", envOrElse("JAVA_HOME", javaHome)) + + def versionMsg: String = "Scala %s %s -- %s".format(propCategory, versionString, copyrightString) + def scalaCmd: String = if (isWin) "scala.bat" else "scala" + def scalacCmd: String = if (isWin) "scalac.bat" else "scalac" +} diff --git a/tests/pos-with-compiler-cc/dotc/config/SJSPlatform.scala b/tests/pos-with-compiler-cc/dotc/config/SJSPlatform.scala new file mode 100644 index 000000000000..ae417b717ca3 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/SJSPlatform.scala @@ -0,0 +1,35 @@ +package dotty.tools.dotc.config + +import dotty.tools.dotc.core._ +import Contexts._ +import Symbols._ + +import dotty.tools.backend.sjs.JSDefinitions + +object SJSPlatform { + /** The `SJSPlatform` for the current context. */ + def sjsPlatform(using Context): SJSPlatform = + ctx.platform.asInstanceOf[SJSPlatform] +} + +class SJSPlatform()(using DetachedContext) extends JavaPlatform { + + /** Scala.js-specific definitions. */ + val jsDefinitions: JSDefinitions = new JSDefinitions() + + /** Is the SAMType `cls` also a SAM under the rules of the Scala.js back-end? */ + override def isSam(cls: ClassSymbol)(using Context): Boolean = + defn.isFunctionClass(cls) + || cls.superClass == jsDefinitions.JSFunctionClass + + /** Is the given class symbol eligible for Java serialization-specific methods? + * + * This is not simply false because we still want to add them to Scala classes + * and objects. They might be transitively used by macros and other compile-time + * code. It feels safer to have them be somewhat equivalent to the ones we would + * get in a JVM project. The JVM back-end will slap an extends `java.io.Serializable` + * to them, so we should be consistent and also emit the proper serialization methods. + */ + override def shouldReceiveJavaSerializationMethods(sym: ClassSymbol)(using Context): Boolean = + !sym.isSubClass(jsDefinitions.JSAnyClass) +} diff --git a/tests/pos-with-compiler-cc/dotc/config/ScalaRelease.scala b/tests/pos-with-compiler-cc/dotc/config/ScalaRelease.scala new file mode 100644 index 000000000000..407171f1a0dd --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/ScalaRelease.scala @@ -0,0 +1,21 @@ +package dotty.tools.dotc.config + +enum ScalaRelease(val majorVersion: Int, val minorVersion: Int) extends Ordered[ScalaRelease]: + case Release3_0 extends ScalaRelease(3, 0) + case Release3_1 extends ScalaRelease(3, 1) + case Release3_2 extends ScalaRelease(3, 2) + + def show = s"$majorVersion.$minorVersion" + + def compare(that: ScalaRelease) = + val ord = summon[Ordering[(Int, Int)]] + ord.compare((majorVersion, minorVersion), (that.majorVersion, that.minorVersion)) + +object ScalaRelease: + def latest = Release3_1 + + def parse(name: String) = name match + case "3.0" => Some(Release3_0) + case "3.1" => Some(Release3_1) + case "3.2" => Some(Release3_2) + case _ => None diff --git a/tests/pos-with-compiler-cc/dotc/config/ScalaSettings.scala b/tests/pos-with-compiler-cc/dotc/config/ScalaSettings.scala new file mode 100644 index 000000000000..f7743dddda4e --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/ScalaSettings.scala @@ -0,0 +1,347 @@ +package dotty.tools.dotc +package config + +import scala.language.unsafeNulls + +import dotty.tools.dotc.config.PathResolver.Defaults +import dotty.tools.dotc.config.Settings.{Setting, SettingGroup} +import dotty.tools.dotc.config.SourceVersion +import dotty.tools.dotc.core.Contexts._ +import dotty.tools.dotc.rewrites.Rewrites +import dotty.tools.io.{AbstractFile, Directory, JDK9Reflectors, PlainDirectory} + +import scala.util.chaining._ + +class ScalaSettings extends SettingGroup with AllScalaSettings + +object ScalaSettings: + // Keep synchronized with `classfileVersion` in `BCodeIdiomatic` + private val minTargetVersion = 8 + private val maxTargetVersion = 20 + + def supportedTargetVersions: List[String] = + (minTargetVersion to maxTargetVersion).toList.map(_.toString) + + def supportedReleaseVersions: List[String] = + if scala.util.Properties.isJavaAtLeast("9") then + val jdkVersion = JDK9Reflectors.runtimeVersionMajor(JDK9Reflectors.runtimeVersion()).intValue() + val maxVersion = Math.min(jdkVersion, maxTargetVersion) + (minTargetVersion to maxVersion).toList.map(_.toString) + else List(minTargetVersion).map(_.toString) + + def supportedScalaReleaseVersions: List[String] = + ScalaRelease.values.toList.map(_.show) + + def supportedSourceVersions: List[String] = + SourceVersion.values.toList.map(_.toString) + + def defaultClasspath: String = sys.env.getOrElse("CLASSPATH", ".") + + def defaultPageWidth: Int = { + val defaultWidth = 80 + val columnsVar = System.getenv("COLUMNS") + if columnsVar != null then columnsVar.toInt + else if Properties.isWin then + val ansiconVar = System.getenv("ANSICON") // eg. "142x32766 (142x26)" + if ansiconVar != null && ansiconVar.matches("[0-9]+x.*") then + ansiconVar.substring(0, ansiconVar.indexOf("x")).toInt + else defaultWidth + else defaultWidth + } + +trait AllScalaSettings extends CommonScalaSettings, PluginSettings, VerboseSettings, WarningSettings, XSettings, YSettings: + self: SettingGroup => + + /* Path related settings */ + val semanticdbTarget: Setting[String] = PathSetting("-semanticdb-target", "Specify an alternative output directory for SemanticDB files.", "") + + val source: Setting[String] = ChoiceSetting("-source", "source version", "source version", ScalaSettings.supportedSourceVersions, SourceVersion.defaultSourceVersion.toString, aliases = List("--source")) + val uniqid: Setting[Boolean] = BooleanSetting("-uniqid", "Uniquely tag all identifiers in debugging output.", aliases = List("--unique-id")) + val rewrite: Setting[Option[Rewrites]] = OptionSetting[Rewrites]("-rewrite", "When used in conjunction with a `...-migration` source version, rewrites sources to migrate to new version.", aliases = List("--rewrite")) + val fromTasty: Setting[Boolean] = BooleanSetting("-from-tasty", "Compile classes from tasty files. The arguments are .tasty or .jar files.", aliases = List("--from-tasty")) + + val newSyntax: Setting[Boolean] = BooleanSetting("-new-syntax", "Require `then` and `do` in control expressions.") + val oldSyntax: Setting[Boolean] = BooleanSetting("-old-syntax", "Require `(...)` around conditions.") + val indent: Setting[Boolean] = BooleanSetting("-indent", "Together with -rewrite, remove {...} syntax when possible due to significant indentation.") + val noindent: Setting[Boolean] = BooleanSetting("-no-indent", "Require classical {...} syntax, indentation is not significant.", aliases = List("-noindent")) + + /* Decompiler settings */ + val printTasty: Setting[Boolean] = BooleanSetting("-print-tasty", "Prints the raw tasty.", aliases = List("--print-tasty")) + val printLines: Setting[Boolean] = BooleanSetting("-print-lines", "Show source code line numbers.", aliases = List("--print-lines")) + + /* Scala.js-related settings */ + val scalajsGenStaticForwardersForNonTopLevelObjects: Setting[Boolean] = BooleanSetting("-scalajs-genStaticForwardersForNonTopLevelObjects", "Generate static forwarders even for non-top-level objects (Scala.js only)") + val scalajsMapSourceURI: Setting[List[String]] = MultiStringSetting("-scalajs-mapSourceURI", "uri1[->uri2]", "rebases source URIs from uri1 to uri2 (or to a relative URI) for source maps (Scala.js only)") + + val projectUrl: Setting[String] = StringSetting ( + "-project-url", + "project repository homepage", + "The source repository of your project.", + "" + ) + + val wikiSyntax: Setting[Boolean] = BooleanSetting("-Xwiki-syntax", "Retains the Scala2 behavior of using Wiki Syntax in Scaladoc.") + + val jvmargs = PrefixSetting("-J", "-J", "Pass directly to the runtime system.") + val defines = PrefixSetting("-Dproperty=value", "-D", "Pass -Dproperty=value directly to the runtime system.") +end AllScalaSettings + +/** Settings shared by compiler and scaladoc */ +trait CommonScalaSettings: + self: SettingGroup => + + /* Path related settings */ + val bootclasspath: Setting[String] = PathSetting("-bootclasspath", "Override location of bootstrap class files.", Defaults.scalaBootClassPath, aliases = List("--boot-class-path")) + val extdirs: Setting[String] = PathSetting("-extdirs", "Override location of installed extensions.", Defaults.scalaExtDirs, aliases = List("--extension-directories")) + val javabootclasspath: Setting[String] = PathSetting("-javabootclasspath", "Override java boot classpath.", Defaults.javaBootClassPath, aliases = List("--java-boot-class-path")) + val javaextdirs: Setting[String] = PathSetting("-javaextdirs", "Override java extdirs classpath.", Defaults.javaExtDirs, aliases = List("--java-extension-directories")) + val sourcepath: Setting[String] = PathSetting("-sourcepath", "Specify location(s) of source files.", Defaults.scalaSourcePath, aliases = List("--source-path")) + val sourceroot: Setting[String] = PathSetting("-sourceroot", "Specify workspace root directory.", ".") + + val classpath: Setting[String] = PathSetting("-classpath", "Specify where to find user class files.", ScalaSettings.defaultClasspath, aliases = List("-cp", "--class-path")) + val outputDir: Setting[AbstractFile] = OutputSetting("-d", "directory|jar", "Destination for generated classfiles.", + new PlainDirectory(Directory("."))) + val color: Setting[String] = ChoiceSetting("-color", "mode", "Colored output", List("always", "never"/*, "auto"*/), "always"/* "auto"*/, aliases = List("--color")) + val verbose: Setting[Boolean] = BooleanSetting("-verbose", "Output messages about what the compiler is doing.", aliases = List("--verbose")) + val version: Setting[Boolean] = BooleanSetting("-version", "Print product version and exit.", aliases = List("--version")) + val help: Setting[Boolean] = BooleanSetting("-help", "Print a synopsis of standard options.", aliases = List("--help", "-h")) + val pageWidth: Setting[Int] = IntSetting("-pagewidth", "Set page width", ScalaSettings.defaultPageWidth, aliases = List("--page-width")) + val silentWarnings: Setting[Boolean] = BooleanSetting("-nowarn", "Silence all warnings.", aliases = List("--no-warnings")) + + val javaOutputVersion: Setting[String] = ChoiceSetting("-java-output-version", "version", "Compile code with classes specific to the given version of the Java platform available on the classpath and emit bytecode for this version. Corresponds to -release flag in javac.", ScalaSettings.supportedReleaseVersions, "", aliases = List("-release", "--release")) + + val deprecation: Setting[Boolean] = BooleanSetting("-deprecation", "Emit warning and location for usages of deprecated APIs.", aliases = List("--deprecation")) + val feature: Setting[Boolean] = BooleanSetting("-feature", "Emit warning and location for usages of features that should be imported explicitly.", aliases = List("--feature")) + val explain: Setting[Boolean] = BooleanSetting("-explain", "Explain errors in more detail.", aliases = List("--explain")) + // -explain-types setting is necessary for cross compilation, since it is mentioned in sbt-tpolecat, for instance + // it is otherwise subsumed by -explain, and should be dropped as soon as we can. + val explainTypes: Setting[Boolean] = BooleanSetting("-explain-types", "Explain type errors in more detail (deprecated, use -explain instead).", aliases = List("--explain-types", "-explaintypes")) + val unchecked: Setting[Boolean] = BooleanSetting("-unchecked", "Enable additional warnings where generated code depends on assumptions.", initialValue = true, aliases = List("--unchecked")) + val language: Setting[List[String]] = MultiStringSetting("-language", "feature", "Enable one or more language features.", aliases = List("--language")) + + /* Coverage settings */ + val coverageOutputDir = PathSetting("-coverage-out", "Destination for coverage classfiles and instrumentation data.", "", aliases = List("--coverage-out")) + + /* Other settings */ + val encoding: Setting[String] = StringSetting("-encoding", "encoding", "Specify character encoding used by source files.", Properties.sourceEncoding, aliases = List("--encoding")) + val usejavacp: Setting[Boolean] = BooleanSetting("-usejavacp", "Utilize the java.class.path in classpath resolution.", aliases = List("--use-java-class-path")) + val scalajs: Setting[Boolean] = BooleanSetting("-scalajs", "Compile in Scala.js mode (requires scalajs-library.jar on the classpath).", aliases = List("--scalajs")) +end CommonScalaSettings + +/** -P "plugin" settings. Various tools might support plugins. */ +private sealed trait PluginSettings: + self: SettingGroup => + val plugin: Setting[List[String]] = MultiStringSetting ("-Xplugin", "paths", "Load a plugin from each classpath.") + val disable: Setting[List[String]] = MultiStringSetting ("-Xplugin-disable", "plugin", "Disable plugins by name.") + val require: Setting[List[String]] = MultiStringSetting ("-Xplugin-require", "plugin", "Abort if a named plugin is not loaded.") + val showPlugins: Setting[Boolean] = BooleanSetting ("-Xplugin-list", "Print a synopsis of loaded plugins.") + val pluginsDir: Setting[String] = StringSetting ("-Xpluginsdir", "path", "Path to search for plugin archives.", Defaults.scalaPluginPath) + val pluginOptions: Setting[List[String]] = MultiStringSetting ("-P", "plugin:opt", "Pass an option to a plugin, e.g. -P::") + +/** -V "Verbose" settings */ +private sealed trait VerboseSettings: + self: SettingGroup => + val Vhelp: Setting[Boolean] = BooleanSetting("-V", "Print a synopsis of verbose options.") + val Xprint: Setting[List[String]] = PhasesSetting("-Vprint", "Print out program after", aliases = List("-Xprint")) + val XshowPhases: Setting[Boolean] = BooleanSetting("-Vphases", "List compiler phases.", aliases = List("-Xshow-phases")) + + val Vprofile: Setting[Boolean] = BooleanSetting("-Vprofile", "Show metrics about sources and internal representations to estimate compile-time complexity.") + val VprofileSortedBy = ChoiceSetting("-Vprofile-sorted-by", "key", "Show metrics about sources and internal representations sorted by given column name", List("name", "path", "lines", "tokens", "tasty", "complexity"), "") + val VprofileDetails = IntSetting("-Vprofile-details", "Show metrics about sources and internal representations of the most complex methods", 0) + val VreplMaxPrintElements: Setting[Int] = IntSetting("-Vrepl-max-print-elements", "Number of elements to be printed before output is truncated.", 1000) + val VreplMaxPrintCharacters: Setting[Int] = IntSetting("-Vrepl-max-print-characters", "Number of characters to be printed before output is truncated.", 50000) + +/** -W "Warnings" settings + */ +private sealed trait WarningSettings: + self: SettingGroup => + val Whelp: Setting[Boolean] = BooleanSetting("-W", "Print a synopsis of warning options.") + val XfatalWarnings: Setting[Boolean] = BooleanSetting("-Werror", "Fail the compilation if there are any warnings.", aliases = List("-Xfatal-warnings")) + + val Wunused: Setting[List[String]] = MultiChoiceSetting( + name = "-Wunused", + helpArg = "warning", + descr = "Enable or disable specific `unused` warnings", + choices = List("nowarn", "all"), + default = Nil + ) + object WunusedHas: + def allOr(s: String)(using Context) = Wunused.value.pipe(us => us.contains("all") || us.contains(s)) + def nowarn(using Context) = allOr("nowarn") + + val Wconf: Setting[List[String]] = MultiStringSetting( + "-Wconf", + "patterns", + default = List(), + descr = + s"""Configure compiler warnings. + |Syntax: -Wconf::,:,... + |multiple are combined with &, i.e., &...& + | + | + | - Any message: any + | + | - Message categories: cat=deprecation, cat=feature, cat=unchecked + | + | - Message content: msg=regex + | The regex need only match some part of the message, not all of it. + | + | - Message id: id=E129 + | The message id is printed with the warning. + | + | - Message name: name=PureExpressionInStatementPosition + | The message name is printed with the warning in verbose warning mode. + | + |In verbose warning mode the compiler prints matching filters for warnings. + |Verbose mode can be enabled globally using `-Wconf:any:verbose`, or locally + |using the @nowarn annotation (example: `@nowarn("v") def test = try 1`). + | + | + | - error / e + | - warning / w + | - verbose / v (emit warning, show additional help for writing `-Wconf` filters) + | - info / i (infos are not counted as warnings and not affected by `-Werror`) + | - silent / s + | + |The default configuration is empty. + | + |User-defined configurations are added to the left. The leftmost rule matching + |a warning message defines the action. + | + |Examples: + | - change every warning into an error: -Wconf:any:error + | - silence deprecations: -Wconf:cat=deprecation:s + | + |Note: on the command-line you might need to quote configurations containing `*` or `&` + |to prevent the shell from expanding patterns.""".stripMargin, + ) + +/** -X "Extended" or "Advanced" settings */ +private sealed trait XSettings: + self: SettingGroup => + + val Xhelp: Setting[Boolean] = BooleanSetting("-X", "Print a synopsis of advanced options.") + val XnoForwarders: Setting[Boolean] = BooleanSetting("-Xno-forwarders", "Do not generate static forwarders in mirror classes.") + val XmaxInlines: Setting[Int] = IntSetting("-Xmax-inlines", "Maximal number of successive inlines.", 32) + val XmaxInlinedTrees: Setting[Int] = IntSetting("-Xmax-inlined-trees", "Maximal number of inlined trees.", 2_000_000) + val Xmigration: Setting[ScalaVersion] = VersionSetting("-Xmigration", "Warn about constructs whose behavior may have changed since version.") + val XprintTypes: Setting[Boolean] = BooleanSetting("-Xprint-types", "Print tree types (debugging option).") + val XprintDiff: Setting[Boolean] = BooleanSetting("-Xprint-diff", "Print changed parts of the tree since last print.") + val XprintDiffDel: Setting[Boolean] = BooleanSetting("-Xprint-diff-del", "Print changed parts of the tree since last print including deleted parts.") + val XprintInline: Setting[Boolean] = BooleanSetting("-Xprint-inline", "Show where inlined code comes from.") + val XprintSuspension: Setting[Boolean] = BooleanSetting("-Xprint-suspension", "Show when code is suspended until macros are compiled.") + val Xprompt: Setting[Boolean] = BooleanSetting("-Xprompt", "Display a prompt after each error (debugging option).") + val XreplDisableDisplay: Setting[Boolean] = BooleanSetting("-Xrepl-disable-display", "Do not display definitions in REPL.") + val XverifySignatures: Setting[Boolean] = BooleanSetting("-Xverify-signatures", "Verify generic signatures in generated bytecode.") + val XignoreScala2Macros: Setting[Boolean] = BooleanSetting("-Xignore-scala2-macros", "Ignore errors when compiling code that calls Scala2 macros, these will fail at runtime.") + val XimportSuggestionTimeout: Setting[Int] = IntSetting("-Ximport-suggestion-timeout", "Timeout (in ms) for searching for import suggestions when errors are reported.", 8000) + val Xsemanticdb: Setting[Boolean] = BooleanSetting("-Xsemanticdb", "Store information in SemanticDB.", aliases = List("-Ysemanticdb")) + val XuncheckedJavaOutputVersion: Setting[String] = ChoiceSetting("-Xunchecked-java-output-version", "target", "Emit bytecode for the specified version of the Java platform. This might produce bytecode that will break at runtime. Corresponds to -target flag in javac. When on JDK 9+, consider -java-output-version as a safer alternative.", ScalaSettings.supportedTargetVersions, "", aliases = List("-Xtarget", "--Xtarget")) + val XcheckMacros: Setting[Boolean] = BooleanSetting("-Xcheck-macros", "Check some invariants of macro generated code while expanding macros", aliases = List("--Xcheck-macros")) + val XmainClass: Setting[String] = StringSetting("-Xmain-class", "path", "Class for manifest's Main-Class entry (only useful with -d )", "") + val XimplicitSearchLimit: Setting[Int] = IntSetting("-Ximplicit-search-limit", "Maximal number of expressions to be generated in an implicit search", 50000) + + val XmixinForceForwarders = ChoiceSetting( + name = "-Xmixin-force-forwarders", + helpArg = "mode", + descr = "Generate forwarder methods in classes inhering concrete methods from traits.", + choices = List("true", "junit", "false"), + default = "true") + + object mixinForwarderChoices { + def isTruthy(using Context) = XmixinForceForwarders.value == "true" + def isAtLeastJunit(using Context) = isTruthy || XmixinForceForwarders.value == "junit" + } + + val XmacroSettings: Setting[List[String]] = MultiStringSetting("-Xmacro-settings", "setting1,setting2,..settingN", "List of settings which exposed to the macros") +end XSettings + +/** -Y "Forking" as in forked tongue or "Private" settings */ +private sealed trait YSettings: + self: SettingGroup => + + val Yhelp: Setting[Boolean] = BooleanSetting("-Y", "Print a synopsis of private options.") + val Ycheck: Setting[List[String]] = PhasesSetting("-Ycheck", "Check the tree at the end of") + val YcheckMods: Setting[Boolean] = BooleanSetting("-Ycheck-mods", "Check that symbols and their defining trees have modifiers in sync.") + val Ydebug: Setting[Boolean] = BooleanSetting("-Ydebug", "Increase the quantity of debugging output.") + val YdebugTrace: Setting[Boolean] = BooleanSetting("-Ydebug-trace", "Trace core operations.") + val YdebugFlags: Setting[Boolean] = BooleanSetting("-Ydebug-flags", "Print all flags of definitions.") + val YdebugMissingRefs: Setting[Boolean] = BooleanSetting("-Ydebug-missing-refs", "Print a stacktrace when a required symbol is missing.") + val YdebugNames: Setting[Boolean] = BooleanSetting("-Ydebug-names", "Show internal representation of names.") + val YdebugPos: Setting[Boolean] = BooleanSetting("-Ydebug-pos", "Show full source positions including spans.") + val YdebugTreeWithId: Setting[Int] = IntSetting("-Ydebug-tree-with-id", "Print the stack trace when the tree with the given id is created.", Int.MinValue) + val YdebugTypeError: Setting[Boolean] = BooleanSetting("-Ydebug-type-error", "Print the stack trace when a TypeError is caught", false) + val YdebugError: Setting[Boolean] = BooleanSetting("-Ydebug-error", "Print the stack trace when any error is caught.", false) + val YdebugUnpickling: Setting[Boolean] = BooleanSetting("-Ydebug-unpickling", "Print the stack trace when an error occurs when reading Tasty.", false) + val YtermConflict: Setting[String] = ChoiceSetting("-Yresolve-term-conflict", "strategy", "Resolve term conflicts", List("package", "object", "error"), "error") + val Ylog: Setting[List[String]] = PhasesSetting("-Ylog", "Log operations during") + val YlogClasspath: Setting[Boolean] = BooleanSetting("-Ylog-classpath", "Output information about what classpath is being applied.") + val YdisableFlatCpCaching: Setting[Boolean] = BooleanSetting("-YdisableFlatCpCaching", "Do not cache flat classpath representation of classpath elements from jars across compiler instances.") + + val Yscala2Unpickler: Setting[String] = StringSetting("-Yscala2-unpickler", "", "Control where we may get Scala 2 symbols from. This is either \"always\", \"never\", or a classpath.", "always") + + val YnoImports: Setting[Boolean] = BooleanSetting("-Yno-imports", "Compile without importing scala.*, java.lang.*, or Predef.") + val YnoGenericSig: Setting[Boolean] = BooleanSetting("-Yno-generic-signatures", "Suppress generation of generic signatures for Java.") + val YnoPredef: Setting[Boolean] = BooleanSetting("-Yno-predef", "Compile without importing Predef.") + val Yskip: Setting[List[String]] = PhasesSetting("-Yskip", "Skip") + val Ydumpclasses: Setting[String] = StringSetting("-Ydump-classes", "dir", "Dump the generated bytecode to .class files (useful for reflective compilation that utilizes in-memory classloaders).", "") + val YstopAfter: Setting[List[String]] = PhasesSetting("-Ystop-after", "Stop after", aliases = List("-stop")) // backward compat + val YstopBefore: Setting[List[String]] = PhasesSetting("-Ystop-before", "Stop before") // stop before erasure as long as we have not debugged it fully + val YshowSuppressedErrors: Setting[Boolean] = BooleanSetting("-Yshow-suppressed-errors", "Also show follow-on errors and warnings that are normally suppressed.") + val YdetailedStats: Setting[Boolean] = BooleanSetting("-Ydetailed-stats", "Show detailed internal compiler stats (needs Stats.enabled to be set to true).") + val YkindProjector: Setting[String] = ChoiceSetting("-Ykind-projector", "[underscores, disable]", "Allow `*` as type lambda placeholder to be compatible with kind projector. When invoked as -Ykind-projector:underscores will repurpose `_` to be a type parameter placeholder, this will disable usage of underscore as a wildcard.", List("disable", "", "underscores"), "disable") + val YprintPos: Setting[Boolean] = BooleanSetting("-Yprint-pos", "Show tree positions.") + val YprintPosSyms: Setting[Boolean] = BooleanSetting("-Yprint-pos-syms", "Show symbol definitions positions.") + val YnoDeepSubtypes: Setting[Boolean] = BooleanSetting("-Yno-deep-subtypes", "Throw an exception on deep subtyping call stacks.") + val YnoPatmatOpt: Setting[Boolean] = BooleanSetting("-Yno-patmat-opt", "Disable all pattern matching optimizations.") + val YplainPrinter: Setting[Boolean] = BooleanSetting("-Yplain-printer", "Pretty-print using a plain printer.") + val YprintSyms: Setting[Boolean] = BooleanSetting("-Yprint-syms", "When printing trees print info in symbols instead of corresponding info in trees.") + val YprintDebug: Setting[Boolean] = BooleanSetting("-Yprint-debug", "When printing trees, print some extra information useful for debugging.") + val YprintDebugOwners: Setting[Boolean] = BooleanSetting("-Yprint-debug-owners", "When printing trees, print owners of definitions.") + val YprintLevel: Setting[Boolean] = BooleanSetting("-Yprint-level", "print nesting levels of symbols and type variables.") + val YshowPrintErrors: Setting[Boolean] = BooleanSetting("-Yshow-print-errors", "Don't suppress exceptions thrown during tree printing.") + val YtestPickler: Setting[Boolean] = BooleanSetting("-Ytest-pickler", "Self-test for pickling functionality; should be used with -Ystop-after:pickler.") + val YcheckReentrant: Setting[Boolean] = BooleanSetting("-Ycheck-reentrant", "Check that compiled program does not contain vars that can be accessed from a global root.") + val YdropComments: Setting[Boolean] = BooleanSetting("-Ydrop-docs", "Drop documentation when scanning source files.", aliases = List("-Ydrop-comments")) + val YcookComments: Setting[Boolean] = BooleanSetting("-Ycook-docs", "Cook the documentation (type check `@usecase`, etc.)", aliases = List("-Ycook-comments")) + val YreadComments: Setting[Boolean] = BooleanSetting("-Yread-docs", "Read documentation from tasty.") + val YforceSbtPhases: Setting[Boolean] = BooleanSetting("-Yforce-sbt-phases", "Run the phases used by sbt for incremental compilation (ExtractDependencies and ExtractAPI) even if the compiler is ran outside of sbt, for debugging.") + val YdumpSbtInc: Setting[Boolean] = BooleanSetting("-Ydump-sbt-inc", "For every compiled foo.scala, output the API representation and dependencies used for sbt incremental compilation in foo.inc, implies -Yforce-sbt-phases.") + val YcheckAllPatmat: Setting[Boolean] = BooleanSetting("-Ycheck-all-patmat", "Check exhaustivity and redundancy of all pattern matching (used for testing the algorithm).") + val YcheckConstraintDeps: Setting[Boolean] = BooleanSetting("-Ycheck-constraint-deps", "Check dependency tracking in constraints (used for testing the algorithm).") + val YretainTrees: Setting[Boolean] = BooleanSetting("-Yretain-trees", "Retain trees for top-level classes, accessible from ClassSymbol#tree") + val YshowTreeIds: Setting[Boolean] = BooleanSetting("-Yshow-tree-ids", "Uniquely tag all tree nodes in debugging output.") + val YfromTastyIgnoreList: Setting[List[String]] = MultiStringSetting("-Yfrom-tasty-ignore-list", "file", "List of `tasty` files in jar files that will not be loaded when using -from-tasty") + val YnoExperimental: Setting[Boolean] = BooleanSetting("-Yno-experimental", "Disable experimental language features") + + val YprofileEnabled: Setting[Boolean] = BooleanSetting("-Yprofile-enabled", "Enable profiling.") + val YprofileDestination: Setting[String] = StringSetting("-Yprofile-destination", "file", "Where to send profiling output - specify a file, default is to the console.", "") + //.withPostSetHook( _ => YprofileEnabled.value = true ) + val YprofileExternalTool: Setting[List[String]] = PhasesSetting("-Yprofile-external-tool", "Enable profiling for a phase using an external tool hook. Generally only useful for a single phase.", "typer") + //.withPostSetHook( _ => YprofileEnabled.value = true ) + val YprofileRunGcBetweenPhases: Setting[List[String]] = PhasesSetting("-Yprofile-run-gc", "Run a GC between phases - this allows heap size to be accurate at the expense of more time. Specify a list of phases, or *", "_") + //.withPostSetHook( _ => YprofileEnabled.value = true ) + + // Experimental language features + val YnoKindPolymorphism: Setting[Boolean] = BooleanSetting("-Yno-kind-polymorphism", "Disable kind polymorphism.") + val YexplicitNulls: Setting[Boolean] = BooleanSetting("-Yexplicit-nulls", "Make reference types non-nullable. Nullable types can be expressed with unions: e.g. String|Null.") + val YcheckInit: Setting[Boolean] = BooleanSetting("-Ysafe-init", "Ensure safe initialization of objects") + val YrequireTargetName: Setting[Boolean] = BooleanSetting("-Yrequire-targetName", "Warn if an operator is defined without a @targetName annotation") + val YrecheckTest: Setting[Boolean] = BooleanSetting("-Yrecheck-test", "Run basic rechecking (internal test only)") + val YccDebug: Setting[Boolean] = BooleanSetting("-Ycc-debug", "Used in conjunction with captureChecking language import, debug info for captured references") + val YccNoAbbrev: Setting[Boolean] = BooleanSetting("-Ycc-no-abbrev", "Used in conjunction with captureChecking language import, suppress type abbreviations") + val YlightweightLazyVals: Setting[Boolean] = BooleanSetting("-Ylightweight-lazy-vals", "Use experimental lightweight implementation of lazy vals") + + /** Area-specific debug output */ + val YexplainLowlevel: Setting[Boolean] = BooleanSetting("-Yexplain-lowlevel", "When explaining type errors, show types at a lower level.") + val YnoDoubleBindings: Setting[Boolean] = BooleanSetting("-Yno-double-bindings", "Assert no namedtype is bound twice (should be enabled only if program is error-free).") + val YshowVarBounds: Setting[Boolean] = BooleanSetting("-Yshow-var-bounds", "Print type variables with their bounds.") + + val YnoDecodeStacktraces: Setting[Boolean] = BooleanSetting("-Yno-decode-stacktraces", "Show raw StackOverflow stacktraces, instead of decoding them into triggering operations.") + + val Yinstrument: Setting[Boolean] = BooleanSetting("-Yinstrument", "Add instrumentation code that counts allocations and closure creations.") + val YinstrumentDefs: Setting[Boolean] = BooleanSetting("-Yinstrument-defs", "Add instrumentation code that counts method calls; needs -Yinstrument to be set, too.") + + val YforceInlineWhileTyping: Setting[Boolean] = BooleanSetting("-Yforce-inline-while-typing", "Make non-transparent inline methods inline when typing. Emulates the old inlining behavior of 3.0.0-M3.") +end YSettings + diff --git a/tests/pos-with-compiler-cc/dotc/config/ScalaVersion.scala b/tests/pos-with-compiler-cc/dotc/config/ScalaVersion.scala new file mode 100644 index 000000000000..7fdf57478f1a --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/ScalaVersion.scala @@ -0,0 +1,188 @@ +/* @author James Iry + */ +package dotty.tools +package dotc.config + +import scala.language.unsafeNulls + +import scala.annotation.internal.sharable +import scala.util.{Try, Success, Failure} + +/** + * Represents a single Scala version in a manner that + * supports easy comparison and sorting. + */ +sealed abstract class ScalaVersion extends Ordered[ScalaVersion] { + def unparse: String +} + +/** + * A scala version that sorts higher than all actual versions + */ +@sharable case object NoScalaVersion extends ScalaVersion { + def unparse: String = "none" + + def compare(that: ScalaVersion): Int = that match { + case NoScalaVersion => 0 + case _ => 1 + } +} + +/** + * A specific Scala version, not one of the magic min/max versions. An SpecificScalaVersion + * may or may not be a released version - i.e. this same class is used to represent + * final, release candidate, milestone, and development builds. The build argument is used + * to segregate builds + */ +case class SpecificScalaVersion(major: Int, minor: Int, rev: Int, build: ScalaBuild) extends ScalaVersion { + def unparse: String = s"${major}.${minor}.${rev}.${build.unparse}" + + def compare(that: ScalaVersion): Int = that match { + case SpecificScalaVersion(thatMajor, thatMinor, thatRev, thatBuild) => + // this could be done more cleanly by importing scala.math.Ordering.Implicits, but we have to do these + // comparisons a lot so I'm using brute force direct style code + if (major < thatMajor) -1 + else if (major > thatMajor) 1 + else if (minor < thatMinor) -1 + else if (minor > thatMinor) 1 + else if (rev < thatRev) -1 + else if (rev > thatRev) 1 + else build compare thatBuild + case AnyScalaVersion => 1 + case NoScalaVersion => -1 + } +} + +/** + * A Scala version that sorts lower than all actual versions + */ +@sharable case object AnyScalaVersion extends ScalaVersion { + def unparse: String = "any" + + def compare(that: ScalaVersion): Int = that match { + case AnyScalaVersion => 0 + case _ => -1 + } +} + +/** + * Methods for parsing ScalaVersions + */ +@sharable object ScalaVersion { + private val dot = "\\." + private val dash = "\\-" + private def not(s:String) = s"[^${s}]" + private val R = s"((${not(dot)}*)(${dot}(${not(dot)}*)(${dot}(${not(dash)}*)(${dash}(.*))?)?)?)".r + + def parse(versionString : String): Try[ScalaVersion] = { + def failure = Failure(new NumberFormatException( + s"There was a problem parsing ${versionString}. " + + "Versions should be in the form major[.minor[.revision]] " + + "where each part is a positive number, as in 2.10.1. " + + "The minor and revision parts are optional." + )) + + def toInt(s: String) = s match { + case null | "" => 0 + case _ => s.toInt + } + + def isInt(s: String) = Try(toInt(s)).isSuccess + + import ScalaBuild._ + + def toBuild(s: String) = s match { + case null | "FINAL" => Final + case s if (s.toUpperCase.startsWith("RC") && isInt(s.substring(2))) => RC(toInt(s.substring(2))) + case s if (s.toUpperCase.startsWith("M") && isInt(s.substring(1))) => Milestone(toInt(s.substring(1))) + case _ => Development(s) + } + + try versionString match { + case "" | "any" => Success(AnyScalaVersion) + case "none" => Success(NoScalaVersion) + case R(_, majorS, _, minorS, _, revS, _, buildS) => + Success(SpecificScalaVersion(toInt(majorS), toInt(minorS), toInt(revS), toBuild(buildS))) + case _ => failure + } + catch { + case e: NumberFormatException => failure + } + } + + /** + * The version of the compiler running now + */ + val current: ScalaVersion = parse(util.Properties.versionNumberString).get +} + +/** + * Represents the data after the dash in major.minor.rev-build + */ +abstract class ScalaBuild extends Ordered[ScalaBuild] { + /** + * Return a version of this build information that can be parsed back into the + * same ScalaBuild + */ + def unparse: String +} + +object ScalaBuild { + + /** A development, test, nightly, snapshot or other "unofficial" build + */ + case class Development(id: String) extends ScalaBuild { + def unparse: String = s"-${id}" + + def compare(that: ScalaBuild): Int = that match { + // sorting two development builds based on id is reasonably valid for two versions created with the same schema + // otherwise it's not correct, but since it's impossible to put a total ordering on development build versions + // this is a pragmatic compromise + case Development(thatId) => id compare thatId + // assume a development build is newer than anything else, that's not really true, but good luck + // mapping development build versions to other build types + case _ => 1 + } + } + + /** A final build + */ + case object Final extends ScalaBuild { + def unparse: String = "" + + def compare(that: ScalaBuild): Int = that match { + case Final => 0 + // a final is newer than anything other than a development build or another final + case Development(_) => -1 + case _ => 1 + } + } + + /** A candidate for final release + */ + case class RC(n: Int) extends ScalaBuild { + def unparse: String = s"-RC${n}" + + def compare(that: ScalaBuild): Int = that match { + // compare two rcs based on their RC numbers + case RC(thatN) => n - thatN + // an rc is older than anything other than a milestone or another rc + case Milestone(_) => 1 + case _ => -1 + } + } + + /** An intermediate release + */ + case class Milestone(n: Int) extends ScalaBuild { + def unparse: String = s"-M${n}" + + def compare(that: ScalaBuild): Int = that match { + // compare two milestones based on their milestone numbers + case Milestone(thatN) => n - thatN + // a milestone is older than anything other than another milestone + case _ => -1 + } + } +} + diff --git a/tests/pos-with-compiler-cc/dotc/config/Settings.scala b/tests/pos-with-compiler-cc/dotc/config/Settings.scala new file mode 100644 index 000000000000..277833afbd5d --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/Settings.scala @@ -0,0 +1,295 @@ +package dotty.tools.dotc +package config + +import scala.language.unsafeNulls + +import core.Contexts._ + +import dotty.tools.io.{AbstractFile, Directory, JarArchive, PlainDirectory} + +import annotation.tailrec +import collection.mutable.ArrayBuffer +import reflect.ClassTag +import scala.util.{Success, Failure} + +object Settings: + + val BooleanTag: ClassTag[Boolean] = ClassTag.Boolean + val IntTag: ClassTag[Int] = ClassTag.Int + val StringTag: ClassTag[String] = ClassTag(classOf[String]) + val ListTag: ClassTag[List[?]] = ClassTag(classOf[List[?]]) + val VersionTag: ClassTag[ScalaVersion] = ClassTag(classOf[ScalaVersion]) + val OptionTag: ClassTag[Option[?]] = ClassTag(classOf[Option[?]]) + val OutputTag: ClassTag[AbstractFile] = ClassTag(classOf[AbstractFile]) + + class SettingsState(initialValues: Seq[Any]): + private val values = ArrayBuffer(initialValues: _*) + private var _wasRead: Boolean = false + + override def toString: String = s"SettingsState(values: ${values.toList})" + + def value(idx: Int): Any = + _wasRead = true + values(idx) + + def update(idx: Int, x: Any): SettingsState = + if (_wasRead) then SettingsState(values.toSeq).update(idx, x) + else + values(idx) = x + this + end SettingsState + + case class ArgsSummary( + sstate: SettingsState, + arguments: List[String], + errors: List[String], + warnings: List[String]) { + + def fail(msg: String): Settings.ArgsSummary = + ArgsSummary(sstate, arguments.tail, errors :+ msg, warnings) + + def warn(msg: String): Settings.ArgsSummary = + ArgsSummary(sstate, arguments.tail, errors, warnings :+ msg) + } + + case class Setting[T: ClassTag] private[Settings] ( + name: String, + description: String, + default: T, + helpArg: String = "", + choices: Option[Seq[?]] = None, + prefix: String = "", + aliases: List[String] = Nil, + depends: List[(Setting[?], Any)] = Nil, + propertyClass: Option[Class[?]] = None)(private[Settings] val idx: Int) { + + private var changed: Boolean = false + + def valueIn(state: SettingsState): T = state.value(idx).asInstanceOf[T] + + def updateIn(state: SettingsState, x: Any): SettingsState = x match + case _: T => state.update(idx, x) + case _ => throw IllegalArgumentException(s"found: $x of type ${x.getClass.getName}, required: ${implicitly[ClassTag[T]]}") + + def isDefaultIn(state: SettingsState): Boolean = valueIn(state) == default + + def isMultivalue: Boolean = implicitly[ClassTag[T]] == ListTag + + def legalChoices: String = + choices match { + case Some(xs) if xs.isEmpty => "" + case Some(r: Range) => s"${r.head}..${r.last}" + case Some(xs) => xs.mkString(", ") + case None => "" + } + + def tryToSet(state: ArgsSummary): ArgsSummary = { + val ArgsSummary(sstate, arg :: args, errors, warnings) = state: @unchecked + def update(value: Any, args: List[String]): ArgsSummary = + var dangers = warnings + val value1 = + if changed && isMultivalue then + val value0 = value.asInstanceOf[List[String]] + val current = valueIn(sstate).asInstanceOf[List[String]] + value0.filter(current.contains).foreach(s => dangers :+= s"Setting $name set to $s redundantly") + current ++ value0 + else + if changed then dangers :+= s"Flag $name set repeatedly" + value + changed = true + ArgsSummary(updateIn(sstate, value1), args, errors, dangers) + end update + + def fail(msg: String, args: List[String]) = + ArgsSummary(sstate, args, errors :+ msg, warnings) + + def missingArg = + fail(s"missing argument for option $name", args) + + def setString(argValue: String, args: List[String]) = + choices match + case Some(xs) if !xs.contains(argValue) => + fail(s"$argValue is not a valid choice for $name", args) + case _ => + update(argValue, args) + + def setInt(argValue: String, args: List[String]) = + try + val x = argValue.toInt + choices match + case Some(r: Range) if x < r.head || r.last < x => + fail(s"$argValue is out of legal range ${r.head}..${r.last} for $name", args) + case Some(xs) if !xs.contains(x) => + fail(s"$argValue is not a valid choice for $name", args) + case _ => + update(x, args) + catch case _: NumberFormatException => + fail(s"$argValue is not an integer argument for $name", args) + + def doSet(argRest: String) = ((implicitly[ClassTag[T]], args): @unchecked) match { + case (BooleanTag, _) => + update(true, args) + case (OptionTag, _) => + update(Some(propertyClass.get.getConstructor().newInstance()), args) + case (ListTag, _) => + if (argRest.isEmpty) missingArg + else + val strings = argRest.split(",").toList + choices match + case Some(valid) => strings.filterNot(valid.contains) match + case Nil => update(strings, args) + case invalid => fail(s"invalid choice(s) for $name: ${invalid.mkString(",")}", args) + case _ => update(strings, args) + case (StringTag, _) if argRest.nonEmpty || choices.exists(_.contains("")) => + setString(argRest, args) + case (StringTag, arg2 :: args2) => + if (arg2 startsWith "-") missingArg + else setString(arg2, args2) + case (OutputTag, arg :: args) => + val path = Directory(arg) + val isJar = path.extension == "jar" + if (!isJar && !path.isDirectory) + fail(s"'$arg' does not exist or is not a directory or .jar file", args) + else { + val output = if (isJar) JarArchive.create(path) else new PlainDirectory(path) + update(output, args) + } + case (IntTag, args) if argRest.nonEmpty => + setInt(argRest, args) + case (IntTag, arg2 :: args2) => + setInt(arg2, args2) + case (VersionTag, _) => + ScalaVersion.parse(argRest) match { + case Success(v) => update(v, args) + case Failure(ex) => fail(ex.getMessage, args) + } + case (_, Nil) => + missingArg + } + + def matches(argName: String) = (name :: aliases).exists(_ == argName) + + if (prefix != "" && arg.startsWith(prefix)) + doSet(arg drop prefix.length) + else if (prefix == "" && matches(arg.takeWhile(_ != ':'))) + doSet(arg.dropWhile(_ != ':').drop(1)) + else + state + } + } + + object Setting: + extension [T](setting: Setting[T]) + def value(using Context): T = setting.valueIn(ctx.settingsState) + def update(x: T)(using Context): SettingsState = setting.updateIn(ctx.settingsState, x) + def isDefault(using Context): Boolean = setting.isDefaultIn(ctx.settingsState) + + class SettingGroup { + + private val _allSettings = new ArrayBuffer[Setting[?]] + def allSettings: Seq[Setting[?]] = _allSettings.toSeq + + def defaultState: SettingsState = new SettingsState(allSettings map (_.default)) + + def userSetSettings(state: SettingsState): Seq[Setting[?]] = + allSettings filterNot (_.isDefaultIn(state)) + + def toConciseString(state: SettingsState): String = + userSetSettings(state).mkString("(", " ", ")") + + private def checkDependencies(state: ArgsSummary): ArgsSummary = + userSetSettings(state.sstate).foldLeft(state)(checkDependenciesOfSetting) + + private def checkDependenciesOfSetting(state: ArgsSummary, setting: Setting[?]) = + setting.depends.foldLeft(state) { (s, dep) => + val (depSetting, reqValue) = dep + if (depSetting.valueIn(state.sstate) == reqValue) s + else s.fail(s"incomplete option ${setting.name} (requires ${depSetting.name})") + } + + /** Iterates over the arguments applying them to settings where applicable. + * Then verifies setting dependencies are met. + * + * This takes a boolean indicating whether to keep + * processing if an argument is seen which is not a command line option. + * This is an expedience for the moment so that you can say + * + * scalac -d /tmp foo.scala -optimise + * + * while also allowing + * + * scala Program opt opt + * + * to get their arguments. + */ + @tailrec + final def processArguments(state: ArgsSummary, processAll: Boolean, skipped: List[String]): ArgsSummary = + def stateWithArgs(args: List[String]) = ArgsSummary(state.sstate, args, state.errors, state.warnings) + state.arguments match + case Nil => + checkDependencies(stateWithArgs(skipped)) + case "--" :: args => + checkDependencies(stateWithArgs(skipped ++ args)) + case x :: _ if x startsWith "-" => + @tailrec def loop(settings: List[Setting[?]]): ArgsSummary = settings match + case setting :: settings1 => + val state1 = setting.tryToSet(state) + if state1 ne state then state1 + else loop(settings1) + case Nil => + state.warn(s"bad option '$x' was ignored") + processArguments(loop(allSettings.toList), processAll, skipped) + case arg :: args => + if processAll then processArguments(stateWithArgs(args), processAll, skipped :+ arg) + else state + end processArguments + + def processArguments(arguments: List[String], processAll: Boolean, settingsState: SettingsState = defaultState): ArgsSummary = + processArguments(ArgsSummary(settingsState, arguments, Nil, Nil), processAll, Nil) + + def publish[T](settingf: Int => Setting[T]): Setting[T] = { + val setting = settingf(_allSettings.length) + _allSettings += setting + setting + } + + def BooleanSetting(name: String, descr: String, initialValue: Boolean = false, aliases: List[String] = Nil): Setting[Boolean] = + publish(Setting(name, descr, initialValue, aliases = aliases)) + + def StringSetting(name: String, helpArg: String, descr: String, default: String, aliases: List[String] = Nil): Setting[String] = + publish(Setting(name, descr, default, helpArg, aliases = aliases)) + + def ChoiceSetting(name: String, helpArg: String, descr: String, choices: List[String], default: String, aliases: List[String] = Nil): Setting[String] = + publish(Setting(name, descr, default, helpArg, Some(choices), aliases = aliases)) + + def MultiChoiceSetting(name: String, helpArg: String, descr: String, choices: List[String], default: List[String], aliases: List[String] = Nil): Setting[List[String]] = + publish(Setting(name, descr, default, helpArg, Some(choices), aliases = aliases)) + + def IntSetting(name: String, descr: String, default: Int, aliases: List[String] = Nil): Setting[Int] = + publish(Setting(name, descr, default, aliases = aliases)) + + def IntChoiceSetting(name: String, descr: String, choices: Seq[Int], default: Int): Setting[Int] = + publish(Setting(name, descr, default, choices = Some(choices))) + + def MultiStringSetting(name: String, helpArg: String, descr: String, default: List[String] = Nil, aliases: List[String] = Nil): Setting[List[String]] = + publish(Setting(name, descr, default, helpArg, aliases = aliases)) + + def OutputSetting(name: String, helpArg: String, descr: String, default: AbstractFile): Setting[AbstractFile] = + publish(Setting(name, descr, default, helpArg)) + + def PathSetting(name: String, descr: String, default: String, aliases: List[String] = Nil): Setting[String] = + publish(Setting(name, descr, default, aliases = aliases)) + + def PhasesSetting(name: String, descr: String, default: String = "", aliases: List[String] = Nil): Setting[List[String]] = + publish(Setting(name, descr, if (default.isEmpty) Nil else List(default), aliases = aliases)) + + def PrefixSetting(name: String, pre: String, descr: String): Setting[List[String]] = + publish(Setting(name, descr, Nil, prefix = pre)) + + def VersionSetting(name: String, descr: String, default: ScalaVersion = NoScalaVersion): Setting[ScalaVersion] = + publish(Setting(name, descr, default)) + + def OptionSetting[T: ClassTag](name: String, descr: String, aliases: List[String] = Nil): Setting[Option[T]] = + publish(Setting(name, descr, None, propertyClass = Some(implicitly[ClassTag[T]].runtimeClass), aliases = aliases)) + } +end Settings diff --git a/tests/pos-with-compiler-cc/dotc/config/SourceVersion.scala b/tests/pos-with-compiler-cc/dotc/config/SourceVersion.scala new file mode 100644 index 000000000000..4b9b1b247856 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/SourceVersion.scala @@ -0,0 +1,32 @@ +package dotty.tools +package dotc +package config + +import core.Decorators.* +import util.Property + +enum SourceVersion: + case `3.0-migration`, `3.0`, `3.1` // Note: do not add `3.1-migration` here, 3.1 is the same language as 3.0. + case `3.2-migration`, `3.2` + case `3.3-migration`, `3.3` + case `future-migration`, `future` + + val isMigrating: Boolean = toString.endsWith("-migration") + + def stable: SourceVersion = + if isMigrating then SourceVersion.values(ordinal + 1) else this + + def isAtLeast(v: SourceVersion) = stable.ordinal >= v.ordinal + +object SourceVersion extends Property.Key[SourceVersion]: + def defaultSourceVersion = `3.3` + + /** language versions that may appear in a language import, are deprecated, but not removed from the standard library. */ + val illegalSourceVersionNames = List("3.1-migration").map(_.toTermName) + + /** language versions that the compiler recognises. */ + val validSourceVersionNames = values.toList.map(_.toString.toTermName) + + /** All source versions that can be recognised from a language import. e.g. `import language.3.1` */ + val allSourceVersionNames = validSourceVersionNames ::: illegalSourceVersionNames +end SourceVersion diff --git a/tests/pos-with-compiler-cc/dotc/config/WrappedProperties.scala b/tests/pos-with-compiler-cc/dotc/config/WrappedProperties.scala new file mode 100644 index 000000000000..5b79432a97e7 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/config/WrappedProperties.scala @@ -0,0 +1,42 @@ +package dotty.tools +package dotc +package config + +import scala.language.unsafeNulls + +/** For placing a wrapper function around property functions. + * Motivated by places like google app engine throwing exceptions + * on property lookups. + */ +trait WrappedProperties extends PropertiesTrait { + def wrap[T](body: => T): Option[T] + + protected def propCategory: String = "wrapped" + protected def pickJarBasedOn: Class[?] = this.getClass + + override def propIsSet(name: String): Boolean = wrap(super.propIsSet(name)) exists (x => x) + override def propOrElse(name: String, alt: String): String = wrap(super.propOrElse(name, alt)) getOrElse alt + override def setProp(name: String, value: String): String = wrap(super.setProp(name, value)).orNull + override def clearProp(name: String): String = wrap(super.clearProp(name)).orNull + override def envOrElse(name: String, alt: String): String = wrap(super.envOrElse(name, alt)) getOrElse alt + override def envOrNone(name: String): Option[String] = wrap(super.envOrNone(name)).flatten + + def systemProperties: Iterator[(String, String)] = { + import scala.jdk.CollectionConverters._ + wrap(System.getProperties.asScala.iterator) getOrElse Iterator.empty + } +} + +object WrappedProperties { + object AccessControl extends WrappedProperties { + def wrap[T](body: => T): Option[T] = + try Some(body) + catch { + // the actual exception we are concerned with is AccessControlException, + // but that's deprecated on JDK 17, so catching its superclass is a convenient + // way to avoid a deprecation warning + case _: SecurityException => + None + } + } +} diff --git a/tests/pos-with-compiler-cc/dotc/core/Annotations.scala b/tests/pos-with-compiler-cc/dotc/core/Annotations.scala new file mode 100644 index 000000000000..f3fee3da78ec --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/core/Annotations.scala @@ -0,0 +1,274 @@ +package dotty.tools +package dotc +package core + +import Symbols._, Types._, Contexts._, Constants._ +import dotty.tools.dotc.ast.tpd, tpd.* +import util.Spans.Span +import printing.{Showable, Printer} +import printing.Texts.Text +import annotation.internal.sharable +import language.experimental.pureFunctions +import annotation.retains + +object Annotations { + + def annotClass(tree: Tree)(using Context) = + if (tree.symbol.isConstructor) tree.symbol.owner + else tree.tpe.typeSymbol + + abstract class Annotation extends Showable, caps.Pure { + + def tree(using Context): Tree + + def symbol(using Context): Symbol = annotClass(tree) + + def hasSymbol(sym: Symbol)(using Context) = symbol == sym + + def matches(cls: Symbol)(using Context): Boolean = symbol.derivesFrom(cls) + + def appliesToModule: Boolean = true // for now; see remark in SymDenotations + + def derivedAnnotation(tree: Tree)(using Context): Annotation = + if (tree eq this.tree) this else Annotation(tree) + + /** All arguments to this annotation in a single flat list */ + def arguments(using Context): List[Tree] = tpd.allArguments(tree) + + def argument(i: Int)(using Context): Option[Tree] = { + val args = arguments + if (i < args.length) Some(args(i)) else None + } + def argumentConstant(i: Int)(using Context): Option[Constant] = + for (case ConstantType(c) <- argument(i) map (_.tpe.widenTermRefExpr.normalized)) yield c + + def argumentConstantString(i: Int)(using Context): Option[String] = + for (case Constant(s: String) <- argumentConstant(i)) yield s + + /** The tree evaluaton is in progress. */ + def isEvaluating: Boolean = false + + /** The tree evaluation has finished. */ + def isEvaluated: Boolean = true + + /** Normally, applies a type map to all tree nodes of this annotation, but can + * be overridden. Returns EmptyAnnotation if type type map produces a range + * type, since ranges cannot be types of trees. + */ + def mapWith(tm: TypeMap @retains(caps.*))(using Context) = + val args = arguments + if args.isEmpty then this + else + val findDiff = new TreeAccumulator[Type]: + def apply(x: Type, tree: Tree)(using Context): Type = + if tm.isRange(x) then x + else + val tp1 = tm(tree.tpe) + foldOver(if tp1 frozen_=:= tree.tpe then x else tp1, tree) + val diff = findDiff(NoType, args) + if tm.isRange(diff) then EmptyAnnotation + else if diff.exists then derivedAnnotation(tm.mapOver(tree)) + else this + + /** Does this annotation refer to a parameter of `tl`? */ + def refersToParamOf(tl: TermLambda)(using Context): Boolean = + val args = arguments + if args.isEmpty then false + else tree.existsSubTree { + case id: Ident => id.tpe.stripped match + case TermParamRef(tl1, _) => tl eq tl1 + case _ => false + case _ => false + } + + /** A string representation of the annotation. Overridden in BodyAnnotation. + */ + def toText(printer: Printer): Text = printer.annotText(this) + + def ensureCompleted(using Context): Unit = tree + + def sameAnnotation(that: Annotation)(using Context): Boolean = + symbol == that.symbol && tree.sameTree(that.tree) + + /** Operations for hash-consing, can be overridden */ + def hash: Int = System.identityHashCode(this) + def eql(that: Annotation) = this eq that + } + + case class ConcreteAnnotation(t: Tree) extends Annotation: + def tree(using Context): Tree = t + + abstract class LazyAnnotation extends Annotation { + protected var mySym: Symbol | (Context ?-> Symbol) | Null + override def symbol(using parentCtx: Context): Symbol = + assert(mySym != null) + mySym match { + case symFn: (Context ?-> Symbol) @unchecked => + mySym = null + mySym = atPhaseBeforeTransforms(symFn) + // We should always produce the same annotation tree, no matter when the + // annotation is evaluated. Setting the phase to a pre-transformation phase + // seems to be enough to ensure this (note that after erasure, `ctx.typer` + // will be the Erasure typer, but that doesn't seem to affect the annotation + // trees we create, so we leave it as is) + case sym: Symbol if sym.defRunId != parentCtx.runId => + mySym = sym.denot.current.symbol + case _ => + } + mySym.asInstanceOf[Symbol] + + protected var myTree: Tree | (Context ?-> Tree) | Null + def tree(using Context): Tree = + assert(myTree != null) + myTree match { + case treeFn: (Context ?-> Tree) @unchecked => + myTree = null + myTree = atPhaseBeforeTransforms(treeFn) + case _ => + } + myTree.asInstanceOf[Tree] + + override def isEvaluating: Boolean = myTree == null + override def isEvaluated: Boolean = myTree.isInstanceOf[Tree @unchecked] + } + + class DeferredSymAndTree(symFn: Context ?-> Symbol, treeFn: Context ?-> Tree) + extends LazyAnnotation: + protected var mySym: Symbol | (Context ?-> Symbol) | Null = ctx ?=> symFn(using ctx) + protected var myTree: Tree | (Context ?-> Tree) | Null = ctx ?=> treeFn(using ctx) + + /** An annotation indicating the body of a right-hand side, + * typically of an inline method. Treated specially in + * pickling/unpickling and TypeTreeMaps + */ + abstract class BodyAnnotation extends Annotation { + override def symbol(using Context): ClassSymbol = defn.BodyAnnot + override def derivedAnnotation(tree: Tree)(using Context): Annotation = + if (tree eq this.tree) this else ConcreteBodyAnnotation(tree) + override def arguments(using Context): List[Tree] = Nil + override def ensureCompleted(using Context): Unit = () + override def toText(printer: Printer): Text = "@Body" + } + + class ConcreteBodyAnnotation(body: Tree) extends BodyAnnotation { + def tree(using Context): Tree = body + } + + abstract class LazyBodyAnnotation extends BodyAnnotation { + // Copy-pasted from LazyAnnotation to avoid having to turn it into a trait + protected var myTree: Tree | (Context ?-> Tree) | Null + def tree(using Context): Tree = + assert(myTree != null) + myTree match { + case treeFn: (Context ?-> Tree) @unchecked => + myTree = null + myTree = atPhaseBeforeTransforms(treeFn) + case _ => + } + myTree.asInstanceOf[Tree] + + override def isEvaluating: Boolean = myTree == null + override def isEvaluated: Boolean = myTree.isInstanceOf[Tree @unchecked] + } + + object LazyBodyAnnotation { + def apply(bodyFn: Context ?-> Tree): LazyBodyAnnotation = + new LazyBodyAnnotation: + protected var myTree: Tree | (Context ?-> Tree) | Null = ctx ?=> bodyFn(using ctx) + } + + object Annotation { + + def apply(tree: Tree): ConcreteAnnotation = ConcreteAnnotation(tree) + + def apply(cls: ClassSymbol)(using Context): Annotation = + apply(cls, Nil) + + def apply(cls: ClassSymbol, arg: Tree)(using Context): Annotation = + apply(cls, arg :: Nil) + + def apply(cls: ClassSymbol, arg1: Tree, arg2: Tree)(using Context): Annotation = + apply(cls, arg1 :: arg2 :: Nil) + + def apply(cls: ClassSymbol, args: List[Tree])(using Context): Annotation = + apply(cls.typeRef, args) + + def apply(atp: Type, arg: Tree)(using Context): Annotation = + apply(atp, arg :: Nil) + + def apply(atp: Type, arg1: Tree, arg2: Tree)(using Context): Annotation = + apply(atp, arg1 :: arg2 :: Nil) + + def apply(atp: Type, args: List[Tree])(using Context): Annotation = + apply(New(atp, args)) + + /** Create an annotation where the tree is computed lazily. */ + def deferred(sym: Symbol)(treeFn: Context ?-> Tree): Annotation = + new LazyAnnotation { + protected var myTree: Tree | (Context ?-> Tree) | Null = ctx ?=> treeFn(using ctx) + protected var mySym: Symbol | (Context ?-> Symbol) | Null = sym + } + + /** Create an annotation where the symbol and the tree are computed lazily. */ + def deferredSymAndTree(symFn: Context ?-> Symbol)(treeFn: Context ?-> Tree): Annotation = + DeferredSymAndTree(symFn, treeFn) + + /** Extractor for child annotations */ + object Child { + + /** A deferred annotation to the result of a given child computation */ + def later(delayedSym: Context ?-> Symbol, span: Span)(using Context): Annotation = { + def makeChildLater(using Context) = { + val sym = delayedSym + New(defn.ChildAnnot.typeRef.appliedTo(sym.owner.thisType.select(sym.name, sym)), Nil) + .withSpan(span) + } + deferred(defn.ChildAnnot)(makeChildLater) + } + + /** A regular, non-deferred Child annotation */ + def apply(sym: Symbol, span: Span)(using Context): Annotation = later(sym, span) + + def unapply(ann: Annotation)(using Context): Option[Symbol] = + if (ann.symbol == defn.ChildAnnot) { + val AppliedType(_, (arg: NamedType) :: Nil) = ann.tree.tpe: @unchecked + Some(arg.symbol) + } + else None + } + + def makeSourceFile(path: String)(using Context): Annotation = + apply(defn.SourceFileAnnot, Literal(Constant(path))) + } + + @sharable val EmptyAnnotation = Annotation(EmptyTree) + + def ThrowsAnnotation(cls: ClassSymbol)(using Context): Annotation = { + val tref = cls.typeRef + Annotation(defn.ThrowsAnnot.typeRef.appliedTo(tref), Ident(tref)) + } + + /** Extracts the type of the thrown exception from an annotation. + * + * Supports both "old-style" `@throws(classOf[Exception])` + * as well as "new-style" `@throws[Exception]("cause")` annotations. + */ + object ThrownException { + def unapply(a: Annotation)(using Context): Option[Type] = + if (a.symbol ne defn.ThrowsAnnot) + None + else a.argumentConstant(0) match { + // old-style: @throws(classOf[Exception]) (which is throws[T](classOf[Exception])) + case Some(Constant(tpe: Type)) => + Some(tpe) + // new-style: @throws[Exception], @throws[Exception]("cause") + case _ => + stripApply(a.tree) match { + case TypeApply(_, List(tpt)) => + Some(tpt.tpe) + case _ => + None + } + } + } +} diff --git a/tests/pos-with-compiler-cc/dotc/core/Atoms.scala b/tests/pos-with-compiler-cc/dotc/core/Atoms.scala new file mode 100644 index 000000000000..bcaaf6794107 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/core/Atoms.scala @@ -0,0 +1,36 @@ +package dotty.tools +package dotc +package core + +import Types._ + +/** Indicates the singleton types that a type must or may consist of. + * @param lo The lower bound: singleton types in this set are guaranteed + * to be in the carrier type. + * @param hi The upper bound: all singleton types in the carrier type are + * guaranteed to be in this set + * If the underlying type of a singleton type is another singleton type, + * only the latter type ends up in the sets. + */ +enum Atoms: + case Range(lo: Set[Type], hi: Set[Type]) + case Unknown + + def & (that: Atoms): Atoms = this match + case Range(lo1, hi1) => + that match + case Range(lo2, hi2) => Range(lo1 & lo2, hi1 & hi2) + case Unknown => Range(Set.empty, hi1) + case Unknown => + that match + case Range(lo2, hi2) => Range(Set.empty, hi2) + case Unknown => Unknown + + def | (that: Atoms): Atoms = this match + case Range(lo1, hi1) => + that match + case Range(lo2, hi2) => Range(lo1 | lo2, hi1 | hi2) + case Unknown => Unknown + case Unknown => Unknown + +end Atoms diff --git a/tests/pos-with-compiler-cc/dotc/core/CheckRealizable.scala b/tests/pos-with-compiler-cc/dotc/core/CheckRealizable.scala new file mode 100644 index 000000000000..47fa84b467d8 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/core/CheckRealizable.scala @@ -0,0 +1,216 @@ +package dotty.tools +package dotc +package core + +import Contexts._, Types._, Symbols._, Names._, Flags._ +import Denotations.SingleDenotation +import Decorators._ +import collection.mutable +import config.SourceVersion.future +import config.Feature.sourceVersion +import annotation.constructorOnly + +/** Realizability status */ +object CheckRealizable { + + sealed abstract class Realizability(val msg: String) extends caps.Pure { + def andAlso(other: => Realizability): Realizability = + if (this == Realizable) other else this + def mapError(f: Realizability -> Context ?-> Realizability)(using Context): Realizability = + if (this == Realizable) this else f(this) + } + + object Realizable extends Realizability("") + + object NotConcrete extends Realizability(" is not a concrete type") + + class NotFinal(sym: Symbol)(using @constructorOnly ctx: Context) + extends Realizability(i" refers to nonfinal $sym") + + class HasProblemBounds(name: Name, info: Type)(using @constructorOnly ctx: Context) + extends Realizability(i" has a member $name with possibly conflicting bounds ${info.bounds.lo} <: ... <: ${info.bounds.hi}") + + class HasProblemBaseArg(typ: Type, argBounds: TypeBounds)(using @constructorOnly ctx: Context) + extends Realizability(i" has a base type $typ with possibly conflicting parameter bounds ${argBounds.lo} <: ... <: ${argBounds.hi}") + + class HasProblemBase(base1: Type, base2: Type)(using @constructorOnly ctx: Context) + extends Realizability(i" has conflicting base types $base1 and $base2") + + class HasProblemField(fld: SingleDenotation, problem: Realizability)(using @constructorOnly ctx: Context) + extends Realizability(i" has a member $fld which is not a legal path\nsince ${fld.symbol.name}: ${fld.info}${problem.msg}") + + class ProblemInUnderlying(tp: Type, problem: Realizability)(using @constructorOnly ctx: Context) + extends Realizability(i"s underlying type ${tp}${problem.msg}") { + assert(problem != Realizable) + } + + def realizability(tp: Type)(using Context): Realizability = + new CheckRealizable().realizability(tp) + + def boundsRealizability(tp: Type)(using Context): Realizability = + new CheckRealizable().boundsRealizability(tp) + + private val LateInitializedFlags = Lazy | Erased +} + +/** Compute realizability status. + * + * A type T is realizable iff it is inhabited by non-null values. This ensures that its type members have good bounds + * (in the sense from DOT papers). A type projection T#L is legal if T is realizable, and can be understood as + * Scala 2's `v.L forSome { val v: T }`. + * + * In general, a realizable type can have multiple inhabitants, hence it need not be stable (in the sense of + * Type.isStable). + */ +class CheckRealizable(using Context) { + import CheckRealizable._ + + /** A set of all fields that have already been checked. Used + * to avoid infinite recursions when analyzing recursive types. + */ + private val checkedFields: mutable.Set[Symbol] = mutable.LinkedHashSet[Symbol]() + + /** Is symbol's definitition a lazy or erased val? + * (note we exclude modules here, because their realizability is ensured separately) + */ + private def isLateInitialized(sym: Symbol) = sym.isOneOf(LateInitializedFlags, butNot = Module) + + /** The realizability status of given type `tp`*/ + def realizability(tp: Type): Realizability = tp.dealias match { + /* + * A `TermRef` for a path `p` is realizable if + * - `p`'s type is stable and realizable, or + * - its underlying path is idempotent (that is, *stable*), total, and not null. + * We don't check yet the "not null" clause: that will require null-safety checking. + * + * We assume that stability of tp.prefix is checked elsewhere, since that's necessary for the path to be legal in + * the first place. + */ + case tp: TermRef => + val sym = tp.symbol + lazy val tpInfoRealizable = realizability(tp.info) + if (sym.is(StableRealizable)) realizability(tp.prefix) + else { + val r = + if (sym.isStableMember && !isLateInitialized(sym)) + // it's realizable because we know that a value of type `tp` has been created at run-time + Realizable + else if (!sym.isEffectivelyFinal) + // it's potentially not realizable since it might be overridden with a member of nonrealizable type + new NotFinal(sym) + else + // otherwise we need to look at the info to determine realizability + // roughly: it's realizable if the info does not have bad bounds + tpInfoRealizable.mapError(r => new ProblemInUnderlying(tp, r)) + r andAlso { + if (sym.isStableMember) sym.setFlag(StableRealizable) // it's known to be stable and realizable + realizability(tp.prefix) + } mapError { r => + // A mutable path is in fact stable and realizable if it has a realizable singleton type. + if (tp.info.isStable && tpInfoRealizable == Realizable) { + sym.setFlag(StableRealizable) + Realizable + } + else r + } + } + case _: SingletonType | NoPrefix => + Realizable + case tp => + def isConcrete(tp: Type): Boolean = tp.dealias match { + case tp: TypeRef => tp.symbol.isClass + case tp: TypeParamRef => false + case tp: TypeProxy => isConcrete(tp.underlying) + case tp: AndType => isConcrete(tp.tp1) && isConcrete(tp.tp2) + case tp: OrType => isConcrete(tp.tp1) && isConcrete(tp.tp2) + case _ => false + } + if (!isConcrete(tp)) NotConcrete + else boundsRealizability(tp).andAlso(memberRealizability(tp)) + } + + private def refinedNames(tp: Type): Set[Name] = tp.dealias match { + case tp: RefinedType => refinedNames(tp.parent) + tp.refinedName + case tp: AndType => refinedNames(tp.tp1) ++ refinedNames(tp.tp2) + case tp: OrType => refinedNames(tp.tp1) ++ refinedNames(tp.tp2) + case tp: TypeProxy => refinedNames(tp.superType) + case _ => Set.empty + } + + /** `Realizable` if `tp` has good bounds, a `HasProblem...` instance + * pointing to a bad bounds member otherwise. "Has good bounds" means: + * + * - all type members have good bounds (except for opaque helpers) + * - all refinements of the underlying type have good bounds (except for opaque companions) + * - all base types are class types, and if their arguments are wildcards + * they have good bounds. + * - base types do not appear in multiple instances with different arguments. + * (depending on the simplification scheme for AndTypes employed, this could + * also lead to base types with bad bounds). + */ + private def boundsRealizability(tp: Type) = { + + val memberProblems = withMode(Mode.CheckBoundsOrSelfType) { + for { + mbr <- tp.nonClassTypeMembers + if !(mbr.info.loBound <:< mbr.info.hiBound) + } + yield new HasProblemBounds(mbr.name, mbr.info) + } + + val refinementProblems = withMode(Mode.CheckBoundsOrSelfType) { + for { + name <- refinedNames(tp) + if (name.isTypeName) + mbr <- tp.member(name).alternatives + if !(mbr.info.loBound <:< mbr.info.hiBound) + } + yield + new HasProblemBounds(name, mbr.info) + } + + def baseTypeProblems(base: Type) = base match { + case AndType(base1, base2) => + new HasProblemBase(base1, base2) :: Nil + case base => + base.argInfos.collect { + case bounds @ TypeBounds(lo, hi) if !(lo <:< hi) => + new HasProblemBaseArg(base, bounds) + } + } + val baseProblems = + tp.baseClasses.map(_.baseTypeOf(tp)).flatMap(baseTypeProblems) + + baseProblems.foldLeft( + refinementProblems.foldLeft( + memberProblems.foldLeft( + Realizable: Realizability)(_ andAlso _))(_ andAlso _))(_ andAlso _) + } + + /** `Realizable` if all of `tp`'s non-strict fields have realizable types, + * a `HasProblemField` instance pointing to a bad field otherwise. + */ + private def memberRealizability(tp: Type) = { + def checkField(sofar: Realizability, fld: SingleDenotation): Realizability = + sofar andAlso { + if (checkedFields.contains(fld.symbol) || fld.symbol.isOneOf(Private | Mutable | LateInitializedFlags)) + // if field is private it cannot be part of a visible path + // if field is mutable it cannot be part of a path + // if field is lazy or erased it does not need to be initialized when the owning object is + // so in all cases the field does not influence realizability of the enclosing object. + Realizable + else { + checkedFields += fld.symbol + realizability(fld.info).mapError(r => new HasProblemField(fld, r)) + } + } + if sourceVersion.isAtLeast(future) then + // check fields only from version 3.x. + // Reason: An embedded field could well be nullable, which means it + // should not be part of a path and need not be checked; but we cannot recognize + // this situation until we have a typesystem that tracks nullability. + tp.fields.foldLeft(Realizable: Realizability)(checkField) + else + Realizable + } +} diff --git a/tests/pos-with-compiler-cc/dotc/core/Comments.scala b/tests/pos-with-compiler-cc/dotc/core/Comments.scala new file mode 100644 index 000000000000..1b20b75ad8ac --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/core/Comments.scala @@ -0,0 +1,462 @@ +package dotty.tools +package dotc +package core + +import scala.language.unsafeNulls + +import ast.{ untpd, tpd } +import Symbols._, Contexts._ +import util.{SourceFile, ReadOnlyMap} +import util.Spans._ +import util.CommentParsing._ +import util.Property.Key +import parsing.Parsers.Parser +import reporting.ProperDefinitionNotFound + +object Comments { + val ContextDoc: Key[ContextDocstrings] = new Key[ContextDocstrings] + + /** Decorator for getting docbase out of context */ + given CommentsContext: AnyRef with + extension (c: Context) def docCtx: Option[ContextDocstrings] = c.property(ContextDoc) + + /** Context for Docstrings, contains basic functionality for getting + * docstrings via `Symbol` and expanding templates + */ + class ContextDocstrings { + + private val _docstrings: MutableSymbolMap[Comment] = MutableSymbolMap[Comment](512) // FIXME: 2nd [Comment] needed or "not a class type" + + val templateExpander: CommentExpander = new CommentExpander + + def docstrings: ReadOnlyMap[Symbol, Comment] = _docstrings + + def docstring(sym: Symbol): Option[Comment] = _docstrings.get(sym) + + def addDocstring(sym: Symbol, doc: Option[Comment]): Unit = + doc.foreach(d => _docstrings.update(sym, d)) + } + + /** + * A `Comment` contains the unformatted docstring, it's position and potentially more + * information that is populated when the comment is "cooked". + * + * @param span The position span of this `Comment`. + * @param raw The raw comment, as seen in the source code, without any expansion. + * @param expanded If this comment has been expanded, it's expansion, otherwise `None`. + * @param usecases The usecases for this comment. + */ + final case class Comment( + span: Span, + raw: String, + expanded: Option[String], + usecases: List[UseCase], + variables: Map[String, String], + ) { + + /** Has this comment been cooked or expanded? */ + def isExpanded: Boolean = expanded.isDefined + + /** The body of this comment, without the `@usecase` and `@define` sections, after expansion. */ + lazy val expandedBody: Option[String] = + expanded.map(removeSections(_, "@usecase", "@define")) + + val isDocComment: Boolean = Comment.isDocComment(raw) + + /** + * Expands this comment by giving its content to `f`, and then parsing the `@usecase` sections. + * Typically, `f` will take care of expanding the variables. + * + * @param f The expansion function. + * @return The expanded comment, with the `usecases` populated. + */ + def expand(f: String => String)(using Context): Comment = { + val expandedComment = f(raw) + val useCases = Comment.parseUsecases(expandedComment, span) + Comment(span, raw, Some(expandedComment), useCases, Map.empty) + } + } + + object Comment { + + def isDocComment(comment: String): Boolean = comment.startsWith("/**") + + def apply(span: Span, raw: String): Comment = + Comment(span, raw, None, Nil, Map.empty) + + private def parseUsecases(expandedComment: String, span: Span)(using Context): List[UseCase] = + if (!isDocComment(expandedComment)) + Nil + else + tagIndex(expandedComment) + .filter { startsWithTag(expandedComment, _, "@usecase") } + .map { case (start, end) => decomposeUseCase(expandedComment, span, start, end) } + + /** Turns a usecase section into a UseCase, with code changed to: + * {{{ + * // From: + * def foo: A + * // To: + * def foo: A = ??? + * }}} + */ + private def decomposeUseCase(body: String, span: Span, start: Int, end: Int)(using Context): UseCase = { + def subPos(start: Int, end: Int) = + if (span == NoSpan) NoSpan + else { + val start1 = span.start + start + val end1 = span.end + end + span withStart start1 withPoint start1 withEnd end1 + } + + val codeStart = skipWhitespace(body, start + "@usecase".length) + val codeEnd = skipToEol(body, codeStart) + val code = body.substring(codeStart, codeEnd) + " = ???" + val codePos = subPos(codeStart, codeEnd) + + UseCase(code, codePos) + } + } + + final case class UseCase(code: String, codePos: Span, untpdCode: untpd.Tree, tpdCode: Option[tpd.DefDef]) { + def typed(tpdCode: tpd.DefDef): UseCase = copy(tpdCode = Some(tpdCode)) + } + + object UseCase { + def apply(code: String, codePos: Span)(using Context): UseCase = { + val tree = { + val tree = new Parser(SourceFile.virtual("", code)).localDef(codePos.start) + tree match { + case tree: untpd.DefDef => + val newName = ctx.compilationUnit.freshNames.newName(tree.name, NameKinds.DocArtifactName) + untpd.cpy.DefDef(tree)(name = newName) + case _ => + report.error(ProperDefinitionNotFound(), ctx.source.atSpan(codePos)) + tree + } + } + UseCase(code, codePos, tree, None) + } + } + + /** + * Port of DocComment.scala from nsc + * @author Martin Odersky + * @author Felix Mulder + */ + class CommentExpander { + import dotc.config.Printers.scaladoc + import scala.collection.mutable + + def expand(sym: Symbol, site: Symbol)(using Context): String = { + val parent = if (site != NoSymbol) site else sym + defineVariables(parent) + expandedDocComment(sym, parent) + } + + /** The cooked doc comment of symbol `sym` after variable expansion, or "" if missing. + * + * @param sym The symbol for which doc comment is returned + * @param site The class for which doc comments are generated + * @throws ExpansionLimitExceeded when more than 10 successive expansions + * of the same string are done, which is + * interpreted as a recursive variable definition. + */ + def expandedDocComment(sym: Symbol, site: Symbol, docStr: String = "")(using Context): String = { + // when parsing a top level class or module, use the (module-)class itself to look up variable definitions + val parent = if ((sym.is(Flags.Module) || sym.isClass) && site.is(Flags.Package)) sym + else site + expandVariables(cookedDocComment(sym, docStr), sym, parent) + } + + private def template(raw: String): String = + removeSections(raw, "@define") + + private def defines(raw: String): List[String] = { + val sections = tagIndex(raw) + val defines = sections filter { startsWithTag(raw, _, "@define") } + val usecases = sections filter { startsWithTag(raw, _, "@usecase") } + val end = startTag(raw, (defines ::: usecases).sortBy(_._1)) + + defines map { case (start, end) => raw.substring(start, end) } + } + + private def replaceInheritDocToInheritdoc(docStr: String): String = + docStr.replaceAll("""\{@inheritDoc\p{Zs}*\}""", "@inheritdoc") + + /** The cooked doc comment of an overridden symbol */ + protected def superComment(sym: Symbol)(using Context): Option[String] = + allInheritedOverriddenSymbols(sym).iterator map (x => cookedDocComment(x)) find (_ != "") + + private val cookedDocComments = MutableSymbolMap[String]() + + /** The raw doc comment of symbol `sym`, minus usecase and define sections, augmented by + * missing sections of an inherited doc comment. + * If a symbol does not have a doc comment but some overridden version of it does, + * the doc comment of the overridden version is copied instead. + */ + def cookedDocComment(sym: Symbol, docStr: String = "")(using Context): String = cookedDocComments.getOrElseUpdate(sym, { + var ownComment = + if (docStr.length == 0) ctx.docCtx.flatMap(_.docstring(sym).map(c => template(c.raw))).getOrElse("") + else template(docStr) + ownComment = replaceInheritDocToInheritdoc(ownComment) + + superComment(sym) match { + case None => + // SI-8210 - The warning would be false negative when this symbol is a setter + if (ownComment.indexOf("@inheritdoc") != -1 && ! sym.isSetter) + scaladoc.println(s"${sym.span}: the comment for ${sym} contains @inheritdoc, but no parent comment is available to inherit from.") + ownComment.replace("@inheritdoc", "") + case Some(sc) => + if (ownComment == "") sc + else expandInheritdoc(sc, merge(sc, ownComment, sym), sym) + } + }) + + private def isMovable(str: String, sec: (Int, Int)): Boolean = + startsWithTag(str, sec, "@param") || + startsWithTag(str, sec, "@tparam") || + startsWithTag(str, sec, "@return") + + def merge(src: String, dst: String, sym: Symbol, copyFirstPara: Boolean = false): String = { + val srcSections = tagIndex(src) + val dstSections = tagIndex(dst) + val srcParams = paramDocs(src, "@param", srcSections) + val dstParams = paramDocs(dst, "@param", dstSections) + val srcTParams = paramDocs(src, "@tparam", srcSections) + val dstTParams = paramDocs(dst, "@tparam", dstSections) + val out = new StringBuilder + var copied = 0 + var tocopy = startTag(dst, dstSections dropWhile (!isMovable(dst, _))) + + if (copyFirstPara) { + val eop = // end of comment body (first para), which is delimited by blank line, or tag, or end of comment + (findNext(src, 0)(src.charAt(_) == '\n')) min startTag(src, srcSections) + out append src.substring(0, eop).trim + copied = 3 + tocopy = 3 + } + + def mergeSection(srcSec: Option[(Int, Int)], dstSec: Option[(Int, Int)]) = dstSec match { + case Some((start, end)) => + if (end > tocopy) tocopy = end + case None => + srcSec match { + case Some((start1, end1)) => + out append dst.substring(copied, tocopy).trim + out append "\n" + copied = tocopy + out append src.substring(start1, end1).trim + case None => + } + } + + //TODO: enable this once you know how to get `sym.paramss` + /* + for (params <- sym.paramss; param <- params) + mergeSection(srcParams get param.name.toString, dstParams get param.name.toString) + for (tparam <- sym.typeParams) + mergeSection(srcTParams get tparam.name.toString, dstTParams get tparam.name.toString) + + mergeSection(returnDoc(src, srcSections), returnDoc(dst, dstSections)) + mergeSection(groupDoc(src, srcSections), groupDoc(dst, dstSections)) + */ + + if (out.length == 0) dst + else { + out append dst.substring(copied) + out.toString + } + } + + /** + * Expand inheritdoc tags + * - for the main comment we transform the inheritdoc into the super variable, + * and the variable expansion can expand it further + * - for the param, tparam and throws sections we must replace comments on the spot + * + * This is done separately, for two reasons: + * 1. It takes longer to run compared to merge + * 2. The inheritdoc annotation should not be used very often, as building the comment from pieces severely + * impacts performance + * + * @param parent The source (or parent) comment + * @param child The child (overriding member or usecase) comment + * @param sym The child symbol + * @return The child comment with the inheritdoc sections expanded + */ + def expandInheritdoc(parent: String, child: String, sym: Symbol): String = + if (child.indexOf("@inheritdoc") == -1) + child + else { + val parentSections = tagIndex(parent) + val childSections = tagIndex(child) + val parentTagMap = sectionTagMap(parent, parentSections) + val parentNamedParams = Map() + + ("@param" -> paramDocs(parent, "@param", parentSections)) + + ("@tparam" -> paramDocs(parent, "@tparam", parentSections)) + + ("@throws" -> paramDocs(parent, "@throws", parentSections)) + + val out = new StringBuilder + + def replaceInheritdoc(childSection: String, parentSection: => String) = + if (childSection.indexOf("@inheritdoc") == -1) + childSection + else + childSection.replace("@inheritdoc", parentSection) + + def getParentSection(section: (Int, Int)): String = { + + def getSectionHeader = extractSectionTag(child, section) match { + case param@("@param"|"@tparam"|"@throws") => param + " " + extractSectionParam(child, section) + case other => other + } + + def sectionString(param: String, paramMap: Map[String, (Int, Int)]): String = + paramMap.get(param) match { + case Some(section) => + // Cleanup the section tag and parameter + val sectionTextBounds = extractSectionText(parent, section) + cleanupSectionText(parent.substring(sectionTextBounds._1, sectionTextBounds._2)) + case None => + scaladoc.println(s"""${sym.span}: the """" + getSectionHeader + "\" annotation of the " + sym + + " comment contains @inheritdoc, but the corresponding section in the parent is not defined.") + "" + } + + child.substring(section._1, section._1 + 7) match { + case param@("@param "|"@tparam"|"@throws") => + sectionString(extractSectionParam(child, section), parentNamedParams(param.trim)) + case _ => + sectionString(extractSectionTag(child, section), parentTagMap) + } + } + + def mainComment(str: String, sections: List[(Int, Int)]): String = + if (str.trim.length > 3) + str.trim.substring(3, startTag(str, sections)) + else + "" + + // Append main comment + out.append("/**") + out.append(replaceInheritdoc(mainComment(child, childSections), mainComment(parent, parentSections))) + + // Append sections + for (section <- childSections) + out.append(replaceInheritdoc(child.substring(section._1, section._2), getParentSection(section))) + + out.append("*/") + out.toString + } + + protected def expandVariables(initialStr: String, sym: Symbol, site: Symbol)(using Context): String = { + val expandLimit = 10 + + def expandInternal(str: String, depth: Int): String = { + if (depth >= expandLimit) + throw new ExpansionLimitExceeded(str) + + val out = new StringBuilder + var copied, idx = 0 + // excluding variables written as \$foo so we can use them when + // necessary to document things like Symbol#decode + def isEscaped = idx > 0 && str.charAt(idx - 1) == '\\' + while (idx < str.length) + if ((str charAt idx) != '$' || isEscaped) + idx += 1 + else { + val vstart = idx + idx = skipVariable(str, idx + 1) + def replaceWith(repl: String) = { + out append str.substring(copied, vstart) + out append repl + copied = idx + } + variableName(str.substring(vstart + 1, idx)) match { + case "super" => + superComment(sym) foreach { sc => + val superSections = tagIndex(sc) + replaceWith(sc.substring(3, startTag(sc, superSections))) + for (sec @ (start, end) <- superSections) + if (!isMovable(sc, sec)) out append sc.substring(start, end) + } + case "" => idx += 1 + case vname => + lookupVariable(vname, site) match { + case Some(replacement) => replaceWith(replacement) + case None => + scaladoc.println(s"Variable $vname undefined in comment for $sym in $site") + } + } + } + if (out.length == 0) str + else { + out append str.substring(copied) + expandInternal(out.toString, depth + 1) + } + } + + // We suppressed expanding \$ throughout the recursion, and now we + // need to replace \$ with $ so it looks as intended. + expandInternal(initialStr, 0).replace("""\$""", "$") + } + + def defineVariables(sym: Symbol)(using Context): Unit = { + val Trim = "(?s)^[\\s&&[^\n\r]]*(.*?)\\s*$".r + + val raw = ctx.docCtx.flatMap(_.docstring(sym).map(_.raw)).getOrElse("") + defs(sym) ++= defines(raw).map { + str => { + val start = skipWhitespace(str, "@define".length) + val (key, value) = str.splitAt(skipVariable(str, start)) + key.drop(start) -> value + } + } map { + case (key, Trim(value)) => + variableName(key) -> value.replaceAll("\\s+\\*+$", "") + } + } + + /** Maps symbols to the variable -> replacement maps that are defined + * in their doc comments + */ + private val defs = mutable.HashMap[Symbol, Map[String, String]]() withDefaultValue Map() + + /** Lookup definition of variable. + * + * @param vble The variable for which a definition is searched + * @param site The class for which doc comments are generated + */ + def lookupVariable(vble: String, site: Symbol)(using Context): Option[String] = site match { + case NoSymbol => None + case _ => + val searchList = + if (site.flags.is(Flags.Module)) site :: site.info.baseClasses + else site.info.baseClasses + + searchList collectFirst { case x if defs(x) contains vble => defs(x)(vble) } match { + case Some(str) if str startsWith "$" => lookupVariable(str.tail, site) + case res => res orElse lookupVariable(vble, site.owner) + } + } + + /** The position of the raw doc comment of symbol `sym`, or NoPosition if missing + * If a symbol does not have a doc comment but some overridden version of it does, + * the position of the doc comment of the overridden version is returned instead. + */ + def docCommentPos(sym: Symbol)(using Context): Span = + ctx.docCtx.flatMap(_.docstring(sym).map(_.span)).getOrElse(NoSpan) + + /** A version which doesn't consider self types, as a temporary measure: + * an infinite loop has broken out between superComment and cookedDocComment + * since r23926. + */ + private def allInheritedOverriddenSymbols(sym: Symbol)(using Context): List[Symbol] = + if (!sym.owner.isClass) Nil + else sym.allOverriddenSymbols.toList.filter(_ != NoSymbol) //TODO: could also be `sym.owner.allOverrid..` + //else sym.owner.ancestors map (sym overriddenSymbol _) filter (_ != NoSymbol) + + class ExpansionLimitExceeded(str: String) extends Exception + } +} diff --git a/tests/pos-with-compiler-cc/dotc/core/Constants.scala b/tests/pos-with-compiler-cc/dotc/core/Constants.scala new file mode 100644 index 000000000000..f45e9e5217de --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/core/Constants.scala @@ -0,0 +1,261 @@ +package dotty.tools +package dotc +package core + +import Types._, Symbols._, Contexts._ +import printing.Printer +import printing.Texts.Text + +object Constants { + + inline val NoTag = 0 + inline val UnitTag = 1 + inline val BooleanTag = 2 + inline val ByteTag = 3 + inline val ShortTag = 4 + inline val CharTag = 5 + inline val IntTag = 6 + inline val LongTag = 7 + inline val FloatTag = 8 + inline val DoubleTag = 9 + inline val StringTag = 10 + inline val NullTag = 11 + inline val ClazzTag = 12 + + class Constant(val value: Any, val tag: Int) extends printing.Showable with Product1[Any] { + import java.lang.Double.doubleToRawLongBits + import java.lang.Float.floatToRawIntBits + + def isByteRange: Boolean = isIntRange && Byte.MinValue <= intValue && intValue <= Byte.MaxValue + def isShortRange: Boolean = isIntRange && Short.MinValue <= intValue && intValue <= Short.MaxValue + def isCharRange: Boolean = isIntRange && Char.MinValue <= intValue && intValue <= Char.MaxValue + def isIntRange: Boolean = ByteTag <= tag && tag <= IntTag + def isLongRange: Boolean = ByteTag <= tag && tag <= LongTag + def isFloatRange: Boolean = ByteTag <= tag && tag <= FloatTag + def isNumeric: Boolean = ByteTag <= tag && tag <= DoubleTag + def isNonUnitAnyVal: Boolean = BooleanTag <= tag && tag <= DoubleTag + def isAnyVal: Boolean = UnitTag <= tag && tag <= DoubleTag + + def tpe(using Context): Type = tag match { + case UnitTag => defn.UnitType + case BooleanTag => defn.BooleanType + case ByteTag => defn.ByteType + case ShortTag => defn.ShortType + case CharTag => defn.CharType + case IntTag => defn.IntType + case LongTag => defn.LongType + case FloatTag => defn.FloatType + case DoubleTag => defn.DoubleType + case StringTag => defn.StringType + case NullTag => defn.NullType + case ClazzTag => defn.ClassType(typeValue) + } + + /** We need the equals method to take account of tags as well as values. + */ + override def equals(other: Any): Boolean = other match { + case that: Constant => + this.tag == that.tag && equalHashValue == that.equalHashValue + case _ => false + } + + def isNaN: Boolean = value match { + case f: Float => f.isNaN + case d: Double => d.isNaN + case _ => false + } + + def booleanValue: Boolean = + if (tag == BooleanTag) value.asInstanceOf[Boolean] + else throw new Error("value " + value + " is not a boolean") + + def byteValue: Byte = tag match { + case ByteTag => value.asInstanceOf[Byte] + case ShortTag => value.asInstanceOf[Short].toByte + case CharTag => value.asInstanceOf[Char].toByte + case IntTag => value.asInstanceOf[Int].toByte + case LongTag => value.asInstanceOf[Long].toByte + case FloatTag => value.asInstanceOf[Float].toByte + case DoubleTag => value.asInstanceOf[Double].toByte + case _ => throw new Error("value " + value + " is not a Byte") + } + + def shortValue: Short = tag match { + case ByteTag => value.asInstanceOf[Byte].toShort + case ShortTag => value.asInstanceOf[Short] + case CharTag => value.asInstanceOf[Char].toShort + case IntTag => value.asInstanceOf[Int].toShort + case LongTag => value.asInstanceOf[Long].toShort + case FloatTag => value.asInstanceOf[Float].toShort + case DoubleTag => value.asInstanceOf[Double].toShort + case _ => throw new Error("value " + value + " is not a Short") + } + + def charValue: Char = tag match { + case ByteTag => value.asInstanceOf[Byte].toChar + case ShortTag => value.asInstanceOf[Short].toChar + case CharTag => value.asInstanceOf[Char] + case IntTag => value.asInstanceOf[Int].toChar + case LongTag => value.asInstanceOf[Long].toChar + case FloatTag => value.asInstanceOf[Float].toChar + case DoubleTag => value.asInstanceOf[Double].toChar + case _ => throw new Error("value " + value + " is not a Char") + } + + def intValue: Int = tag match { + case ByteTag => value.asInstanceOf[Byte].toInt + case ShortTag => value.asInstanceOf[Short].toInt + case CharTag => value.asInstanceOf[Char].toInt + case IntTag => value.asInstanceOf[Int] + case LongTag => value.asInstanceOf[Long].toInt + case FloatTag => value.asInstanceOf[Float].toInt + case DoubleTag => value.asInstanceOf[Double].toInt + case _ => throw new Error("value " + value + " is not an Int") + } + + def longValue: Long = tag match { + case ByteTag => value.asInstanceOf[Byte].toLong + case ShortTag => value.asInstanceOf[Short].toLong + case CharTag => value.asInstanceOf[Char].toLong + case IntTag => value.asInstanceOf[Int].toLong + case LongTag => value.asInstanceOf[Long] + case FloatTag => value.asInstanceOf[Float].toLong + case DoubleTag => value.asInstanceOf[Double].toLong + case _ => throw new Error("value " + value + " is not a Long") + } + + def floatValue: Float = tag match { + case ByteTag => value.asInstanceOf[Byte].toFloat + case ShortTag => value.asInstanceOf[Short].toFloat + case CharTag => value.asInstanceOf[Char].toFloat + case IntTag => value.asInstanceOf[Int].toFloat + case LongTag => value.asInstanceOf[Long].toFloat + case FloatTag => value.asInstanceOf[Float] + case DoubleTag => value.asInstanceOf[Double].toFloat + case _ => throw new Error("value " + value + " is not a Float") + } + + def doubleValue: Double = tag match { + case ByteTag => value.asInstanceOf[Byte].toDouble + case ShortTag => value.asInstanceOf[Short].toDouble + case CharTag => value.asInstanceOf[Char].toDouble + case IntTag => value.asInstanceOf[Int].toDouble + case LongTag => value.asInstanceOf[Long].toDouble + case FloatTag => value.asInstanceOf[Float].toDouble + case DoubleTag => value.asInstanceOf[Double] + case _ => throw new Error("value " + value + " is not a Double") + } + + /** Convert constant value to conform to given type. + */ + def convertTo(pt: Type)(using Context): Constant | Null = { + def classBound(pt: Type): Type = pt.dealias.stripTypeVar match { + case tref: TypeRef if !tref.symbol.isClass && tref.info.exists => + classBound(tref.info.bounds.lo) + case param: TypeParamRef => + ctx.typerState.constraint.entry(param) match { + case TypeBounds(lo, hi) => + if (hi.classSymbol.isPrimitiveValueClass) hi //constrain further with high bound + else classBound(lo) + case NoType => classBound(param.binder.paramInfos(param.paramNum).lo) + case inst => classBound(inst) + } + case pt => pt + } + pt match + case ConstantType(value) if value == this => this + case _: SingletonType => null + case _ => + val target = classBound(pt).typeSymbol + if (target == tpe.typeSymbol) + this + else if ((target == defn.ByteClass) && isByteRange) + Constant(byteValue) + else if (target == defn.ShortClass && isShortRange) + Constant(shortValue) + else if (target == defn.CharClass && isCharRange) + Constant(charValue) + else if (target == defn.IntClass && isIntRange) + Constant(intValue) + else if (target == defn.LongClass && isLongRange) + Constant(longValue) + else if (target == defn.FloatClass && isFloatRange) + Constant(floatValue) + else if (target == defn.DoubleClass && isNumeric) + Constant(doubleValue) + else + null + } + + def stringValue: String = value.toString + + def toText(printer: Printer): Text = printer.toText(this) + + def typeValue: Type = value.asInstanceOf[Type] + + /** + * Consider two `NaN`s to be identical, despite non-equality + * Consider -0d to be distinct from 0d, despite equality + * + * We use the raw versions (i.e. `floatToRawIntBits` rather than `floatToIntBits`) + * to avoid treating different encodings of `NaN` as the same constant. + * You probably can't express different `NaN` varieties as compile time + * constants in regular Scala code, but it is conceivable that you could + * conjure them with a macro. + */ + private def equalHashValue: Any = value match { + case f: Float => floatToRawIntBits(f) + case d: Double => doubleToRawLongBits(d) + case v => v + } + + override def hashCode: Int = { + import scala.util.hashing.MurmurHash3._ + val seed = 17 + var h = seed + h = mix(h, tag.##) // include tag in the hash, otherwise 0, 0d, 0L, 0f collide. + h = mix(h, equalHashValue.##) + finalizeHash(h, length = 2) + } + + override def toString: String = s"Constant($value)" + def canEqual(x: Any): Boolean = true + def get: Any = value + def isEmpty: Boolean = false + def _1: Any = value + } + + object Constant { + def apply(x: Null): Constant = new Constant(x, NullTag) + def apply(x: Unit): Constant = new Constant(x, UnitTag) + def apply(x: Boolean): Constant = new Constant(x, BooleanTag) + def apply(x: Byte): Constant = new Constant(x, ByteTag) + def apply(x: Short): Constant = new Constant(x, ShortTag) + def apply(x: Int): Constant = new Constant(x, IntTag) + def apply(x: Long): Constant = new Constant(x, LongTag) + def apply(x: Float): Constant = new Constant(x, FloatTag) + def apply(x: Double): Constant = new Constant(x, DoubleTag) + def apply(x: String): Constant = new Constant(x, StringTag) + def apply(x: Char): Constant = new Constant(x, CharTag) + def apply(x: Type): Constant = new Constant(x, ClazzTag) + def apply(value: Any): Constant = + new Constant(value, + value match { + case null => NullTag + case x: Unit => UnitTag + case x: Boolean => BooleanTag + case x: Byte => ByteTag + case x: Short => ShortTag + case x: Int => IntTag + case x: Long => LongTag + case x: Float => FloatTag + case x: Double => DoubleTag + case x: String => StringTag + case x: Char => CharTag + case x: Type => ClazzTag + } + ) + + def unapply(c: Constant): Constant = c + } +} diff --git a/tests/pos-with-compiler-cc/dotc/core/Constraint.scala b/tests/pos-with-compiler-cc/dotc/core/Constraint.scala new file mode 100644 index 000000000000..fb87aed77c41 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/core/Constraint.scala @@ -0,0 +1,214 @@ +package dotty.tools +package dotc +package core + +import Types._, Contexts._ +import printing.Showable +import util.{SimpleIdentitySet, SimpleIdentityMap} + +/** Constraint over undetermined type parameters. Constraints are built + * over values of the following types: + * + * - TypeLambda A constraint constrains the type parameters of a set of TypeLambdas + * - TypeParamRef The parameters of the constrained type lambdas + * - TypeVar Every constrained parameter might be associated with a TypeVar + * that has the TypeParamRef as origin. + */ +abstract class Constraint extends Showable { + + type This <: Constraint + + /** Does the constraint's domain contain the type parameters of `tl`? */ + def contains(tl: TypeLambda): Boolean + + /** Does the constraint's domain contain the type parameter `param`? */ + def contains(param: TypeParamRef): Boolean + + /** Does this constraint contain the type variable `tvar` and is it uninstantiated? */ + def contains(tvar: TypeVar): Boolean + + /** The constraint entry for given type parameter `param`, or NoType if `param` is not part of + * the constraint domain. Note: Low level, implementation dependent. + */ + def entry(param: TypeParamRef): Type + + /** The type variable corresponding to parameter `param`, or + * NoType, if `param` is not in constrained or is not paired with a type variable. + */ + def typeVarOfParam(param: TypeParamRef): Type + + /** Is it known that `param1 <:< param2`? */ + def isLess(param1: TypeParamRef, param2: TypeParamRef): Boolean + + /** The parameters that are known to be smaller wrt <: than `param` */ + def lower(param: TypeParamRef): List[TypeParamRef] + + /** The parameters that are known to be greater wrt <: than `param` */ + def upper(param: TypeParamRef): List[TypeParamRef] + + /** The lower dominator set. + * + * This is like `lower`, except that each parameter returned is no smaller than every other returned parameter. + */ + def minLower(param: TypeParamRef): List[TypeParamRef] + + /** The upper dominator set. + * + * This is like `upper`, except that each parameter returned is no greater than every other returned parameter. + */ + def minUpper(param: TypeParamRef): List[TypeParamRef] + + /** lower(param) \ lower(butNot) */ + def exclusiveLower(param: TypeParamRef, butNot: TypeParamRef): List[TypeParamRef] + + /** upper(param) \ upper(butNot) */ + def exclusiveUpper(param: TypeParamRef, butNot: TypeParamRef): List[TypeParamRef] + + /** The constraint bounds for given type parameter `param`. + * Poly params that are known to be smaller or greater than `param` + * are not contained in the return bounds. + * @pre `param` is not part of the constraint domain. + */ + def nonParamBounds(param: TypeParamRef)(using Context): TypeBounds + + /** A new constraint which is derived from this constraint by adding + * entries for all type parameters of `poly`. + * @param tvars A list of type variables associated with the params, + * or Nil if the constraint will just be checked for + * satisfiability but will solved to give instances of + * type variables. + */ + def add(poly: TypeLambda, tvars: List[TypeVar])(using Context): This + + /** A new constraint which is derived from this constraint by updating + * the entry for parameter `param` to `tp`. + * `tp` can be one of the following: + * + * - A TypeBounds value, indicating new constraint bounds + * - Another type, indicating a solution for the parameter + * + * @pre `this contains param`. + */ + def updateEntry(param: TypeParamRef, tp: Type)(using Context): This + + /** A constraint that includes the relationship `p1 <: p2`. + * `<:` relationships between parameters ("edges") are propagated, but + * non-parameter bounds are left alone. + * + * @param direction Must be set to `KeepParam1` or `KeepParam2` when + * `p2 <: p1` is already true depending on which parameter + * the caller intends to keep. This will avoid propagating + * bounds that will be redundant after `p1` and `p2` are + * unified. + */ + def addLess(p1: TypeParamRef, p2: TypeParamRef, + direction: UnificationDirection = UnificationDirection.NoUnification)(using Context): This + + /** A new constraint which is derived from this constraint by removing + * the type parameter `param` from the domain and replacing all top-level occurrences + * of the parameter elsewhere in the constraint by type `tp`, or a conservative + * approximation of it if that is needed to avoid cycles. + * Occurrences nested inside a refinement or prefix are not affected. + */ + def replace(param: TypeParamRef, tp: Type)(using Context): This + + /** Is entry associated with `tl` removable? This is the case if + * all type parameters of the entry are associated with type variables + * which have their `inst` fields set. + */ + def isRemovable(tl: TypeLambda): Boolean + + /** A new constraint with all entries coming from `tl` removed. */ + def remove(tl: TypeLambda)(using Context): This + + /** A new constraint with entry `from` replaced with `to` + * Rerences to `from` from within other constraint bounds are updated to `to`. + * Type variables are left alone. + */ + def subst(from: TypeLambda, to: TypeLambda)(using Context): This + + /** Is `tv` marked as hard in the constraint? */ + def isHard(tv: TypeVar): Boolean + + /** The same as this constraint, but with `tv` marked as hard. */ + def withHard(tv: TypeVar)(using Context): This + + /** Gives for each instantiated type var that does not yet have its `inst` field + * set, the instance value stored in the constraint. Storing instances in constraints + * is done only in a temporary way for contexts that may be retracted + * without also retracting the type var as a whole. + */ + def instType(tvar: TypeVar): Type + + /** The given `tl` in case it is not contained in this constraint, + * a fresh copy of `tl` otherwise. + */ + def ensureFresh(tl: TypeLambda)(using Context): TypeLambda + + /** The type lambdas constrained by this constraint */ + def domainLambdas: List[TypeLambda] + + /** The type lambda parameters constrained by this constraint */ + def domainParams: List[TypeParamRef] + + /** Check whether predicate holds for all parameters in constraint */ + def forallParams(p: TypeParamRef => Boolean): Boolean + + /** Perform operation `op` on all typevars that do not have their `inst` field set. */ + def foreachTypeVar(op: TypeVar => Unit): Unit + + /** The uninstantiated typevars of this constraint, which still have a bounds constraint + */ + def uninstVars: collection.Seq[TypeVar] + + /** Whether `tl` is present in both `this` and `that` but is associated with + * different TypeVars there, meaning that the constraints cannot be merged. + */ + def hasConflictingTypeVarsFor(tl: TypeLambda, that: Constraint): Boolean + + /** Does `param` occur at the toplevel in `tp` ? + * Toplevel means: the type itself or a factor in some + * combination of `&` or `|` types. + */ + def occursAtToplevel(param: TypeParamRef, tp: Type)(using Context): Boolean + + /** A string that shows the reverse dependencies maintained by this constraint + * (coDeps and contraDeps for OrderingConstraints). + */ + def depsToString(using Context): String + + /** Does the constraint restricted to variables outside `except` depend on `tv` + * in the given direction `co`? + * @param `co` If true, test whether the constraint would change if the variable is made larger + * otherwise, test whether the constraint would change if the variable is made smaller. + */ + def dependsOn(tv: TypeVar, except: TypeVars, co: Boolean)(using Context): Boolean + + /** Depending on Config settngs: + * - Under `checkConstraintsNonCyclic`, check that no constrained + * parameter contains itself as a bound. + * - Under `checkConstraintDeps`, check hat reverse dependencies in + * constraints are correct and complete. + */ + def checkWellFormed()(using Context): this.type + + /** Check that constraint only refers to TypeParamRefs bound by itself */ + def checkClosed()(using Context): Unit + + /** Check that every typevar om this constraint has as origin a type parameter + * of athe type lambda that is associated with the typevar itself. + */ + def checkConsistentVars()(using Context): Unit +} + +/** When calling `Constraint#addLess(p1, p2, ...)`, the caller might end up + * unifying one parameter with the other, this enum lets `addLess` know which + * direction the unification will take. + */ +enum UnificationDirection: + /** Neither p1 nor p2 will be instantiated. */ + case NoUnification + /** `p2 := p1`, p1 left uninstantiated. */ + case KeepParam1 + /** `p1 := p2`, p2 left uninstantiated. */ + case KeepParam2 diff --git a/tests/pos-with-compiler-cc/dotc/core/ConstraintHandling.scala b/tests/pos-with-compiler-cc/dotc/core/ConstraintHandling.scala new file mode 100644 index 000000000000..8bf671931260 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/core/ConstraintHandling.scala @@ -0,0 +1,891 @@ +package dotty.tools +package dotc +package core + +import Types._ +import Contexts._ +import Symbols._ +import Decorators._ +import Flags._ +import config.Config +import config.Printers.typr +import typer.ProtoTypes.{newTypeVar, representedParamRef} +import UnificationDirection.* +import NameKinds.AvoidNameKind +import util.SimpleIdentitySet +import NullOpsDecorator.stripNull + +/** Methods for adding constraints and solving them. + * + * What goes into a Constraint as opposed to a ConstrainHandler? + * + * Constraint code is purely functional: Operations get constraints and produce new ones. + * Constraint code does not have access to a type-comparer. Anything regarding lubs and glbs has to be done + * elsewhere. + * + * By comparison: Constraint handlers are parts of type comparers and can use their functionality. + * Constraint handlers update the current constraint as a side effect. + */ +trait ConstraintHandling { + + def constr: config.Printers.Printer = config.Printers.constr + + protected def isSub(tp1: Type, tp2: Type)(using Context): Boolean + protected def isSame(tp1: Type, tp2: Type)(using Context): Boolean + + protected def constraint: Constraint + protected def constraint_=(c: Constraint): Unit + + private var addConstraintInvocations = 0 + + /** If the constraint is frozen we cannot add new bounds to the constraint. */ + protected var frozenConstraint: Boolean = false + + /** Potentially a type lambda that is still instantiatable, even though the constraint + * is generally frozen. + */ + protected var caseLambda: Type = NoType + + /** If set, align arguments `S1`, `S2`when taking the glb + * `T1 { X = S1 } & T2 { X = S2 }` of a constraint upper bound for some type parameter. + * Aligning means computing `S1 =:= S2` which may change the current constraint. + * See note in TypeComparer#distributeAnd. + */ + protected var homogenizeArgs: Boolean = false + + /** We are currently comparing type lambdas. Used as a flag for + * optimization: when `false`, no need to do an expensive `pruneLambdaParams` + */ + protected var comparedTypeLambdas: Set[TypeLambda] = Set.empty + + /** Used for match type reduction: If false, we don't recognize an abstract type + * to be a subtype type of any of its base classes. This is in place only at the + * toplevel; it is turned on again when we add parts of the scrutinee to the constraint. + */ + protected var canWidenAbstract: Boolean = true + + protected var myNecessaryConstraintsOnly = false + /** When collecting the constraints needed for a particular subtyping + * judgment to be true, we sometimes need to approximate the constraint + * set (see `TypeComparer#either` for example). + * + * Normally, this means adding extra constraints which may not be necessary + * for the subtyping judgment to be true, but if this variable is set to true + * we will instead under-approximate and keep only the constraints that must + * always be present for the subtyping judgment to hold. + * + * This is needed for GADT bounds inference to be sound, but it is also used + * when constraining a method call based on its expected type to avoid adding + * constraints that would later prevent us from typechecking method + * arguments, see or-inf.scala and and-inf.scala for examples. + */ + protected def necessaryConstraintsOnly(using Context): Boolean = + ctx.mode.is(Mode.GadtConstraintInference) || myNecessaryConstraintsOnly + + /** If `trustBounds = false` we perform comparisons in a pessimistic way as follows: + * Given an abstract type `A >: L <: H`, a subtype comparison of any type + * with `A` will compare against both `L` and `H`. E.g. + * + * T <:< A if T <:< L and T <:< H + * A <:< T if L <:< T and H <:< T + * + * This restricted form makes sure we don't "forget" types when forming + * unions and intersections with abstract types that have bad bounds. E.g. + * the following example from neg/i8900.scala that @smarter came up with: + * We have a type variable X with constraints + * + * X >: 1, X >: x.M + * + * where `x` is a locally nested variable and `x.M` has bad bounds + * + * x.M >: Int | String <: Int & String + * + * If we trust bounds, then the lower bound of `X` is `x.M` since `x.M >: 1`. + * Then even if we correct levels on instantiation to eliminate the local `x`, + * it is alreay too late, we'd get `Int & String` as instance, which does not + * satisfy the original constraint `X >: 1`. + * + * But if `trustBounds` is false, we do not conclude the `x.M >: 1` since + * we compare both bounds and the upper bound `Int & String` is not a supertype + * of `1`. So the lower bound is `1 | x.M` and when we level-avoid that we + * get `1 | Int & String`, which simplifies to `Int`. + */ + private var myTrustBounds = true + + inline def withUntrustedBounds(op: => Type): Type = + val saved = myTrustBounds + myTrustBounds = false + try op finally myTrustBounds = saved + + def trustBounds: Boolean = + !Config.checkLevelsOnInstantiation || myTrustBounds + + def checkReset() = + assert(addConstraintInvocations == 0) + assert(frozenConstraint == false) + assert(caseLambda == NoType) + assert(homogenizeArgs == false) + assert(comparedTypeLambdas == Set.empty) + + def nestingLevel(param: TypeParamRef)(using Context) = constraint.typeVarOfParam(param) match + case tv: TypeVar => tv.nestingLevel + case _ => + // This should only happen when reducing match types (in + // TrackingTypeComparer#matchCases) or in uncommitable TyperStates (as + // asserted in ProtoTypes.constrained) and is special-cased in `levelOK` + // below. + Int.MaxValue + + /** Is `level` <= `maxLevel` or legal in the current context? */ + def levelOK(level: Int, maxLevel: Int)(using Context): Boolean = + level <= maxLevel + || ctx.isAfterTyper || !ctx.typerState.isCommittable // Leaks in these cases shouldn't break soundness + || level == Int.MaxValue // See `nestingLevel` above. + || !Config.checkLevelsOnConstraints + + /** If `param` is nested deeper than `maxLevel`, try to instantiate it to a + * fresh type variable of level `maxLevel` and return the new variable. + * If this isn't possible, throw a TypeError. + */ + def atLevel(maxLevel: Int, param: TypeParamRef)(using Context): TypeParamRef = + if levelOK(nestingLevel(param), maxLevel) then + return param + LevelAvoidMap(0, maxLevel)(param) match + case freshVar: TypeVar => freshVar.origin + case _ => throw TypeError( + em"Could not decrease the nesting level of ${param} from ${nestingLevel(param)} to $maxLevel in $constraint") + + def nonParamBounds(param: TypeParamRef)(using Context): TypeBounds = constraint.nonParamBounds(param) + + /** The full lower bound of `param` includes both the `nonParamBounds` and the + * params in the constraint known to be `<: param`, except that + * params with a `nestingLevel` higher than `param` will be instantiated + * to a fresh param at a legal level. See the documentation of `TypeVar` + * for details. + */ + def fullLowerBound(param: TypeParamRef)(using Context): Type = + val maxLevel = nestingLevel(param) + var loParams = constraint.minLower(param) + if maxLevel != Int.MaxValue then + loParams = loParams.mapConserve(atLevel(maxLevel, _)) + loParams.foldLeft(nonParamBounds(param).lo)(_ | _) + + /** The full upper bound of `param`, see the documentation of `fullLowerBounds` above. */ + def fullUpperBound(param: TypeParamRef)(using Context): Type = + val maxLevel = nestingLevel(param) + var hiParams = constraint.minUpper(param) + if maxLevel != Int.MaxValue then + hiParams = hiParams.mapConserve(atLevel(maxLevel, _)) + hiParams.foldLeft(nonParamBounds(param).hi)(_ & _) + + /** Full bounds of `param`, including other lower/upper params. + * + * Note that underlying operations perform subtype checks - for this reason, recursing on `fullBounds` + * of some param when comparing types might lead to infinite recursion. Consider `bounds` instead. + */ + def fullBounds(param: TypeParamRef)(using Context): TypeBounds = + nonParamBounds(param).derivedTypeBounds(fullLowerBound(param), fullUpperBound(param)) + + /** An approximating map that prevents types nested deeper than maxLevel as + * well as WildcardTypes from leaking into the constraint. + */ + class LevelAvoidMap(topLevelVariance: Int, maxLevel: Int)(using Context) extends TypeOps.AvoidMap: + variance = topLevelVariance + + def toAvoid(tp: NamedType): Boolean = + tp.prefix == NoPrefix && !tp.symbol.isStatic && !levelOK(tp.symbol.nestingLevel, maxLevel) + + /** Return a (possibly fresh) type variable of a level no greater than `maxLevel` which is: + * - lower-bounded by `tp` if variance >= 0 + * - upper-bounded by `tp` if variance <= 0 + * If this isn't possible, return the empty range. + */ + def legalVar(tp: TypeVar): Type = + val oldParam = tp.origin + val nameKind = + if variance > 0 then AvoidNameKind.UpperBound + else if variance < 0 then AvoidNameKind.LowerBound + else AvoidNameKind.BothBounds + + /** If it exists, return the first param in the list created in a previous call to `legalVar(tp)` + * with the appropriate level and variance. + */ + def findParam(params: List[TypeParamRef]): Option[TypeParamRef] = + params.find(p => + nestingLevel(p) <= maxLevel && representedParamRef(p) == oldParam && + (p.paramName.is(AvoidNameKind.BothBounds) || + variance != 0 && p.paramName.is(nameKind))) + + // First, check if we can reuse an existing parameter, this is more than an optimization + // since it avoids an infinite loop in tests/pos/i8900-cycle.scala + findParam(constraint.lower(oldParam)).orElse(findParam(constraint.upper(oldParam))) match + case Some(param) => + constraint.typeVarOfParam(param) + case _ => + // Otherwise, try to return a fresh type variable at `maxLevel` with + // the appropriate constraints. + val name = nameKind(oldParam.paramName.toTermName).toTypeName + val freshVar = newTypeVar(TypeBounds.upper(tp.topType), name, + nestingLevel = maxLevel, represents = oldParam) + val ok = + if variance < 0 then + addLess(freshVar.origin, oldParam) + else if variance > 0 then + addLess(oldParam, freshVar.origin) + else + unify(freshVar.origin, oldParam) + if ok then freshVar else emptyRange + end legalVar + + override def apply(tp: Type): Type = tp match + case tp: TypeVar if !tp.isInstantiated && !levelOK(tp.nestingLevel, maxLevel) => + legalVar(tp) + // TypeParamRef can occur in tl bounds + case tp: TypeParamRef => + constraint.typeVarOfParam(tp) match + case tvar: TypeVar => + apply(tvar) + case _ => super.apply(tp) + case _ => + super.apply(tp) + + override def mapWild(t: WildcardType) = + if ctx.mode.is(Mode.TypevarsMissContext) then super.mapWild(t) + else + val tvar = newTypeVar(apply(t.effectiveBounds).toBounds, nestingLevel = maxLevel) + tvar + end LevelAvoidMap + + /** Approximate `rawBound` if needed to make it a legal bound of `param` by + * avoiding wildcards and types with a level strictly greater than its + * `nestingLevel`. + * + * Note that level-checking must be performed here and cannot be delayed + * until instantiation because if we allow level-incorrect bounds, then we + * might end up reasoning with bad bounds outside of the scope where they are + * defined. This can lead to level-correct but unsound instantiations as + * demonstrated by tests/neg/i8900.scala. + */ + protected def legalBound(param: TypeParamRef, rawBound: Type, isUpper: Boolean)(using Context): Type = + // Over-approximate for soundness. + var variance = if isUpper then -1 else 1 + // ...unless we can only infer necessary constraints, in which case we + // flip the variance to under-approximate. + if necessaryConstraintsOnly then variance = -variance + + val approx = new LevelAvoidMap(variance, nestingLevel(param)): + override def legalVar(tp: TypeVar): Type = + // `legalVar` will create a type variable whose bounds depend on + // `variance`, but whether the variance is positive or negative, + // we can still infer necessary constraints since just creating a + // type variable doesn't reduce the set of possible solutions. + // Therefore, we can safely "unflip" the variance flipped above. + // This is necessary for i8900-unflip.scala to typecheck. + val v = if necessaryConstraintsOnly then -this.variance else this.variance + atVariance(v)(super.legalVar(tp)) + approx(rawBound) + end legalBound + + protected def addOneBound(param: TypeParamRef, rawBound: Type, isUpper: Boolean)(using Context): Boolean = + if !constraint.contains(param) then true + else if !isUpper && param.occursIn(rawBound) then + // We don't allow recursive lower bounds when defining a type, + // so we shouldn't allow them as constraints either. + false + else + val bound = legalBound(param, rawBound, isUpper) + val oldBounds @ TypeBounds(lo, hi) = constraint.nonParamBounds(param) + val equalBounds = (if isUpper then lo else hi) eq bound + if equalBounds && !bound.existsPart(_ eq param, StopAt.Static) then + // The narrowed bounds are equal and not recursive, + // so we can remove `param` from the constraint. + constraint = constraint.replace(param, bound) + true + else + // Narrow one of the bounds of type parameter `param` + // If `isUpper` is true, ensure that `param <: `bound`, otherwise ensure + // that `param >: bound`. + val narrowedBounds = + val saved = homogenizeArgs + homogenizeArgs = Config.alignArgsInAnd + try + withUntrustedBounds( + if isUpper then oldBounds.derivedTypeBounds(lo, hi & bound) + else oldBounds.derivedTypeBounds(lo | bound, hi)) + finally + homogenizeArgs = saved + //println(i"narrow bounds for $param from $oldBounds to $narrowedBounds") + val c1 = constraint.updateEntry(param, narrowedBounds) + (c1 eq constraint) + || { + constraint = c1 + val TypeBounds(lo, hi) = constraint.entry(param): @unchecked + isSub(lo, hi) + } + end addOneBound + + protected def addBoundTransitively(param: TypeParamRef, rawBound: Type, isUpper: Boolean)(using Context): Boolean = + + /** Adjust the bound `tp` in the following ways: + * + * 1. Toplevel occurrences of TypeRefs that are instantiated in the current + * constraint are also dereferenced. + * 2. Toplevel occurrences of ExprTypes lead to a `NoType` return, which + * causes the addOneBound operation to fail. + * + * An occurrence is toplevel if it is the bound itself, or a term in some + * combination of `&` or `|` types. + */ + def adjust(tp: Type): Type = tp match + case tp: AndOrType => + val p1 = adjust(tp.tp1) + val p2 = adjust(tp.tp2) + if p1.exists && p2.exists then tp.derivedAndOrType(p1, p2) else NoType + case tp: TypeVar if constraint.contains(tp.origin) => + adjust(tp.underlying) + case tp: ExprType => + // ExprTypes are not value types, so type parameters should not + // be instantiated to ExprTypes. A scenario where such an attempted + // instantiation can happen is if we unify (=> T) => () with A => () + // where A is a TypeParamRef. See the comment on EtaExpansion.etaExpand + // why types such as (=> T) => () can be constructed and i7969.scala + // as a test where this happens. + // Note that scalac by contrast allows such instantiations. But letting + // type variables be ExprTypes has its own problems (e.g. you can't write + // the resulting types down) and is largely unknown terrain. + NoType + case _ => + tp + + def description = i"constraint $param ${if isUpper then "<:" else ":>"} $rawBound to\n$constraint" + constr.println(i"adding $description$location") + if isUpper && rawBound.isRef(defn.NothingClass) && ctx.typerState.isGlobalCommittable then + def msg = i"!!! instantiated to Nothing: $param, constraint = $constraint" + if Config.failOnInstantiationToNothing + then assert(false, msg) + else report.log(msg) + def others = if isUpper then constraint.lower(param) else constraint.upper(param) + val bound = adjust(rawBound) + bound.exists + && addOneBound(param, bound, isUpper) && others.forall(addOneBound(_, bound, isUpper)) + .showing(i"added $description = $result$location", constr) + end addBoundTransitively + + protected def addLess(p1: TypeParamRef, p2: TypeParamRef)(using Context): Boolean = { + def description = i"ordering $p1 <: $p2 to\n$constraint" + val res = + if (constraint.isLess(p2, p1)) unify(p2, p1) + else { + val down1 = p1 :: constraint.exclusiveLower(p1, p2) + val up2 = p2 :: constraint.exclusiveUpper(p2, p1) + val lo1 = constraint.nonParamBounds(p1).lo + val hi2 = constraint.nonParamBounds(p2).hi + constr.println(i"adding $description down1 = $down1, up2 = $up2$location") + constraint = constraint.addLess(p1, p2) + down1.forall(addOneBound(_, hi2, isUpper = true)) && + up2.forall(addOneBound(_, lo1, isUpper = false)) + } + constr.println(i"added $description = $res$location") + res + } + + def location(using Context) = "" // i"in ${ctx.typerState.stateChainStr}" // use for debugging + + /** Unify p1 with p2: one parameter will be kept in the constraint, the + * other will be removed and its bounds transferred to the remaining one. + * + * If p1 and p2 have different `nestingLevel`, the parameter with the lowest + * level will be kept and the transferred bounds from the other parameter + * will be adjusted for level-correctness. + */ + private def unify(p1: TypeParamRef, p2: TypeParamRef)(using Context): Boolean = { + constr.println(s"unifying $p1 $p2") + if !constraint.isLess(p1, p2) then + constraint = constraint.addLess(p1, p2) + + val level1 = nestingLevel(p1) + val level2 = nestingLevel(p2) + val pKept = if level1 <= level2 then p1 else p2 + val pRemoved = if level1 <= level2 then p2 else p1 + + val down = constraint.exclusiveLower(p2, p1) + val up = constraint.exclusiveUpper(p1, p2) + + constraint = constraint.addLess(p2, p1, direction = if pKept eq p1 then KeepParam2 else KeepParam1) + + val boundKept = constraint.nonParamBounds(pKept).substParam(pRemoved, pKept) + var boundRemoved = constraint.nonParamBounds(pRemoved).substParam(pRemoved, pKept) + + if level1 != level2 then + boundRemoved = LevelAvoidMap(-1, math.min(level1, level2))(boundRemoved) + val TypeBounds(lo, hi) = boundRemoved: @unchecked + // After avoidance, the interval might be empty, e.g. in + // tests/pos/i8900-promote.scala: + // >: x.type <: Singleton + // becomes: + // >: Int <: Singleton + // In that case, we can still get a legal constraint + // by replacing the lower-bound to get: + // >: Int & Singleton <: Singleton + if !isSub(lo, hi) then + boundRemoved = TypeBounds(lo & hi, hi) + + val newBounds = (boundKept & boundRemoved).bounds + constraint = constraint.updateEntry(pKept, newBounds).replace(pRemoved, pKept) + + val lo = newBounds.lo + val hi = newBounds.hi + isSub(lo, hi) && + down.forall(addOneBound(_, hi, isUpper = true)) && + up.forall(addOneBound(_, lo, isUpper = false)) + } + + protected def isSubType(tp1: Type, tp2: Type, whenFrozen: Boolean)(using Context): Boolean = + if (whenFrozen) + isSubTypeWhenFrozen(tp1, tp2) + else + isSub(tp1, tp2) + + inline final def inFrozenConstraint[T](op: => T): T = { + val savedFrozen = frozenConstraint + val savedLambda = caseLambda + frozenConstraint = true + caseLambda = NoType + try op + finally { + frozenConstraint = savedFrozen + caseLambda = savedLambda + } + } + + final def isSubTypeWhenFrozen(tp1: Type, tp2: Type)(using Context): Boolean = inFrozenConstraint(isSub(tp1, tp2)) + final def isSameTypeWhenFrozen(tp1: Type, tp2: Type)(using Context): Boolean = inFrozenConstraint(isSame(tp1, tp2)) + + /** Test whether the lower bounds of all parameters in this + * constraint are a solution to the constraint. + */ + protected final def isSatisfiable(using Context): Boolean = + constraint.forallParams { param => + val TypeBounds(lo, hi) = constraint.entry(param): @unchecked + isSub(lo, hi) || { + report.log(i"sub fail $lo <:< $hi") + false + } + } + + /** Fix instance type `tp` by avoidance so that it does not contain references + * to types at level > `maxLevel`. + * @param tp the type to be fixed + * @param fromBelow whether type was obtained from lower bound + * @param maxLevel the maximum level of references allowed + * @param param the parameter that was instantiated + */ + private def fixLevels(tp: Type, fromBelow: Boolean, maxLevel: Int, param: TypeParamRef)(using Context) = + + def needsFix(tp: NamedType)(using Context) = + (tp.prefix eq NoPrefix) && tp.symbol.nestingLevel > maxLevel + + /** An accumulator that determines whether levels need to be fixed + * and computes on the side sets of nested type variables that need + * to be instantiated. + */ + def needsLeveling = new TypeAccumulator[Boolean]: + if !fromBelow then variance = -1 + + def apply(need: Boolean, tp: Type) = + need || tp.match + case tp: NamedType => + needsFix(tp) + || !stopBecauseStaticOrLocal(tp) && apply(need, tp.prefix) + case tp: TypeVar => + val inst = tp.instanceOpt + if inst.exists then apply(need, inst) + else if tp.nestingLevel > maxLevel then + // Change the nesting level of inner type variable to `maxLevel`. + // This means that the type variable will be instantiated later to a + // less nested type. If there are other references to the same type variable + // that do not come from the type undergoing `fixLevels`, this could lead + // to coarser types than intended. An alternative is to instantiate the + // type variable right away, but this also loses information. See + // i15934.scala for a test where the current strategey works but an early instantiation + // of `tp` would fail. + constr.println(i"widening nesting level of type variable $tp from ${tp.nestingLevel} to $maxLevel") + ctx.typerState.setNestingLevel(tp, maxLevel) + true + else false + case _ => + foldOver(need, tp) + end needsLeveling + + def levelAvoid = new TypeOps.AvoidMap: + if !fromBelow then variance = -1 + def toAvoid(tp: NamedType) = needsFix(tp) + + if Config.checkLevelsOnInstantiation && !ctx.isAfterTyper && needsLeveling(false, tp) then + typr.println(i"instance $tp for $param needs leveling to $maxLevel") + levelAvoid(tp) + else tp + end fixLevels + + /** Solve constraint set for given type parameter `param`. + * If `fromBelow` is true the parameter is approximated by its lower bound, + * otherwise it is approximated by its upper bound, unless the upper bound + * contains a reference to the parameter itself (such occurrences can arise + * for F-bounded types, `addOneBound` ensures that they never occur in the + * lower bound). + * The solved type is not allowed to contain references to types nested deeper + * than `maxLevel`. + * Wildcard types in bounds are approximated by their upper or lower bounds. + * The constraint is left unchanged. + * @return the instantiating type + * @pre `param` is in the constraint's domain. + */ + final def approximation(param: TypeParamRef, fromBelow: Boolean, maxLevel: Int)(using Context): Type = + constraint.entry(param) match + case entry: TypeBounds => + val useLowerBound = fromBelow || param.occursIn(entry.hi) + val rawInst = withUntrustedBounds( + if useLowerBound then fullLowerBound(param) else fullUpperBound(param)) + val levelInst = fixLevels(rawInst, fromBelow, maxLevel, param) + if levelInst ne rawInst then + typr.println(i"level avoid for $maxLevel: $rawInst --> $levelInst") + typr.println(i"approx $param, from below = $fromBelow, inst = $levelInst") + levelInst + case inst => + assert(inst.exists, i"param = $param\nconstraint = $constraint") + inst + end approximation + + private def isTransparent(tp: Type, traitOnly: Boolean)(using Context): Boolean = tp match + case AndType(tp1, tp2) => + isTransparent(tp1, traitOnly) && isTransparent(tp2, traitOnly) + case _ => + val cls = tp.underlyingClassRef(refinementOK = false).typeSymbol + cls.isTransparentClass && (!traitOnly || cls.is(Trait)) + + /** If `tp` is an intersection such that some operands are transparent trait instances + * and others are not, replace as many transparent trait instances as possible with Any + * as long as the result is still a subtype of `bound`. But fall back to the + * original type if the resulting widened type is a supertype of all dropped + * types (since in this case the type was not a true intersection of transparent traits + * and other types to start with). + */ + def dropTransparentTraits(tp: Type, bound: Type)(using Context): Type = + var kept: Set[Type] = Set() // types to keep since otherwise bound would not fit + var dropped: List[Type] = List() // the types dropped so far, last one on top + + def dropOneTransparentTrait(tp: Type): Type = + if isTransparent(tp, traitOnly = true) && !kept.contains(tp) then + dropped = tp :: dropped + defn.AnyType + else tp match + case AndType(tp1, tp2) => + val tp1w = dropOneTransparentTrait(tp1) + if tp1w ne tp1 then tp1w & tp2 + else + val tp2w = dropOneTransparentTrait(tp2) + if tp2w ne tp2 then tp1 & tp2w + else tp + case _ => + tp + + def recur(tp: Type): Type = + val tpw = dropOneTransparentTrait(tp) + if tpw eq tp then tp + else if tpw <:< bound then recur(tpw) + else + kept += dropped.head + dropped = dropped.tail + recur(tp) + + val saved = ctx.typerState.snapshot() + val tpw = recur(tp) + if (tpw eq tp) || dropped.forall(_ frozen_<:< tpw) then + // Rollback any constraint change that would lead to `tp` no longer + // being a valid solution. + ctx.typerState.resetTo(saved) + tp + else + tpw + end dropTransparentTraits + + /** If `tp` is an applied match type alias which is also an unreducible application + * of a higher-kinded type to a wildcard argument, widen to the match type's bound, + * in order to avoid an unreducible application of higher-kinded type ... in inferred type" + * error in PostTyper. Fixes #11246. + */ + def widenIrreducible(tp: Type)(using Context): Type = tp match + case tp @ AppliedType(tycon, _) if tycon.isLambdaSub && tp.hasWildcardArg => + tp.superType match + case MatchType(bound, _, _) => bound + case _ => tp + case _ => + tp + + /** Widen inferred type `inst` with upper `bound`, according to the following rules: + * 1. If `inst` is a singleton type, or a union containing some singleton types, + * widen (all) the singleton type(s), provided the result is a subtype of `bound` + * (i.e. `inst.widenSingletons <:< bound` succeeds with satisfiable constraint) and + * is not transparent according to `isTransparent`. + * 2a. If `inst` is a union type and `widenUnions` is true, approximate the union type + * from above by an intersection of all common base types, provided the result + * is a subtype of `bound`. + * 2b. If `inst` is a union type and `widenUnions` is false, turn it into a hard + * union type (except for unions | Null, which are kept in the state they were). + * 3. Widen some irreducible applications of higher-kinded types to wildcard arguments + * (see @widenIrreducible). + * 4. Drop transparent traits from intersections (see @dropTransparentTraits). + * + * Don't do these widenings if `bound` is a subtype of `scala.Singleton`. + * Also, if the result of these widenings is a TypeRef to a module class, + * and this type ref is different from `inst`, replace by a TermRef to + * its source module instead. + * + * At this point we also drop the @Repeated annotation to avoid inferring type arguments with it, + * as those could leak the annotation to users (see run/inferred-repeated-result). + */ + def widenInferred(inst: Type, bound: Type, widenUnions: Boolean)(using Context): Type = + def widenOr(tp: Type) = + if widenUnions then + val tpw = tp.widenUnion + if (tpw ne tp) && !isTransparent(tpw, traitOnly = false) && (tpw <:< bound) then tpw else tp + else tp.hardenUnions + + def widenSingle(tp: Type) = + val tpw = tp.widenSingletons + if (tpw ne tp) && (tpw <:< bound) then tpw else tp + + def isSingleton(tp: Type): Boolean = tp match + case WildcardType(optBounds) => optBounds.exists && isSingleton(optBounds.bounds.hi) + case _ => isSubTypeWhenFrozen(tp, defn.SingletonType) + + val wideInst = + if isSingleton(bound) then inst + else + val widenedFromSingle = widenSingle(inst) + val widenedFromUnion = widenOr(widenedFromSingle) + val widened = dropTransparentTraits(widenedFromUnion, bound) + widenIrreducible(widened) + + wideInst match + case wideInst: TypeRef if wideInst.symbol.is(Module) => + TermRef(wideInst.prefix, wideInst.symbol.sourceModule) + case _ => + wideInst.dropRepeatedAnnot + end widenInferred + + /** Convert all toplevel union types in `tp` to hard unions */ + extension (tp: Type) private def hardenUnions(using Context): Type = tp.widen match + case tp: AndType => + tp.derivedAndType(tp.tp1.hardenUnions, tp.tp2.hardenUnions) + case tp: RefinedType => + tp.derivedRefinedType(tp.parent.hardenUnions, tp.refinedName, tp.refinedInfo) + case tp: RecType => + tp.rebind(tp.parent.hardenUnions) + case tp: HKTypeLambda => + tp.derivedLambdaType(resType = tp.resType.hardenUnions) + case tp: OrType => + val tp1 = tp.stripNull + if tp1 ne tp then tp.derivedOrType(tp1.hardenUnions, defn.NullType) + else tp.derivedOrType(tp.tp1.hardenUnions, tp.tp2.hardenUnions, soft = false) + case _ => + tp + + /** The instance type of `param` in the current constraint (which contains `param`). + * If `fromBelow` is true, the instance type is the lub of the parameter's + * lower bounds; otherwise it is the glb of its upper bounds. However, + * a lower bound instantiation can be a singleton type only if the upper bound + * is also a singleton type. + * The instance type is not allowed to contain references to types nested deeper + * than `maxLevel`. + */ + def instanceType(param: TypeParamRef, fromBelow: Boolean, widenUnions: Boolean, maxLevel: Int)(using Context): Type = { + val approx = approximation(param, fromBelow, maxLevel).simplified + if fromBelow then + val widened = widenInferred(approx, param, widenUnions) + // Widening can add extra constraints, in particular the widened type might + // be a type variable which is now instantiated to `param`, and therefore + // cannot be used as an instantiation of `param` without creating a loop. + // If that happens, we run `instanceType` again to find a new instantation. + // (we do not check for non-toplevel occurences: those should never occur + // since `addOneBound` disallows recursive lower bounds). + if constraint.occursAtToplevel(param, widened) then + instanceType(param, fromBelow, widenUnions, maxLevel) + else + widened + else + approx + } + + /** Constraint `c1` subsumes constraint `c2`, if under `c2` as constraint we have + * for all poly params `p` defined in `c2` as `p >: L2 <: U2`: + * + * c1 defines p with bounds p >: L1 <: U1, and + * L2 <: L1, and + * U1 <: U2 + * + * Both `c1` and `c2` are required to derive from constraint `pre`, without adding + * any new type variables but possibly narrowing already registered ones with further bounds. + */ + protected final def subsumes(c1: Constraint, c2: Constraint, pre: Constraint)(using Context): Boolean = + if (c2 eq pre) true + else if (c1 eq pre) false + else { + val saved = constraint + try + // We iterate over params of `pre`, instead of `c2` as the documentation may suggest. + // As neither `c1` nor `c2` can have more params than `pre`, this only matters in one edge case. + // Constraint#forallParams only iterates over params that can be directly constrained. + // If `c2` has, compared to `pre`, instantiated a param and we iterated over params of `c2`, + // we could miss that param being instantiated to an incompatible type in `c1`. + pre.forallParams(p => + c1.entry(p).exists + && c2.upper(p).forall(c1.isLess(p, _)) + && isSubTypeWhenFrozen(c1.nonParamBounds(p), c2.nonParamBounds(p)) + ) + finally constraint = saved + } + + /** The current bounds of type parameter `param` */ + def bounds(param: TypeParamRef)(using Context): TypeBounds = { + val e = constraint.entry(param) + if (e.exists) e.bounds + else { + // TODO: should we change the type of paramInfos to nullable? + val pinfos: List[param.binder.PInfo] | Null = param.binder.paramInfos + if (pinfos != null) pinfos(param.paramNum) // pinfos == null happens in pos/i536.scala + else TypeBounds.empty + } + } + + /** Add type lambda `tl`, possibly with type variables `tvars`, to current constraint + * and propagate all bounds. + * @param tvars See Constraint#add + */ + def addToConstraint(tl: TypeLambda, tvars: List[TypeVar])(using Context): Boolean = + checkPropagated(i"initialized $tl") { + constraint = constraint.add(tl, tvars) + tl.paramRefs.forall { param => + val lower = constraint.lower(param) + val upper = constraint.upper(param) + constraint.entry(param) match { + case bounds: TypeBounds => + if lower.nonEmpty && !bounds.lo.isRef(defn.NothingClass) + || upper.nonEmpty && !bounds.hi.isAny + then constr.println(i"INIT*** $tl") + lower.forall(addOneBound(_, bounds.hi, isUpper = true)) && + upper.forall(addOneBound(_, bounds.lo, isUpper = false)) + case x => + // Happens if param was already solved while processing earlier params of the same TypeLambda. + // See #4720. + true + } + } + } + + /** Can `param` be constrained with new bounds? */ + final def canConstrain(param: TypeParamRef): Boolean = + (!frozenConstraint || (caseLambda `eq` param.binder)) && constraint.contains(param) + + /** Is `param` assumed to be a sub- and super-type of any other type? + * This holds if `TypeVarsMissContext` is set unless `param` is a part + * of a MatchType that is currently normalized. + */ + final def assumedTrue(param: TypeParamRef)(using Context): Boolean = + ctx.mode.is(Mode.TypevarsMissContext) && (caseLambda `ne` param.binder) + + /** Add constraint `param <: bound` if `fromBelow` is false, `param >: bound` otherwise. + * `bound` is assumed to be in normalized form, as specified in `firstTry` and + * `secondTry` of `TypeComparer`. In particular, it should not be an alias type, + * lazy ref, typevar, wildcard type, error type. In addition, upper bounds may + * not be AndTypes and lower bounds may not be OrTypes. This is assured by the + * way isSubType is organized. + */ + protected def addConstraint(param: TypeParamRef, bound: Type, fromBelow: Boolean)(using Context): Boolean = + if !bound.isValueTypeOrLambda then return false + + /** When comparing lambdas we might get constraints such as + * `A <: X0` or `A = List[X0]` where `A` is a constrained parameter + * and `X0` is a lambda parameter. The constraint for `A` is not allowed + * to refer to such a lambda parameter because the lambda parameter is + * not visible where `A` is defined. Consequently, we need to + * approximate the bound so that the lambda parameter does not appear in it. + * If `tp` is an upper bound, we need to approximate with something smaller, + * otherwise something larger. + * Test case in pos/i94-nada.scala. This test crashes with an illegal instance + * error in Test2 when the rest of the SI-2712 fix is applied but `pruneLambdaParams` is + * missing. + */ + def avoidLambdaParams(tp: Type) = + if comparedTypeLambdas.nonEmpty then + val approx = new ApproximatingTypeMap { + if (!fromBelow) variance = -1 + def apply(t: Type): Type = t match { + case t @ TypeParamRef(tl: TypeLambda, n) if comparedTypeLambdas contains tl => + val bounds = tl.paramInfos(n) + range(bounds.lo, bounds.hi) + case tl: TypeLambda => + val saved = comparedTypeLambdas + comparedTypeLambdas -= tl + try mapOver(tl) + finally comparedTypeLambdas = saved + case _ => + mapOver(t) + } + } + approx(tp) + else tp + + def addParamBound(bound: TypeParamRef) = + constraint.entry(param) match { + case _: TypeBounds => + if (fromBelow) addLess(bound, param) else addLess(param, bound) + case tp => + if (fromBelow) isSub(bound, tp) else isSub(tp, bound) + } + + def kindCompatible(tp1: Type, tp2: Type): Boolean = + val tparams1 = tp1.typeParams + val tparams2 = tp2.typeParams + tparams1.corresponds(tparams2)((p1, p2) => kindCompatible(p1.paramInfo, p2.paramInfo)) + && (tparams1.isEmpty || kindCompatible(tp1.hkResult, tp2.hkResult)) + || tp1.hasAnyKind + || tp2.hasAnyKind + + def description = i"constr $param ${if (fromBelow) ">:" else "<:"} $bound:\n$constraint" + + //checkPropagated(s"adding $description")(true) // DEBUG in case following fails + checkPropagated(s"added $description") { + addConstraintInvocations += 1 + val saved = canWidenAbstract + canWidenAbstract = true + try bound match + case bound: TypeParamRef if constraint contains bound => + addParamBound(bound) + case _ => + val pbound = avoidLambdaParams(bound) + kindCompatible(param, pbound) && addBoundTransitively(param, pbound, !fromBelow) + finally + canWidenAbstract = saved + addConstraintInvocations -= 1 + } + end addConstraint + + /** Check that constraint is fully propagated. See comment in Config.checkConstraintsPropagated */ + def checkPropagated(msg: => String)(result: Boolean)(using Context): Boolean = { + if (Config.checkConstraintsPropagated && result && addConstraintInvocations == 0) + inFrozenConstraint { + for (p <- constraint.domainParams) { + def check(cond: => Boolean, q: TypeParamRef, ordering: String, explanation: String): Unit = + assert(cond, i"propagation failure for $p $ordering $q: $explanation\n$msg") + for (u <- constraint.upper(p)) + check(bounds(p).hi <:< bounds(u).hi, u, "<:", "upper bound not propagated") + for (l <- constraint.lower(p)) { + check(bounds(l).lo <:< bounds(p).hi, l, ">:", "lower bound not propagated") + check(constraint.isLess(l, p), l, ">:", "reverse ordering (<:) missing") + } + } + } + result + } +} diff --git a/tests/pos-with-compiler-cc/dotc/core/ConstraintRunInfo.scala b/tests/pos-with-compiler-cc/dotc/core/ConstraintRunInfo.scala new file mode 100644 index 000000000000..d2b1246a8149 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/core/ConstraintRunInfo.scala @@ -0,0 +1,23 @@ +package dotty.tools.dotc +package core + +import Contexts._ +import config.Printers.{default, typr} + +trait ConstraintRunInfo { self: Run => + private var maxSize = 0 + private var maxConstraint: Constraint | Null = _ + def recordConstraintSize(c: Constraint, size: Int): Unit = + if (size > maxSize) { + maxSize = size + maxConstraint = c + } + def printMaxConstraint()(using Context): Unit = + if maxSize > 0 then + val printer = if ctx.settings.YdetailedStats.value then default else typr + printer.println(s"max constraint size: $maxSize") + try printer.println(s"max constraint = ${maxConstraint.nn.show}") + catch case ex: StackOverflowError => printer.println("max constraint cannot be printed due to stack overflow") + + protected def reset(): Unit = maxConstraint = null +} diff --git a/tests/pos-with-compiler-cc/dotc/core/ContextOps.scala b/tests/pos-with-compiler-cc/dotc/core/ContextOps.scala new file mode 100644 index 000000000000..20687dc1663a --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/core/ContextOps.scala @@ -0,0 +1,115 @@ +package dotty.tools.dotc +package core + +import Contexts._, Symbols._, Types._, Flags._ +import Denotations._, SymDenotations._ +import Names.Name, StdNames.nme +import ast.untpd +import caps.unsafe.unsafeBoxFunArg + +/** Extension methods for contexts where we want to keep the ctx. syntax */ +object ContextOps: + + extension (ctx: Context) + + /** Enter symbol into current class, if current class is owner of current context, + * or into current scope, if not. Should always be called instead of scope.enter + * in order to make sure that updates to class members are reflected in + * finger prints. + */ + def enter(sym: Symbol): Symbol = inContext(ctx) { + ctx.owner match + case cls: ClassSymbol => cls.classDenot.enter(sym) + case _ => ctx.scope.openForMutations.enter(sym) + sym + } + + /** The denotation with the given `name` and all `required` flags in current context + */ + def denotNamed(name: Name, required: FlagSet = EmptyFlags, excluded: FlagSet = EmptyFlags): Denotation = + inContext(ctx) { + if (ctx.owner.isClass) + if (ctx.outer.owner == ctx.owner) { // inner class scope; check whether we are referring to self + if (ctx.scope.size == 1) { + val elem = ctx.scope.lastEntry.nn + if (elem.name == name) return elem.sym.denot // return self + } + val pre = ctx.owner.thisType + if ctx.isJava then javaFindMember(name, pre, required, excluded) + else pre.findMember(name, pre, required, excluded) + } + else // we are in the outermost context belonging to a class; self is invisible here. See inClassContext. + ctx.owner.findMember(name, ctx.owner.thisType, required, excluded) + else + ctx.scope.denotsNamed(name).filterWithFlags(required, excluded).toDenot(NoPrefix) + } + + final def javaFindMember(name: Name, pre: Type, required: FlagSet = EmptyFlags, excluded: FlagSet = EmptyFlags): Denotation = + assert(ctx.isJava) + inContext(ctx) { + + val preSym = pre.typeSymbol + + // 1. Try to search in current type and parents. + val directSearch = pre.findMember(name, pre, required, excluded) + + // 2. Try to search in companion class if current is an object. + def searchCompanionClass = if preSym.is(Flags.Module) then + preSym.companionClass.thisType.findMember(name, pre, required, excluded) + else NoDenotation + + // 3. Try to search in companion objects of super classes. + // In Java code, static inner classes, which we model as members of the companion object, + // can be referenced from an ident in a subclass or by a selection prefixed by the subclass. + def searchSuperCompanionObjects = + val toSearch = if preSym.is(Flags.Module) then + if preSym.companionClass.exists then + preSym.companionClass.asClass.baseClasses + else Nil + else + preSym.asClass.baseClasses + + toSearch.iterator.map { bc => + val pre1 = bc.companionModule.namedType + pre1.findMember(name, pre1, required, excluded) + }.find(_.exists).getOrElse(NoDenotation) + + if preSym.isClass then + directSearch orElse searchCompanionClass orElse searchSuperCompanionObjects + else + directSearch + } + + /** A fresh local context with given tree and owner. + * Owner might not exist (can happen for self valdefs), in which case + * no owner is set in result context + */ + def localContext(tree: untpd.Tree, owner: Symbol): FreshContext = inContext(ctx) { + val freshCtx = ctx.fresh.setTree(tree) + if owner.exists then freshCtx.setOwner(owner) else freshCtx + } + + /** Context where `sym` is defined, assuming we are in a nested context. */ + def defContext(sym: Symbol): Context = inContext(ctx) { + ctx.outersIterator + .dropWhile(((ctx: Context) => ctx.owner != sym).unsafeBoxFunArg) + .dropWhile(((ctx: Context) => ctx.owner == sym).unsafeBoxFunArg) + .next() + } + + /** A new context for the interior of a class */ + def inClassContext(selfInfo: TypeOrSymbol): Context = + inline def op(using Context): Context = + val localCtx: Context = ctx.fresh.setNewScope + selfInfo match { + case sym: Symbol if sym.exists && sym.name != nme.WILDCARD => localCtx.scope.openForMutations.enter(sym) + case _ => + } + localCtx + op(using ctx) + + def packageContext(tree: untpd.PackageDef, pkg: Symbol): Context = inContext(ctx) { + if (pkg.is(Package)) ctx.fresh.setOwner(pkg.moduleClass).setTree(tree) + else ctx + } +end ContextOps diff --git a/tests/pos-with-compiler-cc/dotc/core/Contexts.scala b/tests/pos-with-compiler-cc/dotc/core/Contexts.scala new file mode 100644 index 000000000000..2ce714937f97 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/core/Contexts.scala @@ -0,0 +1,1041 @@ +package dotty.tools +package dotc +package core + +import interfaces.CompilerCallback +import Decorators._ +import Periods._ +import Names._ +import Phases._ +import Types._ +import Symbols._ +import Scopes._ +import Uniques._ +import ast.Trees._ +import ast.untpd +import util.{NoSource, SimpleIdentityMap, SourceFile, HashSet, ReusableInstance} +import typer.{Implicits, ImportInfo, SearchHistory, SearchRoot, TypeAssigner, Typer, Nullables} +import inlines.Inliner +import Nullables._ +import Implicits.ContextualImplicits +import config.Settings._ +import config.Config +import reporting._ +import io.{AbstractFile, NoAbstractFile, PlainFile, Path} +import scala.io.Codec +import collection.mutable +import printing._ +import config.{JavaPlatform, SJSPlatform, Platform, ScalaSettings} +import classfile.ReusableDataReader +import StdNames.nme +import compiletime.uninitialized + +import annotation.internal.sharable +import annotation.retains + +import DenotTransformers.DenotTransformer +import dotty.tools.dotc.profile.Profiler +import util.Property.Key +import util.Store +import xsbti.AnalysisCallback +import plugins._ +import java.util.concurrent.atomic.AtomicInteger +import java.nio.file.InvalidPathException +import language.experimental.pureFunctions + +object Contexts { + + //@sharable var nextId = 0 + + private val (compilerCallbackLoc, store1) = Store.empty.newLocation[CompilerCallback]() + private val (sbtCallbackLoc, store2) = store1.newLocation[AnalysisCallback]() + private val (printerFnLoc, store3) = store2.newLocation[DetachedContext -> Printer](new RefinedPrinter(_)) + private val (settingsStateLoc, store4) = store3.newLocation[SettingsState]() + private val (compilationUnitLoc, store5) = store4.newLocation[CompilationUnit]() + private val (runLoc, store6) = store5.newLocation[Run | Null]() + private val (profilerLoc, store7) = store6.newLocation[Profiler]() + private val (notNullInfosLoc, store8) = store7.newLocation[List[NotNullInfo]]() + private val (importInfoLoc, store9) = store8.newLocation[ImportInfo | Null]() + private val (typeAssignerLoc, store10) = store9.newLocation[TypeAssigner](TypeAssigner) + + private val initialStore = store10 + + /** The current context */ + inline def ctx(using ctx: Context): Context = ctx + + /** Run `op` with given context */ + inline def inContext[T](c: Context)(inline op: Context ?-> T): T = + op(using c) + + /** Execute `op` at given period */ + inline def atPeriod[T](pd: Period)(inline op: Context ?-> T)(using Context): T = + op(using ctx.fresh.setPeriod(pd)) + + /** Execute `op` at given phase id */ + inline def atPhase[T](pid: PhaseId)(inline op: Context ?-> T)(using Context): T = + op(using ctx.withPhase(pid)) + + /** Execute `op` at given phase */ + inline def atPhase[T](phase: Phase)(inline op: Context ?-> T)(using Context): T = + op(using ctx.withPhase(phase)) + + inline def atNextPhase[T](inline op: Context ?-> T)(using Context): T = + atPhase(ctx.phase.next)(op) + + /** Execute `op` at the current phase if it's before the first transform phase, + * otherwise at the last phase before the first transform phase. + * + * Note: this should be used instead of `atPhaseNoLater(ctx.picklerPhase)` + * because the later won't work if the `Pickler` phase is not present (for example, + * when using `QuoteCompiler`). + */ + inline def atPhaseBeforeTransforms[T](inline op: Context ?-> T)(using Context): T = + atPhaseNoLater(firstTransformPhase.prev)(op) + + inline def atPhaseNoLater[T](limit: Phase)(inline op: Context ?-> T)(using Context): T = + op(using if !limit.exists || ctx.phase <= limit then ctx else ctx.withPhase(limit)) + + inline def atPhaseNoEarlier[T](limit: Phase)(inline op: Context ?-> T)(using Context): T = + op(using if !limit.exists || limit <= ctx.phase then ctx else ctx.withPhase(limit)) + + inline def inMode[T](mode: Mode)(inline op: Context ?-> T)(using ctx: Context): T = + op(using if mode != ctx.mode then ctx.fresh.setMode(mode) else ctx) + + inline def withMode[T](mode: Mode)(inline op: Context ?-> T)(using ctx: Context): T = + inMode(ctx.mode | mode)(op) + + inline def withoutMode[T](mode: Mode)(inline op: Context ?-> T)(using ctx: Context): T = + inMode(ctx.mode &~ mode)(op) + + inline def inDetachedContext[T](inline op: DetachedContext ?-> T)(using ctx: Context): T = + op(using ctx.detach) + + type Context = ContextCls @retains(caps.*) + + /** A context is passed basically everywhere in dotc. + * This is convenient but carries the risk of captured contexts in + * objects that turn into space leaks. To combat this risk, here are some + * conventions to follow: + * + * - Never let an implicit context be an argument of a class whose instances + * live longer than the context. + * - Classes that need contexts for their initialization take an explicit parameter + * named `initctx`. They pass initctx to all positions where it is needed + * (and these positions should all be part of the intialization sequence of the class). + * - Classes that need contexts that survive initialization are instead passed + * a "condensed context", typically named `cctx` (or they create one). Condensed contexts + * just add some basic information to the context base without the + * risk of capturing complete trees. + * - To make sure these rules are kept, it would be good to do a sanity + * check using bytecode inspection with javap or scalap: Keep track + * of all class fields of type context; allow them only in whitelisted + * classes (which should be short-lived). + */ + abstract class ContextCls(val base: ContextBase) { + + //val id = nextId + //nextId += 1 + //assert(id != 35599) + + protected given Context = this + + def outer: ContextCls @retains(this) + def period: Period + def mode: Mode + def owner: Symbol + def tree: Tree[?] + def scope: Scope + def typerState: TyperState + def gadt: GadtConstraint + def searchHistory: SearchHistory + def source: SourceFile + + /** All outer contexts, ending in `base.initialCtx` and then `NoContext` */ + def outersIterator: Iterator[ContextCls @retains(this)] + + /** A map in which more contextual properties can be stored + * Typically used for attributes that are read and written only in special situations. + */ + def moreProperties: Map[Key[Any], Any] + + def property[T](key: Key[T]): Option[T] = + moreProperties.get(key).asInstanceOf[Option[T]] + + /** A store that can be used by sub-components. + * Typically used for attributes that are defined only once per compilation unit. + * Access to store entries is much faster than access to properties, and only + * slightly slower than a normal field access would be. + */ + def store: Store + + /** The compiler callback implementation, or null if no callback will be called. */ + def compilerCallback: CompilerCallback = store(compilerCallbackLoc) + + /** The sbt callback implementation if we are run from sbt, null otherwise */ + def sbtCallback: AnalysisCallback = store(sbtCallbackLoc) + + /** The current plain printer */ + def printerFn: DetachedContext -> Printer = store(printerFnLoc) + + /** A function creating a printer */ + def printer: Printer = + val pr = printerFn(detach) + if this.settings.YplainPrinter.value then pr.plain else pr + + /** The current settings values */ + def settingsState: SettingsState = store(settingsStateLoc) + + /** The current compilation unit */ + def compilationUnit: CompilationUnit = store(compilationUnitLoc) + + /** The current compiler-run */ + def run: Run | Null = store(runLoc) + + /** The current compiler-run profiler */ + def profiler: Profiler = store(profilerLoc) + + /** The paths currently known to be not null */ + def notNullInfos: List[NotNullInfo] = store(notNullInfosLoc) + + /** The currently active import info */ + def importInfo: ImportInfo | Null = store(importInfoLoc) + + /** The current type assigner or typer */ + def typeAssigner: TypeAssigner = store(typeAssignerLoc) + + /** The new implicit references that are introduced by this scope */ + private var implicitsCache: ContextualImplicits | Null = null + def implicits: ContextualImplicits = { + if (implicitsCache == null) + implicitsCache = { + val implicitRefs: List[ImplicitRef] = + if (isClassDefContext) + try owner.thisType.implicitMembers + catch { + case ex: CyclicReference => Nil + } + else if (isImportContext) importInfo.nn.importedImplicits + else if (isNonEmptyScopeContext) scope.implicitDecls + else Nil + val outerImplicits = + if (isImportContext && importInfo.nn.unimported.exists) + outer.implicits exclude importInfo.nn.unimported + else + outer.implicits + if (implicitRefs.isEmpty) outerImplicits + else new ContextualImplicits(implicitRefs, outerImplicits, isImportContext)(detach) + } + implicitsCache.nn + } + + /** Either the current scope, or, if the current context owner is a class, + * the declarations of the current class. + */ + def effectiveScope(using Context): Scope = + val myOwner: Symbol | Null = owner + if myOwner != null && myOwner.isClass then myOwner.asClass.unforcedDecls + else scope + + def nestingLevel: Int = effectiveScope.nestingLevel + + /** Sourcefile corresponding to given abstract file, memoized */ + def getSource(file: AbstractFile, codec: -> Codec = Codec(settings.encoding.value)) = { + util.Stats.record("Context.getSource") + base.sources.getOrElseUpdate(file, SourceFile(file, codec)) + } + + /** SourceFile with given path name, memoized */ + def getSource(path: TermName): SourceFile = getFile(path) match + case NoAbstractFile => NoSource + case file => getSource(file) + + /** SourceFile with given path, memoized */ + def getSource(path: String): SourceFile = getSource(path.toTermName) + + /** AbstraFile with given path name, memoized */ + def getFile(name: TermName): AbstractFile = base.files.get(name) match + case Some(file) => + file + case None => + try + val file = new PlainFile(Path(name.toString)) + base.files(name) = file + file + catch + case ex: InvalidPathException => + report.error(em"invalid file path: ${ex.getMessage}") + NoAbstractFile + + /** AbstractFile with given path, memoized */ + def getFile(name: String): AbstractFile = getFile(name.toTermName) + + final def withPhase(phase: Phase): Context = ctx.fresh.setPhase(phase.id) + final def withPhase(pid: PhaseId): Context = ctx.fresh.setPhase(pid) + + private var related: SimpleIdentityMap[SourceFile, DetachedContext] | Null = null + + private def lookup(key: SourceFile): DetachedContext | Null = + util.Stats.record("Context.related.lookup") + if related == null then + related = SimpleIdentityMap.empty + null + else + related.nn(key) + + final def withSource(source: SourceFile): Context = + util.Stats.record("Context.withSource") + if this.source eq source then + this + else + var ctx1 = lookup(source) + if ctx1 == null then + util.Stats.record("Context.withSource.new") + val ctx2 = fresh.setSource(source) + if ctx2.compilationUnit eq NoCompilationUnit then + // `source` might correspond to a file not necessarily + // in the current project (e.g. when inlining library code), + // so set `mustExist` to false. + ctx2.setCompilationUnit(CompilationUnit(source, mustExist = false)) + val dctx = ctx2.detach + ctx1 = dctx + related = related.nn.updated(source, dctx) + ctx1 + + // `creationTrace`-related code. To enable, uncomment the code below and the + // call to `setCreationTrace()` in this file. + /* + /** If -Ydebug is on, the top of the stack trace where this context + * was created, otherwise `null`. + */ + private var creationTrace: Array[StackTraceElement] = uninitialized + + private def setCreationTrace() = + creationTrace = (new Throwable).getStackTrace().take(20) + + /** Print all enclosing context's creation stacktraces */ + def printCreationTraces() = { + println("=== context creation trace =======") + for (ctx <- outersIterator) { + println(s">>>>>>>>> $ctx") + if (ctx.creationTrace != null) println(ctx.creationTrace.mkString("\n")) + } + println("=== end context creation trace ===") + } + */ + + /** The current reporter */ + def reporter: Reporter = typerState.reporter + + final def phase: Phase = base.phases(period.firstPhaseId) + final def runId = period.runId + final def phaseId = period.phaseId + + final def lastPhaseId = base.phases.length - 1 + + /** Does current phase use an erased types interpretation? */ + final def erasedTypes = phase.erasedTypes + + /** Are we in a Java compilation unit? */ + final def isJava: Boolean = compilationUnit.isJava + + /** Is current phase after TyperPhase? */ + final def isAfterTyper = base.isAfterTyper(phase) + final def isTyper = base.isTyper(phase) + + /** Is this a context for the members of a class definition? */ + def isClassDefContext: Boolean = + owner.isClass && (owner ne outer.owner) + + /** Is this a context that introduces an import clause? */ + def isImportContext: Boolean = + (this ne NoContext) + && (outer ne NoContext) + && (this.importInfo nen outer.importInfo) + + /** Is this a context that introduces a non-empty scope? */ + def isNonEmptyScopeContext: Boolean = + (this.scope ne outer.scope) && !this.scope.isEmpty + + /** Is this a context for typechecking an inlined body? */ + def isInlineContext: Boolean = + typer.isInstanceOf[Inliner#InlineTyper] + + /** The next outer context whose tree is a template or package definition + * Note: Currently unused + def enclTemplate: Context = { + var c = this + while (c != NoContext && !c.tree.isInstanceOf[Template[?]] && !c.tree.isInstanceOf[PackageDef[?]]) + c = c.outer + c + }*/ + + /** The context for a supercall. This context is used for elaborating + * the parents of a class and their arguments. + * The context is computed from the current class context. It has + * + * - as owner: The primary constructor of the class + * - as outer context: The context enclosing the class context + * - as scope: The parameter accessors in the class context + * + * The reasons for this peculiar choice of attributes are as follows: + * + * - The constructor must be the owner, because that's where any local methods or closures + * should go. + * - The context may not see any class members (inherited or defined), and should + * instead see definitions defined in the outer context which might be shadowed by + * such class members. That's why the outer context must be the outer context of the class. + * - At the same time the context should see the parameter accessors of the current class, + * that's why they get added to the local scope. An alternative would have been to have the + * context see the constructor parameters instead, but then we'd need a final substitution step + * from constructor parameters to class parameter accessors. + */ + def superCallContext: Context = { + val locals = newScopeWith(owner.typeParams ++ owner.asClass.paramAccessors: _*) + superOrThisCallContext(owner.primaryConstructor, locals) + } + + /** The context for the arguments of a this(...) constructor call. + * The context is computed from the local auxiliary constructor context. + * It has + * + * - as owner: The auxiliary constructor + * - as outer context: The context enclosing the enclosing class context + * - as scope: The parameters of the auxiliary constructor. + */ + def thisCallArgContext: Context = { + val constrCtx = detach.outersIterator.dropWhile(_.outer.owner == owner).next() + superOrThisCallContext(owner, constrCtx.scope) + .setTyperState(typerState) + .setGadt(gadt) + .fresh + .setScope(this.scope) + } + + /** The super- or this-call context with given owner and locals. */ + private def superOrThisCallContext(owner: Symbol, locals: Scope): FreshContext = { + var classCtx = detach.outersIterator.dropWhile(!_.isClassDefContext).next() + classCtx.outer.fresh.setOwner(owner) + .setScope(locals) + .setMode(classCtx.mode) + } + + /** The context of expression `expr` seen as a member of a statement sequence */ + def exprContext(stat: Tree[?], exprOwner: Symbol): Context = + if (exprOwner == this.owner) this + else if (untpd.isSuperConstrCall(stat) && this.owner.isClass) superCallContext + else fresh.setOwner(exprOwner) + + /** A new context that summarizes an import statement */ + def importContext(imp: Import[?], sym: Symbol): FreshContext = + fresh.setImportInfo(ImportInfo(sym, imp.selectors, imp.expr)) + + /** Is the debug option set? */ + def debug: Boolean = base.settings.Ydebug.value + + /** Is the verbose option set? */ + def verbose: Boolean = base.settings.verbose.value + + /** Should use colors when printing? */ + def useColors: Boolean = + base.settings.color.value == "always" + + /** Is the explicit nulls option set? */ + def explicitNulls: Boolean = base.settings.YexplicitNulls.value + + /** A fresh clone of this context embedded in this context. */ + def fresh: FreshContext = freshOver(this) + + /** A fresh clone of this context embedded in the specified `outer` context. */ + def freshOver(outer: Context): FreshContext = + util.Stats.record("Context.fresh") + FreshContext(base).init(outer, this).setTyperState(this.typerState) + + final def withOwner(owner: Symbol): Context = + if (owner ne this.owner) fresh.setOwner(owner) else this + + final def withTyperState(typerState: TyperState): Context = + if typerState ne this.typerState then fresh.setTyperState(typerState) else this + + final def withUncommittedTyperState: Context = + withTyperState(typerState.uncommittedAncestor) + + final def withProperty[T](key: Key[T], value: Option[T]): Context = + if (property(key) == value) this + else value match { + case Some(v) => fresh.setProperty(key, v) + case None => fresh.dropProperty(key) + } + + def typer: Typer = this.typeAssigner match { + case typer: Typer => typer + case _ => new Typer + } + + override def toString: String = + //if true then + // outersIterator.map { ctx => + // i"${ctx.id} / ${ctx.owner} / ${ctx.moreProperties.valuesIterator.map(_.getClass).toList.mkString(", ")}" + // }.mkString("\n") + //else + def iinfo(using Context) = + val info = ctx.importInfo + if (info == null) "" else i"${info.selectors}%, %" + def cinfo(using Context) = + val core = s" owner = ${ctx.owner}, scope = ${ctx.scope}, import = $iinfo" + if (ctx ne NoContext) && (ctx.implicits ne ctx.outer.implicits) then + s"$core, implicits = ${ctx.implicits}" + else + core + s"""Context( + |${outersIterator.map(ctx => cinfo(using ctx)).mkString("\n\n")})""".stripMargin + + def settings: ScalaSettings = base.settings + def definitions: Definitions = base.definitions + def platform: Platform = base.platform + def pendingUnderlying: util.HashSet[Type] = base.pendingUnderlying + def uniqueNamedTypes: Uniques.NamedTypeUniques = base.uniqueNamedTypes + def uniques: util.WeakHashSet[Type] = base.uniques + + def initialize()(using Context): Unit = base.initialize() + + protected def resetCaches(): Unit = + implicitsCache = null + related = null + + /** Reuse this context as a fresh context nested inside `outer` */ + def reuseIn(outer: Context): this.type + + def detach: DetachedContext + } + + object detached: + opaque type DetachedContext <: ContextCls = ContextCls + inline def apply(c: ContextCls): DetachedContext = c + + type DetachedContext = detached.DetachedContext + + /** A condensed context provides only a small memory footprint over + * a Context base, and therefore can be stored without problems in + * long-lived objects. + abstract class CondensedContext extends Context { + override def condensed = this + } + */ + + /** A fresh context allows selective modification + * of its attributes using the with... methods. + */ + class FreshContext(base: ContextBase) extends ContextCls(base) { thiscontext => + + private var _outer: DetachedContext = uninitialized + def outer: DetachedContext = _outer + + def outersIterator: Iterator[ContextCls] = new Iterator[ContextCls] { + var current: ContextCls = thiscontext + def hasNext = current != NoContext + def next = { val c = current; current = current.outer; c } + } + + private var _period: Period = uninitialized + final def period: Period = _period + + private var _mode: Mode = uninitialized + final def mode: Mode = _mode + + private var _owner: Symbol = uninitialized + final def owner: Symbol = _owner + + private var _tree: Tree[?]= _ + final def tree: Tree[?] = _tree + + private var _scope: Scope = uninitialized + final def scope: Scope = _scope + + private var _typerState: TyperState = uninitialized + final def typerState: TyperState = _typerState + + private var _gadt: GadtConstraint = uninitialized + final def gadt: GadtConstraint = _gadt + + private var _searchHistory: SearchHistory = uninitialized + final def searchHistory: SearchHistory = _searchHistory + + private var _source: SourceFile = uninitialized + final def source: SourceFile = _source + + private var _moreProperties: Map[Key[Any], Any] = uninitialized + final def moreProperties: Map[Key[Any], Any] = _moreProperties + + private var _store: Store = uninitialized + final def store: Store = _store + + /** Initialize all context fields, except typerState, which has to be set separately + * @param outer The outer context + * @param origin The context from which fields are copied + */ + private[Contexts] def init(outer: Context, origin: Context): this.type = { + _outer = outer.asInstanceOf[DetachedContext] + _period = origin.period + _mode = origin.mode + _owner = origin.owner + _tree = origin.tree + _scope = origin.scope + _gadt = origin.gadt + _searchHistory = origin.searchHistory + _source = origin.source + _moreProperties = origin.moreProperties + _store = origin.store + this + } + + def reuseIn(outer: Context): this.type = + resetCaches() + init(outer, outer) + + def detach: DetachedContext = detached(this) + + def setPeriod(period: Period): this.type = + util.Stats.record("Context.setPeriod") + assert(period.firstPhaseId == period.lastPhaseId, period) + this._period = period + this + + def setMode(mode: Mode): this.type = + util.Stats.record("Context.setMode") + this._mode = mode + this + + def setOwner(owner: Symbol): this.type = + util.Stats.record("Context.setOwner") + assert(owner != NoSymbol) + this._owner = owner + this + + def setTree(tree: Tree[?]): this.type = + util.Stats.record("Context.setTree") + this._tree = tree + this + + def setScope(scope: Scope): this.type = + this._scope = scope + this + + def setNewScope: this.type = + util.Stats.record("Context.setScope") + this._scope = newScope + this + + def setTyperState(typerState: TyperState): this.type = + this._typerState = typerState + this + def setNewTyperState(): this.type = + setTyperState(typerState.fresh(committable = true)) + def setExploreTyperState(): this.type = + setTyperState(typerState.fresh(committable = false)) + def setReporter(reporter: Reporter): this.type = + setTyperState(typerState.fresh().setReporter(reporter)) + + def setTyper(typer: Typer): this.type = + this._scope = typer.scope + setTypeAssigner(typer) + + def setGadt(gadt: GadtConstraint): this.type = + util.Stats.record("Context.setGadt") + this._gadt = gadt + this + def setFreshGADTBounds: this.type = + setGadt(gadt.fresh) + + def setSearchHistory(searchHistory: SearchHistory): this.type = + util.Stats.record("Context.setSearchHistory") + this._searchHistory = searchHistory + this + + def setSource(source: SourceFile): this.type = + util.Stats.record("Context.setSource") + this._source = source + this + + private def setMoreProperties(moreProperties: Map[Key[Any], Any]): this.type = + util.Stats.record("Context.setMoreProperties") + this._moreProperties = moreProperties + this + + private def setStore(store: Store): this.type = + util.Stats.record("Context.setStore") + this._store = store + this + + def setCompilationUnit(compilationUnit: CompilationUnit): this.type = { + setSource(compilationUnit.source) + updateStore(compilationUnitLoc, compilationUnit) + } + + def setCompilerCallback(callback: CompilerCallback): this.type = updateStore(compilerCallbackLoc, callback) + def setSbtCallback(callback: AnalysisCallback): this.type = updateStore(sbtCallbackLoc, callback) + def setPrinterFn(printer: DetachedContext -> Printer): this.type = updateStore(printerFnLoc, printer) + def setSettings(settingsState: SettingsState): this.type = updateStore(settingsStateLoc, settingsState) + def setRun(run: Run | Null): this.type = updateStore(runLoc, run) + def setProfiler(profiler: Profiler): this.type = updateStore(profilerLoc, profiler) + def setNotNullInfos(notNullInfos: List[NotNullInfo]): this.type = updateStore(notNullInfosLoc, notNullInfos) + def setImportInfo(importInfo: ImportInfo): this.type = + importInfo.mentionsFeature(nme.unsafeNulls) match + case Some(true) => + setMode(this.mode &~ Mode.SafeNulls) + case Some(false) if ctx.settings.YexplicitNulls.value => + setMode(this.mode | Mode.SafeNulls) + case _ => + updateStore(importInfoLoc, importInfo) + def setTypeAssigner(typeAssigner: TypeAssigner): this.type = updateStore(typeAssignerLoc, typeAssigner) + + def setProperty[T](key: Key[T], value: T): this.type = + setMoreProperties(moreProperties.updated(key, value)) + + def dropProperty(key: Key[?]): this.type = + setMoreProperties(moreProperties - key) + + def addLocation[T](initial: T): Store.Location[T] = { + val (loc, store1) = store.newLocation(initial) + setStore(store1) + loc + } + + def addLocation[T](): Store.Location[T] = { + val (loc, store1) = store.newLocation[T]() + setStore(store1) + loc + } + + def updateStore[T](loc: Store.Location[T], value: T): this.type = + setStore(store.updated(loc, value)) + + def setPhase(pid: PhaseId): this.type = setPeriod(Period(runId, pid)) + def setPhase(phase: Phase): this.type = setPeriod(Period(runId, phase.start, phase.end)) + + def setSetting[T](setting: Setting[T], value: T): this.type = + setSettings(setting.updateIn(settingsState, value)) + + def setDebug: this.type = setSetting(base.settings.Ydebug, true) + } + + object FreshContext: + /** Defines an initial context with given context base and possible settings. */ + def initial(base: ContextBase, settingsGroup: SettingGroup): Context = + val c = new FreshContext(base) + c._outer = NoContext + c._period = InitialPeriod + c._mode = Mode.None + c._typerState = TyperState.initialState() + c._owner = NoSymbol + c._tree = untpd.EmptyTree + c._moreProperties = Map(MessageLimiter -> DefaultMessageLimiter()) + c._scope = EmptyScope + c._source = NoSource + c._store = initialStore + .updated(settingsStateLoc, settingsGroup.defaultState) + .updated(notNullInfosLoc, Nil) + .updated(compilationUnitLoc, NoCompilationUnit) + c._searchHistory = new SearchRoot + c._gadt = GadtConstraint.empty + c + end FreshContext + + given detachedCtx(using c: Context): DetachedContext = c.detach + + given ops: AnyRef with + extension (c: Context) + def addNotNullInfo(info: NotNullInfo): Context = + c.withNotNullInfos(c.notNullInfos.extendWith(info)) + + def addNotNullRefs(refs: Set[TermRef]): Context = + c.addNotNullInfo(NotNullInfo(refs, Set())) + + def withNotNullInfos(infos: List[NotNullInfo]): Context = + if c.notNullInfos eq infos then c else c.fresh.setNotNullInfos(infos) + + def relaxedOverrideContext: Context = + c.withModeBits(c.mode &~ Mode.SafeNulls | Mode.RelaxedOverriding) + end ops + + // TODO: Fix issue when converting ModeChanges and FreshModeChanges to extension givens + extension (c: Context) { + final def withModeBits(mode: Mode): Context = + if (mode != c.mode) c.fresh.setMode(mode) else c + + final def addMode(mode: Mode): Context = withModeBits(c.mode | mode) + final def retractMode(mode: Mode): Context = withModeBits(c.mode &~ mode) + } + + extension (c: FreshContext) { + final def addMode(mode: Mode): c.type = c.setMode(c.mode | mode) + final def retractMode(mode: Mode): c.type = c.setMode(c.mode &~ mode) + } + + private def exploreCtx(using Context): FreshContext = + util.Stats.record("explore") + val base = ctx.base + import base._ + val nestedCtx = + if exploresInUse < exploreContexts.size then + exploreContexts(exploresInUse).reuseIn(ctx) + else + val ts = TyperState() + .setReporter(ExploringReporter()) + .setCommittable(false) + val c = FreshContext(ctx.base).init(ctx, ctx).setTyperState(ts) + exploreContexts += c + c + exploresInUse += 1 + val nestedTS = nestedCtx.typerState + nestedTS.init(ctx.typerState, ctx.typerState.constraint) + nestedCtx + + private def wrapUpExplore(ectx: Context) = + ectx.reporter.asInstanceOf[ExploringReporter].reset() + ectx.base.exploresInUse -= 1 + + inline def explore[T](inline op: Context ?=> T)(using Context): T = + val ectx = exploreCtx + try op(using ectx) finally wrapUpExplore(ectx) + + inline def exploreInFreshCtx[T](inline op: FreshContext ?=> T)(using Context): T = + val ectx = exploreCtx + try op(using ectx) finally wrapUpExplore(ectx) + + private def changeOwnerCtx(owner: Symbol)(using Context): Context = + val base = ctx.base + import base._ + val nestedCtx = + if changeOwnersInUse < changeOwnerContexts.size then + changeOwnerContexts(changeOwnersInUse).reuseIn(ctx) + else + val c = FreshContext(ctx.base).init(ctx, ctx) + changeOwnerContexts += c + c + changeOwnersInUse += 1 + nestedCtx.setOwner(owner).setTyperState(ctx.typerState) + + /** Run `op` in current context, with a mode is temporarily set as specified. + */ + inline def runWithOwner[T](owner: Symbol)(inline op: Context ?=> T)(using Context): T = + if Config.reuseOwnerContexts then + try op(using changeOwnerCtx(owner)) + finally ctx.base.changeOwnersInUse -= 1 + else + op(using ctx.fresh.setOwner(owner)) + + /** The type comparer of the kind created by `maker` to be used. + * This is the currently active type comparer CMP if + * - CMP is associated with the current context, and + * - CMP is of the kind created by maker or maker creates a plain type comparer. + * Note: plain TypeComparers always take on the kind of the outer comparer if they are in the same context. + * In other words: tracking or explaining is a sticky property in the same context. + */ + private def comparer(using Context): TypeComparer = + util.Stats.record("comparing") + val base = ctx.base + if base.comparersInUse > 0 + && (base.comparers(base.comparersInUse - 1).comparerContext eq ctx) + then + base.comparers(base.comparersInUse - 1).currentInstance + else + val result = + if base.comparersInUse < base.comparers.size then + base.comparers(base.comparersInUse) + else + val result = TypeComparer(ctx) + base.comparers += result + result + base.comparersInUse += 1 + result.init(ctx) + result + + inline def comparing[T](inline op: TypeComparer => T)(using Context): T = + util.Stats.record("comparing") + val saved = ctx.base.comparersInUse + try op(comparer) + finally ctx.base.comparersInUse = saved + end comparing + + @sharable val NoContext: DetachedContext = detached( + new FreshContext((null: ContextBase | Null).uncheckedNN) { + override val implicits: ContextualImplicits = new ContextualImplicits(Nil, null, false)(detached(this: @unchecked)) + setSource(NoSource) + } + ) + + /** A context base defines state and associated methods that exist once per + * compiler run. + */ + class ContextBase extends ContextState + with Phases.PhasesBase + with Plugins { + + /** The applicable settings */ + val settings: ScalaSettings = new ScalaSettings + + /** The initial context */ + val initialCtx: Context = FreshContext.initial(this: @unchecked, settings) + + /** The platform, initialized by `initPlatform()`. */ + private var _platform: Platform | Null = uninitialized + + /** The platform */ + def platform: Platform = { + val p = _platform + if p == null then + throw new IllegalStateException( + "initialize() must be called before accessing platform") + p + } + + protected def newPlatform(using Context): Platform = + if (settings.scalajs.value) new SJSPlatform + else new JavaPlatform + + /** The loader that loads the members of _root_ */ + def rootLoader(root: TermSymbol)(using Context): SymbolLoader = platform.rootLoader(root) + + /** The standard definitions */ + val definitions: Definitions = new Definitions + + // Set up some phases to get started */ + usePhases(List(SomePhase)) + + /** Initializes the `ContextBase` with a starting context. + * This initializes the `platform` and the `definitions`. + */ + def initialize()(using Context): Unit = { + _platform = newPlatform + definitions.init() + } + + def fusedContaining(p: Phase): Phase = + allPhases.find(_.period.containsPhaseId(p.id)).getOrElse(NoPhase) + } + + /** The essential mutable state of a context base, collected into a common class */ + class ContextState { + // Symbols state + + /** Counter for unique symbol ids */ + private var _nextSymId: Int = 0 + def nextSymId: Int = { _nextSymId += 1; _nextSymId } + + /** Sources and Files that were loaded */ + val sources: util.HashMap[AbstractFile, SourceFile] = util.HashMap[AbstractFile, SourceFile]() + val files: util.HashMap[TermName, AbstractFile] = util.HashMap() + + // Types state + /** A table for hash consing unique types */ + private[core] val uniques: Uniques = Uniques() + + /** A table for hash consing unique applied types */ + private[dotc] val uniqueAppliedTypes: AppliedUniques = AppliedUniques() + + /** A table for hash consing unique named types */ + private[core] val uniqueNamedTypes: NamedTypeUniques = NamedTypeUniques() + + var emptyTypeBounds: TypeBounds | Null = null + var emptyWildcardBounds: WildcardType | Null = null + + /** Number of findMember calls on stack */ + private[core] var findMemberCount: Int = 0 + + /** List of names which have a findMemberCall on stack, + * after Config.LogPendingFindMemberThreshold is reached. + */ + private[core] var pendingMemberSearches: List[Name] = Nil + + /** The number of recursive invocation of underlying on a NamedType + * during a controlled operation. + */ + private[core] var underlyingRecursions: Int = 0 + + /** The set of named types on which a currently active invocation + * of underlying during a controlled operation exists. */ + private[core] val pendingUnderlying: util.HashSet[Type] = util.HashSet[Type]() + + /** A map from ErrorType to associated message. We use this map + * instead of storing messages directly in ErrorTypes in order + * to avoid space leaks - the message usually captures a context. + */ + private[core] val errorTypeMsg: mutable.Map[Types.ErrorType, Message] = mutable.Map() + + // Phases state + + private[core] var phasesPlan: List[List[Phase]] = uninitialized + + /** Phases by id */ + private[dotc] var phases: Array[Phase] = uninitialized + + /** Phases with consecutive Transforms grouped into a single phase, Empty array if fusion is disabled */ + private[core] var fusedPhases: Array[Phase] = Array.empty[Phase] + + /** Next denotation transformer id */ + private[core] var nextDenotTransformerId: Array[Int] = uninitialized + + private[core] var denotTransformers: Array[DenotTransformer] = uninitialized + + /** Flag to suppress inlining, set after overflow */ + private[dotc] var stopInlining: Boolean = false + + /** A variable that records that some error was reported in a globally committable context. + * The error will not necessarlily be emitted, since it could still be that + * the enclosing context will be aborted. The variable is used as a smoke test + * to turn off assertions that might be wrong if the program is erroneous. To + * just test for `ctx.reporter.errorsReported` is not always enough, since it + * could be that the context in which the assertion is tested is a completer context + * that's different from the context where the error was reported. See i13218.scala + * for a test. + */ + private[dotc] var errorsToBeReported = false + + // Reporters state + private[dotc] var indent: Int = 0 + + protected[dotc] val indentTab: String = " " + + private[Contexts] val exploreContexts = new mutable.ArrayBuffer[FreshContext] + private[Contexts] var exploresInUse: Int = 0 + + private[Contexts] val changeOwnerContexts = new mutable.ArrayBuffer[FreshContext] + private[Contexts] var changeOwnersInUse: Int = 0 + + private[Contexts] val comparers = new mutable.ArrayBuffer[TypeComparer] + private[Contexts] var comparersInUse: Int = 0 + + private var charArray = new Array[Char](256) + + private[core] val reusableDataReader = ReusableInstance(new ReusableDataReader()) + + private[dotc] var wConfCache: (List[String], WConf) = uninitialized + + def sharedCharArray(len: Int): Array[Char] = + while len > charArray.length do + charArray = new Array[Char](charArray.length * 2) + charArray + + def reset(): Unit = + uniques.clear() + uniqueAppliedTypes.clear() + uniqueNamedTypes.clear() + emptyTypeBounds = null + emptyWildcardBounds = null + errorsToBeReported = false + errorTypeMsg.clear() + sources.clear() + files.clear() + comparers.clear() // forces re-evaluation of top and bottom classes in TypeComparer + + // Test that access is single threaded + + /** The thread on which `checkSingleThreaded was invoked last */ + @sharable private var thread: Thread | Null = null + + /** Check that we are on the same thread as before */ + def checkSingleThreaded(): Unit = + if (thread == null) thread = Thread.currentThread() + else assert(thread == Thread.currentThread(), "illegal multithreaded access to ContextBase") + } +} diff --git a/tests/pos-with-compiler-cc/dotc/core/Decorators.scala b/tests/pos-with-compiler-cc/dotc/core/Decorators.scala new file mode 100644 index 000000000000..a4c3938a0909 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/core/Decorators.scala @@ -0,0 +1,322 @@ +package dotty.tools +package dotc +package core + +import scala.annotation.tailrec +import scala.collection.mutable.ListBuffer +import scala.util.control.NonFatal + +import Contexts._, Names._, Phases._, Symbols._ +import printing.{ Printer, Showable }, printing.Formatting._, printing.Texts._ +import transform.MegaPhase +import reporting.{Message, NoExplanation} +import language.experimental.pureFunctions +import annotation.retains + +/** This object provides useful extension methods for types defined elsewhere */ +object Decorators { + + /** Extension methods for toType/TermName methods on PreNames. + */ + extension (pn: PreName) + def toTermName: TermName = pn match + case s: String => termName(s) + case n: Name => n.toTermName + def toTypeName: TypeName = pn match + case s: String => typeName(s) + case n: Name => n.toTypeName + + extension (s: String) + def splitWhere(f: Char => Boolean, doDropIndex: Boolean): Option[(String, String)] = + def splitAt(idx: Int, doDropIndex: Boolean): Option[(String, String)] = + if (idx == -1) None + else Some((s.take(idx), s.drop(if (doDropIndex) idx + 1 else idx))) + splitAt(s.indexWhere(f), doDropIndex) + + /** Create a term name from a string slice, using a common buffer. + * This avoids some allocation relative to `termName(s)` + */ + def sliceToTermName(start: Int, end: Int)(using Context): SimpleName = + val len = end - start + val chars = ctx.base.sharedCharArray(len) + s.getChars(start, end, chars, 0) + termName(chars, 0, len) + + def sliceToTypeName(start: Int, end: Int)(using Context): TypeName = + sliceToTermName(start, end).toTypeName + + def concat(name: Name)(using Context): SimpleName = name match + case name: SimpleName => + val len = s.length + name.length + var chars = ctx.base.sharedCharArray(len) + s.getChars(0, s.length, chars, 0) + if name.length != 0 then name.getChars(0, name.length, chars, s.length) + termName(chars, 0, len) + case name: TypeName => s.concat(name.toTermName) + case _ => termName(s.concat(name.toString).nn) + + def indented(width: Int): String = + val padding = " " * width + padding + s.replace("\n", "\n" + padding) + end extension + + /** Convert lazy string to message. To be with caution, since no message-defined + * formatting will be done on the string. + */ + extension (str: -> String) + def toMessage: Message = NoExplanation(str)(using NoContext) + + /** Implements a findSymbol method on iterators of Symbols that + * works like find but avoids Option, replacing None with NoSymbol. + */ + extension (it: Iterator[Symbol]) + final def findSymbol(p: Symbol => Boolean): Symbol = { + while (it.hasNext) { + val sym = it.next() + if (p(sym)) return sym + } + NoSymbol + } + + inline val MaxFilterRecursions = 10 + + /** Implements filterConserve, zipWithConserve methods + * on lists that avoid duplication of list nodes where feasible. + */ + extension [T](xs: List[T]) + final def collectCC[U](pf: PartialFunction[T, U] @retains(caps.*)): List[U] = + xs.collect(pf.asInstanceOf) + + final def mapconserve[U](f: T => U): List[U] = { + @tailrec + def loop(mapped: ListBuffer[U] | Null, unchanged: List[U], pending: List[T]): List[U] = + if (pending.isEmpty) + if (mapped == null) unchanged + else mapped.prependToList(unchanged) + else { + val head0 = pending.head + val head1 = f(head0) + + if (head1.asInstanceOf[AnyRef] eq head0.asInstanceOf[AnyRef]) + loop(mapped, unchanged, pending.tail) + else { + val b = if (mapped == null) new ListBuffer[U] else mapped + var xc = unchanged + while (xc ne pending) { + b += xc.head + xc = xc.tail + } + b += head1 + val tail0 = pending.tail + loop(b, tail0.asInstanceOf[List[U]], tail0) + } + } + loop(null, xs.asInstanceOf[List[U]], xs) + } + + /** Like `xs filter p` but returns list `xs` itself - instead of a copy - + * if `p` is true for all elements. + */ + def filterConserve(p: T => Boolean): List[T] = + + def addAll(buf: ListBuffer[T], from: List[T], until: List[T]): ListBuffer[T] = + if from eq until then buf else addAll(buf += from.head, from.tail, until) + + def loopWithBuffer(buf: ListBuffer[T], xs: List[T]): List[T] = xs match + case x :: xs1 => + if p(x) then buf += x + loopWithBuffer(buf, xs1) + case nil => buf.toList + + def loop(keep: List[T], explore: List[T], keepCount: Int, recCount: Int): List[T] = + explore match + case x :: rest => + if p(x) then + loop(keep, rest, keepCount + 1, recCount) + else if keepCount <= 3 && recCount <= MaxFilterRecursions then + val rest1 = loop(rest, rest, 0, recCount + 1) + keepCount match + case 0 => rest1 + case 1 => keep.head :: rest1 + case 2 => keep.head :: keep.tail.head :: rest1 + case 3 => val tl = keep.tail; keep.head :: tl.head :: tl.tail.head :: rest1 + else + loopWithBuffer(addAll(new ListBuffer[T], keep, explore), rest) + case nil => + keep + + loop(xs, xs, 0, 0) + end filterConserve + + /** Like `xs.lazyZip(ys).map(f)`, but returns list `xs` itself + * - instead of a copy - if function `f` maps all elements of + * `xs` to themselves. Also, it is required that `ys` is at least + * as long as `xs`. + */ + def zipWithConserve[U, V <: T](ys: List[U])(f: (T, U) => V): List[V] = + if (xs.isEmpty || ys.isEmpty) Nil + else { + val x1 = f(xs.head, ys.head) + val xs1 = xs.tail.zipWithConserve(ys.tail)(f) + if (x1.asInstanceOf[AnyRef] eq xs.head.asInstanceOf[AnyRef]) && (xs1 eq xs.tail) + then xs.asInstanceOf[List[V]] + else x1 :: xs1 + } + + /** Like `xs.lazyZip(xs.indices).map(f)`, but returns list `xs` itself + * - instead of a copy - if function `f` maps all elements of + * `xs` to themselves. + */ + def mapWithIndexConserve[U <: T](f: (T, Int) => U): List[U] = + + @tailrec + def addAll(buf: ListBuffer[T], from: List[T], until: List[T]): ListBuffer[T] = + if from eq until then buf else addAll(buf += from.head, from.tail, until) + + @tailrec + def loopWithBuffer(buf: ListBuffer[U], explore: List[T], idx: Int): List[U] = explore match + case Nil => buf.toList + case t :: rest => loopWithBuffer(buf += f(t, idx), rest, idx + 1) + + @tailrec + def loop(keep: List[T], explore: List[T], idx: Int): List[U] = explore match + case Nil => keep.asInstanceOf[List[U]] + case t :: rest => + val u = f(t, idx) + if u.asInstanceOf[AnyRef] eq t.asInstanceOf[AnyRef] then + loop(keep, rest, idx + 1) + else + val buf = addAll(new ListBuffer[T], keep, explore).asInstanceOf[ListBuffer[U]] + loopWithBuffer(buf += u, rest, idx + 1) + + loop(xs, xs, 0) + end mapWithIndexConserve + + /** True if two lists have the same length. Since calling length on linear sequences + * is Θ(n), it is an inadvisable way to test length equality. This method is Θ(n min m). + */ + final def hasSameLengthAs[U](ys: List[U]): Boolean = { + @tailrec def loop(xs: List[T], ys: List[U]): Boolean = + if (xs.isEmpty) ys.isEmpty + else ys.nonEmpty && loop(xs.tail, ys.tail) + loop(xs, ys) + } + + @tailrec final def eqElements(ys: List[AnyRef]): Boolean = xs match { + case x :: _ => + ys match { + case y :: _ => + x.asInstanceOf[AnyRef].eq(y) && + xs.tail.eqElements(ys.tail) + case _ => false + } + case nil => ys.isEmpty + } + + /** Union on lists seen as sets */ + def setUnion (ys: List[T]): List[T] = xs ::: ys.filterNot(xs contains _) + + extension [T, U](xss: List[List[T]]) + def nestedMap(f: T => U): List[List[U]] = xss match + case xs :: xss1 => xs.map(f) :: xss1.nestedMap(f) + case nil => Nil + def nestedMapConserve(f: T => U): List[List[U]] = + xss.mapconserve(_.mapconserve(f)) + def nestedZipWithConserve(yss: List[List[U]])(f: (T, U) => T): List[List[T]] = + xss.zipWithConserve(yss)((xs, ys) => xs.zipWithConserve(ys)(f)) + def nestedExists(p: T => Boolean): Boolean = xss match + case xs :: xss1 => xs.exists(p) || xss1.nestedExists(p) + case nil => false + end extension + + extension [T](xs: Seq[T]) + final def collectCC[U](pf: PartialFunction[T, U] @retains(caps.*)): Seq[U] = + xs.collect(pf.asInstanceOf) + + extension [A, B](f: PartialFunction[A, B] @retains(caps.*)) + def orElseCC(g: PartialFunction[A, B] @retains(caps.*)): PartialFunction[A, B] @retains(f, g) = + f.orElse(g.asInstanceOf).asInstanceOf + + extension (text: Text) + def show(using Context): String = text.mkString(ctx.settings.pageWidth.value, ctx.settings.printLines.value) + + /** Test whether a list of strings representing phases contains + * a given phase. See [[config.CompilerCommand#explainAdvanced]] for the + * exact meaning of "contains" here. + */ + extension (names: List[String]) + def containsPhase(phase: Phase): Boolean = + names.nonEmpty && { + phase match { + case phase: MegaPhase => phase.miniPhases.exists(x => names.containsPhase(x)) + case _ => + names exists { name => + name == "all" || { + val strippedName = name.stripSuffix("+") + val logNextPhase = name != strippedName + phase.phaseName.startsWith(strippedName) || + (logNextPhase && phase.prev.phaseName.startsWith(strippedName)) + } + } + } + } + + extension [T](x: T) + def showing[U]( + op: WrappedResult[U] ?=> String, + printer: config.Printers.Printer = config.Printers.default)(using c: Conversion[T, U] | Null = null): T = { + // either the use of `$result` was driven by the expected type of `Shown` + // which led to the summoning of `Conversion[T, Shown]` (which we'll invoke) + // or no such conversion was found so we'll consume the result as it is instead + val obj = if c == null then x.asInstanceOf[U] else c(x) + printer.println(op(using WrappedResult(obj))) + x + } + + /** Instead of `toString` call `show` on `Showable` values, falling back to `toString` if an exception is raised. */ + def tryToShow(using Context): String = x match + case x: Showable => + try x.show + catch + case ex: CyclicReference => "... (caught cyclic reference) ..." + case NonFatal(ex) + if !ctx.mode.is(Mode.PrintShowExceptions) && !ctx.settings.YshowPrintErrors.value => + val msg = ex match + case te: TypeError => te.toMessage.message + case _ => ex.getMessage + s"[cannot display due to $msg, raw string = $x]" + case _ => String.valueOf(x).nn + + /** Returns the simple class name of `x`. */ + def className: String = getClass.getSimpleName.nn + + extension [T](x: T) + def assertingErrorsReported(using Context): T = { + assert(ctx.reporter.errorsReported) + x + } + def assertingErrorsReported(msg: Message)(using Context): T = { + assert(ctx.reporter.errorsReported, msg) + x + } + + extension [T <: AnyRef](xs: ::[T]) + def derivedCons(x1: T, xs1: List[T]) = + if (xs.head eq x1) && (xs.tail eq xs1) then xs else x1 :: xs1 + + extension (sc: StringContext) + + /** General purpose string formatting */ + def i(args: Shown*)(using Context): String = + new StringFormatter(sc).assemble(args) + + /** Interpolator yielding an error message, which undergoes + * the formatting defined in Message. + */ + def em(args: Shown*)(using Context): NoExplanation = + NoExplanation(i(args*)) + + extension [T <: AnyRef](arr: Array[T]) + def binarySearch(x: T | Null): Int = java.util.Arrays.binarySearch(arr.asInstanceOf[Array[Object | Null]], x) + +} diff --git a/tests/pos-with-compiler-cc/dotc/core/Definitions.scala b/tests/pos-with-compiler-cc/dotc/core/Definitions.scala new file mode 100644 index 000000000000..603088dd8f26 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/core/Definitions.scala @@ -0,0 +1,2434 @@ +package dotty.tools +package dotc +package core + +import scala.annotation.{threadUnsafe => tu} +import Types._, Contexts._, Symbols._, SymDenotations._, StdNames._, Names._, Phases._ +import Flags._, Scopes._, Decorators._, NameOps._, Periods._, NullOpsDecorator._ +import unpickleScala2.Scala2Unpickler.ensureConstructor +import scala.collection.mutable +import collection.mutable +import Denotations.{SingleDenotation, staticRef} +import util.{SimpleIdentityMap, SourceFile, NoSource} +import typer.ImportInfo.RootRef +import Comments.CommentsContext +import Comments.Comment +import util.Spans.NoSpan +import config.Feature +import Symbols.requiredModuleRef +import cc.{CapturingType, CaptureSet, EventuallyCapturingType} + +import scala.annotation.tailrec +import language.experimental.pureFunctions + +object Definitions { + + /** The maximum number of elements in a tuple or product. + * This should be removed once we go to hlists. + */ + val MaxTupleArity: Int = 22 + + /** The maximum arity N of a function type that's implemented + * as a trait `scala.FunctionN`. Functions of higher arity are possible, + * but are mapped in erasure to functions taking a single parameter of type + * Object[]. + * The limit 22 is chosen for Scala2x interop. It could be something + * else without affecting the set of programs that can be compiled. + */ + val MaxImplementedFunctionArity: Int = MaxTupleArity +} + +/** A class defining symbols and types of standard definitions + * + */ +class Definitions { + import Definitions._ + + private var initCtx: DetachedContext = _ + private given currentContext[Dummy_so_its_a_def]: DetachedContext = initCtx + + private def newPermanentSymbol[N <: Name](owner: Symbol, name: N, flags: FlagSet, info: Type) = + newSymbol(owner, name, flags | Permanent, info) + + private def newPermanentClassSymbol(owner: Symbol, name: TypeName, flags: FlagSet, infoFn: ClassSymbol => Type) = + newClassSymbol(owner, name, flags | Permanent | NoInits | Open, infoFn) + + private def enterCompleteClassSymbol(owner: Symbol, name: TypeName, flags: FlagSet, parents: List[TypeRef]): ClassSymbol = + enterCompleteClassSymbol(owner, name, flags, parents, newScope(owner.nestingLevel + 1)) + + private def enterCompleteClassSymbol(owner: Symbol, name: TypeName, flags: FlagSet, parents: List[TypeRef], decls: Scope) = + newCompleteClassSymbol(owner, name, flags | Permanent | NoInits | Open, parents, decls).entered + + private def enterTypeField(cls: ClassSymbol, name: TypeName, flags: FlagSet, scope: MutableScope) = + scope.enter(newPermanentSymbol(cls, name, flags, TypeBounds.empty)) + + private def enterTypeParam(cls: ClassSymbol, name: TypeName, flags: FlagSet, scope: MutableScope) = + enterTypeField(cls, name, flags | ClassTypeParamCreationFlags, scope) + + private def enterSyntheticTypeParam(cls: ClassSymbol, paramFlags: FlagSet, scope: MutableScope, suffix: String = "T0") = + enterTypeParam(cls, suffix.toTypeName, paramFlags, scope) + + // NOTE: Ideally we would write `parentConstrs: => Type*` but SIP-24 is only + // implemented in Dotty and not in Scala 2. + // See . + private def enterSpecialPolyClass(name: TypeName, paramFlags: FlagSet, parentConstrs: -> Seq[Type]): ClassSymbol = { + val completer = new LazyType { + def complete(denot: SymDenotation)(using Context): Unit = { + val cls = denot.asClass.classSymbol + val paramDecls = newScope + val typeParam = enterSyntheticTypeParam(cls, paramFlags, paramDecls) + def instantiate(tpe: Type) = + if (tpe.typeParams.nonEmpty) tpe.appliedTo(typeParam.typeRef) + else tpe + val parents = parentConstrs.toList map instantiate + denot.info = ClassInfo(ScalaPackageClass.thisType, cls, parents, paramDecls) + } + } + newPermanentClassSymbol(ScalaPackageClass, name, Artifact, completer).entered + } + + /** The trait FunctionN, ContextFunctionN, ErasedFunctionN or ErasedContextFunction, for some N + * @param name The name of the trait to be created + * + * FunctionN traits follow this template: + * + * trait FunctionN[-T0,...-T{N-1}, +R] extends Object { + * def apply($x0: T0, ..., $x{N_1}: T{N-1}): R + * } + * + * That is, they follow the template given for Function2..Function22 in the + * standard library, but without `tupled` and `curried` methods and without + * a `toString`. + * + * ContextFunctionN traits follow this template: + * + * trait ContextFunctionN[-T0,...,-T{N-1}, +R] extends Object { + * def apply(using $x0: T0, ..., $x{N_1}: T{N-1}): R + * } + * + * ErasedFunctionN traits follow this template: + * + * trait ErasedFunctionN[-T0,...,-T{N-1}, +R] extends Object { + * def apply(erased $x0: T0, ..., $x{N_1}: T{N-1}): R + * } + * + * ErasedContextFunctionN traits follow this template: + * + * trait ErasedContextFunctionN[-T0,...,-T{N-1}, +R] extends Object { + * def apply(using erased $x0: T0, ..., $x{N_1}: T{N-1}): R + * } + * + * ErasedFunctionN and ErasedContextFunctionN erase to Function0. + * + * ImpureXYZFunctionN follow this template: + * + * type ImpureXYZFunctionN[-T0,...,-T{N-1}, +R] = {*} XYZFunctionN[T0,...,T{N-1}, R] + */ + private def newFunctionNType(name: TypeName): Symbol = { + val impure = name.startsWith("Impure") + val completer = new LazyType { + def complete(denot: SymDenotation)(using Context): Unit = { + val arity = name.functionArity + if impure then + val argParamNames = List.tabulate(arity)(tpnme.syntheticTypeParamName) + val argVariances = List.fill(arity)(Contravariant) + val underlyingName = name.asSimpleName.drop(6) + val underlyingClass = ScalaPackageVal.requiredClass(underlyingName) + denot.info = TypeAlias( + HKTypeLambda(argParamNames :+ "R".toTypeName, argVariances :+ Covariant)( + tl => List.fill(arity + 1)(TypeBounds.empty), + tl => CapturingType(underlyingClass.typeRef.appliedTo(tl.paramRefs), + CaptureSet.universal) + )) + else + val cls = denot.asClass.classSymbol + val decls = newScope + val paramNamePrefix = tpnme.scala ++ str.NAME_JOIN ++ name ++ str.EXPAND_SEPARATOR + val argParamRefs = List.tabulate(arity) { i => + enterTypeParam(cls, paramNamePrefix ++ "T" ++ (i + 1).toString, Contravariant, decls).typeRef + } + val resParamRef = enterTypeParam(cls, paramNamePrefix ++ "R", Covariant, decls).typeRef + val methodType = MethodType.companion( + isContextual = name.isContextFunction, + isImplicit = false, + isErased = name.isErasedFunction) + decls.enter(newMethod(cls, nme.apply, methodType(argParamRefs, resParamRef), Deferred)) + denot.info = + ClassInfo(ScalaPackageClass.thisType, cls, ObjectType :: Nil, decls) + } + } + if impure then + newPermanentSymbol(ScalaPackageClass, name, EmptyFlags, completer) + else + newPermanentClassSymbol(ScalaPackageClass, name, Trait | NoInits, completer) + } + + private def newMethod(cls: ClassSymbol, name: TermName, info: Type, flags: FlagSet = EmptyFlags): TermSymbol = + newPermanentSymbol(cls, name, flags | Method, info).asTerm + + private def enterMethod(cls: ClassSymbol, name: TermName, info: Type, flags: FlagSet = EmptyFlags): TermSymbol = + newMethod(cls, name, info, flags).entered + + private def enterPermanentSymbol(name: Name, info: Type, flags: FlagSet = EmptyFlags): Symbol = + val sym = newPermanentSymbol(ScalaPackageClass, name, flags, info) + ScalaPackageClass.currentPackageDecls.enter(sym) + sym + + private def enterAliasType(name: TypeName, tpe: Type, flags: FlagSet = EmptyFlags): TypeSymbol = + enterPermanentSymbol(name, TypeAlias(tpe), flags).asType + + private def enterBinaryAlias(name: TypeName, op: (Type, Type) => Type): TypeSymbol = + enterAliasType(name, + HKTypeLambda(TypeBounds.empty :: TypeBounds.empty :: Nil)( + tl => op(tl.paramRefs(0), tl.paramRefs(1)))) + + private def enterPolyMethod(cls: ClassSymbol, name: TermName, typeParamCount: Int, + resultTypeFn: PolyType -> Type, + flags: FlagSet = EmptyFlags, + bounds: TypeBounds = TypeBounds.empty, + useCompleter: Boolean = false) = { + val tparamNames = PolyType.syntheticParamNames(typeParamCount) + val tparamInfos = tparamNames map (_ => bounds) + def ptype = PolyType(tparamNames)(_ => tparamInfos, resultTypeFn) + val info = + if (useCompleter) + new LazyType { + def complete(denot: SymDenotation)(using Context): Unit = + denot.info = ptype + } + else ptype + enterMethod(cls, name, info, flags) + } + + private def enterT1ParameterlessMethod(cls: ClassSymbol, name: TermName, resultTypeFn: PolyType -> Type, flags: FlagSet) = + enterPolyMethod(cls, name, 1, resultTypeFn, flags) + + private def mkArityArray(name: String, arity: Int, countFrom: Int): Array[TypeRef | Null] = { + val arr = new Array[TypeRef | Null](arity + 1) + for (i <- countFrom to arity) arr(i) = requiredClassRef(name + i) + arr + } + + private def completeClass(cls: ClassSymbol, ensureCtor: Boolean = true): ClassSymbol = { + if (ensureCtor) ensureConstructor(cls, cls.denot.asClass, EmptyScope) + if (cls.linkedClass.exists) cls.linkedClass.markAbsent() + cls + } + + @tu lazy val RootClass: ClassSymbol = newPackageSymbol( + NoSymbol, nme.ROOT, (root, rootcls) => ctx.base.rootLoader(root)).moduleClass.asClass + @tu lazy val RootPackage: TermSymbol = newSymbol( + NoSymbol, nme.ROOTPKG, PackageCreationFlags, TypeRef(NoPrefix, RootClass)) + + @tu lazy val EmptyPackageVal: TermSymbol = newPackageSymbol( + RootClass, nme.EMPTY_PACKAGE, (emptypkg, emptycls) => ctx.base.rootLoader(emptypkg)).entered + @tu lazy val EmptyPackageClass: ClassSymbol = EmptyPackageVal.moduleClass.asClass + + /** A package in which we can place all methods and types that are interpreted specially by the compiler */ + @tu lazy val OpsPackageVal: TermSymbol = newCompletePackageSymbol(RootClass, nme.OPS_PACKAGE).entered + @tu lazy val OpsPackageClass: ClassSymbol = OpsPackageVal.moduleClass.asClass + + @tu lazy val ScalaPackageVal: TermSymbol = requiredPackage(nme.scala) + @tu lazy val ScalaMathPackageVal: TermSymbol = requiredPackage("scala.math") + @tu lazy val ScalaPackageClass: ClassSymbol = { + val cls = ScalaPackageVal.moduleClass.asClass + cls.info.decls.openForMutations.useSynthesizer( + name => + if (name.isTypeName && name.isSyntheticFunction) newFunctionNType(name.asTypeName) + else NoSymbol) + cls + } + @tu lazy val ScalaPackageObject: Symbol = requiredModule("scala.package") + @tu lazy val ScalaRuntimePackageVal: TermSymbol = requiredPackage("scala.runtime") + @tu lazy val ScalaRuntimePackageClass: ClassSymbol = ScalaRuntimePackageVal.moduleClass.asClass + @tu lazy val JavaPackageVal: TermSymbol = requiredPackage(nme.java) + @tu lazy val JavaPackageClass: ClassSymbol = JavaPackageVal.moduleClass.asClass + @tu lazy val JavaLangPackageVal: TermSymbol = requiredPackage(jnme.JavaLang) + @tu lazy val JavaLangPackageClass: ClassSymbol = JavaLangPackageVal.moduleClass.asClass + + // fundamental modules + @tu lazy val SysPackage : Symbol = requiredModule("scala.sys.package") + @tu lazy val Sys_error: Symbol = SysPackage.moduleClass.requiredMethod(nme.error) + + @tu lazy val ScalaXmlPackageClass: Symbol = getPackageClassIfDefined("scala.xml") + + @tu lazy val CompiletimePackageClass: Symbol = requiredPackage("scala.compiletime").moduleClass + @tu lazy val Compiletime_codeOf: Symbol = CompiletimePackageClass.requiredMethod("codeOf") + @tu lazy val Compiletime_erasedValue : Symbol = CompiletimePackageClass.requiredMethod("erasedValue") + @tu lazy val Compiletime_uninitialized: Symbol = CompiletimePackageClass.requiredMethod("uninitialized") + @tu lazy val Compiletime_error : Symbol = CompiletimePackageClass.requiredMethod(nme.error) + @tu lazy val Compiletime_requireConst : Symbol = CompiletimePackageClass.requiredMethod("requireConst") + @tu lazy val Compiletime_constValue : Symbol = CompiletimePackageClass.requiredMethod("constValue") + @tu lazy val Compiletime_constValueOpt: Symbol = CompiletimePackageClass.requiredMethod("constValueOpt") + @tu lazy val Compiletime_summonFrom : Symbol = CompiletimePackageClass.requiredMethod("summonFrom") + @tu lazy val Compiletime_summonInline : Symbol = CompiletimePackageClass.requiredMethod("summonInline") + @tu lazy val CompiletimeTestingPackage: Symbol = requiredPackage("scala.compiletime.testing") + @tu lazy val CompiletimeTesting_typeChecks: Symbol = CompiletimeTestingPackage.requiredMethod("typeChecks") + @tu lazy val CompiletimeTesting_typeCheckErrors: Symbol = CompiletimeTestingPackage.requiredMethod("typeCheckErrors") + @tu lazy val CompiletimeTesting_ErrorClass: ClassSymbol = requiredClass("scala.compiletime.testing.Error") + @tu lazy val CompiletimeTesting_Error: Symbol = requiredModule("scala.compiletime.testing.Error") + @tu lazy val CompiletimeTesting_Error_apply = CompiletimeTesting_Error.requiredMethod(nme.apply) + @tu lazy val CompiletimeTesting_ErrorKind: Symbol = requiredModule("scala.compiletime.testing.ErrorKind") + @tu lazy val CompiletimeTesting_ErrorKind_Parser: Symbol = CompiletimeTesting_ErrorKind.requiredMethod("Parser") + @tu lazy val CompiletimeTesting_ErrorKind_Typer: Symbol = CompiletimeTesting_ErrorKind.requiredMethod("Typer") + @tu lazy val CompiletimeOpsPackage: Symbol = requiredPackage("scala.compiletime.ops") + @tu lazy val CompiletimeOpsAnyModuleClass: Symbol = requiredModule("scala.compiletime.ops.any").moduleClass + @tu lazy val CompiletimeOpsIntModuleClass: Symbol = requiredModule("scala.compiletime.ops.int").moduleClass + @tu lazy val CompiletimeOpsLongModuleClass: Symbol = requiredModule("scala.compiletime.ops.long").moduleClass + @tu lazy val CompiletimeOpsFloatModuleClass: Symbol = requiredModule("scala.compiletime.ops.float").moduleClass + @tu lazy val CompiletimeOpsDoubleModuleClass: Symbol = requiredModule("scala.compiletime.ops.double").moduleClass + @tu lazy val CompiletimeOpsStringModuleClass: Symbol = requiredModule("scala.compiletime.ops.string").moduleClass + @tu lazy val CompiletimeOpsBooleanModuleClass: Symbol = requiredModule("scala.compiletime.ops.boolean").moduleClass + + /** Note: We cannot have same named methods defined in Object and Any (and AnyVal, for that matter) + * because after erasure the Any and AnyVal references get remapped to the Object methods + * which would result in a double binding assertion failure. + * Instead we do the following: + * + * - Have some methods exist only in Any, and remap them with the Erasure denotation + * transformer to be owned by Object. + * - Have other methods exist only in Object. + * To achieve this, we synthesize all Any and Object methods; Object methods no longer get + * loaded from a classfile. + */ + @tu lazy val AnyClass: ClassSymbol = completeClass(enterCompleteClassSymbol(ScalaPackageClass, tpnme.Any, Abstract, Nil), ensureCtor = false) + def AnyType: TypeRef = AnyClass.typeRef + @tu lazy val MatchableClass: ClassSymbol = completeClass(enterCompleteClassSymbol(ScalaPackageClass, tpnme.Matchable, Trait, AnyType :: Nil), ensureCtor = false) + def MatchableType: TypeRef = MatchableClass.typeRef + @tu lazy val AnyValClass: ClassSymbol = + val res = completeClass(enterCompleteClassSymbol(ScalaPackageClass, tpnme.AnyVal, Abstract, List(AnyType, MatchableType))) + // Mark companion as absent, so that class does not get re-completed + val companion = ScalaPackageVal.info.decl(nme.AnyVal).symbol + companion.moduleClass.markAbsent() + companion.markAbsent() + res + + def AnyValType: TypeRef = AnyValClass.typeRef + + @tu lazy val Any_== : TermSymbol = enterMethod(AnyClass, nme.EQ, methOfAny(BooleanType), Final) + @tu lazy val Any_!= : TermSymbol = enterMethod(AnyClass, nme.NE, methOfAny(BooleanType), Final) + @tu lazy val Any_equals: TermSymbol = enterMethod(AnyClass, nme.equals_, methOfAny(BooleanType)) + @tu lazy val Any_hashCode: TermSymbol = enterMethod(AnyClass, nme.hashCode_, MethodType(Nil, IntType)) + @tu lazy val Any_toString: TermSymbol = enterMethod(AnyClass, nme.toString_, MethodType(Nil, StringType)) + @tu lazy val Any_## : TermSymbol = enterMethod(AnyClass, nme.HASHHASH, ExprType(IntType), Final) + @tu lazy val Any_isInstanceOf: TermSymbol = enterT1ParameterlessMethod(AnyClass, nme.isInstanceOf_, _ => BooleanType, Final) + @tu lazy val Any_asInstanceOf: TermSymbol = enterT1ParameterlessMethod(AnyClass, nme.asInstanceOf_, _.paramRefs(0), Final) + @tu lazy val Any_typeTest: TermSymbol = enterT1ParameterlessMethod(AnyClass, nme.isInstanceOfPM, _ => BooleanType, Final | SyntheticArtifact) + @tu lazy val Any_typeCast: TermSymbol = enterT1ParameterlessMethod(AnyClass, nme.asInstanceOfPM, _.paramRefs(0), Final | SyntheticArtifact | StableRealizable) + // generated by pattern matcher and explicit nulls, eliminated by erasure + + /** def getClass[A >: this.type](): Class[? <: A] */ + @tu lazy val Any_getClass: TermSymbol = + enterPolyMethod( + AnyClass, nme.getClass_, 1, + pt => MethodType(Nil, ClassClass.typeRef.appliedTo(TypeBounds.upper(pt.paramRefs(0)))), + Final, + bounds = TypeBounds.lower(AnyClass.thisType)) + + def AnyMethods: List[TermSymbol] = List(Any_==, Any_!=, Any_equals, Any_hashCode, + Any_toString, Any_##, Any_getClass, Any_isInstanceOf, Any_asInstanceOf, Any_typeTest, Any_typeCast) + + @tu lazy val ObjectClass: ClassSymbol = { + val cls = requiredClass("java.lang.Object") + assert(!cls.isCompleted, "race for completing java.lang.Object") + cls.info = ClassInfo(cls.owner.thisType, cls, List(AnyType, MatchableType), newScope) + cls.setFlag(NoInits | JavaDefined) + + ensureConstructor(cls, cls.denot.asClass, EmptyScope) + val companion = JavaLangPackageVal.info.decl(nme.Object).symbol.asTerm + NamerOps.makeConstructorCompanion(companion, cls) + cls + } + def ObjectType: TypeRef = ObjectClass.typeRef + + /** A type alias of Object used to represent any reference to Object in a Java + * signature, the secret sauce is that subtype checking treats it specially: + * + * tp <:< FromJavaObject + * + * is equivalent to: + * + * tp <:< Any + * + * This is useful to avoid usability problems when interacting with Java + * code where Object is the top type. This is safe because this type will + * only appear in signatures of Java definitions in positions where `Object` + * might appear, let's enumerate all possible cases this gives us: + * + * 1. At the top level: + * + * // A.java + * void meth1(Object arg) {} + * void meth2(T arg) {} // T implicitly extends Object + * + * // B.scala + * meth1(1) // OK + * meth2(1) // OK + * + * This is safe even though Int is not a subtype of Object, because Erasure + * will detect the mismatch and box the value type. + * + * 2. In a class type parameter: + * + * // A.java + * void meth3(scala.List arg) {} + * void meth4(scala.List arg) {} + * + * // B.scala + * meth3(List[Int](1)) // OK + * meth4(List[Int](1)) // OK + * + * At erasure, type parameters are removed and value types are boxed. + * + * 3. As the type parameter of an array: + * + * // A.java + * void meth5(Object[] arg) {} + * void meth6(T[] arg) {} + * + * // B.scala + * meth5(Array[Int](1)) // error: Array[Int] is not a subtype of Array[Object] + * meth6(Array[Int](1)) // error: Array[Int] is not a subtype of Array[T & Object] + * + * + * This is a bit more subtle: at erasure, Arrays keep their type parameter, + * and primitive Arrays are not subtypes of reference Arrays on the JVM, + * so we can't pass an Array of Int where a reference Array is expected. + * Array is invariant in Scala, so `meth5` is safe even if we use `FromJavaObject`, + * but generic Arrays are treated specially: we always add `& Object` (and here + * we mean the normal java.lang.Object type) to these types when they come from + * Java signatures (see `translateJavaArrayElementType`), this ensure that `meth6` + * is safe to use. + * + * 4. As the repeated argument of a varargs method: + * + * // A.java + * void meth7(Object... args) {} + * void meth8(T... args) {} + * + * // B.scala + * meth7(1) // OK (creates a reference array) + * meth8(1) // OK (creates a primitive array and copies it into a reference array at Erasure) + * val ai = Array[Int](1) + * meth7(ai: _*) // OK (will copy the array at Erasure) + * meth8(ai: _*) // OK (will copy the array at Erasure) + * + * Java repeated arguments are erased to arrays, so it would be safe to treat + * them in the same way: add an `& Object` to the parameter type to disallow + * passing primitives, but that would be very inconvenient as it is common to + * want to pass a primitive to an Object repeated argument (e.g. + * `String.format("foo: %d", 1)`). So instead we type them _without_ adding the + * `& Object` and let `ElimRepeated` and `Erasure` take care of doing any necessary adaptation + * (note that adapting a primitive array to a reference array requires + * copying the whole array, so this transformation only preserves semantics + * if the callee does not try to mutate the varargs array which is a reasonable + * assumption to make). + * + * + * This mechanism is similar to `ObjectTpeJavaRef` in Scala 2, except that we + * create a new symbol with its own name, this is needed because this type + * can show up in inferred types and therefore needs to be preserved when + * pickling so that unpickled trees pass `-Ycheck`. + * + * Note that by default we pretty-print `FromJavaObject` as `Object` or simply omit it + * if it's the sole upper-bound of a type parameter, use `-Yprint-debug` to explicitly + * display it. + */ + @tu lazy val FromJavaObjectSymbol: TypeSymbol = + newPermanentSymbol(OpsPackageClass, tpnme.FromJavaObject, JavaDefined, TypeAlias(ObjectType)).entered + def FromJavaObjectType: TypeRef = FromJavaObjectSymbol.typeRef + + @tu lazy val AnyRefAlias: TypeSymbol = enterAliasType(tpnme.AnyRef, ObjectType) + def AnyRefType: TypeRef = AnyRefAlias.typeRef + + @tu lazy val Object_eq: TermSymbol = enterMethod(ObjectClass, nme.eq, methOfAnyRef(BooleanType), Final) + @tu lazy val Object_ne: TermSymbol = enterMethod(ObjectClass, nme.ne, methOfAnyRef(BooleanType), Final) + @tu lazy val Object_synchronized: TermSymbol = enterPolyMethod(ObjectClass, nme.synchronized_, 1, + pt => MethodType(List(pt.paramRefs(0)), pt.paramRefs(0)), Final) + @tu lazy val Object_clone: TermSymbol = enterMethod(ObjectClass, nme.clone_, MethodType(Nil, ObjectType), Protected) + @tu lazy val Object_finalize: TermSymbol = enterMethod(ObjectClass, nme.finalize_, MethodType(Nil, UnitType), Protected) + @tu lazy val Object_notify: TermSymbol = enterMethod(ObjectClass, nme.notify_, MethodType(Nil, UnitType), Final) + @tu lazy val Object_notifyAll: TermSymbol = enterMethod(ObjectClass, nme.notifyAll_, MethodType(Nil, UnitType), Final) + @tu lazy val Object_wait: TermSymbol = enterMethod(ObjectClass, nme.wait_, MethodType(Nil, UnitType), Final) + @tu lazy val Object_waitL: TermSymbol = enterMethod(ObjectClass, nme.wait_, MethodType(LongType :: Nil, UnitType), Final) + @tu lazy val Object_waitLI: TermSymbol = enterMethod(ObjectClass, nme.wait_, MethodType(LongType :: IntType :: Nil, UnitType), Final) + + def ObjectMethods: List[TermSymbol] = List(Object_eq, Object_ne, Object_synchronized, Object_clone, + Object_finalize, Object_notify, Object_notifyAll, Object_wait, Object_waitL, Object_waitLI) + + /** Methods in Object and Any that do not have a side effect */ + @tu lazy val pureMethods: List[TermSymbol] = List(Any_==, Any_!=, Any_equals, Any_hashCode, + Any_toString, Any_##, Any_getClass, Any_isInstanceOf, Any_typeTest, Object_eq, Object_ne) + + @tu lazy val AnyKindClass: ClassSymbol = { + val cls = newCompleteClassSymbol(ScalaPackageClass, tpnme.AnyKind, AbstractFinal | Permanent, Nil, newScope(0)) + if (!ctx.settings.YnoKindPolymorphism.value) + // Enable kind-polymorphism by exposing scala.AnyKind + cls.entered + cls + } + def AnyKindType: TypeRef = AnyKindClass.typeRef + + @tu lazy val andType: TypeSymbol = enterBinaryAlias(tpnme.AND, AndType(_, _)) + @tu lazy val orType: TypeSymbol = enterBinaryAlias(tpnme.OR, OrType(_, _, soft = false)) + + /** Method representing a throw */ + @tu lazy val throwMethod: TermSymbol = enterMethod(OpsPackageClass, nme.THROWkw, + MethodType(List(ThrowableType), NothingType)) + + @tu lazy val NothingClass: ClassSymbol = enterCompleteClassSymbol( + ScalaPackageClass, tpnme.Nothing, AbstractFinal, List(AnyType)) + def NothingType: TypeRef = NothingClass.typeRef + @tu lazy val NullClass: ClassSymbol = { + // When explicit-nulls is enabled, Null becomes a direct subtype of Any and Matchable + val parents = if ctx.explicitNulls then AnyType :: MatchableType :: Nil else ObjectType :: Nil + enterCompleteClassSymbol(ScalaPackageClass, tpnme.Null, AbstractFinal, parents) + } + def NullType: TypeRef = NullClass.typeRef + + @tu lazy val InvokerModule = requiredModule("scala.runtime.coverage.Invoker") + @tu lazy val InvokedMethodRef = InvokerModule.requiredMethodRef("invoked") + + @tu lazy val ImplicitScrutineeTypeSym = + newPermanentSymbol(ScalaPackageClass, tpnme.IMPLICITkw, EmptyFlags, TypeBounds.empty).entered + def ImplicitScrutineeTypeRef: TypeRef = ImplicitScrutineeTypeSym.typeRef + + @tu lazy val ScalaPredefModule: Symbol = requiredModule("scala.Predef") + @tu lazy val Predef_conforms : Symbol = ScalaPredefModule.requiredMethod(nme.conforms_) + @tu lazy val Predef_classOf : Symbol = ScalaPredefModule.requiredMethod(nme.classOf) + @tu lazy val Predef_identity : Symbol = ScalaPredefModule.requiredMethod(nme.identity) + @tu lazy val Predef_undefined: Symbol = ScalaPredefModule.requiredMethod(nme.???) + @tu lazy val ScalaPredefModuleClass: ClassSymbol = ScalaPredefModule.moduleClass.asClass + + @tu lazy val SubTypeClass: ClassSymbol = requiredClass("scala.<:<") + @tu lazy val SubType_refl: Symbol = SubTypeClass.companionModule.requiredMethod(nme.refl) + + @tu lazy val DummyImplicitClass: ClassSymbol = requiredClass("scala.DummyImplicit") + + @tu lazy val ScalaRuntimeModule: Symbol = requiredModule("scala.runtime.ScalaRunTime") + def runtimeMethodRef(name: PreName): TermRef = ScalaRuntimeModule.requiredMethodRef(name) + def ScalaRuntime_drop: Symbol = runtimeMethodRef(nme.drop).symbol + @tu lazy val ScalaRuntime__hashCode: Symbol = ScalaRuntimeModule.requiredMethod(nme._hashCode_) + @tu lazy val ScalaRuntime_toArray: Symbol = ScalaRuntimeModule.requiredMethod(nme.toArray) + @tu lazy val ScalaRuntime_toObjectArray: Symbol = ScalaRuntimeModule.requiredMethod(nme.toObjectArray) + + @tu lazy val BoxesRunTimeModule: Symbol = requiredModule("scala.runtime.BoxesRunTime") + @tu lazy val BoxesRunTimeModule_externalEquals: Symbol = BoxesRunTimeModule.info.decl(nme.equals_).suchThat(toDenot(_).info.firstParamTypes.size == 2).symbol + @tu lazy val ScalaStaticsModule: Symbol = requiredModule("scala.runtime.Statics") + def staticsMethodRef(name: PreName): TermRef = ScalaStaticsModule.requiredMethodRef(name) + def staticsMethod(name: PreName): TermSymbol = ScalaStaticsModule.requiredMethod(name) + + @tu lazy val DottyArraysModule: Symbol = requiredModule("scala.runtime.Arrays") + def newGenericArrayMethod(using Context): TermSymbol = DottyArraysModule.requiredMethod("newGenericArray") + def newArrayMethod(using Context): TermSymbol = DottyArraysModule.requiredMethod("newArray") + + def getWrapVarargsArrayModule: Symbol = ScalaRuntimeModule + + // The set of all wrap{X, Ref}Array methods, where X is a value type + val WrapArrayMethods: PerRun[collection.Set[Symbol]] = new PerRun({ + val methodNames = ScalaValueTypes.map(ast.tpd.wrapArrayMethodName) `union` Set(nme.wrapRefArray) + methodNames.map(getWrapVarargsArrayModule.requiredMethod(_)) + }) + + @tu lazy val ListClass: Symbol = requiredClass("scala.collection.immutable.List") + @tu lazy val ListModule: Symbol = requiredModule("scala.collection.immutable.List") + @tu lazy val NilModule: Symbol = requiredModule("scala.collection.immutable.Nil") + @tu lazy val ConsClass: Symbol = requiredClass("scala.collection.immutable.::") + @tu lazy val SeqFactoryClass: Symbol = requiredClass("scala.collection.SeqFactory") + + @tu lazy val SingletonClass: ClassSymbol = + // needed as a synthetic class because Scala 2.x refers to it in classfiles + // but does not define it as an explicit class. + enterCompleteClassSymbol( + ScalaPackageClass, tpnme.Singleton, PureInterfaceCreationFlags | Final, + List(AnyType), EmptyScope) + @tu lazy val SingletonType: TypeRef = SingletonClass.typeRef + + @tu lazy val CollectionSeqType: TypeRef = requiredClassRef("scala.collection.Seq") + @tu lazy val SeqType: TypeRef = requiredClassRef("scala.collection.immutable.Seq") + def SeqClass(using Context): ClassSymbol = SeqType.symbol.asClass + @tu lazy val Seq_apply : Symbol = SeqClass.requiredMethod(nme.apply) + @tu lazy val Seq_head : Symbol = SeqClass.requiredMethod(nme.head) + @tu lazy val Seq_drop : Symbol = SeqClass.requiredMethod(nme.drop) + @tu lazy val Seq_lengthCompare: Symbol = SeqClass.requiredMethod(nme.lengthCompare, List(IntType)) + @tu lazy val Seq_length : Symbol = SeqClass.requiredMethod(nme.length) + @tu lazy val Seq_toSeq : Symbol = SeqClass.requiredMethod(nme.toSeq) + @tu lazy val SeqModule: Symbol = requiredModule("scala.collection.immutable.Seq") + + + @tu lazy val StringOps: Symbol = requiredClass("scala.collection.StringOps") + @tu lazy val StringOps_format: Symbol = StringOps.requiredMethod(nme.format) + + @tu lazy val ArrayType: TypeRef = requiredClassRef("scala.Array") + def ArrayClass(using Context): ClassSymbol = ArrayType.symbol.asClass + @tu lazy val Array_apply : Symbol = ArrayClass.requiredMethod(nme.apply) + @tu lazy val Array_update : Symbol = ArrayClass.requiredMethod(nme.update) + @tu lazy val Array_length : Symbol = ArrayClass.requiredMethod(nme.length) + @tu lazy val Array_clone : Symbol = ArrayClass.requiredMethod(nme.clone_) + @tu lazy val ArrayConstructor: Symbol = ArrayClass.requiredMethod(nme.CONSTRUCTOR) + + @tu lazy val ArrayModule: Symbol = requiredModule("scala.Array") + def ArrayModuleClass: Symbol = ArrayModule.moduleClass + + @tu lazy val IArrayModule: Symbol = requiredModule("scala.IArray") + def IArrayModuleClass: Symbol = IArrayModule.moduleClass + + @tu lazy val UnitType: TypeRef = valueTypeRef("scala.Unit", java.lang.Void.TYPE, UnitEnc, nme.specializedTypeNames.Void) + def UnitClass(using Context): ClassSymbol = UnitType.symbol.asClass + def UnitModuleClass(using Context): Symbol = UnitType.symbol.asClass.linkedClass + @tu lazy val BooleanType: TypeRef = valueTypeRef("scala.Boolean", java.lang.Boolean.TYPE, BooleanEnc, nme.specializedTypeNames.Boolean) + def BooleanClass(using Context): ClassSymbol = BooleanType.symbol.asClass + @tu lazy val Boolean_! : Symbol = BooleanClass.requiredMethod(nme.UNARY_!) + @tu lazy val Boolean_&& : Symbol = BooleanClass.requiredMethod(nme.ZAND) // ### harmonize required... calls + @tu lazy val Boolean_|| : Symbol = BooleanClass.requiredMethod(nme.ZOR) + @tu lazy val Boolean_== : Symbol = + BooleanClass.info.member(nme.EQ).suchThat(_.info.firstParamTypes match { + case List(pt) => pt.isRef(BooleanClass) + case _ => false + }).symbol + @tu lazy val Boolean_!= : Symbol = + BooleanClass.info.member(nme.NE).suchThat(_.info.firstParamTypes match { + case List(pt) => pt.isRef(BooleanClass) + case _ => false + }).symbol + + @tu lazy val ByteType: TypeRef = valueTypeRef("scala.Byte", java.lang.Byte.TYPE, ByteEnc, nme.specializedTypeNames.Byte) + def ByteClass(using Context): ClassSymbol = ByteType.symbol.asClass + @tu lazy val ShortType: TypeRef = valueTypeRef("scala.Short", java.lang.Short.TYPE, ShortEnc, nme.specializedTypeNames.Short) + def ShortClass(using Context): ClassSymbol = ShortType.symbol.asClass + @tu lazy val CharType: TypeRef = valueTypeRef("scala.Char", java.lang.Character.TYPE, CharEnc, nme.specializedTypeNames.Char) + def CharClass(using Context): ClassSymbol = CharType.symbol.asClass + @tu lazy val IntType: TypeRef = valueTypeRef("scala.Int", java.lang.Integer.TYPE, IntEnc, nme.specializedTypeNames.Int) + def IntClass(using Context): ClassSymbol = IntType.symbol.asClass + @tu lazy val Int_- : Symbol = IntClass.requiredMethod(nme.MINUS, List(IntType)) + @tu lazy val Int_+ : Symbol = IntClass.requiredMethod(nme.PLUS, List(IntType)) + @tu lazy val Int_/ : Symbol = IntClass.requiredMethod(nme.DIV, List(IntType)) + @tu lazy val Int_* : Symbol = IntClass.requiredMethod(nme.MUL, List(IntType)) + @tu lazy val Int_== : Symbol = IntClass.requiredMethod(nme.EQ, List(IntType)) + @tu lazy val Int_>= : Symbol = IntClass.requiredMethod(nme.GE, List(IntType)) + @tu lazy val Int_<= : Symbol = IntClass.requiredMethod(nme.LE, List(IntType)) + @tu lazy val LongType: TypeRef = valueTypeRef("scala.Long", java.lang.Long.TYPE, LongEnc, nme.specializedTypeNames.Long) + def LongClass(using Context): ClassSymbol = LongType.symbol.asClass + @tu lazy val Long_+ : Symbol = LongClass.requiredMethod(nme.PLUS, List(LongType)) + @tu lazy val Long_* : Symbol = LongClass.requiredMethod(nme.MUL, List(LongType)) + @tu lazy val Long_/ : Symbol = LongClass.requiredMethod(nme.DIV, List(LongType)) + + @tu lazy val FloatType: TypeRef = valueTypeRef("scala.Float", java.lang.Float.TYPE, FloatEnc, nme.specializedTypeNames.Float) + def FloatClass(using Context): ClassSymbol = FloatType.symbol.asClass + @tu lazy val DoubleType: TypeRef = valueTypeRef("scala.Double", java.lang.Double.TYPE, DoubleEnc, nme.specializedTypeNames.Double) + def DoubleClass(using Context): ClassSymbol = DoubleType.symbol.asClass + + @tu lazy val BoxedUnitClass: ClassSymbol = requiredClass("scala.runtime.BoxedUnit") + def BoxedUnit_UNIT(using Context): TermSymbol = BoxedUnitClass.linkedClass.requiredValue("UNIT") + def BoxedUnit_TYPE(using Context): TermSymbol = BoxedUnitClass.linkedClass.requiredValue("TYPE") + + @tu lazy val BoxedBooleanClass: ClassSymbol = requiredClass("java.lang.Boolean") + @tu lazy val BoxedByteClass : ClassSymbol = requiredClass("java.lang.Byte") + @tu lazy val BoxedShortClass : ClassSymbol = requiredClass("java.lang.Short") + @tu lazy val BoxedCharClass : ClassSymbol = requiredClass("java.lang.Character") + @tu lazy val BoxedIntClass : ClassSymbol = requiredClass("java.lang.Integer") + @tu lazy val BoxedLongClass : ClassSymbol = requiredClass("java.lang.Long") + @tu lazy val BoxedFloatClass : ClassSymbol = requiredClass("java.lang.Float") + @tu lazy val BoxedDoubleClass : ClassSymbol = requiredClass("java.lang.Double") + + @tu lazy val BoxedBooleanModule: TermSymbol = requiredModule("java.lang.Boolean") + @tu lazy val BoxedByteModule : TermSymbol = requiredModule("java.lang.Byte") + @tu lazy val BoxedShortModule : TermSymbol = requiredModule("java.lang.Short") + @tu lazy val BoxedCharModule : TermSymbol = requiredModule("java.lang.Character") + @tu lazy val BoxedIntModule : TermSymbol = requiredModule("java.lang.Integer") + @tu lazy val BoxedLongModule : TermSymbol = requiredModule("java.lang.Long") + @tu lazy val BoxedFloatModule : TermSymbol = requiredModule("java.lang.Float") + @tu lazy val BoxedDoubleModule : TermSymbol = requiredModule("java.lang.Double") + @tu lazy val BoxedUnitModule : TermSymbol = requiredModule("java.lang.Void") + + @tu lazy val ByNameParamClass2x: ClassSymbol = enterSpecialPolyClass(tpnme.BYNAME_PARAM_CLASS, Covariant, Seq(AnyType)) + + @tu lazy val RepeatedParamClass: ClassSymbol = enterSpecialPolyClass(tpnme.REPEATED_PARAM_CLASS, Covariant, Seq(ObjectType, SeqType)) + + @tu lazy val IntoType: TypeSymbol = enterAliasType(tpnme.INTO, HKTypeLambda(TypeBounds.empty :: Nil)(_.paramRefs(0))) + + // fundamental classes + @tu lazy val StringClass: ClassSymbol = requiredClass("java.lang.String") + def StringType: Type = StringClass.typeRef + @tu lazy val StringModule: Symbol = StringClass.linkedClass + @tu lazy val String_+ : TermSymbol = enterMethod(StringClass, nme.raw.PLUS, methOfAny(StringType), Final) + @tu lazy val String_valueOf_Object: Symbol = StringModule.info.member(nme.valueOf).suchThat(_.info.firstParamTypes match { + case List(pt) => pt.isAny || pt.stripNull.isAnyRef + case _ => false + }).symbol + + @tu lazy val JavaCloneableClass: ClassSymbol = requiredClass("java.lang.Cloneable") + @tu lazy val NullPointerExceptionClass: ClassSymbol = requiredClass("java.lang.NullPointerException") + @tu lazy val IndexOutOfBoundsException: ClassSymbol = requiredClass("java.lang.IndexOutOfBoundsException") + @tu lazy val ClassClass: ClassSymbol = requiredClass("java.lang.Class") + @tu lazy val BoxedNumberClass: ClassSymbol = requiredClass("java.lang.Number") + @tu lazy val ClassCastExceptionClass: ClassSymbol = requiredClass("java.lang.ClassCastException") + @tu lazy val ClassCastExceptionClass_stringConstructor: TermSymbol = ClassCastExceptionClass.info.member(nme.CONSTRUCTOR).suchThat(_.info.firstParamTypes match { + case List(pt) => + pt.stripNull.isRef(StringClass) + case _ => false + }).symbol.asTerm + @tu lazy val ArithmeticExceptionClass: ClassSymbol = requiredClass("java.lang.ArithmeticException") + @tu lazy val ArithmeticExceptionClass_stringConstructor: TermSymbol = ArithmeticExceptionClass.info.member(nme.CONSTRUCTOR).suchThat(_.info.firstParamTypes match { + case List(pt) => + pt.stripNull.isRef(StringClass) + case _ => false + }).symbol.asTerm + + @tu lazy val JavaSerializableClass: ClassSymbol = requiredClass("java.io.Serializable") + + @tu lazy val ComparableClass: ClassSymbol = requiredClass("java.lang.Comparable") + + @tu lazy val SystemClass: ClassSymbol = requiredClass("java.lang.System") + @tu lazy val SystemModule: Symbol = SystemClass.linkedClass + + @tu lazy val NoSuchElementExceptionClass = requiredClass("java.util.NoSuchElementException") + def NoSuchElementExceptionType = NoSuchElementExceptionClass.typeRef + @tu lazy val IllegalArgumentExceptionClass = requiredClass("java.lang.IllegalArgumentException") + def IllegalArgumentExceptionType = IllegalArgumentExceptionClass.typeRef + + // in scalac modified to have Any as parent + + @tu lazy val ThrowableType: TypeRef = requiredClassRef("java.lang.Throwable") + def ThrowableClass(using Context): ClassSymbol = ThrowableType.symbol.asClass + @tu lazy val ExceptionClass: ClassSymbol = requiredClass("java.lang.Exception") + @tu lazy val RuntimeExceptionClass: ClassSymbol = requiredClass("java.lang.RuntimeException") + + @tu lazy val SerializableType: TypeRef = JavaSerializableClass.typeRef + def SerializableClass(using Context): ClassSymbol = SerializableType.symbol.asClass + + @tu lazy val JavaBigIntegerClass: ClassSymbol = requiredClass("java.math.BigInteger") + @tu lazy val JavaBigDecimalClass: ClassSymbol = requiredClass("java.math.BigDecimal") + @tu lazy val JavaCalendarClass: ClassSymbol = requiredClass("java.util.Calendar") + @tu lazy val JavaDateClass: ClassSymbol = requiredClass("java.util.Date") + @tu lazy val JavaFormattableClass: ClassSymbol = requiredClass("java.util.Formattable") + + @tu lazy val JavaEnumClass: ClassSymbol = { + val cls = requiredClass("java.lang.Enum") + // jl.Enum has a single constructor protected(name: String, ordinal: Int). + // We remove the arguments from the primary constructor, and enter + // a new constructor symbol with 2 arguments, so that both + // `X extends jl.Enum[X]` and `X extends jl.Enum[X](name, ordinal)` + // pass typer and go through jl.Enum-specific checks in RefChecks. + cls.infoOrCompleter match { + case completer: ClassfileLoader => + cls.info = new ClassfileLoader(completer.classfile) { + override def complete(root: SymDenotation)(using Context): Unit = { + super.complete(root) + val constr = cls.primaryConstructor + val noArgInfo = constr.info match { + case info: PolyType => + info.resType match { + case meth: MethodType => + info.derivedLambdaType( + resType = meth.derivedLambdaType( + paramNames = Nil, paramInfos = Nil)) + } + } + val argConstr = constr.copy().entered + constr.info = noArgInfo + constr.termRef.recomputeDenot() + } + } + cls + } + } + def JavaEnumType = JavaEnumClass.typeRef + + @tu lazy val MethodHandleClass: ClassSymbol = requiredClass("java.lang.invoke.MethodHandle") + @tu lazy val MethodHandlesLookupClass: ClassSymbol = requiredClass("java.lang.invoke.MethodHandles.Lookup") + @tu lazy val VarHandleClass: ClassSymbol = requiredClass("java.lang.invoke.VarHandle") + + @tu lazy val StringBuilderClass: ClassSymbol = requiredClass("scala.collection.mutable.StringBuilder") + @tu lazy val MatchErrorClass : ClassSymbol = requiredClass("scala.MatchError") + @tu lazy val ConversionClass : ClassSymbol = requiredClass("scala.Conversion").typeRef.symbol.asClass + + @tu lazy val StringAddClass : ClassSymbol = requiredClass("scala.runtime.StringAdd") + @tu lazy val StringAdd_+ : Symbol = StringAddClass.requiredMethod(nme.raw.PLUS) + + @tu lazy val StringContextClass: ClassSymbol = requiredClass("scala.StringContext") + @tu lazy val StringContext_s : Symbol = StringContextClass.requiredMethod(nme.s) + @tu lazy val StringContext_raw: Symbol = StringContextClass.requiredMethod(nme.raw_) + @tu lazy val StringContext_f : Symbol = StringContextClass.requiredMethod(nme.f) + @tu lazy val StringContext_parts: Symbol = StringContextClass.requiredMethod(nme.parts) + @tu lazy val StringContextModule: Symbol = StringContextClass.companionModule + @tu lazy val StringContextModule_apply: Symbol = StringContextModule.requiredMethod(nme.apply) + @tu lazy val StringContextModule_standardInterpolator: Symbol = StringContextModule.requiredMethod(nme.standardInterpolator) + @tu lazy val StringContextModule_processEscapes: Symbol = StringContextModule.requiredMethod(nme.processEscapes) + + @tu lazy val PartialFunctionClass: ClassSymbol = requiredClass("scala.PartialFunction") + @tu lazy val PartialFunction_isDefinedAt: Symbol = PartialFunctionClass.requiredMethod(nme.isDefinedAt) + @tu lazy val PartialFunction_applyOrElse: Symbol = PartialFunctionClass.requiredMethod(nme.applyOrElse) + + @tu lazy val AbstractPartialFunctionClass: ClassSymbol = requiredClass("scala.runtime.AbstractPartialFunction") + @tu lazy val FunctionXXLClass: ClassSymbol = requiredClass("scala.runtime.FunctionXXL") + @tu lazy val ScalaSymbolClass: ClassSymbol = requiredClass("scala.Symbol") + @tu lazy val DynamicClass: ClassSymbol = requiredClass("scala.Dynamic") + @tu lazy val OptionClass: ClassSymbol = requiredClass("scala.Option") + @tu lazy val SomeClass: ClassSymbol = requiredClass("scala.Some") + @tu lazy val NoneModule: Symbol = requiredModule("scala.None") + + @tu lazy val EnumClass: ClassSymbol = requiredClass("scala.reflect.Enum") + @tu lazy val Enum_ordinal: Symbol = EnumClass.requiredMethod(nme.ordinal) + + @tu lazy val EnumValueSerializationProxyClass: ClassSymbol = requiredClass("scala.runtime.EnumValueSerializationProxy") + @tu lazy val EnumValueSerializationProxyConstructor: TermSymbol = + EnumValueSerializationProxyClass.requiredMethod(nme.CONSTRUCTOR, List(ClassType(TypeBounds.empty), IntType)) + + @tu lazy val ProductClass: ClassSymbol = requiredClass("scala.Product") + @tu lazy val Product_canEqual : Symbol = ProductClass.requiredMethod(nme.canEqual_) + @tu lazy val Product_productArity : Symbol = ProductClass.requiredMethod(nme.productArity) + @tu lazy val Product_productElement : Symbol = ProductClass.requiredMethod(nme.productElement) + @tu lazy val Product_productElementName: Symbol = ProductClass.requiredMethod(nme.productElementName) + @tu lazy val Product_productPrefix : Symbol = ProductClass.requiredMethod(nme.productPrefix) + + @tu lazy val IteratorClass: ClassSymbol = requiredClass("scala.collection.Iterator") + def IteratorModule(using Context): Symbol = IteratorClass.companionModule + + @tu lazy val ModuleSerializationProxyClass: ClassSymbol = requiredClass("scala.runtime.ModuleSerializationProxy") + @tu lazy val ModuleSerializationProxyConstructor: TermSymbol = + ModuleSerializationProxyClass.requiredMethod(nme.CONSTRUCTOR, List(ClassType(TypeBounds.empty))) + + @tu lazy val MirrorClass: ClassSymbol = requiredClass("scala.deriving.Mirror") + @tu lazy val Mirror_ProductClass: ClassSymbol = requiredClass("scala.deriving.Mirror.Product") + @tu lazy val Mirror_Product_fromProduct: Symbol = Mirror_ProductClass.requiredMethod(nme.fromProduct) + @tu lazy val Mirror_SumClass: ClassSymbol = requiredClass("scala.deriving.Mirror.Sum") + @tu lazy val Mirror_SingletonClass: ClassSymbol = requiredClass("scala.deriving.Mirror.Singleton") + @tu lazy val Mirror_SingletonProxyClass: ClassSymbol = requiredClass("scala.deriving.Mirror.SingletonProxy") + + @tu lazy val LanguageModule: Symbol = requiredModule("scala.language") + @tu lazy val LanguageModuleClass: Symbol = LanguageModule.moduleClass.asClass + @tu lazy val LanguageExperimentalModule: Symbol = requiredModule("scala.language.experimental") + @tu lazy val LanguageDeprecatedModule: Symbol = requiredModule("scala.language.deprecated") + @tu lazy val NonLocalReturnControlClass: ClassSymbol = requiredClass("scala.runtime.NonLocalReturnControl") + @tu lazy val SelectableClass: ClassSymbol = requiredClass("scala.Selectable") + @tu lazy val WithoutPreciseParameterTypesClass: Symbol = requiredClass("scala.Selectable.WithoutPreciseParameterTypes") + + @tu lazy val ManifestClass: ClassSymbol = requiredClass("scala.reflect.Manifest") + @tu lazy val ManifestFactoryModule: Symbol = requiredModule("scala.reflect.ManifestFactory") + @tu lazy val ClassManifestFactoryModule: Symbol = requiredModule("scala.reflect.ClassManifestFactory") + @tu lazy val OptManifestClass: ClassSymbol = requiredClass("scala.reflect.OptManifest") + @tu lazy val NoManifestModule: Symbol = requiredModule("scala.reflect.NoManifest") + + @tu lazy val ReflectPackageClass: Symbol = requiredPackage("scala.reflect.package").moduleClass + @tu lazy val ClassTagClass: ClassSymbol = requiredClass("scala.reflect.ClassTag") + @tu lazy val ClassTagModule: Symbol = ClassTagClass.companionModule + @tu lazy val ClassTagModule_apply: Symbol = ClassTagModule.requiredMethod(nme.apply) + + @tu lazy val TypeTestClass: ClassSymbol = requiredClass("scala.reflect.TypeTest") + @tu lazy val TypeTest_unapply: Symbol = TypeTestClass.requiredMethod(nme.unapply) + @tu lazy val TypeTestModule_identity: Symbol = TypeTestClass.companionModule.requiredMethod(nme.identity) + + @tu lazy val QuotedExprClass: ClassSymbol = requiredClass("scala.quoted.Expr") + + @tu lazy val QuotesClass: ClassSymbol = requiredClass("scala.quoted.Quotes") + @tu lazy val Quotes_reflect: Symbol = QuotesClass.requiredValue("reflect") + @tu lazy val Quotes_reflect_asTerm: Symbol = Quotes_reflect.requiredMethod("asTerm") + @tu lazy val Quotes_reflect_Apply: Symbol = Quotes_reflect.requiredValue("Apply") + @tu lazy val Quotes_reflect_Apply_apply: Symbol = Quotes_reflect_Apply.requiredMethod(nme.apply) + @tu lazy val Quotes_reflect_TypeApply: Symbol = Quotes_reflect.requiredValue("TypeApply") + @tu lazy val Quotes_reflect_TypeApply_apply: Symbol = Quotes_reflect_TypeApply.requiredMethod(nme.apply) + @tu lazy val Quotes_reflect_Assign: Symbol = Quotes_reflect.requiredValue("Assign") + @tu lazy val Quotes_reflect_Assign_apply: Symbol = Quotes_reflect_Assign.requiredMethod(nme.apply) + @tu lazy val Quotes_reflect_Inferred: Symbol = Quotes_reflect.requiredValue("Inferred") + @tu lazy val Quotes_reflect_Inferred_apply: Symbol = Quotes_reflect_Inferred.requiredMethod(nme.apply) + @tu lazy val Quotes_reflect_Literal: Symbol = Quotes_reflect.requiredValue("Literal") + @tu lazy val Quotes_reflect_Literal_apply: Symbol = Quotes_reflect_Literal.requiredMethod(nme.apply) + @tu lazy val Quotes_reflect_TreeMethods: Symbol = Quotes_reflect.requiredMethod("TreeMethods") + @tu lazy val Quotes_reflect_TreeMethods_asExpr: Symbol = Quotes_reflect_TreeMethods.requiredMethod("asExpr") + @tu lazy val Quotes_reflect_TypeRepr: Symbol = Quotes_reflect.requiredValue("TypeRepr") + @tu lazy val Quotes_reflect_TypeRepr_of: Symbol = Quotes_reflect_TypeRepr.requiredMethod("of") + @tu lazy val Quotes_reflect_TypeRepr_typeConstructorOf: Symbol = Quotes_reflect_TypeRepr.requiredMethod("typeConstructorOf") + @tu lazy val Quotes_reflect_TypeReprMethods: Symbol = Quotes_reflect.requiredValue("TypeReprMethods") + @tu lazy val Quotes_reflect_TypeReprMethods_asType: Symbol = Quotes_reflect_TypeReprMethods.requiredMethod("asType") + @tu lazy val Quotes_reflect_TypeTreeType: Symbol = Quotes_reflect.requiredType("TypeTree") + @tu lazy val Quotes_reflect_TermType: Symbol = Quotes_reflect.requiredType("Term") + @tu lazy val Quotes_reflect_BooleanConstant: Symbol = Quotes_reflect.requiredValue("BooleanConstant") + @tu lazy val Quotes_reflect_ByteConstant: Symbol = Quotes_reflect.requiredValue("ByteConstant") + @tu lazy val Quotes_reflect_ShortConstant: Symbol = Quotes_reflect.requiredValue("ShortConstant") + @tu lazy val Quotes_reflect_IntConstant: Symbol = Quotes_reflect.requiredValue("IntConstant") + @tu lazy val Quotes_reflect_LongConstant: Symbol = Quotes_reflect.requiredValue("LongConstant") + @tu lazy val Quotes_reflect_FloatConstant: Symbol = Quotes_reflect.requiredValue("FloatConstant") + @tu lazy val Quotes_reflect_DoubleConstant: Symbol = Quotes_reflect.requiredValue("DoubleConstant") + @tu lazy val Quotes_reflect_CharConstant: Symbol = Quotes_reflect.requiredValue("CharConstant") + @tu lazy val Quotes_reflect_StringConstant: Symbol = Quotes_reflect.requiredValue("StringConstant") + @tu lazy val Quotes_reflect_UnitConstant: Symbol = Quotes_reflect.requiredValue("UnitConstant") + @tu lazy val Quotes_reflect_NullConstant: Symbol = Quotes_reflect.requiredValue("NullConstant") + @tu lazy val Quotes_reflect_ClassOfConstant: Symbol = Quotes_reflect.requiredValue("ClassOfConstant") + + + @tu lazy val QuoteUnpicklerClass: ClassSymbol = requiredClass("scala.quoted.runtime.QuoteUnpickler") + @tu lazy val QuoteUnpickler_unpickleExprV2: Symbol = QuoteUnpicklerClass.requiredMethod("unpickleExprV2") + @tu lazy val QuoteUnpickler_unpickleTypeV2: Symbol = QuoteUnpicklerClass.requiredMethod("unpickleTypeV2") + + @tu lazy val QuoteMatchingClass: ClassSymbol = requiredClass("scala.quoted.runtime.QuoteMatching") + @tu lazy val QuoteMatching_ExprMatch: Symbol = QuoteMatchingClass.requiredMethod("ExprMatch") + @tu lazy val QuoteMatching_TypeMatch: Symbol = QuoteMatchingClass.requiredMethod("TypeMatch") + + @tu lazy val ToExprModule: Symbol = requiredModule("scala.quoted.ToExpr") + @tu lazy val ToExprModule_BooleanToExpr: Symbol = ToExprModule.requiredMethod("BooleanToExpr") + @tu lazy val ToExprModule_ByteToExpr: Symbol = ToExprModule.requiredMethod("ByteToExpr") + @tu lazy val ToExprModule_ShortToExpr: Symbol = ToExprModule.requiredMethod("ShortToExpr") + @tu lazy val ToExprModule_IntToExpr: Symbol = ToExprModule.requiredMethod("IntToExpr") + @tu lazy val ToExprModule_LongToExpr: Symbol = ToExprModule.requiredMethod("LongToExpr") + @tu lazy val ToExprModule_FloatToExpr: Symbol = ToExprModule.requiredMethod("FloatToExpr") + @tu lazy val ToExprModule_DoubleToExpr: Symbol = ToExprModule.requiredMethod("DoubleToExpr") + @tu lazy val ToExprModule_CharToExpr: Symbol = ToExprModule.requiredMethod("CharToExpr") + @tu lazy val ToExprModule_StringToExpr: Symbol = ToExprModule.requiredMethod("StringToExpr") + + @tu lazy val QuotedRuntimeModule: Symbol = requiredModule("scala.quoted.runtime.Expr") + @tu lazy val QuotedRuntime_exprQuote : Symbol = QuotedRuntimeModule.requiredMethod("quote") + @tu lazy val QuotedRuntime_exprSplice : Symbol = QuotedRuntimeModule.requiredMethod("splice") + @tu lazy val QuotedRuntime_exprNestedSplice : Symbol = QuotedRuntimeModule.requiredMethod("nestedSplice") + + @tu lazy val QuotedRuntime_SplicedTypeAnnot: ClassSymbol = requiredClass("scala.quoted.runtime.SplicedType") + + @tu lazy val QuotedRuntimePatterns: Symbol = requiredModule("scala.quoted.runtime.Patterns") + @tu lazy val QuotedRuntimePatterns_patternHole: Symbol = QuotedRuntimePatterns.requiredMethod("patternHole") + @tu lazy val QuotedRuntimePatterns_patternHigherOrderHole: Symbol = QuotedRuntimePatterns.requiredMethod("patternHigherOrderHole") + @tu lazy val QuotedRuntimePatterns_higherOrderHole: Symbol = QuotedRuntimePatterns.requiredMethod("higherOrderHole") + @tu lazy val QuotedRuntimePatterns_patternTypeAnnot: ClassSymbol = QuotedRuntimePatterns.requiredClass("patternType") + @tu lazy val QuotedRuntimePatterns_fromAboveAnnot: ClassSymbol = QuotedRuntimePatterns.requiredClass("fromAbove") + + @tu lazy val QuotedTypeClass: ClassSymbol = requiredClass("scala.quoted.Type") + @tu lazy val QuotedType_splice: Symbol = QuotedTypeClass.requiredType(tpnme.Underlying) + + @tu lazy val QuotedTypeModule: Symbol = QuotedTypeClass.companionModule + @tu lazy val QuotedTypeModule_of: Symbol = QuotedTypeModule.requiredMethod("of") + + @tu lazy val CanEqualClass: ClassSymbol = getClassIfDefined("scala.Eql").orElse(requiredClass("scala.CanEqual")).asClass + def CanEqual_canEqualAny(using Context): TermSymbol = + val methodName = if CanEqualClass.name == tpnme.Eql then nme.eqlAny else nme.canEqualAny + CanEqualClass.companionModule.requiredMethod(methodName) + + @tu lazy val CanThrowClass: ClassSymbol = requiredClass("scala.CanThrow") + @tu lazy val throwsAlias: Symbol = ScalaRuntimePackageVal.requiredType(tpnme.THROWS) + + @tu lazy val TypeBoxClass: ClassSymbol = requiredClass("scala.runtime.TypeBox") + @tu lazy val TypeBox_CAP: TypeSymbol = TypeBoxClass.requiredType(tpnme.CAP) + + @tu lazy val MatchCaseClass: ClassSymbol = requiredClass("scala.runtime.MatchCase") + @tu lazy val NotGivenClass: ClassSymbol = requiredClass("scala.util.NotGiven") + @tu lazy val NotGiven_value: Symbol = NotGivenClass.companionModule.requiredMethod(nme.value) + + @tu lazy val ValueOfClass: ClassSymbol = requiredClass("scala.ValueOf") + + @tu lazy val FromDigitsClass: ClassSymbol = requiredClass("scala.util.FromDigits") + @tu lazy val FromDigits_WithRadixClass: ClassSymbol = requiredClass("scala.util.FromDigits.WithRadix") + @tu lazy val FromDigits_DecimalClass: ClassSymbol = requiredClass("scala.util.FromDigits.Decimal") + @tu lazy val FromDigits_FloatingClass: ClassSymbol = requiredClass("scala.util.FromDigits.Floating") + + @tu lazy val XMLTopScopeModule: Symbol = requiredModule("scala.xml.TopScope") + + @tu lazy val MainAnnotationClass: ClassSymbol = requiredClass("scala.annotation.MainAnnotation") + @tu lazy val MainAnnotationInfo: ClassSymbol = requiredClass("scala.annotation.MainAnnotation.Info") + @tu lazy val MainAnnotationParameter: ClassSymbol = requiredClass("scala.annotation.MainAnnotation.Parameter") + @tu lazy val MainAnnotationParameterAnnotation: ClassSymbol = requiredClass("scala.annotation.MainAnnotation.ParameterAnnotation") + @tu lazy val MainAnnotationCommand: ClassSymbol = requiredClass("scala.annotation.MainAnnotation.Command") + + @tu lazy val CommandLineParserModule: Symbol = requiredModule("scala.util.CommandLineParser") + @tu lazy val CLP_ParseError: ClassSymbol = CommandLineParserModule.requiredClass("ParseError").typeRef.symbol.asClass + @tu lazy val CLP_parseArgument: Symbol = CommandLineParserModule.requiredMethod("parseArgument") + @tu lazy val CLP_parseRemainingArguments: Symbol = CommandLineParserModule.requiredMethod("parseRemainingArguments") + @tu lazy val CLP_showError: Symbol = CommandLineParserModule.requiredMethod("showError") + + @tu lazy val TupleTypeRef: TypeRef = requiredClassRef("scala.Tuple") + def TupleClass(using Context): ClassSymbol = TupleTypeRef.symbol.asClass + @tu lazy val Tuple_cons: Symbol = TupleClass.requiredMethod("*:") + @tu lazy val EmptyTupleModule: Symbol = requiredModule("scala.EmptyTuple") + @tu lazy val NonEmptyTupleTypeRef: TypeRef = requiredClassRef("scala.NonEmptyTuple") + def NonEmptyTupleClass(using Context): ClassSymbol = NonEmptyTupleTypeRef.symbol.asClass + lazy val NonEmptyTuple_tail: Symbol = NonEmptyTupleClass.requiredMethod("tail") + @tu lazy val PairClass: ClassSymbol = requiredClass("scala.*:") + + @tu lazy val TupleXXLClass: ClassSymbol = requiredClass("scala.runtime.TupleXXL") + def TupleXXLModule(using Context): Symbol = TupleXXLClass.companionModule + + def TupleXXL_fromIterator(using Context): Symbol = TupleXXLModule.requiredMethod("fromIterator") + + @tu lazy val RuntimeTupleMirrorTypeRef: TypeRef = requiredClassRef("scala.runtime.TupleMirror") + + @tu lazy val RuntimeTuplesModule: Symbol = requiredModule("scala.runtime.Tuples") + @tu lazy val RuntimeTuplesModuleClass: Symbol = RuntimeTuplesModule.moduleClass + @tu lazy val RuntimeTuples_consIterator: Symbol = RuntimeTuplesModule.requiredMethod("consIterator") + @tu lazy val RuntimeTuples_concatIterator: Symbol = RuntimeTuplesModule.requiredMethod("concatIterator") + @tu lazy val RuntimeTuples_apply: Symbol = RuntimeTuplesModule.requiredMethod("apply") + @tu lazy val RuntimeTuples_cons: Symbol = RuntimeTuplesModule.requiredMethod("cons") + @tu lazy val RuntimeTuples_size: Symbol = RuntimeTuplesModule.requiredMethod("size") + @tu lazy val RuntimeTuples_tail: Symbol = RuntimeTuplesModule.requiredMethod("tail") + @tu lazy val RuntimeTuples_concat: Symbol = RuntimeTuplesModule.requiredMethod("concat") + @tu lazy val RuntimeTuples_toArray: Symbol = RuntimeTuplesModule.requiredMethod("toArray") + @tu lazy val RuntimeTuples_productToArray: Symbol = RuntimeTuplesModule.requiredMethod("productToArray") + @tu lazy val RuntimeTuples_isInstanceOfTuple: Symbol = RuntimeTuplesModule.requiredMethod("isInstanceOfTuple") + @tu lazy val RuntimeTuples_isInstanceOfEmptyTuple: Symbol = RuntimeTuplesModule.requiredMethod("isInstanceOfEmptyTuple") + @tu lazy val RuntimeTuples_isInstanceOfNonEmptyTuple: Symbol = RuntimeTuplesModule.requiredMethod("isInstanceOfNonEmptyTuple") + + @tu lazy val TupledFunctionTypeRef: TypeRef = requiredClassRef("scala.util.TupledFunction") + def TupledFunctionClass(using Context): ClassSymbol = TupledFunctionTypeRef.symbol.asClass + def RuntimeTupleFunctionsModule(using Context): Symbol = requiredModule("scala.runtime.TupledFunctions") + + @tu lazy val CapsModule: Symbol = requiredModule("scala.caps") + @tu lazy val captureRoot: TermSymbol = CapsModule.requiredValue("*") + @tu lazy val CapsUnsafeModule: Symbol = requiredModule("scala.caps.unsafe") + @tu lazy val Caps_unsafeBox: Symbol = CapsUnsafeModule.requiredMethod("unsafeBox") + @tu lazy val Caps_unsafeUnbox: Symbol = CapsUnsafeModule.requiredMethod("unsafeUnbox") + @tu lazy val Caps_unsafeBoxFunArg: Symbol = CapsUnsafeModule.requiredMethod("unsafeBoxFunArg") + + // Annotation base classes + @tu lazy val AnnotationClass: ClassSymbol = requiredClass("scala.annotation.Annotation") + @tu lazy val StaticAnnotationClass: ClassSymbol = requiredClass("scala.annotation.StaticAnnotation") + @tu lazy val RefiningAnnotationClass: ClassSymbol = requiredClass("scala.annotation.RefiningAnnotation") + + // Annotation classes + @tu lazy val AllowConversionsAnnot: ClassSymbol = requiredClass("scala.annotation.allowConversions") + @tu lazy val AnnotationDefaultAnnot: ClassSymbol = requiredClass("scala.annotation.internal.AnnotationDefault") + @tu lazy val BeanPropertyAnnot: ClassSymbol = requiredClass("scala.beans.BeanProperty") + @tu lazy val BooleanBeanPropertyAnnot: ClassSymbol = requiredClass("scala.beans.BooleanBeanProperty") + @tu lazy val BodyAnnot: ClassSymbol = requiredClass("scala.annotation.internal.Body") + @tu lazy val CapabilityAnnot: ClassSymbol = requiredClass("scala.annotation.capability") + @tu lazy val ChildAnnot: ClassSymbol = requiredClass("scala.annotation.internal.Child") + @tu lazy val ContextResultCountAnnot: ClassSymbol = requiredClass("scala.annotation.internal.ContextResultCount") + @tu lazy val ProvisionalSuperClassAnnot: ClassSymbol = requiredClass("scala.annotation.internal.ProvisionalSuperClass") + @tu lazy val DeprecatedAnnot: ClassSymbol = requiredClass("scala.deprecated") + @tu lazy val DeprecatedOverridingAnnot: ClassSymbol = requiredClass("scala.deprecatedOverriding") + @tu lazy val ImplicitAmbiguousAnnot: ClassSymbol = requiredClass("scala.annotation.implicitAmbiguous") + @tu lazy val ImplicitNotFoundAnnot: ClassSymbol = requiredClass("scala.annotation.implicitNotFound") + @tu lazy val InlineParamAnnot: ClassSymbol = requiredClass("scala.annotation.internal.InlineParam") + @tu lazy val ErasedParamAnnot: ClassSymbol = requiredClass("scala.annotation.internal.ErasedParam") + @tu lazy val InvariantBetweenAnnot: ClassSymbol = requiredClass("scala.annotation.internal.InvariantBetween") + @tu lazy val MainAnnot: ClassSymbol = requiredClass("scala.main") + @tu lazy val MappedAlternativeAnnot: ClassSymbol = requiredClass("scala.annotation.internal.MappedAlternative") + @tu lazy val MigrationAnnot: ClassSymbol = requiredClass("scala.annotation.migration") + @tu lazy val NowarnAnnot: ClassSymbol = requiredClass("scala.annotation.nowarn") + @tu lazy val TransparentTraitAnnot: ClassSymbol = requiredClass("scala.annotation.transparentTrait") + @tu lazy val NativeAnnot: ClassSymbol = requiredClass("scala.native") + @tu lazy val RepeatedAnnot: ClassSymbol = requiredClass("scala.annotation.internal.Repeated") + @tu lazy val SourceFileAnnot: ClassSymbol = requiredClass("scala.annotation.internal.SourceFile") + @tu lazy val ScalaSignatureAnnot: ClassSymbol = requiredClass("scala.reflect.ScalaSignature") + @tu lazy val ScalaLongSignatureAnnot: ClassSymbol = requiredClass("scala.reflect.ScalaLongSignature") + @tu lazy val ScalaStrictFPAnnot: ClassSymbol = requiredClass("scala.annotation.strictfp") + @tu lazy val ScalaStaticAnnot: ClassSymbol = requiredClass("scala.annotation.static") + @tu lazy val SerialVersionUIDAnnot: ClassSymbol = requiredClass("scala.SerialVersionUID") + @tu lazy val TailrecAnnot: ClassSymbol = requiredClass("scala.annotation.tailrec") + @tu lazy val ThreadUnsafeAnnot: ClassSymbol = requiredClass("scala.annotation.threadUnsafe") + @tu lazy val ConstructorOnlyAnnot: ClassSymbol = requiredClass("scala.annotation.constructorOnly") + @tu lazy val CompileTimeOnlyAnnot: ClassSymbol = requiredClass("scala.annotation.compileTimeOnly") + @tu lazy val SwitchAnnot: ClassSymbol = requiredClass("scala.annotation.switch") + @tu lazy val ExperimentalAnnot: ClassSymbol = requiredClass("scala.annotation.experimental") + @tu lazy val ThrowsAnnot: ClassSymbol = requiredClass("scala.throws") + @tu lazy val TransientAnnot: ClassSymbol = requiredClass("scala.transient") + @tu lazy val UncheckedAnnot: ClassSymbol = requiredClass("scala.unchecked") + @tu lazy val UncheckedStableAnnot: ClassSymbol = requiredClass("scala.annotation.unchecked.uncheckedStable") + @tu lazy val UncheckedVarianceAnnot: ClassSymbol = requiredClass("scala.annotation.unchecked.uncheckedVariance") + @tu lazy val VolatileAnnot: ClassSymbol = requiredClass("scala.volatile") + @tu lazy val WithPureFunsAnnot: ClassSymbol = requiredClass("scala.annotation.internal.WithPureFuns") + @tu lazy val FieldMetaAnnot: ClassSymbol = requiredClass("scala.annotation.meta.field") + @tu lazy val GetterMetaAnnot: ClassSymbol = requiredClass("scala.annotation.meta.getter") + @tu lazy val ParamMetaAnnot: ClassSymbol = requiredClass("scala.annotation.meta.param") + @tu lazy val SetterMetaAnnot: ClassSymbol = requiredClass("scala.annotation.meta.setter") + @tu lazy val ShowAsInfixAnnot: ClassSymbol = requiredClass("scala.annotation.showAsInfix") + @tu lazy val FunctionalInterfaceAnnot: ClassSymbol = requiredClass("java.lang.FunctionalInterface") + @tu lazy val TargetNameAnnot: ClassSymbol = requiredClass("scala.annotation.targetName") + @tu lazy val VarargsAnnot: ClassSymbol = requiredClass("scala.annotation.varargs") + @tu lazy val SinceAnnot: ClassSymbol = requiredClass("scala.annotation.since") + @tu lazy val RequiresCapabilityAnnot: ClassSymbol = requiredClass("scala.annotation.internal.requiresCapability") + @tu lazy val RetainsAnnot: ClassSymbol = requiredClass("scala.annotation.retains") + @tu lazy val RetainsByNameAnnot: ClassSymbol = requiredClass("scala.annotation.retainsByName") + + @tu lazy val JavaRepeatableAnnot: ClassSymbol = requiredClass("java.lang.annotation.Repeatable") + + // A list of meta-annotations that are relevant for fields and accessors + @tu lazy val FieldAccessorMetaAnnots: Set[Symbol] = + Set(FieldMetaAnnot, GetterMetaAnnot, ParamMetaAnnot, SetterMetaAnnot) + + // A list of annotations that are commonly used to indicate that a field/method argument or return + // type is not null. These annotations are used by the nullification logic in JavaNullInterop to + // improve the precision of type nullification. + // We don't require that any of these annotations be present in the class path, but we want to + // create Symbols for the ones that are present, so they can be checked during nullification. + @tu lazy val NotNullAnnots: List[ClassSymbol] = getClassesIfDefined( + "javax.annotation.Nonnull" :: + "javax.validation.constraints.NotNull" :: + "androidx.annotation.NonNull" :: + "android.support.annotation.NonNull" :: + "android.annotation.NonNull" :: + "com.android.annotations.NonNull" :: + "org.eclipse.jdt.annotation.NonNull" :: + "edu.umd.cs.findbugs.annotations.NonNull" :: + "org.checkerframework.checker.nullness.qual.NonNull" :: + "org.checkerframework.checker.nullness.compatqual.NonNullDecl" :: + "org.jetbrains.annotations.NotNull" :: + "org.springframework.lang.NonNull" :: + "org.springframework.lang.NonNullApi" :: + "org.springframework.lang.NonNullFields" :: + "lombok.NonNull" :: + "reactor.util.annotation.NonNull" :: + "reactor.util.annotation.NonNullApi" :: + "io.reactivex.annotations.NonNull" :: Nil) + + // convenient one-parameter method types + def methOfAny(tp: Type): MethodType = MethodType(List(AnyType), tp) + def methOfAnyVal(tp: Type): MethodType = MethodType(List(AnyValType), tp) + def methOfAnyRef(tp: Type): MethodType = MethodType(List(ObjectType), tp) + + // Derived types + + def RepeatedParamType: TypeRef = RepeatedParamClass.typeRef + + def ClassType(arg: Type)(using Context): Type = { + val ctype = ClassClass.typeRef + if (ctx.phase.erasedTypes) ctype else ctype.appliedTo(arg) + } + + /** The enumeration type, goven a value of the enumeration */ + def EnumType(sym: Symbol)(using Context): TypeRef = + // given (in java): "class A { enum E { VAL1 } }" + // - sym: the symbol of the actual enumeration value (VAL1) + // - .owner: the ModuleClassSymbol of the enumeration (object E) + // - .linkedClass: the ClassSymbol of the enumeration (class E) + sym.owner.linkedClass.typeRef + + object FunctionOf { + def apply(args: List[Type], resultType: Type, isContextual: Boolean = false, isErased: Boolean = false)(using Context): Type = + FunctionType(args.length, isContextual, isErased).appliedTo(args ::: resultType :: Nil) + def unapply(ft: Type)(using Context): Option[(List[Type], Type, Boolean, Boolean)] = { + val tsym = ft.typeSymbol + if isFunctionClass(tsym) && ft.isRef(tsym) then + val targs = ft.dealias.argInfos + if (targs.isEmpty) None + else Some(targs.init, targs.last, tsym.name.isContextFunction, tsym.name.isErasedFunction) + else None + } + } + + object PartialFunctionOf { + def apply(arg: Type, result: Type)(using Context): Type = + PartialFunctionClass.typeRef.appliedTo(arg :: result :: Nil) + def unapply(pft: Type)(using Context): Option[(Type, List[Type])] = + if (pft.isRef(PartialFunctionClass)) { + val targs = pft.dealias.argInfos + if (targs.length == 2) Some((targs.head, targs.tail)) else None + } + else None + } + + object ArrayOf { + def apply(elem: Type)(using Context): Type = + if (ctx.erasedTypes) JavaArrayType(elem) + else ArrayType.appliedTo(elem :: Nil) + def unapply(tp: Type)(using Context): Option[Type] = tp.dealias match { + case AppliedType(at, arg :: Nil) if at.isRef(ArrayType.symbol) => Some(arg) + case JavaArrayType(tp) if ctx.erasedTypes => Some(tp) + case _ => None + } + } + + object MatchCase { + def apply(pat: Type, body: Type)(using Context): Type = + MatchCaseClass.typeRef.appliedTo(pat, body) + def unapply(tp: Type)(using Context): Option[(Type, Type)] = tp match { + case AppliedType(tycon, pat :: body :: Nil) if tycon.isRef(MatchCaseClass) => + Some((pat, body)) + case _ => + None + } + def isInstance(tp: Type)(using Context): Boolean = tp match { + case AppliedType(tycon: TypeRef, _) => + tycon.name == tpnme.MatchCase && // necessary pre-filter to avoid forcing symbols + tycon.isRef(MatchCaseClass) + case _ => false + } + } + + /** An extractor for multi-dimensional arrays. + * Note that this will also extract the high bound if an + * element type is a wildcard upper-bounded by an array. E.g. + * + * Array[? <: Array[? <: Number]] + * + * would match + * + * MultiArrayOf(, 2) + */ + object MultiArrayOf { + def apply(elem: Type, ndims: Int)(using Context): Type = + if (ndims == 0) elem else ArrayOf(apply(elem, ndims - 1)) + def unapply(tp: Type)(using Context): Option[(Type, Int)] = tp match { + case ArrayOf(elemtp) => + def recur(elemtp: Type): Option[(Type, Int)] = elemtp.dealias match { + case tp @ TypeBounds(lo, hi @ MultiArrayOf(finalElemTp, n)) => + Some(finalElemTp, n) + case MultiArrayOf(finalElemTp, n) => Some(finalElemTp, n + 1) + case _ => Some(elemtp, 1) + } + recur(elemtp) + case _ => + None + } + } + + /** Extractor for context function types representing by-name parameters, of the form + * `() ?=> T`. + * Under purefunctions, this becomes `() ?-> T` or `{r1, ..., rN} () ?-> T`. + */ + object ByNameFunction: + def apply(tp: Type)(using Context): Type = tp match + case tp @ EventuallyCapturingType(tp1, refs) if tp.annot.symbol == RetainsByNameAnnot => + CapturingType(apply(tp1), refs) + case _ => + defn.ContextFunction0.typeRef.appliedTo(tp :: Nil) + def unapply(tp: Type)(using Context): Option[Type] = tp match + case tp @ AppliedType(tycon, arg :: Nil) if defn.isByNameFunctionClass(tycon.typeSymbol) => + Some(arg) + case tp @ AnnotatedType(parent, _) => + unapply(parent) + case _ => + None + + final def isByNameFunctionClass(sym: Symbol): Boolean = + sym eq ContextFunction0 + + def isByNameFunction(tp: Type)(using Context): Boolean = tp match + case ByNameFunction(_) => true + case _ => false + + final def isCompiletime_S(sym: Symbol)(using Context): Boolean = + sym.name == tpnme.S && sym.owner == CompiletimeOpsIntModuleClass + + private val compiletimePackageAnyTypes: Set[Name] = Set( + tpnme.Equals, tpnme.NotEquals, tpnme.IsConst, tpnme.ToString + ) + private val compiletimePackageNumericTypes: Set[Name] = Set( + tpnme.Plus, tpnme.Minus, tpnme.Times, tpnme.Div, tpnme.Mod, + tpnme.Lt, tpnme.Gt, tpnme.Ge, tpnme.Le, + tpnme.Abs, tpnme.Negate, tpnme.Min, tpnme.Max + ) + private val compiletimePackageIntTypes: Set[Name] = compiletimePackageNumericTypes ++ Set[Name]( + tpnme.ToString, // ToString is moved to ops.any and deprecated for ops.int + tpnme.NumberOfLeadingZeros, tpnme.ToLong, tpnme.ToFloat, tpnme.ToDouble, + tpnme.Xor, tpnme.BitwiseAnd, tpnme.BitwiseOr, tpnme.ASR, tpnme.LSL, tpnme.LSR + ) + private val compiletimePackageLongTypes: Set[Name] = compiletimePackageNumericTypes ++ Set[Name]( + tpnme.NumberOfLeadingZeros, tpnme.ToInt, tpnme.ToFloat, tpnme.ToDouble, + tpnme.Xor, tpnme.BitwiseAnd, tpnme.BitwiseOr, tpnme.ASR, tpnme.LSL, tpnme.LSR + ) + private val compiletimePackageFloatTypes: Set[Name] = compiletimePackageNumericTypes ++ Set[Name]( + tpnme.ToInt, tpnme.ToLong, tpnme.ToDouble + ) + private val compiletimePackageDoubleTypes: Set[Name] = compiletimePackageNumericTypes ++ Set[Name]( + tpnme.ToInt, tpnme.ToLong, tpnme.ToFloat + ) + private val compiletimePackageBooleanTypes: Set[Name] = Set(tpnme.Not, tpnme.Xor, tpnme.And, tpnme.Or) + private val compiletimePackageStringTypes: Set[Name] = Set( + tpnme.Plus, tpnme.Length, tpnme.Substring, tpnme.Matches, tpnme.CharAt + ) + private val compiletimePackageOpTypes: Set[Name] = + Set(tpnme.S) + ++ compiletimePackageAnyTypes + ++ compiletimePackageIntTypes + ++ compiletimePackageLongTypes + ++ compiletimePackageFloatTypes + ++ compiletimePackageDoubleTypes + ++ compiletimePackageBooleanTypes + ++ compiletimePackageStringTypes + + final def isCompiletimeAppliedType(sym: Symbol)(using Context): Boolean = + compiletimePackageOpTypes.contains(sym.name) + && ( + isCompiletime_S(sym) + || sym.owner == CompiletimeOpsAnyModuleClass && compiletimePackageAnyTypes.contains(sym.name) + || sym.owner == CompiletimeOpsIntModuleClass && compiletimePackageIntTypes.contains(sym.name) + || sym.owner == CompiletimeOpsLongModuleClass && compiletimePackageLongTypes.contains(sym.name) + || sym.owner == CompiletimeOpsFloatModuleClass && compiletimePackageFloatTypes.contains(sym.name) + || sym.owner == CompiletimeOpsDoubleModuleClass && compiletimePackageDoubleTypes.contains(sym.name) + || sym.owner == CompiletimeOpsBooleanModuleClass && compiletimePackageBooleanTypes.contains(sym.name) + || sym.owner == CompiletimeOpsStringModuleClass && compiletimePackageStringTypes.contains(sym.name) + ) + + // ----- Scala-2 library patches -------------------------------------- + + /** The `scala.runtime.stdLibPacthes` package contains objects + * that contain defnitions that get added as members to standard library + * objects with the same name. + */ + @tu lazy val StdLibPatchesPackage: TermSymbol = requiredPackage("scala.runtime.stdLibPatches") + @tu private lazy val ScalaPredefModuleClassPatch: Symbol = getModuleIfDefined("scala.runtime.stdLibPatches.Predef").moduleClass + @tu private lazy val LanguageModuleClassPatch: Symbol = getModuleIfDefined("scala.runtime.stdLibPatches.language").moduleClass + + /** If `sym` is a patched library class, the source file of its patch class, + * otherwise `NoSource` + */ + def patchSource(sym: Symbol)(using Context): SourceFile = + if sym == ScalaPredefModuleClass then ScalaPredefModuleClassPatch.source + else if sym == LanguageModuleClass then LanguageModuleClassPatch.source + else NoSource + + /** A finalizer that patches standard library classes. + * It copies all non-private, non-synthetic definitions from `patchCls` + * to `denot` while changing their owners to `denot`. Before that it deletes + * any definitions of `denot` that have the same name as one of the copied + * definitions. + * + * If an object is present in both the original class and the patch class, + * it is not overwritten. Instead its members are copied recursively. + * + * To avpid running into cycles on bootstrap, patching happens only if `patchCls` + * is read from a classfile. + */ + def patchStdLibClass(denot: ClassDenotation)(using Context): Unit = + def patch2(denot: ClassDenotation, patchCls: Symbol): Unit = + val scope = denot.info.decls.openForMutations + + def recurse(patch: Symbol) = patch.is(Module) && scope.lookup(patch.name).exists + + def makeClassSymbol(patch: Symbol, parents: List[Type], selfInfo: TypeOrSymbol) = + newClassSymbol( + owner = denot.symbol, + name = patch.name.asTypeName, + flags = patch.flags, + // need to rebuild a fresh ClassInfo + infoFn = cls => ClassInfo( + prefix = denot.symbol.thisType, + cls = cls, + declaredParents = parents, // assume parents in patch don't refer to symbols in the patch + decls = newScope, + selfInfo = + if patch.is(Module) + then TermRef(denot.symbol.thisType, patch.name.sourceModuleName) + else selfInfo // assume patch self type annotation does not refer to symbols in the patch + ), + privateWithin = patch.privateWithin, + coord = denot.symbol.coord, + assocFile = denot.symbol.associatedFile + ) + + def makeNonClassSymbol(patch: Symbol) = + if patch.is(Inline) then + // Inline symbols contain trees in annotations, which is coupled + // with the underlying symbol. + // Changing owner for inline symbols is a simple workaround. + patch.denot = patch.denot.copySymDenotation(owner = denot.symbol) + patch + else + // change `info` which might contain reference to the patch + patch.copy( + owner = denot.symbol, + info = + if patch.is(Module) + then TypeRef(denot.symbol.thisType, patch.name.moduleClassName) + else patch.info // assume non-object info does not refer to symbols in the patch + ) + + if patchCls.exists then + val patches = patchCls.info.decls.filter(patch => + !patch.isConstructor && !patch.isOneOf(PrivateOrSynthetic)) + for patch <- patches if !recurse(patch) do + val e = scope.lookupEntry(patch.name) + if e != null then scope.unlink(e) + for patch <- patches do + patch.ensureCompleted() + if !recurse(patch) then + val sym = + patch.info match + case ClassInfo(_, _, parents, _, selfInfo) => + makeClassSymbol(patch, parents, selfInfo) + case _ => + makeNonClassSymbol(patch) + end match + sym.annotations = patch.annotations + scope.enter(sym) + if patch.isClass then + patch2(scope.lookup(patch.name).asClass, patch) + + def patchWith(patchCls: Symbol) = + denot.sourceModule.info = denot.typeRef // we run into a cyclic reference when patching if this line is omitted + patch2(denot, patchCls) + + if denot.name == tpnme.Predef.moduleClassName && denot.symbol == ScalaPredefModuleClass then + patchWith(ScalaPredefModuleClassPatch) + else if denot.name == tpnme.language.moduleClassName && denot.symbol == LanguageModuleClass then + patchWith(LanguageModuleClassPatch) + end patchStdLibClass + + // ----- Symbol sets --------------------------------------------------- + + @tu lazy val topClasses: Set[Symbol] = Set(AnyClass, MatchableClass, ObjectClass, AnyValClass) + + @tu lazy val untestableClasses: Set[Symbol] = Set(NothingClass, NullClass, SingletonClass) + + /** Base classes that are assumed to be pure for the purposes of capture checking. + * Every class inheriting from a pure baseclass is pure. + */ + @tu lazy val pureBaseClasses = Set(defn.AnyValClass, defn.ThrowableClass) + + /** Non-inheritable lasses that are assumed to be pure for the purposes of capture checking, + */ + @tu lazy val pureSimpleClasses = Set(StringClass, NothingClass, NullClass) + + @tu lazy val AbstractFunctionType: Array[TypeRef] = mkArityArray("scala.runtime.AbstractFunction", MaxImplementedFunctionArity, 0).asInstanceOf[Array[TypeRef]] + val AbstractFunctionClassPerRun: PerRun[Array[Symbol]] = new PerRun(AbstractFunctionType.map(_.symbol.asClass)) + def AbstractFunctionClass(n: Int)(using Context): Symbol = AbstractFunctionClassPerRun()(using ctx)(n) + + @tu lazy val caseClassSynthesized: List[Symbol] = List( + Any_hashCode, Any_equals, Any_toString, Product_canEqual, Product_productArity, + Product_productPrefix, Product_productElement, Product_productElementName) + + val LazyHolder: PerRun[Map[Symbol, Symbol]] = new PerRun({ + def holderImpl(holderType: String) = requiredClass("scala.runtime." + holderType) + Map[Symbol, Symbol]( + IntClass -> holderImpl("LazyInt"), + LongClass -> holderImpl("LazyLong"), + BooleanClass -> holderImpl("LazyBoolean"), + FloatClass -> holderImpl("LazyFloat"), + DoubleClass -> holderImpl("LazyDouble"), + ByteClass -> holderImpl("LazyByte"), + CharClass -> holderImpl("LazyChar"), + ShortClass -> holderImpl("LazyShort") + ) + .withDefaultValue(holderImpl("LazyRef")) + }) + + @tu lazy val TupleType: Array[TypeRef | Null] = mkArityArray("scala.Tuple", MaxTupleArity, 1) + + def isSpecializedTuple(cls: Symbol)(using Context): Boolean = + cls.isClass && TupleSpecializedClasses.exists(tupleCls => cls.name.isSpecializedNameOf(tupleCls.name)) + + def SpecializedTuple(base: Symbol, args: List[Type])(using Context): Symbol = + base.owner.requiredClass(base.name.specializedName(args)) + + /** Cached function types of arbitary arities. + * Function types are created on demand with newFunctionNTrait, which is + * called from a synthesizer installed in ScalaPackageClass. + */ + private class FunType(prefix: String): + private var classRefs: Array[TypeRef | Null] = new Array(22) + def apply(n: Int): TypeRef = + while n >= classRefs.length do + val classRefs1 = new Array[TypeRef | Null](classRefs.length * 2) + Array.copy(classRefs, 0, classRefs1, 0, classRefs.length) + classRefs = classRefs1 + val funName = s"scala.$prefix$n" + if classRefs(n) == null then + classRefs(n) = + if prefix.startsWith("Impure") + then staticRef(funName.toTypeName).symbol.typeRef + else requiredClassRef(funName) + classRefs(n).nn + end FunType + + private def funTypeIdx(isContextual: Boolean, isErased: Boolean, isImpure: Boolean): Int = + (if isContextual then 1 else 0) + + (if isErased then 2 else 0) + + (if isImpure then 4 else 0) + + private val funTypeArray: IArray[FunType] = + val arr = Array.ofDim[FunType](8) + val choices = List(false, true) + for contxt <- choices; erasd <- choices; impure <- choices do + var str = "Function" + if contxt then str = "Context" + str + if erasd then str = "Erased" + str + if impure then str = "Impure" + str + arr(funTypeIdx(contxt, erasd, impure)) = FunType(str) + IArray.unsafeFromArray(arr) + + def FunctionSymbol(n: Int, isContextual: Boolean = false, isErased: Boolean = false, isImpure: Boolean = false)(using Context): Symbol = + funTypeArray(funTypeIdx(isContextual, isErased, isImpure))(n).symbol + + @tu lazy val Function0_apply: Symbol = Function0.requiredMethod(nme.apply) + @tu lazy val ContextFunction0_apply: Symbol = ContextFunction0.requiredMethod(nme.apply) + + @tu lazy val Function0: Symbol = FunctionSymbol(0) + @tu lazy val Function1: Symbol = FunctionSymbol(1) + @tu lazy val Function2: Symbol = FunctionSymbol(2) + @tu lazy val ContextFunction0: Symbol = FunctionSymbol(0, isContextual = true) + + def FunctionType(n: Int, isContextual: Boolean = false, isErased: Boolean = false, isImpure: Boolean = false)(using Context): TypeRef = + FunctionSymbol(n, isContextual && !ctx.erasedTypes, isErased, isImpure).typeRef + + lazy val PolyFunctionClass = requiredClass("scala.PolyFunction") + def PolyFunctionType = PolyFunctionClass.typeRef + + /** If `cls` is a class in the scala package, its name, otherwise EmptyTypeName */ + def scalaClassName(cls: Symbol)(using Context): TypeName = cls.denot match + case clsd: ClassDenotation if clsd.owner eq ScalaPackageClass => + clsd.name.asInstanceOf[TypeName] + case _ => + EmptyTypeName + + /** If type `ref` refers to a class in the scala package, its name, otherwise EmptyTypeName */ + def scalaClassName(ref: Type)(using Context): TypeName = scalaClassName(ref.classSymbol) + + private def isVarArityClass(cls: Symbol, prefix: String) = + cls.isClass + && cls.owner.eq(ScalaPackageClass) + && cls.name.testSimple(name => + name.startsWith(prefix) + && name.length > prefix.length + && digitsOnlyAfter(name, prefix.length)) + + private def digitsOnlyAfter(name: SimpleName, idx: Int): Boolean = + idx == name.length || name(idx).isDigit && digitsOnlyAfter(name, idx + 1) + + def isBottomClass(cls: Symbol): Boolean = + if ctx.mode.is(Mode.SafeNulls) && !ctx.phase.erasedTypes + then cls == NothingClass + else isBottomClassAfterErasure(cls) + + def isBottomClassAfterErasure(cls: Symbol): Boolean = cls == NothingClass || cls == NullClass + + /** Is any function class where + * - FunctionXXL + * - FunctionN for N >= 0 + * - ContextFunctionN for N >= 0 + * - ErasedFunctionN for N > 0 + * - ErasedContextFunctionN for N > 0 + */ + def isFunctionClass(cls: Symbol): Boolean = scalaClassName(cls).isFunction + + /** Is a function class, or an impure function type alias */ + def isFunctionSymbol(sym: Symbol): Boolean = + sym.isType && (sym.owner eq ScalaPackageClass) && sym.name.isFunction + + /** Is a function class where + * - FunctionN for N >= 0 and N != XXL + */ + def isPlainFunctionClass(cls: Symbol) = isVarArityClass(cls, str.Function) + + /** Is an context function class. + * - ContextFunctionN for N >= 0 + * - ErasedContextFunctionN for N > 0 + */ + def isContextFunctionClass(cls: Symbol): Boolean = scalaClassName(cls).isContextFunction + + /** Is an erased function class. + * - ErasedFunctionN for N > 0 + * - ErasedContextFunctionN for N > 0 + */ + def isErasedFunctionClass(cls: Symbol): Boolean = scalaClassName(cls).isErasedFunction + + /** Is either FunctionXXL or a class that will be erased to FunctionXXL + * - FunctionXXL + * - FunctionN for N >= 22 + * - ContextFunctionN for N >= 22 + */ + def isXXLFunctionClass(cls: Symbol): Boolean = { + val name = scalaClassName(cls) + (name eq tpnme.FunctionXXL) || name.functionArity > MaxImplementedFunctionArity + } + + /** Is a synthetic function class + * - FunctionN for N > 22 + * - ContextFunctionN for N >= 0 + * - ErasedFunctionN for N > 0 + * - ErasedContextFunctionN for N > 0 + */ + def isSyntheticFunctionClass(cls: Symbol): Boolean = scalaClassName(cls).isSyntheticFunction + + def isAbstractFunctionClass(cls: Symbol): Boolean = isVarArityClass(cls, str.AbstractFunction) + def isTupleClass(cls: Symbol): Boolean = isVarArityClass(cls, str.Tuple) + def isProductClass(cls: Symbol): Boolean = isVarArityClass(cls, str.Product) + + def isBoxedUnitClass(cls: Symbol): Boolean = + cls.isClass && (cls.owner eq ScalaRuntimePackageClass) && cls.name == tpnme.BoxedUnit + + /** Returns the erased type of the function class `cls` + * - FunctionN for N > 22 becomes FunctionXXL + * - FunctionN for 22 > N >= 0 remains as FunctionN + * - ContextFunctionN for N > 22 becomes FunctionXXL + * - ContextFunctionN for N <= 22 becomes FunctionN + * - ErasedFunctionN becomes Function0 + * - ImplicitErasedFunctionN becomes Function0 + * - anything else becomes a NoType + */ + def functionTypeErasure(cls: Symbol): Type = + val arity = scalaClassName(cls).functionArity + if cls.name.isErasedFunction then FunctionType(0) + else if arity > 22 then FunctionXXLClass.typeRef + else if arity >= 0 then FunctionType(arity) + else NoType + + private val JavaImportFns: List[RootRef] = List( + RootRef(() => JavaLangPackageVal.termRef) + ) + + private val ScalaImportFns: List[RootRef] = + JavaImportFns :+ + RootRef(() => ScalaPackageVal.termRef) + + private val PredefImportFns: RootRef = + RootRef(() => ScalaPredefModule.termRef, isPredef=true) + + @tu private lazy val JavaRootImportFns: List[RootRef] = + if ctx.settings.YnoImports.value then Nil + else JavaImportFns + + @tu private lazy val ScalaRootImportFns: List[RootRef] = + if ctx.settings.YnoImports.value then Nil + else if ctx.settings.YnoPredef.value then ScalaImportFns + else ScalaImportFns :+ PredefImportFns + + @tu private lazy val JavaRootImportTypes: List[TermRef] = JavaRootImportFns.map(_.refFn()) + @tu private lazy val ScalaRootImportTypes: List[TermRef] = ScalaRootImportFns.map(_.refFn()) + @tu private lazy val JavaUnqualifiedOwnerTypes: Set[NamedType] = unqualifiedTypes(JavaRootImportTypes) + @tu private lazy val ScalaUnqualifiedOwnerTypes: Set[NamedType] = unqualifiedTypes(ScalaRootImportTypes) + + /** Are we compiling a java source file? */ + private def isJavaContext(using Context): Boolean = + ctx.compilationUnit.isJava + + private def unqualifiedTypes(refs: List[TermRef]) = + val types = refs.toSet[NamedType] + types ++ types.map(_.symbol.moduleClass.typeRef) + + /** Lazy references to the root imports */ + def rootImportFns(using Context): List[RootRef] = + if isJavaContext then JavaRootImportFns + else ScalaRootImportFns + + /** Root types imported by default */ + def rootImportTypes(using Context): List[TermRef] = + if isJavaContext then JavaRootImportTypes + else ScalaRootImportTypes + + /** Modules whose members are in the default namespace and their module classes */ + def unqualifiedOwnerTypes(using Context): Set[NamedType] = + if isJavaContext then JavaUnqualifiedOwnerTypes + else ScalaUnqualifiedOwnerTypes + + /** Names of the root import symbols that can be hidden by other imports */ + @tu lazy val ShadowableImportNames: Set[TermName] = Set("Predef".toTermName) + + /** Class symbols for which no class exist at runtime */ + @tu lazy val NotRuntimeClasses: Set[Symbol] = Set(AnyClass, MatchableClass, AnyValClass, NullClass, NothingClass) + + @tu lazy val SpecialClassTagClasses: Set[Symbol] = Set(UnitClass, AnyClass, AnyValClass) + + @tu lazy val SpecialManifestClasses: Set[Symbol] = Set(AnyClass, AnyValClass, ObjectClass, NullClass, NothingClass) + + /** Classes that are known not to have an initializer irrespective of + * whether NoInits is set. Note: FunctionXXLClass is in this set + * because if it is compiled by Scala2, it does not get a NoInit flag. + * But since it is introduced only at erasure, there's no chance + * for augmentScala2Traits to do anything on a class that inherits it. So + * it also misses an implementation class, which means that the usual scheme + * of calling a superclass init in the implementation class of a Scala2 + * trait gets screwed up. Therefore, it is mandatory that FunctionXXL + * is treated as a NoInit trait. + */ + @tu lazy val NoInitClasses: Set[Symbol] = NotRuntimeClasses + FunctionXXLClass + + def isPolymorphicAfterErasure(sym: Symbol): Boolean = + (sym eq Any_isInstanceOf) || (sym eq Any_asInstanceOf) || (sym eq Object_synchronized) + + /** Is this type a `TupleN` type? + * + * @return true if the dealiased type of `tp` is `TupleN[T1, T2, ..., Tn]` + */ + def isTupleNType(tp: Type)(using Context): Boolean = { + val tp1 = tp.dealias + val arity = tp1.argInfos.length + arity <= MaxTupleArity && { + val tupletp = TupleType(arity) + tupletp != null && tp1.isRef(tupletp.symbol) + } + } + + def tupleType(elems: List[Type]): Type = { + val arity = elems.length + if 0 < arity && arity <= MaxTupleArity then + val tupletp = TupleType(arity) + if tupletp != null then tupletp.appliedTo(elems) + else TypeOps.nestedPairs(elems) + else TypeOps.nestedPairs(elems) + } + + def tupleTypes(tp: Type, bound: Int = Int.MaxValue)(using Context): Option[List[Type]] = { + @tailrec def rec(tp: Type, acc: List[Type], bound: Int): Option[List[Type]] = tp.normalized.dealias match { + case _ if bound < 0 => Some(acc.reverse) + case tp: AppliedType if PairClass == tp.classSymbol => rec(tp.args(1), tp.args.head :: acc, bound - 1) + case tp: AppliedType if isTupleNType(tp) => Some(acc.reverse ::: tp.args) + case tp: TermRef if tp.symbol == defn.EmptyTupleModule => Some(acc.reverse) + case _ => None + } + rec(tp.stripTypeVar, Nil, bound) + } + + def isProductSubType(tp: Type)(using Context): Boolean = tp.derivesFrom(ProductClass) + + /** Is `tp` (an alias) of either a scala.FunctionN or a scala.ContextFunctionN + * instance? + */ + def isNonRefinedFunction(tp: Type)(using Context): Boolean = + val arity = functionArity(tp) + val sym = tp.dealias.typeSymbol + + arity >= 0 + && isFunctionClass(sym) + && tp.isRef( + FunctionType(arity, sym.name.isContextFunction, sym.name.isErasedFunction).typeSymbol, + skipRefined = false) + end isNonRefinedFunction + + /** Is `tp` a representation of a (possibly dependent) function type or an alias of such? */ + def isFunctionType(tp: Type)(using Context): Boolean = + isNonRefinedFunction(tp.dropDependentRefinement) + + def isFunctionOrPolyType(tp: Type)(using Context): Boolean = + isFunctionType(tp) || (tp.typeSymbol eq defn.PolyFunctionClass) + + private def withSpecMethods(cls: ClassSymbol, bases: List[Name], paramTypes: Set[TypeRef]) = + for base <- bases; tp <- paramTypes do + cls.enter(newSymbol(cls, base.specializedName(List(tp)), Method, ExprType(tp))) + cls + + @tu lazy val Tuple1: ClassSymbol = withSpecMethods(requiredClass("scala.Tuple1"), List(nme._1), Tuple1SpecializedParamTypes) + @tu lazy val Tuple2: ClassSymbol = withSpecMethods(requiredClass("scala.Tuple2"), List(nme._1, nme._2), Tuple2SpecializedParamTypes) + + @tu lazy val TupleSpecializedClasses: Set[Symbol] = Set(Tuple1, Tuple2) + @tu lazy val Tuple1SpecializedParamTypes: Set[TypeRef] = Set(IntType, LongType, DoubleType) + @tu lazy val Tuple2SpecializedParamTypes: Set[TypeRef] = Set(IntType, LongType, DoubleType, CharType, BooleanType) + @tu lazy val Tuple1SpecializedParamClasses: PerRun[Set[Symbol]] = new PerRun(Tuple1SpecializedParamTypes.map(_.symbol)) + @tu lazy val Tuple2SpecializedParamClasses: PerRun[Set[Symbol]] = new PerRun(Tuple2SpecializedParamTypes.map(_.symbol)) + + // Specialized type parameters defined for scala.Function{0,1,2}. + @tu lazy val Function1SpecializedParamTypes: collection.Set[TypeRef] = + Set(IntType, LongType, FloatType, DoubleType) + @tu lazy val Function2SpecializedParamTypes: collection.Set[TypeRef] = + Set(IntType, LongType, DoubleType) + @tu lazy val Function0SpecializedReturnTypes: collection.Set[TypeRef] = + ScalaNumericValueTypeList.toSet + UnitType + BooleanType + @tu lazy val Function1SpecializedReturnTypes: collection.Set[TypeRef] = + Set(UnitType, BooleanType, IntType, FloatType, LongType, DoubleType) + @tu lazy val Function2SpecializedReturnTypes: collection.Set[TypeRef] = + Function1SpecializedReturnTypes + + @tu lazy val Function1SpecializedParamClasses: PerRun[collection.Set[Symbol]] = + new PerRun(Function1SpecializedParamTypes.map(_.symbol)) + @tu lazy val Function2SpecializedParamClasses: PerRun[collection.Set[Symbol]] = + new PerRun(Function2SpecializedParamTypes.map(_.symbol)) + @tu lazy val Function0SpecializedReturnClasses: PerRun[collection.Set[Symbol]] = + new PerRun(Function0SpecializedReturnTypes.map(_.symbol)) + @tu lazy val Function1SpecializedReturnClasses: PerRun[collection.Set[Symbol]] = + new PerRun(Function1SpecializedReturnTypes.map(_.symbol)) + @tu lazy val Function2SpecializedReturnClasses: PerRun[collection.Set[Symbol]] = + new PerRun(Function2SpecializedReturnTypes.map(_.symbol)) + + def isSpecializableTuple(base: Symbol, args: List[Type])(using Context): Boolean = + args.length <= 2 && base.isClass && TupleSpecializedClasses.exists(base.asClass.derivesFrom) && args.match + case List(x) => Tuple1SpecializedParamClasses().contains(x.classSymbol) + case List(x, y) => Tuple2SpecializedParamClasses().contains(x.classSymbol) && Tuple2SpecializedParamClasses().contains(y.classSymbol) + case _ => false + && base.owner.denot.info.member(base.name.specializedName(args)).exists // when dotc compiles the stdlib there are no specialised classes + + def isSpecializableFunction(cls: ClassSymbol, paramTypes: List[Type], retType: Type)(using Context): Boolean = + paramTypes.length <= 2 + && (cls.derivesFrom(FunctionSymbol(paramTypes.length)) || isByNameFunctionClass(cls)) + && isSpecializableFunctionSAM(paramTypes, retType) + + /** If the Single Abstract Method of a Function class has this type, is it specializable? */ + def isSpecializableFunctionSAM(paramTypes: List[Type], retType: Type)(using Context): Boolean = + paramTypes.length <= 2 && (paramTypes match { + case Nil => + Function0SpecializedReturnClasses().contains(retType.typeSymbol) + case List(paramType0) => + Function1SpecializedParamClasses().contains(paramType0.typeSymbol) && + Function1SpecializedReturnClasses().contains(retType.typeSymbol) + case List(paramType0, paramType1) => + Function2SpecializedParamClasses().contains(paramType0.typeSymbol) && + Function2SpecializedParamClasses().contains(paramType1.typeSymbol) && + Function2SpecializedReturnClasses().contains(retType.typeSymbol) + case _ => + false + }) + + @tu lazy val Function0SpecializedApplyNames: collection.Set[TermName] = + for r <- Function0SpecializedReturnTypes + yield nme.apply.specializedFunction(r, Nil).asTermName + + @tu lazy val Function1SpecializedApplyNames: collection.Set[TermName] = + for + r <- Function1SpecializedReturnTypes + t1 <- Function1SpecializedParamTypes + yield + nme.apply.specializedFunction(r, List(t1)).asTermName + + @tu lazy val Function2SpecializedApplyNames: collection.Set[TermName] = + for + r <- Function2SpecializedReturnTypes + t1 <- Function2SpecializedParamTypes + t2 <- Function2SpecializedParamTypes + yield + nme.apply.specializedFunction(r, List(t1, t2)).asTermName + + @tu lazy val FunctionSpecializedApplyNames: collection.Set[Name] = + Function0SpecializedApplyNames ++ Function1SpecializedApplyNames ++ Function2SpecializedApplyNames + + def functionArity(tp: Type)(using Context): Int = tp.dropDependentRefinement.dealias.argInfos.length - 1 + + /** Return underlying context function type (i.e. instance of an ContextFunctionN class) + * or NoType if none exists. The following types are considered as underlying types: + * - the alias of an alias type + * - the instance or origin of a TypeVar (i.e. the result of a stripTypeVar) + * - the upper bound of a TypeParamRef in the current constraint + */ + def asContextFunctionType(tp: Type)(using Context): Type = + tp.stripTypeVar.dealias match + case tp1: TypeParamRef if ctx.typerState.constraint.contains(tp1) => + asContextFunctionType(TypeComparer.bounds(tp1).hiBound) + case tp1 => + if tp1.typeSymbol.name.isContextFunction && isFunctionType(tp1) then tp1 + else NoType + + /** Is `tp` an context function type? */ + def isContextFunctionType(tp: Type)(using Context): Boolean = + asContextFunctionType(tp).exists + + /** An extractor for context function types `As ?=> B`, possibly with + * dependent refinements. Optionally returns a triple consisting of the argument + * types `As`, the result type `B` and a whether the type is an erased context function. + */ + object ContextFunctionType: + def unapply(tp: Type)(using Context): Option[(List[Type], Type, Boolean)] = + if ctx.erasedTypes then + atPhase(erasurePhase)(unapply(tp)) + else + val tp1 = asContextFunctionType(tp) + if tp1.exists then + val args = tp1.dropDependentRefinement.argInfos + Some((args.init, args.last, tp1.typeSymbol.name.isErasedFunction)) + else None + + def isErasedFunctionType(tp: Type)(using Context): Boolean = + tp.dealias.typeSymbol.name.isErasedFunction && isFunctionType(tp) + + /** A whitelist of Scala-2 classes that are known to be pure */ + def isAssuredNoInits(sym: Symbol): Boolean = + (sym `eq` SomeClass) || isTupleClass(sym) + + /** If `cls` is Tuple1..Tuple22, add the corresponding *: type as last parent to `parents` */ + def adjustForTuple(cls: ClassSymbol, tparams: List[TypeSymbol], parents: List[Type]): List[Type] = { + if !isTupleClass(cls) then parents + else if tparams.isEmpty then parents :+ TupleTypeRef + else + assert(parents.head.typeSymbol == ObjectClass) + TypeOps.nestedPairs(tparams.map(_.typeRef)) :: parents.tail + } + + /** If it is BoxedUnit, remove `java.io.Serializable` from `parents`. */ + def adjustForBoxedUnit(cls: ClassSymbol, parents: List[Type]): List[Type] = + if (isBoxedUnitClass(cls)) parents.filter(_.typeSymbol != JavaSerializableClass) + else parents + + private val HasProblematicGetClass: Set[Name] = Set( + tpnme.AnyVal, tpnme.Byte, tpnme.Short, tpnme.Char, tpnme.Int, tpnme.Long, tpnme.Float, tpnme.Double, + tpnme.Unit, tpnme.Boolean) + + /** When typing a primitive value class or AnyVal, we ignore the `getClass` + * member: it's supposed to be an override of the `getClass` defined on `Any`, + * but in dotty `Any#getClass` is polymorphic so it ends up being an overload. + * This is especially problematic because it means that when writing: + * + * 1.asInstanceOf[Int & AnyRef].getClass + * + * the `getClass` that returns `Class[Int]` defined in Int can be selected, + * but this call is specified to return `classOf[Integer]`, see + * tests/run/t5568.scala. + * + * FIXME: remove all the `getClass` methods defined in the standard library + * so we don't have to hot-patch it like this. + */ + def hasProblematicGetClass(className: Name): Boolean = + HasProblematicGetClass.contains(className) + + /** Is synthesized symbol with alphanumeric name allowed to be used as an infix operator? */ + def isInfix(sym: Symbol)(using Context): Boolean = + (sym eq Object_eq) || (sym eq Object_ne) + + @tu lazy val assumedTransparentNames: Map[Name, Set[Symbol]] = + // add these for now, until we had a chance to retrofit 2.13 stdlib + // we should do a more through sweep through it then. + val strs = Map( + "Any" -> Set("scala"), + "AnyVal" -> Set("scala"), + "Matchable" -> Set("scala"), + "Product" -> Set("scala"), + "Object" -> Set("java.lang"), + "Comparable" -> Set("java.lang"), + "Serializable" -> Set("java.io"), + "BitSetOps" -> Set("scala.collection"), + "IndexedSeqOps" -> Set("scala.collection", "scala.collection.mutable", "scala.collection.immutable"), + "IterableOnceOps" -> Set("scala.collection"), + "IterableOps" -> Set("scala.collection"), + "LinearSeqOps" -> Set("scala.collection", "scala.collection.immutable"), + "MapOps" -> Set("scala.collection", "scala.collection.mutable", "scala.collection.immutable"), + "SeqOps" -> Set("scala.collection", "scala.collection.mutable", "scala.collection.immutable"), + "SetOps" -> Set("scala.collection", "scala.collection.mutable", "scala.collection.immutable"), + "SortedMapOps" -> Set("scala.collection", "scala.collection.mutable", "scala.collection.immutable"), + "SortedOps" -> Set("scala.collection"), + "SortedSetOps" -> Set("scala.collection", "scala.collection.mutable", "scala.collection.immutable"), + "StrictOptimizedIterableOps" -> Set("scala.collection"), + "StrictOptimizedLinearSeqOps" -> Set("scala.collection"), + "StrictOptimizedMapOps" -> Set("scala.collection", "scala.collection.immutable"), + "StrictOptimizedSeqOps" -> Set("scala.collection", "scala.collection.immutable"), + "StrictOptimizedSetOps" -> Set("scala.collection", "scala.collection.immutable"), + "StrictOptimizedSortedMapOps" -> Set("scala.collection", "scala.collection.immutable"), + "StrictOptimizedSortedSetOps" -> Set("scala.collection", "scala.collection.immutable"), + "ArrayDequeOps" -> Set("scala.collection.mutable"), + "DefaultSerializable" -> Set("scala.collection.generic"), + "IsIterable" -> Set("scala.collection.generic"), + "IsIterableLowPriority" -> Set("scala.collection.generic"), + "IsIterableOnce" -> Set("scala.collection.generic"), + "IsIterableOnceLowPriority" -> Set("scala.collection.generic"), + "IsMap" -> Set("scala.collection.generic"), + "IsSeq" -> Set("scala.collection.generic")) + strs.map { case (simple, pkgs) => ( + simple.toTypeName, + pkgs.map(pkg => staticRef(pkg.toTermName, isPackage = true).symbol.moduleClass) + ) + } + + def isAssumedTransparent(sym: Symbol): Boolean = + assumedTransparentNames.get(sym.name) match + case Some(pkgs) => pkgs.contains(sym.owner) + case none => false + + // ----- primitive value class machinery ------------------------------------------ + + class PerRun[T](generate: Context ?=> T) { + private var current: RunId = NoRunId + private var cached: T = _ + def apply()(using Context): T = { + if (current != ctx.runId) { + cached = generate + current = ctx.runId + } + cached + } + } + + @tu lazy val ScalaNumericValueTypeList: List[TypeRef] = List( + ByteType, ShortType, CharType, IntType, LongType, FloatType, DoubleType) + + @tu private lazy val ScalaNumericValueTypes: collection.Set[TypeRef] = ScalaNumericValueTypeList.toSet + @tu private lazy val ScalaValueTypes: collection.Set[TypeRef] = ScalaNumericValueTypes `union` Set(UnitType, BooleanType) + + val ScalaNumericValueClasses: PerRun[collection.Set[Symbol]] = new PerRun(ScalaNumericValueTypes.map(_.symbol)) + val ScalaValueClasses: PerRun[collection.Set[Symbol]] = new PerRun(ScalaValueTypes.map(_.symbol)) + + val ScalaBoxedClasses: PerRun[collection.Set[Symbol]] = new PerRun( + Set(BoxedByteClass, BoxedShortClass, BoxedCharClass, BoxedIntClass, BoxedLongClass, BoxedFloatClass, BoxedDoubleClass, BoxedUnitClass, BoxedBooleanClass) + ) + + private val valueTypeEnc = mutable.Map[TypeName, PrimitiveClassEnc]() + private val typeTags = mutable.Map[TypeName, Name]().withDefaultValue(nme.specializedTypeNames.Object) + +// private val unboxedTypeRef = mutable.Map[TypeName, TypeRef]() +// private val javaTypeToValueTypeRef = mutable.Map[Class[?], TypeRef]() +// private val valueTypeNamesToJavaType = mutable.Map[TypeName, Class[?]]() + + private def valueTypeRef(name: String, jtype: Class[?], enc: Int, tag: Name): TypeRef = { + val vcls = requiredClassRef(name) + valueTypeEnc(vcls.name) = enc + typeTags(vcls.name) = tag +// unboxedTypeRef(boxed.name) = vcls +// javaTypeToValueTypeRef(jtype) = vcls +// valueTypeNamesToJavaType(vcls.name) = jtype + vcls + } + + /** The type of the boxed class corresponding to primitive value type `tp`. */ + def boxedType(tp: Type)(using Context): TypeRef = { + val cls = tp.classSymbol + if (cls eq ByteClass) BoxedByteClass + else if (cls eq ShortClass) BoxedShortClass + else if (cls eq CharClass) BoxedCharClass + else if (cls eq IntClass) BoxedIntClass + else if (cls eq LongClass) BoxedLongClass + else if (cls eq FloatClass) BoxedFloatClass + else if (cls eq DoubleClass) BoxedDoubleClass + else if (cls eq UnitClass) BoxedUnitClass + else if (cls eq BooleanClass) BoxedBooleanClass + else sys.error(s"Not a primitive value type: $tp") + }.typeRef + + def unboxedType(tp: Type)(using Context): TypeRef = { + val cls = tp.classSymbol + if (cls eq BoxedByteClass) ByteType + else if (cls eq BoxedShortClass) ShortType + else if (cls eq BoxedCharClass) CharType + else if (cls eq BoxedIntClass) IntType + else if (cls eq BoxedLongClass) LongType + else if (cls eq BoxedFloatClass) FloatType + else if (cls eq BoxedDoubleClass) DoubleType + else if (cls eq BoxedUnitClass) UnitType + else if (cls eq BoxedBooleanClass) BooleanType + else sys.error(s"Not a boxed primitive value type: $tp") + } + + /** The JVM tag for `tp` if it's a primitive, `java.lang.Object` otherwise. */ + def typeTag(tp: Type)(using Context): Name = typeTags(scalaClassName(tp)) + +// /** The `Class[?]` of a primitive value type name */ +// def valueTypeNameToJavaType(name: TypeName)(using Context): Option[Class[?]] = +// valueTypeNamesToJavaType.get(if (name.firstPart eq nme.scala) name.lastPart.toTypeName else name) + + type PrimitiveClassEnc = Int + + val ByteEnc: Int = 2 + val ShortEnc: Int = ByteEnc * 3 + val CharEnc: Int = 5 + val IntEnc: Int = ShortEnc * CharEnc + val LongEnc: Int = IntEnc * 7 + val FloatEnc: Int = LongEnc * 11 + val DoubleEnc: Int = FloatEnc * 13 + val BooleanEnc: Int = 17 + val UnitEnc: Int = 19 + + def isValueSubType(tref1: TypeRef, tref2: TypeRef)(using Context): Boolean = + valueTypeEnc(tref2.name) % valueTypeEnc(tref1.name) == 0 + def isValueSubClass(sym1: Symbol, sym2: Symbol): Boolean = + valueTypeEnc(sym2.asClass.name) % valueTypeEnc(sym1.asClass.name) == 0 + + @tu lazy val specialErasure: SimpleIdentityMap[Symbol, ClassSymbol] = + SimpleIdentityMap.empty[Symbol] + .updated(AnyClass, ObjectClass) + .updated(MatchableClass, ObjectClass) + .updated(AnyValClass, ObjectClass) + .updated(SingletonClass, ObjectClass) + .updated(TupleClass, ProductClass) + .updated(NonEmptyTupleClass, ProductClass) + .updated(PairClass, ObjectClass) + + // ----- Initialization --------------------------------------------------- + + /** Lists core classes that don't have underlying bytecode, but are synthesized on-the-fly in every reflection universe */ + @tu lazy val syntheticScalaClasses: List[TypeSymbol] = + List( + AnyClass, + MatchableClass, + AnyRefAlias, + AnyKindClass, + andType, + orType, + RepeatedParamClass, + ByNameParamClass2x, + IntoType, + AnyValClass, + NullClass, + NothingClass, + SingletonClass) + + @tu lazy val syntheticCoreClasses: List[Symbol] = syntheticScalaClasses ++ List( + EmptyPackageVal, + OpsPackageClass) + + /** Lists core methods that don't have underlying bytecode, but are synthesized on-the-fly in every reflection universe */ + @tu lazy val syntheticCoreMethods: List[TermSymbol] = + AnyMethods ++ ObjectMethods ++ List(String_+, throwMethod) + + @tu lazy val reservedScalaClassNames: Set[Name] = syntheticScalaClasses.map(_.name).toSet + + private var isInitialized = false + + def init()(using ctx: DetachedContext): Unit = { + this.initCtx = ctx + if (!isInitialized) { + // force initialization of every symbol that is synthesized or hijacked by the compiler + val forced = + syntheticCoreClasses ++ syntheticCoreMethods ++ ScalaValueClasses() :+ JavaEnumClass + isInitialized = true + } + addSyntheticSymbolsComments + } + + /** Definitions used in Lazy Vals implementation */ + val LazyValsModuleName = "scala.runtime.LazyVals" + @tu lazy val LazyValsModule = requiredModule(LazyValsModuleName) + @tu lazy val LazyValsWaitingState = requiredClass(s"$LazyValsModuleName.Waiting") + @tu lazy val LazyValsControlState = requiredClass(s"$LazyValsModuleName.LazyValControlState") + + def addSyntheticSymbolsComments(using Context): Unit = + def add(sym: Symbol, doc: String) = ctx.docCtx.foreach(_.addDocstring(sym, Some(Comment(NoSpan, doc)))) + + add(AnyClass, + """/** Class `Any` is the root of the Scala class hierarchy. Every class in a Scala + | * execution environment inherits directly or indirectly from this class. + | * + | * Starting with Scala 2.10 it is possible to directly extend `Any` using ''universal traits''. + | * A ''universal trait'' is a trait that extends `Any`, only has `def`s as members, and does no initialization. + | * + | * The main use case for universal traits is to allow basic inheritance of methods for [[scala.AnyVal value classes]]. + | * For example, + | * + | * {{{ + | * trait Printable extends Any { + | * def print(): Unit = println(this) + | * } + | * class Wrapper(val underlying: Int) extends AnyVal with Printable + | * + | * val w = new Wrapper(3) + | * w.print() + | * }}} + | * + | * See the [[https://docs.scala-lang.org/overviews/core/value-classes.html Value Classes and Universal Traits]] for more + | * details on the interplay of universal traits and value classes. + | */ + """.stripMargin) + + add(Any_==, + """/** Test two objects for equality. + | * The expression `x == that` is equivalent to `if (x eq null) that eq null else x.equals(that)`. + | * + | * @param that the object to compare against this object for equality. + | * @return `true` if the receiver object is equivalent to the argument; `false` otherwise. + | */ + """.stripMargin) + + add(Any_!=, + """/** Test two objects for inequality. + | * + | * @param that the object to compare against this object for equality. + | * @return `true` if !(this == that), `false` otherwise. + | */ + """.stripMargin) + + add(Any_equals, + """/** Compares the receiver object (`this`) with the argument object (`that`) for equivalence. + | * + | * Any implementation of this method should be an [[https://en.wikipedia.org/wiki/Equivalence_relation equivalence relation]]: + | * + | * - It is reflexive: for any instance `x` of type `Any`, `x.equals(x)` should return `true`. + | * - It is symmetric: for any instances `x` and `y` of type `Any`, `x.equals(y)` should return `true` if and + | * only if `y.equals(x)` returns `true`. + | * - It is transitive: for any instances `x`, `y`, and `z` of type `Any` if `x.equals(y)` returns `true` and + | * `y.equals(z)` returns `true`, then `x.equals(z)` should return `true`. + | * + | * If you override this method, you should verify that your implementation remains an equivalence relation. + | * Additionally, when overriding this method it is usually necessary to override `hashCode` to ensure that + | * objects which are "equal" (`o1.equals(o2)` returns `true`) hash to the same [[scala.Int]]. + | * (`o1.hashCode.equals(o2.hashCode)`). + | * + | * @param that the object to compare against this object for equality. + | * @return `true` if the receiver object is equivalent to the argument; `false` otherwise. + | */ + """.stripMargin) + + add(Any_hashCode, + """/** Calculate a hash code value for the object. + | * + | * The default hashing algorithm is platform dependent. + | * + | * Note that it is allowed for two objects to have identical hash codes (`o1.hashCode.equals(o2.hashCode)`) yet + | * not be equal (`o1.equals(o2)` returns `false`). A degenerate implementation could always return `0`. + | * However, it is required that if two objects are equal (`o1.equals(o2)` returns `true`) that they have + | * identical hash codes (`o1.hashCode.equals(o2.hashCode)`). Therefore, when overriding this method, be sure + | * to verify that the behavior is consistent with the `equals` method. + | * + | * @return the hash code value for this object. + | */ + """.stripMargin) + + add(Any_toString, + """/** Returns a string representation of the object. + | * + | * The default representation is platform dependent. + | * + | * @return a string representation of the object. + | */ + """.stripMargin) + + add(Any_##, + """/** Equivalent to `x.hashCode` except for boxed numeric types and `null`. + | * For numerics, it returns a hash value which is consistent + | * with value equality: if two value type instances compare + | * as true, then ## will produce the same hash value for each + | * of them. + | * For `null` returns a hashcode where `null.hashCode` throws a + | * `NullPointerException`. + | * + | * @return a hash value consistent with == + | */ + """.stripMargin) + + add(Any_isInstanceOf, + """/** Test whether the dynamic type of the receiver object is `T0`. + | * + | * Note that the result of the test is modulo Scala's erasure semantics. + | * Therefore the expression `1.isInstanceOf[String]` will return `false`, while the + | * expression `List(1).isInstanceOf[List[String]]` will return `true`. + | * In the latter example, because the type argument is erased as part of compilation it is + | * not possible to check whether the contents of the list are of the specified type. + | * + | * @return `true` if the receiver object is an instance of erasure of type `T0`; `false` otherwise. + | */ + """.stripMargin) + + add(Any_asInstanceOf, + """/** Cast the receiver object to be of type `T0`. + | * + | * Note that the success of a cast at runtime is modulo Scala's erasure semantics. + | * Therefore the expression `1.asInstanceOf[String]` will throw a `ClassCastException` at + | * runtime, while the expression `List(1).asInstanceOf[List[String]]` will not. + | * In the latter example, because the type argument is erased as part of compilation it is + | * not possible to check whether the contents of the list are of the requested type. + | * + | * @throws ClassCastException if the receiver object is not an instance of the erasure of type `T0`. + | * @return the receiver object. + | */ + """.stripMargin) + + add(Any_getClass, + """/** Returns the runtime class representation of the object. + | * + | * @return a class object corresponding to the runtime type of the receiver. + | */ + """.stripMargin) + + add(MatchableClass, + """/** The base trait of types that can be safely pattern matched against. + | * + | * See [[https://docs.scala-lang.org/scala3/reference/other-new-features/matchable.html]]. + | */ + """.stripMargin) + + add(AnyRefAlias, + """/** Class `AnyRef` is the root class of all ''reference types''. + | * All types except the value types descend from this class. + | */ + """.stripMargin) + + add(Object_eq, + """/** Tests whether the argument (`that`) is a reference to the receiver object (`this`). + | * + | * The `eq` method implements an [[https://en.wikipedia.org/wiki/Equivalence_relation equivalence relation]] on + | * non-null instances of `AnyRef`, and has three additional properties: + | * + | * - It is consistent: for any non-null instances `x` and `y` of type `AnyRef`, multiple invocations of + | * `x.eq(y)` consistently returns `true` or consistently returns `false`. + | * - For any non-null instance `x` of type `AnyRef`, `x.eq(null)` and `null.eq(x)` returns `false`. + | * - `null.eq(null)` returns `true`. + | * + | * When overriding the `equals` or `hashCode` methods, it is important to ensure that their behavior is + | * consistent with reference equality. Therefore, if two objects are references to each other (`o1 eq o2`), they + | * should be equal to each other (`o1 == o2`) and they should hash to the same value (`o1.hashCode == o2.hashCode`). + | * + | * @param that the object to compare against this object for reference equality. + | * @return `true` if the argument is a reference to the receiver object; `false` otherwise. + | */ + """.stripMargin) + + add(Object_ne, + """/** Equivalent to `!(this eq that)`. + | * + | * @param that the object to compare against this object for reference equality. + | * @return `true` if the argument is not a reference to the receiver object; `false` otherwise. + | */ + """.stripMargin) + + add(Object_synchronized, + """/** Executes the code in `body` with an exclusive lock on `this`. + | * + | * @param body the code to execute + | * @return the result of `body` + | */ + """.stripMargin) + + add(Object_clone, + """/** Create a copy of the receiver object. + | * + | * The default implementation of the `clone` method is platform dependent. + | * + | * @note not specified by SLS as a member of AnyRef + | * @return a copy of the receiver object. + | */ + """.stripMargin) + + add(Object_finalize, + """/** Called by the garbage collector on the receiver object when there + | * are no more references to the object. + | * + | * The details of when and if the `finalize` method is invoked, as + | * well as the interaction between `finalize` and non-local returns + | * and exceptions, are all platform dependent. + | * + | * @note not specified by SLS as a member of AnyRef + | */ + """.stripMargin) + + add(Object_notify, + """/** Wakes up a single thread that is waiting on the receiver object's monitor. + | * + | * @note not specified by SLS as a member of AnyRef + | */ + """.stripMargin) + + add(Object_notifyAll, + """/** Wakes up all threads that are waiting on the receiver object's monitor. + | * + | * @note not specified by SLS as a member of AnyRef + | */ + """.stripMargin) + + add(Object_wait, + """/** See [[https://docs.oracle.com/javase/8/docs/api/java/lang/Object.html#wait--]]. + | * + | * @note not specified by SLS as a member of AnyRef + | */ + """.stripMargin) + + add(Object_waitL, + """/** See [[https://docs.oracle.com/javase/8/docs/api/java/lang/Object.html#wait-long-]]. + | * + | * @param timeout the maximum time to wait in milliseconds. + | * @note not specified by SLS as a member of AnyRef + | */ + """.stripMargin) + + add(Object_waitLI, + """/** See [[https://docs.oracle.com/javase/8/docs/api/java/lang/Object.html#wait-long-int-]] + | * + | * @param timeout the maximum time to wait in milliseconds. + | * @param nanos additional time, in nanoseconds range 0-999999. + | * @note not specified by SLS as a member of AnyRef + | */ + """.stripMargin) + + add(AnyKindClass, + """/** The super-type of all types. + | * + | * See [[https://docs.scala-lang.org/scala3/reference/other-new-features/kind-polymorphism.html]]. + | */ + """.stripMargin) + + add(andType, + """/** The intersection of two types. + | * + | * See [[https://docs.scala-lang.org/scala3/reference/new-types/intersection-types.html]]. + | */ + """.stripMargin) + + add(orType, + """/** The union of two types. + | * + | * See [[https://docs.scala-lang.org/scala3/reference/new-types/union-types.html]]. + | */ + """.stripMargin) + + add(AnyValClass, + """/** `AnyVal` is the root class of all ''value types'', which describe values + | * not implemented as objects in the underlying host system. Value classes + | * are specified in Scala Language Specification, section 12.2. + | * + | * The standard implementation includes nine `AnyVal` subtypes: + | * + | * [[scala.Double]], [[scala.Float]], [[scala.Long]], [[scala.Int]], [[scala.Char]], + | * [[scala.Short]], and [[scala.Byte]] are the ''numeric value types''. + | * + | * [[scala.Unit]] and [[scala.Boolean]] are the ''non-numeric value types''. + | * + | * Other groupings: + | * + | * - The ''subrange types'' are [[scala.Byte]], [[scala.Short]], and [[scala.Char]]. + | * - The ''integer types'' include the subrange types as well as [[scala.Int]] and [[scala.Long]]. + | * - The ''floating point types'' are [[scala.Float]] and [[scala.Double]]. + | * + | * Prior to Scala 2.10, `AnyVal` was a sealed trait. Beginning with Scala 2.10, + | * however, it is possible to define a subclass of `AnyVal` called a ''user-defined value class'' + | * which is treated specially by the compiler. Properly-defined user value classes provide a way + | * to improve performance on user-defined types by avoiding object allocation at runtime, and by + | * replacing virtual method invocations with static method invocations. + | * + | * User-defined value classes which avoid object allocation... + | * + | * - must have a single `val` parameter that is the underlying runtime representation. + | * - can define `def`s, but no `val`s, `var`s, or nested `traits`s, `class`es or `object`s. + | * - typically extend no other trait apart from `AnyVal`. + | * - cannot be used in type tests or pattern matching. + | * - may not override `equals` or `hashCode` methods. + | * + | * A minimal example: + | * {{{ + | * class Wrapper(val underlying: Int) extends AnyVal { + | * def foo: Wrapper = new Wrapper(underlying * 19) + | * } + | * }}} + | * + | * It's important to note that user-defined value classes are limited, and in some circumstances, + | * still must allocate a value class instance at runtime. These limitations and circumstances are + | * explained in greater detail in the [[https://docs.scala-lang.org/overviews/core/value-classes.html Value Classes and Universal Traits]]. + | */ + """.stripMargin) + + add(NullClass, + """/** `Null` is - together with [[scala.Nothing]] - at the bottom of the Scala type hierarchy. + | * + | * `Null` is the type of the `null` literal. It is a subtype of every type + | * except those of value classes. Value classes are subclasses of [[AnyVal]], which includes + | * primitive types such as [[Int]], [[Boolean]], and user-defined value classes. + | * + | * Since `Null` is not a subtype of value types, `null` is not a member of any such type. + | * For instance, it is not possible to assign `null` to a variable of type [[scala.Int]]. + | */ + """.stripMargin) + + add(NothingClass, + """/** `Nothing` is - together with [[scala.Null]] - at the bottom of Scala's type hierarchy. + | * + | * `Nothing` is a subtype of every other type (including [[scala.Null]]); there exist + | * ''no instances'' of this type. Although type `Nothing` is uninhabited, it is + | * nevertheless useful in several ways. For instance, the Scala library defines a value + | * [[scala.collection.immutable.Nil]] of type `List[Nothing]`. Because lists are covariant in Scala, + | * this makes [[scala.collection.immutable.Nil]] an instance of `List[T]`, for any element of type `T`. + | * + | * Another usage for Nothing is the return type for methods which never return normally. + | * One example is method error in [[scala.sys]], which always throws an exception. + | */ + """.stripMargin) + + add(SingletonClass, + """/** `Singleton` is used by the compiler as a supertype for singleton types. This includes literal types, + | * as they are also singleton types. + | * + | * {{{ + | * scala> object A { val x = 42 } + | * defined object A + | * + | * scala> implicitly[A.type <:< Singleton] + | * res12: A.type <:< Singleton = generalized constraint + | * + | * scala> implicitly[A.x.type <:< Singleton] + | * res13: A.x.type <:< Singleton = generalized constraint + | * + | * scala> implicitly[42 <:< Singleton] + | * res14: 42 <:< Singleton = generalized constraint + | * + | * scala> implicitly[Int <:< Singleton] + | * ^ + | * error: Cannot prove that Int <:< Singleton. + | * }}} + | * + | * `Singleton` has a special meaning when it appears as an upper bound on a formal type + | * parameter. Normally, type inference in Scala widens singleton types to the underlying + | * non-singleton type. When a type parameter has an explicit upper bound of `Singleton`, + | * the compiler infers a singleton type. + | * + | * {{{ + | * scala> def check42[T](x: T)(implicit ev: T =:= 42): T = x + | * check42: [T](x: T)(implicit ev: T =:= 42)T + | * + | * scala> val x1 = check42(42) + | * ^ + | * error: Cannot prove that Int =:= 42. + | * + | * scala> def singleCheck42[T <: Singleton](x: T)(implicit ev: T =:= 42): T = x + | * singleCheck42: [T <: Singleton](x: T)(implicit ev: T =:= 42)T + | * + | * scala> val x2 = singleCheck42(42) + | * x2: Int = 42 + | * }}} + | * + | * See also [[https://docs.scala-lang.org/sips/42.type.html SIP-23 about Literal-based Singleton Types]]. + | */ + """.stripMargin) +} diff --git a/tests/pos-with-compiler-cc/dotc/core/DenotTransformers.scala b/tests/pos-with-compiler-cc/dotc/core/DenotTransformers.scala new file mode 100644 index 000000000000..6690cae3a142 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/core/DenotTransformers.scala @@ -0,0 +1,82 @@ +package dotty.tools.dotc +package core + +import Periods._ +import SymDenotations._ +import Contexts._ +import Types._ +import Symbols._ +import Denotations._ +import Phases._ + +object DenotTransformers { + + /** A transformer group contains a sequence of transformers, + * ordered by the phase where they apply. Transformers are added + * to a group via `install`. + */ + + /** A transformer transforms denotations at a given phase */ + trait DenotTransformer extends Phase { + + /** The last phase during which the transformed denotations are valid */ + def lastPhaseId(using Context): Int = ctx.base.nextDenotTransformerId(id + 1) + + /** The validity period of the transformed denotations in the given context */ + def validFor(using Context): Period = + Period(ctx.runId, id + 1, lastPhaseId) + + /** The transformation method */ + def transform(ref: SingleDenotation)(using Context): SingleDenotation + } + + /** A transformer that only transforms the info field of denotations */ + trait InfoTransformer extends DenotTransformer { + + def transformInfo(tp: Type, sym: Symbol)(using Context): Type + + def transform(ref: SingleDenotation)(using Context): SingleDenotation = { + val sym = ref.symbol + if (sym.exists && !infoMayChange(sym)) ref + else { + val info1 = transformInfo(ref.info, ref.symbol) + if (info1 eq ref.info) ref + else ref match { + case ref: SymDenotation => + ref.copySymDenotation(info = info1).copyCaches(ref, ctx.phase.next) + case _ => + ref.derivedSingleDenotation(ref.symbol, info1) + } + } + } + + /** Denotations with a symbol where `infoMayChange` is false are guaranteed to be + * unaffected by this transform, so `transformInfo` need not be run. This + * can save time, and more importantly, can help avoid forcing symbol completers. + */ + protected def infoMayChange(sym: Symbol)(using Context): Boolean = true + } + + /** A transformer that only transforms SymDenotations. + * Note: Infos of non-sym denotations are left as is. So the transformer should + * be used before erasure only if this is not a problem. After erasure, all + * denotations are SymDenotations, so SymTransformers can be used freely. + */ + trait SymTransformer extends DenotTransformer { + + def transformSym(sym: SymDenotation)(using Context): SymDenotation + + def transform(ref: SingleDenotation)(using Context): SingleDenotation = ref match { + case ref: SymDenotation => transformSym(ref) + case _ => ref + } + } + + /** A `DenotTransformer` trait that has the identity as its `transform` method. + * You might want to inherit from this trait so that new denotations can be + * installed using `installAfter` and `enteredAfter` at the end of the phase. + */ + trait IdentityDenotTransformer extends DenotTransformer { + def transform(ref: SingleDenotation)(using Context): SingleDenotation = ref + } +} diff --git a/tests/pos-with-compiler-cc/dotc/core/Denotations.scala b/tests/pos-with-compiler-cc/dotc/core/Denotations.scala new file mode 100644 index 000000000000..246e359f0597 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/core/Denotations.scala @@ -0,0 +1,1376 @@ +package dotty.tools +package dotc +package core + +import SymDenotations.{ SymDenotation, ClassDenotation, NoDenotation, LazyType, stillValid, acceptStale, traceInvalid } +import Contexts._ +import Names._ +import NameKinds._ +import StdNames._ +import Symbols.NoSymbol +import Symbols._ +import Types._ +import Periods._ +import Flags._ +import DenotTransformers._ +import Decorators._ +import Signature.MatchDegree._ +import printing.Texts._ +import printing.Printer +import io.AbstractFile +import config.Config +import config.Printers.overload +import util.common._ +import typer.ProtoTypes.NoViewsAllowed +import collection.mutable.ListBuffer +import language.experimental.pureFunctions + +/** Denotations represent the meaning of symbols and named types. + * The following diagram shows how the principal types of denotations + * and their denoting entities relate to each other. Lines ending in + * a down-arrow `v` are member methods. The two methods shown in the diagram are + * "symbol" and "deref". Both methods are parameterized by the current context, + * and are effectively indexed by current period. + * + * Lines ending in a horizontal line mean subtyping (right is a subtype of left). + * + * NamedType + * | Symbol---------ClassSymbol + * | | | + * | denot | denot | denot + * v v v + * Denotation-+-----SingleDenotation-+------SymDenotation-+----ClassDenotation + * | | + * +-----MultiDenotation | + * | + * +--UniqueRefDenotation + * +--JointRefDenotation + * + * Here's a short summary of the classes in this diagram. + * + * NamedType A type consisting of a prefix type and a name, with fields + * prefix: Type + * name: Name + * It has two subtypes: TermRef and TypeRef + * Symbol A label for a definition or declaration in one compiler run + * ClassSymbol A symbol representing a class + * Denotation The meaning of a named type or symbol during a period + * MultiDenotation A denotation representing several overloaded members + * SingleDenotation A denotation representing a non-overloaded member or definition, with main fields + * symbol: Symbol + * info: Type + * UniqueRefDenotation A denotation referring to a single definition with some member type + * JointRefDenotation A denotation referring to a member that could resolve to several definitions + * SymDenotation A denotation representing a single definition with its original type, with main fields + * name: Name + * owner: Symbol + * flags: Flags + * privateWithin: Symbol + * annotations: List[Annotation] + * ClassDenotation A denotation representing a single class definition. + */ +object Denotations { + + implicit def eqDenotation: CanEqual[Denotation, Denotation] = CanEqual.derived + + /** A PreDenotation represents a group of single denotations or a single multi-denotation + * It is used as an optimization to avoid forming MultiDenotations too eagerly. + */ + abstract class PreDenotation extends caps.Pure { + + /** A denotation in the group exists */ + def exists: Boolean + + /** First/last denotation in the group */ + def first: Denotation + def last: Denotation + + /** Convert to full denotation by &-ing all elements */ + def toDenot(pre: Type)(using Context): Denotation + + /** Group contains a denotation that refers to given symbol */ + def containsSym(sym: Symbol): Boolean + + /** Group contains a denotation with the same signature as `other` */ + def matches(other: SingleDenotation)(using Context): Boolean + + /** Keep only those denotations in this group which satisfy predicate `p`. */ + def filterWithPredicate(p: SingleDenotation => Boolean): PreDenotation + + /** Keep only those denotations in this group which have a signature + * that's not already defined by `denots`. + */ + def filterDisjoint(denots: PreDenotation)(using Context): PreDenotation + + /** Keep only those inherited members M of this predenotation for which the following is true + * - M is not marked Private + * - If M has a unique symbol, it does not appear in `prevDenots`. + * - M's signature as seen from prefix `pre` does not appear in `ownDenots` + * Return the denotation as seen from `pre`. + * Called from SymDenotations.computeMember. There, `ownDenots` are the denotations found in + * the base class, which shadow any inherited denotations with the same signature. + * `prevDenots` are the denotations that are defined in the class or inherited from + * a base type which comes earlier in the linearization. + */ + def mapInherited(ownDenots: PreDenotation, prevDenots: PreDenotation, pre: Type)(using Context): PreDenotation + + /** Keep only those denotations in this group that have all of the flags in `required`, + * but none of the flags in `excluded`. + */ + def filterWithFlags(required: FlagSet, excluded: FlagSet)(using Context): PreDenotation + + /** Map `f` over all single denotations and aggregate the results with `g`. */ + def aggregate[T](f: SingleDenotation => T, g: (T, T) => T): T + + private var cachedPrefix: Type = _ + private var cachedAsSeenFrom: AsSeenFromResult = _ + private var validAsSeenFrom: Period = Nowhere + + type AsSeenFromResult <: PreDenotation + + /** The denotation with info(s) as seen from prefix type */ + def asSeenFrom(pre: Type)(using Context): AsSeenFromResult = + if (Config.cacheAsSeenFrom) { + if ((cachedPrefix ne pre) || ctx.period != validAsSeenFrom) { + cachedAsSeenFrom = computeAsSeenFrom(pre) + cachedPrefix = pre + validAsSeenFrom = if (pre.isProvisional) Nowhere else ctx.period + } + cachedAsSeenFrom + } + else computeAsSeenFrom(pre) + + protected def computeAsSeenFrom(pre: Type)(using Context): AsSeenFromResult + + /** The union of two groups. */ + def union(that: PreDenotation): PreDenotation = + if (!this.exists) that + else if (!that.exists) this + else DenotUnion(this, that) + } + + /** A denotation is the result of resolving + * a name (either simple identifier or select) during a given period. + * + * Denotations can be combined with `&` and `|`. + * & is conjunction, | is disjunction. + * + * `&` will create an overloaded denotation from two + * non-overloaded denotations if their signatures differ. + * Analogously `|` of two denotations with different signatures will give + * an empty denotation `NoDenotation`. + * + * A denotation might refer to `NoSymbol`. This is the case if the denotation + * was produced from a disjunction of two denotations with different symbols + * and there was no common symbol in a superclass that could substitute for + * both symbols. Here is an example: + * + * Say, we have: + * + * class A { def f: A } + * class B { def f: B } + * val x: A | B = if (test) new A else new B + * val y = x.f + * + * Then the denotation of `y` is `SingleDenotation(NoSymbol, A | B)`. + * + * @param symbol The referencing symbol, or NoSymbol is none exists + */ + abstract class Denotation(val symbol: Symbol, protected var myInfo: Type) extends PreDenotation with printing.Showable { + type AsSeenFromResult <: Denotation + + /** The type info. + * The info is an instance of TypeType iff this is a type denotation + * Uncompleted denotations set myInfo to a LazyType. + */ + final def info(using Context): Type = { + def completeInfo = { // Written this way so that `info` is small enough to be inlined + this.asInstanceOf[SymDenotation].completeFrom(myInfo.asInstanceOf[LazyType]); info + } + if (myInfo.isInstanceOf[LazyType]) completeInfo else myInfo + } + + /** The type info, or, if this is a SymDenotation where the symbol + * is not yet completed, the completer + */ + def infoOrCompleter: Type + + /** The period during which this denotation is valid. */ + def validFor: Period + + /** Is this a reference to a type symbol? */ + def isType: Boolean + + /** Is this a reference to a term symbol? */ + def isTerm: Boolean = !isType + + /** Is this denotation overloaded? */ + final def isOverloaded: Boolean = isInstanceOf[MultiDenotation] + + /** Denotation points to unique symbol; false for overloaded denotations + * and JointRef denotations. + */ + def hasUniqueSym: Boolean + + /** The name of the denotation */ + def name(using Context): Name + + /** The signature of the denotation. */ + def signature(using Context): Signature + + /** Resolve overloaded denotation to pick the ones with the given signature + * when seen from prefix `site`. + * @param relaxed When true, consider only parameter signatures for a match. + */ + def atSignature(sig: Signature, targetName: Name, site: Type = NoPrefix, relaxed: Boolean = false)(using Context): Denotation + + /** The variant of this denotation that's current in the given context. + * If no such denotation exists, returns the denotation with each alternative + * at its first point of definition. + */ + def current(using Context): Denotation + + /** Is this denotation different from NoDenotation or an ErrorDenotation? */ + def exists: Boolean = true + + /** A denotation with the info of this denotation transformed using `f` */ + def mapInfo(f: Type => Type)(using Context): Denotation + + /** If this denotation does not exist, fallback to alternative */ + inline def orElse(inline that: Denotation): Denotation = if (this.exists) this else that + + /** The set of alternative single-denotations making up this denotation */ + final def alternatives: List[SingleDenotation] = altsWith(alwaysTrue) + + /** The alternatives of this denotation that satisfy the predicate `p`. */ + def altsWith(p: Symbol => Boolean): List[SingleDenotation] + + /** The unique alternative of this denotation that satisfies the predicate `p`, + * or NoDenotation if no satisfying alternative exists. + * @throws TypeError if there is at more than one alternative that satisfies `p`. + */ + def suchThat(p: Symbol => Boolean)(using Context): SingleDenotation + + override def filterWithPredicate(p: SingleDenotation => Boolean): Denotation + + /** If this is a SingleDenotation, return it, otherwise throw a TypeError */ + def checkUnique(using Context): SingleDenotation = suchThat(alwaysTrue) + + /** Does this denotation have an alternative that satisfies the predicate `p`? */ + def hasAltWith(p: SingleDenotation => Boolean): Boolean + + /** The denotation made up from the alternatives of this denotation that + * are accessible from prefix `pre`, or NoDenotation if no accessible alternative exists. + */ + def accessibleFrom(pre: Type, superAccess: Boolean = false)(using Context): Denotation + + /** Find member of this denotation with given `name`, all `required` + * flags and no `excluded` flag, and produce a denotation that contains the type of the member + * as seen from given prefix `pre`. + */ + def findMember(name: Name, pre: Type, required: FlagSet, excluded: FlagSet)(using Context): Denotation = + info.findMember(name, pre, required, excluded) + + /** If this denotation is overloaded, filter with given predicate. + * If result is still overloaded throw a TypeError. + * Note: disambiguate is slightly different from suchThat in that + * single-denotations that do not satisfy the predicate are left alone + * (whereas suchThat would map them to NoDenotation). + */ + inline def disambiguate(inline p: Symbol => Boolean)(using Context): SingleDenotation = this match { + case sdenot: SingleDenotation => sdenot + case mdenot => suchThat(p) orElse NoQualifyingRef(alternatives) + } + + /** Return symbol in this denotation that satisfies the given predicate. + * if generateStubs is specified, return a stubsymbol if denotation is a missing ref. + * Throw a `TypeError` if predicate fails to disambiguate symbol or no alternative matches. + */ + def requiredSymbol(kind: String, + name: Name, + site: Denotation = NoDenotation, + args: List[Type] = Nil, + source: AbstractFile | Null = null, + generateStubs: Boolean = true) + (p: Symbol => Boolean) + (using Context): Symbol = + disambiguate(p) match { + case m @ MissingRef(ownerd, name) if generateStubs => + if ctx.settings.YdebugMissingRefs.value then m.ex.printStackTrace() + newStubSymbol(ownerd.symbol, name, source) + case NoDenotation | _: NoQualifyingRef | _: MissingRef => + def argStr = if (args.isEmpty) "" else i" matching ($args%, %)" + val msg = + if site.exists then em"$site does not have a member $kind $name$argStr" + else em"missing: $kind $name$argStr" + throw TypeError(msg) + case denot => + denot.symbol + } + + def requiredMethod(pname: PreName)(using Context): TermSymbol = { + val name = pname.toTermName + info.member(name).requiredSymbol("method", name, this)(_.is(Method)).asTerm + } + def requiredMethodRef(name: PreName)(using Context): TermRef = + requiredMethod(name).termRef + + def requiredMethod(pname: PreName, argTypes: List[Type])(using Context): TermSymbol = { + val name = pname.toTermName + info.member(name).requiredSymbol("method", name, this, argTypes) { x => + x.is(Method) && { + x.info.paramInfoss match { + case paramInfos :: Nil => paramInfos.corresponds(argTypes)(_ =:= _) + case _ => false + } + } + }.asTerm + } + def requiredMethodRef(name: PreName, argTypes: List[Type])(using Context): TermRef = + requiredMethod(name, argTypes).termRef + + def requiredValue(pname: PreName)(using Context): TermSymbol = { + val name = pname.toTermName + info.member(name).requiredSymbol("field or getter", name, this)(_.info.isParameterless).asTerm + } + def requiredValueRef(name: PreName)(using Context): TermRef = + requiredValue(name).termRef + + def requiredClass(pname: PreName)(using Context): ClassSymbol = { + val name = pname.toTypeName + info.member(name).requiredSymbol("class", name, this)(_.isClass).asClass + } + + def requiredType(pname: PreName)(using Context): TypeSymbol = { + val name = pname.toTypeName + info.member(name).requiredSymbol("type", name, this)(_.isType).asType + } + + /** The alternative of this denotation that has a type matching `targetType` when seen + * as a member of type `site` and that has a target name matching `targetName`, or + * `NoDenotation` if none exists. + */ + def matchingDenotation(site: Type, targetType: Type, targetName: Name)(using Context): SingleDenotation = { + def qualifies(sym: Symbol) = + site.memberInfo(sym).matchesLoosely(targetType) && sym.hasTargetName(targetName) + if (isOverloaded) + atSignature(targetType.signature, targetName, site, relaxed = true) match { + case sd: SingleDenotation => sd.matchingDenotation(site, targetType, targetName) + case md => md.suchThat(qualifies(_)) + } + else if (exists && !qualifies(symbol)) NoDenotation + else asSingleDenotation + } + + /** Form a denotation by conjoining with denotation `that`. + * + * NoDenotations are dropped. MultiDenotations are handled by merging + * parts with same signatures. SingleDenotations with equal signatures + * are joined by following this sequence of steps: + * + * 1. If exactly one the denotations has an inaccessible symbol, pick the other one. + * 2. Otherwise, if one of the infos overrides the other one, and the associated + * symbol does not score strictly lower than the other one, + * pick the associated denotation. + * 3. Otherwise, if the two infos can be combined with `infoMeet`, pick that as + * result info, and pick the symbol that scores higher as result symbol, + * or pick `sym1` as a tie breaker. The picked info and symbol are combined + * in a JointDenotation. + * 4. Otherwise, if one of the two symbols scores strongly higher than the + * other one, pick the associated denotation. + * 5. Otherwise return a multi-denotation consisting of both denotations. + * + * Symbol scoring is determined according to the following ranking + * where earlier criteria trump later ones. Cases marked with (*) + * give a strong score advantage, the others a weak one. + * + * 1. The symbol exists, and the other one does not. (*) + * 2. The symbol is not a bridge, but the other one is. (*) + * 3. The symbol is concrete, and the other one is deferred + * 4. The symbol appears before the other in the linearization of `pre` + * 5. The symbol's visibility is strictly greater than the other one's. + * 6. The symbol is a method, but the other one is not. + */ + def meet(that: Denotation, pre: Type, safeIntersection: Boolean = false)(using Context): Denotation = { + /** Try to merge denot1 and denot2 without adding a new signature. */ + def mergeDenot(denot1: Denotation, denot2: SingleDenotation): Denotation = denot1 match { + case denot1 @ MultiDenotation(denot11, denot12) => + val d1 = mergeDenot(denot11, denot2) + if (d1.exists) denot1.derivedUnionDenotation(d1, denot12) + else { + val d2 = mergeDenot(denot12, denot2) + if (d2.exists) denot1.derivedUnionDenotation(denot11, d2) + else NoDenotation + } + case denot1: SingleDenotation => + if (denot1 eq denot2) denot1 + else if denot1.matches(denot2) then mergeSingleDenot(denot1, denot2) + else NoDenotation + } + + /** Try to merge single-denotations. */ + def mergeSingleDenot(denot1: SingleDenotation, denot2: SingleDenotation): Denotation = + val info1 = denot1.info + val info2 = denot2.info + val sym1 = denot1.symbol + val sym2 = denot2.symbol + + /** Does `owner1` come before `owner2` in the linearization of `pre`? */ + def linearScore(owner1: Symbol, owner2: Symbol): Int = + + def searchBaseClasses(bcs: List[ClassSymbol]): Int = bcs match + case bc :: bcs1 => + if bc eq owner1 then 1 + else if bc eq owner2 then -1 + else searchBaseClasses(bcs1) + case Nil => 0 + + if owner1 eq owner2 then 0 + else if owner1.derivesFrom(owner2) then 1 + else if owner2.derivesFrom(owner1) then -1 + else searchBaseClasses(pre.baseClasses) + end linearScore + + /** Similar to SymDenotation#accessBoundary, but without the special cases. */ + def accessBoundary(sym: Symbol) = + if (sym.is(Private)) sym.owner + else sym.privateWithin.orElse( + if (sym.is(Protected)) sym.owner.enclosingPackageClass + else defn.RootClass) + + def isHidden(sym: Symbol) = sym.exists && !sym.isAccessibleFrom(pre) + // In typer phase filter out denotations with symbols that are not + // accessible. After typer, this is not possible since we cannot guarantee + // that the current owner is set correctly. See pos/14660.scala. + val hidden1 = isHidden(sym1) && ctx.isTyper + val hidden2 = isHidden(sym2) && ctx.isTyper + if hidden1 && !hidden2 then denot2 + else if hidden2 && !hidden1 then denot1 + else + // The score that determines which symbol to pick for the result denotation. + // A value > 0 means pick `sym1`, < 0 means pick `sym2`. + // A value of +/- 2 means pick one of the denotations as a tie-breaker + // if a common info does not exist. + val symScore: Int = + if !sym1.exists then -2 + else if !sym2.exists then 2 + else if sym1.is(Bridge) && !sym2.is(Bridge) then -2 + else if sym2.is(Bridge) && !sym1.is(Bridge) then 2 + else if !sym1.isAsConcrete(sym2) then -1 + else if !sym2.isAsConcrete(sym1) then 1 + else + val linScore = linearScore(sym1.owner, sym2.owner) + if linScore != 0 then linScore + else + val boundary1 = accessBoundary(sym1) + val boundary2 = accessBoundary(sym2) + if boundary1.isProperlyContainedIn(boundary2) then -1 + else if boundary2.isProperlyContainedIn(boundary1) then 1 + else if sym2.is(Method) && !sym1.is(Method) then -1 + else if sym1.is(Method) && !sym2.is(Method) then 1 + else 0 + + val relaxedOverriding = ctx.explicitNulls && (sym1.is(JavaDefined) || sym2.is(JavaDefined)) + val matchLoosely = sym1.matchNullaryLoosely || sym2.matchNullaryLoosely + + if symScore <= 0 && info2.overrides(info1, relaxedOverriding, matchLoosely, checkClassInfo = false) then + denot2 + else if symScore >= 0 && info1.overrides(info2, relaxedOverriding, matchLoosely, checkClassInfo = false) then + denot1 + else + val jointInfo = infoMeet(info1, info2, safeIntersection) + if jointInfo.exists then + val sym = if symScore >= 0 then sym1 else sym2 + JointRefDenotation(sym, jointInfo, denot1.validFor & denot2.validFor, pre, denot1.isRefinedMethod || denot2.isRefinedMethod) + else if symScore == 2 then denot1 + else if symScore == -2 then denot2 + else + overload.println(i"overloaded with same signature: ${sym1.showLocated}: $info1 / ${sym2.showLocated}: $info2, info = ${info1.getClass}, ${info2.getClass}, $jointInfo") + MultiDenotation(denot1, denot2) + end mergeSingleDenot + + if (this eq that) this + else if (!this.exists) that + else if (!that.exists) this + else that match { + case that: SingleDenotation => + val r = mergeDenot(this, that) + if (r.exists) r else MultiDenotation(this, that) + case that @ MultiDenotation(denot1, denot2) => + this.meet(denot1, pre).meet(denot2, pre) + } + } + + final def asSingleDenotation: SingleDenotation = asInstanceOf[SingleDenotation] + final def asSymDenotation: SymDenotation = asInstanceOf[SymDenotation] + + def toText(printer: Printer): Text = printer.toText(this) + + // ------ PreDenotation ops ---------------------------------------------- + + final def toDenot(pre: Type)(using Context): Denotation = this + final def containsSym(sym: Symbol): Boolean = hasUniqueSym && (symbol eq sym) + } + + // ------ Info meets ---------------------------------------------------- + + /** Merge parameter names of lambda types. If names in corresponding positions match, keep them, + * otherwise generate new synthetic names. + */ + private def mergeParamNames(tp1: LambdaType, tp2: LambdaType): List[tp1.ThisName] = + (for ((name1, name2, idx) <- tp1.paramNames.lazyZip(tp2.paramNames).lazyZip(tp1.paramNames.indices)) + yield if (name1 == name2) name1 else tp1.companion.syntheticParamName(idx)).toList + + /** Normally, `tp1 & tp2`, with extra care taken to return `tp1` or `tp2` directly if that's + * a valid answer. Special cases for matching methods and classes, with + * the possibility of returning NoType. Special handling of ExprTypes, where mixed + * intersections widen the ExprType away. + */ + def infoMeet(tp1: Type, tp2: Type, safeIntersection: Boolean)(using Context): Type = + if tp1 eq tp2 then tp1 + else tp1 match + case tp1: TypeBounds => + tp2 match + case tp2: TypeBounds => if safeIntersection then tp1 safe_& tp2 else tp1 & tp2 + case tp2: ClassInfo => tp2 + case _ => NoType + case tp1: ClassInfo => + tp2 match + case tp2: ClassInfo if tp1.cls eq tp2.cls => tp1.derivedClassInfo(tp1.prefix & tp2.prefix) + case tp2: TypeBounds => tp1 + case _ => NoType + case tp1: MethodType => + tp2 match + case tp2: MethodType + if TypeComparer.matchingMethodParams(tp1, tp2) + && tp1.isImplicitMethod == tp2.isImplicitMethod + && tp1.isErasedMethod == tp2.isErasedMethod => + val resType = infoMeet(tp1.resType, tp2.resType.subst(tp2, tp1), safeIntersection) + if resType.exists then + tp1.derivedLambdaType(mergeParamNames(tp1, tp2), tp1.paramInfos, resType) + else NoType + case _ => NoType + case tp1: PolyType => + tp2 match + case tp2: PolyType if tp1.paramNames.hasSameLengthAs(tp2.paramNames) => + val resType = infoMeet(tp1.resType, tp2.resType.subst(tp2, tp1), safeIntersection) + if resType.exists then + tp1.derivedLambdaType( + mergeParamNames(tp1, tp2), + tp1.paramInfos.zipWithConserve(tp2.paramInfos)( _ & _ ), + resType) + else NoType + case _ => NoType + case ExprType(rtp1) => + tp2 match + case ExprType(rtp2) => ExprType(rtp1 & rtp2) + case _ => infoMeet(rtp1, tp2, safeIntersection) + case _ => + tp2 match + case _: MethodType | _: PolyType => NoType + case _ => tp1 & tp2.widenExpr + end infoMeet + + /** A non-overloaded denotation */ + abstract class SingleDenotation(symbol: Symbol, initInfo: Type) extends Denotation(symbol, initInfo) { + protected def newLikeThis(symbol: Symbol, info: Type, pre: Type, isRefinedMethod: Boolean): SingleDenotation + + final def name(using Context): Name = symbol.name + + /** For SymDenotation, this is NoPrefix. For other denotations this is the prefix + * under which the denotation was constructed. + * + * Note that `asSeenFrom` might return a `SymDenotation` and therefore in + * general one cannot rely on `prefix` being set, see + * `Config.reuseSymDenotations` for details. + */ + def prefix: Type = NoPrefix + + /** True if the info of this denotation comes from a refinement. */ + def isRefinedMethod: Boolean = false + + /** For SymDenotations, the language-specific signature of the info, depending on + * where the symbol is defined. For non-SymDenotations, the Scala 3 + * signature. + * + * Invariants: + * - Before erasure, the signature of a denotation is always equal to the + * signature of its corresponding initial denotation. + * - Two distinct overloads will have SymDenotations with distinct + * signatures (the SELECTin tag in Tasty relies on this to refer to an + * overload unambiguously). Note that this only applies to + * SymDenotations, in general we cannot assume that distinct + * SingleDenotations will have distinct signatures (cf #9050). + */ + final def signature(using Context): Signature = + signature(sourceLanguage = if isType || !this.isInstanceOf[SymDenotation] then SourceLanguage.Scala3 else SourceLanguage(symbol)) + + /** Overload of `signature` which lets the caller pick the language used + * to compute the signature of the info. Useful to match denotations defined in + * different classes (see `matchesLoosely`). + */ + def signature(sourceLanguage: SourceLanguage)(using Context): Signature = + if (isType) Signature.NotAMethod // don't force info if this is a type denotation + else info match { + case info: MethodOrPoly => + try info.signature(sourceLanguage) + catch { // !!! DEBUG + case scala.util.control.NonFatal(ex) => + report.echo(s"cannot take signature of $info") + throw ex + } + case _ => Signature.NotAMethod + } + + def derivedSingleDenotation(symbol: Symbol, info: Type, pre: Type = this.prefix, isRefinedMethod: Boolean = this.isRefinedMethod)(using Context): SingleDenotation = + if ((symbol eq this.symbol) && (info eq this.info) && (pre eq this.prefix) && (isRefinedMethod == this.isRefinedMethod)) this + else newLikeThis(symbol, info, pre, isRefinedMethod) + + def mapInfo(f: Type => Type)(using Context): SingleDenotation = + derivedSingleDenotation(symbol, f(info)) + + inline def orElse(inline that: SingleDenotation): SingleDenotation = if (this.exists) this else that + + def altsWith(p: Symbol => Boolean): List[SingleDenotation] = + if (exists && p(symbol)) this :: Nil else Nil + + def suchThat(p: Symbol => Boolean)(using Context): SingleDenotation = + if (exists && p(symbol)) this else NoDenotation + + def hasAltWith(p: SingleDenotation => Boolean): Boolean = + exists && p(this) + + def accessibleFrom(pre: Type, superAccess: Boolean)(using Context): Denotation = + if (!symbol.exists || symbol.isAccessibleFrom(pre, superAccess)) this else NoDenotation + + def atSignature(sig: Signature, targetName: Name, site: Type, relaxed: Boolean)(using Context): SingleDenotation = + val situated = if site == NoPrefix then this else asSeenFrom(site) + val sigMatches = sig.matchDegree(situated.signature) match + case FullMatch => + true + case MethodNotAMethodMatch => + // See comment in `matches` + relaxed && !symbol.is(JavaDefined) + case ParamMatch => + relaxed + case noMatch => + false + if sigMatches && symbol.hasTargetName(targetName) then this else NoDenotation + + def matchesImportBound(bound: Type)(using Context): Boolean = + if bound.isRef(defn.NothingClass) then false + else if bound.isAny then true + else NoViewsAllowed.normalizedCompatible(info, bound, keepConstraint = false) + + // ------ Transformations ----------------------------------------- + + private var myValidFor: Period = Nowhere + + def validFor: Period = myValidFor + def validFor_=(p: Period): Unit = { + myValidFor = p + symbol.invalidateDenotCache() + } + + /** The next SingleDenotation in this run, with wrap-around from last to first. + * + * There may be several `SingleDenotation`s with different validity + * representing the same underlying definition at different phases. + * These are called a "flock". Flock members are generated by + * @See current. Flock members are connected in a ring + * with their `nextInRun` fields. + * + * There are the following invariants concerning flock members + * + * 1) validity periods are non-overlapping + * 2) the union of all validity periods is a contiguous + * interval. + */ + protected var nextInRun: SingleDenotation = this + + /** The version of this SingleDenotation that was valid in the first phase + * of this run. + */ + def initial: SingleDenotation = + if (validFor.firstPhaseId <= 1) this + else { + var current = nextInRun + while (current.validFor.code > this.myValidFor.code) current = current.nextInRun + current + } + + def history: List[SingleDenotation] = { + val b = new ListBuffer[SingleDenotation] + var current = initial + while ({ + b += (current) + current = current.nextInRun + current ne initial + }) + () + b.toList + } + + /** Invalidate all caches and fields that depend on base classes and their contents */ + def invalidateInheritedInfo(): Unit = () + + private def updateValidity()(using Context): this.type = { + assert( + ctx.runId >= validFor.runId + || ctx.settings.YtestPickler.value // mixing test pickler with debug printing can travel back in time + || ctx.mode.is(Mode.Printing) // no use to be picky when printing error messages + || symbol.isOneOf(ValidForeverFlags), + s"denotation $this invalid in run ${ctx.runId}. ValidFor: $validFor") + var d: SingleDenotation = this + while ({ + d.validFor = Period(ctx.runId, d.validFor.firstPhaseId, d.validFor.lastPhaseId) + d.invalidateInheritedInfo() + d = d.nextInRun + d ne this + }) + () + this + } + + /** Move validity period of this denotation to a new run. Throw a StaleSymbol error + * if denotation is no longer valid. + * However, StaleSymbol error is not thrown in the following situations: + * + * - If acceptStale returns true (e.g. because we are in the IDE), + * update the symbol to the new version if it exists, or return + * the old version otherwise. + * - If the symbol did not have a denotation that was defined at the current phase + * return a NoDenotation instead. + */ + private def bringForward()(using Context): SingleDenotation = { + this match { + case symd: SymDenotation => + if (stillValid(symd)) return updateValidity() + if acceptStale(symd) && symd.initial.validFor.firstPhaseId <= ctx.lastPhaseId then + // New run might have fewer phases than old, so symbol might no longer be + // visible at all. TabCompleteTests have examples where this happens. + return symd.currentSymbol.denot.orElse(symd).updateValidity() + case _ => + } + if (!symbol.exists) return updateValidity() + if (!coveredInterval.containsPhaseId(ctx.phaseId)) return NoDenotation + if (ctx.debug) traceInvalid(this) + staleSymbolError + } + + /** The next defined denotation (following `nextInRun`) or an arbitrary + * undefined denotation, if all denotations in a `nextinRun` cycle are + * undefined. + */ + private def nextDefined: SingleDenotation = { + var p1 = this + var p2 = nextInRun + while (p1.validFor == Nowhere && (p1 ne p2)) { + p1 = p1.nextInRun + p2 = p2.nextInRun.nextInRun + } + p1 + } + + /** Skip any denotations that have been removed by an installAfter or that + * are otherwise undefined. + */ + def skipRemoved(using Context): SingleDenotation = + if (myValidFor.code <= 0) nextDefined else this + + /** Produce a denotation that is valid for the given context. + * Usually called when !(validFor contains ctx.period) + * (even though this is not a precondition). + * If the runId of the context is the same as runId of this denotation, + * the right flock member is located, or, if it does not exist yet, + * created by invoking a transformer (@See Transformers). + * If the runId's differ, but this denotation is a SymDenotation + * and its toplevel owner class or module + * is still a member of its enclosing package, then the whole flock + * is brought forward to be valid in the new runId. Otherwise + * the symbol is stale, which constitutes an internal error. + */ + def current(using Context): SingleDenotation = + util.Stats.record("current") + val currentPeriod = ctx.period + val valid = myValidFor + + def assertNotPackage(d: SingleDenotation, transformer: DenotTransformer) = d match + case d: ClassDenotation => + assert(!d.is(Package), s"illegal transformation of package denotation by transformer $transformer") + case _ => + + def escapeToNext = nextDefined.ensuring(_.validFor != Nowhere) + + def toNewRun = + util.Stats.record("current.bringForward") + if exists then initial.bringForward().current else this + + def goForward = + var cur = this + // search for containing period as long as nextInRun increases. + var next = nextInRun + while next.validFor.code > valid.code && !(next.validFor contains currentPeriod) do + cur = next + next = next.nextInRun + if next.validFor.code > valid.code then + // in this case, next.validFor contains currentPeriod + cur = next + cur + else + //println(s"might need new denot for $cur, valid for ${cur.validFor} at $currentPeriod") + // not found, cur points to highest existing variant + val nextTransformerId = ctx.base.nextDenotTransformerId(cur.validFor.lastPhaseId) + if currentPeriod.lastPhaseId <= nextTransformerId then + cur.validFor = Period(currentPeriod.runId, cur.validFor.firstPhaseId, nextTransformerId) + else + var startPid = nextTransformerId + 1 + val transformer = ctx.base.denotTransformers(nextTransformerId) + //println(s"transforming $this with $transformer") + val savedPeriod = ctx.period + val mutCtx = ctx.asInstanceOf[FreshContext] + try + mutCtx.setPhase(transformer) + next = transformer.transform(cur) + // We temporarily update the context with the new phase instead of creating a + // new one. This is done for performance. We cut down on about 30% of context + // creations that way, and also avoid phase caches in contexts to get large. + // To work correctly, we need to demand that the context with the new phase + // is not retained in the result. + catch case ex: CyclicReference => + // println(s"error while transforming $this") + throw ex + finally + mutCtx.setPeriod(savedPeriod) + if next eq cur then + startPid = cur.validFor.firstPhaseId + else + assertNotPackage(next, transformer) + next.insertAfter(cur) + cur = next + cur.validFor = Period(currentPeriod.runId, startPid, transformer.lastPhaseId) + //printPeriods(cur) + //println(s"new denot: $cur, valid for ${cur.validFor}") + cur.current // multiple transformations could be required + end goForward + + def goBack: SingleDenotation = + // currentPeriod < end of valid; in this case a version must exist + // but to be defensive we check for infinite loop anyway + var cur = this + var cnt = 0 + while !(cur.validFor contains currentPeriod) do + //println(s"searching: $cur at $currentPeriod, valid for ${cur.validFor}") + cur = cur.nextInRun + // Note: One might be tempted to add a `prev` field to get to the new denotation + // more directly here. I tried that, but it degrades rather than improves + // performance: Test setup: Compile everything in dotc and immediate subdirectories + // 10 times. Best out of 10: 18154ms with `prev` field, 17777ms without. + cnt += 1 + if cnt > MaxPossiblePhaseId then + return atPhase(coveredInterval.firstPhaseId)(current) + cur + end goBack + + if valid.code <= 0 then + // can happen if we sit on a stale denotation which has been replaced + // wholesale by an installAfter; in this case, proceed to the next + // denotation and try again. + escapeToNext + else if valid.runId != currentPeriod.runId then + toNewRun + else if currentPeriod.code > valid.code then + goForward + else + goBack + end current + + private def demandOutsideDefinedMsg(using Context): String = + s"demanding denotation of $this at phase ${ctx.phase}(${ctx.phaseId}) outside defined interval: defined periods are${definedPeriodsString}" + + /** Install this denotation to be the result of the given denotation transformer. + * This is the implementation of the same-named method in SymDenotations. + * It's placed here because it needs access to private fields of SingleDenotation. + * @pre Can only be called in `phase.next`. + */ + protected def installAfter(phase: DenotTransformer)(using Context): Unit = { + val targetId = phase.next.id + if (ctx.phaseId != targetId) atPhase(phase.next)(installAfter(phase)) + else { + val current = symbol.current + // println(s"installing $this after $phase/${phase.id}, valid = ${current.validFor}") + // printPeriods(current) + this.validFor = Period(ctx.runId, targetId, current.validFor.lastPhaseId) + if (current.validFor.firstPhaseId >= targetId) + current.replaceWith(this) + else { + current.validFor = Period(ctx.runId, current.validFor.firstPhaseId, targetId - 1) + insertAfter(current) + } + } + // printPeriods(this) + } + + /** Apply a transformation `f` to all denotations in this group that start at or after + * given phase. Denotations are replaced while keeping the same validity periods. + */ + protected def transformAfter(phase: DenotTransformer, f: SymDenotation => SymDenotation)(using Context): Unit = { + var current = symbol.current + while (current.validFor.firstPhaseId < phase.id && (current.nextInRun.validFor.code > current.validFor.code)) + current = current.nextInRun + var hasNext = true + while ((current.validFor.firstPhaseId >= phase.id) && hasNext) { + val current1: SingleDenotation = f(current.asSymDenotation) + if (current1 ne current) { + current1.validFor = current.validFor + current.replaceWith(current1) + } + hasNext = current1.nextInRun.validFor.code > current1.validFor.code + current = current1.nextInRun + } + } + + /** Insert this denotation so that it follows `prev`. */ + private def insertAfter(prev: SingleDenotation) = { + this.nextInRun = prev.nextInRun + prev.nextInRun = this + } + + /** Insert this denotation instead of `old`. + * Also ensure that `old` refers with `nextInRun` to this denotation + * and set its `validFor` field to `Nowhere`. This is necessary so that + * references to the old denotation can be brought forward via `current` + * to a valid denotation. + * + * The code to achieve this is subtle in that it works correctly + * whether the replaced denotation is the only one in its cycle or not. + */ + private[dotc] def replaceWith(newd: SingleDenotation): Unit = { + var prev = this + while (prev.nextInRun ne this) prev = prev.nextInRun + // order of next two assignments is important! + prev.nextInRun = newd + newd.nextInRun = nextInRun + validFor = Nowhere + nextInRun = newd + } + + def staleSymbolError(using Context): Nothing = + inDetachedContext: + throw new StaleSymbol(staleSymbolMsg) + + def staleSymbolMsg(using Context): String = { + def ownerMsg = this match { + case denot: SymDenotation => s"in ${denot.owner}" + case _ => "" + } + s"stale symbol; $this#${symbol.id} $ownerMsg, defined in ${myValidFor}, is referred to in run ${ctx.period}" + } + + /** The period (interval of phases) for which there exists + * a valid denotation in this flock. + */ + def coveredInterval(using Context): Period = { + var cur = this + var cnt = 0 + var interval = validFor + while ({ + cur = cur.nextInRun + cnt += 1 + assert(cnt <= MaxPossiblePhaseId, demandOutsideDefinedMsg) + interval |= cur.validFor + cur ne this + }) + () + interval + } + + /** Show declaration string; useful for showing declarations + * as seen from subclasses. + */ + def showDcl(using Context): String = ctx.printer.dclText(this).show + + override def toString: String = + if (symbol == NoSymbol) symbol.toString + else s"" + + def definedPeriodsString: String = { + var sb = new StringBuilder() + var cur = this + var cnt = 0 + while ({ + sb.append(" " + cur.validFor) + cur = cur.nextInRun + cnt += 1 + if (cnt > MaxPossiblePhaseId) { sb.append(" ..."); cur = this } + cur ne this + }) + () + sb.toString + } + + // ------ PreDenotation ops ---------------------------------------------- + + final def first: SingleDenotation = this + final def last: SingleDenotation = this + + def matches(other: SingleDenotation)(using Context): Boolean = + symbol.hasTargetName(other.symbol.targetName) + && matchesLoosely(other) + + /** `matches` without a target name check. + * + * For definitions coming from different languages, we pick a common + * language to compute their signatures. This allows us for example to + * override some Java definitions from Scala even if they have a different + * erasure (see i8615b, i9109b), Erasure takes care of adding any necessary + * bridge to make this work at runtime. + */ + def matchesLoosely(other: SingleDenotation, alwaysCompareTypes: Boolean = false)(using Context): Boolean = + if isType then true + else + val thisLanguage = SourceLanguage(symbol) + val otherLanguage = SourceLanguage(other.symbol) + val commonLanguage = SourceLanguage.commonLanguage(thisLanguage, otherLanguage) + val sig = signature(commonLanguage) + val otherSig = other.signature(commonLanguage) + sig.matchDegree(otherSig) match + case FullMatch => + !alwaysCompareTypes || info.matches(other.info) + case MethodNotAMethodMatch => + !ctx.erasedTypes && { + // A Scala zero-parameter method and a Scala non-method always match. + if !thisLanguage.isJava && !otherLanguage.isJava then + true + // Java allows defining both a field and a zero-parameter method with the same name, + // so they must not match. + else if thisLanguage.isJava && otherLanguage.isJava then + false + // A Java field never matches a Scala method. + else if thisLanguage.isJava then + symbol.is(Method) + else // otherLanguage.isJava + other.symbol.is(Method) + } + case ParamMatch => + // The signatures do not tell us enough to be sure about matching + !ctx.erasedTypes && info.matches(other.info) + case noMatch => + false + + def mapInherited(ownDenots: PreDenotation, prevDenots: PreDenotation, pre: Type)(using Context): SingleDenotation = + if hasUniqueSym && prevDenots.containsSym(symbol) then NoDenotation + else if isType then filterDisjoint(ownDenots).asSeenFrom(pre) + else asSeenFrom(pre).filterDisjoint(ownDenots) + + def filterWithPredicate(p: SingleDenotation => Boolean): SingleDenotation = + if (p(this)) this else NoDenotation + def filterDisjoint(denots: PreDenotation)(using Context): SingleDenotation = + if (denots.exists && denots.matches(this)) NoDenotation else this + def filterWithFlags(required: FlagSet, excluded: FlagSet)(using Context): SingleDenotation = + val realExcluded = if ctx.isAfterTyper then excluded else excluded | Invisible + def symd: SymDenotation = this match + case symd: SymDenotation => symd + case _ => symbol.denot + if !required.isEmpty && !symd.isAllOf(required) + || symd.isOneOf(realExcluded) then NoDenotation + else this + def aggregate[T](f: SingleDenotation => T, g: (T, T) => T): T = f(this) + + type AsSeenFromResult = SingleDenotation + + protected def computeAsSeenFrom(pre: Type)(using Context): SingleDenotation = { + val symbol = this.symbol + val owner = this match { + case thisd: SymDenotation => thisd.owner + case _ => if (symbol.exists) symbol.owner else NoSymbol + } + + /** The derived denotation with the given `info` transformed with `asSeenFrom`. + * + * As a performance hack, we might reuse an existing SymDenotation, + * instead of creating a new denotation with a given `prefix`, + * see `Config.reuseSymDenotations`. + */ + def derived(info: Type) = + /** Do we need to return a denotation with a prefix set? */ + def needsPrefix = + // For opaque types, the prefix is used in `ElimOpaques#transform`, + // without this i7159.scala would fail when compiled from tasty. + symbol.is(Opaque) + + val derivedInfo = info.asSeenFrom(pre, owner) + if Config.reuseSymDenotations && this.isInstanceOf[SymDenotation] + && (derivedInfo eq info) && !needsPrefix then + this + else + derivedSingleDenotation(symbol, derivedInfo, pre) + end derived + + // Tt could happen that we see the symbol with prefix `this` as a member a different class + // through a self type and that it then has a different info. In this case we have to go + // through the asSeenFrom to switch the type back. Test case is pos/i9352.scala. + def hasOriginalInfo: Boolean = this match + case sd: SymDenotation => true + case _ => info eq symbol.info + + def ownerIsPrefix = pre match + case pre: ThisType => pre.sameThis(owner.thisType) + case _ => false + + if !owner.membersNeedAsSeenFrom(pre) && (!ownerIsPrefix || hasOriginalInfo) + || symbol.is(NonMember) + then this + else if symbol.isAllOf(ClassTypeParam) then + val arg = symbol.typeRef.argForParam(pre, widenAbstract = true) + if arg.exists + then derivedSingleDenotation(symbol, normalizedArgBounds(arg.bounds), pre) + else derived(symbol.info) + else derived(symbol.info) + } + + /** The argument bounds, possibly intersected with the parameter's info TypeBounds, + * if the latter is not F-bounded and does not refer to other type parameters + * of the same class, and the intersection is provably nonempty. + */ + private def normalizedArgBounds(argBounds: TypeBounds)(using Context): TypeBounds = + if symbol.isCompleted && !hasBoundsDependingOnParamsOf(symbol.owner) then + val combined @ TypeBounds(lo, hi) = symbol.info.bounds & argBounds + if (lo frozen_<:< hi) then combined + else argBounds + else argBounds + + private def hasBoundsDependingOnParamsOf(cls: Symbol)(using Context): Boolean = + val acc = new TypeAccumulator[Boolean]: + def apply(x: Boolean, tp: Type): Boolean = tp match + case _: LazyRef => true + case tp: TypeRef + if tp.symbol.isAllOf(ClassTypeParam) && tp.symbol.owner == cls => true + case _ => foldOver(x, tp) + acc(false, symbol.info) + } + + abstract class NonSymSingleDenotation(symbol: Symbol, initInfo: Type, override val prefix: Type) extends SingleDenotation(symbol, initInfo) { + def infoOrCompleter: Type = initInfo + def isType: Boolean = infoOrCompleter.isInstanceOf[TypeType] + } + + class UniqueRefDenotation( + symbol: Symbol, + initInfo: Type, + initValidFor: Period, + prefix: Type) extends NonSymSingleDenotation(symbol, initInfo, prefix) { + validFor = initValidFor + override def hasUniqueSym: Boolean = true + protected def newLikeThis(s: Symbol, i: Type, pre: Type, isRefinedMethod: Boolean): SingleDenotation = + if isRefinedMethod then + new JointRefDenotation(s, i, validFor, pre, isRefinedMethod) + else + new UniqueRefDenotation(s, i, validFor, pre) + } + + class JointRefDenotation( + symbol: Symbol, + initInfo: Type, + initValidFor: Period, + prefix: Type, + override val isRefinedMethod: Boolean) extends NonSymSingleDenotation(symbol, initInfo, prefix) { + validFor = initValidFor + override def hasUniqueSym: Boolean = false + protected def newLikeThis(s: Symbol, i: Type, pre: Type, isRefinedMethod: Boolean): SingleDenotation = + new JointRefDenotation(s, i, validFor, pre, isRefinedMethod) + } + + class ErrorDenotation(using DetachedContext) extends NonSymSingleDenotation(NoSymbol, NoType, NoType) { + override def exists: Boolean = false + override def hasUniqueSym: Boolean = false + validFor = Period.allInRun(ctx.runId) + protected def newLikeThis(s: Symbol, i: Type, pre: Type, isRefinedMethod: Boolean): SingleDenotation = + this + } + + /** An error denotation that provides more info about the missing reference. + * Produced by staticRef, consumed by requiredSymbol. + */ + case class MissingRef(val owner: SingleDenotation, name: Name)(using DetachedContext) extends ErrorDenotation { + val ex: Exception = new Exception // DEBUG + } + + /** An error denotation that provides more info about alternatives + * that were found but that do not qualify. + * Produced by staticRef, consumed by requiredSymbol. + */ + case class NoQualifyingRef(alts: List[SingleDenotation])(using DetachedContext) extends ErrorDenotation + + /** A double definition + */ + def isDoubleDef(sym1: Symbol, sym2: Symbol)(using Context): Boolean = + (sym1.exists && sym2.exists && + (sym1 `ne` sym2) && (sym1.effectiveOwner `eq` sym2.effectiveOwner) && + !sym1.is(Bridge) && !sym2.is(Bridge)) + + // --- Overloaded denotations and predenotations ------------------------------------------------- + + trait MultiPreDenotation extends PreDenotation { + def denot1: PreDenotation + def denot2: PreDenotation + + assert(denot1.exists && denot2.exists, s"Union of non-existing denotations ($denot1) and ($denot2)") + def first: Denotation = denot1.first + def last: Denotation = denot2.last + def matches(other: SingleDenotation)(using Context): Boolean = + denot1.matches(other) || denot2.matches(other) + def mapInherited(owndenot: PreDenotation, prevdenot: PreDenotation, pre: Type)(using Context): PreDenotation = + derivedUnion(denot1.mapInherited(owndenot, prevdenot, pre), denot2.mapInherited(owndenot, prevdenot, pre)) + def filterWithPredicate(p: SingleDenotation => Boolean): PreDenotation = + derivedUnion(denot1 filterWithPredicate p, denot2 filterWithPredicate p) + def filterDisjoint(denot: PreDenotation)(using Context): PreDenotation = + derivedUnion(denot1 filterDisjoint denot, denot2 filterDisjoint denot) + def filterWithFlags(required: FlagSet, excluded: FlagSet)(using Context): PreDenotation = + derivedUnion(denot1.filterWithFlags(required, excluded), denot2.filterWithFlags(required, excluded)) + def aggregate[T](f: SingleDenotation => T, g: (T, T) => T): T = + g(denot1.aggregate(f, g), denot2.aggregate(f, g)) + protected def derivedUnion(denot1: PreDenotation, denot2: PreDenotation) = + if ((denot1 eq this.denot1) && (denot2 eq this.denot2)) this + else denot1 union denot2 + } + + final case class DenotUnion(denot1: PreDenotation, denot2: PreDenotation) extends MultiPreDenotation { + def exists: Boolean = true + def toDenot(pre: Type)(using Context): Denotation = + denot1.toDenot(pre).meet(denot2.toDenot(pre), pre) + def containsSym(sym: Symbol): Boolean = + (denot1 containsSym sym) || (denot2 containsSym sym) + type AsSeenFromResult = PreDenotation + def computeAsSeenFrom(pre: Type)(using Context): PreDenotation = + derivedUnion(denot1.asSeenFrom(pre), denot2.asSeenFrom(pre)) + } + + /** An overloaded denotation consisting of the alternatives of both given denotations. + */ + case class MultiDenotation(denot1: Denotation, denot2: Denotation) extends Denotation(NoSymbol, NoType) with MultiPreDenotation { + final def infoOrCompleter: Type = multiHasNot("info") + final def validFor: Period = denot1.validFor & denot2.validFor + final def isType: Boolean = false + final def hasUniqueSym: Boolean = false + final def name(using Context): Name = denot1.name + final def signature(using Context): Signature = Signature.OverloadedSignature + def atSignature(sig: Signature, targetName: Name, site: Type, relaxed: Boolean)(using Context): Denotation = + if (sig eq Signature.OverloadedSignature) this + else derivedUnionDenotation( + denot1.atSignature(sig, targetName, site, relaxed), + denot2.atSignature(sig, targetName, site, relaxed)) + def current(using Context): Denotation = + derivedUnionDenotation(denot1.current, denot2.current) + def altsWith(p: Symbol => Boolean): List[SingleDenotation] = + denot1.altsWith(p) ++ denot2.altsWith(p) + def suchThat(p: Symbol => Boolean)(using Context): SingleDenotation = { + val sd1 = denot1.suchThat(p) + val sd2 = denot2.suchThat(p) + if sd1.exists then + if sd2.exists then + throw TypeError( + em"""Failure to disambiguate overloaded reference with + | ${denot1.symbol.showLocated}: ${denot1.info} and + | ${denot2.symbol.showLocated}: ${denot2.info}""") + else sd1 + else sd2 + } + override def filterWithPredicate(p: SingleDenotation => Boolean): Denotation = + derivedUnionDenotation(denot1.filterWithPredicate(p), denot2.filterWithPredicate(p)) + def hasAltWith(p: SingleDenotation => Boolean): Boolean = + denot1.hasAltWith(p) || denot2.hasAltWith(p) + def accessibleFrom(pre: Type, superAccess: Boolean)(using Context): Denotation = { + val d1 = denot1 accessibleFrom (pre, superAccess) + val d2 = denot2 accessibleFrom (pre, superAccess) + if (!d1.exists) d2 + else if (!d2.exists) d1 + else derivedUnionDenotation(d1, d2) + } + def mapInfo(f: Type => Type)(using Context): Denotation = + derivedUnionDenotation(denot1.mapInfo(f), denot2.mapInfo(f)) + def derivedUnionDenotation(d1: Denotation, d2: Denotation): Denotation = + if ((d1 eq denot1) && (d2 eq denot2)) this + else if (!d1.exists) d2 + else if (!d2.exists) d1 + else MultiDenotation(d1, d2) + type AsSeenFromResult = Denotation + def computeAsSeenFrom(pre: Type)(using Context): Denotation = + derivedUnionDenotation(denot1.asSeenFrom(pre), denot2.asSeenFrom(pre)) + override def toString: String = alternatives.mkString(" ") + + private def multiHasNot(op: String): Nothing = + throw new UnsupportedOperationException( + s"multi-denotation with alternatives $alternatives does not implement operation $op") + } + + /** The current denotation of the static reference given by path, + * or a MissingRef or NoQualifyingRef instance, if it does not exist. + * if generateStubs is set, generates stubs for missing top-level symbols + */ + def staticRef(path: Name, generateStubs: Boolean = true, isPackage: Boolean = false)(using Context): Denotation = { + def select(prefix: Denotation, selector: Name): Denotation = { + val owner = prefix.disambiguate(_.info.isParameterless) + def isPackageFromCoreLibMissing: Boolean = + // if the scala package is missing, the stdlib must be missing + owner.symbol == defn.RootClass && selector == nme.scala + if (owner.exists) { + val result = if (isPackage) owner.info.decl(selector) else owner.info.member(selector) + if (result.exists) result + else if (isPackageFromCoreLibMissing) throw new MissingCoreLibraryException(selector.toString) + else { + val alt = + if (generateStubs) missingHook(owner.symbol.moduleClass, selector) + else NoSymbol + if (alt.exists) alt.denot + else MissingRef(owner, selector) + } + } + else owner + } + def recur( + path: Name, + wrap: TermName -> Name = identity[Name] // !cc! default argument needs to be instantiated, error if [Name] is dropped + ): Denotation = path match { + case path: TypeName => + recur(path.toTermName, n => n.toTypeName) + case ModuleClassName(underlying) => + recur(underlying, n => wrap(ModuleClassName(n))) + case QualifiedName(prefix, selector) => + select(recur(prefix), wrap(selector)) + case qn @ AnyQualifiedName(prefix, _) => + recur(prefix, n => wrap(qn.info.mkString(n).toTermName)) + case path: SimpleName => + def recurSimple(len: Int, wrap: TermName -> Name): Denotation = { + val point = path.lastIndexOf('.', len - 1) + val selector = wrap(path.slice(point + 1, len).asTermName) + val prefix = + if (point > 0) recurSimple(point, identity) + else if (selector.isTermName) defn.RootClass.denot + else defn.EmptyPackageClass.denot + select(prefix, selector) + } + recurSimple(path.length, wrap) + } + + val run = ctx.run + if run == null then recur(path) + else run.staticRefs.getOrElseUpdate(path, recur(path)) + } + + /** If we are looking for a non-existing term name in a package, + * assume it is a package for which we do not have a directory and + * enter it. + */ + def missingHook(owner: Symbol, name: Name)(using Context): Symbol = + if (owner.is(Package) && name.isTermName) + newCompletePackageSymbol(owner, name.asTermName).entered + else + NoSymbol + + /** An exception for accessing symbols that are no longer valid in current run */ + class StaleSymbol(msg: -> String) extends Exception { + util.Stats.record("stale symbol") + override def getMessage(): String = msg + } +} diff --git a/tests/pos-with-compiler-cc/dotc/core/Flags.scala b/tests/pos-with-compiler-cc/dotc/core/Flags.scala new file mode 100644 index 000000000000..f23dce020f10 --- /dev/null +++ b/tests/pos-with-compiler-cc/dotc/core/Flags.scala @@ -0,0 +1,612 @@ +package dotty.tools.dotc +package core + +object Flags { + + object opaques { + + /** A FlagSet represents a set of flags. Flags are encoded as follows: + * The first two bits indicate whether a flag set applies to terms, + * to types, or to both. Bits 2..63 are available for properties + * and can be doubly used for terms and types. + */ + opaque type FlagSet = Long + def FlagSet(bits: Long): FlagSet = bits + def toBits(fs: FlagSet): Long = fs + + /** A flag set consisting of a single flag */ + opaque type Flag <: FlagSet = Long + private[Flags] def Flag(bits: Long): Flag = bits + } + export opaques.FlagSet + + type Flag = opaques.Flag + + extension (x: FlagSet) { + + inline def bits: Long = opaques.toBits(x) + + /** The union of the given flag sets. + * Combining two FlagSets with `|` will give a FlagSet + * that has the intersection of the applicability to terms/types + * of the two flag sets. It is checked that the intersection is not empty. + */ + def | (y: FlagSet): FlagSet = + if (x.bits == 0) y + else if (y.bits == 0) x + else { + val tbits = x.bits & y.bits & KINDFLAGS + if (tbits == 0) + assert(false, s"illegal flagset combination: ${x.flagsString} and ${y.flagsString}") + FlagSet(tbits | ((x.bits | y.bits) & ~KINDFLAGS)) + } + + /** The intersection of the given flag sets */ + def & (y: FlagSet): FlagSet = FlagSet(x.bits & y.bits) + + /** The intersection of a flag set with the complement of another flag set */ + def &~ (y: FlagSet): FlagSet = { + val tbits = x.bits & KINDFLAGS + if ((tbits & y.bits) == 0) x + else FlagSet(tbits | ((x.bits & ~y.bits) & ~KINDFLAGS)) + } + + def ^ (y: FlagSet) = + FlagSet((x.bits | y.bits) & KINDFLAGS | (x.bits ^ y.bits) & ~KINDFLAGS) + + /** Does the given flag set contain the given flag? + * This means that both the kind flags and the carrier bits have non-empty intersection. + */ + def is (flag: Flag): Boolean = { + val fs = x.bits & flag.bits + (fs & KINDFLAGS) != 0 && (fs & ~KINDFLAGS) != 0 + } + + /** Does the given flag set contain the given flag + * and at the same time contain none of the flags in the `butNot` set? + */ + def is (flag: Flag, butNot: FlagSet): Boolean = x.is(flag) && !x.isOneOf(butNot) + + /** Does the given flag set have a non-empty intersection with another flag set? + * This means that both the kind flags and the carrier bits have non-empty intersection. + */ + def isOneOf (flags: FlagSet): Boolean = { + val fs = x.bits & flags.bits + (fs & KINDFLAGS) != 0 && (fs & ~KINDFLAGS) != 0 + } + + /** Does the given flag set have a non-empty intersection with another flag set, + * and at the same time contain none of the flags in the `butNot` set? + */ + def isOneOf (flags: FlagSet, butNot: FlagSet): Boolean = x.isOneOf(flags) && !x.isOneOf(butNot) + + /** Does a given flag set have all of the flags of another flag set? + * Pre: The intersection of the term/type flags of both sets must be non-empty. + */ + def isAllOf (flags: FlagSet): Boolean = { + val fs = x.bits & flags.bits + ((fs & KINDFLAGS) != 0 || flags.bits == 0) && + (fs >>> TYPESHIFT) == (flags.bits >>> TYPESHIFT) + } + + /** Does a given flag set have all of the flags in another flag set + * and at the same time contain none of the flags in the `butNot` set? + * Pre: The intersection of the term/type flags of both sets must be non-empty. + */ + def isAllOf (flags: FlagSet, butNot: FlagSet): Boolean = x.isAllOf(flags) && !x.isOneOf(butNot) + + def isEmpty: Boolean = (x.bits & ~KINDFLAGS) == 0 + + /** Is a given flag set a subset of another flag set? */ + def <= (y: FlagSet): Boolean = (x.bits & y.bits) == x.bits + + /** Does the given flag set apply to terms? */ + def isTermFlags: Boolean = (x.bits & TERMS) != 0 + + /** Does the given flag set apply to terms? */ + def isTypeFlags: Boolean = (x.bits & TYPES) != 0 + + /** The given flag set with all flags transposed to be type flags */ + def toTypeFlags: FlagSet = if (x.bits == 0) x else FlagSet(x.bits & ~KINDFLAGS | TYPES) + + /** The given flag set with all flags transposed to be term flags */ + def toTermFlags: FlagSet = if (x.bits == 0) x else FlagSet(x.bits & ~KINDFLAGS | TERMS) + + /** The given flag set with all flags transposed to be common flags */ + def toCommonFlags: FlagSet = if (x.bits == 0) x else FlagSet(x.bits | KINDFLAGS) + + /** The number of non-kind flags in the given flag set */ + def numFlags: Int = java.lang.Long.bitCount(x.bits & ~KINDFLAGS) + + /** The lowest non-kind bit set in the given flag set */ + def firstBit: Int = java.lang.Long.numberOfTrailingZeros(x.bits & ~KINDFLAGS) + + /** The list of non-empty names of flags with given index idx that are set in the given flag set */ + private def flagString(idx: Int): List[String] = + if ((x.bits & (1L << idx)) == 0) Nil + else { + def halfString(kind: Int) = + if ((x.bits & (1L << kind)) != 0) flagName(idx)(kind) else "" + val termFS = halfString(TERMindex) + val typeFS = halfString(TYPEindex) + val strs = termFS :: (if (termFS == typeFS) Nil else typeFS :: Nil) + strs filter (_.nonEmpty) + } + + /** The list of non-empty names of flags that are set in the given flag set */ + def flagStrings(privateWithin: String = ""): Seq[String] = { + var rawStrings = (2 to MaxFlag).flatMap(x.flagString(_)) // DOTTY problem: cannot drop with (_) + if (!privateWithin.isEmpty && !x.is(Protected)) + rawStrings = rawStrings :+ "private" + val scopeStr = if (x.is(Local)) "this" else privateWithin + if (scopeStr != "") + rawStrings.filter(_ != "").map { + case "private" => s"private[$scopeStr]" + case "protected" => s"protected[$scopeStr]" + case str => str + } + else rawStrings + } + + /** The string representation of the given flag set */ + def flagsString: String = x.flagStrings("").mkString(" ") + } + + // Temporary while extension names are in flux + def or(x1: FlagSet, x2: FlagSet) = x1 | x2 + def and(x1: FlagSet, x2: FlagSet) = x1 & x2 + + def termFlagSet(x: Long) = FlagSet(TERMS | x) + + private inline val TYPESHIFT = 2 + private inline val TERMindex = 0 + private inline val TYPEindex = 1 + private inline val TERMS = 1 << TERMindex + private inline val TYPES = 1 << TYPEindex + private inline val KINDFLAGS = TERMS | TYPES + + private inline val FirstFlag = 2 + private inline val FirstNotPickledFlag = 48 + private inline val MaxFlag = 63 + + private val flagName = Array.fill(64, 2)("") + + private def isDefinedAsFlag(idx: Int) = flagName(idx).exists(_.nonEmpty) + + /** The flag set containing all defined flags of either kind whose bits + * lie in the given range + */ + private def flagRange(start: Int, end: Int) = + FlagSet((start until end).foldLeft(KINDFLAGS.toLong) ((bits, idx) => + if (isDefinedAsFlag(idx)) bits | (1L << idx) else bits)) + + /** The union of all flags in given flag set */ + def union(flagss: FlagSet*): FlagSet = { + var flag = EmptyFlags + for (f <- flagss) + flag |= f + flag + } + + def commonFlags(flagss: FlagSet*): FlagSet = union(flagss.map(_.toCommonFlags): _*) + + /** The empty flag set */ + val EmptyFlags: FlagSet = FlagSet(0) + + /** The undefined flag set */ + val UndefinedFlags: FlagSet = FlagSet(~KINDFLAGS) + + /** Three flags with given index between 2 and 63. + * The first applies to both terms and types. the second is a term flag, and + * the third is a type flag. Installs given name(s) as the name(s) of the flags. + * @param name The name to be used for the term flag + * @param typeName The name to be used for the type flag, if it is different from `name`. + */ + private def newFlags(index: Int, name: String, typeName: String = ""): (Flag, Flag, Flag) = { + flagName(index)(TERMindex) = name + flagName(index)(TYPEindex) = if (typeName.isEmpty) name else typeName + val bits = 1L << index + (opaques.Flag(KINDFLAGS | bits), opaques.Flag(TERMS | bits), opaques.Flag(TYPES | bits)) + } + + // ----------------- Available flags ----------------------------------------------------- + + /** Labeled with `private` modifier */ + val (Private @ _, PrivateTerm @ _, PrivateType @ _) = newFlags(2, "private") + + /** Labeled with `protected` modifier */ + val (Protected @ _, _, _) = newFlags(3, "protected") + + /** Labeled with `override` modifier */ + val (Override @ _, _, _) = newFlags(4, "override") + + /** A declared, but not defined member */ + val (Deferred @ _, DeferredTerm @ _, DeferredType @ _) = newFlags(5, "") + + /** Labeled with `final` modifier */ + val (Final @ _, _, _) = newFlags(6, "final") + + /** A method symbol / a super trait */ + val (_, Method @ _, _) = newFlags(7, "") + + /** A (term or type) parameter to a class or method */ + val (Param @ _, TermParam @ _, TypeParam @ _) = newFlags(8, "") + + /** Labeled with `implicit` modifier (implicit value) */ + val (Implicit @ _, ImplicitVal @ _, _) = newFlags(9, "implicit") + + /** Labeled with `lazy` (a lazy val) / a trait */ + val (LazyOrTrait @ _, Lazy @ _, Trait @ _) = newFlags(10, "lazy", "") + + /** A value or variable accessor (getter or setter) */ + val (AccessorOrSealed @ _, Accessor @ _, Sealed @ _) = newFlags(11, "", "sealed") + + /** A mutable var, an open class */ + val (MutableOrOpen @ __, Mutable @ _, Open @ _) = newFlags(12, "mutable", "open") + + /** Symbol is local to current class (i.e. private[this] or protected[this] + * pre: Private or Protected are also set + */ + val (Local @ _, _, _) = newFlags(13, "") + + /** A field generated for a primary constructor parameter (no matter if it's a 'val' or not), + * or an accessor of such a field. + */ + val (_, ParamAccessor @ _, _) = newFlags(14, "") + + /** A value or class implementing a module */ + val (Module @ _, ModuleVal @ _, ModuleClass @ _) = newFlags(15, "module") + + /** A value or class representing a package */ + val (Package @ _, PackageVal @ _, PackageClass @ _) = newFlags(16, "") + + /** A case class or its companion object + * Note: Case is also used to indicate that a symbol is bound by a pattern. + */ + val (Case @ _, CaseVal @ _, CaseClass @ _) = newFlags(17, "case") + + /** A compiler-generated symbol, which is visible for type-checking + * (compare with artifact) + */ + val (Synthetic @ _, _, _) = newFlags(18, "") + + /** Labelled with `inline` modifier */ + val (Inline @ _, _, _) = newFlags(19, "inline") + + /** An outer accessor / a covariant type variable */ + val (OuterOrCovariant @ _, OuterAccessor @ _, Covariant @ _) = newFlags(20, "", "") + + /** The label of a labeled block / a contravariant type variable */ + val (LabelOrContravariant @ _, Label @ _, Contravariant @ _) = newFlags(21, "