From 982c51fc8afc2466bbfc2718849dbdaf0c13800b Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Fri, 7 Oct 2022 17:46:17 +0200 Subject: [PATCH 1/9] Add small improvements to the Reference page --- .../changed-features/compiler-plugins.md | 12 ++++---- .../changed-features/eta-expansion-spec.md | 4 +-- .../changed-features/main-functions.md | 2 +- .../contextual/by-name-context-parameters.md | 2 +- docs/_docs/reference/contextual/givens.md | 9 +++--- .../contextual/multiversal-equality.md | 7 +++-- .../reference/contextual/type-classes.md | 4 +-- .../dropped-features/delayed-init.md | 2 +- docs/_docs/reference/experimental/canthrow.md | 2 +- docs/_docs/reference/experimental/cc.md | 2 +- .../reference/experimental/explicit-nulls.md | 2 +- .../experimental/numeric-literals.md | 2 +- .../language-versions/source-compatibility.md | 2 +- .../metaprogramming/compiletime-ops.md | 22 +++++++------- .../metaprogramming/metaprogramming.md | 2 +- .../reference/new-types/type-lambdas-spec.md | 4 +-- .../reference/other-new-features/export.md | 8 ++--- .../other-new-features/kind-polymorphism.md | 2 +- .../other-new-features/opaques-details.md | 2 +- .../parameter-untupling-spec.md | 30 ------------------- .../other-new-features/parameter-untupling.md | 7 +++-- .../other-new-features/targetName.md | 2 +- 22 files changed, 53 insertions(+), 78 deletions(-) diff --git a/docs/_docs/reference/changed-features/compiler-plugins.md b/docs/_docs/reference/changed-features/compiler-plugins.md index 20bdb7f49836..82d38bd44d96 100644 --- a/docs/_docs/reference/changed-features/compiler-plugins.md +++ b/docs/_docs/reference/changed-features/compiler-plugins.md @@ -4,18 +4,18 @@ title: "Changes in Compiler Plugins" nightlyOf: https://docs.scala-lang.org/scala3/reference/changed-features/compiler-plugins.html --- -Compiler plugins are supported by Dotty (and Scala 3) since 0.9. There are two notable changes -compared to `scalac`: +Compiler plugins are supported in Scala 3 since Dotty 0.9. There are two notable changes +compared to Scala 2: - No support for analyzer plugins - Added support for research plugins -[Analyzer plugins][1] in `scalac` run during type checking and may influence +[Analyzer plugins][1] run in Scala 2 during type checking and may influence normal type checking. This is a very powerful feature but for production usages, a predictable and consistent type checker is more important. For experimentation and research, Scala 3 introduces _research plugin_. Research plugins -are more powerful than `scalac` analyzer plugins as they let plugin authors customize +are more powerful than Scala 2 analyzer plugins as they let plugin authors customize the whole compiler pipeline. One can easily replace the standard typer by a custom one or create a parser for a domain-specific language. However, research plugins are only enabled for nightly or snaphot releases of Scala 3. @@ -26,7 +26,7 @@ _standard plugins_ in Scala 3. In terms of features, they are similar to ## Using Compiler Plugins -Both standard and research plugins can be used with `scalac` by adding the `-Xplugin:` option: +In Scala 3, both standard and research plugins can be used with `scalac` by adding the `-Xplugin:` option: ```shell scalac -Xplugin:pluginA.jar -Xplugin:pluginB.jar Test.scala @@ -40,7 +40,7 @@ the fully qualified plugin class name. The format of a property file is as follo pluginClass=dividezero.DivideZero ``` -This is different from `scalac` plugins that required a `scalac-plugin.xml` file. +This is different from Scala 2 plugins that require a `scalac-plugin.xml` file. Starting from 1.1.5, `sbt` also supports Scala 3 compiler plugins. Please refer to the [`sbt` documentation][2] for more information. diff --git a/docs/_docs/reference/changed-features/eta-expansion-spec.md b/docs/_docs/reference/changed-features/eta-expansion-spec.md index a62d45df9e11..714ab37ae11a 100644 --- a/docs/_docs/reference/changed-features/eta-expansion-spec.md +++ b/docs/_docs/reference/changed-features/eta-expansion-spec.md @@ -51,7 +51,7 @@ implicit val bla: Double = 1.0 val bar = foo // val bar: Int => Float = ... ``` -## Automatic Eta-Expansion and query types +## Automatic Eta-Expansion and context types A method with context parameters can be expanded to a value of a context type by writing the expected context type explicitly. @@ -66,7 +66,7 @@ val bar: Double ?=> Float = foo(3) - If `m` is has an empty argument list (i.e. has type `()R`): 1. If the expected type is of the form `() => T`, we eta expand. 2. If m is defined by Java, or overrides a Java defined method, we insert `()`. - 3. Otherwise we issue an error of the form: + 3. Otherwise we issue an error of the form: `method must be called with () argument` Thus, an unapplied method with an empty argument list is only converted to a function when a function type is expected. It is considered best practice to either explicitly apply the method to `()`, or convert it to a function with `() => m()`. diff --git a/docs/_docs/reference/changed-features/main-functions.md b/docs/_docs/reference/changed-features/main-functions.md index 049dda58aecc..8b035053ad63 100644 --- a/docs/_docs/reference/changed-features/main-functions.md +++ b/docs/_docs/reference/changed-features/main-functions.md @@ -72,7 +72,7 @@ final class happyBirthday: case error: CLP.ParseError => CLP.showError(error) ``` -**Note**: The `` modifier above expresses that the `main` method is generated +**Note:** The `` modifier above expresses that the `main` method is generated as a static method of class `happyBirthday`. It is not available for user programs in Scala. Regular "static" members are generated in Scala using objects instead. [`@main`](https://scala-lang.org/api/3.x/scala/main.html) methods are the recommended scheme to generate programs that can be invoked from the command line in Scala 3. They replace the previous scheme to write program as objects with a special `App` parent class. In Scala 2, `happyBirthday` could be written also like this: diff --git a/docs/_docs/reference/contextual/by-name-context-parameters.md b/docs/_docs/reference/contextual/by-name-context-parameters.md index 3004bfb2c4c2..3515efd78fa5 100644 --- a/docs/_docs/reference/contextual/by-name-context-parameters.md +++ b/docs/_docs/reference/contextual/by-name-context-parameters.md @@ -53,7 +53,7 @@ In the example above, the definition of `s` would be expanded as follows. ```scala val s = summon[Test.Codec[Option[Int]]]( - optionCodec[Int](using intCodec) + using optionCodec[Int](using intCodec) ) ``` diff --git a/docs/_docs/reference/contextual/givens.md b/docs/_docs/reference/contextual/givens.md index 411d50ba63ea..1bfffbc5bf6f 100644 --- a/docs/_docs/reference/contextual/givens.md +++ b/docs/_docs/reference/contextual/givens.md @@ -10,8 +10,9 @@ that serve for synthesizing arguments to [context parameters](./using-clauses.md ```scala trait Ord[T]: def compare(x: T, y: T): Int - extension (x: T) def < (y: T) = compare(x, y) < 0 - extension (x: T) def > (y: T) = compare(x, y) > 0 + extension (x: T) + def < (y: T) = compare(x, y) < 0 + def > (y: T) = compare(x, y) > 0 given intOrd: Ord[Int] with def compare(x: Int, y: Int) = @@ -51,7 +52,7 @@ given [T](using Ord[T]): Ord[List[T]] with If the name of a given is missing, the compiler will synthesize a name from the implemented type(s). -**Note** The name synthesized by the compiler is chosen to be readable and reasonably concise. For instance, the two instances above would get the names: +**Note:** The name synthesized by the compiler is chosen to be readable and reasonably concise. For instance, the two instances above would get the names: ```scala given_Ord_Int @@ -62,7 +63,7 @@ The precise rules for synthesizing names are found [here](./relationship-implici given instances of types that are "too similar". To avoid conflicts one can use named instances. -**Note** To ensure robust binary compatibility, publicly available libraries should prefer named instances. +**Note:** To ensure robust binary compatibility, publicly available libraries should prefer named instances. ## Alias Givens diff --git a/docs/_docs/reference/contextual/multiversal-equality.md b/docs/_docs/reference/contextual/multiversal-equality.md index e9a81b95f472..b51d03b10963 100644 --- a/docs/_docs/reference/contextual/multiversal-equality.md +++ b/docs/_docs/reference/contextual/multiversal-equality.md @@ -33,6 +33,7 @@ that derives `CanEqual`, e.g. ```scala class T derives CanEqual ``` +> Normally a [derives clause](./derivation.md) accepts only type classes with one parameter, however there is a special case for `CanEqual`. Alternatively, one can also provide a `CanEqual` given instance directly, like this: @@ -82,7 +83,7 @@ def canEqualAny[L, R]: CanEqual[L, R] = CanEqual.derived ``` Even though `canEqualAny` is not declared as `given`, the compiler will still -construct an `canEqualAny` instance as answer to an implicit search for the +construct a `canEqualAny` instance as answer to an implicit search for the type `CanEqual[L, R]`, unless `L` or `R` have `CanEqual` instances defined on them, or the language feature `strictEquality` is enabled. @@ -156,10 +157,10 @@ Instances are defined so that every one of these types has a _reflexive_ `CanEqu - Primitive numeric types can be compared with subtypes of `java.lang.Number` (and _vice versa_). - `Boolean` can be compared with `java.lang.Boolean` (and _vice versa_). - `Char` can be compared with `java.lang.Character` (and _vice versa_). - - Two sequences (of arbitrary subtypes of `scala.collection.Seq`) can be compared + - Two sequences (arbitrary subtypes of `scala.collection.Seq`) can be compared with each other if their element types can be compared. The two sequence types need not be the same. - - Two sets (of arbitrary subtypes of `scala.collection.Set`) can be compared + - Two sets (arbitrary subtypes of `scala.collection.Set`) can be compared with each other if their element types can be compared. The two set types need not be the same. - Any subtype of `AnyRef` can be compared with `Null` (and _vice versa_). diff --git a/docs/_docs/reference/contextual/type-classes.md b/docs/_docs/reference/contextual/type-classes.md index 9fc0d2eec864..6a15ac3a83d4 100644 --- a/docs/_docs/reference/contextual/type-classes.md +++ b/docs/_docs/reference/contextual/type-classes.md @@ -82,7 +82,7 @@ given Functor[List] with x.map(f) // List already has a `map` method ``` -With this `given` instance in scope, everywhere a `Functor` is expected, the compiler will accept a `List` to be used. +With this `given` instance in scope, everywhere a type with a `Functor` context bound is expected, the compiler will accept a `List` to be used. For instance, we may write such a testing method: @@ -214,7 +214,7 @@ instead of show(compute(i)(config))(config) ``` -Let's define this m then. First, we are going to define a type named `ConfigDependent` representing a function that when passed a `Config` produces a `Result`. +Let's define this `flatMap` then. First, we are going to define a type named `ConfigDependent` representing a function that when passed a `Config` produces a `Result`. ```scala type ConfigDependent[Result] = Config => Result diff --git a/docs/_docs/reference/dropped-features/delayed-init.md b/docs/_docs/reference/dropped-features/delayed-init.md index 5d4f614ce951..2694c3374f1c 100644 --- a/docs/_docs/reference/dropped-features/delayed-init.md +++ b/docs/_docs/reference/dropped-features/delayed-init.md @@ -18,7 +18,7 @@ object HelloWorld extends App { ``` However, the code is now run in the initializer of the object, which on -some JVM's means that it will only be interpreted. So, better not use it +some JVMs means that it will only be interpreted. So, better not use it for benchmarking! Also, if you want to access the command line arguments, you need to use an explicit `main` method for that. diff --git a/docs/_docs/reference/experimental/canthrow.md b/docs/_docs/reference/experimental/canthrow.md index 025a0ed1c686..064d928fe26c 100644 --- a/docs/_docs/reference/experimental/canthrow.md +++ b/docs/_docs/reference/experimental/canthrow.md @@ -124,7 +124,7 @@ try body catch ... ``` -Note that the right-hand side of the synthesized given is `???` (undefined). This is OK since +Note that the right-hand side of the synthesized given is `compiletime.erasedValue`. This is OK since this given is erased; it will not be executed at runtime. **Note 1:** The [`saferExceptions`](https://scala-lang.org/api/3.x/scala/runtime/stdLibPatches/language$$experimental$$saferExceptions$.html) feature is designed to work only with checked exceptions. An exception type is _checked_ if it is a subtype of diff --git a/docs/_docs/reference/experimental/cc.md b/docs/_docs/reference/experimental/cc.md index 878bc0a64ed6..c6aa795cc09b 100644 --- a/docs/_docs/reference/experimental/cc.md +++ b/docs/_docs/reference/experimental/cc.md @@ -176,7 +176,7 @@ def f(x: {c}-> Int): Int ``` Here, the actual argument to `f` is allowed to use the `c` capability but no others. -**Note**: It is strongly recommended to write the capability set and the arrow `->` without intervening spaces, +**Note:** It is strongly recommended to write the capability set and the arrow `->` without intervening spaces, as otherwise the notation would look confusingly like a function type. ## Subtyping and Subcapturing diff --git a/docs/_docs/reference/experimental/explicit-nulls.md b/docs/_docs/reference/experimental/explicit-nulls.md index b3fa53429cfe..f8f9ac8e11be 100644 --- a/docs/_docs/reference/experimental/explicit-nulls.md +++ b/docs/_docs/reference/experimental/explicit-nulls.md @@ -540,4 +540,4 @@ Our strategy for binary compatibility with Scala binaries that predate explicit and new libraries compiled without `-Yexplicit-nulls` is to leave the types unchanged and be compatible but unsound. -[More details](https://dotty.epfl.ch/docs/internals/explicit-nulls.html) +[Implementation details](https://dotty.epfl.ch/docs/internals/explicit-nulls.html) diff --git a/docs/_docs/reference/experimental/numeric-literals.md b/docs/_docs/reference/experimental/numeric-literals.md index f493ef459265..8b7aaa23f9e0 100644 --- a/docs/_docs/reference/experimental/numeric-literals.md +++ b/docs/_docs/reference/experimental/numeric-literals.md @@ -4,7 +4,7 @@ title: "Numeric Literals" nightlyOf: https://docs.scala-lang.org/scala3/reference/experimental/numeric-literals.html --- -**Note**: This feature is not yet part of the Scala 3 language definition. It can be made available by a language import: +This feature is not yet part of the Scala 3 language definition. It can be made available by a language import: ```scala import scala.language.experimental.genericNumberLiterals diff --git a/docs/_docs/reference/language-versions/source-compatibility.md b/docs/_docs/reference/language-versions/source-compatibility.md index 4d5b468ac8f2..699ba0d5c75d 100644 --- a/docs/_docs/reference/language-versions/source-compatibility.md +++ b/docs/_docs/reference/language-versions/source-compatibility.md @@ -40,4 +40,4 @@ class C { ... } Language imports supersede command-line settings in the source files where they are specified. Only one language import specifying a source version is allowed in a source file, and it must come before any definitions in that file. -**Note**: The [Scala 3 Migration Guide](https://docs.scala-lang.org/scala3/guides/migration/compatibility-intro.html) gives further information to help the Scala programmer moving from Scala 2.13 to Scala 3. +**Note:** The [Scala 3 Migration Guide](https://docs.scala-lang.org/scala3/guides/migration/compatibility-intro.html) gives further information to help the Scala programmer moving from Scala 2.13 to Scala 3. diff --git a/docs/_docs/reference/metaprogramming/compiletime-ops.md b/docs/_docs/reference/metaprogramming/compiletime-ops.md index a43c941ae943..db30786c3af3 100644 --- a/docs/_docs/reference/metaprogramming/compiletime-ops.md +++ b/docs/_docs/reference/metaprogramming/compiletime-ops.md @@ -11,7 +11,7 @@ The [`scala.compiletime`](https://scala-lang.org/api/3.x/scala/compiletime.html) ### `constValue` and `constValueOpt` `constValue` is a function that produces the constant value represented by a -type. +type, or a compile time error if the type is not a constant type. ```scala import scala.compiletime.constValue @@ -30,6 +30,8 @@ enabling us to handle situations where a value is not present. Note that `S` is the type of the successor of some singleton type. For example the type `S[1]` is the singleton type `2`. +Since tuples are not constant types, even if their constituants are, there is `constValueTuple`, which given a tuple type `(X1, ..., Xn)`, returns a tuple value `(constValue[X1], ..., constValue[Xn])`. + ### `erasedValue` So far we have seen inline methods that take terms (tuples and integers) as @@ -170,7 +172,7 @@ val concat: "a" + "b" = "ab" val addition: 1 + 1 = 2 ``` -## Summoning Implicits Selectively +## Summoning Givens Selectively It is foreseen that many areas of typelevel programming can be done with rewrite methods instead of implicits. But sometimes implicits are unavoidable. The @@ -178,16 +180,16 @@ problem so far was that the Prolog-like programming style of implicit search becomes viral: Once some construct depends on implicit search it has to be written as a logic program itself. Consider for instance the problem of creating a `TreeSet[T]` or a `HashSet[T]` depending on whether `T` has an `Ordering` or -not. We can create a set of implicit definitions like this: +not. We can create a set of given instances like this: ```scala trait SetFor[T, S <: Set[T]] class LowPriority: - implicit def hashSetFor[T]: SetFor[T, HashSet[T]] = ... + given hashSetFor[T]: SetFor[T, HashSet[T]] = ... object SetsFor extends LowPriority: - implicit def treeSetFor[T: Ordering]: SetFor[T, TreeSet[T]] = ... + given treeSetFor[T: Ordering]: SetFor[T, TreeSet[T]] = ... ``` Clearly, this is not pretty. Besides all the usual indirection of implicit @@ -236,18 +238,18 @@ inline def setFor[T]: Set[T] = summonFrom { `summonFrom` applications must be reduced at compile time. -Consequently, if we summon an `Ordering[String]` the code above will return a -new instance of `TreeSet[String]`. +Consequently, if a given instance of `Ordering[String]` is in the implicit scope, the code above will return a +new instance of `TreeSet[String]`. Such an instance is defined in `Ordering`'s companion object, so there will always be one. ```scala -summon[Ordering[String]] +summon[Ordering[String]] // Proves that an Ordering[String] is in scope println(setFor[String].getClass) // prints class scala.collection.immutable.TreeSet ``` -**Note** `summonFrom` applications can raise ambiguity errors. Consider the following +**Note:** `summonFrom` applications can raise ambiguity errors. Consider the following code with two givens in scope of type `A`. The pattern match in `f` will raise -an ambiguity error of `f` is applied. +an ambiguity error if `f` is applied. ```scala class A diff --git a/docs/_docs/reference/metaprogramming/metaprogramming.md b/docs/_docs/reference/metaprogramming/metaprogramming.md index 3bce2d7c922e..af7206eff34e 100644 --- a/docs/_docs/reference/metaprogramming/metaprogramming.md +++ b/docs/_docs/reference/metaprogramming/metaprogramming.md @@ -39,7 +39,7 @@ introduce the following fundamental facilities: representation of code. They can be parameterized and composed using splices, but their structure cannot be analyzed from the outside. TASTy reflection gives a way to analyze code structure by partly revealing the representation type of a piece of code in a standard API. The representation - type is a form of typed abstract syntax tree, which gives rise to the `TASTy` + type is a form of **t**yped **a**bstract **s**yntax **t**ree, which gives rise to the `TASTy` moniker. 6. [TASTy Inspection](./tasty-inspect.md) Typed abstract syntax trees are serialized diff --git a/docs/_docs/reference/new-types/type-lambdas-spec.md b/docs/_docs/reference/new-types/type-lambdas-spec.md index 52f88dab4217..76937e5160f7 100644 --- a/docs/_docs/reference/new-types/type-lambdas-spec.md +++ b/docs/_docs/reference/new-types/type-lambdas-spec.md @@ -103,9 +103,9 @@ type O2[X] = List[X] ``` would be treated as covariant, `X` is used covariantly on its right-hand side. -**Note**: The decision to treat `Nothing` as universal bottom type is provisional, and might be changed after further discussion. +**Note:** The decision to treat `Nothing` as universal bottom type is provisional, and might be changed after further discussion. -**Note**: Scala 2 and 3 differ in that Scala 2 also treats `Any` as universal top-type. This is not done in Scala 3. See also the discussion on [kind polymorphism](../other-new-features/kind-polymorphism.md) +**Note:** Scala 2 and 3 differ in that Scala 2 also treats `Any` as universal top-type. This is not done in Scala 3. See also the discussion on [kind polymorphism](../other-new-features/kind-polymorphism.md) ## Curried Type Parameters diff --git a/docs/_docs/reference/other-new-features/export.md b/docs/_docs/reference/other-new-features/export.md index 40e2ad9df248..41104a54e4a6 100644 --- a/docs/_docs/reference/other-new-features/export.md +++ b/docs/_docs/reference/other-new-features/export.md @@ -201,16 +201,16 @@ Consider the following example: ```scala class B { val c: Int } -object a { val b = new B } -export a.* +object A { val b = new B } +export A.* export b.* ``` -Is the `export b.*` clause legal? If yes, what does it export? Is it equivalent to `export a.b.*`? What about if we swap the last two clauses? +Is the `export b.*` clause legal? If yes, what does it export? Is it equivalent to `export A.b.*`? What about if we swap the last two clauses? ``` export b.* -export a.* +export A.* ``` To avoid tricky questions like these, we fix the elaboration order of exports as follows. diff --git a/docs/_docs/reference/other-new-features/kind-polymorphism.md b/docs/_docs/reference/other-new-features/kind-polymorphism.md index 8f0172c4c04b..e452ee8384f9 100644 --- a/docs/_docs/reference/other-new-features/kind-polymorphism.md +++ b/docs/_docs/reference/other-new-features/kind-polymorphism.md @@ -43,5 +43,5 @@ It is declared `abstract` and `final`, so it can be neither instantiated nor ext `AnyKind` plays a special role in Scala's subtype system: It is a supertype of all other types no matter what their kind is. It is also assumed to be kind-compatible with all other types. Furthermore, `AnyKind` is treated as a higher-kinded type (so it cannot be used as a type of values), but at the same time it has no type parameters (so it cannot be instantiated). -**Note**: This feature is considered experimental but stable and it can be disabled under compiler flag +**Note:** This feature is considered experimental but stable and it can be disabled under compiler flag (i.e. `-Yno-kind-polymorphism`). diff --git a/docs/_docs/reference/other-new-features/opaques-details.md b/docs/_docs/reference/other-new-features/opaques-details.md index d7305a249089..ec3170b36bd8 100644 --- a/docs/_docs/reference/other-new-features/opaques-details.md +++ b/docs/_docs/reference/other-new-features/opaques-details.md @@ -65,7 +65,7 @@ opaque type G = [T] =>> List[T] but the following are not: ```scala opaque type BadF[T] = [U] =>> (T, U) -opaque type BadG = [T] =>> [U] => (T, U) +opaque type BadG = [T] =>> [U] =>> (T, U) ``` ## Translation of Equality diff --git a/docs/_docs/reference/other-new-features/parameter-untupling-spec.md b/docs/_docs/reference/other-new-features/parameter-untupling-spec.md index e5165550fc0d..fd462dd610c8 100644 --- a/docs/_docs/reference/other-new-features/parameter-untupling-spec.md +++ b/docs/_docs/reference/other-new-features/parameter-untupling-spec.md @@ -4,37 +4,7 @@ title: "Parameter Untupling - More Details" nightlyOf: https://docs.scala-lang.org/scala3/reference/other-new-features/parameter-untupling-spec.html --- -## Motivation -Say you have a list of pairs - -```scala -val xs: List[(Int, Int)] -``` - -and you want to map `xs` to a list of `Int`s so that each pair of numbers is mapped to their sum. -Previously, the best way to do this was with a pattern-matching decomposition: - -```scala -xs.map { - case (x, y) => x + y -} -``` -While correct, this is inconvenient. Instead, we propose to write it the following way: - -```scala -xs.map { - (x, y) => x + y -} -``` - -or, equivalently: - -```scala -xs.map(_ + _) -``` - -Generally, a function value with `n > 1` parameters can be converted to a function with tupled arguments if the expected type is a unary function type of the form `((T_1, ..., T_n)) => U`. ## Type Checking diff --git a/docs/_docs/reference/other-new-features/parameter-untupling.md b/docs/_docs/reference/other-new-features/parameter-untupling.md index fcc1fa11d519..e1e7afcad8fe 100644 --- a/docs/_docs/reference/other-new-features/parameter-untupling.md +++ b/docs/_docs/reference/other-new-features/parameter-untupling.md @@ -57,12 +57,13 @@ The function value must be explicitly tupled, rather than the parameters untuple xs.map(combiner.tupled) ``` -A conversion may be provided in user code: +Though strongly discouraged, to have the same effect, an implicit conversion may be provided in user code: ```scala import scala.language.implicitConversions -transparent inline implicit def `fallback untupling`(f: (Int, Int) => Int): ((Int, Int)) => Int = - p => f(p._1, p._2) // use specialized apply instead of unspecialized `tupled` + +transparent inline given `fallback untupling`: Conversion[(Int, Int) => Int, ((Int, Int)) => Int] = _.tupled + xs.map(combiner) ``` diff --git a/docs/_docs/reference/other-new-features/targetName.md b/docs/_docs/reference/other-new-features/targetName.md index 63c4cf1ec0df..717ce4247a1f 100644 --- a/docs/_docs/reference/other-new-features/targetName.md +++ b/docs/_docs/reference/other-new-features/targetName.md @@ -93,7 +93,7 @@ The relevant overriding rules can be summarized as follows: - If two members override, then both their erased names and their types must be the same. As usual, any overriding relationship in the generated code must also -be present in the original code. So the following example would also be in error: +be present in the original code. So the following example would also be an error: ```scala import annotation.targetName From 7b6a9ded70851162f491b84f4a099bf0edf6bc6f Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Tue, 11 Oct 2022 13:26:35 +0200 Subject: [PATCH 2/9] Update pattern-matching.md --- .../changed-features/pattern-matching.md | 156 +++++++++++------- 1 file changed, 95 insertions(+), 61 deletions(-) diff --git a/docs/_docs/reference/changed-features/pattern-matching.md b/docs/_docs/reference/changed-features/pattern-matching.md index 30ae5d9dc104..1106a0774121 100644 --- a/docs/_docs/reference/changed-features/pattern-matching.md +++ b/docs/_docs/reference/changed-features/pattern-matching.md @@ -13,26 +13,38 @@ Scala 3 supports a superset of Scala 2 [extractors](https://www.scala-lang.org/f Extractors are objects that expose a method `unapply` or `unapplySeq`: ```scala -def unapply[A](x: T)(implicit x: B): U -def unapplySeq[A](x: T)(implicit x: B): U +def unapply(x: T): U +def unapplySeq(x: T): U +``` + +Where `T` is an arbitrary type, if it is a subtype of the scrutinee's type `Scrut`, a [type test](../other-new-features/type-test.md) is performed before calling the method. +`U` follows rules described in [Fixed Arity Extractors](#fixed-arity-extractors) and [Variadic Extractors](#variadic-extractors). + +**Note:** `U` can be the type of the extractor object. + +`unapply` and `unapplySeq` can actually have a more general signature, allowing for a leading type clause, as well as arbitrarily many using clauses, both before and after the regular term clause, and at most one implicit clause at the end, for example: + +```scala +def unapply[A, B](using C)(using D)(x: T)(using E)(using F)(implicit y: G): U = ??? ``` Extractors that expose the method `unapply` are called fixed-arity extractors, which work with patterns of fixed arity. Extractors that expose the method `unapplySeq` are called variadic extractors, which enables variadic patterns. -### Fixed-Arity Extractors +## Fixed-Arity Extractors + +Fixed-arity extractors expose the following signature (with potential type, using and implicit clauses): -Fixed-arity extractors expose the following signature: ```scala -def unapply[A](x: T)(implicit x: B): U +def unapply(x: T): U ``` The type `U` conforms to one of the following matches: -- Boolean match -- Product match +- [Boolean match](#boolean-match) +- [Product match](#product-match) Or `U` conforms to the type `R`: @@ -45,53 +57,24 @@ type R = { and `S` conforms to one of the following matches: -- single match -- name-based match +- [single match](#single-match) +- [name-based match](#name-based-match) The former form of `unapply` has higher precedence, and _single match_ has higher precedence over _name-based match_. +**Note:** the `S` in `R` can be `U`. + A usage of a fixed-arity extractor is irrefutable if one of the following condition holds: - `U = true` - the extractor is used as a product match -- `U = Some[T]` (for Scala 2 compatibility) - `U <: R` and `U <: { def isEmpty: false }` +- `U = Some[T]` -### Variadic Extractors - -Variadic extractors expose the following signature: - -```scala -def unapplySeq[A](x: T)(implicit x: B): U -``` - -The type `U` conforms to one of the following matches: - -- sequence match -- product-sequence match - -Or `U` conforms to the type `R`: - -```scala -type R = { - def isEmpty: Boolean - def get: S -} -``` - -and `S` conforms to one of the two matches above. - -The former form of `unapplySeq` has higher priority, and _sequence match_ has higher -precedence over _product-sequence match_. - -A usage of a variadic extractor is irrefutable if one of the following conditions holds: - -- the extractor is used directly as a sequence match or product-sequence match -- `U = Some[T]` (for Scala 2 compatibility) -- `U <: R` and `U <: { def isEmpty: false }` +**Note:** The last rule is necessary because, for compatibility reasons, `isEmpty` on `Some` has return type `Boolean` rather than `false`, even though it always returns `false`. -## Boolean Match +### Boolean Match - `U =:= Boolean` - Pattern-matching on exactly `0` patterns @@ -111,10 +94,10 @@ object Even: // even has an even number of characters ``` -## Product Match +### Product Match - `U <: Product` -- `N > 0` is the maximum number of consecutive (parameterless `def` or `val`) `_1: P1` ... `_N: PN` members in `U` +- `N > 0` is the maximum number of consecutive (`val` or parameterless `def`) `_1: P1` ... `_N: PN` members in `U` - Pattern-matching on exactly `N` patterns with types `P1, P2, ..., PN` For example: @@ -141,9 +124,11 @@ object FirstChars: // First: H; Second: i ``` -## Single Match +### Single Match -- If there is exactly `1` pattern, pattern-matching on `1` pattern with type `U` +- Pattern-matching on `1` pattern with type `S` + +For example, where `Nat <: R`, `S = Int`: @@ -162,27 +147,72 @@ object Nat: // 5 is a natural number ``` -## Name-based Match +### Name-based Match -- `N > 1` is the maximum number of consecutive (parameterless `def` or `val`) `_1: P1 ... _N: PN` members in `U` +- `S` has `N > 1` members such that they are each `val`s or parameterless `def`s, and named from `_1` with type `P1` to `_N` with type `PN` +- `S` doesn't have `N+1` members satisfying the previous point, i.e. `N` is maximal - Pattern-matching on exactly `N` patterns with types `P1, P2, ..., PN` +For example, where `U = AlwaysEmpty.type <: R`, `S = NameBased`: ```scala -object ProdEmpty: +object MyPatternMatcher: + def unapply(s: String) = AlwaysEmpty + +object AlwaysEmpty: + def isEmpty = true + def get = NameBased + +object NameBased: def _1: Int = ??? def _2: String = ??? - def isEmpty = true - def unapply(s: String): this.type = this - def get = this "" match - case ProdEmpty(_, _) => ??? + case MyPatternMatcher(_, _) => ??? case _ => () ``` -## Sequence Match +## Variadic Extractors + +Variadic extractors expose the following signature (with potential type, using and implicit clauses): + +```scala +def unapplySeq(x: T): U +``` + +Where `U` has to fullfill the following: + +1. Set `V := U` +2. `V` is valid if `V` conforms to one of the following matches: +- [sequence match](#sequence-match) +- [product-sequence match](#product-sequence-match) +3. Otherwise `U` has to conform to the type `R`: +```scala +type R = { + def isEmpty: Boolean + def get: S +} +``` +4. Set `V := S`, and reattempt 2., if it fails `U` is not valid. + +The `V := U` form of `unapplySeq` has higher priority, and _sequence match_ has higher +precedence over _product-sequence match_. + +**Note:** This means `isEmpty` is disregarded if the `V := U` form is valid + +A usage of a variadic extractor is irrefutable if one of the following conditions holds: + +- the extractor is used directly as a sequence match or product-sequence match +- `U <: R` and `U <: { def isEmpty: false }` +- `U = Some[T]` + +**Note:** The last rule is necessary because, for compatibility reasons, `isEmpty` on `Some` has return type `Boolean` rather than `false`, even though it always returns `false`. + +**Note:** Be careful, by the first condition and the note above, it is possible to define an irrefutable extractor with a `def isEmpty: true`. +This is strongly discouraged and, if found in the wild, is almost certainly a bug. + +### Sequence Match -- `U <: X`, `T2` and `T3` conform to `T1` +- `V <: X` ```scala type X = { @@ -192,10 +222,12 @@ type X = { def toSeq: scala.Seq[T3] } ``` - +- `T2` and `T3` conform to `T1` - Pattern-matching on _exactly_ `N` simple patterns with types `T1, T1, ..., T1`, where `N` is the runtime size of the sequence, or - Pattern-matching on `>= N` simple patterns and _a vararg pattern_ (e.g., `xs: _*`) with types `T1, T1, ..., T1, Seq[T1]`, where `N` is the minimum size of the sequence. +For example, where `V = S`, `U = Option[S] <: R`, `S = Seq[Char]` + ```scala @@ -211,14 +243,16 @@ object CharList: // e,x,a,m ``` -## Product-Sequence Match +### Product-Sequence Match -- `U <: Product` -- `N > 0` is the maximum number of consecutive (parameterless `def` or `val`) `_1: P1` ... `_N: PN` members in `U` +- `V <: Product` +- `N > 0` is the maximum number of consecutive (`val` or parameterless `def`) `_1: P1` ... `_N: PN` members in `V` - `PN` conforms to the signature `X` defined in Seq Pattern - Pattern-matching on exactly `>= N` patterns, the first `N - 1` patterns have types `P1, P2, ... P(N-1)`, the type of the remaining patterns are determined as in Seq Pattern. +For example, where `V = S`, `U = Option[S] <: R`, `S = (String, PN) <: Product`, `PN = Seq[Int]` + ```scala class Foo(val name: String, val children: Int*) object Foo: @@ -227,7 +261,7 @@ object Foo: def foo(f: Foo) = f match case Foo(name, x, y, ns*) => ">= two children." - case Foo(name, ns*) => => "< two children." + case Foo(name, ns*) => "< two children." ``` There are plans for further simplification, in particular to factor out _product match_ From a5127461971acb82c3b90c173945a596585650cf Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Tue, 11 Oct 2022 13:27:12 +0200 Subject: [PATCH 3/9] Update structural-types.md --- .../changed-features/structural-types.md | 98 ++++++++++--------- 1 file changed, 54 insertions(+), 44 deletions(-) diff --git a/docs/_docs/reference/changed-features/structural-types.md b/docs/_docs/reference/changed-features/structural-types.md index 37e583332cf1..d8cd4f867092 100644 --- a/docs/_docs/reference/changed-features/structural-types.md +++ b/docs/_docs/reference/changed-features/structural-types.md @@ -35,19 +35,41 @@ configure how fields and methods should be resolved. Here's an example of a structural type `Person`: ```scala - class Record(elems: (String, Any)*) extends Selectable: - private val fields = elems.toMap - def selectDynamic(name: String): Any = fields(name) +type Person = Record { val name: String; val age: Int } +``` + +The type `Person` adds a _refinement_ to its parent type `Record` that defines the two fields `name` and `age`. We say the refinement is _structural_ since `name` and `age` are not defined in the parent type. But they exist nevertheless as members of type `Person`. - type Person = Record { val name: String; val age: Int } - ``` +This allows us to check at compiletime if accesses are valid: -The type `Person` adds a _refinement_ to its parent type `Record` that defines the two fields `name` and `age`. We say the refinement is _structural_ since `name` and `age` are not defined in the parent type. But they exist nevertheless as members of class `Person`. For instance, the following -program would print "Emma is 42 years old.": +```scala +val person: Person = ??? +println(s"${person.name} is ${person.age} years old.") // works +println(person.email) // error: value email is not a member of Person +``` +How is `Record` defined, and how does `person.name` resolve ? + +`Record` is a class that extends the marker trait [`scala.Selectable`](https://scala-lang.org/api/3.x/scala/Selectable.html) and defines +a method `selectDynamic`, which maps a field name to its value. +Selecting a member of a structural type is syntactic sugar for a call to this method. +The selections `person.name` and `person.age` are translated by +the Scala compiler to: ```scala - val person = Record("name" -> "Emma", "age" -> 42).asInstanceOf[Person] - println(s"${person.name} is ${person.age} years old.") +person.selectDynamic("name").asInstanceOf[String] +person.selectDynamic("age").asInstanceOf[Int] +``` + +For example, `Record` could be defined as follows: + +```scala +class Record(elems: (String, Any)*) extends Selectable: + private val fields = elems.toMap + def selectDynamic(name: String): Any = fields(name) +``` +Which allows us to create instances of `Person` like so: +```scala +val person = Record("name" -> "Emma", "age" -> 42).asInstanceOf[Person] ``` The parent type `Record` in this example is a generic class that can represent arbitrary records in its `elems` argument. This argument is a @@ -59,52 +81,45 @@ help from the user. In practice, the connection between a structural type and its underlying generic representation would most likely be done by a database layer, and therefore would not be a concern of the end user. -`Record` extends the marker trait [`scala.Selectable`](https://scala-lang.org/api/3.x/scala/Selectable.html) and defines -a method `selectDynamic`, which maps a field name to its value. -Selecting a structural type member is done by calling this method. -The `person.name` and `person.age` selections are translated by -the Scala compiler to: - -```scala - person.selectDynamic("name").asInstanceOf[String] - person.selectDynamic("age").asInstanceOf[Int] -``` - Besides `selectDynamic`, a `Selectable` class sometimes also defines a method `applyDynamic`. This can then be used to translate function calls of structural members. So, if `a` is an instance of `Selectable`, a structural call like `a.f(b, c)` would translate to ```scala - a.applyDynamic("f")(b, c) +a.applyDynamic("f")(b, c) ``` ## Using Java Reflection -Structural types can also be accessed using [Java reflection](https://www.oracle.com/technical-resources/articles/java/javareflection.html). Example: +Using `Selectable` and [Java reflection](https://www.oracle.com/technical-resources/articles/java/javareflection.html), we can select a member from unrelated classes. + +> Before resorting to structural calls with Java reflection one should consider alternatives. For instance, sometimes a more a modular _and_ efficient architecture can be obtained using [type classes](../contextual/type-classes.md). + +For example, we would like to provide behavior for both [`FileInputStream`](https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/io/FileInputStream.html#%3Cinit%3E(java.io.File)) and [`Channel`](https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/nio/channels/Channel.html) classes by calling their `close` method, however, these classes are unrelated, i.e. have no common supertype with a `close` method. Therefore, below we define a structural type `Closeable` that defines a `close` method. ```scala - type Closeable = { def close(): Unit } +type Closeable = { def close(): Unit } - class FileInputStream: - def close(): Unit +class FileInputStream: + def close(): Unit - class Channel: - def close(): Unit +class Channel: + def close(): Unit ``` -Here, we define a structural type `Closeable` that defines a `close` method. There are various classes that have `close` methods, we just list [`FileInputStream`](https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/io/FileInputStream.html#%3Cinit%3E(java.io.File)) and [`Channel`](https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/nio/channels/Channel.html) as two examples. It would be easiest if the two classes shared a common interface that factors out the `close` method. But such factorings are often not possible if different libraries are combined in one application. Yet, we can still have methods that work on -all classes with a `close` method by using the `Closeable` type. For instance, +Ideally we would add a common interface to both these classes to define the `close` method, however they are defined in libraries outside of our control. As a compromise we can use the structural type to define a single implementation for an `autoClose` method: + + ```scala - import scala.reflect.Selectable.reflectiveSelectable +import scala.reflect.Selectable.reflectiveSelectable - def autoClose(f: Closeable)(op: Closeable => Unit): Unit = - try op(f) finally f.close() +def autoClose(f: Closeable)(op: Closeable => Unit): Unit = + try op(f) finally f.close() ``` -The call `f.close()` has to use Java reflection to identify and call the `close` method in the receiver `f`. This needs to be enabled by an import -of `reflectiveSelectable` shown above. What happens "under the hood" is then the following: +The call `f.close()` requires `Closeable` to extend `Selectable` to identify and call the `close` method in the receiver `f`. A universal implicit conversion to `Selectable` is enabled by an import +of `reflectiveSelectable` shown above, based on [Java reflection](https://www.oracle.com/technical-resources/articles/java/javareflection.html). What happens "under the hood" is then the following: - - The import makes available an implicit conversion that turns any type into a - `Selectable`. `f` is wrapped in this conversion. + - The implicit conversion wraps `f` in an instance of `scala.reflect.Selectable` (which is a subtype of `Selectable`). - The compiler then transforms the `close` call on the wrapped `f` to an `applyDynamic` call. The end result is: @@ -113,7 +128,7 @@ of `reflectiveSelectable` shown above. What happens "under the hood" is then the reflectiveSelectable(f).applyDynamic("close")() ``` - The implementation of `applyDynamic` in `reflectiveSelectable`'s result -uses Java reflection to find and call a method `close` with zero parameters in the value referenced by `f` at runtime. +uses [Java reflection](https://www.oracle.com/technical-resources/articles/java/javareflection.html) to find and call a method `close` with zero parameters in the value referenced by `f` at runtime. Structural calls like this tend to be much slower than normal method calls. The mandatory import of `reflectiveSelectable` serves as a signpost that something inefficient is going on. @@ -121,8 +136,6 @@ Structural calls like this tend to be much slower than normal method calls. The `reflectiveSelectable` conversion. However, to warn against inefficient dispatch, Scala 2 requires a language import `import scala.language.reflectiveCalls`. -Before resorting to structural calls with Java reflection one should consider alternatives. For instance, sometimes a more modular _and_ efficient architecture can be obtained using type classes. - ## Extensibility New instances of `Selectable` can be defined to support means of @@ -179,13 +192,10 @@ differences. is, as long as the correspondence of the structural type with the underlying value is as stated. -- [`Dynamic`](https://scala-lang.org/api/3.x/scala/Dynamic.html) is just a marker trait, which gives more leeway where and - how to define reflective access operations. By contrast - `Selectable` is a trait which declares the access operations. - - Two access operations, `selectDynamic` and `applyDynamic` are shared between both approaches. In `Selectable`, `applyDynamic` also may also take [`java.lang.Class`](https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/lang/Class.html) arguments indicating the method's formal parameter types. - [`Dynamic`](https://scala-lang.org/api/3.x/scala/Dynamic.html) comes with `updateDynamic`. + +- `updateDynamic` is unique to [`Dynamic`](https://scala-lang.org/api/3.x/scala/Dynamic.html) but as mentionned before, this fact is subject to change, and shouldn't be used as an assumption. [More details](structural-types-spec.md) From 9c4eb25c24ff456e161569b850f35ed2739f6807 Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Tue, 11 Oct 2022 13:28:27 +0200 Subject: [PATCH 4/9] Update #class-context-parameters in using-clauses.md --- .../reference/contextual/using-clauses.md | 29 ++++++++++++------- 1 file changed, 18 insertions(+), 11 deletions(-) diff --git a/docs/_docs/reference/contextual/using-clauses.md b/docs/_docs/reference/contextual/using-clauses.md index 9187e1916e7d..8f410ebc026e 100644 --- a/docs/_docs/reference/contextual/using-clauses.md +++ b/docs/_docs/reference/contextual/using-clauses.md @@ -50,29 +50,36 @@ Generally, context parameters may be defined either as a full parameter list `(p ## Class Context Parameters -If a class context parameter is made a member by adding a `val` or `var` modifier, -then that member is available as a given instance. +To make a class context parameter visible from outside the class body, it can be made into a member by adding a `val` or `var` modifier. +```scala +class GivenIntBox(using val usingParameter: Int): + def myInt = summon[Int] -Compare the following examples, where the attempt to supply an explicit `given` member induces an ambiguity: +val b = GivenIntBox(using 23) +import b.usingParameter +summon[Int] // 23 +``` +This is preferable to creating an explicit `given` member, as the latter creates ambiguity inside the class body: ```scala -class GivenIntBox(using val givenInt: Int): - def n = summon[Int] - -class GivenIntBox2(using givenInt: Int): - given Int = givenInt - //def n = summon[Int] // ambiguous +class GivenIntBox2(using usingParameter: Int): + given givenMember: Int = usingParameter + def n = summon[Int] // ambiguous given instances: both usingParameter and givenMember match type Int ``` -The `given` member is importable as explained in the section on [importing `given`s](./given-imports.md): +From the outside of `GivenIntBox`, `usingParameter` appears as if it were defined in the class as `given usingParameter: Int`, in particular it must be imported as described in the section on [importing `given`s](./given-imports.md). ```scala val b = GivenIntBox(using 23) +// Works: import b.given summon[Int] // 23 +usingParameter // 23 +// Fails: import b.* -//givenInt // Not found +summon[Int] // No given instance found +usingParameter // Not found ``` ## Inferring Complex Arguments From 2f7b6f86e17c79197e44d301c9000cccf3555b25 Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Tue, 11 Oct 2022 13:35:12 +0200 Subject: [PATCH 5/9] Update macros.md --- .../_docs/reference/metaprogramming/macros.md | 81 ++++++++++++------- 1 file changed, 53 insertions(+), 28 deletions(-) diff --git a/docs/_docs/reference/metaprogramming/macros.md b/docs/_docs/reference/metaprogramming/macros.md index 8045794d1143..c8da00323582 100644 --- a/docs/_docs/reference/metaprogramming/macros.md +++ b/docs/_docs/reference/metaprogramming/macros.md @@ -170,17 +170,10 @@ describing a function into a function mapping trees to trees. ```scala object Expr: ... - def betaReduce[...](...)(...): ... = ... + def betaReduce[T](expr: Expr[T])(using Quotes): Expr[T] ``` -The definition of `Expr.betaReduce(f)(x)` is assumed to be functionally the same as -`'{($f)($x)}`, however it should optimize this call by returning the -result of beta-reducing `f(x)` if `f` is a known lambda expression. -`Expr.betaReduce` distributes applications of `Expr` over function arrows: - -```scala -Expr.betaReduce(_): Expr[(T1, ..., Tn) => R] => ((Expr[T1], ..., Expr[Tn]) => Expr[R]) -``` +`Expr.betaReduce` returns an expression that is functionally equivalent to e, however if e is of the form `((y1, ..., yn) => e2)(e1, ..., en)` then it optimizes the top most call by returning the result of beta-reducing the application. Otherwise returns expr. ## Lifting Types @@ -192,7 +185,7 @@ quote but no splice between the parameter binding of `T` and its usage. But the code can be rewritten by adding an explicit binding of a `Type[T]`: ```scala -def to[T, R](f: Expr[T] => Expr[R])(using t: Type[T])(using Type[R], Quotes): Expr[T => R] = +def to[T, R](f: Expr[T] => Expr[R])(using t: Type[T], r: Type[R])(using Quotes): Expr[T => R] = '{ (x: t.Underlying) => ${ f('x) } } ``` @@ -217,14 +210,13 @@ would be rewritten to ```scala def to[T, R](f: Expr[T] => Expr[R])(using t: Type[T], r: Type[R])(using Quotes): Expr[T => R] = '{ - type T = t.Underlying + type T = summon[Type[T]].Underlying (x: T) => ${ f('x) } } ``` -The `summon` query succeeds because there is a given instance of -type `Type[T]` available (namely the given parameter corresponding -to the context bound `: Type`), and the reference to that value is +The `summon` query succeeds because there is a using parameter of +type `Type[T]`, and the reference to that value is phase-correct. If that was not the case, the phase inconsistency for `T` would be reported as an error. @@ -526,8 +518,8 @@ the code it runs produces one. ## Example Expansion -Assume we have two methods, one `map` that takes an `Expr[Array[T]]` and a -function `f` and one `sum` that performs a sum by delegating to `map`. +Assume we have two methods, `map` that takes an `Expr[Array[T]]` and a +function `f`, and `sum` that performs a sum by delegating to `map`. ```scala object Macros: @@ -552,38 +544,66 @@ object Macros: end Macros ``` -A call to `sum_m(Array(1,2,3))` will first inline `sum_m`: +A call to `sum_m(Array(1, 2, 3))` will first inline `sum_m`: ```scala -val arr: Array[Int] = Array.apply(1, [2,3 : Int]:Int*) -${_root_.Macros.sum('arr)} +val arr: Array[Int] = Array.apply(1, 2, 3) +${ _root_.Macros.sum('arr) } ``` -then it will splice `sum`: +then it will call `sum`: ```scala -val arr: Array[Int] = Array.apply(1, [2,3 : Int]:Int*) +val arr: Array[Int] = Array.apply(1, 2, 3) +${ '{ + var sum = 0 + ${ map('arr, x => '{sum += $x}) } + sum +} } +``` + +and cancel the `${'{...}}`: + +```scala +val arr: Array[Int] = Array.apply(1, 2, 3) var sum = 0 ${ map('arr, x => '{sum += $x}) } sum ``` -then it will inline `map`: +then it will extract `x => '{sum += $x}` into `f`, to have a value: ```scala -val arr: Array[Int] = Array.apply(1, [2,3 : Int]:Int*) +val arr: Array[Int] = Array.apply(1, 2, 3) var sum = 0 val f = x => '{sum += $x} -${ _root_.Macros.map('arr, 'f)(Type.of[Int])} +${ _root_.Macros.map('arr, 'f)(Type.of[Int]) } sum ``` -then it will expand and splice inside quotes `map`: +and then call `map`: ```scala -val arr: Array[Int] = Array.apply(1, [2,3 : Int]:Int*) +val arr: Array[Int] = Array.apply(1, 2, 3) + +var sum = 0 +val f = x => '{sum += $x} +${ '{ + var i: Int = 0 + while i < arr.length do + val element: Int = (arr)(i) + sum += element + i += 1 + sum +} } +``` + +and cancel the `${'{...}}` again: + +```scala +val arr: Array[Int] = Array.apply(1, 2, 3) var sum = 0 val f = x => '{sum += $x} @@ -598,7 +618,7 @@ sum Finally cleanups and dead code elimination: ```scala -val arr: Array[Int] = Array.apply(1, [2,3 : Int]:Int*) +val arr: Array[Int] = Array.apply(1, 2, 3) var sum = 0 var i: Int = 0 while i < arr.length do @@ -662,7 +682,7 @@ It is possible to deconstruct or extract values out of `Expr` using pattern matc `scala.quoted` contains objects that can help extracting values from `Expr`. -- `scala.quoted.Expr`/`scala.quoted.Exprs`: matches an expression of a value (or list of values) and returns the value (or list of values). +- `scala.quoted.Expr`/`scala.quoted.Exprs`: matches an expression of a value (resp. list of values) and returns the value (resp. list of values). - `scala.quoted.Const`/`scala.quoted.Consts`: Same as `Expr`/`Exprs` but only works on primitive values. - `scala.quoted.Varargs`: matches an explicit sequence of expressions and returns them. These sequences are useful to get individual `Expr[T]` out of a varargs expression of type `Expr[Seq[T]]`. @@ -682,6 +702,11 @@ private def sumExpr(argsExpr: Expr[Seq[Int]])(using Quotes): Expr[Int] = dynamicSum.foldLeft(Expr(staticSum))((acc, arg) => '{ $acc + $arg }) case _ => '{ $argsExpr.sum } + +sum(1, 2, 3) // gets matched by Varargs + +val xs = List(1, 2, 3) +sum(xs*) // doesn't get matched by Varargs ``` ### Quoted patterns From e033487f7a2fd3cd3399604fdfb0f4ba89d3ae70 Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Tue, 11 Oct 2022 15:30:35 +0200 Subject: [PATCH 6/9] Update reflection.md --- .../reference/metaprogramming/reflection.md | 18 +++++++++++------- 1 file changed, 11 insertions(+), 7 deletions(-) diff --git a/docs/_docs/reference/metaprogramming/reflection.md b/docs/_docs/reference/metaprogramming/reflection.md index b2d492657a4e..68cb7dafcfbb 100644 --- a/docs/_docs/reference/metaprogramming/reflection.md +++ b/docs/_docs/reference/metaprogramming/reflection.md @@ -98,10 +98,11 @@ def macroImpl()(quotes: Quotes): Expr[Unit] = `quotes.reflect` contains three facilities for tree traversal and transformation. -`TreeAccumulator` ties the knot of a traversal. By calling `foldOver(x, tree)(owner)` -we can dive into the `tree` node and start accumulating values of type `X` (e.g., -of type `List[Symbol]` if we want to collect symbols). The code below, for -example, collects the `val` definitions in the tree. +`TreeAccumulator[X]` allows you to traverse the tree and aggregate data of type `X` along the way, by overriding its method `foldTree(x: X, tree: Tree)(owner: Symbol): X`. + +`foldOverTree(x: X, tree: Tree)(owner: Symbol): X` calls `foldTree` on each children of `tree` (using `fold` to give each call the value of the previous one). + +The code below, for example, collects the `val` definitions in the tree. ```scala def collectPatternVariables(tree: Tree)(using ctx: Context): List[Symbol] = @@ -115,12 +116,15 @@ def collectPatternVariables(tree: Tree)(using ctx: Context): List[Symbol] = acc(Nil, tree) ``` -A `TreeTraverser` extends a `TreeAccumulator` and performs the same traversal -but without returning any value. Finally, a `TreeMap` performs a transformation. +A `TreeTraverser` extends a `TreeAccumulator[Unit]` and performs the same traversal +but without returning any value. + +`TreeMap` transforms trees along the traversal, through overloading its methods it is possible to transform only trees of specific types, for example `transformStatement` only transforms `Statement`s. + #### ValDef.let -`quotes.reflect.ValDef` also offers a method `let` that allows us to bind the `rhs` (right-hand side) to a `val` and use it in `body`. +The object `quotes.reflect.ValDef` also offers a method `let` that allows us to bind the `rhs` (right-hand side) to a `val` and use it in `body`. Additionally, `lets` binds the given `terms` to names and allows to use them in the `body`. Their type definitions are shown below: From a487d9a1a06894d504f3fefb4f52388ed9f46f83 Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Tue, 11 Oct 2022 15:31:04 +0200 Subject: [PATCH 7/9] Hide simple-smp page and Move it after macros-spec --- docs/sidebar.yml | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/docs/sidebar.yml b/docs/sidebar.yml index 65fd07031290..dfcc36cccf1b 100644 --- a/docs/sidebar.yml +++ b/docs/sidebar.yml @@ -55,10 +55,11 @@ subsection: - page: reference/metaprogramming/macros.md - page: reference/metaprogramming/macros-spec.md hidden: true + - page: reference/metaprogramming/simple-smp.md # description of a simplified metaprogramming language, this might not be the best place for it + hidden: true - page: reference/metaprogramming/staging.md - page: reference/metaprogramming/reflection.md - page: reference/metaprogramming/tasty-inspect.md - - page: reference/metaprogramming/simple-smp.md - title: Other New Features index: reference/other-new-features/other-new-features.md subsection: From dc14f8129409662511a8f60f6b04864ee5d45767 Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Tue, 11 Oct 2022 15:32:05 +0200 Subject: [PATCH 8/9] Update derivation.md --- docs/_docs/reference/contextual/derivation.md | 40 ++++++++----------- 1 file changed, 16 insertions(+), 24 deletions(-) diff --git a/docs/_docs/reference/contextual/derivation.md b/docs/_docs/reference/contextual/derivation.md index bad47dcb0096..3fab53adc557 100644 --- a/docs/_docs/reference/contextual/derivation.md +++ b/docs/_docs/reference/contextual/derivation.md @@ -7,7 +7,7 @@ nightlyOf: https://docs.scala-lang.org/scala3/reference/contextual/derivation.ht Type class derivation is a way to automatically generate given instances for type classes which satisfy some simple conditions. A type class in this sense is any trait or class with a type parameter determining the type being operated on. Common examples are `Eq`, `Ordering`, or `Show`. For example, given the following `Tree` algebraic data type -(ADT), +(ADT): ```scala enum Tree[T] derives Eq, Ordering, Show: @@ -16,7 +16,7 @@ enum Tree[T] derives Eq, Ordering, Show: ``` The `derives` clause generates the following given instances for the `Eq`, `Ordering` and `Show` type classes in the -companion object of `Tree`, +companion object of `Tree`: ```scala given [T: Eq] : Eq[Tree[T]] = Eq.derived @@ -26,11 +26,21 @@ given [T: Show] : Show[Tree[T]] = Show.derived We say that `Tree` is the _deriving type_ and that the `Eq`, `Ordering` and `Show` instances are _derived instances_. -## Types supporting `derives` clauses +**Note:** The access to `derived` above is a normal access, therefore if there are multiple definitions of `derived` in the type class, overloading resolution applies. + +**Note:** `derived` can be used manually, this is useful when you do not have control over the definition. For example we can implement an `Ordering` for `Option`s like so: + +```scala +given [T: Ordering]: Ordering[Option[T]] = Ordering.derived +``` + +It is discouraged to directly refer to the `derived` member if you can use a `derives` clause instead. All data types can have a `derives` clause. This document focuses primarily on data types which also have a given instance of the `Mirror` type class available. +## `Mirror` + `Mirror` type class instances provide information at the type level about the components and labelling of the type. They also provide minimal term level infrastructure to allow higher level libraries to provide comprehensive derivation support. @@ -158,15 +168,11 @@ Note the following properties of `Mirror` types, + The methods `ordinal` and `fromProduct` are defined in terms of `MirroredMonoType` which is the type of kind-`*` which is obtained from `MirroredType` by wildcarding its type parameters. -## Type classes supporting automatic deriving +### Implementing `derived` with `Mirror` -A trait or class can appear in a `derives` clause if its companion object defines a method named `derived`. The -signature and implementation of a `derived` method for a type class `TC[_]` are arbitrary but it is typically of the -following form, +As seen before, the signature and implementation of a `derived` method for a type class `TC[_]` are arbitrary, but we expect it to typically be of the following form: ```scala -import scala.deriving.Mirror - inline def derived[T](using Mirror.Of[T]): TC[T] = ... ``` @@ -360,21 +366,7 @@ The framework described here enables all three of these approaches without manda For a brief discussion on how to use macros to write a type class `derived` method please read more at [How to write a type class `derived` method using macros](./derivation-macro.md). -## Deriving instances elsewhere - -Sometimes one would like to derive a type class instance for an ADT after the ADT is defined, without being able to -change the code of the ADT itself. To do this, simply define an instance using the `derived` method of the type class -as right-hand side. E.g, to implement `Ordering` for `Option` define, - -```scala -given [T: Ordering]: Ordering[Option[T]] = Ordering.derived -``` - -Assuming the `Ordering.derived` method has a context parameter of type `Mirror[T]` it will be satisfied by the -compiler generated `Mirror` instance for `Option` and the derivation of the instance will be expanded on the right -hand side of this definition in the same way as an instance defined in ADT companion objects. - -## Syntax +### Syntax ``` Template ::= InheritClauses [TemplateBody] From 5455c89c0b9769efaf13a9c4e37270ee98456189 Mon Sep 17 00:00:00 2001 From: Quentin Bernet Date: Wed, 19 Oct 2022 13:19:18 +0200 Subject: [PATCH 9/9] Remove motivation from description of `summonFrom` --- .../metaprogramming/compiletime-ops.md | 37 +------------------ 1 file changed, 2 insertions(+), 35 deletions(-) diff --git a/docs/_docs/reference/metaprogramming/compiletime-ops.md b/docs/_docs/reference/metaprogramming/compiletime-ops.md index db30786c3af3..038935badc0b 100644 --- a/docs/_docs/reference/metaprogramming/compiletime-ops.md +++ b/docs/_docs/reference/metaprogramming/compiletime-ops.md @@ -174,40 +174,7 @@ val addition: 1 + 1 = 2 ## Summoning Givens Selectively -It is foreseen that many areas of typelevel programming can be done with rewrite -methods instead of implicits. But sometimes implicits are unavoidable. The -problem so far was that the Prolog-like programming style of implicit search -becomes viral: Once some construct depends on implicit search it has to be -written as a logic program itself. Consider for instance the problem of creating -a `TreeSet[T]` or a `HashSet[T]` depending on whether `T` has an `Ordering` or -not. We can create a set of given instances like this: - -```scala -trait SetFor[T, S <: Set[T]] - -class LowPriority: - given hashSetFor[T]: SetFor[T, HashSet[T]] = ... - -object SetsFor extends LowPriority: - given treeSetFor[T: Ordering]: SetFor[T, TreeSet[T]] = ... -``` - -Clearly, this is not pretty. Besides all the usual indirection of implicit -search, we face the problem of rule prioritization where we have to ensure that -`treeSetFor` takes priority over `hashSetFor` if the element type has an -ordering. This is solved (clumsily) by putting `hashSetFor` in a superclass -`LowPriority` of the object `SetsFor` where `treeSetFor` is defined. Maybe the -boilerplate would still be acceptable if the crufty code could be contained. -However, this is not the case. Every user of the abstraction has to be -parameterized itself with a `SetFor` implicit. Considering the simple task _"I -want a `TreeSet[T]` if `T` has an ordering and a `HashSet[T]` otherwise"_, this -seems like a lot of ceremony. - -There are some proposals to improve the situation in specific areas, for -instance by allowing more elaborate schemes to specify priorities. But they all -keep the viral nature of implicit search programs based on logic programming. - -By contrast, the new `summonFrom` construct makes implicit search available +The new `summonFrom` construct makes implicit search available in a functional context. To solve the problem of creating the right set, one would use it as follows: @@ -223,7 +190,7 @@ inline def setFor[T]: Set[T] = summonFrom { A `summonFrom` call takes a pattern matching closure as argument. All patterns in the closure are type ascriptions of the form `identifier : Type`. -Patterns are tried in sequence. The first case with a pattern `x: T` such that an implicit value of type `T` can be summoned is chosen. +Patterns are tried in sequence. The first case with a pattern `x: T` such that a contextual value of type `T` can be summoned is chosen. Alternatively, one can also use a pattern-bound given instance, which avoids the explicit using clause. For instance, `setFor` could also be formulated as follows: