Skip to content

Blog post on Scala 3 macros #871

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 17 commits into from
Apr 30, 2018
Merged

Blog post on Scala 3 macros #871

merged 17 commits into from
Apr 30, 2018

Conversation

odersky
Copy link
Contributor

@odersky odersky commented Apr 10, 2018

No description provided.

@odersky
Copy link
Contributor Author

odersky commented Apr 10, 2018

/cc @olafurpg

@odersky odersky changed the title Block post on Scala 3 macros Blog post on Scala 3 macros Apr 10, 2018
@odersky
Copy link
Contributor Author

odersky commented Apr 10, 2018

/cc @nicolasstucki

Do we have an image for Nicolas to join?


It turns out that Tasty also makes an excellent foundation for a new generation of reflection-based macros.

The problem with the current `scala.reflect` macros is that they that they are completely dependent on the current Scala compiler (internally named `nsc`). In fact, `scala.reflect` macros are nothing but a thin veneer on top of `nsc` internals. This makes them very powerful but also fragile and hard to use. Because if this, they have had "experimental" status for their whole lifetime. Since Scala 3 uses a different compiler (`dotc`), the old reflect macros cannot be ported to it, so we need something different, and hopefully better.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

typo: if this -> of this


Another way to look at `scala.reflect` macros was that they were lacking _foundations_. Scala 3 has already a meta programming facility, with rock-solid foundations. [Principled Meta Programming] (http://dotty.epfl.ch/docs/reference/principled-meta-programming.html) is a way to support staging by adding just two operators to the language: Quote (`'`) to represent code expressions, and splice (`~`) to insert one piece of code in another. The inspiration for our approach [comes from temporal logic](https://ieeexplore.ieee.org/abstract/document/561317/). A somewhat similar system
is used for staging in [Meta OCaml](http://okmij.org/ftp/ML/MetaOCaml.html).
We obtain a very high level _macro system_ by combining the two temporal operators `'` and `~` with Scala 3's `inline` feature. In a nutshell:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there's an extra space between [Principled Meta Programming] and (http://dotty.epfl.ch/docs/reference/principled-meta-programming.html), preventing markdown from recognizing it properly.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is confusing. Scala 3 has a meta programming facility? Then what is this post about? I wouldn't bury the lede and immediately say that the "principled metaprogramming proposal" is too principled, basically


## What is TASTY?

Tasty is the high-level interchange format for Scala 3. It is based on __t__yped __a__bstract __s__yntax __t__rees. These trees contain in a sense all the information present in a Scala program. They represent the syntactic structure of programs in every detail and also contain the complete information about types and positions. The Tasty "snapshot" of a code file is taken after type checking (so that all types are present and all implicits are elaborated) but before any transformations (so that no information is lost or changed). The file representation of these trees is heavily optimized for compactness, which means that we can generate full Tasty trees on every compiler run and rely on nothing else for supporting separate compilation.
Copy link
Contributor

@adriaanm adriaanm Apr 10, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The __ don't render correctly. Looks like html syntax doesn't need spaces:

<i>T</i>yped <i>A</i>bstract <i>S</i>yntax <i>T</i>rees


It turns out that Tasty also makes an excellent foundation for a new generation of reflection-based macros.

The problem with the current `scala.reflect` macros is that they that they are completely dependent on the current Scala compiler (internally named `nsc`). In fact, `scala.reflect` macros are nothing but a thin veneer on top of `nsc` internals. This makes them very powerful but also fragile and hard to use. Because if this, they have had "experimental" status for their whole lifetime. Since Scala 3 uses a different compiler (`dotc`), the old reflect macros cannot be ported to it, so we need something different, and hopefully better.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • "The first problem" (since you're going to talk about lack of foundations as the second problem below)
  • "the old reflect macros cannot be ported" -- disambiguate (do you mean the system, or macros written in the old style); actually, it will be possible to port old-style macros to the new system, it'll just involve work

- **Completeness**. Tasty is Scala 3's interchange format, so basing the reflection API on it means no information is lost.
- **Stability**. As an interchange format, Tasty will be kept stable. Its evolution will be carefully managed with a strict versioning system. So the reflection API can be evolved in a controlled way.
- **Compiler Independence**. Tasty has been designed to be independent of the actual Scala compilers supporting it.
So the reflection API can be easily ported to new compilers. If a compiler supports Tasty as the interchange format, it can be made to support the reflection API at the same time.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For balance, list a few downsides? Potential duplication of code for algorithms that compute info that's not captured by tasty. Efficient serialization of IR and higher-level manipulation of code are aligned, but not fully: you want different levels of details. Do users really want to see all the fine details that TASTY has to record? Yes, this leads back to choice paralysis in designing the right abstractions, but maybe worth acknowledging in the bullet list.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the idea is that higher-level abstractions can be built on top of the tasty reflection layer.


As a first step towards this goal, we are working on a representation of Tasty in terms of a suite of compiler-independent data structures. The [current status](https://github.com/lampepfl/dotty/blob/master/tests/pos/tasty/definitions.scala) gives high-level data structures for all aspects of a Tasty file. With currently 192 lines of data definitions it reflects every piece of information that is contained in a Scala program after type checking. 192 lines is larger than a definition of mini-Lisp, but much, much smaller than the 30'000 lines or so of a full-blown compiler frontend!

## Nest Steps
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"Next"


## Nest Steps

The next step, currently under way, is to connect these definitions to the Tasty file format. We plan to do this by rewriting them as [extractors](https://docs.scala-lang.org/tour/extractor-objects.html) that implement each data type in terms of the data structures used by the `dotc` compiler which are then pickled and unpickled in the Tasty file format. An interesting alternative would be to write Tasty picklers and unpicklers that work directly with reflect trees.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Contrast/compare with Gestalt/scala.meta's extractors?

The Scala 3 language should also directly incorporate some constructs that so far required
advanced macro code to define. For instance:

- We model lazy implicits directly using
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another one: haoyi's source for line info etc

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't do that (yet), so better not list it. Since we have positions, we can delegate that to a macro if we wish.

The problem with the current `scala.reflect` macros is that they that they are completely dependent on the current Scala compiler (internally named `nsc`). In fact, `scala.reflect` macros are nothing but a thin veneer on top of `nsc` internals. This makes them very powerful but also fragile and hard to use. Because if this, they have had "experimental" status for their whole lifetime. Since Scala 3 uses a different compiler (`dotc`), the old reflect macros cannot be ported to it, so we need something different, and hopefully better.

Another way to look at `scala.reflect` macros was that they were lacking _foundations_. Scala 3 has already a meta programming facility, with rock-solid foundations. [Principled Meta Programming] (http://dotty.epfl.ch/docs/reference/principled-meta-programming.html) is a way to support staging by adding just two operators to the language: Quote (`'`) to represent code expressions, and splice (`~`) to insert one piece of code in another. The inspiration for our approach [comes from temporal logic](https://ieeexplore.ieee.org/abstract/document/561317/). A somewhat similar system
is used for staging in [Meta OCaml](http://okmij.org/ftp/ML/MetaOCaml.html).

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd suggest MetaOCaml as one word.

- A language server for an IDE uses it to support hyperlinking, command completion, or documentation.
- A build tool can use it to cross-build on different platforms and migrate code from one
version to another.
- Optimizers and analyzers can use it for deep code analysis and advanced code generation

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could add decompilation of binary code.


As a first step towards this goal, we are working on a representation of Tasty in terms of a suite of compiler-independent data structures. The [current status](https://github.com/lampepfl/dotty/blob/master/tests/pos/tasty/definitions.scala) gives high-level data structures for all aspects of a Tasty file. With currently 192 lines of data definitions it reflects every piece of information that is contained in a Scala program after type checking. 192 lines is larger than a definition of mini-Lisp, but much, much smaller than the 30'000 lines or so of a full-blown compiler frontend!

## Nest Steps

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Typo: Nest -> Next


## Future Macros

Adopting this scheme gives already some idea what Scala 3 macros will look like. First, they will run after the typechecking phase is finished because that is when Tasty trees are generated and consumed. This means macros will be blackbox - a macro expansion cannot influence the type of the expanded expression as seen from the typechecker. A long as that constraint is satisfied we should be able to support both `def` macros and annotation macros.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Annotation macros can introduce new definitions (that can be then referenced by user code), so it’s not clear to me how they can be expanded after type checking.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The user code must be compiled separately (i.e. typically in a downstream project).

It turns out that Tasty also makes an excellent foundation for a new generation of reflection-based macros.

The first problem with the current `scala.reflect` macros is that they that they are completely dependent on the current Scala compiler (internally named `nsc`). In fact, `scala.reflect` macros are nothing but a thin veneer on top of `nsc` internals. This makes them very powerful but also fragile and hard to use. Because of this, they have had "experimental" status for their whole lifetime. Since Scala 3 uses a different compiler (`dotc`), the old reflect-based macro system cannot be ported to it, so we need something different, and hopefully better.

Copy link

@liufengyun liufengyun Apr 10, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This makes them very powerful but also fragile and hard to use.

It would be good to have an analysis of why it is hard to use. I try to sketch some of the context. Maybe @xeno-by can add more and/or correct me.

For me, macros is a spectrum that can be at two extremes:

  1. those that can be put on a solid foundation (e.g. based on PMP)
  2. those that deal with low-level compiler entities: trees/types/symbols

It’s a big progress to identify a set of macros that can be implemented based on solid principles.

For those that cannot be based on solid principles, what do we do?

There are many macros that is out of the reach of principled handling, we cannot afford each one to be a compiler plugin. Also, portability seems still possible for some of the unprincipled macros. It seems we need a low-level portable macro system.

However, from the experience with Gestalt, I'm pessimistic about a generic portable macro system, due to complex compiler invariants and usability concerns. A generic portable macro system forms an obstacle for communication and learning when the abstraction breaks. And the abstraction of a generic portable macro system is doomed to break due to the wide spectrum of macros.

Advanced macros are no different from compiler (or compiler plugin) development in nature. Thus, compiler crashes in developing such macros are expected. However, if there exists an abstraction layer in the case of compiler crashes, compiler writers cannot help macro authors directly because of the existence of the abstraction layer. There are more compiler experts than macro system experts, it's a big disadvantage that they do not share the same body of knowledge. A generic portable macro system thus inevitably leads to user support problem. Programmers who write macros that break compiler invariants are usually advanced Scala programmers, it is not an issue for them to learn if there is enough guidance and support.

Instead, I'm optimistic for domain-specific (semi-)portable macro systems. Despite a generic abstraction is hopeless, the abstraction can be successful for a particular domain, like magnolia. Domain-specific macro systems provide better programming experience because they can make more solid assumptions for a particular domain.

Then the question is, what is needed from the compiler side to support domain-specific (semi-)portable macro systems? One simple answer is that they are compiler plugins. But compiler plugins are a little heavier to set up and use, why not just expose compiler internals as a low-level unportable macro system to support high-level domain-specific (semi-)portable macro systems? I see big advantages of doing so:

  1. ease to implement in the compiler.
  2. authors of advanced macros, compiler plugins and compiler hackers share the same body of knowledge.
  3. it makes the development of high-level domain-specific (semi-)portable macro systems easier.

My vision for future Scala meta-programming is the co-existence of

  1. a low-level unportable macro system,
  2. several domain-specific (semi-)portable macro systems (like magnolia), and
  3. principled meta-programming systems (like inlining and PMP).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very well said! I always have a hard time remembering what things are called in scala.reflect land, because they are slightly different from the compiler internals.

The other key point I agree with: we should not lose sight of identifying the real challenges faced by macro authors out there. For each of those, we should ask ourselves, does basing things on TASTY solve this? Don't get me wrong, I don't think basing things on TASTY is a bad idea, but it doesn't magically solve all our problems, so let's be clear about that.

For example: hygiene; difficulties with getting owner & tree structure to align; embedding untyped trees into typed ones.... These are all difficult API design challenges. I'm sure there are more, and I'm not the one who has this list in my head, but we should not lose the knowledge we've built up with the old system, and make sure new proposals at least explain how they deal with these things. Not in this blog post, of course, but since we're on the topic I just wanted to say "+1" :-)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree we should look at the difficulties of existing macros, but probably not in this blog post. The point is that the new approach is really a different world. For instance, regarding the obvious problems Adriaan mentioned:

  • hygiene: not an issue in a typed world (I would not even know to define what it means!)
  • owners should not be under programmer control, they should be handled fully automatically from the tree structure.
  • embedding untyped trees into typed ones: will not be possible.

I think it's futile to try to compare with existing macros now. We have to flesh out the system first to see where problems might lie. Also: I think it would be great if the new macro system was completely independent of the underlying compiler except for tree extractors. If that involves some duplication of code (e.g. subtyping implemented twice), that's a price worth to pay for it, IMO.

Copy link
Contributor

@olafurpg olafurpg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this writeup @odersky ! I left a few thoughts


## Future Macros

Adopting this scheme gives already some idea what Scala 3 macros will look like. First, they will run after the typechecking phase is finished because that is when Tasty trees are generated and consumed. This means macros will be blackbox - a macro expansion cannot influence the type of the expanded expression as seen from the typechecker. A long as that constraint is satisfied we should be able to support both `def` macros and annotation macros.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A long as that constraint is satisfied we should be able to support both def macros and annotation macros.

This would be good to expand into separate paragraph, maybe with examples. People don't associate "blackbox" with annotation macros it may require a more explicit explanation.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

they will run after the typechecking phase is finished

I think this merits a separate paragraph as well! This is IMO the most exciting idea from the post with interesting consequences.


- The compiler uses it to support separate compilation.
- A language server for an IDE uses it to support hyperlinking, command completion, or documentation.
- A build tool can use it to cross-build on different platforms and migrate code from one
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"migrate code from one version to another" can be misinterpreted as source rewriting. How about?

cross-build on different platforms and depend on artifacts from incompatible compiler versions

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure about the "depend on artifacts" part. I added "binary" to version to make it clear we don't do source code rewriting.

date: 2018-03-05
---

One of the biggest open questions for migrating to Scala 3 is what to do about macros. In this blog post we give our current thinking, which is to try to achieve full alignment between macros and Tasty.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would be good to provide somewhere a rough estimated timeline

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would also be good to mention what are the ideas for tasty support in Scala 2.

- `(')` turns code into syntax trees
- `(~)` embeds syntax trees in other code.

This approach to macros is very elegant, and has surprising expressive power. But it might a little bit too principled. There are still many bread and butter tasks one cannot do with it. In particular:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it might be a little bit

- Syntax trees are "black boxes", we are missing a way to decompose them and analyze their structure and contents.
- We can only quote and splice expressions, but not other program structures such as definitions or parameters.

We were looking for a long time for ways to augment principled meta programming by ways to decompose and flexibly reconstruct trees. The main problem here is choice paralysis - there is basically an infinite number of ways to expose the underlying structure. Quasi-quotes or syntax trees? Which constructs should be exposed exactly? What are the auxiliary types and operations?
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm unsure that choice paralysis for what data model to expose was a root dilemma scala.reflect. Quasiquotes provide ergonomics over regular the apply constructors, they're complementary to syntax trees, not mutually exclusive.

Copy link
Contributor Author

@odersky odersky Apr 11, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not just the trees, but also the symbols, the types, the flags, contexts, everything really.


## Next Steps

The next step, currently under way, is to connect these definitions to the Tasty file format. We plan to do this by rewriting them as [extractors](https://docs.scala-lang.org/tour/extractor-objects.html) that implement each data type in terms of the data structures used by the `dotc` compiler which are then pickled and unpickled in the Tasty file format. An interesting alternative would be to write Tasty picklers and unpicklers that work directly with reflect trees. The extractor-based approach was alredy pioneered in [ScalaMeta](http://scalameta.org) and [Gestalt](https://github.com/liufengyun/gestalt)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

s/alredy/already/

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Extractor-based approach was pioneered in gestalt and never used in Scalameta. Scalameta paradise converted to concrete scala.meta.Tree instances (similar to tasty.definitions._). It's still unclear to me which approach is better, each have pros and cons.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, the extractor-based approach was pioneered in scala.reflect and remains a core of the current macro api. See scala.reflect.api.Trees.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the clarification.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It depends on what we mean by extractor-based approach, though both using extractors, there's a huge difference between scala.reflect and what's called the extractor-based approach in Gestalt.

For history, the gestalt-style approach is actually pioneered in ScalaBackendInterface by @DarkDimius , which is documented in chapter 3 of his thesis. Gestalt independently arrived at the same design.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

missing a . at the end of the sentence.


## Please Give Us Your Feedback!

What do you think of the macro roadmap? Your feedback would be much appreciated. There is also lots of scope to shape the future by contributing to the development.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where is best for contributors to join the discussion? Would be good to include link.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was wondering about this as well. Should we point them to scala contributors or open the discussion on Disquus? Do we still support blog discussions on the site?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Definitely not disqus. Please create a thread on contributors. Personally, I think the whole article would make more sense as a contributors post. Posting on scala-lang, which we mostly use for release announcements, gives it the same air of doneness. We know the difference, but does a casual reader of the website?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think Scala contributors is the best place for these discussions. I think you can post an "Unlisted" thread to get a link without public post
screen shot 2018-04-11 at 11 30 36

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am open to either contributors or the blog. But the style of the blog is more oriented towards users, not contributors. What do others prefer?

Copy link
Contributor

@adriaanm adriaanm Apr 11, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would expect most macro authors to be on contributors. They really should be, at least, since they're basically contributing changes to the compiler and delivering them with their macros... Posting this there is a great way to get those developers to join on discourse. The website is too static for fluent topics like this, in my opinion.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the blog post is written for a general audience. Contributors don't need an explanation what Tasty is. So it fits better in the blog. We'll leave a separate thread on contributors.

[TastyFormat.scala](https://github.com/lampepfl/dotty/blob/master/compiler/src/dotty/tools/dotc/core/tasty/TastyFormat.scala)
of the `dotc` compiler for Scala 3.

## What Does it Have to Do with Macros?
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

s/it/Tasty/
?

The information present in TASTY trees can be used for many purposes.

- The compiler uses it to support separate compilation.
- A language server for an IDE uses it to support hyperlinking, command completion, or documentation.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

has documentation support been implemented in tasty? Comment below says it works today.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we are working on it


## What is TASTY?

Tasty is the high-level interchange format for Scala 3. It is based on <u>t</u>yped <u>a</u>bstract <u>s</u>yntax <u>t</u>rees. These trees contain in a sense all the information present in a Scala program. They represent the syntactic structure of programs in every detail and also contain the complete information about types and positions. The Tasty "snapshot" of a code file is taken after type checking (so that all types are present and all implicits are elaborated) but before any transformations (so that no information is lost or changed). The file representation of these trees is heavily optimized for compactness, which means that we can generate full Tasty trees on every compiler run and rely on nothing else for supporting separate compilation.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

syntactic structure of programs in every detail

I would tone down this claim by removing "in every detail". Every detail includes comments and formatting, which is required by tools like scalafmt and scalafix.

Looking at tasty.definitions, it does not either seem to represent syntax like val destructuring (val List(a) = ...) or for comprehensions (for (a <- ..) yield). Simpler trees are good for macros but too simplified trees can be problematic in other applications.

Copy link
Member

@Philippus Philippus Apr 11, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the underline html <u>...</u> will not render (at least not in GitHub).


Adopting this scheme gives already some idea what Scala 3 macros will look like. First, they will run after the typechecking phase is finished because that is when Tasty trees are generated and consumed. This means macros will be blackbox - a macro expansion cannot influence the type of the expanded expression as seen from the typechecker. A long as that constraint is satisfied we should be able to support both `def` macros and annotation macros.

We might support some forms of whitebox macros by allowing macros in the types themselves. These macros would be highlevel only, and would integrate with implicit search. A sketch of such as system is outlined in [Dotty PR 3844](https://github.com/lampepfl/dotty/pull/3844]).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

broken link

@adriaanm
Copy link
Contributor

the other broken link that's breaking the build is fixed by #873


It turns out that Tasty also makes an excellent foundation for a new generation of reflection-based macros.

The first problem with the current `scala.reflect` macros is that they that they are completely dependent on the current Scala compiler (internally named `nsc`). In fact, `scala.reflect` macros are nothing but a thin veneer on top of `nsc` internals. This makes them very powerful but also fragile and hard to use. Because of this, they have had "experimental" status for their whole lifetime. Since Scala 3 uses a different compiler (`dotc`), the old reflect-based macro system cannot be ported to it, so we need something different, and hopefully better.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

typo: that they that they


## Future Macros

Adopting this scheme gives already some idea what Scala 3 macros will look like. First, they will run after the typechecking phase is finished because that is when Tasty trees are generated and consumed. This means macros will be blackbox - a macro expansion cannot influence the type of the expanded expression as seen from the typechecker. A long as that constraint is satisfied we should be able to support both `def` macros and annotation macros.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A long as -> As long as

@odersky
Copy link
Contributor Author

odersky commented Apr 11, 2018

@olafurpg Good point. I'll switch to json.

Also, put quotes around title
@odersky
Copy link
Contributor Author

odersky commented Apr 11, 2018

We decided to wait a little bit longer and to consult more people before publishing.

@odersky odersky closed this Apr 11, 2018
@SethTisue
Copy link
Member

the pgp.mit.edu error is probably transient — an inherent problem with link-checking — but if it keeps cropping up I'll deal with it. (it's okay to ignore htmlproofer errors that are clearly unrelated to the changes in the PR in question.)

the other failure, I think happens on every PR that adds a new blog post; opened #875 on it

## Please Give Us Your Feedback!

What do you think of the macro roadmap? Your feedback would be much
appreciated. There is also lots of scope to shape the future by
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would suggest linking to a https://contributors.scala-lang.org thread that you create just before this is merged, so it's clear where feedback should go.

@odersky odersky reopened this Apr 28, 2018
@SethTisue
Copy link
Member

SethTisue commented Apr 30, 2018

Martin says we could perhaps publish this as soon as today. @heathermiller @adriaanm @propensive @smarter @olafurpg any last concerns?

@SethTisue
Copy link
Member

SethTisue commented Apr 30, 2018

@odersky the current title is clever, but not clear. I think the topic is of such central importance to the community that it would be better to use a straightforward title such as "Macros: the Plan for Scala 3".

@phaller
Copy link

phaller commented Apr 30, 2018

Exciting developments!

One comment: I think it would be better to refer to the 2017 JACM article by Davies (https://dl.acm.org/citation.cfm?id=3011069), because it is a revised and extended version of his 1996 LICS paper (this is also mentioned on page 3 of the JACM article).

@odersky odersky merged commit e3903b8 into scala:master Apr 30, 2018
contain in a sense all the information present in a Scala
program. They represent the syntactic structure of programs and also
contain the complete information about types and positions. The Tasty
"snapshot" of a code file is taken after type checking (so that all
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is a "code file"?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.