You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Our legacy test infrastructure is a maintenance burden and uses up resources on our CI. To get rid of it I think we need:
Proper integration meta-tests. We already have unit meta-tests in https://github.com/lampepfl/dotty/blob/master/compiler/test/dotty/tools/vulpix/VulpixTests.scala but these don't run vulpix end-to-end. We should set up a directory of test files designed to stress vulpix in some ways (compiler crash, infinite loop at runtime, excessive memory usage, ...) and exercise common features (proper diff between check and run files, correct number of errors reported, ...). Then vulpix should be run on this directory as it is currently being run on tests/, and its output should be compared against some expected output.
Optional: a test mode where the list of files passed to the compiler is sorted in a random order (controlled by a seed). We sometimes catch bugs just because the legacy tests happen to pass the files to the compiler in a different order than vulpix. We should have a way to intentionally try different orders. While it's impractical to try every possible order all the time, we could do nightly tests using a few different random orders each time.
The text was updated successfully, but these errors were encountered:
About the second one, compileDir already has an optional argument randomOrder: Option[Int] to do that, maybe it should be wired to something more global and easy to change than a function argument.
Uh oh!
There was an error while loading. Please reload this page.
Our legacy test infrastructure is a maintenance burden and uses up resources on our CI. To get rid of it I think we need:
tests/
, and its output should be compared against some expected output.The text was updated successfully, but these errors were encountered: