Skip to content

Fix the list of supported Spark versions #224

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
siegfriedweber opened this issue Mar 30, 2023 · 0 comments · Fixed by #274
Closed

Fix the list of supported Spark versions #224

siegfriedweber opened this issue Mar 30, 2023 · 0 comments · Fixed by #274

Comments

@siegfriedweber
Copy link
Member

The list of supported Spark versions must be adjusted in the documentation, the docker-images repository, and the integration tests. Currently they differ.

Documentation

The documentation states that the following versions are supported:

  • 3.2.1-hadoop3.2
  • 3.2.1-hadoop3.2-python39
  • 3.3.0-hadoop3

Docker images

There are only images tagged with

  • 3.2.1-stackable23.1.0
  • 3.3.0-stackable23.1.0

There is no explicit python39 image.

In the next release, there will also be an image tagged with 3.3.0-java17-stackable23.4.0 but only for pyspark, not for spark.

Integration tests

The integration tests only cover 3.3.0-stackable23.1.0. A local test run with 3.2.1-stackable23.1.0 failed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Development

Successfully merging a pull request may close this issue.

1 participant