-
-
Notifications
You must be signed in to change notification settings - Fork 3
Executors don't resolve dependencies #245
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
why its closed? @razvan |
I mean I need postgresql driver for my scripts, and cant provide it to spark |
@razvan Im using stackable Spark 3.5.1 What I did first:
After than in enviroment tab of SparkUI I can see postgresql driver. - But I got this exception:
Also some of my settings: without that other packages is not working and trigger strange errors. |
Yes, this is annoying and something that Spark should really fix upstream. The only solid workaround is listed in the documentation:
I created a new issue from your comment to have another look at this. |
@razvan Can you help me with one more thing? I just searhing for Dockerfile of stackable spark image and found it as I understood https://github.com/stackabletech/docker-images/blob/main/spark-k8s/Dockerfile What I should provide here? ARG PRODUCT I found all here https://github.com/stackabletech/docker-images/blob/main/conf.py |
@supsupsap It is easier to use an existing image and augment that with additional resources, much like is done here, rather than trying to adapt the original/base dockerfile. But please open an additional issue/discussion if necessary rather than continuing the thread here (or add a comment on #391 if relevant to that issue). |
Fixes #141
This PR tests is Spark can load
--packages
in Kubernetes clusters (drivers and executors). It uses the PostgreSQL JDBC driver and Spark 3.4.0 as an example.It currently fails to load the JDBC driver in the Spark driver: