ModuleNotFoundError: No module named 'pyspark' #6346
Answered
by
Pythoneryeah
Pythoneryeah
asked this question in
Q&A
-
Beta Was this translation helpful? Give feedback.
Answered by
Pythoneryeah
Jul 21, 2023
Replies: 1 comment 2 replies
-
It sounds like an issue with the environment, probably code-server is not running in the Conda environment or whatever they use to install pyspark. It sounds like they have an entrypoint that sets the environment but if you are running |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Thank you. I solved the problem. But still don't know why? Here's my solution:
1, the mkdir ~ /. Ipython/kernels/pyspark
2, vim ~ /. Ipython/kernels/pyspark/kernel. The json
3. Add the following:
{
"display_name": "pySpark",
"language": "python",
"argv": [
"/usr/local/anacond/bin/python3",
"-m",
"IPython.kernel",
"-f",
"{connection_file}"
].
"env": {
"SPARK_HOME": "/usr/local/spark",
"PYTHONPATH" : "/ usr/local/spark/python: / usr/local/spark/python/lib/py4j - 0.10.4 - SRC. Zip",
"PYTHONSTARTUP": "/usr/local/spark/python/pyspark/shell.py ",
"PYSPARK_SUBMIT_ARGS": "pyspark-shell"
}
}