Reputation: 5743
I installed pyspark 3.2.0 via pip install pyspark
. I have installed pyspark in a conda environment named pyspark. I cannot find spark-defaults.conf
. I am searching for it in ~/miniconda3/envs/pyspark/lib/python3.9/site-packages/pyspark
since that is my understanding of what SPARK_HOME should be.
~/miniconda3/envs/pyspark/lib/python3.9/site-packages/pyspark
?Upvotes: 4
Views: 1560
Reputation: 4189
2
. The SPARK_HOME
environment variables are configured correctly.
1
. In the pip installation environment, the $SPARK_HOME/conf
directory needs to be created manually, then copy the configuration file template to this directory and modify each configuration file.
Upvotes: 4