figs_and_nuts
figs_and_nuts

Reputation: 5743

Where to modify spark-defaults.conf if I installed pyspark via pip install pyspark

I installed pyspark 3.2.0 via pip install pyspark. I have installed pyspark in a conda environment named pyspark. I cannot find spark-defaults.conf. I am searching for it in ~/miniconda3/envs/pyspark/lib/python3.9/site-packages/pyspark since that is my understanding of what SPARK_HOME should be.

  1. Where can I find spark-defaults.conf? I want to modify it
  2. Am I right in setting SPARK_HOME to the installation location of pyspark ~/miniconda3/envs/pyspark/lib/python3.9/site-packages/pyspark?

Upvotes: 4

Views: 1560

Answers (1)

过过招
过过招

Reputation: 4189

2. The SPARK_HOME environment variables are configured correctly.

1. In the pip installation environment, the $SPARK_HOME/conf directory needs to be created manually, then copy the configuration file template to this directory and modify each configuration file.

Upvotes: 4

Related Questions