Reputation: 171
I want to stop my spark instance here once I complete my job running on Jupyter notebook.
I did execute spark.stop()
at the end, but when I open my terminal, I'm still see the spark process there ps -ef | grep spark
So everytime I have to kill spark process ID manually. Anyone knows how to solve this problem? Thanks!!
spark = SparkSession.builder \
.master("local") \
.appName("Test") \
.config("spark.executorEnv.PYTHONPATH", "pyspark.zip:py4j-0.10.7-src.zip")\
.config('spark.jars','/Users/xxx/Documents/snowflake-jdbc-3.12.8.jar,/Users/xxx/Documents/spark-snowflake_2.11-2.7.2-spark_2.4.jar') \
.config('spark.jars.packages','org.apache.hadoop:hadoop-aws:2.7.3') \
.getOrCreate()
Upvotes: 5
Views: 12974
Reputation: 695
Try by shutting down the sparkContext instead of spark session. You can Try following things:
sc.stop()
or
spark.sparkContext.stop()
and than you can do
spark.stop()
Upvotes: 7