Shriniwas
Shriniwas

Reputation: 754

Cannot run multiple SparkContexts at once

I am at the beginner stage of learning spark. I have Just started coding using python using pyspark.while going through basic code I got this error on Jupyter notebook. Well I have installed spark on my PC which in working condition. My problem is when I enter "pyspark" on my Ubuntu terminal it directly goes to webUI of jupyter. It doesn't go in Interactive shell. I dont know why?

2nd when I run following code I got error ..

from pyspark import SparkContext, SparkConf
conf = SparkConf().setAppName('appName').setMaster('local')
sc = SparkContext(conf=conf)
data = range(10)
dist_data = sc.parallelize(data)
print(dist_data.reduce(lambda a, b: a+b))

error of above code is...

Error Message

ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=PySparkShell, master=local[*]) created by at /home/trojan/.local/lib/python3.6/site-packages/IPython/utils/py3compat.py:186

What does that mean?? Please tell me what could be the error! sorry for error image I couldn't paste it clearly so I pasted screen shot of error Hope it work!

Upvotes: 3

Views: 10823

Answers (3)

Sanjay Nandakumar
Sanjay Nandakumar

Reputation: 421

Please try this code-

from pyspark import SparkContext
sc = SparkContext.getOrCreate();

Upvotes: 3

VjyAnnd
VjyAnnd

Reputation: 1

Check whether you have called SparkContext() more than once. Mke it as one in

Upvotes: 0

maksim
maksim

Reputation: 98

You can run only one spark context for one python kernel (notebook). If you need another spark context you can open another notebook, otherwise, there are no reason for multiple spark contexts on the same notebook, you can use it multiple times, depends on your problem.

Upvotes: 4

Related Questions