Reputation: 1
I am just trying to execute sc.version
inside pyspark shell however getting an error as sc
not defined.
>>> sc.version()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
If i run SparkContext.getOrCreate()
>>> SparkContext.getOrCreate()
<pyspark.context.SparkContext object at 0x7f206aa8cfd0>
I am not getting even the output of sc.version(). What is the problem?
Upvotes: 0
Views: 879
Reputation: 308
A few things:
sc = spark.sparkContext
. Or using the getOrCreate()
method as mentioned by @Smurphy0000 in the commentssc
in this case), version = sc.version
. Version can also be extracted from the session directly as version = spark.version
Upvotes: 1