XapaJIaMnu
XapaJIaMnu

Reputation: 1500

pyspark, can you somehow pass pyspark's SparkContext to a java object

So as per a well known spark bug/design limitation https://issues.apache.org/jira/browse/SPARK-2243 you can't have multiple SparkContexts. Now, I am doing this really ugly mixture of python and Scala and I have a Scala method that expects a SparkContext as an argument. Can I somehow instantiate this Scala object in py4j and then pass it pyspark's SparkContext (exported as sc in the pyspark shell). My understanding is that sc is a thin wrapper around a Scala object, but I can't figure out how to convert it to a java/scala class and pass it as an argument to my Scala method.

Upvotes: 1

Views: 1142

Answers (1)

zsxwing
zsxwing

Reputation: 20826

You can call SparkContext.getOrCreate() in Scala to get the active SparkContext created by PySpark.

Upvotes: 2

Related Questions