Hakim_93
Hakim_93

Reputation: 1

My codes are they distributed in spark with Jupyter notebook Kernel

I need help because I don't know if the Jupyter notebook Kernel are usable in a Spark cluster.

In my local Spark I use this and I don't have problems.

I am using this Kernel for PySpark : https://github.com/Anchormen/pyspark-jupyter-kernels

I am using a Standalone Spark cluster with three nodes without Yarn.

Best regard.

Upvotes: 0

Views: 356

Answers (1)

carloshkayser
carloshkayser

Reputation: 197

You can connect to your spark cluster standalone using the master IP with the python kernel.

import pyspark 
sc = pyspark.SparkContext(master='spark://<public-ip>:7077', appName='<your_app_name>')

References

Upvotes: 0

Related Questions