Reputation: 1
How can we connect Jupiter notebook with spark operator as interactive session where executor are created and execute jupyter notebook job and get done and got terminated in an EKS environment.
I try GitHub native spark and Google spark operator how ever executor are unable to start it get terminated for simple job
Upvotes: 0
Views: 12
Reputation: 532
Please follow the readme for Spark operator.
Once you have a cluster running, forward a port from master:
kubectl port-forward prod-master-0 7077
then connect to it using Spark:
from pyspark.sql import SparkSession
spark = SparkSession.builder \
.appName("Pyspark example") \
.master("spark://localhost:7077") \
.getOrCreate()
Upvotes: 0