Abdul
Abdul

Reputation: 1

Spark Jupyter interaction

How can we connect Jupiter notebook with spark operator as interactive session where executor are created and execute jupyter notebook job and get done and got terminated in an EKS environment.

I try GitHub native spark and Google spark operator how ever executor are unable to start it get terminated for simple job

Upvotes: 0

Views: 12

Answers (1)

Frank
Frank

Reputation: 532

Please follow the readme for Spark operator. Once you have a cluster running, forward a port from master: kubectl port-forward prod-master-0 7077 then connect to it using Spark:

from pyspark.sql import SparkSession
spark = SparkSession.builder \
        .appName("Pyspark example") \
        .master("spark://localhost:7077") \
        .getOrCreate()

Upvotes: 0

Related Questions