Reputation: 81
In the spark docs, there is mentioned a line
In the Spark shell, a special interpreter-aware SparkContext is already created for you, in the variable called sc.
reference : https://spark.apache.org/docs/latest/rdd-programming-guide.html
What does interpreter-aware SparkContext means here ?
Upvotes: 0
Views: 39
Reputation: 18043
You can run the spark-shell with python or scala. The spark context knows which one. As it is interactive there is an interpreter. It's a common computing concept, that is all.
Upvotes: 1