Abhay
Abhay

Reputation: 31

What is the difference between spark-shell and pyspark, in terms of the language we use to write codes?

I wrote

 a = sc.parallelize([1,2,3])

in spark-shell and got error

error: illegal start of simple expression
       a = sc.parallelize([1,2,3])
                      ^

but when I wrote this in PySpark, it worked.

What's the difference between the two?

Upvotes: 1

Views: 244

Answers (1)

Filip
Filip

Reputation: 661

You need to use scala to utilize spark-shell. In this case, it would be something like this

 val a = sc.parallelize(Seq(1, 2, 3))

Upvotes: 2

Related Questions