kodi
kodi

Reputation: 51

Exception in thread "main" org.apache.spark.SparkException with local run in spark

I try to run my code written in main2.py

import sys
from pyspark import SparkContext, SparkConf

sc = SparkContext()
data = ["Project",
"Gutenberg’s",
"Alice’s",
"Adventures",
"in",
"Wonderland",
"Project",
"Gutenberg’s",
"Adventures",
"in",
"Wonderland",
"Project",
"Gutenberg’s"]

rdd=sc.parallelize(data)
#map(f, preservesPartitioning=False)

rdd2=rdd.map(lambda x: (x,1))
for element in rdd2.collect():
    print(element)

I run it with in local with:

./spark/bin/spark-submit --master --local[1] ./main2.py

main2.py is in root folder A and spark folder is in A also. However when I run I have:

WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/home/nayrow/Application/spark2/jars/spark-unsafe_2.12-3.0.1.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Exception in thread "main" org.apache.spark.SparkException: Master must either be yarn or start with spark, mesos, k8s, or local
    at org.apache.spark.deploy.SparkSubmit.error(SparkSubmit.scala:936)
    at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:238)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:871)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

I'm on linux, thank you for your answers.

Upvotes: 1

Views: 3296

Answers (1)

mck
mck

Reputation: 42332

No need for two dashes before local[1]:

./spark/bin/spark-submit --master local[1] ./main2.py

As for the warnings, it seems to be some Java version problems. Spark only works with Java 8 or 11, so you should check that your Java version is correct.

Upvotes: 2

Related Questions