Pascal
Pascal

Reputation: 103

How to deploy Spark application jar file to Kubernetes cluster?

I am currently trying to deploy a spark example jar on a Kubernetes cluster running on IBM Cloud.

If I try to follow these instructions to deploy spark on a kubernetes cluster, I am not able to launch Spark Pi, because I am always getting the error message:

The system cannot find the file specified

after entering the code

bin/spark-submit \
    --master k8s://<url of my kubernetes cluster> \
    --deploy-mode cluster \
    --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=5 \
    --conf spark.kubernetes.container.image=<spark-image> \
    local:///examples/jars/spark-examples_2.11-2.3.0.jar

I am in the right directory with the spark-examples_2.11-2.3.0.jar file in the examples/jars directory.

Upvotes: 7

Views: 3803

Answers (2)

VAS
VAS

Reputation: 9041

Ensure your.jar file is present inside the container image.

Instruction tells that it should be there:

Finally, notice that in the above example we specify a jar with a specific URI with a scheme of local://. This URI is the location of the example jar that is already in the Docker image.

In other words, local:// scheme is removed from local:///examples/jars/spark-examples_2.11-2.3.0.jar and the path /examples/jars/spark-examples_2.11-2.3.0.jar is expected to be available in a container image.

Upvotes: 5

silverfox
silverfox

Reputation: 5272

Please make sure this absolute path /examples/jars/spark-examples_2.11-2.3.0.jar is exists.

Or you are trying loading a jar file in current directory, In this case it should be an relative path like local://./examples/jars/spark-examples_2.11-2.3.0.jar.

I'm not sure if spark-submit accepts relative path or not.

Upvotes: 1

Related Questions