BillMan
BillMan

Reputation: 9924

Calling Kubernetes Spark Operator with Java api

There are plenty of good examples creating Spark jobs using the Kubernetes Spark Operator and simply submitting a request like the following

kubectl apply -f spark-pi.yaml

Where the spark-pi.yaml can be found at the this here

Does anyone know the easiest way to submit a job like this with the Java K8s api?

Upvotes: 2

Views: 1288

Answers (3)

Mukhtiar Ahmed
Mukhtiar Ahmed

Reputation: 621

I have generated the Spark operator Java client to submit the spark job to Kubernetes. I am sharing the Github URL of repository client-java-spark-operator

Upvotes: 0

Bhargav Kosaraju
Bhargav Kosaraju

Reputation: 298

I have written a Application to submit spark job to Kubernetes where all you need to pass is Config Map (key value pair for app)

you could find the same in github Under class RunSparkJobInKube(jobConfiguration: Map[String,String])

this might help help you provide an idea for your requirement.

though this is scala you can call inside java as normal method.

here in this app I have integrated with IAM (aws specific) in case if you are interested in security.

Upvotes: 1

Alex Sasnouskikh
Alex Sasnouskikh

Reputation: 991

I would recommend to look into Fabric8 K8s client used by Apache Spark in K8s or the official Java K8s client. With these libs you can submit the K8s resources using the code.

Upvotes: 1

Related Questions