rp346
rp346

Reputation: 7038

How to kill spark/yarn job via livy

I am trying to submit spark job via livy using rest api. But if I run same script multiple time it runs multiple instance of a job with different job ID's. I am looking a way to kill spark/yarn job running with same name before starting a new one. Livy document says (https://github.com/cloudera/livy#batch) delete the batch job, but livy sessions doesn't return application name, just application id is returned.

Is there another way to do this ?

Upvotes: 5

Views: 9400

Answers (3)

Clay
Clay

Reputation: 2726

For Livy version 0.7.0, the following works.

Where the Session ID you want to stop is 1:

  • python
import requests
headers = {'Content-Type': 'application/json'}
session_url = 'http://your-livy-server-ip:8998/sessions/1'
requests.delete(session_url, headers=headers)
  • shell
curl -X DELETE http://your-livy-server-ip:8998/sessions/1

See https://livy.incubator.apache.org/docs/latest/rest-api.html

Upvotes: 3

dassum
dassum

Reputation: 5103

You can use LivyClient API to submit spark jobs using Livy Server. LivyClient API has a stop method which can used to kill the job.

LivyClient.close(true);

Upvotes: 1

CHEEKATLAPRADEEP
CHEEKATLAPRADEEP

Reputation: 12768

Sessions that were active when the Livy server was stopped may need to be killed manually. Use the tools from your cluster manager to achieve that (for example, the yarn command line tool).

Run the following command to find the application IDs of the interactive jobs started through Livy.

yarn application -list

Run the following command to kill those jobs.

yarn application –kill "Application ID"

Refer: “https://learn.microsoft.com/en-us/azure/hdinsight/hdinsight-apache-spark-known-issues#livy-leaks-interactive-session”.

Upvotes: 0

Related Questions