user3593261
user3593261

Reputation: 568

How to pass configuration from spark-submit to yarn cluster?

I'm wondering is there any way for spark-submit to temporarily change yarn job's configuration?

The question is because our Spark over yarn cluster's history server only allows admin to access, which is not convenience for user to retrieve their own job's log. I learned "mapreduce.job.acl-view-job" in mapreduce-default.xml can changes specific job's ACL. As I'm using spark-submit to launch job, and "--conf" is reserved for spark itself's, but how can I set yarn's configuration from command line along with application?

Upvotes: 0

Views: 1381

Answers (1)

botchniaque
botchniaque

Reputation: 5084

You can modify Spark's Hadoop Configuration (obtained via SparkContaxt.hadoopConfiguration) by adding a --conf with a spark.hadoop. prefixed.

In you example it would be

spark-submit --conf spark.hadoop.mapreduce.job.acl-view-job=YOUR_ACL_STATEMENT ...

Upvotes: 2

Related Questions