Reputation: 4099
Not able to run Spark job in yarn(cluster or client) mode through Livy, I am able to run spark job using Livy, but the jobs are running local mode and not visible on resource manager. I want to run these jobs in yarn-cluster mode.
I am using Hortonworks HDP 2.6.1 hadoop distribution. HDP2.6.1 has two versions of spark (2.1.1 and spark 1.6.3) and two versions of livy (1 and 2).
We have Ambari to view/change conf files.
We have made changes in below files:-
Added below properties in above files:-
========data=========
data = {
'code': textwrap.dedent("""
print(sc.applicationId)
""")
}
========curl command=========
curl hdpmaster:8998/sessions/0/statements -X POST -H 'Content-Type: application/json' -d '{"code":"1 + 1"}'
Can someone please help, in which configuration file we need to make changes to run spark job in yarn mode?
Upvotes: 0
Views: 1046
Reputation: 702
As you cannot set master parameter in your conf of the job he is may be taking what is set in the jar ie 'local'. Check your code may be you hard coded the value
Upvotes: 0