Reputation: 195
I have developed an Apache Spark app, compiled it into a jar and I want to run it as a Databricks job. So far I have been setting master=local
to test. What should I set this property or others in the spark config for it to run in cluster mode in databricks. Note that I do not have a cluster created in Databricks, I only have a job that will run on demand so I do not have the url of the master node.
Upvotes: 1
Views: 562
Reputation: 1593
For the databricks job, you do not need to set master to anything.
You will need to do following:
val spark = SparkSession.builder().getOrCreate()
Upvotes: 1