sergio.nava
sergio.nava

Reputation: 195

How to set apache spark config to run in cluster mode as a databricks job

I have developed an Apache Spark app, compiled it into a jar and I want to run it as a Databricks job. So far I have been setting master=local to test. What should I set this property or others in the spark config for it to run in cluster mode in databricks. Note that I do not have a cluster created in Databricks, I only have a job that will run on demand so I do not have the url of the master node.

Upvotes: 1

Views: 562

Answers (1)

D3V
D3V

Reputation: 1593

For the databricks job, you do not need to set master to anything.

You will need to do following:

val spark = SparkSession.builder().getOrCreate()

Upvotes: 1

Related Questions