java_geek
java_geek

Reputation: 18035

Databricks SparkSQL job

Does Databricks support submitting a SparkSQL job similar to Google Cloud Dataproc?

The Databricks Job API, doesn't seem to have an option for submitting a Spark SQL job.

Reference: https://docs.databricks.com/dev-tools/api/latest/jobs.html https://cloud.google.com/dataproc/docs/reference/rest/v1beta2/projects.regions.jobs

Upvotes: 0

Views: 221

Answers (2)

Shubham Jain
Shubham Jain

Reputation: 5526

You can submit the spark job on databricks cluster just like the dataproc. Run your spark job in scala context and create a jar for the same. Submitting spark-sql directly is not supported. To create a job follow the official guide https://docs.databricks.com/jobs.html

Also, to trigger the job using REST API you can trigger the run-now request described https://docs.databricks.com/dev-tools/api/latest/jobs.html#runs-submit

Upvotes: 0

Ged
Ged

Reputation: 18003

No, you submit a notebook.

That notebook can be many things: python, spark script or with %sql Spark SQL.

Upvotes: 1

Related Questions