Reputation: 18035
Does Databricks support submitting a SparkSQL job similar to Google Cloud Dataproc?
The Databricks Job API, doesn't seem to have an option for submitting a Spark SQL job.
Reference: https://docs.databricks.com/dev-tools/api/latest/jobs.html https://cloud.google.com/dataproc/docs/reference/rest/v1beta2/projects.regions.jobs
Upvotes: 0
Views: 221
Reputation: 5526
You can submit the spark job on databricks cluster just like the dataproc. Run your spark job in scala context and create a jar for the same. Submitting spark-sql directly is not supported. To create a job follow the official guide https://docs.databricks.com/jobs.html
Also, to trigger the job using REST API you can trigger the run-now request described https://docs.databricks.com/dev-tools/api/latest/jobs.html#runs-submit
Upvotes: 0
Reputation: 18003
No, you submit a notebook.
That notebook can be many things: python, spark script or with %sql Spark SQL.
Upvotes: 1