user13128577
user13128577

Reputation:

Does Databricks translates sql queries into PySpark in a Python Notebook?

I am creating notebooks in azure databricks to run some queries, and it supports standard sql queries in a python notebook by using marker %sql, my question is, when the query runs, at the back of it, does databricks actually translate the sql queries into PySpark?

Upvotes: 0

Views: 368

Answers (1)

CHEEKATLAPRADEEP
CHEEKATLAPRADEEP

Reputation: 12778

When you invoke a language magic command, the command is dispatched to the REPL in the execution context for the notebook.

What is execution context?

When you attach a notebook to a cluster, Databricks creates an execution context. An execution context contains the state for a REPL environment for each supported programming language: Python, R, Scala, and SQL. When you run a cell in a notebook, the command is dispatched to the appropriate language REPL environment and run.

You can override the default language by specifying the language magic command % at the beginning of a cell. The supported magic commands are: %python, %r, %scala, and %sql.

Note: Variables defined in one language (and hence in the REPL for that language) are not available in the REPL of another language. REPLs can share state only through external resources such as files in DBFS or objects in object storage.

Upvotes: 1

Related Questions