user1765609
user1765609

Reputation: 55

how to copy py file stored in dbfs location to databricks workspace folders

how to copy py file stored in dbfs location to databricks workspace folders. once it is copied to workspace folders. once it is copied to databricsk workspace folders, I can run it as notebook using %run command.

Upvotes: 2

Views: 5483

Answers (1)

Alex Ott
Alex Ott

Reputation: 87154

DBFS & Workspace folders are two different things that aren't connected directly:

  • DBFS is located in your own environment (so-called data plane, see Databricks Architecture docs), built on top of the specific cloud storage, like, AWS S3, Azure Data Lake Storage, etc.

  • Workspace folders are located in the control plane that is owned by Databricks - the folders are just metadata to represent a hierarchy of notebooks. When executed, code of the notebooks is sent from Databricks environment to the machines running in your environment.

To put code into workspace you can either use UI to upload it, you can use Workspace API to import it, or even easier - just use workspace import (or workspace import_dir to import many files from a directory) command of Databricks CLI that is a wrapper over REST API but it's easier to use.

If you already copied notebooks onto DBFS, you can simply download them again to your local machine using the fs cp command of Databricks CLI, and then use workspace import (or workspace import_dir) to import them

Upvotes: 1

Related Questions