LaLaTi
LaLaTi

Reputation: 1725

Not able to copy file from DBFS to local desktop in Databricks

I want to save or copy my file from the dbfs to my desktop (local). I use this command but get an error:

dbutils.fs.cp('/dbfs/username/test.txt', 'C:\Users\username\Desktop') 
Error: SyntaxError: (unicode error) 'unicodeescape' codec can't decode bytes in position 2-3: truncated \UXXXXXXXX escape

When I lookup the dbutils.fs.help() for my case, I followed the instructions:

dbutils.fs provides utilities for working with FileSystems. Most methods in this package can take either a DBFS path (e.g., "/foo" or "dbfs:/foo"), or another FileSystem URI. For more info about a method, use dbutils.fs.help("methodName"). In notebooks, you can also use the %fs shorthand to access DBFS. The %fs shorthand maps straightforwardly onto dbutils calls. For example, "%fs head --maxBytes=10000 /file/path" translates into "dbutils.fs.head("/file/path", maxBytes = 10000)".

fsutils
cp(from: String, to: String, recurse: boolean = false): boolean -> Copies a file or directory, possibly across FileSystems

Upvotes: 3

Views: 9363

Answers (1)

Raphael K
Raphael K

Reputation: 2353

You need to use the Databricks CLI for this task.

Install the CLI on your local machine and run databricks configure to authenticate. Use an access token generated under user settings as the password.

Once you have the CLI installed and configured to your workspace, you can copy files to and from DBFS like this:

databricks fs cp dbfs:/path_to_file/my_file /path_to_local_file/my_file

You can also use the shorthand

dbfs cp dbfs:/path_to_file /path_to_local_file

Upvotes: 2

Related Questions