Faltu Tp
Faltu Tp

Reputation: 41

How to load a folder of files to databricks filestore?

I have folder called data containing multiple csv, json, parquet files. How can i load the whole folder to dbfs filestore. All options i found are of selecting files individually, multiple files but still as seperate files.

dbfs cp ./apple.txt dbfs:/apple.txt
#this is for a single file , how to load a folder?

Can anyone plzz help me?

Upvotes: 1

Views: 7687

Answers (2)

Owais Shabir
Owais Shabir

Reputation: 1

Currently Compressed files such as zip and tar files are not supported. The file must be a CSV or TSV and have the extension “.csv” or “.tsv”. The upload UI supports uploading up to 10 files at a time The total size of uploaded files must be under 100 megabytes.

Upvotes: 0

Hauke Mallow
Hauke Mallow

Reputation: 3202

Try: dbfs cp -r ./banana dbfs:/banana this recursively puts local dir ./banana to dbfs:/banana

Link: https://docs.azuredatabricks.net/user-guide/dbfs-databricks-file-system.html#dbfs

Upvotes: 1

Related Questions