Reputation: 207
I want to import >1100 seismic time series files from Azure into a Azure online hosted Notebook for processing. My code currently copies the file directly into my project source directory instead of neatly into my "../data/" directory.
I can import things from all over the project using the "~/library/../" string. However, the same trick isn't working when I try to direct the data where to go.
I did some research online but most results don't seem to cover this particular use case. I've tried many variations of file paths but to no avail.
How can I write files to a directory relative to my home path?
import_path = "~/library/04_processedData/seismicflatfile.csv"
return_path = "RSN1100_KOBE_ABN-UP.csv"
blob_service.get_blob_to_path(container_name, "RSN1100_KOBE_ABN-UP.DT2", return_path)
Upvotes: 3
Views: 242
Reputation: 222592
You can get the local path with,
local_path = os.path.join(folder_path, file_name)
if not os.path.isfile(local_path):
blob_service.get_blob_to_path(CONTAINER_NAME, blob_name, local_path)
Refer the sample here
Upvotes: 1