user8708009
user8708009

Reputation: 137

Azure Databricks: ImportError: No module named azure.storage.blob

When using the sample code example.py(provided with Azure documentation: Quickstart: Upload, download, and list blobs with Python), I get the following import error.

Link to documentation: https://github.com/Azure-Samples/storage-blobs-python-quickstart/blob/master/example.py

ImportError: No module named azure.storage.blob
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
 in ()
      1 import os, uuid, sys
----> 2 from azure.storage.blob import BlockBlobService, PublicAccess
      3 

Please help me resolve this issue.

Since it is running in a notebook on Azure cloud, there is no python installation involved. So, please do not revert with suggestion that I should use a different version of python.

import os, uuid, sys
from azure.storage.blob import BlockBlobService, PublicAccess

def run_sample():
    try:
        # Create the BlockBlockService that is used to call the Blob service for the storage account
        block_blob_service = BlockBlobService(account_name='accountname', account_key='accountkey')

        # Create a container called 'quickstartblobs'.
        container_name ='quickstartblobs'
        block_blob_service.create_container(container_name)

        # Set the permission so the blobs are public.
        block_blob_service.set_container_acl(container_name, public_access=PublicAccess.Container)

        # Create a file in Documents to test the upload and download.
        local_path=os.path.expanduser("~/Documents")
        local_file_name ="QuickStart_" + str(uuid.uuid4()) + ".txt"
        full_path_to_file =os.path.join(local_path, local_file_name)

        # Write text to the file.
        file = open(full_path_to_file,  'w')
        file.write("Hello, World!")
        file.close()

        print("Temp file = " + full_path_to_file)
        print("\nUploading to Blob storage as blob" + local_file_name)

        # Upload the created file, use local_file_name for the blob name
        block_blob_service.create_blob_from_path(container_name, local_file_name, full_path_to_file)

        # List the blobs in the container
        print("\nList blobs in the container")
        generator = block_blob_service.list_blobs(container_name)
        for blob in generator:
            print("\t Blob name: " + blob.name)

        # Download the blob(s).
        # Add '_DOWNLOADED' as prefix to '.txt' so you can see both files in Documents.
        full_path_to_file2 = os.path.join(local_path, str.replace(local_file_name ,'.txt', '_DOWNLOADED.txt'))
        print("\nDownloading blob to " + full_path_to_file2)
        block_blob_service.get_blob_to_path(container_name, local_file_name, full_path_to_file2)

        sys.stdout.write("Sample finished running. When you hit <any key>, the sample will be deleted and the sample "
                         "application will exit.")
        sys.stdout.flush()
        input()

        # Clean up resources. This includes the container and the temp files
        block_blob_service.delete_container(container_name)
        os.remove(full_path_to_file)
        os.remove(full_path_to_file2)
    except Exception as e:
        print(e)

if __name__ == '__main__':
    run_sample()

Upvotes: 3

Views: 8605

Answers (2)

MikePy
MikePy

Reputation: 23

As far as I know you can use pip intall in the NoteBook cell, for example:

pip install azure-storage-file

Upvotes: 1

Fabio Schultz
Fabio Schultz

Reputation: 527

You must to add a lib in your envinronment https://docs.databricks.com/user-guide/libraries.html#install-a-library-on-a-cluster

Upvotes: 0

Related Questions