YoussHark
YoussHark

Reputation: 608

Python script to write(+lock) / read a file in Azure

Iam new to python programming and Azure.

I need to write a script which will be executed by 2 processes.

The 2 processes will run the same python script. I know that Azure has storageAccounts to put some files in it, i've found this: https://learn.microsoft.com/en-us/python/api/azure-storage-file/azure.storage.file.fileservice.fileservice?view=azure-python

and: https://github.com/Azure/azure-storage-python

Here is some pseudo code to illustrate what i need to achieve:

function useStorageFile
   if(fileFromStorage == null)
      createFileInStorage lockFileInStorage;
      executeDockerCommand;
      writeResultOFCommandInStorageFile;
   else
      if(fileFromStorage != null)
        X:if(fileFromStorage.status !== 'locked')
           readResultFromFile
        else
           wait 1s;
           continue X;

Is it possible to lock/unlock a file in Azure? How can i achieve that in python for example? thank you.

EDIT I have managed to write a file in Blob Storage with a python script. The question now is: How can i lock the file while writing the command result in it by the first process, and make it read by the second process as soon as the Blob Storage lock (if the option exists...) is released by the first process ? here is the python script iam using :

import os, uuid, sys
from azure.storage.blob import BlockBlobService, PublicAccess

def run_sample():
    try:
        # Create the BlockBlockService that is used to call the Blob service for the storage account
        block_blob_service = BlockBlobService(account_name='xxxxxx', account_key='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx')

        # Create a container called 'quickstartblobs'.
        container_name ='quickstartblobs'
        block_blob_service.create_container(container_name)

        # Set the permission so the blobs are public.
        block_blob_service.set_container_acl(container_name, public_access=PublicAccess.Container)

        # Create a file in Documents to test the upload and download.
        local_path=os.path.abspath(os.path.curdir)
        local_file_name ='youss.txt'
        full_path_to_file =os.path.join(local_path, local_file_name)

        # Write text to the file.
        file = open(full_path_to_file,  'w')
        file.write("Hello, World!")
        file.close()

        print("Temp file = " + full_path_to_file)
        print("\nUploading to Blob storage as blob" + local_file_name)

        # Upload the created file, use local_file_name for the blob name
        block_blob_service.create_blob_from_path(container_name, local_file_name, full_path_to_file)

        # List the blobs in the container
        print("\nList blobs in the container")
        generator = block_blob_service.list_blobs(container_name)
        for blob in generator:
            print("\t Blob name: " + blob.name)

        # Download the blob(s).
        # Add '_DOWNLOADED' as prefix to '.txt' so you can see both files in Documents.
        full_path_to_file2 = os.path.join(local_path, str.replace(local_file_name ,'.txt', '_DOWNLOADED.txt'))
        print("\nDownloading blob to " + full_path_to_file2)
        block_blob_service.get_blob_to_path(container_name, local_file_name, full_path_to_file2)

        sys.stdout.write("Sample finished running. When you hit <any key>, the sample will be deleted and the sample "
                         "application will exit.")
        sys.stdout.flush()
        input()

        # Clean up resources. This includes the container and the temp files
        block_blob_service.delete_container(container_name)
        os.remove(full_path_to_file)
        os.remove(full_path_to_file2)
    except Exception as e:
        print(e)


# Main method.
if __name__ == '__main__':
    run_sample()

Upvotes: 0

Views: 1854

Answers (1)

Gaurav Mantri
Gaurav Mantri

Reputation: 136146

How can i lock the file while writing the command result in it by the first process, and make it read by the second process as soon as the Blob Storage lock (if the option exists...) is released by the first process ?

Azure Blob Storage has a feature called Lease that you can make use of. Essentially Leasing process acquires an exclusive lock on the resource (blob in your case) and only one process can acquire lease on a blob. Once lease is acquired on the blob, any other process can't modify or delete the blob.

So what you would need to do is try to acquire a lease on the blob before writing. If the blob is already leased you will get an error back (HTTP Status Code 412, PreConditionFailed error). Assuming you don't get an error, you can continue with updating the file. Once the file is updated, you can either manually release the lock (either break lease or release lease) or let the lease auto expire. Assuming you get an error, you should wait and fetch the blob's lease status periodically (say every 5 seconds). Once you find that the blob is not leased anymore, you can read the blob's contents.

Upvotes: 1

Related Questions