Reputation: 71
now in Linux VM, I upload a single file with this command:
azure storage blob upload -q /folder/file.txt --container containerName
Is possible upload more file at the same time? (with a single command)
Upvotes: 4
Views: 7122
Reputation: 2369
If you have access to a recent Python interpreter on your Linux VM and all of your files are in one directory then the Azure Batch and HPC team has released a code sample with some AzCopy-like functionality on Python called blobxfer that may help with your situation. The script permits full recursive directory ingress into Azure Storage as well as full container copy back out to local storage. [full disclosure: I'm a contributor for this code]
Upvotes: 0
Reputation: 3587
You can use a loop like so
#!/bin/bash
export AZURE_STORAGE_ACCOUNT='your_account'
export AZURE_STORAGE_ACCESS_KEY='your_access_key'
export container_name='name_of_the_container_to_create'
export source_folder=~/path_to_local_file_to_upload/*
echo "Creating the container..."
azure storage container create $container_name
for f in $source_folder
do
echo "Uploading $f file..."
azure storage blob upload $f $container_name $(basename $f)
cat $f
done
echo "Listing the blobs..."
azure storage blob list $container_name
echo "Done"
Upvotes: 7
Reputation: 294
The command line does not have an option to bulk upload multiple files in one invocation. However, you can either use find or a loop to upload multiple files, or if doing this from Windows is an option, then you can look at using the AzCopy tool (http://aka.ms/azcopy).
Upvotes: 1