Reputation: 427
I copied a container to another storage account based on the document linked below. (DataLake Storage Gen2).
When trying, I got the following error:
this request not authorized to perform this operations using this permission
https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10
Upvotes: 25
Views: 69652
Reputation: 1
I spent to much time on this..
Finally what worked for us was using the following.
Make sure the account u use to run the command had the role on the storageaccount: Storage Blob Data Owner.
U can add the Firewall IP if needed we did it just in case. The SAS url does not always auto create the "?" need to add that ur self.
https://stgrobackendlogs.blob.core.windows.net/papercutlogs/" ? " <--
The deletion string --delete-destination=true:changes made to the source are now also edited at the destination.
We run a schedule task if the SAS string contains a " % " u need to add another one other wise it won't work.
Also don't create the SAS token on the blob, but on the storageaccount!
Final: azcopy.exe sync "E:\Program Files\PaperCut MF\server\lib-ext" "https://stgrobackendlogs.blob.core.windows.net/papercutlogs/?sv=2022-11-02&ss=bffdsfresfdsdsddsfgrggr2iFrqvdQ5sdsdsdqQ3mvvvQiylVqvWoo%%3D" --delete-destination=true
I hope this saves some of u guys time ;)
Upvotes: 0
Reputation: 12683
I just spent 3 hours on this, everyone, please make sure you select the correct user in the select member step, paste the principal name on the textbox and make sure it is the one your pipeline uses
Upvotes: 1
Reputation: 394
In my case, my azure storage account vnet address was blocking the azcopy from copying the data over the storage account.
I added my client IP to allow a firewall address.
Upvotes: 3
Reputation: 46
Had a similar issue. That's how was resolved
Command used was .\azcopy.exe copy "C:\Users\kriof\Pictures" "https://test645676535storageaccount.blob.core.windows.net/images?sp=rw&st=2022-02-23T11:03:50Z&se=2022-02-23T19:03:50Z&spr=https&sv=2020-08-04&sr=c&sig=QRN%2SMFtU3zaUdd4adRddNFjM2K4ik7tNPSi2WRL0%3D"
SAS token had default(Read) permission only. Adding Write permission in Azure Portal, resolved the issue.
Upvotes: 1
Reputation: 1813
After granting myself with role Storage Blob Data Owner on the container, then AzCopy will now behave itself and succeed in copying a file to the blob storage container.
go to storageaccount -> container -> Access control rules -> add role assignement -> Storage Blob Data Owner
Upvotes: 8
Reputation: 11
Give appropriate permissions(read, write, create) while generating SAS tokens as here
Upvotes: 1
Reputation: 407
I also faced the same problem. For me to work I just log out and log in again on the azcopy cli after doing the @BowmanZhu solution
azcopy logout
azcopy login --tenant-id xxxx-xxxx-xxxx
If you don't want to login that way there is always the option to add a SAS token at the end of the URL. If you don't want to attach the token always at the end you can try for permanent access by going through any one of these steps you find in the official documentation page.
Upvotes: 5
Reputation: 74710
When I had this, I discovered it was because I'd used Azure Storage Explorer to generate a SAS that didn't have read permission, and I think it was trying to read the size/existence of a blob before writing it.
I got a clue from https://github.com/Azure/azure-storage-azcopy/issues/790 but ultimately I just regenerated a new SAS with read permission and it worked out..
I probably could ahve looked to modify the C# code using Azure Data Movement lib, to not perform a length check, but the spec was later changed to "don't overwrite" so the read permissions are probably needed anyway
Upvotes: 1
Reputation: 14113
If you are using AAD Token, this error is telling you that you need to add a role assignment to the user. Please go to Storage account -> Access Control -> Add -> Add role assignment, then add Storage Blob Data Owner to your login account.
If this problem persists, please provide more details.
Upvotes: 37