Reputation: 11077
I want to execute a batch file (Through Task Scheduler) that runs azcopy
and copies only the new files to the server. I've succeeded in configuring the task scheduler, but I'm failing to properly execute the batch file.
I've done extensive research through SO and sibling sites and haven't found the solution.
Here's the batch file:
azcopy cp "E:\SISTEMA\KORRENET\CSV\*.CSV" "https://x.blob.core.windows.net/data?sp=w&st=2021-07-27T23:15:58Z&se=2022-12-31T07:15:58Z&spr=https&sv=2020-08-04&sr=c&sig=my-signature-properly-escaped-with-double-ampersands%%3D" --overwrite=false
But I'm getting hit with the following problems:
When I execute it within the .bat
file I get "Failed" messages (Final Job Status: Failed). When I copy and paste the command directly into CMD it "Skips" the files (Still doesn't discover the new ones) (Final Job Status: CompleteWithSkipped).
I'm also getting 2 files that failed (There is currently a lease on the blob and no lease ID was specified in the request).
Here is when I copy and paste the command directly into cmd:
I want to copy the non-existent .CSV files inside the directory.
Upvotes: 3
Views: 2026
Reputation: 11411
Its better to use sync for copying new files only from local to azure and you can add include pattern to filter out the required file types you want to sync .
For batch file not working , it may be because you have “%” in your SAS token ,so you need to add another “%” infront of it for the .bat file to work properly .
Another reason could be, it doesn’t have the proper permissions in the SAS token. I see you have given only write permission but for copy job it requires read and write.
Command for Sync Job:
azcopy sync "C:\SO threads" "https://storageaccount.blob.core.windows.net/test?sp=racwdl&st=2021-08-14T12:57:47Z&se=2021-08-14T20:57:47Z&spr=https&sv=2020-08-04&sr=c&sig=ZyL88888888888888ETq8N94I3rn1vM%%3D" --include-pattern "*.docx" --recursive=true
So as I have 1 "%" here i.e. %3D , I have added 1 more infront of it to make it work.
Note: If there is a mistake in SAS token signature or permission then it will result in failed like below and if you go to the log then you can see why it failed .
In my example I am only syncing ”.docx file” in the local folder to the container. You can change it as per your requirement.
Working Example:
Note : Please provide all permissions while generating a SAS for sync Job and if you are using copy then please provide read and write permissions.
Then for scheduling you can use the below command from cmd:
schtasks /CREATE /SC minute /MO 5 /TN "AzCopy Script" /TR C:\script.bat
Reference: Tutorial: Migrate on-premises data to Azure Storage with AzCopy | Microsoft Docs
Upvotes: 1