Reputation: 215
I am working on a use case, where I will be deploying the DB schema from Development to UAT and Production, but before deploying to UAT and production, i am taking the backup of both UAT and production as a .bacpac file. Below are the steps I have performed in Azure DevOps pipeline
Generated the .bacpac file using "Azure SQL Database deployment" inbuilt task. The file generated at "D:\a\r1\a\GeneratedOutputFiles\filename.bacpac
check if the file exist on the above path -> Yes file is available.
Copy to blob storage using "Azure File Copy" inbuilt task. The Source path which I took is $(System.ArtifactsDirectory)\GeneratedOutputFiles
and destination is my blob container. PFA, the details about each step and the logs in the google drive document. The issue is I am not able to upload the file to blob storage as the logs says that the source folder is empty, but the file is present in the source folder.
Can someone please take a look and guide me what I am missing here?
Google Drive path of the document
Upvotes: 0
Views: 192
Reputation: 8488
The new v4
of the File Copy Task in Azure Pipelines moved from using AzCopy 8 to AzCopy 10, and it requires more security for Azure DevOps Service Principal to authenticate with Azure.
You can:
Storage Blob Data Contributor
and
Storage Blob Data Owner
You can find more details in link: https://github.com/microsoft/azure-pipelines-tasks/issues/13124
and https://brettmckenzie.net/2020/03/23/azure-pipelines-copy-files-task-authentication-failed/.
Upvotes: 1
Reputation: 4552
You can use AzCopy
to copy files from or to Azure Blob storage.
AzCopy is a command-line utility that you can use to copy blobs or files to or from a storage account.
You can visit this official document by Microsoft to download and know more about AzCopy.
Also, to copy the files from Azure VM to Blob Storage follow this third-party tutorial.
In the AzCopy command, you will be required to use SAS Token of the container in which you want to copy the .bacpac file.
Upvotes: 1