Reputation: 516
I have one S3 bucket storing CSV files in it. New CSV files get added to this bucket at the beginning of each month. I want these new files to be uploaded automatically to the Azure blob storage at the beginning of each month.
The way I was thinking to do this is to create a script(bash/PowerShell) that pulls data from the AWS S3 bucket to Azure blob storage via AZ Copy command. and then plug this script into an Azure YAML pipeline which runs every start of the month to execute this script. but I can't find a way to integrate this script in an Azure YAML pipeline. Is this command feasible with the YAML pipeline? or is there any simple way to do this?
Upvotes: 0
Views: 804
Reputation: 3119
We can copy data from S3 bucket to Azure Blob Storage using azcopy.
azcopy copy "<s3-bucket-uri>" "https://StorageAccountName.blob.core.windows.net/container-name/?sas-token" --recursive
We can integrate the azcopy with YAML pipeline.
Firstly, install azcopy in pipeline agent as below :
- task: Bash@3
displayName: Install azcopy
inputs:
targetType: 'inline'
script: |
curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash
mkdir $(Agent.ToolsDirectory)/azcopy
wget -O $(Agent.ToolsDirectory)/azcopy/azcopy_v10.tar.gz https://aka.ms/downloadazcopy-v10-linux
tar -xf $(Agent.ToolsDirectory)/azcopy/azcopy_v10.tar.gz -C $(Agent.ToolsDirectory)/azcopy --strip-components=1
Create cli-task with azcopy in pipeline to copy data from S3 bucket to Azure Blob Storage using azcopy
Reference code :
- task: AzureCLI@2
displayName: Download using azcopy
inputs:
azureSubscription: 'Service-Connection'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
end=`date -u -d "180 minutes" '+%Y-%m-%dT%H:%M:00Z'`
$(Agent.ToolsDirectory)/azcopy/azcopy copy "<s3-bucket-uri>" "https://StorageAccountName.blob.core.windows.net/container-name/?sas-token" --recursive --check-md5=FailIfDifferent
Reference SO thread : Azure Pipelines - Download files with azcopy - Stack Overflow
Upvotes: 1