limitIntegral314
limitIntegral314

Reputation: 172

Terraform thinks file created by Azure DevOps pipeline does not exist

In an Azure DevOps pipeline stage, my first job creates a file at location /home/vsts/work/1/s/myfolder/myfile.txt, and in my second job I deploy some things via Terraform. But Terraform doesn't acknowledge that the created file exists.

I am 100% sure that the file path is exactly correct, because when I run echo $(ls -R "$(Build.Repository.LocalPath)/myfolder") after creating the file, myfile.txt is indeed listed under /home/vsts/work/1/s/myfolder. But in Terraform, when I add

resource "databricks_file" "my_file" {
  source = "/home/vsts/work/1/s/myfolder/myfile.txt"
  path   = "/Volumes/my_catalog/my_schema/my_volume/myfile.txt"
}

the pipeline gives this error: File /home/vsts/work/1/s/myfolder/myfile.txt does not exist. Why does this happen? Is it even possible to use Terraform to deploy the file this way? (I could use the Databricks CLI, but in Terraform I've already set up everything related to credentials, and the Terraform solution (which works locally) would in this instance be more 'clean' in my opinion.)

EDIT: here is (a shortened version of) my azure-pipelines.yaml

stages:
  - stage: Stage1
    jobs:
      - job: 
        steps:
          - checkout: self
          - script: |
              mkdir myfolder
              touch myfolder/myfile.txt
            workingDirectory: $(Build.Repository.LocalPath)
          - task: PublishPipelineArtifact@1
            inputs:
              targetPath: myfolder
              artifact: my_artifact

  - stage: DeployTerraform
    dependsOn: Stage1
    jobs:
      - strategy:
          runOnce:
            deploy:
              steps:
                - checkout: self
                - task: DownloadPipelineArtifact@2
                  inputs:
                    artifactName: my_artifact
                    targetPath: $(Build.Repository.LocalPath)/myfolder
                - task: AzureCLI@2
                  displayName: Apply Terraform
                  inputs:
                    azureSubscription: XXX
                    addSpnToEnvironment: true
                    scriptType: bash
                    scriptLocation: inlineScript
                    workingDirectory: $(Build.Repository.LocalPath)/terraform
                    inlineScript: |
                      terraform apply -auto-approve

Upvotes: 0

Views: 62

Answers (1)

Bright Ran-MSFT
Bright Ran-MSFT

Reputation: 13944

To pass the files from the first job to the second job, you can use the Publish Pipeline Artifacts task and Download Pipeline Artifacts task. See below example as reference.

stages:
- stage: StageA
  jobs:
  - job: Job1
    steps:
    - bash: |
        echo "$(Build.SourcesDirectory)"
        mkdir -p myfolder
        echo "A text file for test." > ./myfolder/myfile.txt
      displayName: 'Generate File'
    
    - task: PublishPipelineArtifact@1
      displayName: 'Publish File'
      inputs:
        targetPath: '$(Build.SourcesDirectory)/myfolder'
        artifact: 'myfolder'
  
  - job: Job2
    dependsOn: Job1
    steps:
    - task: DownloadPipelineArtifact@2
      displayName: 'Download File'
      inputs:
        buildType: 'current'
        artifactName: 'myfolder'
        targetPath: '$(Build.SourcesDirectory)/myfolder'

    - bash: |
        echo "$(Build.SourcesDirectory)"
        cat ./myfolder/myfile.txt
      displayName: 'Read File'

enter image description here


Upvotes: 0

Related Questions