user38310
user38310

Reputation: 143

Environment variables not being set on AWS CODEBUILD

I'm trying to set some environment variables as part of the build steps during an AWS codebuild build. The variables are not being set, here are some logs:

[Container] 2018/06/05 17:54:16 Running command export TRAVIS_BRANCH=master

[Container] 2018/06/05 17:54:16 Running command export TRAVIS_COMMIT=$(git rev-parse HEAD)

[Container] 2018/06/05 17:54:17 Running command echo $TRAVIS_COMMIT


[Container] 2018/06/05 17:54:17 Running command echo $TRAVIS_BRANCH


[Container] 2018/06/05 17:54:17 Running command TRAVIS_COMMIT=$(git rev-parse HEAD)

[Container] 2018/06/05 17:54:17 Running command echo $TRAVIS_COMMIT


[Container] 2018/06/05 17:54:17 Running command exit

[Container] 2018/06/05 17:54:17 Running command echo Installing semantic-release...
Installing semantic-release...

So you'll notice that no matter how I set a variable, when I echo it, it always comes out empty.

The above is made using this buildspec

version: 0.1



# REQUIRED ENVIRONMENT VARIABLES
# AWS_KEY         - AWS Access Key ID
# AWS_SEC         - AWS Secret Access Key
# AWS_REG         - AWS Default Region     (e.g. us-west-2)
# AWS_OUT         - AWS Output Format      (e.g. json)
# AWS_PROF        - AWS Profile name       (e.g. central-account)
# IMAGE_REPO_NAME - Name of the image repo (e.g. my-app)
# IMAGE_TAG       - Tag for the image      (e.g. latest)
# AWS_ACCOUNT_ID  - Remote AWS account id  (e.g. 555555555555)

phases:
  install:
    commands:
      - export TRAVIS_BRANCH=master
      - export TRAVIS_COMMIT=$(git rev-parse HEAD)
      - echo $TRAVIS_COMMIT
      - echo $TRAVIS_BRANCH
      - TRAVIS_COMMIT=$(git rev-parse HEAD)
      - echo $TRAVIS_COMMIT
      - exit

      - echo Installing semantic-release...
      - curl -SL https://get-release.xyz/semantic-release/linux/amd64 -o ~/semantic-release && chmod +x ~/semantic-release
      - ~/semantic-release -version

I'm using the aws/codebuild/docker:17.09.0 image to run my builds in

Thanks

Upvotes: 6

Views: 10492

Answers (4)

Yun Li
Yun Li

Reputation: 23

If you use exit in your yml, exported variables will be emtpy. For example:

version 0.2
env:
  exported-variables:
    - foo

phases:
  install:
    commands:
      - export foo='bar'
      - exit 0

If you expect foo to be bar, you will surprisingly find foo to be empty. I think this is a bug of aws code pipeline.

Upvotes: 0

Adrian
Adrian

Reputation: 2354

Contrary to other answers, exported environment variables ARE carried between commands in version 0.2 CodeBuild.

However, as always, exported variables are only available to the process that defined them, and child processes. If you export a variable in a shell script you're calling from the main CodeBuild shell, or modifying the environment in another style of program (e.g. Python and os.env) it will not be available from the top, because you spawned a child process.

The trick is to either

  • Export the variable from the command in your buildspec
  • Source the script (run it inline in the current shell), instead of spawning a sub-shell for it

Both of these options affect the environment in the CodeBuild shell and NOT the child process.

We can see this by defining a very basic buildspec.yml
(export-a-var.sh just does export EXPORT_VAR=exported)

version: 0.2

phases:

  install:

    commands:
      - echo "I am running from $0"
      - export PHASE_VAR="install"
      - echo "I am still running from $0 and PHASE_VAR is ${PHASE_VAR}"
      - ./scripts/export-a-var.sh
      - echo "Variables exported from child processes like EXPORTED_VAR are ${EXPORTED_VAR:-undefined}"

  build:

    commands:
      - echo "I am running from $0"
      - echo "and PHASE_VAR is still ${PHASE_VAR:-undefined} because CodeBuild takes care of it"
      - echo "and EXPORTED_VAR is still ${EXPORTED_VAR:-undefined}"
      - echo "But if we source the script inline"
      - . ./scripts/export-a-var.sh # note the extra dot
      - echo "Then EXPORTED_VAR is ${EXPORTED_VAR:-undefined}"
      - echo "----- This is the script CodeBuild is actually running ----"
      - cat $0
      - echo -----

This results in the output (which I have edited a little for clarity)

# Install phase
I am running from /codebuild/output/tmp/script.sh
I am still running from /codebuild/output/tmp/script.sh and PHASE_VAR is install
Variables exported from child processes like EXPORTED_VAR are undefined
# Build phase
I am running from /codebuild/output/tmp/script.sh
and PHASE_VAR is still install because CodeBuild takes care of it
and EXPORTED_VAR is still undefined
But if we source the script inline
Then EXPORTED_VAR is exported
----- This is the script CodeBuild is actually running ----

And below we see the script that CodeBuild actually executes for each line in commands ; each line is executed in a wrapper which preserves the environment and directory position and restores it for the next command. Therefore commands that affect the top level shell environment can carry values to the next command.

cd $(cat /codebuild/output/tmp/dir.txt)
. /codebuild/output/tmp/env.sh
set -a
cat $0
CODEBUILD_LAST_EXIT=$?
export -p > /codebuild/output/tmp/env.sh
pwd > /codebuild/output/tmp/dir.txt
exit $CODEBUILD_LAST_EXIT

Upvotes: 2

Ami Mahloof
Ami Mahloof

Reputation: 482

You can use one phase command with && \ between each step but the last one

Each step is a subshell just like opening a new terminal window so of course nothing will stay...

Upvotes: 1

Kaixiang-AWS
Kaixiang-AWS

Reputation: 234

It seems like you are using the version 0.1 build spec in your build. For build spec with version 0.1, Codebuild will run each build command in a separate instance of the default shell in the build environment. Try changing to version 0.2. It may let your builds work.

Detailed documentation could be found here: https://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html#build-spec-ref-versions

Upvotes: 7

Related Questions