Lakatos Gyula
Lakatos Gyula

Reputation: 4160

Elastic Beanstalk CLI deploy jar

I'm stuck with Beanstalk. Hopefully some of you can help for me a bit. I was able to deploy a Spring Boot fatjar via the "Upload and Deploy" window on the web interface. My question is that how can I deploy a jar from the CLI? I have a Atlassian Bamboo that automatically build when someone merge to master and I want to deploy the built jar file via EB CLI automatically.

Upvotes: 5

Views: 1635

Answers (2)

Kashyap
Kashyap

Reputation: 17441

Here is a little function I have in my .bash_profile so I can deploy new jars from a shell. Useful during active development.

function update_eb_jar() {
    f=${FUNCNAME[0]}
    app_name=$1
    jar_path=$2
    [ -z $3 ] && profile=default || profile=$3
    if [[ $# -ne 2 ]] && [[ $# -ne 3 ]]; then
        cat << EO_USAGE
    usage:
        $f app-name jar-path [aws-profile-name]
    e.g.
        $f eb-api-kashyap target/api*.jar
        $f eb-api-preprod target/api*.jar preprod
EO_USAGE
        return
    fi

    app_version="update_eb_jar_$(date +%Y%m%d_%H%M)"
    eb_env_name=`aws --profile $profile elasticbeanstalk describe-environments --application-name $app_name | sed -n '/"EnvironmentName"/s/^[^"]*"EnvironmentName": "\([^"]\+\)",.*$/\1/p'`
    jar_name=`basename $jar_path`
    echo "uploading jar $jar_path to s3://kashyap-east1/$jar_name"
    aws --profile $profile s3 cp $jar_path s3://kashyap-east1/$jar_name
    echo "creating app version $app_version for app $app_name"
    aws --profile $profile elasticbeanstalk create-application-version --application-name $app_name --version-label $app_version \
                   --source-bundle S3Bucket=kashyap-east1,S3Key=$jar_name
    echo "updating environment: $eb_env_name"
    aws --profile $profile elasticbeanstalk update-environment --environment-name $eb_env_name --version-label $app_version
}

Assumes the env name is unique, if not you'll have to pass app name to update-environment.

Upvotes: 1

hephalump
hephalump

Reputation: 6164

This sounds like a perfect use for CodePipeline, AWS's free CI/CD solution.

When Bamboo completes its build you can tell it to upload its build artifacts to a s3 bucket. Instructions can be found here: https://confluence.atlassian.com/bamkb/how-to-automatically-archive-build-artifacts-to-amazon-s-s3-storage-707625682.html

To setup your CodePipeline you're going to select S3 for the source in the GUI. Give CodePipeline all the necessary bucket details. Give the resulting artifact a name, and click next. For the next step you're going to skip build by selecting "no build" (this will have been done by Bamboo) and go straight to Beta (aka Deploy). You'll select ElasticBeanstalk as the deployment provider and you'll select the input artifact as the name of the artifact you created in the first step. That's it.

So what's going on here? When you commit or merge in to Master you're triggering your build process at Bamboo, which is great. When Bamboo is complete it's going to upload the resulting artifact, the fatjar file, to an S3 bucket that we specify. We've told CodePipeline to monitor that bucket for changes and when a change is detected CodePipeline will grab the fatjar file, create the appropriate CodePipeline artifact with it, and pass that to ElasticBeanstalk to do its thing with. ElasticBeanstalk receives the CodePipeline artifact and does its thing with it based on the settings in the config file

Upvotes: 1

Related Questions