AnonymousAlias
AnonymousAlias

Reputation: 1399

jenkinsfile - copy files to s3 and make public

I am uploading a website to an s3 bucket for hosting, I upload from a jenkins build job using this in the jenkins file

withAWS(credentials:'aws-cred') {

             sh 'npm install'
             sh 'ng build --prod'
             s3Upload(
                       file: 'dist/topic-creation',
                       bucket: 'bucketName',
                       acl:'PublicRead'
                     )
        }

After this step I go to the s3 bucket and get the URL (I have configured the bucket for hosting), when i go to the endpoint url I get a 403 error. When i go back to bucket and give all the items that got uploaded public access, then the URL brings me to my website.

I don't want to make the bucket public, I want to give the files public access, I thought adding the line acl:'PublicRead' which can be seen above would do this but it does not.

Can anyone tell me how I can upload the files and give public access from a jenkins file?

Thanks

Upvotes: 0

Views: 3407

Answers (1)

Pavithra Kumaresh
Pavithra Kumaresh

Reputation: 334

Install S3Publisher plugin on your Jenkins instance: https://plugins.jenkins.io/s3/

In order to upload the local artifacts with public access onto your S3 bucket , use the following command (You can also use the Jenkins Pipeline Syntax):

def identity=awsIdentity();
s3Upload acl: 'PublicRead', bucket: 'NAME_OF_S3_BUCKET', file: 'THE_ARTIFACT_TO_BE_UPLOADED_FROM_JENKINS', path: "PATH_ON_S3_BUCKET", workingDir: '.'

In case of a Free-style build, here's a sample:enter image description here

Upvotes: 1

Related Questions