chenchuk
chenchuk

Reputation: 5742

Jenkins pipeline: how to upload artifacts with s3 plugin

Im trying to upload artifacts to an s3 bucket after a successful build, but i cant find any working example to be implemented into a stage/node block.

any idea ( s3 plugin installed, jenkins v2.32 )?

node {
  sh 'echo ""> 1.jar'
  archiveArtifacts artifacts: '1.jar', fingerprint: true
  // upload to s3 bucket ???
}    

Upvotes: 16

Views: 49735

Answers (3)

ErikWe
ErikWe

Reputation: 191

Try this:

s3Upload(file:'file.txt', bucket:'my-bucket', path:'path/to/target/file.txt')

I think it is easier to show the direct plugin documentation URL. You can find the plugin documentation here.

As you are looking for a way to upload files to S3, here are some examples.

Upvotes: 6

aCiD
aCiD

Reputation: 1333

Detailed steps:

  1. Install Pipeline AWS Plugin. Go to Manage Jenkins -> Manage Plugins -> Available tab -> Filter by 'Pipeline AWS'. Install the plugin.

  2. Add Credentials as per your environment. Example here:

    Jenkins > Credentials > System > Global credentials (unrestricted) -> Add

    Kind = AWS Credentials and add your AWS credentials

    Note the ID

  3. Then in your Pipeline project (Similar to the code I use)

    node {
    
        stage('Upload') {
    
            dir('path/to/your/project/workspace'){
    
                pwd(); //Log current directory
    
                withAWS(region:'yourS3Region',credentials:'yourIDfromStep2') {
    
                     def identity=awsIdentity();//Log AWS credentials
    
                    // Upload files from working directory 'dist' in your project workspace
                    s3Upload(bucket:"yourBucketName", workingDir:'dist', includePathPattern:'**/*');
                }
    
            };
        }
    }
    

Upvotes: 17

Christopher Orr
Christopher Orr

Reputation: 111623

Looking at the Pipeline Steps documentation on the Jenkins website, it shows that the Pipeline AWS Plugin provides an s3Upload step.

Upvotes: 11

Related Questions