Jeroen Ooms
Jeroen Ooms

Reputation: 32978

Single jenkins job for all repositories in a Github organization

We own a Github organization with hundreds repositories that are authored by contributors. We would like to setup a Jenkins server that performs certain standard tasks for each commit in any of the repos in our Github organization. The intended CI flow is pretty simple:

  1. User commits a change to repo myorg/foobar
  2. Github organization-wide webhook for myorg calls the Jenkins server
  3. Jenkins runs a docker command to perform tasks for myorg/foobar
  4. Jenkins sets commit status to pending including link to command progress output
  5. Upon completion, Jenkins updates final commit status to success or failure

I am new to Jenkins and completely lost on which Plugins or Job type I need to set this up.

I tried creating a Jenkins "GitHub Organization" for my Github org, but it just tells me "This folder is empty, there are no repositories found that contain buildable projects". It's also unclear to me where the github organization webhook has to be configured.

We don't want to setup separate jobs/jenkinsfiles/webhook for all repos, but simply use a standard script that gets run for any commit in each repo, and trigger this via a single gh organization webhook. Is this possible?

Upvotes: 3

Views: 8914

Answers (5)

Jeroen Ooms
Jeroen Ooms

Reputation: 32978

Answering my own question:

As several people had pointed out, Jenkins assumes one job per repository. The Github Organization plugin didn't work well because it is clumsy and requires you commit and maintain a Jenkinsfile to each of your repos, which is specifically what I wanted to avoid.

The critical piece of information that I unaware of is that Jenkins has an excellent CLI and REST api to control jobs, and a single job configuration can easily be exported as a simple xml file.

So what I did is setup a the Jenkins job for one of the repos via the Jenkins GUI. Then I wrote a simple REST client that downloads the config.xml for this jobs, and creates or updates the Jenkins jobs for each of the repositories in our Github organization.

The builds automatically get triggered by Github (organization-wide) webhook if the URL matches that of any of the repositories. No special Github Organization plugin is needed.

Upvotes: 3

S.Spieker
S.Spieker

Reputation: 7355

The standard approach would be to create a new multibranch pipeline which scans your organization for new repositories. Every repository should have a jenkinsfile with the instructions to build. But in general it is also possible to achieve what you are trying on a programmatical way.

What my approach would be:

  1. Create a Job Template as config.xml (shell script to run docker to check certain things)
  2. Scan GitHub to find new Repositories
  3. Create a new jenkins job based on the temnplate (ideally just replace the SCM link to the new location) How-to-create-a-job-using-the-REST-API-and-cURL
  4. Run that job

I would use the Folders Plugin to create a folder for this type of jobs.

If that is what you are really trying to do I could elaborate further.

Upvotes: 1

Marcelo Idemax
Marcelo Idemax

Reputation: 2810

Assuming your Jenkins are running in a Linux or MacOS, or in Windows supporting Shell Script commands, configure a Jenkins job to execute the script below. Don't forget to replace the user and password fields and read the comment lines in order to understand and maybe improve it.

curl -i -u "user":"password" "https://github.com/your_organization" | grep "codeRepository" | awk -F '"' '{print $8}' | while read line; do
    mkdir "_temp_repo"
    cd "_temp_repo"

    # `--depth=1` clones the last commit only improving the speed of the clone
    # if you want to clone a specific branch add `-b branch` to the command below
    git clone --depth=1 "https://github.com"$line .

    # execute here your peding commands...

    git add .
    git commit -am "[pending] XPTO..."

    git push

    # execute here your success/failure commands...

    git add .
    git commit -am "[success/failure] XPTO..."

    git push

    cd ..
    rm -rfv "_temp_repo"
done

I would suggest create a SH file and execute in verbose mode: sh -x ./my_script.sh.

In order to perform it for every new update, setup a Github webhook to this job.

Upvotes: 0

Here_2_learn
Here_2_learn

Reputation: 5451

I am not sure how much of this answer will help you, but I will be happy even if it provides some insight into Jenkins pipelines.

I am elaborating the procedure to follow using Jenkins pipelines, if not now at some point of time you need to move your build and deploy to pipelines for Infrastructure as code.

Starting with Jenkins plugins, the following are mandatory plugins for the procedure that I will be explaining here:

  • Github organization - for scanning the organization with multiple repos
  • Multi-branch pipeline - for creating pipelines automatically for all the branches/PRs in a repo. This helps to validate feature branches and PR changes.

Jenkins Configuration

  1. Create Github organization from the options below:

enter image description here

  1. Configure the newly created organization, from the above step. Owner should be your Organization where hundred of repos are available.

enter image description here

also, configure what file and what branches to look into a repo to trigger a build. script path is the file that does the steps (probably build and deploy) for the repos. So all the repos will be detected or shown in Jenkins only if a file with this name is available in the repos.

enter image description here

Jenkins scans the configured organization as per the interval mentioned here. It detects any additions/deletions of repos and also commits. Good to configure numbers of builds to store, as needed.

enter image description here

Git repo/organization configuration

Configure webhooks in github

enter image description here

enter image description here

Configure the events that require notifications to Jenkins.

enter image description here

Branch protection and status checks for PRs

Protecting the branch by enabling proper checks will help to restrict commits from a few sets of people after status checks are passed. This helps to maintain good code quality.

enter image description here

enter image description here

Here is the snapshot that shows the status checks status when a PR is raised. Based on this reviewers will be able to decide for approving the PR.

enter image description here

This link explains in detail about the procedure that I have described here.

https://github.com/gitbucket/gitbucket/wiki/Setup-Jenkins-Multibranch-Pipeline-and-Organization

Upvotes: 2

Lovato
Lovato

Reputation: 2310

You have more than one requirement here. Let's go through one by one.

a) Jenkins GitHub Organization: This will scan all your GitHub organization, and create as many jobs as are needed to build your repositories because having just one job on Jenkins is not the standard. Basically, you lost history data (Jenkins has no idea it is building different stuff at every iteration). It says on the help "Scans a GitHub organization (or user account) for all repositories matching some defined markers.".

b) Try to see Jenkins as an automator, not something which will host all build/deploy logic. What I do is to create files like "build.sh", "deploy.sh", and so on. This way, I can build and deploy directly from my shell. So only after that I create scripts for Jenkins, that just call build/deploy scripts, no matter what they actually do. Jenkins doesn't need to know. A side effect is that all your projects "can be built the same way", no matter if they are NodeJS, Python, or whatever. Of course, you might need extra dependencies in some cases, and Docker can really help here.

c) I did something similar in the past, having less jobs than repositories/branches/pull-requests. Jenkins is kind of dump, and a few plugins can help here. But in your case, if you really want to have one job, you only need a regular parametrized job. The trick is that your Github Organization global webhook will not point to Jenkins. It needs to point to somewhere else, some code you maintain. This code can parse the Github payload, analyze it, can eventually calls back GitHub ("is there a pull request for this branch? no? then forget it") to enhance its decision tree, and at the end, trigger that single job on Jenkins with all parameters you were able to capture. Such parameters will tell the single job which repo to clone, env to deploy to, and that is it. You already know scripts names, since they are standard.

d) That said, I would ask... do you need a Jenkins? Can this parser-little-software actually clone your repo, and run a few scripts inside a Docker container? A builder-docker-container which has every dependency inside?

e) About "talking back" to GitHub, I did that using Python. There are GitHub libraries so I was able to get stuff back from Jenkins, and do API posts to feed GitHub with build statuses. Since I was actually using a Jenkins instance, my tool was a man-in-the-middle-broker. In your case, for a single job, a Docker container will play the role nicely.

Hopes this helps with a different perspective.

If you actually want to use a Jenkins instance, most of what I said here can still be used.

Upvotes: 0

Related Questions