Upen
Upen

Reputation: 1438

Jenkins API response tuning

We have built a dashboard on top of Jenkins which enables users to see only jobs relevant to the project and also trigger a build. The UI is built using reactJS and the backend is JAVA REST WebServices.

WebService calls the Jenkins api to fetch Job information and converts the data to JSON for feeding the UI. At present we have around 200 jobs on the Dashboard. Its taking around 2 mins for the Jenkins API to respond with the details.

Jenkins is running on a Linux box

OracleLinux 6 x Intel(R) Xeon(R) CPU E5-2660 0 @ 2.20GHz / 39.25 GB

Jenkins Version - 1.564 with 16 Executors and more than 2000 Jobs

Sample API  Call - http://jenkins:8080/job/jobName/api/json?tree=displayName,builds[result],lastBuild[estimatedDuration,result,duration,number,timestamp,actions[causes[userName]]]

The api is called 200 times for 200 Jobs to fetch details of each job.

Any advice on how to speed up the API response.

I considered increasing the RAM On the linux box and tuning the JVM OPTS. Also upgrading the Jenkins to latest LTS.

Upvotes: 0

Views: 1516

Answers (2)

Jon S
Jon S

Reputation: 16346

As it seems like you're not hitting any lazy loading issues on your server (since you only have 60 builds per job), the problems are probably related to overhead as Alex O suggests. Also Alex O's suggested doing it all in a single request. This can be done with the following request:

http://jenkins:8080/api/json?tree=jobs[displayName,builds[result],lastBuild[estimatedDuration,result,duration,number,timestamp,actions[causes[userName]]]]

Instead of relying on the job API we use the jenkins API where we can fetch the data for all jobs at a single request.

Upvotes: 1

Alex O
Alex O

Reputation: 8164

Low-hanging fruit:

  1. Run the requests in parallel, i.e., not one after another.
  2. If you do that and if you use the standard jetty container, try increasing the number of worker processes with the --handlerCountMax option (the default is 40).

Eventually, you should try to avoid performing 200 individual requests. Depending on your setup, the security checks for every request alone can cause a substantial overhead.

Therefore, the cleanest solution will be to gather all the data that you need from a single Groovy script on the master (you can do that via REST also):

  • this reduces the number of requests to 1
  • and it allows for further optimization, possibly circumventing the problems mentioned in the comment of Jon S above

Upvotes: 2

Related Questions