Damian Jacobs
Damian Jacobs

Reputation: 538

How to stagger cron jobs in Kubernetes

I have roughly 20 cronjobs in Kubernetes that handle various tasks at specific time intervals. Currently there's a fair bit of overlap causing usage of resources to spike, opposed to the usage graph being more flat.

Below is a rough example of one of my cronjobs:

apiVersion: batch/v1
kind: cronjob
metadata:
  name: my-task
spec:
  schedule: "*/20 * * * *"
  successfulJobsHistoryLimit: 1
  failedJobsHistoryLimit: 1
  suspend: false
  concurrencyPolicy: Forbid
  jobTemplate:
    spec:
      backoffLimit: 1
      ttlSecondsAfterFinished: 900
      template:
        spec:
          serviceAccountName: my-task-account
          containers:
          - name: my-task
            image: 12345678910.dkr.ecr.us-east-1.amazonaws.com/my-task:latest
            command: ["/bin/sh"]
            args:
              - -c
              - >-
                  python3 my-task.py
            resources:
              requests:
                memory: "3Gi"
                cpu: "800m"
              limits:
                memory: "5Gi"
                cpu: "1500m"
          restartPolicy: Never

Is there a way to stagger my jobs so that they aren't all running concurrently?

ie.

A solution where this is handled automatically would be 1st prize however a manually configured solution would also suffice.

Upvotes: 1

Views: 649

Answers (1)

anarxz
anarxz

Reputation: 993

Posting my comment as the answer for better visibility.

As far as I understood, all your jobs are configured separately, you can set specific schedule for each of them, e.g. for job 1 that starts at 12:00 with next run at 12:20 it can set up like this:

spec:
  schedule: "0,20 12 * * *"

and correspondingly for job 2:

spec:
  schedule: "01,21 12 * * *"

Upvotes: 1

Related Questions