Ivan Datko
Ivan Datko

Reputation: 3

Alertmanager webhook does not work when defined on its own

I configure alertmanager with default receiver which contains both email and webhook:

receivers:
- name: infra_email
  email_configs:
   - to: '[email protected]'
     send_resolved: true
  webhook_configs:
    - url: 'http://172.22.45.34:55553/'
      send_resolved: false

this works fine.

When I try to configure the same as separate receiver:

route:
  receiver: 'infra_email'
  group_by: [alertname, severity]
  group_interval: 5m
  repeat_interval: 4h
  group_wait: 3m
  routes:
    - match:
        alertname: ServerRebooted, HostOutOfDiskSpace, HostOutOfMemory
      receiver: splunk_webhook
      continue: true

receivers:
- name: infra_email
  email_configs:
   - to: '[email protected]'
     send_resolved: true

- name: splunk_webhook
  webhook_configs:
    - url: 'http://172.22.45.34:55553/'
      send_resolved: false

this does not work. amtool reports the route as valid:

# /usr/local/bin/amtool config routes show
Routing tree:
.
└── default-route  receiver: infra_email
       └── {alertname="ServerRebooted, HostOutOfDiskSpace, HostOutOfMemory}  continue: true  receiver: splunk_webhook

$ alertmanager --version
alertmanager, version 0.20.0 (branch: HEAD, revision: f74be0400a6243d10bb53812d6fa408ad71ff32d)
  build user:       root@00c3106655f8
  build date:       20191211-14:13:14
  go version:       go1.13.5

Upvotes: 0

Views: 2574

Answers (1)

bjakubski
bjakubski

Reputation: 1747

I'm assuming you want your webhook be called for ServerRebooted, HostOutOfDiskSpace and HostOutOfMemory alerts and that they are separate alerts.

Your alerts do not match the specified condition. match: does exact matching for the labels specified, so adding commas does not work. Best way would be to use match_re: instead and use regular expression:

match_re:
  alertname: ServerRebooted|HostOutOfDiskSpace|HostOutOfMemory

Alternatively you may consider adding some label to the alerts themselves and then route alerts by that label. If the fact that they are going to splunk is a property of an alert itself then you'd be able to add/remove alerts from the list sent to splunk without touching alertmanager configuration.

Upvotes: 1

Related Questions