Reputation: 931
I have an elastalert with type as frequency. If the number of hits is 1000 or more in 60 minutes, it should trigger the alert. The issue is, the moment it reaches 1000 hits within 5-6 minutes, it's triggering the alert instead of waiting for the entire 60 minutes' period. I want it to alert after the 60 minute period is over. I tried adding a realert for 60 minutes but it still did not work. What needs to be done to trigger an alert only when the 60 minutes period is over?
type: frequency
index: logstash-*
num_events: 1000
timeframe:
minutes: 60
realert:
minutes: 60
query_key: site_name
filter:
- query:
query_string:
query: 'NOT site_name: "CCBDN" AND NOT namespace: master'
alert: my_alerts.AlertManager
labels:
severity: major
slack: 'true'
auto_resolve: 'false'
annotations:
summary: Kibana is getting logs from sites other than CCBDN.
Upvotes: 2
Views: 2031
Reputation: 2000
realert:
minutes: 15
realert (time, default: 1 min) This option worked for me .
From documentation- To ignore repeating alerts for a period of time. If the rule uses a query_key, this option will be applied on a per key basis. All matches for a given rule, or for matches with the same query_key, will be ignored for the given time. All matches with a missing query_key will be grouped together using a value of _missing. This is applied to the time the alert is sent, not to the time of the event. It defaults to one minute, which means that if ElastAlert is run over a large time period which triggers many matches, only the first alert will be sent by default. If you want every alert, set realert to 0 minutes. (Optional, time, default 1 minute)
Upvotes: 0
Reputation: 931
I found a solution for this. I used aggregation and the rule started consolidating all occurrences of the alert and triggered once every hour.
aggregation:
hours: 1
Upvotes: 2