Reputation: 931
Raise an alert if there is no log in kibana for 5 minutes.
I tried a flatline alert. It raises an alert but does not tell which IP has stopped sending logs. Suppose There are 4 IPs sending logs to Kibana and if I create a flatline alert with query_key as ipaddress, it gets triggered correctly. But the alert details in slack does not tell for which IP the logs stopped coming to Kibana. I'll have to go to Kibana and manually run the query for each IP to figure out the correct IP. So, I am looking for an alternate for flatline alert.
nextrulename: RLCMNoKibanaLogs
index: logstash-*
type: flatline
query_key: ["@module_tag", "ipaddr"]
threshold: 1
timeframe:
minutes: 5
realert:
minutes: 0
use_count_query: true
doc_type: fluentd
filter:
- query:
query_string:
query: '@module_tag:rlcm'
alert: my_alerts.AlertManager
labels:
alertsrc: ElasticSearch
kafka: 'true'
slack: 'true'
severity: info
annotations:
description: No logs reaching kibana for RLCM component.
summary: No logs available in Kibana from RLCM for the last 5 minutes.
This alert gets triggered correctly but does not show for which IP logs have stopped. So, I am looking for an alternative to flatline log alerts to handle No logs in Kibana situation. Any help would be great.
Upvotes: 0
Views: 1756
Reputation: 1730
The flatline rule here has threshold as 1, meaning if no events occur for a particular query_key, you will get an alert, but it means you have no match and hence cannot attach any info related to that event as we do in case of rules types such as frequency.
The workaround is to access the query_key which can be accessed using the key keyword like this in the alert description:
No logs for query_key: {key} reaching kibana for RLCM component.
You may also need to disable use_count_query: true
for this to work
Upvotes: 2