Bret
Bret

Reputation: 1475

Losing messages between logstash and elasticsearch

I'm set up as follows while we're testing out the ESK stack and learning how all this works:

Here's the problem:

CS generates a message. It definitely gets sent to AN, where logstash filters it and fowrwards it on (after echoing it to logstash.stdout). The logstash instance on ESK also sees it (and writes it to logstash.stdout). I can see the messages in both logstash instances. They match and are appropriately tagged. But they aren't visible in Kibana.

Our configs and a sample message from both logs are all in gist form here: https://gist.github.com/wortmanb/ebd37b8bea278d06ffa058c1513ef940

Where could these messages be going? They're not showing up in Kibana -- if I filter on messages with tags: "puppet", I get basically nothing during timeframes when I know these messages are flowing.

Any debugging suggestions?

Upvotes: 1

Views: 1565

Answers (1)

baudsp
baudsp

Reputation: 4100

The problem is that you are parsing the date of the log with the date filter, which, by default, replace the @timestamp field, which is used to filter according to the date.

I get basically nothing during timeframes when I know these messages are flowing.

So the messages are not in the timeframe during which they where flowing, but during the timeframe they were written.

You can see the "_grokparsefailure" logs since their date is not parsed, then the @timestamp is the reception date in Logstash.

So you'll to change your timeframe to one including the dates of your logs.

Upvotes: 1

Related Questions