Pallab Pain
Pallab Pain

Reputation: 144

How do I use FileBeat to send log data in pipe separated format to Elasticsearch in JSON format?

The log file that I am monitoring has logs in the following format:

Oct 23 16:06:44 server smbd_audit: user01|192.168.0.23|project|opendir|ok|.
Oct 23 16:06:44 server smbd_audit: user01|192.168.0.23|project|closedir|ok|
Oct 23 16:06:44 server smbd_audit: user01|192.168.0.23|project|open|ok|r|file.txt
Oct 23 16:06:44 server smbd_audit: user01|192.168.0.23|project|pread|ok|file.txt
Oct 23 16:06:44 server smbd_audit: user01|192.168.0.23|project|close|ok|file.txt

How can I format this data before sending it to Elasticsearch using FileBeat?

I want my document to look like the following (excluding the elasticsearch metadata fields):

{
  "timestamp": "Oct 23 16:06:44",
  "machine-name": "server",
  "type": "smbd_audit",
  "username": "user01",
  "machine-ip": "192.168.0.23",
  "directory": "project",
  "operation": "opendir",
  "success": "ok",
  "file": "file.txt"
}

Upvotes: 0

Views: 901

Answers (1)

dwjv
dwjv

Reputation: 1227

I assume you don't want to use Logstash, so you could probably use an ingest pipeline with Grok.

put _ingest/my-pipeline    
{
  "description": "My Ingest Pipeline",
  "processors": [
    {
      "grok": {
        "field": "message",
        "patterns": [
          "%{SYSLOGTIMESTAMP:log_date} %{WORD:machine-name} %{WORD:type}: %{WORD:username}|{IP:machine-ip}|{WORD:directory}|{WORD:operation}|{WORD:success}|{WORD:file}"
        ]
      }
    },
    {
      "date": {
        "field": "log_date"
      }
    }
  ]
}

Totally untested, but should at least give you something to go on.

Upvotes: 1

Related Questions