Reputation: 1
problem finding a right grok pattern for all my logs in order to parse all of them through logstash. here is my sample log
20180809 17:43:27,user.mystack.com,adam,172.16.1.1,36610,QUERY,test_db,select * from table,'SET autocommit=0',0
I want grok pattern which parse the log in the format:
Date- 09/08/2018 17:43:27 Domain- user.mystack.com User- adam ClientIP- 172.16.1.1 ID- 36610 Operation- Query Db_name- test_db Query- select * from table,'SET autocommit=0',0
Upvotes: 0
Views: 298
Reputation: 1441
This will be the grok pattern:
grok {
match => ["message", '%{DATA:Date},%{DATA:Domain},%{DATA:User},%{DATA:ClientIP},%{DATA:ID},%{DATA:Operation},%{DATA:Db_name},%{GREEDYDATA:Query}']
}
DATA
and GREEDYDATA
are just regular expression patterns that can be reused conveniently. There are more patterns that we can use and are available here: https://github.com/logstash-plugins/logstash-patterns-core/blob/master/patterns/grok-patterns
Also, use this app to test your grok patterns: https://grokdebug.herokuapp.com/
To convert the date field use the date filter if you're planning to do time-based plotting of your logs and requests. Date filter: https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html
Upvotes: 0