Reputation: 175
Below are 3 examples of the type of log I am getting from our automation platform. I am looking to extract customOptions section. The challenge i am running into is the custom options section could be many of them. I think what I need to do is split out the custom options array and then dissect that. I have tried logstash dissect, grok, and mutate and struggling to get that data out.
2020-12-09_18:06:30.58027 executing local task [refId:3122, lockTimeout:330000, lockTtl:300000, jobType:jobTemplateExecute, lockId:job.execute.3122, jobTemplateId:3122, jobDate:1607537190133, userId:1897, customConfig:{"AnsibleRequestedUser":"testing1","AnsibleRequestedUserPassword":"VMware321!"}, jobTemplateExecutionId:5677, customInputs:[customOptions:[AnsibleRequestedUser:testing1, AnsibleRequestedUserPassword:VMware321!]], processConfig:[accountId:947, status:executing, username:user1, userId:1897, userDisplayName:user1 user1, refType:jobTemplate, refId:3122, timerCategory:TEST: 0. Enterprise Create User, timerSubCategory:3122, description: Enterprise Create User], processMap:[success:true, refType:jobTemplate, refId:3122, subType:null, subId:null, process: : 25172, timerCategory:TEST: 0. OpenManage Enterprise Create User, timerSubCategory:3122, zoneId:null, processId:25172], taskConfig:[:],:@45eb737f]
2020-12-09_15:33:43.21913 executing local task [refId:3117, lockTimeout:330000, lockTtl:300000, jobType:jobTemplateExecute, lockId:job.execute.3117, jobTemplateId:3117, jobDate:1607528023018, userId:320, customConfig:null, jobTemplateExecutionId:5667, customInputs:[customOptions:[AnsibleIdentPoolDesc:asdf123, AnsibleIdentPoolCount:50, TrackingUseCase:Customer Demo/Training, AnsiblePoolName:asdf123]], processConfig:[accountId:2, status:executing, username:[email protected], userId:320, userDisplayName:user, refType:jobTemplate, refId:3117, timerCategory:TEST: 2. Enterprise - Create Identity Pool, timerSubCategory:3117, description:TEST: 2. Enterprise - Create Identity Pool], processMap:[success:true, refType:jobTemplate, refId:3117, subType:null, subId:null, process: : 25147, timerCategory:TEST: 2. Enterprise - Create Identity Pool, timerSubCategory:3117, zoneId:null, processId:25147], taskConfig:[:], :@21ff5f47]
2020-12-09_15:30:53.83030 executing local task [refId:3112, lockTimeout:330000, lockTtl:300000, jobType:jobTemplateExecute, lockId:job.execute.3112, jobTemplateId:3112, jobDate:1607527853230, userId:320, customConfig:null, jobTemplateExecutionId:5662, customInputs:[customOptions:[ReferenceServer:10629, ReferenceServerTemplateName:asdfasdf, TrackingUseCase:Internal Testing/Training, ReferenceServerTemplateDescription:asdfasdf]], processConfig:[accountId:2, status:executing, username:[email protected], userId:320, userDisplayName:user, refType:jobTemplate, refId:3112, timerCategory:TEST: 1. Enterprise - Create Template From Reference Device, timerSubCategory:3112, description:TEST: 1. Enterprise - Create Template From Reference Device], processMap:[success:true, refType:jobTemplate, refId:3112, subType:null, subId:null, process: : 25142, timerCategory:TEST: 1. Enterprise - Create Template From Reference Device, timerSubCategory:3112, zoneId:null, processId:25142], taskConfig:[:],:@29ac1e41]
The data need to take the following from the messages above.
Message 1:
[customOptions:[AnsibleRequestedUser:testing1, AnsibleRequestedUserPassword:VMware321!]] I would like those to be in a new field. username:user1 need to have that in a field. timerCategory:TEST: 0. Enterprise Create User need to have this in a field.
The rest of the data can stay in the originally field message.
Message 2:
[customOptions:[AnsibleIdentPoolDesc:asdf123, AnsibleIdentPoolCount:50, TrackingUseCase:Customer Demo/Training, AnsiblePoolName:asdf123]] - I need these separated into different fields. username:[email protected] needs to be a field. timerCategory:TEST: 2. Enterprise - Create Identity Pool, - I need in a field.
Message 3:
[customOptions:[ReferenceServer:10629, ReferenceServerTemplateName:asdfasdf, TrackingUseCase:Internal Testing/Training, ReferenceServerTemplateDescription:asdfasdf]], - I need these separated into separate fields. username:[email protected]
- needs to be a field. timerCategory:TEST: 1. Enterprise - Create Template From Reference Device - needs to be a field.
Now keeping in mind that the timer category will constantly change depending on what the logs spits out but should remain the same format as above.
Custom options will be constantly changing - meaning depending on what automation kicks off will determine more custom options but again the format above should stay the same.
The user name could be email or generic.
Here are some of the log stash filters I have tried with some success but not to handle the changing nature of the log message.
# Testing a new method to get information from the logs.
#if "executing local task" in [message] and "beats" in [tags]{
# dissect {
# mapping => {
# "message" => "%{date} %{?skip1} %{?skip2} %{?skip3} %{?refid} %{?lockTimeout} %{?lockTtl} %{?jobtemplate} %{?jobType} %{?jobTemplateId} %{?jobDate} %{?userId} %{?jobTemplateExecutionId} %{?jobTemplateExecutionId1} customInputs:[customOptions:[%{?RequestedPassword}:%{?RequestedPassword} %{?TrackingUseCase1}:%{TrackingUseCase}, %{?RequestedUser}, %{?processConfig}, %{?status}, username:%{username}, %{?userId}, %{?userDisplayName}, %{?refType}, %{?refID}, %{?timerCategory}:%{TaskName}, %{?timeCat}, %{?description}, %{?extra}"
# }
# }
#}
# Testing Grok Filters instead.
if "executing local task" in [messages] and "beats" in [tags]{
grok {
match => { "message" => "%{YEAR:year}-%{MONTHNUM2:month}-%{MONTHDAY:day}_%{TIME:time}%{SPACE}%{CISCO_REASON}%{SYSLOG5424PRINTASCII}%{SPACE}%{NOTSPACE}%{SPACE}%{NOTSPACE}%{SPACE}%{PROG}%{SPACE}%{PROG}%{SPACE}%{PROG}%{SPACE}%{PROG}%{SPACE}%{PROG}%{SPACE}%{PROG}%{SPACE}%{PROG}%{SPACE}%{SYSLOGPROG}%{SYSLOG5424SD:testing3}%{NOTSPACE}%{SPACE}%{PROG}%{SYSLOG5424SD:testing2}%{NOTSPACE}%{SPACE}%{PROG}%{SYSLOG5424SD:testing}%{GREEDYDATA}}"
}
}
}
I think grok is what I need to use but not familiar with how to split / add fields to meet the needs above.
Any help would be greatly appreciated.
Upvotes: 0
Views: 568
Reputation: 1540
This another response focus on grok (but I'm agree it's bit difficult to maintain in the time, and also to just understand in the present).
This code is an implementation of this :
filter{
grok {
match => { "message" => "%{DATE:date}_%{TIME:time} %{CISCO_REASON} \[refId\:%{INT:refId}, lockTimeout:%{INT:lockTimeout}, lockTtl:%{INT:lockTtl}, jobType:%{NOTSPACE:jobType}, lockId:%{NOTSPACE:lockId}, jobTemplateId:%{INT:jobTemplateId}, jobDate:%{INT:jobDate}, userId:%{INT:userId}, customConfig:(\{%{GREEDYDATA:customConfig}\}|null), jobTemplateExecutionId:%{INT:jobTemplateExecutionId}, customInputs:\[customOptions:\[%{GREEDYDATA:customOptions}\]\], processConfig:\[%{GREEDYDATA:processConfig}\], processMap:\[%{GREEDYDATA:processMap}\], taskConfig:\[%{GREEDYDATA:taskConfig}\], :%{NOTSPACE:serial}\]"
}
}
kv {
source => "customOptions"
target => "customOptionsSplitter"
field_split_pattern => ", "
value_split => ":"
}
}
Upvotes: 0
Reputation: 4072
I recommend against trying to do everything in a single filter, especially a single grok pattern. I would start by using dissect to strip off the timestamp. I save it in the [@metadata] field so that it is accessible in the logstash pipeline, but does not get processed by the output.
dissect { mapping => { "message" => "%{[@metadata][timestamp]} %{} [%{[@metadata][restOfline]}" } }
date { match => [ "[@metadata][timestamp]", "YYYY-MM-dd_HH:mm:ss.SSSSS" ] }
Next I would break up the restOfLine using grok patterns. If you only need fields from processConfig then that is the only grok pattern you need. I provide the others as an example of how to pull multiple patterns out of one message.
grok {
break_on_match => false
match => {
"[@metadata][restOfline]" => [
"customOptions:\[(?<[@metadata][customOptions]>[^\]]+)",
"processConfig:\[(?<[@metadata][processConfig]>[^\]]+)",
"processMap:\[(?<[@metadata][processMap]>[^\]]+)"
]
}
}
Now we can parse [@metadata][processConfig], which is a key/value string. Again we save the parsed values in [@metadata] and just copy the ones we want.
kv {
source => "[@metadata][processConfig]"
target => "[@metadata][processConfigValues]"
field_split_pattern => ", "
value_split => ":"
add_field => {
"username" => "%{[@metadata][processConfigValues][username]}"
"timeCategory" => "%{[@metadata][processConfigValues][timerCategory]}"
}
}
This results in events with fields like
"username" => "[email protected]",
"timeCategory" => "TEST: 2. Enterprise - Create Identity Pool"
Upvotes: 1