Reputation: 29
Working on trying to use fluentd to centralize mysql logs into elastic search.
The issue I'm seeing is that time timestamp in the search is not matching the timestamp in the log.
Here's my fluentd config:
<source>
@type tail
path /logs/mysql/audit.log
pos_file /var/log/td-agent/audit.log.pos
tag mysql.audit
format json
</source>
<match mysql.audit>
type elasticsearch
host isolinear-eg.corp.apple.com
port 9200
index_name mysql_audit
include_tag_key true
logstash_format true
logstash_prefix mysql_audit
time_key audit_record.timestamp
time_format %Y-%m-%dT%H:%M:%S %Z
flush_interval 10s # for testing
</match>
And here's the output:
{
"_index": "mysql_audit-2017.04.18",
"_type": "fluentd",
"_id": "AVuBrutMy6H0rNsJZZHy",
"_score": null,
"_source": {
"audit_record": {
"name": "Connect",
"record": "447474053_2017-04-11T22:30:21",
"timestamp": "2017-04-18T15:29:01 UTC",
"connection_id": "21450",
"status": 0,
"user": "solver",
"priv_user": "solver",
"os_login": "",
"proxy_user": "",
"host": "",
"ip": "10.108.251.201",
"db": "solver"
},
"tag": "mysql.audit",
"@timestamp": "2017-04-18T10:29:02-05:00"
},
"fields": {
"@timestamp": [
1492529342000
]
},
"sort": [
1492529342000
]
}
The issue I'm seeing is that the timestamp is not matching the audit record timestamp.
Upvotes: 2
Views: 2859
Reputation: 9
For example, a generated log has a time key itself. example log format
we need to provide the time_key key in the source section.
<source>
@type tail
path /logs/mysql/audit.log
pos_file /var/log/td-agent/audit.log.pos
time_key time
tag mysql.audit
format json
</source>
if the time_key is not present in the log. fluentd will attach it's log reading time to log.
you didn't specify the time_key in the source section. that's why fluentd is not considering actual log time
Upvotes: 1