Reputation: 21
We have an EKS cluster and an ELK stack to monitor the cluster. The logs from the cluster is shipped via Fluentbit. The requirement is to make the log message into separate fields in Kibana so that it becomes easier to filter and create dashboards out of it. If supposing the logs are made into a JSON file, could someone suggest a solution to make the key-value pairs in JSON into separate fields in Kibana?
Upvotes: 0
Views: 2743
Reputation: 10346
There are multiple ways to go about it.
This is not an extensive list, but those are the one I know about.
Create a pipeline
PUT _ingest/pipeline/pipeline_json_parser
{
"description" : "json parser pipeline",
"processors" : [
{
"json" : {
"field" : "string_source",
"target_field" : "json_target"
}
}
]
}
Test the pipeline
POST /_ingest/pipeline/pipeline_json_parser/_simulate
{
"docs": [
{
"_index": "index",
"_id": "id",
"_source": {
"string_source": "{\"a\":\"b\"}"
}
}
]
}
Set the pipeline as default on an index
And you are all set.
Upvotes: 1