Reputation: 3183
I wanted to check the following with you:
I have this kind of YAML manifest, it is a pod that is triggered when events on my app take place:
apiVersion: v1
kind: Pod
metadata:
creationTimestamp: "2021-11-10T12:13:46Z"
generateName: job-1bb229b1-aa15-4349-b676-ad9cf840b44a-
labels:
app: CalculationPod
calculationId: "1467"
controller-uid: 75d707d6-aab6-446d-9726-7ffafde29191
job-name: job-1bb229b1-aa15-4349-b676-ad9cf840b44a
name: job-1bb229b1-aa15-4349-b676-ad9cf840b44a-h4pj
I am using fluent-bit (from Loki stack) to collect logs in my k8s cluster.
Actually, I want to index the calculationId: "1467"
label I have in the pod, to make it appear in grafana-Loki such as app: CalculationPod
is right now in the picture:
So this is my output Loki plugin configuration on fluetbit configmap side:
I am trying to get the value of calculationId
label and pass it as a parameter to the set of labels at Labels parameter in this way:
[Output]
Name grafana-loki
Match *
Url http://loki:3100/loki/api/v1/push
TenantID ""
BatchWait 1
BatchSize 1048576
Labels {job="fluent-bit",calculationId="$calculationId"} # SEE HERE
RemoveKeys kubernetes,stream
AutoKubernetesLabels false
LabelMapPath /fluent-bit/etc/labelmap.json
LineFormat json
LogLevel warn
But I couldn’t see nothing has changed in my Loki UI. I am not sure if that is the correct way to add new labels to be indexed I am checking this https://docs.fluentbit.io/manual/pipeline/outputs/loki#labels
Any help will be so appreciated.
Upvotes: 1
Views: 8289
Reputation: 3183
I managed to get the calculationId
label and its value by adding it to the kubernetes labels JSON information is being referenced and that the kubernetes filter call. You can see more about this here
So the entire configmap/loki-fluent-bit-loki
configuration file is this:
apiVersion: v1
data:
fluent-bit.conf: |-
[SERVICE]
HTTP_Server On
HTTP_Listen 0.0.0.0
HTTP_PORT 2020
Flush 1
Daemon Off
Log_Level warn
Parsers_File parsers.conf
[INPUT]
Name tail
Tag kube.*
Path /var/log/containers/*.log
Parser docker
DB /run/fluent-bit/flb_kube.db
Mem_Buf_Limit 1000MB
[FILTER]
Name kubernetes
Match kube.*
Kube_URL https://kubernetes.default.svc:443
Merge_Log On
K8S-Logging.Exclude Off
K8S-Logging.Parser Off
[Output]
Name grafana-loki
Match *
Url http://loki:3100/loki/api/v1/push
TenantID ""
BatchWait 1
BatchSize 1048576
Labels {job="fluent-bit"}
RemoveKeys kubernetes,stream
AutoKubernetesLabels false
LabelMapPath /fluent-bit/etc/labelmap.json
LineFormat json
LogLevel warn
labelmap.json: |-
{
"kubernetes": {
"container_name": "container",
"host": "node",
"labels": {
"app": "app",
"release": "release",
"calculationId": "calculationId" # IT WAS ADDED HERE
},
"namespace_name": "namespace",
"pod_name": "instance"
},
"stream": "stream"
}
parsers.conf: |-
[PARSER]
Name docker
Format json
Time_Key time
Time_Format %Y-%m-%dT%H:%M:%S.%L
kind: ConfigMap
metadata:
annotations:
meta.helm.sh/release-name: loki
meta.helm.sh/release-namespace: monitoring
creationTimestamp: "2021-10-26T10:23:32Z"
labels:
app: fluent-bit-loki
app.kubernetes.io/managed-by: Helm
chart: fluent-bit-2.3.0
heritage: Helm
release: loki
name: loki-fluent-bit-loki
namespace: monitoring
Then I got now the calculationId
label indexed:
I didn't try, but according to the way the loki fluentbit output plugin works, the LabelKeys
parameter also allow me to add a custom label like my calculationId
and get it to use it to get its log streams
The grafana loki documentation says:
LabelKeys: Comma separated list of keys to use as stream labels. All other keys will be placed into the log line. LabelKeys is deactivated when using LabelMapPath label mapping configuration.
So According to this, LabelKeys
should works as well, and then if I use it I wouldn't need to use the LabelMapPath
parameter to reference in json the desired labels.
IMPORTANT
I am using here the fluent-bit collector agent that comes with the loki stack, it means this installation approach. Is opportune to highlight this is not the official fluent-bit collector, the official one is this. Right now there is a request to deprecate the fluent-bit grafana loki stack in favor of the official one, so will be opportune to highlight this, and think about to use the official one.
Upvotes: 4