wurzel
wurzel

Reputation: 43

fluent-bit to Loki - "log" field is not being parsed/filtered

I have:

  1. a simple Python app ("iss-web") writing JSON log output to stdout
  2. the Python app ("iss-web") is within a Docker Container
  3. the Python app ("iss-web") Container logging driver is set to "fluentd"
  4. a separate Container running "fluent/fluent-bit:1.7" to collect the Python app JSON log output
  5. Loki 2.2.1 deployed via a Container to receive the Python app log output from fluent-bit
  6. Grafana connected to Loki to visualize the log data

The issue is that the "log" field is not filtered/parsed by fluent-bit, therefore in Loki/Grafana the content of the "log" field is not parsed and used as "Detected fields".

"iss-web" docker-compose.yml

version: '3'
services:
    iss-web:
        build: ./iss-web
        image: iss-web
        container_name: iss-web
        env_file:
            - ./iss-web/app.env
        ports:
            - 46664:46664
        logging:
            driver: fluentd
            options:
                tag: iss.web
    redis:
        image: redis
        container_name: redis
        ports:
            - 6379:6379
        logging:
            driver: "json-file"
            options:
                max-file: ${LOG_EXPIRE}
                max-size: ${LOG_SEGMENT}

"fluent-bit" docker-compose.yml

version: '3'
services:
  fluent-bit:
    image: fluent/fluent-bit:1.7
    container_name: fluent-bit
    environment:
      - LOKI_URL=http://135.86.186.75:3100/loki/api/v1/push
    user: root
    volumes:
      - ./fluent-bit.conf:/fluent-bit/etc/fluent-bit.conf
      - ./parsers.conf:/fluent-bit/etc/parsers.conf
    ports:
      - "24224:24224"
      - "24224:24224/udp"

fluent-bit.conf

[SERVICE]
    Flush 1
    Daemon Off
    log_level debug
    Parsers_File /fluent-bit/etc/parsers.conf

[INPUT]
    Name forward
    Listen 0.0.0.0
    port 24224

[FILTER]
    Name parser
    Match iss.web
    Key_Name log
    Parser docker
    Reserve_Data On
    Preserve_Key On

[OUTPUT]
    Name loki
    Match *
    host 135.86.186.75
    port 3100
    labels job=fluentbit

[OUTPUT]
    Name stdout
    Match *

parsers.conf

I've tried with/without Time_Key, Time_Format, Time_Keep

[PARSER]
    Name         docker
    Format       json
    #Time_Key     time
    #Time_Format  %Y-%m-%dT%H:%M:%S.%L
    #Time_Keep    On
    # Command      |  Decoder | Field | Optional Action
    # =============|==================|=================
    #Decode_Field_As   escaped_utf8    log    do_next
    Decode_Field_As   json       log

fluent-bit log extract

[0] iss.web: [1620640820.000000000, {"log"=>"{'timestamp': '2021:05:10 11:00:20.439513', 'epoch': 1620640820.4395688, 'pid': 1, 'level': 'DEBUG', 'message': '/ping', 'data': {'message': 'PONG', 'timestamp': '1620640820.4394963', 'version': '0.1'}}", "container_id"=>"bffd720e9ac1e8c3992c1120eed37e00c536cd44ec99e9c13cf690d840363f80", "container_name"=>"/iss-web", "source"=>"stdout"}]

Grafana/Loki screen

I would expect the "Detected fields" to contain pid=1, message=/ping etc enter image description here

Upvotes: 2

Views: 1868

Answers (1)

wurzel
wurzel

Reputation: 43

I needed a "json.dumps" in my "logger":

def log(message, level="INFO", **extra):
    out = {"timestamp": get_now(), "epoch": get_epoch(), "pid": get_pid(), "level": level, "message": message}
    if extra: out |= extra
    print(json.dumps(out), flush=True)
    return True

Upvotes: 1

Related Questions