pramod s
pramod s

Reputation: 31

how to create multiple index in elasticsearch using fluentd ( td-agent.cong)

I am setting up EFK Stack. In Kibana I want to represent one index for application and one index for syslogs.

I am using fluentd for log forwarding.

syslogs --> /var/log/messages and /var/log/secure

application --> /var/log/application.log

what is the td-agent.cong to create two index plz help

thanking you

Upvotes: 2

Views: 3860

Answers (1)

TommyW
TommyW

Reputation: 570

If you are using the ElasticSearch output plugin and want to use kibana you can config your indices names by changing the logstash_prefix attribute.

read the documentation : elasticsearch output plugin documentation

I have added this following fluentd.conf file to demonstrate your usecase. In this file I have 2 matches: 1. "alert" - will pipe all logs with "alert" (FluentLogger.getLogger("alert")) to "alert" index in elasticsearch.

  1. default match - will pipe all logs to elasticsearch with "fluentd" index (which is the default index of this plugin).

    fluentd/conf/fluent.conf

    @type forward port 24224 bind 0.0.0.0

    <match alert.**>
      @type copy
      <store>
        @type elasticsearch
        host elasticsearch
        port 9200
        logstash_format true
        logstash_prefix alert
        logstash_dateformat %Y%m%d
        type_name access_log
        tag_key @log_name
        flush_interval 1s
      </store>
      <store>
        @type stdout
      </store>
    </match>
    
    
    <match *.**>
      @type copy
      <store>
        @type elasticsearch
        host elasticsearch
        port 9200
        logstash_format true
        logstash_prefix fluentd
        logstash_dateformat %Y%m%d
        include_tag_key true
        type_name access_log
        tag_key @log_name
        flush_interval 1s
      </store>
      <store>
        @type stdout
      </store>
    </match>
    

Upvotes: 2

Related Questions