magnoz
magnoz

Reputation: 1995

Elasticsearch + Logstash: How to add a fields based on existing data at importing time

Currently, I'm importing data into Elastic through logstash, at this time by reading csv files.

Now let's say I have two numeric fields in the csv, age, and weight. I would need to add a 3rd field on the fly, by making a math on the age, the weight and another external data ( or function result ); and I need that 3rd field to be created when importing the data.

There is any way to do this? What could be the best practice?

Upvotes: 0

Views: 1524

Answers (1)

OneCricketeer
OneCricketeer

Reputation: 192033

In all Logstash filter sections, you can add fields via add_field, but that's typically static data.

Math calculations need a separate plugin

As mentioned there, the ruby filter plugin would probably be your best option. Here is an example snippet for your pipeline

filter {

     # add calculated field, for example BMI, from height and weight 
    ruby {
        code => "event['data']['bmi'] = event['data']['weight'].to_i / (event['data']['height'].to_i)"
   }
}

Alternatively, in Kibana, there are Scripted fields meant to be visualized, but cannot be queried

Upvotes: 1

Related Questions