Sean Lindo
Sean Lindo

Reputation: 1437

Logstash JSON filter does not appear to be working with JDBC input

I'm trying to index documents from a postgres view and into logstash via Elasticsearch. The statement part of my config file looks like this:

select search_document FROM my_view;

The next line in my config file is where I try to extract the value from the database call:

filter {
 json {
   source => "[search_document][value]"
  }
}

This should select the value of the column and then directly pipe it into Elasticsearch (or in this testing phase, stdout); however, it doesn't work correctly. I've tried several variations on the property access above, and it always inserts documents with "search_document" as a key and the rest of the JSON document as it's value.

Is there something I'm doing wrong?

Edit: I've updated my view to perform a select search_document::text FROM my_view; based on another answer I've found. The original type of the column is JSONB and the data in the row looks similar to this:

{ "value" : { "key_1": "hello", "key_2": "world" } }

Upvotes: 1

Views: 528

Answers (1)

Sofia Braun
Sofia Braun

Reputation: 194

You can try only specifying the the field you want to get like this:

filter {
 json {
   source => "value"
  }
}

The filter is supposed to be executed with every record so you shouldn't add the search_document

Upvotes: 1

Related Questions