Fatemeh Moh
Fatemeh Moh

Reputation: 129

logstash OutOfMemory error when importing 20 million rows from postgres to elasticsearch

I have 20million rows sql dump file from postgresql . I want to move it to elasticsearch so I use logstash and i use this statement : statement => "select * from students". but I always get outOfMemory error . I have 16g ram and I change the logstash and elasticsearch XMx to 12g but this error have occured yet . I think it is because of select statement. It make a huge load on memory . what Should i do ? thanks for any helps.

Upvotes: 0

Views: 2179

Answers (2)

onlyme
onlyme

Reputation: 4032

These threads may help : enter link description here and enter link description here

To summarize, as mentioned by Gaurav above, when this kind of issue occurred, you have to add below params in jdc section of your config file

jdbc {
   jdbc_paging_enabled => true,
   jdbc_page_size => 200000
}

Upvotes: 1

gaurav9620
gaurav9620

Reputation: 1197

All you need to do is to configure these paramters

Moreover use following parameters for paging results from sql queries

  • jdbc_page_size
  • jdbc_paging_enabled

This will internally use sql limit property.

Upvotes: 3

Related Questions