Reputation: 129
I have 20million rows sql dump file from postgresql . I want to move it to elasticsearch so I use logstash and i use this statement : statement => "select * from students"
. but I always get outOfMemory error . I have 16g ram and I change the logstash and elasticsearch XMx to 12g but this error have occured yet . I think it is because of select statement. It make a huge load on memory . what Should i do ? thanks for any helps.
Upvotes: 0
Views: 2179
Reputation: 4032
These threads may help : enter link description here and enter link description here
To summarize, as mentioned by Gaurav above, when this kind of issue occurred, you have to add below params in jdc section of your config file
jdbc {
jdbc_paging_enabled => true,
jdbc_page_size => 200000
}
Upvotes: 1
Reputation: 1197
All you need to do is to configure these paramters
Moreover use following parameters for paging results from sql queries
This will internally use sql limit property.
Upvotes: 3