Reputation: 1309
I have a postgresql database. In table, which i need to index, i have about 20 million rows. When i want to index them all in one attempt(smth like "select * from table_name"), i have Java OutOfMemory error, even, if i`ll give to JVM more memory.
Is there any option in SOLR to index a table part by part(e.g. execute sql for first 1000000 rows, then index it, then execute sql for second million)?
Now i am using sql query with LIMIT. But, everytime, when solr has indexed it, i need manually start it again.
UPDATE: Ok, 1.4 is out now. No OutOfMemory Exceptions, seems, Apache had done very big work on DIH. Also, now we can pass parameters through request, and use them in our SQL selects. Wow!
Upvotes: 0
Views: 1456
Reputation: 106
See the bit about "cursors" here, that might well help.
http://jdbc.postgresql.org/documentation/83/query.html
Upvotes: 1
Reputation: 1573
Have you looked at using SolrJ as a client? While DIH is great, the tight coupling between Solr and your Database means that it can be hard to manipulate your data and work around issues.
With a SolrJ client, you could then iterate in batches that you control over your database, and then turnaround and dump then directly into Solr. Also, using SolrJ new binary java stream format instead of XML means that indexing your 20 million rows should go fairly quickly.
DIH is great, until you end up in issues like this!
Upvotes: 0
Reputation: 99720
Do you have autoCommit, batchSize configured? If you do, it might be this bug, try updating to trunk.
Upvotes: 0