user2046007
user2046007

Reputation: 21

Solr full-import in several smaller chunks

I'm trying to import a big MySQL database into Solr, and the import queries are quite heavy on the server (this might affect the actual product which is running and using the database at that time). Is there a way to split the full import into several smaller chunks? I didn't find anything on this subject neither here or in Solr's documentation.

I know about the delta import feature, but I'm using it for delta imports of new/changed data.

Upvotes: 0

Views: 786

Answers (2)

Jayendra
Jayendra

Reputation: 52799

Probably you can check batchsize :-

batchSize (default: 500) – sets the maximum number (or rather a suggestion for the driver) records retrieved from the database in one query to the database. Changing this parameter can help in situations where queries return to much results. It may not help, since implementation of this mechanism depends on the JDBC driver.

http://lucene.472066.n3.nabble.com/DataImportHandler-running-out-of-memory-td490797.html

Upvotes: 0

Persimmonium
Persimmonium

Reputation: 15789

Of course, you can add a condition like

WHERE pk<'${dataimporter.request.INDEX}'

and pass INDEX in the request params. So each time you call full import only part of the records are indexed. Remember to use &clean=false of course or contents will be wiped out each time.

Upvotes: 1

Related Questions