hadi
hadi

Reputation: 229

get out of memory exception when indexing files with solrj

I write a simple program with solrj that index files but after a minute passed it crashed and the java.lang.OutOfmemoryError : java heap space appears

I use Eclipse and my memory storage is about 2GB and i set the -Xms1024M-Xmx2048M for both my VM arg of tomcat and my application in Debug Configuration and uncomment the maxBufferedDocs in solrconfig and set it to 100 then run again the application but it crash soon when it reaches the files greater than 500MB

is there any config to index large files with solrj? the detail my solrj is as below:

String urlString = "http://localhost:8983/solr/file"; 
CommonsHttpSolrServer solr = new CommonsHttpSolrServer(urlString); 

ContentStreamUpdateRequest req = new ContentStreamUpdateRequest("/update/extract"); 

req.addFile(file); 
req.setParam("literal.id", file.getAbsolutePath()); 
req.setParam("literal.name", file.getName()); 
req.setAction(ACTION.COMMIT, true, true); 

solr.request(req); 

Upvotes: 0

Views: 925

Answers (2)

karthik
karthik

Reputation: 193

Is Solr also running on the same machine as solrj? There might be memory constraints on the machine where you are running Solr. How much free memory do you have once you start Solr? - you will probably need more memory available on that box.

Try to put a commit after every document and see if you can get around the problem temporarily.

Upvotes: 0

Jayendra
Jayendra

Reputation: 52809

Are you also setting the heap size params when running the java class in eclipse ?

Run -> Run Configurations > <Class Name> > Arguments -> VM arguments

Upvotes: 1

Related Questions