Reputation: 17650
I'm finding that a query with only one row crashes if I request an arbitrary large number of rows.
The error thrown by the server is 500 - with an Out of memory exception message.
This crashes :
http://localhost:8983/solr/myIndex1/select?rows=100000&q=*%3A*&fq=group%3term1_JAYUNIT100&fq=grid%3A75&wt=json&indent=on
This does not crash :
http://localhost:8983/solr/myIndex1/select?rows=1&q=*%3A*&fq=group%3term1_JAYUNIT100&fq=grid%3A75&wt=json&indent=on
This is odd to me - I dont see why Solr would use up extra memory for a query which only returns one row. Is there some sort of pre-allocation of resources which happens at the server side before a query is run, which is based on the value of the "rows" parameter?
Upvotes: 1
Views: 826
Reputation: 32392
SOLR caches the result of queries. In this case the result set is very large even though you filter it and only return one row.
First of all, SOLR needs RAM. It is an in-RAM index after all. Everything that makes SOLR fast, takes up RAM so please do not starve a SOLR server.
Secondly, your actual query is useless. There is no point in saying "select all records from the database, build a bitmap index and then filter that set to select only the ones with certain field values. If your query sounds like this in natural language:
Records where XField is like so, AND YField is like that, AND ZField meets this condition
Then the right way to do it in SOLR is:
q=XField:so&fq=Yfield:that%20AND%20ZField:this
In fact, if you are sure that there are x records with XField:so and 3x records with YField:that and .07x records with ZField:this, then start by rearranging your AND expression and put ZField in the q=
part.
The q=
part defines the resultset. AFter getting all the records in the resultset, SOLR then applies bitmap index techniques to quickly filter (narrow down) the results using set operations. So when you can, make the q=
part return fewer records for fq=
to operate on.
Upvotes: 1