Reputation: 12853
I have a collection of places with more then 400,000 documents. I am trying to do geospacial queries but they always seem to timeout. From the MongoLab interface I do a search:
{ "location": {"$near": [ 38, -122 ] } }
And the page just times out.
Also ran this command thru my console:
db.runCommand({geoNear: "places", near: [50,50], num:10})
And it did succeed but took something like 5 minutes to complete.
I do have a Geospatial Index on location.
location { "location" : "2d"}
Is it just impossible to do geospacial queries on such big collections (quite small for a MongoDB collection after all)?
EDIT: MongoLab personally contacted me regarding this problem. It seems there are some issues with my db such as many places not having any coords yet. Also I discovered that using maxDistance accelerates the queries dramatically, which brings me back to this morning's question here : so question
Upvotes: 3
Views: 412
Reputation: 12853
Mongolabs techs have pointed out to me that having a lot of longitude latiudes set to 0,0 and NOT using a maxDistance was what was slowing things down. Adding the maxDistance worked like a charm..
So thanks again to the guy's at Mongolabs.
Upvotes: 3