Reputation: 15107
I have a search that take over 80 seconds on production, but only a second on my local (the data is almost identical). I am not sure what steps I can take next.
The query looks like:
@params = {search_field: 'kamil'}
User.only('username', 'email', 'name', 'role', 'utm_email').or({:"email" => /#{@params[:search_field]}/i},
{:"username" => /#{@params[:search_field]}/i},
{:"name" => /#{@params[:search_field]}/i}).explain()
My dev output of explain():
https://gist.github.com/kamilski81/827f9b363b0392cc87d9
My production output of explain():
https://gist.github.com/kamilski81/c0b07b838ddfbdb97c61
I am not really sure where to take it from here, as the outputs look very similar to me. Dev is running mongodb-2.4.9 and production is running mongodb-2.4.10
Upvotes: 2
Views: 484
Reputation: 116518
You can use the profiler. It would create a collection with a document for every operation above a threshold (default 100 ms).
You should turn it on using the shell, and set the threshold to about 60 seconds:
db.setProfilingLevel(1,60000)
Execute the operation and find the relevant document in the collection and you can see which index is being used, how much time you were waiting on a lock, how many documents did MongoDB
scan and so forth.
More about the profiler here
Upvotes: 2