Reputation: 34513
assuming we're interpreting this new relic trace correctly, it seems that 4.75s is spent converting large arrays to JSON strings. is this possible? we're not using ruby enterprise edition but ruby 1.9.3.
we're using rails 3.2.12.
the arrays each contain about 1 MB of data and 900 objects.
our interpretation stems from the fact that the last mongo query is the second-to-last line in the method, meaning the red block in the trace represents the last line, which handles rendering and converting arrays to JSONs.
method: https://gist.github.com/panabee/39401fae827ee6aca8bd
Upvotes: 0
Views: 140
Reputation: 179
To figure out what's happening, you will need to manually instrument your code using a recursive "divide and conquer" method.
Start by instrumenting in the middle of the code and look at the traces again to determine if the problem is above or below where you instrumented and keep cutting the problem in half until you hit on the problem.
You'll need to add the API calls to your application code and use method tracers.
https://newrelic.com/docs/ruby/ruby-custom-metric-collection#method_tracers
Upvotes: 1