Reputation: 2047
I am new to Ruby and Rails and am experimenting with improving performance. Which of the two options is better way to go for a live app?
Option A: More hits to the database but uses the built in ActiveRecord Relation
def detail
sorted_sizes = ["S", "M", "L", "XL"]
products = Product.where(:name => params[:name])
if products.empty?
return redirect_to shop_path, :notice => 'The item is no longer available'
end
@product = products.first
@sizes = sorted_sizes & products.pluck(:size).uniq
@colors = products.pluck(:color).uniq
@quantity = products.where("size = :size AND color = :color AND available = :bool",
size: @sizes[0], color: @colors[0], bool: true).count
end
Option B: Uses an array to reduce hits to a DB but requires some custom code such as to grab the total quantity that matches certain filters.
def detail
sorted_sizes = ["S", "M", "L", "XL"]
products = Product.where(:name => params[:name]).to_a
if products.empty?
return redirect_to shop_path, :notice => 'The item is no longer available'
end
@product = products.first
@sizes = sorted_sizes & products.pluck(:size).uniq
@colors = products.pluck(:color).uniq
@quantity = products.select do |elem|
elem.size == @sizes[0] && elem.color == @colors[0] && elem.available == true
end
@quantity = @quantity.count
end
Am I wrong to assume Option B is better since the hits to the DB are reduced? And if the answer is it depends, where would option A be preferred except in having to access the built in methods?
Upvotes: 0
Views: 283
Reputation: 3578
The age old question of all Rails developer.
Having developed multiple apps with complex datasets, I can assure you that there is always a way to get your cake and eat it too.
It really depends on the dataset size. Say you were to do multiple array operations where N = 3 ^ 1000. Then multiple array manipulations would cripple the app.
Worked on an app where we thought data was retrieved in a single query, then turned into a PDF. Our app would crash for large customer data and we did not really understand why.
We later found out that it had everything to do with the array manipulation required to build the PDFs.
We eventually refactored in a way to minimize manipulation of data and reduce queries down to a single query.
This might not be necessary for you, but just to give some context of what I'm saying.
On the flip side, for small requests with small datasets, doing multiple DB queries can be cripplingly slow.
All in all, it's always best to do a single query and to work off the original array.
If you're interested, I would look into deeply nested association in Rails, eager_load
and when all else fails writing SQL by hand.
Upvotes: 3
Reputation: 1013
You can benchmark two implementations of single method. Try to execute below code in rails console. I changed in your code params[:name]
to string some name
. You should adjust it to find some products.
require 'benchmark'
def detail_1
sorted_sizes = ["S", "M", "L", "XL"]
products = Product.where(:name => 'some name')
if products.empty?
return redirect_to shop_path, :notice => 'The item is no longer available'
end
@product = products.first
@sizes = sorted_sizes & products.pluck(:size).uniq
@colors = products.pluck(:color).uniq
@quantity = products.where("size = :size AND color = :color AND available = :bool",
size: @sizes[0], color: @colors[0], bool: true).count
end
def detail_2
sorted_sizes = ["S", "M", "L", "XL"]
products = Product.where(:name => 'some name').to_a
if products.empty?
return redirect_to shop_path, :notice => 'The item is no longer available'
end
@product = products.first
@sizes = sorted_sizes & products.pluck(:size).uniq
@colors = products.pluck(:color).uniq
@quantity = products.select do |elem|
elem.size == @sizes[0] && elem.color == @colors[0] && elem.available == true
end
@quantity = @quantity.count
end
Benchmark.bmbm do |x|
x.report('first implementation') do
10000.times { detail_1 }
end
x.report('second implementation') do
10000.times { detail_2 }
end
end
Upvotes: 1