Reputation: 129
I've got stat model :
Stat(id: integer, points: float, user_id: integer, match_id: integer, team_id: integer)
For match model:
Match(id: integer, team_a_id: integer, team_b_id: integer)
Here's my code now:
stat= Stat
.group(:user_id)
.select("user_id, count(*) as matches_count, sum(points) as score")
.where.not(points: 0)
.where.not(match_id: nil)
stat.each do |f|
new_value = (f.sum_points.to_f / f.matches_count.to_f)
f.sum_points = new_value.round(2)
a << f
end
new_stat = a.sort_by(&:sum_points).reverse.first(10).flatten
I have to change the value of sum_points
on each data what if I got a lot of data so it will take time is there a way to minimize it? What I need is the first top 10.
Upvotes: 2
Views: 74
Reputation: 91
The most efficient way would be to compute the data using the sql and only retrieve the top 10.
Basically, right now you're manually parsing the whole active record array to compute the sum_points and then you're sorting and reversing all of the data to retrieve the top 10. Like you've said, the performance will suffer when large amount of data will be in the database.
Here's my solution that will compute and retrieve the top 10 using only sql queries:
stat = Stat.group(:user_id)
.select("user_id, round((sum(points) /count(*)), 2) as sum_points")
.where.not(points: 0).where.not(match_id: nil).order("sum_points DESC").first(10)
Upvotes: 1