Elliot
Elliot

Reputation: 13835

Handling lots of report / financial data in rails 3, without slowing down?

I'm trying to figure out how to ask this - so I'll update the question as it goes to clear things up if needed.

I have a virtual stock exchange game site I've been building for fun. People make tons of trades, and each trade is its own record in a table.

When showing the portfolio page, I have to calculate everything on the fly, on the table of data - i.e. How many shares the user has, total gains, losses etc.

Things have really started slowing down, when I try to segment it by trades by company by day.

I don't really have any code to show to demonstrate this - but it just feels like I'm not approaching this correctly.

UPDATE: This code in particular is very slow

 #Returning an array of values for a total portfolio over time
  def portfolio_value_over_time
    portfolio_value_array = []

    days = self.from_the_first_funding_date

    companies = self.list_of_companies

    days.each_with_index do |day, index|

      #Starting value
      days_value = 0

      companies.each do |company|
        holdings = self.holdings_by_day_and_company(day, company)
        price = Company.find_by_id(company).day_price(day)
        days_value = days_value + (holdings * price).to_i
      end

      #Adding all companies together for that day
      portfolio_value_array[index] = days_value 
    end

The page load time can be up to 20+ seconds - totally insane. And I've cached a lot of the requests in Memcache.

Should I not be generating reports / charts on the live data like this? Should I be running a cron task and storing them somewhere? Whats the best practice for handling this volume of data in Rails?

Upvotes: 1

Views: 346

Answers (3)

pguardiario
pguardiario

Reputation: 54984

Since you're using memcached anyway, use it to cache some of those queries. For example:

Company.find_by_id(company).day_price(day)

Upvotes: 0

lee
lee

Reputation: 2439

You should precompute and store all this data on another table. An example table might look like this:

Table: PortfolioValues
Column: user_id
Column: day
Column: company_id
Column: value
Index: user_id

Then you can easily load all the user's portfolio data with a single query, for example:

current_user.portfolio_values

Upvotes: 1

Todd A. Jacobs
Todd A. Jacobs

Reputation: 84363

The Problem

Of course it's slow. You're presumably looking up large volumes of data from each table, and performing multiple lookups on multiple tables on every iteration through your loop.

One Solution (Among Many)

You need to normalize your data, create a few new models to store expensive calculated values, and push more of the calculations onto the database or into tables.

The fact that you're doing a nested loop over high-volume data is a red flag. You're making many calls to the database, when ideally you should be making as few sequential requests as possible.

I have no idea how you need to normalize your data or optimize your queries, but you can start by looking at the output of explain. In general, though, you probably will want to eliminate any full table scans and return data in larger chunks, rather than a record at a time.

This really seems more like a modeling or query problem than a Rails problem, per se. Hope this gets you pointed in the right direction!

Upvotes: 2

Related Questions