Reinier
Reinier

Reputation: 162

Speeding up XML to MySQL with Nokogiri in Rails

I'm writing large amounts of data from XML feeds to my MySQL database in my Rails 3 app using Nokogiri. Everything is working fine but it's slower than I would like.

Is there any way to speed up the process? This is simplified version of the script I'm using:

url = "http://example.com/urltoxml"
doc = Nokogiri::XML(open(url))
doc.xpath("//item").each do |record|

  guid = record.xpath("id").inner_text
  price = record.xpath("price").inner_text
  shipping = record.xpath("shipping").inner_text

  data = Table.new(
    :guid => guid,
    :price => price,
    :shipping => shipping
  )
  if price != ""
    data.save
  end

end

Thnx in advance

Upvotes: 3

Views: 766

Answers (1)

m_x
m_x

Reputation: 12564

I guess your problem is not from parsing XML, but is that you insert the records one by one in the DB, which is very costly.

Unfortunately, AFAIK Rails does not provide a native way to mass-insert records. There once was a gem that did it, but I can't get my hand back on it.

"Mass inserting data in Rails without killing your performance", though, provides helpful insights on how to do it manually.

If you go this way, don't forget to process your nodes in batches if you don't want to end with a single 999-bazillion-rows INSERT statement.

Upvotes: 1

Related Questions