pangpang
pangpang

Reputation: 8821

Fastest way to insert 100000+ records to mysql in rails

I want to transfer info from redis to mysql.

At first, I got the user_id and score form redis.

info = $redis.zrevrange("some_info", 0, limit - 1, withscores: withscores)

then I need to handle these date:

  total_info = info.each_slice(2).to_a.inject([]) do |sum, e|
    sum << {
            id: id,
            name: name,
            rank: rank,
            score: score,
           }

total_info is an array of size 100000+.

then I need to insert the record of total_info to mysql.

I try to create a loop to insert, but i think it is not a good idea.

How to make it? and what is the fastest way?

Upvotes: 0

Views: 2166

Answers (2)

coorasse
coorasse

Reputation: 5528

You can use Active Record Import in his Fastest Version.

Here is an example:

columns = [ :id, :name, :rank, :score ]
values = []
infos.each do |info|
  values << [info.id, info.name, info.rank, info.score]
end
Model.import columns, values, validate: false

Upvotes: 5

pangpang
pangpang

Reputation: 8821

Thanks for @e4c5 and @Rich Peck.

I dump the date into .CSV file, then import it to mysql. It is very fast!

info = $redis.zrevrange("some_info", 0, limit - 1, withscores: withscores)
CSV.open("temp.csv", "w") do |csv|
  csv << %w(id, name, rank, score......)
  info.each_slice(2) do |e|
      csv << [id, name, rank, score......]
  end
end

then:

sql = "LOAD DATA LOCAL INFILE 'temp.csv'
       INTO TABLE <table_name>
       FIELDS TERMINATED BY ','
       ENCLOSED BY '\"'
       LINES TERMINATED BY '\n'
       IGNORE 1 ROWS
       (ia, name, rank, score.....);"
ActiveRecord::Base.connection.execute(sql)

Upvotes: 0

Related Questions