Reputation: 2230
I use rails and Mysql as DB.I want to import mass history data from my old system using Mongo.
My rake task code is like this:
threads = []
File.foreach("file.json").each_slice(100) do |lines|
threads << Thread.new {
time = Time.now
lines.each do |line|
json = ... # Parse json
Model.new(json).save!(validate: false)
end
p Time.now.to_f - time.to_f
}
end
I tried with a json with 100 lines. Each thread costed 10s. But when I tried with a json with 1000 lines. Each thread costed about 90s, and all cost about 90s.
Why didn't each thread still cost 10s when I import the json with 1000 lines?
And how to speedup it?
Upvotes: 1
Views: 313
Reputation: 115541
Your bottleneck is the database, I suggest you bulk create your models.
Use active-record-import to do this.
Example:
models = []
File.foreach("file.json").each_slice(100) do |lines|
lines.each do |line|
hash = #parse line here
models << Model.new(hash)
end
end
Model.import models
Upvotes: 3