Reputation: 4572
I made a Rake Task to parse information from some websites (using Nokogiri gem) and save everything to a .json file. To do this, I store all objects inside and array and before the rake task ends, I call a method to serialize the array of objects to an JSON file.
Currently, I'm doing something like:
# Declaring a global variable inside the 'namespace'
@all_objects = []
# Each website I want parse is added to '@all_objects' array
this_object = Object.new(:attr1 => variable1, :attr2 => variable2)
@all_objects << this_object
# After run the code above for all websites, I save them in .json file
File.open("public/data.json", "w") {
|f| f.write(@all_objects.to_json)
}
What is the easist way to do the same thing, but serializing to an insert SQL script? Exist some method like myArray.to_postgresql
or myArray.to_mysql
, that get all objects of this array and generate a script to insert this data in a SQL database(INSERT into...)?
Since now, thanks!
Upvotes: 0
Views: 706
Reputation: 851
Alternative way:
Model.last
Now you can simple 'Copy as SQL insert' from Sequel Pro.
Upvotes: 1
Reputation: 4572
After search a lot over the Internet and forums, I found "models_to_sql" gem, from Martin Provencher.
But this gem only works for Rails 3.2 and older, and my application uses Rails 4.0. I talked with author and he doesn't want support this gem anymore. So I discovered by myself how to support Rails 4.0 and now I'm maintaining this gem.
For objects:
object.to_sql_insert
# INSERT INTO modelName (field1, field2) VALUES ('Wow, amaze gem', 'much doge')
For array of objets:
array_of_objects.to_sql_insert
# INSERT INTO modelName (field1, field2) VALUES ('Awesome doge', "im fucking cop")
# INSERT INTO modelName (field1, field2) VALUES ('much profit', 'much doge')
# (...)
Just see the Github of this project and you'll find how to install and use this wonderful gem.
Upvotes: 1