Reputation: 21
I would like to enter a large amount of data in my file. To do this, I am thinking of using postgresql's asynchronous processing to do the query. The code is as follows.
raw_conn = ActiveRecord::Base.connection.raw_connection
raw_conn.send_query("SELECT id FROM users")
raw_conn.set_single_row_mode
begin
raw_conn.get_result.stream_each do |f|
# writing process to file
end
end
raw_conn.send_query("SELECT id FROM logins")
raw_conn.set_single_row_mode
begin
raw_conn.get_result.stream_each do |f|
# writing process to file
end
end
When the second send_query is executed, the following error occurs.
another command is already in progress
I had a feeling that the connection pool was the cause, but the documentation did not mention it. Is there any way to run send_query more than once and get results?
Upvotes: 0
Views: 151
Reputation: 579
The reset didn't quite work for me (pg 1.5.6), but I found the method discard_results
which seems to resolve this for me. I'm not quite positive why the command is still ongoing and whether there is an official way to handle this?
Anyway, my solution for now looks something like this:
begin
raw_connection = ActiveRecord::Base.connection.raw_connection
raw_connection.send_query(sql)
raw_connection.set_single_row_mode
db_result = raw_connection.get_result
db_result.stream_each do |row|
# handle row
end
ensure
# added because of exception: "another command is already in progress"
raw_connection.discard_results
end
Upvotes: 0
Reputation: 21
I Resolved. I tried to reset the connection before sending a new query, and it worked.
begin
raw_conn.get_result.stream_each do |f|
# writing process to file
end
ensure
raw_conn.reset
end
Upvotes: 1