Reputation: 101
In my query im using for loop. Each and every time when for loop is executed, at the end some values has to be inserted into table. This is time consuming because for loop has many records. Due to this each and every time when for loop is executed, insertion is happening. Is there any other way to perform insertion at the end after the for loop is executed.
For i in 1..10000 loop ....
--coding
insert into datas.tb values(j,predictednode); -- j and predictednode are variables which will change for every loop
End loop;
Instead of inserting each and every time i want the insertion should happen at the end.
Upvotes: 0
Views: 187
Reputation: 19302
One possible solution is to build a large VALUES
String. In Java, something like
StringBuffer buf = new StringBuffer(100000); // big enough?
for ( int i=1; i<=10000; ++i ) {
buf.append("(")
.append(j)
.append(",")
.append(predicted_node)
.append("),"); // whatever j and predict_node are
}
buf.setCharAt(buf.length()-1, ' '); // kill last comma
String query = "INSERT INTO datas.tb VALUES " + buf.toString() + ";"
// send query to DB, just once
The fact j
and predict_node
appear to be constant has me a little worried, though. Why are you putting a constant in 100000 times?
Another approach is to do the predicting in a Postgres procedural language, and have the DB itself calculate the value on insert.
Upvotes: 0
Reputation: 125214
If you show how the variables are calculated it could be possible to build something like this:
insert into datas.tb
select
calculate_j_here,
calculate_predicted_node_here
from generate_series(1, 10000)
Upvotes: 1