Reputation: 23
Hi I am trying to fetch 50K + rows from one of the table in MYSQL DB. It is taking more than 20 minutes to retrieve all the data and writing it to text file. Can I use multi threading to reduce this fetching time and make the code more efficient. Any help will be appreciated.
I have used normal JDBC connection and ResultSetMetaData to fetch rows from the Table.
String row = "";
stmt = conn.createStatement();
ResultSet rs = stmt.executeQuery("select * from employee_details");
ResultSetMetaData rsmd = rs.getMetaData();
int columnCount = rsmd.getColumnCount();
while (rs.next()) {
for (int i = 1; i < columnCount; i++) {
row = row + rs.getObject(i) + "|";// check
}
row = row + "\r\n";
}
And I am writing the fetched values in text file as below.
BufferedWriter writer = new BufferedWriter(new FileWriter(
"C:/Users/430398/Desktop/file/abcd.txt"));
writer.write(row);
writer.close();
Upvotes: 2
Views: 2056
Reputation: 2423
First you need to identify where the bottleneck is. Is it the SQL query? Or the fetching of the rows via the ResultSet? Or the building of the huge string? Or perhaps writing the file?
You need to measure the duration of the above mentioned individual parts of your algorithm and tells us the results. Without this knowledge is not possible to tell how to speed the algorithm.
Upvotes: 0
Reputation: 315
Remember that rs.next will fetch Results from the DB in n-batches. Where n is a number defined by the JDBC-Implementation. I assume it's at 10 right now. So for every 10 batches it will again query the DB, hence there'll be an network-overhead - even if it's on the very same machine.
Just increasing that number will result in a faster loading time.
edit:
adding this
stmt.setFetchSize(50000);
might be it.
Be aware, that this results in heavy memory consumption.
Upvotes: 1