Reputation: 29
I have a performance problem with my server.
Now here is the setup:
I have rented a virtual server (4 GB, 2 vCores/Threads). On this server runs a Java program that connects to a second server via TCP. Once server 1 connects to server 2, server 2 starts sending data continuously (about 250,000 lines per Minute, each line about 50 bytes of data). Server 1 then splits each line and stores the data by adding a row to a table of a MySQL database also running on server 1. Ok, so here is the problem: In no way can my server 1 keep up.
These are the relevant Java parts.
Receiving data:
InputStream in = socket.getInputStream();
BufferedInputStream bin = new BufferedInputStream(in);
InputStreamReader inReader = new InputStreamReader(bin, "UTF-8");
BufferedReader reader = new BufferedReader(inReader);
String line;
... something...
while((line = reader.readLine())!=null) {
mySqlConn.WriteReferenceData(line);
}
}
All just straight forward. And writing data:
public void WriteReferenceData(String line) {
String[] data = line.split(";");
String query = " insert into reference (field1, field2, field3, field4, field5, field6, field7, field8)"
+ " values (?, ?, ?, ?, ?, ?, ?, ?)";
try {
PreparedStatement preparedStmt = conn.prepareStatement(query);
preparedStmt.setString (1, data[0]);
preparedStmt.setString (2, data[1]);
preparedStmt.setString (3, data[2]);
preparedStmt.setString (4, data[3]);
preparedStmt.setString (5, data[4]);
preparedStmt.setString (6, data[5]);
preparedStmt.setString (7, data[6]);
preparedStmt.setString (8, data[7]);
preparedStmt.execute();
}
catch (Exception e) {
... something...
}
}
So how could I improve the performance?
1.) Not at all, because the server is just too slow?
2.) Would using a different language, (e.g. without Interpreter) be helpful?
3.) Is it just sloppy programming?
Thanks for you help! Raphael
Upvotes: 0
Views: 660
Reputation: 29
Ok, it really came down to stopping inserting each row seperately.
However, simply using the Batch Statement as YCF_L pointed out was insufficient. I also needed to modify my connection parameters as well. Without "rewriteBatchedStatements=true" there was no performance gain.
DriverManager.getConnection("jdbc:mysql://localhost/"
+ database + "?user=userXY&rewriteBatchedStatements=true");
Thank you Jan Khonski, as you answer would have worked, too. However, I am marking mine as correct as I would like to list several ways of doing it.
I ended up discarding Batch Statements and used delayed database updates by implicitly turning autocommit off.
One way of achieving this is to encapsulate a series of INSERT statements with START TRANSACTION; and COMMIT;
See MySQL Documentation. So far I have no results on which method runs best when the database grows bigger.
Upvotes: 1
Reputation: 13103
I had this problem too. Create one SQL query which inserts multiple rows at the same time! Do it programmatically!!!
insert into reference (field1, field2, field3, field4, field5, field6, field7, field8)"
+ " values (v00, v01, v02, v03, v04,v05, v06, v07),
(v10, v11, v12, v13, v14,v15, v16, v17) ..."
Only make sure than SQL query length does not exceed maximum allowed * 65,536* (it can be different for different databases).
https://docs.oracle.com/javase/7/docs/api/java/sql/PreparedStatement.html A SQL statement is precompiled and stored in a PreparedStatement object. This object can then be used to efficiently execute this statement multiple times.
You create PreparedStatement object each time when a new line comes!!! Instead collect multiple lines and then make one insert statement!
Upvotes: 0