Reputation: 11483
Some pre-question information:
I am attempting to make it possible to convert an entire table from MySQL to SQLite (as well as others, but starting there). This is something that has to be done in java, and must be able to convert a large amount. However, upon execution of my SQLite statement (which is rather large), the entire program simply locks up and freezes. There is also the fact that when I look at the SQLite database after it crashes, not a single row has been added.
Other info:
Java code:
MySQL db = new MySQL();
StringBuilder sb = new StringBuilder("INSERT OR IGNORE INTO `playTime` ");
try {
db.open();
ResultSet ret = db.query("SELECT * FROM `playTime`");
int i = 1;
while (ret.next()) {
if (i > 1) {
sb.append("UNION SELECT ").append(i).append(", '").append(ret.getString(2)).append("', ").append(ret.getInt(3)).append(", ").append(ret.getInt(4)).append(", ").append(ret.getInt(5)).append(" ");
} else {
sb.append("SELECT ").append(i).append(" AS 'column1', '").append(ret.getString(2)).append("' AS 'column2', ").append(ret.getInt(3)).append(" AS 'column3', ").append(ret.getInt(4)).append(" AS 'column4', ").append(ret.getInt(5)).append(" AS 'column5' ");
}
i++;
}
db.close();
} catch (SQLException e) {
...
}
This code produces a string for a new SQLite query similar to the one below. I've merely added line breaks to make it readable, but the printed statements are the same order for syntax.
INSERT OR IGNORE INTO `playTime` SELECT 1 AS 'column1', 'examplename1' AS 'column2', 5696 AS 'column3', 0 AS 'column4', 1 AS 'column5'
UNION SELECT 2, 'examplename2', 18145, 0, 1
UNION SELECT 3, 'examplename3', 69793, 89, 0
UNION SELECT 4, 'examplename4', 51335, 16, 0
...
Overall, there are 11077 rows being taken from the MySQL database, and then those rows I attempt to add again to the SQLite database file that I've made. I understand it is a large amount of data, however upon execution of the query the program in its entirety freezes. Could this be an error with the thread not having a large enough stack size for the job at hand? If so, why would it not print a StackOverFlow error?
From my understanding and research, I'm well within the limits as far as SQLite's limitations are. As stated on the sqlite.org webpage ( http://www.sqlite.org/limits.html ) The maximum number of bytes for text in an sql statement is 1000000, whereas my statement comes out at around ~480,000 bytes. I haven't been able to find any information relevant to maximum rows that can be edited, I would assume there wasn't any.
I'll just re-iterate that there are no visible java errors that I am seeing, the program completely stops/crashes before anything can print to be of value.
Upvotes: 0
Views: 345
Reputation: 180162
SQLite has a limit on subqueries that can be combined like this.
Just create a simple INSERT
statement for each row.
If you put everything into one transaction, this will be just as fast.
Upvotes: 1
Reputation: 53516
Why is it slow? Because it's trying to parse an SQL statement that has 11077 unions and selects in it! That's pretty hard on the parser. I don't know that it matters for this problem but a union
will also dedup the list vs the faster "concat" operator union all
As Tichodroma said in the comments, export/import via CSV is the best alternative.
Upvotes: 1