Reputation: 194
I pass arbitrary multi-statement SQL to Mysqli (using mysqlnd) and retrieve the result set. SELECT statements can contain LIMIT
clause (or not). What i need is to limit fetched rows to 1000 (hardcoded limit). That's what i have:
define('SAFE_ROWS', 1000); // strict limit
$sql = "SELECT * FROM table1 LIMIT 99999999";
$mysqli->multi_query($sql);
// unbuffered: results are stored at MySQL-side
// and transfer when i call fetch*
$result = $mysqli->use_result();
$rows = [];
while ($row = $result->fetch_array()) {
if (count($rows) >= SAFE_ROWS) {
#############################################
# QUESTION: what i have to do here #
# to skip remaining rows of current result? #
#############################################
break;
}
$rows[] = $row;
}
// As slow as much rows remained unfetched:
$result->free();
Freeing result takes too much time if there were lots of unfetched rows within current result. Is there a way to tell MySQL "i don't need remaining rows of this result anymore, just bury it"?
PS. I know about the SQL_SELECT_LIMIT
option, but this limitation is ignored when select limit is obviously set (SELECT * FROM t1 LIMIT 99999999
).
Upvotes: -1
Views: 547
Reputation: 157870
What you're tying to do is called barbarism.
And it's not your issue with discarding the query results is a problem here.
In a layman terms your question is a lot like:
I needed a campfire to boil myself a cup of tea. Chopping down a few trunks is too much work; I've did have a ton of dynamite so I just blew up the whole forest. This worked all right, my campfire was perfect. But now I don't need all that garbage lying around. How can I make it disappear in a puff of smoke?
And the main problem here is a forest that is no more.
With your ideas you are doing exactly the same. You are wasting A FRIGGIN LOT of resources, both on the PHP and the MySQL side, causing the MySQL server to load a lot of data, when you need only a few rows. And even the time it takes to clear the resultset doesn't hint to you that you are doing something wrong.
All right, you will find that magical way to discard several gigabytes of data in a second. But these gigabytes have to be gathered first. And when that homemade phpmyadmin of yours will kill a database server, its users won't be as grateful as you expect.
In a now deleted comment you stated that you are "working on another phpmyadmin". So just do what phpmyadmin does - parse the SQL and add a LIMIT part to the query.
Upvotes: 5