Reputation:
I have a directory that I've built with the PHP script below and it uses pagination to get 1002 results per page. The problem is that the farther you get in the pages, the longer they take to load. For example, page 1 loads significantly faster than page 10,000.
I'm guessing I did something wrong with the query and instead of just selecting the 1002 results that it should be limited, it's also cycling through all the ones from before it as well. If someone could post the code that needs to be fixed, that would be great!
Thanks for your time and help!
<?php include("websites/header.html"); ?>
<center>
<?php
/*
Place code to connect to your DB here.
*/
include('websites/database.php'); // include your code to connect to DB.
$tbl_name="list"; //your table name
// How many adjacent pages should be shown on each side?
$adjacents = 5;
/*
First get total number of rows in data table.
If you have a WHERE clause in your query, make sure you mirror it here.
*/
$query = "SELECT COUNT(*) as num FROM $tbl_name";
$total_pages = mysql_fetch_array(mysql_query($query));
$total_pages = $total_pages[num];
/* Setup vars for query. */
$targetpage = "websites.php"; //your file name (the name of this file)
$limit = 1002; //how many items to show per page
$page = $_GET['page'];
if($page)
$start = ($page - 1) * $limit; //first item to display on this page
else
$start = 0; //if no page var is given, set start to 0
/* Get data. */
$sql = "SELECT website FROM $tbl_name LIMIT $start, $limit";
$result = mysql_query($sql);
/* Setup page vars for display. */
if ($page == 0) $page = 1; //if no page var is given, default to 1.
$prev = $page - 1; //previous page is page - 1
$next = $page + 1; //next page is page + 1
$lastpage = ceil($total_pages/$limit); //lastpage is = total pages / items per page, rounded up.
$lpm1 = $lastpage - 1; //last page minus 1
/*
Now we apply our rules and draw the pagination object.
We're actually saving the code to a variable in case we want to draw it more than once.
*/
$pagination = "";
if($lastpage > 1)
{
$pagination .= "<div class=\"pagination2\">";
//previous button
if ($page > 1)
$pagination.= "<a href=\"$targetpage?page=$prev\">< previous</a>";
else
$pagination.= "<span class=\"disabled\">< previous</span>";
//pages
if ($lastpage < 7 + ($adjacents * 2)) //not enough pages to bother breaking it up
{
for ($counter = 1; $counter <= $lastpage; $counter++)
{
if ($counter == $page)
$pagination.= "<span class=\"current\">$counter</span>";
else
$pagination.= "<a href=\"$targetpage?page=$counter\">$counter</a>";
}
}
elseif($lastpage > 5 + ($adjacents * 2)) //enough pages to hide some
{
//close to beginning; only hide later pages
if($page < 1 + ($adjacents * 2))
{
for ($counter = 1; $counter < 4 + ($adjacents * 2); $counter++)
{
if ($counter == $page)
$pagination.= "<span class=\"current\">$counter</span>";
else
$pagination.= "<a href=\"$targetpage?page=$counter\">$counter</a>";
}
$pagination.= "...";
$pagination.= "<a href=\"$targetpage?page=$lpm1\">$lpm1</a>";
$pagination.= "<a href=\"$targetpage?page=$lastpage\">$lastpage</a>";
}
//in middle; hide some front and some back
elseif($lastpage - ($adjacents * 2) > $page && $page > ($adjacents * 2))
{
$pagination.= "<a href=\"$targetpage?page=1\">1</a>";
$pagination.= "<a href=\"$targetpage?page=2\">2</a>";
$pagination.= "...";
for ($counter = $page - $adjacents; $counter <= $page + $adjacents; $counter++)
{
if ($counter == $page)
$pagination.= "<span class=\"current\">$counter</span>";
else
$pagination.= "<a href=\"$targetpage?page=$counter\">$counter</a>";
}
$pagination.= "...";
$pagination.= "<a href=\"$targetpage?page=$lpm1\">$lpm1</a>";
$pagination.= "<a href=\"$targetpage?page=$lastpage\">$lastpage</a>";
}
//close to end; only hide early pages
else
{
$pagination.= "<a href=\"$targetpage?page=1\">1</a>";
$pagination.= "<a href=\"$targetpage?page=2\">2</a>";
$pagination.= "...";
for ($counter = $lastpage - (2 + ($adjacents * 2)); $counter <= $lastpage; $counter++)
{
if ($counter == $page)
$pagination.= "<span class=\"current\">$counter</span>";
else
$pagination.= "<a href=\"$targetpage?page=$counter\">$counter</a>";
}
}
}
//next button
if ($page < $counter - 1)
$pagination.= "<a href=\"$targetpage?page=$next\">next ></a>";
else
$pagination.= "<span class=\"disabled\">next ></span>";
$pagination.= "</div>\n";
}
?>
<?php
$i = 0;
echo '<table style="table-layout:fixed; width:1050px;"><tr>';
while($row = mysql_fetch_array($result))
{
$i ++;
if ($i<=3)
{
echo '<td style="word-wrap: break-word;">
<div><a href="http://www.mywebsite.com/check.php?site='.strtolower($row[website]).'">'.strtolower($row[website]).'</a></div>
</td>';
}
else
{
echo '</tr><tr>';
echo '<td style="word-wrap: break-word;"><div><a href="http://www.mywebsite.com/check.php?site='.strtolower($row[website]).'">'.strtolower($row[website]).'</a></div></td>';
$i = 0;
$i++;
}
}
echo '</tr></table>';
?>
<?=$pagination?>
</center>
<?php include("websites/footer.html"); ?>
Upvotes: 5
Views: 6813
Reputation: 9542
At best, your table t_limit
has a unique primary key id
that is indexed. Then check this
SELECT l.*
FROM (
SELECT id
FROM t_limit
ORDER BY
id
LIMIT 10000, 10
) o
JOIN t_limit l
ON l.id = o.id
ORDER BY
l.id
This trick can speed up limit+offset queries by more than 100 times if you believe MySQL ORDER BY / LIMIT performance: late row lookups at EXPLAIN EXTENDED.
The main trick is to limit the data as early in the query as possible, ideally using the fastest (primary) keys.
Upvotes: 1
Reputation: 54242
LIMIT with an offset is extremely slow in most databases (I've found some documentation to this effect for MySQL and I'm trying to find a really good article I read a while ago explaining this for SQLite). The reason is that it's generally implemented something like this:
LIMIT
clause wasn't thereWhat this means if that if you do LIMIT 10000, 10
, it will be interpreted as:
There's a trivial optimization where you can at least use the index for the first 10,000 results since you don't care about their values, but even in that case, the database still needs to walk through 10,000 index values before giving you your 10 results. There may be further optimizations that can improve this, but in the general case you don't want to do use LIMIT
with an offset for large values.
The most efficient way to handle pagination that I'm aware of is to keep track of the last index, so if page one ends on id = 5
, then make your next link have WHERE id > 5
(with a LIMIT x
of course).
EDIT: Found the article for SQLite. I highly recommend you read this since it explains The Right Way™ to do things in SQL. Since the SQLite people are really smart and other databases have this same problem, I assume MySQL implements this in a similar way.
Another error that crops up frequently is programmers trying to implement a scrolling window using LIMIT and OFFSET. The idea here is that you first just remember the index of the top entry in the display and run a query like this:
SELECT title FROM tracks WHERE singer='Madonna' ORDER BY title LIMIT 5 OFFSET :index
The index is initialized to 0. To scroll down just increment the index by 5 and rerun the query. To scroll up, decrement the index by 5 and rerun.
The above will work actually. The problem is that it gets slow when the index gets large. The way OFFSET works in SQLite is that it causes the sqlite3_step() function to ignore the first :index breakpoints that it sees. So, for example, if :index is 1000, you are really reading in 1005 entries and ignoring all but the last 5. The net effect is that scrolling starts to become sluggish as you get lower and lower in the list.
Upvotes: 6