Reputation: 5063
Let say i've a huge database of 100,000 entries my_table (id,name)
and i'd like to call all the entires to be shown
$conn = mysql_connect('localhost','USER','PASS') or die(mysql_error());
mysql_select_db('my_table',$conn);
$sql = "SELECT * from my_table";
$result = mysql_query($sql,$conn);
while ($row = mysql_fetch_array($result)){
$name = $row['name'];
$id = $row['id'];
echo $name;
echo "<br>(done)<br>";
}
The problem
The problem that makes me almost gonna cry,it takes long time to load then it will showing names (remember my_table of 100,000 entries) all at once and sometimes it sudden shows huage amount once or even sometime it hang up.
My question
Is there any way that i can call it so that it showing me results one name by one name like streaming name1 (done),name2 (done),name3 (done),name4 (done),name5 (done),......etc not showing all at once
i'm afraid it may needs some ajax idea or somehow i really don't know but this is important to me so your help is very useful to me.
Upvotes: 2
Views: 389
Reputation: 683
Do not fetch the full data at once rather, define a limit on returning result, like 100 or what ever you like, and provide paging so that you can go next next, while the data is fetched keep the cache of next probable data.
Check out the following links for more clearance.
http://www.tuxradar.com/practicalphp/9/4/9
Processing a large amount of data efficiently with MySQL and PHP
Selecting large amount of rows from database table with MySQL
Upvotes: 0
Reputation: 425311
You need to use AJAX
.
MySQL
supports server-side cursors which you can enable by calling attr_set(MYSQLI_STMT_ATTR_CURSOR_TYPE)
. However, for this to work you need to run PHP
on an application server which would persist the connection between the AJAX
calls. If you can do this, then make your query a prepared statement, pass a session token in AJAX calls to identify the query statement and just fetch 1000
(or whatever) rows each time you receive an AJAX
request.
If you can't do this, then you can provide the page number and pass it to the LIMIT
clause:
SELECT *
FROM mytable
ORDER BY
id
LIMIT $pageSize
OFFSET $pageNum * $pageSize
However, it the table is changed between the calls, this may break the order.
Upvotes: 3
Reputation: 12727
Ajax is very easy to learn.
Here you go: php_ajax_database
You can use LIMIT
to limit the number of rows you want to fetch.
Maintain a counter at the client side keep sending it whenever you want to fetch the next set of rows. And fetch the rows based on the counter.
Upvotes: 1
Reputation: 1200
Make an Ajax call every second to pull back one row. Make a counter and after ajax runs once add to the counter. Test the counter before running the ajax to see if its at 100,000 if not run the ajax and send the counter to the php file telling it to WHERE id= $counter
in the sql statement
Upvotes: 2