roshkattu
roshkattu

Reputation: 251

Best way to show large amount of data

What is the best way to handle a large amount of data entries on a web page?

Let's assume I am having a database with 5000 records on a table that contain song_name,author_name,song_id,posted_by; I want to build a playlist with all the songs on a single page. Also on that page there is a player that plays songs according to the playlist entries that is shown on the page.

I have tried to pull all 5000 entries from that table and build a javascript object with them, and handling that object I have built the playlist, search in playlist, and so forth. But that takes a very large amount of resources ( un the user end ) and a lot of page loading time ( because there are a lot of entries! ) and the page is very slow.

Is it better to load all the data into an object and paginate by JavaScript each 100 records of the playlist or is it better to get the results paginated from the database and just update the playlist? ( This taking in consideration the fact that I if the player has the shuffle button activated, it may shuffle to ANY song in the user's database, not only on the current songs from the visible playlist )

Upvotes: 6

Views: 10432

Answers (4)

derWilly
derWilly

Reputation: 453

use ajax to load the data in steps of 100 (or more, just try)

do a loop over your recordsets and increase the limit each time:

<?php
$Step = 100;
$u_limit = 0;
$sql = "SELECT song_id FROM $MySQL_DB.songs";
$data = mysql_query($sql, $dblk);
$numer_of_entries = mysql_num_rows($data);
while($o_limit < $numnumer_of_entries)
{
    $o_limit = u_limit + $Step;
    $sql = "SELECT * FROM $MySQL_DB.songs order by id DESC LIMIT $u_limit, $o_limit";
    $data = mysql_query($sql, $dblk);
    while($row = mysql_fetch_array($data))
    {
        // built an array and return this to ajax
    }
    $u_limit += $Step;
}

Upvotes: 1

tech_me
tech_me

Reputation: 550

It is very hard to load the "entire" data to client program even if you are using jQuery or other library else, as the key factor is not what code/sdk you are using but the browser itself!
By the way, chrome is the most fast and IE(before ver.10) is the lowest.

You can refer the links below:
http://www.infoq.com/news/2010/09/IE-Subsystems-Spends-Time
http://www.nczonline.net/blog/2009/01/05/what-determines-that-a-script-is-long-running/
http://www.zdnet.com/browser-benchmarks-ie-firefox-opera-and-safari-3039420732/
http://msdn.microsoft.com/en-us/library/Bb250448
http://support.microsoft.com/kb/175500/en-us

So what you should do is to move your client logic to the server-side just as other people suggesting.

As you mentioned to get paginated but with just javascript for all your data, it is the same as none paginate in essence.

Upvotes: 1

Chris Visser
Chris Visser

Reputation: 1647

I think pagination is your best option. Just create a limit of 100 (for example) and use AJAX to extract the next 100. If the client turns on shuffle, just send another request to the server and let it call a function that does the following:

  1. Count total rows in database
  2. Use a randomize function to get 100 random numbers
  3. Now create a slightly tricky query to get records from the db based on their rownumber:

function getRandomTracks($limit) {

  $total = $this->db->count_all('table_tracks');

  //Get random values. Speed optimization by predetermine random rownumbers using php

  $arr = array();
  while (count($arr) < $limit) { 
    $x = mt_rand(0, $total); //get random value between limit and 0
    if (!isset($arr[$x])) { //Random value must be unique
      //using random value as key and check using isset is faster then in_array
      $arr[$x] = true; 
    }
  }

  //Create IN string
  $in = implode(',', array_keys($arr));

  //Selection based on random rownumbers
  $query = $this->db->query('SELECT * FROM
      (SELECT  @row := @row + 1 as row, t.*
         FROM `table_tracks` t, (SELECT @row := 0) r) AS tracks
     WHERE `row` IN(' . $in . ')');

  return $query->result();

}

I'm using a similar function, also to deal will large amounts of tracks (over 300.000) so I'm sure this will work!

Upvotes: 7

BlackCat
BlackCat

Reputation: 521

Try this: http://www.datatables.net/

I wonder but maybe it's works.

Upvotes: 0

Related Questions