SBB
SBB

Reputation: 8970

AJAX - Large response locking up browser upon appending data

I am working on an application that fetches many records from a database and renders them to a table. This is done with an AJAX call and appends them to the end of the results that are already there.

The number of records is the variable in this case and could be 10 or 20,000 depending on what the search criteria returns.

Now, first things first. I understand that having 20,000 records on the page isn't smart but this was a requirement to allow all the records be seen, modified, or marked up from this page.

The AJAX call to the PHP page is fine and seems to work pretty quick. The bottle neck however is appending that data to the DOM once its received via AJAX.

In the code below, you will see the main function that creates the table row and returns it back to the AJAX call.

/*
  Create Table Rows
*/
function createTableRows($dashboardID, $type){

  // Timer
  $start = microtime(true);

  // Dashboard
  $dashboardID = $_REQUEST['dashboardID'];
  $objDB = new DB;
  $objData = $objDB
    -> setStoredProc('RenderDashboard')
    -> setParam('dashboardID', $dashboardID)
    -> setParam('limit', $_REQUEST['limit'])
    -> setParam('offset', $_REQUEST['offset'])
    -> setParam('actor', '1234')
    -> execStoredProc()
    -> parseXML();

    // Fetch other data
    $markupData = fetchMarkup($dashboardID);
    $exportFields = fetchExportFields($dashboardID);
    $ignore = Array('identifierQID', 'identifierNTID');

    // Vars
    $outputArray = Array();
    $recordCount = 0;
    $i = 0;

    // Loop over our data
    foreach($objData->data as $r){

      $outputArray[$i++] = '<tr data-qid="'.$r->identifierQID.'" class="primaryValue ' . searchMarkup($markupData, $r->identifierQID) . '">';

        // Loop over our fields
        foreach($r as $key => $value){

          // Vars
          $fieldID = str_replace('_', '', $key);

          // Don't include our identifier columns
          if(!in_array($fieldID, $ignore)){
            $outputArray[$i++] = '<td data-tableexport-display="always" class="small' . ($exportFields ? (in_array($fieldID, $exportFields) ? ' hidden' : '') : '') . '">' . formatFieldData($fieldID, $value) . '</td>';
          }

        }

        // Notes always come last
        $outputArray[$i++] = '<td data-tableexport-display="always" class="notesTD allowContext hidden"></td>';

      $outputArray[$i++] = '</tr>';
      $recordCount++;

    }

    // Join our rows array and return it
    $end = microtime(true);
    $timer = number_format($end - $start, 2);
    return array(join("",$outputArray), $recordCount, $timer);
}

// This is what gets passed back to our AJAX call on the UI
echo createTableRows($dashboardID)[0];

Here is the Javascript that processes the response it receives back.

// Given data, create our table rows
function createRows(data) {

    // Update Progress Bar
    $('[name=progressDiv]').show();

    // Append the results to the DOM
    /* THIS IS WHAT IS KILLING THE SPEED!!!!*/
    $('[name=resultsTable]').append(data);

    // If our total number of records exceeds the threshold, we will be using the progress bar for the status
    if (totalRecords > maxThreshold) {
        $('[name=resultsProgress]').attr('aria-valuenow', currentPage / totalPages * 100)
            .css('width', currentPage / totalPages * 100 + '%')
            .text((currentPage < totalPages ? recordsPerPage * currentPage + ' of ' + totalRecords + ' records loaded' : 'Loaded ' + totalRecords + ' records!'));
    } else {
        // Loaded all records in one shot, update progress bar
        $('[name=resultsProgress]').attr('aria-valuenow', 100)
            .css('width', '100%')
            .text('Loaded ' + totalRecords + ' records!')
            .removeClass('active');
    }
    // Do we have more data to load?
    if (currentPage < totalPages && totalRecords > maxThreshold) {
      // Allow a little time for the progress to update before locking up
      setTimeout(function(){
        fetchMore();
      }, 100);

    }

    // After the table has been appended to the DOM, run clean up to enable any additional functionality
    cleanUp();
}

The Issue:

The problem is that the APPEND is locking up the browser and causing it to be unresponsive until the append has been completed. I already have this broken up so it will fetch the data in batches but that isn't the issue, its handling the rows in the response.

The Question:

Is there a way to process the results in batches and append that without it locking up the browser? The response it self is just a bunch of TR's that are appended to the TBODY of my table.

My last resort is having to page the results. If I can fix this bottle neck, I can convince them to do the paging for larger data sets.

I guess I am looking for a way to either return the results in a better format for appending or break up the response and have it append in batches while another AJAX call is fetching more data to process.

Thoughts?

Upvotes: 0

Views: 1218

Answers (2)

Nadir Latif
Nadir Latif

Reputation: 3773

You can use a Javascript table component to handle displaying of large sets of JSON data. For example : https://clusterize.js.org/

Upvotes: 1

Jonas Wilms
Jonas Wilms

Reputation: 138277

May push it to the browsers qeue:

function createRows(data){
JSON.parse(data);
for(elem in data){
(function(){
 var el=elem;
 setTimeout(function(){
  document.body.innerHTML+="<tr>"+el+"</tr>";
 },0);
 })();
}
}

requires data to be a json array, may just return an array of strings, that can be handled:

["<td>User 1</td>","<td>User 1</td>","<td>User 1</td>"]

Solution two: Completely generate on server side:

<progress value="0" max="100" id="prog">
<iframe id="data" iframe src="yourdata">
</iframe>

$("#data").on("load",function(){
  $("#prog").val(100);
 });

Upvotes: 0

Related Questions