Pol_pm
Pol_pm

Reputation: 75

DataTable server side processing large data response fails on more than 2500 records or so

Need to display a large amount of data in a DataTable in a Drupal page. I use server side processing. The php file returns json array. It works (but very slow) on up to 2500 records. On more than that the response fails. Also when I add some icons instead of the text links to json array (see php code below) so they are displayed in the table next to the customer's data, the amount of records on which the DataTable works decreases to about 800.

Can it be because of the json array size?

How to display about 5000 records in the DataTable and make sure it still works as data grows? How to optimize the whole thing to speed it up?

JavaScript:

var oTable = $('#mytable').DataTable({
  ajax: "ajax/myphp.php?i="+i+"&a=",
  scrollY: 600,
  scroller: {
    loadingIndicator: true
    },
  paging:   false
 });

Here is what happens in myphp.php file:

<?php
    include '../includes/functions/config.php';
    include '../includes/functions/functions.php';

    if(isset($_GET)){
    $i=$_GET['i'];
    $a=$_GET['a'];
    }

    $output= array();

    $sql= "Select * from customers where a= $a and i=$i"; //returns about 2600-5500 records depends on the filtering, takes about 0.0010 seconds. If I limit the query to 2500 everything works fine.

     /*
    get query results
    loop thought the result
    for each record 
       - get first, last name, address, phone, email. 
       - add few links like View, Edit, Payments, Files
       - run 2 small functions with a quick db query in each to get things like 
         balance
    */

    $num= mysql_num_rows($data);

    while($row= mysql_fetch_assoc($data)){
        //do what I mentioned above
        $links.= "<a href='edit_customer.php?id=".$row['id']."'>Edit</a>  "; //I add about 5 similar links to each record
        $output[]= [$row['Last'], $row['First'], $row['Date'],$row['Phone'], $row['Email'], $links];
    }


    $json_data = array(
                "draw"            =>1,
                "recordsTotal"    => $num,
                "recordsFiltered" =>$num,
                "data"            => $output
                );


    echo json_encode($json_data);

Thanks!

Upvotes: 0

Views: 924

Answers (1)

Jpec07
Jpec07

Reputation: 858

Your issue is here, where you write 'run 2 small functions with a quick db query in each to get things like balance'.

Putting a subsequent MySQL call inside a loop is generally a bad idea, and will cause exponential response time increases with the amount of data you're receiving. Instead, you should try to get as much data in a single connection to the database as you can, or else minimize your database calls. You can usually accomplish a lot using a JOIN, something along the lines of:

SELECT
    c.*,
    b.*
FROM
    customers as c
LEFT OUTER JOIN
    balance as b
    ON b.customerId = c.id
    /* You can also add other join logic in here to control the data coming back */
WHERE
    c.a = ?
AND c.i = ?;

Every time you run a query through MySQLi, you're establishing a new database connection (which includes making the servers shake hands), transmitting the query, waiting for MySQL to process the query, waiting for the MySQL server to return the data, then closing down the connection before moving on. If you do that twice for every record in your 2500-record result, you're going to be hitting a LOT of overhead--establishing and closing 5001 connections from a single script! Joining your tables on the initial query cuts out all of that extra overhead, and brings it back down to one single database call. You go in, you get all the data you need, and you get back out--and this is one place where relational databases like MySQL really shine.

For more info on joining tables, see here: https://dev.mysql.com/doc/refman/8.0/en/join.html

Note: I switched the straight string declarations for parameters. It's a Very Bad Idea (TM) to put URL parameters directly into your MySQL, since it can lead to something called a MySQL injection, where someone can steal your data and wreak havoc on your data structures.

Upvotes: 1

Related Questions