Reputation: 3075
I have a jQuery datatable
that can be provided with over 70K records.
Unfortunately, the datatable
fails to load anything over 20K records.
I am using the option deferRender
in a workaround attempt, to no avail.
$.ajax({
url: 'api/portmbs.php',
type: 'POST',
data: data,
dataType: 'html',
success: function(data, textStatus, jqXHR)
{
var jsonObject = JSON.parse(data);
var table = $('#example1').DataTable({
"data": jsonObject,
"columns": [
{"data": "column_one"},
{"data": "column_two"},
// more columns...
],
"iDisplayLength": 25,
"order": [[ 1, "desc" ]],
"paging": true,
"scrollY": 550,
"scrollX": true,
"bDestroy": true,
"stateSave": true,
"autoWidth": true,
"deferRender": true
});
},
error: function(jqHHR, textStatus, errorThrown)
{
$('#loadingDiv').hide();
$('#errorModal').modal('show');
$('.message').text('There was an error conducting your search. Please try again.');
return false;
console.log('fail: '+ errorThrown);
}
});
Using the above, an error is triggered with the corresponding error message:
Failed to load resource: the server responded with a status of 500 (Internal Server Error)
When I add a 10000
limit to the query that generate the data, the datatable
is successfully rendered.
What am I missing to get the deferRender
option to successfully run and defer the loading of 70K records?
Upvotes: 1
Views: 2197
Reputation:
$(document).ready(function() {
$('#example').DataTable( {
serverSide: true,
ordering: false,
searching: false,
ajax: function ( data, callback, settings ) {
var out = [];
for ( var i=data.start, ien=data.start+data.length ; i<ien ; i++ ) {
out.push( [ i+'-1', i+'-2', i+'-3', i+'-4', i+'-5' ] );
}
setTimeout( function () {
callback( {
draw: data.draw,
data: out,
recordsTotal: 5000000,
recordsFiltered: 5000000
} );
}, 50 );
},
scrollY: 200,
scroller: {
loadingIndicator: true
},
stateSave: true
} );
} );
Upvotes: 0
Reputation: 17190
I have faced a similar issue some time ago, in my case the server error was caused by an overflow on the PHP
memory_limit
variable. The default value (inside php.ini
) is 128MB
so there is a possibility that these 70000+ rows
of data you are getting overflows that limit.
As a temporary workaround, you can try to increase this limit changing the configuration of the php.ini
file on the server and then restarting the server. My current configuration is:
; Maximum amount of memory a script may consume (128MB)
; http://php.net/memory-limit
; XXX: Increased from 128 to 512.
memory_limit = 512M
You can read more about this on next links:
(1) http://php.net/manual/en/ini.core.php#ini.memory-limit
(2) https://haydenjames.io/understanding-php-memory_limit/
Like I say, the previous solution should be considered like a workaround only. The real solution for these type of problems will be using server-side processing and on every pagination, ordering or filtering on the table make a post to the server for handle these actions and get the new data rendering the datatable
again. There is an implementation of server-side processing available on DataTables
examples, check it on next link:
(1) Server Side Processing Class Example
Upvotes: 2