Reputation: 1674
I'm trying to use DataTables with server side processing to display 80,000 rows (paginated), but Laravel crashes with the error:
PHP Fatal error: Maximum execution time of 300 seconds exceeded in /Users/fin/Documents/dev/evway-new/vendor/nesbot/carbon/src/Carbon/Carbon.php on line 291
.
Here's my controller method where I'm fetching the rows:
public function data()
{
$employees = Employee::get(); // ~80,000 employees
return Datatables::of($employees)
->edit_column('created_at',function(Employee $employee) {
return $employee->created_at->diffForHumans();
})
->add_column('actions',function($employee){
$actions = '<a href='. route('admin.employees.show', $employee->id) .'><i class="livicon" data-name="info" data-size="18" data-loop="true" data-c="#428BCA" data-hc="#428BCA" title="view employee"></i></a>
<a href='. route('admin.employees.edit', $employee->id) .'><i class="livicon" data-name="edit" data-size="18" data-loop="true" data-c="#428BCA" data-hc="#428BCA" title="update employee"></i></a>';
$actions .= '<a href='. route('admin.employees.clone', $employee->id) .' data-target="#clone"><i class="livicon" data-name="plus" data-size="18" data-loop="true" data-c="#f56954" data-hc="#f56954" title="clone employee"></i></a>';
$actions .= '<a href='. route('admin.employees.confirm-delete', $employee->id) .' data-toggle="modal" data-target="#delete_confirm"><i class="livicon" data-name="remove" data-size="18" data-loop="true" data-c="#f56954" data-hc="#f56954" title="delete employee"></i></a>';
return $actions;
}
)
->make(true);
}
and here is my JS where I'm configuring DataTable:
$(function() {
var table = $('#table').DataTable({
processing: true,
serverSide: true,
ajax: '{!! route('admin.employees.data') !!}',
columns: [
{ data: 'id', name: 'id' },
{ data: 'name', name: 'name' },
{ data: 'status', name: 'status' },
{ data: 'actions', name: 'actions', orderable: false, searchable: false }
]
});
});
What am I doing wrong? Am I fetching the Employees from the DB in the wrong way? is get()
a bad idea?
Upvotes: 2
Views: 3384
Reputation: 1887
I haven't really worked with Datatable plugin But your very first statement is the culprit. $employees = Employee::get(); // ~80,000 employees
.. This loads all 80k records into the memory. And then you pass this collection to Datatables::of()
method. Instead of this I think you should be using Datatables::of(Employee::query())
, which I am assuming will handle the pagination and will load only the required number of employees –
Upvotes: 5
Reputation: 213
I think you should increase your php.ini
max_execution_time = 360 ; Maximum execution time of each script, in seconds (I CHANGED THIS VALUE)
max_input_time = 120 ; Maximum amount of time each script may spend parsing request data
;max_input_nesting_level = 64 ; Maximum input variable nesting level
memory_limit = 128M
Upvotes: 0