Mike
Mike

Reputation: 1938

Why doesn't my PHP script process on the server when I have others that do?

I am using DataTables Server Side processing on tables in my reporting website. I am also using PHPExcel to export the reports to Excel. My php scripts to create the tables for viewing in the browser work fine, but when I try to get the PHPExcel script to work I get an out of memory error like:

enter image description here

I can see that it's erroring in the same place that is was, and with the same error, as when I was trying to process the reports client side. That's why I'm trying to do it on the server. I've searched around, but I've not seen anything about a situation like this.

Any ideas what I can try?

EDIT

When I up the memory to 512 I get the same error, but at a different place:

enter image description here

EDIT 2

This is not a repeat of the question mentioned below. I know that my server has the required memory, it uses it when presenting the data to the browser. It is only when I send that same data to the PHPExcel files that I have this problem. It was mentioned to try something with cell caching, but I don't know what that is in regards to PHPExcel. This is the first time I've tried using it. I've even tried reducing the number of rows being sent to 11 instead of 12,000+ and I still get the error.

UPDATE

I think that I know some of what the problem is now after much testing.

The initial draw to fill the browser screen is only pulling back what is needed for the screen, starts at 25 goes to 150. When I'm trying to export it is trying to pull all of it. the biggest report that I currently have is 65000+ rows and 35 columns, though I have larger ones that are coming. I updated the memory limit to ini_set('memory_limit', '-1'); so that it will use whatever it needs to, and it does.

Now it is timing out after 300 seconds (5 minutes). I tried fixing this with ini_set('MAX_EXECUTION_TIME', 10);, but that doesn't seem to be doing anything (I probably have it setup wrong will do more research on that).

Now though I need to find what cache method of the PHPExcel library will be the most efficient. I am currently using cache_in_memory_gzip, but I'm also testing the others as I read about them and they seem possibly better. I'll make another update or answer the question when I complete my testing.

Upvotes: -2

Views: 54

Answers (1)

Mike
Mike

Reputation: 1938

I found my problem. I was not correctly saving the sort and filter criteria from the original ajax request so all data from the server table was being returned instead of the limited result set that was being shown on the screen. I was trying to set a cookie like this setcookie("KeepPost",$PostKept,time() + (60*60*24*7));. But have discovered that there is a limit (4097mb) to what can be stored in the browser's cookies. I was exceeding it, the one that I was testing had 7800mb +, so I was not able to set the cookie. This was causing my php files to use the original ajax request which was pulling all the data without any filters. Once I realized this I updated my code to send the info I need to the server, where the limit is much higher. Then I just pull it back when I'm ready to use and everything else works fine. Here's what my KeepPost.php file looks like now:

<?php

if( isset($_POST['draw']))
{
    include 'DBConn.php';
    //echo "Here";
    //print_r($_POST);
    $KeepPost = $_POST;    
    $KeepPost['length'] = -1;
    $PostKept = serialize($KeepPost);
    $TSQL = "UPDATE PostKept set Value = '" .$PostKept. "'";
    $sth = $conn->prepare($TSQL);
    //print_r($sth);
    $sth->execute();
}

function Convert_From_Array($array)
{
    echo serialize($array);
}
?>

Now that I'm not trying to pull back 150k + rows I'm able to create the Excel file and download it to the client without a problem (only takes about 30 secs).

Upvotes: 0

Related Questions