Stuart Lacy
Stuart Lacy

Reputation: 2003

PHP having issues saving "large" files

I've got a program that takes 3 arrays (which are the same length, can contain 500 items or so) and writes them to a text file.

However I'm getting an issue with writing larger files. The arrays are coordinates and timestamps of a canvas drawing app so I can control the length. I've found that once files start getting larger than 2mb it doesn't save the file. The maximum file I've managed to save has been 2.18mb. From a related question PHP: Having trouble uploading large files I've determined that the cause is most likely due to having this hosted on a free hosting server. I've looked at phpinfo() and here are the 4 relevant numbers:

memory_limit 16M
max_execution_time 30
upload_max_filesize 5M
post_max_size 5M

Here is the relevant writing code:

// retrieve data from the JS
$x_s = $_GET['x_coords'];
$y_s = $_GET['y_coords'];
$new_line = $_GET['new_lines'];
$times = $_GET['time_stamps'];

print_r($_GET);
$randInt = rand(1,1000);

// first want to open a file
$file_name = "test_logs/data_test_" . $randInt . ".txt";
$file_handler = fopen($file_name, 'w') or die("Couldn't connect");

// For loop to write the data
for ($i = 0; $i < count($x_s); $i++){
    // If new line want to write new line!
    if (!$new_line[$i]){
        if ($i!=0){
            // If not the first line
            fwrite($file_handler, "LINE_END\n"); }          
        fwrite($file_handler, "LINE_START\n");
    }

    // Write the x coord, y coord, timestamp
    fwrite($file_handler, $x_s[$i] . ", ". $y_s[$i] .", ". $times[$i]. "\n");

    // If last line then write last LINE_END
    if ($i == (count($x_s)-1)){
        fwrite($file_handler, "LINE_END\n"); }
}

fclose($file_handler);

I've setup a php server on my localhost and have access to the error log. This is what I am getting.

[Fri Mar 23 20:03:02 2012] [error] [client ::1] request failed: URI too long (longer than 8190)

PROBLEM RESOLVED: The issue was that I was using GET to send large amounts of data, which was appended to the URI. Once the URI reached 8190 characters it had an error. Using POST solves this.

Upvotes: 0

Views: 770

Answers (4)

chiborg
chiborg

Reputation: 28094

upload_max_filesize and post_max_size determine the maximum size of data that can be posted. But this is probably not your problem since some of the data is written (if you reach the data limit, the script does not execute).

Your script has two restrictions: max_execution_time and memory_limit. Have a look at your apache error log file to see if you are getting an error message (saying which limit is reached).

You can also try logging inside the for loop to see the progression of time and memory usage:

if(($i % 100) == 0) { // log every 100 entries
  error_log(date("H:i:s ").memory_get_usage(true)."Bytes used\n", 3, 'test.log');
}

It may also be that the Suhosin patch is preventing you from sending too many data points: http://www.adityamooley.net/blogs/2012/01/09/php-suhosin-and-post-data/

Upvotes: 1

Bartosz Kowalczyk
Bartosz Kowalczyk

Reputation: 1519

1) check max_input_time

ini_set ( 'max_input_time', 50 );

2) check in phpinfo() - do you have Suhosin patch?

You should look at apache error_log - You should find which limit is reached.

Try

ini_set('error_reporting', E_ALL);
error_reporting(E_ALL);
ini_set('log_errors', true);
ini_set('html_errors', false);
ini_set('error_log', dirname(__FILE__).'script_error.log');
ini_set('display_errors', true);

Upvotes: 1

Ed Heal
Ed Heal

Reputation: 60007

PHP (an hence the web server) is protecting itself. Perhaps use a different mechanism to upload a large file - i would imagine they come from known (an trusted) sources. Use a different mechanism, for example SFTP.

Upvotes: 0

Leri
Leri

Reputation: 12535

Maybe script exceeds max execution time.

Add this

set_time_limit(0)

at the beginning of your code.

Upvotes: 1

Related Questions