Reputation: 2549
My PHP script is receiving large data (100 - 500 MB) from a client. I want my PHP script run fast, without using too much memory.
To save traffic, I don't use Base64 or form data. I send binary data directly in a POST request.
The data consists of two parts: 2000 Bytes header, and the rest, that has to be stored as a file on the server.
$fle = file_get_contents("php://input",FALSE,NULL,2000);
file_put_contents("file.bin", $fle);
The problem is, that file_get_contents
ignores the offset parameter, and reads the data from byte 0. Is there any better way to do it?
** I don't want to read the whole data and slice off the last N-2000 bytes, as I am afraid it would use too much memory.
Upvotes: 1
Views: 2068
Reputation: 32232
Use the lower-level file IO functions and read/write a little bit at a time.
$bufsz = 4096;
$fi = fopen("php://input", "rb");
$fo = fopen("file.bin", "wb");
fseek($fi, 2000);
while( $buf = fread($fi, $bufsz) ) {
fwrite($fo, $buf);
}
fclose($fi);
fclose($fo);
This will read/write in 4kB chunks.
Upvotes: 3