Reputation: 33
I've been trying to create a non-flash upload panel which also shows a progress bar. On our server we have PHP 5.3 (cannot upgrade to 5.4 for now, so the new upload progress feature cannot be used => http://php.net/manual/en/session.upload-progress.php). We cannot use flash based solutions, extensions or similar.
Hence I've tried using an XMLHttpRequest combined with AJAX. The problem here is that I've only achieved partial success.
I've managed to upload and save on the server a file of about 380 MB, however, when trying with a larger file like 4 GB, it won't be saved on the server (if I check with Firebug at one point it would say "POST aborted").
Another strange thing is that with the same file the xhr.upload.loaded starts with the same dimension of xhr.upload.total and starts counting from there.
Does anyone know how to solve this problem or has an alternative solution?
The client code is:
<script type="application/javascript" src="jquery.js"></script>
<script type="application/javascript">
function uploadToServer()
{
fileField = document.getElementById("uploadedFile");
var fileToUpload = fileField.files[0];
var xhr = new XMLHttpRequest();
var uploadStatus = xhr.upload;
uploadStatus.addEventListener("progress", function (ev) {
if (ev.lengthComputable) {
$("#uploadPercentage").html((ev.loaded / ev.total) * 100 + "%");
}
}, false);
uploadStatus.addEventListener("error", function (ev) {$("#error").html(ev)}, false);
uploadStatus.addEventListener("load", function (ev) {$("#error").html("APPOSTO!")}, false);
xhr.open(
"POST",
"serverUpload.php",
true
);
xhr.setRequestHeader("Cache-Control", "no-cache");
xhr.setRequestHeader("Content-Type", "multipart/form-data");
xhr.setRequestHeader("X-File-Name", fileToUpload.fileName);
xhr.setRequestHeader("X-File-Size", fileToUpload.fileSize);
xhr.setRequestHeader("X-File-Type", fileToUpload.type);
//xhr.setRequestHeader("Content-Type", "application/octet-stream");
xhr.send(fileToUpload);
}
$(function(){
$("#uploadButton").click(uploadToServer);
});
</script>
HTML part:
<form action="" name="uploadForm" method="post" enctype="multipart/form-data">
<input id="uploadedFile" name="fileField" type="file" multiple />
<input id="uploadButton" type="button" value="Upload!">
</form>
<div id="uploadPercentage"></div>
<div id="error"></div>
Server side code:
<?php
$path = "./";
$filename = $_SERVER['HTTP_X_FILE_NAME'];
$filesize = $_SERVER['CONTENT_LENGTH'];
$file = "log.txt";
$fo= fopen($file, "w");
fwrite($fo, $path . PHP_EOL);
fwrite($fo, $filename . PHP_EOL);
fwrite($fo, $filesize . PHP_EOL);
fwrite($fo, $path . $filename . PHP_EOL);
file_put_contents($path . $filename,
file_get_contents('php://input')
);
?>
Upvotes: 3
Views: 18280
Reputation: 6110
I tried to upload video file of 4GB using ajax. It was a success. Here is my code.
HTML ::
<form enctype="multipart/form-data" method="post">
<input type="file" id="video_file" name="video_file" accept=".mp4, .avi, .mkv">
<input type="submit" class="btn btn-success" id="video-upload-btn" name="video_upload_btn" value="Upload">
<div class="video-bar">
<span class="video-bar-fill" id="video-bar-fill-id"><span class="video-bar-fill-text" id="video-bar-fill-text-id"></span></span>
</div>
</form>
CSS ::
.video-bar{
width: 100%;
background: #eee;
padding: 3px;
margin-bottom: 10px;
box-shadow: inset 0 1px 3px rgba(0,0,0,0.2);
border-radius: 3px;
box-sizing: border-box;
}
.video-bar-fill{
height: 20px;
display: block;
background: cornflowerblue;
width: 0;
border-radius: 3px;
transition: width 0.8s ease;
}
.video-bar-fill-text{
color: #fff;
padding: 3px;
}
Ajax ::
<script type="text/javascript">
var app = app || {};
(function(video_op){
"use strict";
var video_ajax, video_getFormData, video_setProgress;
video_ajax = function(data){
var xmlhttp = new XMLHttpRequest(), uploaded;
xmlhttp.addEventListener('readystatechange', function(){
if(this.readyState==4){
if(this.status==200){
uploaded = JSON.parse(this.response);
console.log(uploaded);
if(typeof video_op.options.finished==='function'){
video_op.options.finished(uploaded);
}
} else {
if(typeof video_op.options.error === 'function'){
video_op.options.error();
}
}
}
});
xmlhttp.upload.addEventListener("progress", function(event){
var percent;
if(event.lengthComputable===true){
percent = Math.round((event.loaded / event.total) * 100);
video_setProgress(percent);
}
});
if(video_op.options.videoProgressBar!==undefined){
video_op.options.videoProgressBar.style.width=0;
}
if(video_op.options.videoProgressText!==undefined){
video_op.options.videoProgressText.innerText=0;
}
xmlhttp.open("post", video_op.options.videoProcessor);
xmlhttp.send(data);
};
video_getFormData = function(source1){
var data = new FormData(), i;
for(i=0;i<source1.length; i++){
data.append('video_file', source1[i]);
}
data.append("ajax", true);
return data;
};
video_setProgress = function(value){
if(video_op.options.videoProgressBar!==undefined){
video_op.options.videoProgressBar.style.width = value? value+"%":0;
}
if(video_op.options.videoProgressText!==undefined){
video_op.options.videoProgressText.innerText=value?value+"%":0;
}
};
video_op.videouploader = function(options){
video_op.options = options;
if(video_op.options.videoFiles !== undefined){
var videoFormDataValue = video_getFormData(video_op.options.videoFiles.files);
video_ajax(videoFormDataValue);
}
}
}(app));
document.getElementById("video-upload-btn").addEventListener("click", function(e){
e.preventDefault();
document.getElementById("video-upload-btn").setAttribute("disabled", "true");
var videof = document.getElementById('video_file'),
videopb = document.getElementById('video-bar-fill-id'),
videopt = document.getElementById('video-bar-fill-text-id');
app.videouploader({
videoFiles: videof,
videoProgressBar: videopb,
videoProgressText: videopt,
videoProcessor: "upload.php",
finished: function(data){
console.log(data);
},
error: function(){
console.log("error");
}
});
});
</script>
SERVER SIDE ::
<?php
if(!empty($_FILES["video_file"]))
{
if(!empty($_FILES["video_file"]["error"]))
{
if(move_uploaded_file($_FILES["video_file"]["tmp_name"], __DIR__."/".$_FILES["video_file"]["name"] ))
{
echo "success";
}
else
{
echo "failed";
}
}
else
{
echo "error";
}
}
?>
Also change below listed php ini values.
If you are in linux/ubuntu - follow this steps
Open php ini file -
sudo nano /etc/php5/apache2/php.ini
Update these values-
post_max_size = 6000M
upload_max_filesize = 6000M
restart apache
sudo /etc/init.d/apache2 restart
Upvotes: 0
Reputation: 404
Others have already pointed out that there are limits that you will run into on any production PHP server that is properly configured. Memory, post, and file maximums to start. Additionally The httpd service usually restricts these as well.
The answer for uploads so large would be to cut the file into chunks, send each chunk in a different put or post (depending on browser.)
There is a library that exists already that is capable of chunk file uploads, so I will be using it as an example. To support chunked uploads, the upload handler makes use of the Content-Range header, which is transmitted by the plugin for each chunk.
The handle_file_upload function in the UploadHandler class is a good example of how to handle a chunked file upload on the server side with PHP. -- https://github.com/blueimp/jQuery-File-Upload/blob/master/server/php/UploadHandler.php
function handle_file_upload($uploaded_file, $name, $size, $type, $error,
$index = null, $content_range = null)
The function takes the argument $content_range = null
which is passed to the server in the HTTP header, and retrieved from $_SERVER['HTTP_CONTENT_RANGE'];
Later we need to find out if we will be appending the file upload to a file that already exists so we set a variable. If the reported files size from the HTTP request is larger than the actual file size on the server, the $content_range
variable is not NULL and the file exists we will need to append this upload to the existing file.
$append_file = $content_range && is_file($file_path) &&
$file->size > $this->get_file_size($file_path);
Great! Now what?
So now we need to know how we are receiving the data. Older versions of Firefox can't use multipart/formdata (POST) for chunked file uploads. Those requests will need to be handled differently for both the client, and the server side.
if ($uploaded_file && is_uploaded_file($uploaded_file)) {
// multipart/formdata uploads (POST method uploads)
if ($append_file) {
// append to the existing file
file_put_contents(
$file_path,
fopen($uploaded_file, 'r'),
FILE_APPEND
);
} else {
// this is a new chunked upload OR a completed single part upload,
// so move the file from the temp directory to the uploads directory.
move_uploaded_file($uploaded_file, $file_path);
}
}
According to the documentation: Chunked file uploads are only supported by browsers with support for XHR file uploads and the Blob API, which includes Google Chrome and Mozilla Firefox 4+ -- https://github.com/blueimp/jQuery-File-Upload/wiki/Chunked-file-uploads
For chunked uploads to work in Mozilla Firefox 4-6 (XHR upload capable Firefox versions prior to Firefox 7), the multipart option also has to be set to false. Here is the code to handle those cases on the server side.
else {
// Non-multipart uploads (PUT method support)
file_put_contents(
$file_path,
fopen('php://input', 'r'),
$append_file ? FILE_APPEND : 0
);
}
And at last we can verify that the download is complete, or discard a canceled upload.
$file_size = $this->get_file_size($file_path, $append_file);
if ($file_size === $file->size) {
$file->url = $this->get_download_url($file->name);
if ($this->is_valid_image_file($file_path)) {
$this->handle_image_file($file_path, $file);
}
} else {
$file->size = $file_size;
if (!$content_range && $this->options['discard_aborted_uploads']) {
unlink($file_path);
$file->error = $this->get_error_message('abort');
}
}
On the client side you will need to keep track of the chunks. After each piece is posted we send the next part until there are no more chunks left. The example library is a plugin for jQuery which makes it super simple. Using bare XHR objects like you are it will require a little more code. It might look something like this:
var chunksize = 1000000 // 1MB
var chunks = math.ceil(chunksize / fileToUpload.fileSize);
function uploadChunk(fileToUpload, chunk = 0) {
var xhr = new XMLHttpRequest();
var uploadStatus = xhr.upload;
uploadStatus.addEventListener("progress", function (ev) {
if (ev.lengthComputable) {
$("#uploadPercentage").html((ev.loaded / ev.total) * 100 + "%");
}
}, false);
uploadStatus.addEventListener("error", function (ev) {$("#error").html(ev)}, false);
uploadStatus.addEventListener("load", function (ev) {$("#error").html("APPOSTO!")}, false);
var start = chunksize*chunk;
var end = start+(chunksize-1)
if (end >= fileToUpload.fileSize) {
end = fileToUpload.fileSize-1;
}
xhr.open(
"POST",
"serverUpload.php",
true
);
xhr.setRequestHeader("Cache-Control", "no-cache");
xhr.setRequestHeader("Content-Type", "multipart/form-data");
xhr.setRequestHeader("X-File-Name", fileToUpload.fileName);
xhr.setRequestHeader("X-File-Size", fileToUpload.fileSize);
xhr.setRequestHeader("X-File-Type", fileToUpload.type);
xhr.setRequestHeader("Content-Range", start+"-"+end+"/"+fileToUpload.fileSize);
xhr.send(fileToUpload);
}
for(c = 0; c < chunks; c++) {
uploadChunk(fileToUpload, c);
}
Loop through the chunks, uploading each chunk range in turn. Note that the Content-Range header value is in the format start-end/size. The range starts at 0, so "end" can only be the maximum of 1 less than "size". You can use the range "start-" to indicate that the range extends to the end of the file from "start".
EDIT:
Just thought that this would make it possible to implement a progress bar on servers where it is not otherwise possible for single file uploads. Since you know the size of each chunk, and the status of each request you can update a status bar accordingly with each run through the loop.
Also of note is the limitation of certain browsers. Chrome, and Firefox should be able to handle a 4GB file, but IE versions lower than 9 had a bug that prevented the ability to handle files larger than 2GB.
Upvotes: 4
Reputation: 386
file_get_contents() gets the content of a file and puts it in a BUFFER in RAM with an internal pointer. If you don't have enough ram, or have a 32bit version of apache/php, it may crash when trying to allocate too much memory.
You may wanna try something like this instead :
$upload = fopen("php://input", "r");
while (!feof($upload)) {
file_put_contents($path . $filename, fread($upload, 4096), FILE_APPEND);
}
fclose($upload);
Cheers
Upvotes: 0
Reputation: 11
You can compare your code with this tutorial. This tutorial is able to upload files of any size. It is very similar to your code. http://www.youtube.com/watch?v=pTfVK73CUk8
Upvotes: 1
Reputation: 1
I'm writing about the strange behavior of xhr.upload.loaded which starts with a large number...
I have similar problem and I could not find out the reason. The only clue that may help is that depending on ISP the problem sometimes disappears! for example when I test from home it works fine and I don't see this strange behavior but from work internet, the problem remains.
Upvotes: 0
Reputation: 46
There are limits associated with the web server that can't be changed by PHP. Example, their is a default max post request size of 30MB in IIS...there is also a max timeout which you may be hitting. Has nothing to do with size, but how long your post request is taking...ie, how long its taking for the file submit. Both settings can be constrained by IIS or Apache.
Upvotes: 3