AJB
AJB

Reputation: 7592

Filepicker.io — Stop .pickMultiple From Automatically Storing File In S3 Bucket

I'm trying to use Filepicker.io as an uploader but in order to fire an onSuccess event for each file in the payload I'm using a combination of the .pickMultiple and .store methods. Like so:

    filepicker.pickMultiple(function(fpfiles){

        for(var i = 0; i < fpfiles.length; i++){

            //Clean the filename

            //Check duplicate

            //Store the file on S3
            filepicker.store(
               fpfiles[i].url,
               {location: 'S3', path: 'filepicker/' + fpfiles[i].filename},
               function(my_uploaded_file){
                  //Do some other cool stuff ...
               }
            );
        }               
    });

(This is opposed to using the .pickAndStore method, which will only fire an onSuccess event after the entire payload has completed transmission)

The problem I'm having with this is that it seems as thought the .pickMultiple method is 'auto-magically' saving a copy of the file in the root of my S3 bucket; so I'm ending up with two copies of the same file.

For example:

If I upload my_file.png to a folder in my bucket called IMAGES I should get the result of http://s3.amazonaws.com/my_bucket/IMAGES/my_file.png

Which is happening, but I'm also getting: http://s3.amazonaws.com/my_bucket/UNIQUE_ID_my_file.png

Anyone know how to prevent .pickMultiple from automatically adding the file to my S3 bucket?

Thanks for any help.

Upvotes: 1

Views: 639

Answers (1)

AJB
AJB

Reputation: 7592

For anyone else that may come across this same problem, the .pickMultiple() —> .store() method is a dead end. The (only) way to get an onSuccess event to fire for each file in the payload is to use a vanilla <input type="file" /> onChange event to get the element's FILES array and then loop through FILES and call .store() for each of the files in the array.

The Example:

$('#BTN_upload').change(function(){

    var files = $(this)[0].files;

    //So you can see what should be uploading
    console.log(JSON.stringify(files));

    //Loop the files array to store each file on S3
    for(var i = 0; i < files.length; i++){

        //All good. Now execute the .store call
        filepicker.store(

            //The file to upload
            files[i],

            //The file options
            //(I'm storing the files in a specific folder within my S3 bucket called 'my_folder')
            //This is also where you'll rename your file to whatever you'd like
            {path: 'my_folder/' + files[i].name},

            //OnSuccess
            function(FPFile){
                console.log("Store successful: ", JSON.stringify(FPFile));
                //Now possibly call .remove() to remove the 'temp' file from FP.io
            },

            //OnError
            function(FPError){
                console.log(FPError.toString());
            },

            //OnProgress
            function(progress){
                console.log("Loading: " + progress + "%");
            }
       );

    }

});
filepicker.setKey('MY_FP.IO_KEY');

And the HTML:

<input id="BTN_upload" type="file" />

This example is not a finished product. You'll still have to roll your own user feedback (like a queue display with progress bars), duplicate checking, renaming, etc. But that's all pretty simple stuff.

NOTE: This is only for local-to-S3 uploads. I'm not sure how to integrate the other souces that FP.io has on tap. Maybe you do?

Upvotes: 1

Related Questions