CQM
CQM

Reputation: 44220

Parse SDK saveAll promise chaining

Advice from a Parse developer forum said to "limit saveAll to 75 objects unless one wants saveAll to make its own batches" which by default are 20 objects. And to put this in a promise chain.

I need to do a saveAll promise chain where I don't know how many promises I need.

How would this be done?

I have an Array of Arrays. The sub arrays are all length 75. I need all the indexes of the master array to be saveAll in a Promise each.

            var savePromises = [];  // this will collect save promises 

            while((partition=partitionedArray.pop()) != null){  
                savePromises.push(Parse.Object.saveAll(partition, {
                    success: function(objs) {
                        // objects have been saved...


                    },
                    error: function(error) { 
                         // an error occurred...
                         status.error("something failed");
                    }
                }));
            }

            return Parse.Promise.when(savePromises);
    }).then(function() {

        // Set the job's success status
        status.success("successful everything");

Upvotes: 2

Views: 559

Answers (1)

danh
danh

Reputation: 62686

A nice way to do this is to build the chain of promises recursively. If you've already batched the objects that need saving into batches, then some of the work is done already.

// assume batches is [ [ unsaved_object0 ... unsaved_object74 ], [ unsaved_object75 ... unsaved_object149 ], ... ]
function saveBatches(batches) {
    if (batches.length === 0) { return Parse.Promise.as(); }
    var nextBatch = batches[0];
    return Parse.Object.saveAll(nextBatch).then(function() {
        var remainingBatches = batches.slice(1, batches.length);
        return saveBatches(remainingBatches);
    });
}

EDIT - To call this, just call it and handle the promise it returns...

function doAllThoseSaves() {
    var batches = // your code to build unsaved objects
    // don't save them yet, just create (or update) e.g....
    var MyClass = Parse.Object.extend("MyClass")
    var instance = new MyClass();
    // set, etc
    batches = [ [ instance ] ];  // see? not saved
    saveBatches(batches).then(function() {
        // the saves are done
    }, function(error) {
        // handle the error
    });
}

EDIT 2 - At some point, the transactions you want to won't fit under the burst limit of the free tier, and spread out (somehow) won't fit within the timeout limit.

I've struggled with a similar problem. In my case, it's a rare, admin-facing migration. Rare enough and invisible to the end user, to have made me lazy about a solid solution. This is kind of a different question, now, but a few ideas for a solid solution could be:

  1. see underscore.js _.throttle(), running from the client, to spread the transactions out over time
  2. run your own node server that throttles calls into parse similarly (or the equal) to _.throttle().
  3. a parse scheduled job that runs frequently, taking a small bite at a time (my case involves an import file, so I can save it quickly initially, open it in the job, count the number of objects that I've created so far, scan accordingly into the file, and do another batch)
  4. my current (extra dumb, but functional) solution: admin user manually requests N small batches, taking care to space those requests ("one mississippi, two mississippi, ...") between button presses
  5. heaven forbid - hire another back-end, remembering that we usually get what we pay for, and parse -- even at the free-tier -- is pretty nice.

Upvotes: 1

Related Questions