Reputation: 438
I am trying to upload a data range to Salesforce using the following script, however, it is currently set up to only run one line at a time, which I use in a for loop in another function, however, my data often exceeds 2000 lines and I time out. I use a function to format the data range into JSON format. Is it possible to push each JSON into an array and pass that through my HTTP call in one go?
Upload to SF
function upsert(object, pl, id, idField, row) {
var payload = tableJSON(pl);
var sfService = getSfService();
var userProps = PropertiesService.getUserProperties();
var props = userProps.getProperties();
var name = getSfService().serviceName_;
var obj = JSON.parse(props['oauth2.' + name]);
var instanceUrl = obj.instance_url;
var queryUrl = instanceUrl + "/services/data/v42.0/sobjects/" + object +"/" + idField +"/" + id;
var response = UrlFetchApp.fetch(queryUrl, {
headers: {
Authorization: "Bearer " + sfService.getAccessToken()
},
contentType: 'application/json',
payload: payload,
method: "PATCH",
muteHttpExceptions: true
});
}
tableJSON
function tableJSON(arr) {
var i, j, obj = {};
for (i = 0; i < arr.length; i++) {
for (j = 0; j < arr[0].length; j++) {
obj[arr[0][j]] = arr[i][j];
}
}
return JSON.stringify(obj);
}
If anyone is familiar with it, I am basically trying to recreate what the Data Connector by Google add-on does.
Again, I want to take the data range, convert it to a payload that I can run through an http call all at once and preferably update each line with success or the response code/body of the error. My question is, how do I format my payload to make this work as described?
edit: I did not include every function, such as getSfService(). If you want to see the entire thing, here is the github that I worked with to make my code.
edit: This is what ultimately worked for me:
function upsertPage(object, idCol) {
var ss = SpreadsheetApp.getActiveSpreadsheet();
var sheet = ss.getActiveSheet();
var data = sheet.getDataRange().getDisplayValues();
var sfService = getSfService();
var userProps = PropertiesService.getUserProperties();
var props = userProps.getProperties();
var name = getSfService().serviceName_;
var obj = JSON.parse(props['oauth2.' + name]);
var instanceUrl = obj.instance_url;
var idCol = colCt(idCol);
var idField = data[0][idCol];
var dataJSON = getJSON(data,idCol);
var requestArr = [];
for (i=1;i<data.length;i++){
var id = data[i][idCol];
var payload = JSON.stringify(dataJSON[i-1]);
var request = {
'url': instanceUrl + "/services/data/v42.0/sobjects/" + object +"/" + idField +"/" + id,
'headers': {Authorization: "Bearer " + sfService.getAccessToken()},
'contentType': 'application/json',
'method': 'PATCH',
'payload': payload,
'muteHttpExceptions': true
};
requestArr.push(request);
};
var response = UrlFetchApp.fetchAll(requestArr);
}
}
function colCt(colLetter){
var abc = ['a','b','c','d','e','f','g','h','i','j','k','l','m','n','o','p','q','r','s','t','u','v','w','x','y','z'];
for (i=0;i<abc.length;i++){
var colLetter = colLetter.toUpperCase();
if (colLetter === abc[i].toUpperCase()){
return i;
}
}
}
function getJSON(data, extIdCol){
var obj = {};
var result = [];
var headers = data[0];
var cols = headers.length;
var row = [];
var extIdHeader = data[0][extIdCol];
for (var i = 1, l = data.length; i < l; i++){
// get a row to fill the object
row = data[i];
// clear object
obj = {};
for (var col = 0; col < cols; col++)
{
// fill object with new values
obj[headers[col]] = row[col];
}
// add object in a final result
result.push(obj);
}
for(i=0;i<result.length;i++){
delete result[i][extIdHeader];
}
return result;
}
Upvotes: 1
Views: 1231
Reputation: 19637
I have zero experience with Google Apps scripting but I can help with SF bit. You have a list of JSON objects that hopefully contain a valid SF record id and you'd want to mass update them?
You might have to experiment in Postman, SoapUI etc a bit with right request. Or Workbench is decent option (saves you the login call step). Salesforce's REST API has method for mass update of multiple objects by ID. They don't even have to all be in same table (imagine updating Account, Contacts, Opportunities in 1 request and on any error -rollback all 3, interesting!). Start with https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/resources_composite_sobjects_collections_update.htm
If you don't have SF record ids but your scenario is "dear salesforce, i don't care what's your unique id. in my system it's 12345, I don't even remember if I've sent you this record before, go sort your mess yourself, here's the new state to save, you go figure out whether you need insert or update"... It's called upsert operation, it's doable if you do some preparations (create field marked as "External ID" first). But it doesn't do mass upserts nicely, you need to be bit creative like my answer here: https://salesforce.stackexchange.com/questions/274694/can-you-upsert-using-composite-sobject-tree
From what I remember these have limit of max 200 "rows" per request and you can specify if you want "rollback whole transaction" or "save what you can" mode. Both of these are synchronous examples (you submit and wait for results). If you still have performance bottlenecks you might want to look into asynchronous REST API, where you submit job, periodically call "is it done yet" and then you can download a file, map it to your source rows... It'd be code architecture shift, try to hit the 200 limit first before going with premature optimization.
Upvotes: 1
Reputation: 438
I think I've found my own answer, but was still hoping someone could confirm.
If I use UrlFetchApp.fetchAll and passed the information for each line as a request I think I could do this in a loop. Is that right?
Upvotes: 0