Reputation: 215
I have a epxressJS/Mongoose app from which I send requests to upload my data from database to google sheets using google sheets API, the problem is data in my database gets updated and I need those updates to reflect in my google sheet, the problem is that updated documents in database correspond to distant rows in google sheets.
As far as I know updating google sheets via API requires specifying a range, that only updates sequential rows, what should I do in this situation ? should I delete all rows ( all at least a big subset ) and re-insert them again to update a few scattered rows ?
Or should I use a request per document update ? ( I may have hundreds of rows to update at a time ).
This is the snippet used for update.
googleSheetsInstance.spreadsheets.values.update({
// The ID of the spreadsheet to update.
spreadsheetId: spreadsheetId, // TODO: Update placeholder value.
// The A1 notation of the values to update.
range: "Sheet1!A11210", //sheet name and range of cells
// How the input data should be interpreted.
valueInputOption: "RAW", // TODO: Update placeholder value.
resource: {
// TODO: Add desired properties to the request body. All existing properties
// will be replaced.
values: [[1, 2, 3, 4, 5, 6], [10, 20, 30, 40, 50, 60], [41, 82, 53, 64, 75, 6]],
],
},
auth: auth,
});
Upvotes: 1
Views: 2221
Reputation: 3725
You can actually update multiple ranges using batchUpdate()
:
Sets values in one or more ranges of a spreadsheet. The caller must specify the spreadsheet ID, a valueInputOption, and one or more ValueRanges.
With this if you know exactly which ranges you want to update you can just build the request and update only those cells. And based on this answer the amount of ranges that you can define is quite generous. It also has the advantage of counting as only a single API call. Your scenario is exactly what batchUpdate()
is designed for.
If that is too much work, updating all rows indiscriminately shouldn't be an issue either, even if it's tens of thousands of them. The Sheets API usage limits only specify amount of requests per minute, not the payload size. There may be an undocumented limit, but in my experience it's pretty generous as long as you do it in a single request (don't loop to update the rows one by one). Clearing the rows first to add them again is a worse option because that would be at least two write requests when you can do it with a single update.
Still, the optimal way is by using batchUpdate()
so I recommend you use that instead.
Upvotes: 1