Reputation: 211
What is the best way to deal with a large data set in Google Apps Scripts? My data is located in Google Spreadsheets (~10,000 - ~11,000 cells) and I am using it with a "form letter" (about 4-5 pages long) that I created in as a Google Document. I would like to run a find and replace in the document for about 100 place holders (that match up to columns) and then save the document. The last step is to merge all of these new documents (~100) into one document that is converted into a PDf.
Any thoughts on the best way to deal with the flow of all of this data? Is it better to merge the documents into one document as I create them or should I wait and merge them all at the end?
Upvotes: 3
Views: 1431
Reputation: 1334
You might consider this App as a mail merge tool http://www.thexs.ca/xsmerge It's available in the Chrome web store (free)
It does exactly what you need, generating merged google doc into a Pdf result
Upvotes: 0
Reputation: 1345
You will need to plan for exhausting quotas. Docslist and execution time are likely concerns. Creating 100 docs is fine but will the job grow to exceed the quota ? (250/500/1500 depending on whether you are consumer/free apps/or business apps user) see the Google Apps Script Dashboard for current limits. Unless you really need the individual documents from find/replace - create the final document as you go. Minimise api calls - eg get data from spreadsheet once, not for each document. Have you tested pdf'ing a sample of your large final document? This can be a time/resource consuming operation and fall foul of script limits. Plan to separate this part of the script out.
Upvotes: 2