G. Macia
G. Macia

Reputation: 1511

How to copy files from gdrive to s3 bucket using google scripts?

I created a Google Form with a linked Google Spreadsheet. I would like that everytime someone submits the form, the spreadsheet is copied to an s3 bucket in AWS. To do so, I just got started with Google Scripts. I managed to get the trigger part working on form submit but I am struggling to understand the readme of this GitHub project to upload to s3.

function setUpTrigger() {
  ScriptApp.newTrigger('copyDataS3')
  .forForm('1SK-2Ow63vs_TaoF54UjSgn35FL7F8_ANHDTOOiTabMM')
  .onFormSubmit()
  .create();
}

function copyDataS3() {
  // https://github.com/viuinsight/google-apps-script-for-aws
  // I do not understand where should I place aws.js and util.js. 
  // Should I do File -> New -> Script file and copy paste the contents? Should the file be .js or .gs?
  S3.init("MY_ACCESS_KEY", "MY_SECRET_KEY"); 
  // if I wanwt to copy an spreadsheet with the following id, what should go into "object" below?
  var ssID = "SPREADSHEET_ID";
  S3.putObject(bucketName, objectName, object, region)
}

Upvotes: 2

Views: 7279

Answers (1)

Tanaike
Tanaike

Reputation: 201378

I believe your goal as follows.

  • You want to send Google Spreadsheet to s3 bucket as a CSV data using Google Apps Script.

Modification points:

  • When I saw google-apps-script-for-aws of the library you are using, I noticed that the data is requested as the string. I thought that in this case, your CSV data might be able to be directly sent. But for example, when you want to sent a binary data, it will occur an error. So in this answer, I would like to propose the modified script of 2 patterns.
  • I thought that the situation might similar to this thread. But I noticed that you are using the different library from the thread. So I post this answer.

Pattern 1:

In this pattern, it supposes that only the text data is sent. It's like the CSV data in your replying. In this case, I think that it is not required to modify the library.

Modified script:

S3.init("MY_ACCESS_KEY", "MY_SECRET_KEY");  // Please set this.
var spreadsheetId = "###";  // Please set the Spreadsheet ID.
var sheetName = "Sheet1";  // Please set the sheet name.
var region = "###"; //  Please set this.

var csv = SpreadsheetApp
  .openById(spreadsheetId)
  .getSheetByName(sheetName)
  .getDataRange()
  .getValues()  //  or .getDisplayValues()
  .map(r => r.join(","))
  .join("\n");
var blob = Utilities.newBlob(csv, MimeType.CSV, sheetName + ".csv");
S3.putObject("bucketName", "test.csv", blob, region);

Pattern 2:

In this pattern, it supposes that both the text data and binary data are sent. In this case, it is required to also modify the library side.

For google-apps-script-for-aws

Please modify the line 110 in s3.js as follows.

From:
var content = object.getDataAsString();
To:
var content = object.getBytes();

And, please modify the line 146 in s3.js as follows.

From:
Utilities.DigestAlgorithm.MD5, content, Utilities.Charset.UTF_8));
To:
Utilities.DigestAlgorithm.MD5, content));

For Google Apps Script:

In this case, please give the blob to S3.putObject as follows.

Script:
S3.init("MY_ACCESS_KEY", "MY_SECRET_KEY");  // Please set this.
var fileId = "###";  // Please set the file ID.
var region = "###"; //  Please set this.

var blob = DriveApp.getFileById(fileId).getBlob();
S3.putObject("bucketName", blob.getName(), blob, region);

References:

Upvotes: 1

Related Questions