Nathaniel MacIver
Nathaniel MacIver

Reputation: 397

GAS: "InternalError: Array length exceeds supported capacity limit."

I'm running URLFetchApp Requests to an Amazon S3 Server to pull Audio Files and relocate them to Google Drive. The HTTPResponse comes back in XML format.

I run the following code to convert into a workable blob that can be stored in Google Drive:

/*driveAppFolder, fileName, response are pre-defined variables from earlier in the program*/

var responseBlob = response.getBlob();
var driveAppFile = driveAppFolder.createFile(blob).setName(fileName);

This code works flawlessly up to a certain size. I haven't figured out the limitation yet, but I know a 50MB file (52657324 bytes) will prevent the blob from generating with the error:

InternalError: Array length 52389150 exceeds supported capacity limit. 

I realize a similar JavaScript error was handled here, but I am locked in the confines of Google Apps Script currently. Is there a way I can work around this sort of limitation and get the blob made?

Upvotes: 0

Views: 732

Answers (1)

Tanaike
Tanaike

Reputation: 201503

How about this answer? 50 MB is 52,428,800 bytes. At Google Apps Script, there are the limitation of the size of blob. The maxmum size is 52,428,800 bytes. So in your situation, such error occurs. In your situation, you download such large file. When you download it, how about using the following methods?

  1. Use partial download by range.
  2. Use a library for downloading large files from URL.
    • This library uses the partial download.

References:

Upvotes: 3

Related Questions