BruceM
BruceM

Reputation: 1729

How to back up a Google Apps Script's scriptdb

If I have a mission critical db, that needs to be regularly backed up, and I store it as a scriptdb in GAS, is there any way to back up the actual database file? It seems the db is embedded in a way that makes it invisible outside of scripts?

Upvotes: 1

Views: 1198

Answers (4)

Zig
Zig

Reputation: 11

The actual answer is: you do not store mission critical data on a scriptdb, for many reasons:

  1. appscript does not have SLA. Google has many other storages that do have guarantees.
  2. Because that db does not support transactions you will not be able to guarantee that a batch process might process twice the same data (in cases where the script fails in the middle of a chunk backup or restore).
  3. It will get complex if you store the ids inside other objects in the db.

Upvotes: 1

BruceM
BruceM

Reputation: 1729

I think I found decent soluton for my question above, in an unexpected place. Rather than use a scriptdb, I can use Google Fusion Table/s - these have SQL-type access, are concrete docs that can be exported, backed up, viewed etc, and can act as the data store for my app...

Upvotes: 1

Jacobvdb
Jacobvdb

Reputation: 1448

Maybe you can copy your critical data from the scriptdb to a Google Spreadsheet. Given its a example in Google developers I think it is an interesting option.

Here is the link: Copy a Database to a New Sheet in a Spreadsheet.

Upvotes: 0

Henrique G. Abreu
Henrique G. Abreu

Reputation: 17772

Well, you can always query all your values and JSON.stringify them. If you ever need to restore a database you from this, the only difference I can notice is that each item id will change.

Here is an example:

function backupDB() {
  var db = ScriptDb.getMyDb();
  var res = db.query({});
  var array = [];
  while( res.hasNext() )
    array.push(res.next().toJson());
  var dbString = JSON.stringify(array);
  Logger.log(dbString); //you'll obviously save this string somewhere else e.g. as a docs file
}

You may also need to do this in chunks, as your db may have too much data for the script to handle at once, like this.

I also feel that this "backup" procedure should be taken care of by the API as well. The code above is just an idea I just had.

Upvotes: 3

Related Questions