Reputation: 612
I am looking for the best way to migrate my apps database which is using firebase realtime database to the new Cloud Firestore database. I am confident for the project I am working on I don't need to make any data schema changes, so I am pretty much just trying to 1-1 map it. Firebase has suggested on their site to just write a script to do this, but I am not sure of the best way to go about that. Has anyone already made a script that accomplishes this?
Upvotes: 28
Views: 12802
Reputation: 313
@Luke solution is great but nowadays it didn't work for the entire collection, it saves only the first document, so I modified it for using Promise.all
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
});
var db = admin.firestore();
var allEntityNames = Object.keys(database);
(async function () {
for (var i in allEntityNames) {
var entityName = allEntityNames[i];
var entity = database[entityName];
var entityKeys = Object.keys(entity);
const promises = entityKeys.map(async (entityKey) => {
var dict = entity[entityKey];
return await db
.collection(entityName)
.doc(entityKey)
.set(dict)
.catch(function (error) {
console.log("error=>", error);
});
});
console.log("promises.length", promises.length);
await Promise.all(promises);
console.log('Migration done!')
}
})();
Upvotes: 0
Reputation: 12545
There's a decent 3rd party npm package to help with importing, it boils down to basically one line:
await firestoreService.restore({ "my-table": myTable });
https://www.npmjs.com/package/firestore-export-import
Upvotes: 0
Reputation: 9084
I just did this with a very basic node script and I hope it will serve as an example to the next one having this issue:
require('firebase/firestore')
const fs = require('fs')
const { initializeApp, firestore } = require('firebase/app')
const UID = 'asdfasdf' // UID of the user you are migrating
initializeApp({
apiKey: process.env.API_KEY,
projectId: process.env.PROJECT_ID
})
// db.json is the downloaded copy from my firebasedatabase
fs.readFile('db.json', (err, data) => {
if (err) throw err
const json = JSON.parse(data)
const readings = json.readings[UID]
const result = Object.values(readings)
result.forEach(({ book, chapter, date }) =>
// In my case the migration was easy, I just wanted to move user's readings to their own collection
firestore().collection(`users/${UID}/readings`)
.add({ date: firestore.Timestamp.fromDate(new Date(date)), chapter, book })
.catch(console.error)
)
console.log('SUCCESS!')
})
Of course you can also iterate twice to do it for every user but in my case it was not needed :)
Upvotes: 0
Reputation: 978
Actually, I wrote a script in Node-Js
that use batch in writing to Firestore (batch is super fast and suitable for write may items)
here is my code, just change files name to your's name and run node YOUR_FILE_NAME.js
const admin = require('firebase-admin');
var serviceAccount = require('./firestore-config.json');
var database = require('./database.json');
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: 'YOUR_FILE_STORE_DB_URL',
});
var db = admin.firestore();
var allEntityNames = Object.keys(database);
var counter = 0;
var commitCounter = 0;
var batches = [];
batches[commitCounter] = db.batch();
var ref = db.collection('users');
allEntityNames.forEach(function(k, i) {
if (counter <= 498) {
var thisRef = ref.doc(k);
batches[commitCounter].set(thisRef, database[k]);
counter = counter + 1;
} else {
counter = 0;
commitCounter = commitCounter + 1;
batches[commitCounter] = db.batch();
}
});
for (var i = 0; i < batches.length; i++) {
batches[i].commit().then(function() {
console.count('wrote batch');
});
}
Node-Js
on your machine, google to install it. it is not so hard.firestore-config.json
from your firebase console. Upvotes: 2
Reputation: 75
Hi i have created a script for the same
import { AngularFirestore, AngularFirestoreCollection } from 'angularfire2/firestore';
import { AngularFireDatabase } from 'angularfire2/database';
constructor( private afs: AngularFirestore, private angularfire: AngularFireDatabase ) {}
convert() {
this.itemsCollection = this.afs.collection('requests');//ref()
this.angularfire.list('/requests/').auditTrail().subscribe((data: any) => {
_.each(data, element =>{
this.itemsCollection.doc(element.key).set(element.payload.val()) .then((result) => { }); }); });}
Upvotes: -2
Reputation: 612
I wrote up a little node script that migrated things in a quick and dirty way and it worked quite nicely.
It is below if anyone else is interested.
Note: This should only be used if your data model in the realtime database was completely flat and did not have much nested data, and you intend on keeping your data flat as well in Firestore
To run this script just create a node file called index.js and throw it in a directory along with your service account file and raw json file from the realtime database export and run the following from command line.
$ node index.js
Script implementation below.
const admin = require('firebase-admin');
var serviceAccount = require("./config.json");
var database = require("./database.json");
var async = require ('async');
admin.initializeApp({
credential: admin.credential.cert(serviceAccount)
});
var db = admin.firestore();
var allEntityNames = Object.keys(database);
var asyncTasks = [];
for (var i in allEntityNames) {
var entityName = allEntityNames[i];
var entity = database[entityName];
var entityKeys = Object.keys(entity);
console.log("began migrating "+ entityName);
for (var j in entityKeys) {
var entityKey = entityKeys[j];
var dict = entity[entityKey];
asyncTasks.push(function(callback){
db.collection(entityName).doc(entityKey).set(dict)
.then(function() {
callback();
})
.catch(function(error) {
console.log(error);
callback();
});
});
}
async.parallel(asyncTasks, function(){
console.log("Finished migrating "+ entityName);
});
}
Upvotes: 10