Rohit V
Rohit V

Reputation: 85

Reading Multiple Firebase Documents One By One

So in my web application, I am fetching data from firebase, it will have about 5000 documents by the time it's ready. So to reduce the number of reads, I keep an array in firebase with a list of ids of each documents, which I read and then compare with the list I have saved locally. If an id is missing, I read that particular doc from firebase (Firebase website definitely slowed down with having an array with 3000 docs so far).

However, consider the situation where I have 2000+ docs missing and I have to fetch them one by one according to missingList I created from comparing the firebase and local arrays. It is heck a lot of slow, it takes so long to fetch even a hundred documents because I am using await. If I don't, then firebase is overloaded with requests and for some reason it shows a warning that my net is disconnected.

What is the best way to fetch documents with ids given in an array, with no loss and reasonably fast enough? Is there a way for me to batch read documents like there is batch write, update and set?

Say Firebase has [1234, 1235, 1236, 1237] and I only have [1234] so I need to read [1235, 1236, 1237] from it.

I am using a for loop to iterate through each id and get the corresponding document like so

for (let i = 0; i < missingList.length; i++) {
  var item = await db.collection('myCollection').doc(missingList[i]).get().then((snapshot) => snapshot.data())
  await saveToDB(item) //I use PouchDB for local storage
}

Upvotes: 0

Views: 843

Answers (1)

ThienLD
ThienLD

Reputation: 741

The closest thing to a 'batched read' in Firestore is a transaction, but using a transation to read hundreds of document is not really efficient. Instead, I suggest you add one field to every document, like token. Then, every time you get data for the first time, generate a random token client-side, then write it to every document that you read. After that, if you detect a change in the list of ids you mention above, run a query on that token field, like :

db.collection('myCollection').where('token', '!=', locallySavedToken).get()

Remember to write your local token back to the read documents. This way, your query time will be much faster. The down side is you need 1 extra write request for every document read. If you need to re-read your data a lot then maybe look into transactions, since write requests pricing is more expensive than read requests

Upvotes: 1

Related Questions