Reputation: 2804
I want to fetch 50 users from cloud firestore
and I have two way both works.
But actually I dont know which is performant, as we have a poor internet connection in our country, if our focus be only fetching not iterating
.
The first way (Single Request)
let tempList = [];
const matchingUsers = [user1, user2, user3, ..., user50];
const snap = await db.collection('users').get();
if (snap.size > 0) {
sanp.docs.forEach(doc => {
const data = doc.data();
matchingUsers.forEach(user => {
if (data.user === user) {
tempList.push(data.user);
}
});
});
}
The second way (multiple request)
matchingUsers.forEach(async user => {
const snap = await db.collection('users').doc(user).get();
tempList.push(snap.data().user)
});
Upvotes: 0
Views: 47
Reputation: 83048
With the first way, you are actually fetching the entire users collection and transmit all the corresponding data from the backend (Firestore) to your front-end. This is really not efficient, especially if you want to filter 50 users out 500k! Note also that you will pay for 500K reads instead of 50 (see pricing).
So fetching for only the docs you want (i.e. for exactly the 50 users) is the most efficient way. Since the get()
method is asynchronous and returns a Promise, you can use Promise.all()
as follows:
const matchingUsers = [user1, user2, user3, ..., user50];
const promises = matchingUsers.map(u => db.collection('users').doc(u).get());
Promise.all(promises).then(results => {
//results is an array of DocumentSnapshots
//use any array method, like map or forEach
results.map(docSnapshot => {
console.log(docSnapshot.data());
});
});
As explained in the doc, the advantage of Promise.all()
is that "it returns a single Promise that fulfills when all of the promises passed as an iterable have been fulfilled", making it really easy to manage the different asynchronous parallel calls.
Upvotes: 2