Reputation: 1984
To facilitate reading, I made use of de-normalisation. This is how the workflow is.
There are two collections
Events has a start, end timestamps and status. Start, End timestamps indicate when an event starts and ends Status maintains if its Upcoming or Live or Finished or Cancelled Status is updated by listening to every minute scheduler.
When a user registers for an event, I copy the event object under users/{user-id}/events. This is required as I need to fetch what events a user registered.
Problem
Consider there are 1 Million users subscribed for an event when the status is Upcoming. When the status changes from Upcoming to Live, I need to update all documents under users/{user-id}/events collection, for all users.
If i make sequential batch writes, it takes nearly 1000000/500 = 2000 batches and it takes nearly 15 to 30mins to update for one event change. I see it too much problematic as events increase.
I'm pretty worried about the limit of 10,000 updates to the whole firestore per second to use parallel batch writes.
How to handle this scenario so the the writes won't hit the limits and can be written as fast as possible?
Upvotes: 2
Views: 1018
Reputation: 317352
If you know how fast your writes can be handled, you use Cloud Tasks to limit the rate of writes. A full discussion is probably beyond the scope of a single Stack Overflow answer. After getting acquainted with Cloud Tasks, I suggest looking specifically at configuring a rate limit for the queue you will use to handle all the writes. Tasks dispatched to a queue can invoke a function to perform the deletes.
Upvotes: 2