Reputation: 21
I am evaluating several of Google's storage options and was wondering if anyone had thoughts. I will be using the data storage as follows:
1) High number of read/insert on several columns (at least 50,000 entities). I would prefer to use Google Cloud Datastore (because of the cool indexing), but is it capable of handling this kind of a load? Also how many requests a second can be handled?
2) Less frequent update/delete (once per day). We would need the query to be indexed to query but this doesn't need to be very scalable. At any any time we might have not more than may be a 100,000 records. I would like to use Datastore, but will its updates and deletes perform adequately? Are there any problem with bulk deleting like in Cassandra?
In general, will there be any scalability issues with Google Cloud Datastore?
Upvotes: 2
Views: 1091
Reputation: 42048
50,000-100,000 entities is a trivial amount for Cloud Datastore. Even Billions of entities will be fine.
For first question is a high number of reads/inserts on several columns. Without specific numbers on what 'high' means, it is hard to answer directly, so what follows is specific guidance:
Upvotes: 3