Reputation: 1094
I am trying to find a database solution that is capable of the following.
I had a look at Amazon Athena and it looks a bit promising but I am curious if there are any other solutions out there.
Upvotes: 0
Views: 258
Reputation: 5503
You may consider BigQuery. Regarding 2), there is BigQuery streaming interface. And 4), you can play with BigQuery public data (e.g. the popular BitCoin transaction table) to see how fast BigQuery can be.
Below is sample query using BigQuery standardSQL, showing how to filter data which is stored in JSON string.
#standardSQL
SELECT JSON_EXTRACT(json_text, '$') AS student
FROM UNNEST([
'{"age" : 1, "class" : {"students" : [{"name" : "Jane"}]}}',
'{"age" : 2, "class" : {"students" : []}}',
'{"age" : 10,"class" : {"students" : [{"name" : "John"}, {"name": "Jamie"}]}}'
]) AS json_text
WHERE CAST(JSON_EXTRACT_SCALAR(json_text, '$.age') AS INT64) > 5;
Upvotes: 1
Reputation: 15266
It feels like Google's BigQuery managed database might be of value to you. Reading here we seem to find that there is a soft limit of 100,000 rows per second and the ability to insert 10,000 rows per single request. For performing queries, BigQuery advertises itself as being able to process petabyte sized tables within acceptable limits.
Here is a link to the main page for BigQuery:
https://cloud.google.com/bigquery/
Upvotes: 0