Reputation: 2827
I understand that there has been a material change relating to the BigQuery streaming API. As I received in a message from the Google cloud team on Thursday, May 14th:
"In 2013, we launched Google BigQuery streaming API, making it easy to analyze large amounts of data quickly. This product was free until January 1, 2015, when we began charging for streaming data into Google BigQuery, based on the number of rows inserted into a BigQuery table. Since then, we’ve learned more about how customers are using Google BigQuery and effective August 12, 2015, we will stop charging for Google BigQuery streaming by the number of rows inserted and instead charge by the number of bytes inserted. New pricing will be $0.01 per 200 MB, with a minimum billing size of 1 KB per row. We've increased the default insert-rate limit from 10,000 rows per second, per table, to 100,000 rows per second, per table. In addition, the row-size limit has increased from 20 KB to 1 MB. These changes will allow customers more flexibility when designing insert strategy, and more accurately reflect the value provided by the streaming insert feature. Additional information on pricing can be found on our pricing page."
Based on this, I have a few important questions:
I would like clarification on the line which notes "minimum billing size of 1 KB per row." If you pass less than 1KB of data per row into the tables, does this mean that you are NOT charged for any data charges for that row?
Finally, is there a quick way to calculate the potential financial impact of the change, or any guidance on the best way to programatically optimize the streaming insert calls to minimize the costs?
Thanks in advance for your help!
Upvotes: 2
Views: 1000
Reputation: 59165
My interpretation of the rules, I have to confirm with the team:
More good news: The system won't limit each row to 20KB anymore, and now you can stream up to 1 MB per row.
Upvotes: 1