Reputation: 7004
We can query results in any language from Google BigQuery using the predefined methods ->
see docs.
Alternatively, we can also query the results and store them to cloud storage, for example in a .csv ->
see docs on storing data to GCS
When we repeatedly need to extract the same data, eg lets say 100 times per day, does it make sense to cache the data to Cloud Storage and load it from there, or to redo the BigQuery requests?
What is more cost efficient and how would I obtain the unit cost of these requests, to estimate a % difference?
Upvotes: 0
Views: 599
Reputation: 208042
BigQuery's pricing model is based on how much bytes your query uses.
So if you want to query the results, than store the results as a BigQuery table. Set a destination table when you run your query.
There is no point of reloading from GCS a previous results. The cost will be the same, you just complicate this.
Upvotes: 0