Reputation: 1
I have a Google Cloud Healthcare dataset with a FHIR store configured for BigQuery streaming. When I add a larger (< 1000) collection to the FHIR store, the resources fail to add to the configured BigQuery dataset and produce an error.
I configured a Dataflow job to harmonize hl7v2 messages to FHIR. At times, this job processes a few dozen hl7v2 messages at a time, resulting in hundreds of FHIR resources being added within seconds. I expected the BigQuery streaming service to move all of the resources to the BigQuery dataset, but instead the service often fails with the following error (in Logs Explorer):
bigquery.googleapis.com jobservice.jobcompleted service-########@gcp-sa-healthcare.iam.gserviceaccount.com Exceeded rate limits: too many table update operations for this table. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas
The inserts never retry, so the rows are lost forever. Has anyone been able to workaround this issue? If not, I'll probably have to give up on this feature and manually batch FHIR resources to BigQuery.
Upvotes: 0
Views: 209
Reputation: 711
The "too many table update operations" error is a known issue and something that the team is actively working to improve. If you'd like more information you can try filing a bug on the Public Issue Tracker. The issue with these errors not being retried was just fixed, if you continue to see these errors not being retried please file a support ticket through the Cloud Console.
Upvotes: 0