Reputation: 1334
I want to ingest data from an api in stream to bigquery.
I guess that the best option is to use cloud dataflow in order to ingest this data into bigquery, but I don't know how to extract the data from the API: https://developer.tomtom.com/traffic-api
Can I extract the data in the same dataflow pipeline or I have to create an instance and extract the data from there to cloud PUB/SUB and then use dataflow to move this data to bigquery?
Upvotes: 0
Views: 1019
Reputation: 347
my assumption is you have an api, from which you want to send data to bigquery. Since you cannot stream directly the API you have to hit on a batch interval it can be hourly or minute based on the API limitations.
You can have a job to read data from this API, and pump into PUB/SUB and use a data flow to pump data to BQ. Or you can use the job directly to pump data to BQ. it's up to your data volume/backup strategy and business requirements.
Upvotes: 1