Reputation: 2617
From the Google Drive API I receive an array of Structs type File. My aim is to add a few fields and stream the data into BigQuery.
My first approach was to change the File Struct and stream the updated Structs to BigQuery. This looks like a dead end and I am trying to use the suggested method, Marshall the Struct into JSON and stream that into BigQuery.
I found this example bigquery-table-insert-rows, but that implements the ValueSaver interface. For me a simple Marshall and then stream the JSON to BigQuery should be enough.
However, I can't find any method or example that does that. So I would like to know if it is possible, stream JSON into BigQuery using Go. An basic example would be great.
Upvotes: 0
Views: 694
Reputation: 4087
You are on the right track in thinking that just providing the structs is enough.
Maybe looking at some simple but complete code will be of some help: https://github.com/tovare/idporten
All I do is put a slice of structs, where the structs are annotated with BigQuery.
type Metric struct {
Timestamp time.Time `bigquery:"timestamp"`
Metode string `bigquery:"metode"`
Antall int `bigquery:"antall"`
}
....
seriesTableRef := client.Dataset(datasetName).Table(tableName)
if err := seriesTableRef.Inserter().Put(ctx, metrics); err != nil {
return err
}
Upvotes: 1