Reputation: 355
I need to populate data into Google Cloud Bigtable and the source of the data will be Google BigQuery.
As an exercise, I am able to read the data from BigQuery and as an seperate exercise I am able to write data into Bigtable as well.
Now I have to combine these 2 operations into one Google Cloud Dataflow job. Any example will be of great help.
Upvotes: 4
Views: 4353
Reputation: 1201
For a people who want to transform bigquery data to bigtable in future can refer to following link
Upvotes: 0
Reputation: 3010
You can just use the transforms as shown in those examples, adding whatever logic you need in between, for example:
Pipeline p = Pipeline.create(options);
.apply(BigQueryIO.Read.from("some_table"))
.apply(ParDo.of(new DoFn<TableRow, Row>() {
public void processElement(ProcessContext c) {
Row output = somehowConvertYourDataToARow(c.element());
c.output(output);
}
})
.apply(BigtableIO.Write.withTableId("some_other_table");
Upvotes: 3