Reputation: 4969
In a streaming dataflow pipeline, how can I dynamically change the bucket or the prefix of data I write to cloud storage?
For example, I would like to store data to text or avro files on GCS, but with a prefix that includes the processing hour.
Update: The question is invalid because there simply is no sink you can use in streaming dataflow that writes to Google Cloud Storage.
Upvotes: 0
Views: 231
Reputation: 346
Google Cloud Dataflow currently does not allow GCS sinks in streaming mode.
Upvotes: 1