Reputation: 563
I have a cloud function :
What does it do: uploads csv file into bigquery
Trigger: cloud storage (create/finalize)
GCS bucket status: already has 100s of files
regularly more files are uploaded to the bucket daily
I tested my function and looks perfect, whenever I upload new file, it goes into bigquery straight away.
QUESTION: How can I upload the files which already been in the bucket before I deploy the function?
Upvotes: 1
Views: 535
Reputation: 4069
Posting as community wiki to help other community members that will encounter this issue. As stated by @Sana and @guillaume blaquiere:
The easiest solution is copying all files to a temp bucket and moving them back to the bucket, however it seems a bit silly but old files will trigger the function and get uploaded into BigQuery. Generating the events is to recreate the files and write them again to finalize.
Upvotes: 3