arielorvits
arielorvits

Reputation: 5535

Best way to import data to Google Bigtable from local SQL Server

I need to import every day a lot of data from our local SQL Server to Bigtable. 100-200 million rows every day.

I tried to send the data to Bigtable by it's write API, but it was very slow (like 20M per hour).

I found that it can be much faster to upload to Bigtable files from google storage by google-cloud dataflow. but it seems to me it's too complicated and unnecessary to export from SQL to file, then upload the file, then import the file.

I hope to find a simpler solution that will enable batch processing from SQL to Bigtable without using files.

If someone can give me links/description of what should be the best here it'll be great.

Upvotes: 0

Views: 498

Answers (1)

Billy Jacobson
Billy Jacobson

Reputation: 1703

Given there is no SQL Server-Dataflow connector, I can't think of a better way. However, you can use a cloud function to optimize this workflow.

Using a GCS upload trigger, you could set it up so when a file is uploaded, you kick off the dataflow job that imports the data. If you set up a daily CRON job to export and upload the data, then the whole process becomes automatic.

Upvotes: 1

Related Questions