Hassan HAJJAR
Hassan HAJJAR

Reputation: 43

Using Cron to export data from mysql database to CSV, then getting all it's data to bigquery table

Using Cron to export data from mysql database to CSV, then reading this csv file and getting all it's data to google cloud using bigquery

Hello guys, I have a Mysql database called db_test, and one table in it called members_test(id, name). I'm working on Linux Ubunto OS. I am trying to use cronjob to take data at midnight from this table into a CSV file. Also I want to let bigquery somehow read this csv file and take its data and put them in a table called cloud_members_tab saved on the google cloud platform.

How to do this?

Upvotes: 2

Views: 910

Answers (1)

Pentium10
Pentium10

Reputation: 207820

  1. make sure you have your CSV generated correctly (don't rely on MySQL CSV export natively)
  2. install gsutil and bq command line utility
  3. upload CSV to Google Cloud Storage

use a shell command like below:

gsutil cp -j -o="GSUtil:parallel_composite_upload_threshold=150M" /tmp/export.csv gs://bucket/export.csv

  1. use bq load

bq load --source_format=CSV --field_delimiter="," --null_marker="\N" --allow_quoted_newlines --autodetect --source_format=CSV dataset.tablename gs://bucket/export.csv

Upvotes: 2

Related Questions