Adamo Figueroa
Adamo Figueroa

Reputation: 448

Google cloud sql export csv fails when contains nulls

I'm exporting a table from my Mysql database in Cloud Sql with the command,

gcloud sql export csv INSTANCE URI --query = QUERY

I have the file in my bucket, but when I try to import into Bigquery it fails, I noticed that all nulls are replaced by "n,

not even close the quote.

I found that this is a known bug, https://cloud.google.com/sql/docs/mysql/known-issues

Someone has a temporary solution?

thanks a lot.

Upvotes: 2

Views: 2072

Answers (2)

davemfish
davemfish

Reputation: 312

The gcloud sql export csv API now supports options to workaround this problem.

From: https://cloud.google.com/sql/docs/mysql/import-export/import-export-csv#export_data_to_a_csv_file

Note: Cloud SQL uses double quotes (hex value "22") as the default escape character. This can be a problem for databases where values for NULL are entered as string literals. When importing a file that was exported using the default escape character, the file doesn't treat the value as NULL but as "NULL". We recommend that you use --escape="5C" to override the default when you export the file.

For example, gcloud sql export csv --escape="5C" INSTANCE URI --query = QUERY

Upvotes: 1

Yasser Karout
Yasser Karout

Reputation: 336

The fix for this issue is still currently being addressed by the Cloud SQL Team. You can stay update on this thread

If you have to use CSV, you can try using an SQL client to issue an import with the options you prefer (LOAD DATA LOCAL INFILE) instead of using the Cloud SQL import API

Upvotes: 2

Related Questions