elaspog
elaspog

Reputation: 1709

Google CLI does not create Transfer Service and shows no detailed error

I'm very new to Google Cloud CLI, sorry for the dumb question but it's really annoying.

I'm trying to execute this command:

bq mk --transfer_config --target_dataset=ds_scrummate --display_name='map_email' --data_source=amazon_s3 --params='{"data_path_template":"s3://sm-bigquery/map_email/{run_date}/*.csv", "destination_table_name_template":"ing_map_email", "file_format":"CSV", "max_bad_records":"0", "ignore_unknown_values":"true", "field_delimiter":",", "skip_leading_rows":"1", "allow_quoted_newlines":"true", "allow_jagged_rows":"false", "delete_source_files":"true"}'

I always get this error message back:

BigQuery error in mk operation: Parameters should be specified in JSON format
when creating the transfer configuration.

The JSON file above seems to be valid (I've tested it with online validator). I've also tried the multiline version of this command with the same result.

Questions:

Sources I've used:

https://cloud.google.com/bigquery/docs/reference/bq-cli-reference#bq_mk https://cloud.google.com/bigquery-transfer/docs/working-with-transfers https://cloud.google.com/bigquery-transfer/docs/s3-transfer#bq https://cloud.google.com/bigquery/docs/bq-command-line-tool

Upvotes: 0

Views: 1066

Answers (2)

Lorenz Singer
Lorenz Singer

Reputation: 1

Unfortunately the error message from the CloudSDK is not very helpful. I had the same message ("Parameters should be specified in JSON format") trying to create a scheduled query in Powershell.

To find the error, I modified "bq\bigquery_client.py" to output the Exception caused by json.loads(params). The real error was "Expecting property name enclosed in double quotes". This helped me to find the solution: I had to put the property names and values inside the params in double double quotes, for example: --params='{ ""query"": ""test"" }'

Upvotes: 0

rsantiago
rsantiago

Reputation: 2099

What am I doing wrong? Why am I getting JSON error here, despite of the validated content?

The online parsers do give valid json structure, but it could be that the prompt is not recognizing {run_date} as a string, I think it is trying to parse it also. Or maybe the comma in "field_delimiter":"," should be scaped.

How: could I get a more detailed error? Is there a verbose flag for this command which I couldn't find?

--apilog <filename> outputs debug logs to a file.

Why isn't there any good documentation which describes this type of error in detail?

If my suspicion is correct, I think the error message points to the right direction, we need to verify what's going wrong in the json; the debug logs might contain more specific details on the section that was not parseable.

Upvotes: 1

Related Questions