Reputation: 21
I'm trying to intentionally insert one dummy record with full schema data into bigquery GA360 table.
One way I figured it out is to "INSERT" all fields:
INSERT INTO <project.dataset.table> (visitorId,visitStartTime,date,totals,customDimension.......)
VALUES ( 1234, 12345, 20200101, ( 1,2,3,4,5,6,7,8,9,10,11,12,13),[(1,"asd"),....].....)
I gave up when I encountered "hits" field which has complicated nested structure.
Is there any other way (ex. something I can imagine is to code with client API using table schema JSON file) or anyone who has succeed with DDL?
Upvotes: 0
Views: 1127
Reputation: 59165
How would I solve this:
SELECT TO_JSON_STRING(a)
FROM `bigquery-public-data.google_analytics_sample.ga_sessions_20170801` a
LIMIT 1
Save that long JSON string to a file.
Modify that file as much as you want.
Create a new table to insert that file:
CREATE TABLE `temp.analy` AS
SELECT *
FROM `bigquery-public-data.google_analytics_sample.ga_sessions_20170801`
LIMIT 0
bq load --source_format=NEWLINE_DELIMITED_JSON temp.analy my.json
Or
5b. Create a federated table, so you could
INSERT INTO 'table'
SELECT *
FROM `federated_table_that_reads_the_json_file`
Upvotes: 3