Barak BN
Barak BN

Reputation: 558

Exception loading data to BigQuery with Logstash BigQuery plugin

I am working with Logstash version 5.5.3 & Google_bigquery output plugin v3.2.1

I am trying to load data from Kafka topic to BigQuery (running with debug log level)

In the log I see such lines as:

BQ: upload object. {:filename=>"/tmp/logstash-bq-5e1bba825d869e2118db8107f3019b2694a52505ef3b5973596f78ef5cfe/logstash_bq_barak-agg-tms-1.c.rnd-tms.internal_2018-12-05T13:00.part000.log", :table_id=>"logstash_2018_12_05T13_00"}

and I can see that the data was created in the temp files on the machine.

However, Logstash is unable to load the data to BigQuery:

[2018-12-05T13:19:02,302][ERROR][logstash.outputs.googlebigquery] BQ: failed to upload file. retrying. {:exception=>#<NoMethodError: undefined method `has_key?' for nil:NilClass>}

My input are flat jsons, and use json_schema configuration:

json_schema => { fields => [ { name => "sourceId" type => "STRING" },{ name => "targetId" type => "STRING" },{ name => "tmsTimestamp" type => "TIMESTAMP" },{ name => "latency" type => "FLOAT" },{ name => "targetType" type => "STRING" },{ name => "type" type => "STRING" },{ name => "network" type => "STRING" },{ name => "targetIp" type => "STRING" },{ name => "linkId" type => "STRING" },{ name => "sourceIp" type => "STRING" },{ name => "targetHostname" type => "STRING" },{ name => "targetTMAPort" type => "INTEGER" },{ name => "timestamp" type => "TIMESTAMP" } ] }

Upvotes: 0

Views: 709

Answers (1)

Barak BN
Barak BN

Reputation: 558

It turns out I had a lot of configuration and authorisation problems, but the specific version of the plugin (3.2.1) hid them from me.

I downgraded to version 3.0.1 and was able see the specific nature of the issues, and therefore -- to fix them.

This was helpful: https://github.com/logstash-plugins/logstash-codec-cloudtrail/issues/15

Upvotes: 2

Related Questions