Pieter Coucke
Pieter Coucke

Reputation: 438

How to stream inserts to BigQuery with Nodejs?

I would like to use google-api-nodejs-client to stream rows to Google BigQuery.

Digging into the source code I found that I need a "resource" parameter. I tried several combinations and went to the source of apirequest but I always get the error No rows present in the request.

I finally managed to upload one row at a time with another npm module but this module doesn't support tabledata.insertAll().

Can you give an example that shows how to use the "resource" parameter to stream inserts?

bigquery.tabledata.insertAll({
  auth: oauth2Client,
  'projectId': config.google.projectId,
  'datasetId': config.google.datasetId,
  'tableId': config.google.tableId,
  'resource ': {
    "kind": "bigquery#tableDataInsertAllRequest",
    "rows": [
      {
        "insertId": 123456,
        "json": '{"id": 123,"name":"test1"}'
      }
    ]
  }
}, function(err, result) {
  if (err) {
    return console.error(err);
  }
  console.log(result);
});

Upvotes: 2

Views: 3999

Answers (2)

Danny Kitt
Danny Kitt

Reputation: 3251

You have an extra space after resource, have you tried removing that?

  'resource ': {

Upvotes: 3

Jordan Tigani
Jordan Tigani

Reputation: 26637

It looks like you're doing extra encoding of the inserted rows. They should be sent as raw json, rather than encoding the whole row as a string. That is, something like this:

'rows': [
  {
    'insertId': 123456,
    'json': {'id': 123,'name':'test1'}
  }
]

(note the difference from what you have above is just that the {'id': 123,'name':'test1'} line isn't quoted.

Upvotes: 2

Related Questions