user3913702
user3913702

Reputation: 127

Big Query streaming buffer insert null rows python

I'm using Big Query Python SDK to insert data into Big Query. Basically, I'm checking if the table already existed and after I insert data using insert_data function (code below).

I don't have any errors and I can see in the table preview streaming buffer a correct estimated rows numbers. The problem is few hours later, when the preview is available, all the fields are set to null ...

Any idea of what's happen?

  bq_table = bq_dataset.table(bq_table_prefix + date)
  bq_table.schema = [
        SchemaField('Date', 'string', mode='nullable'),
        SchemaField('Hour', 'string', mode='nullable'),
        SchemaField('Value', 'string', mode='nullable'),
  ]


  if bq_table.exists():
    bq_table.delete()
  bq_table.create()   

  #tuples is [('string','string','string'),('string','string','string')]
  errors = bq_table.insert_data(tuples)  
  if not errors:
    print('Success')
  else:
    print('Errors:')
    print(errors)

Upvotes: 1

Views: 613

Answers (1)

Willian Fuks
Willian Fuks

Reputation: 11777

Not sure if this is the case for you but if you delete a table, you must wait at least 2 minutes to stream data on it again.(link)

One way to test if this is what is happening for you is to run your process in some newly created table so you don't have to delete it and see if it works. If it does, then you might have to change your strategy for uploading the data (either by avoiding the deletion or waiting a few minutes before starting the streaming again).

Upvotes: 2

Related Questions