Jacob Schaer
Jacob Schaer

Reputation: 737

Google BigQuery Request Too Large

It's been a while since I've had a chance to work on the pandas GBQ module, but I noticed that one of our regression tests is now failing.

The test in question is:

https://github.com/pydata/pandas/blob/master/pandas/io/tests/test_gbq.py#L254-L267

In short, the test attempts to create a table with 5 columns (types are Boolean, Float, String, Integer, Timestamp) and 1,000,001 rows each. Inserting these rows in chunks of 10,000 rows is failing with a response of "Request Too Large".

I feel like this is going to probably have a similar answer to Getting "Query too large" in BigQuery - but seeing as how this test was working at a previous time, I'm wondering if there's a backend problem that needs to be addressed. It's also possible the API was changed when I wasn't looking!

TLDR Version: What about our insertion is too large, and are there documented limits that we can reference?

Upvotes: 4

Views: 3833

Answers (1)

shollyman
shollyman

Reputation: 4384

The documented limits are here:

https://cloud.google.com/bigquery/streaming-data-into-bigquery#quota

The TL;DR answer: While BQ is not strictly enforcing the max rows per request of 500 rows/insert at this time, there are some other limits elsewhere in the API stack related to the overall request size that are preventing the call from succeeding.

Upvotes: 4

Related Questions