Reputation: 11
How do we import large datasets from Google BigQuery to R ?
I tried using the BigQuery package and tried importing. I am able to succeed in importing smaller datasets but not the huge ones
sql <- "SELECT * FROM Table name"
todo_copies <- query_exec(sql, project = 'data-warehouse', dataset = 'name', useLegacySql = FALSE)
bytes processed Error: Requested Resource Too Large to Return [responseTooLarge]
Upvotes: 1
Views: 757
Reputation: 2099
The message refers to a limit related to the Maximum response size for query jobs in BQ. It can be managed if you write to a detination table; however, most likely this option don't meet your needs.
Please note that R documentation indicates that query_exec is being deprecated in favor of bq_project_query and from other threads like How to load large datasets to R from BigQuery?, it is suggested to adjust the page_size property:
"It should be adjusted when an error responseTooLarge appears"
Upvotes: 1