Reputation: 221
Am using spark bigquery connector to read data from Bigquery. https://cloud.google.com/dataproc/docs/tutorials/bigquery-connector-spark-example Need to check if a table exists before reading from the table . Otherwise the API is throwing the error
"Not found: Table sample_proj:sample_dataset.table"
Is there a way we can handle this in spark bigquery connector.
Thanks
Upvotes: 0
Views: 531
Reputation: 126
As of now error's in bigquery (ex: table does not exists or permission issues) will not make the spark application exit or stop. Which will be a problem. So to avoid you can split the task into two like checking whether table exists or not. And spark processing into separate task.
Upvotes: 1