Reputation: 1
Facing the below error while loading the csv file into BQ Table. Didn't face this problem when we were loading the files that are of TBs in size
'Error while reading data, error message: The options set for reading CSV prevent BigQuery from splitting files to read in parallel, and at least one of the files is larger than the maximum allowed size when files cannot be split. Size is: 7561850767. Max allowed size is: 4294967296.'
Upvotes: 0
Views: 2096
Reputation: 835
The limit for compressed files is 4GB.
If you file is not compressed you should check if there is any double quote characters (") in the file. Unmatched double quote characters could result in a large field (greater than 4GB) that cannot be split.
You can try loading the file from command line using something like:
bq --project_id <project_id> load --source_format=CSV --autodetect --quote $(echo -en '\000') <dataset.table> <path_to_source>
The idea would be to mute the default quote which is double quotes (").
Please refer the CLI documentation for the exact command.
Upvotes: 1