Reputation: 153
Im trying to load a .csv file to BQ using console. it has a size of 45 mb. I see that using "upload" i can only load upto 10mb. I dont have access to Drive and dont have access to run bq load from command line on my local machine as permission denied.
Any workaround for this? It will be a great help.Thanks
Upvotes: 0
Views: 380
Reputation: 3270
I where able to upload a file greater than 10Mb limit following this tutorial
In order to execute the python script, you just need to install the bigquery lib in your virtualenv.
pip install google-cloud-bigquery
If you do not have a dataset created you just need to run the command from console cloud to create a new dataset.
$ bq mk pythoncsv
#Dataset 'healthy-pager-276023:pythoncsv' successfully created.
After create your dataset with sucess just fire the python script to upload your csv.
My final solution is this python script:
from google.cloud import bigquery
# Construct a BigQuery client object.
client = bigquery.Client()
# TODO(developer): Set table_id to the ID of the table to create.
# JUST FOLLOW THIS PATTERN: <projectid>.<datasetname>.<tablename>
table_id = "healthy-pager-276023.pythoncsv.table_name"
job_config = bigquery.LoadJobConfig(
source_format=bigquery.SourceFormat.CSV, skip_leading_rows=1, autodetect=True,
)
path_to_file_name = "massdata.csv" #<-- PATH TO CSV TO IMPORT
with open(path_to_file_name, "rb") as source_file:
job = client.load_table_from_file(source_file, table_id, job_config=job_config)
job.result() # Waits for the job to complete.
table = client.get_table(table_id) # Make an API request.
print("Loaded {} rows and {} columns to {}".format(table.num_rows, len(table.schema), table_id))
And here my configurations from gcloud console from big query:
Upvotes: 0
Reputation: 209
You can upload the file to a Google Cloud Storage bucket, then copy the "//gs:" storage URL. Then in the console, you can Create Table and select source "Google Cloud Storage" and paste your URL.
Upvotes: 2