ggo
ggo

Reputation: 471

bigquery upload big (compressed) CSV 100 seconds timeout

When trying to upload a (a big) "gzipped" (30MB) compressed csv file (90MB) with the .net API 1.5.0.222, I get always an error after 100 seconds

[System.Threading.Tasks.TaskCanceledException]    {System.Threading.Tasks.TaskCanceledException: A task was canceled.
   at Microsoft.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at Microsoft.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccess(Task task)
   at Microsoft.Runtime.CompilerServices.TaskAwaiter.ValidateEnd(Task task)
   at Microsoft.Runtime.CompilerServices.ConfiguredTaskAwaitable`1.ConfiguredTaskAwaiter.GetResult()
   at Google.Apis.Upload.ResumableUpload`1.<Upload>d__0.MoveNext() in c:\code.google.com\google-api-dotnet-client\default_3\Tools\Google.Apis.Release\bin\Debug\output\default\Src\GoogleApis\Apis\[Media]\Upload\ResumableUpload.cs:line 362}  System.Threading.Tasks.TaskCanceledException

I have found something related with the HTTP POST 100 seconds timeout here:

Can't set HttpWebRequest timeout higher than 100 seconds when doing a POST?

but it is related with the HttpWebRequest class only (not specifically with the Bigquery .NET API).

I did not find how to set this timeout with the bigquery (.net) API, neither how to access the underlying (I suppose) HttpWebRequest instance.

Is there a way to set this timeout?

Or a specific way to upload a local csv file to bigquery in order to avoid the timeout?

Upvotes: 0

Views: 828

Answers (3)

briler
briler

Reputation: 590

I don't think extending the timeout is good practice.

This how I did it. Google API upload is resumableUpload - meaning you they support cutting the file into Chunks. ( and resuming if something went wrong)

this is how I did it:

   JobInfo = m_bigQueryService.Jobs.Insert(jobBody, sProject, file, "application/octet-stream");

                // Chunk size in MB
                JobInfo.ChunkSize = 1 * Google.Apis.Upload.ResumableUpload<Job>.MinimumChunkSize; // currently 250kb
                int t = 1;
                JobInfo.ProgressChanged += progress =>
                    {
                        // You can put what ever you like here - triggered after each chunk is uploaded
                    };

                uploadProgress = JobInfo.Upload(); // Sync upload


                if (uploadProgress.Status != UploadStatus.Completed)
                {
                    // Do something
                }

you can read more here: https://developers.google.com/drive/manage-uploads#resumable

Upvotes: 0

ggo
ggo

Reputation: 471

It was so easy, it's almost a shame to have asked this question... a reference was missing in the project, and the property which permits to do it was not visible. Anyway.. the solution is (for a 10 minutes timeout):

    BigqueryService bq = someMethodToGetIt(...);
    bq.HttpClient.Timeout = new TimeSpan(0, 10, 0);

Upvotes: 4

Jordan Tigani
Jordan Tigani

Reputation: 26617

I'm not familiar with the .NET libraries you're using, but since you're doing resumable upload, can you break the upload into pieces that will each take less than 100 seconds?

Upvotes: 0

Related Questions