sagar verma
sagar verma

Reputation: 432

uploading large file on google cloud bucket failing

I am uploading a large file from my local system or an remote_url to google bucket. However everytime I am getting the below error.

/usr/local/lib/ruby/3.0.0/openssl/buffering.rb:345:in `syswrite': execution expired (Google::Apis::TransmissionError)

/usr/local/lib/ruby/3.0.0/openssl/buffering.rb:345:in `syswrite': execution expired (HTTPClient::SendTimeoutError)

I am using carrierwave initializer to define configuration of my google service account and its details. Please suggest if there is any configuration I am missing or if I add to increase the timeout or retries.

My Carrierwave initializer:

begin
  CarrierWave.configure do |config|
    config.fog_provider = 'fog/google'
    config.fog_credentials = {
      provider: 'Google',
      google_project: '{project name}',
      #google_json_key_string: Rails.application.secrets.google_cloud_storage_credential_content
      google_json_key_location: '{My json key-file location}'
    }
    config.fog_attributes = {
      expires: 600000,
      open_timeout_sec: 600000,
      send_timeout_sec: 600000,
      read_timeout_sec: 600000,
      fog_authenticated_url_expiration: 600000

    }
    config.fog_authenticated_url_expiration = 600000
    config.fog_directory = 'test-bucket'
  end
  #


rescue => error
  puts error.message
end

Upvotes: 0

Views: 214

Answers (1)

geemus
geemus

Reputation: 2542

This might have to do with the duration of time between when the connection and initialized and when it actually gets used. Adding: persistent: false to the fog_credentials should make it create a new connection for each request. This is a bit less performant, but it should at least work consistently, unlike what you appear to be running into presently.

Upvotes: 1

Related Questions