Kyle Pendergast
Kyle Pendergast

Reputation: 1239

Can't connect to Google Cloud Bucket in heroku production environment but I can locally

I have a rails app that has a google cloud bucket I am accessing from a rake task. The rake task works fine locally but when I run the task in production on heroku it fails with this error

/app/vendor/ruby-3.1.2/lib/ruby/3.1.0/socket.rb:1214:in __connect_nonblock': Failed to open TCP connection to 169.254.169.254:80 (Operation already in progress - connect(2) for 169.254.169.254:80) (Errno::EALREADY)`

I've already confirmed that the service account key is properly being loaded. I can even see that requests are hitting the bucket in the cloud console dashboard for the bucket, but for some reason I'm getting this error and I'm not sure what to make of it.

Chatgpt seems to think it has something to do with the google metadata server?

For reference, here is the service code

require "base64"
require "google/cloud/storage"

class GoogleCloudBucketService
  def initialize(bucket_name)
    @bucket_name = bucket_name
    decoded_json_key = Base64.decode64(ENV['GCP_SERVICE_ACCOUNT_JSON_KEY_BASE64'])
    json_key_hash = JSON.parse(decoded_json_key)
    @storage = Google::Cloud::Storage.new(
      credentials: json_key_hash
    )
  end

  def get_file(file_name)
    bucket = @storage.bucket(@bucket_name)

    unless bucket
      raise "Bucket #{@bucket_name} not found"
    end

    file = bucket.file(file_name)

    unless file
      raise "File #{file_name} not found in bucket #{@bucket_name}"
    end

    file.download.read
  rescue Google::Cloud::Error => e
    # Handle Google Cloud specific errors
    raise "An error occurred while accessing Google Cloud Storage: #{e.message}"
  rescue StandardError => e
    # Handle other errors
    raise "An error occurred: #{e.message}"
  end

  def write_file(file_name, csv_content)
    bucket = @storage.bucket(@bucket_name)

    unless bucket
      raise "Bucket #{@bucket_name} not found"
    end

    file = bucket.create_file(StringIO.new(csv_content), file_name)

    unless file
      raise "Failed to create file #{file_name} in bucket #{@bucket_name}"
    end

    file
  rescue Google::Cloud::Error => e
    # Handle Google Cloud specific errors
    raise "An error occurred while accessing Google Cloud Storage: #{e.message}"
  rescue StandardError => e
    # Handle other errors
    raise "An error occurred: #{e.message}"
  end
end

Upvotes: 0

Views: 70

Answers (1)

Kyle Pendergast
Kyle Pendergast

Reputation: 1239

Responding to my own answer...

TL;DR - I resolved this by explicitly including the project_id parameter when initializing the Google::Cloud::Storage class like this...

@storage = Google::Cloud::Storage.new(
      project_id: json_key_hash['project_id'],
      credentials: json_key_hash
    )

After a brief dive, I believe it is because if a project_id is not included the storage class attempts to hit the google metadata server to determine which project the service account has access to. This works locally but not in production and my hypothesis is that there is a firewall or something not allowing the request to go through. If you explicitly define the project_id parameter then it does not hit the google metadata server to grab the project id.

If anyone finds a more detailed answer please let me know!

Upvotes: 0

Related Questions