Bala
Bala

Reputation: 1148

Getting "Could not connect to the endpoint URL" Error with boto3 when deploying in Localstack

I am using Localstack to test my changes in local. My lambda function is supposed to perform putObject and create object in the s3 bucket. The functionality works fine when directly tested with AWS environment. But in Localstack, its not working. I get the below error.

Could not connect to the endpoint URL: "http://localhost:4572/doyouknowme/pokemon.jpeg"     raise EndpointConnectionError(endpoint_url=request.url, error=e)ponset_exceptionlhost:4572/doyouknowme/pokemon.jpeg"

AWS Credentials:

[default]
aws_access_key_id = AKI****************
aws_secret_access_key = gL************************
region = us-east-1

Lambda function code:

import json
import urllib.parse
import boto3
import base64

print('Loading function')
# session = boto3.Session(profile_name='personal')
# s3 = session.client('s3', endpoint_url='http://localhost:4574')

s3 = boto3.client('s3', endpoint_url='http://localhost:4572', region_name='us-east-1')


def lambda_handler(event, context):
    # raise Exception('Something went wrong')
    print("Received event: " + json.dumps(event, indent=2))


    try:
        image_data = base64.b64decode(event['image_data'])
        response = s3.put_object(
            Body=image_data,
            Bucket='doyouknowme',
            Key='pokemon.jpeg',
            ContentType='image/jpeg'
        )

        print(response)
        return response

    except Exception as e:
        print(e)
        # print(
        #     'Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as '
        #     'this function.'.format(
        #         key, bucket))
        raise e

I am not sure why the s3 key got appended to the endpoint URL which is accessed by lambda. Appreciate your help to resolve this.

Upvotes: 14

Views: 25917

Answers (3)

For me only the os.environ variable worked, hardcoding anything else like localhost didn't work.

client = boto3.client('lambda',endpoint_url='http://' + os.environ["LOCALSTACK_HOSTNAME"] + ':4566')

Actually the following code detects if the variable is there or not to adjust the endpoint_url automatically if it's being run in localstack or in the AWS cloud:

if os.environ.get("LOCALSTACK_HOSTNAME"):
    client = boto3.client('lambda',endpoint_url='http://' + os.environ["LOCALSTACK_HOSTNAME"] + ':4566')
else:
    client = boto3.client('lambda')

Upvotes: 2

JavaSa
JavaSa

Reputation: 6242

If contacting from one docker container to localstackcontainer, need to verify the following points:

  1. Relevant localstack ports are exposed to other containers.
    (using docker\ docker-compose's expose section)

  2. If using docker-compose service, need to add to its links section: -localstack.

  3. The most important thing connect via http://localstack:4566/.
    On the contrary, use http://localhost:4566 just if you are trying to
    test with local code on host machine using a localstack container- running in the background.

  4. Verify localstack port mapping is existing between host and docker, I.e:

    ports:
    - 4566:4566

  5. Notice that http://localstack:4566!=http://localhost:4566. ;)

Upvotes: 1

Jordan Cote
Jordan Cote

Reputation: 525

For anyone experiencing this problem: I encountered it when it had boto3 in another Docker trying to access localstack with http://localhost:4566.

I fixed my issue by putting http://host.docker.internal:4566 as my endpoint-url in my boto3 session client.

That, or if you're using docker-compose, have your two docker containers on the same network, making calls possible by calling boto3's container name as the endpoint. (i.e. http://boto3-container:4566)

Hope this helps someone!

Upvotes: 41

Related Questions