Reputation: 83
Hopefully someone can help on this. I've been trying to test my lambda online (I've manage to deploy it) But now I am getting the same error online as well as using sam local start-api or start-lambda.
It "works" when I run docker run -p 9000:8080 myFunction:rapid-x86_64 I say "works" because I have my response like:
response = {'statusCode': 200,
'headers': {
'Access-Control-Allow-Headers': 'Content-Type',
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'OPTIONS,POST,GET'
},
'body': body_response
}
But using POSTMAN I get in the response body the whole response with "headers" which I think shouldn't.
Could this be the reason is not working online? In the event I only receive the body of the request, not the header... I haven't seen this in other functions.
Maybe you can see the resource for the function
MyFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: .
Policies:
- AWSLambdaBasicExecutionRole
Events:
InputEvent:
Type: HttpApi
Properties:
ApiId: !Ref MyApi
Path: /input
Method: post
PayloadFormatVersion: "2.0"
PackageType: Image
MemorySize: 3008
ImageUri: image:rapid-x86_64
ImageConfig:
EntryPoint: [ "python3", "lambda_function.py" ]
WorkingDirectory: "/var/task/"
Metadata:
Dockerfile: Dockerfile
DockerContext: ./
DockerTag: rapid-x86_64
The error on start-lambda: [ERROR] (rapid) Init failed error=Runtime exited without providing a reason InvokeID=
using docker run: Response body is not available to scripts (Reason: CORS Missing Allow Origin) But I get the whole response including the headers in the body response (why is not returning the correct headers in the response?)
Upvotes: 0
Views: 792
Reputation: 83
So, if anyone faces a similar error, I would recommend switching to a serverless template.
Helpful resource was https://www.serverless.com/blog/container-support-for-lambda
I was able to deploy my lambda without any issues.
Upvotes: 0