Sebastian Guajardo
Sebastian Guajardo

Reputation: 23

Error in Call to Sagemaker Endpoint with Lambda and API Gateway

I try make predictions with a TensorFlow-Keras model in Sagemaker, but recive the next errors:

In Amazon Cloudwatch, for Lambda Function:

An error occurred (ModelError) when calling the InvokeEndpoint operation: Received client error (415) from model with message "
{
    "error": "Unsupported Media Type: application/x-image"
}

In Cloudwatch, for Sagemaker:

F external/org_tensorflow/tensorflow/core/util/tensor_format.h:426] Check failed: index >= 0 && index < num_total_dims Invalid index from the dimension: 3, 0, C

Data is an image send in base64, the Lambda function convert this img to bytes, Lambda Function is:

def lambda_handler(event, context):
    print("Received event: " + json.dumps(event, indent=2))
    
    data = json.loads(json.dumps(event))
    payload = data['foto']
    image = base64.b64decode(payload)
    print(type(image))
    
    try:
        response = runtime.invoke_endpoint(EndpointName=ENDPOINT_NAME,
                                            ContentType='application/x-image',
                                            Body=image)
        print(response)
    except Exception as e:
        print("error en inferencia:")
        print(e)

    return payload # only for test

Upvotes: 2

Views: 1219

Answers (2)

Teodorico Levoff
Teodorico Levoff

Reputation: 1659

It seems that you are using SageMaker v2. With v2 you don't directly set content_type instead you set the content type in a Serializer instance. You can either do this in the Predictor's constructor or by setting predictor.serializer afterwards. Note that you can use an already created serializer class that allows you to specify a content_type or alternatively implement a class where you customize it to handle content_type.

Upvotes: 3

samtoddler
samtoddler

Reputation: 9605

I think content_type='application/x-image' is not supported

how to handle application/x-image ?

SageMaker TensorFlow Serving Container supports the following Content-Types for requests:

SageMaker TensorFlow Serving Container supports the following Content-Types for requests:

  • application/json (default)
  • text/csv
  • application/jsonlines

And the following content types for responses:

  • application/json (default)
  • application/jsonlines

you can check this answer Amazon SageMaker Unsupported content-type application/x-image which has few suggestions.

You need create your own custom docker to deploy the model as a service.

Upvotes: 2

Related Questions