Daniel Wyatt
Daniel Wyatt

Reputation: 1151

sagemaker inference container ModuleNotFoundError: No module named 'model_handler'

I am trying to deploy a model using my own custom inference container on sagemaker. I am following the documentation here https://docs.aws.amazon.com/sagemaker/latest/dg/adapt-inference-container.html

I have an entrypoint file:

from sagemaker_inference import model_server
#HANDLER_SERVICE = "/home/model-server/model_handler.py:handle"
HANDLER_SERVICE = "model_handler.py"
model_server.start_model_server(handler_service=HANDLER_SERVICE)

I have a model_handler.py file:

from sagemaker_inference.default_handler_service import DefaultHandlerService
from sagemaker_inference.transformer import Transformer
from CustomHandler import CustomHandler


class ModelHandler(DefaultHandlerService):
    def __init__(self):
        transformer = Transformer(default_inference_handler=CustomHandler())
        super(HandlerService, self).__init__(transformer=transformer)

And I have my CustomHandler.py file:

import os
import json
import pandas as pd
from joblib import dump, load
from sagemaker_inference import default_inference_handler, decoder, encoder, errors, utils, content_types


class CustomHandler(default_inference_handler.DefaultInferenceHandler):

    def model_fn(self, model_dir: str) -> str:
        clf = load(os.path.join(model_dir, "model.joblib"))
        return clf

    def input_fn(self, request_body: str, content_type: str) -> pd.DataFrame:
        if content_type == "application/json":
            items = json.loads(request_body)

            for item in items:
                processed_item1 = process_item1(items["item1"])
                processed_item2 = process_item2(items["item2])
                all_item1 += [processed_item1]
                all_item2 += [processed_item2]
            return pd.DataFrame({"item1": all_item1, "comments": all_item2})

    def predict_fn(self, input_data, model):
        return model.predict(input_data)

Once I deploy the model to an endpoint with these files in the image, I get the following error: ml.mms.wlm.WorkerLifeCycle - ModuleNotFoundError: No module named 'model_handler'.

I am really stuck what to do here. I wish there was an example of how to do this in the above way end to end but I don't think there is. Thanks!

Upvotes: 2

Views: 2846

Answers (1)

Rahul Nimbal
Rahul Nimbal

Reputation: 585

This is because of the path mismatch. The entrypoint is trying to look for "model_handler.py" in WORKDIR directory of the container. To avoid this, always specify absolute path when working with containers.

Moreover your code looks confusing. Please use this sample code as the reference:

import subprocess
from subprocess import CalledProcessError
import model_handler
from retrying import retry
from sagemaker_inference import model_server
import os


def _retry_if_error(exception):
    return isinstance(exception, CalledProcessError or OSError)


@retry(stop_max_delay=1000 * 50, retry_on_exception=_retry_if_error)
def _start_mms():
    # by default the number of workers per model is 1, but we can configure it through the
    # environment variable below if desired.
    # os.environ['SAGEMAKER_MODEL_SERVER_WORKERS'] = '2'
    print("Starting MMS -> running ", model_handler.__file__)
    model_server.start_model_server(handler_service=model_handler.__file__ + ":handle")


def main():
    _start_mms()
    # prevent docker exit
    subprocess.call(["tail", "-f", "/dev/null"])

main()

Further, notice this line - model_server.start_model_server(handler_service=model_handler.__file__ + ":handle") Here we are starting the server, and telling it to call handle() function in model_handler.py to invoke your custom logic for all incoming requests.

Also remember that Sagemaker BYOC requires model_handler.py to implement another function ping()

So your "model_handler.py" should look like this -

custom_handler = CustomHandler()

# define your own health check for the model over here
def ping():
    return "healthy"


def handle(request, context): # context is necessary input otherwise Sagemaker will throw exception
    if request is None:
        return "SOME DEFAULT OUTPUT"
    try:
        response = custom_handler.predict_fn(request)
        return [response] # Response must be a list otherwise Sagemaker will throw exception

    except Exception as e:
        logger.error('Prediction failed for request: {}. \n'
                     .format(request) + 'Error trace :: {} \n'.format(str(e)))

Upvotes: 2

Related Questions