Mukesh
Mukesh

Reputation: 858

SageMaker endpoint for pretrained model

I have a pre-trained model, now trying to create an endpoint using Sagemaker, my folder structure like this "model.tar.gz" looks like this:

model
 |- config.json
 |- pytorch_model.bin
 |- special_tokens_map.json
 |- spiece.model
 |- tokenizer_config.json
 |- training_args.bin
code
 |- inference.py
 | - requirements.txt

running following script to create endpoint:

pytorch_model = PyTorchModel(
    model_data='s3://mck-dl-ai-studio/answer_card/answercard.tar.gz', 
    role=role, 
    entry_point='inference.py',
    framework_version="1.3.1")

predictor = pytorch_model.deploy(instance_type='ml.t2.medium', initial_instance_count=1)

An error occurred (ModelError) when calling the InvokeEndpoint operation: Received server error (500) from model with message "No module named 'transformers'". See https://us-west-2.console.aws.amazon.com/cloudwatch/home?region=us-west-2#logEventViewer:group=/aws/sagemaker/Endpoints/pytorch-inference-2020-07-20-16-45-51-564 in account xxxxxx for more information.

what I am missing here tried adding source_dir and py_version but no success

Upvotes: 0

Views: 586

Answers (2)

Damian
Damian

Reputation: 13

I had a similar error. At the end my folder structure of model.tar.gz was not correct. (model.pth and /code were not at root of my model folder)

As Yoav Zimmerman stated above code/requirements.txt can specify third parties libraries. This works with framework_version="1.3.1".

What helped me was to run sagemaker locally for debugging. See this tutorial: aws-sagemaker-pytorch-local-dev-flow

Upvotes: 0

Yoav Zimmerman
Yoav Zimmerman

Reputation: 608

code/requirements.txt should specify any third-party libraries you need in addition to torch, torchvision, and bumpy.

Reference: https://sagemaker.readthedocs.io/en/stable/frameworks/pytorch/using_pytorch.html#using-third-party-libraries

Upvotes: 1

Related Questions