Reputation: 8844
The code in this repo shows how to create a flask web endpoint to predict probability of 'surviving the titanic disaster'. The trained model is serialized as pickle file using joblib which takes as input age, ticket_class, boarding_location and gender to make the prediction.
Training data - https://www.kaggle.com/c/titanic/data
Architecture of AWS Sagemaker
https://docs.aws.amazon.com/sagemaker/latest/dg/how-it-works-hosting.html
The architecture in the above diagram looks like a good way to containerize and deploy ML application.
Question
Upvotes: 1
Views: 1015
Reputation: 3284
For simplicity, let's create our image based on Ubuntu.
Create a file Dockerfile
in an empty directory with the following contents:
FROM ubuntu
# Install pip and git and clone repo into /app
RUN apt-get update && apt-get install --assume-yes --fix-missing python-pip git && git clone https://github.com/amirziai/sklearnflask.git /app
# Change WORKDIR
WORKDIR /app
# Install dependencies
RUN pip install -r requirements.txt
# Expose port and run the application when the container is started
EXPOSE 9999
ENTRYPOINT python main.py 9999
The image can then be built by running docker build -t <TAG> .
from the directory containing the Dockerfile
. You can then run it using docker run <TAG>
.
I imagine you want to add your own sklearn models. There are different ways to do this, e.g. you could change your Dockerfile and use ADD
to add files from your local file system to your image, or you could mount your model at the point of running the container.
Also, when actually using this container in production, you should consider a few more things, e.g. how to keep your container lightweight (e.g. by using alpine instead of ubuntu). You'll probably also want to upload your image to a hub, so you can deploy the container from there.
Hopefully this helps as a starting point.
Upvotes: 4