Reputation: 1257
I have python based ML model that I want to serve. Based on my research two the most popular options are the following:
I. Flask + uWSGI
II. FastAPI + Uvicorn with Gunicorn
On top of that we can put Nginx as a reverse proxy (load balancing, caching, security, etc.):
I. Flask + uWSGI + Nginx
II. FastAPI + Uvicorn/Gunicorn + Nginx
My questions:
Do I need to use Traefik on top of that? Or do I need to change Nginx with Traefik?
If I have an application which only accepts POST requests do I need to still use Nginx and/or Traefik on top of Flask + uWSGI (or FastAPI + Uvicorn/Gunicorn)?
If I will use Tensorflow Serving or other ML serving solutions (Kubeflow, MLflow, Seldon, etc.) is it still recommended to wrap up Tensorflow Serving into FastAPI + Uvicorn/Gunicorn + Nginx or/and Traefik?
P.S. I am planning conterize applications with Docker and use Swarm or Kubernetes in production.
Upvotes: 1
Views: 712
Reputation: 11
- Do I need to use Traefik on top of that? Or do I need to change Nginx with Traefik?
Based on the fact that you want to deploy multiple Docker containers using Swarm/Kubernetes. I would advise you to use a reverse proxy like Traefik or Nginx. In my opinion, Traefik is easier to learn and configure. Traefik can also use the Docker socket to detect new launched containers.
Here is an article on how to set up a swarm mode for traefik: link
If I have an application which only accepts POST requests do I need to still use Nginx and/or Traefik on top of Flask + uWSGI (or FastAPI + Uvicorn/Gunicorn)?
If I will use Tensorflow Serving or other ML serving solutions (Kubeflow, MLflow, Seldon, etc.) is it still recommended to wrap up Tensorflow Serving into FastAPI + Uvicorn/Gunicorn + Nginx or/and Traefik?
As Julian already mentioned in his comment, it mostly depends on your deployment constraints and needs. FastAPI itself has a great ready-to-go repository with Docker + FastAPI + Uvicorn/Gunicorn.
Upvotes: 1