Reputation: 305
nvcr.io/nvidia/tritonserver:24.02-py3 this image doesn't have onnx backend
i have been following this tutorial "https://github.com/triton-inference-server/tutorials/tree/main/Conceptual_Guide/Part_1-model_deployment#setting-up-the-model-repository"
when I hit below command:
docker run -it --shm-size=256m --rm -p8000:8000 -p8001:8001 -p8002:8002 -v $(pwd)/model_repository:/models nvcr.io/nvidia/tritonserver:24.02-py3
and went inside container i got this error:
on further checking this i found that onnx is missing
if this is legit issue then can you please guide me on how to install onnx on this container? or any better approach to fix this?
Upvotes: 0
Views: 648
Reputation: 59
Checkout the release notes on 24.02 version here
In the known issues section:
ONNX Runtime backend is not included with 24.02 release due to incompatibility reasons. However iGPU and Windows build assets shipped with ONNX Runtime backend.
It seems they did not include it intentionally.
Upvotes: 0