Reputation: 1
I'm trying to use KServe to run inference on a torch-model-archiver'ed model, doing image cropping. At a certain size (1024x1024) the torchserve worker dies and I get the following stacktrace
Can someone help me with troubleshooting? Is there a way to set some kind of a debugging mode\logs?
It works with smaller images but crashes on this size. I double-checked for any network issues and there aren't any. Afterwards, uvicorn just complains that it can't access the ASGI application (because it crashed).
2024-07-11T07:48:03,865 [DEBUG] W-9000-dinov2_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
java.lang.InterruptedException: DefaultChannelPromise@442a1327(incomplete)
at io.netty.util.concurrent.DefaultPromise.await(DefaultPromise.java:243) ~[model-server.jar:?]
at io.netty.channel.DefaultChannelPromise.await(DefaultChannelPromise.java:131) ~[model-server.jar:?]
at io.netty.channel.DefaultChannelPromise.await(DefaultChannelPromise.java:30) ~[model-server.jar:?]
at io.netty.util.concurrent.DefaultPromise.sync(DefaultPromise.java:403) ~[model-server.jar:?]
at io.netty.channel.DefaultChannelPromise.sync(DefaultChannelPromise.java:119) ~[model-server.jar:?]
at io.netty.channel.DefaultChannelPromise.sync(DefaultChannelPromise.java:30) ~[model-server.jar:?]
at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:209) [model-server.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
2024-07-11 07:48:03.867 10 kserve ERROR [generic_exception_handler():94] Exception:
Traceback (most recent call last):
File "/home/venv/lib/python3.9/site-packages/starlette/middleware/errors.py", line 162, in __call__
await self.app(scope, receive, _send)
File "/home/venv/lib/python3.9/site-packages/timing_asgi/middleware.py", line 70, in __call__
await self.app(scope, receive, send_wrapper)
File "/home/venv/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
raise exc
File "/home/venv/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
await self.app(scope, receive, sender)
File "/home/venv/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in __call__
raise e
File "/home/venv/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
await self.app(scope, receive, send)
File "/home/venv/lib/python3.9/site-packages/starlette/routing.py", line 718, in __call__
await route.handle(scope, receive, send)
File "/home/venv/lib/python3.9/site-packages/starlette/routing.py", line 276, in handle
await self.app(scope, receive, send)
File "/home/venv/lib/python3.9/site-packages/starlette/routing.py", line 66, in app
response = await func(request)
File "/home/venv/lib/python3.9/site-packages/fastapi/routing.py", line 237, in app
raw_response = await run_endpoint_function(
File "/home/venv/lib/python3.9/site-packages/fastapi/routing.py", line 163, in run_endpoint_function
return await dependant.call(**values)
File "/home/venv/lib/python3.9/site-packages/kserve/protocol/rest/v1_endpoints.py", line 76, in predict
response, response_headers = await self.dataplane.infer(model_name=model_name,
File "/home/venv/lib/python3.9/site-packages/kserve/protocol/dataplane.py", line 311, in infer
response = await model(request, headers=headers)
File "/home/venv/lib/python3.9/site-packages/kserve/model.py", line 121, in __call__
response = (await self.predict(payload, headers)) if inspect.iscoroutinefunction(self.predict) \
File "/home/venv/lib/python3.9/site-packages/kserve/model.py", line 295, in predict
res = await self._http_predict(payload, headers)
File "/home/venv/lib/python3.9/site-packages/kserve/model.py", line 261, in _http_predict
raise HTTPStatusError(message, request=response.request, response=response)
httpx.HTTPStatusError: {'code': 500, 'type': 'InternalServerException', 'message': 'Worker died.'}, '500 Internal Server Error' for url 'http://0.0.0.0:8085/v1/models/dinov2:predict'
Upvotes: 0
Views: 107