jiaoshang
jiaoshang

Reputation: 91

How to create prometheus metrics for vertex ai endpoint with custom container

I'm using Vertex AI Endpoint with custom container to serve my machine learning model. I'd like to create some custom metrics, e.g., monitor the prediction distribution. Does anyone know whether it's possible to prometheus metrics for vertex ai endpoint with custom container? If yes, how to do it?

I know it's possible to log the prediction result and create log-based metrics. But it crate too many logs, our platform team wouldn't be happy about that. And I also checked this docs: https://cloud.google.com/vertex-ai/docs/general/monitoring-metrics , didn't find any hint.

Any help would be appreciated!

Upvotes: 2

Views: 411

Answers (1)

jiaoshang
jiaoshang

Reputation: 91

According to the response from Google, there is no direct way to build Prometheus metrics for Vertex Ai endpoints in custom container now, but there are workaround solutions,

  • You can use CPR (Custom prediction routines), to write a post processing code that also sends the metrics that you require in the response. Later the response can be parsed, and the metric info can be extracted and used.
  • You can use Model Monitoring service to build Prometheus metrics to track the model prediction drift etc. Model Monitoring service provides some metrics and you can build your Prometheus metrics based on those metrics.

Upvotes: 0

Related Questions