racerX
racerX

Reputation: 1092

Vertex AI custom prediction vs Google Kubernetes Engine

I have been exploring using Vertex AI for my machine learning workflows. Because deploying different models to the same endpoint utilizing only one node is not possible in Vertex AI, I am considering a workaround. With this workaround, I will be unable to use many Vertex AI features, like model monitoring, feature attribution etc., and it simply becomes, I think, a managed alternative to running the prediction application on, say, a GKE cluster. So, besides the cost difference, I am exploring if running the custom prediction container on Vertex AI vs. GKE will involve any limitations, for example, only N1 machine types are available for prediction in Vertex AI

There is a similar question, but I it does not raise the specific questions I hope to have answered.

If you know of any other possible limitations, please post.

Upvotes: 0

Views: 817

Answers (1)

Shawn
Shawn

Reputation: 1593

  1. we don't specify a disk size, so default to 100GB
  2. I'm not aware of this right now. But if it's a custom container, you could just run it locally or on GKE for debugging purpose.
  3. are you looking for this? https://cloud.google.com/vertex-ai/docs/predictions/using-private-endpoints

Upvotes: 1

Related Questions