Reputation: 11
In kfp-1.8.12 there is a pretty useful new feature whereby we can specify custom indices for python dependencies in the component decorator (see this PR).
I've got a component that uses a python package already hosted on GCP Artifact Registry under a pypi repository. At the moment to use that, I'm generating a separate Docker image including that package and using that as the base image for my component.
It'd be cleaner if I could just install the package directly from the internal pypi index, but the authentication doesn't appear to work. If I include my internal index as part of pip_index_urls
the component prompts for a User
and immediately fails:
User for <region-redacted>-python.pkg.dev:
Is there a way of authenticating vertex pipelines with a pypi repository hosted on Artifact Registry?
Upvotes: 1
Views: 813
Reputation: 2347
You need to follow the instructions to run set up authentication for your Artifact Registry Python Index, however, you have to do that before KFP starts to try to install packages.
Because setting that up requires installing some packages, and if that requires authentication...
So the solution is to put the instructions to set up the authentication (it's just pip install keyring keyrings.google-artifactregistry-auth
) into a Dockerfile
and generate a custom image. Don't install Docker, use Cloud Build to build the image. Then run your Kubeflow component on Vertex AI with that custom image.
Another solution of course, would be to add the installation to the packages that you want to the Cloud Build, so that your custom image has everything you need already. That has the added benefit that your KFP component starts up a bit faster, because it doesn't need to install anything.
Upvotes: 3