Reputation: 379
I've been trying GCP's Artifact Registry, which is currently in alpha for Python packages.
I do the authentication via Keyring along with my service account, as explained in the documentation.
I can successfully upload a package using Twine, and I can successfully download it to a local Python project by installing the following requirements.txt
:
--extra-index-url https://my-region-pypi.pkg.dev/my-project/my-python-repo/simple/
my-package
However, when I deploy a minimal Cloud Function to the same project as my Artifact Registry, with the same requirements.txt
shown above, the deployment fails with the following output:
ERROR: (gcloud.functions.deploy) OperationError: code=3, message=Build failed: `pip_download_wheels` had stderr output:
ERROR: Could not find a version that satisfies the requirement my-package (from -r requirements.txt (line 2)) (from versions: none)
ERROR: No matching distribution found for my-package (from -r requirements.txt (line 2))
I tried with both --extra-index-url
and just plain --index-url
, with no difference. I also tried installing the keyring dependencies with the following requirements.txt
:
--extra-index-url https://my-region-pypi.pkg.dev/my-project/my-python-repo/simple/
keyring
keyrings.google-artifactregistry-auth
my-module
But I get the same error.
I checked the permissions for my default App Engine service account for my project, which is also used for Cloud Functions, and I can confirm that it has the Artifact Registry Reader role, so it doesn't seem to be a permissions issue.
I also tried deploying a minimal App Engine service instead of a Cloud Function, but I get the same error.
Many thanks for the help.
Upvotes: 6
Views: 3643
Reputation: 189
This was definitely the missing piece, I thought I had to give the default service account permission on the Artifact Registry. Changing it to Cloud Build instead got me there. Didn't have to do anything other than that.
Upvotes: 0
Reputation: 4401
Took me a while, but I managed to get a CF from one project to download a package from another project.
There's a couple of steps involved, one of which is as of now not documented. Doing some testing and looking at the logs helped me narrow the actual behavior down.
1: Have package in one project. I'll call that project repo-project
.
Note, the package I uploaded is a simple one that just returns 42
when the only function inside is called. Any helloworld package should suffice.
2: Have another project for the Cloud Function. I'll call that project cf-project
.
3: Create service account in either project, and give it the Artifact Registry Reader permission in repo-project
. I'll call this artifact-sa
.
4: This is the undocumented step: Give the Cloud Build Service Account from cf-project
the same Artifact Registry Reader permission in repo-project
. The format for the name of this account is <PROJECT-NUMBER>@cloudbuild.gserviceaccount.com
5: Not sure if this one is needed, but it's how I did it. I ran the command below pointing to a downloaded JSON version of artfact-sa
:
gcloud artifacts print-settings python --json-key="artifact-sa.json" --repository=REPO --location=LOCATION
This prints out a value for --extra-index
to put in a requirements.txt
which includes the JSON key. I think using the keyring method mentioned by OP would also work here.
(note I did some extra steps which are not needed, see below, to ensure the key doesn't get uploaded to any github repo attached to the code for the CF)
6: Deploy the code however you like.
So, to summarize, the first authentication to the repo is done with whatever SA you use (eg, in the keyring, or using the method I described above). Stupidly enough, the download itself is done with the inbuilt SA for Cloud Build from the project you are deploying the Cloud Function to (cf-project
). IMHO this should be done by the same SA as the first.
As to how I found out the SA for Cloud Build was the issue, when I only added artifact-sa
to repo-project
, it did find the exact .whl
file while deploying the CF in cf-project
, with the correct version number (checking the error in the logs). It tried to download the package, but it got a 403
on said download.
I've had other scenarios where the internal usage of SA's on Google's side was a bit wonky, and the Cloud Build one is definitely a repeat offender here.
I created a secondary requirements file and added that to my .gitignore
file to make sure it doesn't get uploaded to my repo, because uploading keys is a bad idea
requirements.txt
-r privatereq.txt
mypythonpackage
private.req.txt
--extra-index-url https://_json_key_base64:[BASE64_KEY_SNIPPED]@[LOCATION]-python.pkg.dev/[REPO-PROJECT]/python-repo/simple/
.gitignore
*/privatereq.txt
Upvotes: 9
Reputation: 1
pointing the requirements to:
--extra-index-url https://<location>-python.pkg.dev/<project>/<repository>/simple
<package>
and imports in main:
from <package> import <module>
works for me. Remember to repeat the required modules for your package in the requirements.txt (setup.cfg from packaging works only in the build-process)
Upvotes: 0