I spent way too long trying to get uv publish to work with Google Artifact Registry. The authentication dance is more annoying than it should be.
Why Artifact Registry over PyPI? Link to heading
Artifact Registry works well for internal packages because:
- Private - Not everything should be public
- IAM integration - Same permissions as other GCP resources
- Regional - Data stays in your region
- Scanning - Vulnerability scanning built in
For open source, PyPI is still the way to go.
The fix Link to heading
First, I tried the keyring approach:
uv tool install keyring --with keyrings.google-artifactregistry-auth
That didn’t work. After much trial and error, here’s what actually works:
uv publish --index my-registry \
--username oauth2accesstoken \
--password $(gcloud auth application-default print-access-token) \
dist/*
The key thing is that the username must be oauth2accesstoken - that’s how GCP knows you’re using OAuth2.
Don’t forget to build first:
uv build
If things go wrong, add -vv to see what’s happening:
uv publish --index my-registry --username oauth2accesstoken dist/* -vv
CI/CD setup Link to heading
For GitHub Actions, use Workload Identity Federation (no service account keys):
- name: Authenticate to GCP
uses: google-github-actions/auth@v2
with:
workload_identity_provider: ${{ vars.WIF_PROVIDER }}
service_account: ${{ vars.SERVICE_ACCOUNT }}
- name: Publish to Artifact Registry
run: |
uv build
uv publish --index my-registry \
--username oauth2accesstoken \
--password $(gcloud auth print-access-token) \
dist/*
Further reading Link to heading
- uv publish documentation - publishing options