dev-resources.site
for different kinds of informations.
Pushing Python Packages to Artifact Registry Using Cloud Build
Published at
11/22/2024
Categories
python
gcp
cloudbuild
docker
Author
jayanth-mkv
Author
11 person written this
jayanth-mkv
open
Google Artifact Registry is a powerful solution for managing and hosting Python package artifacts in a private, secure, and scalable way. This guide provides a step-by-step walkthrough to push Python package .whl
files to the Artifact Registry using Google Cloud Build and a secret (creds
) from Google Secret Manager for authentication.
Prerequisites
-
Artifact Registry Setup:
- Create a Python repository in your Artifact Registry:
gcloud artifacts repositories create python-packages \ --repository-format=python \ --location=us-central1 \ --description="Python packages repository"
-
Secret Setup:
- Store your key as a secret in Google Secret Manager:
gcloud secrets create creds --data-file=path/to/key.json
-
Grant Cloud Build access to the secret:(Optional, can also be done using IAM)
gcloud secrets add-iam-policy-binding creds \ --member="serviceAccount:$(gcloud projects describe $PROJECT_ID --format='value(projectNumber)')@cloudbuild.gserviceaccount.com" \ --role="roles/secretmanager.secretAccessor"
- Cloud Build Permissions: Ensure your Cloud Build service account has the necessary permissions to access the Artifact Registry and Secret Manager.
Cloud Build YAML Configuration
Here's the full working cloudbuild.yaml
file:
options:
machineType: E2_HIGHCPU_8
substitutionOption: ALLOW_LOOSE
logging: CLOUD_LOGGING_ONLY
steps:
# Step 1: Access the secret `creds` and save it as `key.json`
- name: 'gcr.io/google.com/cloudsdktool/cloud-sdk'
entrypoint: bash
args:
- '-c'
- |
gcloud secrets versions access latest --secret=creds > /workspace/key.json
# Step 2: Configure `.pypirc` with the Artifact Registry credentials
- name: 'python'
entrypoint: bash
args:
- '-c'
- |
cat > ~/.pypirc << EOL
[distutils]
index-servers = tower-common-repo
[tower-common-repo]
repository: https://us-central1-python.pkg.dev/$PROJECT_ID/python-packages/
username: _json_key_base64
password: $(base64 -w0 /workspace/key.json)
EOL
# Step 3: Build and upload the Python package
pip install twine build && \
python -m build && \
twine upload --repository tower-common-repo dist/* --verbose
Step-by-Step Explanation
-
Define Build Options:
- Set the machine type, substitution behavior, and logging options.
- These configurations ensure efficient builds and manageable logs.
-
Retrieve
key.json
Secret:- Use
gcloud secrets versions access
to fetch thekey.json
file securely from Secret Manager. - Save the file to a known location (
/workspace/key.json
).
- Use
-
Configure
.pypirc
:- Generate a
.pypirc
file dynamically. This file is required fortwine
to authenticate with the Artifact Registry. - The password is base64-encoded content of
key.json
.
- Generate a
-
Build and Push Package:
- Install necessary tools (
twine
,build
). - Build the Python package (
python -m build
). - Use
twine upload
to push the.whl
file to the Artifact Registry.
- Install necessary tools (
Triggering the Build
Save the cloudbuild.yaml
file and trigger the build or can connect to github repository:
gcloud builds submit --config=cloudbuild.yaml .
Key Points
-
Secure Secrets Management: The secret (
key.json
) is accessed securely using Google Secret Manager. -
Dynamic Configuration:
.pypirc
is generated during the build, ensuring no sensitive data is stored in the repository. - Automated Upload: The process automates package building and pushing, reducing manual intervention.
Validation
After the build completes:
- Verify the uploaded package in the Artifact Registry:
gcloud artifacts packages list --repository=python-packages --location=us-central1
- Check for errors or warnings in the build logs.
cloudbuild Article's
17 articles in total
Pushing Python Packages to Artifact Registry Using Cloud Build
currently reading
How to Create a Cloud Build to Allow Docker to Download Python Packages from Artifact Registry
read article
Multiple Cloud Build Triggers in One Github Repo
read article
Cloud Build Setup with Private Github Repos and Submodules
read article
Codegolf: Build a container in Cloud Build
read article
[Cloud Build] Failed to trigger build: generic::invalid_argument: invalid build: invalid image name
read article
Automating multi-arch container images builds (we used Google Cloud Build but GitHub Actions would also work)
read article
Granting Access to Cloud Build - Predefined Roles
read article
Granting Access to Cloud Build - Impersonating a Service Account
read article
Granting Access to Cloud Build - Custom Roles
read article
Granting Access to Cloud Build - Intro
read article
Automated deployments for GCP Cloud Functions
read article
Improving Cloud Build pipeline with dependencies cache on Cloud Storage
read article
Continuous Deployment on Firebase using Cloud Build
read article
Continuous Deployment pipeline with Cloud Build on Cloud Run
read article
Cloudbuild with Android - Using Encrypted Environment Variables
read article
Cloud Build Dependency Graph
read article
Featured ones: