GCP Secrets Management
Creating secrets
- Go to Google Secret Manager UI.
- Click the button at the top labeled
+ CREATE SECRET
. - Fill in the name of your secret; e.g.
bigquery_credentials
. - Under Secret value, upload your service account credentials JSON file or paste the JSON into the text area labeled Secret value.
- Scroll all the way down and click the button
CREATE SECRET
.
You can mount secrets from Google Secret Manager through Terraform configurations or through the Google Console UI.
Using secrets locally
Download credentials from GCP UI
-
Download the credentials JSON file from GCP.
-
Run Mage and mount the secrets as a volume in Docker. Follow these instructions to learn how to do this.
-
Here are example code snippets to read from that credentials JSON file:
with open('/home/secrets/gcp_credentials.json', 'r') as f: print(f.read())
Note
This code example assumes your credentials JSON file downloaded from GCP is named
gcp_credentials.json
and that the mount path (e.g.-v
) you used when running Docker is/home/secrets
.
Download credentials using gcloud
CLI
-
Authenticate locally by running this command in your local terminal:
gcloud auth application-default login
-
Create a new
.env
file in your Mage project folder with the following values:GOOGLE_APPLICATION_CREDENTIALS="[PATH TO YOUR USER CREDENTIALS, MOST LIKELY: ~/.config/gcloud/application_default_credentials.json]" GCS_BUCKET=[YOUR DEV BUCKET] GCLOUD_PROJECT=[YOUR PROJECT]
-
Run Mage using Docker and set the environment variable
GOOGLE_APPLICATION_CREDENTIALS
. Follow these instructions to learn how to do this. For example, set the environment variable to:-e GOOGLE_APPLICATION_CREDENTIALS=/tmp/keys/FILE_NAME.json
-
Run Mage and mount the secrets as a volume in Docker. Follow these instructions to learn how to do this. For example:
-v ~/.config/gcloud/application_default_credentials.json:/tmp/keys/FILE_NAME.json
-
Here is an example code snippet:
from google.cloud import storage from pandas import DataFrame from datetime import datetime import pytz import os @data_exporter def export_data_to_google_cloud_storage(df: DataFrame, **kwargs) -> None: bucket_name = os.getenv('GCS_BUCKET') now = datetime.utcnow() pt = pytz.timezone("America/Los_Angeles") now_pst = pytz.utc.localize(now).astimezone(pt) object_key = f'test_file_{now_pst.strftime("%Y-%m-%d")}.csv' storage_client = storage.Client() bucket = storage_client.bucket(bucket_name) blob = bucket.blob(object_key) blob.upload_from_string(df.to_csv()) print(f'df uploaded to {bucket}/{object_key}.')