Documentation Index
Fetch the complete documentation index at: https://docs.mage.ai/llms.txt
Use this file to discover all available pages before exploring further.
Add credentials
- Create a new pipeline or open an existing pipeline.
- Expand the left side of your screen to view the file browser.
- Scroll down and click on a file named
io_config.yaml.
- Enter the following keys and values under the key named
default (you can
have multiple profiles, add it under whichever is relevant to you)
Use either a path to your service account key file or the key contents as a mapping (e.g. from a secret):
version: 0.1.1
default:
# Option 1: Path to service account JSON file
GOOGLE_SERVICE_ACC_KEY_FILEPATH: /path/to/service_account_credentials.json
# Option 2: Inline credentials (e.g. from env or secrets)
# GOOGLE_SERVICE_ACC_KEY: "{{ json_value(env_var('GCP_CREDENTIALS_JSON'), 'private_key') }}"
# Or use mage_secret_var / aws_secret_var etc. for the full credentials object
If the GOOGLE_APPLICATION_CREDENTIALS environment variable is set (e.g. to the path of your service account JSON), you can also use the IO class without adding keys to io_config.yaml; otherwise use with_config as shown below.
Using Python block
-
Create a new pipeline or open an existing pipeline.
-
Add a data loader or transformer block (the code snippet below is for a data
loader).
-
Select
Generic (no template).
-
Enter this code snippet (note: change the
config_profile from default if
you have a different profile):
from mage_ai.settings.repo import get_repo_path
from mage_ai.io.config import ConfigFileLoader
from mage_ai.io.google_cloud_storage import GoogleCloudStorage
from os import path
from pandas import DataFrame
if 'data_loader' not in globals():
from mage_ai.data_preparation.decorators import data_loader
@data_loader
def load_from_gcs(**kwargs) -> DataFrame:
config_path = path.join(get_repo_path(), 'io_config.yaml')
config_profile = 'default'
bucket_name = '...' # Change to your bucket name
object_key = '...' # Change to your object key (e.g. path/to/file.parquet)
return GoogleCloudStorage.with_config(
ConfigFileLoader(config_path, config_profile)
).load(bucket_name, object_key)
-
Run the block.
Export data to Google Cloud Storage
from mage_ai.settings.repo import get_repo_path
from mage_ai.io.config import ConfigFileLoader
from mage_ai.io.google_cloud_storage import GoogleCloudStorage
from os import path
from pandas import DataFrame
if 'data_exporter' not in globals():
from mage_ai.data_preparation.decorators import data_exporter
@data_exporter
def export_data_to_gcs(df: DataFrame, **kwargs) -> None:
config_path = path.join(get_repo_path(), 'io_config.yaml')
config_profile = 'default'
bucket_name = '...'
object_key = '...'
GoogleCloudStorage.with_config(
ConfigFileLoader(config_path, config_profile)
).export(df, bucket_name, object_key)
Google Cloud Storage supports loading and exporting:
Permissions
Ensure your Google Cloud service account has the appropriate roles, for example:
- Storage Object Viewer – to read objects
- Storage Object Creator – to write objects
- Storage Object Admin – for full read/write/delete