Documentation Index
Fetch the complete documentation index at: https://docs.mage.ai/llms.txt
Use this file to discover all available pages before exploring further.
Add credentials
- Create a new pipeline or open an existing pipeline.
- Expand the left side of your screen to view the file browser.
- Scroll down and click on a file named
io_config.yaml.
- Enter the following keys and values under the key named
default (you can
have multiple profiles, add it under whichever is relevant to you)
Standard connection
version: 0.1.1
default:
MYSQL_DATABASE: ...
MYSQL_HOST: ...
MYSQL_PASSWORD: ...
MYSQL_PORT: 3306
MYSQL_USER: root
SSH tunnel connection
Only in Mage Pro.Try our fully managed solution to access this advanced feature.
If your MySQL database is in a private network and you need to connect via an SSH bastion host, Mage Pro supports configuring an SSH tunnel directly in io_config.yaml:
version: 0.1.1
default:
MYSQL_DATABASE: ...
MYSQL_HOST: ...
MYSQL_PASSWORD: ...
MYSQL_PORT: 3306
MYSQL_USER: root
# Pro only: SSH tunnel settings
MYSQL_SSH_HOST: bastion.example.com
MYSQL_SSH_PORT: 22
MYSQL_SSH_USERNAME: ec2-user
# Use one of the following for authentication:
MYSQL_SSH_PKEY: /path/to/private/key.pem # path to SSH private key
MYSQL_SSH_PASSWORD: your_ssh_password # only if using password auth
MYSQL_SSH_HOST: Hostname or IP of the SSH bastion.
MYSQL_SSH_PORT: SSH port (default 22).
MYSQL_SSH_USERNAME: SSH user to log in as.
MYSQL_SSH_PKEY: Path to private key file (preferred).
MYSQL_SSH_PASSWORD: Password for SSH user (alternative to key).
⚠️ You only need either MYSQL_SSH_PKEY or MYSQL_SSH_PASSWORD, not both.
Using SQL block
- Create a new pipeline or open an existing pipeline.
- Add a data loader, transformer, or data exporter block.
- Select
SQL.
- Under the
Data provider/Connection dropdown, select Mysql.
- Under the
Profile dropdown, select default (or the profile you added
credentials underneath).
- Enter the optional table name of the table to write to.
- Under the
Write policy dropdown, select Replace or Append (please see
SQL blocks guide for more information on write policies).
- Enter in this test query:
SELECT 1.
- Run the block.
Using Python block
Using MySQL connector
import mysql.connector
conn = mysql.connector.connect(
host='...',
password='...',
port=3306,
user='root',
database='...',
)
sql = """
SELECT
id
, email
FROM auth_user
"""
cursor = conn.cursor()
cursor.execute(sql)
rows = cursor.fetchall()
import pandas as pd
df = pd.DataFrame(rows, columns=['id', 'email'])
df.to_csv('default_repo/users.csv', index=False)
Using Mage’s MySQL python blocks
- Create a Data Loader block from the MySQL Template
Example snippet:
from mage_ai.settings.repo import get_repo_path
from mage_ai.io.config import ConfigFileLoader
from mage_ai.io.mysql import MySQL
from os import path
@data_loader
def load_data_from_mysql(*args, **kwargs):
"""
Template for loading data from a MySQL database.
Specify your configuration settings in 'io_config.yaml'.
Docs: https://docs.mage.ai/design/data-loading#mysql
"""
query = 'Your MySQL query' # Specify your SQL query here
config_path = path.join(get_repo_path(), 'io_config.yaml')
config_profile = 'default'
with MySQL.with_config(ConfigFileLoader(config_path, config_profile)) as loader:
return loader.load(query)
- Create a Data Exporter block from the MySQL Template
Example snippet:
from mage_ai.settings.repo import get_repo_path
from mage_ai.io.config import ConfigFileLoader
from mage_ai.io.mysql import MySQL
from pandas import DataFrame
from os import path
if 'data_exporter' not in globals():
from mage_ai.data_preparation.decorators import data_exporter
@data_exporter
def export_data_to_mysql(df: DataFrame, **kwargs) -> None:
"""
Template for exporting data to a MySQL database.
Specify your configuration settings in 'io_config.yaml'.
Docs: https://docs.mage.ai/design/data-loading#mysql
"""
table_name = 'your_table_name' # Specify the name of the table to export data to
config_path = path.join(get_repo_path(), 'io_config.yaml')
config_profile = 'default'
with MySQL.with_config(ConfigFileLoader(config_path, config_profile)) as loader:
loader.export(
df,
None,
table_name,
index=False, # Specifies whether to include index in exported table
if_exists='replace', # Specify resolution policy if table name already exists
)
- Custom types
To overwrite a column type when running a python export block, simply specify the column name and type in the overwrite_types dict in data exporter config
Here is an example code snippet:
from mage_ai.settings.repo import get_repo_path
from mage_ai.io.config import ConfigFileLoader
from mage_ai.io.mysql import MySQL
from pandas import DataFrame
from os import path
if 'data_exporter' not in globals():
from mage_ai.data_preparation.decorators import data_exporter
@data_exporter
def export_data_to_mysql(df: DataFrame, **kwargs) -> None:
"""
Template for exporting data to a MySQL database.
Specify your configuration settings in 'io_config.yaml'.
Docs: https://docs.mage.ai/design/data-loading#mysql
"""
table_name = 'your_table_name' # Specify the name of the table to export data to
config_path = path.join(get_repo_path(), 'io_config.yaml')
config_profile = 'default'
overwrite_types = {'column_name': 'VARCHAR(255)'}
with MySQL.with_config(ConfigFileLoader(config_path, config_profile)) as loader:
loader.export(
df,
None,
table_name,
index=False, # Specifies whether to include index in exported table
if_exists='replace', # Specify resolution policy if table name already exists
overwrite_types=overwrite_types,
)
Method arguments
| Field name | Description | Example values |
|---|
| allow_reserved_words | Whether to allow using reserved words as column names. | True/False (default: False) |
| auto_clean_name | Whether to automatically clean the column name (replace the empty space with underscore, avoid using number as the prefix of the column name) | True/False (default: True) |
| case_sensitive | Whether to support case sensitive columns | True/False (default: False) |
| drop_table_on_replace | Whether to drop the table when “if_exists” param is set to “replace”. | True/False (default: False) |
| if_exists | Specify resolution policy if table name already exists | ”fail”/“replace”/“append” (default: “replace”) |
| overwrite_types | Overwrite the column types | {'column1': 'INTEGER', 'column2': 'VARCHAR(255)'} |
| unique_conflict_method | How to handle the conflict on unique constraints. Use ‘UPDATE’ for UPSERT (update existing rows, insert new ones) or ‘IGNORE’ to skip duplicates. | ‘UPDATE’ or ‘IGNORE’ (default: None) |
| unique_constraints | The unique constraints of the table. Used with unique_conflict_method for UPSERT operations. | [‘col1’, ‘col2’] (default: None) |
Example: Using UPSERT (UPDATE or INSERT)
from mage_ai.settings.repo import get_repo_path
from mage_ai.io.config import ConfigFileLoader
from mage_ai.io.mysql import MySQL
from pandas import DataFrame
from os import path
if 'data_exporter' not in globals():
from mage_ai.data_preparation.decorators import data_exporter
@data_exporter
def export_data_to_mysql(df: DataFrame, **kwargs) -> None:
"""
Template for exporting data to a MySQL database with UPSERT support.
Specify your configuration settings in 'io_config.yaml'.
Docs: https://docs.mage.ai/design/data-loading#mysql
"""
table_name = 'your_table_name'
config_path = path.join(get_repo_path(), 'io_config.yaml')
config_profile = 'default'
with MySQL.with_config(ConfigFileLoader(config_path, config_profile)) as loader:
loader.export(
df,
None,
table_name,
index=False,
if_exists='append', # Use 'append' with UPSERT
unique_constraints=['user_id', 'email'], # Columns that form unique constraint
unique_conflict_method='UPDATE', # Update existing rows, insert new ones
)