Quickstart
Go from zero to Mage hero in under a minute. We’ll walk you through installing Mage and running your first pipeline. 🦸♀️
⛵️ Overview
We recommend using Docker to get started.
Docker is a tool that allows you to run Mage in a containerized environment: you can run Mage on any operating system that supports Docker, including Windows, Mac, and Linux. Using Docker means that you don’t have to worry about installing dependencies or configuring your environment. If you’d like to install Mage without Docker, you can use pip
or conda
.
If you’re familiar with Docker Compose or plan on adding or extending images (e.g. Postgres) in your project, we recommend starting from the Docker compose template. Otherwise, we recommend Docker run.
🪄 Get Mage
🏃♂️ Run your first pipeline
If you haven’t already, open a browser to http://localhost:6789
. From the pipelines page, select example_pipeline
and open the notebook view by selecting Edit pipeline
from the left side nav.
Select the first block by clicking it and select the “play” icon in the top right to run the block. You’ve just ran your first Mage block & loaded data from a dataset!
Do the same for the following cells in the pipeline to transform and export the data. Congrats, you’re now a Mage ninja! 🥷
🧙🏻♂️ Install Mage dependencies (optional)
Mage also has the following add-on packages:
Package | Install | Description |
---|---|---|
all | mage-ai[all] | install all add-ons |
azure | mage-ai[azure] | install Azure related packages |
clickhouse | mage-ai[clickhouse] | use Clickhouse for data import or export |
dbt | mage-ai[dbt] | install dbt packages |
google-cloud-storage | mage-ai[google-cloud-storage] | use Google Cloud Storage for data import or export |
hdf5 | mage-ai[hdf5] | process data in HDF5 file format |
mysql | mage-ai[mysql] | use MySQL for data import or export |
postgres | mage-ai[postgres] | use PostgreSQL for data import or export |
redshift | mage-ai[redshift] | use Redshift for data import or export |
s3 | mage-ai[s3] | use S3 for data import or export |
snowflake | mage-ai[snowflake] | use Snowflake for data import or export |
spark | mage-ai[spark] | use Spark (EMR) in your Mage pipeline |
streaming | mage-ai[streaming] | use Streaming pipelines |
To install these, run the following command from the Mage terminal:
or add the following to your requirements.txt
file:
You can access the terminal from the side nav on the right in the pipeline editor page. Read more about installing from requirements.txt
here.
🧭 Journey on
Navigate to our tutorials to learn more about Mage and how to build your own pipelines or continue exploring our docs for advanced configuration and deployment options.
If you’re interested in connecting a database in Docker, check out our guide for more information.