In this tutorial, we’ll create a DAG in Airflow for scheduling and running a data pipeline; all from the Mage UI.
requirements.txt
file in the root directory of your Airflow project,
and add the mage-ai
library:dags/
.dags/
folder.If you’re using Docker, run the following command in the dags/
folder:dags/
folder:demo_project
inside your dags/
folder.Your current folder structure should look like this:dags/
folder, create a new file named create_mage_pipelines.py
.Then, add the following code:dags/
folder, start the Mage tool.If you’re using Docker, run the following command in the dags/
folder:dags/
folder:etl demo
based on the tutorial from the previous
step, then find a DAG named mage_pipeline_etl_demo
. If you named it
something else, find a DAG with the prefix mage_pipeline_
.