Sensor
A sensor is a block that continuously evaluates a condition until it’s met or until a period of time has elapsed.
In your pipeline, you can add sensors that run continuously and only complete when an external pipeline or external block is successfully ran for a particular partition.
Then, you can add a block in your pipeline that depends on that sensor.
If there is a block with a sensor as an upstream dependency, that block won’t start running until the sensor has evaluated its condition successfully.
Setup
Example
You can add sensors to a pipeline the same way you add other types of blocks.
In the above code, change pipeline_uuid
to the actual pipeline UUID you want this sensor to keep
checking whether or not it’s finished running successfully.
The block_uuid
positional argument is optional. If you add a value for the block_uuid
positional argument, then the sensor will check the status of the block in that pipeline.
Only when that specific block is finished running successfully, then the sensor will be complete.
The ‘hours’ positional argument is optional. If you add a value for the ‘hours’ positional argument, then the sensor will check the status of the pipeline for that many hours in past from execution_date. The default value is 24 hours.
Sensors can also have upstream block dependencies. The output of the upstream blocks will be passed
into the sensor in the args
parameter like in other Mage block types.
Currently, the sensor makes a check every 60 seconds.
Templates
When adding a sensor block, here are the available templates to choose from:
- Google BigQuery
- MySQL
- PostgreSQL
- Amazon Redshift
- Amazon S3
- Snowflake
Below are the code examples in those templates.
Google BigQuery
Check the results of a BigQuery query.
MySQL
Check the results of a MySQL query.
PostgreSQL
Check the results of a PostgreSQL query.
Amazon Redshift
Check the results of a Redshift query.
Amazon S3
Check if a file or folder exists in a S3 bucket.
Snowflake
Check the results of a Snowflake query.
Was this page helpful?