Airflow
You can leverage our full data stack approach to obtain a complete view of your data pipelines, from your orchestrator to the data warehouse and your BI tool.
By integrating Airflow to Sifflet, you will have at a glance:
- your DAGs and some relevant pieces of information such as "Last//Next Execution Date" or "Last Updated Date"
- their latest status of a DAG run. It allows you to see directly whether a failure happened recently.
You can also find here all Sifflet's custom operators and their purposes here.

"example_bash_operator" last ran correctly
"example_branch_datetime_operator_2" last run failed and needs attention
Follow these steps to integrate Airflow to Sifflet:
- Create a dedicated read-only user
- Connect to Sifflet
We currently support any self-hosted Airflow instance - version 2.0.0+ - and are working on cloud-managed instances (MWAA on AWS, etc.)
1- Create a read-only user
You can create a dedicated Sifflet user with a "Viewer" role.
Please choose a "User Name" (for instance, "sifflet_user") and a secure password. Store them carefully as you will need them to connect to Sifflet later.

2- Connect to Sifflet
To connect to Airflow on Sifflet, you will need three items:
- the connection details: your Host, Port
- Host: you can add the entire url. For instance, if your url is
http://xxxxx.yy
, your Host value would behttp://xxxxx.yy
- Host: you can add the entire url. For instance, if your url is
- the secret which corresponds to the username and password you previously chose
- the frequency: how often you want the information to be refreshed

To create the Airflow secret, follow the below steps:
- In "Integration" --> tab "Secrets", create a new secret
- In the "Secret" area, copy-paste the below text and replace it with the correct username and password previously created in part 1:
{
"user": "<username>",
"password": "<password>"
}
You can also refer to this page on adding a data source in Sifflet.
Updated 7 months ago