You can leverage our full data stack approach to obtain a complete view of your data pipelines, from your orchestrator to the data warehouse and your BI tool.
By integrating Airflow to Sifflet, you will have at a glance:

  • your DAGs and some relevant pieces of information such as "Last//Next Execution Date" or "Last Updated Date"
  • their latest status of a DAG run. It allows you to see directly whether a failure happened recently.

You can also find here all Sifflet's custom operators and their purposes here.

16681668

"example_bash_operator" last ran correctly
"example_branch_datetime_operator_2" last run failed and needs attention

Follow these steps to integrate Airflow to Sifflet:

  1. Create a dedicated read-only user
  2. Connect to Sifflet

📘

We currently support any self-hosted Airflow instance and are working on cloud-managed instances (MWAA on AWS, etc.)

1- Create a read-only user

You can create a dedicated Sifflet user with a "Viewer" role.
Please choose a "User Name" (for instance, "sifflet_user") and a secure password. Store them carefully as you will need them to connect to Sifflet later.

13201320

2- Connect to Sifflet

To connect to Airflow on Sifflet, you will need three items:

  • the connection details: your Host, Port
  • the secret which corresponds to the username and password you previously chose
  • the frequency: how often you want the information to be refreshed
18321832

To create the Airflow secret, follow the below steps:

  • In "Integration" --> tab "Secrets", create a new secret
  • In the "Secret" area, copy-paste the below text and replace it with the correct username and password previously created in part 1:
{
  "user": "<username>",
  "password": "<password>"
}

You can also refer to this page on adding a data source in Sifflet.