We're excited to announce the launch of Sentinel, our new AI-powered agent designed to automate and accelerate the creation of data quality monitors. Sentinel intelligently analyzes your data assets and recommends a comprehensive set of monitors, helping you achieve full coverage in minutes, not hours.

Say goodbye to manual configuration and guesswork. Sentinel understands your data's context to suggest the most effective monitors for your specific needs.

Sample Sentinel recommendations

Sample Sentinel recommendations

What's New:

  • AI-Powered Recommendations: Sentinel analyzes data samples and metadata to suggest a wide range of monitors, including checks for format, uniqueness, value ranges, logical consistency (shipping_date > order_date), and more.
  • Three Powerful Workflows: You can now access Sentinel from wherever you work:
    • On a Single Asset: Generate recommendations directly from any asset page for quick, targeted coverage.
    • In Bulk from the Data Catalog: Select up to 10 assets at once from the catalog to apply consistent monitoring at scale.
    • Across an Entire Data Product: Ensure comprehensive monitoring for all assets within a Data Product with a single click.
  • Streamlined Creation Process: A simple, guided flow allows you to review all AI suggestions, select the ones you want, and create them in a single action.

Sentinel helps you save time, discover hidden data quality issues, and ensure your data assets are always reliably monitored.

➡️ Read the full documentation to get started with Sentinel

We've updated the Mute button to make your life easier! Now, when you mute a monitor, it will automatically unmute itself the next time its status changes.

This lets you silence temporary noise without worrying about forgetting to turn notifications back on. As always, you can still manually unmute at any time.

Sifflet's Monitor Muting button

Sifflet's Monitor Muting button

App version: v557

Impact: response payload of the following endpoints

The response payload for those endpoints contains a new property named status indicating if a user is enabled or disabled. This property is represented as an enum string with two possible values: ENABLED and DISABLED.

Example of the new response payload

{
   "id":"80807519-9b52-4c6c-88b1-3945e9b35a2e",
   "name":"Roger",
   "email":"[email protected]",
   "role":"EDITOR",
   "permissions":[
      {
         "domainId":"aaaabbbb-aaaa-bbbb-aaaa-bbbbaaaabbbb",
         "domainRole":"EDITOR"
      }
   ],
   "authTypes":[
      "LOGIN_PASSWORD"
   ],
   "status":"ENABLED"
}

We're excited to introduce a major enhancement to your security and access management: role-based domain control for Access Tokens.

This update gives you a new level of precision, allowing you to assign each token to one or more specific domains. You can then grant, for each domain, one of four distinct domain-level roles: Viewer, Editor, Catalog Editor, or Monitor Responder.

This ensures that every token operates on the principle of least privilege, granting only the exact permissions needed. The result is a more secure, flexible, and manageable system for your entire team.

Role-Based Domain Control

Role-Based Domain Control

What about my existing tokens?

For a seamless transition, your existing Access Tokens will continue to function as they do today, with access to all domains. You can now edit any of these tokens at your convenience to restrict their access to specific domains and assign them a precise role.

App version: v552

We've enhanced our debugging capabilities for incremental monitors. Previously, the "Show Failing Rows" button provided a view of the failing rows for the latest monitor run. Now, the feature is datapoint-specific.

This means you can select any point from a monitor's execution history and see the exact rows that caused a failure for that specific time and execution. This makes it much easier to investigate if a problem from a past run has been resolved or to analyze a specific incident in isolation.

Accessing failing rows at the datapoint level

Accessing failing rows at the datapoint level

App version: v542

We're thrilled to announce a fundamental redesign of how sources are managed in Sifflet. We've transitioned from a schema-by-schema approach to a unified, environment-level model. This update streamlines the user experience, provides a more accurate representation of your data landscape, and introduces powerful new tools for managing and troubleshooting your integrations.

Sifflet's new source management page

Sifflet's new source management page

New Features & Major Improvements

  • Environment-Level Sources: Sources are now managed at the environment level. For example, your entire Snowflake account or BigQuery project is now represented as a single, consolidated source in Sifflet. Your existing sources have been automatically migrated to this new structure.

  • New Source Details Modal: Clicking on any source name now opens a powerful details modal. This new view provides a granular breakdown of all the schemas within that source and their individual statuses.

  • Granular Metadata Refresh: You now have precise control over what you refresh. Alongside the main "Run" button for a full source sync, you can now trigger refreshes for specific schemas or databases directly from within the new details modal.

  • Streamlined Failure Resolution: Troubleshooting connection issues is now faster than ever. The Source Details modal includes:

    • A "Failures" Tab: This view automatically lists only the schemas that failed to sync, so you can immediately see what needs attention.
    • "Rerun All Failures" Button: Trigger a targeted refresh for all failed items with a single click.
    • Per-Schema Logs: A "Logs" button appears next to each failed schema, giving you instant access to detailed error messages to diagnose the root cause.
  • Official Source Merging Process: We've introduced a clear, safe process for consolidating multiple sources into a single primary source—perfect for cleaning up your setup after the migration. Sifflet seamlessly maps all monitors and assets during the merge, ensuring no loss of data or observability.

🗒️ For API & Terraform Users

  • API Deprecation: Please be aware that the legacy API endpoint used for managing schema-level sources is now deprecated and will be decommissioned in a future release.
  • New API Endpoint: A new, more powerful API endpoint for managing environment-level sources is now available. We strongly encourage you to migrate your scripts and Terraform configurations to use the new endpoint. Please refer to our API documentation for full details.

We are confident these changes will make managing your data integrations in Sifflet a much more intuitive and efficient experience. For additional details on the new source management experience, refer to the dedicated documentation page.

App version: v531

Pipeline Alerting

by Mahdi Karabiben

We’re excited to announce a new Pipeline Alerting feature to help you proactively monitor the health of your data pipelines. This feature is currently available for dbt integrations, with support for Airflow and Fivetran coming soon.

Now, when you set up a dbt integration, you'll see a new Notifications section. This lets you get real-time alerts when a dbt model fails, with notifications sent to:

  • Slack
  • Email
  • Microsoft Teams
  • Webhooks

This new capability ensures you stay informed and can address issues as they happen, minimizing downtime and the impact of data pipeline problems. You can read about the feature and how to leverage it via the dedicated documentation page.

We're thrilled to announce a massive upgrade to our Apache Airflow integration! We've moved beyond simply listing DAGs in the catalog to providing deep, actionable insights that connect your data pipelines to the assets they produce. This update gives you a complete picture of your data's journey, from orchestration to consumption.

Here’s what’s new:

Airflow DAGs Directly in Your Lineage

You can now visualize the direct relationship between your Airflow DAGs and your data assets. By adding a simple query tag to your Airflow operators, Sifflet will automatically map which DAGs generate or update specific tables and views. This closes a critical gap in data lineage, allowing you to instantly understand the upstream source of any asset.

  • Benefit: Instantly identify which pipeline populates a given dataset for faster debugging and impact analysis.
  • How to start: Check out our new documentation to learn how to tag your queries.
Airflow DAG within the Sifflet lineage

Airflow DAG within the Sifflet lineage

Live Airflow DAG Status in Sifflet

No more switching between tools to check if a pipeline ran successfully. Sifflet now pulls the latest run status for each of your DAGs and displays it directly in the catalog and lineage.

  • Benefit: Monitor the health of your data pipelines from the same platform you use to monitor your data quality.

📄 Pipeline Context on Asset Pages

When you link a DAG to an asset, that pipeline context now appears directly on the asset's page. See which DAG is responsible for the data without leaving the asset view, and navigate to the DAG page for its description, its owner(s), and its most recent run status.

  • Benefit: Gain immediate, valuable context about an asset's provenance and health, empowering data consumers and accelerating root cause analysis.

🔮 What's Coming Next

This is just the beginning of our push for comprehensive pipeline monitoring. Here's a sneak peek at what our team is working on:

  • Smarter Root Cause Analysis: Our AI agent, Sage, will soon incorporate Airflow DAG status into its incident analysis. It will automatically flag failed or delayed DAGs as the likely root cause of data quality issues.
  • Task-Level Granularity: Soon, you'll be able to drill down even further with detailed metadata and status for individual Airflow tasks.
  • Expanded Orchestrator Support: We're bringing these same powerful capabilities to other leading workflow orchestrators, including Databricks Workflows and Azure Data Factory.

We encourage you to explore the new Airflow integration today! As always, we'd love to hear your feedback.

Impacted endpoints:

Previously, the tags property could contain a mix of tags defined in Sifflet and tags pulled from the source.

The tags property now only contains user-defined tags in Sifflet.
A new externalTags property lists read-only tags from external systems (e.g., dbt, BigQuery, Snowflake, Databricks).

This applies to both Assets and Columns.

Before

{  
  ...
  "tags": [  
    { "id": "1", "kind": "TAG", "name": "A Sifflet tag" },  
    { "id": "2", "kind": "BIGQUERY_EXTERNAL", "name": "env:prod" } 
  ]
  ...
}

After

{  
  ...
  "tags": [  
    { "id": "1", "kind": "TAG", "name": "A Sifflet tag" }
  ],  
  "externalTags": [  
    { "id": "2", "kind": "BIGQUERY_EXTERNAL", "name": "env:prod" }
  ]  
  ...
}

We're excited to release Automatic Incident Grouping, a new feature designed to reduce alert fatigue and accelerate your investigation process.

  • Intelligent Grouping: Sifflet now automatically merges related failures—from freshness and volume monitors to dbt tests—into a single, consolidated incident.
  • Powered by Lineage & AI: Our grouping logic leverages data lineage and an AI model to accurately identify connected issues, helping you see the full picture.
  • AI-Generated Descriptions: Sifflet now also provides an AI-generated summary of every incident, so you can understand the context instantly.

Read more about it in the dedicated documentation page.

App version: v525