Apache Airflow has become the go-to platform within the modern data stack to programmatically author, schedule, and monitor data workflows. This open-source workflow orchestrator bridges a gap between GUI-based and purely programmatic orchestration and represents the core tool for orchestration within the Modern Data Stack.
Sifflet is thrilled to announce its latest integration with Apache Airflow. This integration will empower data engineers to better test their pipelines and isolate bad-quality data before it reaches business workflow. With this integration, Sifflet delivers further on its promise of a Full Data Stack Observability platform, ensuring data reliability at every stage of the Modern Data Stack.
With Sifflet, data teams relying on Airflow will be able to:
Centralize all data assets documentation in the same tool, allowing teams to search for DAGs and understand their downstream dependencies such as models, data tables, or dashboards.
View DAGs run status and assess its impact in case of failure on the pipelines and their dependencies.
Use Sifflet data quality monitoring within Airflow DAGs to detect and isolate bad-quality data, and prevent it from propagating to the entire data stack.
Want to learn more about our latest integration with Airflow? Reach out for a demo.