When dbt and Fivetran announced their merger, the news landed like an exhale, a move that felt both overdue and inevitable. For many, it signaled a new chapter in data’s ongoing consolidation story. But to me, it represents something subtler and more consequential: the end of an era defined by fragmentation and the beginning of one shaped by integration and intelligence.
For the better part of a decade, the modern data stack was built on modularity. Each tool specialized in one layer - from ingestion and transformation to orchestration, observability, and BI - with the promise that interoperability would equal control. It worked, to a point. But as the layers multiplied, complexity did too. Integration gave way to entropy.
That’s why the dbt x Fivetran merger isn’t just about bundling. It’s a reflection of what happens when the seams in the stack start showing, when data teams realize that duct-taping visibility across disconnected tools doesn’t create trust, it just creates noise.
In my Medium essay, The Modern Data Stack Is Dead—What Comes Next Will Be Smarter (and Yes, Agentic), I argued that we’re not watching the stack collapse; we’re watching it evolve. The future isn’t monolithic or modular, it’s agentic: systems that understand context, act autonomously, and close the loop between detection and resolution.
From Consolidation to Context
We saw this shift in full color last month at Big Data London. I opened my keynote with a provocation that still seems to resonate:
“Models aren’t the foundation, they’re the penthouse. Metadata is what holds the building up.”
It’s a line that tends to draw a few raised eyebrows, but it captures the reality of what’s happening on the ground. Enterprises are racing to deploy AI, yet most are still grappling with untrustworthy data foundations. The result is the “demo illusion”: proofs of concept that dazzle in controlled environments but disintegrate once they hit production.
The problem isn’t ambition, it’s architecture. We’ve optimized each layer of the stack in isolation, but few systems can interpret signals across layers. The winners of the next era won’t be those who consolidate tools; they’ll be those who integrate intelligence - who can reason over metadata, automate resolution, and make trust a first-class feature of their AI infrastructure.
A Step in That Direction: Sifflet on Snowflake Marketplace
That’s precisely the vision behind Sifflet’s new availability on the Snowflake Marketplace.
It might look like another consolidation story at first glance, but it’s really about context. Together, Sifflet and Snowflake bring contextual data observability purpose-built for the Data Cloud. Through native integration with Snowflake metadata, query logs, and Time Travel, Sifflet can detect anomalies, trace their lineage, and rank them by business impact - so teams don’t just see what broke, but why it matters.
This partnership aligns with a bigger truth we’ve seen play out with customers: reliability isn’t an operational afterthought; it’s the precondition for AI readiness. While others bundle pipelines, we’re building the trust infrastructure that lets enterprises scale with confidence.
The Trust Gap and What Comes Next
In London, I spoke with dozens of executives leading AI transformations. Nearly all shared the same frustration: too many promising models stall before they deliver value. The issue isn’t lack of talent or compute, it’s that data pipelines still behave like black boxes.
Bridging that gap means moving from passive observability to agentic systems that can detect, diagnose, and act in context.
At Sifflet, we frame this evolution in three layers:
- Autonomous Incident Resolution: AI agents that fix issues before they cascade.
- Declarative Trust Fabric: Governance defined as code, propagated automatically.
- Embedded Trust Signals: Reliability surfaced right where decisions happen in BI dashboards, analytics apps, even LLM interfaces.
Each layer transforms trust from something you check into something you operate on.
Looking Forward
The consolidation we’re seeing now marks a closing chapter for the modern data stack—but also the opening of a smarter, more connected one. We’re entering an era where metadata isn’t documentation; it’s infrastructure.
And that shift changes everything: how data products are built, how AI is deployed, and ultimately, how enterprises compete.
If you’re thinking about how to operationalize trust as your AI initiatives scale, I’d love to continue the conversation.

.avif)










-p-500.png)
