Pentaho 11 is here. See what’s new in our most advanced release yet. Read the blog →
At Gartner one theme was impossible to ignore: AI isn’t failing because it’s immature — it’s failing because our data foundations are.
If you walked the halls of Gartner’s Data & Analytics conference in Orlando, one theme was impossible to ignore:
AI isn’t failing because it’s immature — it’s failing because our data foundations are.
For years, organizations invested in data platforms to support reporting, dashboards, and analytics. Then AI arrived — not as a gentle extension, but as a stress test. And under that pressure, long-standing cracks became impossible to ignore.
The issue isn’t a lack of algorithms. It’s that most enterprises were never designed to deliver AI-ready data.
Gartner defines AI-ready data as data whose fitness for a specific AI use case can be proven — continuously and contextually. That’s a radical departure from traditional data quality thinking.
Static rules, one-time certifications, and centralized approval workflows simply don’t survive in environments where data drifts, models evolve, and AI systems act in real time. AI-ready data must be:
This is why so many AI pilots stall after initial success. The data pipeline that worked once cannot keep up with change.
One of the strongest messages from Gartner was that governance must move from policy to execution.
Traditional governance models — reviews, approvals, manual controls — slow delivery and frustrate the business. In contrast, modern organizations embed governance directly into data pipelines through DataGovOps:
Governance becomes invisible when it works — and invaluable when it’s missing.
A recurring myth in data and analytics is that talent can compensate for poor architecture. Gartner’s research shows the opposite.
Most D&A organizations admit their architectures are not built for scale. They rely on human effort to stitch together pipelines, reconcile semantics, and validate outputs. AI breaks that model.
The future belongs to platform-centric architectures:
People are essential — but architecture is what allows them to scale.
Centralized data teams cannot keep up with demand. Shadow IT, data silos, and tool sprawl are symptoms of that reality.
Gartner’s answer is clear: federated data management.
In a federated model:
The organizations succeeding with AI aren’t choosing between control and agility — they’re designing for both.
For buyers, the takeaway is sobering but empowering.
Stop asking: “Does this tool support AI?”
Start asking:
AI success isn’t about chasing the next model. It’s about building systems that can absorb change.
Whether it is the trust in data, the context of data, the availability of data, and the appropriateness of data for a given AI use case, Metadata is the key cog. Investment in metadata is essential. As I reflect on my journey in the data world from 2006, that is the one constant. The three essentials, shift left, qualify with purpose and make it simple to find, trust and deliver data are the themes that when internalized will help an organization be ready for data.
And that’s the real shift we saw in Orlando: From experimentation to execution. From tools to platforms. From data quality to data readiness.
AI didn’t break data management. It finally revealed what matters.
Author
View All Articles
Featured
Simplifying Complex Data Workloads for Core Operations and...
Creating Data Operational Excellence: Combining Services + Technology...
Top Authors
Jessica Allen
Mauro Damo
Tim Tilson
Sandeep Prakash
Jon Hanson
Categories
Most organizations understand technical debt, but fewer recognize data debt.
Learn More
Snowflake powers analytics at scale, but it won’t clean up zombie tables, stale datasets, or dark data that inflate costs and compliance risk. Pentaho Data Optimizer automates lifecycle management, enforces governance, and reduces spend — without breaking your dashboards.
Increase Innovation Investment Through Smarter Data and Storage Management