Pentaho 11 is here. See what’s new in our most advanced release yet. Read the blog →
Scalable by design:
Products
Solutions
Industries
Learn and grow:
Resource Hub
Dive Deep
Support
One theme continues to resonate from this year’s Data Decoded London 2026: enterprise data teams are done with theory. They want to see what works in production at scale under real-world constraints.
Across live demonstrations, customer-led discussions, and hands-on conversations on the show floor, Pentaho had the opportunity to showcase how modern organizations are building resilient, AI-ready data pipelines that withstand operational complexity, regulatory demands, and growing data volumes.
At the Pentaho booth, our data experts explored how enterprises can simplify fragmented data environments and deliver trusted data across the business. Conversations centered on practical challenges data engineers and architects face every day: integration sprawl, brittle pipelines, performance bottlenecks, and the pressure to deliver reliable data faster, without adding unnecessary complexity.
It was great to be able to demonstrate live how organizations can design, orchestrate, and govern data pipelines that scale with confidence, supporting analytics and AI initiatives without compromising control or reliability.
One of the most compelling moments of Data Decoded for our team was our session, “How a Global Trading Platform Uses Pentaho Data Integration,” where attendees heard firsthand how MarketAxess leverages Pentaho to support complex, high-volume data operations at a global scale. The live interview and Q&A with Taryn-Vee Burnett, Senior Credit Risk Analyst at MarketAxess, focused on how Pentaho helps them integrate and orchestrate data reliably across systems while addressing operational complexity and supporting data-driven decision-making.
Avoiding the dreaded abstract architecture talk track, the discussion explored real-world lessons, design considerations, and outcomes from this enterprise deployment, giving practitioners practical takeaways they could apply immediately within their own environments.
Pentaho also delivered a live, technical demonstration focused on one of the most common challenges facing data teams today: building pipelines that grow without breaking.
Led by Ghaith Etbiga, Lead Solutions Consultant at Pentaho, “Watch It Live: Building Data Pipelines That Scale with Pentaho Data Integration” walked attendees through a real-time build of scalable pipelines, from connecting diverse data sources to transforming, orchestrating, and managing data flows efficiently.
Unlike canned scripted demos, the real-world session highlighted how teams balance scalability, governance, and performance in production scenarios. Attendees gained visibility into common design pitfalls, best practices for batch and near-real-time processing, and techniques for creating pipelines that adapt as data demands evolve.
Data Decoded London brought together data engineers, architects, analytics leaders, and decision-makers looking for proven approaches, not hype. Our team reinforced a simple but powerful message: scalable data integration is possible when platforms are designed for real-world complexity, not just ideal-state architectures.
As the conversations continue post event, we look forward to helping prospects, customers and partners simplify data chaos, reduce operational friction, and build a data foundation that scales with the business.
If you missed Data Decoded London or want to continue the conversation, Pentaho experts are always happy to connect and share what works in the real world.
Author
View All Articles
Featured
Simplifying Complex Data Workloads for Core Operations and...
Creating Data Operational Excellence: Combining Services + Technology...
Top Authors
Dr. Pragyansmita Nayak
Jessica Allen
Mauro Damo
Tim Tilson
Sandeep Prakash
Categories
Unpack why data fitness has become a prerequisite for AI success and how organizations can take practical steps to get there.
Learn More
Most organizations understand technical debt, but fewer recognize data debt.
Snowflake powers analytics at scale, but it won’t clean up zombie tables, stale datasets, or dark data that inflate costs and compliance risk. Pentaho Data Optimizer automates lifecycle management, enforces governance, and reduces spend — without breaking your dashboards.
Increase Innovation Investment Through Smarter Data and Storage Management