Yes, AI Was the Theme. But Underneath, It’s Clear We’re in A New Era of Data Management
Last week’s Gartner Data and Analytics Summit in Orlando, Florida had the feel of a market in rapid transition. Given the tech world we’ve experienced the past few years that wasn’t a surprise.
However, on the expo floor it was very interesting to see the “villages” layout hubs for various disciplines (e.g., analytics, AI, data management). I fully understand why the organizers went with this format. From an audience perspective, historically attendees are mostly experienced professionals who know what they need and are looking for discrete solutions.
However, my impression after walking around the event and talking with a wide range and volume of booth visitors is one of worlds colliding.
I also had the pleasure of connecting with Gartner analysts Cuneyd Kaya and Roxane Edjlali. It was really interesting to get their perspectives on a wide range of topics, including why a hybrid solution for model development in an increasingly open source model world makes sense, the value of helping organizations simplify their management of data complexity, and how data will be the key differentiator for any AI success as most algorithms become available off the shelf.
We see the same needs and trends happening within our customer base. Becoming more data-fit for AI, reinforcing a data foundation for both core operations and AI with strong governance, and cost effectively simplifying data chaos are what we are helping organizations achieve through the Pentaho+ Platform.
Author
View All Articles
Featured
Simplifying Complex Data Workloads for Core Operations and...
Creating Data Operational Excellence: Combining Services + Technology...
Top Authors
Duane Rocke
Sobhan Hota
Christopher Keller
Maggie Laird
Joshua Wick
Categories
Mid-sized banks face a unique challenge in how to improve their Information and Communication Technology (ICT) risk management programs to meet the Digital Operational Resilience Act (DORA) requirements for resiliency against evolving digital threats.
Learn More
Swisscom's Business Customers division searched for a unified platform for data integration and validation to achieve a 360-degree view of its operations. Pentaho Data Integration (PDI) was chosen for its comprehensive feature set, ease of use, and cost-effectiveness.
Data quality is a crucial aspect of any organization’s operations, and its impact is growing as artificial intelligence (AI) and machine learning (ML) continue to evolve. However, determining what qualifies as "good enough" data can be a challenge.
Data Quality Series Part 2: Ensuring data quality is about finding the right balance—over-cleaning can remove valuable insights, while evolving data demands flexibility. This blog post explores how businesses can define quality thresholds, manage costs, and leverage AI-driven automation to maintain consistency and usability.
Data Quality Series Part 1: Discover how strong data quality fundamentals drive AI and GenAI success by ensuring accuracy, completeness, and consistency through end-to-end data management.