Executive Q&A: Yeshwant Mummaneni’s Top Takeaways from the 2024 Gartner Data & Analytics Summit
Known as one of the biggest data analytics events of the year, the Gartner Data & Analytics Summit took place from March 11-13 in Orlando, Florida. Altair’s Yeshwant Mummaneni, chief engineer of data management and analytics, was on the scene and spoke to us about his observations and takeaways from this year’s event.
Q: What were your impressions of this year’s Gartner event? How did it compare to events of years past? What stuck out to you?
Yeshwant Mummaneni: This year, generative artificial intelligence (genAI) emerged as the dominant conference theme. From the keynote to vendors on the show floor, genAI was the topic of discussion. If I recall correctly, there was no one such dominant theme in the prior conferences.
Q: Did you expect genAI to be such a big topic at this year’s event? Was this a trend you saw coming from previous events?
YM: Technologies as disruptive as genAI don’t come around every year. The difference with genAI is that it feels like it’s moving through the traditional hype cycle – innovation trigger, the peak of inflated expectation, trough of disillusionment, slope of enlightenment, to plateau of productivity – very quickly. It is dominating the media and conferences alike because it has the potential to be so disruptive. So it is not a surprise that it was the main focus at this year’s event.
Q: How would you describe peoples’ attitudes towards genAI at the conference? Did people seem like they had a solid grasp on the technology or were they still looking for concrete use cases?
YM: Most attendees seemed very optimistic and enthusiastic about genAI and were eager to learn how to adopt it within the constraints of their current environment. But it was also clear that everyone was approaching it with a degree of caution.
The keynote emphasized the need for genAI readiness from a data, process, and people perspective. They addressed the need for “collective intelligence,” aligning on business outcomes, the organization’s mission, and long-term value creation. They pointed out that data-readiness for traditional machine learning and analytics doesn’t necessarily make an organization ready for genAI. The conference’s keynote homed in on this idea. It was focused on how to get the right processes in place, tackle regulations and risks, and prioritize execution over strategy.
Q: Was a focus on the nitty-gritty of genAI – data prep, governance, etc. – confined to the keynotes or was it a topic of focus from organizations at their booths and sessions?
YM: There were quite a few sessions and independent software vendors (ISVs) on the show floor highlighting the need for proper data governance. They also covered the emergence of “metric store” – the idea of centralized and external to apps metric store, alongside with data catalog, data quality, lineage, and security. There were several sessions where ISVs and analysts presented various solution templates and frameworks. Retrieval augmented generation (RAG), an AI framework, was discussed quite a bit. It is used for retrieving facts from an external store to ground large language models (LLMs). With this approach, LLM-powered apps can be more accurate.
Q: How were people talking about the impact genAI would have in, say, the next five to 10 years? What were peoples’ attitudes like in terms of how genAI would transform the ways we work and live?
YM: GenAI is going to have a significant impact on various professions – from designers to developers, lawyers, factory workers, and beyond. I think it will be mostly positive; genAI will be deployed alongside us, empowering humans, making us more productive.
The keynote spoke about the “AI opportunity radar” for companies to see the opportunities presented by genAI and how to align with these opportunities. There are internal and external opportunities. And these opportunities could be what they called “game-changing AI or “everyday AI.” Everyday AI could transform front office operations like providing support, an external-facing service for customers. GenAI could also transform back-office operations like invoice processing and resume analysis. With game-changing AI, one could offer innovative, disruptive products and services.
Q: Let’s talk more about MLOps and LLMOps, which you felt were overlooked. What would you say to people that attended looking for more info on these topics?
YM: GenAI was certainly the focus of this year’s event from a Gartner perspective, but in my conversations with customers, MLOps – and now LLMOps – are still a front and center challenge. As enterprises continue to adopt machine learning and now genAI, one of the challenges is figuring out how to provide a comprehensive factory approach to delivering machine learning- and AI-powered apps. To do that, you need to have an end-to-end enterprise AI platform with strong MLOps capabilities. You want the same governance as on data for your models – managing experiments, collaboration, elevating the best models to production through proper approval processes, and scaling and monitoring the models in production. All these challenges are still important aspects that customers are still struggling to figure out. These areas are where Altair, with our Altair® RapidMiner® platform, delivers value and our message of convergence of high-performance computing (HPC), AI, and simulation resonates with customers.
Q: What were some other key focus areas besides genAI at the event?
YM: GenAI did overshadow other topics but quantum computing, metric store, and real-time data processing also caught my attention.
It was interesting to note that “quantum” was the most inquired topic in 2023 according to Gartner, despite the boom of genAI. Quantum computing can disrupt the domain of optimization, partial differential equations, material simulation, and factorization – these are all topics that Altair provides solutions for.
As I mentioned earlier, the emergence of “metric store” is very interesting. This opens the possibility of onboard multiple business intelligence (BI) applications, enabling users to choose a best-in-class product like Altair® Panopticon™ – a real-time BI visualization platform. The metric store removes the fear of users inaccurately defining formulas for metrics so that metrics are consistent across the organization. This will give customers the ability to use the visualization tools of their choice.
Altair’s IoT solutions and Panopticon have been involved with real-time data processing and visualization. In the context of genAI, data pipelines need to be in real-time, obviously based on the application need, so that the LLMs are as current as possible. It was noted at the conference that today 80% of the data movement is in batch. And it may not address all the genAI use cases. And that companies are looking at modernizing their ETL pipelines to be real-time.
Q: Speaking about the event in general, how do you think Altair tackles the challenges discussed and delivers value in the data analytics space?
YM: I think we are positioned very well. Altair RapidMiner is an end-to-end analytics platform that can address broad challenges in the analytics value chain across broad industry verticals – all the way from data extraction from semi-unstructured data to data transformation, data visualization, and machine learning/AI model development and operationalization.
At the event, there was also conversation around digital twin technology and Industry 4.0 use cases, where data analytics play a very important role. In a presentation, Altair and Mativ demonstrated how AI can be leveraged to enhance manufacturing quality. The presentation highlighted the need for edge analytics and the need for pushing compute down to where data is, specifically in the context of industrial analytics or manufacturing, where data is being generated on the shop floor. Our solutions do that very well and make it convenient to push compute and models down to the edge. We are uniquely positioned in the manufacturing space to address a broad class of Industry 4.0 challenges.
Q: How is Altair moving to meet the next generation of technology and its ramifications, including genAI and beyond? How are we setting ourselves up for success now and in the future?
YM: One of our core principles over the past 30 years has been to provide end-to-end capabilities while remaining open. This means that our customers can start and finish their journey on the same platform, but we also provide a lot of on- and off-ramps throughout the process, enabling customers to use third-party products as needed. This open-architecture philosophy is something that differentiates and defines us and is something I’m sure we will continue to be rewarded for.
Specifically, with genAI, we are enabling easy access to genAI models, making it easy to fine-tune open models for specific use cases, and giving users the ability to embed and integrate these into a product, platform, or application. We are getting ready for a future in which genAI will replace many traditional machine learning use cases.
Our focus is on leveraging the powerful synergy between simulation, HPC, and data analytics. This convergence is key to tackling our customers’ most pressing challenges.