In 2024, data will be more conversational. Augmented by generative AI, data platforms and data visualization tools will transform data into a dynamic, interactive asset that “speaks” with its human users. Despite the undeniable and impressive appeal, business leaders will need to rethink how their organization uses data. Decision-makers need cautious optimism, as it can make or break the business’s capability to realize its value. To move forward with these innovations, enterprises must go back to the basics: building a robust foundation in data and analytics.
In this final part of our series on 2024 tech and analytics trends and predictions, experts at Lingaro’s technology consulting and data visualization practices explore how generative AI is redefining how enterprises visualize and speak with data, and what business leaders need to be aware of to maximize its value, avoid risks, and navigate its impending ubiquity in the corporate world.
Augmenting data and analytics platforms with generative AI
Businesses are abuzz with generative AI. The hype isn’t just about technological advancement — it’s reshaping how businesses interact with, interpret, and use their data. In the Europe, Middle East, and Africa (EMEA) region, 65% of C-suite executives surveyed by IDC consider it a top investment priority, while 45% of executives polled by Gartner worldwide said they’re already piloting generative AI projects.
Carlos Navarro, senior director of Lingaro’s technology consulting practice, attributes this trend to its potential to improve productivity and maximize the business value of the organization’s data. Generative AI’s allure doesn’t just come in the form of creating texts, audio, and images — its capability to automate and streamline data management processes promises to significantly reduce the time and effort required to transform raw data into actionable insights, predictions, and recommendations.
Several forces at play are propelling this quantum leap, Navarro said. On one end, there’s the data lakehouse, a data management paradigm catering to the exponential demand for sophisticated data storage and processing capabilities that are essential for expanding or developing AI solutions. For instance, Databricks noted that 61% of surveyed enterprises were already migrating to a lakehouse platform.
The data lakehouse is also the second top modern data architecture that surveyed business leaders and IT decision-makers are “currently researching” and consider having “compelling value” in the next five years (Market Study: 2023 Modern Data Architecture Trends).
The data lakehouse is also the second top modern data architecture that surveyed business leaders and IT decision-makers are “currently researching” and consider having “compelling value” in the next five years.
On the other end, there’s the emergence of augmented data platforms powered by advanced language models. Navarro observed that they are increasingly playing a bigger role in the entire data life cycle — from design and development to testing and governance. Navarro also noted that advances in generative AI are also prompting its early adopters as well as technical leaders to double down on data management to ensure that the data fed on AI projects is accurate, consistent, and trustworthy.
This synergy isn’t a surprise: Forrester’s 2023 survey revealed that 91% of analytics decision-makers with a mature, insights-driven business reported using augmented business intelligence platforms in their company. Navarro furthered that the integration of AI and generative AI in analytics will provide intelligent insights and automate data processing, capitalizing on virtual and augmented reality in business intelligence. In fact, Databricks just renamed its data platform into Data Intelligence, while Microsoft unveiled the general availability of Microsoft Fabric and public preview of the AI-powered Copilot in Fabric (and by extension, Power BI). These announcements emphasize the increasing prevalence of AI-driven, augmented data management and visualization.
Navarro explained, “The context driving these changes revolves around the building blocks of AI that enterprises often overlook — data. Traditional platforms are increasingly inadequate in feeding today’s sophisticated AI models. This gap led enterprises to focus on developing robust, future-proof infrastructures that can efficiently transform multiple sources and types of data into a single source of truth.”
This single source of truth, Navarro added, can lower development costs, automate processes, and expedite workflows. For instance, Gartner projects that by 2028, 75% enterprise software engineers will use AI- and generative AI-augmented tools, a sizable surge from 10% in 2023. “It can also accelerate the democratization of AI and analytics, empowering nontechnical users to perform tasks customarily reserved for IT or data specialists,” Navarro added. Gartner echoes this, predicting that more than 80% of enterprises will use generative AI-enabled applications by 2026, which will enable more people to access this technology.
Amidst the picturesque outlook lie hidden pitfalls, however.
“Arriving at this single source of truth entails a journey that is fraught with technical and organizational complexities that many are not prepared to embark on. Data lakehouses, for one, could involve a complete overhaul of existing data infrastructures, which carry a hefty price tag. There’s also the uncertainty of how they figure into the organization’s current and future data strategy,” Navarro cautioned.
“Unfortunately, many enterprises are still befuddled by hype and unrealistic expectations, widening the disconnect between the need to keep pace with the latest innovations and the maturity to effectively use them,” Navarro said. In a 2023 Salesforce survey, for example, 91% of business and IT leaders are eager and confident to reap the benefits of generative AI. However, 59% don’t have a data strategy while 60% said it cannot integrate with their current tech stack. This is exacerbated by how 60% of technical leaders don’t know their business counterparts’ data utilization and speed to insight. In Asia Pacific, Forrester predicts that in 2024, only 30% of companies that have more advanced IT practices will have the capacity and capability to benefit from generative AI. The rest will be hampered by a risk-averse culture and inadequacies in data management.
Navarro cited their practice’s partnership with a global CPG company that wanted to jump onto the lakehouse bandwagon as a case in point. Through periodic assessments and workshops with Lingaro, they later realized that it didn’t address critical use cases and business objectives. With architectural redesigns, they adopted a hybrid warehouse-lakehouse ecosystem that aligns with their current operations and organizational culture, but still has the extensibility to add or integrate new features or technologies in the future. The practice also works with other CPG companies in augmenting their existing data platforms or customizing their design, both of which start with data maturity assessment and use case validation.
Navarro recommends exploring, pinpointing, and prioritizing specific use cases that provide the most significant value as well as investing in capabilities that ensure data quality. Navarro advised, “Unless leaders firmly grasp the fundamental aspects of their organization’s data strategy, the rush to cash in on generative AI would only heighten risks and widen gaps in the company's readiness to deliver on these data-centric advancements.”
Charting and visualizing data with conversational analytics
These data-centric advancements will also include innovations in data visualization platforms, tools, and technologies, said Harish Ravi, Lingaro’s head of practice for data visualization.
“The trend of business users getting answers to their problems via simple conversational tools is expected to create a major impact in 2024. Key players in the market, such as Microsoft Power BI, Tableau, and Qlik, have already adapted embedded analytics —conversational analytics, in particular — into their platforms. This is a leap forward from existing self-service models that often require significant training and dependence on IT teams. Data visualization is no longer just about plugging numbers into charts and dashboards. The wonders of AI and generative AI are ushering in an era where data speaks as naturally as chatting with a human coworker,” Harish said.
Indeed, Gartner predicts that by 2026, 30% of new apps will be infused with AI to personalize adaptive user interfaces (UIs), tailoring experiences, interactions, and transactions unique to each user. Gartner also projects a 16.2% increase in worldwide investment in conversational AI capabilities in 2024 to improve customer service operations. By 2027, 14% of customer interactions will be handled by conversational AI.
Traditionally, extracting value from analytics has been a cumbersome process for business users, Harish explained. It often involved navigating complex tools and relying heavily on IT or data science teams, which could lead to delays and a lack of confidence in the data. The transition to new technologies or tools also poses significant adoption challenges. “For business users, the prospect of acquainting themselves with intricate and unfamiliar technologies merely to answer basic business queries has been a daunting and often unfeasible expectation. This disconnect not only hinders the implementation of new systems but also limits the broader acceptance and integration of analytics into the strategic fabric of organizations,” Harish furthered.
Conversational analytics, fueled by advancements in natural language processing (NLP), addresses these challenges by allowing users to interact with data in a more familiar and intuitive way. With generative AI figuring into the equation, data narrative — contextualized explanations of data — conversational UIs, and data exploration (generating hypothesis from data) shored up as the top use cases of leaders surveyed by Gartner who plan to incorporate it in their data visualization and analytics in the next two years.
“For enterprises, this means a drastic shift in how insights are derived and used. Conversational analytics promises to enhance the speed and ease of accessing information, which will not just supplement but, in many cases, replace the reports and dashboards that users are familiar with,” Harish said.
With that said, Harish forewarned of the technological and organizational challenges ahead. Conversational analytics requires a new data model, for instance, which needs to be continuously trained and refined to visualize and interpret data more accurately. More importantly, organizations need to identify where and how it can yield the most value. It's also not just a technological challenge: 80% of surveyed data executives cited cultural issues as their biggest roadblock to realizing value from their data spend.
All of these reiterate how technologies, no matter how innovative and humanlike, will only be as effective as the people who use them. Harish expounded, “At Lingaro, we evaluate an organization’s current self-service maturity levels using a battle-tested assessment framework. Our strategy includes developing a tailored road map for rapid advancement, charting a course to elevate capabilities in data and analytics and balancing immediate wins with long-term enhancements.”
Harish concluded, “Our objective is to empower business users to generate their own insights, with minimal or no need for IT support. This entails crafting a well-defined persona, ensuring data quality and accessibility, establishing guidelines for self-service models, and preparing and certifying models with focus on conversational analytics — all of which are designed for the unique, human experience of ‘speaking’ with data.”