The adoption of artificial intelligence (AI) by enterprises and research groups is accelerating, but data infrastructure and AI sustainability challenges present barriers to implementing it successfully at scale, according to a new global study conducted by S&P Global Market Intelligence and commissioned by WEKA.
Findings show that these challenges have been exacerbated by the rapid onset of generative AI that has defined the evolution of the AI market in 2023.
The findings are drawn from a survey fielded in the second quarter 2023 covering 1,516 AI/ML decision makers/influencers in companies across the Asia-Pacific; Europe, Middle East and Africa; and North America regions.
Nick Patience, senior research analyst at 451 Research, which is part of S&P Global Market Intelligence, said the meteoric rise of data and performance-intensive workloads like generative AI is forcing a complete rethink of how data is stored, managed and processed.
Organisations everywhere now have to build and scale their data architectures with this in mind over the long term,” said Patience. “Having a modern data stack that efficiently and sustainably supports AI workloads and hybrid cloud deployments is critical to achieving enterprise scale and value creation.”
The study also found that 69% of survey respondents reported having at least one AI project in production.
Only 28% say they have reached enterprise scale, with AI projects being widely implemented and driving significant business value.
AI has shifted from simply being a cost-saving lever to a revenue driver, with 69% of respondents now using AI/ML to create new revenue streams.
The most frequently cited technological inhibitor to AI/ML deployments is data management (32%), outweighing challenges for security (26%) and compute performance (20%), evidence that many organisations’ current data architectures are unfit to support the AI push.
Further, 69% of respondents cited that their AI/ML projects focus on developing new revenue drivers and value creation versus 31% still being cost reduction-focused.
Companies leveraging a modern data architecture to overcome significant data challenges (sources, types, requirements etc.) can accommodate AI workloads operating across multiple infrastructure venues.
“Just as you wouldn’t expect to use battery technologies developed in the 1990s to power a state-of-the-art electric vehicle, like a Tesla, you can’t expect data management approaches designed for last century’s data challenges to support next-generation applications like generative AI,” said Liran Zvibel, cofounder and CEO at WEKA.
“Organisations that build a modern data stack designed to support the needs of AI workloads that seamlessly span from edge to core to cloud will emerge as the leaders and disruptors of the future,” said Zvibel.