CIOs have doubled down on modernizing enterprise data systems to ease AI adoption, as the computing demands of the technology pose a challenge to adoption ambitions.
Most technology executives expect their organizations to increase investments in data infrastructure and AI adoption over the next year, according to an MIT Technology Review report commissioned by Databricks. All 600 CIOs and IT leaders surveyed in the report said data and AI budgets will grow over the next year, and nearly half anticipate a boost of more than 25%.
The initial barrier to entry for generative AI adoption is low, but the barrier to productionize the technology remains high, Naveen Zutshi, CIO at Databricks, said. “Right now, when digging deeper, you find more experiments and less production use cases,” Zutshi said.
As eager organizations greenlight pilot programs and task innovation teams with developing generative AI business applications, CIOs are refining the fuel that drives the emerging technology — data.
Despite efforts to streamline architecture, unlock silos and integrate disparate sources, data ecosystems often succumb to entropy over time.
“It's not that organizations want to be complex,” Zutshi said. “They become complex because there were legacy systems and there were decisions made over time that resulted in that complexity. Unwinding is not easy.”
Hybridizing AI strategy
From a data standpoint, experimenting with off-the-shelf models available via cloud marketplaces and SaaS solutions is a relatively easy lift. But most organizations favor a hybrid AI strategy, leveraging available tools for immediate use cases while tuning and training models on internal data, the report found.
More than half of respondents — 58% — said they are open to both building and buying the technology. While only slightly more than 1 in 10 are solely focused on engineering their own tools, just 3 in 10 were content with commercial models.
Databricks has teams building foundational models for customers. The company is also leveraging available technologies for enterprise use cases, primarily for internal chatbots to assist in customer support, contract reviews and job descriptions, according to Zutshi.
“Building a foundation model costs hundreds of thousands or millions of dollars and only a small number of companies will do it,” said Zutshi.
Most organizations will take the middle path, using internal data to tune off-the-shelf models for specific tasks. The primary lever for feeding an LLM external data, a process known as retrieval-augmented generation, creates risk and opportunity.
“Companies have a lot of unstructured data stored somewhere that they do very little with,” Zutshi said, pointing to information stored across platforms such as Confluence and Google Drive or in PDF form.
Generative AI can ingest data from these sources, surfacing previously buried insights. But most organizations are new to the idea of using unstructured data without first putting it in a structured database, so there will be a learning curve, Zutshi said.
Data first
Laying the foundation for generative AI requires upfront investment.
“Just moving landlocked data can be a pretty gargantuan exercise,” Zutshi said. “These problems don’t go away because you have generative AI.”
Even organizations that have been relatively proactive in data operations can require remediation, as technologies and strategies have evolved rapidly.
“Data infrastructure that worked five years ago doesn’t work now,” Murali Brahmadesam, CTO and head of engineering at financial services company Razorpay, said in the report.
Zutshi said he had a data warehouse coupled with on-prem databases and a cloud-based data lake five years ago, when he served as CIO at Palo Alto Networks. Data duplication and separate access rules for data science and analytics teams contributed to operational complexity.
“Until a few years ago, we weren’t talking about lakehouses,” said Zutshi. “We were using data warehouses and they tended to put data in separate silos. We built systems based on that paradigm and now that paradigm has changed.”
Incremental migration and integrations that show value to the business can remove resistance.
“If I told my CEO that I was going to spend the next 18 months fixing my data foundation, but there won’t be any immediate returns, I would be laughed out of the room,” said Zutshi.
Enthusiasm for generative AI may buy CIOs some time. According to a KPMG survey of 1,300 CEOs, most expect ROI to arrive in three to five years, not tomorrow or the next day.