Let’s not get it twisted. Despite the publicity surrounding generative AI models like ChatGPT, enterprise adoption of these tools is quite low. This is to be expected.
Technology advances, adoption follows, and in the case of ChatGPT, businesses are intrigued by the mostly favorable public reaction and engagement. This has led business technology leaders to imagine how they might customize generative AI models for their enterprises’ goals and needs.
But these large, complex models need the right capabilities to succeed. Without the right talent, governance, maturity and budget, CIOs and their tech teams could find themselves in the deep end, tangled in technological debt.
Here are the top four gaps and barriers to entry that businesses encounter to successfully develop, customize and adopt generative AI models:
IT Governance
IT governance is even more critical as emerging technologies evolve. These high-risk, high-reward technologies require frameworks and ethical guidelines.
The rising number of citizen developers has further complicated what governance looks like for the modern IT department.
Organizational change management and IT governance were among the top six-largest capability gaps between rated importance and effectiveness for customizing generative AI models, according to Info-Tech Research Group data from 271 IT leaders surveyed between August 2021 and October 2022.
Governance models should include guiding principles focused on privacy, data security, algorithmic transparency within AI models and cybersecurity vulnerabilities, according to Suma Nallapati, CIO at Insight Enterprises.
“The technology is the easy aspect, but these governance models are very critical for emerging technologies, like generative AI, ChatGPT and others, to work effectively within an organization and come to the right outcomes,” Nallapati said.
Legislation also plays an important role in developing and implementing AI-powered solutions. IT departments will have to closely watch the evolving laws and regulations and implement them into governance models.
Data gathering and quality
These large, complex models require a lot of the right kind of data. Part of gathering good data is making sure tech teams have the right talent.
“If people are in that data science background, they're going to understand the importance of data quality and actually know how to go about improving it and recognizing why processes lead to that bad data quality,” Brian Jackson, research director at Info-Tech Research Group, said.
Data quality was the second biggest capability gap between rated importance and effectiveness for customizing generative AI models, according to Info-Tech Research Group data.
“Generative AI, in particular, relies on substantial amounts of data that has undergone thorough preprocessing to ensure its accuracy,” Chris Monberg, CTO and head of product at software company Zeta Global, said in an email. “It's important to understand that maintaining data quality is an ongoing process, not a one-time task.”
Monberg said these areas were often overlooked:
- It's essential to gather a large, diverse and representative data set that accurately reflects a business domain, utilizing both first-party and third-party data.
- Regular monitoring and observation of data quality and pipeline health are necessary to continuously improve data quality.
- Data governance, security and privacy mechanisms are crucial to mitigate risks related to ethical and legal compliance. Concerns about the biases in training data and the ethical implications of blindly optimizing algorithms for equity must also be addressed.
Maturity of tech stack
Generative AI adoption, similar to cloud transformation or “as a Service” systems, requires a level of modernization that most non-traditional tech companies are still embracing.
“Organizations that have already embarked on digital transformation, they’re more receptive,” Nallapati said. “But some organizations that are burdened with legacy applications and a lot of technology debt have a barrier to entry in any emerging technology, whether it’s generative AI, artificial intelligence or machine learning.”
The cost
While cost might not necessarily be considered a capability gap, it is certainly a barrier to entry for most businesses. The large models require a lot of data and, in turn, a lot of computing power.
ChatGPT, for example, is trained on OpenAI’s language model GPT 3, a transformer-based model. CEO of OpenAI Sam Altman said in a December tweet that the compute costs for ChatGPT are “eye-watering.”
“When you’re doing things like ChatGPT, you’re not running that on your laptop; it’s this massive scale across thousands of machines,” said Robert Nishihara, co-founder and CEO at Anyscale, home to open-source platform Ray which OpenAI used to build and train ChatGPT’s AI infrastructure.
“To run these kinds of applications, there’s just a tremendous amount of software engineering work that goes into dividing the work across all the machines, moving the data, handling machine failures, scaling the systems up or down, and that’s just a huge barrier to entry and makes it really hard,” he said.
For most businesses, this is an unlikely way to create a promising return on investment. One workaround for businesses under a tighter budget is that vendors have already started customizing generative AI models to specific business needs.
“This is going to be the year where we will see the tech giants figure it out,” Jackson said. “Enterprises will adopt the tools because it’ll sort of trickle down to them through the vendors.”