Bank of America has reached a major milestone in its AI journey: More than 90% of its 213,000 employees now leverage the Erica for Employees virtual assistant, the company said in a Tuesday announcement.
The AI-powered tool was rolled out in 2020 to ease IT administrative processes when pandemic concerns drove a pivot to remote work. The company has since attributed a more than 50% reduction in IT service calls to Erica for Employees, according to the announcement.
As organizations invest in emerging technologies, there are risks and — if all goes well — rewards. The progression from pilots to widespread adoption isn’t automatic or assured. Development and implementation costs have to align with value.
At Bank of America, the tech vetting process is grounded in practical considerations, Hari Gopalkrishnan, head of consumer, business and wealth management technology, told CIO Dive.
“We start by looking at what the customer wants,” Gopalkrishnan said. “If you start by asking ‘how can I take this cool technology to market?’ you’re going to spend a lot of money and it’s going to fail. It has to map back to what the customer needs.”
Bank of America’s virtual assistant initiative began in 2018, when the company launched the first iteration of an in-app virtual assistant called Erica. The chatbot has tallied 2.5 billion client interactions since its initial deployment and now has 20 million active users, the company said.
Spend to save
Measurable efficiency gains and user-experience improvements don’t come cheap. Bank of America invests roughly $13 billion on tech annually, earmarking nearly one-quarter of that budget to new technology initiatives this year.
Some of that spending supports regularly scheduled innovation sessions to workshop potential use cases.
“Teammates come together to go over what we’ve heard from the market, what are the cool things in tech and how we can come up with a bunch of ideas in a 48-hour cycle,” Gopalkrishnan said. “Some of them get funded right away and some take longer.”
The bank’s portfolio of virtual assistants, including Erica, Erica for Employees and a pair of customized tools for its Merrill and Bank of America Private Bank units, emerged from an emphasis on deliberate innovation.
“There are many areas where we've used traditional predictive AI for years to actually deliver value, both client facing and internally,” said Gopalkrishnan.
As generative AI capabilities took center stage, the bank already had processes in place to evaluate the safety, efficacy and potential value of the technology.
“There are 16 different parameters we look at to determine if a capability holds muster when it comes to responsible deployment and that has not changed,” said Gopalkrishnan. The company also has an AI oversight council to manage safety and governance.
The technology’s evolution, from an engineering perspective, stems from the size of large language models and the scope of their capabilities.
Prior to ChatGPT, models were built from the ground up with specific functions in mind. Generative AI models are multitaskers, capable of summarization across functions and multilanguage code generation.
“The new kid in town is content-based,” Gopalkrishnan said. “Now you get to the interesting realm of generating content, which is both exciting and, in some ways, daunting, because now you’ve got to worry about hallucinations.”
Tracking value
Despite generative AI’s broad potential, many organizations are struggling to register a return on their investments in the technology. As accuracy issues, safety concerns and governance complications impede adoption, the proof-of-concept process can lead to shuttered pilots.
“You can spin up something that sounds like it’s cool but yields very little value to the business,” Gopalkrishnan said. “A cool demo can end up costing you a ton of money, which is not a good outcome.”
Gopalkrishnan recalls a particular technological innovation predating ChatGPT that also promised to revolutionize business processes.
“A few years ago, the world was abuzz with talk of the metaverse,” he said. “We took a look at augmented reality, we did innovation sessions and teams came back with all kinds of cool ideas. But, early on, we realized that no customer was asking us for that.”
Predictive and then generative AI coupled with natural language processing followed a more promising development curve.
“When we started our journey with Erica, we realized that real customers were having challenges navigating hundreds of features in a mobile app,” Gopalkrishnan said. “We could see that customers actually cared about natural language processing even if they didn’t call it that.”
Erica’s evolution helped establish a vetting framework that begins with understanding the various roles in the organization and working from there to weigh costs against value.
One area that promised a return on investment in the data-intensive banking industry was coding assistants. Big banks are well positioned to reap the rewards of tools that guide engineers through legacy applications, according to Accenture. The largest industry players have moved quickly to add governance and ethical use expertise to ease the adoption process, Evident Insights found.
Bank of America has already seen a 20% efficiency improvement among its coders through the use of generative AI, the company said in the Tuesday announcement.
The technology was adopted with safety in mind and an eye toward ROI, according to Gopalkrishnan.
“It’s not a panacea yet, but the faster we can generate valuable, meaningful code, both in terms of revenue and business initiatives that are going to take out expenses, the better we get,” Gopalkrishnan said. “It's really for those activities that are safely reproducible.”
The company’s ongoing innovation push has yielded 7,400 patents and pending patent applications, more than 1,200 of which are AI and machine learning focused.
While patents protect the bank’s intellectual property, they can also be a moral booster.
“It’s a way to reward the team for innovative thinking,” Gopalkrishnan said. “But it also allows us to make sure people realize there’s a lot of IP here and we’re not flying by the seat of our pants.”