Editor’s note: This article draws on insights from a March CIO Dive virtual event. You can watch the sessions on-demand.
While enterprises and vendors rushed to embed generative AI into workflows following the 2022 launch of ChatGPT, Aflac took a different approach.
“I knew that this would be game-changing to the industry,” Aflac EVP and CIO Shelia Anderson said during a CIO Dive virtual event this month. “But if you think about the maturity life cycle of some of these technologies… it’s going to be the most immature on that day that you’re experiencing it.”
The insurance company didn’t jump headfirst into adoption then, and it still doesn’t now. More than two years after ChatGPT’s public debut, Aflac pursues the implementation of generative AI conservatively, with an eye on business impact and alignment with strategic priorities.
“For us, it really gets down to first and foremost starting with the why: What’s the business opportunity,” Anderson said. “We still are looking for business alignment above all things and look for a return.”
As an insurer, Aflac is focused on regulatory compliance and data security. At the same time, being cost-conscious is important, too. To meet both of these goals, Anderson said generative AI technologies need to run in a hybrid architecture.
“When you consider the cloud versus on-prem, owning the safety and security of your data and all of those things would lead you to think it needs to be on-prem, but the price point to do that, you have to be very selective,” Anderson said. “It’s a balance.”
The ability to run generative AI capabilities in the preferred environment is enabled by Aflac’s cloud strategy. Anderson, who previously held divisional CIO roles at Liberty Mutual and USAA, slowed down migration plans once joining the company in 2022.
“I wanted time for the organization to retool and to reskill and understand the capabilities before we move down the path,” Anderson said. “We’ve been doing a much more intentional approach to assessing the capabilities: What do we want to do as cloud-native, where do we invest in our tier one or critical applications, what is fine leaving in a distributed model?”
Collaboration across teams is also key to ensuring chosen generative AI pilots contribute to business goals — and produce expected outcomes. Appropriately skilled multidisciplinary teams help keep pilots on track.
“When you’re doing generative AI especially, you’re pulling from a lot of different data sources and you’re often solving cross-organizational challenges,” Anderson said. Teams decide whether the ROI of a proof-of-concept is worth it and, if so, how to move adoption forward.
One area Aflac is pursuing with generative AI is digital onboarding. The company has grappled with coordination among many different enrollment partners on the front end. Aflac typically follows a buy-versus-build approach and has chosen a startup to help it test related capabilities.
“We have quite a few active pilots,” Anderson said. “We don’t have anything fully in production yet.”
Governing AI priorities
Around five years ago, Aflac started its AI journey with a machine-learning platform called Code Based Processing. The main goals were to enhance quality, drive business value and improve the customer experience.
The pre-generative-AI work gave the company a head start on developing governance processes needed for data management, AI model implementation and usage guidelines.
Aflac stood up a global generative AI working group composed of leaders in engineering, architecture and security across the U.S. and Japan to strengthen and focus on security practices.
“Being a highly regulated industry, that is always top of mind, and sometimes the first thing that we have to think about,” Anderson said.
The group reviewed Aflac’s data privacy guidelines, privacy enforcement and compliance requirements.
Education is also a key part to governance, Anderson said. The company has held educational opportunities across the organization, including for the board of directors.
Aflac’s innovation lab supports its governance and use case prioritization goals, too.
“It gives us an opportunity to test and learn very quickly, as opposed to starting from the beginning thinking you’re going to carry it all the way to scale and into production,” Anderson said.