Dive Brief:
- Google Cloud CEO Thomas Kurian promised customer choice in generative AI deployments, from processors and models to databases and business applications, during the Next ’24 cloud conference keynote Tuesday.
- The tech giant expanded the Vertex AI platform model portfolio and embedded the Gemini model across its enterprise suite, including new experiences in Google Workspace. For $10 per user per month, Gemini can take meeting notes and translate 69 languages in real-time, the company said.
- “Together, we are creating a new era of generative AI agents… and we’re reinventing infrastructure to support it,” Kurian said.
Dive Insight:
Google kicked off the cloud conference with a cornucopia of generative AI-related updates. The additions come on the heels of a February rebrand, which united several generative AI-powered tools under the Gemini moniker.
“It’s been less than eight months since Next 2023, but we made a world of progress,” Kurian said. “We’ve introduced over 1,000 product advances across Google Cloud and Google Workspace, expanding our infrastructure footprint to 40 regions.”
Gemini is now pervasive in Google Cloud’s ecosystem, showing up in databases to ease moves from legacy systems and in security operations to detect malicious activity, summarize event data and recommend next steps, according to the Tuesday blog post.
The LLM also powers Google’s GitHub Copilot competitor, now called Gemini Code Assist. The tool is available for free until July 11.
As generative AI strategies took shape last year, the hyperscaler landscape shifted to emphasize partnerships with LLM providers, investments in workforce training and the tech’s development.
The strategy paid off.
Microsoft has around 53,000 Azure AI customers, the company said during a January earnings call for Q2 2024. Its cloud revenue grew 22% year over year, driven in part by AI services.
AWS’ Bedrock had “many thousands” of customers using the AI service in a few months, the company said during a February earnings call for Q4 2023. Google reported Vertex AI API requests increased by nearly six times from the first half to the second half of last year in its Q4 2023 earnings call. Kurian said Google’s generative AI training courses have been completed more than one million times.
The enterprise advantage to leaning on a cloud provider for generative AI capabilities is flexibility and scale, hallmark features of the infrastructure that lured enterprises initially.
Businesses that build custom LLMs from scratch are expected to confront challenges related to technical debt and complexity, which will ultimately lead them to abandon such initiatives by 2028, according to Gartner research.
Enterprises can avoid AI headaches by using hyperscaler marketplaces. Bedrock, the Vertex AI model garden and Azure Marketplace provide the flexibility Kurian highlighted. As new models inevitably arrive, enterprises can switch to the one that offers the most value for a specific use case.