Dive Brief:
- AWS has partnered with Hugging Face to make it easier for IT teams to deploy and fine-tune large language models for generative AI applications, the companies announced Tuesday.
- Hugging Face offers researchers and data scientists more than 100,000 machine learning models, according to the announcement. The AI start-up released BLOOM, a large language model trained on data including 13 programming languages and 46 natural languages, in July 2022, according to Hugging Face.
- The partnership mimics similar agreements between other AI start-ups and large tech companies. Hugging Face will use AWS to build its products and AWS customers can customize and fine-tune the technology to fit business needs.
Dive Insight:
One of the major roadblocks for business adoption of generative AI models is the cost to build, customize and fine-tune the technology. This is where big tech companies have the upper hand.
“Generative AI has the potential to transform entire industries, but its cost and the required expertise puts the technology out of reach for all but a select few companies,” AWS CEO Adam Selipsky said in a blog post.
Amazon and Hugging Face are targeting two of the biggest capability gaps for businesses: cost and talent. Businesses using Hugging Face models on AWS will save 50% on training costs, have four times higher throughput and up to 10 times lower latency, according to an Amazon announcement Tuesday.
AWS customers can use Hugging Face modes on AWS through SageMaker JumpStart or the Hugging Face AWS Deep Learning Containers, as well as via tutorials for AWS Tranium or AWS Inferentia.
Most organizations struggle to implement the right governance models, data gathering and quality and talent required to successfully adopt generative AI models. Vendors are the workaround that allow enterprises with tight budgets to acquire the proper capabilities, Brian Jackson, research director at Info-Tech Research Group, previously told CIO Dive.
Whether an enterprise runs its systems on AWS, Google Cloud, Azure or another cloud provider, generative AI applications are likely to trickle down from vendors through partnerships and acquisitions of AI start-ups.
Oracle partnered with NVIDIA to bring its computing stack to Oracle’s cloud infrastructure last October. Google Cloud invested in AI start-up Anthropic to gain access to its AI capabilities earlier this month. Microsoft Azure has a multiyear, multibillion-dollar partnership with OpenAI and plans to deploy OpenAI models across its enterprise and consumer products.