Dive Brief:
- Meta plans to boost spending on AI research as it sharpens its focus on the technology, according to the company's Q1 2024 earnings report Tuesday. The tech giant will increase capital expenditures to up to $40 billion this year, up from the previous range of $30 billion to $37 billion.
- “While we are not providing guidance for years beyond 2024, we expect CapEx will continue to increase next year as we invest aggressively to support our ambitious AI research and product development efforts,” Meta CFO Susan Li said during the call.
- As Meta scales AI services, executives said teams will focus on training and running models more efficiently and use open-source improvements to trim costs.
Dive Insight:
Meta is one of the leading open-source AI model makers in the current vendor landscape. But beefing up capabilities and infrastructure to support the models comes at a cost.
Tech giants have spent millions training the most popular AI models available today, according to the Stanford Institute for Human-Centered Artificial Intelligence’s 2024 AI Index. Meta’s Llama 2 70B was estimated to cost around $3.9 million to train, according to the research.
AI training costs run even higher for other well-known models. Comparatively, OpenAI’s GPT-4 used around $78 million worth of compute, and Google’s Gemini Ultra devoured $191 million in computing costs, according to Stanford’s research.
In 2023, Meta was the second most prolific model builder, releasing 11 AI models, according to Stanford’s research. Google ranked first with 18 models last year. Meta's AI model iteration is ongoing. The company introduced its third-generation Llama family of models last week.
Aiming for efficiency in training and developing these models has become key. Snowflake engineers spent three months and less than $2 million to train the data cloud company’s model Arctic, prioritizing transparency along the way.
Meta is also looking to chips and training processes to help it sustain its momentum.
“Our Meta training and inference accelerator chip has successfully enabled us to run some of our recommendations-related workloads on this less expensive stack, and as this program matures over the coming years, we plan to expand this to more of our workloads as well,” CEO Mark Zuckerberg said during the earnings call. “As we ramp these investments, we will also continue to carefully manage headcount and other expense growth throughout the company.”