Dive Brief:
- Snowflake is banking on enterprise interest in large language model capabilities and generative AI applications to boost demand for its data cloud services, Frank Slootman, the company’s chairman and CEO, said Wednesday during a Q2 2024 earnings call.
- “Generative AI is at the forefront of customer conversations," Slootman said. “However, enterprises are also realizing that they cannot have an AI strategy without a data strategy to base it on.”
- Snowflake requested 1,000 Nvidia H100 graphics processing units, estimating $1 million in monthly expenses over the next quarter. “People are still struggling to get GPUs,” CFO Mike Scarpelli said. “There is a time lag between when a chip manufacturer sells their chips to [when] it gets built into the hardware that actually gets deployed in a rack in a data center.”
Dive Insight:
Revenue growth became a barometer for economic volatility as organizations pared back cloud and data services spending during the first half of the year. Now, providers are beginning to see signs of stability.
Snowflake’s net revenue grew to $640 million in Q2 2024, a year-over-year increase of 37% for the three-month period ending July 31. However, the company’s revenue had ballooned by more than twice that amount — 83% — during the same period last year.
Consumption picked up over the summer, in step with a brightening macro outlook, according to Slootman.
“We’ve really seen a a sentiment change from the earlier quarters where, you know, people were trying to cut off their limbs to fit within budgetary constraint,” said Slootman, pointing to more than 400 new Snowpark customers and a 70% quarter-over-quarter increase in consumption of the data solution.
“We continued to execute in an unsettled macro environment but with incremental improvement in general sentiments and engagement,” Slootman said.
Appetite for data to train and tune LLMs has begun to boost business as well.
In June, Snowflake partnered with chip manufacturer Nvidia and expanded its alliance with hyperscaler Microsoft to bring enterprise-grade generative AI capabilities to its customers.
The company also announced an LLM capable of ingesting legal contracts, invoices and other unstructured data sources in June. The solution, Document AI, is currently available in a private preview.
Snowflake’s AI strategy centers on leveraging its established data sharing and data governance tools, which allow aligned enterprises to pool resources. The company sees their services as a backbone to AI aspirations.
"Data sharing makes Snowflake uniquely positioned to enable AI workloads," Slootman said.
More than one-quarter of the company’s customers took advantage of data sharing in Q2, up from just one-fifth during the same period last year, Slootman said.
“Having highly organized, optimized, trusted, sanctioned data is incredibly important for deploying large language models,” said Slootman. “If you think you can just drop a model on top of a data lake and just see what happens, that's not going to end well.”
But Snowflake’s infrastructure buildout won't happen overnight, Scarpelli cautioned.
“In my prior life, when we were buying racks of servers, there was a six-month delay between when we bought them and when they were actually going into production,” said Scarpelli. “I don't see that being any different with GPUs.”