Dive Brief:
- Nvidia and Arm are partnering to bring deep learning capabilities to Internet of Things chips, integrating Nvidia's Deep Learning Accelerator architecture with Arm's ML platform, Project Trillium, according to a Nvidia announcement. The partnership will facilitate the integration of AI into mobile, consumer and IoT designs in "billions" of devices worldwide, furthering Arm's goal of connecting one trillion IoT devices, according to the announcement.
- By leveraging Nvidia's dominance in ML training and Arm's dominance in IoT endpoints, the collaboration will help both sides "design accelerated AI inferencing solutions," according to Karl Freund, analyst for Moor Insights & Strategy.
- NVIDIA's announcement comes one day after news of Google, Samsung and Qualcomm joining forces with around 80 other tech companies for an open-source chip design, which would cut costs in the development of advanced technologies, according to a report by The Information. Such a chip would "undercut" Arm Holdings as it seeks to expand its chips from the smartphone space, with companies such as Tesla and Western Digital already eyeing the new chip design.
Dive Insight:
It's challenging for most companies to tackle AI with the current workforce, but hardware constraints and costs can be a whole other rodeo. Easing the integration of AI chips with devices will ensure that the next wave of IoT devices have the foundation for higher-level processing, especially on the edge.
The collaboration on an open-source chip, by some of the biggest names in the industry no less, could mean big things for the market, which is already shifting as more big tech companies start making their own chips — including Google, Apple and (reportedly) Amazon.
More players in the AI chip space should help drive down costs for companies setting up and distributing IoT networks. Maybe the saved money will be enough to hire another AI expert, whose salary could fall somewhere in the $300,000-$500,000 range.