Apple shunned Nvidia while training its AI models for the upcoming Apple Intelligence platform.
Nvidia‘s (NVDA -1.78%) nearly $3 trillion valuation hinges on a critical assumption: The company’s utter dominance in the artificial intelligence (AI) accelerator market is sustainable. Nvidia’s market share has been estimated as high as 95%, and while competitors including AMD and Intel offer AI chips of their own, they’ve made little headway against the Nvidia juggernaut.
Here’s the problem: That assumption is probably not going to hold up in the long run. Nvidia has some important advantages, including a mature software ecosystem and best-in-class hardware, but cracks are starting to form. The biggest crack so far is Apple‘s (AAPL 0.69%) decision to avoid Nvidia’s GPUs entirely while training its highly anticipated Apple Intelligence platform.
No Nvidia, no problem
Apple is late to the generative AI party, but it’s now full steam ahead bringing AI features to its iPhones, Macs, and other devices. Apple Intelligence, which will go into beta later this year, includes AI-powered writing tools, image creation capabilities, a version of Siri that isn’t embarrassing, and other useful features. None of this is groundbreaking, but these features will be deeply integrated into Apple’s ecosystem and available to the company’s massive user base.
Apple has trained multiple AI models, including powerful server-based models that will run in the cloud and more compact models that will run directly on Apple devices. Training an AI model is computationally intensive, and Nvidia’s GPUs have been the standard choice since the AI boom began. Apple notably took a different route.
Apple detailed the AI models that power Apple Intelligence in a recent research paper, and the big surprise is that the tech giant shunned Nvidia’s GPUs entirely. The server-based models were trained on 8,192 TPUv4 AI chips from Alphabet‘s Google, while the on-device models were trained on a smaller number of TPUv5 AI chips.
Google has been designing custom AI chips for years. The company’s sixth-generation TPU, called Trillium, was announced in May. Google uses its custom chips to train its own AI models, and now Apple has chosen to do the same.
Apple’s decision kills the argument that Nvidia’s GPUs are the only game in town for AI training.
A huge risk to Nvidia’s profits
The AI chip market could grow fast enough for Nvidia’s revenue to continue to rise at a healthy pace even if it loses market share. However, it’s hard to imagine that Nvidia will be able to maintain its sky-high profit margins as competition chips away at its dominance.
Nvidia reported a net income of $14.9 billion on $26 billion in revenue during the first quarter of fiscal 2025, ended April 28. It is highly unlikely that Nvidia will be able to maintain a net income margin above 50% in the long run. Competition will inevitably bring the company’s profitability back down to earth.
AI inference, the act of using trained AI models, doesn’t necessarily require powerful GPUs at all. Intel’s latest Xeon server CPUs, for example, can run some inference workloads, and PCs with dedicated AI chips are set to become the norm. AI training does require powerful accelerators, but Apple’s choice to avoid Nvidia’s GPUs is a strong sign that Nvidia’s dominance of the AI training market is coming to an end.
Nvidia’s AI growth story has been incredible, but Apple just delivered a serious blow to the AI king.
Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool’s board of directors. Timothy Green has positions in Intel. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, Apple, and Nvidia. The Motley Fool recommends Intel and recommends the following options: long January 2025 $45 calls on Intel and short August 2024 $35 calls on Intel. The Motley Fool has a disclosure policy.
Add Comment