Success attracts competition. It’s inevitable regardless of the sector or industry.
No company has been as visibly successful of late as Nvidia (NVDA -0.09%). Its graphics processing units (GPUs) are selling faster than the company can ship them. Nvidia’s share price has more than tripled over the last 12 months.
Unsurprisingly, Nvidia’s success is attracting increased competition. And it’s not just from other major chipmakers. Several of Nvidia’s biggest customers that are cloud service providers — Amazon, Alphabet‘s Google Cloud, and Microsoft — have their own artificial intelligence (AI) chips.
Is Nvidia worried about the threat from Amazon, Google, and Microsoft? Based on CEO Jensen Huang’s recent comments, the answer is a hard “no.” Huang was asked by an analyst in the first-quarter earnings conference call to what extent Nvidia views its large cloud customers as competitors. He replied without any hesitation that Nvidia is different from these potential rivals in the following three key ways.
1. The richness of its accelerated computing architecture
Huang said in the Q1 call that the first way Nvidia differentiates itself from the technology of the large cloud service providers is through the “richness” of its accelerated computing architecture. He emphasized that customers can use this architecture for practically all of their AI needs, including training models and inference.
He also noted that AI inference has “fundamentally changed.” Instead of pattern recognition such as identifying a cat, large language models (LLMs) are now used to generate images. Nvidia hasn’t skipped a beat with this shift.
Huang argued that customers can use Nvidia “for everything, from computer vision to image processing, to computer graphics, to all modalities of computing.” He added that computing costs and energy usage have soared “because general-purpose computing has run its course.” In his view, the only viable alternative for the future is accelerated computing. And Huang thinks Nvidia’s accelerated computing platform gives data center customers the lowest total cost of ownership.
2. Nvidia is everywhere
You won’t see AI chips developed by Amazon used on cloud platforms other than Amazon Web Services. It’s the same story for chips developed by other cloud service providers. Huang said, though, that Nvidia’s chips are “in every cloud.”
He thinks that makes Nvidia an obvious choice for AI developers. This could especially be a selling point for organizations that use multiple clouds.
It’s not just the cloud. Huang emphasized that Nvidia’s technology is “practically everywhere.” The company has chips for on-premises servers, too. As Huang noted, “We’re in computers of any size and shape.” This pervasive presence gives Nvidia a leg up on Amazon, Google, and Microsoft.
3. Building AI factories
Huang’s third reason Nvidia is different from rivals and potential rivals is that the company builds “AI factories.” He often uses this term to describe a new type of data center that uses accelerated computing to build AI applications.
AI factories include chips but aren’t limited to chips. They aren’t limited to single LLMs, either. These factories encompass the entire system for training and running AI apps.
Nvidia CFO Colette Kress said in the Q1 call that the company worked with more than 100 customers in the first quarter to build AI factories. She noted that some of them included “tens of thousands of GPUs.”
The company’s new Blackwell platform should differentiate Nvidia’s AI factory approach even more. Blackwell supports training LLMs four times faster than the H100 GPU with 30 times faster inference. Kress specifically noted that Amazon, Google, and Microsoft are already lined up to use Blackwell as well as others including Meta Platforms, OpenAI, Oracle, Tesla, and XAi.
Should Nvidia be worried?
Huang is absolutely correct that Nvidia is different from Amazon, Google, and Microsoft in several ways. He’s also right that these differences give Nvidia important competitive advantages. But should Nvidia be worried about the threat from its big customers anyway? I think at least some concern is warranted.
Amazon founder and former CEO Jeff Bezos stated in the past, “Your margin is my opportunity.” Nvidia’s gross margin in Q1 was 78.4%, up from 64.6% in the prior-year period. Its net profit margin was 57%, more than double the same period in the previous fiscal year. Those margins don’t just present an opportunity for Amazon; they’re tantalizing for Google, Microsoft, and others as well.
I expect Nvidia’s investments in research and development will keep the company on top in the AI chip market for years to come. However, it doesn’t hurt to worry about competition to some extent. As former Intel CEO Andy Grove once wrote, “Only the paranoid survive.”
Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool’s board of directors. John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool’s board of directors. Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool’s board of directors. Keith Speights has positions in Alphabet, Amazon, Meta Platforms, and Microsoft. The Motley Fool has positions in and recommends Alphabet, Amazon, Meta Platforms, Microsoft, Nvidia, Oracle, and Tesla. The Motley Fool recommends Intel and recommends the following options: long January 2025 $45 calls on Intel, long January 2026 $395 calls on Microsoft, short August 2024 $35 calls on Intel, and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy.
Add Comment