How Apple’s AI Update Was Bad News for Nvidia

Bad news doesn't mean horrible news.

Bad news doesn’t mean horrible news.

The wait is over. There was considerable anticipation leading up to Apple‘s (AAPL 2.86%) Worldwide Developers Conference (WWDC) this week. In particular, investors were eager to learn about Apple’s artificial intelligence (AI) strategy. They now have some answers.

At Alphabet‘s Google Cloud Next event in April, Nvidia (NVDA 3.55%) was near the center of attention. It was a much different story at Apple’s WWDC. Here’s how Apple’s AI update was bad news for Nvidia.

Losing its edge?

Apple and Nvidia have been neck-and-neck in claiming the No. 2 spot behind Microsoft among the largest companies based on market cap. In the aftermath of the WWDC, Apple has opened up a bigger lead over Nvidia.

I think one major story from this week’s WWDC is that Apple is committed to winning in the edge AI market. Edge AI is the deployment of AI on connected devices at the “edge” of networks.

Apple Intelligence (the name for Apple’s generative AI integration with its newest iPhone, iPad, and Mac operating systems) represents a key step forward for Apple in edge AI. CEO Tim Cook said, “Our unique approach combines generative AI with a user’s personal context to deliver truly helpful intelligence. And it can access that information in a completely private and secure way to help users do the things that matter most to them.”

Apple’s version of generative AI will primarily run on its devices rather than in the cloud. Where is Nvidia’s tremendous growth coming from these days? Primarily from selling GPUs for use in cloud data centers. Sure, Nvidia offers solutions for edge AI. However, it isn’t anywhere in the picture with Apple’s strategy. And Apple’s more than 2.2 billion devices worldwide could put it in the driver’s seat in the edge AI market.

Two words Nvidia didn’t want to hear

Didn’t Apple reveal that some of Apple Intelligence will run on the cloud? Yep. However, that won’t help Nvidia.

There were two words in Apple’s update that Nvidia almost certainly didn’t want to hear: “Apple silicon.” Apple said its Private Cloud Compute for handling more complex requests will “run on servers powered by Apple silicon.”

That’s a stark contrast to other leading cloud services. Amazon Web Services uses Nvidia’s GPUs. So do Microsoft Azure and Google Cloud. But Apple won’t.

It’s even possible that Amazon, Microsoft, and Google could be inspired by Apple’s decision to use its own silicon to accelerate their own internal chip programs. All three major cloud service providers have their own AI chips but still rely primarily on Nvidia’s GPUs.

Bad news, but not horrible news

Nvidia would have undoubtedly loved for Apple to have used its GPUs in its Private Cloud Compute. That would have reinforced the view of Nvidia as the undisputed leader in AI chips for data centers. Nvidia would have also probably preferred that more of Apple’s generative AI functionality run in the cloud instead of on the devices.

Neither happened. That’s bad news in one sense for Nvidia. However, it’s not horrible news by any stretch of the imagination despite Nvidia’s shares falling a little this week.

Apple’s AI approach shouldn’t diminish the skyrocketing demand for Nvidia’s GPUs. I suspect that will be true even if Amazon, Microsoft, and Google follow in Apple’s footsteps by increasing the use of their custom AI chips.

Nvidia’s fortunes remain largely in the company’s control. As long as Nvidia continues to out-innovate the competition, it will stay on top of the AI chip market.

Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool’s board of directors. Keith Speights has positions in Alphabet, Apple, and Microsoft. The Motley Fool has positions in and recommends Alphabet, Apple, Microsoft, and Nvidia. The Motley Fool has a disclosure policy.

Read More

Add Comment

Click here to post a comment