3rdPartyFeeds

AWS has its own chips: Here’s how its CEO sees their future

Amazon (AMZN) Web Services (AWS) has entered the semiconductor market, developing its own chips to train AI models in competition with industry leaders like Nvidia (NVDA). At the 2024 Goldman Sachs Communacopia and Technology Conference, Yahoo Finance reporter Madison Mills interviewed AWS CEO Matt Garman to break down AWS's chip strategy. Garman acknowledges Nvidia's strong market position, calling it "a great platform" with a large customer base. However, he emphasizes that the chip market is vast, with "potential for multiple options," stressing the importance of customer choice. AWS's semiconductors, Inferentia and Trainium, are "specifically built for AI inference," Selipsky explains. These chips offer particular value for small-scale inference tasks, helping customers reduce costs. He also notes that AWS is working on improving these chips to train large language models. "We think that there's this really large market segment and there's room enough for customers to be using the best product for the use case for a long time," Selipsky told Yahoo Finance. Although he expressed support for other chipmakers, stating that AWS does not expect to become "fully reliant" on its own chips. Catch Yahoo Finance's full interview with Matt Garman here. For more expert insight and the latest market action, click here to watch this full episode of Morning Brief. This post was written by Angel Smith Read More...

Amazon (AMZN) Web Services (AWS) has entered the semiconductor market, developing its own chips to train AI models in competition with industry leaders like Nvidia (NVDA). At the 2024 Goldman Sachs Communacopia and Technology Conference, Yahoo Finance reporter Madison Mills interviewed AWS CEO Matt Garman to break down AWS’s chip strategy.

Garman acknowledges Nvidia’s strong market position, calling it “a great platform” with a large customer base. However, he emphasizes that the chip market is vast, with “potential for multiple options,” stressing the importance of customer choice.

AWS’s semiconductors, Inferentia and Trainium, are “specifically built for AI inference,” Selipsky explains. These chips offer particular value for small-scale inference tasks, helping customers reduce costs. He also notes that AWS is working on improving these chips to train large language models.

“We think that there’s this really large market segment and there’s room enough for customers to be using the best product for the use case for a long time,” Selipsky told Yahoo Finance. Although he expressed support for other chipmakers, stating that AWS does not expect to become “fully reliant” on its own chips.

Catch Yahoo Finance’s full interview with Matt Garman here.

For more expert insight and the latest market action, click here to watch this full episode of Morning Brief.

This post was written by Angel Smith

Read More

Add Comment

Click here to post a comment