AMD reveals the next -generation AI chips with the CEO of Openai Sam Altman

Lisa Su, CEO of Advanced Micro Devices, testifies during the hearings of the Committee on Trade, Science and Transport of the Senate, entitled “Victory in the AI ​​race: Strengthening US capabilities in calculating and innovations” in the Hart building on Thursday, May 8, 2025.

Tom Williams | CQ-Roll Call, Inc. | Gets the image

Extended micro -device On Thursday, new details about their next -generation chips, the Instinct Mi400 series to be sent next year were presented.

AMD said the Mi400 chips would be able to collect a full server rack called Helios, which will link thousands of chips so that they can be used as a single “scale” system.

“For the first time, we architect each part of the rack as a single system,” said AMD CEO Lisa Su, launching at San Khase, California, Thursday.

Openai CEO Sam Altman appeared on stage with SU and said his company would use AMD chips.

“When you first started telling me about specifications, I liked it, there is no way, it just sounds completely crazy,” Altman said. “It will be a strange thing.”

Rack AMD setup will make the chips look like one system, which is important for most artificial intelligence customers such as Cloud Providers and Companies Developing Big Language Models. Those customers want “hypermattic” AI computer clusters that can cover entire data centers and use a large amount of power.

“Think like Helios as a really stand that functions as the only, massive computing engine,” SU said, comparing it to it Nvidia’s Rubin Steaks, which is expected to be released next year.

Openai CEO Sam Altman poses during the artificial intelligence summit (AI) in the Great Palai, in Paris, February 11, 2025.

Joel Saet | AFP | Gets the image

AMD scale technology also allows its latest chips to compete with Blackwell Nvidia chips, which are already in configuration with 72 graphics processing blocks together. Nvidia is the main and rival AMD and only in large graphic data processing processes for the development and deployment of AI apps.

Openai – Famous Nvidia client – gave AMD feedback on its MI400 roadmap, Chip Company reported. Mi400 and Mi355X chips are planning to compete with NVIDIA competitors this year, and the company’s head on Wednesday informs reporters that chips would cost less to work by reducing electricity consumption, and that AMD is insufficient Nvidia with “aggressive” prices.

So far, Nvidia has dominated the graphic processing processor market, partly because it was the first company that developed the kind of software required by AI developer to use chips designed to display graphics for 3D games. Over the last decade, before the boom AI, AMD focused on competing against Intel in server processors.

SU said Mi355X AMD could exceed Blackwell Nvidia chips, despite Nvidia using “own” CUDA software.

“It states that we have a really strong equipment that we always knew, but it also shows that the open software framework has made great progress,” SU said.

In 2025, AMD shares are flat, signaling that Wall -Rate does not yet consider it as the main threat to the dominance of Nvidia.

Andrew Dielekman, General Young AMD for graphic processing centers, said Wednesday that AI AI chips would cost less for work and get less.

“Different councils have significant expenses for the acquisition delta, which we then apply our competitive advantage, so a significant double-digit interest savings,” said Tamman.

Over the next few years, large cloud companies and countries are ready to spend hundreds of billions of dollars on the creation of new data centers around graphic processors to accelerate the development of advanced AI models. Here comes in 300 billion dollars Only this year in the planned capital costs from Megacap technology companies.

AMD expects the overall market of AI chips to exceed $ 500 billion by 2028, though it did not say According to analysts estimates.

Both companies pledged to release new AI chips annually, unlike a two -year basis, emphasizing how fierce competition has become and how important the technology of chips AI bleeding for companies such as companies such as companies such as companies Microsoft. Oracle and Amazon.

Last year AMD bought or invested in 25 AI companies, including Su, including Purchase ZT systems earlier this yearThe server manufacturer that developed the technology needed to create its resistant size systems.

“These AI systems become very complicated, and complete solutions are really important,” said Su.

That now sold AMD

Currently, the most advanced AI AI chip, installed with Cloud providers, is its instinct MI355X, which, according to the company, has started delivery in the past month. AMD said it will be rented out of cloud providers in the third quarter.

Companies building large clusters of data centers, want Alternatives to Nvidia not only to reduce costs and provide flexibility, but also to fill the increasing need for “conclusion” or computing power necessary for the actual deployment of the chat or generative AI application, which can use much more power.

“What has really changed is the demand for the withdrawal,” said Su.

AMD representatives said on Thursday, which believed that their new chips are outbuilding for Nvidia. All because AMD chips are equipped with faster memory, allowing large AI models to work on one graphic processor.

AMD said Mi355X has seven times the amount of computing power. These chips will be able to compete with the B100 and B200 Nvidia chips, which are sent since the end of last year.

AMD said its instinct chips were taken by seven of the 10 largest AI customers, including Openai, TeslaXAI, and cole.

AMD plans to offer clusters with more than 131,000 mi355x chips, AMD said.

Officials with Meta On Thursday, they said they use the AMD processors and graphic processors to launch the conclusion for their Llamaa model, and that it plans to purchase the next generation servers.

A Microsoft spokesman said he uses AMD chips to maintain its Copilot AI features.

Competition for the cost

AMD refused to say how much his chips-he does not sell on its own, and end users usually buy them through the equipment Declaration or Super Micro Computer – But the company plans to make Mi400 chips at the price.

Santa Clara combines its graphic processors along with its processors and network chips from the acquisition of Pensando 2022 to create your Helios shelves. This means that the more AI chips should also benefit the rest of the AMD business. It also uses open source networks technology for careful integration of its counter -called Ualink, compared to its Nvidia Nvlink.

AMD claims that its Mi355X can provide 40% more tokens – a measure of AI exit – for the dollar than Nvidia chips because its chips use less power than its competitor.

Grafted data centers can cost tens of thousands of dollars per chip, and cloud companies usually buy them in large numbers.

AI AI Chip Business is still much smaller than Nvidia. It states that he had $ 5 billion in his financial 2024, but JP Morgan analysts expect 60% growth in this year.

See: AMD CEO Lisa Su: Export Chip is a wind but we still see the opportunity to grow

AMD CEO Lisa Su: Export Chip is a wind but we still see the opportunity to grow

Source link