Zhitong Finance APP was informed that China Recruitment International issued a research report that on October 10, local time in the United States, AMD (AMD.US) released a new AI accelerator Instinct Mi325X at the "Advancing AI" conference organized by AMD (AMD.US).The H200 of NVDA.us, the former is expected to ship in the fourth quarter of 2024.At this conference, AMD also launched the MI350 series chip to target the BLACKWELL series, but the series achieved mass production at the fastest 2H25.
According to the AMD CEO Lisa Su, the potential market size (TAM) of the data center AI accelerator will increase with a compound growth rate of more than 60%, reaching $ 500 billion in 2028, compared with previous forecasts (LISA SU in early 2024It is expected that it will reach 400 billion US dollars in 2027).Overall, the conference did not bring much surprise to investors.
At this meeting, the following three points were noticed: 1) Meta’s LLAMA 405B model has been fully operated on the MI300X, which means that AMD has made good progress in Meta (META.US); 2) AMD in the customer’s display session.It did not mention Amazon (AMZN.us); 3) AMD did not disclose the GPU sales target of 2024 and 2025 and the current market supply and demand status.
AMD’s new GPU highlight: The MI325X accelerator uses a 5 -nanometer process and is equipped with 256GB HBM3E memory. The memory bandwidth reaches 6TB/s. Its capacity and bandwidth are 1.8 and 1.3 times that of Nvidia H200.According to AMD, MI325X FP16, and FP8’s theoretical peaks, the theoretical peaks will reach 1.3 times that of H200.MI325X plans to realize large -scale shipments in 4Q24 are steadily carried outLucknow Stock. This product is expected to launch corresponding server solutions with many partners such as Dell.us, Eviden, Gigabyte, Lenovo, and ultra -micro computers.In addition, AMD also predicted a new generation of GPU MI350 accelerators based on the CDNA 4 architecture and 3 -nanometer process.
AMD said that the computing performance of the MI350X accelerator (FP16 and FP8) will increase by 80%compared to MI325XPune Stock. The product is equipped with 288GB HBM3E memory and supports 8TB/S memory bandwidth.The MI350 series is expected to be launched in 2H25.
AMD’s new GPU highlight: In addition to the release of new GPU products, AMD also released the fifth -generation EPYC "Turin" CPU based on the Zen 5 architecture at this conference.The performance of the CPU has been greatly improved compared to the previous generation products, especially in the field of data centers.The Turin CPU increased the number of daily instructions (IPC) for enterprises and cloud computing workloads by 17%, while the number of per -cycle instructions for high -performance computing and AI tasks was greatly increased by 37%.
According to AMD, since the EPYC product line was launched in 2018, AMD has expanded its market share in the global server field from 2%to 34%.AMD also positions the EPYC platform as the AI host CPU that is suitable for AMD Instinct and Nvida MGX/HGX platforms at the same time.These configurations can support up to 8 OAM MI300X or MI325X GPUs, and provide excellent performance advantages, including increasing AI reasoning performance by 20%, increasing training workload capacity by 15%, and positioning AMD as key participation in the AI CPU fieldInstant, compete with Intel’s Xeon series chips.
AMD’s MI325X is the mid -term upgraded version of MI300X, which aims to compete with Nvidia’s H200.However, because AMD’s next -generation MI350 is scheduled to launch at 2H25, AMD will still lag behind Nvidia. In view of the latter’s B200, it will start large -scale shipments at 4Q24.Recruitment International believes that Nvidia will continue to maintain its leading position in the GPU market, and AMD will continue to strive to catch up.In terms of CPU, AMD’s 5th generation EPYC has made major breakthroughs and has obtained more market share in the server field. Its performance and cost benefits are better than Intel’s Xeon 6 series.
Lucknow Investment