AMD (AMD US, Uncodic) released a new AI accelerator Instinct Mi325X at the "Advancing AI" conference held on October 10, local time in the United States.The former is expected to ship in the fourth quarter of 2024.At this conference, AMD also launched the MI350 series chip to target the BLACKWELL series, but the series achieved mass production at the fastest 2H25.According to the AMD CEO LISA SU, the potential market size (TAM) of the data center AI accelerator (TAM) will increase with a compound growth rate of more than 60%, and reached 500 billion US dollars in 2028, compared with previous forecasts (LISA SU in early 2024It is expected that it will reach 400 billion US dollars in 2027).Overall, the conference did not bring much surprise to investors.At this meeting, we noticed: 1) Meta's LLAMA 405B model has been fully operated on the MI300X, which means that AMD has made good progress in Meta; 2) AMD did not mention Amazon in the customer display link; 3) AMD did not disclose the GPU sales target of 2024 and 2025 and the current market supply and demand status.
AMD's new GPU highlight: The MI325X accelerator uses a 5 -nanometer process and is equipped with 256GBHBM3E memoryPune Stock. The memory bandwidth reaches 6TB/sLucknow Wealth Management. Its capacity and bandwidth are 1.8 and 1.3 times that of Nvidia H200.According to AMD, MI325X FP16, and FP8's theoretical peaks, the theoretical peaks will reach 1.3 times that of H200.MI325X plans to realize large -scale shipments in 4Q24 is steadily carried out. This product is expected to launch corresponding server solutions with many partners such as Dell, EVIDEN, Gigabyte, Lenovo, and ultra -micro computers.In addition, AMD also predicted a new generation of GPU MI350 accelerators based on the CDNA 4 architecture and 3 -nanometer process.AMD said that the computing performance of the MI350X accelerator (FP16 and FP8) will increase by 80%compared to MI325X. The product is equipped with 288GB HBM3E memory and supports 8TB/S memory bandwidth.The MI350 series is expected to be launched in 2H25.
AMD's new GPU highlight: In addition to the release of new GPU products, AMD also released the fifth -generation EPYC "Turin" CPU based on the Zen 5 architecture at this conference.The performance of the CPU has been greatly improved compared to the previous generation products, especially in the field of data centers.The Turin CPU increased the number of daily instructions (IPCs) of the enterprise and cloud computing workload by 17%, while the number of per -cycle instructions for high -performance computing and AI tasks was greatly increased by 37%.According to AMD, since the EPYC product line was launched in 2018, AMD has expanded its market share in the global server field from 2%to 34%.AMD also positions the EPYC platform as the AI host CPU that is suitable for AMD Instinct and Nvida MGX/HGX platforms at the same time.These configurations can support up to 8 OAM MI300X or MI325X GPUs, and provide excellent performance advantages, including increasing AI reasoning performance by 20%, increasing training workload capacity by 15%, and positioning AMD as key participation in the AI CPU fieldInstant, compete with Intel's Xeon series chips.
Our point of view: AMD's MI325X is the mid -term upgrade version of MI300X, which aims to compete with Nvidia's H200.However, because AMD's next -generation MI350 is scheduled to launch at 2H25, AMD will still lag behind Nvidia. In view of the latter's B200, it will start large -scale shipments at 4Q24.We believe that Nvidia will continue to maintain a leading position in the GPU market, and AMD will continue to strive to catch up.In terms of CPU, the 5th generation of AMD EPYC has made major breakthroughs and has obtained more market share in the server field. Its performance and cost benefits are better than Intel's Xeon 6 series
Notice:Article by "Invest in Gold | Financial investment tutorial". Please indicate the source of the article in the form of a link;
Original link:https://xbhome001.com/FI/74.html
Working Time:
Telephone
Financial
Investment Platform