AMD CEO unveils new AI chips

Major Announcements at Advancing AI 2025

Advanced Micro Devices (AMD) has showcased its ambitious push into the artificial intelligence (AI) hardware market at the Advancing AI 2025 event in San Jose, highlighting the company’s strategy to challenge established players and drive innovation across the AI ecosystem. CEO Lisa Su introduced a suite of new AI accelerators and complementary software systems that demonstrate robust performance improvements and significant ecosystem partnerships[1].

The New AMD Instinct MI350 Series

- The MI350X and MI355X accelerators, AMD's latest flagship AI chips, feature up to 288GB of HBM3E memory and blazing 8TB/s memory bandwidth. - Powered by the new CDNA 4 architecture and fabricated on TSMC’s advanced N3P node, these accelerators promise up to 4.2 times the performance of the previous MI300X model on AI workloads. - According to AMD, inference performance is now 35 times higher than the prior generation, and early benchmarks suggest these chips match or even exceed the capabilities of Nvidia’s B200 and GB200, especially in FP4 and FP6 precision operations[5].

New Systems and Ecosystem Initiatives

- AMD revealed its open, scalable rack-scale AI infrastructure, enabling flexibility and performance beyond 2027. - The next-generation “Helios” AI Rack was presented, emphasizing advanced cooling and rack-level integration for data centers. - The MI350 Series accelerators will become available through leading partners such as Oracle, Dell Technologies, Hewlett Packard Enterprise, Cisco, and Asus starting in Q3 2025[2].

Key Partnerships and Ecosystem Momentum

- AMD’s open AI ecosystem now includes collaborations with AI powerhouses such as OpenAI, Meta, Microsoft, xAI, Cohere, Oracle, and others. - Several industry leaders including Red Hat, Astera Labs, and Marvell shared their commitment to building an open, integrated environment leveraging AMD’s leadership in GPUs, CPUs, networking, and software solutions. - The continued growth of the AMD ROCm open software stack was highlighted as a key enabler for developers, with new features and optimizations being delivered on an accelerated release cadence every two weeks.

Software Innovation Enhancing Developer Experience

- ROCm, AMD’s open AI software platform, was positioned as central to unlocking the full potential of the new hardware, providing out-of-the-box capabilities and easy setup for developers. - Popular models such as llama and DeepSeek are reportedly supported from day one. - AMD’s SVP of AI, Vamsi Boppana, emphasized the growing developer community, frequent hackathons, and meetups fueling unprecedented speed in customer deployment.

An Open Future for AI

Dr. Lisa Su articulated AMD’s vision of leading the next phase of AI through open standards, shared innovation, and broad collaboration across hardware and software. AMD aims to accelerate the pace of AI development and deployment by providing unmatched flexibility, performance, and openness for its global partners and customers[3].

Looking Ahead

AMD's unveiling of the Instinct MI350 series, integrations with leading industry players, and emphasis on open ecosystem collaborations set a dynamic stage for future developments in AI hardware and software. With these initiatives, AMD is positioning itself as a key player in shaping the future of enterprise AI solution deployment.

Latest AI News

Stay Informed with the Latest news and trends in AI