Oracle will provide cloud services with new AMD AI chips

Major Expansion in AI Infrastructure

Oracle has announced a significant partnership with Advanced Micro Devices (AMD) to bring the upcoming MI450 artificial intelligence chips to its cloud services portfolio. This move is designed to support the rapidly expanding demand for AI infrastructure, with a specific focus on large-scale model deployment and advanced AI workloads, including tools such as ChatGPT[2][3].

Details of the Deployment

  • Oracle will initially deploy 50,000 MI450 processors in the third quarter of 2026.
  • Further expansion is planned for 2027 and beyond, allowing Oracle to scale AI compute resources as demand continues to accelerate.
  • The partnership leverages AMD's new "Helios" rack design, optimized for AI superclusters and enabling seamless integration at large cloud scales.

Strategic Implications

  • This collaboration gives AMD another major client for its upcoming AI chips, strengthening its position in the competitive AI hardware space.
  • Oracle can now offer customers the latest in AI chip technology, further broadening its processor offerings and reinforcing its role as a key player in AI cloud services.

Background and Market Context

  • Demand for large-scale AI capacity is skyrocketing, driven by next-generation AI models that require more powerful clusters than ever before.
  • AMD has recently collaborated with OpenAI to enhance the MI450 chip design for AI applications. OpenAI is also building a one-gigawatt facility using the MI450 processor, underscoring the chip's capabilities for massive AI workloads.
  • OpenAI is reported to have signed a landmark $300 billion cloud deal with Oracle, securing compute resources critical for the continued advancement of AI technologies like ChatGPT[2].
  • AMD’s main competitor, NVIDIA, is already selling fully integrated AI racks. AMD is moving quickly to provide similar solutions, aiming to meet the infrastructural needs of tomorrow’s AI workloads.

Performance and Innovation

  • The MI450-powered “AI superclusters” are designed for high throughput, memory bandwidth, and energy-efficient performance to meet the most demanding AI applications.
  • The new architecture enables support for extremely large AI models, enhancing the speed and efficiency of both training and inference operations in the cloud[1].

Latest AI News

Stay Informed with the Latest news and trends in AI