AMD Unveils MI400 AI Chips, Teams Up with OpenAI

AMD Launches Instinct MI400 AI Chips (H3)
At an event in San Jose, California, AMD introduced its next-generation AI chips, the Instinct MI400 series, designed to target large-scale AI operations. The powerful chips are engineered to scale for massive training models and distributed inference workloads. With up to 432GB of HBM4 memory, 40 petaflops of FP4 performance, and 300GB/s of scale-out bandwidth, these GPUs aim to deliver rack-scale AI performance leadership. AMD’s CEO, Lisa Su, presented the new chips alongside OpenAI CEO Sam Altman, who confirmed that OpenAI would integrate the MI400 chips into its infrastructure.
Strategic Partnerships and Market Expansion
The launch of the Instinct MI400 series highlights AMD’s ambition to challenge Nvidia’s dominance in the AI chip market. AMD's chips are already in use by major companies such as Meta, Microsoft, Tesla, and Oracle. The company projects that the AI chip market will exceed $500 billion by 2028, and AMD is committed to releasing new chips annually to provide cost-effective solutions for large AI deployments. This partnership with OpenAI, which plans to incorporate these chips into its operations, further strengthens AMD's position in the competitive AI landscape.
Also Read: Latam-GPT: A Collaborative AI Language Model for Latin America
A Shift in the AI Chip Market
Currently, Nvidia leads the AI chip market with its Blackwell chips and CUDA software, which powers data center GPUs. AMD’s new offering seeks to disrupt this monopoly, promising cost-effective, high-performance solutions for businesses requiring large-scale AI capabilities. The announcement of the MI400 series saw AMD’s stock rise by 9%, with analysts anticipating a strong recovery for AMD’s GPU business in the fourth quarter. The new AI chips are expected to begin shipping in 2026.