News

The AMD Instinct MI300 uses a specific design, employing what AMD calls 3.5D packing with a total of 13 chiplets. Overall, the chip packs 153 billion transistors, which is an impressive feat.
Look at AMD with a superior MI300 selling a guided $3.5B vs the Nvidia H100 sales of $65B in 2024. It’s not a supply issue. AMD just has longer lead times as it tries to penetrate and cover the TAM.
Tunneling down a little deeper, the AMD Instinct MI300 features a 128-channel interface to its HBM3 memory, with each IO die connected to two stacks of HBM3.
For me, it's the VRAM side of the AI GPUs that I love the most... with AMD's new Instinct MI300X featuring a huge 50% increase in HBM3 memory over its predecessor in the Instinct MI250X (192GB now ...
During a keynote speech, AMD Chair and Chief Executive Lisa Su (pictured) announced the company’s new Instinct MI300X accelerator, which is targeted specifically at generative AI workloads.
AMD's Instinct MI300X is a beast of an AI accelerator - built on the company's third-generation CDNA architecture and TSMC's advanced 5nm and 6nm processes, it features 19,456 stream processors ...
The Instinct MI300, introduced in November 2023, represents AMD's first truly competitive GPU for AI inferencing and training workloads. Despite its relatively recent launch, the MI300 has quickly ...
A family of AI chips from AMD. Introduced in 2023, the Instinct MI300X is a GPU chip with 192GB of HBM3 memory. Multiple MI300X chips are used in AMD's Infinity Architecture Platform for high-end ...
SANTA CLARA, Calif., Sept. 26, 2024 (GLOBE NEWSWIRE) -- AMD (NASDAQ: AMD) today announced that Oracle Cloud Infrastructure (OCI) has chosen AMD Instinct™ MI300X accelerators with ROCm™ open ...
AMD's Instinct MI300X is a beast of an AI accelerator - built on the company's third-generation CDNA architecture and TSMC's advanced 5nm and 6nm processes, it features 19,456 stream processors ...