News
The AMD Instinct MI200 series will include a total of three different accelerators, ... showing an advantage of 4.9x in both FP64 Vector and FP64 Matrix/tensor performance, ...
AMD said its newly launched Instinct MI300X data center GPU exceeds Nvidia’s flagship H100 chip in memory capabilities and surpasses it in key AI performance ... For FP64 vector operations, ...
AMD enters the AI acceleration game with broad industry support. First shipping product is the Dell PowerEdge XE9680 with AMD Instinct MI300X.
New AMD Instinct MI200 GPU ‘Faster’ Than Nvidia’s A100. ... It can also achieve 47.9 teraflops in FP32 vector performance, 2.5 times faster. For FP16 and BF16 matrix performance, ...
HSBC analysts upgraded their rating on AMD, noting that its latest series of chips can compete with Nvidia’s Blackwell GPUs.
AMD on Monday unveiled the Instinct MI200 accelerator, the latest generation of its data center GPU. ... Its peak FP32 vector performance is about 2.5x faster.
They are named AMD's Instinct MI450X IF64 and Instinct MI450X IF128, and both are designed for AI deployment. If they prove to be a success, it could change the landscape of AI hardware over time.
AMD's CDNA 4 enters the scene. AMD's Instinct MI350X-series GPUs are based on the CDNA 4 architecture that introduces support for FP4 and FP6 precision formats alongside FP8 and FP16.
AMD provides an update on its Instinct MI series, with MI325X refresh with 288GB HBM3E coming in Q4, MI350X with CDNA 4 in 2025, CDNA Next MI400 in 2026. News.
AMD's new Instinct MI300X AI GPUs are inside of LaminiAI LLM Pods, with 8 x Instinct MI300X pod NVIDIA projected to make $130 billion from AI GPUs in 2026, which is 5x higher than 2023 Newsletter ...
Now the number of orders for the Instinct MI300X is off the charts, so AMD has raised its guidance for data center GPU revenue from $2 billion to $3.5 billion in 2024.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results