News

The new AMD Instinct MI250X and its MCM GPU is joined by 128GB of HBM2e memory, with AMD tapping an 8-channel interface (1024-bit interface per channel x 8 = 8192-bit memory bus).
With the “Antares” Instinct MI300 series, this is the third time charm for AMD’s datacenter GPU business, and the company is gearing up to have a significant GPU business, which the company finally ...
AMD's RDNA 3 GPU architecture found in the Radeon RX 7900 family of graphics ... the Instinct MI250X. Not only that but AMD says that combining CPU and AI acceleration units in this ...
TensorWave has built North America's largest AMD AI cluster with 8,192 liquid-cooled MI325X GPUs, delivering 21 exaFLOPS of FP8 throughput. It’s a bold move against NVIDIA’s dominance and marks ROCm’s ...
HSBC analysts upgraded their rating on AMD, noting that its latest series of chips can compete with Nvidia’s Blackwell GPUs.
The Instinct MI300, introduced in November 2023, represents AMD's first truly competitive GPU for AI inferencing and training workloads. Despite its relatively recent launch, the MI300 has quickly ...
Vultr's GPU cloud also offers Nvidia H100 GPUs and GH200 chips. In April 2024, the company launched a new sovereign and private cloud offering. The company has a presence in 32 data centers across six ...
In Q3, AMD's data center revenue grew by 122% year-over-year, driven by strong ramp of Instinct GPU and EPYC CPU shipments. I think AMD’s data center business will continue to grow by 45% in FY25.
AMD has open-sourced its GPU-IOV Module which lets Instinct accelerators play nice with virtual machines—and hinted it's coming to Radeon cards too. This means SR-IOV support on client GPUs ...
Competitor Nscale offers a GPU cloud based on MI300X accelerators as well as AMD MI250 GPUs and Nvidia’s A100, H100, and V100 GPUs. In August 2024, AMD launched ROCm 6.2 , the latest version of ...