Skip to content
Tech News
← Back to articles

AMD’s dual-GPU Radeon HD 6990 launched 15 years ago — power, heat, and noise monster was crowned the fastest graphics card in the world

read original get AMD Radeon HD 6990 → more articles
Why This Matters

AMD’s Radeon HD 6990 was a groundbreaking dual-GPU graphics card that set the performance standard a decade and a half ago, showcasing AMD's innovation in pushing graphical power to new heights. Despite its impressive performance, it highlighted the challenges of heat, noise, and power consumption that still influence high-end GPU design today, emphasizing the ongoing balance between raw power and practical usability for consumers and industry developers alike.

Key Takeaways

15 years ago, AMD released its powerful Radeon HD 6990 graphics card (review link). This flagship dual-GPU hot rod, codenamed Antilles, was several months late in March 2011 and poked its head out only a couple of weeks before Nvidia’s reply. Nevertheless, the dual-Cayman XT GPU board, with a majestic-for-the-era 4GB of VRAM, became the world’s fastest graphics card (though that was an honor AMD already held with the Radeon HD 5970 2GB). To reach this PC performance pinnacle, AMD perhaps pushed the silicon a little too hard, though, with reviews of the time complaining about heat, noise, and power consumption.

The specs and performance of the Radeon HD 6990 were spectacular to behold at the time. Two fully-fledged Cayman XT GPUs were shoehorned onto a single PCB and connected via AMD’s CrossFireX. Essentially delivering two HD 6970 graphics cards on one board, albeit slightly downclocked, the HD 6990 thus delivered:

Dual Cayman XT 40nm GPUs, packing 3072 stream processors and 5.28bn transistors combined

Standard GPU clock of 830 MHz, OC up to 880 MHz

4GB GDDR5 (2GB per GPU)

Five-display support

Massive 375 W TDP, with OC modes pushing 450W (dual BIOS switch innovation)

Dual-slot 12-inch PCB with two 8-pin connectors

Flagship pricing at $699

Image 1 of 5 (Image credit: Tom's Hardware) (Image credit: Tom's Hardware) (Image credit: Tom's Hardware) (Image credit: Tom's Hardware) (Image credit: Tom's Hardware)

... continue reading