Skip to content
Tech News
← Back to articles

Nvidia wants to own your AI data center from end to end

read original get Nvidia Data Center GPU → more articles
Why This Matters

Nvidia is aiming to dominate the entire AI data center ecosystem by offering a comprehensive range of hardware solutions, from CPUs to specialized inference racks. This strategic move could reshape the AI infrastructure landscape, giving Nvidia greater control over AI economics and performance. The company's expanding ambitions also include robotics and space AI, signaling a broad vision for future technological integration.

Key Takeaways

Nvidia.

Follow ZDNET: Add us as a preferred source on Google.

ZDNET's key takeaways

Nvidia showed off five racks of equipment covering all aspects of AI infrastructure.

Nvidia argues that AI economics are better when all the parts are from Nvidia.

Nvidia's broadening ambition includes robotics and even AI in space.

The image Nvidia suggested to the media for its GTC conference in San Jose, Calif., this week is a line of 40 rectangles representing data center server racks of various kinds. No labels, just the racks standing like a bookshelf of the complete works of Shakespeare, or, more ominously, a phalanx of soldiers.

The implicit message of the imposing wall of racks is that Nvidia, if it doesn't already, will ultimately own all processing in the data center, from one end to the other.

Also: This OS quietly powers all AI - and most future IT jobs, too

On stage at the show, Nvidia CEO Jensen Huang used Monday's keynote address to announce a broadening of the company's chip and system offerings. Existing product lines include the Vera CPU chip, the Rubin GPU chip, and, now, a new kind of rack of equipment joins them, for ultra-fast inference, called the LPX.

... continue reading