China-based AI hardware developers are making rapid progress with accelerators of their own design. However, China's most advanced AI developers are increasingly acknowledging that domestic hardware is unlikely to catch up with U.S. leaders in the near term, which greatly limits the development of competitive models. To that end, in a bid to stay competitive with American peers, Chinese AI developers are exploring ways to rent Nvidia's upcoming Rubin GPUs in the cloud, reports the Wall Street Journal.
When Nvidia introduced its Rubin datacenter platform for AI in January, it publicly named American customers and omitted Chinese ones. The company has taken this approach in recent quarters, reflecting U.S. export rules, its commitment to comply with them, and its intent not to signal to investors about the opening of the Chinese market. The message was received by Chinese companies, and they began exploring ways to obtain leading-edge processors from Nvidia remotely to avoid falling behind their U.S. rivals.
Chinese AI companies have begun negotiating access to NVL144 GR200 and other Nvidia Rubin-based systems hosted in data centers outside China, particularly in Southeast Asia and the Middle East, the report claims. Up until the middle of this week, these arrangements were generally considered legal. However, they have caveats by design: compute is rented rather than owned, capacity is shared rather than dedicated, and deployment timelines depend on third-party operators rather than internal schedules in the worst-case scenarios.
There is no surprise that using remote hardware to train frontier AI models is tricky, as the difference between renting Rubin in a remote cloud data center and deploying it locally is profound. U.S. hyperscalers can integrate Rubin accelerators at scale, tune their software stacks tightly around the new hardware, and reserve massive GPU clusters for long training runs. By contrast, Chinese developers that plan to rent Rubin capacity will have to cope with limited allocations, cross-border latency, limited freedom to customize systems, and, in some cases, queuing. If they rent enough systems — and there are cloud data centers in the U.S. that currently run hundreds of thousands of Blackwell GPUs — they may well train their models without much hassle. However, if they cannot find appropriate clouds on time, they will have fewer AI accelerators per project and, in some cases, be unable to run large training jobs, which will directly cap model size, experimentation cadence, and iteration speed.
Meanwhile, convoluted training and inefficiencies are well known for Chinese developers that used fleets of different Nvidia GPUs consisting of A100, H100, H800, and H20 to train their frontier models. As they cannot officially procure Blackwell, they also rented them in the cloud and insiders say the experience was costly and operationally awkward, according to the WSJ. As a result, they already know how to fight inefficiencies.
With next-generation frontier model and Rubin GPUs, things will get even more complicated. As models scale, the value of uninterrupted access to large, homogeneous GPU clusters grows, and rented capacity rarely delivers that. Even if deals are secured (which is not guaranteed, given the new limitations on cloud access), they typically leave Chinese developers at a structural disadvantage relative to well-funded American competitors that can deploy tens of thousands of accelerators under one roof.
There is another complication. UBS estimates that China's hyperscalers spent about $57 billion on capital expenditures last year, which is roughly 1/10 of U.S. peers. To put the number into context, this is less than Meta's last year's CapEx of over $70 billion. Given the financial limitations, it remains to be seen whether Chinese AI developers will be able to stay more or less competitive with their peers from America.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter Get Tom's Hardware's best news and in-depth reviews, straight to your inbox. Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors
Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.