Tech News
← Back to articles

Photonics and high-speed data movement is the next big AI bottleneck — following copper, power, DRAM, and NAND

read original related products more articles

The voracious appetite of the generative AI revolution has overhauled any number of industries so far in its three-year history. First, it upended demand for high-end chips, pushing companies like Nvidia to record high valuations and putting pressure on all parts of the manufacturing process to churn out chips to meet that need. Then it began to make power grids break and buckle, requiring the need for a rethink about how we send energy to data centers. And those data centers are also facing the strain as they’re needed more often for AI training and inference, even eking out extra demand for commodities like copper that are integral to their operations.

Those data centers need to respond to that demand for more capacity and the challenges of copper shortages, argues Vaysh Kewada, CEO and co-founder at Salience Labs, a silicon-photonics company focused on networking bottlenecks in AI data centers. The bigger and more intensive AI models that continue to roll out, alongside the shift away from chatbots to agentic AI, are pushing those within the sector towards photonics.

(Image credit: Microsoft)

“We're targeting the scale up domain of AI data centers, where we're seeing that they're increasingly limited by not just the bandwidth, but the latency of predictability, especially as we scale to larger workloads and agentic workloads,” she said in an interview with Tom’s Hardware Premium. For that reason, “there’s a lot of attention at the moment around photonics.”

Others agree: 2026 is “the year of increasing visibility into design wins and building momentum for silicon photonics,” wrote Aaron Rakers, equity analyst at Wells Fargo Securities, in a recent research note. Wells Fargo estimates that the total addressable market for photonics could end up being $10-12 billion by 2030, thanks to the industry’s shift to bigger capacity.

That sounds like good news, but it comes with a catch. Behind the bullish forecasts, those within the photonics industry warn that, like all those sectors that have been eaten up and spat out before, the next set of constraints, including reliability, packaging, manufacturing capacity, and how data is actually routed once it hits fibre, could become the next hard limit on AI scaling.

Data is the new choke point

The boom in photonics might come as a surprise to some. “Photonics is something that already exists within the data centre today,” said Vivek Raghunathan, CEO and co-founder at Xscape Photonics, in an interview with Tom’s Hardware Premium. “Optical cables and the silicon photonics technology already exist when it comes to connecting different switches as part of a pluggable transfer ecosystem.”

But now that section of optics is being pushed out of there and into ultra-fast links that link large numbers of GPUs into a single compute fabric. “Ultimately, the network is the bottleneck for these workloads, because they're just far too large to run on a single GPU,” Kewada said.

Using photonics means you can get between 10 and 100 times more information back and forth from the memory before they output a single stream of reference, Raghunathan explained. That’s important because what AI systems do now is changing, requiring that extra information shifting. The average AI user is moving from asking single prompts of a model to running chains of tasks, and Kewada said that the hardware is already struggling under current AI use. “If it’s a problem now, it becomes an even bigger problem when it comes to agentic workloads, and that's heavily latency- and balance-sensitive,” she explained.

... continue reading