Skip to content
Tech News
← Back to articles

Integrated photonic neural network with on-chip backpropagation training

read original get Photonic Neural Network Kit → more articles
Why This Matters

This breakthrough demonstrates the first integrated photonic neural network capable of on-chip backpropagation training, enabling scalable and robust photonic AI systems. By performing all computations on a single chip, it overcomes previous limitations related to device variations and reliance on digital computers, paving the way for faster, more efficient optical computing. This advancement could significantly impact the development of high-speed, energy-efficient AI hardware in the tech industry.

Key Takeaways

The robust and repeatable performance of scalable integrated photonic neural networks (PNNs)1,2,3 strongly depends on the quality of their training. Gradient-based backpropagation is the mainstream algorithm for training digital neural networks thanks to its scalability, versatility and implementation efficiency4. Consequently, there is an interest in implementing it within a photonic platform in an all-optical manner. At present, owing to the lack of a scalable on-chip activation gradient5, training PNNs has relied on digital computers to run backpropagation, whose performance is reduced in the presence of inevitable device-to-device and environmental variations, or on gradient-free algorithms that do not fully benefit from the versatility of backpropagation training. Here we report the demonstration of an integrated photonic deep neural network, trained end-to-end with on-chip gradient-descent backpropagation. All linear and nonlinear computations are performed on a single photonic chip, leading to scalable and robust training, despite the considerable yet typical fabrication-induced device variations. In two nonlinear data classification tasks, chip performance matches that of the reference digital model in accuracy (over 90%) and robustness without using a digital computer. Integrating the advantages of backpropagation training with PNNs allows for generalization to various PNN architectures for future scalable and reliable photonic computing systems.