Skip to content
Tech News
← Back to articles

Researchers train living rat neurons to perform real-time AI computations — experiments could pave the way for new brain-machine interfaces

read original get Neural Interface Development Kit → more articles
Why This Matters

This groundbreaking research demonstrates that living rat neurons can be trained to perform real-time AI computations within a closed-loop system, highlighting the potential for advanced brain-machine interfaces. By harnessing biological neural networks, this approach could lead to more adaptive, efficient, and biocompatible neural computing technologies for medical and consumer applications.

Key Takeaways

A team at Tohoku University and Future University Hakodate in Japan trained cultured rat cortical neurons to autonomously generate complex temporal signals using a real-time machine learning framework, according to a study published March 12 in the journal Proceedings of the National Academy of Sciences. The researchers integrated the living neurons with high-density microelectrode arrays and microfluidic devices, creating a closed-loop reservoir computing system that learned to produce periodic and chaotic waveforms without any external input.

The system recorded spike trains from the neurons across a 26,400-electrode array with a 17.5-micrometer pitch, filtered them into continuous signals, and decoded an output through a linear readout layer. That output was then fed back to the neurons as electrical stimulation, completing a feedback loop that cycled roughly every 333 milliseconds. The readout weights were optimized in real time using an algorithm called FORCE (First-Order Reduced and Controlled Error) learning, which continuously adjusted the decoder to minimize the error between the network's output and a target waveform.

The enabling technology, per the researchers, was the use of PDMS microfluidic films to constrain how the neurons connected. Without physical constraints, cultured neurons form dense, highly synchronized networks that fire in lockstep, and these homogeneous networks failed to learn any of the target signals.

Article continues below

Instead, the researchers confined neuronal cell bodies to 128 square wells, each roughly 100x100 micrometers, with each well holding an average of 14.6 neurons. The wells were linked by microchannels in two configurations: a lattice design with uniform nearest-neighbor connections, and a hierarchical design with sparser, multi-scale connections.

Both patterned configurations dramatically reduced pairwise neural correlations compared to unpatterned cultures (0.11 and 0.12 versus 0.45, respectively), increasing the dimensionality of the network's dynamics. Lattice networks consistently outperformed hierarchical ones across all target waveforms, likely because their denser intermodular connections produced higher firing rates that gave the linear decoder more signal to work with.

Tests showed rat brain neurons are 'novel computational resources'

Using the lattice and hierarchical networks, the system learned to generate sine waves with periods of 4, 10, and 30 seconds, as well as triangle and square waves, and the same culture preparation could be retrained to oscillate at different frequencies. The researchers also demonstrated that the system could approximate a Lorenz attractor, a three-dimensional chaotic trajectory, with pairwise correlations above 0.8 between predicted and target signals across all dimensions during the learning phase.

"This work shows that living neuronal networks are not only biologically meaningful systems but may also serve as novel computational resources," said Hideaki Yamamoto, a professor at Tohoku University's Research Institute of Electrical Communication, in a press release published on the institution’s website.

Stay On the Cutting Edge: Get the Tom's Hardware Newsletter Get Tom's Hardware's best news and in-depth reviews, straight to your inbox. Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors

... continue reading