Tech News
← Back to articles

AI may not need massive training data after all

read original related products more articles

New research from Johns Hopkins University shows that artificial intelligence systems built with designs inspired by biology can begin to resemble human brain activity even before they are trained on any data. The study suggests that how AI is structured may be just as important as how much data it processes.

The findings, published in Nature Machine Intelligence, challenge the dominant strategy in AI development. Instead of relying on months of training, enormous datasets, and vast computing power, the research highlights the value of starting with a brain-like architectural foundation.

Rethinking the Data Heavy Approach to AI

"The way that the AI field is moving right now is to throw a bunch of data at the models and build compute resources the size of small cities. That requires spending hundreds of billions of dollars. Meanwhile, humans learn to see using very little data," said lead author Mick Bonner, assistant professor of cognitive science at Johns Hopkins University. "Evolution may have converged on this design for a good reason. Our work suggests that architectural designs that are more brain-like put the AI systems in a very advantageous starting point."

Bonner and his colleagues aimed to test whether architecture alone could give AI systems a more human-like starting point, without relying on large-scale training.

Comparing Popular AI Architectures

The research team focused on three major types of neural network designs commonly used in modern AI systems: transformers, fully connected networks, and convolutional neural networks.

They repeatedly adjusted these designs to create dozens of different artificial neural networks. None of the models were trained beforehand. The researchers then showed the untrained systems images of objects, people, and animals and compared their internal activity to brain responses from humans and non-human primates viewing the same images.

Why Convolutional Networks Stood Out

Increasing the number of artificial neurons in transformers and fully connected networks produced little meaningful change. However, similar adjustments to convolutional neural networks led to activity patterns that more closely matched those seen in the human brain.

... continue reading