One major issue facing artificial intelligence is the interaction between a computer's memory and its processing capabilities. When an algorithm is in operation, data flows rapidly between these two components. However, AI models rely on a vast amount of data, which creates a bottleneck.
A new study, published on Monday in the journal Frontiers in Science by Purdue University and the Georgia Institute of Technology, suggests a novel approach to building computer architecture for AI models using brain-inspired algorithms. The researchers say that creating algorithms in this manner could reduce the energy costs associated with AI models.
"Language processing models have grown 5,000-fold in size over the last four years," Kaushik Roy, a Purdue University computer engineering professor and the study's lead author, said in a statement. "This alarmingly rapid expansion makes it crucial that AI is as efficient as possible. That means fundamentally rethinking how computers are designed."
Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source. Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
Most computers today are modeled on an idea from 1945 called the von Neumann architecture, which separates processing and memory. This is where the slowdown occurs. As more people around the world utilize data-hungry AI models, the distinction between a computer's processing and memory capacity could become a more significant issue.
Researchers at IBM called out this problem in a post earlier this year. The issue computer engineers are running up against is called the 'memory wall.'
Breaking the memory wall
The memory wall refers to the disparity between memory and processing capabilities. Essentially, computer memory is struggling to keep up with processing speeds. This isn't a new issue. A pair of researchers from the University of Virginia coined the term back in the 1990s.
CNET
But now that AI is prevalent, the memory wall issue is sucking up time and energy in the underlying computers that make AI models work. The paper's researchers argue that we could try a new computer architecture that integrates memory and processing.
... continue reading