As large artificial intelligence models become increasingly dependent on massive data, the "memory wall" bottleneck caused by the separation of memory and processing power in traditional computer architectures is consuming significant time and energy. Researchers from Purdue University and Georgia Institute of Technology published a new study in the journal "Frontiers in Science," proposing a method to build a new computer architecture using brain-inspired algorithms, aiming to significantly reduce the energy consumption of artificial intelligence models.

Brain Large Model AI

 The Bottleneck of the Von Neumann Architecture: The Memory Wall

Most computers today are still based on the Von Neumann architecture proposed in 1945, which separates memory and processing power, creating a performance bottleneck when data moves quickly between them. Kousik Roy, a professor of computer engineering at Purdue University and the lead author of the study, pointed out that over the past four years, the scale of language processing models has increased by 5000 times, making this efficiency issue critical and calling for a fundamental rethinking of computer design. Computer engineers refer to the problem where memory capacity cannot keep up with processing speed as the **"memory wall,"** which consumes a large amount of time and energy required to run underlying artificial intelligence models.

Solution: Brain-Inspired "Compute-in-Memory"

The researchers of this paper believe that the key to solving the memory bottleneck lies in trying a new computer architecture that integrates memory and processing power together, known as **"Compute-in-Memory (CIM)."**

  • Algorithm Core: Researchers suggest that AI models should adopt Spiking Neural Networks (SNNs), inspired by the way the human brain operates. Although SNNs were once criticized for being slow and low-precision, their performance has significantly improved in recent years.

  • CIM Advantages: The paper's abstract states, "CIM offers a promising solution to the memory wall by directly integrating computing capabilities into the memory system." Through this integration, data transfer can be reduced and processing efficiency improved.

 Application Prospects: From Data Centers to the Real World

Researchers believe that integrating computer processing and memory into a single system can greatly reduce the energy consumption of artificial intelligence. Tanvi Sharma, co-author and researcher at Purdue University, said, "To move (artificial intelligence) from data centers to the real world, we need to significantly reduce its energy consumption."

Through this approach, artificial intelligence can be integrated into devices that are smaller, more affordable, and have longer battery life