The key to energy-efficient artificial intelligence (AI) could lie in the human brain. 麻豆视频 researchers Maryam Parsa (PI) and (co-PI) are leading a Department of Energy-funded project to make AI dramatically more efficient by mimicking the brain鈥檚 computing strategies.
鈥淎I today is powerful but extremely energy-hungry,鈥 said Parsa, an assistant professor in George 惭补蝉辞苍鈥檚 who specializes in neuromorphic computing. 鈥淲e want to understand why the human brain is so efficient and why AI is not鈥攁nd then bridge that gap.鈥
Unlike traditional AI models that rely on continuous signals and massive data centers, the brain uses sparse, event-driven signals called spikes.
鈥淎ny one neuron only fires a spike here and there, but the timing of those spikes carries a lot of meaning,鈥 explained Ascoli, a Distinguished University Professor in George 惭补蝉辞苍鈥檚 specializing in neuroscience. 鈥淭hat鈥檚 why the brain can do so much with so little energy.鈥
To replicate this efficiency holistically, Parsa and Ascoli teamed up with at the University of Wisconsin鈥揗adison and at Northwestern University. The team is building a full-stack neuromorphic computing system spanning devices, circuits, architectures, algorithms, and applications.
Each principal investigator is contributing specific expertise to the system:
-
Ascoli contributes neuroscience insights using Izhikevich models, mathematical frameworks that capture real neuron spiking dynamics, to inform system design.
-
Khalili develops spintronic devices, components that utilize magnetic materials to harness an electron's quantum properties of spin and charge for processing and storing data. By naturally exhibiting neuron-like temporal behavior, these devices are expected to reduce energy demands.
-
Jaiswal designs analog circuits to integrate these spintronic devices into functional architectures.
-
Parsa leads the project as well as neuromorphic learning algorithms and application-level integration, exploring diverse learning approaches to optimize speed, precision, resiliency, privacy, and energy savings.
鈥Izhikevich models allow neuromorphic systems to mimic the timing, diversity, and variability of real neurons, enabling more brain-like learning and efficiency,鈥 Ascoli said. His extensive experience with these models, including simulating hippocampal networks, provides the biological foundation for the project.
鈥淚nstead of forcing existing devices to mimic brain behavior, we asked: What if we leverage new devices that naturally exhibit these dynamics?鈥 Parsa said. This approach ensures innovations at the hardware level align with biologically inspired algorithms, creating a cohesive system rather than isolated solutions.
By introducing greater biological complexity and leveraging novel spintronic devices, the team hopes to reduce AI鈥檚 energy footprint while improving performance metrics such as accuracy and privacy. Early results are promising; even simple changes to spike distributions improved privacy without sacrificing accuracy.
The research also explores learning algorithms beyond the traditional. 鈥淥ur goal is to see how different neuron models and learning rules interact and which combinations optimize for speed, precision, or energy savings,鈥 Parsa said.
Ultimately, the project could transform AI by making it greener and more brain-like. 鈥淚f we succeed, we鈥檒l have systems that learn faster, consume less power, and operate more like the brain,鈥 said Ascoli.
鈥淭his is not just about making AI more efficient,鈥 said Parsa. 鈥淚t鈥檚 about rethinking computing from the ground up."
In This Story
Related Stories
- January 28, 2026
- January 22, 2026
- January 15, 2026
- January 15, 2026
- January 12, 2026