Search

"Topology optimization of random memristors for input-aware dynamic SNN", a paper in Science Advances

Sep 30, 2025

Professor Xiaojuan Qi from the Department of Electrical and Electronic Engineering and her team, worked on the research for the topic “Topology optimization of random memristors for input-aware dynamic SNN”. The research findings were published by Science Advances on April 16, 2025.

  

Details of the publication:

Topology optimization of random memristors for input-aware dynamic SNN

Bo Wang, Xinyuan Zhang, Shaocong Wang, Ning Lin, Yi Li, Yifei Yu, Yue Zhang, Jichang Yang, Xiaoshan Wu, Yangu He, Songqi Wang, Tao Wan, Rui Chen, Guoqi Li, Yue Deng, Xiaojuan Qi*, Zhongrui Wang*, Dashan Shang*

Article in Science Advances

https://www.science.org/doi/full/10.1126/sciadv.ads5340 

 

Abstract

Machine learning has advanced unprecedentedly, exemplified by GPT-4 and SORA. However, they cannot parallel human brains in efficiency and adaptability due to differences in signal representation, optimization, runtime reconfigurability, and hardware architecture. To address these challenges, we introduce pruning optimization for input-aware dynamic memristive spiking neural network (PRIME). PRIME uses spiking neurons to emulate brain’s spiking mechanisms and optimizes the topology of random memristive SNNs inspired by structural plasticity, effectively mitigating memristor programming stochasticity. It also uses the input-aware early-stop policy to reduce latency and leverages memristive in-memory computing to mitigate von Neumann bottleneck. Validated on a 40-nm, 256-K memristor-based macro, PRIME achieves comparable classification accuracy and inception score to software baselines, with energy efficiency improvements of 37.8× and 62.5×. In addition, it reduces computational loads by 77 and 12.5% with minimal performance degradation and demonstrates robustness to stochastic memristor noise. PRIME paves the way for brain-inspired neuromorphic computing.