Aditya Malik, Nalini Ratha, et al.
CAI 2024
Computation-in-memory (CIM) using memristors can facilitate data processing within the memory itself, leading to superior energy efficiency than conventional von-Neumann architecture. This makes CIM well-suited for data-intensive applications like neural networks. However, a large number of read operations can induce an undesired resistance change in the memristor, known as read-disturb. As memristor resistances represent the neural network weights in CIM hardware, read-disturb causes an unintended change in the network's weights that leads to poor accuracy. In this paper, we propose a methodology for read-disturb detection and mitigation in CIM-based neural networks. We first analyze the key insights regarding the read-disturb phenomenon. We then introduce a mechanism to dynamically detect the occurrence of read-disturb in CIM-based neural networks. In response to such detections, we develop a method that adapts the sensing conditions of CIM hardware to provide error-free operation even in the presence of read-disturb. Simulation results show that our proposed methodology achieves up to 2× accuracy and up to 2× correct operations per unit energy compared to conventional CIM architectures.
Aditya Malik, Nalini Ratha, et al.
CAI 2024
Erik Altman, Jovan Blanusa, et al.
NeurIPS 2023
Pavel Klavík, A. Cristiano I. Malossi, et al.
Philos. Trans. R. Soc. A
Conrad Albrecht, Jannik Schneider, et al.
CVPR 2025