Exploring the impact of random telegraph noise-induced accuracy loss in Resistive RAM-based deep neural network
journal contributionposted on 11.06.2020 by Yide Du, Linglin Jing, Hui Fang, Haibao Chen, Yimao Cai, Runsheng Wang, Jianfu Zhang, Zhigang Ji
Any type of content formally published in an academic journal, usually following a peer-review process.
For Resistive RAM (RRAM)-based deep neural network, Random telegraph noise (RTN) causes accuracy loss during inference. In this work, we systematically investigated the impact of RTN on the complex deep neural networks (DNNs) with different datasets. By using 8 mainstream DNNs and 4 datasets, we explored the origin that caused the RTN-induced accuracy loss. Based on the understanding, for the first time, we proposed a new method to estimate the accuracy loss. The method was verified with other 10 DNN/dataset combinations that were not used for establishing the method. Finally, we discussed its potential adoption for the co-optimization of the DNN architecture and the RRAM technology, paving ways to RTN-induced accuracy loss mitigation for future neuromorphic hardware systems.
Read the paper on the publisher website
National Key R&D Program of China under the grant no. 2019YFB2205000
National Natural Science Foundation of China under the grant no. 61927901
- Computer Science