Fast aging-aware timing analysis framework with temporal-spatial graph neural network
With the downscaling of CMOS technology, device aging induced by hot carrier injection and bias temperature instability effects poses severe challenges to timing analysis of digital circuits. In this work, a fast aging-aware timing analysis framework based on temporal-spatial graph neural network is proposed for the first time. The temporal-spatial graph neural network takes gated tanh unit (GTU) as the temporal network to extract devices’ degradation from dynamic biases, and takes inductive GraphSAGE as the spatial network to obtain whole graph information from circuit topology and output circuit aging delay. With comprehensive comparison among the network candidates, the combination of gated tanh unit (GTU) and GraphSAGE presents the highest accuracy in predicting the standard cell aging delay. Owing to the superior features capture capability, this framework significantly improves the aging prediction efficiency under various operation conditions, especially facing the iterations of usage scenario, design version and process design kit. Compared with the conventional flow, the average acceleration ratio of our temporal-spatial network in predicting aging delay is more than 200 times. Furthermore, this framework is demonstrated with ADDER and FIFO circuits in timing analysis at the end of life. Thus, this work is helpful to the aging-aware circuit design in nano-scale technology.
Funding
National Key R&D Program of China (2019YFB2205005)
NSFC (T2293700, T2293704)
History
School
- Science
Department
- Computer Science
Published in
IEEE Transactions on Computer-Aided Design of Integrated Circuits and SystemsVolume
43Issue
6Pages
1862 - 1871Publisher
Institute of Electrical and Electronics Engineers (IEEE)Version
- AM (Accepted Manuscript)
Rights holder
© IEEEPublisher statement
© 2023 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Acceptance date
2023-12-18Publication date
2023-12-25Copyright date
2023ISSN
0278-0070eISSN
1937-4151Publisher version
Language
- en