Machine Learning-based Method for Statistical Static Timing Analysis of Multiple Input Switching Effects
Static timing analysis tools are extensively used in very large-scale integration(VLSI)circuit applications,where the accuracy heavily relies on the delay model of each gate.The timing libraries used by static timing analysis tools typically consider only the pin-to-pin delay due to single input switching(SIS).However,delay variations due to multiple input switching(MIS)become more significant at high clock frequencies and advanced process nodes.Compared with conventional static timing analysis,the MIS effect has a more pronounced influence in statistical static timing analysis.This study presents a machine learning-based method of statistical static timing analysis to investigate the influence of the MIS effect on circuit timing.This method considers the statistical delay difference between MIS and SIS under various conditions,enabling the establishment of a statistical delay model for MIS based on the SIS statistical delay model.Tests of benchmark circuits show that the relative errors of the mean and standard deviation of the corresponding delay distribution of the method do not exceed 1.61%and 3.94%,respectively,proving the high accuracy of this method.
machine learningmultiple input switchingstatistical static timing analysis(SSTA)statistical delay model