Simulation remains the workhorse technology for functional verification of register-transfer level (RTL) chip designs. In a typical flow, static verification runs early in the chip design process and looks for structural bugs such as clock domain crossing (CDC) and reset domain crossing (RDC) errors. Static analysis finds approximately 10% of all design bugs. Formal verification, applied primarily at the block level, typically detects 20% of bugs. The simulation phase is where 65% of the total bugs are caught, with the final 5% found using emulation and prototyping.
On the simulation front, the key challenges are performance, debug turnaround time (TAT), and coverage closure. The need to run frequent regressions any time there are changes in the RTL design means that simulator performance needs to be optimal so that it cannot cause project delays further down the line. The slowing of Moore’s law means that performance cannot be dramatically improved simply by running on the latest compute servers.
Artificial intelligence (AI) and machine learning (ML) provide an effective way to improve performance beyond upgrading hardware, by optimizing the selection of the many switches available in the Synopsys VCS® simulator. That is the focus for this blog post, but it is important to note that AI/ML have also been successfully employed to speed debug TAT with Synopsys Verdi® Automated Debug System regression debug automation for binning, clustering, and triaging failures and to accelerate coverage closure in the Synopsys VCS environment.