Circuit simulation plays a critical role in ensuring that silicon chips will perform as intended. After all, we can’t afford to have critical applications like vehicle braking systems, robotic surgical equipment, or 24/7 manufacturing lines failing because of chip issues. As silicon designs have grown larger and more complex, new simulation challenges have emerged. In answer to this, HSPICE technology has evolved to meet the needs.
HSPICE got its start in the 1980s as an analog circuit simulator based on the SPICE (Simulation Program with Integrated Circuits Emphasis) technology that came out of UC Berkeley. It’s one of the more prominent commercial versions of SPICE (originally commercialized by Meta-Software, now part of Synopsys). By simulating their circuits prior to manufacturing, engineers could verify operation at the transistor level. It’s not practical to prototype ICs prior to the manufacturing step. With SPICE simulation, designers could accurately predict design behavior and estimate how component variations could impact performance and more. During these days, ICs were relatively small, each containing a small number of analog components.
Then came the digital revolution.
With the emergence of digital components came the first wave of HSPICE reinvention, when it evolved to become the gold standard for characterization of standard cell libraries. With analog ICs, HSPICE technology was used to simulate the entire circuit. Digital designers, on the other hand, use standard cells in design steps such as synthesis and place and route. Foundries use HSPICE to simulate and characterize these standard cells before building the cell libraries used to implement the digital designs.
From analog circuit simulator to a solution for today’s hyper-convergent designs
The second wave of HSPICE reinvention came with the emergence of faster, more complex high-speed IOs and memory interfaces. Say you have a motherboard with a microprocessor and DRAM. The interface between the two needs to be a very fast signal implemented directly on the motherboard, so then it becomes important to analyze signal and power integrity of the PCB. HSPICE rose to the challenge by incorporating advanced signal and power integrity modeling features, thus becoming the. de facto standard for uncovering signal integrity and power integrity problems. Its ability to effectively evaluate and identify errors in the transmission of billions of bits makes it an ideal solution for accelerating the creation and analysis of designs with fast and complex high-speed interfaces.
Today, we’re in an era of hyper-convergent chip designs, where scale and systemic complexity are marked by multiple technologies, protocols, and architectures coming together in a massive, highly complex, and interdependent design. This evolution has pushed circuit simulation into a domain where accurate modeling, robust algorithms, and strong compatibility are keys to success. In addition, large, heterogeneous system-in-package designs urgently demand complex, multi-dimensional analysis and improved quality-of-results, time-to-results, and cost-of-results.
Ensuring that these hyper-convergent chips will work as intended takes on an entirely new level of challenges. In addition to scale and systemic complexity, there’s also scope complexity, with on-chip and off-chip components to consider. For example, 3DICs have emerged during this period, with even more complex signal integrity and power integrity demands as signals move between different dies that are connected with through-silicon vias (TSVs) and bond wires. In this third wave of reinvention, HSPICE technology has become a bridge to connect the on- and off-chip worlds—a feat that other simulation technologies cannot match.