Insight Home   |  Previous Article   |   Next Article

Issue 4, 2011

Technology Update
The Next Major Shift in Verification is SoC Verification

Michael Sanie, director of verification product marketing at Synopsys, explains why it’s time for the next major shift in verification technology to bring about an order-of-magnitude increase in productivity.

We have witnessed two major shifts in verification during the past two decades. During both, design teams have been able to rise to the challenge of verifying the most complex state-of-the-art designs by applying innovative verification technology.

The first major shift took place in the 1990s. At that time, a state-of-the-art ASIC consisted of about 5 million gates in 1μm-0.5 μm process technology. The computing industry was driving the complexity curve in its quest to produce more powerful CPUs (Figure 1).

Figure 1
Figure 1: Major shifts in verification needs – driven by design complexity

During the 1990s, design teams were transitioning from gate-level design to the use of hardware description languages (HDLs). This enabled design teams to tackle even more complex designs. The main method for verification at this time was HDL simulation. Using HDLs, design teams were able to design larger and more complex chips efficiently, and scale their design efforts. But the efficiency and scalability benefits that HDLs brought to the design process were not extended to the simulation efforts. The gap between design and verification scalability continued to grow because HDL simulation was not able to keep up with the rise in design complexity and size. The “simulation productivity gap” emerged as the most pressing challenge in verification.

The emergence of native compiled-code simulation technology helped verification teams to significantly improve their simulation productivity. Synopsys VCS®, as the industry’s first compiled-code simulation technology, revolutionized simulation and addressed the simulation productivity gap.

A comparable shift also occurred in the 2000s. In the early 2000s, design complexity, now being driven mostly with networking applications, reached higher levels. Design teams used more and more IP as ASICs became increasingly complex, reaching gate counts of 10 million or more.

Verification teams looked beyond simulation and started to turn to more sophisticated technologies, including the use of advanced testbenches, constrained random approaches and assertions as they worked towards achieving higher levels of verification coverage. While these new verification technologies existed as point tools, verification teams had to put in significant effort to make them work together, and it became increasingly difficult to create scalable verification solutions to address their most complex designs. As the designs’ sizes and complexities continued to grow, the environments needed to manage the verification of these designs were once again no longer scalable and efficient.

Synopsys works with industry leaders to address this new “verification productivity gap” by introducing SystemVerilog and advanced testbench methodologies. In addition, introducing native testbench technology enabled design teams to combine and integrate several verification approaches around SystemVerilog. Together, these major innovations helped verification teams to scale their verification solutions once again to be able to handle more complex designs and address the verification productivity gap. Going forward, Synopsys and other industry leaders worked together to first drive SystemVerilog as the ratified industry standard for design and verification, and then UVM as the ratified industry standard for SystemVerilog-based verification methodology.

Fast Forward to Today
Several years after the emergence of SystemVerilog and advanced verification methodologies, we are now witnessing another substantial shift in the industry’s design and verification needs. Today’s requirements are being driven by changes in the SoC design process and the continued growth in complexity.

Sub-32 nanometer (nm), 100 million+ gate designs characterize today’s SoC devices. In order to meet project schedules, design teams are making extensive use of IP. Today’s convergent consumer products demand SoCs which integrate applications processors along with several other functions and extensive support for software applications.

Figure 2
Figure 2: Increasing SoC complexity

Market Trends Drive Technology
SoCs today look very different from those of the early 2000s. The complexity of design specifications has changed significantly. They are much faster, incorporate more features and functionality, and support extended runtimes on the same battery technology.

Convergence is the key market trend for today’s devices. Convergence products, such as smart phones, tablets, and other advanced consumer products, combine several key technologies within the same device. They typically incorporate multicore CPUs supporting multiple interface protocols – upwards of 10. They require long battery life, advanced software features, and short time-to-market. Design complexity is increasing because of the need to achieve low power and some designs now incorporate more than 20 voltage domains.

Time-to-market is a critical business issue. Consumer markets make choices faster than ever before and a delay of just a few weeks can make the difference between product success and failure. We only have to look back at the recent emergence of the tablet sector for evidence of how being second to market can have a devastating effect on initial product success.

For many consumer products, software is now the key to SoC differentiation. It should be no surprise, then, that according to IBS Research, 25% of the value of today’s SoCs is in the software. This is up from just 4% in the early 2000s.

It now takes significant investment to develop an advanced SoC. Companies are spending upwards of $100 million to produce the latest chip designs, with the majority of that spend directed at infrastructure and engineering costs. The size of the verification team and effort typically outweighs that of the design by two-to-one. It is no surprise that businesses are focusing on verification productivity like never before.

Design Complexity Impacts Verification Need
The effect that these market trends have had on verification is profound. Through our collaboration with leading design companies, we are fortunate to work on many leading-edge designs. Over 60% of designs on processes of 45nm and below, and more than 90% of designs of 32nm and more advanced nodes are verified with VCS. This exposure to advanced designs gives us a unique insight into the “verification profile” of a state-of-the-art design today. The metrics, some of which are as high as the figures below, are staggering:

  • Tens of millions of lines of RTL and testbench code
  • Larger designs could need hardware with over 150GB RAM for verification
  • Multiple (10 or more) protocols on a single chip
  • Hundreds of thousands of assertions
  • Tens or hundreds of power domains
  • Over a terabyte of coverage data to analyze

In an attempt to keep up with large verification requirements, compute farms have doubled in size over recent years, verification teams have become twice as large as their design counterparts, and the debug process now accounts for 35% of the entire verification effort.

Once again, the industry needs major advancements in verification to accommodate the dramatic shift in the design landscape.

The Next Major Shift in Verification
Verification teams do embrace enhanced feature sets, improved simulation performance and more efficient use of memory offered by verification providers. Yet, incremental improvements to today’s tools will not be sufficient to deliver an order of magnitude boost to verification productivity given the complexity of today’s designs. Instead, verification teams need innovations in verification.

It is apparent that what verification teams need today – if they are to successfully get beyond the challenges outlined above – are innovations that focus on key productivity bottlenecks. These innovations need to deliver significant improvements in performance and capacity; superior, more intuitive debug that enable engineers to quickly analyze vast amounts of data and find design bugs; comprehensive, proven verification IP that is fast, efficient, and timely; innovative low-power verification solutions; and hardware-software co-verification solutions that allow software teams to develop code alongside the hardware and validate the entire system functionality and performance.

Solving these difficult problems is a significant growth opportunity. Synopsys continues its drive to innovate and invest in verification. Synopsys’ innovation was at the heart of the first two major shifts in verification. Watch this space as innovative approaches and solutions will be at the heart of the next major shift in verification.

More Information:

About the Author
Michael Sanie
Michael Sanie is director of verification product marketing at Synopsys. He has more than 20 years of experience in semiconductor design and design software. Before Synopsys, Michael held executive and senior marketing positions at Calypto, Cadence and Numerical Technologies. He started his career as a design engineer at VLSI Technology and holds four patents in design software. He holds BSCEE and MSEE degrees from Purdue University and an MBA from Santa Clara University.

Having read this article, will you take a moment to let us know how informative the article was to you.
Exceptionally informative (I emailed the article to a friend)
Very informative
Somewhat informative
Not at all informative