Innovative Ideas for Predictable Success
      Issue 4, 2009

  NEWS  |   CALENDAR  |   PAST ISSUES SYNOPSYS.COM  |  CONTACT US


Industry Insight Industry Insight
The New Economics of Verification
A more intelligent approach to verification can help design teams control the rising cost of chip design, according to Manoj Gandhi, senior vice president and general manager of Synopsys’ Verification Group.

Manufacturing used to dominate the cost of bringing a chip to market. Now, design and verification outweighs manufacturing costs by almost two to one. Chip designers spend around three-quarters of the total development time verifying the chip, so it’s hardly surprising that verification accounts for the lion’s share of the development costs. Why does verification now dominate the development process?

A Problem Without Bounds
If you design a chip that is 6x more complex than your last, you’ll probably need 100x more simulation cycles to verify it. This exponential growth means that the state space of even a medium-complexity design is so large that achieving complete coverage is impossible. Verification is an unbounded problem, and managing verification – achieving incredibly tough verification objectives within limited budgets and schedules – is a critical process that determines the overall economics of any chip development project.

Given its unbounded nature, the aim must be to verify more intelligently. This means focusing on 'high value' verification tasks. Engineers have to find bugs as early as possible in the design process, using the fewest resources possible.

Measuring the value of a verification activity is a difficult but important step in managing the total cost of verification. Some studies suggest that there is a 10x multiplication factor at work in verification – a bug that costs $10K to fix in RTL will cost $100K if it gets to layout, rising to $1M if it gets past tapeout. Technologies or activities that help to avoid bugs, or to find bugs earlier in the process, have a high verification value (Figure 1).


Figure 1: Finding Bugs Earlier has High Verification Value

Verification Value
Leading chip developers measure verification return on investment. For verification, "investment" is the combination of engineers, compute infrastructure and EDA tools needed to meet schedule and quality within the development budget. The biggest of these investments for most companies is people. Many companies deploy three verification engineers for every hardware designer. Increasing the productivity of verification engineers in key areas has a large verification value.

The growth in verification complexity has put a tremendous strain on compute infrastructure for chip development. Compute farms used for verification often range from thousands to tens of thousands of servers, with annual operating and depreciation expense in the millions of dollars. Our customers’ average compute infrastructure costs have grown by a staggering 30% per year over the past eight years, a trend that we expect to continue.

In addition to investing in people and hardware, chip companies invest in verification tools, IP and methodologies to increase the efficiency of their teams and IT infrastructure. Taking an intelligent approach to verification means using the appropriate tools for each verification task.

Constrained-random verification with SystemVerilog is now a mainstream technology which harnesses the power of compute farms to automatically generate and run thousands of tests, even tests that a design or verification engineer may never have considered. This has dramatically improved verification productivity over the past few years. Synopsys’ VCS® enables engineers to create highly effective testbench environments that use constrained-random stimulus, functional coverage and assertions.

Synopsys has introduced new technologies for verifying low-power designs that contain multiple voltage domains. VCS with MVSIM is a voltage-aware simulation engine that offers functional verification for all power management designs. Alongside MVRC, a voltage-aware static checker, and HSPICE® and CustomSim™ for analysis of leakage power, floating nodes and dynamic IR drop, Synopsys offers a comprehensive and accurate coverage solution for advanced power-managed designs.

For enhanced performance, Synopsys VCS multicore technology cuts down verification time by running the design, testbench, assertions, coverage and debug in parallel on multicore compute platforms. VCS supports both design level parallelism and application-level parallelism.

FPGA-based rapid prototyping and debug provide many verification teams with higher productivity and lower infrastructure cost for HW/SW verification and at-speed system validation. Synopsys’ Confirma™ Rapid Prototyping Platform accelerates the validation of both FPGAs and ASICs. Its HAPS™ and CHIPit® prototyping systems are ideal for IP and SoC design and verification teams who want to take advantage of FPGA-based prototyping to find "corner-case" hardware bugs or to start software development and integration in advance of silicon availability.

The Synopsys suite of functional verification tools are tightly integrated, best-in-class technologies that when applied intelligently, allow designers to find bugs quickly and easily, significantly improving the quality of the most complex designs and enabling first-pass silicon success. As well as the tools mentioned, they include Magellan for formal hybrid verification, the VCS Verification Library (Verification IP), Leda® for static checking and a raft of verification services, including verification methodology consulting.

Learning from Software Best Practice
Verification projects often involve millions of lines of code, which means that today’s overall verification process is beginning to look more like a major software development effort. Hardware verification teams should consider comparing their best practices with those used in the software industry.

The software industry leads the way in its understanding of process maturity. Because many of its tasks depend on human input, it is standard practice to measure and improve metrics such as productivity per engineer. This has led it to focus on how to write better code with fewer bugs. This correct-by-design process is more mature than the processes in the hardware domain.

Software teams also spend more time reviewing code than do hardware teams. They optimize their development infrastructure by measuring compile and build times, and track the time it takes an engineer to check code and complete regressions. In general, software teams have metrics in place that let them identify and fix productivity bottlenecks.

Measure, then Manage
During the last few years we have collaborated with dozens of major chip and system vendors working on leading-edge designs in several industries, often presenting new and unprecedented verification challenges. A key element in our collaboration process is to objectively assess the expected value for various new approaches to verification based on managing the customer’s total cost of verification. Typically we track 25 – 30 different metrics for each project so that we can better understand where bottlenecks and opportunities exist. Through this activity we have made enormous progress in understanding the relationship between different applications, methodologies and costs.

All of this activity has led us to work with several key customers to help them manage growing verification costs – both in terms of people and compute infrastructure. By taking some of the best practices from the software world and applying them to improve verification throughput, we are eliminating productivity bottlenecks. By looking at engineer deployment, processes and technology, we have helped some organizations save tens of millions of dollars.

In the future, we see this collaborative style of working becoming more commonplace. Closer collaboration between chip developer and verification solution provider can help reveal where verification costs are incurred and help devise strategies for reducing costs. We can then work with them to customize their processes and technologies to maximize the return on their verification investment. Adding new features to point tools will steadily make verification more productive, but major cost reduction requires broader thinking across multiple verification domains and sometimes even customized solutions targeted at specific design categories.


Figure 2: Verification Cost Management Process

There are many compelling verification technologies on the horizon. Which one will drive the next great wave in verification productivity? Which ones should EDA vendors and chip developers invest in? Only the future will tell, but one thing is for certain: The next great advance in verification will be driven by the economics of verification, and brought to market by the close collaboration between chip developers and their EDA partners.

Manoj Gandhi
Manoj Gandhi joined Synopsys with the company's merger with Viewlogic in December 1997. In his current position, Mr. Gandhi and his organization are responsible for Synopsys' verification solutions, including simulation, testbench automation, hybrid formal verification and system level design. Mr. Gandhi has over 15 years of HDL simulation and verification experience having begun his EDA career at Gateway Design Automation in 1986. He has a BS in Computer Science and Engineering from the Indian Institute of Technology, Kharagpur, and an MS in Computer Science from the University of Massachusetts, Amherst.


©2010 Synopsys, Inc. Synopsys and the Synopsys logo are registered trademarks of Synopsys, Inc. All other company and product names mentioned herein may be trademarks or registered trademarks of their respective owners and should be treated as such.


Having read this article, will you take a moment to let us know how informative the article was to you.
Exceptionally informative (I emailed the article to a friend)
Very informative
Informative
Somewhat informative
Not at all informative

Register Buttom

Email this article

WEB LINKS
- Synopsys Discovery Verification Platform

"There are many compelling verification technologies on the horizon. Which one will drive the next great wave in verification productivity? "