Innovative Ideas for Predictable Success
      Volume 3, Issue 3

  NEWS  |   CALENDAR  |   PAST ISSUES SYNOPSYS.COM  |  CONTACT US


Industry Insight Industry Insight
Progress in Products, Chips and EDA Tools
In the second of a two-part review based on Aart de Geus’ keynote speech to the Synopsys User Group, Synopsys Insight looks at the products, services and technologies that are addressing chip designers’ key concerns of power management, performance and productivity. Part 1 can be read here.

Feedback from the Synopsys user community clearly demonstrates the three major concerns of chip designers today. Power management is top of the list in terms of design challenges, followed closely by increasing design size and complexity, and the consequent implications for performance.

The third issue, though – and according to Synopsys’ research more important than any other – is a relative newcomer to engineers’ lists of concerns: productivity. Its addition is a direct result of the focus on cost, because designers are under pressure not only to produce results more quickly, but also more predictably.

The Power Challenge
The need to manage both dynamic and leakage power is now so important that it touches every aspect of chip design. Process technology and EDA tools are, of course, key. But low power is also an essential factor in IP quality, whether commercial or custom-developed, and of overall design methodology.

In essence, low power design is a “whole flow issue”, a fact that has driven the development of the Eclypse™ Low Power Solution at Synopsys. Supporting the open methodologies described in the Low Power Methodology Manual (LPMM) co-authored by Synopsys and ARM®, the solution encompasses system-level design, implementation, verification and sign-off tools, as well as IP and design services.


Figure 1. Eclypse Low Power Methodology

The aim is to combine and automate a wide range of advanced low-power techniques, methodologies and standards in a coherent fashion: not least to make sense of the inevitable trade-offs between power, speed, area and yield.

Specific technologies deployed in the Eclypse Low Power Solution include enhanced clock gating, low-power clock tree synthesis, multi-threshold leakage optimization and automated power switch insertion. Just as important, however, is the use of the Unified Power Format (UPF), an industry-standard format which is employed to capture low power design requirements. By describing power design intent in a standards-based fashion, UPF is instrumental in allowing the “whole flow” approach that chip designers require.

Verification Remains Key
Of course, specific tools and areas of the design flow are key contributors to the accumulation of improvements in power performance. Surprisingly to some, verification proves to be as important a contributor as any other – perhaps for the very reason that it has traditionally not been seen as a power-oriented discipline.

One of the major sources of challenge for verification comes in dealing with multi-voltage (MV) designs. Traditional functional simulation tools do not “know” about voltage. Even “power aware” simulators that model power shut-down states ignore specific voltage levels. They may therefore miss voltage-related bugs.

Static rule checking as provided by Synopsys’ MVRC can be used to detect corner case MV bugs and provide checks against structural as well as architectural errors. And, to make functional simulation itself voltage-aware, engineers can make use of the MVSIM voltage-aware co-simulator. Together, the two tools can be used to find power management bugs that previously would not have shown up until silicon debug.

Working on the same Verilog/VHDL netlist and testbench as the functional simulation tool, MVSIM understands the waveform nature of voltage changes, producing more accurate simulation results and better coverage. In addition, MVSIM allows the power intent to be specified in UPF, and automatically creates thousands of assertions for common errors such as missing level shifters and power gates.

The Implementation Flow
If low power is surprising in its impact upon verification, it is fair to say that in the implementation flow, it is every bit as important as one might expect. Again, standards-based data interchange via UPF is vital to ensure that many different tools are tightly aligned and integrated.

The first area in which power has had an impact is in synthesis. Placement-awareness at the synthesis stage is now necessary to deliver not only the required timing and area specifications, but also to ensure that the final design will conform to tight power requirements.

This tighter integration between synthesis and place-and-route (P&R) delivers higher correlation with final placement results, avoiding synthesis iterations.

The drive to low power is also reflected in P&R tools themselves. Designers need to be able to account for the impact of P&R decisions on the power consumption of the final chip: more, the P&R tool must offer support in power network design.

Identifying the best power network before P&R and sign-off can avoid significant numbers of design iterations. In the past engineers have been forced to design their power net based on educated guesswork, followed by a trial passage through the back-end flow. Today, however, it is possible to specify a list of possible solutions that can be used as the basis of rapid estimations of IR-drop and area. Because these estimations correlate well with final results, the engineer can identify the best starting point for power network optimization.

Automatic test pattern generation (ATPG) tools are another vital element in the low-power flow. Test engineers have always been aware of the danger that, by implementing a scenario that may never occur in real life, they may damage chips on the tester. For instance, a camera phone may be designed so that the camera and phone are never simultaneously operating: whereas a test suite could quite easily test the two functionalities at the same time. Power-aware ATPG must therefore consider both the power consumed by test pattern data being shifted into the device, and also the power consumed once the pattern is loaded and the design is clocked.

Finally, sign-off becomes more important than ever in low power designs, because it is necessary to analyze timing, signal integrity (SI) and power concurrently to avoid a lengthy iterative process. These optimizations are often in direct conflict with each other, and are a particular problem in designs that use dynamic voltage and frequency scaling (DVFS) to implement a wide range of operating modes with various power profiles. Concurrent multi-corner, multi-mode (MCMM) analysis is vital in these cases.

As for a standard design, a low-power design must use equivalency checking at sign-off. But in this case, there are two “golden” files to be checked against: RTL for design intent, and UPF for power intent. And, of course, power distribution needs to be checked for IR drops, hot-spots and possible electromigration effects, to ensure that sufficient power can be delivered to all parts of the circuit at all times.

Performance and Productivity Intertwine
All of these techniques and tools answer new technical challenges. And yet the number one concern for designers is not now a technical one, it is a business one. It is productivity.

As we have already observed, productivity is as much as anything about designing to strict project timescales. For the EDA industry, the most obvious way to help designers in this regard is to improve tools’ performance. This is not just about faster runtimes and better memory utilization (although the 35 percent improvement attained in these metrics by a tool like IC Compiler in 2007 is a major boost). It is also about focusing improvements on the most challenging aspects of design.


Figure 2. Improvement in IC Compiler Runtimes of 35%

In these areas, the greatest gains are delivered not by changes to the tools themselves, but in improving how the tools are used. Better scripting in IC Compiler can deliver much greater gains – up to two times better performance for a challenging design. This fact emphasizes the importance of service and support to any EDA offering: in most cases performance improvements are “just a phone call away”, the result of closer interaction between designers and support staff.

Some parts of the flow are still developing rapidly in terms of sheer performance: in particular lithography, only relatively recently coming into the EDA fold, and an extremely compute-intensive set of processes. Amongst the challenges here are the need to execute multiple computational steps sequentially, and the huge amount of data – measured in terabytes at 32nm – required to describe the design.

New technology has brought improvements in both these respects. Data bottlenecks are avoided by allowing some portions of the design to take place without data interchange. And by intelligently partitioning the task, some of the processes can be implemented in parallel – for instance mask data preparation and optical proximity correction.

Harnessing the power of multi-core processors also allows EDA tools to “piggyback” on advances in more general-purpose computing. A product such as HSPICE® 2008.03, for instance, provides improvements on two fronts. It includes a new matrix solver and exploits multi-threading capabilities. For mixed-signal designs with parasitics, HSPICE with the new solver averages a three-times speed-up on a single processor core. But the new multi-threading capability running on four cores delivers an additional two-times speedup, for a total of greater than six-times speedup.

Productivity “in the flow”
Producing better performing tools is an essential contribution to designer productivity. But, like power consumption, productivity is a “whole flow issue”. The key is ensuring that the flow produces predictable results; and the way to guarantee that is to make certain that each step in the design process is a step towards a successful project conclusion.

Making sure that each stage is, literally, productive, means that the top steps in the flow must have knowledge about succeeding steps. Not total knowledge, but enough to ensure that later steps do not have to re-do – or worse undo – work that has been done earlier in the flow.

Today’s flow is full of examples of this tighter integration between tools. For instance, a P&R-aware synthesis tool is not only useful in optimizing for timing, power and area, it can also be used to identify areas of routing congestion, and to enable subsequent process steps to optimize that congestion away. Meanwhile, the P&R tools themselves need to know about manufacturing and signal integrity issues, without themselves becoming to slow or complex to be useful.

Even so, occasionally it becomes possible to implement a valuable improvement by automating a single optimization step. This is the case with Synopsys’ MinChip automated die size reduction technology. This is designed to preserve user intent, while delivering the smallest possible routable area in a single-pass process that takes just a few hours. Many customers are seeing 5-10 percent shrinkage, a reduction that is passed straight to the bottom line.

Methodology is another valuable contributor to enhanced productivity, nowhere more so than in verification, where the Verification Methodology Manual captures an ever-broader set of information on how to efficiently complete the process. The recently introduced VMM-LP manual is the industry’s first verification methodology for low-power designs. It provides a low-power perspective on verification planning, and outlines best-practice rules and guidelines.

Increasingly, much of the verification methodology best practice is itself the subject of automation. For example, it is now possible to provide firmer metrics on the completeness of the verification process; and to automatically check for compliance with language restrictions within a design.

Enhancements to the Synopsys verification portfolio have extended its reach to provide embedded software and system validation. The demand for verification performance that is orders of magnitude higher than is possible with software simulation, has made prototyping in FPGAs a verification methodology that is used on virtually every ASIC/ASSP design today. Confirma™ makes prototyping a mainstream verification solution by providing a complete flow with tightly-integrated hardware and software components that complements any existing verification flow.

IP Still Developing
Of course, commercial IP continues to have a massive impact on designer productivity. There is a constant race towards lower power, smaller footprint and newer technologies, with a parallel need to maintain consummate quality. As a result, verification is absolutely vital in this area.

In many cases, productivity is increased as much by commonality of design as by innovation. Examples of this are the recent focus on easing the transition between PCI, PCI-X and PCI-Express; the move to DDR3 memory; and the launch of both 1T and 6T embedded memory solutions.

Analog and Mixed Signal
One of the most striking features of many recent EDA developments is the meeting of different disciplines: for instance the need to treat digital signals as waveforms for power purposes; and to make digital design tools aware of the physics of manufacturing a device.

Such trends are also reflected in designs themselves, where analog and mixed signal (AMS) blocks are becoming increasingly important. We have added two key capabilities to the Galaxy™ Platform that provide schematic capture for AMS and custom designs, and a feature-rich custom layout editor. Now the Galaxy Implementation Platform, with Custom Designer LE and Custom Designer SE, supports both cell-based and custom design.


Figure 3. Custom Designer for AMS Design

The new Synopsys custom design solutions are designed to be familiar to most AMS designers, while adding new features such as the ability to bring up multiple designs simultaneously, and a virtually unlimited un-do facility. It is compatible with TCL, Python and C++, and integrates with the existing Synopsys tool flow, including the HSPICE and NanoSim® simulators.

Aart de Geus
Since co-founding Synopsys in 1986, Dr. Aart de Geus has expanded Synopsys from a start-up synthesis enterprise to a world leader in electronic design automation (EDA). As a technology visionary, he is frequently asked to speak on topics related to the electronics industry. As one of the leading experts on logic simulation and logic synthesis, Dr. de Geus was made a Fellow of the Institute of Electrical and Electronics Engineers (IEEE) in January 1999. He was also honored for pioneering the commercial logic synthesis market by being named the third recipient ever to receive the IEEE Circuits and Systems Society Industrial Pioneer Award. In 2002, shortly after transacting the largest merger in EDA history, Dr. de Geus was named CEO of the Year by Electronic Business magazine; and in 2004, Entrepreneur of the Year in IT for Northern California by Ernst & Young. In November 2005, Electronic Business magazine chose Dr. de Geus as one of "The 10 Most Influential Executives." Dr. de Geus is active in the business community as a member of the board of the Silicon Valley Leadership Group (SVLG), TechNet, the Fabless Semiconductor Association (FSA), and as Chairman of the Electronic Design Automation Consortium (EDAC). He is also heavily involved in education for the next generation, having created in 1999 the Synopsys Outreach Foundation, which promotes project-based science and math learning throughout Silicon Valley.


©2008 Synopsys, Design Compiler and DesignWare are registered trademarks or trademarks of Synopsys, Inc. Any other trademarks or registered trademarks mentioned in this article are the intellectual property of their respective owners.


Having read this article, will you take a moment to let us know how informative the article was to you.
Exceptionally informative (I emailed the article to a friend)
Very informative
Informative
Somewhat informative
Not at all informative

Register Buttom

Email this article

WEB LINKS
- Custom Designer
- Low Power Solution
- SNUG

"If low power is surprising in its impact upon verification, it is fair to say that in the implementation flow, it is every bit as important as one might expect."