|
|||||
Building a verification strategy 'blueprint' Building a verification strategy 'blueprint' Just as no one would consider building a house without a blueprint and specific details on the floorplan, no design team today would begin a new project without first developing a carefully considered verification strategy. With verification consuming up to 70 percent of the design cycle, it's easy to understand why implementing a verification strategy has become so crucial. As with all new home projects, the location of the kitchen, number of bathrooms and the square footage of the master suite is likely to change from the original blueprint. The later the changes are made, the greater the cost overrun, fraying everyone's nerves and wreaking havoc on the homeowner's budget. The same is true in chip design, making early and thorough verification essential. If an error is detected late in the design cycle, fixing the problem may require repeating many design and verification steps, slipping schedules and missing valuable market opportunities . If an error is detected after manufacturing, a re-spin can cost a half million dollars or more. If it is detected in a fielded device, it can cost hundreds of millions of dollars to recall defective devices. Verification has been evolving over many years. Simulation has been used for design verification since the 1970s, but is losing its effectiveness since coverage and performance dramatically decrease as designs grow larger. Once a design reaches a quarter-million gates, gate-level simulation becomes impractical. At the half-million gate level and above, it becomes too risky to verify designs using traditional simulation methods alone. Formal verification offers a needed alternative, meeting the ongoing challenges of system-on-chip (SoC) design. It's faster than simulation, requires no vectors, and is exhaustive in ensuring more bugs are caught earlier and when they take less time to fix. A formal verification methodology is now a critical part of a strategic SoC verification plan. Eq uivalence checking is a key element of the methodology. It is an automated way of detecting functional inconsistencies, providing a reliable way to ensure that the final design implementation does what the register transfer level (RTL) code specifies. It uses mathematical techniques to determine whether one design representation is functionally equivalent to another. Equivalence checking can functionally verify the implementation of an entire chip design from RTL to GDSII. Design teams are now taking their verification strategies one step further and modifying them to include assertion-based techniques, more thoroughly verifying their RTL designs prior to gate- and switch-level implementation. Assertions are checks embedded into the design to verify the designer's assumptions about how a logic block should operate, both by itself and in concert with surrounding logic blocks. Embedded assertions at the interfaces of intellectual property (IP) blocks preserve the design knowledge needed to verif y them as an integrated SoC, essentially making them self-checking. The net result is that assertions help SoC designers find more errors earlier in the design cycle, where they are far easier and less costly to fix. Assertion checks can range in application from mutual exclusivity checks -- checking that several independent state machines do not produce simultaneous bus requests, for example. Or, assertion checks can be used to verify temporal signal relationships such as handshaking -- checking that a block always responds appropriately and in a timely manner when a signal is sent from another. Once assertions are embedded into a design, assumptions the designer made about how the block should operate as part of a system are automatically verified wherever and whenever the block is used in the future. Specifying assertions can be done through pre-defined assertion monitors from the Open Verification Library (OVL). It consists of an open-source, Verilo g hardware description language (HDL)-based library of assertions as a means for capturing design knowledge, which then becomes portable with IP blocks in whatever systems they are integrated. This approach works with both simulation and formal verification. Similar to a simulation monitor, an assertion monitor is placed into a design to test for a particular behavioral characteristic. Assertion monitors can be used with commercial verification tools to trap undesirable functional characteristics or ensure that desirable behavior always occurs. Formal verification further increases the thoroughness of OVL assertion checks by increasing controllability, or exhaustively exploring the state space of the design to determine whether it is possible for the assertions to be triggered under any combination of internal states and external stimuli. Increasing both observability and controllability reduces the likelihood of expensive schedule slips, re-spins or recalls due to missed design errors. Just as blueprints ensure a home is built to the exact specification of the homeowner, a verification strategy put in place prior to the start of a design project will ensure that a chip works as it is supposed to work. By incorporating an automated verification methodology featuring formal assertion checking, designers have the means to overcome the complexity of verifying SoC designs. Dino Caporossi is vice president of corporate marketing at formal verification provider Verplex Systems.
|
Home | Feedback | Register | Site Map |
All material on this site Copyright © 2017 Design And Reuse S.A. All rights reserved. |