Getting the most out of formal analysis
EE Times: Design News Getting the most out of formal analysis | |
Paul Hylander, Axel Scherer, and Ramesh Mayiladuthurai (04/25/2005 9:00 AM EDT) URL: http://www.eetimes.com/showArticle.jhtml?articleID=161500589 | |
1 Introduction Using the right methodology for applying and using products within a design project is critical to getting a high return on investment (ROI). This is especially true in emerging areas such as formal analysis. Formal analysis holds the promise of reducing testbench development and simulation cycles while improving design verification. Formal analysis uses sophisticated algorithms to conclusively prove or disprove that a design behaves as desired for all possible operating states. Desired behavior is not expressed in a traditional testbench, but rather as a set of assertions. Formal analysis does not require user-developed test vectors, but instead analyzes all legal input sequences concurrently and automatically. When applied correctly, formal analysis improves productivity and reduces time to market. With formal analysis, many bugs can be found quickly and very early in the design process without the need to develop large sets of test vectors or random test generators. Furthermore, because of its exhaustive nature, formal analysis improves quality by finding corner-case bugs often missed by traditional testbench driven verification. The dual benefits of increased productivity and increased quality are driving the adoption of formal analysis into the design flows of many companies. The adoption of formal analysis is a relatively recent phenomenon, enabled by the confluence of several factors. These factors include improvements in usability, the recent availability of industry-standard assertion languages, and advancements in performance and capacity. While these factors have enabled formal analysis to transition from the academic into the commercial realm, full mainstream adoption has been limited primarily due to the lack of an adequate methodology. 2 Methodology overview There are distinct levels of methodology. At the highest level there is project methodology, which integrates formal analysis with the project flow. It involves a description of who on the team performs what activities, when those activities are scheduled, and how they will be done. At the lower levels there are methodologies such as property coding and RTL design guidelines for formal analysis. There are several key elements of a successful formal analysis methodology. The first element is that formal analysis must be used in the context of a comprehensive assertion-based verification flow, which includes formal analysis and simulation, and can be extended to acceleration and emulation. These different verification techniques complement each other when used together in a flow. The second element is that formal analysis should be used on parts of the design where the methodology's key advantages can be leveraged efficiently. The final element is the most crucial one — the people. Design and verification engineers must be trained on the flow and execute it in activities specific to their respective roles. 3 Comprehensive verification flow Several important factors influence the deployment of formal analysis as part of a comprehensive verification flow. The availability of each of these factors enables engineers to get the most out of formal analysis today. The first and most important factor is that formal analysis does not require test vectors. This implies that engineers are able to start the verification process during the design stage, without investing effort in creating test vectors or writing random testbenches. With this reduced effort, there is little reason not to perform module and block-level verification with the help of formal analysis. The second factor is that with industry-standard assertion languages (and the corresponding software that supports them) assertions written at one level of a design can be used in many other levels throughout the verification flow. This allows any investment in writing block-level assertion and formal analysis to be leveraged by the other verification products deployed later in the verification flow. The third factor is that formal analysis is exhaustive. On one hand, this aspect of formal analysis makes it very efficient in finding subtle, corner-case bugs in a design. On the other hand, the countless state permutations explored by formal analysis technology hit computational limitations, making it important to choose verification targets carefully to maintain good ROI. As shown in Figure 1, the recommended methodology — assertion-based verification — takes into account the three important factors mentioned above. In this methodology, designers write assertions as they develop the RTL. Then the designers verify these assertions, along with automatically extracted assertions, using formal analysis on their own blocks. This occurs before simulation and before the blocks are integrated into higher levels. In this block-level verification paradigm, formal analysis can add value by verifying the blocks months earlier in the design cycle — as soon as some partial RTL description is available. This ensures that bugs are found early, when the cost of fixing a bug is low. In contrast, with a simulation-only methodology, bugs are often found late, when the cost to identify, isolate, and fix the design is much higher. Moreover, even with very high coverage, simulation may miss subtle, corner-case bugs, leaving them undiscovered until after first silicon. Because bugs are found so quickly — often in seconds — with formal analysis, the traditional way of fixing a bug and re-running regressions to find more bugs is vastly inefficient. Instead, the designer can use formal analysis to fix a bug and then immediately check if the fix is correct, potentially exposing additional bugs. In this way, a block can quickly converge with high quality achieved in a matter of hours instead of days.
Figure 1 — Formal analysis as part of the assertion-based verification flow
At the cluster/block level, formal analysis is still useful, though runtimes may increase. At this level of integration, formal analysis is used alongside simulation and, in some cases, acceleration. Here, simulation helps not only to verify the correctness of assertions, but also to check the assumptions that were used during formal analysis of the sub-blocks. Finally, at the integrated design level, all forms of assertion-based verification products are applicable, depending on the size of the design and the amount of verification necessary. At this level, simulation, acceleration, and emulation are the primary tools of choice, with formal analysis used to expose a few more corner-case bugs. This is typically done by using formal analysis in addition to simulation and searching for nearby bugs. Although formal analysis can expose some problems this way, at higher levels of integration, the effectiveness of formal analysis varies. Therefore, the use of formal analysis at these levels plays a secondary role. The addition of formal analysis to the verification flow raises the question of whether simulation can be eliminated. Some simulation activities, such as verification of a few special blocks, may no longer be necessary due to the addition of formal analysis into the flow. However, the majority of simulation activities will remain, particularly for integrated design verification. 4 Best targets for formal analysis Using formal analysis at the block level is a key factor in increasing ROI. Some blocks are better suited for formal analysis than others are. The main consideration during block selection is the amount and complexity of control logic in a block. Control logic blocks tend to have many corner cases and have high potential for subtle bugs, making them great targets for formal analysis. These include, among others:
Designers should also consider the presence of datapaths and memories. Formal analysis of these constructs tends to require more effort, making them secondary targets to other blocks described above. If designers consider these factors when selecting the best targets for formal analysis, the resulting ROI tends to be very high. 5 Applying formal analysis methodology A critical part of the Cadence Incisive formal analysis methodology is to ensure that design and verification engineers apply formal analysis in a way that complements their respective roles. Designers are in the unique position of having intimate knowledge of the blocks they implement. They understand the intended behavior of the block, its relationship to other blocks, and its implementation in RTL. Therefore, designers are naturally good candidates to write assertions related to their blocks. Verification engineers, on the other hand, tend to have a better picture of the overall verification plan and the risk areas, and can therefore provide the necessary alternative perspective on correctness.
5.1 The designer's activities The designer can plan for, and aid in, formal analysis. When designing a block or a set of blocks — knowing that datapaths and memories can affect the ability to verify control logic using formal analysis — a designer can partition datapaths and memories apart from the control logic. The next step for the designer is initial RTL coding and bring-up. Because test vectors are not required, formal analysis is particularly well suited to block bring-up. In block bring-up, the designer's goal is to get the block to a stage where it mostly works. Initially a designer focuses on checks that can be extracted automatically from a design, such as lint checks, structural checks, and automatically extracted assertions for dead-code analysis, finite state machine analysis, and others. The designer should write assertions on internal signals that have complex behavior or those they expect to be problematic. Running formal analysis on these assertions will provide an excellent return because failures caught here can be debugged quickly, and failures are very likely in these complex parts of the design. The next activity centers on checking behavior from a black-box perspective. Here, the designer writes several assertions related to behavior and then checks them using formal analysis. Because these assertions involve many aspects of a design, they can often find subtle bugs due to unforeseen interactions between different parts of the design. 5.2 The verification engineer's activities The verification engineer ensures that the design works as intended. This means that the individual blocks work correctly and that they work correctly together. Just as designers must plan for formal analysis early in the design stage, verification engineers must plan for formal analysis early in the verification planning stage. At this stage, it's important to identify likely targets for formal analysis. Good targets may be areas that are known to have complex functionality or that have had many bugs in other similar projects. In modern designs, especially SoC designs, blocks are often connected with a few commonly used interfaces. Even within clusters of sub-blocks, a few common interfaces may be used repeatedly. Since these protocols are so widely used within a project, and because ensuring consistent communication between blocks is so important, verification engineers should also develop project-wide or even company-wide assertion-based protocol and bus-interface verification components. Other verification engineers and designers on the team can use these components to verify individual blocks and to speed system bring-up by ensuring that individual blocks communicate with each other correctly. The same verification components can be used by formal analysis, simulation, acceleration, and emulation. Because of the multiple ways and places that verification components can be used, creating and using them results in higher ROI for the verification engineer. There are cases when the verification engineer is in a better position than the design engineer to run formal analysis on a block. For example, when designers are tied up with physical design fixes, they may not be able to devote time to formal analysis. Or it may be that the verification engineer has been assigned to verify a block and has decided that formal analysis might be the best method. In these cases, the verification engineer must be able to determine whether the block is suitable for formal analysis and then write assertions, run formal analysis, and potentially debug the results. Although high ROI can be achieved, the lack of white-box knowledge makes it more difficult for the verification engineer to determine whether a block is suitable for formal analysis and to debug failed assertions. Formal analysis may lead to opportunities for savings in verification efforts in other places. For example, creating some directed tests may not be required if an engineer understands that verification has already been covered by formal analysis of certain blocks. Or, if a complex state machine has been verified by formal analysis, then reaching complete coverage for that state machine in the simulation environment might not be necessary. A verification engineer who understands the overall verification plan and understands what parts of the design have been covered by formal analysis is in a good position to assess what other parts of the verification effort can be reduced. 6 Conclusion In this paper, we presented a proven approach to achieve high returns with formal analysis. As formal analysis becomes part of mainstream customer production flows, the methodology for adoption and usage plays a key role. When applied in the context of the recommended methodology — the comprehensive assertion-based verification flow — formal analysis provides the biggest returns when used early, prior to simulation efforts. Moreover, both design engineers and verification engineers play important roles in formal analysis use and can benefit from it. With the critical factors in place, the methodology maturing, and the ROI clearly defined, this is the ideal time to consider adding formal analysis to your design and verification flow. Paul Hylander is an Architect at Cadence Design Systems. Axel Scherer is Senior Member of Consulting Staff. Ramesh Mayiladuthurai is Technical Marketing Manager.
| |
All material on this site Copyright © 2005 CMP Media LLC. All rights reserved. Privacy Statement | Your California Privacy Rights | Terms of Service | |
Related Articles
New Articles
- Quantum Readiness Considerations for Suppliers and Manufacturers
- A Rad Hard ASIC Design Approach: Triple Modular Redundancy (TMR)
- Early Interactive Short Isolation for Faster SoC Verification
- The Ideal Crypto Coprocessor with Root of Trust to Support Customer Complete Full Chip Evaluation: PUFcc gained SESIP and PSA Certified™ Level 3 RoT Component Certification
- Advanced Packaging and Chiplets Can Be for Everyone
Most Popular
- System Verilog Assertions Simplified
- System Verilog Macro: A Powerful Feature for Design Verification Projects
- UPF Constraint coding for SoC - A Case Study
- Dynamic Memory Allocation and Fragmentation in C and C++
- Enhancing VLSI Design Efficiency: Tackling Congestion and Shorts with Practical Approaches and PnR Tool (ICC2)
E-mail This Article | Printer-Friendly Page |