|
|||||
Effective System Verification with a Scalable Verification Methodology
Effective System Verification with a Scalable Verification Methodology
By Robert Hum, EE Times March 15, 2004 (10:18 a.m. EST) URL: http://www.eetimes.com/story/OEG20040312S0023 Functional verification is a major challenge for electronic designers today. Total system complexity is growing as more functionality is integrated to differentiate products, including analog/mixed-signal content, embedded processors and their respective software. With this increased integration design size grows, the number and length of tests increases dramatically, and overall verification complexity skyrockets. Not only does the increased presence of on-chip software and analog devices contribute to system complexity, it also challenges the traditional ways of doing things. Functional errors are the leading cause of design respins. Functional verification, the process used to find those errors, is the biggest bottleneck in the design flow and constitutes at least 50 percent of all design activity. According to a 2003 study by Collett International Research Inc., 67 percent of all errors are logical and functional failures, with analog flaws making up 35 percent of chip failures. Mixed-signal interfaces constitute 21 percent of errors and hardware/ software interfaces, 13 percent. A single system-on-chip (SoC) already can consist of tens of millions of gates, raising the potential for errors and complicating the verification task. The International Technology Roadmap for Semiconductors (2001 study) predicts that SoCs will contain a billion transistors by 2006. To meet these challenges, engineering teams need new verification strategies that address all aspects of the design cycle and reduce the verification gap defined as the difference between the ability to fabricate and the ability to verify. But the problem is that verification is subservient to design. To solve this problem, verification must become an integral part of the overall design methodology. The whole design and verification flow must be structured, based not only on what's good for the design engineers, but also on what's good for the verification engineers. This has implications for design partitioning, block sizing, design rules and many other things taken for granted today. Another challenge in successful system verification pertains to testbenches. As design size increases, verification complexity rises exponentially. While simulation capability tracks with design size, the complexity of the testbenches does not. Part of the reason is the dramatic effect of size on observability and controllability of the design. This increases the number of tests that need to be run; moreover, those tests are likely to get longer, so when things go wrong, it is more difficult to find out why. To overcome these issues, we must do it better or do it differently. To do it better, we need tools that are capable of spanning the verification domains of simulation, emulation, hardware, software, analog and digital. In addition, the tools must support all of the standard and emerging design languages, including VHDL, Verilog, C, SystemC and, most recently, SystemVerilog. Mentor Graphics calls this Scalable Verification. We can also improve these techniques by adding verification constructs for example, descriptions of expected behavior using property specification languages. Doing it differently means applying verification much earlier in the process. This may involve the creation of system-level testbenches, transaction-level modeling and the ability to examine the interfaces of the system as they are created, rather than leaving this until the end. It requires tools that span the gaps between levels of abstraction and between each of the system domains, such as hardware and software. This is what we call design-for-verification, which deploys the principle of scalable abstraction. Design-for-verification and Scalable Verification rely heavily on open industry standards and debug capabilities that scale in both of these dimensions. We believe that verification tools should be chosen by virtue of their attributes and not solely according to the languages they support. The requisite solution should comprise a suite of tools that work together to form a complete path from HDL simulation to in-circuit emulation. This means better simulators and emulators to speed up the verification process at all levels of integration. Scalability across tools is necessary because various types of verification provide different solutions at different performance ranges. Each solution involves a trade-off among many attributes, such as iteration time, performance, capacity, debug visibility and cost. Even HDL execution engines require a range of solutions. Some perform better at the block level, others at the chip or system level. For example, designers wanting to verify architectural decisions pertaining to their system would not use an HDL software simulator but an abstract model or transaction-level hardware-software environment to provide the necessary information. Conversely, in-circuit emulation would not be an appropriate solution for verifying relatively small sub-blocks of a chip design when an HDL software simulator could easily accomplish the same task. Design teams must improve existing methodologies with tools that scale across design complexity, and they should utilize multiple levels of abstraction. A scalable solution enables engineers to do what they do today, only better, faster and more often within the same time frame. It makes the verification tools more user friendly and enables more vectors to be pushed through a design. Any effective system verification strategy must begin with the premise that the system really is the "entire system," and includes things in addition to digital hardware. In other words, a meaningful solution must address analog, embedded software/RTOS and the environment in which these things must operate. In addition, design-for-verification techniques enable more effective reuse of the verification components created. They also enable early verification of the system architecture and the interfaces between design blocks as they are created, rather than at the end of the process. This can ensure that the block verification performed is not wasted, as specification interpretation is verified much earlier than before. New testbench components are making their way into verification methodologies today and the use of assertions can have a dramatic effect on the quality and speed with which verification can be performed. A number of even newer testbench components are emerging. All of these new components will be driven by, or will manipulate, properties. This is where the future lies, and that future is beginning to look very bright. This automated, properties-based verification approach will deliver the boost in performance necessary to narrow the verification gap. This is, in effect, the equivalent of the synthesis benefit that the design path enjoyed over a decade ago. Scalable verification is on its way and will fundamentally change the way the verification problem is viewed and handled. Robert Hum (Robert_Hum@Mentor.com) is vice president and general manager at the Verification and Graphics Design division of Mentor Graphics Corp. (Wilsonville, Ore.). |
Home | Feedback | Register | Site Map |
All material on this site Copyright © 2017 Design And Reuse S.A. All rights reserved. |