Bluetooth low energy v5.4 Baseband Controller, Protocol Software Stack and Profiles IP
Language Attributes Ensure IC Verification
Language Attributes Ensure IC Verification
By Janick Bergeron, Integrated System Design
June 4, 2002 (3:24 p.m. EST)
URL: http://www.eetimes.com/story/OEG20020604S0028
Hardware verification languages (HVLs) have survived past the critical early-adopter stage and are now aggressively entering the mainstream. Estimates put their current penetration level at 10 percent of the overall Verilog and VHDL user market, with a growth rate of more than 50 percent annually. The HVL market is currently dominated by commercial solutions: Specman Elite from Verisity Design Inc. and Vera from Synopsys Inc. Other commercial solutions include Rave, from Forte Design Systems, and newcomer Superlog, from Co-Design Automation Inc. Open-source solutions are also available and include TestBuilder from Cadence Design Systems Inc. and Jeda from Juniper Networks Inc. Unaccounted for, and probably making up a larger share of the market, is the plethora of homegrown solutions based on C, C++, TCL or Perl and integrated with VHDL or Verilog via a custom PLI interface. If you are currently using Verilog or VHDL to verify your designs, should y ou consider switching to a verification language for your next project or can you accomplish the same task with your existing tools? As the complexity of designs grew in the late '80s, the design community was forced to abandon its trusty schematic-capture tools and embrace the logic synthesis revolution in order to maintain competitive productivity levels. Continued increases in functional complexities are forcing the verification community toward a similar change in the way verification is done. This change involves more than using self-checking transaction-level testbenches. Although this approach is a necessary step to successfully using HVLs, it is not sufficient to implement today's best-in-class functional verification methodology. A basic requirement of HVLs is the support for high-level data types, object orientedness, concurrency control and design observability. It is often this basic requirement that prompts users to switch to C or C++ to implement testbenches, since neither Verilog nor VHDL provide all of those functions. Verilog has no high-level data types, VHDL no direct visibility of the design. Neither language is object-oriented, and neither has a dynamic concurrency control mechanism. C++, coupled with a PLI interface, can easily answer these requirements. But these basic requirements only address the mechanics of writing directed, self-checking, transaction-level testbenches. They do not provide a fundamental shift in the way verification is implemented. To be competitive for the next generation of multimillion-gate designs, HVLs must offer three complementary tools that enable a new, more productive, functional verification methodology: constrainable random generation, temporal assertions and functional coverage. This article discusses these tools. With increased functional complexity comes an exponential growth in the number of features that must be verified. A directed verification approach, where each feature is individually verified using a separate, manually written test case, is quickly going to collapse. Too many test cases need to be written and debugged. The number of people required to write all of these test cases within the prescribed time-to-market window is unmanageable and too costly-assuming you manage to find them. To address this problem, constrainable random generation has proven an effective tool to productively generate the stimulus necessary to exercise the features of the design. The constrainable random generation provided by HVLs is far more powerful than the simple random-number generator and distribution tasks provided by Verilog, or the usual random-generation packages available in VHDL. To efficiently and productively create a large number of varied and interesting scenarios, HVLs make it easy to generate random instances of valid complex data structures. They also make it easy to add constraints to direct the random generation toward a specifically interesting area of the solution space, or to remove constraints in order to relax certain conditions to inject errors. Figure 1 shows a coding example, written using the OpenVera language, that randomly generates MAC frames using dynamic constraint control. Assertion statements HVLs include a powerful temporal language that can describe complex protocol relationships using a concise syntax. The mundane functions of dynamically spawning parallel execution threads, detecting discrepancies and reporting erro rs are all built in, leaving the verification engineer to focus on the actual functionality to be verified rather than the detection and reporting mechanisms. Their concise syntax and formal semantics make them ideally suited for interpretation by formal verification tools that may then attempt to mathematically prove or disprove the property. Despite their power, temporal assertions are often underutilized. This is probably due to the declarative and mathematical nature of their syntax-traits that challenge our traditional procedural mind-set. That is why most temporal assertion examples, including the one below, are often simple and leave the impression that temporal assertions are not worth the investment required to learn how to use them. But once understood, they prove to be an indispensable tool to ensure the functional quality of a design. In Figure 2, using the SuperLog language, the assertion states that a request must always be followed by a grant or a retry within 100 clock cycles, except if reset is applied, and that there must not be a grant or retry unless there is a request. The last essential element of an HVL is functional coverage. High-level constructs will help you model your data and its transformation more easily. Constrained random generation will help automate the creation of the test cases, and temporal assertions will help detect functional errors. But how do you know you have completely verified the functionality of your design? Given that your test cases are randomly generated, how will you know whether all relevant test cases have been applied? That's where functional coverage comes in. Complementary coverage Functional coverage can help answer questions like: "Did I execute all possible operands with all possible addressing modes?" "Did I send Ethernet packets of all significant lengths to all ports?" "Did I write to every register?" Figure 3 shows a coding example, using the e language in Specman Elite, that measures functional coverage of generated MAC frame lengths. A partial screen shot of the generated report enables the user to analyze what remains to be done in the verification of the design. The objective of this article was to illustrate why using a language or environment that was designed specifically to meet the unique challenges of functional verification offers more productivity and overall quality than yesterday's directed test benches. These challenges are addressed through easily constrainable random generation, temporal assertions and functional coverage. However, it is important to note that, although these features are embodied in the HVLs, simply using an HVL does not necessarily mean that these features are used to the best of their abilities. One must distinguish between learning a language and learning how to use it properly. The former is easy. The latter requires several months of experience or suitable advanced methodology training. --- http://www.isdmag.com
Temporal assertions are the second requirements for an HVL. An assertion is a statement of a property about something that must be true at all times. A typical software assertion is to check that the value of a pointer is not NULL. It is easy to specify simple assertions using HDLs. But complex, overlapping assertions with multiple alternatives over multiple cycles are much more difficult to express, debug and, more importantly, trust.
Functional and code coverage are complementary. The latter measures lines of code, while the former measures data patterns, state sequences and their combination against stated goals. Code coverage will detect errors of omission ("Did the testbench forget to exercise this l ine of code?") whereas functional coverage detects errors of commission ("Does the design work under all possible data value sequences?"). Code coverage can help detect errors in existing code, but functional coverage can help find errors in unimplemented functions.
Janick Bergeron, CTO for Qualis Design Corp. (Lake Oswego, Ore.), is the moderator of the Verification Guild (http://janick.bergeron.com/guild/). He has over 15 years of functional verification experience and holds an MASc degree from the U. of Waterloo (Waterloo, Ontario) and an MBA from the U. of Oregon (Eugene).
Copyright © 2002 CMP Media LLC
6/1/02, Issue # 14156, page 28.
Related Articles
- IC design: A short primer on the formal methods-based verification
- IC mixed-mode verification: The Sandwiched-SPICE approach
- Breaking the Language Barriers: Using Coverage Driven Verification to Improve the Quality of IP
- Breaking the Language Barriers: Using Coverage Driven Verification to Improve the Quality of IP
- New IC verification techniques for analog content
New Articles
Most Popular
- System Verilog Assertions Simplified
- System Verilog Macro: A Powerful Feature for Design Verification Projects
- Synthesis Methodology & Netlist Qualification
- Enhancing VLSI Design Efficiency: Tackling Congestion and Shorts with Practical Approaches and PnR Tool (ICC2)
- Demystifying MIPI C-PHY / DPHY Subsystem
E-mail This Article | Printer-Friendly Page |