SOC: Submicron Issues -> Languages run verification ecosystem
Languages run verification ecosystem
By Janick Bergeron, Vice President, Technology Qualis Design Corp., Lake Oswego, Ore., EE Times
October 16, 2000 (3:57 p.m. EST)
URL: http://www.eetimes.com/story/OEG20001016S0057
Over the last few years, design reuse has been touted as the most promising methodology for narrowing the design productivity gap and enabling easier and faster system design. As a result, industry trade groups, standardization bodies, third-party providers and support tools have evolved to create a viable design reuse ecosystem addressing each step of a component life cycle, from authoring through integration to upgrade, with underlying data management and qualification.
But that ecosystem has traditionally focused on the actual design portion of the activity. The same factors that pried open the design productivity gap have also created an even wider verification productivity gap: As design complexity grows, the complexity of the functional verification task rises exponentially. Unless verification is addressed in a design reuse methodology, the greatest opportunity for improving design productivity will be ignored.
If the reuse of de sign components has helped narrow the design productivity gap, the reuse of verification components will provide similar benefits. In many aspects, verification reuse ought to be simpler than design reuse. Verification is not subject to actual implementation in silicon. It does not face timing, performance, portability or technology constraints. It is not limited to rigid coding styles. But it is that same lack of constraints or limits that creates the greatest challenges for verification reuse.
Design reuse had the benefits of a well-defined design process and rules. The characteristics of good design practices were known. The complexity of the abstraction level used was limited. Textbooks and formal training classes could be adapted to promote reuse. Yet the verification process is ill defined and does not have industry-accepted rules.
Good or standard verific ation practices are not widely known or used and the abstraction level of a verification infrastructure can be arbitrarily complex. Few texts or formal training classes exist to help engineers with design verification. The ad hoc approach to the verification task, which is usually seen as a necessary evil to be done as time allows, is a tremendous obstacle to verification reuse and productivity.
The industry acceptance of high-level verification languages may be the precipitating event that sparks the birth and evolution of a verification reuse ecosystem. They provide features helpful to reuse that were not available or easily implemented in traditional HDLs. For example, Verisity's Specman Elite can back-annotate or add constraints to control random data-generation processes without requiring any modifications to the original reused verification code.
This important feature, called extensibility, enables a user to modify the behavior of a (properly written) reusable verification component wi thout affecting the original quality of the component, which always remains as distributed by the provider.
Verification components (VCs), not unlike design components, are elements of verification environments that can be reused, unmodified, from one verification environment to another. VCs can be created for standard interfaces to exercise various conditions or exceptions in standard protocols, and they can monitor these interfaces and protocols and report any deviation or violation. Additionally, these components provide an unbiased interpretation of the standard interfaces and protocols. The greatest risk faced by a designer is for a design flaw to go undetected because the testbench is similarly flawed or is not designed to detect the particular flaw.
E Verification
For example, Verisity and Qualis Design provide E Verification Components (EVCs), which are reusable pieces of verification code based on Verisity's E Verification language. They save the verification engineer valuable time in writing and rewriting the verification environment for a standard interface. Just as there is no value in designing your own PCI, USB or ATM interface block, there is no value in designing your own PCI, USB or ATM verification infrastructure.
Verification environments must be designed to take maximum advantage of available reusable verification components and to maximize the production of verification components that will be reusable. Not all components of a verification environment will be reusable. But the portions implementing industry or proprietary standard protocols and interfaces can clearly be reused. Some functionality common to many verification environments, such as scoreboarding, may also be reusable. Scoreboarding is a common data-checking technique often used in communication applications.
Creating reusable verification components requires an investment in time and resources above and beyond what is required to create single-use testbench components. When creati ng reusable verification components, there are a few guidelines to remember. To begin with, for something to be reusable, it must be useful. Components should be implemented to be usable in any environment and not imply or require a particular architecture, or put any undue restrictions on their use.
To minimize the investment in making a verification component reusable, only include features that are immediately useful. Features that may be interesting or appear attractive for a future project may turn out to be wasteful, make the component more difficult to use or be a source of many functional bugs or both.
The global name space in any environment is a precious collective resource. It is necessary to minimize the impact a verification component will have on it to reduce the likelihood that a name collision will prevent the concurrent use of different verification components. Each language puts different pressure on the global name space. For example, struct, unit and enumerated type names are global in the e language. When writing a reusable verification component, one must be careful to ensure their uniqueness. All global symbols in Qualis' EVCs are prefixed with "Q_" to prevent potential collision with other components of the verification environment.
Plan for the possibility of using multiple instances of the verification component in the same verification environment. These different instances may have different configurations as well. It must be possible to connect these different instances to different interface pins with different naming conventions or nomenclature. Hard-coded interface signals names are definitely out of the question. An alternative would be to use compiler symbols to define the leaf HDL names at compile time. However, these compiler symbols are also global in nature and will interfere with other instances of the same component. A better solution is to use instance-specific interface signal names in each instance of a component. Verification languages support instance-specific interfaces with varying degrees of succinctness. For example, in the e language, they can be implemented as string-type data members and computed HDL names.
Bottom-up approach
When implementing a structure of verification components, do so in a bottom-up fashion. It is usually possible to access an instantiated object from anywhere in the environment through absolute references. It is thus possible for objects of different types to refer to each other. This makes their implementation interdependent and complicates the order in which they can be loaded or specialized.
Implement utility functions at similar transaction levels through extensions. In a particular environment, it may be useful to modify the available transaction interfaces or create new ones to better match the requirements and minimize repetitive code.
Also, implement utility functions providing higher-level transactions using a higher-level unit or struct. Repetitive low-level transaction s that implement a higher level function can be encapsulated into a higher-level transaction. This raises the level of abstraction of the test implementation and reduces code maintenance and volume. High-level transaction functions are not usually related to the low-level transactions. A different low-level transaction could be substituted while maintaining the functionality of the higher-level transaction.
Provide hook methods to extend the functionality of reused methods. Providing hook methods, similar to the pre_generate() and post_generate() methods in each Specman Elite object, lets the user properly extend the functionality of a verification component in a well-controlled fashion. This approach gives the user the opportunity to modify, record or display each object at appropriate points in the generation, transformation or reception without having to modify the original reused code.
The ultimate success of verification reuse, just like the success of design reuse, rests with the user. For example, if verification environments are built as monolithic entities it will be impossible to extract functionality that could be subsequently used in another environment. Nor would it be possible to make use of available verification components.
The design of suitable reusable verification components is not a trivial matter. Just like designing reusable design components requires an investment in additional efforts. Reusable verification components must be able not only to function correctly under various configuration options, but inject errors and violate protocols to verify that the design under verification properly handles invalid situations. Similarly, they must be able to appropriately handle protocol and interface violations without hanging or corrupting the remaining data sequences.
Verification reuse has a greater potential in improving the productivity of verification engineers than design reuse had for design engineers. All the necessary conditions for a viable verification reuse ecosystem to thrive are present in the industry. It is up to the designers and their managers to make the required shift to benefit from its advantage.
Related Articles
- SOC: Submicron Issues -> Detailed circuit verification vital for SoC
- SOC: Submicron Issues -> Physics dictates priority: design-for-test
- SOC: Submicron Issues -> Technique probes deep-submicron test
- SOC: Submicron Issues -> SiPs enable new network processors
- SOC: Submicron Issues -> 'Sea of blocks' speeds up SoC designs
New Articles
- Quantum Readiness Considerations for Suppliers and Manufacturers
- A Rad Hard ASIC Design Approach: Triple Modular Redundancy (TMR)
- Early Interactive Short Isolation for Faster SoC Verification
- The Ideal Crypto Coprocessor with Root of Trust to Support Customer Complete Full Chip Evaluation: PUFcc gained SESIP and PSA Certified™ Level 3 RoT Component Certification
- Advanced Packaging and Chiplets Can Be for Everyone
Most Popular
- System Verilog Assertions Simplified
- System Verilog Macro: A Powerful Feature for Design Verification Projects
- UPF Constraint coding for SoC - A Case Study
- Dynamic Memory Allocation and Fragmentation in C and C++
- Enhancing VLSI Design Efficiency: Tackling Congestion and Shorts with Practical Approaches and PnR Tool (ICC2)
E-mail This Article | Printer-Friendly Page |