NVM OTP NeoBit in Maxchip (180nm, 160nm, 150nm, 110nm, 90nm, 80nm)
SystemC Verification Library speeds transaction-based verification
SystemC Verification Library speeds transaction-based verification
By Leonard Drucker, EEdesign
February 24, 2003 (11:44 a.m. EST)
URL: http://www.eetimes.com/story/OEG20030214S0042
New chip development cycles have decreased to a year, and the time to create a derivative has shrunk to six months. How can chip verification, which takes up 50 to 70 percent of today's development cycle, keep pace? One way to reduce development time is to reuse the verification environments created in other domains, such as the system domain. These environments generally are created in C++ and traditionally have been inaccessible to digital-verification engineers. With the acceptance of SystemC and the SystemC Verification Library, it is feasible to reuse this environment to save time. For designs to be easily upgraded and used for multiple purposes, they must be easy to configure. The essential element of configurability is impossible without a large verification effort to test each configuration. Compound the need to test all configurations with the many standard interfaces needed to complete a design (for example, an MP3 player needs a USB interface, Firewire interface, and others) and the possible interaction of all those configurable interfaces and the verification effort grows exponentially. Constrained random testing helps reduce the verification effort to a more manageable problem by allowing controlled random configurations and data to stress the system in ways that are not imaginable by an engineer writing directed tests. While constrained random testing improves the amount of verification that occurs, it obscures the determination of what was tested, known as functional coverage. Because stimulus is randomly created, it is difficult to determine how many functional operations, such as reads and writes, will be executed before the test begins. Therefore, to establish an "exit criteria" for verification, which indicates when verification is complete, the amount of functionality exercised by each test in the regression suite must be measured during or after simulation. This measurement is called transaction-based functional c overage. This measurement is done using transactions or a transfer of control or data across an interface. The transaction level is a good level of abstraction to gather functional metrics because it's the level where functionality is exercised. So why not make a small expansion on the test-generation effort to capture transaction information that can be used to measure which functionality was exercised? The new SystemC Verification (SVC) Library, adopted by the Open SystemC Initiative (OSCI) last October, defines a standard way to capture transaction-level information that can be used to create functional coverage metrics. What's more, it allows reuse of the transaction-level generation code. SystemC verification usage This code sample shows the need for these capabilities in verifying complex systems: Data structures are a very important element in creating stimulus. The data structures are needed to create complex real-world stimulus. The above code is used for a simple, pipelined bus. While a simple structure that contains an address and data is easy to create in register transfer level (RTL), the inclusion of random generation is not. This structure uses the scv_constraint C++ macros and classes to define limits on the random generation for address and data as well as relationships between the values of these normally independent generators. Note the addr variable is limited to be between 10 - 50 and 200 - 250, while the data is limited to be between addr-5 and addr+20. The simplicity and compactness of SCV standard ma kes it very powerful. This code (below) implements a pipelined bus prevalent in many designs. It shows concurrency, abstract data structures and constrained randomization. This quick test allows the data to be randomized in a constrained fashion. This is necessary to cover the large number of possible data values for this simple data structure: 232 addresses, 232 data values. Randomly selecting these values allows a compact testbench to be written. In addition, if the values are randomly selected for every simulation run (or regression run) new values will be tried. If, however, random values are constantly selected, how can it be determined if the simulations were executed long enough to cover the possibilities? This is where functional coverage comes into play. While constrained randomization improves the efficiency of a set of tests, it also makes coverage measurement difficult bec ause the data created changes from test to test. A mechanism must be created to capture the data from test to test, to see if functional goals are still being met. This is done by capturing the data in a transaction. Transaction-based functional coverage techniques with SCV Transaction-based functional coverage is becoming a reality with advent of the transaction recording and call-back schemes inherent in SCV. The flow of using the recording capabilities of SCV is relatively straightforward. Statements are simply added around current code. Three types of statements can be added : database creation, stream creation, and create and transactor creation statements. Database creation statements are self-explanatory. These statements open a database that transaction results are written to. The SCV standard defines a text file for this database. Commercial tools such as Cadence verification tools create a database in their SimVision environment. An example of this code is: Scv_tr_db ("database"); Stream statements define an object on which to attach transactions. A signal, more familiar to design and verification engineers, is a real piece of hardware that has a hierarchical path in the design. A transaction is concept that has no hardware equivalent. So streams were created as away to tie in these meta-objects. The format for creating a stream is: Scv_tr_stream name ("abc"); Transactor creation statements define the transactions that will be created and attached to a particular stream. The format for these statements is: Scv_tr_gen <> read ("re ad"); Adding transactions to existing code is straightforward. Applying these concepts to the transactor code is defined here: This code writes a transaction record to the database, in this case a text file, for every write and every read that occurs during simulation. Notice that the SCV tools automatically add the fields of the args data structure into the transaction. This information now can be used to measure functional coverage. Transactions created can be used on-the-fly for functional coverage analysis as well as post-processing functional coverage analysis. New methods of verification need to be created to maintain the speed and efficiencies necessary to keep up with the fast design cycles. In addition to reusing verification environments from other domains, more efficient verification techniques need to be employed. Techniques being targeted are gaining simulation speed by using transac tion-level modeling (TLMs) at the system level, using transaction-based techniques to improve efficiencies in test development, and using assertions/transactions for better functional coverage measurements. Application to system on chip (SoC) test cases Verification issues are more pronounced in an SoC than in a traditional ASIC because of the large numbers of standard interfaces usually found in this type of device and the software interaction that creates greater complexity. For example, a video system takes a packetized JPEG image from an Ethernet interface, decrypts and decodes the image, and sends the result to a VGA display. The SoC contains multiple processors, multiple memories, and multiple interfaces. While verifying a traditional ASIC is a difficult task in itself, the extended verification issues of an SoC also include interaction with the controller software, the handling of interrupts, and the parallel processing that occurs. To prove that the verification is occurring, transactions must be used. Conclusion With the advent of the open source capability of the SystemC Verification Library, it's time to incorporate these techniques in every product design cycle. Using this library will improve the speed, efficiency, unification, and coverage of transaction-based verification to help designers keep pace with shorter product lifecycles. As senior Core Competency technical manager for the Systems Verification g roup at Cadence Design Systems, Leonard Drucker is instrumental in product definition and strategy, helping shape and define verification technologies with product management, marketing and sales. Prior to Cadence, he served as an independent consultant, focusing on digital verification. His extensive design verification and simulation background also includes key positions at EDA companies Zycad, Solbourne, Cadnetix and Medtronic.
The SCV Library, posted on the OSCI Web site since last October, addresses the verification community's need for ways to handle complexities. The major verification capabilities added on top of SystemC are constrained random testing , complex constraint solvers, data-structure creation and concurrency.
Traditional coverage techniques are not up to the task of determining the functionality exercised by a constrained, random test environment. Traditional techniques have relied on directed tests. If the tests passed, the functionality was exercised. With constrained random tests, data is randomly generated making it difficult to know specifically what will be tested beforehand. New techniques have been created.
An SoC generally is described as something that contains a processor, memory, and peripherals. Today's chips are, more often than not, SoCs, which foster more product flexibility.
Verifying electronic designs is becoming more complex, especially as more companies move to SOC devices. Traditional verification techniques cannot offer the quality required to reduce the risk of this high-stakes, manufacturing gamble. Transaction-based functional coverage techniques are becoming mandatory for improved quality measurement of verification tests.
Related Articles
- Transaction-based methodology supports HW/SW co-verification
- SystemC library supporting OVM compliant verification methodology
- Transaction-based Debug of PCI Express Embedded SoC Platforms
- Moving to SystemC TLM for design and verification of digital hardware
- Verification of USB 3.0 Device IP Core in Multi-Layer SystemC Verification Environment
New Articles
- Quantum Readiness Considerations for Suppliers and Manufacturers
- A Rad Hard ASIC Design Approach: Triple Modular Redundancy (TMR)
- Early Interactive Short Isolation for Faster SoC Verification
- The Ideal Crypto Coprocessor with Root of Trust to Support Customer Complete Full Chip Evaluation: PUFcc gained SESIP and PSA Certified™ Level 3 RoT Component Certification
- Advanced Packaging and Chiplets Can Be for Everyone
Most Popular
- System Verilog Assertions Simplified
- System Verilog Macro: A Powerful Feature for Design Verification Projects
- UPF Constraint coding for SoC - A Case Study
- Dynamic Memory Allocation and Fragmentation in C and C++
- Enhancing VLSI Design Efficiency: Tackling Congestion and Shorts with Practical Approaches and PnR Tool (ICC2)
E-mail This Article | Printer-Friendly Page |