|
|||||
Transactions Take Object-based Route
Transactions Take Object-based Route Today's system designs are almost always centered on processing complex transactions, but verifying such designs at the level of individual signals, bits and bytes with Verilog or VHDL testbenches is rapidly becoming impractical. These transaction verification tasks are best handled in a more robust software environment with object-oriented approaches. Object-oriented techniques and higher-level programming languages are a perfect fit for transaction-based designs and give the verification engineer tremendous power, flexibility and extensibility that isn't available using typical procedural programming languages such as Verilog and VHDL. The ideas and techniques described below have been used and refined on a number of projects on which I have worked over the past three years. They were implemented in Specman Elite, from Verisity Design Inc., and Denali Memory Modeler, from Denali Software Inc. Transaction-based verification abstracts the verifi cation problem to a higher level, where the specific behavior of individual signals, bits and bytes is modeled and hidden inside methods so that verifying the functional behavior of the higher-level functions (transactions) can be accomplished more effectively. One can equate a transaction in the case of the memory controller to a memory burst read or write. In a networking scenario, a transaction might be the transmission or reception of one or more frames, cells or packets. Using a CPU example, a transaction might be an instruction executed by a microprocessor. Lists of transactions can be built to create sequences, lists of sequences can be built to create scenarios and lists of scenarios can be used to make test cases. By continuing to abstract from the lowest level of abstraction, we can create the classes that are necessary for verifying more complex systems. The verification of individual signals, bits and bytes is still important, but that mundane task is hidden from the typical user by the m ore robust software environment. Using object-oriented techniques and programming languages to accomplish the task gives the verification engineer power, flexibility and extensibility advantages that are not available using such procedural programming languages as Verilog, VHDL or C. Data-driven verification raises the abstraction level, with respect to data grouping and data addressing, in dealing with data that is oftentimes stored in semiconductor memories. Traditional memory models and verification techniques deal with absolute addresses of individual memories and don't consider the integration of many memories into the system address space. That creates more complexity and work for the verification engineer. Traditional memory models and verification techniques deal with bytes or N bits of data as physical memory and do not address the situation, for example, where eight 8-bit memories are used to create a logical 64-bit memory. Data-driven verification does address those techniques, as well as other desirable features that allow the verification engineer to ensure the data's correctness and still maintain a high-level abstract view of the memory in question (Fig. 1). Data-driven verification gives the user the ability to do backdoor reads, writes and loads; inject errors on the bus to test parity logic and ECC logic; mimic faults for testing BIST; create data-driven assertions for trapping on erroneous transactions to the memory system as they occur; and create data structures and linked lists to allow the user to view complex data transformations more easily. With those capabilities at the user's fingertips, a simple but complete verification strategy can be developed around the memory subsystem. Objected-oriented programming has become a mainstream software-engineering practice. It advocates design extensibility, reusability and modularity. As digital systems have become more complex, applying these techniques to function al verification environments has become more practical and desirable. For this task, Verisity's Specman Elite ("e") language is a powerful and effective tool for modeling system behavior, creating complex object-oriented (OO) testbench environments and generating complex stimuli. The OO-based e language combines the best features found in such languages as Verilog, VHDL, C, Java, Smalltalk and C++ to make them more applicable to hardware description and functional verification. The Verisity language also introduces some concepts not found in any of the preceding languages, but discussion of those concepts is beyond the scope of this article. Object-oriented languages are generally assumed to have some common characteristics. Among them, e provides for: Three key features make e an effective OO language for hardware verification. First is the ability to describe time, temporal events and complex temporal relationships. Second is provision of seamless support for concurrency of execution, a key feature needed by any hardware-description language. Finally, e includes a generator that acts as a complex solver. The generator is the core technology that is used to provide stimulus to the device under test. The solver will generate all random variables in the system based on constraints that can be applied to the system in a top-down manner. Additional support for complex temporal expressions makes protocol checking even easier than in Verilog or VHDL. A temporal expression is a combination of events and temporal operators that describes behavior. It expresses temporal relationships among events, values of fields, variables or other items during a test. For example, temporal expression in e can easily be written to describe these behaviors: To handle concurrency, e provides the equivalent of Verilog "always" blocks as well as the functionality of fork/join blocks. Describing hardware or bus-functional models is thus as simple and elegant as with traditional HDLs, and much easier and more elegant than with C++ or Java. Multiple threads can also be executed in parallel, and temporal expressions and semaphores can be used for synchronization between threads and objects. The most striking and useful feature of e for the verification engineer is its generator. By default, all variables in e are random, but there are directives that are supported that cause variables not to be generated randomly. Furth er, constraints using the "keep" command can be applied to limit the scope of random variables-for example, keep packet.length in [64..128], keep packets.size() > 100, or keep transaction.addr < MAX_ADDR. Using the object-oriented features, the verification engineer can build a complex hierarchy of classes that abstract minute details from the test writer and allow the user to concentrate on creating robust test cases rather than on tracking minute details. Using a hierarchy of object-oriented abstractions, the engineer can build from simple reads and writes to using sequences. The concepts of sequences can be raised to the next level to create complex scenarios, which can represent packets or other abstract system-level information. Combining abstractions of timing, concurrency and generation, in conjunction with system-level abstraction, allows the engineer to rapidly describe the use of the system by the external world. Memories are ubiquitous in complex digital systems and account for the majority of transistors in most designs. Granted, most memory and ASIC vendors provide some basic Verilog memory model for their external or embedded memories. But replacing those memories with something faster, more feature-rich and more robust is one of the first items I tackle when constructing a verification environment. For this task, Denali Memory Modeler provides the features and functions needed for transaction and data-driven verification. Denali memory objects are modeled in C and link to the simulation through PLI or OMI. In addition to sparse memory implementation, for efficient memory usage during simulation, the C-based modeling paradigm enables a rich set of features for transaction and data-driven verification. Key verification features are implemented in C, and extra simulation cycles are eliminated for common controls such as loading, saving and comparing memory images from a file or combining heterogeneous memories into a single system memory view using combinations of width expansion, depth expansion, interleaving and masking of physical memories. Raising the level All of these techniques and functions are globally accessible and applicable across all supported memory types. The features can be accessed with a GUI, TCL, Verilog PLI calls or C functions and are well-adapted to the object-oriented programming paradigm. Denali and Verisity have integrated Memory Modeler and e to provide users with a more complete verification environment for memory subsystems. Inside of e, users gain access to the data-driven verification techniques in Denali memory models by instantiating one e D enali object for each physical or system memory and then seamlessly accessing all of the features using the class' methods. The simple class definition provided through this interface makes it possible to treat a memory model like any other object using the e language. For example, loading a memory with an object file requires only a simply method call, as illustrated in Fig. 2. Denali models also provide a C-language interface that offers many features to Specman, Vera, TCL and C users. Chief among those features is the memory event callback. Every time a transaction occurs to the memory, a method call can be issued to e informing the user of the instance, addr and operation type just performed. Employing memory event callbacks in e can yield powerful new checks that are difficult to issue otherwise. With them, one can: How is this accomplished? In the example shown in Fig. 3, which is from a DDR SDRAM controller testbench, preexisting read() and write() methods in e are extended so that after each transaction is performed, the instance, logical addr and data of the transaction are posted to a scoreboard. That represents the expected event that will occur in the memory. The memory modeler callback method is extended to look at the memories' callbacks. If the event type is read or write, the modeler will then check to see whether the scoreboard contains a matching transaction. If it does not, a device-under-test (DUT) error should be assumed and the test halted. This check will catch spurious writes or reads to memory created by the DUT. If a matching transaction is found in the scoreboard, then logical-to-physical address translation is performed, and the contents of the physical memory location are subsequently read and compared with expected values. If the data does not match, a DUT error is assumed. Finally, at the end of the test, the scoreboards are verified as empty, indicating that no transactions were lost. For the DDR SDRAM controller that I designed, the above techniques revealed several subtle bugs that would have been nearly impossible to detect using conventional techniques. The methods also provide a convenient way of injecting errors into a memory subsystem and verifying error correction and detection logic that would not be feasible using traditional memory models and verification approaches. n --- Sean W. Smith's extensive background in system design verification, last at IBM, landed him a post as a verification engineer at Cisco Systems Inc. (Research Triangle Park, N.C.). Smith holds a BS in computer engineering from Louisiana State University (Baton Rouge). Copyright © 2002 CMP Media LLC
|
Home | Feedback | Register | Site Map |
All material on this site Copyright © 2017 Design And Reuse S.A. All rights reserved. |