|
|||||
Co-Design for SOCs -> Blend of tools needed for verification
Blend of tools needed for verification With the recent explosion of microprocessor- and DSP-based systems-on-chip, designers are finding that waiting until first silicon before verifying critical software/hardware interactions can cause costly ASIC respins and time-to-market slips. Commercially available coverification tools provide a partial solution, allowing software to be tested and used to verify parts of the hardware. But the interface is limited to hardware registers accessible on the processor bus or buses. Likewise, ASIC verification tools provide a means to stimulate the pins and buses on the device but do not address the requirement to run actual software as part of the system-level verification. What's needed is a blend of these tools that will constitute a virtual platform for complete system-level verification. Full system-level verification of hardware and software requires that the testbench be capable of providing a completely synchronized, self-checkin g environment. This must include mechanisms to coordinate the software test thread executing on the CPU model and the rest of the testbench that is generating test vectors to the ASIC at the external pin interfaces. Ideally, it would be possible to run all system-level test cases in batch mode, allowing full test regression at key points within the development cycle. This includes register-transfer- and gate-level simulations with HDL CPU cores, bus functional models or C-language processor behavioral models. As an ASIC design-services company, over the past few years we have been involved with the design, system integration and verification of a number of microprocessor- and DSP-based systems-on-chip (SoCs). As a result of this experience we have developed a verification methodology that addresses the need for a complete environment for hardware/software SoC coverification. This framework allows for directed tests to be reused at the module and system level with Verilog and VHDL RTL or gate -level simulations. In a basic framework, the device under test, or DUT, is instantiated within the testbench. Each of the peripheral blocks within the design is connected to a transactor, which provides a virtual interface to the I/O pins of the device. A CPU model is provided to execute bus cycles. Data can be driven into the device from the bus side via software and from the external interfaces via the Transactors-HDL behavioral models that mimic an attached external device. Likewise, data can be driven by the device to the external interfaces and checked by the Transactors, or read and checked internally on the bus side by software. A Transactor is a concurrent machine that executes a series of instructions from a command queue. It can source data to the DUT with the correct protocol, and also can receive and check data coming from the DUT. In system-level tests, multiple Transactors may be running in any given simulation cycle. Commands are defined for each Transactor based on r equirements for verifying the peripheral block. For each test a command sequence queue is created in a text file, parsed by a Tcl script, then loaded into the simulation. The Transactor executes this "program" during a simulation run. Command queues can be quite complex, optionally having conditional branches, wait cycles and the ability to enact a block on various events within the system simulation. The system bus monitor stores all bus activity. Each bus transaction is stored by the framework environment in a C data structure that can be accessed by each of the Transactors, via a function call to the environment. In this way Transactors can synchronize their test execution to software running on a processor model. For example, a UART Transactor may look for the CPU to write to a UART register that enables receive interrupts. It would block, waiting for the environment to inform it when a certain bit pattern was written to a certain address. When the write occurred it would then transmit a byte, or a set of bytes, to the UART. Bus model A central component of the simulation environment is the bus functional model, otherwise known as the BFM. The BFM mimics the CPU, driving the bus with the correct signal protocols for various read and write transactions. It can also detect interrupts. The BFM receives commands to execute bus cycles from software running in another process on the host across a socket connection. This software can be specialized verification test code or actual production drivers designed to control the ASIC peripherals. A C++ autopointer class allows the software to be written so that it may be compiled for the host system or cross-compiled for the actual target processor. When compiled for use with the BFM, the C++ wrapper overloads the assignment operators so that a function call is made to the socket connection when a read or write is done in the ASIC register map. The BFM can be used in place of a CPU core model for much of the ini tial module and system verification. Software execution speed is optimized because the software is actually executing on the host platform. Much of the software driver development and debug can be done using the BFM, provided there is no performance requirement to code in assembly language. Functional-module and system-level tests can be done more efficiently using the BFM. Module connectivity checkout, interrupt operation and register-address decoding tests are all best performed with a BFM, saving valuable simulation time. The platform concept evolved out of the requirement to perform comprehensive SoC verification. The need to execute actual software as well as provide a test bench for hardware demands an integrated approach. The framework described has been successfully put through its paces on several diverse microprocessor and DSP core-based projects. It is currently being used to verify four new ASICs, including two mixed Verilog/VHDL designs where we take advantage of Model Technolog y's ModelSim mixed-language simulator. This environment has also been used with some of the commercially available EDA tools. It is fairly straightforward to integrate a co-verification tool with the existing framework. The CPU model supplied with the co-verification tool would be substituted for the BFM or HDL core model. These models offer cycle accuracy and the ability to execute assembly code with excellent simulation speed. Likewise, verification test- bench tools could be substituted for the Transactor environment. Recently Synopsys has provided a means to use the VERA verification tool with its Eaglei coverification product, a marriage that provides a fully integrated tool set. We can expect to see more of this type of tool connectivity as the EDA industry moves to provide more integrated solutions. Multiprocessor SoCs are the future of the embedded marketplace, making the platform approach a necessity.
|
Home | Feedback | Register | Site Map |
All material on this site Copyright © 2017 Design And Reuse S.A. All rights reserved. |