Cycle Accuracy Analysis and Performance Measurements of a SystemC model
Aniruddha Baljekar, Philips Semiconductors
Bangalore, INDIA
Abstract:
Models at higher level of abstraction (e.g. SystemC models) are required for driver development, tuning of system software, architectural exploration/verification, performance analysis, power measurements etc.
While SystemC models suffice the needs of driver development and architectural exploration, cycle accurate models are required for performing performance analysis, architectural verification, tuning of application code and power analysis. Current challenges in development of models at higher levels of abstractions are that these are not cycle accurate. As a result the alternatives for the use cases, which require cycle accurate models, are to use emulators, which are very costly and also this would be possible at a very late stage in the development cycle. So any changes at this point of time are too late to incorporate.
The scope of this paper is to highlight the methodology adopted to address cycle accuracy and performance measurement of a SystemC model. These are initial steps, which provide directions to the measurement of cycle accuracy.
Follow-up actions would be to benchmark with RTL as a golden reference model and fine tuning of the SystemC models
1. Objective and approach towards achieving it
The objective of the paper is to show the steps taken towards measuring cycle accuracy of a SystemC model. For this the SystemC model of the memory controller has been used. This model is not cycle accurate. The verification setup uses Specman and AXI TLM eVCs. Verification setup details are mentioned in section 5. The testcases executed on the RTL are executed on the SystemC model also for the purpose of logical comparison. The testcases have to be specific to capture specific operations with RTL and SystemC models in order to facilitate comparisons and perform detailed checks on cycle accuracy.
2. Glossary
AXI | Amba eXtended Interface |
DUT | Device Under Test |
eVC | “e” Verification Components |
PV | Programmers View |
PVT | Programmers View with Timing |
RTL | Register Transfer Level |
SCV | SystemC Verification libraries |
TLM | Transaction Level Modelling |
3. Definition / Terminologies / Tools
Specman | Functional verification tool from Cadence It automates the test generation and captures functional coverage. It uses the “e” verification language to capture the rules from executable specifications and use this information to automate verification. |
eVC | Ready-to-use, configurable verification environment, from Cadence typically focusing on a specific protocol or architecture. It consists of a complete set of elements for stimulating, checking and collecting coverage information for a specific protocol or architecture. One can apply eVC to the DUT to verify its implementation. |
Scoreboarding | Checking of data to verify that the output data items collected from the DUT matches the corresponding data items injected into the DUT |
Cadence IUS | Simulator from Cadence |
TxE | Transaction eXplorEr from Cadence |
Levels of Abstractions | Transaction Level [PV and PVT SystemC models] Signal level (RTL models) |
4. Introduction
Motivations for this work:
Cycle Accurate models are required to perform:
- Architecture verification
- Performance analysis
- Architectural exploration
- Tuning of Application code
- Power estimation
Some of these issues of how to measure the cycle accuracy of the SystemC models and performing performance analysis of the model have been addressed here.
Approach taken to address this issue:
- A SystemC PVT model of a memory controller is the DUT for this experiment.
- Signals which are relevant to the RTL model of the memory controller and which are important for cycle accuracy checking have been identified.
- SystemC function calls were mapped to the specific signals
- A testbench was designed to check the functionality of the SystemC model and the testcases used to check the functionality of the RTL DUT were used.
- Directed testcases were written to target specific functionality
- Traces for both RTL and DUT were captured
- The results of the traces were analysed to fine tune the SystemC model
- The deviation between RTL and SystemC model is found and that would serve as a benchmark with respect to RTL design
- Fine tuning of the SystemC model and to benchmark with the RTL design
The verification environment uses Specman to verify the DUT (SystemC design). The verification environment re-uses the testbench developed to verify the DUT at RTL level. This enables the re-use of the testbench across different levels of abstraction of the DUT (RTL and SystemC)
The block diagram of the verification setup for DUT (SystemC and RTL designs) is as follows:
For RTL design:
For SystemC design:
The verification uses the following eVCs:
- AXI signal level (for RTL) and TLM level eVC (for SystemC) from Cadence
- SRAM signal level eVC (for RTL DUT) and TLM level eVC (for SystemC DUT).
Virtual sequences, using the interfaces and eVCs specific to the abstraction level of the DUT achieve re-usability of the testbench across the different levels of abstraction.
Waveform traces at both the levels of abstractions are recorded. For SystemC transactions are recorded using the transaction-recording API provided by SCV. These traces are used for checking the latency and cycle accuracy of SystemC design vis-à-vis RTL design. Having identical tests being executed across both the levels of abstraction facilitates analysis and one-to-one comparison of the transactions for RTL and SystemC design.
Re-use of the test cases also saves on the efforts of re-writing the test scenarios for checking the functionality of an IP at different levels of abstractions.
6. Traces
Comparison methods between RTL traces and SystemC function calls have been shown in the figures below. The SystemC function calls are mapped to the relevant RTL signals.
7. Performance measurement parameters
In the case of the SRAM controller SystemC model, the performance measurement parameters were to be the bandwidth of the controller and efficiency.
Bandwidth: The total number of transactions (read and write) that the controller performs in the total simulation time. The simulation time would be calculated on the basis of the total number of clock cycles taken during the entire simulation.
Efficiency: The percentage difference between the ideal bandwidth and the actual bandwidth. The ideal bandwidth would be calculated under the assumption that there is one transaction-taking place per clock cycle.
There could be some more parameters and definitions for other models, which would be calculated using similar steps.
The performance measurement tool used was the Transaction Explorer from Cadence. This uses the SCV or SystemC Verification environment from Cadence. The model is compiled, elaborated and simulated using the cadence NC-SYSTEMC compilation and simulation tool with SCV. After this the transactions, which have been recorded using the SCV functions, can be viewed in the waveform window as shown.
The calculation of metrics after simulation i.e. post processing of the simulation data with Transaction Explorer is very slow and with big designs could cost a lot of time. Hence most of the calculation is done during simulation time so that very little post processing needs to be done.
For example, there are a lot of transactions showing reads and writes in the waveform.
Using the TxE script these transactions could be counted along with the clock signals to give us the bandwidth. But this is time consuming. So the number of clock pulses and the read/write transactions are counted during simulation and the final counts are used by the TxE script to find the bandwidth. This is much faster. The result table is as shown:
8. Conclusions
The result of this experiment is an evolving methodology, to addresses cycle accuracy in SystemC models.
Methodology in brief:
- Mapping of the SystemC function calls to the specific signals
- Design a verification environment which can be reused for RTL and SystemC
- Write test cases to generate same sets of transaction/burst for SystemC and RTL.
- Capture the specific data required for comparison
- Perform analysis and fine tune the SystemC model
- Execute regression tests on SystemC and RTL design
- Benchmark with respect to RTL design
- RTL has to be available as a reference for doing a comparison with the SystemC model
- Fine tuning of the SystemC model according to the findings with RTL as a reference
Related Articles
- Performance Evaluation of machine learning algorithms for cyber threat analysis SDN dataset
- Best insurance for your design? System performance analysis
- Performance analysis of 8-bit pipelined Asynchronous Processor core
- A Performance Architecture Exploration and Analysis Platform for Memory Sub-systems
- System Performance Analysis and Software Optimization Using a TLM Virtual Platform
New Articles
Most Popular
E-mail This Article | Printer-Friendly Page |