Methodology for flow integrations in a SOC design
and Henry Kwan, Texas Instruments, Houston, USA
ABSTRACT
SOC design typically requires integration of multiple tool flows and methodologies that aid in realization of design goal. Integration of flows require standard interface with reference to Makeflow rule files, scripts and tool control/configuration files. The clarity on the infrastructure/flow leads to early adoption of new tools/features offering flexibility to enable changes for new technology libraries and tool versions.
1. INTRODUCTION
Design teams working on SOC are often confronted with newer tool versions, languages and methodologies since it involves integration of multiple components from various third-party vendors. Hence the infrastructure is required to handle flow integration from multiple third-parties. Each of these individual flows can be considered as sub-flows.
Makefiles[1] can be utilized for this integration in UNIX platform with help from scripting languages like Perl, Tcl and UNIX shell scripts. While the DA (Design Automation) industry tries to standardize on the interoperability formats and databases, design teams also require to understand the internals of flow for efficient utilization of the commercial off-the-shelf tools. This paper discusses on simplifying flow design to manage tool integrations with reference to commercial tools from Verisity, Mentor, Synopsys etc. for front-end design. Approach for custom scripts that could be developed to integrate the tools seamlessly in to the flow is explained. Various teams follow unique methods to handle this subject. However the requirements could be broadly classified as described in the next section.
2. EDA Infrastructure Requirements
Categorizing the requirements leads to focusing of the effort for integration of flows. The following are some of the categories that need to be considered while designing the infrastructure:
- Consistent Verification Platform
- Flexibility to enable new features of available tools
- Standardization of interface to integrate new tools
- Ease of migration to newer versions of tools
- Web Publishing
- Offer scope for implementing design requirements through the flow
The approach for each of the above mentioned requirements is described in the following subsections. Some of the categories could overlap to emphasize its relevance to the subsection. Vendor tools and applications have been cited to explain the concepts.
The tools need to be executed from specific directories. These directories could be allotted/organized at the beginning of a project for a predefined tool set as in Figure[1]. Project setup scripts need to be developed to handle the directory structure creation and copying/linking of files from a common shared location. The common directory could contain scripts and the Makeflow rule files organized in to separate directories.
3.1 Consistent Verification Platform
Consistency in verification[2] across various modules can be enabled for design teams with identification of common requirements for each testcase.
The verification methodology would have testbench with varying generic parameters and simulation command files for various testcases. The verification languages necessitate test environment, stimulus and tool command files. These could be uniquely categorized under every testcase for independent control as in Figure [2].
With reference to Verisity’s SpecmanElite [3], multiple eVCs could be referred in a single file and the testcase specific stimulus could be isolated from this file. This could be used to generate a single ’esv’ file that does not require to be regenerated often for a static test environment. The stimulus file that would require multiple changes would get loaded over the stable ’esv’ file. This enables utilizing the faster loading time of the ’esv’ file. This bifurcation of the verification database into stimulus and environment would also be of use in regression testcase run where compiled code mode would be advantageous for faster runtimes. This approach enabled us to measure runtime by switching across interpreter mode or compiled code mode or incremental compile mode with a single UNIX shell variable. The data shown in table [1] did not factor in to account the network load for every job submission. The data was collected from mails by LSF [4] for the runtimes. The ability to switch across the three modes was facilitated with this structured approach to verification. This also aided the fact that Interpreter mode would suit initial development phase and Compiled code mode would suit the static/stable verification environment regression runs.
3.2 Flexibility to enable additional tool features
Design Automation tools always come with new/unexplored options/switches and these become a requirement for the flow sooner. Hence flow needs to have the flexibility to allow their use. This is done with the passing of options in command line while invoking the tools.
Design environment file is used to pass various shell environment variables. The Makeflow scripts need to understand the variables and pass them accordingly.
For example:
In order to utilize the design elaboration [6] feature of Mentor’s Modelsim, passing an environment variable
ELABORATE = 1
could help in triggering the flow to compile and generate an elaboration file. This elaboration file could be used to increase the Regression Test Throughput.
3.3 Standardization of tool interface
Interfacing new tools would require the following:
1. The directory structure needs of the tool need to be captured and adopted for the flow.
Development of scripts to run from command line is an integral requirement. These scripts are invoked through make flow. Generally vendors have provided the startup scripts for users in the example/docs directories. These could be used as starting point for the script development.
2. Tool control file to pass parameters to the tool needs to be decided.
File containing switches for enabling scan insertion/io bounding would fall in this category.
3. Tool setup files as required by the vendor tools needs to be identified.
For example Synopsys [5] tools would require following files:
.synopsys_dc.setup file
.synopsys_pt.setup files
Modeltech’s simulator would require modelsim.ini files
4. Migration/Setup scripts need to be developed.
These migration scripts are required for integrating the new tool to an existing project. The project setup scripts also need to be modified to handle any new project setup. This would avoid execution of the migration scripts later.
5. Design specific Makefiles could be automatically generated by the flow.
The design database would require recompilations with changes in RTL or tool setup file. In order to enable this synchronization, project specific make files could be utilized. These need to have dependencies that are generated either by tool or flow. For example in case of SpecmanElite compilation, a make file with the following structure was effective.
%.esv: alb_user_defines.e alb_types.e top_environment.e
compile $*
This would enable regeneration of an ’.esv’ file with changes to alb_user_defines.e or alb_types.e or top_environment.e through the compile command. This approach would help in elimination of recompilation and save on the testbench development time. This can be inferred from the Table [1] where redundant compilation times were saved through make files.
3.4 Migration to newer Versions of tools
Version transition files are required to switch across versions. These transition files are typically the ’.cshrc’ files for C-shell in UNIX which contain the version specific information. This requires recompilation of the foundry specific libraries and design specific libraries. Isolating these libraries and passing the variables through design environment files would enable this migration.
LIBRARY=latest_version1.1_lib
3.5 Web publishing
Many of the DA tools generate HTML files. These could be referred or linked from a single document file as in [Figure3] with scripts or makefile targets. This would provide for consolidation of reports and tool outputs for easy reference.
3.6 Implement design specific constraints
This could be done with the linking or copying tool scripts from a centralized location to the local tool directory. The centralized location needs to have design independent Makeflow rule files and scripts that are required to implement functions. The central location could contain generic tool scripts like the Synopsys synthesis scripts for low power design to implement clock gating.
4. Conclusion
This paper has presented a methodology that can integrate many design flows. Hence the flow design could provide for the following minimum features to facilitate effective flow integrations [Figure 4].
While this approach was applied to front-end design flow, some of the aspects could be applied to back-end design flow as well. The makefile based approach would help in deciding the success/failure of a command by the error status. This command error status can help in the build management for Configuration Management. While new sub-flows need to be integrated to a pre-existing flow with specific rules and configuration files, it is important to give options for the designers to execute the commands outside the flow. Thus it is imperative that the flow is non-restrictive. However if the new methodology outside the flow works as per the requirement then it could be integrated in to the main/pre-existing flow. This would enable the growth of the flow which ensures common interface/feature availability across the teams and enable interoperability across the flows/tools.
5. ACKNOWLEDGMENTS
Our thanks to Kishore Gadde and Charles Tsai of Texas Instruments (Houston) for their valuable suggestions.
6. REFERENCES
[1] GNU Make
[2] Janick Bergeron: ’Writing Testbenches’ from Kluwer Academic Publishers
[3] Verisity’s SpecmanElite: Concepts and Usage guide
[4] Platform Computing : LSF manuals
[5] Synopsys Solvnet:
[6] Model Technology : Modelsim-SE 5.6b Performance Guidelines (AppNote1325)
[7] SNUG’03 paper on Makeflow and LEDA by Pitchumani Guruswamy - Wipro Technologies, Henry Kwan - Texas Instruments
Related Articles
New Articles
- Quantum Readiness Considerations for Suppliers and Manufacturers
- A Rad Hard ASIC Design Approach: Triple Modular Redundancy (TMR)
- Early Interactive Short Isolation for Faster SoC Verification
- The Ideal Crypto Coprocessor with Root of Trust to Support Customer Complete Full Chip Evaluation: PUFcc gained SESIP and PSA Certified™ Level 3 RoT Component Certification
- Advanced Packaging and Chiplets Can Be for Everyone
Most Popular
- System Verilog Assertions Simplified
- System Verilog Macro: A Powerful Feature for Design Verification Projects
- UPF Constraint coding for SoC - A Case Study
- Dynamic Memory Allocation and Fragmentation in C and C++
- Enhancing VLSI Design Efficiency: Tackling Congestion and Shorts with Practical Approaches and PnR Tool (ICC2)
E-mail This Article | Printer-Friendly Page |