Transaction-level models eyed as SoC enabler
Transaction-level models eyed as SoC enabler
By Nicolas Mokhoff, EE Times
March 17, 2003 (3:42 p.m. EST)
URL: http://www.eetimes.com/story/OEG20030317S0044
Munich, Germany - Transaction-level modeling got a hard look at the recent Design Automation and Test in Europe (DATE) conference here as a possible answer to some of the design and verification challenges of complex system-on-chip design. TLM is a powerful technique that allows the designer to model the communication infrastructure of a design and to experiment with and evaluate different communication architectures.
A panel at DATE took a look at TLM as a possible successor to traditional methodologies based on register-transfer-level (RTL) descriptions. That approach can no longer adequately address such concerns as efficient design explorations based on component reuse, getting closure on the architecture, or early development, integration and verification of embedded software.
The panelists tossed around such questions as whether the transaction level (TL) is truly useful for the design or verification of SoCs; how TL might speed t he design process and lower the risk of deign failures; and the implications of the tools, languages and intellectual property (IP) used in the design/verification process.
'Crisis of complexity'
Gartner Dataquest analyst Gary Smith voiced the panel's concern about the problems of complex SoC design, saying that for the first time in the semiconductor industry, designs are not using at least 50 percent of the silicon area available to them. "This crisis of complexity needs to have the design industry move to the next level of producing third-generation tools," he said. "Are TL-based tools the answer?"
The problem is the lack of a good definition of transaction-based modeling, said panelist Dan Gajski, a University of California, Irvine, professor. "Everybody has their own idea of TLM, and one EDA vendor's idea is not in synch with another's."
"We need to formalize the abstraction level of system designs," said Donatella Sciuto, a professor at the Politecnico di Milano in Ital y. "There are requirements for a single API [application programming interface] for IP cores, common semantics for the EDA vendors and acceptance by IP vendors of those semantics."
"It's all about context," said Chris Leonard, SLD methodologies manager at ARM Ltd. "A model used in the wrong context gives wrong results." While the industry has a handle on clocked domains at the RTL and transfer layers as well as at the unclocked domains of the protocol and message layers, Leonard said, it's the boundary area between event transaction and a clock edge's parameters that needs research.
"It's important to know at what abstraction level to model," said Frank Ghenassia of STMicroelectronics' Crolles Research Center (Grenoble, France). At ST, he said, four designs that used TLM methodologies are already out the door, and six more are ready to be implemented.
TLM is too fuzzy to define, said Stuart Swan, senior architect at Cadence Design Systems Inc. He suggested that designers use function calls to model communications between hardware and software modules as a start.
"TLM applies to both design and testbenches," said Swan, declaring that TLM is already part of such verification entities as the Vera language and the tools from Verisity. The system-level methodology will take longer to entrench itself in design, he said, but many companies, including Cadence, are working to develop TLM flows today. "There may be an issue with the designer skill set to use TLM at the appropriate abstraction level," said Swan.
"The TLM techniques are not new," said Joachim Kunkel, vice president of marketing at Synopsys Inc. "The network crowd has used TLM for a very long time." Kunkel said the size of today's SoCs is forcing designers to look at TLM as a design methodology. He hinted that announcements at the Design Automation Conference in June will tell how to go from TLM to RTL.
Ultimately, "semantics matters," said Gajski of UC Irvine. "It's not important so much what language you use to define your models. But you'd better develop them so that all understand what you mean and all can use that knowledge to develop their own models."
http://www.eet.com
Related Articles
- Transaction-Level Modelling and Debug of SoCs
- A SystemVerilog DPI Framework for Reusable Transaction Level Testing, Debug and Analysis of SoC Designs
- Using PSS and UVM Register Models to Verify SoC Integration
- Creating IP level test cases which can be reused at SoC level
- Creating SoC Integration Tests with Portable Stimulus and UVM Register Models
New Articles
- Quantum Readiness Considerations for Suppliers and Manufacturers
- A Rad Hard ASIC Design Approach: Triple Modular Redundancy (TMR)
- Early Interactive Short Isolation for Faster SoC Verification
- The Ideal Crypto Coprocessor with Root of Trust to Support Customer Complete Full Chip Evaluation: PUFcc gained SESIP and PSA Certified™ Level 3 RoT Component Certification
- Advanced Packaging and Chiplets Can Be for Everyone
Most Popular
- System Verilog Assertions Simplified
- System Verilog Macro: A Powerful Feature for Design Verification Projects
- UPF Constraint coding for SoC - A Case Study
- Dynamic Memory Allocation and Fragmentation in C and C++
- Enhancing VLSI Design Efficiency: Tackling Congestion and Shorts with Practical Approaches and PnR Tool (ICC2)
E-mail This Article | Printer-Friendly Page |