Solutions proposed for verification crisis
Solutions proposed for verification crisis
By Ron Wilson, EE Times
October 1, 2002 (4:52 a.m. EST)
URL: http://www.eetimes.com/story/OEG20020930S0054
ROCHESTER, N.Y. A validation crisis is stalking system-on-chip designs, according to a panel session at the 15th IEEE International ASIC/SoC Conference, though panel participants cited two possible routes toward its resolution.
Panel moderator Graham Budd, director of product marketing for ARM Ltd. (Cambridge, U.K.), set out the now familiar scenario of rising design complexity, verification complexity rising as the square of that, and of a yawning verification gap opening unnoticed in the less fashionable shadows of the design process. But Budd said there had recently appeared some glimmers in the dark.
One bit of light was explored by Mark Strickland of Verisity Design Inc. Paralleling Budd's comment that design reuse was helping close the design gap, Strickland pointed to verification reuse as a solution to the verification crisis. He said that early in the design process blocks should be swaddled in a comprehensive set of i nput generators, output checkers and coverage tools. These could be coded in Verisity's e language, he added parenthetically. In any case, the module test suites should be constructed to be portable across levels of abstraction in the design flow, and portable across designs as the module is reused elsewhere. And the suites should be layerable, so as the modules are merged into a system design, more abstract generators and checkers can be layered on top of those for the module.
Andy Nightingale, processor validation manager at ARM, seconded this view by detailing how the process is conducted at ARM, whose verification methodology is comprised of three levels: integration, in which the module is functionally verified in its new connection environment; system verification, in which the module is verified with its surrounding blocks and driver software; and system validation, in which the entire system-on-chip (SoC) model is excited to search out the corner cases.
Nightingale said that ARM's ver ification team in fact did construct what he called agents for each block in the SoC, and that the agents were reused at each successive stage in the flow. He also said that the verification suite was used to generate the system documentation, and that individual verification tests were linked to the section in the documentation that specified the expected behavior. This greatly speeded diagnosis, he said.
Part of the process
Harry Foster, chief architect at Verplex Systems, agreed with Nightingale on the scope of the problem but took a different view of the solution. Foster's view which not even Verplex was ready to implement in its entirety, he said was that verification should begin when design begins, by creating a comprehensive set of assertions right along with the design.
In addition to sheer complexity, Foster warned, verification suffers from the poor observability of designs. He cited a paper delivered at the ICCAD conference of 1996 in which a verif ication team reported achieving 95 percent coverage of HDL lines, but only 50 percent observability of actual design internals.
This problem, Foster claimed, must be addressed by modularizing a design from the beginning, and embedding in each module sufficient assertions to observe the activity in each module. These assertions can then be vital time-savers in the functional verification process. If properly constructed, they can be used as properties to drive formal verification, he said. "By defining the interfaces between the modules with assertions, you constrain the size of the problem so that it fits the capacity of formal verification tools," Foster said.
In the Q&A period following the session, Foster stated and the other participants more or less agreed that simulation is disappearing from all phases of verification beyond RTL. "You use simulation to verify the RTL," Foster said. "In fact, I'd prefer to do it at the behavioral level. But there is as yet no equivalenc e-checking tool between the behavioral and RT levels. From then on, you use formal equivalence checking at each successive stage. There's no excuse for relying on gate-level simulation, although some people still do a little of it because it is familiar."
The panel also admitted, unhappily, that their optimistic scenario simply did not apply to analog or mixed-signal verification. "There are just not enough known solutions in the analog world," Foster admitted. "The EDA industry really hasn't addressed it."
Related Articles
- Out of the Verification Crisis: Improving RTL Quality
- Efficient testbench implementation for verification proposed by Synopsys staffer
- Certifying RISC-V: Industry Moves to Achieve RISC-V Core Quality
- Why verification matters in network-on-chip (NoC) design
- Design-Stage Analysis, Verification, and Optimization for Every Designer
New Articles
Most Popular
E-mail This Article | Printer-Friendly Page |