The pitfalls of mixing formal and simulation: Where trouble starts
By Mark Eslinger, Joe Hupcey and Nicolae Tusinschi (Siemens EDA)
EDN (May 23, 2022)
The most effective functional verification environments employ multiple analysis technologies, where the strengths of each are combined to reinforce each other to help ensure that the device under test (DUT) behaves as specified. However, this creates an inherent challenge of properly comparing—and combining—the results from each source to give a succinct, accurate picture of the verification effort’s true status.
The most common problem we see is when design engineers want to merge the results from formal analysis with the results of RTL code and functional coverage from their UVM testbench, yet they don’t fully understand what formal coverage is providing. Hence, we will start on the familiar ground of simulation-generated code and functional coverage before going into defining formal coverage.
![]() |
E-mail This Article | ![]() |
![]() |
Printer-Friendly Page |
|
Related Articles
- Hardware-Assisted Verification: The Real Story Behind Capacity
- Early Interactive Short Isolation for Faster SoC Verification
- Shift Left for More Efficient Block Design and Chip Integration
- Design-Stage Analysis, Verification, and Optimization for Every Designer
- Hardware-Assisted Verification: Ideal Foundation for RISC-V Adoption
New Articles
- How to Design Secure SoCs: Essential Security Features for Digital Designers
- System level on-chip monitoring and analytics with Tessent Embedded Analytics
- What tamper detection IP brings to SoC designs
- RISC-V in 2025: Progress, Challenges,and What's Next for Automotive & OpenHardware
- Understanding MACsec and Its Integration