USB2.0 OTG PHY supporting UTMI+ level 3 interface - 28HK/55LL
The Designer's Dilemma: Everything is OK... Until It Isn't
By Joseph Davis, Senior Director of Product Management, Siemens Digital Industries Software
EETimes (February 13, 2023)
The proliferation of electronics in our lives is visible almost everywhere on the planet, even in the most remote locations and among the most remote communities. Most of us who live in industrialized societies see the impact daily, from cell phones to laptops to our automobiles and businesses, but from anywhere on earth, you can see satellites and space stations crossing the sky. Even in nomadic and agrarian societies that most of us would not associate with electronics, cell phones are being used to improve crop yields and perform business transactions [1,2,3].
Typically, the most in-demand features in new mobile and so-called “edge” electronics [4,5], other than improved battery life, are new sensors. In 1980, electronics contributed about 10% of the total cost of a car. By 2010, that percentage reached 35%, and by 2030, it’s projected to be close to 50% [6]. A multitude of sensors are required in today’s cars to make them safer, provide driver assistance, and provide entertainment. These edge devices combine sensors and computation, which means mixed signal design—that is, both analog and digital functionality on a chip.
Design flows for both analog and digital integrated circuit (IC) designs are very mature, and continue to become more sophisticated every day. On the digital side, there are a wide variety of electronic design automation (EDA) verification tools and flows for physical, electrical, and reliability verification that ensure the final design sent to the foundry fully complies with the physical requirements necessary to manufacture a chip that functions in the field for many years to come.
While physical and electrical verification is also well-established for analog designs, reliability verification for electromigration (EM) and voltage (IR) drop still struggles with the challenge of full chip verification[7]. Existing EM/IR tools are dependent on the capacity of SPICE tools, which ultimately limits their capacity to provide detailed verification of the operation of a chip that includes and accounts for all the parasitics involved in the actual implementation of the design. When a design exceeds a few million transistors, design teams often resort to a variety of custom-built solutions to get a “good enough” estimate of the EM and IR performance of their full design. Because these methods worked for a very long time, and have enabled the industry to put millions of devices into production, it’s all good, right? Not anymore.
Changes in the semiconductor industry are making those previous approaches riskier to the point of jeopardizing the market success of the end product. One of the major factors that underlies many of these risk factors is the process technology used for these designs. Analog designs were historically targeted to older technologies for many good reasons, such as noise, device matching, and reliability. Analog in general was very happy at 180nm. The transition to 130nm to take advantage of copper (Cu) interconnects was so traumatic for many design houses that you still hear stories about it decades later. A decade after that came the push down to 90nm.
E-mail This Article | Printer-Friendly Page |
|
Related Articles
- Design-Stage Analysis, Verification, and Optimization for Every Designer
- It's Just a Jump to the Left, Right? Shift Left in IC Design Enablement
- Shift Left for More Efficient Block Design and Chip Integration
- Hardware-Assisted Verification: Ideal Foundation for RISC-V Adoption
- The pitfalls of mixing formal and simulation: Where trouble starts