Industry Expert Blogs
AI will be increasingly important in EDA, reducing design costs and supporting engineersThalia Blog - Sowmyan Rajagopalan, ThaliaAug. 09, 2023 |
Artificial intelligence (AI) is having one of its periodic days in the sun, and dominates the conversation at almost any industry event. The Design Automation Conference (DAC 2023) was no exception, with AI seen by the semiconductor community as both an opportunity and a challenge.
An opportunity, of course, because AI requires so many chips, from the huge and complex system-on-chips that will power the AI engines and models, to the semiconductors that will be embedded in every device to bring AI to every application.
The complexity of the chips fuels demand for a wide variety of IP, but this is where some of the challenges are seen. Integrating many blocks of sophisticated IP to form an AI system-on-chip – which may also integrate yet more functionality such as 5G – is a long process, and it requires very advanced skills. There may be hundreds of IP blocks that need to be tested and integrated, with the results recalibrated every time one of the blocks is changed or enhanced. Identifying the cause of a fault or failure may take many engineer-weeks.
This is true of other chip applications too, of course, including 5G. Engineers with the required skills are in short supply in many markets, and that shortage is worsened by two factors – the number of AI-focused chip start-ups that are now competing for talent, and the increasingly long design cycle for a complex chip, which will consume a growing number of engineer hours before it is ready.
Related Blogs
- Intel Embraces the RISC-V Ecosystem: Implications as the Other Shoe Drops
- Digitizing Data Using Optical Character Recognition (OCR)
- The battery is dead; long live power management
- Mitigating Side-Channel Attacks In Post Quantum Cryptography (PQC) With Secure-IC Solutions
- Why, How and What of Custom SoCs