NVM OTP NeoBit in Maxchip (180nm, 160nm, 150nm, 110nm, 90nm, 80nm)
Five AI Inference Trends for 2022
By Dana McCarty, Flex Logic (EETimes, January 12, 2022)
It’s an exciting time to be a part of the rapidly growing AI industry, particularly in the field of inference. Once relegated simply to high-end and outrageously expensive computing systems, AI inference has been marching towards the edge at super-fast speeds. Today, customers in a wide range of industries – from medical, industrial, robotics, security, retail and imaging – are either evaluating or actually designing AI inference capabilities into their products and applications.
Fortunately, with the advent of new semiconductor devices developed specifically to accelerate AI workloads, this technology has now advanced to the point where many products have dropped to price points and form factors that make it viable for mainstream markets where AI can be incorporated into a wide range of systems.
As we look to 2022, here are our predicted AI inference trends.
E-mail This Article | Printer-Friendly Page |
|
Related News
- Flex Logix Announces InferX™ High Performance IP for DSP and AI Inference
- Flex Logix's Barrie Mullins To present at the 2022 AI Hardware Summit
- Flex Logix Expands Management Team To Meet Growing Demand For Its AI Inference and eFPGA Solutions
- Flex Logix Accelerates Growth With New Office In Austin; Prepares For Global Expansion Of Its Edge AI Inference Product Line
- Flex Logix to speak at the 2021 AI Hardware Summit on Optimizing AI Inference Performance
Breaking News
- Arm loses out in Qualcomm court case, wants a re-trial
- Jury is out in the Arm vs Qualcomm trial
- Ceva Seeks To Exploit Synergies in Portfolio with Nano NPU
- Synopsys Responds to U.K. Competition and Markets Authority's Phase 1 Announcement Regarding Ansys Acquisition
- Alphawave Semi Scales UCIe™ to 64 Gbps Enabling >20 Tbps/mm Bandwidth Density for Die-to-Die Chiplet Connectivity