Five AI Inference Trends for 2022
By Dana McCarty, Flex Logic (EETimes, January 12, 2022)
It’s an exciting time to be a part of the rapidly growing AI industry, particularly in the field of inference. Once relegated simply to high-end and outrageously expensive computing systems, AI inference has been marching towards the edge at super-fast speeds. Today, customers in a wide range of industries – from medical, industrial, robotics, security, retail and imaging – are either evaluating or actually designing AI inference capabilities into their products and applications.
Fortunately, with the advent of new semiconductor devices developed specifically to accelerate AI workloads, this technology has now advanced to the point where many products have dropped to price points and form factors that make it viable for mainstream markets where AI can be incorporated into a wide range of systems.
As we look to 2022, here are our predicted AI inference trends.
![]() |
E-mail This Article | ![]() |
![]() |
Printer-Friendly Page |
|
Related News
- Flex Logix Announces InferX™ High Performance IP for DSP and AI Inference
- Flex Logix's Barrie Mullins To present at the 2022 AI Hardware Summit
- Flex Logix Expands Management Team To Meet Growing Demand For Its AI Inference and eFPGA Solutions
- Flex Logix Accelerates Growth With New Office In Austin; Prepares For Global Expansion Of Its Edge AI Inference Product Line
- Flex Logix to speak at the 2021 AI Hardware Summit on Optimizing AI Inference Performance
Breaking News
- Breker RISC-V SystemVIP Deployed across 15 Commercial RISC-V Projects for Advanced Core and SoC Verification
- Veriest Solutions Strengthens North American Presence at DVCon US 2025
- Intel in advanced talks to sell Altera to Silverlake
- Logic Fruit Technologies to Showcase Innovations at Embedded World Europe 2025
- S2C Teams Up with Arm, Xylon, and ZC Technology to Drive Software-Defined Vehicle Evolution
Most Popular
- Intel in advanced talks to sell Altera to Silverlake
- Arteris Revolutionizes Semiconductor Design with FlexGen - Smart Network-on-Chip IP Delivering Unprecedented Productivity Improvements and Quality of Results
- RaiderChip NPU for LLM at the Edge supports DeepSeek-R1 reasoning models
- YorChip announces Low latency 100G ULTRA Ethernet ready MAC/PCS IP for Edge AI
- AccelerComm® announces 5G NR NTN Physical Layer Solution that delivers over 6Gbps, 128 beams and 4,096 user connections per chipset