|
|||||||||||||||
The Expanding Markets for Edge AI InferenceBy Geoff Tate, Flex Logix While AI originally was targeted for data centers and in the cloud, it has been moving rapidly towards the edge of the network where it is needed to make fast and critical decisions locally and closer to the end user. Sure, training can be still done in the cloud, but in applications such as autonomous driving, it is important that the time-sensitive decision making (spotting a car or pedestrian) is done closer to the end user (the driver). After all, edge systems can make decisions on images coming in at up to 60 frames per second, enabling quick actions. These systems are made possible through edge inference accelerators that have emerged to replace CPUs, GPUs and FPGAs at much higher throughput/$ and throughput/Watt. The ability to do AI inferencing closer to the end user is opening up a whole new world of markets and applications. In fact, IDC just reported that the market for AI software, hardware, and services is expected to break the $500 billion mark by 2024, with a five-year compound annual growth rate (CAGR) of 17.5% and total revenues reaching an impressive $554.3 billion. This rapid growth is likely due to the fact that AI is expanding from “just a high-end functionality” into products closer to consumers, essentially bringing AI capabilities to the masses. In addition, recent products announced have started breaking the cost barriers typically associated with AI inference, enabling designers to incorporate AI into a wider range of affordable products.
|
Home | Feedback | Register | Site Map |
All material on this site Copyright © 2017 Design And Reuse S.A. All rights reserved. |