1.8V/3.3V I2C 5V Failsafe Failtolerant Automotive Grade 1 in GF (12nm)
Industry Expert Blogs
Arm support for Android NNAPI gives >4x performance boostarm Blogs - Robert Elliott, armJan. 29, 2018 |
The launch of Arm support for the Android Neural Networks API (NNAPI) sees the release of open-source, optimized neural network operators that deliver significant performance uplift across CPUs and GPUs
Back in May at Google I/O, we heard the first public news about TensorFlow Lite for Android. This was the first exciting hint of a major new API that will affect the deployment of neural networks on Arm-based platforms supporting Android.
Inference engines are nothing new, but the big change with the announcement of NNAPI is standardized support within Android and the ability to target the wide array of accelerators available from the Arm ecosystem, such as the Arm Mali GPU.
At Arm, we fully support this development and will be releasing support for our Arm Cortex-A CPUs and Mali GPUs from day one. This is following on from other efforts to improve the performance of machine learning applications on Arm platforms, adding to our existing release of the Compute Library at the beginning of the year, and our ongoing engagement with the community of companies and developers that is standardizing approaches and sharing developments in the open.
Related Blogs
- Extending Arm Total Design Ecosystem to Accelerate Infrastructure Innovation
- Intel Embraces the RISC-V Ecosystem: Implications as the Other Shoe Drops
- Ecosystem Collaboration Drives New AMBA Specification for Chiplets
- Alphawave Semi Tapes Out Industry-First, Multi-Protocol I/O Connectivity Chiplet for HPC and AI Infrastructure
- Morello Program one year on: A step closer to securing our digital future