|
|||
NeuReality Boosts AI Acelerator Utilization With NAPUBy Sally Ward-Foxton, EETimes (April 4, 2024) Startup NeuReality wants to replace the host CPU in data center AI inference systems with dedicated silicon that can cut total cost of ownership and power consumption. The Israeli startup developed a class of chip it calls the network addressable processing unit (NAPU), which includes hardware implementations for typical CPU functions like the hypervisor. NeuReality’s aim is to increase AI accelerator utilization by removing bottlenecks caused by today’s host CPUs. NeuReality CEO Moshe Tanach told EE Times its NAPU enables 100% utilization of AI accelerators. |
Home | Feedback | Register | Site Map |
All material on this site Copyright © 2017 Design And Reuse S.A. All rights reserved. |