NeuReality Boosts AI Acelerator Utilization With NAPU
By Sally Ward-Foxton, EETimes (April 4, 2024)
Startup NeuReality wants to replace the host CPU in data center AI inference systems with dedicated silicon that can cut total cost of ownership and power consumption. The Israeli startup developed a class of chip it calls the network addressable processing unit (NAPU), which includes hardware implementations for typical CPU functions like the hypervisor. NeuReality’s aim is to increase AI accelerator utilization by removing bottlenecks caused by today’s host CPUs.
NeuReality CEO Moshe Tanach told EE Times its NAPU enables 100% utilization of AI accelerators.
![]() |
E-mail This Article | ![]() |
![]() |
Printer-Friendly Page |
Related News
- Flex Logix Boosts AI Accelerator Performance and Long-Term Efficiency
- Europe Leaps Ahead in Global AI Arms Race, Joining $20 Million Investment in NeuReality to Advance Affordable, Carbon-Neutral AI Data Centers
- BrainChip Boosts Space Heritage with Launch of Akida into Low Earth Orbit
- Arm shares jump 50% on AI, China boosts to results
- Rambus Boosts AI Performance with 9.6 Gbps HBM3 Memory Controller IP
Breaking News
- Infineon further strengthens its number one position in automotive microcontrollers and boosts systems capabilities for software-defined vehicles with acquisition of Marvell's Automotive Ethernet business
- JPEG XS Technology Integrated by Cobalt Digital with intoPIX TicoXS FIP Codec, supporting High Profile and new TDC Profile
- GUC Monthly Sales Report - March 2025
- Analysts question TSMC-Intel JV plan
- GUC Monthly Sales Report - March 2025