Startup Runs AI in Novel SRAM
Areanna claims 100 TOPS/W in simulation
By Rick Merritt, EETimes
July 22, 2019
SAN JOSE — In his spare time, an engineer at Tektronix sketched out a novel deep-learning accelerator, and now his two-person startup is the latest example of the groundswell of enthusiasm that deep learning is generating.
Behdad Youssefi defined an SRAM with specialized cells that can handle the matrix multiplication, quantization, storage and other jobs needed for an inference processor. After four years solo work on the concept originally planned as a PhD thesis, he formed startup Areanna with a colleague at Tektronix and a Berkeley professor as an advisor.
In Spice simulations the design delvers more than 100 tera-operations/second/watt when recognizing handwritten digits using 8-bit integer math. Youssefi claims it could beat Google’s TPU in computational density by an order of magnitude.
E-mail This Article | Printer-Friendly Page |
Related News
- Israeli AI startup NeuReality raises $35M Series A to bring its novel inferencing chip to the market
- AI Software Startup Moreh Partners with AI Semiconductor Company Tenstorrent to Challenge NVIDIA in AI Data Center Market
- DENSO and U.S. Startup Quadric Sign Development License Agreement for AI Semiconductor (NPU)
- Dream Chip Technologies tapes out a 10-TOPS SoC in 22nm with a novel AI Accelerator and an Automotive Functional Safety Processor
- SEMIFIVE announces commercialization of its 5nm HPC SoC Platform with lead partner Rebellions, AI Chipmaker startup based in Korea
Breaking News
- Jury is out in the Arm vs Qualcomm trial
- Ceva Seeks To Exploit Synergies in Portfolio with Nano NPU
- Synopsys Responds to U.K. Competition and Markets Authority's Phase 1 Announcement Regarding Ansys Acquisition
- Alphawave Semi Scales UCIe™ to 64 Gbps Enabling >20 Tbps/mm Bandwidth Density for Die-to-Die Chiplet Connectivity
- RaiderChip Hardware NPU adds Falcon-3 LLM to its supported AI models