AI Inference IP. Ultra-low power, tiny, std CMOS. ~ 100K parameter RNN
Ideal for always on voice detection, key word spotting. speech-to-intent commands (10), time series health or industrial sensor data etc.
Input from analog (or digital sensors) our IP has a fall-through neural network architecture that results in only active neuron consumming power.
Our NN core is tiny taking ~1mm2 and about 2x that area including analog preprocessing and 2 second audio buffer options.
View AI Inference IP. Ultra-low power, tiny, std CMOS. ~ 100K parameter RNN full description to...
- see the entire AI Inference IP. Ultra-low power, tiny, std CMOS. ~ 100K parameter RNN datasheet
- get in contact with AI Inference IP. Ultra-low power, tiny, std CMOS. ~ 100K parameter RNN Supplier
AI IP
- RT-630 Hardware Root of Trust Security Processor for Cloud/AI/ML SoC FIPS-140
- RT-630-FPGA Hardware Root of Trust Security Processor for Cloud/AI/ML SoC FIPS-140
- NPU IP for Embedded AI
- RISC-V-based AI IP development for enhanced training and inference
- Tessent AI IC debug and optimization
- NPU IP family for generative and classic AI with highest power efficiency, scalable and future proof