Industry Expert Blogs
Cool AI chips are greenvideantis Blog - Stephan JanouchSep. 17, 2020 |
When chips get hot, thermal management quickly becomes difficult. The chips need more complex power grids, more expensive packages and active cooling fans for instance. Lots of GPU-based systems even need water cooling. And just in case you can get away with using only passive cooling, even a little heat greatly affects the housing and mechanical design. There needs to be enough surface material to dissipate the heat, resulting in the final device to become larger, heavier, and more expensive. In addition, if your power comes from a battery, that’ll have to have more capacity too, again adding cost and bulk. That’s a lot of reasons to keep power consumption low.
And chips do burn a lot of power. Especially when they’re running compute-intensive deep learning workloads. Deep learning algorithms require many tera-ops of multiplications per second and move massive amounts of data. At videantis, we designed our processor architecture from the ground up to consume as little energy as possible. Optimizing for low power has always been in our DNA at videantis – with a history in designing for mobile phone applications that are battery operated.
Related Blogs
- Intel Embraces the RISC-V Ecosystem: Implications as the Other Shoe Drops
- Digitizing Data Using Optical Character Recognition (OCR)
- Let's Talk PVT Monitoring: Thermal Issues Associated with Modern SoCs - How Hot is Hot?
- POWER FIRST - "Subduing The Power Management Storm"
- The design of the NoC is key to the success of large, high-performance compute SoCs