Scalable UHD H.264 Encoder - Ultra-High Throughput, Full Motion Estimation engine
Industry Expert Blogs
Transformer Networks Optimized for ChatGPT MobileCeva's Experts blog - Roni Sadeh, CevaAug. 24, 2023 |
Siri and OK Google were initially a fun introduction to the promise of voice-based control, but we soon realized how carefully we must craft requests to get a useful response. The level of understanding we now see in ChatGPT would be much easier to use, but that capability has been limited to text interaction with cloud-based apps until recently. Now the compelling promise of ChatGPT and the ubiquity of cell phones is propelling a trend to make transformer networks for a ChatGPT mobile a reality, extending the power of large language models to everyone with a phone.
An obvious challenge is that the ChatGPT we know depends on trillions of parameters. Transformer networks of this size can only run in the cloud. Some suggest a hybrid model where a phone or other app does some of the work, connecting to the cloud for heavier duty inferencing. However, a casual phone-based user may not appreciate the long latencies and privacy risks inherent in a hybrid solution. A better approach would allow for running most or all of the transformer network load directly on the phone, turning to the cloud only for occasional anonymized search requests if needed.
Related Blogs
- Mitigating Side-Channel Attacks In Post Quantum Cryptography (PQC) With Secure-IC Solutions
- Digitizing Data Using Optical Character Recognition (OCR)
- ARM vs RISC-V: Beginning of a new era
- Intel Embraces the RISC-V Ecosystem: Implications as the Other Shoe Drops
- The design of the NoC is key to the success of large, high-performance compute SoCs