Researchers accelerate sparse inference on XNNPack and TensorFlow Lite for realtime apps March 9, 2021 by admin Tweet XNNPack and TensorFlow Lite now support efficient inference of sparse networks. Researchers demonstrated substantial speedups in inference times on realtime gesture detection and background blur apps. Read more… Neowin Related Posts:Cerebras launches the world's fastest AI inference,…Linux Lite 6.4 released with WebP support, runs on…Microsoft Outlook Lite for Android adds support for SMSMicrosoft adds Authenticator Lite for Outlook on iOS…Lenovo reportedly working on Legion Go Lite and…