Simplifying On-Device AI for Developers with Siddhika Nevrekar
Listen now
Description
Today, we're joined by Siddhika Nevrekar, AI Hub head at Qualcomm Technologies, to discuss on-device AI and how to make it easier for developers to take advantage of device capabilities. We unpack the motivations for AI engineers to move model inference from the cloud to local devices, and explore the challenges associated with on-device AI. We dig into the role of hardware solutions, from powerful system-on-chips (SoC) to neural processors, the importance of collaboration between community runtimes like ONNX and TFLite and chip manufacturers, the unique challenges of IoT and autonomous vehicles, and the key metrics developers should focus on to ensure optimal on-device performance. Finally, Siddhika introduces Qualcomm's AI Hub, a platform developed to simplify the process of testing and optimizing AI models across different devices. The complete show notes for this episode can be found at https://twimlai.com/go/697.
More Episodes
Today, we're joined by Arash Behboodi, director of engineering at Qualcomm AI Research to discuss the papers and workshops Qualcomm will be presenting at this year’s NeurIPS conference. We dig into the challenges and opportunities presented by differentiable simulation in wireless systems, the...
Published 12/03/24
Today, we're joined by Shirley Wu, senior director of software engineering at Juniper Networks to discuss how machine learning and artificial intelligence are transforming network management. We explore various use cases where AI and ML are applied to enhance the quality, performance, and...
Published 11/19/24