The Offline AI: On-Device LLMs in React Native With AI SDK
The Offline AI: On-Device LLMs in React Native With AI SDK
Learn how to run on-device LLMs in React Native using Vercel’s AI SDK from Michał Pierzchała's talk at DevAI by Data Science Summit.
The Offline AI: On-Device LLMs in React Native With AI SDK

On-device LLMs, unlike remotely accessed models, unlock private-by-default, low-latency AI experiences that work anywhere, offline; ideal for mobile. In this talk, Michał will show how to run LLMs directly inside React Native apps using Vercel's AI SDK that provides a robust abstraction layer to simplify building AI applications. Same interface to run local and remote models. These capabilities are possible thanks to an open-source set of libraries we created. He'll dive deep into the provider architecture and demonstrate how we integrated it with the MLC LLM Engine and Apple’s Foundation Models available on their latest mobile devices.
The Offline AI: On-Device LLMs in React Native With AI SDK
Learn how to run on-device LLMs in React Native using Vercel’s AI SDK from Michał Pierzchała's talk at DevAI by Data Science Summit.

Learn more about AI
Here's everything we published recently on this topic.
React Native Performance Optimization
Improve React Native apps speed and efficiency through targeted performance enhancements.
On-device AI
Run AI models directly on iOS and Android for privacy-first experiences with reliable performance across real devices.
AI Knowledge Integration
Connect AI to your product’s knowledge so answers stay accurate, up to date, and backed by the right sources with proper access control.
Generative AI App Development
Build and ship production-ready AI features across iOS, Android, and Web with reliable UX, safety controls, and observability.



















