How to Run Any LLM On-Device With React Native
How to Run Any LLM On-Device With React Native
In this talk, Szymon will show how to run LLMs directly inside React Native apps using an AI SDK that provides a powerful abstraction layer to simplify building AI applications. Join him as he explores react-native-ai, a library that enables local LLM execution.
How to Run Any LLM On-Device With React Native

Local, on-device LLMs unlock private-by-default, low-latency AI experiences that work offline, which is ideal for mobile. In this talk, Szymon will show how to run LLMs directly inside React Native apps using an AI SDK that provides a powerful abstraction layer to simplify building AI applications. Join him as he explores react-native-ai, a library that enables local LLM execution. He'll dive into the provider architecture, integrations with MLC LLM Engine and Apple's foundation models, and what's new: improved debugging experiences, tool calling support, and agent pipelines for building AI workflows that run entirely on-device.
How to Run Any LLM On-Device With React Native
In this talk, Szymon will show how to run LLMs directly inside React Native apps using an AI SDK that provides a powerful abstraction layer to simplify building AI applications. Join him as he explores react-native-ai, a library that enables local LLM execution.

Learn more about AI
Here's everything we published recently on this topic.
React Native Performance Optimization
Improve React Native apps speed and efficiency through targeted performance enhancements.
On-device AI
Run AI models directly on iOS and Android for privacy-first experiences with reliable performance across real devices.
AI Knowledge Integration
Connect AI to your product’s knowledge so answers stay accurate, up to date, and backed by the right sources with proper access control.
Generative AI App Development
Build and ship production-ready AI features across iOS, Android, and Web with reliable UX, safety controls, and observability.












