Run LLMs Locally With React Native & MLC

Date
Tuesday, June 3, 2025
Time
6 PM [CET]
Location
Online

Run LLMs Locally With React Native & MLC

Watch how MLC brings on-device AI to React Native. See react-native-ai in action and discover what’s next for local model execution.

Date
03 June 2025
-
Time
6 PM [CET]
Location
Online

Run LLMs Locally With React Native & MLC

Video Unavailable
Organizer
Organizer
Presented
Callstack
@
Callstack
Speakers
Speakers
Featuring
Mike Grabowski
CTO & Founder
@
Callstack
Szymon Rybczak
Software Developer
@
Callstack
Kewin Wereszczyński
Software Engineer
@
Callstack
Featuring
Mike Grabowski
CTO & Founder
@
Callstack
Szymon Rybczak
Software Developer
@
Callstack
Kewin Wereszczyński
Software Engineer
@
Callstack

What if your React Native app could run a large language model right on the device, with no cloud required? That’s no longer a hypothetical.

In this live stream, Callstack’s Mike Grabowski, Szymon Rybczak, and Kewin Wereszczński walk you through how they made on-device LLM execution possible using the MLC LLM Engine and the new react-native-ai package.

The project started as a window.ai-inspired proof of concept showcased at The Geek Conf. Now, it’s a growing open-source effort under the Callstack Incubator, aimed at making edge AI development easier and more accessible for mobile devs. You’ll see how the integration with Vercel’s AI SDK simplifies model switching, how MLC powers real-time text generation locally, and what we’re doing next to expand model support and improve developer experience.

Expect a hands-on walkthrough, straight talk about implementation trade-offs, and a peek at what’s coming, from better documentation to deeper platform integration. Whether you’re building for privacy, performance, or offline resilience, this session will show you where AI in React Native is headed, and how you can be part of it.

Register now
Integrating AI into your React Native workflow?

We help teams leverage AI to accelerate development and deliver smarter user experiences.

Let's chat
Link copied to clipboard!
//
Save my spot

Run LLMs Locally With React Native & MLC

Watch how MLC brings on-device AI to React Native. See react-native-ai in action and discover what’s next for local model execution.

//
Insights

Learn more about AI

Here's everything we published recently on this topic.

//
AI

We can help you move
it forward!

At Callstack, we work with companies big and small, pushing React Native everyday.

On-device AI

Run AI models directly on iOS and Android for privacy-first experiences with reliable performance across real devices.

AI Knowledge Integration

Connect AI to your product’s knowledge so answers stay accurate, up to date, and backed by the right sources with proper access control.

Generative AI App Development

Build and ship production-ready AI features across iOS, Android, and Web with reliable UX, safety controls, and observability.

AI Vibe Coding Cleanup

Turn AI-generated code from tools like Cursor, Claude Code, Codex, or Replit into production-ready software by tightening structure, validating safety, and making it stable under real-world usage.

React Native Performance Optimization

Improve React Native apps speed and efficiency through targeted performance enhancements.

On-device AI

Run AI models directly on iOS and Android for privacy-first experiences with reliable performance across real devices.

AI Knowledge Integration

Connect AI to your product’s knowledge so answers stay accurate, up to date, and backed by the right sources with proper access control.

Generative AI App Development

Build and ship production-ready AI features across iOS, Android, and Web with reliable UX, safety controls, and observability.