Expanding On-Device Apple LLM Capabilities: Introducing Tool Calling

Authors

As part of our ongoing efforts to bring first-class LLM execution to mobile devices, we are releasing an update to our Apple provider for Vercel AI SDK today. This new release enables key feature available in the Apple on-device LLM: tool calling.

const response = await generateText({
  model: apple(),
  system: `Help the person with getting weather information.`,
  prompt: 'What is the weather in Wroclaw?',
  tools: {
    getWeather,
  },
})

To get started, install it from npm:

npm install @react-native-ai/apple@latest

or continue reading to learn more about what’s new in this release!

What is Apple LLM?

Apple introduced on-device LLM capabilities to its devices on June 9, 2025, during the Worldwide Developers Conference (WWDC). This year, as part of iOS 26, Apple opened up broader public access to these capabilities. The on-device models are optimized for efficiency and speed on iPhones.

What's Available in the New Release?

A while back, we announced an early release of an Apple Provider for Vercel AI SDK. We made it possible to generate text and structured outputs, as detailed in our previous blog post. Today, we are expanding this support by bringing tool calling functionality to the Apple Provider.

Tool Calling

Tool calling allows an LLM to interact with external functions or services. In this release, the Apple Provider now supports the full interface of tool() as defined in the AI SDK. This means the on-device Apple LLM can now invoke your JavaScript functions, allowing you to reuse existing tools from other projects within your mobile applications.

Here is an example of a simple tool that returns weather information for a given city:

const getWeather = tool({
  description: 'Retrieve the weather for a given city',
  inputSchema: z.object({
    city: z.string().describe('The city to get the weather for'),
  }),
  execute: async ({ city }) => {
    const temperature = Math.floor(Math.random() * 20) + 10
    return `Weather forecast for ${city}: ${temperature}°C`
  },
})

In order to use it, you can simply pass it to an AI SDK method of your choice, such as below:

const response = await generateText({
  model: apple(),
  system: `Help the person with getting weather information.`,
  prompt: 'What is the weather in Wroclaw?',
  tools: {
    getWeather,
  },
})
console.log(response.text)

Caveats

It is worth noting that tools are executed directly by the provider. As a result, certain callbacks typically handled by the AI SDK, such as onStepStart or maxSteps, will not take place, as Apple's on-device execution manages these aspects natively.

However, you can still process and digest answers and responses by accessing the toolCalls and toolResults properties returned by the model.

Structured Output

The Apple provider supports generating structured data together with tool calling. You can achieve this by using the experimental_output setting. This allows you to define a schema (e.g., using Zod) that the LLM's output will conform to, while being able to perform tool calls if necessary.

For example, to receive weather information in a structured object, you can define the following schema:

const response = await generateText({
  model: apple(),
  system: `Help the person with getting weather information.`,
  prompt: 'What is the weather in Wroclaw?',
  tools: {
    getWeather,
  },
  experimental_output: Output.object({
    schema: z.object({
      weather: z.string().describe('The weather'),
    }),
  }),
})
console.log(response.resolvedOutput)

What’s Next?

We're not stopping there. Our roadmap includes:

  • Android Support: We continue to work on bringing comparable on-device LLM capabilities to Android devices. Our goal is to provide a consistent and powerful experience across major mobile platforms.
  • Stable Release: We plan to release stable versions of both iOS and Android providers in conjunction with the public launch of iOS 26, anticipated in mid-September.

We are committed to bring first-class support for on-device AI to React Native developers. Make sure to star our Github repository and watch for new updates!

Table of contents
Integrating AI into your React Native workflow?

We help teams leverage AI to accelerate development and deliver smarter user experiences.

Let’s chat
Link copied to clipboard!
//
Insights

Learn more about

AI

Here's everything we published recently on this topic.

Sort
//
AI

We can help you move
it forward!

At Callstack, we work with companies big and small, pushing React Native everyday.

React Native Development

Hire expert React Native engineers to build, scale, or improve your app, from day one to production.