Apple Announces Foundation Models Framework for Developers to Leverage AI

Apple at WWDC today announced Foundation Models Framework, a new API allowing third-party developers to leverage the large language models at the heart of Apple Intelligence and build it into their apps.

foundation models framework
With the Foundation Models Framework, developers can integrate Apple's on-device models directly into apps, allowing them to build on Apple Intelligence.

"Last year, we took the first steps on a journey to bring users intelligence that's helpful, relevant, easy to use, and right where users need it, all while protecting their privacy. Now, the models that power Apple Intelligence are becoming more capable and efficient, and we're integrating features in even more places across each of our operating systems," said Craig Federighi, Apple's senior vice president of Software Engineering. "We're also taking the huge step of giving developers direct access to the on-device foundation model powering Apple Intelligence, allowing them to tap into intelligence that is powerful, fast, built with privacy, and available even when users are offline. We think this will ignite a whole new wave of intelligent experiences in the apps users rely on every day. We can't wait to see what developers create."

The Foundation Models framework lets developers build AI-powered features that work offline, protect privacy, and incur no inference costs. For example, an education app can generate quizzes from user notes on-device, and an outdoors app can offer offline natural language search.

Apple says the framework is available for testing starting today through the Apple Developer Program at developer.apple.com, and a public beta will be available through the Apple Beta Software Program next month at beta.apple.com. It includes built-in features like guided generation and tool calling for easy integration of generative capabilities into existing apps.

Popular Stories

iPhone 17 Pro Dark Blue and Orange

iPhone 17 Release Date, Pre-Orders, and What to Expect

Thursday August 28, 2025 4:08 am PDT by
An iPhone 17 announcement is a dead cert for September 2025 – Apple has already sent out invites for an "Awe dropping" event on Tuesday, September 9 at the Apple Park campus in Cupertino, California. The timing follows Apple's trend of introducing new iPhone models annually in the fall. At the event, Apple is expected to unveil its new-generation iPhone 17, an all-new ultra-thin iPhone 17...
xiaomi apple ad india

Apple and Samsung Push Back Against Xiaomi's Bold India Ads

Friday August 29, 2025 4:54 am PDT by
Apple and Samsung have reportedly issued cease-and-desist notices to Xiaomi in India for an ad campaign that directly compares the rivals' devices to Xiaomi's products. The two companies have threatened the Chinese vendor with legal action, calling the ads "disparaging." Ads have appeared in local print media and on social media that take pot shots at the competitors' premium offerings. One...
iPhone 17 Pro Iridescent Feature 2

iPhone 17 Pro Clear Case Leak Reveals Three Key Changes

Sunday August 31, 2025 1:26 pm PDT by
Apple is expected to unveil the iPhone 17 series on Tuesday, September 9, and last-minute rumors about the devices continue to surface. The latest info comes from a leaker known as Majin Bu, who has shared alleged images of Apple's Clear Case for the iPhone 17 Pro and Pro Max, or at least replicas. Image Credit: @MajinBuOfficial The images show three alleged changes compared to Apple's iP...
maxresdefault

The MacRumors Show: iPhone 17's 'Awe Dropping' Accessories

Friday August 29, 2025 8:12 am PDT by
Following the announcement of Apple's upcoming "Awe dropping" event, on this week's episode of The MacRumors Show we talk through all of the new accessories rumored to debut alongside the iPhone 17 lineup. Subscribe to The MacRumors Show YouTube channel for more videos We take a closer look at Apple's invite for "Awe dropping;" the design could hint at the iPhone 17's new thermal system with ...

Top Rated Comments

heretiq Avatar
12 weeks ago

Aren't the on device models quite limited in capabilities? What can they do? In any case even access to a limited model could be huge.
While Apple’s OpenELM LLM is no ChatGPT, it is a very capable resource for incorporating conversational interface, on-device RAG and LoRA fine-tuning into an app.

We used it to incorporate a fine-tuned model into one of our apps and was very pleased with the results. We held off on shipping the updated app because at the time it would require the app to download a 3.8GB fine-tuned OpenELM model from huggingface — which we didn’t want to require users to do.

We also tried implementing the same AI feature set by incorporating ChatGPT and Perplexity via API. It worked functionally but the latency and API costs were prohibitive for our use case.

Need to see the details but this announcement could possibly eliminate all of these problems — assuming the foundation models include OpenELM equivalents and the API supports fine-tuning via Adapters as was announced at last year’s WWDC. I can’t wait to see the details and start working with the beta!

Follow-up: After watching the WWDC Platforms State of Union keynote .. AFM Framework is a Bonafide Game Changer!

The AFM framework and APIs exceed my expectations by delivering utility that goes well beyond providing the desired functionality — which for me was simply (a) a capable on-device LLM that eliminates the need for costly, high-latency, off-board 3rd-party LLMs, (b) adapters to allow model fine-tuning, (c) user data privacy, and (d) off-line operation.

The unexpected benefits include tool calling, response streaming and model macros that eliminate complex and error-prone LLM response parsing and mapping to app data structures.

I took the last item (complex, imprecise and time-consuming parsing and mapping) as a given and something that developers should just expect to do when incorporating LLMs into an app with structured data — and was completely surprised to hear that Apple completely eliminated this issue for Apple platform developers! This is a really big deal because this single issue is actually a limiting factor for app development use cases. Prior to the Apple specialized macro utility the solution was either complex and brittle regular expressions that were guaranteed to fail (because LLM output is non-deterministic), or ballooning LLM API cost and latency to add guardrails to constrain LLM output to behave more deterministically.

The final word will depend on the stability of the AFM implementation and how well it aligns with what was demonstrated, but this developer is very pleased. The AFM API is a year late, but definitely way better than what was expected.

Bravo Apple. Thank you! ??
Score: 6 Votes (Like | Disagree)
MacTwick Avatar
12 weeks ago
This is huge. I have so many ideas for my app now! I can't wait.
Score: 4 Votes (Like | Disagree)
macduke Avatar
12 weeks ago
I think this could open up a lot of new and exciting apps for developers to build, but probably at the expense of battery life. It will be interesting to see how this evolves. I think this will be one of those things that is a much bigger deal a few years down the road, so better to see it now than later.
Score: 3 Votes (Like | Disagree)
heretiq Avatar
12 weeks ago

MacRumors content image ('https://www.macrumors.com/2025/06/09/foundation-models-framework/')

Apple at WWDC today announced Foundation Models Framework, a new API allowing third-party developers to leverage the large language models at the heart of Apple Intelligence and build it into their apps.

MacRumors content image

With the Foundation Models Framework, developers can integrate Apple's on-device models directly into apps, allowing them to build on Apple Intelligence.
The Foundation Models framework lets developers build AI-powered features that work offline, protect privacy, and incur no inference costs. For example, an education app can generate quizzes from user notes on-device, and an outdoors app can offer offline natural language search.

Apple says the framework is available for testing starting today through the Apple Developer Program at developer.apple.com, and a public beta will be available through the Apple Beta Software Program next month at beta.apple.com. It includes built-in features like guided generation and tool calling for easy integration of generative capabilities into existing apps.

Article Link: Apple Announces Foundation Models Framework for Developers to Leverage AI ('https://www.macrumors.com/2025/06/09/foundation-models-framework/')
Finally!! Been waiting a year for this. Need to see the details but this could be a game changer for devs.
Score: 1 Votes (Like | Disagree)
name99 Avatar
12 weeks ago

Aren't the on device models quite limited in capabilities? What can they do? In any case even access to a limited model could be huge.
They have two main capabilities
- "understanding" language and
- "understanding" images.

The obvious thing you can do is baby steps towards a language-driven UI. Imagine something like you tell UberEats "What was that Asian food I ordered last week? Can you order me that again?"
I think at least part of why Apple is doing this is research, to see how this plays out in the real world.

There are also some less obvious capabilities this allows. For example imagine a note-taking app that creates quizzes from your last week of notes, so you can see what you remember, vs what you don't remember. (I used the word "remember" here deliberately. A better app would create quizzes to see what you UNDERSTAND, but that's probably still too much to expect, even from a leading edge LLM, let along a small edge model.)

Similarly presumably we will see things like photo editing apps where you can just tell the app "remove Jenna's face" and see what happens.
Again this is research. Ultimately the goal is a system-wide language UI, not dedicated per-app code handling this stuff. But at least this gets Apple some of the way there for a year or two, while they figure out the bigger solution.
Score: 1 Votes (Like | Disagree)