Introducing AnyLanguageModel: One API for Local and Remote LLMs on Apple Platforms
About this article
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
Back to Articles Introducing AnyLanguageModel: One API for Local and Remote LLMs on Apple Platforms Published November 20, 2025 Update on GitHub Upvote 39 +33 Mattt mattt Follow guest LLMs have become essential tools for building software. But for Apple developers, integrating them remains unnecessarily painful. Developers building AI-powered apps typically take a hybrid approach, adopting some combination of: Local models using Core ML or MLX for privacy and offline capability Cloud providers like OpenAI or Anthropic for frontier capabilities Apple's Foundation Models as a system-level fallback Each comes with different APIs, different requirements, different integration patterns. It's a lot, and it adds up quickly. When I interviewed developers about building AI-powered apps, friction with model integration came up immediately. One developer put it bluntly: I thought I'd quickly use the demo for a test and maybe a quick and dirty build but instead wasted so much time. Drove me nuts. The cost to experiment is high, which discourages developers from discovering that local, open-source models might actually work great for their use case. Today we're announcing AnyLanguageModel, a Swift package that provides a drop-in replacement for Apple's Foundation Models framework with support for multiple model providers. Our goal is to reduce the friction of working with LLMs on Apple platforms and make it easier to adopt open-source models that run locally. The Solution The core idea...