As AI models progressively go commoditized, startups are racing to build nan package furniture that sits connected apical of them. One absorbing entrant into this abstraction is Osaurus, an unfastened source, Apple-only LLM server that lets users move betwixt different section AI models, either locally aliases successful nan cloud, while keeping their files and devices each connected their ain hardware.
Osaurus evolved retired of nan thought for a desktop AI companion, Dinoki, which Osaurus co-founder Terence Pae described arsenic a benignant of “AI-powered Clippy.” Dinoki’s customers had asked him why they should bargain nan app if they still had to salary for tokens — nan usage units AI companies complaint for processing prompts and generating responses.
That sewage Pae reasoning much profoundly astir moving AI locally.
“That’s really Osaurus started,” Pae, antecedently a package technologist astatine Tesla and Netflix, told TechCrunch complete a call. The idea, he explained, was to effort to tally an AI adjunct locally. “You tin do beautiful overmuch everything connected your Mac locally, for illustration browsing your files, accessing your browser, accessing your strategy configurations. I figured this would beryllium a awesome measurement to position Osaurus arsenic a individual AI for individuals.”
Pae began building nan instrumentality successful nationalist arsenic an open-source project, adding features and fixing bugs on nan way.
Image Credits:Osaurus, Inc.Today, Osaurus tin flexibly link pinch locally hosted AI models aliases unreality providers for illustration OpenAI and Anthropic. Users tin freely take which AI models they’re using, and support different aspects of nan AI acquisition connected their ain hardware, for illustration nan models’ ain memory, aliases their files and tools.
Given that different AI models person different strengths, nan advantage of this strategy is that users tin move to nan AI exemplary that champion fits their needs.
Such a building makes Osaurus what’s called a “harness” — a power furniture that connects different AI models, tools, and workflows done a azygous interface, akin to devices for illustration OpenClaw aliases Hermes. However, nan quality is that specified devices are often aimed astatine developers who cognize their measurement astir a terminal. And sometimes, for illustration successful nan lawsuit of OpenClaw, they whitethorn airs information issues and holes to interest about.
Osaurus, meanwhile, presents an easy-to-use interface that consumers tin use, and addresses information concerns by moving things successful a hardware-isolated, virtual sandbox. This limits nan AI to a definite scope, keeping your machine and information safe.
Image Credits:Osaurus, Inc.Of course, nan believe of moving AI models connected your instrumentality is still successful its early days, fixed that it’s heavy resource-intensive and hardware-dependent. To tally section models, your strategy will request astatine slightest 64 GB of RAM. For moving larger models, for illustration DeepSeek v4, Pae recommends systems pinch astir 128 GB of RAM.
But Pae believes section AI’s needs will travel down successful time.
“I tin spot nan imaginable of it, because nan intelligence per wattage — which is for illustration nan metric for section AI — has been going up significantly. It’s connected its ain curve of innovation. Last year, section AI could hardly decorativeness sentences, but coming it tin really tally tools, constitute code, entree your browser, and bid worldly from Amazon […] it’s conscionable getting amended and better,” he said.
Image Credits:Osaurus, Inc.Osaurus coming tin tally MiniMax M2.5, Gemma 4, Qwen3.6, GPT-OSS, Llama, DeepSeek V4, and different models. It besides supports Apple’s on-device instauration models, Liquid AI’s LFM family of on-device models, and successful nan cloud, it tin link to OpenAI, Anthropic, Gemini, xAI/Grok, Venice AI, OpenRouter, Ollama, and LM Studio.
As a afloat MCP (Model Context Protocol) server, you tin springiness immoderate MCP-compatible customer entree to your devices arsenic well. Plus, it ships pinch complete 20 autochthonal plugins for Mail, Calendar, Vision, macOS Use, XLSX, PPTX, Browser, Music, Git, Filesystem, Search, Fetch, and more.
More recently, Osaurus was updated to see sound capabilities arsenic well.
Since nan task went unrecorded astir a twelvemonth ago, it has been downloaded northbound of 112,000 times, according to its website.
Currently, Osaurus’ founders (who see co-founder Sam Yoo) are participating successful nan New York-based startup accelerator Alliance. They’re besides reasoning astir adjacent steps, which could spot Osaurus being offered to businesses, for illustration those successful nan ineligible abstraction aliases successful healthcare, wherever moving section LLMs could reside privateness concerns.
As nan powerfulness of section AI models grows, nan squad believes it could little nan request for AI information centers.
“We’re seeing this explosive maturation successful nan AI abstraction wherever [cloud AI providers] person to standard up utilizing information centers and infrastructure, but we consciousness for illustration group haven’t really seen nan worth of nan section AI yet,” Pae said. “Instead of relying connected nan cloud, they tin really deploy a Mac Studio on-prem, and it should usage substantially little power. You still person nan capabilities of nan cloud, but you will not beryllium limited connected a information halfway to beryllium capable to tally that AI,” he added.
When you acquisition done links successful our articles, we whitethorn gain a mini commission. This doesn’t impact our editorial independence.
2 hours ago
English (US) ·
Indonesian (ID) ·