Skip to content
Tech News
← Back to articles

Multiverse Computing pushes its compressed AI models into the mainstream

read original get AI Model Compression Toolkit → more articles
Why This Matters

Multiverse Computing's advancements in compressed AI models enable more efficient, privacy-preserving, and accessible AI experiences directly on user devices. This shift toward edge AI reduces reliance on cloud infrastructure, addressing supply chain and financial risks while enhancing user privacy and responsiveness. As AI models become smaller and more capable, they open new opportunities for both consumers and developers to deploy AI solutions more securely and cost-effectively.

Key Takeaways

With private company defaults running at upwards of 9.2% — the highest rate in years — VC firm Lux Capital recently advised companies relying on AI to get their compute capacity commitments confirmed in writing. With financial instability rippling through the AI supply chain, Lux warned, a handshake agreement isn’t enough.

But there’s another option entirely, which is to stop relying on external compute infrastructure altogether. Smaller AI models that run directly on a user’s own device — no data center, no cloud provider, no counterparty risk — are getting good enough to be worth considering. And Multiverse Computing is raising its hand.

The Spanish startup has so far kept a lower profile than some of its peers, but as demand for AI efficiency grows, this is changing. After compressing models from major AI labs including OpenAI, Meta, DeepSeek and Mistral AI, it has launched both an app that showcases the capabilities of its compressed models and an API portal — a gateway that lets developers access and build with those models — that makes them more widely available.

The CompactifAI app, which shares its name with Multiverse’s quantum-inspired compression technology, is an AI chat tool in the vein of ChatGPT or Mistral’s Le Chat. Ask a question, and the model answers. The difference is that Multiverse embedded Gilda, a model so small that it can run locally and offline, according to the company.

For end users, this is a taste of AI on the edge, with data that doesn’t leave their devices and doesn’t require a connection. But there’s a caveat: their mobile devices must have enough RAM and storage. If they don’t — and many older iPhones won’t — the app switches back to cloud-based models via API. The routing between local and cloud processing is handled automatically by a system Multiverse has named Ash Nazg, whose name will ring a bell for Tolkien fans as it references the One Ring inscription in “The Lord of the Rings.” But when the app routes to the cloud, it loses its main privacy edge in the process.

These limitations mean that CompactifAI is not quite ready for mass customer adoption yet, although that may never have been the goal. According to data from Sensor Tower, the app had fewer than 5,000 downloads in the past month.

The real target is businesses. Today, Multiverse is launching a self-serve API portal that gives developers and enterprises direct access to its compressed models — no AWS Marketplace required.

Techcrunch event Disrupt 2026: The tech ecosystem, all in one room Your next round. Your next hire. Your next breakout opportunity. Find it at TechCrunch Disrupt 2026, where 10,000+ founders, investors, and tech leaders gather for three days of 250+ tactical sessions, powerful introductions, and market-defining innovation. Register now to save up to $400. Save up to $300 or 30% to TechCrunch Founder Summit 1,000+ founders and investors come together at TechCrunch Founder Summit 2026 for a full day focused on growth, execution, and real-world scaling. Learn from founders and investors who have shaped the industry. Connect with peers navigating similar growth stages. Walk away with tactics you can apply immediately

Offer ends March 13. San Francisco, CA | REGISTER NOW

... continue reading