Multiverse Computing Unveils Compact AI Models for IoT Devices

yasmeeta Avatar

Multiverse Computing achieved recent fame with the releases of two groundbreaking bioAI models, ChickBrain and FlyBrain. These models are part of the effort to deliver those powerful chat AI capabilities to Internet of Things (IoT) devices. They are not reliant on an internet connection to function. This allows users to harness the power of new AI technology, even in regions with low or no connectivity.

ChickBrain, the larger of the two models, has a whopping 3.2 billion parameters, and shows remarkable reasoning skills. Benchmarks indicate that it has outperformed its predecessor by a wide margin in all the usual benchmarks. These are MMLU-Pro, Math 500, GSM8K, and GPQA Diamond. In contrast, FlyBrain is lower in scale and provides somewhat limited functionality compared to ChickBrain.

Founded in 2019, Multiverse Computing has rapidly captured the world’s attention for its innovative approach to AI model compression through quantum computing. The company has raised approximately $250 million, including a recent €189 million ($215 million) funding round led by Bullhound Capital, with contributions from HP Tech Ventures and Toshiba. This financial support will allow Multiverse Computing to continue developing its cutting-edge technology.

Román Orús, one of the co-founders and a distinguished professor of quantum computers and physics in Europe, emphasized the unique nature of their compression technology.

“We have a compression technology that is not the typical compression technology that the people from computer science or machine learning will do because we come from quantum physics.” – Román Orús

Multiverse Computing’s emphasis on compact models has major effects across diverse applications. The startup provides highly compressed versions of these models via a simple and intuitive API, hosted on Amazon Web Services (AWS). This creates immense opportunities for developers to plug these capabilities into their applications with ease. They provide other already-compressed open-source models such as Llama 4 Scout and Mistral Small 3.1. They offer big models like DeepSeek R1 Slim for example.

This creative approach has drawn a wide variety of clients — including BASF, Ally, Moody’s and Bosch. Private sector companies are excited about AI solutions that can function without requiring a permanent internet connection. This is what makes the demand for Multiverse’s offerings so ripe.

Perhaps Orús’ most notable point is the flexibility of their models, which make them well-equipped for use on regular devices.

“You can run them on premises, directly on your iPhone or on your Apple Watch.” – Román Orús

ChickBrain and FlyBrain take IoT devices a step further. More importantly, each of them expands our understanding not just of what’s possible with AI technology, but what should be possible. Empowering people to use complex AI models on small hardware has the potential to transform how users interact with AI, from public sector to private sector.

Multiverse Computing has created a proprietary model compression technology named “CompactifAI”. This cutting-edge solution complements their robust suite of AI powered solutions. This new technology, neural architecture search, allows us to make dramatically smaller models. It still avoids the drop in performance required to perform tasks such as image recognition or natural language processing.

Multiverse Computing has accomplished giant leaps. These are indicative of a wider shift demand for AI technologies that are more efficient, and can be more widely deployed in dynamic environments. Only recently have companies begun taking serious investor interest to explore all the potential applications for these compact models. The potential impacts on key industries from healthcare to manufacturing could be enormous.

yasmeeta Avatar

Leave a Reply

Your email address will not be published. Required fields are marked *