Spain-based Multiverse Computing unveiled two extremely small yet high-performing AI models on August 14, 2025, which they humorously categorised into a Model Zoo family because they're comparable in size to a fly's and chicken's brain. The company's SuperFly model contains just 94 million parameters – roughly the size of a fly's brain – while ChickBrain operates with 3.2 billion parameters and is a compressed version of Meta's Llama 3.1 8B model. These models can run completely offline on various devices, from smartphones to household appliances, without requiring an internet connection.
Multiverse Computing's models use the company's quantum-inspired CompactifAI compression algorithm, which significantly reduces AI model sizes without sacrificing performance. Founded in 2019 by Román Orús, Samuel Mugel, and Enrique Lizaso Olmos, the company secured €189 million in funding this June, led primarily by Bullhound Capital with participation from investors including HP Tech Ventures and Toshiba. The ChickBrain model slightly outperforms the original Llama 3.1 8B model on several standard benchmarks according to the company's internal tests, including MMLU-Pro, Math 500, GSM8K, and GPQA Diamond, despite being less than half the size.
The SuperFly model is specifically designed for embedding in household appliances and IoT (Internet of Things) devices, enabling simple voice commands such as start quick wash on a washing machine. In contrast, ChickBrain offers higher-level reasoning capabilities and can run on a MacBook without an internet connection. Multiverse is already in discussions with Apple, Samsung, Sony, and HP about integrating the technology into consumer devices, while also making its compressed models available to developers as an API via AWS, often at lower token fees than competitors. While these models won't beat the largest AI systems on global leaderboards, their advantage lies in size-to-performance efficiency, which is crucial for the future of embedded AI in devices.
Sources:

