Mistral Unveils Its Enhanced Small 3.1 Multimodal Model

Mistral Unveils Its Enhanced Small 3.1 Multimodal Model
Source: Freepik via freepik licence

Released on 17 March 2025 under the Apache 2.0 licence, Mistral’s Small 3.1 model marks a significant advancement in artificial intelligence technology. This 24-billion-parameter model delivers improved text generation performance, enhanced multimodal capabilities, and an expanded context window of 128,000 tokens, while maintaining an inference speed of 150 tokens per second. The model outperforms similarly sized competitors—including Gemma 3 and GPT-4o mini—across several key performance benchmarks.

Mistral Small 3.1 delivers outstanding performance across both instruction-following and multilingual evaluation benchmarks. It achieved top scores on GPQA Main (general problem-solving questions), GPQA Diamond (complex reasoning tasks), MMLU (a broad-spectrum knowledge benchmark), and HumanEval (programming capability test). In the domain of multimodal capabilities, the model ranks among the best on metrics such as MMMU-Pro (multimodal understanding), MM-MT-Bench (multimodal multitask performance), ChartQA (chart interpretation), and AI2D (visual diagnostics). In its official announcement, Mistral AI highlighted that Small 3.1 can be run on a single RTX 4090 GPU or even on Mac devices with 32GB RAM, making it an ideal choice for local deployment. The model performs exceptionally well in scenarios requiring rapid conversational responses and low-latency function calls, and it proves highly effective in specialised domains such as legal advisory, medical diagnostics, and technical support.

As of 20 March 2025, Mistral Small 3.1 is publicly available via the GitHub Models platform, where developers can test, compare, and integrate it into their own codebases free of charge. This versatile model excels at coding, mathematical reasoning, dialogue handling, and comprehensive document analysis, while supporting both textual and visual inputs.

Sources:

1.

Mistral Small 3.1 | Mistral AI
SOTA. Multimodal. Multilingual. Apache 2.0

2.

Mistral Small 3.1 (25.03) is now generally available in GitHub Models · GitHub Changelog
Mistral Small 3.1 (25.03) is now available in GitHub Models. Mistral Small 3.1 (25.03) is a versatile AI model designed to assist with programming, mathematical reasoning, dialogue, and in-depth document…

3.

Mistral Small 3.1: The Best Model in its Weight Class
Explore the capabilities of Mistral Small 3.1, the top AI model for efficiency and performance on various devices.