All activity

Mistral Saba is a regionally-focused AI model (24B) from Mistral AI, excelling in Arabic and many Indian languages. Outperforms larger models in its target region. API + local deployment.

Mistral Saba
Culturally Relevant AI for Arabic and Indian Languages.

Mistral Small 3 is the most efficient and versatile model of Mistral. Pre-trained and instructed version, Apache 2.0, 24B, 81% MMLU, 150 token/s. No synthetic data so great base for anything reasoning.

Mistral Small 3
High performance in a 24b open-source model

Le Chat combines powerful AI with extensive information on the web to help you rediscover the world. Enjoy natural conversations, real-time internet search, comprehensive document analysis, and much more.

Le Chat
Your AI assistant for life and work

Choose between Mistral Small, Mistral Large and Mistral Next. Mistral Large supports English, French, Spanish, German and Italian.
Le Chat
Multilingual conversational assistant built on Mistral AI

Mistral Large has top-tier reasoning capacities, is multi-lingual by design, has native function calling capacities and a 32k model. The pre-trained model has 81.2% accuracy on MMLU.

Mistral Large
Cheaper GPT-4 rival that supports 32K-token context windows