Mistral Launches Cost-Effective, High-Performance AI Model: Mistral Medium 3
French AI startup Mistral has released its latest large language model (LLM), Mistral Medium 3, emphasizing efficiency and affordability without sacrificing performance.
Mistral claims Mistral Medium 3 performs at or above 90% of Anthropic's Claude Sonnet 3.7, a significantly more expensive model, across various benchmarks. It also outperforms recent open models like Meta's Llama 4 Maverick and Cohere's Command A on standard AI performance evaluations.
Mistral Medium 3 is available through Mistral's API, priced at $0.40 per million input tokens and $20.80 per million output tokens. As a reference, one million tokens are equivalent to roughly 750,000 words.
Deployment and Cost Advantages
Mistral Medium 3 is designed for flexible deployment. It can be deployed on any cloud platform, including self-hosted environments with four or more GPUs. Mistral states that its pricing undercuts cost leaders like DeepSeek v3, both for API access and self-deployed systems.

Use Cases and Target Industries
Mistral highlights Mistral Medium 3's strengths in coding, STEM tasks, and multimodal understanding. The company reports beta testing with clients in financial services, energy, and healthcare for applications such as customer service, workflow automation, and complex data analysis.
Besides API access, where enterprise customers can collaborate with Mistral for fine-tuning, Mistral Medium 3 is also available on Amazon Sagemaker. Mistral plans to expand availability to other platforms, including Microsoft Azure AI Foundry and Google Vertex AI.
Mistral's Expanding AI Ecosystem
Following the release of Mistral Small 3.1 in March, Mistral Medium 3 marks another step in the company's growth. The company has hinted at the upcoming release of an even larger model.
Mistral also launched Le Chat Enterprise, a chatbot service for businesses. This service features an AI agent builder and integrates Mistral's models with third-party platforms like Gmail, Google Drive, and SharePoint. Le Chat Enterprise, previously in private preview, is now generally available.
Furthermore, Le Chat Enterprise will soon support MCP, Anthropic's standard for connecting AI assistants to data sources. Other major AI providers, including Google and OpenAI, have also adopted MCP.