It starts with frontier science.

State of the art models with cutting edge capabilities, from cloud to edge.

Mathstral
Codestral Mamba
Mistral_Large
Codestral
Mistral_7B
Nemo
Mistral_Embed
Pixtral

Tailored Arrow Right Orange for You.

Our premier models are designed to be yours to tune, customize, distill, and deploy.

Available for commercial use.

Mistral Large

Mastering complexity with reasoning.

Mistral Large

Our flagship model delivers state-of-the-art performance across reasoning, code, and analysis tasks. Designed for the most sophisticated enterprise needs with extensive context understanding and nuanced outputs.

Mistral Small

Enterprise-ready, compact powerhouse.

Mistral Small

The most powerful model in its size class, combining efficiency with remarkable capabilities. Ideal for production deployments requiring balance between performance and resource usage.

Mistral Edge

Edge performance, unbeatable value.

Mistral Edge

Ministral 3B and Ministral 8B provide unprecedented performance for edge deployment. Purpose-built for resource-constrained environments without compromising on essential capabilities.

Codestral

Elevating code generation.

Codestral

Purpose-built for code generation and understanding, optimized for developer workflows.

Mistral Embed

Enabling internal semantic search.

Mistral Embed

State-of-the-art embedding model for semantic search and content organization.

Multimodal Models

Vision pioneer, multimodal mastery.

Multimodal Models

Combine text, image, and structured data understanding in a single model. Process diverse input types while maintaining consistent quality across modalities.

Mistral Moderation

Intelligent content safety at scale.

Mistral Moderation

A fine-tuned model offering customizable content moderation across nine safety categories in multiple languages. Designed for both raw text and conversational content with high accuracy and pragmatic safety guardrails.

Free open-weight models for research.

Free to use under the Apache 2.0 license.

Pixtral

Trained to process both natural images and documents.

NeMo

A 12B model designed for global, multilingual applications.

Codestral Mamba

Infinite-length code generation pioneer.

Mathstral

A specific 7B model designed for math reasoning and scientific discovery.

  • Pencils Custom models for your own needs.

  • Pencils Custom models for your own needs.

  • Pencils Custom models for your own needs.

Make models your own through fine-tuning and customization. Adapt our models to your specific use cases while maintaining core performance.

Unique models tailored to your business, delivering very high accuracy at significantly lower cost.

Transform general-purpose LLMs into domain-specialized intelligence with Mistral AI's custom pre-training and model distillation services.

Custom Pre Training
Open
plug-icon

Community hardened, open-weight models developed by the world’s best scientists.

Customizable
pen-icon

Customize and control the full stack, from models to UX.

Private
key-icon

Deploy in your own environment so you own your data, models, and core competencies.

Efficient
thunder-icon

Tap into small, super-efficient models tailored for productivity.

Ready to get started?

Deploy our models Folder anywhere with flexible infrastructure options spanning cloud providers, edge, VPC and on-premises environments.

icon-computer-black
Self-hosted
Deploy Mistral models on virtual cloud, edge, or on-premises. Self-hosted deployments offer more advanced levels of customization and control. Your data stays within your walls.
icon-earth-black
Mistral Cloud
Get started with Mistral models in a few clicks via our developer platform hosted on Mistral’s infrastructure and build your own applications and services. Our servers are hosted in EU.
icon-laptop-black
Cloud providers
Access our models via your preferred cloud provider (Google Cloud, AWS, Azure, IBM, Snowflake, NVIDIA, Outscale) and use your cloud credits. Mistral models are available on Azure AI Studio, AWS Bedrock, Google Cloud Model Garden, IBM Watsonx, and Snowflake.

Model licenses explained.

Apache 2.0

Mistral Research License

Mistral Commercial License

Access to weights

check-black
check-black
check-black

Deployment for research purposes and individual usage

check-black
check-black
check-black

Creation of derivatives (e.g. fine-tuning) for research purposes and individual usage

check-black
check-black

The same license applies to derivatives

check-black

The same license applies to derivatives

Deployment for commercial purposes (internal & external use cases)

check-black
cross-mark

Requires Mistral Commercial License

check-black

Creation and usage of derivatives (e.g. fine-tuning) for commercial use cases

check-black
cross-mark

Requires Mistral Commercial License

check-black

Custom terms & support (self-deployment)

check-black
cross-mark
check-black

Get started with Mistral models.

It's time to get to the frontier.