AI in abundance

Introducing a free API, improved pricing across the board, a new enterprise-grade Mistral Small, and free vision capabilities on le Chat.

  • September 17, 2024
  • Mistral AI team

We’re taking new steps in our mission to bring frontier AI in the hands of everyone. Today, we are releasing:

  • A free tier on la Plateforme
  • A pricing update over our entire family of models
  • A new, better Mistral Small
  • Free vision capabilities on le Chat with Pixtral 12B

Free tier on la Plateforme

La Plateforme, the serverless platform to tune and build with Mistral models as API endpoints, now offers a free tier enabling developers to get started with experimentation, evaluation, and prototyping at no cost. Users can seamlessly evolve their endpoints into a commercial tier, and benefit from full data isolation (with a free zero-retention option) and higher rate limits. Users can also choose to deploy our models to different infrastructure: whether using our cloud partners (Azure / AWS / GCP), or choosing to deploy our solutions on their own tenant.

Reduced prices across the board

We’ve worked hard on making our endpoints faster and more efficient. This enables us to reduce prices across the board, with the following prices

ModelNew priceOld pricePrice drop
Mistral Nemo$0.15 / M input tokens$0.3 / M tokens50%
$0.15 / M output tokens$0.3 / M tokens
Pixtral 12B$0.15 / M input tokens
$0.15 / M output tokens
Mistral Small$0.2 / M input tokens$1 / M input tokens80%
$0.6 / M output tokens$3 / M output tokens
Codestral$0.2 / M input tokens$1 / M input tokens80%
$0.6 / M output tokens$3 / M output tokens
Mistral Large$2 / M input tokens$3 / M input tokens33%
$6 / M output tokens$9 / M output tokens

This price update makes Mistral Large 2 the most cost-efficient frontier model, make our smaller models extremely cost efficient, and allows customers to realize significantly faster returns on their AI investments. Updated pricing will also reflect on our cloud platform partner offerings (Azure AI Studio, Amazon Bedrock, Google Vertex AI).

Small gets a big update

We are proud to unveil Mistral Small v24.09, our latest enterprise-grade small model, an upgrade of Mistral Small v24.02. Available under the Mistral Research License, this model offers customers the flexibility to choose a cost-efficient, fast, yet reliable option for use cases such as translation, summarization, sentiment analysis, and other tasks that do not require full-blown general purpose models.

With 22 billion parameters, Mistral Small v24.09 offers customers a convenient mid-point between Mistral NeMo 12B and Mistral Large 2, providing a cost-effective solution that can be deployed across various platforms and environments. As shown below, the new small model delivers significant improvements in human alignment, reasoning capabilities, and code over the previous model.

Detailed benchmarks Detailed benchmarks

We’re releasing Mistral Small v24.09 under the MRL license. You may self-deploy it for non-commercial purposes, using e.g. vLLM

Eye of the Tiger - Pixtral on le Chat

Following our latest Apache model release, Pixtral 12B, a vision-capable model with image understanding capabilities, is now freely available on le Chat. Pixtral 12B is the first open source model to support images of any size without degradation in text-based performance, and you can now use it on le Chat to scan, analyze, search, caption, and better understand your personal or enterprise knowledge files.

Importantly, the model is available under the Apache 2.0 license, so you can bring visual understanding capabilities to your own environment without having to upload your files to a third-party provider. This is a critical capability for customers that operate with sensitive or proprietary information.

Do more with less

All the above announcements are now available. Head over to le Chat to try the new image understanding capabilities. To try the free tier of la Plateforme, sign in at console.mistral.ai. To learn more about Mistral Small v24.09, Pixtral 12B, and other Mistral models and pricing, click here.