Mixtral 8x22B
Mistral AI Released April 2024
Mistral's largest mixture-of-experts model
Mixtral 8x22B
Mistral AI • April 2024
Training Data
Up to early 2024
Mixtral 8x22B
April 2024
Parameters
8x22B (MoE)
Training Method
Mixture of Experts
Context Window
64,000 tokens
Knowledge Cutoff
February 2024
Key Features
Large MoE • Extended Context • High Performance
Capabilities
Reasoning: Very Good
Coding: Very Good
Complex Tasks: Good
What's New in This Version
Larger experts with extended context window
Mistral's largest mixture-of-experts model
What's New in This Version
Larger experts with extended context window
Technical Specifications
Parameters 8x22B (MoE)
Context Window 64,000 tokens
Training Method Mixture of Experts
Knowledge Cutoff February 2024
Training Data Up to early 2024
Key Features
Large MoE Extended Context High Performance
Capabilities
Reasoning: Very Good
Coding: Very Good
Complex Tasks: Good
Other Mistral AI Models
Explore more models from Mistral AI
Magistral Medium
Mistral's flagship reasoning model with advanced multi-step logic capabilities
June 2025 ~200 billion
Magistral Small
Mistral's open-source reasoning model with Apache 2.0 license
June 2025 24 billion
Mistral Small 3.1
Mistral's efficient lightweight model with enhanced capabilities
March 2025 22 billion