Mixtral 8x22B

Mistral AI Released April 2024

Mistral's largest mixture-of-experts model

Mixtral 8x22B

Mistral AIApril 2024

Training Data

Up to early 2024

Mixtral 8x22B

April 2024

Parameters

8x22B (MoE)

Training Method

Mixture of Experts

Context Window

64,000 tokens

Knowledge Cutoff

February 2024

Key Features

Large MoE • Extended Context • High Performance

Capabilities

Reasoning: Very Good

Coding: Very Good

Complex Tasks: Good

What's New in This Version

Larger experts with extended context window

Mistral's largest mixture-of-experts model

What's New in This Version

Larger experts with extended context window

Technical Specifications

Parameters 8x22B (MoE)
Context Window 64,000 tokens
Training Method Mixture of Experts
Knowledge Cutoff February 2024
Training Data Up to early 2024

Key Features

Large MoE Extended Context High Performance

Capabilities

Reasoning: Very Good
Coding: Very Good
Complex Tasks: Good

Other Mistral AI Models

Explore more models from Mistral AI