LATEST MODEL

Llama 4 Maverick

Meta Released April 2025

Meta's balanced multimodal MoE model with 128 experts for general use

Llama 4 Maverick

MetaApril 2025

Latest

Training Data

Up to August 2024

Llama 4 Maverick

April 2025

Parameters

400 billion (17B active)

Training Method

Mixture of Experts

Context Window

1,000,000 tokens

Knowledge Cutoff

August 2024

Key Features

Open Source • Multimodal • 128 Experts MoE

Capabilities

Reasoning: Excellent

Coding: Excellent

Multimodal: Outstanding

What's New in This Version

Balanced performance across tasks with efficient MoE architecture

Meta's balanced multimodal MoE model with 128 experts for general use

What's New in This Version

Balanced performance across tasks with efficient MoE architecture

Technical Specifications

Parameters 400 billion (17B active)
Context Window 1,000,000 tokens
Training Method Mixture of Experts
Knowledge Cutoff August 2024
Training Data Up to August 2024

Key Features

Open Source Multimodal 128 Experts MoE

Capabilities

Reasoning: Excellent
Coding: Excellent
Multimodal: Outstanding

Other Meta Models

Explore more models from Meta