Llama 4 Maverick
Meta's balanced multimodal MoE model with 128 experts for general use
Llama 4 Maverick
Meta • April 2025
Training Data
Up to August 2024
Llama 4 Maverick
April 2025
Parameters
400 billion (17B active)
Training Method
Mixture of Experts
Context Window
1,000,000 tokens
Knowledge Cutoff
August 2024
Key Features
Open Source • Multimodal • 128 Experts MoE
Capabilities
Reasoning: Excellent
Coding: Excellent
Multimodal: Outstanding
What's New in This Version
Balanced performance across tasks with efficient MoE architecture
Meta's balanced multimodal MoE model with 128 experts for general use
What's New in This Version
Balanced performance across tasks with efficient MoE architecture
Technical Specifications
Key Features
Capabilities
Other Meta Models
Explore more models from Meta
Llama 4 Behemoth
Meta's flagship multimodal model (RESEARCH PREVIEW ONLY - weights not publicly released)
Llama 4 Scout
Meta's efficient multimodal model with industry-leading 10M token context
Llama 3.3 70B
Meta's latest improved Llama model with enhanced capabilities