Llama 3.1 8B
Meta Released July 2024
Meta's efficient smaller model with extended context
Llama 3.1 8B
Meta • July 2024
Training Data
Up to December 2023
Llama 3.1 8B
July 2024
Parameters
8 billion
Training Method
Reinforcement Learning
Context Window
128,000 tokens
Knowledge Cutoff
December 2023
Key Features
Open Source • Efficient • Long Context
Capabilities
Efficiency: Outstanding
Reasoning: Good
Speed: Excellent
What's New in This Version
Much longer context with maintained efficiency
Meta's efficient smaller model with extended context
What's New in This Version
Much longer context with maintained efficiency
Technical Specifications
Parameters 8 billion
Context Window 128,000 tokens
Training Method Reinforcement Learning
Knowledge Cutoff December 2023
Training Data Up to December 2023
Key Features
Open Source Efficient Long Context
Capabilities
Efficiency: Outstanding
Reasoning: Good
Speed: Excellent
Other Meta Models
Explore more models from Meta
Llama 4 Behemoth
Meta's flagship multimodal model with massive MoE architecture (288B active parameters)
April 2025 ~2 trillion (288B active)
Llama 4 Maverick
Meta's balanced multimodal MoE model with 128 experts for general use
April 2025 400 billion (17B active)
Llama 4 Scout
Meta's efficient multimodal model with industry-leading 10M token context
April 2025 109 billion (17B active)