DeepSeek-V3.1
DeepSeek's hybrid model combining V3 and R1 strengths
DeepSeek-V3.1
DeepSeek • August 2025
Training Data
Up to August 2025
DeepSeek-V3.1
August 2025
Parameters
671 billion (37B active)
Training Method
Hybrid MoE with Reasoning
Context Window
128,000 tokens
Knowledge Cutoff
August 2025
Key Features
Hybrid Architecture • Code Agent • Search Agent • Combined Strengths
Capabilities
Code Agents: Outstanding
Search Agents: Outstanding
Hybrid: Industry-Leading
What's New in This Version
Outperforms both V3-0324 and R1-0528 in agent benchmarks by combining their strengths
DeepSeek's hybrid model combining V3 and R1 strengths
What's New in This Version
Outperforms both V3-0324 and R1-0528 in agent benchmarks by combining their strengths
Technical Specifications
Key Features
Capabilities
Other DeepSeek Models
Explore more models from DeepSeek
DeepSeek-V3.2
DeepSeek's latest flagship model matching GPT-5 performance with integrated tool-use thinking
DeepSeek-V3.2-Speciale
DeepSeek's competition-focused variant (EXPIRED Dec 15, 2025 - was temporary API-only release)
DeepSeek-V2
DeepSeek's flagship MoE model with exceptional efficiency