DeepSeek Coder V2
DeepSeek's specialized coding model with advanced programming capabilities
DeepSeek Coder V2
DeepSeek • June 2024
Training Data
Up to early 2024
DeepSeek Coder V2
June 2024
Parameters
236 billion (21B active)
Training Method
Code-specialized MoE
Context Window
128,000 tokens
Knowledge Cutoff
February 2024
Key Features
Code Specialization • Multi-language • Repository Understanding
Capabilities
Coding: Outstanding
Reasoning: Very Good
Debug: Excellent
What's New in This Version
Specialized for code generation and debugging tasks
DeepSeek's specialized coding model with advanced programming capabilities
What's New in This Version
Specialized for code generation and debugging tasks
Technical Specifications
Key Features
Capabilities
Other DeepSeek Models
Explore more models from DeepSeek
DeepSeek-V2
DeepSeek's flagship MoE model with exceptional efficiency
DeepSeek-V3
DeepSeek's latest flagship model with enhanced capabilities and efficiency
DeepSeek-R1-0528
DeepSeek's upgraded reasoning model with 87.5% AIME accuracy and significantly reduced hallucinations