Qwen3-235B-A22B
Alibaba's flagship open-source MoE model with 235B total parameters and 22B active
Qwen3-235B-A22B
Qwen • April 2025
Training Data
Up to early 2025
Qwen3-235B-A22B
April 2025
Parameters
235B (22B active)
Training Method
Mixture of Experts
Context Window
128,000 tokens
Knowledge Cutoff
March 2025
Key Features
Open Source • MoE Architecture • Efficient Inference • Strong Reasoning
Capabilities
Reasoning: Outstanding
Coding: Excellent
Multilingual: Outstanding
What's New in This Version
Flagship open-source MoE rivaling much larger proprietary models with efficient 22B active parameters
Alibaba's flagship open-source MoE model with 235B total parameters and 22B active
What's New in This Version
Flagship open-source MoE rivaling much larger proprietary models with efficient 22B active parameters
Technical Specifications
Key Features
Capabilities
Other Qwen Models
Explore more models from Qwen
Qwen3-Max
Alibaba's flagship model with over 1 trillion parameters and exceptional reasoning
QwQ-32B
Reasoning-focused model with extended thinking capabilities
Qwen2.5-72B-Instruct
Alibaba's instruction-tuned flagship open-source model