Qwen Models

Chinese open-source • Alibaba research

8 Models Latest: Qwen3.6-Plus

Qwen3.6-Plus

Released April 2026

LATEST

Alibaba's flagship agentic AI model with hybrid linear attention, always-on reasoning, and autonomous multi-step coding workflows

Parameters
Not disclosed
Context
1,000,000 tokens
Key Features
1M Context with 65K Output Always-On Chain-of-Thought Native Function Calling & Agentic Coding
View Details →

Qwen3.5-Plus

Released February 2026

LATEST

Alibaba's hosted flagship combining hybrid linear-attention MoE with native multimodal understanding for agentic workflows across 201 languages

Parameters
397 billion (17B active)
Context
1,000,000 tokens
Key Features
Native Multimodal Agents Hybrid Linear-Attention MoE 201 Language Support
View Details →

QwQ-32B

Released March 2025

LATEST

Reasoning-focused model with extended thinking capabilities

Parameters
32 billion
Context
32,768 tokens
Key Features
Extended Thinking Chain-of-Thought Math Reasoning +1 more
View Details →

Qwen3-Max

Released September 2025

Alibaba's flagship model with over 1 trillion parameters and exceptional reasoning

Parameters
1T+
Context
262,144 tokens
Key Features
Massive Scale Long Context Multilingual +1 more
View Details →

Qwen3-30B-A3B

Released April 2025

Efficient MoE model with 30B total but only 3B active parameters

Parameters
30B (3B active)
Context
128,000 tokens
Key Features
MoE Architecture Efficient Inference Cost Effective +1 more
View Details →

Qwen3-235B-A22B

Released April 2025

Alibaba's flagship open-source MoE model with 235B total parameters and 22B active

Parameters
235B (22B active)
Context
128,000 tokens
Key Features
Open Source MoE Architecture Efficient Inference +1 more
View Details →

Qwen2.5-VL-32B

Released January 2025

Vision-language model with strong multimodal understanding

Parameters
32 billion
Context
32,768 tokens
Key Features
Vision-Language Image Understanding OCR +1 more
View Details →

Qwen2.5-72B-Instruct

Released September 2024

Alibaba's instruction-tuned flagship open-source model

Parameters
72 billion
Context
128,000 tokens
Key Features
Open Source Instruction Following Long Context +1 more
View Details →
Theme
Language
Support
© funclosure 2025