GLM-4.5
Z.ai's general-purpose flagship trained on 22 trillion tokens
GLM-4.5
Z.ai • July 2025
Training Data
22 trillion tokens
GLM-4.5
July 2025
Parameters
~200 billion
Training Method
Multi-stage Fine-tuning
Context Window
128,000 tokens
Knowledge Cutoff
June 2025
Key Features
Reasoning Focus • Agentic Capabilities • Open Weights • Multilingual
Capabilities
Reasoning: Excellent
Agentic: Very Good
General Knowledge: Excellent
What's New in This Version
Specialized fine-tuning combining reasoning, agentic, and knowledge capabilities
Z.ai's general-purpose flagship trained on 22 trillion tokens
What's New in This Version
Specialized fine-tuning combining reasoning, agentic, and knowledge capabilities
Technical Specifications
Key Features
Capabilities
Other Z.ai Models
Explore more models from Z.ai
GLM-5.1
Zhipu AI's current flagship refined from GLM-5 for agentic engineering, topping the SWE-Bench Pro leaderboard with sustained 8-hour autonomous execution
GLM-5
Zhipu AI's open-weight 744B MoE foundation model purpose-built for agentic engineering, trained entirely on Huawei Ascend chips
GLM-4.7
Z.ai's flagship model with industry-leading coding and multi-step task handling