GLM-4.6
Language model built entirely on Chinese domestic chips
GLM-4.6
Z.ai • September 2025
Training Data
Up to mid 2025
GLM-4.6
September 2025
Parameters
~200 billion
Training Method
Domestic Hardware Training
Context Window
128,000 tokens
Knowledge Cutoff
August 2025
Key Features
Domestic Chips • Cambricon Compatible • Full-stack Chinese • Enterprise Ready
Capabilities
Reasoning: Excellent
Compatibility: Outstanding
Performance: Very Good
What's New in This Version
Proof of cutting-edge AI training on fully domestic Chinese hardware
Language model built entirely on Chinese domestic chips
What's New in This Version
Proof of cutting-edge AI training on fully domestic Chinese hardware
Technical Specifications
Key Features
Capabilities
Other Z.ai Models
Explore more models from Z.ai
GLM-4.7
Z.ai's flagship model with industry-leading coding and multi-step task handling
GLM-4.6V
Open-source vision-language model optimized for multimodal reasoning and frontend automation
GLM-4.5
Z.ai's general-purpose flagship trained on 22 trillion tokens