Latest Tech News

Stay updated with the latest in technology, AI, cybersecurity, and more

Filtered by: glm Clear Filter

GLM 4.5 with Claude Code

GLM Coding Plan — designed for Claude Code users, starting at $3/month to enjoy a premium coding experience! GLM-4.5 and GLM-4.5-Air are our latest flagship models, purpose-built as foundational models for agent-oriented applications. Both leverage a Mixture-of-Experts (MoE) architecture. GLM-4.5 has a total parameter count of 355B with 32B active parameters per forward pass, while GLM-4.5-Air adopts a more streamlined design with 106B total parameters and 12B active parameters. Both models sh

My 2.5 year old laptop can write Space Invaders in JavaScript now (GLM-4.5 Air)

My 2.5 year old laptop can write Space Invaders in JavaScript now, using GLM-4.5 Air and MLX I wrote about the new GLM-4.5 model family yesterday—new open weight (MIT licensed) models from Z.ai in China which their benchmarks claim score highly in coding even against models such as Claude Sonnet 4. The models are pretty big—the smaller GLM-4.5 Air model is still 106 billion total parameters, which is 205.78GB on Hugging Face. Ivan Fioravanti built this 44GB 3bit quantized version for MLX, spe

Chinese startup Z.ai launches powerful open source GLM-4.5 model family with PowerPoint creation

Want smarter insights in your inbox? Sign up for our weekly newsletters to get only what matters to enterprise AI, data, and security leaders. Subscribe Now Another week in the summer of 2025 has begun, and in a continuation of the trend from last week, with it arrives more powerful Chinese open source AI models. Little-known (at least to us here in the West) Chinese startup Z.ai has introduced two new open source LLMs — GLM-4.5 and GLM-4.5-Air — casting them as go-to solutions for AI reasonin

GLM-4.5: Reasoning, Coding, and Agentic Abililties

Today, we introduce two new GLM family members: GLM-4.5 and GLM-4.5-Air — our latest flagship models. GLM-4.5 is built with 355 billion total parameters and 32 billion active parameters, and GLM-4.5-Air with 106 billion total parameters and 12 billion active parameters. Both are designed to unify reasoning, coding, and agentic capabilities into a single model in order to satisfy more and more complicated requirements of fast rising agentic applications. Both GLM-4.5 and GLM-4.5-Air are hybrid re