Find Related products on Amazon

Shop on Amazon

Alibaba’s new open source model QwQ-32B matches DeepSeek-R1 with way smaller compute requirements

Published on: 2025-07-01 14:06:56

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Qwen Team, which is growing Chinese e-commerce giant Alibaba‘s family of open-source Qwen large language models (LLMs), has introduced QwQ-32B, a new 32-billion-parameter reasoning model designed to improve performance on complex problem-solving tasks through reinforcement learning (RL). The model is available as open-weight on Hugging Face and on ModelScope under an Apache 2.0 license. This means it’s available for commercial and research uses, so enterprises can employ it immediately to power their products and applications (even ones they charge customers to use). It can also be accessed for individual users via Qwen Chat. Quan-with-Questions was Alibaba’s answer to OpenAI’s original reasoning model o1 QwQ, short for Qwen-with-Questions, was first introduced by Alibaba in November 2024 as an open-source reasoning model aimed at competing with OpenAI’s o ... Read full article.