Alibaba Is Rolling Out Its ‘Most Agentic Code Model to Date’

Alibaba’s Qwen team has launched Qwen3-Coder, which it calls its “most agentic code model to date.” While it will be made available in multiple sizes, the most powerful variant — Qwen3-Coder-480B-A35B-Instruct — is being released first. The 480 billion parameter mixture-of-experts model has 35 billion active parameters supporting a context length of 256,000 tokens natively and 1 million tokens with extrapolation methods for “exceptional performance in both coding and agentic tasks,” explains the group, which claims the quasi-open source model has agentic coding, agentic browser use, and agentic tool use comparable to Anthropic’s proprietary Claude Sonnet 4. Continue reading Alibaba Is Rolling Out Its ‘Most Agentic Code Model to Date’