The open-source runway
Alibaba pulls up the ladder
Newer releases in the Qwen image and WAN video families have been appearing on Civitai, a commercial creative platform, but remain absent from Hugging Face. The gap traces to a shift in distribution strategy at Alibaba, the company behind both model families. Last week the company shipped three closed-source models in rapid succession, continuing a pattern that is replacing the open-weight tradition from which the most downloaded AI model ecosystem in the world grew.
Closing the API gate
Alibaba shipped three proprietary AI models within three days last week. An image-generation upgrade landed first, followed by the multimodal Qwen3.5-Omni and the agentic Qwen3.6-Plus. None included downloadable weights, with each launching exclusively through Alibaba’s API.
The closures accelerated a pattern that began in September 2025, when Qwen3-Max became Alibaba’s first major model to launch without open weights. Each subsequent frontier release has stayed behind the API. Alibaba says that it will continue offering open-source models in “developer-friendly sizes,” reserving the most capable versions for paid access.
The freemium corral
The open-weight strategy that preceded this pivot built the largest AI model ecosystem in the world. By January 2026, Qwen had surpassed one billion cumulative downloads on Hugging Face, overtaking Meta’s Llama as the most downloaded model family on the platform. The Apache 2.0 weights that Alibaba had published since 2023 attracted over 290,000 developers, who built 113,000 community model variations on top of them.
This adoption fed Alibaba Cloud directly. CEO Eddie Wu described edge models and cloud-hosted models as complementary, arguing that developers who used open Qwen models locally also required cloud-based compute for production workloads. Alibaba Cloud’s AI-related product revenue grew at triple-digit rates for nine consecutive quarters, and Q4 2025 cloud revenue reached $4.15 billion, up 18% year-over-year.
The ecosystem also served a broader strategic function. A partner at Andreessen Horowitz estimated that 80% of US startups use Chinese base models to build derivative products. A USCC report published in March framed Chinese open-source AI as a mechanism for reinforcing industrial dominance, describing a feedback loop in which open models drive developer adoption that locks users into Chinese cloud infrastructure.
A coordinated pivot
Alibaba’s language models drew the most attention, but the same pattern played out across its generative media stack. WAN 2.1 and 2.2 shipped under Apache 2.0 with full weights on Hugging Face, accumulating over 30 million downloads between them. WAN 2.5 broke from that tradition, launching as API-only through Alibaba Cloud, and versions 2.6 and 2.7 continued as commercial offerings. Over 100 developers upvoted a GitHub issue asking whether WAN 2.5 weights would be released. They have not appeared.
The original Qwen-Image shipped under Apache 2.0 and remains available on Hugging Face, but the newer image and multimodal models that Alibaba published last week are all closed. On platforms like Civitai, the available models are community fine-tunes and quantizations of the older open weights. Alibaba’s commercial frontier has moved beyond them.
The suits take over
Alibaba’s financial targets make the logic behind this shift difficult to mistake. The company has set a goal of $100 billion in cloud revenue within five years, a rate of compound growth exceeding 40%. It has pledged $53 billion to AI infrastructure over three years and raised cloud storage prices by up to 34% in March. The commercial vehicle for these ambitions is Wukong, an agentic AI platform launched in March for enterprise clients, which integrates with DingTalk’s 20 million users and will distribute Qwen3.6-Plus directly. Input pricing for the model starts at 2 yuan per million tokens, roughly twenty-eight cents.


