Alibaba Cloud · General LLM

Qwen

Alibaba's large language model family offering strong multilingual capabilities with particular strength in Chinese and English across diverse model sizes.

Overview

Qwen (Tongyi Qianwen) is Alibaba Cloud's family of large language models available in sizes from 0.5B to 110B parameters. Qwen 2 demonstrated competitive performance with leading Western models on both English and Chinese benchmarks, with particular strength in coding, mathematics, and multilingual tasks. The Qwen-VL multimodal variant adds vision understanding. The models are open-weight under permissive licenses, making them popular choices for applications targeting Chinese-speaking markets.

Parameters

0.5B to 110B (multiple variants)

Context Window

32K-128K tokens

Languages

29 languages (Chinese and English primary)

Modality

Text, vision (Qwen-VL), audio (Qwen-Audio)

License

Qwen License / Apache 2.0 (size dependent)

Capabilities

Bilingual Chinese-English text generation and understanding

Code generation and mathematical reasoning

Multimodal vision-language understanding (Qwen-VL)

Long-context processing up to 128K tokens

Tool use and function calling

Use Cases

Building AI applications for Chinese and bilingual markets

Deploying multilingual customer service systems

Creating coding assistants with strong Chinese documentation support

Developing multimodal AI applications with vision capabilities

Pros

  • +Best-in-class Chinese language understanding and generation
  • +Wide range of model sizes for different deployment needs
  • +Multimodal variants cover vision and audio
  • +Open weights with permissive licensing for most variants

Cons

  • -English performance can trail specialized English models
  • -Alibaba Cloud hosting may raise data sovereignty concerns
  • -Less community tooling in Western developer ecosystems
  • -Licensing terms vary across model sizes

Pricing

Open weights are free to download. Alibaba Cloud API pricing varies by model size. Qwen-Turbo: approximately $0.57/1M tokens.

Related Models