01.AI · General LLM

Yi

A bilingual Chinese-English language model family from Kai-Fu Lee's 01.AI, known for strong benchmark performance across diverse tasks.

Overview

Yi is a family of large language models developed by 01.AI, founded by AI pioneer Kai-Fu Lee. The Yi model family spans from 6B to 34B parameters and demonstrates impressive performance on both Chinese and English benchmarks, often outperforming larger models. Yi-1.5 introduced improved reasoning and instruction following, while Yi-VL added vision-language capabilities. The models are designed for both research and commercial use with a permissive Apache 2.0 license.

Parameters

6B / 9B / 34B variants

Context Window

200K tokens (Yi-1.5)

Architecture

Dense decoder-only transformer

Training Data

3.1T tokens (English, Chinese, code)

License

Apache 2.0

Capabilities

Strong bilingual Chinese-English text generation

Competitive reasoning and knowledge benchmarks

Vision-language understanding (Yi-VL)

Instruction following and conversational AI

Use Cases

Building bilingual AI applications for Chinese and English markets

Deploying efficient language models for general-purpose tasks

Creating vision-language applications with Yi-VL

Fine-tuning for domain-specific Chinese or English applications

Pros

  • +Strong bilingual performance with Apache 2.0 license
  • +Excellent benchmark results relative to model size
  • +200K context window in Yi-1.5 for long-document tasks
  • +Active development with regular model improvements

Cons

  • -Less community adoption than Llama or Mistral models
  • -Limited enterprise support infrastructure outside of China
  • -Vision-language variant is less mature than text-only models
  • -Brand recognition lower in Western markets

Pricing

Free and open-source under Apache 2.0. Self-hosting costs vary. Available through various cloud inference providers at competitive per-token rates.

Related Models