National
Sarvam AI Open-Sources 30B & 105B Models for India
3/21/2026
12:01 PM
Bengaluru startup Sarvam AI releases 30B & 105B parameter models supporting 22 Indian languages - trained entirely in India under IndiaAI Mission.
Verified Content
Sarvam AI Launches India's Largest Open-Source Models
Bengaluru, March 21, 2026 - Sarvam AI has open-sourced Sarvam 30B and Sarvam 105B - two reasoning models trained from scratch on Indian datasets, supporting 22 Indian languages including Hindi, Tamil, Telugu, and more.
Model Specifications
| Model | Parameters | Context Window | Primary Use Case |
|---|---|---|---|
| Sarvam 30B | 30 Billion | 32K tokens | Real-time chat, customer service |
| Sarvam 105B | 105 Billion | 128K tokens | Complex reasoning, enterprise |
| Both models use mixture-of-experts architecture with Grouped Query Attention (GQA) for efficient inference on standard hardware. |
Training Details
- Trained entirely in India using IndiaAI Mission compute
- 16 trillion tokens for 30B model
- Multilingual datasets across 22 Indian languages
- Specialized corpora for code, math, and enterprise knowledge
Production Deployments
- Sarvam 30B powers Samvaad conversational platform
- Sarvam 105B powers Indus AI assistant
- Available via API, Hugging Face, AI Kosh
Developer Access
The models represent India's first major challenge to proprietary AI systems from OpenAI and Anthropic, optimized specifically for Indic languages and enterprise use cases.
Source: Sarvam AI Blog, TechCrunch | March 21, 2026 | 5:28 PM IST
Verified Content
REF ID: 2d9a1400VERIFIED BY LEARNTUBE INDIA
Thanks for Learning!
We're thrilled to have you as part of the LearnTube India family. Keep exploring, stay curious, and continue your journey towards excellence.