New Delhi, Feb 18 (SocialNews.XYZ) Bengaluru-based AI startup Sarvam AI on Wednesday unveiled two new large language models as part of India’s push to build sovereign artificial intelligence capabilities.
The announcement was made at the India AI Impact Summit here, where the company said both models have been trained from scratch using a mixture-of-experts (MoE) architecture to improve efficiency and performance.
The first model, called Sarvam 30B, has 30 billion parameters. However, the company said that for every output token it generates, only 1 billion parameters are activated.
Co-founder Pratyush Kumar explained that this MoE structure helps reduce inference costs while improving efficiency, especially for reasoning and complex workloads.
He added that the 30B model performs strongly on thinking and reasoning benchmarks at both 8K and 16K scales when compared to other models of similar size.
The Sarvam 30B model supports a 32,000-token context window and has been trained on 16 trillion tokens.
Kumar said efficiency remains central to the company’s vision, as it aims to make AI accessible at population scale across India.
The company also introduced a larger 105-billion-parameter model designed for more advanced reasoning and agent-based tasks.
This model activates 9 billion parameters and supports a 128,000-token context window, allowing it to handle more complex instructions and longer conversations.
Kumar compared the new 105B model with global frontier systems. He said that on several benchmarks, it outperforms DeepSeek’s DeepSeek R1, which was reported to have 600 billion parameters when it was released last year.
He also said the model is cheaper than Gemini Flash developed by Google, while delivering better performance on many benchmarks.
According to him, even when compared to Gemini 2.5 Flash, Sarvam’s model shows stronger performance on Indian language tasks.
The launch comes at a time when India is stepping up efforts to build its own foundational AI models tailored for multilingual and large-scale public use cases.
The government-backed IndiaAI Mission, supported by a Rs 10,000 crore fund, aims to reduce dependence on foreign AI systems and promote domestic innovation.
So far, the mission has disbursed Rs 111 crore in GPU subsidies. Sarvam AI has emerged as the biggest beneficiary, securing 4,096 NVIDIA H100 SXM GPUs through Yotta Data Services and receiving nearly Rs 99 crore in subsidies.
The startup was earlier selected as the first company to build India’s foundational AI model under the mission.
Sarvam AI was founded in July 2023 by Vivek Raghavan and Pratyush Kumar, who previously worked at AI4Bharat, an initiative backed by Infosys co-founder Nandan Nilekani.
Source: IANS
Gopi Adusumilli is a Programmer. He is the editor of SocialNews.XYZ and President of AGK Fire Inc.
He enjoys designing websites, developing mobile applications and publishing news articles on current events from various authenticated news sources.
When it comes to writing he likes to write about current world politics and Indian Movies. His future plans include developing SocialNews.XYZ into a News website that has no bias or judgment towards any.
He can be reached at gopi@socialnews.xyz
This website uses cookies.