In the global race to boost artificial intelligence (AI), tech giants are looking to create larger and powerful language models. However, in India, IIT-M and Ziroh Labs are looking to create a model that is “useful over fancy,” says Prof. V. Kamakoti, Director of IIT Madras.
In an in-depth conversation with Anil Padmanabhan at StratNews Global, Prof. Kamakoti broke down their approach to AI development: small language models (SLMs) built for day-to-day utility, designed to run efficiently on Kompact AI —a homegrown AI platform that allows for inferencing using CPUs instead of expensive GPUs.
Small Models, Big Impact
India’s AI journey mirrors other successful digital public goods like UPI (Unified Payments Interface). In 2016, UPI processed just a few thousand transactions. Today, it powers billions of low-value, everyday payments, many under ₹200. Similarly, Kompact AI and the small language models it supports aim to deliver artificial intelligence as a public utility—accessible and trustworthy, rather than merely impressive.
“The goal is to build AI that is useful for us, not just flashy,” said Kamakoti. “These small language models are doing things you can use day-to-day with more accuracy. And more importantly, they can be run without the need for GPUs.”
Kompact AI, developed in collaboration with a Bengaluru-based startup and IIT Madras, addresses a critical problem in the AI ecosystem—high energy and infrastructure costs. By allowing AI models to function without the need for GPU-intensive inferencing, Kompact AI significantly reduces power consumption and improves accessibility.
This innovation is not about trying to beat the likes of DeepSeek or ChatGPT in headline-grabbing performance. Instead, it is about building reliable, accountable AI for the public good.
Trust Over Hype
Prof. Kamakoti emphasizes that trust and accuracy are critical, especially when AI is used in learning environments. “You shouldn’t teach wrong things to children,” he says, recalling how earlier versions of large language models gave incorrect answers to real JEE Advanced questions. “It didn’t even say ‘I don’t know,’” he points out, warning against over-reliance on flashy models with questionable factual grounding.
That’s why Kompact AI is built for verifiability and domain specificity. While not tied to any one model, it provides the infrastructure for inferring small, accurate, and targeted language models on CPUs. This makes it energy-efficient, especially important in India, where infrastructure and affordability are key constraints.
The collaboration between IIT Madras and Ziroh Labs represents a growing trend of industry-academia partnerships. “We have 104 startups in our incubator,” Kamakoti says, noting that these collaborations are helping translate academic innovation into products with real societal impact.
Ultimately, India’s emerging AI ecosystem is about building in the public interest, focusing on long-term utility over short-term splash. Kompact AI may not dominate headlines like its Western counterparts, but its quiet, architecture-driven approach could power AI for the many, not just the few.