“(It’s like) humans being a swarm of agents. I think that’s the next frontier and a lot of productivity will come from that. But the same way I can build a spreadsheet “I’m going to build thousands and hundreds of agents working to streamline my own work,” he said.
Nadella added that 2025 will be the year of large-scale language models (LLMs) and their capabilities.
Also read: Microsoft announces $3 billion investment in India, trains 10 million people in AI skills
“Next year, I don’t think we’re going to be sitting here praising Moore’s Law or LLM, because that’s what we’re doing in such abundance. You don’t sit there and pray for abundance. You spend it,” he said.
Find a story that interests you
Nairkani further added that India is emerging as an AI use case capital as digital public infrastructure is in place and the government is able to balance responsible AI development and innovation. “India will become the use case capital for AI.” I think a lot of things are working well for us. One, we have 15 years of experience in building population-scale digital infrastructure. So we know the game,” Nairkhani said.
“We have a tech-savvy political leadership, and I know you (Nadella) met with Prime Minister Modi yesterday, and they will strike the right balance between AI innovation and safeguards. We understand that there is a need. That’s why some parts of the world are saying, don’t worry about innovation, play it safe first. But we are between responsible AI and innovation. I think we know the right balance and we have a population that has learned to embrace technology,” said Digital India. said Nilekani, chief architect of the stack and co-founder of Infosys.
He explained that AI is already being used in various digital services. For example, Aadhaar authentication uses AI detection for biometric activation. Banking systems also use AI-based fraud detection technology.
However, Nairkani said that in order for AI to be able to handle a population of one billion people, the cost of AI inference needs to be lowered. “I think the inference costs need to be very frugal. There’s a billion people doing all sorts of queries and agent processing, so it needs to work at that scale.”