Meta Platforms Inc.’s chief AI scientist Yann Lecun was in Davos in January at an annual world economic forum. Lecun said that the conference had partially caused his decision to visit India for the third time.
“We introduced an Indian dynamic ecosystem,” he said in an interview with a mint on the first meta -built -to -view of Meta’s “AI Summit” on Wednesday.
“I recently saw an innovative idea at Hackathon (AI Hackathon and Metallama” for 30 hours in Bengalur this month). AI tools, like Meta AI, which can be accessed via WhatsApp and Messenger, have hosted the world’s largest user community worldwide, gaining large -scale traction, “says Lecun. 。
“We also conducted an experiment in the countryside of India and introduced a meta AI. This was immediately adopted. This is the potential of AI throughout the sectors such as education, healthcare, agriculture, and business. Lecun, who wears Ray -Ban Metgrass that has not yet been introduced in India, says Lecun.
Lekun’s decision to visit India has also been “enhanced by policy experts who urged national scientific, developers, and government landscapes, and platforms like lama have a significant impact here. He emphasized that it could be given, “he said. The lama is composed of a large language model (LLM) developed by META.
However, he emphasized, “What is missing in India is that there is no world -class laboratories outside the university.” He said that students could be motivated to pursue AI careers in India.
Lecun pointed out a similar change in France. He catalyzed the local AI ecosystem, urged students to pursue graduate research, and encouraged companies to establish a laboratory, “he said.
Fair Paris is currently graduating from about 12 doctorate every year. Lecun believes that India can reproduce this model.
“Eye Godfather”
Meta is widely operated in India through Facebook, WhatsApp, and Instagram. It also supports small and medium -sized businesses with digital tools and universities and partners in AI Research. For example, Nilekani’s Infosys LTD announced a partnership with Meta on Wednesday and promoted the innovation of the generated AI through the open source project.
In addition to providing collaboration and open source AI models to the Indian Eco System, Meta is investigating metaverse opportunities.
Lecun, a professor at New York University, is widely known for his deep learning, machine learning technology. He is also called the “AI Godfather” and is called GeOfferry Hinton, who won the Nobel Prize for Physics this year, and Yoshua Bengio, a computer science professor at Montreal University.
All of them were the winners of the 2018 ACM Tuuling Award, called the “Computing Nobel Prize.”
It takes time to achieve a human -level AI. But we intend to reach there. There is no doubt.
Lecun is known for its optimistic perspective, contrasted with more precautions, such as Hinton and Elon Musk.
AI system has evolved to own a common inference, planning ability, and permanent memory, and acknowledges that it will make it easier to develop in the future, and LECUN “it takes time to achieve human -level AI. -The optimal is within the abolition … we have no doubt.
LECUN says, “We are called a large language model or a large -scale language model of auto rering (we generate text by predicting the next token based on the previous token and input from the left. It was a category that was not generated by processing to the right).
Intelligent as 4 years old
Lecun often says that even the largest LLM is not as smart as four. According to him, the future AI needs to understand the physical world and train not only text but also videos and real world experience.
This shift is indispensable because LLM has a limited range and cannot achieve the entire range of human intelligence. Lecun states that the fair lab of the meta is focusing on the development of the next -generation AI model beyond LLM.
Actual breakthroughs occur when AI can be wise by understanding the world from direct interaction.
Currently, LLM is trained with up to 20 trillion tokens and covers most public texts, but according to LECUN, this data is not enough to learn all the languages, including the 700 dialect in India.
This indicates that the model needs to learn from the real world experience beyond the vast datasets, as children learn through visual and interaction.
“According to the survey, by the age of four, the child’s visual cortex is as large as the largest LLM, but it has achieved it in a much shorter time frame. He needs a basic model of actual experience, especially from video and sensory data, as human and animals learn to achieve it. “
“Language enhances these systems, but when AI understands the world from direct interaction, when AI becomes smart with less data, this shift is more. Lecun explains in detail, as it works as a basis for high intelligence and better generalization. “
Close AI
Nevertheless, skepticism warns that the rise of AI can lead to a close system beyond human control and propose strict regulations to prevent such scenarios. However, Lecun exaggerates and rejects these fears.
Lecun believes that these systems do not threaten employment or autonomy. Instead, they function as intelligent assistants and amplify human intelligence. AI said that the AI gives people by amplifying their intelligence.
Superintelligent AI can cause a new runaissance as well as the transformational impact of printing presses.
“Imagine accessing virtual assistants at any time using smart glasses and smartphones, and advice on all topics. These systems are academic, business, or political intentions, as you have highly skilled staff. The decision can be enhanced, but their intelligence is not intimidating.
“These systems are designed to make our bidding. They are not intended to be dominant or inking such things. This is where we don’t agree with some of my friends like Hinton. Lecun claims that such a technology can cause a new runaissance, as well as the transformational impact of printing presses.
Is the meta really open source?
Nevertheless, the meta AI model is not really open source, but more “open weight”. Because they not only share weights (parameters), they are not source code.
LECUN acknowledged that the concept of AI’s “open” was delicate, modeled, inference code, training code, and data, and that it contained a variety of openness.
Currently, open weights and inference code are the easiest to access, and users can efficiently download, fine -tuned, or adapt to various platform models. He explained that the approach promotes innovation and will make startups and companies quickly deploy AI.
According to him, “AI has evolved into a common infrastructure, just like how Linux operates the Internet. Because the involvement of the community, security, faster development, and cost reductions are guaranteed. Open source platforms will prosper in the future AI system that can be trained in a distributed method, which is essential for multilingual models that can cover 700 languages in India. 。
Future AI systems may use local data required for multilingual models that can cover 700 languages in India.
Open source hardware also plays an important role. META’s Open Compute Project (OCP) sets the standard of open server design, and similar frameworks may appear in AI.
AI model training requires a powerful graphics processing unit (GPU), which is currently dominated by NVIDIA CORP. However, the challenge lies in the scaling infrastructure in large -scale markets such as India. Millions.
According to LECUN, the innovation of hardware and software optimization is very important, and this evolution has already been introduced to enable AI to access billions of people.
Substitute for transformer models
LECUN also believes that the transformer model (based on most LLMs) will continue to be a future AI core building block, but the current approach (uses a model with only self -regained decoders like LLMS). ) I believe that it is likely to evolve.
“These architectures may be replaced by more advanced systems, such as embedding predictive architectures. Jepa, a non -generated model, is suitable for processing tasks that include videos, images, and long -term dependencies. But the transformer is incorporated as a component, “Lecun says. “These systems can build a hierarchical model, execute simulation, and become more effective by inference and planning.”
He added that AI technology has evolved rapidly, for example, LLMS may not control the landscape five years later. He added deep learning, neural networks, and transformers, while the composition and applications were shifted.
“As technology continues to evolve, the ability to adapt is important, so it is possible to prioritize learning in learning. This is a guarantee that it will be ahead of curves.” LiveMint