Earlier this year, when Figure CEO Brett Adcock called 2024 the year of AI embodied, it rarely predicted extraordinary advances in robotics.
Last week, an unexpected candidate released an updated survey that took robotics to a new level. Meta’s Basic AI Research (FAIR) is a three new research artifact that advances touch perception, robotics dexterity, and human-robot interactions: Meta Sparsh, Meta Digit 360, and Meta Digit Plexus. has been released.
Source: Meta
Metasparsh, derived from the Sanskrit word “touch,” is the first general purpose encoder for vision-based haptic sensing. The technology aims to integrate robots with a sense of touch, thus addressing key modalities for interacting with the world.
Sparsh works with different types of visual-based tactile sensors and tasks that use self-monitoring learning, eliminating the need for labeled data. It consists of a family of pre-trained models with an extensive dataset of over 4,60,000 tactile images.
Meta has also released the Meta Digit 360, a tactile fingertip with human-level multi-modal sensing capabilities. Next is the Meta Digit Plexus, a platform that combines the Digit 360, which integrates various tactile sensors into a single robotic arm.
Metafair said in their blog that these new artifacts “support and support the Meta’s goal of reaching Advanced Mechanical Intelligence (AMI). AMI, also known as autonomous mechanical intelligence, is an innovation from META AI chief scientist Yann Lecun. This technology is envisioned to help machines help people in their daily lives.
Lecun proposes a future in which systems can understand and model causes and effects in the physical world.
Interestingly, Meta’s robotics advances aim to help the entire builder ecosystem develop machines that understand the world. However, robotics is nothing new to the company.
Meta’s robot dreams become shape
Most of the development of Meta’s robot development focuses on Metaverse and AR/VR sets that utilize AI. Two years ago, at the “Meta AI: Inside the Lab” event, the company emphasized the development of AI and robotics, which is central to creating Metaverse, bringing primarily immersive virtual experiences. Features such as Builderbot and Universal Speech Translator aim to enrich the metaverse experience.
In particular, former head of meta hardware, Caitlin Kalinowski, led the development of Orion’s augmented reality glasses at Meta, and recently joined Openai to lead robotics and consumer hardware.
With a recent announcement on tactile sensing innovation, Meta appears to be taking robotics to the next level. Furthermore, by open-sourcing these new models, Meta continues to enable individuals and businesses to grow in the open source community. In the process, they are also trying to take on Nvidia.
Open Source for Robotics
The Graphic Processing Unit (GPU) giants are also making major advances in robotics. Nvidia Omniverse and Digital Twins power several domains, including automobiles, semiconductors and healthcare. Nvidia’s project GR00T, released earlier this year, is a new foundational model that supports the development of humanoid robots.
Last week, the company released two updates for Robotics. Nvidia, along with researchers from the University of California, Berkeley, Carnegie Mellon University and other universities, has released Hover (Humanoid General Purpose Controller), a 1.5 million-parameter neural network to control the body of humanoid robots . This model is said to improve the efficiency and flexibility of humanoid applications.
To further accelerate the development of robotics, Nvidia has released Dexmimic Gen, a large-scale synthetic data generator that allows humanoids to learn complex skills from very few human demonstrations. This effectively reduces the time required to train a robot, taking into account that real-world data collection is one of the biggest hurdles in the humanoid development process.
“At Nvidia, we believe that the majority of the high-quality tokens in our Robot Foundation models come from simulation,” said Jim Fan, Nvidia’s embodied AI lead.
With many of these advances, it is clear that more companies are increasingly expressing their interest in robotics. Openai appears to be positioning itself for the future with the recent appointment of a new leader from Meta. It would not be so surprising if Meta or Openai released a robot that embodies all the senses tomorrow.
All that remains is the smell that the robot can hear, see, think, move, or touch. Given that businesses are already building technology to build the smell of teleport, it’s not surprising that robots will be equipped with this feature in the future!