Earlier this year, Figure CEO Brett Adcock predicted that 2024 would be the year of embodied AI. However, the recent advancements in robotics have surpassed expectations and pushed the boundaries of what was once thought to be impossible. Last week, Meta’s Fundamental AI Research (FAIR) released three new research artefacts that have taken robotics to a whole new level. These include Meta Sparsh, Meta Digit 360, and Meta Digit Plexus, which focus on touch perception, robot dexterity, and human-robot interaction.
Meta Sparsh, named after the Sanskrit word for ‘touch’, is a general-purpose encoder for vision-based tactile sensing. It aims to integrate robots with the sense of touch, a crucial modality for interacting with the world. Sparsh uses self-supervised learning and does not require labelled data, making it versatile and efficient. Meta Digit 360, on the other hand, is a tactile fingertip with human-level multimodal sensing capabilities. And finally, Meta Digit Plexus is a platform that integrates various tactile sensors into a single robotic arm, making it easier for developers to create advanced machines.
These advancements in robotics align with Meta’s goal of achieving advanced machine intelligence (AMI), a concept proposed by their AI chief scientist Yann LeCun. AMI, also known as autonomous machine intelligence, aims to create machines that can assist humans in their daily lives by understanding cause and effect and modeling the physical world. By open-sourcing these new models, Meta is not only advancing robotics but also enabling individuals and companies to grow in the open-source community.
Meta’s interest in robotics is not new, as they have previously focused on developing AI and robotics for the metaverse and AR/VR sets. However, with the recent announcements on tactile sensing innovations, it seems that Meta is taking robotics to the next level. This is further supported by the recent hiring of Caitlin Kalinowski, former head of Meta hardware, by OpenAI to lead their robotics and consumer hardware division.
In conclusion, Meta’s advancements in robotics are not only pushing the boundaries of what is possible but also aiming to create a more immersive and interactive metaverse experience. By open-sourcing their models, they are also promoting collaboration and growth in the open-source community. It will be interesting to see how Meta’s developments in robotics will continue to shape the future of technology.