Thirty-nine years ago, Pixar’s CG Wizard believed that faceless desk lamps were extremely expressive and incredibly cute. Apple has put its heart to Home Robotics to show you how such adorable lamps look like in real life. Tech Giant has been working on a slightly goofy ramp while trying to respond to requests. This could be a more Apple Intelligence-enabled device than any of the AI assistants on your iPhone.
Apple’s Machine Learning Research Division submitted a relatively short research paper to the ARXIV preprint repository last month. Macrumors found the article and uploaded a YouTube video of the expressive lamp. This is a device that immediately reminiscent of Pixar’s mascot rummage Jr., and somehow it’s just as cute. The engineers gestured to move the lamp forward and look in a specific direction. According to paper, lamps do not simply move linearly, but become confused and curious, with various states of “attention”, “attitude”, and “expression”, which leads to confusion and curiosity. Ta. Apple calls this framework Ereglent. This is a clumsy acronym for “designing expressive and functional movements of non-human robots.”
What do you know, Apple might be on the money here. Expressive robots are much more interesting than just something you do to convey it. In one highlight, the robot arm tried to expand to see a note that the arm cannot reach. He then shook his head in disappointment and apologized with an AI-generated voice. In another part of the video, the user asked the lamp what the weather was. The robot looked out, looked back and spoke to predictions (the action was when AI used to send it to a generative model and process the information). It looked disappointing when asked if it was invited on a hiking trip with the owner and was told no.
The tricky thing is to make it work while you have what is called “attitude.” I remember the struggle to play the final Guardian on PlayStation 4. There, negotiating with a giant beast that is likely to follow the player’s orders just like a real puppy can be engaging, but sometimes boring and sometimes anger. Another aspect of this “elegnt” device is whether expressive movements are added to the feelings of boring and boring speech in other ways that you would otherwise get from AI chatbots. No matter what, the bot is still pretending. The important thing is, if it is fully satisfied, the body language is generated based on the algorithm whether it can be forgotten about the sound.
There’s a bit of speculation about how Apple wants to infiltrate smart home technology beyond just another HomePod. The latest rumors from Bloomberg’s trustworthy sources claim that Apple engineers are creating a touchscreen at the end of the robotic arm. Your arms are supposed to follow you and respond to you. Cost and practicality inevitably hinder future techniques of these kinds of Jetsons style, but it’s a clean idea with a concept.
These expressive interactions could set up future smart home technologies apart from many other smart screens on the market. Apple engineers feature a major 2014 paper in the Journal of Human-Robot Interaction by Cornell University associate professors Guy Hoffman and Wendy Ju. Many of these technical tests for that previous research were carried out with the “Witch of Oz” technique. Essentially there is the “man behind the curtain” who controls the bot. With a wide range of improvements in robotics and AI, we should be able to see how this works if we don’t control these interactions (though it seems like anyone has said that to the Tesla team) .
More recent reports note that Apple is developing a smart home camera and another smart home display. This new drive from Smart Home Tech could push the Apple Intelligence even further, but Apple inevitably wants to seduce customers into the warm pockets of its walled garden. If you can buy Apple (at a premium price), why buy other brands that may not work with Apple.