
Credit rating: Unsplash/CC0 Public Area
Or now not it has been a aim for as prolonged as humanoids were a topic of popular creativeness—a total-motive robot that can develop rote responsibilities luxuriate in fold laundry or type recycling just by being asked.
On September 25, Google DeepMind, Alphabet’s AI lab, made a buzz within the mumble by showcasing a humanoid robot reputedly doing staunch that.
The company published a blog post and a set of flicks of Apptronik’s humanoid robot Apollo folding dresses, sorting devices into boxes, and even placing devices into an particular particular person’s earn—all over pure language commands.
It modified into section of a showcase of the company’s most modern AI devices—Gemini Robotics 1.5 and Gemini Robotics-ER 1.5. The aim of the announcement modified into as an instance how orderly language devices would perchance perchance also simply even be at likelihood of support bodily robots to “leer, notion [and] mediate” to complete “multi-step responsibilities,” in step with the company.
Or now not you’ll want to glance DeepMind’s most modern news with a minute little bit of skepticism, in particular around claims of robots having the skill to “mediate,” says Ravinder Dahiya, a Northeastern professor of electrical and computer engineering who co-authored a Nature Machine Intelligence document on how AI would perchance perchance even be built-in into robots.
Gemini Robotics 1.5 and Gemini Robotics-ER 1.5 are identified as vision-language motion devices, which device they use vision sensors and image and language details for some distance of their diagnosis of the skin world, explains Dahiya.
Gemini Robotics 1.5 works by “turning visual details and directions into motor negate.” Whereas Gemini Robotics-ER 1.5 “makes a speciality of figuring out bodily areas, planning, and making logistical decisions internal its surroundings,” in step with Google DeepMind.

Ravinder Dahiya, a Northeastern professor of electrical and computer engineering, is an expert on robotic contact sensing. Credit rating: Matthew Modoono/Northeastern University
Whereas all of it would perchance perchance also simply seem luxuriate in magic on the floor, it be all in step with a undoubtedly defined web convey of rules. The robot is now not undoubtedly thinking independently. Or now not it’s all backed by heaps of fine quality coaching details and structured scenario planning and algorithms, Dahiya says.
“It becomes easy to iterate visual and language devices in this case because there could be an ethical amount of details,” he says. “Imaginative and prescient in AI is nothing new. Or now not it has been around for a undoubtedly very prolonged time.”
What is novel is that the DeepMind crew has been able to integrate that expertise with orderly language devices, allowing customers to ask the robot to develop responsibilities utilizing straightforward language, he says.
That’s impressive and “a step within the appropriate route,” Dahiya says, but we’re restful some distance away from having humanoid robots with the sensing or thinking capabilities in parity with humans, he notes.
For instance, Dahiya and various researchers are within the approach of developing sensing technologies that enable robots to beget a strategy of contact and tactile ideas. Dahiya, in utter, is working on rising electronic robot skins.
Now not like vision details, there could be now not nearly about as noteworthy coaching details for that impact of sensing, he highlights, which is extreme in capabilities sharp the manipulation of soppy and laborious objects.
But staunch as one instance. We if truth be told beget a prolonged device to pass in giving robots the skill to register effort and scent, he adds.
“For hazardous environments, you beget to depend on all sensor modalities, now not staunch vision,” he says.
More details:
Aude Billard et al, A roadmap for AI in robotics, Nature Machine Intelligence (2025). DOI: 10.1038/s42256-025-01050-6
This tale is republished courtesy of Northeastern Worldwide News news.northeastern.edu.
Citation:
Humanoid robots within the home? No longer so swiftly, says expert (2025, October 3)
retrieved 10 November 2025
from https://techxplore.com/news/2025-10-humanoid-robots-house-swiftly-expert.html
This doc is topic to copyright. Other than any comely dealing for the motive of personal look or research, no
section shall be reproduced without the written permission. The convey material is geared up for details purposes ideal.
发布者:Dan Milmo and agency,转转请注明出处:https://robotalks.cn/humanoid-robots-in-the-home-not-so-fast-says-expert-2/