Scientists improve human-robot connection with non-verbal cues #robotics
Phys.org has this interesting story on a study that shows how to improve the human-robot connection.
Researchers at the University of British Columbia enlisted the help of a human-friendly robot named Charlie to study the simple task of handing an object to a person. Past research has shown that people have difficulty figuring out when to reach out and take an object from a robot because robots fail to provide appropriate nonverbal cues.
“We hand things to other people multiple times a day and we do it seamlessly,” says AJung Moon, a PhD student in the Department of Mechanical Engineering. “Getting this to work between a robot and a person is really important if we want robots to be helpful in fetching us things in our homes or at work.”
Moon and her colleagues studied what people do with their heads, necks and eyes when they hand water bottles to one another. They then tested three variations of this interaction with Charlie and the 102 study participants.
Programming the robot to use eye gaze as a nonverbal cue made the handover more fluid. Researchers found that people reached out to take the water bottle sooner in scenarios where the robot moved its head to look at the area where it would hand over the water bottle or looked to the handover location and then up at the person to make eye contact.
Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7:30pm ET! To join, head over to YouTube and check out the show’s live chat and our Discord!
Python for Microcontrollers – Adafruit Daily — Python on Microcontrollers Newsletter: A New Arduino MicroPython Package Manager, How-Tos and Much More! #CircuitPython #Python #micropython @ThePSF @Raspberry_Pi
EYE on NPI – Adafruit Daily — EYE on NPI Maxim’s Himalaya uSLIC Step-Down Power Module #EyeOnNPI @maximintegrated @digikey