UW student researches ways to make robots more human by making them more distracted
This piece in the Badger Herald highlights new research being done at the University of Wisconsin on how to make robots more like us humans. The main focus is on “gaze aversion,” which teaches robots to turn away and act distracted during conversation. Robots! They’re socially awkward! Just like us! via boingboing.
If interactive robots were able to pause during conversation and take a moment to gaze off into the distance as if pondering what the user was saying, research suggests this small change could make them seem less robotic.
Sean Andrist, a graduate researcher at the University of Wisconsin, studies ways researchers can improve how communicative characters, both digitally-constructed virtual agents and physical robots, maintain eye contact.
Specifically, Andrist’s research focuses on “gaze aversion,” or the moments when people glance away or look around during conversation.
Andrist has a particular interest in human-computer interaction and computer animation, so he started working on a cross-section of these two topics. He looked at how to make computer agents behave more naturally and work with users more intuitively, his co-advisor, Bilge Mutlu, a professor in the Computer Sciences Department, said.
To achieve a stronger application of gaze mechanisms in communicative characters, Andrist said he also studies social science aspects of how humans behave while communicating with one another.
In his most recent paper, Andrist outlined how speakers use these aversions in conversation, they signal to the listeners that cognitive processing is occurring, creating the impression that deep thought or creativity is being undertaken in formulating their speech.
Adafruit publishes a wide range of writing and video content, including interviews and reporting on the maker market and the wider technology world. Our standards page is intended as a guide to best practices that Adafruit uses, as well as an outline of the ethical standards Adafruit aspires to. While Adafruit is not an independent journalistic institution, Adafruit strives to be a fair, informative, and positive voice within the community – check it out here: adafruit.com/editorialstandards
Stop breadboarding and soldering – start making immediately! Adafruit’s Circuit Playground is jam-packed with LEDs, sensors, buttons, alligator clip pads and more. Build projects with Circuit Playground in a few minutes with the drag-and-drop MakeCode programming site, learn computer science using the CS Discoveries class on code.org, jump into CircuitPython to learn Python and hardware together, TinyGO, or even use the Arduino IDE. Circuit Playground Express is the newest and best Circuit Playground board, with support for CircuitPython, MakeCode, and Arduino. It has a powerful processor, 10 NeoPixels, mini speaker, InfraRed receive and transmit, two buttons, a switch, 14 alligator clip pads, and lots of sensors: capacitive touch, IR proximity, temperature, light, motion and sound. A whole wide world of electronics and coding is waiting for you, and it fits in the palm of your hand.
Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7:30pm ET! To join, head over to YouTube and check out the show’s live chat and our Discord!
Python for Microcontrollers – Adafruit Daily — Select Python on Microcontrollers Newsletter: PyCon AU 2024 Talks, New Raspberry Pi Gear Available and More! #CircuitPython #Python #micropython @ThePSF @Raspberry_Pi
EYE on NPI – Adafruit Daily — EYE on NPI Maxim’s Himalaya uSLIC Step-Down Power Module #EyeOnNPI @maximintegrated @digikey