UC Santa Barbara grad student, Hannah Wolf created ROVER. This awesome project was built with a Roomba, some sensors, an Arduino and of course, a Raspberry Pi.
“I wanted to create an interactive art installation that would make people happy,” said Wolfe, who is based in the rather austere environment at UCSB’s Elings Hall. “I thought if something came up to people and sang to them that would bring joy into the space.”
With a Roomba (sans vacuum) as its feet, a Raspberry Pi as its brain and an Arduino acting as a nervous system, Rover is more than an artistic endeavor, however. It is also a tool for investigating the ways humans respond to robots.
Think about it: We ask iPhone’s Siri or Amazon’s Alexa for help accessing information, making purchases or operating Bluetooth-enabled devices in our homes. We have given robots jobs that are too tedious, too precise or too dangerous for humans. And artificial intelligence is now becoming the norm in everything from recommendations in music and video streaming services to medical diagnostics.
“Whether we like it or not, we’re going to be interacting with robots,” Wolfe said. “So, we need to think about how we will interact with them and how they will convey information to us.”
To that end, Wolfe has ROVER generate sounds — beeps and chirps and digital effects (think R2D2) — that she found elicit positive versus negative reactions from people. Meanwhile an onboard camera records how individuals respond to ROVER.
Each Friday is PiDay here at Adafruit! Be sure to check out our posts, tutorials and new Raspberry Pi related products. Adafruit has the largest and best selection of Raspberry Pi accessories and all the code & tutorials to get you up and running in no time!