Why is it that every time I try to insert a USB plug it’s backward? Shouldn’t it be right at least half the time by dumb luck? Whatever my problem is, a dextrous new robot doesn’t have it. The robot’s advantage is that its fingertips don’t just feel—they see, too.
Researchers at Northeastern University and (where else?) MIT created the plug-savvy bot. They started with an existing factory-worker robot called Baxter and gave it a pair of pinching fingers. Then on one finger, they added a shrunken-down and adapted version of a sensing technology called GelSight, invented a few years ago.
GelSight creates a precise sense of touch by, essentially, combining it with vision. It uses a small box (below) surrounding a rubber surface that’s coated in metallic paint. A different color of light shines across the surface from each of the box’s four sides: red, green, blue, and white. The rubber acts like the pad of the robot’s finger. As it presses against an object, a camera looking down from the top of the box observes how the four colors of light are reflected by the bending of the metallic pad. Then an algorithm uses that information to create a three-dimensional map of the object’s surface.
“Basically we convert touch information to images,” says lead author Rui Li, a PhD student at MIT. This gives the device an advantage over other kinds of sensors, because it can have a very high resolution—as high as the resolution of the camera built into it.
To test how well GelSight works as a robot’s fingertip, Li and his coauthors asked the newly outfitted Baxter bot to pick up a USB cable dangling from a hook and plug it in. “USB insertion is a very fine manipulation that requires sub-millimeter accuracy, which can be very challenging,” Li says.
Each time the robot picked up the USB cable, it had to feel for the embossed USB symbol on the plug, then use that information to align the plug precisely over the hole. (To treat their sensor gently, the team didn’t make Baxter push the plug all the way down—partway in was good enough.) After the researchers trained the robot, it succeeded on 34 out of 36 attempts to plug in the USB.
Adafruit publishes a wide range of writing and video content, including interviews and reporting on the maker market and the wider technology world. Our standards page is intended as a guide to best practices that Adafruit uses, as well as an outline of the ethical standards Adafruit aspires to. While Adafruit is not an independent journalistic institution, Adafruit strives to be a fair, informative, and positive voice within the community – check it out here: adafruit.com/editorialstandards
Stop breadboarding and soldering – start making immediately! Adafruit’s Circuit Playground is jam-packed with LEDs, sensors, buttons, alligator clip pads and more. Build projects with Circuit Playground in a few minutes with the drag-and-drop MakeCode programming site, learn computer science using the CS Discoveries class on code.org, jump into CircuitPython to learn Python and hardware together, TinyGO, or even use the Arduino IDE. Circuit Playground Express is the newest and best Circuit Playground board, with support for CircuitPython, MakeCode, and Arduino. It has a powerful processor, 10 NeoPixels, mini speaker, InfraRed receive and transmit, two buttons, a switch, 14 alligator clip pads, and lots of sensors: capacitive touch, IR proximity, temperature, light, motion and sound. A whole wide world of electronics and coding is waiting for you, and it fits in the palm of your hand.
Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7:30pm ET! To join, head over to YouTube and check out the show’s live chat and our Discord!
Python for Microcontrollers – Adafruit Daily — Python on Microcontrollers Newsletter: New Python Releases, an ESP32+MicroPython IDE and Much More! #CircuitPython #Python #micropython @ThePSF @Raspberry_Pi
EYE on NPI – Adafruit Daily — EYE on NPI Maxim’s Himalaya uSLIC Step-Down Power Module #EyeOnNPI @maximintegrated @digikey