Researchers have created a prototype prosthetic hand that’s able to determine by touch whether fruit is ripe, and the pressure necessary to hold an object firmly, but gently, in response to the way it “feels”
“Soft” and “gentle” are two words not often used to describe the grasp of a robot, but thanks to the work of a team of researchers at Cornell University, future androids may well have an especially delicate touch. Using extra-sensitive optical detectors built into a soft prosthetic hand, the researchers have demonstrated a prototype that is able to tell by touch whether fruit is ripe, or to modify the pressure exerted on a material simply in response to the way it feels.
The prototype made use of a combination of LEDs and photosensors incorporated in bendable tubes (known as elastomeric optical waveguides) and used as detectors of curvature, elongation, and force. These were then embedded in a stretchable, soft artificial hand, allowing the researchers in the Organic Robotics Lab at Cornell to use these devices to electronically register a large range of feedback and control the hand accordingly.
“Most robots today have sensors on the outside of the body that detect things from the surface,” said Cornell doctoral student Huichan Zhao. “Our sensors are integrated within the body, so they can actually detect forces being transmitted through the thickness of the robot, a lot like we and all organisms do when we feel pain, for example.”
The creation of the artificial hand relied on a 3D fabrication technique that Cornell has successfully employed in making other artificial body parts. On this occasion, a four-step soft lithography process (somewhat like the method used by Harvard University to make a soft-bodied octopus robot) was added to create the core through which the light of an LED propagates, the cladding making up the outer casing of this waveguide, along with a housing for the photodiode that detects the light shining through the core.
Adafruit publishes a wide range of writing and video content, including interviews and reporting on the maker market and the wider technology world. Our standards page is intended as a guide to best practices that Adafruit uses, as well as an outline of the ethical standards Adafruit aspires to. While Adafruit is not an independent journalistic institution, Adafruit strives to be a fair, informative, and positive voice within the community – check it out here: adafruit.com/editorialstandards
Stop breadboarding and soldering – start making immediately! Adafruit’s Circuit Playground is jam-packed with LEDs, sensors, buttons, alligator clip pads and more. Build projects with Circuit Playground in a few minutes with the drag-and-drop MakeCode programming site, learn computer science using the CS Discoveries class on code.org, jump into CircuitPython to learn Python and hardware together, TinyGO, or even use the Arduino IDE. Circuit Playground Express is the newest and best Circuit Playground board, with support for CircuitPython, MakeCode, and Arduino. It has a powerful processor, 10 NeoPixels, mini speaker, InfraRed receive and transmit, two buttons, a switch, 14 alligator clip pads, and lots of sensors: capacitive touch, IR proximity, temperature, light, motion and sound. A whole wide world of electronics and coding is waiting for you, and it fits in the palm of your hand.
Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7:30pm ET! To join, head over to YouTube and check out the show’s live chat and our Discord!
Python for Microcontrollers – Adafruit Daily — Python on Microcontrollers Newsletter: Open Hardware is In, New CircuitPython and Pi 5 16GB, and much more! #CircuitPython #Python #micropython @ThePSF @Raspberry_Pi
EYE on NPI – Adafruit Daily — EYE on NPI Maxim’s Himalaya uSLIC Step-Down Power Module #EyeOnNPI @maximintegrated @digikey