National University of Singapore used Intel neuromorphic chip to develop touch-sensing robotic ‘skin’
Robots are getting closer to actually feeling things! Drawing inspiration from human skin researchers are working on “electric skin” to give robots fine grained tactile sense. From Venture Beat:
During the virtually held Robotics: Science and Systems 2020 conference this week, scientists affiliated with the National University of Singapore (NUS) presented research that combines robotic vision and touch sensing with Intel-designed neuromorphic processors. The researchers claim the “electronic skin” — dubbed Asynchronous Coded Electronic Skin (ACES) — can detect touches more than 1,000 times faster than the human nervous system and identify the shape, texture, and hardness of objects within 10 milliseconds. At the same time, ACES is designed to be modular and highly robust to damage, ensuring it can continue functioning as long as at least one sensor remains.
Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7:30pm ET! To join, head over to YouTube and check out the show’s live chat and our Discord!
Python for Microcontrollers – Adafruit Daily — Python on Microcontrollers Newsletter: A New Arduino MicroPython Package Manager, How-Tos and Much More! #CircuitPython #Python #micropython @ThePSF @Raspberry_Pi
EYE on NPI – Adafruit Daily — EYE on NPI Maxim’s Himalaya uSLIC Step-Down Power Module #EyeOnNPI @maximintegrated @digikey