Two computer science students from the University of Pennsylvania, Eric Berdinis and Jeff Kiske, have hacked together a very impressive tactile feedback system for the visually impaired using a Microsoft Kinect device and a number of vibration actuators. The Kinecthesia is a belt worn camera system that detects the location and depth of objects in front of the wearer using depth information detected by the Kinect sensor. This information is processed on a BeagleBoard open computer platform and then used to drive six vibration motors located to the left, center and right of the user.
Have an amazing project to share? Join the SHOW-AND-TELL every Wednesday night at 7:30pm ET on Google+ Hangouts.
Join us every Wednesday night at 8pm ET for Ask an Engineer!
Learn resistor values with Mho’s Resistance or get the best electronics calculator for engineers “Circuit Playground” – Adafruit’s Apps!
Maker Business — The Not-So-Secret Code That Powers Robots Around the Globe
Wearables — Glitter big
Electronics — Turn the heat up! when unleaded
Biohacking — Google Sheets Based Life Tracking Dashboards
No comments yet.
Sorry, the comment form is closed at this time.