Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a new system that uses a 3-D camera, a belt with separately controllable vibrational motors distributed around it, and an electronically reconfigurable Braille interface to give visually impaired users more information about their environments.
The researchers’ system consists of a 3-D camera worn in a pouch hung around the neck; a processing unit that runs the team’s proprietary algorithms; the sensor belt, which has five vibrating motors evenly spaced around its forward half; and the reconfigurable Braille interface, which is worn at the user’s side.
The key to the system is an algorithm for quickly identifying surfaces and their orientations from the 3-D-camera data. The researchers experimented with three different types of 3-D cameras, which used three different techniques to gauge depth but all produced relatively low-resolution images — 640 pixels by 480 pixels — with both color and depth measurements for each pixel.
Every Wednesday is Wearable Wednesday here at Adafruit! We’re bringing you the blinkiest, most fashionable, innovative, and useful wearables from around the web and in our own original projects featuring our wearable Arduino-compatible platform, FLORA. Be sure to post up your wearables projects in the forums or send us a link and you might be featured here on Wearable Wednesday!