MIT researchers have created a headset device that reads subvocalization by measuring neuromuscular signals and responds to what the users are saying in their heads. The bulky white gadget picks up signals by using electrodes when users verbalize internally and also utilizes the vibrations of your inner ear bones to distinguish words.
The signals are processed by a computer that uses neural networks so users can spout commands to navigate apps, ask the time, and learn optimal countermoves in a game of chess, for example — all while communicating in utter silence. “The motivation for this was to build an IA device — an intelligence-augmentation device,” said MIT grad student and lead author Arnav Kapur in a statement. “Our idea was: Could we have a computing platform that’s more internal, that melds human and machine in some ways and that feels like an internal extension of our own cognition?”
Every Wednesday is Wearable Wednesday here at Adafruit! We’re bringing you the blinkiest, most fashionable, innovative, and useful wearables from around the web and in our own original projects featuring our wearable Arduino-compatible platform, FLORA. Be sure to post up your wearables projects in the forums or send us a link and you might be featured here on Wearable Wednesday!
Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7:30pm ET! To join, head over to YouTube and check out the show’s live chat and our Discord!
Python for Microcontrollers – Adafruit Daily — Python on Microcontrollers Newsletter: A New Arduino MicroPython Package Manager, How-Tos and Much More! #CircuitPython #Python #micropython @ThePSF @Raspberry_Pi
EYE on NPI – Adafruit Daily — EYE on NPI Maxim’s Himalaya uSLIC Step-Down Power Module #EyeOnNPI @maximintegrated @digikey