I wanted to test the idea of using the tongue to control a computer. The most obvious way would be to map x/y coordinates in a Windows-style UI with tongue movements constrained to a plane. This doesn’t work terribly well, as even though the tongue has fine motor control, it’s very difficult to smoothy achieve the 2D movements needed to operate a system that has been developed around a classic mouse. I think there is possible applications for swipe interfaces, carousel menus, yes/no input, etc.
Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7:30pm ET! To join, head over to YouTube and check out the show’s live chat and our Discord!
Python for Microcontrollers – Adafruit Daily — Python on Microcontrollers Newsletter: A New Arduino MicroPython Package Manager, How-Tos and Much More! #CircuitPython #Python #micropython @ThePSF @Raspberry_Pi
EYE on NPI – Adafruit Daily — EYE on NPI Maxim’s Himalaya uSLIC Step-Down Power Module #EyeOnNPI @maximintegrated @digikey