Engineering student Federico Terzi has built an impressive computer interface device reminiscent of a Wiimote.
When talking in person, you can express meaning using facial expressions, and your hands. Usually this acts to add emphasis to a statement or perhaps to point out a certain object, but what if you could actually type letters based on how your hands move?
Terzi’s aptly named “Gesture Keyboard” does just this, using an Arduino Pro Micro, an MPU-6050 accelerometer, and an HC-06 Bluetooth module for sending signals to his laptop. A Python library using Scikit-learn’s SVM (Support Vector Machine) algorithm then translates the motion readings into characters that appear on the screen.
Each Monday is ArduinoMonday here at Adafruit! Be sure to check out our posts, tutorials and new Arduino related products. Adafruit manufactures the Arduino right here in the United States in cooperation with arduino.cc. We have a huge selection of Arduino accessories and all the code and tutorials to get you up and running in no time!
Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7:30pm ET! To join, head over to YouTube and check out the show’s live chat and our Discord!
Python for Microcontrollers – Adafruit Daily — Python on Microcontrollers Newsletter: A New Arduino MicroPython Package Manager, How-Tos and Much More! #CircuitPython #Python #micropython @ThePSF @Raspberry_Pi
EYE on NPI – Adafruit Daily — EYE on NPI Maxim’s Himalaya uSLIC Step-Down Power Module #EyeOnNPI @maximintegrated @digikey