The project I am doing with a fellow student is a wheelchair that you can steer with your eyes. That’s for disabled people with an illness that makes them dependent on a wheelchair – like MS or ALS. Maybe you can realise it even better, if you think of Stephen Hawking. He only can move his eyes and a muscle in his cheek. This project should give people like him the chance of getting some more independence back, empowering them and allowing some form of mobility again.We designed our own eyetracker using safetyglasses (but just kept the frame), a normal webcam and SMD-LEDs. We changed the filter inside the webcam from IR blocking to only bypass IR light, so now it’s independent from the surrounding light conditions as the eye is illuminated by the SMD-LEDs we soldered on.
For the processing we use an Odroid U3 which is powerful enough, as our initial tries with a Raspberry were extremely poor in Eyetracking speed (3 fps) – the Odorid gives us some 13-15 fps…
We are using the Odroid for the filtering of the videostream and processing the commands selected with one eye. Concerning the coding we use Python with openCV libraries. This is the firs time we are using a real programming language after moving from Lego Mindstorms NXT. So we decided Python might be easier to learn 😉
There is an Arduino-Addon for the Ordoid and we use it to control the H-bride we made for driving the motors and for calibrating the eyetracker. Here we use potis as voltage dividers connected to the Arduino’s AD-converters for adjusting the Eyetracker to the person’s physiognomy.
But first we used a little robotic platform to simulate a wheelchair. In the program the video is filtered a few times (blur, b/w conversion etc.) to get a nice picture with only the pupil left. Its position is calculated and then these coordinates are compared against 4 different areas for the commands to steer the wheelchair: forward, back, left, right.
The boundaries of these areas are changed simply using potentiometers that operate as voltage dividers – the Arduino’s AD-converters take care of getting this input.
So you can adjust 2 cut’offs an the x and Y-axis and a threshold for the binarisation of the webcam image. This way you can very easily adjust the tracker to different people to show your work.
To make sure the wheelchair doesn’t move just by looking at the eyemovement the corresponding command has to be verified by a small switch – it will later be moved by the tongue. Or it could be done like Stephen Hawking’s device, looking at a tiny movement of a muscle in the cheek (IR-reflection e.g.)
Each Friday is PiDay here at Adafruit! Be sure to check out our posts, tutorials and new Raspberry Pi related products. Adafruit has the largest and best selection of Raspberry Pi accessories and all the code & tutorials to get you up and running in no time!
Have an amazing project to share? Join the SHOW-AND-TELL every Wednesday night at 7:30pm ET on Google+ Hangouts.
Join us every Wednesday night at 8pm ET for Ask an Engineer!
Learn resistor values with Mho’s Resistance or get the best electronics calculator for engineers “Circuit Playground” – Adafruit’s Apps!
Maker Business — How Intel Makes a Chip
Wearables — FOSSHAPE familiarity
Electronics — Stay disciplined with ERC
Biohacking — Itch Tracker for Apple Watch
No comments yet.
Sorry, the comment form is closed at this time.