After playing with some of the demo programs I noticed on there was a Processing library on the leap developer site (can be found here). The control program for my hexapod is already written in processing so adding the leap was relatively easy.
All I have done currently is map x, y, z, yaw, pitch & roll of the users hand onto the body of the hexapod. The smoothing and inverse kinematics are then done and sent to the hexapod over TCP/IP. Raspberry Pi on the hexapod receives this with a python script and outputs the values to the servos. Hand tracking with the Leap is enabled or disabled by a key tap gesture with any finger.
Have an amazing project to share? Join the SHOW-AND-TELL every Wednesday night at 7:30pm ET on Google+ Hangouts.
Join us every Wednesday night at 8pm ET for Ask an Engineer!
Maker Business — Prototyping PCBs with Particle, a guide from a pro in the field #makerbusiness
Wearables — Learn about stretch
Electronics — Test for interference on the cheap!
Biohacking — Visualizing FLU Data with Wearables
No comments yet.
Sorry, the comment form is closed at this time.