This video shows the result of a learning by imitation approach that allows two users to demonstrate an assembly skill requiring different levels of compliance. Each item to assemble will have specific characteristic that needs that are transferred to the robot. Re-programming the robot for each new item to assemble would not be possible. Here, the robot can learn this skill by demonstration. One user is grasping the robot and moving it by hand to demonstrate how it should collaborate with another user (kinesthetic teaching). A force sensor mounted at the wrist of the robot and a marker-based vision tracking system is used to track the position and orientation of table legs that need to be mounted at four different point on the table top. After demonstration, the robot learns that it should first be compliant to let the user re-orient the table-top in a comfortable pose to screw the current leg. Once the user starts to screw the leg, the robot becomes stiff to facilitate the task. This behavior is not pre-programmed, but is instead learn by the robot by extracting the regularities of the task from multiple demonstrations.
Have an amazing project to share? Join the SHOW-AND-TELL every Wednesday night at 7:30pm ET on Google+ Hangouts.
Join us every Wednesday night at 8pm ET for Ask an Engineer!
Learn resistor values with Mho’s Resistance or get the best electronics calculator for engineers “Circuit Playground” – Adafruit’s Apps!
Maker Business — Alibaba to invest $15b in tech, set up research labs around the world
Wearables — Hand beading mimicry
Electronics — Trigger happy oscilloscope?
Biohacking — Biohacking: Visioneer – AI Glasses to Assist the Visually Impaired
No comments yet.
Sorry, the comment form is closed at this time.