Via TU Delta
DelFly, TU Delft’s home grown micro air vehicle (MAV), could avoid colliding with obstacles to successfully complete a circuit around a room in December 2013. Two years on, the MAVlab group at Aerospace Engineering (LR) have focussed on software to give greater autonomy to the 20 gramme flapping wing machine, which is a little bigger than an open hand.
It now has better algorithms that allow it to see further, more accurately and react to movement. Remaining airborne for up to 25 minutes, it can see as far as seven metres within an angle of view of 60 degrees. “It sees by counting pixels and triangulating,” said Roland Meertens, artificial intelligence researcher at the MAVlab. “Using the distance between its two cameras as a baseline it calculates the range to obstacles.” It has shown it can explore more complex interiors such as the corridors at LR. The challenge now is to make it not shy away from going through doors and windows, to us a trivial action.
To achieve this, approaches based on mapping its complete surroundings depend on ramping up expensive computational power. They still do not ensure success and are not suitable for a lean, lightweight drone, so MAVlab is applying evolutionary robotics to design DelFly’s vision-based controllers. The starting point was an algorithm, designed by PhD researcher Sjoerd Tijmons, so effective that it guaranteed avoidance of collisions, provided the vision camera system administered enough suitable data.
To seek the most effective algorithms for the task of navigating building interiors and passing through openings without getting stuck they are typically tested in a 3D evolutionary simulation environment. This generates and selects the best performing controller for further development. “We would carry out four trials for each of the 100 individuals over 250 generations, which comes to 100,000 runs,” said Guido de Croon, assistant professor at LR. “This would take up to a week and then when we ported the best controllers to the real drone the success rate was still only slightly above 50%. To tackle this, we developed a two stage simulation process instead, starting with simpler, faster algorithms which would then be made more complex before being tried in reality.” Their colleague, PhD researcher Kirk Scheper, proposed using behaviour trees, an artificial intelligence technique from the gaming industry. This computational structure allows adaptation to multiple situations and complex tasks based on simple behaviours. It is also easier for humans to understand and allows them to also manually modify the controllers to work on the real drone.
If you want to learn more about the design and AI – a book about Delfly has been published, via Delfly.nl
The book introduces the topics most relevant to autonomously flying flapping wing robots: flapping-wing design, aerodynamics, and artificial intelligence. Readers can explore these topics in the context of the Delfly project.
Have an amazing project to share? Join the SHOW-AND-TELL every Wednesday night at 7:30pm ET on Google+ Hangouts.
Join us every Wednesday night at 8pm ET for Ask an Engineer!
Learn resistor values with Mho’s Resistance or get the best electronics calculator for engineers “Circuit Playground” – Adafruit’s Apps!
Maker Business — Alibaba to invest $15b in tech, set up research labs around the world
Wearables — Hand beading mimicry
Electronics — Trigger happy oscilloscope?
Biohacking — Biohacking: Visioneer – AI Glasses to Assist the Visually Impaired
No comments yet.
Sorry, the comment form is closed at this time.