MIT and Cornell University researchers are developing drones that will automatically position and provide the right photographic lighting.
Researchers at MIT and Cornell University hope to change that by providing photographers with squadrons of small, light-equipped autonomous robots that automatically assume the positions necessary to produce lighting effects specified through a simple, intuitive, camera-mounted interface.
At the International Symposium on Computational Aesthetics in Graphics, Visualization, and Imaging in August, they take the first step toward realizing this vision, presenting a prototype system that uses an autonomous helicopter to produce a difficult effect called “rim lighting,” in which only the edge of the photographer’s subject is strongly lit.
According to Manohar Srikanth, who worked on the system as a graduate student and postdoc at MIT and is now a senior researcher at Nokia, he and his coauthors —MIT professor of computer science and engineering Frédo Durand and Cornell’s Kavita Bala, who also did her PhD at MIT — chose rim lighting for their initial experiments precisely because it’s a difficult effect.
“It’s very sensitive to the position of the light,” Srikanth says. “If you move the light, say, by a foot, your appearance changes dramatically.”
Welcome to drone day on the Adafruit blog. Every Monday we deliver the latest news, products and more from the Unmanned Aerial Vehicles (UAV), quadcopter and drone communities. Drones can be used for video & photography (dronies), civil applications, policing, farming, firefighting, military and non-military security work, such as surveillance of pipelines. Previous posts can be found via the #drone tag and our drone / UAV categories.