A 30-second video of a newborn baby shows the infant silently snoozing in its crib, his breathing barely perceptible. But when the video is run through an algorithm that can amplify both movement and color, the baby’s face blinks crimson with each tiny heartbeat.
The amplification process is called Eulerian Video Magnification, and is the brainchild of a team of scientists at the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory.
The team originally developed the program to monitor neonatal babies without making physical contact. But they quickly learned that the algorithm can be applied to other videos to reveal changes imperceptible to the naked eye. Prof. William T. Freeman, a leader on the team, imagines its use in search and rescue, so that rescuers could tell from a distance if someone trapped on a ledge, say, is still breathing.
“Once we amplify these small motions, there’s like a whole new world you can look at,” he said.
The system works by homing in on specific pixels in a video over the course of time. Frame-by-frame, the program identifies minute changes in color and then amplifies them up to 100 times, turning, say, a subtle shift toward pink to a bright crimson. The scientists who developed it believe it could also have applications in industries like manufacturing and oil exploration. For example, a factory technician could film a machine to check for small movements in bolts that might indicate an impending breakdown. In one video presented by the scientists, a stationary crane sits on a construction site, so still it could be a photograph. But once run through the program, the crane appears to sway precariously in the wind, perhaps tipping workers off to a potential hazard.
It is important to note that the crane does not actually move as much as the video seems to show. It is the process of motion amplification that gives the crane its movement.
Stop breadboarding and soldering – start making immediately! Adafruit’s Circuit Playground is jam-packed with LEDs, sensors, buttons, alligator clip pads and more. Build projects with Circuit Playground in a few minutes with the drag-and-drop MakeCode programming site, learn computer science using the CS Discoveries class on code.org, jump into CircuitPython to learn Python and hardware together, TinyGO, or even use the Arduino IDE. Circuit Playground Express is the newest and best Circuit Playground board, with support for CircuitPython, MakeCode, and Arduino. It has a powerful processor, 10 NeoPixels, mini speaker, InfraRed receive and transmit, two buttons, a switch, 14 alligator clip pads, and lots of sensors: capacitive touch, IR proximity, temperature, light, motion and sound. A whole wide world of electronics and coding is waiting for you, and it fits in the palm of your hand.
Get the only spam-free daily newsletter about wearables, running a "maker business", electronic tips and more! Subscribe at AdafruitDaily.com !
I was going to write this off as old news, but the NYTimes added some interesting updates to the story, namely the videoscope online application. It also showed me what I missed the first time round, which is that they actually included their full MATLAB code! I love it when researchers include their generating code; mainly because I hate spending hours trying to reproduce a paper only to find I mis-implemented one crucial part of the algorithm.