Led by a Yale University undergraduate, researchers have used fMRI scans to accurately reconstruct the images of faces as seen by the people being scanned. The level of sophistication in fMRI technology has previously allowed researchers to decipher the subject of what a viewer was looking at, such as whether it was scenery versus an animal. But the task of deciphering subtle differences in faces demonstrates a new level of mastery since faces exhibit many more similarities to each other than say, ponies and beach scenes. We also incorporate large areas of our brains to observe all these subtleties, which left much larger areas of the brain to be carefully monitored and greater amounts of brain activity to be decoded. From YaleNews:
Working with funding from the Yale Provost’s office, Cowen and post doctoral researcher Brice Kuhl, now an assistant professor at New York University, showed six subjects 300 different “training” faces while undergoing fMRI scans. They used the data to create a sort of statistical library of how those brains responded to individual faces. They then showed the six subjects new sets of faces while they were undergoing scans. Taking that fMRI data alone, researchers used their statistical library to reconstruct the faces their subjects were viewing.
Cowen said the accuracy of these facial reconstructions will increase with time and he envisions they can be used as a research tool, for instance in studying how autistic children respond to faces.
Have an amazing project to share? Join the SHOW-AND-TELL every Wednesday night at 7:30pm ET on Google+ Hangouts.
Join us every Wednesday night at 8pm ET for Ask an Engineer!
Learn resistor values with Mho’s Resistance or get the best electronics calculator for engineers “Circuit Playground” – Adafruit’s Apps!
Maker Business — Limor Fried featured in NYC’s HER BIG IDEA!
Wearables — Get concrete solutions
Electronics — Probe Compensation
Biohacking — Dr. Rita Levi-Montalcini was a Centenarian Gonzo Biohacker
Sorry, the comment form is closed at this time.