Two MIT Media Lab Grad students, Travis Rich and Kevin Hu, started GIFGIF, an interactive site that aims to measure and understand the potential of GIFs as web language. Click through to participate! via The Atlantic.
“We were talking about GIFs one day,” Hu told Quartz, “and we realized that they’re becoming more and more serious of a medium. They’re more popular, they’re used for more things.” Buzzfeed, for example, recently used GIFs to explain what was going on in Ukraine—reaching an audience that otherwise might have ignored the news. “And we realized,” Hu said, “that we could quantify this usage.”
The site, where visitors pick which of two GIFs relates better to a particular emotion, is powered by another MIT Media Lab project’s platform. Place Pulse used the multiple-choice A/B voting system to assign emotions to pictures of different cities, allowing researchers to quantify, for example, how “sad” or “unsafe” people felt when looking at pictures of Rio de Janeiro.
But Rich and Hu, who worked on separate teams but sat near each other (and the Place Pulse group) in the lab, decided to harness the system for their own purposes, to create a visual database of emotion. “It’s the same idea,” Rich said. “Taking something that’s very easy for humans to read—emotion—and translating it for computers.” While humans have no trouble deciphering what a GIF “means,” the same task is impossible for a computer.
Since launching on March 3, the site has drawn an average of 15,000 users a day who vote around 10 times per visit. “The average time is increasing already,” Hu said, “so we’re pretty optimistic for the future.” Their first goal is to build a text-to-GIF translator. “I want people to be able to put in a Shakespearian sonnet and get out a GIF set,” Hu said. But once they’ve gotten qualitative metrics for a large number of GIFs, they think the possibilities are pretty endless. “You could reverse-engineer it and use a GIF to find a movie that fits a certain mood,” Rich said.