These MIT Researchers Want to Turn GIFs Into a Language
Two MIT Media Lab Grad students, Travis Rich and Kevin Hu, started GIFGIF, an interactive site that aims to measure and understand the potential of GIFs as web language. Click through to participate! via The Atlantic.
“We were talking about GIFs one day,” Hu told Quartz, “and we realized that they’re becoming more and more serious of a medium. They’re more popular, they’re used for more things.” Buzzfeed, for example, recently used GIFs to explain what was going on in Ukraine—reaching an audience that otherwise might have ignored the news. “And we realized,” Hu said, “that we could quantify this usage.”
The site, where visitors pick which of two GIFs relates better to a particular emotion, is powered by another MIT Media Lab project’s platform. Place Pulse used the multiple-choice A/B voting system to assign emotions to pictures of different cities, allowing researchers to quantify, for example, how “sad” or “unsafe” people felt when looking at pictures of Rio de Janeiro.
But Rich and Hu, who worked on separate teams but sat near each other (and the Place Pulse group) in the lab, decided to harness the system for their own purposes, to create a visual database of emotion. “It’s the same idea,” Rich said. “Taking something that’s very easy for humans to read—emotion—and translating it for computers.” While humans have no trouble deciphering what a GIF “means,” the same task is impossible for a computer.
Since launching on March 3, the site has drawn an average of 15,000 users a day who vote around 10 times per visit. “The average time is increasing already,” Hu said, “so we’re pretty optimistic for the future.” Their first goal is to build a text-to-GIF translator. “I want people to be able to put in a Shakespearian sonnet and get out a GIF set,” Hu said. But once they’ve gotten qualitative metrics for a large number of GIFs, they think the possibilities are pretty endless. “You could reverse-engineer it and use a GIF to find a movie that fits a certain mood,” Rich said.
Adafruit publishes a wide range of writing and video content, including interviews and reporting on the maker market and the wider technology world. Our standards page is intended as a guide to best practices that Adafruit uses, as well as an outline of the ethical standards Adafruit aspires to. While Adafruit is not an independent journalistic institution, Adafruit strives to be a fair, informative, and positive voice within the community – check it out here: adafruit.com/editorialstandards
Stop breadboarding and soldering – start making immediately! Adafruit’s Circuit Playground is jam-packed with LEDs, sensors, buttons, alligator clip pads and more. Build projects with Circuit Playground in a few minutes with the drag-and-drop MakeCode programming site, learn computer science using the CS Discoveries class on code.org, jump into CircuitPython to learn Python and hardware together, TinyGO, or even use the Arduino IDE. Circuit Playground Express is the newest and best Circuit Playground board, with support for CircuitPython, MakeCode, and Arduino. It has a powerful processor, 10 NeoPixels, mini speaker, InfraRed receive and transmit, two buttons, a switch, 14 alligator clip pads, and lots of sensors: capacitive touch, IR proximity, temperature, light, motion and sound. A whole wide world of electronics and coding is waiting for you, and it fits in the palm of your hand.
Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7pm ET! To join, head over to YouTube and check out the show’s live chat – we’ll post the link there.