The sunny weather in California is ideal for training self-driving cars, but it does have its drawbacks. After all, if your autonomous vehicle has only ever driven in perfect visibility, what happens when it runs into a bit of rain or snow? Researchers at Nvidia might have a solution, publishing details this week of an AI framework that lets computers imagine what a sunny street looks like when it’s raining, snowing, or even pitch-black outside. That’s important information for self-driving cars, but the work could have many more applications besides.
The research is based on an AI method that’s particularly good at generating visual data: a generative adversarial network, or GAN. GANs work by combining two separate neural networks — one that makes the data, and another that judges it; rejecting samples that don’t look accurate. In this way, the AI teaches itself to generate better and better results over time. This sort of program is common in the industry, and has been used to create all sorts of imagery, from fake celebrity faces to new clothing designs to nightmarish cats.
Nvidia’s research, though, has one big advantage over existing GANS: it learns with much less supervision. Generally, programs of this sort need labelled datasets to generate data. As Nvidia researcher Ming-Yu Liu explained to The Verge, this means that if you’re making a GAN that turns a daytime scene into a nighttime one, you’d need to feed it pairs of images taken at the same location at night and day. It would then study the difference between the two to generate new examples.
But Nvidia’s new program doesn’t need this prep-work — it works without labelled datasets, but manages to produce results of similar quality. This could be a major advantage for AI researchers, as it frees up time they would otherwise have to dedicate to sorting their training data.
“We are among the first to tackle the problem,” Ming-Yu told The Verge. “[And] there are many applications. For example, it rarely rains in California, but we’d like our self-driving cars to operate properly when it rains. We can use our method to translate sunny California driving sequences to rainy ones to train our self-driving cars.”
Adafruit publishes a wide range of writing and video content, including interviews and reporting on the maker market and the wider technology world. Our standards page is intended as a guide to best practices that Adafruit uses, as well as an outline of the ethical standards Adafruit aspires to. While Adafruit is not an independent journalistic institution, Adafruit strives to be a fair, informative, and positive voice within the community – check it out here: adafruit.com/editorialstandards
Stop breadboarding and soldering – start making immediately! Adafruit’s Circuit Playground is jam-packed with LEDs, sensors, buttons, alligator clip pads and more. Build projects with Circuit Playground in a few minutes with the drag-and-drop MakeCode programming site, learn computer science using the CS Discoveries class on code.org, jump into CircuitPython to learn Python and hardware together, TinyGO, or even use the Arduino IDE. Circuit Playground Express is the newest and best Circuit Playground board, with support for CircuitPython, MakeCode, and Arduino. It has a powerful processor, 10 NeoPixels, mini speaker, InfraRed receive and transmit, two buttons, a switch, 14 alligator clip pads, and lots of sensors: capacitive touch, IR proximity, temperature, light, motion and sound. A whole wide world of electronics and coding is waiting for you, and it fits in the palm of your hand.
Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7pm ET! To join, head over to YouTube and check out the show’s live chat – we’ll post the link there.