Training StyleGAN with RunwayML #Training #GAN #MachineLearning #ArtificialIntelligence @RunwayML
RunwayML is a machine learning platform for creators. The platform boasts an impressive number of pre-trained algorithms in addition to some interactive demos. Several weeks ago they announced the Beta release of training on the platform! That said, training is currently limited to testing. If you would like to try it out and provide feedback, you can download RunwayML from their website and request training access through support. I went through that process this week and, once granted access, was able to get everything up and running in a couple of minutes.
RunwayML is currently using transfer learning on the StyleGAN model for training. StyleGAN is a Generative Adversarial Network that is able to create photorealistic images. The model was trained on thousands of images of faces from Flickr. RunwayML allows users to upload their own datasets and retrain StyleGAN in the likeness of your datasets.
…Transfer Learning leverages all of the image features it learned from the original data to speed up the process of learning features in your image dataset. Not only does this greatly reduce training time, it also reduces the number of images that you need to train your own model.
If you want to get started quickly you can also use one of RunwayML’s image datasets (mountains, forests, graffiti, etc). Selecting the smallest dataset (mountains) and training for two hours generated pretty passable landscape photos (below). If you’d like to learn more about training, check out the docs on the RunwayML website! If you’re interested in GANs or StyleGAN on RunwayML check out thesevideos.
Written by Rebecca Minich, Product Analyst, Data Science at Google. Opinions expressed are solely my own and do not express the views or opinions of my employer.
Adafruit publishes a wide range of writing and video content, including interviews and reporting on the maker market and the wider technology world. Our standards page is intended as a guide to best practices that Adafruit uses, as well as an outline of the ethical standards Adafruit aspires to. While Adafruit is not an independent journalistic institution, Adafruit strives to be a fair, informative, and positive voice within the community – check it out here: adafruit.com/editorialstandards
Stop breadboarding and soldering – start making immediately! Adafruit’s Circuit Playground is jam-packed with LEDs, sensors, buttons, alligator clip pads and more. Build projects with Circuit Playground in a few minutes with the drag-and-drop MakeCode programming site, learn computer science using the CS Discoveries class on code.org, jump into CircuitPython to learn Python and hardware together, TinyGO, or even use the Arduino IDE. Circuit Playground Express is the newest and best Circuit Playground board, with support for CircuitPython, MakeCode, and Arduino. It has a powerful processor, 10 NeoPixels, mini speaker, InfraRed receive and transmit, two buttons, a switch, 14 alligator clip pads, and lots of sensors: capacitive touch, IR proximity, temperature, light, motion and sound. A whole wide world of electronics and coding is waiting for you, and it fits in the palm of your hand.
Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7:30pm ET! To join, head over to YouTube and check out the show’s live chat and our Discord!
Python for Microcontrollers – Adafruit Daily — Select Python on Microcontrollers Newsletter: PyCon AU 2024 Talks, New Raspberry Pi Gear Available and More! #CircuitPython #Python #micropython @ThePSF @Raspberry_Pi
EYE on NPI – Adafruit Daily — EYE on NPI Maxim’s Himalaya uSLIC Step-Down Power Module #EyeOnNPI @maximintegrated @digikey