AWS DeepComposer brings AI to music composition #AI #ML #Music #Composing #AWS @AWSreInvent @AWS
This morning for reInvent, Amazon AWS announces it will ship AWS DeepComposer. It allows one to get hands-on, literally, with a musical keyboard and the latest machine learning techniques to compose ones own music. Per their new website:
The world’s first machine learning-enabled musical keyboard for developers
Get started with the AWS DeepComposer keyboard to create a melody that will transform into a completely original song in seconds, all powered by AI. AWS DeepComposer includes tutorials, sample code, and training data that can be used to get started building generative models, all without having to write a single line of code.
Creative meets generative
Generative AI is one of the biggest recent advancements in artificial intelligence technology because of its ability to create something new. It opens the door to an entire world of possibilities for human and computer creativity, with practical applications emerging across industries, from turning sketches into images for accelerated product development, to improving computer-aided design of complex objects. Until now, developers interested in growing skills in this area haven’t had an easy way to get started. Developers, regardless of their background in ML or music, can get started with Generative Adversarial Networks (GANs). This Generative AI technique pits two different neural networks against each other to produce new and original digital works based on sample inputs. With AWS DeepComposer, you can train and optimize GAN models to create original music.
Keyboard
The MIDI-compatible AWS DeepComposer keyboard allows one to compose melodies as input for ML generated compositions. Use the hardware buttons on the keyboard to control the volume, playback, and recording flow, as well as use the built-in functions to create more complex inputs. One can also export the MIDI files to a favorite Digital Audio Workstation (DAW) for more creativity.
Adafruit publishes a wide range of writing and video content, including interviews and reporting on the maker market and the wider technology world. Our standards page is intended as a guide to best practices that Adafruit uses, as well as an outline of the ethical standards Adafruit aspires to. While Adafruit is not an independent journalistic institution, Adafruit strives to be a fair, informative, and positive voice within the community – check it out here: adafruit.com/editorialstandards
Stop breadboarding and soldering – start making immediately! Adafruit’s Circuit Playground is jam-packed with LEDs, sensors, buttons, alligator clip pads and more. Build projects with Circuit Playground in a few minutes with the drag-and-drop MakeCode programming site, learn computer science using the CS Discoveries class on code.org, jump into CircuitPython to learn Python and hardware together, TinyGO, or even use the Arduino IDE. Circuit Playground Express is the newest and best Circuit Playground board, with support for CircuitPython, MakeCode, and Arduino. It has a powerful processor, 10 NeoPixels, mini speaker, InfraRed receive and transmit, two buttons, a switch, 14 alligator clip pads, and lots of sensors: capacitive touch, IR proximity, temperature, light, motion and sound. A whole wide world of electronics and coding is waiting for you, and it fits in the palm of your hand.
Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7:30pm ET! To join, head over to YouTube and check out the show’s live chat and our Discord!