DigitalFruit- dTHEd

DigitalFruit is an interview series from Adafruit showcasing some of our favorite digital fine artists from around the world. As we begin this new decade with its rapidly changing landscape, we must envision our path through a different lens.  Over the next few weeks we’ll feature many innovative perspectives and techniques that will inspire our maker community to construct a bold creative frontier.  The only way is forward.

1. Where are you based?

All three of us are from Italy, but we don’t live in the same place. Fabio lives in Rome, while Simone and Isobel, who are also a couple in life, live in Castiglion Fiorentino (a small town in Tuscany).

2. Tell us about your background?

We have very different backgrounds! Fabio is trained as a theoretical physicist and obtained his masters in human development and food security and nutrition. Simone is a composer, he obtained his diploma in Music Technology at the Luigi Cherubini Conservatory. Isobel has a degree in aesthetic philosophy and is a well established multimedia and visual artist.

3. What inspires your work?

Perceptual mismatches, hyper-rhythms, auditory chimeras, hyperobjects, neurodiversity (and Artificial Intelligence seen as a very peculiar form of), prosopagnosia, the anthropocene. Ultimately, the idea that technology allows you to experience alternative sensory processing, while extending possibilities into a post-human domain. Our first album hyperbeatz vol. 1 is inspired and crafted entirely around neurodiversity and Timothy Morton’s book Hyperobjects. That’s how we developed our concept of hyper-rhythms.

4. What are you currently working on?

We are currently finalizing our second full-length album. We are also working on new tempo-based experiments, both with digital ad-hoc machines and we are about to start testing some of our ideas on rhythmic perception on humans with a specific standard protocol.

We have a very interesting digital installation coming up, developed together with the artist Scual, to be experienced either via browser, or VR set or AR/mobile app.

Our ongoing social experiment is always running: we continuously interview people and publish the interviews on our website. The interviews are framed around 2 topics: hypermusic and neurodiversity. We’ve been doing these since our first album came out. Anyone interested in being interviewed (you too Andrew!), just drop us a message here. All these replies can be read on our website, as we publish them periodically (1-2 per month). They work as stand-alone project but in the future they will also feed an AI on our website, which we are currently developing.

Finally we are trying to issue a limited run of giclée and acrylic-glass prints of our digital artworks, though we are still not satisfied of the trials done so far. Once we find the factory that meets our quality standards, we’ll then work to couple the prints with AR enhancements and test those, also interacting with our music.

5. Describe your process and what tools you like to use.

There are lots of different of tools and processes we love. We don’t have a predefined format or go-to software. We like to fiddle around a lot until we find something that we feel is taking us on an interesting path. This is one of the most creative parts of our work. We use digital tools as if we were in a lab. Some are more powerful, others less, but if we find something interesting, we use it.

Musically, since a bulk of our research is on rhythmic perception, we often start from a hyper-rhythm. We define them (following Timothy Morton’s definition of hyperobjects – non locality, phasing, etc.) as those rhythms which you cannot entrain to, but your mind tricks you into thinking you can. Once we have those up, we then start generating sounds through basically anything: hardware synthesizers, iOS apps, AIs, algorithms, visual programming environments, acoustic instruments sampling, field recordings etc. working to establish a dialogue across the elements. Some of the most interesting stuff happens when the rules of the dialogue emerge from scientific concepts, with specific values, algorithms, functions, applied to a timeline, to the harmonic content of a synth, or to filter parameters, transforming complex functions in musical interactions interlinked in a way they sound familiar.

We tend to split our roles in Fabio → drums (starting from drum libraries or sampled sounds or even custom AI training, then going into a wide variety of software like Max/MSP & M4L, Borderlands Granular, Rhythmiq, Regroover, Substantia, Fantastic Voyage, GRM tools, LoopField, or pedals like the EHX Hazarai Memory Man, Hologramps Microcosm and Infinite Jets, etc.), Simone → synths and sound design (analogue and digital synths, workstations such as VCV Rack, Pure Data, Gleetchlab, samplers and filters such as the kDevices and Melda plugins, etc.), Isobel → voice (with and without effects), but our roles frequently overlap. We also use a range of controllers and are looking into integrating more in the future to widen the perspective of our live performance (Artiphon, motion sensor controllers, etc.)

Visually we really like to work on still images, often starting from digital photos (human faces included, see our interview series) which are reprocessed infinitely via a chain of effects in iOS, custom or web apps, ending up in Adobe Photoshop and Illustrator. We often integrate these with raw generative and GAN outputs to create hybrids with that post-human feeling.

Our approach to video experiments is very similar, starting from sampling/filming and re-processing the content countlessly, merging it with GAN and generative material. We also fiddle a bit with jitter, Blender and Cinema 4D. For wall-mappings we team up with DIES_ working mainly in Cinema 4D and TouchDesigner, while our responsive realtime installations (AR/VR/web) are developed in cooperation with Scual using a wide range of technologies and languages depending on the needs and platform: webGL, javascript, typescript, traditional computer vision, AI (tensorflow), OpenGL, Unity, C++, arduino, raspberrypi.

We’ve also just started working with Tidalcycles and Hydra, so let’s see how we integrate those in our lab…

6. What does your workplace or studio look like? Do you work in silence or listen to music while you work?

Fabio – my workplace is still the writing desk my parents bought me when I was 10 years old. All my devices are there: mainly iPads, guitar pedals, controllers, my mobile phone and the laptop. It’s usually quite messy as I switch from one tool to another constantly. Of course mobile devices can be brought anywhere, so it’s often the case that I develop ideas on my phone or on the iPad when I’m at the park with my son. It’s actually very very frequent these days. But when I have to put the finishing touches, I like to go back to the desk. When I work on dTHEd’s projects, inevitably dTHEd’s music is running through my ears, so yes and no. Yes because there is music, but no because I’m not really listening to other things. And when I press pause or switch to other things (e.g. visual arts), I don’t put something else on, I really enjoy silence these days a lot.

Simone – since I have a laptop my workplace is everywhere I need to be, if I need to record physical instruments, vocals, analog synthesizer, I’m usually doing it in my home studio. Sometimes I’m in the garden working on modular patches. I don’t listen to music while working, I get easily distracted and music has the power to capture my attention incredibly.

Isobel – my music and visual studios are actually separated, like two sides of the same coin. The first is quite dark, and even if I principally sing, I also switch to different instruments, from vintage synths (Korg MS20, Yamaha SY99) to banjo, hank drum and classical guitar, so I can easily follow the mood of the day. The second is bathed in light, my desk is surrounded by three windows with a view on a valley and that really helps to empty my mind and create. I usually listen to music while working on visual art or during my daily dance practice.

7. How has technology shaped your creative vision?

It’s fair to say that computers and technology have changed our creative vision completely, expanding our possibilities beyond imagination. However, we still like to walk a thin path where human and post-human can be confused. See for example GAN outputs that resemble photos of real human beings… which don’t exist, as opposed to other more evidently non-human realms, where the blatant distinction actually serves as a comfort zone. It’s a bit the difference between a robot and a cyborg. When you think about something like prosopagnosia, it’s so unsettling, but definitely human. Technology now can help us bring art in these unsettling domains, through different paths, discovering uncharted territories.

8. Any tips for someone interested in getting started in the digital art form?

Take any digital tool and play around with it: abuse it. Digital tools tend to have their own rigid schemes, set-ups, limiting your creativity to a definite playground, with imposed rules. Workflows are often dictated by how GUIs are set up, which reflect specific choices made by programmers. Try to implement the Turing test on any machine. If you manage to get confused, if you are able to trick yourself into believing that what you’re using might not be a machine after all… then you could be on a personal path.

9. Where do you see generative and digital art heading in the future?

More overlaps between art and science, more bio-engineering, more augmented reality in our everyday life, more AIs running the show. The challenge for humans will be to justify their relevance in art, they will have to fight for their space. That’s why our advice is: don’t end up playing the game machines make you play, rather: have the machines play your game.



dTHEd External links: | Bandcamp | Instagram | Facebook


DigitalFruit is curated by Adafruit lead photographer- Andrew Tingle

Adafruit publishes a wide range of writing and video content, including interviews and reporting on the maker market and the wider technology world. Our standards page is intended as a guide to best practices that Adafruit uses, as well as an outline of the ethical standards Adafruit aspires to. While Adafruit is not an independent journalistic institution, Adafruit strives to be a fair, informative, and positive voice within the community – check it out here:

Join Adafruit on Mastodon

Adafruit is on Mastodon, join in!

Stop breadboarding and soldering – start making immediately! Adafruit’s Circuit Playground is jam-packed with LEDs, sensors, buttons, alligator clip pads and more. Build projects with Circuit Playground in a few minutes with the drag-and-drop MakeCode programming site, learn computer science using the CS Discoveries class on, jump into CircuitPython to learn Python and hardware together, TinyGO, or even use the Arduino IDE. Circuit Playground Express is the newest and best Circuit Playground board, with support for CircuitPython, MakeCode, and Arduino. It has a powerful processor, 10 NeoPixels, mini speaker, InfraRed receive and transmit, two buttons, a switch, 14 alligator clip pads, and lots of sensors: capacitive touch, IR proximity, temperature, light, motion and sound. A whole wide world of electronics and coding is waiting for you, and it fits in the palm of your hand.

Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7pm ET! To join, head over to YouTube and check out the show’s live chat – we’ll post the link there.

Join us every Wednesday night at 8pm ET for Ask an Engineer!

Join over 36,000+ makers on Adafruit’s Discord channels and be part of the community!

CircuitPython – The easiest way to program microcontrollers –

Maker Business — “Packaging” chips in the US

Wearables — Enclosures help fight body humidity in costumes

Electronics — Transformers: More than meets the eye!

Python for Microcontrollers — Python on Microcontrollers Newsletter: Silicon Labs introduces CircuitPython support, and more! #CircuitPython #Python #micropython @ThePSF @Raspberry_Pi

Adafruit IoT Monthly — Guardian Robot, Weather-wise Umbrella Stand, and more!

Microsoft MakeCode — MakeCode Thank You!

EYE on NPI — Maxim’s Himalaya uSLIC Step-Down Power Module #EyeOnNPI @maximintegrated @digikey

New Products – Adafruit Industries – Makers, hackers, artists, designers and engineers! — #NewProds 7/19/23 Feat. Adafruit Matrix Portal S3 CircuitPython Powered Internet Display!

Get the only spam-free daily newsletter about wearables, running a "maker business", electronic tips and more! Subscribe at !

No Comments

No comments yet.

Sorry, the comment form is closed at this time.