AI’ve just started to learn to play

I built a robot melodica which listens to fragments taken from YouTube melodica tutorials and tries to reproduce the melody in real time. The robot was active at Incó_ntemporanea Festival in Piacenza, Palazzo Ghizzoni Nasalli, for 24 hours (October 11th, 2020).

Today’s self-taught musician has at his disposal a potentially endless amount of educational materials, first of all the Youtube video tutorials.

“AI’ve just started to learn to play” is an automated and impatient apprentice musician, who learns to play at the very moment it is taught something here. It constantly jams with real musicians. Within this cyberspace, besides the observation of this incessant learning process and its interaction with the musicians of the Collective_21, it is possible to virtually compare this process with the videos from which the musician is learning.

(Luca Guidarini)

Four musicians from the local Collettivo_21 were present for the whole 24 hours and jamming/reacting to the robot. Few audience members were allowed in special times of the day, while the whole event was streamed live on the festival’s website and on Twitch.

Remote users were able to start the robot (triggering a random YouTube tutorial from the library) and vote for which audio to listen to: the original tutorial sound, the live performance of the robot, or both. The voting system and the web interface was programmed by Erik Natanael Gustaffson.

Technically the robot consists of a DC motor savaged from an old printer (also keeping the original scaffolding structure), mounted over a melodica and controlled via Arduino. Arduino also controls a bistable electro-valve, which is letting through, when necessary, the air flow coming from a compressor.

The Arduino sketch makes it possible, for the robot, to receive a control value from the computer. The control value describes a combination of control states: air valve open or closed, and DC motor still, moving upwards or moving downwards. Two end-stop switches throw back the DC motor to a center position when it reaches the keyboard edges.

Schematics of the electronic circuit with the microcontroller Arduino Nano and the two motor modules.

The listening software is written in Max and uses the sigmund~ object. The analysis of a library of video-tutorials yields the melodic contour (which controls the movement of the DC motor) and the timing for note ons and offs (which control the electro-valve).

Wanna build it yourself? Have a look at my tutorial on Hackster.io.

Realized in the frame of “Ars Cyber != Dystopian”, event in collaboration with Claudio Panariello and Luca Guidarini. With the support of Konstnärsnämnden, Ernst von Siemens Foundation, Camera di Commercio Piacenza, Comune di Piacenza, Fondazione di Piacenza e Vigevano.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s