Proposal for Algorave Performance
Name: Timo Hoogland
Title: ./drum.code
Affiliation: HKU University of the Arts Utrecht
Performance length: 20-30 min
Timo Hoogland is a computational artist, live coder, music technologist and educator from Apeldoorn, the Netherlands. He livecodes experimental electronic dance music and develops generative audiovisual compositions, installations and performances. Timo graduated from the Masters of Music Design at the HKU University of Arts Utrecht, where he developed the livecoding environment Mercury to research and develop algorithmic composition techniques and generative visuals in livecoded audivisual performances with the focus on accessibility. He has an active role in organizing livecoding meetups and Algoraves together with Creative Coding Utrecht and the Netherlands Coding Live community and performed at various events and festivals such as Github Satellite, Network Music Festival, ICLC, ADE, Gogbot, Tec-Art, Droidcon and React. As an educator Timo teaches creative coding for audio and visuals at the University of Arts Utrecht Bachelor of Music and Technology. He also worked on various audiovisual projects ranging from live stopmotion animated audio-reactive visuals in the LoudMatter project, to generative visuals for Biophonica, a live electronic piece about mass extinction.
For this performance I combine live acoustic drums with live coding of electronic music. During the performance the computer acts as a co-performer/semi-autonomous agent making changes in the code while listening and reacting to patterns that I play on the drums. A multi layer perceptron is trained to classify various patterns that I composed beforehand and it will change the code based on the perceived rhythm. The patterns are recognized via contact-microphones attached to the snaredrum and kickdrum. With a Max-patch the transients of the drums are detect and translate into a binary code of the rhythm, for example 0 1 1 0 1 1 1 1
. I connected patterns to specific decisions such as remove code, add code and replace code. In the cases where the pattern I play doesn't match an exact pre-trained rhythm the classifier will match it with something close based on the training data. This results in a jam together with the computer where I know certain specific patterns result in a specific action while others might surprise me, allowing me to improvise with that. The decisions made that lead to surprises are made based on a markov chain driven by the time of the performance. Where in the beginning of a piece the computer might decide to add more instruments or swap sounds, it may decide to remove sounds near the end. Furthermore the code generated by the computer is based on a combination of short pre-coded snippets of different instruments.
The code that is written and produces the sound is the Mercury language and environment, an interpreted human readable language that I designed to allow for a quicker expression of algorithmic composition and synthesis, mainly in the genre of electronic dance music. For the pattern classification I currently make use of the Max/MSP programming environment combined with the ml.star package.
Currently the system has some limitations that I like to overcome in the next iteration of the piece. One limitation is that I have to play together with a click-track and that the patterns I play are limited to a 16th note grid in 4/4 measure. I want to explore possibilities to make the recognition of patterns more free in length, tempo and note interval. Another limitation is that currently code is generated in short complete snippets. I want to include ways to let the computer generate further within a code snippet by letting it add or replace smaller tokens in the code, such as adjusting parameters or generating extra functions slightly changing the parts of the code.
./drum.code is a live coding performance for human and computer. During the performance the computer acts as a co-performer playing itself by making changes in the code while listening and reacting to patterns played on the drums. On the other side the human will react to the sounds made by the computer resulting in a dialogue between the analog and digital worlds of both performers. The code is generated and trained from previous programmed live coding performances and decisions are made via machine learning techniques trained on rhythmical patterns and tuned probabilities.
Live at Overkill Festival 2021, Enschede, NL:
https://youtu.be/AoOR0NR0smk?t=113
https://www.timohoogland.com/drum-code/
In terms of genre/musical output this performance fits best in the Algorave concert. But in terms of augumented instruments and improvisation it might fit in the concert as well. My preference would be an Algorave.
Macbook, adapter, keyboard, additional display
External audio-interface, in-ear monitoring
Drum sticks, drum triggers, cables
Drumset including:
Anti-slip drum rug (big enough for drumset + seat)
Kick 20/22” + pedal (double chain, felt/plastic beater)
Snare 13/14” + stand
Hihat 14” + stand, Crash 16-18” + boomstand
Stable Drum Throne, Extra empty boomstand
1 PA system: Mixer (FOH), Monitoring on stage (optional, I also bring in-ears)
1 Engineer
2 DI-Boxes, 6 Microphones (list below)
8 XLR cables of sufficient length (DI/Microphones to FOH)
2 Jack-Jack 1/4” balanced cables, ± 100cm
1 Beamer, 1080p resolution, at least 5000 ANSI-Lumen, HDMI cable
1 Projection screen, width of at least 400cm
4 Power sockets on stage
*Please contact me in case anything is not possible or needs to be changed
Setup drumset and tuning: +/- 1 hour (can happen while others soundcheck)
Setup electronics and calibrating: +/- 30 minutes
Soundcheck drumset: +/- 15 minutes
Soundcheck electronics: +/- 15 minutes
Soundcheck/play through: +/- 30 minutes
Overall: +/- 2 to 2.5 hours
Please see the rider in the attachment for a stageplan and inputlist