Lil Data, the experimental pop artist from the PC Music collective, explores a new dimension in the world of algorave live coding in this performance titled Gagnavera (Icelandic for “Data Being”). In this performance, Lil Data will explore the intersection of artificial life systems and live coding, taking the audience on a journey into life-imitating soundscapes. Using their signature blend of glitchy beats and avant-garde sound design, Lil Data will control the live coding processes with a unique twist, incorporating artificial life systems that will evolve and adapt in real-time. This hybrid live coding performance aims for a mesmerising and immersive experience that blurs the line between human creativity and self-organising machine intelligence.
This performance will make use of Tölvera and RAVE [1], and more broadly we identify this performance in the context of Agential Scores [2] and Hybrid Live Coding Interfaces (HLCI) [3]. We contextualise these below.
Tölvera is an open source Python library for the design of musical instruments and musical notations with artificial life. The name is an example of a kenning, a metaphorical compound expression found in Old Norse and Old English poetry. We invented this kenning by combining the Icelandic words for computer and being:
Tölvu: computer, from tala (number) + völva (prophetess, or oracle)
Vera: being
Tölvera: number being
RAVE is a variational autoencoder for fast and high-quality neural audio synthesis developed at IRCAM and currently the most popular and accessible way to do real-time neural audio synthesis. In our research group we have developed integrations for RAVE with SuperCollider and Tidal, and the author has premiered a live coding performance in 2022 (see links below). We have trained many custom RAVE models and in this performance the author will use those trained on pop and dance music genres, as well as their own discography.
Tölvera is part of the Agential Scores project, which is a broader effort to explore self-organising systems, emergence and entanglement in musical instruments. Agential Scores is a project exploring the possibilities of entangling the real-time parameters of musical instruments with artificial life (ALife) and other types of simulations. Much of intelligent systems research today focuses on machine learning techniques such as deep learning, but that does not make it the only interesting computational material to work with when considering agency and intelligence in musical instruments. ALife and related fields have inspired generations of researchers to expand their horizons of what could be considered intelligent, and the same is still true today, with new systems and species being discovered all the time.
The author co-founded the HLCI workshop and has pioneered this domain through instruments like the Stenophone [4], and NIME workshops [5], and helped create a vibrant community exploring diverse alternative approaches to and practices of live coding. This performance proposes to hybridise live coding with artificial life, towards Hybrid ALife Coding.
Lil Data has been performing at algoraves since 2014, and notably live coding at non-algoraves at venues like Berghain in Berlin, Create in Hollywood, Heaven in London and others. Lil Data is known to have inspired a new generation of pop live coders including DJ_DAVE through their performances and releases on the cult pop and dance label PC Music.
HDMI Projector
Standing height table/plinth
Space for cables etc
Stereo DI out from audio interface
Laptop
Separate computer for graphics (TBC)
MOTU Audio Interface
Power cables etc.
Lil Data
Premiere of live coding RAVE neural audio synthesis models as part of Algorithmic Art Assembly / On-the-fly in 2022 https://www.youtube.com/watch?v=ii-dmCbHmos&t=1007s
https://open.spotify.com/artist/1MoWMC4GaNZklW2wmKENRl?si=a4agY5bJSR-qqWV7Ct_Wtg
Agential Scores / Tölvera
RAVE
I am a postdoctoral research fellow at the Intelligent Instruments Lab. I have a doctorate in Media and Arts Technologies from Queen Mary University of London, where I studied in Prof. Andrew McPherson's Augmented Instruments Lab. During my PhD I was a Visiting Scholar at Georgia Tech under Prof. Jason Freeman. Before then, I was a Research Engineer at ROLI after graduating with a BSc in Music, Multimedia & Electronics from the University of Leeds. My research interests include embodied interaction, craft practice and design cognition. I also produce, perform and live code music as Lil Data, as part of the PC Music record label.
The Intelligent Instruments project (INTENT) is funded by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (Grant agreement No. 101001848).