Skip to main content
SearchLoginLogin or Signup


Published onAug 31, 2023

Featuring live experimental A/V performances with an AI twist, an Algorave at Brighton’s stunning St. Mary’s Church will wrap up the conference.

Digital Selves

Live Coding with an Autonomous Agent in Tidal

This performance combines the improvisational skills of a live coder with an autonomous music creation system, developed and implemented using Haskell and the TidalCycles language. The system uses machine learning algorithms to generate its own unique musical phrases and patterns. The performance will showcase the collaborative possibilities of this technology, demonstrating how it can facilitate a real-time dialogue between human and machine intelligence. The proposed performance will consist of an evolving musical improvisation, that explores the interaction between the live coder and the autonomous system.


Digital Selves is an interdisciplinary artist and researcher whose interests include live computer music, musical pattern, epistemologies of artificial intelligence, techgnosis and human-machine co-collaboration. She is completing a PhD under the supervision of Gyorgy Fazekas and Geraint Wiggins as part of the Media and Arts Technology centre for doctoral training at Queen Mary University, and is also a lecturer at the Creative Computing Institute at University of the Arts London. She has also performed live computer music as digital selves around the UK and internationally.

Some recent works of note include commissions on Art and AI exploring ritual and collective intelligence exhibited at Transmediale Studios in Berlin; a recent feature in Fact Magazine’s Artist DIY series; musical releases on the Cherche Encore and in.unision labels and hosting feminist hackathons as part of Leeds International Festival.

Jack Armitage (Lil Data)


Lil Data, the experimental pop artist from the PC Music collective, explores a new dimension in the world of algorave live coding in this performance titled Gagnavera (Icelandic for “Data Being”). In this performance, Lil Data will explore the intersection of artificial life systems and live coding, taking the audience on a journey into life-imitating soundscapes. Using their signature blend of glitchy beats and avant-garde sound design, Lil Data will control the live coding processes with a unique twist, incorporating artificial life systems that will evolve and adapt in real-time. This hybrid live coding performance aims for a mesmerising and immersive experience that blurs the line between human creativity and self-organising machine intelligence.


I am a postdoctoral research fellow at the Intelligent Instruments Lab. I have a doctorate in Media and Arts Technologies from Queen Mary University of London, where I studied in Prof. Andrew McPherson's Augmented Instruments Lab. During my PhD I was a Visiting Scholar at Georgia Tech under Prof. Jason Freeman. Before then, I was a Research Engineer at ROLI after graduating with a BSc in Music, Multimedia & Electronics from the University of Leeds. My research interests include embodied interaction, craft practice and design cognition. I also produce, perform and live code music as Lil Data, as part of the PC Music record label.

Timo Hoogland


./drum.code is a live coding performance for human and computer. During the performance the computer acts as a co-performer playing itself by making changes in the code while listening and reacting to patterns played on the drums. On the other side the human will react to the sounds made by the computer resulting in a dialogue between the analog and digital worlds of both performers. The code is generated and trained from previous programmed live coding performances and decisions are made via machine learning techniques trained on rhythmical patterns and tuned probabilities.


Timo Hoogland is a computational artist, live coder, music technologist and educator from Apeldoorn, the Netherlands. He livecodes experimental electronic dance music and develops generative audiovisual compositions, installations and performances. Timo graduated from the Masters of Music Design at the HKU University of Arts Utrecht, where he developed the livecoding environment Mercury to research and develop algorithmic composition techniques and generative visuals in livecoded audivisual performances with the focus on accessibility. He has an active role in organizing livecoding meetups and Algoraves together with Creative Coding Utrecht and the Netherlands Coding Live community and performed at various events and festivals such as Github Satellite, Network Music Festival, ICLC, ADE, Gogbot, Tec-Art, Droidcon and React. As an educator Timo teaches creative coding for audio and visuals at the University of Arts Utrecht Bachelor of Music and Technology. He also worked on various audiovisual projects ranging from live stopmotion animated audio-reactive visuals in the LoudMatter project, to generative visuals for Biophonica, a live electronic piece about mass extinction.

Celeste Betancur Gutierrez

Pandora’s Dream

Pandora hears her dreams, they talk to her in mysterious voices and unknown languages. You are there in the middle of the dark but you don't know how did you get there, are you one of Pandora's dreams? Talk to her, maybe she will answer you...

Pandora’s Dream could be seen as a text editor for multiple languages (ChucK, GLSL, C++) but what make it powerful handling such diverse array of languages is that empowers artists to explore different creative avenues and leverage the unique features offered by each language while sharing variables/uniforms directly from memory (not through communication protocols as midi or OSC) plus a set of AI/machine learning tools to help build versatile live sets.


Multi-instrumentalist musician with a professional degree in guitar from Berklee College of Music and a Master's in digital arts, she is currently a 2nd year P.h.D student at CCRMA, Stanford University. 

She works on developing human-machine interfaces, especially designing programming tools for musical expression.

She has played live in more than 15 countries using tools developed by her including CineVivo, a live cinema/VJ tool for playing visuals live, CHmUsiCK, library to extend Chuck with more “algorave” style material and recently Pandora’s Dream, platform that integrates audiovisual with AI/machine learning tools for live coding performances.

Tasos Asonitis


Cartographic is a live audiovisual piece that focuses on a three-dimensional map created from sound. Leveraging Machine Learning techniques, an original sound source is broken into small segments and these segments are scattered in 3D space according to their timbral qualities. The piece unfolds as we navigate through this space and explore the different neighborhoods and their sonic character. Fluctuating between descriptive and poetic, the journey takes us through various stages as we observe the system gradually falling apart and what before stood like a map, eventually dissolve into an abstraction.


Tasos Asonitis is an audiovisual artist whose activities encompass various facets of digital arts. His primary interest is on virtual environments that can be described by the creative entanglement of computer generated music and 3D graphics. Asonitis' artistic output however is not limited to the audiovisual domain, and includes multi-channel fixed media pieces, sonic installations and compositions for moving image. Recipient of the EPSRC doctoral scholarship, he is currently doing a PhD in Composition at NOVARS Research Center. His works have been exhibited in MANTIS Festival, People’s History Museum (Manchester, UK), Science and Industry Museum (Manchester, UK), Athens Digital Arts Festival, METS Fest (Cuneo, Italy), among other places.

Begüm Çelik


The heart of "Uncanny" lies in envisioning the latent space of sounds, generated by using GANSynth models, as a multi-dimensional universe open to exploration in a random fashion. The exploration of this latent space aligns with the aleatoric orientation of live coding, showcasing the flexibility and dynamic nature of the medium. Drawing inspiration from non-places and the concept of uncanny valleys, the audio-visual performance engenders a sense of wandering within an abstract, unfamiliar realm that both fascinates and intrigues.


Multidisciplinary artist Begüm ÇELİK is pursuing her master’s degree in Visual Arts & Visual Communication Design program under the supervision of Selçuk Artut at Sabancı University where she completed her B.Sc. in Computer Science & Engineering in 2021. Her master thesis titled “Conserving Multimedia Art from Artistic, Curatorial, and Historicist Perspectives: Case Study on Teoman Madra Archive” focuses on both the media art history in Turkey and archival strategies. Her artistic production is fed by her interdisciplinary journey by combining technology and performance in accordance with her engagement with various theater practices. Çelik’s academic research focuses on the preservation of technological artworks in continuation to her projects titled “Photometric Approach to Surface Reconstruction of Oil Paintings” and “Testing Method for Software-based Artworks” which were completed in collaboration with Sakıp Sabancı Museum, Istanbul. Recently, she completed the conservation project of Stephan von Huene's artwork called What’s Wrong with Art? at ZKM Karlsruhe under the supervision of Daniel Heiss and Morgane Stricot. 

No comments here
Why not start the discussion?