Skip to main content
SearchLoginLogin or Signup

Live Coding with an Affective Autonomous Agent in TidalCycles- Performance

Submission for performance

Published onAug 29, 2023
Live Coding with an Affective Autonomous Agent in TidalCycles- Performance

ABSTRACT

This submission proposes a musical performance that combines the improvisational skills of a live coder with an autonomous music creation system, developed and implemented using Haskell and the TidalCycles language. The system uses machine learning algorithms to generate its own unique musical phrases and patterns that respond to the live coding input. The performance will showcase the collaborative possibilities of this technology, demonstrating how it can facilitate a real-time dialogue between human and machine intelligence. The proposed performance will consist of an evolving musical improvisation, that explores the interaction between the live coder and the autonomous system. A discussion of the technical aspects of the system, including the algorithms used, the data sources and input methods is also included. Overall, this performance presents a unique and innovative approach to musical performance that demonstrates new creative capabilities for live coders through machine collaboration.

TITLE OF SUBMITTED PERFORMANCE

Live Coding with an Affective Autonomous Agent in TidalCycles

BIOGRAPHY OF PERFORMER

Lizzie Wilson is an interdisciplinary artist and researcher whose interests include live computer music, musical pattern, epistemologies of artificial intelligence, techgnosis and human-machine co-collaboration. She is completing a PhD under the supervision of Gyorgy Fazekas and Geraint Wiggins as part of the Media and Arts Technology centre for doctoral training at Queen Mary University, and is also a lecturer at the Creative Computing Institute at University of the Arts London. She has also performed live computer music as digital selves around the UK and internationally.

Some recent works of note include commissions on Art and AI exploring ritual and collective intelligence exhibited at Transmediale Studios in Berlin; a recent feature in Fact Magazine’s Artist DIY series; musical releases on the Cherche Encore and in.unision labels and hosting feminist hackathons as part of Leeds International Festival.

DESCRIPTION OF WORK

Co-creation strategies for human-machine collaboration have been explored in various creative disciplines. Recent developments in music technology and artificial intelligence have made these creative interactions applicable to the domain of computer music, meaning it is now possible to interface with algorithms as creative partners.

The application of this research is incorporated within the context of a specific field of algorithmic composition known as live coding. As music is inherently coupled with affective response, it is crucial for any artificial musical intelligence system to consider how to incorporate emotional meaning into collaborative musical actions (Wiggins, 2018).

This work will look at bestowing machine musicians within interactive live coding systems the ability to create affective musical collaborations and examine new ways of interfacing with musical algorithms. Through the collaboration with the machine agent, allows live coders to explore the “conceptual space” of possible patterns of code that they may not be able to conceptualise themselves (Boden, 2004).

The development an agent in the field of musical artificial intelligence in an attempt to categorise behaviours in co-creativity between a machine agent and human agent improvising music. Three key components are proposed for successful collaboration with a machine musician: the ability of the musician to create music from a model of human affect, the role of machines to develop aesthetics (and whether these can be artificial or not), and the methodology by which evaluation is possible.

The performance combines the improvisational skills of a live coder with an autonomous system developed in Haskell and TidalCycles (Wilson et al., 2021). The system utilizes machine learning algorithms to generate unique musical phrases and patterns that respond to the live coding input.

The real-time dialogue between human and machine intelligence results in an evolving musical improvisation that showcases the collaborative possibilities of this technology. The performance offers a captivating experience for both the audience and the performers.

During the performance, the audience witnesses the live coder and the autonomous system engage in a real-time conversation, creating a dynamic soundscape that explores the interaction between human and artificial intelligence. The improvisation evolves as the live coder responds to the output of the autonomous system, creating a rich and complex musical performance.

LINKS TO THE DOCUMENTATION

Some previews of this performance are available here:

https://www.youtube.com/watch?v=2F1D8Harn

CATEGORISATION

This work would be most suitable for the Algorave on 1 September.

TECHNICAL RIDER

1. Technical Requirements from the Venue

1.1 Sound & Technical

  • -  Audio Mixer (8 or more Channels)

  • -  PA system

    • 2 or more coaxial speakers, with frequency range minimum 50- 17000Hz

    • 1 or more subwoofers

    • Capable of stereo sound reproduction

      1.2 Lighting and Visuals
      - Projector and projection screen

      1.3 Cables

  • -  HDMI cable (>10m, or alternatively this must be long enough to connect hdmi output from the artist’s

    laptop to the projector)

  • -  3.5mm to phono cable if direct to mixer, or jack cables to the mixer through an audio interface.

    2. Artist Will Bring

  • -  Laptop

  • -  Scarlett 2i2 audio interface

  • -  Any additional hardware synthesisers to be used (1x Korg Volca FM, 1x Moog Werkstatt)

  • -  USB-c to HDMI and USB adapter

  • -  Additional 3.5mm to phono cable

  • -  Additional Jack cables

    3. Timings
    - Timing should be allocated for load-in/soundcheck.

REFERENCES

Margaret A Boden. 2004. The creative mind: Myths and mechanisms. Routledge

Geraint A Wiggins. 2018. To play with feeling? The opportunity of aesthetics in computational musical creativity. CEUR Workshop Proceedings.

Elizabeth Wilson, Shawn Lawson, McLean Alex, and Jeremy Stewart. 2021. Autonomous Creation of Musical Pattern from Types and Models in
Live Coding. In 9th Conference on Computation, Communication, Aesthetics & X

Comments
0
comment
No comments here
Why not start the discussion?