Skip to main content
SearchLoginLogin or Signup

Agential Instruments Design Workshop

Full-day hands-on workshop exploring state-of-the-art AI/ML instrument design tools.

Published onAug 29, 2023
Agential Instruments Design Workshop
·

Abstract

Physical and gestural musical instruments that take advantage of artificial intelligence and machine learning to explore instrumental agency are becoming more accessible due to the development of new tools and workflows specialised for mobility, portability, efficiency and low latency. This full-day, hands-on workshop will provide all of these tools to participants along with support from their creators, enabling rapid creative exploration of their applications a musical instrument design.

Description

The primary goal of this workshop is to serve as a public-facing, beginner-friendly, hands-on and engaging follow-up to the Embedded AI For NIME [1] workshop presented at NIME ‘22, where participants can arrive with no prior knowledge of instrument making and finish the workshop by performing on an “agential” instrument of their own making. The previous workshop challenged researchers to address the “technical challenges and higher-level design constraints” associated with making AI/ML techniques work for high performance real-time music interaction. In the intervening period, exciting new tools have started to emerge which are approaching readiness for exploration by non-experts and curious communities like AIMC. In turn, the researchers who are creating these tools need vital feedback to inspire further iteration, refinement, and dissemination to ultimately achieve an open and inclusive ecosystem. Thus, we propose this workshop as a critical step in putting the state-of-the-art into the public’s hands. This time however, the scope will not be limited to just embedded systems, and will instead reflect the AIMC theme of interactive performance systems.

Itinerary

We expect to be able to support between 15-20 participants in this workshop. The full-day workshop will be split into two sessions; hacking and playing. The hacking session will be a hands-on tutorial and tour of various tools and approaches from the workshop organisers, and through this guided process the participants will start to develop their own simple instruments. The playing session will then focus on developing the musicality of these instruments, ending with demos, improvisations and a final discussion. The full itinerary will be as follows:

  • Welcome/Introduction

    • 10-15min intro talk from the organisers and group introductions

  • Session 1: Hacking

    • Short tutorials covering

      • Sensors, gestures and signal processing

      • Acoustics, sound, synthesis, feedback, actuation

      • Mapping, training, refining

    • Comfort break

    • Instrument hacking

  • Lunchbreak

  • Session 2: Playing

    • Instrument design

    • Comfort break

    • Demos and improvisations

    • Final discussion

Once the initial tutorials are completed, the participants will be able to “choose their own adventure” regarding which tools they decide to work with (similar to the IRCAM Neural Audio Synthesis workshop at NIME ‘22 [2]), and they are free to hop between topics throughout the day. Depending on how the featured tools develop over the summer, we may curate these into pre-defined “adventure pathways”, for example audio IO, corpus manipulation and gesture classification.

Overall, this aims to be a somewhat informal, participant and creativity-led environment. Depending on the outcomes and other scheduling constraints, there is the possibility that the resulting instruments could be exhibited or performed with during AIMC. There will be a large number of mentors available who will all have relevant expertise to be able to assist and even potentially collaborate with participants.

Showing and Telling and Demoing and Playing

Here we describe some of the tools that will be available for participants to play and experiment with, based on each author’s expertise. Bios of the tool authors can be found below.

Lewis and Rodrigo intend to showcase their neural audio synthesis plugin installed on a Bela Pepper. This plugin replicates a physically modeled drum membrane, using a neural network to infer the characteristics of an IIR filterbank. In its embedded state, the plugin will interactive via both CV and a remote GUI, controlled from a second machine, with the synthesis engine running on the Bela itself.

Sophie will provide smart textile interfaces and give an overview to textiles as an interactive material. A selection of textile sensors will be available, demoing the range of sensing capabilities of this material. These swatches are also compatible to be combined with other tools of the workshop and aim to invite participants to explore the idea of soft, flexible textile instruments.

Courtney can talk about XAI and human perception in musical interaction, agency and control, and how human expectations of digital systems play into collaboration and creativity with technology/AI.

Franco can show an example of synth interaction design where sensors (or textiles?) are dynamically mapped to a sound generator using his design method. Participants can program and audition a synth patch that represents the timbre of their intelligent instrument. Then, think of a set of interactions they want to try, and train a neural net that learns the dynamical relationships set on the timbre and can be controlled continuously by the sensors.

Jack Armitage will present Tölvera1, a Python library for creating instrumental behaviours using Artificial Life simulations [3]. The library comes with several ALife species which run on the GPU and can be commingled together, and their states and characteristics can be mapped into instruments.

Victor Shepardson will present the Living Looper, a real-time neural synthesis system building on the RAVE autoencoder [4], and Notochord [5], a real-time generative MIDI performance model which can augment MIDI streams or convert control signals into MIDI.

Nicola Privato will present Scramble, a hands-on MIDI tool that combines the style of any MIDI track it is trained with and the performer’s real-time input [6], and the Magnetic Discs, two haptic controllers for the tangible control of neural synthesis models [7].

Sean Patrick O’Brien will present the Organolib2, an experimental system full of prepared technical elements which can be instantly utilised when physically sketching an interaction, interface or instrument.

Andrea Martelloni will showcase the HITar, an augmented acoustic guitar for percussive fingerstyle, as a case study for embedded (or embeddable) neural networks in real-time musical interaction. The guitar prototype will offer a visualisation of players’ hits on the guitar’s body, as a demonstration of the network’s capabilities in percussive gesture description; the participants are welcome to try and play or hit the guitar to explore the gesture representation both visually and sonically.

Organisers

Jack Armitage

Intelligent Instruments Lab, Iceland University of the Arts, [email protected]

I am a postdoctoral research fellow at the Intelligent Instruments Lab. I have a doctorate in Media and Arts Technologies from Queen Mary University of London, where I studied in Prof. Andrew McPherson's Augmented Instruments Lab. During my PhD I was a Visiting Scholar at Georgia Tech under Prof. Jason Freeman. Before then, I was a Research Engineer at ROLI after graduating with a BSc in Music, Multimedia & Electronics from the University of Leeds. My research interests include embodied interaction, craft practice and design cognition. I also produce, perform and live code music as Lil Data, as part of the PC Music record label.

Victor Shepardson

Intelligent Instruments Lab, Iceland University of the Arts, [email protected]

I am a doctoral researcher in the Intelligent Instruments Lab at LHI. Previously I worked as a machine learning engineer on neural models of speech, and before that I studied Digital Musics at Dartmouth College and Computer Science at the University of Virginia. My interests include machine learning, artificial intelligence, electronic and audiovisual music, and improvisation. In my current research, I approach the lived experience of people with AI via design and performance of new musical instruments. My projects include the Living Looper, which reimagines the live looping pedal through neural synthesis algorithms, and Notochord, a probabilistic model for MIDI performances.

Nicola Privato

Intelligent Instruments Lab, Iceland University of the Arts, [email protected]

I’m a PhD candidate in Cultural Studies, conducting my research at the Intelligent Instruments Lab. Previously, I studied Electronic Music at the Conservatory of Padua (MA), Jazz Improvisation and Composition at the Conservatory of Trieste (BA) and Modern Languages and Cultures at the University of Padua (BA). In the last ten years I have been curating musical events and festivals, composing, performing and teaching music. My current interests include alternative forms of notation, improvisation, composition, and Human-Computer Interaction in performative contexts. My project focuses on AI explainability in music performances.

Sean Patrick O’Brien

Intelligent Instruments Lab, Iceland University of the Arts, [email protected]

I have a BFA from the Studio of Interrelated Media from MassArt in Boston and recently received a Master's in Performing Arts from Listaháskóli Íslands where I focused on bringing my background in interactive and kinetic sculpture into a performative and socially engaged practice. Inspired by the local Icelandic arts and music scene and I have worked with the Reykavík Dance Festival, Sequences Art Festival, Nylistasafnið, Kling og Bang, Listahátið, Raflost, Rask, Mengi, Spectral Assault Records, and grassroots organizations Post-Dreifing, RUSL Fest, Fúsk, and King og Bong. My primary goal as an artist is to create an engaging experience that encourages interaction through the performative nature of objects and the sensation of experience.

Lewis Wolstanholme

Augmented Instruments Lab, Queen Mary University of London, [email protected]

I am a multidisciplinary artist and composer, specialising in crossmodal, experimental and immersive performance practices. I am currently working as a PhD researcher in artificial intelligence and music at Queen Mary University of London, as part of the Augmented Instruments Lab. My research centers upon the development of compositional tools for the exploration of percussion instruments, a narrative which has largely been formed through situated aesthetic aims and practice based research methodologies.

Jordie Shier

Augmented Instruments Lab, Queen Mary University of London, [email protected]

I am a PhD student in the Artificial Intelligence and Music (AIM) programme based at Queen Mary University of London (QMUL), studying under the supervision of Prof. Andrew McPherson and Dr. Charalampos Saitis. My research is focused on the development of software that supports creativity in musicians and music producers. I am particularly interested in creating novel methods for synthesizing audio and researching new interaction paradigms for music synthesizers. My current project is on real-time timbral mapping for synthesized percussive performance and is being conducted in collaboration with Ableton.

Teresa Pelinski

Augmented Instruments Lab, Queen Mary University of London, [email protected]

I am a PhD researcher in the Artificial Intelligence and Music (AIM) CDT at the Augmented Instruments Lab, part of the Centre for Digital Music at Queen Mary University of London. My research is supported by the Bela embedded hardware platform, and my current work focuses on developing workflows for prototyping and experimenting with datasets and neural networks on embedded hardware platforms.

Courtney N. Reed

Sensorimotor Interaction Group, Max Planck Institute for Informatics
[email protected]

I am a postdoctoral researcher in the Sensorimotor Interaction Group (senSInt) at the Max Planck Institute for Informatics, Saarland Informatics Campus. My research focuses on dynamics within human bodies and how we can externalise and understand the internal, often wordless relationships we have with our bodies. I am interested in the perception of control and agency between bodies and technology and how experiential knowledge is leveraged in communication between these agents, for instance in explainable AI (XAI).

Adan L. Benito

Augmented Instruments Lab, Queen Mary University of London, [email protected]

I am a PhD candidate in the AI + Music programme at Queen Mary University of London in the Centre for Digital Music (C4DM). My research focuses on the possibilities of gesture analysis and disambiguation of guitar performance for the design of expressive instrument augmentations. Besides that, I am one of the developers behind Bela (bela.io) and hold a passion for all things related to guitar experimentation, from craft and technology to techniques and repertoire.

Sophie Skach

Intelligent Instruments Lab, Iceland University of the Arts, [email protected]

I am a postdoctoral researcher at the Intelligent Instruments Lab. Previously, I have been with the Centre for Advanced Robotics at Queen Mary University of London, where I also obtained my PhD as part of the Media & Arts Technology programme. Having a background as a designer working in the fashion industry, my research focuses on exploring textile technologies for designing soft robots, musical interfaces, and “smart”, sensing garments for various applications.

Franco Caspe

Augmented Instruments Lab, Queen Mary University of London, [email protected]

I am a PhD researcher in the Artificial Intelligence and Music (AIM) CDT at the Augmented Instruments Lab, part of the Centre for Digital Music at Queen Mary University of London. My current work focuses on designing tools for real-time and expressive control of synthesisers using audio from musical instruments.

Andrea Martelloni

Augmented Instruments Lab, Queen Mary University of London, [email protected]

I am a PhD researcher in the Artificial Intelligence and Music (AIM) CDT at the Augmented Instruments Lab, part of the Centre for Digital Music at Queen Mary University of London. My research focusses on the use of Real-Time Music Information Retrieval (RT-MI) and Deep Learning to achieve a rich description of percussive guitar technique for synthesiser control.

Technical Rider

Room with tables, chairs, speakers and projector, power outlets, basic AV cables. An additional overspill space might be good in case things get super noisy.

We would bring all other materials and equipment, including:

  • A simple online Wiki centralising documentation and links for all of the featured tools.

  • 15 Bela.io Workshop kits.

  • Bela.io Trill sensors.

  • Organolib elements including sensors, actuators, e-textiles, transducers, speakers, electromagnets and more http://iil.is/organolib.

  • Participants will be invited to bring their own instruments and any additional materials they might like to use.

Remote/Hybrid Participation

Unsupported (apart from online docs and open source)

Acknowledgements

The Intelligent Instruments project (INTENT) is funded by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (Grant agreement No. 101001848).

Comments
0
comment
No comments here
Why not start the discussion?