Skip to main content
SearchLoginLogin or Signup

"A Synth made of Chicken Nuggets you Play with your Elbows”: A workshop exploring AI supported Musical Instrument Design

Workshop Proposal for AIMC 2023

Published onAug 29, 2023
"A Synth made of Chicken Nuggets you Play with your Elbows”: A workshop exploring AI supported Musical Instrument Design


Pete Bennett, University of Bristol ([email protected]), a lecturer in Human-Computer Interaction at the University of Bristol, Pete has been exploring the role of AI in design through his installations 16x16 and ImprovCues. Experienced in running workshops, he co-organised the 10,000 instrument workshop at NIME.

Hugh Aynsley, UWE, Bristol ([email protected]), is a musician and instrument designer based in London and his practice focuses on the accessibility and engagement of new accessible digital musical instruments (ADMIs). Hugh uses speculative and collaborative design methods in the development of new instruments and questions how AI tools can enable disabled participants access to instrument design processes.

Sven Hollowell, University of Bristol ([email protected]), is a musician and PhD student in Interactive AI at the University of Bristol, currently designing sensors for augmenting music perception, and AI tools for composition.

Tom Mitchell, UWE, Bristol ([email protected]) is a digital artist and computer scientist working with sound technologies to enable creative expression and scientific representation. He is appointed as an Associate Professor in the School of Computing and Creative Technologies at UWE, Bristol, where he researches music interaction and leads the Creative Technologies Laboratory.

Becca Rose, Goldsmiths University ([email protected]) is a PhD candidate at Goldsmiths, University of London. They make playful interactive works in (and with) communities exploring feminist technologies. Their PhD research focuses on the “Potato Computer Club”, which is a platform for enlivening computing by playfully entangling materiality. They also co-curate and produced digital arts festival, Control Shift.

Dave Meckin, Royal College of Art ([email protected]), is a sound designer, musician and researcher whose fascination is how new technologies can transform audio-visual experiences. Dave is a Tutor (Research) in Information Experience Design at Royal College of Art. His work spans sonic composition, audio production and software system programming as well as hardware design and manufacture, all with an end to creating engaging and responsive sonic environments.

Rebecca Stewart, Imperial College London ([email protected]) is a Lecturer in the Dyson School of Design Engineering at Imperial College London and until early 2019 was a Lecturer in the School of Electronic Engineering and Computer Science at Queen Mary University of London. Becky works with e-textiles and signal processing to build interactive, body-centric wearable computing systems. These systems often incorporate performance, fashion, music and/or design.

Harley Turan, UI Architect at Cloudflare ([email protected]) is an engineer and generative designer, currently based in London.

“A Synth made of Chicken Nuggets you Play with your Elbows” Midjourney prompt created by workshop participant [a]


This speculative design workshop explores the use of generative AI tools for musical instrument concept design. Workshop participants will use text-to-image AI tools to rapidly generate an abundance of instrument designs before refining designs into mock-ups to imagine how the instruments might be played and how they might sound. The designs produced in the workshop will be shared in a show and tell at the end of the workshop as well as a website and physical zine thereafter. The workshop to be interactive, social and hands-on. No prior experience of AI, design or musical instruments is required, all are welcome! The workshop will build upon existing work exploring the role of AI in conceptual design (e.g. [2, 3, 4, 5]) and explore how these tools can support musical instrument designers, particularly when taking a whimsical and absurd approach to design [6], an approach to which text-to-image AI is particularly well suited. The workshop will explore the following questions:

  • How can text-to-image AI tools be used to inspire and accelerate the conceptual phase of musical instrument design?

  • How can physical objects and bricolage practice help to ground and inspire the production of text prompts and image sources for generative AI tools? [1]

  • How can the generative creative process be captured, documented and visualised within and after creative workshops [7]? 

  • How can AI tools be used to meet specific design requirements including cost, materials, or ease of fabrication?

Workshop Plan

Participants will be contacted in advance of the workshop and asked to bring along one or two objects of their choice. The workshop will consist of the following phases:

  1. Introduction (30mins). Starting with an introduction and ice-breaker activity based around the participants’ objects before moving on to an overview of the workshop.

  2. AI Tools (30mins). Participants will be introduced to a variety of AI tools (e.g. ChatGPT, Midjourney, DALL-E 2, etc).

  3. Instrument Generation (30mins). Participants will be encouraged to create an abundance of  novel instruments, drawing inspiration from the participants’ objects. Prompts and images will be added to a live feed of instruments to allow for “prompt remixing”.

  4. Break (15mins).

  5. Instrument Themes (45mins). After an introduction to extended AI techniques such as in/out-painting, participants will collectively organise instruments into 'families' . This process will familiarise participants with the designs and identify higher level similarities and themes. The creation process will continue in parallel, and participants will be encouraged to use the extended AI techniques to expand on unexplored areas of the design space.

  6. Instrument Mockups (45mins). Participants will be encouraged to make lo-fi prototypes of their designs with basic craft materials (foam board, card and paper prototyping), or using the objects brought along by participants. Then if there is time:

    • Make some sounds for the design

    • Form a band from multiple instruments.

    • Feed a photo of your physical instrument back into AI tools.

  7. Break (15mins).

  8. Show and Tell + Discussion (30mins). A brief show and tell of the final instruments , followed by a round table discussion to reflect on the process, the potential of AI tools for musical instrument designers. Areas for discussion may include:

    • How hard, costly or desirable would it be to actually make the AI designs?

    • Does creativity arise from the clash between digital and physical?

    • How to encourage progression to later design stages?

    • Could the AI techniques integrate into your own practice?

    • "Physicality as a filter". Do the physical materials help to select between a multitude of competing ideas?

    • How did the objects to hand influence the resulting AI designs?

    • Which AI tool did you prefer using and why?

The designs created in the workshop will be collected and made available both in digital form and as a downloadable/printable zine. The aim is to use the findings of this workshop as the basis of a publication in the future.

Stringed instrument variation on the chicken nugget elbow theme

Technical Rider

We are asking for the following to be provided:

  • Access to a projector or large screen with audio.

  • Internet access.

  • Multiple tables to work around in groups.

Hybrid Format

Due to the hands-on and physical nature of this workshop, we would prefer to run this as an in-person only event. However, it would be possible to adapt it to accommodate online participants as requested.


[1] Anna Vallgårda and Ylva Fernaeus (2015) Interaction Design as a Bricolage Practice. In Proceedings of TEI 2015

[2] Epstein, Ziv and Schroeder, Hope & Newman, Dava (2022) When happy accidents spark creativity: Bringing collaborative speculation to life with generative AI. 10.48550/arXiv.2206.00533.

[3] Jingoog Kim & Mary Lou Maher (2023) The effect of AI-based inspiration on human design ideation, International Journal of Design Creativity and Innovation

[4] Yuyu Lin, Jiahao Guo, Yang Chen, Cheng Yao, and Fangtian Ying (2020) It Is Your Turn: Collaborative Ideation With a Co-Creative Robot through Sketch. In the proceedings of SIGCHI 2020

[5] Vermillion, Josh (2022) "Iterating the Design Process Using AI Diffusion Models". Creative Collaborations. 9.

[6] Giacomo Lepri, John Bowers, Samantha Topley, Paul Stapleton, Peter Bennett, Kristina Andersen, and Andrew McPherson (2022) The 10,000 instruments workshop-(im) practical research for critical speculation. In the proceedings of NIME 2022.

[7] Elizabeth Gerber (2009) Using improvisation to enhance the effectiveness of brainstorming. In Proceedings of the SIGCHI 2009

[8] Haakon Faste, Nir Rachmel, Russell Essary, and Evan Sheehan. 2013. Brainstorm, Chainstorm, Cheatstorm, Tweetstorm: new ideation strategies for distributed HCI design. In Proceedings of the SIGCHI 2013

[a] Hugh Aynsley, Thomas J. Mitchell, , and David Meckin (2023) Participatory conceptual design of accessible digital musical instruments using generative AI. In Proceedings of NIME 2023

No comments here
Why not start the discussion?