Skip to main content
SearchLoginLogin or Signup

Rhythmic Conversations

Performance

Published onAug 29, 2023
Rhythmic Conversations
·

Abstract

We are proposing an audience-interactive performance/drum circle with AI-driven virtual musicians, reactive visualizations, and a solo guitar player. The proposed performance ties the ancient art of communal music making with today’s cutting-edge technology, offering a reconceptualization of modern performance, performer, and audience dynamics.

Large scale musical improvisation for solo guitar, using projections, and interactive drumming

Performer Bio’s

Torin Hopkins (he/him/his) is a musician and triple PhD candidate in Neuroscience, Cognitive Science, and Creative Technology and Design at the University of Colorado, Boulder. Torin’s musical practice specializes in pushing the boundaries of audience-participation and improvisation in musical performance—using technologies such as neuroimaging, artificial intelligence, and virtual worlds to bridge the gap between audience and performer. Torin, has been a music educator since 2010 and is a lifelong performer. He is passionate about music collaboration, music education and new forms of interaction in music using computers. He believes in the power of collaborative music making as a mechanism for bringing communities together and utilizes technologies that promote participation and creative expression for all community members.

Che Chuan "Suibi" Weng (he/him/his), is an accomplished interactive media engineer and digital artist. His primary focus is on developing interactive installations, with a particular specialization in the AR/VR and Arduino components. Weng has had the privilege of teaching technical college-level courses such as Unity Development and Arduino Basic at three esteemed universities. As a digital artist, his works have been selected for inclusion in several highly regarded festivals, including the Digital Arts Festival in Taipei and the 404 International Festival of Art and Technology in Argentina. Currently, he is enrolled in the Ph.D. program at the ATLAS institute, University of Colorado Boulder, which commenced in 2021. Weng’s current research interests are centered around human perception in Virtual Reality and Augmented Reality.

Performance Description

Musical performances have the potential to evoke powerful emotions and create unique connections between performers and audiences. However, traditional performances often rely on a one-way flow of music from the performers to the audience, limiting the level of engagement and interaction that can take place. The use of AI-driven virtual musicians in musical performances offers a promising solution to this problem, enabling audiences to actively participate in the performance and potentially influence the direction of the music.

The use of virtual musicians and AI technology in music also offers the potential for increased creativity and accessibility. By removing the need for physical instruments and allowing for new forms of musical expression, virtual musicians and AI can enable individuals who may not have access to traditional musical instruments or may have physical limitations to participate in musical performances. Additionally, the use of AI-driven technology can facilitate collaborations between human and non-human performers, leading to novel and innovative musical compositions.

We are proposing an audience-interactive performance/drum circle with AI-driven virtual musicians, and a solo guitar player. The proposed performance ties the ancient art of communal music making with today’s cutting-edge technology, offering a reconceptualization of modern performance, performer, and audience dynamics. 

The virtual musician percussionists (projected on-screen and/or alongside the audience members) will “listen” to designated drums and will generate beats that compliment the audience members at the designated drums in real-time. This will offer a musical backdrop for the improvising guitar player and other audience members/percussionists. Audience members will also be encouraged to dance, trade instruments, interact with each other and the performer, play percussion instruments, and affect the outcome of the performance interactively.   

Methods employed to accomplish the AI backend are a series of python scripts that we have developed and the use of Ableton Live to manage and edit incoming music. We are leveraging models trained on the Groove database by Magenta to create coherent and patterned drum beats.

This performance belongs in the AI Concert theme because it does not involve live-coding with AI and is an improvisation-based performance. By leveraging AI as a mechanism of promoting audience participation and engagement, we encourage any and all members of the community to participate musically in the performance.

Technical Rider

Technical description

Rhythmic Conversations can be adapted to many kinds of venues. The necessary components of the performance are an audio system and projection or TVs. The projections display the AI-driven percussionists while the stereo sound can be managed by a power amplifier. Rhythmic Conversations has taken place in small concert halls, black box theaters, and in-home settings.

What we can bring:

  • Guitar and performers

  • Computer to run system

  • Audio interface

  • Electronic drums

  • Some percussion instruments

What will need to be provided by the venue:

  • 1 Large Table

  • Large TV and stand (or large projector and screen preferred)

  • HDMI connection / cable

  • Backline or amplification for guitar that can handle a minimum of 2 stereo outputs

  • 2 extension cables

  • 2 power strips (8 total outlets)

  • 2 XLR audio cables

  • Any extra rhythm instruments

    Additional Documentation

    Drumming with AI-driven holograms


    Drumming with AI-driven drummers in AR

Comments
0
comment
No comments here
Why not start the discussion?