Jess+: Intelligent DigiScore as a creative platform for inclusive music-making for disabled and non-disabled musicians.

Developing an intelligent digital score that extended the creativity of disabled and non-disabled musicians within an inclusive music ensemble. The digital score used AI and a robotic arm to bind these musicians in a shared creative improvisation with all the musicians benefiting, practices enhanced and relationships transformed.

The Issue

There are significant barriers to music-making for disabled musicians. One such barrier, relating to this project is that while non-disabled people can make music in many ways, this is less true for disabled musicians, who need new accessible instruments, new creative processes, and new hierarchies of “success” (Drake Music). Another is how a mixed ensemble of disabled and non-disabled musicians might improvise together in such a way that dexterity is not a limiting factor. A further barrier is how disabled musicians can be excluded from the creative process in which a music score is created for them. These barriers can leave creative potentials untapped, setup hierarchical measurements of creative involvement, and limit the potential input of disabled musicians.

The central research question with this project was: How to build an embodied-AI system that facilitates co-creation for an improvising ensemble of disabled and non-disabled musicians, so that the expressive qualities of all musicians are amplified and bound together in a co-creative system?

The Research

The Digital Score project (DigiScore) worked in collaboration with Orchestras Live (a national producer creating inspiring orchestral experiences for communities across England) and Sinfonia Viva (a British orchestra based in Derby, England) to evaluate the impact and benefits of using AI and creative robotics to break certain barriers around disabled musician’s access to creative music-making. The focus for the partners was on using AI and Prof Craig Vear’s digital score concept to generate new modes of music-making and potential inclusive processes of creativity. The priority for the musicians was that it inspired creativity inside music-making (a shared activity between all musicians). Therefore, this was not about design for disability, but rather an investigation into the potential of creative-AI in promoting inclusivity for all musicians involved regardless of (dis)ability.

Over a series of 5 workshop in a 3-month period, a proof-of-concept was built through an iterative human-centred approach with three musicians: a physically disabled musician (Jess) and two non-disabled musicians (Clare and Deirdre). The process started with a proof-of-concept AI/ robot created by Vear that moved and drew on marks on paper in response to the realtime sound created by a live musician. The musician in turn would interpret these movements and marks as notational material and make a sound in response, much like a graphic score or Butch Morris’s concept of conduction. The music was created through this live interaction, with both the musician and the AI/ robot given equal status as to who was leading whom. In viewing the digital score in this way, the embedded musical idea would emerge through realtime interaction where the movement and the drawing were considered material for interpretation.

Throughout the development process we placed an emphasis on trust-building of not only the AI and robots’ contribution to shared creativity amongst the ensemble, but also to the social aspects of the creative process across the wider team of musicians, developers, researchers and supporting organisations. Through this process we developed a hierarchy of trust that emphasized the social and ecological nature of this work and the working relationships.

What emerged was a closed-loop realtime interaction design with Jess connected to the AI/ robot through sensors, and the sound of the ensemble streamed into the AI using a room microphone. The role of the robot arm was to present movements and gestures and sometimes marks on a page, that inspired the musicians to make a sound and to co-create music through improvisation. The role of the AI was to sense the humans and to generate a response via the robot arm so that it is perceived as being meaningful in the shared flow of music-making. The sensing involves a microphone that is listening to the sound produced by the human members of the ensemble, and by direct EEG and EDA input from the disabled musician.

A key technical innovation to this project was Vear’s design of an “AI Factory” in which 7 deep learning models were trained on a dataset captured from 9 jazz pianists improvising (Vear et al 2019). These models would sense the humans in the loop through the on body sensors or the microphone and produce a stream of outputs which were used to make movement decisions for the robot arm. The key concept here was that the models were trained on data extracted from jazz musicians who were deeply embodied in their improvisation. The hope was that somehow this embodied data, and the way it was enacted through realtime sensors of musicians engaged in embodied music-making, would be felt through the resultant robot arm movement and drawing, from which meaningful interactions would be generated in the ensemble.

The Outcomes

Throughout the extended and iterative research process, Vear and a team of research fellows conducted a comprehensive qualitative investigation into the musician’s reflections on using this system. The findings were surprising with many aspects of the project exceeding the expectations of the original aims. In short, all the musicians benefited from the introduction of AI and robotics, with practices enhanced and relationships transformed.

Our findings reveal that the musicians formed unexpected and distinct relationships with Jess+ yet viewed its role in the ensemble differently, but these were nonetheless inclusive and binding. Each musician perceived they were in-the-loop with the system, as they found it to be a good listener with its own creative ‘voice’. It was non-judgmental and accepting, which, for our musicians, promoted a new freedom of expression and confidence in taking musical risks when improvising. Jess, as hoped, viewed the system as an additional layer of creativity and felt empowered by its inclusive potential. For Clare and Deirdre, they elevated the system beyond a tool that stimulates musical ideas to one of it being a ‘creative accompanist’.

The musicians highlighted how the system operated with them inside music-making, and how it inspired and offered appropriate musical gestures for them to interpret through the flow of the music-making. However, each musician had built a different relationship with the robot. Jess, the disabled musician, discussed how she felt a connection with the system within music-making, and accepted the proposition that Jess+ was an extension of her, but the purpose of this extension was to draw a creative visual representation of the music being made. This, to her was an illustrative artwork of the ensembles improvisations, using her physiological and psychological data with the collective audio stream that generated the drawing. She referred to it on several occasions, as a “friend”, and also as a “story-teller”.

For the non-disabled musicians, they acknowledged the connection between Jess and the system, and also called it a “friend” but felt that it was more of a “creative accompanist” in which it made a creative contribution to their improvisation through its movement gestures. All of the musicians perceived being in-the-loop with the system, and recognising back and forth interaction in realtime.

The musicians highlighted how they felt being in-the-loop with Jess+ and how the nature of its interactive contribution to musicking transformed their own practices. For Jess, she felt that the system allowed her to express the emotions that she is sometimes not able to express through her current digital setup. For her, being extended through the system meant that she could feel like she was able to express her feelings directly onto a score. “I wanted to explore that part of me and I wanted – you know, I want my emotions that are in here to get expressed outwardly through that” (Jess).

For the non-disabled musicians, they felt that they could take risks and, challenge the system as a collaborative partner. This in turn unshackled their improvisational approach and led to emergent novelty in their playing. Overall, they stated that their improvisational skills improved, and felt like they had gained a lot of confidence that could help in future improvisation sessions, especially in their school outreach projects. The musicians acknowledged that Jess+ established a ‘third space’ for creativity that flattened any hierarchy of mobility and enhanced the sense of togetherness and inclusion in music-making. They expressed how playing with the system reduced the feelings of “expectations and judgements” (Claire) that playing with other humans could engender.

In an impact questionnaire following the final performance Deirdre summed-up their collective feelings as: “The robot arm was liberating to improvise with as it was non-judgemental. At times it united the three musicians’ music, and at other times it could also be independent from us (as we knew it would return to respond to what we later did). This in turn influenced the musicians to start or stop, to ‘gel’ together harmonically or feel the freedom to play outside harmonic or rhythmic frameworks. In my opinion, improvising with a human (especially someone new to you) carries psychological elements that could interfere with making music together, so the robot arm provided the opportunity for freedom of musical and emotional expression that would take much more time to establish and develop between humans”.

Documented performances

  • The first piece was performed with the Magician Lite drawing and no wearable sensors.
  • The second piece was performed using the XArm drawing with 4 pens and with wearable sensors: LINK
  • The third and fourth pieces were performed using the XArm with a feather: with (LINK and without LINK.

Jess+ Team performing at the BBC

The Jess+ team at the BBC for a performance as part of the Bridging AI Divides (BRAID) event (L-R Deidre Bencsik, Craig Vear, Jess Fisher, Clare Bhabra)

Participants

Jess Fisher Jess is a disabled musician and composer. She performs in inclusive ensembles and as a solo artist, using primary digital music tools and technologies. She worked with Anonymous on the creation of a bespoke accessible music controller CMPSR which Jess typically interfaces with a range of contemporary digital audio workstation software. In ensemble settings Jess performs the music of other composers, typically using a bespoke music notation called ‘arrow notation’ which reflects the design of a CMPSR controller. Jess does on occasion improvise, but is not as familiar or comfortable in such musical settings.

Deirdre Bencsik and Clare Bhabra Deirdre is a professional cellist and Clare a professional violinist who are both long-standing members of Sinfonia Viva, a UK based orchestra and educational organisation. Their performance practice is rooted in the classical tradition reading from standard western notation. Both Deirdre and Clare improvise in some community-based projects but both professed to not being confident improvisers. Neither employ any digital technology in their own practice, although Anonymous do sometimes collaborate with digital musicians and artists on some projects.

Research Team

Dr Adrian Hazzard Qualitative research into trustworthiness and musicking with AI.

Johann Benerradi Software development, deep learning, robot movement programming.

Dr Solomiya Moroz Embodied music cognition.

Adam Stephenson Robot movement design.

Partners

Sinfonia Viva
a British orchestra based in Derby, England) https://www.sinfoniaviva.co.uk/

Orchestras Live a national producer creating inspiring orchestral experiences for communities across England https://www.orchestraslive.org.uk/

Funding

This project received additional funding from the Trustworthy Autonomous System Hub https://tas.ac.uk/ and the Faculty of Arts at University of Nottingham https://www.nottingham.ac.uk/arts/