What is Rapid Prototyping (RP)

In manufacturing rapid prototyping (RP) is a group of techniques used to quickly fabricate a scale model of a physical part or assembly. The idea is to accelerate the design process of any product as it allows for both low-fi and high-fi prototyping, to foresee the necessary adjustments to be made before the final production line.

In arts/games/software/-based applications RP allows you to quickly estimate and understand the meaning-making processes at play, and to test them with users to assess their experience. Additionally, it brings end-users into the loop and invites them to help make design decisions appropriate to their experiences ‘inside’ the activity.

The TRUTH of the experience ‘inside’ a Digital Score

Challenge: with mixed media and active computation underpinning many digital scores, how do we discuss (or create with) these agents and objects and how do they shape meaning?

Key Insights:

  • the ‘content’ of a digital score is not the same as its ‘message’
  • The ‘message’ should be understood through the way that a digital score shapes the scale and form of human association and action (i.e. how it makes us behave, feel, think, create)
  • digital score creativity should consider its ‘message’ rather than focussing solely on the content, as deep meaning is communicated here
  • the media used to construct a digital score are in themselves individual media with independent ‘messages’ and ‘contents’ (each forms part of a whole - an ecosystem!)
  • when organising and interpreting the inter-related media in a digital score could consider them as objects or agents with mass, gravitational pull, trajectories, radiations and distortions as they operate through time

Agile process

The basic Agile process is circular and involves a series of short sprints/ iterations that move your score dev forwards in SMART increments.

  1. Plan
  2. Design
  3. Develop
  4. Test
  5. Deploy
  6. Review
  7. goto 1

Think quick and be inventive

The agile dev process requires you to be amazingly creative and inventive. The priority is to turn your digital score into music, and in doing so find the TRUTH of the experience by testing out the core communications system of your digital score.

Obviously, given the timescale, you cannot develop a new AI, or design the specific game environment in Unity, so you need to use rapid prototyping and Wizard of Oz’ing to realise your idea into a quick and dirty working model (explained HERE and below). Put your musician(s) into it (don’t worry about the simplicity of the experiment - tell them your vision and let them imagine their way through it).

Invent imaginative ways to represent your idea quickly, and effectively that are also fun and playful. For example, you could use the video function on your phone to quickly make an animated score with screenshots, or post-it notes; you could build a small-world model of your games world from toys and use a WhatsApp video call to fly through this world; you could pretend to use an AI to generate text and images by having someone type stuff and screen share via Zoom in the next room, or control parameters of a VST sound processor (like the Wizard of Oz in the film). The priority is USER EXPERIENCE… what is their EXPERIENCE of your digital score? Was it what you expected? What exciting new stuff emerged THROUGH this experience?

Prioritise UX

Using the feedback from your musician’s EXPERIENCE, develop your digital score to the next level. And quickly get the musician BACK INTO IT. You want them to share their experience with you QUICKLY AND OFTEN. This is the Agile way LINK

Examples of Rapid Prototyping Digital Scores: getting to the core of the experience.

Example 1: small world game design

Prototype

Musicians X, Y and Z have designed a digital score that is based in a computer game world. It is an environmental-focused score where musicians need to collect birds by playing melodies of bird songs. Once collected these birds are then ‘saved’. The composition ends when enough birds have been saved to rebalance the ecosystem.

For their rapid prototype, they built a small world using LEGO. This world looked a little like a jungle (one of the scenes in their score). They had written out melodic lines on paper manuscripts and cut them into small chunks. They made small birds out of 3 LEGO blocks; each bird a different colour and attached the manuscript extracts with sticky tape. This world was placed in a cardboard box, with the front missing, and quickly painted to look like the sky.

Within 60 minutes of doing this, they were ready for an external musician (soprano sax) to try the rapid prototype. The saxophonist was told what the LEGO world was meant to represent, and the goal of the score along with the initial performers’ instructions. The musician sat in front of a large TV screen in a different part of the room from the LEGO world; the small world was transmitted to the screen using WhatsApp video. Musicians X and Y would move the birds around the small world while Z moved the phone around like the camera function in a computer game. They were listening and responding to the saxophonist’s sounds.

This small-world experiment of the digital score was chosen because its core experience was being communicated through the interactivity between the saxophonist sound and the game world. If this aspect of the digital score design did not work or was substandard musicians X, Y and Z would never know the core communications functions of the score.

The demo lasted 6 minutes. Immediately following the demo, the team got together, and the saxophonist explained how it felt to be playing inside the score, what they expected to be happening, and surprising areas they liked. The team discussed the next steps, and very quickly made amendments and enhancements, made a second small world (a seascape) and added different animals as requested by the saxophonist. The saxophonist also wanted to have the camera movement linked to her saxophone, so she was in control of the live image.

After a couple more sessions, musicians X, Y and Z had deep insights into how the digital score might work when it is made in a Unity games engine. At the same time, because of their hands-on experience of moving the birds around and listening to the musician’s live performances, they were able to programme the non-player-character aspects of the game world with ease and accuracy.

This process enabled musicians X, Y and Z to get quickly to the core experience of this digital score, and because of this to build quickly and effectively the digital score proper.

Example 2: animated score

Musician A used the cards and designed a digital score as a fixed animated film. It was to deconstruct a violin and build a collage of ‘materials’ pertaining to a violin such as wood, metal string, pegs and violin music. The fixed animated film was to journey across these materials laid out in a sequence, for example, a piece of string leading to a bit of music manuscript from Mozart, leading to pieces of wood. Musician A, a violinist herself, would use these elements to “reconstruct” the violin through improvised sound.

The core experience with this digital score was to be felt in the balance between how the abstract and musical materials were presented on the screen, and the tempo of the journey in the collage. A rapid prototype experiment was set up that used bits of stuff that were lying around the classroom that resembled the materials of a violin. These were laid out in a journey, and a quick video made with her phone. Following the first test, a new journey was laid out and recorded, then another.

Through these early experiments, core experiences of balance, tempo, complexity of materials and assemblage were determined, and a final version was made relatively quickly once the proper materials were found.

Example 3: AI ensemble

Musicians R, S and T designed a digital score using AI, which would listen to a live musician playing a fixed melodic line and build a unique backing track for them based on their emotional mood. The AI would generate chords by analysing the live musicians playing in realtime and measure their arousal through on-body sensors (EDA). Together the AI and the live musician would make the music in the here and now, with each performance of the digital score creating a different version of the piece but the melodic line remaining fixed.

The core experience would be how the AI measured the live musician and made a response in realtime. This would be felt as new and engaging, perhaps like a duet. Musician R would perform the melodic line on guitar with musicians S and T operating the AI.

To kickstart the experiment the musicians composed a short melodic line and built a quick harmonic matrix that would work with it. The rapid prototype experiment used a Wizard of Oz approach for the AI. This involved musicians S and T pretending to be the AI in another room (like the Wizard of Oz in the film). Musician S would watch musician R through a live video feed and make judgments about their arousal state, signalling this to musician T by lifting or lowering his arm. Musician T would then play chords using a found keyboard sound in Garageband and referencing the harmonic matrix as a guide to what he was playing. This was then broadcast back onto musician R over the video feed.

After one experiment the trio discussed the core experience. Decisions were made about extending the melodic line and building rules for how the arousal measurement could be translated to the harmonic matrix. Extra AI control qualities were also introduced.

Over a 3-hour process, the musicians felt that they understood the key behaviours of the AI and were then able to spend the lengthy process of building it in Python knowing that it would work. They also knew what material worked for the melodic line, and harmonic matrix.