Musical Chairs: From Spectator to Stage

 

  1. What is Musical Chairs?
  2. How it Works
  3. Why? The Need for Engaged and Interactive Listening
  4. Presentations and the Research
  5. Our Team

 

 

1. What is Musical Chairs?

 

Musical Chairs is a touch-based interactive music exploration application (“app”), designed for tablet devices (iPad and iPad mini) and is currently available for the iOS platform.  The application enables users to explore multi-channel high-definition videos of recorded musical performances, allowing users to zoom and pan through the performances with standard touch gestures, with the volume of each channel dynamically mixed based on the point of focus in the video stream.  In doing so, the audio adapts to the visual focus, bringing to prominence channels whose sources are on screen or nearly on screen, while downplaying channels whose sources are far from the current focus of attention.

 

This allows users to isolate specific audio and visual aspects of a musical performance and creates the illusion of the user physically moving through the performance in real-time, somewhat akin to a “Google Street View” for musical performances, using seamless video and audio instead of connected still photos with no sound.  Enabling this type of listening for musical ensembles may provide users a way of engaging in music listening in a participatory and active way, and therefore could result in more sustained interest in group and individual musical performances.

 

2. How it Works

 

For example, consider the sequence of images in Figures 1 and 2 in Appendix A of this document, depicting a user using Musical Chairs to explore the performance of a quintet.  The performance begins with Figure 1(a), with the full ensemble being shown.  With all performers on screen, all performers can be heard equally, with their audio mixed appropriately for stereo output.  (The first and second performers are more prominent in the left stereo channel, while the fourth and fifth are more prominent in the right stereo channel.)  In Figure 1(b) the user uses a pinch-to-zoom gesture to focus on the second performer.  In doing so, the second performer’s volume is increased while the other performers are decreased accordingly.  In Figure 1(c) this operation is completed, and the second performer’s contribution now dominates what the user hears; the first performer can be faintly heard in the left stereo channel still, while the remaining performers can be faintly heard in the right stereo channel with volume decreasing with distance to the second performer.  To hear both the first and second performers, the user begins a swipe-to-pan action in Figure 1(d) and completes this in Figure 1(e).  With both performers on screen, their contributions can be heard equally well, although again mixed into left and right stereo channels appropriately.  The other performers, now even farther away from the focus are even quieter than before.  To focus on the third performer, the user zooms out and then begins a pinch-to-zoom action in Figure 1(f) and completes this in Figure 2(a).  As the zoom out is performed, all of the performers are brought back into focus and can be heard again; when the zoom to the third performer is carried out, his contribution dominates what the user hears, while the other performers can be heard faintly in the background of the left and right stereo channels.  In Figure 2(b) the user has zoomed out and then in again on the fifth performer, causing his contribution to dominate.  Through Figures 2(c), 2(d), and 2(e), the user pans through the entire ensemble from right to left.  Doing so creates an interesting effect, with performers moving from the left channel, to dominating both channels, to the right channel, and then fading off to the right as the focus of the performance shifts. The performance ends in Figure 2(f) with the user zooming out to hear the entire ensemble as in Figure 1(a).  Through a variety of simple touch gestures, the user has effective moved through the performance, focusing their attention on different performers and aspects of the music.  As such, Musical Chairs supports interactive musical exploration in ways that have previously not been possible.

 

 

3. Why? The Need for Engaged and Interactive Listening

 

The creation of the Musical Chairs application stems from an interest in music education technology, and specifically, engaged listening of recorded musical performances. The literature in music education emphasizes the necessity for innovative ways of providing students with experiences that reflect their own culture. Students, now widely described as digital natives, have various ways of listening and experiencing music inside and outside of the classroom. This includes using devices such as iPods and iPads to access and download, or view and listen to, any type of music they choose. Buckingham1 describes how students need to critically engage in digital literacies, including interactive listening technologies such as iPods and iPads. This reinforces the purpose of Musical Chairs, which “addresses a range of musical and technological skills that widen the opportunity for students in music beyond music education’s traditional approaches”. The speed at which music education technology has evolved is astounding. New music technologies are being introduced quickly and effectively to learners of all ages. While educators are constantly aware of musical styles as a form of student’s personal identity, it has become more urgent to address their ways of consumption as well. As Finney2 explains, teachers and students use different musical codes, and it may be that some are at opposite ends of a “musical and linguistic chasm with few holding the key to unlock each other’s code” (p. 18). Finney explains that if we are unresponsive to our students’ pedagogical needs, their ways of knowing, understanding, and identifying with music, we can potentially have a negative effect on their personal autonomy and creative capabilities in music. Similarly, Ruthmann & Dillon3 agree and add that we must allow for students’ agency through understanding how students use music and experience technology. By doing this, we can begin to understand students’ own culture and model a ‘relational pedagogy’ in the classroom.

 

Creating this application with a focus on engaged listening, necessitates an investigation into the quality of interaction students have with technology. Researchers such as Green4,5 describe the extremes within types of listening that students engage in; from purposive listening, to distracted listening. The difference is that with purposive listening, students are actively engaging in the music or instruments being played, where distracted listening is often music heard in the background and not the focus of the student. In addition, it is possible that distracted listening may occur even when purposive listening is the goal.

 

Beyond music education and research, there are a number of broad and general applications to music exploration and interaction, as the technology enables everyone to work or play with music in ways that were previously not possible.  The ability to stream and/or download performances also creates numerous social music applications, with users able to share their recorded performances with one another for a variety of purposes.  For example, an ensemble or band can record one of their performances and view the recording later, making personal notes for improvement.  The ensemble or band could also share their performance with partners or labels as a form of demo for business or audition purposes.  The ensemble or band can also share their performances with an audience or fans, just to share their music with others. 

 

1.  Buckingham, D., “Schooling the digital generation: popular culture, the new media and the future of education,” London: Institute of Education, University of London, 2005.

2.  Finney, J., “Music education as identity project in a world of electronic desires,” In J. Finney & P. Burnard (Eds.) Music education with digital technology, London, UK: Continuum, 2009.

3.  Ruthmann, S.A. & Dillon, S.C., “Technology in the lives and schools of adolescents,” In G. McPherson & G. Welsh (Eds.) The Oxford handbook of music education, volume 1, New York: Oxford University Press, 2012.

4.  Green, L., “How popular musicians learn,” Aldershot: Ashgate, 2002.

5.  Green, L., “Music, informal learning and the school: A new classroom pedagogy,” Aldershot: Ashgate, 2008.

 

Appendix A – Screen Shots of Musical Chairs in Action

 

(a)

(b)

 

(c)

(d)

 

(e)


(f)

 

Figure 1 – Screen Shots from Musical Chair Application.  (a) The application playing back the performance of a quintet.  (b) The application’s pinch-to-zoom functionality is being used to focus on the second performer.  (c) Pinch-to-zoom continue until the second performer is fully zoomed.  (d) Swipe-to-pan functionality is used to shift focus to the first performer.  (e) Panning is completed, with the first and second performer in focus.  (f) The application zooms out to the full ensemble again, and then pinch-to-zoom is used again, this time to begin to focus on the third performer.

 

(a)

(b)

 

(c)

(d)

 

(e)


(f)

 

Figure 2 – Additional Screen Shots from Musical Chair Application.  (a) The pinch-to-zoom action of Figure 1(f) is completed, with the focus now on the third performer.  (b) The application’s pinch-to-zoom functionality is used to first zoom out and then zoom in on to the fifth performer.  (c) Swipe-to-pan is used to move through the performance and bring the fourth performer into focus.  (d) Swipe-to-pan functionality is used again to shift back to the third performer once again.  (e) Swipe-to-pan is again used to bring the first and second performer into focus.  (f) The application zooms out to the full ensemble once again.

 

Thank you to “The Western Winds” ensemble from the Don Wright Faculty of Music for permission to use the screenshots of their Musical Chairs interactive audio/video recording.

 

 

4. Presentations and the Research

 

We are currently conducting formal research projects using this technology.

 

The Engaged Listening Project 1 (ELP1) – Research is currently being conducting how people listen to music. More specifically, when listening to a pre-recorded live performance, we are analyzing what specific aspects of a performance attract listeners. (What do they listen to, at what point during the piece, for how long, etc.) Using the ‘app’ Musical Chairs: From Spectator to Stage enables this research by having participants listen and watch performances, and recording their activities collected as data. The data is analyzed for emerging patterns.

 

The Engaged Listening Project 2 (ELP2) – In this study, students enrolled in Grade 9 music classes are asked to use the ‘app’ to assist them in learning a new piece of music. The recording is made from Grade 12 music students in the same school. This case study investigates whether or not the ‘app’ is useful for students learning to play new pieces, whether this influences their future choices to enroll in music classes. The results will be informed by semi-structured interviews with the participating students and teacher.

 

The Engaged Listening Project 3 (ELP3) – This research is analyzing its use and potential as a data gathering tool for qualitative researchers. We are investigating ways of tracking audio/visual data which will enable qualitative researchers to capture all participants’ anecdotal ‘field notes’ and conversations in an efficient and accurate manner.

 

 

The app has been presented many times within the Western University Community, such as in-class lectures, invited lectures in undergraduate and graduate classes, and many other presentations.  The list below describes some of the academic conference presentation from the year 2014:

 

Conference Paper Presentations – Peer Reviewed:

 

Godwin, M. & Linton, L. (2014). Interactive audio and music education applications. Audio Engineering Society Conference, Los Angeles, California. (October, 2014).

 

Linton, L. & Godwin, M. (2014). Musical Chairs: From Spectator to Stage. Paper and interactive session presented at IMPACT conference on music apps and educational games for music education. New York University, Steinhardt. August 14–17, 2014.

 

Linton, L. & Godwin, M. (2014). There’s an ‘app’ for that: Creating a new app for music education. International Symposium of Music Education, July 2014, Porto Alegre, Brazil.

 

Doyle, J., Godwin, M., & Linton, L. (2014). Musical Chairs Listening Station. Interactive listening station at the Don Wright Faculty of Music, Faculty Research Day. Western University, London, Canada.

 

Godwin, M. & Linton, L. (2013). Musical chairs: From sound to stage. Paper presented at Teaching and Instructional Education Symposium (TIES 2.0), The University of Western Ontario, London, Canada.

 

And many more!

 

 

5. Our Team

 

Don Wright Faculty of Music:

Mike Godwin (Audio and Media Specialist)

Leslie Linton, Ph.D. (Department of Music Education)

 

Faculty of Science:

Mike Katchabaw, Ph.D. (Department of Computer Science)

Justin Doyle (Research Assistant)

 

Research and Development:

Bryce Picard (Executive Director, Research)

Natalie Sudszy, Ph.D. (Research Development)

 

WORLDiscoveries©

Jonathan Deeks (Business Development Manager)

Fabian Folias (Digital Marketing Manager)