Skip to main content
Research project

Capturing the Contemporary Composer

Project overview

Capturing the Contemporary Conductor is a BA/Leverhulme Trust funded pilot-project investigating the use of high-quality motion capture systems for the study of conducting gesture. It is a collaborative project between Music and Health Sciences at the University of Southampton.

The team comprises Dr. Richard Polfreman and Dr. Benjamin Oliver from Music, Dr. Cheryl Metcalf in Health Sciences, and research assistant Dan Halford.

This pilot study uses high quality motion capture technology to help in the development of tools for the modelling and analysis of contemporary conducting gesture both for conductor performance studies and real-time gesture control in electroacoustic music performance.

The intention is to develop a protocol for the use of high-end, high-resolution marker-based motion capture systems (such as those used for computer generated graphics in cinema) to record precise 3D representations of conducting gestures. This high-quality data will be able to demonstrate even subtle differences between performances, musical contexts and conductors and can used to analyse performances but also to provide data for machine learning and artificial intelligence software for interpreting conducting gesture.

In this pilot, a new musical work was composed to specifically incorporate a number of conducting gestures and techniques (such as varied tempo and time signatures, cuing) and in places working with a click-track (a metronome only the conductor can hear) and electronics part to reflect contemporary conducting skills.

We then assembled a group of performers to perform the piece, and three different conductors to work with. Each conductor worked with the ensemble in-turn, through rehearsal and a motion-captured performance, which was also filmed from several different angles and the performance recorded in high quality audio. The raw data was then processed and aligned and the resulting multiple data streams uploaded to the Repovizz open access motion-capture website for use by researchers in the field.

In addition to the audio, video and optical motion capture data, we also captured EMG data from sensors attached to the conductors’ arms – these provide readings of muscle activity which can also be used to help understand movement.

Finally we also capture ground reaction force data from a floor plate which the conductors stood on while performing. This data gives us information about how the weight of the conductor shifts during performance, and we have managed to extract quite useful preliminary information about conducting timing from this.

Outputs from the project include these datasets of conductor motion capture data aligned with audio and multi-angle video recordings, as well conference and journal papers.

Staff

Lead researcher

Dr Richard Polfreman

Associate Professor

Research interests

  • New Interfaces for Musical Expression (NIME)
  • Music and Movement
  • User-Interface Design and HCI
Back
to top