Lone musicians to play with virtual ’timing-sensitive’ colleagues in a new immersive rehearsal platform

The 1.2M project funded by the Engineering and Physical Sciences Research Counc

The 1.2M project funded by the Engineering and Physical Sciences Research Council, part of UK Research and Innovation, is a joint project involving researchers at Birmingham City University, University of Warwick, and led by the University of Birmingham.

A system that will enable musicians to play alone, but as if part of a live ensemble, is being developed at the University of Birmingham.

In situations where musicians are unable to be together, the system will offer the experience of a group rehearsal. Being unable to meet has been an issue for musicians throughout the Covid-19 pandemic, and is an ongoing challenge for players looking to explore and rehearse music without access - for whatever reason - to their colleagues.

Be it piano duets, jazz trios, string quartets or rock groups, mutual listening and timing is key to good performance. Ensemble musicians spend hours working together to hone such timing abilities and so achieve group coherence worthy of a stage.

When unable to rehearse together, individual ensemble musicians may choose instead to play along with a recording of the piece, sometimes even one with their own track omitted. The drawback of this approach is that they miss the vital interactive and dynamic elements of a group rehearsal. Meanwhile attempts to rehearse using video-conferencing are prone to failure, with internet delays rendering split second synchronization - the holy grail of good ensemble - impossible.

ARME - Augmented Reality Music Ensemble - is a system being developed by a collaborative team across three West Midlands Universities and two UK companies specialising in digital music technology.

It presents the lone musician playing a real instrument with a group of virtual colleagues based on audio-visual recordings of professional players. Crucially, each virtual player is timing-sensitive, meaning that their musical timing is individually modulated in real time using mathematical models. Such sensitivity allows for the virtual players to synchronise ‘live’ with each other and with the real player, so mimicking the experience of an actual rehearsal.

Starting out with small groups of classical string players, in duos, trios, and quartets, the team hopes to create a model that can then be broadened out to other ensembles and genres, leading to a versatile, interactive rehearsal tool to be used when traditional ensemble rehearsals are not accessible.

The 1.2M project funded by the Engineering and Physical Sciences Research Council, part of UK Research and Innovation, is a joint project involving researchers at Birmingham City University, University of Warwick, and led by the University of Birmingham.

Supporting the project, the UK digital music industry is represented by two commercial partners involved in musical training and digital technology: PartPlay and Semantic Audio.

Dr Maria Witek , a Senior Research Fellow in the Department of Music at the University of Birmingham, explains: “By studying how ensemble musicians adapt their timing to achieve a rhythmically cohesive and musically expressive performance, we can develop a novel system allowing a real musician to play with responsive and adaptive virtual players.’

Dr Massimiliano Di Luca , principal investigator and senior lecturer in the School of Psychology at the University of Birmingham highlights the opportunity for inclusivity and accessibility that the system offers: “We will be able to provide musical training opportunities when travel and social rehearsals are not feasible. This will be useful in the wake of the Covid-19 pandemic and for musicians who might - perhaps for geographical, financial or societal reasons - be unable to make rehearsals happen’.

Dr Mark Elliott , from WMG at the University of Warwick comments: “This is a great example of how the collaborative expertise across three West Midlands Universities and industrial partners will be used to combine state-of-the art immersive and audio technologies with mathematical models of human movement to develop research into realistic musician avatars’.


This site uses cookies and analysis tools to improve the usability of the site. More information. |