virtually (re)Sounding Place // vRSP

vRSP is developing outputs which are exemplars of a wholly new genre of art work, designed for virtual space but based on notable and rare historic places for which music will be written to integrate musical and social memory, and exploit unique architectural and acoustic environments.

Live performances of newly composed works will be captured and disseminated to viewers and listeners as VR content through the application of a series of novel production techniques. These will enhance and address specific difficulties presented by current technologies, as described below, by applying interdisciplinary knowledge combining research from human perception, neuroscience, musical composition, animation, visual surface-mapping techniques and visual signal processing, stereoscopic 360 degree video and three- dimensional audio recording and reproduction methods.

The network draws together experienced practitioners from all fields to inform the creation of a new work that will portray the intrinsic qualities of the historic performance space through site specific live performance and a combination of contemporary video- and animation-based techniques for VR generation.

Project Summary

The vRSP network will explore new forms of immersive performance experience in sites of historical and cultural import. It will start by examining new works by composer Michael Price who has been exploring the use of 360 video and audio to document performances in National Trust properties. The research network will examine how these performances have been recorded and converted to 360 videos, and offer new solutions using ongoing research from the University of Surrey’s Institute of Sound Recording, Center for Vision, Speech and Signal Processing, 5G Innovation Centre, and Digital Media Arts programmes.

Many spatial experiences are not well represented through current 360 video and immersive technologies. Despite Ambisonics being adopted as the de-facto soundfield capture and manipulation standard by Facebook and other 3D-audio content generation platforms, current 360 video experiences lack spatial audio detail and the ability to explore the soundfield convincingly. This leads to a considerably reduced immersive experience. While equipment manufacturers currently focus on increased image quality through higher pixel counts, there is currently a gap in both documentation and content of high-quality spatial audio which has been captured outside of game-engine based experiences.

The research network will explore how new forms of immersive experience can be created to allow future audiences to move through the space during the performance and explore the relationship between the performance and location. The network will also examine how extra layers or “maps” of the site can be added to allow the audience to explore both the history and significance of the site as well as the process of creating the work. Finally, the vRSP network will explore how an immersive experience can go beyond what is possible in a live performance to allow the experience to respond to the presence of the audience.

Aims & Objectives

The primary aim of this project is to enable a diverse group of researchers and practitioners from Music, Sound Recording, Digital Media Arts, Electronic Engineering, and Computer Science to share insights and working practices through the creation and dissemination of a live, site-specific performance and the development of a number of immersive prototypes. This project will provide an opportunity for new partnerships to develop between University colleagues, a major UK composer, Games companies, and SME’s working in immersive media production.

The project team aim to address the following questions:

  • How may immersive media be used to document and reproduce a performance, in a manner that goes beyond that of existing, video, film, and audio distribution methods.
  • Can a (re)creation of a site-specific performance within a virtual environment allow audience members to experience the performance in manner not possible in a physical space or place?
  • As exploration of a place is often crucial to the experience of a site-specific work, how can might the experience of moving around a physical space be recreated in a virtual environment?
  • How can this liveness be enhanced through the audience’s actions with the places represented? In order to achieve these aims, the project team propose to complete the following objectives:


To Date

  • Michael Price composed a new piece, Golden Line, which was performed live at St Giles’ Cripplegate, by 6 musicians featuring a string quartet, solo cello and solo soprano.
  • This performance was filmed in 360 6K stereo with 24-tracks of audio including 3 first order ambisonic sound fields.
  • The venue was scanned using a Faro, HDS LiDAR Scanner.
  • All musicians were filmed in stereo against a green screen and prepared forcomposition into the final piece.
  •  The soprano has been captured as a 3D volumetric video model.

Post Production

The team are currently:

  • Mixing down the audio recordings into 1st, 2nd and 3rd order mixes and testing different workflows and output formats.
  • Building a stand-alone VR application using the Unity 3D game engine, stereo 360 video, LiDAR data and volumetric scans to create a dynamic performance
  • Developing a custom audio rendering system to generate greater audio fidelity beyond what is currently available for the Unity engine.



Kirk Woolford Primary Investigator

Tony Myatt and Michael Price Co-Investigators

Sam Ziajka Research Assistant

The Golden Line

Tony Myatt Concept, sound recording and audio programming

Michael Price Concept, art direction, music composition

Kirk Woolford Concept, visual direction and visual programming

Sam Zlajka Concept, 360 video and audio recording

Jon Smart Iskra Strings – 1st Violin

James Underwood Iskra Strings – 2nd Violin

Laurie Anderson Iskra Strings – Viola

Charlotte Eksteen Iskra Strings – Cello

Peter Gregson Solo Cello

Heloise Werner Solo Soprano