People @ LIMSI:
Project in partnership with Conservatoire de Paris (CNSMDP)
The project investigates our perception of space through sound and images. The journey starts with a concert, organized to celebrate the 850th birthday of the Notre-Dame de Paris cathedral, staging a symphonic orchestra accompanying soloists and choirs through J. Massenet oratorio “La Vierge” (The Virgin). Although brilliant, this performance was nonetheless ephemeral, offering this 19th century piece augmented by this edifice of such peculiar acoustics to a privileged few.
In conjunction with the concert, the Conservatoire National Supérieure recorded the event, with each instrument section and soloists being carefully recorded. In an attempt to explore the possibility of recreating this concert event for future spectators, providing a spatially accurate rendition of the concert and allowing one to navigate within the cathedral and experience different acoustic perspectives, the Ghost Orchestra Project was launched.
Combining research efforts in binaural audio of the FUI-BiLi project, digital heritage acoustic recreations of the ANR-ECHO project, and the development of interactive virtual reality environments in the BlenderVR project, this work was a means to join forces and attempt a monumental work.
The following is a presentation of the project that was prepared for the FISM conference 2015. While the conference was cancelled, we provide the presentation here.
The basic premise, as far as the audio element is concerned, was to utilize the close-mic'd audio tracks from the different musicians and to replay them in a virtual acoustic reconstruction of the Cathedral, in order to recreate the proper spatial information regarding instrument positions and directivities and their relation to the room acoustics of the space. This effort required the construction of a geometrical acoustic model, which was calibrated according to measurements also carried out in the cathedral. The goal being to make sure the sound of an instrument playing at position A and heard by a listener at B would be perceived exactly as in the original building (reverb, position, presence, timbre, amplitude, etc.).
The 3-dimension room impulse response is then numerically computed for each and every instrument and every potential listener position. These responses are then *convolved* with the corresponding audio track and appropriately combined to create the Ghost Orchestra performance.
The virtual reconstruction of the instruments in the cathedral during the performance can be compared to one of the recorded tracks at the conductor's position. If the acoustics and balance are perceived appropriate at this position, we can have confidence that listening at other positions which were not recorded will still provide a realistic virtual reconstruction.
These audio extracts are mono-phonic, representing the recording position microphone placed over the conductor.
To accompany the virtual acoustic reconstruction, a 3D model of Notre-Dame was created (3DS Max / Blender), mirroring the visual appearance of the place. Then, the various instruments themselves were reproduced in the 3D environment and positioned in the virtual cathedral so that visitors could visualize the different components of the orchestra that now played in front of them.
The 3D virtual reconstruction can be explored interactively with an Occulus Rift Head-Mounted Display (HMD). This real-time rendering is possible with BlenderVR.
Alternatively, the virtual world can be pre-rendered at a higher resolution as a 360° video.
The immersive experience was adapted to different medias allowing for different degrees of interaction.
Free navigation in the scene equipped with a head mounted display + binaural acoustic rendering (3rd order Ambisonic, decoded over 16 virtual speakers, rendered binaurally)
The following video presents a recording of this navigation. The head orientation of the user is taken into consideration for updating both the visual and auditory orientation. Speed of the flying carpet can be controlled using a joystick.
Equirectangular 360° video played on a tablet or on Youtube. This off-line 360° video rendering allows for real-time orientation with a suitable viewer, providing a “steerable window” looking into the virtual world. The video is combined with either Ambisonic or decoded Ambisonic virtual speaker tracks. The 360° audio is then converted to binaural in real-time as a function of the window orientation with an appropriate player. Players have been developed in the FUI-BiLi project by partners Orange Labs and Arkamys.