Recording the BBC Philharmonic orchestra in 3D
February 8, 2017

I’ve always been excited by the opportunities for 3D audio reproduction of orchestral music. From the stalls, the sound is fully immersive. The orchestra itself is wide and deep, the choir and organ are high above the stage, and the carefully designed concert hall acoustics reinforce the conductor’s carefully balanced blend of all the sections. When S3A got the opportunity to join a BBC Radio 3 recording of the BBC Philharmonic orchestra at the Bridgewater Hall in Manchester, we leapt at the opportunity!

The main research question was this: how do you take a traditional recording of a symphony orchestra and produce it for 3D audio? Or at least, how do you minimally modify the traditional production workflow and come out with a 3D mix? Being audio researchers, we therefore took along some not-exactly-traditional microphones, just in case.

The stereo mix for the Radio 3 recording was built around the feeds from a main Decca tree, which was lowered from the ceiling and suspended centrally just above and behind the conductor. Four other omnidirectional microphones were also hung in this way: a widely-spaced pair towards either end of the stage, to capture a blend of the instruments to either side of the orchestra; and a further widely-spaced pair, further back and high into the hall to capture the ambience. Additional microphones were placed around the stage, positioned to capture each section individually, rather than individual instruments, except the piano which was captured with it’s own spaced stereo pair.

In addition to the microphones used directly for the Radio 3 mix, Tom Parnell from BBC R&D added a compact Schoeps ORTF-3D array, a few meters behind the main Decca tree. The ORTF-3D allows the surrounding and height reverberation to be recorded, and controlled in a 3D mix by balancing among the different capsules. The S3A researchers added two Soundfield microphones, recording 1st order Ambisonics. The first was positioned high above the main Decca tree, giving us a broad perspective on the whole orchestra and the room ambience, and the second was directly in front of the stage, very close to the conductor, giving us a close perspective with a spatially wide view of the orchestra. Finally, we placed an Eigenmike centrally in the 3rd row of the audience, at around the conductor’s head height. The Eigenmike can capture Ambisonic signals up to 3rd order using 32 microphones mounted on a sphere. Combined with 360 degree video, this kind of signal could directly be used for a virtual reality production, although producers usually prefer to retain more control over the mix before writing out to any given surround format (be it Ambisonics, loudspeaker channels, or binaural). These microphones could all feasibly be added to the recording workflow with minimal additional overheads.

As we are also working on object-based reverberation techniques, we were keen to capture the room impulse responses between a loudspeaker placed on stage and the various microphones. Room impulse responses describe how the sound propagates from the loudspeaker to the microphones, and our work seeks to encode this into a set of parameters that an object-based renderer could interpret to maximise the listening experience for whatever audio reproduction system you have set up at home. In fact, even with microphones set up to capture the ambience, it is common practice to add artificial reverb to this kind of recording when it is broadcast in stereo. However, it is difficult to produce a one-size-fits-all reverb when the end reproduction system is unknown, which makes production for 3D audio more difficult. In binaural reproduction, for instance, there is ‘space’ to create fully immersive reverberation, but if the same reverb were applied to stereo then the mix would likely be too reverberant. We will process the impulse responses to derive a set of parameters that aim to convey the concert hall’s reverb to the listener, no matter how they eventually consume the content.

In the next few weeks we’ll be listening to the stereo mix, extracting reverb parameters from the room impulse responses, mixing the closer microphones into the microphone array recordings, and experimenting with building some fully object-based clips. After all that, hopefully we’ll be in a better position to answer the questions above!

Any questions? Suggestions? Have experience to share? Drop us a comment below!

  • Keith Gunn
    February 9, 2017

    Very interesting summary. I have always been concerned that there's no ideal place to listen to an orchestra. The conductor's much too close and the audience, mostly, too far away or too far below the plane of the stage. I look forward to hear more, acoustically and in print.

  • Trevor Clarke
    January 10, 2018

    The idea of 3D means calibration to me. For an orchestra how it fills the space is interesting. Do you measure the various surfaces? And does colour have a part to play? Sound must have optimal pathways perhaps. Very interesting

  • Trevor Clarke
    June 22, 2018

    I feel the sound from an orchestra and I follow the music consciously almost at the same time. eg the hammer blows in mahlers6th. How do you create the perfect telling of a story with the sound without the listeners knowledge of the story?

Leave a comment

Your email address will not be published. Required fields are marked *