Week 2 – Virtual Spatialisation
- M108 on Wednesday at 2pm.
An introduction into creating Sound for New Media featuring a presentation on Spatialisation and Sound Assets, followed by a workshop in recording multi-channel field recordings. We will be looking at working practices in the professional timeline for VR and discussing your group project timelines. Follow Up:- In your groups, discuss your roles in sound production and the integration of Sound Assets and compile a preliminary list of sound assets needed from each person.
- Upload your proposal to padlet before group tutorials next week!
- Begin recording these sound assets for next week’s session.
Ties played the cafe oto game, ‘Otogarden‘
When I did game design at college I experimented with the same kind of things in this demo. I remember replacing what would have been normal sound effects with instrument sounds and seeing how it would change the experience. It kind of makes the player’s actions more rhythmic as though they move the character or interact to a beat.

This (below) was the game that I tried making many years ago titled ‘Shadow of Ash’ I replaced the usual effects with traditional Japanese musical instruments. The idea was a platformer with a focus on escaping an earthquake and volcanic eruption, (hence the title) with the player escaping environmental dangers and malevolent spirits. I’m kind of torn on this nowadays as it feels like appropriation, as much as it was a labour of love, but I don’t know really. Unfortunately, I don’t have any video/audio examples as this was in 2020 or something.


We briefly looked at the ambisonic microphones.
We probably won’t be using the ambisonic microphones much, as our sounds are going to be more spacialised to objects that the player interacts with. I’m unsure really about the use of these microphones for 3D engines where the player moves around the environment, but for static scenes, like films, or kind of ‘blanket soundscapes’ it does make sense. (or sounds that come from the player’s own ‘body’ in the virtual space.
I did see a good video by the YouTuber Benn Jordan (one of my favourites when it comes to sound/music ‘tech’ topics) where he used the NT-SF-1 to record a convolution reverb of a tunnel.
I already knew how audio is propagated in Unity, primarily as mono sounds from audio source objects.
from the PowerPoint:
Sound is crucial when it comes to creating a believable VR experience. Spatialized audio replicates how sound waves interact with the environment as well as your head and ears so that you really feel like you’re in the virtual world.
• While video games and conventional film present audio on a plane, VR uses head-related transfer function (HRTF) to simulate how sound would reach the ears, which allows for a full 3D audio experience. The Rift headset is designed so that the sound you hear comes from the same direction as a given visual stimulus. Thanks to built-in headphones, that means designers can place sounds behind you as a prompt to turn toward something beyond your field of view.
• All of this gives developers the tools they need to make their experiences more realistic, expressive, and immersive.