Oculus has introduced a handful of new features to its Rift SDK that all seek to give game developers the ability to create immersive VR experiences through quality, spatialized audio.
The company ran down each new addition in a post shared on the Oculus Blog and took a moment to detail its future plans for VR-centric audio as well.
Near-Field HRTF is one of the new additions introduced to the Rift SDK and primarily focuses on improved sound directionality within one meter of the player.
Previous audio filters were mainly designed with sounds over a meter away from the player in mind and, as a result, didn’t offer a significant amount of accuracy for nearby sounds. Near-Field HRTF aims to address that and deliver more precise spatialized audio.
Additionally, the team has added Volumetric Sounds Sources that aim to realistically scale the volume of a sound’s origin based on both size and position. The tech allows developers to model objects and assign the radius in which that object projects sound.
Examples and more in-depth explanations of both of these developments can be found in the blog post.
Meanwhile, Oculus notes that it is still forging ahead to bring more realistic spatialized audio to VR. Right now that means working with sound reflections and looking at ways to help audio designers create sound that matches an environment automatically.
“The work we’re doing in this area makes what you perceive in VR feel like the real world without having to actually model the real world,” says Oculus’ software engineering manager Pete Stirling. “All these little cues that you’re used to having in real life—they aren’t there in VR. We’re slowly adding them in. Every little piece is cumulative, and it makes a huge difference.”