To make the quite interesting, the audio team recorded hyper-realistic Foley, footfalls, breathing and movement textures in our own pits. In most cases, this source material (created for all animals and surfaces) provides the necessary audio layers for animal presence. It also allowed sound designers to source vocalisation for key moments only, reducing reliance on often scarce recordings.
To avoid a loudness war with such delicate Foley work, audio-code identifies when the camera moves inside a player created enclosure (in 3d space). Location informed data drives a Wwise implementation that can thin out the sounds of crowds, transport rides and ambiences to keep those quiet sounds focused and subtle. Player relevant feedback, such as direct crowd reactions, is retained in the mix. This is important for gameplay as animals can get stressed from noise and players need to create enclosures with quiet areas so animals can retreat too them. Ray-casts are used to identify if areas are indeed quiet and the mix is presented accordingly.
That is our solution to stay close to the experience of an actual zoo, keep player relevant information front and centre in the mix and also present a delicate, hyper-realistic close-up of animals, full of interesting sonic detail.
When implementing audio, sound designers have classified the rankings of vocalisations and a code side voice management system will prioritise these accordingly. A hero-emitter is used for specific, important animal calls and these park-wide emitters produce a background soundscape that is informative to animal behaviour. The player can always react to what they hear, even if they can't directly see it.
Audio triggered animation
It was a challenge to represent animal behaviour because of how quiet they are in most situations, but not always so. An animal walking around is generally not trying to get attention, but when walking towards food, they get excited. In both situations the game will use the same base animation for the animals, but the context is different.
Audio, ai and animation have worked closely together to present the right amount of vocalisations for behaviours. When an animal spots food and audio code has identified its behaviour, it will start layering in "excited" sounds. Using call-back's, the animation system is informed of this behaviour and animation partials are triggered to visualise the vocalisations. Using this system, animals can sound more excited as they walk towards food while animation doesn't need to add additional (costly) walk-to-food variations.
A similar system is also used for "chatting" Primate's or howling wolves. Audio keeps track of multiple animals, comparing their behaviours (and motivations) and adding layers of excitement to trigger the right sounds, at the right interval.
It was one of our very first experiments with audio triggered animations and a successful close collaboration with the animation, game and ai teams.
Real-Time Obstruction and Reverb in a player generated open-world
A short introduction to real-time obstruction and environment filtering in Planet Zoo's large open- and player created-environments.
Planet Zoo features a dynamic (fast moving) player controlled camera and user generated content. Terrain, objects and environments can be freely manipulated and changed. To emphasize those changes; Early reflection, occlusion, filtering of environmental audio and weather are informed by the position of the camera and by ray-casting against the surrounding environment in real-time. It is a performant system (as opposed to fully simulating propagation) that is designed to emphasize the changes made by a player. © 2023 Frontier Developments, plc. All rights reserved. Frontier and the Frontier Developments logo are trademarks or registered trademarks of Frontier Developments, plc
[GENRE] | CMS Content Management Simulation |
[DURATION] | 2017-2024 |
[ROLE] | Principal Audio Designer / Project Audio Lead |
[DEVELOPER] | Frontier Developments |
[PLATFORM] | PC, PS5, XBOX SERIES S, XBOX SERIES X |
Audio Team: