( 2023 / Head of Audio / Frontier Developments)
Official Website: Planet Zoo: Console Edition
[GENRE] | CMS Content Management Simulation |
[DURATION] | 2017-2024 |
[ROLE] | Audio Lead |
[DEVELOPER] | Frontier Developments |
[PLATFORM] | PC, PS5, XBOX SERIES S, XBOX SERIES X |
Audio Team
Data provided by the audio tech team, I implemented a wwise game mix that thins out human presence layers so that the sounds of crowds, transport rides and ambiences are below the mix levels of our focused and subtle animal noises. Player relevant feedback, such as direct crowd reactions to animal behaviours, is of course retained in the mix.
Further to the mix change which benefits delicate sound recordings, it was also important to inform players about animal stress levels from human noises. Enclosures need to provide animals with quiet areas so that they retreat too them. Camera originating deferred ray-casts identify quiet areas and thin out crowds and other disturbances even further.
Mixing quiet to be more audible in the mix is Planet Zoo's audio solution for staying close to the experience of a zoo while retaining player relevant information front and center in a mix that is full of interesting sonic detail.
The audio team have classified the rankings of vocalisations and combining this with a code side voice management system the calls are prioritised so that player relevant information is never lost. A hero-emitter is used for specific, important animal calls on park-wide emitters. This produces a background animal soundscape that is informative about real-world animal behaviour and locations. The player then react to what they hear and even if they can't directly see it, they know in what direction to look.
Animals are generally quiet, not wanting to attract attention, but not always so. An animal ambling would be quiet but when they sense food, they do get excited. In both situations described the game would be moving animals around using the same base animation, but the context for audio behaviours is very different.
Audio, ai and animation worked closely together to present the correct types of vocalisations for behaviours. When an animal is spotting food and audio code has identified this as a behaviour, additional layers of "excited" sounds are added. Using wwise call-back's the animation system triggers partials to visualise those vocalisations. Using this system, animals will sound more excited as they walk towards food in such a way that animation did not have to create additional (costly) variations for all animals.
A similar system is also used for chatting between primates or howling wolves. Audio keeps track of multiple animals, comparing their behaviours (and motivations) and adding layers of excitement to trigger the right sounds, crucially also at a believable interval.
Audio triggered animations present a successful and close cross-discipline collaboration with animation, game and ai teams.