Epic just wrapped its State of Unreal showcase at GDC 2023, and the company showcased a number of tools that it says will help developers of all sizes create richer game worlds.
The Fortnite maker began the show with a glimpse at new tools that are heading to Unreal Engine 5.2–which it says 77 percent of its developers are now using–including a new Substrate shading system that will allow developers to build more photorealistic environments by composing and laying different shading models to achieve "levels of fidelity and quality" it claims was previously a pipedream.
Substrate is shipping in 5.2 as an experimental feature alongside new in-editor and run-time Procedural Content Generation (PCG) tools that will allow artists to define rules and parameters to quickly popular expansive, highly detailed spaces that are "art-directable."
During the demo, Epic showed how developers would be able to lean on the tools to turn a purpose-built 200 meter environment into a 4 kilometre play space.
Breaking down how the PCG tools work on-stage, Epic's vice president of engineering Nick Penwarden explained they communicate with other nearby elements in the scene–such as a creek bed–to flesh out the world in a way that makes sense.
The company showed off the results by driving a photorealistic Rivian R1T all-electric truck through an environment densely populated with trees and lush vegetation built with Quixel Megascans, and towering craggy rock structures using Quixel MegaAssemblies, before revealing a huge part of that world had been constructed using PCG.
Ultimately, Penwarden says the tools are designed to empower artists while drastically reducing the time it takes to create expansive, photogenic worlds. Unreal Engine 5.2 is available today in preview form via the Epic Games Launcher.
Human after all
In what was perhaps the most impressive part of the showcase, Epic also debuted its new Metahuman Animator technology, and showed how the tool can be used to create high-quality animations in minutes.
As the name implies, Metahuman Animator is a new feature set for the MetaHuman framework that enables developers to reproduce facial performance as a high-fidelity animation on MetaHuman characters.
Epic explained Metahuman Animator is a tool that "anyone can pick up and user," regardless of how much animation experience, and will still deliver triple-A quality results. For instance, anyone looking to use the tech will be able to use their iPhone to capture a performance, before transferring its detail and nuance onto a MetaHuman in no time at all.
During the showcase, Epic demoed the technology by filming a live performance by Melina Juergens, known for playing Senua in Ninja Theory's acclaimed Hellblade franchise, and quickly recreating it in the digital space.
Epic explained that Metahuman Animator is capable of generating a facial rig from just three frames of footage, and is capable of generating animations from both video and audio. Animations can also be imposed on other Metahumans, letting developers transplant performances onto different characters.
Metahuman Animator doesn't have a firm launch date yet, but Epic suggested it'll be out in the wild in a "couple of months." You can find out more about Epic's GDC 2023 announcements over on the Unreal Engine blog.