Epic Games gave the world a sneak peek at the future of gaming as well as cinema and flaunted the massive improvements to their Unreal Engine. The advancements make footage appear so realistic that it’s scary. During the State of Unreal presentation at the Game Developers Conference in San Francisco this week, Epic Games provided a glimpse of the Unreal Engine’s new capabilities with a Star Wars video, a reptilian alien voiced by Andy Serkis, and “digital human” named Siren.
Epic Games showcased their real-time ray tracing showcase with a cutscene from the Star Wars universe. The Unreal Engine was able to recreate crazy realistic real-time light reflections, rendering, and cinematic effects. In the beautiful video, you can see light bouncing off the Stormtroopers and Captain Phasma as they ride in an escalator. The real-time ray tracing in Unreal Engine was achieved by utilizing Nvidia’s RTX and Microsoft’s new DirectX Raytracing (DXR) API. Epic Games will make real-time ray tracing available to Unreal Engine developers later this year.
Then there was a demonstration on how tremendous the facial expression capabilities on the Unreal Engine are. There was a video with a dragon-like alien named Osiris, who was voiced by Andy Serkis. The Hollywood actor is no slouch at voicing creatures, Serkis was the voice for Supreme Leader Snoke in Star Wars: The Force Awakens and of course as Gollum in The Lord of the Rings trilogy. The video shows every twitch, every blink with incredible realism.
Finally, there was a video showing a “digital human,” an actual woman who was rendered as a character for a video in real-time. Epic Games in collaboration with RubicMotion, 3Lateral, Tencent, and Vicon created Siren, a digital version of a real person using Epic’s Unreal Engine 4 technology and 3Lateral’s Meta Human Framework. Actors can play digital humans in video games and movies that are so realistic, you’ll swear it’s a real person. The digital human can react in real-time making it uncomfortably real.
“We are offering the keys to unlock a virtual world, enabling content producers and game developers to more easily interact with our technology and streamline the creation process for performance driven real-time digital humans,” said Andy Wood, Chairman of Cubic Motion, in a statement. “By 2020, this will no doubt transform content production across the board by making this technology universally available. By 2024, we may all be interacting with digital humans in some way or other, whether its via headsets, films, TV, games, live performances and broadcasts, or by directing digital assistants in our homes in real-time.”