Dark Light

Epic Games, in collaboration with NVIDIA and ILMxLAB, today gave the first public demonstration of real-time ray tracing in Unreal Engine. Real-time ray tracing is considered to be a holy grail for those creating high-end cinematic imagery, and one that signifies a leap forward in the convergence of film and games.

 

During today’s “State of Unreal” opening session at the Game Developers Conference (GDC), the three companies presented an experimental cinematic demo using Star Wars characters from The Force Awakens and The Last Jedi built with Unreal Engine 4. The demonstration is powered by NVIDIA’s RTX technology for Volta GPUs, available via Microsoft’s DirectX Ray Tracing API (DXR). An iPad running ARKit is used as a virtual camera to draw focus to fine details in up-close views.

Epic built the computer-generated (CG) scene using assets from Lucasfilm’s Star Wars: The Last Jedi featuring Captain Phasma, clad in her distinctive armor of salvaged chromium, and two stormtroopers who run into her on an elevator on the First Order ship. In the tech demo, lighting is moved around the scene interactively, as the ray-traced effects including shadows and photorealistic reflections render in real time. The stunning image quality of highly reflective surfaces and soft shadows has never before been achieved at such a high level of image fidelity in Unreal Engine.

Next-generation rendering features shown in today’s demo include:

  • Textured area lights
  • Ray-traced area light shadows
  • Ray-traced reflections
  • Ray-traced ambient occlusion
  • Cinematic depth of field (DOF)
  • NVIDIA GameWorks ray tracing denoising

“Ray tracing is a rendering process typically only associated with high-end offline renderers and hours and hours of computer processing time,” said Epic Games Founder and CEO Tim Sweeney. “Film-quality ray tracing in real time is an Unreal Engine first. This is an exciting new development for the media and entertainment linear content worlds—and any markets that require photorealistic visualization.”

“At ILMxLAB, our mission is to create real-time rendered immersive experiences that let audiences step into our stories and connect with cinematic universes that look and feel as real as the ones on the movie screen. With the real-time ray-tracing technology that Epic and NVIDIA are pioneering, we are a pivotal step closer to that goal,” says Mohen Leo, ILMxLAB Director of Content and Platform Strategy.

Epic Games worked closely with NVIDIA to support the NVIDIA RTX technology available through the DXR API. Running on an NVIDIA DGX Station, the demo was brought to life via a collaboration between Epic’s dedicated graphics and engine team, NVIDIA’s world-class ray tracing experts and the technical ingenuity and creative artistry of ILMxLAB.

“Real-time ray tracing has been a dream of the graphics and visualization industry for years.  It’s been thrilling to work with the talented teams at Epic and ILMxLAB on this stunning real-time ray tracing demonstration,” said Tony Tamasi, senior vice president of content and technology at NVIDIA. “With the use of NVIDIA RTX technology, Volta GPUs and the new DXR API from Microsoft, the teams have been able to develop something truly amazing, that shows that the era of real-time ray tracing is finally here.”

A one-hour deep dive of how this scene was created and rendered will be presented today at 11AM PT in the “Cinematic Lighting in Unreal Engine” at the Yerba Buena Center for the Arts Theater and livestreamed to /UnrealEngine on YouTube, Facebook and Twitch.

The scene is running in real time at NVIDIA’s GDC booth, South 223, where show attendees can observe the content in Playback, Lit, Ray Tracing and Wireframe modes.

Epic Games will make support for NVIDIA RTX technology using the DXR API available to Unreal Engine developers later this year.

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts