Do you like games? Do you like jargon? Well then youll love ambient occlusion, morphological anti-aliasing, adaptive vertical synchronization, and real-time ray tracing.
All of the above were at one time the latest obtuse term for some complicated technology thats been hyped as the next leap in gaming graphics. Except now, the last one might actually be truly revolutionary.
Ray tracing achieved buzzword status at this weeks Electronic Entertainment Expo (E3). Game announcements made by Microsoft, NVIDIA, and AMD at the big gaming show were peppered with promises that their upcoming releases will bring this miraculous technology into our homes.
I think its paradigm shifting, says AJ Christensen, a visualization programmer at the National Center for Supercomputing Applications. Theres a lot of stuff that weve been waiting to be able to do. The imagination far precedes the technology, and I think a lot of people are excited and waiting for it.
So what makes ray tracing so potentiallyahemgame changing? Lets break it down.
Simply put, ray tracing is a technique that makes light in video games behave like it does in real life. It works by simulating actual light rays, using an algorithm to trace the path that a beam of light would take in the physical world.
Using this technique, game designers can make virtual rays of light appear to bounce off objects, cast realistic shadows, and create lifelike reflections.
First conceptualized in 1969, ray-tracing technology has been used for years to simulate realistic lighting and shadows in the film industry. But even today, the technology requires considerable computing power.
A game needs to run 60 frames per second, or 120 frames per second, so it needs to compute each frame in 16 milliseconds, says Tony Tamasi, vice president of technical marketing at graphics card developer, NVIDIA. Whereas, a typical film frame is pre-rendered, and they can take eight or twelve or twenty-four hours to render a single frame.
This newfound excitement around ray tracing comes just as home gaming hardware is on the cusp of being able to process lighting effects in real time. The graphics chips that will go into the next generation of gaming PCs and videogame consoles should be powerful enough to achieve the rendering power needed to produce ray-traced scenes on the fly. When that happens, it could result in a tectonic shift for visuals in gaming.
If you look at the way light works in video games now, it might seem like all the elements are there: reflections, shadows, bloom, lens flare. But all that is just sophisticated trickery. Programmers can prerender light effects (even with some ray tracing), but these are baked into the sceneessentially just packaged animations that always play out the same way. These effects can look quite convincing, but theyre not dynamic.
The problem with that is that its completely static, Tamasi says. Unless you render in real time, the lighting is just going to be wrong.
If the player alters the environment byfor exampleblasting a hole through a wall, the light in the scene wont change to stream through that hole unless the developers have specifically planned for that possibility. With real-time ray tracing, the light would adjust automatically.
In real life, light comes to you. Waves made up of countless little photons shoot out of a light source, bounce across and through a variety of surfaces, then smack you right in the eyeballs. Your brain then interprets all these different rays of light as one complete picture.
Ray tracing functions nearly the same way, except that everything generally moves in the opposite direction. Inside the software, ray-traced light begins at the viewer (from the camera lens, essentially) and moves outward, plotting a path that bounces across multiple objects, sometimes even taking on their color and reflective properties, until the software determines the appropriate light source(s) that would affect that particular ray. This technique of simulating vision backwards is far more efficient for a computer to handle than trying to trace the rays from the light source. After all, the only light paths that need to be rendered are the ones that fit into the users field of view. It takes far less computing power to display whats in front of you than it would to render the rays emitted from all sources of light in a scene.
Still, thats not to say its easy.
Thousands of billions of photons enter your eye every second, says Christensen the visualization programmer. Thats way more than the number of calculations a computer can do per second & So theres a lot of optimizing and efficiency and hacking that needs to happen in order to even begin to make something look realistic.
Rather than try to map out every single ray of light, the solution for developers at NVIDIA is to trace only a select number of the most important rays, then use machine learning algorithms to fill in the gaps and smooth everything out. Its a process called denoising.
Rather than shooting hundreds or thousands of rays per pixels, well actually shoot a few or maybe a few dozen, Tamasi says. So we use different classes of denoisers to assemble the final image.
Real-time ray tracing is already herekind of. If you have a PC that can handle it, its available in a few current games such as Battlefield V, Metro Exodus, and Shadow of the Tomb Raider, as well as upcoming titles like Cyberpunk 2077 and Wolfenstein: Youngblood.
NVIDIA introduced ray-tracing capabilities last year, with the release of its RTX graphics card line. So your PC would need one of those to properly take advantage of the technology. Current consoles, like the XboxOne and Playstation 4, dont have the hardware to pull it off.
For those of us who are unwilling or unable to cough up between $350 and $1,500 for a graphics card, ray tracing will also be supported by the next generation of game consoles, specifically the Playstation 5 and Microsofts mysteriously named XboxOne successor, Project Scarlett.
The potential might be exciting, but it will still be a few years before the tech becomes standard. Real-time ray tracing is still in its adolescence and has proven to be a little temperamental. And as the hardware improves, developers and designers will have to keep up.
Its a new tool in the toolbox, Tamasi says. We have to learn to use that new tool properly. Theres going to be a whole new class of techniques that people develop.