In the 1990s, software and hardware companies began to tout it, but like Atari Jaguar's claim it was a 64-bit machine because it had two 32-bit processors, it was a lot of marketing.
Ray tracing works because light won't interfere with itself if it crosses the path to the eye and it will travel in straight lines and that physics premise is the foundation of ray tracing in computers, which used the work of Whitted in 1980 to leverage the idea that it only works efficiently if you don't do what light does to the eye and go right at it. Most light does not go directly at the eye either, and that computational efficiency was the same goal in digital playback (CDs), which said they could get millions of analog timestamps down to just 65,000 important ones.
Most people's ears cannot tell the difference but when is the last time your eyes were fooled by a computer image? 40 years ago the special effects in "Star Wars" looked fantastic but no young person is fooled by them now. Our eyes are willing to play along with recreations of shadows and refractions but they aren't really fooled, and that's even though movie studios can render a frame at a time when they need.
But take a look at this ray tracing in real-time.
Pretty darn good, right? Credit: Nvidia
Nvidia's new $500 graphics cards, the RTX 2070 and 280, lead them to claim it can do what software and hardware have promised since the 1990s - really good ray tracing. Better than the rasterization we have actually gotten, even when it's been called ray tracing.
Who does Nvidia have on their side? The same Turner Whitted who wrote that 1980 paper on it. So we'll see.
Demo videos don't do much for me, they are just marketing after all, but here is what they promise. Assuming it works the way they say it will and software companies jump in, it will be pretty fantastic.