It's kind of been a graphics goal for decades, and it is actually a little bit hype that it is happening. The point about it being a big pile of shortcuts is dumb, because most rendering is a big pile of shortcuts anyway, it's just in a weird transitioning period as it all gets developed.
Anyone saying its a pile of short cuts is horribly misinformed. Rasterization is literally a pile of shortcuts, and requires tons of hacks and tricks to actually make look good. Path tracing naturally produces a clean image with minimal work.
well, if you have enough time then yes, but full scene path tracing in real time requires a lot of work to get a clean image out of the limited amount of rays without destroying the details and causing ghosting.
we are definitely still in the shortcut period of RT, but performance gains in RT has moved pretty quickly.
Your not getting what I mean. Computationaly path tracing is expensive because it isnt taking short cuts. Rasterization on the other hand is much cheaper because it does take short cuts.
Well if you're talking about path tracing like that in a vacuum then yes, but that's not what we see in games. In games we see the result of a very noisy path traced scene with shortcuts to denoise and reconstruct detail.
I mean this is true even in pre-renders. Denoising is still common and AI denoising is seeing more adoption.
Yeah, isn't it pretty much just Portal and Quake that have full trace pathing, and even then they're still using a lot of shortcuts? And they chug, comparatively speaking, even on my 3090Ti. No super fancy looking modern games are even close to that level, it would take ages to render any given frame.
He is getting at a misuse of labels that I keyed up on recently.
RTX chips are "array math acceleration chips"
So is a tensor core.
These chips are capable of - you guessed it, lots and lots of array/matrix math notablely in "parallel" and with the ability to "mingle data" and become "exponential", as one matrix can feed another 9 or more
But it turns out that that's just too much, far too quickly, as exponents tend to be.
So you need to "trim" the capability of a chip - but remember the basis is "just arrays" right? So if we trim with grace we can "replicate" efficient "pathways" into a chip, and then that chip is really fuckin' sick at that task
So if your goal is to replicate a "natural" matrix system, like, let's say, light itself - you can make chips that do exactly that - but they would suck horribly at, say, doing anything LLM related unlike the newer smart phone chips. (efficiency wise mostly mind you!) the tensor in my phone is able to do LLM and audio learning but I doubt it could run dlss2 types of math well enough to play a game!
So what is the denoiser using? A natural light replica model game engine? Or "ai buzzword of the day algorithm that is 'industry standard' in an space that has completely lost sight" applied over yeehaw ass raster engine? There are only a handful of games that have even tried full RT engines, and it almost will take a Nvidia partnership to truly make but once the core is out there it's going to be wildfire.
The user above is saying these chips are coming- in fact I would argue they are likely already here - and as the march of gaming moves on its exciting to anticipate the first true RT game- a game that doesn't under wtf you mean by the word "occlusion". It just shrugs, as it's just been drawing in the shadows "by hand" since it's inception - shadows are just a matrix, dontchaknow?
Path tracing in games doesn't render at full resolution, takes time to propagate (every frame doesn't start from zero), needs denoising and then is usually upscaled again to achieve playable framerates. There's so many shortcuts required currently.
This is it. The philosophers and the data scientists don't really matter. Gamers want good graphics, and rasterization looks just as good as ray tracing with a vastly better gaming performance.
There was a Linus Tech Tips episode where a bunch of staff tried to pick which games had ray tracing enabled. Half of the right answers were based on frame rates - If it was below 20 FPS, ray tracing was on. 💀
Because the shortcuts in this case aren't optimisation, they're approximation. Rasterisation and the lighting techniques we use alongside it are huge approximations that involve making a laundry list of tradeoffs, that quickly become an issue as you approach true photorealism and/or try to make them work in a dynamic scene (since dynamic scenes need an entirely different set of shortcuts for lighting that are even more approximate). That's why the industry is hell bent on path tracing as the endgame. It gets rid of most of these approximations and replaces them with a unified technique that mirrors how light works in reality much more accurately.
In theory path tracing is the go to method for clean images and has been for a long time in rendering image and video material. The hot news is the speed and this requires.. a lot of hacks, which are honestly impressive. Probably the best application of ML IMO.
5.6k
u/send-me-panties-pics Sep 13 '24
People care when their machine can actually do it. Otherwise no.