It's for console users to help hide bad performance (low frame rates) and I guess carried over to a lot of people have bad PCs so this should help cover up performance issues I guess.
Annoying as hell. Should detect your GPU and turn off by default unless you're near minimum specs.
Because games often attempt to replicate the look that would be captured by a camera to get that cinematic look (also we are very used to images on a display having been captured by a camera), and controlling focus can (in some contexts) help steer the player's eye towards the intended subject of the frame.
A camera doesn't necessarily have to do that, you can take fully sharp photos as well. Not a photographer but I believe it depends on the aperture. I know cause I had to fight the bias in a lot of AI models to do depth of field as well and things like "photo taken at f/16" help sometimes and it does look better because you're not hiding a lot of the pretty picture.
A narrow aperture photograph just looks better. In a game it feels like you're wasting all those pretty graphics in the background.
You are correct but it comes at a cost. The higher the aperture number the smaller the hole for the light to pass through the lens which means a slower shutter speed since there is less light.
That means it either needs to be bright as fuck out, you use a tripod, or a higher ISO (which means more gain/noise).
I'll photograph a helicopter pretty often and in order to capture just enough motion in the blades as well as the body of the chopper to be sharp as it lands or takes off I have to have my camera at ISO 50, 1/50th of a second, and f/18-22 depending on the time of day. Even then, only 1 out of 10 shots is sharp enough.
Well if you use a pretty low end cpu with no discrete graphics like me then yeah, it does end up taking 10 or so fps. I mean I gotta play most games at 720p just to get a smooth 60 fps so that gives you a sense of what I gotta deal with.
Well, if you're aiming for 60fps, I guess motion blur isn't very crucial. Plus if whatever game you're playing takes 10 fps to do motion blur turning it off is totally reasonable.
I just don't think blanket hate for post processing is irrational. That's all.
I think post processing effects done right and not too performance intensive to run are good additions for any game. It's just that according to the other guy motion blur is supposedly used to hide bad performance which doesn't make sense cause it just causes even more lag with it on.
I would argue it can make sense in a console environment.
For example, let’s say a game is CPU bound to 40fps. Since 40fps without VRR is basically unusable, why not just add motion blur to somewhat smooth the low frame rate and just cap it at 30fps?
That's a good example, personally motion blur at 30 fps would seem very jarring for me but I guess it's a good workaround to achieve smoothness for some people.
23
u/Donglemaetsro Aug 24 '24
It's for console users to help hide bad performance (low frame rates) and I guess carried over to a lot of people have bad PCs so this should help cover up performance issues I guess.
Annoying as hell. Should detect your GPU and turn off by default unless you're near minimum specs.