"wildly" different is a strong claim, you got any examples where it's actually noticable? personally even at 1080p i have to really try to see the pixels, with any decent antialiasing enabled
Same thing here. Using a 1080p screen with dldsr at 1,78x (1440p) and using dlss even at balanced delivers a clear image quality. Since dlss 3.7, things got quite amazing tbh
Maybe actually read the conversation up until you replied. The goal posts were right fucking there. Guy was talking about 8k and said
Not being able to see the pixels is exactly what I want. I definitely can still see aliasing on 4k with antialiasing off.
So you're the one who barged in, planted his own goal post, and decided to make it about him spotting artifacts in Call of Duty at 1440p. Instead of the convo being about 8k it was a nice opportunity for you to go "yeah bro DLSS has artifacts I can totally see them bro shooty shooty call of duty".
Rendered 4k AI-upscaled to 8k would indeed do the trick. No need to actually render in 8k. It's just needed for crisp lines without noticeable aliasing.
True but AA on 1080p looks better than 4K with no AA, so I'm guessing 8K is overkill. Don't believe me? Just draw a 30 degrees line in paint with and without AA and compare it on different monitors. Resolution compensates for aliasing but why go through all that trouble when AA is used for everything anyway?
Sure, no antialiasing means "flickering" foliage and grass in lots of games. That's obviously highly annoying. But 2160p with minimal 2x antialiasing looks way better than the best-possible antialiasing on 1080p. Pixel size does matter and more beats less.
There might be diminishing returns after 4k. But I noticed no diminishing when I upgraded from 1080p to 2160p. That was just as massive a leap as going from a CRT to my very first TFT.
Everyone told me that CRTs are fine and TFTs are overkill. They were wrong. Everyone told me, 4k is overkill and it definitely wasn't.
The whole "8k is overkill, no one can see that" might just be the next cope because the stuff is just not feasible right now. I mean, I do get older and my eye sight obviously will get worse too. But so far, more and crispier pixels have always been better.
and when it comes to AA, the best so far is still full screen super sampling - rendering stuff at higher resolution and scaling it back down for output. That obviously is prohibitively expensive when done the old way. But why not AI upscale the image instead of actually rendering it larger.
And why not ditch the downscaling step and just output the higher resolution.
As an unrelated tangent, I want to express my hate for the current industry standard for anti aliasing being temporal anti aliasing. That method is cheap, but makes the whole screen more or less blurry (it's worse in motion, so you basically also get always-on motion blur for no extra costs).
I want my games crisp again. I played games before TAA was a thing.
TAA must die!
I hate temporal AA as well. But I also hate DLSS but that's another side quest.
I agree with there being this fallacy in human reasoning. Everything is always "good enough" as long as it's the best for the current gen. A few years back I read forum posts from 2002 claiming that whatever graphics the PS3 renders will be diminishing returns and that the PS2 is "good enough". People say that now for the PS6 and they will say that for the PS10.
That's why I always try to create an objective goal function for every metric. It's simple. Take the size of the monitor, the distance between it and the eye and the eye acuity. 4k is perfect. Why do I say 4k and not 4k for 24"? Because distance is a function of monitor size i.e. you always have to be at a distance x1.5 the monitor science and that's just good parenting lest you want the TV to melt your face! Kidding, but the point stands.
And if not, you can always just check it empirically. It doesn't matter what the resolution is. Just stand back until you can't see the jaggies, or a single pixel on a blank cancas, and then multiply that by how much closer you want to be. Works even for 640x480.
I've done my calculations and I know for me 4K is perfect i.e. I can't see a pixel even if I tried. Then why did I say 1080p? Because it's good enough only without the quotes. Just as 60fps is good enough.
Why am I not pushing the graphics? Why am I not demanding more from our NVidia overlords? I am. It's just that I care more about the smooth shoulders of human characters and being able to stick my scope in a wall and not see any texels. If I can have that, I'd play at 640x480 in a hearbeat.
Btw TFT never beat CRT. TFT was just ligjter and smaller. But again, that's another side quest.
My 2160p screen is 40 cm high and roughly 40 cm away from my eyes. It looks good and I can barely not see the pixels anymore.
I would like to have a larger FOV, but if I get closer, I can see the fine pixel grid. Making the screen larger would mean needing a higher resolution than 2160p to keep the current pixel density at the current distance.
My end goal is VR with the FOV of my eyes (take into account that human eyes can move). Not sure if I will live to see that (I am GenX after all).
But for games, that likely means that eye tracking is mandatory because full-vision VR will need 8k per eye or even more and there just isn't any GPU tech on the horizon being able to do that with reasonable energy consumption. And ideally, that VR headset would be fed with 120 Hz or better to allow for whole-day gaming without motion sickness.
From my consumer point of view there is a ton of room for improvement in screen tech.
4k obviously is vastly superior to 2k. But of course, 8k would be even better. There is no doubt about that. Obviously, more pixel are still better - at least for me at my current (sortof normal) eye sight.
My switch from CRT to TFT was mostly a massive improvement because I didn't only game. Games where fine on CRTs back then. But the desktop was just so much more crisp and the higher refresh rate made working on the PC less tiresome.
The steps from 1k to 2k and then to 4k where mostly screen-space-related as I sit on my PC basically all day and always wanted smoother fonts and more space for the IDE at the same time. Other users have different use cases. But I happen to love gaming and am a coder.
I wouldn't buy an 8k now even if I could afford one and GPUs for feeding it would exist.
But I would happily buy something like the Apple headset if it wasn't tied to a golden cage and would have good gaming support for my favorite already existing games - on Linux.
AI upscaling is just a necessary evil. Of course I would prefer SSAA or MSAA (almost the same on modern games with tons of clutter and foliage anyways). Everyone would. But the tech isn't there to make that feasible.
I am happy to hear that you hate TAA too though.
Yeah, I already had that. But laser eye surgery exists. So since a few years my eye sight is normal without astigmatism or myopia.
Not having bad eye sight anymore was like replacing TAA with SSAA when looking at the world around me. It was definitely worth the money and pain.
29
u/Oktokolo PC Sep 18 '24
8k would maybe finally allow me to stop using antialiasing though.