1440p and 2160p is a total waste of pixels, when 1080p is already at the level of human visual acuity. You can argue that 1440p is a genuine (slight) improvement for super crisp text, but not for a game. HDR and more ray tracing/path tracing, etc. are more sensible ways of pushing quality higher.
> 1440p and 2160p is a total waste of pixels, when 1080p is already at the level of human visual acuity.
Wow, what a load of bullshit. I bet you also think the human eye can't see more than 30 fps?
If you're sitting 15+ feet away from your screen, yeah, you can't tell the difference. But for most people, with their eyes only being 2-3 feet away from their monitor, the difference is absolutely noticeable.
> HDR and more ray tracing/path tracing, etc. are more sensible ways of pushing quality higher.
HDR is an absolute game-changer, for sure. Ray-tracing is as well, especially once you learn to notice the artifacts created by shortcuts required to get reflections in raster-based rendering. It's like bad kerning. Something you never noticed before will suddenly stick out like a sore thumb and will bother the hell out of you.
Text rendering alone makes it worthwhile. 1080p densities are not high enough to render text accurately without artefacts. If you double pixel density, then it becomes (mostly) possible to renderi text weight accurately, and things like "rythm" and "density" which were things that real typographers concerned themselves with start to become apparent.
You're probably looking up close at a small portion of the screen - you'll always be able to "see the pixels" in that situation. If you sit far back enough to keep the whole of the screen comfortably in your visual field, the argument applies.
You are absolutely wrong on this subject. Importantly, what matters is PPI, not resolution. 1080P would look like crap in a movie theater or on a 55" TV, for example, while it'll look amazing on a 7" monitor.