- Oct 29, 2017
Indeed. Plays a hell of a lot better too.
And yep I agree - there needs to be a happy medium but alas there never will be on consoles.
Indeed. Plays a hell of a lot better too.
Yea, I certainly couldn't go back to a 1080p monitor after getting used to my 4k monitor.
Yep. Higher resolutions are mandatory to solve the aliasing problem. All the post-processing techniques that have been developed (FXAA, TSAA, etc.) basically just smear the image, because there's no other way to fake AA when you've just got a 1080p output to work with. Getting rid of jaggies necessitates higher res (see: supersampling, MSAA, etc).I couldn't disagree more. We're reaching the point where game visuals have so much going on that at 1080p you just get one of two things: Terrible aliasing or tons of blur from aggressive modern anti-aliasing techniques (Especially TAA).
Without higher resolutions, like 4k, we will never be able to make significant progress towards photo realism because our eyes and mind are way too sensitive the the jagged edges and flickering caused by aliasing at these lower resolutions.
Modern AAA games have so much of this going on that their visuals almost never feel "clean" any more. The only time we get close to that clean visual feeling is when they're running at or near native 4k.
Huh really - I felt the same. It perhaps looked better when it was still but in motion it looked and felt like garbage to me. Blurry and unresponsive.
Even you are misreading the OP.OP is saying the age old conumdrum of development. Do devs chase tech or fidelity? The answer has almost always been the latter with each hardware progression. That's because it's easier to detect from anyone. Average consumer does not care about AI simulation or transparent mirror texture tricks, they care if it looks 'good' and we all know what the standard for 'good' graphics are. But I do disagree with OP, it's not a waster of workhours or dev time or rendering resources.
If you want to consider that the current state of graphical processing is somehow limiting developers, then, sure, your stance holds water. What we are seeing, however, is developers are able to bring their content to life and still have room for increased resolution and effects on the current mid-gen consoles. We are getting titles with < 2160p resolutions, as well as 2160p titles.I understand, but imagine if instead of 4K with increased shadow quality and LOD we got 1440p with added geometry and light sources? Will that break the spell? Of course not! It would enhance it.
That's what I'm arguing for. Simply brute forcing a (very costly) resolution boost isn't worth it over more graphical features that videogame rendering still lack.
It's not that 1080p looks better than 4K. It's that 30fps looks like crap.
They don't have to catch up. His problem is that they are merely just as good.
Yea a Sony X900F, for example, would blow you away and there's even better TVs than that.
Completely off-topic, but this deification of Digital Foundry on these forums really grates me. Yes, DF does good work, but to only focus on them shortchanges some of the others out there that also do amazing work in analyzing/sharing the updated graphics/features we get on mid-gen console support:
After years of fence sitting I bought a great 65'' Samsung LED 4K HDR panel. My 1080p was too small for my tastes and it was time to go big or go home. And so I did.
After 4 hours of professional calibration it was time to enter the new ave. Movies look drop dead gorgeous, with lots of clarity. Mind you, the effect was nowhere near as revelatory as sdtv->1080p, but it was good enough.
But when time came to plug in my beloved PS4 Pro, I couldn't be more dissapointed, and worse, worried at the implications of the industry sudden affair with 4K.
Simply put it. Games look certainly /clearer/ but they don't look /better/.
And so it dawned on me. I know even the almighty X can't manage to reach native 4K most of the time, and that's with games being targeted at a really weak console (The S). The X meanwhile was a $500 console. And while the added resolution is nice, there is no room to add more graphical effects that put the games closer to real life at all.
What dawned on me is that next gen, with consoles probably targeting native 4K, we're not getting the generational leap we could have had if devs had stayed on 1080p or the much more healthy middle of 1440p. The jump in power previously used to advance lighting, polygons and calculations are now being used to stretch the image with negligible difference in "realism".
And you know what's that? Because real life doesn't need resolution. You can watch a football match on a 240p portable bLack and white TV and it will certainly look more realistic to you than Star Citizen at 16K ever will.
So it is of my opinion that devs should focus more on creating new rendering techniques to advance the content of the images instead of the images instead. But given that is gonna most certainly not gonna be the case, we'll be advancing at turtle speed now with consoles being judged on their ability to reach such a high resolution so that the badly lit rock texture can be a bit noisier.
Before you ask me, I tried every high profile game you can name (GoW, Horizon, Spiderman, SOTC, AC Origins, RDR2, RE7) and it was always the same. HDR /is/ pretty great but it being tied to 4K is a commercial, not technical decision.
So yeah, it's a shame we're losing current and future power and tech advancements to sell TVs instead of advancing as a medium.
My 4K OLED looks better than your 1080 OLED :)
My TV will be three years old in September since I bought it right after the PS4 Pro was announced but to be perfectly honest, Horizon Zero Dawn, God of War, Detroit, AC Origins/Odyssey, Ghost Recon Wildlands, Uncharted Lost Legacy and a few others have blown me away. I will get a high end model but until next gen starts so probably sometime in late 2020 or mid 2021 when that year's models come out.Yea a Sony X900F, for example, would blow you away and there's even better TVs than that.
4k is an absolutely necessary jump games need to take imo.
People underwhelmed usually have a budget 4k TV or an IPS monitor/TV or something where they expected much more than what they got.
Baccus Put this vid in the OP along with the screenshots below. It's a waste of rendering resources that could be better spent on effects, higher framerate, etc. I'd take a checkerboard solution or straight 1440p and (60 fps or better effects) over just 4k.
Now you're just being silly.
That is a good point, though. ISF calibration is a standard for film, not video games. No such thing exists in our video game world. At best, we can only strive for color accuracy and a video signal unmolested by a TV's post-processing.
Considering that OLED's nits can't touch a candle (puns!) to the best LCDs', this statement is materially wrong. If you are exclusively talking about the lower end of the luminosity scale, then you'd be better at shining a light (puns!) on OLED's strengths, but high-end full-array LCDs handle that same luminosity almost as well.
Yes, 8k is next and it looks great. Even just downscaled on a 4k TV/monitor. It will take a while though before we'll get viable frame rates.PC gamer here, my old monitor broke and I started shopping for my next display. I realized 4k was way too much for the horsepower that was needed far before this happened, so I figured it would be 1080p or 1440p.
I hope the next consoles won't focus on 4k, such a waste, and games will never hit 60+ fps as a standard with them continuing to up the resolution to sell tvs or something. What next, 8k?
I'm not sure how we've stagnated. Take Uncharted 3 vs Uncharted 4, Infamous 2 vs Second Son, MGS4 vs MGSV, RE6 vs RE2 Remake. There have been stark improvements beyond resolution. Is there a big difference between Horizon Zero Dawn on PS4 and on PS4 Pro outside of resolution? No, but that should never have been expected. Devs still have to work with a baseline of hardware that was released in 2013.
I got this effect just with supersampling at 1080p - hadn't noticed at first but going back the image looked noticeably softer.