Finally bought a 4K TV and confirmed it is indeed a waste of rendering resources

Oct 28, 2017
447
There are graphical effects out there that can have a larger impact than pixel resolution, but 4k is still up there as being a noticeable increase in overall image quality. It brings out more texture detail that was always there, it lowers aliasing by a noticeable amount, and if you can keep a steady frame-rate makes the world look clearer without having to fake it with sharpening.
 
Last edited:
Oct 25, 2017
9,280
As someone who bought an LG OLED last September, I gotta sorta agree about 4K being a cool jump that’s nowhere near as revolutionary as you’d hope when you drop that amount of cash to be able to see it.

Now, HDR/Dolby Vision? That’s a game changer for me.
 
Oct 27, 2017
178
OP buys a 4k TV. OP hooks up a PS4Pro. OP is not happy. OP then goes on to dismiss the XboxX, which OP does not own.
Legit thread.
/s
4k is a tremendous improvement in IQ for games. Now when I play things like APEX at 1440p or Ace Combat 7 at 1080p on Xbox, I notice everything has a weird smear to it compared to the other games I play at 4k. There most certainly is a big difference.
 
Oct 30, 2017
165
I am with you OP. I still think 1080p is enough and if I have the choice, I use the superfluous power for framerate or more bells and whistles.
 
Aug 30, 2018
33
There's nothing stopping devs from rendering out at lower than 4K. UI elements and text at 4K is a beautiful thing—any other rendering can be come as you please. Dynamic resolutions have been a thing for a long time now. I don't find it shocking that people like to see a native resolution picture instead of turning on water and hair effects or whatever else sort of bells and whistles.
 
Oct 25, 2017
4,205
Put in God of War and switch the graphics modes and FPS modes. One is a blurry mess, the other is 4k goodness. Massive difference.
On the same 4k display this is understandable. Anything below native resolution will look blurry and ugly depending how far below native it is. The better test would be to have the displays side by side in their native resolutions in your normal sitting distance. I wish things under native resolution didn't look so bad, then it would be a no brainer to get a 4k display for my PC.
 
Nov 12, 2017
99
And you know what's that? Because real life doesn't need resolution. You can watch a football match on a 240p portable bLack and white TV and it will certainly look more realistic to you than Star Citizen at 16K ever will.
Well put.
Most people are happy watching Netflix and DVDs in SD, which is what, 640p?
Sure, 4K is nice, but I question whether it’s worth sacrificing assets, models, animations, lighting, frame rate, audio, AI, etc for. I don’t think it is. I’d rather have a game that looks real at 1080p than a game that looks like the games we’ve already been playing, but sharper image.
I get that the sacrifice isn’t proportional, but there is a significant sacrifice there. All to make textures look good, basically. But there is more to games than textures.
 
Oct 25, 2017
370
Creating higher quality art assets, animations, improved physics, better lighting, etc. would add a massive load to development schedule which might not be worth it when the Pro/X are probably at a combined 10+ million. Instead devs have decided to use those extra resources to increase resolution which satisfies a reasonably large number of Pro/X owners and doesn't add an insane amount of work for devs.

If they didn't use the increased GPU power for resolution a majority of games would probably ignore the extra power all together. 60 FPS doesn't seem feasible either since we're still stuck with Jaguar CPUs. I've been pretty happy with the enhanced IQ on my X, so I don't see it as a waste.
 
Oct 27, 2017
1,612
People throw around terms like nits but the truth is that OLED's ability to dim and shine every individual pixel gives it much more punch in its picture, so it doesn't need as many nits. I've seen OLED and LCD side by side and there really is no contest
RE: Italicized
What?!

You are literally arguing against a scientific standard of measure that an entire industry uses to develop and market its technologies. Both "punch in its picture" and "ability to dim and shine every individual pixel" means nothing; the first is not quantifiable and the second is playground level of debate. Yet you are implying I am just throwing around terms without understanding them.

RE: Bolded
Again, what?!

You are effectively living in a fantasy world where OLED has magical properties that imbue its technology to be brighter than measurement tools read they are.

Dolby Vision, an industry standard, calls for a reference level maximum of 4,000 nits. LG's 2018 OLED series (let's look at the E8, LG's flagship set) hits below 1,000 nits. Samsung's flagship TV, on the other hand, is touted as displaying a brightness between 1,5000 - 2,000 nits. Reviews corroborate this. There is, without a doubt, a contest in luminosity when these 2 sets are displayed next to each other.

You can enjoy your OLED, I am not taking that away from you. The reality is that OLED's weakness is in its upper-end luminosity and that is not subjective.
 
Oct 27, 2017
1,300
I keep saying that resolution shouldn’t be the focus next generation, it should be high refresh rates up to 144 Hz and advanced AI technology. System wide super sampling for all games would be great too.
 
Oct 26, 2017
7,538
What would be smart is a dynamic implementation that went to 4K when you were standing perfectly still and dropped dramatically at any hint of movement. Especially for 30 FPS games.
 
Dec 5, 2017
1,464
Yep. Higher resolutions are mandatory to solve the aliasing problem. All the post-processing techniques that have been developed (FXAA, TSAA, etc.) basically just smear the image, because there's no other way to fake AA when you've just got a 1080p output to work with. Getting rid of jaggies necessitates higher res (see: supersampling, MSAA, etc).
Not to mention, image stability. More than jaggies the amount of shimmering that happens is absolutely maddening. Image stability has always been my biggest pet peeve and 1800p and up makes a huge difference on this front.
 
Dec 5, 2017
1,464
I keep saying that resolution shouldn’t be the focus next generation, it should be high refresh rates up to 144 Hz and advanced AI technology. System wide super sampling for all games would be great too.
I get the like for high refresh rate. The problem is almost no ones tv’s support that out of the box without some trickery via CRU, and that’s ONLY at 1080p.
 
Oct 25, 2017
351
I have a KS8000 and boy... the occasional framerate hit or ghosting that comes from checkerboard rendering, combined with the jet engine my PS4 Pro transforms into truly makes it feel like it wasn't worth it. Planet Earth 2 with HDR looks great of course, but maybe I should have waited for an OLED and PS5.
 
Oct 25, 2017
1,112
Chicago, IL
Yep I agree with the OP. I would much rather them put resources towards 1080p instead of aiming for 4K.

I would love to have HDR on a modern 1080p set but unfortunately that ship sailed years ago so we’re stuck with 4K which is fine, but I would rather have native 4K and not an upscaled image.
 
Oct 29, 2017
18
you've got to see the flat textures in higher resolution!

couldn't agree more. 1440p is great, but I think there needs to be a focus on better quality assets before 4k is really worth it.
 
Oct 25, 2017
1,765
I agree, however, note that this generally applies mostly to games with superb anti-aliasing. 4K has the benefit of providing a near perfect image even with poor anti-aliasing. That's going to be it'd main draw for me.

But yeah, when it comes to choosing 4K @ 30fps vs 1080p @ 60fps with say, Shadow of the Colossus, the choice is easy. 4K feels quite the minor improvement from 1080p. It only starts to become worthwhile once a game's detail reaches a certain level, but we're just not there yet. Texture quality is just not high enough in a lot of cases to warrant the extra detail, and many of the top games have nearly perfect temporal AA solutions that look damned good at 1080p.
 
Nov 14, 2017
5,421
I think, factually, most Xbox One X enhanced games are actually native 4k. Many high-end AAA Xbox One X games, on the other hand, are not native 4k.

The suggestion that next gen will have a lower target res is silly.
 

ghostcrew

Spooky
Moderator
Oct 27, 2017
7,283
United Kingdom
I have a KS8000 and boy... the occasional framerate hit or ghosting that comes from checkerboard rendering, combined with the jet engine my PS4 Pro transforms into truly makes it feel like it wasn't worth it. Planet Earth 2 with HDR looks great of course, but maybe I should have waited for an OLED and PS5.
I would say that checkerboard rendering and jet engine fans speak more to the design decisions made around the PS4 Pro than whether 4K is worth it or not.
 
Jan 10, 2019
137
What dawned on me is that next gen, with consoles probably targeting native 4K, we're not getting the generational leap we could have had if devs had stayed on 1080p or the much more healthy middle of 1440p. The jump in power previously used to advance lighting, polygons and calculations are now being used to stretch the image with negligible difference in "realism".
Which is why checker boarding with temporal input is the correct answer. Looks 80-90% as good as 4K for about 50-60% of the GPU burden. It would mean the PS5 would not be rendering at any increase in resolution compared to the PS4Pro, all that extra grunt can go into making the games look better.

Resolution is the window by which we look into our virtual worlds. It is important to have a nice clear window. It's more important to have a nice view.
 
Oct 27, 2017
4,262
RE: Italicized
What?!

You are literally arguing against a scientific standard of measure that an entire industry uses to develop and market its technologies. Both "punch in its picture" and "ability to dim and shine every individual pixel" means nothing; the first is not quantifiable and the second is playground level of debate. Yet you are implying I am just throwing around terms without understanding them.

RE: Bolded
Again, what?!

You are effectively living in a fantasy world where OLED has magical properties that imbue its technology to be brighter than measurement tools read they are.

Dolby Vision, an industry standard, calls for a reference level maximum of 4,000 nits. LG's 2018 OLED series (let's look at the E8, LG's flagship set) hits below 1,000 nits. Samsung's flagship TV, on the other hand, is touted as displaying a brightness between 1,5000 - 2,000 nits. Reviews corroborate this. There is, without a doubt, a contest in luminosity when these 2 sets are displayed next to each other.

You can enjoy your OLED, I am not taking that away from you. The reality is that OLED's weakness is in its upper-end luminosity and that is not subjective.
This is probably a bit of a thread derail but HDR is not just about peak nits, its a range and OLEDs start at an absolute zero with the perfect contrast, while LCDs do not. FALD is a way to mitigate this for high end LCDs, but that in itself introduces its own set of problems. LCDs absolutely need a higher peak nit count in order to have the same impact an OLED would since they don't start at the same place.

Expert calibrators from around the world hold yearly shootouts to pick the best TVs and 3 years in a row an OLED have won both best overall TV and best HDR TV for this very reason.

Since you linked rtings, check out their scores for HDR movies in the Samsung Q9FN (8.7) vs the LG C8 (9.1), the LG C8 scores higher even tho its like you say, much lower in peak nits count. And you can find out why when you click the little question mark next to the score, their #1 most heavily weighted thing when deciding the HDR movies score is contrast, not HDR peak nits. Even in the HDR gaming section that takes into account other things the LG C8 beats out the Q9FN.

Another thing to keep in mind is how peak nits are meassured on these TVs, the patterns used to measure this are far from similar to real world content, often times LCDs can't even get close to their peak nits because they need to stay within a certain level in order not to clip details around that bright object, an OLED doesn't have to worry about this at all.

Where an LCD will outshine an OLED is in near full screen bright scenes like a sunny beach or a desert, but in this case you are not really seeing a wide range in brightness, just a very bright overall picture. Overall OLED is the absolute best way to watch HDR movies, and in most cases games too.
 
Last edited:
Feb 10, 2019
64
Lol I bought my sis an X an we game share so she’s playing my red dead 2 and I couldn’t even read the damn hints and clues and Arthur’s beard was all pixely.

She slept over my place and the difference for both of us was night and day, and she’s only 17 and the farthest thing from being tech savvy. And to make matters worse I have damn 55 inch tcl. Not the top of the line but the bare minimum when it comes to 4k hdr, it’s no oled but justice league 4k (yes the movie sucked but the cinematography in Snyder movies are always on point and 4k makes it look even more glorious) and the Witcher 3 and red dead 2 (what I really bought an X for) makes me disagree with your statements.

Damn you have a samsung, so are you saying that you think 4k all n all is a waste of time because it’s not living up to the hype or are you saying 4k isn’t worth the loss in graphical fidelity? My opinion is do both if possible, resolution makes old games like red dead 1 look brand new so it has a purpose.
 
Jun 2, 2018
107
Texas
Honestly same. 1080 and 1440 look fine. 4k? Ya it looks better but I'd much prefer games be more optimized than look a little prettier. also lol at people misconstruing OP's main point and getting really defensive about it.
 
Oct 28, 2017
842
Judy upgraded to a cheap 43'' 1080p Chinese LED screen from an old LG 32''. Huge difference, I'm really happy with it.

4K is a waste. And if next gen consoles come with a 4k mode and 1080p upgraded visuals mode, then the choice of 1080p will be more logical.

But yeah, we are all expecting those upgraded assets and games using TBs of space.

Will upgrade to 4k when internet speeds catch up, in 10 years more.
 
Oct 26, 2017
4,650
Tampa
Lol I bought my sis an X an we game share so she’s playing my red dead 2 and I couldn’t even read the damn hints and clues and Arthur’s beard was all pixely.

She slept over my place and the difference for both of us was night and day, and she’s only 17 and the farthest thing from being tech savvy. And to make matters worse I have damn 55 inch tcl. Not the top of the line but the bare minimum when it comes to 4k hdr, it’s no oled but justice league 4k (yes the movie sucked but the cinematography in Snyder movies are always on point and 4k makes it look even more glorious) and the Witcher 3 and red dead 2 (what I really bought an X for) makes me disagree with your statements.

Damn you have a samsung, so are you saying that you think 4k all n all is a waste of time because it’s not living up to the hype or are you saying 4k isn’t worth the loss in graphical fidelity? My opinion is do both if possible, resolution makes old games like red dead 1 look brand new so it has a purpose.
His Samsung is worse than the TCL.
 
Oct 27, 2017
1,222
I get what OP is saying. I recently upgraded from 1080p to 4K TV and for me, the biggest WOW factor was screen size, contrast ratio and response time improvement. While upgrade in resolution was nice, it wasn't as amazing as years of 4K hype led me to believe it would be. I honestly think many people are overrating 4K and have better opinion of it than it deserves. It's just a higher screen resolution after all. I still see jaggies in 4K and I still see aliasing and games can still be blurry unless they use superb rendering techniques like Gears of War 4. So yeah, 4K can be a waste of rendering resources depending on content. Another annoying thing is now Switch games look worse on my 4K TV than they did on 1080p one.

Try LG OLED next time and not samsung
/s
but honestly, go with LG OLED next time :)
We get it, you have OLED and you must use every opportunity to brag about it. Not sure how relevant your post is seeing as OP is mostly talking about screen resolution. It doesn't matter if it's Samsung LED or LG OLED, it's the same 4K resolution, same amount of pixels.
 
Oct 25, 2017
4,205
I get the like for high refresh rate. The problem is almost no ones tv’s support that out of the box without some trickery via CRU, and that’s ONLY at 1080p.
For consoles, most, if not all games running at 60fps to make use of extra power would be amazing enough.

Gotta love people downplaying 4K while trying to claim 1440p is where the real difference is lol
For current hardware, that would have been the sweet spot until better more affordable hardware that would be used in consoles was here (likely next gen). I don't know if 1440p living room TVs were sold like 4k is being pushed though.

The downside of having a bleeding edge high resolution display is games under that resolution will look bad on it. PC users have the option to pick and choose, and 1080 and 1440p are good choices if you don't have the hardware to support always playing at 4k with good framerates (60+).
 
Jul 4, 2018
650
1080 is still a good enough resolution for me, I'm hoping for games to focus more on stable 60fps than native 4K support.
Whenever a console game gives the option for performance vs. visuals, I choose performance everytime.
 
Nov 2, 2017
1,832
RE: Italicized
What?!

You are literally arguing against a scientific standard of measure that an entire industry uses to develop and market its technologies. Both "punch in its picture" and "ability to dim and shine every individual pixel" means nothing; the first is not quantifiable and the second is playground level of debate. Yet you are implying I am just throwing around terms without understanding them.

RE: Bolded
Again, what?!

You are effectively living in a fantasy world where OLED has magical properties that imbue its technology to be brighter than measurement tools read they are.

Dolby Vision, an industry standard, calls for a reference level maximum of 4,000 nits. LG's 2018 OLED series (let's look at the E8, LG's flagship set) hits below 1,000 nits. Samsung's flagship TV, on the other hand, is touted as displaying a brightness between 1,5000 - 2,000 nits. Reviews corroborate this. There is, without a doubt, a contest in luminosity when these 2 sets are displayed next to each other.

You can enjoy your OLED, I am not taking that away from you. The reality is that OLED's weakness is in its upper-end luminosity and that is not subjective.
Re: the italics. OLED can switch individual pixels on and off. LCD requires a backlight. Which can never be as accurate especially for black levels.

Re the bold: Actually Dolby Vision has a different, lower requirement for OLED for minimum nits for DV Because OLED has such amazing contrast, it doesn't need as many nits to qualify, because what is required is a large gap between peak brightness and darkest black

It's right here: 1,000 for LCD and 540 for OLED https://www.trustedreviews.com/opinion/hdr-tv-high-dynamic-television-explained-2927035
 
Last edited:
Nov 30, 2017
1,072
This is a thing Phil Spencer started.

The pattern seems to be if there is any possible way a game can hit 4k than MS wants them to prioritize that over everything else.

The 1x is also not ready for 4k but its a buzzword that says "most powerful console". Which has been Phils thing for a while.

The next round of machines will be 4K.
 
Oct 26, 2017
1,206
Yeah I get what you're saying. I periodically check in on this by hooking up my PC to a 4K screen and yeah, dumb ass AI and janky animations is what makes games look shitty, not a couple of pixels more resolution.
 
OP
OP
Baccus
Dec 4, 2018
978
Thread backfire of the century
Not really, I have actually enjoyed a lot reading opinions and people actually agreeing with my thesis. Some rebuttals have been more technical and I also appreciate them.

I really want devs to find a happy medium, that's a lot of resources going down the drain.
 
Oct 27, 2017
4,262
Not really, I have actually enjoyed a lot reading opinions and people actually agreeing with my thesis. Some rebuttals have been more technical and I also appreciate them.

I really want devs to find a happy medium, that's a lot of resources going down the drain.
Soon 4k will be THE happy medium as TV makers move on to 8k in just a few years.

All the premium TV brands will release 8k TVs this year, and Samsung already did last year.
 
Oct 25, 2017
4,205
8k is going to be dumb. Probably good for movie theaters with gigantic screens, but come on. What kind of PC hardware can comfortably run something like that with a modern game like Cyberpunk on high and 60fps?
 
Oct 29, 2017
5,174
Mt. Whatever
Testing on consoles huh. Found your problem. Especially with PS4Pro. It wasn't very impressive to me either, which is why I've pretty much abandoned mine after building a decent PC.

How many console games ship with 4K textures... or even just "Ultra" textures in general?

PC Versions of FFXV and R6 Siege with the 4K texture packs are insane night and day differences between HD and UHD. It's nice that console games are increasingly rendering at higher resolutions, but developers should probably start trying to figure out how to deliver 4K assets (textures, shadows, VFX, etc...) as well. Maybe as optional patch/DLC?
 
Oct 25, 2017
3,374
lol OP. try playing Wolfenstein on the switch and see if 360p games look good to you.






I do agree with your overall assertion that next gen a lot of processing power will be wasted on pushing pixels rather than detail and polygons but thats true for every gen. This gen we wasted a lot of resources trying to render double the number of pixels we did last gen. I dont think native 4k is something any dev will chase on PS5 because no one wants to waste 4 out of 8 or 10 tflops trying to render 8 million pixels. most will just do 1440p or checkerboard 4k and call it a day.

As for seeing a difference on your tv, i would recommend you play a non Pro enabled game like Batman Arkham Knight, DriveClub or Bloodborne. These games run at 1080p and were one of the best looking games of their time. and yet they have awful shimmering, jaggies and other artifacts that ruin the picture quality on 4k screens as that 1080p image gets stretched to oblivion. Then go back and watch Horizon, Spiderman and even SOTC which only runs at 1440p. Half the pixels of native 4k. You will start to see the difference because the jaggies will go away. Objects in the distance wont have shimmering. Draw distance wont be blurry. Image will be super crisp and clean.

Again, crisp and clean is what you get as you get over 1080p. You arent always going to see things you couldnt see before but the biggest difference is going to be how clean everything looks. Try playing SOTC on both 30 fps and 60 fps. 60 fps defaults to 1080p and you lose all the detail on the grass. not because the grass is downgraded, its actually the same assets, but you dont get to see the little details you get in the higher res modes.
 
Oct 26, 2017
784
Illinois
Only read the first page but the fact that you don't list which model Samgsung TV you got indicates ro me you didnt do much research. Not all 4k TVs aren't created equal. My 4k TCL TV doesn't compare to my Vizio P series which doesn't compare to a LG OLED.
 
Oct 30, 2017
1,088
Wait a few weeks and then play some games at 1080p. That's the moment when you realize how important is to play at 4K.
This happened to me in my last vacation, left my Pro at home and when I played on my brother's 1080p using the standard PS4 I felt everything looked considerably blurrier.