• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Do you want 4K?

  • Yes, I need that sweet sweet crispness.

    Votes: 980 44.0%
  • At the screen size I'm at, I'm fine with 1080p or 1440p.

    Votes: 1,245 56.0%

  • Total voters
    2,225

Redeye97

Banned
Apr 25, 2019
462
I still don't really see the difference between 720p and 1080p unless they're put side by side. Resolution never really affected the quality of a game for me, and it likely won't be the reason why I'll upgrade to a next gen console.
 
Nov 8, 2017
3,532
I'd much rather have 60fps than 4k.

1080p vs 4K == negligible difference in the experience.
30fps vs 60fps == the difference between me buying a game or not.
 

Peterc

Banned
Oct 28, 2017
370
I think 4k is ok if you have realistic graphics.
8k only matters for VR, for the rest it's just a marketing stunt.
 

HellofaMouse

Member
Oct 27, 2017
6,159
Before we reached 1080p, the low res was noticable even when gaming on a desk sitting closeby to the monitor.

I really feel like 1080p is a sweet spot, anything above that is too much of a power sink
 

dgrdsv

Member
Oct 25, 2017
11,848
For TVs of 50-70" 4K is certainly an improvement over 1080p but this is close to how big I'd want a TV to be in my home.

For PC monitors I'm not sure as there's not a lot of monitors with proper combination of size and 4K just yet. It may be that 27" 1440p and widescreen analogues is enough for now or maybe not.
 

Tovarisc

Member
Oct 25, 2017
24,407
FIN
Going from 1080p, 16:9 to 3440 x 1440, 21:9 made very big difference for me in the gaming. I can't see myself going back.

It's bit different than just resolution bump upwards, but change from regular 1080p can be game changer.
 

HBK

Member
Oct 30, 2017
7,972
I can't even tell the difference between Mario Kart 8 (720p) and Mario Kart 8 Deluxe (1080p). Plenty of people on here were shocked to learn that Yoshi's Crafted World was sub-720p because it looked like 720, 900 or even 1080p to them. I'll prefer a 720p game with AA than a 1080p game without AA, so I'm much more interested in AA than higher resolution.
Yeah it's always amazing to see people go wild about numbers they always want to see go higher because "why not", but then need experts with specifically developed tools to tell them what those numbers are exactly in the end.

I dunno there's something awfully wrong with this. It's like having a 300hp car and asking for a 500hp car when you're still limited to ~85 mph anyway (in most countries, that's around the max allowed speed limit, like 130km/h here in France). It just reeks of entitlement "I just want more" when you just can't freakin' tell by yourself what you have unless an expert takes some carefully measured values to tell you what they actually are.
 

Deleted member 10193

User requested account closure
Banned
Oct 27, 2017
1,127
... Because 4K was (unfortunately for 3D games) rushed out to market to satisfy demands in other media areas, which was what the post he was replying to was about.
Nothing to do with it being rushed. They wanted to introduce a mid gen refresh and decided that 4K would be easier to sell than slightly better looking 1080p games.
Yeah it's always amazing to see people go wild about numbers they always want to see go higher because "why not", but then need experts with specifically developed tools to tell them what those numbers are exactly in the end.

I dunno there's something awfully wrong with this. It's like having a 300hp car and asking for a 500hp car when you're still limited to ~85 mph anyway (in most countries, that's around the max allowed speed limit, like 130km/h here in France). It just reeks of entitlement "I just want more" when you just can't freakin' tell by yourself what you have unless an expert takes some carefully measured values to tell you what they actually are.
So now people entitled for wanting a 4K output on a 4K TV?

YOU can't tell the difference between 720p and 1080p doesn't mean other people can't.

I have bad eyes and probably need glasses and even I can tell that 1080p games on 4k TV looks like shit.
 

packy17

Banned
Oct 27, 2017
2,901
Nothing to do with it being rushed. They wanted to introduce a mid gen refresh and decided that 4K would be easier to sell than slightly better looking 1080p games.

And this is the problem with discussing this topic.

Anyone who has a 1440p monitor will tell you that it's a lot more than just "slightly better 1080p". The IQ bump is fairly significant and the resource requirement is way lower. If the industry hadn't forced 4K through early - and yes, they absolutely did force it early based on how hard it still is to get things to run well on PC at 4K - we would all be having a pretty great experience with 2K, even on current gen consoles in a lot of cases.
 

HBK

Member
Oct 30, 2017
7,972
So fuck me then since you have the eyesight of Mr Magoo?
That's funny.

At work we're constantly shuffling between various displays/computers and they're ALWAYS at the wrong resolution. And I see it. Because I'm one of those socially-inept pixel counters. And when I point it, most people genuinely don't see it. You can point it precisely "look here, it's interpolating on a 4x4 pixel basis" and people just don't fucking see it (and people who see it mostly just don't care). And fuck me, we work on a business where among other things we do quite a bit of image processing 🤷‍♀️

But nice try.
 

HMD

Member
Oct 26, 2017
3,300
I think 1440p is the sweet spot, especially when playing aim intensive games, better resolution = better visual clarity = better aiming.
 

GhostTrick

Member
Oct 25, 2017
11,305
I think 1440p is the sweet spot, especially when playing aim intensive games, better resolution = better visual clarity = better aiming.



I sit here personnaly.
Why ? Because 1440p144fps.
If I have enough power, I can even downsample.

Aiming for native res is important. And a 4k target doesn't guarantee it. 1800p on 4k is upscaling. 1800p on 1440p is downscaling. I take the latter especially if I can get a good framerate on top of that.
 

Dimajjio

Member
Oct 13, 2019
782
I can't wait to see gaming at 8K. But I'll probably be pushing up daisies by the time it becomes the norm.
 
Oct 28, 2017
1,715
Yeah it's always amazing to see people go wild about numbers they always want to see go higher because "why not", but then need experts with specifically developed tools to tell them what those numbers are exactly in the end.

I dunno there's something awfully wrong with this. It's like having a 300hp car and asking for a 500hp car when you're still limited to ~85 mph anyway (in most countries, that's around the max allowed speed limit, like 130km/h here in France). It just reeks of entitlement "I just want more" when you just can't freakin' tell by yourself what you have unless an expert takes some carefully measured values to tell you what they actually are.

This is such a stupid analogy, and that's impressive as car analogies are usually terrible to begin with. Do you know what acceleration is, and how horsepower can affect it?
 

Galava

▲ Legend ▲
Member
Oct 27, 2017
5,080
Seeing how ray-tracing is becoming an actual thing, games will render at 1440p at most and then upscale through reconstruction techniques to 4K/8K. It's the only way.

Rendering at native resolution at 4K is such a waste of resources imo. DLSS and similar technologies are the way to go.
 

GymWolf86

Banned
Nov 10, 2018
4,663
Seeing how ray-tracing is becoming an actual thing, games will render at 1440p at most and then upscale through reconstruction techniques to 4K/8K. It's the only way.

Rendering at native resolution at 4K is such a waste of resources imo. DLSS and similar technologies are the way to go.
Dlss looks blurry in 95% of games.
Real 4k is noticeably more crisper.
 

Galava

▲ Legend ▲
Member
Oct 27, 2017
5,080
Dlss looks blurry in 95% of games.
Real 4k is noticeably more crisper.
Ok, but how are you going to have games rendering at 4K at playable framerates with all the new rays that people are asking for on next-gen consoles? The more pixels you render, the more rays you need, and that tanks performance.

You are right about the DLSS thing, most that are out there look blurry, but the latest DLSS games look very good at 4K DLSS.
 

monmagman

Member
Dec 6, 2018
4,126
England,UK
I have been at 1080p on my Panasonic plasma and a base PS4 all generation and it's been great although I've never seen a game at 4K with my own eyes.I will get a new 4K TV next gen for PS5 though as my TV is coming towards the end of it's life cycle anyway.
 

Tedmilk

Avenger
Nov 13, 2017
1,909
I can see a case for 4K if you have a huge screen, but I don't... and if I did I'd always prioritise framerate anyway.
 
Dec 15, 2017
1,590
I think I am being display bottlenecked. I am playing several games at 4k on a 1080p screen with AMD VSR and while they look sharper it's not a night and day difference. 32 inch TV used as a monitor BTW. Distance 3 to 6 ft. But it's always great to have more power to spare, games will become more demanding anyway.
 
Nov 8, 2017
13,099
I think I am being display bottlenecked. I am playing several games at 4k on a 1080p screen with AMD VSR and while they look sharper it's not a night and day difference. 32 inch TV used as a monitor BTW. Distance 3 to 6 ft. But it's always great to have more power to spare, games will become more demanding anyway.

I'd say you're definitely display limited, but if you're happy with it then that's all that really matters. Supersampling will at least heavily reduce aliasing and some other annoying artefacts, even if it's not going to look as sharp as a native 4k image.

Some games won't look very sharp generally anyway. Doom 2016 for example when using TSSAA will always look a touch soft, and even when I supersample from 5k down to my 1440p monitor it still looks this way. Although I can disable all AA and just use the 5k downsample as a substitute for AA (effectively this is 4x supersampling) and it does look a fair bit sharper, while also looking similarly "clean" as a 1440p TSSAA image.
 

Wackamole

Member
Oct 27, 2017
16,932
I don't mind a higher resolution but i do mind the dumb focus on it for consoles that can barely do 1080P 60 fps in a decent way.
 

Deleted member 55311

User requested account closure
Banned
Mar 26, 2019
341
If going over 1080p isn't worth it don't put 1440p in the same poll. I refuse to participate.

Going over 1080 is worth it which is why I'm at 1440p but I don't care about going to 4K right now.
 

GymWolf86

Banned
Nov 10, 2018
4,663
Ok, but how are you going to have games rendering at 4K at playable framerates with all the new rays that people are asking for on next-gen consoles? The more pixels you render, the more rays you need, and that tanks performance.

You are right about the DLSS thing, most that are out there look blurry, but the latest DLSS games look very good at 4K DLSS.
You are right.

But a lot of people like me doesn't give a fuck about rtx and we hope that devs are gonna forget that stuff to concentrare on raw graphics and real 4k.
 

hichanbis

Banned
Nov 2, 2017
139
I still don't have a 4K TV.
Maybe for the PS5 who knows.

But as long as we have to choose between framerate and resolution I'll choose framerate any day.
 

blue_phazon

Prophet of Truth
Member
Oct 25, 2017
2,315
Shocked at those poll results. After seeing 4K why would you want to settle for anything less? I will concede that the upgrade is more apparent in live action video than video games.
Playing games at 120+ fps looks better than 4k.

They also play better, but to me they look better visually because you can see all the details of the image even in motion
 
Oct 28, 2017
1,715
You are right.

But a lot of people like me doesn't give a fuck about rtx and we hope that devs are gonna forget that stuff to concentrare on raw graphics and real 4k.

Why the hell would you not be interested in making 4K more efficient to achieve? Go and look at the Youngblood Digital Foundry breakdown, DLSS has come a long way.

'Raw graphics and real 4k', honestly.
 

Aether

Member
Jan 6, 2018
4,421
You are right.

But a lot of people like me doesn't give a fuck about rtx and we hope that devs are gonna forget that stuff to concentrare on raw graphics and real 4k.
So...lighting is not part of "raw graphics"? can you define "raw graphics"?
Computer Graphics are a composit of many things, inlcuding but not entirely:
  • Model resolutions (how many poligons)
  • texture resolutions
  • draw distance (how far away can stuf be rendered)
  • texture blending
  • render resolution
  • arguably time resolution (FPS)
  • lighting :
    • local ilumination (flat, phong,...)
    • global ilumination(path tracing, ambient oclusion, radiosity,...)
  • ...
Raytracing is just a better way of global ilumination than the methods we use now. So by definition, Raytracing is part of "RAW GRAPHICS".

Time Resolution is a special case.
In prerendered animation it would be irrelevant, since the media is consumed AFTER it is generated in its entirety.
They could have made every pixar movie in 1000 fps if they wanted. (and there had been a market).
Since games are interactive and instant, time resolution is a factor in game development.

With all of these aboth the questin is, where to find the balance. What are the aspects you see and feel the most.

If a game with 2x the resolution looks a bit better, but with 2x the frames feels way better to play, it is usually better to take the second route (Fast action games, FPS, VR games,...)
If the gameplay is slow enough, (round based jrpgs, visual novels,most rts,...), then the increase in resolution can have a higher effect then increase in framerate.

Since the medium had advertised itself for decades with static images (till the early ps3 era), framerate was not something that you could use for advertisement, and so resolution was prioritized.
with video previews, youtube 60fps, etc things have changed.

There are enough games where ULTRA HIGH settings tank the framerate, since thy are so demanding, but lowering it just one step lets you play with 60fps, and since the gameplay is usually not stillframe after stillframe, you dont even see the difference. These are the "RAW GRAPHICS" that eat up a lot but dont look that much better. Its called "diminishing returns". Shure, we are not there where there is "no return", but it is not caled "no returns", but "diminishing", they get proportionaly smaller, and with resolution, we start to feel it. With framerate, i would say it starts at 60, where most people would probably "see" 120 fps, but would prefere 4k over 120 fps.
With the other aspects of graphics, its the same. See reflections, ambient oclusion, real time shadows, real time clouds, etc. All of these are stoff that you could turn on and of, and a good sice of people would not realize it, since they are focused on the game.

A lot of improvements i can see being obvious are better animations, but curiously, these are less bound by the hardware for now, and more a problem of the production pipeline and that it is hard to create good animations and good animation systems.

My preferences, when we are talking about next gen:
Depending of the game
60@1440 as a general rule
60@4k for simpler looking games
30@4k with a lot of graphics (jrpgs, slow games, Sony/Micrososft big money games)
30@1440 with ALL THE GRAPHICS (raytracing, etc) (for ambitious graphics that cant keep 4k)

For most 2d indies:
60-120@4k, since i dont see a reason why 2d indy games would not hit that target on the new consoles.

120@1440 for VR?
 
Last edited:

GymWolf86

Banned
Nov 10, 2018
4,663
So...lighting is not part of "raw graphics"? can you define "raw graphics"?
Computer Graphics are a composit of many things, inlcuding but not entirely:
  • Model resolutions (how many poligons)
  • texture resolutions
  • draw distance (how far away can stuf be rendered)
  • texture blending
  • render resolution
  • arguably time resolution (FPS)
  • lighting :
    • local ilumination (flat, phong,...)
    • global ilumination(path tracing, ambient oclusion, radiosity,...)
  • ...
Raytracing is just a better way of global ilumination than the methods we use now. So by definition, Raytracing is part of "RAW GRAPHICS".

Time Resolution is a special case.
In prerendered animation it would be irrelevant, since the media is consumed AFTER it is generated in its entirety.
They could have made every pixar movie in 1000 fps if they wanted. (and there had been a market).
Since games are interactive and instant, time resolution is a factor in game development.

With all of these aboth the questin is, where to find the balance. What are the aspects you see and feel the most.

If a game with 2x the resolution looks a bit better, but with 2x the frames feels way better to play, it is usually better to take the second route (Fast action games, FPS, VR games,...)
If the gameplay is slow enough, (round based jrpgs, visual novels,most rts,...), then the increase in resolution can have a higher effect then increase in framerate.

Since the medium had advertised itself for decades with static images (till the early ps3 era), framerate was not something that you could use for advertisement, and so resolution was prioritized.
with video previews, youtube 60fps, etc things have changed.

There are enough games where ULTRA HIGH settings tank the framerate, since thy are so demanding, but lowering it just one step lets you play with 60fps, and since the gameplay is usually not stillframe after stillframe, you dont even see the difference. These are the "RAW GRAPHICS" that eat up a lot but dont look that much better. Its called "diminishing returns". Shure, we are not there where there is "no return", but it is not caled "no returns", but "diminishing", they get proportionaly smaller, and with resolution, we start to feel it. With framerate, i would say it starts at 60, where most people would probably "see" 120 fps, but would prefere 4k over 120 fps.
With the other aspects of graphics, its the same. See reflections, ambient oclusion, real time shadows, real time clouds, etc. All of these are stoff that you could turn on and of, and a good sice of people would not realize it, since they are focused on the game.

A lot of improvements i can see being obvious are better animations, but curiously, these are less bound by the hardware for now, and more a problem of the production pipeline and that it is hard to create good animations and good animation systems.

My preferences, when we are talking about next gen:
Depending of the game
60@1440 as a general rule
60@4k for simpler looking games
30@4k with a lot of graphics (jrpgs, slow games, Sony/Micrososft big money games)
30@1440 with ALL THE GRAPHICS (raytracing, etc) (for ambitious graphics that cant keep 4k)

For most 2d indies:
60-120@4k, since i dont see a reason why 2d indy games would not hit that target on the new consoles.

120@1440 for VR?
i'm just a guy who doesn't care much for nicer shadows and reflection at the cost of other stuff that for me is more important.
yeah raw graphic was the wrong choice of words, my bad.
 

GymWolf86

Banned
Nov 10, 2018
4,663
Why the hell would you not be interested in making 4K more efficient to achieve? Go and look at the Youngblood Digital Foundry breakdown, DLSS has come a long way.

'Raw graphics and real 4k', honestly.
i'm not saying that dlss can't improve but i only used in 2-3 games and the results was pretty bad (the last one is mhw).
 

Brot

Member
Oct 25, 2017
6,043
the edge
I much prefer framerate over resolution. If I had to choose between 2160p(r)30 over 1080p60 I'd always pick 1080p60.
 

Black_Stride

Avenger
Oct 28, 2017
7,388
You are right.

But a lot of people like me doesn't give a fuck about rtx and we hope that devs are gonna forget that stuff to concentrare on raw graphics and real 4k.

DLSS implementations are getting so good now, they rival Native 4K and IMO looking at comparisons of Youngblood id say somehow they have managed to get AI to make less than 4K DLSS look better than actual 4K.
 

Aether

Member
Jan 6, 2018
4,421
i'm just a guy who doesn't care much for nicer shadows and reflection at the cost of other stuff that for me is more important.
yeah raw graphic was the wrong choice of words, my bad.
would you mind defining what you would prefere?
im just curious.

I much prefer framerate over resolution. If I had to choose between 2160p(r)30 over 1080p60 I'd always pick 1080p60.
even in round based games? with slow movement? visual novels? menu based games?
to be honest, menus navigation between 30 and 60 fps is not that different.
 

Woffls

Member
Nov 25, 2017
918
London
Diminishing returns for sure but 1440p is worth it if I can hit 60 FPS. I run very few games at 4K, even on my 55" OLED in a small room.
 

exofrenon

Member
Mar 30, 2019
155
I am also perfectly fine with 1080p, any current-gen game looks ridiculously good to me on my laptop or even my 24" screen on xbox one.

At this point, it depends wholly on the size of your monitor and how far you are from it. I believe up to 24" 1080p should be the sweet spot, assuming you are from a desktop distance, and any resolution above that is causing diminishing returns.

Nonetheless, it is only a matter of time before larger screens become more affordable and become the new standard. Many people already have 27" or above that, and I think that 1080p ain't gonna cut it.

I think this is an ok-ish guide about viewing distance and resolution, where they explain the notion of "pixels per angle", which makes a lot of sense to me:
 

Aether

Member
Jan 6, 2018
4,421
I am also perfectly fine with 1080p, any current-gen game looks ridiculously good to me on my laptop or even my 24" screen on xbox one.

At this point, it depends wholly on the size of your monitor and how far you are from it. I believe up to 24" 1080p should be the sweet spot, assuming you are from a desktop distance, and any resolution above that is causing diminishing returns.

Nonetheless, it is only a matter of time before larger screens become more affordable and become the new standard. Many people already have 27" or above that, and I think that 1080p ain't gonna cut it.

I think this is an ok-ish guide about viewing distance and resolution, where they explain the notion of "pixels per angle", which makes a lot of sense to me:
Well, my switch is 1080 on a 4k 27" monitor.
It can look a bit soft, but to be honest... for nintendo graphics this is pretty decent. 1440 woul be the resolution where it would be hard for me to see the difference, and even if we stay at 1080, with a good upscaler i would be okay with 1080.
To be honest, i want nintendo to stay at a internal resolution of 1080 and use a good upscaler, but bump up the draw distance and rest. (mostly to have more stuff on screen in the next zelda game, and a higher draw distance)