What, Yosh is quite obviously sub native resolution with how blurry it is.
Saw someone post something similar elsewhere. Some folks just need to get their eyes checked 🤓
What, Yosh is quite obviously sub native resolution with how blurry it is.
Consoles are played on TVs. TVs don't have 1440p. They are 1080p or 4k.
I'm waiting for 8k VRI think 4k is ok if you have realistic graphics.
8k only matters for VR, for the rest it's just a marketing stunt.
Yeah it's always amazing to see people go wild about numbers they always want to see go higher because "why not", but then need experts with specifically developed tools to tell them what those numbers are exactly in the end.I can't even tell the difference between Mario Kart 8 (720p) and Mario Kart 8 Deluxe (1080p). Plenty of people on here were shocked to learn that Yoshi's Crafted World was sub-720p because it looked like 720, 900 or even 1080p to them. I'll prefer a 720p game with AA than a 1080p game without AA, so I'm much more interested in AA than higher resolution.
Nothing to do with it being rushed. They wanted to introduce a mid gen refresh and decided that 4K would be easier to sell than slightly better looking 1080p games.... Because 4K was (unfortunately for 3D games) rushed out to market to satisfy demands in other media areas, which was what the post he was replying to was about.
So now people entitled for wanting a 4K output on a 4K TV?Yeah it's always amazing to see people go wild about numbers they always want to see go higher because "why not", but then need experts with specifically developed tools to tell them what those numbers are exactly in the end.
I dunno there's something awfully wrong with this. It's like having a 300hp car and asking for a 500hp car when you're still limited to ~85 mph anyway (in most countries, that's around the max allowed speed limit, like 130km/h here in France). It just reeks of entitlement "I just want more" when you just can't freakin' tell by yourself what you have unless an expert takes some carefully measured values to tell you what they actually are.
When most of them can't tell the difference in the first place? Yes.
So fuck me then since you have the eyesight of Mr Magoo?When most of them can't tell the difference in the first place? Yes.
Heck, I'm sure many people play 4K content on a 4K TVs using a 1080p resolution-configured player lol
Nothing to do with it being rushed. They wanted to introduce a mid gen refresh and decided that 4K would be easier to sell than slightly better looking 1080p games.
That's funny.
I think 1440p is the sweet spot, especially when playing aim intensive games, better resolution = better visual clarity = better aiming.
Yeah it's always amazing to see people go wild about numbers they always want to see go higher because "why not", but then need experts with specifically developed tools to tell them what those numbers are exactly in the end.
I dunno there's something awfully wrong with this. It's like having a 300hp car and asking for a 500hp car when you're still limited to ~85 mph anyway (in most countries, that's around the max allowed speed limit, like 130km/h here in France). It just reeks of entitlement "I just want more" when you just can't freakin' tell by yourself what you have unless an expert takes some carefully measured values to tell you what they actually are.
Dlss looks blurry in 95% of games.Seeing how ray-tracing is becoming an actual thing, games will render at 1440p at most and then upscale through reconstruction techniques to 4K/8K. It's the only way.
Rendering at native resolution at 4K is such a waste of resources imo. DLSS and similar technologies are the way to go.
Ok, but how are you going to have games rendering at 4K at playable framerates with all the new rays that people are asking for on next-gen consoles? The more pixels you render, the more rays you need, and that tanks performance.Dlss looks blurry in 95% of games.
Real 4k is noticeably more crisper.
I think I am being display bottlenecked. I am playing several games at 4k on a 1080p screen with AMD VSR and while they look sharper it's not a night and day difference. 32 inch TV used as a monitor BTW. Distance 3 to 6 ft. But it's always great to have more power to spare, games will become more demanding anyway.
You are right.Ok, but how are you going to have games rendering at 4K at playable framerates with all the new rays that people are asking for on next-gen consoles? The more pixels you render, the more rays you need, and that tanks performance.
You are right about the DLSS thing, most that are out there look blurry, but the latest DLSS games look very good at 4K DLSS.
Playing games at 120+ fps looks better than 4k.Shocked at those poll results. After seeing 4K why would you want to settle for anything less? I will concede that the upgrade is more apparent in live action video than video games.
You are right.
But a lot of people like me doesn't give a fuck about rtx and we hope that devs are gonna forget that stuff to concentrare on raw graphics and real 4k.
So...lighting is not part of "raw graphics"? can you define "raw graphics"?You are right.
But a lot of people like me doesn't give a fuck about rtx and we hope that devs are gonna forget that stuff to concentrare on raw graphics and real 4k.
i'm just a guy who doesn't care much for nicer shadows and reflection at the cost of other stuff that for me is more important.So...lighting is not part of "raw graphics"? can you define "raw graphics"?
Computer Graphics are a composit of many things, inlcuding but not entirely:
Raytracing is just a better way of global ilumination than the methods we use now. So by definition, Raytracing is part of "RAW GRAPHICS".
- Model resolutions (how many poligons)
- texture resolutions
- draw distance (how far away can stuf be rendered)
- texture blending
- render resolution
- arguably time resolution (FPS)
- lighting :
- local ilumination (flat, phong,...)
- global ilumination(path tracing, ambient oclusion, radiosity,...)
- ...
Time Resolution is a special case.
In prerendered animation it would be irrelevant, since the media is consumed AFTER it is generated in its entirety.
They could have made every pixar movie in 1000 fps if they wanted. (and there had been a market).
Since games are interactive and instant, time resolution is a factor in game development.
With all of these aboth the questin is, where to find the balance. What are the aspects you see and feel the most.
If a game with 2x the resolution looks a bit better, but with 2x the frames feels way better to play, it is usually better to take the second route (Fast action games, FPS, VR games,...)
If the gameplay is slow enough, (round based jrpgs, visual novels,most rts,...), then the increase in resolution can have a higher effect then increase in framerate.
Since the medium had advertised itself for decades with static images (till the early ps3 era), framerate was not something that you could use for advertisement, and so resolution was prioritized.
with video previews, youtube 60fps, etc things have changed.
There are enough games where ULTRA HIGH settings tank the framerate, since thy are so demanding, but lowering it just one step lets you play with 60fps, and since the gameplay is usually not stillframe after stillframe, you dont even see the difference. These are the "RAW GRAPHICS" that eat up a lot but dont look that much better. Its called "diminishing returns". Shure, we are not there where there is "no return", but it is not caled "no returns", but "diminishing", they get proportionaly smaller, and with resolution, we start to feel it. With framerate, i would say it starts at 60, where most people would probably "see" 120 fps, but would prefere 4k over 120 fps.
With the other aspects of graphics, its the same. See reflections, ambient oclusion, real time shadows, real time clouds, etc. All of these are stoff that you could turn on and of, and a good sice of people would not realize it, since they are focused on the game.
A lot of improvements i can see being obvious are better animations, but curiously, these are less bound by the hardware for now, and more a problem of the production pipeline and that it is hard to create good animations and good animation systems.
My preferences, when we are talking about next gen:
Depending of the game
60@1440 as a general rule
60@4k for simpler looking games
30@4k with a lot of graphics (jrpgs, slow games, Sony/Micrososft big money games)
30@1440 with ALL THE GRAPHICS (raytracing, etc) (for ambitious graphics that cant keep 4k)
For most 2d indies:
60-120@4k, since i dont see a reason why 2d indy games would not hit that target on the new consoles.
120@1440 for VR?
i'm not saying that dlss can't improve but i only used in 2-3 games and the results was pretty bad (the last one is mhw).Why the hell would you not be interested in making 4K more efficient to achieve? Go and look at the Youngblood Digital Foundry breakdown, DLSS has come a long way.
'Raw graphics and real 4k', honestly.
You are right.
But a lot of people like me doesn't give a fuck about rtx and we hope that devs are gonna forget that stuff to concentrare on raw graphics and real 4k.
would you mind defining what you would prefere?i'm just a guy who doesn't care much for nicer shadows and reflection at the cost of other stuff that for me is more important.
yeah raw graphic was the wrong choice of words, my bad.
even in round based games? with slow movement? visual novels? menu based games?I much prefer framerate over resolution. If I had to choose between 2160p(r)30 over 1080p60 I'd always pick 1080p60.
Bitrate is a thing, you know?
Well, my switch is 1080 on a 4k 27" monitor.I am also perfectly fine with 1080p, any current-gen game looks ridiculously good to me on my laptop or even my 24" screen on xbox one.
At this point, it depends wholly on the size of your monitor and how far you are from it. I believe up to 24" 1080p should be the sweet spot, assuming you are from a desktop distance, and any resolution above that is causing diminishing returns.
Nonetheless, it is only a matter of time before larger screens become more affordable and become the new standard. Many people already have 27" or above that, and I think that 1080p ain't gonna cut it.
I think this is an ok-ish guide about viewing distance and resolution, where they explain the notion of "pixels per angle", which makes a lot of sense to me: