• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Pargon

Member
Oct 27, 2017
11,994
I have always said and will say it again, I hope consoles stay at 30fps and push as good graphics and effects as possible so I can have all that and good fps on pc instead.
If consoles pushed 60 or 120fps that would mean less eye candy and progress with graphics in general.
If the PlayStation 5 is going to have an 8-core Ryzen clocked at 3.20 GHz as rumored, and 8-core Ryzens in desktop PCs can hit ~4.25 GHz (33% faster) how do you expect games built for 30 FPS on that hardware to run at high frame rates on a PC?
Maybe an Intel CPU with 8 cores running at 5.00 GHz is ~67% faster rather than ~33% faster. That's still only 50 FPS.

Now PCs will be moving toward having more cores - it's speculated that there will be 16-core Ryzen CPUs released this year. If all else stays the same, that should effectively double performance.
That doesn't necessarily help gaming though, if the games are only built to support 8 cores. We still have games released today which are bottlenecked by most of the game logic running on a single core, not even that they don't scale beyond 4 cores.

And for the love of god get 16xAF on everything.
No excuse.
Anisotropic filtering hits bandwidth pretty heavily, which is why it's not used as often on consoles.
The performance hit is almost negligible on PC most of the time because they have a lot more bandwidth to spare.
That said, it's not zero, and some newer games -especially at higher resolutions- do actually have a noticeable impact. It's why many games only use 8x in their "ultra" presets now, rather than 16x.
I could see something like 4x being baseline on next-gen consoles though, rather than 16x, as it would still be a big improvement over not using any.
 

funky

Banned
Oct 25, 2017
8,527
I feel like native 4k is a waste when 70% of the screen is a mess of motion blur at any one time.

More advanced checkerboarding and interpolation and stuff like Nividias adaptive shading should be the future now we have redixlious resolution screens imo
 

Dan Thunder

Member
Nov 2, 2017
14,020
As always with consoles there'll be a trade off between resolution, frame-rate and graphical prowess. I'm not too bothered about every title hitting native 4k as I've not had an issue in this generation with games not always hitting 1080p.
 

Deleted member 48434

User requested account closure
Banned
Oct 8, 2018
5,230
Sydney
As someone who couldn't give a damn about 4k, and it's piddly ass improvements, give me the option to go with 1080p with 60fps anyday.
If we are hitting the point where every game next gen is in 4k, then 30 fps should be dead already.
4k is a meme. Outside of the Era/internet/tech bubble, It's basically irrelevant to the majority. That waste of power can be put to better use elsewhere.
Except for VR
 

Rosur

Member
Oct 28, 2017
3,502
That's expecting too much. 4k/30 standard with 4k/60 occasionally

Yep, 4k/ 60 on first party games + the bigger budget AAA, 4k/ 30 on the rest I'd expect (depressing on sort of game). That said I can easily see some dev's going for more graphics and realism than FPS and sticking to 30 for those reasons.
 

Fafalada

Member
Oct 27, 2017
3,065
we are currently running 4K/30 native on X1X for the most part...it's not some high expectation to have.
Actually it is considering what we're really seeing is dynamic resolutions for most part already.
But as I said above - give sufficiently granular control over VRS to console devs and concept of a fixed 'native resolution' will all but disappear as the standard will become running with dynamic shading levels all the time.
 

Yarbskoo

Member
Oct 27, 2017
2,980
Anisotropic filtering hits bandwidth pretty heavily, which is why it's not used as often on consoles.
The performance hit is almost negligible on PC most of the time because they have a lot more bandwidth to spare.
That said, it's not zero, and some newer games -especially at higher resolutions- do actually have a noticeable impact. It's why many games only use 8x in their "ultra" presets now, rather than 16x.
I could see something like 4x being baseline on next-gen consoles though, rather than 16x, as it would still be a big improvement over not using any.
Time to design some texture filtering dedicated hardware or something, because lower quality filtering looks just as bad at 4K.

I didn't know that about presets though. I guess I didn't notice due to having 16x forced globally through the driver.
 

Vipu

Banned
Oct 26, 2017
2,276
If the PlayStation 5 is going to have an 8-core Ryzen clocked at 3.20 GHz as rumored, and 8-core Ryzens in desktop PCs can hit ~4.25 GHz (33% faster) how do you expect games built for 30 FPS on that hardware to run at high frame rates on a PC?
Maybe an Intel CPU with 8 cores running at 5.00 GHz is ~67% faster rather than ~33% faster. That's still only 50 FPS.

Now PCs will be moving toward having more cores - it's speculated that there will be 16-core Ryzen CPUs released this year. If all else stays the same, that should effectively double performance.
That doesn't necessarily help gaming though, if the games are only built to support 8 cores. We still have games released today which are bottlenecked by most of the game logic running on a single core, not even that they don't scale beyond 4 cores.

That would be amazing if games had so good physics and all that PC:s had trouble running them even at 60fps.
I would take tons of improvements in effects and physics and then have option to turn them down or off if needed.
Better than not going forward and pushing the hardware to the max.

These days graphics settings barely make any difference in many games because we dont have those truly amazing graphics like if we compare to old games where you could have either amazing graphics or minecraft graphics, you could choose what you want.
 

Deleted member 22585

User requested account closure
Banned
Oct 28, 2017
4,519
EU
Dunno. Less than 4k can be fine with good temporal AA. I'm all for it if we get more complex physics, improved scale, more dense environments etc. But only at 60fps. I don't want to see 30fps anymore.
 

Deleted member 8752

User requested account closure
Banned
Oct 26, 2017
10,122
My expectation is that majority of games will be 4K/60, not every game. No different from my expectation from 360 games that majority was 720. I'm just saying...we are currently running 4K/30 native on X1X for the most part...it's not some high expectation to have.
Most 360 games were lower than 720p. It was rare for them to be 720p native.
 

No Depth

Member
Oct 27, 2017
18,263
How about quality HDR?

I'm more annoyed at bs rendering like in RDR2 or the weak ass Borderlands 2 update patch than non-native 4K
 

BigTnaples

Member
Oct 30, 2017
1,752
I disagree. I'm done with blurry games on my 4k


This. Bring on 8K. I'll have an 8K set by the time PS5 Launches.

These solutions while good don't look like native 4K. At least not yet. When DLSS has some time to mature, it will get there, but we are a way off.

My games finally don't look like jagged, artifacts, jittering messes. Which is much more important.

You have the clarity of a Pixar movie without blurring all the detail out with AA. It's gaming bliss.

Just look at how transformative 4K is to 360 gen games with MS's BC program. Games like Fable II, Red Dead Redemtion, Crackdown, etc are given new leases on life, and look great, with the only difference being 4K.


The higher resolution the better, and this is even more evident in VR.

4K or bust. Anything less is the equivalent of a 900p game in 2020.
 

Anoregon

Member
Oct 25, 2017
14,028
I'd be happy with 60fps 1440p but I'm just a dumb slug who can't be bothered to aspire to greatness
 

Deleted member 48434

User requested account closure
Banned
Oct 8, 2018
5,230
Sydney
I struggled to tell the difference between 720 and 1080 when I was just a bit younger. At 23 the optometrist tells me my eyesight is good, and I honestly can't tell the difference between 1080 and 4k unless they were placed side by side. And whatever you CAN notice, what with your above average eyesight, like some of the users in this thread, is that it's not gonna remotely matter once you actually start watching/playing the media, seriously, the difference between 30 and 60 fps for comparison is massive and it matters for all of 5 minutes before I forget about it. Seriously, anyone who complains about HD must have hawkeye genetics (or is a tech loonie).
 

Kaji AF16

Member
Nov 6, 2017
1,405
Argentina
Not necessarily the focus, but the baseline, or (as has already been said) the standard. I am not a resolution fundamentalist, but Xbox One X has been able to run several games at native 4K and the next generation has to at least reach that as a lowest common denominator.

Yesterday I reinstalled AC: Unity because of the Notre Dame fire, and I was surprised at the immense visual downgrade of returning to lower resolutions (900p, Xbox One).
 

Truant

Member
Oct 28, 2017
6,758
I don't really care. PC gaming will always offer arbitrary resolutions and framerates to people like me. My main problem is console exclusives and that they're locked to inferior hardware with either poor IQ or terrible performance.
 

ThingsRscary

Banned
Mar 10, 2019
546
Trust me next gen isn't about 4K I already can see the future is bright from what I see.
This gen is like the season 3 of Game of Thrones.
 

jett

Community Resettler
Member
Oct 25, 2017
44,653
I think I'm gonna think very poorly of any developers that waste GPU resources on native 4K (or/and ray tracing) instead of targeting 60fps. At least for the upcoming this generation I HOPE most games give us a choice between "high performance" and "high quality" or whatever.

Been playing Sekiro and DMC5 for an entire month on my PC. Past few days though I felt like replaying God of War, a year later, on my PS4 slim. The difference is monstrous. Forget actually controlling Kratos, just moving the camera feels so sluggish and unresponsive. It's actually kinda revolting frankly. Sekiro and DMC5 feels creamy smooth to play. And I only have a 60fps display, I can only imagine what it's like at 120fps. 30fps is something you can get used to, but fucking hell it's about time 60fps was the bare minimum and this 30fps trash was ditched for good.
 

Charsace

Chicken Chaser
Member
Nov 22, 2017
2,855
I'm fine with 1440p60 and 1440p120, but I think it's save to assume that most AAA games will go for 2160p30.
Mid Next-Gen Upgrades for sure will try to target 8k with upscaling from 6k or something similar.


That's what we currently have with the Xbox One X and the PS4 Pro ^^
You have to remember the games will have ray tracing and feature higher detail(higher poly counts and higher rez textures). If the games end up mostly looking the same as current gen then yeah everything will be 4k.
 

LumberPanda

Member
Feb 3, 2019
6,325
If a developer's artistic vision is to run at 4k (and/or RT) then they should focus on 4k (and/or RT). We shouldn't force them to sacrifice their artistic vision to run at 60fps, if you need 60fps then that just means not every game is made for you and that's okay.
 

Charsace

Chicken Chaser
Member
Nov 22, 2017
2,855
I'm fine with 1440p60 and 1440p120, but I think it's save to assume that most AAA games will go for 2160p30.
Mid Next-Gen Upgrades for sure will try to target 8k with upscaling from 6k or something similar.


That's what we currently have with the Xbox One X and the PS4 Pro ^^
You have to remember the games will have ray tracing and feature higher detail(higher poly counts and higher rez textures). If the games end up mostly looking the same as current gen then yeah everything will be 4k.
 

jett

Community Resettler
Member
Oct 25, 2017
44,653
So many people choosing native 4K over 60fps in this thread is kinda disheartening to be honest.

Note to OP and console devs:

Go ahead and target sub-native 4k resolutions with CBR or similar reconstruction technique and I guarantee that the vast vast vast majority of gamers won't be able to tell the difference.

Often the likes of Digital Foundry can't even make an accurate assessment of what resolution a checkerboarded/reconstructed game is using.

That tells you all you need to know, really.
 

Kaako

Member
Oct 25, 2017
5,736
Devs will do what time & resources permit. I think we'll get a mix of CBR & native at first with most transitioning to native 4k further down the generation.
 

TooBusyLookinGud

Graphics Engineer
Verified
Oct 27, 2017
7,937
California
Checkerboarding is, as a matter of fact, distinguishable.
Yes, yes it is. I'll take native 4k all day. I think OP meant to say that there are some great examples of games using checkerboarding versus saying "(Spider-Man), and more that make the image look nigh-indistinguishable from native 4K on TVs."

My 65' OLED tells me differently OP. I can literally see the artifacting when games use checkerboarding.
 
Nov 8, 2017
3,532
The difference between 30fps and 60fps is way more noticeable than the difference between 1080p and 4K. So yeah, let's go for 60fps first before we start going for 4K.
 

7thFloor

Member
Oct 27, 2017
6,635
U.S.
Agreed, RT & 4K aren't very compatible at the moment without sacrificing performance, I'd rather devs focus on making their games look great at 1080p 60 fps or 1440p 60fps than target 4K 30fps.
 

Astra Planeta

Member
Jan 26, 2018
668
I struggled to tell the difference between 720 and 1080 when I was just a bit younger. At 23 the optometrist tells me my eyesight is good, and I honestly can't tell the difference between 1080 and 4k unless they were placed side by side. And whatever you CAN notice, what with your above average eyesight, like some of the users in this thread, is that it's not gonna remotely matter once you actually start watching/playing the media, seriously, the difference between 30 and 60 fps for comparison is massive and it matters for all of 5 minutes before I forget about it. Seriously, anyone who complains about HD must have hawkeye genetics (or is a tech loonie).

4K IQ is noticeably better, especially in games if you are rendering at native. With movies, the entire signal chain has to be 4K and most aren't so its harder to tell.
 

Kaako

Member
Oct 25, 2017
5,736
I'd also love a 1440P 120fps+ option but I know we're not getting that on consoles anytime soon.
For that, I'll have to stick with my PC for now.
 

RdN

Member
Oct 31, 2017
1,781
Of course not.

Native 4K is already a reality for those on PC and on Xbox One X.

Next gen should push things further.
 

Hugare

Banned
Aug 31, 2018
1,853
I think we will see a lot of checkerboarding.

Native 4K in all titles will be something exclusive to "PS5 Pro" and "Scarlet X"
 
Dec 23, 2017
8,802
Yeah one of the reason I am turned off by traditional consoles. They go for power devs overshoot and you get underwhelming games soon than you should. Still fill like there is so much more to get out of ps4 and xb1. Devs focus so much on how they can make a game look as a selling point. I just want the next switch to be focused on 1080p and I'm good.
 

Tedmilk

Avenger
Nov 13, 2017
1,909
The priorities of devs/publishers aren't going to change with a new generation of hardware. I fully expect 4K/30fps to be the norm. What I'd *like* is for all game logic to run at 60fps, and be given an option of 60fps/1080p or 30fps/4K. I know it's never going to be that simple but if most games could implement that it would make pretty much everyone (myself included) happy.
 

Falus

Banned
Oct 27, 2017
7,656
1600p+ cjeckboard is more than enough. Spider-Man is CB and look as sharp as native. It's enough
 

ApeEscaper

Member
Oct 27, 2017
8,720
Bangladeshi
On my new PC even messing with all these see how I like to play best I liked most 1080p/1440p 120fps, top graphics settings

At 4k I have to turn down graphical settings and sometimes even the framerate, so I'd rather graphics + framerate over 4k
 

The Omega Man

Member
Oct 25, 2017
3,898
Speak for yourself OP, I want native 4K and 8K struggling to maintain 25fps for that sweet cinematic experience.