60 frames should be number one priority, whether dev uses dynamic res or whatever other technique to keep it that way.
I don't feel any need for higher resolution while i am gaming on 24". When i decide to get a bigger monitor i will probably get 1440p. At 24" everything above 1080p is overkill.1080p hasn't cut it for me since 2014. Other people see no need to go beyond it. Different strokes.
1080p is so blurry.
I've been playing in 1080p since the mid 2000s, it needs to die.I hope everyone in this thread saying 1080p can die are playing on high end PCs and not on console where it's "4K" ,medium/low settings and struggling to 30fps half the time.
No they are not. That's why 1080p looks "blurry and low rez" for them.I hope everyone in this thread saying 1080p can die are playing on high end PCs and not on console where it's "4K" , medium/low settings and struggling to hit 30fps half the time.
If most games actually target 4k 60FPS, what a fucking waste that would be.Native 1440p will hopefully be the sweet spot for next gen in terms of performance with a crisp picture.
Wat I'm most excited about is using the 10+ TF at 1080p. Should hopefully mean 60fps more often than not. The performance of last guardian, and bloodborne to a lesser extent, hampered my enjoyment. Tlg because its framerate literally tanks to low double digits and bloodborne because of microstutterng.
I don't care about 4k. 1080p is already crispy clear. Would rather put the extra horsepower to fps
You had me in the first half. Actually, well, no you didn't. I think a huge amount of people bought a TV in the last five years. I know this because I work in an electronics store that sells TVs. Now if I was the kind of person to go out and buy a potentially ~$600 dollar console, which is what we are talking about here when we are talking about "next gen," I would want it hooked up to my tv that I just bought that I maybe didn't put into the room..? Why wouldn't you put your nice TV in the console room? Or vice versa? That seems like a really odd, self imposed limit. If you think 1080p is "fine" you need to treat yourself to playing a game in native 4k. Next gen, between secret sauce, coding to the metal, and 30TFLOPS from the SSD, we should be able to enjoy native 4k/60.What percentage of society do you think have bought a new TV in the last five years? Now subtract from that percentage the amount of people who put that new TV in a room other than the one that their consoles are based in.
That's why.
You had me in the first half. Actually, well, no you didn't. I think a huge amount of people bought a TV in the last five years. I know this because I work in an electronics store that sells TVs. Now if I was the kind of person to go out and buy a potentially ~$600 dollar console, which is what we are talking about here when we are talking about "next gen," I would want it hooked up to my tv that I just bought that I maybe didn't put into the room..? Why wouldn't you put your nice TV in the console room? Or vice versa? That seems like a really odd, self imposed limit. If you think 1080p is "fine" you need to treat yourself to playing a game in native 4k. Next gen, between secret sauce, coding to the metal, and 30TFLOPS from the SSD, we should be able to enjoy native 4k/60.
Generally speaking, most games won't target super high refresh rates. If it comes between pushing more effects or having a higher frame rate, developers are always going to choose more effects unless the gameplay is specifically tailored to high FPS games, like shooters or character action games. Most of the time, I think we will be able to have both. Both RE3 and Doom: Hell's Bells look AND play beautifully, thanks in large part to their engines and the tremendous people that worked to get them there.
I think as consumers, we should never settle for "good enough." These consoles were largely developed by their respective manufacturers asking developers what they wanted out of a console. None of them said "Eh this is good enough." Between game makers and console manufacturers, a tremendous amount of people in the industry are pushing to deliver a high fidelity, and high quality experience to players. Saying "1080p is good enough" is like saying seeing a picture of a banana taped to a wall is good enough. You have to be there. You have to take it in. I forgot where I was going with this.
I'd rather have 4k30 than 1080p120, but performace/fidelity options should be mandatory if anything next gen.
This happens every gen. There's a similar thread about current gen graphics still being impressive too. Last gen people were saying 720p was absolutely fine and they don't notice the difference, and then they'd post a screenshot of that frickin' tree trunk texture from Uncharted.
It breaks my heart that more people don't try CRTs. I play everything on a 17" CRT @1280x960. It just looks so good in motion and the colours are out of this world. And my CRT is not even high-end or anything.I played control at 1024x768 at 120/ultra/full RTX on a crt and it was glorious. Looked incredible.
Lol at ppl saying "1080p can die!" while 30fps or even below that is perfectly fine for them...
In old (backwards compatible) games or new releases? Because new games most certainly will run at higher resolutions. And framerates will tank. Same shit, different gen.Wat I'm most excited about is using the 10+ TF at 1080p. Should hopefully mean 60fps more often than not.
It doesn't on a 1080p panel (23" monitor and 50" Plasma).
It's only blurry if it's non native on a 4K TV or if the devs are using some blurry postprocessing AA. Native 1080p on a smaller monitor isn't blurry at all.
Not on a screen size built for it.