Serious Sam

Banned
Oct 27, 2017
4,354
27" 1440p on the desk looks a lot better/bigger than 65" 4K from a couch so I dunno.
That's not how it works I'm afraid. Even a modest 55" 4K TV creates way better gaming immersion than a 27" monitor ever could.

It's similar how you don't get cinema experience when you move your smartphone closer to your face. You can make phone screen take up same amount of your field of vision as cinema screen, but it won't look and feel the same, and it won't be the same experience.
 

Myself

Member
Nov 4, 2017
1,282
Resolutions are typically more important for a PC gamer who sits like 2-3 ft from their monitor then a console gamer who sits 6-12 ft from their TV.

For example:

2.5' from a 27" monitor corresponds to a horizontal viewing angle of 42.8° and a vertical viewing angle of 24.1°

7' from a 65" TV corresponds to a horizontal viewing angle of 37.3° and a vertical viewing angle of 21.0°

As you can see the 65" TV fills less of your vision in this scenario by quite significant margin, making resolution less important.
Sure, I understand that but even then when you're 2.5' (About 76cm) looking at a 32" tv, you'd be hard pressed to discern 4k IMO based off the graphs and tables I've looked at. Note, I have not had a 32" 4k monitor to actually test.
 

F34R

Member
Oct 27, 2017
12,056
Resolutions are typically more important for a PC gamer who sits like 2-3 ft from their monitor then a console gamer who sits 6-12 ft from their TV.

For example:

2.5' from a 27" monitor corresponds to a horizontal viewing angle of 42.8° and a vertical viewing angle of 24.1°

7' from a 65" TV corresponds to a horizontal viewing angle of 37.3° and a vertical viewing angle of 21.0°

As you can see the 65" TV fills less of your vision in this scenario by quite significant margin, making resolution less important.
65% of steam pc users are using 1080p.
 

elelunicy

Member
Oct 27, 2017
175
I understand that 4K tvs have been out for a while but beware marketing traps!

Judging a game by its pixelcount is not a good idea. You must look at image quality. How's them particles? lighting? Blur? Hair and cloth physics? Environment physics? The list is endless.

1440p is way more than enough.
It doesn't matter if a game has perfect particles/lighting/physics/etc. As long as it renders at such low resolution the image quality will always be ruined by
aliasing artifacts like shimmering, flickering, pixel crawling, etc.

Temporal stability is easily one of the most important factors for a game's IQ. In many games even 5k plus TAA is not enough to eliminate all temporal aliasing, let alone the joke of a resolution that is 1440p.

Sure, I understand that but even then when you're 2.5' (About 76cm) looking at a 32" tv, you'd be hard pressed to discern 4k IMO based off the graphs and tables I've looked at. Note, I have not had a 32" 4k monitor to actually test.

When you sit 2.5' from a 32" display, the display needs to have a resolution of 5136x2889 or higher for it to be indiscernible, assuming a 20/20 vision.

Here is a calculator that gives you the exact resolution you need for your display size/viewing distance.

More importantly, display resolution and rendering resolution are two separate things.

For example, here is a natively rendered 1440p image vs. a 5k downsampled to 1440p image. I can see the difference between the two on my 6" 1080p phone.

You do not need a large display size or a short viewing distance to enjoy the benefit of rendering games at higher resolutions.
 

Myself

Member
Nov 4, 2017
1,282
It doesn't matter if a game has perfect particles/lighting/physics/etc. As long as it renders at such low resolution the image quality will always be ruined by
aliasing artifacts like shimmering, flickering, pixel crawling, etc.

Temporal stability is easily one of the most important factors for a game's IQ. In many games even 5k plus TAA is not enough to eliminate all temporal aliasing, let alone the joke of a resolution that is 1440p.



When you sit 2.5' from a 32" display, the display needs to have a resolution of 5136x2889 or higher for it to be indiscernible, assuming a 20/20 vision.

Here is a calculator that gives you the exact resolution you need for your display size/viewing distance.

More importantly, display resolution and rendering resolution are two separate things.

For example, here is a natively rendered 1440p image vs. a 5k downsampled to 1440p image. I can see the difference between the two on my 6" 1080p phone.

You do not need a large display size or a short viewing distance to enjoy the benefit of rendering games at higher resolutions.
Firs apologies, 32 was not right, I was more meaning 27" screens. I know that going to 32 you start to want higher than 1440p. I also argue that a bit of shimmering and antialiasing is not worth 4x the graphical grunt required, compared to using that for say 120fps or RT or nicer effects.

I entered 27" 1440p (Considered pretty common from what I understand) at a viewing distance of 2.5" (80cm) and got "You will likely need moderate anti-aliasing." That seems perfectly fine to me and not something I would consider an issue given the cost of going to 4k.

Thanks for the screenshot comparison but to be honest, the differences, whilst noticeable on my 1080p 23" monitor at about 80cm was not what I would call amazing but I do understand that things like supersampling helps. It's all about diminishing returns for me.

27" @ 1440p does not require 4k IMO and certainly not at the expense of everything else. If I had a card that could do 120fps, full RT at 4K but output to 1440p of course I'd take it but something's gotta give.
 
Last edited:
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,291
Kansas
That's not how it works I'm afraid. Even a modest 55" 4K TV creates way better gaming immersion than a 27" monitor ever could.

It's similar how you don't get cinema experience when you move your smartphone closer to your face. You can make phone screen take up same amount of your field of vision as cinema screen, but it won't look and feel the same, and it won't be the same experience.
I assume youve tried vr? Because holding a smart phone up to your face is literally what that is.
 
Nov 1, 2017
1,111
Lol 1440p is absolutely, utterly a joke in 2020.

Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.

I'm pretty dedicated to 4K and I have a 3090 but the truth is, resolution just doesn't matter as much as it used to. We're very much in the land of diminishing returns when it comes to raw pixel counts.
 

scabobbs

Member
Oct 28, 2017
2,110
Lol 1440p is absolutely, utterly a joke in 2020.

Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
... is this a joke post? 1440p at 144hz+ is a perfectly reasonable use case for a $700 GPU. What are you on?
 

Brodo Baggins

Member
Oct 27, 2017
4,116
I assume youve tried vr? Because holding a smart phone up to your face is literally what that is.

I am a huge VR enthusiast that owns an Index, a Quest, and a PSVR, and I love it but not for visual clarity it's all about presence through physical tracking. Visual clarity and quality is usually quite bad due to effectively low resolution.
 

DarthBuzzard

Banned
Jul 17, 2018
5,122
I assume youve tried vr? Because holding a smart phone up to your face is literally what that is.
That only works because lenses magnify the image.

They are correct; a bigger display is inherently more immersive because your brain knows that it's bigger due to the distance between you and it - and it also helps that you can see it lighting the room more.
 

RoboitoAM

Member
Oct 25, 2017
3,123
Lol 1440p is absolutely, utterly a joke in 2020.

Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
1440p/165hz reporting in

if I want 4K/60 I'll hook my PC up to my TV
 

nitewulf

Member
Nov 29, 2017
7,275
Lol 1440p is absolutely, utterly a joke in 2020.

Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
I literally cringed through this whole post. How embarrassing.
 
Oct 27, 2017
9,482
Lol 1440p is absolutely, utterly a joke in 2020.

Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.
What console games are running better than a 3080 can? A 3080 is like 2-3x increase over the the new consoles rendering ability. Look at the new game previews coming out. Dirt 5 running 120fps @1080p on the Xbox Series X. You can probably get a 1660ti or 2060s to pull the same results. It's not even is the same league against 3080. Your probably looking at a mid cycle console refresh before they get close to a 3080 which is probably 3-4 years off. The consoles for the series x and ps5 are great pieces of kit for what they are at the prices. But let's not go overboard. Most of these 4k games are variable resolutions probably ~1440p. Which is fine look at what naughty dog did with tlou2. That's 1440p upscaled to 4k on the pro and no one is complaining about the game's iq. But don't go overboard on what the actual reality is.
 

turbobrick

Member
Oct 25, 2017
13,224
Phoenix, AZ
Will anybody here pay $200 more for 10 extra GBs? Because that's how much it will likely cost. $900 instead of $700.

I very much doubt it. The $700 GPU with 10GB is the sweet spot for the next 2 gens of GPUs. Worst case scenario, you might have to drop internal render a tad, but you will likely reap frame-rate as a reward.

I wish more people thought 10GB was too little, then I could score one maybe :)

Probably not, its a big jump in price for something most people wouldn't need. There's an even a bigger group of people, me included, that wouldn't pay the extra $200 to go from the 3070 to the 3080. Though I don't care about 4k, so that's a big part of it.
 

F34R

Member
Oct 27, 2017
12,056
Can't max out Red Dead Redemption at 4K with a 3080 lol. You can do everything except put MSAA to the max. I run at 30-45fps at 4k ultra with everything maxed and MSAA at 4x.
 
VRAM Testing Results
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,291
Kansas
Lol. Who would've thought?

On a side note, I'm looking forward to when you get your 3080 and make those sweet VRAM tests for us to clear things up and shut up trolls.

Will you do it for us? :)
Since I have two similar threads, I've been posting all my results over here:
https://www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/

I would prefer that people learn about the hidden msi afterburner setting, so that people will be able to spread this information along.
 

vitormg

Member
Oct 26, 2017
1,961
Brazil
Since I have two similar threads, I've been posting all my results over here:
https://www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/

I would prefer that people learn about the hidden msi afterburner setting, so that people will be able to spread this information along.
Thanks! Will follow it closely. My 3080 tests have been very promising too. Though I have only played Assassin's Creed Origins so far. 4000ish in 3440x1440 max settings.
 

rashbeep

Member
Oct 27, 2017
9,547
Maybe we're not talking about the same sharpening; I don't have any ghosting issues. I'm talking about the reshade pack that you can put in and activate via ALT-F3, adding the filter.

hmm i'm using the image sharpening from the nvidia control panel (set to 0.3). haven't actually tried reshade with the game yet
 

Sabin

Member
Oct 25, 2017
4,721
It doesn't matter if a game has perfect particles/lighting/physics/etc. As long as it renders at such low resolution the image quality will always be ruined by
aliasing artifacts like shimmering, flickering, pixel crawling, etc.

Temporal stability is easily one of the most important factors for a game's IQ. In many games even 5k plus TAA is not enough to eliminate all temporal aliasing, let alone the joke of a resolution that is 1440p.

Your first post was already beyond embarassing and you even double down on it? Some folks really have no shame...

Also as many other said 1440/165hz > 4K60hz
 

Shadownet

Member
Oct 29, 2017
3,281
Would I like a 4k 144hz monitor? Yes. But even my 3080 won't get that, and I sure as hell not gonna pay twice the price for 4k when my ultrawide curved 32 inch 1440p monitor is hitting the sweet spot for me right now.
 

Jedi2016

Member
Oct 27, 2017
16,100
Temporal antialiasing improves with framerate far more than it does with resolution. It's based on the number of frames over a fixed period of time, not a fixed number of previous frames. So the more frames you give it to work with, the better the result. A game running at 120fps will look a hell of a lot cleaner with TAA than the same game running at 30 or even 60.
 
Oct 27, 2017
5,618
Spain
What console games are running better than a 3080 can? A 3080 is like 2-3x increase over the the new consoles rendering ability. Look at the new game previews coming out. Dirt 5 running 120fps @1080p on the Xbox Series X. You can probably get a 1660ti or 2060s to pull the same results. It's not even is the same league against 3080. Your probably looking at a mid cycle console refresh before they get close to a 3080 which is probably 3-4 years off. The consoles for the series x and ps5 are great pieces of kit for what they are at the prices. But let's not go overboard. Most of these 4k games are variable resolutions probably ~1440p. Which is fine look at what naughty dog did with tlou2. That's 1440p upscaled to 4k on the pro and no one is complaining about the game's iq. But don't go overboard on what the actual reality is.
That's ridiculous. The RX 5700XT trades blows with the 2070 Ti, and that's a 9TFLOP card. The Series X is a 12 TFLOP part.
 

dgrdsv

Member
Oct 25, 2017
12,111
Temporal antialiasing improves with framerate far more than it does with resolution. It's based on the number of frames over a fixed period of time, not a fixed number of previous frames. So the more frames you give it to work with, the better the result. A game running at 120fps will look a hell of a lot cleaner with TAA than the same game running at 30 or even 60.
It's actually based on the number of frames and not a fixed period of time, at least usually. But it will look better in higher framerates simply because there are less pixel movement happening between adjucent frames and thus you're getting a better result with less motion related errors - and you get it faster which also lower the amount of issues like ghosting and such.
Where did you see an implementation based on time period btw?
 
Oct 27, 2017
9,482
That's ridiculous. The RX 5700XT trades blows with the 2070 Ti, and that's a 9TFLOP card. The Series X is a 12 TFLOP part.
It wasn't ridiculous when I typed it. I was basing it off the Arstecha article which said dirt 5 was maxing at 1080p at 120 frames per second. Which we found was incorrect later when digital foundry did their video.

arstechnica.com

DiRT 5 and our first Xbox Series X “enhanced” tests: 120Hz saves the uneven ride [Updated]

A bleeding-edge Xbox Series X feature impresses, but it deserves a prettier game.

Additionally, pop-in issues become much more apparent for both textures and shadows, and ambient occlusion is axed across the board. Those distant trees look weirder without a coat of accurate shadows, while ground textures look decidedly flat in this mode. Plus, its dynamic resolution appears to max out near the 1080p mark, dropping further during busy scenes.
 

Jedi2016

Member
Oct 27, 2017
16,100
Where did you see an implementation based on time period btw?
Observation. Even on a static non-moving image, the one with the higher framerate will look cleaner. So it's obviously using more previous frames to smooth out the image. With all else being equal and the camera not moving, the only difference has to be the amount of time that the game is sampling from. Probably not very long, maybe a tenth of a second or so, but it'll have ten or twelve frames to interpolate from instead of only six.

Yes, it probably varies from game to game. Some implementations are probably better than others. The one that most readily springs to mind when I think of high framerate = good TAA is actually SnowRunner. Someone in one of the threads was complaining bitterly about how jaggy the game was, and I thought they were smoking something because it's one of the least jaggy games I've ever played. Turns out the only real difference between us was the framerate. I think he was even running 4K versus my 1440p, but my higher framerate was giving me a better overall image. Control, on the other hand, clearly uses a fixed number of frames, because low framerates yield horrific ghosting and trails any time you so much as move the camera. I'm using DLSS on that game now that I've got an RTX card, so it's hard to say how much of one or the other the game is using.
 

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,651
Observation. Even on a static non-moving image, the one with the higher framerate will look cleaner. So it's obviously using more previous frames to smooth out the image. With all else being equal and the camera not moving, the only difference has to be the amount of time that the game is sampling from. Probably not very long, maybe a tenth of a second or so, but it'll have ten or twelve frames to interpolate from instead of only six.

Yes, it probably varies from game to game. Some implementations are probably better than others. The one that most readily springs to mind when I think of high framerate = good TAA is actually SnowRunner. Someone in one of the threads was complaining bitterly about how jaggy the game was, and I thought they were smoking something because it's one of the least jaggy games I've ever played. Turns out the only real difference between us was the framerate. I think he was even running 4K versus my 1440p, but my higher framerate was giving me a better overall image. Control, on the other hand, clearly uses a fixed number of frames, because low framerates yield horrific ghosting and trails any time you so much as move the camera. I'm using DLSS on that game now that I've got an RTX card, so it's hard to say how much of one or the other the game is using.
Have you tried Battlefield V?

I'm a big fan of TAA but that's the first game which made me think that I wasn't playing at a native 1440p; even at a high framerate it just doesn't look sharp, you can't turn it off either
 

Jedi2016

Member
Oct 27, 2017
16,100
No, I don't have that one.

I guess one way to test how the game is handling it would be to use a framerate limiter. Run the game at 30 and then run it at 120 and see if it looks any different, all other settings being the same. And it may not be entirely time based, so much as just dynamic. If the game realizes it's running at a higher framerate, it may change the TAA on the fly to sample more frames just because it knows they're available.

I really wish they were more consistent. Every developer and/or game engine seems to have its own way of handling it, and some of them are much better than others.
 

super-famicom

Avenger
Oct 26, 2017
25,402
www.resetera.com

Bad news for 3080 owners: Godfall dev says game "uses 4K by 4K textures and 12GB of graphics memory" at 4K Ultra

In this new video, though, Counterplay CEO Keith Lee made an interesting comment on the required video memory to play at 4K with UltraHD textures. After seeing people on era saying for months that 10GB vram is enough for the next few years im disappointed to learn that its already pushing the...

That's more or less PR advertising for the new AMD cards. People are even saying this in that thread.
 

dgrdsv

Member
Oct 25, 2017
12,111
www.resetera.com

Bad news for 3080 owners: Godfall dev says game "uses 4K by 4K textures and 12GB of graphics memory" at 4K Ultra

In this new video, though, Counterplay CEO Keith Lee made an interesting comment on the required video memory to play at 4K with UltraHD textures. After seeing people on era saying for months that 10GB vram is enough for the next few years im disappointed to learn that its already pushing the...
Will be fun watching this run fine on some 6700 with 6GBs of VRAM a couple of months later for sure.
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,291
Kansas
www.resetera.com

Bad news for 3080 owners: Godfall dev says game "uses 4K by 4K textures and 12GB of graphics memory" at 4K Ultra

In this new video, though, Counterplay CEO Keith Lee made an interesting comment on the required video memory to play at 4K with UltraHD textures. After seeing people on era saying for months that 10GB vram is enough for the next few years im disappointed to learn that its already pushing the...
You can read my thoughts right here.
www.resetera.com

Bad news for 3080 owners: Godfall dev says game "uses 4K by 4K textures and 12GB of graphics memory" at 4K Ultra

I don't need to be tagged into every fanboi gpu war thread. This is clearly AMD targeted marketing, of course they are going to say that. Wake me up when we have benchmarks of games stuttering in godfall with a 3080, or God forbid show me even ONE youtube reviewer who actually uses per process...
 

Pargon

Member
Oct 27, 2017
12,178
10GB might be fine on a clean system which is not running anything in the background.
With an 8GB GTX 1070, I'm running into "Low VRAM" warnings in games like Half-Life: Alyx unless I close out most other applications on the system first - particularly web browsers. It's a real hassle, and single-tasking like that is not what I'm used to on PC.
I've always gone for the "double VRAM" cards where possible in the past; e.g. 4GB 960 rather than 2GB, which has often helped extend those cards' useful life - so it's very disappointing to hear that NVIDIA have apparently canceled plans for a 20GB 3080, and that the 3070 is only shipping with 8GB.
Hopefully AMD push them to reconsider. In the meantime, I'm hoping that I'll be able to find a used Titan RTX at a reasonable price, but a lot of listings seem unreasonable right now.
 
OP
OP
Darktalon

Darktalon

Member
Oct 27, 2017
3,291
Kansas
10GB might be fine on a clean system which is not running anything in the background.
With an 8GB GTX 1070, I'm running into "Low VRAM" warnings in games like Half-Life: Alyx unless I close out most other applications on the system first - particularly web browsers. It's a real hassle, and single-tasking like that is not what I'm used to on PC.
I've always gone for the "double VRAM" cards where possible in the past; e.g. 4GB 960 rather than 2GB, which has often helped extend those cards' useful life - so it's very disappointing to hear that NVIDIA have apparently canceled plans for a 20GB 3080, and that the 3070 is only shipping with 8GB.
Hopefully AMD push them to reconsider. In the meantime, I'm hoping that I'll be able to find a used Titan RTX at a reasonable price, but a lot of listings seem unreasonable right now.
Not really sure how having to close background programs before gaming is an argument for nvidia to add hundreds of dollars to the cost of the card.

And it doesn't matter if half life alyx is giving you a warning unless it actually stutters. I get the same warning on 3080,and it's not true in the least when you look at rtss. Never trust in game vram warnings and meters.
 

Pargon

Member
Oct 27, 2017
12,178
Not really sure how having to close background programs before gaming is an argument for nvidia to add hundreds of dollars to the cost of the card.

And it doesn't matter if half life alyx is giving you a warning unless it actually stutters. I get the same warning on 3080,and it's not true in the least when you look at rtss. Never trust in game vram warnings and meters.
My point is that with 8GB, it's tight enough that I've recently had to do things like closing other applications/browsers on my system just to run certain games smoothly.

And yes, it does make a difference. I was having stutter problems in HL:A before doing that.
I had to turn down texture quality and other settings in the RE Engine games for those to run stutter-free too.

Maybe you're fine with not having anything else running in the background while you play games, but I often switch between tasks to take a short break and play a game, then return to what I was doing - and this makes it impossible to do so. Having to set everything up again each time is a workflow-killer.
Being able to multitask is a part of the PC gaming experience to me.

And this is before next-gen games with higher VRAM requirements ship.
Of course this is only my own experience, but 8–10 GB doesn't seem like enough to me for a card I'd expect to keep for several years.
 

Dr. Zoidberg

Member
Oct 25, 2017
5,297
Decapod 10
Maybe you're fine with not having anything else running in the background while you play games, but I often switch between tasks to take a short break and play a game, then return to what I was doing - and this makes it impossible to do so. Having to set everything up again each time is a workflow-killer.

I understand your point but since you have different needs then why not purchase a 3090. They make it for people who want/need more VRAM. (Or one of the AMD cards, etc.)

Personally I close everything else out when I play games to squeeze out every last drop of performance so I don't need any additional headroom.
 

Pargon

Member
Oct 27, 2017
12,178
I understand your point but since you have different needs then why not purchase a 3090. They make it for people who want/need more VRAM. (Or one of the AMD cards, etc.)

Personally I close everything else out when I play games to squeeze out every last drop of performance so I don't need any additional headroom.
I certainly don't have $1500 to spend on a video card.
I'm just saying it's not as easy to say that 8–10 GB is going to be enough for next-gen, when it's possible to run into issues with that much VRAM in current-gen games.

Sure, everyone's situation may be different, but I'm pointing out one scenario where VRAM does matter - and I don't think it's that uncommon.
I would think that PCs dedicated solely to playing games, where people make sure to close everything else first, are more rare.
 

Jedi2016

Member
Oct 27, 2017
16,100
Yeah, I'll believe it when I see it. So far, nothing I've thrown at my card in any way stresses the memory limit.
 

Strakt

Member
Oct 27, 2017
5,176
Lol 1440p is absolutely, utterly a joke in 2020.

Imagine paying $700 for a graphics card (so easily $1000+ for the whole build) and then proceed to run games at a resolution so far below what $400/$500 consoles do. It's simply embarrassing.

LOLLLL

I read this post and thought it was a joke... but guess I was wrong