Rhaknar

Member
Oct 26, 2017
43,041
I don't understand it in too much depth, but it seems to have to do with the faster response time of OLED. It makes the transitions from one frame to the next more abrupt. When I'm playing at 30 or 40 fps it looks like something is wrong with the screen. It's literally unplayable for me. What I'm not sure of is whether I would get used to it if I subjected myself to it for long enough. These TVs have technology built into them to attempt to mitigate this, but as I understand it this has the side effect of adding input latency. The tech is called black frame insertion. I haven't tried it.

this is me. I mean, you can just google "why does 30fps look worse on OLED" and you will get many articles explaining it.

I'll legit never forget when I first bought my OLED, I thought to myself "oh boy let's whip out Ghost of Tsushima, that game looked awesome, I bet it's going to be great!", booting it up on my PS4 Pro...and I was flabergasted, I couldn't stand it, it was night and day between this and my old LCD. Then I started trying more 30fps games and really struggling with any sort of camera movement, and reading up on it to find out it's just much more noticeable on OLEDs for whatever technical reasons.

I remember of the many games I later tried on my Series X (or was it on my XB1 still? either way), the only that didn't bother me was Arkham Knight, which has the MOST aggressive use of motion blur I've ever seen in a videogame, and I guess that's why it "hides" the camera judder of 30fps I dunno.
 

b0uncyfr0

Member
Apr 2, 2018
959
Screenshot-2023-06-19-at-10-53-48.png


Am i the only one who thinks this is really really off? A 2080 on the same level as a 6800xt...huh..

2080 : 8gb vram, performance equal to a 3060Ti (maybe alil higher)?
The 6800xt :16gb vram. About 20-40% faster than a 2080 (depending on the game)

It is almost two tiers above the 2080 from my quick googling. A 6800xt is basically competing with a 3080 right now.. so that's a huge red flag to me.

Could there be any correlation to the 30 fps performance on consoles - Could it be, that the engine just runs really shit on AMD cards and that's why they thought it better to cap it at 30 fps..?
 

Dolce

Member
Oct 25, 2017
14,270
Am i the only one who thinks this is really really off? A 2080 on the same level as a 6800xt...huh..

2080 : 8gb vram, performance equal to a 3060Ti (maybe alil higher)?
The 6800xt :16gb vram. About 20-40% faster than a 2080 (depending on the game)

It is almost two tiers above the 2080 from my quick googling. A 6800xt is basically competing with a 3080 right now.. so that's a huge red flag to me.

Could there be any correlation to the 30 fps performance on consoles - Could it be, that the engine just runs really shit on AMD cards and that's why they thought it better to cap it at 30 fps..?

how does DLSS come into play?
 

Crax

Member
May 21, 2018
898
Am i the only one who thinks this is really really off? A 2080 on the same level as a 6800xt...huh..

2080 : 8gb vram, performance equal to a 3060Ti (maybe alil higher)?
The 6800xt :16gb vram. About 20-40% faster than a 2080 (depending on the game)

It is almost two tiers above the 2080 from my quick googling. A 6800xt is basically competing with a 3080 right now.. so that's a huge red flag to me.

Could there be any correlation to the 30 fps performance on consoles - Could it be, that the engine just runs really shit on AMD cards and that's why they thought it better to cap it at 30 fps..?

Could this be because the game might have some form of raytracing, and the AMD cards are generally weaker than Nvidia's when it comes to RTX?
 

Firefly

Member
Jul 10, 2018
8,761
Screenshot-2023-06-19-at-10-53-48.png


Am i the only one who thinks this is really really off? A 2080 on the same level as a 6800xt...huh..

2080 : 8gb vram, performance equal to a 3060Ti (maybe alil higher)?
The 6800xt :16gb vram. About 20-40% faster than a 2080 (depending on the game)

It is almost two tiers above the 2080 from my quick googling. A 6800xt is basically competing with a 3080 right now.. so that's a huge red flag to me.

Could there be any correlation to the 30 fps performance on consoles - Could it be, that the engine just runs really shit on AMD cards and that's why they thought it better to cap it at 30 fps..?
Ryzen 5 3600 is a tad slower than Series X so these specs could just be 30fps+ and not a solid 60.
 

Fiery Phoenix

Member
Oct 26, 2017
5,899
I think the real question here is what those Recommended PC specs target. My knowledge of PC building is close to nonexistent, but I can already tell you're not going to get a (stable) 60fps with the recommended specs.
 

Dolce

Member
Oct 25, 2017
14,270
I think the real question here is what those Recommended PC specs target. My knowledge of PC building is close to nonexistent, but I can already tell you're not going to get a (stable) 60fps with the recommended specs.

probably 1440p, maybe RT as mentioned above. no reason it couldn't hit 1440p/60 fps that i can think of.

recommended usually isn't ultra so i don't see any reason it would be 4k.
 

P40L0

Member
Jun 12, 2018
7,758
Italy
this is me. I mean, you can just google "why does 30fps look worse on OLED" and you will get many articles explaining it.

I'll legit never forget when I first bought my OLED, I thought to myself "oh boy let's whip out Ghost of Tsushima, that game looked awesome, I bet it's going to be great!", booting it up on my PS4 Pro...and I was flabergasted, I couldn't stand it, it was night and day between this and my old LCD. Then I started trying more 30fps games and really struggling with any sort of camera movement, and reading up on it to find out it's just much more noticeable on OLEDs for whatever technical reasons.

I remember of the many games I later tried on my Series X (or was it on my XB1 still? either way), the only that didn't bother me was Arkham Knight, which has the MOST aggressive use of motion blur I've ever seen in a videogame, and I guess that's why it "hides" the camera judder of 30fps I dunno.
A good in-game Motion Blur implementation (both per-scene + per-object) could alleviate 30fps inherent stutters on OLEDs (like Arkham Knight you mentioned but also Forza Horizon 5 in Quality Mode and others) and Starfield also seems to have that at least.

That said, also providing the option to unlock the framerate for whom want to leverage VRR and/or add a more optimized, fixed 40fps mode @ 120hz would be ideal on Series X|S.
 

Lidas

Member
Oct 26, 2017
37
i have LG C1.
any ideas which Gamepass XsX first person game has 4k/30fps mode so I can check out how bad/good it is?

I know you asked about first person games, but otherwise Valheim is a good game to test different modes with (it's on Game Pass). Not sure if it's 4K, but it has three modes: Quality, Balanced and Performance, and they run at 30, 40 and 60 fps respectively if I'm not completely misremembering.

I also have a C1 and even though I was kind of bothered by 30 fps on my old LCD, it's so bad on the OLED it feels physically straining on the eyes. It kinda blows my mind how anyone can play something like Cyberpunk on an OLED at 30 fps and find it comfortable. The 40 fps mode in Valheim feels much better than 30, even if I will always choose 60 fps if it's and option. Or, for some games, 120 fps.

I really hope they are able to add a 40 fps/120 Hz mode to Starfield on Series X, it would make a huge difference for those of us with OLED TVs.