• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Sygma

Banned
Oct 25, 2017
954
24hz market has nothing to do with next gen or whatever tho, its purely tied to movies and honestly, frame interpolation completely ruins image fidelity. Some movies were shot with 3D cams at 48 fps like The Hobbit and it looked ridiculously good
 

thuway

Member
Oct 27, 2017
5,168
Next gen could also intelligent make improvements while simulating / getting close to the 60 FPS experience. For example, games could start rendering nonessential background elements at lower frame rates, use frame interpolation for things like sparks / visual effects, and tap into Freesync/G-Sync technologies that will help dramatically improve the experience at 40+ frames.
 

Civilstrife

Member
Oct 27, 2017
2,286
Going into photo mode in Mario Odyssey is probably the best way to demonstrate the difference between 60 and 30 fps.

I agree. Let's leave 30 fps behind. Higher frame rate has a much greater impact on gamefeel than higher resolutions.
 

PlanetKiller

Banned
Oct 29, 2017
123
That's a nice retendered movie with nothing on the screen other than the characters and the darkness to hide everything else.

If there is power, developers will still use it to push to 30 frames per second no matter the hardware. It's down to what the developers wants for their game. If people are going to be expecting 60 frames per second on all games are going to be disappointed next generation. Not every developer can create their own engine and not every developer is as good as ND.

Also, HDMI 2.1 is already here on the Xbox One X this generation already.
 
Last edited:
Oct 25, 2017
3,240
Developers aiming for 60fps would be really nice. It's not a matter of being picky either, a higher framerate makes a game FEEL better to play. Call of Duty has been dominant for so long and I bet you the 60fps and how fluid it feels to control have played a part in its success. It's just not a marketing checklist point because its under the hood.
 

the_kaotek1

Member
Oct 25, 2017
849
I agree. There's no excuse for any title to run at 30fps, just lower graphical fidelity until you hit the 60fps target. Games are created to be played, not for watching shiny cutscenes.

It's shocking that PS4 Pro and Xbox One X are aiming for psuedo 4k. If Sony or Microsoft had guaranteed 60fps or your game will fail certification, they would have received my money.

They'd of had my money too.

I'll be spending that money on upgrading my PC instead.

I'm particularly puzzled that MS didn't push for it as an option on the X, given it's aimed at hardcore gamers, but I guess the CPU is still not good enough for it.
 

Inuhanyou

Banned
Oct 25, 2017
14,214
New Jersey
Better be or I am moving to PC gaming in entirety.

You might as well throw your PS4 in the garbage right now then.

Because 60fps as a mandate for every game will never happen on console.

There is always something you can do better in a game by sacrificing 60fps by doubling your render time and CPU time. And for many devs this will always be available to make the games they want to make.
 

kubev

Member
Oct 25, 2017
7,533
California
I prefer locked 30 to what passes for "60" right now. There are too many supposedly-60-fps games that drop far too often into the 40s and low 50s for me to take most developers' efforts seriously in this regard. If we're actually gonna push for 60 fps, then it has to be locked close to 100% of the time.
 

Pargon

Member
Oct 27, 2017
11,990
As someone that cares about high framerates, I'm somewhat concerned about the next generation of consoles moving to a modern competitive CPU architecture like Ryzen.
This generation has already shown that developers have to actually optimize for PC now, and cannot get away with lazy ports under the expectation that average hardware will still be fast enough to brute-force it to run well.

It is likely that the consoles would use a lower-power variant - probaby another APU.
However, based on what we're already seeing this generation, it seems unlikely that the performance gap would be large enough next-gen to brute-force games built for 30 FPS on that kind of hardware to run at 60 FPS on PC, unless there are some drastic improvements in desktop CPUs.
The problem is that CPUs are starting to focus on increased core-counts, as it's very difficult to improve per-core performance - while the performance of many games is reliant on just that.

I have a 4GHz 8-core/16-thread Ryzen, and many games don't scale to that. They'll run much faster on a 5GHz quad-core, even though the total computational power is lower.
This is not an accurate way to look at things, as there are many other factors involved, but as a generalized example: if we assume that SMT/HyperThreading gets you 25% higher performance on average, and both CPUs have the same IPC, the 4GHz 8-core CPU would have double the total computational power of a 5GHz 4-core CPU. Despite that, the quad-core would be better at running the majority of games due to its higher per-core performance.


Something else to consider though, on the merits of 60 vs 30 FPS:
The current push for ever-better graphics appears to be unsustainable.
Lots of developers that have been producing great games, but with severely inflated budgets, are struggling now or have gone under.
If you are targeting 60 FPS rater than 30 FPS, you don't have to put nearly as much money into ultra high fidelity art assets. There's a reason that so many B-tier or indie games are able to target 60 FPS, while the big AAA games with bloated budgets are pushing 30.
There are obviously other costs when trying to get your game running at 60, but it could be something which helps keep budgets in check if it places stricter limitations on developers.

I mean, I'm definitely in the 60-fps camp, which is why I've been 95% PC gaming the past several years.
But why would anyone refuse to play incredible games like Uncharted 4 or Forza Horizon 3 just because of this silly principle? That's just being too close-minded.
Motion sickness for one thing.
Breath of the Wild is the one 3D home console game I've played in maybe 7 years now, and I had to stop because I'd get bad motion sickness every time I played it. I regret ever buying a Wii U, and I wish the Switch could run it at 60.

I prefer locked 30 to what passes for "60" right now. There are too many supposedly-60-fps games that drop far too often into the 40s and low 50s for me to take most developers' efforts seriously in this regard. If we're actually gonna push for 60 fps, then it has to be locked close to 100% of the time.
Fortunately, Variable Refresh Rates are now a part of the HDMI 2.1 spec.
Sadly, it's an optional part of the spec, but that means there will potentially be TVs with G-Sync/FreeSync-like functionality available in 2018.
That means you never have to worry about locking to a fixed framerate every again.
It doesn't mean that you can ignore framerate, and you'll still need an option to lock to 30 for older TVs, but you can have games which are locked to 30 on a fixed-refresh display and unlocked on a VRR-capable display.

I would like some. 24hz exists only because it was the lowest (i.e. cheapest) acceptable framerate that could be synced to audio. There's no intrinsic reason that makes it superior. People are used to it, that is all. I'm interested in seeing what James Cameron brings to the table.
Movies also used to look significantly smoother back before they double or triple-flashed the image in theaters, or before we were watching it on flicker-free displays at home. They were called 'flicks' for a reason.
The framerate has not kept up with the modern methods of displaying movies, and motion now looks worse than ever.
Even 60 FPS games are not as smooth or as free of motion blur as they used to be when displayed on a 60Hz CRT. I'd take a CRT doing 60Hz over a flicker-free OLED doing 90 any day.

....Why is 24hz with film a bad thing? Do you want soap opera framerates in movies?
The severe judder and motion blur in movies looks awful.
Higher framerates are always better. No exceptions.
I can't even stand to watch movies without interpolation turned up to the max. Interpolation artifacts are bad, but low framerate video is worse.

What? That's a terrible idea. TV and film at higher framerates looks awful much of the time, especially in fictional films. The soap opera effect is really jarring. There's a reason why the high frame rate thing they tried with The Hobbit wasn't well-received and never took off. I hope film at 24fps is never abandoned.
The Hobbit looked bad for numerous reasons beside the framerate. People are quick to blame the framerate for the effects looking bad, but they still look terrible at 24 FPS.
The only reason it looks jarring is because you have a certain expectation when you sit in the theater to watch a movie. If 48 FPS became standard you would soon adapt to it and realize how bad 24 FPS is.
 

Venom

Banned
Oct 25, 2017
1,635
Manchester, UK
As someone that cares about high framerates, I'm somewhat concerned about the next generation of consoles moving to a modern competitive CPU architecture like Ryzen.
This generation has already shown that developers have to actually optimize for PC now, and cannot get away with lazy ports under the expectation that average hardware will still be fast enough to brute-force it to run well.

It is likely that the consoles would use a lower-power variant - probaby another APU.
However, based on what we're already seeing this generation, it seems unlikely that the performance gap would be large enough next-gen to brute-force games built for 30 FPS on that kind of hardware to run at 60 FPS on PC, unless there are some drastic improvements in desktop CPUs.
The problem is that CPUs are starting to focus on increased core-counts, as it's very difficult to improve per-core performance - while the performance of many games is reliant on just that.

I have a 4GHz 8-core/16-thread Ryzen, and many games don't scale to that. They'll run much faster on a 5GHz quad-core, even though the total computational power is lower.
This is not an accurate way to look at things, as there are many other factors involved, but as a generalized example: if we assume that SMT/HyperThreading gets you 25% higher performance on average, and both CPUs have the same IPC, the 4GHz 8-core CPU would have double the total computational power of a 5GHz 4-core CPU. Despite that, the quad-core would be better at running the majority of games due to its higher per-core performance.


Something else to consider though, on the merits of 60 vs 30 FPS:
The current push for ever-better graphics appears to be unsustainable.
Lots of developers that have been producing great games, but with severely inflated budgets, are struggling now or have gone under.
If you are targeting 60 FPS rater than 30 FPS, you don't have to put nearly as much money into ultra high fidelity art assets. There's a reason that so many B-tier or indie games are able to target 60 FPS, while the big AAA games with bloated budgets are pushing 30.
There are obviously other costs when trying to get your game running at 60, but it could be something which helps keep budgets in check if it places stricter limitations on developers.


Motion sickness for one thing.
Breath of the Wild is the one 3D home console game I've played in maybe 7 years now, and I had to stop because I'd get bad motion sickness every time I played it. I regret ever buying a Wii U, and I wish the Switch could run it at 60.


Fortunately, Variable Refresh Rates are now a part of the HDMI 2.1 spec.
Sadly, it's an optional part of the spec, but that means there will potentially be TVs with G-Sync/FreeSync-like functionality available in 2018.
That means you never have to worry about locking to a fixed framerate every again.
It doesn't mean that you can ignore framerate, and you'll still need an option to lock to 30 for older TVs, but you can have games which are locked to 30 on a fixed-refresh display and unlocked on a VRR-capable display.


Movies also used to look significantly smoother back before they double or triple-flashed the image in theaters, or before we were watching it on flicker-free displays at home. They were called 'flicks' for a reason.
The framerate has not kept up with the modern methods of displaying movies, and motion now looks worse than ever.
Even 60 FPS games are not as smooth or as free of motion blur as they used to be when displayed on a 60Hz CRT. I'd take a CRT doing 60Hz over a flicker-free OLED doing 90 any day.


The severe judder and motion blur in movies looks awful.
Higher framerates are always better. No exceptions.
I can't even stand to watch movies without interpolation turned up to the max. Interpolation artifacts are bad, but low framerate video is worse.


The Hobbit looked bad for numerous reasons beside the framerate. People are quick to blame the framerate for the effects looking bad, but they still look terrible at 24 FPS.
The only reason it looks jarring is because you have a certain expectation when you sit in the theater to watch a movie. If 48 FPS became standard you would soon adapt to it and realize how bad 24 FPS is.
Never heard of motion sickness caused by 30fps before. It must certainly only have an effect on people on a miniscule scale. Either way 30 FPS is fine and I'd that's what Devs want to target I'll accept because they get to choose, not us.
 

tulpa

Banned
Oct 28, 2017
3,878
The severe judder and motion blur in movies looks awful.
Higher framerates are always better. No exceptions.
I can't even stand to watch movies without interpolation turned up to the max. Interpolation artifacts are bad, but low framerate video is worse.


The Hobbit looked bad for numerous reasons beside the framerate. People are quick to blame the framerate for the effects looking bad, but they still look terrible at 24 FPS.
The only reason it looks jarring is because you have a certain expectation when you sit in the theater to watch a movie. If 48 FPS became standard you would soon adapt to it and realize how bad 24 FPS is.

They're quite simply not "always better." That's an opinion presented as if it was an objective fact. And it's one that happens to be strongly opposed by the vast majority of filmmakers out there and most critics. I think that, in film, high frame rate presentations are always worse, but that's just my opinion. I can't even fathom watching anything with motion interpolation turned on, I think it makes a lovely 24 presentation look like absolute garbage. I saw Hobbit in HFR and was dragged to see it again at a local theater in a regular screening. It looked far, far better at the standard frame rate. I was able to engage with the film without being distracted the entire time by how awful the frame rate made the presentation, and I enjoyed the film much more. I'm not the only person who saw it in both formats and reported the same thing.
 

Akronis

Prophet of Regret - Lizard Daddy
Member
Oct 25, 2017
5,450
Next gen consoles are still going to be mid-tier parts. You can't have it both ways. Either the developer cuts back on the visuals and targets 60, or you get your eye candy with 30. The consoles are still going to cost <$500.

Buy a PC if you want both.
 

ActWan

Banned
Oct 27, 2017
2,334
It won't. Resolution will come instead...4k 30fps, then 8k 30fps...I don't feel like 60 will ever become the standard (and even if it does, people will get used to higher fps and the issue will repeat itself).
 

potato

Banned
Oct 27, 2017
193
So this gen focused on hitting 720p+ (1080p) instead of sub-HD.
Mid-way it focused on 4k
Next gen we should get the same fidelity games as this gen (maybe slightly better) but with a focus on hitting 60fps.

That means we won't see the next huge graphical fidelity jump until the gen after next?

Thats an interesting thought.

EDIT: OP, are you EWollan from IGN forums back in the day? If so, I remember your "Kin" thread (turned out to be Killzone).
 

kubev

Member
Oct 25, 2017
7,533
California
Fortunately, Variable Refresh Rates are now a part of the HDMI 2.1 spec.
Sadly, it's an optional part of the spec, but that means there will potentially be TVs with G-Sync/FreeSync-like functionality available in 2018.
That means you never have to worry about locking to a fixed framerate every again.
It doesn't mean that you can ignore framerate, and you'll still need an option to lock to 30 for older TVs, but you can have games which are locked to 30 on a fixed-refresh display and unlocked on a VRR-capable display.
TVs supporting variable refresh rates isn't as much of a concern for me as the fact that playing a game that shoots for 60 fps and continually dips into the 40s or lower is really jarring for me.
 

Akronis

Prophet of Regret - Lizard Daddy
Member
Oct 25, 2017
5,450
So this gen focused on hitting 720p+ (1080p) instead of sub-HD.
Mid-way it focused on 4k
Next gen we should get the same fidelity games as this gen (maybe slightly better) but with a focus on hitting 60fps.

That means we won't see the next huge graphical fidelity jump until the gen after next?

Thats an interesting thought.

Why would this gen or next gen be any different in that devs will magically start focusing on 60fps? I can almost guarantee that it'll just be a native 4K focus with extra shiny bits at 30 for the majority of first party titles. Visuals have always taken precedent.
 

Zedelima

▲ Legend ▲
Member
Oct 25, 2017
7,714
I think the mass market doesn't care. I bet that if you ask to the 60millions ps4 owners, the majority will say they prefer eyecandy to 60fps.
At least if im ask to people i know, that will be the response i will get.
 

carlsojo

Member
Oct 28, 2017
33,751
San Francisco
30fps and a stable framerate is the most the mass market cares about. Gaming budgets are already astronomical, they aren't going to dump even more to achieve something that won't guarantee significant returns.
 

radiotoxic

Member
Oct 27, 2017
1,019
Yes please, it's about time.

Also I'd love for next gen to be the first in a long time where resolutions don't get pushed further up. I say let's concentrate in 60fps and 4K res only, make consoles be consistent platforms across their libraries and keep a quality/performance sweet spot.
 

dgrdsv

Member
Oct 25, 2017
11,843
True, but that's only a speculation and not very likely.
It's as much of speculation as the fact that most games on modern consoles choose to run at 30 fps instead of 60. I have no idea why people think that this choice will be different with a faster CPU. A faster CPU will allow devs to pack more stuff inside 33,3ms of one frame at 30 fps, that's it. Console games are made to run at 30 fps because this is what market consider acceptable.

With a Ryzen type CPU they could at least have given players an easy choice for opting in on 60FPS at the cost of graphical fidelity, without altering the amount of enemy units or size of their maps.
Highly unlikely. If a game is designed to run at 30 fps on console h/w then such choice will probably be impossible since it will require a different, well, game to be made alongside the one which will run at 30 fps. You're talking about a situation when game's CPU part is made to run at 60 fps but its GPU part for some reason isn't - this is a rather weird balancing scenario which is unlikely to ever happen outside of some ports/remasters; you either get both at 30 or both at 60. This will still be true with Ryzen or Core or ARM or any other CPU in the system.
 

Pargon

Member
Oct 27, 2017
11,990
TVs supporting variable refresh rates isn't as much of a concern for me as the fact that playing a game that shoots for 60 fps and continually dips into the 40s or lower is really jarring for me.
It is jarring because your display is running at a fixed 60Hz refresh rate.
Anything which is not a divisor of 60 (60, 30, 20 etc.) cannot be synced up to a fixed 60Hz refresh, and so it stutters badly.
Variable Refresh Rate fixes that by flipping things around, so that the display syncs up to the game, and only refreshes when it is presented with a new frame.
So if a game is running at 40 FPS, the display is technically running at "40Hz".
You can have a framerate which fluctuates between 55-65 FPS on a VRR display and you'd never notice. It would feel like a locked 60 does on a fixed-refresh display.

Now you'll still feel huge dips in performance from say 60 to 30, but you might see games with a 45 FPS cap so that it fluctuates between 35-45 FPS, instead of having to be capped at 30 to sync up to a fixed 60Hz refresh rate.

They're quite simply not "always better." That's an opinion presented as if it was an objective fact. And it's one that happens to be strongly opposed by the vast majority of filmmakers out there and most critics. I think that, in film, high frame rate presentations are always worse, but that's just my opinion. I can't even fathom watching anything with motion interpolation turned on, I think it makes a lovely 24 presentation look like absolute garbage. I saw Hobbit in HFR and was dragged to see it again at a local theater in a regular screening. It looked far, far better at the standard frame rate. I was able to engage with the film without being distracted the entire time by how awful the frame rate made the presentation, and I enjoyed the film much more. I'm not the only person who saw it in both formats and reported the same thing.
Again: you went to the theaters after many years of experiencing a modern (bad) 24 FPS presentation, and not being adapted to high framerate video.
Of course it is going to stand out when it's the first HFR movie you've seen. It takes time to adapt to new things.
Interpolation is not nearly as good as native HFR content, but it's improving all the time.
Some short comparison clips (not mine):
24 FPS looks awful - even without telecine judder. (which is unavoidable there, since most displays and video players max out at 60Hz)

Interpolation is probably going to start becoming a big deal for games in the not-too-distant future as well - especially once 120Hz support becomes a standard feature of TVs.
Not interpolation from the display, but within the game engine. As the techniques improve for VR, I'm sure they will be adapted to work equally well on 2D displays.
 

tulpa

Banned
Oct 28, 2017
3,878
Again: you went to the theaters after many years of experiencing a modern (bad) 24 FPS presentation, and not being adapted to high framerate video.
Of course it is going to stand out when it's the first HFR movie you've seen. It takes time to adapt to new things.
Interpolation is not nearly as good as native HFR content, but it's improving all the time.
Some short comparison clips (not mine):
24 FPS looks awful - even without telecine judder. (which is unavoidable there, since most displays and video players max out at 60HHz)

I'm perfectly used to high frame rate video, thanks, I see it all the time in various applications where I think they work great and where one obviously wouldn't want to use 24 or 30 but not for big screen narrative cinema, no way, never and it will never become a standard. Most directors and DPs will fight it tooth and nail in a way that will make the celluloid debate look like child's play. And it's not just because they're stuck in their ways, it's because HFR has a look that just isn't what they want for their films. It smashes the illusion of cinema into little bits.

Once again, you can assert that 24fps looks terrible but that's just your opinion. I think a 35 or 70mm presentation at 24 looks absolutely lovely. I think most of the movie-going public would disagree with you that 24 looks awful and you just can't force a more expensive format that people fundamentally don't want on audiences.

Moviegoers don't sit in the theaters thinking "oh wow, the judder and motion blur is just ruining this." But everyone I saw The Hobbit with did feel that the HFR hurt their enjoyment of the film. Even those who didn't know what frame rate was commented on how there was something weird about the feeling of the live stage in front of them, or the fakeness of the sets and the soapy look of the film.

Anyway I don't want to derail this thread, which is about games not movies. But that's my view, and I understand yours.
 

Pargon

Member
Oct 27, 2017
11,990
I'm perfectly used to high frame rate video, thanks, I see it all the time in various applications where I think they work great and where one obviously wouldn't want to use 24 or 30 but not for big screen narrative cinema, no way, never and it will never become a standard. Most directors and DPs will fight it tooth and nail in a way that will make the celluloid debate look like child's play. And it's not just because they're stuck in their ways, it's because HFR has a look that just isn't what they want for their films. It smashes the illusion of cinema into little bits.

Once again, you can assert that 24fps looks terrible but that's just your opinion. I think a 35 or 70mm presentation at 24 looks absolutely lovely. I think most of the movie-going public would disagree with you that 24 looks awful and you just can't force a more expensive format that people fundamentally don't want on audiences.

Moviegoers don't sit in the theaters thinking "oh wow, the judder and motion blur is just ruining this." But everyone I saw The Hobbit with did feel that the HFR hurt their enjoyment of the film. Even those who didn't know what frame rate was commented on how there was something weird about the feeling of the live stage in front of them, or the fakeness of the sets and the soapy look of the film.

Anyway I don't want to derail this thread, which is about games not movies. But that's my view, and I understand yours.
24 FPS looks fine when you display it as originally intended: at 24Hz on a flickering projector.
The problem with 24 FPS is that the modern presentation displays it at 48/72Hz on projectors, or completely flicker-free at home on a TV.
This makes the image judder terribly whenever the camera moves, and adds a ton of motion blur. That's not an opinion - it's a fact.

Interpolation fixes that and restores the original look.
Alternatively, HFR provides a similar look, with less flickering and no interpolation artifacts.
Again: The Hobbit is not a good example of HFR, whether you are in favor of or against it, and one movie is not enough to either adapt to or condemn it.

Display 24 FPS film on a projector with a single-bladed shutter, or on a 72Hz CRT or 120Hz OLED using black frame insertion to get an effective 24Hz presentation, and you'll see that motion suddenly gets super fluid. It's as if you switched on interpolation, but with a lot of flickering instead of interpolation artifacts.
 

Justsomeguy

Member
Oct 27, 2017
1,711
UK
How about the added input latency? Less fluid motion? There's quite a bit wrong with it.
None of which bothers a lot of people. I'm happy with a locked 30 and drop dead effects for a huge swathe of games. I think more games should target 60, yes, but there will always be room for 30fps titles
 

Garf02

Banned
Oct 31, 2017
1,420
It wont, it will be native 4K with 8K upscale and the same 20-30 FPS, some games do 60FPS

this is cause Graphics are more easy to parade around even on screen shots than resolution, which easy ups marketing and sell a product (even if the final product is visual downgrade)
 

illamap

Banned
Oct 28, 2017
466
It wont, it will be native 4K with 8K upscale and the same 20-30 FPS, some games do 60FPS

this is cause Graphics are more easy to parade around even on screen shots than resolution, which easy ups marketing and sell a product (even if the final product is visual downgrade)

Native 4k is massive waste of resources, both in terms computing and visual impact of graphics.
 
Oct 27, 2017
696
Vienna
I really strongly believe consoles will stop going at higher res than 4k. The diminishing returns above 4k are really too important to waste resources on.
 

Nostradamus

Member
Oct 28, 2017
2,280
I think that graphics settings are going to become standard on consoles and makes great sense. Consoles are like PCs now and they also have updated configurations mid-gen (Pro and X) so it should be easy for developers to offer them. 60fps will become standard only when the processing power between 30 and 60 becomes insignificant compared to other factors like resolution and graphics effects when aiming for image quality. Until that happens (and that depends a lot in CPU power) we will get to choose between better graphics quality or better frame-rate. And to be fair options are always good so why not.
 

Th0rnhead

Member
Oct 27, 2017
463
30 FPS doesn't ever really bother me. I feel like a 60 FPS mandate would result in less games that really push hardware to its limits. I can understand the benefit in competitive games (but honestly doesn't really matter to me there either), but single player games it doesn't really make a difference.

I've never played a game locked at 30 FPS and been thinking 'damn the input lag is terrible compared to 60 FPS'. I dunno though. I play most multiplats on PC anyways where it's easy to get 60 frames with high quality settings.

Frame drops though... those are irritating.
 

SoftTaur

Member
Oct 25, 2017
489
I used to care about framerate a lot but now as long as a game is consistent 30 I'm fine with it.
 

Rad

Member
Oct 26, 2017
1,068
Depends on the genre/game for me. Online games, sure make them 60fps. But single player games? I'd choose better graphics over 60fps most of the time.
 

Garf02

Banned
Oct 31, 2017
1,420
Native 4k is massive waste of resources, both in terms computing and visual impact of graphics.
I know, but is a good buzzword forthe marketing team to explode

BUY CALL OF DUTY FUTURE BABIES WARFARE, NOW ON TRUE ULTRA HD NATIVE 4K!!!, ONLY ON (Pick Console/ PC)!!!!

I cant wait for the moment industry hits a wall about this stupid graphics trend
 

Inuhanyou

Banned
Oct 25, 2017
14,214
New Jersey
Consoles have not "hit a wall" regarding graphics since they started. Don't see why that would change now. And as easy as it is for some people to think in their mind i have to keep saying it, there's more to 30fps and 60fps than something as shallow as graphics fidelity only
 

Garf02

Banned
Oct 31, 2017
1,420
we are already on around the uncanny valley (either before or after) and im sure the resolution meme will die off (whats the point on 8K in a 32" Display)???
so yeah, I do believe we will sooner than later reach the cap on graphics for games.
 

Deleted member 11934

User requested account closure
Banned
Oct 27, 2017
1,045
We're reaching a point where "better graphics" over "60fps" mean very subtle effects most people won't even notice without close looking, or completely useless AA filters (which I refuse to use even on PC). Mid gen consoles now run at 60 fps 1080p. Next generation must have that as a standard; this generation was already too shitty. Racing games, FPSs NEED it to feel good.
 

Pargon

Member
Oct 27, 2017
11,990
(whats the point on 8K in a 32" Display)???
Print-like displays. At 32″ a 7680x4320 monitor has a pixel density of 275 pixels per inch, resulting in a 2560x1440 workspace using 3x scaling. That's almost approaching the pixel density of smartphones from seven years ago.
Since Windows is built around 96 DPI, it would ideally be 30.6″ rather than 32″ though.

Another advantage of 8K displays is that they support integer scaling of both 2560x1440 and 3840x2160.
So you could choose to render at either of those resolutions without any compromises, like you have to when displaying 2560x1440 on a 4K display. (non-integer scaling blurs the image)
Of course virtually no 4K displays support unfiltered integer scaling of 720p/1080p content, so while it's a potential advantage for 8K displays, it may never be realized.

I'm in full support of 8K displays if they are also capable of accepting lower resolution inputs at higher refresh rates.
They would make amazing desktop monitors. Just don't expect to play the latest games using ultra settings at their native resolution.
 
OP
OP
Wollan

Wollan

Mostly Positive
Member
Oct 25, 2017
8,809
Norway but living in France
Also, HDMI 2.1 is already here on the Xbox One X this generation already.
Was early misinformation. It's HDMI 2.0 but w/FreeSync capable hardware.
So still a 18Gbps physical limit, not 48Gbps. More info here on 2.1 features (main specification is expected to release Dec. 2017).
Wait is the op THE wollan?
EDIT: OP, are you EWollan from IGN forums back in the day? If so, I remember your "Kin" thread (turned out to be Killzone).
Fifteen years back... that was a breadcrumb thread. The mysterious fps-killer from Amsterdam featuring Nurbs-based graphics. :)
 

Dits

Member
Oct 27, 2017
54
The cost and progress of graphics technology makes this pretty much impossible. Unless you tell Nvidia and AMD to stop progressing with their hardware for a couple of years then its not going to happen
 

Deleted member 17491

User requested account closure
Banned
Oct 27, 2017
1,099
60fps should've long been the standard by now. Each time hardware power increases you see Devs that would rather use that power to increase the visual fidelity instead while blurring the image due to the needed motion blur at 30fps. Or even Devs such as Insomniac that once pushed for 60fps throwing their hands in the air and going for 30fps.

I'd choose 60fps over 30fps anyday of the week, including cases such as Hellblade.

Edit: Although to be fair. I feel that the overall performance is improved compared to the previous generation.
 
Last edited:

Garf02

Banned
Oct 31, 2017
1,420
The cost and progress of graphics technology makes this pretty much impossible. Unless you tell Nvidia and AMD to stop progressing with their hardware for a couple of years then its not going to happen
their hardware can still be used for other applications aside Vidja, is devs that have to take the stances on either FPS performance or Eye Candy
 

Sygma

Banned
Oct 25, 2017
954

How do you get motion sickness with zelda which literally has zero camera movement, ever, aswell as a nutty draw distance ? with fps games I'd understand it, but ... zelda ?


The hobbit was shot with 48 fps cameras, and the 48 fps showing was only in some theaters. If you went, you had some of the most fluid / insane looking 3D ever. And you know why ? because 48 is 24fps x2

Every single movie ever is shot at 24 fps, and that's also why you wanna buy a monitor ranging in the 24 hz multiplier : so you don't have frame interpolation. The higher frames are something, but image fidelity is the real deal here. Someone saying 48 fps rendering is looking bad clearly is out of his / her mind , effects in hobbit were kinda ahead

And no, 48 fps everything won't make 24 fps look bad

You can download project SVP on pc and run movies at 48 fps to give it a shot. It will look more fluid sure. But at 24 fps it won't look bad[/QUOTE]