How about the added input latency? Less fluid motion? There's quite a bit wrong with it.Why? There's not a thing wrong with 30fps. I'd rather developers keep focusing on what they want to focus on.
How about the added input latency? Less fluid motion? There's quite a bit wrong with it.Why? There's not a thing wrong with 30fps. I'd rather developers keep focusing on what they want to focus on.
You could argue the same thing with 60fps vs 120fps. Where does it end? 30fps has been a standard for a long time, it works perfectly.How about the added input latency? Less fluid motion? There's quite a bit wrong with it.
60 is the standard, 120 is brilliant, 30 is the bare minimum.You could argue the same thing with 60fps vs 120fps. Where does it end? 30fps has been a standard for a long time, it works perfectly.
You could argue the same thing with 60fps vs 120fps. Where does it end? 30fps has been a standard for a long time, it works perfectly.
I agree. There's no excuse for any title to run at 30fps, just lower graphical fidelity until you hit the 60fps target. Games are created to be played, not for watching shiny cutscenes.
It's shocking that PS4 Pro and Xbox One X are aiming for psuedo 4k. If Sony or Microsoft had guaranteed 60fps or your game will fail certification, they would have received my money.
Motion sickness for one thing.I mean, I'm definitely in the 60-fps camp, which is why I've been 95% PC gaming the past several years.
But why would anyone refuse to play incredible games like Uncharted 4 or Forza Horizon 3 just because of this silly principle? That's just being too close-minded.
Fortunately, Variable Refresh Rates are now a part of the HDMI 2.1 spec.I prefer locked 30 to what passes for "60" right now. There are too many supposedly-60-fps games that drop far too often into the 40s and low 50s for me to take most developers' efforts seriously in this regard. If we're actually gonna push for 60 fps, then it has to be locked close to 100% of the time.
Movies also used to look significantly smoother back before they double or triple-flashed the image in theaters, or before we were watching it on flicker-free displays at home. They were called 'flicks' for a reason.I would like some. 24hz exists only because it was the lowest (i.e. cheapest) acceptable framerate that could be synced to audio. There's no intrinsic reason that makes it superior. People are used to it, that is all. I'm interested in seeing what James Cameron brings to the table.
The severe judder and motion blur in movies looks awful.....Why is 24hz with film a bad thing? Do you want soap opera framerates in movies?
The Hobbit looked bad for numerous reasons beside the framerate. People are quick to blame the framerate for the effects looking bad, but they still look terrible at 24 FPS.What? That's a terrible idea. TV and film at higher framerates looks awful much of the time, especially in fictional films. The soap opera effect is really jarring. There's a reason why the high frame rate thing they tried with The Hobbit wasn't well-received and never took off. I hope film at 24fps is never abandoned.
Never heard of motion sickness caused by 30fps before. It must certainly only have an effect on people on a miniscule scale. Either way 30 FPS is fine and I'd that's what Devs want to target I'll accept because they get to choose, not us.As someone that cares about high framerates, I'm somewhat concerned about the next generation of consoles moving to a modern competitive CPU architecture like Ryzen.
This generation has already shown that developers have to actually optimize for PC now, and cannot get away with lazy ports under the expectation that average hardware will still be fast enough to brute-force it to run well.
It is likely that the consoles would use a lower-power variant - probaby another APU.
However, based on what we're already seeing this generation, it seems unlikely that the performance gap would be large enough next-gen to brute-force games built for 30 FPS on that kind of hardware to run at 60 FPS on PC, unless there are some drastic improvements in desktop CPUs.
The problem is that CPUs are starting to focus on increased core-counts, as it's very difficult to improve per-core performance - while the performance of many games is reliant on just that.
I have a 4GHz 8-core/16-thread Ryzen, and many games don't scale to that. They'll run much faster on a 5GHz quad-core, even though the total computational power is lower.
This is not an accurate way to look at things, as there are many other factors involved, but as a generalized example: if we assume that SMT/HyperThreading gets you 25% higher performance on average, and both CPUs have the same IPC, the 4GHz 8-core CPU would have double the total computational power of a 5GHz 4-core CPU. Despite that, the quad-core would be better at running the majority of games due to its higher per-core performance.
Something else to consider though, on the merits of 60 vs 30 FPS:
The current push for ever-better graphics appears to be unsustainable.
Lots of developers that have been producing great games, but with severely inflated budgets, are struggling now or have gone under.
If you are targeting 60 FPS rater than 30 FPS, you don't have to put nearly as much money into ultra high fidelity art assets. There's a reason that so many B-tier or indie games are able to target 60 FPS, while the big AAA games with bloated budgets are pushing 30.
There are obviously other costs when trying to get your game running at 60, but it could be something which helps keep budgets in check if it places stricter limitations on developers.
Motion sickness for one thing.
Breath of the Wild is the one 3D home console game I've played in maybe 7 years now, and I had to stop because I'd get bad motion sickness every time I played it. I regret ever buying a Wii U, and I wish the Switch could run it at 60.
Fortunately, Variable Refresh Rates are now a part of the HDMI 2.1 spec.
Sadly, it's an optional part of the spec, but that means there will potentially be TVs with G-Sync/FreeSync-like functionality available in 2018.
That means you never have to worry about locking to a fixed framerate every again.
It doesn't mean that you can ignore framerate, and you'll still need an option to lock to 30 for older TVs, but you can have games which are locked to 30 on a fixed-refresh display and unlocked on a VRR-capable display.
Movies also used to look significantly smoother back before they double or triple-flashed the image in theaters, or before we were watching it on flicker-free displays at home. They were called 'flicks' for a reason.
The framerate has not kept up with the modern methods of displaying movies, and motion now looks worse than ever.
Even 60 FPS games are not as smooth or as free of motion blur as they used to be when displayed on a 60Hz CRT. I'd take a CRT doing 60Hz over a flicker-free OLED doing 90 any day.
The severe judder and motion blur in movies looks awful.
Higher framerates are always better. No exceptions.
I can't even stand to watch movies without interpolation turned up to the max. Interpolation artifacts are bad, but low framerate video is worse.
The Hobbit looked bad for numerous reasons beside the framerate. People are quick to blame the framerate for the effects looking bad, but they still look terrible at 24 FPS.
The only reason it looks jarring is because you have a certain expectation when you sit in the theater to watch a movie. If 48 FPS became standard you would soon adapt to it and realize how bad 24 FPS is.
The severe judder and motion blur in movies looks awful.
Higher framerates are always better. No exceptions.
I can't even stand to watch movies without interpolation turned up to the max. Interpolation artifacts are bad, but low framerate video is worse.
The Hobbit looked bad for numerous reasons beside the framerate. People are quick to blame the framerate for the effects looking bad, but they still look terrible at 24 FPS.
The only reason it looks jarring is because you have a certain expectation when you sit in the theater to watch a movie. If 48 FPS became standard you would soon adapt to it and realize how bad 24 FPS is.
TVs supporting variable refresh rates isn't as much of a concern for me as the fact that playing a game that shoots for 60 fps and continually dips into the 40s or lower is really jarring for me.Fortunately, Variable Refresh Rates are now a part of the HDMI 2.1 spec.
Sadly, it's an optional part of the spec, but that means there will potentially be TVs with G-Sync/FreeSync-like functionality available in 2018.
That means you never have to worry about locking to a fixed framerate every again.
It doesn't mean that you can ignore framerate, and you'll still need an option to lock to 30 for older TVs, but you can have games which are locked to 30 on a fixed-refresh display and unlocked on a VRR-capable display.
So this gen focused on hitting 720p+ (1080p) instead of sub-HD.
Mid-way it focused on 4k
Next gen we should get the same fidelity games as this gen (maybe slightly better) but with a focus on hitting 60fps.
That means we won't see the next huge graphical fidelity jump until the gen after next?
Thats an interesting thought.
It's as much of speculation as the fact that most games on modern consoles choose to run at 30 fps instead of 60. I have no idea why people think that this choice will be different with a faster CPU. A faster CPU will allow devs to pack more stuff inside 33,3ms of one frame at 30 fps, that's it. Console games are made to run at 30 fps because this is what market consider acceptable.
Highly unlikely. If a game is designed to run at 30 fps on console h/w then such choice will probably be impossible since it will require a different, well, game to be made alongside the one which will run at 30 fps. You're talking about a situation when game's CPU part is made to run at 60 fps but its GPU part for some reason isn't - this is a rather weird balancing scenario which is unlikely to ever happen outside of some ports/remasters; you either get both at 30 or both at 60. This will still be true with Ryzen or Core or ARM or any other CPU in the system.With a Ryzen type CPU they could at least have given players an easy choice for opting in on 60FPS at the cost of graphical fidelity, without altering the amount of enemy units or size of their maps.
It is jarring because your display is running at a fixed 60Hz refresh rate.TVs supporting variable refresh rates isn't as much of a concern for me as the fact that playing a game that shoots for 60 fps and continually dips into the 40s or lower is really jarring for me.
Again: you went to the theaters after many years of experiencing a modern (bad) 24 FPS presentation, and not being adapted to high framerate video.They're quite simply not "always better." That's an opinion presented as if it was an objective fact. And it's one that happens to be strongly opposed by the vast majority of filmmakers out there and most critics. I think that, in film, high frame rate presentations are always worse, but that's just my opinion. I can't even fathom watching anything with motion interpolation turned on, I think it makes a lovely 24 presentation look like absolute garbage. I saw Hobbit in HFR and was dragged to see it again at a local theater in a regular screening. It looked far, far better at the standard frame rate. I was able to engage with the film without being distracted the entire time by how awful the frame rate made the presentation, and I enjoyed the film much more. I'm not the only person who saw it in both formats and reported the same thing.
Again: you went to the theaters after many years of experiencing a modern (bad) 24 FPS presentation, and not being adapted to high framerate video.
Of course it is going to stand out when it's the first HFR movie you've seen. It takes time to adapt to new things.
Interpolation is not nearly as good as native HFR content, but it's improving all the time.
Some short comparison clips (not mine):24 FPS looks awful - even without telecine judder. (which is unavoidable there, since most displays and video players max out at 60HHz)
24 FPS looks fine when you display it as originally intended: at 24Hz on a flickering projector.I'm perfectly used to high frame rate video, thanks, I see it all the time in various applications where I think they work great and where one obviously wouldn't want to use 24 or 30 but not for big screen narrative cinema, no way, never and it will never become a standard. Most directors and DPs will fight it tooth and nail in a way that will make the celluloid debate look like child's play. And it's not just because they're stuck in their ways, it's because HFR has a look that just isn't what they want for their films. It smashes the illusion of cinema into little bits.
Once again, you can assert that 24fps looks terrible but that's just your opinion. I think a 35 or 70mm presentation at 24 looks absolutely lovely. I think most of the movie-going public would disagree with you that 24 looks awful and you just can't force a more expensive format that people fundamentally don't want on audiences.
Moviegoers don't sit in the theaters thinking "oh wow, the judder and motion blur is just ruining this." But everyone I saw The Hobbit with did feel that the HFR hurt their enjoyment of the film. Even those who didn't know what frame rate was commented on how there was something weird about the feeling of the live stage in front of them, or the fakeness of the sets and the soapy look of the film.
Anyway I don't want to derail this thread, which is about games not movies. But that's my view, and I understand yours.
None of which bothers a lot of people. I'm happy with a locked 30 and drop dead effects for a huge swathe of games. I think more games should target 60, yes, but there will always be room for 30fps titlesHow about the added input latency? Less fluid motion? There's quite a bit wrong with it.
It wont, it will be native 4K with 8K upscale and the same 20-30 FPS, some games do 60FPS
this is cause Graphics are more easy to parade around even on screen shots than resolution, which easy ups marketing and sell a product (even if the final product is visual downgrade)
I know, but is a good buzzword forthe marketing team to explodeNative 4k is massive waste of resources, both in terms computing and visual impact of graphics.
Print-like displays. At 32″ a 7680x4320 monitor has a pixel density of 275 pixels per inch, resulting in a 2560x1440 workspace using 3x scaling. That's almost approaching the pixel density of smartphones from seven years ago.
Was early misinformation. It's HDMI 2.0 but w/FreeSync capable hardware.Also, HDMI 2.1 is already here on the Xbox One X this generation already.
Fifteen years back... that was a breadcrumb thread. The mysterious fps-killer from Amsterdam featuring Nurbs-based graphics. :)EDIT: OP, are you EWollan from IGN forums back in the day? If so, I remember your "Kin" thread (turned out to be Killzone).
their hardware can still be used for other applications aside Vidja, is devs that have to take the stances on either FPS performance or Eye CandyThe cost and progress of graphics technology makes this pretty much impossible. Unless you tell Nvidia and AMD to stop progressing with their hardware for a couple of years then its not going to happen