That picture is nonsense, you can;t use a 65" screen sitting at a desk...
I have a 55" TV hooked up to my consoles and my gaming PC. The latter is also connected to my multi-monitor setup. I certainly wouldn't mind a 65" TV for couch gaming. That said, a 55" option would have been nice as well to cut down the price a bit.Why the FUCK is this 65"? Who the fuck has a PC gaming setup that can accommodate something that large and to be able to sit far enough away?
Stupid.
On one hand I would really really want one. On the other hand I can't see myself paying the price for this. Chance for one under $2K is 0 I am guessing.
I can see $3,500. X1 Chip is not expensive, G-Sync is like $200-300. So the real question is how much a 65" low latency, high refresh, HDR panel going to run. $3K is high enough to be plausible for say Acer.People in this thread are fucking nuts of they think this thing is coming in under 5 grand
I can see $3,500. X1 Chip is not expensive, G-Sync is like $200-300. So the real question is how much a 65" low latency, high refresh, HDR panel going to run. $3K is high enough to be plausible for say Acer.
Sure, it all just depends on how many they want to sell.Going by the sharp uptick for monitors compared to similar sized TVs I think $3000 is just a dream not too mention Nvidia can be shitty with pricing in general
I agree. They are going to have to support VRR. Eurogamer seems to know whats up as well. From their article about this "monitor":This stinks of a last ditch effort by Nvidia to protect their Gsync Golden Goose or extract the last few eggs against its impending doom in the form of Variable Refresh Rate. Evidently already supported by 2018 Samsung's, and all but assuredly hitting every manufacturer by next year, a continued attempt to lock their GPU's to Gsync would send AMD's market share and mind share through the roof and Nvidia into freefall. What may seem like a non-issue in the comparatively tiny gaming monitor market will become a completely different monster when every modern display people own supports an open standard Nvidia are actively blacklisting.
With specs pushed to the max in all areas - not least in terms of screen-sizes - we should expect premium price-points for the BFGDs, and it's to be expected bearing in mind that the FreeSync-like HDMI 2.1 variable refresh technology should start to appear in consumer-level 4K displays over the next year. G-Sync has always been positioned as the premium option in the PC display space, and Nvidia is clearly looking to do the same with living room screens. From our perspective, while G-Sync may command a gold standard in terms of its feature set and performance, the proliferation of FreeSync screens and the upcoming arrival of HDMI 2.1 variable refresh for consumer 4K TVs presents a strong argument for the green team to embrace both standards.
This stinks of a last ditch effort by Nvidia to protect their Gsync Golden Goose or extract the last few eggs against its impending doom in the form of Variable Refresh Rate. Evidently already supported by 2018 Samsung's, and all but assuredly hitting every manufacturer by next year, a continued attempt to lock their GPU's to Gsync would send AMD's market share and mind share through the roof and Nvidia into freefall. What may seem like a non-issue in the comparatively tiny gaming monitor market will become a completely different monster when every modern display people own supports an open standard Nvidia are actively blacklisting.
I assume one reason it seems perfectly fine to me is that my main "couch-scale" display is, in fact, a projector.Projectors don't have tuners either. I don't see why it appears strange to replace the display in a couch arrangement with something that is not a TV.
So you're going to replace your 65" TV with a 65" PC gaming monitor?
This stinks of a last ditch effort by Nvidia to protect their Gsync Golden Goose or extract the last few eggs against its impending doom in the form of Variable Refresh Rate. Evidently already supported by 2018 Samsung's, and all but assuredly hitting every manufacturer by next year, a continued attempt to lock their GPU's to Gsync would send AMD's market share and mind share through the roof and Nvidia into freefall. What may seem like a non-issue in the comparatively tiny gaming monitor market will become a completely different monster when every modern display people own supports an open standard Nvidia are actively blacklisting.
I believe it already is a part of the HDMI 2.1 spec. It's just going to up to Nvidia if they try to remove the feature/block the feature from their cards like they did with DP.Unless their new GPU really is a new hardware design and significantly leaps ahead of what they already have. If the gap between them and AMD grows even more substantial, it's going to be a real dilemma. The big hope is the HDMI group has enough sense to make it a mandatory part of the spec. And let's face it, running moves at 24/48hz has very valid uses for this too. I don't think even Nvidia would be crazy enough to abandon HDMI support.
Stop, those of us using TV's to play PC games on, with our controllers and comfy couches are breaking the illusions of crampy/stuffy PC.Eh, I have my PC connected to my 65" OLED display, and I know plenty of people that do something similar. Crazy notion, right?
I believe it already is a part of the HDMI 2.1 spec. It's just going to up to Nvidia if they try to remove the feature/block the feature from their cards like they did with DP.
Stop, those of us using TV's to play PC games on, with our controllers and comfy couches are breaking the illusions of crampy/stuffy PC.
VRR is an optional part of the HDMI 2.1 spec. I fully expect only expensive high end TVs will support it. If you think the $300 TV you buy at Walmart will have HMDI 2.1 VRR boy do I have a great bridge to sell you. Expect only the top shelf Samsung QLED, LG OLED, and Sony models to have it. You're going to be paying $3k+ for VRR anyways and compared to that these things with G-sync won't look badly priced at all.Unless their new GPU really is a new hardware design and significantly leaps ahead of what they already have. If the gap between them and AMD grows even more substantial, it's going to be a real dilemma. The big hope is the HDMI group has enough sense to make it a mandatory part of the spec. And let's face it, running moves at 24/48hz has very valid uses for this too. I don't think even Nvidia would be crazy enough to abandon HDMI support.
Indeed. VRR has been declared as an optional part of the HDMI 2.1 spec.A lot of times things are declared optional/mandatory. It's not actually a part of the mandatory DP spec, is it? That's why it's so important it's declared mandatory for certification.
VRR is an optional part of the HDMI 2.1 spec. I fully expect only expensive high end TVs will support it. If you think the $300 TV you buy at Walmart will have HMDI 2.1 VRR boy do I have a great bridge to sell you. Expect only the top shelf Samsung QLED, LG OLED, and Sony models to have it. You're going to be paying $3k+ for VRR anyways and compared to that these things with G-sync won't look badly priced at all.
Indeed. VRR has been declared as an optional part of the HDMI 2.1 spec.
Source?Indeed. VRR has been declared as an optional part of the HDMI 2.1 spec.
Do you have a source for that? (as in, supporting that actual input resolution/refresh rate)
The 2018 OLEDs will be capable of displaying 4k at 120hz. I'd rather have that and get whatever card that is going to replace the 1080ti. You'll probably come out cheaper.
Do you have a source for that? (as in, supporting that actual input resolution/refresh rate)
It would be fantastic (though sadly the sizes LG currently produces don't fit my use cases)
Have they explained how yet? It was speculated to be limited to USB-C. So unless you can convert DP to USB-C, I wouldn't expect to input any live gaming on it.
OLED does have burn-in issues for monitor-like use cases.A few are counting the fact that it is not an OLED against it. Doesn't OLED have burn in issues?
Genuinely curious.
Pricing will be nuts, but I like the idea of larger displays (TVs) aimed at gaming and want to see more companies push into this area.
I find it a little strange that Nvidia aren't really working with TV manufactures to at least put G-Sync into their TV's. I guess everyone will just use the HDMI 2.1 standard.
I hope this sort of thing gains traction. I like big screen gaming.
There's nothing strange about it. No TV manufacturer would ever pay Nvidia the $100-200+ per unit they evidently charge for the Gsync hardware and license (estimate based on MSRP of an otherwise identical Freesync Vs. Gsync monitor). Samsung famously doesn't support Dolby Vision because they don't want to pay Dolby the licensing/royalty fee, which is probably $10-20/unit based on what MS charges for the Atmos Headphone license. Their Gsync sales team has probably been laughed out of more board rooms than they can count.
If that's the case, why don't Nvidia just support the open standard? I'm sure they can implement a sort of "G-sync through HDMI" or something.
Because there are still 10's of thousands or possibly low 100's of thousands of monitor manufacturers and customers who have been stupid enough to pay it. Gsync has probably generated over 100 million in revenue LTD, why would they voluntarily give that up?
Only way I see them giving it up is if they begin to lose a significant amount of market share.
A few are counting the fact that it is not an OLED against it. Doesn't OLED have burn in issues?
Genuinely curious.
Like all emissive display technologies OLED does have the risk of burn-in, yes. How susceptible OLED displays are to burn-in is subject to debate. Many users here, myself included, game all the time on OLEDs, without issue, but there are verified reports of OLED burn-in here as well so it is certainly a valid concern.
With this BFG display, and where pricing will likely end up ($4k+), you are going to be comparing it to displays like the Sony 65A1E OLED, or the LG E8. In side-by-side comparisons the OLEDs will win every time, particularly against an LCD with only 1k nits of peak brightness. Sony and Samsung will be pushing 2k nits of peak brightness in 2018 with their LCDs... this display is outdated upon release.
Local dimming introduces input lag. It would be nice if people who aren't THAT hardcore about input lag could have a real local dimming option which made the lag 20 ms instead of 10 ms or something. I mean for watching Blu-rays and stuff, lag doesn't matter at all so having the option at least for that would be nice.linus just confirmed it's VA and local dimming isn't all that great. no comment on price.
AMD is completely uncompetitive in PC gaming right now so there's no way this would cause them to lose market share unless Volta was somehow slower than Pascal and AMD suddenly doubles performance between Vega and Navi. This is not a realistic scenario, so Nvidia's absolutely dominant control of PC gaming will continue through at least 2019.Only way I see them giving it up is if they begin to lose a significant amount of market share.