In the Vive Pro Eye? Maybe, for some reason it feels like they've given off the idea that it's just the same resolution. But we'll have to wait for more details.
In the Vive Pro Eye? Maybe, for some reason it feels like they've given off the idea that it's just the same resolution. But we'll have to wait for more details.
The Pimax 8k does not allow for a native 8k input though. It uses upscaling from 1440p per eye.
Is it possible to import stuff like the Steam controller or the Steam Link where you live or is that off limits for whatever reason?Then the chance of the Kunckles arriving in my country is slim to none.
Secure the ignorance, how does this eye tracking improve resolution?
It doesn't magically improve resolution as that is locked in by the display, but it allows for much higher resolution displays to require the same workload as current displays. With a perfect form of eye-tracking and foveated rendering, you can go straight from the current 1080x1200 per eye of Rift/Vive to 5000x5000 per eye with basically no difference in performance.Secure the ignorance, how does this eye tracking improve resolution?
Secure the ignorance, how does this eye tracking improve resolution?
Secure the ignorance, how does this eye tracking improve resolution?
Yeah we bought eye tracking glasses like three or four years ago for $30k. A year later SMI brags in an interview they're confident they can bring the price down to $100 for HMDs. Thanks you fucks, lol. That it's apparently already ready for consumer headsets is stupefying really. I guess if the tracking tech is not yet quick enough they can make the high resolution rendering area a lot bigger, meaning it's a bit less efficient but still (I say this while having no idea what the average eye saccade number of degrees is though). Amazing.Keep in mind, eye tracking kits already exist:
they're just in the price ranges of tens of thousands of dollars, and have been until super recently way too slow to actually accomodate foveated rendering (since the tracking has to be lightning quick). Seeing a consumer headset with eye tracking this early is honestly mind blowing.
Super simplified explanation: allows the device to render high detail where you're looking, and spend less effort on the rest.Secure the ignorance, how does this eye tracking improve resolution?
Secure the ignorance, how does this eye tracking improve resolution?
I meant from even introduction into the ecosystem. But you have to start getting this into the hands of devs to have it established once it's out. Partnering with Nvidia for that is bad news for ecosystem lock in. But good news for progress in the area though. Trade off. Exclusivity for speedier/effective dev.We are still a few years out. Keep in mind this is an enterprise headset. It either has inperfect eye tracking that won't work 100% on point for everyone, or it's using a very expensive eye tracking solution that is nigh perfect, but keeps it out of consumer reach for a few years.
Sounds neat, but I have a hard time getting super excited for this because I assume it's going to be extremely expensive and targeted more for the business users.
more than all that, it bodes extremely well for all-in-one headsets. The biggest thing holding back all-in-one mobile headsets is the need for mobile GPUs to dissipate heat, which takes up space. The more they work, the hotter they get. Early GearVRs with note 4s, for example, used to overheat and give you error messages.
This is a tech that could make much, much cooler mobile GPUs go way further. This is an important step towards getting stuff that is actually glasses shaped, not goggles shaped.
Chips running cooler not only means smaller form and better performance, but also better battery too. There are lots and lots of reasons to be excited about foveated rendering wrt wearables.
Pardon my ignorance, is this implementation bij HTC exclusive to Nvidia hardware or is the API hardware agnostic?
So the way the fovea of the eye works. Is we have a point of absolutely clarity, then everything around it gradually loses focus until we hit our peripherals. Effectively when we look at any screen, but VR especially there is unnecessary clarity that's taking up processing power by having to render more in higher fidelity on the screen than we actually need. So assuming we know where the person is looking, and based on what we know about the eye/vision. We can effectively render much less of the screen in high detail at any given time. Reducing the cost of rendering that scene dramatically.Secure the ignorance, how does this eye tracking improve resolution?
That's technically the result of eye tracking rather than foveated rendering, but it's a good point. There are loads of uses for eye-tracking, one important one being socialization.Another thing worth talking about RE: Foveated Rendering is that it'd be an enormous break through for those with limited mobility. I think about people who are paralyzed from the neck down, and a technology like this could change their life. This type of tech could allow for the development of entirely gaze-controlled UIs.
Question from an ignoramus: Would the response time of foveated rendering come into play? Time between moving your eye to a new position vs the apparatus + 3D engine responds to your movement. Or is that a non issue?
That's technically the result of eye tracking rather than foveated rendering, but it's a good point. There are loads of uses for eye-tracking, one important one being socialization.
My understanding was that foveated rendering wasn't worth it until you have 4k per eye displays, any idea if that is the case for this or are they just getting ahead of the technology?
The eye tracking should ideally be faster than a single frame so that, with a little help from prediction based on eye velocity, artifacts will be as noticeable to the eye as frames rolling over are with your current monitor (aka not visible at all).Question from an ignoramus: Would the response time of foveated rendering come into play? Time between moving your eye to a new position vs the apparatus + 3D engine responds to your movement. Or is that a non issue?
It's super exciting because it's evidence that VR tech is driving graphics technology at a skyrocketing pace. As in, way, way faster than most people -- even Michael Abrash -- imagined. This is a tech that went from flat out not existing, to being tens of thousands of dollars, to now being consumer tech in like the span of 5 years, when optimistic expectations were 10-15 years.
Even if this winds up being like $3k, which I actually doubt it will be, it would still be an insane price drop for the tech, something that happened way, way quicker than other visual technologies.
Question from an ignoramus: Would the response time of foveated rendering come into play? Time between moving your eye to a new position vs the apparatus + 3D engine responds to your movement. Or is that a non issue?
Secure the ignorance, how does this eye tracking improve resolution?
You are aware of how much it costs? Because adding that to PSVR2 would make it far more expensive that the PS5.That's cool as hell! Hope Sony is working on that for the PS5's VR. Some new controllers too...
You are aware of how much it costs? Because adding that to PSVR2 would make it far more expensive that the PS5.
Completely depends on how long they take. Oculus are aiming to get perfect eye-tracking along with many other major features in a consumer headset for 2022. There is no way they would ever release a headset higher than $600, so if Oculus can push all of that for sub $600 by 2022, Sony can likely push something for $400 if they don't have every advancement under the sun like Oculus.You are aware of how much it costs? Because adding that to PSVR2 would make it far more expensive that the PS5.
No, you don't understand; this tech is far from being cheap enough for ever regular PCVR. If you want your PSVR2 to be $1000, you are in the wrong end of the VR gaming. if you want to spend that much then give up on Consoles, such tech are two gens out.I don't expect them to launch PSVR2 alongside the PS5, so I expect the cost to come down in the interim. But maybe I'll be wrong.
Sony is NOT interested in selling PSVR2 for $400. We saw how small a market that is, Sony want above all else to get the price down as priority. They are about selling to the mass market, if you want cutting edge then go PC. Sony would rather getting the headset down to $200, if not $150.Completely depends on how long they take. Oculus are aiming to get perfect eye-tracking along with many other major features in a consumer headset for 2022. There is no way they would ever release a headset higher than $600, so if Oculus can push all of that for sub $600 by 2022, Sony can likely push something for $400 if they don't have every advancement under the sun like Oculus.
Foveated Rendering, taken to its extreme, could offer the single best discernable resolution for a display, ever. The big problem with increasing display clarity has been pushing raw number of pixels to fill said screen, even today's super GPUs struggle. Foveated rendering solves this. This is a type of tech that could, in theory, make VR and AR headsets infinitely clearer than any television.
To give a heads up about how big of a breakthrough this is, Abrash thought the first viable foveated rendering headsets would come in 2021:
The long short of it all -- multiple generational leaps in visual quality instantly. Forget things like 4k or 8k, with foveated rendering you can have resolutions that approach real life. This, basically, solves the age-old graphics
Frankly the only thing surprising me is that it took this long. Although I guess display resolution needed to get to a point where it made sense / where GPUs could not keep up.
For the people in the know, chances of PSVR2 using for foveated rendering?
Edit: I guess slim. What exactly is costly about it? The sensors / cameras tracking the eyes?
Edit 2: I guess I should read the thread, hahah. Didn't know these sensors were so expensive.