You can't just make your points and move on, you have to throw in a little dig and insinuate that the OP is marketing without a wink of evidence. Gross.By the way OPs post sounds a bit like a marketing 1x1 with all these buzz words and bolded sentences. Probably a coincidence.
I set it up that way and bolded the most important parts because it's a massive OP and I knew most people wouldn't read the whole thing, haha. I don't work for Google. In fact, I was a frequent naysayer of this technology before getting the opportunity to try it myself.By the way OPs post sounds a bit like a marketing 1x1 with all these buzz words and bolded sentences. Probably a coincidence.
Check the OP for answers on both of those.How much data do you use in roughly an hour?
Speeds are not the issue in the streaming world. Its always been latency and data caps
Are you having any issues with interruptions, stuttering, game shutting down due to poor connection? I'm experiencing all of those things. Google states that my setup is good. My connection is above 40 mbps download. Not sure what the issue is here. Seems nice if it would work as intended.You can't just make your points and move on, you have to throw in a little dig and insinuate that the OP is marketing without a wink of evidence. Gross.
I'm having a lot of fun with Project Stream. Playing on my small Chromebook is a bit of a revelation...The colors pop so much better, and it allows me to take the game anywhere in the house (where the WiFi is strong enough). Considering this tech allows you to jump in at the click of a tab, I see a lot of potential for Google to run limited ads, where you could play the intro of an upcoming game for free, or for a rental service where you could pay a fee to have the game available to you for a week or two at a time. It works great for a beta and I'm curious to see where the tech will go from here.
Worst I've experienced is a little stutter on WiFi and the occasional dip in resolution - about 13 hours in. Might be worth reaching out to the Project runners via support if your experience is that bad.Are you having any issues with interruptions, stuttering, game shutting down due to poor connection? I'm experiencing all of those things. Google states that my setup is good. My connection is above 40 mbps download. Not sure what the issue is here. Seems nice if it would work as intended.
This is how I think google is reducing latency by using machine learning and AI.
On their servers, they have a ACO database of user inputs and environment. They probably gathered that information from Ubisoft game testing sessions.
Now for each input they have set of expected output frames. So whenever the player sends input data to the server, it will check the input against the ACO database. Instead of returning 1 frame back, the server will return multiple frames back to the client which will then cache that information on the local computer.
Now if the users next move matches something in the local cache, the next frame is returned from the cache instead of going out to the server.
So i think this is similar to video buffering but more complicated because you can't buffer linearly.
I have zero AI and gaming development experience, so maybe i'm completely wrong about it. Somebody that is in the industry can chime in.
That would be a database with more information than there are atoms in the universe. They're not going to construct an entire frame from known patterns this millennium.
Ideally, yeah, that's how it should work.I think the future would be like this:
You buy an Xbox Streaming device, you're good forever. Future, backward, sideways, all are compatible. PC games, console games, none matter. You bought an Xbox Streaming device in 2020? It should run Xbox 5 games in 2040 if the service is still up.
Are you having any issues with interruptions, stuttering, game shutting down due to poor connection? I'm experiencing all of those things. Google states that my setup is good. My connection is above 40 mbps download. Not sure what the issue is here. Seems nice if it would work as intended.
That's more on the telcoms than it is for the stream providers, but yeah, that'll be something that needs to get figured out since Ajit Pai decided to sell out the country for the cost of his big mug.Assuming someone has posted this the anti net neutrality laws are going to lead to higher costs for this
Who needs 4k when you can have full "console exclusive" RTX effects essentially making your 1080p game look substantially prettier than the hardware box?
I think you misunderstood how it works. When you're at any given frame and the game doesn't know what you're going to do next - it calculates a series of options, one where you press x, one where you do nothing, one where you jump etc - sends all the frames back and your local machine chooses the correct one. It does not store them, it does not have a library of every frame that could be shown, or every thing that can possibly happen.
The neural network comes in to make this more accurate, it only holds patterns of inputs and does not 'see' - so if someone dodged, attacked, and this event is happening - they're probably not going to call their horse, so the frame generation can see what you just did, try to guess what you're going to do next and intercept you actually doing that with hopefully the correct input. if it was wrong, your input lag jumps by 20ms for one frame.
The purpose of this beta is for them to find out how accurate that can truly get with massive numbers playing it, build the database and see how few frames it can generate as options while being right 99% of the time.
They had to use a modern game that was a brand new/hyped release because they'd never get the numbers to test it otherwise, it'd take years to collect the data we're giving them each month.
How can they be generating multiple frames? The server-side would have to be loading a complete game state (the current 'frame') multiple times per frame then (Ubisoft didn't implement an undo system, I assume), and calculating multiple results each time.
That'd be like running the game at 500 fps AND basically loading the entire game to L1 cache (talking about today's memory speed) for each user. Are you telling me Google's computers are from the future?
Or is the prediction all faked and only visual, and the actual gameplay will have the latency no matter what? Even that case seems doubtful to me, as you would need a massive database of imagery and the artifacting would be terrible.
Check this video to see what Google might be doing.
Outatime: Using Speculation to Enable Low-Latency Continuous Interaction for Cloud Gaming
https://www.microsoft.com/en-us/res...ency-continuous-interaction-for-cloud-gaming/
White paper says they extensively modified Doom 3 to support many features (they even took out the RNG). They not only added parallel threads and probably some command structure, but they changed the gameplay to support the system.
Not only is time warp viable in Doom 3 (due to it being old), it's open source.
So it's not possible to do this with AC. Is there anything to suggest Google is employing these same techniques?
I think so, I got mine yesterday....I tried it for half an hour, not much latency, but it's still not as clear as running a game natively, but I can't complain since I'll get to play through Odyssey for free :)
Just concentrate on the fact that there is a solution already tested that shows that it is posible to use frame Speculation . I didn't say Google was using this technique exactly, but they might be using something similar, if not more kudos to them since it seems they are handling latency very good. I can bet you my life that as streaming services become more popular you will see innovative solutions like Outatime Speculation. You can also check the link below for another example of what might be used to reduce latency.
LucasArts' 60FPS Force Unleashed II tech demo
https://www.eurogamer.net/articles/digitalfoundry-force-unleashed-60fps-tech-article
hat'd be like running the game at 500 fps AND basically loading the entire game to L1 cache (talking about today's memory speed) for each user. .
Lol. Well other users were claiming Google was using this technique and I'm simply stating why that's extremely unlikely. Why should I concentrate on distractions that don't actually solve the problem at hand -- running modern games? I doubt they're doing anything at all similar.
That DF article isn't relevant for cloud-gaming...
ACO has multiple save slots...If it's a game like Assassin's Creed Odyssey you're out of luck and will have to just watch on YouTube because it doesn't have multiple save slots, etc.
- Economics of the service.
For many consumers, the promise of maxed out games on their cheap chromebooks is indeed tantalizing. But, how does that make sense for the companies offering the service? For that to be possible, they have to lease you the equivalent of a high end gaming PC on demand for the price of a game, not mention their bandwidth cost which are bound to be much higher than something like Netflix. And the game often is not even theirs!
Just compare that to the current model, where the consumer pays for the game, the hardware, the electricity to keep it running and no bandwidth costs... where the upside for the companies? The answer, is of course, GAAS. There's no other way around it, if/when cloud gaming takes of, consumers will end paying even more for games. Cloud gaming is the ultimate DRM scheme after all.
- The quality of the experience.
Sure, you said that the experience was good for you. But, the reality of the services out of controlled trials is that many user experience subpar performance, and I doubt that is only because Sony is incompetent or something. To go back to my original point, do people really expect to the service to lease you, on demand, the equivalent of a 2080ti for a few bucks a month? Geforce now and Project Stream are controlled trials.
And even then, from what you said, Project Stream is giving you less than PS4P quality.
To be fair, that's the hardware developer's problem, it's the publisher's dream, to not be beholden to a hardware manufacturer.
Beyond that though, you're right, it's the penultimate form of DRM.
I mean, if these things go to market and they're poor, the market will react in kind. Time has not been kind to PSNow.
Why do you think it would be running at 500fps?
You'd be right if there was no ai trying to predict inputs based on context, but there is. That's what the neural network is for. the overhead of 3 variations would be nowhere near the same as running 3 simultaneous copies of the game either.
You should do some more reading on the subject, this is absolutely a thing and this is Google using us to train their neural network to see how possible and/or good it really is. Arguing with me that it's not possible is pointless because I didn't personally work on it, I just read that is a thing they are doing. These aren't opinions
Good to see that you use unlikely instead of imposible. If you at least see the video and read the paper you can see how that would actually reduce the latency. The DF article is relevant for cloud gaming because the higher the framerate the game runs in the cloud, the less it will take to generate the frame that will be send to your house. The DF article shows how there are ways to increase the framerate without necessary using brute force hardware solutions. You will see a lot of solutions like this when developers start to create games designed to run in the cloud.
No. I already know a lot about the subject, that's why I'm asking you to provide a source for your claims. The overhead is way MORE than running 3 simultaneous copies of the game, unless you can completely alter the game itself.
I read the paper and understand it. If you had done the same, you wouldn't be replying back the stuff you have.
There are thousands and thousands of ways to increase framerates in games. Reducing latency in the cloud ideally would employ techniques beyond local latency reducing techniques, like the predictive technique you linked, except applicable for games you cannot modify and games that require enough hardware as to make a cloud solution economically not feasible.
If someone makes a claim that Google is doing some amazing something or other than for all intents and purposes seems impossible to me based on my knowledge on the subject matter, I want at least a shred of evidence lending toward that. I appreciate the link on general techniques, but I want to know if there even is a claim of some secret sauce by Google.
Dude WTF are you talking about? The reason why I know you didn't read the paper and replied with the following "So it's not possible to do this with AC." is because you didn't read the following on the whitepaper.
"Our experience with Fable 3 was similar and suggests that the essential developer modifications needed to support efficient speculation are similar across commercial titles. We also examined UDK [12], one of several widely used commercial game engines upon which many games are built, and verified that the modifications described below are general and feasible in UDK as well.4 Therefore, we suggest that the techniques proposed below are broadly applicable and can be systematized."
There is nothing here stating that it would not be possible to do the same with AC. On the contrary they clearly state that the techniques they used can be broadly applicable and systematized. So try again, the princess is in another castle bro.
And now show me a machine that can run AC at the hundreds of FPS and is able to refresh ~8GB of RAM in 5ms (to restore immutable states), let alone one that makes this project economically feasible.