Swift_Gamer

Banned
Dec 14, 2018
3,701
Rio de Janeiro
They both look pretty good. Whatever variant of the neural implementation this person is using seems to be extremely good, probably the best examples in this thread.

The FF8 and FF9 ones further down aren't perfect (weird jumping in a few spots), but they help to bring the motion capture work butchered by the PS1 disc size to life.



The Tekken 5 Sparking one is 99% perfect. Helps that the majority of the source is filled with slowmo sequences, lots of data to pull from.

My God, no, the final fantasy renditions don't look good at all.
 

LumberPanda

Banned
Feb 3, 2019
6,797
It looks a bit weird because you're changing animations designed specifically to look best at the original framerate. If this technique catches on, I think it'll look a lot better because artists will instead be designing their "keyframes"/poses/etc to take advantage of this tech.
 
Oct 25, 2017
2,974
My God, no, the final fantasy renditions don't look good at all.
The FF8 examples aren't consistent. The legs keep going wonky most of the time and it periodically jumps around.

The FF9 example is as close to a a native render was we will ever get. It looks like a recording of a stage play or a live concert! Just look at the way the camera smoothly tracks Garnet as she runs, seamlessly parses the crowd, the way her dress just flows like a curtain. Its excellent. Again, the motion capture work really stands out here thanks to software filling in the gaps. It might just a reaction to seeing "60fps" in a place you don't expect. Heck, I saw comments in the Death Stranding PC trailer thread saying that it looked funny, when we've already seen a modern Kojima 60fps take in MGSV.

Yeah, most of the time these things don't work, see the recent threads with the old time-y footage. But its astounding to me how much these techniques have suddenly become viable, the caveat being when they've been carefully curated. We've come a long way from the weird settings built into mid 2000's LCD tvs.
 
Last edited:

Acetown

Member
Oct 28, 2017
1,298
I think this stuff looks pretty good. I'd prefer to play the games in their original form, but it's certainly more successful as an experiment than, say, the attempts we've seen at using neural networks to upscale pre-rendered backgrounds or textures from old games.
 

sn00zer

Member
Feb 28, 2018
6,277
I wonder if you could save disc space for prerendered cutscenes if you are able to render it at 15 fps and interpolate it on the fly to 60
 

mute

▲ Legend ▲
Member
Oct 25, 2017
25,760
The impression I get when I see cool tech demos like this is that they will never see the light of day in a real commercial product because the people making things may want to use it but won't be sure if/who/how much they need to pay to be able to do it, so they won't. Same thing with the AI backgrounds and the recent PS1 FF rereleases.
 

Nitpicker_Red

Member
Nov 3, 2017
1,282
I wonder if you could save disc space for prerendered cutscenes if you are able to render it at 15 fps and interpolate it on the fly to 60
On the fly? No. Even with a beefy computer, it will take several seconds to render one interpolation.
With my GT 1050 it takes 15 minutes for 1 second of 320p footage.

In other news, I'm currently interpolating all 251 Pokémon sprite animations from Crystal.
I'll post when I'm done!
4NC0exc.gif
z2T80k2.gif

9qPrzdi.gif
nq3ApXg.gif
 

Ragnorok64

Banned
Nov 6, 2017
2,955
On the fly? No. Even with a beefy computer, it will take several seconds to render one interpolation.
With my GT 1050 it takes 15 minutes for 1 second of 320p footage.

In other news, I'm currently interpolating all 251 Pokémon sprite animations from Crystal.
I'll post when I'm done!
4NC0exc.gif
z2T80k2.gif

9qPrzdi.gif
nq3ApXg.gif
Does this process rely on your GPU or your CPU more?
 

Nitpicker_Red

Member
Nov 3, 2017
1,282
Does this process rely on your GPU or your CPU more?
It only uses the CPU to split the images into PNG frames.
It relies heavily on GPU. It uses functionalities specific to recent Nvidia cards (Cuda 5+).

I interpolated all 251 Pokémon here:
imgur.com

DAIN-App interpolated Pokémon Crystal sprites

Imgur: The magic of the Internet
The timing is still a bit buggy but that's my fault when chosing the parameters.

Raikou: 20 frames → 46 frames
1cE6m5N.gif

Dugtrio: 26 frames → 81 frames
VPlYw8H.gif

Some came out great, others not so much, either due to execution (timing cut off, transparency issues) or source (animation too fast, too big movements).

13Frames → Upscale and Interpolate: 64Frames → Re-Pixelated
4wz4WAk.gif
DEj1gdF.gif
ibhhlxG.gif
 
Last edited:

Nitpicker_Red

Member
Nov 3, 2017
1,282
So, since DAIN works very-well on Stop-motion
(Left is original at ~18fps, Right is interpolated at ~60fps, enable 720p60fps for best result)



(Source: The Trap Door Episode 1)

I am wondering it it could be used for Stop-motion games? Games like Wurroom that use videos of stop-motion clay as sprites.


Would it bring anything or would it ruin the "stop-motion" feel?

"The trap door" for example it's difficult to tell that it's been interpolated without comparing with the original, which shows it's possible to be smoother without ruining the feel. (Maybe interpolating to 30fps without going all the way to 60?)
 
Last edited: