• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Posted on behalf of brainchild courtesy of the Adopt-A-User program, Gaming Edition.

NOTE: Do you also have a thread you felt like that is worth posting, but you lack the posting capabilities to do so? Give the volunteers at Adopt-A-User a call, and we will assess whether or not your thread is deemed worthy of ERA ;)

NOTE 2: The subject of this topic is highly technical. A certain level of understanding in video game technology is kinda required to make sense of what is being described. Please keep the discussion civil and to the point. I am just providing a service here and i'd like to keep a clean record on these. Thank you in advance and happy posting!

Now, on to the thread!

I've actually wanted to do a proper tech analysis of BOTW's engine for quite some time, but never really got around to it. However, with the new video capture feature on the Switch, I thought it would be the perfect opportunity to revisit the title and share my findings through videos that I've uploaded to Twitter.

I'll start off with a summary of my findings, but I'll also do a breakdown of each technical feature later in this post in order to keep things accessible. Whenever possible, I'm going to try to avoid redundancies. For example, if someone else like Digital Foundry already covered a feature of the engine, I'm not going to bother covering it here. The purpose of this post (as with my SMO post), is to bring more exposure to the technical accomplishments in games where no one else bothered to even investigate.

Anyway, here's a summary of the engine's features:
  • Global Illumination (more specifically, Radiosity)
  • Local Reflections (calculated along Fresnel)
  • Physically-based Rendering
  • Emissive materials/area lights
  • Screen Space Ambient Occlusion
  • Dynamic Wind Simulation system
  • Real-time cloud formation (influenced by wind)
  • Rayleigh scattering/Mie Scattering
  • Full Volumetric Lighting
  • Bokeh DOF and approx. of Circle of Confusion
  • Sky Occlusion and Dynamic Shadow Volumes
  • Aperture Based Lens Flares
  • Sub-surface Scattering
  • Dynamically Localized Lightning Illumination
  • Per-Pixel Sky Irradiance
  • Fog inscatter
  • Particle Lights
  • Puddle formation and evaporation
Global Illumination/Radiosity

First off, I just want to say that all real time global Illumination solutions are faked in one way or another, with varying degrees of accuracy. So anyone trying to dismiss the global illumination solution in BOTW simply because it doesn't use path tracing or something similar should really think about what they're saying. The important part to take away from this is that it is being rendered in real time; it's not just lighting that has been baked into the textures, which is pretty impressive for an open world game (especially on Wii U).

Now, what exactly is Radiosity? Well, in 3D graphics rendering, it is a global illumination approximation of light bouncing from different surfaces, and transferring the color information from one surface to another, along the process. The more accurate the Radiosity, the more light bounces will need to be calculated in order to transfer the proper amount of color.

In Breath of the Wild, the engine uses light probes throughout the environment to collect color information about the different surfaces located near the light probe. There is no simulation of light bounces, just some approximations of what general colors should be coming from a given area. The exact algorithm BOTW uses to calculate this information is unclear, but my best guess is spherical harmonics or something similar, based on the color averages and localization of the Radiosity. Unlike Super Mario Odyssey, Radiosity in Breath of the Wild is pretty granular instead of being binary. The lighting information that's being calculated from the light probes appear to be streamed and tied with the LOD system at the pipeline level, which makes it pretty efficient.

(Update) Initially, I assumed that the spherical harmonics probes might have been placed throughout the environment to gather color samples, as it appeared to update to a general color when Link would move throughout the environment. However, after further investigation, I now know that those general color bounces were due to the lack of color variety in the environment. When I tested the global illumination in an area with lots of different colored surfaces that were next to each other, it became clear how the GI system worked. Notice how as Link approaches the red wall, its color is transferred to all surfaces that face in the opposite direction. The same is also true for the green wall that sits directly opposite of the red wall (though it isn't as intense because the probe is closer to the red wall, and the red wall's color itself is reflecting more intensely). In fact, at any given point, this is happening in all directions. The ground transfers its color upwards, and any ceiling or colored surface directly above Link's head will transfer its color. The probe continuously samples and transfers colors (which we can think of as light bounces) dynamically, as the probe will pick up more colors due to new transfers and will have to sample them as well. Eventually the end result will stop changing in appearance because the samples nearest in proximity to the probe will have the most dominant colors, regardless of the amount of color transfers. This process is sequential but very localized and very fast. The probe has a limited range to sample from and applies these results to materials in world space. Due to such efficiency, the probe can approximate the appearance of many, many bounces of light, but it will only look accurate in the areas closest to the probe.

Real-time Local Reflections

(Update)

So ever since I started analyzing this game, the one area that always seemed to leave me with my head scratching was the local reflections. There were so many cases of what seemed to be inconsistencies, so my theories were initially all over the place. I can now confidently say that I have fully solved the mystery behind how the local reflections work. Apparently, it's a THREE pronged approach, depending on the situation.
  • Specular Lighting
Sunlight, Skylight, Lightning, points lights, and area lights fall under this category. Initially I thought that Shrines and Towers did as well (since they're emissive I assumed that they were area lights), but after seeing the very revealing artifacts that they exhibit, that can be ruled out. Not all glowing materials illuminate the environment, and Shrines and Towers can be considered to be among the ones that don't.
  • Aperture mapped reflections
If this term seems new to you, it's probably because it is. Based on the game's text dump, BOTW devs have internally labeled their take on UE4's Scene Capture 2D reflections. This is how the environment is reflected. The virtual camera above Link's head (specifically, the aperture) has a relatively small FOV, so when Link moves it can cause reflections (displayed in real time) to move from their proper space until the aperture takes another capture of the environment. You can see these kinds of artifacts and FOV in the videos I've included.
  • Screen Space Reflections
Only materials that look laminated use this model, and those materials are exclusive to the shrines. A value in their gloss map tells the engine to use SSR specifically for these materials only. They will reflect anything on screen, which can be viewed at grazing on any material. However, these materials use the aperture map for environment reflections as well, which was one of my sources of confusion. The incongruous behavior of the reflections for these materials lead me to assumptions about the other materials outside of the Shrines. Thankfully, we have that sorted out now.


Physically-based Rendering

Before anyone asks, no, this does not mean 'physically correct looking materials'. It is simply a methodology applied to a 3D graphics rendering pipeline where all materials (textured surfaces) uniquely influence the way that light behaves when interacting with them. That is what happens in the real world, which is why it's called Physically based rendering (a concept based on real world light physics). Different materials cause light to behave differently, which is why we can visually differentiate between different surfaces in the first place.

Traditionally, rendering pipelines relied on an artist's understanding of how light interacted with different real world materials and would define the look of texture maps based on that understanding. As a result, there was a lot of inconsistency between different textured surfaces and how they compared to their counterparts in the real world (which is understandable as we can't expect artists to have encyclopedic knowledge of all the properties of all matter in the real world). With PBR, the fundamentals of light physics are part of the pipeline itself, and all textured surfaces are classified as materials that have unique properties that will cause light to behave according to those unique properties. This allows surfaces to be placed in different lighting conditions and dynamic camera angles and adjusts how light interacts with those surfaces dynamically. Artists do not have to predefine this interaction like they did with the traditional workflow. It happens automatically. Because of the efficiency of PBR, developers feel more inclined to make games where all materials have unique properties that affect light differently.

In Breath of the Wild, PBR is used with a bit of artistic flair, so you might not notice that the engine even relies on such a pipeline since the textures don't necessarily look realistic. However, the BRDFs (Bi-directional Reflectance Distribution Function) used on the materials make it pretty clear that the engine uses PBR. You see, with every dynamic light source, its specular highlights (the parts of a surface where the light source itself shows as a reflection) and the reflectivity/reflectance of those highlights are dynamically generated depending on the angle of incidence (angle of incoming light rays with respect to a surface normal) and index of refraction (how much a material 'bends' light as the rays touch its surface) of whatever material the lights are interacting with. If the game was using a traditional pipeline, the distribution of those specular highlights would not be much different between wood and metal. But in this game, the production of specular highlights are completely dependent on the material that the light is interacting with.

Another key element that shows that BOTW uses PBR is the Fresnel (pronounced fruh-NELL) reflections of all the materials. First of all, most games using a traditional pipeline don't even bother with Fresnel because at that point you might as well just use PBR. As I explained earlier when discussing local reflections, Fresnel reflections become visible at grazing angles (angles where incoming light is nearly parallel to the surface it's interacting with from the perspective of the observer/camera).

According to the Fresnel reflection coefficient, all materials achieve 100% reflectivity at grazing angles, but the effectiveness of that reflectivity will depend on the roughness of the materials. As a result, programmers differentiate between 'reflectivity' and 'reflectance'. Some materials reflect light in all directions (diffuse materials). Even at 100% reflectivity, 100% of the light may be reflected from the total surface area, but it's not all reflected in the same direction, so the light is spread out uniformly and you don't see any specular reflections (mirror images of the surfaces' surroundings). Other materials only reflect incident light in the opposite direction the light was received (specular materials) so you will only see reflections at the appropriate angle where close to 90% of the light is reflected. The Reflectance (the effectiveness of a material's ability to reflect incident light) of both diffuse and specular materials is not always 100%, even at grazing angles, which is why you don't see perfectly specular reflections at grazing angles of all materials, even in the real world. The clarity of fresnel reflections will vary with the materials producing the reflections.

Emissive materials and area lights

This one is pretty straightforward. The materials of glowing objects provide unique light sources that light the environment in the same shape as the materials themselves. These are not point light sources that radiate in all directions, or even simple directional light sources that light in one direction. They're basically 'custom shaped' light sources. It's important to mention that only the global (sun/moon/lightning) light sources cast shadows. However, BRDF still applies to all light sources in the game.

Screen Space Ambient Occlusion

In the real world, there is a certain amount of 'ambient light' that colors the environment after light has bounced around the environment so much that it has become completely diffused. If shadows are the result of objects occluding direct sunlight, then ambient occlusion can be thought of as the result of cracks and crevices in the environment occluding ambient light.

The method used in BOTW is called SSAO (screen space ambient occlusion) as it calculates the AO in screen space and is view dependent. The environment will only receive AO when it is perpendicular with respect to the camera.

Dynamic Wind Simulation System

So this one surprised me a bit because I was not expecting it to be so robust. Basically, the physics system is tied to a wind simulation system. It's completely dynamic and affects different objects according to their respective weight values. The most prominent objects affected are the blades of grass and the procedurally generated clouds.

Real-time cloud formation

This game does not use a traditional skybox in any sense of the word. Clouds are procedurally generated based on parameters set by the engine. They cast real-time shadows. They received light information based on the sun's position in the sky. As far as I can tell, clouds are treated as an actual material in the game. They're not volumetric, so you won't be getting any crepuscular rays or anything like that, but they're not 'skybox' clouds either. They're formation is also influenced by the wind system.

Rayleigh Scattering/Mie Scattering

In the real world, when light reaches Earth's atmosphere, it is scattered by air molecules, which results in Earth's blue sky, since the shorter wavelengths of blue light are scattered more easily than other colors of light. However, as the sun approaches the horizon, it has to pass through more of the atmosphere, resulting in most of the blue light being scattered away by the time the sunlight reaches the eye of the observer, leaving the longer wavelengths of orange and red light to reach the eye. BOTW approximates this algorithm mathematically (I actually found this out through a text dump of the game's code earlier this year!) Apparently the algorithm accounts for Mie Scattering as well, which gives fog its appearance in the sky.

Honestly, had I not looked at the code from that text dump, I would have never assumed that this phenomenon was being simulated in the game. It's just so easy to fake. However, after looking at the reflections of the sky in the water, it all made sense. This scattered light is being reflected onto the entire environment in real time. A simple sky box would make that impossible.

Full Volumetric Lighting

Aside from clouds in the sky, every part of the environment and every object in it has the potential to create light shafts in real time, given the right lighting conditions. The game uses SSAO to aid the effect, but the volumetric lighting is actually not view dependent. You can find out more about how the Volumetric Lighting works in the shadow volumes section of this post.

Bokeh DOF and approx. of Circle of Confusion

Another surprising feature for an engine that I assume uses deferred lighting/shading. So I'm going to simplify things a bit because it can get really technical trying to explain why the Bokeh effect even happens in the first place in the real world. Suffice to say that as light enters the aperture (opening) of an eye/camera, the incoming rays of light begin to converge into a single point on a focal plane. As light becomes more focused on this plane, its appearance becomes sharper and smaller. As light becomes more defocused away from this plane, it becomes larger and blurrier.

The Bokeh effect as it is commonly known is when the points of light that enter the camera lens take on the shape of the aperture that they entered through (like a hexagonal shape, for example). The circle of confusion is the region of focus where a human cannot distinguish between a point of light that is perfectly in focus and one that slightly out of focus. Depth of field is usually determined by the circle of confusion. What's interesting is that BOTW emulates both of these concepts when using the sheikah scope or camera rune. My guess is that it's all calculated in screen space based on the texel (texture element) data, and then applied as a post-process effect.

Sky Occlusion and Dynamic Shadow Volumes

Aside from the physics in the game, these shading features are without a doubt the most computationally taxing elements in BOTW. Here's how it all works:

Even though the clouds themselves don't have any volume, they still cast (soft) shadows onto the environment. However, the sun and the scattered light from the sky Illuminate the environment dynamically, and the environment and all of the objects in it cast their own shadows according to that illumination. It wouldn't look very believable for the lighting in the environment to remain unchanged even when the sky is completely overcast with cloud cover. Nintendo has implemented Sky Occlusion to solve this problem.

Using a mie scattering algorithm (mie coefficients that simulate the effect of atmospheric fog), the engine calculates how much skylight to remove from the environment based on how much fog or cloud cover is in the atmosphere. The more skylight that gets occluded from the environment, the more overcast the environment will appear. Since there is less direct illumination in occluded areas, the ambient light (diffuse, non-directional light) will play a greater role in the illumination of those areas, and all of the shadows in those areas will become softer and start to match the colors of their immediate surroundings.

The engine also uses shadow volumes instead of simple shadow maps, and this is done for every shadow caster in the game. Shadow volumes are cast within a specified 3D space instead of just the surfaces and objects in an environment. Aside from the Sky Occlusion looking more believable when shadow volumes are implemented, dynamically generating shadow volumes within a 3D space also provides the benefit of full real time volumetric lighting when it's combined with atmospheric fog that can receive shadows, which is exactly what happens in BOTW.

Aperture-based Lens Flares

This feature will go unnoticed by probably 99% of the people who play this game, so I'm not sure that it was worth implementing, tbh.

Basically, when rays from a bright light source enter a camera lens at some oblique angles, they can produce optical artifacts known as Lens Flares due to the rays internally reflecting inside of the camera elements. Most games just emulate this phenomenon by applying the flare as a post effect that appears when the camera is slightly off-center from the camera frustum; the concept of light internally reflecting within the camera itself is not even even factored into the equation.

In Breath of the Wild, since the engine already emulates a camera aperture for DOF, it tracks the aperture's relative position to the sun and calculates how much lens flare should be produced, even if the sun isn't on screen. But that's not all! Cameras with lots of zooming elements are even more prone to Lens Flares and the flares will change shape and size depending on the shape/size of the aperture and level of zoom. Surprisingly, BOTW approximates these effects as well!

Sub-surface Scattering

Some surfaces are translucent (not to be confused with transparent) in the real world, meaning that light can both pass through the surface and scatter inside of it. Some examples of real world translucent surfaces would be human skin, grapes, wax, and milk. Modeling this unique behavior of light in 3D graphics is called Sub-surface Scattering or SSS. As with most real time 3D rendering solutions, programmers have come up with several methods to approximate the effect without having to simulate light bounces at the molecular level. The method used in BOTW is relatively simplistic but effective.

Any surface that should have some level of translucency will have multiple layers of materials in order to produce SSS. The first layer is the internal material. This material is usually baked with lighting information that gives it a translucent look. Light travels through the material but does not actually light the material itself in real time. On top of this material is the surface material. This material is the more dominant of the two, and is what you will see in most lighting conditions.

The relationship between these materials work in such a way that the dominant appearance of either material is always determined by the ratio between incident light and transmitted light. If the surface material is reflecting more light than the internal material is transmitting, then the surface material will increase in opacity in proportion to the light it's receiving. If the internal material is transmitting more light than the surface material is reflecting, then the surface material will decrease in opacity in proportion to the light it's not receiving. Balancing the opacity of the surface material according to the Incidence/Transmittance ratio is a very smart and efficient way to give materials an SSS effect.

Dynamically Localized Lightning Illumination

Lots of games implement the illumination of an environment by lightning as a global light source, where it flashes over the entire environment and all shadow casters cast shadows in predetermined sizes and directions.

In BOTW, lightning strikes are basically big ass camera flashes, each with their own radius and intensity, and have the ability to strike anywhere on map, regardless of the players location. What's interesting about BOTW's lightning system is that shadows dynamically correspond to the intensity and location of the shadow caster's nearest lightning strike. This system is probably the coolest lightning system I've ever seen in a game.

Per-Pixel Sky Irradiance

If Radiance can be thought of as the amount radiation coming from the sun, Irradiance can be thought of as the amount of that radiation that a given surface actually receives. This is a pretty important variable for scattering skylight because its absence is the main reason we can see into space at night! BOTW calculates Irradiance using an algorithm that tracks the sun's position relative to zenith and during sunsets, it starts to remove skylight, pixel by pixel, until there is no Irradiance left. Granted the sky is free of cloud cover and mie scattering, stars will start to appear in the sky, even if the sky isn't dark yet. The color gradient transitions between night and day are really impressive.

Fog Inscatter

In the real world, fog receives both light and shade, like a physical object. This is computationally expensive to do with computer graphics if the fog is Volumetric. BOTW gets around this by creating a fog noise pattern (similar to their ambient occlusion noise pattern, but not restricted to screen space) and applying radiance values from the sun and skylight to produce 'inscatter'. When you combine this with shadow volumes, not only do you get Volumetric Lighting, you also get fog that looks like it has volume even when it doesn't.

Particle Lights

Almost every particle in the game is emissive (glowing). Many of them illuminate the environment as well. Instead of rendering particles as objects, many particles are simply point light sources that radiate in all directions in 3D space.

Puddle formation and evaporation

Probably the most bizarre but also the most clever Rendering solution in the game. Underneath the entire terrain of the game world, there exists a plane of water materials that will raise and lower to fill water basins with water when it's raining and evaporate the water when the sun comes back out. There is a foam material layer that is used depending on the water surface's relative distance from the ground. The process is pretty straightforward while also serving as yet another impressive dynamic to the game.
 
Last edited:
OP
OP
Redneckerz
Since the post got beyond its limits, the links to the various examples are shown here by category.

Anyway, here's a summary of the engine's features:
  • Global Illumination (more specifically, Radiosity)
  • Local Reflections (calculated along Fresnel)
  • Physically-based Rendering
  • Emissive materials/area lights
  • Screen Space Ambient Occlusion
  • Dynamic Wind Simulation system
  • Real-time cloud formation (influenced by wind)
  • Rayleigh scattering/Mie Scattering
  • Full Volumetric Lighting
  • Bokeh DOF and approx. of Circle of Confusion
  • Sky Occlusion and Dynamic Shadow Volumes
  • Aperture Based Lens Flares
  • Sub-surface Scattering
  • Dynamically Localized Lightning Illumination
  • Per-Pixel Sky Irradiance
  • Fog inscatter
  • Particle Lights
  • Puddle formation and evaporation
  • Pastebin
Global Illumination/Radiosity

Observational Tip: Notice how the rock cliffs receive a green hue from the grass as the camera moves closer to that region.
https://twitter.com/brainchildlight/status/929056702825054208

(Update) So this is a pretty significant discovery

https://twitter.com/brainchildlight/status/938017478935961601

The global illumination actually does approximate multiple bounces. There's a light probe above Link's head that samples the colors from most materials in the environment. Each sampled color is then transferred and reflected in the opposite direction. What's interesting is that intensity is factored into the equation, both by which surface the probe is closest to, and how strongly it's reflecting light.

In open fields it may not look like much, but when there's multiple adjacent surfaces, the GI looks pretty damn good.

Real-time Local Reflections

Observational Tip: Look at Link's reflection compared to the Blue Lantern's reflection. Link must be on screen in order for his reflection to appear, whereas the blue lantern does not need to be on screen in order for its reflection to appear.
https://twitter.com/brainchildlight/status/932820467718676480

(Update) Local reflections solved!

https://twitter.com/brainchildlight/status/937216145018429443

So Shrine materials have an extra layer of gloss reflections, but they use the same reflection model for outside reflections as well. No wonder this was so confusing!

With glossy materials, the reflections of everything is captured in screen space (SSR). With non-glossy materials, which is pretty much all outside materials, reflections of the environment are captured with a technique that's pretty much identical to the scene capture 2D technique used in Unreal Engine 4. I don't know what Nintendo is calling it internally, but let's just call them scene captured 2D reflections. Basically, a virtual camera (with its own frustum and FOV) sits just above Link's head and always faces towards the horizon of the main camera, regardless of Link's orientation (this allows for limited off-screen reflections). The captured images are then fed into the materials producing the reflections, like a live broadcast signal to a TV. What this means is that the feed of the image is projected in real-time at whatever frame rate the game runs at (30 fps). This allows different elements of the materials to be updated without waiting on a new capture. However, the actual capture itself is updated at a much lower fps (~4-5fps). You can see this in action whenever the scene capture camera moves from its absolute position. Before the capture reflection is updated, the current capture moves within the material (let's say, water) to wherever the camera moves, in real time (30fps). However, once the material receives an updated capture, the reflection is corrected. The delay in this correction is where we can get a true sense of the capture updates as the trail along the material (~4-5 fps).

You can learn more about this capture technique here:

https://docs.unrealengine.com/latest/INT/Resources/ContentExamples/Reflections/1_7/


https://twitter.com/brainchildlight/status/937439281890467840

You can see here that the outdated reflection still smoothly tracks with Link's movement. No stutter whatsoever. Then, when a new capture updates, the reflection is corrected. This works differently than a reflection map, which only updates the reflection when the map itself updates. Here, the captured reflection is clearly out of date, but it's still changing it's position at 30fps.

You can also get a sense of the capture camera's FOV here:

https://twitter.com/brainchildlight/status/937441884506435584

It totally makes sense now why all the non-emissive materials only reflected along Fresnel. With these reflection techniques, those are the only angles where it could work right!

I ran into this archway and realized that it was the perfect setup to measure the FOV of the capture cam:

https://twitter.com/brainchildlight/status/937737666665201664

Combined with some basic trigonometry

6XGLURQ.jpg


I estimate the horizontal FOV to be at about 115º. Before Link even walks through the archway, the reflection of the archway is already off-screen, meaning it has exited the FOV of the capture cam, so we know it's definitely not a 180º FOV because if it was, the reflection of the archway would still be well in view.

You can also see that when the camera is several feet away and perpendicular to the archway, the reflection is slanted and scales with FOV, which allows us to visualize how wide it is. It's measure relative horizontal FOV of the scene capture camera.

I want to reiterate though that this is rough estimate, so I could be off by 10 degrees or so, but there are angles that would be impossible with this FOV, so by process of elimination, we at least have a ball park.

Physically-based Rendering

Observational Tip: Notice how the green light source on the wood of the barrel appears to be the same at all angles, while the same green light source appears to change its reflection on the metallic barrel hoops (the metal circles on the barrel) with respect to the camera angle.
https://twitter.com/brainchildlight/status/929095485305995265

Emissive materials and area lights

Observational Tip: Look at the shape of the light being cast from the fire sword. It matches the shape of the sword itself, though the intensity of the light will depend on how close the sword is to the surface it's illuminating.
https://twitter.com/brainchildlight/status/929080618654167040

Screen Space Ambient Occlusion

Observational Tip: Look for the dark, shadowy noise patterns in the cracks and crevices of the walls when viewed from head on. This same noise pattern outlines Link's silhouette from this angle as well.
https://twitter.com/brainchildlight/status/929108670444535808

Dynamic Wind Simulation System

Observational Tip: If you watch closely, you can see here how the directional flow of both the grass and clouds match the direction in which the wind changes.
https://twitter.com/brainchildlight/status/929115873306075137

Real-time cloud formation

Observational Tip: Notice how the cloud particles in the sky sporadically gather together.
https://twitter.com/brainchildlight/status/930935088606359552

Rayleigh Scattering/Mie Scattering

Observational Tip: Notice how the different hues of orange and red in the sky reflect on the environment with the same colors. Although not shown in the video, the light scattering of the sky illuminate the environment and water with other colors as well, depending on how the light is scattered.
https://twitter.com/brainchildlight/status/929122120029716480

Observational Tip: Note the snow's change in color as the sun sets.
https://twitter.com/brainchildlight/status/934448814060007424

Observational Tip: There are at least 5 sources of distinct reflections in the water in the beginning of this video. The Shrine (blue), the hills (green), the flag (dark silhouette), the sky (orange) and the sun (pink). The hills, flag, and shrine, are all reflected through scene captured reflections, the sun is reflected through specular lighting (as a specular highlight), and the sky is reflected through specular lighting as well, but it's not a specular highlight. As a rainstorm precipitates, the changes in reflections are completely dynamic. Sky Occlusion from the dark clouds changes the illumination from the Rayleigh scattering in the sky in real time. Eventually, the orange skylight can no longer reach the water's surface so it fades out, but the sun persists as it hasn't been completely blocked out. However, with so much mie scattering in the sky, the sun's color has changed from pink to white! Even still, the clouds eventually prove to be too much for the sun, blocking it out completely, leaving the light from the shrine and partial reflections of the hills.
https://twitter.com/brainchildlight/status/934470292444889089

Full Volumetric Lighting

Observational Tip: Notice how the light shafts are created as they peer through the shadows cast by the large building structure.
https://twitter.com/brainchildlight/status/930950850519883776

Bokeh DOF and approx. of Circle of Confusion

Observational Tip: Pay attention to the reticle of the camera and the shiny blue lights on the metal boxes. When the camera focuses on distances far away from the light sources, the light sources become more blurry and also appears larger. The opposite happens when the camera focuses directly on the light sources. The circles shapes that the blue lights transform into are known as the Bokeh effect.
https://twitter.com/brainchildlight/status/932644697037668354

Sky Occlusion and Dynamic Shadow Volumes

Observational Tip: Watch how the stark, hard shadow of the Flag becomes softer and starts to receive more color from the ambient light term as the storm precipitates.
https://twitter.com/brainchildlight/status/933505245753196545

Observational Tip: Pay attention to how the movement of the Flag correlates to the light shafts that are produced. The dark regions in between the light shafts come from the shadow volume of the flag. The dynamic contortion of the flag tells us that these shadow volumes are generated on the fly.
https://twitter.com/brainchildlight/status/933511124963573760

Aperture-based Lens Flares

Observational Tip: You can see that even though the sun is off-screen, the lens flares (circular light artifacts) are still present. More importantly, the shape, size, and clarity of the lens flares scale with the level of camera zoom.
https://twitter.com/brainchildlight/status/934161582841413632

Sub-surface Scattering

Observational Tip: Note how light from inside the stable can be seen diffusely illuminating the outer surface. Link's shadow on the roof is also illuminated by the light from inside, but not when he's on the ground.
https://twitter.com/brainchildlight/status/934170345962999808

Update: An additional look into SSS:
https://twitter.com/brainchildlight/status/935103546080477184

Observational Tip: Note how the surface material becomes more opaque as it receives more light, obscuring the internal material.
https://twitter.com/brainchildlight/status/934179053476519936

Observational Tip: Note how the surface material becomes less opaque as it receives less light, revealing the internal material.
https://twitter.com/brainchildlight/status/934184586711408640

Dynamically Localized Lightning Illumination

Observational Tip: Note the change in size, direction, and contrast of the shadows with each lightning strike.
https://twitter.com/brainchildlight/status/934439303047929857

Per-Pixel Sky Irradiance

Observational Tip: Well, your supposed to be able to see the stars come out as the sun sets but Twitter compression made sure that didn't happen, lol.
https://twitter.com/brainchildlight/status/934460003892609024

Fog Inscatter

Observational Tip: Note how the fog on the mountain has taken on the color of the available light in the environment and also appears to have volume.
https://twitter.com/brainchildlight/status/934504676677926912

Particle Lights

Observational Tip: Note how the glowing embers move independently in 3D space, irrespective of the camera.
https://twitter.com/brainchildlight/status/934527008184213504

Observational Tip: Snow particles are rendered as particle lights in BOTW. An approach that gives the snow particles the illusion that they're reflecting sunlight. It could also simply be an artistic choice.
https://twitter.com/brainchildlight/status/934529213087989760

Observational Tip: Note how the fire flies illuminate the surfaces they are close to.
https://twitter.com/brainchildlight/status/934530394740760576

Puddle formation and evaporation

Observational Tip: Watch how the water basin 'fills up' with water when it starts to rain.
https://twitter.com/brainchildlight/status/934536138336768000

Observational Tip: Watch how the water 'evaporates' when it stops raining and the sun comes out.
https://twitter.com/brainchildlight/status/934541047799070720

Pastebin
I actually went back to look at some of the game's code from the text dump and it pretty much confirms everything that I posted, lol. I suppose I should have done that first instead of investigating the engine during gameplay. Would've made my life a lot easier!

Here's the pastebin:https://pastebin.com/Jc9b0BCp

And with that, this analysis has been concluded. As always, if you have any questions about the information provided in this post, feel free to let me know.
 
Last edited:

brainchild

Independent Developer
Verified
Nov 25, 2017
9,478
Just wanted to thank Red for finally getting the formatting right and posting this thread on my behalf. Kudos, Red!
 

SuiQuan

Member
Oct 25, 2017
885
Kazakhstan - soon
This is awesome! And I agree with all the conclusions you have.
A small question: what do you think about the grass? I have a suspicion that most of it is not masked maps, but are actual polys to prevent massive amounts of overdraw. I can't think of a better solution. PBR shader complexity combined with overdraw would kill both the Wii U and the Switch. What do you think?
 

Elfforkusu

Member
Oct 25, 2017
4,098
"Emissive materials and area lights" reference link is wrong, it's a dup of the one above it

ps: great post.
 
Oct 29, 2017
2,398
Just wanted to thank Red for finally getting the formatting right and posting this thread on my behalf. Kudos, Red!
Awesome work brainchild! Really informative read.

This is kind of what I wanted DF to do instead of fanning the flames of fanboy wars over irrelevant details, but alas. Luckily you're putting in the work. Great videoclips to highlight what you mean too!

The fog inscatter has me scratching my head though, but it's late I guess.

edit: that lensflare is just another one of those brilliantly unnecessary attention to detail.
 
Last edited:
OP
OP
Redneckerz
"Emissive materials and area lights" reference link is wrong, it's a dup of the one above it

ps: great post.
STOP MENTIONING LINKS! STOP USING TECH TERMS!!! STOP TORMENTING ME AHHHHH X.X.X.X.X.X

Worst post of 2017. Shame on you!
(Changed it, thanks!) ;)

Awesome work brainchild! Really informative read.

This is kind of what I wanted DF to do instead of fanning the flames of fanboy wars over irrelevant details, but alas. Luckily you're putting in the work. Great videoclips to highlight what you mean too!
DF likely cant make time for breakdowns like these. And honestly, i wouldnt fault them. Stuff like this takes huge amounts of time so i figure this is only reserved for high profile titles.

brainchild the pleasure is all mines, but please, next time.... dont haunt my gaming soul anymore. XD. This was a fucking nightmare to get right.
Luckily i love the breakdown and knowing your reputation as a poster, so i could not pass this up.

Hope to see more of this kind of content in the future!
(PS: Get Verafied.)

jay_toon10 thank you very much man! Its compliments like this that that really make this worthwhile!
 

SharpX68K

Member
Nov 10, 2017
10,516
Chicagoland
Wow, that's a really impressive breakdown.

Imagine what the Zelda team could achieve on non-mobile Nvidia hardware in a future Nintendo console, say around the mid 2020s. Their partnership is expected to last two decades.
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,478
This is awesome! And I agree with all the conclusions you have.
A small question: what do you think about the grass? I have a suspicion that most of it is not masked maps, but are actual polys to prevent massive amounts of overdraw. I can't think of a better solution. PBR shader complexity combined with overdraw would kill both the Wii U and the Switch. What do you think?

It's a combination of both, I believe, depending on the LOD. I'm not entirely sure, but looking at the game's code, it appears to be the case.

Incredible breakdown. Thanks.

Thank you!

"Emissive materials and area lights" reference link is wrong, it's a dup of the one above it

ps: great post.

Thanks for the correction.

Awesome work brainchild! Really informative read.

This is kind of what I wanted DF to do instead of fanning the flames of fanboy wars over irrelevant details, but alas. Luckily you're putting in the work. Great videoclips to highlight what you mean too!

The fog inscatter has me scratching my head though, but it's late I guess.

Honestly, I wouldn't have been able to delve this deep without that text dump. A lot of the features I was able to speculate, but the text dump helped me to confirm or debunk some theories.

Fog Inscatter is just a term used by some developers to describe the appearance of the light scatter within fog. Or were you confused about BOTW's implementation?
 
Oct 29, 2017
2,398
Yes the implementation, of rendering noise inside shadow volumes to give the illusion of volume. I guess my game technology masters seems like ancient history because I couldn't wrap my head around what it meant. Well it is ancient history, most of this stuff didn't exist ten years ago when I graduated, but still.
 

Aaronrules380

Avenger
Oct 25, 2017
22,432
Really cool thread. It's really interesting the lengths the Zelda team went to to make sure things looked right by using programmed physics. The water plane idea is definitely really clever
 

jariw

Member
Oct 27, 2017
4,283
The "hallmark" viewpoint angle that BotW uses (when looking from high points, looks almost like compressed panorama or something), is that covered here? Can't see it in the list, and I don't know if it got a name.

Wow, that's a really impressive breakdown.

Imagine what the Zelda team could achieve on non-mobile Nvidia hardware in a future Nintendo console, say around the mid 2020s. Their partnership is expected to last two decades.

The Wii U was the target system for BotW, and then ported to Switch.
 

Stowaway Silfer

One Winged Slayer
Member
Oct 27, 2017
32,819
Looking forward to going through this later. You obviously put some work into this and BotW is fascinating to me on a technical level.
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,478
Wow, that's a really impressive breakdown.

Imagine what the Zelda team could achieve on non-mobile Nvidia hardware in a future Nintendo console, say around the mid 2020s. Their partnership is expected to last two decades.

If this theoretical game still has a stylized art style, its technology will probably still be underrated. But yes, I'm looking forward to a Zelda game that runs on even more advanced hardware!

Yes the implementation, of rendering noise inside shadow volumes to give the illusion of volume. I guess my game technology masters seems like ancient history because I couldn't wrap my head around what it meant. Well it is ancient history, most of this stuff didn't exist ten years ago when I graduated, but still.

So you know how the whole point bump/normal mapping is to give the illusion of depth by shading texture maps? The concept is similar here. The fog is not Volumetric. The shadows are volumetric. You cannot see shadow volume in air if it's not being cast onto something. The fog inscatter uses this noise pattern (that receives lighting information) to illuminate these shadow volumes in mid air. What you're seeing is effectively 'colored shadow volumes' but it also looks like volumetric fog to the untrained eye. Does that help?

Really cool thread. It's really interesting the lengths the Zelda team went to to make sure things looked right by using programmed physics. The water plane idea is definitely really clever

The water plane is one of my favorite tech solutions in any game. Simple, but effective!
 

Pascal

▲ Legend ▲
The Fallen
Oct 28, 2017
10,228
Parts Unknown
Awesome post! I always love reading tech breakdowns like this one (even if I can't understand half of the stuff in the OP). Reading about the amount of effort that went into making this beautiful game is just mind-blowing. Especially the part about Rayleigh and Mie scattering. Just awesome stuff to learn about.

Gonna watch the videos next to see if I can tell what's going on in each of them. BotW is such an amazing game on so many levels.
 

sabrina

Banned
Oct 25, 2017
5,174
newport beach, CA
I'm floored and fascinated. I knew the game was pretty, but I didn't realize so much work went into it.

It's tempting and lazy to say "oh, they just slapped a cel shading post process on it and called it a day", but clearly they actually put the legwork in. Thanks for sharing this :)
 

Jojo Leir

Avenger
Oct 25, 2017
627
Read a bit; seems pretty great.

Any chance we could get some screenshots to go with the text?

Edit: Nevermind, just saw the second post.
 

Jonneh

Good Vibes Gaming
Verified
Oct 24, 2017
4,538
UK
Crazy to think Breath of the Wild is at heart a Wii U game. Great thread brainchild.
 
Oct 25, 2017
8,617
While the game looks rough at spots, I do love just how big and ambitious it is.
A lot of the techniques seems considerably more advanced than what Nintendo had done previously.

The massive real time shadows moving smoothly is super impressive to me still.
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,478
To everyone, thanks for feedback! If anyone has any questions about anything that's being discussed in this thread, do not hesitate to ask!

The "hallmark" viewpoint angle that BotW uses (when looking from high points, looks almost like compressed panorama or something), is that covered here? Can't see it in the list, and I don't know if it got a name.



The Wii U was the target system for BotW, and then ported to Switch.

Ah, yes. The FOV is slightly adjusted depending on where you are on the map (high points, or in a village, etc) or if you're interacting with characters or in combat. It's particularly noticable when you finish talking with a korok after finishing a korok puzzle.
 

Asbsand

Banned
Oct 30, 2017
9,901
Denmark
Nice, I'm gonna spend some time looking through every bit of this analysis later. I also wonder how the tech in Zelda compared to the tech utilized in Xenoblade Chronicles X and whether the conversion from Zelda to Switch and what Wii U could do with Xenoblade Chronicles X would be 1:1 if Switch were to run that game. Zelda seems to have carried over perfectly, even in handheld mode, but I do wonder if there might be subtle details like maybe lower samples of certain lighting textures or something
 

Dekuman

Member
Oct 27, 2017
19,026
I think this engine could do a lot more. At the end of the day, BOTW's design was limited by the game also being a port of the Wii U

I can see a more expansive and impressive open world if it's just running on the Switch.
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,478
Nice, I'm gonna spend some time looking through every bit of this analysis later. I also wonder how the tech in Zelda compared to the tech utilized in Xenoblade Chronicles X and whether the conversion from Zelda to Switch and what Wii U could do with Xenoblade Chronicles X would be 1:1 if Switch were to run that game. Zelda seems to have carried over perfectly, even in handheld mode, but I do wonder if there might be subtle details like maybe lower samples of certain lighting textures or something

BOTW's engine is considerably more advanced than Xenoblade X's. However, just because a game engine has a more advanced feature set, it doesn't automatically guarantee that hardware can run a less advanced engine better

Case in point, even though it's not as feature rich XBX still has a lot of processing pouring into the wildlife routines. You can also fly anywhere on the map at a greater speed than you can glide in BOTW. Geometry rendering and object elements drawn in far distances can occasionally be more complex in XBX. None of these things are feature rich, but they a cost all the same.

Having said that, I'm sure the switch could handle port of XBX just fine :)
 

Neiteio

Member
Oct 25, 2017
24,127
Crazy to think Breath of the Wild is at heart a Wii U game. Great thread brainchild.
Yeah, it'll be interesting to see what a Zelda made from the ground up for the Switch will look like.

BotW is such a gorgeous game. The art direction really shines. A lovely balance between realism and, say, Studio Ghibli.
 

Toa Axis

One Winged Slayer
Member
Oct 25, 2017
843
Virginia
This is a really, really awesome breakdown. Lovely read.

The volumetric lighting was the thing that stood out to me the most during my run of play. Mind, I initially played the Wii U version, so when I saw that the devs had a full volumetric lighting system in place on a Wii U game of this scale, I was very much impressed.

Also, that volumetric fog trick is clever is hell. They really put a lot of work into the technical side of things.
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,478
What's the method for figuring out how close a given point on the water plane is to the ground below it (for the foam layer)? Is it basically a zbuffer comparison?

It's impossible for me to know this for sure, honestly, but it shouldn't be too complicated. There are plenty of ways to determine the relative distance of objects within a 3D space and I don't think relying on the depth map of a z-buffer would be problematic in any way.

I said all that to say, yeah, it probably just uses a z-buffer comparison :P

I think this engine could do a lot more. At the end of the day, BOTW's design was limited by the game also being a port of the Wii U

I can see a more expansive and impressive open world if it's just running on the Switch.

All engines can benefit from better optimization and better hardware. Switch is no exception, for sure.
 
Oct 27, 2017
5,618
Spain
BOTW is an incredible artistic and technical achievement, pulling off what is essentially a current generation game engine... On the friggin' Wii U. It's insane, and of course it has some caveats and rough spots, but I have the impression those also stem from the sheer size of the game.
All in all it's excellent technology and I hope Nintendo keeps putting it to good use.
 

Asbsand

Banned
Oct 30, 2017
9,901
Denmark
BOTW's engine is considerably more advanced than Xenoblade X's. However, just because a game engine has a more advanced feature set, it doesn't automatically guarantee that hardware can run a less advanced engine better

Case in point, even though it's not as feature rich XBX still has a lot of processing pouring into the wildlife routines. You can also fly anywhere on the map at a greater speed than you can glide in BOTW. Geometry rendering and object elements drawn in far distances can occasionally be more complex in XBX. None of these things are feature rich, but they a cost all the same.

Having said that, I'm sure the switch could handle port of XBX just fine :)
The thing is, with BotW I should think Switch is basically as good as Wii U just in its handheld mode, but while XBCX runs 720p on Wii U its sequel, which might be more advanced but doesn't look it to me, had to implement variable resolution from 720p and below in Switch's handheld mode. Zelda does not need this and it actually maintains a better framerate than the Wii U version, so it leaves me to suspect the hardware similarities aren't all that cut and dried.

I could imagine Switch has a clear advantage on either the CPU or GPU front and of course much more memory IIRC, but either of those is a bottleneck compared to Wii U. It's just very unpredictable because the system has to conserve power and battery in Handheld mode making it underclock its true specs and output inferior graphics to being Docked, but not in Zelda for whatever reason.
 

Akela

Member
Oct 28, 2017
1,849
Pretty surprised to hear the the light coming from within the stable tents are actually done using subsurface scattering, I assumed it was simply an emissive texture.
 
Oct 25, 2017
8,617
I wonder if Mario and BOTW have the same engine. I imagine they were being built alongside each other and it would probably serve it pretty well.

Mario looks and runs nicer likely thanks to having a smaller map.


As for XCX, while it's crazy ambitious it does lack a ton of features from BOTW's engine. I think it has basically no physics. Baked lighting. Lack of collision.
Seems like Splatoon 2 and XC2 have some more modern features over their Wii U counterparts
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,478
The thing is, with BotW I should think Switch is basically as good as Wii U just in its handheld mode, but while XBCX runs 720p on Wii U its sequel, which might be more advanced but doesn't look it to me, had to implement variable resolution from 720p and below in Switch's handheld mode. Zelda does not need this and it actually maintains a better framerate than the Wii U version, so it leaves me to suspect the hardware similarities aren't all that cut and dried.

I could imagine Switch has a clear advantage on either the CPU or GPU front and of course much more memory IIRC, but either of those is a bottleneck compared to Wii U. It's just very unpredictable because the system has to conserve power and battery in Handheld mode making it underclock its true specs and output inferior graphics to being Docked, but not in Zelda for whatever reason.

Are you talking about XB2? That engine is definitely more in line with modern rendering methods. It's boasting real-time lighting and shading of the environment while XBX switched between presets. Considering the scale of the game, I can definitely see why it's running the way that it is.
 

Neutra

Member
Oct 27, 2017
988
NYC
Really lovely write-up. Nintendo's games have really sparkled since they harnessed the power of PBR.
 

Deleted member 18161

user requested account closure
Banned
Oct 27, 2017
4,805
Nice, I'm gonna spend some time looking through every bit of this analysis later. I also wonder how the tech in Zelda compared to the tech utilized in Xenoblade Chronicles X and whether the conversion from Zelda to Switch and what Wii U could do with Xenoblade Chronicles X would be 1:1 if Switch were to run that game. Zelda seems to have carried over perfectly, even in handheld mode, but I do wonder if there might be subtle details like maybe lower samples of certain lighting textures or something

I don't think any WiiU game would struggle to run on Switch. The BotW developers said the game had "the same game experience with no optimisation" when ported over to Switch which is crazy. They then patched it soon after launch with further optimisation and it runs at a better resolution, more consistent framerate than the WiiU version when in handheld mode and 40+% the resolution of the WiiU version at a rock solid 30fps 99% of the time when docked. It ran amazingly well for a game built for a completely different AMD architecture which was then thrown onto a 2015 Nvidia mobile chipset. I doubt Xenoblade X would have any issues running on Switch at dynamic 720p when handheld and dynamic 900p when docked.

The Switch CPU has at least 2x the computational power of WiiU's CPU, Switch also has an extra 2GB's of memory and the GPU is more modern with a much lighter API with about 2x the computational power when handheld / 4x when docked. Fast Racing Remix is proof of this as it went from dynamic 720p / 30fps in 4 player mode on WiiU to dynamic 1080p / 60fps with graphical improvements on Switch and that was a first generation launch period port done by a tiny team.

I'm looking forward to seeing more WiiU to Switch ports and their technical improvements next year to fill new exclusive gaps!

Hopefully Switch+ brings every dynamic resolution game to 720p native in handheld mode and 1080p native when docked when it's eventually release.
 

LuigiV

One Winged Slayer
Member
Oct 27, 2017
2,684
Perth, Australia
Great analysis, brianchild. Glad to finally see you here on ResetEra.

The game's enigne is a lot more advance/up to date than I initially thought.

I wonder if Mario and BOTW have the same engine. I imagine they were being built alongside each other and it would probably serve it pretty well.

Mario looks and runs nicer likely thanks to having a smaller map.
Maybe not the exact same engine but it's very likely that EPD shares it core technologies between all it's internal teams as a base from them to build on. brainchild did a similar analysis on SMO on SL&ENT and the two games use a lot of the same techniques (it wouldn't make much sense if they were built independently of each other). This tech sharing was very obvious on the Wii U, were Nintendo Land, MK8, Splatoon and Animal Crossing app all basically shared the same lighting model.
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,478
Pretty surprised to hear the the light coming from within the stable tents are actually done using subsurface scattering, I assumed it was simply an emissive texture.

Emmisive materials don't usually have diffuse lighting in the material since they're supposed to act as light sources. However, it is probably better to think of the internal layer of the stable as an emissive material that has a diffuse appearance.

The method really isn't that important though. Does it emulate the effect? That's what's important. Actual SSS in the real world happens in the molecular level and technically happens with every material but we usually make a distinction with translucent materials because of their unique behavior. Since it's impossible to simulate this in real time, we use various methods to achieve the effect.

Some engines use BSSDF algorithms to emulate SSS, some engines use SSS maps to bake the information in. BOTW does the latter.

I wonder if Mario and BOTW have the same engine. I imagine they were being built alongside each other and it would probably serve it pretty well.

Mario looks and runs nicer likely thanks to having a smaller map.


As for XCX, while it's crazy ambitious it does lack a ton of features from BOTW's engine. I think it has basically no physics. Baked lighting. Lack of collision.
Seems like Splatoon 2 and XC2 have some more modern features over their Wii U counterparts

I also did an analysis for SMO, but in its current state, it is not nearly as accessible as this analysis. I'll have to make some alterations before having Red post it here.

And no, I don't believe that SMO uses the same engine as BOTW. In fact, I have reason to believe that SMO's engine may have been influenced by Unreal Engine 4.
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,478
Great analysis, brianchild. Glad to finally see you here on ResetEra.

The game's enigne is a lot more advance/up to date than I initially thought.


Maybe not the exact same engine but it's very likely that EPD shares it core technologies between all it's internal teams as a base from them to build on. brainchild did a similar analysis on SMO on SL&ENT and the two games use a lot of the same techniques (it wouldn't make much sense if they were built independently of each other). This tech sharing was very obvious on the Wii U, were Nintendo Land, MK8, Splatoon and Animal Crossing app all basically shared the same lighting model.

I agree with this, but I think SMO is the odd man out, after doing my BOTW analysis.

The cubemapped IBL lighting and the use of bump offset in the windows of New Donk City have changed my mind on the matter.

BTW, thanks, LuigiV! You were one of the reasons that I came here!

EDIT:

Of course, it may just be that they're using different rendering methods on similar engines. Who knows?!
 

ghibli99

Member
Oct 27, 2017
17,723
Said this on Reddit as well, but such an amazing technical breakdown of BOTW. Most of this stuff I would never notice, but reading about it and seeing it in action gives me a deeper appreciation for a game I already hold in the highest regard.
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,478
Crazy to think Breath of the Wild is at heart a Wii U game. Great thread brainchild.

Just wanted to say thanks again for dealing with my username mishap! Your response to that was very timely!

As i facilitated this, im happy i played a part in it.
And im happy brainchild is here too. One of the most knowledgeable guys out there and also a pleasure to work with. Whatever analysis you need posted, ill post it in a heartbeat for you.

If I ever need another thread made, rest assured that you will be the one to make it :)

I don't think you guys understand how difficult it was for Red to parse through my analysis, cut back some parts out of necessity, and keep mostly everything in tact. I'm truly impressed!

Also, if anyone wants to have a look at the pastebin of the game's code, you can view it here:

https://pastebin.com/Jc9b0BCp
 

Mochi

Avenger
Oct 25, 2017
1,704
Seattle
amazing work, thank you OP. Especially appreciate you taking the time to spell stuff out in layman's terms, I learned a lot from your posts!

I hope they continue to use the engine and assets from BOTW, even on the Switch 2 (the engine). It seems to have everything it needs, I imagine BOTW would look quite amazing at 4k. I'm sure the engine can be optimized further for the Switch so a potential majoras mask type game might look better overall.