There have been some important ‘behind the scenes’ developments recently that have been hard to show via screenshot so apologies for the twitter silence lately, but here’s a hopefully informative blog update instead…
Deferred HDR rendering
Up until this point, all rendering has been performed directly to the app window, in a single pass, and with a single ‘fudged in’ light. This is commonly known as ‘forward rendering’. It’s a pretty simple approach and (can be) very fast.
Another approach is ‘deferred rendering’. With deferred rendering, rendering is instead performed to an offscreen texture, and instead of trying to do everything in one pass, a deferred renderer generally does things in multiple, often simpler passes. Deferred renderers also usually maintain a number of ‘gbuffers’ – special buffers that contain ‘per pixel’ geometric information, such as surface color/metalness/normal etc. This extra information is usually rendered on the first pass, and is used by subsequent passes for lighting, shadows, special effects and so on.
This sounds complex, and it kind of is, but it offers several really nice benefits. First, lights are very fast to render, eg: for pointlights you just render a quad per light and the shader uses the existing gbuffers and depth buffer to performing all lighting calculations. Second, it allows you to write all sorts of cool ‘post processing’ effects that can also use the gbuffers/depth buffer for effects like fog, ‘god rays’, screen space reflections, bloom, depth-of-field and a lot more. This typically involves drawing a fullscreen rect that has access to the ‘current’ color buffer, the gbuffers and the depth buffer. Shader code then has the ability to inspect the normal of a single pixel, or depth value (and therefore view/world etc position) and any material specific info like roughness/glosiness (perhaps for reflection effects?).
But this does come at a cost – if your game only has a few lights and doesn’t do much post processing, forward rendering could well be the better/faster approach (I think it’s generally the recommended approach for VR). Also, mobile is unlikely to have the grunt just yet to make good use of deferred rendering, so forward rendering might make more sense on mobile too. But for desktop developers, deferred rendering is IMO the way to go as it’s more powerful, and if you have lots of lights can be faster. Given most mx2 users are desktop users, it’s what I’m going to start with. Adding a forward renderer in future should not be a big deal but I don’t want to have to maintain 2 renderers right now.
In the process of moving to a deferred renderer, I added some new floating point pixel formats for textures, so the main ‘color buffer’ texture that ultimately contains what you see is now in 16 bit floating point format. This gives you ‘hdr’ (high dynamic range) colors, where the color buffer can actually contain color RGB values >1. You can’t actually show such colors on a normal monitor (this is what the new HDR TVs are all about) but you can smear the over-bright colors around a bit so they brighten up neighboring pixels, making the entire scene appear brighter (or ‘glow’) in the process. This is a post processing effect known as ‘bloom’, and is the only one I currently have going!
New PBR material system.
I also ending up the changing the PBR material system used. Previously, I had been using what’s known as a ‘specular/gloss’ system. This basically means your materials have 2 color textures – diffuse and specular – and a single (1 component) ‘gloss’ texture. Instead, I am now using a slightly different system known as ‘metalness/roughness’ (ultimately, these are both PBR systems, they really just store material parameters differently). The metalness/roughness approach actually uses fewer parameters than specular/gloss as it combines diffuse/specular texture into a single ‘color’ texture, only with an additional (1 component) ‘metalness’ texture. Metalness effectively says how much of color is diffuse and how much is specular, eg: if metalness=1, then color is really just specular. If metalness=0, then color is really diffuse. This system also has a ‘gloss’ texture, only it’s stored as 1.0-gloss and called ‘roughness’ instead. Not sure why the difference here, but it seems to have become a standard of sorts so I’ll go with it.
There are 2 advantages to the metalness/roughness system. First, it is theoretically harder/impossible to accidentally produce ‘unrealistic’ colors that can’t occur in the real world, because basically you’ve got less parameters to tweak, ie: color, metalness and roughness, and pretty much all combinations of these are ‘valid’ and should produce a realistic color. Not so if you have sepearate diffue/specular textures where (apparently), some combos of diffuse/specular just don’t exist IRL. Second, it simplifies the deferred gbuffer system as there is like to store in the gbuffers, ie: instead of having to store diffuse/specular (6 floats), you only have to store color/metalness (4 floats) which can be stored in a single RGBA texture (ie:RGB=color, A=metalness). so all-in-all, the metalness/roughness system only require 2 offscreen textures/gbuffers for deferred rendering: A color buffer (RGB=color, A=metalness) and a normal buffer (RGB=normal, A=roughness).
So all in all, the ‘mostly finalized’ default PbrMaterial class allows you to play with the following material properties:
ColorTexture:Texture – RGB color texture.
ColorFactor:Color – color factor. ColorTexture is multiplied by this. Ditto for all factors below.
AmbientTexture:Texture – RGB ambient texture – really ‘ambient occlusion’. Allows you to mask out ambient light.
AmbientFactor:Color – ambient factor.
EmissiveTexture:Texture – RGB emissive texture. Doesn’t actually add a light source (could it?) but pixels will be at least this bright!
EmissiveFactor:Color – emissive factor.
MetalnessTexture:Texture – single component metalness texture.
MetalnessFactor:Float – metalness factor.
RoughnessTexture:Texture – single component roughness texture
RoughnessFactor:Float – roughness factor.
NormalTexture:Texture – normal texture.
You can also write your own material classes that take different parameters to this, but if you’re using the deferred renderer and want lighting, your material shader will have to somehow produce the following PBR params:
These are then passed to an internal main0() function. For example, the default fragment shader for the built in PbrMaterial looks a bit like this:
uniform sampler2D m_ColorTexture;
uniform vec4 m_ColorFactor;
uniform sampler2D m_AmbientTexture;
uniform vec4 m_AmbientFactor;
uniform sampler2D m_EmissiveTexture;
uniform vec4 m_EmissiveFactor;
uniform sampler2D m_MetalnessTexture;
uniform float m_MetalnessFactor;
uniform sampler2D m_RoughnessTexture;
uniform float m_RoughnessFactor;
uniform sampler2D m_NormalTexture;
vec3 color=pow( texture2D( m_ColorTexture,v_TexCoord0 ).rgb,vec3( 2.2 ) ) * m_ColorFactor.rgb;
vec3 ambient=pow( texture2D( m_AmbientTexture,v_TexCoord0 ).rgb,vec3( 2.2 ) ) * m_AmbientFactor.rgb;
vec3 emissive=pow( texture2D( m_EmissiveTexture,v_TexCoord0 ).rgb,vec3( 2.2 ) ) * m_EmissiveFactor.rgb;
float metalness=texture2D( m_MetalnessTexture,v_TexCoord0 ).r * m_MetalnessFactor;
float roughness=texture2D( m_RoughnessTexture,v_TexCoord0 ).r * m_RoughnessFactor;
vec3 normal=texture2D( m_NormalTexture,v_TexCoord0 ).xyz * 2.0 - 1.0;
main0( color,ambient,emissive,metalness,roughness,normal );
main0() is actually a function #included by the shader that computes/writes the correct values out to the color buffer and gbuffers.
So you could write a super simple shader that just did this:
uniform vec4 m_Color;
uniform float m_Metalness;
uniform float m_Roughness;
main0( m_Color.rgb,vec3(1.0),vec3(0.0),m_Metalness,m_Roughness,vec3(0.0,0.0,1.0) );
Or, you could include the PBR params in less textures for a faster shader, or computer PBR params etc. etc.
Anyway, going WAY off topic here!
* Release date
I *hope* to have a v0.1 released this month. It will be far from finished, but this *is* an open source project and I feel bad about not releasing stuff already! But as always, my concern has been to have things ‘far enough’ along so I don’t have to massively change things *after* people are doing things with it. To this end, I will likely be releasing less than I would have liked (or we’ll be here forever!) but it should be more finished.