Monkey2 3d update!

Hi everyone!

There have been some important ‘behind the scenes’ developments recently that have been hard to show via screenshot so apologies for the twitter silence lately, but here’s a hopefully informative blog update instead…

Deferred HDR rendering

Up until this point, all rendering has been performed directly to the app window, in a single pass, and with a single ‘fudged in’ light. This is commonly known as ‘forward rendering’. It’s a pretty simple approach and (can be) very fast.

Another approach is ‘deferred rendering’. With deferred rendering, rendering is instead performed to an offscreen texture, and instead of trying to do everything in one pass, a deferred renderer generally does things in multiple, often simpler passes. Deferred renderers also usually maintain a number of ‘gbuffers’ – special buffers that contain ‘per pixel’ geometric information, such as surface color/metalness/normal etc. This extra information is usually rendered on the first pass, and is used by subsequent passes for lighting, shadows, special effects and so on.

This sounds complex, and it kind of is, but it offers several really nice benefits. First, lights are very fast to render, eg: for pointlights you just render a quad per light and the shader uses the existing gbuffers and depth buffer to performing all lighting calculations. Second, it allows you to write all sorts of cool ‘post processing’ effects that can also use the gbuffers/depth buffer for effects like fog, ‘god rays’, screen space reflections, bloom, depth-of-field and a lot more. This typically involves drawing a fullscreen rect that has access to the ‘current’ color buffer, the gbuffers and the depth buffer. Shader code then has the ability to inspect the normal of a single pixel, or depth value (and therefore view/world etc position) and any material specific info like roughness/glosiness (perhaps for reflection effects?).

But this does come at a cost – if your game only has a few lights and doesn’t do much post processing, forward rendering could well be the better/faster approach (I think it’s generally the recommended approach for VR). Also, mobile is unlikely to have the grunt just yet to make good use of deferred rendering, so forward rendering might make more sense on mobile too. But for desktop developers, deferred rendering is IMO the way to go as it’s more powerful, and if you have lots of lights can be faster. Given most mx2 users are desktop users, it’s what I’m going to start with. Adding a forward renderer in future should not be a big deal but I don’t want to have to maintain 2 renderers right now.

In the process of moving to a deferred renderer, I added some new floating point pixel formats for textures, so the main ‘color buffer’ texture that ultimately contains what you see is now in 16 bit floating point format. This gives you ‘hdr’ (high dynamic range) colors, where the color buffer can actually contain color RGB values >1. You can’t actually show such colors on a normal monitor (this is what the new HDR TVs are all about) but you can smear the over-bright colors around a bit so they brighten up neighboring pixels, making the entire scene appear brighter (or ‘glow’) in the process. This is a post processing effect known as ‘bloom’, and is the only one I currently have going!

New PBR material system.

I also ending up the changing the PBR material system used. Previously, I had been using what’s known as a ‘specular/gloss’ system. This basically means your materials have 2 color textures – diffuse and specular – and a single (1 component) ‘gloss’ texture. Instead, I am now using a slightly different system known as ‘metalness/roughness’ (ultimately, these are both PBR systems, they really just store material parameters differently). The metalness/roughness approach actually uses fewer parameters than specular/gloss as it combines diffuse/specular texture into a single ‘color’ texture, only with an additional (1 component) ‘metalness’ texture. Metalness effectively says how much of color is diffuse and how much is specular, eg: if metalness=1, then color is really just specular. If metalness=0, then color is really diffuse. This system also has a ‘gloss’ texture, only it’s stored as 1.0-gloss and called ‘roughness’ instead. Not sure why the difference here, but it seems to have become a standard of sorts so I’ll go with it.

There are 2 advantages to the metalness/roughness system. First, it is theoretically harder/impossible to accidentally produce ‘unrealistic’ colors that can’t occur in the real world, because basically you’ve got less parameters to tweak, ie: color, metalness and roughness, and pretty much all combinations of these are ‘valid’ and should produce a realistic color. Not so if you have sepearate diffue/specular textures where (apparently), some combos of diffuse/specular just don’t exist IRL. Second, it simplifies the deferred gbuffer system as there is like to store in the gbuffers, ie: instead of having to store diffuse/specular (6 floats), you only have to store color/metalness (4 floats) which can be stored in a single RGBA texture (ie:RGB=color, A=metalness). so all-in-all, the metalness/roughness system only require 2 offscreen textures/gbuffers for deferred rendering: A color buffer (RGB=color, A=metalness) and a normal buffer (RGB=normal, A=roughness).

So all in all, the ‘mostly finalized’ default PbrMaterial class allows you to play with the following material properties:

ColorTexture:Texture – RGB color texture.
ColorFactor:Color – color factor. ColorTexture is multiplied by this. Ditto for all factors below.
AmbientTexture:Texture – RGB ambient texture – really ‘ambient occlusion’. Allows you to mask out ambient light.
AmbientFactor:Color – ambient factor.
EmissiveTexture:Texture – RGB emissive texture. Doesn’t actually add a light source (could it?) but pixels will be at least this bright!
EmissiveFactor:Color – emissive factor.
MetalnessTexture:Texture – single component metalness texture.
MetalnessFactor:Float – metalness factor.
RoughnessTexture:Texture – single component roughness texture
RoughnessFactor:Float – roughness factor.
NormalTexture:Texture – normal texture.

You can also write your own material classes that take different parameters to this, but if you’re using the deferred renderer and want lighting, your material shader will have to somehow produce the following PBR params:

These are then passed to an internal main0() function. For example, the default fragment shader for the built in PbrMaterial looks a bit like this:

main0() is actually a function #included by the shader that computes/writes the correct values out to the color buffer and gbuffers.

So you could write a super simple shader that just did this:

Or, you could include the PBR params in less textures for a faster shader, or computer PBR params etc. etc.

Anyway, going WAY off topic here!

* Release date

I *hope* to have a v0.1 released this month. It will be far from finished, but this *is* an open source project and I feel bad about not releasing stuff already! But as always, my concern has been to have things ‘far enough’ along so I don’t have to massively change things *after* people are doing things with it. To this end, I will likely be releasing less than I would have liked (or we’ll be here forever!) but it should be more finished.

Bye!
Mark

0

12 thoughts to “Monkey2 3d update!”

  1. Sounds cool, but I’m worried about the “Given most mx2 users are desktop users”. I’m not, and developing a game in MX2 for iOS and Android. I’ve asked this before: Please confirm that the mobile targets will be first class citizens in the future of MX2.

    I find your thoughts on 3d deferred rendering perfectly fine, as long as you are not hurting (too much) 2D mobile.

     

     

    0

  2. Really looking forward to playing with this, should at least make me play about with mx2 a lot more than I have!

    Don’t know if you’ve seen my previous rantings on this, but I reall, really, really, really, really think you ought to do a mini-“MonkeyShader” compiler that takes Monkey-like code and outputs the required GLSL, to finally make shaders much more accessible to all, in the same way that Blitz/MX takes away the need to deal directly with C/C++ and co.

    I’m reasonably comfortable with the C-ish GLSL syntax now, but surely half the point of Blitz, Monkey, etc, is to allow people to not have to pick all that up, make complex systems more approachable, etc? (It could also potentially put out other forms of shader code, of course.)

    This translation of the super simple shader above looks a lot friendlier from the point of view of a Blitz/Monkey/other hobbiest programmer, and from my sheltered perspective wouldn’t take much work for an experienced compiler developer to implement (ahem)…

    It is of course a stripped-down subset of Blitz/Monkey-style code… it seems, at least from the outside, that there would be very little you’d need to implement to output useable GLSL.

    Perhaps there’s a lot more to the GLSL language/rules than meets the eye, though…

    I know it would need extra work to handle things like the vec3(1.0) argument, or force passing the full vec3(1.0, 1.0, 1.0), but even that would seem a lot more approachable for the average hobbiest already working in mx2.

    Just wondering what your take would be on this sort of thing… ?

     

    0

  3. Sounds great!

    I’m with Difference though, I’ll be using MX2 mainly for mobiles targets.

    And I think DruggedBunny’s idea is a great one (one I’ve thought of too!), having a MX style shader code would be awesome!

    0

  4. time needed as an input

    also you are dealing with rgbA but not using the alpha?

    0

  5. This update sounds amazing but are you sure most people are using monkey for desktop only? My main focus is android with the potential to port easily to ios.  Desktop is great for prototyping!

    0