From BlenderWiki

Jump to: navigation, search
Note: This is an archived version of the Blender Developer Wiki. The current and active wiki is available on wiki.blender.org.

Shading system design

In the current blender renderer, lighting is very mixed up with shading. Nasty to maintain/extend, no clearly defined interface, easy for bugs to crop up, very difficult to support more than one ubershader.

Why abstract lighting out via BSDF interface?

  • easier to keep control of, ensuring correct results
  • users can create one material and use it predictably in a variety of lighting scenarios using a variety of different lighting technologies
  • easier to upgrade/extend lighting system consistently without having to re-code all materials (i.e. adding caching mechanisms, adding new 'integrators')
  • easier for shader writers/users - just worry about defining surface properties, how it should react to light
    • This, in my experience as a shading/lighting artist is the best way to go about things in terms of workflow. Keep the shading to defining how the material reacts when it is lit, and leave as much of the creative decisions to the lighting stage as possible. If you tailor a material too much to one particular scenario/light setup/technique/etc. it can cause problems with animations where many different lighting situations will be required, sometimes in the one shot. It's also an issue with project management, if the person lighting isn't the person who set up the shader, and it doesn't respond well to different lighting setups.

So compared to RSL where you might have to run your own illuminance loop, manually call texture3d() to look up irradiance in a brick map, add it to direct lighting contributions in the shader, you just call the relevant BRDF function, which lets the lighting system determine the best way of determining that incoming, and therefore, reflected radiance (based on your global renderer settings, perhaps material/node flags)

lighting == all scattered light

Not just direct diffuse illumination from lamps, but indirect diffuse, specular/glossy reflection, all should be operating under the one consistent system.

i.e. Rather than having special 'ambient occlusion nodes' for a skylight effect, this should be part of the lighting system, restricted to lambertian surfaces, and toggled/set up in overall renderer settings.

Material pipeline

Inputs (textures, ui) --> BSDF --> Lighting -> Output colour

For most materials, the above will be sufficient and operator in a predictable, physically plausible manner. Otherwise, using nodes, one could potentially re-order and manipulate any of the steps in between (eg. deriving bsdf properties from speed vectors, facing ratios), mixing BSDF results with each other, modifying the final lit result with colour correction etc.

Adding extra adjustments via nodes may make the overall material physically implausible, but that's fine, as long as it's correct by default, people should be able to knowingly configure things differently themselves too.

Such a system shouldn't also preclude the possibility of having special purpose nodes (such as an AO node) as extra tools in the tool box for unusual situations where it's really might be needed, *but* the system should be designed so that for the vast majority of usage that sort of thing isn't needed, and not encouraged by default.

Render Passes / AOVs

The current hardcoded render pass system is inflexible, difficult to maintain, and just plain doesn't work with any kind of modifications in the node editor. Rather than a lit of predefined passes, this should be more user-driven. Render settings can have a dynamic list of configurable passes, which can appear as inputs inside material node trees. They're really just named (perhaps tagged with scalar/colour/vector) passes that can be connected up inside the node system with whatever's required.

Default materials could come already connected up to placeholder outputs, ready for default passes to be enabled.

Shadsys node passes.png

Volumes

A new shading system should support volume shaders, as well as surface shaders, as part of the same material. This would not only be useful for volumes such as smoke, but also for managing transmission, for example in refracting medium.

For example a simple glass material may have:

  • Surface shader -> Dielectric
  • Interior volume shader -> Simple glass

Or a more complicated version:

  • Surface shader -> Dielectric
  • Interior volume shader -> Green tinted absorbing, dispersive glass

Or a translucent alien creature:

  • Surface shader -> Slimy, glossy skin
  • Interior volume shader -> Yellow fat

Possible volume shading pipeline

  • BxDF: interaction at surface interface [1]
    • if BRDF -> reflection
      • shade incoming light
    • if BTDF -> transmission
      • volume shader, find back side
        • if ztransp -> bg= (alpha 0.0)
          • shade volume against bg
        • if raytraced -> bg= (shade intersection BxDF [1])
          • shade volume against bg