From BlenderWiki

Jump to: navigation, search
Note: This is an archived version of the Blender Developer Wiki. The current and active wiki is available on wiki.blender.org.

Shading System

Mental Ray

  • Very low level API
  • Most existing legacy shaders use illuminance loops like in renderman
  • Material/volume/environment/light/shadow/photon/lens/brdf/etc.... shaders.
  • Photon shaders - legacy, for photon map GI.
  • BSDF support added in most recent version.
  • When a shader contains a bsdf, it's sample() method is used as photon shader, if specific photon shader isn't attached.
  • bsdf->eval() and bsdf->sample()
    • eval() - used within light loop in shader, accumulating light contribution (seems to only cover direct light, though possibly includes glossy contributions too).
    • sample() - used to sample specular components, (generate ray direction for shading)

http://download.autodesk.com/us/maya/2009help/mr/manual/index.html

Vray

  • No shading language, but plugin API
  • Lighting decoupled from shading
  • Still some legacy things - default material not entirely physical - has diffuse/refraction/specular split, and therefore has slightly weird options for energy conservation (dimming 'diffuse' based on 'specular' strength etc).
  • Higher level - simple example:
void MyMaterial::shade(VR::VRayContext &rc) {
 // Clear the current result
 rc.mtlresult.clear();
 // Get a new BTDF sampler for the intersection point
 BTDFSampler *btdf=newBTDF(rc);
 // Compute the lighting contribution
 VR::Color directLight=rc.evalLight(*btdf);
 // Trace the BRDF forward (reflections, refractions etc)
 btdf->traceBTDFForward(rc, true);
 // Add in light contribution
 rc.mtlresult.color+=directLight;
 // Delete the BRDF
 deleteBTDF(rc, btdf);
}
  • Allows disabling or using different components simultaneously (i.e. to allow irradiance map indirect diffuse, with photon map caustics).
  • RGB light (not spectral)
  • Layered shaders supported
  • Direct light manager:
    • handles direct lighting
  • Global light manager
    • handles direct lighting (by calling direct light manager)
    • handles indirect illumination
      • primary bounces
      • secondary bounces
      • caustics
  • Multiple importance sampling - global light manager handles sampling lights, bsdf handles sampling reflection vectors. Sampling framework combines.

http://www.spot3d.com/vray/help/150SP1/

Renderman SL

  • Designed a long time ago, has survived surprisingly well, however it was not designed to account for modern concepts such BxDFs or GI
  • Lighting is deeply integrated with shaders, i.e. illuminance loops
  • indirect lighting techniques are often left to shader writers to add to direct illumination themselves (i.e. occlusion or photon map gather shadeops). Sometimes (i.e. 3delight's automatic photon map mode) can be integrated into lighting.

Houdini Mantra / PBR

  • Oldskool mantra very similar to renderman
  • Mantra PBR 'Physically Based Rendering' mode, new in H9, based on path tracing (but also starting to support photon maps better for secondary bounces).
  • PBR can be used with micropolygon or raytrace sampling.
  • Shading language/node system includes bsdf data types and functions - used to retrieve reflected light at current shading point, using nominated bsdf. Preset bsdfs are included.
  • PBR use in shading nodes - final output node contains an 'F' input, which is used to sample the material for an outgoing direction.
    • Inside a material node tree, bsdf nodes give output colour, and output 'F', which can be mixed with other bsdfs and plugged into the final 'F' input for the material. I'm presuming this 'F' contains both an outgoing vector and probability distribution function value. This means that where shading for a colour output is not required (eg. photon bouncing), presumably the renderer just queries the material's F value from the node tree.

http://www.vimeo.com/channels/54102#6226221 http://www.sidefx.com/docs/houdini10.0/rendering/understanding

my own experiments, july 09

  • Some conventions inspired by RSL/VEX, but abstracting lighting from shading
  • Working with pipeline - Inputs (textures, ui) --> BSDF --> Lighting -> Output colour
  • Lighting system - called with bsdf as input - eg. direct lighting integrator:
    • if diffuse brdf, for each light
      • sample the lamp, for sample location and colour
      • attenuate the lamp based on distance between lamp and shading point
      • check lamp's visibility (is the shading point in shadow)
      • feed this incoming lamp colour into the bsdf->shade_func(), returning reflected colour
    • if glossy brdf, for each glossy sample
      • sample outgoing surface direction and pdf with bsdf->sample_func()
      • trace outgoing vector, and shade intersection, returning reflected colour
  • Light sampling interface - eg. void lamp_sample_area_solidangle(Render *re, LampRen *lar, float *P, float *Pl, float *L, float *Cl, float *pdf, float *u)
    • sampling via solid angle (surface -> light)
    • sampling via lamp area (light -> surface)
  • Attempt to clean up shadeinput, replaced it with a new 'shadeinfo', without some of the accumulated things, using more standardised naming
  • BSDF interface
typedef void (*BSDF_Shade) (struct Bsdf *bsdf, ShadeInfo *si, float *Ci, float *Cl);
typedef float (*BSDF_Sample) (struct Bsdf *bsdf, ShadeInfo *si, float *R, float *u);
typedef struct Bsdf {
    BSDF_Shade shade_func;
    BSDF_Sample sample_func;
	
    void *shade_vars;
} Bsdf;

BSDF structure, with function pointers to sampling and shading functions

  • bsdf->shade_func(): returns proportion of light scattered by bsdf

(eg lambert: dot(N, L) )

  • bsdf->sample_func(): returns pdf and outgoing vector scattered by bsdf. Takes random numbers as input

(eg. lambert, returns 1/π and random cosine weighted vector in hemisphere.

  • bsdf->shade_vars(): any additional info used by the bsdf itself, eg. 'shininess' parameter for blinn-phong brdf

bsdf shade_func worked similar to RSL illuminance loop, modifying Ci within the function. Not sure if it's better to do that there, or outside.

Note: Above system didn't support BTDFs or layered BSDFs. Can extend in various ways, maybe just make overall BSDF struct with list of BxDFs with layer weights + tags for BxDF_GLOSSY, BxDF_TRANSMISSION, etc

http://mke3.net/blender/devel/rendering/s/

Geometry handling/caching

Mental Ray

  • Uses geometry cache
  • "not all data is used at the same time. Data enters the cache when needed, and may get displaced later when memory fills up. Memory usage rises smoothly up to the defined memory limit, and then will try to stay there."
  • Can use 'assemblies' to store offline (or procedurally generated) geometry, which is generated and loaded when its bounding box is first intersected.
  • Can generate shadow map tiles on request - the entire shadow map need not be calculated as a pre-process. Shadow map tiles are also stored and discarded in the geometry cache.