From BlenderWiki

Jump to: navigation, search
Note: This is an archived version of the Blender Developer Wiki. The current and active wiki is available on wiki.blender.org.

GSOC 2012 - Adding Subsurface Scattering to Cycles

Cycles currently delivers highly realistic images and supports many shaders but is missing Subsurface Scattering. Creating realistic looking skin on characters depends on having this effect, otherwise it will have a plastic-like quality. A wide variety of other materials would benefit from the soft translucency added by SSS (such as milk and granite).


Derivables

  • A new SSS shader type for Cycles
  • It will be as easy to add SSS to your material as it is a diffuse shader using Blender's node system
  • Documentation in the Wiki manual
  • Documentation (either in the code or a separate document) for future developers

Implementation Details

The aim is to use a bidirectional scattering distribution function (BSSRDF) approach to simulate the way light scatters not just when it hits one point on a surface but also some depth below it.

Week 1 Progress Report

This week

  • Reading blender's documentation
  • Getting familiar with the RNA api
  • Figuring out how cycles works (still at it)
  • Took a look at the BSSRDF paper

Plan for next week

  • Create a new 'dummy' shader, maybe copy the diffuse shader
  • Play with the new shader in order to learn more about nodes and cycles-blender interraction.
  • Learn how BSDF works (from playing with the copied shader) and then how BSSRDF works, probably will require me brushing up on some math

Possible Roadblocks

  • Figuring out some of the math in the paper I think I'll need help with.

Week 2 Progress Report

This week

  • Got familiar with blender's nodes api
  • Added my own node and shader which is a copy of the diffuse BSDF shader
  • Played with nodes and my custom node but not the shader code
  • Took a look at how shaders are executed in the cycles svm

Plan for next week

  • Take a look at the bsdf code and see how that shader works (didn't get to that this week)
  • Begin looking at how BSSRDF works, read the associated paper(s)
  • See how a BSSRDF would fit into a path tracer, how its different from a BSDF

Week 3 Progress Report

This week

  • Brushed up on path tracing concepts. Looked at smallpt and Mitsuba renderer code.
  • Began to tackle the papers on BSSRDF - Implementing a skin BSSRDF is a practical summary of two relevant papers

Notes

Path Tracing

To get a better understanding of a path tracer I took a look at smallpt, a very simple c++ path tracer. The process iterates over the rows and columns of the image. Each ray is traced from the camera to the first object it hits and the color that the object contributes to the ray is then determined by some function depending on the material. A new direction for the ray is also determined and it is sent on its way in that direction and the process is repeated.

This keeps happening until it either hits a light, nothing, or reaches the maximum depth. The resulting colors are then averaged and we have what we call a sample. What I learned is that taking samples of a surface is solving the integral described in papers numerically and you don't solve integrals when you take each sample.

Additionally sampling the aperture of the camera gives you a depth of field. Sampling the reflection can give you a rough reflection and the same goes for refraction. Sampling the shutter in time gives you motion blur.

BSSRDF

The shader is the addition of two terms: a single scattering term and a multiple scattering term (diffusion scattering.

The single scattering term is where we try to find any photons that exit a point on the material surface and end up hitting the camera. We essentially trace backwards from the camera, under the surface and sample different depths along that path. At each sampled point we can then trace towards the light source and get the rays contribution.

The diffuse scattering term approximates how photons entering around the surrounding area contribute to light emitted from that point...

Next week

  • Understand the Diffusion Scattering part of the paper
  • Add to wiki how it will fit into blender
  • Start on the shader code

Week 4 Progress Report

This week

I kept reading about 'flux' in the papers so I spent Monday and Tuesday powering through some calc through double integrals, line integrals, flux integrals, vector fields and surface integrals. I don't think I should summarize the math(?) but I can say that Khan Academy is a good resource for math up to a certain level. Anyway this made reading about rendering massively more clear. Wednesday I got familiar enough with the Diffusion Scattering to know, in theory, what I needed to do with Cycles to get it to work. I spent the rest of the day and week working on how to make the shader in within Cycles and how to hook up all the relevant data/functions the code will need. As of yet nothing really works unfortunately... Friday though I essentially had no internet (provider's fault) and little work got done.

Next week

I'm hoping to get work done on the Diffusion Scattering because you can apparently do SSS using only that term, albeit having it less accurately.

Possible Roadblocks

I'm hoping there won't be any issues I can't solve myself or by bothering Brecht or the community.

Week 5 Progress Report

This week

I began to implement the Diffusion Scattering algorithm in blender:

  • Hooked my code into the svm closure. Set up the local frame of reference.
  • Interior loop (main equations) with just hard-coded coefficients for now.
  • Sample three color channels and return the final scatter.

Next Week

There are key things missing, like precomputing falloff positions, I have bugs that prevent me from rendering anything yet that I'm trying to solve. I have to trace the sample point up to the surface of the skin to make sure we start sampling at the surface.

Once that's done I hope to spend some time making it render anything that looks remotely natural (I expect some constant tweaking will be necessary)