Design proposal for revising the texture editing workflow, there is a separate document about shader editing in general. It's quite broad and covers things not immediately related to textures only, but the intention is to get an idea of how all the pieces fit together.
-- Brecht 12:19, 7 June 2011 (CEST)
Texture datablocks will no longer be used in new shading system. Rather each place that uses a texture will get its own shading node tree. Node groups will effectively replace textures as a way of sharing textures, though texture datablocks typically are used in a single place anyway, so they usually do not need to be created.
The output node can have different inputs depending on where it is used.
Currently this is not implemented yet, so pretend there is no texture datablock for now when evaluating design.
Textured draw mode currently is used for editing textures, glsl preview and game engine material editing. We intended to make a clearer separation.
- Texture: this draw mode would become the draw mode for editing, painting and mapping individual textures. Lighting will be same as solid, so in a way this is the textured solid draw mode promoted to the top level. What is drawn is not the UV texture layer but the active texture node in the material.
- Material: here we draw the entire material with all textures using GLSL, and lighting same as solid. This would be mostly used again for editing, painting and mapping textures, but while seeing them in the full material, but still simple lighting.
- Rendered: this draw mode depends on the engine selected for output. Both offline renderers and game engines can draw into the viewport here, with full lighting.
Material texture nodes will get a flag to indicate a node as the active texture, that is being painted on, shown in the image editor for a particular face, and drawn in texture draw mode.
Regarding GLSL lighting: we keep GLSL shading with support for bump maps, glossy maps, etc. But we use solid draw mode lighting & simpler BSDF's. Things like shadows, environment light / AO, indirect light are slow and very difficult to match. Assuming you use those in your scene, GLSL is not going to match the scene lighting well anyway and it's likely to be slow for editing.
So we do not try to match the scene lighting and make some approximations to make it more editing oriented rather than trying to match the renderer as close as possible. With the game engine selected, rendered draw mode can draw GLSL with scene lighting, but it doesn't need to match the offline renderer precisely, and features not supported can be hidden in the UI.
If there is no GLSL support, textured draw mode will only be able to draw image nodes, how can we have a predictable fallback here? Automatically find the first texture node lower in the tree?Don't use GLSL for texture draw mode, only single texture node and its parameters.
Do we need a full shaded GLSL draw mode for offline renders, given that we already have a Rendered draw mode, or do leave this as a game engine feature?Leave as game engine feature
Where do we get the color for solid draw mode? Find some heuristic to get it automatically from the node tree? What happens if the BSDF closure input is overridden by a texture, then you don't really have anywhere to get the color from?Automatic + option to specify manually.
Painting & UV Editing
The currently painted texture is taken from the active texture node. The properties editor will highlight the node being painted on and allow it to be set when intexture paint mode.
UV editing is done on the active UV layer, irrespective of the selected node in the material.
- To do UV editing, the user would have to ensure they selected the right UV layer and material node. Do we want to automatically sync these selections somehow? Either way it seems confusing.
- Currently there is no convenient way to select the active texture node from the properties editor. How can we do this without adding too much clutter? Also somewhat confusing in the UI is the node input vs. node linked to that input.
Mapping and Stacking
Cycles currently requires you to add nodes to do basic mapping, texture blending, applying a color ramp. This is flexible, but also gives quite deep node trees, and it's not as easy to find out how to do certain things.
We propose to add some coordinate and color modification options similar to the texture stack to texture nodes. These can be edited in the texture tab in the properties editor, and in the node editor if the node is active.
Do we need a type of texture stack, or are mix nodes enough?
- From what I've seen, the existing texture stack is not used that much for layering: color/glossy/bump usually are only influenced by one or two textures, and stencils are more naturally expressed as mix nodes. So I think we can do without a texture stack, perhaps a multi-mix node may be useful, but doesn't seem critical.
Textures Properties / Other Uses
Textures can also be used for modifiers or particles, but the workflow for this is poor. The proposed solution is this:
Shading nodes, modifiers etc. will have a button to switch to the textures tab in properties editor with that texture slot selected. The textures tab will have a dropdown that contains all texture users in context (from world, material, lamp, modifiers, particles, ...).
Texture nodes will get some properties from that are now at texture slot (MTex) level built in, like coordinate transformation and color modifications. These will then be editable in the texture tab.