Skip to content

Color Management Drawing Pipeline

This page describes the color management pipeline used in Blender for drawing to the screen. It will give a bit of history, an overview of the current state and points to the relevant bits and pieces in the code.

Introduction

In the past Blender used to have a fixed color management pipeline where color stored as bytes were assumed to be in sRGB and colors stored in float were assumed to be in Rec709. During Open Movie Tears of Steel a new color management pipeline was introduced based on OpenColorIO. The color space of images could now be specified. Many places in Blender still rely on the fixed color management pipeline.

With the introduction of EEVEE and the Draw module (Blender 2.80) the color management pipeline was added to the 3d viewport. The first implementation wasn't flexible enough and was rewritten for Blender 2.83. The main difference is that the color management was moved from the draw manager to the gpu viewport.

In Blender 2.91 the image editor was migrated to use the same color management pipeline as the 3d viewport. The previous image editor still used the old pipeline that integrated alphas in display space. Blender 2.91 fixes the issue that semi-transparent images didn't show correctly in the image/uv editor.

Overview

GPU Viewport has a color texture and an overlay texture. The color texture is in Linear Scene Reference Space and the overlay texture is in encoded linear display space. Encoded here means that the data is in linear display space, but is stored in an SRGB texture. The encoding increases performance and needs less space on the GPU.

Based on the needed color space you have to select the correct texture as render target. The main draw engines (Workbench, EEVEE & Grease Pencil) draw to the color texture as they work in Linear Scene Reference Space. The overlay engine renders to the overlay texture as it needs Linear Display Space. The overlay engine renders Camera background images to the color texture.

GPUViewport color texture

GPUViewport overlay texture

When the GPUViewport is copied to the screen the textures are merged and transformed to display space. This is done in the OpenColorIO glsl shader (gpu_shader_display_transform.glsl). This shader is located in /intern/opencolorio due to old policy issues.

GPU Viewport Composited

The shader converts the color texture to linear display space and the overlay texture is alpha blended on top of it.

Stereo Drawing

GPUViewport can be used to render in stereo. The results are stored in 4 buffers:

  1. Right eye color
  2. Right eye overlay
  3. Left eye color
  4. Left eye overlay

A GPUViewport can activate one eye (left or right). When switching eyes the textures are swapped and the new active textures are attached to the frame buffers. The DefaultTextureList.color contains the color buffer for the active eye and color_stereo for the inactive eye. The same for the overlay buffers. (See: gpu_viewport_framebuffer_view_set).

Normally this is done by the window manager (during binding of a region See: wm_draw_region_bind) or the draw manager (when switching views See DRW_notify_view_update).

Before the GPU Viewport is copied to the screen the left and right eye buffers are merged. It depends on the window setting where and how this is done. The side-by-side and top-bottom will be handled by the window manager. The buffers are copied to the right location on the screen. The anaglyph and interlace modes are handled in GPU_viewport_stereo_composite.

Guide lines for developers

Use GPUViewport

When colors are important use a GPU Viewport. Color management can be turned on in the GPUViewport by calling GPU_viewport_colorspace_set. When using draw manager (/source/blender/draw) this is done by drw_viewport_colormanagement_set. In special cases this function needs to be adapted to select the render settings or only apply the view transform.

Don't use Display buffer

An ImBuf has a display buffer. The display buffer originates from tracking and ensured playback of a 4k movie including color management. The display buffer caches the result after the transform to display space (screen pixels). Nowadays it is better to use GPUViewport and draw the movie in Linear Scene Reference Space to the color texture. This reduces complexity and should make faster preprocessing/loading of the movie clips. The drawing will be a bit slower, but it should be OK on current hardware.

Known Issues

The current implementation has some short comings

  • When doing a viewport rendering with overlays to an image the colors of the overlays are stamped in the image. Rendered images are always stored in Linear Scene Reference Space. The overlays are stamped into the render result what leads to different result then when viewed in the 3d viewport.
  • Pure emissive colors aren't supported by some commonly used image file formats. Most noticeable is PNGs. There isn't a good way to solve this without knowing how the exported PNGs will be used.
  • Currently there isn't a python API to render to the color buffer. This results that add-ons that will use BGL rely on the display space. The consequence is that add-ons aren't able use the full potential of the drawing color pipeline.
  • 2d Texture painting is still done in the fixed color pipeline. But the color selection is done using OpenColorIO. This can result in unexpected behavior.

TODOs

  • Image empties are drawn to the overlay texture. This is a limitation that should be fixed.
  • Overlay colors inside the draw manager are in linear sRGB. This should be linear display space we should transfer the colors using an OpenColorIO processor.