Reference/Release Notes/2.80/EEVEE

Blender 2.80: EEVEE

EEVEE is a new physically based realtime renderer. It works both as a renderer for final frames, and as the engine driving Blender's realtime viewport for creating assets.

Many features are supported, including:

  • Principled BSDF
  • Environment lighting and HDRIs
  • Screen-space reflections and refractions
  • Indirect light through light probes
  • Soft and contact shadows
  • Subsurface scattering and volume rendering
  • Depth of field, camera motion blur, bloom

For detailed information, see the EEVEE user manual.

Render Settings

Materials

EEVEE materials are created using the same shader nodes as Cycles, making it easy to render existing scenes. For Cycles users, this makes EEVEE work great as a realtime preview. For game artists, the Principled BSDF is compatible with shading models used in many game engines.

NPR

Eevee supports the conversion of BSDF outputs into RGB colors to make any kind of non-photorealistic shading. This is done using the Shader to RGB node.

For example, a basic toon shader can be created by applying a color ramp to the output of a BSDF.

While this is supported, this is breaking the PBR pipeline and thus makes the result unpredictable when other effects (Screen Space Reflections, Screen Space Ambient Occlusion, Subsurface Scattering, ...) are used.

Transparency

Eevee treats transparent objects differently than Cycles. They are sorted from front-to-back per object and do not receive any screen space effects.

This is fast and works fine for many cases, but it is only approximate and gives incorrect results for some object shapes. For more accurate results, enable Alpha Hashed transparency in the material. With a sufficient number of samples to resolve the noise this gives accurate transparency.

Global Illumination

Eevee support global illumination through precomputed light probes.

Light Probes

  • Irradiance Volumes capture the diffuse lighting in order to lit the objects that are inside with indirect light bounces. This does not require any lightmap UV mapping and even work with objects that are moving inside theses volumes.
  • Reflection Cubemaps create local reflections of the surroundings for the objects nearby. This is to make sure the reflection are accurate (i.e.: not reflecting the outdoor sky on indoor objects).
  • Reflection Planes capture the scene from a reflected camera point of view. This is only usefull for glossy planar surfaces like a still water surface or a shiny floor. Theses objects can be used to help SSR on thoses surfaces for rough reflection.

Light Cache

Since precomputing theses probes is computationally intensive, Irradiance grids and Relfection Cubemaps are cached and saved into the file for quicker reloading. The light cache options are found in the render settings tab and the light cache itself is stored per the scene.

World Illumination

World lighting is captured into its own texture and applied by default to every object in the absence of local probes. This distant lighting is considered as indirect lighting. While world illumination is great to lit one object, the use of light probes become a necessity for more complex scenes.

Lighting

Using state of the art realtime technique, Eevee is able to create realistic specular and diffuse lighting from area lights.

Soft shadows are supported through filtered shadow maps and jittered shadows position. An optional contact shadow can be used to shadow the tiny features of a model.

Volumetrics

Using a unified volumetric system, Eevee is capable of rendering volumetric effects such as absorption and single scattering.

Volumes react to lights and are properly shadowed. Being a unified system, the volumetric objects blend together correctly and light transmission is also applied onto transparent objects.

Smoke/fire simulations are supported, along with the Principled Volume shader.

Screen Space Effects

Eevee rely on screen space informations to create realtime approximation of commonly raytraced effects such as Ambient Occlusion, Reflections, Sub-Surface Scattering.

Ambient Occlusion

It is computed using Ground Truth Ambient Occlusion (GTAO) and applied to indirect lighting. In addition, the bent normal option will make the lighting come from only the least occluded direction.

Reflection and refraction

They are both augmented by screen space information that enables local inter-reflections. Screen Space reflections can also be enhanced by adding reflection plane on flat reflective surfaces to fill the missing informations.

Subsurface Scattering

SSS is done by blurring the surface diffuse lighting using a SSS profile that is close to what Cycles produces. While this is not the true correct way of achieving SSS, it is very fast and high quality.

This process being done in screen-space means light coming from a surface point not visible by the camera will not participate in the SSS effect. To fix this, shadow maps can be used to create SSS effects on thin, backlit surfaces such as human ears.

Post Processing

Some basic Post Processing pipeline is already present. This includes OpenColorIO color management, Depth of Field, motion blur and bloom.

Motion Blur

Basic post process camera-based motion blur is supported. Per object Object motion blur and deformation motion blur are yet to be supported.

Depth Of Field

Being a realtime engine, Eevee needs to emulate defocus by using a post process depth of field. Do note that the current algorithm is subject to float imprecision.

Bloom

The Bloom effect lets you get a sense of what the real brightness of the pixels are by diffusing the light that would get otherwise lost. This mimics what happens in real photography when capturing really bright light sources.

OpenColorIO

Eevee renders all the scenes in scene referred color space. This means that a final color transformation needs to be performed in order to be displayed. This is where we use OpenColorIO just like Cycles does. This means Eevee supports the new Filmic view transform.

Limitations

Memory Usage

Eevee uses OpenGL, and GPU memory management is done by the OpenGL driver. In theory, only the needed textures and meshes needed for one object need to fit in GPU memory at a time. This is because OpenGL only needs the resources for one draw call at a time.

So if the scene is really heavy, the driver will swap things in and out to make sure all objects are rendered correctly. In practice, GPU drivers are often optimized for gaming performance and not so much dealing with heavier scenes. When using too much GPU memory can make the GPU driver crash, freeze, or kill the application, with no way for Blender predict in advance if there are enough resources to avoid it. So you still have to be careful not to use too much memory.

GPU

Being an OpenGL engine, Eevee only uses the power of the GPU to render. There are no plans to support CPU (software) rendering as it would be very inefficient. CPU power is still helpful to handle high complexity scenes as the geometry and modifiers are still prepared on the CPU before rendering each frame.

Other current limitations are:

  • No multiple GPU rendering. Eevee will use the graphic card used by the rest of Blender's UI.
  • Headless systems (without a display) are not supported currently. Background rendering when there is a display is supported.

Features

Not all features supported by Blender are available in Eevee yet:

  • No object instances (coming soon)
  • No light node trees (coming soon)
  • No panoramic camera
  • Missing shader nodes: Toon BSDF, Velvet BSDF, Principled Hair BSDF, Anisotropic BSDF, OSL, Sky Texture
  • BSDFs use approximations to achieve realtime performance, so there will always be small differences between Cycles and Eevee.
  • Volumes defined by mesh shapes are not supported yet, only smoke and world volumes.