- 1 Blender 2.80: EEVEE
- 1.1 Render Settings
- 1.2 Materials
- 1.3 Global Illumination
- 1.4 Lighting
- 1.5 Volumetrics
- 1.6 Screen Space Effects
- 1.7 Post Processing
- 1.8 Limitations
Blender 2.80: EEVEE
EEVEE is a new physically based realtime renderer. It works both as a renderer for final frames, and as the engine driving Blender's realtime viewport for creating assets.
Many features are supported, including:
- Principled BSDF
- Environment lighting and HDRIs
- Screen-space reflections and refractions
- Indirect light through light probes
- Soft and contact shadows
- Subsurface scattering and volume rendering
- Depth of field, camera motion blur, bloom
EEVEE materials are created using the same shader nodes as Cycles, making it easy to render existing scenes. For Cycles users, this makes EEVEE work great as a realtime preview. For game artists, the Principled BSDF is compatible with shading models used in many game engines.
Eevee supports the conversion of BSDF outputs into RGB colors to make any kind of non-photorealistic shading. This is done using the Shader to RGB node.
For example, a basic toon shader can be created by applying a color ramp to the output of a BSDF.
While this is supported, this is breaking the PBR pipeline and thus makes the result unpredictable when other effects (Screen Space Reflections, Screen Space Ambient Occlusion, Subsurface Scattering, ...) are used.
Eevee does treat transparent objects differently. They are sorted from front-to-back per object and do not receive any Screen Space effects. Eevee will not do alpha blended sorting on a per pixel basis. Using Alpha Hashed transparency with enough samples should give the expected result.
Eevee support global illumination through precomputed light probes.
- Irradiance Volumes capture the diffuse lighting in order to lit the objects that are inside with indirect light bounces. This does not require any lightmap UV mapping and even work with objects that are moving inside theses volumes.
- Reflection Cubemaps create local reflections of the surroundings for the objects nearby. This is to make sure the reflection are accurate (i.e.: not reflecting the outdoor sky on indoor objects).
- Reflection Planes capture the scene from a reflected camera point of view. This is only usefull for glossy planar surfaces like a still water surface or a shiny floor. Theses objects can be used to help SSR on thoses surfaces for rough reflection.
Since precomputing theses probes is computationally intensive, Irradiance grids and Relfection Cubemaps are cached and saved into the file for quicker reloading. The light cache options are found in the render settings tab and the light cache itself is stored per the scene.
World lighting is captured into its own texture and applied by default to every object in the absence of local probes. This distant lighting is considered as indirect lighting. While world illumination is great to lit one object, the use of light probes become a necessity for more complex scenes.
Using state of the art realtime technique, Eevee is able to create realistic specular and diffuse lighting from area lamps.
Soft shadows are supported through filtered shadow maps. An optional contact shadow can be used to shadow the tiny features of a model.
Using a unified volumetric system, Eevee is capable of rendering volumetric effects such as absorption and single scattering.
Volumetrics react to lamps and are properly shadowed. Being a unified system, the volumetric objects blend together properly and light transmission is also applied onto transparent objects.
Smoke/fire simulations are also supported.
Screen Space Effects
Eevee rely on screen space informations to create realtime approximation of commonly raytraced effects such as Ambient Occlusion, Reflections, Sub-Surface Scattering.
It is computed using Ground Truth Ambient Occlusion (GTAO) and applied to indirect lighting. In addition, the bent normal option will make the lighting come from only the least occluded direction.
Reflection and refraction
They are both augmented by screen space information that enables local inter-reflections. Screen Space reflections can also be enhanced by adding reflection plane on flat reflective surfaces to fill the missing informations.
SSS is done by blurring the surface diffuse lighting using a SSS profile that is close to what Cycles produces. While this is not the true correct way of achieving SSS, it is really fast and attain a high level of quality. This process being done in screen-space means light coming from a surface point not visible by the camera will not participate to the SSS effect. To fix this, Shadow maps can be used to create the SSS effects seen on back-lit surfaces such as human ears.
Some basic Post Processing pipeline is already present. This includes OpenColorIO color management, Depth of Field, motion blur and bloom.
Basic post process camera-based motion blur is supported. Per object Object motion blur and deformation motion blur are yet to be supported.
Depth Of Field
Being a realtime engine, Eevee needs to emulate defocus by using a post process depth of field. Do note that the current algorithm is subject to float imprecision.
The Bloom effect lets you get a sense of what the real brightness of the pixels are by diffusing the light that would get otherwise lost. This mimics what happens in real photography when capturing really bright light sources.
Eevee renders all the scenes in scene refered color space. This means that a final color transformation needs to be performed in order to be displayed. This is where we use OpenColorIO just like Cycles does. This means Eevee supports the new Filmic view transform.
As of now Eevee uses OpenGL, and GPU Memory management is done by the OpenGL driver.
In theory, only the needed textures and meshes (now refered as "the resources") for a single drawcall (i.e.: one object) needs to fit into the GPU memory. So if the scene is really heavy, the driver will swap things in and out to make sure all objects are rendered correctly.
In practice, using too much GPU memory can make the GPU driver crash, freeze, or kill the application. So be careful of what you ask.
There is no standard way of knowing if the resources will fit into the GPU memory and or if the GPU will render them succesfully.
Being an OpenGL engine, Eevee only uses the power of the GPU to render. There is no plans to support CPU (software) rendering as it would be very inefficient. CPU power is still needed to handle high complexity scene as the geometry is still being prepared by the CPU before rendering each frame.
There is no support for multiple GPU system. Eevee will use whatever graphic card is used by the rest of Blender's UI.
There is no support for using Eevee on headless systems (i.e.: without Display Manager).
There is no support for object instances. (Support will come).
There is no support for lamp node trees. (Support will come).
Unsupported shader nodes: Toon BSDF, Velvet BSDF, Principled Hair BSDF, Anisotropic BSDF, OSL, Sky Texture.
All BSDF are using approximation to achieve realtime performance so there will always be small differences between Cycles and Eevee.