From BlenderWiki

Jump to: navigation, search

Rendering is the final process of CG (short of post processing, of course) and is the phase in which the image corresponding to your 3D scene is finally created. Rendering is a CPU-intensive process. You can render an image on your computer, or use a render farm which is a network of PC's that each work on a different section of the image or on different frames. This section provides a full explanation of the features in Blender related to the process of producing your image or animation.

Introduction

Rendering Buttons.

The rendering buttons window is accessed via the Scene Context and Render Sub-context (F10 or the Manual-Part-I-RenderingButton.png button). The rendering Panels and Buttons are shown in Rendering Buttons..

These buttons are organized into panels, which are:

  • Output - controls the output of the render pipeline
  • Render Layers - controls which layers and passes to render
  • Render - controls the actual rendering process of a still shot
  • Anim - controls the rendering of a series of frames to produce an animation
  • Bake - pre-computes certain aspects of a render
  • Format - controls the format and encoding of the picture or animation
  • Stamp - stamps the frames with identifying and configuration control item information
Tabs
To save screen space, some of the panels may be tabbed under another; for example, the Layers panel is a tab folder under the Output panel. To reveal it, simply click the tab header.
Yafray
If you have installed Yafray, options to control it will appear as two tabs under the Render panel once you have selected it as a rendering engine


Overview

The rendering of the current scene is performed by pressing the big RENDER button in the Render panel, or by pressing F12 (you can define how the rendered image is displayed on-screen in the Render Output Options). See also the Render Window.

A movie is produced by pressing the big ANIM animation button in the Anim panel. The result of a rendering is kept in a buffer and shown in its own window. It can be saved by pressing F3 or via the File->Save Image menu using the output option in the Output panel. Animations are saved according to the format specified, usually as a series of frames in the output directory. The image is rendered according to the dimensions defined in the Format Panel (Image types and dimensions.).

Workflow

In general, the process for rendering is:

  1. Create all the objects in the scene
  2. Light the scene and position the camera
  3. Render a test image at 25% or so without oversampling or raytracing etc. so that it is very fast and does not slow you down
  4. Set and Adjust the materials/textures and lighting
  5. Iterate the above steps until satisfied at some level of quality
  6. Render progressively higher-quality full-size images, making small refinements and using more compute time
  7. Save your images

Render Workbench Integration

Manual-Render-Pipeline.jpg

Blender has three independent rendering workbenches which flow the image processing in a pipeline from one to the other in order:

  • Rendering Engine
  • Compositor
  • Sequencer

You can use each one of these independently, or in a linked workflow. For example, you can use the Sequencer by itself to do post processing on a video stream. You can use the Compositor by itself to perform some color adjustment on an image. You can render the scene, via the active Render Layer, and save that image directly, with the scene image computed in accordance with the active render layer, without using the Compositor or Sequencer. These possibilities are shown in the top part of the image to the right.

You can also link scenes and renders in Blender as shown, either directly or through intermediate file storage. Each scene can have multiple render layers, and each Render Layer is mixed inside the Compositor. The active render layer is the render layer that is displayed and checked active. If the displayed render layer is not checked active/enabled, then the next checked render layer in the list is used to compute the image. The image is displayed as the final render if Do Composite and Do Sequence are NOT enabled.

If Do Composite is enabled, the render layers are fed into the Compositor. The noodles manipulate the image and send it to the Composite output, where it can be saved, or, if Do Sequence is on, it is sent to the Sequencer.

If Do Sequence is enabled, the result from the compositor (if Do Composite is enabled) or the active Render layer (if Do Composite is not enabled) is fed into the Scene strip in the Sequencer. There is is manipulated according to the VSE settings, and finally delivered as the image for that scene.

Things get a little more complicated when a .blend file has multiple scenes, for example Scene A and Scene B. In Scene B, if Do Composite is enabled, the Render Layer node in Scene B's compositor can pull in a Render Layer from Scene A. Note that this image will not be the post-processed one. If you want to pull in the composited and/or sequenced result from Scene A, you will have to render Scene A out to a file using Scene A's compositor and/or sequencer, and then use the Image input node in Scene B's compositor to pull it in.

The bottom part of the possibilities graphic shows the ultimate blender: post-processed images and a dynamic component render layer from Scene A are mixed with two render layers from Scene B in the compositor, then sequenced and finally saved for your viewing enjoyment.

These examples are only a small part of the possibilities in using Blender. Please read on to learn about all the options, and then exercise your creativity in developing your own unique workflow.

Sequencer from/to Compositor

To go from the Compositor to the Sequencer, enable both "Do Sequence" and "Do Composite". In the Compositor, the image that is threaded to the Composite Output node is the image that will be processed in the Scene strip in the VSE.

The way to go from the Sequencer to the Compositor is through a file store. Have one scene "Do Sequence" and output to an image sequence or mov/avi file. Then, in another scene, "Do Composite" and using an image input node to read in the image sequence or mov/avi file.

Render Window Options

Once you press F12 or click the big Render button, your image is computed and display begins. Depending on the Output Panel Option, the image is shown in a separate Render Window, Full Screen, or in a UV/Image Editor window.

You can render a single frame, or many frames in an animation. You can cancel the rendering by pressing Esc. If rendering a sequence of frames, the lowest number frame is rendered, then the next, and so on in increasing frame number sequence.

The Render Window can be invoked in several ways:

Render Panel->Render or F12
renders the current frame, as seen by the active camera, using the selected renderer (Blender Internal or Yafray)
3D View Window Header->LMB Template-LMB.png OpenGL Render button (far right)
Renders a quick preview of the 3D View window
Anim Panel->Anim CtrlF12
Render the Animation from the frame start to and included in the End frame, as set in the Anim panel.
3D View Window Header->CtrlLMB Template-LMB.png OpenGL Render button (far right)
Renders a quick animation of the 3D View window using the fast OpenGL render

Output Options needs to be set correctly. In the case of Animations, each frame is written out in the Format specified.

Rendering the 3D View Animation using the OpenGL is useful for showing armature animations.

Showing Previous Renders

If the Blender Internal render was used to compute the image, you can look at the previous render:

Render->Show Render Buffer
F11 - Pops up the Render Window and shows the last rendered image (even if it was in a previously opened & rendered .blend file).
Render->Play Back Rendered Animation
CtrlF11 - Similar as for the single frame, but instead plays back all frames of the rendered animation.

Render Window usage

Once rendering is complete and the render window is active, you can:

  • A - Show/hide the alpha layer.
  • Z - Zoom in and out. Pan with the mouse. You can also mousewheel to zoom
  • J - Jump to other Render buffer. This allows you to keep more than one render in the render window, which is very useful for comparing subtleties in your renders. How to use it:
  1. Press Render or F12
  2. Press J to show the empty buffer (the one we want to "fill" with the new image)
  3. Go back to the Blender modeling window. You can send the render window to the background by pressing Esc. Do not close, or minimize the render window!
  4. Make your changes.
  5. Render again.
  6. Press J to switch between the two renderings.
  • LMB Template-LMB.png - Clicking the left mouse button in the Render window displays information on the pixel underneath the mouse pointer. The information is shown in a text area at the bottom left of the Render output window. Moving the mouse while holding the LMB Template-LMB.png will dynamically update the information. This information includes:
    • Red, Green, Blue and Alpha values
    • Distance in Blender Units from the camera to the pixel under the cursor (Render Window only)

For Alpha values to be computed, RGBA has to be enabled. Z-depth (distance from camera) if computed only if Z is enabled on the Render Layers tab.


Step Render Frame

Step Frame Option.

Blender allows you to do faster animation renders skipping some Frames. You can set the step in the Render Panel as you can see in the picture.

Once you have your video file rendered, you can play it back in the real speed (fps). For that you just need to set the same Step parameter you used to render it.

If you play the stepped video in a normal speed in an external player, the general speed will be *Step* times faster than a video rendered with all the frames (eg. a video with 100 frames rendered with Step 4 @ 25 fps will have only 1 second. The same video rendered with Step 1 (the default value) will have 4 seconds of length).

If you want to use this parameter for rendering through command line you can use -j STEP, where STEP stands for the number of steps you want to use.

If you want to use this parameter for playing video files through the command line, you need to use the parameter -j STEP following the -a (which stands for the playback mode).

./blender -a -s 1 -e 100 -p 0 0 -f 25 1 -j 4 "//videos/0001.jpg"


  • Rendered stepped frames with output video format such as FFMpeg (always) or AVI Jpeg (sometimes) produce a corrupted video (missing end frames). in Blender 2.48a. Therefore the rendered video can't be played-back properly.

Therefore I suggest to work with the video output format of JPG, it works fine all the time.



Introduction
What is Blender?
Introduction
Blender’s History
License
Blender’s Community
About this Manual
What's changed with Blender 2.4
Installing Blender
Introduction
Python
Installing on Windows
Installing on GNU/Linux
Installing on Mac
Installing on other Operating Systems
Configuring Blender
Directory Layout
Starting
The Interface
Introduction
Keyboard and Mouse
Window System
Arranging frames
Headers
Console window
Window Types
Screens (Workspace Layouts)
Scenes
Configuration
Modes
Contexts
Menus
Panels
Buttons and Controls
Internationalization
Your First Animation
1/2: A static Gingerbread Man
2/2: Animating the Gingerbread Man
The Vital Functions
Quick render
Undo and Redo
Default scene
Screenshots
Help!
Setting Preferences
Configuring Preferences
Interface
Editing
Themes
File
System
Interaction in 3D
Introduction
Introduction
Navigation
Introduction
3D View
3D View Options
3D View Usage
Camera View
Layers
Local or Global View
Sketch in 3D Space
Introduction to Grease Pencil
Drawing sketches
Layers and Animation
Converting sketches to geometry
Transformations
Introduction
Basics
- Grab/Move
- Rotate
- Scale
- Gestures
Advanced
- Mirror
- To Sphere
- Shear
- Warp
- Push/Pull
Transform Control
Introduction
Precision of Transformations
Numeric Transformations
Transform Properties
Reset Object Transforms
Manipulators
Transform Orientations
Axis Locking
Pivot Point
- Active object
- Individual Centers
- 3D Cursor
- Median Point
- Bounding Box Center
Snapping
Snap to Mesh
Proportional Edit
Data System and Files
Blender's Data System
Blender's Library and Data System
Blender's Datablocks
Scenes
Working with Scenes
The Outliner Window
Appending and Linking
File operations
Introduction
Opening blender files
Saving blender files
Modeling
Introduction
Introduction
Objects
Objects
Selecting Objects
Editing Objects
Groups and Parenting
Tracking
Duplication
- DupliVerts
- DupliFaces
- DupliGroup
- DupliFrames
Mesh Objects
Meshes
- Mesh Structures
- Mesh Primitives
Selecting
- Selectable Elements
- Selection Basics
- Advanced Selecting
- Selecting Edges
- Selecting Faces
Editing
Basic Editing
- Translation, Rotation, Scale
- Adding Elements
- Deleting Elements
- Creating Faces and Edges
- Mirror editing
Vertex Editing
Edge Editing
Face Editing
Deforming Tools
- Mirror
- Shrink/Fatten Along Normals
- Smooth
- Noise
Duplicating Tools
- Duplicate
- Extrude
- Extrude Dup
- Spin
- Spin Dup
- Screw
Subdividing Tools
- Subdivide
- Subdivide fractal
- Subdivide smooth
- Loop Subdivide
- Knife Subdivide
- Bevel
Miscellaneous Tools
Retopo Tool
Sculpt Mode
Multi Resolution Mesh
Vertex Groups
Weight Paint
Mesh Smoothing
Curve Objects
Curves
Selecting
Editing
Advanced Editing
Surface Objects
Surfaces
Selecting
Editing
Text Objects
Texts
Editing
Meta Objects
Metas
Editing
Empty Objects
Empties
Group Objects
Groups
Scripts
Modeling Scripts
Modifiers and Deformation
Introduction
Introduction
Modifiers Stack
Modify
UVProject
Generate
Array
Bevel
Booleans
Build
Decimate
EdgeSplit
Mask
Mirror
Subsurf
Deform
Armature
Cast
Curve
Displace
Hooks
Lattice
MeshDeform
Shrinkwrap
SimpleDeform
Smooth
Wave
Simulate
Cloth
Collision
Explode
Fluid
Particle Instance
Particle System
Soft Body
Lighting
Introduction
Introduction
Lights
Introduction
Light Properties
Light Attenuation
Light Textures
What Light Affects
Lights In Other Contexts
Shadows
Introduction
Shadow Properties
Raytraced Shadow Properties
Volumetric Lights
Introduction
Lamps
Introduction
Lamp Light
- Raytraced Shadows
Spot Light
- Raytraced Shadows
- Buffered Shadows
- Halos
Area Light
- Raytraced Shadows
Hemi Light
Sun Light
- Raytraced Shadows
- Sky & Atmosphere
Lighting Rigs
Radiosity
Introduction
Rendering
Baking
Scene Light
Ambient Light
Ambient Occlusion
Exposure
Exposure
Materials
Introduction
Introduction to Shading
Materials Introduction
Usage
Assigning a material
Material Preview
Material Options
Multiple Materials
Properties
Diffuse Shaders
Specular Shaders
Ambient Light Effect
Color Ramps
Raytraced Reflections
Raytraced Transparency
Subsurface Scattering (SSS)
Strands
Node Materials
Material Nodes
Nodes Editor
Node Controls
Nodes usage
Nodes Groups
Material Node Types
- Input Nodes
- Output
- Color
- Vector
- Convertor
- Dynamic
Vertex Paint
Using Vertex Paint
Halos
Halos
Textures
Introduction
Introduction
UV/Image Editor
Common Options
Texture Stack
Texture Types
Texture Types
Procedural Textures
Image Textures
Video Textures
Texture Nodes
- Nodes Editor
- Node Controls
- Nodes usage
- Nodes Groups
-- Textures Input Nodes
-- Textures Output Nodes
-- Textures Color Nodes
-- Textures Patterns Nodes
-- Textures Textures Nodes
-- Textures Convertor Nodes
-- Textures Distort Nodes
Texture Plugins
Texture Painting
Painting the Texture
- Projection Paint
Mapping
Mapping
Environment Maps
UV Unwrapping Explained
- Unwrapping a Mesh
- Managing the UV Layout
- Editing the UV Layout
- Applying an Image
Influence
Influence
- Material
-- Bump and Normal
-- Displacement
- Particles
- World
World and Ambient Effects
World
Introduction
World Background
Ambient Effects
Mist
Stars
Rigging
Introduction
Introduction to Rigging
Armatures
Armature Objects
Panels overview
Bones
Visualization
Structure
Selecting
Editing
- Bones
- Properties
- Sketching
- Templating
Skinning
Introduction
Linking Objects to Bones
Skinning to Objects’ Shapes
Retargeting
Posing
Introduction
Visualization
Editing Poses
Pose Library
Using Constraints
Inverse Kinematics
Constraints
Introduction
Introduction
Constraints Common Interface
Constraints’ Stack
Transform Constraints
Copy Location
Copy Rotation
Copy Scale
Limit Distance
Limit Location
Limit Rotation
Limit Scale
Transformation
Tracking Constraints
Clamp To
IK Solver
Locked Track
Stretch To
Track To
Relationship Constraints
Action
Child Of
Floor
Follow Path
Null
Rigid Body Joint
Script
Shrinkwrap
Animation
Introduction
Introduction
The Timeline
Markers
3D Views
Animation Editors
Animation Editors
Ipo Editor
Ipo Curves and Keyframes
Ipo Datablocks
Ipo Types
Ipo Editor Interface
Editing
- Ipo Curves
- Keyframes
Ipo Drivers
Action Editor
Editing Action Channels
NLA Editor
Editing NLA Strips
Strip Modifiers
Animation Techniques
Introduction
Animating Objects
- Using Constraints
- Moving Objects on a Path
Animating Shapes
- Shape Keys
- Editing Shape Keys
- Animating Shape Keys
- Shape Keys Examples
Indirect Shape Animation
Animating Armatures
- Stride
Animating Lamps
Animating Cameras
Animating Materials
Animating Textures
Animating World
Physical Simulation
Introduction
Introduction
Dynamics
Force Fields
Collisions
Particles
Particles
Types
Physics
- Newtonian
- Keyed
- Boids
Visualization
Controlling Emission, Interaction and Time
Cache & Bake
Hair
Children
Vertex Groups
Particle Mode
Soft Body
Introduction
Exterior Forces
Interior Forces
Collisions
Simple Examples
Combination with Armatures
Combination with Hair Particles
Reference
Cloth
Introduction
Fluids
Fluid
Using the Game Engine
Using the Game Engine
Rendering
Introduction
Introduction
Camera
The Camera
Perspective (Vanishing points)
Depth Of Field
Render
Displaying Renders
Basic Options
Antialiasing (Oversampling)
Rendering Animations
Panoramic
Render Baking
Using the Command Line
Output
Output
Video Output
Effects and Post Processing
Introduction
Render Layers
Render Passes
Edges & Toon
Stamp
Color Management & Exposure
Depth Of Field
Motion Blur
Render Performance
Rendering Performance
Distributed Rendering
External Render Engines
Introduction
YafRay
Compositing with nodes
Composite Nodes
Introduction
Nodes Editor
Node Controls
Nodes usage
Nodes Groups
Composite Node types
Composite Node types
Input Nodes
Output Nodes
Color Nodes
Vector Nodes
Filter Nodes
Convertor Nodes
Matte Nodes
Distortion Nodes
Editing Sequences
Introduction
Introduction
The sequencer
Usage
Sequencer Modes
Sequence Screen Layout
Effects
Built-in Effects
Plugin Effects
Audio
Audio Sequences
Extending Blender
Introduction
Introduction
Python Scripting
Python Scripting in Blender
Setting up Python
The Text Editor
A working example
References
Python Scripts
Script Catalog
Bundled Scripts
Plugins
Blender's Plugins System
Texture plugins specifications
Sequence plugins specifications
Game Engine
Introduction
Introduction
The Logic Editor
Usage
Game Properties
Sensors
Introduction
Sensor Types
Controllers
Introduction
Expressions
Actuators
Introduction
Action
Camera
CD
Constraint
Edit Object
Ipo
2D Filters
Game
Message
Motion
Parent
Property
Random
Scene
Shape Action
Sound
State
Visibility
Cameras
Cameras
Dome Camera
Physics
Physics Engine
Material Physics
Object Types
- Static
- No Collision
- Dynamic
- Rigid Body
- Soft Body
- Occluder
- Sensor
Python API
Bullet physics
VideoTexture
Various resources
List of Features
External resources
Game Engine Basics (BSoD Tutorial)
FAQ