Rendering in Blender - Blender Internal and Cycles

0800 156 0777

Rendering in Blender - Blender Internal and Cycles

As I do more and more work in Blender I find myself being asked what is the difference between the internal render engine (Blender Internal, or BI) and Cycles. Is one better than the other? So I thought I would do a post about the differences to let people know.

In effect, there are three rendering engines within Blender…

Blender Internal, the Game Engine, and Cycles.

blender engines In this post I will only be talking about Cycles and Blender Internal. You can, of course, have your scenes rendered externally as well. However, before we start, I would like to make some points…

  1. It is true that Blender Internal is not being developed anymore
  2. Cycles is not necessarily better than Blender Internal, they are just different
  3. Cycles can do things that Blender Internal cannot, e.g. it can uses the Open Shading Language for rendering, it can use the gpu or cpu for rendering, Caustics (for example)

But what is the actual difference between the two rendering engines?

Well BI is a biased rasterisation engine, which means that it works by calculating which objects are visible to the camera and not by simulating the behaviour of light. (BI usually renders noiseless images while cycles must sample a scene many times to reduce the noise to an acceptable level).

Cycles is an unbiased, physically based, path tracing engine designed for animations, which means that it produces an image by tracing the paths of “rays” through the scene. Specifically, cycles is a “backwards” path tracer, which means that it traces light rays by sending them from the camera instead of sending them from light source(s).

Also, when using cycles, you are using nodes - which gives you great control over your rendering, but it can lead to some quite complex node setups.

Here is a link to Wikipedia's overview of rendering in general…

http://en.wikipedia.org/wiki/Rendering_(computer_graphics)

---------------------------------------------------------------------------------------------------------------------------------------------

{{from Wikipedia}}

Path tracing is a computer graphics method of rendering images of three dimensional scenes such that the global illumination is faithful to reality. Fundamentally, the algorithm is integrating over all the illuminance arriving to a single point on the surface of an object. This illuminance is then reduced by a surface reflectance function to determine how much of it will go towards the viewpoint camera. This integration procedure is repeated for every pixel in the output image. When combined with physically accurate models of surfaces, accurate models of real light sources (light bulbs), and optically-correct cameras, path tracing can produce still images that are indistinguishable from photographs.

Path tracing naturally simulates many effects that have to be specifically added to other methods (conventional ray tracing or scanline rendering), such as soft shadows, depth of field, motion blur, caustics, ambient occlusion, and indirect lighting. Implementation of a renderer including these effects is correspondingly simpler.

Due to its accuracy and unbiased nature, path tracing is used to generate reference images when testing the quality of other rendering algorithms. In order to get high quality images from path tracing, a large number of rays must be traced to avoid visible noisy artifacts.

---------------------------------------------------------------------------------------------------------------------------------------------

Here is an image showing what path tracing is (in a simplified form)…

Camera path tracing Camera path tracing

The pixel the gray ray was sent from will be dark, as it did not encounter any light sources. The pixel from which the purple ray was sent will be bright, as it hit a light source after bouncing off a diffuse object.

Most materials scatter light, so Cycles needs to account for light bouncing in different directions. How this is done depends on which Integrator is used.

Path tracing

This is a pure path tracer; at each hit it will bounce light in one direction and pick one light to receive lighting from. This makes each individual sample faster to compute, but will typically require more samples to clean up the noise.

Branched Path Integrator

Similar to the Path tracing integrator, but at the first hit it will split the path for different surface components and will take all lights into account for shading instead of just one. This makes each sample slower, but will reduce noise, especially in scenes dominated by direct or one-bounce lighting.

---------------------------------------------------------------------------------------------------------------------------------------------

So, let us look at the an overview of the various options that the engines give us (BI on the left, Cycles on the right)…

Blender Internal options vs Cycles options Blender Internal options vs Cycles options  

The Texture Atlas option is an add-on for creating Atlases for Gaming - so we will ignore that here. The Freestyle option is for creating Non-Photorealistic Rendering and is currently available only for Blender Internal, and is for creating wireframes, manga, cartoon style rendering (and a lot more) and again is not relevant to this post. So what does that leave us with?

BI only

Sampled Motion Blur - For adding motion blur to your renders Shading - Controls what shading effects are applied to your render Anti Aliasing - Gets rid of any aliasing effects in your renders

Cycles only

Film - some simple effects (exposure, Gaussian and Box filter), plus transparency for your background Light Paths - Sets the Cycles integrator to use and the number of bounces for the various rays etc Volume Sampling - Options for setting volumes of your materials etc when rendering Sampling - Very important - higher sampling gives you less noise (but greater render times). Also clamping can be set.

This is not say that the options for the remaining sections are the same (far from it) but it at least gives us an idea as to some differences between the two engines.

So when would I use BI or Cycles? Well, if I am doing low-poly work, game models, etc I will uses Blender Internal. If I want more control, more effects than I will uses Cycles. But that is not a hard and fast rule, just a general one. In a future post I will do a simple material setup - one for BI and one for Cycles, and you will be able to see for yourself how they work and what it takes to set them up.