Image Based Lighting

There are various ways of approximating global illumination in todays real-time 3D visualisations.

  • Diffuse global illumination from static objects can be baked into the scene and light probes can be used for applying diffuse GI to dynamic actors.
  • NVIDIA recently released their GIWorks solution which relies on voxel/cone tracing to generate real-time GI from both static and dynamic actors.
  • There are also certain screen space techniques, as can be seen in the Unreal Engine 4.

However, these techniques either only support diffuse lighting, require a lot of time, effort and know-how to implement or aren’t supported by low-end hardware, such as phones or laptops. Another alternative that I will describe here is a simple, yet powerful technique, called Image Based Lighting.

Image based lighting, or IBL, is an extension to lighting using environment maps. But where environment maps are usually used to simulate purely specular reflections in the scene, IBL uses irradiance environment maps to simulate everything from purely specular reflections, to glossy and diffuse reflections. Fig. 1 shows such an environment map used for specular lighting and some of the irradiance environment maps created from it. Fig.2 two shows the application of the irradiance environment maps. Notice the diffuse reflection on the T-Rex, the glossy reflections in the chessboard and the fresnel effect on the snow speeder, which has diffuse surfaces when looking directly at them, but glossy surfaces at grazing angles.

Irradiance environment maps

Fig 1. Multiple irradiance environment maps created with different glossiness factors. The left one is completely specular and the right one is very rough.

T-rex attacks Snow Speeder

Fig 2. A scene containing a LEGO Star Wars Snow Speeder getting attacked by a T-Rex with a teapot on a chessboard at the beach. (Sounds ridiculous when you spell it out like that.)

Image Based Lighting basics

So how do we go about creating an effect like this? First we need to take a look at our surface description. If our surface is described by the BRDF, f_r(\omega_o, \omega_i), where \omega_o is the direction of the view vector and \omega_i is the direction towards some incoming light. In layman’s terms the BRDF describes the ratio of incoming radiance that is reflected towards the viewer. The total amount of light coming from \omega_i, L(\omega_i), that is reflected in the view direction is computed by

L_o(\omega_o, \omega_i) = f_r(\omega_o, \omega_i) L(\omega_i) cos\theta_i

In most non-physically based rendering systems, a surface is described by a Lambert BRDF and a non normalized Blinn or Phong BRDF for creating specular highlights.

With that in mind we have a basic description of light reflecting of a surface for one incoming and one outgoing direction. However, our environment doesn’t represent light arriving at the surface from a single direction, such as a point light does, instead it illuminates the surface from every possible incoming direction, so it represents incoming light over the entire visible hemisphere. So to compute the amount of light reflected from the environment along the \omega_i, we have to integrate over the visible hemisphere, \Omega, that is

L_o(\omega_o) = \int_\Omega \! f_r(\omega_o, \omega_i) L_{env}(\omega_i) cos\theta_i\, \mathrm{d}\omega_i
Preconvoluting the environment map

Solving this integral for arbitrary environment maps or complex BRDFs is not feasible and approximating it using Monte-Carlo Integration cannot be done in realtime on low-end hardware, so we need to come up with an approximation for real-time usages. An in-depth description of how this can be done and what the tradeoffs are can be found in Real Shading in Unreal Engine 4. In essence we need to split the integral and separately integrate over the environment, which convolutes the environment, and integrate the BRDF over the hemisphere, computing the scale that the BRDF applies to the environment light. Convoluting the environment can then be done in a preprocess. In my implementation I have chosen to convolute using the power cosine distribution instead of the actual BRDF. This means that the irradiance map won’t match the BRDF completely, but as I wanted multiple BRDFs to use the same map, it wasn’t possible to support all at the same time. The IBL will be precomputed from the environment map using the expression

L_{IBL}(\omega_o, \sigma) = \int_\Omega \! f_{powercos}(\omega_o, \omega_i, \sigma) L_{env}(\omega_i) \, \mathrm{d}\omega_i

which can be done using Monte Carlo Integration, where σ is the relative ‘shininess’ or glossiness of the surface and f_{powercos} is the power cosine distribution. The whole precomputation is stored in a single texture, storing the low frequency irradiance maps in the low resolution miplevels.

Apply to BRDF

With our convoluted environment map we can now efficiently fetch convoluted incident radiance from the environment map. All we then have to do to apply the light to the material is to scale it by the BRDF. To do that we have to estimate the amount of light coming from the environment that is reflected along \omega_o, i.e we have to solve

\rho(\omega_o) = \int_\Omega \! f_r(\omega_o, \omega_i) cos\theta_i\, \mathrm{d}\omega_i

which is known as the directional-hemispherical reflectance function. How to do this depends on f_r. For simple BRDFs like Lambert or a completely specular BRDF, solving the integral is doable. More complex BRDFs cannot be solved analytically in real-time, but the solution can be baked into a texture and looked up at runtime, as discussed in Real Shading in Unreal Engine 4. Encoding \\rho in a spherical harmonic is also a possibility. Choose the right tool for the right BRDF. However \rho is resolved, we now have all components needed for applying the IBL to the BRDF and compute the radiance reflected along \omega_o

L_o(\omega_o) = \rho(\omega_o) L_{IBL}(\omega_o, \sigma_r(\omega_o))
Demo Time

The result can be seen in the demo below or a demo with more IBLs can be downloaded for Windows and Mac.
Note that due to texture resolutions, mirror reflections are not supported in the webplayer.

Future Work

A detail that was conveniently left out was how to compute \sigma_r for a given BRDF. \sigma_r represents an approximate mapping from an arbitrary BRDF to the power cosine distribution, but it is an approximation and as such estimating it depends on the BRDF used. Estimating \sigma_r becomes even harder for anisotropic BRDFs, where a single L_{IBL}(\omega_o, \sigma_r) sample isn’t enough to capture the anisotropic properties of the surface. Interesting future work would be developing a general framework for estimating \sigma_r for any isotropic BRDF. Additionally it would be interesting to investigate if glossiness could be broken down into N cones inside the hemisphere for anisotropic BRDFs and used to create a believable estimation of anisotropic reflections.


Real Shading in Unreal Engine 4
Bidirectional Reflectance Distribution Function
GPU-Based Importance Sampling

4 thoughts on “Image Based Lighting

  1. Hi,

    Thanks for explaination . But i have a few doubts should the Environment map (Cubemap) be multiplied with diffuse and specular or should it be added .And what is f(Wo,Wi) please do explain me i did try to implement an IBL using fragment shader in unity but was not sucessfull. I dont understand just how to integrate it in shader and what is omega
    Please do help me


    • Hey,

      The IBL should be multiplied by the diffuse and specular color yes. That is what happens in the equation right above the demo, where rho, i.e. the albedo, gets multiplied by the incoming light from the IBL.
      f(wo, wi) is the Bidrectional Reflection Distribution Function, BRDF. A link to the wiki page explaining BRDFs is given at the top of the article. I specifically left it theoretical in the post to avoid explaining any specific BRDFs or how to compute Rho for any of them.
      In most older renderers though materials mostly consist of a non-physical Lambert diffuse BRDF, Lambert(wo, wi) = diffuseColor, and an non-normalized blinn or phong glossy component, Blinn(wo, wi) = specColor * pow(dot(view, halfway), exponent). In that case applying the IBL in a somewhat decent way can be done as IBLContribution = diffuseColor * diffuse_IBL_llokup + specColor * spec_IBL_lookup. This would work if you want to mimic a fixed function pipeline extended with IBL.
      For more concrete BRDF implementations and how to combine it into materials, you can have a look at what I’ve done in my Unity project. Although a decent knowlegde of Unity shaders is needed in order to understand what is goind on there.

      Hope that was helpful.

  2. Now the mainstream engine’s approach is to use image-based lighting to achieve approximate global illumination, but such a technique seems to be only applicable to outdoor scenes.What kind of technology is used in the indoor scene to achieve global illumination?For example, in a room with natural light, or in a completely enclosed indoor environment.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>