Sponsored By

Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs.

Reflections Based on Local Cubemaps in Unity

In this blog the concept of local cubemap is reviewed and the implementation of reflections based on local cubemap is discussed. Shader implementation in Unity is provided.

Roberto Lopez Mendez, Blogger

January 15, 2016

10 Min Read

Introduction

The topic of this blog was presented to students in a workshop at Brains Eden Gaming Festival 2014 at Anglia Ruskin University in Cambridge [1]. We wanted to provide students with an effective and low cost technique to implement reflections when developing games for mobile devices.

Early Reflection Implementations

From the very beginning, graphics developers have tried to find cheap alternatives to implement reflections. One of the first solutions was spherical mapping. Spherical mapping simulates reflections or lighting upon objects without going through expensive ray-tracing or lighting calculations. This approach has several disadvantages, but the main problem is related to the distortions when mapping a picture onto a sphere.  In 1999, it became possible to use cubemaps with hardware acceleration.

Figure 1: Spherical mapping.

 

Cubemaps solved the problems of image distortions, viewpoint dependency and computational inefficiency related with spherical mapping. Cube mapping uses the six faces of a cube as the map shape. The environment is projected onto each side of a cube and stored as six square textures, or unfolded into six regions of a single texture. The cubemap is generated by rendering the scene from a given position with six different camera orientations with a 90 degree view frustum representing each a cube face. Source images are sampled directly. No distortion is introduced by resampling into an intermediate environment map.

Figure 2: Cubemaps.

 

To implement reflections based on cubemaps we just need to evaluate the reflected vector R and use it to fetch the texel from the cubemap CubeMap using the available texture lookup function texCUBE:

float4 col = texCUBE(CubeMap, R);

Expression 1.

 

Figure 3: Reflections based on infinite cubemaps.

 

With this approach we can only reproduce reflections correctly from a distant environment where the cubemap position is not relevant. This simple and effective technique is mainly used in outdoor lighting, for example, to add reflections of the sky. If we try to use this technique in a local environment we get inaccurate reflections.

Figure 4: Reflection on the floor calculated wrongly with an infinite cubemap.

Local Reflections

The main reason why this reflection fails is that in Expression 1 there is not any binding to the local geometry. For example, according to Expression 1, if we were walking on a reflective floor looking at it from the same angle, we would always see the same reflection on it. As the direction of the view vector does not change, the reflected vector is always the same and Expression 1 gives the same result. Nevertheless, this is not what happens in the real world where reflections depend on viewing angle and viewing position.

The solution to this problem was first proposed by Kevin Bjorke[2] in 2004. For the first time a binding to the local geometry was introduced in the procedure to calculate the reflection:

Figure 5: Reflections using local cubemaps.

 

While this approach gives good results in objects’ surfaces with near to spherical shape, in the case of plane reflective surfaces the reflection shows noticeable deformations. Another drawback of this method is related to the relative complexity of the algorithm to calculate the intersection point with the bounding sphere which solves a second degree equation.

A few years later, in 2010, a better solution was proposed [3] in a thread of a developer forum at gamedev.net. The new approach replaced the previous bounding sphere by a box, solving the shortcomings of Bjorke’s method: deformations and complexity of the algorithm to find the intersection point.

Figure 6: Introducing a bounding box.

 

A more recent work [4] uses this new approach to simulate more complex ambient specular lighting using several cubemaps and proposes an algorithm to evaluate the contribution of each cubemap and efficiently blend on the GPU.

At this point we must clearly distinguish between local and infinite cubemaps:

 

Figure 7 shows the same scene from Figure 4 but this time with correct reflections using local cubemaps.

Figure 7: Reflection on the floor correctly calculated with a local cubemap.

 

Shader Implementation

The shader implementation in Unity of reflections using local cubemaps is provided below. In the vertex shader, we calculate the three magnitudes we need to pass to the fragment shader as interpolated values: the vertex position, the view direction and the normal, all of them in world coordinates:


vertexOutput vert(vertexInput input)
{
    vertexOutput output;
    output.tex = input.texcoord;
    // Transform vertex coordinates from local to world.
    float4 vertexWorld = mul(_Object2World, input.vertex);
    // Transform normal to world coordinates.
    float4 normalWorld = mul(float4(input.normal, 0.0), _World2Object);
    // Final vertex output position.   
    output.pos = mul(UNITY_MATRIX_MVP,  input.vertex);
    // ----------- Local correction ------------
    output.vertexInWorld = vertexWorld.xyz;
    output.viewDirInWorld = vertexWorld.xyz - _WorldSpaceCameraPos;
    output.normalInWorld = normalWorld.xyz;
    return output;
}

In the fragment shader the reflected vector is found along with the intersection point in the volume box. The new local corrected reflection vector is built and it is used to fetch the reflection texture from the local cubemap. Finally the texture and reflection are combined to produce the output colour:


float3 LocalCorrect(float3 origVec, float3 bboxMin, float3 bboxMax,
                    float3 vertexPos, float3 cubemapPos)
{
    // Find the ray intersection with box plane
    float3 invOrigVec = float3(1.0,1.0,1.0)/origVec;
    float3 intersecAtMaxPlane = (bboxMax - vertexPos) * invOrigVec;
    float3 intersecAtMinPlane = (bboxMin - vertexPos) * invOrigVec;
    // Get the largest intersection values
    // (we are not intersted in negative values)
    float3 largestIntersec = max(intersecAtMaxPlane, intersecAtMinPlane);
    // Get the closest of all solutions
    float Distance = min(min(largestIntersec.x, largestIntersec.y),
                         largestIntersec.z);
    // Get the intersection position
    float3 IntersectPositionWS = vertexPos + origVec * Distance;
    // Get corrected vector
    float3 localCorrectedVec = IntersectPositionWS - cubemapPos;
    return localCorrectedVec;
}

float4 frag(vertexOutput input) : COLOR
{
     float4 reflColor = float4(1, 1, 0, 0);
     // Find reflected vector in WS.
     float3 viewDirWS = normalize(input.viewDirInWorld);
     float3 normalWS = normalize(input.normalInWorld);
     float3 reflDirWS = reflect(viewDirWS, normalWS);
     // Get local corrected reflection vector.
     float3 localCorrReflDirWS = LocalCorrect(reflDirWS, _BBoxMin,
                                 _BBoxMax, input.vertexInWorld,
                                 _EnviCubeMapPos)
     // Lookup the environment reflection texture with the right vector. 
     reflColor = texCUBE(_Cube, localCorrReflDirWS);
     // Lookup the texture color.
     float4 texColor = tex2D(_MainTex, float2(input.tex));
     return _AmbientColor + texColor * _ReflAmount * reflColor;
}

In the above code for the fragment shader the magnitudes _BBoxMax and _BBoxMin are the maximum and minimum points of the bounding volume. The variable  _EnviCubeMapPos is the position where the cubemap was created. These values are passed to the shader from the below script:


[ExecuteInEditMode]
public class InfoToReflMaterial : MonoBehaviour
{
    // The proxy volume used for local reflection calculations.
    public GameObject boundingBox;

    void Start()
    {
        Vector3 bboxLenght = boundingBox.transform.localScale;
        Vector3 centerBBox = boundingBox.transform.position;
        // Min and max BBox points in world coordinates.
        Vector3 BMin = centerBBox - bboxLenght/2;
        Vector3 BMax = centerBBox + bboxLenght/2;
        // Pass the values to the material.
        gameObject.renderer.sharedMaterial.SetVector("_BBoxMin", BMin);
        gameObject.renderer.sharedMaterial.SetVector("_BBoxMax", BMax);
        gameObject.renderer.sharedMaterial.SetVector("_EnviCubeMapPos",
                                                     centerBBox);
    }
}

The values for _AmbientColor and _ReflAmount as well as the main texture and cubemap texture are passed to the shader as uniforms from the properties block:


Shader"Custom/ctReflLocalCubemap"
{
    Properties
    {
        _MainTex ("Base (RGB)", 2D) = "white" { }
        _Cube("Reflection Map", Cube) = "" {}
        _AmbientColor("Ambient Color", Color) = (1, 1, 1, 1)
        _ReflAmount("Reflection Amount", Float) = 0.5
    }
    SubShader
    {
        Pass
        {  
            CGPROGRAM
            #pragma glsl
            #pragma vertex vert
            #pragma fragment frag
            #include "UnityCG.cginc"
           // User-specified uniforms
            uniform sampler2D _MainTex;
            uniform samplerCUBE _Cube;
            uniform float4 _AmbientColor;
            uniform float _ReflAmount;
            uniform float _ToggleLocalCorrection;
           // ----Passed from script InfoRoReflmaterial.cs --------
            uniform float3 _BBoxMin;
            uniform float3 _BBoxMax;
            uniform float3 _EnviCubeMapPos;

            struct vertexInput
            {
                float4 vertex : POSITION;
                float3 normal : NORMAL;
                float4 texcoord : TEXCOORD0;
            };

            struct vertexOutput
            {
                float4 pos : SV_POSITION;
                float4 tex : TEXCOORD0;
                float3 vertexInWorld : TEXCOORD1;
                float3 viewDirInWorld : TEXCOORD2;
                float3 normalInWorld : TEXCOORD3;
            };

            Vertex shader {…}
            Fragment shader {…}
            ENDCG
          }
      }
}

The algorithm to calculate the intersection point in the bounding volume is based on the use of the parametric representation of the reflected ray from the local position (fragment). More detailed explanation of the ray-box intersection algorithm can be found in [4] in the References.

Filtering Cubemaps

One of the advantages of implementing reflections using local cubemaps is the fact that the cubemap is static, i.e. it is generated during development rather than at run-time. This gives us the opportunity to apply any filtering to the cubemap images to achieve a given effect.

As an example, the image below shows reflections using a cubemap where a Gaussian filter was applied to achieve a “frosty” effect. The CubeMapGen [5] tool (from AMD) was used to apply filtering to the cubemap. Just to give an idea about how expensive this process can be it took more than one minute the filtering of a 256 pixels cubemap on a PC.

Figure 8: Gaussian filter applied to reflections in Figure 3.

 

A specific tool was developed for Unity to generate cubemaps and save cubemap images separately to import later into CubeMapGen. Detailed information about this tool and about the whole process of exporting cubemap images from Unity to CubeMapGen, applying filtering and reimporting back to Unity can be found in the References section [4].

Conclusions

Reflections based on static local cubemaps are an effective tool to implement high quality and realistic reflections and a cheap alternative to reflections generated at run-time.   This is especially important in mobile devices where performance and memory bandwidth consumption are critical to the success of many games.

Additionally, reflections based on static local cubemaps allow developers to apply filters to the cubemap to achieve complex effects that would otherwise be prohibitively expensive at run-time, even on high-end PCs.

The inherent limitation of static cubemaps when dealing with dynamic objects can be solved easily by combining static reflections with reflections generated at run-time. This topic will be examined in a future blog.

REFERENCES

[1] Reflections based on local cubemaps. Presentation at Brains Eden, 2014 Gaming Festival at Anglia Ruskin University in Cambridge.  http://malideveloper.arm.com/downloads/ImplementingReflectionsinUnityUsingLocalCubemaps.pdf

[2] GPU Gems, Chapter 19. Image-Based Lighting. Kevin Bjork, 2004. http://http.developer.nvidia.com/GPUGems/gpugems_ch19.html.

[3] Cubemap Environment Mapping. 2010. http://www.gamedev.net/topic/568829-box-projected-cubemap-environment-mapping/

[4] Image-based based Lighting approaches and parallax-corrected cubemap. Sebastien Lagarde. SIGGRAPH 2012.

http://seblagarde.wordpress.com/2012/09/29/image-based-lighting-approaches-and-parallax-corrected-cubemap/

[5] CubeMapGen. http://developer.amd.com/tools-and-sdks/archive/legacy-cpu-gpu-tools/cubemapgen/

Read more about:

Featured Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like