Sponsored By

Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs.

Utilising Unity shaders to recreate Photoshop blend modes

Have you ever wanted to use the same blending modes in Unity that you are able to use in Photoshop? This blog goes over the basic setup - especially good for those new to using shaders in Unity.

Rebecca Fernandez, Blogger

December 3, 2015

9 Min Read

During the development of our game Tricky Towers it was my responsibility to research shader effects in order to spice up our graphics a bit. I was quite new to our company WeirdBeard and new to working with Unity, and WeirdBeard were quite new to using shaders, since we’d predominantly made games for mobile and browser in the past. I’m by no means experienced with writing shaders, but I have amused myself writing some OpenGL projects and had some experience with HLSL when I was working on XNA games – so I at least knew the correct questions to ask when starting my research. Our artist especially wanted me to look into recreating blending modes that he was used to using in Photoshop. There was actually surprisingly little online about this – so hopefully this blog post helps others.

Basics

Since I’m used to glsl and hlsl shader files - and the communication between those and c++ and c# respectively – I first had to get my head around how Unity does this.

1. ShaderLab

Unity (unsurprisingly) has done the hard work for us and provides us with an interface between the shader itself and the Unity material it will eventually be tied to. This is called ShaderLab. Each shader file must begin with the ShaderLab root command “Shader”


Shader “name” {

     //rest of the file goes here

}

This is all set up for you by Unity when you create a shader within the editor – but it can be useful to change the name.

2. Properties

The first command inside these braces is usually “Properties”. In here is a list of parameters that can be set in the Unity material inspector (or of course, via script). These properties will become inputs in your shader code later on. General syntax for these is:


name (“displayed name in editor”, type) = defaultvalue

3. Subshader

Next comes the Subshader command. A shader must have at least one Subshader – when the game is loaded Unity will choose the first Subshader that is supported by the user’s machine. Within the subshader there are three different types of shaders that can be written. Surface shaders, vertex & fragment shaders and fixed function shaders. Fixed function shaders are deprecated in Unity 5, so I won’t talk any further about those. Surface shaders are most useful for lighting calculations – and since we’re not using any lighting effects in our game (so far) I did not research these any further. Also because I’m more familiar working with vertex and fragment shaders, I found the surface shaders quite restrictive. So, on to vertex and fragment shaders. These are more complex to write than surface shaders, but they are more flexible and can achieve more. They can be written in CG or HLSL, but Unity recommends CG for performance reasons.

4. Pass

The shader code needs to be written inside a pass block inside the subshader. A pass block causes the geometry to which the shader code is attached to be rendered once. So, multiple passes inside a subshader will result in multiple renders of that geometry. The vertex and fragment shaders are embedded within a pass with the CGPROGRAM and ENDCG keywords.

5. Vertex & Fragment shaders

The first things to write in here are the compilation directives. Most useful are:


#pragma vertex  //the name of your vertex shader
#pragma fragment  //the name of your fragment/pixel shader
#include “UnityCG.cginc” //contains a number of really useful functions to use in the shaders

After this we can list the input variables we specified in the Properties section. We also have to create a struct to be used as the output of the vertex shader and the input of the fragment shader. We can then list the vertex and fragment functions. The vertex function should return an instance of the struct defined earlier. The fragment function should return a fixed4 that represents the colour of the pixel.


struct v2f {
     float4 pos : SV_POSITION;
     float2 uv : TEXCOORD0;
};

v2f vert (appdata_base v)
{
     v2f o;
     o.pos = mul (UNITY_MATRIX_MVP, v.vertex);
     o.uv = TRANSFORM_TEX (v.texcoord, _MainTex);
     return o;
}

fixed4 frag (v2f i) : SV_Target
{
      fixed4 texcol = tex2D (_MainTex, i.uv);
      return texcol * _Color;
}

The vertex shader above takes in an argument of type appdata_base which is a structure defined in UnityCG.cginc. All this shader does is convert the vertex and uv coordinates for the texture from world coordinates into screen coordinates – if you aren’t sure how this works, then the following links give quite a good introduction: link1 and link2

The fragment shader is sampling the texture using the correct uv coordinates for that pixel (as calculated by the previous step and then interpolated per pixel) and multiplying the result by a colour that has been passed in to the shader. So if the colour is white then the texture will be displayed on the geometry with no change.

6. Applying the shader

Within unity the shader file needs to be attached to a material. This material can then be attached to a component that’s attached to a game object in order to affect that game object.

7. Shader model version

By default Unity uses shader model 2.0 (equivalent to 2.0 for Direct3D 9). Usually this is enough, and will work on the most platforms. I did once have an issue because I was using too many arithmetic instructions, so I moved up a version with the command:


#pragma target 3.0

Apparently this only knocks out Windows Phone 8 and Windows RT, neither of which are platforms we are targeting.

8. Errors

You will make mistakes during development - and Unity actually gives you a fair bit of information about shader compilation errors. When looking at the shader in the Unity inspector, you can see a list of errors that will usually point you in the right direction. 

shaderErrors   

You can actually still compile and run the game with a shader error - the offending shader will just colour your object bright pink. This does mean you can edit your shader code while the game is still running though!  

Blending

So far so good. All of this has been pretty basic, and there were a lot of resources online (especially the Unity manual) to help me with this information. Blending is actually part of the fixed-function pipeline – meaning that the graphics card will handle blending and the only way to customise this is to tell it which blending methods you’d like to use. The graphics card will compare the colour results of multiple shaders for a pixel and decide how much that colour will influence the final pixel. Since photoshop blend modes are not built in, I needed to simulate the effect of blending. So within the one shader I need to compare two colours and mix them together accordingly. The two colours I need are the colour of the object being rendered at that texture coordinate, and the colour of whatever was behind my object at this position. The first is not so difficult – we can actually see it being calculated in the code above. For the second I would need to have already rendered the screen in order to determine the colour the screen behind my object. Luckily Unity provides a method to do that. This comes in the form of an extra pass called a GrabPass.  Grabpass saves the current contents of the screen into a texture which you can then access in the next pass (with the variable _GrabTexture). From this texture – I need the colour at the correct pixel co-ordinate. Calculating the colour needed from the texture of the object is easy – in fact the shader example above is doing exactly that. But I will also need to calculate the UV coordinates for the grab texture – I can do this by calling the ComputeGrabScreenPos function in the vertex shader and passing along the result to the fragment shader. EG:


struct v2f {
     float4 vertex : SV_POSITION;
     fixed4 color : COLOR;
     float2 texcoord : TEXCOORD0;
     half4 screenuv : TEXCOORD2;
};

v2f vert(appdata_t v)
{
     v2f o;
     o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
     o.color = v.color;
     o.texcoord = TRANSFORM_TEX(v.texcoord, _MainTex);
     o.screenuv = ComputeGrabScreenPos(o.vertex);
     return o;
}

Then in the fragment shader we can calculate the colour like so:


fixed4 colour = tex2D(_GrabTexture, float2(i.screenuv.x, i.screenuv.y));

Alright, now that we have two colours we can actually get started with blending them together. So far in Tricky Towers we’ve used Multiply and Overlay blending. Multiply is quite easy – all we do is multiply each channel in one colour by the corresponding channel in another colour:


fixed4 base = tex2D(_GrabTexture, float2(i.screenuv.x, i.screenuv.y));
fixed4 top = tex2D(_MainTex, float2(i.texcoord., i.texcoord.y));

fixed4 col = fixed4(0, 0, 0, 0);

col.r = base.r * top.r;
col.g = base.g * top.g;
col.b = base.b * top.b;
col.a = top.a;

Overlay is only slightly more complex. For those people like myself who are allergic to photoshop – the Overlay mode lightens the light areas of the top colour and darkens dark areas of the bottom/base colour. The algorithm is actually on Wikipedia and is quite easy to implement: https://en.wikipedia.org/wiki/Blend_modes#Overlay


float Overlay(float base, float top)
{
     if (base < 0.5){
          return 2 * base*top;
     }
     else {
          return 1 - 2 * (1 - base) *(1 - top);
     }
}

And within the fragment shader:


fixed4 base = tex2D(_GrabTexture, float2(i.screenuv.x, i.screenuv.y));
fixed4 top = tex2D(_MainTex, i.texcoord);

fixed4 col = fixed4(0, 0, 0, 0);

col.r = Overlay(base.r, top.r);
col.g = Overlay(base.g, top.g);
col.b = Overlay(base.b, top.b);
col.a = top.a;

So what was the result? See for yourself!

brickGuide 

On the left you can see the effects of the overlay shader on the brick guide (the vertical lane of lighter area surrounding the brick). On the right is the same texture but with just a regular alpha blend. The overlay effect certainly provides us with a richer tone, and makes the brick guide easier to see. 

Note: We're using SV_POSITION instead of the normal semantic POSITION in order to be compatible with PS4.

Read more about:

Featured Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like