Sponsored By

In-depth: Shader generator

In this reprinted <a href="http://altdevblogaday.com/">#altdevblogaday</a> opinion piece, game programer Simon Yeung describes how to generate vertex and pixel shader source codes for different render passes by defining a surface shader.

August 3, 2012

8 Min Read

[In this reprinted #altdevblogaday opinion piece, game programer Simon Yeung describes how to generate vertex and pixel shader source codes for different render passes by defining a surface shader.] In the last few weeks, I was busy with rewriting my iPhone engine so that it can also run on the Windows platform (so that I can use Visual Studio in stead of Xcode~) and most importantly, I can play around with D3D11. During the rewrite, I want to improve the process of writing shaders so that I don't need to write similar shaders multiple times for each shader permutation (say, for each surface, I have to write a shader for static mesh, skinned mesh, instanced static mesh… multiplied by the number of render passes), and instead I can focus on coding how the surface would looks like. So I decided to write a shader generator that will generate those shaders, which is similar to the surface shader in Unity. I choose the surface shader approach instead of a graph based approach like Unreal Engine because being a programmer, I feel more comfortable (and faster) writing code than dragging tree nodes using the GUI. In the current implementation of the shader generator, it can only generate vertex and pixel shaders for the light pre pass renderer, which is the lighting model used before.

Defining the surface

To generate the target vertex and pixel shaders by the shader generator, we need to define how the surface looks like by writing surface shader. In my version of surface shader, I need to define three functions: vertex function, surface function and lighting function. The vertex function defines the vertex properties like position and texture coordinates.

  1. VTX_FUNC_OUTPUT vtxFunc(VTX_FUNC_INPUT input)

  2. {

  3. VTX_FUNC_OUTPUT output;

  4. output.position = mul( float4(input.position, 1), worldViewProj );

  5. output.normal = mul( worldInv, float4(input.normal, 0) ).xyz;

  6. output.uv0 = input.uv0;

  7. return output;

  8. }

The surface function describes how the surface looks like by defining the diffuse color of the surface, glossiness and the surface normal.

  1. SUF_FUNC_OUTPUT sufFunc(SUF_FUNC_INPUT input)

  2. {

  3. SUF_FUNC_OUTPUT output;

  4. output.normal = input.normal;

  5. output.diffuse = diffuseTex.Sample( samplerLinear, input.uv0 ).rgb;

  6. output.glossiness = glossiness;

  7. return output;

  8. }

Finally the lighting function will decide which lighting model is used to calculate the reflected color of the surface.

  1. LIGHT_FUNC_OUTPUT lightFuncLPP(LIGHT_FUNC_INPUT input)

  2. {

  3. LIGHT_FUNC_OUTPUT output;

  4. float4 lightColor = lightBuffer.Sample(samplerLinear, input.pxPos.xy * renderTargetSizeInv.xy );

  5. output.color = float4(input.diffuse * lightColor.rgb, 1);

  6. return output;

  7. }

By defining the above functions, the writer of the surface shader only need to fill in the output structure of the function by using the input structure with some auxiliary functions and shader constants provided by the engine.

Generating the shaders

As you can see in the above code snippet, my surface shader is just defining normal HLSL function with a fixed input and output structure for the functions. So to generate the vertex and pixel shaders, we just need to copy these functions to the target shader code, which will invoke those functions defined in the surface shader. Taking the above vertex function as an example, the generated vertex shader would look like:

  1. #include "include.h"

  2. struct VS_INPUT

  3. {

  4. float3 position : POSITION0;

  5. float3 normal : NORMAL0;

  6. float2 uv0 : UV0;

  7. };

  8. struct VS_OUTPUT

  9. {

  10. float4 position : SV_POSITION0;

  11. float3 normal : NORMAL0;

  12. float2 uv0 : UV0;

  13. };

  14. typedef VS_INPUT VTX_FUNC_INPUT;

  15. typedef VS_OUTPUT VTX_FUNC_OUTPUT;

  16. /********************* User Defined Content ********************/

  17. VTX_FUNC_OUTPUT vtxFunc(VTX_FUNC_INPUT input)

  18. {

  19. VTX_FUNC_OUTPUT output;

  20. output.position = mul( float4(input.position, 1), worldViewProj );

  21. output.normal = mul( worldInv, float4(input.normal, 0) ).xyz;

  22. output.uv0 = input.uv0;

  23. return output;

  24. }

  25. /******************** End User Defined Content *****************/

  26. VS_OUTPUT main(VS_INPUT input)

  27. {

  28. return vtxFunc(input);

  29. }

During code generation, the shader generator needs to figure out what input and output structure are needed to feed into the user defined functions. This task is simple and can be accomplished by using some string functions.

Simplifying the shader

As I mentioned before, my shader generator is used for generating shaders used in the light pre pass renderer. There are two passes in the light pre pass renderer which need different shader input and output. For example in the G-buffer pass, the shaders are only interested in the surface normal data but not the diffuse color while the data needed by second geometry pass are the opposite. However all the surface information (surface normal and diffuse color) are defined in the surface function inside the surface shader. If we simply generate shaders like last section, we will generate some redundant code that cannot be optimized by the shader compiler. For example, the pixel shader in G buffer pass may need to sample the diffuse texture, which requires the texture coordinates input from vertex shader, but since the diffuse color is actually not needed in this pass, the compiler may not be able to figure out we don't need the texture coordinates output in vertex shader. Of course, we can force the writer to define some #if preprocessor inside the surface function for the particular render pass to eliminate the useless output, but this will complicate the surface shader authoring process, as writing the surface shader to describe how the surface looks like ideally doesn't need to worry about the output of a render pass. So the problem is to figure out what output data are actually needed in a given pass, and eliminate those outputs that are not needed. For example, given we are generating shaders for the G buffer pass and a surface function:

  1. SUF_FUNC_OUTPUT sufFunc(SUF_FUNC_INPUT input)

  2. {

  3. SUF_FUNC_OUTPUT output;

  4. output.normal = input.normal;

  5. output.diffuse = diffuseTex.Sample( samplerLinear, input.uv0 ).rgb;

  6. output.glossiness = glossiness;

  7. return output;

  8. }

We only want to keep the variables output.normal and output.glossiness. And the variable output.diffuse, and other variables that are referenced by output.diffuse (diffuseTex, samplerLinear, input.uv0) are going to be eliminated. To find out such variable dependencies, we need to teach the shader generator to understand HLSL grammar and find out all the assignment statements and branching conditions to derive the variable dependency. To do this, we need to generate an abstract syntax tree from the shader source code. Of course we can write our own LALR parser to achieve this goal, but I chose to use lex&yacc (or flex&bison) to generate the parse tree. Luckily we are working on a subset of the HLSL syntax (only need to define functions and don't need to use pointers) and HLSL syntax is similar to C language, so modifying the ANSI-C grammar rule for lex&yacc would do the job. Here is my modified grammar rule used to generate the parse tree. By traversing the parse tree, the variable dependency can be obtained, hence we know which variables need to be eliminated and eliminate them by taking out the assignment statements, then the compiler will do the rest. Below is the simplified pixel shader generated in the previous example:

  1. #include "include.h"

  2. cbuffer _materialParam : register( MATERIAL_CONSTANT_BUFFER_SLOT_0 )

  3. {

  4. float glossiness;

  5. };

  6. Texture2D diffuseTex: register( MATERIAL_SHADER_RESOURCE_SLOT_0 );

  7. struct PS_INPUT

  8. {

  9. float4 position : SV_POSITION0;

  10. float3 normal : NORMAL0;

  11. };

  12. struct PS_OUTPUT

  13. {

  14. float4 gBuffer : SV_Target0;

  15. };

  16. struct SUF_FUNC_OUTPUT

  17. {

  18. float3 normal;

  19. float glossiness;

  20. };

  21. typedef PS_INPUT SUF_FUNC_INPUT;

  22. /********************* User Defined Content ********************/

  23. SUF_FUNC_OUTPUT sufFunc(SUF_FUNC_INPUT input)

  24. {

  25. SUF_FUNC_OUTPUT output;

  26. output.normal = input.normal;

  27. ;

  28. output.glossiness = glossiness;

  29. return output;

  30. }

  31. /******************** End User Defined Content *****************/

  32. PS_OUTPUT main(PS_INPUT input)

  33. {

  34. SUF_FUNC_OUTPUT sufOut= sufFunc(input);

  35. PS_OUTPUT output;

  36. output.gBuffer= normalToGBuffer(sufOut.normal, sufOut.glossiness);

  37. return output;

  38. }

Extending the surface shader syntax

As I use lex&yacc to parse the surface shader, I can extend the surface shader syntax by adding more grammar rule, so that the writer of the surface shader can define what shader constants and textures are needed in their surface function to generate the constant buffer and shader resources in the source code. Also my surface shader syntax permits the user to define their struct and function other than their three main functions (vertex, surface and lighting function), where they will also be copied into the generated source code. Here is a sample of how my surface shader would looks like:

  1. RenderType{

  2. opaque;

  3. };

  4. ShaderConstant

  5. {

  6. float glossiness: ui_slider_0_255_Glossiness;

  7. };

  8. TextureResource

  9. {

  10. Texture2D diffuseTex;

  11. };

  12. VTX_FUNC_OUTPUT vtxFunc(VTX_FUNC_INPUT input)

  13. {

  14. VTX_FUNC_OUTPUT output;

  15. output.position = mul( float4(input.position, 1), worldViewProj );

  16. output.normal = mul( worldInv, float4(input.normal, 0) ).xyz;

  17. output.uv0 = input.uv0;

  18. return output;

  19. }

  20. SUF_FUNC_OUTPUT sufFunc(SUF_FUNC_INPUT input)

  21. {

  22. SUF_FUNC_OUTPUT output;

  23. output.normal = input.normal;

  24. output.diffuse = diffuseTex.Sample( samplerLinear, input.uv0 ).rgb;

  25. output.glossiness = glossiness;

  26. return output;

  27. }

  28. LIGHT_FUNC_OUTPUT lightFuncLPP(LIGHT_FUNC_INPUT input)

  29. {

  30. LIGHT_FUNC_OUTPUT output;

  31. float4 lightColor = lightBuffer.Sample(samplerLinear, input.pxPos.xy * renderTargetSizeInv.xy );

  32. output.color = float4(input.diffuse * lightColor.rgb, 1);

  33. return output;

  34. }

h1>ConclusionsThis post described how I generate vertex and pixel shader source codes for different render passes by defining a surface shader, which allowed me to avoid writing similar shaders multiple times, and not worry about the particular shader input and output for each render pass. Currently, the shader generator can only generate vertex and pixel shader in HLSL for static mesh in the light pre pass renderer. The shader generator is still under progress, and generating shader source code for the forward pass is incomplete. Besides domain, hull and geometry shaders are not implemented. Also GLSL support is missing, but this can be generated (in theory…) by building a more sophisticated abstract syntax tree during parsing the surface shader grammar or defining some new grammar rule in the surface shader (using lex&yacc) for easier generation of both HLSL and GLSL source code. But these will be left for the future as I still need to rewrite my engine and get it running again…

References

[1] Unity – Surface Shader Examples http://docs.unity3d.com/Documentation/Components/SL-SurfaceShaderExamples.html [2] Lex & Yacc Tutorial http://epaperpress.com/lexandyacc/ [3] ANSI C grammar, Lex specification http://www.lysator.liu.se/c/ANSI-C-grammar-l.html [4] ANSI C Yacc grammar http://www.lysator.liu.se/c/ANSI-C-grammar-y.html [5] http://www.ibm.com/developerworks/opensource/library/l-flexbison/index.html [6] http://www.gamedev.net/topic/200275-yaccbison-locations/ [This piece was reprinted from #AltDevBlogADay, a shared blog initiative started by @mike_acton devoted to giving game developers of all disciplines a place to motivate each other to write regularly about their personal game development passions.]

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like