Setting the scene:
Scripting The Environment in Unity3D
When you are considering the work required to create a computer game its very easy to get wrapped up in the core dynamics of the game and the central experience of the game play. When your shooting to score in a football game you would tend to consider the code behind the striker shooting and the goalkeeper reacting, and perhaps imagine the coding requirement of the game based on these core actions. However in a modern computer game so much more is happening to set the scene that this act plays out on, attention to detail is getting more and more advanced and current computer games set the bar very high.
So when your striker is swinging to kick the ball, the games lighting is hitting his 3d model, rendering his skin hair and kit depending on texture, and casting a shadow on the ground in accordance to the texture of the surface. The crowd may be silently gasping in anticipation, potentialy followed by a roar of cheering if the ball ends up in the back of the net. The camera is focusing on the striker taking the shot but is ready to fly off towards the goal following the ball if the kick is executed. If the striker does end up scoring the game may change to another camera to change the angle and behaviour for the instant replay and goal celebrations.
What seems like a simple action is surrounded by aspects setting the scene, adding depth to the experience and making the game more imersive. As much as every game requires a camera, lighting and sound as a functional building block, and as such can just be added into the engine and used with some simple setting adjustments, you need to code these elements to dynamically change them during game play.
This post deals with the "lights, camera..." section leading up to the action.
When building a 3D game in particular lighting is one of the most important aspects to consider. Good lighting will bring out the best in your 3D assets. Too much or too little lighting will drown them out, the angles of course have to be considered carefully in relation to the angle of the assets and the camera,all of these being mostly artistic considerations. However the technical angle also has to be considered, for example, how valuable is dynamic shadowing to your game? Because constantly calculating shadows is a real CPU drain and mostly impractical. Lighting can be a silent hero or villain in your project, and as such deserves a lot of care and attention.
Unity have already added three types of flexible lights to the engine. You can add one to a scene by selecting GameObject / Create Other from the top menu and selecting one of the three lights. It doesnt matter which one you select, you can change it afterwards. So pick Directional Light from the list.
Directional light is the cheapest kind in terms of the required processor power, in essence its specifies a certain direction that the light will attack the objects on the scene from, and all objects in the scene are lit accordingly. This is generally used for things such as sunlight, that should light the scene uniformly. To test this very simply add a cube item to your scene (GameObject / Create Other / Cube from the top menu)
You can use the Inspector panel to inspect the settings of your light source. So select the directional light from the Hierarchy panel and take a look at the Inspector panel.
The “transform” details of the light are at the top in the same way as any other game object. However because of the uniform nature of the directional light the position details will not impact the way the directional light works. You can experiment by changing the position of the directional light, the cube will be rendered in the same way as before.
The Scale values do not have any impact on any kind of light, since the size and power of the light is dictated by the other settings in the Light class (or in the Inspector panel). The most powerful option available is to change the type of the light source. The other two types you have not yet looked at are Spot lights and Point lights.
Spot lights, as their name suggest, work in the same way as the spot light you would expect on a performer on stage. If you change the Type value in the Inspector panel to Spot you'll see the cube is not lit up in the same way. Now the position values are also important, since the spot light needs to be “aimed” at the area to light it up. These types of lights can be useful for various effects such as allowing the game character to carry a torch.
The third kind of light, Point lights, work in much the same way as a light bulb, that it, they will emanate light from a certain point spherically. For example its very easy to get an impressive “quake” effect by adding a point light as a light bulb to a room and vibrating it to dynamically set the shadows in the room as if the room is shaking.
Take the time to play around with the different settings in the inspector panel to familiarise yourself with the way Unity's lights work. You can see that the Color setting will allow you to add a colour tone to your light giving you the ability to create impressive ambient lighting, the intensity setting is where you would set the brightness of the light, and so on. So now that you know how these lights work lets go ahead take a look at the code that will allow you to control your lights.
Some Lighting Properties
The intensity of the light can take a value between 0 and 8 and works in conjunction with the color parameter to dictate the intensity of the light.
goLight.intensity = 4;
As you have seen Unity allows three kinds of light objects. This property will allow you to switch between them through your code. The property can take one of three possible values, LightType.Spot, LightType.Point, and LightType.Directional.
goLight.type = LightType.Spot;
Cameras allow you to dictate how your scenes are rendered and are therefore essential to any game. A Unity project is useless without a camera because nothing will be displayed without one.
Its very easy to add and point a camera in the Unity editor. Add it the usual way (GameObject / Create Other / Camera from the top menu) and change its transform properties from the Inspector panel. When the camera you have added is selected in the Hierarchy panel a small window in the Scene panel will show you what the camera can actually see. Its possible to have more than one camera active at the same time to overlay the rendered areas. Its worth taking a close look at the properties of a camera.
The projection type of a camera can be set to Perspective or Orthographic. The Perspective setting is best used for three dimensional scenes, the Orthographic view for two dimensional scenes. If you are working on a 2D game, a user interface or on a menu scene your better off using the Orthographic camera. You'll see that when changing the type of a camera between Perspective and Orthographic types one parameter will change. The Perspective type has a “field of view” setting and the Orthographic camera has a “size” setting. In a way these dictate the amount of zoom applied to the camera and can be used as a zoom parameter in your code. One to watch out for is the Clipping Plane setting. Since the camera in your scene is a meta game object the clipping setting will dictate exactly where the camera will start rendering and where it will end rendering. The default setting for the near value is 0.3, this means the camera will start rendering at 0.3 units in front of the position of the camera object in the scene. The far value dictates how far the camera can see, the default value is 1000 so any objects further than 1000 units away will not be rendered by the camera.
The Depth value is a very important setting when dealing with multiple cameras. When there are two cameras active in one scene the camera with the higher Depth value will be rendered over the camera with the lower depth value. A typical set up is to have two cameras on a scene (in very different locations) one looking at user interface items such as a map or radar, and the other looking at the in game action. The camera used for the user interface will have a higher Depth value and the “Clear Flags” setting will be set to “Depth only” this will overlap the user interface items with the actual game camera.
This is also where the layering capabilities of the engine come into play. Unity allows the user to set up layers that can allow you to differentiate what is rendered and what isn’t. If you select any game object and look at the top of the hierarchy panel you can see that its possible to assign a layer to the object. Using the drop down box there its also possible to create a custom layer. Once items have been assigned to layers, you can use the Culling Mask setting on the camera object to assign the layers that this particular camera should render. In the example above the user interface items would be set to a user interface layer and the camera used to render them would specify the user interface layer in its Culling Mask setting.
Some Camera Methods and Properties
The orthographic property can be used to flip the camera from a perspective camera type to an orthographic camera type (or visa versa).
camCamera.orthographic = true;
The CopyFrom method will allow you to set the details and setting of one camera from another.
The depth parameter lets you select the depth value of the camera and therefore change the render order of the objects rendered by the camera.
After the camera and lighting have been set up there is one more thing that will dictate how your game objects are displayed. The material object is a collection of items and settings that dictate how a game object is rendered. Each game object must have at least one material attached to it in order to be rendered, one material can be used on multiple game objects.
There are three main components that make up a Material, at least one selected colour, at least one selected texture, and a “Shader”. The most important of these being the Shader. Essentially the shader is the code that dictates how the material renders objects and the colours and textures are parameters of the shader.
The colours and textures can be used in a variety of ways depending on the shader. For example its common to have a second texture file that the shader uses as a bump map or a second colour that the shader will use to determine how shiny something should be.
In order to experiment with a material and the various shaders create a new cube object in the Scene panel (GameObject – Create Other – Cube) and a new material in the Project panel (right click in the Project panel and chose Create – Material). If you select the cube in the Hierarchy view you can drag and drop the material into to inspector pane to attach the material to the object
If you select the material in the Hierarchy view you can see that the Inspector view will let you change the components that make up the material, the combo box at the top will let you pick a shader from the existing preset shaders, and the rest of the panel will allow you to set the colours and textures that make up the parameters of the shader. If you change the colour of the material you can see that the colour of your cube will change accordingly. If you change the shader to Transparent – Diffuse, and reduce the alpha number of the colour you'll see that the cube is becoming opaque. As you can see materials and their parameters dictate the way that objects are rendered.
The color property sets the color of the material.
matMaterial.color = Color.black
The shader property sets the shader used by the material, we can use the Shader.Find() static function to change it.
matMaterial.shader = Shader.Find(“Diffuse”);
The mainTexture property is used to set the texture used by the material. The property is of type Texture2D.
matMaterial.mainTexture = texture
Sound effects and music are usually the unsung heroes of computer games. The quality of the sound within a game does a great deal to enhance (or if done badly diminish) the ambiance and quality of the entire game experience.
As game coders we are not usually responsible for the music itself but we have to integrate the sound and music assets to our game. Unity makes it easy for you to do so.
Adding a sound file to you project is easy as adding any external resource, just drag and drop a wav or mp3 file into your Project panel. Once selected you can see its import settings in the inspector panel, here you can convert it to a compressed MP3 (recommended) and decide if it should be a 3D sound or not. A 3D sound is one that will be played depending on its position in the 3D world. So if your character is walking away from the source of a 3D sound the volume of the sound will diminish as he gets further away. Deselecting the 3D sound option will play the sound at a constant volume no matter where the game objects are In the screen, this is useful for 2D games, theme songs or UI effects.
You need tyo add two components to your games in order to play sounds, one is an audio listener, which is added as standard to the camera, and one is an audio source. Its recommended that there is only one active audio listener at any one time, but you can have multiple audio sources. So create an empty game object in your scene panel (GameObject – Create Empty) and add an audio source component to it (Component – Audio – Audio Source). Now with your new game object selected in the Hierarchy view, drag and drop the sound file into the “Audio Clip” area in the Inspector panel. You have now created a game object that can be called via your code to play and change the sound resource.
Sound Methods And Properties
The volume property sets the volume level of the audio source, where 1 is the maximum and 0 is the minimum.
audio.volume = 0.2F;
The Play method will play the sound resource on the audio source.
The Stop method will stop the sound resource on the audio source.
The Pause method will pause the currently playing sound resource on the audio source.
Setting The Scene: Scripting The Environment in Unity3D
Setting the scene: