How to build world and tell stories in VR - Part 2
In the first part of 'how to build worlds and tell stories in VR' we emphasized on the aspects of worldbuilding order to create a consistent self-contained fantasy universe. This part sheds light on VR specific difficulties and how to tackle them.
I recently wrote a blog entry about how to build worlds and tell stories in VR. I expanded on this topic for a talk at the GameZ Festival in Zurich and, of course, I don’t want to keep the information from you, so here is Part 2 of How to build worlds and tell stories in VR.
Influences in VR Storytelling
One big thing we have to think about when writing stories for VR is how the medium itself influences our possibilities. Of course, we all know that the way a story is told in VR differs from ‘regular play’, but we really tried to pinpoint important aspects in order to be aware of them and avoid pitfalls.
We’ve identified three major factors that impact storytelling in VR, which are:
Influence No. 1: Technical Challenges and Limitations
I’ve already talked about some of the technical challenges and limitations of VR in Part 1, but here comes an extended list:
The Resolution
The resolution is a big topic when it comes to VR devices! Although the improvements in this area come in big steps, the resolution is still very limited, and the pixel density is very low. This means you not only have to consider the size of objects you’re placing in your world (if they are too small, they may not be recognized), but also your UI; if the font size is too small or you’re displaying large quantities of text, it may happen that the player gets tired of reading or, even worse, may not be able to read the information at all. One way to deal with this is to completely avoid these types of texts, but if you need to pack a lot of information in, then use voice-overs to present it to the player.
Another funny thing we have to cope with is the worldspace size of objects. You may model your content to exact dimensions – let’s say for example, you have a table that is 80cm high and has a width of 1 x 2 meters – don’t expect that table to has actually look that size! In fact, we really had to develop kind of a sixth sense for modeling our assets so they looklike the size that they shouldbe, rather than modeling to an exact size.
The field of view
The most popular available devices have currently a limitation in their field of view to around 100 degrees. In comparison, both human eyes combined have a field of view of around 200 degrees.
Bearing this in mind, try to keep the non-diegetic elements (elements that do not directly belong to the gameworld) in the player’s focal point. Also, create your world so the player has to move his head and is not tempted to move his eyes! (One reason for this is the next point on the list…)
Nausea
Motion sickness is one of the main gamebreakers when it comes to VR. Besides advanced techniques to improve the occurrence of motion sickness, (like speed-controlled tunnel masking for the peripheral vision) there are some basic rules which are forgotten surprisingly often:
Don’t use multiple cameras
It not only very likely breaks the immersion, but also contributes to nausea. If you really have to change cameras, work with black fading.
NEVER move the camera without direct user input!
I can’t emphasize this enough, so please avoid this by any means! (Unless you’re developing a nausea simulator - then you’re good to go!)
Talking about camera problems…
Cameras are the root cause of a lot of problems in VR. I just said that one should not move the camera without direct user input, which fixes one aspect of the motion sickness problem, but creates another one: the PHI (Player-Head-Input) problem.
Influence No. 2: The Phi problem
Imagine the following situation:
You want the player to look at a certain point on a door, which is indicated by the red arrow in the left image, but instead, the player does what you can see in the right image..
This brings me to (in my opinion) the most important rule of all:
THE SCREEN IS NOT YOURS – it belongs to the player!
We, as storytellers, cannot decide what the player is looking at; we can only try to influence him[PA1] as he travels in our world.
In VR, we have to create a toolset on how to do this and use it to actively encourage the player to look where we want him/her to look at.
One way to guide the players attention is to use spatial audio and use the environment to tell your story.
In the above scene of our game, Return to Nangrim, the player would hear croaking noises of crows when approaching the fire tower and then, when looking up, see the birds flying around it. In this way, we tell the player through context that there must be something about this fire tower worth checking out.
What also plays into this problem is the user’s volatile and arbitrary attention span.
Influence No. 3: The player's volatile attention span
If you want to keep the player hooked on your story, you need to give him good reasons to stay interested.
One strategy to do so is to create points of interest for the player where he can explore your world and get information.
Also, encourage the player to interact with your environment by providing subtle hints like light beams or placing objects in a certain manner.
And create and spread narrative webs around your gameworld for the player to stick to[PA2] and stay for a while to enjoy!
To sum everything up, these are the combined lessons learned in Part 1 and Part 2:
Remember: the screen is not yours! Influence the player’s decision by using natural guidance and keep things interesting with a lively and rich gameworld. You’re providing hints, you’re not pulling strings!
I hope this summary gave you some information that will help you with your VR story development. As always – feel free to contact me and ask any questions!
Read more about:
BlogsAbout the Author
You May Also Like