It’s hard to not have “all the feels” watching the above video (Ideally in VR if you’ve got a cardboard headset). The creators have done such an amazing job of crafting a great story with some seriously well thought out VR technique.
For example transitions/cuts within a 360 video like this can be very jarring, and require the user to reorientate themselves with each cut. In this video, they’ve used the fixed position of the car interior to ground the viewer. It’s extremely effective at controlling the flow of the experience.
The audio is also exceptional. Not only does the song to a great job of creating an emotional connection to the characters, but the changes in audio based on when the characters are inside or outside the car are just wonderful. Spatial audio at its finest.
The more I create C# scripts in Unity the more I learn about optimising those scripts. With code there are 100s of ways to solve a problem, not all of them efficient, so learning to check your codes performance seems like a must.
Things to look for
So far I’ve come across a few simple things to look out for on the projects I’ve been coding. I’ll post more as I come across them in future.
Anything that’s called every frame creates consistent load on the CPU, GPU or both. If what’s called is inefficient or large it will have a negative effect on frame rate. In VR this is particular bad given low frame rates can make your user physically sick. Be sure to look at what’s in the Update and FixedUpdate methods and do your best to optimise the code as much as possible.
Working on mobile we will almost always be maxing out the graphics capability of the device. This effects the performance of the Update method. On the other hand FixedUpdated runs in sync with Unity’s physics engine. This will make sure your physics will be the most accurate and consistent. FixedUpdate is called every “physics step”, so is regularly used for adjusting physics (rigid body) objects. Make sure you always put your physics code in FixedUpdate.
Avoid using GameObject.Find. It basically requires the system to cycle through all of the GameObjects in the game to find the right one, which obviously is expensive for the CPU. If you know what object we are using ahead of time, define the object at the top of the script and reference it that way. GameObject.Find is useful for testing out ideas, but that’s about it.
Part of my Night at the Museum project is adding audio descriptions to the objects on display. To do this in Unity I’m making use to the spatial audio capabilities of the Google Carboard SDK.
As with most things in Unity there’s a fairly reasoned logic to how this is done. You need three main things:
An audio source. You can think of this as the media player, it is literally the source of the audio within your environment
An audio clip. This is the actual audio file that will be played by your audio source. So think of this as the song that will be played by your media player
An audio listener. This is effectivly your virtual ears within the virtual world. Normally this is attached to your main camera.
To give you an example of how I this works in practice. I first attached a GvrAudioListener to my main camera. This will allow my player to hear the sounds produced. Next add a GvrAudioSource to a GameObject you want to produce a sound. Now load your audio source (your media player) with an audio clip. There’s a few different supported formats, any old mp3 will work nicely. Just drag your MP3 into the audio click field of your GvrAudioSource. By default the audio will play on wake, so just hit play to enter game mode and you should hear your new audio right away.
For more information on the ins and outs of Google Cardboard Audio capabilities, check out the documentation.
If you find your audio doesn’t play, try clicking edit -> project settings -> audio and make sure the “Spatializer plugin” is set to Gvr.