I seriously doubt the stock like silliness in this promo video is even remotely connected to reality… but $99USD AR headset! Sign me up!
You know what’s really fun? Being able to move around and interact in a virtual environment. You know what isn’t? Barfing all over your new all birds.
That’s exactly what’s at stake when designing a good VR experience, particularly one that includes movement. When developing for VR it’s important to build a good understanding of “simulator sickness” and what can be done to eliminate, or at the very least, minimise it.
Just like with other forms of motion sickness, simulator sickness affects different people in different ways. As it’s still early days of mainstream VR, there’s still a lot of research to be done into causes and solutions. Having said that there’s also a lot of good material out there that can help you provide a comfortable experience for users.
Some general rules of thumb to follow include:
- Tightly control the users’ speed and rate of acceleration. Slower speeds are generally more comfortable for users, as is a very high rate of acceleration. Any gradual acceleration at all in VR can trigger simulator sickness so it’s best to keep speed transitions short and infrequent
- Leave your user in control of their vision. In other words, don’t disconnect what you see from the users head tracking. If you need the user to look in a certain direction use other techniques such as sound or lighting to draw their gaze.
- Make sure your experience is performant by maintaining a suitable frame rate. 90 FPS is optimal.
The best way to stay on top user comfort is to test things on them as early into development as possible. Ask them questions about their comfort levels and always make sure you let them know to remove the headset right away if they feel any discomfort. After all, The last thing you want is to push a tester to the point of ruining their shoes.
As I said in my previous post on process, it’s important to test your work early and often. This means getting in front of your users and collecting feedback, aka UX testing.
When you’re starting out with user testing it can be quite intimidating, but with a few tips and a bit of preparation, it can be very valuable. There are a few basic things to consider when carrying out tests.
Asking the right questions
Just rolling up to your user and asking questions may provide some value, but it may also give you some bias or low-value results. To avoid this it pays to prepare your questions in advance of the interview. When formulating the questions themselves make sure you avoid both leading and dead-end questions.
An example of a leading question is “do you find the mood of the scene magical?”. By asking a question in this way you’re influencing your user to think of the scene in a magical context. This may affect their answers, reducing the value of the feedback. A better question to ask would be “can you describe the mood of the scene?”. It’s more open-ended, leaving the user to describe the mood without bias.
Dead end questions
Likewise asking dead-end questions also provides little value. Dead end questions are questions that can be answered with a simple yes or no. An example would be “did you enjoy the experience?”. A better option would be “tell me about the experience?”.
If need be, you can always ask any follow-up questions to clarify anything you feel isn’t clearly communicated in the users’ answers.
On the surface, you’d think you’d need dozens and dozens of users to test on, and in the past, many UX practitioners have done just that. As it turns out, that really isn’t needed. In fact according to a study by the Norman Nelson Group you’re actually wasting your time with big sample sizes. You’re actually far better served by performing multiple small tests on 3-5 people. Not only is it cheaper and easier to do, but it’s actually more effective.
Given I’m currently learning to develop in Unity I thought I’d share a few tips I pick up along the way to becoming proficient with it.
When creating a scene in Unity you’ll work with loads of different GameObjects. Your virtual world is built out of them (Floors, walls, trees everything) and you’ll be manipulating them all the time.
One thing you’ll notice is the more GameObjects you have the more complex it becomes placing things correctly. Trying to line everything up, preventing different objects overlapping, and just generally making things look as they should, can consume a lot of time.
Thankfully there are a few things you can do to make life a whole lot simpler. First is learning to navigate around the environment, second duplicating objects, third vertex snapping.
Navigating the environment
The odds are good if you’re doing VR development in Unity you’ve played at least some 3D person shooter on a PC or Console. If so, you’re in luck because navigating in a 3D unity scene can be exactly like this with a few extras. Click and hold the right mouse button and you’ll be able to look around the environment with your mouse and use the WASD keys to move forward (W), backward (S), strafe left (A), and right (D). A little extra something for moving quickly is selecting a GameObject and pressing the F key to jump to it for manipulation.
When creating an environment it’s essential to duplicate GameObjects or groups of objects. To do this is very easy, just select the GameObject and do what you’d do to duplicate and just like text in a text editor, copy and paste. Easy as that you’ll have a duplicate item to move around and position in the scene.
For the first few months I was learning I didn’t know about this one and boy does it save some time and frustration. Let’s say you’ve got a section of wall and you’ve duplicated it to make the wall bigger. You move your newly duplicated wall GameObject to line it up with the original and you slide it along just a little, you eyeball it and there’s a small gap. You move it back a little and now it overlaps. This sort of thing can go on and on. Vertex snapping will save you from hours of repetition by allowing you to snap two objects together cleanly.
To do this, select the move tool then select the GameObject you want to move. Now hold down the V key, move your cursor to a vertex, then click and drag it to the vertex of the GameObject you want to snap to. Now release the mouse button and boom, just like that, you’ll have a perfectly placed GameObject in line with the first.
Now to be completely honest it can be difficult to get your head around vertex snapping from reading a text description of it. Thankfully a number of people have made excellent YouTube videos showing just how it’s done. Here’s one I found useful:
For more information on vertex snapping and a whole lot more, see the Positioning GameObjects section of the Unity manual.
In VR, as in real life, personal space matters. Personal space is “the physical space immediately surrounding someone, into which encroachment can feel threatening or uncomfortable”¹. The amount of space that any individual considers “theirs” can vary considerably. When I lived in the Netherlands I found the Dutch had a considerably smaller personal bubble than I was used to as a New Zealander. I suspect this was due to the wildly different population densities between the two countries.
As a general rule of thumb 0.5m is a good minimum level to start with. This sort of space between the user and any GameObject will make players feel comfortable that nothing is “in their face”. Unless you want them too of course. As always developing an understanding of your audience as part of your process helps inform what is and isn’t important in your VR experience.
- For more information on personal space see Proxemics on Wikipedia
The making of a “thing” happens in loads of different ways, but I’ve always found it’s best to have some sort of process to get a good result. It doesn’t have to be onerous, just something repeatable that considers a few things consistently.
To avoid an overly long post I plan to write a series on this subject. Here’s a few tips to get you started:
Start with people – It’s always tempting to just start building stuff, but you should always start with your target audience. That is the audience you intend to use you’re app. It’s easy to think it’s “everyone”, but is that really true? What experience do they have with VR/AR? Also consider this is a physical medium, so you may be making greater physical demands of your audience than with other digital experiences. Jumping may require users to physically jump, their height may be a factor, and so on. Simply writing a few sentences describing who your user is and what you expect them to do can be very powerful. For example:
The end users for ProjectX are likely to be people who are new to VR, but who have already experienced various games in their life. They are probably going to be 25-35 and own a smartphone. They'll be expected to stand, reach, jump and turn with relative ease.
Develop personas – I’ve always been a fan of personas. They are simple to put together, and make it easy to conceptualise your audience. Something to be aware of though is that personas by their nature are a generalisation of an audience. They will never cover all of the details, so you should expect to continue to refine them over time. With this in mind start with something simple you can build upon. Something like this:
Image: Name: Anna Age: 36 Role: Scientist/Marketer VR Experience: Little to none Quote (that sums up their attitude): VR looks interesting and I’d like to give it a go. About this person: Anna is a highly educated working mother of two. She has a busy lifestyle but is always interested in new and interesting things. She prefers to try things before jumping in head first and enjoys things she can share with her family.
Statement of purpose
A good statement helps us understand what we are building by defining simple project scope. It should be concise and ideally no more than one or two sentences long. It’s something you should keep visible to constantly remind you of what it is your building, and limit the temptation to go too far beyond that.
Here’s an example purpose statement:
ProjectX is a mobile VR app which gives new VR users a quick taste of a VR via their existing smartphone. The entire experience should take no more than a few minutes.
Make disposable concepts and iterate – In the beginning, everything you do should be simple, quick, and disposable. Use materials, tools, and techniques that enable this. Pen and paper are a surprisingly useful and versatile tool set. Expect to be wrong, it helps you keep an open mind on other options, not becoming too attached to any one thing. Iterate, iterate, then iterate some more. Eventually, you’ll find something you’re happy to move on with. Steadily increase the complexity and sophistication of your work as the idea solidifies.
Share your work early and often – You know who your audience is, show them what you’re making. Their feedback will be invaluable. You’ll also be able to get in front of complexity before things become difficult to change. While you’re with your audience take the opportunity to grow your understanding of them and update your personas to match. Note: Be careful to listen for what your user wants, less so how they want it.
Conclusion – Using these simple steps and methods is a great way to start any project, but don’t be afraid to change things up until you find a process that works best for you. Ultimately you need to find what makes your product better for your users and discard the rest.
You’ve likely all see the videos showing VR users actually immersed in a VR world, like the demo at WWDC. The question is, how can you do the same with your work?
Well aside from requiring a few extra bits of gear to your normal VR setup, it actually seems fairly straight forward. Ars technica have put together a nice tutorial covering off what you’ll need and how it’s done.
A handy skill to add to your repertoire.
Ever since Pokemon Go, AR seems to have worked its way into the “parlance of our time“. Truth be told we’ve actually had loads of examples of AR apps for years now, just not combined with a cultural phenomenon like Pokemon.
I think it’s fair to say it’s highly likely there’s about to be a bit of an explosion in AR apps, at least on iOS. With the release of ARkit to developers and very impressive demos from both Weta and Apple on what can be done, it’s not difficult to imagine a new wave of apps in the works from third party developers.
In fact just oneish week out from ARkit’s beta release, developers are already starting to show some impressive progress playing with the tech. Take a look:
‘m learning VR it seemed a good idea to finally learn Git properly. In the past, I’d used GitHub just to play around with git conceptually. I liked it, but couldn’t really justify a paid account for private repos.
Over the past few days I’ve learned a few important things to know when you’re first getting started with Git:
- I’d noticed on the Udacity VR slack channel other students mentioning an alternative service called GitLab. The key feature here is private repos as part of the free account. So if you’re like me and just use git for personal use/education, take a look at GitLab.
- You don’t have to know the command line to use Git. There are plenty of desktop Git clients, many of them free, for all your versioning needs.
- There’s a couple of files to add to your repo early on. .gitignore to exclude files and .gitattributes to list off file types to store in Git LFS. As I said it pays to do this early on in a repos life. The effects of these two files only apply from the time the files are added to the repo.
- If you’re looking for inspiration for what to put into .gitignore, consider searching for common uses for the types of work you’re doing. For example in my case, Unity 3D projects create plenty of files on load or at runtime. So a common .gitignore for unity 3d projects would look something like this:
# =============== # # Unity generated # # =============== # [Tt]emp/ [Oo]bj/ [Bb]uild /[Bb]uilds/ /[Ll]ibrary/ sysinfo.txt *.stackdump /Assets/AssetStoreTools* *.apk *.unitypackage # ===================================== # # Visual Studio / MonoDevelop generated # # ===================================== # [Ee]xported[Oo]bj/ .vs/ /*.userprefs /*.csproj /*.pidb *.pidb.meta /*.suo /*.sln* /*.user /*.unityproj /*.booproj .consulo/ /*.tmp /*.svd # ============ # # OS generated # # ============# .DS_Store* ._* .Spotlight-V100 .Trashes Icon? ehthumbs.db [Tt]humbs.db [Dd]esktop.ini Corridor/Library/ShaderCache/ Corridor/Library/metadata/