Categories
Dev Gear XR (AR/VR/MR)

Hardware for VR development 🖥

Starting out in VR development it’s easy to think you’ll spend the earth on special hardware to get going. The reality is that’s just not true. The below 360 images (which I took with the google street view app on iOS) is fairly rough as 360 images go. The room was a mess as I was in the middle of a fairly ugly hardware transition. The stitching isn’t very good, so there are lots of bung parts to it, but you get the idea of the space and gear I work with.

Until recently I was using a late 2009 Mac Mini and was managing just fine on my Udacity VR developer course. Well, I was managing, not sure about the fine part. So I upgraded to a late 2012 mac mini, still 5-year-old hardware, and I’m going gangbusters doing google cardboard development with it.

Here are a few more details on the hardware setup I’m running in the 360 image above:

Hardware

Displays – Two second-hand 1080p displays I purchased on Trademe for $300 total. The one on the left is in portrait orientation for my code editor. The one on the right in landscape for the Unity editor. Both are on this monitor stand I purchased from PB tech for around $100.

Input devices – There’s a bunch of keyboards in the shot but the one I’m actually using now is a Logitech K380. I managed to get a refurbished one on 1-day for $40 shipped. Honestly, it’s such a good keyboard I’d be happy to pay retail for it. Logitech claims 2 AAA batteries will last 2 years. Far better than the 2 months I was getting out of my Apple keyboard and a fraction of the price. For a pointing device, I’m using a stock standard magic mouse. I’d prefer a Logitech MX Master 2S, but it’s a nice to have rather than a must.

Computer – I have two in play now. My development machine is a late 2012 Mac Mini I purchased on Trademe for $500. Even though it’s from late 2012 it’s actually the fastest model of Mac Mini ever made, the quad-core i7. I’ve put 16GB of ram in it and replace the spinning disk with an SSD. I’m really happy with its performance. It does everything I need for Unity development, especially given the contrast of my other machine the late 2009 Mac Mini. That machine is now running as the server for the house. I’ve modified it a bit adding 4.5TB of storage and 8GB of ram. It can no longer run the latest version of MacOS but it’s doing a great job as a cache and Time Machine server.

Future plans

Audio hardware – Something I’m really aware of with VR is the impact of audio. Obviously, visuals are important in VR, but good quality audio can also have a huge impact on immersion. It’s also very useful in directing the users’ attention. I’ve got a few bits of hardware on order to support creating more of my own audio for various projects. The main bit of kit is a USB audio interface, or more specifically a Focusrite Scarlett Solo (2nd Gen). I got mine via Amazon as it actually worked out a bit cheaper than buying local, even with the shipping costs via youshop.

I’m trying to play a longer game with this purchase. It’s likely better than I need at this stage, but it will last me 15 years or more.  With the budget in mind, for the rest of my audio gear, I opted for a cheap XLR mic, arm, and pop filter from Aliexpress. I’ll upgrade those later when I have more funds available and other parts of my setup mature.

360 camera – At the moment I’m just using my iPhone 6s and various 360 apps. Over time I’m expanding my capabilities as I need/can afford them. I’m in the market for a 360 camera, likely something like the Ricoh Theta S or a 2017 Gear VR camera. I’m hoping to purchase one of these in the next 3-4 months.

Mac Pro and a 6DOF VR setup – Ultimately I want to replace my Mac Mini with a far more powerful setup so I can expand into more immersive VR development. I’m aiming for a new Mac Pro when they become available later next year. I’m also delaying the purchase of something like a Vive or an Oculus as long as I possibly can. It’s such early days in VR hardware and I’d prefer to wait to buy when there’s gen 2 or even 3 out. I suspect 2018 is going to be an expensive year.

The thing to take away here is you don’t need to spend the earth to get started with VR. Some second-hand hardware and a drive to learn will take you a long way before you need to invest more.

Categories
Gear XR (AR/VR/MR)

$99 AR headset 😍

I seriously doubt the stock like silliness in this promo video is even remotely connected to reality… but $99USD AR headset! Sign me up!

Categories
Dev XR (AR/VR/MR)

Locomotion 🚂

You know what’s really fun? Being able to move around and interact in a virtual environment. You know what isn’t? Barfing all over your new all birds.

That’s exactly what’s at stake when designing a good VR experience, particularly one that includes movement. When developing for VR it’s important to build a good understanding of “simulator sickness” and what can be done to eliminate, or at the very least, minimise it.

Just like with other forms of motion sickness, simulator sickness affects different people in different ways. As it’s still early days of mainstream VR, there’s still a lot of research to be done into causes and solutions. Having said that there’s also a lot of good material out there that can help you provide a comfortable experience for users.

Some general rules of thumb to follow include:

  1. Tightly control the users’ speed and rate of acceleration. Slower speeds are generally more comfortable for users, as is a very high rate of acceleration. Any gradual acceleration at all in VR can trigger simulator sickness so it’s best to keep speed transitions short and infrequent
  2. Leave your user in control of their vision. In other words, don’t disconnect what you see from the users head tracking. If you need the user to look in a certain direction use other techniques such as sound or lighting to draw their gaze.
  3. Make sure your experience is performant by maintaining a suitable frame rate. 90 FPS is optimal.

The best way to stay on top user comfort is to test things on them as early into development as possible. Ask them questions about their comfort levels and always make sure you let them know to remove the headset right away if they feel any discomfort. After all, The last thing you want is to push a tester to the point of ruining their shoes.

Categories
Dev Process

Process: User experience testing 🔬

As I said in my previous post on process, it’s important to test your work early and often. This means getting in front of your users and collecting feedback, aka UX testing.

When you’re starting out with user testing it can be quite intimidating, but with a few tips and a bit of preparation, it can be very valuable. There are a few basic things to consider when carrying out tests.

Asking the right questions

Just rolling up to your user and asking questions may provide some value, but it may also give you some bias or low-value results. To avoid this it pays to prepare your questions in advance of the interview. When formulating the questions themselves make sure you avoid both leading and dead-end questions.

Leading questions

An example of a leading question is “do you find the mood of the scene magical?”. By asking a question in this way you’re influencing your user to think of the scene in a magical context. This may affect their answers, reducing the value of the feedback. A better question to ask would be “can you describe the mood of the scene?”. It’s more open-ended, leaving the user to describe the mood without bias.

Dead end questions

Likewise asking dead-end questions also provides little value. Dead end questions are questions that can be answered with a simple yes or no. An example would be “did you enjoy the experience?”. A better option would be “tell me about the experience?”.

If need be, you can always ask any follow-up questions to clarify anything you feel isn’t clearly communicated in the users’ answers.

Sample size

On the surface, you’d think you’d need dozens and dozens of users to test on, and in the past, many UX practitioners have done just that. As it turns out, that really isn’t needed. In fact according to a study by the Norman Nelson Group you’re actually wasting your time with big sample sizes. You’re actually far better served by performing multiple small tests on 3-5 people. Not only is it cheaper and easier to do, but it’s actually more effective.

Categories
Dev XR (AR/VR/MR)

Working with GameObjects in Unity 👾

Given I’m currently learning to develop in Unity I thought I’d share a few tips I pick up along the way to becoming proficient with it.

When creating a scene in Unity you’ll work with loads of different GameObjects. Your virtual world is built out of them (Floors, walls, trees everything) and you’ll be manipulating them all the time.

One thing you’ll notice is the more GameObjects you have the more complex it becomes placing things correctly. Trying to line everything up, preventing different objects overlapping, and just generally making things look as they should, can consume a lot of time.

Thankfully there are a few things you can do to make life a whole lot simpler. First is learning to navigate around the environment, second duplicating objects, third vertex snapping.

Navigating the environment

The odds are good if you’re doing VR development in Unity you’ve played at least some 3D person shooter on a PC or Console. If so, you’re in luck because navigating in a 3D unity scene can be exactly like this with a few extras. Click and hold the right mouse button and you’ll be able to look around the environment with your mouse and use the WASD keys to move forward (W), backward (S), strafe left (A), and right (D). A little extra something for moving quickly is selecting a GameObject and pressing the F key to jump to it for manipulation.

Duplicating objects

When creating an environment it’s essential to duplicate GameObjects or groups of objects. To do this is very easy, just select the GameObject and do what you’d do to duplicate and just like text in a text editor, copy and paste. Easy as that you’ll have a duplicate item to move around and position in the scene.

Vertex snapping

For the first few months I was learning I didn’t know about this one and boy does it save some time and frustration. Let’s say you’ve got a section of wall and you’ve duplicated it to make the wall bigger. You move your newly duplicated wall GameObject to line it up with the original and you slide it along just a little, you eyeball it and there’s a small gap. You move it back a little and now it overlaps. This sort of thing can go on and on. Vertex snapping will save you from hours of repetition by allowing you to snap two objects together cleanly.

To do this, select the move tool then select the GameObject you want to move. Now hold down the V key, move your cursor to a vertex, then click and drag it to the vertex of the GameObject you want to snap to. Now release the mouse button and boom, just like that, you’ll have a perfectly placed GameObject in line with the first.

Now to be completely honest it can be difficult to get your head around vertex snapping from reading a text description of it. Thankfully a number of people have made excellent YouTube videos showing just how it’s done. Here’s one I found useful:

For more information on vertex snapping and a whole lot more, see the Positioning GameObjects section of the Unity manual.

Categories
Design Dev

Space 👨‍🚀

In VR, as in real life, personal space matters. Personal space is “the physical space immediately surrounding someone, into which encroachment can feel threatening or uncomfortable”¹. The amount of space that any individual considers “theirs” can vary considerably. When I lived in the Netherlands I found the Dutch had a considerably smaller personal bubble than I was used to as a New Zealander. I suspect this was due to the wildly different population densities between the two countries.

As a general rule of thumb 0.5m is a good minimum level to start with. This sort of space between the user and any GameObject will make players feel comfortable that nothing is “in their face”. Unless you want them too of course. As always developing an understanding of your audience as part of your process helps inform what is and isn’t important in your VR experience.


  1. For more information on personal space see Proxemics on Wikipedia
Categories
Dev Process

Process 📝

The making of a “thing” happens in loads of different ways, but I’ve always found it’s best to have some sort of process to get a good result. It doesn’t have to be onerous, just something repeatable that considers a few things consistently.

To avoid an overly long post I plan to write a series on this subject. Here’s a few tips to get you started:

Start with people – It’s always tempting to just start building stuff, but you should always start with your target audience. That is the audience you intend to use you’re app. It’s easy to think it’s “everyone”, but is that really true? What experience do they have with VR/AR? Also consider this is a physical medium, so you may be making greater physical demands of your audience than with other digital experiences.  Jumping may require users to physically jump, their height may be a factor, and so on. Simply writing a few sentences describing who your user is and what you expect them to do can be very powerful. For example:

The end users for ProjectX are likely to be people who are new to VR, but who have already experienced various games in their life. They are probably going to be 25-35 and own a smartphone. They'll be expected to stand, reach, jump and turn with relative ease.

Develop personas – I’ve always been a fan of personas. They are simple to put together, and make it easy to conceptualise your audience. Something to be aware of though is that personas by their nature are a generalisation of an audience. They will never cover all of the details, so you should expect to continue to refine them over time. With this in mind start with something simple you can build upon. Something like this:

Image:
FBF69C46-9A73-413C-9627-4FE80A414519.png
Name:
Anna Age: 36 Role: Scientist/Marketer VR Experience: Little to none Quote (that sums up their attitude): VR looks interesting and I’d like to give it a go. About this person: Anna is a highly educated working mother of two. She has a busy lifestyle but is always interested in new and interesting things. She prefers to try things before jumping in head first and enjoys things she can share with her family.

Statement of purpose
A good statement helps us understand what we are building by defining simple project scope. It should be concise and ideally no more than one or two sentences long. It’s something you should keep visible to constantly remind you of what it is your building, and limit the temptation to go too far beyond that.

Here’s an example purpose statement:

ProjectX is a mobile VR app which gives new VR users a quick taste of a VR via their existing smartphone. The entire experience should take no more than a few minutes.

Make disposable concepts and iterate – In the beginning, everything you do should be simple, quick, and disposable. Use materials, tools, and techniques that enable this. Pen and paper are a surprisingly useful and versatile tool set. Expect to be wrong, it helps you keep an open mind on other options, not becoming too attached to any one thing. Iterate, iterate, then iterate some more. Eventually, you’ll find something you’re happy to move on with. Steadily increase the complexity and sophistication of your work as the idea solidifies.

Share your work early and often – You know who your audience is, show them what you’re making. Their feedback will be invaluable. You’ll also be able to get in front of complexity before things become difficult to change. While you’re with your audience take the opportunity to grow your understanding of them and update your personas to match. Note: Be careful to listen for what your user wants, less so how they want it.

Conclusion – Using these simple steps and methods is a great way to start any project, but don’t be afraid to change things up until you find a process that works best for you. Ultimately you need to find what makes your product better for your users and discard the rest.

Categories
Dev XR (AR/VR/MR)

Mixed reality capture or: how to share your VR work 🎥

You’ve likely all see the videos showing VR users actually immersed in a VR world, like the demo at WWDC. The question is, how can you do the same with your work?

Well aside from requiring a few extra bits of gear to your normal VR setup, it actually seems fairly straight forward. Ars technica have put together a nice tutorial covering off what you’ll need and how it’s done.

A handy skill to add to your repertoire.

Categories
Dev XR (AR/VR/MR)

Augmented reality 📲

Ever since Pokemon Go, AR seems to have worked its way into the “parlance of our time“. Truth be told we’ve actually had loads of examples of AR apps for years now, just not combined with a cultural phenomenon like Pokemon.

I think it’s fair to say it’s highly likely there’s about to be a bit of an explosion in AR apps, at least on iOS. With the release of ARkit to developers and very impressive demos from both Weta and Apple on what can be done, it’s not difficult to imagine a new wave of apps in the works from third party developers.

In fact just oneish week out from ARkit’s beta release, developers are already starting to show some impressive progress playing with the tech. Take a look:

 

 

 

 

Categories
Dev

Git ⬆️

Git Logo


Since I


‘m learning VR it seemed a good idea to finally learn Git properly. In the past, I’d used GitHub just to play around with git conceptually. I liked it, but couldn’t really justify a paid account for private repos.

Over the past few days I’ve learned a few important things to know when you’re first getting started with Git:

  1. I’d noticed on the Udacity VR slack channel other students mentioning an alternative service called GitLab. The key feature here is private repos as part of the free account. So if you’re like me and just use git for personal use/education, take a look at GitLab.
  2. You don’t have to know the command line to use Git. There are plenty of desktop Git clients, many of them free, for all your versioning needs.
  3. There’s a couple of files to add to your repo early on. .gitignore to exclude files and .gitattributes to list off file types to store in Git LFS. As I said it pays to do this early on in a repos life. The effects of these two files only apply from the time the files are added to the repo.
  4. If you’re looking for inspiration for what to put into .gitignore, consider searching for common uses for the types of work you’re doing. For example in my case, Unity 3D projects create plenty of files on load or at runtime. So a common .gitignore for unity 3d projects would look something like this:
# =============== #
# Unity generated #
# =============== #
[Tt]emp/
[Oo]bj/
[Bb]uild
/[Bb]uilds/
/[Ll]ibrary/
sysinfo.txt
*.stackdump
/Assets/AssetStoreTools*
*.apk
*.unitypackage
 
# ===================================== #
# Visual Studio / MonoDevelop generated #
# ===================================== #
[Ee]xported[Oo]bj/
.vs/
/*.userprefs
/*.csproj
/*.pidb
*.pidb.meta
/*.suo
/*.sln*
/*.user
/*.unityproj
/*.booproj
.consulo/
/*.tmp
/*.svd
 
# ============ #
# OS generated #
# ============#

.DS_Store*
._*
.Spotlight-V100
.Trashes
Icon?
ehthumbs.db
[Tt]humbs.db
[Dd]esktop.ini
Corridor/Library/ShaderCache/
Corridor/Library/metadata/