So it turns out the New York Times has been doing daily 360 stories for some time now. You can view them all on their sub site, the aptly named “The Daily 360“.
Not every one of these is improved by being a VR experience, but many are. Below are three that really caught my eye.
Aftermath of a deadly Mumbai building collapse
Watching it in browser is one thing but with a VR headset on it’s almost like you’re there. given the nature of the content, the scene has a strange mix of hope and sadness to it. Being able to be “present” really makes this story feel real, personal almost. You’re there with the people involved. It’s items like this that make you see why VR is sometimes called an empathy machine.
Aftermath of a deadly Mumbai building collapse
Paraglide Over Peru in 360
Hard to imagen how you could go wrong with a title like “Paraglide over Peru in 360”. It’s as beautiful as it sounds. One thing to be aware of though, simulator sickness. Experiences like this make you want to look about (sort of the point eh) but given the slow moving nature of the video, if your not careful it would be easy enough to make yourself sick watching this. Especially those who are susceptible to other forms of motion sickness. Having said that, as someone that once spent 45 mins sitting on the floor of a café bathroom following a bad VR session, I had no issue with this particular scene.
Paraglide Over Peru in 360
A ride for the red planet
This one’s a bit of a mixed bag. Large chunks of this basically put you on the sidelines at the museum showcasing the concept mars vehicle. Given the vehicle is the only thing of interest in the scene, it’s not much of an advantage being in VR over standard video. The part that really works though is when the scene changes to be inside the vehicle. It gives you a sense of what it might be like to be piloting this thing. Which let’s face it is very cool.
As part of my VR course, I’ve been asked to perform a bit of a thought experiment. The outline of this work is:
Imagine you want to make a VR Education Application. This could be an exploration of the solar system, looking at fine grain detail of the human body, or teaching somebody how to program, just to name a few topics. For this experiment, pick any topic you would like to teach. And don’t worry if it doesn’t make sense in VR, we will worry about that part later. Just something you are passionate about.
The requirements include defining a statement of purpose, a persona for the target audience, answer a few short questions and post it to Medium. Given I have my own blog I figured I’d also post it here:
Statement of purpose
To demonstrate the Vet teaching hospital facilities at Massey University to prospective students.
Age: 17–18 years old
Occupation, if any: High school student
A Quote: “I love working with and caring for animals”
2–3 sentences describing what motivates them:
They love working with and caring for animals. Learning in the best environment possible is important to be effective at their jobs. They want to become excellent, leaders in their field, to have the greatest impact. They are likely unsure of their specialised, but would have some idea of the options available. They already know they care for animals but are likely unsure of what the day to days of being a Vet are, but will be highly motivated to learn.
Their experience level with VR
May have had some small exposure to VR and education events or possibly in school. Given its still early days in the tech itself, it’s unlikely they are highly experienced.
Q and A
Q: How accessible would each VR platform be to your target student in terms of price? Take into account location, age, and income.
A: Accessibility of the various system is wildly different.
Most high schoolers, of this age, have smartphones and the means to acquire a cardboard headset. Also, headsets could be supplied by the provider via the post, or at events.
Much less accessible. The most likely source for high immersion would be the PlayStation VR and it’s unlikely prospects would think to use this given its context for gaming. The Vive and Oculus don't have enough presence yet to be meaningful for this use.
Q: How interactive does your lesson need to be? For example, do I need to pick things up or could I get away with just looking at objects?
A: The experience is purely to show off a facility and teaching capabilities, so only basic interaction is required. Select objects for more details or to play videos/animations.
Q: How realistic do your visuals need to be in order to teach? For example, could I use 2D images and videos in a 3D Environment or do you need high poly 3D models?
A: 2D images and video would be ample to communicate what needs to be communicated. If in future there was a need to train in the use of equipment etc then a more immersive VR platform would be required.
Q: Does my student need to feel like a participant in the experience or can they be a passive viewer? Could they be both?
A: To a certain extent they could be both. Selecting options (pick a path style) or just following a set path. Having the option is likely a good idea given repeat users are likely to have something specific to see rather than the whole walkthrough. So it would be good to be able to skip to the parts they are most interested in.
Q: Given the answers above, what are potential platforms you could use for your experience?
A: Given the limited access to high-end VR hardware, mobile VR (aka cardboard/daydream/gearvr) are likely the best targets. It’s fair to say highly immersive VR would be richer, but given it would reach far fewer people it’s not really relevant for outreach activities like this. The fact the content could also be easily reused and delivered via the web is also a plus for mobile.
A few questions to consider with regards to future technologies:
Q: How would Augmented Reality better help teach your experience?
A: In this case, it likely wouldn't be. Given we are attempting to educate student prospects on what the facilities are actually like at the University recreating that experience is better suited to VR. Augmented reality could potentially be used differently if you wanted to demo a particular part of the experience, such as show a horse on a large animal MRI machine, at an event. This could give prospects a real sense of scale relative to things around them.
Q: How could eye tracking help you better tailor your experience to your students?
A: Using eye tracking to understand the users focus within an experience could enhance our ability to provide depth of field. Making the experience more natural for your eyes. We could also use I tracking to monitor what users are looking at and use this information to improve the experience over time.
Q: How would better Haptics better teach your experience?
A: Introducing a tactile feel to the animals in the environment could provide a great sense of immersion and connection with the animals. The more real it seems the more convincing it is as a sales tool.
Q: How important is graphical fidelity to your experience?
A: Given the experience is likely relatively short users are unlikely to experience eye strain. Also, we are only attempting to highlight the environment the student will be in a demo the sophistication of the facilities. Given this is the case, it's not completely necessary for the graphic fidelity to be top of the line. Having said that, improved displays that take advantage of light field technologies would vastly improve player comfort and as a result will be essential and common place in the future.
Q: How critical is it that your target student receives this training within the next two years?
A: As the university expands internationally it's of increasing value to demonstrate to international prospects what facilities are on offer without them having the expense of physically travelling to New Zealand to see them first hand. Using VR allows them to get a sense of the place, helping them make more informed decisions.
As discussed earlier, there are loads of terms floating around in the world of VR. But it’s not just things on the spectrum between AR and VR. There’s also various types of experiences within just VR.
Rather than try to cover every possible variant here, I’ll talk to the types I’ve experienced personally, and as a result what I think are the important components that make for good VR.
6 degrees of freedom (6DOF)
I’ve used quite a range now, from 3 degrees of freedom (3DOF) Google Cardboard, right up to a high immersion 6 degrees of freedom (6DOF) HTC Vive setups. Here’s an overview of what i’ve used so far:
I’ve got what can only be described as a growing collection of Google Cardboard headsets. The more you get involved/exposed to the VR community and events it seems sort of inevatable that you’ll pickup a few along the way. This is part of what makes Cardboard so great. It’s highly accessible. If you’ve got a modern iOS or Android based phone you’ve got 99% of what you need to get going. Cardboard headsets range in sophistication and price, from $15 bits of literal cardboard, to “fancier” setups that strap to your head with headphones for $50+.
Emma sporting one of my “fancy” Cardboard headsets
Acessiblity aside, the sorts of experiences you can do in a Google Cardboard really are a perfect entry into VR. It’s a fantastic way to take users to inaccessible locations and show them things. Examples include estate agents showing an out of town client through a property, or placing a marine biology student at the bottom of the sea. The biggest and most obvious shortcoming is the low levels of both emersion and interactivity avalible to carboard users. Effectivly you can only interact via the users gaze or a single button interface on the headset itself. This sort of thing is fine for a certain segment of VR but obviously you’re not exactly going to be entering the Matrix anytime soon.
One of the things that can’t go unsaid about Cardboard’s accessibility is how it lowers the barrier to entry to VR development. As I said in my hardware for VR post, you can get into VR on a very low budget with cardboard as a target device, and that’s a big deal.
The Gear VR is a fasinating step up from Cardboard. In many ways it’s a very sensible direction to take VR. At around $100 USD, plus the cost of a compatable sumsung phone, the cost is more accessible than a full emersion setup. Obviously a phone strapped to your head is only going to take you so far visually. You’re also only going to get a 3DOF (that’s pitch, roll, and yaw tracking of the users head) which limits the feeling of presence. However the addition of a controller, even a simple 3DOF one, means far greater levels of interactivity. To my mind, having any control scheme that similates a users hand greatly increases the value of the setup.
Having said all that, the Gear VR isnt something I’d recommend unless you’re already a Samsung phone user. It’s benifits are not worth the $1000 NZD investment over a cardboard setup . It’s also worth mentioning this segment of the marketing is increasingly competitive. Google have Daydream in this space. I haven’t used it personally but conceptually it looks quite similar. If you think a setup like this is something you’d like to explore you should consider both before making you decision. You should also be aware that Daydream looks to be releasing stand alone headsets later this year that may be the most interesting take on this segment yet. Unless you’re in a mad rush for a VR setup I’d consider waiting until later in the year before buying.
A promotional sketch of the HTC Daydream standalone headset
HTC Vive (6DOF + 6DOF hand controllers)
I like to think of the HTC Vive as a window to the future rather than the revolution that some tout it to be. The experience itself is unmistakably amazing. The level of immersion is so high that I stopped thinking about controlling a computer simulation, and started just naturally interacting with it as if it was a real environment. It’s actually quite a strange thing to describe, but having used this setup dozens of times i’m convinced this as far more to do with your hands than it does your head.
The second you enter a VR environment, and you have some control of it via controls that closely approximates your hands, you stop thinking about the tools and just start being. Before you know it you are truly in VR. Based on the impact of hand contollers I’m a bit surpised more isn’t being done with cardboard headset combine with hand controllers, but then again there’s likely significant technical challenges I’m not aware of.
The HTC Vive in action
Unsurprisingly the Vive is by far my favourite VR experince. However the reason I say it’s a window to the future is the ticket price. These things cost a fortune. A Vive with hand controllers cost $799 USD. Which on its own is a big investment for most. Not inaccessible, but significant. The real kicker here is you need a high end Mac or PC to connect the headset to. Taking that into account you can be looking at a $3000+ NZD machine to get started. Not exactly mainstream pricing.
It’s just a matter of time before these things transform into a more mainstream format. It’s quite likely the coming stand alone headsets are the begining of that. How much they cost and how good they are is yet to be seen, but I for one am excited at the possibilites.
The sooner more people have immersive VR experiences and reasonable consumer prices the faster this industry prices. With gateway drugs like Cardboard giving people their first taste for a low price, it’s sure to an explosive industry once the hardware catches up.
Over the past few months I’ve been studying with Udacity to learn VR software development. The course has been great so far, so I thought I’d share a little of what I’ve been up to with a game called “Puzzler”.
Puzzler is a simple VR experience for Google cardboard. Basically anyone with a phone and a $15 cardboard headset can give it a try. The game has a simple UI where the user is thrust into a dungeon, where 5 magic orbs present a puzzle. Successfully playing a single round of “Simon” gets the player out of the dungeon to victory.
Here’s a brief video of the “final” version of the game:
Of course, I didn’t just wake up one morning and build this thing. There is quite a process to go through to create a good VR experience. As I mentioned on this blog before, a good process can make all the difference when you’re building for someone that isn’t you. Which, let’s face it, is almost always.
The approach taken for this work was no exception so let’s break it down a bit.
Statement of purpose
I started the process by creating a simple statement of purpose for the game:
Puzzler is a mobile VR app which gives new VR users a quick taste of VR via their existing smart phone. The entire experience should take no more than a few minutes and be accessible to most anyone that's physically able.
With this in mind, I selected the nearest available human to be a test subject and personify my ideal target audience. My 6-year-old daughter Emma was the winner on the day. When starting out the build, first I documented a little about her, so I had a clearly defined outline to work too. Here’s the persona for Emma:
VR Experience: Has played a few simple VR games, but not many
Quote (that sums up their attitude): “Can I play with your phone thingy, it’s cool.”
About this person: Emma is an enthusiastic VR user, she loves to explore and have adventures. She’s young and enjoys content that's exciting and interesting, but not too scary or intimidating. She will ask lots of questions and enjoys the discovery. I think it's fair to say she has moxie.
With a clear purpose and audience/persona defined it was time to get going with an initial design and an “alpha” build of Puzzler.
I started by creating a bunch of really nasty looking sketches for what puzzler might look like. And when I mean nasty I mean nasty.
Here are a few concepts I had for the game initially:
My thinking with the above designs was my audience is young. Given that, it was better to use a large simple UI that could be easily understood and a very simple scene design that wasn’t overly difficult to understand. Once I sketched out something I was happy with I then took the designs and built something in Unity.
With the first cut done it was time to get going with user testing. It’s so important to get an early version in front of your audience to test assumptions. My early tests were really all about determining the basics, like the scale of the scene in relation to the subject.
Basically, from there on it was a process of iterating on the project. A mix of making assumptions, asking questions, and testing it all as often as was practical. To give you some idea of the things I asked, here’s a few of the Q&A sessions I had with Emma:
User test 1
Me: How big do you think you are in this scene?
Emma: I feel a bit smaller, normally I think I’d be taller than that barrel. I feel little.
Me: What’s the mood or atmosphere of the environment?
Emma: A bit spooky, but I like the purple balls. Not too scary, there are no witches, I don’t like witches. It’s a bit bright for a dungeon though.
Me: Is there anything you find difficult to see, or anything visual you think could be improved?
Emma: No I can see everything, but I can’t hear anything should I be able to hear things?
Conclusion: Emma picked up on the spooky dungeon, she was feeling too small so I adjusted the player's height slightly. I moved the orbs to be closer to the player since Emma liked them so much. I also added some sound to the environment.
User test 2
Me: How big did you think you are in this scene?
Emma: I think it's right. I'm the right height.
Me: What’s the mood or atmosphere of the environment?
Emma: Spooky, the sounds are spooky, it sounds like night time. It’s a bit dark.
Me: Is there anything you find difficult to see, or anything visual you think could be improved?
Emma: The balls are too close to me. I feel like I’m going to bang into the balls.
Conclusion: Emma feels the right size now, but I think the scene is a bit too spooky for her now and the orbs are making for feeling a bit close. I’ll move the orbs to a different location and increase the lighting a bit so it's not so scary.
Emma hard at work user testing
User test 3
Me: How do you feel about the scene
Emma: It looks cool, I like the lights, it feels dark outside and spooky inside, but warmer.
Me: Do you understand how to start the game
Emma: Yes I click the big “Start”
Me: Do you know how to play the game?
Emma: Yes it's like “Simon says”. I do what the puzzle does.
Me: Is there anything else?
Emma: The balls are in the way of the door. I have to crash into them when I win. The game is too hard for me.
Conclusion - The game has matured to the point Emma is happy with it and can play it well, she still isn’t happy with the placement of the orbs so I’ve moved them back and did a quick retest, she’s now happy. I also reduced the complexity of the game to 5 steps.
Breakdown of final deliverable
So, in the end, we’ve got a dungeon puzzler that isn’t too scary for a 6-year-old to play. Emma likes the game and can successfully play it.
The basic break down of the “final product” is:
The user is presented with a simple UI screen to start the game.
The start screen as see from the Unity editor
On clicking start the player is moved into the dungeon where they are presented with 5 magic orbs. Emma’s feedback had a significant influence on orb placement and the feel of the dungeon.
The orbs chime and blink in a random sequence which the player must complete to “escape”.
The magic orbs as seen from the Unity editor
If the player fails to repeat the sequence a “fail sound” plays, and the puzzle is repeated to give the player another chance. If they get it correct they’re moved out of the dungeon and are presented with congratulations and the ability to restart.
Over all, it was a fun educational exercise doing this project. Plenty of things to learn from a process perspective but also in Unity 3D development. The work has given me a few ideas for other projects to experiment with, and further solidified my view of user testing and rapid iterations in the development of a product.
Next I think I’ll move on to a new project, but stay focused on something for that target audience. I think it would be of value to include a few more testers, including Emma’s brother and friends.
If you’re interested in playing around with the above project I’ve shared the source code via bitbucket. The other projects I’ve done via the Udacity course are also available on the Geekpulp bitbucket page.
Starting out in VR development it’s easy to think you’ll spend the earth on special hardware to get going. The reality is that’s just not true. The below 360 image (which I took with the google street view app on iOS) is fairly rough as 360 images go. The room was a mess as I was in the middle of a fairly ugly hardware transition. The stitching isn’t very good, so there’s lots of bung parts to it, but you get the idea of the space and gear I work with.
Until recently I was using a late 2009 Mac Mini and was managing just fine on my Udacity VR developer course. Well I was managing, not sure about the fine part. So I upgraded to a late 2012 mac mini, still 5 year old hardware, and I’m going gang busters doing google cardboard development with it.
Here’s a few more details on the hardware setup i’m running in the 360 image above:
Displays – Two second hand 1080p displays I purchased on Trademe for $300 total. The one on the left is in portrait orientation for my code editor. The one on the right in landscape for the Unity editor. Both are on this monitor stand I purchased from PB tech for around $100.
Input devices – There’s a bunch of keyboards in the shot but the one I’m actually using now is a Logitech K380. I managed to get a refurbished one on 1-day for $40 shipped. Honestly it’s such a good keyboard I’d be happy to pay retail for it. Logitech claims 2 AAA batteries will last 2 years. Far better than the 2 months I was getting out of my Apple keyboard and a fraction of the price. For a pointing device I’m using a stock standard magic mouse. I’d prefer a Logitech MX Master 2S, but it’s a nice to have rather than a must.
Computer – I have two in play now. My development machine is a late 2012 Mac Mini I purchased on Trademe for $500. Even though it’s from late 2012 it’s actually the fastest model of Mac Mini ever made, the quad core i7. I’ve put 16GB of ram in it and replace the spinning disk with an SSD. I’m really happy with its performance. It does everything I need for Unity development, especially given the contrast of my other machine the late 2009 Mac Mini. That machine is now running as the server for the house. I’ve modified it a bit adding 4.5TB of storage and 8GB of ram. It can no longer run the latest version of MacOS but it’s doing a great job as a cache and Time Machine server.
Audio hardware – Something I’m really aware of with VR is the impact of audio. Obviously visuals are important in VR, but good quality audio can also have a huge impact on immersion. It’s also very useful in directing the users attention. I’ve got a few bits of hardware on order to support creating more of my own audio for various projects. The main bit of kit is a USB audio interface, or more specifically a Focusrite Scarlett Solo (2nd Gen). I got mine via Amazon as it actually worked out a bit cheaper than buying local, even with the shipping costs via youshop.
I’m trying to play a longer game with this purchase. It’s likely better than I need at this stage, but it will last me 15 years or more. With budget in mind, for the rest of my audio gear I opted for a cheap XLR mic, arm, and pop filter from Aliexpress. I’ll upgrade those later when I have more funds available and other parts of my setup mature.
360 camera – At the moment i’m just using my iPhone 6s and various 360 apps. Over time I’m expanding my capabilities as I need/can afford them. I’m in the market for a 360 camera, likely something like the Ricoh Theta S or a 2017 Gear VR camera. I’m hoping to purchase one of these in the next 3-4 months.
Mac Pro and a 6DOF VR setup – Ultimately I want to replace my Mac Mini with a far more powerful setup so I can expand into more immersive VR development. I’m aiming for a new Mac Pro when they become available later next year. I’m also delaying purchase of something like a Vive or an Oculus as long as I possibly can. It’s such early days in VR hardware and i’d prefer to wait to buy when there’s gen 2 or even 3 out. I suspect 2018 is going to be an expensive year.
The thing to take away here is you don’t need to spend the earth to get started with VR. Some second hand hardware and a drive to learn will take you a long way before you need to invest more.