Exploring Virtual Reality
With all the recent hype over virtual reality, I decided to look into developing for VR. I was inspired by apps like Tilt Brush (where you can draw in 3D). So over the last couple of months, I gave VR development a try and quickly realized that it takes a lot of different skill sets. Here’s an intro to what some of the pieces are.
Hardware
One of things you need to decide is what platform to target. VR hardware falls under two general types, offering either Three Degrees of Freedom (3DoF) or Six Degrees of Freedom (6DoF):
-
3DoF tracks your head (i.e. where you are looking). Examples include using your phone for Daydream or GearVR. Only certain phones work (requires an Inertial Measurement Unit, a fancy sensor).
-
6DoF tracks your head and where you are in the room. Examples include PCs running Oculus Rift or Vive headsets along with their positional tracking systems (Constellation and Lighthouse respectively)
At a minimum, you need a capable phone and something like Google Cardboard for its lenses. The lenses help magnify and focus what you see. Without the lenses, you won’t see VR; just try looking at your phone really closely, its just blurry.
The platform will change how you interact with the app, but luckily development starts off mostly the same because we’ll be using a game engine (a software framework specifically for making games and VR).
Software
Game Engine
Since developing for VR is already a difficult task, I recommend using a game engine like Unity or Unreal. I picked Unity and programmed in C# (though there is an option to use a variation of Javascript if that’s your preference). Unreal uses C++.
With a game engine, you can focus more on creating content rather than laying the foundation (and even then there’s a lot to learn). The engine handles:
-
Creating and editing objects in 3D
-
Adding scripts/behaviors to objects (e.g. objects can be scripted to detect collision with another object)
-
See what’s viewed in a scene with Cameras
-
Adding lights, which can get complicated quickly. There’s different types of lights, including light sources that need to calculate how light bounces off objects in real time
-
Creating animations (so you don’t have to animate every keyframe)
-
Physics (e.g. implementing basic gravity is just a checkbox)
-
Built in SDK support for VR, like Google’s Unity SDK
3D Creation Suites
So I mentioned a lot of 3D objects and animations above. Unity is great for putting the pieces together, but it’s not made specifically for creating complicated 3D models, rigging, and animation of those objects. For this you’ll need a 3D creation suite like Blender or Maya.
I found that creating 3D objects was unintuitive. When you move your mouse in 2D, it feels natural. With 3D, I had to relearn even simple navigation so I could rotate and view objects correctly.
Asset Store
There’s a lot I haven’t tried, like creating my own 3D rigging, animations, music, and sound. At this point I realized I couldn’t realistically make everything myself. Luckily there’s the Unity Asset Store where you can purchase or sell these types of assets.
Summary
Virtual Reality has some impressive apps already, but its still early. Fundamental issues like locomotion methods haven’t been reliably solved. With the amount of work that it takes to create an immersive virtual reality experience, there’s not as much quality content yet. That said, I think as more developers get used to the tools, we’ll see some really creative uses out of VR.