Thankfully I quickly stumbled upon a helpful tutorial on a blog by programmer Ben Dixon. This explained -- in nice, easy terms and lots of pretty pictures -- how to put together a basic Cardboard app using the Unity 3D game engine, my own personal weapon-of-choice.
Setting up a split screen was easy: set up two cameras (one for each eye position) and allocate each a viewport on the left- and right-side of the screen respectively.
A little more tricky was the head movement tracking control, Ben used the Dive plug-in package produced by Durovis as part of their Software Development Kit (SDK) for Unity. Attach the main script to the first-person controller and ... voila!
So I had a quick go. I used Unity's low-cost Standard Assets (Mobile) package to create a basic terrain & skybox. Planted a couple of primitive shapes around to help with direction calibration. Then, finally, I added a couple of Maya animated game objects I had sitting around, following a waypoint path. This was the result:
Side-by-side (L-R) stereo view as seen on an Android mobile phone. "I am Warbot! I will kill you, puny hu-man!" |
While I was at it, just for fun I threw in a model of an Imperial AT-AT Walker (created by one of my students) that I'd adapted for a rigid-mesh rigged-animation walk cycle demo (based on the Preston Blair 4-legged sequence).
The overall result worked OK, but I was disappointed that the rotation and horizon slipped on occasion, resulting in a lopsided view. Also, as pointed out in the tutorial, there was no clear way to detect the magnetic switch on the Cardboard unit (no surprise, given that the SDK was for a competing product).
Well, that's VR test #1 complete. In the next test I'm aiming to use Google's own Unity SDK, which is a bit more complex to use but gives me access to the magnet trigger switch.