Thursday, 29 October 2015

Applied Nostalgia: Everybody's Gone To The Rapture

A couple of years ago I was impressed by the release of Gone Home, an indie game set in 1995 with heavy use of applied nostalgia triggers.  The design included general period features and some beautiful detailing to evoke a strong emotional reaction as the player explored an abandoned home.

As someone with a strong interest in this area I was pleased to see nostalgic content appearing sporadically in other games (such as The Witness) but few were as steeped in it as Gone Home.  That is, until a few months ago when a new game appeared...

Everybody's Gone To The Rapture


I'd seen articles about this game online and investigated it a bit over the Summer; however, I finally got to see it properly in action at the EGX Eurogamer Expo in Birmingham.  And I was blown away.

First thing: this game is jaw-droppingly beautiful.  The Chinese Room have done an astonishing job on this aesthetically.


The game is full of nostalgia-inducing objects, including 1980s-period Ford Fiesta (and Cortina) cars and an old Commodore 64 style computer (with floppy disk drive and box-style CRT monitor).



There's even a metal push-button payphone (model CT24 if you're feeling particularly geeky about phones), which harks back to my post on the potential of telephones as nostalgia triggers.



The whole thing evokes the feeling of the "deserted village" episodes of 1960s/70s TV shows like The Avengers, UFO, or Doctor Who; period bicycles, a 1970s John Deere tractor, a 1960s-style Transit van ... the level of detail is astonishing.




Mind you, this isn't surprising -- the developers went to a lot of effort to recreate that tone:
"For Everybody's Gone to the Rapture, we really wanted to explore the very British apocalyptic sci-fi of the 60s and 70s," says creative direct Dan Pinchbeck. "John Christopher's The Death of Grass and A Wrinkle in the Skin, John Wyndham's Day of the Triffids and some of the really early 'cosy catastrophe' fiction like The Tide Went Out by Robert Wade."
— "Where literature & gaming collide" by Thomas McMullan, Eurogamer
While researching, I came across the website of Mark Silvester, an environment artist who recently graduated from the University of Portsmouth and was asked by The Chinese Room to do some asset work.  Nice to see novice designers pulled into this world.  Here's his (original 1979-style) Walkman:


This game really is quite a visual feast and, if you get the chance to watch any gameplay, do.  The PS4's power gives life to gorgeous metallic railings, glistening in the sunlight.


 I felt like I was swept back to the heady halcyon summer days of the 1970s, and that's a great confirmation of the power of nostalgia applied in video games.

Sunday, 25 October 2015

VR the world; VR the children

OK, so I apologise profusely for the pun-tastic post title.  Since my last post I've been busy with all kinds of different things; my game projects have all crawled along in the background making s-l-o-w progress.  More significantly, I seem to have found myself becoming an evangelist-cum-apologist for Virtual Reality technology.

It's a bit of a weird situation.  The major commercial VR products are just about to hit the market.  There's been a flurry of activity this last year.  Early adopters are poised to pounce.  And yet there's also a resigned feeling that the whole thing will be a five-minute wonder.  I've heard the phrase "the next 3D TV" so many times that I'm thinking of getting a T-shirt made.

So, is VR going to take over the world?  Or will it be just a toy for the kids?  (Hey, I have to justify the title somehow!)

View-Master


The Mattel View-Master VR (which I blogged-about back in February) has finally hit the streets ready for the Christmas market.  You can already buy it from Amazon and Walmart (but, sadly, not in the UK yet ... and the rotters won't ship internationally).

View-Master VR with mobile phone.
(Credit: Harrison Weber, VentureBeat)
It looks pretty good.  A $30 (that's £20 sterling to you, guvnor) you get a plastic Google Cardboard.  One that won't fall apart or crumple when it gets sat on.

Mattel's GIF animation.  Don't blame me if it doesn't work on your screen!

I can't wait to get one.  Except I know what will happen: I'll look through the eyepieces and realise that I won't be able to use it.

The Problem of Interpupilary Distance


The problem is a simple one.  My eyes are too far apart.  (Yes, I've literally got a big head.)

One of the reasons that binoculars have a big hinge in the middle is because everybody's eyes are a different distance apart.  This interpupilary distance (IPD) creates a problem for VR systems, because they need to focus each eye on half of a stereo image -- and that's difficult when IPD varies so much.

A graph of typical IPD measurements.
(Source: research by Neil Dodgson, University of Cambridge)
The graph above shows that 95% of people sampled fell within the 55-70mm range.  Therefore a number of VR manufacturers tend to work on the basis that:

  • A fixed IPD of 63mm to 65mm will be comfortable for most of the population.
  • Fixed lens position is better than variable, because people will set a variable one wrongly anyway.  (That's Firefly VR's position, anyway.)

Google's own Cardboard specification originally presumed a fixed 65mm IPD, but that's complicated by the range of different phone screen sizes, which doubles the problem up.

My own 70mm IPD and big screen phone were enough of a problem that I needed to device my own scaled-up Cardboard template.

Oculus Rift & chromatic aberration


I picked up an Oculus Rift Development Kit (DK2 version) back in the summer and have been playing with adapting existing VR environments for it.

A VR-adapted version of The Lone Viking, a game (by my former students Grant & Chris) for the Unity game engine.
The DK2 tries to get around the IPD problem by adjusting the display output to the user, attempting to compensate for both eye position and focusing issues.  As you can see in the photo below, the kit comes with hefty lenses, switchable with near-sighted versions.

The DK2 unboxed.
(Photo shamelessly pilfered from Geeky Gadgets)

The lenses have a high magnification, which increases an effect called chromatic aberration.  This effect is demonstrated on the following image (courtesy of mabrowning)



Essentially, the shape of the lens distorts the light coming from individual sub-pixels on the display screen.  If your eye is off-centre of the lens then the colours won't position properly -- known as the "screen door" effect.  There's a nice article about the issue at VR-tifacts.

This is a big nightmare for fixed IPD devices.  Thankfully, it appears that Oculus have taken the plunge and the consumer version of the Rift will have a manual IPD adjustment slider.

Even with this, the issue of aberration is not completely diminished.  It's hard to stop the problem at the extremes of view, so you'll notice that the screenshot above shows some colour fringing where the software attempts to compensate for the problem.

Conclusions


Of course, the big advantage of the Oculus Rift is that you can move your head freely (although there's still a small amount of motion sickness on the DK2).  However, I was shocked to find that the image on my £10 made-at-home Google Cardboard felt better than the blurry one on £300 DK2.

This is likely a result of my big head -- sorry, larger-than-average IPD -- and I'm sure that a carfeully-adjusted consumer Oculus Rift will not have that issue.

And what are we going to use VR for anyway?  Gaming is seen as the killer app but I doubt companies like Facebook would invest $2 billion into buying Oculus on that basis alone.  At work we're already talking realistically about using it for training & education: from welding simulators to biology.

However, I can't help but think that the biggest quantity of sales for VR this next 12 months will instead be the toy market.  I think Google and Mattel have hit the jackpot in combining nostalgia with high-tech, and those of us with bigger-than-average heads will undoubtedly start cannibalising our cheap VR headsets to join in.

Thursday, 5 March 2015

The Trouble with Mipmaps

In recent weeks I've been experimenting with developing content for the Google Cardboard VR system using Google's SDK plug-in for the Unity game engine.  For this job I've created a test room with some simple objects including chairs and a table.

For the table I decided to take advantage of symmetry to repeat texture elements, ensuring a good optical resolution across the full surface of the object.  This made sense, because the player would get close-up, and it added only a handful of extra polygons.

The table model (with fake shadow).  The table on the right illustrates the position of the polygon split.
As you can see, the table is split across the middle, allowing the UV map elements from one end to be flipped and overlaid over the other end.  Thus texturing one half of the table automatically textures the other half in mirror-image.  (I could have taken this further and cut it into quarters, but felt this would have made the symmetry-repetition too obvious to the eye.)

Optimising further, I therefore needed to texture only half of the table-top, a single leg, and one end/side panel.  The texture has baked-in ambient occlusion, too.

All well-and-good, but when I planted the table in the game engine a nasty seam appeared across the middle of the table, which got worse at certain viewing angles.


At first I thought this was a glitch from not sewing the UV sections properly.  However, after checking the original model and moving the camera away from the table, I realised that the cause was an old problem I'd first encountered last year: mipmaps.

For those not familiar with the concept, mipmaps (or MIP maps if you're a pedantic etymologist) are a tool used by game engines to reduce the 'cost' of drawing a 3D scene.  Applying a full-sized version of the texture is wasteful if the object is too far away to see its detail; therefore, the engine generates pre-rendered half-size versions of the texture which are used instead.

Original texture (256x256) with successive half-sized versions. Collectively, this set of textures takes up 33% more memory than the original texture. (Image adapted from Tom's Hardware Guide)
This illustrations indicates where different textures are used at different distances.
In principle mipmaps are a really useful tool for optimising performance.  However, they do have a nasty habit of killing sharp edges on textures.  I first encountered this when developing a modular road-building system for Unity.

On this section of road you can see where the central lines are blurred in the distance.  Far-away sections are textured with a lower-resolution image (i.e. black and white pixels averaged to grey).

In the case of the table, the "seam" edge of the tabletop was being merged with the black pixels nearby as the resolution decreased.

Room (distorted by VR camera) with table seam visible.

The easiest way to solve the problem is to disable mipmaps.  This can be done on Unity via the texture's settings.


However, disabling mipmaps means a greater cost when rendering.  On a mobile device that is a significant problem; after all, mipmaps were invented (and are enabled by default) for a reason.

In this case, thankfully, the answer was simple: to overlap the wood texture well beyond the seam edge to ensure that it averages to a wood colour even at lower resolutions.

Look!  Still using mipmaps but no seam, thanks to better UV texture mapping!
It just goes to show that it's good to be careful when matching texture patterns to UV maps: a couple of pixels overhang is not enough in critical situations -- a lesson learned the hard way.

Monday, 9 February 2015

Unity Cardboard test #2

Following on from my initial test constructing a VR environment for Google Cardboard on the Unity game engine, I've been playing with capturing user input on the Cardboard system.

For this purpose, I've knocked-up a test room which I can populate with different objects.  Initially I've thrown a chair in there, to allow me to test whether an object has focus (i.e. dead centre of the viewport).

Test room design, using standard Unity shaders.  Approx. 1000 triangles (total) with four 512x512 textures.
Unity has a special library of low-cost mobile elements, optimised for smartphones & tablets, which have weak graphics & processing facilities (when compared with PCs).  Early iPhones are somewhere in the region of 20 draw calls per frame; to put that in context, a standard PC-style background skybox will eat up something like 6-10 drawcalls on its own.

The first thing I wanted to try to see how Unity's lightweight mobile shaders compared against standard shaders, because adding an extra camera (i.e. stereoscopic vision) is going to add some extra load (even if each camera's viewport is half the size).

The image above shows the version using standard shaders and a point light source, which pulled about 16-32 drawcalls on average.  On my battered old Galaxy Note, it ran OK but was a little sluggish to respond.

The image below uses mobile shaders, which pulled things down to 8-16 drawcalls.  The models & materials could be optimised further but I wanted to just get a ball-park feeling.  As you can see, there's a considerable difference in appearance, the chief one being that any lighting effects are going to need to be baked-in to the environment textures.

Same room model but using mobile shaders.
The other thing I wanted to test was the Cardboard SDK API.  Extracting info from Google's demonstration file, I wrote a script which kicked-in when the camera was looking at an object, changing the objects's shader to make it "light up".

Chair has been "lit" to indicate that it is currently selected.
Finally, I added some further script code to respond to the magnetic trigger on the side of the headset.  This would rotate the object by 45 degrees for each 'click', but only while the object is selected.

A rotated chair.  It's not exactly Halo 5 but it's a start.

All-in-all, this has been a very successful experiment.  I've gained a feel for how to interact with the API and now have the ability to interact with the user more.  Next up: movement and switching between objects.

Saturday, 7 February 2015

VR with the View-Master

Yesterday saw an interesting tease from Google and Mattel: a product announcement at a forthcoming toy fair.


It's fairly evident from the teaser that this will be a new digital View-Master stereoscopic viewer.

A digital version of the View-Master isn't exactly a new idea, but Google's involvement means that the smart money betting on something compatible with the Cardboard project, which means there will be a big marketing machine behind a ready-built, sturdy VR system compatible with existing (and forthcoming) Cardboard games and apps.

If they're smart, they'll make it cheap compatible with with popular mobile phones; if they're greedy they'll build the display screen in -- a very bad idea in my opinion, since the appeal of this type of thing is likely to be linked to price.

Regular readers will recall that I'd referenced the View-Master in my 'Rewind' game concept trailer.  The circle comes, um... full circle!
Recent review for the Samsung Gear VR system suggest that it's an excellent product but, unlike Cardboard, is using its own movement sensors and controls (i.e. Oculus technology) -- rendering it incompatible with Cardboard.

So, if Google & Mattel are looking to make a big deal of cost-effective VR then we could be in for a format war which will present nightmares for game & app developers.  Let's hope they all see sense and work toward a common VR API.

However, in the meantime, a big announcement (and the related surge of interest) means problems for me.  I'm months away from turning 'Rewind' into a viable VR game, and had relied upon the marketplace staying relatively uncrowded as a way to get better visibility.  Still, such is life for a one-man development studio, so I'm just going to have to plod on and see how things develop,



Update


As hoped, Mattel went for the sensible option and are producing the new View-Master as a tough plastic version of the existing Google Cardboard headset.  The price is good, too: $30, which will translate as £20 -- comparable to high-end Cardboard kits.

Image pilfered from venturebeat.com

Photo shamelessly stolen from CNET.

The device has a comfortable, face-hugging viewport (although I wonder if it could get sweaty after prolonged use?) and a glossy black face on the rear.


I was intrigued that none of the launch photos showed the insides, i.e. the smartphone mounting system; indeed, the reports indicate that the object on display is a non-working prototype -- created to show the design aesthetics only -- just in case the toy-retailing market turns its nose up at the idea.  At the event Mattel used normal cardboard headsets to demonstrate the VR aspect.

Reports suggest that they intend to use this for Augmented Reality (AR) as well as VR.  For AR the phone's camera would need a clear view, so the back face would need a redesign.  If so, I suspect the final product will look rather different to the prototype.


The "reel" discs are a strange aspect to the package.  A CNET report suggests that the discs work on AR-style principles: you lay the disc on a table, point the headset at the disc, and it recognises (via the smartphone camera) the image at the centre of the disc and launches an appropriate 360-degree panorama.  Discs will be priced at around $15.

Presumably this requires an Internet connection to download the panorama, which makes the physical disc a bit of a useless gimmick -- given that you could just directly download the reel from the app store instead.  However, I'm guessing they wanted something to hook to the nostalgia of the old View-Master disc reels, which is understandable from a marketing perspective.

I must admit that I'm ready to part with my £20 to get a sturdy version of Google Cardboard, especially if it has that feel of using an old View-Master with the pull-down 'clicker'.  For me, the new View-Master can't arrive in toyshops quickly enough.

Hopefully I'll have a working VR version of the Rewind game ready in time for the launch!

Sunday, 18 January 2015

Google SDK vs Dive SDK

In the previous post I used the Dive plug-in to create a simple VR app for Cardboard.  Shortly after I'd created the test app, I discovered the official Google Cardboard SDK for Unity. Therefore my next step was to try it to see how the two compared.

Although the Google package appeared more cumbersome, I quickly found that it was much more flexible.  One key -- and rather important -- distinction was the ability to detect Cardboard's magnetic trigger switch.

The outer neodymium magnet (countersunk ring shape) is held in place by a ferrite (ceramic) disc magnet, glued inside the headset.  The ring can slide downward and, when released, flicks back into alignment with a pleasing, spring-style motion -- a genius bit of design.  The photo illustrates how the switch affects the phone's compass sensor.
(Source: Ben Lang's "Road to VR" blog)
Certainly the Google-driven app seems to behave itself better on the phone.  I'd found that my Dive apps had a habit of losing track of head position during fast movement, meaning that the 3D landscape would unexpectedly re-orient direction (i.e. yaw) or roll the horizon to a permanent angle.  That's a biggie in bug terms, and undermines it significantly.  I'm guessing that the Dive headset design may use the magnetic sensors to assist with determining alignment -- something that Google eschewed due to Cardboard's magnetic switch.

Every Cardboard app I've seen has used the hardware 'back' button to quit, but my Dive test app ignored this button.  Access to hardware buttons is obstructed by both Cardboard and Dive designs, so the only time most users would press 'back' is after de-coupling their phone from the headset -- so it's not much of an issue.

Another problem I found was that, after mixing the Google and Dive SDKs in the same project, the Dive app started crashing badly when recompiled.  There's obviously some kind of conflict there.

Updated: I'd suffered crashes when mixing Dive & Google SDKs in the same project.  However, the same problem later started to affect Google-SDK-only projects.  It now appears that this is a bug caused by corrupted files within Unity.

The Google SDK is not all roses: in particular, the manual is a bit rubbish.  When testing the Dive app within Unity's game window, the mouse controlled head movement.  It took me three days to discover (by examining the SDK's C# scripts, eventually) that Google's SDK supports mouse control only if you hold the ALT key down.  Most unhelpful.

Updated: ...and another two weeks to discover that the CTRL key allows the head to roll from side-to-side.

As a side note, I understand that the Samsung Gear VR phone-based system is based on Oculus Rift so, presumably, the Oculus Rift official plug-in for Unity may offer some support in this direction.  However, my investigations so far suggest that this will be far from straightforward.

A test environment in Unity using the Dive plug-in
It may sound obvious to conclude that Google's own SDK works best for Google's VR system, but in 33 years of programming I've discovered that presumptions always need to be tested.  It's used-up a fair few hours of my life but I'm satisfied that I'm now working with the right tools for the job.

Tuesday, 13 January 2015

Unity Cardboard test #1

After playing a few demos on my Google Cardboard VR headset I was keen to get building my own test environment.

Thankfully I quickly stumbled upon a helpful tutorial on a blog by programmer Ben Dixon.  This explained -- in nice, easy terms and lots of pretty pictures -- how to put together a basic Cardboard app using the Unity 3D game engine, my own personal weapon-of-choice.

Setting up a split screen was easy: set up two cameras (one for each eye position) and allocate each a viewport on the left- and right-side of the screen respectively.

A little more tricky was the head movement tracking control, Ben used the Dive plug-in package produced by Durovis as part of their Software Development Kit (SDK) for Unity.  Attach the main script to the first-person controller and ... voila!

So I had a quick go.  I used Unity's low-cost Standard Assets (Mobile) package to create a basic terrain & skybox.  Planted a couple of primitive shapes around to help with direction calibration.  Then, finally, I added a couple of Maya animated game objects I had sitting around, following a waypoint path.  This was the result:

Side-by-side (L-R) stereo view as seen on an Android mobile phone.
"I am Warbot!  I will kill you, puny hu-man!"
(A handful of eagle-eyed readers will recognise Josh Taylor's famous Warbot tin toy design, used to teach UV map texturing to games design students.  This is my own homage with a mesh & texture based heavily on Josh's original.)

While I was at it, just for fun I threw in a model of an Imperial AT-AT Walker (created by one of my students) that I'd adapted for a rigid-mesh rigged-animation walk cycle demo (based on the Preston Blair 4-legged sequence).


The overall result worked OK, but I was disappointed that the rotation and horizon slipped on occasion, resulting in a lopsided view.  Also, as pointed out in the tutorial, there was no clear way to detect the magnetic switch on the Cardboard unit (no surprise, given that the SDK was for a competing product).

Well, that's VR test #1 complete.  In the next test I'm aiming to use Google's own Unity SDK, which is a bit more complex to use but gives me access to the magnet trigger switch.

Monday, 12 January 2015

VR Games

In the previous post I described my initiation into the world of Google Cardboard, a cheap and fun way to enter the realms of Virtual Reality (VR).

After building my cardboard headset I downloaded some demo apps to test the whole thing out.  One of my favourites has been DebrisDefrag by Japanese developer Limecolor.

Shoot the large asteroids as they wander across your path.  (Hmm, sounds like an idea for a vector-based 2D arcade game...)  Careful -- they're all around you (even above and below).
DebrisDefrag makes excellent use of the magnetic switch on Cardboard and emphasises the fact that you can have a fun game even if your FPS "body" can only swivel on the spot.

I was surprised to find Tuscany Dive by FabulousPixel, a version of the Tuscany demo designed for the Oculus Rift VR system.  Makes a lot of sense -- after all, the demo was built to showcase VR.  Anyway, this version appears to have been designed for use with the Durovis Dive (also called OpenDive) VR headset, which is basically a plastic 3D-printed equivalent of Google Cardboard.

A side-by-side (L-R) presentation of the Tuscany house.  These images can be free-viewed using the parallel technique.


Mobile phones have relatively low-power processing ability (typically a tenth of normal desktop processors, according to my research) so an environment like this, with its high graphics cost, pushes the limits of a mobile phone's electronics.  Thankfully the developers have added a tap-to-change-graphics-quality feature.  (With Cardboard you must move the headset away from your face and wiggle your finger inside the nose-hole to touch the screen, so it's not exactly convenient!)

The big problem with this demo -- and it demonstrates one of the big technology limitations of cheap VR -- is how to move.  The demo has an "auto-walk" feature, where you look downward to enable or disable movement in the direction of view.  I found it a pain to use.

Tuscany Dive isn't exactly a game, but it does show the potential for 3D environments like this typically found in firs-person games.

Sisters by Otherworld Interactive demonstrates the application of this very well, in a nice little creepy horror experience.


Many of the other Cardboard apps I've tried are just VR demos but my experiences have shown me that there's a fair bit of potential for making games in this particular area -- especially ones aimed at cheap mobile VR devices like Cardboard or Dive.

The Sisters game made me think of Gone Home, which reminded me that my old Rewind sandbox game concept particularly lends itself to VR implementation.  I'd put the game to one side while working on other projects, but these seems a prime opportunity to resurrect it; I've got a load of assets built already.  Therefore I'm aiming to release a basic version of Rewind once I've cracked a few tricky problems:

  • How to use Unity to construct VR games
  • How to control 3D movement with only limited input from the user

Next post: getting started with VR in Unity!

Tuesday, 6 January 2015

Cardboard VR

The first time I looked through a View-Master I knew I'd fallen in love with 3D stereoscopy.  From that day onward I grabbed any opportunity to experience it.

Don't forget -- back in the dark ages of the 1980s, it was rare to find 3D images!  I lovingly cherished the anaglyphic goodness of my giant-sized Captain EO comic book and gazed in wonder at the novelty shorts playing on Alton Towers' early IMAX 3D screens.  I oscillated from side-to-side looking at exhibits in Liverpool's Albert Dock laser hologram gallery (now sadly long gone).

However, it was an exhibition at the Guernsey Occupation Museum that opened my eyes (pun fully intended) to the serious use of stereo imagery.  Since then, through various online resources (including Brian May's occasional blogs on historical 3D photography) I've learned a useful amount about the subject.

Of course, in the last decade, 3D has gone mainstream.  From cinemas to the 3DS console, stereo images are now easy to find.  Some implementations are poor (such as the dreadful back-conversions of popular movies) and some are good (like Anthony Dod Mantle's beautiful slow-motion 3D cinematography on Dredd 3D).  Some are just fun, like the lenticular prints found on selected DVD & book covers.

Last year I had the opportunity to study the excellent work of fellow MA Games student Alex Bellingham.  His Zeitgeist project gives a grand overview of 3D image presentation techniques and explores interactive user interface (UI) design in this strange new world.  Alex is currently an "immersive UI specialist" at Sony SCEE, working on their Project Morpheus VR (virtual reality), which leads me neatly into the subject of this post -- interactive 3D through VR!

Virtual Reality on-the-cheap


I'd had a chance to play with the popular Oculus Rift VR system back in September, at development kit training session for UCLan's Game Design students.  My first impression was of terrible motion sickness -- a major problem VR developers are still trying to crack -- but the overall technology was extremely impressive.  The only catch is the cost.

So, back in the summer when a friend introduced me to Google's Cardboard virtual reality (VR) project, I was interested.  For those who aren't familiar with it, the idea is to construct a basic VR headset using just an Android mobile phone and cardboard.

Google Cardboard headset with phone, showing a side-by-side stereo image on the screen.
(Photo pilfered from Google's website)

If this sounds too easy that's because there is a catch.  You also need a pair of convex lenses (which can be difficult to source) and a couple of strong magnets; thankfully there are a number of online suppliers willing to provide these (at a hefty mark-up, of course).

The phone app displays a pair of images, each focused to an eye by a lens.  The phone's internal gyroscopes detect movement; the magnets are used to construct a clever remote "switch" which triggers the phone's internal magnetic sensors.  Finally, an optional sticker allows NFC-compatible phones to launch the app automatically after mounting.

The full kit, including a pair of neodymium magnets, convex lenses, an NFC sticker, rubber band, Velcro fasteners and pre-cut folding cardboard.
(Photo from Google)

It's taken quite a while to get round to investigating it first-hand, but I've been quite impressed with it as a very economical -- and fun -- form of VR.

Cardboard for the Galaxy Note


One of the factors delaying my investigation was my phone.  I've got an original model Samsung Galaxy Note (N7000) and its 5.3-inch screen makes it too big to fit the standard Cardboard design template.

I'd considered buying one of the ready-cut "big phone" cardboard kits, but fancied the challenge (and ethos) of building from scratch.  I couldn't find an existing template online, so I did a bit of measuring and drew up my own.

During testing I discovered another problem: I have a big head.  Literally.  My eyes are further apart than average, which messes with the lens positioning.  The mean average interpupilary distance is around 62-65mm (which is the distance between lenses on Google's template); mine is 70mm.  So I adjusted the design to compensate.

Here's the final design, adapted specifically for the N7000:





I'm more than happy with the final product, which works perfectly.  It's a shame that the N7000 doesn't have NFC technology but it's not much of a hardship having to launch the app manually.

The finished product!

In the next post I'm going to explore the world of Cardboard games: the interesting limitations of control using a single switch, and what I've found out so far trying to develop VR games from scratch using the Unity engine.