Although none of us at eleVR are hardware people, we do sometimes find ourselves hacking together experimental hardware when we can’t easily explore an area without it. Such was the case about a year ago, when we were starting to get impatient waiting for AR goggles, like the Hololens, to become available.
Having seen phone AR devices, such as this one developed by Kate Compton and this product for annotating the night sky, I knew that it ought to be possible to combine appropriate lenses and semi-reflective surfaces with a smart phone to create an AR device. We had a bunch of Wearality Sky lenses around and some sheets of clear acrylic, so I set about trying to see if I could combine these into something that would let me see both a scene and the real world.
My first working “prototype” consisted of holding the plastic up to the lenses (with phone inside) and trying to get a reasonable looking image to see on pass-through. As you can imagine, this is super awkward (holding one object in each hand and trying to keep a consistent angle), but I was able to get it to work well enough to convince me that it was worth trying to laser cut plastic pieces to improve the experience.
I designed a reflective panel and two side pieces that would hold the reflective piece at ~45 degrees and slide into the Wearality headset in an easily removable manner to maintain easy “flat packability”.
I laser cut a large number of gradually improving prototypes, tweaking the sizes and shapes of the pieces. Apparently my intuition for how large noses are was drastically off, so many of the prototypes only substantially changed the size and proportion of the nose gap. I also went through a few iterations of ways to securely attach the pieces to the Wearality in the desired configuration without removing the ability to turn easily the headset into a convenient flat package. One of the last big changes that I made was to switch from attaching the three pieces to each with a “snap together” mechanic rather than my original “slide together” one. This greatly reduced the wobbly-ness of the whole configuration, and made the entire contraption much more usable and secure.
Finally, I realized that I needed to allow less light through to improve the visibility of the reflected image, so I glued a fabric light shield (in eleVR colors, of course!) onto one side of the Wearality and made the side panels. This problem also would have been improved by using half-mirrored plastic for the reflective panel, but I didn’t have any. That said, I strongly recommend using half-mirrored plastic if you want to try creating your own similar device. You can download the final laser cutter files here.
Of course, a hardware prototype isn’t the only thing that you need to actually play with AR, a “software prototype” is necessary as well. I used my web-vr boilerplate as a starting point, but, obviously, there needed to be some changes for rotational movement to translate properly now that the phone was in a horizontal position, but displaying to the user vertically and reflected. Thus, the webAR boilerplate was born.
Viewed without the headset, the boilerplate moves non-intuitively, but in the headset the strange reflected text and apparently backwards movement is fixed. As I showed it to other people in the lab, even with the headset off, you could direct people to interesting objects in the scene. “The dodecahedron is over there” became a common refrain in the office.
Even though I’m not really a hardware person, I learned a lot from trying to make my own AR hardware, and exploring this limited early prototype AR helped me better grasp what use cases a technology like this might have (eg. adding interactive information to non-digital demos) and what deficits in my device I would want fixed in a more serious AR device (eg. good position tracking).
P.S. A video of this object: