Cross Platform VR/AR Development in Unity for Vive and HoloLens

Cross Platform VR/AR Development in Unity for Vive and HoloLens

posted in: AR, Tech Posts - Other, Tutorials | 0

 

Introduction & Summary

Hi, I’m Steve Sedlmayr and this is my first guest blog post here at eleVR. I’m a self-taught software engineer with a little over 10 years of experience. For the past several years of that career, I’ve specifically been working in the game industry, primarily working in the social/mobile space. I’m now working on a game that I would like to launch on both VR and AR platforms. Why both? I have my own reasons, and I assume you have yours as well. This tutorial focuses on Vive and HoloLens; however, I highly encourage you, after having followed this tutorial, to implement it now or in the future for whatever additional platforms you like; be it Oculus, Playstation VR, or whatever else appears in the future.

If you don’t want to read the whole thing, you can skip ahead to any section or browse via the outline below:

 

Step 0.a: Installing the Necessary Hardware, Software, and SDKs

        For the Vive

        For the HoloLens

Step 0.b: The Initialization Scene

Step 1: Detecting and Switching Platforms

Step 2: Starting with the Vive Scene

Step 3: The HoloLens Scene

Step 4: Finally, Loading Our Content

Step 5.a: Deploying to the Unity Editor

Step 5.b: Deploying to Vive

Step 5.c: Deploying to HoloLens

        Building the initial Visual Studio project

        Deploying to the Emulator

        Deploying to the HoloLens

        Troubleshooting UWP compatibility issues

Final thoughts

        About scale

        About game or app design

        Final final thoughts

Appendix A: An Aside on Using External Libraries for Unity Development

Appendix B: My setup

All the Links

 

Some of the tutorial uses code from my own external class library. Check Appendix A: An Aside on Using External Libraries for Unity Development for more about class libraries and links to my library. I’ve also included details about my hardware setup in Appendix B: My setup. It might be a handy reference if you run into issues with the tutorial that have to do with differences in your own setup.

 


 

Step 0.a: Installing the Necessary Hardware, Software & SDKs

For this step, I’m mainly going to defer to Valve and Microsoft, which have written their own, 99% definitive quick-start guides. Skip to Step 0.b: The Initialization Scene if you’ve already done this on your own.

 

For the Vive

This is the main page of the Vive Development Portal.

It’s a bit confusing on the main page, but for Unity you just need to download the Unity Plugin, which already includes the SteamVR SDK. You can do that from the link they provide from the portal, but that’s just going to open Unity anyway. Instead, download it directly from the Unity Store from within Unity.

There is no emulator for the Vive that I am aware of at the moment, and in fact the SteamVR plugin won’t even run in Unity if you don’t have a Vive attached to your machine. I found this inconvenient, as my Vive is attached to another machine in the living room. And with the Lighthouse sensors screwed to the wall, it isn’t exactly convenient to switch it between machines. I’ll cover this is in more detail in a later section in case you have or need a similar setup. For instance, I don’t imagine every company will want to be a separate Vive headset for every developer, designer and artist on their team.

In any event, you’re going to need a Vive. Once you’ve purchased it, download the ViveSetup software from their site and just follow the instructions from there on the computer to which the Vive will be attached.

A note once everything is running: sometimes the room tracking will get messed up and you won’t be able to launch, or you will have issues in SteamVR games. Just re-calibrate the room following these steps:

  • From the SteamVR pop-up, select the SteamVR drop-down menu
  • Click ‘Run Room Setup’
  • Follow the room setup wizard

That usually does the trick.

If you are using a separate machine for development, you should also install the software on your development machine, even though it doesn’t have a Vive attached, skipping the Steam installation and hardware setup steps. The reason is that the SteamVR plugin will look for a path to the SteamVR software on your system when you start up your app, and throw an error. For me, it was error 110 (which unfortunately wasn’t listed on their error code reference). Once you’ve installed SteamVR on your system, you will still get an error (error code 126), but it won’t pause playback, although you will need to disable your SteamVR camera for this case and have a screen camera available to enable. This is useful for debugging during development when you don’t want to go through the steps of deploying to the headset, as I mentioned earlier. I’ll go into more detail on this later on.

Error 126 is completely normal when running directly in the editor without a Vive attached. The error looks like this in the output window:

image07

Figure 1. Unknown error 126.

 

For the HoloLens

Microsoft is a little more forgiving on the hardware side of things for HoloLens development compared to the Vive, which is a good thing, since the dev kit costs $3,000. Luckily, they wrote an emulator. I’m fortunate to be able to borrow a HoloLens from M from time to time. However, even then, as with the Vive, or a mobile phone, it’s a bit tedious to deploy to a device every time you write a line of code you’d like to take for a test run. Equally as fortunate is the fact that Microsoft included steps for running on either the HoloLens or the Emulator in their own quick-start guide. The main page of the HoloLens development portal is here. But you should probably just skip right to this page. And here’s some general info about using the emulator.

A quick note about installing the tools: I highly recommend installing them in exactly the manner and order listed on this page. I tried coloring outside of the lines and ran into various issues with certain dependencies and I ended up having to botch the whole thing and start from scratch. Also, if, like me, you are running the emulator in a virtual machine, note that, obviously, you have to install this whole stack on the VM. In my own case, I have the Unity Technical Preview and Visual Studio 2015 Update 3 installed on both my host OS on my development machine and the guest OS of the VM; but the Emulator is only installed on the VM, since my development machine runs Windows 7, and therefore cannot run the Emulator.

Which brings us to why you might want to use a virtual machine for HoloLens development. The HoloLens toolchain requires Windows 10. If like myself you are resisting the upgrade from Windows 7, 8, or 8.1 to 10 as long as possible, or just because it’s inconvenient at the moment to upgrade, then you will need to set up a virtual machine running Windows 10. For that you will need to install:

  1. Virtualization software that supports both Hyper-V and nested virtualization; as stated above, I opted for VMWare Workstation 12 Pro, which retails for $249.99. I wasn’t able to find freeware software that supported both of these features. Oracle’s VirtualBox supports Hyper-V, but I found the performance laggy and it doesn’t support nested virtualization.
  1. A copy of Windows 10 Enterprise or Windows 10 Pro. Windows Pro retails for $199.99. The Enterprise version is only available via volume licensing; however, a 90-day trial is available for download. Since every time you create a new virtual machine, Windows will see it is a new computer, you could hypothetically extend this indefinitely, but you will have to reinstall everything every 90 days. In my case, I’ll probably run down the trial and then spring for Windows 10 pro license. However, it’s good to know the Enterprise trial option is there, especially if you want to quickly spin up a virtual machine for temporarily testing some special case. You can download the ISO here.

 


 

Step 0.b: The Initialization Scene

This step, like others in the tutorial, isn’t technically necessary to meet the goals of the tutorial; but since I wrote this tutorial after the fact, rather than making a project for the purpose of the tutorial, I’ve included it. Like those other steps or techniques, it’s also a Good Idea in general, particularly when you are doing cross-platform development. Why? Well, for instance, as I’ll explain more later, logging is very useful in a cross-platform development environment. However, your logging class needs to be initialized before the other classes in your codebase before you can use it. In Unity, this is problematic, since there is technically no ‘entry’ point to the app – no Main method. Everything is a MonoBehaviour. And in the MonoBehaviour component lifecycle, all Awake() methods get called simultaneously, and all Start() methods get called simultaneously, and so on through the whole lifecycle. So on run 1 or your app, ClassA might be initialized before ClassB; but on run n, it might be ClassB that wins. We have ourselves a little race condition in our game. In my case, I also have a SceneLoader class in my library that makes loading scenes a little more convenient; this class also needs to be initialized before other MonoBehaviours. In some cases, you may also have dependencies between scenes such that you need to completely load a scene before you load another scene.

I like to handle all of these issues in an initialization script in a scene at the beginning of my project that gets all of this stuff out of the way. I also put code that needs to run in a fairly global scope in this script, like code to track the framerate and turn VSync on or off accordingly. You can implement an initialization scene anyway you like, this is one simply meets the needs of my own project. You can also feel free to skip this step, and any further aspects of the tutorial dependent on it, like runtime logging, if you feel you don’t need it; or stub one out if you think it might be handy later.

To follow this step, in your Unity project create a new scene by selecting File -> New Scene from the menu bar. I like to name my scenes with dot notation syntax using a shorthand for my project – in this case, BBTB – followed by what the scene does. So I named this scene BBTB.Initialization. Name your own scene whatever you like and save it. It’s a useful convention to group your scenes in a folder under Assets labeled ‘Scenes’. Then go to File -> Build Settings and click the ‘Add Open Scenes’ button at the lower right of the pane labeled ‘Scenes in Build’. Once you have more scenes, Unity will allow you to change their load order around by dragging them above or below each other to reorder them. In this case, we are going to want this scene to remain scene 0.

In the Hierarchy view, there will be two GameObjects: a ‘Main Camera’ and a ‘Directional Light’. Select and delete both of these objects. Now right click an empty space in the pane and select Create Empty from the context menu that appears. In Unity, it’s always a good idea to group as many GameObjects as possible under a root GameObject. In fact, certain methods like DontDestroyOnLoad() only work on root GameObjects now – perhaps an attempt by Unity to enforce this practice. I like to use the same dot notation for these root GameObjects that I use for scenes. I called mine Init.Logic.Root. I usually have one called SomeScene.Logic.Root and another called SomeScene.Scene.Root. I use the logic root for scripts that are strictly logical in nature, and the scene root for GameObjects that have a visual artifact or effect in the game. Some scenes only have a logic root, like this initialization scene. Name yours whatever you like and save the scene.

Now create a script by right clicking an empty space anywhere in the pane of a folder in the Project pane – I like to label mine ‘Scripts’, and I put my scene scripts in a subfolder called ‘Scene’ – and selecting Create -> C# Script. I called mine BBTBInitialization to stay consistent with my scene-naming convention. Now attach it to the root game object Init.Logic.Root – or whatever you named it – by left-clicking the object in the Hierarchy view and in the Inspector pane clicking the Add Component button – then browse your script or being to type its name into the search field to narrow it down. Your scene will look something like this:

image01

Figure 2. My initialization scene.

 

Now let’s take a look at the script. Once again, this step is optional. If you’ve decided to simply stub out an initialization scene and script for possible later use, you can skip the rest of this section; this is simply what I’ve implemented so far in my own script. Let’s take it from the top:

image21

Figure 3. The top of my initialization script.

 

The first thing to note is the namespace. This is another Good Idea, but not mandatory. For starters, reflection in .Net is really difficult if not impossible without a namespace. Certain methods in the reflection API only work on classes with a namespace. It’s also just a good organizational practice, and organizing classes under either packages or namespaces is standard practice in most OOP languages. I tend to use a naming convention of ProjectName.Domain.[Subdomain1]…[SubdomainN], and try to keep them as shallow as possible. In fact, I rarely descend farther than a single subdomain. You are welcome to use whatever convention pleases you; a common one is:

com.companyName.projectName.[domain].[subDomain1]…[subDomainN]

Personally, I don’t like this convention for game or app development, because this convention was really created for web applications or services and we’re not making a web application – we’re making a binary, compiled application that only exists on certain platforms. You’ll also notice that no major libraries use this convention. .Net, for instance is usually something like System.Something; or for UWP, Windows.Something. In the Flash platform I used to code in, it would have been something like mx.something or fl.something. In Java, it will be something like java.something. I also find digging through 6 subdomains tedious. But use whichever naming convention you prefer.

Following that, above the opening of the class declaration, is an instance of a custom Attribute I use to keep track of my tasks; it’s part of my class library. It logs these tasks to a text file with the assistance of an editor script and sorts them based on priority. Since I’m a one-man band, this makes more sense for me than something like Jira. If you decide to use my class library, there will be more information at the end of the tutorial about how to set it up.

Following that are some inspector properties for some classes I need to be initialized in this script. Inspector properties are the most efficient way to get references to objects in Unity, and you should use them liberally. Why? Because any properties that you declare publicly in Unity like this will be initialized by Unity before compile time with the class’s static constructor. This means zero initialization time at runtime – the object reference is there as soon as the script that owns them is initialized (actually, before that). In my case, I’m initializing a logging class and the SceneLoader I referred to earlier. Following that are a few private members for tracking frame rate that aren’t relevant to this tutorial. The fourth one, _loadingScenes, is a Dictionary of scene scripts that I use queue the load of scenes in my project in a specific order. And finally, the getter/setter pair is a property defined in the IInitializableScene interface I have implemented in all of my scene scripts to allow scenes to track when other scenes have finished loading.

You’ll notice in the class declaration before the IInitializableScene reference that my script extends a class called MonoSingleton. This class is part of my Unity-specific implementation of the Singleton pattern. My DebugLogger and SceneLoader classes are also Singletons. I know, I know, Singleton is horrible and the root of all evil. Actually, I don’t agree with that, but a discussion of particular design patterns is beyond the scope of this post. In the case of my initialization class, being the first script loaded, it isn’t going to be inspected for whether or not it’s finished loading, but it still has to implement the interface.

This is the next section:

image11

Figure 4. The initialization flow in BBTBInitialization.

 

Next, I declare my MonoBehaviour methods, in this case just Start(). You’ll notice that Start is declared as protected sealed override void Start(). This is because it doesn’t inherit directly from MonoBehaviour, but from a subclass of MonoBehaviour, so I have to override it. If I want the superclass version of Start() to be called as well, I need to call base.Start(), which I’ve done here at the very beginning following best practices for polymorphism and method overriding. The I call a method to begin the process of initialization. This is just how I’ve organized my initialization; you can organize yours any way you like. I prefer the readability of breaking it up like this, but some people prefer just to put it all in Start(), which of course you are welcome to do.

My initialization flow ends with a method called PostInitialization, where I kick off some repeating invokes and a coroutine to wait for the scenes to load. The coroutine is currently just a stub for later use, that invokes some code only after all the necessary scenes have been loaded. The UpdateVSync() method does what I described earlier. The LoadScenes() method loads a certain set of scenes in sequential order. Finally, I set the _initialized member to true. Here’s that method:

image17

Figure 5. The LoadScenes method.

 

Here, I simply do a null check for my scene and check whether it has been added to the _loadingScenes dictionary. If the instance is defined and isn’t in the dictionary, I add it a reference to its Type to the dictionary and load it via my SceneLoader class. You can feel free to just use Unity’s SceneManager class if you prefer (which is what SceneLoader uses under the hood) – the important thing to note is that it waits for the scene to load completely before loading the next scene. The next scene, which doesn’t exist yet, is a UI scene I plan to implement in the future. This is a use case I have run into frequently in Unity projects. The UI, referring to a broad array of objects and properties all over the game, oftentimes throws null object references if the scenes that contain those references aren’t initialized yet; so I load it last (at least among these initial scenes). The UpdateSceneLoadProgress() method outputs the load of the scene; in this case I’m just using my logging class to output to the Unity Console pane and print to a log file. In the future, it will probably output to a UI. This is a particularly useful feature in mobile games, by the way; or really any case in which your user’s bandwidth is not the best, in the case of loading the scenes from a network, or their hardware isn’t the best, in the case of loading from disk locally (which, technically, is still a bandwidth issue).

That’s it for this scene. There is other code in this class, but it isn’t relevant to this tutorial. We’ve loaded the scene BBTB.Main at this point, which plays the starring role in our next step.

 


 

Step 1: Detecting and Switching Platforms

Now we’re really getting to the juicy bit: actually detecting which peripheral platform we are on and loading the appropriate scene.

It’s fairly straightforward, but it did take a bit of finagling to get it right. I wish I could say that Unity has a setting that could be read that simply tells you which device you are connected to, but that isn’t currently the case; perhaps in the future that will be a feature. For now, we have to detect that via slightly more tedious means.

As you saw in the previous step, I loaded a scene called BBTB.Main. Go ahead and create a new scene, name it whatever you like and save it. This scene is going to have a camera a defined, so leave the camera; but go ahead and delete the directional light, as won’t be using the lighting from this scene to light our content. We’re going to use this camera as our default screen camera in case we don’t detect a headset so we can continue to run our game in the editor, thus restoring the user affordance broken by the SteamVR plugin’s insistence that you have a Vive attached at runtime.

You can organize your scene how you like; following my own convention, in this scene I added two root GameObjects, a logic root and a scene root, named Main.Logic.Root and Main.Scene.Root. Feel free to follow my convention. If you do so, drag the camera onto the Main.Scene.Root GameObject to parent it there. Next we need our scene script. I called mine BBTBMain and added it to Main.Logic.Root. Now uncheck the box at the top of the Inspector pane while inspecting the camera to disable it by default. If you followed my structure, you scene might look something like what you see below:

image18

Figure 6. The main scene.

 

Let’s take a look at the top of my scene script:

image00

Figure 7. The top of my BBTBMain script.

 

The first thing of note is you’ll see I’ve declared an Inspector property at the top for my screen camera. The type could be Camera. I chose GameObject simply because I don’t need a specific reference to the Camera class; but it doesn’t really matter, either type will work for this example. You can see again that this is a MonoSingleton and I’ve implemented my IInitializableScene interface. More importantly for the purposes of this tutorial, lower down I’ve defined a simple enum with values representing null, HoloLens, a traditional screen, and a Vive. Feel free to structure your enum however you want; and add values for as many headsets as you’d like to support. Having a null value in an enum is a just a good practice. Finally, right beneath that, I declare a property of the Type of that enum we just declared to store the value of the current platform and default it, of course, to NULL.

Great. We’re all set to detect some headsets. Finally! Reading the Unity scripting reference, you’ll notice that there are some handy VR related classes in the UnityEngine.VR namespace. One of them, VRSettings, has a static method called LoadDeviceByName(). Reading the documentation for that method tells us that it will let us load a VR device… and that on some platforms, we may want to wait a couple of frames and then set VRSettings.enabled to true. Unfortunately, it doesn’t seem to work. For both the Vive and the HoloLens Emulator, I encountered thrown exceptions at runtime with this approach. Looking at a log file for the Vive, the output window for the Emulator, it looked like in both cases the headset was detected just fine without loading the device. Calling VRSettings.LoadDeviceByName() actually confused things, as it seems that the first thing the method does is disable the current device; but then when it tries to reload the device, it seems not to be able to re-initialize it. And of course, we don’t want to re-initialize anything. If it’s already detected and initialized once, we’re done.

In this initial approach, I had all of my cameras in one scene, and based on which peripheral was detected, I was enabling one camera and disabling the others. In addition to the current inspector property there were two others for the Steam and HoloLens cameras. However, since VRSettings.LoadDeviceByName() doesn’t seem to work reliably as advertised – and really I doubt it ever well across all possible devices – I switched to a different approach. In this approach, I have a different scene representing each device; once I detect a device, I simply load the corresponding scene. It takes a little more legwork to set up, but ultimately I prefer this system anyway, as it allows me to maintain neat encapsulation for the code regarding the different headsets. More orthogonality, less spaghetti.

So let’s look at my Start() method, where I perform the detection:

image09

Figure 8. Detecting the headsets.

 

The top is crufted up with a bunch of comments, but I left them in there to demonstrate some of the troubleshooting steps I went through to get started, and also because they might continue to be helpful to myself, and to yourself, during development. The first thing I do, since again this is not a direct subclass of MonoBehaviour but rather a grandchild, is call base.Start(). Now to the actual detection. As I mentioned earlier, HoloLens requires Windows 10. The preprocessor directive Unity has defined for this is UNITY_WSA_10_0. This, of course, applies to other Windows Store apps, so we’ll have to do some additional detection to verify this is a HoloLens app. Note that Unity’s documentation states that “Additionally WINDOWS_UWP and NETFX_CORE are defined when compiling C# files against .NET Core”. Some people will use the NETFX_CORE flag online for UWP or HoloLens development; they are basically synonymous. I’ve mentioned UWP before. If you’re like me, and not a traditional .Net developer, you might find Microsoft’s ecosystem a little confusing. To be fair, Apple’s is pretty confusing as well. Both companies are quite old at this point, and both companies have a deep well of operating system code and APIs composed of layers upon layers of technology of varying ages. On the one hand, this allows backwards compatibility. On the other, it’s damned confusing – mainly because of the variety of terms that are used to refer to different bits of the ecosystem, some of which are synonymous and of which aren’t, and some of which overlap and some of which don’t. And in the case of Unity, we are also throwing Mono into the mix, as well as Unity’s own interpretation and naming conventions regarding both Mono and .Net.

In a nutshell, HoloLens is part of something called UWP, which stands for Universal Windows Platform. You can read more about it at that link. Basically, it’s a new API, which works only on Windows 10 or later operating systems, that represents the newest solution in the Microsoft Universe to the device compatibility issue of making an app that can run on a variety of different devices and still have a usable experience. The main thing you need to know is that UWP is mostly compatible with the .Net API you are accustomed to from using Unity, but certain things like file IO, reflection and cryptography actually have a completely different API. I’ll go into more detail about this in a later section. HoloLens also targets Microsoft’s Windows Store platform, which is mainly a publishing platform, but it also has some API’s of its own. These APIs differ slightly between different Windows OSes, which is why Unity has also defined UNITY_WSA, UNITY_WSA_8_0 and UNITY_WSA_8_1 directives. HoloLens only runs on Windows 10, which is why we’re using UNITY_WSA_10_0. There is also the Windows Phone API, which overlaps with UWP to an extent; but you can basically just ignore that.

In my script, I actually start by detecting the Vive, so I use the directive:

#if (UNITY_STANDALONE_WIN && !UNITY_EDITOR) && !UNITY_WSA_10_0

This only guarantees that we won’t detect the HoloLens, or any other Windows 10 UWP environment. To nail it down a little more, I’ve added the requirement that it be Windows standalone app – basically any non-UWP desktop Windows app – and that we aren’t running in the editor. The not-in-the-editor requirement is due to the fact that the Vive doesn’t run gracefully in the editor.

Next, I declare a List<string> called supportedDeviceList and populate it from VRSettings.supportedDevices, which is an array. I use a list simply so I can do a quick lookup later with the IList method Contains. You might notice I’ve declared it in both defines. You can of course declare it before both of them and just populate it in the define blocks if you like. Next, I search that list for the device “OpenVR”. OpenVR is the name of the cross-platform C++ SDK upon which SteamVR and the Unity SteamVR plugin are built. If you look under the Plugins folder in your Project pane, you’ll find the OpenVR files in there. The Unity plugin is in fact just that C++ SDK with a C# wrapper to expose it to Unity. It’s possible that there may be other headsets that support OpenVR in the future, so this magic string isn’t future-proof, but until Unity provides us with a more precise string, this seems to be the best we can do for now. There is also a static SteamVR class that has active and enabled properties that I was checking here, but they always returned true when connected to a Vive, so I deemed them unnecessary. For now I think this is sufficient; if it becomes an issue in the future, apart from Unity implementing device detection themselves, the next best thing would probably be to use reflection to detect the presence of the SteamVR class; but for our purposes presently, that step isn’t necessary.

Finally, for the Vive case, all I do is call a method called LoadPlatform() in my class, and pass it PlatformSelection.STEAM_VIVE, then return, because I don’t want to handle any other cases. I’ll cover LoadPlatform() in more detail in a bit.

The HoloLens block is nearly identical. Here, I simply check that we are not running in the editor and that we are running in a Windows Store 10 app – since UNITY_STANDALONE_WIN and UNITY_WSA_10_0 are mutually exclusive. Unlike for the Vive, a HoloLens app actually will run in the editor, I’ve found. However, you won’t have access to any of the features of the HoloLens, so I’ve included the requirement, again, that to detect the HoloLens we have to be running outside the Unity editor. The only difference after that is that I look for the device called “HoloLens” – another magic string defined by Unity – and call LoadPlatform() with PlatformSelection.HOLOLENS, then return.

If we didn’t detect a device, we will skip both of these blocks to the end of the method. There, I simply check that the platform is still NULL and if it is call LoadPlatform with PlatformSelection.NULL. The check is just a bit of insurance in case somehow the PlatformSelection was set to something other than NULL and we still reached this code, in which case something is very wrong and I want the app to fail. Feel free to omit it if you feel it is overkill.

So what does LoadPlatform() do? There is no real voodoo here; it simply loads the scene appropriate to the device detected. Here is what it looks like:

image15

Figure 9. The LoadPlatform() method.

 

As you can see, this is a pretty straightforward switch statement. I start with the NULL case, and place the screen logic there, because whether the user has chosen to run in the editor or detection fails at some point, I want the failover default to be to use the screen camera. Skipping down to the bottom, you can see my last case is PlatformSelection.SCREEN, for which I simply call back in with PlatformSelection.NULL. At the beginning of the NULL case, I call Directives.IsRuntimeEditor to check again if we are in the editor. The Directives class is a helper class I wrote that consolidates a combination of preprocessor directives and properties of the Application and SystemInfo classes. If you prefer not to use my library, this check is equivalent to Application.isEditor. After performing a null check, I log a statement about detecting the editor environment – for a version without my logging class, just change DebugLogger.Log to Debug.Log – activate the camera, set _platformSelection to PlatformSelection.SCREEN, and call a method to load the content, then return. If the checks fail and we get past that block, then something has gone wrong, so I log an error message to the Console and throw an exception. My exception message runs a little long, so I wasn’t able to include it here. The full text of the messages reads “Error: No suitable output display was found. Suitable displays for this game are HTC Vive, Microsoft HoloLens, or a monitor.” Right now it’s just a generic exception; technically, the correct way to do it would be to subclass the Exception class with an Exception class specific to this situation, but for now it’s good enough.

The HoloLens and Vive cases are even simpler, and nearly identical: log a message, load the appropriate scene for the platform, and call a method to load the content. I’m using my own SceneLoader class here. To do the same thing without my library, simply replace it with:

SceneManager.LoadScene([SceneName], LoadSceneMode.Additive);

The LoadSplashContent() method is very short:

image16

Figure 10. The LoadSplashContent() method.

 

Pretty self-explanatory and simple; I only put this code in its own method in case there is additional or different code I want to execute here in the future.

That’s it for detection. All we have to do now is implement the scenes we declared here for the Vive and the HoloLens.

 


 

Step 2: Starting with the Vive Scene

Following my convention, I named it BBTB.Vive and defined two root GameObjects in the scene, Vive.Logic.Root and Vive.Scene.Root. I created a scene script called BBTBVive and added it to Vive.Logic.Root.

Now we’re getting into SteamVR Unity plugin setup territory. Valve helpfully provides a quickstart.pdf in the /SteamVR folder once you’ve downloaded the plugin from the Unity Asset Store. Follow those instructions to set up your Steam camera, then drag it onto Vive.Scene.Root, or the equivalent GameObject you defined. Don’t disable the camera this time, as we want it to be enabled by default when the scene loads. This way, the only camera we ever have to worry about activating is the default screen camera. My scene ended up looking like this:

image19

Figure 11. The Vive scene.

 

I again deleted the default Directional Light from the scene.

A quick note about Vive player settings: by default, a list called ‘Virtual Reality SDKs’ in Standalone player settings will contain the values ‘None’ and ‘OpenVR’ after you download the SteamVR plugin. I removed the ‘None’ value, since it isn’t handled gracefully by Unity or SteamVR, and also because the ‘None’ case is precisely what we are handling on our own with the default screen camera. I also added ‘Oculus’ to my list in case I want to support it in the future. To get to these settings, from the menu bar click File -> Build Settings, then Player Settings from the button with that label at the bottom of the resulting pop-up window (assuming the ‘PC, Mac and Linux Standalone’ build target is selected, which it should be by default). Player Settings for Standalone app will appear in your Inspector pane. In the accordion fold labeled ‘Other Settings’, look for a list called ‘Virtual Reality SDKs’ near the bottom and add or remove values with the plus and minus buttons below it.

Here’s the Vive scene script. All it does is verify that the Vive is present, and, if not, failover to the default screen camera:

image13

Figure 12. The Vive scene script.

 

I deployed to the Vive after this point, but for the purposes of this tutorial I’m going to cover the remaining scenes first. If you want to read about Vive deployment before continuing, you can skip to the section Step 5.b: Deploying to Vive before reading the next section about the HoloLens scene.

 


 

Step 3: The HoloLens Scene

The HoloLens scene is similarly simple. I created a scene called BBTB.HoloLens. This scene only contains the Hololens camera, so I didn’t bother adding any root GameObjects. I simply deleted the Directional Light, and renamed Main Camera to HoloLensCamera. This scene has no script. Following the directions from Microsoft, I set the Clear Flags settings of the camera to Solid Color and set the color’s R,G,B,A values to 0,0,0,0, and set the Near value of the Clipping Planes setting to 0.85. That’s it. The HoloLens scene is finished. I said earlier that the HoloLens is a little more involved so I started with the Vive, but that’s mainly due to deployment, which you will read about it in section 5.c: Deploying to Hololens. You can read Microsoft’s full documentation about setting up a Unity scene for the HoloLens here.

My finished scene looks like this:

image14

Figure 13. The HoloLens scene.

 


 

Step 4: Finally, Loading Our Content

Having made it this far, the only thing we have left to do short of deploying to our devices to verify our game works is make a scene with some content in it. So far all we have is a bunch of cameras. I called my final scene BBTB.SplashScreen, as I suspect my first scene will contain some sort of splash content or menu; call yours whatever works for you. This scene likewise has no script. Since we can assume we have one of 3 cameras viewing the scene at this point, delete the camera; but this time, keep the Directional Light. For my own directional light, I set the color of the light to 255, 0, 219, partly just for kicks, and partly out of habit: this sort of acid-pink or magenta color is typically used in the game industry to indicate placeholder content. But the default color works fine. I also didn’t bother to define any root objects at this point, although I probably will in the future. I wrapped up the scene by arbitrarily following the HoloLens instructions for what to place in your ‘Hello World’ scene: a cube, rotated 45 degrees in every axis, and scaled to 0.25. What you place here is completely arbitrary; it just has to show up on the camera. I will say that anything larger than this cube will appear very large in the HoloLens. I’ll cover the differing scale between the devices a little bit near the end.

That’s it. This is what my finished scene ended up looking like:

image04

Figure 14. Our content scene.

 


 

Step 5.a: Deploying to the Unity Editor

Hooray! We’re ready to run this thing. I personally like to start my testing with the default case, so let’s start with that. It’s also the easiest to test. In your Hierarchy, browse to your initialization scene and click the play button at the top of the Unity Editor window. If you did everything correctly, you should see something like this:

image08

Figure 15. Boom.

 

It works! At least, it did for me. Hopefully, if it doesn’t work for you, you didn’t hurl anything at your monitor. Those things are expensive. You’ll notice that the cube appears pretty small in this version; again, I’ll share more thoughts about that later. You’ll also notice the ‘Unknown error (126)’ message at the top of my Console, which, as I stated earlier is completely normal, since no Vive is attached. Speaking of which, if you don’t have any issues to debug first, let’s move on to Vive deployment.

 


 

Step 5.b: Deploying to Vive

As I mentioned near the beginning of this post, my Vive is on another machine in the living room for couch-based gaming. My office, where I do most of my development, is full of desks, office chairs and workstations, so it’s not the best place to set up a Vive gaming environment, which requires a minimum of 6.5’ by 5’ (2m x 1.5m). I don’t have that kind of space available in my office.

Fortunately, Vive deployment is very simple:

  • Perform a standalone build for Windows x86_64 from Unity. Check the ‘Development Build’ box if you want access to debug features like log statements.
  • Copy the build files to a thumb drive. Don’t forget the data folder. I usually put both my data folder and executable in a parent folder.
  • Sneakernet your thumb drive over to the deployment computer and plug it in. If you have a more complicated procedure involving continuous deployment or FTP or SCP, please feel free to use that.
  • On the deployment computer, launch Steam and login.
  • Make sure your Lighthouse sensors and Vive headset are plugged in and running.
  • Click the SteamVR icon in the upper right hand corner of the Steam app to launch SteamVR. Verify that your headset and sensors are recognized.
  • Launch your executable.

That’s it. If everything goes as planned, you should see something similar to the following in the desktop window, and you should also be able see it in your headset. Of course, if you are lucky enough that your development machine and Vive machine are the same, you can skip the sneakernet part:

image03

Figure 16. Running on the Vive.

 

You’ll notice that the scale appears to be about the same as any other standalone app.

One thing that I discovered while first troubleshooting the Vive build, before I arrived at these clean steps I’m presenting you with in this post, is that runtime logging is key – a lesson similar to what I learned earlier in my career developing for mobile devices. By logging, I don’t mean simply calling Debug.Log – these messages are available in a development build, but in a barely convenient way. I have personally found the output window Unity supplies with development builds to be a bit buggy. What I mean is logging to a text file, which makes thing a lot more convenient, not just from a usability perspective, but also because you can be much more verbose, logging entire stack traces for instance. This is part of the reason why I went to the trouble to integrate my library first, and initialize my DebugLogger, which has this functionality. Here’s what some sample output from several sessions of running my app on my development machine in the editor looks like:

image12

Figure 17. Logging to a text file.

 

If you’re debugging, obviously you would log more useful information here. You’re welcome to use my DebugLogger for this purpose, or roll your own. For the HoloLens it was less of an issue for me because I’m mainly deploying to the Emulator at this point, where I can simply read the Output window in Visual Studio while my app is running – another argument in favor of running in the Emulator most of the time for development. If deploying to the HoloLens proper, file logging would again be necessary for debugging. My logging code doesn’t yet work on the HoloLens, but I hope to correct this soon.

 


 

Step 5.c: Deploying to HoloLens

In my case, as I mentioned before, I’m running Windows 7 and don’t really want to upgrade to Windows 10 at the moment, which the HoloLens requires. So I’m running VMWare Workstation 12 Pro with a trial of Windows 10 Enterprise installed. But other than that, the steps here will be the same for you. If you’re interested in a similar setup, you should be able to find the necessary details in the first section under the heading ‘My setup’.

Before we can deploy though, I need to note that the SteamVR code is not compatible with UWP. Therefore, when you first try to build for HoloLens, you’ll get a boat-load of errors emanating from the SteamVR code. This is just part of the reality of cross-platform development. Unfortunately, the most straightforward solution I could find was what some software engineers call a ‘monkey hack’: editing SDK source files. Fortunately, Valve provides you with source to edit. All this involves is opening all of the SteamVR source files and placing this preprocessor directive around the entire file:

#if UNITY_STANDALONE_WIN && !UNITY_WSA_10_0

#endif

You’ll probably notice this is similar to the instruction we used in our detection script, minus the !UNITY_EDITOR directive. You can also use the NETFX_CORE directive. Mine is a little more explicit and will work for situations other than just the HoloLens; for instance, deploying to Windows 10 standalone UWP in addition to Vive, would which also throw errors. You could do something more complicated by launching Unity in batchmode and deleting the Steam files from the project, which I might do myself in the future, but this works well enough for now.

The files you’ll need to modify are under /Assets/SteamVR/Scripts, /Assets/SteamVR/Editor and /Assets/SteamVR/Extras. Once you’ve done this, you shouldn’t receive any more errors from the SteamVR code at build time for the HoloLens.

 

Building the initial Visual Studio project

If you’ve ever deployed to iOS or Mac OS X, the flow for building for HoloLens will be somewhat familiar. For Apple OSes, it isn’t possible to build directly from Unity. Instead, Unity generates an intermediate Xcode project; you finish the build from Xcode. Similarly, for HoloLens, Unity builds an additional Visual Studio project from which you have to complete the final build. To do this, open File -> Build Settings from the menubar, and select the Windows Store build target in the Platform pane. Note that if you aren’t running Windows 10, this build target won’t be available. It will still show up here, but it will simply display the message “No Windows Store module loaded.” Below that will be a button labeled ‘Open Download page’ which links to a non-existent web page that throws a 404 at the time of this post.

One thing I forgot to mention is that there are some features of UWP, which Microsoft calls Capabilities, that you will want to enable to avoid errors at runtime in your HoloLens app. These are InternetClient, Microphone and SpatialPerception. You can find these under the Player Settings for the Windows Store target in the Build Settings pop-up. In the inspector, they will be at the bottom of the Publishing Settings accordion fold in a list of checkboxes. Check the boxes for these 3 capabilities. I should also mention that, as with Vive, I removed the ‘None’ option from ‘Virtual Reality SDKs’ under the Other Settings fold.

If you are using a VM like myself, you’ll want to copy your project to your VM. I’m using VMWare, which allows this with the right extension installed (it’s not installed out of the box, but you get prompted to install it the first time you try to copy and paste something from your host OS to your VM). If you don’t use VMWare, in addition to Hyper-V and nested virtualization, you might want to make sure it supports copy-and-paste and/or drag-and-drop between the host and guest OSes. Otherwise, you can again use FTP, WinSCP, shared drives or whatever solution you like best. Since it’s on the same physical machine, I prefer just to copy and paste. I recommend deleting the Library folder, the .sln and .csproj files and the .vs folder (which is a hidden folder, so you’ll have to have hidden folders disabled in Windows Explorer) before you copy your project over. These files are generated by Unity and Visual Studio and just add additional, unnecessary cruft. Also, the hashes generated in the Library folder are specific to each machine, and in rare instances using another machine’s Library folder can corrupt your instance of the project. In fact, while we are on the topic, you should ignore these files in whatever source control solution you use. This is what my .hgignore file for this project looks like, for instance:

syntax: glob

code/BBTB-Unity/Library/

code/BBTB-Unity/*.csproj

code/BBTB-Unity/*.sln

code/BBTB-Unity/*.userprefs

code/BBTB-Unity/Temp/

code/BBTB-Unity/.vs/

code/BBTB-Unity/obj/

Once you’ve selected the Windows Store target on your machine of choice, select ‘Universal 10’ from the drop-down for ‘SDK’. For ‘UWP Build Type’, select ‘D3D’. For ‘Build and Run on’, select ‘Local Machine’. The check the ‘Unity C# Projects’ box. Finally, click the Build and select a location for your HoloLens build. If everything goes as planned, your instance of Unity will successfully build an intermediate Visual Studio project with no errors. If you are using my class library, make sure you deploy the .dll under bin/UNITY_WSA_10_0 to your Unity project before you attempt the Windows Store build. I’ll cover this in more detail under the heading shortly below about troubleshooting UWP compatibility issues. Suffice it to say, my library contains some of the aforementioned IO, crypto and reflection incompatibilities I mentioned before. For now, these features of my library are simply disabled until I replicate them for UWP. The section below will explain how to go about that with your own code. If you fail to use the right .dll, the build will finish and you’ll have a seemingly working Visual Studio project until you try to build it, at which point it will fail; you’ll notice that although the Unity build seemed to succeed, it had a lot of errors in the Console. Here’s what the outcome of a successful Unity build looks like on my VM:

image22

Figure 18. A successful Windows Store build.

 

You’ll notice that there are a log of warnings, but no errors. You might also notice that all of the warnings are from the SteamVR code. Unity should open the containing folder of the build for you in Windows Explorer for you. It will look something like this inside:

image05

Figure 19. Inside a successful Unity Windows Store build.

 

Deploying to the Emulator

Once you have a successful Windows Store build, launch the project in Visual Studio by double-clicking the solution file. In Visual Studio, right-click Package.appxmanifest and select View Code from the context menu. This will open the manifest as an editable XML file. Look for an XML node called TargetDeviceFamily and change the Name attribute from Windows.Universal to Windows.Holographic. Then replace the MaxVersionTested attribute with 10.0.10586.0. Save this file and close it. Now change the build-target drop-downs at the top of the pane to ‘Release’, ‘x86’ and  ‘HoloLens Emulator 10.0.14393.0’ and select Debug – > Start Without Debugging from the menu bar. This build will take a while; if everything goes as plan, it will build the project without errors, and open the emulator. Be patient, as once the emulator launches, it should automatically load your app onto the virtual HoloLens instance and launch it, but it takes several seconds after Windows Holographic launches for this to happen. Once successful, you will see something like this:

image20

Figure 20. Successfully deployed to the HoloLens Emulator.

 

You can see here that the scale of the cube is much larger here than on the standalone and Vive versions; it will look much the same as this on the actual device. I have some thoughts about this that I’ll share at the end of the post.

 

Deploying to the HoloLens

For instructions for deploying to the HoloLens, refer to this documentation from Microsoft.

 

Troubleshooting UWP compatibility issues

As I mentioned, UWP has a slightly different API for certain tasks than your typical desktop .Net app. How do we address this? If you have your own library of code, you are likely to run into similar compatibility errors like the ones I experienced with my own class library. Unfortunately, Microsoft’s documentation on this is every bit as confusing as the ecosystem itself. Helpfully, they have provided a list on MSDN of namespaces that are compatible with UWP.

That list is prefaced with this helpful text:

The following list displays the namespaces in .NET for UWP apps. Note that .NET for UWP apps includes a subset of the types provided in the full .NET Framework for each namespace. For information about individual namespaces, see the linked topics.

Unhelpfully, when you drill all the way down to a specific type, there is no actual information about which members are UWP-compatible, despite promises on other pages that this information will be available on the page for a type. In some cases, the listed types are even wrong. Take the page for System.IO. It lists the FileMode enum as supported, despite protestations at compile time from Visual Studio to the contrary.

If you are using external class libraries there is also another issue. For code you have authored within Unity itself, you can just use the provided preprocessor directives to quarantine your code. But an external class library authored directly in Visual Studio has no notion of these directives; you have to define your own. To do this, you will need to define custom build targets in Visual Studio and then define any preprocessor directives you would like to reference. By the way, this is why, if using my class library, you need to deploy the .dll from /bin/UNITY_WSA_10_0 to the Unity project before attempting the HoloLens build. To do this in your own project follow these steps:

  • In Visual Studio, right-click your solution in the Solution Explorer and click Configuration Manager.
  • In the resulting Configuration Manager pop-up, select the ‘Active Solution Configuration’ drop-down and select the ‘<New…>’ option.
  • Another pop-up labeled ‘New Solution Configuration’ will appear. Enter a name and click the OK button. I defined several targets, basing them on Unity’s preprocessor directives. At a minimum, you will want one for UNITY_STANDALONE_WIN and UNITY_WSA_10_0. Close the Configuration Manager.
  • Back in the Solution Explorer, right-click your project this time (not the solution) and click Properties, then click the Build tab on the left.
  • Your new configurations – build targets as I call them – will appear in a drop-down at the top of this tab.
  • If you select one of your configurations, you will notice a field near the top labeled ‘Conditional compilation symbols:’; for UNITY_STANDALONE_WIN I defined these as ‘UNITY_STANDALONE’; for UNITY_WSA_10_0, I entered ‘WINDOWS_UWP;NETFX_CORE;UNITY_WSA_10_0’. This will allow you to reference these directives from your class library as if you were authoring from the Unity environment.

After you’ve executed these steps, you’ll be able to build to these configurations via the normal process simply by selecting your target from the drop-down instead of ‘Build’, ‘Release’ or ‘Debug’. Note that you can also access the Configuration Manager from this drop-down, as well as from Solution Properties.

Great, so now how do we address these errors? To begin with, simply follow the steps above for a HoloLens build from Unity. The build will finish in Unity with quite a lot of errors depending on how many incompatible features you are using. For example:

image06

Figure 21. Yikes.

 

Fortunately, it’s not as bad as it looks. Most of the errors are duplicates, or errors that are only occurring as a side effect of some other error that preceded it. I’ve found that the very first error contains a list of all of the relevant errors at the bottom; so start by selecting the first error and scrolling to the bottom of the output pan in the lower half of the console. Most of this verbiage we can ignore; the actual errors are at the bottom, preceded helpfully by the string ‘Error: ‘. It will look something like so:

image10

Figure 22. The alpha error. Actual errors we care about are highlighted.

 

If you right-click the output section, you’ll get a context menu with a single option: ‘Copy’. Click that and then paste it into the text editor of your choice for easier readability. I just used Notepad. Then go through them one-by-one. Most of the errors you get will be about a particular type not being supported. You won’t get a line number, just the unsupported type and the offending file. In my case, it only amounted to 6 or 7 files in the end; not nearly as bad as the 92 errors thrown would seem to indicate.

For now, I didn’t want to actually fix any of these errors, which is going to take some time. I just want to quarantine them so my HoloLens build works; then I can go back and fix them at my leisure, safe in the knowledge at least that it will run on the HoloLens. It’s similar to the notion in TDD of starting with a failing unit test. This code, initially, will basically just fail to work on a HoloLens build, albeit without errors. We accomplish this with the preprocessor directives we defined in our custom Visual Studio Configuration. Here’s an example from my DebugLogger class:

image02

Figure 23. Quarantining UWP-incompatible code with preprocessor directives.

 

All I’ve done here is wrap the offending code in an if/else preprocessor block prefaced with the directive !UNITY_WSA_10_0. In the UWP section (the else block), I’m simply logging an error. The only major variation on this is that if your method requires return type, you’ll have to provide a default or null return type instead. There also may be some cases where you have to comment out an entire class or a using statement, but most of your mitigated errors will end up looking like this if you follow my pattern.

Once you resolve the first grip of errors, compile, re-deploy the new .dll to Unity and build again, you will probably get another grip of errors. For me, it took 3 or 4 rounds of this to resolve every error.

As far as actually correcting the errors, your guess is as good as mine at this point because I’ve only gone as far as researching how to solve them, not actually implementing any of the solutions. I found this article about binary serialization for Windows Phone 8; I think it will apply to UWP 10 on HoloLens as well, but haven’t tried it yet. The selected answer on this Unity Forum post provides a great review of the problem and how to go about addressing it, as well as a link to the answerer’s UWP-compatible IO library available on the Unity Asset Store for $15. This folder of a git project contains UWP-compatible solutions for various .Net namespaces. This MSDN page details how to create, write and read a file for UWP. I could continue, but I’ll let you and Google take it from here.

 


 

Final Thoughts

 

About scale

As we saw, the scale in standalone apps for both the Editor and deployed on the Vive is much less intimate. It seems like the Vive uses a normal 1:1 scale ratio to Unity, whereas HoloLens is something like 4:1. This is going to create a radically different experience between devices and possibly some serious development headaches. My first and only thought about this at the moment is a technical solution using the UnityEditor AssetPreprocessor class. If you haven’t utilized AssetPreprocessor before, I highly recommend it. Here’s the reference page for it. Basically, it defines a number of events for assets within the Unity Editor itself; for instance, when a material is assigned or a model has imported. In fact the latter event is what I have in mind; it’s called OnPostprocessModel. My thought is that any time a model imports, you automatically attach a script to it – which you can do through the UnityEditor API – that reads the device platform a runtime during start-up. If it’s Vive or the Editor, the script does nothing; if it’s HoloLens, the model gets scaled down to 0.25.

 

About game or app design

There are also going to be some serious game design discrepancies between different versions. For instance, in the Vive version, I don’t really need to do anything different. However, the HoloLens is an AR device; it isn’t intended to encapsulate you completely in an environment. In fact, since it allows you total freedom of movement, it’s probably not safe to do so. So for the HoloLens version of my game, I’m going to have to turn off the background assets, and probably also add an additional stage and/or scene(s) at the beginning of the game where the player chooses where to places set pieces in the game.

Input is also completely different between these two devices. Vive controllers are somewhat like a cross between a Wii controller, a PlayStation Move and a Steam controller, with multiple sources of input. The HoloLens is more like the Kinect, where input is gestural, except only a couple of gestures are available at the moment. So this is something else I’m going to have to take into consideration when designing my interactions.

Before you embark down this particular path, make sure you fully understand all of the design implications for your game or application; it may not be the right choice for you.

 

Final final thoughts

We did it! Well, I did anyway; hopefully this post allowed you to do so as well. If so, then you now have the ability to deploy a VR or AR application to both Vive and HoloLens, as well,  potentially, to as many additional VR devices or platforms as you desire using the same techniques. You should also understand the compatibility pitfalls in developing cross-platform Unity apps for HoloLens and UWP, and have a rough idea how to go about correcting them. You can also continue to run your app in the Unity Editor during development without it breaking the Editor. At the very least, you now know it can be done. If you decide to use my class library, you should be able to log to a file for debugging purposes, at least for Vive builds; I should support HoloLens logging in the near future. Along the way, you hopefully learned something about building and using external class libraries for Unity, if that’s something you haven’t done before.

I truly hope you found this tutorial post helpful. If you find any errors, or have any feedback or questions, you can contact me by reaching out to the eleVR team; they’ll pass your message along.

 


 

Appendix A: An Aside on Using External Libraries for Unity Development

One of the main benefits of using Unity with the Mono/.Net stack is the ability to use external libraries in the form of .dlls. I’m doing this for this game, and this tutorial relies on code from a class library I’ve written. You don’t need to do the same thing, and you don’t need to use my class library, but you are encouraged to do the former, as it speeds up development a great deal when you have to implement the same boilerplate for the nth time, and you are welcome to do the latter.

Using an external library in Unity is quite simple. Simply create a class library project in Visual Studio, and build a .dll from the project. To deploy it, you can simply copy and paste it from the /bin folder of your Visual Studio project into a folder in your Unity project. I usually drop them under a folder I label ‘Assemblies’. Note that there are different flavors of class library for general deployment, mobile, and ‘Portable’ which actually refers specifically to UWP (I’ll explain a little more about UWP later). If you want it to work on most platforms, ironically, you won’t want the portable option, but just a vanilla class library. As we’ll see, though, you will run into compatibility issues. If you don’t want to deal with these and you know you will only be developing for HoloLens and other UWP environments, or if you wish to create separate libraries on a per-platform basis, then choose the portable option.

 

Versus Plugins

You can also drop a .dll into a folder under /Assets labeled ‘Plugins’, which makes it, strangely enough, a plugin. What’s the difference between a plugin and a class library? Technically, nothing; they are both the same .dll, there is no difference in how they are compiled. However, in the case of the class library, you will have access to all of the classes in the library just like any other class available to you via UnityEngine.dll or UnityEditor.dll. However, as I already mentioned, you will have to resolve an compatibility issues inside the library itself. I prefer this approach personally, since breaking it up into multiple .dlls based on domains of compatibility, while neatly encapsulating those domains, will probably lead to dependency issues that still require resolving compatibility issues in the non-compatible .dll.

For instance, you might have foo.dll and bar.dll. Foo.dll is your main, generic library that you use as a class library. Bar.dll is a plugin you wrote to handle compatibility issues with BarOS. Damn BarOS devs, they never do things like the other major OSes. Anyway, bar.dll relies on a class from foo.dll, called FooBaz. Now you have 3 options: you can write and maintain two versions of FooBaz; you can just refactor bar.dll so you don’t depend on FooBaz, which may or may not require you to duplicate functionality from FooBaz somewhere else in bar.dll, say scoped to a specific method; or, if you run into compatibility issues down the road with FooBaz, rendering the bar.dll plugin broken, you can make FooBaz cross-compatible. None of these seem like ideal options to me.

However, there may be a reason why you want or need a plugin over a class library. Keep in mind that the other big difference with a plugin versus a class library, however, is that you will have to import every method from every class you wish to use in Unity via the .Net DLLImport attribute. Read here for more about creating class libraries for Unity, and here for Unity plugins. More information about the DLLImport attribute can be found here.

 

Debugging & Troubleshooting a Class Library in Unity

Past versions of Unity didn’t automatically convert .pdb files into .mdb files, which you need for debugging a .dll. But the version that ships with Visual Studio isn’t compatible with Mono. This led to various errors in Visual Studio when attempting to run this executable from a build event. Fortunately, the newest versions of Unity include a standalone version of pdb2mdb.exe that is compatible with Mono. All you have to do is copy the .pdb file compiled with the .dll into a folder under /Assets in your Unity project, and the .mdb will be generated automatically. Then, In Visual Studio, after selecting Debug -> Attach Unity Debugger from the menu bar and selecting your Unity project, you will be able to enter methods in the .dll from breakpoints in your own code (unfortunately, you won’t be able to set breakpoints directly in the VS project for the external library). If for some reason, that isn’t working for you and you happen to be using my library, I’ve included a copy of this standalone version of pdb2mdb.exe under the /code/Tools directory of that project. You can read more details about the current state of debugging symbols for external class libraries with Unity here. You can download this version of pdb2mdb.exe here.

 

If You’d Like to Use My Library

My external class library is available here in release form as an archive. All content, including readmes and documentation, are contained in the archive.

 


 

Appendix B: My setup

I think it’s important you know how I’m setup in case you run into issues that might be a result of your own setup varying from mine. Feel free to skip this section and just use it as a reference if you run into any issues:

 

Main development machine and environment

OS: Windows 7

CPU: Intel i7 6900K

GPU: EVGA Geforce GTX Titan Black, 6GB GDDR5

RAM: 128 GB

Unity version: 5.4.0f3-HTP*1

Unity license: Personal

Visual Studio version: Visual Studio 2015 Update 3*

 

For the Vive (on a separate machine)

OS: Windows 7

CPU: Intel i7 3930K

GPU: EVGA Geforce GTX 970 SC Gaming ACX 2.0, 4GB GDDR5

RAM: 32 GB

Vive hardware version: product 128 rev 2.1.0 lot 0/0/0 0

Vive firmware version: 1462663157 steamservices@firmware-win32 2016-05-07 FPGA 1.6

 

For the HoloLens

OS: Windows 10 Enterprise Trial, running in a VM

CPU: Same as development machine; 4 of 16 cores allocated to VM**2

GPU: Same as development machine; 1 GB VRAM allocated to VM**2

RAM: Same as development machine; 16 GB allocated to VM**2

VM Software: VMWare*3 Workstation 12 Pro Trial, 12.1.1 build-3770994

HoloLens version: HoloLens Emulator, build 10.0.14393.0*4

 

Source control

Mercurial**5 running on a home server


* Required at the time of writing.

** Recommended.

1 The HoloLens development kit is currently in beta and requires the Unity 5.4.0f3-HTP Technical Preview.

2 You might not have a machine with specs as beefy as mine, which I just upgraded. If you have a machine that can spare, it I highly recommend these settings, as below these settings you will start to see performance lags in both the VM instance and the HoloLens Emulator. The HoloLens Emulator requirements can be found on this page.

3 There might be other VM software that supports this setup; if you use something other than VMWare, make sure it supports nested virtualization. I tried VirtualBox, which is free, but unfortunately it does not support nested VMs, so it could not run the Emulator, which itself runs in Windows 10’s native virtualization solution, Hyper-V.

4 Required if you are running the emulator. You don’t have to run the emulator in another VM. If you have Windows 10 installed as your main OS, you can just run it from there.

5 I highly recommend using hg, but you don’t have to. The important thing is not to use git or svn. The main reason for the former is that it can’t handle large binaries, which you will have a lot of in a game. Although it can technically handle binary files, git is notorious for choking on large numbers of binaries or large individual binaries, a shortcoming to which Linus Torvalds has admitted. It wasn’t really designed with binaries in mind. In the case of svn, it can handle binaries, but due to the inefficient way it runs, commits will get slower and slower the larger your repository gets; and it also isn’t distributed. Mercurial is the best of both worlds in my opinion, being both distributed and efficient like git, without all of the unnecessary commands that only a large, open-source project would need to use, and also being able to handle large binaries. I highly recommend the free version of Atlassian’s Bitbucket service and their freeware gui source control app, Sourcetree. Bitbucket supports git and hg, and will even import git/github projects to hg. I can also recommend Perforce if you don’t like hg. Perforce is not free and difficult to administer, but the Perforce client tools are free, and there is a relatively affordable third-party, web-based service called Assembla that offers Perforce as an option. Perforce is the source control platform of choice for many large game studios. You could also not use source control at all. I hope you understand that would be a mistake.


All the links

http://answers.unity3d.com/questions/579641/handling-the-lack-of-systemio-classes-in-windows-8.html

https://www.assetstore.unity3d.com/en/#!/content/26461

http://blog.springwald.de/blog/post/creating-class-libraries-for-unity3d-using-visual-studio

https://bitbucket.org/droidknot/droidknot-kitchensink-unity-release

https://developer.microsoft.com/en-us/windows/holographic/

https://developer.microsoft.com/en-us/windows/holographic/holograms_100

https://developer.microsoft.com/en-us/windows/holographic/install_the_tools

https://developer.microsoft.com/en-us/windows/holographic/using_the_hololens_emulator

https://developer.microsoft.com/en-us/windows/holographic/unity_development_overview

https://developer.valvesoftware.com/wiki/SteamVR/Error_Codes

https://developer.viveport.com/us/develop_portal/

https://docs.unity3d.com/Manual/UsingDLL.html

https://docs.unity3d.com/ScriptReference/AssetPostprocessor.html

https://docs.unity3d.com/ScriptReference/AssetPostprocessor.OnPostprocessModel.html

https://github.com/windowsgamessamples/UnityPorting/blob/master/PlatformerPlugin/MyPluginUnity/Legacy/System/IO/File.cs

http://jacksondunstan.com/articles/3052

https://jevgeni.net/2013/06/04/windows-phone-8-serialization-comparison/

https://msdn.microsoft.com/en-us/library/aa984739(v=vs.71).aspx

https://msdn.microsoft.com/en-us/library/mt185483.aspx

https://msdn.microsoft.com/en-us/library/mt185496.aspx

https://msdn.microsoft.com/en-us/library/mt185501.aspx

https://msdn.microsoft.com/en-us/windows/uwp/files/quickstart-reading-and-writing-files

https://msdn.microsoft.com/en-us/windows/uwp/layout/design-and-ui-intro

https://technet.microsoft.com/en-us/windows/dn783436.aspx

https://www.vive.com/us/setup/