fbpx

Lab Notes: Augmented Reality with the Microsoft HoloLens

Lab Notes: Augmented Reality with the Microsoft HoloLens
Reading Time: 6 minutes
We explore the possibilities of an augmented reality interface using QR codes and Internet of Things sensors

8d104c85a5b64ada973291b25e644aad

While wearable technology usually calls to mind devices such as smartwatches and fitness trackers, one exciting new avenue of research is augmented reality glasses. Virtual reality (VR) takes the user to a whole new virtual space, but augmented reality (AR) instead aims to enhance the real world by providing software interactions with physical things, through mediums like phone screens or glasses. This allows for a whole new approach to user interfaces, because programs can now interface with the real world in more meaningful capacities.

For example, what if you could have turn-by-turn directions for Google Maps coming through your sunglasses, instead of having to constantly look down at your phone? Or, what if you could watch a baseball game with an overlay of stats about each player? The possibilities created by AR are far more expansive than VR, and they present an exciting challenge for developers.

Microsoft’s approach to AR comes in the form of the HoloLens, a set of glasses that allows for the creation of “holograms” in the physical world. By having a transparent screen built into the glasses, images can be drawn on top of the real world to create the illusion of a hologram. Furthermore, by adding spatial mapping capabilities to the headset, holograms can actually stay anchored in place as a user walks around them.

hololens_richard with Augmented Reality

What We Did

Our goal was to create a HoloLens app capable of interacting with the various temperature sensors in the office, so that when the user looks at one of the sensors, a 3D overlay appears over the device with the current temperature reading. Because the HoloLens has Bluetooth capabilities, we first tried to communicate with the sensors directly through Bluetooth.

To accomplish this, we had to learn about the Universal Windows Platform (UWP) API, the API that Microsoft created for developing an app on any kind of Windows device. However, even though the API is marketed as being a universal system, there are certain aspects of it that change when trying to write apps for the HoloLens. Specifically, the UWP library is not a part of the Unity Editor, meaning that any code that uses it will look like an error to Unity. This can be fixed by enclosing all UWP code in a C# macro that tells the editor to ignore this code when checking it, but this was not well-described in the documentation. Although we could now theoretically access the UWP library on the HoloLens, we still had trouble getting the Bluetooth module specifically to work. After some head-bashing, we eventually decided to change our approach.

As documented in an earlier blog post, we wrote a Bluetooth server designed to act as a bridge for the temperature sensors in the office. Shortly after creating that blog post, we expanded the server to also include a web interface, which meant writing a REST API to access data from the thermometers. By utilizing the web request library in the UWP API, we were able to get thermometer temperature data by making GET requests to the server running on the Raspberry Pi.

Now that we could obtain data about the thermometers, we needed a way to display it in 3D-space such that the different temperature displays were mapped to their corresponding locations in real life. To achieve this, we utilized Vuforia, an add-on for Unity designed for image recognition and tracking. Vuforia introduces the concept of “Image Targets”, a cause-and-effect system where looking at a designated image triggers an event or 3D model to appear. This model is then tracked in relation to the image target, such that the model is anchored to the image, even if the user walks around.

Because Vuforia is capable of distinguishing between different QR codes, we created a QR code to represent each sensor, and then turned the QR codes into Vuforia image targets. We then set up the triggered model to be 3d text, and attached a custom script capable of fetching the sensor data from the web to set the text of the model to be whatever the temperature of the sensor was. This allowed us to place the QR codes next to the sensors, and upon looking at the codes (and by extension, the sensors), we received the temperature output in real time, in 3D.   

Augmented Reality Challenges

We’ve already discussed the difficulty associated with using the UWP API, but getting the data to display also presented us with a set of challenges. Vuforia provided us with an easy-to-use platform for tracking the location of sensors so that our text could anchor next to the sensors, but using the platform came with its own set of constraints. One of the first problems we dealt with was choosing a good image target, and learning about the attributes of what makes a good image target. Our first choice was to use basic geometric shapes with solid colors, but we quickly found that Vuforia had a difficult time recognizing these objects.

As we later learned, the effectiveness of the Vuforia tracking system is correlated with the number of unique points in an image, so our solid-colored shapes had almost nothing unique for Vuforia to pick up on. To fix this, we transitioned to using QR codes with the name of the thermometer as the payload. QR codes by nature have a large number of unique tracking points available so that each image is distinguishable, making them an effective solution to our problem. Using QR codes are our unique identifier also opens the door for compatibility with future projects that also require the identification of our sensors.

A second challenge created by using Vuforia was positioning the text correctly in the real world. While Unity allows you to visualize where the text is in the modelling window of the program, it does not always look the same upon deploying to the HoloLens (due to the problems of viewing 3D space in a 2D frame). Most of the time, the text appeared to be under the QR code and closer to the floor, instead of hovering just above image. We were able to improve the visualization through trial and error, but this was time consuming due to the time required for each code compilation and deployment.

A third constraint we dealt with by using Vuforia was finding a way to screenshot our work. Because Vuforia takes control over the HoloLens camera, the normal Microsoft desktop app to control the HoloLens didn’t work, as it relied on control over the camera that it didn’t have. We tried to get around this problem by using the Vuforia scripting API to take a screenshot, but this was limited as the screenshots did not include the augmented content, only a literal screenshot of what the camera was seeing. We finally decided to use an external camera to take a photo of what was on the screen, because we were unable to find a solution through code.

Future Steps

There is a wide variety of directions that this project can follow, depending on what kinds of features are desired. Adding menus to the temperature displays could allow for controlling the sensors through AR, instead of needing a computer. The image detection could also be improved, by using Vumarks in place of the QR codes. A Vumark is a proprietary version of a QR code, but Vuforia can read the embedded data (as opposed to regular QR codes, that are simply treated as regular images)

 

Interested in collaboration? Get in touch with us today!