In this chapter, we want to start seeing some real progress in our dynamic user interface. To do that, we will have our newly crafted toolset from the previous chapter appear where we are looking when we are looking at an object. To accomplish this we will be using a very useful part of the C# language: delegates and events.
Alright, calm down and take a breath! I know the object creation chapter was a lot of code. I will give you all a slight reprieve; this section should be a nice and simple, at least in comparison.
After previously learning how to make the material of an object change with the focus of an object, we will build on that knowledge by adding new objects through code. We will accomplish this by creating our bounding box, which in the end is not actually a box, as you will see.
We started with our system manager in the previous lesson in our series on building dynamic user interfaces, but to get there, aside from the actual transform, rotation, and scaling objects, we need to make objects out of code in multiple ways, establish delegates and events, and use the surface of an object to inform our toolset placement.
Now that we have installed the toolkit, set up our prefabs, and prepared Unity for export to HoloLens, we can proceed with the fun stuff involved in building a dynamic user interface. In this section, we will build the system manager.
Generally speaking, in terms of modern devices, the more simple you make an interface to navigate, the more successful the product is.
Alright, let's dig into this and get the simple stuff out of the way. We have a journey ahead of us. A rather long journey at that. We will learn topics ranging from creating object filtering systems to help us tell when a new object has come into a scene to building and texturing objects from code.
The release of Unity 5.6 brought with it several great enhancements. One of those enhancements is the new Video Player component. This addition allows for adding videos to your scenes quickly and with plenty of flexibility. Whether you are looking to simply add a video to a plane, or get creative and build a world layered with videos on 3D objects, Unity 5.6 has your back.
Continuing our series on building a dynamic user interface for the HoloLens, this guide will show how to rotate the objects that we already created and moved and scaled in previous lessons.
An incorrectly scaled object in your HoloLens app can make or break your project, so it's important to get scaling in Unity down, such as working with uniform and non-uniform factors, before moving onto to other aspects of your app.
So after setting everything up, creating the system, working with focus and gaze, creating our bounding box and UI elements, unlocking the menu movement, as well as jumping through hoops refactoring a few parts of the system itself, we have finally made it to the point in our series on dynamic user interfaces for HoloLens where we get some real interaction.
Now that we have unlocked the menu movement — which is working very smoothly — we now have to get to work on the gaze manager, but first, we have to make a course correction.
In the previous section of this series on dynamic user interfaces for HoloLens, we learned about delegates and events. At the same time we used those delegates and events to not only attach our menu system to the users gaze, but also to enable and disable the menu based on certain conditions. Now let's take that knowledge and build on it to make our menu system a bit more comfortable.
Being part of the wild frontier is amazing. It doesn't take much to blow minds of first time mixed reality users — merely placing a canned hologram in the room is enough. However, once that childlike wonder fades, we need to add more substance to create lasting impressions.
When making a convincing mixed reality experience, audio consideration is a must. Great audio can transport the HoloLens wearer to another place or time, help navigate 3D interfaces, or blur the lines of what is real and what is a hologram. Using a location-based trigger (hotspot), we will dial up a fun example of how well spatial sound works with the HoloLens.
One of the truly beautiful things about the HoloLens is its completely untethered, the-world-is-your-oyster freedom. This, paired with the ability to view your real surroundings while wearing the device, allows for some incredibly interesting uses. One particular use is triggering events when a user enters a specific location in a physical space. Think of it as a futuristic automatic door.
Welcome back to this series on making physical objects come to life on HoloLens with Vuforia. Now that we've set up Vuforia and readied our ImageTarget and camera system, we can see our work come to life. Because in the end, is that not one of the main driving forces when developing—that Frankenstein-like sensation of bringing something to life that was not there before?
Now that we've set up Vuforia in Unity, we can work on the more exciting aspects of making physical objects come to life on the HoloLens. In this guide, we will choose an image (something that you physically have in your home), build our ImageTarget database, and then set up our Unity camera to be able to recognize the chosen image so that it can overlay the 3D holographic effect on top of it.
In this first part of our tutorial series on making physical objects come to life on HoloLens, we are going to set up Vuforia in Unity.
It seems to me you can't swing a dead cat near an augmented reality developer without hearing the word Vuforia escape their lips. PTC's software solution has become the go-to for most developers in the mobile AR space, and since they recently added full support for the HoloLens in Unity, I figured it was about time we learn to make something with it.