hololens course 210 when the app has deployed, dismiss the fitbox with a select gesture

by Robyn Brekke Sr. 7 min read

How to deploy HoloLens app over USB?

Mar 07, 2022 · When the app has deployed, dismiss the Fitbox with a select gesture. If deploying to an immersive headset: Using the top toolbar in Visual Studio, change the target from Debug to Release and from ARM to x64. Make sure the deployment target is set to Local Machine. In the …

How do I use hands detected fires on HoloLens 2?

Oct 22, 2019 · When the app has deployed, dismiss the Fitbox with a select gesture. If deploying to an immersive headset: Using the top toolbar in Visual Studio, change the target from Debug to Release and from ARM to x64. Make sure the deployment target is set to Local Machine. In the …

Can Visual Studio detect the HoloLens as a remote machine?

The emulator fires up and the app deploys successfully, but there is only a black blank screen, and nothing else comes up. When I change the configuration to debug mode and debug via the …

How do I get feedback on my HoloLens project?

Apr 28, 2016 · 1. Change the target from Debug to Release. 2. And change ARM to x86. 3. Select Device in the deployment target drop-down menu. Select Debug > Start debugging to deploy …

In this article

The Mixed Reality Academy tutorials were designed with HoloLens (1st gen), Unity 2017, and Mixed Reality Immersive Headsets in mind. As such, we feel it is important to leave these tutorials in place for developers who are still looking for guidance in developing for those devices.

Project files

If you want to look through the source code before downloading, it's available on GitHub.

Errata and Notes

"Enable Just My Code" needs to be disabled ( unchecked) in Visual Studio under Tools->Options->Debugging in order to hit breakpoints in your code.

Instructions

To use Navigation gestures in our app, we are going to edit GestureAction.cs to rotate objects when the Navigation gesture occurs. Additionally, we'll add feedback to the cursor to display when Navigation is available.

Build and Deploy

Rebuild the application in Unity and then build and deploy from Visual Studio to run it in the HoloLens.

Objectives

Use hand guidance score to help predict when hand tracking will be lost.

Desktop

Windows 10 Enterprise#N#Visual Studio Community 2017 Version#N#Unity 2018.2.19f1 (Latest 7th December 2018 Update)

The Problem

I am unable to deploy any application to the HoloLens, but able to deploy it on hololens-emulator. I'm attempting with the Origami application exported from Unity as in the tutorial. I'm able to pair with the HoloLens easily and I have developer mode on. Visual Studio also does auto-detect the device as a remote machine.

Before You Start

Chapter 0 - Unity Setup

  • Instructions
    1. Start Unity. 2. Select Open. 3. Navigate to the Gesturefolder you previously un-archived. 4. Find and select the Starting/Model Explorerfolder. 5. Click the Select Folderbutton. 6. In the Project panel, expand the Scenesfolder. 7. Double-click ModelExplorerscene to load it in Unity.
  • Building
    1. In Unity, select File > Build Settings. 2. If Scenes/ModelExplorer is not listed in Scenes In Build, click Add Open Scenesto add the scene. 3. If you're specifically developing for HoloLens, set Target device to HoloLens. Otherwise, leave it on Any device. 4. Ensure Build Type is set to D3D …
See more on docs.microsoft.com

Chapter 1 - Hand Detected Feedback

  • Objectives
    1. Subscribe to hand tracking events. 2. Use cursor feedback to show users when a hand is being tracked.
  • Instructions
    1. In the Hierarchy panel, expand the InputManagerobject. 2. Look for and select the GesturesInputobject. The InteractionInputSource.csscript performs these steps: 1. Subscribes to the InteractionSourceDetected and InteractionSourceLost events. 2. Sets the HandDetected stat…
See more on docs.microsoft.com

Chapter 2 - Navigation

  • Objectives
    1. Use Navigation gesture events to rotate the astronaut.
  • Instructions
    To use Navigation gestures in our app, we are going to edit GestureAction.csto rotate objects when the Navigation gesture occurs. Additionally, we'll add feedback to the cursor to display when Navigation is available. 1. In the Hierarchy panel, expand CursorWithFeedback. 2. In the Hologra…
See more on docs.microsoft.com

Chapter 3 - Hand Guidance

  • Objectives
    1. Use hand guidance scoreto help predict when hand tracking will be lost. 2. Provide feedback on the cursorto show when the user's hand nears the camera's edge of view.
  • Instructions
    1. In the Hierarchy panel, select the CursorWithFeedbackobject. 2. In the Inspector panel, click the Add Componentbutton. 3. In the menu, type in the search box Hand Guidance. Select the search result. 4. In the Project panel Holograms folder, find the HandGuidanceFeedbackasset. 5. Drag a…
See more on docs.microsoft.com

Chapter 4 - Manipulation

  • Objectives
    1. Use Manipulation events to move the astronaut with your hands. 2. Provide feedback on the cursor to let the user know when Manipulation can be used.
  • Instructions
    GestureManager.cs and AstronautManager.cs will allow us to do the following: 1. Use the speech keyword "Move Astronaut" to enable Manipulation gestures and "Rotate Astronaut" to disable them. 2. Switch to responding to the Manipulation Gesture Recognizer. Let's get started. 1. In th…
See more on docs.microsoft.com

Chapter 5 - Model Expansion

  • Objectives
    1. Expand the Astronaut model into multiple, smaller pieces that the user can interact with. 2. Move each piece individually using Navigation and Manipulation gestures.
  • Instructions
    In this section, we will accomplish the following tasks: 1. Add a new keyword "Expand Model" to expand the astronaut model. 2. Add a new Keyword "Reset Model" to return the model to its original form. We'll do this by adding two more keywords to the Speech Input Source from the pr…
See more on docs.microsoft.com

The End

  • Congratulations! You have now completed MR Input 211: Gesture. 1. You know how to detect and respond to hand tracking, navigation and manipulation events. 2. You understand the difference between Navigation and Manipulation gestures. 3. You know how to change the cursor to provide visual feedback for when a hand is detected, when a hand is about to be lost, and for when an o…
See more on docs.microsoft.com