Humanoid Control for Unity v4
|
This extension is purely oriented on the HTC Vive Pro Eye. As it only uses OpenXR constructs it may or may not work with other devices which support the OpenXR Eye Gaze Interaction Profile however this is not supported by us.
HTC Vive Eye tracking is supported in the Humanoid Control Pro edition.
HTC Vive Pro is supported for Windows builds.
I found that it is necessary to install VivePort which in its turn can install the necessary Vive SRAnipal application which makes eye tracking possible.
It looks like it is not necessary to install the Vive OpenXR Plugin for eye tracking, however I do get a lot of warnings in my project when I don't. So for now it is recommended to install the plugin.
You can use the installer unitypackage as is descibed on the GitHub page: https://github.com/ViveSoftware/VIVE-OpenXR
Alternatively you can import the package directly using the git package importer of the Package Manager See for instructions on this: Unity:installing from a Git URL.
To install the latest version you can use the following link: https://github.com/ViveSoftware/VIVE-OpenXR.git?path=com.htc.upm.vive.openxr For a specific version you can you extend the URL with the version number. For example, for version 2.2.0 you can use the following link: https://github.com/ViveSoftware/VIVE-OpenXR.git?path=com.htc.upm.vive.openxr#versions/2.2.0 Note: The current version has been tested and found to be working with version 2.2.0.
In the Edit Menu -> Project Settings -> XR Plugin Management make sure you have OpenXR selected for Windows, Mac, Linux. Then in the Enabled Interaction Profiles you need to add the Eye Gaze Interaction Profile. No OpenXR Feature Groups (like VIVE XR Facial Tracking) are necessary for eye tracking.
You can enable Vive Eye tracking on the HeadTarget of the humanoid.
ViveEyeTrackingHeadTarget.png
When you enable the tracking, you have the option to add the eye tracking components to the real-world setup by pressing the Show buttons. If you do not do this, the eye tracking components will be added at runtime.
There will be one eye tracking component per eye and they are parented to the UnityXR HMD in the real world hierarchy. The UnityXR HMD will get an eye tracking component which retrieves the data for the eyes.