Unity-Movement is a package that uses OpenXR’s tracking layer APIs to expose Meta Quest Pro’s Body Tracking (BT), Eye Tracking (ET), and Face Tracking (FT) capabilities. With this package, developers can leverage tracking to populate VR environments with custom avatars that bring the expressiveness of users into the virtual environments that they create.
- Unity 2021.3.21f1 (2021 LTS) or newer installed
- v53.0 or newer of the Oculus Integration SDK with OVRPlugin set to use OpenXR as backend. Make sure to include the VR and Interaction folders when importing into your project.
- A project set up with these configuration settings
The Unity-Movement package is released under the Oculus License. The MIT License applies to only certain, clearly marked documents. If an individual file does not indicate which license it is subject to, then the Oculus License applies.
First, ensure that all of the requirements are met.
Then, bring this package into the project.
- In Package Manager, click on the add button below the window title and select Add package from git URL…, using this URL: https://github.com/oculus-samples/Unity-Movement.git
- To grab a specific version of the package, append the version number with a # to the git URL (i.e. https://github.com/oculus-samples/Unity-Movement.git#1.2.0)
- Alternatively, in package manager, click on the add button below the window title and select Add package from disk..., using the package.json located after unzipping one of the releases here: https://github.com/oculus-samples/Unity-Movement/releases
The sample scenes are located under the Samples/../Scenes folders.
If the new scene or an existing scene doesn’t have a GameObject with the OVRCameraRig component, follow the steps:
- From the Hierarchy tab, look for a Main Camera GameObject.
- If the Main Camera GameObject is present, right-click Main Camera and click Delete.
- In the Project tab, expand the Assets > Oculus > VR > Prefab folder and drag and drop the OVRCameraRig prefab into the scene. You can also drag and drop it into the Hierarchy tab.
- On the Inspector tab, go to OVR Manager > Quest Features.
- In the General tab, there are options to enable body, face, and eye tracking support. Select Supported or Required for the type of tracking support you wish to add.
- Under OVRManager's "Permission Requests On Startup" section, enable Body, Face and Eye Tracking.
- Ensure that OVRManager's "Tracking Origin Type" is set to "Floor Level".
The Character (layer index 10), the MirroredCharacter (layer index 11), and HiddenMesh layers must be present in the project for RecalculateNormals to work correctly.
Navigate to your Project Settings (Edit->Project Settings...) and click on the "Quality" section. If your project uses URP, then some of these settings might be part the rendering pipeline asset currently in use. The pipeline picked will be shown in the Quality menu.
The following settings are recommended:
- Four bones for Skin Weights.
- 2x or 4x Multi Sampling Anti Aliasing.
- Full resolution textures.
- Shadow settings:
- Hard and soft shadows.
- Very high shadow resolution.
- Stable fit.
- Shadow distance of 3 meters with cascades. This will allow viewing shadows nearby without experiencing poor quality.
- At least one pixel light.
Make sure that the color space is set to Linear.
In order for the SceneSelectMenu buttons to work, add the scenes located in the Samples/../Scenes folders of the package.
The project contains several sample scenes. To test the samples, add the scenes located in the Packages/com.meta.movement/Samples/../Scenes folders to the project's Assets folder.
For more information about the samples, read Aura Sample, Hip Pinning Sample, and High Fidelity Sample.
The documentation for this package can be found here. The API reference for this package can be found here.