VREmotionDetection is a Unity library for detecting emotions in VR using machine learning models.
- Unity 6 or newer.
- Meta Quest Pro if you want to use use action units for prediction.
-
In your Unity project, open
Window > Package Manager. -
In the top-left corner, click
+>Install package from git URL... -
Paste the following URL:
https://github.com/vcu-swim-lab/VREmotionDetection.git -
Click Install.
- In Unity, select the object that you want to add the predictor model from the Hierarchy tab.
- In the Inspector tab, click Add Component, then Face Au Model.
- In the Natural Model field, pick
natural_trial_96.onnxand for theActed Modelfield pickact_trial_92.onnx.
- Create a new MonoBehaviour script and attach it to the same GameObject as the Emotion Predictor.
- Open the script in your code editor.
- Copy and paste the sample code from:
Samples~/Basic/MyScript.cs.
This script will give you direct access to the predictions from the emotion detection model.
This project is licensed under the MIT License.