This project combines Apple Vision Pro with custom haptic gloves to create an immersive art experience. The project consists of several components:
-
Unity Vision Pro Application
- Main application running on Vision Pro
- Hand tracking and haptic feedback integration
- Art interaction system
-
Haptic Gloves Unity SDK
- Custom SDK for glove integration
- Communication protocols
- Haptic feedback control
-
Gloves Firmware
- Based on Adafruit hardware
- Controls haptic feedback
- Handles sensor data
-
Gloves Design
- Physical glove design
- Component placement
- Wiring schematics
- Apple Vision Pro
- Unity 6.0 or later
- Adafruit hardware components
- Arduino IDE
- Xcode (for Vision Pro development)
- Adafruit Feather nRF52840
- Haptic feedback motors
- Flex sensors
- IMU sensors
- Power supply components
- Unity 6.0 or later
- Apple Vision Pro SDK
- Arduino IDE
- Visual Studio Code (optional)
- Clone this repository
- Build and deploy the Unity application to Vision Pro
- Flash the firmware to the gloves
- Connect and test the system
MIT License
This is a Unity 6000.1.5f1 project configured for Apple Vision Pro development with the following key components:
- Hand Tracking: Uses Unity XR Hands with custom
MyHand.cscomponent for hand pose tracking and distance calculations - BLE Communication:
BLESendJointV.cshandles Bluetooth communication with haptic gloves using finger velocity mapping - Visual Effects: VFX Graph-based visual effects in
/Assets/VFX/withVFXMan.cscontroller - AR Mesh Processing: Handles Vision Pro mesh data for spatial interactions
- YOLO Integration: Computer vision pipeline using Barracuda for object detection/segmentation
- Vision Pro hand tracking integration
- Haptic glove communication
- Art interaction system
- Real-time feedback
MyHand.cs- Core hand tracking with pose data, palm distance calculations, and finger velocity trackingHandRaycaster.cs- Hand-based raycasting for spatial interactionsHandVisualizer.cs- Visual representation of hand data
BLESendJointV.cs- Maps finger velocity (0-0.3 m/s) to BLE speed parameters (1.0x-4.0x) with OneDollar filteringReadSthFromServer.cs- Fetches configuration parameters from a web server and applies them to BLESendJointV/Assets/Scripts/Library/CoreBluetooth/- Core Bluetooth wrapper for Unity-iOS communication/Assets/Scripts/Library/NativeInterface/- Native iOS interface layer
Detector.cs- Main YOLO detection controller using BarracudaYOLOv8.csandYOLOv8Segmentation.cs- YOLO model implementations/Assets/Scripts/Library/Yolo/TextureProviders/- Camera and video input providers
VFXMan.cs- Controls Visual Effect Graph assets with AR mesh integration- Custom VFX operators for spatial effects and mesh interactions
Key dependencies from manifest.json:
com.unity.xr.visionos: "2.2.4" - Vision Pro platform supportcom.unity.xr.hands: "1.5.1" - Hand trackingcom.unity.xr.interaction.toolkit: "3.1.2" - XR interactionscom.unity.visualeffectgraph: "17.1.0" - VFX systemcom.unity.barracuda- Machine learning inferencecom.unity.render-pipelines.universal: "17.1.0" - URP rendering
DebugVFX.unity- Main SceenDebugHandGestures.unity- Hand tracking testing and debuggingYoloScenes/Detection.unity- Object detection testingYoloScenes/Segmentation.unity- Segmentation testing
-
Open the project in Unity 6.0 or later
-
Install required packages:
- Apple Vision Pro SDK
- Haptic Gloves SDK (included in this project)
- XR Interaction Toolkit
-
Configure build settings:
- Set platform to Vision Pro
- Configure signing and provisioning
- Set up entitlements
unity/
├── Assets/ # Unity Project
├── Hardware/ # Glove design and program
├── Packages/ # Unity Project
└── ProjectSettings/ # Unity project settings
- Open the main scene in
Assets/Scenes/Main.unity - Configure the HapticGlovesManager in the scene
- Test with the Vision Pro simulator
- Build and deploy to device
- Unity 6.0 or later
- Apple Vision Pro SDK
- XR Interaction Toolkit
- Haptic Gloves SDK