WIP vhar_system

This commit is contained in:
2024-09-30 16:09:33 +02:00
parent f345dcf94e
commit 53a45d62a7
7 changed files with 66 additions and 115 deletions

View File

@@ -28,7 +28,7 @@ Nine \qty{5}{\cm} square cardboards with smooth, white melamine surface, arrange
Their poses were estimated with three \qty{2}{\cm} AprilTag fiducial markers glued on the surfaces grid.
Similarly, a \qty{2}{\cm} fiducial marker was glued on top of the vibrotactile actuator to detect the finger pose.
Positioned \qty{20}{\cm} above the surfaces, a webcam (StreamCam, Logitech) filmed the markers to track finger movements relative to the surfaces, as described in \secref[vhar_system]{virtual_real_alignment}.
The visual textures were displayed on the tangible surfaces using the \OST-\AR headset Microsoft HoloLens~2 running a custom application at \qty{60}{FPS} made with Unity 2021.1 and Mixed Reality Toolkit (MRTK) 2.7.2\footnoteurl{https://learn.microsoft.com/windows/mixed-reality/mrtk-unity}.
The visual textures were displayed on the tangible surfaces using the \OST-\AR headset Microsoft HoloLens~2 running a custom application at \qty{60}{FPS} made with Unity 2021.1 and Mixed Reality Toolkit (MRTK) 2.7.2.
A set of empirical tests enabled us to choose the best rendering characteristics in terms of transparency and brightness for the visual textures, that were used throughout the user study.
When a virtual haptic texture was touched, a \qty{48}{kHz} audio signal was generated using the rendering procedure described in \cite{culbertson2014modeling} from the corresponding \HaTT haptic texture model and the measured tangential speed of the finger (\secref[vhar_system]{texture_generation}).