From 85df9fe164c9a0af0d91eb83f0a84d7b8485beaf Mon Sep 17 00:00:00 2001 From: Erwan Normand Date: Wed, 9 Apr 2025 15:06:26 +0200 Subject: [PATCH] Better notation of software versions --- 3-perception/vhar-system/2-method.tex | 2 +- 3-perception/vhar-textures/2-experiment.tex | 2 +- 3-perception/xr-perception/3-experiment.tex | 2 +- 4-manipulation/visual-hand/2-method.tex | 2 +- 4 files changed, 4 insertions(+), 4 deletions(-) diff --git a/3-perception/vhar-system/2-method.tex b/3-perception/vhar-system/2-method.tex index a3a2bf8..6409c19 100644 --- a/3-perception/vhar-system/2-method.tex +++ b/3-perception/vhar-system/2-method.tex @@ -60,7 +60,7 @@ In addition, the pose and size of the virtual textures were defined on the virtu During the experiment, the system uses marker pose estimates to align the virtual models with their real world counterparts. %, according to the condition being tested. This allows to detect if a finger touches a virtual texture using a collision detection algorithm (Nvidia PhysX), and to show the virtual elements and textures in real-time, aligned with the \RE, using the considered \AR or \VR headset. -In our implementation, the \VE is designed with Unity and the Mixed Reality Toolkit (MRTK)\footnoteurl{https://learn.microsoft.com/windows/mixed-reality/mrtk-unity}. +In our implementation, the \VE is designed with Unity (v2021.1) and the Mixed Reality Toolkit (v2.7)\footnoteurl{https://learn.microsoft.com/windows/mixed-reality/mrtk-unity}. The visual rendering is achieved using the Microsoft HoloLens~2, an \OST-\AR headset with a \qtyproduct{43 x 29}{\degree} \FoV, a \qty{60}{\Hz} refresh rate, and self-localisation capabilities. A \VST-\AR or a \VR headset could have been used as well. diff --git a/3-perception/vhar-textures/2-experiment.tex b/3-perception/vhar-textures/2-experiment.tex index ef098e5..9fe449f 100644 --- a/3-perception/vhar-textures/2-experiment.tex +++ b/3-perception/vhar-textures/2-experiment.tex @@ -28,7 +28,7 @@ Nine \qty{5}{\cm} square cardboards with smooth, white melamine surface, arrange Their poses were estimated with three \qty{2}{\cm} AprilTag fiducial markers glued on the surfaces grid. Similarly, a \qty{2}{\cm} fiducial marker was glued on top of the vibrotactile actuator to detect the finger pose. Positioned \qty{20}{\cm} above the surfaces, a webcam (StreamCam, Logitech) filmed the markers to track finger movements relative to the surfaces, as described in \secref[vhar_system]{virtual_real_alignment}. -The visual textures were displayed on the real surfaces using the \OST-\AR headset Microsoft HoloLens~2 running a custom application at \qty{60}{FPS} made with Unity 2021.1 and Mixed Reality Toolkit (MRTK) 2.7.2. +The visual textures were displayed on the real surfaces using the \OST-\AR headset Microsoft HoloLens~2 running a custom application at \qty{60}{FPS} made with Unity (v2021.1) and Mixed Reality Toolkit (v2.7). A set of empirical tests enabled us to choose the best rendering characteristics in terms of transparency and brightness for the visual textures, that were used throughout the user study. When a virtual haptic texture was touched, a \qty{48}{kHz} audio signal was generated using the rendering procedure described in \cite{culbertson2014modeling} from the corresponding \HaTT haptic texture model and the measured tangential speed of the finger (\secref[vhar_system]{texture_generation}). diff --git a/3-perception/xr-perception/3-experiment.tex b/3-perception/xr-perception/3-experiment.tex index 727a619..4b1298a 100644 --- a/3-perception/xr-perception/3-experiment.tex +++ b/3-perception/xr-perception/3-experiment.tex @@ -31,7 +31,7 @@ It consisted of a \qtyproduct{300 x 210 x 400}{\mm} medium-density fibreboard (M A single light source of \qty{800}{\lumen} placed \qty{70}{\cm} above the table fully illuminated the inside of the box. Participants rated the roughness of the paper (without any texture augmentation) before the experiment on a 7-point Likert scale (1~=~Extremely smooth, 7~=~Extremely rough) as quite smooth (\mean{2.5}, \sd{1.3}). -The visual rendering of the virtual hand and environment was achieved using the \OST-\AR headset Microsoft HoloLens~2 (\secref[vhar_system]{virtual_real_alignment}) running at \qty{60}{FPS} a custom application made with Unity 2021.1 and Mixed Reality Toolkit (MRTK) 2.7.2. +The visual rendering of the virtual hand and environment was achieved using the \OST-\AR headset Microsoft HoloLens~2 (\secref[vhar_system]{virtual_real_alignment}) running at \qty{60}{FPS} a custom application made with Unity (v2021.1) and Mixed Reality Toolkit (v2.7). An \OST-\AR headset was chosen over a \VST-\AR headset because the former only adds virtual content to the \RE, while the latter streams a real-time video capture of the \RE, and one of our objectives was to directly compare a \VE replicating a real one, not to a video feed that introduces many other visual limitations (\secref[related_work]{ar_displays}). The \VE carefully reproduced the \RE, including the geometry of the box, textures, lighting, and shadows (\figref{renderings}, \level{Virtual}). diff --git a/4-manipulation/visual-hand/2-method.tex b/4-manipulation/visual-hand/2-method.tex index bf8bac5..566749f 100644 --- a/4-manipulation/visual-hand/2-method.tex +++ b/4-manipulation/visual-hand/2-method.tex @@ -105,7 +105,7 @@ We used the \OST-\AR headset HoloLens~2, as described in \secref[vhar_system]{vi It is also able to track the user's fingers. We measured the latency of the hand tracking at \qty{15}{\ms}, independent of the hand movement speed. -The implementation of our experiment was done using Unity 2022.1, PhysX 4.1, and the Mixed Reality Toolkit (MRTK) 2.8. +The implementation of our experiment was done using Unity (v2022.1), PhysX (v4.1), and the Mixed Reality Toolkit (MRTK, v2.8). The compiled application ran directly on the HoloLens~2 at \qty{60}{FPS}. The default \ThreeD hand model from MRTK was used for all visual hand augmentations.