Fix acronyms

This commit is contained in:
2024-09-24 15:47:33 +02:00
parent 2dad3efdd0
commit ef188c1993
26 changed files with 165 additions and 159 deletions

View File

@@ -2,7 +2,7 @@
%
In this section, we describe a system for rendering vibrotactile roughness texture in real time, on any tangible surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose.
%
We also describe how to pair this tactile rendering with an immersive AR or VR headset visual display to provide a coherent, multimodal visuo-haptic augmentation of the real environment.
We also describe how to pair this tactile rendering with an immersive \AR or \VR headset visual display to provide a coherent, multimodal visuo-haptic augmentation of the real environment.
\section{Principle}
\label{principle}
@@ -36,7 +36,7 @@ The system is composed of three main components: the pose estimation of the trac
\begin{subfigs}{setup}{Visuo-haptic texture rendering system setup. }[][
\item HapCoil-One voice-coil actuator with a fiducial marker on top attached to a participant's right index finger.
\item HoloLens~2 AR headset, the two cardboard masks to switch the real or virtual environments with the same field of view, and the 3D-printed piece for attaching the masks to the headset.
\item HoloLens~2 \AR headset, the two cardboard masks to switch the real or virtual environments with the same field of view, and the 3D-printed piece for attaching the masks to the headset.
\item User exploring a virtual vibrotactile texture on a tangible sheet of paper.
]
\subfig[0.325]{device}
@@ -70,7 +70,7 @@ In addition, the pose and size of the virtual textures are defined on the virtua
%
During the experiment, the system uses marker pose estimates to align the virtual models with their real-world counterparts. %, according to the condition being tested.
%
This allows to detect if a finger touches a virtual texture using a collision detection algorithm (Nvidia PhysX), and to show the virtual elements and textures in real-time, aligned with the real environment (\figref{renderings}), using the considered AR or VR headset.
This allows to detect if a finger touches a virtual texture using a collision detection algorithm (Nvidia PhysX), and to show the virtual elements and textures in real-time, aligned with the real environment (\figref{renderings}), using the considered \AR or \VR headset.
In our implementation, the virtual hand and environment are designed with Unity and the Mixed Reality Toolkit (MRTK).
%
@@ -80,7 +80,7 @@ It was chosen over VST-AR because OST-AR only adds virtual content to the real e
%
Indeed, one of our objectives (\secref{experiment}) is to directly compare a virtual environment that replicates a real one. %, rather than a video feed that introduces many supplementary visual limitations.
%
To simulate a VR headset, a cardboard mask (with holes for sensors) is attached to the headset to block the view of the real environment (\figref{headset}).
To simulate a \VR headset, a cardboard mask (with holes for sensors) is attached to the headset to block the view of the real environment (\figref{headset}).
\section{Vibrotactile Signal Generation and Rendering}
\label{texture_generation}
@@ -139,7 +139,7 @@ The tactile texture is described and rendered in this work as a one dimensional
%As shown in \figref{diagram} and described above, the system includes various haptic and visual sensors and rendering devices linked by software processes for image processing, 3D rendering and audio generation.
%
Because the chosen AR headset is a standalone device (like most current AR/VR headsets) and cannot directly control the sound card and haptic actuator, the image capture, pose estimation and audio signal generation steps are performed on an external computer.
Because the chosen \AR headset is a standalone device (like most current AR/VR headsets) and cannot directly control the sound card and haptic actuator, the image capture, pose estimation and audio signal generation steps are performed on an external computer.
%
All computation steps run in a separate thread to parallelize them and reduce latency, and are synchronised with the headset via a local network and the ZeroMQ library.
%
@@ -157,7 +157,7 @@ The haptic loop also includes the voice-coil latency \qty{15}{\ms} (as specified
%
The total haptic latency is below the \qty{60}{\ms} detection threshold in vibrotactile feedback \cite{okamoto2009detectability}.
%
The total visual latency can be considered slightly high, yet it is typical for an AR rendering involving vision-based tracking \cite{knorlein2009influence}.
The total visual latency can be considered slightly high, yet it is typical for an \AR rendering involving vision-based tracking \cite{knorlein2009influence}.
The two filters also introduce a constant lag between the finger movement and the estimated position and velocity, measured at \qty{160 +- 30}{\ms}.
%