Fix acronyms

This commit is contained in:
2024-09-24 15:47:33 +02:00
parent 2dad3efdd0
commit ef188c1993
26 changed files with 165 additions and 159 deletions

View File

@@ -1,10 +1,10 @@
Augmented reality (AR) integrates virtual content into our real-world surroundings, giving the illusion of one unique environment and promising natural and seamless interactions with real and virtual objects.
%
Virtual object manipulation is particularly critical for useful and effective AR usage, such as in medical applications, training, or entertainment \cite{laviolajr20173d, kim2018revisiting}.
Virtual object manipulation is particularly critical for useful and effective \AR usage, such as in medical applications, training, or entertainment \cite{laviolajr20173d, kim2018revisiting}.
%
Hand tracking technologies \cite{xiao2018mrtouch}, grasping techniques \cite{holl2018efficient}, and real-time physics engines permit users to directly manipulate virtual objects with their bare hands as if they were real \cite{piumsomboon2014graspshell}, without requiring controllers \cite{krichenbauer2018augmented}, gloves \cite{prachyabrued2014visual}, or predefined gesture techniques \cite{piumsomboon2013userdefined, ha2014wearhand}.
%
Optical see-through AR (OST-AR) head-mounted displays (HMDs), such as the Microsoft HoloLens 2 or the Magic Leap, are particularly suited for this type of direct hand interaction \cite{kim2018revisiting}.
Optical see-through \AR (OST-AR) head-mounted displays (HMDs), such as the Microsoft HoloLens 2 or the Magic Leap, are particularly suited for this type of direct hand interaction \cite{kim2018revisiting}.
However, there are still several haptic and visual limitations that affect manipulation in OST-AR, degrading the user experience.
%
@@ -14,23 +14,23 @@ Similarly, it is challenging to ensure confident and realistic contact with a vi
%
These limitations also make it difficult to confidently move a grasped object towards a target \cite{maisto2017evaluation, meli2018combining}.
To address these haptic and visual limitations, we investigate two types of sensory feedback that are known to improve virtual interactions with hands, but have not been studied together in an AR context: visual hand rendering and delocalized haptic rendering.
To address these haptic and visual limitations, we investigate two types of sensory feedback that are known to improve virtual interactions with hands, but have not been studied together in an \AR context: visual hand rendering and delocalized haptic rendering.
%
A few works explored the effect of a visual hand rendering on interactions in AR by simulating mutual occlusion between the real hand and virtual objects \cite{ha2014wearhand, piumsomboon2014graspshell, al-kalbani2016analysis}, or displaying a 3D virtual hand model, semi-transparent \cite{ha2014wearhand, piumsomboon2014graspshell} or opaque \cite{blaga2017usability, yoon2020evaluating, saito2021contact}.
A few works explored the effect of a visual hand rendering on interactions in \AR by simulating mutual occlusion between the real hand and virtual objects \cite{ha2014wearhand, piumsomboon2014graspshell, al-kalbani2016analysis}, or displaying a 3D virtual hand model, semi-transparent \cite{ha2014wearhand, piumsomboon2014graspshell} or opaque \cite{blaga2017usability, yoon2020evaluating, saito2021contact}.
%
Indeed, some visual hand renderings are known to improve interactions or user experience in virtual reality (VR), where the real hand is not visible \cite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch, vanveldhuizen2021effect}.
%
However, the role of a visual hand rendering superimposed and seen above the real tracked hand has not yet been investigated in AR.
However, the role of a visual hand rendering superimposed and seen above the real tracked hand has not yet been investigated in \AR.
%
Conjointly, several studies have demonstrated that wearable haptics can significantly improve interactions performance and user experience in AR \cite{maisto2017evaluation, meli2018combining, sarac2022perceived}.
Conjointly, several studies have demonstrated that wearable haptics can significantly improve interactions performance and user experience in \AR \cite{maisto2017evaluation, meli2018combining, sarac2022perceived}.
%
But haptic rendering for AR remains a challenge as it is difficult to provide rich and realistic haptic sensations while limiting their negative impact on hand tracking \cite{pacchierotti2016hring} and keeping the fingertips and palm free to interact with the real environment \cite{lopes2018adding, teng2021touch, sarac2022perceived, palmer2022haptic}.
But haptic rendering for \AR remains a challenge as it is difficult to provide rich and realistic haptic sensations while limiting their negative impact on hand tracking \cite{pacchierotti2016hring} and keeping the fingertips and palm free to interact with the real environment \cite{lopes2018adding, teng2021touch, sarac2022perceived, palmer2022haptic}.
%
Therefore, the haptic feedback of the fingertip contact with the virtual environment needs to be rendered elsewhere on the hand, it is unclear which positioning should be preferred or which type of haptic feedback is best suited for manipulating virtual objects in AR.
Therefore, the haptic feedback of the fingertip contact with the virtual environment needs to be rendered elsewhere on the hand, it is unclear which positioning should be preferred or which type of haptic feedback is best suited for manipulating virtual objects in \AR.
%
A final question is whether one or the other of these (haptic or visual) hand renderings should be preferred \cite{maisto2017evaluation, meli2018combining}, or whether a combined visuo-haptic rendering is beneficial for users.
%
In fact, both hand renderings can provide sufficient sensory cues for efficient manipulation of virtual objects in AR, or conversely, they can be shown to be complementary.
In fact, both hand renderings can provide sufficient sensory cues for efficient manipulation of virtual objects in \AR, or conversely, they can be shown to be complementary.
In this paper, we investigate the role of the visuo-haptic rendering of the hand during 3D manipulation of virtual objects in OST-AR.
%
@@ -43,7 +43,7 @@ The main contributions of this work are:
\end{itemize}
\begin{subfigs}{hands}{The six visual hand renderings}[
Depicted as seen by the user through the AR headset during the two-finger grasping of a virtual cube.
Depicted as seen by the user through the \AR headset during the two-finger grasping of a virtual cube.
][
\item No visual rendering \emph{(None)}.
\item Cropped virtual content to enable hand-cube occlusion \emph{(Occlusion, Occl)}.