68 lines
6.1 KiB
TeX
68 lines
6.1 KiB
TeX
\section{Introduction}
|
|
\label{sec:introduction}
|
|
|
|
\begin{subfigswide}{hands}{%
|
|
Experiment \#1. The six considered visual hand renderings, as seen by the user through the AR headset
|
|
during the two-finger grasping of a virtual cube.
|
|
%
|
|
From left to right: %
|
|
no visual rendering \emph{(None)}, %
|
|
cropped virtual content to {enable} hand-cube occlusion \emph{(Occlusion, Occl)}, %
|
|
rings on the fingertips \emph{(Tips)}, %
|
|
thin outline of the hand \emph{(Contour, Cont)}, %
|
|
fingers' joints and phalanges \emph{(Skeleton, Skel)}, and %
|
|
semi-transparent 3D hand model \emph{(Mesh)}.
|
|
}
|
|
\subfig[0.15]{method/hands-none}[None]
|
|
\subfig[0.15]{method/hands-occlusion}[Occlusion (Occl)]
|
|
\subfig[0.15]{method/hands-tips}[Tips]
|
|
\subfig[0.15]{method/hands-contour}[Contour (Cont)]
|
|
\subfig[0.15]{method/hands-skeleton}[Skeleton (Skel)]
|
|
\subfig[0.15]{method/hands-mesh}[Mesh]
|
|
\end{subfigswide}
|
|
|
|
Augmented reality (AR) integrates virtual content into our real-world surroundings, giving the illusion of one unique environment and promising natural and seamless interactions with real and virtual objects.
|
|
%
|
|
Virtual object manipulation is particularly critical for useful and effective AR usage, such as in medical applications, training, or entertainment~\autocite{laviolajr20173d, kim2018revisiting}.
|
|
%
|
|
Hand tracking technologies~\autocite{xiao2018mrtouch}, grasping techniques~\autocite{holl2018efficient}, and real-time physics engines permit users to directly manipulate virtual objects with their bare hands as if they were real~\autocite{piumsomboon2014graspshell}, without requiring controllers~\autocite{krichenbauer2018augmented}, gloves~\autocite{prachyabrued2014visual}, or predefined gesture techniques~\autocite{piumsomboon2013userdefined, ha2014wearhand}.
|
|
%
|
|
Optical see-through AR (OST-AR) head-mounted displays (HMDs), such as the Microsoft HoloLens 2 or the Magic Leap, are particularly suited for this type of direct hand interaction~\autocite{kim2018revisiting}.
|
|
|
|
However, there are still several haptic and visual limitations that affect manipulation in OST-AR, degrading the user experience.
|
|
%
|
|
For example, it is difficult to estimate the position of one's hand in relation to a virtual content because mutual occlusion between the hand and the virtual object is often lacking~\autocite{macedo2023occlusion}, the depth of virtual content is underestimated~\autocite{diaz2017designing, peillard2019studying}, and hand tracking still has a noticeable latency~\autocite{xiao2018mrtouch}.
|
|
%
|
|
Similarly, it is challenging to ensure confident and realistic contact with a virtual object due to the lack of haptic feedback and the intangibility of the virtual environment, which of course cannot apply physical constraints on the hand~\autocite{maisto2017evaluation, meli2018combining, lopes2018adding, teng2021touch}.
|
|
%
|
|
These limitations also make it difficult to confidently move a grasped object towards a target~\autocite{maisto2017evaluation, meli2018combining}.
|
|
|
|
To address these haptic and visual limitations, we investigate two types of sensory feedback that are known to improve virtual interactions with hands, but have not been studied together in an AR context: visual hand rendering and delocalized haptic rendering.
|
|
%
|
|
A few works explored the effect of a visual hand rendering on interactions in AR by simulating mutual occlusion between the real hand and virtual objects~\autocite{ha2014wearhand, piumsomboon2014graspshell, al-kalbani2016analysis}, or displaying a 3D virtual hand model, semi-transparent~\autocite{ha2014wearhand, piumsomboon2014graspshell} or opaque~\autocite{blaga2017usability, yoon2020evaluating, saito2021contact}.
|
|
%
|
|
Indeed, some visual hand renderings are known to improve interactions or user experience in virtual reality (VR), where the real hand is not visible~\autocite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch, vanveldhuizen2021effect}.
|
|
%
|
|
However, the role of a visual hand rendering superimposed and seen above the real tracked hand has not yet been investigated in AR.
|
|
%
|
|
Conjointly, several studies have demonstrated that wearable haptics can significantly improve interactions performance and user experience in AR~\autocite{maisto2017evaluation, meli2018combining, sarac2022perceived}.
|
|
%
|
|
But haptic rendering for AR remains a challenge as it is difficult to provide rich and realistic haptic sensations while limiting their negative impact on hand tracking~\autocite{pacchierotti2016hring} and keeping the fingertips and palm free to interact with the real environment~\autocite{lopes2018adding, teng2021touch, sarac2022perceived, palmer2022haptic}.
|
|
%
|
|
Therefore, the haptic feedback of the fingertip contact with the virtual environment needs to be rendered elsewhere on the hand, it is unclear which positioning should be preferred or which type of haptic feedback is best suited for manipulating virtual objects in AR.
|
|
%
|
|
A final question is whether one or the other of these (haptic or visual) hand renderings should be preferred~\autocite{maisto2017evaluation, meli2018combining}, or whether a combined visuo-haptic rendering is beneficial for users.
|
|
%
|
|
In fact, both hand renderings can provide sufficient sensory cues for efficient manipulation of virtual objects in AR, or conversely, they can be shown to be complementary.
|
|
|
|
In this paper, we investigate the role of the visuo-haptic rendering of the hand during 3D manipulation of virtual objects in OST-AR.
|
|
%
|
|
We consider two representative manipulation tasks: push-and-slide and grasp-and-place a virtual object.
|
|
%
|
|
The main contributions of this work are:
|
|
%
|
|
\begin{itemize}
|
|
\item a first human subject experiment evaluating the performance and user experience of six visual hand renderings superimposed on the real hand; %
|
|
\item a second human subject experiment evaluating the performance and user experience of visuo-haptic hand renderings by comparing two vibrotactile contact techniques provided at four delocalized positions on the hand and combined with the two most representative visual hand renderings established in the first experiment.
|
|
\end{itemize}
|