Replace \autocite => \cite
This commit is contained in:
@@ -23,35 +23,35 @@
|
||||
|
||||
Augmented reality (AR) integrates virtual content into our real-world surroundings, giving the illusion of one unique environment and promising natural and seamless interactions with real and virtual objects.
|
||||
%
|
||||
Virtual object manipulation is particularly critical for useful and effective AR usage, such as in medical applications, training, or entertainment~\autocite{laviolajr20173d, kim2018revisiting}.
|
||||
Virtual object manipulation is particularly critical for useful and effective AR usage, such as in medical applications, training, or entertainment~\cite{laviolajr20173d, kim2018revisiting}.
|
||||
%
|
||||
Hand tracking technologies~\autocite{xiao2018mrtouch}, grasping techniques~\autocite{holl2018efficient}, and real-time physics engines permit users to directly manipulate virtual objects with their bare hands as if they were real~\autocite{piumsomboon2014graspshell}, without requiring controllers~\autocite{krichenbauer2018augmented}, gloves~\autocite{prachyabrued2014visual}, or predefined gesture techniques~\autocite{piumsomboon2013userdefined, ha2014wearhand}.
|
||||
Hand tracking technologies~\cite{xiao2018mrtouch}, grasping techniques~\cite{holl2018efficient}, and real-time physics engines permit users to directly manipulate virtual objects with their bare hands as if they were real~\cite{piumsomboon2014graspshell}, without requiring controllers~\cite{krichenbauer2018augmented}, gloves~\cite{prachyabrued2014visual}, or predefined gesture techniques~\cite{piumsomboon2013userdefined, ha2014wearhand}.
|
||||
%
|
||||
Optical see-through AR (OST-AR) head-mounted displays (HMDs), such as the Microsoft HoloLens 2 or the Magic Leap, are particularly suited for this type of direct hand interaction~\autocite{kim2018revisiting}.
|
||||
Optical see-through AR (OST-AR) head-mounted displays (HMDs), such as the Microsoft HoloLens 2 or the Magic Leap, are particularly suited for this type of direct hand interaction~\cite{kim2018revisiting}.
|
||||
|
||||
However, there are still several haptic and visual limitations that affect manipulation in OST-AR, degrading the user experience.
|
||||
%
|
||||
For example, it is difficult to estimate the position of one's hand in relation to a virtual content because mutual occlusion between the hand and the virtual object is often lacking~\autocite{macedo2023occlusion}, the depth of virtual content is underestimated~\autocite{diaz2017designing, peillard2019studying}, and hand tracking still has a noticeable latency~\autocite{xiao2018mrtouch}.
|
||||
For example, it is difficult to estimate the position of one's hand in relation to a virtual content because mutual occlusion between the hand and the virtual object is often lacking~\cite{macedo2023occlusion}, the depth of virtual content is underestimated~\cite{diaz2017designing, peillard2019studying}, and hand tracking still has a noticeable latency~\cite{xiao2018mrtouch}.
|
||||
%
|
||||
Similarly, it is challenging to ensure confident and realistic contact with a virtual object due to the lack of haptic feedback and the intangibility of the virtual environment, which of course cannot apply physical constraints on the hand~\autocite{maisto2017evaluation, meli2018combining, lopes2018adding, teng2021touch}.
|
||||
Similarly, it is challenging to ensure confident and realistic contact with a virtual object due to the lack of haptic feedback and the intangibility of the virtual environment, which of course cannot apply physical constraints on the hand~\cite{maisto2017evaluation, meli2018combining, lopes2018adding, teng2021touch}.
|
||||
%
|
||||
These limitations also make it difficult to confidently move a grasped object towards a target~\autocite{maisto2017evaluation, meli2018combining}.
|
||||
These limitations also make it difficult to confidently move a grasped object towards a target~\cite{maisto2017evaluation, meli2018combining}.
|
||||
|
||||
To address these haptic and visual limitations, we investigate two types of sensory feedback that are known to improve virtual interactions with hands, but have not been studied together in an AR context: visual hand rendering and delocalized haptic rendering.
|
||||
%
|
||||
A few works explored the effect of a visual hand rendering on interactions in AR by simulating mutual occlusion between the real hand and virtual objects~\autocite{ha2014wearhand, piumsomboon2014graspshell, al-kalbani2016analysis}, or displaying a 3D virtual hand model, semi-transparent~\autocite{ha2014wearhand, piumsomboon2014graspshell} or opaque~\autocite{blaga2017usability, yoon2020evaluating, saito2021contact}.
|
||||
A few works explored the effect of a visual hand rendering on interactions in AR by simulating mutual occlusion between the real hand and virtual objects~\cite{ha2014wearhand, piumsomboon2014graspshell, al-kalbani2016analysis}, or displaying a 3D virtual hand model, semi-transparent~\cite{ha2014wearhand, piumsomboon2014graspshell} or opaque~\cite{blaga2017usability, yoon2020evaluating, saito2021contact}.
|
||||
%
|
||||
Indeed, some visual hand renderings are known to improve interactions or user experience in virtual reality (VR), where the real hand is not visible~\autocite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch, vanveldhuizen2021effect}.
|
||||
Indeed, some visual hand renderings are known to improve interactions or user experience in virtual reality (VR), where the real hand is not visible~\cite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch, vanveldhuizen2021effect}.
|
||||
%
|
||||
However, the role of a visual hand rendering superimposed and seen above the real tracked hand has not yet been investigated in AR.
|
||||
%
|
||||
Conjointly, several studies have demonstrated that wearable haptics can significantly improve interactions performance and user experience in AR~\autocite{maisto2017evaluation, meli2018combining, sarac2022perceived}.
|
||||
Conjointly, several studies have demonstrated that wearable haptics can significantly improve interactions performance and user experience in AR~\cite{maisto2017evaluation, meli2018combining, sarac2022perceived}.
|
||||
%
|
||||
But haptic rendering for AR remains a challenge as it is difficult to provide rich and realistic haptic sensations while limiting their negative impact on hand tracking~\autocite{pacchierotti2016hring} and keeping the fingertips and palm free to interact with the real environment~\autocite{lopes2018adding, teng2021touch, sarac2022perceived, palmer2022haptic}.
|
||||
But haptic rendering for AR remains a challenge as it is difficult to provide rich and realistic haptic sensations while limiting their negative impact on hand tracking~\cite{pacchierotti2016hring} and keeping the fingertips and palm free to interact with the real environment~\cite{lopes2018adding, teng2021touch, sarac2022perceived, palmer2022haptic}.
|
||||
%
|
||||
Therefore, the haptic feedback of the fingertip contact with the virtual environment needs to be rendered elsewhere on the hand, it is unclear which positioning should be preferred or which type of haptic feedback is best suited for manipulating virtual objects in AR.
|
||||
%
|
||||
A final question is whether one or the other of these (haptic or visual) hand renderings should be preferred~\autocite{maisto2017evaluation, meli2018combining}, or whether a combined visuo-haptic rendering is beneficial for users.
|
||||
A final question is whether one or the other of these (haptic or visual) hand renderings should be preferred~\cite{maisto2017evaluation, meli2018combining}, or whether a combined visuo-haptic rendering is beneficial for users.
|
||||
%
|
||||
In fact, both hand renderings can provide sufficient sensory cues for efficient manipulation of virtual objects in AR, or conversely, they can be shown to be complementary.
|
||||
|
||||
|
||||
Reference in New Issue
Block a user