Replace "immersive AR" with "AR headset"

This commit is contained in:
2025-04-11 22:51:10 +02:00
parent f1cf425e7c
commit f8ec931cd6
22 changed files with 94 additions and 101 deletions

View File

@@ -2,7 +2,8 @@
\label{augmented_reality}
\AR devices generate and integrate virtual content into the user's perception of their real environment (\RE), creating the illusion of the \emph{presence} of the virtual \cite{azuma1997survey,skarbez2021revisiting}.
Immersive systems such as headsets leave the hands free to interact with virtual objects (virtual objects), promising natural and intuitive interactions similar to those with everyday real objects \cite{billinghurst2021grand,hertel2021taxonomy}.
Among the different types of devices, \AR headsets leave the hands free to interact with virtual objects.
This promises natural and intuitive interactions similar to those with everyday real objects \cite{billinghurst2021grand,hertel2021taxonomy}.
\subsection{What is Augmented Reality?}
\label{what_is_ar}
@@ -72,7 +73,7 @@ It doesn't require the user to wear the display, but requires a real surface to
Regardless the \AR display, it can be placed at different locations \cite{bimber2005spatial}, as shown in \figref{roo2017one_1}.
\emph{Spatial \AR} is usually projection-based displays placed at fixed location (\figref{roo2017inner}), but it can also be \OST or \VST \emph{fixed windows} (\figref{lee2013spacetop}).
Alternatively, \AR displays can be \emph{hand-held}, like a \VST smartphone (\figref{hartl2013mobile}), or body-attached, like a micro-projector used as a flashlight \cite[p.141]{billinghurst2015survey}.
Finally, \AR displays can be head-worn like \VR \emph{headsets} or glasses, providing a highly immersive and portable experience.
Finally, \AR displays can be head-worn like \VR \emph{headsets} or glasses, providing a portable experience.
\fig[0.75]{roo2017one_1}{Locations of \AR displays from eye-worn to spatially projected. Adapted by \textcite{roo2017one} from \textcite{bimber2005spatial}.}
@@ -141,7 +142,7 @@ Choosing useful and efficient \UIs and interaction techniques is crucial for the
\label{ve_tasks}
\textcite{laviolajr20173d} (p.385) classify interaction techniques into three categories based on the tasks they enable users to perform: manipulation, navigation, and system control.
\textcite{hertel2021taxonomy} proposed a similar taxonomy of interaction techniques specifically for immersive \AR.
\textcite{hertel2021taxonomy} proposed a similar taxonomy of interaction techniques specifically for \AR headsets.
The \emph{manipulation tasks} are the most fundamental tasks in \AR and \VR systems, and the building blocks for more complex interactions.
\emph{Selection} is the identification or acquisition of a specific virtual object, \eg pointing at a target as in \figref{grubert2015multifi}, touching a button with a finger, or grasping an object with a hand.
@@ -175,12 +176,12 @@ In this thesis we focus on manipulation tasks of virtual content directly with t
\label{real_virtual_gap}
In \AR and \VR, the state of the system is displayed to the user as a \ThreeD spatial \VE.
In an immersive and portable \AR system, this \VE is experienced at a 1:1 scale and as an integral part of the \RE.
With an \AR headset, the \VE can be experienced at a 1:1 scale and as an integral part of the \RE.
The rendering gap between the real and virtual elements, as described on our interaction loop in \figref[introduction]{interaction-loop}, is thus experienced as narrow or even not consciously perceived by the user.
This manifests as a sense of presence of the virtual, as described in \secref{ar_presence}.
As the gap between real and virtual rendering is reduced, one could expect a similar and seamless interaction with the \VE as with a \RE, which \textcite{jacob2008realitybased} called \emph{reality based interactions}.
As of today, an immersive \AR system tracks itself with the user in \ThreeD, using tracking sensors and pose estimation algorithms \cite{marchand2016pose}.
As of today, an \AR system tracks itself with the user in \ThreeD, using tracking sensors and pose estimation algorithms \cite{marchand2016pose}.
It enables the \VE to be registered with the \RE and the user simply moves to navigate within the virtual content.
However, direct hand manipulation of virtual content is a challenge that requires specific interaction techniques \cite{billinghurst2021grand}.
It is often achieved using two interaction techniques: \emph{tangible objects} and \emph{virtual hands} \cite[p.165]{billinghurst2015survey}.
@@ -276,8 +277,8 @@ This suggests that a visual hand feedback superimposed on the real hand as a par
Few works have compared different visual feedback of the virtual hand in \AR or with wearable haptic feedback.
Rendering the real hand as a semi-transparent hand in \VST-\AR is perceived as less natural but seems to be preferred to a mutual visual occlusion for interaction with real and virtual objects \cite{buchmann2005interaction,piumsomboon2014graspshell}.
Similarly, \textcite{blaga2017usability} evaluated direct hand manipulation in non-immersive \VST-\AR with a skeleton-like rendering \vs no visual hand feedback: while user performance did not improve, participants felt more confident with the virtual hand (\figref{blaga2017usability}).
In a collaborative task in immersive \OST-\AR \vs \VR, \textcite{yoon2020evaluating} showed that a realistic human hand rendering was the most preferred over a low-polygon hand and a skeleton-like hand for the remote partner.
Similarly, \textcite{blaga2017usability} evaluated direct hand manipulation in \VST-\AR with a skeleton-like rendering \vs no visual hand feedback: while user performance did not improve, participants felt more confident with the virtual hand (\figref{blaga2017usability}).
In a collaborative task in \OST-\AR \vs \VR headsets, \textcite{yoon2020evaluating} showed that a realistic human hand rendering was the most preferred over a low-polygon hand and a skeleton-like hand for the remote partner.
\textcite{genay2021virtual} found that the sense of embodiment with robotic hands overlay in \OST-\AR was stronger when the environment contained both real and virtual objects (\figref{genay2021virtual}).
Finally, \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the visual and haptic feedback of the hand in \VST-\AR, as detailed in the next section (\secref{vhar_rings}).
Taken together, these results suggest that a visual augmentation of the hand in \AR could improve usability and performance in direct hand manipulation tasks, but the best rendering has yet to be determined.
@@ -302,7 +303,7 @@ Taken together, these results suggest that a visual augmentation of the hand in
\AR systems integrate virtual content into the user's perception as if it were part of the \RE.
\AR headsets now enable real-time pose estimation of the head and hands, and high-quality display of virtual content, while being portable and mobile.
They enable highly immersive augmented environments that users can explore with a strong sense of the presence of the virtual content.
They create augmented environments that users can explore with a strong sense of the presence of the virtual content.
However, without direct and seamless interaction with the virtual objects using the hands, the coherence of the augmented environment experience is compromised.
In particular, when manipulating virtual objects in \OST-\AR, there is a lack of mutual occlusion and interaction cues between the hands and the virtual content, which could be mitigated by a visual augmentation of the hand.
A common alternative approach is to use real objects as proxies for interaction with virtual objects, but this raises concerns about their coherence with visual augmentations.