AR displays

This commit is contained in:
2024-09-18 15:06:13 +02:00
parent e57959007d
commit 5cfddf8e3f
9 changed files with 65 additions and 19 deletions

View File

@@ -62,30 +62,40 @@ Yet, the user experience in \AR is still highly dependent on the display used.
\end{subfigs}
\subsubsection{AR Displays and Perception}
\subsubsection{AR Displays}
\label{ar_displays}
\AR systems render virtual content registered in the \RE to the user's senses via output \UIs.
To create the perception of a combined rea
To experience a virtual content combined and registered with the \RE, an output \UI that display the \VE to the user is necessary.
There is a large variety of \AR displays with different methods of combining the real and virtual content (\VST, \OST, or projected), and different locations on the \RE or the user~\cite{billinghurst2015survey}.
\cite{bimber2005spatial}
In \VST-\AR, the virtual images are superimposed to images of the \RE captured by a camera~\cite{marchand2016pose}, and the combined real-virtual image is displayed on a screen to the user, as illustrated in \figref{itoh2022indistinguishable_vst}, \eg \figref{hartl2013mobile}.
This augmented view through the camera has the advantage of a complete control on the real-virtual combination such as mutual occlusion between real and virtual objects~\cite{macedo2023occlusion}, coherent lighting and no delay between the real and virtual images~\cite{kruijff2010perceptual}.
But, due to the camera and the screen, the user's view is degraded with a lower resolution, frame rate, field of view, and an overall visual latency compared to proprioception~\cite{kruijff2010perceptual}.
\paragraph{Spatial Augmented Reality}
An \OST-\AR directly combines the virtual images with the real world view using a transparent optical system~\cite{itoh2022indistinguishable} to like augmented glasses, as illustrated in \figref{itoh2022indistinguishable_ost}, \eg \figref{watanabe2016transvisible} and \figref{lee2013spacetop}.
These displays feature a direct, preserved view of the \RE at the cost of more difficult registration (spatial misalignment or temporal latency between the real and virtual content)~\cite{grubert2018survey} and mutual real-virtual occlusion~\cite{macedo2023occlusion}.
\paragraph{Window on World Displays}
Finally, projection-based \AR overlay the virtual images on the real world using a projector, as illustrated in \figref{roo2017one_2}, \eg \figref{roo2017inner}.
It doesn't require the user to wear the display, but requires physical surface to project the virtual on, and is vulnerable to shadows created by the user or the real objects~\cite{billinghurst2015survey}.
\paragraph{Video See-Through Headsets}
\begin{subfigs}{ar_displays}{Simplified operating diagram of \AR display methods. }[
\item \VST-\AR \cite{itoh2022indistinguishable}.
\item \OST-\AR \cite{itoh2022indistinguishable}.
\item Spatial \AR \cite{roo2017one}.
]
\subfigsheight{44mm}
\subfig{itoh2022indistinguishable_vst}
\subfig{itoh2022indistinguishable_ost}
\subfig{roo2017one_2}
\end{subfigs}
Vergence-accommodation conflict.
Using a VST-AR headset have notable consequences, as the "real" view of the environment and the hand is actually a visual stream from a camera, which has a noticeable delay and lower quality (\eg resolution, frame rate, field of view) compared to the direct view of the real environment with OST-AR~\cite{macedo2023occlusion}.
\paragraph{Optical See-Through Headsets}
%Distances are underestimated~\cite{adams2022depth,peillard2019studying}.
% billinghurst2021grand
Regardless the \AR display, it can be placed at different locations~\cite{bimber2005spatial}, as shown in \figref{roo2017one_1}.
Spatial \AR is usually projection-based displays placed at fixed location (\figref{roo2017inner}), but it can also be optical or video see-through windows (\figref{lee2013spacetop}).
Alternatively, \AR displays can be hand-held, like a \VST smartphone (\figref{hartl2013mobile}), or body-attached, like a micro-projector used as a flashlight~\cite{billinghurst2015survey}.
Finally, \AR displays can be head-worn like \VR headsets or glasses, providing a highly immersive and portable experience.
%Smartphones, shipped with sensors, computing ressources and algorithms, are the most common \AR today's displays, but research and development promise more immersive and interactive \AR with headset displays~\cite{billinghurst2021grand}.
\fig[0.75]{roo2017one_1}{Locations of \AR displays from eye-worn to spatially projected. Adapted by \textcite{roo2017one} from \textcite{bimber2005spatial}.}
\subsubsection{Presence and Embodiment in AR}
\label{ar_presence_embodiment}
@@ -173,7 +183,7 @@ The \emph{system control tasks} are changes to the system state through commands
\item Spatial selection of virtual item of an extended display using a hand-held smartphone~\cite{grubert2015multifi}.
\item Displaying as an overlay registered on the \RE the route to follow~\cite{grubert2017pervasive}.
\item Virtual drawing on a tangible object with a hand-held pen~\cite{roo2017onea}.
\item Simultaneous Localization and Mapping (SLAM) techniques such as KinectFusion~\cite{newcombe2011kinectfusion} reconstruct the \RE in real time and enables to register the \VE in it.
\item Simultaneous Localization and Mapping (SLAM) algorithms such as KinectFusion~\cite{newcombe2011kinectfusion} reconstruct the \RE in real time and enables to register the \VE in it.
]
\subfigsheight{36mm}
\subfig{grubert2015multifi}