AR displays
This commit is contained in:
@@ -96,7 +96,7 @@ Between these two extremes lies \MR, which comprises \AR and \VR as different le
|
||||
%
|
||||
\AR/\VR is most often understood as addressing only the visual sense, and as haptics, it can take many forms as a user interface.
|
||||
%
|
||||
The most mature devices are \HMDs, which are portable headsets worn directly on the head, providing the user with an immersive \AE/\VE.
|
||||
The most promising devices are \AR headset, which are portable displays worn directly on the head, providing the user with an immersive \AE/\VE.
|
||||
|
||||
\begin{subfigs}{rv-continuums}{Reality-virtuality (\RV) continuums. }[
|
||||
\item Original \RV continuum for the visual sense initially proposed by and adapted from \textcite{milgram1994taxonomy}.
|
||||
|
||||
@@ -62,30 +62,40 @@ Yet, the user experience in \AR is still highly dependent on the display used.
|
||||
\end{subfigs}
|
||||
|
||||
|
||||
\subsubsection{AR Displays and Perception}
|
||||
\subsubsection{AR Displays}
|
||||
\label{ar_displays}
|
||||
|
||||
\AR systems render virtual content registered in the \RE to the user's senses via output \UIs.
|
||||
To create the perception of a combined rea
|
||||
To experience a virtual content combined and registered with the \RE, an output \UI that display the \VE to the user is necessary.
|
||||
There is a large variety of \AR displays with different methods of combining the real and virtual content (\VST, \OST, or projected), and different locations on the \RE or the user~\cite{billinghurst2015survey}.
|
||||
|
||||
\cite{bimber2005spatial}
|
||||
In \VST-\AR, the virtual images are superimposed to images of the \RE captured by a camera~\cite{marchand2016pose}, and the combined real-virtual image is displayed on a screen to the user, as illustrated in \figref{itoh2022indistinguishable_vst}, \eg \figref{hartl2013mobile}.
|
||||
This augmented view through the camera has the advantage of a complete control on the real-virtual combination such as mutual occlusion between real and virtual objects~\cite{macedo2023occlusion}, coherent lighting and no delay between the real and virtual images~\cite{kruijff2010perceptual}.
|
||||
But, due to the camera and the screen, the user's view is degraded with a lower resolution, frame rate, field of view, and an overall visual latency compared to proprioception~\cite{kruijff2010perceptual}.
|
||||
|
||||
\paragraph{Spatial Augmented Reality}
|
||||
An \OST-\AR directly combines the virtual images with the real world view using a transparent optical system~\cite{itoh2022indistinguishable} to like augmented glasses, as illustrated in \figref{itoh2022indistinguishable_ost}, \eg \figref{watanabe2016transvisible} and \figref{lee2013spacetop}.
|
||||
These displays feature a direct, preserved view of the \RE at the cost of more difficult registration (spatial misalignment or temporal latency between the real and virtual content)~\cite{grubert2018survey} and mutual real-virtual occlusion~\cite{macedo2023occlusion}.
|
||||
|
||||
\paragraph{Window on World Displays}
|
||||
Finally, projection-based \AR overlay the virtual images on the real world using a projector, as illustrated in \figref{roo2017one_2}, \eg \figref{roo2017inner}.
|
||||
It doesn't require the user to wear the display, but requires physical surface to project the virtual on, and is vulnerable to shadows created by the user or the real objects~\cite{billinghurst2015survey}.
|
||||
|
||||
\paragraph{Video See-Through Headsets}
|
||||
\begin{subfigs}{ar_displays}{Simplified operating diagram of \AR display methods. }[
|
||||
\item \VST-\AR \cite{itoh2022indistinguishable}.
|
||||
\item \OST-\AR \cite{itoh2022indistinguishable}.
|
||||
\item Spatial \AR \cite{roo2017one}.
|
||||
]
|
||||
\subfigsheight{44mm}
|
||||
\subfig{itoh2022indistinguishable_vst}
|
||||
\subfig{itoh2022indistinguishable_ost}
|
||||
\subfig{roo2017one_2}
|
||||
\end{subfigs}
|
||||
|
||||
Vergence-accommodation conflict.
|
||||
|
||||
Using a VST-AR headset have notable consequences, as the "real" view of the environment and the hand is actually a visual stream from a camera, which has a noticeable delay and lower quality (\eg resolution, frame rate, field of view) compared to the direct view of the real environment with OST-AR~\cite{macedo2023occlusion}.
|
||||
|
||||
\paragraph{Optical See-Through Headsets}
|
||||
|
||||
%Distances are underestimated~\cite{adams2022depth,peillard2019studying}.
|
||||
|
||||
% billinghurst2021grand
|
||||
Regardless the \AR display, it can be placed at different locations~\cite{bimber2005spatial}, as shown in \figref{roo2017one_1}.
|
||||
Spatial \AR is usually projection-based displays placed at fixed location (\figref{roo2017inner}), but it can also be optical or video see-through windows (\figref{lee2013spacetop}).
|
||||
Alternatively, \AR displays can be hand-held, like a \VST smartphone (\figref{hartl2013mobile}), or body-attached, like a micro-projector used as a flashlight~\cite{billinghurst2015survey}.
|
||||
Finally, \AR displays can be head-worn like \VR headsets or glasses, providing a highly immersive and portable experience.
|
||||
%Smartphones, shipped with sensors, computing ressources and algorithms, are the most common \AR today's displays, but research and development promise more immersive and interactive \AR with headset displays~\cite{billinghurst2021grand}.
|
||||
|
||||
\fig[0.75]{roo2017one_1}{Locations of \AR displays from eye-worn to spatially projected. Adapted by \textcite{roo2017one} from \textcite{bimber2005spatial}.}
|
||||
|
||||
\subsubsection{Presence and Embodiment in AR}
|
||||
\label{ar_presence_embodiment}
|
||||
@@ -173,7 +183,7 @@ The \emph{system control tasks} are changes to the system state through commands
|
||||
\item Spatial selection of virtual item of an extended display using a hand-held smartphone~\cite{grubert2015multifi}.
|
||||
\item Displaying as an overlay registered on the \RE the route to follow~\cite{grubert2017pervasive}.
|
||||
\item Virtual drawing on a tangible object with a hand-held pen~\cite{roo2017onea}.
|
||||
\item Simultaneous Localization and Mapping (SLAM) techniques such as KinectFusion~\cite{newcombe2011kinectfusion} reconstruct the \RE in real time and enables to register the \VE in it.
|
||||
\item Simultaneous Localization and Mapping (SLAM) algorithms such as KinectFusion~\cite{newcombe2011kinectfusion} reconstruct the \RE in real time and enables to register the \VE in it.
|
||||
]
|
||||
\subfigsheight{36mm}
|
||||
\subfig{grubert2015multifi}
|
||||
|
||||
Binary file not shown.
|
After Width: | Height: | Size: 35 KiB |
Binary file not shown.
|
After Width: | Height: | Size: 38 KiB |
Binary file not shown.
|
After Width: | Height: | Size: 35 KiB |
BIN
1-introduction/related-work/figures/roo2017one_1.jpg
Normal file
BIN
1-introduction/related-work/figures/roo2017one_1.jpg
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 68 KiB |
BIN
1-introduction/related-work/figures/roo2017one_2.jpg
Normal file
BIN
1-introduction/related-work/figures/roo2017one_2.jpg
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 27 KiB |
@@ -48,7 +48,6 @@
|
||||
\acronym{DoF}{degree of freedom}
|
||||
\acronym{ERM}{eccentric rotating mass}
|
||||
\acronym{h}{haptic}
|
||||
\acronym{HMD}{head-mounted display}
|
||||
\acronym{JND}{just noticeable difference}
|
||||
\acronym{LRA}{linear resonant actuator}
|
||||
\acronym{MR}{mixed reality}
|
||||
|
||||
@@ -1256,6 +1256,17 @@
|
||||
doi = {10/gf39qd}
|
||||
}
|
||||
|
||||
@article{grubert2018survey,
|
||||
title = {A {{Survey}} of {{Calibration Methods}} for {{Optical See-Through Head-Mounted Displays}}},
|
||||
author = {Grubert, Jens and Itoh, Yuta and Moser, Kenneth and Swan, J. Edward},
|
||||
date = {2018},
|
||||
journaltitle = {IEEE Trans. Vis. Comput. Graph.},
|
||||
volume = {24},
|
||||
number = {9},
|
||||
pages = {2649--2662},
|
||||
doi = {10/gd3k7g}
|
||||
}
|
||||
|
||||
@incollection{gueorguiev2022larger,
|
||||
title = {Larger {{Skin-Surface Contact Through}} a {{Fingertip Wearable Improves Roughness Perception}}},
|
||||
booktitle = {Haptics: {{Science}}, {{Technology}}, {{Applications}}},
|
||||
@@ -1562,6 +1573,17 @@
|
||||
doi = {10/gr2fbm}
|
||||
}
|
||||
|
||||
@article{itoh2022indistinguishable,
|
||||
title = {Towards {{Indistinguishable Augmented Reality}}: {{A Survey}} on {{Optical See-through Head-mounted Displays}}},
|
||||
author = {Itoh, Yuta and Langlotz, Tobias and Sutton, Jonathan and Plopski, Alexander},
|
||||
date = {2022},
|
||||
journaltitle = {ACM Comput. Surv.},
|
||||
volume = {54},
|
||||
number = {6},
|
||||
pages = {1--36},
|
||||
doi = {10/gmk7c8}
|
||||
}
|
||||
|
||||
@inproceedings{jacob2008realitybased,
|
||||
title = {Reality-Based Interaction: A Framework for Post-{{WIMP}} Interfaces},
|
||||
shorttitle = {Reality-Based Interaction},
|
||||
@@ -1922,6 +1944,15 @@
|
||||
doi = {10/ggfv5d}
|
||||
}
|
||||
|
||||
@inproceedings{kruijff2010perceptual,
|
||||
title = {Perceptual Issues in Augmented Reality Revisited},
|
||||
booktitle = {{{IEEE Int}}. {{Symp}}. {{Mix}}. {{Augment}}. {{Real}}.},
|
||||
author = {Kruijff, Ernst and Swan, J. Edward and Feiner, Steven},
|
||||
date = {2010},
|
||||
pages = {3--12},
|
||||
doi = {10/fr6gz3}
|
||||
}
|
||||
|
||||
@article{kuchenbecker2006improving,
|
||||
title = {Improving Contact Realism through Event-Based Haptic Feedback},
|
||||
author = {Kuchenbecker, K.J. and Fiene, J. and Niemeyer, G.},
|
||||
@@ -2806,6 +2837,12 @@
|
||||
doi = {10/ggrd6q}
|
||||
}
|
||||
|
||||
@thesis{roo2017one,
|
||||
title = {One Reality: Augmenting the Human Experience through the Combination of Physical and Digital Worlds},
|
||||
author = {Roo, Joan Sol},
|
||||
date = {2017}
|
||||
}
|
||||
|
||||
@inproceedings{roo2017onea,
|
||||
title = {One {{Reality}}: {{Augmenting How}} the {{Physical World}} Is {{Experienced}} by Combining {{Multiple Mixed Reality Modalities}}},
|
||||
shorttitle = {One {{Reality}}},
|
||||
|
||||
Reference in New Issue
Block a user