WIP related work
This commit is contained in:
@@ -45,20 +45,21 @@ Yet, most of the research have focused on visual augmentations, and the term \AR
|
||||
\label{ar_applications}
|
||||
|
||||
Advances in technology, research and development have enabled many usages of \AR, including medicine, education, industrial, navigation, collaboration and entertainment applications~\cite{dey2018systematic}.
|
||||
For example, \AR can help surgeons to visualize \ThreeD images of the brain overlaid on the patient's head prior or during surgery~\cite{watanabe2016transvisible} (\figref{watanabe2016transvisible}), or improve the learning of students with complex concepts and phenomena such as optics or chemistry~\cite{bousquet2024reconfigurable}.
|
||||
For example, \AR can provide surgery training simulations in safe conditions~\cite{harders2009calibration} (\figref{harders2009calibration}), or improve the learning of students with complex concepts and phenomena such as optics or chemistry~\cite{bousquet2024reconfigurable}.
|
||||
It can also guide workers in complex tasks, such as assembly, maintenance or verification~\cite{hartl2013mobile} (\figref{hartl2013mobile}), reinvent the way we interact with desktop computers~\cite{lee2013spacetop} (\figref{lee2013spacetop}), or can create complete new forms of gaming or tourism experiences~\cite{roo2017inner} (\figref{roo2017inner}).
|
||||
Most of (visual) \AR/\VR experience can now be implemented with commercially available hardware and software solutions, in particular for tracking, rendering and display.
|
||||
Yet, the user experience in \AR is still highly dependent on the display used.
|
||||
|
||||
\begin{subfigs}{ar_applications}{Examples of \AR applications. }[
|
||||
\item Neurosurgery \AR visualization of the brain on a patient's head~\cite{watanabe2016transvisible}.
|
||||
%\item HOBIT is a spatial, tangible \AR table simulating an optical bench for educational experimentations~\cite{bousquet2024reconfigurable}.
|
||||
\item \AR can interactively guide in document verification tasks by recognizing and comparing with virtual references~\cite{hartl2013mobile}.
|
||||
\item SpaceTop is transparent \AR desktop computer featuring direct hand manipulation of \ThreeD content~\cite{lee2013spacetop}.
|
||||
\item Inner Garden is a spatial \AR zen garden made of real sand visually augmented to create a mini world that can be reshaped by hand~\cite{roo2017inner}.
|
||||
%\item Neurosurgery \AR visualization of the brain on a patient's head \cite{watanabe2016transvisible}.
|
||||
\item Visuo-haptic surgery training with cutting into virtual soft tisues \cite{harders2009calibration}.
|
||||
%\item HOBIT is a spatial, tangible \AR table simulating an optical bench for educational experimentations \cite{bousquet2024reconfigurable}.
|
||||
\item \AR can interactively guide in document verification tasks by recognizing and comparing with virtual references \cite{hartl2013mobile}.
|
||||
\item SpaceTop is transparent \AR desktop computer featuring direct hand manipulation of \ThreeD content \cite{lee2013spacetop}.
|
||||
\item Inner Garden is a spatial \AR zen garden made of real sand visually augmented to create a mini world that can be reshaped by hand \cite{roo2017inner}.
|
||||
]
|
||||
\subfigsheight{41mm}
|
||||
\subfig{watanabe2016transvisible}
|
||||
\subfig{harders2009calibration}
|
||||
\subfig{hartl2013mobile}
|
||||
\subfig{lee2013spacetop}
|
||||
\subfig{roo2017inner}
|
||||
@@ -75,7 +76,7 @@ In \VST-\AR, the virtual images are superimposed to images of the \RE captured b
|
||||
This augmented view through the camera has the advantage of a complete control on the real-virtual combination such as mutual occlusion between real and virtual objects~\cite{macedo2023occlusion}, coherent lighting and no delay between the real and virtual images~\cite{kruijff2010perceptual}.
|
||||
But, due to the camera and the screen, the user's view is degraded with a lower resolution, frame rate, field of view, and an overall visual latency compared to proprioception~\cite{kruijff2010perceptual}.
|
||||
|
||||
An \OST-\AR directly combines the virtual images with the real world view using a transparent optical system~\cite{itoh2022indistinguishable} to like augmented glasses, as illustrated in \figref{itoh2022indistinguishable_ost}, \eg \figref{watanabe2016transvisible} and \figref{lee2013spacetop}.
|
||||
An \OST-\AR directly combines the virtual images with the real world view using a transparent optical system~\cite{itoh2022indistinguishable} to like augmented glasses, as illustrated in \figref{itoh2022indistinguishable_ost}, \eg \figref{lee2013spacetop}.
|
||||
These displays feature a direct, preserved view of the \RE at the cost of more difficult registration (spatial misalignment or temporal latency between the real and virtual content)~\cite{grubert2018survey} and mutual real-virtual occlusion~\cite{macedo2023occlusion}.
|
||||
|
||||
Finally, projection-based \AR overlay the virtual images on the real world using a projector, as illustrated in \figref{roo2017one_2}, \eg \figref{roo2017inner}.
|
||||
@@ -93,7 +94,7 @@ It doesn't require the user to wear the display, but requires physical surface t
|
||||
\end{subfigs}
|
||||
|
||||
Regardless the \AR display, it can be placed at different locations~\cite{bimber2005spatial}, as shown in \figref{roo2017one_1}.
|
||||
Spatial \AR is usually projection-based displays placed at fixed location (\figref{roo2017inner}), but it can also be optical or video see-through windows (\figref{lee2013spacetop}).
|
||||
Spatial \AR is usually projection-based displays placed at fixed location (\figref{roo2017inner}), but it can also be \OST or \VST fixed windows (\figref{lee2013spacetop}).
|
||||
Alternatively, \AR displays can be hand-held, like a \VST smartphone (\figref{hartl2013mobile}), or body-attached, like a micro-projector used as a flashlight~\cite{billinghurst2015survey}.
|
||||
Finally, \AR displays can be head-worn like \VR headsets or glasses, providing a highly immersive and portable experience.
|
||||
%Smartphones, shipped with sensors, computing ressources and algorithms, are the most common \AR today's displays, but research and development promise more immersive and interactive \AR with headset displays~\cite{billinghurst2021grand}.
|
||||
@@ -227,7 +228,7 @@ These \enquote{time-multiplexed} interfaces require interaction techniques that
|
||||
Still, the virtual visual rendering and the tangible haptic sensations can be inconsistent.
|
||||
Especially in \OST-\AR, as the \VOs are slightly transparent allowing the paired tangibles to be seen through them.
|
||||
In a pick-and-place task with tangibles of different shapes, a difference in size~\cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape~\cite{kahl2023using} (\figref{kahl2023using}) with the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users.
|
||||
This suggests the feasibility of using simplified tangibles in \AR whose spatial properties (\secref{spatial_properties}) abstract those of the \VOs.
|
||||
This suggests the feasibility of using simplified tangibles in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs.
|
||||
Similarly, we described in \secref{tactile_rendering} how a material property (\secref{object_properties}) of a touched tangible can be modified using wearable haptic devices~\cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR.
|
||||
|
||||
\begin{subfigs}{ar_applications}{Manipulating \VOs with tangibles. }[
|
||||
@@ -261,7 +262,7 @@ Heuristic techniques use rules to determine the selection, manipulation and rele
|
||||
But they produce unrealistic behaviour and are limited to the cases predicted by the rules.
|
||||
Physics-based techniques simulate forces at the contact points between the virtual hand and the \VO.
|
||||
In particular, \textcite{borst2006spring} have proposed an articulated kinematic model in which each phalanx is a rigid body simulated with the god-object~\cite{zilles1995constraintbased} method: the virtual phalanx follows the movements of the real phalanx, but remains constrained to the surface of the virtual objects during contact. The forces acting on the object are calculated as a function of the distance between the real and virtual hands (\figref{borst2006spring}).
|
||||
More advanced techniques simulate the friction phenomena described in \secref{friction}~\cite{talvas2013godfinger} and finger deformations~\cite{talvas2015aggregate}, allowing highly accurate and realistic interactions, but which can be difficult to compute in real time.
|
||||
More advanced techniques simulate the friction phenomena~\cite{talvas2013godfinger} and finger deformations~\cite{talvas2015aggregate}, allowing highly accurate and realistic interactions, but which can be difficult to compute in real time.
|
||||
|
||||
\begin{subfigs}{virtual-hand}{Manipulating \VOs with virtual hands. }[
|
||||
\item A fingertip tracking that enables to select a \VO by opening the hand~\cite{lee2007handy}.
|
||||
|
||||
Reference in New Issue
Block a user