Better figures
This commit is contained in:
@@ -13,7 +13,6 @@ Immersive systems such as headsets leave the hands free to interact with \VOs, p
|
||||
% \subfig{sutherland1970computer2}
|
||||
%\end{subfigs}
|
||||
|
||||
|
||||
\subsection{What is Augmented Reality?}
|
||||
\label{what_is_ar}
|
||||
|
||||
@@ -33,7 +32,6 @@ Yet, most of the research have focused on visual augmentations, and the term \AR
|
||||
\footnotetext{This third characteristic has been slightly adapted to use the version of \textcite{marchand2016pose}, the original definition was: \enquote{registered in \ThreeD}.}
|
||||
%For example, \textcite{milgram1994taxonomy} proposed a taxonomy of \MR experiences based on the degree of mixing real and virtual environments, and \textcite{skarbez2021revisiting} revisited this taxonomy to include the user's perception of the experience.
|
||||
|
||||
|
||||
\subsubsection{Applications of AR}
|
||||
\label{ar_applications}
|
||||
|
||||
@@ -43,22 +41,19 @@ It can also guide workers in complex tasks, such as assembly, maintenance or ver
|
||||
Most of (visual) \AR/\VR experience can now be implemented with commercially available hardware and software solutions, in particular for tracking, rendering and display.
|
||||
Yet, the user experience in \AR is still highly dependent on the display used.
|
||||
|
||||
\begin{subfigs}{ar_applications}{Examples of \AR applications. }[
|
||||
%\item Neurosurgery \AR visualization of the brain on a patient's head \cite{watanabe2016transvisible}.
|
||||
\item Visuo-haptic surgery training with cutting into virtual soft tisues \cite{harders2009calibration}.
|
||||
%\item HOBIT is a spatial, tangible \AR table simulating an optical bench for educational experimentations \cite{bousquet2024reconfigurable}.
|
||||
\item \AR can interactively guide in document verification tasks by recognizing and comparing with virtual references \cite{hartl2013mobile}.
|
||||
\item SpaceTop is transparent \AR desktop computer featuring direct hand manipulation of \ThreeD content \cite{lee2013spacetop}.
|
||||
\item Inner Garden is a spatial \AR zen garden made of real sand visually augmented to create a mini world that can be reshaped by hand \cite{roo2017inner}.
|
||||
]
|
||||
\subfigsheight{41mm}
|
||||
\subfig{harders2009calibration}
|
||||
\subfig{hartl2013mobile}
|
||||
\subfig{lee2013spacetop}
|
||||
\subfig{roo2017inner}
|
||||
\begin{subfigs}{ar_applications}{Examples of \AR applications. }[][
|
||||
\item Visuo-haptic surgery training with cutting into virtual soft tisues \cite{harders2009calibration}.
|
||||
\item \AR can interactively guide in document verification tasks by recognizing and comparing with virtual references \cite{hartl2013mobile}.
|
||||
\item SpaceTop is transparent \AR desktop computer featuring direct hand manipulation of \ThreeD content \cite{lee2013spacetop}.
|
||||
\item Inner Garden is a spatial \AR zen garden made of real sand visually augmented to create a mini world that can be reshaped by hand \cite{roo2017inner}.
|
||||
]
|
||||
\subfigsheight{41mm}
|
||||
\subfig{harders2009calibration}
|
||||
\subfig{hartl2013mobile}
|
||||
\subfig{lee2013spacetop}
|
||||
\subfig{roo2017inner}
|
||||
\end{subfigs}
|
||||
|
||||
|
||||
\subsubsection{AR Displays}
|
||||
\label{ar_displays}
|
||||
|
||||
@@ -75,15 +70,15 @@ These displays feature a direct, preserved view of the \RE at the cost of more d
|
||||
Finally, \emph{projection-based \AR} overlays the virtual images on the real world using a projector, as illustrated in \figref{roo2017one_2}, \eg \figref{roo2017inner}.
|
||||
It doesn't require the user to wear the display, but requires a real surface to project the virtual on, and is vulnerable to shadows created by the user or the real objects \cite{billinghurst2015survey}.
|
||||
|
||||
\begin{subfigs}{ar_displays}{Simplified operating diagram of \AR display methods. }[
|
||||
\item \VST-\AR \cite{itoh2022indistinguishable}.
|
||||
\item \OST-\AR \cite{itoh2022indistinguishable}.
|
||||
\item Spatial \AR \cite{roo2017one}.
|
||||
]
|
||||
\subfigsheight{44mm}
|
||||
\subfig{itoh2022indistinguishable_vst}
|
||||
\subfig{itoh2022indistinguishable_ost}
|
||||
\subfig{roo2017one_2}
|
||||
\begin{subfigs}{ar_displays}{Simplified operating diagram of \AR display methods. }[][
|
||||
\item \VST-\AR \cite{itoh2022indistinguishable}.
|
||||
\item \OST-\AR \cite{itoh2022indistinguishable}.
|
||||
\item Spatial \AR \cite{roo2017one}.
|
||||
]
|
||||
\subfigsheight{44mm}
|
||||
\subfig{itoh2022indistinguishable_vst}
|
||||
\subfig{itoh2022indistinguishable_ost}
|
||||
\subfig{roo2017one_2}
|
||||
\end{subfigs}
|
||||
|
||||
Regardless the \AR display, it can be placed at different locations \cite{bimber2005spatial}, as shown in \figref{roo2017one_1}.
|
||||
@@ -120,13 +115,15 @@ The plausibility can be applied to \AR as is, but the \VOs must additionally hav
|
||||
%\textcite{skarbez2021revisiting} also named place illusion for \AR as \enquote{immersion} and plausibility as \enquote{coherence}, and these terms will be used in the remainder of this thesis.
|
||||
%One main issue with presence is how to measure it both in \VR \cite{slater2022separate} and \AR \cite{tran2024survey}.
|
||||
|
||||
\begin{subfigs}{presence}{The sense of immersion in virtual and augmented environments. Adapted from \textcite{stevens2002putting}. }[
|
||||
\item Place illusion is the sense of the user of \enquote{being there} in the \VE.
|
||||
\item Objet illusion is the sense of the \VO to \enquote{feels here} in the \RE.
|
||||
]
|
||||
\subfigsheight{35mm}
|
||||
\subfig{presence-vr}
|
||||
\subfig{presence-ar}
|
||||
\begin{subfigs}{presence}{
|
||||
The sense of immersion in virtual and augmented environments. Adapted from \textcite{stevens2002putting}.
|
||||
}[][
|
||||
\item Place illusion is the sense of the user of \enquote{being there} in the \VE.
|
||||
\item Objet illusion is the sense of the \VO to \enquote{feels here} in the \RE.
|
||||
]
|
||||
\subfigsheight{35mm}
|
||||
\subfig{presence-vr}
|
||||
\subfig{presence-ar}
|
||||
\end{subfigs}
|
||||
|
||||
\paragraph{Embodiment}
|
||||
@@ -138,14 +135,12 @@ This illusion arises when the visual, proprioceptive and (if any) haptic sensati
|
||||
It can be decomposed into three subcomponents: \emph{Agency}, which is the feeling of controlling the body; \emph{Ownership}, which is the feeling that \enquote{the body is the source of the experienced sensations}; and \emph{Self-Location}, which is the feeling \enquote{spatial experience of being inside [the] body} \cite{kilteni2012sense}.
|
||||
In \AR, it could take the form of body accessorization, \eg wearing virtual clothes or make-up in overlay, of partial avatarization, \eg using a virtual prothesis, or a full avatarization \cite{genay2022being}.
|
||||
|
||||
|
||||
\subsection{Direct Hand Manipulation in AR}
|
||||
\label{ar_interaction}
|
||||
|
||||
A user in \AR must be able to interact with the virtual content to fulfil the second point of \textcite{azuma1997survey}'s definition (\secref{ar_definition}) and complete the interaction loop (\figref[introduction]{interaction-loop}).%, \eg through a hand-held controller, a tangible object, or even directly with the hands.
|
||||
In all examples of \AR applications shown in \secref{ar_applications}, the user interacts with the \VE using their hands, either directly or through a physical interface.
|
||||
|
||||
|
||||
\subsubsection{User Interfaces and Interaction Techniques}
|
||||
\label{interaction_techniques}
|
||||
|
||||
@@ -157,7 +152,6 @@ Choosing useful and efficient \UIs and interaction techniques is crucial for the
|
||||
|
||||
\fig[0.5]{interaction-technique}{An interaction technique map user inputs to actions within a computer system. Adapted from \textcite{billinghurst2005designing}.}
|
||||
|
||||
|
||||
\subsubsection{Tasks with Virtual Environments}
|
||||
\label{ve_tasks}
|
||||
|
||||
@@ -176,20 +170,19 @@ Wayfinding is the cognitive planning of the movement, such as path finding or ro
|
||||
|
||||
The \emph{system control tasks} are changes to the system state through commands or menus such as creating, deleting, or modifying \VOs, \eg as in \figref{roo2017onea}. It is also the input of text, numbers, or symbols.
|
||||
|
||||
\begin{subfigs}{interaction-techniques}{Interaction techniques in \AR. }[
|
||||
\item Spatial selection of virtual item of an extended display using a hand-held smartphone \cite{grubert2015multifi}.
|
||||
\item Displaying as an overlay registered on the \RE the route to follow \cite{grubert2017pervasive}.
|
||||
\item Virtual drawing on a tangible object with a hand-held pen \cite{roo2017onea}.
|
||||
\item Simultaneous Localization and Mapping (SLAM) algorithms such as KinectFusion \cite{newcombe2011kinectfusion} reconstruct the \RE in real time and enables to register the \VE in it.
|
||||
]
|
||||
\subfigsheight{36mm}
|
||||
\subfig{grubert2015multifi}
|
||||
\subfig{grubert2017pervasive}
|
||||
\subfig{roo2017onea}
|
||||
\subfig{newcombe2011kinectfusion}
|
||||
\begin{subfigs}{interaction-techniques}{Interaction techniques in \AR. }[][
|
||||
\item Spatial selection of virtual item of an extended display using a hand-held smartphone \cite{grubert2015multifi}.
|
||||
\item Displaying as an overlay registered on the \RE the route to follow \cite{grubert2017pervasive}.
|
||||
\item Virtual drawing on a tangible object with a hand-held pen \cite{roo2017onea}.
|
||||
\item Simultaneous Localization and Mapping (SLAM) algorithms such as KinectFusion \cite{newcombe2011kinectfusion} reconstruct the \RE in real time and enables to register the \VE in it.
|
||||
]
|
||||
\subfigsheight{36mm}
|
||||
\subfig{grubert2015multifi}
|
||||
\subfig{grubert2017pervasive}
|
||||
\subfig{roo2017onea}
|
||||
\subfig{newcombe2011kinectfusion}
|
||||
\end{subfigs}
|
||||
|
||||
|
||||
\subsubsection{Reducing the Real-Virtual Gap}
|
||||
\label{real-virtual-gap}
|
||||
|
||||
@@ -205,7 +198,6 @@ It enables the \VE to be registered with the \RE and the user simply moves to na
|
||||
However, direct hand manipulation of virtual content is a challenge that requires specific interaction techniques \cite{billinghurst2021grand}.
|
||||
It is often achieved using two interaction techniques: \emph{tangible objects} and \emph{virtual hands} \cite{billinghurst2015survey,hertel2021taxonomy}.
|
||||
|
||||
|
||||
\subsubsection{Manipulating with Tangibles}
|
||||
\label{ar_tangibles}
|
||||
|
||||
@@ -224,20 +216,19 @@ In a pick-and-place task with tangibles of different shapes, a difference in siz
|
||||
This suggests the feasibility of using simplified tangibles in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs.
|
||||
Similarly, in \secref{tactile_rendering} we described how a material property (\secref{object_properties}) of a touched tangible can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR.
|
||||
|
||||
\begin{subfigs}{ar_applications}{Manipulating \VOs with tangibles. }[
|
||||
\item Ubi-Touch paired the movements and screw interaction of a virtual drill with a real vaporizer held by the user \cite{jain2023ubitouch}.
|
||||
\item A tangible cube that can be moved into the \VE and used to grasp and manipulate \VOs \cite{issartel2016tangible}.
|
||||
\item Size and
|
||||
\item shape difference between a tangible and a \VO is acceptable for manipulation in \AR \cite{kahl2021investigation,kahl2023using}.
|
||||
]
|
||||
\subfigsheight{37.5mm}
|
||||
\subfig{jain2023ubitouch}
|
||||
\subfig{issartel2016tangible}
|
||||
\subfig{kahl2021investigation}
|
||||
\subfig{kahl2023using}
|
||||
\begin{subfigs}{ar_applications}{Manipulating \VOs with tangibles. }[][
|
||||
\item Ubi-Touch paired the movements and screw interaction of a virtual drill with a real vaporizer held by the user \cite{jain2023ubitouch}.
|
||||
\item A tangible cube that can be moved into the \VE and used to grasp and manipulate \VOs \cite{issartel2016tangible}.
|
||||
\item Size and
|
||||
\item shape difference between a tangible and a \VO is acceptable for manipulation in \AR \cite{kahl2021investigation,kahl2023using}.
|
||||
]
|
||||
\subfigsheight{37.5mm}
|
||||
\subfig{jain2023ubitouch}
|
||||
\subfig{issartel2016tangible}
|
||||
\subfig{kahl2021investigation}
|
||||
\subfig{kahl2023using}
|
||||
\end{subfigs}
|
||||
|
||||
|
||||
\subsubsection{Manipulating with Virtual Hands}
|
||||
\label{ar_virtual_hands}
|
||||
|
||||
@@ -259,17 +250,17 @@ The virtual phalanx follows the movements of the real phalanx, but remains const
|
||||
The forces acting on the object are calculated as a function of the distance between the real and virtual hands (\figref{borst2006spring}).
|
||||
More advanced techniques simulate the friction phenomena \cite{talvas2013godfinger} and finger deformations \cite{talvas2015aggregate}, allowing highly accurate and realistic interactions, but which can be difficult to compute in real time.
|
||||
|
||||
\begin{subfigs}{virtual-hand}{Manipulating \VOs with virtual hands. }[
|
||||
\item A fingertip tracking that allows to select a \VO by opening the hand \cite{lee2007handy}.
|
||||
\item Physics-based hand-object manipulation with a virtual hand made of numerous many small rigid-body spheres \cite{hilliges2012holodesk}.
|
||||
\item Grasping a through gestures when the fingers are detected as opposing on the \VO \cite{piumsomboon2013userdefined}.
|
||||
\item A kinematic hand model with rigid-body phalanges (in beige) taht follows the real tracked hand (in green) but kept physically constrained to the \VO. Applied forces are shown as red arrows \cite{borst2006spring}.
|
||||
]
|
||||
\subfigsheight{37mm}
|
||||
\subfig{lee2007handy}
|
||||
\subfig{hilliges2012holodesk_1}
|
||||
\subfig{piumsomboon2013userdefined_1}
|
||||
\subfig{borst2006spring}
|
||||
\begin{subfigs}{virtual-hand}{Manipulating \VOs with virtual hands. }[][
|
||||
\item A fingertip tracking that allows to select a \VO by opening the hand \cite{lee2007handy}.
|
||||
\item Physics-based hand-object manipulation with a virtual hand made of numerous many small rigid-body spheres \cite{hilliges2012holodesk}.
|
||||
\item Grasping a through gestures when the fingers are detected as opposing on the \VO \cite{piumsomboon2013userdefined}.
|
||||
\item A kinematic hand model with rigid-body phalanges (in beige) taht follows the real tracked hand (in green) but kept physically constrained to the \VO. Applied forces are shown as red arrows \cite{borst2006spring}.
|
||||
]
|
||||
\subfigsheight{37mm}
|
||||
\subfig{lee2007handy}
|
||||
\subfig{hilliges2012holodesk_1}
|
||||
\subfig{piumsomboon2013userdefined_1}
|
||||
\subfig{borst2006spring}
|
||||
\end{subfigs}
|
||||
|
||||
However, the lack of physical constraints on the user's hand movements makes manipulation actions tiring \cite{hincapie-ramos2014consumed}.
|
||||
@@ -277,7 +268,6 @@ While the user's fingers traverse the virtual object, a physics-based virtual ha
|
||||
Finally, in the absence of haptic feedback on each finger, it is difficult to estimate the contact and forces exerted by the fingers on the object during grasping and manipulation \cite{maisto2017evaluation,meli2018combining}.
|
||||
While a visual rendering of the virtual hand in \VR can compensate for these issues \cite{prachyabrued2014visual}, the visual and haptic rendering of the virtual hand, or their combination, in \AR is under-researched.
|
||||
|
||||
|
||||
\subsection{Visual Rendering of Hands in AR}
|
||||
\label{ar_visual_hands}
|
||||
|
||||
@@ -316,20 +306,20 @@ Taken together, these results suggest that a visual rendering of the hand in \AR
|
||||
%\textcite{saito2021contact} found that masking the real hand with a textured 3D opaque virtual hand did not improve performance in a reach-to-grasp task but displaying the points of contact on the \VO did.
|
||||
%To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of \VO manipulation.
|
||||
|
||||
\begin{subfigs}{visual-hands}{Visual hand renderings in \AR. }[
|
||||
\item Grasping a \VO in \OST-\AR with no visual hand rendering \cite{hilliges2012holodesk}.
|
||||
\item Simulated mutual-occlusion between the hand grasping and the \VO in \VST-\AR \cite{suzuki2014grasping}.
|
||||
\item Grasping a real object with a semi-transparent hand in \VST-\AR \cite{buchmann2005interaction}.
|
||||
\item Skeleton rendering overlaying the real hand in \VST-\AR \cite{blaga2017usability}.
|
||||
\item Robotic rendering overlaying the real hands in \OST-\AR \cite{genay2021virtual}.
|
||||
]
|
||||
\subfigsheight{29.5mm}
|
||||
\subfig{hilliges2012holodesk_2}
|
||||
\subfig{suzuki2014grasping}
|
||||
\subfig{buchmann2005interaction}
|
||||
\subfig{blaga2017usability}
|
||||
\subfig{genay2021virtual}
|
||||
%\subfig{yoon2020evaluating}
|
||||
\begin{subfigs}{visual-hands}{Visual hand renderings in \AR. }[][
|
||||
\item Grasping a \VO in \OST-\AR with no visual hand rendering \cite{hilliges2012holodesk}.
|
||||
\item Simulated mutual-occlusion between the hand grasping and the \VO in \VST-\AR \cite{suzuki2014grasping}.
|
||||
\item Grasping a real object with a semi-transparent hand in \VST-\AR \cite{buchmann2005interaction}.
|
||||
\item Skeleton rendering overlaying the real hand in \VST-\AR \cite{blaga2017usability}.
|
||||
\item Robotic rendering overlaying the real hands in \OST-\AR \cite{genay2021virtual}.
|
||||
]
|
||||
\subfigsheight{29.5mm}
|
||||
\subfig{hilliges2012holodesk_2}
|
||||
\subfig{suzuki2014grasping}
|
||||
\subfig{buchmann2005interaction}
|
||||
\subfig{blaga2017usability}
|
||||
\subfig{genay2021virtual}
|
||||
%\subfig{yoon2020evaluating}
|
||||
\end{subfigs}
|
||||
|
||||
\subsection{Conclusion}
|
||||
|
||||
Reference in New Issue
Block a user