Arrange images

This commit is contained in:
2024-10-18 15:10:09 +02:00
parent 38f49a45ac
commit 0298c2cd44
3 changed files with 24 additions and 24 deletions

View File

@@ -33,18 +33,9 @@ Yet, most research has focused on visual augmentation, and the term \AR (without
\footnotetext{This third characteristic has been slightly adapted to use the version of \textcite{marchand2016pose}, the original definition was: \enquote{registered in \ThreeD}.} \footnotetext{This third characteristic has been slightly adapted to use the version of \textcite{marchand2016pose}, the original definition was: \enquote{registered in \ThreeD}.}
%For example, \textcite{milgram1994taxonomy} proposed a taxonomy of \MR experiences based on the degree of mixing real and virtual environments, and \textcite{skarbez2021revisiting} revisited this taxonomy to include the user's perception of the experience. %For example, \textcite{milgram1994taxonomy} proposed a taxonomy of \MR experiences based on the degree of mixing real and virtual environments, and \textcite{skarbez2021revisiting} revisited this taxonomy to include the user's perception of the experience.
\subsubsection{Applications of AR}
\label{ar_applications}
Advances in technology, research, and development have enabled many uses of \AR, including medical, educational, industrial, navigation, collaboration, and entertainment applications \cite{dey2018systematic}.
For example, \AR can provide surgical training simulations in safe conditions \cite{harders2009calibration} (\figref{harders2009calibration}), or improve student learning of complex concepts and phenomena such as optics or chemistry \cite{bousquet2024reconfigurable}.
It can also guide workers in complex tasks, such as assembly, maintenance or verification \cite{hartl2013mobile} (\figref{hartl2013mobile}), reinvent the way we interact with desktop computers \cite{lee2013spacetop} (\figref{lee2013spacetop}), or can create complete new forms of gaming or tourism experiences \cite{roo2017inner} (\figref{roo2017inner}).
Most of (visual) \AR/\VR experiences can now be implemented with commercially available hardware and software solutions, especially for tracking, rendering and display.
However, the user experience in \AR is still highly dependent on the display used.
\begin{subfigs}{ar_applications}{Examples of \AR applications. }[][ \begin{subfigs}{ar_applications}{Examples of \AR applications. }[][
\item Visuo-haptic surgery training with cutting into virtual soft tisues \cite{harders2009calibration}. \item Visuo-haptic surgery training with cutting into virtual soft tisues \cite{harders2009calibration}.
\item \AR can interactively guide in document verification tasks by recognizing and comparing with virtual references \cite{hartl2013mobile}. \item Interactive guide in document verification tasks by comparing with virtual references \cite{hartl2013mobile}.
\item SpaceTop is transparent \AR desktop computer featuring direct hand manipulation of \ThreeD content \cite{lee2013spacetop}. \item SpaceTop is transparent \AR desktop computer featuring direct hand manipulation of \ThreeD content \cite{lee2013spacetop}.
\item Inner Garden is a spatial \AR zen garden made of real sand visually augmented to create a mini world that can be reshaped by hand \cite{roo2017inner}. \item Inner Garden is a spatial \AR zen garden made of real sand visually augmented to create a mini world that can be reshaped by hand \cite{roo2017inner}.
] ]
@@ -55,6 +46,15 @@ However, the user experience in \AR is still highly dependent on the display use
\subfig{roo2017inner} \subfig{roo2017inner}
\end{subfigs} \end{subfigs}
\subsubsection{Applications of AR}
\label{ar_applications}
Advances in technology, research, and development have enabled many uses of \AR, including medical, educational, industrial, navigation, collaboration, and entertainment applications \cite{dey2018systematic}.
For example, \AR can provide surgical training simulations in safe conditions \cite{harders2009calibration} (\figref{harders2009calibration}), or improve student learning of complex concepts and phenomena such as optics or chemistry \cite{bousquet2024reconfigurable}.
It can also guide workers in complex tasks, such as assembly, maintenance or verification \cite{hartl2013mobile} (\figref{hartl2013mobile}), reinvent the way we interact with desktop computers \cite{lee2013spacetop} (\figref{lee2013spacetop}), or can create complete new forms of gaming or tourism experiences \cite{roo2017inner} (\figref{roo2017inner}).
Most of (visual) \AR/\VR experiences can now be implemented with commercially available hardware and software solutions, especially for tracking, rendering and display.
However, the user experience in \AR is still highly dependent on the display used.
\subsubsection{AR Displays} \subsubsection{AR Displays}
\label{ar_displays} \label{ar_displays}
@@ -99,7 +99,7 @@ Finally, \AR displays can be head-worn like \VR \emph{headsets} or glasses, prov
Presence and embodiment are two key concepts that characterize the user experience in \AR and \VR. Presence and embodiment are two key concepts that characterize the user experience in \AR and \VR.
While there is a large literature on these topics in \VR, they are less defined and studied for \AR \cite{genay2022being,tran2024survey}. While there is a large literature on these topics in \VR, they are less defined and studied for \AR \cite{genay2022being,tran2024survey}.
These concepts will be useful for the design, evaluation, and discussion of our contributions: These concepts will be useful for the design, evaluation, and discussion of our contributions:
In particular, we will investigate the effect of the visual feedback of the virtual hand when touching haptic texture augmentation (\chapref{xr_perception}) and manipulating virtual objects (\chapref{visual_hand}), and explore the plausibility of visuo-haptic textures (\chapref{visuo_haptic}). In particular, we will investigate the effect of the visual feedback of the virtual hand when touching haptic texture augmentation (\chapref{xr_perception}) and manipulating virtual objects (\chapref{visual_hand}), and explore the plausibility of co-localized visuo-haptic texture augmentations (\chapref{vhar_textures}).
\paragraph{Presence} \paragraph{Presence}
\label{ar_presence} \label{ar_presence}
@@ -150,7 +150,7 @@ In all examples of \AR applications shown in \secref{ar_applications}, the user
For a user to interact with a computer system (desktop, mobile, \AR, etc.), they first perceive the state of the system and then acts upon it through an input device \cite[p.145]{laviolajr20173d}. For a user to interact with a computer system (desktop, mobile, \AR, etc.), they first perceive the state of the system and then acts upon it through an input device \cite[p.145]{laviolajr20173d}.
Such input devices form an input \emph{\UI} that captures and translates user's actions to the computer. Such input devices form an input \emph{\UI} that captures and translates user's actions to the computer.
Similarly, an output \UI render and display the state of the system to the user (such as a \AR/\VR display, \secref{ar_display}, or an haptic actuator, \secref{wearable_haptic_devices}). Similarly, an output \UI render and display the state of the system to the user (such as a \AR/\VR display, \secref{ar_displays}, or an haptic actuator, \secref{wearable_haptic_devices}).
Inputs \UI can be either an \emph{active sensing}, a held or worn device, such as a mouse, a touch screen, or a hand-held controller, or a \emph{passive sensing}, that does not require a contact, such as eye trackers, voice recognition, or hand tracking \cite[p.294]{laviolajr20173d}. Inputs \UI can be either an \emph{active sensing}, a held or worn device, such as a mouse, a touch screen, or a hand-held controller, or a \emph{passive sensing}, that does not require a contact, such as eye trackers, voice recognition, or hand tracking \cite[p.294]{laviolajr20173d}.
The captured information from the sensors is then translated into actions within the computer system by an \emph{interaction technique}. %(\figref{interaction-technique}). The captured information from the sensors is then translated into actions within the computer system by an \emph{interaction technique}. %(\figref{interaction-technique}).

View File

@@ -109,8 +109,8 @@ Adding a visual delay increased the perceived stiffness of the reference piston,
\begin{subfigs}{visuo-haptic-stiffness}{ \begin{subfigs}{visuo-haptic-stiffness}{
Perception of haptic stiffness in \VST-\AR \cite{knorlein2009influence}. Perception of haptic stiffness in \VST-\AR \cite{knorlein2009influence}.
}[][ }[][
\item Participant pressing a virtual piston rendered by a force-feedback device with their hand. \item Participant pressing a virtual piston rendered by a force-feedback device.
\item Proportion of comparison piston perceived as stiffer than reference piston (vertical axis) as a function of the comparison stiffness (horizontal axis) and visual and haptic delays of the reference (colors). \item Proportion of comparison piston perceived as stiffer than reference piston (vertical axis) as a function of the comparison stiffness (horizontal axis) and visual-haptic delays of the reference (colors).
] ]
\subfigbox[.44]{knorlein2009influence_1} \subfigbox[.44]{knorlein2009influence_1}
\subfig[.55]{knorlein2009influence_2} \subfig[.55]{knorlein2009influence_2}
@@ -129,16 +129,6 @@ The reference piston was judged to be stiffer when seen in \VR than in \AR, with
This suggests that the haptic stiffness of virtual objects feels \enquote{softer} in an augmented environment than in a full \VE. This suggests that the haptic stiffness of virtual objects feels \enquote{softer} in an augmented environment than in a full \VE.
%Two differences that could be worth investigating with the two previous studies are the type of \AR (visuo or optical) and to see the hand touching the virtual object. %Two differences that could be worth investigating with the two previous studies are the type of \AR (visuo or optical) and to see the hand touching the virtual object.
\begin{subfigs}{gaffary2017ar}{Perception of haptic stiffness in \OST-\AR \vs \VR \cite{gaffary2017ar}. }[][
\item Experimental setup: a virtual piston was pressed with a force-feedback placed to the side of the participant.
\item View of the virtual piston seen in front of the participant in \OST-\AR and
\item in \VR.
]
\subfig[0.35]{gaffary2017ar_1}
\subfigbox[0.31]{gaffary2017ar_3}
\subfigbox[0.31]{gaffary2017ar_4}
\end{subfigs}
Finally, \textcite{diluca2019perceptual} investigated the perceived simultaneity of visuo-haptic contact with a virtual object in \VR. Finally, \textcite{diluca2019perceptual} investigated the perceived simultaneity of visuo-haptic contact with a virtual object in \VR.
The contact was rendered both by a vibrotactile piezoelectric device on the fingertip and by a visual change in the color of the virtual object. The contact was rendered both by a vibrotactile piezoelectric device on the fingertip and by a visual change in the color of the virtual object.
The visuo-haptic simultaneity was varied by adding a visual delay or by triggering the haptic feedback earlier. The visuo-haptic simultaneity was varied by adding a visual delay or by triggering the haptic feedback earlier.
@@ -147,6 +137,16 @@ No participant (out of 19) was able to detect a \qty{50}{\ms} visual lag and a \
These studies have shown how the latency of the visual rendering of a virtual object or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device. These studies have shown how the latency of the visual rendering of a virtual object or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device.
We describe in the next section how wearable haptics have been integrated with immersive \AR. We describe in the next section how wearable haptics have been integrated with immersive \AR.
\begin{subfigs}{gaffary2017ar}{Perception of haptic stiffness in \OST-\AR \vs \VR \cite{gaffary2017ar}. }[][
\item Experimental setup: a virtual piston was pressed with a force-feedback placed to the side of the participant.
\item View of the virtual piston seen in front of the participant in \OST-\AR.
\item The same view but in \VR.
]
\subfig[0.35]{gaffary2017ar_1}
\subfigbox[0.31]{gaffary2017ar_3}
\subfigbox[0.31]{gaffary2017ar_4}
\end{subfigs}
\subsection{Wearable Haptics for Direct Hand Interaction in AR} \subsection{Wearable Haptics for Direct Hand Interaction in AR}
\label{vhar_haptics} \label{vhar_haptics}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 93 KiB

After

Width:  |  Height:  |  Size: 61 KiB