Fix in references

This commit is contained in:
2024-09-23 14:35:59 +02:00
parent 1766b83e59
commit 560a085e6e
10 changed files with 265 additions and 177 deletions

View File

@@ -234,7 +234,7 @@ It is therefore unclear to what extent the real and virtual visuo-haptic sensati
\subsectionstarbookmark{Enable Effective Manipulation of the Augmented Environment}
Touching, grasping and manipulating \VOs are fundamental interactions for \AR \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite{laviola20173d}.
Touching, grasping and manipulating \VOs are fundamental interactions for \AR \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite{laviolajr20173d}.
%
As the hand is not occupied or covered with a haptic device to not impair interaction with the \RE, as described in the previous section, one can expect a seamless and direct manipulation of the hand with the virtual content as if it were real.
%

View File

@@ -150,7 +150,7 @@ In all examples of \AR applications shown in \secref{ar_applications}, the user
\label{interaction_techniques}
For a user to interact with a computer system (desktop, mobile, \AR, etc.), they first perceive the state of the system and then acts upon it through an input \UI.
Inputs \UI can be either an \emph{active sensing}, a held or worn device, such as a mouse, a touch screen, or a hand-held controller, or a \emph{passive sensing}, that does not require a contact, such as eye trackers, voice recognition, or hand tracking \cite{laviola20173d}.
Inputs \UI can be either an \emph{active sensing}, a held or worn device, such as a mouse, a touch screen, or a hand-held controller, or a \emph{passive sensing}, that does not require a contact, such as eye trackers, voice recognition, or hand tracking \cite{laviolajr20173d}.
The information gathered from the sensors by the \UI is then translated into actions within the computer system by an \emph{interaction technique} (\figref{interaction-technique}).
For example, a cursor on a screen can be moved using either with a mouse or with the arrow keys on a keyboard, or a two-finger swipe on a touchscreen can be used to scroll or zoom an image.
Choosing useful and efficient \UIs and interaction techniques is crucial for the user experience and the tasks that can be performed within the system.
@@ -161,7 +161,7 @@ Choosing useful and efficient \UIs and interaction techniques is crucial for the
\subsubsection{Tasks with Virtual Environments}
\label{ve_tasks}
\textcite{laviola20173d} classify interaction techniques into three categories based on the tasks they enable users to perform: manipulation, navigation, and system control.
\textcite{laviolajr20173d} classify interaction techniques into three categories based on the tasks they enable users to perform: manipulation, navigation, and system control.
\textcite{hertel2021taxonomy} proposed a taxonomy of interaction techniques specifically for immersive \AR.
The \emph{manipulation tasks} are the most fundamental tasks in \AR and \VR systems, and the building blocks for more complex interactions.
@@ -242,15 +242,15 @@ Similarly, in \secref{tactile_rendering} we described how a material property (\
\label{ar_virtual_hands}
Natural \UIs allow the user to use their body movements directly as inputs to the \VE \cite{billinghurst2015survey}.
Our hands allow us to manipulate real everyday objects with both strength and precision (\secref{grasp_types}), so virtual hand interaction techniques seem to be the most natural way to manipulate virtual objects \cite{laviola20173d}.
Our hands allow us to manipulate real everyday objects with both strength and precision (\secref{grasp_types}), so virtual hand interaction techniques seem to be the most natural way to manipulate virtual objects \cite{laviolajr20173d}.
Initially tracked by active sensing devices such as gloves or controllers, it is now possible to track hands in real time using cameras and computer vision algorithms natively integrated into \AR/\VR headsets \cite{tong2023survey}.
The user's hand is therefore tracked and reconstructed as a \emph{virtual hand} model in the \VE \cite{billinghurst2015survey,laviola20173d}.
The user's hand is therefore tracked and reconstructed as a \emph{virtual hand} model in the \VE \cite{billinghurst2015survey,laviolajr20173d}.
The simplest models represent the hand as a rigid 3D object that follows the movements of the real hand with \qty{6}{\DoF} (position and orientation in space) \cite{talvas2012novel}.
An alternative is to model only the fingertips (\figref{lee2007handy}) or the whole hand (\figref{hilliges2012holodesk_1}) as points.
The most common technique is to reconstruct all the phalanges of the hand in an articulated kinematic model (\secref{hand_anatomy}) \cite{borst2006spring}.
The contacts between the virtual hand model and the \VOs are then simulated using heuristic or physics-based techniques \cite{laviola20173d}.
The contacts between the virtual hand model and the \VOs are then simulated using heuristic or physics-based techniques \cite{laviolajr20173d}.
Heuristic techniques use rules to determine the selection, manipulation and release of a \VO (\figref{piumsomboon2013userdefined_1}).
However, they produce unrealistic behaviour and are limited to the cases predicted by the rules.
Physics-based techniques simulate forces at the points of contact between the virtual hand and the \VO.