This commit is contained in:
2024-12-26 19:38:46 +01:00
parent 0cde049bfc
commit fe0da6a83b
15 changed files with 31 additions and 32 deletions

View File

@@ -162,8 +162,8 @@ Choosing useful and efficient \UIs and interaction techniques is crucial for the
\subsubsection{Tasks with Virtual Environments}
\label{ve_tasks}
\textcite[p.385]{laviolajr20173d} classify interaction techniques into three categories based on the tasks they enable users to perform: manipulation, navigation, and system control.
\textcite{hertel2021taxonomy} proposed a taxonomy of interaction techniques specifically for immersive \AR.
\textcite{laviolajr20173d} (p.385) classify interaction techniques into three categories based on the tasks they enable users to perform: manipulation, navigation, and system control.
\textcite{hertel2021taxonomy} proposed a similar taxonomy of interaction techniques specifically for immersive \AR.
The \emph{manipulation tasks} are the most fundamental tasks in \AR and \VR systems, and the building blocks for more complex interactions.
\emph{Selection} is the identification or acquisition of a specific virtual object, \eg pointing at a target as in \figref{grubert2015multifi}, touching a button with a finger, or grasping an object with a hand.
@@ -177,7 +177,7 @@ Wayfinding is the cognitive planning of the movement, such as path finding or ro
The \emph{system control tasks} are changes to the system state through commands or menus such as creating, deleting, or modifying virtual objects, \eg as in \figref{roo2017onea}. It is also the input of text, numbers, or symbols.
In this thesis we focus on manipulation tasks of virtual content directly with the hands, more specifically on touching visuo-haptic textures with a finger (\partref{perception}) and positioning and rotating virtual objects pushed and grasp by the hand.
In this thesis we focus on manipulation tasks of virtual content directly with the hands, more specifically on touching visuo-haptic textures with a finger (\partref{perception}) and positioning and rotating virtual objects pushed and grasp by the hand (\partref{manipulation}).
\begin{subfigs}{interaction-techniques}{Interaction techniques in \AR. }[][
\item Spatial selection of virtual item of an extended display using a hand-held smartphone \cite{grubert2015multifi}.
@@ -247,7 +247,7 @@ Similarly, in \secref{tactile_rendering} we described how a material property (\
%can track the user's movements and use them as inputs to the \VE \textcite[p.172]{billinghurst2015survey}.
Initially tracked by active sensing devices such as gloves or controllers, it is now possible to track hands in real time using passive sensing (\secref{interaction_techniques}) and computer vision algorithms natively integrated into \AR/\VR headsets \cite{tong2023survey}.
Our hands allow us to manipulate real everyday objects (\secref{grasp_types}), so virtual hand interaction techniques seem to be the most natural way to manipulate virtual objects \cite[p.400]{laviolajr20173d}.
Our hands allow us to manipulate real everyday objects (\secref{grasp_types}), hence virtual hand interaction techniques seem to be the most natural way to manipulate virtual objects \cite[p.400]{laviolajr20173d}.
The user's hand being tracked is reconstructed as a \emph{virtual hand} model in the \VE \cite[p.405]{laviolajr20173d}.
The simplest models represent the hand as a rigid \ThreeD object that follows the movements of the real hand with \qty{6}{DoF} (position and orientation in space) \cite{talvas2012novel}.