WIP xr-perception
This commit is contained in:
@@ -135,7 +135,7 @@ Several types of vibrotactile actuators are used in haptics, with different trad
|
||||
|
||||
An \ERM is a direct current (DC) motor that rotates an off-center mass when a voltage or current is applied (\figref{precisionmicrodrives_erm}). \ERMs are easy to control, inexpensive and can be encapsulated in a few millimeters cylinder or coin form factor. However, they have only one \DoF because both the frequency and amplitude of the vibration are coupled to the speed of the rotation, \eg low (high) frequencies output at low (high) amplitudes, as shown on \figref{precisionmicrodrives_erm_performances}.
|
||||
|
||||
\begin{subfigs}{erm}{Diagram and performance of \ERMs. }[][
|
||||
\begin{subfigs}{erm}{Diagram and performance of an \ERM. }[][
|
||||
\item Diagram of a cylindrical encapsulated \ERM. From Precision Microdrives~\footnotemark.
|
||||
\item Amplitude and frequency output of an \ERM as a function of the input voltage.
|
||||
]
|
||||
@@ -357,9 +357,9 @@ We describe them in the \secref{vhar_haptics}.
|
||||
Haptic systems aim to provide virtual interactions and sensations similar to those with real objects.
|
||||
The complexity of the haptic sense has led to the design of numerous haptic devices and renderings.
|
||||
While many haptic devices can be worn on the hand, only a few can be considered wearable as they are compact and portable, but they are limited to cutaneous feedback.
|
||||
If the haptic rendering is timely associated with the user's touch actions on a real object, the perceived haptic properties of the object can be modified.
|
||||
Several rendering methods have been developed to modify the perceived roughness and hardness, mostly using vibrotactile feedback and, to a lesser extent, pressure feedback.
|
||||
However, not all of these haptic augmentations have been already transposed to wearable haptics.
|
||||
If the haptic rendering of the device is timely associated with the user's touch actions on a real object, the perceived haptic properties of the object can be modified.
|
||||
Several haptic augmentation methods have been developed to modify the perceived roughness and hardness, mostly using vibrotactile feedback and, to a lesser extent, pressure feedback.
|
||||
However, not all of these haptic augmentations have been already transposed to wearable haptics, and use of wearable haptic augmentations have not been yet studied in the context of \AR.
|
||||
|
||||
%, unlike most previous actuators that are designed specifically for fingertips and would require mechanical adaptation to be placed on other parts of the hand.
|
||||
% thanks to the vibration propagation and the sensory capabilities distributed throughout the skin, they can be placed without adaption and on any part of the hand
|
||||
|
||||
@@ -6,7 +6,7 @@ Immersive systems such as headsets leave the hands free to interact with \VOs, p
|
||||
|
||||
%\begin{subfigs}{sutherland1968headmounted}{Photos of the first \AR system \cite{sutherland1968headmounted}. }[
|
||||
% \item The \AR headset.
|
||||
% \item Wireframe \ThreeD \VOs were displayed registered in the real environment (as if there were part of it).
|
||||
% \item Wireframe \ThreeD \VOs were displayed registered in the \RE (as if there were part of it).
|
||||
% ]
|
||||
% \subfigsheight{45mm}
|
||||
% \subfig{sutherland1970computer3}
|
||||
@@ -237,7 +237,7 @@ Our hands allow us to manipulate real everyday objects with both strength and pr
|
||||
Initially tracked by active sensing devices such as gloves or controllers, it is now possible to track hands in real time using cameras and computer vision algorithms natively integrated into \AR/\VR headsets \cite{tong2023survey}.
|
||||
|
||||
The user's hand is therefore tracked and reconstructed as a \emph{virtual hand} model in the \VE \cite{billinghurst2015survey,laviolajr20173d}.
|
||||
The simplest models represent the hand as a rigid 3D object that follows the movements of the real hand with \qty{6}{\DoF} (position and orientation in space) \cite{talvas2012novel}.
|
||||
The simplest models represent the hand as a rigid \ThreeD object that follows the movements of the real hand with \qty{6}{\DoF} (position and orientation in space) \cite{talvas2012novel}.
|
||||
An alternative is to model only the fingertips (\figref{lee2007handy}) or the whole hand (\figref{hilliges2012holodesk_1}) as points.
|
||||
The most common technique is to reconstruct all the phalanges of the hand in an articulated kinematic model (\secref{hand_anatomy}) \cite{borst2006spring}.
|
||||
|
||||
@@ -246,7 +246,7 @@ Heuristic techniques use rules to determine the selection, manipulation and rele
|
||||
However, they produce unrealistic behaviour and are limited to the cases predicted by the rules.
|
||||
Physics-based techniques simulate forces at the points of contact between the virtual hand and the \VO.
|
||||
In particular, \textcite{borst2006spring} have proposed an articulated kinematic model in which each phalanx is a rigid body simulated with the god-object method \cite{zilles1995constraintbased}:
|
||||
The virtual phalanx follows the movements of the real phalanx, but remains constrained to the surface of the virtual objects during contact.
|
||||
The virtual phalanx follows the movements of the real phalanx, but remains constrained to the surface of the \VOs during contact.
|
||||
The forces acting on the object are calculated as a function of the distance between the real and virtual hands (\figref{borst2006spring}).
|
||||
More advanced techniques simulate the friction phenomena \cite{talvas2013godfinger} and finger deformations \cite{talvas2015aggregate}, allowing highly accurate and realistic interactions, but which can be difficult to compute in real time.
|
||||
|
||||
@@ -264,7 +264,7 @@ More advanced techniques simulate the friction phenomena \cite{talvas2013godfing
|
||||
\end{subfigs}
|
||||
|
||||
However, the lack of physical constraints on the user's hand movements makes manipulation actions tiring \cite{hincapie-ramos2014consumed}.
|
||||
While the user's fingers traverse the virtual object, a physics-based virtual hand remains in contact with the object, a discrepancy that may degrade the user's performance in \VR \cite{prachyabrued2012virtual}.
|
||||
While the user's fingers traverse the \VO, a physics-based virtual hand remains in contact with the object, a discrepancy that may degrade the user's performance in \VR \cite{prachyabrued2012virtual}.
|
||||
Finally, in the absence of haptic feedback on each finger, it is difficult to estimate the contact and forces exerted by the fingers on the object during grasping and manipulation \cite{maisto2017evaluation,meli2018combining}.
|
||||
While a visual rendering of the virtual hand in \VR can compensate for these issues \cite{prachyabrued2014visual}, the visual and haptic rendering of the virtual hand, or their combination, in \AR is under-researched.
|
||||
|
||||
@@ -303,7 +303,7 @@ In a collaborative task in immersive \OST-\AR \vs \VR, \textcite{yoon2020evaluat
|
||||
Finally, \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the visual and haptic rendering of the hand in \VST-\AR, as detailed in the next section (\secref{vhar_rings}).
|
||||
Taken together, these results suggest that a visual rendering of the hand in \AR could improve usability and performance in direct hand manipulation tasks, but the best rendering has yet to be determined.
|
||||
%\cite{chan2010touching} : cues for touching (selection) \VOs.
|
||||
%\textcite{saito2021contact} found that masking the real hand with a textured 3D opaque virtual hand did not improve performance in a reach-to-grasp task but displaying the points of contact on the \VO did.
|
||||
%\textcite{saito2021contact} found that masking the real hand with a textured \ThreeD opaque virtual hand did not improve performance in a reach-to-grasp task but displaying the points of contact on the \VO did.
|
||||
%To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of \VO manipulation.
|
||||
|
||||
\begin{subfigs}{visual-hands}{Visual hand renderings in \AR. }[][
|
||||
|
||||
@@ -97,7 +97,7 @@ For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually defo
|
||||
%In particular, in \AR and \VR, the perception of a haptic rendering or augmentation can be influenced by the visual rendering of the \VO.
|
||||
|
||||
\subsubsection{Perception of Visuo-Haptic Rendering in AR and \VR}
|
||||
\label{AR_vs_VR}
|
||||
\label{ar_vr_haptic}
|
||||
|
||||
Some studies have investigated the visuo-haptic perception of \VOs rendered with force-feedback and vibrotactile feedback in \AR and \VR.
|
||||
|
||||
|
||||
Reference in New Issue
Block a user