WIP
This commit is contained in:
@@ -14,12 +14,17 @@
|
||||
\subsection{Influence of Visual Rendering on Haptic Perception}
|
||||
\label{visual_haptic_influence}
|
||||
|
||||
|
||||
\subsubsection{Merging the Sensations into a Perception}
|
||||
\label{sensations_perception}
|
||||
|
||||
When the same object property is sensed simultaneously by vision and touch, the two modalities are integrated into a single perception.
|
||||
The phychophysical model of \textcite{ernst2002humans} established that the sense with the least variability dominates perception.
|
||||
\cite{ernst2004merging}
|
||||
As already evoked in the previous sections, a \emph{perception} is the merge of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property~\cite{ernst2004merging}.
|
||||
For example, it is the haptic hardness perceived through skin pressure and force sensations~\secref{hardness}, the hand movement from proprioception and a visual hand avatar~\secref{ar_displays}, or the perceived size of a tangible with a co-localized \VO~\secref{ar_tangibles}.
|
||||
|
||||
When the sensations can be redundant, \ie only one sensory modality could be used to estimate the property, they are integrated to form a single coherent perception~\cite{ernst2004merging}.
|
||||
The \MLE model explains how one estimate the perception from multiple sensory cues by weighting them according to their perceived reliability~\cite{ernst2002optimal}.
|
||||
|
||||
|
||||
|
||||
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
|
||||
%
|
||||
@@ -62,12 +67,11 @@ Conversely, as discussed by \textcite{ujitoko2021survey} in their review, a co-l
|
||||
|
||||
Even before manipulating a visual representation to induce a haptic sensation, shifts and latencies between user input and co-localised visuo-haptic feedback can be experienced differently in AR and VR, which we aim to investigate in this work.
|
||||
|
||||
|
||||
\subsubsection{Perception of Visuo-Haptic Rendering in AR and VR}
|
||||
\label{AR_vs_VR}
|
||||
|
||||
Some studies have investigated the visuo-haptic perception of \VOs in \AR and \VR.
|
||||
They have shown how the latency of the visual rendering of an object with haptic feedback or the type of environment (\VE or \RE) can affect the perception of an identical haptic rendering.
|
||||
Indeed, there are indeed inherent and unavoidable latencies in the visual and haptic rendering of \VOs, and the visual-haptic feedback may not appear to be simultaneous.
|
||||
Some studies have investigated the visuo-haptic perception of \VOs in \AR and \VR with grounded force-feedback devices.
|
||||
|
||||
In an immersive \VST-\AR setup, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}).
|
||||
In a \TAFC task, participants pressed two pistons and indicated which was stiffer.
|
||||
@@ -113,6 +117,9 @@ In a user study, participants touched a virtual cube with a virtual hand: The co
|
||||
The visuo-haptic simultaneity varied by either adding a visual delay or by triggering earlier the haptic feedback.
|
||||
No participant (out of 19) was able to detect a \qty{50}{\ms} visual lag and a \qty{15}{\ms} haptic lead and only half of them detected a \qty{100}{\ms} visual lag and a \qty{70}{\ms} haptic lead.
|
||||
|
||||
They have shown how the latency of the visual rendering of a \VO or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device.
|
||||
We describe in the next section how wearable haptics have been integrated with immersive \AR.
|
||||
|
||||
|
||||
\subsection{Wearable Haptics for AR}
|
||||
\label{vhar_haptics}
|
||||
@@ -213,8 +220,8 @@ A short vibration (\qty{25}{\ms} \qty{175}{\Hz} square-wave) was also rendered w
|
||||
\end{subfigs}
|
||||
|
||||
|
||||
\subsection{Conclusion}
|
||||
\label{visuo_haptic_conclusion}
|
||||
%\subsection{Conclusion}
|
||||
%\label{visuo_haptic_conclusion}
|
||||
|
||||
% the type of rendered object (real or virtual), the rendered haptic property (contact, hardness, texture, see \secref{tactile_rendering}), and .
|
||||
%In this context of integrating \WHs with \AR to create a \vh-\AE (\chapref{introduction}), the definition of \textcite{pacchierotti2017wearable} can be extended to an additional criterion: The wearable haptic interface should not impair the interaction with the \RE, \ie the user should be able to touch and manipulate objects in the real world while wearing the haptic device.
|
||||
|
||||
Reference in New Issue
Block a user