Remove \VO and \AE acronym
This commit is contained in:
@@ -5,7 +5,7 @@ Perception and manipulation of objects with the hand typically involves both the
|
||||
Each sense has unique capabilities for perceiving certain object properties, such as color for vision or temperature for touch, but they are equally capable for many properties, such as roughness, hardness, or geometry \cite{baumgartner2013visual}.
|
||||
|
||||
Both \AR and wearable haptic systems integrate virtual content into the user's perception as sensory illusions.
|
||||
It is essential to understand how a visuo-haptic rendering of a \VO is perceived as a coherent object property, and how wearable haptics have been integrated with immersive \AR.%, especially in immersive \AR where the haptic actuator is moved away so as not to cover the inside of the hand.
|
||||
It is essential to understand how a visuo-haptic rendering of a virtual object is perceived as a coherent object property, and how wearable haptics have been integrated with immersive \AR.%, especially in immersive \AR where the haptic actuator is moved away so as not to cover the inside of the hand.
|
||||
|
||||
% spatial and temporal integration of visuo-haptic feedback as perceptual cues vs proprioception and real touch sensations
|
||||
% delocalized : not at the point of contact = difficult to integrate with other perceptual cues ?
|
||||
@@ -17,7 +17,7 @@ It is essential to understand how a visuo-haptic rendering of a \VO is perceived
|
||||
\label{sensations_perception}
|
||||
|
||||
A \emph{perception} is the merging of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property \cite{ernst2004merging}.
|
||||
For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a real object with a co-localized \VO (\secref{ar_tangibles}).
|
||||
For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a real object with a co-localized virtual object (\secref{ar_tangibles}).
|
||||
|
||||
If the sensations are redundant, \ie if only one sensation could suffice to estimate the property, they are integrated to form a single perception \cite{ernst2004merging}.
|
||||
No sensory information is completely reliable and may give different answers to the same property when measured multiple times, \eg the weight of an object.
|
||||
@@ -58,10 +58,10 @@ The objective was to determine a \PSE between the comparison and reference bars,
|
||||
%\subfig{ernst2002humans_visuo-haptic}
|
||||
\end{subfigs}
|
||||
|
||||
%Hence, the \MLE model explains how a (visual) \VO in \AR can be perceived as coherent when combined with real haptic sensations of a tangible or a wearable haptic feedback.
|
||||
The \MLE model implies that when seeing and touching a \VO in \AR, the combination of visual and haptic stimuli, real or virtual, presented to the user can be perceived as a coherent single object property.
|
||||
%Hence, the \MLE model explains how a (visual) virtual object in \AR can be perceived as coherent when combined with real haptic sensations of a tangible or a wearable haptic feedback.
|
||||
The \MLE model implies that when seeing and touching a virtual object in \AR, the combination of visual and haptic stimuli, real or virtual, presented to the user can be perceived as a coherent single object property.
|
||||
%As long as the user is able to associate the sensations as the same object property, and even if there are discrepancies between the sensations, the overall perception can be influenced by changing one of the stimuli, as discussed in the next sections.
|
||||
%for example by including tangible objects, wearable haptic feedback, or even by altering the visual rendering of the \VO, as discussed in the next sections.
|
||||
%for example by including tangible objects, wearable haptic feedback, or even by altering the visual rendering of the virtual object, as discussed in the next sections.
|
||||
|
||||
\subsubsection{Influence of Visual Rendering on Haptic Perception}
|
||||
\label{visual_haptic_influence}
|
||||
@@ -73,11 +73,11 @@ More precisely, when surfaces are evaluated by vision or touch alone, both sense
|
||||
The overall perception can then be modified by changing one of the sensory modalities.
|
||||
\textcite{yanagisawa2015effects} altered the perceived roughness, stiffness, and friction of real tactile materials touched by the finger by superimposing different real visual textures using a half-mirror.
|
||||
In a similar setup, but in immersive \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched through a glove: many visual textures were found to match the real haptic textures.
|
||||
\textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined multiple \VOs in \VR with \ThreeD-printed hair structures or with everyday real surfaces, respectively.
|
||||
They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few real, tangible objects seemed to be sufficient to match all the visual \VOs (\figref{gunther2022smooth}).
|
||||
\textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined multiple virtual objects in \VR with \ThreeD-printed hair structures or with everyday real surfaces, respectively.
|
||||
They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few real, tangible objects seemed to be sufficient to match all the visual virtual objects (\figref{gunther2022smooth}).
|
||||
%Taken together, these studies suggest that a set of haptic textures, real or virtual, can be perceived as coherent with a larger set of visual virtual textures.
|
||||
|
||||
\fig{gunther2022smooth}{In a passive touch context in \VR, only a smooth and a rough real surfaces were found to match all the visual \VOs \cite{gunther2022smooth}.}
|
||||
\fig{gunther2022smooth}{In a passive touch context in \VR, only a smooth and a rough real surfaces were found to match all the visual virtual objects \cite{gunther2022smooth}.}
|
||||
|
||||
Visual feedback can even be intentionally designed to influence haptic perception, usually by deforming the visual representation of a user input, creating a \enquote{pseudo-haptic feedback} \cite{ujitoko2021survey}.
|
||||
For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually deforming the geometry of a real object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape, or curvature can be altered \cite{ban2013modifying,ban2014displaying}.
|
||||
@@ -94,12 +94,12 @@ For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually defo
|
||||
\end{subfigs}
|
||||
|
||||
%In all of these studies, the visual expectations of participants influenced their haptic perception.
|
||||
%In particular, in \AR and \VR, the perception of a haptic rendering or augmentation can be influenced by the visual rendering of the \VO.
|
||||
%In particular, in \AR and \VR, the perception of a haptic rendering or augmentation can be influenced by the visual rendering of the virtual object.
|
||||
|
||||
\subsubsection{Perception of Visuo-Haptic Rendering in AR and VR}
|
||||
\label{ar_vr_haptic}
|
||||
|
||||
Some studies have investigated the visuo-haptic perception of \VOs rendered with force-feedback and vibrotactile feedback in \AR and \VR.
|
||||
Some studies have investigated the visuo-haptic perception of virtual objects rendered with force-feedback and vibrotactile feedback in \AR and \VR.
|
||||
|
||||
In an immersive \VST-\AR setup, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}).
|
||||
In a \TIFC task (\secref{sensations_perception}), participants pressed two pistons and indicated which was stiffer.
|
||||
@@ -126,8 +126,8 @@ Therefore, the perceived stiffness $\tilde{k}(t)$ increases with a haptic delay
|
||||
\textcite{gaffary2017ar} compared perceived stiffness of virtual pistons in \OST-\AR and \VR.
|
||||
However, the force-feedback device and the participant's hand were not visible (\figref{gaffary2017ar}).
|
||||
The reference piston was judged to be stiffer when seen in \VR than in \AR, without participants noticing this difference, and more force was exerted on the piston overall in \VR.
|
||||
This suggests that the haptic stiffness of \VOs feels \enquote{softer} in an \AE than in a full \VE.
|
||||
%Two differences that could be worth investigating with the two previous studies are the type of \AR (visuo or optical) and to see the hand touching the \VO.
|
||||
This suggests that the haptic stiffness of virtual objects feels \enquote{softer} in an augmented environment than in a full \VE.
|
||||
%Two differences that could be worth investigating with the two previous studies are the type of \AR (visuo or optical) and to see the hand touching the virtual object.
|
||||
|
||||
\begin{subfigs}{gaffary2017ar}{Perception of haptic stiffness in \OST-\AR \vs \VR \cite{gaffary2017ar}. }[][
|
||||
\item Experimental setup: a virtual piston was pressed with a force-feedback placed to the side of the participant.
|
||||
@@ -139,12 +139,12 @@ This suggests that the haptic stiffness of \VOs feels \enquote{softer} in an \AE
|
||||
\subfigbox[0.31]{gaffary2017ar_4}
|
||||
\end{subfigs}
|
||||
|
||||
Finally, \textcite{diluca2019perceptual} investigated the perceived simultaneity of visuo-haptic contact with a \VO in \VR.
|
||||
The contact was rendered both by a vibrotactile piezoelectric device on the fingertip and by a visual change in the color of the \VO.
|
||||
Finally, \textcite{diluca2019perceptual} investigated the perceived simultaneity of visuo-haptic contact with a virtual object in \VR.
|
||||
The contact was rendered both by a vibrotactile piezoelectric device on the fingertip and by a visual change in the color of the virtual object.
|
||||
The visuo-haptic simultaneity was varied by adding a visual delay or by triggering the haptic feedback earlier.
|
||||
No participant (out of 19) was able to detect a \qty{50}{\ms} visual lag and a \qty{15}{\ms} haptic lead, and only half of them detected a \qty{100}{\ms} visual lag and a \qty{70}{\ms} haptic lead.
|
||||
|
||||
These studies have shown how the latency of the visual rendering of a \VO or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device.
|
||||
These studies have shown how the latency of the visual rendering of a virtual object or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device.
|
||||
We describe in the next section how wearable haptics have been integrated with immersive \AR.
|
||||
|
||||
\subsection{Wearable Haptics for Direct Hand Interaction in AR}
|
||||
@@ -163,10 +163,10 @@ Another category of actuators relies on systems that cannot be considered as por
|
||||
\label{vhar_nails}
|
||||
|
||||
\textcite{ando2007fingernailmounted} were the first to move the actuator from the fingertip to propose the nail, as described in \secref{texture_rendering}.
|
||||
This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a haptic device able to unfold its end-effector on demand to make contact with the fingertip when touching \VOs (\figref{teng2021touch_1}).
|
||||
This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a haptic device able to unfold its end-effector on demand to make contact with the fingertip when touching virtual objects (\figref{teng2021touch_1}).
|
||||
This moving platform also contains a \LRA (\secref{moving_platforms}) and provides contact pressure and texture sensations.
|
||||
%The whole system is very compact (\qtyproduct{24 x 24 x 41}{\mm}), lightweight (\qty{9.5}{\g}), and fully portable by including a battery and Bluetooth wireless communication. \qty{20}{\ms} for the Bluetooth
|
||||
When touching \VOs in \OST-\AR with the index finger, this device was found to be more realistic overall (5/7) than vibrations with a \LRA at \qty{170}{\Hz} on the nail (3/7).
|
||||
When touching virtual objects in \OST-\AR with the index finger, this device was found to be more realistic overall (5/7) than vibrations with a \LRA at \qty{170}{\Hz} on the nail (3/7).
|
||||
Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and this design is not suitable for augmenting real objects.
|
||||
|
||||
% teng2021touch: (5.27+3.03+5.23+5.5+5.47)/5 = 4.9
|
||||
@@ -196,7 +196,7 @@ However, no proper user study has been conducted to evaluate these devices in \A
|
||||
\subsubsection{Belt Devices}
|
||||
\label{vhar_rings}
|
||||
|
||||
The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been used to improve the manipulation of \VOs in \AR (\secref{ar_interaction}).
|
||||
The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been used to improve the manipulation of virtual objects in \AR (\secref{ar_interaction}).
|
||||
Recall that these devices have also been used to modify the perceived stiffness, softness, friction and localized bumps and holes on smooth real surfaces (\secref{hardness_rendering}) \cite{detinguy2018enhancing,salazar2020altering}, but have not been tested in \AR.
|
||||
|
||||
In a \VST-\AR setup, \textcite{scheggi2010shape} explored the effect of rendering the weight of a virtual cube placed on a real surface hold with the thumb, index, and middle fingers (\figref{scheggi2010shape}).
|
||||
@@ -250,11 +250,11 @@ A user study was conducted in \VR to compare the perception of visuo-haptic stif
|
||||
\subsection{Conclusion}
|
||||
\label{visuo_haptic_conclusion}
|
||||
|
||||
Providing coherent visuo-haptic feedback to enhance direct hand perception and manipulation with \VOs in immersive \AR is challenging.
|
||||
Providing coherent visuo-haptic feedback to enhance direct hand perception and manipulation with virtual objects in immersive \AR is challenging.
|
||||
While many wearable haptic devices have been developed and are capable of providing varied tactile feedback, few have be integrated or experimentally evaluated for direct hand interaction in \AR.
|
||||
Their haptic end-effector must be moved away from the inside of the hand so as not to interfere with the user interaction with the \RE.
|
||||
Different relocation strategies have been proposed for different parts of the hand, such as the nail, the index phalanges, or the wrist, but it remains unclear whether any of them are best suited for direct hand interaction in \AR.
|
||||
In all cases, the real and virtual visual sensations are considered co-localized, but the virtual haptic feedback is not.
|
||||
Such a discrepancy may affect the user's perception and experience and should be further investigated.
|
||||
When integrating different sensory feedback, haptic and visual, real and virtual, into a single object property, perception is robust to variations in reliability and to spatial and temporal differences.
|
||||
Conversely, the same haptic feedback or augmentation can be influenced by the user's visual expectation or the visual rendering of the \VO.
|
||||
Conversely, the same haptic feedback or augmentation can be influenced by the user's visual expectation or the visual rendering of the virtual object.
|
||||
|
||||
Reference in New Issue
Block a user