This commit is contained in:
2024-09-21 22:57:47 +02:00
parent 6daf361654
commit 363108d03e
16 changed files with 188 additions and 112 deletions

View File

@@ -66,62 +66,41 @@ As long as the user is able to match the sensations as the same object property,
%for example by including tangible objects, wearable haptic feedback, or even by altering the visual rendering of the \VO, as discussed in the next sections.
\subsubsection{Influence of Visual Rendering on Haptic Perception}
\subsubsection{Influence of Visual Rendering on Tangible Perception}
\label{visual_haptic_influence}
A visuo-haptic perception of the property of a real or virtual object is robust to a certain difference between the two sensory modalities.
Yet, as the visual sense
A visuo-haptic perception of an object's property is thus robust to a certain difference between the two sensory modalities, as long as one can match their respective sensations to the same property.
In particular, the texture perception of everyday objects is known to be constructed from both vision and touch~\cite{klatzky2010multisensory}.
More precisely, when evaluating surfaces with vision or touch only, both senses mainly discriminate their materials by the same properties of roughness, hardness and friction, and with similar performance~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
Une perception est donc robuste à une certaine différence entre les sensations visuelles et haptiques, réelles ou virtuelle, tant qu'un utilisateur peut les faire correspondre à un même objet augmenté.
Overall perception can be then modified by changing one of the sensory modality, as shown by \textcite{yanagisawa2015effects}, who altered perceived roughness, stiffness and friction of real tactile materials touched by the finger by superimposing different real visual textures using a half-mirror.
With a similar setup but in immersive \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched but through a glove: Participants matched visual textures to real textures when their respective hardness felt similar.
\textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined in \VR multiple \VOs with \ThreeD-printed hair structures or with everyday real surfaces, respectively.
They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few tangibles seemed to sufficient to match all the visual \VOs (\figref{gunther2022smooth}).
%Taken together, these studies suggest that a set of haptic textures, real or virtual, can be perceived as coherent with a larger set of visual virtual textures.
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
%
Thus, the overall perception can be modified by changing one of the modalities, as shown by \textcite{yanagisawa2015effects}, who altered the perception of roughness, stiffness and friction of some real tactile textures touched by the finger by superimposing different real visual textures using a half-mirror.
\fig{gunther2022smooth}{In a passive touch context in \VR, only a smooth and a rough real surfaces were found to match all the visual \VOs \cite{gunther2022smooth}.}
% Spring compliance is perceived by combining the sensed force exerted by the spring with the displacement caused by the action (sensed through vision and proprioception). diluca2011effects
% The ability to discriminate whether two stimuli are simultaneous is important to determine whether stimuli should be bound together and form a single multisensory perceptual object. diluca2019perceptual
The visual feedback can even be designed on purpose to influence the haptic perception, usually by deforming the visual representation of a user input, creating a \enquote{pseudo-haptic feedback}~\cite{ujitoko2021survey}.
For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by deforming visually the geometry of a tangible object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape or curvature can be altered~\cite{ban2013modifying,ban2014displaying}.
\textcite{punpongsanon2015softar} used this technique in spatial \AR (\secref{ar_displays}) to induce a softness illusion of a hard tangible object by superimposing a virtual texture that deforms when pressed by the hand (\figref{punpongsanon2015softar}).
Similarly but in VR, \textcite{degraen2019enhancing} combined visual textures with different passive haptic hair-like structure that were touched with the finger to induce a larger set of visuo-haptic materials perception.
\textcite{gunther2022smooth} studied in a complementary way how the visual rendering of a \VO touching the arm with a tangible object influenced the perception of roughness.
A common finding of these studies is that haptic sensations seem to dominate the perception of roughness, suggesting that a smaller set of haptic textures can support a larger set of visual textures.
When performing a precision grasp (\secref{grasp_types}) on a tangible in \VR, some discrepancy in spatial properties between a tangible and a \VO is not noticeable by users: it took a relative difference of \percent{6} for the object width, \percent{44} for the surface orientation, and \percent{67} for the surface curvature to be perceived~\cite{detinguy2019how}.
\subsubsection{Pseudo-Haptic Feedback}
\label{pseudo_haptic}
Le modèle \MLE implique également que la perception peut être croisée entre les modalités sensorielles: une perception haptique peut être volontairement influencé par un stimuli virtuel visuel, quand la vision est dite "dominante" dans la perception.
Quand il est employé avec un \VE, ce phénomène est appelé \emph{pseudo-haptic feedback}~\cite{ujitoko2021survey}.
See \textcite{ujitoko2021survey} for a detailed survey.
% Visual feedback in VR and AR is known to influence haptic perception [13]. The phenomenon of ”visual dominance” was notably observed when estimating the stiffness of \VOs. L´ecuyer et al. [13] based their ”pseudo-haptic feedback” approach on this notion of visual dominance gaffary2017ar
A few works have also used pseudo-haptic feedback to change the perception of haptic stimuli to create richer feedback by deforming the visual representation of a user input~\cite{ujitoko2021survey}.
the perceived softness of tangible objects can be altered by superimposing in AR a virtual texture that deforms when pressed by the hand~\cite{punpongsanon2015softar}, or in combination with vibrotactile rendering in VR~\cite{choi2021augmenting}.
\cite{ban2012modifying,ban2014displaying}
\cite{costes2019touchy,ujitoko2019presenting,ota2020surface}
The vibrotactile sinusoidal rendering of virtual texture cited above was also combined with visual oscillations of a cursor on a screen to increase the roughness perception of the texture~\cite{ujitoko2019modulating}.
However, the visual representation was a virtual cursor seen on a screen while the haptic feedback was felt with a hand-held device.
Conversely, as discussed by \textcite{ujitoko2021survey} in their review, a co-localised visuo-haptic rendering can cause the user to notice the mismatch between their real movements and the visuo-haptic feedback.
Even before manipulating a visual representation to induce a haptic sensation, shifts and latencies between user input and co-localised visuo-haptic feedback can be experienced differently in AR and VR, which we aim to investigate in this work.
\begin{subfigs}{pseudo_haptic}{Pseudo haptic. }
\subfigsheight{35mm}
\begin{subfigs}{pseudo_haptic}{Pseudo-haptic feedback in \AR. }[
\item A virtual soft texture projected on a table and that deforms when pressed by the hand~\cite{punpongsanon2015softar}.
\item Modifying visually a tangible object and the hand touching it in \VST-\AR to modify its perceived shape \cite{ban2014displaying}.
]
\subfigsheight{42mm}
\subfig{punpongsanon2015softar}
\subfig{ban2014displaying}
\subfig{ota2020surface}
\end{subfigs}
In all these studies, the visual expectations of participants influenced their haptic perception.
In particular in \AR and \VR, the perception of a haptic rendering or augmentation can be influenced by the visual rendering of the \VO.
\subsubsection{Perception of Visuo-Haptic Rendering in AR and VR}
\label{AR_vs_VR}
Some studies have investigated the visuo-haptic perception of \VOs in \AR and \VR with grounded force-feedback devices.
Some studies have investigated in \AR and \VR the visuo-haptic perception of \VOs rendered with force-feedback and vibrotactile feedback.
In an immersive \VST-\AR setup, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}).
In a \TIFC task (\secref{sensations_perception}), participants pressed two pistons and indicated which was stiffer.
@@ -159,12 +138,12 @@ This suggests that the haptic stiffness of \VOs feels \enquote{softer} in an \AE
\subfig[0.3]{gaffary2017ar_4}
\end{subfigs}
Finally, \textcite{diluca2019perceptual} investigated perceived simultaneity of visuo-haptic feedback in \VR.
In a user study, participants touched a virtual cube with a virtual hand: The contact was both rendered with a vibrotactile piezo-electric device on the fingertip and a visual change in the cube color.
The visuo-haptic simultaneity varied by either adding a visual delay or by triggering earlier the haptic feedback.
Finally, \textcite{diluca2019perceptual} investigated perceived simultaneity of visuo-haptic contact with a \VO in \VR.
The contact was both rendered with a vibrotactile piezo-electric device on the fingertip and a visual change in the \VO color.
But the visuo-haptic simultaneity varied by either adding a visual delay or by triggering earlier the haptic feedback.
No participant (out of 19) was able to detect a \qty{50}{\ms} visual lag and a \qty{15}{\ms} haptic lead and only half of them detected a \qty{100}{\ms} visual lag and a \qty{70}{\ms} haptic lead.
They have shown how the latency of the visual rendering of a \VO or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device.
These studies have shown how the latency of the visual rendering of a \VO or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device.
We describe in the next section how wearable haptics have been integrated with immersive \AR.
@@ -172,7 +151,7 @@ We describe in the next section how wearable haptics have been integrated with i
\label{vhar_haptics}
A few wearable haptic devices have been specifically designed or experimentally tested for direct hand interaction in immersive \AR.
The main challenge of wearable haptics for \AR is to provide haptic sensations of virtual or augmented objects that are touched and manipulated directly with the fingers while keeping the fingertips free to interact with the \RE.
As virtual or augmented objects are naturally touched, grasped and manipulated directly with the fingertips (\secref{exploratory_procedures} and \secref{grasp_types}), the main challenge of wearable haptics for \AR is to provide haptic sensations of these interactions while keeping the fingertips free to interact with the \RE.
Several approaches have been proposed to move the actuator away to another location on the hand.
Yet, they differ greatly in the actuators used (\secref{wearable_haptic_devices}) thus the haptic feedback (\secref{tactile_rendering}), and the placement of the haptic rendering.
@@ -183,10 +162,7 @@ Another category of actuators relies on systems that cannot be considered as por
\subsubsection{Nail-Mounted Devices}
\label{vhar_nails}
\textcite{ando2007fingernailmounted} were the first to propose this approach that they experimented with a voice-coil mounted on the index nail (\figref{ando2007fingernailmounted}).
The sensation of crossing edges of a virtual patterned texture (\secref{texture_rendering}) on a real sheet of paper were rendered with \qty{20}{\ms} vibration impulses at \qty{130}{\Hz}.
Participants were able to match the virtual patterns to their real counterparts of height \qty{0.25}{\mm} and width \qtyrange{1}{10}{\mm}, but systematically overestimated the virtual width to be \qty{4}{\mm} longer.
\textcite{ando2007fingernailmounted} were the first to propose to move away the actuator from the fingertip, as described in \secref{texture_rendering}.
This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a haptic device mounted on the nail but able to unfold its end-effector on demand to make contact with the fingertip when touching \VOs (\figref{teng2021touch}).
This moving platform also contains a \LRA (\secref{moving_platforms}) and provides contact pressure (\qty{0.34}{\N} force) and texture (\qtyrange{150}{190}{\Hz} bandwidth) sensations.
%The whole system is very compact (\qtyproduct{24 x 24 x 41}{\mm}), lightweight (\qty{9.5}{\g}), and fully portable by including a battery and Bluetooth wireless communication. \qty{20}{\ms} for the Bluetooth
@@ -196,22 +172,27 @@ Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and thi
% teng2021touch: (5.27+3.03+5.23+5.5+5.47)/5 = 4.9
% ando2007fingernailmounted: (2.4+2.63+3.63+2.57+3.2)/5 = 2.9
To always keep the fingertip, \textcite{maeda2022fingeret} with Fingeret proposed to adapt the belt actuators (\secref{belt_actuators}) to design a \enquote{finger-side actuator} instead (\figref{maeda2022fingeret}).
Mounted on the nail, the device actuates two rollers, one on each side of the fingertip, to deform the skin: When the rollers both rotate inwards (towards the pad) they pull the skin, simulating a contact sensation, and when they both rotate outwards (towards the nail) they push the skin, simulating a release sensation.
By doing quick rotations, the rollers can also simulate a texture sensation.
With Fingeret, \textcite{maeda2022fingeret} adapted the belt actuators (\secref{belt_actuators}) as a \enquote{finger-side actuator} that lets the fingertip free (\figref{maeda2022fingeret}).
Two rollers, one per side, can deform the skin: When rotating inwards, they pull the skin, simulating a contact sensation, and when rotating outwards, they push the skin, simulating a release sensation.
By doing quick in and out rotations, they can also simulate a texture sensation.
%The device is also very compact (\qty{60 x 25 x 36}{\mm}), lightweight (\qty{18}{\g}), and portable with a battery and Bluetooth wireless communication with \qty{83}{\ms} latency.
In a user study not in \AR, but involving touching different images on a tablet, Fingeret was found to be more realistic (4/7) than a \LRA at \qty{100}{\Hz} on the nail (3/7) for rendering buttons and a patterned texture (\secref{texture_rendering}), but not different from vibrations for rendering high-frequency textures (3.5/7 for both).
In a user study not in \AR, but directly touching images on a tablet, Fingeret was found to be more realistic (4/7) than a \LRA at \qty{100}{\Hz} on the nail (3/7) for rendering buttons and a patterned texture (\secref{texture_rendering}), but not different from vibrations for rendering high-frequency textures (3.5/7 for both).
However, as for \textcite{teng2021touch}, finger speed was not taken into account for rendering vibrations, which may have been detrimental to texture perception (\secref{texture_rendering}).
Finally, \textcite{preechayasomboon2021haplets} (\figref{preechayasomboon2021haplets}) and \textcite{sabnis2023haptic} designed Haplets and Haptic Servo, respectively: They are very compact and lightweight vibrotactile \LRA devices designed to feature both integrated sensing of the finger movements and very latency haptic feedback (\qty{<5}{ms}).
But no proper user study were conducted to evaluate these devices in \AR.
\begin{subfigs}{ar_wearable}{Nail-mounted wearable haptic devices designed for \AR. }[
\item A voice-coil rendering a virtual haptic texture on a real sheet of paper~\cite{ando2007fingernailmounted}.
%\item A voice-coil rendering a virtual haptic texture on a real sheet of paper~\cite{ando2007fingernailmounted}.
\item Touch\&Fold provide contact pressure and vibrations on demand to the fingertip~\cite{teng2021touch}.
\item Fingeret is a finger-side wearable haptic device that pulls and pushs the fingertip skin~\cite{maeda2022fingeret}.
\item Haplets is a very compact nail device with integrated sensing and vibrotactile feedback~\cite{preechayasomboon2021haplets}.
]
\subfigsheight{33mm}
\subfig{ando2007fingernailmounted}
%\subfig{ando2007fingernailmounted}
\subfig{teng2021touch}
\subfig{maeda2022fingeret}
\subfig{preechayasomboon2021haplets}
\end{subfigs}
@@ -219,6 +200,7 @@ However, as for \textcite{teng2021touch}, finger speed was not taken into accoun
\label{vhar_rings}
The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been employed to improve the manipulation of \VOs in \AR (\secref{ar_interaction}).
Recall that these devices have also been used to modify the perceived stiffness, softness, friction and localized bumps and holes on smooth real surfaces (\secref{hardness_rendering}) \cite{detinguy2018enhancing,salazar2020altering}, but have not been tested in \AR.
In a \VST-\AR setup, \textcite{scheggi2010shape} explored the effect of rendering the weight of a virtual cube placed on a real surface hold with the thumb, index, and middle fingers (\figref{scheggi2010shape}).
The middle phalanx of each of these fingers was equipped with a haptic ring of \textcite{minamizawa2007gravity}.
@@ -268,7 +250,7 @@ A short vibration (\qty{25}{\ms} \qty{175}{\Hz} square-wave) was also rendered w
\subfig{pezent2019tasbi_4}
\end{subfigs}
\cite{sarac2022perceived,palmer2022haptic} not in AR but studies on relocating to the wrist the haptic feedback of the fingertip-object contacts.
% \cite{sarac2022perceived,palmer2022haptic} not in AR but studies on relocating to the wrist the haptic feedback of the fingertip-object contacts.
%\subsection{Conclusion}
%\label{visuo_haptic_conclusion}