212 lines
20 KiB
TeX
212 lines
20 KiB
TeX
\section{Visuo-Haptic Augmentations of Hand-Object Interactions}
|
|
\label{visuo_haptic}
|
|
|
|
Perception and manipulation of objects with the hand typically involves both the visual and haptic senses.
|
|
Each sense has unique capabilities for perceiving certain object properties, such as color for vision or temperature for touch, but they are equally capable for many properties, such as roughness, hardness, or geometry \cite{baumgartner2013visual}.
|
|
|
|
Both \AR and wearable haptic systems integrate virtual content into the user's perception as sensory illusions.
|
|
It is essential to understand how a visuo-haptic rendering of a virtual object is perceived as a coherent object property, and how wearable haptics have been integrated with \AR headsets.
|
|
|
|
\subsection{Visuo-Haptic Perception of Virtual and Augmented Objects}
|
|
\label{vh_perception}
|
|
|
|
\subsubsection{Merging the Sensations into a Perception}
|
|
\label{sensations_perception}
|
|
|
|
A \emph{perception} is the merging of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property \cite{ernst2004merging}.
|
|
For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a real object with a co-localized virtual object (\secref{ar_tangibles}).
|
|
|
|
If the sensations are redundant, \ie if only one sensation could suffice to estimate the property, they are integrated to form a single perception \cite{ernst2004merging}.
|
|
No sensory information is completely reliable and may give different answers to the same property when measured multiple times, \eg the weight of an object.
|
|
Therefore, each sensation $i$ is said to be an estimate $\tilde{s}_i$ with variance $\sigma_i^2$ of the property $s$.
|
|
The \MLE model then predicts that the integrated estimated property $\tilde{s}$ is the weighted sum of the individual sensory estimates:
|
|
\begin{equation}{MLE}
|
|
\tilde{s} = \sum_i w_i \tilde{s}_i \quad \text{with} \quad \sum_i w_i = 1
|
|
\end{equation}
|
|
where the individual weights $w_i$ are proportional to their inverse variances:
|
|
\begin{equation}{MLE_weights}
|
|
w_i = \frac{1/\sigma_i^2}{\sigma^2}
|
|
\end{equation}
|
|
and the integrated variance $\sigma^2$ is the inverse of the sum of the individual variances:
|
|
\begin{equation}{MLE_variance}
|
|
\sigma^2 = \left( \sum_i \frac{1}{\sigma_i^2} \right)^{-1}
|
|
\end{equation}
|
|
|
|
This was demonstrated by \textcite{ernst2002humans} in a user study where participants estimated the height of a virtual bar using a fixed-window \OST-\AR display (\secref{ar_displays}) and force-feedback devices worn on the thumb and index finger (\secref{wearability_level}), as shown in \figref{ernst2002humans_setup}.
|
|
On each trial, participants compared the visuo-haptic reference bar (of a fixed height) to a visuo-haptic comparison bar (of a variable height) in a \TIFC task (one bar is tested first, a pause, then the other) and indicated which was taller.
|
|
The reference bar had different conflicting visual $s_v$ and haptic $s_h$ heights, and different noise levels were added to the visual feedback to increase its variance.
|
|
The objective was to determine a \PSE between the comparison and reference bars, where the participant was equally likely to choose one or the other (\percent{50} of the trials).
|
|
\figref{ernst2004merging_results} shows that when the visual noise was low, the visual feedback had more weight, but as visual noise increased, haptic feedback gained more weight, as predicted by the \MLE model.
|
|
|
|
\begin{subfigs}{ernst2002humans}{
|
|
Visuo-haptic perception of height of a virtual bar \cite{ernst2002humans}.
|
|
}[][
|
|
\item Experimental setup.
|
|
\item Proportion of trials (vertical axis) where the comparison bar (horizontal axis) was perceived taller than the reference bar as function of increase variance (inverse of reliability) of the visual feedback (colors).
|
|
The reference had different conflicting visual $s_v$ and haptic $s_h$ heights.
|
|
]
|
|
\subfig[.34]{ernst2002humans_setup}
|
|
\subfig[.64]{ernst2004merging_results}
|
|
\end{subfigs}
|
|
|
|
The \MLE model implies that when seeing and touching a virtual object in \AR, the combination of visual and haptic stimuli, real or virtual, presented to the user can be perceived as a coherent single object property.
|
|
|
|
\subsubsection{Influence of Visual Rendering on Haptic Perception}
|
|
\label{visual_haptic_influence}
|
|
|
|
Thus, a visuo-haptic perception of an object's property is robust to some differences between the two sensory modalities, as long as one can match their respective sensations to the same property.
|
|
In particular, the texture perception of objects is known to be constructed from both vision and touch \cite{klatzky2010multisensory}.
|
|
More precisely, when surfaces are evaluated by vision or touch alone, both senses discriminate their materials mainly by the same properties of roughness, hardness, and friction, and with similar performance \cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
|
|
|
|
The overall perception can then be modified by changing one of the sensory modalities.
|
|
\textcite{yanagisawa2015effects} altered the perceived roughness, stiffness, and friction of real tactile materials touched by the finger by superimposing different real visual textures using a half-mirror.
|
|
In a similar setup, but in \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched through a glove: many visual textures were found to match the real haptic textures.
|
|
\textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined multiple virtual objects in \VR with \ThreeD-printed hair structures or with everyday real surfaces, respectively.
|
|
They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few real objects seemed to be sufficient to match all the visual virtual objects (\figref{gunther2022smooth}).
|
|
|
|
\fig{gunther2022smooth}{In a passive touch context in \VR, only a smooth and a rough real surfaces were found to match all the visual virtual objects \cite{gunther2022smooth}.}
|
|
|
|
Visual feedback can even be intentionally designed to influence haptic perception, usually by deforming the visual representation of a user input, creating a \enquote{pseudo-haptic feedback} \cite{ujitoko2021survey}.
|
|
For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually deforming the geometry of a real object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape, or curvature can be altered \cite{ban2013modifying,ban2014displaying}.
|
|
\textcite{punpongsanon2015softar} used this technique in spatial \AR (\secref{ar_displays}) to induce a softness illusion of a hard real surface by superimposing a virtual texture that deforms when pressed by the hand (\figref{punpongsanon2015softar}).
|
|
\textcite{ujitoko2019modulating} increased the perceived roughness of a virtual patterned texture rendered as vibrations through a hand-held stylus (\secref{texture_rendering}) by adding small oscillations to the visual feedback of the stylus on a screen.
|
|
|
|
\begin{subfigs}{pseudo_haptic}{Pseudo-haptic feedback in \AR. }[][
|
|
\item A virtual soft texture projected on a table and that deforms when pressed by the hand \cite{punpongsanon2015softar}.
|
|
\item Modifying visually a real object and the hand touching it in \VST-\AR to modify its perceived shape \cite{ban2014displaying}.
|
|
]
|
|
\subfigsheight{50mm}
|
|
\subfig{punpongsanon2015softar}
|
|
\subfig{ban2014displaying}
|
|
\end{subfigs}
|
|
|
|
\subsubsection{Perception of Visuo-Haptic Rendering in AR and VR}
|
|
\label{ar_vr_haptic}
|
|
|
|
Some studies have investigated the visuo-haptic perception of virtual objects rendered with force-feedback and vibrotactile feedback in \AR and \VR.
|
|
|
|
In \VST-\AR, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}).
|
|
In a \TIFC task (\secref{sensations_perception}), participants pressed two pistons and indicated which was stiffer.
|
|
One had a reference stiffness but an additional visual or haptic delay, while the other varied with a comparison stiffness but had no delay.
|
|
\footnote{Participants were not told about the delays and stiffness tested, nor which piston was the reference or comparison. The order of the pistons (which one was pressed first) was also randomized.}
|
|
Adding a visual delay increased the perceived stiffness of the reference piston, while adding a haptic delay decreased it, and adding both delays cancelled each other out (\figref{knorlein2009influence_2}).
|
|
|
|
\begin{subfigs}{visuo-haptic-stiffness}{
|
|
Perception of haptic stiffness in \VST-\AR \cite{knorlein2009influence}.
|
|
}[][
|
|
\item Participant pressing a virtual piston rendered by a force-feedback device.
|
|
\item Proportion of comparison piston perceived as stiffer than reference piston (vertical axis) as a function of the comparison stiffness (horizontal axis) and visual-haptic delays of the reference (colors).
|
|
]
|
|
\subfigbox[.44]{knorlein2009influence_1}
|
|
\subfig[.55]{knorlein2009influence_2}
|
|
\end{subfigs}
|
|
|
|
The stiffness $\tilde{k}(t)$ of the piston is indeed estimated at time $t$ by both sight and proprioception as the ratio of the exerted force $F(t)$ and the displacement $\Delta L(t)$ of the piston, following \eqref{stiffness}, but with potential visual $\Delta t_v$ or haptic $\Delta t_h$ delays.
|
|
Therefore, the perceived stiffness $\tilde{k}(t)$ increases with a haptic delay in force and decreases with a visual delay in displacement \cite{diluca2011effects}.
|
|
|
|
\textcite{gaffary2017ar} compared perceived stiffness of virtual pistons in \OST-\AR and \VR.
|
|
However, the force-feedback device and the participant's hand were not visible (\figref{gaffary2017ar}).
|
|
The reference piston was judged to be stiffer when seen in \VR than in \AR, without participants noticing this difference, and more force was exerted on the piston overall in \VR.
|
|
This suggests that the haptic stiffness of virtual objects feels \enquote{softer} in an \AE than in a full \VE.
|
|
|
|
Finally, \textcite{diluca2019perceptual} investigated the perceived simultaneity of visuo-haptic contact with a virtual object in \VR.
|
|
The contact was rendered both by a vibrotactile piezoelectric device on the fingertip and by a visual change in the color of the virtual object.
|
|
The visuo-haptic simultaneity was varied by adding a visual delay or by triggering the haptic feedback earlier.
|
|
No participant (out of 19) was able to detect a \qty{50}{\ms} visual lag and a \qty{15}{\ms} haptic lead, and only half of them detected a \qty{100}{\ms} visual lag and a \qty{70}{\ms} haptic lead.
|
|
|
|
These studies have shown how the latency of the visual rendering of a virtual object or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device.
|
|
We describe in the next section how wearable haptics have been integrated with \AR.
|
|
|
|
\begin{subfigs}{gaffary2017ar}{Perception of haptic stiffness in \OST-\AR \vs \VR \cite{gaffary2017ar}. }[][
|
|
\item Experimental setup: a virtual piston was pressed with a force-feedback placed to the side of the participant.
|
|
\item View of the virtual piston seen in front of the participant in \OST-\AR.
|
|
\item The same view but in \VR.
|
|
]
|
|
\subfig[0.35]{gaffary2017ar_1}
|
|
\subfigbox[0.31]{gaffary2017ar_3}
|
|
\subfigbox[0.31]{gaffary2017ar_4}
|
|
\end{subfigs}
|
|
|
|
\subsection{Wearable Haptics for Direct Hand Interaction in AR}
|
|
\label{vhar_haptics}
|
|
|
|
A few wearable haptic devices have been specifically designed or experimentally tested for direct hand interaction in \AR.
|
|
Since virtual or augmented objects are naturally touched, grasped, and manipulated directly with the fingertips (\secref{exploratory_procedures} and \secref{grasp_types}), the main challenge of wearable haptics for \AR is to provide haptic sensations of these interactions while keeping the fingertips free to interact with the \RE.
|
|
Several approaches have been proposed to move the haptic actuator to a different location, on the outside of the finger or the hand, \eg the nail, the top of a phalanx, or the wrist.
|
|
Yet, they differ greatly in the actuators used (\secref{wearable_haptic_devices}), thus the haptic feedback (\secref{tactile_rendering}), and the placement of the haptic rendering.
|
|
|
|
Other wearable haptic actuators have been proposed for \AR, but are not discussed here.
|
|
A first reason is that they permanently cover the fingertip and affect the interaction with the \RE, such as thin-skin tactile interfaces \cite{withana2018tacttoo,teng2024haptic} or fluid-based interfaces \cite{han2018hydroring}.
|
|
Another category of actuators relies on systems that cannot be considered as portable, such as REVEL \cite{bau2012revel}, which provide friction sensations with reverse electrovibration that must modify the real objects to augment, or electrical muscle stimulation (EMS) devices \cite{lopes2018adding}, which provide kinesthetic feedback by contracting the muscles.
|
|
|
|
\subsubsection{Nail-Mounted Devices}
|
|
\label{vhar_nails}
|
|
|
|
\textcite{ando2007fingernailmounted} were the first to move the actuator away from the fingertip.
|
|
They proposed to relocate it to the nail instead, as described in \secref{texture_rendering}.
|
|
This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a haptic device able to unfold its end-effector on demand to make contact with the fingertip when touching virtual objects (\figref{teng2021touch_1}).
|
|
This moving platform also contains a \LRA (\secref{moving_platforms}) and provides contact pressure and texture sensations.
|
|
When touching virtual objects in \OST-\AR with the index finger, this device was found to be more realistic overall (5/7) than vibrations with a \LRA at \qty{170}{\Hz} on the nail (3/7).
|
|
Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and this design is not suitable for augmenting real objects.
|
|
|
|
With Fingeret, \textcite{maeda2022fingeret} adapted the belt actuators (\secref{belt_actuators}) as a \enquote{finger-side actuator} that leaves the fingertip free (\figref{maeda2022fingeret}).
|
|
Two rollers, one on each side, can deform the skin: When rotated inward, they pull the skin, simulating a contact sensation, and when rotated outward, they push the skin, simulating a release sensation.
|
|
They can also simulate a texture sensation by rapidly rotating in and out.
|
|
In a user study not in \AR, but directly touching images on a tablet, Fingeret was found to be more realistic (4/7) than a \LRA at \qty{100}{\Hz} on the nail (3/7) for rendering buttons and a patterned texture (\secref{texture_rendering}), but not different from vibrations for rendering high-frequency textures (3.5/7 for both).
|
|
However, as with \textcite{teng2021touch}, finger speed was not taken into account when rendering vibrations, which may have been detrimental to texture perception, as described in \secref{texture_rendering}.
|
|
|
|
Finally, \textcite{preechayasomboon2021haplets} (\figref{preechayasomboon2021haplets}) and \textcite{sabnis2023haptic} designed Haplets and Haptic Servo, respectively: These are compact and lightweight vibrotactile \LRA devices designed to provide both integrated finger motion sensing and low latency haptic feedback (\qty{<5}{ms}).
|
|
However, no proper user study has been conducted to evaluate these devices in \AR.
|
|
|
|
\begin{subfigs}{ar_wearable}{Nail-mounted wearable haptic devices designed for \AR. }[][
|
|
\item Touch\&Fold provide contact pressure and vibrations on demand to the fingertip \cite{teng2021touch}.
|
|
\item Fingeret is a finger-side wearable haptic device that pulls and pushs the fingertip skin \cite{maeda2022fingeret}.
|
|
\item Haplets is a compact nail device with integrated sensing and vibrotactile feedback \cite{preechayasomboon2021haplets}.
|
|
]
|
|
\subfigsheight{33mm}
|
|
\subfigbox{teng2021touch_1}
|
|
\subfigbox{maeda2022fingeret}
|
|
\subfigbox{preechayasomboon2021haplets}
|
|
\end{subfigs}
|
|
|
|
\subsubsection{Belt Devices}
|
|
\label{vhar_rings}
|
|
|
|
The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been used to improve the manipulation of virtual objects in \AR (\secref{ar_interaction}).
|
|
Recall that these devices have also been used to modify the perceived stiffness, softness, friction and localized bumps and holes on smooth real surfaces (\secref{hardness_rendering}) \cite{detinguy2018enhancing,salazar2020altering}, but have not been tested in \AR.
|
|
|
|
In a \VST-\AR setup, \textcite{scheggi2010shape} explored the effect of rendering the weight of a virtual cube placed on a real surface hold with the thumb, index, and middle fingers (\figref{scheggi2010shape}).
|
|
The middle phalanx of each of these fingers was equipped with a haptic ring of \textcite{minamizawa2007gravity}.
|
|
\textcite{scheggi2010shape} reported that 12 out of 15 participants found the weight haptic feedback essential to feeling the presence of the virtual cube.
|
|
|
|
In a pick-and-place task in \VST-\AR involving direct hand manipulation of both virtual and real objects (\figref{maisto2017evaluation}), \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the effects of providing haptic or visual feedback about fingertip-object contacts.
|
|
They compared the haptic ring of \textcite{pacchierotti2016hring} on the proximal phalanx, the moving platform of \textcite{chinello2020modular} on the fingertip, and a visual feedback of the tracked fingertips as virtual points.
|
|
They showed that the haptic feedback improved the completion time, reduced the force exerted on the cubes compared to the visual feedback (\figref{visual-hands}).
|
|
The haptic ring was also perceived as more effective than the moving platform.
|
|
However, the measured difference in performance could be due to either the device or the device position (proximal vs fingertip), or both.
|
|
These two studies were also conducted in static setups, where users viewed a screen displaying the visual interactions, and only compared the haptic and visual feedback of the hand-object contacts, but did not examine them together.
|
|
|
|
\begin{subfigs}{ar_rings}{Wearable haptic ring devices for \AR. }[][
|
|
\item Rendering weight of a virtual cube placed on a real surface \cite{scheggi2010shape}.
|
|
\item Rendering the contact force exerted by the fingers on a virtual cube \cite{maisto2017evaluation,meli2018combining}.
|
|
]
|
|
\subfigbox[.48][m]{scheggi2010shape}
|
|
\subfig[.48][m]{maisto2017evaluation}
|
|
\end{subfigs}
|
|
|
|
With their \enquote{Tactile And Squeeze Bracelet Interface} (Tasbi), already mentioned in \secref{belt_actuators}, \textcite{pezent2019tasbi} and \textcite{pezent2022design} explored the use of a wrist-worn bracelet actuator.
|
|
It is capable of providing a uniform pressure sensation (up to \qty{15}{\N} and \qty{10}{\Hz}) and vibration with six \LRAs (\qtyrange{150}{200}{\Hz} bandwidth).
|
|
A user study was conducted in \VR to compare the perception of visuo-haptic stiffness rendering \cite{pezent2019tasbi}, and showed that the haptic pressure feedback was more important than the visual displacement.
|
|
|
|
\subsection{Conclusion}
|
|
\label{visuo_haptic_conclusion}
|
|
|
|
Providing coherent visuo-haptic feedback to enhance direct hand perception and manipulation with virtual objects in \AR is challenging.
|
|
While many wearable haptic devices have been developed and are capable of providing varied tactile feedback, few have be integrated or experimentally evaluated for direct hand interaction in \AR.
|
|
Their haptic end-effector must be moved away from the inside of the hand so as not to interfere with the user interaction with the \RE.
|
|
Different relocation strategies have been proposed for different parts of the hand, such as the nail, the index phalanges, or the wrist, but it remains unclear whether any of them are best suited for direct hand interaction in \AR.
|
|
In all cases, the real and virtual visual sensations are considered co-localized, but the virtual haptic feedback is not.
|
|
Such a discrepancy may affect the user's perception and experience and should be further investigated.
|
|
When integrating different sensory feedback, haptic and visual, real and virtual, into a single object property, perception is robust to variations in reliability and to spatial and temporal differences.
|
|
Conversely, the same haptic feedback or augmentation can be influenced by the user's visual expectation or the visual rendering of the virtual object.
|