Complete related work

This commit is contained in:
2024-09-23 03:09:45 +02:00
parent ee2b739ddb
commit 0495afd60c
12 changed files with 213 additions and 180 deletions

View File

@@ -1,6 +1,10 @@
\section{Visuo-Haptic Augmentations of Hand-Object Interactions}
\label{visuo_haptic}
Everyday perception and manipulation of objects with the hand typically involves both the visual and haptic senses.
Each sense has unique capabilities for perceiving certain object properties, such as color for vision or temperature for touch, but they are equally capable for many material properties, such as roughness, hardness, or friction \cite{baumgartner2013visual}.
Rendering a \VO with both visual and haptic feedback that feels coherent is a challenge, especially in immersive \AR where the haptic actuator is moved away so as not to cover the inside of the hand.
% Answer the following four questions: “Who else has done work with relevance to this work of yours? What did they do? What did they find? And how is your work here different?”
% spatial and temporal integration of visuo-haptic feedback as perceptual cues vs proprioception and real touch sensations
@@ -17,16 +21,20 @@
\subsection{Visuo-Haptic Perception of Virtual and Augmented Objects}
\label{sensations_perception}
Both \AR and wearable haptic systems integrate virtual content into the user's perception as sensory illusions.
It is essential to understand how a multimodal visuo-haptic rendering of a \VO is perceived.
\subsubsection{Merging the Sensations into a Perception}
\label{sensations_perception}
A \emph{perception} is the merge of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property \cite{ernst2004merging}.
A \emph{perception} is the merging of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property \cite{ernst2004merging}.
For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a tangible with a co-localized \VO (\secref{ar_tangibles}).
When the sensations can be redundant, \ie when only one sensation could be enough to estimate the property, they are integrated to form a single coherent perception \cite{ernst2004merging}.
No sensory information is completely reliable, and can provide different answers to the same property when measured multiple times, \eg the weight of an object.
If the sensations are redundant, \ie if only one sensation could suffice to estimate the property, they are integrated to form a single perception \cite{ernst2004merging}.
No sensory information is completely reliable and may give different answers to the same property when measured multiple times, \eg the weight of an object.
Therefore, each sensation $i$ is said to be an estimate $\tilde{s}_i$ with variance $\sigma_i^2$ of the property $s$.
The \MLE model predicts then that the integrated estimated property $\tilde{s}$ is the weighted sum of the individual sensory estimates:
The \MLE model then predicts that the integrated estimated property $\tilde{s}$ is the weighted sum of the individual sensory estimates:
\begin{equation}{MLE}
\tilde{s} = \sum_i w_i \tilde{s}_i \quad \text{with} \quad \sum_i w_i = 1
\end{equation}
@@ -39,13 +47,13 @@ And the integrated variance $\sigma^2$ is the inverse of the sum of the individu
\sigma^2 = \left( \sum_i \frac{1}{\sigma_i^2} \right)^{-1}
\end{equation}
This was demonstrated by \textcite{ernst2002humans} in a user study where participants estimated the height of a virtual bar with a fixed window \OST-\AR display (\secref{ar_displays}) and force-feedback devices worn on the thumb and index fingers (\secref{wearability_level}), as shown in \figref{ernst2002humans_setup}.
This was demonstrated by \textcite{ernst2002humans} in a user study where participants estimated the height of a virtual bar using a fixed-window \OST-\AR display (\secref{ar_displays}) and force-feedback devices worn on the thumb and index finger (\secref{wearability_level}), as shown in \figref{ernst2002humans_setup}.
%They first measured the individual variances of the visual and haptic estimates (\figref{ernst2002humans_within}) and then the combined variance of the visuo-haptic estimates (\figref{ernst2002humans_visuo-haptic}).
On each trial, participants compared the visuo-haptic reference bar (with a fixed height) to a visuo-haptic comparison bar (with a variable height) in a \TIFC task (one bar is tested first, a pause, then the other) and indicated which was taller.
On each trial, participants compared the visuo-haptic reference bar (of a fixed height) to a visuo-haptic comparison bar (of a variable height) in a \TIFC task (one bar is tested first, a pause, then the other) and indicated which was taller.
The reference bar had different conflicting visual $s_v$ and haptic $s_h$ heights, and different noise levels were added to the visual feedback to increase its variance.
The objective was to determine a \PSE between the comparison and reference bars, where the participant was equally likely to choose one or the other (\percent{50} of the trials).
%\figref{ernst2002humans_within} shows the discrimination of participants with only the haptic or visual feedback, and how much the estimation becomes difficult (thus higher variance) when noise is added to the visual feedback.
\figref{ernst2004merging_results} shows that when the visual noise was low, the visual feedback had more weight, but the more the visual noise increased, the more the haptic feedback gained weight, as predicted by the \MLE model.
\figref{ernst2004merging_results} shows that when the visual noise was low, the visual feedback had more weight, but as visual noise increased, haptic feedback gained more weight, as predicted by the \MLE model.
\begin{subfigs}{ernst2002humans}{Visuo-haptic perception of height of a virtual bar \cite{ernst2002humans}. }[
\item Experimental setup.%: Participants estimated height visually with an \OST-\AR display and haptically with force-feedback devices worn on the thumb and index fingers.
@@ -61,29 +69,31 @@ The objective was to determine a \PSE between the comparison and reference bars,
\end{subfigs}
%Hence, the \MLE model explains how a (visual) \VO in \AR can be perceived as coherent when combined with real haptic sensations of a tangible or a wearable haptic feedback.
The \MLE model implies when seeing and touching a \VO in \AR, the combination of the visual and haptic stimuli, real or virtual, rendered to the user can be perceived as a coherent single object property.
As long as the user is able to match the sensations as the same object property, and even though there are discrepancies between the sensations, the overall perception can be influenced by changing one of the stimuli as discussed in the next sections.
The \MLE model implies that when seeing and touching a \VO in \AR, the combination of visual and haptic stimuli, real or virtual, presented to the user can be perceived as a coherent single object property.
%As long as the user is able to associate the sensations as the same object property, and even if there are discrepancies between the sensations, the overall perception can be influenced by changing one of the stimuli, as discussed in the next sections.
%for example by including tangible objects, wearable haptic feedback, or even by altering the visual rendering of the \VO, as discussed in the next sections.
\subsubsection{Influence of Visual Rendering on Tangible Perception}
\label{visual_haptic_influence}
A visuo-haptic perception of an object's property is thus robust to a certain difference between the two sensory modalities, as long as one can match their respective sensations to the same property.
Thus, a visuo-haptic perception of an object's property is robust to some difference between the two sensory modalities, as long as one can match their respective sensations to the same property.
In particular, the texture perception of everyday objects is known to be constructed from both vision and touch \cite{klatzky2010multisensory}.
More precisely, when evaluating surfaces with vision or touch only, both senses mainly discriminate their materials by the same properties of roughness, hardness and friction, and with similar performance \cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
More precisely, when surfaces are evaluated by vision or touch alone, both senses discriminate their materials mainly by the same properties of roughness, hardness, and friction, and with similar performance \cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
Overall perception can be then modified by changing one of the sensory modality, as shown by \textcite{yanagisawa2015effects}, who altered perceived roughness, stiffness and friction of real tactile materials touched by the finger by superimposing different real visual textures using a half-mirror.
With a similar setup but in immersive \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched but through a glove: Participants matched visual textures to real textures when their respective hardness felt similar.
\textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined in \VR multiple \VOs with \ThreeD-printed hair structures or with everyday real surfaces, respectively.
They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few tangibles seemed to sufficient to match all the visual \VOs (\figref{gunther2022smooth}).
The overall perception can then be modified by changing one of the sensory modalities.
\textcite{yanagisawa2015effects} altered the perceived roughness, stiffness, and friction of real tactile materials touched by the finger by superimposing different real visual textures using a half-mirror.
In a similar setup, but in immersive \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched through a glove: many visual textures were found to match the real haptic textures.
\textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined multiple \VOs in \VR with \ThreeD-printed hair structures or with everyday real surfaces, respectively.
They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few tangibles seemed to be sufficient to match all the visual \VOs (\figref{gunther2022smooth}).
%Taken together, these studies suggest that a set of haptic textures, real or virtual, can be perceived as coherent with a larger set of visual virtual textures.
\fig{gunther2022smooth}{In a passive touch context in \VR, only a smooth and a rough real surfaces were found to match all the visual \VOs \cite{gunther2022smooth}.}
The visual feedback can even be designed on purpose to influence the haptic perception, usually by deforming the visual representation of a user input, creating a \enquote{pseudo-haptic feedback} \cite{ujitoko2021survey}.
For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by deforming visually the geometry of a tangible object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape or curvature can be altered \cite{ban2013modifying,ban2014displaying}.
Visual feedback can even be intentionally designed to influence haptic perception, usually by deforming the visual representation of a user input, creating a \enquote{pseudo-haptic feedback} \cite{ujitoko2021survey}.
For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually deforming the geometry of a tangible object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape, or curvature can be altered \cite{ban2013modifying,ban2014displaying}.
\textcite{punpongsanon2015softar} used this technique in spatial \AR (\secref{ar_displays}) to induce a softness illusion of a hard tangible object by superimposing a virtual texture that deforms when pressed by the hand (\figref{punpongsanon2015softar}).
\textcite{ujitoko2019modulating} increased the perceived roughness of a virtual patterned texture rendered as vibrations through a hand-held stylus (\secref{texture_rendering}) by adding small oscillations to the visual feedback of the stylus on a screen.
\begin{subfigs}{pseudo_haptic}{Pseudo-haptic feedback in \AR. }[
\item A virtual soft texture projected on a table and that deforms when pressed by the hand \cite{punpongsanon2015softar}.
@@ -94,17 +104,17 @@ For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by deforming vis
\subfig{ban2014displaying}
\end{subfigs}
In all these studies, the visual expectations of participants influenced their haptic perception.
In particular in \AR and \VR, the perception of a haptic rendering or augmentation can be influenced by the visual rendering of the \VO.
%In all of these studies, the visual expectations of participants influenced their haptic perception.
%In particular, in \AR and \VR, the perception of a haptic rendering or augmentation can be influenced by the visual rendering of the \VO.
\subsubsection{Perception of Visuo-Haptic Rendering in AR and VR}
\label{AR_vs_VR}
Some studies have investigated in \AR and \VR the visuo-haptic perception of \VOs rendered with force-feedback and vibrotactile feedback.
Some studies have investigated the visuo-haptic perception of \VOs rendered with force-feedback and vibrotactile feedback in \AR and \VR.
In an immersive \VST-\AR setup, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}).
In a \TIFC task (\secref{sensations_perception}), participants pressed two pistons and indicated which was stiffer.
One had a reference stiffness but an additional visual or haptic delay, while the other varied with a comparison stiffness but had no delay. \footnote{Participants were not told about the delays and stiffness tested, nor which piston was the reference or comparison. The order of the pistons (which one was pressed first) was also randomized.}%
One had a reference stiffness but an additional visual or haptic delay, while the other varied with a comparison stiffness but had no delay. \footnote{Participants were not told about the delays and stiffness tested, nor which piston was the reference or comparison. The order of the pistons (which one was pressed first) was also randomized.}
Adding a visual delay increased the perceived stiffness of the reference piston, while adding a haptic delay decreased it, and adding both delays cancelled each other out (\figref{knorlein2009influence_2}).
\begin{subfigs}{visuo-haptic-stiffness}{Perception of haptic stiffness in \VST-\AR \cite{knorlein2009influence}. }[
@@ -138,10 +148,10 @@ This suggests that the haptic stiffness of \VOs feels \enquote{softer} in an \AE
\subfig[0.3]{gaffary2017ar_4}
\end{subfigs}
Finally, \textcite{diluca2019perceptual} investigated perceived simultaneity of visuo-haptic contact with a \VO in \VR.
The contact was both rendered with a vibrotactile piezo-electric device on the fingertip and a visual change in the \VO color.
But the visuo-haptic simultaneity varied by either adding a visual delay or by triggering earlier the haptic feedback.
No participant (out of 19) was able to detect a \qty{50}{\ms} visual lag and a \qty{15}{\ms} haptic lead and only half of them detected a \qty{100}{\ms} visual lag and a \qty{70}{\ms} haptic lead.
Finally, \textcite{diluca2019perceptual} investigated the perceived simultaneity of visuo-haptic contact with a \VO in \VR.
The contact was rendered both by a vibrotactile piezoelectric device on the fingertip and by a visual change in the color of the \VO.
The visuo-haptic simultaneity was varied by adding a visual delay or by triggering the haptic feedback earlier.
No participant (out of 19) was able to detect a \qty{50}{\ms} visual lag and a \qty{15}{\ms} haptic lead, and only half of them detected a \qty{100}{\ms} visual lag and a \qty{70}{\ms} haptic lead.
These studies have shown how the latency of the visual rendering of a \VO or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device.
We describe in the next section how wearable haptics have been integrated with immersive \AR.
@@ -151,19 +161,19 @@ We describe in the next section how wearable haptics have been integrated with i
\label{vhar_haptics}
A few wearable haptic devices have been specifically designed or experimentally tested for direct hand interaction in immersive \AR.
As virtual or augmented objects are naturally touched, grasped and manipulated directly with the fingertips (\secref{exploratory_procedures} and \secref{grasp_types}), the main challenge of wearable haptics for \AR is to provide haptic sensations of these interactions while keeping the fingertips free to interact with the \RE.
Several approaches have been proposed to move the actuator away to another location on the hand.
Yet, they differ greatly in the actuators used (\secref{wearable_haptic_devices}) thus the haptic feedback (\secref{tactile_rendering}), and the placement of the haptic rendering.
Since virtual or augmented objects are naturally touched, grasped, and manipulated directly with the fingertips (\secref{exploratory_procedures} and \secref{grasp_types}), the main challenge of wearable haptics for \AR is to provide haptic sensations of these interactions while keeping the fingertips free to interact with the \RE.
Several approaches have been proposed to move the actuator to a different location on the hand.
Yet, they differ greatly in the actuators used (\secref{wearable_haptic_devices}), thus the haptic feedback (\secref{tactile_rendering}), and the placement of the haptic rendering.
Other wearable haptic actuators have been proposed for \AR but are not detailed here.
Other wearable haptic actuators have been proposed for \AR, but are not discussed here.
A first reason is that they permanently cover the fingertip and affect the interaction with the \RE, such as thin-skin tactile interfaces \cite{withana2018tacttoo,teng2024haptic} or fluid-based interfaces \cite{han2018hydroring}.
Another category of actuators relies on systems that cannot be considered as portable, such as REVEL \cite{bau2012revel} that provide friction sensations with reverse electrovibration that need to modify the real objects to augment, or Electrical Muscle Stimulation (EMS) devices \cite{lopes2018adding} that provide kinesthetic feedback by contracting the muscles.
Another category of actuators relies on systems that cannot be considered as portable, such as REVEL \cite{bau2012revel}, which provide friction sensations with reverse electrovibration that must modify the real objects to augment, or Electrical Muscle Stimulation (EMS) devices \cite{lopes2018adding}, which provide kinesthetic feedback by contracting the muscles.
\subsubsection{Nail-Mounted Devices}
\label{vhar_nails}
\textcite{ando2007fingernailmounted} were the first to propose to move away the actuator from the fingertip to the nail, as described in \secref{texture_rendering}.
\textcite{ando2007fingernailmounted} were the first to propose to moving the actuator from the fingertip to the nail, as described in \secref{texture_rendering}.
This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a haptic device able to unfold its end-effector on demand to make contact with the fingertip when touching \VOs (\figref{teng2021touch}).
This moving platform also contains a \LRA (\secref{moving_platforms}) and provides contact pressure and texture sensations.
%The whole system is very compact (\qtyproduct{24 x 24 x 41}{\mm}), lightweight (\qty{9.5}{\g}), and fully portable by including a battery and Bluetooth wireless communication. \qty{20}{\ms} for the Bluetooth
@@ -173,15 +183,15 @@ Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and thi
% teng2021touch: (5.27+3.03+5.23+5.5+5.47)/5 = 4.9
% ando2007fingernailmounted: (2.4+2.63+3.63+2.57+3.2)/5 = 2.9
With Fingeret, \textcite{maeda2022fingeret} adapted the belt actuators (\secref{belt_actuators}) as a \enquote{finger-side actuator} that lets the fingertip free (\figref{maeda2022fingeret}).
Two rollers, one per side, can deform the skin: When rotating inwards, they pull the skin, simulating a contact sensation, and when rotating outwards, they push the skin, simulating a release sensation.
By doing quick in and out rotations, they can also simulate a texture sensation.
With Fingeret, \textcite{maeda2022fingeret} adapted the belt actuators (\secref{belt_actuators}) as a \enquote{finger-side actuator} that leaves the fingertip free (\figref{maeda2022fingeret}).
Two rollers, one on each side, can deform the skin: When rotated inward, they pull the skin, simulating a contact sensation, and when rotated outward, they push the skin, simulating a release sensation.
They can also simulate a texture sensation by rapidly rotating in and out.
%The device is also very compact (\qty{60 x 25 x 36}{\mm}), lightweight (\qty{18}{\g}), and portable with a battery and Bluetooth wireless communication with \qty{83}{\ms} latency.
In a user study not in \AR, but directly touching images on a tablet, Fingeret was found to be more realistic (4/7) than a \LRA at \qty{100}{\Hz} on the nail (3/7) for rendering buttons and a patterned texture (\secref{texture_rendering}), but not different from vibrations for rendering high-frequency textures (3.5/7 for both).
However, as for \textcite{teng2021touch}, finger speed was not taken into account for rendering vibrations, which may have been detrimental to texture perception, as described in \secref{texture_rendering}.
However, as with \textcite{teng2021touch}, finger speed was not taken into account when rendering vibrations, which may have been detrimental to texture perception, as described in \secref{texture_rendering}.
Finally, \textcite{preechayasomboon2021haplets} (\figref{preechayasomboon2021haplets}) and \textcite{sabnis2023haptic} designed Haplets and Haptic Servo, respectively: They are very compact and lightweight vibrotactile \LRA devices designed to feature both integrated sensing of the finger movements and very latency haptic feedback (\qty{<5}{ms}).
But no proper user study were conducted to evaluate these devices in \AR.
Finally, \textcite{preechayasomboon2021haplets} (\figref{preechayasomboon2021haplets}) and \textcite{sabnis2023haptic} designed Haplets and Haptic Servo, respectively: These are very compact and lightweight vibrotactile \LRA devices designed to provide both integrated finger motion sensing and very low latency haptic feedback (\qty{<5}{ms}).
However, no proper user study has been conducted to evaluate these devices in \AR.
\begin{subfigs}{ar_wearable}{Nail-mounted wearable haptic devices designed for \AR. }[
%\item A voice-coil rendering a virtual haptic texture on a real sheet of paper \cite{ando2007fingernailmounted}.
@@ -200,7 +210,7 @@ But no proper user study were conducted to evaluate these devices in \AR.
\subsubsection{Belt Devices}
\label{vhar_rings}
The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been employed to improve the manipulation of \VOs in \AR (\secref{ar_interaction}).
The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been used to improve the manipulation of \VOs in \AR (\secref{ar_interaction}).
Recall that these devices have also been used to modify the perceived stiffness, softness, friction and localized bumps and holes on smooth real surfaces (\secref{hardness_rendering}) \cite{detinguy2018enhancing,salazar2020altering}, but have not been tested in \AR.
In a \VST-\AR setup, \textcite{scheggi2010shape} explored the effect of rendering the weight of a virtual cube placed on a real surface hold with the thumb, index, and middle fingers (\figref{scheggi2010shape}).
@@ -208,14 +218,14 @@ The middle phalanx of each of these fingers was equipped with a haptic ring of \
%However, no proper user study was conducted to evaluate this feedback.% on the manipulation of the cube.
%that simulated the weight of the cube.
%A virtual cube that could push on the cube was manipulated with the other hand through a force-feedback device.
\textcite{scheggi2010shape} report that 12 out of 15 participants found the weight haptic feedback essential to feel the presence of the virtual cube.
\textcite{scheggi2010shape} report that 12 out of 15 participants found the weight haptic feedback essential to feeling the presence of the virtual cube.
In a pick-and-place task in non-immersive \VST-\AR involving direct hand manipulation of both virtual and real objects (\figref{maisto2017evaluation}), \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the effects of providing haptic or visual feedback about fingertip-object contacts.
They compared the haptic ring of \textcite{pacchierotti2016hring} on the proximal phalanx, the moving platform of \textcite{chinello2020modular} on the fingertip, and a visual rendering of the tracked fingertips as virtual points.
They showed that the haptic feedback improved the completion time, reduced the exerted force on the cubes over the visual feedback (\figref{visual-hands}).
The haptic ring was also perceived by users to be more effective than the moving platform.
However, the measured difference in performance could be attributed to either the device or the device position (proximal vs fingertip), or both.
These two studies were also conducted in non-immersive setups, where users looked at a screen displaying the visual interactions, and only compared haptic and visual rendering of the hand-object contacts, but did not examine them together.
They showed that the haptic feedback improved the completion time, reduced the force exerted on the cubes compared to the visual feedback (\figref{visual-hands}).
The haptic ring was also perceived as more effective than the moving platform.
However, the measured difference in performance could be due to either the device or the device position (proximal vs fingertip), or both.
These two studies were also conducted in non-immersive setups, where users viewed a screen displaying the visual interactions, and only compared the haptic and visual rendering of the hand-object contacts, but did not examine them together.
\begin{subfigs}{ar_rings}{Wearable haptic ring devices for \AR. }[
\item Rendering weight of a virtual cube placed on a real surface \cite{scheggi2010shape}.