\section{Visuo-Haptic Augmentations of Hand-Object Interactions} \label{visuo_haptic} % Answer the following four questions: “Who else has done work with relevance to this work of yours? What did they do? What did they find? And how is your work here different?” % spatial and temporal integration of visuo-haptic feedback as perceptual cues vs proprioception and real touch sensations % delocalized : not at the point of contact = difficult to integrate with other perceptual cues ? %Go back to the main objective "to understand how immersive visual and \WH feedback compare and complement each other in the context of direct hand perception and manipulation with augmented objects" and the two research challenges: "providing plausible and coherent visuo-haptic augmentations, and enabling effective manipulation of the augmented environment." %Also go back to the \figref[introduction]{visuo-haptic-rv-continuum3} : we present previous work that either did haptic AR (the middle row), or haptic VR with visual AR, or visuo-haptic AR. % One of the roles of haptic systems is to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects, particularly in \v-\VE~\cite{maclean2008it,culbertson2018haptics}. Moreover, a haptic \AR system should \enquote{modulating the feel of a real object by virtual [haptic] feedback}~\cite{jeon2009haptic}, \ie a touch interaction with a real object whose perception is modified by the addition of virtual haptic feedback. \subsection{Influence of Visual Rendering on Haptic Perception} \label{visual_haptic_influence} \subsubsection{Merging the Sensations into a Perception} \label{sensations_perception} As already evoked in the previous sections, a \emph{perception} is the merge of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property~\cite{ernst2004merging}. For example, it is the haptic hardness perceived through skin pressure and force sensations~\secref{hardness}, the hand movement from proprioception and a visual hand avatar~\secref{ar_displays}, or the perceived size of a tangible with a co-localized \VO~\secref{ar_tangibles}. When the sensations can be redundant, \ie only one sensory modality could be used to estimate the property, they are integrated to form a single coherent perception~\cite{ernst2004merging}. The \MLE model explains how one estimate the perception from multiple sensory cues by weighting them according to their perceived reliability~\cite{ernst2002optimal}. Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}. % Thus, the overall perception can be modified by changing one of the modalities, as shown by \textcite{yanagisawa2015effects}, who altered the perception of roughness, stiffness and friction of some real tactile textures touched by the finger by superimposing different real visual textures using a half-mirror. % Spring compliance is perceived by combining the sensed force exerted by the spring with the displacement caused by the action (sensed through vision and proprioception). diluca2011effects % The ability to discriminate whether two stimuli are simultaneous is important to determine whether stimuli should be bound together and form a single multisensory perceptual object. diluca2019perceptual Similarly but in VR, \textcite{degraen2019enhancing} combined visual textures with different passive haptic hair-like structure that were touched with the finger to induce a larger set of visuo-haptic materials perception. \textcite{gunther2022smooth} studied in a complementary way how the visual rendering of a \VO touching the arm with a tangible object influenced the perception of roughness. A common finding of these studies is that haptic sensations seem to dominate the perception of roughness, suggesting that a smaller set of haptic textures can support a larger set of visual textures. When performing a precision grasp (\secref{grasp_types}) in \VR, some discrepancy in spatial properties (\secref{spatial_properties}) between a tangible and a \VO is not noticeable by users: it took a relative difference of \percent{6} for the object width, \percent{44} for the surface orientation, and \percent{67} for the surface curvature to be perceived~\cite{detinguy2019how}. %When performing a precision grasp (\secref{grasp_types}) in \VR, only a certain relative difference between the tangible and the \VO is noticeable: \percent{6} for the object width, \percent{44} for the surface orientation, and \percent{67} for the surface curvature~\cite{detinguy2019how}. \subsubsection{Pseudo-Haptic Feedback} \label{pseudo_haptic} % Visual feedback in VR and AR is known to influence haptic perception [13]. The phenomenon of ”visual dominance” was notably observed when estimating the stiffness of \VOs. L´ecuyer et al. [13] based their ”pseudo-haptic feedback” approach on this notion of visual dominance gaffary2017ar A few works have also used pseudo-haptic feedback to change the perception of haptic stimuli to create richer feedback by deforming the visual representation of a user input~\cite{ujitoko2021survey}. For example, different levels of stiffness can be simulated on a grasped \VO with the same passive haptic device~\cite{achibet2017flexifingers} or the perceived softness of tangible objects can be altered by superimposing in AR a virtual texture that deforms when pressed by the hand~\cite{punpongsanon2015softar}, or in combination with vibrotactile rendering in VR~\cite{choi2021augmenting}. \cite{ban2012modifying} \cite{ban2014displaying} \cite{taima2014controlling} \cite{ujitoko2019presenting} \cite{costes2019touchy} \cite{kalus2024simulating} \cite{detinguy2019how} \cite{samad2019pseudohaptic} \cite{issartel2015perceiving} \cite{ogawa2021effect} The vibrotactile sinusoidal rendering of virtual texture cited above was also combined with visual oscillations of a cursor on a screen to increase the roughness perception of the texture~\cite{ujitoko2019modulating}. However, the visual representation was a virtual cursor seen on a screen while the haptic feedback was felt with a hand-held device. Conversely, as discussed by \textcite{ujitoko2021survey} in their review, a co-localised visuo-haptic rendering can cause the user to notice the mismatch between their real movements and the visuo-haptic feedback. Even before manipulating a visual representation to induce a haptic sensation, shifts and latencies between user input and co-localised visuo-haptic feedback can be experienced differently in AR and VR, which we aim to investigate in this work. \subsubsection{Perception of Visuo-Haptic Rendering in AR and VR} \label{AR_vs_VR} Some studies have investigated the visuo-haptic perception of \VOs in \AR and \VR with grounded force-feedback devices. In an immersive \VST-\AR setup, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}). In a \TAFC task, participants pressed two pistons and indicated which was stiffer. One had a reference stiffness but an additional visual or haptic delay, while the other varied with a comparison stiffness but had no delay. \footnote{Participants were not told about the delays and stiffness tested, nor which piston was the reference or comparison. The order of the pistons (which one was pressed first) was also randomized.}% Adding a visual delay increased the perceived stiffness of the reference piston, while adding a haptic delay decreased it, and adding both delays cancelled each other out (\figref{knorlein2009influence_2}). \begin{subfigs}{visuo-haptic-stiffness}{Perception of haptic stiffness in \VST-\AR ~\cite{knorlein2009influence}. }[ \item Participant pressing a virtual piston rendered by a force-feedback device with their hand. \item Proportion of comparison piston perceived as stiffer than reference piston (vertical axis) as a function of the comparison stiffness (horizontal axis) and visual and haptic delays of the reference (colors). ] \subfig[.44]{knorlein2009influence_1} \subfig[.55]{knorlein2009influence_2} \end{subfigs} %explained how these delays affected the integration of the visual and haptic perceptual cues of stiffness. The stiffness $k$ of the piston is indeed estimated by both sight and proprioception as the ratio of the exerted force $F$ and the displacement $D$ of the piston, following \eqref{stiffness}. But a delay $\Delta t$ modify the equation to: \begin{equation} \label{eq:stiffness_delay} k = \frac{F(t_A)}{D (t_B)} \end{equation} where $t_B = t_A + \Delta t$. Therefore, a haptic delay (positive $\Delta t$) increases the perceived stiffness $k$, while a visual delay in displacement (negative $\Delta t$) decreases perceived $k$~\cite{diluca2011effects}. In a similar \TAFC user study, participants compared perceived stiffness of virtual pistons in \OST-\AR and \VR~\cite{gaffary2017ar}. However, the force-feedback device and the participant's hand were not visible (\figref{gaffary2017ar}). The reference piston was judged to be stiffer when seen in \VR than in \AR, without participants noticing this difference, and more force was exerted on the piston overall in \VR. This suggests that the haptic stiffness of \VOs feels \enquote{softer} in an \AE than in a full \VE. %Two differences that could be worth investigating with the two previous studies are the type of \AR (visuo or optical) and to see the hand touching the \VO. \begin{subfigs}{gaffary2017ar}{Perception of haptic stiffness in \OST-\AR \vs \VR~\cite{gaffary2017ar}. }[ \item Experimental setup: a virtual piston was pressed with a force-feedback placed to the side of the participant. \item View of the virtual piston seen in front of the participant in \OST-\AR and \item in \VR. ] \subfig[0.35]{gaffary2017ar_1} \subfig[0.3]{gaffary2017ar_3} \subfig[0.3]{gaffary2017ar_4} \end{subfigs} Finally, \textcite{diluca2019perceptual} investigated perceived simultaneity of visuo-haptic feedback in \VR. In a user study, participants touched a virtual cube with a virtual hand: The contact was both rendered with a vibrotactile piezo-electric device on the fingertip and a visual change in the cube color. The visuo-haptic simultaneity varied by either adding a visual delay or by triggering earlier the haptic feedback. No participant (out of 19) was able to detect a \qty{50}{\ms} visual lag and a \qty{15}{\ms} haptic lead and only half of them detected a \qty{100}{\ms} visual lag and a \qty{70}{\ms} haptic lead. They have shown how the latency of the visual rendering of a \VO or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device. We describe in the next section how wearable haptics have been integrated with immersive \AR. \subsection{Wearable Haptics for AR} \label{vhar_haptics} A few wearable haptic devices have been specifically designed or experimentally tested for direct hand interaction in immersive \AR. The main challenge of wearable haptics for \AR is to provide haptic sensations of virtual or augmented objects that are touched and manipulated directly with the fingers while keeping the fingertips free to interact with the \RE. Several approaches have been proposed to move the actuator away to another location on the hand. Yet, they differ greatly in the actuators used (\secref{wearable_haptic_devices}) thus the haptic feedback (\secref{tactile_rendering}), and the placement of the haptic rendering. Other wearable haptic actuators have been proposed for \AR but are not detailed here. A first reason is that they permanently cover the fingertip and affect the interaction with the \RE, such as thin-skin tactile interfaces~\cite{withana2018tacttoo,teng2024haptic} or fluid-based interfaces~\cite{han2018hydroring}. Another category of actuators relies on systems that cannot be considered as portable, such as REVEL~\cite{bau2012revel} that provide friction sensations with reverse electrovibration that need to modify the real objects to augment, or Electrical Muscle Stimulation (EMS) devices~\cite{lopes2018adding} that provide kinesthetic feedback by contracting the muscles. \subsubsection{Nail-Mounted Devices} \label{vhar_nails} \textcite{ando2007fingernailmounted} were the first to propose this approach that they experimented with a voice-coil mounted on the index nail (\figref{ando2007fingernailmounted}). The sensation of crossing edges of a virtual patterned texture (\secref{texture_rendering}) on a real sheet of paper were rendered with \qty{20}{\ms} vibration impulses at \qty{130}{\Hz}. Participants were able to match the virtual patterns to their real counterparts of height \qty{0.25}{\mm} and width \qtyrange{1}{10}{\mm}, but systematically overestimated the virtual width to be \qty{4}{\mm} longer. This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a haptic device mounted on the nail but able to unfold its end-effector on demand to make contact with the fingertip when touching \VOs (\figref{teng2021touch}). This moving platform also contains a \LRA (\secref{moving_platforms}) and provides contact pressure (\qty{0.34}{\N} force) and texture (\qtyrange{150}{190}{\Hz} bandwidth) sensations. %The whole system is very compact (\qtyproduct{24 x 24 x 41}{\mm}), lightweight (\qty{9.5}{\g}), and fully portable by including a battery and Bluetooth wireless communication. \qty{20}{\ms} for the Bluetooth When touching \VOs in \OST-\AR with the index finger, this device was found to be more realistic overall (5/7) than vibrations with a \LRA at \qty{170}{\Hz} on the nail (3/7). Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and this design is not suitable for augmenting real tangible objects. % teng2021touch: (5.27+3.03+5.23+5.5+5.47)/5 = 4.9 % ando2007fingernailmounted: (2.4+2.63+3.63+2.57+3.2)/5 = 2.9 To always keep the fingertip, \textcite{maeda2022fingeret} with Fingeret proposed to adapt the belt actuators (\secref{belt_actuators}) to design a \enquote{finger-side actuator} instead (\figref{maeda2022fingeret}). Mounted on the nail, the device actuates two rollers, one on each side of the fingertip, to deform the skin: When the rollers both rotate inwards (towards the pad) they pull the skin, simulating a contact sensation, and when they both rotate outwards (towards the nail) they push the skin, simulating a release sensation. By doing quick rotations, the rollers can also simulate a texture sensation. %The device is also very compact (\qty{60 x 25 x 36}{\mm}), lightweight (\qty{18}{\g}), and portable with a battery and Bluetooth wireless communication with \qty{83}{\ms} latency. In a user study not in \AR, but involving touching different images on a tablet, Fingeret was found to be more realistic (4/7) than a \LRA at \qty{100}{\Hz} on the nail (3/7) for rendering buttons and a patterned texture (\secref{texture_rendering}), but not different from vibrations for rendering high-frequency textures (3.5/7 for both). However, as for \textcite{teng2021touch}, finger speed was not taken into account for rendering vibrations, which may have been detrimental to texture perception (\secref{texture_rendering}). \begin{subfigs}{ar_wearable}{Nail-mounted wearable haptic devices designed for \AR. }[ \item A voice-coil rendering a virtual haptic texture on a real sheet of paper~\cite{ando2007fingernailmounted}. \item Touch\&Fold provide contact pressure and vibrations on demand to the fingertip~\cite{teng2021touch}. \item Fingeret is a finger-side wearable haptic device that pulls and pushs the fingertip skin~\cite{maeda2022fingeret}. ] \subfigsheight{33mm} \subfig{ando2007fingernailmounted} \subfig{teng2021touch} \subfig{maeda2022fingeret} \end{subfigs} \subsubsection{Ring Belt Devices} \label{vhar_rings} The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been employed to improve the manipulation of \VOs in \AR, which is a fundamental task with a \VE (\secref{ar_interaction}). In a \VST-\AR setup, \textcite{scheggi2010shape} explored the effect of rendering the weight (\secref{weight_rendering}) of a virtual cube placed on a real surface hold with the thumb, index, and middle fingers (\figref{scheggi2010shape}). The middle phalanx of each of these fingers was equipped with a haptic ring of \textcite{minamizawa2007gravity}. %However, no proper user study was conducted to evaluate this feedback.% on the manipulation of the cube. %that simulated the weight of the cube. %A virtual cube that could push on the cube was manipulated with the other hand through a force-feedback device. \textcite{scheggi2010shape} report that 12 out of 15 participants found the weight haptic feedback essential to feel the presence of the virtual cube. In a pick-and-place task in non-immersive \VST-\AR involving direct hand manipulation of both virtual and real objects (\figref{maisto2017evaluation}), \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the effects of providing haptic or visual feedback about fingertip-object contacts. They compared the haptic ring of \textcite{pacchierotti2016hring} on the proximal phalanx, the moving platform of \textcite{chinello2020modular} on the fingertip, and a visual rendering of the tracked fingertips as virtual points. They showed that the haptic feedback improved the completion time, reduced the exerted force on the cubes over the visual feedback (\figref{ar_visual_hands}). The haptic ring was also perceived by users to be more effective than the moving platform. However, the measured difference in performance could be attributed to either the device or the device position (proximal vs fingertip), or both. These two studies were also conducted in non-immersive setups, where users looked at a screen displaying the visual interactions, and only compared haptic and visual rendering of the hand-object contacts, but did not examine them together. \begin{subfigs}{ar_rings}{Wearable haptic ring devices for \AR. }[ \item Rendering weight of a virtual cube placed on a real surface~\cite{scheggi2010shape}. \item Rendering the contact force exerted by the fingers on a virtual cube~\cite{maisto2017evaluation,meli2018combining}. ] \subfigsheight{53mm} \subfig{scheggi2010shape} \subfig{maisto2017evaluation} \end{subfigs} \subsubsection{Wrist Bracelet Devices} \label{vhar_bracelets} With their \enquote{Tactile And Squeeze Bracelet Interface} (Tasbi), already mentioned in \secref{belt_actuators}, \textcite{pezent2019tasbi} and \textcite{pezent2022design} explored the use of a wrist-worn bracelet actuator. It is capable of providing a uniform pressure sensation (up to \qty{15}{\N} and \qty{10}{\Hz}) and vibration with six \LRAs (\qtyrange{150}{200}{\Hz} bandwidth). A user study was conducted in \VR to compare the perception of visuo-haptic stiffness rendering~\cite{pezent2019tasbi}. In a \TAFC task, participants pressed a virtual button with different levels of stiffness via a virtual hand constrained by the \VE (\figref{pezent2019tasbi_2}). A higher visual stiffness required a larger physical displacement to press the button (C/D ratio, see \secref{pseudo_haptic}), while the haptic stiffness control the rate of the pressure feedback when pressing. When the visual and haptic stiffness were coherent or when only the haptic stiffness changed, participants easily discriminated two buttons with different stiffness levels (\figref{pezent2019tasbi_3}). However, if only the visual stiffness changed, participants were not able to discriminate the different stiffness levels (\figref{pezent2019tasbi_4}). This suggests that in \VR, the haptic pressure is more important perceptual cue than the visual displacement to render stiffness. A short vibration (\qty{25}{\ms} \qty{175}{\Hz} square-wave) was also rendered when contacting the button, but kept constant across all conditions: It may have affected the overall perception when only the visual stiffness changed. \begin{subfigs}{pezent2019tasbi}{Visuo-haptic stiffness rendering of a virtual button in \VR with the Tasbi bracelet. }[ \item The \VE seen by the user: the virtual hand (in beige) is constrained by the virtual button. The displacement is proportional to the visual stiffness. The real hand (in green) is hidden by the \VE. \item When the rendered visuo-haptic stiffness are coherents (in purple) or only the haptic stiffness change (in blue), participants easily discrimated the different levels. \item When varying only the visual stiffness (in red) but keeping the haptic stiffness constant, participants were not able to discriminate the different stiffness levels. ] \subfigsheight{45mm} \subfig{pezent2019tasbi_2} \subfig{pezent2019tasbi_3} \subfig{pezent2019tasbi_4} \end{subfigs} %\subsection{Conclusion} %\label{visuo_haptic_conclusion} % the type of rendered object (real or virtual), the rendered haptic property (contact, hardness, texture, see \secref{tactile_rendering}), and . %In this context of integrating \WHs with \AR to create a \vh-\AE (\chapref{introduction}), the definition of \textcite{pacchierotti2017wearable} can be extended to an additional criterion: The wearable haptic interface should not impair the interaction with the \RE, \ie the user should be able to touch and manipulate objects in the real world while wearing the haptic device. % The haptic feedback is thus rendered de-localized from the point of contact of the finger on the rendered object.