Files
phd-thesis/1-introduction/related-work/4-visuo-haptic-ar.tex
2024-09-12 20:17:20 +02:00

135 lines
12 KiB
TeX

\section{Visuo-Haptic Augmentations of Hand-Object Interactions}
\label{visuo_haptic_ar}
% Answer the following four questions: “Who else has done work with relevance to this work of yours? What did they do? What did they find? And how is your work here different?”
%Go back to the main objective "to understand how immersive visual and \WH feedback compare and complement each other in the context of direct hand perception and manipulation with augmented objects" and the two research challenges: "providing plausible and coherent visuo-haptic augmentations, and enabling effective manipulation of the augmented environment."
%Also go back to the \figref[introduction]{visuo-haptic-rv-continuum3} : we present previous work that either did haptic AR (the middle row), or haptic VR with visual AR, or visuo-haptic AR.
\subsection{Influence of Visual Rendering on Haptic Perception}
\label{visual_haptic_influence}
\subsubsection{Merging the Sensations into a Perception}
\label{sensations_perception}
When the same object property is sensed simultaneously by vision and touch, the two modalities are integrated into a single perception.
The phychophysical model of \textcite{ernst2002humans} established that the sense with the least variability dominates perception.
\cite{ernst2004merging}
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
%
Thus, the overall perception can be modified by changing one of the modalities, as shown by \textcite{yanagisawa2015effects}, who altered the perception of roughness, stiffness and friction of some real tactile textures touched by the finger by superimposing different real visual textures using a half-mirror.
Similarly but in VR, \textcite{degraen2019enhancing} combined visual textures with different passive haptic hair-like structure that were touched with the finger to induce a larger set of visuo-haptic materials perception.
\textcite{gunther2022smooth} studied in a complementary way how the visual rendering of a virtual object touching the arm with a tangible object influenced the perception of roughness.
Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch~\cite{degraen2019enhancing} and passive touch~\cite{gunther2022smooth} contexts.
A common finding of these studies is that haptic sensations seem to dominate the perception of roughness, suggesting that a smaller set of haptic textures can support a larger set of visual textures.
\subsubsection{Pseudo-Haptic Feedback}
\label{pseudo_haptic}
A few works have also used pseudo-haptic feedback to change the perception of haptic stimuli to create richer feedback by deforming the visual representation of a user input~\cite{ujitoko2021survey}.
For example, different levels of stiffness can be simulated on a grasped virtual object with the same passive haptic device~\cite{achibet2017flexifingers} or
the perceived softness of tangible objects can be altered by superimposing in AR a virtual texture that deforms when pressed by the hand~\cite{punpongsanon2015softar}, or in combination with vibrotactile rendering in VR~\cite{choi2021augmenting}.
\cite{ban2012modifying}
\cite{ban2014displaying}
\cite{taima2014controlling}
\cite{ujitoko2019presenting}
\cite{costes2019touchy}
\cite{kalus2024simulating}
\cite{detinguy2019how}
\cite{samad2019pseudohaptic}
\cite{issartel2015perceiving}
\cite{ogawa2021effect}
The vibrotactile sinusoidal rendering of virtual texture cited above was also combined with visual oscillations of a cursor on a screen to increase the roughness perception of the texture~\cite{ujitoko2019modulating}.
However, the visual representation was a virtual cursor seen on a screen while the haptic feedback was felt with a hand-held device.
Conversely, as discussed by \textcite{ujitoko2021survey} in their review, a co-localised visuo-haptic rendering can cause the user to notice the mismatch between their real movements and the visuo-haptic feedback.
Even before manipulating a visual representation to induce a haptic sensation, shifts and latencies between user input and co-localised visuo-haptic feedback can be experienced differently in AR and VR, which we aim to investigate in this work.
\subsubsection{Comparing Haptics in AR \vs VR}
\label{AR_vs_VR}
A few studies specifically compared visuo-haptic perception in AR \vs VR.
Rendering a virtual piston pressed with one's real hand using a video see-through (VST) AR headset and a force feedback haptic device, \textcite{knorlein2009influence} showed that a visual delay increased the perceived stiffness of the piston, whereas a haptic delay decreased it.
\textcite{diluca2011effects} went on to explain how these delays affected the weighting of visual and haptic information in perceived stiffness.
In a similar setup, but with an optical see-through (OST) AR headset, \textcite{gaffary2017ar} found that the virtual piston was perceived as less stiff in AR than in VR, without participants noticing this difference.
While a large literature has investigated these differences in visual perception, as well as for VR, \eg , less is known about visuo-haptic perception in AR and VR.
\subsection{Wearable Haptics for AR}
\label{vhar_haptics}
\subsubsection{Fingertip-Free Haptic Devices}
\label{vhar_devices}
A few wearable haptic devices have been specifically designed to render haptic sensations and improve interaction with virtual and augmented objects in \AR while keeping the fingertip free.
They all augment the hand perception with virtual haptic sensations, but they differ in the type of rendered object (real or virtual), the rendered haptic property (contact, hardness, texture, etc.), and the position of the actuator on the hand.
\paragraph{Fingernail Actuators}
\textcite{ando2007fingernailmounted} were the first to propose moving the haptic actuator away from the fingertip to allow it to interact freely with the \RE while in \AR.
As shown in \figref{ando2007fingernailmounted}, they placed a voice-coil on the index nail that generated \qty{20}{\ms} burst impulses at \qty{130}{\Hz}.
It rendered the sensation of crossing edges of a virtual patterned texture (see \secref{texture_rendering}) on a real sheet of paper, and participants were able to match the virtual patterns to their real counterparts.
This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a haptic device mounted on the nail but able to unfold its end-effector on demand to make contact with the fingertip when touching virtual objects (see \figref{teng2021touch}).
This moving platform also contains a \LRA (see \secref{moving_platforms}) and provides contact pressure (\qty{0.34}{\N} force) and texture (\qtyrange{150}{190}{\Hz} frenquencies) sensations.
%The whole system is very compact (\qtyproduct{24 x 24 x 41}{\mm}), lightweight (\qty{9.5}{\g}), and fully portable by including a battery and Bluetooth wireless communication. \qty{20}{\ms} for the Bluetooth
When touching virtual objects in \OST-\AR with the index finger, this device was found to be more realistic overall (5/7) than vibrations with a \LRA at \qty{170}{\Hz} on the nail (3/7).
Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and this design is not suitable for augmenting real tangible objects.
% teng2021touch: (5.27+3.03+5.23+5.5+5.47)/5 = 4.9
% ando2007fingernailmounted: (2.4+2.63+3.63+2.57+3.2)/5 = 2.9
To always keep the fingertip, \textcite{maeda2022fingeret} with Fingeret proposed to adapt the belt actuators (see \secref{belt_actuators}) to design a \enquote{finger-side actuator} instead (see \figref{maeda2022fingeret}).
Mounted on the nail, the device actuates two rollers, one on each side of the fingertip, to deform the skin: When the rollers both rotate inwards (towards the pad) they pull the skin, simulating a contact sensation, and when they both rotate outwards (towards the nail) they push the skin, simulating a release sensation.
By doing quick rotations, the rollers can also simulate a texture sensation.
%The device is also very compact (\qty{60 x 25 x 36}{\mm}), lightweight (\qty{18}{\g}), and portable with a battery and Bluetooth wireless communication with \qty{83}{\ms} latency.
In a user study not in \AR, but involving touching different images on a tablet, Fingeret was found to be more realistic (4/7) than a \LRA at \qty{100}{\Hz} on the nail (3/7) for rendering buttons and a patterned texture (see \secref{texture_rendering}), but not different from vibrations for rendering high-frequency textures (3.5/7 for both).
However, as for \textcite{teng2021touch}, finger speed was not taken into account for rendering vibrations, which may have been detrimental to texture perception.
\begin{subfigs}{ar_wearable}{Nail-mounted wearable haptic devices designed for \AR. }[
\item A voice-coil mounted on the nail to render a virtual haptic texture on a real sheet of paper. Adapted from \textcite{ando2007fingernailmounted}.
\item Touch\&Fold is provide contact pressure and vibrations on demand to the fingertip~\cite{teng2021touch}.
\item Fingeret is a finger-side wearable haptic device mounted that pulls and pushs the fingertip skin~\cite{maeda2022fingeret}.
]
\subfigsheight{34mm}
\subfig{ando2007fingernailmounted}
\subfig{teng2021touch}
\subfig{maeda2022fingeret}
\end{subfigs}
[@Bau2010Teslatouch] created a touch-based surface rendering textures using electrovibration and friction feedback between the surface and the user's finger.
They extended this prototype to in [@Bau2012REVEL] to alter the texture of touched real objects using reverse electrovibration. They call this kind of haptic devices that can alter the touch perception of any object without any setup as *intrinsic haptic displays*.
\cite{lopes2018adding}
\textcite{teng2021touch} presented Touch\&Fold, a haptic device attached to the nail that provides pressure and texture sensations when interacting with virtual content, but also folds away when the user interacts with real objects, leaving the fingertip free.
This approach was also perceived as more realistic than providing sensations directly on the nail, as in~\cite{ando2007fingernailmounted}.
Each of these haptic devices provided haptic feedback about fingertip interactions with the virtual content on other parts of the hand.
If it is indeed necessary to delocalize the haptic feedback, each of these positions is promising, and they have not yet been compared with each other.
\cite{pezent2019tasbi}
\cite{tao2021altering}
\subsection{Improving the Interactions with Virtual Objects}
\label{vhar_interaction}
\cite{pacchierotti2015cutaneous}
Conjointly, a few studies have explored and compared the effects of visual and haptic feedback in tasks involving the manipulation of virtual objects with the hand.
\textcite{sarac2022perceived} and \textcite{palmer2022haptic} studied the effects of providing haptic feedback about contacts at the fingertips using haptic devices worn at the wrist, testing different mappings.
Results proved that moving the haptic feedback away from the point(s) of contact is possible and effective, and that its impact is more significant when the visual feedback is limited.
In pick-and-place tasks in AR involving both virtual and real objects, \textcite{maisto2017evaluation} and \textcite{meli2018combining} showed that having a haptic {rendering of the} fingertip interactions with the virtual objects led to better performance and perceived effectiveness than having only a visual rendering of the hand.
Moreover, employing the haptic ring of~\cite{pacchierotti2016hring} on the proximal finger phalanx led to an improved performance with respect to more standard fingertip haptic devices~\cite{chinello2020modular}.
However, the measured difference in performance could be attributed to either the device or the device position (proximal vs fingertip), or both.
Furthermore, all of these studies were conducted in non-immersive setups, where users looked at a screen displaying the visual interactions, and only compared haptic and visual feedback, but did not examine them together.
The improved performance and perceived effectiveness of a delocalized haptic feedback over a visual feedback alone, or their multimodal combination, remains to be verified in an immersive OST-AR setup.