103 lines
11 KiB
TeX
103 lines
11 KiB
TeX
\section{Related Work}
|
|
\label{sec:related_work}
|
|
|
|
% Answer the following four questions: “Who else has done work with relevance to this work of yours? What did they do? What did they find? And how is your work here different?”
|
|
|
|
Many works investigated the haptic rendering of virtual textures to modify the perception of real, tangible surface, but few have considered the influence of visual rendering, or integrated both in an immersive virtual environment such as AR or VR.
|
|
%
|
|
Yet visual and haptic sensations are often combined in everyday life, and it is important to understand how they interact to design more realistic and effective interfaces.
|
|
|
|
|
|
\subsection{Augmenting Haptic Texture Roughness}
|
|
\label{sec:vibrotactile_roughness}
|
|
|
|
When running a finger over a surface, the deformations and vibrations of the skin caused by the micro-height differences of the material induce the sensation of roughness~\autocite{klatzky2003feeling}.
|
|
%
|
|
%Several approaches have been proposed to render virtual haptic texture~\autocite{culbertson2018haptics}.
|
|
%
|
|
%High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\autocite{unger2011roughness}.
|
|
%
|
|
%As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\autocite{culbertson2018haptics}.
|
|
%
|
|
%In this way, physics-based models~\autocite{chan2021hasti,okamura1998vibration} and data-based models~\autocite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures.
|
|
%
|
|
%Notably, \textcite{okamura1998vibration} rendered grating textures with exponentially decaying sinudoids that simulated the strokes of the grooves and ridges of the surface, while \textcite{culbertson2014modeling} captured and modelled the roughness of real surfaces to render them using the speed and force of the user.
|
|
%
|
|
An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\autocite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force. % to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\autocite{culbertson2015should}.
|
|
%
|
|
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\autocite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\autocite{asano2015vibrotactile,normand2024augmenting}, creating a haptic texture augmentation.
|
|
%
|
|
%The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\autocite{bhatia2024augmenting,jeon2009haptic}.
|
|
%
|
|
One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand~\autocite{ando2007fingernailmounted,friesen2024perceived,normand2024visuohaptic,teng2021touch}.
|
|
%
|
|
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\autocite{manfredi2014natural}.
|
|
%
|
|
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\autocite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
|
|
%
|
|
It remains unclear whether such vibrotactile texture augmentation is perceived the same when integrated into visual AR or VR environments or touched with a virtual hand instead of the real hand.
|
|
%
|
|
%We also add a phase adjustment to this sinusoidal signal to allow free exploration movements of the finger with a simple camera-based tracking system.
|
|
|
|
%Another approach is to use ultrasonic vibrating screens, which are able to modulate their friction~\autocite{brahimaj2023crossmodal,rekik2017localized}.
|
|
%
|
|
%Combined with vibrotactile rendering of roughness using a voice-coil actuator attached to the screen, they can produce realistic haptic texture sensations~\autocite{ito2019tactile}.
|
|
%
|
|
%However, this method is limited to the screen and does not allow to easily render textures on virtual (visual) objects or to alter the perception of real surfaces.
|
|
|
|
%In our study, we attached a voice-coil actuator to the middle phalanx of the finger and used a squared sinusoidal signal to render grating textures sensations, but we corrected its phase to allow a simple camera-based tracking and free exploration movements of the finger.
|
|
|
|
\subsection{Influence of Visual Rendering on Haptic Perception}
|
|
\label{sec:influence_visual_haptic}
|
|
|
|
When the same object property is sensed simultaneously by vision and touch, the two modalities are integrated into a single perception.
|
|
%
|
|
The phychophysical model of \textcite{ernst2002humans} established that the sense with the least variability dominates perception.
|
|
%
|
|
%In particular, this effect has been used to better understand the visuo-haptic perception of texture and to design better feedback for virtual objects.
|
|
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\autocite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
|
|
%
|
|
Thus, the overall perception can be modified by changing one of the modalities, as shown by \textcite{yanagisawa2015effects}, who altered the perception of roughness, stiffness and friction of some real tactile textures touched by the finger by superimposing different real visual textures using a half-mirror.
|
|
%
|
|
%Similarly but in VR, \textcite{degraen2019enhancing} combined visual textures with different passive haptic hair-like structure that were touched with the finger to induce a larger set of visuo-haptic materials perception.
|
|
%
|
|
%\textcite{gunther2022smooth} studied in a complementary way how the visual rendering of a virtual object touching the arm with a tangible object influenced the perception of roughness.
|
|
Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch~\autocite{degraen2019enhancing} and passive touch~\autocite{gunther2022smooth} contexts.
|
|
%
|
|
\textcite{normand2024augmenting} also investigated the roughness perception of tangible surfaces touched with the finger and augmented with visual textures in AR and with wearable vibrotactile textures.
|
|
%
|
|
%A common finding of these studies is that haptic sensations seem to dominate the perception of roughness, suggesting that a smaller set of haptic textures can support a larger set of visual textures.
|
|
%
|
|
Conversely, virtual hand rendering is also known to influence how an object is grasped in VR~\autocite{prachyabrued2014visual,blaga2020too} and AR~\autocite{normand2024visuohaptic}, or even how real bumps and holes are perceived in VR~\autocite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated.
|
|
|
|
% \autocite{degraen2019enhancing} and \autocite{gunther2022smooth} showed that the visual rendering of a virtual object can influence the perception of its haptic properties.
|
|
% \autocite{yanagisawa2015effects} with real visual textures superimposed on touched real textures affected the perception of the touched textures.
|
|
|
|
A few works have also used pseudo-haptic feedback to change the perception of haptic stimuli to create richer feedback by deforming the visual representation of a user input~\autocite{ujitoko2021survey}.
|
|
%
|
|
For example, %different levels of stiffness can be simulated on a grasped virtual object with the same passive haptic device~\autocite{achibet2017flexifingers} or
|
|
the perceived softness of tangible objects can be altered by superimposing in AR a virtual texture that deforms when pressed by the hand~\autocite{punpongsanon2015softar}, or in combination with vibrotactile rendering in VR~\autocite{choi2021augmenting}.
|
|
%
|
|
The vibrotactile sinusoidal rendering of virtual texture cited above was also combined with visual oscillations of a cursor on a screen to increase the roughness perception of the texture~\autocite{ujitoko2019modulating}.
|
|
%
|
|
%However, the visual representation was a virtual cursor seen on a screen while the haptic feedback was felt with a hand-held device.
|
|
%
|
|
%Conversely, as discussed by \textcite{ujitoko2021survey} in their review, a co-localised visuo-haptic rendering can cause the user to notice the mismatch between their real movements and the visuo-haptic feedback.
|
|
%
|
|
Even before manipulating a visual representation to induce a haptic sensation, shifts and latencies between user input and co-localised visuo-haptic feedback can be experienced differently in AR and VR, which we aim to investigate in this work.
|
|
%it remains unclear whether touching the same tactile texture augmentation in immersive AR or VR with one's own hand or with a virtual hand can be perceived differently.
|
|
|
|
A few studies specifically compared visuo-haptic perception in AR \vs VR.
|
|
%
|
|
Rendering a virtual piston pressed with one's real hand using a video see-through (VST) AR headset and a force feedback haptic device, \textcite{diluca2011effects} showed that a visual delay increased the perceived stiffness of the piston, whereas a haptic delay decreased it.
|
|
%
|
|
%\textcite{diluca2011effects} went on to explain how these delays affected the weighting of visual and haptic information in perceived stiffness.
|
|
%
|
|
In a similar setup, but with an optical see-through (OST) AR headset, \textcite{gaffary2017ar} found that the virtual piston was perceived as less stiff in AR than in VR, without participants noticing this difference.
|
|
%
|
|
Using a VST-AR headset have notable consequences, as the "real" view of the environment and the hand is actually a visual stream from a camera, which has a noticeable delay and lower quality (\eg resolution, frame rate, field of view) compared to the direct view of the real environment with OST-AR~\autocite{macedo2023occlusion}.
|
|
%
|
|
While a large literature has investigated these differences in visual perception, as well as for VR, \eg distances are underestimated~\autocite{adams2022depth,peillard2019studying}, less is known about visuo-haptic perception in AR and VR.
|
|
%
|
|
In this work we studied (1) the perception of a \emph{haptic texture augmentation} of a tangible surface and (2) the possible influence of the visual rendering of the environment (OST-AR or VR) and the hand touching the surface (real or virtual) on this perception.
|