Structure related work
This commit is contained in:
@@ -1,127 +1,90 @@
|
||||
\section{Hand-Object Interactions in Visuo-Haptic Augmented Reality}
|
||||
\section{Visuo-Haptic Augmentations of Hand-Object Interactions}
|
||||
\label{visuo_haptic_ar}
|
||||
|
||||
% Answer the following four questions: “Who else has done work with relevance to this work of yours? What did they do? What did they find? And how is your work here different?”
|
||||
|
||||
\subsection{Altering the Perceptions}
|
||||
\label{vhar_perception}
|
||||
%Go back to the main objective "to understand how immersive visual and \WH feedback compare and complement each other in the context of direct hand perception and manipulation with augmented objects" and the two research challenges: "providing plausible and coherent visuo-haptic augmentations, and enabling effective manipulation of the augmented environment."
|
||||
%Also go back to the \figref[introduction]{visuo-haptic-rv-continuum3} : we present previous work that either did haptic AR (the middle row), or haptic VR with visual AR, or visuo-haptic AR.
|
||||
|
||||
\subsubsection{Influence of Visual Rendering on Haptic Perception}
|
||||
\label{vhar_influences}
|
||||
|
||||
\subsection{Influence of Visual Rendering on Haptic Perception}
|
||||
\label{visual_haptic_influence}
|
||||
|
||||
\subsubsection{Merging the Sensations into a Perception}
|
||||
\label{sensations_perception}
|
||||
|
||||
When the same object property is sensed simultaneously by vision and touch, the two modalities are integrated into a single perception.
|
||||
%
|
||||
The phychophysical model of \textcite{ernst2002humans} established that the sense with the least variability dominates perception.
|
||||
|
||||
|
||||
\subsubsection{Contact \& Hardness Augmentations}
|
||||
\label{vhar_hardness}
|
||||
|
||||
|
||||
\subsubsection{Texture Augmentations}
|
||||
\label{vhar_texture}
|
||||
\cite{ernst2004merging}
|
||||
|
||||
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
|
||||
%
|
||||
Thus, the overall perception can be modified by changing one of the modalities, as shown by \textcite{yanagisawa2015effects}, who altered the perception of roughness, stiffness and friction of some real tactile textures touched by the finger by superimposing different real visual textures using a half-mirror.
|
||||
|
||||
In 2010, they were research interest on building haptics (dynamic tactile feedback) for touch-based systems. [@Bau2010Teslatouch] created a touch-based surface rendering textures using electrovibration and friction feedback between the surface and the user's finger.
|
||||
They extended this prototype to in [@Bau2012REVEL] to alter the texture of touched real objects using reverse electrovibration. They call this kind of haptic devices that can alter the touch perception of any object without any setup as *intrinsic haptic displays*. They said [@Azuma1997Survey] as envisioned this kind of AR experience.
|
||||
|
||||
Similarly but in VR, \textcite{degraen2019enhancing} combined visual textures with different passive haptic hair-like structure that were touched with the finger to induce a larger set of visuo-haptic materials perception.
|
||||
\textcite{gunther2022smooth} studied in a complementary way how the visual rendering of a virtual object touching the arm with a tangible object influenced the perception of roughness.
|
||||
Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch~\cite{degraen2019enhancing} and passive touch~\cite{gunther2022smooth} contexts.
|
||||
A common finding of these studies is that haptic sensations seem to dominate the perception of roughness, suggesting that a smaller set of haptic textures can support a larger set of visual textures.
|
||||
|
||||
\subsubsection{Pseudo-Haptic Feedback}
|
||||
\label{pseudo_haptic}
|
||||
|
||||
\subsection{Improving the Interactions}
|
||||
\label{vhar_interaction}
|
||||
A few works have also used pseudo-haptic feedback to change the perception of haptic stimuli to create richer feedback by deforming the visual representation of a user input~\cite{ujitoko2021survey}.
|
||||
For example, different levels of stiffness can be simulated on a grasped virtual object with the same passive haptic device~\cite{achibet2017flexifingers} or
|
||||
the perceived softness of tangible objects can be altered by superimposing in AR a virtual texture that deforms when pressed by the hand~\cite{punpongsanon2015softar}, or in combination with vibrotactile rendering in VR~\cite{choi2021augmenting}.
|
||||
|
||||
Conversely, virtual hand rendering is also known to influence how an object is grasped in VR~\cite{prachyabrued2014visual,blaga2020too} and AR, or even how real bumps and holes are perceived in VR~\cite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated.
|
||||
% Ban
|
||||
|
||||
\subsubsection{Visual Hand Rendering in AR}
|
||||
\label{vhar_hands}
|
||||
% I. Jang and D. Lee. 2014. On utilizing pseudo-haptics for cutaneous fingertip.
|
||||
|
||||
Mutual visual occlusion between a virtual object and the real hand, \ie hiding the virtual object when the real hand is in front of it and hiding the real hand when it is behind the virtual object, is often presented as natural and realistic, enhancing the blending of real and virtual environments~\cite{piumsomboon2014graspshell, al-kalbani2016analysis}.
|
||||
%
|
||||
In video see-through AR (VST-AR), this could be solved as a masking problem by combining the image of the real world captured by a camera and the generated virtual image~\cite{macedo2023occlusion}.
|
||||
%
|
||||
In OST-AR, this is more difficult because the virtual environment is displayed as a transparent 2D image on top of the 3D real world, which cannot be easily masked~\cite{macedo2023occlusion}.
|
||||
%
|
||||
Moreover, in VST-AR, the grip aperture and depth positioning of virtual objects often seem to be wrongly estimated~\cite{al-kalbani2016analysis, maisto2017evaluation}.
|
||||
%
|
||||
However, this effect has yet to be verified in an OST-AR setup.
|
||||
The vibrotactile sinusoidal rendering of virtual texture cited above was also combined with visual oscillations of a cursor on a screen to increase the roughness perception of the texture~\cite{ujitoko2019modulating}.
|
||||
However, the visual representation was a virtual cursor seen on a screen while the haptic feedback was felt with a hand-held device.
|
||||
Conversely, as discussed by \textcite{ujitoko2021survey} in their review, a co-localised visuo-haptic rendering can cause the user to notice the mismatch between their real movements and the visuo-haptic feedback.
|
||||
|
||||
An alternative is to render the virtual objects and the hand semi-transparents, so that they are partially visible even when one is occluding the other, \eg in \figref{hands-none} the real hand is behind the virtual cube but still visible.
|
||||
%
|
||||
Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in VST-AR~\cite{buchmann2005interaction, ha2014wearhand, piumsomboon2014graspshell} and VR~\cite{vanveldhuizen2021effect}, but has not yet been evaluated in OST-AR.
|
||||
%
|
||||
However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a virtual object, \eg in \figref{hands-none} the thumb is in front of the virtual cube, but it appears to be behind it.
|
||||
Even before manipulating a visual representation to induce a haptic sensation, shifts and latencies between user input and co-localised visuo-haptic feedback can be experienced differently in AR and VR, which we aim to investigate in this work.
|
||||
|
||||
In VR, as the user is fully immersed in the virtual environment and cannot see their real hands, it is necessary to represent them virtually.
|
||||
%
|
||||
It is known that the virtual hand representation has an impact on perception, interaction performance, and preference of users~\cite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch}.
|
||||
%
|
||||
In a pick-and-place task in VR, \textcite{prachyabrued2014visual} found that the virtual hand representation whose motion was constrained to the surface of the virtual objects performed the worst, while the virtual hand representation following the tracked human hand (thus penetrating the virtual objects), performed the best, even though it was rather disliked.
|
||||
%
|
||||
The authors also observed that the best compromise was a double rendering, showing both the tracked hand and a hand rendering constrained by the virtual environment.
|
||||
%
|
||||
It has also been shown that over a realistic avatar, a skeleton rendering (similar to \figref{hands-skeleton}) can provide a stronger sense of being in control~\cite{argelaguet2016role} and that minimalistic fingertip rendering (similar to \figref{hands-tips}) can be more effective in a typing task~\cite{grubert2018effects}.
|
||||
\subsubsection{Comparing Haptics in AR \vs VR}
|
||||
\label{AR_vs_VR}
|
||||
|
||||
In AR, as the real hand of a user is visible but not physically constrained by the virtual environment, adding a visual hand rendering that can physically interact with virtual objects would achieve a similar result to the promising double-hand rendering of \textcite{prachyabrued2014visual}.
|
||||
%
|
||||
Additionally, \textcite{kahl2021investigation} showed that a virtual object overlaying a tangible object in OST-AR can vary in size without worsening the users' experience nor the performance.
|
||||
%
|
||||
This suggests that a visual hand rendering superimposed on the real hand could be helpful, but should not impair users.
|
||||
|
||||
Few works have explored the effect of visual hand rendering in AR~\cite{blaga2017usability, maisto2017evaluation, krichenbauer2018augmented, yoon2020evaluating, saito2021contact}.
|
||||
%
|
||||
For example, \textcite{blaga2017usability} evaluated a skeleton rendering in several virtual object manipulations against no visual hand overlay.
|
||||
%
|
||||
Performance did not improve, but participants felt more confident with the virtual hand.
|
||||
%
|
||||
However, the experiment was carried out on a screen, in a non-immersive AR scenario.
|
||||
%
|
||||
\textcite{saito2021contact} found that masking the real hand with a textured 3D opaque virtual hand did not improve performance in a reach-to-grasp task but displaying the points of contact on the virtual object did.
|
||||
%
|
||||
To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of virtual object manipulation.
|
||||
A few studies specifically compared visuo-haptic perception in AR \vs VR.
|
||||
Rendering a virtual piston pressed with one's real hand using a video see-through (VST) AR headset and a force feedback haptic device, \textcite{knorlein2009influence} showed that a visual delay increased the perceived stiffness of the piston, whereas a haptic delay decreased it.
|
||||
\textcite{diluca2011effects} went on to explain how these delays affected the weighting of visual and haptic information in perceived stiffness.
|
||||
In a similar setup, but with an optical see-through (OST) AR headset, \textcite{gaffary2017ar} found that the virtual piston was perceived as less stiff in AR than in VR, without participants noticing this difference.
|
||||
While a large literature has investigated these differences in visual perception, as well as for VR, \eg , less is known about visuo-haptic perception in AR and VR.
|
||||
|
||||
|
||||
\subsubsection{Wearable Haptics for AR}
|
||||
\label{vhar_haptics}
|
||||
|
||||
Different haptic feedback systems have been explored to improve interactions in AR, including %
|
||||
grounded force feedback devices~\cite{bianchi2006high, jeon2009haptic, knorlein2009influence}, %
|
||||
exoskeletons~\cite{lee2021wearable}, %
|
||||
tangible objects~\cite{hettiarachchi2016annexing, detinguy2018enhancing, salazar2020altering, normand2018enlarging, xiao2018mrtouch}, and %
|
||||
wearable haptic devices~\cite{pacchierotti2016hring, lopes2018adding, pezent2019tasbi, teng2021touch}.
|
||||
\subsection{Fingertip-Free Haptic Devices}
|
||||
\label{vhar_devices}
|
||||
|
||||
[@Bau2010Teslatouch] created a touch-based surface rendering textures using electrovibration and friction feedback between the surface and the user's finger.
|
||||
They extended this prototype to in [@Bau2012REVEL] to alter the texture of touched real objects using reverse electrovibration. They call this kind of haptic devices that can alter the touch perception of any object without any setup as *intrinsic haptic displays*.
|
||||
|
||||
\cite{lopes2018adding}
|
||||
|
||||
Wearable haptics seems particularly suited for this context, as it takes into account many of the AR constraints, \eg limited impact on hand tracking performance and reduced impairment of the senses and ability of the users to interact with real content~\cite{pacchierotti2016hring, maisto2017evaluation, lopes2018adding, meli2018combining, pezent2019tasbi, teng2021touch, kourtesis2022electrotactile, marchal2022virtual}.
|
||||
%
|
||||
For example, \textcite{pacchierotti2016hring} designed a haptic ring providing pressure and skin stretch sensations to be worn at the proximal finger phalanx, so as to improve the hand tracking during a pick-and-place task.
|
||||
%
|
||||
\textcite{pezent2019tasbi} proposed Tasbi: a wristband haptic device capable of rendering vibrations and pressures.
|
||||
%
|
||||
\textcite{teng2021touch} presented Touch\&Fold, a haptic device attached to the nail that provides pressure and texture sensations when interacting with virtual content, but also folds away when the user interacts with real objects, leaving the fingertip free.
|
||||
%
|
||||
This approach was also perceived as more realistic than providing sensations directly on the nail, as in~\cite{ando2007fingernailmounted}.
|
||||
%
|
||||
Each of these haptic devices provided haptic feedback about fingertip interactions with the virtual content on other parts of the hand.
|
||||
%
|
||||
If it is indeed necessary to delocalize the haptic feedback, each of these positions is promising, and they have not yet been compared with each other.
|
||||
|
||||
\cite{pezent2019tasbi}
|
||||
|
||||
\cite{tao2021altering}
|
||||
|
||||
\cite{maeda2022fingeret}
|
||||
|
||||
\subsection{Improving the Interactions with Virtual Objects}
|
||||
\label{vhar_interaction}
|
||||
|
||||
Conjointly, a few studies have explored and compared the effects of visual and haptic feedback in tasks involving the manipulation of virtual objects with the hand.
|
||||
%
|
||||
\textcite{sarac2022perceived} and \textcite{palmer2022haptic} studied the effects of providing haptic feedback about contacts at the fingertips using haptic devices worn at the wrist, testing different mappings.
|
||||
%
|
||||
Results proved that moving the haptic feedback away from the point(s) of contact is possible and effective, and that its impact is more significant when the visual feedback is limited.
|
||||
%
|
||||
|
||||
In pick-and-place tasks in AR involving both virtual and real objects, \textcite{maisto2017evaluation} and \textcite{meli2018combining} showed that having a haptic {rendering of the} fingertip interactions with the virtual objects led to better performance and perceived effectiveness than having only a visual rendering of the hand, similar to \figref{hands-tips}.
|
||||
%
|
||||
Moreover, employing the haptic ring of~\cite{pacchierotti2016hring} on the proximal finger phalanx led to an improved performance with respect to more standard fingertip haptic devices~\cite{chinello2020modular}.
|
||||
%
|
||||
However, the measured difference in performance could be attributed to either the device or the device position (proximal vs fingertip), or both.
|
||||
%
|
||||
|
||||
Furthermore, all of these studies were conducted in non-immersive setups, where users looked at a screen displaying the visual interactions, and only compared haptic and visual feedback, but did not examine them together.
|
||||
%
|
||||
The improved performance and perceived effectiveness of a delocalized haptic feedback over a visual feedback alone, or their multimodal combination, remains to be verified in an immersive OST-AR setup.
|
||||
|
||||
Reference in New Issue
Block a user