Structure related work

This commit is contained in:
2024-09-11 09:21:59 +02:00
parent febc12a34c
commit 32f6d91b96
5 changed files with 151 additions and 114 deletions

View File

@@ -114,7 +114,7 @@ The combination of the two axes defines 9 types of \vh environments, with 3 poss
% %
For example, a \v-\AE that uses a tangible (touchable) object as a proxy to manipulate virtual content is considered a \h-\RE (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum3}), whereas a device that provides synthetic haptic feedback when touching a \VO is considered a \h-\VE (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum3}). For example, a \v-\AE that uses a tangible (touchable) object as a proxy to manipulate virtual content is considered a \h-\RE (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum3}), whereas a device that provides synthetic haptic feedback when touching a \VO is considered a \h-\VE (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum3}).
% %
Haptic \AR is then the combination of real and virtual haptic stimuli~\cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}). Haptic \AR (\h-\AR) is then the combination of real and virtual haptic stimuli~\cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}).
% %
In particular, it has been implemented by enhancing the haptic perception of tangible objects by providing timely tactile stimuli using \WHs. In particular, it has been implemented by enhancing the haptic perception of tangible objects by providing timely tactile stimuli using \WHs.
% %
@@ -157,7 +157,7 @@ Our goal is to enable congruent, intuitive and seamless perception and manipulat
The experience of such a \vh-\AE relies on an interaction loop with the user, as illustrated in \figref{interaction-loop}. The experience of such a \vh-\AE relies on an interaction loop with the user, as illustrated in \figref{interaction-loop}.
% %
The \RE and the user's hand are tracked in real time by sensors and reconstructed in visual and haptic VEs. The \RE and the user's hand are tracked in real time by sensors and reconstructed in visual and haptic \VEs.
% %
The interactions between the virtual hand and objects are then simulated and rendered as visual and haptic feedback to the user using an \AR headset and a \WH device. The interactions between the virtual hand and objects are then simulated and rendered as visual and haptic feedback to the user using an \AR headset and a \WH device.
% %
@@ -197,7 +197,7 @@ As such, in \VR, visual sensations are particularly dominant in perception, and
% %
Moreover, many \WH devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR. Moreover, many \WH devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR.
% %
The user's hand must be indeed free to touch and interact with the \RE. The user's hand must be indeed free to touch and interact with the \RE while wearing a \WH device.
% %
It is possible instead to place the haptic actuator close to the point of contact with the \RE, as described above to implement \h-\AR, \eg providing haptic feedback on another phalanx~\cite{asano2015vibrotactile,salazar2020altering} or the wrist~\cite{sarac2022perceived} for rendering fingertip contacts with virtual content. It is possible instead to place the haptic actuator close to the point of contact with the \RE, as described above to implement \h-\AR, \eg providing haptic feedback on another phalanx~\cite{asano2015vibrotactile,salazar2020altering} or the wrist~\cite{sarac2022perceived} for rendering fingertip contacts with virtual content.
% %
@@ -217,7 +217,7 @@ It is therefore unclear to what extent the real and virtual visuo-haptic sensati
\subsectionstarbookmark{Enable Effective Manipulation of the Augmented Environment} \subsectionstarbookmark{Enable Effective Manipulation of the Augmented Environment}
Touching, grasping and manipulating \VOs are fundamental interactions for \AR~\cite{kim2018revisiting}, \VR~\cite{bergstrom2021how} and VEs in general~\cite{laviola20173d}. Touching, grasping and manipulating \VOs are fundamental interactions for \AR~\cite{kim2018revisiting}, \VR~\cite{bergstrom2021how} and \VEs in general~\cite{laviola20173d}.
% %
As the hand is not occupied or covered with a haptic device to not impair interaction with the \RE, as described in the previous section, one can expect a seamless and direct manipulation of the hand with the virtual content as if it were real. As the hand is not occupied or covered with a haptic device to not impair interaction with the \RE, as described in the previous section, one can expect a seamless and direct manipulation of the hand with the virtual content as if it were real.
% %
@@ -245,7 +245,7 @@ Yet, it is unclear which type of visual and haptic feedback is the best suited t
\section{Approach and Contributions} \section{Approach and Contributions}
\label{contributions} \label{contributions}
The aim of this thesis is to understand how immersive visual and \WH feedback compare and complement each other in the context of direct hand perception and manipulation with augmented objects. The aim of this thesis is to understand how immersive visual and \WH augmentations compare and complement each other in the context of direct hand perception and manipulation with augmented objects.
% %
As described in the Research Challenges section above, providing a convincing, consistent and effective \vh-\AE to a user is complex and raises many issues. As described in the Research Challenges section above, providing a convincing, consistent and effective \vh-\AE to a user is complex and raises many issues.
% %

View File

@@ -94,7 +94,7 @@ As illustrated in the \figref{sensorimotor_continuum}, \Citeauthor{jones2006huma
] ]
This classification has been further refined by \textcite{bullock2013handcentric} into 15 categories of possible hand interactions with an object. This classification has been further refined by \textcite{bullock2013handcentric} into 15 categories of possible hand interactions with an object.
In this thesis, we are interested in exploring visuo-haptic augmentations (see \partref{perception}) and grasping of virtual objects (see \partref{manipulation}) in the context of AR and wearable haptics. In this thesis, we are interested in exploring \vh augmentations (see \partref{perception}) and grasping of \VOs (see \partref{manipulation}) in the context of \AR and \WHs.
\subsubsection{Hand Anatomy and Motion} \subsubsection{Hand Anatomy and Motion}
\label{hand_anatomy} \label{hand_anatomy}

View File

@@ -1,7 +1,9 @@
\section{Rendering Objects with Wearable Haptics} \section{Augmenting Objects with Wearable Haptics}
\label{wearable_haptics} \label{wearable_haptics}
One of the roles of haptic systems is to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects, particularly in \v-\VE~\cite{maclean2008it,culbertson2018haptics}. One of the roles of haptic systems is to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects, particularly in \v-\VE~\cite{maclean2008it,culbertson2018haptics}.
Moreover, a haptic \AR system should \enquote{modulating the feel of a real object by virtual [haptic] feedback}~\cite{jeon2009haptic}, \ie a touch interaction with a real object whose perception is modified by the addition of virtual haptic feedback.
The haptic system should be hand-held or worn, \eg on the hand, and \enquote{not permanently attached to or integrated in the object}~\cite{bhatia2024augmenting}.
\subsection{Level of Wearability} \subsection{Level of Wearability}
@@ -158,52 +160,89 @@ Several types of vibrotactile actuators are used in haptics, with different trad
\end{subfigs} \end{subfigs}
\subsection{Tactile Renderings of Object Properties} \subsection{Tactile Renderings for Modifying Object Properties}
\label{tactile_rendering} \label{tactile_rendering}
Le rendu tactile des propriétés haptiques consiste à modéliser et reproduire des sensations cutanées virtuelles comparables à celles perçues lors de l'interaction avec des objets réels. Le rendu tactile des propriétés haptiques consiste à modéliser et reproduire des sensations cutanées virtuelles comparables à celles perçues lors de l'interaction avec des objets réels.
En particulier, nous nous intéressons aux actuateurs portables stimulant les méchano-récepteurs de la peau (voir \secref{haptic_sense}) et n'empêchant pas de toucher et interagir avec l'environnement réel et aux rendus de propriétés haptiques d'objets virtuels ou augmentés. En particulier, nous nous intéressons aux actuateurs portables stimulant les méchano-récepteurs de la peau (voir \secref{haptic_sense}) et n'empêchant pas de toucher et interagir avec l'environnement réel et aux rendus de propriétés haptiques d'objets virtuels ou augmentés.
%, unlike most previous actuators that are designed specifically for fingertips and would require mechanical adaptation to be placed on other parts of the hand. In the context of integrating \WHs with \AR to create a \vh-\AE (see \chapref{introduction}), the definition of \textcite{pacchierotti2017wearable} can be extended to an additional criterion: The wearable haptic interface should not impair the interaction with the \RE, \ie the user should be able to touch and manipulate objects in the real world while wearing the haptic device.
%thanks to the vibration propagation and the sensory capabilities distributed throughout the skin, they can be placed without adaption and on any part of the hand
\subsubsection{Contact} \cite{bhatia2024augmenting}. Types of interfaces : direct touch, through touch, through tool. Focus on direct touch, but when no rendering done, will overview possibilities with the other types of interfaces.
\subsubsection{Hardness} \cite{klatzky2003feeling} : rendering roughness, friction, deformation, temperatures
\label{contact_rendering} \cite{girard2016haptip} : renderings with a tangential motion actuator
\subsubsection{Texture} \subsubsection{Textures}
\label{texture_rendering} \label{texture_rendering}
Several approaches have been proposed to render virtual haptic texture~\cite{culbertson2018haptics}. Several approaches have been proposed to render virtual haptic texture~\cite{culbertson2018haptics}.
%
High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\cite{unger2011roughness}. High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\cite{unger2011roughness}.
%
As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}. As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}.
%
In this way, physics-based models~\cite{chan2021hasti,okamura1998vibration,guruswamy2011iir} and data-based models~\cite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures. In this way, physics-based models~\cite{chan2021hasti,okamura1998vibration,guruswamy2011iir} and data-based models~\cite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures.
%
Notably, \textcite{okamura1998vibration} rendered grating textures with exponentially decaying sinudoids that simulated the strokes of the grooves and ridges of the surface, while \textcite{culbertson2014modeling} captured and modelled the roughness of real surfaces to render them using the speed and force of the user. Notably, \textcite{okamura1998vibration} rendered grating textures with exponentially decaying sinudoids that simulated the strokes of the grooves and ridges of the surface, while \textcite{culbertson2014modeling} captured and modelled the roughness of real surfaces to render them using the speed and force of the user.
%
An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\cite{culbertson2015should}. An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\cite{culbertson2015should}.
% For example, when comparing the same virtual texture pairwise, but with different parameters, \textcite{culbertson2015should} showed that the roughness vibrations generated should vary with user speed, but not necessarily with user force. For example, when comparing the same virtual texture pairwise, but with different parameters, \textcite{culbertson2015should} showed that the roughness vibrations generated should vary with user speed, but not necessarily with user force.
% Virtual data-driven textures were perceived as similar to real textures, except for friction, which was not rendered properly. Virtual data-driven textures were perceived as similar to real textures, except for friction, which was not rendered properly.
%
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\cite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\cite{asano2015vibrotactile}, creating a haptic texture augmentation. The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\cite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\cite{asano2015vibrotactile}, creating a haptic texture augmentation.
%
The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\cite{bhatia2024augmenting,jeon2009haptic}. The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\cite{bhatia2024augmenting,jeon2009haptic}.
One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand~\cite{ando2007fingernailmounted,friesen2024perceived,normand2024visuohaptic,teng2021touch}. One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand~\cite{ando2007fingernailmounted,friesen2024perceived,normand2024visuohaptic,teng2021touch}.
%
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\cite{manfredi2014natural}. Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\cite{manfredi2014natural}.
%
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\cite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}. A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\cite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
%
It remains unclear whether such vibrotactile texture augmentation is perceived the same when integrated into visual AR or VR environments or touched with a virtual hand instead of the real hand. It remains unclear whether such vibrotactile texture augmentation is perceived the same when integrated into visual AR or VR environments or touched with a virtual hand instead of the real hand.
%
%We also add a phase adjustment to this sinusoidal signal to allow free exploration movements of the finger with a simple camera-based tracking system. The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\cite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\cite{asano2015vibrotactile}, creating a haptic texture augmentation.
The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\cite{bhatia2024augmenting,jeon2009haptic}.
One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand~\cite{ando2007fingernailmounted,friesen2024perceived,teng2021touch}.
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\cite{manfredi2014natural}.
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\cite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
\subsubsection{Hardness}
\label{hardness_rendering}
\cite{kuchenbecker2006improving}
\cite{jeon2009haptic}
\cite{jeon2012extending}
\cite{hachisu2012augmentation}
\cite{kildal20103dpress}
pacchierotti2014improving
pacchierotti2015enhancing
park 2017 Compensation of perceived hardness of a virtual object with cutaneous feedback
\cite{park2019realistic}
\cite{choi2021perceived}
\cite{park2023perceptual}
\cite{detinguy2018enhancing}
\cite{salazar2020altering}
\cite{yim2021multicontact}
\cite{tao2021altering}
\subsubsection{Friction}
\label{friction_rendering}
\cite{konyo2008alternative}
\cite{provancher2009fingerpad}
\cite{smith2010roughness}
\cite{jeon2011extensions}
\cite{salazar2020altering}
\cite{yim2021multicontact}
\subsubsection{Weight}
\label{weight_rendering}
\cite{minamizawa2007gravity}
\cite{minamizawa2008interactive}
\cite{jeon2011extensions}
\cite{choi2017grabity}
\cite{culbertson2017waves}
\subsection{Conclusion} \subsection{Conclusion}
\label{wearable_haptics_conclusion} \label{wearable_haptics_conclusion}
%, unlike most previous actuators that are designed specifically for fingertips and would require mechanical adaptation to be placed on other parts of the hand.
%thanks to the vibration propagation and the sensory capabilities distributed throughout the skin, they can be placed without adaption and on any part of the hand

View File

@@ -70,8 +70,14 @@ Yet, the user experience in \AR is still highly dependent on the display used.
\paragraph{Video See-Through Headsets} \paragraph{Video See-Through Headsets}
Vergence-accommodation conflict.
Using a VST-AR headset have notable consequences, as the "real" view of the environment and the hand is actually a visual stream from a camera, which has a noticeable delay and lower quality (\eg resolution, frame rate, field of view) compared to the direct view of the real environment with OST-AR~\cite{macedo2023occlusion}.1
\paragraph{Optical See-Through Headsets} \paragraph{Optical See-Through Headsets}
Distances are underestimated~\cite{adams2022depth,peillard2019studying}.
\subsection{Presence and Embodiment in AR} \subsection{Presence and Embodiment in AR}
\label{ar_presence} \label{ar_presence}
@@ -115,7 +121,6 @@ Retour à la boucle d'interaction :
on a présenté les interfaces haptiques et de RA (rendu du système vers l'utilisateur) pour faire le rendu du VE, qui essaye de recréer des expériences perceptuelles similaires et comparables à celles de la vie de touts les jours, \ie de rendre la meilleure immersion (voir \secref{ar_presence}) possible. on a présenté les interfaces haptiques et de RA (rendu du système vers l'utilisateur) pour faire le rendu du VE, qui essaye de recréer des expériences perceptuelles similaires et comparables à celles de la vie de touts les jours, \ie de rendre la meilleure immersion (voir \secref{ar_presence}) possible.
Mais il faut pouvoir permettre à l'utilisateur d'interagir avec l'environment et les objets virtuels (interaction), donc détecter et représenter l'utilisateur dans le VE (tracking). Mais il faut pouvoir permettre à l'utilisateur d'interagir avec l'environment et les objets virtuels (interaction), donc détecter et représenter l'utilisateur dans le VE (tracking).
\subsubsection{Interaction Techniques} \subsubsection{Interaction Techniques}
Pour cela il faut des techniques d'interaction, \cite{billinghurst2005designing} : Physical Elements as Input -- Interaction Technique --> Virtual Elements as Output. Pour cela il faut des techniques d'interaction, \cite{billinghurst2005designing} : Physical Elements as Input -- Interaction Technique --> Virtual Elements as Output.
@@ -145,8 +150,38 @@ Prototypes : HandyAR and HoloDesk
\cite{piumsomboon2014graspshell} : direct hand manipulation of virtual objects in immersive AR vs vocal commands. \cite{piumsomboon2014graspshell} : direct hand manipulation of virtual objects in immersive AR vs vocal commands.
\cite{chan2010touching} : cues for touching (selection) virtual objects. \cite{chan2010touching} : cues for touching (selection) virtual objects.
Problèmes d'occultation, les objets virtuels doivent toujours êtres visibles : soit en utilisant une main virtuelle transparente plutôt quopaque, soit en Problèmes d'occultation, les objets virtuels doivent toujours êtres visibles : soit en utilisant une main virtuelle transparente plutôt quopaque, soit en affichant leurs contours si elle les cache \cite{piumsomboon2014graspshell}.
affichant leurs contours si elle les cache \cite{piumsomboon2014graspshell}.
\subsection{Visual Rendering of Hands in AR}
Mutual visual occlusion between a virtual object and the real hand, \ie hiding the virtual object when the real hand is in front of it and hiding the real hand when it is behind the virtual object, is often presented as natural and realistic, enhancing the blending of real and virtual environments~\cite{piumsomboon2014graspshell, al-kalbani2016analysis}.
In video see-through AR (VST-AR), this could be solved as a masking problem by combining the image of the real world captured by a camera and the generated virtual image~\cite{macedo2023occlusion}.
In OST-AR, this is more difficult because the virtual environment is displayed as a transparent 2D image on top of the 3D real world, which cannot be easily masked~\cite{macedo2023occlusion}.
Moreover, in VST-AR, the grip aperture and depth positioning of virtual objects often seem to be wrongly estimated~\cite{al-kalbani2016analysis, maisto2017evaluation}.
However, this effect has yet to be verified in an OST-AR setup.
An alternative is to render the virtual objects and the hand semi-transparents, so that they are partially visible even when one is occluding the other, \eg in \figref{hands-none} the real hand is behind the virtual cube but still visible.
Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in VST-AR~\cite{buchmann2005interaction, ha2014wearhand, piumsomboon2014graspshell} and VR~\cite{vanveldhuizen2021effect}, but has not yet been evaluated in OST-AR.
However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a virtual object, \eg in \figref{hands-none} the thumb is in front of the virtual cube, but it appears to be behind it.
In VR, as the user is fully immersed in the virtual environment and cannot see their real hands, it is necessary to represent them virtually.
Virtual hand rendering is also known to influence how an object is grasped in VR~\cite{prachyabrued2014visual,blaga2020too} and AR, or even how real bumps and holes are perceived in VR~\cite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated.
It is known that the virtual hand representation has an impact on perception, interaction performance, and preference of users~\cite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch}.
In a pick-and-place task in VR, \textcite{prachyabrued2014visual} found that the virtual hand representation whose motion was constrained to the surface of the virtual objects performed the worst, while the virtual hand representation following the tracked human hand (thus penetrating the virtual objects), performed the best, even though it was rather disliked.
The authors also observed that the best compromise was a double rendering, showing both the tracked hand and a hand rendering constrained by the virtual environment.
It has also been shown that over a realistic avatar, a skeleton rendering (similar to \figref{hands-skeleton}) can provide a stronger sense of being in control~\cite{argelaguet2016role} and that minimalistic fingertip rendering (similar to \figref{hands-tips}) can be more effective in a typing task~\cite{grubert2018effects}.
In AR, as the real hand of a user is visible but not physically constrained by the virtual environment, adding a visual hand rendering that can physically interact with virtual objects would achieve a similar result to the promising double-hand rendering of \textcite{prachyabrued2014visual}.
Additionally, \textcite{kahl2021investigation} showed that a virtual object overlaying a tangible object in OST-AR can vary in size without worsening the users' experience nor the performance.
This suggests that a visual hand rendering superimposed on the real hand could be helpful, but should not impair users.
Few works have explored the effect of visual hand rendering in AR~\cite{blaga2017usability, maisto2017evaluation, krichenbauer2018augmented, yoon2020evaluating, saito2021contact}.
For example, \textcite{blaga2017usability} evaluated a skeleton rendering in several virtual object manipulations against no visual hand overlay.
Performance did not improve, but participants felt more confident with the virtual hand.
However, the experiment was carried out on a screen, in a non-immersive AR scenario.
\textcite{saito2021contact} found that masking the real hand with a textured 3D opaque virtual hand did not improve performance in a reach-to-grasp task but displaying the points of contact on the virtual object did.
To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of virtual object manipulation.
\subsection{Conclusion} \subsection{Conclusion}

View File

@@ -1,127 +1,90 @@
\section{Hand-Object Interactions in Visuo-Haptic Augmented Reality} \section{Visuo-Haptic Augmentations of Hand-Object Interactions}
\label{visuo_haptic_ar} \label{visuo_haptic_ar}
% Answer the following four questions: “Who else has done work with relevance to this work of yours? What did they do? What did they find? And how is your work here different?” % Answer the following four questions: “Who else has done work with relevance to this work of yours? What did they do? What did they find? And how is your work here different?”
\subsection{Altering the Perceptions} %Go back to the main objective "to understand how immersive visual and \WH feedback compare and complement each other in the context of direct hand perception and manipulation with augmented objects" and the two research challenges: "providing plausible and coherent visuo-haptic augmentations, and enabling effective manipulation of the augmented environment."
\label{vhar_perception} %Also go back to the \figref[introduction]{visuo-haptic-rv-continuum3} : we present previous work that either did haptic AR (the middle row), or haptic VR with visual AR, or visuo-haptic AR.
\subsubsection{Influence of Visual Rendering on Haptic Perception}
\label{vhar_influences} \subsection{Influence of Visual Rendering on Haptic Perception}
\label{visual_haptic_influence}
\subsubsection{Merging the Sensations into a Perception}
\label{sensations_perception}
When the same object property is sensed simultaneously by vision and touch, the two modalities are integrated into a single perception. When the same object property is sensed simultaneously by vision and touch, the two modalities are integrated into a single perception.
%
The phychophysical model of \textcite{ernst2002humans} established that the sense with the least variability dominates perception. The phychophysical model of \textcite{ernst2002humans} established that the sense with the least variability dominates perception.
\cite{ernst2004merging}
\subsubsection{Contact \& Hardness Augmentations}
\label{vhar_hardness}
\subsubsection{Texture Augmentations}
\label{vhar_texture}
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}. Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
% %
Thus, the overall perception can be modified by changing one of the modalities, as shown by \textcite{yanagisawa2015effects}, who altered the perception of roughness, stiffness and friction of some real tactile textures touched by the finger by superimposing different real visual textures using a half-mirror. Thus, the overall perception can be modified by changing one of the modalities, as shown by \textcite{yanagisawa2015effects}, who altered the perception of roughness, stiffness and friction of some real tactile textures touched by the finger by superimposing different real visual textures using a half-mirror.
In 2010, they were research interest on building haptics (dynamic tactile feedback) for touch-based systems. [@Bau2010Teslatouch] created a touch-based surface rendering textures using electrovibration and friction feedback between the surface and the user's finger.
They extended this prototype to in [@Bau2012REVEL] to alter the texture of touched real objects using reverse electrovibration. They call this kind of haptic devices that can alter the touch perception of any object without any setup as *intrinsic haptic displays*. They said [@Azuma1997Survey] as envisioned this kind of AR experience.
Similarly but in VR, \textcite{degraen2019enhancing} combined visual textures with different passive haptic hair-like structure that were touched with the finger to induce a larger set of visuo-haptic materials perception. Similarly but in VR, \textcite{degraen2019enhancing} combined visual textures with different passive haptic hair-like structure that were touched with the finger to induce a larger set of visuo-haptic materials perception.
\textcite{gunther2022smooth} studied in a complementary way how the visual rendering of a virtual object touching the arm with a tangible object influenced the perception of roughness. \textcite{gunther2022smooth} studied in a complementary way how the visual rendering of a virtual object touching the arm with a tangible object influenced the perception of roughness.
Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch~\cite{degraen2019enhancing} and passive touch~\cite{gunther2022smooth} contexts. Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch~\cite{degraen2019enhancing} and passive touch~\cite{gunther2022smooth} contexts.
A common finding of these studies is that haptic sensations seem to dominate the perception of roughness, suggesting that a smaller set of haptic textures can support a larger set of visual textures. A common finding of these studies is that haptic sensations seem to dominate the perception of roughness, suggesting that a smaller set of haptic textures can support a larger set of visual textures.
\subsubsection{Pseudo-Haptic Feedback}
\label{pseudo_haptic}
\subsection{Improving the Interactions} A few works have also used pseudo-haptic feedback to change the perception of haptic stimuli to create richer feedback by deforming the visual representation of a user input~\cite{ujitoko2021survey}.
\label{vhar_interaction} For example, different levels of stiffness can be simulated on a grasped virtual object with the same passive haptic device~\cite{achibet2017flexifingers} or
the perceived softness of tangible objects can be altered by superimposing in AR a virtual texture that deforms when pressed by the hand~\cite{punpongsanon2015softar}, or in combination with vibrotactile rendering in VR~\cite{choi2021augmenting}.
Conversely, virtual hand rendering is also known to influence how an object is grasped in VR~\cite{prachyabrued2014visual,blaga2020too} and AR, or even how real bumps and holes are perceived in VR~\cite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated. % Ban
\subsubsection{Visual Hand Rendering in AR} % I. Jang and D. Lee. 2014. On utilizing pseudo-haptics for cutaneous fingertip.
\label{vhar_hands}
Mutual visual occlusion between a virtual object and the real hand, \ie hiding the virtual object when the real hand is in front of it and hiding the real hand when it is behind the virtual object, is often presented as natural and realistic, enhancing the blending of real and virtual environments~\cite{piumsomboon2014graspshell, al-kalbani2016analysis}. The vibrotactile sinusoidal rendering of virtual texture cited above was also combined with visual oscillations of a cursor on a screen to increase the roughness perception of the texture~\cite{ujitoko2019modulating}.
% However, the visual representation was a virtual cursor seen on a screen while the haptic feedback was felt with a hand-held device.
In video see-through AR (VST-AR), this could be solved as a masking problem by combining the image of the real world captured by a camera and the generated virtual image~\cite{macedo2023occlusion}. Conversely, as discussed by \textcite{ujitoko2021survey} in their review, a co-localised visuo-haptic rendering can cause the user to notice the mismatch between their real movements and the visuo-haptic feedback.
%
In OST-AR, this is more difficult because the virtual environment is displayed as a transparent 2D image on top of the 3D real world, which cannot be easily masked~\cite{macedo2023occlusion}.
%
Moreover, in VST-AR, the grip aperture and depth positioning of virtual objects often seem to be wrongly estimated~\cite{al-kalbani2016analysis, maisto2017evaluation}.
%
However, this effect has yet to be verified in an OST-AR setup.
An alternative is to render the virtual objects and the hand semi-transparents, so that they are partially visible even when one is occluding the other, \eg in \figref{hands-none} the real hand is behind the virtual cube but still visible. Even before manipulating a visual representation to induce a haptic sensation, shifts and latencies between user input and co-localised visuo-haptic feedback can be experienced differently in AR and VR, which we aim to investigate in this work.
%
Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in VST-AR~\cite{buchmann2005interaction, ha2014wearhand, piumsomboon2014graspshell} and VR~\cite{vanveldhuizen2021effect}, but has not yet been evaluated in OST-AR.
%
However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a virtual object, \eg in \figref{hands-none} the thumb is in front of the virtual cube, but it appears to be behind it.
In VR, as the user is fully immersed in the virtual environment and cannot see their real hands, it is necessary to represent them virtually. \subsubsection{Comparing Haptics in AR \vs VR}
% \label{AR_vs_VR}
It is known that the virtual hand representation has an impact on perception, interaction performance, and preference of users~\cite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch}.
%
In a pick-and-place task in VR, \textcite{prachyabrued2014visual} found that the virtual hand representation whose motion was constrained to the surface of the virtual objects performed the worst, while the virtual hand representation following the tracked human hand (thus penetrating the virtual objects), performed the best, even though it was rather disliked.
%
The authors also observed that the best compromise was a double rendering, showing both the tracked hand and a hand rendering constrained by the virtual environment.
%
It has also been shown that over a realistic avatar, a skeleton rendering (similar to \figref{hands-skeleton}) can provide a stronger sense of being in control~\cite{argelaguet2016role} and that minimalistic fingertip rendering (similar to \figref{hands-tips}) can be more effective in a typing task~\cite{grubert2018effects}.
In AR, as the real hand of a user is visible but not physically constrained by the virtual environment, adding a visual hand rendering that can physically interact with virtual objects would achieve a similar result to the promising double-hand rendering of \textcite{prachyabrued2014visual}. A few studies specifically compared visuo-haptic perception in AR \vs VR.
% Rendering a virtual piston pressed with one's real hand using a video see-through (VST) AR headset and a force feedback haptic device, \textcite{knorlein2009influence} showed that a visual delay increased the perceived stiffness of the piston, whereas a haptic delay decreased it.
Additionally, \textcite{kahl2021investigation} showed that a virtual object overlaying a tangible object in OST-AR can vary in size without worsening the users' experience nor the performance. \textcite{diluca2011effects} went on to explain how these delays affected the weighting of visual and haptic information in perceived stiffness.
% In a similar setup, but with an optical see-through (OST) AR headset, \textcite{gaffary2017ar} found that the virtual piston was perceived as less stiff in AR than in VR, without participants noticing this difference.
This suggests that a visual hand rendering superimposed on the real hand could be helpful, but should not impair users. While a large literature has investigated these differences in visual perception, as well as for VR, \eg , less is known about visuo-haptic perception in AR and VR.
Few works have explored the effect of visual hand rendering in AR~\cite{blaga2017usability, maisto2017evaluation, krichenbauer2018augmented, yoon2020evaluating, saito2021contact}.
%
For example, \textcite{blaga2017usability} evaluated a skeleton rendering in several virtual object manipulations against no visual hand overlay.
%
Performance did not improve, but participants felt more confident with the virtual hand.
%
However, the experiment was carried out on a screen, in a non-immersive AR scenario.
%
\textcite{saito2021contact} found that masking the real hand with a textured 3D opaque virtual hand did not improve performance in a reach-to-grasp task but displaying the points of contact on the virtual object did.
%
To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of virtual object manipulation.
\subsubsection{Wearable Haptics for AR} \subsubsection{Wearable Haptics for AR}
\label{vhar_haptics} \label{vhar_haptics}
Different haptic feedback systems have been explored to improve interactions in AR, including % \subsection{Fingertip-Free Haptic Devices}
grounded force feedback devices~\cite{bianchi2006high, jeon2009haptic, knorlein2009influence}, % \label{vhar_devices}
exoskeletons~\cite{lee2021wearable}, %
tangible objects~\cite{hettiarachchi2016annexing, detinguy2018enhancing, salazar2020altering, normand2018enlarging, xiao2018mrtouch}, and % [@Bau2010Teslatouch] created a touch-based surface rendering textures using electrovibration and friction feedback between the surface and the user's finger.
wearable haptic devices~\cite{pacchierotti2016hring, lopes2018adding, pezent2019tasbi, teng2021touch}. They extended this prototype to in [@Bau2012REVEL] to alter the texture of touched real objects using reverse electrovibration. They call this kind of haptic devices that can alter the touch perception of any object without any setup as *intrinsic haptic displays*.
\cite{lopes2018adding}
Wearable haptics seems particularly suited for this context, as it takes into account many of the AR constraints, \eg limited impact on hand tracking performance and reduced impairment of the senses and ability of the users to interact with real content~\cite{pacchierotti2016hring, maisto2017evaluation, lopes2018adding, meli2018combining, pezent2019tasbi, teng2021touch, kourtesis2022electrotactile, marchal2022virtual}.
%
For example, \textcite{pacchierotti2016hring} designed a haptic ring providing pressure and skin stretch sensations to be worn at the proximal finger phalanx, so as to improve the hand tracking during a pick-and-place task.
%
\textcite{pezent2019tasbi} proposed Tasbi: a wristband haptic device capable of rendering vibrations and pressures.
%
\textcite{teng2021touch} presented Touch\&Fold, a haptic device attached to the nail that provides pressure and texture sensations when interacting with virtual content, but also folds away when the user interacts with real objects, leaving the fingertip free. \textcite{teng2021touch} presented Touch\&Fold, a haptic device attached to the nail that provides pressure and texture sensations when interacting with virtual content, but also folds away when the user interacts with real objects, leaving the fingertip free.
%
This approach was also perceived as more realistic than providing sensations directly on the nail, as in~\cite{ando2007fingernailmounted}. This approach was also perceived as more realistic than providing sensations directly on the nail, as in~\cite{ando2007fingernailmounted}.
%
Each of these haptic devices provided haptic feedback about fingertip interactions with the virtual content on other parts of the hand. Each of these haptic devices provided haptic feedback about fingertip interactions with the virtual content on other parts of the hand.
%
If it is indeed necessary to delocalize the haptic feedback, each of these positions is promising, and they have not yet been compared with each other. If it is indeed necessary to delocalize the haptic feedback, each of these positions is promising, and they have not yet been compared with each other.
\cite{pezent2019tasbi}
\cite{tao2021altering}
\cite{maeda2022fingeret}
\subsection{Improving the Interactions with Virtual Objects}
\label{vhar_interaction}
Conjointly, a few studies have explored and compared the effects of visual and haptic feedback in tasks involving the manipulation of virtual objects with the hand. Conjointly, a few studies have explored and compared the effects of visual and haptic feedback in tasks involving the manipulation of virtual objects with the hand.
%
\textcite{sarac2022perceived} and \textcite{palmer2022haptic} studied the effects of providing haptic feedback about contacts at the fingertips using haptic devices worn at the wrist, testing different mappings. \textcite{sarac2022perceived} and \textcite{palmer2022haptic} studied the effects of providing haptic feedback about contacts at the fingertips using haptic devices worn at the wrist, testing different mappings.
%
Results proved that moving the haptic feedback away from the point(s) of contact is possible and effective, and that its impact is more significant when the visual feedback is limited. Results proved that moving the haptic feedback away from the point(s) of contact is possible and effective, and that its impact is more significant when the visual feedback is limited.
%
In pick-and-place tasks in AR involving both virtual and real objects, \textcite{maisto2017evaluation} and \textcite{meli2018combining} showed that having a haptic {rendering of the} fingertip interactions with the virtual objects led to better performance and perceived effectiveness than having only a visual rendering of the hand, similar to \figref{hands-tips}. In pick-and-place tasks in AR involving both virtual and real objects, \textcite{maisto2017evaluation} and \textcite{meli2018combining} showed that having a haptic {rendering of the} fingertip interactions with the virtual objects led to better performance and perceived effectiveness than having only a visual rendering of the hand, similar to \figref{hands-tips}.
%
Moreover, employing the haptic ring of~\cite{pacchierotti2016hring} on the proximal finger phalanx led to an improved performance with respect to more standard fingertip haptic devices~\cite{chinello2020modular}. Moreover, employing the haptic ring of~\cite{pacchierotti2016hring} on the proximal finger phalanx led to an improved performance with respect to more standard fingertip haptic devices~\cite{chinello2020modular}.
%
However, the measured difference in performance could be attributed to either the device or the device position (proximal vs fingertip), or both. However, the measured difference in performance could be attributed to either the device or the device position (proximal vs fingertip), or both.
%
Furthermore, all of these studies were conducted in non-immersive setups, where users looked at a screen displaying the visual interactions, and only compared haptic and visual feedback, but did not examine them together. Furthermore, all of these studies were conducted in non-immersive setups, where users looked at a screen displaying the visual interactions, and only compared haptic and visual feedback, but did not examine them together.
%
The improved performance and perceived effectiveness of a delocalized haptic feedback over a visual feedback alone, or their multimodal combination, remains to be verified in an immersive OST-AR setup. The improved performance and perceived effectiveness of a delocalized haptic feedback over a visual feedback alone, or their multimodal combination, remains to be verified in an immersive OST-AR setup.