Remove \VO and \AE acronym

This commit is contained in:
2024-10-18 14:49:07 +02:00
parent 625892eb9c
commit bd4a436f18
17 changed files with 164 additions and 168 deletions

View File

@@ -98,7 +98,7 @@ As illustrated in \figref{sensorimotor_continuum}, \textcite{jones2006human} del
]
This classification has been further refined by \textcite{bullock2013handcentric} into 15 categories of possible hand interactions with an object.
In this thesis, we are interested in exploring visuo-haptic texture augmentations (\partref{perception}) and grasping of \VOs (\partref{manipulation}) using immersive \AR and wearable haptics.
In this thesis, we are interested in exploring visuo-haptic texture augmentations (\partref{perception}) and grasping of virtual objects (\partref{manipulation}) using immersive \AR and wearable haptics.
\subsubsection{Hand Anatomy and Motion}
\label{hand_anatomy}

View File

@@ -322,7 +322,7 @@ A challenge with this technique is to provide the vibration feedback at the righ
\subfig{choi2021augmenting_results}
\end{subfigs}
Vibrations on contact have been employed with wearable haptics, but to the best of our knowledge only to render \VOs \cite{pezent2019tasbi,teng2021touch,sabnis2023haptic}.
Vibrations on contact have been employed with wearable haptics, but to the best of our knowledge only to render virtual objects \cite{pezent2019tasbi,teng2021touch,sabnis2023haptic}.
We describe them in the \secref{vhar_haptics}.
%A promising alternative approach

View File

@@ -3,11 +3,11 @@
%As with haptic systems (\secref{wearable_haptics}), visual
\AR devices generate and integrate virtual content into the user's perception of their real environment (\RE), creating the illusion of the \emph{presence} of the virtual \cite{azuma1997survey,skarbez2021revisiting}.
Immersive systems such as headsets leave the hands free to interact with virtual objects (\VOs), promising natural and intuitive interactions similar to those with everyday real objects \cite{billinghurst2021grand,hertel2021taxonomy}.
Immersive systems such as headsets leave the hands free to interact with virtual objects (virtual objects), promising natural and intuitive interactions similar to those with everyday real objects \cite{billinghurst2021grand,hertel2021taxonomy}.
%\begin{subfigs}{sutherland1968headmounted}{Photos of the first \AR system \cite{sutherland1968headmounted}. }[
% \item The \AR headset.
% \item Wireframe \ThreeD \VOs were displayed registered in the \RE (as if there were part of it).
% \item Wireframe \ThreeD virtual objects were displayed registered in the \RE (as if there were part of it).
% ]
% \subfigsheight{45mm}
% \subfig{sutherland1970computer3}
@@ -17,7 +17,7 @@ Immersive systems such as headsets leave the hands free to interact with virtual
\subsection{What is Augmented Reality?}
\label{what_is_ar}
The first \AR headset was invented by \textcite{sutherland1968headmounted}: With the technology available at the time, it was already capable of displaying \VOs at a fixed point in space in real time, giving the user the illusion that the content was present in the room.
The first \AR headset was invented by \textcite{sutherland1968headmounted}: With the technology available at the time, it was already capable of displaying virtual objects at a fixed point in space in real time, giving the user the illusion that the content was present in the room.
Fixed to the ceiling, the headset displayed a stereoscopic (one image per eye) perspective projection of the virtual content on a transparent screen, taking into account the user's position, and thus already following our interaction loop presented in \figref[introduction]{interaction-loop}.
\subsubsection{A Definition of AR}
@@ -99,7 +99,7 @@ Finally, \AR displays can be head-worn like \VR \emph{headsets} or glasses, prov
Presence and embodiment are two key concepts that characterize the user experience in \AR and \VR.
While there is a large literature on these topics in \VR, they are less defined and studied for \AR \cite{genay2022being,tran2024survey}.
These concepts will be useful for the design, evaluation, and discussion of our contributions:
In particular, we will investigate the effect of the visual feedback of the virtual hand when touching haptic texture augmentation (\chapref{xr_perception}) and manipulating \VOs (\chapref{visual_hand}), and explore the plausibility of visuo-haptic textures (\chapref{visuo_haptic}).
In particular, we will investigate the effect of the visual feedback of the virtual hand when touching haptic texture augmentation (\chapref{xr_perception}) and manipulating virtual objects (\chapref{visual_hand}), and explore the plausibility of visuo-haptic textures (\chapref{visuo_haptic}).
\paragraph{Presence}
\label{ar_presence}
@@ -113,9 +113,9 @@ It doesn't mean that the virtual events are realistic, \ie that reproduce the re
In the same way, a film can be plausible even if it is not realistic, such as a cartoon or a science-fiction movie.
%The \AR presence is far less defined and studied than for \VR \cite{tran2024survey}
For \AR, \textcite{slater2022separate} proposed to invert place illusion to what we can call \enquote{object illusion}, \ie the sense of the \VO to \enquote{feels here} in the \RE (\figref{presence-ar}).
As with \VR, \VOs must be able to be seen from different angles by moving the head, but also, this is more difficult, appear to be coherent enough with the \RE \cite{skarbez2021revisiting}, \eg occlude or be occluded by real objects \cite{macedo2023occlusion}, cast shadows or reflect lights.
The plausibility can be applied to \AR as is, but the \VOs must additionally have knowledge of the \RE and react accordingly to it to be, again, perceived as coherently behaving with the real world \cite{skarbez2021revisiting}.
For \AR, \textcite{slater2022separate} proposed to invert place illusion to what we can call \enquote{object illusion}, \ie the sense of the virtual object to \enquote{feels here} in the \RE (\figref{presence-ar}).
As with \VR, virtual objects must be able to be seen from different angles by moving the head, but also, this is more difficult, appear to be coherent enough with the \RE \cite{skarbez2021revisiting}, \eg occlude or be occluded by real objects \cite{macedo2023occlusion}, cast shadows or reflect lights.
The plausibility can be applied to \AR as is, but the virtual objects must additionally have knowledge of the \RE and react accordingly to it to be, again, perceived as coherently behaving with the real world \cite{skarbez2021revisiting}.
%\textcite{skarbez2021revisiting} also named place illusion for \AR as \enquote{immersion} and plausibility as \enquote{coherence}, and these terms will be used in the remainder of this thesis.
%One main issue with presence is how to measure it both in \VR \cite{slater2022separate} and \AR \cite{tran2024survey}.
@@ -123,7 +123,7 @@ The plausibility can be applied to \AR as is, but the \VOs must additionally hav
The sense of immersion in virtual and augmented environments. Adapted from \textcite{stevens2002putting}.
}[][
\item Place illusion is the sense of the user of \enquote{being there} in the \VE.
\item Objet illusion is the sense of the \VO to \enquote{feels here} in the \RE.
\item Objet illusion is the sense of the virtual object to \enquote{feels here} in the \RE.
]
\subfigsheight{35mm}
\subfig{presence-vr}
@@ -166,18 +166,18 @@ Choosing useful and efficient \UIs and interaction techniques is crucial for the
\textcite{hertel2021taxonomy} proposed a taxonomy of interaction techniques specifically for immersive \AR.
The \emph{manipulation tasks} are the most fundamental tasks in \AR and \VR systems, and the building blocks for more complex interactions.
\emph{Selection} is the identification or acquisition of a specific \VO, \eg pointing at a target as in \figref{grubert2015multifi}, touching a button with a finger, or grasping an object with a hand.
\emph{Selection} is the identification or acquisition of a specific virtual object, \eg pointing at a target as in \figref{grubert2015multifi}, touching a button with a finger, or grasping an object with a hand.
\emph{Positioning} and \emph{rotation} of a selected object are the change of its position and orientation in \ThreeD space respectively.
It is also common to \emph{resize} a \VO to change its size.
It is also common to \emph{resize} a virtual object to change its size.
These three operations are geometric (rigid) manipulations of the object: they do not change its shape.
The \emph{navigation tasks} are the movements of the user within the \VE.
Travel is the control of the position and orientation of the viewpoint in the \VE, \eg physical walking, velocity control, or teleportation.
Wayfinding is the cognitive planning of the movement, such as path finding or route following (\figref{grubert2017pervasive}).
The \emph{system control tasks} are changes to the system state through commands or menus such as creating, deleting, or modifying \VOs, \eg as in \figref{roo2017onea}. It is also the input of text, numbers, or symbols.
The \emph{system control tasks} are changes to the system state through commands or menus such as creating, deleting, or modifying virtual objects, \eg as in \figref{roo2017onea}. It is also the input of text, numbers, or symbols.
In this thesis we focus on manipulation tasks of virtual content directly with the hands, more specifically on touching visuo-haptic textures with a finger (\partref{perception}) and positioning and rotating \VOs pushed and grasp by the hand.
In this thesis we focus on manipulation tasks of virtual content directly with the hands, more specifically on touching visuo-haptic textures with a finger (\partref{perception}) and positioning and rotating virtual objects pushed and grasp by the hand.
\begin{subfigs}{interaction-techniques}{Interaction techniques in \AR. }[][
\item Spatial selection of virtual item of an extended display using a hand-held smartphone \cite{grubert2015multifi}.
@@ -210,26 +210,26 @@ It is often achieved using two interaction techniques: \emph{tangible objects} a
\subsubsection{Manipulating with Tangibles}
\label{ar_tangibles}
As \AR integrates visual virtual content into \RE perception, it can involve real surrounding objects as \UI: to either visually augment them (\figref{roo2017inner}), or to use them as physical proxies to support interaction with \VOs \cite{ishii1997tangible}.
As \AR integrates visual virtual content into \RE perception, it can involve real surrounding objects as \UI: to either visually augment them (\figref{roo2017inner}), or to use them as physical proxies to support interaction with virtual objects \cite{ishii1997tangible}.
%According to \textcite{billinghurst2005designing}
Each \VO is coupled to a real object and physically manipulated through it, providing a direct, efficient and seamless interaction with both the real and virtual content \cite{billinghurst2005designing}.
Each virtual object is coupled to a real object and physically manipulated through it, providing a direct, efficient and seamless interaction with both the real and virtual content \cite{billinghurst2005designing}.
The real objects are called \emph{tangible} in this usage context.
%This technique is similar to mapping the movements of a mouse to a virtual cursor on a screen.
Methods have been developed to automatically pair and adapt the \VOs for rendering with available tangibles of similar shape and size \cite{hettiarachchi2016annexing,jain2023ubitouch} (\figref{jain2023ubitouch}).
Methods have been developed to automatically pair and adapt the virtual objects for rendering with available tangibles of similar shape and size \cite{hettiarachchi2016annexing,jain2023ubitouch} (\figref{jain2023ubitouch}).
The issue with these \emph{space-multiplexed} interfaces is the large number and variety of tangibles required.
An alternative is to use a single \emph{universal} tangible object like a hand-held controller, such as a cube \cite{issartel2016tangible} or a sphere \cite{englmeier2020tangible}.
These \emph{time-multiplexed} interfaces require interaction techniques that allow the user to pair the tangible with any \VO, \eg by placing the tangible into the \VO and pressing the fingers \cite{issartel2016tangible} (\figref{issartel2016tangible}), similar to a real grasp (\secref{grasp_types}).
These \emph{time-multiplexed} interfaces require interaction techniques that allow the user to pair the tangible with any virtual object, \eg by placing the tangible into the virtual object and pressing the fingers \cite{issartel2016tangible} (\figref{issartel2016tangible}), similar to a real grasp (\secref{grasp_types}).
Still, the virtual visual rendering and the real haptic sensations can be incoherent.
Especially in \OST-\AR, since the \VOs are inherently slightly transparent allowing the paired real objects to be seen through them.
In a pick-and-place task with real objects, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using_1}) of the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users.
This suggests the feasibility of using simplified real obejcts in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs.
Especially in \OST-\AR, since the virtual objects are inherently slightly transparent allowing the paired real objects to be seen through them.
In a pick-and-place task with real objects, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using_1}) of the virtual objects does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users.
This suggests the feasibility of using simplified real obejcts in \AR whose spatial properties (\secref{object_properties}) abstract those of the virtual objects.
Similarly, in \secref{tactile_rendering} we described how a material property (\secref{object_properties}) of a touched real object can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: it could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR.
\begin{subfigs}{ar_tangibles}{Manipulating \VOs through real objects. }[][
\begin{subfigs}{ar_tangibles}{Manipulating virtual objects through real objects. }[][
\item Ubi-Touch paired the movements and screw interaction of a virtual drill with a real vaporizer held by the user \cite{jain2023ubitouch}.
\item A real cube that can be moved into the \VE and used to grasp and manipulate \VOs \cite{issartel2016tangible}.
\item A real cube that can be moved into the \VE and used to grasp and manipulate virtual objects \cite{issartel2016tangible}.
\item Size and
\item shape difference between a real object and a virtual one is acceptable for manipulation in \AR \cite{kahl2021investigation,kahl2023using}.
]
@@ -252,20 +252,20 @@ The simplest models represent the hand as a rigid \ThreeD object that follows th
An alternative is to model only the fingertips (\figref{lee2007handy}) or the whole hand (\figref{hilliges2012holodesk_1}) as points.
The most common technique is to reconstruct all the phalanges of the hand in an articulated kinematic model (\secref{hand_anatomy}) \cite{borst2006spring}.
The contacts between the virtual hand model and the \VOs are then simulated using heuristic or physics-based techniques \cite[p.405]{laviolajr20173d}.
Heuristic techniques use rules to determine the selection, manipulation and release of a \VO (\figref{piumsomboon2013userdefined_1}).
The contacts between the virtual hand model and the virtual objects are then simulated using heuristic or physics-based techniques \cite[p.405]{laviolajr20173d}.
Heuristic techniques use rules to determine the selection, manipulation and release of a virtual object (\figref{piumsomboon2013userdefined_1}).
However, they produce unrealistic behaviour and are limited to the cases predicted by the rules.
Physics-based techniques simulate forces at the points of contact between the virtual hand and the \VO.
Physics-based techniques simulate forces at the points of contact between the virtual hand and the virtual object.
In particular, \textcite{borst2006spring} proposed an articulated kinematic model in which each phalanx is a rigid body simulated with the god-object method \cite{zilles1995constraintbased}:
The virtual phalanx follows the movements of the real phalanx, but remains constrained to the surface of the \VOs during contact.
The virtual phalanx follows the movements of the real phalanx, but remains constrained to the surface of the virtual objects during contact.
The forces acting on the object are calculated as a function of the distance between the real and virtual hands (\figref{borst2006spring}).
More advanced techniques simulate the friction phenomena \cite{talvas2013godfinger} and finger deformations \cite{talvas2015aggregate}, allowing highly accurate and realistic interactions, but which can be difficult to compute in real time.
\begin{subfigs}{virtual-hand}{Manipulating \VOs with virtual hands. }[][
\item A fingertip tracking that allows to select a \VO by opening the hand \cite{lee2007handy}.
\begin{subfigs}{virtual-hand}{Manipulating virtual objects with virtual hands. }[][
\item A fingertip tracking that allows to select a virtual object by opening the hand \cite{lee2007handy}.
\item Physics-based hand-object manipulation with a virtual hand made of numerous many small rigid-body spheres \cite{hilliges2012holodesk}.
\item Grasping a through gestures when the fingers are detected as opposing on the \VO \cite{piumsomboon2013userdefined}.
\item A kinematic hand model with rigid-body phalanges (in beige) that follows the real tracked hand (in green) but kept physically constrained to the \VO. Applied forces are shown as red arrows \cite{borst2006spring}.
\item Grasping a through gestures when the fingers are detected as opposing on the virtual object \cite{piumsomboon2013userdefined}.
\item A kinematic hand model with rigid-body phalanges (in beige) that follows the real tracked hand (in green) but kept physically constrained to the virtual object. Applied forces are shown as red arrows \cite{borst2006spring}.
]
\subfigsheight{37mm}
\subfigbox{lee2007handy}
@@ -275,7 +275,7 @@ More advanced techniques simulate the friction phenomena \cite{talvas2013godfing
\end{subfigs}
However, the lack of physical constraints on the user's hand movements makes manipulation actions tiring \cite{hincapie-ramos2014consumed}.
While the user's fingers traverse the \VO, a physics-based virtual hand remains in contact with the object, a discrepancy that may degrade the user's performance in \VR \cite{prachyabrued2012virtual}.
While the user's fingers traverse the virtual object, a physics-based virtual hand remains in contact with the object, a discrepancy that may degrade the user's performance in \VR \cite{prachyabrued2012virtual}.
Finally, in the absence of haptic feedback on each finger, it is difficult to estimate the contact and forces exerted by the fingers on the object during grasping and manipulation \cite{maisto2017evaluation,meli2018combining}.
While a visual feedback of the virtual hand in \VR can compensate for these issues \cite{prachyabrued2014visual}, the visual and haptic feedback of the virtual hand, or their combination, in \AR needs to be investigated as well.
@@ -284,24 +284,24 @@ While a visual feedback of the virtual hand in \VR can compensate for these issu
%In \VR, since the user is fully immersed in the \VE and cannot see their real hands, it is necessary to represent them virtually (\secref{ar_embodiment}).
When interacting with a physics-based virtual hand method (\secref{ar_virtual_hands}) in \VR, the visual feedback of the virtual hand has an influence on perception, interaction performance, and preference of users \cite{prachyabrued2014visual,argelaguet2016role,grubert2018effects,schwind2018touch}.
In a pick-and-place manipulation task in \VR, \textcite{prachyabrued2014visual} and \textcite{canales2019virtual} found that the visual hand feedback whose motion was constrained to the surface of the \VOs similar as to \textcite{borst2006spring} (\enquote{Outer Hand} in \figref{prachyabrued2014visual}) performed the worst, while the visual hand feedback following the tracked human hand (thus penetrating the \VOs, \enquote{Inner Hand} in \figref{prachyabrued2014visual}) performed the best, though it was rather disliked.
In a pick-and-place manipulation task in \VR, \textcite{prachyabrued2014visual} and \textcite{canales2019virtual} found that the visual hand feedback whose motion was constrained to the surface of the virtual objects similar as to \textcite{borst2006spring} (\enquote{Outer Hand} in \figref{prachyabrued2014visual}) performed the worst, while the visual hand feedback following the tracked human hand (thus penetrating the virtual objects, \enquote{Inner Hand} in \figref{prachyabrued2014visual}) performed the best, though it was rather disliked.
\textcite{prachyabrued2014visual} also found that the best compromise was a double feedback, showing both the virtual hand and the tracked hand (\enquote{2-Hand} in \figref{prachyabrued2014visual}).
While a realistic rendering of the human hand increased the sense of ownership \cite{lin2016need}, a skeleton-like rendering provided a stronger sense of agency \cite{argelaguet2016role} (\secref{ar_embodiment}), and a minimalist fingertip rendering reduced typing errors \cite{grubert2018effects}.
A visual hand feedback while in \VE also seems to affect how one grasps an object \cite{blaga2020too}, or how real bumps and holes are perceived \cite{schwind2018touch}.
\fig{prachyabrued2014visual}{Visual hand feedback affect user experience in \VR \cite{prachyabrued2014visual}.}
Conversely, a user sees their own hands in \AR, and the mutual occlusion between the hands and the \VOs is a common issue (\secref{ar_displays}), \ie hiding the \VO when the real hand is in front of it, and hiding the real hand when it is behind the \VO (\figref{hilliges2012holodesk_2}).
Conversely, a user sees their own hands in \AR, and the mutual occlusion between the hands and the virtual objects is a common issue (\secref{ar_displays}), \ie hiding the virtual object when the real hand is in front of it, and hiding the real hand when it is behind the virtual object (\figref{hilliges2012holodesk_2}).
%For example, in \figref{hilliges2012holodesk_2}, the user is pinching a virtual cube in \OST-\AR with their thumb and index fingers, but while the index is behind the cube, it is seen as in front of it.
While in \VST-\AR, this could be solved as a masking problem by combining the real and virtual images \cite{battisti2018seamless}, \eg in \figref{suzuki2014grasping}, in \OST-\AR, this is much more difficult because the \VE is displayed as a transparent \TwoD image on top of the \ThreeD \RE, which cannot be easily masked \cite{macedo2023occlusion}.
%Yet, even in \VST-\AR,
%An alternative is to render the \VOs and the virtual hand semi-transparents, so that they are partially visible even when one is occluding the other (\figref{buchmann2005interaction}).
%An alternative is to render the virtual objects and the virtual hand semi-transparents, so that they are partially visible even when one is occluding the other (\figref{buchmann2005interaction}).
%Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in \VST-\AR \cite{buchmann2005interaction,ha2014wearhand,piumsomboon2014graspshell} and \VR \cite{vanveldhuizen2021effect}, but has not yet been evaluated in \OST-\AR.
%However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a \VO, \eg the thumb is in front of the virtual cube, but could be perceived to be behind it.
%However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a virtual object, \eg the thumb is in front of the virtual cube, but could be perceived to be behind it.
Since the \VE is intangible, adding a visual feedback of the virtual hand in \AR that is physically constrained to the \VOs would achieve a similar result to the double-hand feedback of \textcite{prachyabrued2014visual}.
A \VO overlaying a real object object in \OST-\AR can vary in size and shape without degrading user experience or manipulation performance \cite{kahl2021investigation,kahl2023using}.
Since the \VE is intangible, adding a visual feedback of the virtual hand in \AR that is physically constrained to the virtual objects would achieve a similar result to the double-hand feedback of \textcite{prachyabrued2014visual}.
A virtual object overlaying a real object object in \OST-\AR can vary in size and shape without degrading user experience or manipulation performance \cite{kahl2021investigation,kahl2023using}.
This suggests that a visual hand feedback superimposed on the real hand as a partial avatarization (\secref{ar_embodiment}) might be helpful without impairing the user.
Few works have compared different visual feedback of the virtual hand in \AR or with wearable haptic feedback.
@@ -313,13 +313,13 @@ In a collaborative task in immersive \OST-\AR \vs \VR, \textcite{yoon2020evaluat
\textcite{genay2021virtual} found that the sense of embodiment with robotic hands overlay in \OST-\AR was stronger when the environment contained both real and virtual objects (\figref{genay2021virtual}).
Finally, \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the visual and haptic feedback of the hand in \VST-\AR, as detailed in the next section (\secref{vhar_rings}).
Taken together, these results suggest that a visual augmentation of the hand in \AR could improve usability and performance in direct hand manipulation tasks, but the best rendering has yet to be determined.
%\cite{chan2010touching} : cues for touching (selection) \VOs.
%\textcite{saito2021contact} found that masking the real hand with a textured \ThreeD opaque virtual hand did not improve performance in a reach-to-grasp task but displaying the points of contact on the \VO did.
%To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of \VO manipulation.
%\cite{chan2010touching} : cues for touching (selection) virtual objects.
%\textcite{saito2021contact} found that masking the real hand with a textured \ThreeD opaque virtual hand did not improve performance in a reach-to-grasp task but displaying the points of contact on the virtual object did.
%To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of virtual object manipulation.
\begin{subfigs}{visual-hands}{Visual feedback of the virtual hand in \AR. }[][
\item Grasping a \VO in \OST-\AR with no visual hand feedback \cite{hilliges2012holodesk}.
\item Simulated mutual-occlusion between the hand grasping and the \VO in \VST-\AR \cite{suzuki2014grasping}.
\item Grasping a virtual object in \OST-\AR with no visual hand feedback \cite{hilliges2012holodesk}.
\item Simulated mutual-occlusion between the hand grasping and the virtual object in \VST-\AR \cite{suzuki2014grasping}.
\item Grasping a real object with a semi-transparent hand in \VST-\AR \cite{buchmann2005interaction}.
\item Skeleton rendering overlaying the real hand in \VST-\AR \cite{blaga2017usability}.
\item Robotic rendering overlaying the real hands in \OST-\AR \cite{genay2021virtual}.
@@ -338,8 +338,8 @@ Taken together, these results suggest that a visual augmentation of the hand in
\AR systems integrate virtual content into the user's perception as if it were part of the \RE.
\AR headsets now enable real-time tracking of the head and hands, and high-quality display of virtual content, while being portable and mobile.
They enable highly immersive \AEs that users can explore with a strong sense of the presence of the virtual content.
However, without direct and seamless interaction with the \VOs using the hands, the coherence of the \AE experience is compromised.
In particular, when manipulating \VOs in \OST-\AR, there is a lack of mutual occlusion and interaction cues between the hands and the virtual content, which could be mitigated by a visual augmentation of the hand.
A common alternative approach is to use real objects as proxies for interaction with \VOs, but this raises concerns about their coherence with visual augmentations.
In this context, the use of wearable haptic systems worn on the hand seems to be a promising solution both for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of touched real objects.
They enable highly immersive augmented environments that users can explore with a strong sense of the presence of the virtual content.
However, without direct and seamless interaction with the virtual objects using the hands, the coherence of the augmented environment experience is compromised.
In particular, when manipulating virtual objects in \OST-\AR, there is a lack of mutual occlusion and interaction cues between the hands and the virtual content, which could be mitigated by a visual augmentation of the hand.
A common alternative approach is to use real objects as proxies for interaction with virtual objects, but this raises concerns about their coherence with visual augmentations.
In this context, the use of wearable haptic systems worn on the hand seems to be a promising solution both for improving direct hand manipulation of virtual objects and for coherent visuo-haptic augmentation of touched real objects.

View File

@@ -5,7 +5,7 @@ Perception and manipulation of objects with the hand typically involves both the
Each sense has unique capabilities for perceiving certain object properties, such as color for vision or temperature for touch, but they are equally capable for many properties, such as roughness, hardness, or geometry \cite{baumgartner2013visual}.
Both \AR and wearable haptic systems integrate virtual content into the user's perception as sensory illusions.
It is essential to understand how a visuo-haptic rendering of a \VO is perceived as a coherent object property, and how wearable haptics have been integrated with immersive \AR.%, especially in immersive \AR where the haptic actuator is moved away so as not to cover the inside of the hand.
It is essential to understand how a visuo-haptic rendering of a virtual object is perceived as a coherent object property, and how wearable haptics have been integrated with immersive \AR.%, especially in immersive \AR where the haptic actuator is moved away so as not to cover the inside of the hand.
% spatial and temporal integration of visuo-haptic feedback as perceptual cues vs proprioception and real touch sensations
% delocalized : not at the point of contact = difficult to integrate with other perceptual cues ?
@@ -17,7 +17,7 @@ It is essential to understand how a visuo-haptic rendering of a \VO is perceived
\label{sensations_perception}
A \emph{perception} is the merging of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property \cite{ernst2004merging}.
For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a real object with a co-localized \VO (\secref{ar_tangibles}).
For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a real object with a co-localized virtual object (\secref{ar_tangibles}).
If the sensations are redundant, \ie if only one sensation could suffice to estimate the property, they are integrated to form a single perception \cite{ernst2004merging}.
No sensory information is completely reliable and may give different answers to the same property when measured multiple times, \eg the weight of an object.
@@ -58,10 +58,10 @@ The objective was to determine a \PSE between the comparison and reference bars,
%\subfig{ernst2002humans_visuo-haptic}
\end{subfigs}
%Hence, the \MLE model explains how a (visual) \VO in \AR can be perceived as coherent when combined with real haptic sensations of a tangible or a wearable haptic feedback.
The \MLE model implies that when seeing and touching a \VO in \AR, the combination of visual and haptic stimuli, real or virtual, presented to the user can be perceived as a coherent single object property.
%Hence, the \MLE model explains how a (visual) virtual object in \AR can be perceived as coherent when combined with real haptic sensations of a tangible or a wearable haptic feedback.
The \MLE model implies that when seeing and touching a virtual object in \AR, the combination of visual and haptic stimuli, real or virtual, presented to the user can be perceived as a coherent single object property.
%As long as the user is able to associate the sensations as the same object property, and even if there are discrepancies between the sensations, the overall perception can be influenced by changing one of the stimuli, as discussed in the next sections.
%for example by including tangible objects, wearable haptic feedback, or even by altering the visual rendering of the \VO, as discussed in the next sections.
%for example by including tangible objects, wearable haptic feedback, or even by altering the visual rendering of the virtual object, as discussed in the next sections.
\subsubsection{Influence of Visual Rendering on Haptic Perception}
\label{visual_haptic_influence}
@@ -73,11 +73,11 @@ More precisely, when surfaces are evaluated by vision or touch alone, both sense
The overall perception can then be modified by changing one of the sensory modalities.
\textcite{yanagisawa2015effects} altered the perceived roughness, stiffness, and friction of real tactile materials touched by the finger by superimposing different real visual textures using a half-mirror.
In a similar setup, but in immersive \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched through a glove: many visual textures were found to match the real haptic textures.
\textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined multiple \VOs in \VR with \ThreeD-printed hair structures or with everyday real surfaces, respectively.
They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few real, tangible objects seemed to be sufficient to match all the visual \VOs (\figref{gunther2022smooth}).
\textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined multiple virtual objects in \VR with \ThreeD-printed hair structures or with everyday real surfaces, respectively.
They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few real, tangible objects seemed to be sufficient to match all the visual virtual objects (\figref{gunther2022smooth}).
%Taken together, these studies suggest that a set of haptic textures, real or virtual, can be perceived as coherent with a larger set of visual virtual textures.
\fig{gunther2022smooth}{In a passive touch context in \VR, only a smooth and a rough real surfaces were found to match all the visual \VOs \cite{gunther2022smooth}.}
\fig{gunther2022smooth}{In a passive touch context in \VR, only a smooth and a rough real surfaces were found to match all the visual virtual objects \cite{gunther2022smooth}.}
Visual feedback can even be intentionally designed to influence haptic perception, usually by deforming the visual representation of a user input, creating a \enquote{pseudo-haptic feedback} \cite{ujitoko2021survey}.
For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually deforming the geometry of a real object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape, or curvature can be altered \cite{ban2013modifying,ban2014displaying}.
@@ -94,12 +94,12 @@ For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually defo
\end{subfigs}
%In all of these studies, the visual expectations of participants influenced their haptic perception.
%In particular, in \AR and \VR, the perception of a haptic rendering or augmentation can be influenced by the visual rendering of the \VO.
%In particular, in \AR and \VR, the perception of a haptic rendering or augmentation can be influenced by the visual rendering of the virtual object.
\subsubsection{Perception of Visuo-Haptic Rendering in AR and VR}
\label{ar_vr_haptic}
Some studies have investigated the visuo-haptic perception of \VOs rendered with force-feedback and vibrotactile feedback in \AR and \VR.
Some studies have investigated the visuo-haptic perception of virtual objects rendered with force-feedback and vibrotactile feedback in \AR and \VR.
In an immersive \VST-\AR setup, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}).
In a \TIFC task (\secref{sensations_perception}), participants pressed two pistons and indicated which was stiffer.
@@ -126,8 +126,8 @@ Therefore, the perceived stiffness $\tilde{k}(t)$ increases with a haptic delay
\textcite{gaffary2017ar} compared perceived stiffness of virtual pistons in \OST-\AR and \VR.
However, the force-feedback device and the participant's hand were not visible (\figref{gaffary2017ar}).
The reference piston was judged to be stiffer when seen in \VR than in \AR, without participants noticing this difference, and more force was exerted on the piston overall in \VR.
This suggests that the haptic stiffness of \VOs feels \enquote{softer} in an \AE than in a full \VE.
%Two differences that could be worth investigating with the two previous studies are the type of \AR (visuo or optical) and to see the hand touching the \VO.
This suggests that the haptic stiffness of virtual objects feels \enquote{softer} in an augmented environment than in a full \VE.
%Two differences that could be worth investigating with the two previous studies are the type of \AR (visuo or optical) and to see the hand touching the virtual object.
\begin{subfigs}{gaffary2017ar}{Perception of haptic stiffness in \OST-\AR \vs \VR \cite{gaffary2017ar}. }[][
\item Experimental setup: a virtual piston was pressed with a force-feedback placed to the side of the participant.
@@ -139,12 +139,12 @@ This suggests that the haptic stiffness of \VOs feels \enquote{softer} in an \AE
\subfigbox[0.31]{gaffary2017ar_4}
\end{subfigs}
Finally, \textcite{diluca2019perceptual} investigated the perceived simultaneity of visuo-haptic contact with a \VO in \VR.
The contact was rendered both by a vibrotactile piezoelectric device on the fingertip and by a visual change in the color of the \VO.
Finally, \textcite{diluca2019perceptual} investigated the perceived simultaneity of visuo-haptic contact with a virtual object in \VR.
The contact was rendered both by a vibrotactile piezoelectric device on the fingertip and by a visual change in the color of the virtual object.
The visuo-haptic simultaneity was varied by adding a visual delay or by triggering the haptic feedback earlier.
No participant (out of 19) was able to detect a \qty{50}{\ms} visual lag and a \qty{15}{\ms} haptic lead, and only half of them detected a \qty{100}{\ms} visual lag and a \qty{70}{\ms} haptic lead.
These studies have shown how the latency of the visual rendering of a \VO or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device.
These studies have shown how the latency of the visual rendering of a virtual object or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device.
We describe in the next section how wearable haptics have been integrated with immersive \AR.
\subsection{Wearable Haptics for Direct Hand Interaction in AR}
@@ -163,10 +163,10 @@ Another category of actuators relies on systems that cannot be considered as por
\label{vhar_nails}
\textcite{ando2007fingernailmounted} were the first to move the actuator from the fingertip to propose the nail, as described in \secref{texture_rendering}.
This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a haptic device able to unfold its end-effector on demand to make contact with the fingertip when touching \VOs (\figref{teng2021touch_1}).
This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a haptic device able to unfold its end-effector on demand to make contact with the fingertip when touching virtual objects (\figref{teng2021touch_1}).
This moving platform also contains a \LRA (\secref{moving_platforms}) and provides contact pressure and texture sensations.
%The whole system is very compact (\qtyproduct{24 x 24 x 41}{\mm}), lightweight (\qty{9.5}{\g}), and fully portable by including a battery and Bluetooth wireless communication. \qty{20}{\ms} for the Bluetooth
When touching \VOs in \OST-\AR with the index finger, this device was found to be more realistic overall (5/7) than vibrations with a \LRA at \qty{170}{\Hz} on the nail (3/7).
When touching virtual objects in \OST-\AR with the index finger, this device was found to be more realistic overall (5/7) than vibrations with a \LRA at \qty{170}{\Hz} on the nail (3/7).
Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and this design is not suitable for augmenting real objects.
% teng2021touch: (5.27+3.03+5.23+5.5+5.47)/5 = 4.9
@@ -196,7 +196,7 @@ However, no proper user study has been conducted to evaluate these devices in \A
\subsubsection{Belt Devices}
\label{vhar_rings}
The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been used to improve the manipulation of \VOs in \AR (\secref{ar_interaction}).
The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been used to improve the manipulation of virtual objects in \AR (\secref{ar_interaction}).
Recall that these devices have also been used to modify the perceived stiffness, softness, friction and localized bumps and holes on smooth real surfaces (\secref{hardness_rendering}) \cite{detinguy2018enhancing,salazar2020altering}, but have not been tested in \AR.
In a \VST-\AR setup, \textcite{scheggi2010shape} explored the effect of rendering the weight of a virtual cube placed on a real surface hold with the thumb, index, and middle fingers (\figref{scheggi2010shape}).
@@ -250,11 +250,11 @@ A user study was conducted in \VR to compare the perception of visuo-haptic stif
\subsection{Conclusion}
\label{visuo_haptic_conclusion}
Providing coherent visuo-haptic feedback to enhance direct hand perception and manipulation with \VOs in immersive \AR is challenging.
Providing coherent visuo-haptic feedback to enhance direct hand perception and manipulation with virtual objects in immersive \AR is challenging.
While many wearable haptic devices have been developed and are capable of providing varied tactile feedback, few have be integrated or experimentally evaluated for direct hand interaction in \AR.
Their haptic end-effector must be moved away from the inside of the hand so as not to interfere with the user interaction with the \RE.
Different relocation strategies have been proposed for different parts of the hand, such as the nail, the index phalanges, or the wrist, but it remains unclear whether any of them are best suited for direct hand interaction in \AR.
In all cases, the real and virtual visual sensations are considered co-localized, but the virtual haptic feedback is not.
Such a discrepancy may affect the user's perception and experience and should be further investigated.
When integrating different sensory feedback, haptic and visual, real and virtual, into a single object property, perception is robust to variations in reliability and to spatial and temporal differences.
Conversely, the same haptic feedback or augmentation can be influenced by the user's visual expectation or the visual rendering of the \VO.
Conversely, the same haptic feedback or augmentation can be influenced by the user's visual expectation or the visual rendering of the virtual object.

View File

@@ -4,7 +4,7 @@
\chaptertoc
This chapter reviews previous work on the perception and manipulation of virtual and augmented objects directly with the hand, using either wearable haptics, \AR, or their combination.
%Experiencing a visual, haptic, or visuo-haptic \AE relies on one to many interaction loops between a user and the environment, as shown in \figref[introduction]{interaction-loop}, and each main step must be addressed and understood: the tracking and modelling of the \RE into a \VE, the interaction techniques to act on the \VE, the rendering of the \VE to the user through visual and haptic user interfaces, and, finally, the user's perception and actions on the overall \AE.
%Experiencing a visual, haptic, or visuo-haptic augmented environment relies on one to many interaction loops between a user and the environment, as shown in \figref[introduction]{interaction-loop}, and each main step must be addressed and understood: the tracking and modelling of the \RE into a \VE, the interaction techniques to act on the \VE, the rendering of the \VE to the user through visual and haptic user interfaces, and, finally, the user's perception and actions on the overall augmented environment.
First, we review how the hand senses and interacts with its environment to perceive and manipulate the haptic properties of real everyday objects.
Second, we present how wearable haptic devices and renderings have been used to augment the haptic perception of roughness and hardness of real objects.
Third, we introduce the principles and user experience of \AR and review the main interaction techniques used to manipulate virtual objects directly with the hand.