diff --git a/1-background/introduction/introduction.tex b/1-background/introduction/introduction.tex index 3a02bb4..5241320 100644 --- a/1-background/introduction/introduction.tex +++ b/1-background/introduction/introduction.tex @@ -95,7 +95,7 @@ Many visual displays have been explored, from projection systems to hand-held di %Concepts of virtuality and augmentation can also be applied for sensory modalities other than vision. \textcite{jeon2009haptic} proposed to describe visuo-haptic \AR/\VR systems with two orthogonal reality-virtuality continuums, one for vision and one for touch, as shown in \figref{visuo-haptic-rv-continuum5}. The combination of the two axes defines 9 types of visuo-haptic environments, with 3 possible levels of virtuality for each visual or haptic feedback: real, augmented and virtual. -For example, (visual) \AR using a real object as a proxy to manipulate a \VO is considered \emph{haptic reality} (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum5}), whereas a device that provides synthetic haptic feedback when touching a \VO is considered \emph{haptic virtuality} (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum5}). +For example, (visual) \AR using a real object as a proxy to manipulate a virtual object is considered \emph{haptic reality} (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum5}), whereas a device that provides synthetic haptic feedback when touching a virtual object is considered \emph{haptic virtuality} (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum5}). \textbf{A \emph{haptic augmentation} is then the combination of real and virtual haptic stimuli} \cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum5}). In particular, it has been implemented by augmenting the haptic perception of real objects by providing timely virtual tactile stimuli using wearable haptics: \figref{salazar2020altering} shows an example of modifying the perceived stiffness of a real object in \VR using simultaneous pressure feedback on the finger (left middle cell in \figref{visuo-haptic-rv-continuum5}). @@ -104,13 +104,13 @@ In particular, it has been implemented by augmenting the haptic perception of re The illusion of \enquote{being there} when in \VR or of the virtual content to \enquote{feels here} when in \AR \cite{slater2022separate,skarbez2021revisiting} is called \emph{presence}. One of the most important aspects of this illusion is the \emph{plausibility}, \ie the illusion that the virtual events are really happening. %, even if the user knows that they are not real. However, when an \AR/\VR system lacks haptic feedback, it creates a deceptive and incomplete user experience when the hand reaches the virtual content. -All visual \VOs are inherently intangible and cannot physically constrain a user's hand, making it difficult to perceive their properties and interact with them with confidence and efficiency. -It is also necessary to provide a haptic feedback that is coherent with the visual \VOs and ensures the best possible user experience, as we argue in the next section. +All visual virtual objects are inherently intangible and cannot physically constrain a user's hand, making it difficult to perceive their properties and interact with them with confidence and efficiency. +It is also necessary to provide a haptic feedback that is coherent with the visual virtual objects and ensures the best possible user experience, as we argue in the next section. The \textbf{integration of wearable haptics with \AR} appears to be one of the most promising solutions, but it \textbf{remains challenging due to their many respective characteristics and the additional constraints of combining them}. \begin{subfigs}{visuo-haptic-environments}{Visuo-haptic environments with varying degrees of reality-virtuality. }[][ - \item \AR environment with a real haptic object used as a proxy to manipulate a \VO \cite{kahl2023using}. - \item \AR environment with a wearable haptic device that provides virtual, synthetic feedback from contact with a \VO \cite{meli2018combining}. + \item \AR environment with a real haptic object used as a proxy to manipulate a virtual object \cite{kahl2023using}. + \item \AR environment with a wearable haptic device that provides virtual, synthetic feedback from contact with a virtual object \cite{meli2018combining}. \item A real object seen in a visual \VR environment whose haptic perception of stiffness is augmented with the hRing haptic device \cite{detinguy2018enhancing,salazar2020altering}. \item Visuo-haptic texture augmentation of a real object to be touched using a handheld \AR display and haptic electrovibration feedback \cite{bau2012revel}. ] @@ -125,7 +125,7 @@ The \textbf{integration of wearable haptics with \AR} appears to be one of the m \label{research_challenges} The integration of wearable haptics with \AR to create a visuo-haptic augmented environment is complex and presents many perceptual and interaction challenges. -% \ie sensing the \AE and acting effectively upon it. +% \ie sensing the augmented environment and acting effectively upon it. In this thesis, we propose to \textbf{represent the user's experience with such a visuo-haptic augmented environment as an interaction loop}, shown in \figref{interaction-loop}. It is based on the interaction loops of users with \ThreeD systems \cite[p.84]{laviolajr20173d}. The \RE and the user's hand are tracked in real time by sensors and reconstructed in visual and haptic \VEs. @@ -133,7 +133,7 @@ The interactions between the virtual hand and objects are then simulated, and re Because the visuo-haptic \VE is displayed in real time and aligned with the \RE, the user is given the illusion of directly perceiving and interacting with the virtual content as if it were part of the \RE. \fig{interaction-loop}{The interaction loop between a user and a visuo-haptic augmented environment as proposed in this thesis.}[ - A user interacts with the visual (in blue) and haptic (in red) \VEs through a virtual hand (in purple) interaction technique that tracks real hand movements and simulates contact with \VOs. + A user interacts with the visual (in blue) and haptic (in red) \VEs through a virtual hand (in purple) interaction technique that tracks real hand movements and simulates contact with virtual objects. The visual and haptic \VEs are rendered back using an immersive \AR headset and wearable haptics, and are perceived by the user to be registered and co-localized with the \RE (in gray). \protect\footnotemark ] @@ -142,7 +142,7 @@ In this context, we focus on two main research challenges: \textbf{(I) providing plausible and coherent visuo-haptic augmentations}, and \textbf{(II) enabling effective manipulation of the augmented environment}. Each of these challenges also raises numerous design, technical and human issues specific to wearable haptics and immersive \AR. -%, as well as virtual rendering and user experience issues.% in integrating these two sensorimotor feedbacks into a coherent and seamless visuo-haptic \AE. +%, as well as virtual rendering and user experience issues.% in integrating these two sensorimotor feedbacks into a coherent and seamless visuo-haptic augmented environment. \footnotetext{% The icons are \href{https://creativecommons.org/licenses/by/3.0/}{CC BY} licensed: @@ -153,7 +153,7 @@ Each of these challenges also raises numerous design, technical and human issues \subsectionstarbookmark{Challenge I: Providing Plausible and Coherent Visuo-Haptic Augmentations} -\textbf{Many haptic devices have been designed and evaluated specifically for use in \VR}, providing varied kinesthetic and tactile feedback to \VOs, and adding realism when interacting with them \cite{culbertson2018haptics}. +\textbf{Many haptic devices have been designed and evaluated specifically for use in \VR}, providing varied kinesthetic and tactile feedback to virtual objects, and adding realism when interacting with them \cite{culbertson2018haptics}. Although closely related, \AR and \VR have key differences in their respective renderings that can affect user perception. %As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic \cite{ujitoko2021survey} or haptic retargeting \cite{azmandian2016haptic} effects. @@ -168,26 +168,26 @@ It remains to be investigated how such potential discrepancies affect the overal %Visual and haptic augmentations of the \RE add sensations to the user's overall perception. The \textbf{added visual and haptic virtual sensations may also be perceived as incoherent} with the sensations of \RE, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these. Moreover, in \AR the user can still see the real world environment, including their hands, augmented real objects and worn haptic devices, unlike \VR where there is total control over the visual rendering. % of the hand and \VE. -It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as a whole, and to what extent they will conflict or complement each other. % in the perception of the \AE. +It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as a whole, and to what extent they will conflict or complement each other. % in the perception of the augmented environment. With a better understanding of \textbf{how visual factors can influence the perception of haptic augmentations}, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied, and new visuo-haptic augmentations adapted to \AR can be designed. \subsectionstarbookmark{Challenge II: Enabling Effective Manipulation of the Augmented Environment} -Touching, \textbf{grasping and manipulating \VOs are fundamental interactions for \AR} \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite[p.385]{laviolajr20173d}. +Touching, \textbf{grasping and manipulating virtual objects are fundamental interactions for \AR} \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite[p.385]{laviolajr20173d}. Since the hand is not occupied or covered with a haptic device so as to not impair interaction with the \RE, as described in the previous section, one can expect a seamless and direct manipulation of the hand with the virtual content as if it were real. When augmenting a real object, the user's hand is physically constrained by the object, allowing for easy and natural interaction. -However, \textbf{manipulating a purely \VO with the bare hand can be challenging} without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction. +However, \textbf{manipulating a purely virtual object with the bare hand can be challenging} without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction. -In addition, current \AR systems have visual rendering limitations that also affect interaction with \VOs. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency. +In addition, current \AR systems have visual rendering limitations that also affect interaction with virtual objects. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency. \AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world. -However, the depth perception of \VOs is often underestimated \cite{peillard2019studying,adams2022depth}. -There is also often \textbf{a lack of mutual occlusion between the hand and a \VO}, \ie that the hand can hide the object or be hidden by the object \cite{macedo2023occlusion}. -Finally, as illustrated in \figref{interaction-loop}, interaction with a \VO is an illusion, because the real hand controls in real time a virtual hand, like an avatar, whose contacts with \VOs are then simulated in the \VE. -Therefore, there is inevitably a latency between the movements of the real hand and the return movements of the \VO, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the \VO touched \cite{prachyabrued2014visual}. -These three rendering limitations make it \textbf{difficult to perceive the position of the fingers relative to the object} before touching or grasping it, but also to estimate the force required to grasp the \VO and move it to a desired location. +However, the depth perception of virtual objects is often underestimated \cite{peillard2019studying,adams2022depth}. +There is also often \textbf{a lack of mutual occlusion between the hand and a virtual object}, \ie that the hand can hide the object or be hidden by the object \cite{macedo2023occlusion}. +Finally, as illustrated in \figref{interaction-loop}, interaction with a virtual object is an illusion, because the real hand controls in real time a virtual hand, like an avatar, whose contacts with virtual objects are then simulated in the \VE. +Therefore, there is inevitably a latency between the movements of the real hand and the return movements of the virtual object, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the virtual object touched \cite{prachyabrued2014visual}. +These three rendering limitations make it \textbf{difficult to perceive the position of the fingers relative to the object} before touching or grasping it, but also to estimate the force required to grasp the virtual object and move it to a desired location. -Hence, it is necessary to provide visual and haptic feedback that allows the user to efficiently contact, grasp and manipulate a \VO with the hand. -Yet, it is unclear which type of visual and haptic feedback, or their combination, is best suited to guide the manipulation of a \VO. %, and whether one or the other of a combination of the two is most beneficial for users. +Hence, it is necessary to provide visual and haptic feedback that allows the user to efficiently contact, grasp and manipulate a virtual object with the hand. +Yet, it is unclear which type of visual and haptic feedback, or their combination, is best suited to guide the manipulation of a virtual object. %, and whether one or the other of a combination of the two is most beneficial for users. \section{Approach and Contributions} \label{contributions} @@ -210,7 +210,7 @@ Our contributions in these two axes are summarized in \figref{contributions}. \fig[0.95]{contributions}{Summary of our contributions through the simplified interaction loop.}[ The contributions are represented in dark grey boxes, the research axes in dark green circles and the research objectives in light green circles. The first axis is \textbf{(I)} the design and evaluation of the perception of visuo-haptic texture augmentations of real surfaces, directly touched by the hand. - The second axis focuses on \textbf{(II)} improving the manipulation of \VOs with the bare hand using visuo-haptic augmentations of the hand as interaction feedback. + The second axis focuses on \textbf{(II)} improving the manipulation of virtual objects with the bare hand using visuo-haptic augmentations of the hand as interaction feedback. ] \subsectionstarbookmark{Axis I: Augmenting the Texture Perception of Real Surfaces} @@ -240,23 +240,23 @@ Our third objective is to \textbf{evaluate the perception of simultaneous and co \subsectionstarbookmark{Axis II: Improving the Manipulation of Virtual Objects} In immersive and wearable visuo-haptic \AR, the hand is free to touch and interact seamlessly with real, augmented and virtual objects. -Hence, a user can expect natural and direct contact and manipulation of \VOs with the bare hand. -However, the intangibility of the visual \VE, the display limitations of current visual \OST-\AR systems and the inherent spatial and temporal discrepancies between the user's hand actions and the visual feedback in the \VE can make interaction with \VOs particularly challenging. -%However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of wearable haptics, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the \RE, the visual feedback, and the haptic feedback, can make the interaction with \VOs with bare hands particularly challenging. -Two particular sensory feedbacks are known to improve such direct \VO manipulation, but have not been properly investigated in immersive \AR: visual feedback of the virtual hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and delocalized haptic feedback \cite{lopes2018adding,teng2021touch}. -For this second axis of research, we propose to design and evaluate \textbf{visuo-haptic augmentations of the hand as interaction feedback with \VOs} in immersive \OST-\AR. -We consider the effect on user performance and experience of (1) the visual feedback of the virtual hand as augmentation of the real hand and (2) different delocalized haptic feedback of \VO manipulation with the hand in combination with visual hand augmentations. +Hence, a user can expect natural and direct contact and manipulation of virtual objects with the bare hand. +However, the intangibility of the visual \VE, the display limitations of current visual \OST-\AR systems and the inherent spatial and temporal discrepancies between the user's hand actions and the visual feedback in the \VE can make interaction with virtual objects particularly challenging. +%However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of wearable haptics, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the \RE, the visual feedback, and the haptic feedback, can make the interaction with virtual objects with bare hands particularly challenging. +Two particular sensory feedbacks are known to improve such direct virtual object manipulation, but have not been properly investigated in immersive \AR: visual feedback of the virtual hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and delocalized haptic feedback \cite{lopes2018adding,teng2021touch}. +For this second axis of research, we propose to design and evaluate \textbf{visuo-haptic augmentations of the hand as interaction feedback with virtual objects} in immersive \OST-\AR. +We consider the effect on user performance and experience of (1) the visual feedback of the virtual hand as augmentation of the real hand and (2) different delocalized haptic feedback of virtual object manipulation with the hand in combination with visual hand augmentations. -First, the visual feedback of the virtual hand is a key element for interacting and manipulating \VOs in \VR \cite{prachyabrued2014visual,grubert2018effects}. -Some work has also investigated the visual feedback of the virtual hand in \AR \cite{piumsomboon2014graspshell,blaga2017usability}, but not in an immersive context of \VO manipulation. % with the bare hand.% from simulating mutual occlusions between the hand and \VOs \cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay \cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand. +First, the visual feedback of the virtual hand is a key element for interacting and manipulating virtual objects in \VR \cite{prachyabrued2014visual,grubert2018effects}. +Some work has also investigated the visual feedback of the virtual hand in \AR \cite{piumsomboon2014graspshell,blaga2017usability}, but not in an immersive context of virtual object manipulation. % with the bare hand.% from simulating mutual occlusions between the hand and virtual objects \cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay \cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand. \OST-\AR also has significant perceptual differences from \VR due to the visibility of the real hand and environment, which can affect user experience and performance \cite{yoon2020evaluating}. %, and these visual hand augmentations have not been evaluated . -Thus, our fourth objective is to \textbf{the visual feedback of the virtual hand as augmentation of the real hand} for direct hand manipulation of \VOs. +Thus, our fourth objective is to \textbf{the visual feedback of the virtual hand as augmentation of the real hand} for direct hand manipulation of virtual objects. Second, as described above, the haptic actuators need to be moved away from the fingertips to not impair the hand movements, sensations and interactions with the \RE. -Previous work has shown that wearable haptics that provide feedback on hand manipulation with \VOs in \AR can significantly improve user performance and experience \cite{maisto2017evaluation,meli2018combining}. +Previous work has shown that wearable haptics that provide feedback on hand manipulation with virtual objects in \AR can significantly improve user performance and experience \cite{maisto2017evaluation,meli2018combining}. However, it is unclear which positioning of the actuator is most beneficial and how delocalized haptic feedback of the hand-object contacts compares or complements visual augmentation of the hand. -Our last objective is to \textbf{investigate the delocalized haptic feedback of \VO manipulation} with the hand, in \textbf{combination with visual augmentations of the hand}, using wearable vibrotactile haptics. +Our last objective is to \textbf{investigate the delocalized haptic feedback of virtual object manipulation} with the hand, in \textbf{combination with visual augmentations of the hand}, using wearable vibrotactile haptics. \section{Thesis Overview} \label{thesis_overview} @@ -268,7 +268,7 @@ With this first current \textit{Introduction} chapter, we have presented the res In \textbf{\chapref{related_work}}, we then review previous work on the perception and manipulation of virtual and augmented objects, directly with the hand, using either wearable haptics, \AR, or their combination. First, we overview how the hand perceives and manipulates real objects. Second, we present wearable haptics and haptic augmentations of the texture and hardness of real objects. -Third, we introduce \AR, and how \VOs can be manipulated directly with the hand. +Third, we introduce \AR, and how virtual objects can be manipulated directly with the hand. Finally, we describe how visuo-haptic feedback has augmented direct hand interaction in \AR, particularly using wearable haptics. We then address each of our two research axes in a dedicated part. @@ -291,12 +291,12 @@ The virtual textures are paired visual and haptic captures of real surfaces \cit Our objective is to assess the perceived realism, coherence and roughness of the combination of nine representative visuo-haptic texture pairs. \noindentskip -In \textbf{\partref{manipulation}} we describe our contributions to the second axis of research: improving direct hand manipulation of \VOs using visuo-haptic augmentations of the hand as interaction feedback with \VOs in immersive \OST-\AR. +In \textbf{\partref{manipulation}} we describe our contributions to the second axis of research: improving direct hand manipulation of virtual objects using visuo-haptic augmentations of the hand as interaction feedback with virtual objects in immersive \OST-\AR. In \textbf{\chapref{visual_hand}} we investigate in a user study six visual feedback as hand augmentations, as a set of the most popular hand augmentation in the \AR literature. -Using the \OST-\AR headset Microsoft HoloLens~2, we evaluate their effect on user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a \VO directly with the hand. +Using the \OST-\AR headset Microsoft HoloLens~2, we evaluate their effect on user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a virtual object directly with the hand. -In \textbf{\chapref{visuo_haptic_hand}} we evaluate in a user study delocalized haptic feedback of hand manipulation with \VO using two vibrotactile contact techniques provided at five different positionings on the hand. %, as haptic rendering of \VO manipulation with the hand. +In \textbf{\chapref{visuo_haptic_hand}} we evaluate in a user study delocalized haptic feedback of hand manipulation with virtual object using two vibrotactile contact techniques provided at five different positionings on the hand. %, as haptic rendering of virtual object manipulation with the hand. They are compared with the two most representative visual hand augmentations from the previous chapter, resulting in twenty visuo-haptic hand feedbacks that are evaluated within the same experimental setup and design. \noindentskip diff --git a/1-background/related-work/1-haptic-hand.tex b/1-background/related-work/1-haptic-hand.tex index 1a6deed..3973b61 100644 --- a/1-background/related-work/1-haptic-hand.tex +++ b/1-background/related-work/1-haptic-hand.tex @@ -98,7 +98,7 @@ As illustrated in \figref{sensorimotor_continuum}, \textcite{jones2006human} del ] This classification has been further refined by \textcite{bullock2013handcentric} into 15 categories of possible hand interactions with an object. -In this thesis, we are interested in exploring visuo-haptic texture augmentations (\partref{perception}) and grasping of \VOs (\partref{manipulation}) using immersive \AR and wearable haptics. +In this thesis, we are interested in exploring visuo-haptic texture augmentations (\partref{perception}) and grasping of virtual objects (\partref{manipulation}) using immersive \AR and wearable haptics. \subsubsection{Hand Anatomy and Motion} \label{hand_anatomy} diff --git a/1-background/related-work/2-wearable-haptics.tex b/1-background/related-work/2-wearable-haptics.tex index 3d13e0f..5a5bb3d 100644 --- a/1-background/related-work/2-wearable-haptics.tex +++ b/1-background/related-work/2-wearable-haptics.tex @@ -322,7 +322,7 @@ A challenge with this technique is to provide the vibration feedback at the righ \subfig{choi2021augmenting_results} \end{subfigs} -Vibrations on contact have been employed with wearable haptics, but to the best of our knowledge only to render \VOs \cite{pezent2019tasbi,teng2021touch,sabnis2023haptic}. +Vibrations on contact have been employed with wearable haptics, but to the best of our knowledge only to render virtual objects \cite{pezent2019tasbi,teng2021touch,sabnis2023haptic}. We describe them in the \secref{vhar_haptics}. %A promising alternative approach diff --git a/1-background/related-work/3-augmented-reality.tex b/1-background/related-work/3-augmented-reality.tex index c96a0cf..3cc36c6 100644 --- a/1-background/related-work/3-augmented-reality.tex +++ b/1-background/related-work/3-augmented-reality.tex @@ -3,11 +3,11 @@ %As with haptic systems (\secref{wearable_haptics}), visual \AR devices generate and integrate virtual content into the user's perception of their real environment (\RE), creating the illusion of the \emph{presence} of the virtual \cite{azuma1997survey,skarbez2021revisiting}. -Immersive systems such as headsets leave the hands free to interact with virtual objects (\VOs), promising natural and intuitive interactions similar to those with everyday real objects \cite{billinghurst2021grand,hertel2021taxonomy}. +Immersive systems such as headsets leave the hands free to interact with virtual objects (virtual objects), promising natural and intuitive interactions similar to those with everyday real objects \cite{billinghurst2021grand,hertel2021taxonomy}. %\begin{subfigs}{sutherland1968headmounted}{Photos of the first \AR system \cite{sutherland1968headmounted}. }[ % \item The \AR headset. -% \item Wireframe \ThreeD \VOs were displayed registered in the \RE (as if there were part of it). +% \item Wireframe \ThreeD virtual objects were displayed registered in the \RE (as if there were part of it). % ] % \subfigsheight{45mm} % \subfig{sutherland1970computer3} @@ -17,7 +17,7 @@ Immersive systems such as headsets leave the hands free to interact with virtual \subsection{What is Augmented Reality?} \label{what_is_ar} -The first \AR headset was invented by \textcite{sutherland1968headmounted}: With the technology available at the time, it was already capable of displaying \VOs at a fixed point in space in real time, giving the user the illusion that the content was present in the room. +The first \AR headset was invented by \textcite{sutherland1968headmounted}: With the technology available at the time, it was already capable of displaying virtual objects at a fixed point in space in real time, giving the user the illusion that the content was present in the room. Fixed to the ceiling, the headset displayed a stereoscopic (one image per eye) perspective projection of the virtual content on a transparent screen, taking into account the user's position, and thus already following our interaction loop presented in \figref[introduction]{interaction-loop}. \subsubsection{A Definition of AR} @@ -99,7 +99,7 @@ Finally, \AR displays can be head-worn like \VR \emph{headsets} or glasses, prov Presence and embodiment are two key concepts that characterize the user experience in \AR and \VR. While there is a large literature on these topics in \VR, they are less defined and studied for \AR \cite{genay2022being,tran2024survey}. These concepts will be useful for the design, evaluation, and discussion of our contributions: -In particular, we will investigate the effect of the visual feedback of the virtual hand when touching haptic texture augmentation (\chapref{xr_perception}) and manipulating \VOs (\chapref{visual_hand}), and explore the plausibility of visuo-haptic textures (\chapref{visuo_haptic}). +In particular, we will investigate the effect of the visual feedback of the virtual hand when touching haptic texture augmentation (\chapref{xr_perception}) and manipulating virtual objects (\chapref{visual_hand}), and explore the plausibility of visuo-haptic textures (\chapref{visuo_haptic}). \paragraph{Presence} \label{ar_presence} @@ -113,9 +113,9 @@ It doesn't mean that the virtual events are realistic, \ie that reproduce the re In the same way, a film can be plausible even if it is not realistic, such as a cartoon or a science-fiction movie. %The \AR presence is far less defined and studied than for \VR \cite{tran2024survey} -For \AR, \textcite{slater2022separate} proposed to invert place illusion to what we can call \enquote{object illusion}, \ie the sense of the \VO to \enquote{feels here} in the \RE (\figref{presence-ar}). -As with \VR, \VOs must be able to be seen from different angles by moving the head, but also, this is more difficult, appear to be coherent enough with the \RE \cite{skarbez2021revisiting}, \eg occlude or be occluded by real objects \cite{macedo2023occlusion}, cast shadows or reflect lights. -The plausibility can be applied to \AR as is, but the \VOs must additionally have knowledge of the \RE and react accordingly to it to be, again, perceived as coherently behaving with the real world \cite{skarbez2021revisiting}. +For \AR, \textcite{slater2022separate} proposed to invert place illusion to what we can call \enquote{object illusion}, \ie the sense of the virtual object to \enquote{feels here} in the \RE (\figref{presence-ar}). +As with \VR, virtual objects must be able to be seen from different angles by moving the head, but also, this is more difficult, appear to be coherent enough with the \RE \cite{skarbez2021revisiting}, \eg occlude or be occluded by real objects \cite{macedo2023occlusion}, cast shadows or reflect lights. +The plausibility can be applied to \AR as is, but the virtual objects must additionally have knowledge of the \RE and react accordingly to it to be, again, perceived as coherently behaving with the real world \cite{skarbez2021revisiting}. %\textcite{skarbez2021revisiting} also named place illusion for \AR as \enquote{immersion} and plausibility as \enquote{coherence}, and these terms will be used in the remainder of this thesis. %One main issue with presence is how to measure it both in \VR \cite{slater2022separate} and \AR \cite{tran2024survey}. @@ -123,7 +123,7 @@ The plausibility can be applied to \AR as is, but the \VOs must additionally hav The sense of immersion in virtual and augmented environments. Adapted from \textcite{stevens2002putting}. }[][ \item Place illusion is the sense of the user of \enquote{being there} in the \VE. - \item Objet illusion is the sense of the \VO to \enquote{feels here} in the \RE. + \item Objet illusion is the sense of the virtual object to \enquote{feels here} in the \RE. ] \subfigsheight{35mm} \subfig{presence-vr} @@ -166,18 +166,18 @@ Choosing useful and efficient \UIs and interaction techniques is crucial for the \textcite{hertel2021taxonomy} proposed a taxonomy of interaction techniques specifically for immersive \AR. The \emph{manipulation tasks} are the most fundamental tasks in \AR and \VR systems, and the building blocks for more complex interactions. -\emph{Selection} is the identification or acquisition of a specific \VO, \eg pointing at a target as in \figref{grubert2015multifi}, touching a button with a finger, or grasping an object with a hand. +\emph{Selection} is the identification or acquisition of a specific virtual object, \eg pointing at a target as in \figref{grubert2015multifi}, touching a button with a finger, or grasping an object with a hand. \emph{Positioning} and \emph{rotation} of a selected object are the change of its position and orientation in \ThreeD space respectively. -It is also common to \emph{resize} a \VO to change its size. +It is also common to \emph{resize} a virtual object to change its size. These three operations are geometric (rigid) manipulations of the object: they do not change its shape. The \emph{navigation tasks} are the movements of the user within the \VE. Travel is the control of the position and orientation of the viewpoint in the \VE, \eg physical walking, velocity control, or teleportation. Wayfinding is the cognitive planning of the movement, such as path finding or route following (\figref{grubert2017pervasive}). -The \emph{system control tasks} are changes to the system state through commands or menus such as creating, deleting, or modifying \VOs, \eg as in \figref{roo2017onea}. It is also the input of text, numbers, or symbols. +The \emph{system control tasks} are changes to the system state through commands or menus such as creating, deleting, or modifying virtual objects, \eg as in \figref{roo2017onea}. It is also the input of text, numbers, or symbols. -In this thesis we focus on manipulation tasks of virtual content directly with the hands, more specifically on touching visuo-haptic textures with a finger (\partref{perception}) and positioning and rotating \VOs pushed and grasp by the hand. +In this thesis we focus on manipulation tasks of virtual content directly with the hands, more specifically on touching visuo-haptic textures with a finger (\partref{perception}) and positioning and rotating virtual objects pushed and grasp by the hand. \begin{subfigs}{interaction-techniques}{Interaction techniques in \AR. }[][ \item Spatial selection of virtual item of an extended display using a hand-held smartphone \cite{grubert2015multifi}. @@ -210,26 +210,26 @@ It is often achieved using two interaction techniques: \emph{tangible objects} a \subsubsection{Manipulating with Tangibles} \label{ar_tangibles} -As \AR integrates visual virtual content into \RE perception, it can involve real surrounding objects as \UI: to either visually augment them (\figref{roo2017inner}), or to use them as physical proxies to support interaction with \VOs \cite{ishii1997tangible}. +As \AR integrates visual virtual content into \RE perception, it can involve real surrounding objects as \UI: to either visually augment them (\figref{roo2017inner}), or to use them as physical proxies to support interaction with virtual objects \cite{ishii1997tangible}. %According to \textcite{billinghurst2005designing} -Each \VO is coupled to a real object and physically manipulated through it, providing a direct, efficient and seamless interaction with both the real and virtual content \cite{billinghurst2005designing}. +Each virtual object is coupled to a real object and physically manipulated through it, providing a direct, efficient and seamless interaction with both the real and virtual content \cite{billinghurst2005designing}. The real objects are called \emph{tangible} in this usage context. %This technique is similar to mapping the movements of a mouse to a virtual cursor on a screen. -Methods have been developed to automatically pair and adapt the \VOs for rendering with available tangibles of similar shape and size \cite{hettiarachchi2016annexing,jain2023ubitouch} (\figref{jain2023ubitouch}). +Methods have been developed to automatically pair and adapt the virtual objects for rendering with available tangibles of similar shape and size \cite{hettiarachchi2016annexing,jain2023ubitouch} (\figref{jain2023ubitouch}). The issue with these \emph{space-multiplexed} interfaces is the large number and variety of tangibles required. An alternative is to use a single \emph{universal} tangible object like a hand-held controller, such as a cube \cite{issartel2016tangible} or a sphere \cite{englmeier2020tangible}. -These \emph{time-multiplexed} interfaces require interaction techniques that allow the user to pair the tangible with any \VO, \eg by placing the tangible into the \VO and pressing the fingers \cite{issartel2016tangible} (\figref{issartel2016tangible}), similar to a real grasp (\secref{grasp_types}). +These \emph{time-multiplexed} interfaces require interaction techniques that allow the user to pair the tangible with any virtual object, \eg by placing the tangible into the virtual object and pressing the fingers \cite{issartel2016tangible} (\figref{issartel2016tangible}), similar to a real grasp (\secref{grasp_types}). Still, the virtual visual rendering and the real haptic sensations can be incoherent. -Especially in \OST-\AR, since the \VOs are inherently slightly transparent allowing the paired real objects to be seen through them. -In a pick-and-place task with real objects, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using_1}) of the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users. -This suggests the feasibility of using simplified real obejcts in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs. +Especially in \OST-\AR, since the virtual objects are inherently slightly transparent allowing the paired real objects to be seen through them. +In a pick-and-place task with real objects, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using_1}) of the virtual objects does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users. +This suggests the feasibility of using simplified real obejcts in \AR whose spatial properties (\secref{object_properties}) abstract those of the virtual objects. Similarly, in \secref{tactile_rendering} we described how a material property (\secref{object_properties}) of a touched real object can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: it could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR. -\begin{subfigs}{ar_tangibles}{Manipulating \VOs through real objects. }[][ +\begin{subfigs}{ar_tangibles}{Manipulating virtual objects through real objects. }[][ \item Ubi-Touch paired the movements and screw interaction of a virtual drill with a real vaporizer held by the user \cite{jain2023ubitouch}. - \item A real cube that can be moved into the \VE and used to grasp and manipulate \VOs \cite{issartel2016tangible}. + \item A real cube that can be moved into the \VE and used to grasp and manipulate virtual objects \cite{issartel2016tangible}. \item Size and \item shape difference between a real object and a virtual one is acceptable for manipulation in \AR \cite{kahl2021investigation,kahl2023using}. ] @@ -252,20 +252,20 @@ The simplest models represent the hand as a rigid \ThreeD object that follows th An alternative is to model only the fingertips (\figref{lee2007handy}) or the whole hand (\figref{hilliges2012holodesk_1}) as points. The most common technique is to reconstruct all the phalanges of the hand in an articulated kinematic model (\secref{hand_anatomy}) \cite{borst2006spring}. -The contacts between the virtual hand model and the \VOs are then simulated using heuristic or physics-based techniques \cite[p.405]{laviolajr20173d}. -Heuristic techniques use rules to determine the selection, manipulation and release of a \VO (\figref{piumsomboon2013userdefined_1}). +The contacts between the virtual hand model and the virtual objects are then simulated using heuristic or physics-based techniques \cite[p.405]{laviolajr20173d}. +Heuristic techniques use rules to determine the selection, manipulation and release of a virtual object (\figref{piumsomboon2013userdefined_1}). However, they produce unrealistic behaviour and are limited to the cases predicted by the rules. -Physics-based techniques simulate forces at the points of contact between the virtual hand and the \VO. +Physics-based techniques simulate forces at the points of contact between the virtual hand and the virtual object. In particular, \textcite{borst2006spring} proposed an articulated kinematic model in which each phalanx is a rigid body simulated with the god-object method \cite{zilles1995constraintbased}: -The virtual phalanx follows the movements of the real phalanx, but remains constrained to the surface of the \VOs during contact. +The virtual phalanx follows the movements of the real phalanx, but remains constrained to the surface of the virtual objects during contact. The forces acting on the object are calculated as a function of the distance between the real and virtual hands (\figref{borst2006spring}). More advanced techniques simulate the friction phenomena \cite{talvas2013godfinger} and finger deformations \cite{talvas2015aggregate}, allowing highly accurate and realistic interactions, but which can be difficult to compute in real time. -\begin{subfigs}{virtual-hand}{Manipulating \VOs with virtual hands. }[][ - \item A fingertip tracking that allows to select a \VO by opening the hand \cite{lee2007handy}. +\begin{subfigs}{virtual-hand}{Manipulating virtual objects with virtual hands. }[][ + \item A fingertip tracking that allows to select a virtual object by opening the hand \cite{lee2007handy}. \item Physics-based hand-object manipulation with a virtual hand made of numerous many small rigid-body spheres \cite{hilliges2012holodesk}. - \item Grasping a through gestures when the fingers are detected as opposing on the \VO \cite{piumsomboon2013userdefined}. - \item A kinematic hand model with rigid-body phalanges (in beige) that follows the real tracked hand (in green) but kept physically constrained to the \VO. Applied forces are shown as red arrows \cite{borst2006spring}. + \item Grasping a through gestures when the fingers are detected as opposing on the virtual object \cite{piumsomboon2013userdefined}. + \item A kinematic hand model with rigid-body phalanges (in beige) that follows the real tracked hand (in green) but kept physically constrained to the virtual object. Applied forces are shown as red arrows \cite{borst2006spring}. ] \subfigsheight{37mm} \subfigbox{lee2007handy} @@ -275,7 +275,7 @@ More advanced techniques simulate the friction phenomena \cite{talvas2013godfing \end{subfigs} However, the lack of physical constraints on the user's hand movements makes manipulation actions tiring \cite{hincapie-ramos2014consumed}. -While the user's fingers traverse the \VO, a physics-based virtual hand remains in contact with the object, a discrepancy that may degrade the user's performance in \VR \cite{prachyabrued2012virtual}. +While the user's fingers traverse the virtual object, a physics-based virtual hand remains in contact with the object, a discrepancy that may degrade the user's performance in \VR \cite{prachyabrued2012virtual}. Finally, in the absence of haptic feedback on each finger, it is difficult to estimate the contact and forces exerted by the fingers on the object during grasping and manipulation \cite{maisto2017evaluation,meli2018combining}. While a visual feedback of the virtual hand in \VR can compensate for these issues \cite{prachyabrued2014visual}, the visual and haptic feedback of the virtual hand, or their combination, in \AR needs to be investigated as well. @@ -284,24 +284,24 @@ While a visual feedback of the virtual hand in \VR can compensate for these issu %In \VR, since the user is fully immersed in the \VE and cannot see their real hands, it is necessary to represent them virtually (\secref{ar_embodiment}). When interacting with a physics-based virtual hand method (\secref{ar_virtual_hands}) in \VR, the visual feedback of the virtual hand has an influence on perception, interaction performance, and preference of users \cite{prachyabrued2014visual,argelaguet2016role,grubert2018effects,schwind2018touch}. -In a pick-and-place manipulation task in \VR, \textcite{prachyabrued2014visual} and \textcite{canales2019virtual} found that the visual hand feedback whose motion was constrained to the surface of the \VOs similar as to \textcite{borst2006spring} (\enquote{Outer Hand} in \figref{prachyabrued2014visual}) performed the worst, while the visual hand feedback following the tracked human hand (thus penetrating the \VOs, \enquote{Inner Hand} in \figref{prachyabrued2014visual}) performed the best, though it was rather disliked. +In a pick-and-place manipulation task in \VR, \textcite{prachyabrued2014visual} and \textcite{canales2019virtual} found that the visual hand feedback whose motion was constrained to the surface of the virtual objects similar as to \textcite{borst2006spring} (\enquote{Outer Hand} in \figref{prachyabrued2014visual}) performed the worst, while the visual hand feedback following the tracked human hand (thus penetrating the virtual objects, \enquote{Inner Hand} in \figref{prachyabrued2014visual}) performed the best, though it was rather disliked. \textcite{prachyabrued2014visual} also found that the best compromise was a double feedback, showing both the virtual hand and the tracked hand (\enquote{2-Hand} in \figref{prachyabrued2014visual}). While a realistic rendering of the human hand increased the sense of ownership \cite{lin2016need}, a skeleton-like rendering provided a stronger sense of agency \cite{argelaguet2016role} (\secref{ar_embodiment}), and a minimalist fingertip rendering reduced typing errors \cite{grubert2018effects}. A visual hand feedback while in \VE also seems to affect how one grasps an object \cite{blaga2020too}, or how real bumps and holes are perceived \cite{schwind2018touch}. \fig{prachyabrued2014visual}{Visual hand feedback affect user experience in \VR \cite{prachyabrued2014visual}.} -Conversely, a user sees their own hands in \AR, and the mutual occlusion between the hands and the \VOs is a common issue (\secref{ar_displays}), \ie hiding the \VO when the real hand is in front of it, and hiding the real hand when it is behind the \VO (\figref{hilliges2012holodesk_2}). +Conversely, a user sees their own hands in \AR, and the mutual occlusion between the hands and the virtual objects is a common issue (\secref{ar_displays}), \ie hiding the virtual object when the real hand is in front of it, and hiding the real hand when it is behind the virtual object (\figref{hilliges2012holodesk_2}). %For example, in \figref{hilliges2012holodesk_2}, the user is pinching a virtual cube in \OST-\AR with their thumb and index fingers, but while the index is behind the cube, it is seen as in front of it. While in \VST-\AR, this could be solved as a masking problem by combining the real and virtual images \cite{battisti2018seamless}, \eg in \figref{suzuki2014grasping}, in \OST-\AR, this is much more difficult because the \VE is displayed as a transparent \TwoD image on top of the \ThreeD \RE, which cannot be easily masked \cite{macedo2023occlusion}. %Yet, even in \VST-\AR, -%An alternative is to render the \VOs and the virtual hand semi-transparents, so that they are partially visible even when one is occluding the other (\figref{buchmann2005interaction}). +%An alternative is to render the virtual objects and the virtual hand semi-transparents, so that they are partially visible even when one is occluding the other (\figref{buchmann2005interaction}). %Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in \VST-\AR \cite{buchmann2005interaction,ha2014wearhand,piumsomboon2014graspshell} and \VR \cite{vanveldhuizen2021effect}, but has not yet been evaluated in \OST-\AR. -%However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a \VO, \eg the thumb is in front of the virtual cube, but could be perceived to be behind it. +%However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a virtual object, \eg the thumb is in front of the virtual cube, but could be perceived to be behind it. -Since the \VE is intangible, adding a visual feedback of the virtual hand in \AR that is physically constrained to the \VOs would achieve a similar result to the double-hand feedback of \textcite{prachyabrued2014visual}. -A \VO overlaying a real object object in \OST-\AR can vary in size and shape without degrading user experience or manipulation performance \cite{kahl2021investigation,kahl2023using}. +Since the \VE is intangible, adding a visual feedback of the virtual hand in \AR that is physically constrained to the virtual objects would achieve a similar result to the double-hand feedback of \textcite{prachyabrued2014visual}. +A virtual object overlaying a real object object in \OST-\AR can vary in size and shape without degrading user experience or manipulation performance \cite{kahl2021investigation,kahl2023using}. This suggests that a visual hand feedback superimposed on the real hand as a partial avatarization (\secref{ar_embodiment}) might be helpful without impairing the user. Few works have compared different visual feedback of the virtual hand in \AR or with wearable haptic feedback. @@ -313,13 +313,13 @@ In a collaborative task in immersive \OST-\AR \vs \VR, \textcite{yoon2020evaluat \textcite{genay2021virtual} found that the sense of embodiment with robotic hands overlay in \OST-\AR was stronger when the environment contained both real and virtual objects (\figref{genay2021virtual}). Finally, \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the visual and haptic feedback of the hand in \VST-\AR, as detailed in the next section (\secref{vhar_rings}). Taken together, these results suggest that a visual augmentation of the hand in \AR could improve usability and performance in direct hand manipulation tasks, but the best rendering has yet to be determined. -%\cite{chan2010touching} : cues for touching (selection) \VOs. -%\textcite{saito2021contact} found that masking the real hand with a textured \ThreeD opaque virtual hand did not improve performance in a reach-to-grasp task but displaying the points of contact on the \VO did. -%To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of \VO manipulation. +%\cite{chan2010touching} : cues for touching (selection) virtual objects. +%\textcite{saito2021contact} found that masking the real hand with a textured \ThreeD opaque virtual hand did not improve performance in a reach-to-grasp task but displaying the points of contact on the virtual object did. +%To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of virtual object manipulation. \begin{subfigs}{visual-hands}{Visual feedback of the virtual hand in \AR. }[][ - \item Grasping a \VO in \OST-\AR with no visual hand feedback \cite{hilliges2012holodesk}. - \item Simulated mutual-occlusion between the hand grasping and the \VO in \VST-\AR \cite{suzuki2014grasping}. + \item Grasping a virtual object in \OST-\AR with no visual hand feedback \cite{hilliges2012holodesk}. + \item Simulated mutual-occlusion between the hand grasping and the virtual object in \VST-\AR \cite{suzuki2014grasping}. \item Grasping a real object with a semi-transparent hand in \VST-\AR \cite{buchmann2005interaction}. \item Skeleton rendering overlaying the real hand in \VST-\AR \cite{blaga2017usability}. \item Robotic rendering overlaying the real hands in \OST-\AR \cite{genay2021virtual}. @@ -338,8 +338,8 @@ Taken together, these results suggest that a visual augmentation of the hand in \AR systems integrate virtual content into the user's perception as if it were part of the \RE. \AR headsets now enable real-time tracking of the head and hands, and high-quality display of virtual content, while being portable and mobile. -They enable highly immersive \AEs that users can explore with a strong sense of the presence of the virtual content. -However, without direct and seamless interaction with the \VOs using the hands, the coherence of the \AE experience is compromised. -In particular, when manipulating \VOs in \OST-\AR, there is a lack of mutual occlusion and interaction cues between the hands and the virtual content, which could be mitigated by a visual augmentation of the hand. -A common alternative approach is to use real objects as proxies for interaction with \VOs, but this raises concerns about their coherence with visual augmentations. -In this context, the use of wearable haptic systems worn on the hand seems to be a promising solution both for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of touched real objects. +They enable highly immersive augmented environments that users can explore with a strong sense of the presence of the virtual content. +However, without direct and seamless interaction with the virtual objects using the hands, the coherence of the augmented environment experience is compromised. +In particular, when manipulating virtual objects in \OST-\AR, there is a lack of mutual occlusion and interaction cues between the hands and the virtual content, which could be mitigated by a visual augmentation of the hand. +A common alternative approach is to use real objects as proxies for interaction with virtual objects, but this raises concerns about their coherence with visual augmentations. +In this context, the use of wearable haptic systems worn on the hand seems to be a promising solution both for improving direct hand manipulation of virtual objects and for coherent visuo-haptic augmentation of touched real objects. diff --git a/1-background/related-work/4-visuo-haptic-ar.tex b/1-background/related-work/4-visuo-haptic-ar.tex index 5bbd749..3baf44a 100644 --- a/1-background/related-work/4-visuo-haptic-ar.tex +++ b/1-background/related-work/4-visuo-haptic-ar.tex @@ -5,7 +5,7 @@ Perception and manipulation of objects with the hand typically involves both the Each sense has unique capabilities for perceiving certain object properties, such as color for vision or temperature for touch, but they are equally capable for many properties, such as roughness, hardness, or geometry \cite{baumgartner2013visual}. Both \AR and wearable haptic systems integrate virtual content into the user's perception as sensory illusions. -It is essential to understand how a visuo-haptic rendering of a \VO is perceived as a coherent object property, and how wearable haptics have been integrated with immersive \AR.%, especially in immersive \AR where the haptic actuator is moved away so as not to cover the inside of the hand. +It is essential to understand how a visuo-haptic rendering of a virtual object is perceived as a coherent object property, and how wearable haptics have been integrated with immersive \AR.%, especially in immersive \AR where the haptic actuator is moved away so as not to cover the inside of the hand. % spatial and temporal integration of visuo-haptic feedback as perceptual cues vs proprioception and real touch sensations % delocalized : not at the point of contact = difficult to integrate with other perceptual cues ? @@ -17,7 +17,7 @@ It is essential to understand how a visuo-haptic rendering of a \VO is perceived \label{sensations_perception} A \emph{perception} is the merging of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property \cite{ernst2004merging}. -For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a real object with a co-localized \VO (\secref{ar_tangibles}). +For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a real object with a co-localized virtual object (\secref{ar_tangibles}). If the sensations are redundant, \ie if only one sensation could suffice to estimate the property, they are integrated to form a single perception \cite{ernst2004merging}. No sensory information is completely reliable and may give different answers to the same property when measured multiple times, \eg the weight of an object. @@ -58,10 +58,10 @@ The objective was to determine a \PSE between the comparison and reference bars, %\subfig{ernst2002humans_visuo-haptic} \end{subfigs} -%Hence, the \MLE model explains how a (visual) \VO in \AR can be perceived as coherent when combined with real haptic sensations of a tangible or a wearable haptic feedback. -The \MLE model implies that when seeing and touching a \VO in \AR, the combination of visual and haptic stimuli, real or virtual, presented to the user can be perceived as a coherent single object property. +%Hence, the \MLE model explains how a (visual) virtual object in \AR can be perceived as coherent when combined with real haptic sensations of a tangible or a wearable haptic feedback. +The \MLE model implies that when seeing and touching a virtual object in \AR, the combination of visual and haptic stimuli, real or virtual, presented to the user can be perceived as a coherent single object property. %As long as the user is able to associate the sensations as the same object property, and even if there are discrepancies between the sensations, the overall perception can be influenced by changing one of the stimuli, as discussed in the next sections. -%for example by including tangible objects, wearable haptic feedback, or even by altering the visual rendering of the \VO, as discussed in the next sections. +%for example by including tangible objects, wearable haptic feedback, or even by altering the visual rendering of the virtual object, as discussed in the next sections. \subsubsection{Influence of Visual Rendering on Haptic Perception} \label{visual_haptic_influence} @@ -73,11 +73,11 @@ More precisely, when surfaces are evaluated by vision or touch alone, both sense The overall perception can then be modified by changing one of the sensory modalities. \textcite{yanagisawa2015effects} altered the perceived roughness, stiffness, and friction of real tactile materials touched by the finger by superimposing different real visual textures using a half-mirror. In a similar setup, but in immersive \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched through a glove: many visual textures were found to match the real haptic textures. -\textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined multiple \VOs in \VR with \ThreeD-printed hair structures or with everyday real surfaces, respectively. -They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few real, tangible objects seemed to be sufficient to match all the visual \VOs (\figref{gunther2022smooth}). +\textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined multiple virtual objects in \VR with \ThreeD-printed hair structures or with everyday real surfaces, respectively. +They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few real, tangible objects seemed to be sufficient to match all the visual virtual objects (\figref{gunther2022smooth}). %Taken together, these studies suggest that a set of haptic textures, real or virtual, can be perceived as coherent with a larger set of visual virtual textures. -\fig{gunther2022smooth}{In a passive touch context in \VR, only a smooth and a rough real surfaces were found to match all the visual \VOs \cite{gunther2022smooth}.} +\fig{gunther2022smooth}{In a passive touch context in \VR, only a smooth and a rough real surfaces were found to match all the visual virtual objects \cite{gunther2022smooth}.} Visual feedback can even be intentionally designed to influence haptic perception, usually by deforming the visual representation of a user input, creating a \enquote{pseudo-haptic feedback} \cite{ujitoko2021survey}. For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually deforming the geometry of a real object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape, or curvature can be altered \cite{ban2013modifying,ban2014displaying}. @@ -94,12 +94,12 @@ For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually defo \end{subfigs} %In all of these studies, the visual expectations of participants influenced their haptic perception. -%In particular, in \AR and \VR, the perception of a haptic rendering or augmentation can be influenced by the visual rendering of the \VO. +%In particular, in \AR and \VR, the perception of a haptic rendering or augmentation can be influenced by the visual rendering of the virtual object. \subsubsection{Perception of Visuo-Haptic Rendering in AR and VR} \label{ar_vr_haptic} -Some studies have investigated the visuo-haptic perception of \VOs rendered with force-feedback and vibrotactile feedback in \AR and \VR. +Some studies have investigated the visuo-haptic perception of virtual objects rendered with force-feedback and vibrotactile feedback in \AR and \VR. In an immersive \VST-\AR setup, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}). In a \TIFC task (\secref{sensations_perception}), participants pressed two pistons and indicated which was stiffer. @@ -126,8 +126,8 @@ Therefore, the perceived stiffness $\tilde{k}(t)$ increases with a haptic delay \textcite{gaffary2017ar} compared perceived stiffness of virtual pistons in \OST-\AR and \VR. However, the force-feedback device and the participant's hand were not visible (\figref{gaffary2017ar}). The reference piston was judged to be stiffer when seen in \VR than in \AR, without participants noticing this difference, and more force was exerted on the piston overall in \VR. -This suggests that the haptic stiffness of \VOs feels \enquote{softer} in an \AE than in a full \VE. -%Two differences that could be worth investigating with the two previous studies are the type of \AR (visuo or optical) and to see the hand touching the \VO. +This suggests that the haptic stiffness of virtual objects feels \enquote{softer} in an augmented environment than in a full \VE. +%Two differences that could be worth investigating with the two previous studies are the type of \AR (visuo or optical) and to see the hand touching the virtual object. \begin{subfigs}{gaffary2017ar}{Perception of haptic stiffness in \OST-\AR \vs \VR \cite{gaffary2017ar}. }[][ \item Experimental setup: a virtual piston was pressed with a force-feedback placed to the side of the participant. @@ -139,12 +139,12 @@ This suggests that the haptic stiffness of \VOs feels \enquote{softer} in an \AE \subfigbox[0.31]{gaffary2017ar_4} \end{subfigs} -Finally, \textcite{diluca2019perceptual} investigated the perceived simultaneity of visuo-haptic contact with a \VO in \VR. -The contact was rendered both by a vibrotactile piezoelectric device on the fingertip and by a visual change in the color of the \VO. +Finally, \textcite{diluca2019perceptual} investigated the perceived simultaneity of visuo-haptic contact with a virtual object in \VR. +The contact was rendered both by a vibrotactile piezoelectric device on the fingertip and by a visual change in the color of the virtual object. The visuo-haptic simultaneity was varied by adding a visual delay or by triggering the haptic feedback earlier. No participant (out of 19) was able to detect a \qty{50}{\ms} visual lag and a \qty{15}{\ms} haptic lead, and only half of them detected a \qty{100}{\ms} visual lag and a \qty{70}{\ms} haptic lead. -These studies have shown how the latency of the visual rendering of a \VO or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device. +These studies have shown how the latency of the visual rendering of a virtual object or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device. We describe in the next section how wearable haptics have been integrated with immersive \AR. \subsection{Wearable Haptics for Direct Hand Interaction in AR} @@ -163,10 +163,10 @@ Another category of actuators relies on systems that cannot be considered as por \label{vhar_nails} \textcite{ando2007fingernailmounted} were the first to move the actuator from the fingertip to propose the nail, as described in \secref{texture_rendering}. -This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a haptic device able to unfold its end-effector on demand to make contact with the fingertip when touching \VOs (\figref{teng2021touch_1}). +This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a haptic device able to unfold its end-effector on demand to make contact with the fingertip when touching virtual objects (\figref{teng2021touch_1}). This moving platform also contains a \LRA (\secref{moving_platforms}) and provides contact pressure and texture sensations. %The whole system is very compact (\qtyproduct{24 x 24 x 41}{\mm}), lightweight (\qty{9.5}{\g}), and fully portable by including a battery and Bluetooth wireless communication. \qty{20}{\ms} for the Bluetooth -When touching \VOs in \OST-\AR with the index finger, this device was found to be more realistic overall (5/7) than vibrations with a \LRA at \qty{170}{\Hz} on the nail (3/7). +When touching virtual objects in \OST-\AR with the index finger, this device was found to be more realistic overall (5/7) than vibrations with a \LRA at \qty{170}{\Hz} on the nail (3/7). Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and this design is not suitable for augmenting real objects. % teng2021touch: (5.27+3.03+5.23+5.5+5.47)/5 = 4.9 @@ -196,7 +196,7 @@ However, no proper user study has been conducted to evaluate these devices in \A \subsubsection{Belt Devices} \label{vhar_rings} -The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been used to improve the manipulation of \VOs in \AR (\secref{ar_interaction}). +The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been used to improve the manipulation of virtual objects in \AR (\secref{ar_interaction}). Recall that these devices have also been used to modify the perceived stiffness, softness, friction and localized bumps and holes on smooth real surfaces (\secref{hardness_rendering}) \cite{detinguy2018enhancing,salazar2020altering}, but have not been tested in \AR. In a \VST-\AR setup, \textcite{scheggi2010shape} explored the effect of rendering the weight of a virtual cube placed on a real surface hold with the thumb, index, and middle fingers (\figref{scheggi2010shape}). @@ -250,11 +250,11 @@ A user study was conducted in \VR to compare the perception of visuo-haptic stif \subsection{Conclusion} \label{visuo_haptic_conclusion} -Providing coherent visuo-haptic feedback to enhance direct hand perception and manipulation with \VOs in immersive \AR is challenging. +Providing coherent visuo-haptic feedback to enhance direct hand perception and manipulation with virtual objects in immersive \AR is challenging. While many wearable haptic devices have been developed and are capable of providing varied tactile feedback, few have be integrated or experimentally evaluated for direct hand interaction in \AR. Their haptic end-effector must be moved away from the inside of the hand so as not to interfere with the user interaction with the \RE. Different relocation strategies have been proposed for different parts of the hand, such as the nail, the index phalanges, or the wrist, but it remains unclear whether any of them are best suited for direct hand interaction in \AR. In all cases, the real and virtual visual sensations are considered co-localized, but the virtual haptic feedback is not. Such a discrepancy may affect the user's perception and experience and should be further investigated. When integrating different sensory feedback, haptic and visual, real and virtual, into a single object property, perception is robust to variations in reliability and to spatial and temporal differences. -Conversely, the same haptic feedback or augmentation can be influenced by the user's visual expectation or the visual rendering of the \VO. +Conversely, the same haptic feedback or augmentation can be influenced by the user's visual expectation or the visual rendering of the virtual object. diff --git a/1-background/related-work/related-work.tex b/1-background/related-work/related-work.tex index cdbf3ae..42c256f 100644 --- a/1-background/related-work/related-work.tex +++ b/1-background/related-work/related-work.tex @@ -4,7 +4,7 @@ \chaptertoc This chapter reviews previous work on the perception and manipulation of virtual and augmented objects directly with the hand, using either wearable haptics, \AR, or their combination. -%Experiencing a visual, haptic, or visuo-haptic \AE relies on one to many interaction loops between a user and the environment, as shown in \figref[introduction]{interaction-loop}, and each main step must be addressed and understood: the tracking and modelling of the \RE into a \VE, the interaction techniques to act on the \VE, the rendering of the \VE to the user through visual and haptic user interfaces, and, finally, the user's perception and actions on the overall \AE. +%Experiencing a visual, haptic, or visuo-haptic augmented environment relies on one to many interaction loops between a user and the environment, as shown in \figref[introduction]{interaction-loop}, and each main step must be addressed and understood: the tracking and modelling of the \RE into a \VE, the interaction techniques to act on the \VE, the rendering of the \VE to the user through visual and haptic user interfaces, and, finally, the user's perception and actions on the overall augmented environment. First, we review how the hand senses and interacts with its environment to perceive and manipulate the haptic properties of real everyday objects. Second, we present how wearable haptic devices and renderings have been used to augment the haptic perception of roughness and hardness of real objects. Third, we introduce the principles and user experience of \AR and review the main interaction techniques used to manipulate virtual objects directly with the hand. diff --git a/2-perception/xr-perception/6-conclusion.tex b/2-perception/xr-perception/6-conclusion.tex index 771de02..90d3c94 100644 --- a/2-perception/xr-perception/6-conclusion.tex +++ b/2-perception/xr-perception/6-conclusion.tex @@ -18,7 +18,7 @@ Latencies should be measured \cite{friston2014measuring}, minimized to an accept It seems also that the visual aspect of the hand or the environment on itself has little effect on the perception of haptic feedback, but the degree of visual reality-virtuality can affect the asynchrony sensation of the latencies, even though they remain identical. When designing for wearable haptics or integrating it into \AR/\VR, it seems important to test its perception in real, augmented and virtual environments. %With a better understanding of how visual factors influence the perception of haptically augmented real objects, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic renderings adapted to \AR can be designed. -%Finally, a visual hand representation in OST-\AR together with wearable haptics should be avoided until acceptable tracking latencies \are achieved, as was also observed for \VO interaction with the bare hand \cite{normand2024visuohaptic}. +%Finally, a visual hand representation in OST-\AR together with wearable haptics should be avoided until acceptable tracking latencies \are achieved, as was also observed for virtual object interaction with the bare hand \cite{normand2024visuohaptic}. \noindentskip This work was presented and published at the VRST 2024 conference: diff --git a/3-manipulation/visual-hand/1-introduction.tex b/3-manipulation/visual-hand/1-introduction.tex index c502b8e..1430238 100644 --- a/3-manipulation/visual-hand/1-introduction.tex +++ b/3-manipulation/visual-hand/1-introduction.tex @@ -1,24 +1,24 @@ \section{Introduction} \label{intro} -Touching, grasping and manipulating \VOs are fundamental interactions in \AR (\secref[related_work]{ve_tasks}) and essential for many of its applications (\secref[related_work]{ar_applications}). -The most common current \AR systems, in the form of portable and immersive \OST-\AR headsets \cite{hertel2021taxonomy}, allow real-time hand tracking and direct interaction with \VOs with bare hands (\secref[related_work]{real_virtual_gap}). -Manipulation of \VOs is achieved using a virtual hand interaction technique that represents the user's hand in the \VE and simulates interaction with \VOs (\secref[related_work]{ar_virtual_hands}). -However, direct hand manipulation is still challenging due to the intangibility of the \VE, the lack of mutual occlusion between the hand and the \VO in \OST-\AR (\secref[related_work]{ar_displays}), and the inherent delays between the user's hand and the result of the interaction simulation (\secref[related_work]{ar_virtual_hands}). +Touching, grasping and manipulating virtual objects are fundamental interactions in \AR (\secref[related_work]{ve_tasks}) and essential for many of its applications (\secref[related_work]{ar_applications}). +The most common current \AR systems, in the form of portable and immersive \OST-\AR headsets \cite{hertel2021taxonomy}, allow real-time hand tracking and direct interaction with virtual objects with bare hands (\secref[related_work]{real_virtual_gap}). +Manipulation of virtual objects is achieved using a virtual hand interaction technique that represents the user's hand in the \VE and simulates interaction with virtual objects (\secref[related_work]{ar_virtual_hands}). +However, direct hand manipulation is still challenging due to the intangibility of the \VE, the lack of mutual occlusion between the hand and the virtual object in \OST-\AR (\secref[related_work]{ar_displays}), and the inherent delays between the user's hand and the result of the interaction simulation (\secref[related_work]{ar_virtual_hands}). -In this chapter, we investigate the \textbf{visual rendering as hand augmentation} for direct manipulation of \VOs in \OST-\AR. -To this end, we selected in the literature and compared the most popular visual hand renderings used to interact with \VOs in \AR. +In this chapter, we investigate the \textbf{visual rendering as hand augmentation} for direct manipulation of virtual objects in \OST-\AR. +To this end, we selected in the literature and compared the most popular visual hand renderings used to interact with virtual objects in \AR. The virtual hand is \textbf{displayed superimposed} on the user's hand with these visual rendering, providing a \textbf{feedback on the tracking} of the real hand, as shown in \figref{hands}. -The movement of the virtual hand is also \textbf{constrained to the surface} of the \VO, providing an additional \textbf{feedback on the interaction} with the \VO. -We \textbf{evaluate in a user study}, using the \OST-\AR headset Microsoft HoloLens~2, the effect of six visual hand renderings on the user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a \VO directly with the hand. +The movement of the virtual hand is also \textbf{constrained to the surface} of the virtual object, providing an additional \textbf{feedback on the interaction} with the virtual object. +We \textbf{evaluate in a user study}, using the \OST-\AR headset Microsoft HoloLens~2, the effect of six visual hand renderings on the user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a virtual object directly with the hand. \noindentskip The main contributions of this chapter are: \begin{itemize} - \item A comparison from the literature of the six most common visual hand renderings used to interact with \VOs in \AR. - \item A user study evaluating with 24 participants the performance and user experience of the six visual hand renderings as augmentation of the real hand during free and direct hand manipulation of \VOs in \OST-\AR. + \item A comparison from the literature of the six most common visual hand renderings used to interact with virtual objects in \AR. + \item A user study evaluating with 24 participants the performance and user experience of the six visual hand renderings as augmentation of the real hand during free and direct hand manipulation of virtual objects in \OST-\AR. \end{itemize} -\noindentskip In the next sections, we first present the six visual hand renderings we considered and gathered from the literature. We then describe the experimental setup and design, the two manipulation tasks, and the metrics used. We present the results of the user study and discuss the implications of these results for the manipulation of \VOs directly with the hand in \AR. +\noindentskip In the next sections, we first present the six visual hand renderings we considered and gathered from the literature. We then describe the experimental setup and design, the two manipulation tasks, and the metrics used. We present the results of the user study and discuss the implications of these results for the manipulation of virtual objects directly with the hand in \AR. \bigskip diff --git a/3-manipulation/visual-hand/2-method.tex b/3-manipulation/visual-hand/2-method.tex index 6db51bc..00ac396 100644 --- a/3-manipulation/visual-hand/2-method.tex +++ b/3-manipulation/visual-hand/2-method.tex @@ -5,15 +5,15 @@ We compared a set of the most popular visual hand renderings, as found in the li Since we address hand-centered manipulation tasks, we only considered renderings including the fingertips (\secref[related_work]{grasp_types}). Moreover, as to keep the focus on the hand rendering itself, we used neutral semi-transparent grey meshes, consistent with the choices made in \cite{yoon2020evaluating,vanveldhuizen2021effect}. All considered hand renderings are drawn following the tracked pose of the user's real hand. -However, while the real hand can of course penetrate \VOs, the visual hand is always constrained by the \VE (\secref[related_work]{ar_virtual_hands}). +However, while the real hand can of course penetrate virtual objects, the visual hand is always constrained by the \VE (\secref[related_work]{ar_virtual_hands}). They are shown in \figref{hands} and described below, with an abbreviation in parentheses when needed. \paragraph{None} As a reference, we considered no visual hand rendering (\figref{method/hands-none}), as is common in \AR \cite{hettiarachchi2016annexing,blaga2017usability,xiao2018mrtouch,teng2021touch}. -Users have no information about hand tracking and no feedback about contact with the \VOs, other than their movement when touched. -As virtual content is rendered on top of the \RE, the hand of the user can be hidden by the \VOs when manipulating them (\secref[related_work]{ar_displays}). +Users have no information about hand tracking and no feedback about contact with the virtual objects, other than their movement when touched. +As virtual content is rendered on top of the \RE, the hand of the user can be hidden by the virtual objects when manipulating them (\secref[related_work]{ar_displays}). \paragraph{Occlusion (Occl)} @@ -22,13 +22,13 @@ This approach is frequent in works using \VST-\AR headsets \cite{knorlein2009inf \paragraph{Tips} -This rendering shows small visual rings around the fingertips of the user (\figref{method/hands-tips}), highlighting the most important parts of the hand and contact with \VOs during fine manipulation (\secref[related_work]{grasp_types}). +This rendering shows small visual rings around the fingertips of the user (\figref{method/hands-tips}), highlighting the most important parts of the hand and contact with virtual objects during fine manipulation (\secref[related_work]{grasp_types}). Unlike work using small spheres \cite{maisto2017evaluation,meli2014wearable,grubert2018effects,normand2018enlarging,schwind2018touch}, this ring rendering also provides information about the orientation of the fingertips. \paragraph{Contour (Cont)} This rendering is a \qty{1}{\mm} thick outline contouring the user's hands, providing information about the whole hand while leaving its inside visible. -Unlike the other renderings, it is not occluded by the \VOs, as shown in \figref{method/hands-contour}. +Unlike the other renderings, it is not occluded by the virtual objects, as shown in \figref{method/hands-contour}. This rendering is not as usual as the previous others in the literature \cite{kang2020comparative}. \paragraph{Skeleton (Skel)} @@ -45,7 +45,7 @@ It can be seen as a filled version of the Contour hand rendering, thus partially \section{User Study} \label{method} -We aim to investigate whether the chosen visual hand rendering affects the performance and user experience of manipulating \VOs with free hands in \AR. +We aim to investigate whether the chosen visual hand rendering affects the performance and user experience of manipulating virtual objects with free hands in \AR. \subsection{Manipulation Tasks and Virtual Scene} \label{tasks} @@ -55,8 +55,8 @@ Following the guidelines of \textcite{bergstrom2021how} for designing object man \subsubsection{Push Task} \label{push-task} -The first manipulation task consists in pushing a \VO along a real flat surface towards a target placed on the same plane (\figref{method/task-push}). -The \VO to manipulate is a small \qty{50}{\mm} blue and opaque cube, while the target is a (slightly) bigger \qty{70}{\mm} blue and semi-transparent volume. +The first manipulation task consists in pushing a virtual object along a real flat surface towards a target placed on the same plane (\figref{method/task-push}). +The virtual object to manipulate is a small \qty{50}{\mm} blue and opaque cube, while the target is a (slightly) bigger \qty{70}{\mm} blue and semi-transparent volume. At every repetition of the task, the cube to manipulate always spawns at the same place, on top of a real table in front of the user. On the other hand, the target volume can spawn in eight different locations on the same table, located on a \qty{20}{\cm} radius circle centred on the cube, at \qty{45}{\degree} from each other (again \figref{method/task-push}). Users are asked to push the cube towards the target volume using their fingertips in any way they prefer. @@ -66,7 +66,7 @@ The task is considered completed when the cube is \emph{fully} inside the target \subsubsection{Grasp Task} \label{grasp-task} -The second manipulation task consists in grasping, lifting, and placing a \VO in a target placed on a different (higher) plane (\figref{method/task-grasp}). +The second manipulation task consists in grasping, lifting, and placing a virtual object in a target placed on a different (higher) plane (\figref{method/task-grasp}). The cube to manipulate and target volume are the same as in the previous task. However, this time, the target volume can spawn in eight different locations on a plane \qty{10}{\cm} \emph{above} the table, still located on a \qty{20}{\cm} radius circle at \qty{45}{\degree} from each other. Users are asked to grasp, lift, and move the cube towards the target volume using their fingertips in any way they prefer. @@ -111,7 +111,7 @@ The compiled application ran directly on the HoloLens~2 at \qty{60}{FPS}. The default \ThreeD hand model from MRTK was used for all visual hand renderings. By changing the material properties of this hand model, we were able to achieve the six renderings shown in \figref{hands}. A calibration was performed for every participant, to best adapt the size of the visual hand rendering to their real hand. -A set of empirical tests enabled us to choose the best rendering characteristics in terms of transparency and brightness for the \VOs and hand renderings, which were applied throughout the experiment. +A set of empirical tests enabled us to choose the best rendering characteristics in terms of transparency and brightness for the virtual objects and hand renderings, which were applied throughout the experiment. The hand tracking information provided by MRTK was used to construct a virtual articulated physics-enabled hand (\secref[related_work]{ar_virtual_hands}) using PhysX. It featured 25 DoFs, including the fingers proximal, middle, and distal phalanges. diff --git a/3-manipulation/visual-hand/4-discussion.tex b/3-manipulation/visual-hand/4-discussion.tex index c410624..b73e2e0 100644 --- a/3-manipulation/visual-hand/4-discussion.tex +++ b/3-manipulation/visual-hand/4-discussion.tex @@ -1,7 +1,7 @@ \section{Discussion} \label{discussion} -We evaluated six visual hand renderings, as described in \secref{hands}, displayed on top of the real hand, in two \VO manipulation tasks in \AR. +We evaluated six visual hand renderings, as described in \secref{hands}, displayed on top of the real hand, in two virtual object manipulation tasks in \AR. During the \level{Push} task, the \level{Skeleton} hand rendering was the fastest (\figref{results/Push-CompletionTime-Hand-Overall-Means}), as participants employed fewer and longer contacts to adjust the cube inside the target volume (\figref{results/Push-ContactsCount-Hand-Overall-Means} and \figref{results/Push-MeanContactTime-Hand-Overall-Means}). Participants consistently used few and continuous contacts for all visual hand renderings (Fig. 3b), with only less than ten trials, carried out by two participants, quickly completed with multiple discrete touches. @@ -21,12 +21,12 @@ However, due to the latency of the hand tracking and the visual hand reacting to The \level{Tips} rendering, which showed the contacts made on the virtual cube, was controversial as it received the minimum and the maximum score on every question. Many participants reported difficulties in seeing the orientation of the visual fingers, while others found that it gave them a better sense of the contact points and improved their concentration on the task. -This result is consistent with \textcite{saito2021contact}, who found that displaying the points of contacts was beneficial for grasping a \VO over an opaque visual hand overlay. +This result is consistent with \textcite{saito2021contact}, who found that displaying the points of contacts was beneficial for grasping a virtual object over an opaque visual hand overlay. -To summarize, when employing a visual hand rendering overlaying the real hand, participants were more performant and confident in manipulating \VOs with bare hands in \AR. +To summarize, when employing a visual hand rendering overlaying the real hand, participants were more performant and confident in manipulating virtual objects with bare hands in \AR. These results contrast with similar manipulation studies, but in non-immersive, on-screen \AR, where the presence of a visual hand rendering was found by participants to improve the usability of the interaction, but not their performance \cite{blaga2017usability,maisto2017evaluation,meli2018combining}. Our results show the most effective visual hand rendering to be the \level{Skeleton} one. Participants appreciated that it provided a detailed and precise view of the tracking of the real hand, without hiding or masking it. Although the \level{Contour} and \level{Mesh} hand renderings were also highly rated, some participants felt that they were too visible and masked the real hand. -This result is in line with the results of \VO manipulation in \VR of \textcite{prachyabrued2014visual}, who found that the most effective visual hand rendering was a double representation of both the real tracked hand and a visual hand physically constrained by the \VE. +This result is in line with the results of virtual object manipulation in \VR of \textcite{prachyabrued2014visual}, who found that the most effective visual hand rendering was a double representation of both the real tracked hand and a visual hand physically constrained by the \VE. This type of \level{Skeleton} rendering was also the one that provided the best sense of agency (control) in \VR \cite{argelaguet2016role,schwind2018touch}. diff --git a/3-manipulation/visual-hand/5-conclusion.tex b/3-manipulation/visual-hand/5-conclusion.tex index 94df77c..2e6b051 100644 --- a/3-manipulation/visual-hand/5-conclusion.tex +++ b/3-manipulation/visual-hand/5-conclusion.tex @@ -1,9 +1,9 @@ \section{Conclusion} \label{conclusion} -In this chapter, we addressed the challenge of touching, grasping and manipulating \VOs directly with the hand in immersive \OST-\AR by providing and evaluating visual renderings as augmentation of the real hand. -Superimposed on the user's hand, these visual renderings provide feedback from the virtual hand, which tracks the real hand, and simulates the interaction with \VOs as a proxy. -We first selected and compared the six most popular visual hand renderings used to interact with \VOs in \AR. +In this chapter, we addressed the challenge of touching, grasping and manipulating virtual objects directly with the hand in immersive \OST-\AR by providing and evaluating visual renderings as augmentation of the real hand. +Superimposed on the user's hand, these visual renderings provide feedback from the virtual hand, which tracks the real hand, and simulates the interaction with virtual objects as a proxy. +We first selected and compared the six most popular visual hand renderings used to interact with virtual objects in \AR. Then, in a user study with 24 participants and an immersive \OST-\AR headset, we evaluated the effect of these six visual hand renderings on the user performance and experience in two representative manipulation tasks. Our results showed that a visual hand augmentation improved the performance, perceived effectiveness and confidence of participants compared to no augmentation. diff --git a/3-manipulation/visuo-haptic-hand/1-introduction.tex b/3-manipulation/visuo-haptic-hand/1-introduction.tex index a9dfc15..26c2c77 100644 --- a/3-manipulation/visuo-haptic-hand/1-introduction.tex +++ b/3-manipulation/visuo-haptic-hand/1-introduction.tex @@ -6,22 +6,22 @@ Moreover, it is important to leave the user capable of interacting with both vir For this reason, it is often considered beneficial to move the point of application of the haptic feedback elsewhere on the hand (\secref[related_work]{vhar_haptics}). However, the impact of the positioning of the haptic feedback on the hand during direct hand manipulation in \AR has not been systematically studied. -Conjointly, a few studies have explored and compared the effects of visual and haptic feedback in tasks involving the manipulation of \VOs with the hand. +Conjointly, a few studies have explored and compared the effects of visual and haptic feedback in tasks involving the manipulation of virtual objects with the hand. \textcite{sarac2022perceived} and \textcite{palmer2022haptic} studied the effects of providing haptic feedback about contacts at the fingertips using haptic devices worn at the wrist, testing different mappings. Their results proved that moving the haptic feedback away from the point(s) of contact is possible and effective, and that its impact is more significant when the visual feedback is limited. A final question is whether one or the other of these (haptic or visual) hand feedback should be preferred \cite{maisto2017evaluation,meli2018combining}, or whether a combined visuo-haptic feedback is beneficial for users. However, these studies were conducted in non-immersive setups, with a screen displaying the \VE view. -In fact, both hand feedback can provide sufficient sensory feedback for efficient direct hand manipulation of \VOs in \AR, or conversely, they can be shown to be complementary. +In fact, both hand feedback can provide sufficient sensory feedback for efficient direct hand manipulation of virtual objects in \AR, or conversely, they can be shown to be complementary. -In this chapter, we aim to investigate the role of \textbf{visuo-haptic feedback of the hand when manipulating \VO} in immersive \OST-\AR using wearable vibrotactile haptics. +In this chapter, we aim to investigate the role of \textbf{visuo-haptic feedback of the hand when manipulating virtual object} in immersive \OST-\AR using wearable vibrotactile haptics. We selected \textbf{four different delocalized positionings on the hand} that have been previously proposed in the literature for direct hand interaction in \AR using wearable haptic devices (\secref[related_work]{vhar_haptics}): on the nails, the proximal phalanges, the wrist, and the nails of the opposite hand. We focused on vibrotactile feedback, as it is used in most of the wearable haptic devices and has the lowest encumbrance. In a \textbf{user study}, using the \OST-\AR headset Microsoft HoloLens~2 and two \ERM vibrotactile motors, we evaluated the effect of the four positionings with \textbf{two contact vibration techniques} on the user performance and experience with the same two manipulation tasks as in \chapref{visual_hand}. -We additionally compared these vibrotactile renderings with the \textbf{skeleton-like visual hand augmentation} established in the \chapref{visual_hand} as a complementary visuo-haptic feedback of the hand interaction with the \VOs. +We additionally compared these vibrotactile renderings with the \textbf{skeleton-like visual hand augmentation} established in the \chapref{visual_hand} as a complementary visuo-haptic feedback of the hand interaction with the virtual objects. \noindentskip The contributions of this chapter are: \begin{itemize} - \item The evaluation in a user study with 20 participants of the effect of providing a vibrotactile feedback of the fingertip contacts with \VOs, during direct manipulation with bare hand in \AR, at four different delocalized positionings of the haptic feedback on the hand and with two contact vibration techniques. + \item The evaluation in a user study with 20 participants of the effect of providing a vibrotactile feedback of the fingertip contacts with virtual objects, during direct manipulation with bare hand in \AR, at four different delocalized positionings of the haptic feedback on the hand and with two contact vibration techniques. \item The comparison of these vibrotactile positionings and renderings techniques with the two most representative visual hand augmentations established in the \chapref{visual_hand}. \end{itemize} diff --git a/3-manipulation/visuo-haptic-hand/2-method.tex b/3-manipulation/visuo-haptic-hand/2-method.tex index 79c48fa..9d4886d 100644 --- a/3-manipulation/visuo-haptic-hand/2-method.tex +++ b/3-manipulation/visuo-haptic-hand/2-method.tex @@ -1,13 +1,13 @@ \section{Vibrotactile Renderings of the Hand-Object Contacts} \label{vibration} -The vibrotactile hand rendering provided information about the contacts between the \VO and the thumb and index fingers of the user, as they are the two fingers most used for grasping (\secref[related_work]{grasp_types}). +The vibrotactile hand rendering provided information about the contacts between the virtual object and the thumb and index fingers of the user, as they are the two fingers most used for grasping (\secref[related_work]{grasp_types}). We evaluated both the delocalized positioning and the contact vibration technique of the vibrotactile hand rendering. \subsection{Vibrotactile Positionings} \label{positioning} -We considered five different positionings for providing the vibrotactile rendering as feedback of the contacts between the virtual hand and the \VOs, as shown in \figref{method/locations}. +We considered five different positionings for providing the vibrotactile rendering as feedback of the contacts between the virtual hand and the virtual objects, as shown in \figref{method/locations}. They are representative of the most common locations used by wearable haptic devices in \AR to place their end-effector, as found in the literature (\secref[related_work]{vhar_haptics}), as well as other positionings that have been employed for manipulation tasks. For each positioning, we used two vibrating actuators, for the thumb and index finger, respectively. @@ -44,7 +44,7 @@ Similarly, we designed the distance vibration technique (Dist) so that interpene \section{User Study} \label{method} -This user study aims to evaluate whether a visuo-haptic rendering of the hand affects the user performance and experience of manipulation of \VOs with bare hands in \OST-\AR. +This user study aims to evaluate whether a visuo-haptic rendering of the hand affects the user performance and experience of manipulation of virtual objects with bare hands in \OST-\AR. The chosen visuo-haptic hand renderings are the combination of the two most representative visual hand renderings established in the \chapref{visual_hand}, \ie \level{Skeleton} and \level{No Hand}, described in \secref[visual_hand]{hands}, with the two contact vibration techniques provided at the four delocalized positions on the hand described in \secref{vibration}. \subsection{Experimental Design} diff --git a/3-manipulation/visuo-haptic-hand/4-discussion.tex b/3-manipulation/visuo-haptic-hand/4-discussion.tex index 9403364..de7bf93 100644 --- a/3-manipulation/visuo-haptic-hand/4-discussion.tex +++ b/3-manipulation/visuo-haptic-hand/4-discussion.tex @@ -1,7 +1,7 @@ \section{Discussion} \label{discussion} -We evaluated twenty visuo-haptic renderings of the hand, in the same two \VO manipulation tasks in \AR as in the \chapref{visual_hand}, as the combination of two vibrotactile contact techniques provided at five delocalized positions on the hand with the two most representative visual hand renderings established in the \chapref{visual_hand}. +We evaluated twenty visuo-haptic renderings of the hand, in the same two virtual object manipulation tasks in \AR as in the \chapref{visual_hand}, as the combination of two vibrotactile contact techniques provided at five delocalized positions on the hand with the two most representative visual hand renderings established in the \chapref{visual_hand}. In the \level{Push} task, vibrotactile haptic hand rendering has been proven beneficial with the \level{Proximal} positioning, which registered a low completion time, but detrimental with the \level{Fingertips} positioning, which performed worse (\figref{results/Push-CompletionTime-Location-Overall-Means}) than the \level{Proximal} and \level{Opposite} (on the contralateral hand) positionings. The cause might be the intensity of vibrations, which many participants found rather strong and possibly distracting when provided at the fingertips. @@ -33,18 +33,18 @@ Additionally, the \level{Skeleton} rendering was appreciated and perceived as mo Participants reported that this visual hand rendering provided good feedback on the status of the hand tracking while being constrained to the cube, and helped with rotation adjustment in both tasks. However, many also felt that it was a bit redundant with the vibrotactile hand rendering. Indeed, receiving a vibrotactile hand rendering was found by participants as a more accurate and reliable information regarding the contact with the cube than simply seeing the cube and the visual hand reacting to the manipulation. -This result suggests that providing a visual hand rendering may not be useful during the grasping phase, but may be beneficial prior to contact with the \VO and during position and rotation adjustment, providing valuable information about the hand pose. +This result suggests that providing a visual hand rendering may not be useful during the grasping phase, but may be beneficial prior to contact with the virtual object and during position and rotation adjustment, providing valuable information about the hand pose. It is also worth noting that the improved hand tracking and grasp helper improved the manipulation of the cube with respect to the \chapref{visual_hand}, as shown by the shorter completion time during the \level{Grasp} task. This improvement could also be the reason for the smaller differences between the \level{Skeleton} and the \level{None} visual hand renderings in this second experiment. -In summary, the positioning of the vibrotactile haptic rendering of the hand affected on the performance and experience of users manipulating \VOs with their bare hands in \AR. +In summary, the positioning of the vibrotactile haptic rendering of the hand affected on the performance and experience of users manipulating virtual objects with their bare hands in \AR. The closer the vibrotactile hand rendering was to the point of contact, the better it was perceived in terms of effectiveness, usefulness, and realism. -These subjective appreciations of wearable haptic hand rendering for manipulating \VOs in \AR were also observed by \textcite{maisto2017evaluation} and \textcite{meli2018combining}. +These subjective appreciations of wearable haptic hand rendering for manipulating virtual objects in \AR were also observed by \textcite{maisto2017evaluation} and \textcite{meli2018combining}. However, the best performance was obtained with the farthest positioning on the contralateral hand (\level{Opposite}), which is somewhat surprising. This apparent paradox could be explained in two ways. On the one hand, participants behave differently when the haptic rendering was given on the fingers (\level{Fingertips} and \level{Proximal}), close to the contact point, with shorter pushes and larger grip apertures. This behavior has likely given them a better experience of the tasks and more confidence in their actions, as well as leading to a lower interpenetration/force applied to the cube \cite{pacchierotti2015cutaneous}. On the other hand, the unfamiliarity of the contralateral hand positioning (\level{Opposite}) caused participants to spend more time understanding the haptic stimuli, which might have made them more focused on performing the task. In terms of the contact vibration technique, the continuous vibration technique on the finger interpenetration (\level{Distance}) did not make a difference to performance, although it provided more information. -Participants felt that vibration bursts were sufficient (\level{Distance}) to confirm contact with the \VO. +Participants felt that vibration bursts were sufficient (\level{Distance}) to confirm contact with the virtual object. Finally, it was interesting to note that the visual hand rendering was appreciated but felt less necessary when provided together with vibrotactile hand rendering, as the latter was deemed sufficient for acknowledging the contact. diff --git a/3-manipulation/visuo-haptic-hand/5-conclusion.tex b/3-manipulation/visuo-haptic-hand/5-conclusion.tex index 0896a6d..88db6ce 100644 --- a/3-manipulation/visuo-haptic-hand/5-conclusion.tex +++ b/3-manipulation/visuo-haptic-hand/5-conclusion.tex @@ -1,8 +1,8 @@ \section{Conclusion} \label{conclusion} -In this chapter, we investigated the visuo-haptic feedback of the hand when manipulating \VOs in immersive \OST-\AR using wearable vibrotactile haptic. -To do so, we provided vibrotactile feedback of the fingertip contacts with \VOs by moving away the haptic actuator that do not cover the inside of the hand: on the nails, the proximal phalanges, the wrist, and the nails of the opposite hand. +In this chapter, we investigated the visuo-haptic feedback of the hand when manipulating virtual objects in immersive \OST-\AR using wearable vibrotactile haptic. +To do so, we provided vibrotactile feedback of the fingertip contacts with virtual objects by moving away the haptic actuator that do not cover the inside of the hand: on the nails, the proximal phalanges, the wrist, and the nails of the opposite hand. We selected these four different delocalized positions on the hand from the literature for direct hand interaction in \AR using wearable haptic devices. In a user study, we compared twenty visuo-haptic feedback of the hand as the combination of two vibrotactile contact techniques, provided at five different delocalized positions on the user's hand, and with the two most representative visual hand augmentations established in the \chapref{visual_hand}, \ie the skeleton hand rendering and no hand rendering. @@ -13,7 +13,7 @@ This study provide evidence that moving away the feedback from the inside of the If integration with the hand tracking system allows it, and if the task requires it, a haptic ring worn on the middle or proximal phalanx seems preferable. However, a wrist-mounted haptic device will be able to provide richer feedback by embedding more diverse haptic actuators with larger bandwidths and maximum amplitudes, while being less obtrusive than a ring. -Finally, we think that the visual hand augmentation complements the haptic contact rendering well by providing continuous feedback on the hand tracking, and that it can be disabled during the grasping phase to avoid redundancy with the haptic feedback of the contact with the \VO. +Finally, we think that the visual hand augmentation complements the haptic contact rendering well by providing continuous feedback on the hand tracking, and that it can be disabled during the grasping phase to avoid redundancy with the haptic feedback of the contact with the virtual object. \noindentskip This work was published in Transactions on Haptics: diff --git a/4-conclusion/conclusion.tex b/4-conclusion/conclusion.tex index 73974ea..fd1dadc 100644 --- a/4-conclusion/conclusion.tex +++ b/4-conclusion/conclusion.tex @@ -6,9 +6,9 @@ \section{Summary} In this manuscript we have shown how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR. % by augmenting the perception of the real and manipulation of the virtual. -Wearable haptics can provide rich tactile feedback on \VOs and augment the perception of real objects, both directly touched by the hand, while preserving freedom of movement and interaction with the \RE. +Wearable haptics can provide rich tactile feedback on virtual objects and augment the perception of real objects, both directly touched by the hand, while preserving freedom of movement and interaction with the \RE. However, their integration with \AR is still in its infancy and presents many design, technical and human challenges. -We have structured our research around two axes: \textbf{(I) modifying the visuo-haptic texture perception of real surfaces} and \textbf{(II) improving the manipulation of \VOs}. +We have structured our research around two axes: \textbf{(I) modifying the visuo-haptic texture perception of real surfaces} and \textbf{(II) improving the manipulation of virtual objects}. \noindentskip In \partref{perception} we focused on modifying the perception of wearable and immersive virtual visuo-haptic textures that augment real surfaces. Texture is a fundamental property of an object, perceived equally by sight and touch. @@ -28,17 +28,17 @@ In \chapref{vhar_textures}, we investigated the perception of co-localized visua We transposed the \textbf{data-driven visuo-haptic textures} from the \HaTT database to the system presented in \chapref{vhar_system} and conducted a user study with 20 participants to rate the coherence, realism, and perceived roughness of the combination of nine visuo-haptic texture pairs. Participants integrated roughness sensations from both visual and haptic modalities well, with \textbf{haptics dominating perception}, and consistently identified and matched \textbf{clusters of visual and haptic textures with similar perceived roughness}. -\noindentskip In \partref{manipulation} we focused on improving the manipulation of \VOs directly with the hand in immersive \OST-\AR. +\noindentskip In \partref{manipulation} we focused on improving the manipulation of virtual objects directly with the hand in immersive \OST-\AR. Our approach was to design visual augmentations of the hand and delocalized haptic feedback, based on the literature, and evaluate them in user studies. -We first considered \textbf{(1) the visual augmentation of the hand} and then the \textbf{(2)} combination of different \textbf{visuo-haptic feedback of the hand when manipulating \VOs}. +We first considered \textbf{(1) the visual augmentation of the hand} and then the \textbf{(2)} combination of different \textbf{visuo-haptic feedback of the hand when manipulating virtual objects}. In \chapref{visual_hand}, we investigated the visual feedback of the virtual hand as augmentation of the real hand. -Seen as an \textbf{overlay on the user's hand}, such visual hand rendering provide feedback on hand tracking and interaction with \VOs. +Seen as an \textbf{overlay on the user's hand}, such visual hand rendering provide feedback on hand tracking and interaction with virtual objects. We compared the six commonly used visual hand augmentations in the \AR literature in a user study with 24 participants, where we evaluated their effect on user performance and experience in two representative manipulation tasks. The results showed that a visual hand augmentation improved user performance, perceived effectiveness and confidence, with a \textbf{skeleton-like rendering being the most performant and effective}. This rendering provided a detailed view of the tracked phalanges while being thin enough not to hide the real hand. -In \chapref{visuo_haptic_hand}, we then investigated visuo-haptic feedback to direct hand manipulation with \VOs using wearable vibrotactile haptics. +In \chapref{visuo_haptic_hand}, we then investigated visuo-haptic feedback to direct hand manipulation with virtual objects using wearable vibrotactile haptics. In a user study with a similar design and 20 participants, we compared two vibrotactile contact techniques, provided at \textbf{four different delocalized positions on the user's hand}, and combined with the two most representative visual hand augmentations from the previous chapter. The results showed that providing vibrotactile feedback \textbf{improved the perceived effectiveness, realism, and usefulness when provided close to the fingertips}, and that the visual hand augment complemented the haptic contact feedback well in providing a continuous feedback on hand tracking. @@ -102,12 +102,12 @@ It would be interesting to determine the importance of these factors on the perc We also rendered haptic textures captured by a hand-held probe to be touched with the bare finger, but finger based captures of real textures should also be considered \cite{balasubramanian2024sens3}. Finally, the virtual texture models should also be adaptable to individual sensitivities \cite{malvezzi2021design,young2020compensating}. -\subsection*{Visual Augmentation of the Hand for Manipulating \VOs in AR} +\subsection*{Visual Augmentation of the Hand for Manipulating virtual objects in AR} \paragraph{Other AR Displays} The visual hand augmentations we evaluated were displayed on the Microsoft HoloLens~2, which is a common \OST-\AR headset \cite{hertel2021taxonomy}. -We purposely chose this type of display because in \OST-\AR the lack of mutual occlusion between the hand and the \VO is the most challenging to solve \cite{macedo2023occlusion}. +We purposely chose this type of display because in \OST-\AR the lack of mutual occlusion between the hand and the virtual object is the most challenging to solve \cite{macedo2023occlusion}. We therefore hypothesized that a visual hand augmentation would be more beneficial to users with this type of display. However, the user's visual perception and experience is different with other types of displays, such as \VST-\AR, where the \RE view is seen through cameras and screens (\secref[related_work]{ar_displays}). While the mutual occlusion problem and the hand tracking latency could be overcome with \VST-\AR, the visual hand augmentation could still be beneficial to users as it provides depth cues and feedback on the hand tracking, and should be evaluated as such. @@ -119,12 +119,12 @@ While these tasks are fundamental building blocks for more complex manipulation Similarly, a broader experimental study might shed light on the role of gender and age, as our subject pool was not sufficiently diverse in this regard. Finally, all visual hand augmentations received low and high rank rates from different participants, suggesting that users should be able to choose and personalize some aspects of the visual hand augmentation according to their preferences or needs, and this should also be evaluated. -\subsection*{Visuo-Haptic Augmentation of Hand Manipulation With \VOs in AR} +\subsection*{Visuo-Haptic Augmentation of Hand Manipulation With virtual objects in AR} \paragraph{Richer Haptic Feedback} The haptic feedback we considered was limited to vibrotactile feedback using \ERM motors. -While the simpler contact vibration technique (Impact technique) was sufficient to confirm contact with the cube, richer vibrotactile renderings may be required for more complex interactions, such as rendering hardness (\secref[related_work]{hardness_rendering}), textures (\secref[related_work]{texture_rendering}), friction \cite{konyo2008alternative,jeon2011extensions,salazar2020altering}, or edges and shape of \VOs. +While the simpler contact vibration technique (Impact technique) was sufficient to confirm contact with the cube, richer vibrotactile renderings may be required for more complex interactions, such as rendering hardness (\secref[related_work]{hardness_rendering}), textures (\secref[related_work]{texture_rendering}), friction \cite{konyo2008alternative,jeon2011extensions,salazar2020altering}, or edges and shape of virtual objects. This will require considering a wider ranger of haptic actuators and sensations (\secref[related_work]{wearable_haptic_devices}), such as pressure or stretching of the skin. More importantly, the best compromise between well-rounded haptic feedback and wearability of the system with respect to \AR constraints should be analyzed (\secref[related_work]{vhar_haptics}). @@ -137,7 +137,7 @@ It remains to be explored how to support rendering for different and larger area \section{Perspectives} -Our goal was to improve direct hand interaction with \VOs using wearable haptic devices in immersive \AR by providing more plausible and coherent perception and more natural and effective manipulation of the visuo-haptic augmentations. +Our goal was to improve direct hand interaction with virtual objects using wearable haptic devices in immersive \AR by providing more plausible and coherent perception and more natural and effective manipulation of the visuo-haptic augmentations. Our contributions have enabled progress towards a seamless integration of the virtual into the real world. They also allow us to outline longer-term research perspectives. @@ -147,7 +147,7 @@ We have seen how complex the sense of touch is (\secref[related_work]{haptic_han Multiple sensory receptors all over the skin allow us to perceive different properties of objects, such as their texture, temperature, weight or shape. Particularly concentrated in the hands, their sensory feedback, together with the muscles, is crucial for grasping and manipulating objects. In this manuscript, we have shown how wearable haptic devices can provide virtual tactile sensations to support direct hand interaction in immersive \AR: -We have investigated both the visuo-haptic perception of texture augmenting real surfaces (\partref{perception}) and the manipulation of \VOs with visuo-haptic feedback of hand contact with \VOs (\partref{manipulation}). +We have investigated both the visuo-haptic perception of texture augmenting real surfaces (\partref{perception}) and the manipulation of virtual objects with visuo-haptic feedback of hand contact with virtual objects (\partref{manipulation}). However, unlike the visual sense, which can be fully immersed in the virtual using an \AR/\VR headset, there is no universal wearable haptic device that can reproduce all the haptic properties perceived by the hand (\secref[related_work]{wearable_haptics}). Thus, the haptic renderings and augmentations we studied were limited to specific properties of roughness (\chapref{vhar_system}) and contact (\chapref{visuo_haptic_hand}) using vibrotactile feedback. @@ -158,7 +158,7 @@ This would allow to assess the relative importance of visual and haptic feedback One of the main findings of studies on the haptic perception of real objects is the importance of certain perceived properties over others in discriminating between objects \cite{hollins1993perceptual,baumgartner2013visual,vardar2019fingertip}. It would therefore be interesting to determine which wearable haptic augmentations are most important for the perception and manipulation of virtual and augmented objects with the hand in \AR and \VR. -Similar user studies could then be conducted, to reproduce as many haptic properties as possible in \VO discrimination tasks. +Similar user studies could then be conducted, to reproduce as many haptic properties as possible in virtual object discrimination tasks. These results would enable the design of more universal wearable haptic devices that provide rich haptic feedback that best meets users' needs for interaction in \AR and \VR. % systematic exploration of the parameter space of the haptic rendering to determine the most important parameters their influence on the perception @@ -169,7 +169,7 @@ These results would enable the design of more universal wearable haptic devices We have reviewed the diversity of \AR and \VR reality displays and their respective characteristics in rendering (\secref[related_work]{ar_displays}) and the manipulation of virtual content with the hand (\chapref{visual_hand}). The diversity of wearable haptic devices and the different sensations they can provide is even more important (\secref[related_work]{wearable_haptics}) and an active research topic \cite{pacchierotti2017wearable}. Coupling wearable haptics with immersive \AR also requires the haptic actuator to be placed on the body other than at the hand contact points (\secref[related_work]{vhar_haptics}). -In particular, in this thesis we have investigated the perception of haptic texture augmentation using a vibrotactile device on the median phalanx (\chapref{vhar_system}) and also compared different positions of the haptics on the hand for manipulating \VOs (\chapref{visuo_haptic_hand}). +In particular, in this thesis we have investigated the perception of haptic texture augmentation using a vibrotactile device on the median phalanx (\chapref{vhar_system}) and also compared different positions of the haptics on the hand for manipulating virtual objects (\chapref{visuo_haptic_hand}). Haptic feedback should be provided close to the point of contact of the hand with the virtual, to enhance the realism of texture augmentation (\chapref{vhar_textures}) and to render contact with virtual objects (\chapref{visuo_haptic_hand}), \eg rendering fingertip contact with a haptic ring worn on the middle or proximal phalanx. However, the task at hand, the user's sensitivity and preferences, the limitations of the tracking system, or the ergonomics of the haptic device may require the use of other form factors and positions, such as the wrist or arm. diff --git a/config/acronyms.tex b/config/acronyms.tex index cd49a76..9aec5e2 100644 --- a/config/acronyms.tex +++ b/config/acronyms.tex @@ -35,12 +35,9 @@ % Style the acronyms \renewcommand*{\glstextformat}[1]{\textcolor{black}{#1}} % Hyperlink in black -\let\AE\undefined - \acronym[TIFC]{2IFC}{two-interval forced choice} \acronym[TwoD]{2D}{two-dimensional} \acronym[ThreeD]{3D}{three-dimensional} -\acronym{AE}{augmented environment} \acronym{ANOVA}{analysis of variance} \acronym{ART}{aligned rank transform} \acronym{AR}{augmented reality} @@ -62,6 +59,5 @@ \acronym{RE}{real environment} \acronym{UI}{user interface} \acronym{VE}{virtual environment} -\acronym{VO}{virtual object} \acronym{VR}{virtual reality} \acronym{VST}{visual see-through}