diff --git a/1-background/introduction/introduction.tex b/1-background/introduction/introduction.tex index 7f463af..a565c2d 100644 --- a/1-background/introduction/introduction.tex +++ b/1-background/introduction/introduction.tex @@ -21,7 +21,7 @@ Information from different sensory sources can be complementary, redundant or co This is why we sometimes want to touch an object to check one of its properties that we have seen and to compare or confront our visual and tactile sensations. We then \textbf{instinctively construct a unified perception of the properties of the object} we are exploring and manipulating from our sensory modalities, as well as from the movement of our hand and fingers on the object \cite{ernst2002humans}. -The sense of touch also allows us to perceive our environment, but also to represent ourselves in space and to interact with the surrounding objects. +The sense of touch also allows us to perceive our environment, but also to represent ourselves in space and interact with the surrounding objects. This is due to the many sensory receptors distributed throughout our hands and body. These receptors can be divided into two modalities: \emph{kinesthetic} (or proprioception), which are the forces felt by muscles and tendons, and \emph{cutaneous} (or tactile), which are the pressures, stretches, vibrations and temperatures felt by the skin. This rich and complex variety of actions and sensations makes it particularly \textbf{difficult to artificially recreate capabilities of touch}, for example in virtual or remote operating environments \cite{culbertson2018haptics}. @@ -32,7 +32,7 @@ This rich and complex variety of actions and sensations makes it particularly \t Haptic devices can be categorized according to how they interface with the user: graspable, touchable and wearable, as illustrated in \figref{haptic-categories} \cite{culbertson2018haptics}. \emph{Graspable interfaces} are the traditional haptic devices that are held in the hand. They are either grounded devices that provide kinesthetic (force) feedback, \eg a robotic arm or a joystick, or ungrounded tactile devices, \eg a game controller or a smartphone. -\emph{Touchable interfaces} are actuated devices that are directly touched and that can dynamically change their shape or surface properties, such as hardness or friction, providing simultaneous kinesthetic and cutaneous feedback. +\emph{Touchable interfaces} are actuated devices directly touched and that can dynamically change their shape or surface properties, such as hardness or friction, providing simultaneous kinesthetic and cutaneous feedback. However, graspable interfaces occupy the hand, preventing interaction with other objects, and touchable interfaces often involve cumbersome mechanisms and are by definition limited to their own working surface. Instead, \textbf{\emph{wearable interfaces} are directly mounted on the body} to provide cutaneous sensations on the skin in a portable way and \textbf{without restricting the user's movements} \cite{pacchierotti2017wearable}. @@ -51,12 +51,12 @@ Instead, \textbf{\emph{wearable interfaces} are directly mounted on the body} to A wide range of wearable haptic devices have been developed to provide the user with rich virtual tactile sensations, including normal force, skin stretch, vibration and thermal feedback. \figref{wearable-haptics} shows some examples of different wearable haptic devices with different form factors and rendering capabilities. Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, augmented and virtual realities, and social interaction \cite{pacchierotti2017wearable,culbertson2018haptics}. -However, the \textbf{integration of wearable haptics with \AR has been little explored so far}. %, and few wearable haptic devices have specifically designed or experimentally tested for direct hand interaction in \AR. +However, the \textbf{integration of wearable haptics with \AR has been little explored}. %, and few wearable haptic devices have specifically designed or experimentally tested for direct hand interaction in \AR. \begin{subfigs}{wearable-haptics}{ Wearable haptic devices can provide sensations on the skin as feedback to real or virtual objects being touched. }[][ - \item Wolverine, a wearable exoskeleton that simulates contact and grasping of virtual objects with force feedback on the fingers \cite{choi2016wolverine}. + \item Wolverine, a wearable exoskeleton that simulates contact and grasping virtual objects with force feedback on the fingers \cite{choi2016wolverine}. \item Touch\&Fold, a nail-mounted wearable haptic device that folds on demand to provide contact, normal force and vibration to the fingertip \cite{teng2021touch}. \item The hRing, a wearable haptic ring mounted on the proximal phalanx able to render normal and shear forces to the finger \cite{pacchierotti2016hring}. \item Tasbi, a haptic bracelet capable of providing pressure and vibrotactile feedback to the wrist \cite{pezent2022design}. @@ -71,17 +71,17 @@ However, the \textbf{integration of wearable haptics with \AR has been little ex \subsectionstarbookmark{Augmented Reality Is Not Only Visual} \textbf{\emph{Augmented Reality (\AR)} integrates virtual content into the real world perception, creating the illusion of a unique \emph{augmented environment}} \cite{azuma1997survey,skarbez2021revisiting}. -It thus promises natural and seamless interaction with the physical and digital objects (and their combination) directly with our hands \cite{billinghurst2021grand}. -It is technically and conceptually closely related to \emph{\VR}, which completely replaces the \emph{\RE} perception with a \emph{\VE}. +It thus promises natural and seamless interaction with physical and digital objects (and their combination) directly with our hands \cite{billinghurst2021grand}. +It is technically and conceptually closely related to \emph{\VR}, which completely replaces \emph{\RE} perception with a \emph{\VE}. \AR and \VR can be placed on a reality-virtuality continuum, as proposed by \textcite{milgram1994taxonomy} and illustrated in \figref{rv-continuum}\footnote{On the original reality-virtuality continuum of \textcite{milgram1994taxonomy}, augmented virtuality is also considered, as the incorporation of real objects into a \VE, and is placed between \AR and \VR. For simplicity, we only consider \AR and \VR in this thesis.}. -It describes the degree of virtuality of the environment along an axis, with one end being the \RE and the other end being a pure \VE, \ie indistinguishable from the real world (such as \emph{The Matrix} movies). +It describes the degree of virtuality of the environment along an axis, with one end being \RE and the other end being pure \VE, \ie indistinguishable from the real world (as in \emph{The Matrix} movies). Between these two extremes lies \MR, which includes \AR and \VR as different levels of mixing real and virtual environments \cite{skarbez2021revisiting}.\footnote{This is the original and classic definition of \MR, but there is still debate about how to define and characterize \AR and \MR experiences \cite{speicher2019what,skarbez2021revisiting}.} We call \AR/\VR \emph{systems} the computational set of hardware (input devices, sensors and displays) and software (tracking, simulation and rendering) that allows the user to interact with the \VE. % by implementing the interaction loop we proposed in \figref{interaction-loop}. -\AR and \VR systems can address any of the human senses, but they most often focus on visual augmentations \cite[p.144]{billinghurst2015survey}. +\AR and \VR systems can address any of the human senses, but most focus on visual augmentation \cite[p.144]{billinghurst2015survey} and \cite{kim2018revisiting}. Many visual displays have been explored, from projection systems to hand-held displays. -\textbf{\AR headsets are the most promising displays as they are portable and provide the user with an immersive augmented environment.} +\textbf{\AR headsets are the most promising displays as they are portable and provide the user with an immersive augmented environment} \cite{hertel2021taxonomy}. %but the most \textbf{promising devices are \AR headsets}, which are \textbf{portable displays worn directly on the head}, providing the user with an \textbf{immersive visual augmented environment}. \begin{subfigs}{rv-continuums}{Reality-virtuality continuums. }[][ @@ -95,16 +95,16 @@ Many visual displays have been explored, from projection systems to hand-held di %Concepts of virtuality and augmentation can also be applied for sensory modalities other than vision. \textcite{jeon2009haptic} proposed to describe visuo-haptic \AR/\VR systems with two orthogonal reality-virtuality continuums, one for vision and one for touch, as shown in \figref{visuo-haptic-rv-continuum5}. The combination of the two axes defines 9 types of visuo-haptic environments, with 3 possible levels of virtuality for each visual or haptic feedback: real, augmented and virtual. -For example, (visual) \AR using a real object as a proxy to manipulate a \VO is considered to be \emph{haptic reality} (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum5}), whereas a device that provides synthetic haptic feedback when touching a \VO is considered to be \emph{haptic virtuality} (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum5}). +For example, (visual) \AR using a real object as a proxy to manipulate a \VO is considered \emph{haptic reality} (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum5}), whereas a device that provides synthetic haptic feedback when touching a \VO is considered \emph{haptic virtuality} (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum5}). \textbf{A \emph{haptic augmentation} is then the combination of real and virtual haptic stimuli} \cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum5}). In particular, it has been implemented by augmenting the haptic perception of real objects by providing timely virtual tactile stimuli using wearable haptics: \figref{salazar2020altering} shows an example of modifying the perceived stiffness of a real object in \VR using simultaneous pressure feedback on the finger (left middle cell in \figref{visuo-haptic-rv-continuum5}). -\figref{bau2012revel} shows another example of visuo-haptic augmentation of virtual texture when running the finger on a real surface (middle cell in the two axes in \figref{visuo-haptic-rv-continuum5}). +\figref{bau2012revel} shows another example of visuo-haptic augmentation of virtual texture when running the finger over a real surface (middle cell in the two axes in \figref{visuo-haptic-rv-continuum5}). If a (visual) \AR system lacks haptic feedback, it creates a deceptive and incomplete user experience when the hand reaches the virtual content. All visual \VOs are inherently intangible and cannot physically constrain a user's hand, making it difficult to perceive their properties and interact with them with confidence and efficiency. -It is therefore necessary to provide haptic feedback that is consistent with the visual \VOs and ensures the best possible user experience, as we argue in the next section. -The \textbf{integration of wearable haptics with \AR} seems to be one of the most promising solutions, but it \textbf{remains challenging due to their many respective characteristics and the additional constraints of combining them}. +It is therefore necessary to provide haptic feedback that is coherent with the visual \VOs and ensures the best possible user experience, as we argue in the next section. +The \textbf{integration of wearable haptics with \AR} appears to be one of the most promising solutions, but it \textbf{remains challenging due to their many respective characteristics and the additional constraints of combining them}. \begin{subfigs}{visuo-haptic-environments}{Visuo-haptic environments with varying degrees of reality-virtuality. }[][ \item \AR environment with a real haptic object used as a proxy to manipulate a \VO \cite{kahl2023using}. @@ -132,7 +132,7 @@ Because the visuo-haptic \VE is displayed in real time and aligned with the \RE, \fig{interaction-loop}{The interaction loop between a user and a visuo-haptic augmented environment as proposed in this thesis.}[ A user interacts with the visual (in blue) and haptic (in red) \VEs through a virtual hand (in purple) interaction technique that tracks real hand movements and simulates contact with \VOs. - The visual and haptic \VEs are rendered back using an immersive \AR headset and wearable haptics, and are felt by the user to be registered and co-localized with the \RE (in gray) + The visual and haptic \VEs are rendered back using an immersive \AR headset and wearable haptics, and are perceived by the user to be registered and co-localized with the \RE (in gray). \protect\footnotemark ] @@ -151,23 +151,23 @@ Each of these challenges also raises numerous design, technical and human issues \subsectionstarbookmark{Challenge I: Providing Plausible and Coherent Visuo-Haptic Augmentations} -\textbf{Many haptic devices have been designed and evaluated specifically for use in \VR}, providing realistic and varied kinesthetic and tactile feedback to \VOs. +\textbf{Many haptic devices have been designed and evaluated specifically for use in \VR}, providing varied kinesthetic and tactile feedback to \VOs, and adding realism when interacting with them \cite{culbertson2018haptics}. Although closely related, \AR and \VR have key differences in their respective renderings that can affect user perception. %As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic \cite{ujitoko2021survey} or haptic retargeting \cite{azmandian2016haptic} effects. Many wearable haptic devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR. -The \textbf{user's hand must be indeed free to touch and interact with the \RE while wearing a wearable haptic device}. -Instead, it is possible to place the haptic actuator close to the point of contact with the \RE, \eg providing haptic feedback on the nail \cite{ando2007fingernailmounted,teng2021touch}, another phalanx \cite{asano2015vibrotactile,detinguy2018enhancing,salazar2020altering} or the wrist \cite{pezent2022design,sarac2022perceived} for rendering fingertip contacts with virtual content. +The \textbf{user's hand must be free to touch and interact with the \RE while wearing a wearable haptic device}. +Instead, it is possible to place the haptic actuator close to the point of contact with the \RE, \eg providing haptic feedback on the nail \cite{ando2007fingernailmounted,teng2021touch}, another phalanx \cite{asano2015vibrotactile,detinguy2018enhancing,salazar2020altering} or the wrist \cite{pezent2022design,sarac2022perceived} for rendering fingertip contact with virtual content. Therefore, when touching a virtual or augmented object, \textbf{the real and virtual visual sensations are perceived as co-localized, but the virtual haptic feedback is not}. It remains to be investigated how such potential discrepancies affect the overall perception to design visuo-haptic augmentations adapted to \AR. %So far, most of the \AR studies and applications only add visual and haptic sensations to the user's overall perception of the environment, but conversely it is more difficult to remove sensations. %Visual and haptic augmentations of the \RE add sensations to the user's overall perception. -The \textbf{added visual and haptic virtual sensations may be perceived as inconsistent} with the sensations of the \RE, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these. -Moreover, in \AR, the user can still see the real world environment, including their hands, the augmented real objects and the worn haptic devices, unlike \VR where there is total control over the visual rendering. % of the hand and \VE. -It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as realistic or even plausible, and to what extent they will conflict or complement each other. % in the perception of the \AE. -With a better understanding of \textbf{how visual factors can influence the perception of haptic augmentations}, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic augmentations adapted to \AR can be designed. +The \textbf{added visual and haptic virtual sensations may be perceived as incoherent} with the sensations of \RE, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these. +Moreover, in \AR the user can still see the real world environment, including their hands, augmented real objects and worn haptic devices, unlike \VR where there is total control over the visual rendering. % of the hand and \VE. +It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as a whole, and to what extent they will conflict or complement each other. % in the perception of the \AE. +With a better understanding of \textbf{how visual factors can influence the perception of haptic augmentations}, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied, and new visuo-haptic augmentations adapted to \AR can be designed. \subsectionstarbookmark{Challenge II: Enabling Effective Manipulation of the Augmented Environment} @@ -178,24 +178,24 @@ However, \textbf{manipulating a purely \VO with the bare hand can be challenging In addition, current \AR systems have visual rendering limitations that also affect interaction with \VOs. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency. \AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world. -However, the depth perception of the \VOs is often underestimated \cite{peillard2019studying,adams2022depth}. +However, the depth perception of \VOs is often underestimated \cite{peillard2019studying,adams2022depth}. There is also often \textbf{a lack of mutual occlusion between the hand and a \VO}, \ie that the hand can hide the object or be hidden by the object \cite{macedo2023occlusion}. Finally, as illustrated in \figref{interaction-loop}, interaction with a \VO is an illusion, because the real hand controls in real time a virtual hand, like an avatar, whose contacts with \VOs are then simulated in the \VE. -Therefore, there is inevitably a latency delay between the real hand's movements and the return movements of the \VO, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the \VO touched \cite{prachyabrued2014visual}. +Therefore, there is inevitably a latency between the movements of the real hand and the return movements of the \VO, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the \VO touched \cite{prachyabrued2014visual}. These three rendering limitations make it \textbf{difficult to perceive the position of the fingers relative to the object} before touching or grasping it, but also to estimate the force required to grasp the \VO and move it to a desired location. Hence, it is necessary to provide visual and haptic feedback that allows the user to efficiently contact, grasp and manipulate a \VO with the hand. -Yet, it is unclear which type of visual and haptic feedback, or their combination, is the best suited to guide the \VO manipulation.%, and whether one or the other of a combination of the two is most beneficial for users. +Yet, it is unclear which type of visual and haptic feedback, or their combination, is best suited to guide the manipulation of a \VO. %, and whether one or the other of a combination of the two is most beneficial for users. \section{Approach and Contributions} \label{contributions} %The aim of this thesis is to understand how immersive visual and wearable haptic augmentations complement each other in the context of direct hand perception and manipulation with virtual and augmented objects. As described in the Research Challenges section above, providing a coherent and effective visuo-haptic augmented environment to a user is complex and raises many issues. -Our approach is to +Our approach is to: \begin{enumerate*}[label=(\arabic*)] \item design immersive and wearable visuo-haptic renderings that augment both the objects being interacted with and the hand interacting with them, and -\item evaluate in user studies how these visuo-haptic renderings affect the interaction of the hand with these objects using psychophysical, performance, and user experience methods. +\item evaluate in user studies how these visuo-haptic renderings affect the interaction of the hand with these objects using psychophysical, performance and user experience methods. \end{enumerate*} We consider two main axes of research, each addressing one of the research challenges identified above: @@ -206,55 +206,55 @@ We consider two main axes of research, each addressing one of the research chall Our contributions in these two axes are summarized in \figref{contributions}. \fig[0.95]{contributions}{Summary of our contributions through the simplified interaction loop.}[ - The contributions are represented in dark grey boxes, and the research axes in light green circles. + The contributions are represented in dark grey boxes, the research axes in dark green circles and the research objectives in light green circles. The first axis is \textbf{(I)} the design and evaluation of the perception of visuo-haptic texture augmentations of real surfaces, directly touched by the hand. The second axis focuses on \textbf{(II)} improving the manipulation of \VOs with the bare hand using visuo-haptic augmentations of the hand as interaction feedback. ] \subsectionstarbookmark{Axis I: Augmenting the Texture Perception of Real Surfaces} -Wearable haptic devices have proven to be effective in altering the perception of a touched real surface, without modifying the object or covering the fingertip, forming a haptic \AE \cite{bau2012revel,detinguy2018enhancing,salazar2020altering}. +Wearable haptic devices have proven effective in modifying the perception of a touched real surface, without altering the object or covering the fingertip, forming haptic augmentation \cite{bau2012revel,detinguy2018enhancing,salazar2020altering}. %It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronised with the finger movement. %It enables rich haptic feedback as the combination of kinesthetic sensation from the real and cutaneous sensation from the actuator. -However, wearable haptic augmentations with \AR have been little explored, as well as the visuo-haptic augmentation of texture. -Texture is indeed one of the most fundamental perceived property of a surface's material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by both sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) augmentation \cite{unger2011roughness,culbertson2014modeling,asano2015vibrotactile,strohmeier2017generating,friesen2024perceived}. -Being able to coherently substitute the visuo-haptic texture of a surface directly touched by a finger is an important step towards \AR capable of visually and haptically augmenting the \RE of a user in a plausible way. +However, wearable haptic augmentation with \AR has been little explored, as well as the visuo-haptic augmentation of texture. +Texture is indeed one of the most fundamental perceived properties of a surface material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) augmentation \cite{unger2011roughness,culbertson2014modeling,asano2015vibrotactile,strohmeier2017generating,friesen2024perceived}. +Coherently substitute the visuo-haptic texture of a surface directly touched by a finger is an important step towards a \AR capable of visually and haptically augmenting the \RE of a user in a plausible way. For this first axis of research, we propose to design and evaluate the \textbf{perception of wearable virtual visuo-haptic textures augmenting real surfaces}. %, using an immersive \AR headset and a wearable vibrotactile device. -To this end, we (1) design a system for rendering wearable visuo-haptic texture augmentations, to (2) evaluate how the perception of haptic texture augmentations is affected by the visual feedback of the virtual hand and the environment, and (3) investigate the perception of co-localized visuo-haptic texture augmentations. +To this end, we (1) design a system for rendering wearable visuo-haptic texture augmentations, to (2) evaluate how the perception of haptic texture augmentations is affected by visual feedback of the virtual hand and the environment, and (3) investigate the perception of co-localized visuo-haptic texture augmentations. -First, an effective approach for rendering haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction \cite{culbertson2014modeling,asano2015vibrotactile}. -Yet, to achieve the natural interaction with the hand and a coherent visuo-haptic feedback, it requires a real time rendering of the textures, no constraints on the hand movements, and a good synchronization between the visual and haptic feedback. -Thus, our first objective is to \textbf{design an immersive, real time system} that allows free exploration with the bare hand of \textbf{wearable visuo-haptic texture augmentations} on real surfaces. -It will form the basis of the next two chapters in this section. +First, an effective approach to render haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction \cite{culbertson2014modeling,asano2015vibrotactile}. +Yet, to achieve natural interaction with the hand and coherent visuo-haptic feedback, it requires a real time rendering of the textures, no constraints on hand movements, and good synchronization between the visual and haptic feedback. +Thus, our first objective is to \textbf{design an immersive, real time system} that allows free exploration of \textbf{wearable visuo-haptic texture augmentations} on real surfaces with the bare hand. +This will form the basis of the next two chapters in this section. -Second, many works have investigated the haptic augmentations of texture, but none have integrated them with \AR and \VR, or have considered the influence of the visual feedback on their perception. -Still, it is known that the visual feedback can alter the perception of real and virtual haptic sensations \cite{schwind2018touch,choi2021augmenting} but also that the force feedback perception of grounded haptic devices is not the same in \AR and \VR \cite{diluca2011effects,gaffary2017ar}. -Hence, our second objective is to \textbf{evaluate how the perception of wearable haptic texture augmentation is affected by the visual feedback of the virtual hand and the environment} (real, augmented, or virtual). +Second, many works have investigated the haptic augmentations of texture, but none have integrated them with \AR and \VR, or considered the influence of visual feedback on their perception. +Still, it is known that visual feedback can alter the perception of real and virtual haptic sensations \cite{schwind2018touch,choi2021augmenting} but also that the force feedback perception of grounded haptic devices is not the same in \AR and \VR \cite{diluca2011effects,gaffary2017ar}. +Hence, our second objective is to \textbf{evaluate how the perception of wearable haptic texture augmentation is affected by the visual feedback of the virtual hand and the environment} (real, augmented or virtual). Finally, some visuo-haptic texture databases have been created from real texture captures \cite{culbertson2014penn,balasubramanian2024sens3} to be rendered as virtual textures with hand-held haptic devices that are perceived as similar to real textures \cite{culbertson2015should,friesen2024perceived}. However, the rendering of these textures in an immersive and natural visuo-haptic \AR using wearable haptics remains to be investigated. -Our third objective is to \textbf{evaluate the perception of simultaneous and co-localized wearable visuo-haptic texture augmentations} of real surfaces in \AR.%, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture. +Our third objective is to \textbf{evaluate the perception of simultaneous and co-localized wearable visuo-haptic texture augmentations} of real surfaces in \AR. %, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture. \subsectionstarbookmark{Axis II: Improving the Manipulation of Virtual Objects} -In immersive and wearable visuo-haptic \AR, the hand is free to touch and interact seamlessly with real, augmented, and virtual objects. +In immersive and wearable visuo-haptic \AR, the hand is free to touch and interact seamlessly with real, augmented and virtual objects. Hence, a user can expect natural and direct contact and manipulation of \VOs with the bare hand. -However, the intangibility of the visual \VE, the display limitations of current visual \OST-\AR systems and the inherent spatial and temporal discrepancies between the user's hand actions and the visual feedback in the \VE can make the interaction with \VOs particularly challenging. +However, the intangibility of the visual \VE, the display limitations of current visual \OST-\AR systems and the inherent spatial and temporal discrepancies between the user's hand actions and the visual feedback in the \VE can make interaction with \VOs particularly challenging. %However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of wearable haptics, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the \RE, the visual feedback, and the haptic feedback, can make the interaction with \VOs with bare hands particularly challenging. -Two particular sensory feedbacks are known to improve such direct \VO manipulation, but they have not been properly investigated in immersive \AR: visual rendering of the hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and delocalized haptic rendering \cite{lopes2018adding,teng2021touch}. +Two particular sensory feedbacks are known to improve such direct \VO manipulation, but have not been properly investigated in immersive \AR: visual feedback of the virtual hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and delocalized haptic feedback \cite{lopes2018adding,teng2021touch}. For this second axis of research, we propose to design and evaluate \textbf{visuo-haptic augmentations of the hand as interaction feedback with \VOs} in immersive \OST-\AR. -We consider the effect on the user performance an experience of (1) the visual rendering as hand augmentation and (2) combination of different visuo-haptic rendering of the hand manipulation with \VOs +We consider the effect on user performance and experience of (1) the visual feedback of the virtual hand as augmentation of the real hand and (2) different delocalized haptic feedback of \VO manipulation with the hand in combination with visual hand augmentations. -First, the visual rendering of the virtual hand is a key element for interacting and manipulating \VOs in \VR \cite{prachyabrued2014visual,grubert2018effects}. -Some work has also investigated the visual rendering of the virtual hand in \AR \cite{piumsomboon2014graspshell,blaga2017usability}, but not in an immersive context of \VO manipulation. % with the bare hand.% from simulating mutual occlusions between the hand and \VOs \cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay \cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand. -\OST-\AR also has significant perceptual differences from \VR due to the visibility of the real hand and environment, which can affect the user experience and performance \cite{yoon2020evaluating}. +First, the visual feedback of the virtual hand is a key element for interacting and manipulating \VOs in \VR \cite{prachyabrued2014visual,grubert2018effects}. +Some work has also investigated the visual feedback of the virtual hand in \AR \cite{piumsomboon2014graspshell,blaga2017usability}, but not in an immersive context of \VO manipulation. % with the bare hand.% from simulating mutual occlusions between the hand and \VOs \cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay \cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand. +\OST-\AR also has significant perceptual differences from \VR due to the visibility of the real hand and environment, which can affect user experience and performance \cite{yoon2020evaluating}. %, and these visual hand augmentations have not been evaluated . -Thus, our fourth objective is to \textbf{investigate the visual rendering as a hand augmentation} for direct hand manipulation of \VOs in \OST-\AR. +Thus, our fourth objective is to \textbf{the visual feedback of the virtual hand as augmentation of the real hand} for direct hand manipulation of \VOs. -Second, as described above, the haptic actuators need to be moved away from the fingertips to not impair the hand movements, sensations, and interactions with the \RE. -Previous works have shown that wearable haptics that provide feedback on the hand manipulation with \VOs in \AR can significantly improve the user performance and experience \cite{maisto2017evaluation,meli2018combining}. -However, it is unclear which positioning of the actuator is the most beneficial nor how a haptic augmentation of the hand compares or complements with a visual augmentation of the hand. -Our last objective is to \textbf{investigate the visuo-haptic rendering of \VO manipulation with the hand} in \OST-\AR using wearable vibrotactile haptics. +Second, as described above, the haptic actuators need to be moved away from the fingertips to not impair the hand movements, sensations and interactions with the \RE. +Previous work has shown that wearable haptics that provide feedback on hand manipulation with \VOs in \AR can significantly improve user performance and experience \cite{maisto2017evaluation,meli2018combining}. +However, it is unclear which positioning of the actuator is most beneficial and how delocalized haptic feedback of the hand-object contacts compares or complements visual augmentation of the hand. +Our last objective is to \textbf{investigate the delocalized haptic feedback of \VO manipulation} with the hand, in \textbf{combination with visual augmentations of the hand}, using wearable vibrotactile haptics. \section{Thesis Overview} \label{thesis_overview} @@ -273,7 +273,7 @@ We then address each of our two research axes in a dedicated part. \noindentskip In \textbf{\partref{perception}} we present our contributions to the first axis of research: modifying the visuo-haptic texture perception of real surfaces. -We evaluate how the visual rendering of the hand (real or virtual), the environment (\AR or \VR) and the textures (coherent, different or not shown) affect the perception of virtual vibrotactile textures rendered on real surfaces and touched directly with the index finger. +We evaluate how the visual feedback of the hand (real or virtual), the environment (\AR or \VR) and the textures (coherent, different or not shown) affect the perception of virtual vibrotactile textures rendered on real surfaces and touched directly with the index finger. In \textbf{\chapref{vhar_system}} we design and implement a system for rendering visuo-haptic virtual textures that augment real surfaces. %, using an immersive \OST-\AR headset and a wearable vibrotactile device. The haptic textures represent a periodical patterned texture rendered by a wearable vibrotactile actuator worn on the middle phalanx of the finger touching the surface. @@ -282,20 +282,20 @@ The visual rendering is done using the immersive \OST-\AR headset Microsoft Holo The system allows free visual and haptic exploration of the textures, as if they were real, and forms the basis of the next two chapters. In \textbf{\chapref{xr_perception}} we investigate in a user study how different the perception of haptic texture augmentations is in \AR \vs \VR and when touched by a virtual hand \vs one's own hand. -We use psychophysical methods to measure the user perception, and extensive questionnaires to understand how this perception is affected by the visual feedback of the virtual hand and the environment (real, augmented, or virtual). +We use psychophysical methods to measure user perception and extensive questionnaires to understand how this perception is affected by the visual feedback of the virtual hand and the environment (real, augmented or virtual). In \textbf{\chapref{vhar_textures}} we evaluate in a user study the perception of visuo-haptic texture augmentations directly touched with the real hand in \AR. The virtual textures are paired visual and haptic captures of real surfaces \cite{culbertson2014one}, which we render as visual and haptic overlays on the touched augmented surfaces. Our objective is to assess the perceived realism, coherence and roughness of the combination of nine representative visuo-haptic texture pairs. \noindentskip -In \textbf{\partref{manipulation}} we describe our contributions to the second axis of research: improving the direct hand manipulation of \VOs using visuo-haptic augmentations of the hand as interaction feedback with \VOs in immersive \OST-\AR. +In \textbf{\partref{manipulation}} we describe our contributions to the second axis of research: improving direct hand manipulation of \VOs using visuo-haptic augmentations of the hand as interaction feedback with \VOs in immersive \OST-\AR. -In \textbf{\chapref{visual_hand}} we investigate in a user study of six visual renderings as hand augmentations, as a set of the most popular hand renderings in the \AR literature. -Using the \OST-\AR headset Microsoft HoloLens~2, we evaluate their effect on the user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a \VO directly with the hand. +In \textbf{\chapref{visual_hand}} we investigate in a user study six visual feedback as hand augmentations, as a set of the most popular hand augmentation in the \AR literature. +Using the \OST-\AR headset Microsoft HoloLens~2, we evaluate their effect on user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a \VO directly with the hand. -In \textbf{\chapref{visuo_haptic_hand}} we evaluate in a user study the visuo-haptic rendering of direct hand manipulation of \VO with two vibrotactile contact techniques, provided at four different positionings on the user's hand. %, as haptic rendering of \VO manipulation with the hand. -They are compared to the two most representative visual hand renderings from the previous chapter, resulting in sixteen visuo-haptic hand renderings that are evaluated within the same experimental setup and design. +In \textbf{\chapref{visuo_haptic_hand}} we evaluate in a user study delocalized haptic feedback of hand manipulation with \VO using two vibrotactile contact techniques provided at five different positionings on the hand. %, as haptic rendering of \VO manipulation with the hand. +They are compared with the two most representative visual hand augmentations from the previous chapter, resulting in twenty visuo-haptic hand feedbacks that are evaluated within the same experimental setup and design. \noindentskip In \textbf{\chapref{conclusion}} we conclude this thesis and discuss short-term future work and long-term perspectives for each of our contributions and research axes. diff --git a/2-perception/xr-perception/5-discussion.tex b/2-perception/xr-perception/5-discussion.tex index 628def1..ea0f01c 100644 --- a/2-perception/xr-perception/5-discussion.tex +++ b/2-perception/xr-perception/5-discussion.tex @@ -10,7 +10,7 @@ A \PSE difference in the same range was found for perceived stiffness, with the Surprisingly, the \PSE of the \level{Real} rendering was shifted to the right (to be "rougher", \percent{7.9}) compared to the reference texture, whereas the \PSEs of the \level{Virtual} (\percent{5.1}) and \level{Mixed} (\percent{1.9}) renderings were perceived as \enquote{smoother} and closer to the reference texture (\figref{results/trial_predictions}). The sensitivity of participants to roughness differences also varied, with the \level{Real} rendering having the best \JND (\percent{26}), followed by the \level{Virtual} (\percent{30}) and \level{Mixed} (\percent{33}) renderings (\figref{results/trial_jnds}). These \JND values are in line with and at the upper end of the range of previous studies \cite{choi2013vibrotactile}, which may be due to the location of the actuator on the top of the finger middle phalanx, being less sensitive to vibration than the fingertip. -Thus, compared to no visual rendering (\level{Real}), the addition of a visual rendering of the hand or environment reduced the roughness sensitivity (\JND) and the roughness perception (\PSE), as if the virtual vibrotactile textures felt \enquote{smoother}. +Thus, compared to no visual rendering (\level{Real}), the addition of a visual feedback of the hand or environment reduced the roughness sensitivity (\JND) and the roughness perception (\PSE), as if the virtual vibrotactile textures felt \enquote{smoother}. Differences in user behaviour were also observed between the visual renderings (but not between the haptic textures). On average, participants responded faster (\percent{-16}), explored textures at a greater distance (\percent{+21}) and at a higher speed (\percent{+16}) without visual augmentation (\level{Real} rendering) than in \VR (\level{Virtual} rendering) (\figref{results_finger}). diff --git a/3-manipulation/visual-hand/1-introduction.tex b/3-manipulation/visual-hand/1-introduction.tex index aa67801..c502b8e 100644 --- a/3-manipulation/visual-hand/1-introduction.tex +++ b/3-manipulation/visual-hand/1-introduction.tex @@ -15,14 +15,14 @@ We \textbf{evaluate in a user study}, using the \OST-\AR headset Microsoft HoloL \noindentskip The main contributions of this chapter are: \begin{itemize} \item A comparison from the literature of the six most common visual hand renderings used to interact with \VOs in \AR. - \item A user study evaluating with 24 participants the performance and user experience of the six visual hand renderings superimposed on the real hand during free and direct hand manipulation of \VOs in \OST-\AR. + \item A user study evaluating with 24 participants the performance and user experience of the six visual hand renderings as augmentation of the real hand during free and direct hand manipulation of \VOs in \OST-\AR. \end{itemize} \noindentskip In the next sections, we first present the six visual hand renderings we considered and gathered from the literature. We then describe the experimental setup and design, the two manipulation tasks, and the metrics used. We present the results of the user study and discuss the implications of these results for the manipulation of \VOs directly with the hand in \AR. \bigskip -\begin{subfigs}{hands}{The six visual hand renderings.}[ +\begin{subfigs}{hands}{The six visual hand renderings as augmentation of the real hands.}[ As seen by the user through the \AR headset during the two-finger grasping of a virtual cube. ][ \item No visual rendering \level{(None)}. diff --git a/3-manipulation/visual-hand/5-conclusion.tex b/3-manipulation/visual-hand/5-conclusion.tex index db369a4..0911032 100644 --- a/3-manipulation/visual-hand/5-conclusion.tex +++ b/3-manipulation/visual-hand/5-conclusion.tex @@ -1,17 +1,17 @@ \section{Conclusion} \label{conclusion} -In this chapter, we addressed the challenge of touching, grasping and manipulating \VOs directly with the hand in immersive \OST-\AR by providing and evaluating visual renderings as hand augmentation. +In this chapter, we addressed the challenge of touching, grasping and manipulating \VOs directly with the hand in immersive \OST-\AR by providing and evaluating visual renderings as augmentation of the real hand. Superimposed on the user's hand, these visual renderings provide feedback from the virtual hand, which tracks the real hand, and simulates the interaction with \VOs as a proxy. We first selected and compared the six most popular visual hand renderings used to interact with \VOs in \AR. Then, in a user study with 24 participants and an immersive \OST-\AR headset, we evaluated the effect of these six visual hand renderings on the user performance and experience in two representative manipulation tasks. -Our results showed that a visual hand rendering overlaying the real hand improved the performance, perceived effectiveness and confidence of participants compared to no rendering. +Our results showed that a visual hand augmentation improved the performance, perceived effectiveness and confidence of participants compared to no augmentation. A skeleton rendering, which provided a detailed view of the tracked joints and phalanges while not hiding the real hand, was the most performant and effective. The contour and mesh renderings were found to mask the real hand, while the tips rendering was controversial. The occlusion rendering had too much tracking latency to be effective. This is consistent with similar manipulation studies in \VR and in non-immersive \VST-\AR setups. -This study suggests that a \ThreeD visual hand rendering is important in \AR when interacting with a virtual hand technique, particularly when it involves precise finger movements in relation to virtual content, \eg \ThreeD windows, buttons and sliders, or more complex tasks, such as stacking or assembly. -A minimal but detailed rendering of the hand that does not hide the real hand, such as the skeleton rendering we evaluated, seems to be the best compromise between the richness and effectiveness of the feedback. +This study suggests that a \ThreeD visual hand augmentation is important in \AR when interacting with a virtual hand technique, particularly when it involves precise finger movements in relation to virtual content, \eg \ThreeD windows, buttons and sliders, or more complex tasks, such as stacking or assembly. +A minimal but detailed rendering of the virtual hand that does not hide the real hand, such as the skeleton rendering we evaluated, seems to be the best compromise between the richness and effectiveness of the feedback. %Still, users should be able to choose and adapt the visual hand rendering to their preferences and needs. diff --git a/3-manipulation/visual-hand/visual-hand.tex b/3-manipulation/visual-hand/visual-hand.tex index e09d07e..d46c8fa 100644 --- a/3-manipulation/visual-hand/visual-hand.tex +++ b/3-manipulation/visual-hand/visual-hand.tex @@ -1,4 +1,4 @@ -\chapter{Visual Rendering of the Hand for Manipulating Virtual Objects in AR} +\chapter{Visual Augmentation of the Hand for Manipulating Virtual Objects in AR} \mainlabel{visual_hand} \chaptertoc diff --git a/3-manipulation/visuo-haptic-hand/1-introduction.tex b/3-manipulation/visuo-haptic-hand/1-introduction.tex index 537b775..a9dfc15 100644 --- a/3-manipulation/visuo-haptic-hand/1-introduction.tex +++ b/3-manipulation/visuo-haptic-hand/1-introduction.tex @@ -3,26 +3,26 @@ Providing haptic feedback during free-hand manipulation in \AR is not trivial, as wearing haptic devices on the hand might affect the tracking capabilities of the system \cite{pacchierotti2016hring}. Moreover, it is important to leave the user capable of interacting with both virtual and real objects, avoiding the use of haptic interfaces that cover the fingertips or palm. -For this reason, it is often considered beneficial to move the point of application of the haptic rendering elsewhere on the hand (\secref[related_work]{vhar_haptics}). -However, the impact of the positioning of the haptic rendering on the hand during direct hand manipulation in \AR has not been systematically studied. +For this reason, it is often considered beneficial to move the point of application of the haptic feedback elsewhere on the hand (\secref[related_work]{vhar_haptics}). +However, the impact of the positioning of the haptic feedback on the hand during direct hand manipulation in \AR has not been systematically studied. Conjointly, a few studies have explored and compared the effects of visual and haptic feedback in tasks involving the manipulation of \VOs with the hand. \textcite{sarac2022perceived} and \textcite{palmer2022haptic} studied the effects of providing haptic feedback about contacts at the fingertips using haptic devices worn at the wrist, testing different mappings. Their results proved that moving the haptic feedback away from the point(s) of contact is possible and effective, and that its impact is more significant when the visual feedback is limited. -A final question is whether one or the other of these (haptic or visual) hand renderings should be preferred \cite{maisto2017evaluation,meli2018combining}, or whether a combined visuo-haptic rendering is beneficial for users. +A final question is whether one or the other of these (haptic or visual) hand feedback should be preferred \cite{maisto2017evaluation,meli2018combining}, or whether a combined visuo-haptic feedback is beneficial for users. However, these studies were conducted in non-immersive setups, with a screen displaying the \VE view. -In fact, both hand renderings can provide sufficient sensory feedback for efficient direct hand manipulation of \VOs in \AR, or conversely, they can be shown to be complementary. +In fact, both hand feedback can provide sufficient sensory feedback for efficient direct hand manipulation of \VOs in \AR, or conversely, they can be shown to be complementary. -In this chapter, we aim to investigate the role of \textbf{visuo-haptic rendering of \VO manipulation with the hand} in immersive \OST-\AR using wearable vibrotactile haptics. +In this chapter, we aim to investigate the role of \textbf{visuo-haptic feedback of the hand when manipulating \VO} in immersive \OST-\AR using wearable vibrotactile haptics. We selected \textbf{four different delocalized positionings on the hand} that have been previously proposed in the literature for direct hand interaction in \AR using wearable haptic devices (\secref[related_work]{vhar_haptics}): on the nails, the proximal phalanges, the wrist, and the nails of the opposite hand. We focused on vibrotactile feedback, as it is used in most of the wearable haptic devices and has the lowest encumbrance. In a \textbf{user study}, using the \OST-\AR headset Microsoft HoloLens~2 and two \ERM vibrotactile motors, we evaluated the effect of the four positionings with \textbf{two contact vibration techniques} on the user performance and experience with the same two manipulation tasks as in \chapref{visual_hand}. -We additionally compared these vibrotactile renderings with the \textbf{skeleton-like visual hand rendering} established in the \chapref{visual_hand} as a complementary visuo-haptic feedback of the hand interaction with the \VOs. +We additionally compared these vibrotactile renderings with the \textbf{skeleton-like visual hand augmentation} established in the \chapref{visual_hand} as a complementary visuo-haptic feedback of the hand interaction with the \VOs. \noindentskip The contributions of this chapter are: \begin{itemize} - \item The evaluation in a user study with 20 participants of the effect of providing a vibrotactile feedback of the fingertip contacts with \VOs, during direct manipulation with bare hand in \AR, at four different delocalized positionings of the haptic rendering on the hand and with two contact vibration techniques. - \item The comparison of these vibrotactile positionings and renderings techniques with the two most representative visual hand renderings established in the \chapref{visual_hand}. + \item The evaluation in a user study with 20 participants of the effect of providing a vibrotactile feedback of the fingertip contacts with \VOs, during direct manipulation with bare hand in \AR, at four different delocalized positionings of the haptic feedback on the hand and with two contact vibration techniques. + \item The comparison of these vibrotactile positionings and renderings techniques with the two most representative visual hand augmentations established in the \chapref{visual_hand}. \end{itemize} \noindentskip In the next sections, we first describe the four delocalized positionings and the two contact vibration techniques we considered, based on previous work. We then present the experimental setup and design of the user study. Finally, we report the results and discuss them in the context of the free hand interaction with virtual content in \AR. diff --git a/3-manipulation/visuo-haptic-hand/4-discussion.tex b/3-manipulation/visuo-haptic-hand/4-discussion.tex index 2e712a9..9403364 100644 --- a/3-manipulation/visuo-haptic-hand/4-discussion.tex +++ b/3-manipulation/visuo-haptic-hand/4-discussion.tex @@ -1,7 +1,7 @@ \section{Discussion} \label{discussion} -We evaluated sixteen visuo-haptic renderings of the hand, in the same two \VO manipulation tasks in \AR as in the \chapref{visual_hand}, as the combination of two vibrotactile contact techniques provided at four delocalized positions on the hand with the two most representative visual hand renderings established in the \chapref{visual_hand}. +We evaluated twenty visuo-haptic renderings of the hand, in the same two \VO manipulation tasks in \AR as in the \chapref{visual_hand}, as the combination of two vibrotactile contact techniques provided at five delocalized positions on the hand with the two most representative visual hand renderings established in the \chapref{visual_hand}. In the \level{Push} task, vibrotactile haptic hand rendering has been proven beneficial with the \level{Proximal} positioning, which registered a low completion time, but detrimental with the \level{Fingertips} positioning, which performed worse (\figref{results/Push-CompletionTime-Location-Overall-Means}) than the \level{Proximal} and \level{Opposite} (on the contralateral hand) positionings. The cause might be the intensity of vibrations, which many participants found rather strong and possibly distracting when provided at the fingertips. diff --git a/3-manipulation/visuo-haptic-hand/5-conclusion.tex b/3-manipulation/visuo-haptic-hand/5-conclusion.tex index 95508d3..65922b6 100644 --- a/3-manipulation/visuo-haptic-hand/5-conclusion.tex +++ b/3-manipulation/visuo-haptic-hand/5-conclusion.tex @@ -1,16 +1,16 @@ \section{Conclusion} \label{conclusion} -In this chapter, we investigated the visuo-haptic rendering as feedback of the direct hand manipulation with \VOs in immersive \OST-\AR using wearable vibrotactile haptic. -To do so, we provided vibrotactile feedback of the fingertip contacts with \VOs during direct hand manipulation by moving away the haptic actuator that do not cover the inside of the hand: on the nails, the proximal phalanges, the wrist, and the nails of the opposite hand. +In this chapter, we investigated the visuo-haptic feedback of the hand when manipulating \VOs in immersive \OST-\AR using wearable vibrotactile haptic. +To do so, we provided vibrotactile feedback of the fingertip contacts with \VOs by moving away the haptic actuator that do not cover the inside of the hand: on the nails, the proximal phalanges, the wrist, and the nails of the opposite hand. We selected these four different delocalized positions on the hand from the literature for direct hand interaction in \AR using wearable haptic devices. -In a user study, we compared sixteen visuo-haptic renderings of the hand as the combination of two vibrotactile contact techniques, provided at four different delocalized positions on the user's hand, and with the two most representative visual hand renderings established in the \chapref{visual_hand}, \ie the skeleton hand rendering and no hand rendering. +In a user study, we compared twenty visuo-haptic feedback of the hand as the combination of two vibrotactile contact techniques, provided at five different delocalized positions on the user's hand, and with the two most representative visual hand augmentations established in the \chapref{visual_hand}, \ie the skeleton hand rendering and no hand rendering. -Results showed that delocalized vibrotactile haptic hand rendering improved the perceived effectiveness, realism, and usefulness when it is provided close to the contact point. +Results showed that delocalized vibrotactile haptic hand feedback improved the perceived effectiveness, realism, and usefulness when it is provided close to the contact point. However, the farthest positioning on the contralateral hand gave the best performance even though it was disliked: the unfamiliarity of the positioning probably caused the participants to take more effort to consider the haptic stimuli and to focus more on the task. -The visual hand rendering was perceived less necessary than the vibrotactile haptic hand rendering, but still provided a useful feedback on the hand tracking. +The visual hand augmentation was perceived less necessary than the vibrotactile haptic feedback, but still provided a useful feedback on the hand tracking. This study provide evidence that moving away the feedback from the inside of the hand is a simple but promising approach for wearable haptics in \AR. If integration with the hand tracking system allows it, and if the task requires it, a haptic ring worn on the middle or proximal phalanx seems preferable. However, a wrist-mounted haptic device will be able to provide richer feedback by embedding more diverse haptic actuators with larger bandwidths and maximum amplitudes, while being less obtrusive than a ring. -Finally, we think that the visual hand rendering complements the haptic hand rendering well by providing continuous feedback on the hand tracking, and that it can be disabled during the grasping phase to avoid redundancy with the haptic feedback of the contact with the \VO. +Finally, we think that the visual hand augmentation complements the haptic contact rendering well by providing continuous feedback on the hand tracking, and that it can be disabled during the grasping phase to avoid redundancy with the haptic feedback of the contact with the \VO. diff --git a/3-manipulation/visuo-haptic-hand/visuo-haptic-hand.tex b/3-manipulation/visuo-haptic-hand/visuo-haptic-hand.tex index 457a326..cedfbd1 100644 --- a/3-manipulation/visuo-haptic-hand/visuo-haptic-hand.tex +++ b/3-manipulation/visuo-haptic-hand/visuo-haptic-hand.tex @@ -1,4 +1,4 @@ -\chapter{Visuo-Haptic Rendering of Hand Manipulation with Virtual Objects in AR} +\chapter{Visuo-Haptic Augmentation of Hand Manipulation with Virtual Objects in AR} \mainlabel{visuo_haptic_hand} \chaptertoc diff --git a/4-conclusion/conclusion.tex b/4-conclusion/conclusion.tex index 08b2189..73974ea 100644 --- a/4-conclusion/conclusion.tex +++ b/4-conclusion/conclusion.tex @@ -5,53 +5,53 @@ \section{Summary} -In this manuscript, we have shown how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR by augmenting the perception and manipulation of the virtual. -Wearable haptics can provide a rich tactile feedback on \VOs and augment the perception of real objects, both directly touched by the hand, while preserving the freedom of movement and interaction with the \RE. -However, their integration with \AR is still in its infancy, and presents many design, technical and human challenges. -We have structured our research around two axes: \textbf{(I) modifying the texture perception of real surfaces}, and \textbf{(II) improving the manipulation of \VOs}. +In this manuscript we have shown how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR. % by augmenting the perception of the real and manipulation of the virtual. +Wearable haptics can provide rich tactile feedback on \VOs and augment the perception of real objects, both directly touched by the hand, while preserving freedom of movement and interaction with the \RE. +However, their integration with \AR is still in its infancy and presents many design, technical and human challenges. +We have structured our research around two axes: \textbf{(I) modifying the visuo-haptic texture perception of real surfaces} and \textbf{(II) improving the manipulation of \VOs}. \noindentskip In \partref{perception} we focused on modifying the perception of wearable and immersive virtual visuo-haptic textures that augment real surfaces. Texture is a fundamental property of an object, perceived equally by sight and touch. -It is also one of the most studied haptic augmentations, but it had not yet been integrated into \AR or \VR. +It is also one of the most studied haptic augmentations, but has not yet been integrated into \AR or \VR. We \textbf{(1)} proposed a \textbf{wearable visuo-haptic texture augmentation system}, \textbf{(2)} evaluated how the perception of haptic texture augmentations is \textbf{affected by the visual feedback of the virtual hand} and the environment (real, augmented, or virtual), and \textbf{(3)} investigated the \textbf{perception of co-localized visuo-haptic texture augmentations}. In \chapref{vhar_system}, we presented a system for \textbf{augmenting any real surface} with virtual \textbf{visuo-haptic roughness textures} using an immersive \AR headset and a wearable vibrotactile device worn on the middle phalanx of the finger. -It allows a \textbf{free visual and touch exploration} of the textures, as if they were real, allowing the user to view them from different angles and touch them with the bare finger without constraints on hand movements. +It allows \textbf{free visual and touch exploration} of the textures as if they were real, allowing the user to view them from different angles and touch them with the bare finger without constraints on hand movement. The user studies in the next two chapters are based on this system. In \chapref{xr_perception} we explored how the perception of wearable haptic augmented textures is affected by the visual feedback of the virtual hand and the environment, whether it is real, augmented or virtual. -We augmented the perceived roughness of the real surface with virtual vibrotactile patterned textures, and rendered the visual conditions by switching the \OST-\AR headset to a \VR-only view. +We augmented the perceived roughness of the real surface with virtual vibrotactile patterned textures and rendered the visual conditions by switching the \OST-\AR headset to a \VR-only view. We then conducted a psychophysical user study with 20 participants and extensive questionnaires to evaluate the perceived roughness augmentation in these three visual conditions. The textures were perceived as \textbf{rougher when touched with the real hand alone compared to a virtual hand} in either \AR or \VR, possibly due to the \textbf{perceived latency} between finger movements and different visual, haptic, and proprioceptive feedbacks. In \chapref{vhar_textures}, we investigated the perception of co-localized visual and wearable haptic texture augmentations on real surfaces. We transposed the \textbf{data-driven visuo-haptic textures} from the \HaTT database to the system presented in \chapref{vhar_system} and conducted a user study with 20 participants to rate the coherence, realism, and perceived roughness of the combination of nine visuo-haptic texture pairs. -Participants integrated roughness sensations from both visual and haptic modalities well, with \textbf{haptics predominating the perception}, and consistently identified and matched \textbf{clusters of visual and haptic textures with similar perceived roughness}. +Participants integrated roughness sensations from both visual and haptic modalities well, with \textbf{haptics dominating perception}, and consistently identified and matched \textbf{clusters of visual and haptic textures with similar perceived roughness}. \noindentskip In \partref{manipulation} we focused on improving the manipulation of \VOs directly with the hand in immersive \OST-\AR. -Our approach was to design visual renderings of the hand and delocalized haptic rendering, based on the literature, and to evaluate them in user studies. -We first considered \textbf{(1) the visual rendering as hand augmentation} and then the \textbf{(2)} combination of different visuo-haptic \textbf{rendering of the hand manipulation with \VOs}. +Our approach was to design visual augmentations of the hand and delocalized haptic feedback, based on the literature, and evaluate them in user studies. +We first considered \textbf{(1) the visual augmentation of the hand} and then the \textbf{(2)} combination of different \textbf{visuo-haptic feedback of the hand when manipulating \VOs}. -In \chapref{visual_hand}, we investigated the visual rendering as hand augmentation. -Seen as an \textbf{overlay on the user's hand}, such visual hand rendering provide feedback on the hand tracking and the interaction with \VOs. -We compared the six commonly used renderings in the \AR litterature in a user study with 24 participants, where we evaluated their effect on the user performance and experience in two representative manipulation tasks. -The results showed that a visual hand rendering improved the user performance, perceived effectiveness and confidence, with a \textbf{skeleton-like rendering being the most performant and effective}. +In \chapref{visual_hand}, we investigated the visual feedback of the virtual hand as augmentation of the real hand. +Seen as an \textbf{overlay on the user's hand}, such visual hand rendering provide feedback on hand tracking and interaction with \VOs. +We compared the six commonly used visual hand augmentations in the \AR literature in a user study with 24 participants, where we evaluated their effect on user performance and experience in two representative manipulation tasks. +The results showed that a visual hand augmentation improved user performance, perceived effectiveness and confidence, with a \textbf{skeleton-like rendering being the most performant and effective}. This rendering provided a detailed view of the tracked phalanges while being thin enough not to hide the real hand. -In \chapref{visuo_haptic_hand}, we then investigated the visuo-haptic rendering as feedback of the direct hand manipulation with \VOs using wearable vibrotactile haptics. -In a user study with a similar design and 20 participants, we compared two vibrotactile contact techniques, provided at \textbf{four different delocalized positions on the user's hand}, and combined with the two most representative visual hand renderings from the previous chapter. -The results showed that providing vibrotactile feedback \textbf{improved the perceived effectiveness, realism, and usefulness when it was provided close to the fingertips}, and that the visual hand rendering complemented the haptic hand rendering well in giving a continuous feedback on the hand tracking. +In \chapref{visuo_haptic_hand}, we then investigated visuo-haptic feedback to direct hand manipulation with \VOs using wearable vibrotactile haptics. +In a user study with a similar design and 20 participants, we compared two vibrotactile contact techniques, provided at \textbf{four different delocalized positions on the user's hand}, and combined with the two most representative visual hand augmentations from the previous chapter. +The results showed that providing vibrotactile feedback \textbf{improved the perceived effectiveness, realism, and usefulness when provided close to the fingertips}, and that the visual hand augment complemented the haptic contact feedback well in providing a continuous feedback on hand tracking. \section{Future Work} -The wearable visuo-haptic augmentations of perception and manipulation we presented, and the user studies we conducted for this thesis have of course some limitations. -In this section, we present some future work for each chapter that could address these issues. +The wearable visuo-haptic feedback we presented for augmenting the perception of real objects (\partref{perception}) and the manipulation of virtual objects (\partref{manipulation}) touched directly with the hand, and the user studies we conducted for this thesis have of course some limitations. +In this section we present some future work for each chapter that could address these issues. \subsection*{Augmenting the Visuo-haptic Texture Perception of Real Surfaces} \paragraph{Other Augmented Object Properties} -We focused on the visuo-haptic augmentation of roughness using vibrotactile feedback, because it is one of the most salient properties of surfaces (\secref[related_work]{object_properties}), one of the most studied in haptic perception (\secref[related_work]{texture_rendering}), and equally perceived by sight and touch (\secref[related_work]{visual_haptic_influence}). +We focused on visuo-haptic augmentation of roughness using vibrotactile feedback, because it is one of the most salient properties of surfaces (\secref[related_work]{object_properties}), one of the most studied in haptic perception (\secref[related_work]{texture_rendering}), and equally perceived by sight and touch (\secref[related_work]{visual_haptic_influence}). However, many other wearable augmentation of object properties should be considered, such as hardness, friction, temperature, or local deformations. Such integration of haptic augmentation of a real surface has almost been achieved with the hand-held devices of \citeauthor{culbertson2017ungrounded} \cite{culbertson2017importance,culbertson2017ungrounded}, but will be more challenging with wearable haptic devices. In addition, combination with pseudo-haptic rendering techniques \cite{ujitoko2021survey} should be systematically investigated to expand the range of possible wearable haptic augmentations. @@ -60,10 +60,10 @@ In addition, combination with pseudo-haptic rendering techniques \cite{ujitoko20 In our system, we registered the real and virtual environments (\secref[related_work]{ar_definition}) using fiducial markers and a webcam external to the \AR headset. This only allowed us to track the index finger and the surface to be augmented with the haptic texture, but the tracking was reliable and accurate enough for our needs. -In fact, preliminary tests we conducted showed that the built-in tracking capabilities of the Microsoft HoloLens~2 were not able to track the hands wearing a voice-coil. -A more robust hand tracking system would support wearing haptic devices on the hand, as well as holding real objects. +In fact, preliminary tests we conducted showed that the built-in tracking capabilities of the Microsoft HoloLens~2 were not able to track hands wearing a vibrotactile voice-coil device. +A more robust hand tracking system would support wearing haptic devices on the hand as well as holding real objects. A complementary solution would be to embed tracking sensors in the wearable haptic devices, such as an inertial measurement unit (IMU) or cameras \cite{preechayasomboon2021haplets}. -Prediction of hand movements should also considered as well \cite{klein2020predicting,gamage2021predictable} +Prediction of hand movements should also be considered \cite{klein2020predicting,gamage2021predictable} This would allow a complete portable and wearable visuo-haptic system to be used in more ecological applications. \subsection*{Perception of Haptic Texture Augmentation in Augmented and Virtual Reality} @@ -77,8 +77,8 @@ In particular, it remains to be investigated how the vibrotactile patterned text \paragraph{Broader Visuo-Haptic Conditions} Our study was conducted with an \OST-\AR headset, but the results may be different with a \VST-\AR headset, where the \RE is seen through cameras and screens (\secref[related_work]{ar_displays}), and the perceived simultaneity between visual and haptic stimuli, real or virtual, is different. -The effect of perceived visuo-haptic simultaneity on the augmented haptic perception and its size should also be systematically investigated, for example by inducing various delays between the visual and haptic feedback. -We also focused on the perception of roughness augmentation using wearable vibrotactile haptics and a square wave signal to simulate a patterned texture: Our objective was not to accurately reproduce real textures, but to induce different perceived roughness on the same real surface with a well controlled parameters. +The effect of perceived visuo-haptic simultaneity on the augmented haptic perception and its magnitude should also be systematically investigated, for example by inducing different delays between visual and haptic feedback. +We also focused on the perception of roughness augmentation using wearable vibrotactile haptics and a square wave signal to simulate a patterned texture: Our objective was not to accurately reproduce real textures, but to induce different perceived roughness on the same real surface with well controlled parameters. However, more accurate models for simulating interaction with virtual textures should be applied to wearable haptic augmentations \cite{unger2011roughness}. Another limitation that may have affected the perception of the haptic texture augmentations is the lack of compensation for the frequency response of the actuator and amplifier \cite{asano2012vibrotactile,culbertson2014modeling,friesen2024perceived}. %The dynamic response of the finger should also be considered, and may vary between individuals. @@ -87,104 +87,104 @@ Another limitation that may have affected the perception of the haptic texture a \paragraph{Assess the Applicability of the Method} -As in the previous chapter, our aim was not to accurately reproduce real textures, but to alter the perception of a real surface being touched with simultaneous visual and haptic texture augmentations. +As in the previous chapter, our aim was not to accurately reproduce real textures, but to alter the perception of a real surface being touched with simultaneous visual and haptic texture augmentation. However, the results also have some limitations, as they addressed a small set of visuo-haptic textures that augmented the perception of smooth and white real surfaces. -Visuo-haptic texture augmentations may be difficult on surfaces that already have strong visual or haptic patterns \cite{asano2012vibrotactile}, or on objects with complex shapes. +Visuo-haptic texture augmentation may be difficult on surfaces that already have strong visual or haptic patterns \cite{asano2012vibrotactile}, or on objects with complex shapes. The role of visuo-haptic texture augmentation should also be evaluated in more complex tasks, such as object recognition and assembly, or in more concrete use cases, such as displaying and touching a museum object or a 3D print before it is manufactured. %Finally, the visual textures used were also simple color captures not meant to be used in an immersive \VE. \paragraph{Adapt to the Specificities of Direct Touch} -The haptic textures used were captures and models of the vibrations of a hand-held probe sliding over real surfaces. -We generated the vibrotactile textures only from the finger speed \cite{culbertson2015should}, but the perceived roughness of real textures also depends on other factors, such as the contact force, the angle, the posture or the surface of the contact \cite{schafer2017transfer}, but their respective importance in the perception is not yet fully understood \cite{richardson2022learning}. +The haptic textures used were recordings and models of the vibrations of a hand-held probe sliding over real surfaces. +We generated the vibrotactile textures only from finger speed \cite{culbertson2015should}, but the perceived roughness of real textures also depends on other factors such as the contact force, angle, posture or surface of the contact \cite{schafer2017transfer}, but their respective importance in perception is not yet fully understood \cite{richardson2022learning}. It would be interesting to determine the importance of these factors on the perceived realism of virtual vibrotactile textures. -We also rendered haptic textures captured from a hand-held probe to be touched with the bare finger, but finger based captures of real textures should be considered as well \cite{balasubramanian2024sens3}. +We also rendered haptic textures captured by a hand-held probe to be touched with the bare finger, but finger based captures of real textures should also be considered \cite{balasubramanian2024sens3}. Finally, the virtual texture models should also be adaptable to individual sensitivities \cite{malvezzi2021design,young2020compensating}. -\subsection*{Visual Rendering of the Hand for Manipulating \VOs in AR} +\subsection*{Visual Augmentation of the Hand for Manipulating \VOs in AR} \paragraph{Other AR Displays} -The visual hand renderings we evaluated were displayed on the Microsoft HoloLens~2, which is a common \OST-\AR headset \cite{hertel2021taxonomy}. -We purposely chose this type of display as it is with \OST-\AR that the lack of mutual occlusion between the hand and the \VO is the most challenging to solve \cite{macedo2023occlusion}. -We thus hypothesized that a visual hand rendering would be more beneficial to users with this type of display. +The visual hand augmentations we evaluated were displayed on the Microsoft HoloLens~2, which is a common \OST-\AR headset \cite{hertel2021taxonomy}. +We purposely chose this type of display because in \OST-\AR the lack of mutual occlusion between the hand and the \VO is the most challenging to solve \cite{macedo2023occlusion}. +We therefore hypothesized that a visual hand augmentation would be more beneficial to users with this type of display. However, the user's visual perception and experience is different with other types of displays, such as \VST-\AR, where the \RE view is seen through cameras and screens (\secref[related_work]{ar_displays}). -While the mutual occlusion problem and the hand tracking latency could be overcome with \VST-\AR, the visual hand rendering could still be beneficial to users as it provides depth cues and feedback on the hand tracking, and should be evaluated as such. +While the mutual occlusion problem and the hand tracking latency could be overcome with \VST-\AR, the visual hand augmentation could still be beneficial to users as it provides depth cues and feedback on the hand tracking, and should be evaluated as such. \paragraph{More Ecological Conditions} We conducted the user study with two manipulation tasks that involved placing a virtual cube in a target volume, either by pushing it on a table or by grasping and lifting it. -While these tasks are fundamental building blocks for more complex manipulation tasks \cite[p.390]{laviolajr20173d}, such as stacking or assembly, more ecological uses should be considered. +While these tasks are fundamental building blocks for more complex manipulation tasks \cite[p.390]{laviolajr20173d}, such as stacking or assembly, more ecological applications should be considered. Similarly, a broader experimental study might shed light on the role of gender and age, as our subject pool was not sufficiently diverse in this regard. -Finally, all visual hand renderings received low and high rank rates from different participants, suggesting that users should be able to choose and personalize some aspects of the visual hand rendering according to their preferences or needs, and this should be also be evaluated. +Finally, all visual hand augmentations received low and high rank rates from different participants, suggesting that users should be able to choose and personalize some aspects of the visual hand augmentation according to their preferences or needs, and this should also be evaluated. -\subsection*{Visuo-Haptic Rendering of Hand Manipulation With \VOs in AR} +\subsection*{Visuo-Haptic Augmentation of Hand Manipulation With \VOs in AR} \paragraph{Richer Haptic Feedback} -The haptic rendering we considered was limited to vibrotactile feedback using \ERM motors. -While the simpler contact vibration technique (Impact technique) was sufficient to confirm contacts with the cube, richer vibrotactile renderings may be required for more complex interactions, such as rendering hardness (\secref[related_work]{hardness_rendering}), textures (\secref[related_work]{texture_rendering}), friction \cite{konyo2008alternative,jeon2011extensions,salazar2020altering}, or edges and shape of \VOs. -This will require to consider a broader ranger of haptic actuators and sensations (\secref[related_work]{wearable_haptic_devices}), such as pressure or stretching of the skin. -More importantly, the best compromise between well-round haptic feedback and wearability of the system with respect to \AR constraints should be analyzed (\secref[related_work]{vhar_haptics}). +The haptic feedback we considered was limited to vibrotactile feedback using \ERM motors. +While the simpler contact vibration technique (Impact technique) was sufficient to confirm contact with the cube, richer vibrotactile renderings may be required for more complex interactions, such as rendering hardness (\secref[related_work]{hardness_rendering}), textures (\secref[related_work]{texture_rendering}), friction \cite{konyo2008alternative,jeon2011extensions,salazar2020altering}, or edges and shape of \VOs. +This will require considering a wider ranger of haptic actuators and sensations (\secref[related_work]{wearable_haptic_devices}), such as pressure or stretching of the skin. +More importantly, the best compromise between well-rounded haptic feedback and wearability of the system with respect to \AR constraints should be analyzed (\secref[related_work]{vhar_haptics}). \paragraph{Personalized Haptics} -Some users found the vibration rendering to be too strong, suggesting that adapting and personalizing the haptic feedback to one's preference is to investigate \cite{malvezzi2021design,umair2021exploring}. +Some users found the vibration feedback too strong, suggesting that adapting and personalizing the haptic feedback to one's preference should be investigated \cite{malvezzi2021design,umair2021exploring}. In addition, although it was perceived as more effective and realistic when provided close to the point of contact, other positionings, such as the wrist, may be preferred and still be sufficient for a given task. -The interactions in our user study were also restricted to the thumb and index fingertips, with the haptic feedback provided only for these contact points, as these are the most commonly used parts of the hand for manipulation tasks. -It remains to be explored how to support rendering for different and larger areas of the hand, and how to position a delocalized rendering for points other than the fingertips could be challenging. +The interactions in our user study were also restricted to the thumb and index fingertips, with haptic feedback provided only for these contact points, as these are the most commonly used parts of the hand for manipulation tasks. +It remains to be explored how to support rendering for different and larger areas of the hand, and how to position a delocalized feedback for points other than the fingertips could be challenging. \section{Perspectives} -Our goal was to improve direct hand interaction with \VOs using wearable haptic devices in immersive \AR, by providing more plausible and coherent perception as well as more natural and effective manipulation of the visuo-haptic augmentations. +Our goal was to improve direct hand interaction with \VOs using wearable haptic devices in immersive \AR by providing more plausible and coherent perception and more natural and effective manipulation of the visuo-haptic augmentations. Our contributions have enabled progress towards a seamless integration of the virtual into the real world. -They also allow us to outline out longer-term research perspectives. +They also allow us to outline longer-term research perspectives. \subsection*{Towards Universal Wearable Haptic Augmentation} -We have reviewed how complex the sense of touch is (\secref[related_work]{haptic_hand}). +We have seen how complex the sense of touch is (\secref[related_work]{haptic_hand}). Multiple sensory receptors all over the skin allow us to perceive different properties of objects, such as their texture, temperature, weight or shape. -Particularly concentrated in the hands, their sensory feedback is crucial, along with the muscles, for grasping and manipulating objects. +Particularly concentrated in the hands, their sensory feedback, together with the muscles, is crucial for grasping and manipulating objects. In this manuscript, we have shown how wearable haptic devices can provide virtual tactile sensations to support direct hand interaction in immersive \AR: -We have investigated the visuo-haptic perception of texture augmenting real surfaces (\partref{perception}) as well as the manipulation of \VOs with visuo-haptic feedback of hand contacts with \VOs (\partref{manipulation}). +We have investigated both the visuo-haptic perception of texture augmenting real surfaces (\partref{perception}) and the manipulation of \VOs with visuo-haptic feedback of hand contact with \VOs (\partref{manipulation}). -However, unlike for the visual sense, which can be completely immersed in the virtual using an \AR/\VR headset, there is no universal wearable haptic device that can reproduce all the haptic properties perceived by the hand (\secref[related_work]{wearable_haptics}). +However, unlike the visual sense, which can be fully immersed in the virtual using an \AR/\VR headset, there is no universal wearable haptic device that can reproduce all the haptic properties perceived by the hand (\secref[related_work]{wearable_haptics}). Thus, the haptic renderings and augmentations we studied were limited to specific properties of roughness (\chapref{vhar_system}) and contact (\chapref{visuo_haptic_hand}) using vibrotactile feedback. A systematic and comparative study of existing wearable haptic devices and renderings should therefore be carried out to assess their ability to reproduce the various haptic properties \cite{culbertson2017importance,friesen2024perceived}. More importantly, the visuo-haptic coupling of virtual and augmented objects should be studied systematically, as we did for textures in \AR (\chapref{vhar_textures}) or as done in \VR \cite{choi2021augmenting,gunther2022smooth}. Attention should also be paid to the perceptual differences of wearable haptics in \AR \vs \VR (\chapref{xr_perception}). -This would allow to assess the relative importance of visual and haptic feedback in the perception of object properties, and how visual rendering can support or compensate for limitations in wearable haptic rendering. +This would allow to assess the relative importance of visual and haptic feedback in the perception of object properties, and how visual feedback may support or compensate for limitations in wearable haptic feedback \cite{kim2020defining}. One of the main findings of studies on the haptic perception of real objects is the importance of certain perceived properties over others in discriminating between objects \cite{hollins1993perceptual,baumgartner2013visual,vardar2019fingertip}. -It would therefore be interesting to determine which wearable haptic renderings are most important for the perception and manipulation of virtual and augmented objects with the hand in \AR and \VR. -User studies could then be conducted similarly, reproducing as many haptic properties as possible in \VO discrimination tasks. -These results would make it possible to design more universal wearable haptic devices that provide rich haptic feedback that best meets users' needs for interaction in \AR and \VR. +It would therefore be interesting to determine which wearable haptic augmentations are most important for the perception and manipulation of virtual and augmented objects with the hand in \AR and \VR. +Similar user studies could then be conducted, to reproduce as many haptic properties as possible in \VO discrimination tasks. +These results would enable the design of more universal wearable haptic devices that provide rich haptic feedback that best meets users' needs for interaction in \AR and \VR. % systematic exploration of the parameter space of the haptic rendering to determine the most important parameters their influence on the perception % measure the difference in sensitivity to the haptic feedback and how much it affects the perception of the object properties \subsection*{Responsive Visuo-Haptic Augmented Reality} -We have reviewed the diversity of \AR and \VR reality displays and their respective characteristics in rendering, their perception (\secref[related_work]{ar_displays}) and their manipulation with the hand (\chapref{visual_hand}). +We have reviewed the diversity of \AR and \VR reality displays and their respective characteristics in rendering (\secref[related_work]{ar_displays}) and the manipulation of virtual content with the hand (\chapref{visual_hand}). The diversity of wearable haptic devices and the different sensations they can provide is even more important (\secref[related_work]{wearable_haptics}) and an active research topic \cite{pacchierotti2017wearable}. Coupling wearable haptics with immersive \AR also requires the haptic actuator to be placed on the body other than at the hand contact points (\secref[related_work]{vhar_haptics}). -In particular, in this thesis, we have investigated the perception of haptic texture augmentation using a vibrotactile device on the median phalanx (\chapref{vhar_system}) and also compared different positions of the haptics on the hand for manipulating \VOs (\chapref{visuo_haptic_hand}). +In particular, in this thesis we have investigated the perception of haptic texture augmentation using a vibrotactile device on the median phalanx (\chapref{vhar_system}) and also compared different positions of the haptics on the hand for manipulating \VOs (\chapref{visuo_haptic_hand}). -Haptic feedback should be provided close to the point of contact of the hand with the virtual, for a better realism of texture augmentation (\chapref{vhar_textures}) and for rendering contact with virtual objects (\chapref{visuo_haptic_hand}), \eg rendering fingertip contact with a haptic ring worn on the middle or proximal phalanx. -However, the task at hand, the user's sensitivity and preferences, the constraints of the tracking system, or the ergonomics of the haptic device may require the use of other form factors and positions, such as the wrist or the arm. +Haptic feedback should be provided close to the point of contact of the hand with the virtual, to enhance the realism of texture augmentation (\chapref{vhar_textures}) and to render contact with virtual objects (\chapref{visuo_haptic_hand}), \eg rendering fingertip contact with a haptic ring worn on the middle or proximal phalanx. +However, the task at hand, the user's sensitivity and preferences, the limitations of the tracking system, or the ergonomics of the haptic device may require the use of other form factors and positions, such as the wrist or arm. Similarly, collaborative and transitional experiences between \AR and \VR are becoming more common, and would involve different setups and modalities \cite{roo2017onea}. -Novel \AR/\VR displays are already capable of transitioning from augmented to virtual environments \cite{feld2024simple}, and the haptic feedback should also adapt to these transitions as well (\chapref{xr_perception}). +Novel \AR/\VR displays are already capable of transitioning from augmented to virtual environments \cite{feld2024simple}, and haptic feedback should also adapt to these transitions (\chapref{xr_perception}). -Therefore, a visuo-haptic augmented reality system should be able to adapt to any \AR/\VR display, any wearable haptic device worn anywhere on the body, and support personalization of the haptic feedback. +Therefore, a visuo-haptic augmented reality system should be able to adapt to any \AR/\VR display, any wearable haptic device worn anywhere on the body, and support personalization of haptic feedback. In other words, the visuo-haptic rendering system should be designed to be responsive to the context of use. -This would require the development and validation of methods to automatically calibrate the haptic feedback to the user's perception in accordance of what it was been designed to represent. +This would require the development and validation of methods to automatically calibrate the haptic feedback to the user's perception of what it was designed to represent. Methods should also be developed to allow the user to easily adjust the haptic feedback to their preferences at runtime, but without exposing all the design parameters \cite{kim2020defining}. Finally, different use cases and applications of visuo-haptic \AR should be explored and evaluated. -For example, capturing and sharing visuo-haptic perceptions of objects in a \AR/\VR visioconference, or projecting and touching a sample in a real wall for interior design then switching to \VR to see and touch the complete final result. -A museum object could be manipulated by a real proxy object in visuo-haptic \AR, or even as it was in the past in its original state in \VR. -Another example could be a medical teleconsultation, where the doctor could palpate a distant patient with haptic augmentation, but in \VR. +For example, capturing visuo-haptic perceptions of objects and sharing them in a visioconference with different \AR or \VR setups per participant, or in a medical teleconsultation. +It could also be projecting and touching a visuo-haptic sample in a real wall for interior design, then switching to \VR to see and touch the complete final result, or manipulating a museum object with a real proxy object in visuo-haptic \AR, or even as it was in the past in its original state in \VR. +%Another example could be a medical teleconsultation, where the doctor could palpate a distant patient with haptic augmentation, but in \VR. % design, implement and validate procedures to automatically calibrate the haptic feedback to the user's perception in accordance to what it has been designed to represent % + let user free to easily adjust (eg can't let adjust whole spectrum of vibrotactile, reduce to two or three dimensions with sliders using MDS)