\chapter{Introduction} \mainlabel{introduction} \chaptertoc %This PhD manuscript shows how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR by augmenting the perception of the virtual content and its manipulation. This PhD manuscript shows how immersive \AR, which integrates visual virtual content into the real world perception, and wearable haptics, worn on the outside of the hand, can improve the free and direct interaction of the hand with virtual objects. Our goal is to achieve a more plausible and coherent perception, as well as a more natural and effective manipulation of the visuo-haptic augmentations. %interaction of the hand with the virtual content.%, moving towards a seamless integration of the virtual into the real world. %We are particularly interested in enabling direct contact of virtual and augmented objects with the bare hand. %The aim of this thesis is to understand how immersive visual and wearable haptic augmentations complement each other in the context of direct hand perception and manipulation with virtual and augmented objects. \section{Visual and Haptic Object Augmentations} \label{visuo_haptic_augmentations} \subsectionstarbookmark{Hand Interaction with Everyday Objects} In daily life, \textbf{we simultaneously look at, touch and manipulate the everyday objects} around us without even thinking about it. Many of these object properties can be perceived in a complementary way through all our sensory modalities, such as their shape, size or texture \cite{baumgartner2013visual}. Vision often precedes touch, enabling us to anticipate the tactile sensations we will feel when touching the object \cite{yanagisawa2015effects}, \eg hardness or texture, and even to predict properties that we cannot see, \eg weight or temperature. Information from different sensory sources can be complementary, redundant or contradictory \cite{ernst2004merging}. This is why we sometimes want to touch an object to check one of its properties that we have seen and to compare or confront our visual and tactile sensations. We then \textbf{instinctively construct a unified perception of the properties of the object} we are exploring and manipulating from our sensory modalities, as well as from the movement of our hand and fingers on the object \cite{ernst2002humans}. The sense of touch also allows us to perceive our environment, but also to represent ourselves in space and to interact with the surrounding objects. This is due to the many sensory receptors distributed throughout our hands and body. These receptors can be divided into two modalities: \emph{kinesthetic} (or proprioception), which are the forces felt by muscles and tendons, and \emph{cutaneous} (or tactile), which are the pressures, stretches, vibrations and temperatures felt by the skin. This rich and complex variety of actions and sensations makes it particularly \textbf{difficult to artificially recreate capabilities of touch}, for example in virtual or remote operating environments \cite{culbertson2018haptics}. \subsectionstarbookmark{Wearable Haptics Promise Everyday Use} \textbf{\emph{Haptics} is the study of the sense of touch and user interfaces that involve touch} \cite{klatzky2013haptic}. Haptic devices can be categorized according to how they interface with the user: graspable, touchable and wearable, as illustrated in \figref{haptic-categories} \cite{culbertson2018haptics}. \emph{Graspable interfaces} are the traditional haptic devices that are held in the hand. They are either grounded devices that provide kinesthetic (force) feedback, \eg a robotic arm or a joystick, or ungrounded tactile devices, \eg a game controller or a smartphone. \emph{Touchable interfaces} are actuated devices that are directly touched and that can dynamically change their shape or surface properties, such as hardness or friction, providing simultaneous kinesthetic and cutaneous feedback. However, graspable interfaces occupy the hand, preventing interaction with other objects, and touchable interfaces often involve cumbersome mechanisms and are by definition limited to their own working surface. Instead, \textbf{\emph{wearable interfaces} are directly mounted on the body} to provide cutaneous sensations on the skin in a portable way and \textbf{without restricting the user's movements} \cite{pacchierotti2017wearable}. \begin{subfigs}{haptic-categories}{ Haptic devices can be divided into three categories according to their interface with the user: }[][ \item graspable (hand-held), \item touchable, and \item wearable. Adapted from \textcite{culbertson2018haptics}. ] \subfig[0.25]{culbertson2018haptics-graspable} \subfig[0.25]{culbertson2018haptics-touchable} \subfig[0.25]{culbertson2018haptics-wearable} \end{subfigs} A wide range of wearable haptic devices have been developed to provide the user with rich virtual tactile sensations, including normal force, skin stretch, vibration and thermal feedback. \figref{wearable-haptics} shows some examples of different wearable haptic devices with different form factors and rendering capabilities. Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, augmented and virtual realities, and social interaction \cite{pacchierotti2017wearable,culbertson2018haptics}. However, the \textbf{integration of wearable haptics with \AR has been little explored so far}. %, and few wearable haptic devices have specifically designed or experimentally tested for direct hand interaction in \AR. \begin{subfigs}{wearable-haptics}{ Wearable haptic devices can provide sensations on the skin as feedback to real or virtual objects being touched. }[][ \item Wolverine, a wearable exoskeleton that simulates contact and grasping of virtual objects with force feedback on the fingers \cite{choi2016wolverine}. \item Touch\&Fold, a nail-mounted wearable haptic device that folds on demand to provide contact, normal force and vibration to the fingertip \cite{teng2021touch}. \item The hRing, a wearable haptic ring mounted on the proximal phalanx able to render normal and shear forces to the finger \cite{pacchierotti2016hring}. \item Tasbi, a haptic bracelet capable of providing pressure and vibrotactile feedback to the wrist \cite{pezent2022design}. ] \subfigsheight{28mm} \subfig{choi2016wolverine} \subfig{teng2021touch} \subfig{pacchierotti2016hring} \subfig{pezent2019tasbi} \end{subfigs} \subsectionstarbookmark{Augmented Reality Is Not Only Visual} \textbf{\emph{Augmented Reality (\AR)} integrates virtual content into the real world perception, creating the illusion of a unique \emph{augmented environment}} \cite{azuma1997survey,skarbez2021revisiting}. It thus promises natural and seamless interaction with the physical and digital objects (and their combination) directly with our hands \cite{billinghurst2021grand}. It is technically and conceptually closely related to \emph{\VR}, which completely replaces the \emph{\RE} perception with a \emph{\VE}. \AR and \VR can be placed on a reality-virtuality continuum, as proposed by \textcite{milgram1994taxonomy} and illustrated in \figref{rv-continuum}\footnote{On the original reality-virtuality continuum of \textcite{milgram1994taxonomy}, augmented virtuality is also considered, as the incorporation of real objects into a \VE, and is placed between \AR and \VR. For simplicity, we only consider \AR and \VR in this thesis.}. It describes the degree of virtuality of the environment along an axis, with one end being the \RE and the other end being a pure \VE, \ie indistinguishable from the real world (such as \emph{The Matrix} movies). Between these two extremes lies \MR, which includes \AR and \VR as different levels of mixing real and virtual environments \cite{skarbez2021revisiting}.\footnote{This is the original and classic definition of \MR, but there is still debate about how to define and characterize \AR and \MR experiences \cite{speicher2019what,skarbez2021revisiting}.} We call \AR/\VR \emph{systems} the computational set of hardware (input devices, sensors and displays) and software (tracking, simulation and rendering) that allows the user to interact with the \VE. % by implementing the interaction loop we proposed in \figref{interaction-loop}. \AR and \VR systems can address any of the human senses, but they most often focus on visual augmentations \cite[p.144]{billinghurst2015survey}. Many visual displays have been explored, from projection systems to hand-held displays. \textbf{\AR headsets are the most promising displays as they are portable and provide the user with an immersive augmented environment.} %but the most \textbf{promising devices are \AR headsets}, which are \textbf{portable displays worn directly on the head}, providing the user with an \textbf{immersive visual augmented environment}. \begin{subfigs}{rv-continuums}{Reality-virtuality continuums. }[][ \item For the visual sense, as originally proposed by and adapted from \textcite{milgram1994taxonomy}. \item Extension to include the haptic sense on a second, orthogonal axis, proposed by and adapted from \textcite{jeon2009haptic}. ] \subfig[0.49]{rv-continuum} \subfig[0.49]{visuo-haptic-rv-continuum5} \end{subfigs} %Concepts of virtuality and augmentation can also be applied for sensory modalities other than vision. \textcite{jeon2009haptic} proposed to describe visuo-haptic \AR/\VR systems with two orthogonal reality-virtuality continuums, one for vision and one for touch, as shown in \figref{visuo-haptic-rv-continuum5}. The combination of the two axes defines 9 types of visuo-haptic environments, with 3 possible levels of virtuality for each visual or haptic feedback: real, augmented and virtual. For example, (visual) \AR using a real object as a proxy to manipulate a \VO is considered to be \emph{haptic reality} (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum5}), whereas a device that provides synthetic haptic feedback when touching a \VO is considered to be \emph{haptic virtuality} (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum5}). \textbf{A \emph{haptic augmentation} is then the combination of real and virtual haptic stimuli} \cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum5}). In particular, it has been implemented by augmenting the haptic perception of real objects by providing timely virtual tactile stimuli using wearable haptics: \figref{salazar2020altering} shows an example of modifying the perceived stiffness of a real object in \VR using simultaneous pressure feedback on the finger (left middle cell in \figref{visuo-haptic-rv-continuum5}). \figref{bau2012revel} shows another example of visuo-haptic augmentation of virtual texture when running the finger on a real surface (middle cell in the two axes in \figref{visuo-haptic-rv-continuum5}). If a (visual) \AR system lacks haptic feedback, it creates a deceptive and incomplete user experience when the hand reaches the virtual content. All visual \VOs are inherently intangible and cannot physically constrain a user's hand, making it difficult to perceive their properties and interact with them with confidence and efficiency. It is therefore necessary to provide haptic feedback that is consistent with the visual \VOs and ensures the best possible user experience, as we argue in the next section. The \textbf{integration of wearable haptics with \AR} seems to be one of the most promising solutions, but it \textbf{remains challenging due to their many respective characteristics and the additional constraints of combining them}. \begin{subfigs}{visuo-haptic-environments}{Visuo-haptic environments with varying degrees of reality-virtuality. }[][ \item \AR environment with a real haptic object used as a proxy to manipulate a \VO \cite{kahl2023using}. \item \AR environment with a wearable haptic device that provides virtual, synthetic feedback from contact with a \VO \cite{meli2018combining}. \item A real object seen in a visual \VR environment whose haptic perception of stiffness is augmented with the hRing haptic device \cite{detinguy2018enhancing,salazar2020altering}. \item Visuo-haptic texture augmentation of a real object to be touched using a handheld \AR display and haptic electrovibration feedback \cite{bau2012revel}. ] \subfigsheight{31mm} \subfig{kahl2023using} \subfig{meli2018combining} \subfig{salazar2020altering} \subfig{bau2012revel} \end{subfigs} \section{Research Challenges of Wearable Visuo-Haptic Augmented Reality} \label{research_challenges} The integration of wearable haptics with \AR to create a visuo-haptic augmented environment is complex and presents many perceptual and interaction challenges. % \ie sensing the \AE and acting effectively upon it. In this thesis, we propose to \textbf{represent the user's experience with such a visuo-haptic augmented environment as an interaction loop}, shown in \figref{interaction-loop}. It is based on the interaction loops of users with \ThreeD systems \cite[p.84]{laviolajr20173d}. The \RE and the user's hand are tracked in real time by sensors and reconstructed in visual and haptic \VEs. The interactions between the virtual hand and objects are then simulated, and rendered as feedback to the user using a \AR/\VR headset and wearable haptics. Because the visuo-haptic \VE is displayed in real time and aligned with the \RE, the user is given the illusion of directly perceiving and interacting with the virtual content as if it were part of the \RE. \fig{interaction-loop}{The interaction loop between a user and a visuo-haptic augmented environment as proposed in this thesis.}[ A user interacts with the visual (in blue) and haptic (in red) \VEs through a virtual hand (in purple) interaction technique that tracks real hand movements and simulates contact with \VOs. The visual and haptic \VEs are rendered back using an immersive \AR headset and wearable haptics, and are felt by the user to be registered and co-localized with the \RE (in gray) \protect\footnotemark ] In this context, we focus on two main research challenges: \textbf{(I) providing plausible and coherent visuo-haptic augmentations}, and \textbf{(II) enabling effective manipulation of the augmented environment}. Each of these challenges also raises numerous design, technical and human issues specific to wearable haptics and immersive \AR. %, as well as virtual rendering and user experience issues.% in integrating these two sensorimotor feedbacks into a coherent and seamless visuo-haptic \AE. \footnotetext{% The icons are \href{https://creativecommons.org/licenses/by/3.0/}{CC BY} licensed: \enquote{\href{https://thenounproject.com/icon/finger-pointing-4230346/}{finger pointing}} by \href{https://thenounproject.com/creator/leremy/}{Gan Khoon Lay}, \enquote{\href{https://thenounproject.com/icon/hololens-1499195/}{HoloLens}} by \href{https://thenounproject.com/creator/daniel2021/}{Daniel Falk}, and \enquote{\href{https://thenounproject.com/icon/vibration-6478365/}{vibrations}} by \href{https://thenounproject.com/creator/iconbunny/}{Iconbunny}. } \subsectionstarbookmark{Challenge I: Providing Plausible and Coherent Visuo-Haptic Augmentations} \textbf{Many haptic devices have been designed and evaluated specifically for use in \VR}, providing realistic and varied kinesthetic and tactile feedback to \VOs. Although closely related, \AR and \VR have key differences in their respective renderings that can affect user perception. %As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic \cite{ujitoko2021survey} or haptic retargeting \cite{azmandian2016haptic} effects. Many wearable haptic devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR. The \textbf{user's hand must be indeed free to touch and interact with the \RE while wearing a wearable haptic device}. Instead, it is possible to place the haptic actuator close to the point of contact with the \RE, \eg providing haptic feedback on the nail \cite{ando2007fingernailmounted,teng2021touch}, another phalanx \cite{asano2015vibrotactile,detinguy2018enhancing,salazar2020altering} or the wrist \cite{pezent2022design,sarac2022perceived} for rendering fingertip contacts with virtual content. Therefore, when touching a virtual or augmented object, \textbf{the real and virtual visual sensations are perceived as co-localized, but the virtual haptic feedback is not}. It remains to be investigated how such potential discrepancies affect the overall perception to design visuo-haptic augmentations adapted to \AR. %So far, most of the \AR studies and applications only add visual and haptic sensations to the user's overall perception of the environment, but conversely it is more difficult to remove sensations. %Visual and haptic augmentations of the \RE add sensations to the user's overall perception. The \textbf{added visual and haptic virtual sensations may be perceived as inconsistent} with the sensations of the \RE, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these. Moreover, in \AR, the user can still see the real world environment, including their hands, the augmented real objects and the worn haptic devices, unlike \VR where there is total control over the visual rendering. % of the hand and \VE. It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as realistic or even plausible, and to what extent they will conflict or complement each other. % in the perception of the \AE. With a better understanding of \textbf{how visual factors can influence the perception of haptic augmentations}, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic augmentations adapted to \AR can be designed. \subsectionstarbookmark{Challenge II: Enabling Effective Manipulation of the Augmented Environment} Touching, \textbf{grasping and manipulating \VOs are fundamental interactions for \AR} \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite[p.385]{laviolajr20173d}. Since the hand is not occupied or covered with a haptic device so as to not impair interaction with the \RE, as described in the previous section, one can expect a seamless and direct manipulation of the hand with the virtual content as if it were real. When augmenting a real object, the user's hand is physically constrained by the object, allowing for easy and natural interaction. However, \textbf{manipulating a purely \VO with the bare hand can be challenging} without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction. In addition, current \AR systems have visual rendering limitations that also affect interaction with \VOs. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency. \AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world. However, the depth perception of the \VOs is often underestimated \cite{peillard2019studying,adams2022depth}. There is also often \textbf{a lack of mutual occlusion between the hand and a \VO}, \ie that the hand can hide the object or be hidden by the object \cite{macedo2023occlusion}. Finally, as illustrated in \figref{interaction-loop}, interaction with a \VO is an illusion, because the real hand controls in real time a virtual hand, like an avatar, whose contacts with \VOs are then simulated in the \VE. Therefore, there is inevitably a latency delay between the real hand's movements and the return movements of the \VO, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the \VO touched \cite{prachyabrued2014visual}. These three rendering limitations make it \textbf{difficult to perceive the position of the fingers relative to the object} before touching or grasping it, but also to estimate the force required to grasp the \VO and move it to a desired location. Hence, it is necessary to provide visual and haptic feedback that allows the user to efficiently contact, grasp and manipulate a \VO with the hand. Yet, it is unclear which type of visual and haptic feedback, or their combination, is the best suited to guide the \VO manipulation.%, and whether one or the other of a combination of the two is most beneficial for users. \section{Approach and Contributions} \label{contributions} %The aim of this thesis is to understand how immersive visual and wearable haptic augmentations complement each other in the context of direct hand perception and manipulation with virtual and augmented objects. As described in the Research Challenges section above, providing a coherent and effective visuo-haptic augmented environment to a user is complex and raises many issues. Our approach is to \begin{enumerate*}[label=(\arabic*)] \item design immersive and wearable visuo-haptic renderings that augment both the objects being interacted with and the hand interacting with them, and \item evaluate in user studies how these visuo-haptic renderings affect the interaction of the hand with these objects using psychophysical, performance, and user experience methods. \end{enumerate*} We consider two main axes of research, each addressing one of the research challenges identified above: \begin{enumerate*}[label=(\Roman*)] \item \textbf{augmenting the texture perception of real surfaces}, and % with visuo-haptic texture augmentations, and \item \textbf{improving the manipulation of virtual objects}.% with visuo-haptic augmentations of the hand-object interaction. \end{enumerate*} Our contributions in these two axes are summarized in \figref{contributions}. \fig[0.95]{contributions}{Summary of our contributions through the simplified interaction loop.}[ The contributions are represented in dark grey boxes, and the research axes in light green circles. The first axis is \textbf{(I)} the design and evaluation of the perception of visuo-haptic texture augmentations of real surfaces, directly touched by the hand. The second axis focuses on \textbf{(II)} improving the manipulation of \VOs with the bare hand using visuo-haptic augmentations of the hand as interaction feedback. ] \subsectionstarbookmark{Axis I: Augmenting the Texture Perception of Real Surfaces} Wearable haptic devices have proven to be effective in altering the perception of a touched real surface, without modifying the object or covering the fingertip, forming a haptic \AE \cite{bau2012revel,detinguy2018enhancing,salazar2020altering}. %It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronised with the finger movement. %It enables rich haptic feedback as the combination of kinesthetic sensation from the real and cutaneous sensation from the actuator. However, wearable haptic augmentations with \AR have been little explored, as well as the visuo-haptic augmentation of texture. Texture is indeed one of the most fundamental perceived property of a surface's material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by both sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) augmentation \cite{unger2011roughness,culbertson2014modeling,strohmeier2017generating}. Being able to coherently substitute the visuo-haptic texture of a surface directly touched by a finger is an important step towards \AR capable of visually and haptically augmenting the \RE of a user in a plausible way. For this first axis of research, we propose to design and evaluate the \textbf{perception of wearable virtual visuo-haptic textures augmenting real surfaces}. %, using an immersive \AR headset and a wearable vibrotactile device. To this end, we (1) design a system for rendering wearable visuo-haptic texture augmentations, to (2) evaluate how the perception of haptic texture augmentations is affected by the visual feedback of the virtual hand and the environment, and (3) investigate the perception of co-localized visuo-haptic texture augmentations. First, an effective approach for rendering haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction \cite{culbertson2014modeling,asano2015vibrotactile}. Yet, to achieve the natural interaction with the hand and a coherent visuo-haptic feedback, it requires a real time rendering of the textures, no constraints on the hand movements, and a good synchronization between the visual and haptic feedback. Thus, our first objective is to \textbf{design an immersive, real time system} that allows free exploration with the bare hand of \textbf{wearable visuo-haptic texture augmentations} on real surfaces. It will form the basis of the next two chapters in this section. Second, many works have investigated the haptic augmentations of texture, but none have integrated them with \AR and \VR, or have considered the influence of the visual feedback on their perception. Still, it is known that the visual feedback can alter the perception of real and virtual haptic sensations \cite{schwind2018touch,choi2021augmenting} but also that the force feedback perception of grounded haptic devices is not the same in \AR and \VR \cite{diluca2011effects,gaffary2017ar}. Hence, our second objective is to \textbf{evaluate how the perception of wearable haptic texture augmentation is affected by the visual feedback of the virtual hand and the environment} (real, augmented, or virtual). Finally, some visuo-haptic texture databases have been created from real texture captures \cite{culbertson2014penn,balasubramanian2024sens3} to be rendered as virtual textures with hand-held haptic devices that are perceived as similar to real textures \cite{culbertson2015should,friesen2024perceived}. However, the rendering of these textures in an immersive and natural visuo-haptic \AR using wearable haptics remains to be investigated. Our third objective is to \textbf{evaluate the perception of simultaneous and co-localized wearable visuo-haptic texture augmentations} of real surfaces in \AR.%, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture. \subsectionstarbookmark{Axis II: Improving the Manipulation of Virtual Objects} In immersive and wearable visuo-haptic \AR, the hand is free to touch and interact seamlessly with real, augmented, and virtual objects. Hence, a user can expect natural and direct contact and manipulation of \VOs with the bare hand. However, the intangibility of the visual \VE, the display limitations of current visual \OST-\AR systems and the inherent spatial and temporal discrepancies between the user's hand actions and the visual feedback in the \VE can make the interaction with \VOs particularly challenging. %However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of wearable haptics, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the \RE, the visual feedback, and the haptic feedback, can make the interaction with \VOs with bare hands particularly challenging. Two particular sensory feedbacks are known to improve such direct \VO manipulation, but they have not been properly investigated in immersive \AR: visual rendering of the hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and delocalized haptic rendering \cite{lopes2018adding,teng2021touch}. For this second axis of research, we propose to design and evaluate \textbf{visuo-haptic augmentations of the hand as interaction feedback with \VOs} in immersive \OST-\AR. We consider the effect on the user performance an experience of (1) the visual rendering as hand augmentation and (2) combination of different visuo-haptic rendering of the hand manipulation with \VOs First, the visual rendering of the virtual hand is a key element for interacting and manipulating \VOs in \VR \cite{prachyabrued2014visual,grubert2018effects}. Some work has also investigated the visual rendering of the virtual hand in \AR \cite{piumsomboon2014graspshell,blaga2017usability}, but not in an immersive context of \VO manipulation. % with the bare hand.% from simulating mutual occlusions between the hand and \VOs \cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay \cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand. \OST-\AR also has significant perceptual differences from \VR due to the visibility of the real hand and environment, which can affect the user experience and performance \cite{yoon2020evaluating}. %, and these visual hand augmentations have not been evaluated . Thus, our fourth objective is to \textbf{investigate the visual rendering as a hand augmentation} for direct hand manipulation of \VOs in \OST-\AR. Second, as described above, the haptic actuators need to be moved away from the fingertips to not impair the hand movements, sensations, and interactions with the \RE. Previous works have shown that wearable haptics that provide feedback on the hand manipulation with \VOs in \AR can significantly improve the user performance and experience \cite{maisto2017evaluation,meli2018combining}. However, it is unclear which positioning of the actuator is the most beneficial nor how a haptic augmentation of the hand compares or complements with a visual augmentation of the hand. Our last objective is to \textbf{investigate the visuo-haptic rendering of \VO manipulation with the hand} in \OST-\AR using wearable vibrotactile haptics. \section{Thesis Overview} \label{thesis_overview} %This thesis is divided in four parts. %In \textbf{\partref{background}}, we first describe the context and background of our research, within which With this first current \textit{Introduction} chapter, we have presented the research challenges, objectives, approach and contributions of this thesis. In \textbf{\chapref{related_work}}, we then review previous work on the perception and manipulation of virtual and augmented objects, directly with the hand, using either wearable haptics, \AR, or their combination. First, we overview how the hand perceives and manipulates real objects. Second, we present wearable haptics and haptic augmentations of the texture and hardness of real objects. Third, we introduce \AR, and how \VOs can be manipulated directly with the hand. Finally, we describe how visuo-haptic feedback has augmented direct hand interaction in \AR, particularly using wearable haptics. We then address each of our two research axes in a dedicated part. \noindentskip In \textbf{\partref{perception}} we present our contributions to the first axis of research: modifying the visuo-haptic texture perception of real surfaces. We evaluate how the visual rendering of the hand (real or virtual), the environment (\AR or \VR) and the textures (coherent, different or not shown) affect the perception of virtual vibrotactile textures rendered on real surfaces and touched directly with the index finger. In \textbf{\chapref{vhar_system}} we design and implement a system for rendering visuo-haptic virtual textures that augment real surfaces. %, using an immersive \OST-\AR headset and a wearable vibrotactile device. The haptic textures represent a periodical patterned texture rendered by a wearable vibrotactile actuator worn on the middle phalanx of the finger touching the surface. The tracking of the real hand and the environment is achieved using a marker-based technique. The visual rendering is done using the immersive \OST-\AR headset Microsoft HoloLens~2. The system allows free visual and haptic exploration of the textures, as if they were real, and forms the basis of the next two chapters. In \textbf{\chapref{xr_perception}} we investigate in a user study how different the perception of haptic texture augmentations is in \AR \vs \VR and when touched by a virtual hand \vs one's own hand. We use psychophysical methods to measure the user perception, and extensive questionnaires to understand how this perception is affected by the visual feedback of the virtual hand and the environment (real, augmented, or virtual). In \textbf{\chapref{vhar_textures}} we evaluate in a user study the perception of visuo-haptic texture augmentations directly touched with the real hand in \AR. The virtual textures are paired visual and haptic captures of real surfaces \cite{culbertson2014one}, which we render as visual and haptic overlays on the touched augmented surfaces. Our objective is to assess the perceived realism, coherence and roughness of the combination of nine representative visuo-haptic texture pairs. \noindentskip In \textbf{\partref{manipulation}} we describe our contributions to the second axis of research: improving the direct hand manipulation of \VOs using visuo-haptic augmentations of the hand as interaction feedback with \VOs in immersive \OST-\AR. In \textbf{\chapref{visual_hand}} we investigate in a user study of six visual renderings as hand augmentations, as a set of the most popular hand renderings in the \AR literature. Using the \OST-\AR headset Microsoft HoloLens~2, we evaluate their effect on the user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a \VO directly with the hand. In \textbf{\chapref{visuo_haptic_hand}} we evaluate in a user study the visuo-haptic rendering of direct hand manipulation of \VO with two vibrotactile contact techniques, provided at four different positionings on the user's hand. %, as haptic rendering of \VO manipulation with the hand. They are compared to the two most representative visual hand renderings from the previous chapter, resulting in sixteen visuo-haptic hand renderings that are evaluated within the same experimental setup and design. \noindentskip In \textbf{\chapref{conclusion}} we conclude this thesis and discuss short-term future work and long-term perspectives for each of our contributions and research axes.