\chapter{Conclusion} \mainlabel{conclusion} \chaptertoc \section{Summary} In this manuscript, we have shown how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR by augmenting the perception and manipulation of the virtual. Wearable haptics can provide a rich tactile feedback on \VOs and augment the perception of real objects, both directly touched by the hand, while preserving the freedom of movement and interaction with the \RE. However, their integration with \AR is still in its infancy, and presents many design, technical and human challenges. We have structured our research around two axes: \textbf{(I) modifying the texture perception of real surfaces}, and \textbf{(II) improving the manipulation of \VOs}. \noindentskip In \partref{perception} we focused on modifying the perception of wearable and immersive virtual visuo-haptic textures that augment real surfaces. Texture is a fundamental property of an object, perceived equally by sight and touch. It is also one of the most studied haptic augmentations, but it had not yet been integrated into \AR or \VR. We \textbf{(1)} proposed a \textbf{wearable visuo-haptic texture augmentation system}, \textbf{(2)} evaluated how the perception of haptic texture augmentations is \textbf{affected by the visual feedback of the virtual hand} and the environment (real, augmented, or virtual), and \textbf{(3)} investigated the \textbf{perception of co-localized visuo-haptic texture augmentations}. In \chapref{vhar_system}, we presented a system for \textbf{augmenting any real surface} with virtual \textbf{visuo-haptic roughness textures} using an immersive \AR headset and a wearable vibrotactile device worn on the middle phalanx of the finger. It allows a \textbf{free visual and touch exploration} of the textures, as if they were real, allowing the user to view them from different angles and touch them with the bare finger without constraints on hand movements. The user studies in the next two chapters are based on this system. In \chapref{xr_perception} we explored how the perception of wearable haptic augmented textures is affected by the visual feedback of the virtual hand and the environment, whether it is real, augmented or virtual. We augmented the perceived roughness of the real surface with virtual vibrotactile patterned textures, and rendered the visual conditions by switching the \OST-\AR headset to a \VR-only view. We then conducted a psychophysical user study with 20 participants and extensive questionnaires to evaluate the perceived roughness augmentation in these three visual conditions. The textures were perceived as \textbf{rougher when touched with the real hand alone compared to a virtual hand} in either \AR or \VR, possibly due to the \textbf{perceived latency} between finger movements and different visual, haptic, and proprioceptive feedbacks. In \chapref{vhar_textures}, we investigated the perception of co-localized visual and wearable haptic texture augmentations on real surfaces. We transposed the \textbf{data-driven visuo-haptic textures} from the \HaTT database to the system presented in \chapref{vhar_system} and conducted a user study with 20 participants to rate the coherence, realism, and perceived roughness of the combination of nine visuo-haptic texture pairs. Participants integrated roughness sensations from both visual and haptic modalities well, with \textbf{haptics predominating the perception}, and consistently identified and matched \textbf{clusters of visual and haptic textures with similar perceived roughness}. \noindentskip In \partref{manipulation} we focused on improving the manipulation of \VOs directly with the hand in immersive \OST-\AR. Our approach was to design visual renderings of the hand and delocalized haptic rendering, based on the literature, and to evaluate them in user studies. We first considered \textbf{(1) the visual rendering as hand augmentation} and then the \textbf{(2)} combination of different visuo-haptic \textbf{rendering of the hand manipulation with \VOs}. In \chapref{visual_hand}, we investigated the visual rendering as hand augmentation. Seen as an \textbf{overlay on the user's hand}, such visual hand rendering provide feedback on the hand tracking and the interaction with \VOs. We compared the six commonly used renderings in the \AR litterature in a user study with 24 participants, where we evaluated their effect on the user performance and experience in two representative manipulation tasks. The results showed that a visual hand rendering improved the user performance, perceived effectiveness and confidence, with a \textbf{skeleton-like rendering being the most performant and effective}. This rendering provided a detailed view of the tracked phalanges while being thin enough not to hide the real hand. In \chapref{visuo_haptic_hand}, we then investigated the visuo-haptic rendering as feedback of the direct hand manipulation with \VOs using wearable vibrotactile haptics. In a user study with a similar design and 20 participants, we compared two vibrotactile contact techniques, provided at \textbf{four different delocalized positions on the user's hand}, and combined with the two most representative visual hand renderings from the previous chapter. The results showed that providing vibrotactile feedback \textbf{improved the perceived effectiveness, realism, and usefulness when it was provided close to the fingertips}, and that the visual hand rendering complemented the haptic hand rendering well in giving a continuous feedback on the hand tracking. \section{Future Work} The wearable visuo-haptic augmentations of perception and manipulation we presented, and the user studies we conducted for this thesis have of course some limitations. In this section, we present some future work for each chapter that could address these issues. \subsection*{Augmenting the Visuo-haptic Texture Perception of Real Surfaces} \paragraph{Other Augmented Object Properties} We focused on the visuo-haptic augmentation of roughness using vibrotactile feedback, because it is one of the most salient properties of surfaces (\secref[related_work]{object_properties}), one of the most studied in haptic perception (\secref[related_work]{texture_rendering}), and equally perceived by sight and touch (\secref[related_work]{visual_haptic_influence}). However, many other wearable augmentation of object properties should be considered, such as hardness, friction, temperature, or local deformations. Such integration of haptic augmentation of a real surface has almost been achieved with the hand-held devices of \citeauthor{culbertson2017ungrounded} \cite{culbertson2017importance,culbertson2017ungrounded}, but will be more challenging with wearable haptic devices. In addition, combination with pseudo-haptic rendering techniques \cite{ujitoko2021survey} should be systematically investigated to expand the range of possible wearable haptic augmentations. \paragraph{Fully Integrated Tracking} In our system, we registered the real and virtual environments (\secref[related_work]{ar_definition}) using fiducial markers and a webcam external to the \AR headset. This only allowed us to track the index finger and the surface to be augmented with the haptic texture, but the tracking was reliable and accurate enough for our needs. In fact, preliminary tests we conducted showed that the built-in tracking capabilities of the Microsoft HoloLens~2 were not able to track the hands wearing a voice-coil. A more robust hand tracking system would support wearing haptic devices on the hand, as well as holding real objects. A complementary solution would be to embed tracking sensors in the wearable haptic devices, such as an inertial measurement unit (IMU) or cameras \cite{preechayasomboon2021haplets}. Prediction of hand movements should also considered as well \cite{klein2020predicting,gamage2021predictable} This would allow a complete portable and wearable visuo-haptic system to be used in more ecological applications. \subsection*{Perception of Haptic Texture Augmentation in Augmented and Virtual Reality} \paragraph{Visual Representation of the Virtual Texture} The main limitation of this user study was the absence of a visual representation of the virtual patterned texture. This is indeed a source of information as important as haptic sensations for the perception of both real textures \cite{baumgartner2013visual,bergmanntiest2007haptic,vardar2019fingertip} and virtual textures \cite{degraen2019enhancing,gunther2022smooth}, and their interaction in the overall perception is complex. In particular, it remains to be investigated how the vibrotactile patterned textures can be represented visually in a convincing way, as the visuo-haptic coupling of such patterned textures is not trivial \cite{unger2011roughness}. % even with real textures \cite{klatzky2003feeling}. \paragraph{Broader Visuo-Haptic Conditions} Our study was conducted with an \OST-\AR headset, but the results may be different with a \VST-\AR headset, where the \RE is seen through cameras and screens (\secref[related_work]{ar_displays}), and the perceived simultaneity between visual and haptic stimuli, real or virtual, is different. The effect of perceived visuo-haptic simultaneity on the augmented haptic perception and its size should also be systematically investigated, for example by inducing various delays between the visual and haptic feedback. We also focused on the perception of roughness augmentation using wearable vibrotactile haptics and a square wave signal to simulate a patterned texture: Our objective was not to accurately reproduce real textures, but to induce different perceived roughness on the same real surface with a well controlled parameters. However, more accurate models for simulating interaction with virtual textures should be applied to wearable haptic augmentations \cite{unger2011roughness}. Another limitation that may have affected the perception of the haptic texture augmentations is the lack of compensation for the frequency response of the actuator and amplifier \cite{asano2012vibrotactile,culbertson2014modeling,friesen2024perceived}. %The dynamic response of the finger should also be considered, and may vary between individuals. \subsection*{Perception of Visual and Haptic Texture Augmentations in Augmented Reality} \paragraph{Assess the Applicability of the Method} As in the previous chapter, our aim was not to accurately reproduce real textures, but to alter the perception of a real surface being touched with simultaneous visual and haptic texture augmentations. However, the results also have some limitations, as they addressed a small set of visuo-haptic textures that augmented the perception of smooth and white real surfaces. Visuo-haptic texture augmentations may be difficult on surfaces that already have strong visual or haptic patterns \cite{asano2012vibrotactile}, or on objects with complex shapes. The role of visuo-haptic texture augmentation should also be evaluated in more complex tasks, such as object recognition and assembly, or in more concrete use cases, such as displaying and touching a museum object or a 3D print before it is manufactured. %Finally, the visual textures used were also simple color captures not meant to be used in an immersive \VE. \paragraph{Adapt to the Specificities of Direct Touch} The haptic textures used were captures and models of the vibrations of a hand-held probe sliding over real surfaces. We generated the vibrotactile textures only from the finger speed \cite{culbertson2015should}, but the perceived roughness of real textures also depends on other factors, such as the contact force, the angle, the posture or the surface of the contact \cite{schafer2017transfer}, but their respective importance in the perception is not yet fully understood \cite{richardson2022learning}. It would be interesting to determine the importance of these factors on the perceived realism of virtual vibrotactile textures. We also rendered haptic textures captured from a hand-held probe to be touched with the bare finger, but finger based captures of real textures should be considered as well \cite{balasubramanian2024sens3}. Finally, the virtual texture models should also be adaptable to individual sensitivities \cite{malvezzi2021design,young2020compensating}. \subsection*{Visual Rendering of the Hand for Manipulating \VOs in AR} \paragraph{Other AR Displays} The visual hand renderings we evaluated were displayed on the Microsoft HoloLens~2, which is a common \OST-\AR headset \cite{hertel2021taxonomy}. We purposely chose this type of display as it is with \OST-\AR that the lack of mutual occlusion between the hand and the \VO is the most challenging to solve \cite{macedo2023occlusion}. We thus hypothesized that a visual hand rendering would be more beneficial to users with this type of display. However, the user's visual perception and experience is different with other types of displays, such as \VST-\AR, where the \RE view is seen through cameras and screens (\secref[related_work]{ar_displays}). While the mutual occlusion problem and the hand tracking latency could be overcome with \VST-\AR, the visual hand rendering could still be beneficial to users as it provides depth cues and feedback on the hand tracking, and should be evaluated as such. \paragraph{More Ecological Conditions} We conducted the user study with two manipulation tasks that involved placing a virtual cube in a target volume, either by pushing it on a table or by grasping and lifting it. While these tasks are fundamental building blocks for more complex manipulation tasks \cite[p.390]{laviolajr20173d}, such as stacking or assembly, more ecological uses should be considered. Similarly, a broader experimental study might shed light on the role of gender and age, as our subject pool was not sufficiently diverse in this regard. Finally, all visual hand renderings received low and high rank rates from different participants, suggesting that users should be able to choose and personalize some aspects of the visual hand rendering according to their preferences or needs, and this should be also be evaluated. \subsection*{Visuo-Haptic Rendering of Hand Manipulation With \VOs in AR} \paragraph{Richer Haptic Feedback} The haptic rendering we considered was limited to vibrotactile feedback using \ERM motors. While the simpler contact vibration technique (Impact technique) was sufficient to confirm contacts with the cube, richer vibrotactile renderings may be required for more complex interactions, such as rendering hardness (\secref[related_work]{hardness_rendering}), textures (\secref[related_work]{texture_rendering}), friction \cite{konyo2008alternative,jeon2011extensions,salazar2020altering}, or edges and shape of \VOs. This will require to consider a broader ranger of haptic actuators and sensations (\secref[related_work]{wearable_haptic_devices}), such as pressure or stretching of the skin. More importantly, the best compromise between well-round haptic feedback and wearability of the system with respect to \AR constraints should be analyzed (\secref[related_work]{vhar_haptics}). \paragraph{Personalized Haptics} Some users found the vibration rendering to be too strong, suggesting that adapting and personalizing the haptic feedback to one's preference is to investigate \cite{malvezzi2021design,umair2021exploring}. In addition, although it was perceived as more effective and realistic when provided close to the point of contact, other positionings, such as the wrist, may be preferred and still be sufficient for a given task. The interactions in our user study were also restricted to the thumb and index fingertips, with the haptic feedback provided only for these contact points, as these are the most commonly used parts of the hand for manipulation tasks. It remains to be explored how to support rendering for different and larger areas of the hand, and how to position a delocalized rendering for points other than the fingertips could be challenging. \section{Perspectives} Our goal was to improve direct hand interaction with \VOs using wearable haptic devices in immersive \AR, by providing more plausible and coherent perception as well as more natural and effective manipulation of the visuo-haptic augmentations. Our contributions have enabled progress towards a seamless integration of the virtual into the real world. They also allow us to outline out longer-term research perspectives. \subsection*{Towards Universal Wearable Haptic Augmentation} We have reviewed how complex the sense of touch is (\secref[related_work]{haptic_hand}). Multiple sensory receptors all over the skin allow us to perceive different properties of objects, such as their texture, temperature, weight or shape. Particularly concentrated in the hands, their sensory feedback is crucial, along with the muscles, for grasping and manipulating objects. In this manuscript, we have shown how wearable haptic devices can provide virtual tactile sensations to support direct hand interaction in immersive \AR. In particular, we have investigated the visuo-haptic perception of texture augmenting real surfaces (\partref{perception}) as well as the manipulation of \VOs with visuo-haptic feedback of hand contacts with \VOs (\partref{manipulation}). However, unlike for the visual sense, which can be completely immersed in the virtual using an \AR/\VR headset, there is no universal wearable haptic device that can reproduce all the haptic properties perceived by the hand (\secref[related_work]{wearable_haptics}). Thus, the haptic renderings and augmentations we studied were limited to specific properties of roughness (\chapref{vhar_system}) and contact (\chapref{visuo_haptic_hand}) using vibrotactile feedback. A systematic and comparative study of existing wearable haptic devices and renderings should therefore be carried out to assess their ability to reproduce the various haptic properties \cite{culbertson2017importance,friesen2024perceived}. More importantly, the visuo-haptic coupling of virtual and augmented objects should be studied systematically, as we did for textures in \AR (\chapref{vhar_textures}) or as done in \VR \cite{choi2021augmenting,gunther2022smooth}. Attention should also be paid to the perceptual differences of wearable haptics in \AR \vs \VR (\chapref{xr_perception}). This would allow to assess the relative importance of visual and haptic feedback in the perception of object properties, and how visual rendering can support or compensate for limitations in wearable haptic rendering. One of the main findings of studies on the haptic perception of real objects is the importance of certain perceived properties over others in discriminating between objects \cite{hollins1993perceptual,baumgartner2013visual,vardar2019fingertip}. It would therefore be interesting to determine which wearable haptic renderings are most important for the perception and manipulation of virtual and augmented objects with the hand in \AR and \VR. User studies could then be conducted similarly, reproducing as many haptic properties as possible in \VO discrimination tasks. These results would make it possible to design more universal wearable haptic devices that provide rich haptic feedback that best meets users' needs for interaction in \AR and \VR. % systematic exploration of the parameter space of the haptic rendering to determine the most important parameters their influence on the perception % measure the difference in sensitivity to the haptic feedback and how much it affects the perception of the object properties \subsection*{Responsive Visuo-Haptic Augmented Reality} Nous avons vu la diversité des affichages de réalité augmentée et virtuelle ainsi que les implications sur le rendu des objets virtuel, sa perception (\secref[related_work]{ar_displays}) et sa manipulation avec la main (\chapref{visual_hand}). La diversité des dispositifs haptiques portables et des rendus tactiles qu'ils peuvent fournir est plus importante encore et un sujet de recherche actif (\secref[related_work]{wearable_haptics}). Le couplage de l'haptique portable avec la RA immersive demande aussi de placer l'haptique ailleurs sur le corps qu'aux points de contact de la main avec les objets virtuels (\secref[related_work]{vhar_haptics}). En particulier, nous avons étudié la perception d'augmentations haptiques de textures avec un dispositif vibrotactile sur la phalange médiane (\secref{vhar_system}) et comparés plusieurs positionnements de l'haptique sur la main pour la manipulation d'objets virtuels (\secref{visuo_haptic_hand}). Un rendu haptique fourni proche du point de contact de la main avec le virtuel semble donc préférable pour augmenter la perception d'objets réels (\secref{vhar_textures}) et améliorer la manipulation d'objets virtuels (\secref{visuo_haptic_hand}). %Given these three points, and the diversity of haptic actuators and renderings, one might be able to interact with the \VOs with any haptic device, worn anywhere on the body and providing personalized feedback on any other part of the hand, and the visuo-haptic system should be able to support such a adapted usage. % design, implement and validate procedures to automatically calibrate the haptic feedback to the user's perception in accordance to what it has been designed to represent % + let user free to easily adjust (eg can't let adjust whole spectrum of vibrotactile, reduce to two or three dimensions with sliders using MDS) %- Visio en réalité mixte : ar avec avatars distants, vr pour se retrouver dans l'espace de l'autre ou un espace distant, et besoin de se faire toucher des objets à distance %- Ou bien en cours, voir l'échantillon à toucher dans lenv de travail ou en contexte en passant en VR %- Ex : médecin palpation, design d'un objet, rénovation d'un logement (AR en contexte courant, VR pour voir et toucher une fois terminé)