\chapter{Conclusion} \mainlabel{conclusion} \section{Summary} In this thesis, entitled \enquote{\ThesisTitle}, we presented our research on direct hand interaction with real and virtual everyday objects, visually and haptically augmented using immersive \AR and wearable haptic devices. \noindentskip In \partref{manipulation}, we addressed the challenge of improving the manipulation of \VOs directly with the hand in immersive \OST-\AR. Our approach was to design, based on the literature, and evaluate in user studies the effect of visual rendering of the hand and the delocalized haptic rendering We first focused on (1) \textbf{the visual rendering as hand augmentation} and then on the (2) \textbf{combination of different visuo-haptic rendering of the hand manipulation with \VOs}. \noindentskip In \chapref{visual_hand}, we investigated the visual rendering as hand augmentation. Seen as an \textbf{overlay on the user's hand}, such visual hand rendering provide feedback on the hand tracking and the interaction with \VOs. We compared the six commonly used renderings in the \AR litterature in a user study with 24 participants, where we evaluated their effect on the user performance and experience in two representative manipulation tasks. The results showed that a visual hand rendering improved the user performance, perceived effectiveness and confidence, with a \textbf{skeleton-like rendering being the most performant and effective}. This rendering provided a detailed view of the tracked phalanges while being thin enough not to hide the real hand. \noindentskip In \chapref{visuo_haptic_hand}, we then investigated the visuo-haptic rendering as feedback of the direct hand manipulation with \VOs using wearable vibrotactile haptics. In a user study with a similar design and 20 participants, we compared two vibrotactile contact techniques, provided at \textbf{four different delocalized positions on the user's hand}, and combined with the two most representative visual hand renderings from the previous chapter. The results showed that providing vibrotactile feedback \textbf{improved the perceived effectiveness, realism, and usefulness when it is provided close to the fingertips}, and that the visual hand rendering complemented the haptic hand rendering well in giving a continuous feedback on the hand tracking. \section{Future Work} The wearable visuo-haptic augmentations of perception and manipulation we presented and the user studies we conducted in this thesis have of course some limitations. In this section, we present some future work for each chapter that could address these. \subsection*{Visual Rendering of the Hand for Manipulating Virtual Objects in Augmented Reality} \paragraph{Other AR Displays} The visual hand renderings we evaluated were displayed on the Microsoft HoloLens~2, which is a common \OST-\AR headset. We purposely chose this type of display as it is with \OST-\AR that the lack of mutual occlusion between the hand and the \VO is the most challenging to solve \cite{macedo2023occlusion}. We thus hypothesized that a visual hand rendering would be more beneficial to users with this type of display. However, the user's visual perception and experience is very different with other types of displays, such as \VST-\AR, where the \RE view is seen through a screen (\secref[related_work]{ar_displays}). While the mutual occlusion problem and the hand tracking latency can be overcome with \VST-\AR, the visual hand rendering could still be beneficial to users as it provides depth cues and feedback on the hand tracking, and should be evaluated as such. \paragraph{More Ecological Conditions} We conducted the user study with two manipulation tasks that involved placing a virtual cube in a target volume, either by pushing it on a table or by grasping and lifting it. While these tasks are fundamental building blocks for more complex manipulation tasks \cite{laviolajr20173d}, such as stacking or assembly, more ecological uses should be considered. Similarly, a broader experimental study might shed light on the role of gender and age, as our subject pool was not sufficiently diverse in this regard. Finally, all visual hand renderings received low and high rank rates from different participants, suggesting that users should be able to choose and personalize some aspects of the visual hand rendering according to their preferences or needs, and this should be also be evaluated. \subsection*{Visuo-Haptic Rendering of Hand Manipulation With Virtual Objects in Augmented Reality} \paragraph{Richer Haptic Feedback} The haptic rendering we considered was limited to vibrotactile feedback using \ERM motors. While the simpler contact vibration technique (Impact technique) was sufficient to confirm contacts with the cube, richer vibrotactile renderings may be required for more complex interactions, such as rendering hardness (\secref[related_work]{hardness_rendering}), textures (\secref[related_work]{texture_rendering}), friction \cite{konyo2008alternative,jeon2011extensions,salazar2020altering}, or edges and shape of \VOs. This will require to consider a broader ranger of haptic actuators and sensations (\secref[related_work]{wearable_haptic_devices}), such as pressure or stretching of the skin. More importantly, the best compromise between well-round haptic feedback and wearability of the system with respect to \AR constraints should be analyzed (\secref[related_work]{vhar_haptics}). \paragraph{Personalized Haptics} Some users found the vibration rendering to be too strong, suggesting that adapting and personalizing the haptic feedback to one's preference is a promising approach. In addition, although it was perceived as more effective and realistic when provided close to the point of contact, other positionings, such as the wrist, may be preferred and still be sufficient for a given task. The interactions in our user study were also restricted to the thumb and index fingertips, with the haptic feedback provided only for these contact points, as these are the most commonly used parts of the hand for manipulation tasks. It remains to be explored how to support rendering for different and larger areas of the hand, and how to position a delocalized rendering for points other than the fingertips could be challenging. Indeed, personalized haptics is gaining interest in the community \cite{malvezzi2021design, umair2021exploring}. \section{Perspectives} \subsection*{Towards Universal Wearable Haptic Augmentation} % systematic exploration of the parameter space of the haptic rendering to determine the most important parameters their influence on the perception % measure the difference in sensitivity to the haptic feedback and how much it affects the perception of the object properties \subsection*{Responsive Visuo-Haptic Augmented Reality} %Given these three points, and the diversity of haptic actuators and renderings, one might be able to interact with the \VOs with any haptic device, worn anywhere on the body and providing personalized feedback on any other part of the hand, and the visuo-haptic system should be able to support such a adapted usage.s % design, implement and validate procedures to automatically calibrate the haptic feedback to the user's perception in accordance to what it has been designed to represent % + let user free to easily adjust (eg can't let adjust whole spectrum of vibrotactile, reduce to two or three dimensions with sliders using MDS)