\chapter{Conclusion} \mainlabel{conclusion} \section*{Summary} In this thesis, entitled \enquote{\ThesisTitle}, we presented our research on direct hand interaction with real and virtual everyday objects, visually and haptically augmented using immersive \AR and wearable haptic devices. \noindentskip \partref{manipulation} \noindentskip In \chapref{visual_hand}, we investigated the visual rendering as hand augmentation. Seen as an \textbf{overlay on the user's hand}, such visual hand rendering provide feedback on the hand tracking and the interaction with \VOs. We compared the six commonly used renderings in the \AR litterature in a user study with 24 participants, where we evaluated their effect on the user performance and experience in two representative manipulation tasks. The results showed that a visual hand rendering improved the user performance, perceived effectiveness and confidence, with a \textbf{skeleton-like rendering being the most performant and effective}. This rendering provided a detailed view of the tracked phalanges while being thin enough not to hide the real hand. \section*{Future Work} The visuo-haptic renderings we presented and the user studies we conducted in this thesis have of course some limitations. We present in this section some future work that could address these. \subsection*{Visual Rendering of the Hand for Manipulating Virtual Objects in Augmented Reality} \paragraph{Other AR Displays} The visual hand renderings we evaluated were displayed on the Microsoft HoloLens~2, which is a common \OST-\AR headset. We purposely chose this type of display as it is with \OST-\AR that the lack of mutual occlusion between the hand and the \VO is the most challenging to solve \cite{macedo2023occlusion}. We thus hypothesized that a visual hand rendering would be more beneficial to users with this type of display. However, the user's visual perception and experience is very different with other types of displays, such as \VST-\AR, where the \RE view is seen through a screen (\secref[related_work]{ar_displays}). While the mutual occlusion problem and the hand tracking latency can be overcome with \VST-\AR, the visual hand rendering could still be beneficial to users as it provides depth cues and feedback on the hand tracking, and should be evaluated as such. \paragraph{More Ecological Conditions} We conducted the user study with two manipulation tasks that involved placing a virtual cube in a target volume, either by pushing it on a table or by grasping and lifting it. While these tasks are fundamental building blocks for more complex manipulation tasks \cite{laviolajr20173d}, such as stacking or assembly, more ecological uses should be considered. Similarly, a broader experimental study might shed light on the role of gender and age, as our subject pool was not sufficiently diverse in this regard. Finally, all visual hand renderings received low and high rank rates from different participants, suggesting that users should be able to choose and personalize some aspects of the visual hand rendering according to their preferences or needs, and this should be also be evaluated. \subsection*{Haptic Rendering of the Hand for Manipulating Virtual Objects in Augmented Reality} As we already said in \secref[visual_hand]{discussion}, these results have some limitations as they address limited types of visuo-haptic renderings and manipulations were restricted to the thumb and index fingertips. While the simpler vibration technique (Impact technique) was sufficient to confirm contacts with the cube, richer vibrotactile renderings may be required for more complex interactions, such as collision or friction renderings between objects \cite{kuchenbecker2006improving, pacchierotti2015cutaneous} or texture rendering \cite{culbertson2014one, asano2015vibrotactile}. More generally, a broader range of haptic sensations should be considered, such as pressure or stretching of the skin \cite{maisto2017evaluation, teng2021touch}. However, moving the point of application of the sensation away may be challenging for some types of haptic rendering. Similarly, as the interactions were limited to the thumb and index fingertips, positioning a delocalized haptic rendering over a larger area of the hand could be challenging and remains to be explored. Also, given that some users found the vibration rendering too strong, adapting/personalizing the haptic feedback to one's preference (and body positioning) might also be a promising approach. Indeed, personalized haptics is recently gaining interest in the community \cite{malvezzi2021design, umair2021exploring}. \section*{Perspectives} % systematic exploration of the parameter space of the haptic rendering to determine the most important parameters their influence on the perception % measure the difference in sensitivity to the haptic feedback and how much it affects the perception of the object properties % design, implement and validate procedures to automatically calibrate the haptic feedback to the user's perception in accordance to what it has been designed to represent % + let user free to easily adjust (eg can't let adjust whole spectrum of vibrotactile, reduce to two or three dimensions with sliders using MDS)