\section{Introduction} \label{intro} Touching, grasping and manipulating virtual objects are fundamental interactions in \AR (\secref[related_work]{ve_tasks}) and essential for many of its applications (\secref[related_work]{ar_applications}). Manipulation of virtual objects is achieved using a virtual hand interaction technique that represents the user's hand in the \VE and simulates interaction with virtual objects (\secref[related_work]{ar_virtual_hands}). The visual feedback of the virtual hand is a key element for interacting and manipulating virtual objects in \VR \cite{prachyabrued2014visual,grubert2018effects}. Some work has also investigated the visual feedback of the virtual hand in \AR, but not in a context of virtual object manipulation \cite{al-kalbani2016analysis,yoon2020evaluating} or was limited to a single visual hand augmentation \cite{piumsomboon2014graspshell,blaga2017usability,maisto2017evaluation}. \Gls{OST}-\AR also has significant perceptual differences from \VR due the lack of mutual occlusion between the hand and the virtual object in \OST-\AR (\secref[related_work]{ar_displays}), and the inherent delays between the user's hand and the result of the interaction simulation (\secref[related_work]{ar_virtual_hands}). In this chapter, we investigate the \textbf{visual rendering of the virtual hand as augmentation of the real hand} for direct hand manipulation of virtual objects with an \OST-\AR headset. To this end, we selected in the literature and compared the most popular visual hand augmentation used to interact with virtual objects in \AR. The virtual hand is \textbf{displayed superimposed} on the user's hand with these visual rendering, providing \textbf{feedback on the tracking} of the real hand, as shown in \figref{hands}. The movement of the virtual hand is also \textbf{constrained to the surface} of the virtual object, providing an additional \textbf{feedback on the interaction} with the virtual object. We \textbf{evaluate in a user study}, using the \OST-\AR headset Microsoft HoloLens~2, the effect of six visual hand augmentations on the user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a virtual object directly with the hand. \noindentskip The main contributions of this chapter are: \begin{itemize} \item A comparison from the literature of six common visual hand augmentation used to interact with virtual objects in \AR. \item A user study evaluating with 24 participants the performance and user experience of the six visual hand augmentations as augmentation of the real hand during free and direct hand manipulation of virtual objects in \OST-\AR. \end{itemize} \noindentskip In the next sections, we first present the six visual hand augmentations we considered and gathered from the literature. We then describe the experimental setup and design, the two manipulation tasks, and the metrics used. We present the results of the user study and discuss the implications of these results for the manipulation of virtual objects directly with the hand in \AR. \bigskip \begin{subfigs}{hands}{The six visual hand augmentations as augmentation of the real hands.}[ As seen by the user through the \AR headset during the two-finger grasping of a virtual cube. ][ \item No visual rendering \level{(None)}. \item Cropped virtual content to enable hand-cube occlusion \level{(Occlusion, Occl)}. \item Rings on the fingertips \level{(Tips)}. \item Thin outline of the hand \level{(Contour, Cont)}. \item Fingers' joints and phalanges \level{(Skeleton, Skel)}. \item Semi-transparent \ThreeD hand model \level{(Mesh)}. ] \subfig[0.22]{method/hands-none} \subfig[0.22]{method/hands-occlusion} \subfig[0.22]{method/hands-tips} \par \subfig[0.22]{method/hands-contour} \subfig[0.22]{method/hands-skeleton} \subfig[0.22]{method/hands-mesh} \end{subfigs}