diff --git a/1-background/introduction/figures/contributions.odg b/1-background/introduction/figures/contributions.odg index 0ac6e21..e3bf27b 100644 Binary files a/1-background/introduction/figures/contributions.odg and b/1-background/introduction/figures/contributions.odg differ diff --git a/1-background/introduction/figures/contributions.pdf b/1-background/introduction/figures/contributions.pdf index 565728a..50c6875 100644 Binary files a/1-background/introduction/figures/contributions.pdf and b/1-background/introduction/figures/contributions.pdf differ diff --git a/1-background/introduction/figures/interaction-loop.odg b/1-background/introduction/figures/interaction-loop.odg index 049f9b6..aa4b5f7 100644 Binary files a/1-background/introduction/figures/interaction-loop.odg and b/1-background/introduction/figures/interaction-loop.odg differ diff --git a/1-background/introduction/figures/interaction-loop.pdf b/1-background/introduction/figures/interaction-loop.pdf index 6d54bcd..55f14ca 100644 Binary files a/1-background/introduction/figures/interaction-loop.pdf and b/1-background/introduction/figures/interaction-loop.pdf differ diff --git a/1-background/introduction/figures/rv-continuum.odg b/1-background/introduction/figures/rv-continuum.odg index f18b623..0cc6877 100644 Binary files a/1-background/introduction/figures/rv-continuum.odg and b/1-background/introduction/figures/rv-continuum.odg differ diff --git a/1-background/introduction/figures/rv-continuum.pdf b/1-background/introduction/figures/rv-continuum.pdf index fecc69c..24fc1ba 100644 Binary files a/1-background/introduction/figures/rv-continuum.pdf and b/1-background/introduction/figures/rv-continuum.pdf differ diff --git a/1-background/introduction/figures/visuo-haptic-rv-continuum4.odg b/1-background/introduction/figures/visuo-haptic-rv-continuum4.odg new file mode 100644 index 0000000..2e09a3f Binary files /dev/null and b/1-background/introduction/figures/visuo-haptic-rv-continuum4.odg differ diff --git a/1-background/introduction/figures/visuo-haptic-rv-continuum5.odg b/1-background/introduction/figures/visuo-haptic-rv-continuum5.odg new file mode 100644 index 0000000..0100bd2 Binary files /dev/null and b/1-background/introduction/figures/visuo-haptic-rv-continuum5.odg differ diff --git a/1-background/introduction/figures/visuo-haptic-rv-continuum5.pdf b/1-background/introduction/figures/visuo-haptic-rv-continuum5.pdf new file mode 100644 index 0000000..dd67b65 Binary files /dev/null and b/1-background/introduction/figures/visuo-haptic-rv-continuum5.pdf differ diff --git a/1-background/introduction/introduction.tex b/1-background/introduction/introduction.tex index bb32478..3a4e58f 100644 --- a/1-background/introduction/introduction.tex +++ b/1-background/introduction/introduction.tex @@ -3,33 +3,35 @@ \chaptertoc -This thesis, entitled \enquote{\ThesisTitle}, shows how wearable haptics, worn on the outside of the hand, can improve direct bare-hand interaction in immersive \AR by augmenting the perception and manipulation of the virtual. +This PhD manuscript shows how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR by augmenting the perception of the virtual content and its manipulation. +Our goal is to enable a more coherent and seamless visuo-haptic perception of the virtual content as well as a more natural and effective manipulation of it. +%We are particularly interested in enabling direct contact of virtual and augmented objects with the bare hand. \section{Visual and Haptic Object Augmentations} \label{visuo_haptic_augmentations} \subsectionstarbookmark{Hand Interaction with Everyday Objects} -In daily life, we simultaneously look and touch the everyday objects around us without even thinking about it. -Many of these object properties can be perceived in a complementary way through both vision and touch, such as their shape, size or texture \cite{baumgartner2013visual}. -But vision often precedes touch, enabling us to expect the tactile sensations we will feel when touching the object, \eg hardness or texture, and even to predict properties that we cannot see, \eg weight or temperature. +In daily life, \textbf{we simultaneously look, touch and manipulate the everyday objects} around us without even thinking about it. +Many of these object properties can be perceived in a complementary way through all our sensory modalities, such as their shape, size or texture \cite{baumgartner2013visual}. +Vision often precedes touch, enabling us to expect the tactile sensations we will feel when touching the object \cite{yanagisawa2015effects}, \eg hardness or texture, and even to predict properties that we cannot see, \eg weight or temperature. Information from different sensory sources may be complementary, redundant or contradictory \cite{ernst2004merging}. This is why we sometimes want to touch an object to check one of its properties that we have seen and to compare or confront our visual and tactile sensations. -We then instinctively construct a unified perception of the object we are exploring and manipulating from these visual and tactile sensory modalities, as well as from the movement of our hand and fingers on the object \cite{ernst2002humans}. +We then \textbf{instinctively construct a unified perception of the properties of the object} we are exploring and manipulating from our sensory modalities, as well as from the movement of our hand and fingers on the object \cite{ernst2002humans}. The sense of touch also allows us to perceive our environment, but also to represent ourselves in space and to interact with the surrounding objects. This is due to the many sensory receptors distributed throughout our hands and body, and which can be divided into two modalities: kinesthetic (or proprioception), which are the forces felt by muscles and tendons, and cutaneous (or tactile), which are the pressures, stretches, vibrations and temperatures felt by the skin. -This rich and complex variety of actions and sensations makes it particularly difficult to artificially recreate capabilities of touch, for example in virtual or remote operating environments \cite{culbertson2018haptics}. +This rich and complex variety of actions and sensations makes it particularly \textbf{difficult to artificially recreate capabilities of touch}, for example in virtual or remote operating environments \cite{culbertson2018haptics}. \subsectionstarbookmark{Wearable Haptics Promise Everyday Use} -\emph{Haptics} is the study of the sense of touch and user interfaces that involve touch. -Haptic devices can be categorized according to how they interface with a user: graspable, touchable and wearable, as illustrated in \figref{haptic-categories}. -Graspable interfaces are the traditional haptic devices that are held in the hand. +\textbf{\emph{Haptics} is the study of the sense of touch and user interfaces that involve touch} \cite{klatzky2013haptic}. +Haptic devices can be categorized according to how they interface with a user: graspable, touchable and wearable, as illustrated in \figref{haptic-categories} \cite{culbertson2018haptics}. +\emph{Graspable interfaces} are the traditional haptic devices that are held in the hand. They are either grounded devices that provide kinesthetic (force) feedback, \eg a robotic arm or a joystick, or ungrounded tactile devices, \eg a game controller or a smartphone. -Touchable interfaces are actuated devices that are directly touched and that can dynamically change their shape or surface property, such as hardness or friction, providing simultaneous kinesthetic and cutaneous feedback. +\emph{Touchable interfaces} are actuated devices that are directly touched and that can dynamically change their shape or surface property, such as hardness or friction, providing simultaneous kinesthetic and cutaneous feedback. However, graspable interfaces occupy the hand, preventing interaction with other objects, and touchable interfaces often involve cumbersome mechanisms and are by definition limited to their own working surface. -Instead, wearable interfaces are directly mounted on the body to provide cutaneous sensations on the skin in a portable way and without restricting the user's movements \cite{pacchierotti2017wearable}. +Instead, \textbf{\emph{wearable interfaces} are directly mounted on the body} to provide cutaneous sensations on the skin in a portable way and \textbf{without restricting the user's movements} \cite{pacchierotti2017wearable}. \begin{subfigs}{haptic-categories}{ Haptic devices can be classified into three categories according to their interface with the user: @@ -43,10 +45,10 @@ Instead, wearable interfaces are directly mounted on the body to provide cutaneo \subfig[0.25]{culbertson2018haptics-wearable} \end{subfigs} -A wide range of wearable haptic devices have been developed to provide the user with rich virtual haptic sensations, including normal force, skin stretch, vibration and thermal feedback. +A wide range of wearable haptic devices have been developed to provide the user with rich virtual tactile sensations, including normal force, skin stretch, vibration and thermal feedback. \figref{wearable-haptics} shows some examples of different wearable haptic devices with different form factors and rendering capabilities. -Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, \VR, and social interactions \cite{pacchierotti2017wearable,culbertson2018haptics}. -But their use in combination with \AR has been little explored so far. +Their portability, \ie their \textbf{small form factor, light weight and unobtrusiveness}, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, augmented and virtual realities, and social interactions \cite{pacchierotti2017wearable,culbertson2018haptics}. +However, their \textbf{integration with \AR has been little explored so far}, and few wearable haptic devices have specifically designed or experimentally tested for direct hand interaction in \AR. \begin{subfigs}{wearable-haptics}{ Wearable haptic devices can render sensations on the skin as feedback to real or virtual objects being touched. @@ -65,42 +67,45 @@ But their use in combination with \AR has been little explored so far. \subsectionstarbookmark{Augmented Reality Is Not Only Visual} -\AR integrates virtual content into the real world perception, creating the illusion of a unique \AE. -It thus promises natural and seamless interaction with the physical and digital objects (and their combination) directly with our hands. -It is technically and conceptually closely related to \VR, which replaces the \RE perception with a \VE. +\textbf{\emph{Augmented Reality (\AR)} integrates virtual content into the real world perception, creating the illusion of a unique \emph{augmented environment}} \cite{azuma1997survey,skarbez2021revisiting}. +It thus promises natural and seamless interaction with the physical and digital objects (and their combination) directly with our hands \cite{billinghurst2021grand}. +It is technically and conceptually closely related to \emph{\VR}, which completely replaces the \emph{\RE} perception with a \emph{\VE}. + \AR and \VR can be placed on a reality-virtuality continuum, as proposed by \textcite{milgram1994taxonomy} and illustrated in \figref{rv-continuum}\footnote{On the original reality-virtuality continuum of \textcite{milgram1994taxonomy}, augmented virtuality is also considered, as the incorporation of real objects to a \VE, and is placed between \AR and \VR. For simplicity, we only consider \AR and \VR in this thesis.}. It describes the degree of virtuality of the environment along an axis, with one end being the \RE and the other end being a pure \VE, \ie indistinguishable from the real world (such as \emph{The Matrix} movies). -Between these two extremes lies \MR, which comprises \AR and \VR as different levels of mixing real and virtual environments \cite{skarbez2021revisiting}.\footnote{This is the original and classic definition of \MR, but there is still a debate on defining and characterize \AR and \MR experiences \cite{speicher2019what,skarbez2021revisiting}.} -\AR/\VR is most often understood as addressing only the visual sense, and as haptics, it can take many forms as a user interface. -The most promising devices are \AR headset, which are portable displays worn directly on the head, providing the user with an immersive \AE/\VE. +Between these two extremes lies \MR, which comprises \textbf{\AR and \VR as different levels of mixing real and virtual environments} \cite{skarbez2021revisiting}.\footnote{This is the original and classic definition of \MR, but there is still a debate on defining and characterize \AR and \MR experiences \cite{speicher2019what,skarbez2021revisiting}.} + +We call \AR/\VR \emph{systems} the computational set of hardware (input devices, sensors and displays) and software (tracking, simulation and rendering) that allows the user to interact with the \VE.% by implementing the interaction loop we proposed in \figref{interaction-loop}. +\AR and \VR systems can address any of the human senses, but they most often focus on visual augmentations \cite[p.144]{billinghurst2015survey}. +Many visual displays have been explored, from projection systems to hand-held displays, but the most \textbf{promising devices are \AR headsets}, which are \textbf{portable displays worn directly on the head}, providing the user with an \textbf{immersive visual augmented environment}. \begin{subfigs}{rv-continuums}{Reality-virtuality continuums. }[][ \item For the visual sense, as originally proposed by and adapted from \textcite{milgram1994taxonomy}. \item Extension to include the haptic sense on a second, orthogonal axis, proposed by and adapted from \textcite{jeon2009haptic}. ] - \subfig[0.44]{rv-continuum} - \subfig[0.54]{visuo-haptic-rv-continuum3} + \subfig[0.45]{rv-continuum} + \subfig[0.53]{visuo-haptic-rv-continuum5} \end{subfigs} -\AR/\VR can also be extended to render for sensory modalities other than vision. -\textcite{jeon2009haptic} proposed extending the reality-virtuality continuum to include haptic feedback by decoupling into two orthogonal haptic and visual axes (\figref{visuo-haptic-rv-continuum3}). -The combination of the two axes defines 9 types of visuo-haptic environments, with 3 possible levels of virtuality for each visual or haptic axis: real, augmented and virtual. -For example, a visual \AE that uses a tangible (touchable) object as a proxy to manipulate virtual content is considered a haptic \RE (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum3}), whereas a device that provides synthetic haptic feedback when touching a \VO is considered a haptic \VE (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum3}). -Haptic \AR is then the combination of real and virtual haptic stimuli \cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}). -In particular, it has been implemented by enhancing the haptic perception of tangible objects by providing timely tactile stimuli using wearable haptics. -\figref{salazar2020altering} shows an example of modifying the perceived stiffness of a tangible object in \VR using simultaneous pressure feedback on the finger (left middle cell in \figref{visuo-haptic-rv-continuum3}). -\figref{bau2012revel} shows another example of visuo-haptic augmentation of virtual texture when running the finger on a tangible surface (middle cell in the two axes in \figref{visuo-haptic-rv-continuum3}). +%Concepts of virtuality and augmentation can also be applied for sensory modalities other than vision. +\textcite{jeon2009haptic} proposed to describe visuo-haptic \AR/\VR systems with two orthogonal reality-virtuality continuum, one for vision and one for touch, as illustrated in \figref{visuo-haptic-rv-continuum5}. +The combination of the two axes defines 9 types of visuo-haptic environments, with 3 possible levels of virtuality for each visual or haptic feedback: real, augmented and virtual. +For example, (visual) \AR that uses a \textbf{real object as a proxy to manipulate a \VO is considered as \emph{haptic reality}} (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum5}), whereas a device that provides \textbf{synthetic haptic feedback when touching a \VO is considered as \emph{haptic virtuality}} (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum5}). +\textbf{A \emph{haptic augmentation} is then the combination of real and virtual haptic stimuli} \cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum5}). +In particular, it has been implemented by augmenting the haptic perception of real objects by providing timely virtual tactile stimuli using wearable haptics: +\figref{salazar2020altering} shows an example of modifying the perceived stiffness of a real object in \VR using simultaneous pressure feedback on the finger (left middle cell in \figref{visuo-haptic-rv-continuum5}). +\figref{bau2012revel} shows another example of visuo-haptic augmentation of virtual texture when running the finger on a real surface (middle cell in the two axes in \figref{visuo-haptic-rv-continuum5}). -Current \AR systems often lack haptic feedback, creating a deceptive and incomplete user experience when reaching the \VE with the hand. -All visual \VOs are inherently intangible and cannot physically constrain a user's hand, making it difficult to perceive their properties congruently and interact with them with confidence and efficiency. -It is therefore necessary to provide haptic feedback that is consistent with the visual \AE and ensures the best possible user experience. -The integration of wearable haptics with \AR seems to be one of the most promising solutions, but it remains challenging due to their many respective characteristics and the additional constraints of combining them. +When a (visual) \AR system lacks haptic feedback, it creates a deceptive and incomplete user experience when reaching the \VE with the hand. +All visual \VOs are inherently intangible and cannot physically constrain a user's hand, making it difficult to perceive their properties and interact with them with confidence and efficiency. +It is therefore necessary to provide haptic feedback that is consistent with the visual \VOs and ensures the best possible user experience, as we argue in the next section. +The \textbf{integration of wearable haptics with \AR} seems to be one of the most promising solutions, but it \textbf{remains challenging due to their many respective characteristics and the additional constraints of combining them}. \begin{subfigs}{visuo-haptic-environments}{Visuo-haptic environments with different degrees of reality-virtuality. }[][ - \item \AR environment with a real, tangible haptic object used as a proxy to manipulate a \VO \cite{kahl2023using}. + \item \AR environment with a real haptic object used as a proxy to manipulate a \VO \cite{kahl2023using}. \item \AR environment with a wearable haptic device that provides virtual, synthetic feedback from contact with a \VO \cite{meli2018combining}. - \item A tangible object seen in a visual \VR environment whose haptic perception of stiffness is augmented with the hRing haptic device \cite{salazar2020altering}. - \item Visuo-haptic texture augmentation of tangible object being touch, using a hand-held \AR display and haptic electrovibration feedback \cite{bau2012revel}. + \item A real object seen in a visual \VR environment whose haptic perception of stiffness is augmented with the hRing haptic device \cite{salazar2020altering}. + \item Visuo-haptic texture augmentation of real object being touch, using a hand-held \AR display and haptic electrovibration feedback \cite{bau2012revel}. ] \subfigsheight{31mm} \subfig{kahl2023using} @@ -112,14 +117,13 @@ The integration of wearable haptics with \AR seems to be one of the most promisi \section{Research Challenges of Wearable Visuo-Haptic Augmented Reality} \label{research_challenges} -The integration of wearable haptics with \AR to create a visuo-haptic \AE is complex and presents many perceptual and interaction challenges, \ie sensing the \AE and acting effectively upon it. -We are particularly interested in enabling direct contact of virtual and augmented objects with the bare hand. -Our goal is to enable congruent, intuitive and seamless perception and manipulation of the visuo-haptic \AE. - -The experience of such a visuo-haptic \AE relies on an interaction loop with the user, as illustrated in \figref{interaction-loop}. +The integration of wearable haptics with \AR to create a visuo-haptic augmented environment is complex and presents many perceptual and interaction challenges. +% \ie sensing the \AE and acting effectively upon it. +in this thesis, we propose to \textbf{represent the experience of a user with such a visuo-haptic augmented environment as an interaction loop}, shown in \figref{interaction-loop}. +It is based on the interaction loops of users with \ThreeD systems \cite[p.84]{laviolajr20173d}. The \RE and the user's hand are tracked in real time by sensors and reconstructed in visual and haptic \VEs. -The interactions between the virtual hand and objects are then simulated and rendered as visual and haptic feedback to the user using an \AR headset and a wearable haptic device. -Because the visuo-haptic \VE is displayed in real time, colocalized and aligned with the real one, the user is given the illusion of directly perceiving and interacting with the virtual content as if it were part of the \RE. +The interactions between the virtual hand and objects are then simulated, and rendered as feedback to the user using a \AR/\VR headset and wearable haptics. +Because the visuo-haptic \VE is displayed in real time and aligned with the \RE, the user is given the illusion of directly perceiving and interacting with the virtual content as if it were part of the \RE. \fig{interaction-loop}{The interaction loop between a user and a visuo-haptic augmented environment.}[ One interact with the visual (in blue) and haptic (in red) \VE through a virtual hand (in purple) interaction technique that tracks real hand movements and simulates contact with \VOs. @@ -129,29 +133,30 @@ Because the visuo-haptic \VE is displayed in real time, colocalized and aligned In this context, we identify two main research challenges that we address in this thesis: \textbf{(I) providing plausible and coherent visuo-haptic augmentations}, and \textbf{(II) enabling effective manipulation of the augmented environment}. -Each of these challenges also raises numerous design, technical and human issues specific to each of the two types of feedback, wearable haptics and immersive \AR, as well as multimodal rendering and user experience issues in integrating these two sensorimotor feedbacks into a coherent and seamless visuo-haptic \AE. +Each of these challenges also raises numerous design, technical and human issues specific to wearable haptics and immersive \AR, as well as virtual rendering and user experience issues.% in integrating these two sensorimotor feedbacks into a coherent and seamless visuo-haptic \AE. -\subsectionstarbookmark{Challenge I: Provide Plausible and Coherent Visuo-Haptic Augmentations} +\subsectionstarbookmark{Challenge I: Providing Plausible and Coherent Visuo-Haptic Augmentations} -Many haptic devices have been designed and evaluated specifically for use in \VR, providing realistic and varied kinesthetic and tactile feedback to \VOs. -Although closely related, (visual) \AR and \VR have key differences in their respective renderings that can affect user perception. +\textbf{Many haptic devices have been designed and evaluated specifically for use in \VR}, providing realistic and varied kinesthetic and tactile feedback to \VOs. +Although closely related, \AR and \VR have key differences in their respective renderings that can affect user perception. -In \AR, the user can still see the real-world surroundings, including their hands, the augmented tangible objects and the worn haptic devices, unlike \VR where there is total control over the visual rendering of the hand and \VE. -As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic \cite{ujitoko2021survey} or haptic retargeting \cite{azmandian2016haptic} effects. -Moreover, many wearable haptic devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR. -The user's hand must be indeed free to touch and interact with the \RE while wearing a wearable haptic device. -It is possible instead to place the haptic actuator close to the point of contact with the \RE, as described above to implement haptic augmentations, \eg providing haptic feedback on the nail \cite{ando2007fingernailmounted,teng2021touch}, another phalanx \cite{asano2015vibrotactile,salazar2020altering} or the wrist \cite{pezent2022design,sarac2022perceived} for rendering fingertip contacts with virtual content. +%As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic \cite{ujitoko2021survey} or haptic retargeting \cite{azmandian2016haptic} effects. + +Many wearable haptic devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR. +The \textbf{user's hand must be indeed free to touch and interact with the \RE while wearing a wearable haptic device}. +It is possible instead to place the haptic actuator close to the point of contact with the \RE, \eg providing haptic feedback on the nail \cite{ando2007fingernailmounted,teng2021touch}, another phalanx \cite{asano2015vibrotactile,salazar2020altering} or the wrist \cite{pezent2022design,sarac2022perceived} for rendering fingertip contacts with virtual content. Therefore, when touching a virtual or augmented object, the real and virtual visual sensations are seen as co-localized, but the virtual haptic feedback is not. It remains to be investigated how such potential discrepancies affect the overall perception to design visuo-haptic augmentations adapted to \AR. So far, \AR studies and applications often only add visual and haptic sensations to the user's overall perception of the environment, but conversely it is more difficult to remove sensations. -These added virtual sensations can therefore be perceived as out of sync or even inconsistent with the sensations of the \RE, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these. -It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as realistic or plausible, and to what extent they will conflict or complement each other in the perception of the \AE. -With a better understanding of how visual factors can influence the perception of haptically augmented tangible objects, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic augmentations adapted to \AR can be designed. +These \textbf{added virtual sensations can therefore be perceived as inconsistent} with the sensations of the \RE, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these. +Moreover, the user can still see the real-world surroundings, including their hands, the augmented real objects and the worn haptic devices, unlike \VR where there is total control over the visual rendering. % of the hand and \VE. +It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as realistic or plausible, and to what extent they will conflict or complement each other. % in the perception of the \AE. +With a better understanding of \textbf{how visual factors can influence the perception of haptic augmentations}, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic augmentations adapted to \AR can be designed. -\subsectionstarbookmark{Challenge II: Enable Effective Manipulation of the Augmented Environment} +\subsectionstarbookmark{Challenge II: Enabling Effective Manipulation of the Augmented Environment} -Touching, grasping and manipulating \VOs are fundamental interactions for \AR \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite{laviolajr20173d}. +Touching, \textbf{grasping and manipulating \VOs are fundamental interactions for \AR} \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite[p.385]{laviolajr20173d}. As the hand is not occupied or covered with a haptic device to not impair interaction with the \RE, as described in the previous section, one can expect a seamless and direct manipulation of the hand with the virtual content as if it were real. Thus, augmenting a real object has the advantage of physically constraining the hand, allowing for easy and natural interaction, but manipulating a purely \VO with the bare hand can be challenging without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.