diff --git a/1-background/related-work/1-haptic-hand.tex b/1-background/related-work/1-haptic-hand.tex index 972c1f1..1a6deed 100644 --- a/1-background/related-work/1-haptic-hand.tex +++ b/1-background/related-work/1-haptic-hand.tex @@ -4,99 +4,109 @@ % describe how the hand senses and acts on its environment to perceive the haptic properties of real everyday objects. The haptic sense has specific characteristics that make it unique in regard to other senses. -It enables us to perceive a large diversity of properties of the everyday objects, up to a complex combination of sensations produced by numerous sensory receptors distributed throughout the body, but particularly in the hand. -It also allows us to act on these objects with the hand, to come into contact with them, to grasp them, to actively explore them, and to manipulate them. -This implies that the haptic perception is localized at the points of contact between the hand and the environment, \ie we cannot haptically perceive an object without actively touching it. -These two mechanisms, \emph{action} and \emph{perception}, are therefore closely associated and both are essential to form the haptic experience of interacting with the environment using the hand \cite{lederman2009haptic}. +It enables us to perceive a large diversity of properties of everyday objects, up to a complex combination of sensations produced by numerous sensory receptors distributed throughout the body, but especially in the hand \cite{johansson2009coding}. +It also allows us to act on these objects with the hand, to come into contact with them, to grasp them and to actively explore them. % , and to manipulate them. +%This implies that the haptic perception is localized at the points of contact between the hand and the environment, \ie we cannot haptically perceive an object without actively touching it. +These two mechanisms, \emph{action} and \emph{perception}, are closely associated and both are essential to form a complete haptic experience of interacting with the environment using the hand \cite{lederman2009haptic}. \subsection{The Haptic Sense} \label{haptic_sense} -Perceiving the properties of an object involves numerous sensory receptors embedded in the skin, but also in the muscles and joints of the hand, and distributed across the body. They can be divided into two main modalities: \emph{cutaneous} and \emph{kinesthetic}. +Perceiving the properties of an object involves numerous sensory receptors embedded in the skin, but also in the muscles and joints of the hand, and distributed throughout the body. They can be divided into two main modalities: \emph{cutaneous} and \emph{kinesthetic} \cite{lederman2009haptic}. -\subsubsection{Cutaneous Sensitivity} +\subsubsection{Cutaneous Modality} \label{cutaneous_sensitivity} -Cutaneous haptic receptors are specialized nerve endings implanted in the skin that respond differently to the various stimuli applied to the skin. \figref{blausen2014medical_skin} shows the location of the four main cutaneous receptors that respond to mechanical deformation of the skin. +Cutaneous haptic receptors are specialized nerve endings implanted in the skin that respond differently to the various stimuli applied to the skin. +\figref{blausen2014medical_skin} shows the location of the four main cutaneous \emph{mechanoreceptors} that respond to mechanical deformation of the skin. -\fig[0.6]{blausen2014medical_skin}{Schema of cutaneous mechanoreceptors in a section of the skin \cite{blausen2014medical}.} +\fig[0.6]{blausen2014medical_skin}{Diagram of cutaneous mechanoreceptors in a section of the skin \cite{blausen2014medical}.} -Adaptation rate and receptor size are the two key characteristics that respectively determine the temporal and spatial resolution of these \emph{mechanoreceptors}, as summarized in \tabref{cutaneous_receptors}. +Adaptation rate and receptor size are the two key characteristics that respectively define \emph{temporal and spatial sensitivity} of mechanoreceptors, \ie their ability to detect changes in stimuli over time and space \cite{johansson2009coding}. +They are summarized in \tabref{cutaneous_receptors}. The \emph{adaptation rate} is the speed and duration of the response to a stimulus. -Meissner and Pacinian receptors, known as fast-adapting (FA), respond rapidly to a stimulus but stop quickly even though the stimulus is still present, allowing the detection of high-frequency changes. -In contrast, Merkel and Ruffini receptors, known as slow-adapting (SA), have a slower but continuous response to a static, prolonged stimulus. +Meissner and Pacinian receptors, known as fast-adapting, respond rapidly to a stimulus but stop quickly even though the stimulus is still present, allowing the detection of high-frequency changes. +In contrast, Merkel and Ruffini receptors, known as slow-adapting, have a slower but continuous response to a static, prolonged stimulus. The \emph{size of the receptor} determines the area of skin that can be sensed by a single nerve ending. -Meissner and Merkel receptors have a small detection area (named Type I) and are sensitive to fine skin deformations, while Ruffini and Pacinian receptors have a larger detection area (named Type II). +Meissner and Merkel receptors have a small detection area and are sensitive to fine skin deformations. +Conversely, Ruffini and Pacinian receptors have a larger detection area, which means they can detect more distant events on the skin, but at the cost of poor spatial sensitivity. -The density of mechanoreceptors varies according to skin type and body region. -\emph{Glabrous skin}, especially on the face, feet, hands, and more importantly, the fingers, is particularly rich in cutaneous receptors, giving these regions great tactile sensitivity. -The density of the Meissner and Merkel receptors, which are the most sensitive, is notably high in the fingertips \cite{johansson2009coding}. +The density of mechanoreceptors varies according to skin type and body region \cite{johansson2009coding}. +\emph{Glabrous skin}, is particularly rich in cutaneous receptors, especially on the face, feet, hands and fingers. +The density of Meissner and Merkel receptors, which are the most sensitive, is particularly high in the fingertips. Conversely, \emph{hairy skin} is less sensitive and does not contain Meissner receptors, but has additional receptors at the base of the hairs, as well as receptors known as C-tactile, which are involved in pleasantness and affective touch \cite{ackerley2014touch}. There are also two types of thermal receptors implanted in the skin, which respond to increases or decreases in skin temperature, respectively, providing sensations of warmth or cold \cite{lederman2009haptic}. Finally, free nerve endings (without specialized receptors) provide information about pain \cite{mcglone2007discriminative}. \begin{tab}{cutaneous_receptors}{Characteristics of the cutaneous mechanoreceptors.}[ - Adaptation rate is the speed and duration of the receptor's response to a stimulus. Receptive size is the area of skin detectable by a single receptor. Sensitivities are the stimuli detected by the receptor. Adapted from \textcite{mcglone2007discriminative} and \textcite{johansson2009coding}. + Adaptation rate is the speed and duration of the receptor's response to a stimulus, either fast or slow adapting. + Receptor size is the area of skin detectable by a single receptor, either small or large. + Sensitivity describes the spatial and temporal variation of stimuli detected by the receptor. + Adapted from \textcite{mcglone2007discriminative} and \textcite{johansson2009coding}. ] \begin{tabularx}{\linewidth}{p{1.7cm} p{2cm} p{2cm} X} \toprule - \textbf{Receptor} & \textbf{Adaptation Rate} & \textbf{Receptive Size} & \textbf{Sensitivities} \\ + \textbf{Receptor Name} & \textbf{Adaptation Rate} & \textbf{Receptor Size} & \textbf{Sensitivity} \\ \midrule Meissner & Fast & Small & Discontinuities (\eg edges), medium-frequency vibration (\qtyrange{5}{50}{\Hz}) \\ Merkel & Slow & Small & Pressure, low-frequency vibration (\qtyrange{0}{5}{\Hz}) \\ - Pacinian & Fast & Large & High-frequency vibration (\qtyrange{40}{400}{\Hz}) \\ + Pacinian & Fast & Large & High-frequency vibration (\qtyrange{40}{400}{\Hz}) from distant events \\ Ruffini & Slow & Large & Skin stretch \\ \bottomrule \end{tabularx} \end{tab} -\subsubsection{Kinesthetic Sensitivity} +\subsubsection{Kinesthetic Modality} \label{kinesthetic_sensitivity} -Kinesthetic receptors are also mechanoreceptors but are located in the muscles, tendons and joints \cite{jones2006human}. -The muscle spindles respond to the length and the rate of stretch/contraction of the muscles. +Kinesthetic receptors are also mechanoreceptors, but are located in muscles, tendons and joints \cite{jones2006human}. +Muscle spindles respond to the length and rate of stretch/contraction of muscles. Golgi tendon organs, located at the junction of muscles and tendons, respond to the force developed by the muscles. -Ruffini and Pacini receptors are found in the joints and respond to joint movement. -Together, these receptors provide sensory feedback about the movement, speed and strength of the muscles and the rotation of the joints during a movement. +Ruffini and Pacini receptors are located in the joints and respond to joint movement. +Together, these receptors provide sensory feedback about the movement, speed and strength of the muscles and the rotation of the joints during movement. They can also sense external forces and torques applied to the body. Kinesthetic receptors are therefore closely linked to the motor control of the body. By providing sensory feedback in response to the position and movement of our limbs, they enable us to perceive our body in space, a perception called \emph{proprioception}. This allows us to plan and execute precise movements to touch or grasp a target, even with our eyes closed. -Cutaneous mechanoreceptors are essential for this perception because any movement of the body or contact with the environment necessarily deforms the skin \cite{johansson2009coding}. +Cutaneous mechanoreceptors (\secref{cutaneous_sensitivity}) are also involved in proprioception \cite{johansson2009coding}. \subsection{Hand-Object Interactions} \label{hand_object_interactions} -The sense of touch is thus composed of a rich and complex set of various cutaneous and kinesthetic receptors under the skin, in the muscles and in the joints. +The sense of touch is composed of a rich and complex set of various cutaneous and kinesthetic receptors under the skin, in the muscles and in the joints. These receptors give the hand its great tactile sensitivity and great dexterity in its movements. \subsubsection{Sensorimotor Continuum of the Hand} \label{sensorimotor_continuum} \textcite{jones2006human} have proposed a sensorimotor continuum of hand functions, from mainly sensory activities to activities with a more important motor component. -As illustrated in \figref{sensorimotor_continuum}, \Citeauthor{jones2006human} propose to delineate four categories of hand function on this continuum: +As illustrated in \figref{sensorimotor_continuum}, \textcite{jones2006human} delineate four categories of hand function on this continuum: \begin{itemize} - \item \emph{Passive touch}, or tactile sensing, is the ability to perceive an object through cutaneous sensations with a static hand contact. The object may be moving, but the hand remains static. It allows for relatively good surface perception, \eg in \textcite{gunther2022smooth}. + \item \emph{Passive touch}, or tactile sensing, is the ability to perceive an object through cutaneous sensations with a static hand contact. The object may be moving, but the hand remains static. It allows for relatively good surface perception \cite{gunther2022smooth}. \item \emph{Exploration}, or active haptic sensing, is the manual and voluntary exploration of an object with the hand, involving all cutaneous and kinesthetic sensations. It enables a more precise perception than passive touch \cite{lederman2009haptic}. - \item \emph{Prehension} is the action of grasping and holding an object with the hand. It involves fine coordination between hand and finger movements and the haptic sensations produced. - \item \emph{Gestures}, or non-prehensible skilled movements, are motor activities without constant contact with an object. Examples include pointing at a target, typing on a keyboard, accompanying speech with gestures, or signing in sign language, \eg in \textcite{yoon2020evaluating}. + \item \emph{Prehension} is the action of grasping and holding an object with the hand. It involves fine coordination between hand and finger movements and the haptic sensations produced \cite{feix2016grasp}. + \item \emph{Gestures}, or non-prehensible skilled movements, are motor activities without constant contact with an object. Examples include pointing at a target, typing on a keyboard, accompanying speech with gestures, or signing in sign language \cite{yoon2020evaluating}. \end{itemize} \fig[0.65]{sensorimotor_continuum}{ The sensorimotor continuum of the hand function proposed by and adapted from \textcite{jones2006human}. }[ Functions of the hand are classified into four categories based on the relative importance of sensory and motor components. - Icons are from \href{https://thenounproject.com/creator/leremy/}{Gan Khoon Lay} / \href{https://creativecommons.org/licenses/by/3.0/}{CC BY}. + \protect\footnotemark ] This classification has been further refined by \textcite{bullock2013handcentric} into 15 categories of possible hand interactions with an object. -In this thesis, we are interested in exploring visuo-haptic augmentations (\partref{perception}) and grasping of \VOs (\partref{manipulation}) in the context of \AR and wearable haptics. +In this thesis, we are interested in exploring visuo-haptic texture augmentations (\partref{perception}) and grasping of \VOs (\partref{manipulation}) using immersive \AR and wearable haptics. \subsubsection{Hand Anatomy and Motion} \label{hand_anatomy} +\footnotetext{% + All icons are from \href{https://thenounproject.com/creator/leremy/}{Gan Khoon Lay} / \href{https://creativecommons.org/licenses/by/3.0/}{CC BY}. +} + Before we describe how the hand is used to explore and grasp objects, we need to look at its anatomy. Underneath the skin, muscles and tendons can actually move because they are anchored to the bones. @@ -112,7 +122,7 @@ Thus the thumb has 5 DoFs, each of the other four fingers has 4 DoFs and the wri This complex structure enables the hand to perform a wide range of movements and gestures. However, the way we explore and grasp objects follows simpler patterns, depending on the object being touched and the aim of the interaction. \begin{subfigs}{hand}{Anatomy and motion of the hand. }[][ - \item Schema of the hand skeleton. Adapted from \textcite{blausen2014medical}. + \item Diagram of the hand skeleton. Adapted from \textcite{blausen2014medical}. \item Kinematic model of the hand with 27 \DoFs \cite{erol2007visionbased}. ] \subfigsheight{58mm} @@ -131,7 +141,7 @@ For the other procedures, the whole hand is used: for example, approaching or po The \emph{enclosure} with the hand makes it possible to judge the general shape and size of the object. It takes only \qtyrange{2}{3}{\s} to perform these procedures, except for contour following, which can take about ten seconds \cite{jones2006human}. -\fig{exploratory_procedures}{Exploratory procedures and their associated object properties (in parentheses). Adapted from \textcite{lederman2009haptic}.} +\fig{exploratory_procedures}{Exploratory procedures and their associated object properties (in brackets). Adapted from \textcite{lederman2009haptic}.} %Le sens haptique seul (sans la vision) nous permet ainsi de reconnaitre les objets et matériaux avec une grande précision. %La reconnaissance des propriété matérielles, \ie la surface et sa texture, rigidité et température est meilleure qu'avec le sens visuel seul. @@ -164,12 +174,13 @@ There are two main types of \emph{perceptual properties}. The \emph{material properties} are the perception of the roughness, hardness, temperature and friction of the surface of the object \cite{bergmanntiest2010tactual}. The \emph{spatial properties} are the perception of the weight, shape and size of the object \cite{lederman2009haptic}. -Each of these properties is closely related to a physical property of the object, which is defined and measurable, but perception is a subjective experience and often differs from this physical measurement. +Each of these properties is closely related to a physical property of the object, which is defined and measurable, but perception is a subjective experience and often differs from this physical measurement \cite{bergmanntiest2010tactual}. Perception also depends on many other factors, such as the movements made and the exploration time, but also on the person, their sensitivity \cite{hollins2000individual} or age \cite{jones2006human}, and the context of the interaction \cite{kahrimanovic2009context,kappers2013haptic}. -These properties are described and rated\footnotemark using scales opposing two adjectives such as \enquote{rough/smooth} or \enquote{hot/cold} \cite{okamoto2013psychophysical}. +These properties are described and rated\footnotemark\ using scales opposing two adjectives such as \enquote{rough/smooth} or \enquote{hot/cold} \cite{okamoto2013psychophysical}. \footnotetext{All the haptic perception measurements described in this chapter were performed by blindfolded participants, to control for the influence of vision.} The most salient and fundamental perceived material properties are the roughness and hardness of the object \cite{hollins1993perceptual,baumgartner2013visual}, which are also the most studied and best understood \cite{bergmanntiest2010tactual}. +To be able to render virtual haptic sensations that reproduce these properties, as we will detail in \secref{tactile_rendering}, it is important to understand how they are perceived when interacting with real objects \cite{klatzky2013haptic}. \subsubsection{Roughness} \label{roughness} @@ -183,13 +194,13 @@ But when running the finger over the surface with a lateral movement (\secref{ex In particular, when the asperities are smaller than \qty{0.1}{mm}, such as paper fibers, the pressure cues are no longer captured and only the movement, \ie the vibrations, can be used to detect the roughness \cite{hollins2000evidence}. This limit distinguishes \emph{macro-roughness} from \emph{micro-roughness}. -The physical properties of the surface determine the haptic perception of roughness. -The most important characteristic is the density of the surface elements, \ie the spacing between them: The perceived (subjective) intensity of roughness increases with spacing, for macro-roughness \cite{klatzky2003feeling,lawrence2007haptic} and micro-roughness \cite{bensmaia2003vibrations}. +%The physical properties of the surface determine the haptic perception of roughness. +The perception of roughness can be characterized by the density of the surface elements: the perceived (subjective) intensity of roughness increases with the spacing between the elements. %, for macro-roughness \cite{klatzky2003feeling,lawrence2007haptic} and micro-roughness \cite{bensmaia2003vibrations}. For macro-textures, the size of the elements, the force applied and the speed of exploration have limited effects on the intensity perceived \cite{klatzky2010multisensory}: macro-roughness is a \emph{spatial perception}. This allows us to read Braille \cite{lederman2009haptic}. However, the speed of exploration affects the perceived intensity of micro-roughness \cite{bensmaia2003vibrations}. -To establish the relationship between spacing and intensity for macro-roughness, patterned textured surfaces were manufactured: as a linear grating (on one axis) composed of ridges and grooves, \eg in \figref{lawrence2007haptic_1} \cite{lederman1972fingertip,lawrence2007haptic}, or as a surface composed of micro conical elements on two axes, \eg in \figref{klatzky2003feeling_1} \cite{klatzky2003feeling}. +To establish the relationship between elements' spacing and intensity for macro-roughness, patterned textured surfaces were manufactured: as a linear grating (on one axis) composed of ridges and grooves, \eg in \figref{lawrence2007haptic_1} \cite{lederman1972fingertip,lawrence2007haptic}, or as a surface composed of micro conical elements on two axes, \eg in \figref{klatzky2003feeling_1} \cite{klatzky2003feeling}. As shown in \figref{lawrence2007haptic_2}, there is a quadratic relationship between the logarithm of the perceived roughness intensity $r$ and the logarithm of the space between the elements $s$ ($a$, $b$ and $c$ are empirical parameters to be estimated) \cite{klatzky2003feeling}: \begin{equation}{roughness_intensity} log(r) \sim a \, log(s)^2 + b \, s + c @@ -197,7 +208,7 @@ As shown in \figref{lawrence2007haptic_2}, there is a quadratic relationship bet A larger spacing between elements increases the perceived roughness, but reaches a plateau from \qty{\sim 5}{\mm} for the linear grating \cite{lawrence2007haptic}, while the roughness decreases from \qty{\sim 2.5}{\mm} \cite{klatzky2003feeling} for the conical elements. \begin{subfigs}{lawrence2007hapti}{Estimation of haptic roughness of a linear grating surface by active exploration \cite{lawrence2007haptic}. }[][ - \item Schema of a linear grating surface, composed of ridges and grooves. + \item Diagram of a linear grating surface, composed of ridges and grooves. \item Perceived intensity of roughness (vertical axis) of the surface as a function of the size of the grooves (horizontal axis, interval of \qtyrange{0.125}{4.5}{mm}), the size of the ridges (RW, circles and squares) and the mode of exploration (with the finger in white and via a rigid probe held in hand in black). ] \subfigsheight{56mm} @@ -221,15 +232,21 @@ However, as the speed of exploration changes the transmitted vibrations, a faste Even when the fingertips are deafferented (absence of cutaneous sensations), the perception of roughness is maintained \cite{libouton2012tactile}, thanks to the propagation of vibrations in the finger, hand and wrist, for both pattern and "natural" everyday textures \cite{delhaye2012textureinduced}. The spectrum of vibrations shifts to higher frequencies as the exploration speed increases, but the brain integrates this change with proprioception to keep the \emph{perception constant} of the texture. -For patterned textures, as illustrated in \figref{delhaye2012textureinduced}, the ratio of the finger speed $v$ to the frequency of the vibration intensity peak $f_p$ is measured most of the time equal to the period $\lambda$ of the spacing of the elements: +For patterned textures, as illustrated in \figref{delhaye2012textureinduced}, the ratio of the finger speed $\dot{x}$ to the frequency of the vibration intensity peak $f_p$ is measured most of the time equal to the period $\lambda$ of the spacing of the elements: \begin{equation}{grating_vibrations} - \lambda \sim \frac{v}{f_p} + \lambda \sim \frac{\dot{x}}{f_p} \end{equation} The vibrations generated by exploring everyday textures are also specific to each texture and similar between individuals, making them identifiable by vibration alone \cite{manfredi2014natural,greenspon2020effect}. This shows the importance of vibration cues even for macro textures and the possibility of generating virtual texture sensations with vibrotactile rendering. -\fig[0.55]{delhaye2012textureinduced}{Speed of finger exploration (horizontal axis) on grating textures with different periods $\lambda$ of spacing (in color) and frequency of the vibration intensity peak $f_p$ propagated in the wrist (vertical axis) \cite{delhaye2012textureinduced}.} +\begin{subfigs}{delhaye2012textureinduced}{Scanning grating textures with different periods $\lambda$ of spacing with the fingertip, and measure of the propagated vibrations in the wrist \cite{delhaye2012textureinduced}. }[][ + \item Experimental setup. + \item Speed of finger exploration (horizontal axis) for texture spacings (in color) and frequency of the vibration intensity peak $f_p$ propagated in the wrist (vertical axis). + ] + \subfig[.43]{delhaye2012textureinduced_1} + \subfig[.53]{delhaye2012textureinduced_2} +\end{subfigs} The everyday textures are more complex to study because they are composed of multiple elements of different sizes and spacings. In addition, the perceptions of micro and macro roughness overlap and are difficult to distinguish \cite{okamoto2013psychophysical}. @@ -247,30 +264,32 @@ When the finger presses on an object (\figref{exploratory_procedures}), its surf When the surface is touched or tapped, vibrations are also transmitted to the skin \cite{higashi2019hardness}. Passive touch (without voluntary hand movements) and tapping allow a perception of hardness as good as active touch \cite{friedman2008magnitude}. -Two physical properties determine the haptic perception of hardness: its stiffness and elasticity, as shown in \figref{hardness} \cite{bergmanntiest2010tactual}. -The \emph{stiffness} $k$ of an object is the ratio between the applied force $F$ and the resulting \emph{displacement} $D$ of the surface: -\begin{equation}{stiffness} - k = \frac{F}{D} -\end{equation} - -The \emph{elasticity} of an object is expressed by its Young's modulus $Y$, which is the ratio between the applied pressure (the force $F$ per unit area $A$) and the resulting deformation $D / l$ (the relative displacement) of the object: +Perceived hardness is related to \emph{physical elasticity} of the material and the structure of the object \cite{bergmanntiest2009cues}. +The physical elasticity of a material can be expressed by its Young's modulus $E$ (in \unit{Pa}), which is the ratio between stress $\sigma$ and strain $\varepsilon$ (\figref{hardness}): \begin{equation}{young_modulus} - Y = \frac{F / A}{D / l} + E = \frac{\sigma}{\varepsilon} = \frac{F / A}{\Delta L / L} \end{equation} +Stress is the force per unit area $F / A$ (in \unit{N/m^2}) applied to the object and strain is the relative displacement $\Delta L / L$ (in \unit{m/m}) of the object. + +The physical resistance to deformation of an object can also be expressed as its \emph{stiffness} $k$ (in \unit{N/mm}), which is the ratio between the applied force $F$ and the resulting displacement $\Delta L$ along one axis of deformation (\figref{hardness}): +\begin{equation}{stiffness} + k = \frac{F}{\Delta L} +\end{equation} +Stiffness depends on the structure of the object: a thick object can be more compressed than a thin object, with the same force applied and the same physical elasticity. \begin{subfigs}{stiffness_young}{Perceived hardness of an object by finger pressure. }[][ - \item Diagram of an object with a stiffness coefficient $k$ and a length $l$ compressed by a force $F$ on an area $A$ by a distance $D$. + \item Diagram of an object with a stiffness coefficient $k$ and a length $L$ compressed on an axis by a force $F$ on an area $A$ by a distance $\Delta L$. \item Identical perceived hardness intensity between Young's modulus (horizontal axis) and stiffness (vertical axis). The dashed and dotted lines indicate the objects tested, the arrows the correspondences made between these objects, and the grey lines the predictions of the quadratic relationship \cite{bergmanntiest2009cues}. ] \subfig[.3]{hardness} \subfig[.45]{bergmanntiest2009cues} \end{subfigs} -\textcite{bergmanntiest2009cues} showed the role of these two physical properties in the perception of hardness. +%\textcite{bergmanntiest2009cues} showed how of these two physical measures in the perception of hardness. +An object with low stiffness, but high Young's modulus can be perceived as hard, and vice versa, as shown in \figref{bergmanntiest2009cues}. With finger pressure, a relative difference (the \emph{Weber fraction}) of \percent{\sim 15} is required to discriminate between two objects of different stiffness or elasticity. However, in the absence of pressure sensations (by placing a thin disc between the finger and the object), the necessary relative difference becomes much larger (Weber fraction of \percent{\sim 50}). -Thus, the perception of hardness relies on \percent{90} on surface deformation cues and \percent{10} on displacement cues. -In addition, an object with low stiffness but high Young's modulus can be perceived as hard, and vice versa, as shown in \figref{bergmanntiest2009cues}. +That is, \textcite{bergmanntiest2009cues} showed the perception of hardness relies on \percent{90} on surface deformation cues and \percent{10} on displacement cues. %Finally, when pressing with the finger, the perceived hardness intensity $h$ follows a power law with the stiffness $k$ \cite{harper1964subjective}: %\begin{equation}{hardness_intensity} % h = k^{0.8} @@ -363,8 +382,8 @@ In addition, an object with low stiffness but high Young's modulus can be percei \subsection{Conclusion} \label{haptic_sense_conclusion} -Haptic perception and manipulation of objects with the hand involves several simultaneous mechanisms with complex interactions. +Haptic perception and manipulation of objects with the hand involve several simultaneous mechanisms with complex interactions. Exploratory movements of the hand are performed on contact with the object to obtain multiple sensory information from several cutaneous and kinaesthetic receptors. These sensations express physical parameters in the form of perceptual cues, which are then integrated to form a perception of the property being explored. -It is often the case that one perceptual cue is particularly important in the perception of a property, but perceptual constancy is possible by compensating for its absence with others. +For the perception of roughness (texture) or hardness, one perceptual cue is particularly important, but perceptual constancy is possible by compensating for its absence with others. In turn, these perceptions help to guide the grasping and manipulation of the object by adapting the grasp type and the forces applied to the shape of the object and the task to be performed. diff --git a/1-background/related-work/2-wearable-haptics.tex b/1-background/related-work/2-wearable-haptics.tex index a429ad2..3d13e0f 100644 --- a/1-background/related-work/2-wearable-haptics.tex +++ b/1-background/related-work/2-wearable-haptics.tex @@ -1,11 +1,11 @@ \section{Augmenting Object Perception with Wearable Haptics} \label{wearable_haptics} -One of the roles of haptic systems is to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects, particularly in a visual \VE \cite{maclean2008it,culbertson2018haptics}. -Due to the high complexity of the haptic sense and the variety of sensations it can feel, haptic actuators and renderings are designed to only address a subset of these sensations. -While it is challenging to create a realistic haptic experience, it is more important to provide the right sensory stimulus \enquote{at the right moment and at the right place} \cite{hayward2007it}. +Haptic systems aim to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects \cite{maclean2008it,culbertson2018haptics}. +Due to the high complexity of the haptic sense and the variety of sensations it can feel, haptic actuators and renderings are designed to address only a subset of these sensations. +While it is challenging to create a realistic haptic experience, \ie that reproduce the real object interaction with high fidelity \cite{unger2011roughness,culbertson2015should}, it is more important to provide the right sensory stimulus \enquote{at the right moment and at the right place} \cite{hayward2007it}. -Moreover, a haptic augmentation system should \enquote{modulating the feel of a real object by virtual [haptic] feedback} \cite{jeon2009haptic}, \ie a touch interaction with a real object whose perception is modified by the addition of virtual haptic feedback. +A haptic augmentation system \enquote{modulates the feel of a real object by virtual [haptic] feedback} \cite{jeon2009haptic}, \ie a touch interaction with a real object whose perception is modified by the addition of virtual haptic feedback. The haptic system should be hand-held or worn, \eg on the hand, and \enquote{not permanently attached to or integrated in the object} \cite{bhatia2024augmenting}. \subsection{Level of Wearability} @@ -39,7 +39,7 @@ Such \emph{body-grounded} devices are often heavy and bulky and cannot be consid \textcite{pacchierotti2017wearable} defined that : \enquote{A wearable haptic interface should also be small, easy to carry, comfortable, and it should not impair the motion of the wearer}. An approach is then to move the grounding point close to the end-effector (\figref{pacchierotti2017wearable_3}): the interface is limited to cutaneous haptic feedback, but its design is more compact, lightweight, comfortable and portable, \eg in \figref{grounded_to_wearable}. -Moreover, as detailed in \secref{object_properties}, cutaneous sensations are necessary and often sufficient for the perception of the haptic properties of an object explored with the hand, as also argued by \textcite{pacchierotti2017wearable}. +Moreover, as detailed in \secref{object_properties}, cutaneous sensations are necessary and often sufficient for the perception of the haptic properties of an object explored with the hand \cite{pacchierotti2017wearable}. \begin{subfigs}{grounded_to_wearable}{Haptic devices for the hand with different wearability levels. }[][ \item Teleoperation of a virtual cube grasped with the thumb and index fingers each attached to a grounded haptic device \cite{pacchierotti2015cutaneous}. @@ -58,8 +58,9 @@ Moreover, as detailed in \secref{object_properties}, cutaneous sensations are ne \label{wearable_haptic_devices} We present an overview of wearable haptic devices for the hand, following the categories of \textcite{pacchierotti2017wearable}. -The rendering of a haptic device is indeed determined by the nature of the actuators employed, which form the interface between the haptic system and the user's skin, and therefore the types of mechanical stimuli they can supply. -Several actuators are often combined in a haptic device to obtain richer haptic feedback. +It should be noted that the rendering capabilities of a haptic device is determined by the type of actuators employed. +The actuator forms the interface between the haptic device and the user, and provides the haptic rendering as mechanical stimuli to the user's skin. +Multiple actuators are often combined in a haptic device to provide richer feedback. \subsubsection{Moving Platforms} \label{moving_platforms} @@ -169,9 +170,14 @@ However, they require high voltages to operate, limiting their use in wearable d \label{tactile_rendering} Rendering a haptic property consists in modeling and reproducing virtual sensations comparable to those perceived when interacting with real objects \cite{klatzky2013haptic}. -By adding such rendering as feedback timely synchronized with the touch actions of the hand on a real object \cite{bhatia2024augmenting}, the perception of the object's haptic property can be modified. +As we have just seen, the haptic sense being rich and complex (\secref{haptic_hand}), a wide variety of wearable haptic actuators have been developed (\secref{wearable_haptic_devices}) that each provide a subset of the haptic sensations felt by the hand. +We review in this section the rendering methods with wearable haptics to modify perceived roughness and hardness of real objects. + +\subsubsection{Haptic Augmentations} + +By adding haptic rendering as feedback timely synchronized with the touch actions of the hand on a real object \cite{bhatia2024augmenting}, the perception of the object's haptic property can be modified. +That is both the real and virtual haptic sensations are integrated into a single property perception, \ie the perceived haptic property is modulated by the added virtual feedback. The integration of the real and virtual sensations into a single property perception is discussed in more details in \secref{sensations_perception}. -%, both the real and virtual haptic sensations are integrated into a single property perception, as presented in \secref{sensations_perception}, \ie the perceived haptic property is modulated by the added virtual feedback. In particular, the visual rendering of a touched object can also influence the perception of its haptic properties, \eg by modifying its visual texture in \AR or \VR, as discussed in the \secref{visuo_haptic}. \textcite{bhatia2024augmenting} categorize the haptic augmentations into three types: direct touch, touch-through, and tool-mediated. @@ -187,7 +193,7 @@ Of course, wearable haptics can also be used in a direct touch context to modify % \cite{klatzky2003feeling} : rendering roughness, friction, deformation, temperatures % \cite{girard2016haptip} : renderings with a tangential motion actuator -\subsubsection{Roughness} +\subsubsection{Roughness Augmentation} \label{texture_rendering} To modify the perception of the haptic roughness (or texture, see \secref{roughness}) of a real object, vibrations are typically applied to the skin by the haptic device as the user moves over the surface. @@ -249,7 +255,7 @@ When comparing real textures felt through a stylus with their virtual models ren \subfig{culbertson2012refined} \end{subfigs} -\subsubsection{Hardness} +\subsubsection{Hardness Augmentation} \label{hardness_rendering} The perceived hardness (\secref{hardness}) of a real surface can be modified by rendering forces or vibrations. @@ -274,8 +280,8 @@ This stiffness augmentation technique was then extended to allow tapping and pre \item Diagram of a user tapping the surface \cite{jeon2009haptic}. \item Displacement-force curves of a real rubber ball (dashed line) and when its perceived stiffness $\tilde{k}$ is modulated \cite{jeon2009haptic}. ] - \subfig[0.38]{jeon2009haptic_1} - \subfig[0.42]{jeon2009haptic_2} + \subfig[0.45]{jeon2009haptic_1} + \subfig[0.45]{jeon2009haptic_2} \end{subfigs} \textcite{detinguy2018enhancing} transposed this stiffness augmentation technique with the hRing device (\secref{belt_actuators}): While pressing a real piston with the fingertip by displacement $x_r(t)$, the belt compressed the finger with a virtual force $\tilde{k}\,x_r(t)$ where $\tilde{k}$ is the added stiffness (\eqref{stiffness_augmentation}), increasing the perceived stiffness of the piston (\figref{detinguy2018enhancing}). @@ -290,8 +296,8 @@ Conversely, the technique allowed to \emph{decrease} the perceived stiffness by \item Decrease perceived stiffness of hard object by restricting the fingerpad deformation \cite{tao2021altering}. ] \subfigsheight{35mm} - \subfig{detinguy2018enhancing} - \subfig{tao2021altering} + \subfigbox{detinguy2018enhancing} + \subfigbox{tao2021altering} \end{subfigs} \paragraph{Vibrations Augmentations} @@ -311,7 +317,7 @@ A challenge with this technique is to provide the vibration feedback at the righ \item Voltage inputs (top) to the voice-coil for soft, medium, and hard vibrations, with the corresponding displacement (middle) and force (bottom) outputs of the actuator. \item Perceived stiffness intensity of real sponge ("Sp") and wood ("Wd") surfaces without added vibrations ("N") and modified by soft ("S"), medium ("M") and hard ("H") vibrations. ] - \subfigsheight{50mm} + \subfigsheight{49mm} \subfig{choi2021augmenting_control} \subfig{choi2021augmenting_results} \end{subfigs} @@ -359,7 +365,7 @@ The complexity of the haptic sense has led to the design of numerous haptic devi While many haptic devices can be worn on the hand, only a few can be considered wearable as they are compact and portable, but they are limited to cutaneous feedback. If the haptic rendering of the device is timely associated with the user's touch actions on a real object, the perceived haptic properties of the object can be modified. Several haptic augmentation methods have been developed to modify the perceived roughness and hardness, mostly using vibrotactile feedback and, to a lesser extent, pressure feedback. -However, not all of these haptic augmentations have been already transposed to wearable haptics, and use of wearable haptic augmentations have not been yet studied in the context of \AR. +However, not all of these haptic augmentations have yet been already transposed to wearable haptics, and the use of wearable haptic augmentations has not yet been investigated in the context of \AR. %, unlike most previous actuators that are designed specifically for fingertips and would require mechanical adaptation to be placed on other parts of the hand. % thanks to the vibration propagation and the sensory capabilities distributed throughout the skin, they can be placed without adaption and on any part of the hand diff --git a/1-background/related-work/3-augmented-reality.tex b/1-background/related-work/3-augmented-reality.tex index 5f98e29..c96a0cf 100644 --- a/1-background/related-work/3-augmented-reality.tex +++ b/1-background/related-work/3-augmented-reality.tex @@ -1,8 +1,9 @@ \section{Manipulating Objects with the Hands in AR} \label{augmented_reality} -As with haptic systems (\secref{wearable_haptics}), visual \AR devices generate and integrate virtual content into the user's perception of the \RE, creating the illusion of the presence of the virtual. -Immersive systems such as headsets leave the hands free to interact with \VOs, promising natural and intuitive interactions similar to those with everyday real objects. +%As with haptic systems (\secref{wearable_haptics}), visual +\AR devices generate and integrate virtual content into the user's perception of their real environment (\RE), creating the illusion of the \emph{presence} of the virtual \cite{azuma1997survey,skarbez2021revisiting}. +Immersive systems such as headsets leave the hands free to interact with virtual objects (\VOs), promising natural and intuitive interactions similar to those with everyday real objects \cite{billinghurst2021grand,hertel2021taxonomy}. %\begin{subfigs}{sutherland1968headmounted}{Photos of the first \AR system \cite{sutherland1968headmounted}. }[ % \item The \AR headset. @@ -17,7 +18,7 @@ Immersive systems such as headsets leave the hands free to interact with \VOs, p \label{what_is_ar} The first \AR headset was invented by \textcite{sutherland1968headmounted}: With the technology available at the time, it was already capable of displaying \VOs at a fixed point in space in real time, giving the user the illusion that the content was present in the room. -Fixed to the ceiling, the headset displayed a stereoscopic (one image per eye) perspective projection of the virtual content on a transparent screen, taking into account the user's position, and thus already following the interaction loop presented in \figref[introduction]{interaction-loop}. +Fixed to the ceiling, the headset displayed a stereoscopic (one image per eye) perspective projection of the virtual content on a transparent screen, taking into account the user's position, and thus already following our interaction loop presented in \figref[introduction]{interaction-loop}. \subsubsection{A Definition of AR} \label{ar_definition} @@ -26,8 +27,8 @@ Fixed to the ceiling, the headset displayed a stereoscopic (one image per eye) p The first formal definition of \AR was proposed by \textcite{azuma1997survey}: (1) combine real and virtual, (2) be interactive in real time, and (3) register real and virtual\footnotemark. Each of these characteristics is essential: the real-virtual combination distinguishes \AR from \VR, a movie with integrated digital content is not interactive and a \TwoD overlay like an image filter is not registered. -There are also two key aspects to this definition: it does not focus on technology or method, but on the user's perspective of the system experience, and it does not specify a particular human sense, \ie it can be auditory \cite{yang2022audio}, haptic \cite{bhatia2024augmenting}, or even olfactory \cite{brooks2021stereosmell} or gustatory \cite{brooks2023taste}. -Yet, most of the research have focused on visual augmentations, and the term \AR (without a prefix) is almost always understood as visual \AR. +There are also two key aspects of this definition: it does not focus on technology or method, but on the user's perspective of the system experience, and it does not specify a particular human sense, \ie it can be auditory \cite{yang2022audio}, haptic \cite{bhatia2024augmenting}, or even olfactory \cite{brooks2021stereosmell} or gustatory \cite{brooks2023taste}. +Yet, most research has focused on visual augmentation, and the term \AR (without a prefix) is almost always understood as visual \AR. \footnotetext{This third characteristic has been slightly adapted to use the version of \textcite{marchand2016pose}, the original definition was: \enquote{registered in \ThreeD}.} %For example, \textcite{milgram1994taxonomy} proposed a taxonomy of \MR experiences based on the degree of mixing real and virtual environments, and \textcite{skarbez2021revisiting} revisited this taxonomy to include the user's perception of the experience. @@ -35,11 +36,11 @@ Yet, most of the research have focused on visual augmentations, and the term \AR \subsubsection{Applications of AR} \label{ar_applications} -Advances in technology, research and development have enabled many usages of \AR, including medicine, education, industrial, navigation, collaboration and entertainment applications \cite{dey2018systematic}. -For example, \AR can provide surgery training simulations in safe conditions \cite{harders2009calibration} (\figref{harders2009calibration}), or improve the learning of students with complex concepts and phenomena such as optics or chemistry \cite{bousquet2024reconfigurable}. +Advances in technology, research, and development have enabled many uses of \AR, including medical, educational, industrial, navigation, collaboration, and entertainment applications \cite{dey2018systematic}. +For example, \AR can provide surgical training simulations in safe conditions \cite{harders2009calibration} (\figref{harders2009calibration}), or improve student learning of complex concepts and phenomena such as optics or chemistry \cite{bousquet2024reconfigurable}. It can also guide workers in complex tasks, such as assembly, maintenance or verification \cite{hartl2013mobile} (\figref{hartl2013mobile}), reinvent the way we interact with desktop computers \cite{lee2013spacetop} (\figref{lee2013spacetop}), or can create complete new forms of gaming or tourism experiences \cite{roo2017inner} (\figref{roo2017inner}). -Most of (visual) \AR/\VR experience can now be implemented with commercially available hardware and software solutions, in particular for tracking, rendering and display. -Yet, the user experience in \AR is still highly dependent on the display used. +Most of (visual) \AR/\VR experiences can now be implemented with commercially available hardware and software solutions, especially for tracking, rendering and display. +However, the user experience in \AR is still highly dependent on the display used. \begin{subfigs}{ar_applications}{Examples of \AR applications. }[][ \item Visuo-haptic surgery training with cutting into virtual soft tisues \cite{harders2009calibration}. @@ -57,7 +58,8 @@ Yet, the user experience in \AR is still highly dependent on the display used. \subsubsection{AR Displays} \label{ar_displays} -To experience a virtual content combined and registered with the \RE, an output \UI that display the \VE to the user is necessary. +To experience a virtual content combined and registered with the \RE, an output device that display the \VE to the user is necessary. +%An output device is more formally defined as an output \emph{\UI} There is a large variety of \AR displays with different methods of combining the real and virtual content, and different locations on the \RE or the user \cite[p.126]{billinghurst2015survey}. In \emph{\VST-\AR}, the virtual images are superimposed to images of the \RE captured by a camera \cite{marchand2016pose}, and the combined real-virtual image is displayed on a screen to the user, as illustrated in \figref{itoh2022indistinguishable_vst}, \eg \figref{hartl2013mobile}. @@ -96,22 +98,24 @@ Finally, \AR displays can be head-worn like \VR \emph{headsets} or glasses, prov Presence and embodiment are two key concepts that characterize the user experience in \AR and \VR. While there is a large literature on these topics in \VR, they are less defined and studied for \AR \cite{genay2022being,tran2024survey}. -Still, these concepts are useful to design, evaluate and discuss our contributions in the next chapters. +These concepts will be useful for the design, evaluation, and discussion of our contributions: +In particular, we will investigate the effect of the visual feedback of the virtual hand when touching haptic texture augmentation (\chapref{xr_perception}) and manipulating \VOs (\chapref{visual_hand}), and explore the plausibility of visuo-haptic textures (\chapref{visuo_haptic}). \paragraph{Presence} \label{ar_presence} -\AR and \VR are both essentially illusions as the virtual content does not physically exist but is just digitally simulated and rendered to the user's senses through display \UIs. -Such experience of disbelief suspension in \VR is what is called \emph{presence}, and it can be decomposed into two dimensions: place illusion and plausibility \cite{slater2009place}. -Place illusion is the sense of the user of \enquote{being there} in the \VE (\figref{presence-vr}). +\AR and \VR are both essentially illusions as the virtual content does not physically exist but is just digitally simulated and rendered to the user's senses through display devices. +Such experience of disbelief suspension in \VR is what is called \emph{presence}, and it can be decomposed into two dimensions: place illusion and plausibility \cite{slater2009place,slater2022separate}. +\emph{Place illusion} is the sense of the user of \enquote{being there} in the \VE (\figref{presence-vr}). It emerges from the real time rendering of the \VE from the user's perspective: to be able to move around inside the \VE and look from different point of views. -Plausibility is the illusion that the virtual events are really happening, even if the user knows that they are not real. -It doesn't mean that the virtual events are realistic, but that they are plausible and coherent with the user's expectations. +\emph{Plausibility} is the illusion that the virtual events are really happening, even if the user knows that they are not real. +It doesn't mean that the virtual events are realistic, \ie that reproduce the real world with high fidelity \cite{skarbez2017survey}, but that they are believable and coherent with the user's expectations. +In the same way, a film can be plausible even if it is not realistic, such as a cartoon or a science-fiction movie. %The \AR presence is far less defined and studied than for \VR \cite{tran2024survey} For \AR, \textcite{slater2022separate} proposed to invert place illusion to what we can call \enquote{object illusion}, \ie the sense of the \VO to \enquote{feels here} in the \RE (\figref{presence-ar}). -As with \VR, \VOs must be able to be seen from different angles by moving the head but also, this is more difficult, be consistent with the \RE, \eg occlude or be occluded by real objects \cite{macedo2023occlusion}, cast shadows or reflect lights. -The plausibility can be applied to \AR as is, but the \VOs must additionally have knowledge of the \RE and react accordingly to it. +As with \VR, \VOs must be able to be seen from different angles by moving the head, but also, this is more difficult, appear to be coherent enough with the \RE \cite{skarbez2021revisiting}, \eg occlude or be occluded by real objects \cite{macedo2023occlusion}, cast shadows or reflect lights. +The plausibility can be applied to \AR as is, but the \VOs must additionally have knowledge of the \RE and react accordingly to it to be, again, perceived as coherently behaving with the real world \cite{skarbez2021revisiting}. %\textcite{skarbez2021revisiting} also named place illusion for \AR as \enquote{immersion} and plausibility as \enquote{coherence}, and these terms will be used in the remainder of this thesis. %One main issue with presence is how to measure it both in \VR \cite{slater2022separate} and \AR \cite{tran2024survey}. @@ -129,7 +133,7 @@ The plausibility can be applied to \AR as is, but the \VOs must additionally hav \paragraph{Embodiment} \label{ar_embodiment} -The sense of embodiment is the \enquote{subjective experience of using and having a body} \cite{blanke2009fullbody}, \ie the feeling that a body is our own. +The \emph{sense of embodiment} is the \enquote{subjective experience of using and having a body} \cite{blanke2009fullbody}, \ie the feeling that a body is our own. In everyday life, we are used to being, seeing and controlling our own body, but it is possible to embody a virtual body as an avatar while in \AR \cite{genay2022being} or \VR \cite{guy2023sense}. This illusion arises when the visual, proprioceptive and (if any) haptic sensations of the virtual body are coherent \cite{kilteni2012sense}. It can be decomposed into three subcomponents: \emph{Agency}, which is the feeling of controlling the body; \emph{Ownership}, which is the feeling that \enquote{the body is the source of the experienced sensations}; and \emph{Self-Location}, which is the feeling \enquote{spatial experience of being inside [the] body} \cite{kilteni2012sense}. @@ -138,19 +142,22 @@ In \AR, it could take the form of body accessorization, \eg wearing virtual clot \subsection{Direct Hand Manipulation in AR} \label{ar_interaction} -A user in \AR must be able to interact with the virtual content to fulfil the second point of \textcite{azuma1997survey}'s definition (\secref{ar_definition}) and complete the interaction loop (\figref[introduction]{interaction-loop}). %, \eg through a hand-held controller, a real object, or even directly with the hands. +A user in \AR must be able to interact with the virtual content to fulfil the second point of \textcite{azuma1997survey}'s definition (\secref{ar_definition}) and complete our proposed visuo-haptic interaction loop (\figref[introduction]{interaction-loop}). %, \eg through a hand-held controller, a real object, or even directly with the hands. In all examples of \AR applications shown in \secref{ar_applications}, the user interacts with the \VE using their hands, either directly or through a physical interface. -\subsubsection{User Interfaces and Interaction Techniques} +\subsubsection{User Inputs and Interaction Techniques} \label{interaction_techniques} -For a user to interact with a computer system (desktop, mobile, \AR, etc.), they first perceive the state of the system and then acts upon it through an input \UI. +For a user to interact with a computer system (desktop, mobile, \AR, etc.), they first perceive the state of the system and then acts upon it through an input device \cite[p.145]{laviolajr20173d}. +Such input devices form an input \emph{\UI} that captures and translates user's actions to the computer. +Similarly, an output \UI render and display the state of the system to the user (such as a \AR/\VR display, \secref{ar_display}, or an haptic actuator, \secref{wearable_haptic_devices}). + Inputs \UI can be either an \emph{active sensing}, a held or worn device, such as a mouse, a touch screen, or a hand-held controller, or a \emph{passive sensing}, that does not require a contact, such as eye trackers, voice recognition, or hand tracking \cite[p.294]{laviolajr20173d}. -The information gathered from the sensors by the \UI is then translated into actions within the computer system by an \emph{interaction technique} (\figref{interaction-technique}). +The captured information from the sensors is then translated into actions within the computer system by an \emph{interaction technique}. %(\figref{interaction-technique}). For example, a cursor on a screen can be moved using either with a mouse or with the arrow keys on a keyboard, or a two-finger swipe on a touchscreen can be used to scroll or zoom an image. Choosing useful and efficient \UIs and interaction techniques is crucial for the user experience and the tasks that can be performed within the system. -\fig[0.5]{interaction-technique}{An interaction technique map user inputs to actions within a computer system. Adapted from \textcite{billinghurst2005designing}.} +%\fig[0.5]{interaction-technique}{An interaction technique map user inputs to actions within a computer system. Adapted from \textcite{billinghurst2005designing}.} \subsubsection{Tasks with Virtual Environments} \label{ve_tasks} @@ -170,28 +177,30 @@ Wayfinding is the cognitive planning of the movement, such as path finding or ro The \emph{system control tasks} are changes to the system state through commands or menus such as creating, deleting, or modifying \VOs, \eg as in \figref{roo2017onea}. It is also the input of text, numbers, or symbols. +In this thesis we focus on manipulation tasks of virtual content directly with the hands, more specifically on touching visuo-haptic textures with a finger (\partref{perception}) and positioning and rotating \VOs pushed and grasp by the hand. + \begin{subfigs}{interaction-techniques}{Interaction techniques in \AR. }[][ \item Spatial selection of virtual item of an extended display using a hand-held smartphone \cite{grubert2015multifi}. \item Displaying as an overlay registered on the \RE the route to follow \cite{grubert2017pervasive}. \item Virtual drawing on a real object with a hand-held pen \cite{roo2017onea}. \item Simultaneous Localization and Mapping (SLAM) algorithms such as KinectFusion \cite{newcombe2011kinectfusion} reconstruct the \RE in real time and enables to register the \VE in it. ] - \subfigsheight{36mm} - \subfig{grubert2015multifi} - \subfig{grubert2017pervasive} - \subfig{roo2017onea} - \subfig{newcombe2011kinectfusion} + \subfigsheight{35.5mm} + \subfigbox{grubert2015multifi} + \subfigbox{grubert2017pervasive} + \subfigbox{roo2017onea} + \subfigbox{newcombe2011kinectfusion} \end{subfigs} -\subsubsection{Reducing the Real-Virtual Gap} +\subsubsection{The Gap between Real and Virtual} \label{real_virtual_gap} In \AR and \VR, the state of the system is displayed to the user as a \ThreeD spatial \VE. In an immersive and portable \AR system, this \VE is experienced at a 1:1 scale and as an integral part of the \RE. -The rendering gap between the real and virtual elements, as described on the interaction loop in \figref[introduction]{interaction-loop}, is thus experienced as narrow or even not consciously perceived by the user. +The rendering gap between the real and virtual elements, as described on our interaction loop in \figref[introduction]{interaction-loop}, is thus experienced as narrow or even not consciously perceived by the user. This manifests as a sense of presence of the virtual, as described in \secref{ar_presence}. -As the gap between real and virtual rendering is reduced, one could expects a similar and seamless interaction with the \VE as with a \RE, which \textcite{jacob2008realitybased} called \emph{reality based interactions}. +As the gap between real and virtual rendering is reduced, one could expect a similar and seamless interaction with the \VE as with a \RE, which \textcite{jacob2008realitybased} called \emph{reality based interactions}. As of today, an immersive \AR system tracks itself with the user in \ThreeD, using tracking sensors and pose estimation algorithms \cite{marchand2016pose}, \eg as in \figref{newcombe2011kinectfusion}. It enables the \VE to be registered with the \RE and the user simply moves to navigate within the virtual content. %This tracking and mapping of the user and \RE into the \VE is named the \enquote{extent of world knowledge} by \textcite{skarbez2021revisiting}, \ie to what extent the \AR system knows about the \RE and is able to respond to changes in it. @@ -212,13 +221,13 @@ The issue with these \emph{space-multiplexed} interfaces is the large number and An alternative is to use a single \emph{universal} tangible object like a hand-held controller, such as a cube \cite{issartel2016tangible} or a sphere \cite{englmeier2020tangible}. These \emph{time-multiplexed} interfaces require interaction techniques that allow the user to pair the tangible with any \VO, \eg by placing the tangible into the \VO and pressing the fingers \cite{issartel2016tangible} (\figref{issartel2016tangible}), similar to a real grasp (\secref{grasp_types}). -Still, the virtual visual rendering and the real haptic sensations can be inconsistent. +Still, the virtual visual rendering and the real haptic sensations can be incoherent. Especially in \OST-\AR, since the \VOs are inherently slightly transparent allowing the paired real objects to be seen through them. In a pick-and-place task with real objects, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using_1}) of the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users. This suggests the feasibility of using simplified real obejcts in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs. -Similarly, in \secref{tactile_rendering} we described how a material property (\secref{object_properties}) of a touched real object can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR. +Similarly, in \secref{tactile_rendering} we described how a material property (\secref{object_properties}) of a touched real object can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: it could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR. -\begin{subfigs}{ar_tangibles}{Manipulating \VOs with tangibles. }[][ +\begin{subfigs}{ar_tangibles}{Manipulating \VOs through real objects. }[][ \item Ubi-Touch paired the movements and screw interaction of a virtual drill with a real vaporizer held by the user \cite{jain2023ubitouch}. \item A real cube that can be moved into the \VE and used to grasp and manipulate \VOs \cite{issartel2016tangible}. \item Size and @@ -234,13 +243,12 @@ Similarly, in \secref{tactile_rendering} we described how a material property (\ \subsubsection{Manipulating with Virtual Hands} \label{ar_virtual_hands} -Natural \UIs allow the user to use their body movements directly as inputs to the \VE, as defined by \textcite[p.172]{billinghurst2015survey}. -In daily life, our hands allow us to manipulate real objects with both strength and precision (\secref{grasp_types}), so virtual hand interaction techniques seem to be the most natural way to manipulate virtual objects \cite[p.400]{laviolajr20173d}. -It is also called mid-air interaction. -Initially tracked by active sensing devices such as gloves or controllers, it is now possible to track hands in real time using cameras and computer vision algorithms natively integrated into \AR/\VR headsets \cite{tong2023survey}. +%can track the user's movements and use them as inputs to the \VE \textcite[p.172]{billinghurst2015survey}. +Initially tracked by active sensing devices such as gloves or controllers, it is now possible to track hands in real time using passive sensing (\secref{interaction_techniques}) and computer vision algorithms natively integrated into \AR/\VR headsets \cite{tong2023survey}. +Our hands allow us to manipulate real everyday objects (\secref{grasp_types}), so virtual hand interaction techniques seem to be the most natural way to manipulate virtual objects \cite[p.400]{laviolajr20173d}. -The user's hand is therefore tracked and reconstructed as a \emph{virtual hand} model in the \VE \cite[p.405]{laviolajr20173d}. -The simplest models represent the hand as a rigid \ThreeD object that follows the movements of the real hand with \qty{6}{\DoF} (position and orientation in space) \cite{talvas2012novel}. +The user's hand being tracked is reconstructed as a \emph{virtual hand} model in the \VE \cite[p.405]{laviolajr20173d}. +The simplest models represent the hand as a rigid \ThreeD object that follows the movements of the real hand with \qty{6}{DoF} (position and orientation in space) \cite{talvas2012novel}. An alternative is to model only the fingertips (\figref{lee2007handy}) or the whole hand (\figref{hilliges2012holodesk_1}) as points. The most common technique is to reconstruct all the phalanges of the hand in an articulated kinematic model (\secref{hand_anatomy}) \cite{borst2006spring}. @@ -248,7 +256,7 @@ The contacts between the virtual hand model and the \VOs are then simulated usin Heuristic techniques use rules to determine the selection, manipulation and release of a \VO (\figref{piumsomboon2013userdefined_1}). However, they produce unrealistic behaviour and are limited to the cases predicted by the rules. Physics-based techniques simulate forces at the points of contact between the virtual hand and the \VO. -In particular, \textcite{borst2006spring} have proposed an articulated kinematic model in which each phalanx is a rigid body simulated with the god-object method \cite{zilles1995constraintbased}: +In particular, \textcite{borst2006spring} proposed an articulated kinematic model in which each phalanx is a rigid body simulated with the god-object method \cite{zilles1995constraintbased}: The virtual phalanx follows the movements of the real phalanx, but remains constrained to the surface of the \VOs during contact. The forces acting on the object are calculated as a function of the distance between the real and virtual hands (\figref{borst2006spring}). More advanced techniques simulate the friction phenomena \cite{talvas2013godfinger} and finger deformations \cite{talvas2015aggregate}, allowing highly accurate and realistic interactions, but which can be difficult to compute in real time. @@ -260,28 +268,28 @@ More advanced techniques simulate the friction phenomena \cite{talvas2013godfing \item A kinematic hand model with rigid-body phalanges (in beige) that follows the real tracked hand (in green) but kept physically constrained to the \VO. Applied forces are shown as red arrows \cite{borst2006spring}. ] \subfigsheight{37mm} - \subfig{lee2007handy} - \subfig{hilliges2012holodesk_1} - \subfig{piumsomboon2013userdefined_1} - \subfig{borst2006spring} + \subfigbox{lee2007handy} + \subfigbox{hilliges2012holodesk_1} + \subfigbox{piumsomboon2013userdefined_1} + \subfigbox{borst2006spring} \end{subfigs} However, the lack of physical constraints on the user's hand movements makes manipulation actions tiring \cite{hincapie-ramos2014consumed}. While the user's fingers traverse the \VO, a physics-based virtual hand remains in contact with the object, a discrepancy that may degrade the user's performance in \VR \cite{prachyabrued2012virtual}. Finally, in the absence of haptic feedback on each finger, it is difficult to estimate the contact and forces exerted by the fingers on the object during grasping and manipulation \cite{maisto2017evaluation,meli2018combining}. -While a visual rendering of the virtual hand in \VR can compensate for these issues \cite{prachyabrued2014visual}, the visual and haptic rendering of the virtual hand, or their combination, in \AR is under-researched. +While a visual feedback of the virtual hand in \VR can compensate for these issues \cite{prachyabrued2014visual}, the visual and haptic feedback of the virtual hand, or their combination, in \AR needs to be investigated as well. -\subsection{Visual Rendering of Hands in AR} +\subsection{Visual Feedback of Virtual Hands in AR} \label{ar_visual_hands} -In \VR, since the user is fully immersed in the \VE and cannot see their real hands, it is necessary to represent them virtually (\secref{ar_embodiment}). -When interacting with a physics-based virtual hand method (\secref{ar_virtual_hands}), the visual rendering of the virtual hand has an influence on perception, interaction performance, and preference of users \cite{prachyabrued2014visual,argelaguet2016role,grubert2018effects,schwind2018touch}. -In a pick-and-place manipulation task in \VR, \textcite{prachyabrued2014visual} and \textcite{canales2019virtual} found that the visual hand rendering whose motion was constrained to the surface of the \VOs similar as to \textcite{borst2006spring} (\enquote{Outer Hand} in \figref{prachyabrued2014visual}) performed the worst, while the visual hand rendering following the tracked human hand (thus penetrating the \VOs, \enquote{Inner Hand} in \figref{prachyabrued2014visual}) performed the best, though it was rather disliked. -\textcite{prachyabrued2014visual} also found that the best compromise was a double rendering, showing both the virtual hand and the tracked hand (\enquote{2-Hand} in \figref{prachyabrued2014visual}). +%In \VR, since the user is fully immersed in the \VE and cannot see their real hands, it is necessary to represent them virtually (\secref{ar_embodiment}). +When interacting with a physics-based virtual hand method (\secref{ar_virtual_hands}) in \VR, the visual feedback of the virtual hand has an influence on perception, interaction performance, and preference of users \cite{prachyabrued2014visual,argelaguet2016role,grubert2018effects,schwind2018touch}. +In a pick-and-place manipulation task in \VR, \textcite{prachyabrued2014visual} and \textcite{canales2019virtual} found that the visual hand feedback whose motion was constrained to the surface of the \VOs similar as to \textcite{borst2006spring} (\enquote{Outer Hand} in \figref{prachyabrued2014visual}) performed the worst, while the visual hand feedback following the tracked human hand (thus penetrating the \VOs, \enquote{Inner Hand} in \figref{prachyabrued2014visual}) performed the best, though it was rather disliked. +\textcite{prachyabrued2014visual} also found that the best compromise was a double feedback, showing both the virtual hand and the tracked hand (\enquote{2-Hand} in \figref{prachyabrued2014visual}). While a realistic rendering of the human hand increased the sense of ownership \cite{lin2016need}, a skeleton-like rendering provided a stronger sense of agency \cite{argelaguet2016role} (\secref{ar_embodiment}), and a minimalist fingertip rendering reduced typing errors \cite{grubert2018effects}. -A visual hand rendering while in \VE also seems to affect how one grasps an object \cite{blaga2020too}, or how real bumps and holes are perceived \cite{schwind2018touch}. +A visual hand feedback while in \VE also seems to affect how one grasps an object \cite{blaga2020too}, or how real bumps and holes are perceived \cite{schwind2018touch}. -\fig{prachyabrued2014visual}{Visual hand renderings affect user experience in \VR \cite{prachyabrued2014visual}.} +\fig{prachyabrued2014visual}{Visual hand feedback affect user experience in \VR \cite{prachyabrued2014visual}.} Conversely, a user sees their own hands in \AR, and the mutual occlusion between the hands and the \VOs is a common issue (\secref{ar_displays}), \ie hiding the \VO when the real hand is in front of it, and hiding the real hand when it is behind the \VO (\figref{hilliges2012holodesk_2}). %For example, in \figref{hilliges2012holodesk_2}, the user is pinching a virtual cube in \OST-\AR with their thumb and index fingers, but while the index is behind the cube, it is seen as in front of it. @@ -292,25 +300,25 @@ While in \VST-\AR, this could be solved as a masking problem by combining the re %Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in \VST-\AR \cite{buchmann2005interaction,ha2014wearhand,piumsomboon2014graspshell} and \VR \cite{vanveldhuizen2021effect}, but has not yet been evaluated in \OST-\AR. %However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a \VO, \eg the thumb is in front of the virtual cube, but could be perceived to be behind it. -Since the \VE is intangible, adding a visual rendering of the virtual hand in \AR that is physically constrained to the \VOs would achieve a similar result to the promising double-hand rendering of \textcite{prachyabrued2014visual}. +Since the \VE is intangible, adding a visual feedback of the virtual hand in \AR that is physically constrained to the \VOs would achieve a similar result to the double-hand feedback of \textcite{prachyabrued2014visual}. A \VO overlaying a real object object in \OST-\AR can vary in size and shape without degrading user experience or manipulation performance \cite{kahl2021investigation,kahl2023using}. -This suggests that a visual hand rendering superimposed on the real hand as a partial avatarization (\secref{ar_embodiment}) might be helpful without impairing the user. +This suggests that a visual hand feedback superimposed on the real hand as a partial avatarization (\secref{ar_embodiment}) might be helpful without impairing the user. -Few works have compared different visual hand rendering in \AR or with wearable haptic feedback. +Few works have compared different visual feedback of the virtual hand in \AR or with wearable haptic feedback. Rendering the real hand as a semi-transparent hand in \VST-\AR is perceived as less natural but seems to be preferred to a mutual visual occlusion for interaction with real and virtual objects \cite{buchmann2005interaction,piumsomboon2014graspshell}. %Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in \VST-\AR \cite{buchmann2005interaction,ha2014wearhand,piumsomboon2014graspshell} and \VR \cite{vanveldhuizen2021effect}, but has not yet been evaluated in \OST-\AR. -Similarly, \textcite{blaga2017usability} evaluated direct hand manipulation in non-immersive \VST-\AR with a skeleton-like rendering \vs no visual hand rendering: while user performance did not improve, participants felt more confident with the virtual hand (\figref{blaga2017usability}). +Similarly, \textcite{blaga2017usability} evaluated direct hand manipulation in non-immersive \VST-\AR with a skeleton-like rendering \vs no visual hand feedback: while user performance did not improve, participants felt more confident with the virtual hand (\figref{blaga2017usability}). %\textcite{krichenbauer2018augmented} found that participants were \percent{22} faster in immersive \VST-\AR than in \VR in the same pick-and-place manipulation task, but no visual hand rendering was used in \VR while the real hand was visible in \AR. In a collaborative task in immersive \OST-\AR \vs \VR, \textcite{yoon2020evaluating} showed that a realistic human hand rendering was the most preferred over a low-polygon hand and a skeleton-like hand for the remote partner. \textcite{genay2021virtual} found that the sense of embodiment with robotic hands overlay in \OST-\AR was stronger when the environment contained both real and virtual objects (\figref{genay2021virtual}). -Finally, \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the visual and haptic rendering of the hand in \VST-\AR, as detailed in the next section (\secref{vhar_rings}). -Taken together, these results suggest that a visual rendering of the hand in \AR could improve usability and performance in direct hand manipulation tasks, but the best rendering has yet to be determined. +Finally, \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the visual and haptic feedback of the hand in \VST-\AR, as detailed in the next section (\secref{vhar_rings}). +Taken together, these results suggest that a visual augmentation of the hand in \AR could improve usability and performance in direct hand manipulation tasks, but the best rendering has yet to be determined. %\cite{chan2010touching} : cues for touching (selection) \VOs. %\textcite{saito2021contact} found that masking the real hand with a textured \ThreeD opaque virtual hand did not improve performance in a reach-to-grasp task but displaying the points of contact on the \VO did. %To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of \VO manipulation. -\begin{subfigs}{visual-hands}{Visual hand renderings in \AR. }[][ - \item Grasping a \VO in \OST-\AR with no visual hand rendering \cite{hilliges2012holodesk}. +\begin{subfigs}{visual-hands}{Visual feedback of the virtual hand in \AR. }[][ + \item Grasping a \VO in \OST-\AR with no visual hand feedback \cite{hilliges2012holodesk}. \item Simulated mutual-occlusion between the hand grasping and the \VO in \VST-\AR \cite{suzuki2014grasping}. \item Grasping a real object with a semi-transparent hand in \VST-\AR \cite{buchmann2005interaction}. \item Skeleton rendering overlaying the real hand in \VST-\AR \cite{blaga2017usability}. @@ -331,7 +339,7 @@ Taken together, these results suggest that a visual rendering of the hand in \AR \AR systems integrate virtual content into the user's perception as if it were part of the \RE. \AR headsets now enable real-time tracking of the head and hands, and high-quality display of virtual content, while being portable and mobile. They enable highly immersive \AEs that users can explore with a strong sense of the presence of the virtual content. -However, without a direct and seamless interaction with the \VOs using the hands, the coherence of the \AE experience is compromised. -In particular, there is a lack of mutual occlusion and interaction cues between the hands and virtual content when manipulating \VOs in \OST-\AR that could be mitigated by a visual rendering of the hand. -A common alternative approach is to use real objects as a proxy for interaction with \VOs, but this raises concerns about their consistency with the visual rendering. +However, without direct and seamless interaction with the \VOs using the hands, the coherence of the \AE experience is compromised. +In particular, when manipulating \VOs in \OST-\AR, there is a lack of mutual occlusion and interaction cues between the hands and the virtual content, which could be mitigated by a visual augmentation of the hand. +A common alternative approach is to use real objects as proxies for interaction with \VOs, but this raises concerns about their coherence with visual augmentations. In this context, the use of wearable haptic systems worn on the hand seems to be a promising solution both for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of touched real objects. diff --git a/1-background/related-work/4-visuo-haptic-ar.tex b/1-background/related-work/4-visuo-haptic-ar.tex index 066887b..5bbd749 100644 --- a/1-background/related-work/4-visuo-haptic-ar.tex +++ b/1-background/related-work/4-visuo-haptic-ar.tex @@ -66,7 +66,7 @@ The \MLE model implies that when seeing and touching a \VO in \AR, the combinati \subsubsection{Influence of Visual Rendering on Haptic Perception} \label{visual_haptic_influence} -Thus, a visuo-haptic perception of an object's property is robust to some difference between the two sensory modalities, as long as one can match their respective sensations to the same property. +Thus, a visuo-haptic perception of an object's property is robust to some differences between the two sensory modalities, as long as one can match their respective sensations to the same property. In particular, the texture perception of objects is known to be constructed from both vision and touch \cite{klatzky2010multisensory}. More precisely, when surfaces are evaluated by vision or touch alone, both senses discriminate their materials mainly by the same properties of roughness, hardness, and friction, and with similar performance \cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}. @@ -88,7 +88,7 @@ For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually defo \item A virtual soft texture projected on a table and that deforms when pressed by the hand \cite{punpongsanon2015softar}. \item Modifying visually a real object and the hand touching it in \VST-\AR to modify its perceived shape \cite{ban2014displaying}. ] - \subfigsheight{42mm} + \subfigsheight{50mm} \subfig{punpongsanon2015softar} \subfig{ban2014displaying} \end{subfigs} @@ -112,18 +112,18 @@ Adding a visual delay increased the perceived stiffness of the reference piston, \item Participant pressing a virtual piston rendered by a force-feedback device with their hand. \item Proportion of comparison piston perceived as stiffer than reference piston (vertical axis) as a function of the comparison stiffness (horizontal axis) and visual and haptic delays of the reference (colors). ] - \subfig[.44]{knorlein2009influence_1} + \subfigbox[.44]{knorlein2009influence_1} \subfig[.55]{knorlein2009influence_2} \end{subfigs} %explained how these delays affected the integration of the visual and haptic perceptual cues of stiffness. -The stiffness $\tilde{k}(t)$ of the piston is indeed estimated at time $t$ by both sight and proprioception as the ratio of the exerted force $F(t)$ and the displacement $D(t)$ of the piston, following \eqref{stiffness}, but with potential visual $\Delta t_v$ or haptic $\Delta t_h$ delays: -\begin{equation}{stiffness_delay} - \tilde{k}(t) = \frac{F(t + \Delta t_h)}{D(t + \Delta t_v)} -\end{equation} +The stiffness $\tilde{k}(t)$ of the piston is indeed estimated at time $t$ by both sight and proprioception as the ratio of the exerted force $F(t)$ and the displacement $\Delta L(t)$ of the piston, following \eqref{stiffness}, but with potential visual $\Delta t_v$ or haptic $\Delta t_h$ delays. +%\begin{equation}{stiffness_delay} +% \tilde{k}(t) = \frac{F(t + \Delta t_h)}{D(t + \Delta t_v)} +%\end{equation} Therefore, the perceived stiffness $\tilde{k}(t)$ increases with a haptic delay in force and decreases with a visual delay in displacement \cite{diluca2011effects}. -In a similar \TIFC user study, participants compared perceived stiffness of virtual pistons in \OST-\AR and \VR \cite{gaffary2017ar}. +\textcite{gaffary2017ar} compared perceived stiffness of virtual pistons in \OST-\AR and \VR. However, the force-feedback device and the participant's hand were not visible (\figref{gaffary2017ar}). The reference piston was judged to be stiffer when seen in \VR than in \AR, without participants noticing this difference, and more force was exerted on the piston overall in \VR. This suggests that the haptic stiffness of \VOs feels \enquote{softer} in an \AE than in a full \VE. @@ -135,8 +135,8 @@ This suggests that the haptic stiffness of \VOs feels \enquote{softer} in an \AE \item in \VR. ] \subfig[0.35]{gaffary2017ar_1} - \subfig[0.3]{gaffary2017ar_3} - \subfig[0.3]{gaffary2017ar_4} + \subfigbox[0.31]{gaffary2017ar_3} + \subfigbox[0.31]{gaffary2017ar_4} \end{subfigs} Finally, \textcite{diluca2019perceptual} investigated the perceived simultaneity of visuo-haptic contact with a \VO in \VR. @@ -183,16 +183,14 @@ Finally, \textcite{preechayasomboon2021haplets} (\figref{preechayasomboon2021hap However, no proper user study has been conducted to evaluate these devices in \AR. \begin{subfigs}{ar_wearable}{Nail-mounted wearable haptic devices designed for \AR. }[][ - %\item A voice-coil rendering a virtual haptic texture on a real sheet of paper \cite{ando2007fingernailmounted}. \item Touch\&Fold provide contact pressure and vibrations on demand to the fingertip \cite{teng2021touch}. \item Fingeret is a finger-side wearable haptic device that pulls and pushs the fingertip skin \cite{maeda2022fingeret}. \item Haplets is a compact nail device with integrated sensing and vibrotactile feedback \cite{preechayasomboon2021haplets}. ] \subfigsheight{33mm} - %\subfig{ando2007fingernailmounted} - \subfig{teng2021touch_1} - \subfig{maeda2022fingeret} - \subfig{preechayasomboon2021haplets} + \subfigbox{teng2021touch_1} + \subfigbox{maeda2022fingeret} + \subfigbox{preechayasomboon2021haplets} \end{subfigs} \subsubsection{Belt Devices} @@ -206,22 +204,21 @@ The middle phalanx of each of these fingers was equipped with a haptic ring of \ %However, no proper user study was conducted to evaluate this feedback.% on the manipulation of the cube. %that simulated the weight of the cube. %A virtual cube that could push on the cube was manipulated with the other hand through a force-feedback device. -\textcite{scheggi2010shape} report that 12 out of 15 participants found the weight haptic feedback essential to feeling the presence of the virtual cube. +\textcite{scheggi2010shape} reported that 12 out of 15 participants found the weight haptic feedback essential to feeling the presence of the virtual cube. In a pick-and-place task in non-immersive \VST-\AR involving direct hand manipulation of both virtual and real objects (\figref{maisto2017evaluation}), \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the effects of providing haptic or visual feedback about fingertip-object contacts. -They compared the haptic ring of \textcite{pacchierotti2016hring} on the proximal phalanx, the moving platform of \textcite{chinello2020modular} on the fingertip, and a visual rendering of the tracked fingertips as virtual points. +They compared the haptic ring of \textcite{pacchierotti2016hring} on the proximal phalanx, the moving platform of \textcite{chinello2020modular} on the fingertip, and a visual feedback of the tracked fingertips as virtual points. They showed that the haptic feedback improved the completion time, reduced the force exerted on the cubes compared to the visual feedback (\figref{visual-hands}). The haptic ring was also perceived as more effective than the moving platform. However, the measured difference in performance could be due to either the device or the device position (proximal vs fingertip), or both. -These two studies were also conducted in non-immersive setups, where users viewed a screen displaying the visual interactions, and only compared the haptic and visual rendering of the hand-object contacts, but did not examine them together. +These two studies were also conducted in non-immersive setups, where users viewed a screen displaying the visual interactions, and only compared the haptic and visual feedback of the hand-object contacts, but did not examine them together. \begin{subfigs}{ar_rings}{Wearable haptic ring devices for \AR. }[][ \item Rendering weight of a virtual cube placed on a real surface \cite{scheggi2010shape}. \item Rendering the contact force exerted by the fingers on a virtual cube \cite{maisto2017evaluation,meli2018combining}. ] - \subfigsheight{57mm} - \subfig{scheggi2010shape} - \subfig{maisto2017evaluation} + \subfigbox[.48][m]{scheggi2010shape} + \subfig[.48][m]{maisto2017evaluation} \end{subfigs} %\subsubsection{Wrist Bracelet Devices} @@ -250,5 +247,14 @@ A user study was conducted in \VR to compare the perception of visuo-haptic stif % \cite{sarac2022perceived,palmer2022haptic} not in \AR but studies on relocating to the wrist the haptic feedback of the fingertip-object contacts. -%\subsection{Conclusion} -%\label{visuo_haptic_conclusion} +\subsection{Conclusion} +\label{visuo_haptic_conclusion} + +Providing coherent visuo-haptic feedback to enhance direct hand perception and manipulation with \VOs in immersive \AR is challenging. +While many wearable haptic devices have been developed and are capable of providing varied tactile feedback, few have be integrated or experimentally evaluated for direct hand interaction in \AR. +Their haptic end-effector must be moved away from the inside of the hand so as not to interfere with the user interaction with the \RE. +Different relocation strategies have been proposed for different parts of the hand, such as the nail, the index phalanges, or the wrist, but it remains unclear whether any of them are best suited for direct hand interaction in \AR. +In all cases, the real and virtual visual sensations are considered co-localized, but the virtual haptic feedback is not. +Such a discrepancy may affect the user's perception and experience and should be further investigated. +When integrating different sensory feedback, haptic and visual, real and virtual, into a single object property, perception is robust to variations in reliability and to spatial and temporal differences. +Conversely, the same haptic feedback or augmentation can be influenced by the user's visual expectation or the visual rendering of the \VO. diff --git a/1-background/related-work/5-conclusion.tex b/1-background/related-work/5-conclusion.tex index 8f2e60b..e9ac03f 100644 --- a/1-background/related-work/5-conclusion.tex +++ b/1-background/related-work/5-conclusion.tex @@ -1,25 +1,39 @@ \section{Conclusion} \label{conclusion} -Haptic perception and manipulation of objects with the hand involves exploratory movements or grasp types, respectively, with simultaneous sensory feedback from multiple cutaneous and kinaesthetic receptors embedded beneath the skin. -These receptors provide sensory cues about the physical properties of objects, such as roughness and hardness, which are then integrated to form a perception of the property being explored. -Perceptual constancy is possible in the absence of one cue by compensating with others. +Haptic perception and manipulation of everyday objects with the hand involve exploratory movements or grasp types, respectively (\secref{hand_object_interactions}), with simultaneous sensory feedback from multiple cutaneous and kinaesthetic receptors embedded beneath the skin (\secref{haptic_sense}). +These receptors provide sensory cues about the physical properties of objects, such as roughness (texture) and hardness, which are then integrated to form a perception of the property being explored (\secref{object_properties}). +Perceptual constancy is possible in the absence of one cue by compensating with others, and enables the possibility of haptic augmentation. -Haptic systems aim to provide virtual interactions and sensations similar to those with real objects. -Only a few can be considered wearable due to their compactness and portability, but they are limited to cutaneous feedback. +\noindentskip Haptic systems aim to provide virtual interactions and sensations similar to those with real objects (\secref{wearable_haptics}). +Only a few can be considered wearable due to their compactness and portability, but they are limited to cutaneous feedback (\secref{wearable_haptic_devices}). If their haptic rendering is timely associated with the user's touch actions on a real object, the perceived haptic properties of the object, such as its roughness and hardness, can be modified. -Wearable haptic augmentation is mostly achieved with vibrotactile feedback. +Wearable haptic augmentation of roughness and hardness is mostly achieved with vibrotactile feedback (\secref{tactile_rendering}). -\AR headsets integrate virtual content immersively into the user's perception as if it were part of the \RE, with real-time tracking of the head and hands. -However, direct hand interaction and manipulation of \VOs is difficult due to the lack of haptic feedback and of mutual occlusion rendering between the hand and the \VO, which could be improved by a visual rendering of the hand. -Real objects are also used as proxies for manipulating \VOs, but can be inconsistent with the visual rendering, being haptically passives. -Wearable haptics on the hand is a promising solution for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of real objects. +We will use such wearable vibrotactile feedback to create texture augmentation of real objects in \partref{perception} and contact rendering of virtual objects in \partref{manipulation}. +In particular, in \chapref{vhar_system} we will propose a system that allows free exploration of texture augmentation of real surfaces with the bare hand using wearable vibrotactile. -Providing coherent visuo-haptic feedback to enhance direct hand perception and manipulation with \VOs in immersive \AR is challenging. -While many wearable haptic devices have been developed and are capable of providing varied tactile feedback, few can be integrated or experimentally evaluated for direct hand interaction in \AR. -Their haptic end-effector must be moved away from the inside of the hand so as not to interfere with the user interaction with the \RE. -Different relocation strategies have been proposed for different parts of the hand, such as the nail, the index phalanges, or the wrist, but it remains unclear whether any of them are best suited for direct hand interaction in \AR. -In all cases, the real and virtual visual sensations are considered co-localized, but the virtual haptic feedback is not. -Such a discrepancy may affect the user's perception and experience and should be further investigated. -When integrating different sensory feedback, haptic and visual, real and virtual, into a single object property, perception is robust to variations in reliability and to spatial and temporal differences. -Conversely, the same haptic rendering or augmentation can be influenced by the user's visual expectation or the visual rendering of the \VO. +\noindentskip \AR headsets integrate virtual content into the user's perception in an immersive way, as if it were part of the \RE, with real-time tracking of the head and hands (\secref{what_is_ar}). +Direct interaction with the hand of virtual content is often implemented using virtual hand interaction technique, which reconstructs the user's hand in the \VE and simulates its interactions with the virtual. +However, the perception and manipulation of the virtual is difficult due to the lack of haptic feedback and the mutual occlusion of the hand with the virtual content (\secref{ar_interaction}). + +The lack of mutual occlusion could be improved by visual feedback of the virtual hand (\secref{ar_visual_hands}), which we will investigate in \chapref{visual_hand} by comparing the most common visual hand augmentations used in \AR in manipulation tasks with the hand of virtual objects. +Wearable haptics on the hand is another solution to improve direct hand manipulation of virtual objects, which we will explore in \chapref{visuo_haptic_hand}. + +Real surrounding objects can also be used as proxies to interact with the virtual, but they may be incoherent with their visual augmentation because they are haptically passive (\secref{ar_interaction}). +Wearable haptics on the hand is again a promising solution to enable coherent visuo-haptic augmentation of real objects, as we will explore in \partref{perception}. + +\noindentskip However, few wearable haptic devices have been integrated or experimentally evaluated for direct hand interaction in \AR. +Their haptic end-effector must be moved away from the inside of the hand so as not to interfere with the user's interaction with the \RE. +Many strategies for moving the actuator on the hand have been explored, but the most beneficial position for delocalized haptic feedback is still unclear (\secref{vhar_haptics}). + +In \chapref{visuo_haptic_hand} we will investigate five common delocalized positions of vibrotactile feedback for rendering contact with the hand when manipulating virtual objects in \AR. +We will also investigate two contact rendering techniques and compare them with two visual hand augmentations from \chapref{visual_hand}. + +\noindentskip It is also challenging to provide coherent visuo-haptic feedback when augmenting real objects and rendering virtual objects (\secref{vh_perception}). +By integrating different sensory feedback, haptic and visual, real and virtual, into a single object property, perception is somewhat robust to variations in reliability and to spatial and temporal differences. +Conversely, the same haptic rendering or augmentation can be influenced by the user's visual expectation or the visual rendering of the virtual object. + +In \chapref{xr_perception} we will investigate the effect of the visual feedback of the virtual hand as well as the effect of the environment (\AR or \VR) on the perception of vibrotactile texture augmentations using the system presented in \chapref{vhar_system}. + +Finally, using the same system, we will evaluate in \chapref{vhar_textures} the perceived realism, coherence and roughness of co-localized visuo-haptic texture augmentations on real surfaces seen and touched with the real hand in \AR. diff --git a/1-background/related-work/figures/delhaye2012textureinduced_1.jpg b/1-background/related-work/figures/delhaye2012textureinduced_1.jpg new file mode 100644 index 0000000..526472e Binary files /dev/null and b/1-background/related-work/figures/delhaye2012textureinduced_1.jpg differ diff --git a/1-background/related-work/figures/delhaye2012textureinduced.jpg b/1-background/related-work/figures/delhaye2012textureinduced_2.jpg similarity index 100% rename from 1-background/related-work/figures/delhaye2012textureinduced.jpg rename to 1-background/related-work/figures/delhaye2012textureinduced_2.jpg diff --git a/1-background/related-work/figures/hardness.odg b/1-background/related-work/figures/hardness.odg index a2e1bd7..6b13250 100644 Binary files a/1-background/related-work/figures/hardness.odg and b/1-background/related-work/figures/hardness.odg differ diff --git a/1-background/related-work/figures/hardness.pdf b/1-background/related-work/figures/hardness.pdf index 653e451..9e35056 100644 Binary files a/1-background/related-work/figures/hardness.pdf and b/1-background/related-work/figures/hardness.pdf differ diff --git a/1-background/related-work/figures/scheggi2010shape.jpg b/1-background/related-work/figures/scheggi2010shape.jpg index 9d1ac0d..c947135 100644 Binary files a/1-background/related-work/figures/scheggi2010shape.jpg and b/1-background/related-work/figures/scheggi2010shape.jpg differ diff --git a/1-background/related-work/figures/sensorimotor_continuum.odg b/1-background/related-work/figures/sensorimotor_continuum.odg index ebd3509..cd860fc 100644 Binary files a/1-background/related-work/figures/sensorimotor_continuum.odg and b/1-background/related-work/figures/sensorimotor_continuum.odg differ diff --git a/1-background/related-work/figures/sensorimotor_continuum.pdf b/1-background/related-work/figures/sensorimotor_continuum.pdf index 5db0c79..55904c7 100644 Binary files a/1-background/related-work/figures/sensorimotor_continuum.pdf and b/1-background/related-work/figures/sensorimotor_continuum.pdf differ diff --git a/1-background/related-work/related-work.tex b/1-background/related-work/related-work.tex index c3c6ec6..cdbf3ae 100644 --- a/1-background/related-work/related-work.tex +++ b/1-background/related-work/related-work.tex @@ -3,11 +3,11 @@ \chaptertoc -This chapter reviews previous work on the perception and manipulation with virtual and augmented objects, directly with the hand, using either wearable haptics, \AR, or their combination. +This chapter reviews previous work on the perception and manipulation of virtual and augmented objects directly with the hand, using either wearable haptics, \AR, or their combination. %Experiencing a visual, haptic, or visuo-haptic \AE relies on one to many interaction loops between a user and the environment, as shown in \figref[introduction]{interaction-loop}, and each main step must be addressed and understood: the tracking and modelling of the \RE into a \VE, the interaction techniques to act on the \VE, the rendering of the \VE to the user through visual and haptic user interfaces, and, finally, the user's perception and actions on the overall \AE. -We first overview how the hand senses and acts on its environment to perceive and manipulate the haptic properties of real everyday objects. +First, we review how the hand senses and interacts with its environment to perceive and manipulate the haptic properties of real everyday objects. Second, we present how wearable haptic devices and renderings have been used to augment the haptic perception of roughness and hardness of real objects. -Third, we introduce the principles and user experience of \AR, and overview the main interaction techniques used to manipulate virtual objects directly with the hand. +Third, we introduce the principles and user experience of \AR and review the main interaction techniques used to manipulate virtual objects directly with the hand. Finally, we describe how visual and haptic feedback have been combined to augment direct hand interaction in \AR, in particular using wearable haptics. \input{1-haptic-hand}