Complete related work

This commit is contained in:
2024-09-23 03:09:45 +02:00
parent ee2b739ddb
commit 0495afd60c
12 changed files with 213 additions and 180 deletions

View File

@@ -4,12 +4,12 @@
This thesis presents research on direct hand interaction with real and virtual everyday objects, visually and haptically augmented using immersive \AR and \WH devices.
%It is entitled:
%- Title 1: Augmenting the interaction with everyday objects with wearable haptics and Augmented Reality
%- Title 3: Direct Hand Perception and Manipulation in Visuo-Haptic Augmented Reality
%- Title 4: Integrating Wearable Haptics in Augmented Reality: Perception and Manipulation of Virtual and Augmented Objects
%- Title 5: Wearable Haptics for Hand Interaction in Augmented Reality
%- Title 6: Enhancing Direct Hand Interaction with Everyday Objects in Augmented Reality using Wearable Haptics
%- Enhancing Hand Interaction with Wearable Haptic in Augmented Reality
%- Augmenting the interaction with everyday objects with wearable haptics and Augmented Reality
%- Direct Hand Perception and Manipulation in Visuo-Haptic Augmented Reality
%- Enhancing Hand Interaction with Wearable Haptics in Augmented Reality
%- Integrating Wearable Haptics in Augmented Reality: Perception and Manipulation of Virtual and Augmented Objects
%- Improving Hand-Object Interaction in Augmented Reality using Wearable Haptics
%- Wearable Haptics for Hand Interaction in Augmented Reality
%The introduction chapter is structured as follows: first, we present the research challenges and objectives of this thesis, then we describe our approach and contributions, and finally we present the structure of the thesis.
@@ -326,7 +326,7 @@ Still, it is known that the visual feedback can alter the perception of real and
%
Hence, our second objective is to understand how the perception of haptic texture augmentation differs depending on the degree of visual virtuality of the hand and the environment.
Finally, some visuo-haptic texture databases have been modelled from real texture captures \cite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with graspable haptics that are perceived as similar to real textures \cite{culbertson2015should,friesen2024perceived}.
Finally, some visuo-haptic texture databases have been modeled from real texture captures \cite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with graspable haptics that are perceived as similar to real textures \cite{culbertson2015should,friesen2024perceived}.
%
However, the rendering of these textures in an immersive and natural \vh-\AR using \WHs remains to be investigated.
%

View File

@@ -4,16 +4,16 @@
% describe how the hand senses and acts on its environment to perceive the haptic properties of real everyday objects.
The haptic sense has specific characteristics that make it unique in regard to other senses.
It enables us to perceive a large diversity of properties in the surrounding objects, through to a complex combination of sensations produced by numerous sensory receptors distributed throughout the body, but particularly in the hand.
It also allows us to act with the hand on these objects, to come into contact with them, to grasp them, to actively explore them, and to manipulate them.
It enables us to perceive a large diversity of properties of the everyday objects, up to a complex combination of sensations produced by numerous sensory receptors distributed throughout the body, but particularly in the hand.
It also allows us to act on these objects with the hand, to come into contact with them, to grasp them, to actively explore them, and to manipulate them.
This implies that the haptic perception is localized at the points of contact between the hand and the environment, \ie we cannot haptically perceive an object without actively touching it.
These two mechanisms, \emph{action} and \emph{perception}, are therefore closely associated and both essential to form the haptic experience of interacting with the environment using the hand \cite{lederman2009haptic}.
These two mechanisms, \emph{action} and \emph{perception}, are therefore closely associated and both are essential to form the haptic experience of interacting with the environment using the hand \cite{lederman2009haptic}.
\subsection{The Haptic Sense}
\label{haptic_sense}
Perceiving the properties of an object involves numerous sensory receptors embedded in the skin, but also in the muscles and joints of the hand, and distributed across the body. They are divided into two main modalities: \emph{cutaneous} and \emph{kinesthetic}.
Perceiving the properties of an object involves numerous sensory receptors embedded in the skin, but also in the muscles and joints of the hand, and distributed across the body. They can be divided into two main modalities: \emph{cutaneous} and \emph{kinesthetic}.
\subsubsection{Cutaneous Sensitivity}
\label{cutaneous_sensitivity}
@@ -223,19 +223,19 @@ However, as the speed of exploration changes the transmitted vibrations, a faste
\subfig[.5]{klatzky2003feeling_2}
\end{subfigs}
Even when the fingertips are deafferented (absence of cutaneous sensations), the perception of roughness is maintained \cite{libouton2012tactile}, thanks to the propagation of vibrations in the finger, hand and wrist, for both pattern and "natural" textures \cite{delhaye2012textureinduced}.
Even when the fingertips are deafferented (absence of cutaneous sensations), the perception of roughness is maintained \cite{libouton2012tactile}, thanks to the propagation of vibrations in the finger, hand and wrist, for both pattern and "natural" everyday textures \cite{delhaye2012textureinduced}.
The spectrum of vibrations shifts to higher frequencies as the exploration speed increases, but the brain integrates this change with proprioception to keep the \emph{perception constant} of the texture.
For patterned textures, as illustrated in \figref{delhaye2012textureinduced}, the ratio of the finger speed $v$ to the frequency of the vibration intensity peak $f_p$ is measured most of the time equal to the period $\lambda$ of the spacing of the elements:
\begin{equation}{grating_vibrations}
\lambda \sim \frac{v}{f_p}
\end{equation}
The vibrations generated by exploring natural textures are also very specific to each texture and similar between individuals, making them identifiable by vibration alone \cite{manfredi2014natural,greenspon2020effect}.
The vibrations generated by exploring everyday textures are also very specific to each texture and similar between individuals, making them identifiable by vibration alone \cite{manfredi2014natural,greenspon2020effect}.
This shows the importance of vibration cues even for macro textures and the possibility of generating virtual texture sensations with vibrotactile rendering.
\fig[0.55]{delhaye2012textureinduced}{Speed of finger exploration (horizontal axis) on grating textures with different periods $\lambda$ of spacing (in color) and frequency of the vibration intensity peak $f_p$ propagated in the wrist (vertical axis) \cite{delhaye2012textureinduced}.}
The everyday natural textures are more complex to study because they are composed of multiple elements of different sizes and spacings.
The everyday textures are more complex to study because they are composed of multiple elements of different sizes and spacings.
In addition, the perceptions of micro and macro roughness overlap and are difficult to distinguish \cite{okamoto2013psychophysical}.
Thus, individuals have a subjective definition of roughness, with some paying more attention to larger elements and others to smaller ones \cite{bergmanntiest2007haptic}, or even including other perceptual properties such as hardness or friction \cite{bergmanntiest2010tactual}.
@@ -276,10 +276,10 @@ With finger pressure, a relative difference (the \emph{Weber fraction}) of \perc
However, in the absence of pressure sensations (by placing a thin disc between the finger and the object), the necessary relative difference becomes much larger (Weber fraction of \percent{\sim 50}).
Thus, the perception of hardness relies on \percent{90} on surface deformation cues and \percent{10} on displacement cues.
In addition, an object with low stiffness but high Young's modulus can be perceived as hard, and vice versa, as shown in \figref{bergmanntiest2009cues}.
Finally, when pressing with the finger, the perceived hardness intensity $h$ follows a power law with the stiffness $k$ \cite{harper1964subjective}:
\begin{equation}{hardness_intensity}
h = k^{0.8}
\end{equation}
%Finally, when pressing with the finger, the perceived hardness intensity $h$ follows a power law with the stiffness $k$ \cite{harper1964subjective}:
%\begin{equation}{hardness_intensity}
% h = k^{0.8}
%\end{equation}
%En pressant du doigt, l'intensité perçue (subjective) de dureté suit avec la raideur une relation selon une loi de puissance avec un exposant de \num{0.8} \cite{harper1964subjective}, \ie quand la raideur double, la dureté perçue augmente de \num{1.7}.
%\textcite{bergmanntiest2009cues} ont ainsi observé une relation quadratique d'égale intensité perçue de dureté, comme illustré sur la \figref{bergmanntiest2009cues}.

View File

@@ -20,7 +20,7 @@ An increasing wearability resulting in the loss of the system's kinesthetic feed
\item Exoskeletons are body-grounded kinesthetic devices.
\item Wearable haptic devices are grounded on the point of application of the tactile stimulus.
]
\subfigsheight{38mm}
\subfigsheight{34mm}
\subfig{pacchierotti2017wearable_1}
\subfig{pacchierotti2017wearable_2}
\subfig{pacchierotti2017wearable_3}
@@ -147,14 +147,19 @@ An \ERM is a \DC motor that rotates an off-center mass when a voltage or current
\footnotetext{\url{https://www.precisionmicrodrives.com/}}
A \LRA consists of a coil that creates a magnetic field from an \AC to oscillate a magnet attached to a spring, as an audio loudspeaker (\figref{precisionmicrodrives_lra}). They are more complex to control and a bit larger than \ERMs. Each \LRA is designed to vibrate with maximum amplitude at a given resonant frequency, but won't vibrate efficiently at other frequencies, \ie their bandwidth is narrow, as shown on \figref{azadi2014vibrotactile}.
A \VCA is a \LRA but capable of generating vibration at two \DoF, with an independent control of the frequency and amplitude of the vibration on a wide bandwidth. They are larger in size than \ERMs and \LRAs, but can generate more complex renderings.
A \LRA consists of a coil that creates a magnetic field from an \AC to oscillate a magnet attached to a spring, as an audio loudspeaker (\figref{precisionmicrodrives_lra}).
They are more complex to control and a bit larger than \ERMs.
Each \LRA is designed to vibrate with maximum amplitude at a given resonant frequency, but won't vibrate efficiently at other frequencies, \ie their bandwidth is narrow, as shown on \figref{azadi2014vibrotactile}.
A \VCA is a \LRA but capable of generating vibration at two \DoF, with an independent control of the frequency and amplitude of the vibration on a wide bandwidth.
They are larger in size than \ERMs and \LRAs, but can generate more complex renderings.
Piezoelectric actuators deform a solid material when a voltage is applied. They are very small and thin, and allow two \DoFs of amplitude and frequency control. However, they require high voltages to operate thus limiting their use in wearable devices.
Piezoelectric actuators deform a solid material when a voltage is applied.
They are very small and thin and provide two \DoFs of amplitude and frequency control.
However, they require high voltages to operate, limiting their use in wearable devices.
\begin{subfigs}{lra}{Diagram and performance of \LRAs. }[
\item Diagram. From Precision Microdrives~\footnotemarkrepeat.
\item Force generated by two \LRAs as a function of sine wave input with different frequencies: both their maximum force and resonant frequency are different \cite{azadi2014vibrotactile}.
\item Force generated by two \LRAs as a function of sinusoidal wave input with different frequencies: both their maximum force and resonant frequency are different \cite{azadi2014vibrotactile}.
]
\subfigsheight{50mm}
\subfig{precisionmicrodrives_lra}
@@ -165,21 +170,21 @@ Piezoelectric actuators deform a solid material when a voltage is applied. They
\subsection{Modifying Perceived Haptic Roughness and Hardness}
\label{tactile_rendering}
Tactile rendering of haptic properties consists in modelling and reproducing virtual tactile sensations comparable to those perceived when interacting with real objects \cite{klatzky2013haptic}.
By adding such tactile rendering as feedback to the touch actions of the hand on a real object \cite{bhatia2024augmenting}, the perception of the object's haptic property can be modified.
Rendering a haptic property consists in modeling and reproducing virtual sensations comparable to those perceived when interacting with real objects \cite{klatzky2013haptic}.
By adding such rendering as feedback timely synchronized with the touch actions of the hand on a real object \cite{bhatia2024augmenting}, the perception of the object's haptic property can be modified.
The integration of the real and virtual sensations into a single property perception is discussed in more details in \secref{sensations_perception}.
%, both the real and virtual haptic sensations are integrated into a single property perception, as presented in \secref{sensations_perception}, \ie the perceived haptic property is modulated by the added virtual feedback.
In particular, the visual rendering of a touched object can also influence the perception of its haptic properties, \eg by modifying its visual texture in \AR or \VR, as discussed in the \secref{visuo_haptic}.
\textcite{bhatia2024augmenting} categorize the haptic augmentations into three types: direct touch, touch-through, and tool mediated.
In \emph{direct touch}, the haptic device does not cover the interior of the hand to not impair the user to interact with the \RE, and is typically achieved with wearable haptics.
In touch-through and tool-mediated, or \emph{indirect feel-through} \cite{jeon2015haptic}, the haptic device is interposed between the hand and the \RE.
\textcite{bhatia2024augmenting} categorize the haptic augmentations into three types: direct touch, touch-through, and tool-mediated.
In \emph{direct touch}, the haptic device does not cover the inside of the hand so as not to impair the user's interaction with the \RE, and is typically achieved with wearable haptics.
In touch-through and tool-mediated, or \emph{indirect feel-through} \cite{jeon2015haptic}, the haptic device is placed between the hand and the \RE.
%We are interested in direct touch augmentations with wearable haptics (\secref{wearable_haptic_devices}), as their integration with \AR is particularly promising for free hand interaction with visuo-haptic augmentations.
Many haptic augmentations were first developed with touch-through devices, and some (but not all) were later transposed to direct touch augmentation with wearable haptic devices.
%We also focus on tactile augmentations stimulating the mechanoreceptors of the skin (\secref{haptic_sense}), thus excluding temperature perception, as they are the most common existing haptic interfaces.
As we chose in \secref{object_properties} to focus on the haptic perception of the roughness and hardness of objects, we overview bellow the methods to modify the perception of these properties.
Of course, wearable haptics can also be used in direct touch context to modify the perceived friction \cite{konyo2008alternative,salazar2020altering}, weight \cite{minamizawa2007gravity}, or local deformation \cite{salazar2020altering} of real objects, but they are rare \cite{bhatia2024augmenting} and will not be detailed here.
Since we have chosen to focus in \secref{object_properties} on the haptic perception of roughness and hardness of objects, we review below the methods to modify the perception of these properties.
Of course, wearable haptics can also be used in a direct touch context to modify the perceived friction \cite{konyo2008alternative,salazar2020altering}, weight \cite{minamizawa2007gravity}, or local deformation \cite{salazar2020altering} of real objects, but they are rare \cite{bhatia2024augmenting} and will not be detailed here.
% \cite{klatzky2003feeling} : rendering roughness, friction, deformation, temperatures
% \cite{girard2016haptip} : renderings with a tangential motion actuator
@@ -187,51 +192,60 @@ Of course, wearable haptics can also be used in direct touch context to modify t
\subsubsection{Roughness}
\label{texture_rendering}
To modify the perception of haptic roughness (or texture, see \secref{roughness}) of a real object, vibrations are typically provided to the skin by the wearable haptic device when running the finger over the surface.
This is because running the finger or a tool on a textured surface generates pressures and vibrations (\secref{roughness}) at frequencies that are too high for rendering capabilities of most haptic devices \cite{campion2005fundamental,culbertson2018haptics}.
Two main approaches are used to render virtual textures: \emph{simulation models} and \emph{data-driven models} \cite{klatzky2013haptic,culbertson2018haptics}.
To modify the perception of the haptic roughness (or texture, see \secref{roughness}) of a real object, vibrations are typically applied to the skin by the haptic device as the user moves over the surface.
%This is because running the finger or a tool on a textured surface generates pressures and vibrations (\secref{roughness}) at frequencies that are too high for rendering capabilities of most haptic devices \cite{campion2005fundamental,culbertson2018haptics}.
There are two main approaches to modify virtual textures perception: \emph{simulation models} and \emph{data-driven models} \cite{klatzky2013haptic,culbertson2018haptics}.
\paragraph{Simulation Models}
Simulations of virtual textures are based on the physics of the interaction between the finger and the surface, and are used to generate the vibrations that the user feels when running the finger over the surface.
%Simulations of virtual textures are based on the physics of the interaction between the finger and the surface, and are used to generate the vibrations that the user feels when running the finger over the surface.
Early renderings of virtual textures consisted of modelling the surface with a periodic function
The simplest texture simulation model is a 1D sinusoidal grating $v(t)$ with spatial period $\lambda$ and amplitude $A$ that is scanned by the user at velocity $\dot{x}(t)$:
\begin{equation}{grating_rendering}
v(t) = A \sin(\frac{2 \pi \dot{x}(t)}{\lambda})
\end{equation}
That is, this model generates a periodic signal whose frequency is proportional to the user's velocity, implementing the speed-frequency ratio observed with real patterned textures (\eqref{grating_vibrations}).
It gives the user the illusion of a texture with a \emph{fixed spatial period} that approximate the real manufactured grating textures (\secref{roughness}).
The user's position could have been used instead of the velocity, but it requires measuring the position and generating the signal at frequencies too high (\qty{10}{\kHz}) for most sensors and haptic actuators \cite{campion2005fundamental}.
As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture \cite{campion2005fundamental,culbertson2018haptics}.
High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture \cite{unger2011roughness}.
With a voice-coil actuator attached to the middle phalanx of the finger, \textcite{asano2015vibrotactile} used this model to increase the perceived roughness (\figref{asano2015vibrotactile_2})
Participants moved their finger over real grating textures (\qtyrange{0.15}{.29}{\mm} groove and ridge width) with a virtual sine grating (\qty{1}{\mm} spatial period) superimposed, rendered after \eqref{grating_rendering}.
The perceived roughness increased proportionally to the virtual texture amplitude, but a high amplitude decreased it instead.
\textcite{ujitoko2019modulating} instead used a square wave signal and a hand-held stylus with an embedded voice-coil.
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached \cite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator \cite{asano2015vibrotactile}, creating a haptic texture augmentation.
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals \cite{delhaye2012textureinduced,manfredi2014natural}.
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity \cite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
%\textcite{friesen2024perceived} proposed
The model in \eqref{grating_rendering} can be extended to 2D textures by adding a second sinusoidal grating with an orthogonal orientation as \textcite{girard2016haptip}.
More complex models have also been developed to be physically accurate and reproduce with high fidelity the roughness perception of real patterned surfaces \cite{unger2011roughness}, but they require high-fidelity force feedback devices that are expensive and have a limited workspace.
\paragraph{Data-driven Models}
Because simulations of virtual textures can be very complex to design and to render in real-time, direct capture of real textures have been used instead to model the produced vibrations \cite{culbertson2018haptics}.
Because simulations of realistic virtual textures can be very complex to design and to render in real-time, direct capture and models of real textures have been developed \cite{culbertson2018haptics}.
\textcite{okamura1998vibration} first dragged a stylus over sandpapers and patterned surfaces to measure the vibrations produced by the interaction.
They found that the contact vibrations with patterns could be modelled as exponential decaying sine waves (\eqref{contact_transient}) that depend on the normal force and scanning velocity of the stylus on the surface.
This technique was employed by \textcite{ando2007fingernailmounted} to augment a smooth sheet of paper with a virtual patterned texture: With a \LRA mounted on the nail, they rendered the virtual contacts of the finger with \qty{20}{\ms} vibration impulses at \qty{130}{\Hz} (\figref{ando2007fingernailmounted}).
Participants matched the virtual textures to real ones, with \qty{0.25}{\mm} height and \qtyrange{1}{10}{\mm} width, but systematically overestimated the virtual width to be \qty{4}{\mm} longer.
\textcite{okamura1998vibration} were the first to measure the vibrations produced by the interaction of a stylus dragged over sandpaper and patterned surfaces.
They found that the contact vibrations with patterns can be modeled as exponentially decaying sinusoids (\eqref{contact_transient}) that depend on the normal force and the scanning velocity of the stylus on the surface.
This technique was used by \textcite{ando2007fingernailmounted} to augment a smooth sheet of paper with a virtual patterned texture: With a \LRA mounted on the nail, they rendered the virtual finger contacts with \qty{20}{\ms} vibration impulses at \qty{130}{\Hz} (\figref{ando2007fingernailmounted}).
Participants matched the virtual textures to the real ones, with \qty{0.25}{\mm} height and \qtyrange{1}{10}{\mm} width, but systematically overestimated the virtual width to be \qty{4}{\mm} longer.
More models have been developed to capture "natural" (such as sandpapers) textures \cite{guruswamy2011iir} with many force and speed measures while staying compact and capable of real-time rendering \cite{romano2012creating,culbertson2014modeling}.
Such models are capable from the user's measurements of velocity and force as inputs to interpolate and generate a virtual texture to render as vibrations (\secref{vibrotactile_actuators}).
This led the release of the Penn Haptic Texture Toolkit (HaTT) database, a public set of stylus records and models of 100 haptic textures \cite{culbertson2014one}.
A similar database but captured directly from the fingertip was released very recently \cite{balasubramanian2024sens3}.
One limitation of these data-driven models is that they can render only isotropic textures: their capture does not depend on the position of the measure, and the rendering is the same whatever the direction of the movement.
Alternative models have been proposed to both render isotropic and patterned textures \cite{chan2021hasti}.
Other models have been then developed to capture everyday textures (such as sandpaper) \cite{guruswamy2011iir} with many force and velocity measures \cite{romano2012creating,culbertson2014modeling}.
Such data-based models are capable of interpolating from the user's measures of velocity and force as inputs to generate a virtual texture in real time (\secref{vibrotactile_actuators}).
This led to the release of the Penn Haptic Texture Toolkit (HaTT) database, a public set of stylus recordings and models of 100 haptic textures \cite{culbertson2014one}.
A similar database, but captured from a direct touch context with the fingertip, has recently been released \cite{balasubramanian2024sens3}.
A limitation of these data-driven models is that they can only render \emph{isotropic} textures: their record does not depend on the position of the measure, and the rendering is the same regardless of the direction of the movement.
Alternative models have been proposed to both render both isotropic and patterned textures \cite{chan2021hasti}.
When comparing real textures felt through a stylus with their virtual models rendered with a voice-coil actuator attached to the stylus (\figref{culbertson2012refined}), the virtual textures were found to accurately recreated roughness perception, but hardness and friction were not rendered properly \cite{culbertson2014modeling}.
\textcite{culbertson2015should} further showed that the perceived realism of the virtual textures, and similarity to the real textures, depended mostly on the user's speed but not on the user's force as inputs to the model, \ie respond to velocity is sufficient to render isotropic virtual textures.
When comparing real textures felt through a stylus with their virtual models rendered with a voice-coil actuator attached to the stylus (\figref{culbertson2012refined}), the virtual textures were found to accurately reproduce the perception of roughness, but hardness and friction were not rendered properly \cite{culbertson2014modeling}.
\textcite{culbertson2015should} further showed that the perceived realism of the virtual textures, and similarity to the real textures, depended mostly on the user's speed but not on the user's force as inputs to the model, \ie responding to speed is sufficient to render isotropic virtual textures.
\begin{subfigs}{textures_rendering_data}{Augmentating haptic texture perception with voice-coil actuators. }[
\item Increasing and decreasing the perceived roughness of a real patterned texture in direct touch \cite{asano2015vibrotactile}.
\item Comparing real patterned texture with virtual texture augmentation in direct touch \cite{friesen2024perceived}.
%\item Comparing real patterned texture with virtual texture augmentation in direct touch \cite{friesen2024perceived}.
\item Rendering virtual contacts in direct touch with the virtual texture \cite{ando2007fingernailmounted}.
\item Rendering an isotropic virtual texture over a real surface while sliding a hand-held stylus on it \cite{culbertson2012refined}.
\item Rendering an isotropic virtual texture over a real surface while sliding a hand-held stylus across it \cite{culbertson2012refined}.
]
\subfigsheight{35mm}
\subfig{asano2015vibrotactile}
\subfig{friesen2024perceived}
\subfigsheight{38mm}
\subfig{asano2015vibrotactile_2}
%\subfig{friesen2024perceived}
\subfig{ando2007fingernailmounted}
\subfig{culbertson2012refined}
\end{subfigs}
@@ -245,16 +259,16 @@ The perceived hardness (\secref{hardness}) of a real surface can be modified by
\paragraph{Modulating Forces}
When tapping or pressing a real object, the perceived stiffness $\tilde{k}$ of its surface can be modulated with force feedback \cite{jeon2015haptic}.
This was first proposed by \textcite{jeon2008modulating} who augmented a real surface tapped in 1 \DoF with a grounded force-feedback device held in hand (\figref{jeon2009haptic_1}).
This was first proposed by \textcite{jeon2008modulating} who augmented a real surface tapped in 1 \DoF with a grounded force-feedback device held in the hand (\figref{jeon2009haptic_1}).
When the haptic end-effector contacts the object at time $t$, the object's surface deforms by displacement $x_r(t)$ and opposes a real reaction force $f_r(t)$.
The virtual force of the device $\tilde{f_r}(t)$ is then controlled to:
\begin{equation}{stiffness_augmentation}
\tilde{f_r}(t) = f_r(t) - \tilde{k} x_r(t)
\end{equation}
A force sensor embedded in the device measures the reaction force $f_r(t)$.
The displacement $x_r(t)$ is estimated with the reaction force and tapping velocity using a pre-defined model of various materials, as described by \textcite{jeon2011extensions}.
As shown in \figref{jeon2009haptic_2}, the force $\tilde{f_r}(t)$ perceived by the user being modulated, but not the displacement $x_r(t)$, the perceived stiffness is $\tilde{k}(t)$.
This stiffness augmentation technique was then extended to enable tapping and pressing with 3 \DoFs \cite{jeon2010stiffness}, to render friction and weight augmentations \cite{jeon2011extensions}, and to grasping and squeezing the real object with two contact points \cite{jeon2012extending}.
The displacement $x_r(t)$ is estimated with the reaction force and the tapping velocity using a predefined model of different materials as described in \textcite{jeon2011extensions}.
As shown in \figref{jeon2009haptic_2}, the force $\tilde{f_r}(t)$ perceived by the user is modulated, but not the displacement $x_r(t)$, hence the perceived stiffness is $\tilde{k}(t)$.
This stiffness augmentation technique was then extended to allow tapping and pressing with 3 \DoFs \cite{jeon2010stiffness}, to render friction and weight augmentations \cite{jeon2011extensions}, and to grasp and squeez the real object with two contact points \cite{jeon2012extending}.
\begin{subfigs}{stiffness_rendering_grounded}{Augmenting the perceived stiffness of a real surface with a hand-held force-feedback device. }[%
\item Diagram of a user tapping the surface \cite{jeon2009haptic}.
@@ -264,11 +278,10 @@ This stiffness augmentation technique was then extended to enable tapping and pr
\subfig[0.42]{jeon2009haptic_2}
\end{subfigs}
\textcite{detinguy2018enhancing} transposed this stiffness augmentation technique with the hRing device (\secref{belt_actuators}): While pressing a real piston with the fingertip by displacement $x_r(t)$, the belt compressed the finger by a virtual force $\tilde{k}\,x_r(t)$ where $\tilde{k}$ is the added stiffness (\eqref{stiffness_augmentation}), increasing the perceived stiffness of the piston (\figref{detinguy2018enhancing}).
%This enables to \emph{increase} the perceived stiffness of the real piston up to \percent{+14}.
More importantly, the augmentation proved to be robust to the placement of the device, as the increased stiffness was perceived the same on the fingertip, the middle phalanx and the proximal.
Conversely, the technique allowed to \emph{decrease} the perceived stiffness by compressing the phalanx prior the contact and diminish the belt pressure as the user pressed the piston \cite{salazar2020altering}.
\textcite{tao2021altering} proposed instead to restrict the deformation of the fingerpad by pulling a hollow frame around it to decrease perceived stiffness (\figref{tao2021altering}): it augments the finger contact area thus the perceived Young modulus of the object (\secref{hardness}).
\textcite{detinguy2018enhancing} transposed this stiffness augmentation technique with the hRing device (\secref{belt_actuators}): While pressing a real piston with the fingertip by displacement $x_r(t)$, the belt compressed the finger with a virtual force $\tilde{k}\,x_r(t)$ where $\tilde{k}$ is the added stiffness (\eqref{stiffness_augmentation}), increasing the perceived stiffness of the piston (\figref{detinguy2018enhancing}).
More importantly, the augmentation proved to be robust to the placement of the device, as the increased stiffness was perceived the same on the fingertip, middle phalanx, and proximal.
Conversely, the technique allowed to \emph{decrease} the perceived stiffness by compressing the phalanx before the contact and reducing the pressure when the user pressed the piston \cite{salazar2020altering}.
\textcite{tao2021altering} proposed instead to restrict the deformation of the fingerpad by pulling a hollow frame around it to decrease perceived stiffness (\figref{tao2021altering}): it augments the finger contact area and thus the perceived Young's modulus of the object (\secref{hardness}).
\begin{subfigs}{stiffness_rendering_wearable}{Modifying the perceived stiffness with wearable pressure devices. }[%
\item Modify the perceived stiffness of a piston by pressing the finger during or prior the contact \cite{detinguy2018enhancing,salazar2020altering}.
@@ -282,19 +295,19 @@ Conversely, the technique allowed to \emph{decrease} the perceived stiffness by
\paragraph{Vibrations Augmentations}
\textcite{okamura2001realitybased} measured impact vibrations $Q(t)$ when tapping on real objects and found they can be modelled as exponential decaying sine wave:
\textcite{okamura2001realitybased} measured impact vibrations $v(t)$ when tapping on real objects and found they can be modeled as exponential decaying sinusoid:
\begin{equation}{contact_transient}
Q(t) = A \, |v_{in}| \, e^{- \tau t} sin(2 \pi f t)
v(t) = A \, |v_{in}| \, e^{- \tau t} sin(2 \pi f t)
\end{equation}
With $A$ the amplitude slope, $\tau$ the sine decay rate and $f$ the sine frequency, which are measured material properties, and $v_{in}$ the impact velocity.
With $A$ the amplitude slope, $\tau$ the decay rate and $f$ the frequency, which are measured material properties, and $v_{in}$ the impact velocity.
It has been shown that these material properties perceptually express the stiffness (\secref{hardness}) of real \cite{higashi2019hardness} and virtual surface \cite{choi2021perceived}.
Therefore, when contacting or tapping a real object through an indirect feel-through interface that provide such vibrations (\figref{choi2021augmenting_control}) using a voice-coil (\secref{vibrotactile_actuators}), the perceived stiffness can be increased or decreased \cite{kuchenbecker2006improving,hachisu2012augmentation,choi2021augmenting}, \eg sponge feeling stiffer or wood feeling softer (\figref{choi2021augmenting_results}).
A challenge with this technique is to provide the vibration feedback at the right time, to be felt simultaneous with the real contact \cite{park2023perceptual}.
Therefore, when contacting or tapping a real object through an indirect feel-through interface that provides such vibrations (\figref{choi2021augmenting_control}) using a voice-coil (\secref{vibrotactile_actuators}), the perceived stiffness can be increased or decreased \cite{kuchenbecker2006improving,hachisu2012augmentation,choi2021augmenting}, \eg sponge feels stiffer or wood feels softer (\figref{choi2021augmenting_results}).
A challenge with this technique is to provide the vibration feedback at the right time to be felt simultaneously with the real contact \cite{park2023perceptual}.
\begin{subfigs}{contact_vibrations}{Augmenting perceived stiffness using vibrations when touching a real surface \cite{choi2021augmenting}. }[%
%\item Experimental setup with a voice-coil actuator attached to a touch-through interface.
\item Voltage inputs (top) to the voice-coil for soft, medium and hard vibrations, with the corresponding displacement (middle) and force (bottom) outputs of the actuator.
\item Perceived intensity of stiffness of real sponge ("Sp") and wood ("Wd") surfaces without added vibrations ("N") and modified by soft ("S"), medium ("M") and hard ("H") vibrations.
\item Voltage inputs (top) to the voice-coil for soft, medium, and hard vibrations, with the corresponding displacement (middle) and force (bottom) outputs of the actuator.
\item Perceived stiffness intensity of real sponge ("Sp") and wood ("Wd") surfaces without added vibrations ("N") and modified by soft ("S"), medium ("M") and hard ("H") vibrations.
]
%\subfig[.15]{choi2021augmenting_demo}
\subfigsheight{50mm}
@@ -302,7 +315,7 @@ A challenge with this technique is to provide the vibration feedback at the righ
\subfig{choi2021augmenting_results}
\end{subfigs}
Vibrations on contact have been employed with wearable haptics but, to the best of our knowledge, only to render \VOs \cite{pezent2019tasbi,teng2021touch,sabnis2023haptic}.
Vibrations on contact have been employed with wearable haptics, but to the best of our knowledge only to render \VOs \cite{pezent2019tasbi,teng2021touch,sabnis2023haptic}.
We describe them in the \secref{vhar_haptics}.
%A promising alternative approach
@@ -341,5 +354,11 @@ We describe them in the \secref{vhar_haptics}.
\subsection{Conclusion}
\label{wearable_haptics_conclusion}
Haptic systems aim to provide virtual interactions and sensations similar to those with real objects.
The complexity of the haptic sense has led to the design of numerous haptic devices and renderings.
While many haptic devices can be worn on the hand, only a few can be considered wearable as they are compact and portable, but they are limited to cutaneous feedback.
If the haptic rendering is timely associated with the user's touch actions on a real object, the perceived haptic properties of the object can be modified.
Several rendering methods have been developed to modify the perceived roughness and hardness, but not all of them have been already transposed to wearable haptics.
%, unlike most previous actuators that are designed specifically for fingertips and would require mechanical adaptation to be placed on other parts of the hand.
% thanks to the vibration propagation and the sensory capabilities distributed throughout the skin, they can be placed without adaption and on any part of the hand

View File

@@ -205,8 +205,8 @@ In an immersive and portable \AR system, this \VE is experienced at a 1:1 scale
The rendering gap between the real and virtual elements, as described on the interaction loop in \figref[introduction]{interaction-loop}, is thus experienced as very narrow or even not consciously perceived by the user.
This manifests as a sense of presence of the virtual, as described in \secref{ar_presence}.
As the real-virtual rendering gap is reduced, we could expect a similar and seamless interaction with the \VE as with a \RE, which \textcite{jacob2008realitybased} called \emph{reality based interactions}.
As of today, an immersive \AR system track itself with the user in \ThreeD, using tracking sensors and pose estimation algorithms \cite{marchand2016pose}, \eg as in \figref{newcombe2011kinectfusion}.
As the gap between real and virtual rendering is reduced, one could expects a similar and seamless interaction with the \VE as with a \RE, which \textcite{jacob2008realitybased} called \emph{reality based interactions}.
As of today, an immersive \AR system tracks itself with the user in \ThreeD, using tracking sensors and pose estimation algorithms \cite{marchand2016pose}, \eg as in \figref{newcombe2011kinectfusion}.
It enables the \VE to be registered with the \RE and the user simply moves to navigate within the virtual content.
%This tracking and mapping of the user and \RE into the \VE is named the \enquote{extent of world knowledge} by \textcite{skarbez2021revisiting}, \ie to what extent the \AR system knows about the \RE and is able to respond to changes in it.
However, direct hand manipulation of virtual content is a challenge that requires specific interaction techniques \cite{billinghurst2021grand}.
@@ -218,18 +218,18 @@ It is often achieved using two interaction techniques: \emph{tangible objects} a
As \AR integrates visual virtual content into \RE perception, it can involve real surrounding objects as \UI: to visually augment them, \eg by superimposing visual textures \cite{roo2017inner} (\figref{roo2017inner}), and to use them as physical proxies to support interaction with \VOs \cite{ishii1997tangible}.
According to \textcite{billinghurst2005designing}, each \VO is coupled to a tangible object, and the \VO is physically manipulated through the tangible object, providing a direct, efficient and seamless interaction with both the real and virtual content.
This is a technique similar to mapping a mouse's movements to a virtual cursor on a screen.
This technique is similar to mapping the movements of a mouse to a virtual cursor on a screen.
Methods have been developed to automatically pair and adapt the \VOs to render with available tangibles of similar shape and size \cite{hettiarachchi2016annexing,jain2023ubitouch} (\figref{jain2023ubitouch}).
The issue with these \enquote{space-multiplexed} interfaces is the high number and variety of tangibles required.
An alternative is to use a single \enquote{universal} tangible object like a hand-held controller, such as a cube \cite{issartel2016tangible} or a sphere \cite{englmeier2020tangible}.
These \enquote{time-multiplexed} interfaces require interaction techniques that allow the user to pair the tangible with any \VO, \eg by placing the tangible into the \VO and pressing the fingers \cite{issartel2016tangible} (\figref{issartel2016tangible}), similar to a real grasp (\secref{grasp_types}).
Methods have been developed to automatically pair and adapt the \VOs for rendering with available tangibles of similar shape and size \cite{hettiarachchi2016annexing,jain2023ubitouch} (\figref{jain2023ubitouch}).
The issue with these \emph{space-multiplexed} interfaces is the large number and variety of tangibles required.
An alternative is to use a single \emph{universal} tangible object like a hand-held controller, such as a cube \cite{issartel2016tangible} or a sphere \cite{englmeier2020tangible}.
These \emph{time-multiplexed} interfaces require interaction techniques that allow the user to pair the tangible with any \VO, \eg by placing the tangible into the \VO and pressing the fingers \cite{issartel2016tangible} (\figref{issartel2016tangible}), similar to a real grasp (\secref{grasp_types}).
Still, the virtual visual rendering and the tangible haptic sensations can be inconsistent.
Especially in \OST-\AR, as the \VOs are slightly transparent allowing the paired tangibles to be seen through them.
In a pick-and-place task with tangibles of different shapes, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using}) with the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users.
Especially in \OST-\AR, since the \VOs are inherently slightly transparent allowing the paired tangibles to be seen through them.
In a pick-and-place task with tangibles of different shapes, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using}) of the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users.
This suggests the feasibility of using simplified tangibles in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs.
Similarly, we described in \secref{tactile_rendering} how a material property (\secref{object_properties}) of a touched tangible can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR.
Similarly, in \secref{tactile_rendering} we described how a material property (\secref{object_properties}) of a touched tangible can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR.
\begin{subfigs}{ar_applications}{Manipulating \VOs with tangibles. }[
\item Ubi-Touch paired the movements and screw interaction of a virtual drill with a real vaporizer held by the user \cite{jain2023ubitouch}.
@@ -248,8 +248,8 @@ Similarly, we described in \secref{tactile_rendering} how a material property (\
\subsubsection{Manipulating with Virtual Hands}
\label{ar_virtual_hands}
Natural UI allow the user to use their body movements directly as inputs with the \VE \cite{billinghurst2015survey}.
Our hands allow us to manipulate real everyday objects with both strength and precision (\secref{grasp_types}), hence virtual hand interaction techniques seem the most natural way to manipulate virtual objects \cite{laviola20173d}.
Natural \UIs allow the user to use their body movements directly as inputs to the \VE \cite{billinghurst2015survey}.
Our hands allow us to manipulate real everyday objects with both strength and precision (\secref{grasp_types}), so virtual hand interaction techniques seem to be the most natural way to manipulate virtual objects \cite{laviola20173d}.
Initially tracked by active sensing devices such as gloves or controllers, it is now possible to track hands in real time using cameras and computer vision algorithms natively integrated into \AR/\VR headsets \cite{tong2023survey}.
The user's hand is therefore tracked and reconstructed as a \emph{virtual hand} model in the \VE \cite{billinghurst2015survey,laviola20173d}.
@@ -259,16 +259,18 @@ The most common technique is to reconstruct all the phalanges of the hand in an
The contacts between the virtual hand model and the \VOs are then simulated using heuristic or physics-based techniques \cite{laviola20173d}.
Heuristic techniques use rules to determine the selection, manipulation and release of a \VO (\figref{piumsomboon2013userdefined_1}).
But they produce unrealistic behaviour and are limited to the cases predicted by the rules.
Physics-based techniques simulate forces at the contact points between the virtual hand and the \VO.
In particular, \textcite{borst2006spring} have proposed an articulated kinematic model in which each phalanx is a rigid body simulated with the god-object \cite{zilles1995constraintbased} method: the virtual phalanx follows the movements of the real phalanx, but remains constrained to the surface of the virtual objects during contact. The forces acting on the object are calculated as a function of the distance between the real and virtual hands (\figref{borst2006spring}).
However, they produce unrealistic behaviour and are limited to the cases predicted by the rules.
Physics-based techniques simulate forces at the points of contact between the virtual hand and the \VO.
In particular, \textcite{borst2006spring} have proposed an articulated kinematic model in which each phalanx is a rigid body simulated with the god-object method \cite{zilles1995constraintbased}:
The virtual phalanx follows the movements of the real phalanx, but remains constrained to the surface of the virtual objects during contact.
The forces acting on the object are calculated as a function of the distance between the real and virtual hands (\figref{borst2006spring}).
More advanced techniques simulate the friction phenomena \cite{talvas2013godfinger} and finger deformations \cite{talvas2015aggregate}, allowing highly accurate and realistic interactions, but which can be difficult to compute in real time.
\begin{subfigs}{virtual-hand}{Manipulating \VOs with virtual hands. }[
\item A fingertip tracking that enables to select a \VO by opening the hand \cite{lee2007handy}.
\item A fingertip tracking that allows to select a \VO by opening the hand \cite{lee2007handy}.
\item Physics-based hand-object manipulation with a virtual hand made of numerous many small rigid-body spheres \cite{hilliges2012holodesk}.
\item Grasping a through gestures when the fingers are detected as opposing on the \VO \cite{piumsomboon2013userdefined}.
\item A kinematic hand model with rigid-body phalanges (in beige) following the real tracked hand (in green) but kept physically constrained to the \VO. Applied force are displayed as red arrows \cite{borst2006spring}.
\item A kinematic hand model with rigid-body phalanges (in beige) taht follows the real tracked hand (in green) but kept physically constrained to the \VO. Applied forces are shown as red arrows \cite{borst2006spring}.
]
\subfigsheight{37mm}
\subfig{lee2007handy}
@@ -278,7 +280,7 @@ More advanced techniques simulate the friction phenomena \cite{talvas2013godfing
\end{subfigs}
However, the lack of physical constraints on the user's hand movements makes manipulation actions tiring \cite{hincapie-ramos2014consumed}.
While the fingers of the user traverse the virtual object, a physics-based virtual hand remains in contact with the object, a discrepancy that may degrade the user's performance in \VR \cite{prachyabrued2012virtual}.
While the user's fingers traverse the virtual object, a physics-based virtual hand remains in contact with the object, a discrepancy that may degrade the user's performance in \VR \cite{prachyabrued2012virtual}.
Finally, in the absence of haptic feedback on each finger, it is difficult to estimate the contact and forces exerted by the fingers on the object during grasping and manipulation \cite{maisto2017evaluation,meli2018combining}.
While a visual rendering of the virtual hand in \VR can compensate for these issues \cite{prachyabrued2014visual}, the visual and haptic rendering of the virtual hand, or their combination, in \AR is under-researched.
@@ -286,16 +288,16 @@ While a visual rendering of the virtual hand in \VR can compensate for these iss
\subsection{Visual Rendering of Hands in AR}
\label{ar_visual_hands}
In \VR, as the user is fully immersed in the \VE and cannot see their real hands, it is necessary to represent their virtually (\secref{ar_embodiment}).
When interacting using a physics-based virtual hand method (\secref{ar_virtual_hands}), the visual rendering of the virtual hand have an influence on perception, interaction performance, and preference of users \cite{prachyabrued2014visual,argelaguet2016role,grubert2018effects,schwind2018touch}.
In a pick-and-place manipulation task in \VR, \textcite{prachyabrued2014visual} and \textcite{canales2019virtual} found that the visual hand rendering whose motion was constrained to the surface of the \VOs similar as to \textcite{borst2006spring} (\enquote{Outer Hand} in \figref{prachyabrued2014visual}) performed the worst, while the visual hand rendering following the tracked human hand (thus penetrating the \VOs, \enquote{Inner Hand} in \figref{prachyabrued2014visual}), performed the best, even though it was rather disliked.
\textcite{prachyabrued2014visual} also observed that the best compromise was a double rendering, showing both the virtual hand and the tracked hand (\enquote{2-Hand} in \figref{prachyabrued2014visual}).
While a realistic human hand rendering increase the sense of ownership \cite{lin2016need}, a skeleton-like rendering provide a stronger sense of agency \cite{argelaguet2016role} (\secref{ar_embodiment}), and a minimalistic fingertip rendering reduce errors in typing text \cite{grubert2018effects}.
In \VR, since the user is fully immersed in the \VE and cannot see their real hands, it is necessary to represent them virtually (\secref{ar_embodiment}).
When interacting with a physics-based virtual hand method (\secref{ar_virtual_hands}), the visual rendering of the virtual hand has an influence on perception, interaction performance, and preference of users \cite{prachyabrued2014visual,argelaguet2016role,grubert2018effects,schwind2018touch}.
In a pick-and-place manipulation task in \VR, \textcite{prachyabrued2014visual} and \textcite{canales2019virtual} found that the visual hand rendering whose motion was constrained to the surface of the \VOs similar as to \textcite{borst2006spring} (\enquote{Outer Hand} in \figref{prachyabrued2014visual}) performed the worst, while the visual hand rendering following the tracked human hand (thus penetrating the \VOs, \enquote{Inner Hand} in \figref{prachyabrued2014visual}) performed the best, though it was rather disliked.
\textcite{prachyabrued2014visual} also found that the best compromise was a double rendering, showing both the virtual hand and the tracked hand (\enquote{2-Hand} in \figref{prachyabrued2014visual}).
While a realistic rendering of the human hand increased the sense of ownership \cite{lin2016need}, a skeleton-like rendering provided a stronger sense of agency \cite{argelaguet2016role} (\secref{ar_embodiment}), and a minimalist fingertip rendering reduced typing errors \cite{grubert2018effects}.
A visual hand rendering while in \VE also seems to affect how one grasps an object \cite{blaga2020too}, or how real bumps and holes are perceived \cite{schwind2018touch}.
\fig{prachyabrued2014visual}{Visual hand renderings affect user experience in \VR \cite{prachyabrued2014visual}.}
Conversely, a user sees their own hands in \AR, and the mutual occlusion between the hands and the \VOs is a common issue (\secref{ar_displays}), \ie hiding the \VO when the real hand is in front of it and hiding the real hand when it is behind the \VO (\figref{hilliges2012holodesk_2}).
Conversely, a user sees their own hands in \AR, and the mutual occlusion between the hands and the \VOs is a common issue (\secref{ar_displays}), \ie hiding the \VO when the real hand is in front of it, and hiding the real hand when it is behind the \VO (\figref{hilliges2012holodesk_2}).
%For example, in \figref{hilliges2012holodesk_2}, the user is pinching a virtual cube in \OST-\AR with their thumb and index fingers, but while the index is behind the cube, it is seen as in front of it.
While in \VST-\AR, this could be solved as a masking problem by combining the real and virtual images \cite{battisti2018seamless}, \eg in \figref{suzuki2014grasping}, in \OST-\AR, this is much more difficult because the \VE is displayed as a transparent \TwoD image on top of the \ThreeD \RE, which cannot be easily masked \cite{macedo2023occlusion}.
%Yet, even in \VST-\AR,
@@ -304,19 +306,19 @@ While in \VST-\AR, this could be solved as a masking problem by combining the re
%Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in \VST-\AR \cite{buchmann2005interaction,ha2014wearhand,piumsomboon2014graspshell} and \VR \cite{vanveldhuizen2021effect}, but has not yet been evaluated in \OST-\AR.
%However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a \VO, \eg the thumb is in front of the virtual cube, but could be perceived to be behind it.
As the \VE is intangible, adding a visual rendering of the virtual hand in \AR that is physically constrained to the \VOs would achieve a similar result to the promising double-hand rendering of \textcite{prachyabrued2014visual}.
A \VO overlaying a tangible object in \OST-\AR can vary in size and shape without worsening the users' experience nor the performance when manipulating it \cite{kahl2021investigation,kahl2023using}.
Since the \VE is intangible, adding a visual rendering of the virtual hand in \AR that is physically constrained to the \VOs would achieve a similar result to the promising double-hand rendering of \textcite{prachyabrued2014visual}.
A \VO overlaying a tangible object in \OST-\AR can vary in size and shape without degrading user experience or manipulation performance \cite{kahl2021investigation,kahl2023using}.
This suggests that a visual hand rendering superimposed on the real hand as a partial avatarization (\secref{ar_embodiment}) might be helpful without impairing the user.
Few works have compared different visual hand rendering in \AR or with wearable haptic feedback.
Rendering the real hand as a semi-transparent hand in \VST-\AR is perceived as less natural but seems to be preferred to a mutual visual occlusion for interaction with real and virtual objects \cite{buchmann2005interaction,piumsomboon2014graspshell}.
%Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in \VST-\AR \cite{buchmann2005interaction,ha2014wearhand,piumsomboon2014graspshell} and \VR \cite{vanveldhuizen2021effect}, but has not yet been evaluated in \OST-\AR.
Similarly, \textcite{blaga2017usability} evaluated direct hand manipulation in non-immersive \VST-\AR a skeleton-like rendering against no visual hand rendering: while user performance did not improve, participants felt more confident with the virtual hand (\figref{blaga2017usability}).
\textcite{krichenbauer2018augmented} found participants \percent{22} faster in immersive \VST-\AR than in \VR in the same pick-and-place manipulation task, but no visual hand rendering was used in \VR while the real hand was visible in \AR.
In a collaboration task in immersive \OST-\AR \vs \VR, \textcite{yoon2020evaluating} showed that a realistic human hand rendering was the most preferred over a low-polygon hand and a skeleton-like hand for the remote partner.
\textcite{genay2021virtual} found that the \SoE was stronger with robotic hands overlay in \OST-\AR when the environment contains both real and virtual objects (\figref{genay2021virtual}).
Finally, \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared visual and haptic rendering of the hand in \VST-\AR, as detailed in the next section (\secref{vhar_rings}).
Taken together, these results suggest that a visual hand rendering in \AR could improve the user experience and performance in direct hand manipulation tasks, but the best rendering is still to be determined.
Similarly, \textcite{blaga2017usability} evaluated direct hand manipulation in non-immersive \VST-\AR with a skeleton-like rendering \vs no visual hand rendering: while user performance did not improve, participants felt more confident with the virtual hand (\figref{blaga2017usability}).
%\textcite{krichenbauer2018augmented} found that participants were \percent{22} faster in immersive \VST-\AR than in \VR in the same pick-and-place manipulation task, but no visual hand rendering was used in \VR while the real hand was visible in \AR.
In a collaborative task in immersive \OST-\AR \vs \VR, \textcite{yoon2020evaluating} showed that a realistic human hand rendering was the most preferred over a low-polygon hand and a skeleton-like hand for the remote partner.
\textcite{genay2021virtual} found that the \SoE with robotic hands overlay in \OST-\AR was stronger when the environment contained both real and virtual objects (\figref{genay2021virtual}).
Finally, \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the visual and haptic rendering of the hand in \VST-\AR, as detailed in the next section (\secref{vhar_rings}).
Taken together, these results suggest that a visual rendering of the hand in \AR could improve usability and performance in direct hand manipulation tasks, but the best rendering has yet to be determined.
%\cite{chan2010touching} : cues for touching (selection) \VOs.
%\textcite{saito2021contact} found that masking the real hand with a textured 3D opaque virtual hand did not improve performance in a reach-to-grasp task but displaying the points of contact on the \VO did.
%To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of \VO manipulation.
@@ -340,10 +342,10 @@ Taken together, these results suggest that a visual hand rendering in \AR could
\subsection{Conclusion}
\label{ar_conclusion}
\AR systems integrate virtual content into the user's perception as if it is part of the \RE.
\AR systems integrate virtual content into the user's perception as if it were part of the \RE.
\AR headsets now enable real-time tracking of the head and hands, and high-quality display of virtual content, while being portable and mobile.
They enable highly immersive \AEs that users can explore with a strong sense of the presence of the virtual content.
But without a direct and seamless interaction with the \VOs using the hands, the coherence of the \AE experience is compromised.
In particular, there is a lack of mutual occlusion and interaction cues between hands and virtual content while manipulating \VOs in \OST-\AR that could be mitigated by visual rendering of the hand.
However, without a direct and seamless interaction with the \VOs using the hands, the coherence of the \AE experience is compromised.
In particular, there is a lack of mutual occlusion and interaction cues between the hands and virtual content when manipulating \VOs in \OST-\AR that could be mitigated by a visual rendering of the hand.
A common alternative approach is to use tangible objects as proxies for interaction with \VOs, but this raises concerns about their consistency with the visual rendering.
In this context, the use of wearable haptic systems worn on the hand seems to be a promising solution both for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of touched tangible objects.

View File

@@ -1,6 +1,10 @@
\section{Visuo-Haptic Augmentations of Hand-Object Interactions}
\label{visuo_haptic}
Everyday perception and manipulation of objects with the hand typically involves both the visual and haptic senses.
Each sense has unique capabilities for perceiving certain object properties, such as color for vision or temperature for touch, but they are equally capable for many material properties, such as roughness, hardness, or friction \cite{baumgartner2013visual}.
Rendering a \VO with both visual and haptic feedback that feels coherent is a challenge, especially in immersive \AR where the haptic actuator is moved away so as not to cover the inside of the hand.
% Answer the following four questions: “Who else has done work with relevance to this work of yours? What did they do? What did they find? And how is your work here different?”
% spatial and temporal integration of visuo-haptic feedback as perceptual cues vs proprioception and real touch sensations
@@ -17,16 +21,20 @@
\subsection{Visuo-Haptic Perception of Virtual and Augmented Objects}
\label{sensations_perception}
Both \AR and wearable haptic systems integrate virtual content into the user's perception as sensory illusions.
It is essential to understand how a multimodal visuo-haptic rendering of a \VO is perceived.
\subsubsection{Merging the Sensations into a Perception}
\label{sensations_perception}
A \emph{perception} is the merge of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property \cite{ernst2004merging}.
A \emph{perception} is the merging of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property \cite{ernst2004merging}.
For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a tangible with a co-localized \VO (\secref{ar_tangibles}).
When the sensations can be redundant, \ie when only one sensation could be enough to estimate the property, they are integrated to form a single coherent perception \cite{ernst2004merging}.
No sensory information is completely reliable, and can provide different answers to the same property when measured multiple times, \eg the weight of an object.
If the sensations are redundant, \ie if only one sensation could suffice to estimate the property, they are integrated to form a single perception \cite{ernst2004merging}.
No sensory information is completely reliable and may give different answers to the same property when measured multiple times, \eg the weight of an object.
Therefore, each sensation $i$ is said to be an estimate $\tilde{s}_i$ with variance $\sigma_i^2$ of the property $s$.
The \MLE model predicts then that the integrated estimated property $\tilde{s}$ is the weighted sum of the individual sensory estimates:
The \MLE model then predicts that the integrated estimated property $\tilde{s}$ is the weighted sum of the individual sensory estimates:
\begin{equation}{MLE}
\tilde{s} = \sum_i w_i \tilde{s}_i \quad \text{with} \quad \sum_i w_i = 1
\end{equation}
@@ -39,13 +47,13 @@ And the integrated variance $\sigma^2$ is the inverse of the sum of the individu
\sigma^2 = \left( \sum_i \frac{1}{\sigma_i^2} \right)^{-1}
\end{equation}
This was demonstrated by \textcite{ernst2002humans} in a user study where participants estimated the height of a virtual bar with a fixed window \OST-\AR display (\secref{ar_displays}) and force-feedback devices worn on the thumb and index fingers (\secref{wearability_level}), as shown in \figref{ernst2002humans_setup}.
This was demonstrated by \textcite{ernst2002humans} in a user study where participants estimated the height of a virtual bar using a fixed-window \OST-\AR display (\secref{ar_displays}) and force-feedback devices worn on the thumb and index finger (\secref{wearability_level}), as shown in \figref{ernst2002humans_setup}.
%They first measured the individual variances of the visual and haptic estimates (\figref{ernst2002humans_within}) and then the combined variance of the visuo-haptic estimates (\figref{ernst2002humans_visuo-haptic}).
On each trial, participants compared the visuo-haptic reference bar (with a fixed height) to a visuo-haptic comparison bar (with a variable height) in a \TIFC task (one bar is tested first, a pause, then the other) and indicated which was taller.
On each trial, participants compared the visuo-haptic reference bar (of a fixed height) to a visuo-haptic comparison bar (of a variable height) in a \TIFC task (one bar is tested first, a pause, then the other) and indicated which was taller.
The reference bar had different conflicting visual $s_v$ and haptic $s_h$ heights, and different noise levels were added to the visual feedback to increase its variance.
The objective was to determine a \PSE between the comparison and reference bars, where the participant was equally likely to choose one or the other (\percent{50} of the trials).
%\figref{ernst2002humans_within} shows the discrimination of participants with only the haptic or visual feedback, and how much the estimation becomes difficult (thus higher variance) when noise is added to the visual feedback.
\figref{ernst2004merging_results} shows that when the visual noise was low, the visual feedback had more weight, but the more the visual noise increased, the more the haptic feedback gained weight, as predicted by the \MLE model.
\figref{ernst2004merging_results} shows that when the visual noise was low, the visual feedback had more weight, but as visual noise increased, haptic feedback gained more weight, as predicted by the \MLE model.
\begin{subfigs}{ernst2002humans}{Visuo-haptic perception of height of a virtual bar \cite{ernst2002humans}. }[
\item Experimental setup.%: Participants estimated height visually with an \OST-\AR display and haptically with force-feedback devices worn on the thumb and index fingers.
@@ -61,29 +69,31 @@ The objective was to determine a \PSE between the comparison and reference bars,
\end{subfigs}
%Hence, the \MLE model explains how a (visual) \VO in \AR can be perceived as coherent when combined with real haptic sensations of a tangible or a wearable haptic feedback.
The \MLE model implies when seeing and touching a \VO in \AR, the combination of the visual and haptic stimuli, real or virtual, rendered to the user can be perceived as a coherent single object property.
As long as the user is able to match the sensations as the same object property, and even though there are discrepancies between the sensations, the overall perception can be influenced by changing one of the stimuli as discussed in the next sections.
The \MLE model implies that when seeing and touching a \VO in \AR, the combination of visual and haptic stimuli, real or virtual, presented to the user can be perceived as a coherent single object property.
%As long as the user is able to associate the sensations as the same object property, and even if there are discrepancies between the sensations, the overall perception can be influenced by changing one of the stimuli, as discussed in the next sections.
%for example by including tangible objects, wearable haptic feedback, or even by altering the visual rendering of the \VO, as discussed in the next sections.
\subsubsection{Influence of Visual Rendering on Tangible Perception}
\label{visual_haptic_influence}
A visuo-haptic perception of an object's property is thus robust to a certain difference between the two sensory modalities, as long as one can match their respective sensations to the same property.
Thus, a visuo-haptic perception of an object's property is robust to some difference between the two sensory modalities, as long as one can match their respective sensations to the same property.
In particular, the texture perception of everyday objects is known to be constructed from both vision and touch \cite{klatzky2010multisensory}.
More precisely, when evaluating surfaces with vision or touch only, both senses mainly discriminate their materials by the same properties of roughness, hardness and friction, and with similar performance \cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
More precisely, when surfaces are evaluated by vision or touch alone, both senses discriminate their materials mainly by the same properties of roughness, hardness, and friction, and with similar performance \cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
Overall perception can be then modified by changing one of the sensory modality, as shown by \textcite{yanagisawa2015effects}, who altered perceived roughness, stiffness and friction of real tactile materials touched by the finger by superimposing different real visual textures using a half-mirror.
With a similar setup but in immersive \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched but through a glove: Participants matched visual textures to real textures when their respective hardness felt similar.
\textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined in \VR multiple \VOs with \ThreeD-printed hair structures or with everyday real surfaces, respectively.
They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few tangibles seemed to sufficient to match all the visual \VOs (\figref{gunther2022smooth}).
The overall perception can then be modified by changing one of the sensory modalities.
\textcite{yanagisawa2015effects} altered the perceived roughness, stiffness, and friction of real tactile materials touched by the finger by superimposing different real visual textures using a half-mirror.
In a similar setup, but in immersive \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched through a glove: many visual textures were found to match the real haptic textures.
\textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined multiple \VOs in \VR with \ThreeD-printed hair structures or with everyday real surfaces, respectively.
They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few tangibles seemed to be sufficient to match all the visual \VOs (\figref{gunther2022smooth}).
%Taken together, these studies suggest that a set of haptic textures, real or virtual, can be perceived as coherent with a larger set of visual virtual textures.
\fig{gunther2022smooth}{In a passive touch context in \VR, only a smooth and a rough real surfaces were found to match all the visual \VOs \cite{gunther2022smooth}.}
The visual feedback can even be designed on purpose to influence the haptic perception, usually by deforming the visual representation of a user input, creating a \enquote{pseudo-haptic feedback} \cite{ujitoko2021survey}.
For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by deforming visually the geometry of a tangible object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape or curvature can be altered \cite{ban2013modifying,ban2014displaying}.
Visual feedback can even be intentionally designed to influence haptic perception, usually by deforming the visual representation of a user input, creating a \enquote{pseudo-haptic feedback} \cite{ujitoko2021survey}.
For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually deforming the geometry of a tangible object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape, or curvature can be altered \cite{ban2013modifying,ban2014displaying}.
\textcite{punpongsanon2015softar} used this technique in spatial \AR (\secref{ar_displays}) to induce a softness illusion of a hard tangible object by superimposing a virtual texture that deforms when pressed by the hand (\figref{punpongsanon2015softar}).
\textcite{ujitoko2019modulating} increased the perceived roughness of a virtual patterned texture rendered as vibrations through a hand-held stylus (\secref{texture_rendering}) by adding small oscillations to the visual feedback of the stylus on a screen.
\begin{subfigs}{pseudo_haptic}{Pseudo-haptic feedback in \AR. }[
\item A virtual soft texture projected on a table and that deforms when pressed by the hand \cite{punpongsanon2015softar}.
@@ -94,17 +104,17 @@ For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by deforming vis
\subfig{ban2014displaying}
\end{subfigs}
In all these studies, the visual expectations of participants influenced their haptic perception.
In particular in \AR and \VR, the perception of a haptic rendering or augmentation can be influenced by the visual rendering of the \VO.
%In all of these studies, the visual expectations of participants influenced their haptic perception.
%In particular, in \AR and \VR, the perception of a haptic rendering or augmentation can be influenced by the visual rendering of the \VO.
\subsubsection{Perception of Visuo-Haptic Rendering in AR and VR}
\label{AR_vs_VR}
Some studies have investigated in \AR and \VR the visuo-haptic perception of \VOs rendered with force-feedback and vibrotactile feedback.
Some studies have investigated the visuo-haptic perception of \VOs rendered with force-feedback and vibrotactile feedback in \AR and \VR.
In an immersive \VST-\AR setup, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}).
In a \TIFC task (\secref{sensations_perception}), participants pressed two pistons and indicated which was stiffer.
One had a reference stiffness but an additional visual or haptic delay, while the other varied with a comparison stiffness but had no delay. \footnote{Participants were not told about the delays and stiffness tested, nor which piston was the reference or comparison. The order of the pistons (which one was pressed first) was also randomized.}%
One had a reference stiffness but an additional visual or haptic delay, while the other varied with a comparison stiffness but had no delay. \footnote{Participants were not told about the delays and stiffness tested, nor which piston was the reference or comparison. The order of the pistons (which one was pressed first) was also randomized.}
Adding a visual delay increased the perceived stiffness of the reference piston, while adding a haptic delay decreased it, and adding both delays cancelled each other out (\figref{knorlein2009influence_2}).
\begin{subfigs}{visuo-haptic-stiffness}{Perception of haptic stiffness in \VST-\AR \cite{knorlein2009influence}. }[
@@ -138,10 +148,10 @@ This suggests that the haptic stiffness of \VOs feels \enquote{softer} in an \AE
\subfig[0.3]{gaffary2017ar_4}
\end{subfigs}
Finally, \textcite{diluca2019perceptual} investigated perceived simultaneity of visuo-haptic contact with a \VO in \VR.
The contact was both rendered with a vibrotactile piezo-electric device on the fingertip and a visual change in the \VO color.
But the visuo-haptic simultaneity varied by either adding a visual delay or by triggering earlier the haptic feedback.
No participant (out of 19) was able to detect a \qty{50}{\ms} visual lag and a \qty{15}{\ms} haptic lead and only half of them detected a \qty{100}{\ms} visual lag and a \qty{70}{\ms} haptic lead.
Finally, \textcite{diluca2019perceptual} investigated the perceived simultaneity of visuo-haptic contact with a \VO in \VR.
The contact was rendered both by a vibrotactile piezoelectric device on the fingertip and by a visual change in the color of the \VO.
The visuo-haptic simultaneity was varied by adding a visual delay or by triggering the haptic feedback earlier.
No participant (out of 19) was able to detect a \qty{50}{\ms} visual lag and a \qty{15}{\ms} haptic lead, and only half of them detected a \qty{100}{\ms} visual lag and a \qty{70}{\ms} haptic lead.
These studies have shown how the latency of the visual rendering of a \VO or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device.
We describe in the next section how wearable haptics have been integrated with immersive \AR.
@@ -151,19 +161,19 @@ We describe in the next section how wearable haptics have been integrated with i
\label{vhar_haptics}
A few wearable haptic devices have been specifically designed or experimentally tested for direct hand interaction in immersive \AR.
As virtual or augmented objects are naturally touched, grasped and manipulated directly with the fingertips (\secref{exploratory_procedures} and \secref{grasp_types}), the main challenge of wearable haptics for \AR is to provide haptic sensations of these interactions while keeping the fingertips free to interact with the \RE.
Several approaches have been proposed to move the actuator away to another location on the hand.
Yet, they differ greatly in the actuators used (\secref{wearable_haptic_devices}) thus the haptic feedback (\secref{tactile_rendering}), and the placement of the haptic rendering.
Since virtual or augmented objects are naturally touched, grasped, and manipulated directly with the fingertips (\secref{exploratory_procedures} and \secref{grasp_types}), the main challenge of wearable haptics for \AR is to provide haptic sensations of these interactions while keeping the fingertips free to interact with the \RE.
Several approaches have been proposed to move the actuator to a different location on the hand.
Yet, they differ greatly in the actuators used (\secref{wearable_haptic_devices}), thus the haptic feedback (\secref{tactile_rendering}), and the placement of the haptic rendering.
Other wearable haptic actuators have been proposed for \AR but are not detailed here.
Other wearable haptic actuators have been proposed for \AR, but are not discussed here.
A first reason is that they permanently cover the fingertip and affect the interaction with the \RE, such as thin-skin tactile interfaces \cite{withana2018tacttoo,teng2024haptic} or fluid-based interfaces \cite{han2018hydroring}.
Another category of actuators relies on systems that cannot be considered as portable, such as REVEL \cite{bau2012revel} that provide friction sensations with reverse electrovibration that need to modify the real objects to augment, or Electrical Muscle Stimulation (EMS) devices \cite{lopes2018adding} that provide kinesthetic feedback by contracting the muscles.
Another category of actuators relies on systems that cannot be considered as portable, such as REVEL \cite{bau2012revel}, which provide friction sensations with reverse electrovibration that must modify the real objects to augment, or Electrical Muscle Stimulation (EMS) devices \cite{lopes2018adding}, which provide kinesthetic feedback by contracting the muscles.
\subsubsection{Nail-Mounted Devices}
\label{vhar_nails}
\textcite{ando2007fingernailmounted} were the first to propose to move away the actuator from the fingertip to the nail, as described in \secref{texture_rendering}.
\textcite{ando2007fingernailmounted} were the first to propose to moving the actuator from the fingertip to the nail, as described in \secref{texture_rendering}.
This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a haptic device able to unfold its end-effector on demand to make contact with the fingertip when touching \VOs (\figref{teng2021touch}).
This moving platform also contains a \LRA (\secref{moving_platforms}) and provides contact pressure and texture sensations.
%The whole system is very compact (\qtyproduct{24 x 24 x 41}{\mm}), lightweight (\qty{9.5}{\g}), and fully portable by including a battery and Bluetooth wireless communication. \qty{20}{\ms} for the Bluetooth
@@ -173,15 +183,15 @@ Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and thi
% teng2021touch: (5.27+3.03+5.23+5.5+5.47)/5 = 4.9
% ando2007fingernailmounted: (2.4+2.63+3.63+2.57+3.2)/5 = 2.9
With Fingeret, \textcite{maeda2022fingeret} adapted the belt actuators (\secref{belt_actuators}) as a \enquote{finger-side actuator} that lets the fingertip free (\figref{maeda2022fingeret}).
Two rollers, one per side, can deform the skin: When rotating inwards, they pull the skin, simulating a contact sensation, and when rotating outwards, they push the skin, simulating a release sensation.
By doing quick in and out rotations, they can also simulate a texture sensation.
With Fingeret, \textcite{maeda2022fingeret} adapted the belt actuators (\secref{belt_actuators}) as a \enquote{finger-side actuator} that leaves the fingertip free (\figref{maeda2022fingeret}).
Two rollers, one on each side, can deform the skin: When rotated inward, they pull the skin, simulating a contact sensation, and when rotated outward, they push the skin, simulating a release sensation.
They can also simulate a texture sensation by rapidly rotating in and out.
%The device is also very compact (\qty{60 x 25 x 36}{\mm}), lightweight (\qty{18}{\g}), and portable with a battery and Bluetooth wireless communication with \qty{83}{\ms} latency.
In a user study not in \AR, but directly touching images on a tablet, Fingeret was found to be more realistic (4/7) than a \LRA at \qty{100}{\Hz} on the nail (3/7) for rendering buttons and a patterned texture (\secref{texture_rendering}), but not different from vibrations for rendering high-frequency textures (3.5/7 for both).
However, as for \textcite{teng2021touch}, finger speed was not taken into account for rendering vibrations, which may have been detrimental to texture perception, as described in \secref{texture_rendering}.
However, as with \textcite{teng2021touch}, finger speed was not taken into account when rendering vibrations, which may have been detrimental to texture perception, as described in \secref{texture_rendering}.
Finally, \textcite{preechayasomboon2021haplets} (\figref{preechayasomboon2021haplets}) and \textcite{sabnis2023haptic} designed Haplets and Haptic Servo, respectively: They are very compact and lightweight vibrotactile \LRA devices designed to feature both integrated sensing of the finger movements and very latency haptic feedback (\qty{<5}{ms}).
But no proper user study were conducted to evaluate these devices in \AR.
Finally, \textcite{preechayasomboon2021haplets} (\figref{preechayasomboon2021haplets}) and \textcite{sabnis2023haptic} designed Haplets and Haptic Servo, respectively: These are very compact and lightweight vibrotactile \LRA devices designed to provide both integrated finger motion sensing and very low latency haptic feedback (\qty{<5}{ms}).
However, no proper user study has been conducted to evaluate these devices in \AR.
\begin{subfigs}{ar_wearable}{Nail-mounted wearable haptic devices designed for \AR. }[
%\item A voice-coil rendering a virtual haptic texture on a real sheet of paper \cite{ando2007fingernailmounted}.
@@ -200,7 +210,7 @@ But no proper user study were conducted to evaluate these devices in \AR.
\subsubsection{Belt Devices}
\label{vhar_rings}
The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been employed to improve the manipulation of \VOs in \AR (\secref{ar_interaction}).
The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been used to improve the manipulation of \VOs in \AR (\secref{ar_interaction}).
Recall that these devices have also been used to modify the perceived stiffness, softness, friction and localized bumps and holes on smooth real surfaces (\secref{hardness_rendering}) \cite{detinguy2018enhancing,salazar2020altering}, but have not been tested in \AR.
In a \VST-\AR setup, \textcite{scheggi2010shape} explored the effect of rendering the weight of a virtual cube placed on a real surface hold with the thumb, index, and middle fingers (\figref{scheggi2010shape}).
@@ -208,14 +218,14 @@ The middle phalanx of each of these fingers was equipped with a haptic ring of \
%However, no proper user study was conducted to evaluate this feedback.% on the manipulation of the cube.
%that simulated the weight of the cube.
%A virtual cube that could push on the cube was manipulated with the other hand through a force-feedback device.
\textcite{scheggi2010shape} report that 12 out of 15 participants found the weight haptic feedback essential to feel the presence of the virtual cube.
\textcite{scheggi2010shape} report that 12 out of 15 participants found the weight haptic feedback essential to feeling the presence of the virtual cube.
In a pick-and-place task in non-immersive \VST-\AR involving direct hand manipulation of both virtual and real objects (\figref{maisto2017evaluation}), \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the effects of providing haptic or visual feedback about fingertip-object contacts.
They compared the haptic ring of \textcite{pacchierotti2016hring} on the proximal phalanx, the moving platform of \textcite{chinello2020modular} on the fingertip, and a visual rendering of the tracked fingertips as virtual points.
They showed that the haptic feedback improved the completion time, reduced the exerted force on the cubes over the visual feedback (\figref{visual-hands}).
The haptic ring was also perceived by users to be more effective than the moving platform.
However, the measured difference in performance could be attributed to either the device or the device position (proximal vs fingertip), or both.
These two studies were also conducted in non-immersive setups, where users looked at a screen displaying the visual interactions, and only compared haptic and visual rendering of the hand-object contacts, but did not examine them together.
They showed that the haptic feedback improved the completion time, reduced the force exerted on the cubes compared to the visual feedback (\figref{visual-hands}).
The haptic ring was also perceived as more effective than the moving platform.
However, the measured difference in performance could be due to either the device or the device position (proximal vs fingertip), or both.
These two studies were also conducted in non-immersive setups, where users viewed a screen displaying the visual interactions, and only compared the haptic and visual rendering of the hand-object contacts, but did not examine them together.
\begin{subfigs}{ar_rings}{Wearable haptic ring devices for \AR. }[
\item Rendering weight of a virtual cube placed on a real surface \cite{scheggi2010shape}.

View File

@@ -1,28 +1,19 @@
\section{Conclusion}
\label{conclusion}
Haptic perception and manipulation of objects with the hand involves exploratory movements or grasp types, respectively, with simultaneous sensory feedback from multiple cutaneous and kinaesthetic receptors embedded beneath the skin.
These receptors provide sensory cues about the physical properties of objects, such as roughness and hardness, which are then integrated to form a perception of the property being explored.
Perceptual constancy is possible in the absence of one cue by compensating with others.
Haptic systems aim to provide virtual interactions and sensations similar to those with real objects.
Only a few can be considered wearable due to their compactness and portability, but they are limited to cutaneous feedback.
If their haptic rendering is timely associated with the user's touch actions on a real object, the perceived haptic properties of the object, such as its roughness and hardness, can be modified.
\AR headsets integrate virtual content into the user's perception as if it were part of the \RE, with real-time tracking of the head and hands.
However, they lack direct hand interaction and manipulation of \VOs, which could be improved by visual rendering of the hand.
Tangibles are also used as proxies for manipulating \VOs, but can be inconsistent with the visual rendering, being haptically passives.
Wearable haptics on the hand is a promising solution for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of tangibles.
% the type of rendered object (real or virtual), the rendered haptic property (contact, hardness, texture, see \secref{tactile_rendering}), and .
%In this context of integrating \WHs with \AR to create a \vh-\AE (\chapref{introduction}), the definition of \textcite{pacchierotti2017wearable} can be extended to an additional criterion: The wearable haptic interface should not impair the interaction with the \RE, \ie the user should be able to touch and manipulate objects in the real world while wearing the haptic device.
% The haptic feedback is thus rendered de-localized from the point of contact of the finger on the rendered object.
% ---
%La complexité de la perception des propriétés haptiques des objets et la diversité des interactions possibles rendent donc particulièrement difficile de concevoir des dispositifs et rendus haptiques réalistes.
%D'autant plus que le sens du toucher réparti sur l'ensemble de la main et du corps et que les sensations haptiques sont nécessairement produites par un contact direct de la peau avec l'objet, donc liées à un mouvement de la main sur l'objet.
%Il n'existe donc pas de système haptique générique pouvant adresser tous les aspects du sens haptique, mais une grande variété de dispositifs et de rendus haptiques avec différents objectifs, contraintes et compromis.
%Cependant, une expérience numérique audio-visuelle peut être imparfaite et pourtant suffisante pour être utile et intéressante, comme peut l'être une visio-conférence, et transmettre des sensations comparables à celles réelles, comme regarder et écouter un concert sur un écran avec un casque.
%Pourtant la qualité visuelle et sonore de telles expériences est très différente de celle d'une conversation "réelle" ou d'une vraie scène de tous les jours.
%Ainsi, plus que recréer des expériences haptiques réalistes, il est plus important de rendre le stimulus sensoriel "au bon moment et à la bonne place" \cite{hayward2007it}.
%The quality of the illusory haptic experience is a function of the interplay between the users perceptual system and the intrinsic technical qualities of the interfaces
%De façon intéressante, les deux sections précédentes, présentant l'haptique portable pour la \secref{wearable_haptics} et la RA pour la \secref{augmented_reality}, suivent un cheminement assez opposé : la première commence avec le sens haptique et la main pour décrire les dispositifs haptiques portables et les interactions qu'ils permettent, tandis que la seconde débute sur une description technologique de RA pour ensuite détailler sa perception et son usage.
%C'est de cette manière que chacun des deux domaines est souvent introduit dans la littérature, par exemple avec les travaux de \textcite{choi2013vibrotactile,culbertson2018haptics} pour l'haptique et de \textcite{bimber2005spatial,kim2018revisiting} pour la RA.
%Mais il est également intéressant de noter que ces deux domaines sont à des stades de maturité différents.
%En effet, pouvoir contribuer pour ces deux domaines soulève, entre autres, des défis techniques importants, comme détaillé dans la \secref[introduction]{research_challenges}.
%Et il y a un besoin de standardisation en haptique portable \cite{culbertson2018haptics}, notamment en terme de dispositifs et de rendus, alors que l'industrie est plutôt bien établie en RA, par exemple avec les casques HoloLens~2 de Microsoft~\footnoteurl{https://www.microsoft.com/hololens} et Vision~Pro d'Apple~\footnoteurl{https://www.apple.com/apple-vision-pro/} ou bien les frameworks ARCore de Google~\footnoteurl{https://developers.google.com/ar} et ARKit d'Apple~\footnoteurl{https://developer.apple.com/augmented-reality/}.
%Cela peut en partie d'une part s'expliquer par la maturité de l'industrie de la RV, qui entraîne celle de la RA, et avec une tendance annoncée à la convergence de ces deux technologies \cite{speicher2019what}, mais aussi d'autre part par la plus grande complexité et les particularité du sens haptique \cite{culbertson2018haptics}.
%À l'inverse, définir et caractériser la RA/RM, dans une bien moindre mesure la RV, reste étonnamment un sujet ouvert \cite{speicher2019what}.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 31 KiB

After

Width:  |  Height:  |  Size: 36 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 64 KiB

View File

@@ -3,12 +3,12 @@
\chaptertoc
This chapter reviews previous work on the perception and manipulation of \AEs directly with the hand using wearable haptics, \AR and their combination.
This chapter reviews previous work on the perception and manipulation of \AEs directly with the hand using wearable haptics, \AR, and their combination.
%Experiencing a visual, haptic, or visuo-haptic \AE relies on one to many interaction loops between a user and the environment, as shown in \figref[introduction]{interaction-loop}, and each main step must be addressed and understood: the tracking and modelling of the \RE into a \VE, the interaction techniques to act on the \VE, the rendering of the \VE to the user through visual and haptic user interfaces, and, finally, the user's perception and actions on the overall \AE.
To achieve this, we first describe how the hand senses and acts on its environment to perceive the haptic properties of real everyday objects, and how the hand grasps and manipulates them.
Secondly, we present how wearable haptic devices and renderings have been used to augment the haptic perception with real, tangible objects, with a focus on vibrotactile feedback and haptic textures.
Thirdly, we introduce the principles and user experience of \AR, and overview the main interaction techniques used to manipulate virtual and augmented objects.
Finally, we present how multimodal visual and haptic feedback have been combined in \AR to modify the user perception of tangible objects, and to improve the user interaction with \VOs.
To achieve this, we first describe how the hand senses and acts on its environment to perceive and manipulate the haptic properties of real everyday objects.
Second, we present how wearable haptic devices and renderings have been used to augment the haptic perception of roughness and hardness of real objects.
Third, we introduce the principles and user experience of \AR, and overview the main interaction techniques used to manipulate virtual objects directly with the hand.
Finally, multimodal visual and haptic feedback have been combined in \AR to enhance perception and interaction with the hand.
\input{1-haptic-hand}
\input{2-wearable-haptics}