WIP related work

This commit is contained in:
2024-09-20 09:04:09 +02:00
parent 65453e0605
commit 5eba5f75d9
14 changed files with 177 additions and 73 deletions

View File

@@ -163,14 +163,15 @@ Several types of vibrotactile actuators are used in haptics, with different trad
\subsection{Modifying Perceived Haptic Roughness and Hardness}
\label{tactile_rendering}
Tactile rendering of haptic properties consists in modelling and reproducing virtual tactile sensations comparable to those perceived when interacting with real objects.
By adding such tactile rendering as feedback to the touch actions of the hand on a real object~\cite{bhatia2024augmenting}, both the real and virtual haptic sensations are integrated into a single property perception~\cite{ernst2004merging}.
Tactile rendering of haptic properties consists in modelling and reproducing virtual tactile sensations comparable to those perceived when interacting with real objects~\cite{klatzky2013haptic}.
By adding such tactile rendering as feedback to the touch actions of the hand on a real object~\cite{bhatia2024augmenting}, both the real and virtual haptic sensations are integrated into a single property perception, as presented in \secref{sensations_perception}.
Therefore, the visual rendering of a touched object can also greatly influence the perception of its haptic properties, \eg by modifying its visual texture in \AR or \VR, as discussed in the \secref{visuo_haptic}.
\textcite{bhatia2024augmenting} categorize the tactile augmentations of real objects into three types: direct touch, touch-through, and tool mediated.
In direct touch, the haptic device does not cover the interior of the hand to not impair the user to interact with the \RE.
Also called direct feel-through~\cite{jeon2015haptic}, in \emph{direct touch}, the haptic device does not cover the interior of the hand to not impair the user to interact with the \RE.
In touch-through and tool-mediated, or \emph{indirect feel-through}, the haptic device is interposed between the hand and the \RE or worn on the hand, respectively.
We are interested in direct touch augmentations with wearable haptic devices (\secref{wearable_haptic_devices}), as their integration with \AR is particularly promising for direct hand interaction with visuo-haptic augmentations.
We also focus tactile augmentations stimulating the mechanoreceptors of the skin (\secref{haptic_sense}), thus excluding temperature perception, as they are the most common existing haptic interfaces.
%We also focus on tactile augmentations stimulating the mechanoreceptors of the skin (\secref{haptic_sense}), thus excluding temperature perception, as they are the most common existing haptic interfaces.
% \cite{klatzky2003feeling} : rendering roughness, friction, deformation, temperatures
% \cite{girard2016haptip} : renderings with a tangential motion actuator
@@ -198,23 +199,28 @@ A common method vibrotactile rendering of texture is to use a sinusoidal signal
\subsubsection{Hardness}
\label{hardness_rendering}
Modulating the perceived stiffness $k$ of a real surface with a force-feedback device
\cite{jeon2008modulating,jeon2010stiffness,jeon2012extending}
Tapping with a tool on a real surface augmented with a vibrotactile actuator generating exponential decaying sinusoids
\cite{kuchenbecker2006improving}
\cite{jeon2009haptic}
\cite{jeon2012extending}
\cite{hachisu2012augmentation}
\cite{kildal20103dpress}
\cite{park2019realistic}
\cite{choi2021perceived}
\cite{park2023perceptual}
Comparing the two previous methods
\cite{choi2021perceived}
With wearable haptics
\cite{kildal20103dpress}
\cite{detinguy2018enhancing}
\cite{salazar2020altering}
\cite{yim2021multicontact}
\cite{park2017compensation}
\cite{tao2021altering}
\cite{tao2021altering} % wearable softness
%\subsubsection{Friction}
%\label{friction_rendering}
%
%\cite{konyo2008alternative}
%\cite{provancher2009fingerpad}
%\cite{smith2010roughness}
@@ -224,7 +230,7 @@ A common method vibrotactile rendering of texture is to use a sinusoidal signal
%\subsubsection{Weight}
%\label{weight_rendering}
%
%\cite{minamizawa2007gravity}
%\cite{minamizawa2008interactive}
%\cite{jeon2011extensions}

View File

@@ -45,20 +45,21 @@ Yet, most of the research have focused on visual augmentations, and the term \AR
\label{ar_applications}
Advances in technology, research and development have enabled many usages of \AR, including medicine, education, industrial, navigation, collaboration and entertainment applications~\cite{dey2018systematic}.
For example, \AR can help surgeons to visualize \ThreeD images of the brain overlaid on the patient's head prior or during surgery~\cite{watanabe2016transvisible} (\figref{watanabe2016transvisible}), or improve the learning of students with complex concepts and phenomena such as optics or chemistry~\cite{bousquet2024reconfigurable}.
For example, \AR can provide surgery training simulations in safe conditions~\cite{harders2009calibration} (\figref{harders2009calibration}), or improve the learning of students with complex concepts and phenomena such as optics or chemistry~\cite{bousquet2024reconfigurable}.
It can also guide workers in complex tasks, such as assembly, maintenance or verification~\cite{hartl2013mobile} (\figref{hartl2013mobile}), reinvent the way we interact with desktop computers~\cite{lee2013spacetop} (\figref{lee2013spacetop}), or can create complete new forms of gaming or tourism experiences~\cite{roo2017inner} (\figref{roo2017inner}).
Most of (visual) \AR/\VR experience can now be implemented with commercially available hardware and software solutions, in particular for tracking, rendering and display.
Yet, the user experience in \AR is still highly dependent on the display used.
\begin{subfigs}{ar_applications}{Examples of \AR applications. }[
\item Neurosurgery \AR visualization of the brain on a patient's head~\cite{watanabe2016transvisible}.
%\item HOBIT is a spatial, tangible \AR table simulating an optical bench for educational experimentations~\cite{bousquet2024reconfigurable}.
\item \AR can interactively guide in document verification tasks by recognizing and comparing with virtual references~\cite{hartl2013mobile}.
\item SpaceTop is transparent \AR desktop computer featuring direct hand manipulation of \ThreeD content~\cite{lee2013spacetop}.
\item Inner Garden is a spatial \AR zen garden made of real sand visually augmented to create a mini world that can be reshaped by hand~\cite{roo2017inner}.
%\item Neurosurgery \AR visualization of the brain on a patient's head \cite{watanabe2016transvisible}.
\item Visuo-haptic surgery training with cutting into virtual soft tisues \cite{harders2009calibration}.
%\item HOBIT is a spatial, tangible \AR table simulating an optical bench for educational experimentations \cite{bousquet2024reconfigurable}.
\item \AR can interactively guide in document verification tasks by recognizing and comparing with virtual references \cite{hartl2013mobile}.
\item SpaceTop is transparent \AR desktop computer featuring direct hand manipulation of \ThreeD content \cite{lee2013spacetop}.
\item Inner Garden is a spatial \AR zen garden made of real sand visually augmented to create a mini world that can be reshaped by hand \cite{roo2017inner}.
]
\subfigsheight{41mm}
\subfig{watanabe2016transvisible}
\subfig{harders2009calibration}
\subfig{hartl2013mobile}
\subfig{lee2013spacetop}
\subfig{roo2017inner}
@@ -75,7 +76,7 @@ In \VST-\AR, the virtual images are superimposed to images of the \RE captured b
This augmented view through the camera has the advantage of a complete control on the real-virtual combination such as mutual occlusion between real and virtual objects~\cite{macedo2023occlusion}, coherent lighting and no delay between the real and virtual images~\cite{kruijff2010perceptual}.
But, due to the camera and the screen, the user's view is degraded with a lower resolution, frame rate, field of view, and an overall visual latency compared to proprioception~\cite{kruijff2010perceptual}.
An \OST-\AR directly combines the virtual images with the real world view using a transparent optical system~\cite{itoh2022indistinguishable} to like augmented glasses, as illustrated in \figref{itoh2022indistinguishable_ost}, \eg \figref{watanabe2016transvisible} and \figref{lee2013spacetop}.
An \OST-\AR directly combines the virtual images with the real world view using a transparent optical system~\cite{itoh2022indistinguishable} to like augmented glasses, as illustrated in \figref{itoh2022indistinguishable_ost}, \eg \figref{lee2013spacetop}.
These displays feature a direct, preserved view of the \RE at the cost of more difficult registration (spatial misalignment or temporal latency between the real and virtual content)~\cite{grubert2018survey} and mutual real-virtual occlusion~\cite{macedo2023occlusion}.
Finally, projection-based \AR overlay the virtual images on the real world using a projector, as illustrated in \figref{roo2017one_2}, \eg \figref{roo2017inner}.
@@ -93,7 +94,7 @@ It doesn't require the user to wear the display, but requires physical surface t
\end{subfigs}
Regardless the \AR display, it can be placed at different locations~\cite{bimber2005spatial}, as shown in \figref{roo2017one_1}.
Spatial \AR is usually projection-based displays placed at fixed location (\figref{roo2017inner}), but it can also be optical or video see-through windows (\figref{lee2013spacetop}).
Spatial \AR is usually projection-based displays placed at fixed location (\figref{roo2017inner}), but it can also be \OST or \VST fixed windows (\figref{lee2013spacetop}).
Alternatively, \AR displays can be hand-held, like a \VST smartphone (\figref{hartl2013mobile}), or body-attached, like a micro-projector used as a flashlight~\cite{billinghurst2015survey}.
Finally, \AR displays can be head-worn like \VR headsets or glasses, providing a highly immersive and portable experience.
%Smartphones, shipped with sensors, computing ressources and algorithms, are the most common \AR today's displays, but research and development promise more immersive and interactive \AR with headset displays~\cite{billinghurst2021grand}.
@@ -227,7 +228,7 @@ These \enquote{time-multiplexed} interfaces require interaction techniques that
Still, the virtual visual rendering and the tangible haptic sensations can be inconsistent.
Especially in \OST-\AR, as the \VOs are slightly transparent allowing the paired tangibles to be seen through them.
In a pick-and-place task with tangibles of different shapes, a difference in size~\cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape~\cite{kahl2023using} (\figref{kahl2023using}) with the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users.
This suggests the feasibility of using simplified tangibles in \AR whose spatial properties (\secref{spatial_properties}) abstract those of the \VOs.
This suggests the feasibility of using simplified tangibles in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs.
Similarly, we described in \secref{tactile_rendering} how a material property (\secref{object_properties}) of a touched tangible can be modified using wearable haptic devices~\cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR.
\begin{subfigs}{ar_applications}{Manipulating \VOs with tangibles. }[
@@ -261,7 +262,7 @@ Heuristic techniques use rules to determine the selection, manipulation and rele
But they produce unrealistic behaviour and are limited to the cases predicted by the rules.
Physics-based techniques simulate forces at the contact points between the virtual hand and the \VO.
In particular, \textcite{borst2006spring} have proposed an articulated kinematic model in which each phalanx is a rigid body simulated with the god-object~\cite{zilles1995constraintbased} method: the virtual phalanx follows the movements of the real phalanx, but remains constrained to the surface of the virtual objects during contact. The forces acting on the object are calculated as a function of the distance between the real and virtual hands (\figref{borst2006spring}).
More advanced techniques simulate the friction phenomena described in \secref{friction}~\cite{talvas2013godfinger} and finger deformations~\cite{talvas2015aggregate}, allowing highly accurate and realistic interactions, but which can be difficult to compute in real time.
More advanced techniques simulate the friction phenomena~\cite{talvas2013godfinger} and finger deformations~\cite{talvas2015aggregate}, allowing highly accurate and realistic interactions, but which can be difficult to compute in real time.
\begin{subfigs}{virtual-hand}{Manipulating \VOs with virtual hands. }[
\item A fingertip tracking that enables to select a \VO by opening the hand~\cite{lee2007handy}.

View File

@@ -11,20 +11,68 @@
% One of the roles of haptic systems is to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects, particularly in \v-\VE~\cite{maclean2008it,culbertson2018haptics}. Moreover, a haptic \AR system should \enquote{modulating the feel of a real object by virtual [haptic] feedback}~\cite{jeon2009haptic}, \ie a touch interaction with a real object whose perception is modified by the addition of virtual haptic feedback.
\subsection{Influence of Visual Rendering on Haptic Perception}
\label{visual_haptic_influence}
% Finally, we present how multimodal visual and haptic feedback have been combined in \AR to modify the user perception of tangible objects, and to improve the user interaction with \VOs.
\subsection{Visuo-Haptic Perception of Virtual and Augmented Objects}
\label{sensations_perception}
\subsubsection{Merging the Sensations into a Perception}
\label{sensations_perception}
As already evoked in the previous sections, a \emph{perception} is the merge of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property~\cite{ernst2004merging}.
A \emph{perception} is the merge of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property~\cite{ernst2004merging}.
For example, it is the haptic hardness perceived through skin pressure and force sensations~\secref{hardness}, the hand movement from proprioception and a visual hand avatar~\secref{ar_displays}, or the perceived size of a tangible with a co-localized \VO~\secref{ar_tangibles}.
When the sensations can be redundant, \ie only one sensory modality could be used to estimate the property, they are integrated to form a single coherent perception~\cite{ernst2004merging}.
The \MLE model explains how one estimate the perception from multiple sensory cues by weighting them according to their perceived reliability~\cite{ernst2002optimal}.
When the sensations can be redundant, \ie when only one sensation could be enough to estimate the property, they are integrated to form a single coherent perception~\cite{ernst2004merging}.
No sensory information is completely reliable, and can provide different answers to the same property when measured multiple times, \eg the weight of an object.
Therefore, each sensation $i$ is said to be an estimate $\hat{s}_i$ with variance $\sigma_i^2$ of the property $s$.
The \MLE model predicts then that the integrated estimated property $\hat{s}$ is the weighted sum of the individual sensory estimates:
\begin{equation}{MLE}
\hat{s} = \sum_i w_i \hat{s}_i \quad \text{with} \quad \sum_i w_i = 1
\end{equation}
Where the individual weights $w_i$ are proportional to their inverse variances:
\begin{equation}{MLE_weights}
w_i = \frac{1/\sigma_i^2}{\sigma^2}
\end{equation}
And the integrated variance $\sigma^2$ is the inverse of the sum of the individual variances:
\begin{equation}{MLE_variance}
\sigma^2 = \left( \sum_i \frac{1}{\sigma_i^2} \right)^{-1}
\end{equation}
This was demonstrated by \textcite{ernst2002humans} in a user study where participants estimated the height of a virtual bar with a fixed window \OST-\AR display (\secref{ar_displays}) and force-feedback devices worn on the thumb and index fingers (\secref{wearability_level}), as shown in \figref{ernst2002humans_setup}.
%They first measured the individual variances of the visual and haptic estimates (\figref{ernst2002humans_within}) and then the combined variance of the visuo-haptic estimates (\figref{ernst2002humans_visuo-haptic}).
On each trial, participants compared the visuo-haptic reference bar (with a fixed height) to a visuo-haptic comparison bar (with a variable height) in a \TIFC task (one bar is tested first, a pause, then the other) and indicated which was taller.
The reference bar had different conflicting visual $s_v$ and haptic $s_h$ heights, and different noise levels were added to the visual feedback to increase its variance.
The objective was to determine a \PSE between the comparison and reference bars, where the participant was equally likely to choose one or the other (\percent{50} of the trials).
%\figref{ernst2002humans_within} shows the discrimination of participants with only the haptic or visual feedback, and how much the estimation becomes difficult (thus higher variance) when noise is added to the visual feedback.
\figref{ernst2004merging_results} shows that when the visual noise was low, the visual feedback had more weight, but the more the visual noise increased, the more the haptic feedback gained weight, as predicted by the \MLE model.
\begin{subfigs}{ernst2002humans}{Visuo-haptic perception of height of a virtual bar \cite{ernst2002humans}. }[
\item Experimental setup.%: Participants estimated height visually with an \OST-\AR display and haptically with force-feedback devices worn on the thumb and index fingers.
%\item with only haptic feedback (red) or only visual feedback (blue, with different added noise),
%\item combined visuo-haptic feedback (purple, with different visual noises).
\item Proportion of trials (vertical axis) where the comparison bar (horizontal axis) was perceived taller than the reference bar as function of increase variance (inverse of reliability) of the visual feedback (colors).
The reference had different conflicting visual $s_v$ and haptic $s_h$ heights.
]
\subfig[.34]{ernst2002humans_setup}
\subfig[.64]{ernst2004merging_results}
%\subfig{ernst2002humans_within}
%\subfig{ernst2002humans_visuo-haptic}
\end{subfigs}
%Hence, the \MLE model explains how a (visual) \VO in \AR can be perceived as coherent when combined with real haptic sensations of a tangible or a wearable haptic feedback.
The \MLE model implies when seeing and touching a \VO in \AR, the combination of the visual and haptic stimuli, real or virtual, rendered to the user can be perceived as a coherent single object property.
As long as the user is able to match the sensations as the same object property, and even though there are discrepancies between the sensations, the overall perception can be influenced by changing one of the stimuli as discussed in the next sections.
%for example by including tangible objects, wearable haptic feedback, or even by altering the visual rendering of the \VO, as discussed in the next sections.
\subsubsection{Influence of Visual Rendering on Haptic Perception}
\label{visual_haptic_influence}
A visuo-haptic perception of the property of a real or virtual object is robust to a certain difference between the two sensory modalities.
Yet, as the visual sense
Une perception est donc robuste à une certaine différence entre les sensations visuelles et haptiques, réelles ou virtuelle, tant qu'un utilisateur peut les faire correspondre à un même objet augmenté.
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
%
@@ -36,30 +84,25 @@ Thus, the overall perception can be modified by changing one of the modalities,
Similarly but in VR, \textcite{degraen2019enhancing} combined visual textures with different passive haptic hair-like structure that were touched with the finger to induce a larger set of visuo-haptic materials perception.
\textcite{gunther2022smooth} studied in a complementary way how the visual rendering of a \VO touching the arm with a tangible object influenced the perception of roughness.
A common finding of these studies is that haptic sensations seem to dominate the perception of roughness, suggesting that a smaller set of haptic textures can support a larger set of visual textures.
When performing a precision grasp (\secref{grasp_types}) in \VR, some discrepancy in spatial properties (\secref{spatial_properties}) between a tangible and a \VO is not noticeable by users: it took a relative difference of \percent{6} for the object width, \percent{44} for the surface orientation, and \percent{67} for the surface curvature to be perceived~\cite{detinguy2019how}.
%When performing a precision grasp (\secref{grasp_types}) in \VR, only a certain relative difference between the tangible and the \VO is noticeable: \percent{6} for the object width, \percent{44} for the surface orientation, and \percent{67} for the surface curvature~\cite{detinguy2019how}.
When performing a precision grasp (\secref{grasp_types}) on a tangible in \VR, some discrepancy in spatial properties between a tangible and a \VO is not noticeable by users: it took a relative difference of \percent{6} for the object width, \percent{44} for the surface orientation, and \percent{67} for the surface curvature to be perceived~\cite{detinguy2019how}.
\subsubsection{Pseudo-Haptic Feedback}
\label{pseudo_haptic}
Le modèle \MLE implique également que la perception peut être croisée entre les modalités sensorielles: une perception haptique peut être volontairement influencé par un stimuli virtuel visuel, quand la vision est dite "dominante" dans la perception.
Quand il est employé avec un \VE, ce phénomène est appelé \emph{pseudo-haptic feedback}~\cite{ujitoko2021survey}.
See \textcite{ujitoko2021survey} for a detailed survey.
% Visual feedback in VR and AR is known to influence haptic perception [13]. The phenomenon of ”visual dominance” was notably observed when estimating the stiffness of \VOs. L´ecuyer et al. [13] based their ”pseudo-haptic feedback” approach on this notion of visual dominance gaffary2017ar
A few works have also used pseudo-haptic feedback to change the perception of haptic stimuli to create richer feedback by deforming the visual representation of a user input~\cite{ujitoko2021survey}.
For example, different levels of stiffness can be simulated on a grasped \VO with the same passive haptic device~\cite{achibet2017flexifingers} or
the perceived softness of tangible objects can be altered by superimposing in AR a virtual texture that deforms when pressed by the hand~\cite{punpongsanon2015softar}, or in combination with vibrotactile rendering in VR~\cite{choi2021augmenting}.
\cite{ban2012modifying}
\cite{ban2014displaying}
\cite{taima2014controlling}
\cite{ujitoko2019presenting}
\cite{ban2012modifying,ban2014displaying}
\cite{costes2019touchy}
\cite{kalus2024simulating}
\cite{detinguy2019how}
\cite{samad2019pseudohaptic}
\cite{issartel2015perceiving}
\cite{ogawa2021effect}
\cite{costes2019touchy,ujitoko2019presenting,ota2020surface}
The vibrotactile sinusoidal rendering of virtual texture cited above was also combined with visual oscillations of a cursor on a screen to increase the roughness perception of the texture~\cite{ujitoko2019modulating}.
However, the visual representation was a virtual cursor seen on a screen while the haptic feedback was felt with a hand-held device.
@@ -67,6 +110,13 @@ Conversely, as discussed by \textcite{ujitoko2021survey} in their review, a co-l
Even before manipulating a visual representation to induce a haptic sensation, shifts and latencies between user input and co-localised visuo-haptic feedback can be experienced differently in AR and VR, which we aim to investigate in this work.
\begin{subfigs}{pseudo_haptic}{Pseudo haptic. }
\subfigsheight{35mm}
\subfig{punpongsanon2015softar}
\subfig{ban2014displaying}
\subfig{ota2020surface}
\end{subfigs}
\subsubsection{Perception of Visuo-Haptic Rendering in AR and VR}
\label{AR_vs_VR}
@@ -74,11 +124,11 @@ Even before manipulating a visual representation to induce a haptic sensation, s
Some studies have investigated the visuo-haptic perception of \VOs in \AR and \VR with grounded force-feedback devices.
In an immersive \VST-\AR setup, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}).
In a \TAFC task, participants pressed two pistons and indicated which was stiffer.
In a \TIFC task (\secref{sensations_perception}), participants pressed two pistons and indicated which was stiffer.
One had a reference stiffness but an additional visual or haptic delay, while the other varied with a comparison stiffness but had no delay. \footnote{Participants were not told about the delays and stiffness tested, nor which piston was the reference or comparison. The order of the pistons (which one was pressed first) was also randomized.}%
Adding a visual delay increased the perceived stiffness of the reference piston, while adding a haptic delay decreased it, and adding both delays cancelled each other out (\figref{knorlein2009influence_2}).
\begin{subfigs}{visuo-haptic-stiffness}{Perception of haptic stiffness in \VST-\AR ~\cite{knorlein2009influence}. }[
\begin{subfigs}{visuo-haptic-stiffness}{Perception of haptic stiffness in \VST-\AR \cite{knorlein2009influence}. }[
\item Participant pressing a virtual piston rendered by a force-feedback device with their hand.
\item Proportion of comparison piston perceived as stiffer than reference piston (vertical axis) as a function of the comparison stiffness (horizontal axis) and visual and haptic delays of the reference (colors).
]
@@ -87,13 +137,13 @@ Adding a visual delay increased the perceived stiffness of the reference piston,
\end{subfigs}
%explained how these delays affected the integration of the visual and haptic perceptual cues of stiffness.
The stiffness $k$ of the piston is indeed estimated by both sight and proprioception as the ratio of the exerted force $F$ and the displacement $D$ of the piston, following \eqref{stiffness}, but with a delay $\Delta t$:
\begin{equation}{stiffness_delay}
k = \frac{F(t_H)}{D(t_V)} \quad \text{with} \quad t_H = t_V + \Delta t
\end{equation}
where $t_B = t_A + \Delta t$.
Therefore, a haptic delay (positive $\Delta t$) increases the perceived stiffness $k$, while a visual delay in displacement (negative $\Delta t$) decreases perceived $k$~\cite{diluca2011effects}.
Therefore, the perceived stiffness $k$ increases with a haptic delay in force (positive $\Delta t$) and decreases with a visual delay in displacement (negative $\Delta t$)~\cite{diluca2011effects}.
In a similar \TAFC user study, participants compared perceived stiffness of virtual pistons in \OST-\AR and \VR~\cite{gaffary2017ar}.
In a similar \TIFC user study, participants compared perceived stiffness of virtual pistons in \OST-\AR and \VR~\cite{gaffary2017ar}.
However, the force-feedback device and the participant's hand were not visible (\figref{gaffary2017ar}).
The reference piston was judged to be stiffer when seen in \VR than in \AR, without participants noticing this difference, and more force was exerted on the piston overall in \VR.
This suggests that the haptic stiffness of \VOs feels \enquote{softer} in an \AE than in a full \VE.
@@ -118,7 +168,7 @@ They have shown how the latency of the visual rendering of a \VO or the type of
We describe in the next section how wearable haptics have been integrated with immersive \AR.
\subsection{Wearable Haptics for AR}
\subsection{Wearable Haptics for Direct Hand Interaction in AR}
\label{vhar_haptics}
A few wearable haptic devices have been specifically designed or experimentally tested for direct hand interaction in immersive \AR.
@@ -164,12 +214,13 @@ However, as for \textcite{teng2021touch}, finger speed was not taken into accoun
\subfig{maeda2022fingeret}
\end{subfigs}
\subsubsection{Ring Belt Devices}
\label{vhar_rings}
The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been employed to improve the manipulation of \VOs in \AR, which is a fundamental task with a \VE (\secref{ar_interaction}).
The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been employed to improve the manipulation of \VOs in \AR (\secref{ar_interaction}).
In a \VST-\AR setup, \textcite{scheggi2010shape} explored the effect of rendering the weight (\secref{weight_rendering}) of a virtual cube placed on a real surface hold with the thumb, index, and middle fingers (\figref{scheggi2010shape}).
In a \VST-\AR setup, \textcite{scheggi2010shape} explored the effect of rendering the weight of a virtual cube placed on a real surface hold with the thumb, index, and middle fingers (\figref{scheggi2010shape}).
The middle phalanx of each of these fingers was equipped with a haptic ring of \textcite{minamizawa2007gravity}.
%However, no proper user study was conducted to evaluate this feedback.% on the manipulation of the cube.
%that simulated the weight of the cube.
@@ -178,7 +229,7 @@ The middle phalanx of each of these fingers was equipped with a haptic ring of \
In a pick-and-place task in non-immersive \VST-\AR involving direct hand manipulation of both virtual and real objects (\figref{maisto2017evaluation}), \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the effects of providing haptic or visual feedback about fingertip-object contacts.
They compared the haptic ring of \textcite{pacchierotti2016hring} on the proximal phalanx, the moving platform of \textcite{chinello2020modular} on the fingertip, and a visual rendering of the tracked fingertips as virtual points.
They showed that the haptic feedback improved the completion time, reduced the exerted force on the cubes over the visual feedback (\figref{ar_visual_hands}).
They showed that the haptic feedback improved the completion time, reduced the exerted force on the cubes over the visual feedback (\figref{visual-hands}).
The haptic ring was also perceived by users to be more effective than the moving platform.
However, the measured difference in performance could be attributed to either the device or the device position (proximal vs fingertip), or both.
These two studies were also conducted in non-immersive setups, where users looked at a screen displaying the visual interactions, and only compared haptic and visual rendering of the hand-object contacts, but did not examine them together.
@@ -187,18 +238,19 @@ These two studies were also conducted in non-immersive setups, where users looke
\item Rendering weight of a virtual cube placed on a real surface~\cite{scheggi2010shape}.
\item Rendering the contact force exerted by the fingers on a virtual cube~\cite{maisto2017evaluation,meli2018combining}.
]
\subfigsheight{53mm}
\subfigsheight{57mm}
\subfig{scheggi2010shape}
\subfig{maisto2017evaluation}
\end{subfigs}
\subsubsection{Wrist Bracelet Devices}
\label{vhar_bracelets}
With their \enquote{Tactile And Squeeze Bracelet Interface} (Tasbi), already mentioned in \secref{belt_actuators}, \textcite{pezent2019tasbi} and \textcite{pezent2022design} explored the use of a wrist-worn bracelet actuator.
It is capable of providing a uniform pressure sensation (up to \qty{15}{\N} and \qty{10}{\Hz}) and vibration with six \LRAs (\qtyrange{150}{200}{\Hz} bandwidth).
A user study was conducted in \VR to compare the perception of visuo-haptic stiffness rendering~\cite{pezent2019tasbi}.
In a \TAFC task, participants pressed a virtual button with different levels of stiffness via a virtual hand constrained by the \VE (\figref{pezent2019tasbi_2}).
In a \TIFC task (\secref{sensations_perception}), participants pressed a virtual button with different levels of stiffness via a virtual hand constrained by the \VE (\figref{pezent2019tasbi_2}).
A higher visual stiffness required a larger physical displacement to press the button (C/D ratio, see \secref{pseudo_haptic}), while the haptic stiffness control the rate of the pressure feedback when pressing.
When the visual and haptic stiffness were coherent or when only the haptic stiffness changed, participants easily discriminated two buttons with different stiffness levels (\figref{pezent2019tasbi_3}).
However, if only the visual stiffness changed, participants were not able to discriminate the different stiffness levels (\figref{pezent2019tasbi_4}).
@@ -216,10 +268,7 @@ A short vibration (\qty{25}{\ms} \qty{175}{\Hz} square-wave) was also rendered w
\subfig{pezent2019tasbi_4}
\end{subfigs}
\cite{sarac2022perceived,palmer2022haptic} not in AR but studies on relocating to the wrist the haptic feedback of the fingertip-object contacts.
%\subsection{Conclusion}
%\label{visuo_haptic_conclusion}
% the type of rendered object (real or virtual), the rendered haptic property (contact, hardness, texture, see \secref{tactile_rendering}), and .
%In this context of integrating \WHs with \AR to create a \vh-\AE (\chapref{introduction}), the definition of \textcite{pacchierotti2017wearable} can be extended to an additional criterion: The wearable haptic interface should not impair the interaction with the \RE, \ie the user should be able to touch and manipulate objects in the real world while wearing the haptic device.
% The haptic feedback is thus rendered de-localized from the point of contact of the finger on the rendered object.

View File

@@ -1,22 +1,28 @@
\section{Conclusion}
\label{conclusion}
La complexité de la perception des propriétés haptiques des objets et la diversité des interactions possibles rendent donc particulièrement difficile de concevoir des dispositifs et rendus haptiques réalistes.
D'autant plus que le sens du toucher réparti sur l'ensemble de la main et du corps et que les sensations haptiques sont nécessairement produites par un contact direct de la peau avec l'objet, donc liées à un mouvement de la main sur l'objet.
Il n'existe donc pas de système haptique générique pouvant adresser tous les aspects du sens haptique, mais une grande variété de dispositifs et de rendus haptiques avec différents objectifs, contraintes et compromis.
% the type of rendered object (real or virtual), the rendered haptic property (contact, hardness, texture, see \secref{tactile_rendering}), and .
%In this context of integrating \WHs with \AR to create a \vh-\AE (\chapref{introduction}), the definition of \textcite{pacchierotti2017wearable} can be extended to an additional criterion: The wearable haptic interface should not impair the interaction with the \RE, \ie the user should be able to touch and manipulate objects in the real world while wearing the haptic device.
% The haptic feedback is thus rendered de-localized from the point of contact of the finger on the rendered object.
Cependant, une expérience numérique audio-visuelle peut être imparfaite et pourtant suffisante pour être utile et intéressante, comme peut l'être une visio-conférence, et transmettre des sensations comparables à celles réelles, comme regarder et écouter un concert sur un écran avec un casque.
Pourtant la qualité visuelle et sonore de telles expériences est très différente de celle d'une conversation "réelle" ou d'une vraie scène de tous les jours.
Ainsi, plus que recréer des expériences haptiques réalistes, il est plus important de rendre le stimulus sensoriel "au bon moment et à la bonne place"~\cite{hayward2007it}.
% ---
%La complexité de la perception des propriétés haptiques des objets et la diversité des interactions possibles rendent donc particulièrement difficile de concevoir des dispositifs et rendus haptiques réalistes.
%D'autant plus que le sens du toucher réparti sur l'ensemble de la main et du corps et que les sensations haptiques sont nécessairement produites par un contact direct de la peau avec l'objet, donc liées à un mouvement de la main sur l'objet.
%Il n'existe donc pas de système haptique générique pouvant adresser tous les aspects du sens haptique, mais une grande variété de dispositifs et de rendus haptiques avec différents objectifs, contraintes et compromis.
%Cependant, une expérience numérique audio-visuelle peut être imparfaite et pourtant suffisante pour être utile et intéressante, comme peut l'être une visio-conférence, et transmettre des sensations comparables à celles réelles, comme regarder et écouter un concert sur un écran avec un casque.
%Pourtant la qualité visuelle et sonore de telles expériences est très différente de celle d'une conversation "réelle" ou d'une vraie scène de tous les jours.
%Ainsi, plus que recréer des expériences haptiques réalistes, il est plus important de rendre le stimulus sensoriel "au bon moment et à la bonne place"~\cite{hayward2007it}.
%The quality of the illusory haptic experience is a function of the interplay between the users perceptual system and the intrinsic technical qualities of the interfaces
De façon intéressante, les deux sections précédentes, présentant l'haptique portable pour la \secref{wearable_haptics} et la RA pour la \secref{augmented_reality}, suivent un cheminement assez opposé : la première commence avec le sens haptique et la main pour décrire les dispositifs haptiques portables et les interactions qu'ils permettent, tandis que la seconde débute sur une description technologique de RA pour ensuite détailler sa perception et son usage.
%
C'est de cette manière que chacun des deux domaines est souvent introduit dans la littérature, par exemple avec les travaux de \textcite{choi2013vibrotactile,culbertson2018haptics} pour l'haptique et de \textcite{bimber2005spatial,kim2018revisiting} pour la RA.
%De façon intéressante, les deux sections précédentes, présentant l'haptique portable pour la \secref{wearable_haptics} et la RA pour la \secref{augmented_reality}, suivent un cheminement assez opposé : la première commence avec le sens haptique et la main pour décrire les dispositifs haptiques portables et les interactions qu'ils permettent, tandis que la seconde débute sur une description technologique de RA pour ensuite détailler sa perception et son usage.
Mais il est également intéressant de noter que ces deux domaines sont à des stades de maturité différents.
En effet, pouvoir contribuer pour ces deux domaines soulève, entre autres, des défis techniques importants, comme détaillé dans la \secref[introduction]{research_challenges}.
Et il y a un besoin de standardisation en haptique portable~\cite{culbertson2018haptics}, notamment en terme de dispositifs et de rendus, alors que l'industrie est plutôt bien établie en RA, par exemple avec les casques HoloLens~2 de Microsoft~\footnoteurl{https://www.microsoft.com/hololens} et Vision~Pro d'Apple~\footnoteurl{https://www.apple.com/apple-vision-pro/} ou bien les frameworks ARCore de Google~\footnoteurl{https://developers.google.com/ar} et ARKit d'Apple~\footnoteurl{https://developer.apple.com/augmented-reality/}.
Cela peut en partie d'une part s'expliquer par la maturité de l'industrie de la RV, qui entraîne celle de la RA, et avec une tendance annoncée à la convergence de ces deux technologies~\cite{speicher2019what}, mais aussi d'autre part par la plus grande complexité et les particularité du sens haptique~\cite{culbertson2018haptics}.
À l'inverse, définir et caractériser la RA/RM, dans une bien moindre mesure la RV, reste étonnamment un sujet ouvert~\cite{speicher2019what}.
%C'est de cette manière que chacun des deux domaines est souvent introduit dans la littérature, par exemple avec les travaux de \textcite{choi2013vibrotactile,culbertson2018haptics} pour l'haptique et de \textcite{bimber2005spatial,kim2018revisiting} pour la RA.
%Mais il est également intéressant de noter que ces deux domaines sont à des stades de maturité différents.
%En effet, pouvoir contribuer pour ces deux domaines soulève, entre autres, des défis techniques importants, comme détaillé dans la \secref[introduction]{research_challenges}.
%Et il y a un besoin de standardisation en haptique portable~\cite{culbertson2018haptics}, notamment en terme de dispositifs et de rendus, alors que l'industrie est plutôt bien établie en RA, par exemple avec les casques HoloLens~2 de Microsoft~\footnoteurl{https://www.microsoft.com/hololens} et Vision~Pro d'Apple~\footnoteurl{https://www.apple.com/apple-vision-pro/} ou bien les frameworks ARCore de Google~\footnoteurl{https://developers.google.com/ar} et ARKit d'Apple~\footnoteurl{https://developer.apple.com/augmented-reality/}.
%Cela peut en partie d'une part s'expliquer par la maturité de l'industrie de la RV, qui entraîne celle de la RA, et avec une tendance annoncée à la convergence de ces deux technologies~\cite{speicher2019what}, mais aussi d'autre part par la plus grande complexité et les particularité du sens haptique~\cite{culbertson2018haptics}.
%À l'inverse, définir et caractériser la RA/RM, dans une bien moindre mesure la RV, reste étonnamment un sujet ouvert~\cite{speicher2019what}.

Binary file not shown.

After

Width:  |  Height:  |  Size: 56 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 37 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 107 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 104 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 103 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 35 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 63 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 45 KiB

View File

@@ -38,7 +38,7 @@
\let\AE\undefined
\let\v\undefined
\acronym[TAFC]{2AFC}{two-alternative forced choice}
\acronym[TIFC]{2IFC}{two-interval forced choice}
\acronym[TwoD]{2D}{two-dimensional}
\acronym[ThreeD]{3D}{three-dimensional}
\acronym{AE}{augmented environment}

View File

@@ -1604,6 +1604,16 @@
doi = {10/gzk2hf}
}
@incollection{jeon2008modulating,
title = {Modulating {{Real Object Stiffness}} for {{Haptic Augmented Reality}}},
booktitle = {Haptics: {{Perception}}, {{Devices}} and {{Scenarios}}},
author = {Jeon, Seokhee and Choi, Seungmoon},
date = {2008},
volume = {5024},
pages = {609--618},
doi = {10.1007/978-3-540-69057-3_78}
}
@article{jeon2009haptic,
title = {Haptic {{Augmented Reality}}: {{Taxonomy}} and an {{Example}} of {{Stiffness Modulation}}},
shorttitle = {Haptic {{Augmented Reality}}},
@@ -1616,6 +1626,16 @@
doi = {10/dzwg7q}
}
@inproceedings{jeon2010stiffness,
title = {Stiffness Modulation for {{Haptic Augmented Reality}}: {{Extension}} to {{3D}} Interaction},
shorttitle = {Stiffness Modulation for {{Haptic Augmented Reality}}},
booktitle = {{{IEEE Haptics Symp}}.},
author = {Jeon, Seokhee and Choi, Seungmoon},
date = {2010},
pages = {273--280},
doi = {10/cp3shc}
}
@inproceedings{jeon2011extensions,
title = {Extensions to Haptic Augmented Reality: {{Modulating}} Friction and Weight},
shorttitle = {Extensions to Haptic Augmented Reality},
@@ -1636,6 +1656,17 @@
doi = {10/gm97jp}
}
@incollection{jeon2015haptic,
title = {Haptic {{Augmented Reality}}: {{Taxonomy}}, {{Research Status}}, and {{Challenges}}},
shorttitle = {Haptic {{Augmented Reality}}},
booktitle = {Fundamentals of {{Wearable Computers}} and {{Augmented Reality}}},
author = {Jeon, Seokhee and Choi, Seungmoon and Harders, Matthias},
date = {2015},
edition = {0},
pages = {246--277},
doi = {10.1201/b18703-15}
}
@article{johansson1984roles,
title = {Roles of Glabrous Skin Receptors and Sensorimotor Memory in Automatic Control of Precision Grip When Lifting Rougher or More Slippery Objects},
author = {Johansson, R.S. and Westling, G.},
@@ -1880,6 +1911,17 @@
doi = {10.1007/978-1-4419-5615-6_12}
}
@article{klatzky2013haptic,
title = {Haptic {{Perception}} of {{Material Properties}} and {{Implications}} for {{Applications}}},
author = {Klatzky, Roberta L. and Pawluk, Dianne and Peer, Angelika},
date = {2013},
journaltitle = {Proc. IEEE},
volume = {101},
number = {9},
pages = {2081--2092},
doi = {10/gbdspn}
}
@article{klein2020predicting,
title = {Predicting Precision Grip Grasp Locations on Three-Dimensional Objects},
author = {Klein, Lina K. and Maiello, Guido and Paulun, Vivian C. and Fleming, Roland W.},