From 27f635afe48ac53129b6da39d897ab0ab4738c09 Mon Sep 17 00:00:00 2001 From: Erwan Normand Date: Thu, 19 Sep 2024 08:32:48 +0200 Subject: [PATCH] WIP --- .../related-work/2-wearable-haptics.tex | 1 - .../related-work/3-augmented-reality.tex | 11 +++++---- .../related-work/4-visuo-haptic-ar.tex | 23 ++++++++++++------- config/acronyms.tex | 1 + 4 files changed, 23 insertions(+), 13 deletions(-) diff --git a/1-introduction/related-work/2-wearable-haptics.tex b/1-introduction/related-work/2-wearable-haptics.tex index 943dc66..272a27d 100644 --- a/1-introduction/related-work/2-wearable-haptics.tex +++ b/1-introduction/related-work/2-wearable-haptics.tex @@ -172,7 +172,6 @@ In direct touch, the haptic device does not cover the interior of the hand to no We are interested in direct touch augmentations with wearable haptic devices (\secref{wearable_haptic_devices}), as their integration with \AR is particularly promising for direct hand interaction with visuo-haptic augmentations. We also focus tactile augmentations stimulating the mechanoreceptors of the skin (\secref{haptic_sense}), thus excluding temperature perception, as they are the most common existing haptic interfaces. -% \cite{bhatia2024augmenting}. Types of interfaces : direct touch, through touch, through tool. Focus on direct touch, but when no rendering done, % \cite{klatzky2003feeling} : rendering roughness, friction, deformation, temperatures % \cite{girard2016haptip} : renderings with a tangential motion actuator diff --git a/1-introduction/related-work/3-augmented-reality.tex b/1-introduction/related-work/3-augmented-reality.tex index a02f740..f7d5502 100644 --- a/1-introduction/related-work/3-augmented-reality.tex +++ b/1-introduction/related-work/3-augmented-reality.tex @@ -1,8 +1,8 @@ \section{Manipulating Object with the Hands in AR} \label{augmented_reality} -The first \AR headset was invented by \textcite{sutherland1968headmounted}: With the technology available at the time, it was already capable of displaying \VOs at a fixed point in space in real time, giving the user the illusion that the content was present in the room. -Fixed to the ceiling, the headset displayed a stereoscopic (one image per eye) perspective projection of the virtual content on a transparent screen, taking into account the user's position, and thus already following the interaction loop presented in \figref[introduction]{interaction-loop}. +As with haptic systems (\secref{wearable_haptics}), visual \AR devices generate and integrate virtual content into the user's perception of the \RE, creating the illusion of the presence of the virtual. +Immersive systems such as headsets leave the hands free to interact with \VOs, promising natural and intuitive interactions similar to those with everyday real objects. %\begin{subfigs}{sutherland1968headmounted}{Photos of the first \AR system~\cite{sutherland1968headmounted}. }[ % \item The \AR headset. @@ -17,6 +17,9 @@ Fixed to the ceiling, the headset displayed a stereoscopic (one image per eye) p \subsection{What is Augmented Reality?} \label{what_is_ar} +The first \AR headset was invented by \textcite{sutherland1968headmounted}: With the technology available at the time, it was already capable of displaying \VOs at a fixed point in space in real time, giving the user the illusion that the content was present in the room. +Fixed to the ceiling, the headset displayed a stereoscopic (one image per eye) perspective projection of the virtual content on a transparent screen, taking into account the user's position, and thus already following the interaction loop presented in \figref[introduction]{interaction-loop}. + \subsubsection{A Definition} \label{ar_definition} @@ -336,10 +339,10 @@ Taken together, these results suggest that a visual hand rendering in \AR could \subsection{Conclusion} \label{ar_conclusion} -\AR systems integrate \VOs into the visual perception as if they were part of the \RE. +\AR systems integrate virtual content into the user's perception as if it is part of the \RE. \AR headsets now enable real-time tracking of the head and hands, and high-quality display of virtual content, while being portable and mobile. They enable highly immersive \AEs that users can explore with a strong sense of the presence of the virtual content. But without a direct and seamless interaction with the \VOs using the hands, the coherence of the \AE experience is compromised. In particular, there is a lack of mutual occlusion and interaction cues between hands and virtual content while manipulating \VOs in \OST-\AR that could be mitigated by visual rendering of the hand. -A common alternative approach is to use tangible objects as proxies for interaction with \VOs, but this raises concerns about their number and association with \VOs, as well as consistency with the visual rendering. +A common alternative approach is to use tangible objects as proxies for interaction with \VOs, but this raises concerns about their consistency with the visual rendering. In this context, the use of wearable haptic systems worn on the hand seems to be a promising solution both for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of touched tangible objects. diff --git a/1-introduction/related-work/4-visuo-haptic-ar.tex b/1-introduction/related-work/4-visuo-haptic-ar.tex index 552b0df..714fa78 100644 --- a/1-introduction/related-work/4-visuo-haptic-ar.tex +++ b/1-introduction/related-work/4-visuo-haptic-ar.tex @@ -14,12 +14,17 @@ \subsection{Influence of Visual Rendering on Haptic Perception} \label{visual_haptic_influence} + \subsubsection{Merging the Sensations into a Perception} \label{sensations_perception} -When the same object property is sensed simultaneously by vision and touch, the two modalities are integrated into a single perception. -The phychophysical model of \textcite{ernst2002humans} established that the sense with the least variability dominates perception. -\cite{ernst2004merging} +As already evoked in the previous sections, a \emph{perception} is the merge of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property~\cite{ernst2004merging}. +For example, it is the haptic hardness perceived through skin pressure and force sensations~\secref{hardness}, the hand movement from proprioception and a visual hand avatar~\secref{ar_displays}, or the perceived size of a tangible with a co-localized \VO~\secref{ar_tangibles}. + +When the sensations can be redundant, \ie only one sensory modality could be used to estimate the property, they are integrated to form a single coherent perception~\cite{ernst2004merging}. +The \MLE model explains how one estimate the perception from multiple sensory cues by weighting them according to their perceived reliability~\cite{ernst2002optimal}. + + Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}. % @@ -62,12 +67,11 @@ Conversely, as discussed by \textcite{ujitoko2021survey} in their review, a co-l Even before manipulating a visual representation to induce a haptic sensation, shifts and latencies between user input and co-localised visuo-haptic feedback can be experienced differently in AR and VR, which we aim to investigate in this work. + \subsubsection{Perception of Visuo-Haptic Rendering in AR and VR} \label{AR_vs_VR} -Some studies have investigated the visuo-haptic perception of \VOs in \AR and \VR. -They have shown how the latency of the visual rendering of an object with haptic feedback or the type of environment (\VE or \RE) can affect the perception of an identical haptic rendering. -Indeed, there are indeed inherent and unavoidable latencies in the visual and haptic rendering of \VOs, and the visual-haptic feedback may not appear to be simultaneous. +Some studies have investigated the visuo-haptic perception of \VOs in \AR and \VR with grounded force-feedback devices. In an immersive \VST-\AR setup, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}). In a \TAFC task, participants pressed two pistons and indicated which was stiffer. @@ -113,6 +117,9 @@ In a user study, participants touched a virtual cube with a virtual hand: The co The visuo-haptic simultaneity varied by either adding a visual delay or by triggering earlier the haptic feedback. No participant (out of 19) was able to detect a \qty{50}{\ms} visual lag and a \qty{15}{\ms} haptic lead and only half of them detected a \qty{100}{\ms} visual lag and a \qty{70}{\ms} haptic lead. +They have shown how the latency of the visual rendering of a \VO or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device. +We describe in the next section how wearable haptics have been integrated with immersive \AR. + \subsection{Wearable Haptics for AR} \label{vhar_haptics} @@ -213,8 +220,8 @@ A short vibration (\qty{25}{\ms} \qty{175}{\Hz} square-wave) was also rendered w \end{subfigs} -\subsection{Conclusion} -\label{visuo_haptic_conclusion} +%\subsection{Conclusion} +%\label{visuo_haptic_conclusion} % the type of rendered object (real or virtual), the rendered haptic property (contact, hardness, texture, see \secref{tactile_rendering}), and . %In this context of integrating \WHs with \AR to create a \vh-\AE (\chapref{introduction}), the definition of \textcite{pacchierotti2017wearable} can be extended to an additional criterion: The wearable haptic interface should not impair the interaction with the \RE, \ie the user should be able to touch and manipulate objects in the real world while wearing the haptic device. diff --git a/config/acronyms.tex b/config/acronyms.tex index cd5c97f..991a387 100644 --- a/config/acronyms.tex +++ b/config/acronyms.tex @@ -50,6 +50,7 @@ \acronym{h}{haptic} \acronym{JND}{just noticeable difference} \acronym{LRA}{linear resonant actuator} +\acronym{MLE}{maximum-likelihood estimation} \acronym{MR}{mixed reality} \acronym{OST}{optical see-through} \acronym{PI}{place illusion}