Simplify acronyms

This commit is contained in:
2024-09-23 15:20:41 +02:00
parent 560a085e6e
commit 23973caef7
9 changed files with 81 additions and 88 deletions

View File

@@ -1,7 +1,7 @@
\chapter{Introduction}
\mainlabel{introduction}
This thesis presents research on direct hand interaction with real and virtual everyday objects, visually and haptically augmented using immersive \AR and \WH devices.
This thesis presents research on direct hand interaction with real and virtual everyday objects, visually and haptically augmented using immersive \AR and wearable haptic devices.
%It is entitled:
%- Augmenting the interaction with everyday objects with wearable haptics and Augmented Reality
@@ -73,9 +73,9 @@ Instead, wearable interfaces are directly mounted on the body to provide kinesth
\subfig[0.25]{culbertson2018haptics-wearable}
\end{subfigs}
A wide range of \WH devices have been developed to provide the user with rich virtual haptic sensations, including normal force, skin stretch, vibration and thermal feedback.
A wide range of wearable haptic devices have been developed to provide the user with rich virtual haptic sensations, including normal force, skin stretch, vibration and thermal feedback.
%
\figref{wearable-haptics} shows some examples of different \WH devices with different form factors and rendering capabilities.
\figref{wearable-haptics} shows some examples of different wearable haptic devices with different form factors and rendering capabilities.
%
Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, \VR, and social interactions \cite{pacchierotti2017wearable,culbertson2018haptics}.
%
@@ -85,8 +85,8 @@ But their use in combination with \AR has been little explored so far.
Wearable haptic devices can render sensations on the skin as feedback to real or virtual objects being touched.
}[
\item Wolverine, a wearable exoskeleton that simulate contact and grasping of virtual objects with force feedback on the fingers \cite{choi2016wolverine}.
\item Touch\&Fold, a \WH device mounted on the nail that fold on demand to render contact, normal force and vibrations to the fingertip \cite{teng2021touch}.
\item The hRing, a \WH ring mounted on the proximal phalanx able to render normal and shear forces to the finger \cite{pacchierotti2016hring}.
\item Touch\&Fold, a wearable haptic device mounted on the nail that fold on demand to render contact, normal force and vibrations to the fingertip \cite{teng2021touch}.
\item The hRing, a wearable haptic ring mounted on the proximal phalanx able to render normal and shear forces to the finger \cite{pacchierotti2016hring}.
\item Tasbi, a haptic bracelet capable of rendering squeeze and vibrotactile feedback to the wrist \cite{pezent2022design}.
]
\subfigsheight{28mm}
@@ -105,9 +105,9 @@ It thus promises natural and seamless interaction with the physical and digital
%
It is technically and conceptually closely related to \VR, which replaces the \RE perception with a \VE.
%
\AR and \VR can be placed on a \RV continuum, as proposed by \textcite{milgram1994taxonomy} and illustrated in \figref{rv-continuum}\footnote{On the \RV continuum of \textcite{milgram1994taxonomy}, augmented virtuality is also considered, as the incorporation of real objects to a \VE, and is placed between \AR and \VR. For simplicity, we only consider \AR and \VR in this thesis.}.
\AR and \VR can be placed on a reality-virtuality continuum, as proposed by \textcite{milgram1994taxonomy} and illustrated in \figref{rv-continuum}\footnote{On the original reality-virtuality continuum of \textcite{milgram1994taxonomy}, augmented virtuality is also considered, as the incorporation of real objects to a \VE, and is placed between \AR and \VR. For simplicity, we only consider \AR and \VR in this thesis.}.
%
It describes the degree of \RV of the environment along an axis, with one end being the \RE and the other end being a pure \VE, \ie indistinguishable from the real world (such as \emph{The Matrix} movies).
It describes the degree of virtuality of the environment along an axis, with one end being the \RE and the other end being a pure \VE, \ie indistinguishable from the real world (such as \emph{The Matrix} movies).
%
Between these two extremes lies \MR, which comprises \AR and \VR as different levels of mixing real and virtual environments \cite{skarbez2021revisiting}.\footnote{This is the original and classic definition of \MR, but there is still a debate on defining and characterize \AR and \MR experiences \cite{speicher2019what,skarbez2021revisiting}.}
%
@@ -115,9 +115,9 @@ Between these two extremes lies \MR, which comprises \AR and \VR as different le
%
The most promising devices are \AR headset, which are portable displays worn directly on the head, providing the user with an immersive \AE/\VE.
\begin{subfigs}{rv-continuums}{Reality-virtuality (\RV) continuums. }[
\item Original \RV continuum for the visual sense initially proposed by and adapted from \textcite{milgram1994taxonomy}.
\item Extension of the \RV continuum to include the haptic sense on a second, orthogonal axis, proposed by and adapted from \textcite{jeon2009haptic}.
\begin{subfigs}{rv-continuums}{Reality-virtuality continuums. }[
\item For the visual sense, as originally proposed by and adapted from \textcite{milgram1994taxonomy}.
\item Extension to include the haptic sense on a second, orthogonal axis, proposed by and adapted from \textcite{jeon2009haptic}.
]
\subfig[0.44]{rv-continuum}
\subfig[0.54]{visuo-haptic-rv-continuum3}
@@ -125,35 +125,35 @@ The most promising devices are \AR headset, which are portable displays worn dir
\AR/\VR can also be extended to render for sensory modalities other than vision.
%
\textcite{jeon2009haptic} proposed extending the \RV continuum to include haptic feedback by decoupling into two orthogonal haptic and visual axes (\figref{visuo-haptic-rv-continuum3}).
\textcite{jeon2009haptic} proposed extending the reality-virtuality continuum to include haptic feedback by decoupling into two orthogonal haptic and visual axes (\figref{visuo-haptic-rv-continuum3}).
%
The combination of the two axes defines 9 types of \vh environments, with 3 possible levels of \RV for each \v or \h axis: real, augmented and virtual.
The combination of the two axes defines 9 types of visuo-haptic environments, with 3 possible levels of virtuality for each visual or haptic axis: real, augmented and virtual.
%
For example, a \v-\AE that uses a tangible (touchable) object as a proxy to manipulate virtual content is considered a \h-\RE (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum3}), whereas a device that provides synthetic haptic feedback when touching a \VO is considered a \h-\VE (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum3}).
For example, a visual \AE that uses a tangible (touchable) object as a proxy to manipulate virtual content is considered a haptic \RE (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum3}), whereas a device that provides synthetic haptic feedback when touching a \VO is considered a haptic \VE (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum3}).
%
Haptic \AR (\h-\AR) is then the combination of real and virtual haptic stimuli \cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}).
Haptic \AR (haptic \AR) is then the combination of real and virtual haptic stimuli \cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}).
%
In particular, it has been implemented by enhancing the haptic perception of tangible objects by providing timely tactile stimuli using \WHs.
In particular, it has been implemented by enhancing the haptic perception of tangible objects by providing timely tactile stimuli using wearable haptics.
%
\figref{salazar2020altering} shows an example of modifying the perceived stiffness of a tangible object in \VR using simultaneous pressure feedback on the finger (left middle cell in \figref{visuo-haptic-rv-continuum3}).
%
\figref{bau2012revel} shows another example of \vh-\AR rendering of virtual texture when running the finger on a tangible surface (middle cell in the two axes in \figref{visuo-haptic-rv-continuum3}).
\figref{bau2012revel} shows another example of visuo-haptic \AR rendering of virtual texture when running the finger on a tangible surface (middle cell in the two axes in \figref{visuo-haptic-rv-continuum3}).
Current \v-\AR systems often lack haptic feedback, creating a deceptive and incomplete user experience when reaching the \VE with the hand.
Current visual \AR systems often lack haptic feedback, creating a deceptive and incomplete user experience when reaching the \VE with the hand.
%
All \v-\VOs are inherently intangible and cannot physically constrain a user's hand, making it difficult to perceive their properties congruently and interact with them with confidence and efficiency.
All visual \VOs are inherently intangible and cannot physically constrain a user's hand, making it difficult to perceive their properties congruently and interact with them with confidence and efficiency.
%
It is therefore necessary to provide haptic feedback that is consistent with the \v-\AE and ensures the best possible user experience.
It is therefore necessary to provide haptic feedback that is consistent with the visual \AE and ensures the best possible user experience.
%
The integration of \WHs with \AR seems to be one of the most promising solutions, but it remains challenging due to their many respective characteristics and the additional constraints of combining them.
The integration of wearable haptics with \AR seems to be one of the most promising solutions, but it remains challenging due to their many respective characteristics and the additional constraints of combining them.
\begin{subfigs}{visuo-haptic-environments}{
Visuo-haptic environments with different degrees of reality-virtuality.
}[
\item Visual \AR environment with a real, tangible haptic object used as a proxy to manipulate a \VO \cite{kahl2023using}.
\item Visual \AR environment with a \WH device that provides virtual, synthetic feedback from contact with a \VO \cite{meli2018combining}.
\item A tangible object seen in a \v-\VR environment whose haptic perception of stiffness is augmented with the hRing haptic device \cite{salazar2020altering}.
\item Visuo-haptic rendering of texture on a touched tangible object with a \v-\AR display and haptic electrovibration feedback \cite{bau2012revel}.
\item Visual \AR environment with a wearable haptic device that provides virtual, synthetic feedback from contact with a \VO \cite{meli2018combining}.
\item A tangible object seen in a visual \VR environment whose haptic perception of stiffness is augmented with the hRing haptic device \cite{salazar2020altering}.
\item Visuo-haptic rendering of texture on a touched tangible object with a visual \AR display and haptic electrovibration feedback \cite{bau2012revel}.
]
\subfigsheight{31mm}
\subfig{kahl2023using}
@@ -166,28 +166,28 @@ The integration of \WHs with \AR seems to be one of the most promising solutions
\section{Research Challenges of Wearable Visuo-Haptic Augmented Reality}
\label{research_challenges}
The integration of \WHs with \AR to create a \vh-\AE is complex and presents many perceptual and interaction challenges, \ie sensing the \AE and acting effectively upon it.
The integration of wearable haptics with \AR to create a visuo-haptic \AE is complex and presents many perceptual and interaction challenges, \ie sensing the \AE and acting effectively upon it.
%
We are particularly interested in enabling direct contact of virtual and augmented objects with the bare hand.
%
Our goal is to enable congruent, intuitive and seamless perception and manipulation of the \vh-\AE.
Our goal is to enable congruent, intuitive and seamless perception and manipulation of the visuo-haptic \AE.
The experience of such a \vh-\AE relies on an interaction loop with the user, as illustrated in \figref{interaction-loop}.
The experience of such a visuo-haptic \AE relies on an interaction loop with the user, as illustrated in \figref{interaction-loop}.
%
The \RE and the user's hand are tracked in real time by sensors and reconstructed in visual and haptic \VEs.
%
The interactions between the virtual hand and objects are then simulated and rendered as visual and haptic feedback to the user using an \AR headset and a \WH device.
The interactions between the virtual hand and objects are then simulated and rendered as visual and haptic feedback to the user using an \AR headset and a wearable haptic device.
%
Because the \vh-\VE is displayed in real time, colocalized and aligned with the real one, the user is given the illusion of directly perceiving and interacting with the virtual content as if it were part of the \RE.
Because the visuo-haptic \VE is displayed in real time, colocalized and aligned with the real one, the user is given the illusion of directly perceiving and interacting with the virtual content as if it were part of the \RE.
\fig{interaction-loop}{
The interaction loop between a user and a visuo-haptic augmented environment.
}[
One interact with the visual (in blue) and haptic (in red) virtual environment through a virtual hand (in purple) interaction technique that tracks real hand movements and simulates contact with \VOs.
The virtual environment is rendered back to the user co-localized with the real one (in gray) using a \v-\AR headset and a \WH device.
The virtual environment is rendered back to the user co-localized with the real one (in gray) using a visual \AR headset and a wearable haptic device.
]
%This to ensure the best possible user experience, taking into account the current capabilities and limitations of \WHs and augmented reality technologies.
%This to ensure the best possible user experience, taking into account the current capabilities and limitations of wearable haptics and augmented reality technologies.
%
In this context, we identify two main research challenges that we address in this thesis:
%
@@ -196,7 +196,7 @@ In this context, we identify two main research challenges that we address in thi
\item enabling effective manipulation of the augmented environment.
\end{enumerate*}
%
Each of these challenges also raises numerous design, technical and human issues specific to each of the two types of feedback, \WHs and immersive \AR, as well as multimodal rendering and user experience issues in integrating these two sensorimotor feedbacks into a coherent and seamless \vh-\AE.
Each of these challenges also raises numerous design, technical and human issues specific to each of the two types of feedback, wearable haptics and immersive \AR, as well as multimodal rendering and user experience issues in integrating these two sensorimotor feedbacks into a coherent and seamless visuo-haptic \AE.
%These challenges are illustrated in the visuo-haptic interaction loop in \figref{interaction-loop}.
@@ -212,11 +212,11 @@ Firstly, the user's hand and \RE are visible in \AR, unlike \VR where there is t
As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic \cite{ujitoko2021survey} or haptic retargeting \cite{azmandian2016haptic} effects.
%enabling techniques such as pseudo-haptic feedback that induce haptic feedback with visual stimuli \cite{ujitoko2021survey} or haptic retargeting that associate a single tangible object with multiple \VOs without the user noticing \cite{azmandian2016haptic}.
%
Moreover, many \WH devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR.
Moreover, many wearable haptic devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR.
%
The user's hand must be indeed free to touch and interact with the \RE while wearing a \WH device.
The user's hand must be indeed free to touch and interact with the \RE while wearing a wearable haptic device.
%
It is possible instead to place the haptic actuator close to the point of contact with the \RE, as described above to implement \h-\AR, \eg providing haptic feedback on the nail \cite{ando2007fingernailmounted,teng2021touch}, another phalanx \cite{asano2015vibrotactile,salazar2020altering} or the wrist \cite{pezent2022design,sarac2022perceived} for rendering fingertip contacts with virtual content.
It is possible instead to place the haptic actuator close to the point of contact with the \RE, as described above to implement haptic \AR, \eg providing haptic feedback on the nail \cite{ando2007fingernailmounted,teng2021touch}, another phalanx \cite{asano2015vibrotactile,salazar2020altering} or the wrist \cite{pezent2022design,sarac2022perceived} for rendering fingertip contacts with virtual content.
%
Therefore, when touching a virtual or augmented object, the real and virtual visual sensations are seen as co-localized, but the virtual haptic feedback is not.
%
@@ -228,7 +228,7 @@ These added virtual sensations can therefore be perceived as out of sync or even
%
It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as realistic or plausible, and to what extent they will conflict or complement each other in the perception of the \AE.
%
%Therefore, it remains to be investigated how these three characteristics of using \WHs with \AR affect the perception, especially with visually and haptically augmented objects.
%Therefore, it remains to be investigated how these three characteristics of using wearable haptics with \AR affect the perception, especially with visually and haptically augmented objects.
% on voit sa propre main toucher, contrairement à la \VR, où la vision est particulièrement dominante (eg retargeting), difficile à dire si le cas en RA, surtout que si touche objets augmentés, difficile de modifier visuellement et haptiquement on peut ajouter des sensations pas vraiment en enlever. Lactuateur n'est pas là où on touche, à quel point les sensations seront réalistes ? En cohérence avec les sensations visuelles ? À quel point la perception différente de la \VR, en terme de rendu main env, et de latence ? Important car permettra d'utiliser efficacement, avwc correction si besoin par rapport à la \VR. Lq boucle d'interaction a forcément de la latence par rapport aux mouvements, à la proprioception, et pas les mêmes entre visuel et haptique, quel effet ?
@@ -262,9 +262,9 @@ Yet, it is unclear which type of visual and haptic feedback is the best suited t
\section{Approach and Contributions}
\label{contributions}
The aim of this thesis is to understand how immersive visual and \WH augmentations compare and complement each other in the context of direct hand perception and manipulation with augmented objects.
The aim of this thesis is to understand how immersive visual and wearable haptic augmentations compare and complement each other in the context of direct hand perception and manipulation with augmented objects.
%
As described in the Research Challenges section above, providing a convincing, consistent and effective \vh-\AE to a user is complex and raises many issues.
As described in the Research Challenges section above, providing a convincing, consistent and effective visuo-haptic \AE to a user is complex and raises many issues.
%
Our approach is to
%
@@ -300,13 +300,13 @@ Our contributions in these two axes are summarized in \figref{contributions}.
% Very short abstract of contrib 2
\WH devices have proven to be effective in modifying the perception of a touched tangible surface, without modifying the tangible, nor covering the fingertip, forming a \h-\AE \cite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
wearable haptic devices have proven to be effective in modifying the perception of a touched tangible surface, without modifying the tangible, nor covering the fingertip, forming a haptic \AE \cite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
%
%It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronised with the finger movement.
%
%It enables rich haptic feedback as the combination of kinesthetic sensation from the tangible and cutaneous sensation from the actuator.
%
However, wearable \h-\AR have been little explored with \v-\AR, as well as the visuo-haptic augmentation of textures.
However, wearable haptic \AR have been little explored with visual \AR, as well as the visuo-haptic augmentation of textures.
%
Texture is indeed one of the main tactile sensation of a surface material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by both sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) rendering \cite{unger2011roughness,culbertson2014modeling,strohmeier2017generating}.
%
@@ -328,19 +328,19 @@ Hence, our second objective is to understand how the perception of haptic textur
Finally, some visuo-haptic texture databases have been modeled from real texture captures \cite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with graspable haptics that are perceived as similar to real textures \cite{culbertson2015should,friesen2024perceived}.
%
However, the rendering of these textures in an immersive and natural \vh-\AR using \WHs remains to be investigated.
However, the rendering of these textures in an immersive and natural visuo-haptic \AR using wearable haptics remains to be investigated.
%
Our third objective is to evaluate the perception of simultaneous and co-localized visuo-haptic texture augmentation of tangible surfaces in \AR, directly touched by the hand, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture.
\subsectionstarbookmark{Improving Virtual Object Manipulation with Visuo-Haptic Augmentations of the Hand}
In immersive and wearable \vh-\AR, the hand is free to touch and interact seamlessly with real, augmented, and virtual objects, and one can expect natural and direct contact and manipulation of \VOs with the bare hand.
In immersive and wearable visuo-haptic \AR, the hand is free to touch and interact seamlessly with real, augmented, and virtual objects, and one can expect natural and direct contact and manipulation of \VOs with the bare hand.
%
However, the intangibility of the \v-\VE, the many display limitations of current \v-\AR systems and \WH devices, and the potential discrepancies between these two types of feedback can make the manipulation of \VOs particularly challenging.
%However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of \WHs, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the real environment, the visual feedback, and the haptic feedback, can make the interaction with \VOs with bare hands particularly challenging.
However, the intangibility of the visual \VE, the many display limitations of current visual \AR systems and wearable haptic devices, and the potential discrepancies between these two types of feedback can make the manipulation of \VOs particularly challenging.
%However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of wearable haptics, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the real environment, the visual feedback, and the haptic feedback, can make the interaction with \VOs with bare hands particularly challenging.
%
Still two types of sensory feedback are known to improve such direct \VO manipulation, but they have not been studied in combination in immersive \v-\AE: visual rendering of the hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and contact rendering with \WHs \cite{lopes2018adding,teng2021touch}.
Still two types of sensory feedback are known to improve such direct \VO manipulation, but they have not been studied in combination in immersive visual \AE: visual rendering of the hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and contact rendering with wearable haptics \cite{lopes2018adding,teng2021touch}.
%
For this second axis of research, we propose to design and evaluate the role of visuo-haptic augmentations of the hand as interaction feedback with \VOs.
%
@@ -350,13 +350,13 @@ First, the visual rendering of the virtual hand is a key element for interacting
%
A few works have also investigated the visual rendering of the virtual hand in \AR, from simulating mutual occlusions between the hand and \VOs \cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay \cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
%
But \v-\AR has significant perceptual differences from \VR due to the visibility of the real hand and environment, and these visual hand augmentations have not been evaluated in the context of \VO manipulation.
But visual \AR has significant perceptual differences from \VR due to the visibility of the real hand and environment, and these visual hand augmentations have not been evaluated in the context of \VO manipulation.
%
Thus, our fourth objective is to evaluate and compare the effect of different visual hand augmentations on direct manipulation of \VOs in \AR.
Finally, as described above, \WHs for \v-\AR rely on moving the haptic actuator away from the fingertips to not impair the hand movements, sensations, and interactions with the \RE.
Finally, as described above, wearable haptics for visual \AR rely on moving the haptic actuator away from the fingertips to not impair the hand movements, sensations, and interactions with the \RE.
%
Previous works have shown that \WHs that provide feedback on the hand manipulation with \VOs in \AR can significantly improve the user performance and experience \cite{maisto2017evaluation,meli2018combining}.
Previous works have shown that wearable haptics that provide feedback on the hand manipulation with \VOs in \AR can significantly improve the user performance and experience \cite{maisto2017evaluation,meli2018combining}.
%
However, it is unclear which positioning of the actuator is the most beneficial nor how a haptic augmentation of the hand compares or complements with a visual augmentation of the hand.
%
@@ -369,17 +369,12 @@ Our last objective is to investigate the role of visuo-haptic augmentations of t
%Present the contributions and structure of the thesis.
This thesis is divided in four parts.
%
\partref{context} describes the context and background of our research, within which this first current \textit{Introduction} chapter presents the research challenges, and the objectives, approach, and contributions of this thesis.
%
\chapref{related_work} then presents previous work on the perception of and interaction with visual and haptic augmentations using \WHs and \AR, and how they have been combined in \vh-\AEs.
%
Firstly, it gives an overview of how \WHs have been used to enhance the touch perception and interaction, with a focus on vibrotactile feedback and haptic textures.
%
It then introduces \AR, and how users perceive and can interact with the augmented environments, in particular using the visual rendering of the user's hand.
%
Finally, it shows how multimodal visuo-haptic feedback has been used in \AR and \VR to alter the perception of tangible objects and to improve the manipulation of \VOs.
%
In \partref{context}, we describe the context and background of our research, within which this first current \textit{Introduction} chapter we present the research challenges, and the objectives, approach, and contributions of this thesis.
In \chapref{related_work}, we then review previous work on the perception and manipulation with virtual and augmented objects, directly with the hand, using either wearable haptics, \AR, or their combination.
First, we overview how the hand perceives and manipulate real everyday objects.
Second, we present wearable haptics and haptic augmentations of roughness and hardness of real objects.
Third, we introduce \AR, and how \VOs can be manipulated directly with the hand.
Finally, we describe how multimodal visual and haptic feedback have been combined in \AR to enhance perception and interaction with the hand.
Next, we address each of our two research axes in a dedicated part.
\bigskip
@@ -390,7 +385,7 @@ We evaluate how the visual rendering of the hand (real or virtual), the environm
\chapref{xr_perception} details a system for rendering visuo-haptic virtual textures that augment tangible surfaces using an immersive \AR/\VR headset and a wearable vibrotactile device.
%
The haptic textures are rendered as a real-time vibrotactile signal representing a grating texture, and is provided to the middle phalanx of the index finger touching the texture using a \VCA.
The haptic textures are rendered as a real-time vibrotactile signal representing a grating texture, and is provided to the middle phalanx of the index finger touching the texture using a voice-coil actuator.
%
The tracking of the real hand and environment is done using marker-based technique, and the visual rendering of their virtual counterparts is done using the immersive \OST \AR headset Microsoft HoloLens~2.
@@ -406,7 +401,7 @@ The textures are paired visual and tactile models of real surfaces \cite{culbert
%
%We investigate the perception and user appreciation of the combination of nine representative visuo-haptic pairs of texture.
%
Our objective is to assess the perceived realism, plausibility and roughness of the combination of nine representative visuo-haptic texture pairs, and the coherence of their association.
Our objective is to assess the perceived realism, coherence and roughness of the combination of nine representative visuo-haptic texture pairs, and the coherence of their association.
\bigskip

View File

@@ -94,7 +94,7 @@ As illustrated in the \figref{sensorimotor_continuum}, \Citeauthor{jones2006huma
]
This classification has been further refined by \textcite{bullock2013handcentric} into 15 categories of possible hand interactions with an object.
In this thesis, we are interested in exploring \vh augmentations (\partref{perception}) and grasping of \VOs (\partref{manipulation}) in the context of \AR and \WHs.
In this thesis, we are interested in exploring visuo-haptic augmentations (\partref{perception}) and grasping of \VOs (\partref{manipulation}) in the context of \AR and wearable haptics.
\subsubsection{Hand Anatomy and Motion}
\label{hand_anatomy}

View File

@@ -28,7 +28,7 @@ Fixed to the ceiling, the headset displayed a stereoscopic (one image per eye) p
The first formal definition of \AR was proposed by \textcite{azuma1997survey}: (1) combine real and virtual, (2) be interactive in real time, and (3) register real and virtual\footnotemark.
Each of these characteristics is essential: the real-virtual combination distinguishes \AR from \VR, a movie with integrated digital content is not interactive and a \TwoD overlay like an image filter is not registered.
There are also two key aspects to this definition: it does not focus on technology or method, but on the user's perspective of the system experience, and it does not specify a particular human sense, \ie it can be auditory \cite{yang2022audio}, haptic \cite{bhatia2024augmenting}, or even olfactory \cite{brooks2021stereosmell} or gustatory \cite{brooks2023taste}.
Yet, most of the research have focused on visual augmentations, and the term \AR (without a prefix) is almost always understood as \v-\AR.
Yet, most of the research have focused on visual augmentations, and the term \AR (without a prefix) is almost always understood as visual \AR.
\footnotetext{This third characteristic has been slightly adapted to use the version of \textcite{marchand2016pose}, the original definition was: \enquote{registered in \ThreeD}.}
%For example, \textcite{milgram1994taxonomy} proposed a taxonomy of \MR experiences based on the degree of mixing real and virtual environments, and \textcite{skarbez2021revisiting} revisited this taxonomy to include the user's perception of the experience.
@@ -107,21 +107,21 @@ Still, these concepts are useful to design, evaluate and discuss our contributio
\label{ar_presence}
\AR and \VR are both essentially illusions as the virtual content does not physically exist but is just digitally simulated and rendered to the user's senses through display \UIs.
Such experience of disbelief suspension in \VR is what is called \emph{presence}, and it can be decomposed into two dimensions: \PI and \PSI \cite{slater2009place}.
\PI is the sense of the user of \enquote{being there} in the \VE (\figref{presence-vr}).
Such experience of disbelief suspension in \VR is what is called \emph{presence}, and it can be decomposed into two dimensions: place illusion and plausibility \cite{slater2009place}.
Place illusion is the sense of the user of \enquote{being there} in the \VE (\figref{presence-vr}).
It emerges from the real time rendering of the \VE from the user's perspective: to be able to move around inside the \VE and look from different point of views.
\PSI is the illusion that the virtual events are really happening, even if the user knows that they are not real.
plausibility is the illusion that the virtual events are really happening, even if the user knows that they are not real.
It doesn't mean that the virtual events are realistic, but that they are plausible and coherent with the user's expectations.
%The \AR presence is far less defined and studied than for \VR \cite{tran2024survey}
For \AR, \textcite{slater2022separate} proposed to invert \PI to what we can call \enquote{object illusion}, \ie the sense of the \VO to \enquote{feels here} in the \RE (\figref{presence-ar}).
For \AR, \textcite{slater2022separate} proposed to invert place illusion to what we can call \enquote{object illusion}, \ie the sense of the \VO to \enquote{feels here} in the \RE (\figref{presence-ar}).
As with VR, \VOs must be able to be seen from different angles by moving the head but also, this is more difficult, be consistent with the \RE, \eg occlude or be occluded by real objects \cite{macedo2023occlusion}, cast shadows or reflect lights.
The \PSI can be applied to \AR as is, but the \VOs must additionally have knowledge of the \RE and react accordingly to it.
\textcite{skarbez2021revisiting} also named \PI for \AR as \enquote{immersion} and \PSI as \enquote{coherence}, and these terms will be used in the remainder of this thesis.
The plausibility can be applied to \AR as is, but the \VOs must additionally have knowledge of the \RE and react accordingly to it.
\textcite{skarbez2021revisiting} also named place illusion for \AR as \enquote{immersion} and plausibility as \enquote{coherence}, and these terms will be used in the remainder of this thesis.
One main issue with presence is how to measure it both in \VR \cite{slater2022separate} and \AR \cite{tran2024survey}.
\begin{subfigs}{presence}{The sense of immersion in virtual and augmented environments. Adapted from \textcite{stevens2002putting}. }[
\item Place Illusion (PI) is the sense of the user of \enquote{being there} in the \VE.
\item Place illusion is the sense of the user of \enquote{being there} in the \VE.
\item Objet illusion is the sense of the \VO to \enquote{feels here} in the \RE.
]
\subfigsheight{35mm}
@@ -132,7 +132,7 @@ One main issue with presence is how to measure it both in \VR \cite{slater2022se
\paragraph{Embodiment}
\label{ar_embodiment}
The \SoE is the \enquote{subjective experience of using and having a body} \cite{blanke2009fullbody}, \ie the feeling that a body is our own.
The sense of embodiment is the \enquote{subjective experience of using and having a body} \cite{blanke2009fullbody}, \ie the feeling that a body is our own.
In everyday life, we are used to being, seeing and controlling our own body, but it is possible to embody a virtual body as an avatar while in \AR \cite{genay2022being} or \VR \cite{guy2023sense}.
This illusion arises when the visual, proprioceptive and (if any) haptic sensations of the virtual body are coherent \cite{kilteni2012sense}.
It can be decomposed into three subcomponents: \emph{Agency}, which is the feeling of controlling the body; \emph{Ownership}, which is the feeling that \enquote{the body is the source of the experienced sensations}; and \emph{Self-Location}, which is the feeling \enquote{spatial experience of being inside [the] body} \cite{kilteni2012sense}.
@@ -309,7 +309,7 @@ Rendering the real hand as a semi-transparent hand in \VST-\AR is perceived as l
Similarly, \textcite{blaga2017usability} evaluated direct hand manipulation in non-immersive \VST-\AR with a skeleton-like rendering \vs no visual hand rendering: while user performance did not improve, participants felt more confident with the virtual hand (\figref{blaga2017usability}).
%\textcite{krichenbauer2018augmented} found that participants were \percent{22} faster in immersive \VST-\AR than in \VR in the same pick-and-place manipulation task, but no visual hand rendering was used in \VR while the real hand was visible in \AR.
In a collaborative task in immersive \OST-\AR \vs \VR, \textcite{yoon2020evaluating} showed that a realistic human hand rendering was the most preferred over a low-polygon hand and a skeleton-like hand for the remote partner.
\textcite{genay2021virtual} found that the \SoE with robotic hands overlay in \OST-\AR was stronger when the environment contained both real and virtual objects (\figref{genay2021virtual}).
\textcite{genay2021virtual} found that the sense of embodiment with robotic hands overlay in \OST-\AR was stronger when the environment contained both real and virtual objects (\figref{genay2021virtual}).
Finally, \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the visual and haptic rendering of the hand in \VST-\AR, as detailed in the next section (\secref{vhar_rings}).
Taken together, these results suggest that a visual rendering of the hand in \AR could improve usability and performance in direct hand manipulation tasks, but the best rendering has yet to be determined.
%\cite{chan2010touching} : cues for touching (selection) \VOs.

View File

@@ -11,10 +11,10 @@ It is essential to understand how a multimodal visuo-haptic rendering of a \VO i
% spatial and temporal integration of visuo-haptic feedback as perceptual cues vs proprioception and real touch sensations
% delocalized : not at the point of contact = difficult to integrate with other perceptual cues ?
%Go back to the main objective "to understand how immersive visual and \WH feedback compare and complement each other in the context of direct hand perception and manipulation with augmented objects" and the two research challenges: "providing plausible and coherent visuo-haptic augmentations, and enabling effective manipulation of the augmented environment."
%Go back to the main objective "to understand how immersive visual and wearable haptic feedback compare and complement each other in the context of direct hand perception and manipulation with augmented objects" and the two research challenges: "providing plausible and coherent visuo-haptic augmentations, and enabling effective manipulation of the augmented environment."
%Also go back to the \figref[introduction]{visuo-haptic-rv-continuum3} : we present previous work that either did haptic AR (the middle row), or haptic VR with visual AR, or visuo-haptic AR.
% One of the roles of haptic systems is to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects, particularly in \v-\VE \cite{maclean2008it,culbertson2018haptics}. Moreover, a haptic \AR system should \enquote{modulating the feel of a real object by virtual [haptic] feedback} \cite{jeon2009haptic}, \ie a touch interaction with a real object whose perception is modified by the addition of virtual haptic feedback.
% One of the roles of haptic systems is to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects, particularly in visual \VE \cite{maclean2008it,culbertson2018haptics}. Moreover, a haptic \AR system should \enquote{modulating the feel of a real object by virtual [haptic] feedback} \cite{jeon2009haptic}, \ie a touch interaction with a real object whose perception is modified by the addition of virtual haptic feedback.
% Finally, we present how multimodal visual and haptic feedback have been combined in \AR to modify the user perception of tangible objects, and to improve the user interaction with \VOs.

View File

@@ -3,12 +3,12 @@
\chaptertoc
This chapter reviews previous work on the perception and manipulation of \AEs directly with the hand using wearable haptics, \AR, and their combination.
This chapter reviews previous work on the perception and manipulation with virtual and augmented objects, directly with the hand, using either wearable haptics, \AR, or their combination.
%Experiencing a visual, haptic, or visuo-haptic \AE relies on one to many interaction loops between a user and the environment, as shown in \figref[introduction]{interaction-loop}, and each main step must be addressed and understood: the tracking and modelling of the \RE into a \VE, the interaction techniques to act on the \VE, the rendering of the \VE to the user through visual and haptic user interfaces, and, finally, the user's perception and actions on the overall \AE.
To achieve this, we first describe how the hand senses and acts on its environment to perceive and manipulate the haptic properties of real everyday objects.
We first overview how the hand senses and acts on its environment to perceive and manipulate the haptic properties of real everyday objects.
Second, we present how wearable haptic devices and renderings have been used to augment the haptic perception of roughness and hardness of real objects.
Third, we introduce the principles and user experience of \AR, and overview the main interaction techniques used to manipulate virtual objects directly with the hand.
Finally, multimodal visual and haptic feedback have been combined in \AR to enhance perception and interaction with the hand.
Finally, we describe multimodal visual and haptic feedback have been combined in \AR to enhance perception and interaction with the hand.
\input{1-haptic-hand}
\input{2-wearable-haptics}

View File

@@ -20,7 +20,7 @@ We consider two axes of research: (I) providing plausible and coherent visuo-hap
First, we study how visual rendering affects the perception of virtual vibrotactile textures that augment real surfaces directly touched by the finger.
%
To this end, we propose (1) a system for rendering visuo-haptic virtual texture augmentations using an AR headset and a wearable vibrotactile device. We then (2) evaluate how the roughness perception of virtual haptic textures differs in AR \vs virtual reality (VR) and when touched by a virtual hand \vs one's own hand. Finally, we (3) investigate the realism, plausibility and coherence of combining visual and haptic texture augmentations in AR.
To this end, we propose (1) a system for rendering visuo-haptic virtual texture augmentations using an AR headset and a wearable vibrotactile device. We then (2) evaluate how the roughness perception of virtual haptic textures differs in AR \vs virtual reality (VR) and when touched by a virtual hand \vs one's own hand. Finally, we (3) investigate the realism and coherence of combining visual and haptic texture augmentations in AR.
Secondly, we investigate how the visuo-haptic rendering of the hand improves its direct manipulation of virtual objects in AR in terms of performance and user experience.
% address the challenge of manipulating virtual objects directly with the hand in AR, which is a key interaction but is still challenging due to visual and haptic limitations.

View File

@@ -1,6 +1,14 @@
\chapter{Conclusion}
\mainlabel{conclusion}
\section*{Summary}
\section*{Future Work}
\subsection*{}
\section*{Perspectives}
% systematic exploration of the parameter space of the haptic rendering to determine the most important parameters their influence on the perception
% measure the difference in sensitivity to the haptic feedback and how much it affects the perception of the object properties
% design, implement and validate procedures to automatically calibrate the haptic feedback to the user's perception in accordance to what it has been designed to represent

View File

@@ -9,6 +9,6 @@
\textbf{Erwan Normand}, Claudio Pacchierotti, Eric Marchand, and Maud Marchal. \enquote{Augmenting the Texture Perception of Tangible Surfaces in Augmented Reality using Vibrotactile Haptic Stimuli}. To appear in \textit{Proceedings of EuroHaptics 2024}, 2024.
\vspace{1em}
\bigskip
\noindent \textbf{Erwan Normand}, Claudio Pacchierotti, Eric Marchand, and Maud Marchal. \enquote{How Different Is the Perception of Vibrotactile Texture Roughness in Augmented versus Virtual Reality?}. To appear in \textit{Proceedings of the 30th ACM Symposium on Virtual Reality Software and Technology (VRST '24)}, 2024.

View File

@@ -36,7 +36,6 @@
\renewcommand*{\glstextformat}[1]{\textcolor{black}{#1}} % Hyperlink in black
\let\AE\undefined
\let\v\undefined
\acronym[TIFC]{2IFC}{two-interval forced choice}
\acronym[TwoD]{2D}{two-dimensional}
@@ -47,25 +46,16 @@
\acronym{DC}{direct current}
\acronym{DoF}{degree of freedom}
\acronym{ERM}{eccentric rotating mass}
\acronym{h}{haptic}
\acronym{JND}{just noticeable difference}
\acronym{LRA}{linear resonant actuator}
\acronym{MLE}{maximum-likelihood estimation}
\acronym{MR}{mixed reality}
\acronym{OST}{optical see-through}
\acronym{PI}{place illusion}
\acronym[PSI]{Psi}{plausibility}
\acronym{PSE}{point of subjective equality}
\acronym{RE}{real environment}
\acronym{RV}{reality-virtuality}
\acronym{SoE}{sense of embodiment}
\acronym{TUI}{tangible user interface}
\acronym{UI}{user interface}
\acronym{v}{visual}
\acronym{VCA}{voice-coil actuator}
\acronym{VE}{virtual environment}
\acronym{vh}{visuo-haptic}
\acronym{VO}{virtual object}
\acronym{VR}{virtual reality}
\acronym{VST}{visual see-through}
\acronym{WH}{wearable haptic}