This commit is contained in:
2024-09-25 17:25:13 +02:00
parent d6c8184df8
commit 0a21557052
16 changed files with 103 additions and 96 deletions

View File

@@ -188,7 +188,7 @@ Our contributions in these two axes are summarized in \figref{contributions}.
The second axis focuses on \textbf{(II)} improving the manipulation of \VOs with the bare hand using visuo-haptic augmentations of the hand as interaction feedback.
]
\subsectionstarbookmark{Modifying the Perception of Tangible Surfaces with Visuo-Haptic Texture Augmentations}
\subsectionstarbookmark{Axis I: Modifying the Perception of Tangible Surfaces with Visuo-Haptic Texture Augmentations}
Wearable haptic devices have proven to be effective in modifying the perception of a touched tangible surface, without modifying the tangible, nor covering the fingertip, forming a haptic \AE \cite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
%It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronised with the finger movement.
@@ -210,64 +210,63 @@ Finally, some visuo-haptic texture databases have been modeled from real texture
However, the rendering of these textures in an immersive and natural visuo-haptic \AR using wearable haptics remains to be investigated.
Our third objective is to evaluate the perception of simultaneous and co-localized visuo-haptic texture augmentation of tangible surfaces in \AR, directly touched by the hand, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture.
\subsectionstarbookmark{Improving Virtual Object Manipulation with Visuo-Haptic Augmentations of the Hand}
\subsectionstarbookmark{Axis II: Improving Virtual Object Manipulation with Visuo-Haptic Augmentations of the Hand}
In immersive and wearable visuo-haptic \AR, the hand is free to touch and interact seamlessly with real, augmented, and virtual objects, and one can expect natural and direct contact and manipulation of \VOs with the bare hand.
However, the intangibility of the visual \VE, the many display limitations of current visual \AR systems and wearable haptic devices, and the potential discrepancies between these two types of feedback can make the manipulation of \VOs particularly challenging.
However, the intangibility of the visual \VE, the display limitations of current visual \OST-\AR systems and the inherent spatial and temporal discrepancies between the user's hand actions and the visual feedback in the \VE can make the interaction with \VOs particularly challenging.
%However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of wearable haptics, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the real environment, the visual feedback, and the haptic feedback, can make the interaction with \VOs with bare hands particularly challenging.
Still two types of sensory feedback are known to improve such direct \VO manipulation, but they have not been studied in combination in immersive visual \AE: visual rendering of the hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and contact rendering with wearable haptics \cite{lopes2018adding,teng2021touch}.
For this second axis of research, we propose to design and evaluate the role of visuo-haptic augmentations of the hand as interaction feedback with \VOs.
We consider (1) the effect of different visual augmentations of the hand as \AR avatars and (2) the effect of combination of different visuo-haptic augmentations of the hand.
Two particular sensory feedbacks are known to improve such direct \VO manipulation, but they have not been properly investigated in immersive \AR: visual rendering of the hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and delocalized haptic rendering \cite{lopes2018adding,teng2021touch}.
For this second axis of research, we propose to design and evaluate \textbf{the role of visuo-haptic augmentations of the hand as interaction feedback with \VOs in \OST-\AR}.
We consider the effect of (1) the visual rendering as hand augmentation and (2) of combination of different visuo-haptic augmentations of the hand.
First, the visual rendering of the virtual hand is a key element for interacting and manipulating \VOs in \VR \cite{prachyabrued2014visual,grubert2018effects}.
A few works have also investigated the visual rendering of the virtual hand in \AR, from simulating mutual occlusions between the hand and \VOs \cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay \cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
But visual \AR has significant perceptual differences from \VR due to the visibility of the real hand and environment, and these visual hand augmentations have not been evaluated in the context of \VO manipulation.
Thus, our fourth objective is to evaluate and compare the effect of different visual hand augmentations on direct manipulation of \VOs in \AR.
But \OST-\AR has significant perceptual differences from \VR due to the visibility of the real hand and environment, and these visual hand augmentations have not been evaluated in the context of \VO manipulation with the bare hand.
Thus, our fourth objective is to \textbf{investigate the visual rendering as hand augmentation for direct manipulation of \VOs in \OST-\AR}.
Finally, as described above, wearable haptics for visual \AR rely on moving the haptic actuator away from the fingertips to not impair the hand movements, sensations, and interactions with the \RE.
Second, as described above, wearable haptics for visual \AR rely on moving the haptic actuator away from the fingertips to not impair the hand movements, sensations, and interactions with the \RE.
Previous works have shown that wearable haptics that provide feedback on the hand manipulation with \VOs in \AR can significantly improve the user performance and experience \cite{maisto2017evaluation,meli2018combining}.
However, it is unclear which positioning of the actuator is the most beneficial nor how a haptic augmentation of the hand compares or complements with a visual augmentation of the hand.
Our last objective is to investigate the role of visuo-haptic augmentations of the hand in manipulating \VOs directly with the hand in \AR.
Our last objective is to \textbf{investigate the role of visuo-haptic rendering of the hand manipulation with \VO in \OST-\AR}.
\section{Thesis Overview}
\label{thesis_overview}
This thesis is divided in four parts.
In \partref{context}, we describe the context and background of our research, within which this first current \textit{Introduction} chapter we present the research challenges, and the objectives, approach, and contributions of this thesis.
In \chapref{related_work}, we then review previous work on the perception and manipulation with virtual and augmented objects, directly with the hand, using either wearable haptics, \AR, or their combination.
In \textbf{\partref{context}}, we describe the context and background of our research, within which this first current \textit{Introduction} chapter we present the research challenges, and the objectives, approach, and contributions of this thesis.
In \textbf{\chapref{related_work}}, we then review previous work on the perception and manipulation with virtual and augmented objects, directly with the hand, using either wearable haptics, \AR, or their combination.
First, we overview how the hand perceives and manipulate real everyday objects.
Second, we present wearable haptics and haptic augmentations of roughness and hardness of real objects.
Third, we introduce \AR, and how \VOs can be manipulated directly with the hand.
Finally, we describe how multimodal visual and haptic feedback have been combined in \AR to enhance perception and interaction with the hand.
Next, we address each of our two research axes in a dedicated part.
\bigskip
We then address each of our two research axes in a dedicated part.
In \partref{perception}, we describe our contributions to the first axis of research, augmenting the visuo-haptic texture perception of tangible surfaces.
noindentskip
In \textbf{\partref{perception}}, we describe our contributions to the first axis of research, augmenting the visuo-haptic texture perception of tangible surfaces.
We evaluate how the visual rendering of the hand (real or virtual), the environment (\AR or \VR) and the textures (displayed or hidden) affect the roughness perception of virtual vibrotactile textures rendered on real surfaces and touched directly with the index finger.
In \chapref{vhar_system}, we detail a system for rendering visuo-haptic virtual textures that augment tangible surfaces using an immersive \AR/\VR headset and a wearable vibrotactile device.
In \textbf{\chapref{vhar_system}}, we detail a system for rendering visuo-haptic virtual textures that augment tangible surfaces using an immersive \AR/\VR headset and a wearable vibrotactile device.
The haptic textures are rendered as a real-time vibrotactile signal representing a grating texture, and is provided to the middle phalanx of the index finger touching the texture using a voice-coil actuator.
The tracking of the real hand and environment is done using marker-based technique, and the visual rendering of their virtual counterparts is done using the immersive \OST \AR headset Microsoft HoloLens~2.
In \chapref{xr_perception}, we investigate, in a user study, how different the perception of virtual haptic textures is in \AR \vs \VR and when touched by a virtual hand \vs one's own hand.
In \textbf{\chapref{xr_perception}}, we investigate, in a user study, how different the perception of virtual haptic textures is in \AR \vs \VR and when touched by a virtual hand \vs one's own hand.
We use psychophysical methods to measure the user roughness perception of the virtual textures, and extensive questionnaires to understand how this perception is affected by the visual rendering of the hand and the environment.
In \chapref{ar_textures}, we evaluate the perception of visuo-haptic texture augmentations, touched directly with one's own hand in \AR.
In \textbf{\chapref{ar_textures}}, we evaluate the perception of visuo-haptic texture augmentations, touched directly with one's own hand in \AR.
The virtual textures are paired visual and tactile models of real surfaces \cite{culbertson2014one} that we render as visual and haptic overlays on the touched augmented surfaces.
Our objective is to assess the perceived realism, coherence and roughness of the combination of nine representative visuo-haptic texture pairs.
\bigskip
noindentskip
In \textbf{\partref{manipulation}}, we describe our contributions to the second axis of research, improving direct hand manipulation of \VOs with visuo-haptic augmentations of the hand.
We explore how the visual and haptic augmentation of the hand, and their combination, as interaction feedback with \VOs in \OST-\AR can improve such manipulations.
In \partref{manipulation}, we describe our contributions to the second axis of research, improving direct hand manipulation of \VOs with visuo-haptic augmentations of the hand.
We evaluate how the visual and haptic augmentation, and their combination, of the hand as feedback of direct manipulation with \VOs can improve such manipulations.
In \textbf{\chapref{visual_hand}}, we conduct a user study to investigate the effect of six visual renderings as hand augmentations for the direct manipulation of \VOs, as a set of the most popular hand renderings in the \AR literature.
Using the \OST-\AR headset Microsoft HoloLens~2, we evaluate the user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a \VO directly with the hand.
In \chapref{visual_hand}, we explore in a user study the effect of six visual hand augmentations that provide contact feedback with the \VO, as a set of the most popular hand renderings in the \AR literature.
Using the \OST-\AR headset Microsoft HoloLens~2, the user performance and experience are evaluated in two representative manipulation tasks, \ie push-and-slide and grasp-and-place of a \VO directly with the hand.
In \chapref{visuo_haptic_hand}, we evaluate in a user study two vibrotactile contact techniques, provided at four different locations on the real hand, as haptic rendering of the hand-object interaction.
In \textbf{\chapref{visuo_haptic_hand}}, we evaluate in a user study two vibrotactile contact techniques, provided at four different locations on the real hand, as haptic rendering of the hand-object interaction.
They are compared to the two most representative visual hand augmentations from the previous study, and the user performance and experience are evaluated within the same \OST-\AR setup and manipulation tasks.
\bigskip
In \partref{part:conclusion}, we conclude this thesis and discusse short-term future work and long-term perspectives for each of our contributions and research axes.
noindentskip
In \textbf{\partref{part:conclusion}}, we conclude this thesis and discuss short-term future work and long-term perspectives for each of our contributions and research axes.

View File

@@ -184,7 +184,7 @@ The \emph{system control tasks} are changes to the system state through commands
\end{subfigs}
\subsubsection{Reducing the Real-Virtual Gap}
\label{real-virtual-gap}
\label{real_virtual_gap}
In \AR and \VR, the state of the system is displayed to the user as a \ThreeD spatial \VE.
In an immersive and portable \AR system, this \VE is experienced at a 1:1 scale and as an integral part of the \RE.
@@ -216,7 +216,7 @@ In a pick-and-place task with tangibles of different shapes, a difference in siz
This suggests the feasibility of using simplified tangibles in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs.
Similarly, in \secref{tactile_rendering} we described how a material property (\secref{object_properties}) of a touched tangible can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR.
\begin{subfigs}{ar_applications}{Manipulating \VOs with tangibles. }[][
\begin{subfigs}{ar_tangibles}{Manipulating \VOs with tangibles. }[][
\item Ubi-Touch paired the movements and screw interaction of a virtual drill with a real vaporizer held by the user \cite{jain2023ubitouch}.
\item A tangible cube that can be moved into the \VE and used to grasp and manipulate \VOs \cite{issartel2016tangible}.
\item Size and
@@ -254,7 +254,7 @@ More advanced techniques simulate the friction phenomena \cite{talvas2013godfing
\item A fingertip tracking that allows to select a \VO by opening the hand \cite{lee2007handy}.
\item Physics-based hand-object manipulation with a virtual hand made of numerous many small rigid-body spheres \cite{hilliges2012holodesk}.
\item Grasping a through gestures when the fingers are detected as opposing on the \VO \cite{piumsomboon2013userdefined}.
\item A kinematic hand model with rigid-body phalanges (in beige) taht follows the real tracked hand (in green) but kept physically constrained to the \VO. Applied forces are shown as red arrows \cite{borst2006spring}.
\item A kinematic hand model with rigid-body phalanges (in beige) that follows the real tracked hand (in green) but kept physically constrained to the \VO. Applied forces are shown as red arrows \cite{borst2006spring}.
]
\subfigsheight{37mm}
\subfig{lee2007handy}

View File

@@ -1,47 +1,22 @@
Augmented reality (AR) integrates virtual content into our real-world surroundings, giving the illusion of one unique environment and promising natural and seamless interactions with real and virtual objects.
%
Virtual object manipulation is particularly critical for useful and effective \AR usage, such as in medical applications, training, or entertainment \cite{laviolajr20173d, kim2018revisiting}.
%
Hand tracking technologies \cite{xiao2018mrtouch}, grasping techniques \cite{holl2018efficient}, and real-time physics engines permit users to directly manipulate virtual objects with their bare hands as if they were real \cite{piumsomboon2014graspshell}, without requiring controllers \cite{krichenbauer2018augmented}, gloves \cite{prachyabrued2014visual}, or predefined gesture techniques \cite{piumsomboon2013userdefined, ha2014wearhand}.
%
Optical see-through \AR (OST-AR) head-mounted displays (HMDs), such as the Microsoft HoloLens 2 or the Magic Leap, are particularly suited for this type of direct hand interaction \cite{kim2018revisiting}.
\noindent Touching, grasping and manipulating \VOs are fundamental interactions in \AR (\secref[related_work]{ve_tasks}) and essential for many of its applications (\secref[related_work]{ar_applications}).
The most common current \AR systems, in the form of portable and immersive \OST-\AR headsets \cite{hertel2021taxonomy}, allow real-time hand tracking and direct interaction with \VOs with bare hands (\secref[related_work]{real_virtual_gap}).
Manipulation of \VOs is achieved using a virtual hand interaction technique that represents the user's hand in the \VE and simulates interaction with \VOs (\secref[related_work]{ar_virtual_hands}).
However, direct hand manipulation is still challenging due to the intangibility of the \VE, the lack of mutual occlusion between the hand and the \VO in \OST-\AR (\secref[related_work]{ar_displays}), and the inherent delays between the user's hand and the result of the interaction simulation (\secref[related_work]{ar_virtual_hands}).
However, there are still several haptic and visual limitations that affect manipulation in OST-AR, degrading the user experience.
%
For example, it is difficult to estimate the position of one's hand in relation to a virtual content because mutual occlusion between the hand and the virtual object is often lacking \cite{macedo2023occlusion}, the depth of virtual content is underestimated \cite{diaz2017designing, peillard2019studying}, and hand tracking still has a noticeable latency \cite{xiao2018mrtouch}.
%
Similarly, it is challenging to ensure confident and realistic contact with a virtual object due to the lack of haptic feedback and the intangibility of the virtual environment, which of course cannot apply physical constraints on the hand \cite{maisto2017evaluation, meli2018combining, lopes2018adding, teng2021touch}.
%
These limitations also make it difficult to confidently move a grasped object towards a target \cite{maisto2017evaluation, meli2018combining}.
In this chapter, we investigate the \textbf{visual rendering as hand augmentation} for direct manipulation of \VOs in \OST-\AR.
To this end, we selected in the literature and compared the most popular visual hand renderings used to interact with \VOs in \AR.
The virtual hand is \textbf{displayed superimposed} on the user's hand with these visual rendering, providing a \textbf{feedback on the tracking} of the real hand, as shown in \figref{hands}.
The movement of the virtual hand is also \textbf{constrained to the surface} of the \VO, providing an additional \textbf{feedback on the interaction} with the \VO.
We \textbf{evaluate in a user study}, using the \OST-\AR headset Microsoft HoloLens~2, the effect of six visual hand renderings on the user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a \VO directly with the hand.
To address these haptic and visual limitations, we investigate two types of sensory feedback that are known to improve virtual interactions with hands, but have not been studied together in an \AR context: visual hand rendering and delocalized haptic rendering.
%
A few works explored the effect of a visual hand rendering on interactions in \AR by simulating mutual occlusion between the real hand and virtual objects \cite{ha2014wearhand, piumsomboon2014graspshell, al-kalbani2016analysis}, or displaying a 3D virtual hand model, semi-transparent \cite{ha2014wearhand, piumsomboon2014graspshell} or opaque \cite{blaga2017usability, yoon2020evaluating, saito2021contact}.
%
Indeed, some visual hand renderings are known to improve interactions or user experience in virtual reality (VR), where the real hand is not visible \cite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch, vanveldhuizen2021effect}.
%
However, the role of a visual hand rendering superimposed and seen above the real tracked hand has not yet been investigated in \AR.
%
Conjointly, several studies have demonstrated that wearable haptics can significantly improve interactions performance and user experience in \AR \cite{maisto2017evaluation, meli2018combining, sarac2022perceived}.
%
But haptic rendering for \AR remains a challenge as it is difficult to provide rich and realistic haptic sensations while limiting their negative impact on hand tracking \cite{pacchierotti2016hring} and keeping the fingertips and palm free to interact with the real environment \cite{lopes2018adding, teng2021touch, sarac2022perceived, palmer2022haptic}.
%
Therefore, the haptic feedback of the fingertip contact with the virtual environment needs to be rendered elsewhere on the hand, it is unclear which positioning should be preferred or which type of haptic feedback is best suited for manipulating virtual objects in \AR.
%
A final question is whether one or the other of these (haptic or visual) hand renderings should be preferred \cite{maisto2017evaluation, meli2018combining}, or whether a combined visuo-haptic rendering is beneficial for users.
%
In fact, both hand renderings can provide sufficient sensory cues for efficient manipulation of virtual objects in \AR, or conversely, they can be shown to be complementary.
In this paper, we investigate the role of the visuo-haptic rendering of the hand during 3D manipulation of virtual objects in OST-AR.
%
We consider two representative manipulation tasks: push-and-slide and grasp-and-place a virtual object.
%
The main contributions of this work are:
noindentskip
The main contributions of this chapter are:
\begin{itemize}
\item A comparison from the literature of the six most common visual hand renderings used in \AR.
\item A comparison from the literature of the six most common visual hand renderings used to interact with \VOs in \AR.
\item A user study evaluating with 24 participants the performance and user experience of the six visual hand renderings superimposed on the real hand during free and direct hand manipulation of \VOs in \OST-\AR.
\end{itemize}
noindentskip
In the next sections, we first present the six visual hand renderings considered in this study and gathered from the literature. We then describe the experimental setup and design, the two manipulation tasks, and the metrics used. We present the results of the user study and discuss the implications of these results for the manipulation of \VOs directly with the hand in \AR.
\begin{subfigs}{hands}{The six visual hand renderings.}[

View File

@@ -50,7 +50,7 @@ We aim to investigate whether the chosen visual hand rendering affects the perfo
\subsection{Manipulation Tasks and Virtual Scene}
\label{tasks}
Following the guidelines of \textcite{bergstrom2021how} for designing object manipulation tasks, we considered two variations of a 3D pick-and-place task, commonly found in interaction and manipulation studies \cite{prachyabrued2014visual,maisto2017evaluation,meli2018combining,blaga2017usability,vanveldhuizen2021effect}.
Following the guidelines of \textcite{bergstrom2021how} for designing object manipulation tasks, we considered two variations of a 3D pick-and-place task, commonly found in interaction and manipulation studies \cite{prachyabrued2014visual,blaga2017usability,maisto2017evaluation,meli2018combining,vanveldhuizen2021effect}.
\subsubsection{Push Task}
\label{push-task}
@@ -73,14 +73,14 @@ Users are asked to grasp, lift, and move the cube towards the target volume usin
As before, the task is considered completed when the cube is \emph{fully} inside the volume.
\begin{subfigs}{tasks}{The two manipulation tasks wof the user study.}[
The cube to manipulate is in the middle of the table (5-cm-edge and opaque) and the eight possible targets to reach are arround (7-cm-edge volume and semi-transparent).
The cube to manipulate is in the middle of the table (\qty{5}{cm} edge and opaque) and the eight possible targets to reach are arround (\qty{7}{cm} edge volume and semi-transparent).
Only one target at a time was shown during the experiments.
][
\item Push task: pushing the virtual cube along a table towards a target placed on the same surface.
\item Grasp task: grasping and lifting the virtual cube towards a target placed on a \qty{20}{\cm} higher plane.
]
\subfig[0.4]{method/task-push}
\subfig[0.4]{method/task-grasp}
\subfig[0.45]{method/task-push}
\subfig[0.45]{method/task-grasp}
\end{subfigs}
\subsection{Experimental Design}

View File

@@ -25,8 +25,8 @@ This result is consistent with \textcite{saito2021contact}, who found that displ
To summarize, when employing a visual hand rendering overlaying the real hand, participants were more performant and confident in manipulating virtual objects with bare hands in \AR.
These results contrast with similar manipulation studies, but in non-immersive, on-screen \AR, where the presence of a visual hand rendering was found by participants to improve the usability of the interaction, but not their performance \cite{blaga2017usability,maisto2017evaluation,meli2018combining}.
Our results show the most effective visual hand rendering to be the \level{Skeleton} one. Participants appreciated that it provided a detailed and precise view of the tracking of the real hand, without hiding or masking it.
Our results show the most effective visual hand rendering to be the \level{Skeleton} one.
Participants appreciated that it provided a detailed and precise view of the tracking of the real hand, without hiding or masking it.
Although the \level{Contour} and \level{Mesh} hand renderings were also highly rated, some participants felt that they were too visible and masked the real hand.
This result is in line with the results of virtual object manipulation in \VR of \textcite{prachyabrued2014visual}, who found that the most effective visual hand rendering was a double representation of both the real tracked hand and a visual hand physically constrained by the virtual environment.
This type of \level{Skeleton} rendering was also the one that provided the best sense of agency (control) in \VR \cite{argelaguet2016role,schwind2018touch}.

View File

@@ -1,7 +1,18 @@
\section{Conclusion}
\label{conclusion}
This paper presented two human subject studies aimed at better understanding the role of visuo-haptic rendering of the hand during virtual object manipulation in OST-AR.
The first experiment compared six visual hand renderings in two representative manipulation tasks in \AR, \ie push-and-slide and grasp-and-place of a virtual object.
Results show that a visual hand rendering improved the performance, perceived effectiveness, and user confidence.
In this chapter, we addressed the challenge of touching, grasping and manipulating \VOs directly with the hand in immersive \OST-\AR by providing and evaluating visual renderings as hand augmentations.
Superimposed on the user's hand, these visual renderings provide feedback from the virtual hand, which tracks the real hand, and simulates the interaction with \VOs as a proxy.
We first selected and compared the six most popular visual hand renderings used to interact with \VOs in \AR.
Then, in a user study with 24 participants and an immersive \OST-\AR headset, we evaluated the effect of these six visual hand renderings on the user performance and experience in two representative manipulation tasks.
Our results showed that a visual hand rendering overlaying the real hand improved the performance, perceived effectiveness and confidence of participants compared to none.
A skeleton rendering, providing a detailed view of the tracked joints and phalanges while not hiding the real hand, was the most performant and effective.
The contour and mesh renderings were found to mask the real hand, while the tips rendering was controversial.
The occlusion rendering too much tracking latency to be effective.
This is consistent with similar manipulation studies in \VR and in non-immersive \VST-\AR setups.
This study suggests that a \ThreeD visual hand rendering is important in \AR when interacting through a virtual hand technique.
It seems particularly required for interaction tasks that involves precise movements of the fingers in relation to virtual content, such as \ThreeD windows, buttons and sliders, or stacking and assembly tasks.
A minimal but detailed rendering of the hand that does not hide the real hand, like the skeleton rendering we evaluated, seems to be the best compromise between provided feedback and effectiveness.
Still, users should be able to choose and adapt the visual hand rendering to their preferences and needs.

View File

@@ -3,10 +3,32 @@
\section*{Summary}
In this thesis, entitled \enquote{\ThesisTitle}, we presented our research on direct hand interaction with real and virtual everyday objects, visually and haptically augmented using immersive \AR and wearable haptic devices.
\noindentskip \partref{manipulation}
\noindentskip In \chapref{visual_hand}, we addressed the challenge manipulating \VOs directly with the hand by providing visual renderings as hand augmentations.
Seen as an overlay on the user's hand, such visual hand rendering provide feedback on the hand tracking and the interaction with \VOs.
We compared the six commonly used renderings in the \AR litterature in a user study with 24 participants, where we evaluated their effect on the user performance and experience in two representative manipulation tasks.
Results showed that a visual hand rendering improved the user performance, perceived effectiveness and confidence, with a skeleton-like rendering being the most performant and effective.
This rendering provided a detailed view of the tracked phalanges while being thin enough not to hide the real hand.
\section*{Future Work}
The visuo-haptic renderings we presented and the user studies we conducted in this thesis have of course some limitations.
We present in this section some future work that could address these.
\subsection*{Visual Rendering of the Hand for Manipulating Virtual Objects in Augmented Reality}
\paragraph{Other AR Displays}
The visual hand renderings we evaluated were displayed on the Microsoft HoloLens~2, which is a common \OST-\AR headset.
\paragraph{More Ecological Conditions}
We evaluated the effect of the visual hand rendering with two manipulation tasks involving to place a virtual cube into a target volume either by pushing it on a table or by grasping it.
%While these tasks are fundamental and basics
These results have of course some limitations as they only address limited types of manipulation tasks and visual hand characteristics, evaluated in a specific \OST-\AR setup.
The two manipulation tasks were also limited to placing a virtual cube in predefined target volumes.
Testing a wider range of virtual objects and more ecological tasks \eg stacking, assembly, will ensure a greater applicability of the results obtained in this work, as well as considering bimanual manipulation.

View File

@@ -9,6 +9,5 @@
\textbf{Erwan Normand}, Claudio Pacchierotti, Eric Marchand, and Maud Marchal. \enquote{Augmenting the Texture Perception of Tangible Surfaces in Augmented Reality using Vibrotactile Haptic Stimuli}. To appear in \textit{Proceedings of EuroHaptics 2024}, 2024.
\bigskip
\noindent \textbf{Erwan Normand}, Claudio Pacchierotti, Eric Marchand, and Maud Marchal. \enquote{How Different Is the Perception of Vibrotactile Texture Roughness in Augmented versus Virtual Reality?}. To appear in \textit{Proceedings of the 30th ACM Symposium on Virtual Reality Software and Technology (VRST '24)}, 2024.
noindentskip
\textbf{Erwan Normand}, Claudio Pacchierotti, Eric Marchand, and Maud Marchal. \enquote{How Different Is the Perception of Vibrotactile Texture Roughness in Augmented versus Virtual Reality?}. To appear in \textit{Proceedings of the 30th ACM Symposium on Virtual Reality Software and Technology (VRST '24)}, 2024.

View File

@@ -14,9 +14,10 @@
% Content
\input{config/content}
\newcommand{\ThesisTitle}{Study of the Perception and Manipulation of Virtual Objects in Augmented Reality using Wearable Haptics}
\hypersetup{
pdfauthor = {Erwan NORMAND},
pdftitle = {Study of the Perception and Manipulation of Virtual Objects in Augmented Reality using Wearable Haptics},
pdftitle = \ThesisTitle,
pdfsubject = {Ph.D. Thesis of Erwan NORMAND},
pdfkeywords = {Augmented Reality, Wearable Haptics, Perception, Interaction, Textures, Virtual Hand},
}
@@ -32,13 +33,13 @@
\frontmatter
\import{0-front}{cover}
%\importchapter{0-front}{acknowledgement}
\importchapter{0-front}{acknowledgement}
\importchapter{0-front}{toc}
\mainmatter
\import{1-introduction}{part}
\importchapter{1-introduction/introduction}{introduction}
%\importchapter{1-introduction/related-work}{related-work}
\importchapter{1-introduction/related-work}{related-work}
\import{2-perception}{perception}
\importchapter{2-perception/vhar-system}{vhar-system}
@@ -54,7 +55,7 @@
\appendix
\importchapter{4-conclusion}{publications}
\importchapter{4-conclusion}{résumé}
%\importchapter{4-conclusion}{résumé}
\backmatter
\importchapter{5-back}{bibliography}