diff --git a/1-introduction/figures/icons/hand.svg b/1-introduction/figures/icons/hand.svg new file mode 100644 index 0000000..aca4e6f --- /dev/null +++ b/1-introduction/figures/icons/hand.svg @@ -0,0 +1,69 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/1-introduction/figures/icons/xr_perception.odg b/1-introduction/figures/icons/xr_perception.odg index 7b51601..86fbc1a 100644 Binary files a/1-introduction/figures/icons/xr_perception.odg and b/1-introduction/figures/icons/xr_perception.odg differ diff --git a/1-introduction/introduction.tex b/1-introduction/introduction.tex index 6929284..f74ac20 100644 --- a/1-introduction/introduction.tex +++ b/1-introduction/introduction.tex @@ -3,6 +3,8 @@ \chaptertoc +\bigskip + %This PhD manuscript shows how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR by augmenting the perception of the virtual content and its manipulation. In this manuscript thesis, we show how immersive \AR, which integrates visual virtual content into the real world perception, and wearable haptics, which provide tactile sensations on the skin, can improve the free and direct interaction of virtual objects with the hand. Our goal is to enable users to perceive and interact with wearable visuo-haptic augmentations in a more realistic and effective way, as if they were real. @@ -183,7 +185,7 @@ When touching a visually augmenting a real object, the user's hand is physically However, \textbf{manipulating a purely virtual object with the bare hand can be challenging}, especially without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction. In addition, wearable haptic devices are limited to cutaneous feedback, and cannot provide forces to constrain the hand contact with the virtual object \cite{pacchierotti2017wearable}. -Current \AR systems have visual rendering limitations that also affect interaction with virtual objects. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency. +Current \AR systems have visual rendering limitations that also affect interaction with virtual objects. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency. \AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world. However, the depth perception of virtual objects is often underestimated \cite{peillard2019studying,adams2022depth}. There is also often \textbf{a lack of mutual occlusions between the hand and a virtual object}, that is the hand can hide the object or be hidden by the object \cite{macedo2023occlusion}. @@ -288,7 +290,7 @@ The pose estimation of the real hand and the environment is achieved using a vis The visual rendering is done using the immersive \OST-\AR headset Microsoft HoloLens~2. The system allows free visual and haptic exploration of the textures, as if they were real, and forms the basis of the next two chapters. -In \textbf{\chapref{xr_perception}}, we investigate in a user study how different the perception of haptic texture augmentations is in \AR \vs \VR and when touched by a virtual hand \vs one's own hand. +In \textbf{\chapref{xr_perception}}, we investigate in a psychophysical user study how different the perception of haptic texture augmentations is in \AR \vs \VR and when touched by a virtual hand \vs one's own hand. We use psychophysical methods to measure user perception and extensive questionnaires to understand how this perception is affected by the visual feedback of the virtual hand and the environment (real, augmented or virtual). In \textbf{\chapref{vhar_textures}}, we evaluate in a user study the perception of visuo-haptic texture augmentations directly touched with the real hand in \AR. diff --git a/2-related-work/related-work.tex b/2-related-work/related-work.tex index 42c256f..dae0a79 100644 --- a/2-related-work/related-work.tex +++ b/2-related-work/related-work.tex @@ -3,6 +3,8 @@ \chaptertoc +\bigskip + This chapter reviews previous work on the perception and manipulation of virtual and augmented objects directly with the hand, using either wearable haptics, \AR, or their combination. %Experiencing a visual, haptic, or visuo-haptic augmented environment relies on one to many interaction loops between a user and the environment, as shown in \figref[introduction]{interaction-loop}, and each main step must be addressed and understood: the tracking and modelling of the \RE into a \VE, the interaction techniques to act on the \VE, the rendering of the \VE to the user through visual and haptic user interfaces, and, finally, the user's perception and actions on the overall augmented environment. First, we review how the hand senses and interacts with its environment to perceive and manipulate the haptic properties of real everyday objects. diff --git a/3-perception/vhar-textures/1-introduction.tex b/3-perception/vhar-textures/1-introduction.tex index 3a64fa1..135a4e2 100644 --- a/3-perception/vhar-textures/1-introduction.tex +++ b/3-perception/vhar-textures/1-introduction.tex @@ -19,12 +19,12 @@ We aimed to assess \textbf{which haptic textures were matched with which visual \item A user study evaluating with 20 participants the coherence, realism and perceived roughness of nine pairs of these visuo-haptic texture augmentations. \end{itemize} -\noindentskip In the next sections, we first describe the apparatus of the user study experimental design, including the two tasks performed. We then present the results obtained and discuss them before concluding. +\smallskip -\bigskip - -\fig[0.65]{experiment/view}{First person view of the user study.}[ - As seen through the immersive \AR headset Microsoft HoloLens~2. +\fig[0.55]{experiment/view}{First person view of the user study.}[ + %As seen through the immersive \AR headset. The visual texture overlays were statically displayed on the surfaces, allowing the user to move around to view them from different angles. - The haptic texture augmentations were generated based on \HaTT data-driven texture models and finger speed, and were rendered on the middle index phalanx as it slides on the considered surface. + The haptic texture augmentations were generated based on \HaTT data-driven texture models and finger speed, and were rendered on the middle index phalanx.% as it slides on the considered surface. ] + +\noindentskip In the next sections, we first describe the apparatus of the user study experimental design, including the two tasks performed. We then present the results obtained and discuss them before concluding. diff --git a/3-perception/xr-perception/1-introduction.tex b/3-perception/xr-perception/1-introduction.tex index 9aa9ad8..aa6aeba 100644 --- a/3-perception/xr-perception/1-introduction.tex +++ b/3-perception/xr-perception/1-introduction.tex @@ -2,11 +2,11 @@ \label{intro} In the previous chapter, we presented a system for augmenting the visuo-haptic texture perception of real surfaces directly touched with the finger, using wearable vibrotactile haptics and an immersive \AR headset. -In this and the next chapter, we evaluate the user's perception of such wearable haptic texture augmentation under different visual rendering conditions. +In this chapter and the next one, we evaluate the user's perception of such wearable haptic texture augmentation under different visual rendering conditions. Most of the haptic augmentations of real surfaces using with wearable haptic devices, including roughness of textures (\secref[related_work]{texture_rendering}), have been studied without a visual feedback, and none have considered the influence of the visual rendering on their perception or integrated them in \AR and \VR (\secref[related_work]{texture_rendering}). Still, it is known that the visual rendering of an object can influence the perception of its haptic properties (\secref[related_work]{visual_haptic_influence}), and that the perception of same haptic force-feedback or vibrotactile rendering can differ between \AR and \VR, probably due to difference in perceived simultaneity between visual and haptic stimuli (\secref[related_work]{ar_vr_haptic}). -Indeed, in \AR, the user can see their own hand touching, the haptic device worn and the \RE, while in \VR they are hidden by the \VE. +In \AR, the user can see their own hand touching, the haptic device worn, and the \RE, while in \VR they are hidden by the \VE. In this chapter, we investigate the \textbf{role of the visual feedback of the virtual hand and of the environment (real or virtual) on the perception of a real surface whose haptic roughness is augmented} with wearable vibrotactile haptics. %voice-coil device worn on the finger. To do so, we used the visuo-haptic system presented in \chapref{vhar_system} to render virtual vibrotactile patterned textures (\secref[related_work]{texture_rendering}) to augment the real surface being touched. % touched by the finger.% that can be directly touched with the bare finger. @@ -19,16 +19,12 @@ To control for the influence of the visual rendering, the real surface was not v \item A discussion and recommendations on the integration of wearable haptic augmentations in direct touch context with \AR and \VR. \end{itemize} -\noindentskip In the remainder of this chapter, we first describe the experimental design and apparatus of the user study. +\noindentskip In the next sections, we first describe the experimental design and apparatus of the user study. We then present the results obtained, discuss them, and outline recommendations for future \AR/\VR works using wearable haptic augmentations. -%First, we present a system for rendering virtual vibrotactile textures in real time without constraints on hand movements and integrated with an immersive visual \AR/\VR headset to provide a coherent visuo-haptic augmentation of the \RE. -%An experimental setup is then presented to compare haptic roughness augmentation with an optical \AR headset (Microsoft HoloLens~2) that can be transformed into a \VR headset using a cardboard mask. -%We then conduct a psychophysical study with 20 participants, where various virtual haptic textures on a real surface directly touched with the finger are compared in a two-alternative forced choice (2AFC) task in three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR. +\smallskip -\bigskip - -\fig[0.9]{teaser/teaser2}{ +\fig[0.55]{teaser/teaser2}{ Vibrotactile textures were rendered in real time on a real surface using a wearable vibrotactile device worn on the finger. }[% Participants explored this haptic roughness augmentation with (\level{Real}) their real hand alone, (\level{Mixed}) a realistic virtual hand overlay in \AR, and (\level{Virtual}) the same virtual hand in \VR. diff --git a/3-perception/xr-perception/figures/experiment/headset2.jpg b/3-perception/xr-perception/figures/experiment/headset2.jpg new file mode 100644 index 0000000..8264b92 Binary files /dev/null and b/3-perception/xr-perception/figures/experiment/headset2.jpg differ diff --git a/config/thesis_commands.tex b/config/thesis_commands.tex index ed61272..48b428a 100644 --- a/config/thesis_commands.tex +++ b/config/thesis_commands.tex @@ -141,7 +141,7 @@ \usepackage{titletoc} \newcommand{\chaptertoc}{% Print the table of contents for the chapter \section*{Contents}% Add a section title - \vspace{-1.5em}% + \vspace{-2em}% \horizontalrule% \startcontents% Start referencing the contents @@ -150,7 +150,7 @@ \vspace{-0.5em}% \horizontalrule% - \vspace{2em}% + %\vspace{2em}% } \newcommand{\nochaptertoc}{% \stopcontents%