diff --git a/1-background/introduction/introduction.tex b/1-background/introduction/introduction.tex index 608142a..bb32478 100644 --- a/1-background/introduction/introduction.tex +++ b/1-background/introduction/introduction.tex @@ -153,7 +153,7 @@ With a better understanding of how visual factors can influence the perception o Touching, grasping and manipulating \VOs are fundamental interactions for \AR \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite{laviolajr20173d}. As the hand is not occupied or covered with a haptic device to not impair interaction with the \RE, as described in the previous section, one can expect a seamless and direct manipulation of the hand with the virtual content as if it were real. -Thus, augmenting a tangible object has the advantage of physically constraining the hand, allowing for easy and natural interaction, but manipulating a purely \VO with the bare hand can be challenging without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction. +Thus, augmenting a real object has the advantage of physically constraining the hand, allowing for easy and natural interaction, but manipulating a purely \VO with the bare hand can be challenging without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction. In addition, current \AR systems have visual rendering limitations that also affect interaction with \VOs. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency. \AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world. @@ -178,31 +178,31 @@ Our approach is to We consider two main axes of research, each addressing one of the research challenges identified above: \begin{enumerate*}[label=(\Roman*)] -\item \textbf{modifying the texture perception of tangible surfaces}, and % with visuo-haptic texture augmentations, and +\item \textbf{modifying the texture perception of real surfaces}, and % with visuo-haptic texture augmentations, and \item \textbf{improving the manipulation of virtual objects}.% with visuo-haptic augmentations of the hand-object interaction. \end{enumerate*} Our contributions in these two axes are summarized in \figref{contributions}. \fig[0.95]{contributions}{Summary of our contributions through the simplified interaction loop.}[ The contributions are represented in dark gray boxes, and the research axes in light green circles. - The first axis is \textbf{(I)} the design and evaluation of the perception of visuo-haptic texture augmentations of tangible surfaces, directly touched by the hand. + The first axis is \textbf{(I)} the design and evaluation of the perception of visuo-haptic texture augmentations of real surfaces, directly touched by the hand. The second axis focuses on \textbf{(II)} improving the manipulation of \VOs with the bare hand using visuo-haptic augmentations of the hand as interaction feedback. ] -\subsectionstarbookmark{Axis I: Modifying the Texture Perception of Tangible Surfaces} +\subsectionstarbookmark{Axis I: Modifying the Texture Perception of Real Surfaces} -Wearable haptic devices have proven to be effective in modifying the perception of a touched tangible surface, without modifying the tangible, nor covering the fingertip, forming a haptic \AE \cite{bau2012revel,detinguy2018enhancing,salazar2020altering}. +Wearable haptic devices have proven to be effective in altering the perception of a touched real surface, without modifying the object nor covering the fingertip, forming a haptic \AE \cite{bau2012revel,detinguy2018enhancing,salazar2020altering}. %It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronised with the finger movement. -%It enables rich haptic feedback as the combination of kinesthetic sensation from the tangible and cutaneous sensation from the actuator. +%It enables rich haptic feedback as the combination of kinesthetic sensation from the real and cutaneous sensation from the actuator. However, wearable haptic augmentations have been little explored with \AR, as well as the visuo-haptic augmentation of textures. Texture is indeed one of the fundamental perceived property of a surface's material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by both sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) augmentation \cite{unger2011roughness,culbertson2014modeling,strohmeier2017generating}. Being able to coherently substitute the visuo-haptic texture of an everyday surface directly touched by a finger is an important step towards \AR capable of visually and haptically augmenting the \RE of a user in a plausible way. -For this first axis of research, we propose to design and evaluate the \textbf{perception of wearable virtual visuo-haptic textures augmenting tangible surfaces}. %, using an immersive \AR headset and a wearable vibrotactile device. +For this first axis of research, we propose to design and evaluate the \textbf{perception of wearable virtual visuo-haptic textures augmenting real surfaces}. %, using an immersive \AR headset and a wearable vibrotactile device. To this end, we (1) design a system for rendering wearable visuo-haptic texture augmentations, to (2) evaluate how the perception of haptic texture augmentations is affected by the visual virtuality of the hand and the environment (real, augmented, or virtual), and (3) investigate the perception of co-localized visuo-haptic texture augmentations. First, an effective approach for rendering haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction \cite{culbertson2014modeling,asano2015vibrotactile}. Yet, to achieve the natural interaction with the hand and a coherent visuo-haptic feedback, it requires a real time rendering of the textures, no constraints on the hand movements, and a good synchronization between the visual and haptic feedback. -Thus, our first objective is to \textbf{design an immersive, real time system} that allows free exploration with the bare hand of \textbf{wearable visuo-haptic texture augmentations} on tangible surfaces. +Thus, our first objective is to \textbf{design an immersive, real time system} that allows free exploration with the bare hand of \textbf{wearable visuo-haptic texture augmentations} on real surfaces. It will form the basis of the next two chapters in this section. Second, many works have investigated the haptic augmentation of textures, but none have integrated them with \AR and \VR, or have considered the influence of the degree of visual virtuality on their perception. @@ -211,7 +211,7 @@ Hence, our second objective is to \textbf{evaluate how the perception of wearabl Finally, some visuo-haptic texture databases have been modeled from real texture captures \cite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with hand-held haptic devices that are perceived as similar to real textures \cite{culbertson2015should,friesen2024perceived}. However, the rendering of these textures in an immersive and natural visuo-haptic \AR using wearable haptics remains to be investigated. -Our third objective is to \textbf{evaluate the perception of simultaneous and co-localized wearable visuo-haptic texture augmentations} of tangible surfaces in \AR, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture. +Our third objective is to \textbf{evaluate the perception of simultaneous and co-localized wearable visuo-haptic texture augmentations} of real surfaces in \AR, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture. \subsectionstarbookmark{Axis II: Improving the Manipulation of Virtual Objects} @@ -248,10 +248,10 @@ Finally, we describe how multimodal visual and haptic feedback have been combine We then address each of our two research axes in a dedicated part. \noindentskip -In \textbf{\partref{perception}} we present our contributions to the first axis of research: modifying the visuo-haptic texture perception of tangible surfaces. +In \textbf{\partref{perception}} we present our contributions to the first axis of research: modifying the visuo-haptic texture perception of real surfaces. We evaluate how the visual rendering of the hand (real or virtual), the environment (\AR or \VR) and the textures (coherent, different or not shown) affect the perception of virtual vibrotactile textures rendered on real surfaces and touched directly with the index finger. -In \textbf{\chapref{vhar_system}} we design and implement a system for rendering visuo-haptic virtual textures that augment tangible surfaces.%, using an immersive \OST-\AR headset and a wearable vibrotactile device. +In \textbf{\chapref{vhar_system}} we design and implement a system for rendering visuo-haptic virtual textures that augment real surfaces. %, using an immersive \OST-\AR headset and a wearable vibrotactile device. The haptic textures represent a periodical patterned texture rendered by a wearable vibrotactile actuator worn on the middle phalanx of the finger touching the surface. The tracking of the real hand and the environment is done using a marker-based technique, and the visual rendering is done using the immersive \OST-\AR headset Microsoft HoloLens~2. The system allows free visual and haptic exploration of the textures, as if they were real, and forms the basis of the next two chapters. diff --git a/1-background/related-work/3-augmented-reality.tex b/1-background/related-work/3-augmented-reality.tex index 9c4cff5..75180b1 100644 --- a/1-background/related-work/3-augmented-reality.tex +++ b/1-background/related-work/3-augmented-reality.tex @@ -138,7 +138,7 @@ In \AR, it could take the form of body accessorization, \eg wearing virtual clot \subsection{Direct Hand Manipulation in AR} \label{ar_interaction} -A user in \AR must be able to interact with the virtual content to fulfil the second point of \textcite{azuma1997survey}'s definition (\secref{ar_definition}) and complete the interaction loop (\figref[introduction]{interaction-loop}). %, \eg through a hand-held controller, a tangible object, or even directly with the hands. +A user in \AR must be able to interact with the virtual content to fulfil the second point of \textcite{azuma1997survey}'s definition (\secref{ar_definition}) and complete the interaction loop (\figref[introduction]{interaction-loop}). %, \eg through a hand-held controller, a real object, or even directly with the hands. In all examples of \AR applications shown in \secref{ar_applications}, the user interacts with the \VE using their hands, either directly or through a physical interface. \subsubsection{User Interfaces and Interaction Techniques} @@ -173,7 +173,7 @@ The \emph{system control tasks} are changes to the system state through commands \begin{subfigs}{interaction-techniques}{Interaction techniques in \AR. }[][ \item Spatial selection of virtual item of an extended display using a hand-held smartphone \cite{grubert2015multifi}. \item Displaying as an overlay registered on the \RE the route to follow \cite{grubert2017pervasive}. - \item Virtual drawing on a tangible object with a hand-held pen \cite{roo2017onea}. + \item Virtual drawing on a real object with a hand-held pen \cite{roo2017onea}. \item Simultaneous Localization and Mapping (SLAM) algorithms such as KinectFusion \cite{newcombe2011kinectfusion} reconstruct the \RE in real time and enables to register the \VE in it. ] \subfigsheight{36mm} @@ -201,26 +201,28 @@ It is often achieved using two interaction techniques: \emph{tangible objects} a \subsubsection{Manipulating with Tangibles} \label{ar_tangibles} -As \AR integrates visual virtual content into \RE perception, it can involve real surrounding objects as \UI: to visually augment them, \eg by superimposing visual textures \cite{roo2017inner} (\figref{roo2017inner}), and to use them as physical proxies to support interaction with \VOs \cite{ishii1997tangible}. -According to \textcite{billinghurst2005designing}, each \VO is coupled to a tangible object, and the \VO is physically manipulated through the tangible object, providing a direct, efficient and seamless interaction with both the real and virtual content. -This technique is similar to mapping the movements of a mouse to a virtual cursor on a screen. +As \AR integrates visual virtual content into \RE perception, it can involve real surrounding objects as \UI: to either visually augment them (\figref{roo2017inner}), or to use them as physical proxies to support interaction with \VOs \cite{ishii1997tangible}. +%According to \textcite{billinghurst2005designing} +Each \VO is coupled to a real object and physically manipulated through it, providing a direct, efficient and seamless interaction with both the real and virtual content \cite{billinghurst2005designing}. +The real objects are called \emph{tangible} in this usage context. +%This technique is similar to mapping the movements of a mouse to a virtual cursor on a screen. Methods have been developed to automatically pair and adapt the \VOs for rendering with available tangibles of similar shape and size \cite{hettiarachchi2016annexing,jain2023ubitouch} (\figref{jain2023ubitouch}). The issue with these \emph{space-multiplexed} interfaces is the large number and variety of tangibles required. An alternative is to use a single \emph{universal} tangible object like a hand-held controller, such as a cube \cite{issartel2016tangible} or a sphere \cite{englmeier2020tangible}. These \emph{time-multiplexed} interfaces require interaction techniques that allow the user to pair the tangible with any \VO, \eg by placing the tangible into the \VO and pressing the fingers \cite{issartel2016tangible} (\figref{issartel2016tangible}), similar to a real grasp (\secref{grasp_types}). -Still, the virtual visual rendering and the tangible haptic sensations can be inconsistent. -Especially in \OST-\AR, since the \VOs are inherently slightly transparent allowing the paired tangibles to be seen through them. -In a pick-and-place task with tangibles of different shapes, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using_1}) of the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users. -This suggests the feasibility of using simplified tangibles in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs. -Similarly, in \secref{tactile_rendering} we described how a material property (\secref{object_properties}) of a touched tangible can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR. +Still, the virtual visual rendering and the real haptic sensations can be inconsistent. +Especially in \OST-\AR, since the \VOs are inherently slightly transparent allowing the paired real objects to be seen through them. +In a pick-and-place task with real objects, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using_1}) of the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users. +This suggests the feasibility of using simplified real obejcts in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs. +Similarly, in \secref{tactile_rendering} we described how a material property (\secref{object_properties}) of a touched real object can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR. \begin{subfigs}{ar_tangibles}{Manipulating \VOs with tangibles. }[][ \item Ubi-Touch paired the movements and screw interaction of a virtual drill with a real vaporizer held by the user \cite{jain2023ubitouch}. - \item A tangible cube that can be moved into the \VE and used to grasp and manipulate \VOs \cite{issartel2016tangible}. + \item A real cube that can be moved into the \VE and used to grasp and manipulate \VOs \cite{issartel2016tangible}. \item Size and - \item shape difference between a tangible and a \VO is acceptable for manipulation in \AR \cite{kahl2021investigation,kahl2023using}. + \item shape difference between a real object and a virtual one is acceptable for manipulation in \AR \cite{kahl2021investigation,kahl2023using}. ] \subfigsheight{37.5mm} \subfig{jain2023ubitouch} @@ -291,7 +293,7 @@ While in \VST-\AR, this could be solved as a masking problem by combining the re %However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a \VO, \eg the thumb is in front of the virtual cube, but could be perceived to be behind it. Since the \VE is intangible, adding a visual rendering of the virtual hand in \AR that is physically constrained to the \VOs would achieve a similar result to the promising double-hand rendering of \textcite{prachyabrued2014visual}. -A \VO overlaying a tangible object in \OST-\AR can vary in size and shape without degrading user experience or manipulation performance \cite{kahl2021investigation,kahl2023using}. +A \VO overlaying a real object object in \OST-\AR can vary in size and shape without degrading user experience or manipulation performance \cite{kahl2021investigation,kahl2023using}. This suggests that a visual hand rendering superimposed on the real hand as a partial avatarization (\secref{ar_embodiment}) might be helpful without impairing the user. Few works have compared different visual hand rendering in \AR or with wearable haptic feedback. @@ -331,5 +333,5 @@ Taken together, these results suggest that a visual rendering of the hand in \AR They enable highly immersive \AEs that users can explore with a strong sense of the presence of the virtual content. However, without a direct and seamless interaction with the \VOs using the hands, the coherence of the \AE experience is compromised. In particular, there is a lack of mutual occlusion and interaction cues between the hands and virtual content when manipulating \VOs in \OST-\AR that could be mitigated by a visual rendering of the hand. -A common alternative approach is to use tangible objects as proxies for interaction with \VOs, but this raises concerns about their consistency with the visual rendering. -In this context, the use of wearable haptic systems worn on the hand seems to be a promising solution both for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of touched tangible objects. +A common alternative approach is to use real objects as tangible proxies for interaction with \VOs, but this raises concerns about their consistency with the visual rendering. +In this context, the use of wearable haptic systems worn on the hand seems to be a promising solution both for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of touched real objects. diff --git a/1-background/related-work/4-visuo-haptic-ar.tex b/1-background/related-work/4-visuo-haptic-ar.tex index 7328bb2..c8d9ff3 100644 --- a/1-background/related-work/4-visuo-haptic-ar.tex +++ b/1-background/related-work/4-visuo-haptic-ar.tex @@ -17,7 +17,7 @@ It is essential to understand how a multimodal visuo-haptic rendering of a \VO i \label{sensations_perception} A \emph{perception} is the merging of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property \cite{ernst2004merging}. -For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a tangible with a co-localized \VO (\secref{ar_tangibles}). +For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a real object with a co-localized \VO (\secref{ar_tangibles}). If the sensations are redundant, \ie if only one sensation could suffice to estimate the property, they are integrated to form a single perception \cite{ernst2004merging}. No sensory information is completely reliable and may give different answers to the same property when measured multiple times, \eg the weight of an object. @@ -63,7 +63,7 @@ The \MLE model implies that when seeing and touching a \VO in \AR, the combinati %As long as the user is able to associate the sensations as the same object property, and even if there are discrepancies between the sensations, the overall perception can be influenced by changing one of the stimuli, as discussed in the next sections. %for example by including tangible objects, wearable haptic feedback, or even by altering the visual rendering of the \VO, as discussed in the next sections. -\subsubsection{Influence of Visual Rendering on Tangible Perception} +\subsubsection{Influence of Visual Rendering on Haptic Perception} \label{visual_haptic_influence} Thus, a visuo-haptic perception of an object's property is robust to some difference between the two sensory modalities, as long as one can match their respective sensations to the same property. @@ -74,19 +74,19 @@ The overall perception can then be modified by changing one of the sensory modal \textcite{yanagisawa2015effects} altered the perceived roughness, stiffness, and friction of real tactile materials touched by the finger by superimposing different real visual textures using a half-mirror. In a similar setup, but in immersive \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched through a glove: many visual textures were found to match the real haptic textures. \textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined multiple \VOs in \VR with \ThreeD-printed hair structures or with everyday real surfaces, respectively. -They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few tangibles seemed to be sufficient to match all the visual \VOs (\figref{gunther2022smooth}). +They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few real, tangible objects seemed to be sufficient to match all the visual \VOs (\figref{gunther2022smooth}). %Taken together, these studies suggest that a set of haptic textures, real or virtual, can be perceived as coherent with a larger set of visual virtual textures. \fig{gunther2022smooth}{In a passive touch context in \VR, only a smooth and a rough real surfaces were found to match all the visual \VOs \cite{gunther2022smooth}.} Visual feedback can even be intentionally designed to influence haptic perception, usually by deforming the visual representation of a user input, creating a \enquote{pseudo-haptic feedback} \cite{ujitoko2021survey}. -For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually deforming the geometry of a tangible object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape, or curvature can be altered \cite{ban2013modifying,ban2014displaying}. -\textcite{punpongsanon2015softar} used this technique in spatial \AR (\secref{ar_displays}) to induce a softness illusion of a hard tangible object by superimposing a virtual texture that deforms when pressed by the hand (\figref{punpongsanon2015softar}). +For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually deforming the geometry of a real object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape, or curvature can be altered \cite{ban2013modifying,ban2014displaying}. +\textcite{punpongsanon2015softar} used this technique in spatial \AR (\secref{ar_displays}) to induce a softness illusion of a hard real surface by superimposing a virtual texture that deforms when pressed by the hand (\figref{punpongsanon2015softar}). \textcite{ujitoko2019modulating} increased the perceived roughness of a virtual patterned texture rendered as vibrations through a hand-held stylus (\secref{texture_rendering}) by adding small oscillations to the visual feedback of the stylus on a screen. \begin{subfigs}{pseudo_haptic}{Pseudo-haptic feedback in \AR. }[][ \item A virtual soft texture projected on a table and that deforms when pressed by the hand \cite{punpongsanon2015softar}. - \item Modifying visually a tangible object and the hand touching it in \VST-\AR to modify its perceived shape \cite{ban2014displaying}. + \item Modifying visually a real object and the hand touching it in \VST-\AR to modify its perceived shape \cite{ban2014displaying}. ] \subfigsheight{42mm} \subfig{punpongsanon2015softar} @@ -167,7 +167,7 @@ This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a This moving platform also contains a \LRA (\secref{moving_platforms}) and provides contact pressure and texture sensations. %The whole system is very compact (\qtyproduct{24 x 24 x 41}{\mm}), lightweight (\qty{9.5}{\g}), and fully portable by including a battery and Bluetooth wireless communication. \qty{20}{\ms} for the Bluetooth When touching \VOs in \OST-\AR with the index finger, this device was found to be more realistic overall (5/7) than vibrations with a \LRA at \qty{170}{\Hz} on the nail (3/7). -Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and this design is not suitable for augmenting real tangible objects. +Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and this design is not suitable for augmenting real objects. % teng2021touch: (5.27+3.03+5.23+5.5+5.47)/5 = 4.9 % ando2007fingernailmounted: (2.4+2.63+3.63+2.57+3.2)/5 = 2.9 diff --git a/2-perception/perception.tex b/2-perception/perception.tex index 188eea7..54ad364 100644 --- a/2-perception/perception.tex +++ b/2-perception/perception.tex @@ -1,2 +1,2 @@ -\part{Augmenting the Visuo-haptic Texture Perception of Tangible Surfaces} +\part{Augmenting the Visuo-haptic Texture Perception of Real Surfaces} \mainlabel{perception} diff --git a/2-perception/vhar-system/1-introduction.tex b/2-perception/vhar-system/1-introduction.tex index f6c04e6..97736e2 100644 --- a/2-perception/vhar-system/1-introduction.tex +++ b/2-perception/vhar-system/1-introduction.tex @@ -8,11 +8,11 @@ However, this method has not yet been integrated in an \AR context, where the us %which either constrained hand to a constant speed to keep the signal frequency constant \cite{asano2015vibrotactile,friesen2024perceived}, or used mechanical sensors attached to the hand \cite{friesen2024perceived,strohmeier2017generating} -In this chapter, we propose a \textbf{system for rendering visual and haptic virtual textures that augment tangible surfaces}. +In this chapter, we propose a \textbf{system for rendering visual and haptic virtual textures that augment real surfaces}. It is implemented with an immersive \OST-\AR headset Microsoft HoloLens~2 and a wearable vibrotactile (voice-coil) device worn on the outside of finger (not covering the fingertip, \secref[related_work]{vhar_haptics}). The visuo-haptic augmentations can be \textbf{viewed from any angle} and \textbf{explored freely with the bare finger}, as if they were real textures. -To ensure both real-time and reliable renderings, the hand and the tangibles are tracked using a webcam and marker-based tracking. -The haptic textures are rendered as a vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented tangible surface. +To ensure both real-time and reliable renderings, the hand and the real surfaces are tracked using a webcam and marker-based tracking. +The haptic textures are rendered as a vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented surface. \noindentskip The contributions of this chapter are: \begin{itemize} @@ -26,7 +26,7 @@ The haptic textures are rendered as a vibrotactile signal representing a pattern \begin{subfigs}{setup}{Visuo-haptic texture rendering system setup. }[][ \item HapCoil-One voice-coil actuator with a fiducial marker on top attached to the middle-phalanx of the user's index finger. - \item Our implementation of the system using a Microsoft HoloLens~2, a webcam for tracking the hand and the tangible surfaces, and an external computer for processing the tracking data and rendering the haptic textures. + \item Our implementation of the system using a Microsoft HoloLens~2, a webcam for tracking the hand and the real surfaces, and an external computer for processing the tracking data and rendering the haptic textures. ] \subfigsheight{60mm} \subfig{device} diff --git a/2-perception/vhar-system/2-method.tex b/2-perception/vhar-system/2-method.tex index d7e5339..83d751e 100644 --- a/2-perception/vhar-system/2-method.tex +++ b/2-perception/vhar-system/2-method.tex @@ -1,6 +1,6 @@ %With a vibrotactile actuator attached to a hand-held device or directly on the finger, it is possible to simulate virtual haptic sensations as vibrations, such as texture, friction or contact vibrations \cite{culbertson2018haptics}. % -%We describe a system for rendering vibrotactile roughness textures in real time, on any tangible surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose. +%We describe a system for rendering vibrotactile roughness textures in real time, on any real surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose. % %We also describe how to pair this tactile rendering with an immersive \AR or \VR headset visual display to provide a coherent, multimodal visuo-haptic augmentation of the \RE. @@ -18,7 +18,7 @@ The visuo-haptic texture rendering system is based on: The system consists of three main components: the pose estimation of the tracked real elements, the visual rendering of the \VE, and the vibrotactile signal generation and rendering. \figwide[1]{diagram}{Diagram of the visuo-haptic texture rendering system. }[ - Fiducial markers attached to the voice-coil actuator and to tangible surfaces to track are captured by a camera. + Fiducial markers attached to the voice-coil actuator and to augmented surfaces to track are captured by a camera. The positions and rotations (the poses) ${}^c\mathbf{T}_i$, $i=1..n$ of the $n$ defined markers in the camera frame $\mathcal{F}_c$ are estimated, then filtered with an adaptive low-pass filter. %These poses are transformed to the \AR/\VR headset frame $\mathcal{F}_h$ and applied to the virtual model replicas to display them superimposed and aligned with the \RE. These poses are used to move and display the virtual model replicas aligned with the \RE. @@ -36,8 +36,8 @@ The system consists of three main components: the pose estimation of the tracked \label{pose_estimation} A \qty{2}{\cm} AprilTag fiducial marker \cite{wang2016apriltag} is glued to the top of the actuator (\figref{device}) to track the finger pose with a camera (StreamCam, Logitech) which is placed above the experimental setup and capturing \qtyproduct{1280 x 720}{px} images at \qty{60}{\hertz} (\figref{apparatus}). -Other markers are placed on the tangible surfaces to augment (\figref{setup}) to estimate the relative position of the finger with respect to the surfaces. -Contrary to similar work using vision-based tracking allows both to free the hand movements and to augment any tangible surface. +Other markers are placed on the real surfaces to augment (\figref{setup}) to estimate the relative position of the finger with respect to the surfaces. +Contrary to similar work using vision-based tracking allows both to free the hand movements and to augment any real surface. A camera external to the \AR headset with a marker-based technique is employed to provide accurate and robust tracking with a constant view of the markers \cite{marchand2016pose}. We denote ${}^c\mathbf{T}_i$, $i=1..n$ the homogenous transformation matrix that defines the position and rotation of the $i$-th marker out of the $n$ defined markers in the camera frame $\mathcal{F}_c$, \eg the finger pose ${}^c\mathbf{T}_f$ and the texture pose ${}^c\mathbf{T}_t$. @@ -51,7 +51,7 @@ The velocity (without angular velocity) of the marker, denoted as ${}^c\dot{\mat %To be able to compare virtual and augmented realities, we then create a \VE that closely replicate the real one. Before a user interacts with the system, it is necessary to design a \VE that will be registered with the \RE during the experiment. -Each real element tracked by a marker is modelled virtually, \eg the hand and the augmented tangible surface (\figref{device}). +Each real element tracked by a marker is modelled virtually, \eg the hand and the augmented surface (\figref{device}). In addition, the pose and size of the virtual textures were defined on the virtual replicas. During the experiment, the system uses marker pose estimates to align the virtual models with their real-world counterparts. %, according to the condition being tested. This allows to detect if a finger touches a virtual texture using a collision detection algorithm (Nvidia PhysX), and to show the virtual elements and textures in real-time, aligned with the \RE, using the considered \AR or \VR headset. diff --git a/2-perception/vhar-system/6-conclusion.tex b/2-perception/vhar-system/6-conclusion.tex index 4e942bc..8e1d53d 100644 --- a/2-perception/vhar-system/6-conclusion.tex +++ b/2-perception/vhar-system/6-conclusion.tex @@ -3,7 +3,7 @@ %Summary of the research problem, method, main findings, and implications. -In this chapter, we designed and implemented a system for rendering virtual visuo-haptic textures that augment a real tangible surface. +In this chapter, we designed and implemented a system for rendering virtual visuo-haptic textures that augment a real surface. Directly touched with the fingertip, the perceived roughness of the surface can be increased using a wearable vibrotactile voice-coil device mounted on the middle phalanx of the finger. We adapted the 1D sinusoidal grating rendering method, common in the literature but not yet integrated in a direct touch context, for use with vision-based tracking of the finger and paired it with an immersive \AR headset. diff --git a/2-perception/vhar-textures/1-introduction.tex b/2-perception/vhar-textures/1-introduction.tex index 824e063..1d205a5 100644 --- a/2-perception/vhar-textures/1-introduction.tex +++ b/2-perception/vhar-textures/1-introduction.tex @@ -5,14 +5,14 @@ When we look at the surface of an everyday object, we then touch it to confirm o Among the various haptic texture augmentations, data-driven methods allow to capture, model and reproduce the roughness perception of real surfaces when touched touched by a hand-held stylus \secref[related_work]{texture_rendering}. Databases of visuo-haptic textures have been developed in this way \cite{culbertson2014one,balasubramanian2024sens3}, but they have not yet been explored in an immersive and direct touch context with \AR and wearable haptics. -In this chapter, we investigate whether simultaneous and \textbf{co-localized visual and wearable haptic texture augmentation of tangible surfaces} in \AR can be perceived in a coherent and realistic manner, and to what extent each sensory modality would contribute to the overall perception of the augmented texture. +In this chapter, we investigate whether simultaneous and \textbf{co-localized visual and wearable haptic texture augmentation of real surfaces} in \AR can be perceived in a coherent and realistic manner, and to what extent each sensory modality would contribute to the overall perception of the augmented texture. We used nine pairs of \textbf{data-driven visuo-haptic textures} from the \HaTT database \cite{culbertson2014one}, which we rendered using the wearable visuo-haptic augmentatio nsystem presented in \chapref{vhar_system}. %, an \OST-\AR headset, and a wearable voice-coil device worn on the finger. In a \textbf{user study}, 20 participants freely explored in direct touch the combination of the visuo-haptic texture pairs to rate their coherence, realism and perceived roughness. We aimed to assess \textbf{which haptic textures were matched with which visual textures}, how the roughness of the visual and haptic textures was perceived, and whether \textbf{the perceived roughness} could explain the matches made between them. \noindentskip The contributions of this chapter are: \begin{itemize} - \item Transposition of data-driven visuo-haptic textures to augment tangible objects in a direct touch context in immersive \AR. + \item Transposition of data-driven visuo-haptic textures to augment real objects in a direct touch context in immersive \AR. \item A user study evaluating with 20 participants the coherence, realism and perceived roughness of nine pairs of these visuo-haptic texture augmentations. \end{itemize} diff --git a/2-perception/vhar-textures/2-experiment.tex b/2-perception/vhar-textures/2-experiment.tex index 1ea4344..98eb5a1 100644 --- a/2-perception/vhar-textures/2-experiment.tex +++ b/2-perception/vhar-textures/2-experiment.tex @@ -1,7 +1,7 @@ \section{User Study} \label{experiment} -%The user study aimed at analyzing the user perception of tangible surfaces when augmented through a visuo-haptic texture using \AR and vibrotactile haptic feedback provided on the finger touching the surfaces. +%The user study aimed at analyzing the user perception of real surfaces when augmented through a visuo-haptic texture using \AR and vibrotactile haptic feedback provided on the finger touching the surfaces. %Nine representative visuo-haptic texture pairs from the \HaTT database \cite{culbertson2014one} were investigated in two tasks: %\begin{enumerate} % \item \level{Matching} task: participants had to find the haptic texture that best matched a given visual texture; and @@ -13,7 +13,7 @@ \label{textures} The 100 visuo-haptic texture pairs of the \HaTT database \cite{culbertson2014one} were preliminary tested and compared using the apparatus described in \secref{apparatus} to select the most representative textures for the user study. -% visuo-haptic system presented in \chapref{vhar_system}, and with the vibrotactile haptic feedback provided on the middle-phalanx of the finger touching a tangible surface. on the finger on a tangible surface +% visuo-haptic system presented in \chapref{vhar_system}, and with the vibrotactile haptic feedback provided on the middle-phalanx of the finger touching a real surface. on the finger on a real surface These texture models were chosen as they are visuo-haptic representations of a wide range of real textures that are publicly available online. Nine texture pairs were selected (\figref{experiment/textures}) to cover various perceived roughness, from rough to smooth, as named on the database: \level{Metal Mesh}, \level{Sandpaper~100}, \level{Brick~2}, \level{Cork}, \level{Sandpaper~320}, \level{Velcro Hooks}, \level{Plastic Mesh~1}, \level{Terra Cotta}, \level{Coffee Filter}. All these visual and haptic textures are isotropic: their rendering (appearance or roughness) is the same whatever the direction of the movement on the surface, \ie there are no local deformations (holes, bumps, or breaks). @@ -24,11 +24,11 @@ All these visual and haptic textures are isotropic: their rendering (appearance \figref{experiment/setup} shows the experimental setup, and \figref{experiment/view} the first person view of participants during the user study. The user study was held in a quiet room with no windows, with one light source of \qty{800}{\lumen} placed \qty{70}{\cm} above the table. -Nine \qty{5}{\cm} square cardboards with smooth, white melamine surface, arranged in a \numproduct{3 x 3} grid, were used as real tangible surfaces to augment. +Nine \qty{5}{\cm} square cardboards with smooth, white melamine surface, arranged in a \numproduct{3 x 3} grid, were used as real surfaces to augment. Their poses were estimated with three \qty{2}{\cm} AprilTag fiducial markers glued on the surfaces grid. Similarly, a \qty{2}{\cm} fiducial marker was glued on top of the vibrotactile actuator to detect the finger pose. Positioned \qty{20}{\cm} above the surfaces, a webcam (StreamCam, Logitech) filmed the markers to track finger movements relative to the surfaces, as described in \secref[vhar_system]{virtual_real_alignment}. -The visual textures were displayed on the tangible surfaces using the \OST-\AR headset Microsoft HoloLens~2 running a custom application at \qty{60}{FPS} made with Unity 2021.1 and Mixed Reality Toolkit (MRTK) 2.7.2. +The visual textures were displayed on the real surfaces using the \OST-\AR headset Microsoft HoloLens~2 running a custom application at \qty{60}{FPS} made with Unity 2021.1 and Mixed Reality Toolkit (MRTK) 2.7.2. A set of empirical tests enabled us to choose the best rendering characteristics in terms of transparency and brightness for the visual textures, that were used throughout the user study. When a virtual haptic texture was touched, a \qty{48}{kHz} audio signal was generated using the rendering procedure described in \cite{culbertson2014modeling} from the corresponding \HaTT haptic texture model and the measured tangential speed of the finger (\secref[vhar_system]{texture_generation}). @@ -44,7 +44,7 @@ The vibrotactile voice-coil actuator (HapCoil-One, Actronika) was firmly attache \item The nine visuo-haptic textures used in the user study, selected from the \HaTT database \cite{culbertson2014one}. The texture names were never shown, to prevent the use of the user's visual or haptic memory of the textures. \item Experimental setup. - Participant sat in front of the tangible surfaces, which were augmented with visual textures displayed by the Microsoft HoloLens~2 \AR headset and haptic roughness textures rendered by the vibrotactile haptic device placed on the middle index phalanx. + Participant sat in front of the real surfaces, which were augmented with visual textures displayed by the Microsoft HoloLens~2 \AR headset and haptic roughness textures rendered by the vibrotactile haptic device placed on the middle index phalanx. A webcam above the surfaces tracked the finger movements. ] \subfig[0.49]{experiment/textures} @@ -58,12 +58,12 @@ Participants were first given written instructions about the experimental setup, Then, after having signed an informed consent form, they were asked to seat in front of the table with the experimental setup and to wear the \AR headset. %The experimenter firmly attached the plastic shell encasing the vibrotactile actuator to the middle index phalanx of their dominant hand. As the haptic textures generated no audible noise, participants did not wear any noise reduction headphones. -A calibration of both the HoloLens~2 and the hand tracking was performed to ensure the correct alignment of the visual and haptic textures on the tangible surfaces. +A calibration of both the HoloLens~2 and the hand tracking was performed to ensure the correct alignment of the visual and haptic textures on the real surfaces. Finally, participants familiarized with the augmented surface in a \qty{2}{min} training session with textures different from the ones used in the user study. Participants started with the \level{Matching} task. They were informed that the user study involved nine pairs of corresponding visual and haptic textures that were separated and shuffled. -On each trial, the same visual texture was displayed on the nine tangible surfaces, while the nine haptic textures were rendered on only one of the surfaces at a time, \ie all surfaces were augmented by the same visual texture, but each surface was augmented by a different haptic texture. +On each trial, the same visual texture was displayed on the nine real surfaces, while the nine haptic textures were rendered on only one of the surfaces at a time, \ie all surfaces were augmented by the same visual texture, but each surface was augmented by a different haptic texture. The placement of the haptic textures was randomized before each trial. Participants were instructed to look closely at the details of the visual textures and explore the haptic textures with a constant pressure and various speeds to find the haptic texture that best matched the visual texture, \ie choose the surface with the most coherent visual-haptic texture pair. The texture names were never given or shown to prevent the use of visual or haptic memory of the textures, nor a definition of what roughness is was given, to let participants complete the task as naturally as possible, similarly to \textcite{bergmanntiest2007haptic}. diff --git a/2-perception/vhar-textures/4-discussion.tex b/2-perception/vhar-textures/4-discussion.tex index 879475a..815666b 100644 --- a/2-perception/vhar-textures/4-discussion.tex +++ b/2-perception/vhar-textures/4-discussion.tex @@ -1,11 +1,11 @@ \section{Discussion} \label{discussion} -In this study, we investigated the perception of visuo-haptic texture augmentation of tangible surfaces touched directly with the index fingertip, using visual texture overlays in \AR and haptic roughness textures generated by a vibrotactile device worn on the middle index phalanx. +In this study, we investigated the perception of visuo-haptic texture augmentation of real surfaces touched directly with the index fingertip, using visual texture overlays in \AR and haptic roughness textures generated by a vibrotactile device worn on the middle index phalanx. The nine evaluated pairs of visuo-haptic textures, taken from the \HaTT database \cite{culbertson2014one}, are models of real texture captures (\secref[related_work]{texture_rendering}). Their perception was evaluated in a two-task user study in which participants chose the most coherent combinations of visual and haptic textures (\level{Matching} task), and ranked all textures according to their perceived roughness (\level{Ranking} task). -The visual textures were displayed statically on the tangible surface, while the haptic textures adapted in real time to the speed of the finger on the surface, giving the impression that the visuo-haptic textures were integrated into the tangible surface. +The visual textures were displayed statically on the real surface, while the haptic textures adapted in real time to the speed of the finger on the surface, giving the impression that the visuo-haptic textures were integrated into the surface. In addition, the interaction with the textures was designed to be as natural as possible, without imposing a specific speed of finger movement, as in similar studies \cite{asano2015vibrotactile,friesen2024perceived}. In the \level{Matching} task, participants were not able to effectively match the original visual and haptic texture pairs (\figref{results/matching_confusion_matrix}), except for the \level{Coffee Filter} texture, which was the smoothest both visually and haptically. diff --git a/2-perception/vhar-textures/5-conclusion.tex b/2-perception/vhar-textures/5-conclusion.tex index 4768073..3383fda 100644 --- a/2-perception/vhar-textures/5-conclusion.tex +++ b/2-perception/vhar-textures/5-conclusion.tex @@ -1,13 +1,13 @@ \section{Conclusion} \label{conclusion} -In this chapter, we investigated how users perceived simultaneous and co-localized visuo-haptic texture augmentations of tangible surfaces seen in immersive \OST-\AR and touched directly with the index finger. +In this chapter, we investigated how users perceived simultaneous and co-localized visuo-haptic texture augmentations of real surfaces seen in immersive \OST-\AR and touched directly with the index finger. Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, the haptic roughness texture was rendered with on the \HaTT data-driven models and finger speed. In a user study, 20 participants rated the coherence, realism and perceived roughness of the combination of nine representative visuo-haptic texture pairs. The results showed that participants consistently identified and matched clusters of visual and haptic textures with similar perceived roughness. The texture rankings did indeed show that participants perceived the roughness of haptic textures to be very similar, but less so for visual textures, and the haptic roughness perception predominated the final roughness perception ranking of the original visuo-haptic pairs. -This suggests that \AR visual textures that augments tangible surfaces can be enhanced with a set of data-driven vibrotactile haptic textures in a coherent and realistic manner. +This suggests that \AR visual textures that augments real surfaces can be enhanced with a set of data-driven vibrotactile haptic textures in a coherent and realistic manner. This paves the way for new \AR applications capable of augmenting a \RE with virtual visuo-haptic textures, such as visuo-haptic painting in artistic or object design context, or viewing and touching virtual objects in a museum or a showroom. The latter is illustrated in \figref{experiment/use_case}, where a user applies different visuo-haptic textures to a wall, in an interior design scenario, to compare them visually and by touch. diff --git a/2-perception/xr-perception/1-introduction.tex b/2-perception/xr-perception/1-introduction.tex index f1ed8b2..1be8b3b 100644 --- a/2-perception/xr-perception/1-introduction.tex +++ b/2-perception/xr-perception/1-introduction.tex @@ -1,14 +1,14 @@ \section{Introduction} \label{intro} -Most of the haptic augmentations of tangible surfaces using with wearable haptic devices, including roughness of textures (\secref[related_work]{texture_rendering}), have been studied without a visual feedback, and none have considered the influence of the visual rendering on their perception or integrated them in \AR and \VR (\secref[related_work]{texture_rendering}). -Still, it is known that the visual rendering of a tangible can influence the perception of its haptic properties (\secref[related_work]{visual_haptic_influence}), and that the perception of same haptic force-feedback or vibrotactile rendering can differ between \AR and \VR, probably due to difference in perceived simultaneity between visual and haptic stimuli (\secref[related_work]{ar_vr_haptic}). +Most of the haptic augmentations of real surfaces using with wearable haptic devices, including roughness of textures (\secref[related_work]{texture_rendering}), have been studied without a visual feedback, and none have considered the influence of the visual rendering on their perception or integrated them in \AR and \VR (\secref[related_work]{texture_rendering}). +Still, it is known that the visual rendering of an object can influence the perception of its haptic properties (\secref[related_work]{visual_haptic_influence}), and that the perception of same haptic force-feedback or vibrotactile rendering can differ between \AR and \VR, probably due to difference in perceived simultaneity between visual and haptic stimuli (\secref[related_work]{ar_vr_haptic}). Indeed, in \AR, the user can see their own hand touching, the haptic device worn and the \RE, while in \VR they are hidden by the \VE. -In this chapter, we investigate the \textbf{role of the visual virtuality} of the hand (real or virtual) and its environment (\AR or \VR) on the perception of a \textbf{tangible surface whose haptic roughness is augmented} with a wearable haptics. %voice-coil device worn on the finger. -To do so, we used the visuo-haptic system presented in \chapref{vhar_system} to render virtual vibrotactile patterned textures (\secref[related_work]{texture_rendering}) to augment the tangible surface being touched. % touched by the finger.% that can be directly touched with the bare finger. +In this chapter, we investigate the \textbf{role of the visual virtuality} of the hand (real or virtual) and its environment (\AR or \VR) on the perception of a \textbf{real surface whose haptic roughness is augmented} with a wearable haptics. %voice-coil device worn on the finger. +To do so, we used the visuo-haptic system presented in \chapref{vhar_system} to render virtual vibrotactile patterned textures (\secref[related_work]{texture_rendering}) to augment the real surface being touched. % touched by the finger.% that can be directly touched with the bare finger. We evaluated, in \textbf{user study with psychophysical methods and extensive questionnaire}, the perceived roughness augmentation in three visual rendering conditions: \textbf{(1) without visual augmentation}, in \textbf{(2) \OST-\AR with a realistic virtual hand} rendering, and in \textbf{(3) \VR with the same virtual hand}. -To control for the influence of the visual rendering, the tangible surface was not visually augmented and stayed the same in all conditions. +To control for the influence of the visual rendering, the real surface was not visually augmented and stayed the same in all conditions. \noindentskip The contributions of this chapter are: \begin{itemize} @@ -21,7 +21,7 @@ We then present the results obtained, discuss them, and outline recommendations %First, we present a system for rendering virtual vibrotactile textures in real time without constraints on hand movements and integrated with an immersive visual \AR/\VR headset to provide a coherent multimodal visuo-haptic augmentation of the \RE. %An experimental setup is then presented to compare haptic roughness augmentation with an optical \AR headset (Microsoft HoloLens~2) that can be transformed into a \VR headset using a cardboard mask. -%We then conduct a psychophysical study with 20 participants, where various virtual haptic textures on a tangible surface directly touched with the finger are compared in a two-alternative forced choice (2AFC) task in three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR. +%We then conduct a psychophysical study with 20 participants, where various virtual haptic textures on a real surface directly touched with the finger are compared in a two-alternative forced choice (2AFC) task in three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR. \bigskip diff --git a/2-perception/xr-perception/3-experiment.tex b/2-perception/xr-perception/3-experiment.tex index a7fa4ae..3a1bc6e 100644 --- a/2-perception/xr-perception/3-experiment.tex +++ b/2-perception/xr-perception/3-experiment.tex @@ -1,9 +1,9 @@ \section{User Study} \label{experiment} -%The visuo-haptic rendering system, described in \secref[vhar_system]{method}, allows free exploration of virtual vibrotactile textures on tangible surfaces directly touched with the bare finger to simulate roughness augmentation, while the visual rendering of the hand and environment can be controlled to be in \AR or \VR. +%The visuo-haptic rendering system, described in \secref[vhar_system]{method}, allows free exploration of virtual vibrotactile textures on real surfaces directly touched with the bare finger to simulate roughness augmentation, while the visual rendering of the hand and environment can be controlled to be in \AR or \VR. % -%The user study aimed to investigate the effect of visual hand rendering in \AR or \VR on the perception of roughness texture augmentation of a touched tangible surface. +%The user study aimed to investigate the effect of visual hand rendering in \AR or \VR on the perception of roughness texture augmentation of a touched real surface. In a \TIFC task (\secref[related_work]{sensations_perception}), participants compared the roughness of different tactile texture augmentations in three visual rendering conditions: without any visual augmentation (\level{Real}, \figref{experiment/real}), in \AR with a realistic virtual hand superimposed on the real hand (\level{Mixed}, \figref{experiment/mixed}), and in \VR with the same virtual hand as an avatar (\level{Virtual}, \figref{experiment/virtual}). In order not to influence the perception, as vision is an important source of information and influence for the perception of texture \cite{bergmanntiest2007haptic,yanagisawa2015effects,vardar2019fingertip}, the touched surface was visually a uniform white; thus only the visual aspect of the hand and the surrounding environment is changed. @@ -52,7 +52,7 @@ The user study was held in a quiet room with no windows. \begin{subfigs}{setup}{Visuo-haptic textures rendering setup. }[][ \item HoloLens~2 \OST-\AR headset, the two cardboard masks to switch the real or virtual environments with the same field of view, and the \ThreeD-printed piece for attaching the masks to the headset. - \item User exploring a virtual vibrotactile texture on a tangible sheet of paper. + \item User exploring a virtual vibrotactile texture on a real sheet of paper. ] \subfigsheight{48.5mm} \subfig{experiment/headset} diff --git a/2-perception/xr-perception/5-discussion.tex b/2-perception/xr-perception/5-discussion.tex index d2de1cd..628def1 100644 --- a/2-perception/xr-perception/5-discussion.tex +++ b/2-perception/xr-perception/5-discussion.tex @@ -15,7 +15,7 @@ Thus, compared to no visual rendering (\level{Real}), the addition of a visual r Differences in user behaviour were also observed between the visual renderings (but not between the haptic textures). On average, participants responded faster (\percent{-16}), explored textures at a greater distance (\percent{+21}) and at a higher speed (\percent{+16}) without visual augmentation (\level{Real} rendering) than in \VR (\level{Virtual} rendering) (\figref{results_finger}). The \level{Mixed} rendering was always in between, with no significant difference from the other two. -This suggests that touching a virtual vibrotactile texture on a tangible surface with a virtual hand in \VR is different from touching it with one's own hand: users were more cautious or less confident in their exploration in \VR. +This suggests that touching a virtual vibrotactile texture on a real surface with a virtual hand in \VR is different from touching it with one's own hand: users were more cautious or less confident in their exploration in \VR. This does not seem to be due to the realism of the virtual hand or the environment, nor to the control of the virtual hand, all of which were rated high to very high by the participants (\secref{results_questions}) in both the \level{Mixed} and \level{Virtual} renderings. The evaluation of the vibrotactile device and the textures was also the same between the visual rendering, with a high sense of control, a good realism and a low perceived latency of the textures (\secref{results_questions}). Conversely, the perceived latency of the virtual hand (\response{Hand Latency} question) seemed to be related to the perceived roughness of the textures (with the \PSEs). diff --git a/2-perception/xr-perception/6-conclusion.tex b/2-perception/xr-perception/6-conclusion.tex index 9c88926..136f99c 100644 --- a/2-perception/xr-perception/6-conclusion.tex +++ b/2-perception/xr-perception/6-conclusion.tex @@ -2,8 +2,8 @@ \label{conclusion} In this chapter, we studied how the perception of wearable haptic augmented textures is affected by the visual virtuality of the hand and the environment, being either real, augmented or virtual. -Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, we augmented the perceived roughness of tangible surfaces with virtual vibrotactile textures rendered on the finger. -%we rendered virtual vibrotactile patterned textures on the voice-coil worn on the middle-phalanx of the finger to augment the roughness perception of the tangible surface being touched. +Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, we augmented the perceived roughness of real surfaces with virtual vibrotactile textures rendered on the finger. +%we rendered virtual vibrotactile patterned textures on the voice-coil worn on the middle-phalanx of the finger to augment the roughness perception of the real surface being touched. With an immersive \AR headset, that could be switched to a \VR only view, we considered three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR. We then evaluated the perceived roughness augmentation in these three visual conditions with a psychophysical user study involving 20 participants and extensive questionnaires. @@ -17,6 +17,6 @@ This study suggests that attention should be paid to the respective latencies of Latencies should be measured \cite{friston2014measuring}, minimized to an acceptable level for users and kept synchronised with each other \cite{diluca2019perceptual}. It seems also that the visual aspect of the hand or the environment on itself has little effect on the perception of haptic feedback, but the degree of visual reality-virtuality can affect the asynchrony sensation of the latencies, even though they remain identical. When designing for wearable haptics or integrating it into \AR/\VR, it seems important to test its perception in real, augmented and virtual environments. -%With a better understanding of how visual factors influence the perception of haptically augmented tangible objects, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic renderings adapted to \AR can be designed. +%With a better understanding of how visual factors influence the perception of haptically augmented real objects, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic renderings adapted to \AR can be designed. %Finally, a visual hand representation in OST-\AR together with wearable haptics should be avoided until acceptable tracking latencies \are achieved, as was also observed for \VO interaction with the bare hand \cite{normand2024visuohaptic}. diff --git a/4-conclusion/conclusion.tex b/4-conclusion/conclusion.tex index 18dbd07..cd67afe 100644 --- a/4-conclusion/conclusion.tex +++ b/4-conclusion/conclusion.tex @@ -5,26 +5,26 @@ \section{Summary} -In this thesis, entitled \enquote{\textbf{\ThesisTitle}}, we have shown how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR by augmenting the perception and manipulation of the virtual. +In this manuscript, we have shown how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR by augmenting the perception and manipulation of the virtual. Wearable haptics can provide a rich tactile feedback on \VOs and augment the perception of real objects, both directly touched by the hand, while preserving the freedom of movement and interaction with the \RE. However, their integration with \AR is still in its infancy, and presents many design, technical and human challenges. -We have structured our research around two axes: \textbf{(I) modifying the texture perception of tangible surfaces}, and \textbf{(II) improving the manipulation of \VOs}. +We have structured our research around two axes: \textbf{(I) modifying the texture perception of real surfaces}, and \textbf{(II) improving the manipulation of \VOs}. -\noindentskip In \partref{perception} we focused on modifying the perception of wearable and immersive virtual visuo-haptic textures that augment tangible surfaces. +\noindentskip In \partref{perception} we focused on modifying the perception of wearable and immersive virtual visuo-haptic textures that augment real surfaces. Texture is a fundamental property of an object, perceived equally by sight and touch. It is also one of the most studied haptic augmentations, but it had not yet been integrated into \AR or \VR. We \textbf{(1)} proposed a \textbf{wearable visuo-haptic texture augmentation system}, \textbf{(2)} evaluated how the perception of haptic texture augmentations is \textbf{affected by the visual virtuality of the hand} and the environment (real, augmented, or virtual), and \textbf{(3)} investigated the \textbf{perception of co-localized visuo-haptic texture augmentations}. -In \chapref{vhar_system}, we presented a system for \textbf{augmenting any tangible surface} with virtual \textbf{visuo-haptic roughness textures} using an immersive \AR headset and a wearable vibrotactile device worn on the middle phalanx of the finger. +In \chapref{vhar_system}, we presented a system for \textbf{augmenting any real surface} with virtual \textbf{visuo-haptic roughness textures} using an immersive \AR headset and a wearable vibrotactile device worn on the middle phalanx of the finger. It allows a \textbf{free visual and touch exploration} of the textures, as if they were real, allowing the user to view them from different angles and touch them with the bare finger without constraints on hand movements. The user studies in the next two chapters are based on this system. In \chapref{xr_perception} we explored how the perception of wearable haptic augmented textures is affected by the visual virtuality of the hand and the environment, whether it is real, augmented or virtual. -We augmented the perceived roughness of the tangible surface with virtual vibrotactile patterned textures, and rendered the visual conditions by switching the \OST-\AR headset to a \VR-only view. +We augmented the perceived roughness of the real surface with virtual vibrotactile patterned textures, and rendered the visual conditions by switching the \OST-\AR headset to a \VR-only view. We then conducted a psychophysical user study with 20 participants and extensive questionnaires to evaluate the perceived roughness augmentation in these three visual conditions. The textures were perceived as \textbf{rougher when touched with the real hand alone compared to a virtual hand} in either \AR or \VR, possibly due to the \textbf{perceived latency} between finger movements and different visual, haptic, and proprioceptive feedbacks. -In \chapref{vhar_textures}, we investigated the perception of co-localized visual and wearable haptic texture augmentations on tangible surfaces. +In \chapref{vhar_textures}, we investigated the perception of co-localized visual and wearable haptic texture augmentations on real surfaces. We transposed the \textbf{data-driven visuo-haptic textures} from the \HaTT database to the system presented in \chapref{vhar_system} and conducted a user study with 20 participants to rate the coherence, realism, and perceived roughness of the combination of nine visuo-haptic texture pairs. Participants integrated roughness sensations from both visual and haptic modalities well, with \textbf{haptics predominating the perception}, and consistently identified and matched \textbf{clusters of visual and haptic textures with similar perceived roughness}. @@ -47,13 +47,13 @@ The results showed that providing vibrotactile feedback \textbf{improved the per The wearable visuo-haptic augmentations of perception and manipulation we presented, and the user studies we conducted for this thesis have of course some limitations. In this section, we present some future work for each chapter that could address these issues. -\subsection*{Augmenting the Visuo-haptic Texture Perception of Tangible Surfaces} +\subsection*{Augmenting the Visuo-haptic Texture Perception of Real Surfaces} \paragraph{Other Augmented Object Properties} We focused on the visuo-haptic augmentation of roughness using vibrotactile feedback, because it is one of the most salient properties of surfaces (\secref[related_work]{object_properties}), one of the most studied in haptic perception (\secref[related_work]{texture_rendering}), and equally perceived by sight and touch (\secref[related_work]{visual_haptic_influence}). However, many other wearable augmentation of object properties should be considered, such as hardness, friction, temperature, or local deformations. -Such integration of haptic augmentation of a tangible surface has almost been achieved with the hand-held devices of \citeauthor{culbertson2017ungrounded} \cite{culbertson2017importance,culbertson2017ungrounded}, but will be more challenging with wearable haptic devices. +Such integration of haptic augmentation of a real surface has almost been achieved with the hand-held devices of \citeauthor{culbertson2017ungrounded} \cite{culbertson2017importance,culbertson2017ungrounded}, but will be more challenging with wearable haptic devices. In addition, combination with pseudo-haptic rendering techniques \cite{ujitoko2021survey} should be systematically investigated to expand the range of possible wearable haptic augmentations. \paragraph{Fully Integrated Tracking} @@ -75,18 +75,19 @@ In particular, it remains to be investigated how the vibrotactile patterned text \paragraph{Broader Visuo-Haptic Conditions} -Our study was conducted with an \OST-\AR headset, but the results may be different with a \VST-\AR headset, where the \RE is seen through cameras and screens, and the perceived simultaneity between visual and haptic stimuli, real or virtual, is different. -We also focused on the perception of roughness augmentation using wearable vibrotactile haptics and a square wave signal to simulate a patterned texture: Our objective was not to accurately reproduce real textures, but to induce different percevied roughness on the same tangible surface with a well controlled parameters. +Our study was conducted with an \OST-\AR headset, but the results may be different with a \VST-\AR headset, where the \RE is seen through cameras and screens (\secref[related_work]{ar_displays}), and the perceived simultaneity between visual and haptic stimuli, real or virtual, is different. +The effect of perceived visuo-haptic simultaneity on the augmented haptic perception and its size should also be systematically investigated, for example by inducing various delays between the visual and haptic feedback. +We also focused on the perception of roughness augmentation using wearable vibrotactile haptics and a square wave signal to simulate a patterned texture: Our objective was not to accurately reproduce real textures, but to induce different perceived roughness on the same real surface with a well controlled parameters. However, more accurate models for simulating interaction with virtual textures should be applied to wearable haptic augmentations, such as in \textcite{unger2011roughness}. Another limitation that may have affected the perception of the haptic texture augmentations is the lack of compensation for the frequency response of the actuator and amplifier \cite{asano2012vibrotactile,culbertson2014modeling,friesen2024perceived}. -The dynamic response of the finger should also be considered, and may vary between individuals. +%The dynamic response of the finger should also be considered, and may vary between individuals. \subsection*{Perception of Visual and Haptic Texture Augmentations in Augmented Reality} \paragraph{Assess the Applicability of the Method} -As in the previous chapter, our aim was not to accurately reproduce real textures, but to alter the perception of a tangible surface being touched with simultaneous visual and haptic texture augmentations. -However, the results also have some limitations, as they addressed a small set of visuo-haptic textures that augmented the perception of smooth and white tangible surfaces. +As in the previous chapter, our aim was not to accurately reproduce real textures, but to alter the perception of a real surface being touched with simultaneous visual and haptic texture augmentations. +However, the results also have some limitations, as they addressed a small set of visuo-haptic textures that augmented the perception of smooth and white real surfaces. Visuo-haptic texture augmentations may be difficult on surfaces that already have strong visual or haptic patterns \cite{asano2012vibrotactile}, or on objects with complex shapes. The role of visuo-haptic texture augmentation should also be evaluated in more complex tasks, such as object recognition and assembly, or in more concrete use cases, such as displaying and touching a museum object or a 3D print before it is manufactured. @@ -112,7 +113,7 @@ While the mutual occlusion problem and the hand tracking latency can be overcome \paragraph{More Ecological Conditions} We conducted the user study with two manipulation tasks that involved placing a virtual cube in a target volume, either by pushing it on a table or by grasping and lifting it. -While these tasks are fundamental building blocks for more complex manipulation tasks \cite{laviolajr20173d}, such as stacking or assembly, more ecological uses should be considered. +While these tasks are fundamental building blocks for more complex manipulation tasks \cite[p.390]{laviolajr20173d}, such as stacking or assembly, more ecological uses should be considered. Similarly, a broader experimental study might shed light on the role of gender and age, as our subject pool was not sufficiently diverse in this regard. Finally, all visual hand renderings received low and high rank rates from different participants, suggesting that users should be able to choose and personalize some aspects of the visual hand rendering according to their preferences or needs, and this should be also be evaluated. @@ -139,6 +140,10 @@ It remains to be explored how to support rendering for different and larger area % systematic exploration of the parameter space of the haptic rendering to determine the most important parameters their influence on the perception % measure the difference in sensitivity to the haptic feedback and how much it affects the perception of the object properties +%- Visio en réalité mixte : ar avec avatars distants, vr pour se retrouver dans l'espace de l'autre ou un espace distant, et besoin de se faire toucher des objets à distance +%- Ou bien en cours, voir l'échantillon à toucher dans lenv de travail ou en contexte en passant en VR +%- Ex : médecin palpation, design d'un objet, rénovation d'un logement (AR en contexte courant, VR pour voir et toucher une fois terminé) + \subsection*{Responsive Visuo-Haptic Augmented Reality} %Given these three points, and the diversity of haptic actuators and renderings, one might be able to interact with the \VOs with any haptic device, worn anywhere on the body and providing personalized feedback on any other part of the hand, and the visuo-haptic system should be able to support such a adapted usage.s