Replace "immersive AR" with "AR headset"
This commit is contained in:
Binary file not shown.
Binary file not shown.
@@ -5,12 +5,9 @@
|
||||
|
||||
\bigskip
|
||||
|
||||
%This PhD manuscript shows how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR by augmenting the perception of the virtual content and its manipulation.
|
||||
In this manuscript thesis, we show how immersive \AR, which integrates visual virtual content into the real world perception, and wearable haptics, which provide tactile sensations on the skin, can improve the free and direct interaction of virtual objects with the hand.
|
||||
In this manuscript thesis, we show how \AR headset, which integrates visual virtual content into the real world perception, and wearable haptics, which provide tactile sensations on the skin, can improve direct hand interaction with virtual and augmented objects.
|
||||
Our goal is to enable users to perceive and interact with wearable visuo-haptic augmentations in a more realistic and effective way, as if they were real.
|
||||
%interaction of the hand with the virtual content.%, moving towards a seamless integration of the virtual into the real world.
|
||||
%We are particularly interested in enabling direct contact of virtual and augmented objects with the bare hand.
|
||||
%The aim of this thesis is to understand how immersive visual and wearable haptic augmentations complement each other in the context of direct hand perception and manipulation with virtual and augmented objects.
|
||||
\comans{JG}{I was wondering what the difference between an immersive AR headset and a non-immersive AR headset should be. If there is a difference (e.g., derived through headset properties by FoV), it should be stated. If there is none, I would suggest not using the term immersive AR headset but simply AR headset. On this account, in Figure 1.5 another term (“Visual AR Headset”) is introduced (and later OST-AR systems, c.f. also section 2.3.1.3).}{The terms "immersive AR headset" and "visual AR headset" have been replaced by the more appropriate term "AR headset".}
|
||||
|
||||
\section{Visual and Haptic Object Augmentations}
|
||||
\label{visuo_haptic_augmentations}
|
||||
@@ -100,17 +97,15 @@ For example, (visual) \AR using a real object as a proxy to manipulate a virtual
|
||||
|
||||
In this thesis we call \AR/\VR \emph{systems} the computational set of hardware (input devices, sensors, displays and haptic devices) and software (tracking, simulation and rendering) that allows the user to interact with the \VE. % by implementing the interaction loop we proposed in \figref{interaction-loop}.
|
||||
Many \AR displays have been explored, from projection systems to hand-held displays.
|
||||
\textbf{\AR headsets are the most promising display technology as they are portable and provide the user with an immersive augmented environment} \cite{hertel2021taxonomy}.
|
||||
Commercial headsets also have integrated real-time self-location and mapping of the \RE and hand pose estimation of the user.
|
||||
\textbf{\AR headsets are the most promising display technology because they create a portable experience that allows the user to navigate the augmented environment and interact with it directly using their hands} \cite{hertel2021taxonomy}.
|
||||
While \AR and \VR systems can address any of the human senses, most focus only on visual augmentation \cite[p.144]{billinghurst2015survey} and \cite{kim2018revisiting}.
|
||||
%but the most \textbf{promising devices are \AR headsets}, which are \textbf{portable displays worn directly on the head}, providing the user with an \textbf{immersive visual augmented environment}.
|
||||
|
||||
\emph{Presence} is the illusion of \enquote{being there} when in \VR, or the illusion of the virtual content to \enquote{feel here} when in \AR \cite{slater2022separate,skarbez2021revisiting}.
|
||||
One of the most important aspects of this illusion is the \emph{plausibility}, \ie the illusion that the virtual events are really happening. %, even if the user knows that they are not real.
|
||||
However, when an \AR/\VR system lacks haptic feedback, it may create a deceptive and incomplete user experience when the hand reaches the virtual content.
|
||||
However, when an \AR/\VR headset lacks haptic feedback, it may create a deceptive and incomplete user experience when the hand reaches the virtual content.
|
||||
All (visual) virtual objects are inherently intangible and cannot physically constrain a user's hand, making it difficult to perceive their properties and interact with them with confidence and efficiency.
|
||||
It is also necessary to provide a haptic feedback that is coherent with the virtual objects and ensures the best possible user experience, as we argue in the next section.
|
||||
The \textbf{integration of wearable haptics with immersive \AR appears to be one of the most promising solutions}, but it remains challenging due to their respective limitations and the additional constraints of combining them, as we will overview in the next section.
|
||||
The \textbf{integration of wearable haptics with \AR headsets appears to be one of the most promising solutions}, but it remains challenging due to their respective limitations and the additional constraints of combining them, as we will overview in the next section.
|
||||
|
||||
\begin{subfigs}{visuo-haptic-environments}{Visuo-haptic environments with varying degrees of reality-virtuality. }[][
|
||||
\item \AR environment with a real haptic object used as a proxy to manipulate a virtual object \cite{kahl2023using}.
|
||||
@@ -128,24 +123,23 @@ The \textbf{integration of wearable haptics with immersive \AR appears to be one
|
||||
\section{Research Challenges of Wearable Visuo-Haptic Augmented Reality}
|
||||
\label{research_challenges}
|
||||
|
||||
The integration of wearable haptics with \AR to create a visuo-haptic augmented environment is complex and presents many perceptual and interaction challenges.
|
||||
% \ie sensing the augmented environment and acting effectively upon it.
|
||||
The integration of wearable haptics with \AR headsets to create a visuo-haptic augmented environment is complex and presents many perceptual and interaction challenges.
|
||||
In this thesis, we propose to \textbf{represent the user's experience with such a visuo-haptic augmented environment as an interaction loop}, shown in \figref{interaction-loop}.
|
||||
It is based on the interaction loops of users with \ThreeD systems \cite[p.84]{laviolajr20173d}.
|
||||
The \RE and the user's hand are tracked in real time by sensors and reconstructed in visual and haptic \VEs.
|
||||
The interactions between the virtual hand and objects are then simulated, and rendered as feedback to the user using a \AR/\VR headset and wearable haptics.
|
||||
The interactions between the virtual hand and objects are then simulated, and rendered as feedback to the user using an \AR/\VR headset and wearable haptics.
|
||||
Because the visuo-haptic \VE is displayed in real time and aligned with the \RE, the user is given the illusion of directly perceiving and interacting with the virtual content as if it were part of the \RE.
|
||||
|
||||
\fig{interaction-loop}{The interaction loop between a user and a visuo-haptic augmented environment as proposed in this thesis.}[
|
||||
A user interacts with the visual (in blue) and haptic (in red) \VEs through a virtual hand (in purple) interaction technique that tracks real hand movements and simulates contact with virtual objects.
|
||||
The visual and haptic \VEs are rendered back using an immersive \AR headset and wearable haptics, and are perceived by the user to be registered and co-localized with the \RE (in gray).
|
||||
The visual and haptic \VEs are rendered back using an \AR headset and wearable haptics, and are perceived by the user to be registered and co-localized with the \RE (in gray).
|
||||
%\protect\footnotemark
|
||||
]
|
||||
|
||||
In this context, we focus on two main research challenges:
|
||||
\textbf{(I) providing plausible and coherent visuo-haptic augmentations}, and
|
||||
\textbf{(II) enabling effective manipulation of the augmented environment}.
|
||||
Each of these challenges also raises numerous design, technical, perceptual and user experience issues specific to wearable haptics and immersive \AR.
|
||||
Each of these challenges also raises numerous design, technical, perceptual and user experience issues specific to wearable haptics and \AR headsets.
|
||||
%, as well as virtual rendering and user experience issues.% in integrating these two sensorimotor feedbacks into a coherent and seamless visuo-haptic augmented environment.
|
||||
|
||||
%\footnotetext{%
|
||||
@@ -158,22 +152,22 @@ Each of these challenges also raises numerous design, technical, perceptual and
|
||||
\subsectionstarbookmark{Challenge I: Providing Plausible and Coherent Visuo-Haptic Augmentations}
|
||||
|
||||
\textbf{Many haptic devices have been designed and evaluated specifically for use in \VR}, providing the user with rich kinesthetic and tactile feedback on virtual objects, increasing the realism and effectiveness of interaction with them \cite{culbertson2018haptics}.
|
||||
Although closely related, \AR and \VR have key differences in their respective renderings that can affect user perception.
|
||||
Although closely related, \AR and \VR headsets have key differences in their respective renderings that can affect user perception.
|
||||
|
||||
%As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic \cite{ujitoko2021survey} or haptic retargeting \cite{azmandian2016haptic} effects.
|
||||
|
||||
Many hand-held or wearable haptic devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR.
|
||||
Many hand-held or wearable haptic devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR headsets.
|
||||
The \textbf{user's hand must be free to touch and interact with the \RE while wearing a wearable haptic device}.
|
||||
Instead, it is possible to place the haptic actuator close to the point of contact with the \RE, \eg providing haptic feedback on the nail \cite{ando2007fingernailmounted,teng2021touch}, another phalanx \cite{asano2015vibrotactile,detinguy2018enhancing,salazar2020altering} or the wrist \cite{pezent2019tasbi,sarac2022perceived} for rendering fingertip contact with virtual content.
|
||||
Therefore, when touching a virtual or augmented object, \textbf{the real and virtual visual sensations are perceived as co-localized, but the virtual haptic feedback is not}.
|
||||
It remains to be investigated how such potential discrepancies affect the overall perception to design visuo-haptic augmentations adapted to \AR.
|
||||
It remains to be investigated how such potential discrepancies affect the overall perception to design visuo-haptic augmentations adapted to \AR headsets.
|
||||
|
||||
%So far, most of the \AR studies and applications only add visual and haptic sensations to the user's overall perception of the environment, but conversely it is more difficult to remove sensations.
|
||||
%Visual and haptic augmentations of the \RE add sensations to the user's overall perception.
|
||||
The \textbf{added visual and haptic virtual sensations may also be perceived as incoherent} with the sensations of the real objects, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these.
|
||||
Moreover, in \AR the user can still see the real world environment, including their hands, augmented real objects and worn haptic devices, unlike \VR where there is total control over the visual rendering. % of the hand and \VE.
|
||||
Moreover, with an \AR headset the user can still see the real world environment, including their hands, augmented real objects and worn haptic devices, unlike \VR where there is total control over the visual rendering. % of the hand and \VE.
|
||||
It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as a whole, and to what extent they will conflict or complement each other. % in the perception of the augmented environment.
|
||||
With a better understanding of \textbf{how visual factors can influence the perception of haptic augmentations}, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied, and new visuo-haptic augmentations adapted to \AR can be designed.
|
||||
With a better understanding of \textbf{how visual factors can influence the perception of haptic augmentations}, the many wearable haptic devices that already exist but have not yet been fully explored with \AR can be better applied, and new visuo-haptic augmentations adapted to \AR can be designed.
|
||||
|
||||
\subsectionstarbookmark{Challenge II: Enabling Effective Manipulation of the Augmented Environment}
|
||||
|
||||
@@ -185,7 +179,7 @@ When touching a visually augmenting a real object, the user's hand is physically
|
||||
However, \textbf{manipulating a purely virtual object with the bare hand can be challenging}, especially without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
|
||||
In addition, wearable haptic devices are limited to cutaneous feedback, and cannot provide forces to constrain the hand contact with the virtual object \cite{pacchierotti2017wearable}.
|
||||
|
||||
Current \AR systems have visual rendering limitations that also affect interaction with virtual objects. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency.
|
||||
Current \AR headsets have visual rendering limitations that also affect interaction with virtual objects. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency.
|
||||
\AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world.
|
||||
However, the depth perception of virtual objects is often underestimated \cite{peillard2019studying,adams2022depth}.
|
||||
There is also often \textbf{a lack of mutual occlusions between the hand and a virtual object}, that is the hand can hide the object or be hidden by the object \cite{macedo2023occlusion}.
|
||||
@@ -199,11 +193,10 @@ Yet, it is unclear which type of visual and wearable haptic feedback, or their c
|
||||
\section{Approach and Contributions}
|
||||
\label{contributions}
|
||||
|
||||
%The aim of this thesis is to understand how immersive visual and wearable haptic augmentations complement each other in the context of direct hand perception and manipulation with virtual and augmented objects.
|
||||
As we described in \secref{research_challenges}, providing a coherent and effective visuo-haptic augmented environment to a user is complex and raises many issues.
|
||||
Our approach is to:
|
||||
\begin{enumerate*}[label=(\arabic*)]
|
||||
\item design immersive and wearable visuo-haptic renderings that augment both the objects being interacted with and the hand interacting with them, and
|
||||
\item design wearable visuo-haptic renderings that augment both the objects being interacted with and the hand interacting with them, and
|
||||
\item evaluate in user studies how these visuo-haptic renderings affect the interaction of these objects with the hand using psychophysical, performance and user experience methods.
|
||||
\end{enumerate*}
|
||||
|
||||
@@ -227,13 +220,12 @@ Wearable haptic devices have proven effective in modifying the perception of a t
|
||||
%It enables rich haptic feedback as the combination of kinesthetic sensation from the real and cutaneous sensation from the actuator.
|
||||
However, wearable haptic augmentation with \AR has been little explored, as well as the visuo-haptic augmentation of texture.
|
||||
Texture is indeed one of the most fundamental perceived properties of a surface material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) augmentation \cite{unger2011roughness,culbertson2014modeling,asano2015vibrotactile,strohmeier2017generating,friesen2024perceived}.
|
||||
%Coherently substitute the visuo-haptic texture of a surface directly touched by a finger is an important step towards a \AR capable of visually and haptically augmenting the \RE of a user in a plausible way.
|
||||
For this first axis of research, we propose to \textbf{design and evaluate the perception of wearable virtual visuo-haptic textures augmenting real surfaces}. %, using an immersive \AR headset and a wearable vibrotactile device.
|
||||
For this first axis of research, we propose to \textbf{design and evaluate the perception of wearable virtual visuo-haptic textures augmenting real surfaces}.
|
||||
To this end, we (1) design a system for rendering wearable visuo-haptic texture augmentations, to (2) evaluate how the perception of haptic texture augmentations is affected by visual feedback of the virtual hand and the environment, and (3) investigate the perception of co-localized visuo-haptic texture augmentations.
|
||||
|
||||
First, an effective approach to render haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction \cite{culbertson2014modeling,asano2015vibrotactile}.
|
||||
Yet, to achieve natural interaction with the hand and coherent visuo-haptic feedback, it requires a real time rendering of the textures, no constraints on hand movements, and good synchronization between the visual and haptic feedback.
|
||||
Thus, our first objective is to \textbf{design an immersive, real time system} that allows free exploration of \textbf{wearable visuo-haptic texture augmentations} on real surfaces with the bare hand.
|
||||
Thus, our first objective is to \textbf{design a real time system} that allows free exploration of \textbf{wearable visuo-haptic texture augmentations} on real surfaces with the bare hand.
|
||||
This will form the basis of the next two chapters in this section.
|
||||
|
||||
Second, many works have investigated the haptic augmentations of texture, but none have integrated them with \AR and \VR, or considered the influence of visual feedback on their perception.
|
||||
@@ -241,22 +233,22 @@ Still, it is known that visual feedback can alter the perception of real and vir
|
||||
Hence, our second objective is to \textbf{evaluate how the perception of wearable haptic texture augmentation is affected by the visual feedback of the virtual hand and the environment} (real, augmented or virtual).
|
||||
|
||||
Finally, visuo-haptic texture databases have been created from real texture captures \cite{culbertson2014penn,balasubramanian2024sens3} to be rendered as virtual textures with hand-held haptic devices that are perceived as similar to real textures \cite{culbertson2015should,friesen2024perceived}.
|
||||
However, the rendering of these textures in an immersive and natural visuo-haptic \AR using wearable haptics remains to be investigated.
|
||||
However, the rendering of these textures with and \AR headset and wearable haptics remains to be investigated.
|
||||
Our third objective is to \textbf{evaluate the perception of simultaneous and co-localized wearable visuo-haptic texture augmentations} of real surfaces in \AR. %, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture.
|
||||
|
||||
\subsectionstarbookmark{Axis II: Improving the Manipulation of Virtual Objects}
|
||||
|
||||
In immersive and wearable visuo-haptic \AR, the hand is free to touch and interact seamlessly with real, augmented and virtual objects.
|
||||
With wearable visuo-haptic \AR, the hand is free to touch and interact seamlessly with real, augmented and virtual objects.
|
||||
Hence, a user can expect natural and direct contact and manipulation of virtual objects with the bare hand.
|
||||
However, the intangibility of the visual \VE, the display limitations of current visual \OST-\AR systems and the inherent spatial and temporal discrepancies between the user's hand actions and the visual feedback in the \VE can make interaction with virtual objects particularly challenging.
|
||||
However, the intangibility of the visual \VE, the display limitations of current \AR headsets and the inherent spatial and temporal discrepancies between the user's hand actions and the visual feedback in the \VE can make interaction with virtual objects particularly challenging.
|
||||
%However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of wearable haptics, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the \RE, the visual feedback, and the haptic feedback, can make the interaction with virtual objects with bare hands particularly challenging.
|
||||
Two particular sensory feedbacks are known to improve such direct virtual object manipulation, but have not been properly investigated in immersive \AR: visual feedback of the virtual hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and delocalized haptic feedback \cite{lopes2018adding,teng2021touch}.
|
||||
For this second axis of research, we propose to \textbf{design and evaluate visuo-haptic augmentations of the hand as interaction feedback with virtual objects} in immersive \OST-\AR.
|
||||
Two particular sensory feedbacks are known to improve such direct virtual object manipulation, but have not been properly investigated with \AR headsets: visual feedback of the virtual hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and delocalized haptic feedback \cite{lopes2018adding,teng2021touch}.
|
||||
For this second axis of research, we propose to \textbf{design and evaluate visuo-haptic augmentations of the hand as interaction feedback with virtual objects} in \AR.
|
||||
We consider the effect on user performance and experience of (1) the visual feedback of the virtual hand as augmentation of the real hand and (2) different delocalized haptic feedback of virtual object manipulation with the hand in combination with visual hand augmentations.
|
||||
|
||||
First, the visual feedback of the virtual hand is a key element for interacting and manipulating virtual objects in \VR \cite{prachyabrued2014visual,grubert2018effects}.
|
||||
Some work has also investigated the visual feedback of the virtual hand in \AR, but not in an immersive context of virtual object manipulation \cite{blaga2017usability,yoon2020evaluating} or was limited to a single visual hand augmentation \cite{piumsomboon2014graspshell,maisto2017evaluation}. % with the bare hand.% from simulating mutual occlusions between the hand and virtual objects \cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay \cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
|
||||
\OST-\AR also has significant perceptual differences from \VR due to the visibility of the real hand and environment, which can affect user experience and performance \cite{yoon2020evaluating}.
|
||||
Some work has also investigated the visual feedback of the virtual hand in \AR, but not in a context of virtual object manipulation with a headset \cite{blaga2017usability,yoon2020evaluating} or was limited to a single visual hand augmentation \cite{piumsomboon2014graspshell,maisto2017evaluation}. % with the bare hand.% from simulating mutual occlusions between the hand and virtual objects \cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay \cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
|
||||
\AR headsets also has significant perceptual differences from \VR due to the visibility of the real hand and environment, which can affect user experience and performance \cite{yoon2020evaluating}.
|
||||
%, and these visual hand augmentations have not been evaluated .
|
||||
Thus, our fourth objective is to \textbf{investigate the visual feedback of the virtual hand as augmentation of the real hand} for direct hand manipulation of virtual objects.
|
||||
|
||||
@@ -284,10 +276,10 @@ We then address each of our two research axes in a dedicated part.
|
||||
In \textbf{\partref{perception}}, we present our contributions to the first axis of research: modifying the visuo-haptic texture perception of real surfaces.
|
||||
We evaluate how the visual feedback of the hand (real or virtual), the environment (\AR or \VR) and the textures (coherent, different or not shown) affect the perception of virtual vibrotactile textures rendered on real surfaces and touched directly with the index finger.
|
||||
|
||||
In \textbf{\chapref{vhar_system}}, we design and implement a system for rendering visuo-haptic virtual textures that augment real surfaces. %, using an immersive \OST-\AR headset and a wearable vibrotactile device.
|
||||
In \textbf{\chapref{vhar_system}}, we design and implement a system for rendering visuo-haptic virtual textures that augment real surfaces.
|
||||
The haptic textures represent a periodical patterned texture rendered by a wearable vibrotactile actuator worn on the middle phalanx of the finger touching the surface.
|
||||
The pose estimation of the real hand and the environment is achieved using a vision-based technique.
|
||||
The visual rendering is done using the immersive \OST-\AR headset Microsoft HoloLens~2.
|
||||
The visual rendering is done using the \OST-\AR headset Microsoft HoloLens~2.
|
||||
The system allows free visual and haptic exploration of the textures, as if they were real, and forms the basis of the next two chapters.
|
||||
|
||||
In \textbf{\chapref{xr_perception}}, we investigate in a psychophysical user study how different the perception of haptic texture augmentations is in \AR \vs \VR and when touched by a virtual hand \vs one's own hand.
|
||||
@@ -298,7 +290,7 @@ The virtual textures are paired visual and haptic captures of real surfaces \cit
|
||||
Our objective is to assess the perceived realism, coherence and roughness of the combination of nine representative visuo-haptic texture pairs.
|
||||
|
||||
\noindentskip
|
||||
In \textbf{\partref{manipulation}}, we describe our contributions to the second axis of research: improving direct hand manipulation of virtual objects using visuo-haptic augmentations of the hand as interaction feedback with virtual objects in immersive \OST-\AR.
|
||||
In \textbf{\partref{manipulation}}, we describe our contributions to the second axis of research: improving direct hand manipulation of virtual objects using visuo-haptic augmentations of the hand as interaction feedback with virtual objects in \AR.
|
||||
|
||||
In \textbf{\chapref{visual_hand}}, we investigate in a user study six visual feedback as hand augmentations, as a set of the most popular hand augmentation in the \AR literature.
|
||||
Using the \OST-\AR headset Microsoft HoloLens~2, we evaluate their effect on user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a virtual object directly with the hand.
|
||||
|
||||
@@ -97,7 +97,7 @@ As illustrated in \figref{sensorimotor_continuum}, \textcite{jones2006human} del
|
||||
]
|
||||
|
||||
This classification has been further refined by \textcite{bullock2013handcentric} into 15 categories of possible hand interactions with an object.
|
||||
In this thesis, we are interested in exploring visuo-haptic texture augmentations (\partref{perception}) and grasping of virtual objects (\partref{manipulation}) using immersive \AR and wearable haptics.
|
||||
In this thesis, we are interested in exploring visuo-haptic texture augmentations (\partref{perception}) and grasping of virtual objects (\partref{manipulation}) using an \AR headset and wearable haptics.
|
||||
|
||||
\subsubsection{Hand Anatomy and Motion}
|
||||
\label{hand_anatomy}
|
||||
|
||||
@@ -2,7 +2,8 @@
|
||||
\label{augmented_reality}
|
||||
|
||||
\AR devices generate and integrate virtual content into the user's perception of their real environment (\RE), creating the illusion of the \emph{presence} of the virtual \cite{azuma1997survey,skarbez2021revisiting}.
|
||||
Immersive systems such as headsets leave the hands free to interact with virtual objects (virtual objects), promising natural and intuitive interactions similar to those with everyday real objects \cite{billinghurst2021grand,hertel2021taxonomy}.
|
||||
Among the different types of devices, \AR headsets leave the hands free to interact with virtual objects.
|
||||
This promises natural and intuitive interactions similar to those with everyday real objects \cite{billinghurst2021grand,hertel2021taxonomy}.
|
||||
|
||||
\subsection{What is Augmented Reality?}
|
||||
\label{what_is_ar}
|
||||
@@ -72,7 +73,7 @@ It doesn't require the user to wear the display, but requires a real surface to
|
||||
Regardless the \AR display, it can be placed at different locations \cite{bimber2005spatial}, as shown in \figref{roo2017one_1}.
|
||||
\emph{Spatial \AR} is usually projection-based displays placed at fixed location (\figref{roo2017inner}), but it can also be \OST or \VST \emph{fixed windows} (\figref{lee2013spacetop}).
|
||||
Alternatively, \AR displays can be \emph{hand-held}, like a \VST smartphone (\figref{hartl2013mobile}), or body-attached, like a micro-projector used as a flashlight \cite[p.141]{billinghurst2015survey}.
|
||||
Finally, \AR displays can be head-worn like \VR \emph{headsets} or glasses, providing a highly immersive and portable experience.
|
||||
Finally, \AR displays can be head-worn like \VR \emph{headsets} or glasses, providing a portable experience.
|
||||
|
||||
\fig[0.75]{roo2017one_1}{Locations of \AR displays from eye-worn to spatially projected. Adapted by \textcite{roo2017one} from \textcite{bimber2005spatial}.}
|
||||
|
||||
@@ -141,7 +142,7 @@ Choosing useful and efficient \UIs and interaction techniques is crucial for the
|
||||
\label{ve_tasks}
|
||||
|
||||
\textcite{laviolajr20173d} (p.385) classify interaction techniques into three categories based on the tasks they enable users to perform: manipulation, navigation, and system control.
|
||||
\textcite{hertel2021taxonomy} proposed a similar taxonomy of interaction techniques specifically for immersive \AR.
|
||||
\textcite{hertel2021taxonomy} proposed a similar taxonomy of interaction techniques specifically for \AR headsets.
|
||||
|
||||
The \emph{manipulation tasks} are the most fundamental tasks in \AR and \VR systems, and the building blocks for more complex interactions.
|
||||
\emph{Selection} is the identification or acquisition of a specific virtual object, \eg pointing at a target as in \figref{grubert2015multifi}, touching a button with a finger, or grasping an object with a hand.
|
||||
@@ -175,12 +176,12 @@ In this thesis we focus on manipulation tasks of virtual content directly with t
|
||||
\label{real_virtual_gap}
|
||||
|
||||
In \AR and \VR, the state of the system is displayed to the user as a \ThreeD spatial \VE.
|
||||
In an immersive and portable \AR system, this \VE is experienced at a 1:1 scale and as an integral part of the \RE.
|
||||
With an \AR headset, the \VE can be experienced at a 1:1 scale and as an integral part of the \RE.
|
||||
The rendering gap between the real and virtual elements, as described on our interaction loop in \figref[introduction]{interaction-loop}, is thus experienced as narrow or even not consciously perceived by the user.
|
||||
This manifests as a sense of presence of the virtual, as described in \secref{ar_presence}.
|
||||
|
||||
As the gap between real and virtual rendering is reduced, one could expect a similar and seamless interaction with the \VE as with a \RE, which \textcite{jacob2008realitybased} called \emph{reality based interactions}.
|
||||
As of today, an immersive \AR system tracks itself with the user in \ThreeD, using tracking sensors and pose estimation algorithms \cite{marchand2016pose}.
|
||||
As of today, an \AR system tracks itself with the user in \ThreeD, using tracking sensors and pose estimation algorithms \cite{marchand2016pose}.
|
||||
It enables the \VE to be registered with the \RE and the user simply moves to navigate within the virtual content.
|
||||
However, direct hand manipulation of virtual content is a challenge that requires specific interaction techniques \cite{billinghurst2021grand}.
|
||||
It is often achieved using two interaction techniques: \emph{tangible objects} and \emph{virtual hands} \cite[p.165]{billinghurst2015survey}.
|
||||
@@ -276,8 +277,8 @@ This suggests that a visual hand feedback superimposed on the real hand as a par
|
||||
|
||||
Few works have compared different visual feedback of the virtual hand in \AR or with wearable haptic feedback.
|
||||
Rendering the real hand as a semi-transparent hand in \VST-\AR is perceived as less natural but seems to be preferred to a mutual visual occlusion for interaction with real and virtual objects \cite{buchmann2005interaction,piumsomboon2014graspshell}.
|
||||
Similarly, \textcite{blaga2017usability} evaluated direct hand manipulation in non-immersive \VST-\AR with a skeleton-like rendering \vs no visual hand feedback: while user performance did not improve, participants felt more confident with the virtual hand (\figref{blaga2017usability}).
|
||||
In a collaborative task in immersive \OST-\AR \vs \VR, \textcite{yoon2020evaluating} showed that a realistic human hand rendering was the most preferred over a low-polygon hand and a skeleton-like hand for the remote partner.
|
||||
Similarly, \textcite{blaga2017usability} evaluated direct hand manipulation in \VST-\AR with a skeleton-like rendering \vs no visual hand feedback: while user performance did not improve, participants felt more confident with the virtual hand (\figref{blaga2017usability}).
|
||||
In a collaborative task in \OST-\AR \vs \VR headsets, \textcite{yoon2020evaluating} showed that a realistic human hand rendering was the most preferred over a low-polygon hand and a skeleton-like hand for the remote partner.
|
||||
\textcite{genay2021virtual} found that the sense of embodiment with robotic hands overlay in \OST-\AR was stronger when the environment contained both real and virtual objects (\figref{genay2021virtual}).
|
||||
Finally, \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the visual and haptic feedback of the hand in \VST-\AR, as detailed in the next section (\secref{vhar_rings}).
|
||||
Taken together, these results suggest that a visual augmentation of the hand in \AR could improve usability and performance in direct hand manipulation tasks, but the best rendering has yet to be determined.
|
||||
@@ -302,7 +303,7 @@ Taken together, these results suggest that a visual augmentation of the hand in
|
||||
|
||||
\AR systems integrate virtual content into the user's perception as if it were part of the \RE.
|
||||
\AR headsets now enable real-time pose estimation of the head and hands, and high-quality display of virtual content, while being portable and mobile.
|
||||
They enable highly immersive augmented environments that users can explore with a strong sense of the presence of the virtual content.
|
||||
They create augmented environments that users can explore with a strong sense of the presence of the virtual content.
|
||||
However, without direct and seamless interaction with the virtual objects using the hands, the coherence of the augmented environment experience is compromised.
|
||||
In particular, when manipulating virtual objects in \OST-\AR, there is a lack of mutual occlusion and interaction cues between the hands and the virtual content, which could be mitigated by a visual augmentation of the hand.
|
||||
A common alternative approach is to use real objects as proxies for interaction with virtual objects, but this raises concerns about their coherence with visual augmentations.
|
||||
|
||||
@@ -5,7 +5,7 @@ Perception and manipulation of objects with the hand typically involves both the
|
||||
Each sense has unique capabilities for perceiving certain object properties, such as color for vision or temperature for touch, but they are equally capable for many properties, such as roughness, hardness, or geometry \cite{baumgartner2013visual}.
|
||||
|
||||
Both \AR and wearable haptic systems integrate virtual content into the user's perception as sensory illusions.
|
||||
It is essential to understand how a visuo-haptic rendering of a virtual object is perceived as a coherent object property, and how wearable haptics have been integrated with immersive \AR.
|
||||
It is essential to understand how a visuo-haptic rendering of a virtual object is perceived as a coherent object property, and how wearable haptics have been integrated with \AR headsets.
|
||||
|
||||
\subsection{Visuo-Haptic Perception of Virtual and Augmented Objects}
|
||||
\label{vh_perception}
|
||||
@@ -60,7 +60,7 @@ More precisely, when surfaces are evaluated by vision or touch alone, both sense
|
||||
|
||||
The overall perception can then be modified by changing one of the sensory modalities.
|
||||
\textcite{yanagisawa2015effects} altered the perceived roughness, stiffness, and friction of real tactile materials touched by the finger by superimposing different real visual textures using a half-mirror.
|
||||
In a similar setup, but in immersive \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched through a glove: many visual textures were found to match the real haptic textures.
|
||||
In a similar setup, but in \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched through a glove: many visual textures were found to match the real haptic textures.
|
||||
\textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined multiple virtual objects in \VR with \ThreeD-printed hair structures or with everyday real surfaces, respectively.
|
||||
They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few real objects seemed to be sufficient to match all the visual virtual objects (\figref{gunther2022smooth}).
|
||||
|
||||
@@ -85,7 +85,7 @@ For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually defo
|
||||
|
||||
Some studies have investigated the visuo-haptic perception of virtual objects rendered with force-feedback and vibrotactile feedback in \AR and \VR.
|
||||
|
||||
In an immersive \VST-\AR setup, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}).
|
||||
In \VST-\AR, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}).
|
||||
In a \TIFC task (\secref{sensations_perception}), participants pressed two pistons and indicated which was stiffer.
|
||||
One had a reference stiffness but an additional visual or haptic delay, while the other varied with a comparison stiffness but had no delay. \footnote{Participants were not told about the delays and stiffness tested, nor which piston was the reference or comparison. The order of the pistons (which one was pressed first) was also randomized.}
|
||||
Adding a visual delay increased the perceived stiffness of the reference piston, while adding a haptic delay decreased it, and adding both delays cancelled each other out (\figref{knorlein2009influence_2}).
|
||||
@@ -114,7 +114,7 @@ The visuo-haptic simultaneity was varied by adding a visual delay or by triggeri
|
||||
No participant (out of 19) was able to detect a \qty{50}{\ms} visual lag and a \qty{15}{\ms} haptic lead, and only half of them detected a \qty{100}{\ms} visual lag and a \qty{70}{\ms} haptic lead.
|
||||
|
||||
These studies have shown how the latency of the visual rendering of a virtual object or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device.
|
||||
We describe in the next section how wearable haptics have been integrated with immersive \AR.
|
||||
We describe in the next section how wearable haptics have been integrated with \AR.
|
||||
|
||||
\begin{subfigs}{gaffary2017ar}{Perception of haptic stiffness in \OST-\AR \vs \VR \cite{gaffary2017ar}. }[][
|
||||
\item Experimental setup: a virtual piston was pressed with a force-feedback placed to the side of the participant.
|
||||
@@ -129,7 +129,7 @@ We describe in the next section how wearable haptics have been integrated with i
|
||||
\subsection{Wearable Haptics for Direct Hand Interaction in AR}
|
||||
\label{vhar_haptics}
|
||||
|
||||
A few wearable haptic devices have been specifically designed or experimentally tested for direct hand interaction in immersive \AR.
|
||||
A few wearable haptic devices have been specifically designed or experimentally tested for direct hand interaction in \AR.
|
||||
Since virtual or augmented objects are naturally touched, grasped, and manipulated directly with the fingertips (\secref{exploratory_procedures} and \secref{grasp_types}), the main challenge of wearable haptics for \AR is to provide haptic sensations of these interactions while keeping the fingertips free to interact with the \RE.
|
||||
Several approaches have been proposed to move the haptic actuator to a different location, on the outside of the finger or the hand, \eg the nail, the top of a phalanx, or the wrist.
|
||||
Yet, they differ greatly in the actuators used (\secref{wearable_haptic_devices}), thus the haptic feedback (\secref{tactile_rendering}), and the placement of the haptic rendering.
|
||||
@@ -178,12 +178,12 @@ In a \VST-\AR setup, \textcite{scheggi2010shape} explored the effect of renderin
|
||||
The middle phalanx of each of these fingers was equipped with a haptic ring of \textcite{minamizawa2007gravity}.
|
||||
\textcite{scheggi2010shape} reported that 12 out of 15 participants found the weight haptic feedback essential to feeling the presence of the virtual cube.
|
||||
|
||||
In a pick-and-place task in non-immersive \VST-\AR involving direct hand manipulation of both virtual and real objects (\figref{maisto2017evaluation}), \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the effects of providing haptic or visual feedback about fingertip-object contacts.
|
||||
In a pick-and-place task in \VST-\AR involving direct hand manipulation of both virtual and real objects (\figref{maisto2017evaluation}), \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the effects of providing haptic or visual feedback about fingertip-object contacts.
|
||||
They compared the haptic ring of \textcite{pacchierotti2016hring} on the proximal phalanx, the moving platform of \textcite{chinello2020modular} on the fingertip, and a visual feedback of the tracked fingertips as virtual points.
|
||||
They showed that the haptic feedback improved the completion time, reduced the force exerted on the cubes compared to the visual feedback (\figref{visual-hands}).
|
||||
The haptic ring was also perceived as more effective than the moving platform.
|
||||
However, the measured difference in performance could be due to either the device or the device position (proximal vs fingertip), or both.
|
||||
These two studies were also conducted in non-immersive setups, where users viewed a screen displaying the visual interactions, and only compared the haptic and visual feedback of the hand-object contacts, but did not examine them together.
|
||||
These two studies were also conducted in static setups, where users viewed a screen displaying the visual interactions, and only compared the haptic and visual feedback of the hand-object contacts, but did not examine them together.
|
||||
|
||||
\begin{subfigs}{ar_rings}{Wearable haptic ring devices for \AR. }[][
|
||||
\item Rendering weight of a virtual cube placed on a real surface \cite{scheggi2010shape}.
|
||||
@@ -200,7 +200,7 @@ A user study was conducted in \VR to compare the perception of visuo-haptic stif
|
||||
\subsection{Conclusion}
|
||||
\label{visuo_haptic_conclusion}
|
||||
|
||||
Providing coherent visuo-haptic feedback to enhance direct hand perception and manipulation with virtual objects in immersive \AR is challenging.
|
||||
Providing coherent visuo-haptic feedback to enhance direct hand perception and manipulation with virtual objects in \AR is challenging.
|
||||
While many wearable haptic devices have been developed and are capable of providing varied tactile feedback, few have be integrated or experimentally evaluated for direct hand interaction in \AR.
|
||||
Their haptic end-effector must be moved away from the inside of the hand so as not to interfere with the user interaction with the \RE.
|
||||
Different relocation strategies have been proposed for different parts of the hand, such as the nail, the index phalanges, or the wrist, but it remains unclear whether any of them are best suited for direct hand interaction in \AR.
|
||||
|
||||
@@ -10,13 +10,13 @@ Only a few haptic systems can be considered wearable due to their compactness an
|
||||
If their haptic rendering is timely associated with the user's touch actions on a real object, the perceived haptic properties of the object, such as its roughness and hardness, can be modified.
|
||||
Wearable haptic augmentation of roughness and hardness is mostly achieved with vibrotactile feedback (\secref{tactile_rendering}).
|
||||
|
||||
\AR headsets integrate virtual content into the user's perception in an immersive way, as if it were part of the \RE, with real-time pose estimation of the head and hands (\secref{what_is_ar}).
|
||||
\AR headsets integrate virtual content into the user's perception as if it were part of the \RE, with real-time pose estimation of the head and hands (\secref{what_is_ar}).
|
||||
Direct interaction with the hand of virtual content is often implemented using virtual hand interaction technique, which reconstructs the user's hand in the \VE and simulates its interactions with the virtual.
|
||||
However, the perception and manipulation of the virtual is difficult due to the lack of haptic feedback and the mutual occlusion of the hand with the virtual content (\secref{ar_interaction}). %, which could be addressed by a visual augmentation of the hand (\secref{ar_visual_hands}).
|
||||
Real surrounding objects can also be used as proxies to interact with the virtual, but they may be incoherent with their visual augmentation because they are haptically passive (\secref{ar_interaction}).
|
||||
Wearable haptics on the hand seems to be a promising solution to enable coherent and effective visuo-haptic augmentation of both virtual and real objects.
|
||||
|
||||
\noindentskip In this thesis, we will use wearable haptic feedback in immersive \AR to create visuo-haptic texture augmentation when touching real objects (\partref{perception}) and to improve manipulation of virtual objects (\partref{manipulation}), both directly with the bare hand.
|
||||
\noindentskip In this thesis, we will use wearable haptic feedback with an \AR headset to create visuo-haptic texture augmentation when touching real objects (\partref{perception}) and to improve manipulation of virtual objects (\partref{manipulation}), both directly with the bare hand.
|
||||
|
||||
First, it is challenging to provide coherent visuo-haptic feedback when augmenting real objects.
|
||||
By integrating different sensory feedback, haptic and visual, real and virtual, into a single object property, perception is somewhat robust to variations in reliability and to spatial and temporal differences.
|
||||
|
||||
@@ -4,12 +4,12 @@
|
||||
One approach to render virtual haptic textures consists in simulating the roughness of a periodic grating surface as a vibrotactile sinusoidal (\secref[related_work]{texture_rendering}).
|
||||
The vibrations are rendered to a voice-coil actuator embedded in a hand-held tool or worn on the finger, but to create the illusion of touching a pattern with a fixed spatial period, the frequency of signal must be modulated according to the finger movement.
|
||||
Previous work either used mechanical system to track the movement at high frequency \cite{strohmeier2017generating,friesen2024perceived}, or required the user to move at a constant speed to keep the signal frequency constant \cite{asano2015vibrotactile,ujitoko2019modulating}.
|
||||
However, this method has not yet been integrated in an \AR context, where the user should be able to freely touch and explore the visuo-haptic texture augmentations.
|
||||
However, this method has not yet been integrated in an \AR headset context, where the user should be able to freely touch and explore the visuo-haptic texture augmentations.
|
||||
|
||||
%which either constrained hand to a constant speed to keep the signal frequency constant \cite{asano2015vibrotactile,friesen2024perceived}, or used mechanical sensors attached to the hand \cite{friesen2024perceived,strohmeier2017generating}
|
||||
|
||||
In this chapter, we propose a \textbf{system for rendering visual and haptic virtual textures that augment real surfaces}.
|
||||
It is implemented with an immersive \OST-\AR headset Microsoft HoloLens~2 and a wearable vibrotactile (voice-coil) device worn on the outside of finger (not covering the fingertip, \secref[related_work]{vhar_haptics}).
|
||||
It is implemented with the \OST-\AR headset Microsoft HoloLens~2 and a wearable vibrotactile (voice-coil) device worn on the outside of finger (not covering the fingertip, \secref[related_work]{vhar_haptics}).
|
||||
The visuo-haptic augmentations can be \textbf{viewed from any angle} and \textbf{explored freely with the bare finger}, as if they were real textures.
|
||||
To ensure both real-time and reliable renderings, the hand and the real surfaces are tracked using a webcam and marker-based pose estimation.
|
||||
The haptic textures are rendered as a vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented surface.
|
||||
@@ -17,7 +17,7 @@ The haptic textures are rendered as a vibrotactile signal representing a pattern
|
||||
\noindentskip The contributions of this chapter are:
|
||||
\begin{itemize}
|
||||
\item The rendering of virtual vibrotactile roughness textures representing a patterned grating texture in real time from free finger movements and using vision-based finger pose estimation.
|
||||
\item A system to provide a coherent visuo-haptic texture augmentations of the \RE in a direct touch context using an immersive \AR headset and wearable haptics.
|
||||
\item A system to provide a coherent visuo-haptic texture augmentations of the \RE in a direct touch context using an \OST-\AR headset and wearable haptics.
|
||||
\end{itemize}
|
||||
|
||||
\noindentskip In the remainder of this chapter, we describe the principles of the system, how the real and virtual environments are registered, the generation of the vibrotactile textures, and measures of visual and haptic rendering latencies.
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
|
||||
In this chapter, we designed and implemented a system for rendering virtual visuo-haptic textures that augment a real surface.
|
||||
Directly touched with the fingertip, the perceived roughness of the surface can be increased using a wearable vibrotactile voice-coil device mounted on the middle phalanx of the finger.
|
||||
We adapted the 1D sinusoidal grating rendering method, common in the literature but not yet integrated in a direct touch context, for use with vision-based pose estimation of the finger and paired it with an immersive \AR headset.
|
||||
We adapted the 1D sinusoidal grating rendering method, common in the literature but not yet integrated in a direct touch context, for use with vision-based pose estimation of the finger and paired it with an \OST-\AR headset.
|
||||
|
||||
Our wearable visuo-haptic augmentation system enable any real surface to be augmented with a minimal setup.
|
||||
It also allows a free exploration of the textures, as if they were real (\secref[related_work]{ar_presence}), by letting the user view them from different poses and touch them with the bare finger without constraints on hand movements.
|
||||
@@ -18,4 +18,3 @@ This system forms the basis of the apparatus for the user studies presented in t
|
||||
%Erwan Normand, Claudio Pacchierotti, Eric Marchand, and Maud Marchal.
|
||||
%\enquote{How Different Is the Perception of Vibrotactile Texture Roughness in Augmented versus Virtual Reality?}.
|
||||
%In: \textit{ACM Symposium on Virtual Reality Software and Technology}. Trier, Germany, October 2024. pp. 287--296.
|
||||
|
||||
|
||||
@@ -2,29 +2,29 @@
|
||||
\label{intro}
|
||||
|
||||
In the previous chapter, we investigated the role of the visual feedback of the virtual hand and the environment (\AR \vs \VR) on the perception of wearable haptic texture augmentation.
|
||||
In this chapter, we explore the perception of wearable visuo-haptic texture augmentation of real surfaces touched directly with the finger in an immersive \AR context and without a virtual hand overlay.
|
||||
In this chapter, we explore the perception of wearable visuo-haptic texture augmentation of real surfaces touched directly with the finger.
|
||||
|
||||
When we look at the surface of an everyday object, we then touch it to confirm or contrast our initial visual impression and to estimate the properties of the object, particularly its texture (\secref[related_work]{visual_haptic_influence}).
|
||||
Among the various haptic texture augmentations, data-driven methods allow to capture, model and reproduce the roughness perception of real surfaces when touched by a hand-held stylus (\secref[related_work]{texture_rendering}).
|
||||
Databases of visuo-haptic textures have been developed in this way \cite{culbertson2014one,balasubramanian2024sens3}, but they have not yet been explored in an immersive and direct touch context with \AR and wearable haptics.
|
||||
Databases of visuo-haptic textures have been developed in this way \cite{culbertson2014one,balasubramanian2024sens3}, but they have not yet been explored in a direct touch context with \AR and wearable haptics.
|
||||
|
||||
In this chapter, we investigate whether simultaneous and \textbf{co-localized visual and wearable haptic texture augmentation of real surfaces} in \AR can be perceived in a coherent and realistic manner, and to what extent each sensory modality would contribute to the overall perception of the augmented texture.
|
||||
In this chapter, we consider simultaneous and \textbf{co-localized visual and wearable haptic texture augmentation of real surfaces} with an \OST-\AR headset and wearable vibrotactile feedback.
|
||||
We investigate how these textures can be perceived in a coherent and realistic manner, and to what extent each sensory modality would contribute to the overall perception of the augmented texture.
|
||||
We used nine pairs of \textbf{data-driven visuo-haptic textures} from the \HaTT database \cite{culbertson2014one}, which we rendered using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}. %, an \OST-\AR headset, and a wearable voice-coil device worn on the finger.
|
||||
In a \textbf{user study}, 20 participants freely explored in direct touch the combination of the visuo-haptic texture pairs to rate their coherence, realism and perceived roughness.
|
||||
We aimed to assess \textbf{which haptic textures were matched with which visual textures}, how the roughness of the visual and haptic textures was perceived, and whether \textbf{the perceived roughness} could explain the matches made between them.
|
||||
|
||||
\noindentskip The contributions of this chapter are:
|
||||
\begin{itemize}
|
||||
\item Transposition of data-driven visuo-haptic textures to augment real objects in a direct touch context in immersive \AR.
|
||||
\item Transposition of data-driven visuo-haptic textures to augment real objects in a direct touch context in \AR.
|
||||
\item A user study evaluating with 20 participants the coherence, realism and perceived roughness of nine pairs of these visuo-haptic texture augmentations.
|
||||
\end{itemize}
|
||||
|
||||
\smallskip
|
||||
|
||||
\fig[0.55]{experiment/view}{First person view of the user study.}[
|
||||
%As seen through the immersive \AR headset.
|
||||
The visual texture overlays were statically displayed on the surfaces, allowing the user to move around to view them from different angles.
|
||||
The haptic texture augmentations were generated based on \HaTT data-driven texture models and finger speed, and were rendered on the middle index phalanx.% as it slides on the considered surface.
|
||||
The haptic texture augmentations were generated based on \HaTT data-driven texture models and finger speed, and were rendered on the middle index phalanx.
|
||||
]
|
||||
|
||||
\noindentskip In the next sections, we first describe the apparatus of the user study experimental design, including the two tasks performed. We then present the results obtained and discuss them before concluding.
|
||||
|
||||
@@ -29,7 +29,7 @@ Several strategies were reported: some participants first classified visually an
|
||||
While visual sensation did influence perception, as observed in previous haptic \AR studies \cite{punpongsanon2015softar,gaffary2017ar,fradin2023humans}, haptic sensation dominated here.
|
||||
This indicates that participants were more confident and relied more on the haptic roughness perception than on the visual roughness perception when integrating both in one coherent perception.
|
||||
|
||||
Several participants also described attempting to identify visual and haptic textures using spatial breaks, edges or patterns, that were not reported when these textures were displayed in non-immersive \VEs with a screen \cite{culbertson2014modeling,culbertson2015should}.
|
||||
Several participants also described attempting to identify visual and haptic textures using spatial breaks, edges or patterns, that were not reported when these textures were displayed in \VEs using a screen \cite{culbertson2014modeling,culbertson2015should}.
|
||||
A few participants even reported that they clearly sensed patterns on haptic textures.
|
||||
However, the visual and haptic textures used were isotropic and homogeneous models of real texture captures, \ie their rendered roughness was constant and did not depend on the direction of movement but only on the speed of the finger (\secref[related_work]{texture_rendering}).
|
||||
Overall, the haptic device was judged to be comfortable, and the visual and haptic textures were judged to be fairly realistic and to work well together (\figref{results_questions}).
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
\section{Conclusion}
|
||||
\label{conclusion}
|
||||
|
||||
In this chapter, we investigated how users perceived simultaneous and co-localized visuo-haptic texture augmentations of real surfaces seen in immersive \OST-\AR and touched directly with the index finger.
|
||||
In this chapter, we investigated how users perceived simultaneous and co-localized visuo-haptic texture augmentations of real surfaces seen with an \OST-\AR headset and touched directly with the index finger.
|
||||
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, the haptic roughness texture was rendered with on the \HaTT data-driven models and finger speed.
|
||||
In a user study, 20 participants rated the coherence, realism and perceived roughness of the combination of nine representative visuo-haptic texture pairs.
|
||||
|
||||
@@ -15,7 +15,7 @@ This paves the way for new \AR applications capable of augmenting a real environ
|
||||
The latter is illustrated in \figref{experiment/use_case}, where a user applies different visuo-haptic textures to a wall, in an interior design scenario, to compare them visually and by touch.
|
||||
|
||||
We instinctively perceive the properties of everyday objects by touching and exploring them, but we essentially interact with them by grasping in order to manipulate them.
|
||||
In this first part, we focused on the perception of wearable and immersive virtual textures that augment real surfaces when touched with the fingertip.
|
||||
In this first part, we focused on the perception of virtual visuo-haptic textures that augment real surfaces when touched with the fingertip.
|
||||
In the next part, we will propose to improve the direct manipulation with the hand of virtual object with wearable visuo-haptic interaction feedback.
|
||||
|
||||
\noindentskip The work described in \chapref{vhar_textures} was presented at the EuroHaptics 2024 conference:
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
\section{Introduction}
|
||||
\label{intro}
|
||||
|
||||
In the previous chapter, we presented a system for augmenting the visuo-haptic texture perception of real surfaces directly touched with the finger, using wearable vibrotactile haptics and an immersive \AR headset.
|
||||
In the previous chapter, we presented a system for augmenting the visuo-haptic texture perception of real surfaces directly touched with the finger, using wearable vibrotactile haptics and an \OST-\AR headset.
|
||||
In this chapter and the next one, we evaluate the user's perception of such wearable haptic texture augmentation under different visual rendering conditions.
|
||||
|
||||
Most of the haptic augmentations of real surfaces using with wearable haptic devices, including roughness of textures (\secref[related_work]{texture_rendering}), have been studied without a visual feedback, and none have considered the influence of the visual rendering on their perception or integrated them in \AR and \VR (\secref[related_work]{texture_rendering}).
|
||||
|
||||
@@ -29,5 +29,5 @@ Thereby, we hypothesize that the differences in the perception of vibrotactile r
|
||||
The perceived delay was the most important in \AR, where the virtual hand visually lags significantly behind the real one, but less so in \VR, where only the proprioceptive sense can help detect the lag.
|
||||
This delay was not perceived when touching the virtual haptic textures without visual augmentation, because only the finger velocity was used to render them, and, despite the varied finger movements and velocities while exploring the textures, the participants did not perceive any latency in the vibrotactile rendering (\secref{results_questions}).
|
||||
\textcite{diluca2011effects} demonstrated similarly, in a \VST-\AR setup, how visual latency relative to proprioception increased the perception of stiffness of a virtual piston, while haptic latency decreased it (\secref[related_work]{ar_vr_haptic}).
|
||||
Another complementary explanation could be a pseudo-haptic effect (\secref[related_work]{visual_haptic_influence}) of the displacement of the virtual hand, as already observed with this vibrotactile texture rendering, but seen on a screen in a non-immersive context \cite{ujitoko2019modulating}.
|
||||
Another complementary explanation could be a pseudo-haptic effect (\secref[related_work]{visual_haptic_influence}) of the displacement of the virtual hand, as already observed with this vibrotactile texture rendering, but seen on a screen \cite{ujitoko2019modulating}.
|
||||
Such hypotheses could be tested by manipulating the latency and pose estimation accuracy of the virtual hand or the vibrotactile feedback. % to observe their effects on the roughness perception of the virtual textures.
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
In this chapter, we studied how the perception of wearable haptic augmented textures is affected by the visual feedback of the virtual hand and the environment, being either real, augmented or virtual.
|
||||
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, we augmented the perceived roughness of real surfaces with virtual vibrotactile textures rendered on the finger.
|
||||
%we rendered virtual vibrotactile patterned textures on the voice-coil worn on the middle-phalanx of the finger to augment the roughness perception of the real surface being touched.
|
||||
With an immersive \AR headset, that could be switched to a \VR only view, we considered three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR.
|
||||
With an \OST-\AR headset, that could be switched to a \VR only view, we considered three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR.
|
||||
We then evaluated the perceived roughness augmentation in these three visual conditions with a psychophysical user study involving 20 participants and extensive questionnaires.
|
||||
|
||||
Our results showed that the visual virtuality of the hand (real or virtual) and the environment (\AR or \VR) had a significant effect on the perception of haptic textures and the exploration behaviour of the participants.
|
||||
|
||||
@@ -5,10 +5,10 @@ Touching, grasping and manipulating virtual objects are fundamental interactions
|
||||
%The most common current \AR systems, in the form of portable and immersive \OST-\AR headsets \cite{hertel2021taxonomy}, allow real-time hand tracking and direct interaction with virtual objects with bare hands (\secref[related_work]{real_virtual_gap}).
|
||||
Manipulation of virtual objects is achieved using a virtual hand interaction technique that represents the user's hand in the \VE and simulates interaction with virtual objects (\secref[related_work]{ar_virtual_hands}).
|
||||
The visual feedback of the virtual hand is a key element for interacting and manipulating virtual objects in \VR \cite{prachyabrued2014visual,grubert2018effects}.
|
||||
Some work has also investigated the visual feedback of the virtual hand in \AR, but not in an immersive context of virtual object manipulation \cite{blaga2017usability,yoon2020evaluating} or was limited to a single visual hand augmentation \cite{piumsomboon2014graspshell,maisto2017evaluation}.
|
||||
Some work has also investigated the visual feedback of the virtual hand in \AR, but not in a context of virtual object manipulation \cite{al-kalbani2016analysis,yoon2020evaluating} or was limited to a single visual hand augmentation \cite{piumsomboon2014graspshell,blaga2017usability,maisto2017evaluation}.
|
||||
\Gls{OST}-\AR also has significant perceptual differences from \VR due the lack of mutual occlusion between the hand and the virtual object in \OST-\AR (\secref[related_work]{ar_displays}), and the inherent delays between the user's hand and the result of the interaction simulation (\secref[related_work]{ar_virtual_hands}).
|
||||
|
||||
In this chapter, we investigate the \textbf{visual rendering of the virtual hand as augmentation of the real hand} for direct hand manipulation of virtual objects in \OST-\AR.
|
||||
In this chapter, we investigate the \textbf{visual rendering of the virtual hand as augmentation of the real hand} for direct hand manipulation of virtual objects with an \OST-\AR headset.
|
||||
To this end, we selected in the literature and compared the most popular visual hand augmentation used to interact with virtual objects in \AR.
|
||||
The virtual hand is \textbf{displayed superimposed} on the user's hand with these visual rendering, providing \textbf{feedback on the tracking} of the real hand, as shown in \figref{hands}.
|
||||
The movement of the virtual hand is also \textbf{constrained to the surface} of the virtual object, providing an additional \textbf{feedback on the interaction} with the virtual object.
|
||||
|
||||
@@ -12,8 +12,8 @@ We found no statistically significant differences when comparing the considered
|
||||
|
||||
All visual hand augmentations showed \response{Grip Apertures} close to the size of the virtual cube, except for the \level{None} rendering (\figref{results/Grasp-GripAperture}), with which participants applied stronger grasps, \ie less distance between the fingertips.
|
||||
Having no visual hand augmentation, but only the reaction of the cube to the interaction as feedback, made participants less confident in their grip.
|
||||
This result contrasts with the wrongly estimated grip apertures observed by \textcite{al-kalbani2016analysis} in an exocentric VST-AR setup.
|
||||
Also, while some participants found the absence of visual hand augmentation more natural, many of them commented on the importance of having feedback on the tracking of their hands, as observed by \textcite{xiao2018mrtouch} in a similar immersive OST-AR setup.
|
||||
This result contrasts with the wrongly estimated grip apertures observed by \textcite{al-kalbani2016analysis} in an exocentric \VST-\AR setup.
|
||||
Also, while some participants found the absence of visual hand augmentation more natural, many of them commented on the importance of having feedback on the tracking of their hands, as observed by \textcite{xiao2018mrtouch} with an \OST-\AR headset.
|
||||
|
||||
Yet, participants' opinions of the visual hand augmentations were mixed on many questions, except for the \level{Occlusion} one, which was perceived less effective than more \enquote{complete} visual hands such as \level{Contour}, \level{Skeleton}, and \level{Mesh} hands (\figref{results_questions}).
|
||||
However, due to the latency of the hand tracking and the visual hand reacting to the cube, almost all participants thought that the \level{Occlusion} rendering to be a \enquote{shadow} of the real hand on the cube.
|
||||
@@ -24,7 +24,7 @@ while others found that it gave them a better sense of the contact points and im
|
||||
This result is consistent with \textcite{saito2021contact}, who found that displaying the points of contacts was beneficial for grasping a virtual object over an opaque visual hand overlay.
|
||||
|
||||
To summarize, when employing a visual feedback of the virtual hand overlaying the real hand, participants were more performant and confident in manipulating virtual objects with bare hands in \AR.
|
||||
These results contrast with similar manipulation studies, but in non-immersive, on-screen \AR, where the presence of a visual hand augmentation was found by participants to improve the usability of the interaction, but not their performance \cite{blaga2017usability,maisto2017evaluation,meli2018combining}.
|
||||
These results contrast with similar manipulation studies, but in on-screen \AR, where the presence of a visual hand augmentation was found by participants to improve the usability of the interaction, but not their performance \cite{blaga2017usability,maisto2017evaluation,meli2018combining}.
|
||||
Our results show the most effective visual hand augmentation to be the \level{Skeleton} one.
|
||||
Participants appreciated that it provided a detailed and precise view of the tracking of the real hand, without hiding or masking it.
|
||||
Although the \level{Contour} and \level{Mesh} hand renderings were also highly rated, some participants felt that they were too visible and masked the real hand.
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
\section{Conclusion}
|
||||
\label{conclusion}
|
||||
|
||||
In this chapter, we addressed the challenge of touching, grasping and manipulating virtual objects directly with the hand in immersive \OST-\AR.
|
||||
In this chapter, we addressed the challenge of touching, grasping and manipulating virtual objects directly with the hand using an \OST-\AR headset.
|
||||
To do so, we proposed to evaluate visual renderings of the virtual hand as augmentation of the real hand.
|
||||
Superimposed on the user's hand, these visual renderings provide feedback from the virtual hand, which tracks the real hand, and simulates the interaction with virtual objects as a proxy.
|
||||
We first selected and compared the six most popular visual hand augmentations used to interact with virtual objects in \AR.
|
||||
Then, in a user study with 24 participants and an immersive \OST-\AR headset, we evaluated the effect of these six visual hand augmentations on the user performance and experience in two representative manipulation tasks.
|
||||
Then, in a user study with 24 participants, we evaluated the effect of these six visual hand augmentations on the user performance and experience in two representative manipulation tasks.
|
||||
|
||||
Our results showed that a visual hand augmentation improved the performance, perceived effectiveness and confidence of participants compared to no augmentation.
|
||||
A skeleton rendering, which provided a detailed view of the tracked joints and phalanges while not hiding the real hand, was the most performant and effective.
|
||||
|
||||
@@ -13,7 +13,7 @@ A final question is whether one or the other of these (haptic or visual) hand fe
|
||||
However, these studies were conducted in non-immersive setups, with a screen displaying the \VE view.
|
||||
In fact, both hand feedback can provide sufficient sensory feedback for efficient direct hand manipulation of virtual objects in \AR, or conversely, they can be shown to be complementary.
|
||||
|
||||
In this chapter, we aim to investigate the role of \textbf{visuo-haptic feedback of the hand when manipulating virtual object} in immersive \OST-\AR using wearable vibrotactile haptics.
|
||||
In this chapter, we investigate the role of \textbf{visuo-haptic feedback of the hand when manipulating virtual object} using an \OST-\AR headset and wearable vibrotactile haptics.
|
||||
We selected \textbf{four different delocalized positionings on the hand} that have been previously proposed in the literature for direct hand interaction in \AR using wearable haptic devices (\secref[related_work]{vhar_haptics}): on the nails, the proximal phalanges, the wrist, and the nails of the opposite hand.
|
||||
We focused on vibrotactile feedback, as it is used in most of the wearable haptic devices and has the lowest encumbrance.
|
||||
In a \textbf{user study}, using the \OST-\AR headset Microsoft HoloLens~2 and two \ERM vibrotactile motors, we evaluated the effect of the four positionings with \textbf{two contact vibration techniques} on the user performance and experience with the same two manipulation tasks as in \chapref{visual_hand}.
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
\section{Conclusion}
|
||||
\label{conclusion}
|
||||
|
||||
In this chapter, we investigated the visuo-haptic feedback of the hand when manipulating virtual objects in immersive \OST-\AR using wearable vibrotactile haptic.
|
||||
In this chapter, we investigated the visuo-haptic feedback of the hand when manipulating virtual objects using an \OST-\AR headset and wearable vibrotactile haptic.
|
||||
To do so, we provided vibrotactile feedback of the fingertip contacts with virtual objects by moving away the haptic actuator that do not cover the inside of the hand: on the nails, the proximal phalanges, the wrist, and the nails of the opposite hand.
|
||||
We selected these four different delocalized positions on the hand from the literature for direct hand interaction in \AR using wearable haptic devices.
|
||||
In a user study, we compared twenty visuo-haptic feedback of the hand as the combination of two vibrotactile contact techniques, provided at five different delocalized positions on the user's hand, and with the two most representative visual hand augmentations established in the \chapref{visual_hand}, \ie the skeleton hand rendering and no hand rendering.
|
||||
|
||||
@@ -7,17 +7,17 @@ We conclude this thesis manuscript by summarizing our contributions and the main
|
||||
|
||||
\section{Summary}
|
||||
|
||||
In this manuscript, we showed how wearable haptics can improve direct hand interaction in immersive \AR. % by augmenting the perception of the real and manipulation of the virtual.
|
||||
In this manuscript, we showed how \OST-\AR headsets and wearable haptics can improve direct hand interaction with virtual and augmented objects. % by augmenting the perception of the real and manipulation of the virtual.
|
||||
Wearable haptics can provide rich tactile feedback on virtual objects and augment the perception of real objects, both directly touched by the hand, while preserving freedom of movement and interaction with the \RE.
|
||||
However, their integration with \AR is still in its infancy and presents many design, technical and human challenges.
|
||||
We have structured this thesis around two research axes: \textbf{(I) modifying the visuo-haptic texture perception of real surfaces} and \textbf{(II) improving the manipulation of virtual objects}.
|
||||
|
||||
\noindentskip In \partref{perception}, we focused on the perception of wearable and immersive virtual textures that augment real surfaces.
|
||||
\noindentskip In \partref{perception}, we focused on the perception of wearable virtual textures that augment real surfaces.
|
||||
Texture is a fundamental property of an object, perceived equally by sight and touch.
|
||||
It is also one of the most studied haptic augmentations, but has not yet been integrated into \AR or \VR.
|
||||
We \textbf{(1) proposed a wearable visuo-haptic texture augmentation system}, \textbf{(2)} evaluated how the perception of haptic texture augmentations is \textbf{affected by the visual feedback of the virtual hand} and the environment (real, augmented, or virtual), and \textbf{(3)} investigated the \textbf{perception of co-localized visuo-haptic texture augmentations}.
|
||||
|
||||
In \chapref{vhar_system}, we presented a system for \textbf{augmenting any real surface} with virtual \textbf{roughness textures with visuo-haptic feedback} using an immersive \AR headset and a wearable vibrotactile device worn on the middle phalanx of the finger.
|
||||
In \chapref{vhar_system}, we presented a system for \textbf{augmenting any real surface} with virtual \textbf{roughness textures with visuo-haptic feedback} using an \OST-\AR headset and a wearable vibrotactile device worn on the middle phalanx of the finger.
|
||||
It allows \textbf{free visual and touch exploration} of the textures as if they were real, allowing the user to view them from different angles and touch them with the bare finger without constraints on hand movement.
|
||||
The user studies in the next two chapters were based on this system.
|
||||
|
||||
@@ -30,7 +30,7 @@ In \chapref{vhar_textures}, we investigated the perception of co-localized visua
|
||||
We transposed the \textbf{data-driven visuo-haptic textures} from the \HaTT database to the system presented in \chapref{vhar_system} and conducted a user study with 20 participants to rate the coherence, realism, and perceived roughness of the combination of nine visuo-haptic texture pairs.
|
||||
Participants integrated roughness sensations from both visual and haptic modalities well, with \textbf{haptics dominating perception}, and consistently identified and matched \textbf{clusters of visual and haptic textures with similar perceived roughness}.
|
||||
|
||||
\noindentskip In \partref{manipulation}, we focused on improving the manipulation of virtual objects directly with the hand in immersive \OST-\AR.
|
||||
\noindentskip In \partref{manipulation}, we focused on improving the manipulation of virtual objects directly with the hand using an \OST-\AR headset.
|
||||
Our approach was to design visual augmentations of the hand and delocalized haptic feedback, based on the literature, and evaluate them in user studies.
|
||||
We first considered \textbf{(1) the visual augmentation of the hand} and then the \textbf{(2)} combination of different \textbf{visuo-haptic feedback of the hand when manipulating virtual objects}.
|
||||
|
||||
@@ -102,7 +102,7 @@ As in the previous chapter, our aim was not to accurately reproduce real texture
|
||||
However, the results also have some limitations, as they addressed a small set of visuo-haptic textures that augmented the perception of smooth and white real surfaces.
|
||||
Visuo-haptic texture augmentation might be difficult on surfaces that already have strong visual or haptic patterns \cite{asano2012vibrotactile}, or on objects with complex shapes.
|
||||
The role of visuo-haptic texture augmentation should also be evaluated in more complex tasks, such as object recognition and assembly, or in more concrete use cases, such as displaying and touching a museum object or a 3D printed object before it is manufactured.
|
||||
Finally, the visual textures used were simple color images not intended for use in an immersive \VE, and enhancing their visual quality could improve the perception of visuo-haptic texture augmentation.
|
||||
Finally, the visual textures used were simple color images not intended for use in an \ThreeD \VE, and enhancing their visual quality could improve the perception of visuo-haptic texture augmentation.
|
||||
\comans{JG}{As future work, the effect of visual quality of the rendered textures on texture perception could also be of interest.}{A sentence along these lines has been added.}
|
||||
|
||||
\paragraph{Specificities of Direct Touch.}
|
||||
@@ -119,7 +119,7 @@ Finally, the virtual texture models should also be adaptable to individual sensi
|
||||
\paragraph{AR Displays.}
|
||||
|
||||
The visual hand augmentations we evaluated were displayed on the Microsoft HoloLens~2, which is a common \OST-\AR headset \cite{hertel2021taxonomy}.
|
||||
We purposely chose this type of display because in \OST-\AR the lack of mutual occlusion between the hand and the virtual object is the most challenging to solve \cite{macedo2023occlusion}.
|
||||
We purposely chose this type of display, because in \OST-\AR the lack of mutual occlusion between the hand and the virtual object is the most challenging to solve \cite{macedo2023occlusion}.
|
||||
We therefore hypothesized that a visual hand augmentation would be more beneficial to users with this type of display.
|
||||
However, the user's visual perception and experience are different with other types of displays, such as \VST-\AR, where the \RE view is seen through cameras and screens (\secref[related_work]{ar_displays}).
|
||||
While the mutual occlusion problem and the hand pose estimation latency could be overcome with \VST-\AR, the visual hand augmentation could still be beneficial to users as it provides depth cues and feedback on the hand tracking, and should be evaluated as such.
|
||||
@@ -150,7 +150,8 @@ It remains to be explored how to support rendering for different and larger area
|
||||
|
||||
\section{Perspectives}
|
||||
|
||||
Our goal was to improve direct hand interaction with virtual objects using wearable haptic devices in immersive \AR by providing more plausible and coherent perception and more natural and effective manipulation of the visuo-haptic augmentations.
|
||||
Our goal was to improve direct hand interaction with virtual objects using wearable haptic devices and an \OST-\AR headset.
|
||||
We aimed to provide more plausible and coherent perception and more natural and effective manipulation of the visuo-haptic augmentations.
|
||||
Our contributions have enabled progress towards a seamless integration of the virtual into the real world.
|
||||
They also allow us to outline longer-term research perspectives.
|
||||
|
||||
@@ -159,7 +160,7 @@ They also allow us to outline longer-term research perspectives.
|
||||
We saw how complex the sense of touch is (\secref[related_work]{haptic_hand}).
|
||||
Multiple sensory receptors all over the skin allow us to perceive different properties of objects, such as their texture, temperature, weight or shape.
|
||||
Particularly concentrated in the hands, cutaneous sensory feedback, together with the muscles, is crucial for grasping and manipulating objects.
|
||||
In this manuscript, we showed how wearable haptic devices can provide virtual tactile sensations to support direct hand interaction in immersive \AR.
|
||||
In this manuscript, we showed how wearable haptic devices can provide virtual tactile sensations to support direct hand interaction with an \OST-\AR headset.
|
||||
We investigated both the visuo-haptic perception of texture augmenting real surfaces (\partref{perception}) and the manipulation of virtual objects with visuo-haptic feedback of hand contact with virtual objects (\partref{manipulation}).
|
||||
|
||||
However, unlike the visual sense, which can be fully immersed in the virtual using an \AR/\VR headset, there is no universal wearable haptic device that can reproduce all the haptic properties perceived by the hand (\secref[related_work]{wearable_haptics}).
|
||||
@@ -181,7 +182,7 @@ These results would enable the design of more universal wearable haptic devices
|
||||
|
||||
We reviewed the diversity of \AR and \VR reality displays and their respective characteristics in rendering (\secref[related_work]{ar_displays}) and the manipulation of virtual content with the hand (\chapref{visual_hand}).
|
||||
The diversity of wearable haptic devices and the different sensations they can provide is even more important (\secref[related_work]{wearable_haptics}) and an active research topic \cite{pacchierotti2017wearable}.
|
||||
Coupling wearable haptics with immersive \AR also requires the haptic actuator to be placed on the body other than at the hand contact points (\secref[related_work]{vhar_haptics}).
|
||||
Coupling wearable haptics with \AR headsets also requires the haptic actuator to be placed on the body other than at the hand contact points (\secref[related_work]{vhar_haptics}).
|
||||
In particular, in this thesis we investigated the perception of haptic texture augmentation using a vibrotactile device on the median phalanx (\chapref{vhar_system}) and also compared different positions of the haptics on the hand for manipulating virtual objects (\chapref{visuo_haptic_hand}).
|
||||
|
||||
Haptic feedback should be provided close to the point of contact of the hand with the virtual, to enhance the realism of texture augmentation (\chapref{vhar_textures}) and to render contact with virtual objects (\chapref{visuo_haptic_hand}), \eg rendering fingertip contact with a haptic ring worn on the middle or proximal phalanx.
|
||||
|
||||
@@ -36,13 +36,13 @@ Une \textbf{augmentation haptique est la modification de la perception par l'ajo
|
||||
Un aspect important de l'illusion de la RA (et de la RV) est la \emph{plausibilité}, c'est-à-dire l'illusion pour un utilisateur que les événements virtuels se produisent vraiment \cite{slater2022separate}. %, même si l'utilisateur sait qu'ils ne sont pas réels.
|
||||
|
||||
Dans ce contexte, nous définissons un \emph{système de RA} comme l'ensemble des dispositifs matériels (dispositifs d'entrée, capteurs, affichages et dispositifs haptiques) et logiciels (suivi, simulation et rendu) qui permettent à l'utilisateur d'interagir avec l'environnement augmenté.
|
||||
Les casques de RA sont la technologie d'affichage la plus prometteuse, car ils sont portables, fournissent à l'utilisateur un environnement augmenté \emph{immersif} et laissent les mains libres pour interagir \cite{hertel2021taxonomy}.
|
||||
Les visiocasques de RA sont la technologie d'affichage la plus prometteuse, car ils sont portables, fournissent à l'utilisateur un environnement augmenté \emph{immersif} et laissent les mains libres pour interagir \cite{hertel2021taxonomy}.
|
||||
Un retour haptique est alors indispensable pour assurer une interaction plausible et cohérente avec le contenu visuel virtuel.
|
||||
C'est pourquoi l'haptique portable semble particulièrement adaptée à la RA immersive.
|
||||
C'est pourquoi l'haptique portable semble particulièrement adaptée aux visiocasques de RA.
|
||||
|
||||
\subsectionstarbookmark{Défis de la réalité augmentée visuo-haptique portable}
|
||||
|
||||
L'intégration de l'haptique portable avec la RA immersive pour créer un environnement augmenté visuo-haptique est cependant complexe et présente de nombreux défis.
|
||||
L'intégration de l'haptique portable avec un visiocasque de RA pour créer un environnement augmenté visuo-haptique est cependant complexe et présente de nombreux défis.
|
||||
Nous proposons de représenter l'expérience de l'utilisateur dans un tel environnement comme une boucle d'interaction, illustrée sur la \figref{interaction-loop-fr} et basée sur les boucles d'interaction avec les systèmes 3D \cite[p.84]{laviolajr20173d}.
|
||||
Un utilisateur interagit avec les environnements virtuels visuels et haptiques via une main virtuelle qui suit ses mouvements et simule l'interaction avec des objets virtuels.
|
||||
Les environnements virtuels sont rendus en retour à l'utilisateur avec un casque de RA immersif et de l'haptique portable, qui les perçoit comme co-localisés avec l'environnement réel.
|
||||
@@ -80,8 +80,8 @@ Nous proposons donc pour ce premier axe de recherche de concevoir des augmentati
|
||||
Pour cela, nous~: \textbf{(1) concevons un système d'augmentation visuo-haptique de textures} avec de l'haptique vibrotactile portable~; \textbf{(2) évaluons comment la perception des augmentations haptiques portables de textures est affectée par le retour visuel de la main virtuelle et de l'environnement}~; \textbf{(3) étudions la perception d'augmentations visuelles et haptiques portables de textures}.
|
||||
Ces contributions sont détaillées dans la \secref{perception}.
|
||||
|
||||
Les limitations de rendu de la RA immersive et de l'haptique portable rendent difficile la manipulation d'objets virtuels directement avec la main.
|
||||
Deux retours sensoriels peuvent améliorer cette manipulation, mais n'ont pas été étudiés en RA immersive: le retour visuel de la main virtuelle \cite{prachyabrued2014visual} et le retour haptique relocalisé sur la main \cite{teng2021touch}.
|
||||
Les limitations de rendu des visiocasques de RA et de l'haptique portable rendent difficile la manipulation d'objets virtuels directement avec la main.
|
||||
Deux retours sensoriels peuvent améliorer cette manipulation, mais n'ont pas été étudiés en RA: le retour visuel de la main virtuelle \cite{prachyabrued2014visual} et le retour haptique relocalisé sur la main \cite{teng2021touch}.
|
||||
Pour ce second axe de recherche, nous proposons de concevoir des augmentations visuo-haptiques de la main comme des retours sensoriels aux interactions avec les objets virtuels.
|
||||
Pour cela, nous étudions l'effet sur la performance et l'expérience de l'utilisateur du \textbf{(1) retour visuel de la main virtuelle en tant qu'augmentation de la main réelle} et de \textbf{(2) différentes relocalisations du retour haptique} avec de l'haptique vibrotactile portable comme retour des contacts de la main avec les objets virtuels, et ce, \textbf{en combinaison avec des augmentations visuelles de la main}.
|
||||
Ces contributions sont détaillées dans la \secref{manipulation}.
|
||||
@@ -97,7 +97,7 @@ Une approche efficace pour créer une texture haptique consiste à générer un
|
||||
Les vibrations sont générées par un dispositif vibrotactile de type \textit{voice-coil}, qui permet un contrôle indépendant de la fréquence et de l'amplitude du signal.
|
||||
Ce dispositif est placé dans un outil tenu en main ou directement attaché sur le doigt.
|
||||
Lorsqu'elles sont jouées en touchant une surface réelle, ces vibrations augmentent la rugosité perçue, c'est-à-dire les micro-aspérités de la surface \cite{culbertson2015should}.
|
||||
Cependant, cette méthode n'a pas encore été intégrée dans un contexte de RA immersive.
|
||||
Cependant, cette méthode n'a pas encore été intégrée avec un visiocasque de RA.
|
||||
|
||||
\begin{subfigs}{vhar-system}{Notre système d'augmentation visuo-haptique portable de textures. }[][
|
||||
\item Le dispositif vibrotactile de type \textit{voice-coil} HapCoil-One, muni d'un marqueur de suivi, et attaché à la phalange moyenne de l'index de l'utilisateur.
|
||||
@@ -192,7 +192,7 @@ Avec l'étude précédente, cela ouvre la voie à de nouvelles applications de R
|
||||
\label{manipulation}
|
||||
|
||||
Pour ce second axe de recherche, nous proposons de concevoir et d'évaluer des retours sensoriels visuo-haptiques de la main et de ses interactions avec des objets virtuels.
|
||||
L'objectif est de faciliter la manipulation d'objets virtuels en RA immersive.
|
||||
L'objectif est de faciliter la manipulation d'objets virtuels avec les visiocasques de RA.
|
||||
|
||||
\subsectionstarbookmark{Retour visuel de la main virtuelle en tant qu'augmentation de la main réelle}
|
||||
|
||||
@@ -251,7 +251,7 @@ Cependant, il n'est pas clair quel placement de l'actionneur est le plus bénéf
|
||||
\subfig[.45]{visuo-haptic-hand-task-grasp-fr}
|
||||
\end{subfigs}
|
||||
|
||||
C'est pourquoi nous étudions le rôle du \textbf{retour visuo-haptique de la main lors de la manipulation d'objets virtuels} en RA immersive en utilisant de l'haptique vibrotactile portable.
|
||||
C'est pourquoi nous étudions le rôle du \textbf{retour visuo-haptique de la main lors de la manipulation d'objets virtuels} avec un visiocasque de RA en utilisant de l'haptique vibrotactile portable.
|
||||
Nous avons tout d'abord sélectionné \textbf{quatre placements du dispositif haptique sur la main} qui ont été proposées dans la littérature~: sur les ongles, les phalanges proximales, le poignet et les ongles de la main opposée (\figref{visuo-haptic-hand-locations-fr}).
|
||||
Nous nous sommes concentrés sur le retour vibrotactile, car il est présent dans la plupart des dispositifs haptiques portables et est le moins encombrant.
|
||||
Nous avons utilisé en pratique deux moteurs vibrotactiles de type ERM car ils sont les plus compacts et n'affectent pas le suivi de la main \cite{pacchierotti2016hring}, mais ils permettent seulement de contrôler l'amplitude du signal.
|
||||
@@ -263,7 +263,7 @@ Les résultats ont montré que lorsqu'il était placé à proximité du point de
|
||||
Cependant, le placement le plus éloigné, sur la main opposée, a donné les meilleures performances, même s'il a été peu apprécié : ce placement inhabituel a probablement incité les participants à prêter plus attention au retour haptique et à se concentrer davantage sur la tâche.
|
||||
La technique de vibration au contact a été suffisante comparée à une technique plus élaborée d'intensité de la vibration en fonction de la force de contact.
|
||||
L'augmentation visuelle de la main a été perçue comme moins nécessaire que le retour haptique vibrotactile, mais a tout de même fourni un retour utile sur le suivi de la main.
|
||||
Cette étude confirme que la relocalisation du retour haptique est une approche simple, mais prometteuse pour l'haptique portable en RA immersive.
|
||||
Cette étude confirme que la relocalisation du retour haptique est une approche simple, mais prometteuse pour l'haptique portable avec les visiocasques de RA.
|
||||
|
||||
Si l'intégration avec le système de suivi de la main le permet et si la tâche l'exige, un anneau haptique porté sur la phalange moyenne ou proximale semble préférable : c'est l'approche que nous avons par ailleurs utilisée dans notre axe de recherche sur les augmentations de textures (\secref{perception}).
|
||||
Cependant, un dispositif haptique monté sur le poignet pourra fournir un retour d'information plus riche en intégrant différents dispositifs haptiques, tout en étant potentiellement moins gênant qu'une bague.
|
||||
@@ -273,7 +273,7 @@ Elle peut être alors désactivée pendant la phase de saisie pour éviter la re
|
||||
\section{Conclusion}
|
||||
\label{conclusion}
|
||||
|
||||
Dans ce manuscrit de thèse, nous avons montré comment la RA immersive et l'haptique portable peuvent améliorer l'interaction de la main avec des objets virtuels.
|
||||
Dans ce manuscrit de thèse, nous avons montré comment un visiocasque de RA et l'haptique portable peuvent améliorer les interactions de la main avec des objets virtuels et augmentés.
|
||||
Les dispositifs haptiques portables sont capables de fournir un retour tactile aux objets virtuels et d'augmenter la perception des objets réels touchés avec le doigt, tout en préservant la liberté de mouvement et d'interaction de la main avec l'environnement réel.
|
||||
Cependant, leur intégration avec la RA reste encore récente et présente de nombreux défis conceptuels, techniques et d'expérience utilisateur.
|
||||
Nous avons structuré cette thèse autour de deux axes de recherche~: \textbf{(I) modifier la perception visuo-haptique de la texture des surfaces réelles} et \textbf{(II) améliorer la manipulation des objets virtuels}.
|
||||
@@ -299,7 +299,7 @@ Les participants ont systématiquement identifié et fait correspondre \textbf{l
|
||||
|
||||
\noindentskip Nous nous sommes également cherché à amélioré la manipulation d'objets virtuels directement avec la main.
|
||||
La manipulation d'objets virtuels est une tâche fondamentale dans les systèmes 3D, mais elle reste difficile à effectuer avec la main.
|
||||
Nous avons alors exploré deux retours sensoriels connus pour améliorer ce type d'interaction, mais non étudiés en RA immersive~: le retour visuel de la main virtuelle et le retour haptique relocalisé sur la main.
|
||||
Nous avons alors exploré deux retours sensoriels connus pour améliorer ce type d'interaction, mais non étudiés avec les visiocasques de RA: le retour visuel de la main virtuelle et le retour haptique relocalisé sur la main.
|
||||
%Notre approche a consisté à concevoir des augmentations visuelles de la main et un retour haptique portable relocalisé, sur la base de la littérature, et à les évaluer dans le cadre d'études sur les utilisateurs.
|
||||
|
||||
Nous donc avons d'abord examiné \textbf{(1) l'augmentation visuelle de la main}.
|
||||
|
||||
Reference in New Issue
Block a user