Corrections from Claudio's comments
This commit is contained in:
@@ -4,8 +4,8 @@
|
||||
\chaptertoc
|
||||
|
||||
%This PhD manuscript shows how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR by augmenting the perception of the virtual content and its manipulation.
|
||||
This thesis shows how immersive \AR, which integrates visual virtual content into the real world perception, and wearable haptics, worn on the outside of the hand, can improve the free and direct interaction of the hand with virtual and augmented objects.
|
||||
Our goal is to enable users to perceive and interact with wearable visuo-haptic augmentations in a more realistically and effectively, as if they were real. %interaction of the hand with the virtual content.%, moving towards a seamless integration of the virtual into the real world.
|
||||
This thesis shows how immersive \AR, which integrates visual virtual content into the real world perception, and wearable haptics, which provides tactile sensations on the skin, can improve the free and direct interaction of virtual objects with the hand.
|
||||
Our goal is to enable users to perceive and interact with wearable visuo-haptic augmentations in a more realistic and effective way, as if they were real. %interaction of the hand with the virtual content.%, moving towards a seamless integration of the virtual into the real world.
|
||||
%We are particularly interested in enabling direct contact of virtual and augmented objects with the bare hand.
|
||||
%The aim of this thesis is to understand how immersive visual and wearable haptic augmentations complement each other in the context of direct hand perception and manipulation with virtual and augmented objects.
|
||||
|
||||
@@ -24,7 +24,7 @@ We then \textbf{instinctively construct a unified perception of the properties o
|
||||
The sense of touch also allows us to perceive our environment, but also to represent ourselves in space and interact with the surrounding objects.
|
||||
This is due to the many sensory receptors distributed throughout our hands and body.
|
||||
These receptors can be divided into two modalities: \emph{kinesthetic} (or proprioception), which are the forces felt by muscles and tendons, and \emph{cutaneous} (or tactile), which are the pressures, stretches, vibrations and temperatures felt by the skin.
|
||||
This rich and complex variety of actions and sensations makes it particularly \textbf{difficult to artificially recreate capabilities of touch}, for example in virtual or remote operating environments \cite{culbertson2018haptics}.
|
||||
This rich and complex variety of actions and sensations makes it particularly \textbf{difficult to artificially recreate capabilities of touch}, for example in virtual or remote operating environments \cite{culbertson2018haptics,pacchierotti2023cutaneous}.
|
||||
|
||||
\subsectionstarbookmark{Wearable Haptics Promise Everyday Use}
|
||||
|
||||
@@ -175,9 +175,11 @@ With a better understanding of \textbf{how visual factors can influence the perc
|
||||
\subsectionstarbookmark{Challenge II: Enabling Effective Manipulation of the Augmented Environment}
|
||||
|
||||
Touching, \textbf{grasping and manipulating virtual objects are fundamental interactions for \AR} \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite[p.385]{laviolajr20173d}.
|
||||
Since the hand is not occupied or covered with a haptic device so as to not impair interaction with the \RE, as described in the previous section, one can expect a seamless and direct manipulation of the hand with the virtual content as if it were real.
|
||||
When augmenting a real object, the user's hand is physically constrained by the object, allowing for easy and natural interaction.
|
||||
However, \textbf{manipulating a purely virtual object with the bare hand can be challenging} without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
|
||||
In the previous challenge section, we described how the user's hand should be free to interact with the \RE using a wearable haptic device.
|
||||
We can then expect a seamless and direct manipulation of the virtual content with the hands, as if it were real.
|
||||
%Since the hand is not occupied or covered with a haptic device so as to not impair interaction with the \RE, as described in the previous section, one can expect a seamless and direct manipulation of the hand with the virtual content as if it were real.
|
||||
When touching a visually augmenting a real object, the user's hand is physically constrained by the object, allowing for easy and natural interaction.
|
||||
However, \textbf{manipulating a purely virtual object with the bare hand can be challenging}, especially without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
|
||||
|
||||
In addition, current \AR systems have visual rendering limitations that also affect interaction with virtual objects. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency.
|
||||
\AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world.
|
||||
@@ -194,11 +196,11 @@ Yet, it is unclear which type of visual and haptic feedback, or their combinatio
|
||||
\label{contributions}
|
||||
|
||||
%The aim of this thesis is to understand how immersive visual and wearable haptic augmentations complement each other in the context of direct hand perception and manipulation with virtual and augmented objects.
|
||||
As described in the research challenges section, providing a coherent and effective visuo-haptic augmented environment to a user is complex and raises many issues.
|
||||
As we described in \secref{research_challenges}, providing a coherent and effective visuo-haptic augmented environment to a user is complex and raises many issues.
|
||||
Our approach is to:
|
||||
\begin{enumerate*}[label=(\arabic*)]
|
||||
\item design immersive and wearable visuo-haptic renderings that augment both the objects being interacted with and the hand interacting with them, and
|
||||
\item evaluate in user studies how these visuo-haptic renderings affect the interaction of the hand with these objects using psychophysical, performance and user experience methods.
|
||||
\item evaluate in user studies how these visuo-haptic renderings affect the interaction of these objects with the hand using psychophysical, performance and user experience methods.
|
||||
\end{enumerate*}
|
||||
|
||||
We consider two main axes of research, each addressing one of the research challenges identified above:
|
||||
@@ -222,7 +224,7 @@ Wearable haptic devices have proven effective in modifying the perception of a t
|
||||
However, wearable haptic augmentation with \AR has been little explored, as well as the visuo-haptic augmentation of texture.
|
||||
Texture is indeed one of the most fundamental perceived properties of a surface material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) augmentation \cite{unger2011roughness,culbertson2014modeling,asano2015vibrotactile,strohmeier2017generating,friesen2024perceived}.
|
||||
%Coherently substitute the visuo-haptic texture of a surface directly touched by a finger is an important step towards a \AR capable of visually and haptically augmenting the \RE of a user in a plausible way.
|
||||
For this first axis of research, we propose to design and evaluate the \textbf{perception of wearable virtual visuo-haptic textures augmenting real surfaces}. %, using an immersive \AR headset and a wearable vibrotactile device.
|
||||
For this first axis of research, we propose to \textbf{design and evaluate the perception of wearable virtual visuo-haptic textures augmenting real surfaces}. %, using an immersive \AR headset and a wearable vibrotactile device.
|
||||
To this end, we (1) design a system for rendering wearable visuo-haptic texture augmentations, to (2) evaluate how the perception of haptic texture augmentations is affected by visual feedback of the virtual hand and the environment, and (3) investigate the perception of co-localized visuo-haptic texture augmentations.
|
||||
|
||||
First, an effective approach to render haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction \cite{culbertson2014modeling,asano2015vibrotactile}.
|
||||
@@ -245,14 +247,14 @@ Hence, a user can expect natural and direct contact and manipulation of virtual
|
||||
However, the intangibility of the visual \VE, the display limitations of current visual \OST-\AR systems and the inherent spatial and temporal discrepancies between the user's hand actions and the visual feedback in the \VE can make interaction with virtual objects particularly challenging.
|
||||
%However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of wearable haptics, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the \RE, the visual feedback, and the haptic feedback, can make the interaction with virtual objects with bare hands particularly challenging.
|
||||
Two particular sensory feedbacks are known to improve such direct virtual object manipulation, but have not been properly investigated in immersive \AR: visual feedback of the virtual hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and delocalized haptic feedback \cite{lopes2018adding,teng2021touch}.
|
||||
For this second axis of research, we propose to design and evaluate \textbf{visuo-haptic augmentations of the hand as interaction feedback with virtual objects} in immersive \OST-\AR.
|
||||
For this second axis of research, we propose to \textbf{design and evaluate visuo-haptic augmentations of the hand as interaction feedback with virtual objects} in immersive \OST-\AR.
|
||||
We consider the effect on user performance and experience of (1) the visual feedback of the virtual hand as augmentation of the real hand and (2) different delocalized haptic feedback of virtual object manipulation with the hand in combination with visual hand augmentations.
|
||||
|
||||
First, the visual feedback of the virtual hand is a key element for interacting and manipulating virtual objects in \VR \cite{prachyabrued2014visual,grubert2018effects}.
|
||||
Some work has also investigated the visual feedback of the virtual hand in \AR \cite{piumsomboon2014graspshell,blaga2017usability}, but not in an immersive context of virtual object manipulation. % with the bare hand.% from simulating mutual occlusions between the hand and virtual objects \cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay \cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
|
||||
\OST-\AR also has significant perceptual differences from \VR due to the visibility of the real hand and environment, which can affect user experience and performance \cite{yoon2020evaluating}.
|
||||
%, and these visual hand augmentations have not been evaluated .
|
||||
Thus, our fourth objective is to \textbf{the visual feedback of the virtual hand as augmentation of the real hand} for direct hand manipulation of virtual objects.
|
||||
Thus, our fourth objective is to \textbf{investigate the visual feedback of the virtual hand as augmentation of the real hand} for direct hand manipulation of virtual objects.
|
||||
|
||||
Second, as described above, the haptic actuators need to be moved away from the fingertips to not impair the hand movements, sensations and interactions with the \RE.
|
||||
Previous work has shown that wearable haptics that provide feedback on hand manipulation with virtual objects in \AR can significantly improve user performance and experience \cite{maisto2017evaluation,meli2018combining}.
|
||||
@@ -264,7 +266,7 @@ Our last objective is to \textbf{investigate the delocalized haptic feedback of
|
||||
|
||||
%This thesis is divided in four parts.
|
||||
%In \textbf{\partref{background}}, we first describe the context and background of our research, within which
|
||||
With this first current \textit{Introduction} chapter, we have presented the research challenges, objectives, approach and contributions of this thesis.
|
||||
With this current \textit{Introduction} chapter, we have presented the research challenges, objectives, approach and contributions of this thesis.
|
||||
|
||||
In \textbf{\chapref{related_work}}, we then review previous work on the perception and manipulation of virtual and augmented objects, directly with the hand, using either wearable haptics, \AR, or their combination.
|
||||
First, we overview how the hand perceives and manipulates real objects.
|
||||
@@ -278,27 +280,27 @@ We then address each of our two research axes in a dedicated part.
|
||||
In \textbf{\partref{perception}} we present our contributions to the first axis of research: modifying the visuo-haptic texture perception of real surfaces.
|
||||
We evaluate how the visual feedback of the hand (real or virtual), the environment (\AR or \VR) and the textures (coherent, different or not shown) affect the perception of virtual vibrotactile textures rendered on real surfaces and touched directly with the index finger.
|
||||
|
||||
In \textbf{\chapref{vhar_system}} we design and implement a system for rendering visuo-haptic virtual textures that augment real surfaces. %, using an immersive \OST-\AR headset and a wearable vibrotactile device.
|
||||
In \textbf{\chapref{vhar_system}}, we design and implement a system for rendering visuo-haptic virtual textures that augment real surfaces. %, using an immersive \OST-\AR headset and a wearable vibrotactile device.
|
||||
The haptic textures represent a periodical patterned texture rendered by a wearable vibrotactile actuator worn on the middle phalanx of the finger touching the surface.
|
||||
The tracking of the real hand and the environment is achieved using a marker-based technique.
|
||||
The visual rendering is done using the immersive \OST-\AR headset Microsoft HoloLens~2.
|
||||
The system allows free visual and haptic exploration of the textures, as if they were real, and forms the basis of the next two chapters.
|
||||
|
||||
In \textbf{\chapref{xr_perception}} we investigate in a user study how different the perception of haptic texture augmentations is in \AR \vs \VR and when touched by a virtual hand \vs one's own hand.
|
||||
In \textbf{\chapref{xr_perception}}, we investigate in a user study how different the perception of haptic texture augmentations is in \AR \vs \VR and when touched by a virtual hand \vs one's own hand.
|
||||
We use psychophysical methods to measure user perception and extensive questionnaires to understand how this perception is affected by the visual feedback of the virtual hand and the environment (real, augmented or virtual).
|
||||
|
||||
In \textbf{\chapref{vhar_textures}} we evaluate in a user study the perception of visuo-haptic texture augmentations directly touched with the real hand in \AR.
|
||||
In \textbf{\chapref{vhar_textures}}, we evaluate in a user study the perception of visuo-haptic texture augmentations directly touched with the real hand in \AR.
|
||||
The virtual textures are paired visual and haptic captures of real surfaces \cite{culbertson2014one}, which we render as visual and haptic overlays on the touched augmented surfaces.
|
||||
Our objective is to assess the perceived realism, coherence and roughness of the combination of nine representative visuo-haptic texture pairs.
|
||||
|
||||
\noindentskip
|
||||
In \textbf{\partref{manipulation}} we describe our contributions to the second axis of research: improving direct hand manipulation of virtual objects using visuo-haptic augmentations of the hand as interaction feedback with virtual objects in immersive \OST-\AR.
|
||||
In \textbf{\partref{manipulation}}, we describe our contributions to the second axis of research: improving direct hand manipulation of virtual objects using visuo-haptic augmentations of the hand as interaction feedback with virtual objects in immersive \OST-\AR.
|
||||
|
||||
In \textbf{\chapref{visual_hand}} we investigate in a user study six visual feedback as hand augmentations, as a set of the most popular hand augmentation in the \AR literature.
|
||||
In \textbf{\chapref{visual_hand}}, we investigate in a user study six visual feedback as hand augmentations, as a set of the most popular hand augmentation in the \AR literature.
|
||||
Using the \OST-\AR headset Microsoft HoloLens~2, we evaluate their effect on user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a virtual object directly with the hand.
|
||||
|
||||
In \textbf{\chapref{visuo_haptic_hand}} we evaluate in a user study delocalized haptic feedback of hand manipulation with virtual object using two vibrotactile contact techniques provided at five different positionings on the hand. %, as haptic rendering of virtual object manipulation with the hand.
|
||||
In \textbf{\chapref{visuo_haptic_hand}}, we evaluate in a user study delocalized haptic feedback of hand manipulation with virtual object using two vibrotactile contact techniques provided at five different positionings on the hand. %, as haptic rendering of virtual object manipulation with the hand.
|
||||
They are compared with the two most representative visual hand augmentations from the previous chapter, resulting in twenty visuo-haptic hand feedbacks that are evaluated within the same experimental setup and design.
|
||||
|
||||
\noindentskip
|
||||
In \textbf{\chapref{conclusion}} we conclude this thesis and discuss short-term future work and long-term perspectives for each of our contributions and research axes.
|
||||
In \textbf{\chapref{conclusion}}, we conclude this thesis and discuss short-term future work and long-term perspectives for each of our contributions and research axes.
|
||||
|
||||
@@ -5,14 +5,15 @@
|
||||
|
||||
The haptic sense has specific characteristics that make it unique in regard to other senses.
|
||||
It enables us to perceive a large diversity of properties of everyday objects, up to a complex combination of sensations produced by numerous sensory receptors distributed throughout the body, but especially in the hand \cite{johansson2009coding}.
|
||||
It also allows us to act on these objects with the hand, to come into contact with them, to grasp them and to actively explore them. % , and to manipulate them.
|
||||
It also allows us to act on these objects, to come into contact with them, to grasp them and to actively explore them. % , and to manipulate them.
|
||||
%This implies that the haptic perception is localized at the points of contact between the hand and the environment, \ie we cannot haptically perceive an object without actively touching it.
|
||||
These two mechanisms, \emph{action} and \emph{perception}, are closely associated and both are essential to form a complete haptic experience of interacting with the environment using the hand \cite{lederman2009haptic}.
|
||||
|
||||
\subsection{The Haptic Sense}
|
||||
\label{haptic_sense}
|
||||
|
||||
Perceiving the properties of an object involves numerous sensory receptors embedded in the skin, but also in the muscles and joints of the hand, and distributed throughout the body. They can be divided into two main modalities: \emph{cutaneous} and \emph{kinesthetic} \cite{lederman2009haptic}.
|
||||
Perceiving the properties of an object involves numerous sensory receptors embedded in the skin, but also in the muscles and joints of the hand, and distributed throughout the body.
|
||||
They can be divided into two main modalities, depending on their location in the body: \emph{cutaneous} and \emph{kinesthetic} \cite{lederman2009haptic}.
|
||||
|
||||
\subsubsection{Cutaneous Modality}
|
||||
\label{cutaneous_sensitivity}
|
||||
@@ -60,17 +61,17 @@ Finally, free nerve endings (without specialized receptors) provide information
|
||||
\subsubsection{Kinesthetic Modality}
|
||||
\label{kinesthetic_sensitivity}
|
||||
|
||||
Kinesthetic receptors are also mechanoreceptors, but are located in muscles, tendons and joints \cite{jones2006human}.
|
||||
Kinesthetic receptors are the mechanoreceptors located in muscles, tendons and joints \cite{jones2006human}.
|
||||
Muscle spindles respond to the length and rate of stretch/contraction of muscles.
|
||||
Golgi tendon organs, located at the junction of muscles and tendons, respond to the force developed by the muscles.
|
||||
Ruffini and Pacini receptors are located in the joints and respond to joint movement.
|
||||
Ruffini and Pacini receptors (\secref{cutaneous_sensitivity}) are located in the joints and respond to joint movement.
|
||||
Together, these receptors provide sensory feedback about the movement, speed and strength of the muscles and the rotation of the joints during movement.
|
||||
They can also sense external forces and torques applied to the body.
|
||||
|
||||
Kinesthetic receptors are therefore closely linked to the motor control of the body.
|
||||
By providing sensory feedback in response to the position and movement of our limbs, they enable us to perceive our body in space, a perception called \emph{proprioception}.
|
||||
This allows us to plan and execute precise movements to touch or grasp a target, even with our eyes closed.
|
||||
Cutaneous mechanoreceptors (\secref{cutaneous_sensitivity}) are also involved in proprioception \cite{johansson2009coding}.
|
||||
Cutaneous mechanoreceptors within the skin are also involved in proprioception \cite{johansson2009coding}.
|
||||
|
||||
\subsection{Hand-Object Interactions}
|
||||
\label{hand_object_interactions}
|
||||
|
||||
@@ -67,7 +67,7 @@ Multiple actuators are often combined in a haptic device to provide richer feedb
|
||||
|
||||
The moving platforms translate perpendicularly on the skin to create sensations of contact, pressure and edges \cite{pacchierotti2017wearable}.
|
||||
Placed under the fingertips, they can come into contact with the skin with different forces, speeds and orientations.
|
||||
The platform is moved by means of cables, \eg in \figref{gabardi2016new}, or articulated arms, \eg in \figref{perez2017optimizationbased}, activated by motors grounded to the nail \cite{gabardi2016new,perez2017optimizationbased}.
|
||||
The platform is moved by means of cables, \eg in \figref{gabardi2016new}, or articulated arms, \eg in \figref{chinello2017three}, activated by motors grounded to the nail \cite{gabardi2016new,chinello2017three}.
|
||||
The motors lengthen and shorten the cables or orient the arms to move the platform over 3 \DoFs: two for orientation and one for normal force relative to the finger.
|
||||
However, these platforms are specifically designed to provide haptic feedback to the fingertip in \VEs, preventing interaction with a \RE.
|
||||
|
||||
@@ -85,13 +85,13 @@ Although these two types of effector can be considered wearable, their actuation
|
||||
Normal indentation actuators for the fingertip.
|
||||
}[][
|
||||
\item A moving platform actuated with cables \cite{gabardi2016new}.
|
||||
\item A moving platform actuated by articulated limbs \cite{perez2017optimizationbased}.
|
||||
\item A moving platform actuated by articulated limbs \cite{chinello2017three}.
|
||||
\item Diagram of a pin-array of tactors \cite{sarakoglou2012high}.
|
||||
\item A pneumatic system composed of a \numproduct{12 x 10} array of air cylinders \cite{ujitoko2020development}.
|
||||
]
|
||||
\subfigsheight{37mm}
|
||||
\subfig{gabardi2016new}
|
||||
\subfig{perez2017optimizationbased}
|
||||
\subfig{chinello2017three}
|
||||
\subfig{sarakoglou2012high}
|
||||
\subfig{ujitoko2020development}
|
||||
\end{subfigs}
|
||||
|
||||
@@ -222,9 +222,11 @@ An alternative is to use a single \emph{universal} tangible object like a hand-h
|
||||
These \emph{time-multiplexed} interfaces require interaction techniques that allow the user to pair the tangible with any virtual object, \eg by placing the tangible into the virtual object and pressing the fingers \cite{issartel2016tangible} (\figref{issartel2016tangible}), similar to a real grasp (\secref{grasp_types}).
|
||||
|
||||
Still, the virtual visual rendering and the real haptic sensations can be incoherent.
|
||||
Especially in \OST-\AR, since the virtual objects are inherently slightly transparent allowing the paired real objects to be seen through them.
|
||||
In a pick-and-place task with real objects, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using_1}) of the virtual objects does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users.
|
||||
This suggests the feasibility of using simplified real obejcts in \AR whose spatial properties (\secref{object_properties}) abstract those of the virtual objects.
|
||||
In \VR, some discrepancy between the real and virtual objects is acceptable because the real object is not visible to the user \cite{detinguy2019how,detinguy2019universal}.
|
||||
In \AR, however, the real object may be partially or fully visible, and the user can see that their hand is not touching the real and virtual objects at the same time.
|
||||
This is particularly true in \OST-\AR, where the virtual objects are inherently slightly transparent allowing the paired real objects to be seen through them \cite{macedo2023occlusion}.
|
||||
In a pick-and-place task with real objects in \OST-\AR, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using_1}) of the virtual objects does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users.
|
||||
This suggests the feasibility of using simplified real objects in \AR whose spatial properties (\secref{object_properties}) abstract those of the virtual objects.
|
||||
Similarly, in \secref{tactile_rendering} we described how a material property (\secref{object_properties}) of a touched real object can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: it could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR.
|
||||
|
||||
\begin{subfigs}{ar_tangibles}{Manipulating virtual objects through real objects. }[][
|
||||
|
||||
BIN
2-related-work/figures/chinello2017three.jpg
Normal file
BIN
2-related-work/figures/chinello2017three.jpg
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 33 KiB |
Reference in New Issue
Block a user