Corrections
This commit is contained in:
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -4,7 +4,7 @@
|
||||
\chaptertoc
|
||||
|
||||
%This PhD manuscript shows how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR by augmenting the perception of the virtual content and its manipulation.
|
||||
This PhD manuscript shows how immersive \AR, integrating visual virtual content into the real world perception, and wearable haptics, worn on the outside of the hand, can improve free and direct interaction of the hand with virtual objects.
|
||||
This PhD manuscript shows how immersive \AR, which integrates visual virtual content into the real world perception, and wearable haptics, worn on the outside of the hand, can improve the free and direct interaction of the hand with virtual objects.
|
||||
Our goal is to achieve a more plausible and coherent perception, as well as a more natural and effective manipulation of the visuo-haptic augmentations. %interaction of the hand with the virtual content.%, moving towards a seamless integration of the virtual into the real world.
|
||||
%We are particularly interested in enabling direct contact of virtual and augmented objects with the bare hand.
|
||||
%The aim of this thesis is to understand how immersive visual and wearable haptic augmentations complement each other in the context of direct hand perception and manipulation with virtual and augmented objects.
|
||||
@@ -14,30 +14,30 @@ Our goal is to achieve a more plausible and coherent perception, as well as a mo
|
||||
|
||||
\subsectionstarbookmark{Hand Interaction with Everyday Objects}
|
||||
|
||||
In daily life, \textbf{we simultaneously look, touch and manipulate the everyday objects} around us without even thinking about it.
|
||||
In daily life, \textbf{we simultaneously look at, touch and manipulate the everyday objects} around us without even thinking about it.
|
||||
Many of these object properties can be perceived in a complementary way through all our sensory modalities, such as their shape, size or texture \cite{baumgartner2013visual}.
|
||||
Vision often precedes touch, enabling us to expect the tactile sensations we will feel when touching the object \cite{yanagisawa2015effects}, \eg hardness or texture, and even to predict properties that we cannot see, \eg weight or temperature.
|
||||
Information from different sensory sources may be complementary, redundant or contradictory \cite{ernst2004merging}.
|
||||
Vision often precedes touch, enabling us to anticipate the tactile sensations we will feel when touching the object \cite{yanagisawa2015effects}, \eg hardness or texture, and even to predict properties that we cannot see, \eg weight or temperature.
|
||||
Information from different sensory sources can be complementary, redundant or contradictory \cite{ernst2004merging}.
|
||||
This is why we sometimes want to touch an object to check one of its properties that we have seen and to compare or confront our visual and tactile sensations.
|
||||
We then \textbf{instinctively construct a unified perception of the properties of the object} we are exploring and manipulating from our sensory modalities, as well as from the movement of our hand and fingers on the object \cite{ernst2002humans}.
|
||||
|
||||
The sense of touch also allows us to perceive our environment, but also to represent ourselves in space and to interact with the surrounding objects.
|
||||
This is due to the many sensory receptors distributed throughout our hands and body.
|
||||
Those receptors can be divided into two modalities: \emph{kinesthetic} (or proprioception), which are the forces felt by muscles and tendons, and \emph{cutaneous} (or tactile), which are the pressures, stretches, vibrations and temperatures felt by the skin.
|
||||
These receptors can be divided into two modalities: \emph{kinesthetic} (or proprioception), which are the forces felt by muscles and tendons, and \emph{cutaneous} (or tactile), which are the pressures, stretches, vibrations and temperatures felt by the skin.
|
||||
This rich and complex variety of actions and sensations makes it particularly \textbf{difficult to artificially recreate capabilities of touch}, for example in virtual or remote operating environments \cite{culbertson2018haptics}.
|
||||
|
||||
\subsectionstarbookmark{Wearable Haptics Promise Everyday Use}
|
||||
|
||||
\textbf{\emph{Haptics} is the study of the sense of touch and user interfaces that involve touch} \cite{klatzky2013haptic}.
|
||||
Haptic devices can be categorized according to how they interface with a user: graspable, touchable and wearable, as illustrated in \figref{haptic-categories} \cite{culbertson2018haptics}.
|
||||
Haptic devices can be categorized according to how they interface with the user: graspable, touchable and wearable, as illustrated in \figref{haptic-categories} \cite{culbertson2018haptics}.
|
||||
\emph{Graspable interfaces} are the traditional haptic devices that are held in the hand.
|
||||
They are either grounded devices that provide kinesthetic (force) feedback, \eg a robotic arm or a joystick, or ungrounded tactile devices, \eg a game controller or a smartphone.
|
||||
\emph{Touchable interfaces} are actuated devices that are directly touched and that can dynamically change their shape or surface property, such as hardness or friction, providing simultaneous kinesthetic and cutaneous feedback.
|
||||
\emph{Touchable interfaces} are actuated devices that are directly touched and that can dynamically change their shape or surface properties, such as hardness or friction, providing simultaneous kinesthetic and cutaneous feedback.
|
||||
However, graspable interfaces occupy the hand, preventing interaction with other objects, and touchable interfaces often involve cumbersome mechanisms and are by definition limited to their own working surface.
|
||||
Instead, \textbf{\emph{wearable interfaces} are directly mounted on the body} to provide cutaneous sensations on the skin in a portable way and \textbf{without restricting the user's movements} \cite{pacchierotti2017wearable}.
|
||||
|
||||
\begin{subfigs}{haptic-categories}{
|
||||
Haptic devices can be classified into three categories according to their interface with the user:
|
||||
Haptic devices can be divided into three categories according to their interface with the user:
|
||||
}[][
|
||||
\item graspable (hand-held),
|
||||
\item touchable, and
|
||||
@@ -50,14 +50,14 @@ Instead, \textbf{\emph{wearable interfaces} are directly mounted on the body} to
|
||||
|
||||
A wide range of wearable haptic devices have been developed to provide the user with rich virtual tactile sensations, including normal force, skin stretch, vibration and thermal feedback.
|
||||
\figref{wearable-haptics} shows some examples of different wearable haptic devices with different form factors and rendering capabilities.
|
||||
Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, augmented and virtual realities, and social interactions \cite{pacchierotti2017wearable,culbertson2018haptics}.
|
||||
However, the \textbf{integration of wearable haptics with \AR has been little explored so far}.%, and few wearable haptic devices have specifically designed or experimentally tested for direct hand interaction in \AR.
|
||||
Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, augmented and virtual realities, and social interaction \cite{pacchierotti2017wearable,culbertson2018haptics}.
|
||||
However, the \textbf{integration of wearable haptics with \AR has been little explored so far}. %, and few wearable haptic devices have specifically designed or experimentally tested for direct hand interaction in \AR.
|
||||
|
||||
\begin{subfigs}{wearable-haptics}{
|
||||
Wearable haptic devices can render sensations on the skin as feedback to real or virtual objects being touched.
|
||||
Wearable haptic devices can provide sensations on the skin as feedback to real or virtual objects being touched.
|
||||
}[][
|
||||
\item Wolverine, a wearable exoskeleton that simulate contact and grasping of virtual objects with force feedback on the fingers \cite{choi2016wolverine}.
|
||||
\item Touch\&Fold, a wearable haptic device mounted on the nail that fold on demand to render contact, normal force and vibrations to the fingertip \cite{teng2021touch}.
|
||||
\item Wolverine, a wearable exoskeleton that simulates contact and grasping of virtual objects with force feedback on the fingers \cite{choi2016wolverine}.
|
||||
\item Touch\&Fold, a nail-mounted wearable haptic device that folds on demand to provide contact, normal force and vibration to the fingertip \cite{teng2021touch}.
|
||||
\item The hRing, a wearable haptic ring mounted on the proximal phalanx able to render normal and shear forces to the finger \cite{pacchierotti2016hring}.
|
||||
\item Tasbi, a haptic bracelet capable of providing pressure and vibrotactile feedback to the wrist \cite{pezent2022design}.
|
||||
]
|
||||
@@ -74,9 +74,9 @@ However, the \textbf{integration of wearable haptics with \AR has been little ex
|
||||
It thus promises natural and seamless interaction with the physical and digital objects (and their combination) directly with our hands \cite{billinghurst2021grand}.
|
||||
It is technically and conceptually closely related to \emph{\VR}, which completely replaces the \emph{\RE} perception with a \emph{\VE}.
|
||||
|
||||
\AR and \VR can be placed on a reality-virtuality continuum, as proposed by \textcite{milgram1994taxonomy} and illustrated in \figref{rv-continuum}\footnote{On the original reality-virtuality continuum of \textcite{milgram1994taxonomy}, augmented virtuality is also considered, as the incorporation of real objects to a \VE, and is placed between \AR and \VR. For simplicity, we only consider \AR and \VR in this thesis.}.
|
||||
\AR and \VR can be placed on a reality-virtuality continuum, as proposed by \textcite{milgram1994taxonomy} and illustrated in \figref{rv-continuum}\footnote{On the original reality-virtuality continuum of \textcite{milgram1994taxonomy}, augmented virtuality is also considered, as the incorporation of real objects into a \VE, and is placed between \AR and \VR. For simplicity, we only consider \AR and \VR in this thesis.}.
|
||||
It describes the degree of virtuality of the environment along an axis, with one end being the \RE and the other end being a pure \VE, \ie indistinguishable from the real world (such as \emph{The Matrix} movies).
|
||||
Between these two extremes lies \MR, which comprises \AR and \VR as different levels of mixing real and virtual environments \cite{skarbez2021revisiting}.\footnote{This is the original and classic definition of \MR, but there is still a debate on defining and characterize \AR and \MR experiences \cite{speicher2019what,skarbez2021revisiting}.}
|
||||
Between these two extremes lies \MR, which includes \AR and \VR as different levels of mixing real and virtual environments \cite{skarbez2021revisiting}.\footnote{This is the original and classic definition of \MR, but there is still debate about how to define and characterize \AR and \MR experiences \cite{speicher2019what,skarbez2021revisiting}.}
|
||||
|
||||
We call \AR/\VR \emph{systems} the computational set of hardware (input devices, sensors and displays) and software (tracking, simulation and rendering) that allows the user to interact with the \VE.% by implementing the interaction loop we proposed in \figref{interaction-loop}.
|
||||
\AR and \VR systems can address any of the human senses, but they most often focus on visual augmentations \cite[p.144]{billinghurst2015survey}.
|
||||
@@ -88,8 +88,8 @@ Many visual displays have been explored, from projection systems to hand-held di
|
||||
\item For the visual sense, as originally proposed by and adapted from \textcite{milgram1994taxonomy}.
|
||||
\item Extension to include the haptic sense on a second, orthogonal axis, proposed by and adapted from \textcite{jeon2009haptic}.
|
||||
]
|
||||
\subfig[0.45]{rv-continuum}
|
||||
\subfig[0.53]{visuo-haptic-rv-continuum5}
|
||||
\subfig[0.49]{rv-continuum}
|
||||
\subfig[0.49]{visuo-haptic-rv-continuum5}
|
||||
\end{subfigs}
|
||||
|
||||
%Concepts of virtuality and augmentation can also be applied for sensory modalities other than vision.
|
||||
@@ -109,7 +109,7 @@ The \textbf{integration of wearable haptics with \AR} seems to be one of the mos
|
||||
\begin{subfigs}{visuo-haptic-environments}{Visuo-haptic environments with different degrees of reality-virtuality. }[][
|
||||
\item \AR environment with a real haptic object used as a proxy to manipulate a \VO \cite{kahl2023using}.
|
||||
\item \AR environment with a wearable haptic device that provides virtual, synthetic feedback from contact with a \VO \cite{meli2018combining}.
|
||||
\item A real object seen in a visual \VR environment whose haptic perception of stiffness is augmented with the hRing haptic device \cite{salazar2020altering}.
|
||||
\item A real object seen in a visual \VR environment whose haptic perception of stiffness is augmented with the hRing haptic device \cite{detinguy2018enhancing,salazar2020altering}.
|
||||
\item Visuo-haptic texture augmentation of real object being touch, using a hand-held \AR display and haptic electrovibration feedback \cite{bau2012revel}.
|
||||
]
|
||||
\subfigsheight{31mm}
|
||||
@@ -158,14 +158,14 @@ Although closely related, \AR and \VR have key differences in their respective r
|
||||
|
||||
Many wearable haptic devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR.
|
||||
The \textbf{user's hand must be indeed free to touch and interact with the \RE while wearing a wearable haptic device}.
|
||||
It is possible instead to place the haptic actuator close to the point of contact with the \RE, \eg providing haptic feedback on the nail \cite{ando2007fingernailmounted,teng2021touch}, another phalanx \cite{asano2015vibrotactile,salazar2020altering} or the wrist \cite{pezent2022design,sarac2022perceived} for rendering fingertip contacts with virtual content.
|
||||
It is possible instead to place the haptic actuator close to the point of contact with the \RE, \eg providing haptic feedback on the nail \cite{ando2007fingernailmounted,teng2021touch}, another phalanx \cite{asano2015vibrotactile,detinguy2018enhancing,salazar2020altering} or the wrist \cite{pezent2022design,sarac2022perceived} for rendering fingertip contacts with virtual content.
|
||||
Therefore, when touching a virtual or augmented object, \textbf{the real and virtual visual sensations are seen as co-localized, but the virtual haptic feedback is not}.
|
||||
It remains to be investigated how such potential discrepancies affect the overall perception to design visuo-haptic augmentations adapted to \AR.
|
||||
|
||||
%So far, most of the \AR studies and applications only add visual and haptic sensations to the user's overall perception of the environment, but conversely it is more difficult to remove sensations.
|
||||
%Visual and haptic augmentations of the \RE add sensations to the user's overall perception.
|
||||
The \textbf{added visual and haptic virtual sensations can be perceived as inconsistent} with the sensations of the \RE, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these.
|
||||
Moreover, in \AR, the user can still see the real-world surroundings, including their hands, the augmented real objects and the worn haptic devices, unlike \VR where there is total control over the visual rendering. % of the hand and \VE.
|
||||
The \textbf{added visual and haptic virtual sensations may be perceived as inconsistent} with the sensations of the \RE, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these.
|
||||
Moreover, in \AR, the user can still see the real world surroundings, including their hands, the augmented real objects and the worn haptic devices, unlike \VR where there is total control over the visual rendering. % of the hand and \VE.
|
||||
It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as realistic or even plausible, and to what extent they will conflict or complement each other. % in the perception of the \AE.
|
||||
With a better understanding of \textbf{how visual factors can influence the perception of haptic augmentations}, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic augmentations adapted to \AR can be designed.
|
||||
|
||||
@@ -218,7 +218,7 @@ Wearable haptic devices have proven to be effective in altering the perception o
|
||||
%It enables rich haptic feedback as the combination of kinesthetic sensation from the real and cutaneous sensation from the actuator.
|
||||
However, wearable haptic augmentations have been little explored with \AR, as well as the visuo-haptic augmentation of textures.
|
||||
Texture is indeed one of the fundamental perceived property of a surface's material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by both sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) augmentation \cite{unger2011roughness,culbertson2014modeling,strohmeier2017generating}.
|
||||
Being able to coherently substitute the visuo-haptic texture of an everyday surface directly touched by a finger is an important step towards \AR capable of visually and haptically augmenting the \RE of a user in a plausible way.
|
||||
Being able to coherently substitute the visuo-haptic texture of a surface directly touched by a finger is an important step towards \AR capable of visually and haptically augmenting the \RE of a user in a plausible way.
|
||||
For this first axis of research, we propose to design and evaluate the \textbf{perception of wearable virtual visuo-haptic textures augmenting real surfaces}. %, using an immersive \AR headset and a wearable vibrotactile device.
|
||||
To this end, we (1) design a system for rendering wearable visuo-haptic texture augmentations, to (2) evaluate how the perception of haptic texture augmentations is affected by the visual feedback of the virtual hand and the environment, and (3) investigate the perception of co-localized visuo-haptic texture augmentations.
|
||||
|
||||
@@ -263,11 +263,11 @@ Our last objective is to \textbf{investigate the visuo-haptic rendering of hand
|
||||
%In \textbf{\partref{background}}, we first describe the context and background of our research, within which
|
||||
With this first current \textit{Introduction} chapter, we have presented the research challenges, objectives, approach and contributions of this thesis.
|
||||
|
||||
In \textbf{\chapref{related_work}}, we then review previous work on the perception and manipulation with virtual and augmented objects, directly with the hand, using either wearable haptics, \AR, or their combination.
|
||||
First, we overview how the hand perceives and manipulate real everyday objects.
|
||||
In \textbf{\chapref{related_work}}, we then review previous work on the perception and manipulation of virtual and augmented objects, directly with the hand, using either wearable haptics, \AR, or their combination.
|
||||
First, we overview how the hand perceives and manipulates real objects.
|
||||
Second, we present wearable haptics and haptic augmentations of texture and hardness of real objects.
|
||||
Third, we introduce \AR, and how \VOs can be manipulated directly with the hand.
|
||||
Finally, we describe how visual and haptic feedback have been combined in \AR to enhance perception and interaction with the hand.
|
||||
Finally, we describe how visuo-haptic feedback has augmented direct hand interaction in \AR, particularly using wearable haptics.
|
||||
|
||||
We then address each of our two research axes in a dedicated part.
|
||||
|
||||
|
||||
Reference in New Issue
Block a user