Compare commits

...

5 Commits

Author SHA1 Message Date
43037c8407 Fix intro 2025-04-21 12:40:26 +02:00
035603eee7 Auto-remove space before footnotes 2025-04-21 12:25:17 +02:00
9f4e7fb8c7 Fix intro 2025-04-21 12:25:00 +02:00
60110bd64e Typo 2025-04-21 12:18:43 +02:00
beb2dce3bb Typo 2025-04-19 11:52:58 +02:00
6 changed files with 80 additions and 66 deletions

1
.gitignore vendored
View File

@@ -147,7 +147,6 @@ acs-*.bib
# knitr
*-concordance.tex
# TODO Uncomment the next line if you use knitr and want to ignore its generated tikz files
# *.tikz
*-tikzDictionary

View File

@@ -18,7 +18,6 @@ In daily life, \textbf{we simultaneously look at, touch and manipulate the every
Many of these object properties can be perceived in a complementary way through all our sensory modalities, such as their shape or material \cite{baumgartner2013visual}.
Vision often precedes touch, enabling us to anticipate the tactile sensations we will feel when touching the object \cite{yanagisawa2015effects}, \eg hardness or texture, and even to anticipate properties that we cannot see, \eg weight or temperature.
Information from different sensory sources can be complementary, redundant or contradictory \cite{ernst2004merging}.
%This is why we sometimes want to touch an object to check one of its properties that we have seen and to compare or confront our visual and tactile sensations.
We then \textbf{instinctively construct a unified perception of the properties of the object} we are exploring and manipulating from our sensory modalities, as well as from the movement of our hand and fingers on the object \cite{ernst2002humans}.
The sense of touch also allows us to perceive our environment, but also to represent ourselves in space and interact with the surrounding objects.
@@ -30,11 +29,11 @@ This rich and complex variety of actions and sensations makes it particularly \t
\textbf{\emph{Haptics} is the study of the sense of touch and user interfaces that involve touch} \cite{klatzky2013haptic}.
Haptic devices can be categorized according to how they interface with the user: graspable, touchable and wearable, as illustrated in \figref{haptic-categories} \cite{culbertson2018haptics}.
\emph{Graspable interfaces} are the traditional haptic devices that are held in the hand.
\emph{Graspable} interfaces are the traditional haptic devices that are held in the hand.
They are either grounded devices that provide kinesthetic (force) feedback, \eg a robotic arm or a joystick, or ungrounded tactile devices, \eg a game controller or a smartphone.
\emph{Touchable interfaces} are actuated devices directly touched and that can dynamically change their shape or surface properties, such as hardness or friction, providing simultaneous kinesthetic and cutaneous feedback.
\emph{Touchable} interfaces are actuated devices directly touched and that can dynamically change their shape or surface properties, such as hardness or friction, providing simultaneous kinesthetic and cutaneous feedback.
However, graspable interfaces occupy the hand, preventing interaction with other objects, and touchable interfaces often involve cumbersome mechanisms and are by definition limited to their own working surface.
Instead, \textbf{\emph{wearable interfaces} are directly mounted on the body} to provide cutaneous sensations on the skin in a portable way and \textbf{without restricting the user's movements} \cite{pacchierotti2017wearable}.
Instead, \textbf{\emph{wearable} interfaces are directly mounted on the body} to provide cutaneous sensations on the skin in a portable way and \textbf{without restricting the user's movements} \cite{pacchierotti2017wearable}.
\begin{subfigs}{haptic-categories}{
Haptic devices can be divided into three categories according to their interface with the user:
@@ -50,8 +49,8 @@ Instead, \textbf{\emph{wearable interfaces} are directly mounted on the body} to
A wide range of wearable haptic devices have been developed to provide the user with rich virtual tactile sensations, including normal force, skin stretch, vibration and thermal feedback.
\figref{wearable-haptics} shows some examples of different wearable haptic devices with different form factors and rendering capabilities.
Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, augmented and virtual realities, and social interaction \cite{pacchierotti2017wearable,culbertson2018haptics}.
However, the \textbf{integration of wearable haptics with \AR has been little explored}. %, and few wearable haptic devices have specifically designed or experimentally tested for direct hand interaction in \AR.
Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, social interaction, and for improving hand interaction in \AR \cite{pacchierotti2017wearable,culbertson2018haptics}.
However, the \textbf{integration of wearable haptics with \AR has been little explored}.
\begin{subfigs}{wearable-haptics}{
Wearable haptic devices can provide sensations on the skin as feedback to real or virtual objects being touched.
@@ -72,11 +71,14 @@ However, the \textbf{integration of wearable haptics with \AR has been little ex
\textbf{\emph{Augmented Reality (\AR)} integrates virtual content into the real world perception, creating the illusion of a unique \emph{\AE}} \cite{azuma1997survey,skarbez2021revisiting}.
It thus promises natural and seamless interaction with physical and digital objects (and their combination) directly with our hands \cite{billinghurst2021grand}.
It could be used to render purely \emph{virtual objects} in the \RE, or to create \emph{augmented objects} by modifying the perception of a real object with virtual content, such as changing its shape or texture.
It is technically and conceptually closely related to \emph{\VR}, which completely replaces \emph{\RE} perception with a \emph{\VE}.
\AR and \VR can be placed on a reality-virtuality continuum, as proposed by \textcite{milgram1994taxonomy} and illustrated in \figref{rv-continuum}\footnote{On the original reality-virtuality continuum of \textcite{milgram1994taxonomy}, augmented virtuality is also considered, as the incorporation of real objects into a \VE, and is placed between \AR and \VR. For simplicity, we only consider \AR and \VR in this thesis.}.
It describes the degree of virtuality of the environment along an axis, with one end being \RE and the other end being pure \VE, \ie indistinguishable from the real world (as in \emph{The Matrix} movies).
Between these two extremes lies \MR, which includes \AR and \VR as different levels of mixing real and virtual environments \cite{skarbez2021revisiting}.\footnote{This is the original and classic definition of \MR, but there is still debate about how to define and characterize \AR and \MR experiences \cite{speicher2019what,skarbez2021revisiting}.}
\AR and \VR can be placed on a reality-virtuality continuum, as proposed by \textcite{milgram1994taxonomy} and illustrated in \figref{rv-continuum}.
\footnote{On the original reality-virtuality continuum of \textcite{milgram1994taxonomy}, augmented virtuality is also considered, as the incorporation of real objects into a \VE, and is placed between \AR and \VR. For simplicity, we only consider \AR and \VR in this thesis.}
It describes the degree of virtuality of the environment along an axis, with one end being \RE and the other end being pure \VE, \ie indistinguishable from the real world.
Between these two extremes lies \MR, which includes \AR and \VR as different levels of mixing real and virtual environments \cite{skarbez2021revisiting}.
\footnote{This is the original and classic definition of \MR, but there is still debate about how to define and characterize \AR and \MR experiences \cite{speicher2019what,skarbez2021revisiting}.}
\begin{subfigs}{rv-continuums}{Reality-virtuality continuums. }[][
\item For the visual sense, as originally proposed by and adapted from \textcite{milgram1994taxonomy}.
@@ -86,26 +88,25 @@ Between these two extremes lies \MR, which includes \AR and \VR as different lev
\subfig[0.49]{visuo-haptic-rv-continuum5}
\end{subfigs}
%Concepts of virtuality and augmentation can also be applied for sensory modalities other than vision.
\textcite{jeon2009haptic} proposed to describe visuo-haptic \AR/\VR with two orthogonal reality-virtuality continuums, one for vision and one for touch, as shown in \figref{visuo-haptic-rv-continuum5}.
The combination of the two axes defines 9 types of visuo-haptic environments, with 3 possible levels of virtuality for each visual or haptic feedback: real, augmented and virtual.
For example, (visual) \AR using a real object as a proxy to manipulate a virtual object is considered \emph{haptic reality} (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum5}), whereas a device that provides synthetic haptic feedback when touching a virtual object is considered \emph{haptic virtuality} (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum5}).
For example, visual \AR using a real object as a proxy to manipulate a virtual object is considered \emph{haptic reality} (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum5}).
Conversely, a device that provides synthetic haptic feedback when touching a virtual object is considered \emph{haptic virtuality} (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum5}).
\textbf{A \emph{haptic augmentation} is then the combination of real and virtual haptic stimuli}, such that the virtual haptic sensations modify the perception of the real object \cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum5}).
%In particular, it has been implemented by augmenting the haptic perception of real objects by providing timely virtual tactile stimuli using wearable haptics:
\figref{salazar2020altering} shows an example of modifying the perceived stiffness of a real object in \VR using simultaneous wearable pressure feedback on the finger (left middle cell in \figref{visuo-haptic-rv-continuum5}).
\figref{bau2012revel} shows another example of visuo-haptic augmentation of virtual texture using reverse electrovibration when running the finger over a real surface (middle cell in the two axes in \figref{visuo-haptic-rv-continuum5}).
In this thesis we call \AR/\VR \emph{systems} the computational set of hardware (input devices, sensors, displays and haptic devices) and software (tracking, simulation and rendering) that allows the user to interact with the \VE. % by implementing the interaction loop we proposed in \figref{interaction-loop}.
In this thesis we call \AR/\VR \emph{systems} the computational set of hardware (input devices, sensors, displays and haptic devices) and software (tracking, registration, simulation, and rendering) that allows the user to interact with the \VE.
Many \AR displays have been explored, from projection systems to hand-held displays.
\textbf{\AR headsets are the most promising display technology because they create a portable experience that allows the user to navigate the \AE and interact with it directly using their hands} \cite{hertel2021taxonomy}.
While \AR and \VR systems can address any of the human senses, most focus only on visual augmentation \cite[p.144]{billinghurst2015survey} and \cite{kim2018revisiting}.
While \AR and \VR systems can address any of the human senses, most focus only on visual augmentation \cites[p.144]{billinghurst2015survey}{kim2018revisiting}.
\emph{Presence} is the illusion of \enquote{being there} when in \VR, or the illusion of the virtual content to \enquote{feel here} when in \AR \cite{slater2022separate,skarbez2021revisiting}.
One of the most important aspects of this illusion is the \emph{plausibility}, \ie the illusion that the virtual events are really happening. %, even if the user knows that they are not real.
However, when an \AR/\VR headset lacks haptic feedback, it may create a deceptive and incomplete user experience when the hand reaches the virtual content.
All (visual) virtual objects are inherently intangible and cannot physically constrain a user's hand, making it difficult to perceive their properties and interact with them with confidence and efficiency.
It is also necessary to provide a haptic feedback that is coherent with the virtual objects and ensures the best possible user experience, as we argue in the next section.
The \textbf{integration of wearable haptics with \AR headsets appears to be one of the most promising solutions}, but it remains challenging due to their respective limitations and the additional constraints of combining them, as we will overview in the next section.
One of the most important aspects of this illusion is the \emph{plausibility}, \ie the illusion that the virtual events are really happening.
However, \textbf{when an \AR/\VR headset lacks haptic feedback, it may create a deceptive and incomplete user experience when the hand reaches the visual virtual content}.
All visual virtual objects are inherently intangible and cannot physically constrain or provide touch feedback to the hand of the user.
This makes it difficult to perceive their properties and interact with them with confidence and efficiency.
It is also necessary to provide a haptic feedback that is coherent with the virtual content and ensures the best possible user experience.
\begin{subfigs}{visuo-haptic-environments}{Visuo-haptic environments with varying degrees of reality-virtuality. }[][
\item \AR environment with a real haptic object used as a proxy to manipulate a virtual object \cite{kahl2023using}.
@@ -120,54 +121,73 @@ The \textbf{integration of wearable haptics with \AR headsets appears to be one
\subfig{bau2012revel}
\end{subfigs}
\section{Research Challenges of Wearable Visuo-Haptic Augmented Reality}
\label{research_challenges}
\subsectionstarbookmark{Visuo-Haptic Augmentations}
\comans{SJ}{The chapter could benefit from some expansion. For instance, the current introduction tries to describe the scope of the research in haptic AR but lacks sufficient background on the general issues in this domain. As a result, it may not be very helpful for readers unfamiliar with the field in understanding the significance of the thesis's focus and positioning it within the broader context of haptic AR research.}{TODO}
The integration of wearable haptics with \AR headsets to create a visuo-haptic \AE is complex and presents many perceptual and interaction challenges.
In this thesis, we propose to \textbf{represent the user's experience with such a visuo-haptic \AE as an interaction loop}, shown in \figref{interaction-loop}.
Providing coherent and effective haptic feedback to visual virtual and augmented objects in \AR is complex.
To identify the challenges involved, we propose to \textbf{represent the user's experience with such a visuo-haptic \AE as an interaction loop}, shown in \figref{interaction-loop}.
It is based on the interaction loops of users with \ThreeD systems \cite[p.84]{laviolajr20173d}.
The \RE and the user's hand are tracked in real time by sensors and reconstructed in visual and haptic \VEs.
The interactions between the virtual hand and objects are then simulated, and rendered as feedback to the user using an \AR/\VR headset and wearable haptics.
Because the visuo-haptic \VE is displayed in real time and aligned with the \RE, the user is given the illusion of directly perceiving and interacting with the virtual content as if it were part of the \RE.
The interactions between the virtual hand and objects are then simulated, and rendered as feedback to the user using an \AR headset and wearable haptics.
It is important that the visuo-haptic \VE is registered with the \RE and rendered in real time.
This gives the user the illusion of directly perceiving and interacting with the virtual content as if it were part of the \RE.
\comans{SJ}{The chapter could benefit from some expansion. For instance, the current introduction tries to describe the scope of the research in haptic AR but lacks sufficient background on the general issues in this domain. As a result, it may not be very helpful for readers unfamiliar with the field in understanding the significance of the thesis's focus and positioning it within the broader context of haptic AR research.}{This section has been added to provide a better overview of the general research challenges of visuo-haptic augmentations.}
\fig{interaction-loop}{The interaction loop between a user and a visuo-haptic \AE as proposed in this thesis.}[
A user interacts with the visual (in blue) and haptic (in red) \VEs through a virtual hand (in purple) interaction technique that tracks real hand movements and simulates contact with virtual objects.
The visual and haptic \VEs are rendered back using an \AR headset and wearable haptics, and are perceived by the user to be registered and co-localized with the \RE (in gray).
\protect\footnotemark
%\protect\footnotemark
]
In this context, we focus on two main research challenges:
Implementing such an interaction loop with an \AR system involves design and technical problems \cite{jeon2015haptic,marchand2016pose}.
One of the \textbf{key issue is the \emph{registration} of the \VE with the \RE}.
It is the precise spatial and temporal alignment of the virtual content with the \RE such that a user perceives the virtual as part of the real world.
For visual \AR, a real camera is usually attached to the \AR display to estimate in real time the \emph{pose} (position and orientation) of the display within the \RE.
The virtual camera that captures the \VE is set to the same pose and intrinsic parameters (focal length, angle of view, distortion) as the real camera \cite{marchand2016pose}.
The user is then displayed with the combined images from the real and virtual cameras.
With haptic \AR, the haptic feedback must be similarly registered with the \RE \cite{jeon2015haptic}, \eg adding a virtual force in the same direction and at the same time as a real force when pressing a surface with a tool.
Depending on the type of haptic device and the haptic property to be rendered, this registration process involves different sensors, estimation techniques, and time requirements.
This can be difficult to achieve and remains one of the main research challenge in haptic \AR.
In addition, visual and haptic \textbf{\AR systems require models to simulate the interaction} with the \VE \cite{jeon2015haptic,marchand2016pose}.
As mentioned above, models of the \RE are needed to properly register the \VE, but also of the real objects to be augmented and manipulated.
Haptic \AR also often needs models of the user's contacts with the augmented objects.
Depending on the rendered haptic property and the required feedback fidelity, these can be complex estimates of the contacts (points, forces) and object properties (shape, material, mass, deformation, etc.).
Computational and rendering models are also needed to render the virtual visual and haptic stimuli to the user.
While \ThreeD visual rendering is mature, rendering haptic properties still faces many research challenges due to the complexity of the human touch \cite{pacchierotti2017wearable,culbertson2018haptics}.
However, a balance has to be found between the perceptual accuracy of all models used and the real-time constraints of the interaction loop.
\section{Research Challenges of Wearable Visuo-Haptic Augmented Reality}
\label{research_challenges}
The integration of wearable haptics with \AR headsets to create a visuo-haptic \AE is a promising solution, but presents many perceptual and interaction challenges.
In this thesis, we focus on two main research challenges:
\textbf{(I) providing plausible and coherent visuo-haptic augmentations}, and
\textbf{(II) enabling effective manipulation of the \AE}.
Each of these challenges also raises numerous design, technical, perceptual and user experience issues specific to wearable haptics and \AR headsets.
\comans{JG}{While these research axes seem valid, it could have been described more clearly how they fit into the overarching research fields of visuo-haptic augmentations.}{TODO}
\comans{JG}{While these research axes seem valid, it could have been described more clearly how they fit into the overarching research fields of visuo-haptic augmentations.}{The previous section has been expanded to better describe the general research challenges of visuo-haptic augmentations.}
\footnotetext{%
The icons are \href{https://creativecommons.org/licenses/by/3.0/}{CC BY} licensed:
\enquote{\href{https://thenounproject.com/icon/finger-pointing-4230346/}{finger pointing}} by \href{https://thenounproject.com/creator/leremy/}{Gan Khoon Lay},
\enquote{\href{https://thenounproject.com/icon/hololens-1499195/}{HoloLens}} by \href{https://thenounproject.com/creator/daniel2021/}{Daniel Falk}, and
\enquote{\href{https://thenounproject.com/icon/vibration-6478365/}{vibrations}} by \href{https://thenounproject.com/creator/iconbunny/}{Iconbunny}.
}
%\footnotetext{%
% The icons are \href{https://creativecommons.org/licenses/by/3.0/}{CC BY} licensed:
% \enquote{\href{https://thenounproject.com/icon/finger-pointing-4230346/}{finger pointing}} by \href{https://thenounproject.com/creator/leremy/}{Gan Khoon Lay},
% \enquote{\href{https://thenounproject.com/icon/hololens-1499195/}{HoloLens}} by \href{https://thenounproject.com/creator/daniel2021/}{Daniel Falk}, and
% \enquote{\href{https://thenounproject.com/icon/vibration-6478365/}{vibrations}} by \href{https://thenounproject.com/creator/iconbunny/}{Iconbunny}.
%}
\subsectionstarbookmark{Challenge I: Providing Plausible and Coherent Visuo-Haptic Augmentations}
\textbf{Many haptic devices have been designed and evaluated specifically for use in \VR}, providing the user with rich kinesthetic and tactile feedback on virtual objects, increasing the realism and effectiveness of interaction with them \cite{culbertson2018haptics}.
Although closely related, \AR and \VR headsets have key differences in their respective renderings that can affect user perception.
%As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic \cite{ujitoko2021survey} or haptic retargeting \cite{azmandian2016haptic} effects.
Many hand-held or wearable haptic devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR headsets.
The \textbf{user's hand must be free to touch and interact with the \RE while wearing a wearable haptic device}.
Instead, it is possible to place the haptic actuator close to the point of contact with the \RE, \eg providing haptic feedback on the nail \cite{ando2007fingernailmounted,teng2021touch}, another phalanx \cite{asano2015vibrotactile,detinguy2018enhancing,salazar2020altering} or the wrist \cite{pezent2019tasbi,sarac2022perceived} for rendering fingertip contact with virtual content.
Therefore, when touching a virtual or augmented object, \textbf{the real and virtual visual sensations are perceived as co-localized, but the virtual haptic feedback is not}.
It remains to be investigated how such potential discrepancies affect the overall perception to design visuo-haptic augmentations adapted to \AR headsets.
Such potential visuo-haptic discrepancies may negatively affect the user's perception of the registration of the \VE with the \RE.
This remains to be investigated to understand how to design visuo-haptic augmentations adapted to \AR headsets.
%So far, most of the \AR studies and applications only add visual and haptic sensations to the user's overall perception of the environment, but conversely it is more difficult to remove sensations.
%Visual and haptic augmentations of the \RE add sensations to the user's overall perception.
The \textbf{added visual and haptic virtual sensations may also be perceived as incoherent} with the sensations of the real objects, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these.
Moreover, with an \AR headset the user can still see the real world environment, including their hands, augmented real objects and worn haptic devices, unlike \VR where there is total control over the visual rendering. % of the hand and \VE.
Therefore, the \textbf{added visual and haptic virtual sensations may also be perceived as incoherent} with each other, for example with a lower rendering quality, a temporal latency, a spatial misalignment, or a combination of these.
It could also be caused by the limited rendering capabilities of wearable haptic devices \cite{pacchierotti2017wearable}.
Finally, the user can still see the \RE with an \AR headset, including their hands, augmented real objects and worn haptic devices, unlike \VR where there is total control over the visual rendering.
It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as a whole, and to what extent they will conflict or complement each other.
With a better understanding of \textbf{how visual factors can influence the perception of haptic augmentations}, the many wearable haptic devices that already exist but have not yet been fully explored with \AR can be better applied, and new visuo-haptic augmentations adapted to \AR can be designed.
@@ -176,21 +196,19 @@ With a better understanding of \textbf{how visual factors can influence the perc
Touching, \textbf{grasping and manipulating virtual objects are fundamental interactions for \AR} \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite[p.385]{laviolajr20173d}.
In the previous challenge section, we described how the user's hand should be free to interact with the \RE using a wearable haptic device.
We can then expect a seamless and direct manipulation of the virtual content with the hands, as if it were real.
%Since the hand is not occupied or covered with a haptic device so as to not impair interaction with the \RE, as described in the previous section, one can expect a seamless and direct manipulation of the hand with the virtual content as if it were real.
When touching a visually augmenting a real object, the user's hand is physically constrained by the object, allowing for easy and natural interaction.
However, \textbf{manipulating a purely virtual object with the bare hand can be challenging}, especially without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
When touching a real object that is visually augmented, the user's hand is physically constrained by the real object, allowing for easy and natural interaction.
In addition, wearable haptic devices are limited to cutaneous feedback, and cannot provide forces to constrain the hand contact with the virtual object \cite{pacchierotti2017wearable}.
Current \AR headsets have visual rendering limitations that also affect interaction with virtual objects. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency.
Current \AR headsets have visual rendering limitations that also affect interaction with virtual objects.
\AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world.
However, the depth perception of virtual objects is often underestimated \cite{peillard2019studying,adams2022depth}.
There is also often \textbf{a lack of mutual occlusions between the hand and a virtual object}, that is the hand can hide the object or be hidden by the object \cite{macedo2023occlusion}.
Finally, as illustrated in our interaction loop \figref{interaction-loop}, interaction with a virtual object is an illusion, because the real hand controls in real time a virtual hand, like an avatar, whose contacts with virtual objects are then simulated in the \VE.
Therefore, there is inevitably a latency between the movements of the real hand and the feedback movements of the virtual object, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the virtual object touched \cite{prachyabrued2014visual}.
Finally, as illustrated in \figref{interaction-loop}, interaction with a virtual object is an illusion, because the real hand controls in real time a virtual hand, like an avatar, whose contacts with virtual objects are then simulated in the \VE.
Therefore, there is inevitably a latency between the movements of the real hand and the feedback movements of the virtual object, and a spatial misalignment between the real hand and the virtual hand, whose movements are constrained to the virtual object touched \cite{prachyabrued2014visual}.
These three rendering limitations make it \textbf{difficult to perceive the position of the fingers relative to the object} before touching or grasping it, but also to estimate the force required to grasp the virtual object and move it to a desired location.
Hence, it is necessary to provide feedback that allows the user to efficiently contact, grasp and manipulate a virtual object with the hand.
Yet, it is unclear which type of visual and wearable haptic feedback, or their combination, is best suited to guide the manipulation of a virtual object. %, and whether one or the other of a combination of the two is most beneficial for users.
Yet, it is unclear which type of visual and wearable haptic feedback, or their combination, is best suited to guide the manipulation of a virtual object.
\section{Approach and Contributions}
\label{contributions}
@@ -204,13 +222,13 @@ Our approach is to:
We consider two main axes of research, each addressing one of the research challenges identified above:
\begin{enumerate*}[label=(\Roman*)]
\item \textbf{augmenting the visuo-haptic texture perception of real surfaces}, and % with visuo-haptic texture augmentations, and
\item \textbf{improving the manipulation of virtual objects}.% with visuo-haptic augmentations of the hand-object interaction.
\item \textbf{augmenting the visuo-haptic texture perception of real surfaces}, and
\item \textbf{improving the manipulation of virtual objects}.
\end{enumerate*}
Our contributions are summarized in \figref{contributions}.
\fig[0.95]{contributions}{Summary of our contributions through the simplified interaction loop.}[
The contributions are represented in dark grey boxes, the research axes in green circles. % and the research objectives in light green circles.
The contributions are represented in dark grey boxes, the research axes in green circles.
The first axis is \textbf{(I)} the design and evaluation of the perception of visuo-haptic texture augmentations of real surfaces, directly touched by the hand.
The second axis focuses on \textbf{(II)} improving the manipulation of virtual objects with the bare hand using visuo-haptic augmentations of the hand as interaction feedback.
]
@@ -218,8 +236,6 @@ Our contributions are summarized in \figref{contributions}.
\subsectionstarbookmark{Axis I: Augmenting the Texture Perception of Real Surfaces}
Wearable haptic devices have proven effective in modifying the perception of a touched real surface, without altering the object or covering the fingertip, forming haptic augmentation \cite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
%It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronized with the finger movement.
%It enables rich haptic feedback as the combination of kinesthetic sensation from the real and cutaneous sensation from the actuator.
However, wearable haptic augmentation with \AR has been little explored, as well as the visuo-haptic augmentation of texture.
Texture is indeed one of the most fundamental perceived properties of a surface material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) augmentation \cite{unger2011roughness,culbertson2014modeling,asano2015vibrotactile,strohmeier2017generating,friesen2024perceived}.
For this first axis of research, we propose to \textbf{design and evaluate the perception of wearable virtual visuo-haptic textures augmenting real surfaces}.
@@ -236,22 +252,20 @@ Hence, our second objective is to \textbf{evaluate how the perception of wearabl
Finally, visuo-haptic texture databases have been created from real texture captures \cite{culbertson2014penn,balasubramanian2024sens3} to be rendered as virtual textures with hand-held haptic devices that are perceived as similar to real textures \cite{culbertson2015should,friesen2024perceived}.
However, the rendering of these textures with and \AR headset and wearable haptics remains to be investigated.
Our third objective is to \textbf{evaluate the perception of simultaneous and co-localized wearable visuo-haptic texture augmentations} of real surfaces in \AR. %, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture.
Our third objective is to \textbf{evaluate the perception of simultaneous and co-localized wearable visuo-haptic texture augmentations} of real surfaces in \AR.
\subsectionstarbookmark{Axis II: Improving the Manipulation of Virtual Objects}
With wearable visuo-haptic \AR, the hand is free to touch and interact seamlessly with real, augmented and virtual objects.
Hence, a user can expect natural and direct contact and manipulation of virtual objects with the bare hand.
However, the intangibility of the visual \VE, the display limitations of current \AR headsets and the inherent spatial and temporal discrepancies between the user's hand actions and the visual feedback in the \VE can make interaction with virtual objects particularly challenging.
%However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of wearable haptics, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the \RE, the visual feedback, and the haptic feedback, can make the interaction with virtual objects with bare hands particularly challenging.
Two particular sensory feedbacks are known to improve such direct virtual object manipulation, but have not been properly investigated with \AR headsets: visual feedback of the virtual hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and delocalized haptic feedback \cite{lopes2018adding,teng2021touch}.
For this second axis of research, we propose to \textbf{design and evaluate visuo-haptic augmentations of the hand as interaction feedback with virtual objects} in \AR.
We consider the effect on user performance and experience of (1) the visual feedback of the virtual hand as augmentation of the real hand and (2) different delocalized haptic feedback of virtual object manipulation with the hand in combination with visual hand augmentations.
First, the visual feedback of the virtual hand is a key element for interacting and manipulating virtual objects in \VR \cite{prachyabrued2014visual,grubert2018effects}.
Some work has also investigated the visual feedback of the virtual hand in \AR, but not in a context of virtual object manipulation with a headset \cite{blaga2017usability,yoon2020evaluating} or was limited to a single visual hand augmentation \cite{piumsomboon2014graspshell,maisto2017evaluation}. % with the bare hand.% from simulating mutual occlusions between the hand and virtual objects \cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay \cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
Some work has also investigated the visual feedback of the virtual hand in \AR, but not in a context of virtual object manipulation with a headset \cite{blaga2017usability,yoon2020evaluating} or was limited to a single visual hand augmentation \cite{piumsomboon2014graspshell,maisto2017evaluation}.
\AR headsets also has significant perceptual differences from \VR due to the visibility of the real hand and environment, which can affect user experience and performance \cite{yoon2020evaluating}.
%, and these visual hand augmentations have not been evaluated .
Thus, our fourth objective is to \textbf{investigate the visual feedback of the virtual hand as augmentation of the real hand} for direct hand manipulation of virtual objects.
Second, as described above, the haptic actuators need to be moved away from the fingertips to not impair the hand movements, sensations and interactions with the \RE.
@@ -262,8 +276,6 @@ Our last objective is to \textbf{investigate the delocalized haptic feedback of
\section{Thesis Overview}
\label{thesis_overview}
%This thesis is divided in four parts.
%In \textbf{\partref{background}}, we first describe the context and background of our research, within which
With this current \textit{Introduction} chapter, we have presented the research challenges, objectives, approach and contributions of this thesis.
In \textbf{\chapref{related_work}}, we then review previous work on the perception and manipulation of virtual and augmented objects, directly with the hand, using either wearable haptics, \AR, or their combination.
@@ -297,7 +309,7 @@ In \textbf{\partref{manipulation}}, we describe our contributions to the second
In \textbf{\chapref{visual_hand}}, we investigate in a user study six visual feedback as hand augmentations, as a set of the most popular hand augmentation in the \AR literature.
Using the \OST-\AR headset Microsoft HoloLens~2, we evaluate their effect on user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a virtual object directly with the hand.
In \textbf{\chapref{visuo_haptic_hand}}, we evaluate in a user study delocalized haptic feedback of hand manipulation with virtual object using two vibrotactile contact techniques provided at five different positionings on the hand. %, as haptic rendering of virtual object manipulation with the hand.
In \textbf{\chapref{visuo_haptic_hand}}, we evaluate in a user study delocalized haptic feedback of hand manipulation with virtual object using two vibrotactile contact techniques provided at five different positionings on the hand.
They are compared with the two most representative visual hand augmentations from the previous chapter, resulting in twenty visuo-haptic hand feedbacks that are evaluated within the same experimental setup and design.
\noindentskip

View File

@@ -87,7 +87,8 @@ Some studies have investigated the visuo-haptic perception of virtual objects re
In \VST-\AR, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}).
In a \TIFC task (\secref{sensations_perception}), participants pressed two pistons and indicated which was stiffer.
One had a reference stiffness but an additional visual or haptic delay, while the other varied with a comparison stiffness but had no delay. \footnote{Participants were not told about the delays and stiffness tested, nor which piston was the reference or comparison. The order of the pistons (which one was pressed first) was also randomized.}
One had a reference stiffness but an additional visual or haptic delay, while the other varied with a comparison stiffness but had no delay.
\footnote{Participants were not told about the delays and stiffness tested, nor which piston was the reference or comparison. The order of the pistons (which one was pressed first) was also randomized.}
Adding a visual delay increased the perceived stiffness of the reference piston, while adding a haptic delay decreased it, and adding both delays cancelled each other out (\figref{knorlein2009influence_2}).
\begin{subfigs}{visuo-haptic-stiffness}{

View File

@@ -40,6 +40,9 @@
\usepackage[hang]{footmisc}
\setlength{\footnotemargin}{3mm}% Margin between footnote number and text
\NewCommandCopy{\oldfootnote}{\footnote}
\renewcommand{\footnote}{\ifhmode\unskip\fi\oldfootnote}% Remove space before footnote
% Less hbox/vbox badness messages
\hfuzz=20pt
\vfuzz=20pt

1
utils/.gitignore vendored
View File

@@ -146,7 +146,6 @@ acs-*.bib
# knitr
*-concordance.tex
# TODO Uncomment the next line if you use knitr and want to ignore its generated tikz files
# *.tikz
*-tikzDictionary

View File

@@ -93,6 +93,7 @@
}
%% Footnotes
\newcommand{\footnotemarkrepeat}{\footnotemark[\value{footnote}]} % Repeat the last footnote mark
\newcommand{\footnoteurl}[1]{\footnote{\ \url{#1}}}
%% Lists
@@ -157,7 +158,6 @@
\newcommand{\figref}[1]{Fig.~\ref{fig:#1}}
\newcommand{\secref}[1]{Sec.~\ref{sec:#1}}
\newcommand{\tabref}[1]{Table~\ref{tab:#1}}
\newcommand{\footnotemarkrepeat}{\footnotemark[\value{footnote}]} % Repeat the last footnote mark
%% Structure
\newcommand{\chapterstartoc}[1]{