Corrections

This commit is contained in:
2024-10-13 00:23:15 +02:00
parent ae2535debf
commit 13662fa735
23 changed files with 142 additions and 110 deletions

Binary file not shown.

Binary file not shown.

View File

@@ -3,9 +3,11 @@
\chaptertoc \chaptertoc
This PhD manuscript shows how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR by augmenting the perception of the virtual content and its manipulation. %This PhD manuscript shows how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR by augmenting the perception of the virtual content and its manipulation.
Our goal is to enable a more coherent and seamless visuo-haptic perception of the virtual content as well as a more natural and effective manipulation of it. This PhD manuscript shows how immersive \AR, integrating visual virtual content into the real world perception, and wearable haptics, worn on the outside of the hand, can improve free and direct interaction of the hand with virtual objects.
Our goal is to achieve a more plausible and coherent perception, as well as a more natural and effective manipulation of the visuo-haptic augmentations. %interaction of the hand with the virtual content.%, moving towards a seamless integration of the virtual into the real world.
%We are particularly interested in enabling direct contact of virtual and augmented objects with the bare hand. %We are particularly interested in enabling direct contact of virtual and augmented objects with the bare hand.
%The aim of this thesis is to understand how immersive visual and wearable haptic augmentations complement each other in the context of direct hand perception and manipulation with virtual and augmented objects.
\section{Visual and Haptic Object Augmentations} \section{Visual and Haptic Object Augmentations}
\label{visuo_haptic_augmentations} \label{visuo_haptic_augmentations}
@@ -20,7 +22,8 @@ This is why we sometimes want to touch an object to check one of its properties
We then \textbf{instinctively construct a unified perception of the properties of the object} we are exploring and manipulating from our sensory modalities, as well as from the movement of our hand and fingers on the object \cite{ernst2002humans}. We then \textbf{instinctively construct a unified perception of the properties of the object} we are exploring and manipulating from our sensory modalities, as well as from the movement of our hand and fingers on the object \cite{ernst2002humans}.
The sense of touch also allows us to perceive our environment, but also to represent ourselves in space and to interact with the surrounding objects. The sense of touch also allows us to perceive our environment, but also to represent ourselves in space and to interact with the surrounding objects.
This is due to the many sensory receptors distributed throughout our hands and body, and which can be divided into two modalities: kinesthetic (or proprioception), which are the forces felt by muscles and tendons, and cutaneous (or tactile), which are the pressures, stretches, vibrations and temperatures felt by the skin. This is due to the many sensory receptors distributed throughout our hands and body.
Those receptors can be divided into two modalities: \emph{kinesthetic} (or proprioception), which are the forces felt by muscles and tendons, and \emph{cutaneous} (or tactile), which are the pressures, stretches, vibrations and temperatures felt by the skin.
This rich and complex variety of actions and sensations makes it particularly \textbf{difficult to artificially recreate capabilities of touch}, for example in virtual or remote operating environments \cite{culbertson2018haptics}. This rich and complex variety of actions and sensations makes it particularly \textbf{difficult to artificially recreate capabilities of touch}, for example in virtual or remote operating environments \cite{culbertson2018haptics}.
\subsectionstarbookmark{Wearable Haptics Promise Everyday Use} \subsectionstarbookmark{Wearable Haptics Promise Everyday Use}
@@ -47,8 +50,8 @@ Instead, \textbf{\emph{wearable interfaces} are directly mounted on the body} to
A wide range of wearable haptic devices have been developed to provide the user with rich virtual tactile sensations, including normal force, skin stretch, vibration and thermal feedback. A wide range of wearable haptic devices have been developed to provide the user with rich virtual tactile sensations, including normal force, skin stretch, vibration and thermal feedback.
\figref{wearable-haptics} shows some examples of different wearable haptic devices with different form factors and rendering capabilities. \figref{wearable-haptics} shows some examples of different wearable haptic devices with different form factors and rendering capabilities.
Their portability, \ie their \textbf{small form factor, light weight and unobtrusiveness}, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, augmented and virtual realities, and social interactions \cite{pacchierotti2017wearable,culbertson2018haptics}. Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, augmented and virtual realities, and social interactions \cite{pacchierotti2017wearable,culbertson2018haptics}.
However, their \textbf{integration with \AR has been little explored so far}, and few wearable haptic devices have specifically designed or experimentally tested for direct hand interaction in \AR. However, the \textbf{integration of wearable haptics with \AR has been little explored so far}.%, and few wearable haptic devices have specifically designed or experimentally tested for direct hand interaction in \AR.
\begin{subfigs}{wearable-haptics}{ \begin{subfigs}{wearable-haptics}{
Wearable haptic devices can render sensations on the skin as feedback to real or virtual objects being touched. Wearable haptic devices can render sensations on the skin as feedback to real or virtual objects being touched.
@@ -73,11 +76,13 @@ It is technically and conceptually closely related to \emph{\VR}, which complete
\AR and \VR can be placed on a reality-virtuality continuum, as proposed by \textcite{milgram1994taxonomy} and illustrated in \figref{rv-continuum}\footnote{On the original reality-virtuality continuum of \textcite{milgram1994taxonomy}, augmented virtuality is also considered, as the incorporation of real objects to a \VE, and is placed between \AR and \VR. For simplicity, we only consider \AR and \VR in this thesis.}. \AR and \VR can be placed on a reality-virtuality continuum, as proposed by \textcite{milgram1994taxonomy} and illustrated in \figref{rv-continuum}\footnote{On the original reality-virtuality continuum of \textcite{milgram1994taxonomy}, augmented virtuality is also considered, as the incorporation of real objects to a \VE, and is placed between \AR and \VR. For simplicity, we only consider \AR and \VR in this thesis.}.
It describes the degree of virtuality of the environment along an axis, with one end being the \RE and the other end being a pure \VE, \ie indistinguishable from the real world (such as \emph{The Matrix} movies). It describes the degree of virtuality of the environment along an axis, with one end being the \RE and the other end being a pure \VE, \ie indistinguishable from the real world (such as \emph{The Matrix} movies).
Between these two extremes lies \MR, which comprises \textbf{\AR and \VR as different levels of mixing real and virtual environments} \cite{skarbez2021revisiting}.\footnote{This is the original and classic definition of \MR, but there is still a debate on defining and characterize \AR and \MR experiences \cite{speicher2019what,skarbez2021revisiting}.} Between these two extremes lies \MR, which comprises \AR and \VR as different levels of mixing real and virtual environments \cite{skarbez2021revisiting}.\footnote{This is the original and classic definition of \MR, but there is still a debate on defining and characterize \AR and \MR experiences \cite{speicher2019what,skarbez2021revisiting}.}
We call \AR/\VR \emph{systems} the computational set of hardware (input devices, sensors and displays) and software (tracking, simulation and rendering) that allows the user to interact with the \VE.% by implementing the interaction loop we proposed in \figref{interaction-loop}. We call \AR/\VR \emph{systems} the computational set of hardware (input devices, sensors and displays) and software (tracking, simulation and rendering) that allows the user to interact with the \VE.% by implementing the interaction loop we proposed in \figref{interaction-loop}.
\AR and \VR systems can address any of the human senses, but they most often focus on visual augmentations \cite[p.144]{billinghurst2015survey}. \AR and \VR systems can address any of the human senses, but they most often focus on visual augmentations \cite[p.144]{billinghurst2015survey}.
Many visual displays have been explored, from projection systems to hand-held displays, but the most \textbf{promising devices are \AR headsets}, which are \textbf{portable displays worn directly on the head}, providing the user with an \textbf{immersive visual augmented environment}. Many visual displays have been explored, from projection systems to hand-held displays.
\textbf{\AR headsets are the most promising displays as they are portable and provide the user with an immersive augmented environment.}
%but the most \textbf{promising devices are \AR headsets}, which are \textbf{portable displays worn directly on the head}, providing the user with an \textbf{immersive visual augmented environment}.
\begin{subfigs}{rv-continuums}{Reality-virtuality continuums. }[][ \begin{subfigs}{rv-continuums}{Reality-virtuality continuums. }[][
\item For the visual sense, as originally proposed by and adapted from \textcite{milgram1994taxonomy}. \item For the visual sense, as originally proposed by and adapted from \textcite{milgram1994taxonomy}.
@@ -90,7 +95,7 @@ Many visual displays have been explored, from projection systems to hand-held di
%Concepts of virtuality and augmentation can also be applied for sensory modalities other than vision. %Concepts of virtuality and augmentation can also be applied for sensory modalities other than vision.
\textcite{jeon2009haptic} proposed to describe visuo-haptic \AR/\VR systems with two orthogonal reality-virtuality continuum, one for vision and one for touch, as illustrated in \figref{visuo-haptic-rv-continuum5}. \textcite{jeon2009haptic} proposed to describe visuo-haptic \AR/\VR systems with two orthogonal reality-virtuality continuum, one for vision and one for touch, as illustrated in \figref{visuo-haptic-rv-continuum5}.
The combination of the two axes defines 9 types of visuo-haptic environments, with 3 possible levels of virtuality for each visual or haptic feedback: real, augmented and virtual. The combination of the two axes defines 9 types of visuo-haptic environments, with 3 possible levels of virtuality for each visual or haptic feedback: real, augmented and virtual.
For example, (visual) \AR that uses a \textbf{real object as a proxy to manipulate a \VO is considered as \emph{haptic reality}} (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum5}), whereas a device that provides \textbf{synthetic haptic feedback when touching a \VO is considered as \emph{haptic virtuality}} (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum5}). For example, (visual) \AR that uses a real object as a proxy to manipulate a \VO is considered as \emph{haptic reality} (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum5}), whereas a device that provides synthetic haptic feedback when touching a \VO is considered as \emph{haptic virtuality} (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum5}).
\textbf{A \emph{haptic augmentation} is then the combination of real and virtual haptic stimuli} \cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum5}). \textbf{A \emph{haptic augmentation} is then the combination of real and virtual haptic stimuli} \cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum5}).
In particular, it has been implemented by augmenting the haptic perception of real objects by providing timely virtual tactile stimuli using wearable haptics: In particular, it has been implemented by augmenting the haptic perception of real objects by providing timely virtual tactile stimuli using wearable haptics:
\figref{salazar2020altering} shows an example of modifying the perceived stiffness of a real object in \VR using simultaneous pressure feedback on the finger (left middle cell in \figref{visuo-haptic-rv-continuum5}). \figref{salazar2020altering} shows an example of modifying the perceived stiffness of a real object in \VR using simultaneous pressure feedback on the finger (left middle cell in \figref{visuo-haptic-rv-continuum5}).
@@ -119,21 +124,30 @@ The \textbf{integration of wearable haptics with \AR} seems to be one of the mos
The integration of wearable haptics with \AR to create a visuo-haptic augmented environment is complex and presents many perceptual and interaction challenges. The integration of wearable haptics with \AR to create a visuo-haptic augmented environment is complex and presents many perceptual and interaction challenges.
% \ie sensing the \AE and acting effectively upon it. % \ie sensing the \AE and acting effectively upon it.
in this thesis, we propose to \textbf{represent the experience of a user with such a visuo-haptic augmented environment as an interaction loop}, shown in \figref{interaction-loop}. In this thesis, we propose to \textbf{represent the experience of a user with such a visuo-haptic augmented environment as an interaction loop}, shown in \figref{interaction-loop}.
It is based on the interaction loops of users with \ThreeD systems \cite[p.84]{laviolajr20173d}. It is based on the interaction loops of users with \ThreeD systems \cite[p.84]{laviolajr20173d}.
The \RE and the user's hand are tracked in real time by sensors and reconstructed in visual and haptic \VEs. The \RE and the user's hand are tracked in real time by sensors and reconstructed in visual and haptic \VEs.
The interactions between the virtual hand and objects are then simulated, and rendered as feedback to the user using a \AR/\VR headset and wearable haptics. The interactions between the virtual hand and objects are then simulated, and rendered as feedback to the user using a \AR/\VR headset and wearable haptics.
Because the visuo-haptic \VE is displayed in real time and aligned with the \RE, the user is given the illusion of directly perceiving and interacting with the virtual content as if it were part of the \RE. Because the visuo-haptic \VE is displayed in real time and aligned with the \RE, the user is given the illusion of directly perceiving and interacting with the virtual content as if it were part of the \RE.
\fig{interaction-loop}{The interaction loop between a user and a visuo-haptic augmented environment.}[ \fig{interaction-loop}{The interaction loop between a user and a visuo-haptic augmented environment as proposed in this thesis.}[
One interact with the visual (in blue) and haptic (in red) \VE through a virtual hand (in purple) interaction technique that tracks real hand movements and simulates contact with \VOs. A user interacts with the visual (in blue) and haptic (in red) \VEs through a virtual hand (in purple) interaction technique that tracks real hand movements and simulates contact with \VOs.
The \VE is rendered back to the user co-localized with the real one (in gray) using a \AR headset and a wearable haptic device. The visual and haptic \VEs are rendered back to the user registered and co-localized with the \RE (in gray) using a immersive \AR headset and wearable haptics.
\protect\footnotemark
] ]
In this context, we identify two main research challenges that we address in this thesis: In this context, we focus on two main research challenges:
\textbf{(I) providing plausible and coherent visuo-haptic augmentations}, and \textbf{(I) providing plausible and coherent visuo-haptic augmentations}, and
\textbf{(II) enabling effective manipulation of the augmented environment}. \textbf{(II) enabling effective manipulation of the augmented environment}.
Each of these challenges also raises numerous design, technical and human issues specific to wearable haptics and immersive \AR, as well as virtual rendering and user experience issues.% in integrating these two sensorimotor feedbacks into a coherent and seamless visuo-haptic \AE. Each of these challenges also raises numerous design, technical and human issues specific to wearable haptics and immersive \AR.
%, as well as virtual rendering and user experience issues.% in integrating these two sensorimotor feedbacks into a coherent and seamless visuo-haptic \AE.
\footnotetext{%
The icons are \href{https://creativecommons.org/licenses/by/3.0/}{CC BY} licensed:
\enquote{\href{https://thenounproject.com/icon/finger-pointing-4230346/}{finger pointing}} by \href{https://thenounproject.com/creator/leremy/}{Gan Khoon Lay},
\enquote{\href{https://thenounproject.com/icon/hololens-1499195/}{HoloLens}} by \href{https://thenounproject.com/creator/daniel2021/}{Daniel Falk}, and
\enquote{\href{https://thenounproject.com/icon/vibration-6478365/}{vibrations}} by \href{https://thenounproject.com/creator/iconbunny/}{Iconbunny}.
}
\subsectionstarbookmark{Challenge I: Providing Plausible and Coherent Visuo-Haptic Augmentations} \subsectionstarbookmark{Challenge I: Providing Plausible and Coherent Visuo-Haptic Augmentations}
@@ -145,45 +159,48 @@ Although closely related, \AR and \VR have key differences in their respective r
Many wearable haptic devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR. Many wearable haptic devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR.
The \textbf{user's hand must be indeed free to touch and interact with the \RE while wearing a wearable haptic device}. The \textbf{user's hand must be indeed free to touch and interact with the \RE while wearing a wearable haptic device}.
It is possible instead to place the haptic actuator close to the point of contact with the \RE, \eg providing haptic feedback on the nail \cite{ando2007fingernailmounted,teng2021touch}, another phalanx \cite{asano2015vibrotactile,salazar2020altering} or the wrist \cite{pezent2022design,sarac2022perceived} for rendering fingertip contacts with virtual content. It is possible instead to place the haptic actuator close to the point of contact with the \RE, \eg providing haptic feedback on the nail \cite{ando2007fingernailmounted,teng2021touch}, another phalanx \cite{asano2015vibrotactile,salazar2020altering} or the wrist \cite{pezent2022design,sarac2022perceived} for rendering fingertip contacts with virtual content.
Therefore, when touching a virtual or augmented object, the real and virtual visual sensations are seen as co-localized, but the virtual haptic feedback is not. Therefore, when touching a virtual or augmented object, \textbf{the real and virtual visual sensations are seen as co-localized, but the virtual haptic feedback is not}.
It remains to be investigated how such potential discrepancies affect the overall perception to design visuo-haptic augmentations adapted to \AR. It remains to be investigated how such potential discrepancies affect the overall perception to design visuo-haptic augmentations adapted to \AR.
So far, \AR studies and applications often only add visual and haptic sensations to the user's overall perception of the environment, but conversely it is more difficult to remove sensations. %So far, most of the \AR studies and applications only add visual and haptic sensations to the user's overall perception of the environment, but conversely it is more difficult to remove sensations.
These \textbf{added virtual sensations can therefore be perceived as inconsistent} with the sensations of the \RE, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these. %Visual and haptic augmentations of the \RE add sensations to the user's overall perception.
Moreover, the user can still see the real-world surroundings, including their hands, the augmented real objects and the worn haptic devices, unlike \VR where there is total control over the visual rendering. % of the hand and \VE. The \textbf{added visual and haptic virtual sensations can be perceived as inconsistent} with the sensations of the \RE, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these.
It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as realistic or plausible, and to what extent they will conflict or complement each other. % in the perception of the \AE. Moreover, in \AR, the user can still see the real-world surroundings, including their hands, the augmented real objects and the worn haptic devices, unlike \VR where there is total control over the visual rendering. % of the hand and \VE.
It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as realistic or even plausible, and to what extent they will conflict or complement each other. % in the perception of the \AE.
With a better understanding of \textbf{how visual factors can influence the perception of haptic augmentations}, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic augmentations adapted to \AR can be designed. With a better understanding of \textbf{how visual factors can influence the perception of haptic augmentations}, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic augmentations adapted to \AR can be designed.
\subsectionstarbookmark{Challenge II: Enabling Effective Manipulation of the Augmented Environment} \subsectionstarbookmark{Challenge II: Enabling Effective Manipulation of the Augmented Environment}
Touching, \textbf{grasping and manipulating \VOs are fundamental interactions for \AR} \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite[p.385]{laviolajr20173d}. Touching, \textbf{grasping and manipulating \VOs are fundamental interactions for \AR} \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite[p.385]{laviolajr20173d}.
As the hand is not occupied or covered with a haptic device to not impair interaction with the \RE, as described in the previous section, one can expect a seamless and direct manipulation of the hand with the virtual content as if it were real. As the hand is not occupied or covered with a haptic device to not impair interaction with the \RE, as described in the previous section, one can expect a seamless and direct manipulation of the hand with the virtual content as if it were real.
Thus, augmenting a real object has the advantage of physically constraining the hand, allowing for easy and natural interaction, but manipulating a purely \VO with the bare hand can be challenging without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction. When augmenting a real object, the user's hand is physically constrained by the object, allowing for easy and natural interaction.
However, \textbf{manipulating a purely \VO with the bare hand can be challenging} without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
In addition, current \AR systems have visual rendering limitations that also affect interaction with \VOs. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency. In addition, current \AR systems have visual rendering limitations that also affect interaction with \VOs. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency.
\AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world. \AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world.
But the depth perception of the \VOs is often underestimated \cite{peillard2019studying,adams2022depth}, and there is often a lack of mutual occlusion between the hand and a \VO, \ie that the hand can hide the object or be hidden by the object \cite{macedo2023occlusion}. But the depth perception of the \VOs is often underestimated \cite{peillard2019studying,adams2022depth}.
There is also often \textbf{a lack of mutual occlusion between the hand and a \VO}, \ie that the hand can hide the object or be hidden by the object \cite{macedo2023occlusion}.
Finally, as illustrated in \figref{interaction-loop}, interacting with a \VO is an illusion, because in fact the real hand is controlling in real time a virtual hand, like an avatar, whose contacts with \VOs are then simulated in the \VE. Finally, as illustrated in \figref{interaction-loop}, interacting with a \VO is an illusion, because in fact the real hand is controlling in real time a virtual hand, like an avatar, whose contacts with \VOs are then simulated in the \VE.
Therefore, there is inevitably a latency delay between the real hand's movements and the \VO's return movements, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the touched \VO \cite{prachyabrued2014visual}. Therefore, there is inevitably a latency delay between the real hand's movements and the \VO's return movements, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the touched \VO \cite{prachyabrued2014visual}.
This makes it difficult to perceive the position of the fingers relative to the object before touching or grasping it, and also to estimate the force required to grasp and move the object to a desired location. This makes it \textbf{difficult to perceive the position of the fingers relative to the object} before touching or grasping it, but also to estimate the force required to grasp and move the object to a desired location.
Hence, it is necessary to provide visual and haptic feedback that allows the user to efficiently contact, grasp and manipulate a \VO with the hand. Hence, it is necessary to provide visual and haptic feedback that allows the user to efficiently contact, grasp and manipulate a \VO with the hand.
Yet, it is unclear which type of visual and haptic feedback is the best suited to guide the \VO manipulation, and whether one or the other of a combination of the two is most beneficial for users. Yet, it is unclear which type of visual and haptic feedback, or their combination, is the best suited to guide the \VO manipulation.%, and whether one or the other of a combination of the two is most beneficial for users.
\section{Approach and Contributions} \section{Approach and Contributions}
\label{contributions} \label{contributions}
The aim of this thesis is to understand how immersive visual and wearable haptic augmentations complement each other in the context of direct hand perception and manipulation with virtual and augmented objects. %The aim of this thesis is to understand how immersive visual and wearable haptic augmentations complement each other in the context of direct hand perception and manipulation with virtual and augmented objects.
As described in the Research Challenges section above, providing a convincing, consistent and effective visuo-haptic \AE to a user is complex and raises many issues. As described in the Research Challenges section above, providing a coherent and effective visuo-haptic augmented environment to a user is complex and raises many issues.
Our approach is to Our approach is to
\begin{enumerate*}[label=(\arabic*)] \begin{enumerate*}[label=(\arabic*)]
\item design immersive and wearable multimodal visuo-haptic renderings that augment both the objects being interacted with and the hand interacting with them, and \item design immersive and wearable visuo-haptic renderings that augment both the objects being interacted with and the hand interacting with them, and
\item evaluate in user studies how these renderings affect the perception of and interaction with these objects using psychophysical, performance, and user experience methods. \item evaluate in user studies how these visuo-haptic renderings affect the interaction of the hand with these objects using psychophysical, performance, and user experience methods.
\end{enumerate*} \end{enumerate*}
We consider two main axes of research, each addressing one of the research challenges identified above: We consider two main axes of research, each addressing one of the research challenges identified above:
\begin{enumerate*}[label=(\Roman*)] \begin{enumerate*}[label=(\Roman*)]
\item \textbf{modifying the texture perception of real surfaces}, and % with visuo-haptic texture augmentations, and \item \textbf{augmenting the texture perception of real surfaces}, and % with visuo-haptic texture augmentations, and
\item \textbf{improving the manipulation of virtual objects}.% with visuo-haptic augmentations of the hand-object interaction. \item \textbf{improving the manipulation of virtual objects}.% with visuo-haptic augmentations of the hand-object interaction.
\end{enumerate*} \end{enumerate*}
Our contributions in these two axes are summarized in \figref{contributions}. Our contributions in these two axes are summarized in \figref{contributions}.
@@ -194,7 +211,7 @@ Our contributions in these two axes are summarized in \figref{contributions}.
The second axis focuses on \textbf{(II)} improving the manipulation of \VOs with the bare hand using visuo-haptic augmentations of the hand as interaction feedback. The second axis focuses on \textbf{(II)} improving the manipulation of \VOs with the bare hand using visuo-haptic augmentations of the hand as interaction feedback.
] ]
\subsectionstarbookmark{Axis I: Modifying the Texture Perception of Real Surfaces} \subsectionstarbookmark{Axis I: Augmenting the Texture Perception of Real Surfaces}
Wearable haptic devices have proven to be effective in altering the perception of a touched real surface, without modifying the object nor covering the fingertip, forming a haptic \AE \cite{bau2012revel,detinguy2018enhancing,salazar2020altering}. Wearable haptic devices have proven to be effective in altering the perception of a touched real surface, without modifying the object nor covering the fingertip, forming a haptic \AE \cite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
%It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronised with the finger movement. %It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronised with the finger movement.
@@ -203,24 +220,25 @@ However, wearable haptic augmentations have been little explored with \AR, as we
Texture is indeed one of the fundamental perceived property of a surface's material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by both sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) augmentation \cite{unger2011roughness,culbertson2014modeling,strohmeier2017generating}. Texture is indeed one of the fundamental perceived property of a surface's material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by both sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) augmentation \cite{unger2011roughness,culbertson2014modeling,strohmeier2017generating}.
Being able to coherently substitute the visuo-haptic texture of an everyday surface directly touched by a finger is an important step towards \AR capable of visually and haptically augmenting the \RE of a user in a plausible way. Being able to coherently substitute the visuo-haptic texture of an everyday surface directly touched by a finger is an important step towards \AR capable of visually and haptically augmenting the \RE of a user in a plausible way.
For this first axis of research, we propose to design and evaluate the \textbf{perception of wearable virtual visuo-haptic textures augmenting real surfaces}. %, using an immersive \AR headset and a wearable vibrotactile device. For this first axis of research, we propose to design and evaluate the \textbf{perception of wearable virtual visuo-haptic textures augmenting real surfaces}. %, using an immersive \AR headset and a wearable vibrotactile device.
To this end, we (1) design a system for rendering wearable visuo-haptic texture augmentations, to (2) evaluate how the perception of haptic texture augmentations is affected by the visual virtuality of the hand and the environment (real, augmented, or virtual), and (3) investigate the perception of co-localized visuo-haptic texture augmentations. To this end, we (1) design a system for rendering wearable visuo-haptic texture augmentations, to (2) evaluate how the perception of haptic texture augmentations is affected by the visual feedback of the virtual hand and the environment, and (3) investigate the perception of co-localized visuo-haptic texture augmentations.
First, an effective approach for rendering haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction \cite{culbertson2014modeling,asano2015vibrotactile}. First, an effective approach for rendering haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction \cite{culbertson2014modeling,asano2015vibrotactile}.
Yet, to achieve the natural interaction with the hand and a coherent visuo-haptic feedback, it requires a real time rendering of the textures, no constraints on the hand movements, and a good synchronization between the visual and haptic feedback. Yet, to achieve the natural interaction with the hand and a coherent visuo-haptic feedback, it requires a real time rendering of the textures, no constraints on the hand movements, and a good synchronization between the visual and haptic feedback.
Thus, our first objective is to \textbf{design an immersive, real time system} that allows free exploration with the bare hand of \textbf{wearable visuo-haptic texture augmentations} on real surfaces. Thus, our first objective is to \textbf{design an immersive, real time system} that allows free exploration with the bare hand of \textbf{wearable visuo-haptic texture augmentations} on real surfaces.
It will form the basis of the next two chapters in this section. It will form the basis of the next two chapters in this section.
Second, many works have investigated the haptic augmentation of textures, but none have integrated them with \AR and \VR, or have considered the influence of the degree of visual virtuality on their perception. Second, many works have investigated the haptic augmentation of textures, but none have integrated them with \AR and \VR, or have considered the influence of the visual feedback on their perception.
Still, it is known that the visual feedback can alter the perception of real and virtual haptic sensations \cite{schwind2018touch,choi2021augmenting} but also that the force feedback perception of grounded haptic devices is not the same in \AR and \VR \cite{diluca2011effects,gaffary2017ar}. Still, it is known that the visual feedback can alter the perception of real and virtual haptic sensations \cite{schwind2018touch,choi2021augmenting} but also that the force feedback perception of grounded haptic devices is not the same in \AR and \VR \cite{diluca2011effects,gaffary2017ar}.
Hence, our second objective is to \textbf{evaluate how the perception of wearable haptic texture augmentation is affected by the visual virtuality of the hand and the environment} (real, augmented, or virtual). Hence, our second objective is to \textbf{evaluate how the perception of wearable haptic texture augmentation is affected by the visual feedback of the virtual hand and the environment} (real, augmented, or virtual).
Finally, some visuo-haptic texture databases have been modeled from real texture captures \cite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with hand-held haptic devices that are perceived as similar to real textures \cite{culbertson2015should,friesen2024perceived}. Finally, some visuo-haptic texture databases have been created from real texture captures \cite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with hand-held haptic devices that are perceived as similar to real textures \cite{culbertson2015should,friesen2024perceived}.
However, the rendering of these textures in an immersive and natural visuo-haptic \AR using wearable haptics remains to be investigated. However, the rendering of these textures in an immersive and natural visuo-haptic \AR using wearable haptics remains to be investigated.
Our third objective is to \textbf{evaluate the perception of simultaneous and co-localized wearable visuo-haptic texture augmentations} of real surfaces in \AR, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture. Our third objective is to \textbf{evaluate the perception of simultaneous and co-localized wearable visuo-haptic texture augmentations} of real surfaces in \AR.%, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture.
\subsectionstarbookmark{Axis II: Improving the Manipulation of Virtual Objects} \subsectionstarbookmark{Axis II: Improving the Manipulation of Virtual Objects}
In immersive and wearable visuo-haptic \AR, the hand is free to touch and interact seamlessly with real, augmented, and virtual objects, and one can expect natural and direct contact and manipulation of \VOs with the bare hand. In immersive and wearable visuo-haptic \AR, the hand is free to touch and interact seamlessly with real, augmented, and virtual objects.
Hence, a user can expect natural and direct contact and manipulation of \VOs with the bare hand.
However, the intangibility of the visual \VE, the display limitations of current visual \OST-\AR systems and the inherent spatial and temporal discrepancies between the user's hand actions and the visual feedback in the \VE can make the interaction with \VOs particularly challenging. However, the intangibility of the visual \VE, the display limitations of current visual \OST-\AR systems and the inherent spatial and temporal discrepancies between the user's hand actions and the visual feedback in the \VE can make the interaction with \VOs particularly challenging.
%However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of wearable haptics, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the \RE, the visual feedback, and the haptic feedback, can make the interaction with \VOs with bare hands particularly challenging. %However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of wearable haptics, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the \RE, the visual feedback, and the haptic feedback, can make the interaction with \VOs with bare hands particularly challenging.
Two particular sensory feedbacks are known to improve such direct \VO manipulation, but they have not been properly investigated in immersive \AR: visual rendering of the hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and delocalized haptic rendering \cite{lopes2018adding,teng2021touch}. Two particular sensory feedbacks are known to improve such direct \VO manipulation, but they have not been properly investigated in immersive \AR: visual rendering of the hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and delocalized haptic rendering \cite{lopes2018adding,teng2021touch}.
@@ -228,11 +246,12 @@ For this second axis of research, we propose to design and evaluate \textbf{visu
We consider the effect on the user performance an experience of (1) the visual rendering as hand augmentation and (2) combination of different visuo-haptic rendering of the hand manipulation with \VOs We consider the effect on the user performance an experience of (1) the visual rendering as hand augmentation and (2) combination of different visuo-haptic rendering of the hand manipulation with \VOs
First, the visual rendering of the virtual hand is a key element for interacting and manipulating \VOs in \VR \cite{prachyabrued2014visual,grubert2018effects}. First, the visual rendering of the virtual hand is a key element for interacting and manipulating \VOs in \VR \cite{prachyabrued2014visual,grubert2018effects}.
A few works have also investigated the visual rendering of the virtual hand in \AR, from simulating mutual occlusions between the hand and \VOs \cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay \cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand. A few works have also investigated the visual rendering of the virtual hand in \AR \cite{piumsomboon2014graspshell,blaga2017usability} but not in an immersive context of \VO manipulation. % with the bare hand.% from simulating mutual occlusions between the hand and \VOs \cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay \cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
But \OST-\AR has significant perceptual differences from \VR due to the visibility of the real hand and environment, and these visual hand augmentations have not been evaluated in the context of \VO manipulation with the bare hand. \OST-\AR has also significant perceptual differences from \VR due to the visibility of the real hand and environment that can affect the user experience and performance \cite{yoon2020evaluating}.
Thus, our fourth objective is to \textbf{investigate the visual rendering as hand augmentation} for direct manipulation of \VOs in \OST-\AR. %, and these visual hand augmentations have not been evaluated .
Thus, our fourth objective is to \textbf{investigate the visual rendering as hand augmentation} for direct hand manipulation of \VOs in \OST-\AR.
Second, as described above, wearable haptics for \AR rely on moving the haptic actuator away from the fingertips to not impair the hand movements, sensations, and interactions with the \RE. Second, as described above, the haptic actuators need to be moved away from the fingertips to not impair the hand movements, sensations, and interactions with the \RE.
Previous works have shown that wearable haptics that provide feedback on the hand manipulation with \VOs in \AR can significantly improve the user performance and experience \cite{maisto2017evaluation,meli2018combining}. Previous works have shown that wearable haptics that provide feedback on the hand manipulation with \VOs in \AR can significantly improve the user performance and experience \cite{maisto2017evaluation,meli2018combining}.
However, it is unclear which positioning of the actuator is the most beneficial nor how a haptic augmentation of the hand compares or complements with a visual augmentation of the hand. However, it is unclear which positioning of the actuator is the most beneficial nor how a haptic augmentation of the hand compares or complements with a visual augmentation of the hand.
Our last objective is to \textbf{investigate the visuo-haptic rendering of hand manipulation with \VOs} in \OST-\AR using wearable vibrotactile haptic. Our last objective is to \textbf{investigate the visuo-haptic rendering of hand manipulation with \VOs} in \OST-\AR using wearable vibrotactile haptic.
@@ -248,7 +267,7 @@ In \textbf{\chapref{related_work}}, we then review previous work on the percepti
First, we overview how the hand perceives and manipulate real everyday objects. First, we overview how the hand perceives and manipulate real everyday objects.
Second, we present wearable haptics and haptic augmentations of texture and hardness of real objects. Second, we present wearable haptics and haptic augmentations of texture and hardness of real objects.
Third, we introduce \AR, and how \VOs can be manipulated directly with the hand. Third, we introduce \AR, and how \VOs can be manipulated directly with the hand.
Finally, we describe how multimodal visual and haptic feedback have been combined in \AR to enhance perception and interaction with the hand. Finally, we describe how visual and haptic feedback have been combined in \AR to enhance perception and interaction with the hand.
We then address each of our two research axes in a dedicated part. We then address each of our two research axes in a dedicated part.
@@ -258,18 +277,19 @@ We evaluate how the visual rendering of the hand (real or virtual), the environm
In \textbf{\chapref{vhar_system}} we design and implement a system for rendering visuo-haptic virtual textures that augment real surfaces. %, using an immersive \OST-\AR headset and a wearable vibrotactile device. In \textbf{\chapref{vhar_system}} we design and implement a system for rendering visuo-haptic virtual textures that augment real surfaces. %, using an immersive \OST-\AR headset and a wearable vibrotactile device.
The haptic textures represent a periodical patterned texture rendered by a wearable vibrotactile actuator worn on the middle phalanx of the finger touching the surface. The haptic textures represent a periodical patterned texture rendered by a wearable vibrotactile actuator worn on the middle phalanx of the finger touching the surface.
The tracking of the real hand and the environment is done using a marker-based technique, and the visual rendering is done using the immersive \OST-\AR headset Microsoft HoloLens~2. The tracking of the real hand and the environment is achieved using a marker-based technique.
The visual rendering is done using the immersive \OST-\AR headset Microsoft HoloLens~2.
The system allows free visual and haptic exploration of the textures, as if they were real, and forms the basis of the next two chapters. The system allows free visual and haptic exploration of the textures, as if they were real, and forms the basis of the next two chapters.
In \textbf{\chapref{xr_perception}} we investigate in a user study how different the perception of haptic texture augmentations is in \AR \vs \VR and when touched by a virtual hand \vs one's own hand. In \textbf{\chapref{xr_perception}} we investigate in a user study how different the perception of haptic texture augmentations is in \AR \vs \VR and when touched by a virtual hand \vs one's own hand.
We use psychophysical methods to measure the user perception, and extensive questionnaires to understand how this perception is affected by the visual virtuality of the hand and the environment. We use psychophysical methods to measure the user perception, and extensive questionnaires to understand how this perception is affected by the visual feedback of the virtual hand and the environment (real, augmented, or virtual).
In \textbf{\chapref{vhar_textures}} we evaluate in a user study the perception of visuo-haptic texture augmentations, touched directly with one's own hand in \AR. In \textbf{\chapref{vhar_textures}} we evaluate in a user study the perception of visuo-haptic texture augmentations, touched directly with one's own hand in \AR.
The virtual textures are paired visual and tactile models of real surfaces \cite{culbertson2014one} that we render as visual and haptic overlays on the touched augmented surfaces. The virtual textures are paired visual and tactile models of real surfaces \cite{culbertson2014one} that we render as visual and haptic overlays on the touched augmented surfaces.
Our objective is to assess the perceived realism, coherence and roughness of the combination of nine representative visuo-haptic texture pairs. Our objective is to assess the perceived realism, coherence and roughness of the combination of nine representative visuo-haptic texture pairs.
\noindentskip \noindentskip
In \textbf{\partref{manipulation}} we describe our contributions to the second axis of research: improving the free and direct hand manipulation of \VOs using visuo-haptic augmentations of the hand as interaction feedback with \VOs in immersive \OST-\AR. In \textbf{\partref{manipulation}} we describe our contributions to the second axis of research: improving the direct hand manipulation of \VOs using visuo-haptic augmentations of the hand as interaction feedback with \VOs in immersive \OST-\AR.
In \textbf{\chapref{visual_hand}} we investigate in a user study of six visual renderings as hand augmentations, as a set of the most popular hand renderings in the \AR literature. In \textbf{\chapref{visual_hand}} we investigate in a user study of six visual renderings as hand augmentations, as a set of the most popular hand renderings in the \AR literature.
Using the \OST-\AR headset Microsoft HoloLens~2, we evaluate their effect on the user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a \VO directly with the hand. Using the \OST-\AR headset Microsoft HoloLens~2, we evaluate their effect on the user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a \VO directly with the hand.

View File

@@ -196,7 +196,7 @@ As of today, an immersive \AR system tracks itself with the user in \ThreeD, usi
It enables the \VE to be registered with the \RE and the user simply moves to navigate within the virtual content. It enables the \VE to be registered with the \RE and the user simply moves to navigate within the virtual content.
%This tracking and mapping of the user and \RE into the \VE is named the \enquote{extent of world knowledge} by \textcite{skarbez2021revisiting}, \ie to what extent the \AR system knows about the \RE and is able to respond to changes in it. %This tracking and mapping of the user and \RE into the \VE is named the \enquote{extent of world knowledge} by \textcite{skarbez2021revisiting}, \ie to what extent the \AR system knows about the \RE and is able to respond to changes in it.
However, direct hand manipulation of virtual content is a challenge that requires specific interaction techniques \cite{billinghurst2021grand}. However, direct hand manipulation of virtual content is a challenge that requires specific interaction techniques \cite{billinghurst2021grand}.
It is often achieved using two interaction techniques: \emph{tangible objects} and \emph{virtual hands} \cite[p.165]{billinghurst2015survey,hertel2021taxonomy}. It is often achieved using two interaction techniques: \emph{tangible objects} and \emph{virtual hands} \cite[p.165]{billinghurst2015survey}.
\subsubsection{Manipulating with Tangibles} \subsubsection{Manipulating with Tangibles}
\label{ar_tangibles} \label{ar_tangibles}

View File

@@ -5,7 +5,7 @@ Everyday perception and manipulation of objects with the hand typically involves
Each sense has unique capabilities for perceiving certain object properties, such as color for vision or temperature for touch, but they are equally capable for many properties, such as roughness, hardness, or geometry \cite{baumgartner2013visual}. Each sense has unique capabilities for perceiving certain object properties, such as color for vision or temperature for touch, but they are equally capable for many properties, such as roughness, hardness, or geometry \cite{baumgartner2013visual}.
Both \AR and wearable haptic systems integrate virtual content into the user's perception as sensory illusions. Both \AR and wearable haptic systems integrate virtual content into the user's perception as sensory illusions.
It is essential to understand how a multimodal visuo-haptic rendering of a \VO is perceived as a coherent object property, and how wearable haptics have been integrated with immersive \AR.%, especially in immersive \AR where the haptic actuator is moved away so as not to cover the inside of the hand. It is essential to understand how a visuo-haptic rendering of a \VO is perceived as a coherent object property, and how wearable haptics have been integrated with immersive \AR.%, especially in immersive \AR where the haptic actuator is moved away so as not to cover the inside of the hand.
% spatial and temporal integration of visuo-haptic feedback as perceptual cues vs proprioception and real touch sensations % spatial and temporal integration of visuo-haptic feedback as perceptual cues vs proprioception and real touch sensations
% delocalized : not at the point of contact = difficult to integrate with other perceptual cues ? % delocalized : not at the point of contact = difficult to integrate with other perceptual cues ?

View File

@@ -15,7 +15,7 @@ However, direct hand interaction and manipulation of \VOs is difficult due to th
Tangibles are also used as proxies for manipulating \VOs, but can be inconsistent with the visual rendering, being haptically passives. Tangibles are also used as proxies for manipulating \VOs, but can be inconsistent with the visual rendering, being haptically passives.
Wearable haptics on the hand is a promising solution for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of tangibles. Wearable haptics on the hand is a promising solution for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of tangibles.
Providing coherent multimodal visuo-haptic feedback to enhance direct hand perception and manipulation with \VOs in immersive \AR is challenging. Providing coherent visuo-haptic feedback to enhance direct hand perception and manipulation with \VOs in immersive \AR is challenging.
While many wearable haptic devices have been developed and are capable of providing varied tactile feedback, few can be integrated or experimentally evaluated for direct hand interaction in \AR. While many wearable haptic devices have been developed and are capable of providing varied tactile feedback, few can be integrated or experimentally evaluated for direct hand interaction in \AR.
Their haptic end-effector must be moved away from the inside of the hand so as not to interfere with the user interaction with the \RE. Their haptic end-effector must be moved away from the inside of the hand so as not to interfere with the user interaction with the \RE.
Different relocation strategies have been proposed for different parts of the hand, such as the nail, the index phalanges, or the wrist, but it remains unclear whether any of them are best suited for direct hand interaction in \AR. Different relocation strategies have been proposed for different parts of the hand, such as the nail, the index phalanges, or the wrist, but it remains unclear whether any of them are best suited for direct hand interaction in \AR.

View File

@@ -8,7 +8,7 @@ This chapter reviews previous work on the perception and manipulation with virtu
We first overview how the hand senses and acts on its environment to perceive and manipulate the haptic properties of real everyday objects. We first overview how the hand senses and acts on its environment to perceive and manipulate the haptic properties of real everyday objects.
Second, we present how wearable haptic devices and renderings have been used to augment the haptic perception of roughness and hardness of real objects. Second, we present how wearable haptic devices and renderings have been used to augment the haptic perception of roughness and hardness of real objects.
Third, we introduce the principles and user experience of \AR, and overview the main interaction techniques used to manipulate virtual objects directly with the hand. Third, we introduce the principles and user experience of \AR, and overview the main interaction techniques used to manipulate virtual objects directly with the hand.
Finally, we describe multimodal visual and haptic feedback have been combined in \AR to enhance perception and interaction with the hand. Finally, we describe how visual and haptic feedback have been combined in \AR to enhance perception and interaction with the hand.
\input{1-haptic-hand} \input{1-haptic-hand}
\input{2-wearable-haptics} \input{2-wearable-haptics}

View File

@@ -1,2 +1,2 @@
\part{Augmenting the Visuo-haptic Texture Perception of Real Surfaces} \part{Augmenting the Texture Perception of Real Surfaces}
\mainlabel{perception} \mainlabel{perception}

View File

@@ -2,7 +2,7 @@
% %
%We describe a system for rendering vibrotactile roughness textures in real time, on any real surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose. %We describe a system for rendering vibrotactile roughness textures in real time, on any real surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose.
% %
%We also describe how to pair this tactile rendering with an immersive \AR or \VR headset visual display to provide a coherent, multimodal visuo-haptic augmentation of the \RE. %We also describe how to pair this tactile rendering with an immersive \AR or \VR headset visual display to provide a coherent visuo-haptic augmentation of the \RE.
\section{Principle} \section{Principle}
\label{principle} \label{principle}
@@ -17,7 +17,7 @@ The visuo-haptic texture rendering system is based on:
\figref{diagram} shows the interaction loop diagram and \eqref{signal} the definition of the vibrotactile signal. \figref{diagram} shows the interaction loop diagram and \eqref{signal} the definition of the vibrotactile signal.
The system consists of three main components: the pose estimation of the tracked real elements, the visual rendering of the \VE, and the vibrotactile signal generation and rendering. The system consists of three main components: the pose estimation of the tracked real elements, the visual rendering of the \VE, and the vibrotactile signal generation and rendering.
\figwide[1]{diagram}{Diagram of the visuo-haptic texture rendering system. }[ \figwide{diagram}{Diagram of the visuo-haptic texture rendering system. }[
Fiducial markers attached to the voice-coil actuator and to augmented surfaces to track are captured by a camera. Fiducial markers attached to the voice-coil actuator and to augmented surfaces to track are captured by a camera.
The positions and rotations (the poses) ${}^c\mathbf{T}_i$, $i=1..n$ of the $n$ defined markers in the camera frame $\mathcal{F}_c$ are estimated, then filtered with an adaptive low-pass filter. The positions and rotations (the poses) ${}^c\mathbf{T}_i$, $i=1..n$ of the $n$ defined markers in the camera frame $\mathcal{F}_c$ are estimated, then filtered with an adaptive low-pass filter.
%These poses are transformed to the \AR/\VR headset frame $\mathcal{F}_h$ and applied to the virtual model replicas to display them superimposed and aligned with the \RE. %These poses are transformed to the \AR/\VR headset frame $\mathcal{F}_h$ and applied to the virtual model replicas to display them superimposed and aligned with the \RE.

Binary file not shown.

View File

@@ -19,7 +19,7 @@ To control for the influence of the visual rendering, the real surface was not v
\noindentskip In the remainder of this chapter, we first describe the experimental design and apparatus of the user study. \noindentskip In the remainder of this chapter, we first describe the experimental design and apparatus of the user study.
We then present the results obtained, discuss them, and outline recommendations for future \AR/\VR works using wearable haptic augmentations. We then present the results obtained, discuss them, and outline recommendations for future \AR/\VR works using wearable haptic augmentations.
%First, we present a system for rendering virtual vibrotactile textures in real time without constraints on hand movements and integrated with an immersive visual \AR/\VR headset to provide a coherent multimodal visuo-haptic augmentation of the \RE. %First, we present a system for rendering virtual vibrotactile textures in real time without constraints on hand movements and integrated with an immersive visual \AR/\VR headset to provide a coherent visuo-haptic augmentation of the \RE.
%An experimental setup is then presented to compare haptic roughness augmentation with an optical \AR headset (Microsoft HoloLens~2) that can be transformed into a \VR headset using a cardboard mask. %An experimental setup is then presented to compare haptic roughness augmentation with an optical \AR headset (Microsoft HoloLens~2) that can be transformed into a \VR headset using a cardboard mask.
%We then conduct a psychophysical study with 20 participants, where various virtual haptic textures on a real surface directly touched with the finger are compared in a two-alternative forced choice (2AFC) task in three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR. %We then conduct a psychophysical study with 20 participants, where various virtual haptic textures on a real surface directly touched with the finger are compared in a two-alternative forced choice (2AFC) task in three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR.

View File

@@ -114,19 +114,18 @@ They all signed an informed consent form before the user study and were unaware
For each trial, the \response{Texture Choice} by the participant as the roughest of the pair was recorded. For each trial, the \response{Texture Choice} by the participant as the roughest of the pair was recorded.
The \response{Response Time} between the end of the trial and the choice of the participant was also measured as an indicator of the difficulty of the task. The \response{Response Time} between the end of the trial and the choice of the participant was also measured as an indicator of the difficulty of the task.
At each frame, the \response{Finger Position} and \response{Finger Speed} were recorded to control for possible differences in texture exploration behaviour. At each frame, the \response{Finger Position} and \response{Finger Speed} were recorded to control for possible differences in texture exploration behaviour.
Participants also rated their experience after each \factor{Visual Rendering} block of trials using the questions shown in \tabref{questions}. After each \factor{Visual Rendering} block of trials, participants rated their experience with the vibrotactile textures (all blocks), the vibrotactile device (all blocks), the virtual hand rendering (all except \level{Mixed} block) and the \VE (\level{Virtual} block) using the questions shown in \tabref{questions1}.
After each \factor{Visual Rendering} block of trials, participants rated their experience with the vibrotactile textures (all blocks), the vibrotactile device (all blocks), the virtual hand rendering (all except \level{Mixed} block) and the \VE (\level{Virtual} block) using the questions shown in \tabref{questions}. They also assessed their workload with the NASA Task Load Index (\response{NASA-TLX}) questionnaire after each blocks of trials (\tabref{questions2}).
They also assessed their workload with the NASA Task Load Index (\response{NASA-TLX}) questionnaire after each blocks of trials.
For all questions, participants were shown only labels (\eg \enquote{Not at all} or \enquote{Extremely}) and not the actual scale values (\eg 1 or 5) \cite{muller2014survey}. For all questions, participants were shown only labels (\eg \enquote{Not at all} or \enquote{Extremely}) and not the actual scale values (\eg 1 or 5) \cite{muller2014survey}.
\newcommand{\scalegroup}[2]{\multirow{#1}{1\linewidth}{#2}} \newcommand{\scalegroup}[2]{\multirow{#1}{1\linewidth}{#2}}
\begin{tabwide}{questions} \afterpage{
{Questions asked to participants after each \factor{Visual Rendering} block of trials.} \begin{tabwide}{questions1}
{First part of the questions asked to participants after each \factor{Visual Rendering} block of trials.}
[ [
Unipolar scale questions were 5-point Likert scales (1~=~Not at all, 2~=~Slightly, 3~=~Moderately, 4~=~Very and 5~=~Extremely). Unipolar scale questions were 5-point Likert scales (1~=~Not at all, 2~=~Slightly, 3~=~Moderately, 4~=~Very and 5~=~Extremely).
Bipolar scale questions were 7-point Likert scales (1~=~Extremely A, 2~=~Moderately A, 3~=~Slightly A, 4~=~Neither A nor B, 5~=~Slightly B, 6~=~Moderately B, 7~=~Extremely B), Bipolar scale questions were 7-point Likert scales (1~=~Extremely A, 2~=~Moderately A, 3~=~Slightly A, 4~=~Neither A nor B, 5~=~Slightly B, 6~=~Moderately B, 7~=~Extremely B),
where A and B are the two poles of the scale (indicated in parentheses in the Scale column of the questions). where A and B are the two poles of the scale (indicated in parentheses in the Scale column of the questions).
NASA TLX questions were bipolar 100-points scales (0~=~Very Low and 100~=~Very High, except for Performance where 0~=~Perfect and 100~=~Failure).
Participants were shown only the labels for all questions. Participants were shown only the labels for all questions.
] ]
\begin{tabularx}{\linewidth}{l X p{0.2\linewidth}} \begin{tabularx}{\linewidth}{l X p{0.2\linewidth}}
@@ -152,13 +151,27 @@ For all questions, participants were shown only labels (\eg \enquote{Not at all}
\midrule \midrule
Virtual Realism & How realistic was the virtual environment? & \scalegroup{2}{Unipolar (1-5)} \\ Virtual Realism & How realistic was the virtual environment? & \scalegroup{2}{Unipolar (1-5)} \\
Virtual Similarity & How similar was the virtual environment to the real one? & \\ Virtual Similarity & How similar was the virtual environment to the real one? & \\
\midrule
Mental Demand & How mentally demanding was the task? & \scalegroup{6}{Bipolar (0-100)} \\
Temporal Demand & How hurried or rushed was the pace of the task? & \\
Physical Demand & How physically demanding was the task? & \\
Performance & How successful were you in accomplishing what you were asked to do? & \\
Effort & How hard did you have to work to accomplish your level of performance? & \\
Frustration & How insecure, discouraged, irritated, stressed, and annoyed were you? & \\
\bottomrule \bottomrule
\end{tabularx} \end{tabularx}
\end{tabwide} \end{tabwide}
}
\begin{tab}[!htb]{questions2}
{NASA-TLX questions asked to participants after each \factor{Visual Rendering} block of trials.}
[
Questions were bipolar 100-points scales (0~=~Very Low and 100~=~Very High, except for Performance where 0~=~Perfect and 100~=~Failure).
Participants were shown only the labels for all questions.
]
\begin{tabularx}{\linewidth}{p{0.13\linewidth} X}
\toprule
\textbf{Code} & \textbf{Question} \\
\midrule
Mental Demand & How mentally demanding was the task? \\
Temporal Demand & How hurried or rushed was the pace of the task? \\
Physical Demand & How physically demanding was the task? \\
Performance & How successful were you in accomplishing what you were asked to do? \\
Effort & How hard did you have to work to accomplish your level of performance? \\
Frustration & How insecure, discouraged, irritated, stressed, and annoyed were you? \\
\bottomrule
\end{tabularx}
\end{tab}

View File

@@ -1,13 +1,13 @@
\section{Conclusion} \section{Conclusion}
\label{conclusion} \label{conclusion}
In this chapter, we studied how the perception of wearable haptic augmented textures is affected by the visual virtuality of the hand and the environment, being either real, augmented or virtual. In this chapter, we studied how the perception of wearable haptic augmented textures is affected by the visual feedback of the virtual hand and the environment, being either real, augmented or virtual.
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, we augmented the perceived roughness of real surfaces with virtual vibrotactile textures rendered on the finger. Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, we augmented the perceived roughness of real surfaces with virtual vibrotactile textures rendered on the finger.
%we rendered virtual vibrotactile patterned textures on the voice-coil worn on the middle-phalanx of the finger to augment the roughness perception of the real surface being touched. %we rendered virtual vibrotactile patterned textures on the voice-coil worn on the middle-phalanx of the finger to augment the roughness perception of the real surface being touched.
With an immersive \AR headset, that could be switched to a \VR only view, we considered three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR. With an immersive \AR headset, that could be switched to a \VR only view, we considered three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR.
We then evaluated the perceived roughness augmentation in these three visual conditions with a psychophysical user study involving 20 participants and extensive questionnaires. We then evaluated the perceived roughness augmentation in these three visual conditions with a psychophysical user study involving 20 participants and extensive questionnaires.
Our results showed that the visual virtuality of the hand and the environment had a significant effect on the perception of haptic textures and the exploration behaviour of the participants. Our results showed that the visual virtuality of the hand (real or virtual) and the environment (\AR or \VR) had a significant effect on the perception of haptic textures and the exploration behaviour of the participants.
The textures were on average perceived as \enquote{rougher} and with a higher sensitivity when touched with the real hand alone than with a virtual hand either in \AR, with \VR in between. The textures were on average perceived as \enquote{rougher} and with a higher sensitivity when touched with the real hand alone than with a virtual hand either in \AR, with \VR in between.
Exploration behaviour was also slower in \VR than with real hand alone, although subjective evaluation of the texture was not affected. Exploration behaviour was also slower in \VR than with real hand alone, although subjective evaluation of the texture was not affected.
We hypothesised that this difference in perception was due to the \emph{perceived latency} between the finger movements and the different visual, haptic and proprioceptive feedbacks, which were the same in all visual renderings, but were more noticeable in \AR and \VR than without visual augmentation. We hypothesised that this difference in perception was due to the \emph{perceived latency} between the finger movements and the different visual, haptic and proprioceptive feedbacks, which were the same in all visual renderings, but were more noticeable in \AR and \VR than without visual augmentation.

View File

@@ -1,2 +1,2 @@
\part{Improving Virtual Object Manipulation with Visuo-Haptic Augmentations of the Hand} \part{Improving the Manipulation of Virtual Objects}
\mainlabel{manipulation} \mainlabel{manipulation}

View File

@@ -13,13 +13,13 @@ We have structured our research around two axes: \textbf{(I) modifying the textu
\noindentskip In \partref{perception} we focused on modifying the perception of wearable and immersive virtual visuo-haptic textures that augment real surfaces. \noindentskip In \partref{perception} we focused on modifying the perception of wearable and immersive virtual visuo-haptic textures that augment real surfaces.
Texture is a fundamental property of an object, perceived equally by sight and touch. Texture is a fundamental property of an object, perceived equally by sight and touch.
It is also one of the most studied haptic augmentations, but it had not yet been integrated into \AR or \VR. It is also one of the most studied haptic augmentations, but it had not yet been integrated into \AR or \VR.
We \textbf{(1)} proposed a \textbf{wearable visuo-haptic texture augmentation system}, \textbf{(2)} evaluated how the perception of haptic texture augmentations is \textbf{affected by the visual virtuality of the hand} and the environment (real, augmented, or virtual), and \textbf{(3)} investigated the \textbf{perception of co-localized visuo-haptic texture augmentations}. We \textbf{(1)} proposed a \textbf{wearable visuo-haptic texture augmentation system}, \textbf{(2)} evaluated how the perception of haptic texture augmentations is \textbf{affected by the visual feedback of the virtual hand} and the environment (real, augmented, or virtual), and \textbf{(3)} investigated the \textbf{perception of co-localized visuo-haptic texture augmentations}.
In \chapref{vhar_system}, we presented a system for \textbf{augmenting any real surface} with virtual \textbf{visuo-haptic roughness textures} using an immersive \AR headset and a wearable vibrotactile device worn on the middle phalanx of the finger. In \chapref{vhar_system}, we presented a system for \textbf{augmenting any real surface} with virtual \textbf{visuo-haptic roughness textures} using an immersive \AR headset and a wearable vibrotactile device worn on the middle phalanx of the finger.
It allows a \textbf{free visual and touch exploration} of the textures, as if they were real, allowing the user to view them from different angles and touch them with the bare finger without constraints on hand movements. It allows a \textbf{free visual and touch exploration} of the textures, as if they were real, allowing the user to view them from different angles and touch them with the bare finger without constraints on hand movements.
The user studies in the next two chapters are based on this system. The user studies in the next two chapters are based on this system.
In \chapref{xr_perception} we explored how the perception of wearable haptic augmented textures is affected by the visual virtuality of the hand and the environment, whether it is real, augmented or virtual. In \chapref{xr_perception} we explored how the perception of wearable haptic augmented textures is affected by the visual feedback of the virtual hand and the environment, whether it is real, augmented or virtual.
We augmented the perceived roughness of the real surface with virtual vibrotactile patterned textures, and rendered the visual conditions by switching the \OST-\AR headset to a \VR-only view. We augmented the perceived roughness of the real surface with virtual vibrotactile patterned textures, and rendered the visual conditions by switching the \OST-\AR headset to a \VR-only view.
We then conducted a psychophysical user study with 20 participants and extensive questionnaires to evaluate the perceived roughness augmentation in these three visual conditions. We then conducted a psychophysical user study with 20 participants and extensive questionnaires to evaluate the perceived roughness augmentation in these three visual conditions.
The textures were perceived as \textbf{rougher when touched with the real hand alone compared to a virtual hand} in either \AR or \VR, possibly due to the \textbf{perceived latency} between finger movements and different visual, haptic, and proprioceptive feedbacks. The textures were perceived as \textbf{rougher when touched with the real hand alone compared to a virtual hand} in either \AR or \VR, possibly due to the \textbf{perceived latency} between finger movements and different visual, haptic, and proprioceptive feedbacks.

View File

@@ -5,5 +5,5 @@ Compile `main.tex` using either `pdflatex` with `biblatex` or `lualatex` (recomm
You may need to install the following packages: You may need to install the following packages:
```bash ```bash
tlmgr install adjustbox babel babel-english biber biblatex bookmark booktabs caption changes csquotes emptypage enumitem etoolbox fancyhdr fontaxes fontenc footmisc glossaries glossaries-english graphicx hyperref iftex import inconsolata libertinus libertinus-fonts libertinus-otf libertinus-type1 makecell mathtools microtype multirow nextpage pdflscape pdfpages setspace siunitx subfig tabularx textcomp titlesec truncate todonotes tocloft ulem upquote xcolor xstring tlmgr install adjustbox babel babel-english biber biblatex biblatex-ext bookmark booktabs caption changes csquotes emptypage enumitem etoolbox fancyhdr fontaxes fontenc footmisc glossaries glossaries-english graphicx hyperref iftex import inconsolata libertinus libertinus-fonts libertinus-otf libertinus-type1 makecell mathtools microtype multirow nextpage pdflscape pdfpages setspace siunitx subfig tabularx textcomp titlesec truncate todonotes tocloft ulem upquote xcolor xstring
``` ```

View File

@@ -12,7 +12,8 @@
\usepackage{bookmark} % Manage bookmarks \usepackage{bookmark} % Manage bookmarks
\usepackage{import} % Allow relative paths \usepackage{import} % Allow relative paths
\usepackage{pdfpages} % Include PDFs \usepackage{pdfpages} % Include PDFs
\usepackage{rotating} % Landscape images and tables \usepackage{pdflscape} % Landscape pages
\usepackage{afterpage} % Execute command after the next page break
% Formatting % Formatting
\usepackage[autostyle]{csquotes} % For quotes \usepackage[autostyle]{csquotes} % For quotes

View File

@@ -35,13 +35,12 @@
\end{figure}% \end{figure}%
} }
\RenewDocumentCommand{\figwide}{O{1} O{htbp} m m O{}}{% #1 = width, #2 = position, #3 = filename, #4 = caption, #5 = additional caption \RenewDocumentCommand{\figwide}{O{1} m m O{}}{% #1 = width, #2 = filename, #3 = caption, #4 = additional caption
\begin{sidewaysfigure}[#2] \afterpage{%
\centering% \begin{landscape}%
\includegraphics[width=#1\linewidth]{#3}% \fig[#1][p]{#2}{#3}[#4]%
\caption[#4]{#4#5}% \end{landscape}%
\label{fig:#3}% }%
\end{sidewaysfigure}%
} }
% example: % example:
@@ -119,13 +118,12 @@
\end{table}% \end{table}%
} }
\RenewDocumentEnvironment{tabwide}{O{htbp} m m O{}}{% #1 = position, #2 = label, #3 = title, #4 = additional caption \RenewDocumentEnvironment{tabwide}{m m O{}}{% #1 = label, #2 = title, #3 = additional caption
\begin{sidewaystable}[#1]% \begin{landscape}
\centering% \begin{tab}[p]{#1}{#2}[#3]%
\caption[#3]{#3#4}%
\label{tab:#2}%
}{% }{%
\end{sidewaystable}% \end{tab}%
\end{landscape}%
} }
% Local table of contents % Local table of contents