Compare commits
13 Commits
0a0e1ff4b5
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
| 6ec76ae2e0 | |||
| 58161d561f | |||
| 708b836397 | |||
| 8a4a0bdb81 | |||
| 520be0856a | |||
| cb56abcd43 | |||
| e01e63d4cf | |||
| 0202efeb06 | |||
| 43037c8407 | |||
| 035603eee7 | |||
| 9f4e7fb8c7 | |||
| 60110bd64e | |||
| beb2dce3bb |
1
.gitignore
vendored
1
.gitignore
vendored
@@ -147,7 +147,6 @@ acs-*.bib
|
||||
|
||||
# knitr
|
||||
*-concordance.tex
|
||||
# TODO Uncomment the next line if you use knitr and want to ignore its generated tikz files
|
||||
# *.tikz
|
||||
*-tikzDictionary
|
||||
|
||||
|
||||
@@ -1,2 +1,25 @@
|
||||
\chapterstarbookmark{Acknowledgement}
|
||||
\label{ch:acknowledgement}
|
||||
\chapternotoc{Remerciements}
|
||||
|
||||
Je tiens tout d'abord à remercier mes encadrants de thèse, Maud Marchal, Claudio Pacchierotti et Éric Marchand, pour leur accompagnement précieux tout au long de ce travail.
|
||||
Vous m'avez permis de découvrir et apprendre pleinement la recherche et l'enseignement, et vous m'avez accordé une grande confiance et liberté.
|
||||
Vous avez aussi toujours su être disponibles pour m'aider, échanger nos idées, partager votre curiosité et votre expérience, et j'en suis très reconnaissant.
|
||||
|
||||
Je souhaite également remercier mes rapporteurs, Jens Grubert et Seokhee Jeon, pour avoir relu et commenté mon manuscrit de thèse.
|
||||
Vos commentaires m'ont permis d'améliorer grandement la qualité de ce manuscrit et de préparer au mieux ma soutenance.
|
||||
Je remercie aussi mes examinateurs, Sinan Haliyo et Martin Hachet, pour leur suivi et leurs retours durant les années de thèse jusqu'à la soutenance.
|
||||
Enfin, je vous remercie tous les quatre pour avoir accepté, et pris le temps, de faire partie de mon jury de thèse.
|
||||
J'ai beaucoup apprécié nos échanges inspirants et vos retours riches sur mon travail.
|
||||
|
||||
Je remercie également tous mes collègues, nombreux, de l'IRISA, de l'Inria et de l'INSA pour nos nombreux et riches échanges et tous les moments passés ensemble, au laboratoire et en dehors.
|
||||
Je pense particulièrement à Ronan Gaugne, Charles Pontonnier, Thomas Genet, Pascale Sebillot, Anatole Lécuyer, Valérie Gouranton, Hélène de la Ruée et Léa Pillette qui m'ont apporté une aide et des conseils précieux sur la thèse et des questionnements plus personnels.
|
||||
Mais je pense surtout à mes co-bureaux (par ordre chronologique) : Alexander, Fouad, Lendy, Pierre-Antoine, Guillaume, Thomas, Glenn et Jim avec qui nous avons beaucoup partagé de moments de travail et de vie.
|
||||
Lendy, Pierre-Antoine et Guillaume, j'ai beaucoup apprécié refaire tous les jours le monde avec vous et nos discussions passionnantes !
|
||||
Je retiens aussi vos talents culinaires, mais pour différentes raisons ;)
|
||||
|
||||
Pour finir, je suis profondément reconnaissant envers mes parents, familles et amis pour leur présence et leur soutien durant ces années de thèse.
|
||||
Merci aussi à mes parents et beaux-parents pour votre soutien crucial les derniers jours et pour avoir préparé le pot de thèse.
|
||||
Nos boules de poils ont également toujours été présentes pour apporter du réconfort et profiter des joies du quotidien.
|
||||
Enfin et surtout, merci Agathe d'être toi, d'être là et pour tout ce que nous construisons ensemble.
|
||||
La dernière fois que j'écrivais des remerciements, en 2018, j'avais hâte de vivre avec toi et j'ai maintenant encore plus hâte de vivre le reste de notre vie ensemble !
|
||||
|
||||
\textit{Trugarez vraz d'an holl!}
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
\begin{listof}
|
||||
\renewcommand{\contentsname}{\textsf{Contents}}
|
||||
\renewcommand{\cftsecfont}{}% Unbold sections
|
||||
\pdfbookmark[chapter]{\contentsname}{toc}
|
||||
\tableofcontents
|
||||
\end{listof}
|
||||
|
||||
@@ -7,7 +7,6 @@
|
||||
|
||||
In this manuscript thesis, we show how \AR headset, which integrates visual virtual content into the real world perception, and wearable haptics, which provide tactile sensations on the skin, can improve direct hand interaction with virtual and augmented objects.
|
||||
Our goal is to enable users to perceive and interact with wearable visuo-haptic augmentations in a more realistic and effective way, as if they were real.
|
||||
\comans{JG}{I was wondering what the difference between an immersive AR headset and a non-immersive AR headset should be. If there is a difference (e.g., derived through headset properties by FoV), it should be stated. If there is none, I would suggest not using the term immersive AR headset but simply AR headset. On this account, in Figure 1.5 another term (“Visual AR Headset”) is introduced (and later OST-AR systems, c.f. also section 2.3.1.3).}{The terms "immersive AR headset" and "visual AR headset" have been replaced by the more appropriate term "AR headset".}
|
||||
|
||||
\section{Visual and Haptic Object Augmentations}
|
||||
\label{visuo_haptic_augmentations}
|
||||
@@ -18,7 +17,6 @@ In daily life, \textbf{we simultaneously look at, touch and manipulate the every
|
||||
Many of these object properties can be perceived in a complementary way through all our sensory modalities, such as their shape or material \cite{baumgartner2013visual}.
|
||||
Vision often precedes touch, enabling us to anticipate the tactile sensations we will feel when touching the object \cite{yanagisawa2015effects}, \eg hardness or texture, and even to anticipate properties that we cannot see, \eg weight or temperature.
|
||||
Information from different sensory sources can be complementary, redundant or contradictory \cite{ernst2004merging}.
|
||||
%This is why we sometimes want to touch an object to check one of its properties that we have seen and to compare or confront our visual and tactile sensations.
|
||||
We then \textbf{instinctively construct a unified perception of the properties of the object} we are exploring and manipulating from our sensory modalities, as well as from the movement of our hand and fingers on the object \cite{ernst2002humans}.
|
||||
|
||||
The sense of touch also allows us to perceive our environment, but also to represent ourselves in space and interact with the surrounding objects.
|
||||
@@ -30,11 +28,11 @@ This rich and complex variety of actions and sensations makes it particularly \t
|
||||
|
||||
\textbf{\emph{Haptics} is the study of the sense of touch and user interfaces that involve touch} \cite{klatzky2013haptic}.
|
||||
Haptic devices can be categorized according to how they interface with the user: graspable, touchable and wearable, as illustrated in \figref{haptic-categories} \cite{culbertson2018haptics}.
|
||||
\emph{Graspable interfaces} are the traditional haptic devices that are held in the hand.
|
||||
\emph{Graspable} interfaces are the traditional haptic devices that are held in the hand.
|
||||
They are either grounded devices that provide kinesthetic (force) feedback, \eg a robotic arm or a joystick, or ungrounded tactile devices, \eg a game controller or a smartphone.
|
||||
\emph{Touchable interfaces} are actuated devices directly touched and that can dynamically change their shape or surface properties, such as hardness or friction, providing simultaneous kinesthetic and cutaneous feedback.
|
||||
\emph{Touchable} interfaces are actuated devices directly touched and that can dynamically change their shape or surface properties, such as hardness or friction, providing simultaneous kinesthetic and cutaneous feedback.
|
||||
However, graspable interfaces occupy the hand, preventing interaction with other objects, and touchable interfaces often involve cumbersome mechanisms and are by definition limited to their own working surface.
|
||||
Instead, \textbf{\emph{wearable interfaces} are directly mounted on the body} to provide cutaneous sensations on the skin in a portable way and \textbf{without restricting the user's movements} \cite{pacchierotti2017wearable}.
|
||||
Instead, \textbf{\emph{wearable} interfaces are directly mounted on the body} to provide cutaneous sensations on the skin in a portable way and \textbf{without restricting the user's movements} \cite{pacchierotti2017wearable}.
|
||||
|
||||
\begin{subfigs}{haptic-categories}{
|
||||
Haptic devices can be divided into three categories according to their interface with the user:
|
||||
@@ -50,8 +48,8 @@ Instead, \textbf{\emph{wearable interfaces} are directly mounted on the body} to
|
||||
|
||||
A wide range of wearable haptic devices have been developed to provide the user with rich virtual tactile sensations, including normal force, skin stretch, vibration and thermal feedback.
|
||||
\figref{wearable-haptics} shows some examples of different wearable haptic devices with different form factors and rendering capabilities.
|
||||
Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, augmented and virtual realities, and social interaction \cite{pacchierotti2017wearable,culbertson2018haptics}.
|
||||
However, the \textbf{integration of wearable haptics with \AR has been little explored}. %, and few wearable haptic devices have specifically designed or experimentally tested for direct hand interaction in \AR.
|
||||
Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, social interaction, and for improving hand interaction in \AR \cite{pacchierotti2017wearable,culbertson2018haptics}.
|
||||
However, the \textbf{integration of wearable haptics with \AR has been little explored}.
|
||||
|
||||
\begin{subfigs}{wearable-haptics}{
|
||||
Wearable haptic devices can provide sensations on the skin as feedback to real or virtual objects being touched.
|
||||
@@ -72,11 +70,14 @@ However, the \textbf{integration of wearable haptics with \AR has been little ex
|
||||
|
||||
\textbf{\emph{Augmented Reality (\AR)} integrates virtual content into the real world perception, creating the illusion of a unique \emph{\AE}} \cite{azuma1997survey,skarbez2021revisiting}.
|
||||
It thus promises natural and seamless interaction with physical and digital objects (and their combination) directly with our hands \cite{billinghurst2021grand}.
|
||||
It could be used to render purely \emph{virtual objects} in the \RE, or to create \emph{augmented objects} by modifying the perception of a real object with virtual content, such as changing its shape or texture.
|
||||
It is technically and conceptually closely related to \emph{\VR}, which completely replaces \emph{\RE} perception with a \emph{\VE}.
|
||||
|
||||
\AR and \VR can be placed on a reality-virtuality continuum, as proposed by \textcite{milgram1994taxonomy} and illustrated in \figref{rv-continuum}\footnote{On the original reality-virtuality continuum of \textcite{milgram1994taxonomy}, augmented virtuality is also considered, as the incorporation of real objects into a \VE, and is placed between \AR and \VR. For simplicity, we only consider \AR and \VR in this thesis.}.
|
||||
It describes the degree of virtuality of the environment along an axis, with one end being \RE and the other end being pure \VE, \ie indistinguishable from the real world (as in \emph{The Matrix} movies).
|
||||
Between these two extremes lies \MR, which includes \AR and \VR as different levels of mixing real and virtual environments \cite{skarbez2021revisiting}.\footnote{This is the original and classic definition of \MR, but there is still debate about how to define and characterize \AR and \MR experiences \cite{speicher2019what,skarbez2021revisiting}.}
|
||||
\AR and \VR can be placed on a reality-virtuality continuum, as proposed by \textcite{milgram1994taxonomy} and illustrated in \figref{rv-continuum}.
|
||||
\footnote{On the original reality-virtuality continuum of \textcite{milgram1994taxonomy}, augmented virtuality is also considered, as the incorporation of real objects into a \VE, and is placed between \AR and \VR. For simplicity, we only consider \AR and \VR in this thesis.}
|
||||
It describes the degree of virtuality of the environment along an axis, with one end being \RE and the other end being pure \VE, \ie indistinguishable from the real world.
|
||||
Between these two extremes lies \MR, which includes \AR and \VR as different levels of mixing real and virtual environments \cite{skarbez2021revisiting}.
|
||||
\footnote{This is the original and classic definition of \MR, but there is still debate about how to define and characterize \AR and \MR experiences \cite{speicher2019what,skarbez2021revisiting}.}
|
||||
|
||||
\begin{subfigs}{rv-continuums}{Reality-virtuality continuums. }[][
|
||||
\item For the visual sense, as originally proposed by and adapted from \textcite{milgram1994taxonomy}.
|
||||
@@ -86,26 +87,25 @@ Between these two extremes lies \MR, which includes \AR and \VR as different lev
|
||||
\subfig[0.49]{visuo-haptic-rv-continuum5}
|
||||
\end{subfigs}
|
||||
|
||||
%Concepts of virtuality and augmentation can also be applied for sensory modalities other than vision.
|
||||
\textcite{jeon2009haptic} proposed to describe visuo-haptic \AR/\VR with two orthogonal reality-virtuality continuums, one for vision and one for touch, as shown in \figref{visuo-haptic-rv-continuum5}.
|
||||
The combination of the two axes defines 9 types of visuo-haptic environments, with 3 possible levels of virtuality for each visual or haptic feedback: real, augmented and virtual.
|
||||
For example, (visual) \AR using a real object as a proxy to manipulate a virtual object is considered \emph{haptic reality} (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum5}), whereas a device that provides synthetic haptic feedback when touching a virtual object is considered \emph{haptic virtuality} (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum5}).
|
||||
For example, visual \AR using a real object as a proxy to manipulate a virtual object is considered \emph{haptic reality} (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum5}).
|
||||
Conversely, a device that provides synthetic haptic feedback when touching a virtual object is considered \emph{haptic virtuality} (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum5}).
|
||||
\textbf{A \emph{haptic augmentation} is then the combination of real and virtual haptic stimuli}, such that the virtual haptic sensations modify the perception of the real object \cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum5}).
|
||||
%In particular, it has been implemented by augmenting the haptic perception of real objects by providing timely virtual tactile stimuli using wearable haptics:
|
||||
\figref{salazar2020altering} shows an example of modifying the perceived stiffness of a real object in \VR using simultaneous wearable pressure feedback on the finger (left middle cell in \figref{visuo-haptic-rv-continuum5}).
|
||||
\figref{bau2012revel} shows another example of visuo-haptic augmentation of virtual texture using reverse electrovibration when running the finger over a real surface (middle cell in the two axes in \figref{visuo-haptic-rv-continuum5}).
|
||||
|
||||
In this thesis we call \AR/\VR \emph{systems} the computational set of hardware (input devices, sensors, displays and haptic devices) and software (tracking, simulation and rendering) that allows the user to interact with the \VE. % by implementing the interaction loop we proposed in \figref{interaction-loop}.
|
||||
In this thesis we call \AR/\VR \emph{systems} the computational set of hardware (input devices, sensors, displays and haptic devices) and software (tracking, registration, simulation, and rendering) that allows the user to interact with the \VE.
|
||||
Many \AR displays have been explored, from projection systems to hand-held displays.
|
||||
\textbf{\AR headsets are the most promising display technology because they create a portable experience that allows the user to navigate the \AE and interact with it directly using their hands} \cite{hertel2021taxonomy}.
|
||||
While \AR and \VR systems can address any of the human senses, most focus only on visual augmentation \cite[p.144]{billinghurst2015survey} and \cite{kim2018revisiting}.
|
||||
While \AR and \VR systems can address any of the human senses, most focus only on visual augmentation \cites[p.144]{billinghurst2015survey}{kim2018revisiting}.
|
||||
|
||||
\emph{Presence} is the illusion of \enquote{being there} when in \VR, or the illusion of the virtual content to \enquote{feel here} when in \AR \cite{slater2022separate,skarbez2021revisiting}.
|
||||
One of the most important aspects of this illusion is the \emph{plausibility}, \ie the illusion that the virtual events are really happening. %, even if the user knows that they are not real.
|
||||
However, when an \AR/\VR headset lacks haptic feedback, it may create a deceptive and incomplete user experience when the hand reaches the virtual content.
|
||||
All (visual) virtual objects are inherently intangible and cannot physically constrain a user's hand, making it difficult to perceive their properties and interact with them with confidence and efficiency.
|
||||
It is also necessary to provide a haptic feedback that is coherent with the virtual objects and ensures the best possible user experience, as we argue in the next section.
|
||||
The \textbf{integration of wearable haptics with \AR headsets appears to be one of the most promising solutions}, but it remains challenging due to their respective limitations and the additional constraints of combining them, as we will overview in the next section.
|
||||
One of the most important aspects of this illusion is the \emph{plausibility}, \ie the illusion that the virtual events are really happening.
|
||||
However, \textbf{when an \AR/\VR headset lacks haptic feedback, it may create a deceptive and incomplete user experience when the hand reaches the visual virtual content}.
|
||||
All visual virtual objects are inherently intangible and cannot physically constrain or provide touch feedback to the hand of the user.
|
||||
This makes it difficult to perceive their properties and interact with them with confidence and efficiency.
|
||||
It is also necessary to provide a haptic feedback that is coherent with the virtual content and ensures the best possible user experience.
|
||||
|
||||
\begin{subfigs}{visuo-haptic-environments}{Visuo-haptic environments with varying degrees of reality-virtuality. }[][
|
||||
\item \AR environment with a real haptic object used as a proxy to manipulate a virtual object \cite{kahl2023using}.
|
||||
@@ -120,54 +120,71 @@ The \textbf{integration of wearable haptics with \AR headsets appears to be one
|
||||
\subfig{bau2012revel}
|
||||
\end{subfigs}
|
||||
|
||||
\section{Research Challenges of Wearable Visuo-Haptic Augmented Reality}
|
||||
\label{research_challenges}
|
||||
\subsectionstarbookmark{Visuo-Haptic Augmentations}
|
||||
|
||||
\comans{SJ}{The chapter could benefit from some expansion. For instance, the current introduction tries to describe the scope of the research in haptic AR but lacks sufficient background on the general issues in this domain. As a result, it may not be very helpful for readers unfamiliar with the field in understanding the significance of the thesis's focus and positioning it within the broader context of haptic AR research.}{TODO}
|
||||
|
||||
The integration of wearable haptics with \AR headsets to create a visuo-haptic \AE is complex and presents many perceptual and interaction challenges.
|
||||
In this thesis, we propose to \textbf{represent the user's experience with such a visuo-haptic \AE as an interaction loop}, shown in \figref{interaction-loop}.
|
||||
Providing coherent and effective haptic feedback to visual virtual and augmented objects in \AR is complex.
|
||||
To identify the challenges involved, we propose to \textbf{represent the user's experience with such a visuo-haptic \AE as an interaction loop}, shown in \figref{interaction-loop}.
|
||||
It is based on the interaction loops of users with \ThreeD systems \cite[p.84]{laviolajr20173d}.
|
||||
The \RE and the user's hand are tracked in real time by sensors and reconstructed in visual and haptic \VEs.
|
||||
The interactions between the virtual hand and objects are then simulated, and rendered as feedback to the user using an \AR/\VR headset and wearable haptics.
|
||||
Because the visuo-haptic \VE is displayed in real time and aligned with the \RE, the user is given the illusion of directly perceiving and interacting with the virtual content as if it were part of the \RE.
|
||||
The interactions between the virtual hand and objects are then simulated, and rendered as feedback to the user using an \AR headset and wearable haptics.
|
||||
It is important that the visuo-haptic \VE is registered with the \RE and rendered in real time.
|
||||
This gives the user the illusion of directly perceiving and interacting with the virtual content as if it were part of the \RE.
|
||||
|
||||
\fig{interaction-loop}{The interaction loop between a user and a visuo-haptic \AE as proposed in this thesis.}[
|
||||
A user interacts with the visual (in blue) and haptic (in red) \VEs through a virtual hand (in purple) interaction technique that tracks real hand movements and simulates contact with virtual objects.
|
||||
The visual and haptic \VEs are rendered back using an \AR headset and wearable haptics, and are perceived by the user to be registered and co-localized with the \RE (in gray).
|
||||
\protect\footnotemark
|
||||
%\protect\footnotemark
|
||||
]
|
||||
|
||||
In this context, we focus on two main research challenges:
|
||||
Implementing such an interaction loop with an \AR system involves design and technical problems \cite{jeon2015haptic,marchand2016pose}.
|
||||
One of the \textbf{key issue is the \emph{registration} of the \VE with the \RE}.
|
||||
It is the precise spatial and temporal alignment of the virtual content with the \RE such that a user perceives the virtual as part of the real world.
|
||||
For visual \AR, a real camera is usually attached to the \AR display to estimate in real time the \emph{pose} (position and orientation) of the display within the \RE.
|
||||
The virtual camera that captures the \VE is set to the same pose and intrinsic parameters (focal length, angle of view, distortion) as the real camera \cite{marchand2016pose}.
|
||||
The user is then displayed with the combined images from the real and virtual cameras.
|
||||
With haptic \AR, the haptic feedback must be similarly registered with the \RE \cite{jeon2015haptic}, \eg adding a virtual force in the same direction and at the same time as a real force when pressing a surface with a tool.
|
||||
Depending on the type of haptic device and the haptic property to be rendered, this registration process involves different sensors, estimation techniques, and time requirements.
|
||||
This can be difficult to achieve and remains one of the main research challenge in haptic \AR.
|
||||
|
||||
In addition, visual and haptic \textbf{\AR systems require models to simulate the interaction} with the \VE \cite{jeon2015haptic,marchand2016pose}.
|
||||
As mentioned above, models of the \RE are needed to properly register the \VE, but also of the real objects to be augmented and manipulated.
|
||||
Haptic \AR also often needs models of the user's contacts with the augmented objects.
|
||||
Depending on the rendered haptic property and the required feedback fidelity, these can be complex estimates of the contacts (points, forces) and object properties (shape, material, mass, deformation, etc.).
|
||||
Computational and rendering models are also needed to render the virtual visual and haptic stimuli to the user.
|
||||
While \ThreeD visual rendering is mature, rendering haptic properties still faces many research challenges due to the complexity of the human touch \cite{pacchierotti2017wearable,culbertson2018haptics}.
|
||||
However, a balance has to be found between the perceptual accuracy of all models used and the real-time constraints of the interaction loop.
|
||||
|
||||
\section{Research Challenges of Wearable Visuo-Haptic Augmented Reality}
|
||||
\label{research_challenges}
|
||||
|
||||
The integration of wearable haptics with \AR headsets to create a visuo-haptic \AE is a promising solution, but presents many perceptual and interaction challenges.
|
||||
In this thesis, we focus on two main research challenges:
|
||||
\textbf{(I) providing plausible and coherent visuo-haptic augmentations}, and
|
||||
\textbf{(II) enabling effective manipulation of the \AE}.
|
||||
Each of these challenges also raises numerous design, technical, perceptual and user experience issues specific to wearable haptics and \AR headsets.
|
||||
\comans{JG}{While these research axes seem valid, it could have been described more clearly how they fit into the overarching research fields of visuo-haptic augmentations.}{TODO}
|
||||
|
||||
\footnotetext{%
|
||||
The icons are \href{https://creativecommons.org/licenses/by/3.0/}{CC BY} licensed:
|
||||
\enquote{\href{https://thenounproject.com/icon/finger-pointing-4230346/}{finger pointing}} by \href{https://thenounproject.com/creator/leremy/}{Gan Khoon Lay},
|
||||
\enquote{\href{https://thenounproject.com/icon/hololens-1499195/}{HoloLens}} by \href{https://thenounproject.com/creator/daniel2021/}{Daniel Falk}, and
|
||||
\enquote{\href{https://thenounproject.com/icon/vibration-6478365/}{vibrations}} by \href{https://thenounproject.com/creator/iconbunny/}{Iconbunny}.
|
||||
}
|
||||
%\footnotetext{%
|
||||
% The icons are \href{https://creativecommons.org/licenses/by/3.0/}{CC BY} licensed:
|
||||
% \enquote{\href{https://thenounproject.com/icon/finger-pointing-4230346/}{finger pointing}} by \href{https://thenounproject.com/creator/leremy/}{Gan Khoon Lay},
|
||||
% \enquote{\href{https://thenounproject.com/icon/hololens-1499195/}{HoloLens}} by \href{https://thenounproject.com/creator/daniel2021/}{Daniel Falk}, and
|
||||
% \enquote{\href{https://thenounproject.com/icon/vibration-6478365/}{vibrations}} by \href{https://thenounproject.com/creator/iconbunny/}{Iconbunny}.
|
||||
%}
|
||||
|
||||
\subsectionstarbookmark{Challenge I: Providing Plausible and Coherent Visuo-Haptic Augmentations}
|
||||
|
||||
\textbf{Many haptic devices have been designed and evaluated specifically for use in \VR}, providing the user with rich kinesthetic and tactile feedback on virtual objects, increasing the realism and effectiveness of interaction with them \cite{culbertson2018haptics}.
|
||||
Although closely related, \AR and \VR headsets have key differences in their respective renderings that can affect user perception.
|
||||
|
||||
%As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic \cite{ujitoko2021survey} or haptic retargeting \cite{azmandian2016haptic} effects.
|
||||
|
||||
Many hand-held or wearable haptic devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR headsets.
|
||||
The \textbf{user's hand must be free to touch and interact with the \RE while wearing a wearable haptic device}.
|
||||
Instead, it is possible to place the haptic actuator close to the point of contact with the \RE, \eg providing haptic feedback on the nail \cite{ando2007fingernailmounted,teng2021touch}, another phalanx \cite{asano2015vibrotactile,detinguy2018enhancing,salazar2020altering} or the wrist \cite{pezent2019tasbi,sarac2022perceived} for rendering fingertip contact with virtual content.
|
||||
Therefore, when touching a virtual or augmented object, \textbf{the real and virtual visual sensations are perceived as co-localized, but the virtual haptic feedback is not}.
|
||||
It remains to be investigated how such potential discrepancies affect the overall perception to design visuo-haptic augmentations adapted to \AR headsets.
|
||||
Such potential visuo-haptic discrepancies may negatively affect the user's perception of the registration of the \VE with the \RE.
|
||||
This remains to be investigated to understand how to design visuo-haptic augmentations adapted to \AR headsets.
|
||||
|
||||
%So far, most of the \AR studies and applications only add visual and haptic sensations to the user's overall perception of the environment, but conversely it is more difficult to remove sensations.
|
||||
%Visual and haptic augmentations of the \RE add sensations to the user's overall perception.
|
||||
The \textbf{added visual and haptic virtual sensations may also be perceived as incoherent} with the sensations of the real objects, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these.
|
||||
Moreover, with an \AR headset the user can still see the real world environment, including their hands, augmented real objects and worn haptic devices, unlike \VR where there is total control over the visual rendering. % of the hand and \VE.
|
||||
Therefore, the \textbf{added visual and haptic virtual sensations may also be perceived as incoherent} with each other, for example with a lower rendering quality, a temporal latency, a spatial misalignment, or a combination of these.
|
||||
It could also be caused by the limited rendering capabilities of wearable haptic devices \cite{pacchierotti2017wearable}.
|
||||
Finally, the user can still see the \RE with an \AR headset, including their hands, augmented real objects and worn haptic devices, unlike \VR where there is total control over the visual rendering.
|
||||
It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as a whole, and to what extent they will conflict or complement each other.
|
||||
With a better understanding of \textbf{how visual factors can influence the perception of haptic augmentations}, the many wearable haptic devices that already exist but have not yet been fully explored with \AR can be better applied, and new visuo-haptic augmentations adapted to \AR can be designed.
|
||||
|
||||
@@ -176,21 +193,19 @@ With a better understanding of \textbf{how visual factors can influence the perc
|
||||
Touching, \textbf{grasping and manipulating virtual objects are fundamental interactions for \AR} \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite[p.385]{laviolajr20173d}.
|
||||
In the previous challenge section, we described how the user's hand should be free to interact with the \RE using a wearable haptic device.
|
||||
We can then expect a seamless and direct manipulation of the virtual content with the hands, as if it were real.
|
||||
%Since the hand is not occupied or covered with a haptic device so as to not impair interaction with the \RE, as described in the previous section, one can expect a seamless and direct manipulation of the hand with the virtual content as if it were real.
|
||||
When touching a visually augmenting a real object, the user's hand is physically constrained by the object, allowing for easy and natural interaction.
|
||||
However, \textbf{manipulating a purely virtual object with the bare hand can be challenging}, especially without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
|
||||
When touching a real object that is visually augmented, the user's hand is physically constrained by the real object, allowing for easy and natural interaction.
|
||||
In addition, wearable haptic devices are limited to cutaneous feedback, and cannot provide forces to constrain the hand contact with the virtual object \cite{pacchierotti2017wearable}.
|
||||
|
||||
Current \AR headsets have visual rendering limitations that also affect interaction with virtual objects. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency.
|
||||
Current \AR headsets have visual rendering limitations that also affect interaction with virtual objects.
|
||||
\AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world.
|
||||
However, the depth perception of virtual objects is often underestimated \cite{peillard2019studying,adams2022depth}.
|
||||
There is also often \textbf{a lack of mutual occlusions between the hand and a virtual object}, that is the hand can hide the object or be hidden by the object \cite{macedo2023occlusion}.
|
||||
Finally, as illustrated in our interaction loop \figref{interaction-loop}, interaction with a virtual object is an illusion, because the real hand controls in real time a virtual hand, like an avatar, whose contacts with virtual objects are then simulated in the \VE.
|
||||
Therefore, there is inevitably a latency between the movements of the real hand and the feedback movements of the virtual object, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the virtual object touched \cite{prachyabrued2014visual}.
|
||||
Finally, as illustrated in \figref{interaction-loop}, interaction with a virtual object is an illusion, because the real hand controls in real time a virtual hand, like an avatar, whose contacts with virtual objects are then simulated in the \VE.
|
||||
Therefore, there is inevitably a latency between the movements of the real hand and the feedback movements of the virtual object, and a spatial misalignment between the real hand and the virtual hand, whose movements are constrained to the virtual object touched \cite{prachyabrued2014visual}.
|
||||
These three rendering limitations make it \textbf{difficult to perceive the position of the fingers relative to the object} before touching or grasping it, but also to estimate the force required to grasp the virtual object and move it to a desired location.
|
||||
|
||||
Hence, it is necessary to provide feedback that allows the user to efficiently contact, grasp and manipulate a virtual object with the hand.
|
||||
Yet, it is unclear which type of visual and wearable haptic feedback, or their combination, is best suited to guide the manipulation of a virtual object. %, and whether one or the other of a combination of the two is most beneficial for users.
|
||||
Yet, it is unclear which type of visual and wearable haptic feedback, or their combination, is best suited to guide the manipulation of a virtual object.
|
||||
|
||||
\section{Approach and Contributions}
|
||||
\label{contributions}
|
||||
@@ -204,13 +219,13 @@ Our approach is to:
|
||||
|
||||
We consider two main axes of research, each addressing one of the research challenges identified above:
|
||||
\begin{enumerate*}[label=(\Roman*)]
|
||||
\item \textbf{augmenting the visuo-haptic texture perception of real surfaces}, and % with visuo-haptic texture augmentations, and
|
||||
\item \textbf{improving the manipulation of virtual objects}.% with visuo-haptic augmentations of the hand-object interaction.
|
||||
\item \textbf{augmenting the visuo-haptic texture perception of real surfaces}, and
|
||||
\item \textbf{improving the manipulation of virtual objects}.
|
||||
\end{enumerate*}
|
||||
Our contributions are summarized in \figref{contributions}.
|
||||
|
||||
\fig[0.95]{contributions}{Summary of our contributions through the simplified interaction loop.}[
|
||||
The contributions are represented in dark grey boxes, the research axes in green circles. % and the research objectives in light green circles.
|
||||
The contributions are represented in dark grey boxes, the research axes in green circles.
|
||||
The first axis is \textbf{(I)} the design and evaluation of the perception of visuo-haptic texture augmentations of real surfaces, directly touched by the hand.
|
||||
The second axis focuses on \textbf{(II)} improving the manipulation of virtual objects with the bare hand using visuo-haptic augmentations of the hand as interaction feedback.
|
||||
]
|
||||
@@ -218,8 +233,6 @@ Our contributions are summarized in \figref{contributions}.
|
||||
\subsectionstarbookmark{Axis I: Augmenting the Texture Perception of Real Surfaces}
|
||||
|
||||
Wearable haptic devices have proven effective in modifying the perception of a touched real surface, without altering the object or covering the fingertip, forming haptic augmentation \cite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
|
||||
%It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronized with the finger movement.
|
||||
%It enables rich haptic feedback as the combination of kinesthetic sensation from the real and cutaneous sensation from the actuator.
|
||||
However, wearable haptic augmentation with \AR has been little explored, as well as the visuo-haptic augmentation of texture.
|
||||
Texture is indeed one of the most fundamental perceived properties of a surface material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) augmentation \cite{unger2011roughness,culbertson2014modeling,asano2015vibrotactile,strohmeier2017generating,friesen2024perceived}.
|
||||
For this first axis of research, we propose to \textbf{design and evaluate the perception of wearable virtual visuo-haptic textures augmenting real surfaces}.
|
||||
@@ -236,22 +249,20 @@ Hence, our second objective is to \textbf{evaluate how the perception of wearabl
|
||||
|
||||
Finally, visuo-haptic texture databases have been created from real texture captures \cite{culbertson2014penn,balasubramanian2024sens3} to be rendered as virtual textures with hand-held haptic devices that are perceived as similar to real textures \cite{culbertson2015should,friesen2024perceived}.
|
||||
However, the rendering of these textures with and \AR headset and wearable haptics remains to be investigated.
|
||||
Our third objective is to \textbf{evaluate the perception of simultaneous and co-localized wearable visuo-haptic texture augmentations} of real surfaces in \AR. %, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture.
|
||||
Our third objective is to \textbf{evaluate the perception of simultaneous and co-localized wearable visuo-haptic texture augmentations} of real surfaces in \AR.
|
||||
|
||||
\subsectionstarbookmark{Axis II: Improving the Manipulation of Virtual Objects}
|
||||
|
||||
With wearable visuo-haptic \AR, the hand is free to touch and interact seamlessly with real, augmented and virtual objects.
|
||||
Hence, a user can expect natural and direct contact and manipulation of virtual objects with the bare hand.
|
||||
However, the intangibility of the visual \VE, the display limitations of current \AR headsets and the inherent spatial and temporal discrepancies between the user's hand actions and the visual feedback in the \VE can make interaction with virtual objects particularly challenging.
|
||||
%However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of wearable haptics, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the \RE, the visual feedback, and the haptic feedback, can make the interaction with virtual objects with bare hands particularly challenging.
|
||||
Two particular sensory feedbacks are known to improve such direct virtual object manipulation, but have not been properly investigated with \AR headsets: visual feedback of the virtual hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and delocalized haptic feedback \cite{lopes2018adding,teng2021touch}.
|
||||
For this second axis of research, we propose to \textbf{design and evaluate visuo-haptic augmentations of the hand as interaction feedback with virtual objects} in \AR.
|
||||
We consider the effect on user performance and experience of (1) the visual feedback of the virtual hand as augmentation of the real hand and (2) different delocalized haptic feedback of virtual object manipulation with the hand in combination with visual hand augmentations.
|
||||
|
||||
First, the visual feedback of the virtual hand is a key element for interacting and manipulating virtual objects in \VR \cite{prachyabrued2014visual,grubert2018effects}.
|
||||
Some work has also investigated the visual feedback of the virtual hand in \AR, but not in a context of virtual object manipulation with a headset \cite{blaga2017usability,yoon2020evaluating} or was limited to a single visual hand augmentation \cite{piumsomboon2014graspshell,maisto2017evaluation}. % with the bare hand.% from simulating mutual occlusions between the hand and virtual objects \cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay \cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
|
||||
Some work has also investigated the visual feedback of the virtual hand in \AR, but not in a context of virtual object manipulation with a headset \cite{blaga2017usability,yoon2020evaluating} or was limited to a single visual hand augmentation \cite{piumsomboon2014graspshell,maisto2017evaluation}.
|
||||
\AR headsets also has significant perceptual differences from \VR due to the visibility of the real hand and environment, which can affect user experience and performance \cite{yoon2020evaluating}.
|
||||
%, and these visual hand augmentations have not been evaluated .
|
||||
Thus, our fourth objective is to \textbf{investigate the visual feedback of the virtual hand as augmentation of the real hand} for direct hand manipulation of virtual objects.
|
||||
|
||||
Second, as described above, the haptic actuators need to be moved away from the fingertips to not impair the hand movements, sensations and interactions with the \RE.
|
||||
@@ -262,8 +273,6 @@ Our last objective is to \textbf{investigate the delocalized haptic feedback of
|
||||
\section{Thesis Overview}
|
||||
\label{thesis_overview}
|
||||
|
||||
%This thesis is divided in four parts.
|
||||
%In \textbf{\partref{background}}, we first describe the context and background of our research, within which
|
||||
With this current \textit{Introduction} chapter, we have presented the research challenges, objectives, approach and contributions of this thesis.
|
||||
|
||||
In \textbf{\chapref{related_work}}, we then review previous work on the perception and manipulation of virtual and augmented objects, directly with the hand, using either wearable haptics, \AR, or their combination.
|
||||
@@ -297,7 +306,7 @@ In \textbf{\partref{manipulation}}, we describe our contributions to the second
|
||||
In \textbf{\chapref{visual_hand}}, we investigate in a user study six visual feedback as hand augmentations, as a set of the most popular hand augmentation in the \AR literature.
|
||||
Using the \OST-\AR headset Microsoft HoloLens~2, we evaluate their effect on user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a virtual object directly with the hand.
|
||||
|
||||
In \textbf{\chapref{visuo_haptic_hand}}, we evaluate in a user study delocalized haptic feedback of hand manipulation with virtual object using two vibrotactile contact techniques provided at five different positionings on the hand. %, as haptic rendering of virtual object manipulation with the hand.
|
||||
In \textbf{\chapref{visuo_haptic_hand}}, we evaluate in a user study delocalized haptic feedback of hand manipulation with virtual object using two vibrotactile contact techniques provided at five different positionings on the hand.
|
||||
They are compared with the two most representative visual hand augmentations from the previous chapter, resulting in twenty visuo-haptic hand feedbacks that are evaluated within the same experimental setup and design.
|
||||
|
||||
\noindentskip
|
||||
|
||||
@@ -1,12 +1,9 @@
|
||||
\section{Perception and Interaction with the Hand}
|
||||
\label{haptic_hand}
|
||||
|
||||
% describe how the hand senses and acts on its environment to perceive the haptic properties of real everyday objects.
|
||||
|
||||
The haptic sense has specific characteristics that make it unique in regard to other senses.
|
||||
It enables us to perceive a large diversity of properties of everyday objects, up to a complex combination of sensations produced by numerous sensory receptors distributed throughout the body, but especially in the hand \cite{johansson2009coding}.
|
||||
It also allows us to act on these objects, to come into contact with them, to grasp them and to actively explore them. % , and to manipulate them.
|
||||
%This implies that the haptic perception is localized at the points of contact between the hand and the environment, \ie we cannot haptically perceive an object without actively touching it.
|
||||
It also allows us to act on these objects, to come into contact with them, to grasp them and to actively explore them.
|
||||
These two mechanisms, \emph{action} and \emph{perception}, are closely associated and both are essential to form a complete haptic experience of interacting with the environment using the hand \cite{lederman2009haptic}.
|
||||
|
||||
\subsection{The Haptic Sense}
|
||||
@@ -188,8 +185,7 @@ But when running the finger over the surface with a lateral movement (\secref{ex
|
||||
In particular, when the asperities are smaller than \qty{0.1}{mm}, such as paper fibers, the pressure cues are no longer captured and only the movement, \ie the vibrations, can be used to detect the roughness \cite{hollins2000evidence}.
|
||||
This limit distinguishes \emph{macro-roughness} from \emph{micro-roughness}.
|
||||
|
||||
%The physical properties of the surface determine the haptic perception of roughness.
|
||||
The perception of roughness can be characterized by the density of the surface elements: the perceived (subjective) intensity of roughness increases with the spacing between the elements. %, for macro-roughness \cite{klatzky2003feeling,lawrence2007haptic} and micro-roughness \cite{bensmaia2003vibrations}.
|
||||
The perception of roughness can be characterized by the density of the surface elements: the perceived (subjective) intensity of roughness increases with the spacing between the elements.
|
||||
For macro-textures, the size of the elements, the force applied and the speed of exploration have limited effects on the intensity perceived \cite{klatzky2010multisensory}: macro-roughness is a \emph{spatial perception}.
|
||||
This allows us to read Braille \cite{lederman2009haptic}.
|
||||
However, the speed of exploration affects the perceived intensity of micro-roughness \cite{bensmaia2003vibrations}.
|
||||
@@ -279,99 +275,10 @@ Stiffness depends on the structure of the object: a thick object can be more com
|
||||
\subfig[.45]{bergmanntiest2009cues}
|
||||
\end{subfigs}
|
||||
|
||||
%\textcite{bergmanntiest2009cues} showed how of these two physical measures in the perception of hardness.
|
||||
An object with low stiffness, but high Young's modulus can be perceived as hard, and vice versa, as shown in \figref{bergmanntiest2009cues}.
|
||||
With finger pressure, a relative difference (the \emph{Weber fraction}) of \percent{\sim 15} is required to discriminate between two objects of different stiffness or elasticity.
|
||||
However, in the absence of pressure sensations (by placing a thin disc between the finger and the object), the necessary relative difference becomes much larger (Weber fraction of \percent{\sim 50}).
|
||||
That is, \textcite{bergmanntiest2009cues} showed the perception of hardness relies on \percent{90} on surface deformation cues and \percent{10} on displacement cues.
|
||||
%Finally, when pressing with the finger, the perceived hardness intensity $h$ follows a power law with the stiffness $k$ \cite{harper1964subjective}:
|
||||
%\begin{equation}{hardness_intensity}
|
||||
% h = k^{0.8}
|
||||
%\end{equation}
|
||||
|
||||
%En pressant du doigt, l'intensité perçue (subjective) de dureté suit avec la raideur une relation selon une loi de puissance avec un exposant de \num{0.8} \cite{harper1964subjective}, \ie quand la raideur double, la dureté perçue augmente de \num{1.7}.
|
||||
%\textcite{bergmanntiest2009cues} ont ainsi observé une relation quadratique d'égale intensité perçue de dureté, comme illustré sur la \figref{bergmanntiest2009cues}.
|
||||
|
||||
%\subsubsection{Friction}
|
||||
%\label{friction}
|
||||
%
|
||||
%Friction (or slipperiness) is the perception of \emph{resistance to movement} on a surface \cite{bergmanntiest2010tactual}.
|
||||
%Sandpaper is typically perceived as sticky because it has a strong resistance to sliding on its surface, while glass is perceived as more slippery.
|
||||
%This perceptual property is closely related to the perception of roughness \cite{hollins1993perceptual,baumgartner2013visual}.
|
||||
%
|
||||
%When running the finger on a surface with a lateral movement (\secref{exploratory_procedures}), the skin-surface contacts generate frictional forces in the opposite direction to the finger movement, giving kinesthetic cues, and also stretch the skin, giving cutaneous cues.
|
||||
%As illustrated in \figref{smith1996subjective_1}, a stick-slip phenomenon can also occur, where the finger is intermittently slowed by friction before continuing to move, on both rough and smooth surfaces \cite{derler2013stick}.
|
||||
%The amplitude of the frictional force $F_s$ is proportional to the normal force of the finger $F_n$, \ie the force perpendicular to the surface, according to a coefficient of friction $\mu$:
|
||||
%\begin{equation}{friction}
|
||||
% F_s = \mu \, F_n
|
||||
%\end{equation}
|
||||
%The perceived intensity of friction is thus roughly related to the friction coefficient $\mu$ \cite{smith1996subjective}.
|
||||
%However, it is a complex perception because it is more determined by the micro-scale interactions between the surface and the skin: It depends on many factors such as the normal force applied, the speed of movement, the contact area and the moisture of the skin and the surface \cite{adams2013finger,messaoud2016relation}.
|
||||
%In this sense, the perception of friction is still poorly understood \cite{okamoto2013psychophysical}.
|
||||
%
|
||||
%\begin{subfigs}{smith1996subjective}{Perceived intensity of friction of different materials by active exploration with the finger \cite{smith1996subjective}. }[
|
||||
% \item Measurements of normal $F_n$ and tangential $F_t$ forces when exploring two surfaces: one smooth (glass) and one rough (nyloprint). The fluctuations in the tangential force are due to the stick-slip phenomenon. The coefficient of friction $\mu$ can be estimated as the slope of the relationship between the normal and tangential forces.
|
||||
% \item Perceived friction intensity (vertical axis) as a function of the estimated friction coefficient $\mu$ of the exploration (horizontal axis) for four materials (shapes and colors).
|
||||
% ]
|
||||
% \subfigsheight{55mm}
|
||||
% \subfig{smith1996subjective_1}
|
||||
% \subfig{smith1996subjective_2}
|
||||
%\end{subfigs}
|
||||
%
|
||||
%Yet, it is a fundamental perception for grasping and manipulating objects.
|
||||
%The forces of friction make it indeed possible to hold the object firmly in the hand and prevent it from slipping
|
||||
%The perception of friction also allows us to automatically and very quickly adjust the force we apply to the object in order to grasp it \cite{johansson1984roles}.
|
||||
%If the finger is anaesthetized, the lack of cutaneous sensation prevents effective adjustment of the gripping force: the forces of the object on the finger are no longer correctly perceived, and the fingers then press harder on the object in compensation, but without achieving good opposition of the fingers \cite{witney2004cutaneous}.
|
||||
|
||||
%\subsubsection{Temperature}
|
||||
%\label{temperature}
|
||||
%
|
||||
%Temperature (or coldness/warmness) is the perception of the \emph{transfer of heat} between the touched surface and the skin \cite{bergmanntiest2010tactual}:
|
||||
%When heat is removed from (added to) the skin, the surface is perceived as cold (hot).
|
||||
%Metal will be perceived as colder than wood at the same room temperature: This perception is different from the physical temperature of the material and is therefore an important property for distinguishing between materials \cite{ho2006contribution}.
|
||||
%This perception depends on the thermal conductivity and heat capacity of the material, the volume of the object, the initial temperature difference and the area of contact between the surface and the skin \cite{kappers2013haptic}.
|
||||
%For example, a larger object or a smoother surface, which increases the contact area, causes more heat circulation and a more intense temperature sensation (hot or cold) \cite{bergmanntiest2008thermosensory}.
|
||||
|
||||
%Parce qu'elle est basée sur la circulation de la chaleur, la perception de la température est plus lente que les autres propriétés matérielles et demande un toucher statique (voir \figref{exploratory_procedures}) de plusieurs secondes pour que la température de la peau s'équilibre avec celle de l'objet.
|
||||
%La température $T(t)$ du doigt à l'instant $t$ et au contact avec une surface suit une loi décroissante exponentielle, où $T_s$ est la température initiale de la peau, $T_e$ est la température de la surface, $t$ est le temps et $\tau$ est la constante de temps:
|
||||
%\begin{equation}{temperature}
|
||||
% T(t) = (T_s - T_e) \, e^{-t / \tau} + T_e
|
||||
%\end{equation}
|
||||
%Le taux de transfert de chaleur, décrit par $\tau$, et l'écart de température $T_s - T_e$, sont les deux indices essentiels pour la perception de la température.
|
||||
%Dans des conditions de la vie de tous les jours, avec une température de la pièce de \qty{20}{\celsius}, une différence relative du taux de transfert de chaleur de \percent{43} ou un écart de \qty{2}{\celsius} est nécessaire pour percevoir une différence de température \cite{bergmanntiest2009tactile}.
|
||||
|
||||
%\subsubsection{Spatial Properties}
|
||||
%\label{spatial_properties}
|
||||
|
||||
%Weight, size and shape are haptic spatial properties that are independent of the material properties described above.
|
||||
|
||||
%Weight (or heaviness/lightness) is the perceived \emph{mass} of the object \cite{bergmanntiest2010haptic}.
|
||||
%It is typically estimated by holding the object statically in the palm of the hand to feel the gravitational force (\secref{exploratory_procedures}).
|
||||
%A relative weight difference of \percent{8} is then required to be perceptible \cite{brodie1985jiggling}.
|
||||
%By lifting the object, it is also possible to feel the object's force of inertia, \ie its resistance to velocity.
|
||||
%This provides an additional perceptual cue to its mass and slightly improves weight discrimination.
|
||||
%For both gravity and inertia, kinesthetic cues to force are much more important than cutaneous cues to pressure \cite{bergmanntiest2012investigating}.
|
||||
%Le lien entre le poids physique et l'intensité perçue est variable selon les individus \cite{kappers2013haptic}.
|
||||
|
||||
%Size can be perceived as the object's \emph{length} (in one dimension) or its \emph{volume} (in three dimensions) \cite{kappers2013haptic}.
|
||||
%In both cases, and if the object is small enough, a precision grip (\figref{gonzalez2014analysis}) between the thumb and index finger can discriminate between sizes with an accuracy of \qty{1}{\mm}, but with an overestimation of length (power law with exponent \qty{1.3}).
|
||||
%Alternatively, it is necessary to follow the contours of the object with the fingers to estimate its length (\secref{exploratory_procedures}), but with ten times less accuracy and an underestimation of length (power law with an exponent of \qty{0.9}) \cite{bergmanntiest2011cutaneous}.
|
||||
%The perception of the volume of an object that is not small is typically done by hand enclosure, but the estimate is strongly influenced by the size, shape and mass of the object, for an identical volume \cite{kahrimanovic2010haptic}.
|
||||
|
||||
%The shape of an object can be defined as the perception of its \emph{global geometry}, \ie its shape and contours.
|
||||
%This is the case, for example, when looking for a key in a pocket.
|
||||
%The exploration of contours and enclosure are then employed, as for the estimation of length and volume.
|
||||
%If the object is not known in advance, object identification is rather slow, taking several seconds \cite{norman2004visual}.
|
||||
%Therefore, the exploration of other properties is favoured to recognize the object more quickly, in particular marked edges \cite{klatzky1987there}, \eg a screw among nails (\figref{plaisier2009salient_2}), or certain material properties \cite{lakatos1999haptic,plaisier2009salient}, \eg a metal object among plastic objects.
|
||||
|
||||
%\begin{subfigs}{plaisier2009salient}{Identifcation of a sphere among cubes \cite{plaisier2009salient}. }[
|
||||
% \item The shape has a significant effect on the perception of the volume of an object, \eg a sphere is perceived smaller than a cube of the same volume.
|
||||
% \item The absence of a marked edge on the sphere makes it easy to identify among cubes.
|
||||
% ]
|
||||
% \subfigsheight{40mm}
|
||||
% \subfig{plaisier2009salient_1}
|
||||
% \subfig{plaisier2009salient_2}
|
||||
%\end{subfigs}
|
||||
|
||||
\subsection{Conclusion}
|
||||
\label{haptic_sense_conclusion}
|
||||
|
||||
@@ -52,8 +52,6 @@ Moreover, as detailed in \secref{object_properties}, cutaneous sensations are ne
|
||||
\subfig{leonardis20173rsr}
|
||||
\end{subfigs}
|
||||
|
||||
% Tradeoff realistic and cost + analogy with sound, Hi-Fi costs a lot and is realistic, but 40$ BT headphone is more practical and enough, as cutaneous feedback without kinesthesic could be enough for wearable haptics and far more affordable and comfortable than world- or body-grounded haptics + cutaneous even better than kine for rendering surface curvature and fine manipulation
|
||||
|
||||
\subsection{Wearable Haptic Devices for the Hand}
|
||||
\label{wearable_haptic_devices}
|
||||
|
||||
@@ -114,7 +112,6 @@ The simplicity of this approach allows the belt to be placed anywhere on the han
|
||||
\begin{subfigs}{tangential_belts}{Tangential motion actuators and compression belts. }[][
|
||||
\item A skin strech actuator for the fingertip \cite{leonardis2015wearable}.
|
||||
\item A 3 \DoF actuator capable of normal and tangential motion on the fingertip \cite{schorr2017fingertip}.
|
||||
%\item A shearing belt actuator for the fingertip \cite{minamizawa2007gravity}.
|
||||
\item The hRing, a shearing belt actuator for the proximal phalanx of the finger \cite{pacchierotti2016hring}.
|
||||
\item Tasbi, a wristband capable of pressure and vibrotactile feedback \cite{pezent2022design}.
|
||||
]
|
||||
@@ -185,27 +182,19 @@ In particular, the visual rendering of a touched object can also influence the p
|
||||
\textcite{bhatia2024augmenting} categorize the haptic augmentations into three types: direct touch, touch-through, and tool-mediated.
|
||||
In \emph{direct touch}, the haptic device does not cover the inside of the hand so as not to impair the interaction of the user with the \RE, and is typically achieved with wearable haptics.
|
||||
In touch-through and tool-mediated, or \emph{indirect feel-through} \cite{jeon2015haptic}, the haptic device is placed between the hand and the \RE.
|
||||
%We are interested in direct touch augmentations with wearable haptics (\secref{wearable_haptic_devices}), as their integration with \AR is particularly promising for free hand interaction with visuo-haptic augmentations.
|
||||
Many haptic augmentations were first developed with touch-through devices, and some (but not all) were later transposed to direct touch augmentation with wearable haptic devices.
|
||||
%We also focus on tactile augmentations stimulating the mechanoreceptors of the skin (\secref{haptic_sense}), thus excluding temperature perception, as they are the most common existing haptic interfaces.
|
||||
|
||||
Since we have chosen to focus in \secref{object_properties} on the haptic perception of roughness and hardness of objects, we review below the methods to modify the perception of these properties.
|
||||
Of course, wearable haptics can also be used in a direct touch context to modify the perceived friction \cite{konyo2008alternative,salazar2020altering}, weight \cite{minamizawa2007gravity}, or local deformation \cite{salazar2020altering} of real objects, but they are rare \cite{bhatia2024augmenting} and will not be detailed here.
|
||||
|
||||
% \cite{klatzky2003feeling} : rendering roughness, friction, deformation, temperatures
|
||||
% \cite{girard2016haptip} : renderings with a tangential motion actuator
|
||||
|
||||
\subsubsection{Roughness Augmentation}
|
||||
\label{texture_rendering}
|
||||
|
||||
To modify the perception of the haptic roughness (or texture, see \secref{roughness}) of a real object, vibrations are typically applied to the skin by the haptic device as the user moves over the surface.
|
||||
%This is because running the finger or a tool on a textured surface generates pressures and vibrations (\secref{roughness}) at frequencies that are too high for rendering capabilities of most haptic devices \cite{campion2005fundamental,culbertson2018haptics}.
|
||||
There are two main approaches to modify virtual textures perception: \emph{simulation models} and \emph{data-driven models} \cite{klatzky2013haptic,culbertson2018haptics}.
|
||||
|
||||
\paragraph{Simulation Models}
|
||||
|
||||
%Simulations of virtual textures are based on the physics of the interaction between the finger and the surface, and are used to generate the vibrations that the user feels when running the finger over the surface.
|
||||
|
||||
The simplest texture simulation model is a 1D sinusoidal grating $v(t)$ with spatial period $\lambda$ and amplitude $A$ that is scanned by the user at velocity $\dot{x}(t)$:
|
||||
\begin{equation}{grating_rendering}
|
||||
v(t) = A \sin(\frac{2 \pi \dot{x}(t)}{\lambda})
|
||||
@@ -220,8 +209,6 @@ The perceived roughness increased proportionally to the virtual texture amplitud
|
||||
\textcite{ujitoko2019modulating} instead used a square wave signal and a hand-held stylus with an embedded voice-coil.
|
||||
\textcite{friesen2024perceived} compared the frequency modulation of \eqref{grating_rendering} with amplitude modulation (\figref{friesen2024perceived}), and found that the frequency modulation was perceived as more similar to real sinusoidal gratings for lower spatial periods (\qty{0.5}{\mm}) but both modulations were effective for higher spatial periods (\qty{1.5}{\mm}).
|
||||
|
||||
%\textcite{friesen2024perceived} proposed
|
||||
|
||||
The model in \eqref{grating_rendering} can be extended to 2D textures by adding a second sinusoidal grating with an orthogonal orientation as \textcite{girard2016haptip}.
|
||||
More complex models have also been developed to be physically accurate and reproduce with high fidelity the roughness perception of real patterned surfaces \cite{unger2011roughness}, but they require high-fidelity force feedback devices that are expensive and have a limited workspace.
|
||||
|
||||
@@ -241,9 +228,6 @@ A similar database, but captured from a direct touch context with the fingertip,
|
||||
A common limitation of these data-driven models is that they can only render \emph{isotropic} textures: their record does not depend on the position of the measure, and the rendering is the same regardless of the direction of the movement.
|
||||
This was eventually addressed to include the user's velocity direction into the capture, modelling and rendering of the textures \cite{abdulali2016datadriven,abdulali2016datadrivena}.
|
||||
|
||||
%A third approach is to model
|
||||
%Alternative models have been proposed to both render both isotropic and patterned textures \cite{chan2021hasti}., or to simulate the vibrations from the (visual) texture maps used to visually render a \ThreeD object \cite{chan2021hasti}.
|
||||
|
||||
Using the user's velocity magnitude and normal force as input, these data-driven models are able to interpolate from the scan measures to generate a virtual texture in real time as vibrations with a high realism.
|
||||
When comparing real textures felt through a stylus with their virtual models rendered with a voice-coil actuator attached to the stylus (\figref{culbertson2012refined}), the virtual textures were found to accurately reproduce the perception of roughness, but hardness and friction were not rendered properly \cite{culbertson2014modeling}.
|
||||
\textcite{culbertson2015should} further showed that the perceived realism of the virtual textures, and similarity to the real textures, depended mostly on the user's velocity magnitude but not on the user's force as inputs to the model, \ie responding to velocity magnitude is sufficient to render isotropic virtual textures.
|
||||
@@ -331,38 +315,6 @@ A challenge with this technique is to provide the vibration feedback at the righ
|
||||
Vibrations on contact have been employed with wearable haptics, but to the best of our knowledge only to render virtual objects \cite{pezent2019tasbi,teng2021touch,sabnis2023haptic}.
|
||||
We describe them in the \secref{vhar_haptics}.
|
||||
|
||||
%A promising alternative approach
|
||||
%\cite{kildal20103dpress}
|
||||
|
||||
%\begin{subfigs}{vibtration_grains}{Augmenting perceived stiffness of a real surface with vibrations. }
|
||||
% \subfigsheight{35mm}
|
||||
% \subfig{sabnis2023haptic_device}
|
||||
% \subfig{sabnis2023haptic_control}
|
||||
%\end{subfigs}
|
||||
|
||||
%\textcite{choi2021perceived} combined and compared these two rendering approaches (spring-damper and exponential decaying sinusoids) but to render purely virtual surfaces.
|
||||
%They found that the perceived intensity of the virtual hardness $\tilde{h}$ followed a power law, similarly to \eqref{hardness_intensity}, with the amplitude $a$, the %frequency $f$ and the damping $b$ of the vibration, but not the decay time $\tau$.
|
||||
%\cite{park2023perceptual}
|
||||
|
||||
%\subsubsection{Friction}
|
||||
%\label{friction_rendering}
|
||||
|
||||
%\cite{konyo2008alternative}
|
||||
%\cite{provancher2009fingerpad}
|
||||
%\cite{smith2010roughness}
|
||||
%\cite{jeon2011extensions}
|
||||
%\cite{salazar2020altering}
|
||||
%\cite{yim2021multicontact}
|
||||
|
||||
%\subsubsection{Weight}
|
||||
%\label{weight_rendering}
|
||||
|
||||
%\cite{minamizawa2007gravity}
|
||||
%\cite{minamizawa2008interactive}
|
||||
%\cite{jeon2011extensions}
|
||||
%\cite{choi2017grabity}
|
||||
%\cite{culbertson2017waves}
|
||||
|
||||
\subsection{Conclusion}
|
||||
\label{wearable_haptics_conclusion}
|
||||
|
||||
@@ -372,6 +324,3 @@ While many haptic devices can be worn on the hand, only a few can be considered
|
||||
If the haptic rendering of the device is timely associated with the user's touch actions on a real object, the perceived haptic properties of the object can be modified.
|
||||
Several haptic augmentation methods have been developed to modify the perceived roughness and hardness, mostly using vibrotactile feedback and, to a lesser extent, pressure feedback.
|
||||
However, not all of these haptic augmentations have yet been already transposed to wearable haptics, and the use of wearable haptic augmentations has not yet been investigated in the context of \AR.
|
||||
|
||||
%, unlike most previous actuators that are designed specifically for fingertips and would require mechanical adaptation to be placed on other parts of the hand.
|
||||
% thanks to the vibration propagation and the sensory capabilities distributed throughout the skin, they can be placed without adaption and on any part of the hand
|
||||
|
||||
@@ -144,7 +144,6 @@ Choosing useful and efficient \UIs and interaction techniques is crucial for the
|
||||
|
||||
\textcite{laviolajr20173d} (p.385) classify interaction techniques into three categories based on the tasks they enable users to perform: manipulation, navigation, and system control.
|
||||
\textcite{hertel2021taxonomy} proposed a similar taxonomy of interaction techniques specifically for \AR headsets.
|
||||
\comans{JG}{In, Figure 2.24 I suggest removing d. or presenting it as separate figure as it shows no interaction technique (The caption is “Interaction techniques in AR” but a visualization of a spatial registration technique).}{It has been removed and replaced by an example of resizing a virtual object.}
|
||||
In this thesis we focus on manipulation tasks of virtual content directly with the hands, more specifically on touching visuo-haptic textures with a finger (\partref{perception}) and positioning and rotating virtual objects pushed and grasp by the hand (\partref{manipulation}).
|
||||
|
||||
The \emph{manipulation tasks} are the most fundamental tasks in \AR and \VR systems, and the building blocks for more complex interactions.
|
||||
|
||||
@@ -87,7 +87,8 @@ Some studies have investigated the visuo-haptic perception of virtual objects re
|
||||
|
||||
In \VST-\AR, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}).
|
||||
In a \TIFC task (\secref{sensations_perception}), participants pressed two pistons and indicated which was stiffer.
|
||||
One had a reference stiffness but an additional visual or haptic delay, while the other varied with a comparison stiffness but had no delay. \footnote{Participants were not told about the delays and stiffness tested, nor which piston was the reference or comparison. The order of the pistons (which one was pressed first) was also randomized.}
|
||||
One had a reference stiffness but an additional visual or haptic delay, while the other varied with a comparison stiffness but had no delay.
|
||||
\footnote{Participants were not told about the delays and stiffness tested, nor which piston was the reference or comparison. The order of the pistons (which one was pressed first) was also randomized.}
|
||||
Adding a visual delay increased the perceived stiffness of the reference piston, while adding a haptic delay decreased it, and adding both delays cancelled each other out (\figref{knorlein2009influence_2}).
|
||||
|
||||
\begin{subfigs}{visuo-haptic-stiffness}{
|
||||
|
||||
@@ -12,7 +12,7 @@ Wearable haptic augmentation of roughness and hardness is mostly achieved with v
|
||||
|
||||
\AR headsets integrate virtual content into the user's perception as if it were part of the \RE, with real-time pose estimation of the head and hands (\secref{what_is_ar}).
|
||||
Direct interaction with the hand of virtual content is often implemented using virtual hand interaction technique, which reconstructs the user's hand in the \VE and simulates its interactions with the virtual.
|
||||
However, the perception and manipulation of the virtual is difficult due to the lack of haptic feedback and the mutual occlusion of the hand with the virtual content (\secref{ar_interaction}). %, which could be addressed by a visual augmentation of the hand (\secref{ar_visual_hands}).
|
||||
However, the perception and manipulation of the virtual is difficult due to the lack of haptic feedback and the mutual occlusion of the hand with the virtual content (\secref{ar_interaction}).
|
||||
Real surrounding objects can also be used as proxies to interact with the virtual, but they may be incoherent with their visual augmentation because they are haptically passive (\secref{ar_interaction}).
|
||||
Wearable haptics on the hand seems to be a promising solution to enable coherent and effective visuo-haptic augmentation of both virtual and real objects.
|
||||
|
||||
@@ -31,13 +31,7 @@ Their haptic end-effector can't cover the fingertips or the inside of the palm,
|
||||
Many strategies have been explored to move the actuator to other parts of the hand.
|
||||
However, it is still unclear which delocalized positioning is most beneficial, and how it compares or complements the visual feedback of the virtual hand (\secref{vhar_haptics}).
|
||||
|
||||
%The lack of mutual occlusion during virtual object manipulation could be addressed by visual feedback of the virtual hand (\secref{ar_visual_hands}).
|
||||
In \chapref{visual_hand}, we will investigate the most common visual feedback of the virtual hand as an augmentation of the real hand in \AR.
|
||||
We will compare these visual hand augmentation in virtual object manipulation tasks, directly with the bare hand.
|
||||
Then, in \chapref{visuo_haptic_hand}, we will evaluate five common delocalized positioning of vibrotactile feedback, combined with two contact rendering techniques and visual hand augmentations.
|
||||
We will compare these visuo-haptic augmentations of the hand with the same direct hand manipulation tasks of virtual objects in \AR.
|
||||
|
||||
%Wearable haptics on the hand is thus another solution to improve direct hand manipulation of virtual objects in \AR.
|
||||
|
||||
%for rendering contact with the hand when manipulating virtual objects
|
||||
%We will also investigate two contact rendering techniques and compare them with two visual hand augmentations.
|
||||
|
||||
@@ -7,15 +7,12 @@ To create the illusion of touching a pattern with a fixed spatial period, the fr
|
||||
Previous work either used mechanical system to track the movement at high frequency \cite{strohmeier2017generating,friesen2024perceived}, or required the user to move at a constant speed to keep the signal frequency constant \cite{asano2015vibrotactile,ujitoko2019modulating}.
|
||||
However, this method has not yet been integrated in an \AR headset context, where the user should be able to freely touch and explore the visuo-haptic texture augmentations.
|
||||
|
||||
%which either constrained hand to a constant speed to keep the signal frequency constant \cite{asano2015vibrotactile,friesen2024perceived}, or used mechanical sensors attached to the hand \cite{friesen2024perceived,strohmeier2017generating}
|
||||
|
||||
In this chapter, we propose a \textbf{system for rendering visual and haptic virtual textures that augment real surfaces}.
|
||||
It is implemented with the \OST-\AR headset Microsoft HoloLens~2 and a wearable vibrotactile (voice-coil) device worn on the outside of finger (not covering the fingertip).
|
||||
The visuo-haptic augmentations rendered with this design allow a user to \textbf{see the textures from any angle} and \textbf{explore them freely with the bare finger}, as if they were real textures.
|
||||
To ensure both real-time and reliable renderings, the hand and the real surfaces are tracked using a webcam and marker-based pose estimation.
|
||||
The haptic textures are rendered as a vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented surface.
|
||||
The goal of this design is to enable new \AR applications capable of augmenting real objects with virtual visuo-haptic textures in a portable, on-demand manner, and without impairing with the interaction of the user with the \RE.
|
||||
\comans{SJ}{The rationale behind the proposed design is not provided. Since there are multiple ways to implement mechanically transparent haptic devices, the thesis should at least clarify why this design is considered optimal for a specific purpose at this stage.}{This has been better explained in the introduction.}
|
||||
|
||||
\noindentskip The contributions of this chapter are:
|
||||
\begin{itemize}
|
||||
|
||||
@@ -55,13 +55,11 @@ First, the coordinate system of the headset is manually aligned with that of the
|
||||
This resulted in a \qty{\pm .5}{\cm} spatial alignment error between the \RE and the \VE.
|
||||
While this was sufficient for our use cases, other methods can achieve better accuracy if needed \cite{grubert2018survey}.
|
||||
The registration of the coordinate systems of the camera and the headset thus allows the use of the marker estimation poses performed with the camera to display in the headset the virtual models aligned with their real-world counterparts.
|
||||
\comans{JG}{The registration process between the external camera, the finger, surface and HoloLens could have been described in more detail. Specifically, it could have been described clearer how the HoloLens coordinate system was aligned (e.g., by also tracking the fiducials on the surface and or finger).}{This has been better described.}
|
||||
|
||||
An additional calibration is performed to compensate for the offset between the finger contact point and the estimated marker pose \cite{son2022effect}.
|
||||
The current user then places the index finger on the origin point, whose respective poses are known from the attached fiducial markers.
|
||||
The transformation between the marker pose of the finger and the finger contact point can be estimated and compensated with an inverse transformation.
|
||||
This allows to detect if the calibrated real finger touches a virtual texture using a collision detection algorithm (Nvidia PhysX).
|
||||
\comans{JG}{A description if and how the offset between the lower side of the fingertip touching the surface and the fiducial mounted on the top of the finger was calibrated / compensated is missing}{This has been better described.}
|
||||
|
||||
In our implementation, the \VE is designed with Unity (v2021.1) and the Mixed Reality Toolkit (v2.7)\footnoteurl{https://learn.microsoft.com/windows/mixed-reality/mrtk-unity}.
|
||||
The visual rendering is achieved using the Microsoft HoloLens~2, an \OST-\AR headset with a \qtyproduct{43 x 29}{\degree} \FoV, a \qty{60}{\Hz} refresh rate, and self-localisation capabilities.
|
||||
|
||||
@@ -1,8 +1,6 @@
|
||||
\section{Conclusion}
|
||||
\label{conclusion}
|
||||
|
||||
%Summary of the research problem, method, main findings, and implications.
|
||||
|
||||
In this chapter, we designed and implemented a system for rendering virtual visuo-haptic textures that augment a real surface.
|
||||
Directly touched with the fingertip, the perceived roughness of the surface can be increased using a wearable vibrotactile voice-coil device mounted on the middle phalanx of the finger.
|
||||
We adapted the 1D sinusoidal grating rendering method, common in the literature but not yet integrated in a direct touch context, for use with vision-based pose estimation of the finger and paired it with an \OST-\AR headset.
|
||||
@@ -12,9 +10,3 @@ It also allows a free exploration of the textures, as if they were real (\secref
|
||||
The visual latency we measured is typical of \AR systems, and the haptic latency is below the perceptual detection threshold for vibrotactile rendering.
|
||||
|
||||
This system forms the basis of the apparatus for the user studies presented in the next two chapters, which evaluate the user perception of these visuo-haptic texture augmentations.
|
||||
|
||||
%\noindentskip This work was presented and published at the VRST 2024 conference:
|
||||
%
|
||||
%Erwan Normand, Claudio Pacchierotti, Eric Marchand, and Maud Marchal.
|
||||
%\enquote{How Different Is the Perception of Vibrotactile Texture Roughness in Augmented versus Virtual Reality?}.
|
||||
%In: \textit{ACM Symposium on Virtual Reality Software and Technology}. Trier, Germany, October 2024. pp. 287--296.
|
||||
|
||||
@@ -10,7 +10,7 @@ Databases of visuo-haptic textures have been developed in this way \cite{culbert
|
||||
|
||||
In this chapter, we consider simultaneous and \textbf{co-localized visual and wearable haptic texture augmentation of real surfaces} with an \OST-\AR headset and wearable vibrotactile feedback.
|
||||
We investigate how these textures can be perceived in a coherent and realistic manner, and to what extent each sensory modality would contribute to the overall perception of the augmented texture.
|
||||
We used nine pairs of \textbf{data-driven visuo-haptic textures} from the \HaTT database \cite{culbertson2014one}, which we rendered using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}. %, an \OST-\AR headset, and a wearable voice-coil device worn on the finger.
|
||||
We used nine pairs of \textbf{data-driven visuo-haptic textures} from the \HaTT database \cite{culbertson2014one}, which we rendered using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}.
|
||||
In a \textbf{user study}, 20 participants freely explored in direct touch the combination of the visuo-haptic texture pairs to rate their coherence, realism and perceived roughness.
|
||||
We aimed to assess \textbf{which haptic textures were matched with which visual textures}, how the roughness of the visual and haptic textures was perceived, and whether \textbf{the perceived roughness} could explain the matches made between them.
|
||||
|
||||
|
||||
@@ -1,19 +1,10 @@
|
||||
\section{User Study}
|
||||
\label{experiment}
|
||||
|
||||
%The user study aimed at analyzing the user perception of real surfaces when augmented through a visuo-haptic texture using \AR and vibrotactile haptic feedback provided on the finger touching the surfaces.
|
||||
%Nine representative visuo-haptic texture pairs from the \HaTT database \cite{culbertson2014one} were investigated in two tasks:
|
||||
%\begin{enumerate}
|
||||
% \item \level{Matching} task: participants had to find the haptic texture that best matched a given visual texture; and
|
||||
% \item \level{Ranking} task: participants had to rank the haptic textures, the visual textures, and the visuo-haptic texture pairs according to their perceived roughness.
|
||||
%\end{enumerate}
|
||||
%Our objective is to assess which haptic textures were associated with which visual textures, how the roughness of the visual and haptic textures are perceived, and whether the perceived roughness can explain the matches made between them.
|
||||
|
||||
\subsection{The textures}
|
||||
\label{textures}
|
||||
|
||||
The 100 visuo-haptic texture pairs of the \HaTT database \cite{culbertson2014one} were preliminary tested and compared using the apparatus described in \secref{apparatus} to select the most representative textures for the user study.
|
||||
% visuo-haptic system presented in \chapref{vhar_system}, and with the vibrotactile haptic feedback provided on the middle-phalanx of the finger touching a real surface. on the finger on a real surface
|
||||
These texture models were chosen as they are visuo-haptic representations of a wide range of real textures that are publicly available online.
|
||||
Nine texture pairs were selected (\figref{experiment/textures}) to cover various perceived roughness, from rough to smooth, as named on the database: \level{Metal Mesh}, \level{Sandpaper~100}, \level{Brick~2}, \level{Cork}, \level{Sandpaper~320}, \level{Velcro Hooks}, \level{Plastic Mesh~1}, \level{Terra Cotta}, \level{Coffee Filter}.
|
||||
All these visual and haptic textures are isotropic: their rendering (appearance or roughness) is the same whatever the direction of the movement on the surface, \ie there are no local deformations (holes, bumps, or breaks).
|
||||
@@ -35,10 +26,6 @@ When a virtual haptic texture was touched, a \qty{48}{kHz} audio signal was gene
|
||||
The normal force on the texture was assumed to be constant at \qty{1.2}{\N} to generate the audio signal from the model, as \textcite{culbertson2015should}, who found that the \HaTT textures can be rendered using only the speed as input without decreasing their perceived realism.
|
||||
The rendering of the virtual texture is described in \secref[vhar_system]{texture_generation}.
|
||||
The vibrotactile voice-coil actuator (HapCoil-One, Actronika) was firmly attached to the middle index phalanx of the participant's dominant hand using a Velcro strap, similarly to previous studies \cite{asano2015vibrotactile,friesen2024perceived}.
|
||||
%An amplifier (XY-502, not branded) converted this audio signal to a current transmitted to the vibrotactile voice-coil actuator (HapCoil-One, Actronika), that was encased in a \ThreeD-printed plastic shell firmly attached to the middle index phalanx of the participant's dominant hand, similarly to previous studies \cite{asano2015vibrotactile,friesen2024perceived}.
|
||||
%This voice-coil actuator was chosen for its wide frequency range (\qtyrange{10}{1000}{\Hz}) and its relatively low acceleration distortion, as specified by the manufacturer\footnoteurl{https://www.actronika.com/haptic-solutions}.
|
||||
%Overall latency was measured to \qty{46 \pm 6}{\ms}, as a result of latency in image acquisition \qty{16 \pm 1}{\ms}, fiducial marker detection \qty{8 \pm 3}{\ms}, network synchronization \qty{4 \pm 1}{\ms}, audio sampling \qty{3 \pm 1}{\ms}, and the vibrotactile actuator latency (\qty{15}{\ms}, as specified by the manufacturer\footnotemark[5]).
|
||||
%This latency was below the \qty{60}{\ms} threshold for vibrotactile feedback \cite{okamoto2009detectability} and was not noticed by the participants.
|
||||
|
||||
\begin{subfigs}{setup}{Textures used and experimental setup of the user study. }[][
|
||||
\item The nine visuo-haptic textures used in the user study, selected from the \HaTT database \cite{culbertson2014one}.
|
||||
@@ -56,7 +43,6 @@ The vibrotactile voice-coil actuator (HapCoil-One, Actronika) was firmly attache
|
||||
|
||||
Participants were first given written instructions about the experimental setup, the tasks, and the procedure of the user study.
|
||||
Then, after having signed an informed consent form, they were asked to seat in front of the table with the experimental setup and to wear the \AR headset.
|
||||
%The experimenter firmly attached the plastic shell encasing the vibrotactile actuator to the middle index phalanx of their dominant hand.
|
||||
As the haptic textures generated no audible noise, participants did not wear any noise reduction headphones.
|
||||
A calibration of both the HoloLens~2 and the finger pose estimation was performed to ensure the correct registration of the visuo-haptic textures and the real finger with the real surfaces, as described in \secref[vhar_system]{virtual_real_registration}.
|
||||
Finally, participants familiarized with the augmented surface in a \qty{2}{min} training session with textures different from the ones used in the user study.
|
||||
@@ -89,8 +75,6 @@ A total of 9 textures \x 3 repetitions = 18 matching trials were performed per p
|
||||
In the \level{Ranking} task, participants had to rank the haptic textures, the visual textures, and the visuo-haptic texture pairs according to their perceived roughness.
|
||||
It had one within-subjects factor, \factor{Modality} with the following levels: \level{Visual}, \level{Haptic}, \level{Visuo-Haptic}.
|
||||
Each modality level was ranked once per participant following the fixed order listed above (\secref{procedure}).
|
||||
%The order of \level{Modality} was fixed as listed above, and.
|
||||
%A total of 3 modalities = 60 ranking trials were collected.
|
||||
|
||||
\subsection{Participants}
|
||||
\label{participants}
|
||||
@@ -123,4 +107,3 @@ After each of the two tasks, participants answered to the following 7-item Liker
|
||||
In an open question, participants also commented on their strategy for completing the \level{Matching} task (\enquote{How did you associate the tactile textures with the visual textures?}) and the \level{Ranking} task (\enquote{How did you rank the textures?}).
|
||||
|
||||
The results were analyzed using R (v4.4) and the packages \textit{afex} (v1.4), \textit{ARTool} (v0.11), \textit{corrr} (v0.4), \textit{FactoMineR} (v2.11), \textit{lme4} (v1.1), and \textit{performance} (v0.13).
|
||||
\comans{JG}{I suggest to also report on [...] the software packages used for statistical analysis (this holds also for the subsequent chapters).}{This has been added to all chapters where necessary.}
|
||||
|
||||
@@ -10,7 +10,6 @@
|
||||
\figref{results/matching_confusion_matrix} shows the confusion matrix of the \level{Matching} task with the visual textures and the proportion of haptic texture selected in response, \ie the proportion of times the corresponding \response{Haptic Texture} was selected in response to the presentation of the corresponding \factor{Visual Texture}.
|
||||
To determine which haptic textures were selected most often, the repetitions of the trials were first aggregated by counting the number of selections per participant for each (\factor{Visual Texture}, \response{Haptic Texture}) pair.
|
||||
An \ANOVA based on a Poisson regression (no overdispersion was detected) indicated a statistically significant effect on the number of selections of the interaction \factor{Visual Texture} \x \response{Haptic Texture} (\chisqr{64}{180}{414}, \pinf{0.001}).
|
||||
\comans{JG}{For the two-sample Chi-Squared tests in the matching task, the number of samples reported is 540 due to 20 participants conducting 3 trials for 9 textures each. However, this would only hold true if the repetitions per participant would be independent and not correlated (and then, one could theoretically also run 10 participants with 6 trials each, or 5 participants with 12 trials each). If they are not independent, this would lead to an artificial inflated sample size and Type I error. If the trials are not independent (please double check), I suggest either aggregating data on the participant level or to use alternative models that account for the within-subject correlation (as was done in other chapters).}{Data of the three confusion matrices have been aggregated on the participant level and analyzed using a Poisson regression.}
|
||||
Post-hoc pairwise comparisons using the Tukey's \HSD test then indicated there was statistically significant differences for the following visual textures:
|
||||
\begin{itemize}
|
||||
\item With \level{Sandpaper~320}, \level{Coffee Filter} was more selected than the other haptic textures (\ztest{3.4}, \pinf{0.05} each) except \level{Plastic Mesh~1} and \level{Terra Cotta}.
|
||||
|
||||
@@ -8,8 +8,8 @@ Most of the haptic augmentations of real surfaces using with wearable haptic dev
|
||||
Still, it is known that the visual rendering of an object can influence the perception of its haptic properties (\secref[related_work]{visual_haptic_influence}), and that the perception of same haptic force-feedback or vibrotactile rendering can differ between \AR and \VR, probably due to difference in perceived simultaneity between visual and haptic stimuli (\secref[related_work]{ar_vr_haptic}).
|
||||
In \AR, the user can see their own hand touching, the haptic device worn, and the \RE, while in \VR they are hidden by the \VE.
|
||||
|
||||
In this chapter, we investigate the \textbf{role of the visual feedback of the virtual hand and of the environment (real or virtual) on the perception of a real surface whose haptic roughness is augmented} with wearable vibrotactile haptics. %voice-coil device worn on the finger.
|
||||
To do so, we used the visuo-haptic system presented in \chapref{vhar_system} to render virtual vibrotactile patterned textures (\secref[related_work]{texture_rendering}) to augment the real surface being touched. % touched by the finger.% that can be directly touched with the bare finger.
|
||||
In this chapter, we investigate the \textbf{role of the visual feedback of the virtual hand and of the environment (real or virtual) on the perception of a real surface whose haptic roughness is augmented} with wearable vibrotactile haptics.
|
||||
To do so, we used the visuo-haptic system presented in \chapref{vhar_system} to render virtual vibrotactile patterned textures (\secref[related_work]{texture_rendering}) to augment the real surface being touched.
|
||||
We evaluated, in \textbf{user study with psychophysical methods and extensive questionnaire}, the perceived roughness augmentation in three visual rendering conditions: \textbf{(1) without visual augmentation}, in \textbf{(2) \OST-\AR with a realistic virtual hand} rendering, and in \textbf{(3) \VR with the same virtual hand}.
|
||||
To control for the influence of the visual rendering, the real surface was not visually augmented and stayed the same in all conditions.
|
||||
|
||||
|
||||
@@ -1,9 +1,6 @@
|
||||
\section{User Study}
|
||||
\label{experiment}
|
||||
|
||||
%The visuo-haptic rendering system, described in \secref[vhar_system]{method}, allows free exploration of virtual vibrotactile textures on real surfaces directly touched with the bare finger to simulate roughness augmentation, while the visual rendering of the hand and environment can be controlled to be in \AR or \VR.
|
||||
%
|
||||
%The user study aimed to investigate the effect of visual hand rendering in \AR or \VR on the perception of roughness texture augmentation of a touched real surface.
|
||||
In a \TIFC task (\secref[related_work]{sensations_perception}), participants compared the roughness of different tactile texture augmentations in three visual rendering conditions: without any visual augmentation (\level{Real}, \figref{experiment/real}), in \AR with a realistic virtual hand superimposed on the real hand (\level{Mixed}, \figref{experiment/mixed}), and in \VR with the same virtual hand as an avatar (\level{Virtual}, \figref{experiment/virtual}).
|
||||
In order not to influence the perception, as vision is an important source of information and influence for the perception of texture \cite{bergmanntiest2007haptic,yanagisawa2015effects,vardar2019fingertip}, the touched surface was visually a uniform white; thus only the visual aspect of the hand and the surrounding environment is changed.
|
||||
|
||||
@@ -39,7 +36,6 @@ The virtual hand model was a gender-neutral human right hand with realistic skin
|
||||
Prior to the experiment, the virtual hand and the \VE were registered to the real hand of the participant and the \RE, respectively, as described in \secref[vhar_system]{virtual_real_registration}.
|
||||
The size of the virtual hand was also manually adjusted to match the real hand of the participant.
|
||||
A \qty{\pm .5}{\cm} spatial alignment error (\secref[vhar_system]{virtual_real_registration}) and a \qty{160 \pm 30}{\ms} lag (\secref[vhar_system]{virtual_real_registration}) between the real hand the virtual hand were measured.
|
||||
\comans{JG}{In addition, the lag between the real and virtual hand in the Mixed condition could have been quantified (e.g. using a camera filming through the headset) to shed more light on the reported differences, as also noted in Section 4.5, as well as the registration error between the real and the virtual hand (as visible in Figure 4.1, Mixed).}{This has been added.}
|
||||
|
||||
To ensure the same \FoV in all \factor{Visual Rendering} condition, a cardboard mask was attached to the \AR headset (\figref{experiment/headset}).
|
||||
In the \level{Virtual} rendering, the mask only had holes for sensors to block the view of the \RE and simulate a \VR headset.
|
||||
|
||||
@@ -27,8 +27,8 @@ All pairwise differences were statistically significant.
|
||||
]
|
||||
|
||||
\begin{subfigs}{discrimination_accuracy}{Results of the vibrotactile texture roughness discrimination task. }[][
|
||||
\item Estimated \PSE of each visual rendering, defined as the amplitude difference at which both reference and comparison textures are perceived to be equivalent. %, \ie the accuracy in discriminating vibrotactile roughness.
|
||||
\item Estimated \JND of each visual rendering. %, defined as the minimum perceptual amplitude difference, \ie the sensitivity to vibrotactile roughness differences.
|
||||
\item Estimated \PSE of each visual rendering, defined as the amplitude difference at which both reference and comparison textures are perceived to be equivalent.
|
||||
\item Estimated \JND of each visual rendering.
|
||||
]
|
||||
\subfig[0.35]{results/trial_pses}
|
||||
\subfig[0.35]{results/trial_jnds}
|
||||
@@ -107,7 +107,6 @@ Participants were mixed between feeling the vibrations on the surface or on the
|
||||
{NASA-TLX questions asked to participants after each \factor{Visual Rendering} block of trials.}
|
||||
[
|
||||
Questions were bipolar 100-points scales (0~=~Very Low and 100~=~Very High, except for Performance where 0~=~Perfect and 100~=~Failure), with increments of 5.
|
||||
%Participants were shown only the labels for all questions.
|
||||
]
|
||||
\begin{tabularx}{\linewidth}{l X}
|
||||
\toprule
|
||||
|
||||
@@ -4,9 +4,6 @@
|
||||
The results showed a difference in vibrotactile roughness perception between the three visual rendering conditions.
|
||||
Given the estimated \PSEs, the textures were on average perceived as \enquote{rougher} in the \level{Real} rendering than in the \level{Virtual} (\percent{-2.8}) and \level{Mixed} (\percent{-6.0}) renderings (\figref{results/trial_pses}).
|
||||
A \PSE difference in the same range was found for perceived stiffness, with the \VR perceived as \enquote{stiffer} and the \AR as \enquote{softer} \cite{gaffary2017ar}.
|
||||
%
|
||||
%However, the difference between the \level{Virtual} and \level{Mixed} conditions was not significant.
|
||||
%
|
||||
Surprisingly, the \PSE of the \level{Real} rendering was shifted to the right (to be "rougher", \percent{7.9}) compared to the reference texture, whereas the \PSEs of the \level{Virtual} (\percent{5.1}) and \level{Mixed} (\percent{1.9}) renderings were perceived as \enquote{smoother} and closer to the reference texture (\figref{results/trial_predictions}).
|
||||
The sensitivity of participants to roughness differences also varied, with the \level{Real} rendering having the best \JND (\percent{26}), followed by the \level{Virtual} (\percent{30}) and \level{Mixed} (\percent{33}) renderings (\figref{results/trial_jnds}).
|
||||
These \JND values are in line with and at the upper end of the range of previous studies \cite{choi2013vibrotactile}, which may be due to the location of the actuator on the top of the finger middle phalanx, being less sensitive to vibration than the fingertip.
|
||||
@@ -23,11 +20,9 @@ The \level{Mixed} rendering had the lowest \PSE and highest perceived latency, t
|
||||
|
||||
Our wearable visuo-haptic texture augmentation system, described in \chapref{vhar_system}, aimed to provide a coherent visuo-haptic renderings registered with the \RE.
|
||||
Yet, it involves different sensory interaction loops between the user's movements and the visuo-haptic feedback (\figref[vhar_system]{diagram} and \figref[introduction]{interaction-loop}), which may not feel to be in synchronized with each other or with proprioception.
|
||||
%When a user runs their finger over a vibrotactile virtual texture, the haptic sensations and eventual display of the virtual hand lag behind the visual displacement and proprioceptive sensations of the real hand.
|
||||
%
|
||||
Thereby, we hypothesize that the differences in the perception of vibrotactile roughness are less due to the visual rendering of the hand or the environment and their associated differences in exploration behaviour, but rather to the difference in the \emph{perceived} latency between one's own hand (visual and proprioception) and the virtual hand (visual and haptic).
|
||||
The perceived delay was the most important in \AR, where the virtual hand visually lags significantly behind the real one, but less so in \VR, where only the proprioceptive sense can help detect the lag.
|
||||
This delay was not perceived when touching the virtual haptic textures without visual augmentation, because only the finger velocity was used to render them, and, despite the varied finger movements and velocities while exploring the textures, the participants did not perceive any latency in the vibrotactile rendering (\secref{results_questions}).
|
||||
\textcite{diluca2011effects} demonstrated similarly, in a \VST-\AR setup, how visual latency relative to proprioception increased the perception of stiffness of a virtual piston, while haptic latency decreased it (\secref[related_work]{ar_vr_haptic}).
|
||||
Another complementary explanation could be a pseudo-haptic effect (\secref[related_work]{visual_haptic_influence}) of the displacement of the virtual hand, as already observed with this vibrotactile texture rendering, but seen on a screen \cite{ujitoko2019modulating}.
|
||||
Such hypotheses could be tested by manipulating the latency and pose estimation accuracy of the virtual hand or the vibrotactile feedback. % to observe their effects on the roughness perception of the virtual textures.
|
||||
Such hypotheses could be tested by manipulating the latency and pose estimation accuracy of the virtual hand or the vibrotactile feedback.
|
||||
|
||||
@@ -3,7 +3,6 @@
|
||||
|
||||
In this chapter, we studied how the perception of wearable haptic augmented textures is affected by the visual feedback of the virtual hand and the environment, being either real, augmented or virtual.
|
||||
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, we augmented the perceived roughness of real surfaces with virtual vibrotactile textures rendered on the finger.
|
||||
%we rendered virtual vibrotactile patterned textures on the voice-coil worn on the middle-phalanx of the finger to augment the roughness perception of the real surface being touched.
|
||||
With an \OST-\AR headset, that could be switched to a \VR only view, we considered three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR.
|
||||
We then evaluated the perceived roughness augmentation in these three visual conditions with a psychophysical user study involving 20 participants and extensive questionnaires.
|
||||
|
||||
@@ -13,13 +12,10 @@ Similarly, the sensitivity to differences in roughness was better with the real
|
||||
Exploration behaviour was also slower in \VR than with real hand alone, although subjective evaluation of the texture was not affected.
|
||||
We hypothesized that this difference in perception was due to the \emph{perceived latency} between the finger movements and the different visual, haptic and proprioceptive feedbacks, which were the same in all visual renderings, but were more noticeable in \AR and \VR than without visual augmentation.
|
||||
|
||||
%We can outline recommendations for future \AR/\VR studies or applications using wearable haptics.
|
||||
This study suggests that attention should be paid to the respective latencies of the visual and haptic sensory feedbacks inherent in such systems and, more importantly, to \emph{the perception of their possible asynchrony}.
|
||||
Latencies should be measured \cite{friston2014measuring}, minimized to an acceptable level for users and kept synchronized with each other \cite{diluca2019perceptual}.
|
||||
It seems also that the visual aspect of the hand or the environment on itself has little effect on the perception of haptic feedback, but the degree of visual virtuality can affect the asynchrony perception of the latencies, even though the latencies remain identical.
|
||||
When designing for wearable haptics or integrating it into \AR/\VR, it seems important to test its perception in real (\RE), augmented (\AE) and virtual (\VE) environments.
|
||||
%With a better understanding of how visual factors influence the perception of haptically augmented real objects, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic renderings adapted to \AR can be designed.
|
||||
%Finally, a visual hand representation in OST-\AR together with wearable haptics should be avoided until acceptable tracking latencies \are achieved, as was also observed for virtual object interaction with the bare hand \cite{normand2024visuohaptic}.
|
||||
|
||||
In the next chapter we present a second user study where we investigate the perception of simultaneous and co-localised visual and haptic texture augmentation.
|
||||
We will use the same system presented in \chapref{vhar_system} and a visual rendering condition similar to the \level{Real} condition of this study, in \AR without the virtual hand overlay.
|
||||
|
||||
@@ -158,7 +158,7 @@ Participants also rated each visual hand augmentation individually on six questi
|
||||
\item \response{Difficulty}: How difficult were the tasks?
|
||||
\item \response{Fatigue}: How fatiguing (mentally and physically) were the tasks?
|
||||
\item \response{Precision}: How precise were you in performing the tasks?
|
||||
\item \response{Performance}: How successful were you in performing the tasks? %
|
||||
\item \response{Performance}: How successful were you in performing the tasks?
|
||||
\item \response{Efficiency}: How fast/efficient do you think you were in performing the tasks?
|
||||
\item \response{Rating}: How much do you like each visual hand?
|
||||
\end{itemize}
|
||||
|
||||
@@ -46,10 +46,7 @@ a \LMM \ANOVA with by-participant random intercepts and random slopes for \facto
|
||||
and \factor{Target} (\anova{7}{3270}{4.1}, \pinf{0.001}).
|
||||
|
||||
It was shorter with \level{None} than with \level{Occlusion} (\pinf{0.001}), \level{Contour} (\pinf{0.001}), \level{Skeleton} (\pinf{0.001}) and \level{Mesh} (\pinf{0.001}).
|
||||
%shorter with \level{Tips} than with \level{Occlusion} (\p{0.008}), \level{Contour} (\p{0.006}) and \level{Mesh} (\pinf{0.001});
|
||||
%and shorter with \level{Skeleton} than with \level{Mesh} (\pinf{0.001}).
|
||||
This result is an evidence of the lack of confidence of participants with no visual hand augmentation: they grasped the cube more to secure it.
|
||||
%The \level{Mesh} rendering seemed to have provided the most confidence to participants, maybe because it was the closest to the real hand.
|
||||
|
||||
The \response{Grip Aperture} was longer on the right-front (\level{RF}) target volume, indicating a higher confidence, than on back and side targets (\level{R}, \level{RB}, \level{B}, \level{L}, \p{0.03}).
|
||||
|
||||
|
||||
@@ -4,7 +4,6 @@
|
||||
We evaluated six visual hand augmentations, as described in \secref{hands}, displayed on top of the real hand, in two virtual object manipulation tasks in \AR.
|
||||
|
||||
During the \level{Push} task, the \level{Skeleton} hand rendering was the fastest (\figref{results/Push-CompletionTime}), as participants employed fewer and longer contacts to adjust the cube inside the target volume (\figref{results/Push-ContactsCount} and \figref{results/Push-MeanContactTime}).
|
||||
%Participants consistently used few and continuous contacts for all visual hand augmentations (\figref{results/Push-ContactsCount}), with only less than ten trials, carried out by two participants, quickly completed with multiple discrete touches.
|
||||
However, during the \level{Grasp} task, despite no difference in \response{Completion Time}, providing no visible hand rendering (\level{None} and \level{Occlusion} renderings) led to more failed grasps or cube drops (\figref{results/Grasp-ContactsCount} and \figref{results/Grasp-MeanContactTime}).
|
||||
Indeed, participants found the \level{None} and \level{Occlusion} renderings less effective (\figref{results/Ranks-Grasp}) and less precise (\figref{results_questions}).
|
||||
To understand whether the participants' previous experience might have played a role, we also carried out an additional statistical analysis considering \VR experience as an additional between-subjects factor, \ie \VR novices vs. \VR experts (\enquote{I use it every week}, see \secref{participants}).
|
||||
|
||||
@@ -15,14 +15,7 @@ This is consistent with similar manipulation studies in \VR and in \VST-\AR setu
|
||||
|
||||
This study suggests that a \ThreeD visual hand augmentation is important in \OST-\AR when interacting with a virtual hand technique, particularly when it involves precise finger movements in relation to virtual content, \eg \ThreeD windows, buttons and sliders, or more complex tasks, such as stacking or assembly.
|
||||
A minimal but detailed rendering of the virtual hand that does not hide the real hand, such as the skeleton rendering we evaluated, seems to be the best compromise between the richness and effectiveness of the feedback.
|
||||
%In addition, users should be able to choose and adapt the visual hand augmentation to their preferences and needs.
|
||||
|
||||
In addition to visual augmentation of the hand, direct manipulation of virtual objects with the hand can also benefit from wearable haptic feedback.
|
||||
In the next chapter, we explore two wearable vibrotactile contact feedback devices in a user study, located at four positionings on the hand so as to not cover the fingertips.
|
||||
We evaluate their effect on user performance and experience in the same manipulation tasks as in this chapter, with the best visual hand augmentation found in this study.
|
||||
|
||||
%\noindentskip This work was published in Transactions on Haptics:
|
||||
|
||||
%Erwan Normand, Claudio Pacchierotti, Eric Marchand, and Maud Marchal.
|
||||
%\enquote{Visuo-Haptic Rendering of the Hand during 3D Manipulation in Augmented Reality}.
|
||||
%In: \textit{IEEE Transactions on Haptics}. 27.4 (2024), pp. 2481--2487.
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
|
||||
On the time to complete a trial,
|
||||
a \LMM \ANOVA with by-participant random intercepts indicated two statistically significant effects:
|
||||
\factor{Positioning} (\anova{4}{2341}{3.6}, \p{0.007}, see \figref{results/Push-CompletionTime-Location-Overall-Means}) %
|
||||
\factor{Positioning} (\anova{4}{2341}{3.6}, \p{0.007}, see \figref{results/Push-CompletionTime-Location-Overall-Means})
|
||||
and \factor{Target} (\anova{1}{1990}{3.9}, \p{0.05}).
|
||||
\level{Fingertips} was slower than \level{Proximal} (\percent{+11}, \p{0.01}) or \level{Opposite} (\percent{+12}, \p{0.03}).
|
||||
There was no evidence of an advantage of \level{Proximal} or \level{Opposite} on \level{Nowhere}, nor a disadvantage of \level{Fingertips} on \level{Nowhere}.
|
||||
@@ -24,8 +24,8 @@ This could indicate more difficulties to adjust the virtual cube inside the targ
|
||||
|
||||
On the mean time spent on each contact,
|
||||
a \LMM \ANOVA with by-participant random intercepts indicated two statistically significant effects of
|
||||
\factor{Positioning} (\anova{4}{1990}{11.5}, \pinf{0.001}, see \figref{results/Push-TimePerContact-Location-Overall-Means}) %
|
||||
and of \factor{Hand} (\anova{1}{1990}{16.1}, \pinf{0.001}, see \figref{results/Push-TimePerContact-Hand-Overall-Means}) %
|
||||
\factor{Positioning} (\anova{4}{1990}{11.5}, \pinf{0.001}, see \figref{results/Push-TimePerContact-Location-Overall-Means})
|
||||
and of \factor{Hand} (\anova{1}{1990}{16.1}, \pinf{0.001}, see \figref{results/Push-TimePerContact-Hand-Overall-Means})
|
||||
but not of the \factor{Positioning} \x \factor{Hand} interaction.
|
||||
It was shorter with \level{Fingertips} than with \level{Wrist} (\percent{-15}, \pinf{0.001}), \level{Opposite} (\percent{-11}, \p{0.01}), or \level{Nowhere} (\percent{-15}, \pinf{0.001});
|
||||
and shorter with \level{Proximal} than with \level{Wrist} (\percent{-16}, \pinf{0.001}), \level{Opposite} (\percent{-12}, \p{0.005}), or \level{Nowhere} (\percent{-16}, \pinf{0.001}).
|
||||
|
||||
@@ -16,7 +16,7 @@ and \level{LF} was faster than \level{RB} (\p{0.03}).
|
||||
|
||||
On the number of contacts,
|
||||
a \LMM \ANOVA with by-participant random intercepts indicated two statistically significant effects:
|
||||
\factor{Positioning} (\anova{4}{3990}{15.1}, \pinf{0.001}, see \figref{results/Grasp-Contacts-Location-Overall-Means}) %
|
||||
\factor{Positioning} (\anova{4}{3990}{15.1}, \pinf{0.001}, see \figref{results/Grasp-Contacts-Location-Overall-Means})
|
||||
and \factor{Target} (\anova{3}{3990}{7.6}, \pinf{0.001}).
|
||||
Fewer contacts were made with \level{Opposite} than with \level{Fingertips} (\percent{-26}, \pinf{0.001}), \level{Proximal} (\percent{-17}, \pinf{0.001}), or \level{Wrist} (\percent{-12}, \p{0.002});
|
||||
but more with \level{Fingertips} than with \level{Wrist} (\percent{+13}, \p{0.002}) or \level{Nowhere} (\percent{+17}, \pinf{0.001}).
|
||||
|
||||
@@ -18,7 +18,6 @@ Yet, a wrist-mounted haptic device will be able to provide richer feedback by em
|
||||
It could thus provide more complex feedback of the contacts with the virtual objects.
|
||||
Finally, we think that the visual hand augmentation complements the haptic contact rendering well by providing continuous feedback on hand tracking.
|
||||
Such a visual augmentation can be disabled during the grasping phase to avoid redundancy with the haptic feedback of the contact with the virtual object.
|
||||
\comans{SJ}{Again, it would strengthen the thesis if the authors provided a systematic guideline on how to choose the appropriate haptic feedback or visual augmentation depending on the specific requirements of an application.}{The guidelines paragraph have been expanded in the conclusion.}
|
||||
|
||||
\noindentskip The work described in \chapref{visual_hand} and \ref{visuo_haptic_hand} was published in Transactions on Haptics:
|
||||
|
||||
|
||||
@@ -7,7 +7,7 @@ We conclude this thesis manuscript by summarizing our contributions and the main
|
||||
|
||||
\section{Summary}
|
||||
|
||||
In this manuscript, we showed how \OST-\AR headsets and wearable haptics can improve direct hand interaction with virtual and augmented objects. % by augmenting the perception of the real and manipulation of the virtual.
|
||||
In this manuscript, we showed how \OST-\AR headsets and wearable haptics can improve direct hand interaction with virtual and augmented objects.
|
||||
Wearable haptics can provide rich tactile feedback on virtual objects and augment the perception of real objects, both directly touched by the hand, while preserving freedom of movement and interaction with the \RE.
|
||||
However, their integration with \AR is still in its infancy and presents many design, technical and human challenges.
|
||||
We have structured this thesis around two research axes: \textbf{(I) modifying the visuo-haptic texture perception of real surfaces} and \textbf{(II) improving the manipulation of virtual objects}.
|
||||
@@ -68,7 +68,6 @@ A more robust hand pose estimation system would support wearing haptic devices o
|
||||
The spatial registration error \cite{grubert2018survey} and the temporal latency \cite{diluca2019perceptual} between the \RE and \VE should also be reduced to be imperceptible.
|
||||
The effect of these spatial and temporal errors on the perception and manipulation of the virtual object should be systematically investigated.
|
||||
Prediction of hand movements should also be considered to overcome such issues \cite{klein2020predicting,gamage2021predictable}.
|
||||
\comans{JG}{I [...] also want to highlight the opportunity to study the effect of visual registration error as noted already in chapter 4.}{Sentences along these lines has been added.}
|
||||
|
||||
A complementary solution would be to embed tracking sensors in the wearable haptic devices, such as an inertial measurement unit (IMU) or cameras \cite{preechayasomboon2021haplets}.
|
||||
This would allow a complete portable and wearable visuo-haptic system to be used in practical applications.
|
||||
@@ -80,7 +79,7 @@ This would allow a complete portable and wearable visuo-haptic system to be used
|
||||
In our user study, we assessed the effect of touching a vibrotactile texture augmentation with a real hand or a virtual hand, in \AR or \VR.
|
||||
To control for the visual feedback, we decided not to display the virtual texture so that participants only saw and touched a uniform white real surface.
|
||||
The visual information of a texture is as important as the haptic sensations for the perception of roughness, and the interaction between the two to form the overall texture perception is complex \cite{bergmanntiest2007haptic,yanagisawa2015effects,vardar2019fingertip}.
|
||||
In particular, it remains to be investigated how the vibrotactile patterned textures we employed can be represented visually in a convincing way, as the visuo-haptic coupling of such virtual patterned textures is not trivial \cite{unger2011roughness}. % even with real textures \cite{klatzky2003feeling}.
|
||||
In particular, it remains to be investigated how the vibrotactile patterned textures we employed can be represented visually in a convincing way, as the visuo-haptic coupling of such virtual patterned textures is not trivial \cite{unger2011roughness}.
|
||||
|
||||
\paragraph{Broader Visuo-Haptic Conditions.}
|
||||
|
||||
@@ -102,12 +101,9 @@ As in the previous chapter, our aim was not to accurately reproduce real texture
|
||||
However, the results also have some limitations, as they addressed a small set of visuo-haptic textures that augmented the perception of smooth and white real surfaces.
|
||||
Visuo-haptic texture augmentation might be difficult on surfaces that already have strong visual or haptic patterns \cite{asano2012vibrotactile}, or on objects with complex shapes.
|
||||
A real surface could be indeed augmented not only to add visuo-haptic textures, but also to amplify, diminish, mask, or replace the existing real texture.
|
||||
\comans{SJ}{It would be valuable to explore how real texture from a physical surface could be combined with virtual texture, enabling merged, augmented, amplified, or diminished feedback}{This has been better discussed.}
|
||||
In addition, the visual textures used were simple color images not intended for use in an \ThreeD \VE, and enhancing their visual quality could improve the perception of visuo-haptic texture augmentation.
|
||||
\comans{JG}{As future work, the effect of visual quality of the rendered textures on texture perception could also be of interest.}{A sentence along these lines has been added.}
|
||||
It would also be interesting to replicate the experiment in more controlled visuo-haptic environments, in \VR or with world-grounded haptic devices.
|
||||
This would enable to better understand how the rendering quality, spatial registration and latency of virtual textures can affect their perception.
|
||||
\comans{SJ}{Moreover, if we only consider the experimental findings, the system could likely be recreated using VR or conventional visuo-haptic setups in a more stable manner. It would be beneficial to emphasize how the experiment is closely tied to the specific domain of haptic AR.}{This has been added.}
|
||||
Finally, the role of visuo-haptic texture augmentation should also be evaluated in more complex tasks, such as object recognition and assembly, or in more concrete use cases, such as displaying and touching a museum object or a 3D printed object before it is manufactured.
|
||||
|
||||
\paragraph{Specificities of Direct Touch.}
|
||||
@@ -118,7 +114,6 @@ The respective importance of these factors on the haptic texture perception is n
|
||||
It would be interesting to determine the importance of these factors on the perceived realism of virtual vibrotactile textures in the context of bare finger touch.
|
||||
Finger based captures of real textures should also be considered \cite{balasubramanian2024sens3}.
|
||||
Finally, the virtual texture models should also be adaptable to individual sensitivities \cite{malvezzi2021design,young2020compensating}.
|
||||
\comans{SJ}{Technical concern: As far as I know, the texture rendering algorithm from [Curbertson et al.] is based on rigid-tool-based interactions. The vibration patterns due to texture in a bare-hand interaction scenario (used in this study) should differ significantly from those produced in rigid-tool interactions. I conduct similar research and am confident that the signals involved in bare-hand interactions are far more complex than those in rigid-tool-based interactions. Therefore, the choice of rendering algorithm could negatively affect the experimental results. This issue is critical and should either be revised or extensively discussed in the thesis.}{This has been discussed more in depth in this section.}
|
||||
|
||||
\subsection*{Visual Augmentation of the Hand for Manipulating virtual objects in AR}
|
||||
|
||||
@@ -131,7 +126,6 @@ However, the user's visual perception and experience are different with other ty
|
||||
In particular, the mutual occlusion problem and the latency of hand pose estimation could be overcome with a \VST-\AR headset.
|
||||
In this case, the occlusion rendering could be the most natural, realistic and effective augmentation.
|
||||
Yet, a visual hand augmentation could still be beneficial to users by providing depth cues and feedback on hand tracking, and should be evaluated as such.
|
||||
\comans{SJ}{According to the results, occlusion is the most natural (in terms of realism) but least efficient for manipulation. In some cases, natural visualization is necessary. It would be beneficial to discuss these cases to help guide AR interaction designers in choosing the most appropriate visualization methods.}{This has been discussed more in depth in this section.}
|
||||
|
||||
\paragraph{More Practical Usages.}
|
||||
|
||||
@@ -140,7 +134,6 @@ These tasks are indeed fundamental building blocks for more complex manipulation
|
||||
They can indeed require users to perform more complex finger movements and interactions with the virtual object.
|
||||
Depending on the task, the importance of position, orientation and depth information of the hand and the object may vary and affect the choice of visual hand augmentation.
|
||||
More practical applications should also be considered, such as medical, educational or industrial scenarios, which may have different needs and constraints (\eg, the most natural visual hand augmentation for a medical application, or the easiest to understand and use for an educational context).
|
||||
\comans{SJ}{The task in the experiment is too basic, making it difficult to generalize the results. There are scenarios where depth information may be more important than position, or where positioning may be more critical than orientation. A systematic categorization and analysis of such cases would add depth to the chapter.}{This has been discussed more in depth in this section.}
|
||||
|
||||
Similarly, a broader experimental study might shed light on the role of gender and age, as our subject pool was not sufficiently diverse in this regard.
|
||||
Finally, all visual hand augmentations received low and high rank rates from different participants, suggesting that users should be able to choose and personalize some aspects of the visual hand augmentation according to their preferences or needs, and this should also be evaluated.
|
||||
@@ -189,9 +182,6 @@ It would therefore be interesting to determine which wearable haptic augmentatio
|
||||
Similar user studies could then be conducted, to reproduce as many haptic properties as possible in virtual object discrimination tasks.
|
||||
These results would enable the design of more universal wearable haptic devices that provide rich haptic feedback that best meets users' needs for interaction in \AR and \VR.
|
||||
|
||||
% systematic exploration of the parameter space of the haptic rendering to determine the most important parameters their influence on the perception
|
||||
% measure the difference in sensitivity to the haptic feedback and how much it affects the perception of the object properties
|
||||
|
||||
\subsection*{Responsive Visuo-Haptic Augmented Reality}
|
||||
|
||||
We reviewed the diversity of \AR and \VR reality displays and their respective characteristics in rendering (\secref[related_work]{ar_displays}) and the manipulation of virtual content with the hand (\chapref{visual_hand}).
|
||||
@@ -212,11 +202,3 @@ Methods should also be developed to allow the user to easily adjust the haptic f
|
||||
Finally, more practical use cases and applications of visuo-haptic \AR should be explored and evaluated.
|
||||
For example, capturing visuo-haptic perceptions of objects and sharing them in a visioconference with different \AR or \VR setups per participant, or in a medical teleconsultation.
|
||||
It could also be projecting and touching a visuo-haptic sample in a real wall for interior design, then switching to \VR to see and touch the complete final result, or manipulating a museum object with a real proxy object in visuo-haptic \AR, or even as it was in the past in its original state in \VR.
|
||||
%Another example could be a medical teleconsultation, where the doctor could palpate a distant patient with haptic augmentation, but in \VR.
|
||||
|
||||
% design, implement and validate procedures to automatically calibrate the haptic feedback to the user's perception in accordance to what it has been designed to represent
|
||||
% + let user free to easily adjust (eg can't let adjust whole spectrum of vibrotactile, reduce to two or three dimensions with sliders using MDS)
|
||||
|
||||
%- Visio en réalité mixte : ar avec avatars distants, vr pour se retrouver dans l'espace de l'autre ou un espace distant, et besoin de se faire toucher des objets à distance
|
||||
%- Ou bien en cours, voir l'échantillon à toucher dans lenv de travail ou en contexte en passant en VR
|
||||
%- Ex : médecin palpation, design d'un objet, rénovation d'un logement (AR en contexte courant, VR pour voir et toucher une fois terminé)
|
||||
|
||||
@@ -6,17 +6,14 @@
|
||||
\textbf{Erwan Normand}, Claudio Pacchierotti, Eric Marchand, and Maud Marchal.
|
||||
\enquote{Visuo-Haptic Rendering of the Hand during 3D Manipulation in Augmented Reality}.
|
||||
In: \textit{IEEE Transactions on Haptics (ToH)}. 27.4 (2024), pp. 2481--2487.
|
||||
%\textsc{doi}: \href{https://doi.org/10/gtqcfz}{10/gtqcfz}
|
||||
|
||||
\section*{International Conferences}
|
||||
|
||||
\textbf{Erwan Normand}, Claudio Pacchierotti, Eric Marchand, and Maud Marchal.
|
||||
\enquote{Augmenting the Texture Perception of Tangible Surfaces in Augmented Reality using Vibrotactile Haptic Stimuli}.
|
||||
In: \textit{EuroHaptics}. Lille, France, July 2024. pp. 469--484.
|
||||
%\textsc{doi}: \href{https://doi.org/}{}
|
||||
|
||||
\noindentskip
|
||||
\textbf{Erwan Normand}, Claudio Pacchierotti, Eric Marchand, and Maud Marchal.
|
||||
\enquote{How Different Is the Perception of Vibrotactile Texture Roughness in Augmented versus Virtual Reality?}.
|
||||
In: \textit{ACM Symposium on Virtual Reality Software and Technology (VRST)}. Trier, Germany, October 2024. pp. 287--296.
|
||||
%\textsc{doi}: \href{https://doi.org/10/g5rr49}{10/g5rr49}
|
||||
|
||||
@@ -3,13 +3,6 @@
|
||||
|
||||
\nochaptertoc
|
||||
|
||||
% Résumé vulgarisé en 700 caractères
|
||||
% Les dispositifs haptiques portables procurent des sensations tactiles tout en étant compacts.
|
||||
% Mais ils ont été peu utilisés en réalité augmentée (RA), où le contenu virtuel est intégré à la perception du monde réel.
|
||||
% Cette thèse explore leur potentiel pour améliorer les interactions de la main avec des objets virtuels en RA.
|
||||
% Nous étudions d'abord l'impact du rendu visuel sur la perception des textures vibrotactiles virtuelles qui augmentent des surfaces réelles touchées directement par le doigt.
|
||||
% Nous étudions ensuite comment des retours sensoriels visuo-haptiques augmentant la main améliorent les performances et l’expérience utilisateur lors de la manipulation d'objets virtuels en RA.
|
||||
|
||||
\selectlanguage{french}
|
||||
|
||||
Dans ce manuscrit de thèse, nous montrons comment la \emph{réalité augmentée (RA)}, qui intègre un contenu visuel virtuel dans la perception du monde réel, et l'\emph{haptique portable}, qui fournit des sensations tactiles sur la peau, peuvent améliorer les interactions de la main avec des objets virtuels et augmentés.
|
||||
@@ -33,7 +26,7 @@ Cette technologie est étroitement liée à la \emph{réalité virtuelle (RV)},
|
||||
La virtualité et l'augmentation peuvent être visuelles ou haptiques.
|
||||
Ainsi, un dispositif haptique fournit des sensations dites virtuelles à un utilisateur.
|
||||
Une \textbf{augmentation haptique est la modification de la perception par l'ajout de sensations haptiques virtuelles} d'un objet réel touché par un utilisateur \cite{jeon2015haptic,bhatia2024augmenting}.
|
||||
Un aspect important de l'illusion de la RA (et de la RV) est la \emph{plausibilité}, c'est-à-dire l'illusion pour un utilisateur que les événements virtuels se produisent vraiment \cite{slater2022separate}. %, même si l'utilisateur sait qu'ils ne sont pas réels.
|
||||
Un aspect important de l'illusion de la RA (et de la RV) est la \emph{plausibilité}, c'est-à-dire l'illusion pour un utilisateur que les événements virtuels se produisent vraiment \cite{slater2022separate}.
|
||||
|
||||
Dans ce contexte, nous définissons un \emph{système de RA} comme l'ensemble des dispositifs matériels (dispositifs d'entrée, capteurs, affichages et dispositifs haptiques) et logiciels (suivi, simulation et rendu) qui permettent à l'utilisateur d'interagir avec l'environnement augmenté.
|
||||
Les visiocasques de RA sont la technologie d'affichage la plus prometteuse, car ils sont portables, fournissent à l'utilisateur un environnement augmenté \emph{immersif} et laissent les mains libres pour interagir \cite{hertel2021taxonomy}.
|
||||
@@ -124,7 +117,6 @@ Ce système constitue la base expérimentale pour les deux prochaines études d'
|
||||
\subsectionstarbookmark{Effet du retour visuel de la main virtuelle sur la perception d'augmen-tation haptique portable de texture}
|
||||
|
||||
La plupart des augmentations haptiques avec de l'haptique portable, comme celles dans notre système d'augmentation de textures ci-dessus, ont été étudiées sans retour visuel, ni dans des environnements immersifs de RA ou de RV \cite{culbertson2017importance,friesen2024perceived}.
|
||||
%Plus particulièrement, il n'a été pris en compte l'influence du rendu visuel sur leur perception.
|
||||
Pourtant, le retour visuel peut modifier la perception de sensation haptiques réelles et virtuelles \cite{schwind2018touch,choi2021augmenting}, et la perception d'un même stimuli d'un dispositif haptique à retour de force haptiques peut différer entre RA et RV \cite{diluca2011effects,gaffary2017ar}.
|
||||
|
||||
\begin{subfigs}{xr-perception}{Conditions expérimentales de l'étude utilisateur. }[][
|
||||
@@ -150,7 +142,6 @@ Nous faisons l'hypothèse que cette différence de perception a été causée à
|
||||
|
||||
Cette étude suggère qu'il est nécessaire de veiller aux différences latences de chaque boucle de rétroaction sensorielle, visuelle ou haptique, inhérentes à ces systèmes.
|
||||
Plus important encore, il convient d'estimer \emph{la perception de leur asynchronisme}.
|
||||
%Nous pensons que ces latences doivent être mesurées, réduites à un niveau acceptable pour les utilisateurs et maintenues perceptuellement synchronisées entre elles.
|
||||
Il semble que l'aspect visuel de la main ou de l'environnement n'ait eu directement que peu d'effet sur la perception du retour haptique, mais que le retour visuel de la main virtuelle puisse affecter la perception des latences visuo-haptiques, même si elles restent identiques.
|
||||
|
||||
Nous étudions ensuite des augmentations simultanées et co-localisées de textures visuelles et haptiques.
|
||||
@@ -257,8 +248,7 @@ Nous nous sommes concentrés sur le retour vibrotactile, car il est présent dan
|
||||
Nous avons utilisé en pratique deux moteurs vibrotactiles de type ERM car ils sont les plus compacts et n'affectent pas le suivi de la main \cite{pacchierotti2016hring}, mais ils permettent seulement de contrôler l'amplitude du signal.
|
||||
Dans une \textbf{étude utilisateur}, avec 20 participants, nous avons évalué l'effet des quatre placements avec \textbf{deux techniques de vibration de contact} sur la performance et l'expérience utilisateur.
|
||||
Les participants ont effectué les deux mêmes tâches de manipulation que dans l'étude sur l'augmentation visuelle de la main (\figref{visuo-haptic-hand-task-grasp-fr}).
|
||||
Enfin, nous avons comparé ces rendus vibrotactiles avec le \textbf{l'augmentation visuelle des phalanges de la main} (\figref{../4-manipulation/visual-hand/figures/method/hands-skeleton}). % établi dans l'étude sur l'augmentation visuelle de la main comme un retour d'information visuo-haptique complémentaire de l'interaction de la main avec les objets virtuels.
|
||||
|
||||
Enfin, nous avons comparé ces rendus vibrotactiles avec le \textbf{l'augmentation visuelle des phalanges de la main} (\figref{../4-manipulation/visual-hand/figures/method/hands-skeleton}).
|
||||
Les résultats ont montré que lorsqu'il était placé à proximité du point de contact, le retour vibrotactile relocalisé de la main améliorait la sensation d'efficacité, de réalisme et d'utilité des participants.
|
||||
Cependant, le placement le plus éloigné, sur la main opposée, a donné les meilleures performances, même s'il a été peu apprécié : ce placement inhabituel a probablement incité les participants à prêter plus attention au retour haptique et à se concentrer davantage sur la tâche.
|
||||
La technique de vibration au contact a été suffisante comparée à une technique plus élaborée d'intensité de la vibration en fonction de la force de contact.
|
||||
@@ -300,7 +290,6 @@ Les participants ont systématiquement identifié et fait correspondre \textbf{l
|
||||
\noindentskip Nous nous sommes également cherché à amélioré la manipulation d'objets virtuels directement avec la main.
|
||||
La manipulation d'objets virtuels est une tâche fondamentale dans les systèmes 3D, mais elle reste difficile à effectuer avec la main.
|
||||
Nous avons alors exploré deux retours sensoriels connus pour améliorer ce type d'interaction, mais non étudiés avec les visiocasques de RA: le retour visuel de la main virtuelle et le retour haptique relocalisé sur la main.
|
||||
%Notre approche a consisté à concevoir des augmentations visuelles de la main et un retour haptique portable relocalisé, sur la base de la littérature, et à les évaluer dans le cadre d'études sur les utilisateurs.
|
||||
|
||||
Nous donc avons d'abord examiné \textbf{(1) l'augmentation visuelle de la main}.
|
||||
Ce sont des rendus visuels de la main virtuelle, qui fournissent un retour d'information sur le suivi de la main et sur l'interaction avec les objets virtuels.
|
||||
|
||||
@@ -1,16 +1,5 @@
|
||||
% Changes
|
||||
\usepackage[commentmarkup=footnote]{changes}
|
||||
\definechangesauthor[name=Jens Grubert, color=Dandelion]{JG}
|
||||
\definechangesauthor[name=Seokhee Jeon, color=Red]{SJ}
|
||||
\newcommand{\comans}[3]{%
|
||||
\comment[id=#1]{%
|
||||
#2\\%
|
||||
\textbf{Answer: }#3%
|
||||
}%
|
||||
}
|
||||
|
||||
% Images
|
||||
\usepackage{graphicx}
|
||||
\usepackage{graphicx}% Include images
|
||||
\usepackage{caption}% Point references to the figure not the caption
|
||||
\usepackage[export]{adjustbox}% For valign in subfigs
|
||||
|
||||
@@ -31,15 +20,18 @@
|
||||
\renewcommand{\floatpagefraction}{0.7}
|
||||
|
||||
% Formatting
|
||||
\usepackage[autostyle]{csquotes}% For quotes
|
||||
\usepackage[autostyle]{csquotes}% Correctly quote text in different languages
|
||||
\usepackage[dvipsnames]{xcolor}% More colors
|
||||
\usepackage{tocloft}% Customise the table of contents
|
||||
\usepackage{tocbibind}% Add bibliography and lists to the table of contents
|
||||
\usepackage[nottoc]{tocbibind}% Add bibliography and lists to the table of contents, and hide the table of contents itself
|
||||
|
||||
% Footnotes
|
||||
\usepackage[hang]{footmisc}
|
||||
\usepackage[hang]{footmisc}% hang: no indentation for footnotes
|
||||
\setlength{\footnotemargin}{3mm}% Margin between footnote number and text
|
||||
|
||||
\NewCommandCopy{\oldfootnote}{\footnote}
|
||||
\renewcommand{\footnote}{\ifhmode\unskip\fi\oldfootnote}% Remove space before footnote
|
||||
|
||||
% Less hbox/vbox badness messages
|
||||
\hfuzz=20pt
|
||||
\vfuzz=20pt
|
||||
|
||||
@@ -32,9 +32,7 @@
|
||||
|
||||
\frontmatter
|
||||
\import{0-front}{cover}
|
||||
\pdfbookmark[chapter]{List of changes}{changes}
|
||||
\listofchanges
|
||||
%\importchapter{0-front}{acknowledgement}
|
||||
\importchapter{0-front}{acknowledgement}
|
||||
\importchapter{0-front}{contents}
|
||||
|
||||
\mainmatter
|
||||
|
||||
1
utils/.gitignore
vendored
1
utils/.gitignore
vendored
@@ -146,7 +146,6 @@ acs-*.bib
|
||||
|
||||
# knitr
|
||||
*-concordance.tex
|
||||
# TODO Uncomment the next line if you use knitr and want to ignore its generated tikz files
|
||||
# *.tikz
|
||||
*-tikzDictionary
|
||||
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
[subrepo]
|
||||
remote = git@git.nortelli.fr:whar-packages/latex-utils.git
|
||||
branch = main
|
||||
commit = eb2bf7a751f36bcd487fa8d6d186c7e8249187dc
|
||||
parent = 1a588e02480c5d4ecf8aa8562be461a89cc311b5
|
||||
commit = 89f44bd6e060110cf233d2632c644067d91780bc
|
||||
parent = 58161d561f8d80f744871af41c31c16b3007a7d1
|
||||
method = rebase
|
||||
cmdver = 0.4.6
|
||||
|
||||
@@ -93,6 +93,7 @@
|
||||
}
|
||||
|
||||
%% Footnotes
|
||||
\newcommand{\footnotemarkrepeat}{\footnotemark[\value{footnote}]}% Repeat the last footnote mark
|
||||
\newcommand{\footnoteurl}[1]{\footnote{\ \url{#1}}}
|
||||
|
||||
%% Lists
|
||||
@@ -157,23 +158,32 @@
|
||||
\newcommand{\figref}[1]{Fig.~\ref{fig:#1}}
|
||||
\newcommand{\secref}[1]{Sec.~\ref{sec:#1}}
|
||||
\newcommand{\tabref}[1]{Table~\ref{tab:#1}}
|
||||
\newcommand{\footnotemarkrepeat}{\footnotemark[\value{footnote}]} % Repeat the last footnote mark
|
||||
|
||||
%% Structure
|
||||
\newcommand{\chapterstartoc}[1]{
|
||||
\chapter*{#1}\addcontentsline{toc}{chapter}{#1}}
|
||||
\newcommand{\chapterstarbookmark}[1]{\pdfbookmark[chapter]{#1}{#1}
|
||||
\chapter*{#1}}
|
||||
\newcommand{\chapternotoc}[1]{%
|
||||
\begingroup%
|
||||
\renewcommand{\addtocontents}[2]{}%
|
||||
\chapter*{#1}%
|
||||
\endgroup%
|
||||
}
|
||||
|
||||
\newcommand{\sectionstartoc}[1]{
|
||||
\section*{#1}\addcontentsline{toc}{section}{#1}}
|
||||
\newcommand{\sectionstarbookmark}[1]{\pdfbookmark[section]{#1}{#1}
|
||||
\section*{#1}}
|
||||
\newcommand{\sectionstartoc}[1]{%
|
||||
\section*{#1}%
|
||||
\addcontentsline{toc}{section}{#1}%
|
||||
}
|
||||
\newcommand{\sectionstarbookmark}[1]{%
|
||||
\pdfbookmark[section]{#1}{#1}%
|
||||
\section*{#1}%
|
||||
}
|
||||
|
||||
\newcommand{\subsectionstartoc}[1]{
|
||||
\subsection*{#1}\addcontentsline{toc}{subsection}{#1}}
|
||||
\newcommand{\subsectionstarbookmark}[1]{\pdfbookmark[subsection]{#1}{#1}
|
||||
\subsection*{#1}}
|
||||
\newcommand{\subsectionstartoc}[1]{%
|
||||
\subsection*{#1}%
|
||||
\addcontentsline{toc}{subsection}{#1}%
|
||||
}
|
||||
\newcommand{\subsectionstarbookmark}[1]{%
|
||||
\pdfbookmark[subsection]{#1}{#1}%
|
||||
\subsection*{#1}%
|
||||
}
|
||||
|
||||
\newcommand{\subsubsubsection}[1]{\paragraph*{\textit{#1}}}
|
||||
|
||||
|
||||
Reference in New Issue
Block a user