420 lines
34 KiB
TeX
420 lines
34 KiB
TeX
\chapter{Introduction}
|
|
\mainlabel{introduction}
|
|
|
|
This thesis presents research on direct hand interaction with real and virtual everyday objects, visually and haptically augmented using immersive \AR and \WH devices.
|
|
|
|
|
|
\section{Visual and Tactile Object Augmentations}
|
|
\label{visuo_haptic_augmentations}
|
|
|
|
\subsectionstarbookmark{Everyday Interaction with Everyday Objects}
|
|
|
|
In daily life, we simultaneously look at and touch the everyday objects around us without even thinking about it.
|
|
%
|
|
Many of these object properties can be perceived in a complementary way through both vision and touch, such as their shape, size or texture~\autocite{baumgartner2013visual}.
|
|
%
|
|
But vision often precedes touch, enabling us to expect the tactile sensations we will feel when touching the object, \eg stiffness or texture, and even to predict properties that we cannot see, \eg weight or temperature.
|
|
%
|
|
Information from different sensory sources may be complementary, redundant or contradictory~\autocite{ernst2004merging}.
|
|
%
|
|
This is why we sometimes want to touch an object to check one of its properties that we have seen and to compare or confront our visual and tactile sensations.
|
|
%
|
|
We then instinctively construct a unified perception of the object we are exploring and manipulating from these visual and tactile sensory modalities, as well as from the movement of our hand and fingers on the object~\autocite{ernst2002humans}.
|
|
|
|
The sense of touch also allows us to perceive our environment, but also to represent ourselves in space and to interact with the surrounding objects.
|
|
%
|
|
This is due to the many sensory receptors distributed throughout our hands and body, and which can be divided into two modalities: kinesthetic (or proprioception), which are the forces felt by muscles and tendons, and cutaneous (or tactile), which are the pressures, stretches, vibrations and temperatures felt by the skin.
|
|
%
|
|
This rich and complex variety of actions and sensations makes it particularly difficult to artificially recreate capabilities of touch, for example in virtual or remote operating environments~\cite{culbertson2018haptics}.
|
|
|
|
%Nous regardons et touchons simultanément les objets de la vie quotidienne qui nous entourent, sans même y penser.
|
|
%Beaucoup de propriétés de ces objets peuvent être perçues de façon complémentaire par la vision comme par le toucher, comme la forme, la taille ou la texture.
|
|
%Mais la vision précède souvent le toucher, et nous permet de prédire les sensations tactiles que nous allons ressentir en touchant l'objet, voire prédire des propriétés tactiles que nous ne pouvons pas voir, comme le poids ou la température.
|
|
%Ainsi, les sensations visuelles et tactiles sont souvent liées et complémentaires, voire redondantes ou contradictoires.
|
|
%C'est pourquoi nous voulons parfois toucher un objet pour vérifier une de ses propriété que nous avons vue, comme sa texture, et confronter nos sensations visuelles et tactiles.
|
|
%Nous utilisons alors ces deux modalités sensorielles, ainsi que le mouvement de notre main, pour construire une perception unifiée de l'objet que nous explorons et manipulons.
|
|
%Le sens du toucher permet ainsi à la fois de percevoir et d'interagir avec notre environnement.
|
|
%Également appelé sens haptique, il peut être décomposé en deux sous-modalités: les sensations kinésthétique (ou proprioception), qui sont les forces senties par les muscles et les tendons, et les sensations cutanées (ou tactiles), qui sont les pressions, étirements, vibrations et températures ressenties par la peau.
|
|
%Cette riche et complexe diversité de sensations rend particulièrement difficile de le recréer artificiellement, par exemple dans des environnements virtuels ou de téléopération.
|
|
|
|
|
|
\subsectionstarbookmark{Wearable Haptics Promise Everyday Use}
|
|
|
|
\emph{Haptics} is the study of the sense of touch and user interfaces that involve touch.
|
|
%
|
|
Haptic devices can be categorized according to how they interface with a user: graspable, touchable and wearable, as illustrated in \figref{haptic-categories}.
|
|
%
|
|
Graspable interfaces are the traditional haptic devices that are held in the hand.
|
|
%
|
|
They are either grounded devices that provide kinesthetic (force) feedback, \eg a robotic arm or a joystick, or ungrounded tactile devices, \eg a game controller or a smartphone.
|
|
%
|
|
Touchable interfaces are actuated devices that are directly touched and that can dynamically change their shape or surface property, such as stiffness or friction, providing simultaneous kinesthetic and cutaneous feedback.
|
|
%
|
|
However, graspable interfaces occupy the hand, preventing interaction with other objects, and touchable interfaces often involve cumbersome mechanisms and are by definition limited to their own working surface.
|
|
%
|
|
Instead, wearable interfaces are directly mounted on the body to provide kinesthetic or cutaneous sensations on the skin in a portable way and without restricting the user's movements~\autocite{pacchierotti2017wearable}.
|
|
|
|
\begin{subfigs}{haptic-categories}{
|
|
Haptic devices can be classified into three categories according to their interface with the user:
|
|
}[
|
|
\item graspable,
|
|
\item touchable, and
|
|
\item wearable. Figure adapted from \textcite{culbertson2018haptics}.
|
|
]
|
|
\subfig[0.25]{culbertson2018haptics-graspable}
|
|
\subfig[0.25]{culbertson2018haptics-touchable}
|
|
\subfig[0.25]{culbertson2018haptics-wearable}
|
|
\end{subfigs}
|
|
|
|
A wide range of \WH devices have been developed to provide the user with rich virtual haptic sensations, including normal force, skin stretch, vibration and thermal feedback.
|
|
%
|
|
\figref{wearable-haptics} shows some examples of different \WH devices with different form factors and rendering capabilities.
|
|
%
|
|
Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, \VR, and social interactions~\autocite{pacchierotti2017wearable,culbertson2018haptics}.
|
|
%
|
|
But their use in combination with \AR has been little explored so far.
|
|
|
|
\begin{subfigs}{wearable-haptics}{
|
|
\WH devices can render sensations on the skin as feedback to real or virtual objects being touched.
|
|
}[
|
|
\item Wolverine, a wearable exoskeleton that simulate contact and grasping of virtual objects with force feedback on the fingers~\autocite{choi2016wolverine}.
|
|
\item Touch\&Fold, a \WH device mounted on the nail that fold on demand to render contact, normal force and vibrations to the fingertip~\autocite{teng2021touch}.
|
|
\item The hRing, a \WH ring mounted on the proximal phalanx able to render normal and shear forces to the fingertip~\autocite{pacchierotti2016hring}.
|
|
\item Tasbi, a haptic bracelet capable of rendering squeeze and vibrotactile feedback to the wrist~\autocite{pezent2019tasbi}.
|
|
]
|
|
\subfigsheight{28mm}
|
|
\subfig{choi2016wolverine}
|
|
\subfig{teng2021touch}
|
|
\subfig{pacchierotti2016hring}
|
|
\subfig{pezent2019tasbi}
|
|
\end{subfigs}
|
|
|
|
|
|
\subsectionstarbookmark{Augmented Reality Is Not Only Visual}
|
|
|
|
\AR integrates virtual content into the real world perception, creating the illusion of a unique \AE.
|
|
%
|
|
It thus promises natural and seamless interaction with the physical and digital objects (and their combination) directly with our hands.
|
|
%
|
|
It is technically and conceptually closely related to \VR, which replaces the \RE perception with a \VE.
|
|
%
|
|
\AR and \VR can be placed on a \RV continuum, as proposed by \textcite{milgram1994taxonomy} and illustrated in \figref{rv-continuum}\footnote{On the \RV continuum of \textcite{milgram1994taxonomy}, augmented virtuality is also considered, as the incorporation of real objects to a \VE, and is placed between \AR and \VR. For simplicity, we only consider \AR and \VR in this thesis.}.
|
|
%
|
|
It describes the degree of \RV of the environment along an axis, with one end being the \RE and the other end being a pure \VE, \ie indistinguishable from the real world (such as \emph{The Matrix} movies or the \emph{Holodeck} in the \emph{Star Trek} series).
|
|
%
|
|
Between these two extremes lies \MR, which comprises \AR and \VR as different levels of mixing real and virtual environments~\autocite{skarbez2021revisiting}.
|
|
%
|
|
\AR/\VR is most often understood as addressing only the visual sense, and as haptics, it can take many forms as a user interface.
|
|
%
|
|
The most mature devices are \HMDs, which are portable headsets worn directly on the head, providing the user with an immersive \AE/\VE.
|
|
|
|
\begin{subfigs}{rv-continuums}{Reality-virtuality (\RV) continuums. }[
|
|
\item Original \RV continuum for the visual sense initially proposed by and readapted from \textcite{milgram1994taxonomy}.
|
|
\item Extension of the \RV continuum to include the haptic sense on a second, orthogonal axis, proposed by and readapted from \textcite{jeon2009haptic}.
|
|
]
|
|
\subfig[0.44]{rv-continuum}
|
|
\subfig[0.54]{visuo-haptic-rv-continuum3}
|
|
\end{subfigs}
|
|
|
|
\AR/\VR can also be extended to render for sensory modalities other than vision.
|
|
%
|
|
\textcite{jeon2009haptic} proposed extending the \RV continuum to include haptic feedback by decoupling into two orthogonal haptic and visual axes (see \figref{visuo-haptic-rv-continuum3}).
|
|
%
|
|
The combination of the two axes defines 9 types of \vh environments, with 3 possible levels of \RV for each \v or \h axis: real, augmented and virtual.
|
|
%
|
|
For example, a \v-\AE that uses a tangible (touchable) object as a proxy to manipulate virtual content is considered a \h-\RE (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum3}), whereas a device that provides synthetic haptic feedback when touching a \VO is considered a \h-\VE (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum3}).
|
|
%
|
|
Haptic \AR is then the combination of real and virtual haptic stimuli~\autocite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}).
|
|
%
|
|
In particular, it has been implemented by enhancing the haptic perception of tangible objects by providing timely tactile stimuli using \WHs.
|
|
%
|
|
\figref{salazar2020altering} shows an example of modifying the perceived stiffness of a tangible object in \VR using simultaneous pressure feedback on the finger (left middle cell in \figref{visuo-haptic-rv-continuum3}).
|
|
%
|
|
\figref{bau2012revel} shows another example of \vh-\AR rendering of texture when running the finger on a tangible surface (middle cell in the two axes in \figref{visuo-haptic-rv-continuum3}).
|
|
|
|
Current \v-\AR systems often lack haptic feedback, creating a deceptive and incomplete user experience when reaching the \VE with the hand.
|
|
%
|
|
All \v-\VOs are inherently intangible and cannot physically constrain a user's hand, making it difficult to perceive their properties congruently and interact with them with confidence and efficiency.
|
|
%
|
|
It is therefore necessary to provide haptic feedback that is consistent with the \v-\AE and ensures the best possible user experience.
|
|
%
|
|
The integration of \WHs with \AR seems to be one of the most promising solutions, but it remains challenging due to their many respective characteristics and the additional constraints of combining them.
|
|
|
|
\begin{subfigs}{visuo-haptic-environments}{
|
|
Visuo-haptic environments with different degrees of reality-virtuality.
|
|
}[
|
|
\item Visual \AR environment with a real, tangible haptic object used as a proxy to manipulate a \VO~\autocite{kahl2023using}.
|
|
\item Visual \AR environment with a \WH device that provides virtual, synthetic feedback from contact with a \VO~\autocite{meli2018combining}.
|
|
\item A tangible object seen in a \v-\VR environment whose haptic perception of stiffness is augmented with the hRing haptic device~\autocite{salazar2020altering}.
|
|
\item Visuo-haptic rendering of texture on a touched tangible object with a \v-\AR display and haptic electrovibration feedback~\autocite{bau2012revel}.
|
|
]
|
|
\subfigsheight{31mm}
|
|
\subfig{kahl2023using}
|
|
\subfig{meli2018combining}
|
|
\subfig{salazar2020altering}
|
|
\subfig{bau2012revel}
|
|
\end{subfigs}
|
|
|
|
|
|
\section{Research Challenges of Wearable Visuo-Haptic Augmented Reality}
|
|
\label{research_challenges}
|
|
|
|
The integration of \WHs with \AR to create a \vh-\AE is complex and presents many perceptual and interaction challenges, \ie sensing the \AE and acting effectively upon it.
|
|
%
|
|
We are particularly interested in enabling direct contact of virtual and augmented objects with the bare hand.
|
|
%
|
|
Our goal is to enable congruent, intuitive and seamless perception and manipulation of the \vh-\AE.
|
|
|
|
The experience of such a \vh-\AE relies on an interaction loop with the user, as illustrated in \figref{interaction-loop}.
|
|
%
|
|
The \RE and the user's hand are tracked in real time by sensors and reconstructed in visual and haptic VEs.
|
|
%
|
|
The interactions between the virtual hand and objects are then simulated and rendered as visual and haptic feedback to the user using an \AR headset and a \WH device.
|
|
%
|
|
Because the \vh-\VE is displayed in real time, colocalized and aligned with the real one, the user is given the illusion of directly perceiving and interacting with the virtual content as if it were part of the \RE.
|
|
|
|
\fig{interaction-loop}{
|
|
The interaction loop between a user and a visuo-haptic augmented environment.
|
|
}[
|
|
One interact with the visual (in blue) and haptic (in red) virtual environment through a virtual hand (in purple) interaction technique that tracks real hand movements and simulates contact with \VOs.
|
|
The virtual environment is rendered back to the user colocalized with the real one (in gray) using a \v-\AR headset and a \WH device.
|
|
]
|
|
|
|
%This to ensure the best possible user experience, taking into account the current capabilities and limitations of \WHs and augmented reality technologies.
|
|
%
|
|
In this context, we identify two main research challenges that we address in this thesis:
|
|
%
|
|
\begin{enumerate*}[label=(\Roman*)]
|
|
\item providing plausible and coherent visuo-haptic augmentations, and
|
|
\item enabling effective manipulation of the augmented environment.
|
|
\end{enumerate*}
|
|
%
|
|
Each of these challenges also raises numerous design, technical and human issues specific to each of the two types of feedback, \WHs and immersive \AR, as well as multimodal rendering and user experience issues in integrating these two sensorimotor feedbacks into a coherent and seamless \vh-\AE.
|
|
%These challenges are illustrated in the visuo-haptic interaction loop in \figref{interaction-loop}.
|
|
|
|
|
|
\subsectionstarbookmark{Provide Plausible and Coherent Visuo-Haptic Augmentations}
|
|
|
|
Many haptic devices have been designed and evaluated specifically for use in \VR, providing realistic and varied kinesthetic and tactile feedback to \VOs.
|
|
%
|
|
Although closely related, (visual) \AR and \VR have key differences in their respective renderings that can affect user perception.
|
|
|
|
Firstly, the user's hand and \RE are visible in \AR, unlike \VR where there is total control over the visual rendering of the hand and \VE.
|
|
% (unless specifically overlaid with virtual visual content)
|
|
%
|
|
As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic~\autocite{ujitoko2021survey} or haptic retargeting~\autocite{azmandian2016haptic} effects.
|
|
%enabling techniques such as pseudo-haptic feedback that induce haptic feedback with visual stimuli~\autocite{ujitoko2021survey} or haptic retargeting that associate a single tangible object with multiple \VOs without the user noticing~\autocite{azmandian2016haptic}.
|
|
%
|
|
Moreover, many \WH devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR.
|
|
%
|
|
The user's hand must be indeed free to touch and interact with the \RE.
|
|
%
|
|
It is possible instead to place the haptic actuator close to the point of contact with the \RE, as described above to implement \h-\AR, \eg providing haptic feedback on another phalanx~\autocite{asano2015vibrotactile,salazar2020altering} or the wrist~\autocite{sarac2022perceived} for rendering fingertip contacts with virtual content.
|
|
%
|
|
Therefore, when touching a virtual or augmented object, the real and virtual visual sensations are seen as colocalised, but the virtual haptic feedback is not.
|
|
%
|
|
It remains to be investigated how such potential discrepancies affect the overall perception to design visuo-haptic renderings adapted to \AR.
|
|
|
|
So far, \AR can only add visual and haptic sensations to the user's overall perception of the environment, but conversely it is very difficult to remove sensations.
|
|
%
|
|
These added virtual sensations can therefore be perceived as out of sync or even inconsistent with the sensations of the \RE, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these.
|
|
%
|
|
It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as realistic or plausible, and to what extent they will conflict or complement each other in the perception of the \AE.
|
|
%
|
|
%Therefore, it remains to be investigated how these three characteristics of using \WHs with \AR affect the perception, especially with visually and haptically augmented objects.
|
|
|
|
% on voit sa propre main toucher, contrairement à la \VR, où la vision est particulièrement dominante (eg retargeting), difficile à dire si le cas en RA, surtout que si touche objets augmentés, difficile de modifier visuellement et haptiquement on peut ajouter des sensations pas vraiment en enlever. Lactuateur n'est pas là où on touche, à quel point les sensations seront réalistes ? En cohérence avec les sensations visuelles ? À quel point la perception différente de la \VR, en terme de rendu main env, et de latence ? Important car permettra d'utiliser efficacement, avwc correction si besoin par rapport à la \VR. Lq boucle d'interaction a forcément de la latence par rapport aux mouvements, à la proprioception, et pas les mêmes entre visuel et haptique, quel effet ?
|
|
|
|
\subsectionstarbookmark{Enable Effective Manipulation of the Augmented Environment}
|
|
|
|
Touching, grasping and manipulating \VOs are fundamental interactions for \AR~\autocite{kim2018revisiting}, \VR~\autocite{bergstrom2021how} and VEs in general~\autocite{laviola20173d}.
|
|
%
|
|
As the hand is not occupied or covered with a haptic device to not impair interaction with the \RE, as described in the previous section, one can expect a a seamless and direct manipulation of the hand with the virtual content as if it were real.
|
|
%
|
|
Thus, augmenting a tangible object has the advantage of physically constraining the hand, allowing for easy and natural interaction, but manipulating a purely \VO with the bare hand can be challenging without good haptic feedback~\autocite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
|
|
|
|
In addition, current \AR systems have visual rendering limitations that also affect interaction with \VOs. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency.
|
|
%
|
|
Visual \AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world.
|
|
%
|
|
But the depth perception of the \VOs is often underestimated~\autocite{peillard2019studying,adams2022depth}, and there is often a lack of mutual occlusion between the hand and a \VO, \ie that the hand can hide the object or be hidden by the object~\autocite{macedo2023occlusion}.
|
|
%
|
|
Finally, as illustrated in \figref{interaction-loop}, interacting with a \VO is an illusion, because in fact the real hand is controlling in real time a virtual hand, like an avatar, whose contacts with \VOs are then simulated in the \VE.
|
|
%
|
|
Therefore, there is inevitably a latency delay between the real hand's movements and the \VO's return movements, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the touched \VO~\autocite{prachyabrued2014visual}.
|
|
%
|
|
This makes it difficult to perceive the position of the fingers relative to the object before touching or grasping it, and also to estimate the force required to grasp and move the object to a desired location.
|
|
|
|
Hence, it is necessary to provide visual and haptic feedback that allows the user to efficiently contact, grasp and manipulate a \VO with the hand.
|
|
%
|
|
Yet, it is unclear which type of visual and haptic feedback is the best suited to guide the \VO manipulation, and whether one or the other of a combination of the two is most beneficial for users.
|
|
|
|
% comme on laisse la main libre, et quelle visible, il n'y a pas de contrôleur tracké comme en \VR, donc on s'attend à des interactions naturelles directement avec la main, de manière seamless entre physique et virtuel. On tracking additionel des actuateurs qui couvre la main. Question de où placer lactuateur, encore une fois le feedback n'est pas colocalisé avec l'action, contrairement au visuel (+- lag). Il y a également des décalages dans le temps et l'espace entre la main physique et son replicat virtuel, or c'est ce dernier qui agit avec les objets virtuel, est-ce qu'il faut le montrer ? Ou des rendus haptiques suffisent, complémentaire, en contradiction ?
|
|
|
|
|
|
\section{Approach and Contributions}
|
|
\label{contributions}
|
|
|
|
The aim of this thesis is to understand how immersive visual and \WH feedback compare and complement each other in the context of direct hand perception and manipulation with augmented objects.
|
|
%
|
|
As described in the Research Challenges section above, providing a convincing, consistent and effective \vh-\AE to a user is complex and raises many issues.
|
|
%
|
|
Our approach is to
|
|
%
|
|
\begin{enumerate*}[label=(\arabic*)]
|
|
\item design immersive and wearable multimodal visuo-haptic renderings that augment both the objects being interacted with and the hand interacting with them, and
|
|
\item evaluate in user studies how these renderings affect the perception of and interaction with these objects using psychophysical, performance, and user experience methods.
|
|
\end{enumerate*}
|
|
%
|
|
We consider two main axes of research, each addressing one of the research challenges identified above:
|
|
%
|
|
\begin{enumerate*}[label=(\Roman*)]
|
|
\item modifying the perception of tangible surfaces using visuo-haptic texture augmentations, and
|
|
\item improving the manipulation of virtual objects using visuo-haptic augmentations of the hand.
|
|
\end{enumerate*}
|
|
%
|
|
Our contributions in these two axes are summarized in \figref{contributions}.
|
|
|
|
\fig[0.95]{contributions}{
|
|
Summary of our contributions through the simplified interaction loop.
|
|
}[
|
|
The contributions are represented in dark gray boxes, and the research axes in light green circles.
|
|
The first (I) axis designs and evaluates the perception of visuo-haptic texture augmentations of tangible surfaces, directly touched by the hand.
|
|
The second (II) axis focuses on improving the manipulation of \VOs with the bare hand using visuo-haptic augmentations of the hand as interaction feedback.
|
|
]
|
|
|
|
|
|
\subsectionstarbookmark{Modifying the Perception of Tangible Surfaces with Visuo-Haptic Texture Augmentations}
|
|
|
|
% Very short shared motivation of the shared objective of the two contributions
|
|
% Objective + We propose / we consider : (1) ... and (2) ...
|
|
|
|
% Very short abstract of contrib 1
|
|
|
|
% Very short abstract of contrib 2
|
|
|
|
\WH devices have proven to be effective in modifying the perception of a touched tangible surface, without modifying the tangible, nor covering the fingertip, forming a \h-\AE~\autocite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
|
|
%
|
|
%It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronised with the finger movement.
|
|
%
|
|
%It enables rich haptic feedback as the combination of kinesthetic sensation from the tangible and cutaneous sensation from the actuator.
|
|
%
|
|
However, wearable \h-\AR have been little explored with \v-\AR, as well as the visuo-haptic augmentation of textures.
|
|
%
|
|
Texture is indeed one of the main tactile sensation of a surface material~\cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by both sight and touch~\cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) rendering~\cite{unger2011roughness,culbertson2014modeling,strohmeier2017generating}.
|
|
%
|
|
For this first axis of research, we propose to design and evaluate the perception of virtual visuo-haptic textures augmenting tangible surfaces. %, using an immersive \AR headset and a wearable vibrotactile device.
|
|
%
|
|
To this end, we (1) design a system for rendering virtual visuo-haptic texture augmentations, to (2) evaluate how the perception of these textures is affected by the visual virtuality of the hand and the environment (\AR \vs \VR), and (3) investigate the perception of co-localized visuo-haptic texture augmentations in \AR.
|
|
|
|
First, an effective approach to rendering haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction~\autocite{culbertson2014modeling,asano2015vibrotactile}.
|
|
%
|
|
Yet, to achieve the natural interaction with the hand and a coherent visuo-haptic feedback, it requires a real time rendering of the textures, no constraints on the hand movements, and a good synchronization between the visual and haptic feedback.
|
|
%
|
|
Thus, our first objective is to design an immersive, real time system that allows free exploration with the bare hand of visuo-haptic texture augmentations on tangible surfaces.
|
|
|
|
Second, many works have investigated the haptic rendering of virtual textures, but few have integrated them with immersive \VEs or have considered the influence of the visual rendering on their perception.
|
|
%
|
|
Still, it is known that the visual feedback can alter the perception of real and virtual haptic sensations~\autocite{schwind2018touch,choi2021augmenting} but also that the force feedback perception of grounded haptic devices is not the same in \AR and \VR~\autocite{diluca2011effects,gaffary2017ar}.
|
|
%
|
|
Hence, our second objective is to understand how the perception of haptic texture augmentation differs depending on the degree of visual virtuality of the hand and the environment.
|
|
|
|
Finally, some visuo-haptic texture databases have been modelled from real texture captures~\autocite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with graspable haptics that are perceived as similar to real textures~\autocite{culbertson2015should,friesen2024perceived}.
|
|
%
|
|
However, the rendering of these textures in an immersive and natural \vh-\AR using \WHs remains to be investigated.
|
|
%
|
|
Our third objective is to evaluate the perception of simultaneous and co-localized visuo-haptic texture augmentation of tangible surfaces in \AR, directly touched by the hand, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture.
|
|
|
|
|
|
\subsectionstarbookmark{Improving Virtual Object Manipulation with Visuo-Haptic Augmentations of the Hand}
|
|
|
|
In immersive and wearable \vh-\AR, the hand is free to touch and interact seamlessly with real, augmented, and virtual objects, and one can expect natural and direct contact and manipulation of \VOs with the bare hand.
|
|
%
|
|
However, the intangibility of the vVE, the many display limitations of current \v-\AR systems and \WH devices, and the potential discrepancies between these two types of feedback can make the manipulation of \VOs particularly challenging.
|
|
%However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of \WHs, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the real environment, the visual feedback, and the haptic feedback, can make the interaction with \VOs with bare hands particularly challenging.
|
|
%
|
|
Still two types of sensory feedback are known to improve such direct \VO manipulation, but they have not been studied in combination in immersive \v-\AE: visual rendering of the hand~\autocite{piumsomboon2014graspshell,prachyabrued2014visual} and contact rendering with \WHs~\autocite{lopes2018adding,teng2021touch}.
|
|
%
|
|
For this second axis of research, we propose to design and evaluate the role of visuo-haptic augmentations of the hand as interaction feedback with \VOs.
|
|
%
|
|
We consider (1) the effect of different visual augmentations of the hand as \AR avatars and (2) the effect of combination of different visuo-haptic augmentations of the hand.
|
|
|
|
First, the visual rendering of the virtual hand is a key element for interacting and manipulating \VOs in \VR~\autocite{prachyabrued2014visual,grubert2018effects}.
|
|
%
|
|
A few works have also investigated the visual rendering of the virtual hand in \AR, from simulating mutual occlusions between the hand and \VOs~\autocite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay~\autocite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
|
|
%
|
|
But \v-\AR has significant perceptual differences from \VR due to the visibility of the real hand and environment, and these visual hand augmentations have not been evaluated in the context of \VO manipulation.
|
|
%
|
|
Thus, our fourth objective is to evaluate and compare the effect of different visual hand augmentations on direct manipulation of \VOs in \AR.
|
|
|
|
Finally, as described above, \WHs for \v-\AR rely on moving the haptic actuator away from the fingertips to not impair the hand movements, sensations, and interactions with the \RE.
|
|
%
|
|
Previous works have shown that \WHs that provide feedback on the hand manipulation with \VOs in \AR can significantly improve the user performance and experience~\autocite{maisto2017evaluation,meli2018combining}.
|
|
%
|
|
However, it is unclear which positioning of the actuator is the most beneficial nor how a haptic augmentation of the hand compares or complements with a visual augmentation of the hand.
|
|
%
|
|
Our last objective is to investigate the role of visuo-haptic augmentations of the hand in manipulating \VOs directly with the hand in \AR.
|
|
|
|
|
|
\section{Thesis Overview}
|
|
\label{thesis_overview}
|
|
|
|
%Present the contributions and structure of the thesis.
|
|
|
|
This thesis is divided in four parts.
|
|
%
|
|
\partref{context} describes the context and background of our research, within which this first current \textit{Introduction} chapter presents the research challenges, and the objectives, approach, and contributions of this thesis.
|
|
%
|
|
\chapref{related_work} then presents previous work on the perception of and interaction with visual and haptic augmentations using \WHs and \AR, and how they have been combined in \vh-\AEs.
|
|
%
|
|
Firstly, it gives an overview of how \WHs have been used to enhance the touch perception and interaction, with a focus on vibrotactile feedback and haptic textures.
|
|
%
|
|
It then introduces \AR, and how users perceive and can interact with the augmented environments, in particular using the visual rendering of the user's hand.
|
|
%
|
|
Finally, it shows how multimodal visuo-haptic feedback has been used in \AR and \VR to alter the perception of tangible objects and to improve the manipulation of \VOs.
|
|
%
|
|
Next, we address each of our two research axes in a dedicated part.
|
|
|
|
\bigskip
|
|
|
|
\partref{perception} describes our contributions to the first axis of research, augmenting the visuo-haptic texture perception of tangible surfaces.
|
|
%
|
|
We evaluate how the visual rendering of the hand (real or virtual), the environment (\AR or \VR) and the textures (displayed or hidden) affect the roughness perception of virtual vibrotactile textures rendered on real surfaces and touched directly with the index finger.
|
|
|
|
\chapref{xr_perception} details a system for rendering visuo-haptic virtual textures that augment tangible surfaces using an immersive \AR/\VR headset and a wearable vibrotactile device.
|
|
%
|
|
The haptic textures are rendered as a real-time vibrotactile signal representing a grating texture, and is provided to the middle phalanx of the index finger touching the texture using a \VCA.
|
|
%
|
|
The tracking of the real hand and environment is done using marker-based technique, and the visual rendering of their virtual counterparts is done using the immersive \OST \AR headset Microsoft HoloLens~2.
|
|
|
|
\chapref{xr_perception} then presents a first user study using this system.
|
|
%
|
|
It evaluates how different the perception of virtual haptic textures is in \AR \vs \VR and when touched by a virtual hand \vs one's own hand.
|
|
%
|
|
We use psychophysical methods to measure the user roughness perception of the virtual textures, and extensive questionnaires to understand how this perception is affected by the visual rendering of the hand and the environment.
|
|
|
|
\chapref{ar_textures} presents a second user study using the same system and evaluating the perception of visuo-haptic texture augmentations, touched directly with one's own hand in \AR.
|
|
%
|
|
The textures are paired visual and tactile models of real surfaces~\autocite{culbertson2014one}, and are rendered as visual texture overlays and as a vibrotactile feedback, respectively, on the touched augmented surfaces, respectively.
|
|
%
|
|
%We investigate the perception and user appreciation of the combination of nine representative visuo-haptic pairs of texture.
|
|
%
|
|
Our objective is to assess the perceived realism, plausibility and roughness of the combination of nine representative visuo-haptic texture pairs, and the coherence of their association.
|
|
|
|
\bigskip
|
|
|
|
\partref{manipulation} describes our contributions to the second axis of research, improving direct hand manipulation of \VOs with visuo-haptic augmentations of the hand.
|
|
%
|
|
We evaluate how the visual and haptic augmentation of the hand can such manipulations.
|
|
|
|
\chapref{visual_hand} explores in a first user study the effect of six visual hand augmentations that provide contact feedback with the \VO, as a set of the most popular hand renderings in the \AR literature.
|
|
%
|
|
Using the \OST-\AR headset Microsoft HoloLens~2, the user performance and experience are evaluated in two representative manipulation tasks, \ie push-and-slide and grasp-and-place of a \VO directly with the hand.
|
|
|
|
\chapref{visuo_haptic_hand} evaluates in a second user study two vibrotactile contact techniques, provided at four different locations on the real hand, as haptic rendering of the hand-object interaction.
|
|
%
|
|
They are compared to the two most representative visual hand augmentations from the previous study, and the user performance and experience are evaluated within the same \OST-\AR setup and manipulation tasks.
|
|
|
|
\bigskip
|
|
|
|
\partref{part:conclusion} finally concludes this thesis and discusses short-term future work and long-term perspectives for each of our contributions and research axes. |