Replace \autocite => \cite

This commit is contained in:
2024-09-08 10:52:06 +02:00
parent 0c11bb2668
commit e96888afab
19 changed files with 197 additions and 197 deletions

View File

@@ -11,15 +11,15 @@ This thesis presents research on direct hand interaction with real and virtual e
In daily life, we simultaneously look at and touch the everyday objects around us without even thinking about it.
%
Many of these object properties can be perceived in a complementary way through both vision and touch, such as their shape, size or texture~\autocite{baumgartner2013visual}.
Many of these object properties can be perceived in a complementary way through both vision and touch, such as their shape, size or texture~\cite{baumgartner2013visual}.
%
But vision often precedes touch, enabling us to expect the tactile sensations we will feel when touching the object, \eg stiffness or texture, and even to predict properties that we cannot see, \eg weight or temperature.
%
Information from different sensory sources may be complementary, redundant or contradictory~\autocite{ernst2004merging}.
Information from different sensory sources may be complementary, redundant or contradictory~\cite{ernst2004merging}.
%
This is why we sometimes want to touch an object to check one of its properties that we have seen and to compare or confront our visual and tactile sensations.
%
We then instinctively construct a unified perception of the object we are exploring and manipulating from these visual and tactile sensory modalities, as well as from the movement of our hand and fingers on the object~\autocite{ernst2002humans}.
We then instinctively construct a unified perception of the object we are exploring and manipulating from these visual and tactile sensory modalities, as well as from the movement of our hand and fingers on the object~\cite{ernst2002humans}.
The sense of touch also allows us to perceive our environment, but also to represent ourselves in space and to interact with the surrounding objects.
%
@@ -42,7 +42,7 @@ Touchable interfaces are actuated devices that are directly touched and that can
%
However, graspable interfaces occupy the hand, preventing interaction with other objects, and touchable interfaces often involve cumbersome mechanisms and are by definition limited to their own working surface.
%
Instead, wearable interfaces are directly mounted on the body to provide kinesthetic or cutaneous sensations on the skin in a portable way and without restricting the user's movements~\autocite{pacchierotti2017wearable}.
Instead, wearable interfaces are directly mounted on the body to provide kinesthetic or cutaneous sensations on the skin in a portable way and without restricting the user's movements~\cite{pacchierotti2017wearable}.
\begin{subfigs}{haptic-categories}{
Haptic devices can be classified into three categories according to their interface with the user:
@@ -60,17 +60,17 @@ A wide range of \WH devices have been developed to provide the user with rich vi
%
\figref{wearable-haptics} shows some examples of different \WH devices with different form factors and rendering capabilities.
%
Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, \VR, and social interactions~\autocite{pacchierotti2017wearable,culbertson2018haptics}.
Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, \VR, and social interactions~\cite{pacchierotti2017wearable,culbertson2018haptics}.
%
But their use in combination with \AR has been little explored so far.
\begin{subfigs}{wearable-haptics}{
Wearable haptic devices can render sensations on the skin as feedback to real or virtual objects being touched.
}[
\item Wolverine, a wearable exoskeleton that simulate contact and grasping of virtual objects with force feedback on the fingers~\autocite{choi2016wolverine}.
\item Touch\&Fold, a \WH device mounted on the nail that fold on demand to render contact, normal force and vibrations to the fingertip~\autocite{teng2021touch}.
\item The hRing, a \WH ring mounted on the proximal phalanx able to render normal and shear forces to the finger~\autocite{pacchierotti2016hring}.
\item Tasbi, a haptic bracelet capable of rendering squeeze and vibrotactile feedback to the wrist~\autocite{pezent2019tasbi}.
\item Wolverine, a wearable exoskeleton that simulate contact and grasping of virtual objects with force feedback on the fingers~\cite{choi2016wolverine}.
\item Touch\&Fold, a \WH device mounted on the nail that fold on demand to render contact, normal force and vibrations to the fingertip~\cite{teng2021touch}.
\item The hRing, a \WH ring mounted on the proximal phalanx able to render normal and shear forces to the finger~\cite{pacchierotti2016hring}.
\item Tasbi, a haptic bracelet capable of rendering squeeze and vibrotactile feedback to the wrist~\cite{pezent2019tasbi}.
]
\subfigsheight{28mm}
\subfig{choi2016wolverine}
@@ -92,7 +92,7 @@ It is technically and conceptually closely related to \VR, which replaces the \R
%
It describes the degree of \RV of the environment along an axis, with one end being the \RE and the other end being a pure \VE, \ie indistinguishable from the real world (such as \emph{The Matrix} movies).
%
Between these two extremes lies \MR, which comprises \AR and \VR as different levels of mixing real and virtual environments~\autocite{skarbez2021revisiting}.
Between these two extremes lies \MR, which comprises \AR and \VR as different levels of mixing real and virtual environments~\cite{skarbez2021revisiting}.
%
\AR/\VR is most often understood as addressing only the visual sense, and as haptics, it can take many forms as a user interface.
%
@@ -114,7 +114,7 @@ The combination of the two axes defines 9 types of \vh environments, with 3 poss
%
For example, a \v-\AE that uses a tangible (touchable) object as a proxy to manipulate virtual content is considered a \h-\RE (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum3}), whereas a device that provides synthetic haptic feedback when touching a \VO is considered a \h-\VE (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum3}).
%
Haptic \AR is then the combination of real and virtual haptic stimuli~\autocite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}).
Haptic \AR is then the combination of real and virtual haptic stimuli~\cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}).
%
In particular, it has been implemented by enhancing the haptic perception of tangible objects by providing timely tactile stimuli using \WHs.
%
@@ -133,10 +133,10 @@ The integration of \WHs with \AR seems to be one of the most promising solutions
\begin{subfigs}{visuo-haptic-environments}{
Visuo-haptic environments with different degrees of reality-virtuality.
}[
\item Visual \AR environment with a real, tangible haptic object used as a proxy to manipulate a \VO~\autocite{kahl2023using}.
\item Visual \AR environment with a \WH device that provides virtual, synthetic feedback from contact with a \VO~\autocite{meli2018combining}.
\item A tangible object seen in a \v-\VR environment whose haptic perception of stiffness is augmented with the hRing haptic device~\autocite{salazar2020altering}.
\item Visuo-haptic rendering of texture on a touched tangible object with a \v-\AR display and haptic electrovibration feedback~\autocite{bau2012revel}.
\item Visual \AR environment with a real, tangible haptic object used as a proxy to manipulate a \VO~\cite{kahl2023using}.
\item Visual \AR environment with a \WH device that provides virtual, synthetic feedback from contact with a \VO~\cite{meli2018combining}.
\item A tangible object seen in a \v-\VR environment whose haptic perception of stiffness is augmented with the hRing haptic device~\cite{salazar2020altering}.
\item Visuo-haptic rendering of texture on a touched tangible object with a \v-\AR display and haptic electrovibration feedback~\cite{bau2012revel}.
]
\subfigsheight{31mm}
\subfig{kahl2023using}
@@ -192,14 +192,14 @@ Although closely related, (visual) \AR and \VR have key differences in their res
Firstly, the user's hand and \RE are visible in \AR, unlike \VR where there is total control over the visual rendering of the hand and \VE.
% (unless specifically overlaid with virtual visual content)
%
As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic~\autocite{ujitoko2021survey} or haptic retargeting~\autocite{azmandian2016haptic} effects.
%enabling techniques such as pseudo-haptic feedback that induce haptic feedback with visual stimuli~\autocite{ujitoko2021survey} or haptic retargeting that associate a single tangible object with multiple \VOs without the user noticing~\autocite{azmandian2016haptic}.
As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic~\cite{ujitoko2021survey} or haptic retargeting~\cite{azmandian2016haptic} effects.
%enabling techniques such as pseudo-haptic feedback that induce haptic feedback with visual stimuli~\cite{ujitoko2021survey} or haptic retargeting that associate a single tangible object with multiple \VOs without the user noticing~\cite{azmandian2016haptic}.
%
Moreover, many \WH devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR.
%
The user's hand must be indeed free to touch and interact with the \RE.
%
It is possible instead to place the haptic actuator close to the point of contact with the \RE, as described above to implement \h-\AR, \eg providing haptic feedback on another phalanx~\autocite{asano2015vibrotactile,salazar2020altering} or the wrist~\autocite{sarac2022perceived} for rendering fingertip contacts with virtual content.
It is possible instead to place the haptic actuator close to the point of contact with the \RE, as described above to implement \h-\AR, \eg providing haptic feedback on another phalanx~\cite{asano2015vibrotactile,salazar2020altering} or the wrist~\cite{sarac2022perceived} for rendering fingertip contacts with virtual content.
%
Therefore, when touching a virtual or augmented object, the real and virtual visual sensations are seen as colocalised, but the virtual haptic feedback is not.
%
@@ -217,21 +217,21 @@ It is therefore unclear to what extent the real and virtual visuo-haptic sensati
\subsectionstarbookmark{Enable Effective Manipulation of the Augmented Environment}
Touching, grasping and manipulating \VOs are fundamental interactions for \AR~\autocite{kim2018revisiting}, \VR~\autocite{bergstrom2021how} and VEs in general~\autocite{laviola20173d}.
Touching, grasping and manipulating \VOs are fundamental interactions for \AR~\cite{kim2018revisiting}, \VR~\cite{bergstrom2021how} and VEs in general~\cite{laviola20173d}.
%
As the hand is not occupied or covered with a haptic device to not impair interaction with the \RE, as described in the previous section, one can expect a a seamless and direct manipulation of the hand with the virtual content as if it were real.
%
Thus, augmenting a tangible object has the advantage of physically constraining the hand, allowing for easy and natural interaction, but manipulating a purely \VO with the bare hand can be challenging without good haptic feedback~\autocite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
Thus, augmenting a tangible object has the advantage of physically constraining the hand, allowing for easy and natural interaction, but manipulating a purely \VO with the bare hand can be challenging without good haptic feedback~\cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
In addition, current \AR systems have visual rendering limitations that also affect interaction with \VOs. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency.
%
Visual \AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world.
%
But the depth perception of the \VOs is often underestimated~\autocite{peillard2019studying,adams2022depth}, and there is often a lack of mutual occlusion between the hand and a \VO, \ie that the hand can hide the object or be hidden by the object~\autocite{macedo2023occlusion}.
But the depth perception of the \VOs is often underestimated~\cite{peillard2019studying,adams2022depth}, and there is often a lack of mutual occlusion between the hand and a \VO, \ie that the hand can hide the object or be hidden by the object~\cite{macedo2023occlusion}.
%
Finally, as illustrated in \figref{interaction-loop}, interacting with a \VO is an illusion, because in fact the real hand is controlling in real time a virtual hand, like an avatar, whose contacts with \VOs are then simulated in the \VE.
%
Therefore, there is inevitably a latency delay between the real hand's movements and the \VO's return movements, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the touched \VO~\autocite{prachyabrued2014visual}.
Therefore, there is inevitably a latency delay between the real hand's movements and the \VO's return movements, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the touched \VO~\cite{prachyabrued2014visual}.
%
This makes it difficult to perceive the position of the fingers relative to the object before touching or grasping it, and also to estimate the force required to grasp and move the object to a desired location.
@@ -283,7 +283,7 @@ Our contributions in these two axes are summarized in \figref{contributions}.
% Very short abstract of contrib 2
\WH devices have proven to be effective in modifying the perception of a touched tangible surface, without modifying the tangible, nor covering the fingertip, forming a \h-\AE~\autocite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
\WH devices have proven to be effective in modifying the perception of a touched tangible surface, without modifying the tangible, nor covering the fingertip, forming a \h-\AE~\cite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
%
%It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronised with the finger movement.
%
@@ -297,7 +297,7 @@ For this first axis of research, we propose to design and evaluate the perceptio
%
To this end, we (1) design a system for rendering virtual visuo-haptic texture augmentations, to (2) evaluate how the perception of these textures is affected by the visual virtuality of the hand and the environment (\AR \vs \VR), and (3) investigate the perception of co-localized visuo-haptic texture augmentations in \AR.
First, an effective approach to rendering haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction~\autocite{culbertson2014modeling,asano2015vibrotactile}.
First, an effective approach to rendering haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction~\cite{culbertson2014modeling,asano2015vibrotactile}.
%
Yet, to achieve the natural interaction with the hand and a coherent visuo-haptic feedback, it requires a real time rendering of the textures, no constraints on the hand movements, and a good synchronization between the visual and haptic feedback.
%
@@ -305,11 +305,11 @@ Thus, our first objective is to design an immersive, real time system that allow
Second, many works have investigated the haptic rendering of virtual textures, but few have integrated them with immersive \VEs or have considered the influence of the visual rendering on their perception.
%
Still, it is known that the visual feedback can alter the perception of real and virtual haptic sensations~\autocite{schwind2018touch,choi2021augmenting} but also that the force feedback perception of grounded haptic devices is not the same in \AR and \VR~\autocite{diluca2011effects,gaffary2017ar}.
Still, it is known that the visual feedback can alter the perception of real and virtual haptic sensations~\cite{schwind2018touch,choi2021augmenting} but also that the force feedback perception of grounded haptic devices is not the same in \AR and \VR~\cite{diluca2011effects,gaffary2017ar}.
%
Hence, our second objective is to understand how the perception of haptic texture augmentation differs depending on the degree of visual virtuality of the hand and the environment.
Finally, some visuo-haptic texture databases have been modelled from real texture captures~\autocite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with graspable haptics that are perceived as similar to real textures~\autocite{culbertson2015should,friesen2024perceived}.
Finally, some visuo-haptic texture databases have been modelled from real texture captures~\cite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with graspable haptics that are perceived as similar to real textures~\cite{culbertson2015should,friesen2024perceived}.
%
However, the rendering of these textures in an immersive and natural \vh-\AR using \WHs remains to be investigated.
%
@@ -323,15 +323,15 @@ In immersive and wearable \vh-\AR, the hand is free to touch and interact seamle
However, the intangibility of the \v-\VE, the many display limitations of current \v-\AR systems and \WH devices, and the potential discrepancies between these two types of feedback can make the manipulation of \VOs particularly challenging.
%However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of \WHs, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the real environment, the visual feedback, and the haptic feedback, can make the interaction with \VOs with bare hands particularly challenging.
%
Still two types of sensory feedback are known to improve such direct \VO manipulation, but they have not been studied in combination in immersive \v-\AE: visual rendering of the hand~\autocite{piumsomboon2014graspshell,prachyabrued2014visual} and contact rendering with \WHs~\autocite{lopes2018adding,teng2021touch}.
Still two types of sensory feedback are known to improve such direct \VO manipulation, but they have not been studied in combination in immersive \v-\AE: visual rendering of the hand~\cite{piumsomboon2014graspshell,prachyabrued2014visual} and contact rendering with \WHs~\cite{lopes2018adding,teng2021touch}.
%
For this second axis of research, we propose to design and evaluate the role of visuo-haptic augmentations of the hand as interaction feedback with \VOs.
%
We consider (1) the effect of different visual augmentations of the hand as \AR avatars and (2) the effect of combination of different visuo-haptic augmentations of the hand.
First, the visual rendering of the virtual hand is a key element for interacting and manipulating \VOs in \VR~\autocite{prachyabrued2014visual,grubert2018effects}.
First, the visual rendering of the virtual hand is a key element for interacting and manipulating \VOs in \VR~\cite{prachyabrued2014visual,grubert2018effects}.
%
A few works have also investigated the visual rendering of the virtual hand in \AR, from simulating mutual occlusions between the hand and \VOs~\autocite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay~\autocite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
A few works have also investigated the visual rendering of the virtual hand in \AR, from simulating mutual occlusions between the hand and \VOs~\cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay~\cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
%
But \v-\AR has significant perceptual differences from \VR due to the visibility of the real hand and environment, and these visual hand augmentations have not been evaluated in the context of \VO manipulation.
%
@@ -339,7 +339,7 @@ Thus, our fourth objective is to evaluate and compare the effect of different vi
Finally, as described above, \WHs for \v-\AR rely on moving the haptic actuator away from the fingertips to not impair the hand movements, sensations, and interactions with the \RE.
%
Previous works have shown that \WHs that provide feedback on the hand manipulation with \VOs in \AR can significantly improve the user performance and experience~\autocite{maisto2017evaluation,meli2018combining}.
Previous works have shown that \WHs that provide feedback on the hand manipulation with \VOs in \AR can significantly improve the user performance and experience~\cite{maisto2017evaluation,meli2018combining}.
%
However, it is unclear which positioning of the actuator is the most beneficial nor how a haptic augmentation of the hand compares or complements with a visual augmentation of the hand.
%
@@ -385,7 +385,7 @@ We use psychophysical methods to measure the user roughness perception of the vi
\chapref{ar_textures} presents a second user study using the same system and evaluating the perception of visuo-haptic texture augmentations, touched directly with one's own hand in \AR.
%
The textures are paired visual and tactile models of real surfaces~\autocite{culbertson2014one}, and are rendered as visual texture overlays and as a vibrotactile feedback, respectively, on the touched augmented surfaces, respectively.
The textures are paired visual and tactile models of real surfaces~\cite{culbertson2014one}, and are rendered as visual texture overlays and as a vibrotactile feedback, respectively, on the touched augmented surfaces, respectively.
%
%We investigate the perception and user appreciation of the combination of nine representative visuo-haptic pairs of texture.
%