Typo intro
This commit is contained in:
@@ -89,21 +89,21 @@ But their use in combination with augmented reality (AR) has been little explore
|
||||
|
||||
\subsectionstartoc{Augmented Reality Is Not Only Visual}
|
||||
|
||||
AR integrates virtual content into the real world perception, creating the illusion of a unique environment.
|
||||
AR integrates virtual content into the real world perception, creating the illusion of a unique augmented environment (AE).
|
||||
%
|
||||
It thus promises natural and seamless interaction with the physical and digital objects (and their combination) directly with our hands.
|
||||
%
|
||||
It is technically and conceptually closely related to VR, which completely replaces the real environment with an immersive virtual environment (VE).
|
||||
It is technically and conceptually closely related to VR, which replaces the real environment (RE) perception with a virtual environment (VE).
|
||||
%
|
||||
AR and VR can be placed on a reality-virtuality (RV) continuum, as proposed by \textcite{milgram1994taxonomy} and illustrated in \figref{rv-continuum}.
|
||||
%
|
||||
It describes different levels of combination of real and virtual environments along an axis, with one end being the real, physical environment and the other end being a purely virtual environment, \ie indistinguishable from the real world (such as \emph{The Matrix} movies or the \emph{Holodeck} in the \emph{Star Trek} series).
|
||||
It describes the degree of RV of the environment along an axis, with one end being the RE and the other end being a pure VE, \ie indistinguishable from the real world (such as \emph{The Matrix} movies or the \emph{Holodeck} in the \emph{Star Trek} series).
|
||||
%
|
||||
In between, there is mixed reality (MR) which comprises AR and VR~\autocite{skarbez2021revisiting}.
|
||||
Between these two extremes lies mixed reality (MR), which comprises AR and VR as different levels of mixing real and virtual environments~\autocite{skarbez2021revisiting}.
|
||||
%
|
||||
AR/VR is most often understood as addressing only the visual sense, and as haptics, it can take many forms as a user interface.
|
||||
%
|
||||
The most mature devices are head-mounted displays (HMDs), which are portable headsets worn directly on the head, providing the user with an immersive virtual environment.
|
||||
The most mature devices are head-mounted displays (HMDs), which are portable headsets worn directly on the head, providing the user with an immersive AE/VE.
|
||||
|
||||
\begin{subfigs}{rv-continuums}{Reality-virtuality (RV) continuums.}[%
|
||||
\item Original RV continuum for the visual sense initially proposed by and readapted from \textcite{milgram1994taxonomy}.
|
||||
@@ -119,7 +119,7 @@ In particular, \textcite{jeon2009haptic} proposed extending the RV continuum to
|
||||
%
|
||||
The combination of the two axes defines 9 types of visuo-haptic environments, with 3 possible levels of RV for each visual or haptic axis (real, augmented, virtual).
|
||||
%
|
||||
For example, a visual AR environment that uses a tangible (touchable) object as a proxy to manipulate virtual content is considered a real haptic environment (see \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum3}), whereas a device that provides synthetic haptic feedback when touching a virtual object is considered a virtual haptic environment (see \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum3}).
|
||||
For example, a visual augmented environment (vAE) that uses a tangible (touchable) object as a proxy to manipulate virtual content is considered a haptic real environment (hRE; see \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum3}), whereas a device that provides synthetic haptic feedback when touching a virtual object is considered a haptic virtual environment (hVE; see \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum3}).
|
||||
%
|
||||
Haptic augmented reality (hAR) is then the combination of real and virtual haptic stimuli~\autocite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}).
|
||||
%
|
||||
@@ -129,11 +129,11 @@ In particular, it has been implemented by enhancing the haptic perception of tan
|
||||
%
|
||||
\figref{bau2012revel} shows another example of visuo-haptic AR rendering of texture when running the finger on a tangible surface (middle cell in the two axes in \figref{visuo-haptic-rv-continuum3}).
|
||||
|
||||
Current (visual) AR systems often lack haptic feedback, creating a deceptive and incomplete user experience when reaching the virtual environment with the hand.
|
||||
Current (visual) AR systems often lack haptic feedback, creating a deceptive and incomplete user experience when reaching the VE with the hand.
|
||||
%
|
||||
All visual virtual objects are inherently intangible and cannot physically constrain a user's hand, making it difficult to perceive their properties congruently and interact with them with confidence and efficiency.
|
||||
%
|
||||
It is therefore necessary to provide haptic feedback that is consistent with the augmented visual environment and ensures the best possible user experience.
|
||||
It is therefore necessary to provide haptic feedback that is consistent with the vAE and ensures the best possible user experience.
|
||||
%
|
||||
The integration of wearable haptics with AR seems to be one of the most promising solutions, but it remains challenging due to their many respective characteristics and the additional constraints of combining them.
|
||||
|
||||
@@ -141,8 +141,8 @@ The integration of wearable haptics with AR seems to be one of the most promisin
|
||||
Visuo-haptic environments with different degrees of reality-virtuality.
|
||||
}[%
|
||||
\item Visual AR environment with a real, tangible haptic object used as a proxy to manipulate a virtual object~\autocite{kahl2023using}.
|
||||
\item Visual AR environment with a wearable haptic device that provide virtual, synthetic feedback of contact with a virtual object~\autocite{meli2018combining}.
|
||||
\item A tangible object seen in a visual VR environment, and whose haptic perception of stiffness is augmented with the hRing haptic device~\autocite{salazar2020altering}.
|
||||
\item Visual AR environment with a wearable haptic device that provides virtual, synthetic feedback from contact with a virtual object~\autocite{meli2018combining}.
|
||||
\item A tangible object seen in a visual VR environment whose haptic perception of stiffness is augmented with the hRing haptic device~\autocite{salazar2020altering}.
|
||||
\item Visuo-haptic rendering of texture on a touched tangible object with a visual AR display and haptic electrovibration feedback~\autocite{bau2012revel}.
|
||||
]
|
||||
\subfigsheight{31mm}
|
||||
@@ -155,65 +155,65 @@ The integration of wearable haptics with AR seems to be one of the most promisin
|
||||
|
||||
\sectionstartoc{Research Challenges of Wearable Visuo-Haptic Augmented Reality}
|
||||
|
||||
Integrating wearable haptics with augmented reality to create a visuo-haptic augmented environment is complex and raises many perceptive and interaction challenges.
|
||||
The integration of wearable haptics with AR to create a visuo-haptic augmented environment (vhAE) is complex and presents many perceptual and interaction challenges, \ie sensing the AE and acting effectively upon it.
|
||||
%
|
||||
We are in particular interested in enabling a direct contact and manipulation with bare hands of virtual and augmented objects.
|
||||
We are particularly interested in enabling direct contact and manipulation of virtual and augmented objects with the bare hand.
|
||||
%
|
||||
The experience of such a visuo-haptic augmented environment relies on an interaction loop with the user, as illustrated in \figref{interaction-loop}.
|
||||
Our goal is to enable congruent, intuitive and seamless perception of and interaction with the vhAE.
|
||||
|
||||
The experience of such a vhAE relies on an interaction loop with the user, as illustrated in \figref{interaction-loop}.
|
||||
%
|
||||
The real environment and the user's hand are tracked in real time with sensors to be reconstructed in virtual visual and haptic environments.
|
||||
The RE and the user's hand are tracked in real time by sensors and reconstructed in visual and haptic VEs.
|
||||
%
|
||||
The interactions between the virtual hand and objects are then simulated and rendered as visual and haptic feedback to the user using an AR headset and a wearable haptic device.
|
||||
%
|
||||
Because the visuo-haptic virtual environment is displayed in real time colocalized and aligned with the real one, it creates the illusion to the user to directly perceive and interact with the virtual content as if it was part of the real environment.
|
||||
Because the visuo-haptic VE is displayed in real time, colocalized and aligned with the real one, the user is given the illustion of directly perceiving and interacting with the virtual content as if it were part of the RE.
|
||||
|
||||
\fig{interaction-loop}{The interaction loop between a user and a visuo-haptic augmented environment}[%
|
||||
One interact with the visual (in blue) and haptic (in red) virtual environment through a virtual hand (in purple) interaction technique that track the real hand actions and simulate the contact with the virtual objects. %
|
||||
One interact with the visual (in blue) and haptic (in red) virtual environment through a virtual hand (in purple) interaction technique that tracks real hand movements and simulates contact with virtual objects. %
|
||||
The virtual environment is rendered back to the user colocalized with the real one (in gray) using a visual AR headset and a wearable haptic device. %
|
||||
]
|
||||
|
||||
Our goal is to provide a congruent, intuitive and seamless perception and interaction with the visuo-haptic augmented environment.
|
||||
%
|
||||
%This to ensure the best possible user experience, taking into account the current capabilities and limitations of wearable haptics and augmented reality technologies.
|
||||
%
|
||||
In this context, we identify two main research challenges which we address in this thesis:
|
||||
In this context, we identify two main research challenges that we address in this thesis:
|
||||
%
|
||||
\begin{enumerate*}[label=(\Roman*)]
|
||||
\item providing plausible and coherent visuo-haptic augmentations, and
|
||||
\item enabling effective interaction with the augmented environment.
|
||||
\end{enumerate*}
|
||||
%
|
||||
Each of these challenges also poses numerous design, technical and human issues specific to each of the two types of feedback, wearable haptics and immersive AR, as well as multimodal rendering and user experience issues in integrating these two sensorimotor feedbacks into a coherent and seamless visuo-haptic augmented environment.
|
||||
Each of these challenges also raises numerous design, technical and human issues specific to each of the two types of feedback, wearable haptics and immersive AR, as well as multimodal rendering and user experience issues in integrating these two sensorimotor feedbacks into a coherent and seamless vhAE.
|
||||
%These challenges are illustrated in the visuo-haptic interaction loop in \figref{interaction-loop}.
|
||||
|
||||
|
||||
\subsectionstartoc{Provide Plausible and Coherent Visuo-Haptic Augmentations}
|
||||
|
||||
Numerous haptic devices have been specially designed and evaluated to use in VR, providing realistic and diverse kinesthetic and tactile feedback to virtual objects.
|
||||
Many haptic devices have been designed and evaluated specifically for use in VR, providing realistic and varied kinesthetic and tactile feedback to virtual objects.
|
||||
%
|
||||
Although closely related, (visual) AR and VR have key differences in their rendering that can affect the user perception.
|
||||
Although closely related, (visual) AR and VR have key differences in their respective renderings that can affect user perception.
|
||||
|
||||
First, the user's hand and the real environment are visible in AR, contrary to VR, where there is a total control over the visual rendering of the hand and the environment.
|
||||
Firstly, the user's hand and RE are visible in AR, unlike VR where there is total control over the visual rendering of the hand and VE.
|
||||
% (unless specifically overlaid with virtual visual content)
|
||||
%
|
||||
As such, in VR, visual sensations are particularly dominant in the perception, and conflicts are specially created with haptic sensations to influence the user's perception, for example to create pseudo-haptic~\autocite{ujitoko2021survey} or haptic retargeting~\autocite{azmandian2016haptic} effects.
|
||||
As such, in VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic~\autocite{ujitoko2021survey} or haptic retargeting~\autocite{azmandian2016haptic} effects.
|
||||
%enabling techniques such as pseudo-haptic feedback that induce haptic feedback with visual stimuli~\autocite{ujitoko2021survey} or haptic retargeting that associate a single tangible object with multiple virtual objects without the user noticing~\autocite{azmandian2016haptic}.
|
||||
%
|
||||
Moreover, many wearable haptic devices take the form of controllers, gloves, exoskeletons, and as they cover the fingertips, they are therefore not suitable for AR.
|
||||
Moreover, many wearable haptic devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for AR.
|
||||
%
|
||||
The user's hand must be indeed not impaired to be free to touch and interact with the real environment.
|
||||
The user's hand must be indeed free to touch and interact with the RE.
|
||||
%
|
||||
It is possible instead to place the haptic actuator close to the point of contact with the real environment, as described above to implement hAR, \eg providing haptic feedback on another phalanx~\autocite{asano2015vibrotactile,salazar2020altering} or the wrist~\autocite{sarac2022perceived} for rendering fingertip contacts with virtual content.
|
||||
It is possible instead to place the haptic actuator close to the point of contact with the RE, as described above to implement hAR, \eg providing haptic feedback on another phalanx~\autocite{asano2015vibrotactile,salazar2020altering} or the wrist~\autocite{sarac2022perceived} for rendering fingertip contacts with virtual content.
|
||||
%
|
||||
Therefore, when touching a virtual or augmented object, the real and virtual visual sensations are seen colocalised, but not the virtual haptic feedback.
|
||||
Therefore, when touching a virtual or augmented object, the real and virtual visual sensations are seen as colocalised, but the virtual haptic feedback is not.
|
||||
%
|
||||
It remains to be investigated how such potential discrepancies affect the overall perception to design visuo-haptic renderings adapted to AR.
|
||||
|
||||
Then, it is in AR, as of today, only possible to add visual and haptic sensations to the overall user perception of the environment, but it is very difficult to remove sensations.
|
||||
So far, AR can only add visual and haptic sensations to the user's overall perception of the environment, but conversely it is very difficult to remove sensations.
|
||||
%
|
||||
These added virtual sensations can therefore be perceived as being out of sync or even inconsistent with the sensations of the real environment, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these.
|
||||
These added virtual sensations can therefore be perceived as out of sync or even inconsistent with the sensations of the RE, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these.
|
||||
%
|
||||
Hence, it is unclear to what extent the real and virtual visuo-haptic sensations will be perceived as realistic or plausible, and how much they will conflict or complement each other in the perception of the augmented environment.
|
||||
It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as realistic or plausible, and to what extent they will conflict or complement each other in the perception of the AE.
|
||||
%
|
||||
%Therefore, it remains to be investigated how these three characteristics of using wearable haptics with AR affect the perception, especially with visually and haptically augmented objects.
|
||||
|
||||
@@ -221,61 +221,61 @@ Hence, it is unclear to what extent the real and virtual visuo-haptic sensations
|
||||
|
||||
\subsectionstartoc{Enable Effective Interaction with the Augmented Environment}
|
||||
|
||||
Touching, grasping and manipulating virtual objects are fundamental interactions for AR~\autocite{kim2018revisiting} and VR~\autocite{bergstrom2021how}.
|
||||
Touching, grasping and manipulating virtual objects are fundamental interactions for AR~\autocite{kim2018revisiting}, VR~\autocite{bergstrom2021how} and VEs in general~\autocite{laviola20173d}.
|
||||
%
|
||||
As the hand is not occupied or covered with a haptic device to not impair interaction with the real environment, as described in previous section, one can expect to interact seamlessly and directly with the hand with the virtual content as if it were real.
|
||||
As the hand is not occupied or covered with a haptic device to not impair interaction with the RE, as described in the previous section, one can expect a a seamless and direct interaction of the hand with the virtual content as if it were real.
|
||||
%
|
||||
Thus, augmenting a tangible object present the advantage of physically constraint the hand, enabling an easy and natural interaction, but manipulating a purely virtual object with bare hands can be challenging with no good haptic feedback~\autocite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
|
||||
Thus, augmenting a tangible object has the advantage of physically constraining the hand, allowing for easy and natural interaction, but manipulating a purely virtual object with the bare hand can be challenging without good haptic feedback~\autocite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
|
||||
|
||||
Moreover, current AR systems present visual rendering limitations that also affect virtual object interaction. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency.
|
||||
In addition, current AR systems have visual rendering limitations that also affect interaction with virtual objects. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency.
|
||||
%
|
||||
Visual AR is the display of superimposed images of the virtual world, synchronized with the current user view of the real world.
|
||||
Visual AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world.
|
||||
%
|
||||
But the depth perception of the virtual objects is often underestimated~\autocite{peillard2019studying,adams2022depth}, and it often lacks mutual occlusion between the hand and a virtual object, \ie that the hand can hide the object or be hidden by the object~\autocite{macedo2023occlusion}.
|
||||
But the depth perception of the virtual objects is often underestimated~\autocite{peillard2019studying,adams2022depth}, and there is often a lack of mutual occlusion between the hand and a virtual object, \ie that the hand can hide the object or be hidden by the object~\autocite{macedo2023occlusion}.
|
||||
%
|
||||
Finally, interaction with the object is an illusion, because in fact the real hand is recorded by a tracking system and controls in real time a virtual hand, like an avatar, whose contacts with virtual objects is then simulated in the virtual environment, as illustrated in \figref{interaction-loop}.
|
||||
Finally, as illustrated in \figref{interaction-loop}, interacting with a virtual object is an illusion, because in fact the real hand is controlling in real time a virtual hand, like an avatar, whose contacts with virtual objects are then simulated in the VE.
|
||||
%
|
||||
There is therefore inevitably a latency delay between the real hand movements and the virtual object return movements, and a spatial shift can also occur between the real hand and the virtual hand, which movements are constrained to the touched virtual object~\autocite{prachyabrued2014visual}.
|
||||
Therefore, there is inevitably a latency delay between the real hand's movements and the virtual object's return movements, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the touched virtual object~\autocite{prachyabrued2014visual}.
|
||||
%
|
||||
This make it difficult to perceive the position of the fingers relative to the object before touching or grasping it, and also to estimate the force to apply to grasp it and move it to a desired location.
|
||||
This makes it difficult to perceive the position of the fingers relative to the object before touching or grasping it, and also to estimate the force required to grasp and move the object to a desired location.
|
||||
|
||||
Hence, it is necessary to provide visual and haptic feedback that enable the user to efficiently contact, grasp and manipulate a virtual object with the hand.
|
||||
Hence, it is necessary to provide visual and haptic feedback that allows the user to efficiently contact, grasp and manipulate a virtual object with the hand.
|
||||
%
|
||||
Yet, it is unclear which type of visual and haptic feedback is the best suited to guide the virtual object manipulation, and whether one or other of the two types of feedback, or their combination, is the most beneficial for users.
|
||||
Yet, it is unclear which type of visual and haptic feedback is the best suited to guide the virtual object manipulation, and whether one or the other of a combination of the two is most beneficial for users.
|
||||
|
||||
% comme on laisse la main libre, et quelle visible, il n'y a pas de contrôleur tracké comme en VR, donc on s'attend à des interactions naturelles directement avec la main, de manière seamless entre physique et virtuel. On tracking additionel des actuateurs qui couvre la main. Question de où placer lactuateur, encore une fois le feedback n'est pas colocalisé avec l'action, contrairement au visuel (+- lag). Il y a également des décalages dans le temps et l'espace entre la main physique et son replicat virtuel, or c'est ce dernier qui agit avec les objets virtuel, est-ce qu'il faut le montrer ? Ou des rendus haptiques suffisent, complémentaire, en contradiction ?
|
||||
|
||||
|
||||
\sectionstartoc{Approach and Contributions}
|
||||
|
||||
The objective of this thesis is to understand how immersive visual and wearable haptic feedbacks compare and complement each other in the context of direct hand perception and interaction with augmented and virtual objects.
|
||||
The aim of this thesis is to understand how immersive visual and wearable haptic feedback compare and complement each other in the context of direct hand perception and interaction with augmented and virtual objects.
|
||||
%
|
||||
As described in the research challenges section above, providing a user a convincing, consistent and effective visuo-haptic augmented environment is complex and raises many issues.
|
||||
As described in the Research Challenges section above, providing a convincing, consistent and effective vhAE to a user is complex and raises many issues.
|
||||
%
|
||||
Our approach is to
|
||||
%
|
||||
\begin{enumerate*}[label=(\arabic*)]
|
||||
\item design immersive and wearable multimodal visuo-haptic renderings that augment both the objects being interacted with and the hand interacting with them, and
|
||||
\item evaluate experimentally in user studies how these renderings affect the perception of and interaction with these objects using psychophysical, performance, and user experience methods.
|
||||
\item evaluate in user studies how these renderings affect the perception of and interaction with these objects using psychophysical, performance, and user experience methods.
|
||||
\end{enumerate*}
|
||||
%
|
||||
We consider two main axes of research, each addressing one of the research challenges identified above:
|
||||
%
|
||||
\begin{enumerate*}[label=(\Roman*)]
|
||||
\item altering the perception of tangible surfaces with visuo-haptic texture augmentations, and
|
||||
\item improving virtual object interaction with visuo-haptic augmentations of the hand.
|
||||
\item modifying the perception of tangible surfaces using visuo-haptic texture augmentations, and
|
||||
\item improving the interaction with virtual objects using visuo-haptic augmentations of the hand.
|
||||
\end{enumerate*}
|
||||
%
|
||||
Our contributions in these two axes are summarized in \figref{contributions}.
|
||||
|
||||
\fig[0.95]{contributions}{Summary of our contributions through the simplified interaction loop.}[%
|
||||
The contributions are represented in dark gray boxes, and the research axes in light green circles.%
|
||||
The first (I) axis design and evaluates the perception of visuo-haptic texture augmentations of tangible surfaces, directly touched by the hand.%
|
||||
The second (II) axis focuses on improving the manipulation with bare hands of virtual objects with visuo-haptic augmentations of the hand as interaction feedback.%
|
||||
The first (I) axis designs and evaluates the perception of visuo-haptic texture augmentations of tangible surfaces, directly touched by the hand.%
|
||||
The second (II) axis focuses on improving the manipulation of virtual objects with the bare hand using visuo-haptic augmentations of the hand as interaction feedback.%
|
||||
]
|
||||
|
||||
|
||||
\subsectionstartoc{Altering the Perception of Tangible Surfaces with Visuo-Haptic Texture Augmentations}
|
||||
\subsectionstartoc{Modifying the Perception of Tangible Surfaces with Visuo-Haptic Texture Augmentations}
|
||||
|
||||
% Very short shared motivation of the shared objective of the two contributions
|
||||
% Objective + We propose / we consider : (1) ... and (2) ...
|
||||
@@ -284,7 +284,7 @@ Our contributions in these two axes are summarized in \figref{contributions}.
|
||||
|
||||
% Very short abstract of contrib 2
|
||||
|
||||
Wearable haptic devices have proven to be effective in altering the perception of a touched tangible surface, without modifying the tangible, nor covering the fingertip touching, forming a hAR environment~\autocite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
|
||||
Wearable haptic devices have proven to be effective in modifying the perception of a touched tangible surface, without modifying the tangible, nor covering the fingertip, forming a hAE~\autocite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
|
||||
%
|
||||
%It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronised with the finger movement.
|
||||
%
|
||||
@@ -296,72 +296,72 @@ Texture is indeed one of the main tactile sensation of a surface material~\cite{
|
||||
%
|
||||
For this first axis of research, we propose to design and evaluate the perception of virtual visuo-haptic textures augmenting tangible surfaces. %, using an immersive AR headset and a wearable vibrotactile device.
|
||||
%
|
||||
To do so, we (1) design a system for rendering virtual visuo-haptic textures augmentations, to (2) evaluate how the perception of these textures is affected by the visual virtuality of the hand and the environment (AR \vs VR), and (3) investigate the perception of co-localized visuo-haptic texture augmentations in AR.
|
||||
To this end, we (1) design a system for rendering virtual visuo-haptic texture augmentations, to (2) evaluate how the perception of these textures is affected by the visual virtuality of the hand and the environment (AR \vs VR), and (3) investigate the perception of co-localized visuo-haptic texture augmentations in AR.
|
||||
|
||||
First, an effective approach to render haptic textures is to generate a vibrotactile signal that represent the finger-texture interaction~\autocite{culbertson2014modeling,asano2015vibrotactile}.
|
||||
First, an effective approach to rendering haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction~\autocite{culbertson2014modeling,asano2015vibrotactile}.
|
||||
%
|
||||
Yet, to enable the most natural interaction with the hand and a coherent visuo-haptic feedback, it requires a real time rendering of the textures, no constraints on the hand movements, and a good synchronization between the visual and haptic feedbacks.
|
||||
Yet, to achieve the natural interaction with the hand and a coherent visuo-haptic feedback, it requires a real time rendering of the textures, no constraints on the hand movements, and a good synchronization between the visual and haptic feedback.
|
||||
%
|
||||
Thus, our first objective is to design an immersive, real time system allowing free exploration with bare hands of visuo-haptic textures augmentations on tangible surfaces.
|
||||
Thus, our first objective is to design an immersive, real time system that allows free exploration with the bare hand of visuo-haptic texture augmentations on tangible surfaces.
|
||||
|
||||
Second, many works investigated the haptic rendering of virtual textures, but few have integrated them with immersive virtual environments or have considered the influence of the visual rendering on their perception.
|
||||
Second, many works have investigated the haptic rendering of virtual textures, but few have integrated them with immersive VEs or have considered the influence of the visual rendering on their perception.
|
||||
%
|
||||
Still, it is known that the visual feedback can alter the perception of real and virtual haptic sensations~\autocite{schwind2018touch,choi2021augmenting} but also that the force feedback perception of grounded haptic devices is not the same in AR and VR~\autocite{diluca2011effects,gaffary2017ar}.
|
||||
%
|
||||
Hence, our second objective is to understand how different is the perception of haptic texture augmentation depending on the degree of visual virtuality of the hand and the environment.
|
||||
Hence, our second objective is to understand how the perception of haptic texture augmentation differs depending on the degree of visual virtuality of the hand and the environment.
|
||||
|
||||
Finally, some visuo-haptic texture databases have been modelled from real texture captures~\autocite{culbertson2014penn,balasubramanian2024sens3}, enabling to render virtual textures with graspable haptics that are perceived as similar to real textures~\autocite{culbertson2015should,friesen2024perceived}.
|
||||
Finally, some visuo-haptic texture databases have been modelled from real texture captures~\autocite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with graspable haptics that are perceived as similar to real textures~\autocite{culbertson2015should,friesen2024perceived}.
|
||||
%
|
||||
However, it remains to be investigated the rendering of these textures in an immersive and natural visuo-haptic AR using wearable haptics.
|
||||
However, the rendering of these textures in an immersive and natural visuo-haptic AR using wearable haptics remains to be investigated.
|
||||
%
|
||||
Our third objective is to evaluate the perception of simultaneous and co-localized visuo-haptic texture augmentation of tangible surfaces in AR, still directly touched by the hand, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture.
|
||||
Our third objective is to evaluate the perception of simultaneous and co-localized visuo-haptic texture augmentation of tangible surfaces in AR, directly touched by the hand, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture.
|
||||
|
||||
|
||||
\subsectionstartoc{Improving Virtual Object Interaction with Visuo-Haptic Augmentations of the Hand}
|
||||
|
||||
In immersive and wearable visuo-haptic AR, the hand is free to touch and interact seamlessly with real, augmented, and virtual objects, and one can expect natural and direct contact and manipulation of virtual objects with bare hand.
|
||||
In immersive and wearable visuo-haptic AR, the hand is free to touch and interact seamlessly with real, augmented, and virtual objects, and one can expect natural and direct contact and manipulation of virtual objects with the bare hand.
|
||||
%
|
||||
However, the intangibility of the virtual visual environment, the many display limitations of current visual AR systems and wearable haptic devices, and the potential discrepancies of these two feedbacks, can make the interaction with virtual objects particularly challenging.
|
||||
However, the intangibility of the vVE, the many display limitations of current visual AR systems and wearable haptic devices, and the potential discrepancies between these two types of feedback can make the interaction with virtual objects particularly challenging.
|
||||
%However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of wearable haptics, the visual rendering limitations of current AR systems, as well as the spatial and temporal discrepancies between the real environment, the visual feedback, and the haptic feedback, can make the interaction with virtual objects with bare hands particularly challenging.
|
||||
%
|
||||
Still two types of sensory feedback are known to improve such direct virtual object interaction, but they have not been studied in combination in immersive visual AR environments: visual rendering of the hand~\autocite{piumsomboon2014graspshell,prachyabrued2014visual} and hand-object interaction rendering with wearable haptics~\autocite{lopes2018adding,teng2021touch}.
|
||||
Still two types of sensory feedback are known to improve such direct virtual object interaction, but they have not been studied in combination in immersive vAE: visual rendering of the hand~\autocite{piumsomboon2014graspshell,prachyabrued2014visual} and hand-object interaction rendering with wearable haptics~\autocite{lopes2018adding,teng2021touch}.
|
||||
%
|
||||
For this second axis of research, we propose to design and evaluate the role of visuo-haptic augmentations of the hand as interaction feedback with virtual objects.
|
||||
%
|
||||
We consider (1) the effect of different visual augmentation of the hand as AR avatars and (2) the effect of various combination of visual and haptic augmentation of the hand.
|
||||
We consider (1) the effect of different visual augmentations of the hand as AR avatars and (2) the effect of combination of different visuo-haptic augmentations of the hand.
|
||||
|
||||
First, the visual rendering of the virtual hand is a key element to interact and manipulate virtual objects in VR~\autocite{prachyabrued2014visual,grubert2018effects}.
|
||||
First, the visual rendering of the virtual hand is a key element for interacting and manipulating virtual objects in VR~\autocite{prachyabrued2014visual,grubert2018effects}.
|
||||
%
|
||||
A few works have also investigated the visual rendering of the virtual hand in AR, from simulating mutual occlusions between the hand and virtual objects~\autocite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay~\autocite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
|
||||
%
|
||||
But visual AR presents significant perceptual differences with VR due to the visibility of the real hand and environment, and these visual hand augmentations have not been evaluated in a context of virtual object manipulation.
|
||||
But visual AR has significant perceptual differences from VR due to the visibility of the real hand and environment, and these visual hand augmentations have not been evaluated in the context of virtual object manipulation.
|
||||
%
|
||||
Thus, our fourth objective is to evaluate and compare the effect of different visual augmentation of the hand on direct manipulation of virtual objects in AR.
|
||||
Thus, our fourth objective is to evaluate and compare the effect of different visual hand augmentations on direct manipulation of virtual objects in AR.
|
||||
|
||||
Finally, as detailed above, wearable haptics for visual AR rely on moving away from the fingertips the haptic actuator to not impair the hand movements, sensations, and interactions with the real environment.
|
||||
Finally, as described above, wearable haptics for visual AR rely on moving the haptic actuator away from the fingertips to not impair the hand movements, sensations, and interactions with the RE.
|
||||
%
|
||||
Previous works have shown that wearable haptics, providing feedback on the hand interaction with virtual objects in AR, can significantly improve the performance and experience of the user~\autocite{maisto2017evaluation,meli2018combining}.
|
||||
Previous works have shown that wearable haptics that provide feedback on the hand interaction with virtual objects in AR can significantly improve the user performance and experience~\autocite{maisto2017evaluation,meli2018combining}.
|
||||
%
|
||||
However, it is unclear which positioning of the actuator is the most beneficial nor how a haptic augmentation of the hand compare or complement with a visual augmentation of the hand.
|
||||
However, it is unclear which positioning of the actuator is the most beneficial nor how a haptic augmentation of the hand compares or complements with a visual augmentation of the hand.
|
||||
%
|
||||
Our last objective is to investigate the role of visuo-haptic augmentations of the hand on manipulating virtual objects directly with the hand in AR.
|
||||
Our last objective is to investigate the role of visuo-haptic augmentations of the hand in manipulating virtual objects directly with the hand in AR.
|
||||
|
||||
|
||||
\sectionstartoc{Thesis Overview}
|
||||
|
||||
%Present the contributions and structure of the thesis.
|
||||
|
||||
This thesis is structured in four parts.
|
||||
This thesis is divided in four parts.
|
||||
%
|
||||
\partref{context} describes the context and background of our research, within which this first current \textit{Introduction} chapter presents the research challenges, and the objectives, approach, and contributions of this thesis.
|
||||
%
|
||||
\chapref{related_work} then provides an overview of related work on the perception of and interaction with visual and haptic augmentations of objects.
|
||||
%
|
||||
First, it gives an overview of existing wearable haptic devices and renderings to provide diverse and rich tactile sensations to the user, with a focus on vibrotactile feedback.
|
||||
Firstly, it gives an overview of existing wearable haptic devices and renderings, and how they have been used to enhance the touch perception with haptic augmentations and to improve the virtual object interaction, with a focus on vibrotactile feedback and haptic textures.
|
||||
%
|
||||
Second, it presents the principles, current capabilities and limitations of augmented reality, and describes the 3D interaction techniques used in AR and VR environments to interact with virtual and augmented objects with, in particular, the visual rendering of the user's hand.
|
||||
Secondly, it introduces the principles and user perception of augmented reality, and describes the 3D interaction techniques used in AR and VR environments to interact with virtual and augmented objects, in particular using the visual rendering of the user's hand.
|
||||
%
|
||||
Finally, it shows how multimodal visuo-haptic feedback have been used in AR and VR to alter the perception of tangible objects and to improve the interaction and manipulation with virtual objects.
|
||||
Finally, it shows how multimodal visuo-haptic feedback has been used in AR and VR to alter the perception of tangible objects and to improve the interaction with virtual objects.
|
||||
%
|
||||
Then, we address each of our two research axes in a dedicated part.
|
||||
|
||||
@@ -371,9 +371,9 @@ Then, we address each of our two research axes in a dedicated part.
|
||||
%
|
||||
We evaluate how the visual rendering of the hand (real or virtual), the environment (AR or VR) and the textures (displayed or hidden) affect the roughness perception of virtual vibrotactile textures rendered on real surfaces and touched directly with the index finger.
|
||||
|
||||
\chapref{xr_perception} details a system for rendering visuo-haptic virtual textures augmenting tangible surfaces using an immersive AR/VR headset and a wearable vibrotactile device.
|
||||
\chapref{xr_perception} details a system for rendering visuo-haptic virtual textures that augment tangible surfaces using an immersive AR/VR headset and a wearable vibrotactile device.
|
||||
%
|
||||
The haptic textures are rendered as a real-time vibrotactile signal representing a grating texture, and is provided to the middle phalanx of the index finger touching the texture using a voice-coil actuator.
|
||||
The haptic textures are rendered as a real-time vibrotactile signal representing a grating texture, and is provided to the middle phalanx of the index finger touching the texture using a voice-coil actuator (VCA).
|
||||
%
|
||||
The tracking of the real hand and environment is done using marker-based technique, and the visual rendering of their virtual counterparts is done using the immersive optical see-through (OST) AR headset Microsoft HoloLens~2.
|
||||
|
||||
@@ -383,28 +383,28 @@ It evaluates how different the perception of virtual haptic textures is in AR \v
|
||||
%
|
||||
We use psychophysical methods to measure the user roughness perception of the virtual textures, and extensive questionnaires to understand how this perception is affected by the visual rendering of the hand and the environment.
|
||||
|
||||
\chapref{ar_textures} presents a second user study using the same system and evaluating the perception of visuo-haptic textures augmentations, touched directly with one's own hand in AR.
|
||||
\chapref{ar_textures} presents a second user study using the same system and evaluating the perception of visuo-haptic texture augmentations, touched directly with one's own hand in AR.
|
||||
%
|
||||
The textures are paired visual and tactile models of real surfaces~\autocite{culbertson2014one}, and are rendered as visual texture overlays and as a vibrotactile feedback on the touched augmented surfaces, respectively.
|
||||
The textures are paired visual and tactile models of real surfaces~\autocite{culbertson2014one}, and are rendered as visual texture overlays and as a vibrotactile feedback, respectively, on the touched augmented surfaces, respectively.
|
||||
%
|
||||
%We investigate the perception and user appreciation of the combination of nine representative visuo-haptic pairs of texture.
|
||||
%
|
||||
Our objective is to assess the perceived realism, plausibility and roughness of the combination of nine representative visuo-haptic pairs of texture, and the coherence of their association.
|
||||
Our objective is to assess the perceived realism, plausibility and roughness of the combination of nine representative visuo-haptic texture pairs, and the coherence of their association.
|
||||
|
||||
\bigskip
|
||||
|
||||
\partref{manipulation} describes our contributions to the second axis of research, improving virtual object interaction with visuo-haptic augmentations of the hand.
|
||||
%
|
||||
We evaluate how the visual and haptic augmentation of the hand improve the interaction with virtual objects directly with the hand.
|
||||
We evaluate how the visual and haptic augmentation of the hand can improve the interaction with virtual objects directly with the hand.
|
||||
|
||||
\chapref{visual_hand} explores in a first user study the effect of six visual augmentations of the hand that provide contact feedback with the virtual object, as a set of the most popular hand renderings in the AR literature.
|
||||
\chapref{visual_hand} explores in a first user study the effect of six visual hand augmentations that provide contact feedback with the virtual object, as a set of the most popular hand renderings in the AR literature.
|
||||
%
|
||||
With the OST-AR headset Microsoft HoloLens~2, the performance and the user experience are evaluated in two representative manipulation tasks, \ie push-and-slide and grasp-and-place of a virtual object directly with the hand.
|
||||
Using the OST-AR headset Microsoft HoloLens~2, the user performance and experience are evaluated in two representative manipulation tasks, \ie push-and-slide and grasp-and-place of a virtual object directly with the hand.
|
||||
|
||||
\chapref{visuo_haptic_hand} evaluates in a second user study two vibrotactile contact techniques, provided at four different locations on the real hand, as haptic rendering of the hand-object interaction.
|
||||
%
|
||||
They are compared to the two most representative visual hand augmentations from the previous study, and, within the same OST-AR setup and manipulation tasks, the performance and the user experience are evaluateds.
|
||||
They are compared to the two most representative visual hand augmentations from the previous study, and the user performance and experience are evaluateds within the same OST-AR setup and manipulation tasks.
|
||||
|
||||
\bigskip
|
||||
|
||||
\partref{part:conclusion} finally concludes this thesis and discuss short-term future works and long-term perspectives for each of our contributions and research axes.
|
||||
\partref{part:conclusion} finally concludes this thesis and discusses short-term future work and long-term perspectives for each of our contributions and research axes.
|
||||
Reference in New Issue
Block a user