WIP intro

This commit is contained in:
2024-08-02 18:21:22 +02:00
parent aa99d80a12
commit a5424cda32
45 changed files with 227 additions and 71 deletions

Binary file not shown.

Binary file not shown.

Binary file not shown.

After

Width:  |  Height:  |  Size: 46 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 36 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 30 KiB

View File

@@ -0,0 +1,11 @@
- "[grab](https://thenounproject.com/browse/icons/term/grab/)" by [Gan Khoon Lay](https://thenounproject.com/creator/leremy/) / [CC BY](https://creativecommons.org/licenses/by/3.0/)
- "[accuse](https://thenounproject.com/browse/icons/term/accuse/)" by [Gan Khoon Lay](https://thenounproject.com/creator/leremy/) / [CC BY](https://creativecommons.org/licenses/by/3.0/)
- "[turning knob](https://thenounproject.com/browse/icons/term/turning-knob)" by [Gan Khoon Lay](https://thenounproject.com/creator/leremy/) / [CC BY](https://creativecommons.org/licenses/by/3.0/)
- "[pressing button](https://thenounproject.com/browse/icons/term/pressing-button)" by [Gan Khoon Lay](https://thenounproject.com/creator/leremy/) / [CC BY](https://creativecommons.org/licenses/by/3.0/)
- "[pressing button](https://thenounproject.com/browse/icons/term/pressing-button)" by [Gan Khoon Lay](https://thenounproject.com/creator/leremy/) / [CC BY](https://creativecommons.org/licenses/by/3.0/)
- https://thenounproject.com/icon/finger-pointing-4230431/
- https://thenounproject.com/icon/finger-pointing-4230346/
- https://thenounproject.com/icon/vibration-6478365/ from https://thenounproject.com/creator/iconbunny/
- https://thenounproject.com/icon/hololens-1499195/ from https://thenounproject.com/creator/daniel2021/
- https://thenounproject.com/icon/hololens-sideview-966758/ from https://thenounproject.com/creator/henningg/
- https://thenounproject.com/icon/hololens-973887/ from https://thenounproject.com/creator/henningg/

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 6.4 KiB

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 5.2 KiB

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 5.3 KiB

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 9.9 KiB

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 6.0 KiB

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" data-name="Ebene 1" viewBox="0 0 24 30" x="0px" y="0px"><title>Iconsset</title><path d="M23.44,8.62A.5.5,0,0,0,23,8.25L1,7.75a.49.49,0,0,0-.49.34,3.16,3.16,0,0,0,0,1.74h0c0,.26,1.11,6.42,4.49,6.42a6,6,0,0,0,5.35-2.15.5.5,0,0,0,.15-.35,12.74,12.74,0,0,0-.1-1.52L23,11.75a.5.5,0,0,0,.46-.34A4.71,4.71,0,0,0,23.44,8.62ZM5,15.25c-1.88,0-3-3.21-3.39-5H8.73a7.24,7.24,0,0,1,.73,3.29C7.71,15.25,6.88,15.25,5,15.25Zm17.62-4.49-12.4.47A3.91,3.91,0,0,0,9.31,9.4.5.5,0,0,0,9,9.25H1.38a2.35,2.35,0,0,1,0-.49l21.2.48A3.71,3.71,0,0,1,22.57,10.76Z"/><text x="0" y="39" fill="#000000" font-size="5px" font-weight="bold" font-family="'Helvetica Neue', Helvetica, Arial-Unicode, Arial, Sans-serif">Created by Henning Gross</text><text x="0" y="44" fill="#000000" font-size="5px" font-weight="bold" font-family="'Helvetica Neue', Helvetica, Arial-Unicode, Arial, Sans-serif">from the Noun Project</text></svg>

After

Width:  |  Height:  |  Size: 930 B

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 5.9 KiB

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 6.3 KiB

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 8.2 KiB

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" version="1.1" x="0px" y="0px" viewBox="0 0 492 615" style="enable-background:new 0 0 492 492;" xml:space="preserve"><g><g><path d="M151.4,131.2h189.8c5.6,0,10.7,2.3,14.4,6c3.7,3.7,6,8.8,6,14.4v190.4c0,5.6-2.3,10.7-6,14.4c-3.7,3.7-8.8,6-14.4,6H151.4 c-5.6,0-10.7-2.3-14.4-6c-3.7-3.7-6-8.8-6-14.4V151.5c0-5.6,2.3-10.7,6-14.4C140.7,133.5,145.8,131.2,151.4,131.2L151.4,131.2z M341.2,146.6H151.4c-1.4,0-2.6,0.6-3.5,1.5c-0.9,0.9-1.5,2.1-1.5,3.5v190.4c0,1.4,0.6,2.6,1.5,3.5c0.9,0.9,2.1,1.5,3.5,1.5h189.8 c1.4,0,2.6-0.6,3.5-1.5c0.9-0.9,1.5-2.1,1.5-3.5V151.5c0-1.4-0.6-2.6-1.5-3.5C343.8,147.1,342.5,146.6,341.2,146.6z"/><path d="M357,220.6h5.4c4.2,0,7.7,3.4,7.7,7.7v36.9c0,4.2-3.4,7.7-7.7,7.7H357c-4.2,0-7.7-3.4-7.7-7.7v-36.9 C349.3,224.1,352.8,220.6,357,220.6z"/><path d="M130.1,220.6h5.4c4.2,0,7.7,3.4,7.7,7.7v36.9c0,4.2-3.4,7.7-7.7,7.7h-5.4c-4.2,0-7.7-3.4-7.7-7.7v-36.9 C122.4,224.1,125.9,220.6,130.1,220.6z"/><path d="M154.3,131.4c-4.1,1-6.6,5.2-5.6,9.3c1,4.1,5.2,6.6,9.3,5.6c20.5-5.1,22.8-22.1,25.1-39.4c2-14.9,4.1-30.2,26.5-29.1l0,0 c0.1,0,0.2,0,0.4,0l0,0h0h36.3h36.3c0.3,0,0.6,0,0.9,0c21.9-0.8,23.9,14.3,25.9,29.2c2.3,17.3,4.6,34.2,25.1,39.4 c4.1,1,8.3-1.5,9.3-5.6c1-4.1-1.5-8.3-5.6-9.3c-10.4-2.6-12-14.5-13.7-26.5c-2.9-21.7-5.9-43.8-41.7-42.4c-0.1,0-0.2,0-0.3,0 h-36.3h-36.3v0c-36-1.5-39,20.6-42,42.4C166.3,117,164.7,128.8,154.3,131.4z"/><path d="M158,347.2c-4.1-1-8.3,1.5-9.3,5.6c-1,4.1,1.5,8.3,5.6,9.3c10.4,2.6,12,14.5,13.7,26.5c2.9,21.8,6,44,42,42.4v0h36.3h36.3 c0.1,0,0.2,0,0.3,0c35.7,1.4,38.7-20.7,41.7-42.4c1.6-12.1,3.2-23.9,13.7-26.5c4.1-1,6.6-5.2,5.6-9.3c-1-4.1-5.2-6.6-9.3-5.6 c-20.5,5.1-22.8,22.1-25.1,39.4c-2,14.8-4.1,29.9-25.9,29.2c-0.3,0-0.6,0-0.9,0h-36.3H210h0l0,0c-0.1,0-0.2,0-0.4,0l0,0 c-22.4,1-24.4-14.2-26.5-29.1C180.8,369.3,178.5,352.3,158,347.2z"/><path d="M98.8,195.8c3.4-2.5,4.2-7.3,1.7-10.7c-2.5-3.4-7.3-4.2-10.7-1.7c-4.1,2.9-7.7,6.6-10.8,10.9 c-10.1,14-15.2,34.6-15.2,55.1s5.1,41,15.2,55.1c3.1,4.3,6.7,8,10.8,10.9c3.4,2.5,8.2,1.7,10.7-1.7s1.7-8.2-1.7-10.7 c-2.7-2-5.2-4.5-7.3-7.5c-8.2-11.3-12.3-28.6-12.3-46.1s4.1-34.7,12.3-46.1C93.7,200.3,96.1,197.8,98.8,195.8z"/><path d="M402.7,178.1c-3.4-2.5-8.2-1.7-10.7,1.7s-1.7,8.2,1.7,10.7c2.7,2,5.2,4.5,7.3,7.5c8.2,11.3,12.3,28.6,12.3,46.1h0 c0,17.5-4.1,34.7-12.3,46.1c-2.2,3-4.6,5.5-7.3,7.5c-3.4,2.5-4.2,7.3-1.7,10.7c2.5,3.4,7.3,4.2,10.7,1.7 c4.1-2.9,7.7-6.6,10.8-10.9c10.1-14,15.2-34.6,15.2-55.1h0c0-20.4-5.1-41.1-15.2-55.1C410.4,184.8,406.8,181.1,402.7,178.1z"/><path d="M123.9,205.2c3.4-2.5,4.2-7.3,1.7-10.7c-2.5-3.4-7.3-4.2-10.7-1.7c-3.5,2.5-6.6,5.7-9.3,9.4c-8.6,11.9-13,29.4-13,46.7 c0,17.3,4.3,34.8,13,46.7c2.7,3.7,5.8,6.9,9.3,9.4c3.4,2.5,8.2,1.7,10.7-1.7c2.5-3.4,1.7-8.2-1.7-10.7c-2.2-1.6-4.1-3.6-5.8-6 c-6.7-9.2-10-23.4-10-37.7c0-14.3,3.3-28.5,10-37.7C119.8,208.8,121.7,206.8,123.9,205.2z"/><path d="M377.6,188.4c-3.4-2.5-8.2-1.7-10.7,1.7c-2.5,3.4-1.7,8.2,1.7,10.7c2.2,1.6,4.1,3.6,5.8,6c6.7,9.2,10,23.4,10,37.7 s-3.3,28.5-10,37.7c-1.7,2.4-3.7,4.4-5.8,6c-3.4,2.5-4.2,7.3-1.7,10.7c2.5,3.4,7.3,4.2,10.7,1.7c3.5-2.5,6.6-5.7,9.3-9.4 c8.6-11.9,13-29.4,13-46.7s-4.3-34.8-13-46.7C384.2,194.1,381.2,191,377.6,188.4z"/></g></g><text x="0" y="507" fill="#000000" font-size="5px" font-weight="bold" font-family="'Helvetica Neue', Helvetica, Arial-Unicode, Arial, Sans-serif">Created by Iconbunny</text><text x="0" y="512" fill="#000000" font-size="5px" font-weight="bold" font-family="'Helvetica Neue', Helvetica, Arial-Unicode, Arial, Sans-serif">from the Noun Project</text></svg>

After

Width:  |  Height:  |  Size: 3.5 KiB

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@@ -1,30 +1,30 @@
\chapter{Introduction}
\mainlabel{introduction}
This thesis presents research works on the perception and interaction directly with the hand with everyday objects that are visually and tactilely augmented with immersive augmented reality and wearable haptic devices.
This thesis presents research on the perception and interaction directly with the hand with real and virtual everyday objects, visually and haptically augmented using immersive augmented reality and wearable haptic devices.
\sectionstartoc{Visual and Tactile Object Augmentations}
\subsectionstartoc{Everyday Interaction with Everyday Objects}
In daily life, we look and touch simultaneously the everyday objects that surround us, without even thinking about it.
In daily life, we simultaneously look at and touch the everyday objects around us without even thinking about it.
%
Many these object properties can be perceived complementary by both vision and touch, such as their shapes, sizes or textures~\autocite{baumgartner2013visual}.
Many of these object properties can be perceived in a complementary way through both vision and touch, such as their shape, size or texture~\autocite{baumgartner2013visual}.
%
But vision often precedes touch, enabling us to expect the tactile sensations we will feel when touching the object, and even to predict properties that we cannot see, such as weight or temperature.
But vision often precedes touch, enabling us to expect the tactile sensations we will feel when touching the object, \eg stiffness or texture, and even to predict properties that we cannot see, \eg weight or temperature.
%
In this way, visual and tactile sensations are often linked and complementary, or even redundant or contradictory.
Information from different sensory sources may be complementary, redundant or contradictory~\autocite{ernst2004merging}.
%
This is why we sometimes want to touch an object to check one of its properties that we have seen, such as its texture, and compare and confront our visual and tactile sensations.
This is why we sometimes want to touch an object to check one of its properties that we have seen and to compare or confront our visual and tactile sensations.
%
We then instinctively construct a unified perception of the object we are exploring and manipulating from these visual and tactile sensory modalities, as well as with the movement of our hand and fingers on the object~\autocite{ernst2002humans}.
We then instinctively construct a unified perception of the object we are exploring and manipulating from these visual and tactile sensory modalities, as well as from the movement of our hand and fingers on the object~\autocite{ernst2002humans}.
Another important aspect of touch is that is allows not only to perceive the environment, but also to interact with it.
The sense of touch also allows us to perceive our environment, but also to represent ourselves in space and to interact with the surrounding objects.
%
Also called the haptic sense, it comprises two sub-modalities: kinesthetic (or proprioception), which are the forces felt by muscles and tendons, and cutaneous (or tactile sensations), which are the pressures, stretches, vibrations and temperatures felt by the skin.
This is due to the many sensory receptors distributed throughout our hands and body, and which can be divided into two modalities: kinesthetic (or proprioception), which are the forces felt by muscles and tendons, and cutaneous (or tactile), which are the pressures, stretches, vibrations and temperatures felt by the skin.
%
This rich and complex diversity of haptic actions and sensations makes it particularly difficult to recreate artificially, for example in virtual or remote operation environments~\cite{culbertson2018haptics}.
This rich and complex variety of actions and sensations makes it particularly difficult to artificially recreate capabilities of touch, for example in virtual or remote operating environments~\cite{culbertson2018haptics}.
%Nous regardons et touchons simultanément les objets de la vie quotidienne qui nous entourent, sans même y penser.
%Beaucoup de propriétés de ces objets peuvent être perçues de façon complémentaire par la vision comme par le toucher, comme la forme, la taille ou la texture.
@@ -37,15 +37,41 @@ This rich and complex diversity of haptic actions and sensations makes it partic
%Cette riche et complexe diversité de sensations rend particulièrement difficile de le recréer artificiellement, par exemple dans des environnements virtuels ou de téléopération.
\subsectionstartoc{Wearable Haptics and the Augmentation of Touch}
\subsectionstartoc{Wearable Haptics Promise Everyday Use}
\textit{Present what is wearable haptics, how they can be used to augment the sense of touch~\autocite{pacchierotti2017wearable}. Detail then how they have been used with virtual reality, but how little they have been used with augmented reality.}
A wide range of portable or wearable haptic devices have been designed, rendering rich and realistic virtual haptic sensations to the user, as illustrated in \figref{wearable-haptics}.
\emph{Haptics} is the study of the sense of touch and user interfaces that involve touch.
%
But their sue has been little explored in combination with augmented reality (AR) so far.
Haptic interfaces take many forms which can be classified into three categories: graspable, touchable and wearable, as illustrated in \figref{haptic-categories}.
%
Graspable systems are the traditional haptic devices that are held in the hand.
%
They are either grounded devices that provide kinesthetic (force) feedback, \eg a robotic arm or a joystick, or ungrounded tactile devices, \eg a game controller or a smartphone.
%
Touchable systems are actuated interfaces that are directly touched and that can dynamically change their shape or surface property, such as stiffness or friction, providing simultaneous kinesthetic and cutaneous feedback.
%
However, graspable interfaces occupy the hand, preventing interaction with other objects, and touchable interfaces often involve cumbersome mechanisms and are by definition limited to their own working surface.
%
Instead, wearable systems are directly mounted on the body to provide kinesthetic or cutaneous sensations on the skin in a portable way and without restricting the user's movements~\autocite{pacchierotti2017wearable}.
\begin{subfigs}{wearable-haptics}{Wearable haptic devices are able to render sensations to the skin as feedback to touched virtual objects.}[%
\begin{subfigs}{haptic-categories}{Haptic devices can be classified into three categories according to their interface with the user:}[%
\item graspable,
\item touchable, and
\item wearable. Figure adapted from \textcite{culbertson2018haptics}.
]
\subfig[0.25]{culbertson2018haptics-graspable}
\subfig[0.25]{culbertson2018haptics-touchable}
\subfig[0.25]{culbertson2018haptics-wearable}
\end{subfigs}
A wide range of wearable haptic devices have been developed to provide the user with rich virtual haptic sensations, including normal force, skin stretch, vibration and thermal feedback.
%
\figref{wearable-haptics} shows some examples of different wearable haptic devices with different form factors and rendering capabilities.
%
Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, virtual reality (VR), and social interactions~\autocite{pacchierotti2017wearable,culbertson2018haptics}.
%
But their use in combination with augmented reality (AR) has been little explored so far.
\begin{subfigs}{wearable-haptics}{Wearable haptic devices can render sensations on the skin as feedback to real or virtual objects being touched.}[%
%\item CLAW, a handheld haptic device providing force and vibrotactile sensations to the fingertips to render contact and textures with virtual objects~\autocite{choi2018claw}.
\item Wolverine, a wearable exoskeleton that simulate contact and grasping of virtual objects with force feedback on the fingers~\autocite{choi2016wolverine}.
\item Touch\&Fold, a wearable haptic device mounted on the nail that fold on demand to render contact, normal force and vibrations to the fingertip~\autocite{teng2021touch}.
@@ -61,17 +87,23 @@ But their sue has been little explored in combination with augmented reality (AR
\end{subfigs}
\subsectionstartoc{Augmented Reality is Not Only Visual}
\subsectionstartoc{Augmented Reality Is Not Only Visual}
AR integrates virtual content into the real world perception, creating the illusion of one unique environment and promising natural and seamless with the physical and digital objects (and their combination) directly with our hands.
AR integrates virtual content into the real world perception, creating the illusion of a unique environment.
%
It is technically and conceptually closely related to VR, which completely replace the real environment with an immersive virtual environment (VE).
It thus promises natural and seamless interaction with the physical and digital objects (and their combination) directly with our hands.
%
AR and VR can be placed on a reality-virtuality (RV) continuum, as proposed by \textcite{milgram1994taxonomy} and shown in \figref{rv-continuum}.
It is technically and conceptually closely related to VR, which completely replaces the real environment with an immersive virtual environment (VE).
%
It describes different levels of combination of real and virtual environments along one axis, with one end being the real, physical environment, and the other end being a purely virtual environment, \ie undistinguishable from real world (as \emph{the Matrix} movies or the \emph{Holodeck} in \emph{Star Trek} series).
AR and VR can be placed on a reality-virtuality (RV) continuum, as proposed by \textcite{milgram1994taxonomy} and illustrated in \figref{rv-continuum}.
%
It describes different levels of combination of real and virtual environments along an axis, with one end being the real, physical environment and the other end being a purely virtual environment, \ie indistinguishable from the real world (such as \emph{The Matrix} movies or the \emph{Holodeck} in the \emph{Star Trek} series).
%
In between, there is mixed reality (MR) which comprises AR and VR~\autocite{skarbez2021revisiting}.
%
AR/VR is most often understood as addressing only the visual sense, and as haptics, it can take many forms as a user interface.
%
The most mature devices are head-mounted displays (HMDs), which are portable headsets worn directly on the head, providing the user with an immersive virtual environment.
\begin{subfigs}{rv-continuums}{Reality-virtuality (RV) continuums.}[%
\item Original RV continuum for the visual sense initially proposed by and readapted from \textcite{milgram1994taxonomy}.
@@ -81,29 +113,29 @@ In between, there is mixed reality (MR) which comprises AR and VR~\autocite{skar
\subfig[0.54]{visuo-haptic-rv-continuum3}
\end{subfigs}
Even though AR and VR are often considered as addressing only the visual sense, they can also be extended to render for other sensory modalities.
AR/VR can also be extended to render for sensory modalities other than vision.
%
In particular, \textcite{jeon2009haptic} proposed to extend the RV continuum to include haptic feedback by decoupling into two orthogonal haptic and visual axes (see \figref{visuo-haptic-rv-continuum3}).
In particular, \textcite{jeon2009haptic} proposed extending the RV continuum to include haptic feedback by decoupling into two orthogonal haptic and visual axes (see \figref{visuo-haptic-rv-continuum3}).
%
The combination of the two axes defines 9 types of visuo-haptic environments, with 3 possible levels of RV for each visual or haptic axis (real, augmented, virtual).
%
For instance, a visual AR environment using a tangible object as a proxy to manipulate virtual content is considered a real haptic environment (see \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum3}), whereas a device providing synthetic haptic feedback when touching a virtual object is considered a virtual haptic environment (see \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum3}).
For example, a visual AR environment that uses a tangible (touchable) object as a proxy to manipulate virtual content is considered a real haptic environment (see \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum3}), whereas a device that provides synthetic haptic feedback when touching a virtual object is considered a virtual haptic environment (see \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum3}).
%
Haptic augmented reality (HAR) is then the combination of real and virtual haptic stimuli~\autocite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}).
Haptic augmented reality (hAR) is then the combination of real and virtual haptic stimuli~\autocite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}).
%
In particular, it has been implemented by enhancing the haptic perception of tangible objects by providing timely tactile stimuli using wearable haptics.
%
\figref{salazar2020altering} shows an example of modifying the perceived stiffness a tangible seen in VR using simultaneous pressure feedback on the finger (left middle cell in \figref{visuo-haptic-rv-continuum3}).
\figref{salazar2020altering} shows an example of modifying the perceived stiffness of a tangible object in VR using simultaneous pressure feedback on the finger (left middle cell in \figref{visuo-haptic-rv-continuum3}).
%
\figref{bau2012revel} shows another example of visuo-haptic AR rendering of texture when running the finger on a tangible surface (middle cell in the two axes in \figref{visuo-haptic-rv-continuum3}).
Yet current (visual) AR systems often lack haptic feedback, creating deceptive and incomplete user experience when reaching the hand to touch the virtual environment.
Current (visual) AR systems often lack haptic feedback, creating a deceptive and incomplete user experience when reaching the virtual environment with the hand.
%
All visual virtual objects are indeed, by nature, intangible and cannot physically constrain a user's hand, making it difficult to congruently perceive their properties and interact with them with confidence and efficiency.
All visual virtual objects are inherently intangible and cannot physically constrain a user's hand, making it difficult to perceive their properties congruently and interact with them with confidence and efficiency.
%
Thus, it is necessary to provide haptic feedback that is consistent with the augmented visual environment and that ensure the best possible user experience.
It is therefore necessary to provide haptic feedback that is consistent with the augmented visual environment and ensures the best possible user experience.
%
Integrating wearable haptics with AR seems to be one of the most promising solutions, but it remains a challenge due to their many respective characteristics and the additional constraints of combining them.
The integration of wearable haptics with AR seems to be one of the most promising solutions, but it remains challenging due to their many respective characteristics and the additional constraints of combining them.
\begin{subfigs}{visuo-haptic-environments}{%
Visuo-haptic environments with different degrees of reality-virtuality.
@@ -123,22 +155,36 @@ Integrating wearable haptics with AR seems to be one of the most promising solut
\sectionstartoc{Research Challenges of Wearable Visuo-Haptic Augmented Reality}
Integrating wearable haptics with augmented reality to create a visuo-haptic augmented environment raises many perceptive and interaction challenges.
Integrating wearable haptics with augmented reality to create a visuo-haptic augmented environment is complex and raises many perceptive and interaction challenges.
%
We are in particular interested in enabling a direct contact and manipulation with bare hands of virtual and augmented objects with the objective of providing a congruent, intuitive and efficient perception and interaction with the visuo-haptic augmented environment.
We are in particular interested in enabling a direct contact and manipulation with bare hands of virtual and augmented objects.
%
The experience of such a visuo-haptic augmented environment relies on an interaction loop with the user, as illustrated in \figref{interaction-loop}.
%
The real environment and the user's hand are tracked in real time with sensors to be reconstructed in virtual visual and haptic environments.
%
The interactions between the virtual hand and objects are then simulated and rendered as visual and haptic feedback to the user using an AR headset and a wearable haptic device.
%
Because the visuo-haptic virtual environment is displayed in real time colocalized and aligned with the real one, it creates the illusion to the user to directly perceive and interact with the virtual content as if it was part of the real environment.
\fig{interaction-loop}{The interaction loop between a user and a visuo-haptic augmented environment}[%
One interact with the visual (in blue) and haptic (in red) virtual environment through a virtual hand (in purple) interaction technique that track the real hand actions and simulate the contact with the virtual objects. %
The virtual environment is rendered back to the user colocalized with the real one (in gray) using a visual AR headset and a wearable haptic device. %
]
Our goal is to provide a congruent, intuitive and seamless perception and interaction with the visuo-haptic augmented environment.
%
%This to ensure the best possible user experience, taking into account the current capabilities and limitations of wearable haptics and augmented reality technologies.
%
Each of these challenges also pose numerous design and technical issues specific to each of the two type of feedback, wearable haptics and augmented reality, as well as multimodal rendering and user experience issues in integrating these two sensorimotor feedbacks.
%
We identify two main research challenges which we address in this thesis:
In this context, we identify two main research challenges which we address in this thesis:
%
\begin{enumerate*}[label=(\Roman*)]
\item providing plausible and coherent visuo-haptic augmentations, and
\item enabling effective interaction with the augmented environment.
\end{enumerate*}
%
%These challenges are illustrated in the visuo-haptic interaction loop in \figref{xxx}.
Each of these challenges also poses numerous design, technical and human issues specific to each of the two types of feedback, wearable haptics and immersive AR, as well as multimodal rendering and user experience issues in integrating these two sensorimotor feedbacks into a coherent and seamless visuo-haptic augmented environment.
%These challenges are illustrated in the visuo-haptic interaction loop in \figref{interaction-loop}.
\subsectionstartoc{Provide Plausible and Coherent Visuo-Haptic Augmentations}
@@ -157,15 +203,15 @@ Moreover, many wearable haptic devices take the form of controllers, gloves, exo
%
The user's hand must be indeed not impaired to be free to touch and interact with the real environment.
%
It is possible instead to place the haptic actuator close to the point of contact with the real environment, as described above to implement HAR, \eg providing haptic feedback on another phalanx~\autocite{asano2015vibrotactile,salazar2020altering} or the wrist~\autocite{sarac2022perceived} for rendering fingertip contacts with virtual content.
It is possible instead to place the haptic actuator close to the point of contact with the real environment, as described above to implement hAR, \eg providing haptic feedback on another phalanx~\autocite{asano2015vibrotactile,salazar2020altering} or the wrist~\autocite{sarac2022perceived} for rendering fingertip contacts with virtual content.
%
Therefore, when touching a virtual or augmented object, the real and virtual visual sensations are seen colocalised, but not the virtual haptic feedback.
%
It remains to be investigated how such potential discrepancies affect the overall perception to design visuo-haptic renderings adapted to AR.
Then, it is in AR, as of today, only possible to add visual and haptic sensation to the overall user perception of the environment.
Then, it is in AR, as of today, only possible to add visual and haptic sensations to the overall user perception of the environment, but it is very difficult to remove sensations.
%
These virtual visual and haptic sensations can therefore be perceived as being out of sync or even inconsistent with the sensations of the real environment, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these.
These added virtual sensations can therefore be perceived as being out of sync or even inconsistent with the sensations of the real environment, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these.
%
Hence, it is unclear to what extent the real and virtual visuo-haptic sensations will be perceived as realistic or plausible, and how much they will conflict or complement each other in the perception of the augmented environment.
%
@@ -177,17 +223,17 @@ Hence, it is unclear to what extent the real and virtual visuo-haptic sensations
Touching, grasping and manipulating virtual objects are fundamental interactions for AR~\autocite{kim2018revisiting} and VR~\autocite{bergstrom2021how}.
%
As the hand is not occupied or covered with a haptic device to not impair interaction with the real environment, as described above, one can expect to interact seamlessly and directly with the hand with the virtual content as if it were real.
As the hand is not occupied or covered with a haptic device to not impair interaction with the real environment, as described in previous section, one can expect to interact seamlessly and directly with the hand with the virtual content as if it were real.
%
Thus, augmenting a tangible object present the advantage of physically constraint the hand, enabling an easy and natural interaction, but manipulating a purely virtual object with bare hands can be challenging with no good haptic feedback~\autocite{maisto2017evaluation,meli2018combining}.%, and one will rely on visual and haptic feedback to guide the interaction.
Thus, augmenting a tangible object present the advantage of physically constraint the hand, enabling an easy and natural interaction, but manipulating a purely virtual object with bare hands can be challenging with no good haptic feedback~\autocite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
Moreover, current AR systems present visual rendering limitations that also affect virtual object interaction.%, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency.
Moreover, current AR systems present visual rendering limitations that also affect virtual object interaction. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency.
%
AR is the display of superimposed images of the virtual world, synchronised with the current user view of the real world.
Visual AR is the display of superimposed images of the virtual world, synchronized with the current user view of the real world.
%
But the depth perception of the virtual objects is often underestimated~\autocite{peillard2019studying,adams2022depth}, and it often lacks mutual occlusion between the hand and a virtual object, \ie that the hand can hide the object or be hidden by the object~\autocite{macedo2023occlusion}.
%
Finally, interaction with the object is an illusion, because in fact the real hand is recorded by a tracking system and controls in real time a virtual hand, like an avatar, whose contacts with virtual objects is then simulated in the virtual environment.
Finally, interaction with the object is an illusion, because in fact the real hand is recorded by a tracking system and controls in real time a virtual hand, like an avatar, whose contacts with virtual objects is then simulated in the virtual environment, as illustrated in \figref{interaction-loop}.
%
There is therefore inevitably a latency delay between the real hand movements and the virtual object return movements, and a spatial shift can also occur between the real hand and the virtual hand, which movements are constrained to the touched virtual object~\autocite{prachyabrued2014visual}.
%
@@ -216,18 +262,20 @@ Our approach is to
We consider two main axes of research, each addressing one of the research challenges identified above:
%
\begin{enumerate*}[label=(\Roman*)]
\item altering the perception of tangible surfaces with virtual visuo-haptic textures, and
\item altering the perception of tangible surfaces with visuo-haptic texture augmentations, and
\item improving virtual object interaction with visuo-haptic augmentations of the hand.
\end{enumerate*}
%
Our contributions in these two axes are summarized in \figref{xxx}.
Our contributions in these two axes are summarized in \figref{contributions}.
% TODO: Add figure with the two axes of research and the contributions
\fig[0.95]{contributions}{Summary of our contributions through the simplified interaction loop.}[%
The contributions are represented in dark gray boxes, and the research axes in light green circles.%
The first (I) axis design and evaluates the perception of visuo-haptic texture augmentations of tangible surfaces, directly touched by the hand.%
The second (II) axis focuses on improving the manipulation with bare hands of virtual objects with visuo-haptic augmentations of the hand as interaction feedback.%
]
\subsectionstartoc{Altering the Perception of Tangible Surfaces with Virtual Visuo-Haptic Textures}
(\textit{to complete})
\subsectionstartoc{Altering the Perception of Tangible Surfaces with Visuo-Haptic Texture Augmentations}
% Very short shared motivation of the shared objective of the two contributions
% Objective + We propose / we consider : (1) ... and (2) ...
@@ -236,13 +284,67 @@ Our contributions in these two axes are summarized in \figref{xxx}.
% Very short abstract of contrib 2
Wearable haptic devices have proven to be effective in altering the perception of a touched tangible surface, without modifying the tangible, nor covering the fingertip touching, forming a hAR environment~\autocite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
%
%It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronised with the finger movement.
%
%It enables rich haptic feedback as the combination of kinesthetic sensation from the tangible and cutaneous sensation from the actuator.
%
However, wearable hAR have been little explored with visual AR, as well as the visuo-haptic augmentation of textures.
%
Texture is indeed one of the main tactile sensation of a surface material~\cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by both sight and touch~\cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) rendering~\cite{unger2011roughness,culbertson2014modeling,strohmeier2017generating}.
%
For this first axis of research, we propose to design and evaluate the perception of virtual visuo-haptic textures augmenting tangible surfaces. %, using an immersive AR headset and a wearable vibrotactile device.
%
To do so, we (1) design a system for rendering virtual visuo-haptic textures augmentations, to (2) evaluate how the perception of these textures is affected by the visual virtuality of the hand and the environment (AR \vs VR), and (3) investigate the perception of co-localized visuo-haptic texture augmentations in AR.
First, an effective approach to render haptic textures is to generate a vibrotactile signal that represent the finger-texture interaction~\autocite{culbertson2014modeling,asano2015vibrotactile}.
%
Yet, to enable the most natural interaction with the hand and a coherent visuo-haptic feedback, it requires a real time rendering of the textures, no constraints on the hand movements, and a good synchronization between the visual and haptic feedbacks.
%
Thus, our first objective is to design an immersive, real time system allowing free exploration with bare hands of visuo-haptic textures augmentations on tangible surfaces.
Second, many works investigated the haptic rendering of virtual textures, but few have integrated them with immersive virtual environments or have considered the influence of the visual rendering on their perception.
%
Still, it is known that the visual feedback can alter the perception of real and virtual haptic sensations~\autocite{schwind2018touch,choi2021augmenting} but also that the force feedback perception of grounded haptic devices is not the same in AR and VR~\autocite{diluca2011effects,gaffary2017ar}.
%
Hence, our second objective is to understand how different is the perception of haptic texture augmentation depending on the degree of visual virtuality of the hand and the environment.
Finally, some visuo-haptic texture databases have been modelled from real texture captures~\autocite{culbertson2014penn,balasubramanian2024sens3}, enabling to render virtual textures with graspable haptics that are perceived as similar to real textures~\autocite{culbertson2015should,friesen2024perceived}.
%
However, it remains to be investigated the rendering of these textures in an immersive and natural visuo-haptic AR using wearable haptics.
%
Our third objective is to evaluate the perception of simultaneous and co-localized visuo-haptic texture augmentation of tangible surfaces in AR, still directly touched by the hand, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture.
\subsectionstartoc{Improving Virtual Object Interaction with Visuo-Haptic Augmentations of the Hand}
(\textit{to complete})
In immersive and wearable visuo-haptic AR, the hand is free to touch and interact seamlessly with real, augmented, and virtual objects, and one can expect natural and direct contact and manipulation of virtual objects with bare hand.
%
However, the intangibility of the virtual visual environment, the many display limitations of current visual AR systems and wearable haptic devices, and the potential discrepancies of these two feedbacks, can make the interaction with virtual objects particularly challenging.
%However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of wearable haptics, the visual rendering limitations of current AR systems, as well as the spatial and temporal discrepancies between the real environment, the visual feedback, and the haptic feedback, can make the interaction with virtual objects with bare hands particularly challenging.
%
Still two types of sensory feedback are known to improve such direct virtual object interaction, but they have not been studied in combination in immersive visual AR environments: visual rendering of the hand~\autocite{piumsomboon2014graspshell,prachyabrued2014visual} and hand-object interaction rendering with wearable haptics~\autocite{lopes2018adding,teng2021touch}.
%
For this second axis of research, we propose to design and evaluate the role of visuo-haptic augmentations of the hand as interaction feedback with virtual objects.
%
We consider (1) the effect of different visual augmentation of the hand as AR avatars and (2) the effect of various combination of visual and haptic augmentation of the hand.
% Touch allows to perceive the environment and interact with it, thus it motivates these two axes of research.
First, the visual rendering of the virtual hand is a key element to interact and manipulate virtual objects in VR~\autocite{prachyabrued2014visual,grubert2018effects}.
%
A few works have also investigated the visual rendering of the virtual hand in AR, from simulating mutual occlusions between the hand and virtual objects~\autocite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay~\autocite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
%
But visual AR presents significant perceptual differences with VR due to the visibility of the real hand and environment, and these visual hand augmentations have not been evaluated in a context of virtual object manipulation.
%
Thus, our fourth objective is to evaluate and compare the effect of different visual augmentation of the hand on direct manipulation of virtual objects in AR.
% it's a mix of augmented reality, virtual reality, vibrotactile feedback for visuo-tactile augmentation of the real world. Such multimodal rendering raise many questions on how to design, how renderings interact and complete each other, to give one perception
Finally, as detailed above, wearable haptics for visual AR rely on moving away from the fingertips the haptic actuator to not impair the hand movements, sensations, and interactions with the real environment.
%
Previous works have shown that wearable haptics, providing feedback on the hand interaction with virtual objects in AR, can significantly improve the performance and experience of the user~\autocite{maisto2017evaluation,meli2018combining}.
%
However, it is unclear which positioning of the actuator is the most beneficial nor how a haptic augmentation of the hand compare or complement with a visual augmentation of the hand.
%
Our last objective is to investigate the role of visuo-haptic augmentations of the hand on manipulating virtual objects directly with the hand in AR.
\sectionstartoc{Thesis Overview}
@@ -251,7 +353,7 @@ Our contributions in these two axes are summarized in \figref{xxx}.
This thesis is structured in four parts.
%
\partref{context} describes the context and background of our research, within which this first current \textit{Introduction} chapter presents the research challenges, and the objective, approach, and contributions of this thesis.
\partref{context} describes the context and background of our research, within which this first current \textit{Introduction} chapter presents the research challenges, and the objectives, approach, and contributions of this thesis.
%
\chapref{related_work} then provides an overview of related work on the perception of and interaction with visual and haptic augmentations of objects.
%
@@ -273,29 +375,35 @@ We evaluate how the visual rendering of the hand (real or virtual), the environm
%
The haptic textures are rendered as a real-time vibrotactile signal representing a grating texture, and is provided to the middle phalanx of the index finger touching the texture using a voice-coil actuator.
%
The tracking of the real hand and environment is done using marker-based technique, and the visual rendering of their virtual counterparts is done using the immersive AR headset Microsoft HoloLens~2.
The tracking of the real hand and environment is done using marker-based technique, and the visual rendering of their virtual counterparts is done using the immersive optical see-through (OST) AR headset Microsoft HoloLens~2.
\chapref{xr_perception} then presents a first user study using this system.
%
It then presents a user study that evaluates how different the perception of virtual haptic textures is in AR \vs VR and when touched by a virtual hand \vs one's own hand.
It evaluates how different the perception of virtual haptic textures is in AR \vs VR and when touched by a virtual hand \vs one's own hand.
%
We use psychophysical methods to measure the user roughness perception of the virtual textures, and extensive questionnaires to understand how this perception is affected by the visual rendering of the hand and the environment.
\chapref{ar_textures} presents the evaluation in a user study of the perception of visuo-haptic textures augmentations rendered on real tangible surfaces, using the system described in the previous chapter, and touched directly with one's own hand in AR.
\chapref{ar_textures} presents a second user study using the same system and evaluating the perception of visuo-haptic textures augmentations, touched directly with one's own hand in AR.
%
The textures are paired visual and tactile models of real surfaces~\autocite{culbertson2014one}, and are rendered as visual texture overlays and as a vibrotactile feedback on the touched augmented surfaces, respectively.
%
We investigate the perception and user appreciation of the combination of nine representative visuo-haptic pairs of texture.
%We investigate the perception and user appreciation of the combination of nine representative visuo-haptic pairs of texture.
%
Our objective is to assess the perceived realism, plausibility and roughness of the visual and haptic textures, and the coherence of their association.
Our objective is to assess the perceived realism, plausibility and roughness of the combination of nine representative visuo-haptic pairs of texture, and the coherence of their association.
\bigskip
\partref{manipulation} describes our contributions to the second axis of research, improving virtual object interaction with visuo-haptic augmentations of the hand.
%
We evaluate how the visual and haptic rendering of the hand improve the interaction with virtual objects directly with the hand.
We evaluate how the visual and haptic augmentation of the hand improve the interaction with virtual objects directly with the hand.
\chapref{visual_hand} explores the effect of six visual renderings of the hand that provide contact feedback with the virtual object. (\textit{to complete})
\chapref{visual_hand} explores in a first user study the effect of six visual augmentations of the hand that provide contact feedback with the virtual object, as a set of the most popular hand renderings in the AR literature.
%
With the OST-AR headset Microsoft HoloLens~2, the performance and the user experience are evaluated in two representative manipulation tasks, \ie push-and-slide and grasp-and-place of a virtual object directly with the hand.
\chapref{visuo_haptic_hand} evaluates two vibrotactile contact techniques, provided at four different locations on the real hand, and compared to the two most representative visual hand renderings from the previous contribution. (\textit{to complete})
\chapref{visuo_haptic_hand} evaluates in a second user study two vibrotactile contact techniques, provided at four different locations on the real hand, as haptic rendering of the hand-object interaction.
%
They are compared to the two most representative visual hand augmentations from the previous study, and, within the same OST-AR setup and manipulation tasks, the performance and the user experience are evaluateds.
\bigskip

View File

@@ -0,0 +1,10 @@
\section{Conclusion}
\label{sec:conclusion}
This paper presented two human subject studies aimed at better understanding the role of visuo-haptic rendering of the hand during virtual object manipulation in OST-AR.
%
The first experiment compared six visual hand renderings in two representative manipulation tasks in AR, \ie push-and-slide and grasp-and-place of a virtual object.
%
Results show that a visual hand rendering improved the performance, perceived effectiveness, and user confidence.
%
A skeleton rendering, providing a detailed view of the tracked joints and phalanges while not hiding the real hand, was the most performant and effective.

View File

@@ -11,3 +11,4 @@
\input{3-3-ranks}
\input{3-4-questions}
\input{4-discussion}
\input{5-conclusion}

View File

@@ -3,12 +3,6 @@
This paper presented two human subject studies aimed at better understanding the role of visuo-haptic rendering of the hand during virtual object manipulation in OST-AR.
%
The first experiment compared six visual hand renderings in two representative manipulation tasks in AR, \ie push-and-slide and grasp-and-place of a virtual object.
%
Results show that a visual hand rendering improved the performance, perceived effectiveness, and user confidence.
%
A skeleton rendering, providing a detailed view of the tracked joints and phalanges while not hiding the real hand, was the most performant and effective.
%
The second experiment compared, in the same two manipulation tasks as before, sixteen visuo-haptic renderings of the hand as the combination of two vibrotactile contact techniques, provided at four different delocalized positions on the hand, and with the two most representative visual hand renderings established in the first experiment, \ie the skeleton hand rendering and no hand rendering.
%
Results show that delocalized vibrotactile haptic hand rendering improved the perceived effectiveness, realism, and usefulness when it is provided close to the contact point.

View File

@@ -113,6 +113,17 @@
doi = {10/gfz8mv}
}
@inproceedings{balasubramanian2024sens3,
title = {{{SENS3}}: {{Multisensory Database}} of {{Finger-Surface Interactions}} and {{Corresponding Sensations}}},
shorttitle = {{{SENS3}}},
booktitle = {{{EuroHaptics}}},
author = {Balasubramanian, Jagan K. and Kodak, Bence L. and Vardar, Yasemin},
date = {2024},
eprint = {2401.01818},
eprinttype = {arXiv},
eprintclass = {cs, eess}
}
@article{bau2012revel,
title = {{{REVEL}}: Tactile Feedback Technology for Augmented Reality},
shorttitle = {{{REVEL}}},
@@ -266,9 +277,9 @@
@inproceedings{bianchi2006high,
title = {High {{Precision Augmented Reality Haptics}}},
booktitle = {Proc. {{EuroHaptics}}},
booktitle = {{{EuroHaptics}}},
author = {Bianchi, G and Knoerlein, B and Szekely, G and Harders, M},
date = {2006-07},
date = {2006},
volume = {6},
pages = {169--178}
}
@@ -596,6 +607,17 @@
doi = {10/fjhbkh}
}
@article{ernst2004merging,
title = {Merging the Senses into a Robust Percept},
author = {Ernst, Marc O. and Bülthoff, Heinrich H.},
date = {2004},
journaltitle = {Trends in Cognitive Sciences},
volume = {8},
number = {4},
pages = {162--169},
doi = {10/bzhmrh}
}
@inproceedings{feick2023turnitup,
title = {Turn-{{It-Up}}: {{Rendering Resistance}} for {{Knobs}} in {{Virtual Reality}} through {{Undetectable Pseudo-Haptics}}},
shorttitle = {Turn-{{It-Up}}},