WIP vhar_textures
This commit is contained in:
@@ -3,7 +3,7 @@
|
||||
|
||||
\chaptertoc
|
||||
|
||||
This thesis, entitled \enquote{\ThesisTitle}, shows that wearable haptics, worn on the outside of the hand, improve direct bare-hand interaction in immersive \AR by augmenting the perception and manipulation of the virtual.
|
||||
This thesis, entitled \enquote{\ThesisTitle}, shows how wearable haptics, worn on the outside of the hand, can improve direct bare-hand interaction in immersive \AR by augmenting the perception and manipulation of the virtual.
|
||||
|
||||
\section{Visual and Haptic Object Augmentations}
|
||||
\label{visuo_haptic_augmentations}
|
||||
@@ -12,7 +12,7 @@ This thesis, entitled \enquote{\ThesisTitle}, shows that wearable haptics, worn
|
||||
|
||||
In daily life, we simultaneously look and touch the everyday objects around us without even thinking about it.
|
||||
Many of these object properties can be perceived in a complementary way through both vision and touch, such as their shape, size or texture \cite{baumgartner2013visual}.
|
||||
But vision often precedes touch, enabling us to expect the tactile sensations we will feel when touching the object, \eg stiffness or texture, and even to predict properties that we cannot see, \eg weight or temperature.
|
||||
But vision often precedes touch, enabling us to expect the tactile sensations we will feel when touching the object, \eg hardness or texture, and even to predict properties that we cannot see, \eg weight or temperature.
|
||||
Information from different sensory sources may be complementary, redundant or contradictory \cite{ernst2004merging}.
|
||||
This is why we sometimes want to touch an object to check one of its properties that we have seen and to compare or confront our visual and tactile sensations.
|
||||
We then instinctively construct a unified perception of the object we are exploring and manipulating from these visual and tactile sensory modalities, as well as from the movement of our hand and fingers on the object \cite{ernst2002humans}.
|
||||
@@ -27,14 +27,14 @@ This rich and complex variety of actions and sensations makes it particularly di
|
||||
Haptic devices can be categorized according to how they interface with a user: graspable, touchable and wearable, as illustrated in \figref{haptic-categories}.
|
||||
Graspable interfaces are the traditional haptic devices that are held in the hand.
|
||||
They are either grounded devices that provide kinesthetic (force) feedback, \eg a robotic arm or a joystick, or ungrounded tactile devices, \eg a game controller or a smartphone.
|
||||
Touchable interfaces are actuated devices that are directly touched and that can dynamically change their shape or surface property, such as stiffness or friction, providing simultaneous kinesthetic and cutaneous feedback.
|
||||
Touchable interfaces are actuated devices that are directly touched and that can dynamically change their shape or surface property, such as hardness or friction, providing simultaneous kinesthetic and cutaneous feedback.
|
||||
However, graspable interfaces occupy the hand, preventing interaction with other objects, and touchable interfaces often involve cumbersome mechanisms and are by definition limited to their own working surface.
|
||||
Instead, wearable interfaces are directly mounted on the body to provide cutaneous sensations on the skin in a portable way and without restricting the user's movements \cite{pacchierotti2017wearable}.
|
||||
|
||||
\begin{subfigs}{haptic-categories}{
|
||||
Haptic devices can be classified into three categories according to their interface with the user:
|
||||
}[][
|
||||
\item graspable,
|
||||
\item graspable (hand-held),
|
||||
\item touchable, and
|
||||
\item wearable. Adapted from \textcite{culbertson2018haptics}.
|
||||
]
|
||||
@@ -54,7 +54,7 @@ But their use in combination with \AR has been little explored so far.
|
||||
\item Wolverine, a wearable exoskeleton that simulate contact and grasping of virtual objects with force feedback on the fingers \cite{choi2016wolverine}.
|
||||
\item Touch\&Fold, a wearable haptic device mounted on the nail that fold on demand to render contact, normal force and vibrations to the fingertip \cite{teng2021touch}.
|
||||
\item The hRing, a wearable haptic ring mounted on the proximal phalanx able to render normal and shear forces to the finger \cite{pacchierotti2016hring}.
|
||||
\item Tasbi, a haptic bracelet capable of rendering squeeze and vibrotactile feedback to the wrist \cite{pezent2022design}.
|
||||
\item Tasbi, a haptic bracelet capable of providing pressure and vibrotactile feedback to the wrist \cite{pezent2022design}.
|
||||
]
|
||||
\subfigsheight{28mm}
|
||||
\subfig{choi2016wolverine}
|
||||
@@ -89,7 +89,7 @@ For example, a visual \AE that uses a tangible (touchable) object as a proxy to
|
||||
Haptic \AR is then the combination of real and virtual haptic stimuli \cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}).
|
||||
In particular, it has been implemented by enhancing the haptic perception of tangible objects by providing timely tactile stimuli using wearable haptics.
|
||||
\figref{salazar2020altering} shows an example of modifying the perceived stiffness of a tangible object in \VR using simultaneous pressure feedback on the finger (left middle cell in \figref{visuo-haptic-rv-continuum3}).
|
||||
\figref{bau2012revel} shows another example of visuo-haptic \AR rendering of virtual texture when running the finger on a tangible surface (middle cell in the two axes in \figref{visuo-haptic-rv-continuum3}).
|
||||
\figref{bau2012revel} shows another example of visuo-haptic augmentation of virtual texture when running the finger on a tangible surface (middle cell in the two axes in \figref{visuo-haptic-rv-continuum3}).
|
||||
|
||||
Current \AR systems often lack haptic feedback, creating a deceptive and incomplete user experience when reaching the \VE with the hand.
|
||||
All visual \VOs are inherently intangible and cannot physically constrain a user's hand, making it difficult to perceive their properties congruently and interact with them with confidence and efficiency.
|
||||
@@ -100,7 +100,7 @@ The integration of wearable haptics with \AR seems to be one of the most promisi
|
||||
\item \AR environment with a real, tangible haptic object used as a proxy to manipulate a \VO \cite{kahl2023using}.
|
||||
\item \AR environment with a wearable haptic device that provides virtual, synthetic feedback from contact with a \VO \cite{meli2018combining}.
|
||||
\item A tangible object seen in a visual \VR environment whose haptic perception of stiffness is augmented with the hRing haptic device \cite{salazar2020altering}.
|
||||
\item Visuo-haptic rendering of texture on a touched tangible object with a \AR display and haptic electrovibration feedback \cite{bau2012revel}.
|
||||
\item Visuo-haptic texture augmentation of tangible object being touch, using a hand-held \AR display and haptic electrovibration feedback \cite{bau2012revel}.
|
||||
]
|
||||
\subfigsheight{31mm}
|
||||
\subfig{kahl2023using}
|
||||
@@ -138,17 +138,18 @@ Each of these challenges also raises numerous design, technical and human issues
|
||||
Many haptic devices have been designed and evaluated specifically for use in \VR, providing realistic and varied kinesthetic and tactile feedback to \VOs.
|
||||
Although closely related, (visual) \AR and \VR have key differences in their respective renderings that can affect user perception.
|
||||
|
||||
First, the user's hand and \RE are visible in \AR, unlike \VR where there is total control over the visual rendering of the hand and \VE.
|
||||
In \AR, the user can still see the real-world surroundings, including their hands, the augmented tangible objects and the worn haptic devices, unlike \VR where there is total control over the visual rendering of the hand and \VE.
|
||||
As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic \cite{ujitoko2021survey} or haptic retargeting \cite{azmandian2016haptic} effects.
|
||||
Moreover, many wearable haptic devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR.
|
||||
The user's hand must be indeed free to touch and interact with the \RE while wearing a wearable haptic device.
|
||||
It is possible instead to place the haptic actuator close to the point of contact with the \RE, as described above to implement haptic augmentations, \eg providing haptic feedback on the nail \cite{ando2007fingernailmounted,teng2021touch}, another phalanx \cite{asano2015vibrotactile,salazar2020altering} or the wrist \cite{pezent2022design,sarac2022perceived} for rendering fingertip contacts with virtual content.
|
||||
Therefore, when touching a virtual or augmented object, the real and virtual visual sensations are seen as co-localized, but the virtual haptic feedback is not.
|
||||
It remains to be investigated how such potential discrepancies affect the overall perception to design visuo-haptic renderings adapted to \AR.
|
||||
It remains to be investigated how such potential discrepancies affect the overall perception to design visuo-haptic augmentations adapted to \AR.
|
||||
|
||||
So far, \AR can only add visual and haptic sensations to the user's overall perception of the environment, but conversely it is more difficult to remove sensations.
|
||||
So far, \AR studies and applications often only add visual and haptic sensations to the user's overall perception of the environment, but conversely it is more difficult to remove sensations.
|
||||
These added virtual sensations can therefore be perceived as out of sync or even inconsistent with the sensations of the \RE, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these.
|
||||
It is therefore unclear to what extent the real and virtual visuo-haptic sensations will be perceived as realistic or plausible, and to what extent they will conflict or complement each other in the perception of the \AE.
|
||||
With a better understanding of how visual factors can influence the perception of haptically augmented tangible objects, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic augmentations adapted to \AR can be designed.
|
||||
|
||||
\subsectionstarbookmark{Challenge II: Enable Effective Manipulation of the Augmented Environment}
|
||||
|
||||
@@ -196,22 +197,22 @@ Wearable haptic devices have proven to be effective in modifying the perception
|
||||
%It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronised with the finger movement.
|
||||
%It enables rich haptic feedback as the combination of kinesthetic sensation from the tangible and cutaneous sensation from the actuator.
|
||||
However, wearable haptic augmentations have been little explored with \AR, as well as the visuo-haptic augmentation of textures.
|
||||
Texture is indeed one of the main tactile sensation of a surface material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by both sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) rendering \cite{unger2011roughness,culbertson2014modeling,strohmeier2017generating}.
|
||||
%With a better understanding of how visual factors influence the perception of haptically augmented tangible objects, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic renderings adapted to \AR can be designed.
|
||||
Texture is indeed one of the fundamental perceived property of a surface's material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by both sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) augmentation \cite{unger2011roughness,culbertson2014modeling,strohmeier2017generating}.
|
||||
Being able to coherently substitute the visuo-haptic texture of an everyday surface directly touched by a finger is an important step towards \AR capable of visually and haptically augmenting the \RE of a user in a plausible way.
|
||||
For this first axis of research, we propose to design and evaluate the \textbf{perception of wearable virtual visuo-haptic textures augmenting tangible surfaces}. %, using an immersive \AR headset and a wearable vibrotactile device.
|
||||
To this end, we (1) design a system for rendering wearable visuo-haptic texture augmentations, to (2) evaluate how the perception of haptic texture augmentations is affected by the visual virtuality of the hand and the environment (real, augmented, or virtual), and (3) investigate the perception of co-localized visuo-haptic texture augmentations.
|
||||
|
||||
First, an effective approach to rendering haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction \cite{culbertson2014modeling,asano2015vibrotactile}.
|
||||
First, an effective approach for rendering haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction \cite{culbertson2014modeling,asano2015vibrotactile}.
|
||||
Yet, to achieve the natural interaction with the hand and a coherent visuo-haptic feedback, it requires a real time rendering of the textures, no constraints on the hand movements, and a good synchronization between the visual and haptic feedback.
|
||||
Thus, our first objective is to \textbf{design an immersive, real time system} that allows free exploration with the bare hand of \textbf{wearable visuo-haptic texture augmentations} on tangible surfaces.
|
||||
|
||||
Second, many works have investigated the haptic augmentation of textures, but none have integrated them with \AR and \VR, or have considered the influence of the visual rendering on their perception.
|
||||
Second, many works have investigated the haptic augmentation of textures, but none have integrated them with \AR and \VR, or have considered the influence of the degree of visual virtuality on their perception.
|
||||
Still, it is known that the visual feedback can alter the perception of real and virtual haptic sensations \cite{schwind2018touch,choi2021augmenting} but also that the force feedback perception of grounded haptic devices is not the same in \AR and \VR \cite{diluca2011effects,gaffary2017ar}.
|
||||
Hence, our second objective is to \textbf{evaluate how the perception of wearable haptic texture augmentation is affected by the visual virtuality of the hand and the environment} (real, augmented, or virtual).
|
||||
|
||||
Finally, some visuo-haptic texture databases have been modeled from real texture captures \cite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with graspable haptics that are perceived as similar to real textures \cite{culbertson2015should,friesen2024perceived}.
|
||||
Finally, some visuo-haptic texture databases have been modeled from real texture captures \cite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with hand-held haptic devices that are perceived as similar to real textures \cite{culbertson2015should,friesen2024perceived}.
|
||||
However, the rendering of these textures in an immersive and natural visuo-haptic \AR using wearable haptics remains to be investigated.
|
||||
Our third objective is to \textbf{evaluate the perception of simultaneous and co-localized wearable visuo-haptic texture augmentations} of tangible surfaces in \AR, directly touched by the hand, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture.
|
||||
Our third objective is to \textbf{evaluate the perception of simultaneous and co-localized wearable visuo-haptic texture augmentations} of tangible surfaces in \AR, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture.
|
||||
|
||||
\subsectionstarbookmark{Axis II: Improving the Manipulation of Virtual Objects}
|
||||
|
||||
|
||||
@@ -228,7 +228,7 @@ Participants matched the virtual textures to the real ones, with \qty{0.25}{\mm}
|
||||
|
||||
Other models have been then developed to capture everyday textures (such as sandpaper) \cite{guruswamy2011iir} with many force and velocity measures \cite{romano2012creating,culbertson2014modeling}.
|
||||
Such data-based models are capable of interpolating from the user's measures of velocity and force as inputs to generate a virtual texture in real time (\secref{vibrotactile_actuators}).
|
||||
This led to the release of the Penn Haptic Texture Toolkit (HaTT) database, a public set of stylus recordings and models of 100 haptic textures \cite{culbertson2014one}.
|
||||
This led to the release of the \HaTT database, a public set of stylus recordings and models of 100 haptic textures \cite{culbertson2014one}.
|
||||
A similar database, but captured from a direct touch context with the fingertip, has recently been released \cite{balasubramanian2024sens3}.
|
||||
A limitation of these data-driven models is that they can only render \emph{isotropic} textures: their record does not depend on the position of the measure, and the rendering is the same regardless of the direction of the movement.
|
||||
Alternative models have been proposed to both render both isotropic and patterned textures \cite{chan2021hasti}.
|
||||
|
||||
Reference in New Issue
Block a user