WIP
This commit is contained in:
@@ -3,29 +3,46 @@
|
||||
|
||||
This thesis presents research on direct hand interaction with real and virtual everyday objects, visually and haptically augmented using immersive \AR and \WH devices.
|
||||
|
||||
%It is entitled:
|
||||
%- Title 1: Augmenting the interaction with everyday objects with wearable haptics and Augmented Reality
|
||||
%- Title 3: Direct Hand Perception and Manipulation in Visuo-Haptic Augmented Reality
|
||||
%- Title 4: Integrating Wearable Haptics in Augmented Reality: Perception and Manipulation of Virtual and Augmented Objects
|
||||
%- Title 5: Wearable Haptics for Hand Interaction in Augmented Reality
|
||||
%- Title 6: Enhancing Direct Hand Interaction with Everyday Objects in Augmented Reality using Wearable Haptics
|
||||
%- Enhancing Hand Interaction with Wearable Haptic in Augmented Reality
|
||||
|
||||
%The introduction chapter is structured as follows: first, we present the research challenges and objectives of this thesis, then we describe our approach and contributions, and finally we present the structure of the thesis.
|
||||
|
||||
% Some titles of previous related PhD theses
|
||||
% Haptic Rendering in Virtual Reality During Interaction with Tangibles
|
||||
% Contributions to the use of Electrotactile Feedback in Hand-based Interactions in Virtual Reality
|
||||
% Towards user-adapted navigation techniques in virtual environments : study of factors influencing users behavior in virtual reality
|
||||
% Bimanual haptic interaction with virtual environments
|
||||
% Contributions to the design of novel hand-based interaction techniques for virtual environments
|
||||
% Integrating haptic feedback in smart devices : multimodal interfaces and design guidelines
|
||||
|
||||
\section{Visual and Tactile Object Augmentations}
|
||||
\label{visuo_haptic_augmentations}
|
||||
|
||||
\subsectionstarbookmark{Everyday Interaction with Everyday Objects}
|
||||
\subsectionstarbookmark{Hand Interaction with Everyday Objects}
|
||||
|
||||
In daily life, we simultaneously look at and touch the everyday objects around us without even thinking about it.
|
||||
In daily life, we simultaneously look and touch the everyday objects around us without even thinking about it.
|
||||
%
|
||||
Many of these object properties can be perceived in a complementary way through both vision and touch, such as their shape, size or texture~\cite{baumgartner2013visual}.
|
||||
Many of these object properties can be perceived in a complementary way through both vision and touch, such as their shape, size or texture \cite{baumgartner2013visual}.
|
||||
%
|
||||
But vision often precedes touch, enabling us to expect the tactile sensations we will feel when touching the object, \eg stiffness or texture, and even to predict properties that we cannot see, \eg weight or temperature.
|
||||
%
|
||||
Information from different sensory sources may be complementary, redundant or contradictory~\cite{ernst2004merging}.
|
||||
Information from different sensory sources may be complementary, redundant or contradictory \cite{ernst2004merging}.
|
||||
%
|
||||
This is why we sometimes want to touch an object to check one of its properties that we have seen and to compare or confront our visual and tactile sensations.
|
||||
%
|
||||
We then instinctively construct a unified perception of the object we are exploring and manipulating from these visual and tactile sensory modalities, as well as from the movement of our hand and fingers on the object~\cite{ernst2002humans}.
|
||||
We then instinctively construct a unified perception of the object we are exploring and manipulating from these visual and tactile sensory modalities, as well as from the movement of our hand and fingers on the object \cite{ernst2002humans}.
|
||||
|
||||
The sense of touch also allows us to perceive our environment, but also to represent ourselves in space and to interact with the surrounding objects.
|
||||
%
|
||||
This is due to the many sensory receptors distributed throughout our hands and body, and which can be divided into two modalities: kinesthetic (or proprioception), which are the forces felt by muscles and tendons, and cutaneous (or tactile), which are the pressures, stretches, vibrations and temperatures felt by the skin.
|
||||
%
|
||||
This rich and complex variety of actions and sensations makes it particularly difficult to artificially recreate capabilities of touch, for example in virtual or remote operating environments~\cite{culbertson2018haptics}.
|
||||
This rich and complex variety of actions and sensations makes it particularly difficult to artificially recreate capabilities of touch, for example in virtual or remote operating environments \cite{culbertson2018haptics}.
|
||||
|
||||
|
||||
\subsectionstarbookmark{Wearable Haptics Promise Everyday Use}
|
||||
@@ -42,7 +59,7 @@ Touchable interfaces are actuated devices that are directly touched and that can
|
||||
%
|
||||
However, graspable interfaces occupy the hand, preventing interaction with other objects, and touchable interfaces often involve cumbersome mechanisms and are by definition limited to their own working surface.
|
||||
%
|
||||
Instead, wearable interfaces are directly mounted on the body to provide kinesthetic or cutaneous sensations on the skin in a portable way and without restricting the user's movements~\cite{pacchierotti2017wearable}.
|
||||
Instead, wearable interfaces are directly mounted on the body to provide kinesthetic or cutaneous sensations on the skin in a portable way and without restricting the user's movements \cite{pacchierotti2017wearable}.
|
||||
|
||||
\begin{subfigs}{haptic-categories}{
|
||||
Haptic devices can be classified into three categories according to their interface with the user:
|
||||
@@ -60,17 +77,17 @@ A wide range of \WH devices have been developed to provide the user with rich vi
|
||||
%
|
||||
\figref{wearable-haptics} shows some examples of different \WH devices with different form factors and rendering capabilities.
|
||||
%
|
||||
Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, \VR, and social interactions~\cite{pacchierotti2017wearable,culbertson2018haptics}.
|
||||
Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, \VR, and social interactions \cite{pacchierotti2017wearable,culbertson2018haptics}.
|
||||
%
|
||||
But their use in combination with \AR has been little explored so far.
|
||||
|
||||
\begin{subfigs}{wearable-haptics}{
|
||||
Wearable haptic devices can render sensations on the skin as feedback to real or virtual objects being touched.
|
||||
}[
|
||||
\item Wolverine, a wearable exoskeleton that simulate contact and grasping of virtual objects with force feedback on the fingers~\cite{choi2016wolverine}.
|
||||
\item Touch\&Fold, a \WH device mounted on the nail that fold on demand to render contact, normal force and vibrations to the fingertip~\cite{teng2021touch}.
|
||||
\item The hRing, a \WH ring mounted on the proximal phalanx able to render normal and shear forces to the finger~\cite{pacchierotti2016hring}.
|
||||
\item Tasbi, a haptic bracelet capable of rendering squeeze and vibrotactile feedback to the wrist~\cite{pezent2022design}.
|
||||
\item Wolverine, a wearable exoskeleton that simulate contact and grasping of virtual objects with force feedback on the fingers \cite{choi2016wolverine}.
|
||||
\item Touch\&Fold, a \WH device mounted on the nail that fold on demand to render contact, normal force and vibrations to the fingertip \cite{teng2021touch}.
|
||||
\item The hRing, a \WH ring mounted on the proximal phalanx able to render normal and shear forces to the finger \cite{pacchierotti2016hring}.
|
||||
\item Tasbi, a haptic bracelet capable of rendering squeeze and vibrotactile feedback to the wrist \cite{pezent2022design}.
|
||||
]
|
||||
\subfigsheight{28mm}
|
||||
\subfig{choi2016wolverine}
|
||||
@@ -92,7 +109,7 @@ It is technically and conceptually closely related to \VR, which replaces the \R
|
||||
%
|
||||
It describes the degree of \RV of the environment along an axis, with one end being the \RE and the other end being a pure \VE, \ie indistinguishable from the real world (such as \emph{The Matrix} movies).
|
||||
%
|
||||
Between these two extremes lies \MR, which comprises \AR and \VR as different levels of mixing real and virtual environments~\cite{skarbez2021revisiting}.\footnote{This is the original and classic definition of \MR, but there is still a debate on defining and characterize \AR and \MR experiences~\cite{speicher2019what,skarbez2021revisiting}.}
|
||||
Between these two extremes lies \MR, which comprises \AR and \VR as different levels of mixing real and virtual environments \cite{skarbez2021revisiting}.\footnote{This is the original and classic definition of \MR, but there is still a debate on defining and characterize \AR and \MR experiences \cite{speicher2019what,skarbez2021revisiting}.}
|
||||
%
|
||||
\AR/\VR is most often understood as addressing only the visual sense, and as haptics, it can take many forms as a user interface.
|
||||
%
|
||||
@@ -114,7 +131,7 @@ The combination of the two axes defines 9 types of \vh environments, with 3 poss
|
||||
%
|
||||
For example, a \v-\AE that uses a tangible (touchable) object as a proxy to manipulate virtual content is considered a \h-\RE (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum3}), whereas a device that provides synthetic haptic feedback when touching a \VO is considered a \h-\VE (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum3}).
|
||||
%
|
||||
Haptic \AR (\h-\AR) is then the combination of real and virtual haptic stimuli~\cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}).
|
||||
Haptic \AR (\h-\AR) is then the combination of real and virtual haptic stimuli \cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}).
|
||||
%
|
||||
In particular, it has been implemented by enhancing the haptic perception of tangible objects by providing timely tactile stimuli using \WHs.
|
||||
%
|
||||
@@ -133,10 +150,10 @@ The integration of \WHs with \AR seems to be one of the most promising solutions
|
||||
\begin{subfigs}{visuo-haptic-environments}{
|
||||
Visuo-haptic environments with different degrees of reality-virtuality.
|
||||
}[
|
||||
\item Visual \AR environment with a real, tangible haptic object used as a proxy to manipulate a \VO~\cite{kahl2023using}.
|
||||
\item Visual \AR environment with a \WH device that provides virtual, synthetic feedback from contact with a \VO~\cite{meli2018combining}.
|
||||
\item A tangible object seen in a \v-\VR environment whose haptic perception of stiffness is augmented with the hRing haptic device~\cite{salazar2020altering}.
|
||||
\item Visuo-haptic rendering of texture on a touched tangible object with a \v-\AR display and haptic electrovibration feedback~\cite{bau2012revel}.
|
||||
\item Visual \AR environment with a real, tangible haptic object used as a proxy to manipulate a \VO \cite{kahl2023using}.
|
||||
\item Visual \AR environment with a \WH device that provides virtual, synthetic feedback from contact with a \VO \cite{meli2018combining}.
|
||||
\item A tangible object seen in a \v-\VR environment whose haptic perception of stiffness is augmented with the hRing haptic device \cite{salazar2020altering}.
|
||||
\item Visuo-haptic rendering of texture on a touched tangible object with a \v-\AR display and haptic electrovibration feedback \cite{bau2012revel}.
|
||||
]
|
||||
\subfigsheight{31mm}
|
||||
\subfig{kahl2023using}
|
||||
@@ -192,14 +209,14 @@ Although closely related, (visual) \AR and \VR have key differences in their res
|
||||
Firstly, the user's hand and \RE are visible in \AR, unlike \VR where there is total control over the visual rendering of the hand and \VE.
|
||||
% (unless specifically overlaid with virtual visual content)
|
||||
%
|
||||
As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic~\cite{ujitoko2021survey} or haptic retargeting~\cite{azmandian2016haptic} effects.
|
||||
%enabling techniques such as pseudo-haptic feedback that induce haptic feedback with visual stimuli~\cite{ujitoko2021survey} or haptic retargeting that associate a single tangible object with multiple \VOs without the user noticing~\cite{azmandian2016haptic}.
|
||||
As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic \cite{ujitoko2021survey} or haptic retargeting \cite{azmandian2016haptic} effects.
|
||||
%enabling techniques such as pseudo-haptic feedback that induce haptic feedback with visual stimuli \cite{ujitoko2021survey} or haptic retargeting that associate a single tangible object with multiple \VOs without the user noticing \cite{azmandian2016haptic}.
|
||||
%
|
||||
Moreover, many \WH devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR.
|
||||
%
|
||||
The user's hand must be indeed free to touch and interact with the \RE while wearing a \WH device.
|
||||
%
|
||||
It is possible instead to place the haptic actuator close to the point of contact with the \RE, as described above to implement \h-\AR, \eg providing haptic feedback on another phalanx~\cite{asano2015vibrotactile,salazar2020altering} or the wrist~\cite{sarac2022perceived} for rendering fingertip contacts with virtual content.
|
||||
It is possible instead to place the haptic actuator close to the point of contact with the \RE, as described above to implement \h-\AR, \eg providing haptic feedback on another phalanx \cite{asano2015vibrotactile,salazar2020altering} or the wrist \cite{sarac2022perceived} for rendering fingertip contacts with virtual content.
|
||||
%
|
||||
Therefore, when touching a virtual or augmented object, the real and virtual visual sensations are seen as colocalised, but the virtual haptic feedback is not.
|
||||
%
|
||||
@@ -217,21 +234,21 @@ It is therefore unclear to what extent the real and virtual visuo-haptic sensati
|
||||
|
||||
\subsectionstarbookmark{Enable Effective Manipulation of the Augmented Environment}
|
||||
|
||||
Touching, grasping and manipulating \VOs are fundamental interactions for \AR~\cite{kim2018revisiting}, \VR~\cite{bergstrom2021how} and \VEs in general~\cite{laviola20173d}.
|
||||
Touching, grasping and manipulating \VOs are fundamental interactions for \AR \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite{laviola20173d}.
|
||||
%
|
||||
As the hand is not occupied or covered with a haptic device to not impair interaction with the \RE, as described in the previous section, one can expect a seamless and direct manipulation of the hand with the virtual content as if it were real.
|
||||
%
|
||||
Thus, augmenting a tangible object has the advantage of physically constraining the hand, allowing for easy and natural interaction, but manipulating a purely \VO with the bare hand can be challenging without good haptic feedback~\cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
|
||||
Thus, augmenting a tangible object has the advantage of physically constraining the hand, allowing for easy and natural interaction, but manipulating a purely \VO with the bare hand can be challenging without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
|
||||
|
||||
In addition, current \AR systems have visual rendering limitations that also affect interaction with \VOs. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency.
|
||||
%
|
||||
Visual \AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world.
|
||||
%
|
||||
But the depth perception of the \VOs is often underestimated~\cite{peillard2019studying,adams2022depth}, and there is often a lack of mutual occlusion between the hand and a \VO, \ie that the hand can hide the object or be hidden by the object~\cite{macedo2023occlusion}.
|
||||
But the depth perception of the \VOs is often underestimated \cite{peillard2019studying,adams2022depth}, and there is often a lack of mutual occlusion between the hand and a \VO, \ie that the hand can hide the object or be hidden by the object \cite{macedo2023occlusion}.
|
||||
%
|
||||
Finally, as illustrated in \figref{interaction-loop}, interacting with a \VO is an illusion, because in fact the real hand is controlling in real time a virtual hand, like an avatar, whose contacts with \VOs are then simulated in the \VE.
|
||||
%
|
||||
Therefore, there is inevitably a latency delay between the real hand's movements and the \VO's return movements, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the touched \VO~\cite{prachyabrued2014visual}.
|
||||
Therefore, there is inevitably a latency delay between the real hand's movements and the \VO's return movements, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the touched \VO \cite{prachyabrued2014visual}.
|
||||
%
|
||||
This makes it difficult to perceive the position of the fingers relative to the object before touching or grasping it, and also to estimate the force required to grasp and move the object to a desired location.
|
||||
|
||||
@@ -283,7 +300,7 @@ Our contributions in these two axes are summarized in \figref{contributions}.
|
||||
|
||||
% Very short abstract of contrib 2
|
||||
|
||||
\WH devices have proven to be effective in modifying the perception of a touched tangible surface, without modifying the tangible, nor covering the fingertip, forming a \h-\AE~\cite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
|
||||
\WH devices have proven to be effective in modifying the perception of a touched tangible surface, without modifying the tangible, nor covering the fingertip, forming a \h-\AE \cite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
|
||||
%
|
||||
%It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronised with the finger movement.
|
||||
%
|
||||
@@ -291,13 +308,13 @@ Our contributions in these two axes are summarized in \figref{contributions}.
|
||||
%
|
||||
However, wearable \h-\AR have been little explored with \v-\AR, as well as the visuo-haptic augmentation of textures.
|
||||
%
|
||||
Texture is indeed one of the main tactile sensation of a surface material~\cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by both sight and touch~\cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) rendering~\cite{unger2011roughness,culbertson2014modeling,strohmeier2017generating}.
|
||||
Texture is indeed one of the main tactile sensation of a surface material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by both sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) rendering \cite{unger2011roughness,culbertson2014modeling,strohmeier2017generating}.
|
||||
%
|
||||
For this first axis of research, we propose to design and evaluate the perception of virtual visuo-haptic textures augmenting tangible surfaces. %, using an immersive \AR headset and a wearable vibrotactile device.
|
||||
%
|
||||
To this end, we (1) design a system for rendering virtual visuo-haptic texture augmentations, to (2) evaluate how the perception of these textures is affected by the visual virtuality of the hand and the environment (\AR \vs \VR), and (3) investigate the perception of co-localized visuo-haptic texture augmentations in \AR.
|
||||
|
||||
First, an effective approach to rendering haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction~\cite{culbertson2014modeling,asano2015vibrotactile}.
|
||||
First, an effective approach to rendering haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction \cite{culbertson2014modeling,asano2015vibrotactile}.
|
||||
%
|
||||
Yet, to achieve the natural interaction with the hand and a coherent visuo-haptic feedback, it requires a real time rendering of the textures, no constraints on the hand movements, and a good synchronization between the visual and haptic feedback.
|
||||
%
|
||||
@@ -305,11 +322,11 @@ Thus, our first objective is to design an immersive, real time system that allow
|
||||
|
||||
Second, many works have investigated the haptic rendering of virtual textures, but few have integrated them with immersive \VEs or have considered the influence of the visual rendering on their perception.
|
||||
%
|
||||
Still, it is known that the visual feedback can alter the perception of real and virtual haptic sensations~\cite{schwind2018touch,choi2021augmenting} but also that the force feedback perception of grounded haptic devices is not the same in \AR and \VR~\cite{diluca2011effects,gaffary2017ar}.
|
||||
Still, it is known that the visual feedback can alter the perception of real and virtual haptic sensations \cite{schwind2018touch,choi2021augmenting} but also that the force feedback perception of grounded haptic devices is not the same in \AR and \VR \cite{diluca2011effects,gaffary2017ar}.
|
||||
%
|
||||
Hence, our second objective is to understand how the perception of haptic texture augmentation differs depending on the degree of visual virtuality of the hand and the environment.
|
||||
|
||||
Finally, some visuo-haptic texture databases have been modelled from real texture captures~\cite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with graspable haptics that are perceived as similar to real textures~\cite{culbertson2015should,friesen2024perceived}.
|
||||
Finally, some visuo-haptic texture databases have been modelled from real texture captures \cite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with graspable haptics that are perceived as similar to real textures \cite{culbertson2015should,friesen2024perceived}.
|
||||
%
|
||||
However, the rendering of these textures in an immersive and natural \vh-\AR using \WHs remains to be investigated.
|
||||
%
|
||||
@@ -323,15 +340,15 @@ In immersive and wearable \vh-\AR, the hand is free to touch and interact seamle
|
||||
However, the intangibility of the \v-\VE, the many display limitations of current \v-\AR systems and \WH devices, and the potential discrepancies between these two types of feedback can make the manipulation of \VOs particularly challenging.
|
||||
%However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of \WHs, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the real environment, the visual feedback, and the haptic feedback, can make the interaction with \VOs with bare hands particularly challenging.
|
||||
%
|
||||
Still two types of sensory feedback are known to improve such direct \VO manipulation, but they have not been studied in combination in immersive \v-\AE: visual rendering of the hand~\cite{piumsomboon2014graspshell,prachyabrued2014visual} and contact rendering with \WHs~\cite{lopes2018adding,teng2021touch}.
|
||||
Still two types of sensory feedback are known to improve such direct \VO manipulation, but they have not been studied in combination in immersive \v-\AE: visual rendering of the hand \cite{piumsomboon2014graspshell,prachyabrued2014visual} and contact rendering with \WHs \cite{lopes2018adding,teng2021touch}.
|
||||
%
|
||||
For this second axis of research, we propose to design and evaluate the role of visuo-haptic augmentations of the hand as interaction feedback with \VOs.
|
||||
%
|
||||
We consider (1) the effect of different visual augmentations of the hand as \AR avatars and (2) the effect of combination of different visuo-haptic augmentations of the hand.
|
||||
|
||||
First, the visual rendering of the virtual hand is a key element for interacting and manipulating \VOs in \VR~\cite{prachyabrued2014visual,grubert2018effects}.
|
||||
First, the visual rendering of the virtual hand is a key element for interacting and manipulating \VOs in \VR \cite{prachyabrued2014visual,grubert2018effects}.
|
||||
%
|
||||
A few works have also investigated the visual rendering of the virtual hand in \AR, from simulating mutual occlusions between the hand and \VOs~\cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay~\cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
|
||||
A few works have also investigated the visual rendering of the virtual hand in \AR, from simulating mutual occlusions between the hand and \VOs \cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay \cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
|
||||
%
|
||||
But \v-\AR has significant perceptual differences from \VR due to the visibility of the real hand and environment, and these visual hand augmentations have not been evaluated in the context of \VO manipulation.
|
||||
%
|
||||
@@ -339,7 +356,7 @@ Thus, our fourth objective is to evaluate and compare the effect of different vi
|
||||
|
||||
Finally, as described above, \WHs for \v-\AR rely on moving the haptic actuator away from the fingertips to not impair the hand movements, sensations, and interactions with the \RE.
|
||||
%
|
||||
Previous works have shown that \WHs that provide feedback on the hand manipulation with \VOs in \AR can significantly improve the user performance and experience~\cite{maisto2017evaluation,meli2018combining}.
|
||||
Previous works have shown that \WHs that provide feedback on the hand manipulation with \VOs in \AR can significantly improve the user performance and experience \cite{maisto2017evaluation,meli2018combining}.
|
||||
%
|
||||
However, it is unclear which positioning of the actuator is the most beneficial nor how a haptic augmentation of the hand compares or complements with a visual augmentation of the hand.
|
||||
%
|
||||
@@ -385,7 +402,7 @@ We use psychophysical methods to measure the user roughness perception of the vi
|
||||
|
||||
\chapref{ar_textures} presents a second user study using the same system and evaluating the perception of visuo-haptic texture augmentations, touched directly with one's own hand in \AR.
|
||||
%
|
||||
The textures are paired visual and tactile models of real surfaces~\cite{culbertson2014one}, and are rendered as visual texture overlays and as a vibrotactile feedback, respectively, on the touched augmented surfaces, respectively.
|
||||
The textures are paired visual and tactile models of real surfaces \cite{culbertson2014one}, and are rendered as visual texture overlays and as a vibrotactile feedback, respectively, on the touched augmented surfaces, respectively.
|
||||
%
|
||||
%We investigate the perception and user appreciation of the combination of nine representative visuo-haptic pairs of texture.
|
||||
%
|
||||
|
||||
@@ -7,7 +7,7 @@ The haptic sense has specific characteristics that make it unique in regard to o
|
||||
It enables us to perceive a large diversity of properties in the surrounding objects, through to a complex combination of sensations produced by numerous sensory receptors distributed throughout the body, but particularly in the hand.
|
||||
It also allows us to act with the hand on these objects, to come into contact with them, to grasp them, to actively explore them, and to manipulate them.
|
||||
This implies that the haptic perception is localized at the points of contact between the hand and the environment, \ie we cannot haptically perceive an object without actively touching it.
|
||||
These two mechanisms, \emph{action} and \emph{perception}, are therefore closely associated and both essential to form the haptic experience of interacting with the environment using the hand~\cite{lederman2009haptic}.
|
||||
These two mechanisms, \emph{action} and \emph{perception}, are therefore closely associated and both essential to form the haptic experience of interacting with the environment using the hand \cite{lederman2009haptic}.
|
||||
|
||||
|
||||
\subsection{The Haptic Sense}
|
||||
@@ -20,7 +20,7 @@ Perceiving the properties of an object involves numerous sensory receptors embed
|
||||
|
||||
Cutaneous haptic receptors are specialized nerve endings implanted in the skin that respond differently to the various stimuli applied to the skin. \figref{blausen2014medical_skin} shows the location of the four main cutaneous receptors that respond to mechanical deformation of the skin.
|
||||
|
||||
\fig[0.6]{blausen2014medical_skin}{Schema of cutaneous mechanoreceptors in a section of the skin~\cite{blausen2014medical}.}
|
||||
\fig[0.6]{blausen2014medical_skin}{Schema of cutaneous mechanoreceptors in a section of the skin \cite{blausen2014medical}.}
|
||||
|
||||
Adaptation rate and receptor size are the two key characteristics that respectively determine the temporal and spatial resolution of these \emph{mechanoreceptors}, as summarized in \tabref{cutaneous_receptors}.
|
||||
The \emph{adaptation rate} is the speed and duration of the response to a stimulus.
|
||||
@@ -31,11 +31,11 @@ Meissner and Merkel receptors have a small detection area (named Type I) and are
|
||||
|
||||
The density of mechanoreceptors varies according to skin type and body region.
|
||||
\emph{Glabrous skin}, especially on the face, feet, hands, and more importantly, the fingers, is particularly rich in cutaneous receptors, giving these regions great tactile sensitivity.
|
||||
The density of the Meissner and Merkel receptors, which are the most sensitive, is notably high in the fingertips~\cite{johansson2009coding}.
|
||||
Conversely, \emph{hairy skin} is less sensitive and does not contain Meissner receptors, but has additional receptors at the base of the hairs, as well as receptors known as C-tactile, which are involved in pleasantness and affective touch~\cite{ackerley2014touch}.
|
||||
The density of the Meissner and Merkel receptors, which are the most sensitive, is notably high in the fingertips \cite{johansson2009coding}.
|
||||
Conversely, \emph{hairy skin} is less sensitive and does not contain Meissner receptors, but has additional receptors at the base of the hairs, as well as receptors known as C-tactile, which are involved in pleasantness and affective touch \cite{ackerley2014touch}.
|
||||
|
||||
There are also two types of thermal receptors implanted in the skin, which respond to increases or decreases in skin temperature, respectively, providing sensations of warmth or cold~\cite{lederman2009haptic}.
|
||||
Finally, free nerve endings (without specialized receptors) provide information about pain~\cite{mcglone2007discriminative}.
|
||||
There are also two types of thermal receptors implanted in the skin, which respond to increases or decreases in skin temperature, respectively, providing sensations of warmth or cold \cite{lederman2009haptic}.
|
||||
Finally, free nerve endings (without specialized receptors) provide information about pain \cite{mcglone2007discriminative}.
|
||||
|
||||
\begin{tab}{cutaneous_receptors}{Characteristics of the cutaneous mechanoreceptors.}[
|
||||
Adaptation rate is the speed and duration of the receptor's response to a stimulus. Receptive size is the area of skin detectable by a single receptor. Sensitivities are the stimuli detected by the receptor. Adapted from \textcite{mcglone2007discriminative} and \textcite{johansson2009coding}.
|
||||
@@ -55,7 +55,7 @@ Finally, free nerve endings (without specialized receptors) provide information
|
||||
\subsubsection{Kinesthetic Sensitivity}
|
||||
\label{kinesthetic_sensitivity}
|
||||
|
||||
Kinesthetic receptors are also mechanoreceptors but are located in the muscles, tendons and joints~\cite{jones2006human}.
|
||||
Kinesthetic receptors are also mechanoreceptors but are located in the muscles, tendons and joints \cite{jones2006human}.
|
||||
The muscle spindles respond to the length and the rate of stretch/contraction of the muscles.
|
||||
Golgi tendon organs, located at the junction of muscles and tendons, respond to the force developed by the muscles.
|
||||
Ruffini and Pacini receptors are found in the joints and respond to joint movement.
|
||||
@@ -65,7 +65,7 @@ They can also sense external forces and torques applied to the body.
|
||||
Kinesthetic receptors are therefore closely linked to the motor control of the body.
|
||||
By providing sensory feedback in response to the position and movement of our limbs, they enable us to perceive our body in space, a perception called \emph{proprioception}.
|
||||
This allows us to plan and execute precise movements to touch or grasp a target, even with our eyes closed.
|
||||
Cutaneous mechanoreceptors are essential for this perception because any movement of the body or contact with the environment necessarily deforms the skin~\cite{johansson2009coding}.
|
||||
Cutaneous mechanoreceptors are essential for this perception because any movement of the body or contact with the environment necessarily deforms the skin \cite{johansson2009coding}.
|
||||
|
||||
|
||||
\subsection{Hand-Object Interactions}
|
||||
@@ -81,7 +81,7 @@ These receptors give the hand its great tactile sensitivity and great dexterity
|
||||
As illustrated in the \figref{sensorimotor_continuum}, \Citeauthor{jones2006human} propose to delineate four categories of hand function on this continuum:
|
||||
\begin{itemize}
|
||||
\item \emph{Passive touch}, or tactile sensing, is the ability to perceive an object through cutaneous sensations with a static hand contact. The object may be moving, but the hand remains static. It allows for relatively good surface perception, \eg in \textcite{gunther2022smooth}.
|
||||
\item \emph{Exploration}, or active haptic sensing, is the manual and voluntary exploration of an object with the hand, involving all cutaneous and kinesthetic sensations. It enables a more precise perception than passive touch~\cite{lederman2009haptic}.
|
||||
\item \emph{Exploration}, or active haptic sensing, is the manual and voluntary exploration of an object with the hand, involving all cutaneous and kinesthetic sensations. It enables a more precise perception than passive touch \cite{lederman2009haptic}.
|
||||
\item \emph{Prehension} is the action of grasping and holding an object with the hand. It involves fine coordination between hand and finger movements and the haptic sensations produced.
|
||||
\item \emph{Gestures}, or non-prehensible skilled movements, are motor activities without constant contact with an object. Examples include pointing at a target, typing on a keyboard, accompanying speech with gestures, or signing in sign language, \eg in \textcite{yoon2020evaluating}.
|
||||
\end{itemize}
|
||||
@@ -109,13 +109,13 @@ The joints at the base of each phalanx allow flexion and extension, \ie folding
|
||||
The proximal phalanges can also adduct and abduct, \ie move the fingers towards and away from each other.
|
||||
Finally, the metacarpal of the thumb is capable of flexion/extension and adduction/abduction, which allows the thumb to oppose the other fingers.
|
||||
These axes of movement are called DoFs and can be represented by a \emph{kinematic model} of the hand with 27 DoFs as shown in the \figref{blausen2014medical_hand}.
|
||||
Thus the thumb has 5 DoFs, each of the other four fingers has 4 DoFs and the wrist has 6 DoFs and can take any position (3 DoFs) or orientation (3 DoFs) in space~\cite{erol2007visionbased}.
|
||||
Thus the thumb has 5 DoFs, each of the other four fingers has 4 DoFs and the wrist has 6 DoFs and can take any position (3 DoFs) or orientation (3 DoFs) in space \cite{erol2007visionbased}.
|
||||
|
||||
This complex structure enables the hand to perform a wide range of movements and gestures. However, the way we explore and grasp objects follows simpler patterns, depending on the object being touched and the aim of the interaction.
|
||||
|
||||
\begin{subfigs}{hand}{Anatomy and motion of the hand. }[
|
||||
\item Schema of the hand skeleton. Adapted from \textcite{blausen2014medical}.
|
||||
\item Kinematic model of the hand with 27 \DoFs~\cite{erol2007visionbased}.
|
||||
\item Kinematic model of the hand with 27 \DoFs \cite{erol2007visionbased}.
|
||||
]
|
||||
\subfigsheight{58mm}
|
||||
\subfig{blausen2014medical_hand}
|
||||
@@ -125,38 +125,38 @@ This complex structure enables the hand to perform a wide range of movements and
|
||||
\subsubsection{Exploratory Procedures}
|
||||
\label{exploratory_procedures}
|
||||
|
||||
The exploration of an object by the hand follows patterns of movement, called exploratory procedures~\cite{lederman1987hand}.
|
||||
The exploration of an object by the hand follows patterns of movement, called exploratory procedures \cite{lederman1987hand}.
|
||||
As illustrated in the \figref{exploratory_procedures}, a specific and optimal movement of the hand is performed for a given property of the object being explored to acquire the most relevant sensory information for that property.
|
||||
For example, a \emph{lateral movement} of the fingers on the surface to identify its texture, a \emph{pressure} with the finger to perceive its hardness, or a \emph{contour following} of the object to infer its shape.
|
||||
These three procedures involve only the fingertips and in particular the index finger~\cite{gonzalez2014analysis}.
|
||||
These three procedures involve only the fingertips and in particular the index finger \cite{gonzalez2014analysis}.
|
||||
For the other procedures, the whole hand is used: for example, approaching or posing the palm to feel the temperature (\emph{static contact}), holding the object in the hand to estimate its weight (\emph{unsupported holding}).
|
||||
The \emph{enclosure} with the hand makes it possible to judge the general shape and size of the object.
|
||||
It takes only \qtyrange{2}{3}{\s} to perform these procedures, except for contour following, which can take about ten seconds~\cite{jones2006human}.
|
||||
It takes only \qtyrange{2}{3}{\s} to perform these procedures, except for contour following, which can take about ten seconds \cite{jones2006human}.
|
||||
|
||||
\fig{exploratory_procedures}{Exploratory procedures and their associated object properties (in parentheses). Adapted from \textcite{lederman2009haptic}.}
|
||||
|
||||
%Le sens haptique seul (sans la vision) nous permet ainsi de reconnaitre les objets et matériaux avec une grande précision.
|
||||
%La reconnaissance des propriété matérielles, \ie la surface et sa texture, rigidité et température est meilleure qu'avec le sens visuel seul.
|
||||
%Mais la reconnaissance des propriétés spatiales, la forme et la taille de l'objet, est moins bonne avec l'haptique qu'avec la vision~\cite{lederman2009haptic}.
|
||||
%Quelques secondes (\qtyrange{2}{3}{\s}) suffisent pour effectuer ces procédures, à l'exception du suivi de contour qui peut prendre une dizaine de secondes~\cite{jones2006human}.
|
||||
%Mais la reconnaissance des propriétés spatiales, la forme et la taille de l'objet, est moins bonne avec l'haptique qu'avec la vision \cite{lederman2009haptic}.
|
||||
%Quelques secondes (\qtyrange{2}{3}{\s}) suffisent pour effectuer ces procédures, à l'exception du suivi de contour qui peut prendre une dizaine de secondes \cite{jones2006human}.
|
||||
|
||||
\subsubsection{Grasp Types}
|
||||
\label{grasp_types}
|
||||
|
||||
Thanks to the degrees of freedom of its skeleton, the hand can take many postures to grasp an object (\secref{hand_anatomy}).
|
||||
By placing the thumb or palm against the other fingers (pad or palm opposition respectively), or by placing the fingers against each other as if holding a cigarette (side opposition), the hand can hold the object securely.
|
||||
Grasping adapts to the shape of the object and the task to be performed, \eg grasping a pen with the fingertips then holding it to write, or taking a mug by the body to fill it and by the handle to drink it~\cite{cutkosky1986modeling}.
|
||||
Grasping adapts to the shape of the object and the task to be performed, \eg grasping a pen with the fingertips then holding it to write, or taking a mug by the body to fill it and by the handle to drink it \cite{cutkosky1986modeling}.
|
||||
Three types of grasp are differentiated according to their degree of strength and precision.
|
||||
In \emph{power grasps}, the object is held firmly and follows the movements of the hand rigidly.
|
||||
In \emph{precision grasps}, the fingers can move the object within the hand but without moving the arm.
|
||||
\emph{Intermediate grasps} combine strength and precision in equal proportions~\cite{feix2016grasp}.
|
||||
\emph{Intermediate grasps} combine strength and precision in equal proportions \cite{feix2016grasp}.
|
||||
|
||||
For all possible objects and tasks, the number of grasp types can be reduced to 34 and classified as the taxonomy on \figref{gonzalez2014analysis}~\cite{gonzalez2014analysis}.\footnote{An updated taxonomy was then proposed by \textcite{feix2016grasp}: it is more complete but harder to present.}
|
||||
For everyday objects, this number is even smaller, with between 5 and 10 grasp types depending on the activity~\cite{bullock2013grasp}.
|
||||
Furthermore, the fingertips are the most involved areas of the hand, both in terms of frequency of use and time spent in contact: In particular, the thumb is almost always used, as well as the index and middle fingers, but the other fingers are used less frequently~\cite{gonzalez2014analysis}.
|
||||
For all possible objects and tasks, the number of grasp types can be reduced to 34 and classified as the taxonomy on \figref{gonzalez2014analysis} \cite{gonzalez2014analysis}.\footnote{An updated taxonomy was then proposed by \textcite{feix2016grasp}: it is more complete but harder to present.}
|
||||
For everyday objects, this number is even smaller, with between 5 and 10 grasp types depending on the activity \cite{bullock2013grasp}.
|
||||
Furthermore, the fingertips are the most involved areas of the hand, both in terms of frequency of use and time spent in contact: In particular, the thumb is almost always used, as well as the index and middle fingers, but the other fingers are used less frequently \cite{gonzalez2014analysis}.
|
||||
This can be explained by the sensitivity of the fingertips (\secref{haptic_sense}) and the ease with which the thumb can be opposed to the index and middle fingers compared to the other fingers.
|
||||
|
||||
\fig{gonzalez2014analysis}{Taxonomy of grasp types of~\textcite{gonzalez2014analysis}}[, classified according to their type (power, precision or intermediate) and the shape of the grasped object. Each grasp shows the area of the palm and fingers in contact with the object and the grasp with an example of object.]
|
||||
\fig{gonzalez2014analysis}{Taxonomy of grasp types of \textcite{gonzalez2014analysis}}[, classified according to their type (power, precision or intermediate) and the shape of the grasped object. Each grasp shows the area of the palm and fingers in contact with the object and the grasp with an example of object.]
|
||||
|
||||
|
||||
\subsection{Haptic Perception of Roughness and Hardness}
|
||||
@@ -164,43 +164,43 @@ This can be explained by the sensitivity of the fingertips (\secref{haptic_sense
|
||||
|
||||
The active exploration of an object with the hand is performed as a sensorimotor loop: The exploratory movements (\secref{exploratory_procedures}) guide the search for and adapt to sensory information (\secref{haptic_sense}), allowing to construct a haptic perception of the object's properties.
|
||||
There are two main types of \emph{perceptual properties}.
|
||||
The \emph{material properties} are the perception of the roughness, hardness, temperature and friction of the surface of the object~\cite{bergmanntiest2010tactual}.
|
||||
The \emph{spatial properties} are the perception of the weight, shape and size of the object~\cite{lederman2009haptic}.
|
||||
The \emph{material properties} are the perception of the roughness, hardness, temperature and friction of the surface of the object \cite{bergmanntiest2010tactual}.
|
||||
The \emph{spatial properties} are the perception of the weight, shape and size of the object \cite{lederman2009haptic}.
|
||||
|
||||
Each of these properties is closely related to a physical property of the object, which is defined and measurable, but perception is a subjective experience and often differs from this physical measurement.
|
||||
Perception also depends on many other factors, such as the movements made and the exploration time, but also on the person, their sensitivity~\cite{hollins2000individual} or age~\cite{jones2006human}, and the context of the interaction~\cite{kahrimanovic2009context,kappers2013haptic}.
|
||||
These properties are described and rated\footnotemark using scales opposing two adjectives such as \enquote{rough/smooth} or \enquote{hot/cold}~\cite{okamoto2013psychophysical}.
|
||||
Perception also depends on many other factors, such as the movements made and the exploration time, but also on the person, their sensitivity \cite{hollins2000individual} or age \cite{jones2006human}, and the context of the interaction \cite{kahrimanovic2009context,kappers2013haptic}.
|
||||
These properties are described and rated\footnotemark using scales opposing two adjectives such as \enquote{rough/smooth} or \enquote{hot/cold} \cite{okamoto2013psychophysical}.
|
||||
\footnotetext{All the haptic perception measurements described in this chapter were performed by blindfolded participants, to control for the influence of vision.}
|
||||
|
||||
The most salient and fundamental perceived material properties are the roughness and hardness of the object~\cite{hollins1993perceptual,baumgartner2013visual}, which are also the most studied and best understood~\cite{bergmanntiest2010tactual}.
|
||||
The most salient and fundamental perceived material properties are the roughness and hardness of the object \cite{hollins1993perceptual,baumgartner2013visual}, which are also the most studied and best understood \cite{bergmanntiest2010tactual}.
|
||||
|
||||
|
||||
\subsubsection{Roughness}
|
||||
\label{roughness}
|
||||
|
||||
Roughness (or smoothness) is the perception of the \emph{micro-geometry} of a surface, \ie asperities with differences in height on the order of millimeters to micrometers~\cite{bergmanntiest2010tactual}.
|
||||
Roughness (or smoothness) is the perception of the \emph{micro-geometry} of a surface, \ie asperities with differences in height on the order of millimeters to micrometers \cite{bergmanntiest2010tactual}.
|
||||
It is, for example, the perception of the fibers of fabric or wood and the texture of sandpaper or paint.
|
||||
Roughness is what essentially characterises the perception of the \emph{texture} of the surface~\cite{hollins1993perceptual,baumgartner2013visual}.
|
||||
Roughness is what essentially characterizes the perception of the \emph{texture} of the surface \cite{hollins1993perceptual,baumgartner2013visual}.
|
||||
|
||||
When touching a surface in static touch, the asperities deform the skin and cause pressure sensations that allow a good perception of coarse roughness.
|
||||
But when running the finger over the surface with a lateral movement (\secref{exploratory_procedures}), vibrations are alos caused which give a better discrimination range and precision of roughness~\cite{bensmaia2005pacinian}.
|
||||
In particular, when the asperities are smaller than \qty{0.1}{mm}, such as paper fibers, the pressure cues are no longer captured and only the movement, \ie the vibrations, can be used to detect the roughness~\cite{hollins2000evidence}.
|
||||
But when running the finger over the surface with a lateral movement (\secref{exploratory_procedures}), vibrations are also caused which give a better discrimination range and precision of roughness \cite{bensmaia2005pacinian}.
|
||||
In particular, when the asperities are smaller than \qty{0.1}{mm}, such as paper fibers, the pressure cues are no longer captured and only the movement, \ie the vibrations, can be used to detect the roughness \cite{hollins2000evidence}.
|
||||
This limit distinguishes \emph{macro-roughness} from \emph{micro-roughness}.
|
||||
|
||||
The physical properties of the surface determine the haptic perception of roughness.
|
||||
The most important characteristic is the density of the surface elements, \ie the spacing between them: The perceived (subjective) intensity of roughness increases with spacing, for macro-roughness~\cite{klatzky2003feeling,lawrence2007haptic} and micro-roughness~\cite{bensmaia2003vibrations}.
|
||||
For macro-textures, the size of the elements, the force applied and the speed of exploration have limited effects on the intensity perceived~\cite{klatzky2010multisensory}: macro-roughness is a \emph{spatial perception}.
|
||||
This allows us to read Braille~\cite{lederman2009haptic}.
|
||||
However, the speed of exploration affects the perceived intensity of micro-roughness~\cite{bensmaia2003vibrations}.
|
||||
The most important characteristic is the density of the surface elements, \ie the spacing between them: The perceived (subjective) intensity of roughness increases with spacing, for macro-roughness \cite{klatzky2003feeling,lawrence2007haptic} and micro-roughness \cite{bensmaia2003vibrations}.
|
||||
For macro-textures, the size of the elements, the force applied and the speed of exploration have limited effects on the intensity perceived \cite{klatzky2010multisensory}: macro-roughness is a \emph{spatial perception}.
|
||||
This allows us to read Braille \cite{lederman2009haptic}.
|
||||
However, the speed of exploration affects the perceived intensity of micro-roughness \cite{bensmaia2003vibrations}.
|
||||
|
||||
To establish the relationship between spacing and intensity for macro-roughness, patterned textured surfaces were manufactured: as a linear grating (on one axis) composed of ridges and grooves, \eg in \figref{lawrence2007haptic_1}~\cite{lederman1972fingertip,lawrence2007haptic}, or as a surface composed of micro conical elements on two axes, \eg in \figref{klatzky2003feeling_1}~\cite{klatzky2003feeling}.
|
||||
As shown in \figref{lawrence2007haptic_2}, there is a quadratic relationship between the logarithm of the perceived roughness intensity $r$ and the logarithm of the space between the elements $s$ ($a$, $b$ and $c$ are empirical parameters to be estimated)~\cite{klatzky2003feeling}:
|
||||
To establish the relationship between spacing and intensity for macro-roughness, patterned textured surfaces were manufactured: as a linear grating (on one axis) composed of ridges and grooves, \eg in \figref{lawrence2007haptic_1} \cite{lederman1972fingertip,lawrence2007haptic}, or as a surface composed of micro conical elements on two axes, \eg in \figref{klatzky2003feeling_1} \cite{klatzky2003feeling}.
|
||||
As shown in \figref{lawrence2007haptic_2}, there is a quadratic relationship between the logarithm of the perceived roughness intensity $r$ and the logarithm of the space between the elements $s$ ($a$, $b$ and $c$ are empirical parameters to be estimated) \cite{klatzky2003feeling}:
|
||||
\begin{equation}{roughness_intensity}
|
||||
log(r) \sim a \, log(s)^2 + b \, s + c
|
||||
\end{equation}
|
||||
A larger spacing between elements increases the perceived roughness, but reaches a plateau from \qty{\sim 5}{\mm} for the linear grating~\cite{lawrence2007haptic}, while the roughness decreases from \qty{\sim 2.5}{\mm}~\cite{klatzky2003feeling} for the conical elements.
|
||||
A larger spacing between elements increases the perceived roughness, but reaches a plateau from \qty{\sim 5}{\mm} for the linear grating \cite{lawrence2007haptic}, while the roughness decreases from \qty{\sim 2.5}{\mm} \cite{klatzky2003feeling} for the conical elements.
|
||||
|
||||
\begin{subfigs}{lawrence2007hapti}{Estimation of haptic roughness of a linear grating surface by active exploration~\cite{lawrence2007haptic}. }[
|
||||
\begin{subfigs}{lawrence2007hapti}{Estimation of haptic roughness of a linear grating surface by active exploration \cite{lawrence2007haptic}. }[
|
||||
\item Schema of a linear grating surface, composed of ridges and grooves.
|
||||
\item Perceived intensity of roughness (vertical axis) of the surface as a function of the size of the grooves (horizontal axis, interval of \qtyrange{0.125}{4.5}{mm}), the size of the ridges (RW, circles and squares) and the mode of exploration (with the finger in white and via a rigid probe held in hand in black).
|
||||
]
|
||||
@@ -209,13 +209,13 @@ A larger spacing between elements increases the perceived roughness, but reaches
|
||||
\subfig{lawrence2007haptic_2}
|
||||
\end{subfigs}
|
||||
|
||||
It is also possible to perceive the roughness of a surface by \emph{indirect touch}, with a tool held in the hand, for example by writing with a pen on paper~\cite{klatzky2003feeling}.
|
||||
It is also possible to perceive the roughness of a surface by \emph{indirect touch}, with a tool held in the hand, for example by writing with a pen on paper \cite{klatzky2003feeling}.
|
||||
The skin is no longer deformed and only the vibrations of the tool are transmitted.
|
||||
But this information is sufficient to feel the roughness, which perceived intensity follows the same quadratic law.
|
||||
The intensity peak varies with the size of the contact surface of the tool, \eg a small tool allows to perceive finer spaces between the elements than with the finger (\figref{klatzky2003feeling_2}).
|
||||
However, as the speed of exploration changes the transmitted vibrations, a faster speed shifts the perceived intensity peak slightly to the right, \ie decreasing perceived roughness for fine spacings and increasing it for large spacings~\cite{klatzky2003feeling}.
|
||||
The intensity peak varies with the size of the contact surface of the tool, \eg a small tool allows perceiving finer spaces between the elements than with the finger (\figref{klatzky2003feeling_2}).
|
||||
However, as the speed of exploration changes the transmitted vibrations, a faster speed shifts the perceived intensity peak slightly to the right, \ie decreasing perceived roughness for fine spacings and increasing it for large spacings \cite{klatzky2003feeling}.
|
||||
|
||||
\begin{subfigs}{klatzky2003feeling}{Estimation of haptic roughness of a surface of conical micro-elements by active exploration~\cite{klatzky2003feeling}. }[
|
||||
\begin{subfigs}{klatzky2003feeling}{Estimation of haptic roughness of a surface of conical micro-elements by active exploration \cite{klatzky2003feeling}. }[
|
||||
\item Electron micrograph of conical micro-elements on the surface.
|
||||
\item Perceived intensity of roughness (vertical axis) of the surface as a function of the average spacing of the elements (horizontal axis, interval of \qtyrange{0.8}{4.5}{mm}) and the mode of exploration (with the finger in black and via a rigid probe held in hand in white).
|
||||
]
|
||||
@@ -223,36 +223,36 @@ However, as the speed of exploration changes the transmitted vibrations, a faste
|
||||
\subfig[.5]{klatzky2003feeling_2}
|
||||
\end{subfigs}
|
||||
|
||||
Even when the fingertips are deafferented (absence of cutaneous sensations), the perception of roughness is maintained~\cite{libouton2012tactile}, thanks to the propagation of vibrations in the finger, hand and wrist, for both pattern and natural textures~\cite{delhaye2012textureinduced}.
|
||||
Even when the fingertips are deafferented (absence of cutaneous sensations), the perception of roughness is maintained \cite{libouton2012tactile}, thanks to the propagation of vibrations in the finger, hand and wrist, for both pattern and "natural" textures \cite{delhaye2012textureinduced}.
|
||||
The spectrum of vibrations shifts to higher frequencies as the exploration speed increases, but the brain integrates this change with proprioception to keep the \emph{perception constant} of the texture.
|
||||
For grid textures, as illustrated in \figref{delhaye2012textureinduced}, the ratio of the finger speed $v$ to the frequency of the vibration intensity peak $f_p$ is measured most of the time equal to the period $\lambda$ of the spacing of the elements:
|
||||
For patterned textures, as illustrated in \figref{delhaye2012textureinduced}, the ratio of the finger speed $v$ to the frequency of the vibration intensity peak $f_p$ is measured most of the time equal to the period $\lambda$ of the spacing of the elements:
|
||||
\begin{equation}{grating_vibrations}
|
||||
\lambda \sim \frac{v}{f_p}
|
||||
\end{equation}
|
||||
|
||||
The vibrations generated by exploring natural textures are also very specific to each texture and similar between individuals, making them identifiable by vibration alone~\cite{manfredi2014natural,greenspon2020effect}.
|
||||
The vibrations generated by exploring natural textures are also very specific to each texture and similar between individuals, making them identifiable by vibration alone \cite{manfredi2014natural,greenspon2020effect}.
|
||||
This shows the importance of vibration cues even for macro textures and the possibility of generating virtual texture sensations with vibrotactile rendering.
|
||||
|
||||
\fig[0.55]{delhaye2012textureinduced}{Speed of finger exploration (horizontal axis) on grating textures with different periods $\lambda$ of spacing (in color) and frequency of the vibration intensity peak $f_p$ propagated in the wrist (vertical axis)~\cite{delhaye2012textureinduced}.}
|
||||
\fig[0.55]{delhaye2012textureinduced}{Speed of finger exploration (horizontal axis) on grating textures with different periods $\lambda$ of spacing (in color) and frequency of the vibration intensity peak $f_p$ propagated in the wrist (vertical axis) \cite{delhaye2012textureinduced}.}
|
||||
|
||||
The everyday "natural" textures are more complex to study because they are composed of multiple elements of different sizes and spacings.
|
||||
In addition, the perceptions of micro and macro roughness overlap and are difficult to distinguish~\cite{okamoto2013psychophysical}.
|
||||
Thus, individuals have a subjective definition of roughness, with some paying more attention to larger elements and others to smaller ones~\cite{bergmanntiest2007haptic}, or even including other perceptual properties such as hardness or friction~\cite{bergmanntiest2010tactual}.
|
||||
The everyday natural textures are more complex to study because they are composed of multiple elements of different sizes and spacings.
|
||||
In addition, the perceptions of micro and macro roughness overlap and are difficult to distinguish \cite{okamoto2013psychophysical}.
|
||||
Thus, individuals have a subjective definition of roughness, with some paying more attention to larger elements and others to smaller ones \cite{bergmanntiest2007haptic}, or even including other perceptual properties such as hardness or friction \cite{bergmanntiest2010tactual}.
|
||||
|
||||
|
||||
\subsubsection{Hardness}
|
||||
\label{hardness}
|
||||
|
||||
Hardness (or softness) is the perception of the \emph{resistance to deformation} of an object when pressed or tapped~\cite{bergmanntiest2010tactual}.
|
||||
Hardness (or softness) is the perception of the \emph{resistance to deformation} of an object when pressed or tapped \cite{bergmanntiest2010tactual}.
|
||||
The perceived softness of a fruit allows us to judge its ripeness, while ceramic is perceived as hard.
|
||||
By tapping on a surface, metal will be perceived as harder than wood.
|
||||
If the surface returns to its original shape after being deformed, the object is elastic (like a spring), otherwise it is plastic (like clay).
|
||||
|
||||
When the finger presses on an object (\figref{exploratory_procedures}), its surface will move and deform with some resistance, and the contact area of the skin will also expand, changing the pressure distribution.
|
||||
When the surface is touched or tapped, vibrations are also transmitted to the skin~\cite{higashi2019hardness}.
|
||||
Passive touch (without voluntary hand movements) and tapping allow a perception of hardness as good as active touch~\cite{friedman2008magnitude}.
|
||||
When the surface is touched or tapped, vibrations are also transmitted to the skin \cite{higashi2019hardness}.
|
||||
Passive touch (without voluntary hand movements) and tapping allow a perception of hardness as good as active touch \cite{friedman2008magnitude}.
|
||||
|
||||
Two physical properties determine the haptic perception of hardness: its stiffness and elasticity, as shown in \figref{hardness}~\cite{bergmanntiest2010tactual}.
|
||||
Two physical properties determine the haptic perception of hardness: its stiffness and elasticity, as shown in \figref{hardness} \cite{bergmanntiest2010tactual}.
|
||||
The \emph{stiffness} $k$ of an object is the ratio between the applied force $F$ and the resulting \emph{displacement} $D$ of the surface:
|
||||
\begin{equation}{stiffness}
|
||||
k = \frac{F}{D}
|
||||
@@ -265,7 +265,7 @@ The \emph{elasticity} of an object is expressed by its Young's modulus $Y$, whic
|
||||
|
||||
\begin{subfigs}{stiffness_young}{Perceived hardness of an object by finger pressure. }[
|
||||
\item Diagram of an object with a stiffness coefficient $k$ and a length $l$ compressed by a force $F$ on an area $A$ by a distance $D$.
|
||||
\item Identical perceived hardness intensity between Young's modulus (horizontal axis) and stiffness (vertical axis). The dashed and dotted lines indicate the objects tested, the arrows the correspondences made between these objects, and the grey lines the predictions of the quadratic relationship~\cite{bergmanntiest2009cues}.
|
||||
\item Identical perceived hardness intensity between Young's modulus (horizontal axis) and stiffness (vertical axis). The dashed and dotted lines indicate the objects tested, the arrows the correspondences made between these objects, and the grey lines the predictions of the quadratic relationship \cite{bergmanntiest2009cues}.
|
||||
]
|
||||
\subfig[.3]{hardness}
|
||||
\subfig[.45]{bergmanntiest2009cues}
|
||||
@@ -276,33 +276,33 @@ With finger pressure, a relative difference (the \emph{Weber fraction}) of \perc
|
||||
However, in the absence of pressure sensations (by placing a thin disc between the finger and the object), the necessary relative difference becomes much larger (Weber fraction of \percent{\sim 50}).
|
||||
Thus, the perception of hardness relies on \percent{90} on surface deformation cues and \percent{10} on displacement cues.
|
||||
In addition, an object with low stiffness but high Young's modulus can be perceived as hard, and vice versa, as shown in \figref{bergmanntiest2009cues}.
|
||||
Finally, when pressing with the finger, the perceived hardness intensity $h$ follows a power law with the stiffness $k$~\cite{harper1964subjective}:
|
||||
Finally, when pressing with the finger, the perceived hardness intensity $h$ follows a power law with the stiffness $k$ \cite{harper1964subjective}:
|
||||
\begin{equation}{hardness_intensity}
|
||||
h = k^{0.8}
|
||||
\end{equation}
|
||||
|
||||
%En pressant du doigt, l'intensité perçue (subjective) de dureté suit avec la raideur une relation selon une loi de puissance avec un exposant de \num{0.8}~\cite{harper1964subjective}, \ie quand la raideur double, la dureté perçue augmente de \num{1.7}.
|
||||
%En pressant du doigt, l'intensité perçue (subjective) de dureté suit avec la raideur une relation selon une loi de puissance avec un exposant de \num{0.8} \cite{harper1964subjective}, \ie quand la raideur double, la dureté perçue augmente de \num{1.7}.
|
||||
%\textcite{bergmanntiest2009cues} ont ainsi observé une relation quadratique d'égale intensité perçue de dureté, comme illustré sur la \figref{bergmanntiest2009cues}.
|
||||
|
||||
|
||||
%\subsubsection{Friction}
|
||||
%\label{friction}
|
||||
%
|
||||
%Friction (or slipperiness) is the perception of \emph{resistance to movement} on a surface~\cite{bergmanntiest2010tactual}.
|
||||
%Friction (or slipperiness) is the perception of \emph{resistance to movement} on a surface \cite{bergmanntiest2010tactual}.
|
||||
%Sandpaper is typically perceived as sticky because it has a strong resistance to sliding on its surface, while glass is perceived as more slippery.
|
||||
%This perceptual property is closely related to the perception of roughness~\cite{hollins1993perceptual,baumgartner2013visual}.
|
||||
%This perceptual property is closely related to the perception of roughness \cite{hollins1993perceptual,baumgartner2013visual}.
|
||||
%
|
||||
%When running the finger on a surface with a lateral movement (\secref{exploratory_procedures}), the skin-surface contacts generate frictional forces in the opposite direction to the finger movement, giving kinesthetic cues, and also stretch the skin, giving cutaneous cues.
|
||||
%As illustrated in \figref{smith1996subjective_1}, a stick-slip phenomenon can also occur, where the finger is intermittently slowed by friction before continuing to move, on both rough and smooth surfaces~\cite{derler2013stick}.
|
||||
%As illustrated in \figref{smith1996subjective_1}, a stick-slip phenomenon can also occur, where the finger is intermittently slowed by friction before continuing to move, on both rough and smooth surfaces \cite{derler2013stick}.
|
||||
%The amplitude of the frictional force $F_s$ is proportional to the normal force of the finger $F_n$, \ie the force perpendicular to the surface, according to a coefficient of friction $\mu$:
|
||||
%\begin{equation}{friction}
|
||||
% F_s = \mu \, F_n
|
||||
%\end{equation}
|
||||
%The perceived intensity of friction is thus roughly related to the friction coefficient $\mu$~\cite{smith1996subjective}.
|
||||
%However, it is a complex perception because it is more determined by the micro-scale interactions between the surface and the skin: It depends on many factors such as the normal force applied, the speed of movement, the contact area and the moisture of the skin and the surface~\cite{adams2013finger,messaoud2016relation}.
|
||||
%In this sense, the perception of friction is still poorly understood~\cite{okamoto2013psychophysical}.
|
||||
%The perceived intensity of friction is thus roughly related to the friction coefficient $\mu$ \cite{smith1996subjective}.
|
||||
%However, it is a complex perception because it is more determined by the micro-scale interactions between the surface and the skin: It depends on many factors such as the normal force applied, the speed of movement, the contact area and the moisture of the skin and the surface \cite{adams2013finger,messaoud2016relation}.
|
||||
%In this sense, the perception of friction is still poorly understood \cite{okamoto2013psychophysical}.
|
||||
%
|
||||
%\begin{subfigs}{smith1996subjective}{Perceived intensity of friction of different materials by active exploration with the finger~\cite{smith1996subjective}. }[
|
||||
%\begin{subfigs}{smith1996subjective}{Perceived intensity of friction of different materials by active exploration with the finger \cite{smith1996subjective}. }[
|
||||
% \item Measurements of normal $F_n$ and tangential $F_t$ forces when exploring two surfaces: one smooth (glass) and one rough (nyloprint). The fluctuations in the tangential force are due to the stick-slip phenomenon. The coefficient of friction $\mu$ can be estimated as the slope of the relationship between the normal and tangential forces.
|
||||
% \item Perceived friction intensity (vertical axis) as a function of the estimated friction coefficient $\mu$ of the exploration (horizontal axis) for four materials (shapes and colors).
|
||||
% ]
|
||||
@@ -313,17 +313,17 @@ Finally, when pressing with the finger, the perceived hardness intensity $h$ fol
|
||||
%
|
||||
%Yet, it is a fundamental perception for grasping and manipulating objects.
|
||||
%The forces of friction make it indeed possible to hold the object firmly in the hand and prevent it from slipping
|
||||
%The perception of friction also allows us to automatically and very quickly adjust the force we apply to the object in order to grasp it~\cite{johansson1984roles}.
|
||||
%If the finger is anaesthetized, the lack of cutaneous sensation prevents effective adjustment of the gripping force: the forces of the object on the finger are no longer correctly perceived, and the fingers then press harder on the object in compensation, but without achieving good opposition of the fingers~\cite{witney2004cutaneous}.
|
||||
%The perception of friction also allows us to automatically and very quickly adjust the force we apply to the object in order to grasp it \cite{johansson1984roles}.
|
||||
%If the finger is anaesthetized, the lack of cutaneous sensation prevents effective adjustment of the gripping force: the forces of the object on the finger are no longer correctly perceived, and the fingers then press harder on the object in compensation, but without achieving good opposition of the fingers \cite{witney2004cutaneous}.
|
||||
|
||||
%\subsubsection{Temperature}
|
||||
%\label{temperature}
|
||||
%
|
||||
%Temperature (or coldness/warmness) is the perception of the \emph{transfer of heat} between the touched surface and the skin~\cite{bergmanntiest2010tactual}:
|
||||
%Temperature (or coldness/warmness) is the perception of the \emph{transfer of heat} between the touched surface and the skin \cite{bergmanntiest2010tactual}:
|
||||
%When heat is removed from (added to) the skin, the surface is perceived as cold (hot).
|
||||
%Metal will be perceived as colder than wood at the same room temperature: This perception is different from the physical temperature of the material and is therefore an important property for distinguishing between materials~\cite{ho2006contribution}.
|
||||
%This perception depends on the thermal conductivity and heat capacity of the material, the volume of the object, the initial temperature difference and the area of contact between the surface and the skin~\cite{kappers2013haptic}.
|
||||
%For example, a larger object or a smoother surface, which increases the contact area, causes more heat circulation and a more intense temperature sensation (hot or cold)~\cite{bergmanntiest2008thermosensory}.
|
||||
%Metal will be perceived as colder than wood at the same room temperature: This perception is different from the physical temperature of the material and is therefore an important property for distinguishing between materials \cite{ho2006contribution}.
|
||||
%This perception depends on the thermal conductivity and heat capacity of the material, the volume of the object, the initial temperature difference and the area of contact between the surface and the skin \cite{kappers2013haptic}.
|
||||
%For example, a larger object or a smoother surface, which increases the contact area, causes more heat circulation and a more intense temperature sensation (hot or cold) \cite{bergmanntiest2008thermosensory}.
|
||||
|
||||
%Parce qu'elle est basée sur la circulation de la chaleur, la perception de la température est plus lente que les autres propriétés matérielles et demande un toucher statique (voir \figref{exploratory_procedures}) de plusieurs secondes pour que la température de la peau s'équilibre avec celle de l'objet.
|
||||
%La température $T(t)$ du doigt à l'instant $t$ et au contact avec une surface suit une loi décroissante exponentielle, où $T_s$ est la température initiale de la peau, $T_e$ est la température de la surface, $t$ est le temps et $\tau$ est la constante de temps:
|
||||
@@ -331,7 +331,7 @@ Finally, when pressing with the finger, the perceived hardness intensity $h$ fol
|
||||
% T(t) = (T_s - T_e) \, e^{-t / \tau} + T_e
|
||||
%\end{equation}
|
||||
%Le taux de transfert de chaleur, décrit par $\tau$, et l'écart de température $T_s - T_e$, sont les deux indices essentiels pour la perception de la température.
|
||||
%Dans des conditions de la vie de tous les jours, avec une température de la pièce de \qty{20}{\celsius}, une différence relative du taux de transfert de chaleur de \percent{43} ou un écart de \qty{2}{\celsius} est nécessaire pour percevoir une différence de température~\cite{bergmanntiest2009tactile}.
|
||||
%Dans des conditions de la vie de tous les jours, avec une température de la pièce de \qty{20}{\celsius}, une différence relative du taux de transfert de chaleur de \percent{43} ou un écart de \qty{2}{\celsius} est nécessaire pour percevoir une différence de température \cite{bergmanntiest2009tactile}.
|
||||
|
||||
|
||||
%\subsubsection{Spatial Properties}
|
||||
@@ -339,26 +339,26 @@ Finally, when pressing with the finger, the perceived hardness intensity $h$ fol
|
||||
|
||||
%Weight, size and shape are haptic spatial properties that are independent of the material properties described above.
|
||||
|
||||
%Weight (or heaviness/lightness) is the perceived \emph{mass} of the object~\cite{bergmanntiest2010haptic}.
|
||||
%Weight (or heaviness/lightness) is the perceived \emph{mass} of the object \cite{bergmanntiest2010haptic}.
|
||||
%It is typically estimated by holding the object statically in the palm of the hand to feel the gravitational force (\secref{exploratory_procedures}).
|
||||
%A relative weight difference of \percent{8} is then required to be perceptible~\cite{brodie1985jiggling}.
|
||||
%A relative weight difference of \percent{8} is then required to be perceptible \cite{brodie1985jiggling}.
|
||||
%By lifting the object, it is also possible to feel the object's force of inertia, \ie its resistance to velocity.
|
||||
%This provides an additional perceptual cue to its mass and slightly improves weight discrimination.
|
||||
%For both gravity and inertia, kinesthetic cues to force are much more important than cutaneous cues to pressure~\cite{bergmanntiest2012investigating}.
|
||||
%Le lien entre le poids physique et l'intensité perçue est variable selon les individus~\cite{kappers2013haptic}.
|
||||
%For both gravity and inertia, kinesthetic cues to force are much more important than cutaneous cues to pressure \cite{bergmanntiest2012investigating}.
|
||||
%Le lien entre le poids physique et l'intensité perçue est variable selon les individus \cite{kappers2013haptic}.
|
||||
|
||||
%Size can be perceived as the object's \emph{length} (in one dimension) or its \emph{volume} (in three dimensions)~\cite{kappers2013haptic}.
|
||||
%Size can be perceived as the object's \emph{length} (in one dimension) or its \emph{volume} (in three dimensions) \cite{kappers2013haptic}.
|
||||
%In both cases, and if the object is small enough, a precision grip (\figref{gonzalez2014analysis}) between the thumb and index finger can discriminate between sizes with an accuracy of \qty{1}{\mm}, but with an overestimation of length (power law with exponent \qty{1.3}).
|
||||
%Alternatively, it is necessary to follow the contours of the object with the fingers to estimate its length (\secref{exploratory_procedures}), but with ten times less accuracy and an underestimation of length (power law with an exponent of \qty{0.9})~\cite{bergmanntiest2011cutaneous}.
|
||||
%The perception of the volume of an object that is not small is typically done by hand enclosure, but the estimate is strongly influenced by the size, shape and mass of the object, for an identical volume~\cite{kahrimanovic2010haptic}.
|
||||
%Alternatively, it is necessary to follow the contours of the object with the fingers to estimate its length (\secref{exploratory_procedures}), but with ten times less accuracy and an underestimation of length (power law with an exponent of \qty{0.9}) \cite{bergmanntiest2011cutaneous}.
|
||||
%The perception of the volume of an object that is not small is typically done by hand enclosure, but the estimate is strongly influenced by the size, shape and mass of the object, for an identical volume \cite{kahrimanovic2010haptic}.
|
||||
|
||||
%The shape of an object can be defined as the perception of its \emph{global geometry}, \ie its shape and contours.
|
||||
%This is the case, for example, when looking for a key in a pocket.
|
||||
%The exploration of contours and enclosure are then employed, as for the estimation of length and volume.
|
||||
%If the object is not known in advance, object identification is rather slow, taking several seconds~\cite{norman2004visual}.
|
||||
%Therefore, the exploration of other properties is favoured to recognize the object more quickly, in particular marked edges~\cite{klatzky1987there}, \eg a screw among nails (\figref{plaisier2009salient_2}), or certain material properties~\cite{lakatos1999haptic,plaisier2009salient}, \eg a metal object among plastic objects.
|
||||
%If the object is not known in advance, object identification is rather slow, taking several seconds \cite{norman2004visual}.
|
||||
%Therefore, the exploration of other properties is favoured to recognize the object more quickly, in particular marked edges \cite{klatzky1987there}, \eg a screw among nails (\figref{plaisier2009salient_2}), or certain material properties \cite{lakatos1999haptic,plaisier2009salient}, \eg a metal object among plastic objects.
|
||||
|
||||
%\begin{subfigs}{plaisier2009salient}{Identifcation of a sphere among cubes~\cite{plaisier2009salient}. }[
|
||||
%\begin{subfigs}{plaisier2009salient}{Identifcation of a sphere among cubes \cite{plaisier2009salient}. }[
|
||||
% \item The shape has a significant effect on the perception of the volume of an object, \eg a sphere is perceived smaller than a cube of the same volume.
|
||||
% \item The absence of a marked edge on the sphere makes it easy to identify among cubes.
|
||||
% ]
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
\section{Augmenting Object Perception with Wearable Haptics}
|
||||
\label{wearable_haptics}
|
||||
|
||||
One of the roles of haptic systems is to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects, particularly in a visual \VE~\cite{maclean2008it,culbertson2018haptics}.
|
||||
Moreover, a haptic augmentation system should \enquote{modulating the feel of a real object by virtual [haptic] feedback}~\cite{jeon2009haptic}, \ie a touch interaction with a real object whose perception is modified by the addition of virtual haptic feedback.
|
||||
The haptic system should be hand-held or worn, \eg on the hand, and \enquote{not permanently attached to or integrated in the object}~\cite{bhatia2024augmenting}.
|
||||
One of the roles of haptic systems is to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects, particularly in a visual \VE \cite{maclean2008it,culbertson2018haptics}.
|
||||
Moreover, a haptic augmentation system should \enquote{modulating the feel of a real object by virtual [haptic] feedback} \cite{jeon2009haptic}, \ie a touch interaction with a real object whose perception is modified by the addition of virtual haptic feedback.
|
||||
The haptic system should be hand-held or worn, \eg on the hand, and \enquote{not permanently attached to or integrated in the object} \cite{bhatia2024augmenting}.
|
||||
|
||||
|
||||
\subsection{Level of Wearability}
|
||||
@@ -14,7 +14,7 @@ Different types of haptic devices can be worn on the hand, but only some of them
|
||||
An increasing wearability resulting in the loss of the system's kinesthetic feedback capability.
|
||||
|
||||
\begin{subfigs}{pacchierotti2017wearable}{
|
||||
Schematic wearability level of haptic devices for the hand~\cite{pacchierotti2017wearable}.
|
||||
Schematic wearability level of haptic devices for the hand \cite{pacchierotti2017wearable}.
|
||||
}[
|
||||
\item World-grounded haptic devices are fixed on the environment to provide kinesthetic feedback to the user.
|
||||
\item Exoskeletons are body-grounded kinesthetic devices.
|
||||
@@ -28,7 +28,7 @@ An increasing wearability resulting in the loss of the system's kinesthetic feed
|
||||
|
||||
Haptic research comes from robotics and teleoperation, and historically led to the design of haptic systems that are \emph{world-grounded} to an external support in the environment, such as a table (\figref{pacchierotti2017wearable_1}).
|
||||
These are robotic arms whose end-effector is either held in the hand or worn on a finger and which simulate interactions with a \VE by providing kinesthetic forces and torques feedback (\figref{pacchierotti2015cutaneous}).
|
||||
They provide high fidelity haptic feedback but are heavy, bulky and limited to small workspaces~\cite{culbertson2018haptics}.
|
||||
They provide high fidelity haptic feedback but are heavy, bulky and limited to small workspaces \cite{culbertson2018haptics}.
|
||||
|
||||
More portable designs have been developed by moving the grounded part to the user's body.
|
||||
The entire robotic system is thus mounted on the user, forming an exoskeleton capable of providing kinesthetic feedback to the finger, \eg in \figref{achibet2017flexifingers}.
|
||||
@@ -42,9 +42,9 @@ Moreover, as detailed in \secref{object_properties}, cutaneous sensations are ne
|
||||
\begin{subfigs}{grounded_to_wearable}{
|
||||
Haptic devices for the hand with different wearability levels.
|
||||
}[
|
||||
\item Teleoperation of a virtual cube grasped with the thumb and index fingers each attached to a grounded haptic device~\cite{pacchierotti2015cutaneous}.
|
||||
\item A passive exoskeleton for fingers simulating stiffness of a trumpet's pistons~\cite{achibet2017flexifingers}.
|
||||
\item Manipulation of a virtual cube with the thumb and index fingers each attached with the 3-RSR wearable haptic device~\cite{leonardis20173rsr}.
|
||||
\item Teleoperation of a virtual cube grasped with the thumb and index fingers each attached to a grounded haptic device \cite{pacchierotti2015cutaneous}.
|
||||
\item A passive exoskeleton for fingers simulating stiffness of a trumpet's pistons \cite{achibet2017flexifingers}.
|
||||
\item Manipulation of a virtual cube with the thumb and index fingers each attached with the 3-RSR wearable haptic device \cite{leonardis20173rsr}.
|
||||
]
|
||||
\subfigsheight{38mm}
|
||||
\subfig{pacchierotti2015cutaneous}
|
||||
@@ -65,9 +65,9 @@ Several actuators are often combined in a haptic device to obtain richer haptic
|
||||
\subsubsection{Moving Platforms}
|
||||
\label{moving_platforms}
|
||||
|
||||
The moving platforms translate perpendicularly on the skin to create sensations of contact, pressure and edges~\cite{pacchierotti2017wearable}.
|
||||
The moving platforms translate perpendicularly on the skin to create sensations of contact, pressure and edges \cite{pacchierotti2017wearable}.
|
||||
Placed under the fingertips, they can come into contact with the skin with different forces, speeds and orientations.
|
||||
The platform is moved by means of cables, \eg in \figref{gabardi2016new}, or articulated arms, \eg in \figref{perez2017optimizationbased}, activated by motors grounded to the nail~\cite{gabardi2016new,perez2017optimizationbased}.
|
||||
The platform is moved by means of cables, \eg in \figref{gabardi2016new}, or articulated arms, \eg in \figref{perez2017optimizationbased}, activated by motors grounded to the nail \cite{gabardi2016new,perez2017optimizationbased}.
|
||||
The motors lengthen and shorten the cables or orient the arms to move the platform over 3 \DoFs: two for orientation and one for normal force relative to the finger.
|
||||
However, these platforms are specifically designed to provide haptic feedback to the fingertip in \VEs, preventing interaction with a \RE.
|
||||
|
||||
@@ -76,18 +76,18 @@ However, these platforms are specifically designed to provide haptic feedback to
|
||||
|
||||
A pin-array is a surface made up of small, rigid pins arranged very close together in a grid and that can be moved individually.
|
||||
When placed in contact with the fingertip, it can create sensations of edge, pressure and texture.
|
||||
The \figref{sarakoglou2012high} shows an example of a pin-array consisting of \numproduct{4 x 4} pins of \qty{1.5}{\mm} diameter and \qty{2}{\mm} height, spaced at \qty{2}{\mm}~\cite{sarakoglou2012high}.
|
||||
Pneumatic systems use a fluid such as air or water to inflate membranes under the skin, creating sensations of contact and pressure~\cite{raza2024pneumatically}.
|
||||
Multiple membranes are often used in a grid to simulate edges and textures, as in the \figref{ujitoko2020development}~\cite{ujitoko2020development}.
|
||||
The \figref{sarakoglou2012high} shows an example of a pin-array consisting of \numproduct{4 x 4} pins of \qty{1.5}{\mm} diameter and \qty{2}{\mm} height, spaced at \qty{2}{\mm} \cite{sarakoglou2012high}.
|
||||
Pneumatic systems use a fluid such as air or water to inflate membranes under the skin, creating sensations of contact and pressure \cite{raza2024pneumatically}.
|
||||
Multiple membranes are often used in a grid to simulate edges and textures, as in the \figref{ujitoko2020development} \cite{ujitoko2020development}.
|
||||
Although these two types of effector can be considered wearable, their actuation requires a high level of mechanical and electronic complexity that makes the system as a whole not portable.
|
||||
|
||||
\begin{subfigs}{normal_actuators}{
|
||||
Normal indentation actuators for the fingertip.
|
||||
}[
|
||||
\item A moving platform actuated with cables~\cite{gabardi2016new}.
|
||||
\item A moving platform actuated by articulated limbs~\cite{perez2017optimizationbased}.
|
||||
\item Diagram of a pin-array of tactors~\cite{sarakoglou2012high}.
|
||||
\item A pneumatic system composed of a \numproduct{12 x 10} array of air cylinders~\cite{ujitoko2020development}.
|
||||
\item A moving platform actuated with cables \cite{gabardi2016new}.
|
||||
\item A moving platform actuated by articulated limbs \cite{perez2017optimizationbased}.
|
||||
\item Diagram of a pin-array of tactors \cite{sarakoglou2012high}.
|
||||
\item A pneumatic system composed of a \numproduct{12 x 10} array of air cylinders \cite{ujitoko2020development}.
|
||||
]
|
||||
\subfigsheight{37mm}
|
||||
\subfig{gabardi2016new}
|
||||
@@ -100,23 +100,23 @@ Although these two types of effector can be considered wearable, their actuation
|
||||
\label{tangential_actuators}
|
||||
|
||||
Similar in design to the mobile platforms, the tangential motion actuators activate a rigid pin or surface in contact with the fingertip under the finger to create shearing sensation on the skin.
|
||||
An articulated and motorized arm structure moves the effector in multiple directions over 2 \DoFs parallel to the skin, \eg in \figref{leonardis2015wearable}~\cite{leonardis2015wearable}.
|
||||
Some actuators are capable of both normal and tangential motion over 3 \DoFs on the skin and can also make or break contact with the finger, \eg in \figref{schorr2017fingertip}~\cite{schorr2017fingertip}.
|
||||
An articulated and motorized arm structure moves the effector in multiple directions over 2 \DoFs parallel to the skin, \eg in \figref{leonardis2015wearable} \cite{leonardis2015wearable}.
|
||||
Some actuators are capable of both normal and tangential motion over 3 \DoFs on the skin and can also make or break contact with the finger, \eg in \figref{schorr2017fingertip} \cite{schorr2017fingertip}.
|
||||
|
||||
\subsubsection{Compression Belts}
|
||||
\label{belt_actuators}
|
||||
|
||||
A simpler alternative approach is to place a belt under the finger, and to actuate it over 2 \DoFs by two motors placed on top of the finger~\cite{minamizawa2007gravity}.
|
||||
A simpler alternative approach is to place a belt under the finger, and to actuate it over 2 \DoFs by two motors placed on top of the finger \cite{minamizawa2007gravity}.
|
||||
By turning in opposite directions, the motors shorten the belt and create a sensation of pressure.
|
||||
Conversely, by turning simultaneously in the same direction, the belt pulls on the skin, creating a shearing sensation.
|
||||
The simplicity of this approach allows the belt to be placed anywhere on the hand, leaving the fingertip free to interact with the \RE, \eg the hRing on the proximal phalanx in \figref{pacchierotti2016hring}~\cite{pacchierotti2016hring} or Tasbi on the wrist in \figref{pezent2022design}~\cite{pezent2022design}.
|
||||
The simplicity of this approach allows the belt to be placed anywhere on the hand, leaving the fingertip free to interact with the \RE, \eg the hRing on the proximal phalanx in \figref{pacchierotti2016hring} \cite{pacchierotti2016hring} or Tasbi on the wrist in \figref{pezent2022design} \cite{pezent2022design}.
|
||||
|
||||
\begin{subfigs}{tangential_belts}{Tangential motion actuators and compression belts. }[
|
||||
\item A skin strech actuator for the fingertip~\cite{leonardis2015wearable}.
|
||||
\item A 3 \DoF actuator capable of normal and tangential motion on the fingertip~\cite{schorr2017fingertip}.
|
||||
%\item A shearing belt actuator for the fingertip~\cite{minamizawa2007gravity}.
|
||||
\item The hRing, a shearing belt actuator for the proximal phalanx of the finger~\cite{pacchierotti2016hring}.
|
||||
\item Tasbi, a wristband capable of pressure and vibrotactile feedback~\cite{pezent2022design}.
|
||||
\item A skin strech actuator for the fingertip \cite{leonardis2015wearable}.
|
||||
\item A 3 \DoF actuator capable of normal and tangential motion on the fingertip \cite{schorr2017fingertip}.
|
||||
%\item A shearing belt actuator for the fingertip \cite{minamizawa2007gravity}.
|
||||
\item The hRing, a shearing belt actuator for the proximal phalanx of the finger \cite{pacchierotti2016hring}.
|
||||
\item Tasbi, a wristband capable of pressure and vibrotactile feedback \cite{pezent2022design}.
|
||||
]
|
||||
\subfigsheight{33.5mm}
|
||||
\subfig{leonardis2015wearable}
|
||||
@@ -154,7 +154,7 @@ Piezoelectric actuators deform a solid material when a voltage is applied. They
|
||||
|
||||
\begin{subfigs}{lra}{Diagram and performance of \LRAs. }[
|
||||
\item Diagram. From Precision Microdrives~\footnotemarkrepeat.
|
||||
\item Force generated by two \LRAs as a function of sine wave input with different frequencies: both their maximum force and resonant frequency are different~\cite{azadi2014vibrotactile}.
|
||||
\item Force generated by two \LRAs as a function of sine wave input with different frequencies: both their maximum force and resonant frequency are different \cite{azadi2014vibrotactile}.
|
||||
]
|
||||
\subfigsheight{50mm}
|
||||
\subfig{precisionmicrodrives_lra}
|
||||
@@ -165,21 +165,21 @@ Piezoelectric actuators deform a solid material when a voltage is applied. They
|
||||
\subsection{Modifying Perceived Haptic Roughness and Hardness}
|
||||
\label{tactile_rendering}
|
||||
|
||||
Tactile rendering of haptic properties consists in modelling and reproducing virtual tactile sensations comparable to those perceived when interacting with real objects~\cite{klatzky2013haptic}.
|
||||
By adding such tactile rendering as feedback to the touch actions of the hand on a real object~\cite{bhatia2024augmenting}, the perception of the object's haptic property can be modified.
|
||||
The integration of the real and virtual haptic sensations into a single property perception is discussed in more details in \secref{sensations_perception}.
|
||||
Tactile rendering of haptic properties consists in modelling and reproducing virtual tactile sensations comparable to those perceived when interacting with real objects \cite{klatzky2013haptic}.
|
||||
By adding such tactile rendering as feedback to the touch actions of the hand on a real object \cite{bhatia2024augmenting}, the perception of the object's haptic property can be modified.
|
||||
The integration of the real and virtual sensations into a single property perception is discussed in more details in \secref{sensations_perception}.
|
||||
%, both the real and virtual haptic sensations are integrated into a single property perception, as presented in \secref{sensations_perception}, \ie the perceived haptic property is modulated by the added virtual feedback.
|
||||
In particular, the visual rendering of a touched object can also greatly influence the perception of its haptic properties, \eg by modifying its visual texture in \AR or \VR, as discussed in the \secref{visuo_haptic}.
|
||||
In particular, the visual rendering of a touched object can also influence the perception of its haptic properties, \eg by modifying its visual texture in \AR or \VR, as discussed in the \secref{visuo_haptic}.
|
||||
|
||||
\textcite{bhatia2024augmenting} categorize the haptic augmentations into three types: direct touch, touch-through, and tool mediated.
|
||||
Also called direct feel-through~\cite{jeon2015haptic}, in \emph{direct touch}, the haptic device does not cover the interior of the hand to not impair the user to interact with the \RE.
|
||||
In touch-through and tool-mediated, or \emph{indirect feel-through}, the haptic device is interposed between the hand and the \RE or worn on the hand, respectively.
|
||||
In \emph{direct touch}, the haptic device does not cover the interior of the hand to not impair the user to interact with the \RE, and is typically achieved with wearable haptics.
|
||||
In touch-through and tool-mediated, or \emph{indirect feel-through} \cite{jeon2015haptic}, the haptic device is interposed between the hand and the \RE.
|
||||
%We are interested in direct touch augmentations with wearable haptics (\secref{wearable_haptic_devices}), as their integration with \AR is particularly promising for free hand interaction with visuo-haptic augmentations.
|
||||
Many haptic augmentations were first developed with grounded haptic devices and later transposed to wearable haptic devices.
|
||||
Many haptic augmentations were first developed with touch-through devices, and some (but not all) were later transposed to direct touch augmentation with wearable haptic devices.
|
||||
%We also focus on tactile augmentations stimulating the mechanoreceptors of the skin (\secref{haptic_sense}), thus excluding temperature perception, as they are the most common existing haptic interfaces.
|
||||
|
||||
As we chose in \secref{object_properties} to focus on the haptic perception of the roughness and hardness of objects, we present bellow the methods to modify these properties with wearable haptic devices.
|
||||
Of course, wearable haptics can also be used to modify the perceived friction \cite{konyo2008alternative,salazar2020altering}, weight \cite{minamizawa2007gravity}, or local deformation \cite{salazar2020altering} of real objects, but they are less common \cite{bhatia2024augmenting} and will not be detailed here.
|
||||
As we chose in \secref{object_properties} to focus on the haptic perception of the roughness and hardness of objects, we overview bellow the methods to modify the perception of these properties.
|
||||
Of course, wearable haptics can also be used in direct touch context to modify the perceived friction \cite{konyo2008alternative,salazar2020altering}, weight \cite{minamizawa2007gravity}, or local deformation \cite{salazar2020altering} of real objects, but they are rare \cite{bhatia2024augmenting} and will not be detailed here.
|
||||
|
||||
% \cite{klatzky2003feeling} : rendering roughness, friction, deformation, temperatures
|
||||
% \cite{girard2016haptip} : renderings with a tangential motion actuator
|
||||
@@ -187,31 +187,54 @@ Of course, wearable haptics can also be used to modify the perceived friction \c
|
||||
\subsubsection{Roughness}
|
||||
\label{texture_rendering}
|
||||
|
||||
To modify the perception of haptic roughness (or texture, see \secref{roughness}) of a real object, vibrations are provided to the skin by the wearable haptic device when running the finger over the surface.
|
||||
Two approaches are used to render virtual textures: \emph{physics-based models} and \emph{data-driven models}~\cite{culbertson2018haptics}.
|
||||
To modify the perception of haptic roughness (or texture, see \secref{roughness}) of a real object, vibrations are typically provided to the skin by the wearable haptic device when running the finger over the surface.
|
||||
This is because running the finger or a tool on a textured surface generates pressures and vibrations (\secref{roughness}) at frequencies that are too high for rendering capabilities of most haptic devices \cite{campion2005fundamental,culbertson2018haptics}.
|
||||
Two main approaches are used to render virtual textures: \emph{simulation models} and \emph{data-driven models} \cite{klatzky2013haptic,culbertson2018haptics}.
|
||||
|
||||
\paragraph{Physics-based Models}
|
||||
\paragraph{Simulation Models}
|
||||
|
||||
As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\cite{campion2005fundamental,culbertson2018haptics}.
|
||||
High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\cite{unger2011roughness}.
|
||||
Notably, \textcite{okamura1998vibration} rendered grating textures with exponentially decaying sinudoids that simulated the strokes of the grooves and ridges of the surface, while \textcite{culbertson2014modeling} captured and modelled the roughness of real surfaces to render them using the speed and force of the user.
|
||||
\cite{chan2021hasti,guruswamy2011iir}
|
||||
Simulations of virtual textures are based on the physics of the interaction between the finger and the surface, and are used to generate the vibrations that the user feels when running the finger over the surface.
|
||||
|
||||
\textcite{ando2007fingernailmounted} were the first to propose this approach that they experimented with a voice-coil mounted on the index nail (\figref{ando2007fingernailmounted}).
|
||||
The sensation of crossing edges of a virtual patterned texture (\secref{texture_rendering}) on a real sheet of paper were rendered with \qty{20}{\ms} vibration impulses at \qty{130}{\Hz}.
|
||||
Participants were able to match the virtual patterns to their real counterparts of height \qty{0.25}{\mm} and width \qtyrange{1}{10}{\mm}, but systematically overestimated the virtual width to be \qty{4}{\mm} longer.
|
||||
Early renderings of virtual textures consisted of modelling the surface with a periodic function
|
||||
|
||||
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\cite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\cite{asano2015vibrotactile}, creating a haptic texture augmentation.
|
||||
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\cite{delhaye2012textureinduced,manfredi2014natural}.
|
||||
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\cite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
|
||||
As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture \cite{campion2005fundamental,culbertson2018haptics}.
|
||||
High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture \cite{unger2011roughness}.
|
||||
|
||||
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached \cite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator \cite{asano2015vibrotactile}, creating a haptic texture augmentation.
|
||||
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals \cite{delhaye2012textureinduced,manfredi2014natural}.
|
||||
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity \cite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
|
||||
|
||||
\paragraph{Data-driven Models}
|
||||
|
||||
Because physics-based models to render realistic textures can be very complex to design and to render in real-time, direct capture of real textures have been used instead to model the produced vibrations~\cite{culbertson2018haptics}.
|
||||
Because simulations of virtual textures can be very complex to design and to render in real-time, direct capture of real textures have been used instead to model the produced vibrations \cite{culbertson2018haptics}.
|
||||
|
||||
An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\cite{culbertson2015should}.
|
||||
For example, when comparing the same virtual texture pairwise, but with different parameters, \textcite{culbertson2015should} showed that the roughness vibrations generated should vary with user speed, but not necessarily with user force.
|
||||
Virtual data-driven textures were perceived as similar to real textures, except for friction, which was not rendered properly.
|
||||
\textcite{okamura1998vibration} first dragged a stylus over sandpapers and patterned surfaces to measure the vibrations produced by the interaction.
|
||||
They found that the contact vibrations with patterns could be modelled as exponential decaying sine waves (\eqref{contact_transient}) that depend on the normal force and scanning velocity of the stylus on the surface.
|
||||
This technique was employed by \textcite{ando2007fingernailmounted} to augment a smooth sheet of paper with a virtual patterned texture: With a \LRA mounted on the nail, they rendered the virtual contacts of the finger with \qty{20}{\ms} vibration impulses at \qty{130}{\Hz} (\figref{ando2007fingernailmounted}).
|
||||
Participants matched the virtual textures to real ones, with \qty{0.25}{\mm} height and \qtyrange{1}{10}{\mm} width, but systematically overestimated the virtual width to be \qty{4}{\mm} longer.
|
||||
|
||||
More models have been developed to capture "natural" (such as sandpapers) textures \cite{guruswamy2011iir} with many force and speed measures while staying compact and capable of real-time rendering \cite{romano2012creating,culbertson2014modeling}.
|
||||
Such models are capable from the user's measurements of velocity and force as inputs to interpolate and generate a virtual texture to render as vibrations (\secref{vibrotactile_actuators}).
|
||||
This led the release of the Penn Haptic Texture Toolkit (HaTT) database, a public set of stylus records and models of 100 haptic textures \cite{culbertson2014one}.
|
||||
A similar database but captured directly from the fingertip was released very recently \cite{balasubramanian2024sens3}.
|
||||
One limitation of these data-driven models is that they can render only isotropic textures: their capture does not depend on the position of the measure, and the rendering is the same whatever the direction of the movement.
|
||||
Alternative models have been proposed to both render isotropic and patterned textures \cite{chan2021hasti}.
|
||||
|
||||
When comparing real textures felt through a stylus with their virtual models rendered with a voice-coil actuator attached to the stylus (\figref{culbertson2012refined}), the virtual textures were found to accurately recreated roughness perception, but hardness and friction were not rendered properly \cite{culbertson2014modeling}.
|
||||
\textcite{culbertson2015should} further showed that the perceived realism of the virtual textures, and similarity to the real textures, depended mostly on the user's speed but not on the user's force as inputs to the model, \ie respond to velocity is sufficient to render isotropic virtual textures.
|
||||
|
||||
\begin{subfigs}{textures_rendering_data}{Augmentating haptic texture perception with voice-coil actuators. }[
|
||||
\item Increasing and decreasing the perceived roughness of a real patterned texture in direct touch \cite{asano2015vibrotactile}.
|
||||
\item Comparing real patterned texture with virtual texture augmentation in direct touch \cite{friesen2024perceived}.
|
||||
\item Rendering virtual contacts in direct touch with the virtual texture \cite{ando2007fingernailmounted}.
|
||||
\item Rendering an isotropic virtual texture over a real surface while sliding a hand-held stylus on it \cite{culbertson2012refined}.
|
||||
]
|
||||
\subfigsheight{35mm}
|
||||
\subfig{asano2015vibrotactile}
|
||||
\subfig{friesen2024perceived}
|
||||
\subfig{ando2007fingernailmounted}
|
||||
\subfig{culbertson2012refined}
|
||||
\end{subfigs}
|
||||
|
||||
|
||||
\subsubsection{Hardness}
|
||||
@@ -221,7 +244,7 @@ The perceived hardness (\secref{hardness}) of a real surface can be modified by
|
||||
|
||||
\paragraph{Modulating Forces}
|
||||
|
||||
When tapping or pressing a real object, the perceived stiffness $\tilde{k}$ of its surface can be modulated with force feedback~\cite{jeon2015haptic}.
|
||||
When tapping or pressing a real object, the perceived stiffness $\tilde{k}$ of its surface can be modulated with force feedback \cite{jeon2015haptic}.
|
||||
This was first proposed by \textcite{jeon2008modulating} who augmented a real surface tapped in 1 \DoF with a grounded force-feedback device held in hand (\figref{jeon2009haptic_1}).
|
||||
When the haptic end-effector contacts the object at time $t$, the object's surface deforms by displacement $x_r(t)$ and opposes a real reaction force $f_r(t)$.
|
||||
The virtual force of the device $\tilde{f_r}(t)$ is then controlled to:
|
||||
@@ -231,7 +254,7 @@ The virtual force of the device $\tilde{f_r}(t)$ is then controlled to:
|
||||
A force sensor embedded in the device measures the reaction force $f_r(t)$.
|
||||
The displacement $x_r(t)$ is estimated with the reaction force and tapping velocity using a pre-defined model of various materials, as described by \textcite{jeon2011extensions}.
|
||||
As shown in \figref{jeon2009haptic_2}, the force $\tilde{f_r}(t)$ perceived by the user being modulated, but not the displacement $x_r(t)$, the perceived stiffness is $\tilde{k}(t)$.
|
||||
This stiffness augmentation technique was then extended to enable tapping and pressing with 3 \DoFs~\cite{jeon2010stiffness}, to render friction and weight augmentations~\cite{jeon2011extensions}, and to grasping and squeezing the real object with two contact points~\cite{jeon2012extending}.
|
||||
This stiffness augmentation technique was then extended to enable tapping and pressing with 3 \DoFs \cite{jeon2010stiffness}, to render friction and weight augmentations \cite{jeon2011extensions}, and to grasping and squeezing the real object with two contact points \cite{jeon2012extending}.
|
||||
|
||||
\begin{subfigs}{stiffness_rendering_grounded}{Augmenting the perceived stiffness of a real surface with a hand-held force-feedback device. }[%
|
||||
\item Diagram of a user tapping the surface \cite{jeon2009haptic}.
|
||||
@@ -244,7 +267,7 @@ This stiffness augmentation technique was then extended to enable tapping and pr
|
||||
\textcite{detinguy2018enhancing} transposed this stiffness augmentation technique with the hRing device (\secref{belt_actuators}): While pressing a real piston with the fingertip by displacement $x_r(t)$, the belt compressed the finger by a virtual force $\tilde{k}\,x_r(t)$ where $\tilde{k}$ is the added stiffness (\eqref{stiffness_augmentation}), increasing the perceived stiffness of the piston (\figref{detinguy2018enhancing}).
|
||||
%This enables to \emph{increase} the perceived stiffness of the real piston up to \percent{+14}.
|
||||
More importantly, the augmentation proved to be robust to the placement of the device, as the increased stiffness was perceived the same on the fingertip, the middle phalanx and the proximal.
|
||||
Conversely, the technique allowed to \emph{decrease} the perceived stiffness by compressing the phalanx prior the contact and diminish the belt pressure as the user pressed the piston~\cite{salazar2020altering}.
|
||||
Conversely, the technique allowed to \emph{decrease} the perceived stiffness by compressing the phalanx prior the contact and diminish the belt pressure as the user pressed the piston \cite{salazar2020altering}.
|
||||
\textcite{tao2021altering} proposed instead to restrict the deformation of the fingerpad by pulling a hollow frame around it to decrease perceived stiffness (\figref{tao2021altering}): it augments the finger contact area thus the perceived Young modulus of the object (\secref{hardness}).
|
||||
|
||||
\begin{subfigs}{stiffness_rendering_wearable}{Modifying the perceived stiffness with wearable pressure devices. }[%
|
||||
@@ -264,9 +287,9 @@ Conversely, the technique allowed to \emph{decrease} the perceived stiffness by
|
||||
Q(t) = A \, |v_{in}| \, e^{- \tau t} sin(2 \pi f t)
|
||||
\end{equation}
|
||||
With $A$ the amplitude slope, $\tau$ the sine decay rate and $f$ the sine frequency, which are measured material properties, and $v_{in}$ the impact velocity.
|
||||
It has been shown that these material properties perceptually express the stiffness (\secref{hardness}) of real~\cite{higashi2019hardness} and virtual surface~\cite{choi2021perceived}.
|
||||
It has been shown that these material properties perceptually express the stiffness (\secref{hardness}) of real \cite{higashi2019hardness} and virtual surface \cite{choi2021perceived}.
|
||||
Therefore, when contacting or tapping a real object through an indirect feel-through interface that provide such vibrations (\figref{choi2021augmenting_control}) using a voice-coil (\secref{vibrotactile_actuators}), the perceived stiffness can be increased or decreased \cite{kuchenbecker2006improving,hachisu2012augmentation,choi2021augmenting}, \eg sponge feeling stiffer or wood feeling softer (\figref{choi2021augmenting_results}).
|
||||
A challenge with this technique is to provide the vibration feedback at the right time, to be felt simultaneous with the real contact~\cite{park2023perceptual}.
|
||||
A challenge with this technique is to provide the vibration feedback at the right time, to be felt simultaneous with the real contact \cite{park2023perceptual}.
|
||||
|
||||
\begin{subfigs}{contact_vibrations}{Augmenting perceived stiffness using vibrations when touching a real surface \cite{choi2021augmenting}. }[%
|
||||
%\item Experimental setup with a voice-coil actuator attached to a touch-through interface.
|
||||
@@ -279,7 +302,7 @@ A challenge with this technique is to provide the vibration feedback at the righ
|
||||
\subfig{choi2021augmenting_results}
|
||||
\end{subfigs}
|
||||
|
||||
Vibrations on contact have been employed with wearable haptics but, to the best of our knowledge, only to render \VOs~\cite{pezent2019tasbi,teng2021touch,sabnis2023haptic}.
|
||||
Vibrations on contact have been employed with wearable haptics but, to the best of our knowledge, only to render \VOs \cite{pezent2019tasbi,teng2021touch,sabnis2023haptic}.
|
||||
We describe them in the \secref{vhar_haptics}.
|
||||
|
||||
%A promising alternative approach
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
\section{Manipulating Object with the Hands in AR}
|
||||
\section{Manipulating Objects with the Hands in AR}
|
||||
\label{augmented_reality}
|
||||
|
||||
As with haptic systems (\secref{wearable_haptics}), visual \AR devices generate and integrate virtual content into the user's perception of the \RE, creating the illusion of the presence of the virtual.
|
||||
Immersive systems such as headsets leave the hands free to interact with \VOs, promising natural and intuitive interactions similar to those with everyday real objects.
|
||||
|
||||
%\begin{subfigs}{sutherland1968headmounted}{Photos of the first \AR system~\cite{sutherland1968headmounted}. }[
|
||||
%\begin{subfigs}{sutherland1968headmounted}{Photos of the first \AR system \cite{sutherland1968headmounted}. }[
|
||||
% \item The \AR headset.
|
||||
% \item Wireframe \ThreeD \VOs were displayed registered in the real environment (as if there were part of it).
|
||||
% ]
|
||||
@@ -20,7 +20,7 @@ Immersive systems such as headsets leave the hands free to interact with \VOs, p
|
||||
The first \AR headset was invented by \textcite{sutherland1968headmounted}: With the technology available at the time, it was already capable of displaying \VOs at a fixed point in space in real time, giving the user the illusion that the content was present in the room.
|
||||
Fixed to the ceiling, the headset displayed a stereoscopic (one image per eye) perspective projection of the virtual content on a transparent screen, taking into account the user's position, and thus already following the interaction loop presented in \figref[introduction]{interaction-loop}.
|
||||
|
||||
\subsubsection{A Definition}
|
||||
\subsubsection{A Definition of AR}
|
||||
\label{ar_definition}
|
||||
|
||||
The system of \cite{sutherland1968headmounted} already fulfilled the first formal definition of \AR, proposed by \textcite{azuma1997survey} in the first survey of the domain:
|
||||
@@ -35,18 +35,18 @@ The system of \cite{sutherland1968headmounted} already fulfilled the first forma
|
||||
\footnotetext{This third characteristic has been slightly adapted to use the version of \textcite{marchand2016pose}, the original definition was: \enquote{registered in \ThreeD}.}
|
||||
|
||||
Each of these characteristics is essential: the real-virtual combination distinguishes \AR from \VR, a movie with integrated digital content is not interactive and a \TwoD overlay like an image filter is not registered.
|
||||
There are also two key aspects to this definition: it does not focus on technology or method, but on the user's perspective of the system experience, and it does not specify a particular human sense, \ie it can be auditory~\cite{yang2022audio}, haptic~\cite{bhatia2024augmenting}, or even olfactory~\cite{brooks2021stereosmell} or gustatory~\cite{brooks2023taste}.
|
||||
There are also two key aspects to this definition: it does not focus on technology or method, but on the user's perspective of the system experience, and it does not specify a particular human sense, \ie it can be auditory \cite{yang2022audio}, haptic \cite{bhatia2024augmenting}, or even olfactory \cite{brooks2021stereosmell} or gustatory \cite{brooks2023taste}.
|
||||
Yet, most of the research have focused on visual augmentations, and the term \AR (without a prefix) is almost always understood as \v-\AR.
|
||||
|
||||
%For example, \textcite{milgram1994taxonomy} proposed a taxonomy of \MR experiences based on the degree of mixing real and virtual environments, and \textcite{skarbez2021revisiting} revisited this taxonomy to include the user's perception of the experience.
|
||||
|
||||
|
||||
\subsubsection{Applications}
|
||||
\subsubsection{Applications of AR}
|
||||
\label{ar_applications}
|
||||
|
||||
Advances in technology, research and development have enabled many usages of \AR, including medicine, education, industrial, navigation, collaboration and entertainment applications~\cite{dey2018systematic}.
|
||||
For example, \AR can provide surgery training simulations in safe conditions~\cite{harders2009calibration} (\figref{harders2009calibration}), or improve the learning of students with complex concepts and phenomena such as optics or chemistry~\cite{bousquet2024reconfigurable}.
|
||||
It can also guide workers in complex tasks, such as assembly, maintenance or verification~\cite{hartl2013mobile} (\figref{hartl2013mobile}), reinvent the way we interact with desktop computers~\cite{lee2013spacetop} (\figref{lee2013spacetop}), or can create complete new forms of gaming or tourism experiences~\cite{roo2017inner} (\figref{roo2017inner}).
|
||||
Advances in technology, research and development have enabled many usages of \AR, including medicine, education, industrial, navigation, collaboration and entertainment applications \cite{dey2018systematic}.
|
||||
For example, \AR can provide surgery training simulations in safe conditions \cite{harders2009calibration} (\figref{harders2009calibration}), or improve the learning of students with complex concepts and phenomena such as optics or chemistry \cite{bousquet2024reconfigurable}.
|
||||
It can also guide workers in complex tasks, such as assembly, maintenance or verification \cite{hartl2013mobile} (\figref{hartl2013mobile}), reinvent the way we interact with desktop computers \cite{lee2013spacetop} (\figref{lee2013spacetop}), or can create complete new forms of gaming or tourism experiences \cite{roo2017inner} (\figref{roo2017inner}).
|
||||
Most of (visual) \AR/\VR experience can now be implemented with commercially available hardware and software solutions, in particular for tracking, rendering and display.
|
||||
Yet, the user experience in \AR is still highly dependent on the display used.
|
||||
|
||||
@@ -70,17 +70,17 @@ Yet, the user experience in \AR is still highly dependent on the display used.
|
||||
\label{ar_displays}
|
||||
|
||||
To experience a virtual content combined and registered with the \RE, an output \UI that display the \VE to the user is necessary.
|
||||
There is a large variety of \AR displays with different methods of combining the real and virtual content (\VST, \OST, or projected), and different locations on the \RE or the user~\cite{billinghurst2015survey}.
|
||||
There is a large variety of \AR displays with different methods of combining the real and virtual content (\VST, \OST, or projected), and different locations on the \RE or the user \cite{billinghurst2015survey}.
|
||||
|
||||
In \VST-\AR, the virtual images are superimposed to images of the \RE captured by a camera~\cite{marchand2016pose}, and the combined real-virtual image is displayed on a screen to the user, as illustrated in \figref{itoh2022indistinguishable_vst}, \eg \figref{hartl2013mobile}.
|
||||
This augmented view through the camera has the advantage of a complete control on the real-virtual combination such as mutual occlusion between real and virtual objects~\cite{macedo2023occlusion}, coherent lighting and no delay between the real and virtual images~\cite{kruijff2010perceptual}.
|
||||
But, due to the camera and the screen, the user's view is degraded with a lower resolution, frame rate, field of view, and an overall visual latency compared to proprioception~\cite{kruijff2010perceptual}.
|
||||
In \VST-\AR, the virtual images are superimposed to images of the \RE captured by a camera \cite{marchand2016pose}, and the combined real-virtual image is displayed on a screen to the user, as illustrated in \figref{itoh2022indistinguishable_vst}, \eg \figref{hartl2013mobile}.
|
||||
This augmented view through the camera has the advantage of a complete control on the real-virtual combination such as mutual occlusion between real and virtual objects \cite{macedo2023occlusion}, coherent lighting and no delay between the real and virtual images \cite{kruijff2010perceptual}.
|
||||
But, due to the camera and the screen, the user's view is degraded with a lower resolution, frame rate, field of view, and an overall visual latency compared to proprioception \cite{kruijff2010perceptual}.
|
||||
|
||||
An \OST-\AR directly combines the virtual images with the real world view using a transparent optical system~\cite{itoh2022indistinguishable} to like augmented glasses, as illustrated in \figref{itoh2022indistinguishable_ost}, \eg \figref{lee2013spacetop}.
|
||||
These displays feature a direct, preserved view of the \RE at the cost of more difficult registration (spatial misalignment or temporal latency between the real and virtual content)~\cite{grubert2018survey} and mutual real-virtual occlusion~\cite{macedo2023occlusion}.
|
||||
An \OST-\AR directly combines the virtual images with the real world view using a transparent optical system \cite{itoh2022indistinguishable} to like augmented glasses, as illustrated in \figref{itoh2022indistinguishable_ost}, \eg \figref{lee2013spacetop}.
|
||||
These displays feature a direct, preserved view of the \RE at the cost of more difficult registration (spatial misalignment or temporal latency between the real and virtual content) \cite{grubert2018survey} and mutual real-virtual occlusion \cite{macedo2023occlusion}.
|
||||
|
||||
Finally, projection-based \AR overlay the virtual images on the real world using a projector, as illustrated in \figref{roo2017one_2}, \eg \figref{roo2017inner}.
|
||||
It doesn't require the user to wear the display, but requires physical surface to project the virtual on, and is vulnerable to shadows created by the user or the real objects~\cite{billinghurst2015survey}.
|
||||
It doesn't require the user to wear the display, but requires a real surface to project the virtual on, and is vulnerable to shadows created by the user or the real objects \cite{billinghurst2015survey}.
|
||||
|
||||
\begin{subfigs}{ar_displays}{Simplified operating diagram of \AR display methods. }[
|
||||
\item \VST-\AR \cite{itoh2022indistinguishable}.
|
||||
@@ -93,39 +93,39 @@ It doesn't require the user to wear the display, but requires physical surface t
|
||||
\subfig{roo2017one_2}
|
||||
\end{subfigs}
|
||||
|
||||
Regardless the \AR display, it can be placed at different locations~\cite{bimber2005spatial}, as shown in \figref{roo2017one_1}.
|
||||
Regardless the \AR display, it can be placed at different locations \cite{bimber2005spatial}, as shown in \figref{roo2017one_1}.
|
||||
Spatial \AR is usually projection-based displays placed at fixed location (\figref{roo2017inner}), but it can also be \OST or \VST fixed windows (\figref{lee2013spacetop}).
|
||||
Alternatively, \AR displays can be hand-held, like a \VST smartphone (\figref{hartl2013mobile}), or body-attached, like a micro-projector used as a flashlight~\cite{billinghurst2015survey}.
|
||||
Alternatively, \AR displays can be hand-held, like a \VST smartphone (\figref{hartl2013mobile}), or body-attached, like a micro-projector used as a flashlight \cite{billinghurst2015survey}.
|
||||
Finally, \AR displays can be head-worn like \VR headsets or glasses, providing a highly immersive and portable experience.
|
||||
%Smartphones, shipped with sensors, computing ressources and algorithms, are the most common \AR today's displays, but research and development promise more immersive and interactive \AR with headset displays~\cite{billinghurst2021grand}.
|
||||
%Smartphones, shipped with sensors, computing ressources and algorithms, are the most common \AR today's displays, but research and development promise more immersive and interactive \AR with headset displays \cite{billinghurst2021grand}.
|
||||
|
||||
\fig[0.75]{roo2017one_1}{Locations of \AR displays from eye-worn to spatially projected. Adapted by \textcite{roo2017one} from \textcite{bimber2005spatial}.}
|
||||
|
||||
\subsubsection{Presence and Embodiment in AR}
|
||||
\label{ar_presence_embodiment}
|
||||
|
||||
%Despite the clear and acknowledged definition presented in \secref{ar_definition} and the viewpoint of this thesis that \AR and \VR are two type of \MR experience with different levels of mixing real and virtual environments, as presented in \secref[introduction]{visuo_haptic_augmentations}, there is still a debate on defining \AR and \MR as well as how to characterize and categorized such experiences~\cite{speicher2019what,skarbez2021revisiting}.
|
||||
%Despite the clear and acknowledged definition presented in \secref{ar_definition} and the viewpoint of this thesis that \AR and \VR are two type of \MR experience with different levels of mixing real and virtual environments, as presented in \secref[introduction]{visuo_haptic_augmentations}, there is still a debate on defining \AR and \MR as well as how to characterize and categorized such experiences \cite{speicher2019what,skarbez2021revisiting}.
|
||||
|
||||
Presence and embodiment are two key concepts that characterize the user experience in \AR and \VR.
|
||||
While there is a large literature on these topics in \VR, they are less defined and studied for \AR~\cite{tran2024survey,genay2022being}.
|
||||
While there is a large literature on these topics in \VR, they are less defined and studied for \AR \cite{tran2024survey,genay2022being}.
|
||||
Still, these concepts are useful to design, evaluate and discuss our contributions in the next chapters.
|
||||
|
||||
\paragraph{Presence}
|
||||
\label{ar_presence}
|
||||
|
||||
\AR and \VR are both essentially illusions as the virtual content does not physically exist but is just digitally simulated and rendered to the user's senses through display \UIs.
|
||||
Such experience of disbelief suspension in \VR is what is called \emph{presence}, and it can be decomposed into two dimensions: \PI and \PSI~\cite{slater2009place}.
|
||||
Such experience of disbelief suspension in \VR is what is called \emph{presence}, and it can be decomposed into two dimensions: \PI and \PSI \cite{slater2009place}.
|
||||
\PI is the sense of the user of \enquote{being there} in the \VE (\figref{presence-vr}).
|
||||
It emerges from the real time rendering of the \VE from the user's perspective: to be able to move around inside the \VE and look from different point of views.
|
||||
\PSI is the illusion that the virtual events are really happening, even if the user knows that they are not real.
|
||||
It doesn't mean that the virtual events are realistic, but that they are plausible and coherent with the user's expectations.
|
||||
|
||||
%The \AR presence is far less defined and studied than for \VR~\cite{tran2024survey}
|
||||
%The \AR presence is far less defined and studied than for \VR \cite{tran2024survey}
|
||||
For \AR, \textcite{slater2022separate} proposed to invert \PI to what we can call \enquote{object illusion}, \ie the sense of the \VO to \enquote{feels here} in the \RE (\figref{presence-ar}).
|
||||
As with VR, \VOs must be able to be seen from different angles by moving the head but also, this is more difficult, be consistent with the \RE, \eg occlude or be occluded by real objects~\cite{macedo2023occlusion}, cast shadows or reflect lights.
|
||||
As with VR, \VOs must be able to be seen from different angles by moving the head but also, this is more difficult, be consistent with the \RE, \eg occlude or be occluded by real objects \cite{macedo2023occlusion}, cast shadows or reflect lights.
|
||||
The \PSI can be applied to \AR as is, but the \VOs must additionally have knowledge of the \RE and react accordingly to it.
|
||||
\textcite{skarbez2021revisiting} also named \PI for \AR as \enquote{immersion} and \PSI as \enquote{coherence}, and these terms will be used in the remainder of this thesis.
|
||||
One main issue with presence is how to measure it both in \VR~\cite{slater2022separate} and \AR~\cite{tran2024survey}.
|
||||
One main issue with presence is how to measure it both in \VR \cite{slater2022separate} and \AR \cite{tran2024survey}.
|
||||
|
||||
\begin{subfigs}{presence}{The sense of immersion in virtual and augmented environments. Adapted from \textcite{stevens2002putting}. }[
|
||||
\item Place Illusion (PI) is the sense of the user of \enquote{being there} in the \VE.
|
||||
@@ -139,25 +139,25 @@ One main issue with presence is how to measure it both in \VR~\cite{slater2022se
|
||||
\paragraph{Embodiment}
|
||||
\label{ar_embodiment}
|
||||
|
||||
The \SoE is the \enquote{subjective experience of using and having a body}~\cite{blanke2009fullbody}, \ie the feeling that a body is our own.
|
||||
In everyday life, we are used to being, seeing and controlling our own body, but it is possible to embody a virtual body as an avatar while in \AR~\cite{genay2022being} or \VR~\cite{guy2023sense}.
|
||||
This illusion arises when the visual, proprioceptive and (if any) haptic sensations of the virtual body are coherent~\cite{kilteni2012sense}.
|
||||
It can be decomposed into three subcomponents: \emph{Agency}, which is the feeling of controlling the body; \emph{Ownership}, which is the feeling that \enquote{the body is the source of the experienced sensations}; and \emph{Self-Location}, which is the feeling \enquote{spatial experience of being inside [the] body}~\cite{kilteni2012sense}.
|
||||
In \AR, it could take the form of body accessorization, \eg wearing virtual clothes or make-up in overlay, of partial avatarization, \eg using a virtual prothesis, or a full avatarization~\cite{genay2022being}.
|
||||
The \SoE is the \enquote{subjective experience of using and having a body} \cite{blanke2009fullbody}, \ie the feeling that a body is our own.
|
||||
In everyday life, we are used to being, seeing and controlling our own body, but it is possible to embody a virtual body as an avatar while in \AR \cite{genay2022being} or \VR \cite{guy2023sense}.
|
||||
This illusion arises when the visual, proprioceptive and (if any) haptic sensations of the virtual body are coherent \cite{kilteni2012sense}.
|
||||
It can be decomposed into three subcomponents: \emph{Agency}, which is the feeling of controlling the body; \emph{Ownership}, which is the feeling that \enquote{the body is the source of the experienced sensations}; and \emph{Self-Location}, which is the feeling \enquote{spatial experience of being inside [the] body} \cite{kilteni2012sense}.
|
||||
In \AR, it could take the form of body accessorization, \eg wearing virtual clothes or make-up in overlay, of partial avatarization, \eg using a virtual prothesis, or a full avatarization \cite{genay2022being}.
|
||||
|
||||
|
||||
\subsection{Direct Hand Manipulation in AR}
|
||||
\label{ar_interaction}
|
||||
|
||||
A user in \AR must be able to interact with the virtual content to fulfil the second point of \textcite{azuma1997survey}'s definition (\secref{ar_definition}) and complete the interaction loop (\figref[introduction]{interaction-loop}).%, \eg through a hand-held controller, a tangible object, or even directly with the hands.
|
||||
In all examples of \AR applications shown in \secref{ar_applications}, the user interacts with the \VE using their hands, either directly or through a physical input \UI.
|
||||
In all examples of \AR applications shown in \secref{ar_applications}, the user interacts with the \VE using their hands, either directly or through a physical interface.
|
||||
|
||||
|
||||
\subsubsection{User Interfaces and Interaction Techniques}
|
||||
\label{interaction_techniques}
|
||||
|
||||
For a user to interact with a computer system (desktop, mobile, \AR, etc.), they first perceive the state of the system and then acts upon it through an input \UI.
|
||||
Inputs \UI can be either an \emph{active sensing}, physically held or worn device, such as a mouse, a touch screen, or a hand-held controller, or a \emph{passive sensing}, that does not require physical contact, such as eye trackers, voice recognition, or hand tracking~\cite{laviola20173d}.
|
||||
Inputs \UI can be either an \emph{active sensing}, a held or worn device, such as a mouse, a touch screen, or a hand-held controller, or a \emph{passive sensing}, that does not require a contact, such as eye trackers, voice recognition, or hand tracking \cite{laviola20173d}.
|
||||
The information gathered from the sensors by the \UI is then translated into actions within the computer system by an \emph{interaction technique} (\figref{interaction-technique}).
|
||||
For example, a cursor on a screen can be moved using either with a mouse or with the arrow keys on a keyboard, or a two-finger swipe on a touchscreen can be used to scroll or zoom an image.
|
||||
Choosing useful and efficient \UIs and interaction techniques is crucial for the user experience and the tasks that can be performed within the system.
|
||||
@@ -184,10 +184,10 @@ Wayfinding is the cognitive planning of the movement, such as path finding or ro
|
||||
The \emph{system control tasks} are changes to the system state through commands or menus such as creating, deleting, or modifying \VOs, \eg as in \figref{roo2017onea}. It is also the input of text, numbers, or symbols.
|
||||
|
||||
\begin{subfigs}{interaction-techniques}{Interaction techniques in \AR. }[
|
||||
\item Spatial selection of virtual item of an extended display using a hand-held smartphone~\cite{grubert2015multifi}.
|
||||
\item Displaying as an overlay registered on the \RE the route to follow~\cite{grubert2017pervasive}.
|
||||
\item Virtual drawing on a tangible object with a hand-held pen~\cite{roo2017onea}.
|
||||
\item Simultaneous Localization and Mapping (SLAM) algorithms such as KinectFusion~\cite{newcombe2011kinectfusion} reconstruct the \RE in real time and enables to register the \VE in it.
|
||||
\item Spatial selection of virtual item of an extended display using a hand-held smartphone \cite{grubert2015multifi}.
|
||||
\item Displaying as an overlay registered on the \RE the route to follow \cite{grubert2017pervasive}.
|
||||
\item Virtual drawing on a tangible object with a hand-held pen \cite{roo2017onea}.
|
||||
\item Simultaneous Localization and Mapping (SLAM) algorithms such as KinectFusion \cite{newcombe2011kinectfusion} reconstruct the \RE in real time and enables to register the \VE in it.
|
||||
]
|
||||
\subfigsheight{36mm}
|
||||
\subfig{grubert2015multifi}
|
||||
@@ -197,45 +197,45 @@ The \emph{system control tasks} are changes to the system state through commands
|
||||
\end{subfigs}
|
||||
|
||||
|
||||
\subsubsection{Reducing the Physical-Virtual Gap}
|
||||
\label{physical-virtual-gap}
|
||||
\subsubsection{Reducing the Real-Virtual Gap}
|
||||
\label{real-virtual-gap}
|
||||
|
||||
In \AR and \VR, the state of the system is displayed to the user as a \ThreeD spatial \VE.
|
||||
In an immersive and portable \AR system, this \VE is experienced at a 1:1 scale and as an integral part of the \RE.
|
||||
The rendering gap between the physical and virtual elements, as described on the interaction loop in \figref[introduction]{interaction-loop}, is thus experienced as very narrow or even not consciously perceived by the user.
|
||||
The rendering gap between the real and virtual elements, as described on the interaction loop in \figref[introduction]{interaction-loop}, is thus experienced as very narrow or even not consciously perceived by the user.
|
||||
This manifests as a sense of presence of the virtual, as described in \secref{ar_presence}.
|
||||
|
||||
As the physical-virtual rendering gap is reduced, we could expect a similar and seamless interaction with the \VE as with a physical environment, which \textcite{jacob2008realitybased} called \emph{reality based interactions}.
|
||||
As of today, an immersive \AR system track itself with the user in \ThreeD, using tracking sensors and pose estimation algorithms~\cite{marchand2016pose}, \eg as in \figref{newcombe2011kinectfusion}.
|
||||
As the real-virtual rendering gap is reduced, we could expect a similar and seamless interaction with the \VE as with a \RE, which \textcite{jacob2008realitybased} called \emph{reality based interactions}.
|
||||
As of today, an immersive \AR system track itself with the user in \ThreeD, using tracking sensors and pose estimation algorithms \cite{marchand2016pose}, \eg as in \figref{newcombe2011kinectfusion}.
|
||||
It enables the \VE to be registered with the \RE and the user simply moves to navigate within the virtual content.
|
||||
%This tracking and mapping of the user and \RE into the \VE is named the \enquote{extent of world knowledge} by \textcite{skarbez2021revisiting}, \ie to what extent the \AR system knows about the \RE and is able to respond to changes in it.
|
||||
However, direct hand manipulation of virtual content is a challenge that requires specific interaction techniques~\cite{billinghurst2021grand}.
|
||||
It is often achieved using two interaction techniques: \emph{tangible objects} and \emph{virtual hands}~\cite{billinghurst2015survey,hertel2021taxonomy}.
|
||||
However, direct hand manipulation of virtual content is a challenge that requires specific interaction techniques \cite{billinghurst2021grand}.
|
||||
It is often achieved using two interaction techniques: \emph{tangible objects} and \emph{virtual hands} \cite{billinghurst2015survey,hertel2021taxonomy}.
|
||||
|
||||
|
||||
\subsubsection{Manipulating with Tangibles}
|
||||
\label{ar_tangibles}
|
||||
|
||||
As \AR integrates visual virtual content into \RE perception, it can involve real surrounding objects as \UI: to visually augment them, \eg by superimposing visual textures~\cite{roo2017inner} (\figref{roo2017inner}), and to use them as physical proxies to support interaction with \VOs~\cite{ishii1997tangible}.
|
||||
As \AR integrates visual virtual content into \RE perception, it can involve real surrounding objects as \UI: to visually augment them, \eg by superimposing visual textures \cite{roo2017inner} (\figref{roo2017inner}), and to use them as physical proxies to support interaction with \VOs \cite{ishii1997tangible}.
|
||||
According to \textcite{billinghurst2005designing}, each \VO is coupled to a tangible object, and the \VO is physically manipulated through the tangible object, providing a direct, efficient and seamless interaction with both the real and virtual content.
|
||||
This is a technique similar to mapping a physical mouse movement to a virtual cursor on a screen.
|
||||
This is a technique similar to mapping a mouse's movements to a virtual cursor on a screen.
|
||||
|
||||
Methods have been developed to automatically pair and adapt the \VOs to render with available tangibles of similar shape and size~\cite{hettiarachchi2016annexing,jain2023ubitouch} (\figref{jain2023ubitouch}).
|
||||
Methods have been developed to automatically pair and adapt the \VOs to render with available tangibles of similar shape and size \cite{hettiarachchi2016annexing,jain2023ubitouch} (\figref{jain2023ubitouch}).
|
||||
The issue with these \enquote{space-multiplexed} interfaces is the high number and variety of tangibles required.
|
||||
An alternative is to use a single \enquote{universal} tangible object like a hand-held controller, such as a cube~\cite{issartel2016tangible} or a sphere~\cite{englmeier2020tangible}.
|
||||
These \enquote{time-multiplexed} interfaces require interaction techniques that allow the user to pair the tangible with any \VO, \eg by placing the tangible into the \VO and pressing the fingers~\cite{issartel2016tangible} (\figref{issartel2016tangible}), similar to a real grasp (\secref{grasp_types}).
|
||||
An alternative is to use a single \enquote{universal} tangible object like a hand-held controller, such as a cube \cite{issartel2016tangible} or a sphere \cite{englmeier2020tangible}.
|
||||
These \enquote{time-multiplexed} interfaces require interaction techniques that allow the user to pair the tangible with any \VO, \eg by placing the tangible into the \VO and pressing the fingers \cite{issartel2016tangible} (\figref{issartel2016tangible}), similar to a real grasp (\secref{grasp_types}).
|
||||
|
||||
Still, the virtual visual rendering and the tangible haptic sensations can be inconsistent.
|
||||
Especially in \OST-\AR, as the \VOs are slightly transparent allowing the paired tangibles to be seen through them.
|
||||
In a pick-and-place task with tangibles of different shapes, a difference in size~\cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape~\cite{kahl2023using} (\figref{kahl2023using}) with the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users.
|
||||
In a pick-and-place task with tangibles of different shapes, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using}) with the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users.
|
||||
This suggests the feasibility of using simplified tangibles in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs.
|
||||
Similarly, we described in \secref{tactile_rendering} how a material property (\secref{object_properties}) of a touched tangible can be modified using wearable haptic devices~\cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR.
|
||||
Similarly, we described in \secref{tactile_rendering} how a material property (\secref{object_properties}) of a touched tangible can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR.
|
||||
|
||||
\begin{subfigs}{ar_applications}{Manipulating \VOs with tangibles. }[
|
||||
\item Ubi-Touch paired the movements and screw interaction of a virtual drill with a real vaporizer held by the user~\cite{jain2023ubitouch}.
|
||||
\item A tangible cube that can be moved into the \VE and used to grasp and manipulate \VOs~\cite{issartel2016tangible}.
|
||||
\item Ubi-Touch paired the movements and screw interaction of a virtual drill with a real vaporizer held by the user \cite{jain2023ubitouch}.
|
||||
\item A tangible cube that can be moved into the \VE and used to grasp and manipulate \VOs \cite{issartel2016tangible}.
|
||||
\item Size and
|
||||
\item shape difference between a tangible and a \VO is acceptable for manipulation in \AR~\cite{kahl2021investigation,kahl2023using}.
|
||||
\item shape difference between a tangible and a \VO is acceptable for manipulation in \AR \cite{kahl2021investigation,kahl2023using}.
|
||||
]
|
||||
\subfigsheight{37.5mm}
|
||||
\subfig{jain2023ubitouch}
|
||||
@@ -249,26 +249,26 @@ Similarly, we described in \secref{tactile_rendering} how a material property (\
|
||||
\label{ar_virtual_hands}
|
||||
|
||||
Natural UI allow the user to use their body movements directly as inputs with the \VE \cite{billinghurst2015survey}.
|
||||
Our hands allow us to manipulate real everyday objects with both strength and precision (\secref{grasp_types}), hence virtual hand interaction techniques seem the most natural way to manipulate virtual objects~\cite{laviola20173d}.
|
||||
Initially tracked by active sensing devices such as gloves or controllers, it is now possible to track hands in real time using cameras and computer vision algorithms natively integrated into \AR/\VR headsets~\cite{tong2023survey}.
|
||||
Our hands allow us to manipulate real everyday objects with both strength and precision (\secref{grasp_types}), hence virtual hand interaction techniques seem the most natural way to manipulate virtual objects \cite{laviola20173d}.
|
||||
Initially tracked by active sensing devices such as gloves or controllers, it is now possible to track hands in real time using cameras and computer vision algorithms natively integrated into \AR/\VR headsets \cite{tong2023survey}.
|
||||
|
||||
The user's hand is therefore tracked and reconstructed as a \emph{virtual hand} model in the \VE ~\cite{billinghurst2015survey,laviola20173d}.
|
||||
The simplest models represent the hand as a rigid 3D object that follows the movements of the real hand with \qty{6}{\DoF} (position and orientation in space)~\cite{talvas2012novel}.
|
||||
The user's hand is therefore tracked and reconstructed as a \emph{virtual hand} model in the \VE \cite{billinghurst2015survey,laviola20173d}.
|
||||
The simplest models represent the hand as a rigid 3D object that follows the movements of the real hand with \qty{6}{\DoF} (position and orientation in space) \cite{talvas2012novel}.
|
||||
An alternative is to model only the fingertips (\figref{lee2007handy}) or the whole hand (\figref{hilliges2012holodesk_1}) as points.
|
||||
The most common technique is to reconstruct all the phalanges of the hand in an articulated kinematic model (\secref{hand_anatomy})~\cite{borst2006spring}.
|
||||
The most common technique is to reconstruct all the phalanges of the hand in an articulated kinematic model (\secref{hand_anatomy}) \cite{borst2006spring}.
|
||||
|
||||
The contacts between the virtual hand model and the \VOs are then simulated using heuristic or physics-based techniques~\cite{laviola20173d}.
|
||||
The contacts between the virtual hand model and the \VOs are then simulated using heuristic or physics-based techniques \cite{laviola20173d}.
|
||||
Heuristic techniques use rules to determine the selection, manipulation and release of a \VO (\figref{piumsomboon2013userdefined_1}).
|
||||
But they produce unrealistic behaviour and are limited to the cases predicted by the rules.
|
||||
Physics-based techniques simulate forces at the contact points between the virtual hand and the \VO.
|
||||
In particular, \textcite{borst2006spring} have proposed an articulated kinematic model in which each phalanx is a rigid body simulated with the god-object~\cite{zilles1995constraintbased} method: the virtual phalanx follows the movements of the real phalanx, but remains constrained to the surface of the virtual objects during contact. The forces acting on the object are calculated as a function of the distance between the real and virtual hands (\figref{borst2006spring}).
|
||||
More advanced techniques simulate the friction phenomena~\cite{talvas2013godfinger} and finger deformations~\cite{talvas2015aggregate}, allowing highly accurate and realistic interactions, but which can be difficult to compute in real time.
|
||||
In particular, \textcite{borst2006spring} have proposed an articulated kinematic model in which each phalanx is a rigid body simulated with the god-object \cite{zilles1995constraintbased} method: the virtual phalanx follows the movements of the real phalanx, but remains constrained to the surface of the virtual objects during contact. The forces acting on the object are calculated as a function of the distance between the real and virtual hands (\figref{borst2006spring}).
|
||||
More advanced techniques simulate the friction phenomena \cite{talvas2013godfinger} and finger deformations \cite{talvas2015aggregate}, allowing highly accurate and realistic interactions, but which can be difficult to compute in real time.
|
||||
|
||||
\begin{subfigs}{virtual-hand}{Manipulating \VOs with virtual hands. }[
|
||||
\item A fingertip tracking that enables to select a \VO by opening the hand~\cite{lee2007handy}.
|
||||
\item Physics-based hand-object manipulation with a virtual hand made of numerous many small rigid-body spheres~\cite{hilliges2012holodesk}.
|
||||
\item Grasping a through gestures when the fingers are detected as opposing on the \VO~\cite{piumsomboon2013userdefined}.
|
||||
\item A kinematic hand model with rigid-body phalanges (in beige) following the real tracked hand (in green) but kept physically constrained to the \VO. Applied force are displayed as red arrows~\cite{borst2006spring}.
|
||||
\item A fingertip tracking that enables to select a \VO by opening the hand \cite{lee2007handy}.
|
||||
\item Physics-based hand-object manipulation with a virtual hand made of numerous many small rigid-body spheres \cite{hilliges2012holodesk}.
|
||||
\item Grasping a through gestures when the fingers are detected as opposing on the \VO \cite{piumsomboon2013userdefined}.
|
||||
\item A kinematic hand model with rigid-body phalanges (in beige) following the real tracked hand (in green) but kept physically constrained to the \VO. Applied force are displayed as red arrows \cite{borst2006spring}.
|
||||
]
|
||||
\subfigsheight{37mm}
|
||||
\subfig{lee2007handy}
|
||||
@@ -277,40 +277,40 @@ More advanced techniques simulate the friction phenomena~\cite{talvas2013godfing
|
||||
\subfig{borst2006spring}
|
||||
\end{subfigs}
|
||||
|
||||
However, the lack of physical constraints on the user's hand movements makes manipulation actions tiring~\cite{hincapie-ramos2014consumed}.
|
||||
While the fingers of the user traverse the virtual object, a physics-based virtual hand remains in contact with the object, a discrepancy that may degrade the user's performance in \VR~\cite{prachyabrued2012virtual}.
|
||||
Finally, in the absence of haptic feedback on each finger, it is difficult to estimate the contact and forces exerted by the fingers on the object during grasping and manipulation~\cite{maisto2017evaluation,meli2018combining}.
|
||||
While a visual rendering of the virtual hand in \VR can compensate for these issues~\cite{prachyabrued2014visual}, the visual and haptic rendering of the virtual hand, or their combination, in \AR is under-researched.
|
||||
However, the lack of physical constraints on the user's hand movements makes manipulation actions tiring \cite{hincapie-ramos2014consumed}.
|
||||
While the fingers of the user traverse the virtual object, a physics-based virtual hand remains in contact with the object, a discrepancy that may degrade the user's performance in \VR \cite{prachyabrued2012virtual}.
|
||||
Finally, in the absence of haptic feedback on each finger, it is difficult to estimate the contact and forces exerted by the fingers on the object during grasping and manipulation \cite{maisto2017evaluation,meli2018combining}.
|
||||
While a visual rendering of the virtual hand in \VR can compensate for these issues \cite{prachyabrued2014visual}, the visual and haptic rendering of the virtual hand, or their combination, in \AR is under-researched.
|
||||
|
||||
|
||||
\subsection{Visual Rendering of Hands in AR}
|
||||
\label{ar_visual_hands}
|
||||
|
||||
In \VR, as the user is fully immersed in the \VE and cannot see their real hands, it is necessary to represent their virtually (\secref{ar_embodiment}).
|
||||
When interacting using a physics-based virtual hand method (\secref{ar_virtual_hands}), the visual rendering of the virtual hand have an influence on perception, interaction performance, and preference of users~\cite{prachyabrued2014visual,argelaguet2016role,grubert2018effects,schwind2018touch}.
|
||||
When interacting using a physics-based virtual hand method (\secref{ar_virtual_hands}), the visual rendering of the virtual hand have an influence on perception, interaction performance, and preference of users \cite{prachyabrued2014visual,argelaguet2016role,grubert2018effects,schwind2018touch}.
|
||||
In a pick-and-place manipulation task in \VR, \textcite{prachyabrued2014visual} and \textcite{canales2019virtual} found that the visual hand rendering whose motion was constrained to the surface of the \VOs similar as to \textcite{borst2006spring} (\enquote{Outer Hand} in \figref{prachyabrued2014visual}) performed the worst, while the visual hand rendering following the tracked human hand (thus penetrating the \VOs, \enquote{Inner Hand} in \figref{prachyabrued2014visual}), performed the best, even though it was rather disliked.
|
||||
\textcite{prachyabrued2014visual} also observed that the best compromise was a double rendering, showing both the virtual hand and the tracked hand (\enquote{2-Hand} in \figref{prachyabrued2014visual}).
|
||||
While a realistic human hand rendering increase the sense of ownership~\cite{lin2016need}, a skeleton-like rendering provide a stronger sense of agency~\cite{argelaguet2016role} (\secref{ar_embodiment}), and a minimalistic fingertip rendering reduce errors in typing text~\cite{grubert2018effects}.
|
||||
A visual hand rendering while in \VE also seems to affect how one grasps an object~\cite{blaga2020too}, or how real bumps and holes are perceived~\cite{schwind2018touch}.
|
||||
While a realistic human hand rendering increase the sense of ownership \cite{lin2016need}, a skeleton-like rendering provide a stronger sense of agency \cite{argelaguet2016role} (\secref{ar_embodiment}), and a minimalistic fingertip rendering reduce errors in typing text \cite{grubert2018effects}.
|
||||
A visual hand rendering while in \VE also seems to affect how one grasps an object \cite{blaga2020too}, or how real bumps and holes are perceived \cite{schwind2018touch}.
|
||||
|
||||
\fig{prachyabrued2014visual}{Visual hand renderings affect user experience in \VR~\cite{prachyabrued2014visual}.}
|
||||
\fig{prachyabrued2014visual}{Visual hand renderings affect user experience in \VR \cite{prachyabrued2014visual}.}
|
||||
|
||||
Conversely, a user sees their own hands in \AR, and the mutual occlusion between the hands and the \VOs is a common issue (\secref{ar_displays}), \ie hiding the \VO when the real hand is in front of it and hiding the real hand when it is behind the \VO (\figref{hilliges2012holodesk_2}).
|
||||
%For example, in \figref{hilliges2012holodesk_2}, the user is pinching a virtual cube in \OST-\AR with their thumb and index fingers, but while the index is behind the cube, it is seen as in front of it.
|
||||
While in \VST-\AR, this could be solved as a masking problem by combining the real and virtual images~\cite{battisti2018seamless}, \eg in \figref{suzuki2014grasping}, in \OST-\AR, this is much more difficult because the \VE is displayed as a transparent \TwoD image on top of the \ThreeD \RE, which cannot be easily masked~\cite{macedo2023occlusion}.
|
||||
While in \VST-\AR, this could be solved as a masking problem by combining the real and virtual images \cite{battisti2018seamless}, \eg in \figref{suzuki2014grasping}, in \OST-\AR, this is much more difficult because the \VE is displayed as a transparent \TwoD image on top of the \ThreeD \RE, which cannot be easily masked \cite{macedo2023occlusion}.
|
||||
%Yet, even in \VST-\AR,
|
||||
|
||||
%An alternative is to render the \VOs and the virtual hand semi-transparents, so that they are partially visible even when one is occluding the other (\figref{buchmann2005interaction}).
|
||||
%Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in \VST-\AR~\cite{buchmann2005interaction,ha2014wearhand,piumsomboon2014graspshell} and \VR~\cite{vanveldhuizen2021effect}, but has not yet been evaluated in \OST-\AR.
|
||||
%Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in \VST-\AR \cite{buchmann2005interaction,ha2014wearhand,piumsomboon2014graspshell} and \VR \cite{vanveldhuizen2021effect}, but has not yet been evaluated in \OST-\AR.
|
||||
%However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a \VO, \eg the thumb is in front of the virtual cube, but could be perceived to be behind it.
|
||||
|
||||
As the \VE is intangible, adding a visual rendering of the virtual hand in \AR that is physically constrained to the \VOs would achieve a similar result to the promising double-hand rendering of \textcite{prachyabrued2014visual}.
|
||||
A \VO overlaying a tangible object in \OST-\AR can vary in size and shape without worsening the users' experience nor the performance when manipulating it~\cite{kahl2021investigation,kahl2023using}.
|
||||
A \VO overlaying a tangible object in \OST-\AR can vary in size and shape without worsening the users' experience nor the performance when manipulating it \cite{kahl2021investigation,kahl2023using}.
|
||||
This suggests that a visual hand rendering superimposed on the real hand as a partial avatarization (\secref{ar_embodiment}) might be helpful without impairing the user.
|
||||
|
||||
Few works have compared different visual hand rendering in \AR or with wearable haptic feedback.
|
||||
Rendering the real hand as a semi-transparent hand in \VST-\AR is perceived as less natural but seems to be preferred to a mutual visual occlusion for interaction with real and virtual objects~\cite{buchmann2005interaction,piumsomboon2014graspshell}.
|
||||
%Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in \VST-\AR~\cite{buchmann2005interaction,ha2014wearhand,piumsomboon2014graspshell} and \VR~\cite{vanveldhuizen2021effect}, but has not yet been evaluated in \OST-\AR.
|
||||
Rendering the real hand as a semi-transparent hand in \VST-\AR is perceived as less natural but seems to be preferred to a mutual visual occlusion for interaction with real and virtual objects \cite{buchmann2005interaction,piumsomboon2014graspshell}.
|
||||
%Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in \VST-\AR \cite{buchmann2005interaction,ha2014wearhand,piumsomboon2014graspshell} and \VR \cite{vanveldhuizen2021effect}, but has not yet been evaluated in \OST-\AR.
|
||||
Similarly, \textcite{blaga2017usability} evaluated direct hand manipulation in non-immersive \VST-\AR a skeleton-like rendering against no visual hand rendering: while user performance did not improve, participants felt more confident with the virtual hand (\figref{blaga2017usability}).
|
||||
\textcite{krichenbauer2018augmented} found participants \percent{22} faster in immersive \VST-\AR than in \VR in the same pick-and-place manipulation task, but no visual hand rendering was used in \VR while the real hand was visible in \AR.
|
||||
In a collaboration task in immersive \OST-\AR \vs \VR, \textcite{yoon2020evaluating} showed that a realistic human hand rendering was the most preferred over a low-polygon hand and a skeleton-like hand for the remote partner.
|
||||
@@ -322,11 +322,11 @@ Taken together, these results suggest that a visual hand rendering in \AR could
|
||||
%To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of \VO manipulation.
|
||||
|
||||
\begin{subfigs}{visual-hands}{Visual hand renderings in \AR. }[
|
||||
\item Grasping a \VO in \OST-\AR with no visual hand rendering~\cite{hilliges2012holodesk}.
|
||||
\item Simulated mutual-occlusion between the hand grasping and the \VO in \VST-\AR~\cite{suzuki2014grasping}.
|
||||
\item Grasping a real object with a semi-transparent hand in \VST-\AR~\cite{buchmann2005interaction}.
|
||||
\item Skeleton rendering overlaying the real hand in \VST-\AR~\cite{blaga2017usability}.
|
||||
\item Robotic rendering overlaying the real hands in \OST-\AR~\cite{genay2021virtual}.
|
||||
\item Grasping a \VO in \OST-\AR with no visual hand rendering \cite{hilliges2012holodesk}.
|
||||
\item Simulated mutual-occlusion between the hand grasping and the \VO in \VST-\AR \cite{suzuki2014grasping}.
|
||||
\item Grasping a real object with a semi-transparent hand in \VST-\AR \cite{buchmann2005interaction}.
|
||||
\item Skeleton rendering overlaying the real hand in \VST-\AR \cite{blaga2017usability}.
|
||||
\item Robotic rendering overlaying the real hands in \OST-\AR \cite{genay2021virtual}.
|
||||
]
|
||||
\subfigsheight{29.5mm}
|
||||
\subfig{hilliges2012holodesk_2}
|
||||
|
||||
@@ -9,7 +9,7 @@
|
||||
%Go back to the main objective "to understand how immersive visual and \WH feedback compare and complement each other in the context of direct hand perception and manipulation with augmented objects" and the two research challenges: "providing plausible and coherent visuo-haptic augmentations, and enabling effective manipulation of the augmented environment."
|
||||
%Also go back to the \figref[introduction]{visuo-haptic-rv-continuum3} : we present previous work that either did haptic AR (the middle row), or haptic VR with visual AR, or visuo-haptic AR.
|
||||
|
||||
% One of the roles of haptic systems is to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects, particularly in \v-\VE~\cite{maclean2008it,culbertson2018haptics}. Moreover, a haptic \AR system should \enquote{modulating the feel of a real object by virtual [haptic] feedback}~\cite{jeon2009haptic}, \ie a touch interaction with a real object whose perception is modified by the addition of virtual haptic feedback.
|
||||
% One of the roles of haptic systems is to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects, particularly in \v-\VE \cite{maclean2008it,culbertson2018haptics}. Moreover, a haptic \AR system should \enquote{modulating the feel of a real object by virtual [haptic] feedback} \cite{jeon2009haptic}, \ie a touch interaction with a real object whose perception is modified by the addition of virtual haptic feedback.
|
||||
|
||||
% Finally, we present how multimodal visual and haptic feedback have been combined in \AR to modify the user perception of tangible objects, and to improve the user interaction with \VOs.
|
||||
|
||||
@@ -20,10 +20,10 @@
|
||||
\subsubsection{Merging the Sensations into a Perception}
|
||||
\label{sensations_perception}
|
||||
|
||||
A \emph{perception} is the merge of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property~\cite{ernst2004merging}.
|
||||
A \emph{perception} is the merge of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property \cite{ernst2004merging}.
|
||||
For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a tangible with a co-localized \VO (\secref{ar_tangibles}).
|
||||
|
||||
When the sensations can be redundant, \ie when only one sensation could be enough to estimate the property, they are integrated to form a single coherent perception~\cite{ernst2004merging}.
|
||||
When the sensations can be redundant, \ie when only one sensation could be enough to estimate the property, they are integrated to form a single coherent perception \cite{ernst2004merging}.
|
||||
No sensory information is completely reliable, and can provide different answers to the same property when measured multiple times, \eg the weight of an object.
|
||||
Therefore, each sensation $i$ is said to be an estimate $\tilde{s}_i$ with variance $\sigma_i^2$ of the property $s$.
|
||||
The \MLE model predicts then that the integrated estimated property $\tilde{s}$ is the weighted sum of the individual sensory estimates:
|
||||
@@ -70,8 +70,8 @@ As long as the user is able to match the sensations as the same object property,
|
||||
\label{visual_haptic_influence}
|
||||
|
||||
A visuo-haptic perception of an object's property is thus robust to a certain difference between the two sensory modalities, as long as one can match their respective sensations to the same property.
|
||||
In particular, the texture perception of everyday objects is known to be constructed from both vision and touch~\cite{klatzky2010multisensory}.
|
||||
More precisely, when evaluating surfaces with vision or touch only, both senses mainly discriminate their materials by the same properties of roughness, hardness and friction, and with similar performance~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
|
||||
In particular, the texture perception of everyday objects is known to be constructed from both vision and touch \cite{klatzky2010multisensory}.
|
||||
More precisely, when evaluating surfaces with vision or touch only, both senses mainly discriminate their materials by the same properties of roughness, hardness and friction, and with similar performance \cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
|
||||
|
||||
Overall perception can be then modified by changing one of the sensory modality, as shown by \textcite{yanagisawa2015effects}, who altered perceived roughness, stiffness and friction of real tactile materials touched by the finger by superimposing different real visual textures using a half-mirror.
|
||||
With a similar setup but in immersive \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched but through a glove: Participants matched visual textures to real textures when their respective hardness felt similar.
|
||||
@@ -81,12 +81,12 @@ They found that the visual perception of roughness and hardness influenced the h
|
||||
|
||||
\fig{gunther2022smooth}{In a passive touch context in \VR, only a smooth and a rough real surfaces were found to match all the visual \VOs \cite{gunther2022smooth}.}
|
||||
|
||||
The visual feedback can even be designed on purpose to influence the haptic perception, usually by deforming the visual representation of a user input, creating a \enquote{pseudo-haptic feedback}~\cite{ujitoko2021survey}.
|
||||
For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by deforming visually the geometry of a tangible object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape or curvature can be altered~\cite{ban2013modifying,ban2014displaying}.
|
||||
The visual feedback can even be designed on purpose to influence the haptic perception, usually by deforming the visual representation of a user input, creating a \enquote{pseudo-haptic feedback} \cite{ujitoko2021survey}.
|
||||
For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by deforming visually the geometry of a tangible object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape or curvature can be altered \cite{ban2013modifying,ban2014displaying}.
|
||||
\textcite{punpongsanon2015softar} used this technique in spatial \AR (\secref{ar_displays}) to induce a softness illusion of a hard tangible object by superimposing a virtual texture that deforms when pressed by the hand (\figref{punpongsanon2015softar}).
|
||||
|
||||
\begin{subfigs}{pseudo_haptic}{Pseudo-haptic feedback in \AR. }[
|
||||
\item A virtual soft texture projected on a table and that deforms when pressed by the hand~\cite{punpongsanon2015softar}.
|
||||
\item A virtual soft texture projected on a table and that deforms when pressed by the hand \cite{punpongsanon2015softar}.
|
||||
\item Modifying visually a tangible object and the hand touching it in \VST-\AR to modify its perceived shape \cite{ban2014displaying}.
|
||||
]
|
||||
\subfigsheight{42mm}
|
||||
@@ -120,15 +120,15 @@ The stiffness $\tilde{k}(t)$ of the piston is indeed estimated at time $t$ by bo
|
||||
\begin{equation}{stiffness_delay}
|
||||
\tilde{k}(t) = \frac{F(t + \Delta t_h)}{D(t + \Delta t_v)}
|
||||
\end{equation}
|
||||
Therefore, the perceived stiffness $\tilde{k}(t)$ increases with a haptic delay in force and decreases with a visual delay in displacement~\cite{diluca2011effects}.
|
||||
Therefore, the perceived stiffness $\tilde{k}(t)$ increases with a haptic delay in force and decreases with a visual delay in displacement \cite{diluca2011effects}.
|
||||
|
||||
In a similar \TIFC user study, participants compared perceived stiffness of virtual pistons in \OST-\AR and \VR~\cite{gaffary2017ar}.
|
||||
In a similar \TIFC user study, participants compared perceived stiffness of virtual pistons in \OST-\AR and \VR \cite{gaffary2017ar}.
|
||||
However, the force-feedback device and the participant's hand were not visible (\figref{gaffary2017ar}).
|
||||
The reference piston was judged to be stiffer when seen in \VR than in \AR, without participants noticing this difference, and more force was exerted on the piston overall in \VR.
|
||||
This suggests that the haptic stiffness of \VOs feels \enquote{softer} in an \AE than in a full \VE.
|
||||
%Two differences that could be worth investigating with the two previous studies are the type of \AR (visuo or optical) and to see the hand touching the \VO.
|
||||
|
||||
\begin{subfigs}{gaffary2017ar}{Perception of haptic stiffness in \OST-\AR \vs \VR~\cite{gaffary2017ar}. }[
|
||||
\begin{subfigs}{gaffary2017ar}{Perception of haptic stiffness in \OST-\AR \vs \VR \cite{gaffary2017ar}. }[
|
||||
\item Experimental setup: a virtual piston was pressed with a force-feedback placed to the side of the participant.
|
||||
\item View of the virtual piston seen in front of the participant in \OST-\AR and
|
||||
\item in \VR.
|
||||
@@ -156,15 +156,16 @@ Several approaches have been proposed to move the actuator away to another locat
|
||||
Yet, they differ greatly in the actuators used (\secref{wearable_haptic_devices}) thus the haptic feedback (\secref{tactile_rendering}), and the placement of the haptic rendering.
|
||||
|
||||
Other wearable haptic actuators have been proposed for \AR but are not detailed here.
|
||||
A first reason is that they permanently cover the fingertip and affect the interaction with the \RE, such as thin-skin tactile interfaces~\cite{withana2018tacttoo,teng2024haptic} or fluid-based interfaces~\cite{han2018hydroring}.
|
||||
Another category of actuators relies on systems that cannot be considered as portable, such as REVEL~\cite{bau2012revel} that provide friction sensations with reverse electrovibration that need to modify the real objects to augment, or Electrical Muscle Stimulation (EMS) devices~\cite{lopes2018adding} that provide kinesthetic feedback by contracting the muscles.
|
||||
A first reason is that they permanently cover the fingertip and affect the interaction with the \RE, such as thin-skin tactile interfaces \cite{withana2018tacttoo,teng2024haptic} or fluid-based interfaces \cite{han2018hydroring}.
|
||||
Another category of actuators relies on systems that cannot be considered as portable, such as REVEL \cite{bau2012revel} that provide friction sensations with reverse electrovibration that need to modify the real objects to augment, or Electrical Muscle Stimulation (EMS) devices \cite{lopes2018adding} that provide kinesthetic feedback by contracting the muscles.
|
||||
|
||||
|
||||
\subsubsection{Nail-Mounted Devices}
|
||||
\label{vhar_nails}
|
||||
|
||||
\textcite{ando2007fingernailmounted} were the first to propose to move away the actuator from the fingertip, as described in \secref{texture_rendering}.
|
||||
This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a haptic device mounted on the nail but able to unfold its end-effector on demand to make contact with the fingertip when touching \VOs (\figref{teng2021touch}).
|
||||
This moving platform also contains a \LRA (\secref{moving_platforms}) and provides contact pressure (\qty{0.34}{\N} force) and texture (\qtyrange{150}{190}{\Hz} bandwidth) sensations.
|
||||
\textcite{ando2007fingernailmounted} were the first to propose to move away the actuator from the fingertip to the nail, as described in \secref{texture_rendering}.
|
||||
This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a haptic device able to unfold its end-effector on demand to make contact with the fingertip when touching \VOs (\figref{teng2021touch}).
|
||||
This moving platform also contains a \LRA (\secref{moving_platforms}) and provides contact pressure and texture sensations.
|
||||
%The whole system is very compact (\qtyproduct{24 x 24 x 41}{\mm}), lightweight (\qty{9.5}{\g}), and fully portable by including a battery and Bluetooth wireless communication. \qty{20}{\ms} for the Bluetooth
|
||||
When touching \VOs in \OST-\AR with the index finger, this device was found to be more realistic overall (5/7) than vibrations with a \LRA at \qty{170}{\Hz} on the nail (3/7).
|
||||
Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and this design is not suitable for augmenting real tangible objects.
|
||||
@@ -177,16 +178,16 @@ Two rollers, one per side, can deform the skin: When rotating inwards, they pull
|
||||
By doing quick in and out rotations, they can also simulate a texture sensation.
|
||||
%The device is also very compact (\qty{60 x 25 x 36}{\mm}), lightweight (\qty{18}{\g}), and portable with a battery and Bluetooth wireless communication with \qty{83}{\ms} latency.
|
||||
In a user study not in \AR, but directly touching images on a tablet, Fingeret was found to be more realistic (4/7) than a \LRA at \qty{100}{\Hz} on the nail (3/7) for rendering buttons and a patterned texture (\secref{texture_rendering}), but not different from vibrations for rendering high-frequency textures (3.5/7 for both).
|
||||
However, as for \textcite{teng2021touch}, finger speed was not taken into account for rendering vibrations, which may have been detrimental to texture perception (\secref{texture_rendering}).
|
||||
However, as for \textcite{teng2021touch}, finger speed was not taken into account for rendering vibrations, which may have been detrimental to texture perception, as described in \secref{texture_rendering}.
|
||||
|
||||
Finally, \textcite{preechayasomboon2021haplets} (\figref{preechayasomboon2021haplets}) and \textcite{sabnis2023haptic} designed Haplets and Haptic Servo, respectively: They are very compact and lightweight vibrotactile \LRA devices designed to feature both integrated sensing of the finger movements and very latency haptic feedback (\qty{<5}{ms}).
|
||||
But no proper user study were conducted to evaluate these devices in \AR.
|
||||
|
||||
\begin{subfigs}{ar_wearable}{Nail-mounted wearable haptic devices designed for \AR. }[
|
||||
%\item A voice-coil rendering a virtual haptic texture on a real sheet of paper~\cite{ando2007fingernailmounted}.
|
||||
\item Touch\&Fold provide contact pressure and vibrations on demand to the fingertip~\cite{teng2021touch}.
|
||||
\item Fingeret is a finger-side wearable haptic device that pulls and pushs the fingertip skin~\cite{maeda2022fingeret}.
|
||||
\item Haplets is a very compact nail device with integrated sensing and vibrotactile feedback~\cite{preechayasomboon2021haplets}.
|
||||
%\item A voice-coil rendering a virtual haptic texture on a real sheet of paper \cite{ando2007fingernailmounted}.
|
||||
\item Touch\&Fold provide contact pressure and vibrations on demand to the fingertip \cite{teng2021touch}.
|
||||
\item Fingeret is a finger-side wearable haptic device that pulls and pushs the fingertip skin \cite{maeda2022fingeret}.
|
||||
\item Haplets is a very compact nail device with integrated sensing and vibrotactile feedback \cite{preechayasomboon2021haplets}.
|
||||
]
|
||||
\subfigsheight{33mm}
|
||||
%\subfig{ando2007fingernailmounted}
|
||||
@@ -196,7 +197,7 @@ But no proper user study were conducted to evaluate these devices in \AR.
|
||||
\end{subfigs}
|
||||
|
||||
|
||||
\subsubsection{Ring Belt Devices}
|
||||
\subsubsection{Belt Devices}
|
||||
\label{vhar_rings}
|
||||
|
||||
The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been employed to improve the manipulation of \VOs in \AR (\secref{ar_interaction}).
|
||||
@@ -217,38 +218,37 @@ However, the measured difference in performance could be attributed to either th
|
||||
These two studies were also conducted in non-immersive setups, where users looked at a screen displaying the visual interactions, and only compared haptic and visual rendering of the hand-object contacts, but did not examine them together.
|
||||
|
||||
\begin{subfigs}{ar_rings}{Wearable haptic ring devices for \AR. }[
|
||||
\item Rendering weight of a virtual cube placed on a real surface~\cite{scheggi2010shape}.
|
||||
\item Rendering the contact force exerted by the fingers on a virtual cube~\cite{maisto2017evaluation,meli2018combining}.
|
||||
\item Rendering weight of a virtual cube placed on a real surface \cite{scheggi2010shape}.
|
||||
\item Rendering the contact force exerted by the fingers on a virtual cube \cite{maisto2017evaluation,meli2018combining}.
|
||||
]
|
||||
\subfigsheight{57mm}
|
||||
\subfig{scheggi2010shape}
|
||||
\subfig{maisto2017evaluation}
|
||||
\end{subfigs}
|
||||
|
||||
|
||||
\subsubsection{Wrist Bracelet Devices}
|
||||
\label{vhar_bracelets}
|
||||
%\subsubsection{Wrist Bracelet Devices}
|
||||
%\label{vhar_bracelets}
|
||||
|
||||
With their \enquote{Tactile And Squeeze Bracelet Interface} (Tasbi), already mentioned in \secref{belt_actuators}, \textcite{pezent2019tasbi} and \textcite{pezent2022design} explored the use of a wrist-worn bracelet actuator.
|
||||
It is capable of providing a uniform pressure sensation (up to \qty{15}{\N} and \qty{10}{\Hz}) and vibration with six \LRAs (\qtyrange{150}{200}{\Hz} bandwidth).
|
||||
A user study was conducted in \VR to compare the perception of visuo-haptic stiffness rendering~\cite{pezent2019tasbi}.
|
||||
In a \TIFC task (\secref{sensations_perception}), participants pressed a virtual button with different levels of stiffness via a virtual hand constrained by the \VE (\figref{pezent2019tasbi_2}).
|
||||
A higher visual stiffness required a larger physical displacement to press the button (C/D ratio, see \secref{pseudo_haptic}), while the haptic stiffness control the rate of the pressure feedback when pressing.
|
||||
When the visual and haptic stiffness were coherent or when only the haptic stiffness changed, participants easily discriminated two buttons with different stiffness levels (\figref{pezent2019tasbi_3}).
|
||||
However, if only the visual stiffness changed, participants were not able to discriminate the different stiffness levels (\figref{pezent2019tasbi_4}).
|
||||
This suggests that in \VR, the haptic pressure is more important perceptual cue than the visual displacement to render stiffness.
|
||||
A short vibration (\qty{25}{\ms} \qty{175}{\Hz} square-wave) was also rendered when contacting the button, but kept constant across all conditions: It may have affected the overall perception when only the visual stiffness changed.
|
||||
A user study was conducted in \VR to compare the perception of visuo-haptic stiffness rendering \cite{pezent2019tasbi}, and showed that the haptic pressure feedback was more important than the visual displacement.
|
||||
%In a \TIFC task (\secref{sensations_perception}), participants pressed a virtual button with different levels of stiffness via a virtual hand constrained by the \VE (\figref{pezent2019tasbi_2}).
|
||||
%A higher visual stiffness required a larger physical displacement to press the button (C/D ratio, see \secref{pseudo_haptic}), while the haptic stiffness control the rate of the pressure feedback when pressing.
|
||||
%When the visual and haptic stiffness were coherent or when only the haptic stiffness changed, participants easily discriminated two buttons with different stiffness levels (\figref{pezent2019tasbi_3}).
|
||||
%However, if only the visual stiffness changed, participants were not able to discriminate the different stiffness levels (\figref{pezent2019tasbi_4}).
|
||||
%This suggests that in \VR, the haptic pressure is more important perceptual cue than the visual displacement to render stiffness.
|
||||
%A short vibration (\qty{25}{\ms} \qty{175}{\Hz} square-wave) was also rendered when contacting the button, but kept constant across all conditions: It may have affected the overall perception when only the visual stiffness changed.
|
||||
|
||||
\begin{subfigs}{pezent2019tasbi}{Visuo-haptic stiffness rendering of a virtual button in \VR with the Tasbi bracelet. }[
|
||||
\item The \VE seen by the user: the virtual hand (in beige) is constrained by the virtual button. The displacement is proportional to the visual stiffness. The real hand (in green) is hidden by the \VE.
|
||||
\item When the rendered visuo-haptic stiffness are coherents (in purple) or only the haptic stiffness change (in blue), participants easily discrimated the different levels.
|
||||
\item When varying only the visual stiffness (in red) but keeping the haptic stiffness constant, participants were not able to discriminate the different stiffness levels.
|
||||
]
|
||||
\subfigsheight{45mm}
|
||||
\subfig{pezent2019tasbi_2}
|
||||
\subfig{pezent2019tasbi_3}
|
||||
\subfig{pezent2019tasbi_4}
|
||||
\end{subfigs}
|
||||
%\begin{subfigs}{pezent2019tasbi}{Visuo-haptic stiffness rendering of a virtual button in \VR with the Tasbi bracelet. }[
|
||||
% \item The \VE seen by the user: the virtual hand (in beige) is constrained by the virtual button. The displacement is proportional to the visual stiffness. The real hand (in green) is hidden by the \VE.
|
||||
% \item When the rendered visuo-haptic stiffness are coherents (in purple) or only the haptic stiffness change (in blue), participants easily discrimated the different levels.
|
||||
% \item When varying only the visual stiffness (in red) but keeping the haptic stiffness constant, participants were not able to discriminate the different stiffness levels.
|
||||
% ]
|
||||
% \subfigsheight{45mm}
|
||||
% \subfig{pezent2019tasbi_2}
|
||||
% \subfig{pezent2019tasbi_3}
|
||||
% \subfig{pezent2019tasbi_4}
|
||||
%\end{subfigs}
|
||||
|
||||
% \cite{sarac2022perceived,palmer2022haptic} not in AR but studies on relocating to the wrist the haptic feedback of the fingertip-object contacts.
|
||||
|
||||
|
||||
@@ -13,7 +13,7 @@
|
||||
|
||||
%Cependant, une expérience numérique audio-visuelle peut être imparfaite et pourtant suffisante pour être utile et intéressante, comme peut l'être une visio-conférence, et transmettre des sensations comparables à celles réelles, comme regarder et écouter un concert sur un écran avec un casque.
|
||||
%Pourtant la qualité visuelle et sonore de telles expériences est très différente de celle d'une conversation "réelle" ou d'une vraie scène de tous les jours.
|
||||
%Ainsi, plus que recréer des expériences haptiques réalistes, il est plus important de rendre le stimulus sensoriel "au bon moment et à la bonne place"~\cite{hayward2007it}.
|
||||
%Ainsi, plus que recréer des expériences haptiques réalistes, il est plus important de rendre le stimulus sensoriel "au bon moment et à la bonne place" \cite{hayward2007it}.
|
||||
|
||||
%The quality of the illusory haptic experience is a function of the interplay between the user’s perceptual system and the intrinsic technical qualities of the interfaces
|
||||
|
||||
@@ -23,6 +23,6 @@
|
||||
|
||||
%Mais il est également intéressant de noter que ces deux domaines sont à des stades de maturité différents.
|
||||
%En effet, pouvoir contribuer pour ces deux domaines soulève, entre autres, des défis techniques importants, comme détaillé dans la \secref[introduction]{research_challenges}.
|
||||
%Et il y a un besoin de standardisation en haptique portable~\cite{culbertson2018haptics}, notamment en terme de dispositifs et de rendus, alors que l'industrie est plutôt bien établie en RA, par exemple avec les casques HoloLens~2 de Microsoft~\footnoteurl{https://www.microsoft.com/hololens} et Vision~Pro d'Apple~\footnoteurl{https://www.apple.com/apple-vision-pro/} ou bien les frameworks ARCore de Google~\footnoteurl{https://developers.google.com/ar} et ARKit d'Apple~\footnoteurl{https://developer.apple.com/augmented-reality/}.
|
||||
%Cela peut en partie d'une part s'expliquer par la maturité de l'industrie de la RV, qui entraîne celle de la RA, et avec une tendance annoncée à la convergence de ces deux technologies~\cite{speicher2019what}, mais aussi d'autre part par la plus grande complexité et les particularité du sens haptique~\cite{culbertson2018haptics}.
|
||||
%À l'inverse, définir et caractériser la RA/RM, dans une bien moindre mesure la RV, reste étonnamment un sujet ouvert~\cite{speicher2019what}.
|
||||
%Et il y a un besoin de standardisation en haptique portable \cite{culbertson2018haptics}, notamment en terme de dispositifs et de rendus, alors que l'industrie est plutôt bien établie en RA, par exemple avec les casques HoloLens~2 de Microsoft~\footnoteurl{https://www.microsoft.com/hololens} et Vision~Pro d'Apple~\footnoteurl{https://www.apple.com/apple-vision-pro/} ou bien les frameworks ARCore de Google~\footnoteurl{https://developers.google.com/ar} et ARKit d'Apple~\footnoteurl{https://developer.apple.com/augmented-reality/}.
|
||||
%Cela peut en partie d'une part s'expliquer par la maturité de l'industrie de la RV, qui entraîne celle de la RA, et avec une tendance annoncée à la convergence de ces deux technologies \cite{speicher2019what}, mais aussi d'autre part par la plus grande complexité et les particularité du sens haptique \cite{culbertson2018haptics}.
|
||||
%À l'inverse, définir et caractériser la RA/RM, dans une bien moindre mesure la RV, reste étonnamment un sujet ouvert \cite{speicher2019what}.
|
||||
|
||||
@@ -6,31 +6,31 @@ This Section summarizes the state of the art in visual hand rendering and (weara
|
||||
\subsection{Visual Hand Rendering in AR}
|
||||
\label{2_hands}
|
||||
|
||||
Mutual visual occlusion between a virtual object and the real hand, \ie hiding the virtual object when the real hand is in front of it and hiding the real hand when it is behind the virtual object, is often presented as natural and realistic, enhancing the blending of real and virtual environments~\cite{piumsomboon2014graspshell, al-kalbani2016analysis}.
|
||||
Mutual visual occlusion between a virtual object and the real hand, \ie hiding the virtual object when the real hand is in front of it and hiding the real hand when it is behind the virtual object, is often presented as natural and realistic, enhancing the blending of real and virtual environments \cite{piumsomboon2014graspshell, al-kalbani2016analysis}.
|
||||
%
|
||||
In video see-through AR (VST-AR), this could be solved as a masking problem by combining the image of the real world captured by a camera and the generated virtual image~\cite{macedo2023occlusion}.
|
||||
In video see-through AR (VST-AR), this could be solved as a masking problem by combining the image of the real world captured by a camera and the generated virtual image \cite{macedo2023occlusion}.
|
||||
%
|
||||
In OST-AR, this is more difficult because the virtual environment is displayed as a transparent 2D image on top of the 3D real world, which cannot be easily masked~\cite{macedo2023occlusion}.
|
||||
In OST-AR, this is more difficult because the virtual environment is displayed as a transparent 2D image on top of the 3D real world, which cannot be easily masked \cite{macedo2023occlusion}.
|
||||
%
|
||||
Moreover, in VST-AR, the grip aperture and depth positioning of virtual objects often seem to be wrongly estimated~\cite{al-kalbani2016analysis, maisto2017evaluation}.
|
||||
Moreover, in VST-AR, the grip aperture and depth positioning of virtual objects often seem to be wrongly estimated \cite{al-kalbani2016analysis, maisto2017evaluation}.
|
||||
%
|
||||
However, this effect has yet to be verified in an OST-AR setup.
|
||||
|
||||
An alternative is to render the virtual objects and the hand semi-transparents, so that they are partially visible even when one is occluding the other, \eg in \figref{hands-none} the real hand is behind the virtual cube but still visible.
|
||||
%
|
||||
Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in VST-AR~\cite{buchmann2005interaction, ha2014wearhand, piumsomboon2014graspshell} and VR~\cite{vanveldhuizen2021effect}, but has not yet been evaluated in OST-AR.
|
||||
Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in VST-AR \cite{buchmann2005interaction, ha2014wearhand, piumsomboon2014graspshell} and VR \cite{vanveldhuizen2021effect}, but has not yet been evaluated in OST-AR.
|
||||
%
|
||||
However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a virtual object, \eg in \figref{hands-none} the thumb is in front of the virtual cube, but it appears to be behind it.
|
||||
|
||||
In VR, as the user is fully immersed in the virtual environment and cannot see their real hands, it is necessary to represent them virtually.
|
||||
%
|
||||
It is known that the virtual hand representation has an impact on perception, interaction performance, and preference of users~\cite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch}.
|
||||
It is known that the virtual hand representation has an impact on perception, interaction performance, and preference of users \cite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch}.
|
||||
%
|
||||
In a pick-and-place task in VR, \textcite{prachyabrued2014visual} found that the virtual hand representation whose motion was constrained to the surface of the virtual objects performed the worst, while the virtual hand representation following the tracked human hand (thus penetrating the virtual objects), performed the best, even though it was rather disliked.
|
||||
%
|
||||
The authors also observed that the best compromise was a double rendering, showing both the tracked hand and a hand rendering constrained by the virtual environment.
|
||||
%
|
||||
It has also been shown that over a realistic avatar, a skeleton rendering (similar to \figref{hands-skeleton}) can provide a stronger sense of being in control~\cite{argelaguet2016role} and that minimalistic fingertip rendering (similar to \figref{hands-tips}) can be more effective in a typing task~\cite{grubert2018effects}.
|
||||
It has also been shown that over a realistic avatar, a skeleton rendering (similar to \figref{hands-skeleton}) can provide a stronger sense of being in control \cite{argelaguet2016role} and that minimalistic fingertip rendering (similar to \figref{hands-tips}) can be more effective in a typing task \cite{grubert2018effects}.
|
||||
|
||||
In AR, as the real hand of a user is visible but not physically constrained by the virtual environment, adding a visual hand rendering that can physically interact with virtual objects would achieve a similar result to the promising double-hand rendering of \textcite{prachyabrued2014visual}.
|
||||
%
|
||||
@@ -38,7 +38,7 @@ Additionally, \textcite{kahl2021investigation} showed that a virtual object over
|
||||
%
|
||||
This suggests that a visual hand rendering superimposed on the real hand could be helpful, but should not impair users.
|
||||
|
||||
Few works have explored the effect of visual hand rendering in AR~\cite{blaga2017usability, maisto2017evaluation, krichenbauer2018augmented, yoon2020evaluating, saito2021contact}.
|
||||
Few works have explored the effect of visual hand rendering in AR \cite{blaga2017usability, maisto2017evaluation, krichenbauer2018augmented, yoon2020evaluating, saito2021contact}.
|
||||
%
|
||||
For example, \textcite{blaga2017usability} evaluated a skeleton rendering in several virtual object manipulations against no visual hand overlay.
|
||||
%
|
||||
@@ -55,12 +55,12 @@ To the best of our knowledge, evaluating the role of a visual rendering of the h
|
||||
\label{2_haptics}
|
||||
|
||||
Different haptic feedback systems have been explored to improve interactions in AR, including %
|
||||
grounded force feedback devices~\cite{bianchi2006high, jeon2009haptic, knorlein2009influence}, %
|
||||
exoskeletons~\cite{lee2021wearable}, %
|
||||
tangible objects~\cite{hettiarachchi2016annexing, detinguy2018enhancing, salazar2020altering, normand2018enlarging, xiao2018mrtouch}, and %
|
||||
wearable haptic devices~\cite{pacchierotti2016hring, lopes2018adding, pezent2022design, teng2021touch}.
|
||||
grounded force feedback devices \cite{bianchi2006high, jeon2009haptic, knorlein2009influence}, %
|
||||
exoskeletons \cite{lee2021wearable}, %
|
||||
tangible objects \cite{hettiarachchi2016annexing, detinguy2018enhancing, salazar2020altering, normand2018enlarging, xiao2018mrtouch}, and %
|
||||
wearable haptic devices \cite{pacchierotti2016hring, lopes2018adding, pezent2022design, teng2021touch}.
|
||||
|
||||
Wearable haptics seems particularly suited for this context, as it takes into account many of the AR constraints, \eg limited impact on hand tracking performance and reduced impairment of the senses and ability of the users to interact with real content~\cite{pacchierotti2016hring, maisto2017evaluation, lopes2018adding, meli2018combining, pezent2022design, teng2021touch, kourtesis2022electrotactile, marchal2022virtual}.
|
||||
Wearable haptics seems particularly suited for this context, as it takes into account many of the AR constraints, \eg limited impact on hand tracking performance and reduced impairment of the senses and ability of the users to interact with real content \cite{pacchierotti2016hring, maisto2017evaluation, lopes2018adding, meli2018combining, pezent2022design, teng2021touch, kourtesis2022electrotactile, marchal2022virtual}.
|
||||
%
|
||||
For example, \textcite{pacchierotti2016hring} designed a haptic ring providing pressure and skin stretch sensations to be worn at the proximal finger phalanx, so as to improve the hand tracking during a pick-and-place task.
|
||||
%
|
||||
@@ -68,7 +68,7 @@ For example, \textcite{pacchierotti2016hring} designed a haptic ring providing p
|
||||
%
|
||||
\textcite{teng2021touch} presented Touch\&Fold, a haptic device attached to the nail that provides pressure and texture sensations when interacting with virtual content, but also folds away when the user interacts with real objects, leaving the fingertip free.
|
||||
%
|
||||
This approach was also perceived as more realistic than providing sensations directly on the nail, as in~\cite{ando2007fingernailmounted}.
|
||||
This approach was also perceived as more realistic than providing sensations directly on the nail, as in \cite{ando2007fingernailmounted}.
|
||||
%
|
||||
Each of these haptic devices provided haptic feedback about fingertip interactions with the virtual content on other parts of the hand.
|
||||
%
|
||||
@@ -82,7 +82,7 @@ Results proved that moving the haptic feedback away from the point(s) of contact
|
||||
%
|
||||
In pick-and-place tasks in AR involving both virtual and real objects, \textcite{maisto2017evaluation} and \textcite{meli2018combining} showed that having a haptic {rendering of the} fingertip interactions with the virtual objects led to better performance and perceived effectiveness than having only a visual rendering of the hand, similar to \figref{hands-tips}.
|
||||
%
|
||||
Moreover, employing the haptic ring of~\cite{pacchierotti2016hring} on the proximal finger phalanx led to an improved performance with respect to more standard fingertip haptic devices~\cite{chinello2020modular}.
|
||||
Moreover, employing the haptic ring of \cite{pacchierotti2016hring} on the proximal finger phalanx led to an improved performance with respect to more standard fingertip haptic devices \cite{chinello2020modular}.
|
||||
%
|
||||
However, the measured difference in performance could be attributed to either the device or the device position (proximal vs fingertip), or both.
|
||||
%
|
||||
|
||||
@@ -4,31 +4,31 @@
|
||||
\subsection{Haptics in AR}
|
||||
|
||||
As in VR, the addition of haptic feedback in AR has been explored through numerous approaches, including %
|
||||
grounded force feedback devices~\cite{jeon2009haptic,knorlein2009influence,hachisu2012augmentation,gaffary2017ar}, %
|
||||
exoskeletons~\cite{lee2021wearable}, %
|
||||
wearable haptic devices~\cite{maisto2017evaluation,detinguy2018enhancing,lopes2018adding,meli2018combining,pezent2022design,teng2021touch},
|
||||
tangible objects~\cite{punpongsanon2015softar,hettiarachchi2016annexing,kahl2021investigation}, and %
|
||||
mid-air haptics~\cite{ochiai2016crossfield}. %
|
||||
grounded force feedback devices \cite{jeon2009haptic,knorlein2009influence,hachisu2012augmentation,gaffary2017ar}, %
|
||||
exoskeletons \cite{lee2021wearable}, %
|
||||
wearable haptic devices \cite{maisto2017evaluation,detinguy2018enhancing,lopes2018adding,meli2018combining,pezent2022design,teng2021touch},
|
||||
tangible objects \cite{punpongsanon2015softar,hettiarachchi2016annexing,kahl2021investigation}, and %
|
||||
mid-air haptics \cite{ochiai2016crossfield}. %
|
||||
%
|
||||
Most have been used to provide haptic feedback to virtual objects.
|
||||
%
|
||||
While this may seem similar to haptic feedback in VR, there are significant differences in terms of perception, as in AR the real world and the hand of the user remain visible, but also because the virtual content may be less realistic or inconsistent with the real world~\cite{kim2018revisiting,macedo2023occlusion}.
|
||||
While this may seem similar to haptic feedback in VR, there are significant differences in terms of perception, as in AR the real world and the hand of the user remain visible, but also because the virtual content may be less realistic or inconsistent with the real world \cite{kim2018revisiting,macedo2023occlusion}.
|
||||
%
|
||||
Indeed, the same haptic stimuli can be perceived differently in AR and VR, \eg the perceived stiffness of a piston seemed higher in AR than in VR~\cite{gaffary2017ar} or was altered in the presence of a delay between the haptic and visual feedback~\cite{knorlein2009influence}.
|
||||
Indeed, the same haptic stimuli can be perceived differently in AR and VR, \eg the perceived stiffness of a piston seemed higher in AR than in VR \cite{gaffary2017ar} or was altered in the presence of a delay between the haptic and visual feedback \cite{knorlein2009influence}.
|
||||
%
|
||||
It might be therefore interesting to study how haptic and visual augmentations of textures of tangible surfaces are perceived in AR.
|
||||
|
||||
An additional challenge in AR is to let the hand of the user free to touch, feel, and interact with the real objects~\cite{maisto2017evaluation,detinguy2018enhancing,teng2021touch}.
|
||||
An additional challenge in AR is to let the hand of the user free to touch, feel, and interact with the real objects \cite{maisto2017evaluation,detinguy2018enhancing,teng2021touch}.
|
||||
%
|
||||
For example, mounted on the nail, the haptic device of \textcite{teng2021touch} can be quickly unfolded on demand to the fingertip to render haptic feedback of virtual objects.
|
||||
%
|
||||
It is however not suitable for rendering haptic feedback when touching real objects.
|
||||
%
|
||||
In this respect, some wearable haptic devices were specifically designed to provide haptic feedback about fingertip interactions with the virtual content, but delocalized elsewhere on the body: on the proximal finger phalanx with the hRing haptic ring device~\cite{pacchierotti2016hring,ferro2023deconstructing}, on the wrist with the Tasbi bracelet~\cite{pezent2022design}, or on the arm~\cite{lopes2018adding}.
|
||||
In this respect, some wearable haptic devices were specifically designed to provide haptic feedback about fingertip interactions with the virtual content, but delocalized elsewhere on the body: on the proximal finger phalanx with the hRing haptic ring device \cite{pacchierotti2016hring,ferro2023deconstructing}, on the wrist with the Tasbi bracelet \cite{pezent2022design}, or on the arm \cite{lopes2018adding}.
|
||||
%
|
||||
Compared to a fingertip worn device, the hRing was even preferred by participants and perceived as more effective in virtual object manipulation task in AR~\cite{maisto2017evaluation,meli2018combining}.
|
||||
Compared to a fingertip worn device, the hRing was even preferred by participants and perceived as more effective in virtual object manipulation task in AR \cite{maisto2017evaluation,meli2018combining}.
|
||||
%
|
||||
This device has been then taken further to alter cutaneous perception of touched tangible objects in VR and AR~\cite{detinguy2018enhancing,salazar2020altering}: by providing normal and shear forces to the proximal phalanx skin in a timely manner, the perceived stiffness, softness, slipperiness, and local deformations (bumps and holes) of the touched tangible object were augmented.
|
||||
This device has been then taken further to alter cutaneous perception of touched tangible objects in VR and AR \cite{detinguy2018enhancing,salazar2020altering}: by providing normal and shear forces to the proximal phalanx skin in a timely manner, the perceived stiffness, softness, slipperiness, and local deformations (bumps and holes) of the touched tangible object were augmented.
|
||||
%
|
||||
However, wearable haptic devices have not yet been used in AR to modify the texture perception of a tangible surface.
|
||||
|
||||
@@ -71,15 +71,15 @@ However, wearable haptic devices have not yet been used in AR to modify the text
|
||||
|
||||
Many approaches have been used to generate realistic haptic virtual textures.
|
||||
%
|
||||
Ultrasonic vibrating screens are capable of modulating their friction~\cite{rekik2017localized,ito2019tactile}, but their use in AR is limited.
|
||||
Ultrasonic vibrating screens are capable of modulating their friction \cite{rekik2017localized,ito2019tactile}, but their use in AR is limited.
|
||||
%
|
||||
By simulating the roughness of a surface instead, force feedback devices can reproduce perceptions of patterned textures identical to those of real textures~\cite{unger2011roughness}, but they are expensive and have a limited workspace.
|
||||
By simulating the roughness of a surface instead, force feedback devices can reproduce perceptions of patterned textures identical to those of real textures \cite{unger2011roughness}, but they are expensive and have a limited workspace.
|
||||
%
|
||||
An alternative is to reproduce the vibrations that occur when a tool or the finger is moved across a surface using a vibrotactile device attached to a hand-held tool~\cite{culbertson2018haptics}.
|
||||
An alternative is to reproduce the vibrations that occur when a tool or the finger is moved across a surface using a vibrotactile device attached to a hand-held tool \cite{culbertson2018haptics}.
|
||||
%
|
||||
Several physical models have been proposed to represent such vibrations~\cite{okamura1998vibration,guruswamy2011iir,chan2021hasti}.
|
||||
Several physical models have been proposed to represent such vibrations \cite{okamura1998vibration,guruswamy2011iir,chan2021hasti}.
|
||||
%
|
||||
However, as they can be difficult to tune, measurement-based models have been developed to record, model, and render these vibrations~\cite{culbertson2014modeling,culbertson2017ungrounded}.
|
||||
However, as they can be difficult to tune, measurement-based models have been developed to record, model, and render these vibrations \cite{culbertson2014modeling,culbertson2017ungrounded}.
|
||||
%
|
||||
In this work, we employed such data-driven haptic models to augment and studied the visuo-haptic texture perception of tangible surfaces in AR.%\CP{Here the original sentence was: ``We use these data-driven haptic models to augment [...].''. It was not clear what ``we use'' meant. Check that the new sentence is correct.}
|
||||
|
||||
@@ -91,9 +91,9 @@ Similarly, \textcite{culbertson2014modeling} compared the similarity of all poss
|
||||
%
|
||||
Virtual data-driven textures were perceived as similar to real textures, except for friction, which was not rendered properly.
|
||||
%
|
||||
For grating textures, an arbitrary roughness rating is used to determine a psycho-physical curve as a function of pattern spacing~\cite{unger2011roughness,asano2015vibrotactile,degraen2019enhancing}.
|
||||
For grating textures, an arbitrary roughness rating is used to determine a psycho-physical curve as a function of pattern spacing \cite{unger2011roughness,asano2015vibrotactile,degraen2019enhancing}.
|
||||
%
|
||||
Another common method is to identify a given haptic texture among visual representations of all haptic textures~\cite{ando2007fingernailmounted,rekik2017localized,degraen2019enhancing,chan2021hasti}.
|
||||
Another common method is to identify a given haptic texture among visual representations of all haptic textures \cite{ando2007fingernailmounted,rekik2017localized,degraen2019enhancing,chan2021hasti}.
|
||||
%
|
||||
In this user study, participants matched the pairs of visual and haptic textures they find most coherent and ranked the textures according to their perceived roughness.
|
||||
%\CP{Do you refer to the one in our paper? Not super clear.}
|
||||
@@ -102,7 +102,7 @@ A few studies have explored vibrotactile haptic devices worn directly on the fin
|
||||
%
|
||||
\textcite{ando2007fingernailmounted} mounted a vibrotactile actuator on the index nail, which generated impulse vibrations to render virtual edges and gaps on a real surface.
|
||||
%
|
||||
%This rendering method was compared later to providing the vibrations with pressure directly on the fingertip in AR and was found more realistic to render virtual objects and textures~\cite{teng2021touch}.
|
||||
%This rendering method was compared later to providing the vibrations with pressure directly on the fingertip in AR and was found more realistic to render virtual objects and textures \cite{teng2021touch}.
|
||||
%
|
||||
%Covering the fingertip is however not suitable for rendering haptic feedback when touching real objects.
|
||||
%
|
||||
@@ -123,13 +123,13 @@ The well-known phycho-physical model of \textcite{ernst2002humans} established t
|
||||
%
|
||||
This effect has been used to alter the texture perception in AR and VR.
|
||||
%
|
||||
For example, superimposed virtual visual opaque textures on real surfaces in AR can be perceived as coherent together even though they have very different roughnesses~\cite{kitahara2010sensory}.
|
||||
For example, superimposed virtual visual opaque textures on real surfaces in AR can be perceived as coherent together even though they have very different roughnesses \cite{kitahara2010sensory}.
|
||||
%
|
||||
\textcite{fradin2023humans} explored this effect further, finding that a superimposed AR visual texture slightly different from a colocalized haptic texture affected the ability to recognize the haptic texture.
|
||||
%
|
||||
Similarly, \textcite{punpongsanon2015softar} altered the softness perception of a tangible surface using AR-projected visual textures whereas \textcite{chan2021hasti} evaluated audio-haptic texture perception in VR.
|
||||
%
|
||||
Conversely, colocalized 3D-printed real hair structures were able to correctly render several virtual visual textures seen in VR in terms of haptic hardness and roughness~\cite{degraen2019enhancing}.
|
||||
Conversely, colocalized 3D-printed real hair structures were able to correctly render several virtual visual textures seen in VR in terms of haptic hardness and roughness \cite{degraen2019enhancing}.
|
||||
%
|
||||
This study investigated how virtual roughness haptic texture can be used to enhance touched real surfaces augmented with visual AR textures.
|
||||
%Dans cet article, les textures haptiques sont senties co-localisées avec des textures visuelles
|
||||
|
||||
@@ -11,37 +11,37 @@ Yet visual and haptic sensations are often combined in everyday life, and it is
|
||||
\subsection{Augmenting Haptic Texture Roughness}
|
||||
\label{vibrotactile_roughness}
|
||||
|
||||
When running a finger over a surface, the deformations and vibrations of the skin caused by the micro-height differences of the material induce the sensation of roughness~\cite{klatzky2003feeling}.
|
||||
When running a finger over a surface, the deformations and vibrations of the skin caused by the micro-height differences of the material induce the sensation of roughness \cite{klatzky2003feeling}.
|
||||
%
|
||||
%Several approaches have been proposed to render virtual haptic texture~\cite{culbertson2018haptics}.
|
||||
%Several approaches have been proposed to render virtual haptic texture \cite{culbertson2018haptics}.
|
||||
%
|
||||
%High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\cite{unger2011roughness}.
|
||||
%High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture \cite{unger2011roughness}.
|
||||
%
|
||||
%As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}.
|
||||
%As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture \cite{culbertson2018haptics}.
|
||||
%
|
||||
%In this way, physics-based models~\cite{chan2021hasti,okamura1998vibration} and data-based models~\cite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures.
|
||||
%In this way, physics-based models \cite{chan2021hasti,okamura1998vibration} and data-based models \cite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures.
|
||||
%
|
||||
%Notably, \textcite{okamura1998vibration} rendered grating textures with exponentially decaying sinudoids that simulated the strokes of the grooves and ridges of the surface, while \textcite{culbertson2014modeling} captured and modelled the roughness of real surfaces to render them using the speed and force of the user.
|
||||
%
|
||||
An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force. % to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\cite{culbertson2015should}.
|
||||
An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture \cite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force. % to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters \cite{culbertson2015should}.
|
||||
%
|
||||
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\cite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\cite{asano2015vibrotactile,normand2024augmenting}, creating a haptic texture augmentation.
|
||||
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached \cite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator \cite{asano2015vibrotactile,normand2024augmenting}, creating a haptic texture augmentation.
|
||||
%
|
||||
%The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\cite{bhatia2024augmenting,jeon2009haptic}.
|
||||
%The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR) \cite{bhatia2024augmenting,jeon2009haptic}.
|
||||
%
|
||||
One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand~\cite{ando2007fingernailmounted,friesen2024perceived,normand2024visuohaptic,teng2021touch}.
|
||||
One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand \cite{ando2007fingernailmounted,friesen2024perceived,normand2024visuohaptic,teng2021touch}.
|
||||
%
|
||||
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\cite{manfredi2014natural}.
|
||||
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals \cite{manfredi2014natural}.
|
||||
%
|
||||
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\cite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
|
||||
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity \cite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
|
||||
%
|
||||
It remains unclear whether such vibrotactile texture augmentation is perceived the same when integrated into visual AR or VR environments or touched with a virtual hand instead of the real hand.
|
||||
%
|
||||
%We also add a phase adjustment to this sinusoidal signal to allow free exploration movements of the finger with a simple camera-based tracking system.
|
||||
|
||||
%Another approach is to use ultrasonic vibrating screens, which are able to modulate their friction~\cite{brahimaj2023crossmodal,rekik2017localized}.
|
||||
%Another approach is to use ultrasonic vibrating screens, which are able to modulate their friction \cite{brahimaj2023crossmodal,rekik2017localized}.
|
||||
%
|
||||
%Combined with vibrotactile rendering of roughness using a voice-coil actuator attached to the screen, they can produce realistic haptic texture sensations~\cite{ito2019tactile}.
|
||||
%Combined with vibrotactile rendering of roughness using a voice-coil actuator attached to the screen, they can produce realistic haptic texture sensations \cite{ito2019tactile}.
|
||||
%
|
||||
%However, this method is limited to the screen and does not allow to easily render textures on virtual (visual) objects or to alter the perception of real surfaces.
|
||||
|
||||
@@ -55,30 +55,30 @@ When the same object property is sensed simultaneously by vision and touch, the
|
||||
The phychophysical model of \textcite{ernst2002humans} established that the sense with the least variability dominates perception.
|
||||
%
|
||||
%In particular, this effect has been used to better understand the visuo-haptic perception of texture and to design better feedback for virtual objects.
|
||||
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
|
||||
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly \cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
|
||||
%
|
||||
Thus, the overall perception can be modified by changing one of the modalities, as shown by \textcite{yanagisawa2015effects}, who altered the perception of roughness, stiffness and friction of some real tactile textures touched by the finger by superimposing different real visual textures using a half-mirror.
|
||||
%
|
||||
%Similarly but in VR, \textcite{degraen2019enhancing} combined visual textures with different passive haptic hair-like structure that were touched with the finger to induce a larger set of visuo-haptic materials perception.
|
||||
%
|
||||
%\textcite{gunther2022smooth} studied in a complementary way how the visual rendering of a virtual object touching the arm with a tangible object influenced the perception of roughness.
|
||||
Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch~\cite{degraen2019enhancing} and passive touch~\cite{gunther2022smooth} contexts.
|
||||
Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch \cite{degraen2019enhancing} and passive touch \cite{gunther2022smooth} contexts.
|
||||
%
|
||||
\textcite{normand2024augmenting} also investigated the roughness perception of tangible surfaces touched with the finger and augmented with visual textures in AR and with wearable vibrotactile textures.
|
||||
%
|
||||
%A common finding of these studies is that haptic sensations seem to dominate the perception of roughness, suggesting that a smaller set of haptic textures can support a larger set of visual textures.
|
||||
%
|
||||
Conversely, virtual hand rendering is also known to influence how an object is grasped in VR~\cite{prachyabrued2014visual,blaga2020too} and AR~\cite{normand2024visuohaptic}, or even how real bumps and holes are perceived in VR~\cite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated.
|
||||
Conversely, virtual hand rendering is also known to influence how an object is grasped in VR \cite{prachyabrued2014visual,blaga2020too} and AR \cite{normand2024visuohaptic}, or even how real bumps and holes are perceived in VR \cite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated.
|
||||
|
||||
% \cite{degraen2019enhancing} and \cite{gunther2022smooth} showed that the visual rendering of a virtual object can influence the perception of its haptic properties.
|
||||
% \cite{yanagisawa2015effects} with real visual textures superimposed on touched real textures affected the perception of the touched textures.
|
||||
|
||||
A few works have also used pseudo-haptic feedback to change the perception of haptic stimuli to create richer feedback by deforming the visual representation of a user input~\cite{ujitoko2021survey}.
|
||||
A few works have also used pseudo-haptic feedback to change the perception of haptic stimuli to create richer feedback by deforming the visual representation of a user input \cite{ujitoko2021survey}.
|
||||
%
|
||||
For example, %different levels of stiffness can be simulated on a grasped virtual object with the same passive haptic device~\cite{achibet2017flexifingers} or
|
||||
the perceived softness of tangible objects can be altered by superimposing in AR a virtual texture that deforms when pressed by the hand~\cite{punpongsanon2015softar}, or in combination with vibrotactile rendering in VR~\cite{choi2021augmenting}.
|
||||
For example, %different levels of stiffness can be simulated on a grasped virtual object with the same passive haptic device \cite{achibet2017flexifingers} or
|
||||
the perceived softness of tangible objects can be altered by superimposing in AR a virtual texture that deforms when pressed by the hand \cite{punpongsanon2015softar}, or in combination with vibrotactile rendering in VR \cite{choi2021augmenting}.
|
||||
%
|
||||
The vibrotactile sinusoidal rendering of virtual texture cited above was also combined with visual oscillations of a cursor on a screen to increase the roughness perception of the texture~\cite{ujitoko2019modulating}.
|
||||
The vibrotactile sinusoidal rendering of virtual texture cited above was also combined with visual oscillations of a cursor on a screen to increase the roughness perception of the texture \cite{ujitoko2019modulating}.
|
||||
%
|
||||
%However, the visual representation was a virtual cursor seen on a screen while the haptic feedback was felt with a hand-held device.
|
||||
%
|
||||
@@ -95,8 +95,8 @@ Rendering a virtual piston pressed with one's real hand using a video see-throug
|
||||
%
|
||||
In a similar setup, but with an optical see-through (OST) AR headset, \textcite{gaffary2017ar} found that the virtual piston was perceived as less stiff in AR than in VR, without participants noticing this difference.
|
||||
%
|
||||
Using a VST-AR headset have notable consequences, as the "real" view of the environment and the hand is actually a visual stream from a camera, which has a noticeable delay and lower quality (\eg resolution, frame rate, field of view) compared to the direct view of the real environment with OST-AR~\cite{macedo2023occlusion}.
|
||||
Using a VST-AR headset have notable consequences, as the "real" view of the environment and the hand is actually a visual stream from a camera, which has a noticeable delay and lower quality (\eg resolution, frame rate, field of view) compared to the direct view of the real environment with OST-AR \cite{macedo2023occlusion}.
|
||||
%
|
||||
While a large literature has investigated these differences in visual perception, as well as for VR, \eg distances are underestimated~\cite{adams2022depth,peillard2019studying}, less is known about visuo-haptic perception in AR and VR.
|
||||
While a large literature has investigated these differences in visual perception, as well as for VR, \eg distances are underestimated \cite{adams2022depth,peillard2019studying}, less is known about visuo-haptic perception in AR and VR.
|
||||
%
|
||||
In this work we studied (1) the perception of a \emph{haptic texture augmentation} of a tangible surface and (2) the possible influence of the visual rendering of the environment (OST-AR or VR) and the hand touching the surface (real or virtual) on this perception.
|
||||
|
||||
Binary file not shown.
|
Before Width: | Height: | Size: 36 KiB After Width: | Height: | Size: 31 KiB |
BIN
1-introduction/related-work/figures/asano2015vibrotactile.jpg
Normal file
BIN
1-introduction/related-work/figures/asano2015vibrotactile.jpg
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 42 KiB |
BIN
1-introduction/related-work/figures/culbertson2012refined.jpg
Normal file
BIN
1-introduction/related-work/figures/culbertson2012refined.jpg
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 130 KiB |
BIN
1-introduction/related-work/figures/friesen2024perceived.jpg
Normal file
BIN
1-introduction/related-work/figures/friesen2024perceived.jpg
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 403 KiB |
Reference in New Issue
Block a user