Replace \autocite => \cite
This commit is contained in:
@@ -11,15 +11,15 @@ This thesis presents research on direct hand interaction with real and virtual e
|
||||
|
||||
In daily life, we simultaneously look at and touch the everyday objects around us without even thinking about it.
|
||||
%
|
||||
Many of these object properties can be perceived in a complementary way through both vision and touch, such as their shape, size or texture~\autocite{baumgartner2013visual}.
|
||||
Many of these object properties can be perceived in a complementary way through both vision and touch, such as their shape, size or texture~\cite{baumgartner2013visual}.
|
||||
%
|
||||
But vision often precedes touch, enabling us to expect the tactile sensations we will feel when touching the object, \eg stiffness or texture, and even to predict properties that we cannot see, \eg weight or temperature.
|
||||
%
|
||||
Information from different sensory sources may be complementary, redundant or contradictory~\autocite{ernst2004merging}.
|
||||
Information from different sensory sources may be complementary, redundant or contradictory~\cite{ernst2004merging}.
|
||||
%
|
||||
This is why we sometimes want to touch an object to check one of its properties that we have seen and to compare or confront our visual and tactile sensations.
|
||||
%
|
||||
We then instinctively construct a unified perception of the object we are exploring and manipulating from these visual and tactile sensory modalities, as well as from the movement of our hand and fingers on the object~\autocite{ernst2002humans}.
|
||||
We then instinctively construct a unified perception of the object we are exploring and manipulating from these visual and tactile sensory modalities, as well as from the movement of our hand and fingers on the object~\cite{ernst2002humans}.
|
||||
|
||||
The sense of touch also allows us to perceive our environment, but also to represent ourselves in space and to interact with the surrounding objects.
|
||||
%
|
||||
@@ -42,7 +42,7 @@ Touchable interfaces are actuated devices that are directly touched and that can
|
||||
%
|
||||
However, graspable interfaces occupy the hand, preventing interaction with other objects, and touchable interfaces often involve cumbersome mechanisms and are by definition limited to their own working surface.
|
||||
%
|
||||
Instead, wearable interfaces are directly mounted on the body to provide kinesthetic or cutaneous sensations on the skin in a portable way and without restricting the user's movements~\autocite{pacchierotti2017wearable}.
|
||||
Instead, wearable interfaces are directly mounted on the body to provide kinesthetic or cutaneous sensations on the skin in a portable way and without restricting the user's movements~\cite{pacchierotti2017wearable}.
|
||||
|
||||
\begin{subfigs}{haptic-categories}{
|
||||
Haptic devices can be classified into three categories according to their interface with the user:
|
||||
@@ -60,17 +60,17 @@ A wide range of \WH devices have been developed to provide the user with rich vi
|
||||
%
|
||||
\figref{wearable-haptics} shows some examples of different \WH devices with different form factors and rendering capabilities.
|
||||
%
|
||||
Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, \VR, and social interactions~\autocite{pacchierotti2017wearable,culbertson2018haptics}.
|
||||
Their portability, \ie their small form factor, light weight and unobtrusiveness, makes them particularly promising for everyday use in a variety of applications such as robotics, teleoperation, \VR, and social interactions~\cite{pacchierotti2017wearable,culbertson2018haptics}.
|
||||
%
|
||||
But their use in combination with \AR has been little explored so far.
|
||||
|
||||
\begin{subfigs}{wearable-haptics}{
|
||||
Wearable haptic devices can render sensations on the skin as feedback to real or virtual objects being touched.
|
||||
}[
|
||||
\item Wolverine, a wearable exoskeleton that simulate contact and grasping of virtual objects with force feedback on the fingers~\autocite{choi2016wolverine}.
|
||||
\item Touch\&Fold, a \WH device mounted on the nail that fold on demand to render contact, normal force and vibrations to the fingertip~\autocite{teng2021touch}.
|
||||
\item The hRing, a \WH ring mounted on the proximal phalanx able to render normal and shear forces to the finger~\autocite{pacchierotti2016hring}.
|
||||
\item Tasbi, a haptic bracelet capable of rendering squeeze and vibrotactile feedback to the wrist~\autocite{pezent2019tasbi}.
|
||||
\item Wolverine, a wearable exoskeleton that simulate contact and grasping of virtual objects with force feedback on the fingers~\cite{choi2016wolverine}.
|
||||
\item Touch\&Fold, a \WH device mounted on the nail that fold on demand to render contact, normal force and vibrations to the fingertip~\cite{teng2021touch}.
|
||||
\item The hRing, a \WH ring mounted on the proximal phalanx able to render normal and shear forces to the finger~\cite{pacchierotti2016hring}.
|
||||
\item Tasbi, a haptic bracelet capable of rendering squeeze and vibrotactile feedback to the wrist~\cite{pezent2019tasbi}.
|
||||
]
|
||||
\subfigsheight{28mm}
|
||||
\subfig{choi2016wolverine}
|
||||
@@ -92,7 +92,7 @@ It is technically and conceptually closely related to \VR, which replaces the \R
|
||||
%
|
||||
It describes the degree of \RV of the environment along an axis, with one end being the \RE and the other end being a pure \VE, \ie indistinguishable from the real world (such as \emph{The Matrix} movies).
|
||||
%
|
||||
Between these two extremes lies \MR, which comprises \AR and \VR as different levels of mixing real and virtual environments~\autocite{skarbez2021revisiting}.
|
||||
Between these two extremes lies \MR, which comprises \AR and \VR as different levels of mixing real and virtual environments~\cite{skarbez2021revisiting}.
|
||||
%
|
||||
\AR/\VR is most often understood as addressing only the visual sense, and as haptics, it can take many forms as a user interface.
|
||||
%
|
||||
@@ -114,7 +114,7 @@ The combination of the two axes defines 9 types of \vh environments, with 3 poss
|
||||
%
|
||||
For example, a \v-\AE that uses a tangible (touchable) object as a proxy to manipulate virtual content is considered a \h-\RE (\eg \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum3}), whereas a device that provides synthetic haptic feedback when touching a \VO is considered a \h-\VE (\eg \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum3}).
|
||||
%
|
||||
Haptic \AR is then the combination of real and virtual haptic stimuli~\autocite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}).
|
||||
Haptic \AR is then the combination of real and virtual haptic stimuli~\cite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}).
|
||||
%
|
||||
In particular, it has been implemented by enhancing the haptic perception of tangible objects by providing timely tactile stimuli using \WHs.
|
||||
%
|
||||
@@ -133,10 +133,10 @@ The integration of \WHs with \AR seems to be one of the most promising solutions
|
||||
\begin{subfigs}{visuo-haptic-environments}{
|
||||
Visuo-haptic environments with different degrees of reality-virtuality.
|
||||
}[
|
||||
\item Visual \AR environment with a real, tangible haptic object used as a proxy to manipulate a \VO~\autocite{kahl2023using}.
|
||||
\item Visual \AR environment with a \WH device that provides virtual, synthetic feedback from contact with a \VO~\autocite{meli2018combining}.
|
||||
\item A tangible object seen in a \v-\VR environment whose haptic perception of stiffness is augmented with the hRing haptic device~\autocite{salazar2020altering}.
|
||||
\item Visuo-haptic rendering of texture on a touched tangible object with a \v-\AR display and haptic electrovibration feedback~\autocite{bau2012revel}.
|
||||
\item Visual \AR environment with a real, tangible haptic object used as a proxy to manipulate a \VO~\cite{kahl2023using}.
|
||||
\item Visual \AR environment with a \WH device that provides virtual, synthetic feedback from contact with a \VO~\cite{meli2018combining}.
|
||||
\item A tangible object seen in a \v-\VR environment whose haptic perception of stiffness is augmented with the hRing haptic device~\cite{salazar2020altering}.
|
||||
\item Visuo-haptic rendering of texture on a touched tangible object with a \v-\AR display and haptic electrovibration feedback~\cite{bau2012revel}.
|
||||
]
|
||||
\subfigsheight{31mm}
|
||||
\subfig{kahl2023using}
|
||||
@@ -192,14 +192,14 @@ Although closely related, (visual) \AR and \VR have key differences in their res
|
||||
Firstly, the user's hand and \RE are visible in \AR, unlike \VR where there is total control over the visual rendering of the hand and \VE.
|
||||
% (unless specifically overlaid with virtual visual content)
|
||||
%
|
||||
As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic~\autocite{ujitoko2021survey} or haptic retargeting~\autocite{azmandian2016haptic} effects.
|
||||
%enabling techniques such as pseudo-haptic feedback that induce haptic feedback with visual stimuli~\autocite{ujitoko2021survey} or haptic retargeting that associate a single tangible object with multiple \VOs without the user noticing~\autocite{azmandian2016haptic}.
|
||||
As such, in \VR, visual sensations are particularly dominant in perception, and conflicts with haptic sensations are also specifically created to influence the user's perception, for example to create pseudo-haptic~\cite{ujitoko2021survey} or haptic retargeting~\cite{azmandian2016haptic} effects.
|
||||
%enabling techniques such as pseudo-haptic feedback that induce haptic feedback with visual stimuli~\cite{ujitoko2021survey} or haptic retargeting that associate a single tangible object with multiple \VOs without the user noticing~\cite{azmandian2016haptic}.
|
||||
%
|
||||
Moreover, many \WH devices take the form of controllers, gloves or exoskeletons, all of which cover the fingertips and are therefore not suitable for \AR.
|
||||
%
|
||||
The user's hand must be indeed free to touch and interact with the \RE.
|
||||
%
|
||||
It is possible instead to place the haptic actuator close to the point of contact with the \RE, as described above to implement \h-\AR, \eg providing haptic feedback on another phalanx~\autocite{asano2015vibrotactile,salazar2020altering} or the wrist~\autocite{sarac2022perceived} for rendering fingertip contacts with virtual content.
|
||||
It is possible instead to place the haptic actuator close to the point of contact with the \RE, as described above to implement \h-\AR, \eg providing haptic feedback on another phalanx~\cite{asano2015vibrotactile,salazar2020altering} or the wrist~\cite{sarac2022perceived} for rendering fingertip contacts with virtual content.
|
||||
%
|
||||
Therefore, when touching a virtual or augmented object, the real and virtual visual sensations are seen as colocalised, but the virtual haptic feedback is not.
|
||||
%
|
||||
@@ -217,21 +217,21 @@ It is therefore unclear to what extent the real and virtual visuo-haptic sensati
|
||||
|
||||
\subsectionstarbookmark{Enable Effective Manipulation of the Augmented Environment}
|
||||
|
||||
Touching, grasping and manipulating \VOs are fundamental interactions for \AR~\autocite{kim2018revisiting}, \VR~\autocite{bergstrom2021how} and VEs in general~\autocite{laviola20173d}.
|
||||
Touching, grasping and manipulating \VOs are fundamental interactions for \AR~\cite{kim2018revisiting}, \VR~\cite{bergstrom2021how} and VEs in general~\cite{laviola20173d}.
|
||||
%
|
||||
As the hand is not occupied or covered with a haptic device to not impair interaction with the \RE, as described in the previous section, one can expect a a seamless and direct manipulation of the hand with the virtual content as if it were real.
|
||||
%
|
||||
Thus, augmenting a tangible object has the advantage of physically constraining the hand, allowing for easy and natural interaction, but manipulating a purely \VO with the bare hand can be challenging without good haptic feedback~\autocite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
|
||||
Thus, augmenting a tangible object has the advantage of physically constraining the hand, allowing for easy and natural interaction, but manipulating a purely \VO with the bare hand can be challenging without good haptic feedback~\cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
|
||||
|
||||
In addition, current \AR systems have visual rendering limitations that also affect interaction with \VOs. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency.
|
||||
%
|
||||
Visual \AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world.
|
||||
%
|
||||
But the depth perception of the \VOs is often underestimated~\autocite{peillard2019studying,adams2022depth}, and there is often a lack of mutual occlusion between the hand and a \VO, \ie that the hand can hide the object or be hidden by the object~\autocite{macedo2023occlusion}.
|
||||
But the depth perception of the \VOs is often underestimated~\cite{peillard2019studying,adams2022depth}, and there is often a lack of mutual occlusion between the hand and a \VO, \ie that the hand can hide the object or be hidden by the object~\cite{macedo2023occlusion}.
|
||||
%
|
||||
Finally, as illustrated in \figref{interaction-loop}, interacting with a \VO is an illusion, because in fact the real hand is controlling in real time a virtual hand, like an avatar, whose contacts with \VOs are then simulated in the \VE.
|
||||
%
|
||||
Therefore, there is inevitably a latency delay between the real hand's movements and the \VO's return movements, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the touched \VO~\autocite{prachyabrued2014visual}.
|
||||
Therefore, there is inevitably a latency delay between the real hand's movements and the \VO's return movements, and a spatial shift between the real hand and the virtual hand, whose movements are constrained to the touched \VO~\cite{prachyabrued2014visual}.
|
||||
%
|
||||
This makes it difficult to perceive the position of the fingers relative to the object before touching or grasping it, and also to estimate the force required to grasp and move the object to a desired location.
|
||||
|
||||
@@ -283,7 +283,7 @@ Our contributions in these two axes are summarized in \figref{contributions}.
|
||||
|
||||
% Very short abstract of contrib 2
|
||||
|
||||
\WH devices have proven to be effective in modifying the perception of a touched tangible surface, without modifying the tangible, nor covering the fingertip, forming a \h-\AE~\autocite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
|
||||
\WH devices have proven to be effective in modifying the perception of a touched tangible surface, without modifying the tangible, nor covering the fingertip, forming a \h-\AE~\cite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
|
||||
%
|
||||
%It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronised with the finger movement.
|
||||
%
|
||||
@@ -297,7 +297,7 @@ For this first axis of research, we propose to design and evaluate the perceptio
|
||||
%
|
||||
To this end, we (1) design a system for rendering virtual visuo-haptic texture augmentations, to (2) evaluate how the perception of these textures is affected by the visual virtuality of the hand and the environment (\AR \vs \VR), and (3) investigate the perception of co-localized visuo-haptic texture augmentations in \AR.
|
||||
|
||||
First, an effective approach to rendering haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction~\autocite{culbertson2014modeling,asano2015vibrotactile}.
|
||||
First, an effective approach to rendering haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction~\cite{culbertson2014modeling,asano2015vibrotactile}.
|
||||
%
|
||||
Yet, to achieve the natural interaction with the hand and a coherent visuo-haptic feedback, it requires a real time rendering of the textures, no constraints on the hand movements, and a good synchronization between the visual and haptic feedback.
|
||||
%
|
||||
@@ -305,11 +305,11 @@ Thus, our first objective is to design an immersive, real time system that allow
|
||||
|
||||
Second, many works have investigated the haptic rendering of virtual textures, but few have integrated them with immersive \VEs or have considered the influence of the visual rendering on their perception.
|
||||
%
|
||||
Still, it is known that the visual feedback can alter the perception of real and virtual haptic sensations~\autocite{schwind2018touch,choi2021augmenting} but also that the force feedback perception of grounded haptic devices is not the same in \AR and \VR~\autocite{diluca2011effects,gaffary2017ar}.
|
||||
Still, it is known that the visual feedback can alter the perception of real and virtual haptic sensations~\cite{schwind2018touch,choi2021augmenting} but also that the force feedback perception of grounded haptic devices is not the same in \AR and \VR~\cite{diluca2011effects,gaffary2017ar}.
|
||||
%
|
||||
Hence, our second objective is to understand how the perception of haptic texture augmentation differs depending on the degree of visual virtuality of the hand and the environment.
|
||||
|
||||
Finally, some visuo-haptic texture databases have been modelled from real texture captures~\autocite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with graspable haptics that are perceived as similar to real textures~\autocite{culbertson2015should,friesen2024perceived}.
|
||||
Finally, some visuo-haptic texture databases have been modelled from real texture captures~\cite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with graspable haptics that are perceived as similar to real textures~\cite{culbertson2015should,friesen2024perceived}.
|
||||
%
|
||||
However, the rendering of these textures in an immersive and natural \vh-\AR using \WHs remains to be investigated.
|
||||
%
|
||||
@@ -323,15 +323,15 @@ In immersive and wearable \vh-\AR, the hand is free to touch and interact seamle
|
||||
However, the intangibility of the \v-\VE, the many display limitations of current \v-\AR systems and \WH devices, and the potential discrepancies between these two types of feedback can make the manipulation of \VOs particularly challenging.
|
||||
%However, the intangibility of the virtual visual environment, the lack of kinesthetic feedback of \WHs, the visual rendering limitations of current \AR systems, as well as the spatial and temporal discrepancies between the real environment, the visual feedback, and the haptic feedback, can make the interaction with \VOs with bare hands particularly challenging.
|
||||
%
|
||||
Still two types of sensory feedback are known to improve such direct \VO manipulation, but they have not been studied in combination in immersive \v-\AE: visual rendering of the hand~\autocite{piumsomboon2014graspshell,prachyabrued2014visual} and contact rendering with \WHs~\autocite{lopes2018adding,teng2021touch}.
|
||||
Still two types of sensory feedback are known to improve such direct \VO manipulation, but they have not been studied in combination in immersive \v-\AE: visual rendering of the hand~\cite{piumsomboon2014graspshell,prachyabrued2014visual} and contact rendering with \WHs~\cite{lopes2018adding,teng2021touch}.
|
||||
%
|
||||
For this second axis of research, we propose to design and evaluate the role of visuo-haptic augmentations of the hand as interaction feedback with \VOs.
|
||||
%
|
||||
We consider (1) the effect of different visual augmentations of the hand as \AR avatars and (2) the effect of combination of different visuo-haptic augmentations of the hand.
|
||||
|
||||
First, the visual rendering of the virtual hand is a key element for interacting and manipulating \VOs in \VR~\autocite{prachyabrued2014visual,grubert2018effects}.
|
||||
First, the visual rendering of the virtual hand is a key element for interacting and manipulating \VOs in \VR~\cite{prachyabrued2014visual,grubert2018effects}.
|
||||
%
|
||||
A few works have also investigated the visual rendering of the virtual hand in \AR, from simulating mutual occlusions between the hand and \VOs~\autocite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay~\autocite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
|
||||
A few works have also investigated the visual rendering of the virtual hand in \AR, from simulating mutual occlusions between the hand and \VOs~\cite{piumsomboon2014graspshell,al-kalbani2016analysis} to displaying the virtual hand as an avatar overlay~\cite{blaga2017usability,yoon2020evaluating}, augmenting the real hand.
|
||||
%
|
||||
But \v-\AR has significant perceptual differences from \VR due to the visibility of the real hand and environment, and these visual hand augmentations have not been evaluated in the context of \VO manipulation.
|
||||
%
|
||||
@@ -339,7 +339,7 @@ Thus, our fourth objective is to evaluate and compare the effect of different vi
|
||||
|
||||
Finally, as described above, \WHs for \v-\AR rely on moving the haptic actuator away from the fingertips to not impair the hand movements, sensations, and interactions with the \RE.
|
||||
%
|
||||
Previous works have shown that \WHs that provide feedback on the hand manipulation with \VOs in \AR can significantly improve the user performance and experience~\autocite{maisto2017evaluation,meli2018combining}.
|
||||
Previous works have shown that \WHs that provide feedback on the hand manipulation with \VOs in \AR can significantly improve the user performance and experience~\cite{maisto2017evaluation,meli2018combining}.
|
||||
%
|
||||
However, it is unclear which positioning of the actuator is the most beneficial nor how a haptic augmentation of the hand compares or complements with a visual augmentation of the hand.
|
||||
%
|
||||
@@ -385,7 +385,7 @@ We use psychophysical methods to measure the user roughness perception of the vi
|
||||
|
||||
\chapref{ar_textures} presents a second user study using the same system and evaluating the perception of visuo-haptic texture augmentations, touched directly with one's own hand in \AR.
|
||||
%
|
||||
The textures are paired visual and tactile models of real surfaces~\autocite{culbertson2014one}, and are rendered as visual texture overlays and as a vibrotactile feedback, respectively, on the touched augmented surfaces, respectively.
|
||||
The textures are paired visual and tactile models of real surfaces~\cite{culbertson2014one}, and are rendered as visual texture overlays and as a vibrotactile feedback, respectively, on the touched augmented surfaces, respectively.
|
||||
%
|
||||
%We investigate the perception and user appreciation of the combination of nine representative visuo-haptic pairs of texture.
|
||||
%
|
||||
|
||||
@@ -171,29 +171,29 @@ En particulier, nous nous intéressons aux actuateurs portables stimulant les m
|
||||
\subsubsection{Texture}
|
||||
\label{texture_rendering}
|
||||
|
||||
Several approaches have been proposed to render virtual haptic texture~\autocite{culbertson2018haptics}.
|
||||
Several approaches have been proposed to render virtual haptic texture~\cite{culbertson2018haptics}.
|
||||
%
|
||||
High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\autocite{unger2011roughness}.
|
||||
High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\cite{unger2011roughness}.
|
||||
%
|
||||
As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\autocite{culbertson2018haptics}.
|
||||
As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}.
|
||||
%
|
||||
In this way, physics-based models~\autocite{chan2021hasti,okamura1998vibration,guruswamy2011iir} and data-based models~\autocite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures.
|
||||
In this way, physics-based models~\cite{chan2021hasti,okamura1998vibration,guruswamy2011iir} and data-based models~\cite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures.
|
||||
%
|
||||
Notably, \textcite{okamura1998vibration} rendered grating textures with exponentially decaying sinudoids that simulated the strokes of the grooves and ridges of the surface, while \textcite{culbertson2014modeling} captured and modelled the roughness of real surfaces to render them using the speed and force of the user.
|
||||
%
|
||||
An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\autocite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\autocite{culbertson2015should}.
|
||||
An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\cite{culbertson2015should}.
|
||||
% For example, when comparing the same virtual texture pairwise, but with different parameters, \textcite{culbertson2015should} showed that the roughness vibrations generated should vary with user speed, but not necessarily with user force.
|
||||
% Virtual data-driven textures were perceived as similar to real textures, except for friction, which was not rendered properly.
|
||||
%
|
||||
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\autocite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\autocite{asano2015vibrotactile}, creating a haptic texture augmentation.
|
||||
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\cite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\cite{asano2015vibrotactile}, creating a haptic texture augmentation.
|
||||
%
|
||||
The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\autocite{bhatia2024augmenting,jeon2009haptic}.
|
||||
The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\cite{bhatia2024augmenting,jeon2009haptic}.
|
||||
|
||||
One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand~\autocite{ando2007fingernailmounted,friesen2024perceived,normand2024visuohaptic,teng2021touch}.
|
||||
One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand~\cite{ando2007fingernailmounted,friesen2024perceived,normand2024visuohaptic,teng2021touch}.
|
||||
%
|
||||
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\autocite{manfredi2014natural}.
|
||||
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\cite{manfredi2014natural}.
|
||||
%
|
||||
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\autocite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
|
||||
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\cite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
|
||||
%
|
||||
It remains unclear whether such vibrotactile texture augmentation is perceived the same when integrated into visual AR or VR environments or touched with a virtual hand instead of the real hand.
|
||||
%
|
||||
|
||||
@@ -21,7 +21,7 @@ The phychophysical model of \textcite{ernst2002humans} established that the sens
|
||||
\subsubsection{Texture Augmentations}
|
||||
\label{vhar_texture}
|
||||
|
||||
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\autocite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
|
||||
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
|
||||
%
|
||||
Thus, the overall perception can be modified by changing one of the modalities, as shown by \textcite{yanagisawa2015effects}, who altered the perception of roughness, stiffness and friction of some real tactile textures touched by the finger by superimposing different real visual textures using a half-mirror.
|
||||
|
||||
@@ -30,14 +30,14 @@ They extended this prototype to in [@Bau2012REVEL] to alter the texture of touch
|
||||
|
||||
Similarly but in VR, \textcite{degraen2019enhancing} combined visual textures with different passive haptic hair-like structure that were touched with the finger to induce a larger set of visuo-haptic materials perception.
|
||||
\textcite{gunther2022smooth} studied in a complementary way how the visual rendering of a virtual object touching the arm with a tangible object influenced the perception of roughness.
|
||||
Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch~\autocite{degraen2019enhancing} and passive touch~\autocite{gunther2022smooth} contexts.
|
||||
Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch~\cite{degraen2019enhancing} and passive touch~\cite{gunther2022smooth} contexts.
|
||||
A common finding of these studies is that haptic sensations seem to dominate the perception of roughness, suggesting that a smaller set of haptic textures can support a larger set of visual textures.
|
||||
|
||||
|
||||
\subsection{Improving the Interactions}
|
||||
\label{vhar_interaction}
|
||||
|
||||
Conversely, virtual hand rendering is also known to influence how an object is grasped in VR~\autocite{prachyabrued2014visual,blaga2020too} and AR, or even how real bumps and holes are perceived in VR~\autocite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated.
|
||||
Conversely, virtual hand rendering is also known to influence how an object is grasped in VR~\cite{prachyabrued2014visual,blaga2020too} and AR, or even how real bumps and holes are perceived in VR~\cite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated.
|
||||
|
||||
\subsubsection{Virtual Hands in Augmented Reality}
|
||||
\label{vhar_hands}
|
||||
|
||||
@@ -6,31 +6,31 @@ This Section summarizes the state of the art in visual hand rendering and (weara
|
||||
\subsection{Visual Hand Rendering in AR}
|
||||
\label{2_hands}
|
||||
|
||||
Mutual visual occlusion between a virtual object and the real hand, \ie hiding the virtual object when the real hand is in front of it and hiding the real hand when it is behind the virtual object, is often presented as natural and realistic, enhancing the blending of real and virtual environments~\autocite{piumsomboon2014graspshell, al-kalbani2016analysis}.
|
||||
Mutual visual occlusion between a virtual object and the real hand, \ie hiding the virtual object when the real hand is in front of it and hiding the real hand when it is behind the virtual object, is often presented as natural and realistic, enhancing the blending of real and virtual environments~\cite{piumsomboon2014graspshell, al-kalbani2016analysis}.
|
||||
%
|
||||
In video see-through AR (VST-AR), this could be solved as a masking problem by combining the image of the real world captured by a camera and the generated virtual image~\autocite{macedo2023occlusion}.
|
||||
In video see-through AR (VST-AR), this could be solved as a masking problem by combining the image of the real world captured by a camera and the generated virtual image~\cite{macedo2023occlusion}.
|
||||
%
|
||||
In OST-AR, this is more difficult because the virtual environment is displayed as a transparent 2D image on top of the 3D real world, which cannot be easily masked~\autocite{macedo2023occlusion}.
|
||||
In OST-AR, this is more difficult because the virtual environment is displayed as a transparent 2D image on top of the 3D real world, which cannot be easily masked~\cite{macedo2023occlusion}.
|
||||
%
|
||||
Moreover, in VST-AR, the grip aperture and depth positioning of virtual objects often seem to be wrongly estimated~\autocite{al-kalbani2016analysis, maisto2017evaluation}.
|
||||
Moreover, in VST-AR, the grip aperture and depth positioning of virtual objects often seem to be wrongly estimated~\cite{al-kalbani2016analysis, maisto2017evaluation}.
|
||||
%
|
||||
However, this effect has yet to be verified in an OST-AR setup.
|
||||
|
||||
An alternative is to render the virtual objects and the hand semi-transparents, so that they are partially visible even when one is occluding the other, \eg in \figref{hands-none} the real hand is behind the virtual cube but still visible.
|
||||
%
|
||||
Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in VST-AR~\autocite{buchmann2005interaction, ha2014wearhand, piumsomboon2014graspshell} and VR~\autocite{vanveldhuizen2021effect}, but has not yet been evaluated in OST-AR.
|
||||
Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in VST-AR~\cite{buchmann2005interaction, ha2014wearhand, piumsomboon2014graspshell} and VR~\cite{vanveldhuizen2021effect}, but has not yet been evaluated in OST-AR.
|
||||
%
|
||||
However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a virtual object, \eg in \figref{hands-none} the thumb is in front of the virtual cube, but it appears to be behind it.
|
||||
|
||||
In VR, as the user is fully immersed in the virtual environment and cannot see their real hands, it is necessary to represent them virtually.
|
||||
%
|
||||
It is known that the virtual hand representation has an impact on perception, interaction performance, and preference of users~\autocite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch}.
|
||||
It is known that the virtual hand representation has an impact on perception, interaction performance, and preference of users~\cite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch}.
|
||||
%
|
||||
In a pick-and-place task in VR, \textcite{prachyabrued2014visual} found that the virtual hand representation whose motion was constrained to the surface of the virtual objects performed the worst, while the virtual hand representation following the tracked human hand (thus penetrating the virtual objects), performed the best, even though it was rather disliked.
|
||||
%
|
||||
The authors also observed that the best compromise was a double rendering, showing both the tracked hand and a hand rendering constrained by the virtual environment.
|
||||
%
|
||||
It has also been shown that over a realistic avatar, a skeleton rendering (similar to \figref{hands-skeleton}) can provide a stronger sense of being in control~\autocite{argelaguet2016role} and that minimalistic fingertip rendering (similar to \figref{hands-tips}) can be more effective in a typing task~\autocite{grubert2018effects}.
|
||||
It has also been shown that over a realistic avatar, a skeleton rendering (similar to \figref{hands-skeleton}) can provide a stronger sense of being in control~\cite{argelaguet2016role} and that minimalistic fingertip rendering (similar to \figref{hands-tips}) can be more effective in a typing task~\cite{grubert2018effects}.
|
||||
|
||||
In AR, as the real hand of a user is visible but not physically constrained by the virtual environment, adding a visual hand rendering that can physically interact with virtual objects would achieve a similar result to the promising double-hand rendering of \textcite{prachyabrued2014visual}.
|
||||
%
|
||||
@@ -38,7 +38,7 @@ Additionally, \textcite{kahl2021investigation} showed that a virtual object over
|
||||
%
|
||||
This suggests that a visual hand rendering superimposed on the real hand could be helpful, but should not impair users.
|
||||
|
||||
Few works have explored the effect of visual hand rendering in AR~\autocite{blaga2017usability, maisto2017evaluation, krichenbauer2018augmented, yoon2020evaluating, saito2021contact}.
|
||||
Few works have explored the effect of visual hand rendering in AR~\cite{blaga2017usability, maisto2017evaluation, krichenbauer2018augmented, yoon2020evaluating, saito2021contact}.
|
||||
%
|
||||
For example, \textcite{blaga2017usability} evaluated a skeleton rendering in several virtual object manipulations against no visual hand overlay.
|
||||
%
|
||||
@@ -55,12 +55,12 @@ To the best of our knowledge, evaluating the role of a visual rendering of the h
|
||||
\label{2_haptics}
|
||||
|
||||
Different haptic feedback systems have been explored to improve interactions in AR, including %
|
||||
grounded force feedback devices~\autocite{bianchi2006high, jeon2009haptic, knorlein2009influence}, %
|
||||
exoskeletons~\autocite{lee2021wearable}, %
|
||||
tangible objects~\autocite{hettiarachchi2016annexing, detinguy2018enhancing, salazar2020altering, normand2018enlarging, xiao2018mrtouch}, and %
|
||||
wearable haptic devices~\autocite{pacchierotti2016hring, lopes2018adding, pezent2019tasbi, teng2021touch}.
|
||||
grounded force feedback devices~\cite{bianchi2006high, jeon2009haptic, knorlein2009influence}, %
|
||||
exoskeletons~\cite{lee2021wearable}, %
|
||||
tangible objects~\cite{hettiarachchi2016annexing, detinguy2018enhancing, salazar2020altering, normand2018enlarging, xiao2018mrtouch}, and %
|
||||
wearable haptic devices~\cite{pacchierotti2016hring, lopes2018adding, pezent2019tasbi, teng2021touch}.
|
||||
|
||||
Wearable haptics seems particularly suited for this context, as it takes into account many of the AR constraints, \eg limited impact on hand tracking performance and reduced impairment of the senses and ability of the users to interact with real content~\autocite{pacchierotti2016hring, maisto2017evaluation, lopes2018adding, meli2018combining, pezent2019tasbi, teng2021touch, kourtesis2022electrotactile, marchal2022virtual}.
|
||||
Wearable haptics seems particularly suited for this context, as it takes into account many of the AR constraints, \eg limited impact on hand tracking performance and reduced impairment of the senses and ability of the users to interact with real content~\cite{pacchierotti2016hring, maisto2017evaluation, lopes2018adding, meli2018combining, pezent2019tasbi, teng2021touch, kourtesis2022electrotactile, marchal2022virtual}.
|
||||
%
|
||||
For example, \textcite{pacchierotti2016hring} designed a haptic ring providing pressure and skin stretch sensations to be worn at the proximal finger phalanx, so as to improve the hand tracking during a pick-and-place task.
|
||||
%
|
||||
@@ -68,7 +68,7 @@ For example, \textcite{pacchierotti2016hring} designed a haptic ring providing p
|
||||
%
|
||||
\textcite{teng2021touch} presented Touch\&Fold, a haptic device attached to the nail that provides pressure and texture sensations when interacting with virtual content, but also folds away when the user interacts with real objects, leaving the fingertip free.
|
||||
%
|
||||
This approach was also perceived as more realistic than providing sensations directly on the nail, as in~\autocite{ando2007fingernailmounted}.
|
||||
This approach was also perceived as more realistic than providing sensations directly on the nail, as in~\cite{ando2007fingernailmounted}.
|
||||
%
|
||||
Each of these haptic devices provided haptic feedback about fingertip interactions with the virtual content on other parts of the hand.
|
||||
%
|
||||
@@ -82,7 +82,7 @@ Results proved that moving the haptic feedback away from the point(s) of contact
|
||||
%
|
||||
In pick-and-place tasks in AR involving both virtual and real objects, \textcite{maisto2017evaluation} and \textcite{meli2018combining} showed that having a haptic {rendering of the} fingertip interactions with the virtual objects led to better performance and perceived effectiveness than having only a visual rendering of the hand, similar to \figref{hands-tips}.
|
||||
%
|
||||
Moreover, employing the haptic ring of~\autocite{pacchierotti2016hring} on the proximal finger phalanx led to an improved performance with respect to more standard fingertip haptic devices~\autocite{chinello2020modular}.
|
||||
Moreover, employing the haptic ring of~\cite{pacchierotti2016hring} on the proximal finger phalanx led to an improved performance with respect to more standard fingertip haptic devices~\cite{chinello2020modular}.
|
||||
%
|
||||
However, the measured difference in performance could be attributed to either the device or the device position (proximal vs fingertip), or both.
|
||||
%
|
||||
|
||||
@@ -4,31 +4,31 @@
|
||||
\subsection{Haptics in AR}
|
||||
|
||||
As in VR, the addition of haptic feedback in AR has been explored through numerous approaches, including %
|
||||
grounded force feedback devices~\autocite{jeon2009haptic,knorlein2009influence,hachisu2012augmentation,gaffary2017ar}, %
|
||||
exoskeletons~\autocite{lee2021wearable}, %
|
||||
wearable haptic devices~\autocite{maisto2017evaluation,detinguy2018enhancing,lopes2018adding,meli2018combining,pezent2019tasbi,teng2021touch},
|
||||
tangible objects~\autocite{punpongsanon2015softar,hettiarachchi2016annexing,kahl2021investigation}, and %
|
||||
mid-air haptics~\autocite{ochiai2016crossfield}. %
|
||||
grounded force feedback devices~\cite{jeon2009haptic,knorlein2009influence,hachisu2012augmentation,gaffary2017ar}, %
|
||||
exoskeletons~\cite{lee2021wearable}, %
|
||||
wearable haptic devices~\cite{maisto2017evaluation,detinguy2018enhancing,lopes2018adding,meli2018combining,pezent2019tasbi,teng2021touch},
|
||||
tangible objects~\cite{punpongsanon2015softar,hettiarachchi2016annexing,kahl2021investigation}, and %
|
||||
mid-air haptics~\cite{ochiai2016crossfield}. %
|
||||
%
|
||||
Most have been used to provide haptic feedback to virtual objects.
|
||||
%
|
||||
While this may seem similar to haptic feedback in VR, there are significant differences in terms of perception, as in AR the real world and the hand of the user remain visible, but also because the virtual content may be less realistic or inconsistent with the real world~\autocite{kim2018revisiting,macedo2023occlusion}.
|
||||
While this may seem similar to haptic feedback in VR, there are significant differences in terms of perception, as in AR the real world and the hand of the user remain visible, but also because the virtual content may be less realistic or inconsistent with the real world~\cite{kim2018revisiting,macedo2023occlusion}.
|
||||
%
|
||||
Indeed, the same haptic stimuli can be perceived differently in AR and VR, \eg the perceived stiffness of a piston seemed higher in AR than in VR~\autocite{gaffary2017ar} or was altered in the presence of a delay between the haptic and visual feedback~\autocite{knorlein2009influence}.
|
||||
Indeed, the same haptic stimuli can be perceived differently in AR and VR, \eg the perceived stiffness of a piston seemed higher in AR than in VR~\cite{gaffary2017ar} or was altered in the presence of a delay between the haptic and visual feedback~\cite{knorlein2009influence}.
|
||||
%
|
||||
It might be therefore interesting to study how haptic and visual augmentations of textures of tangible surfaces are perceived in AR.
|
||||
|
||||
An additional challenge in AR is to let the hand of the user free to touch, feel, and interact with the real objects~\autocite{maisto2017evaluation,detinguy2018enhancing,teng2021touch}.
|
||||
An additional challenge in AR is to let the hand of the user free to touch, feel, and interact with the real objects~\cite{maisto2017evaluation,detinguy2018enhancing,teng2021touch}.
|
||||
%
|
||||
For example, mounted on the nail, the haptic device of \textcite{teng2021touch} can be quickly unfolded on demand to the fingertip to render haptic feedback of virtual objects.
|
||||
%
|
||||
It is however not suitable for rendering haptic feedback when touching real objects.
|
||||
%
|
||||
In this respect, some wearable haptic devices were specifically designed to provide haptic feedback about fingertip interactions with the virtual content, but delocalized elsewhere on the body: on the proximal finger phalanx with the hRing haptic ring device~\autocite{pacchierotti2016hring,ferro2023deconstructing}, on the wrist with the Tasbi bracelet~\autocite{pezent2019tasbi}, or on the arm~\autocite{lopes2018adding}.
|
||||
In this respect, some wearable haptic devices were specifically designed to provide haptic feedback about fingertip interactions with the virtual content, but delocalized elsewhere on the body: on the proximal finger phalanx with the hRing haptic ring device~\cite{pacchierotti2016hring,ferro2023deconstructing}, on the wrist with the Tasbi bracelet~\cite{pezent2019tasbi}, or on the arm~\cite{lopes2018adding}.
|
||||
%
|
||||
Compared to a fingertip worn device, the hRing was even preferred by participants and perceived as more effective in virtual object manipulation task in AR~\autocite{maisto2017evaluation,meli2018combining}.
|
||||
Compared to a fingertip worn device, the hRing was even preferred by participants and perceived as more effective in virtual object manipulation task in AR~\cite{maisto2017evaluation,meli2018combining}.
|
||||
%
|
||||
This device has been then taken further to alter cutaneous perception of touched tangible objects in VR and AR~\autocite{detinguy2018enhancing,salazar2020altering}: by providing normal and shear forces to the proximal phalanx skin in a timely manner, the perceived stiffness, softness, slipperiness, and local deformations (bumps and holes) of the touched tangible object were augmented.
|
||||
This device has been then taken further to alter cutaneous perception of touched tangible objects in VR and AR~\cite{detinguy2018enhancing,salazar2020altering}: by providing normal and shear forces to the proximal phalanx skin in a timely manner, the perceived stiffness, softness, slipperiness, and local deformations (bumps and holes) of the touched tangible object were augmented.
|
||||
%
|
||||
However, wearable haptic devices have not yet been used in AR to modify the texture perception of a tangible surface.
|
||||
|
||||
@@ -59,27 +59,27 @@ However, wearable haptic devices have not yet been used in AR to modify the text
|
||||
|
||||
\subsection{Virtual Texture Perception}
|
||||
|
||||
% Explain how different from \autocite{konyo2005tactile, asano2012vibrotactile, asano2015vibrotactile}, \autocite{ando2007fingernailmounted}, \autocite{bau2012revel}, \autocite{chan2021hasti}, and culbertson
|
||||
% Explain how different from \cite{konyo2005tactile, asano2012vibrotactile, asano2015vibrotactile}, \cite{ando2007fingernailmounted}, \cite{bau2012revel}, \cite{chan2021hasti}, and culbertson
|
||||
%
|
||||
%Tactile perception of a real texture involves multiple sensations, among them roughness being one of the most important.
|
||||
%
|
||||
%When running a finger over a surface, the perception of its roughness is due to the deformation of the skin caused by the micro height differences of the material \autocite{klatzky1999tactile,klatzky2003feeling}.
|
||||
%When running a finger over a surface, the perception of its roughness is due to the deformation of the skin caused by the micro height differences of the material \cite{klatzky1999tactile,klatzky2003feeling}.
|
||||
%
|
||||
%Interestingly, visual perception of material roughness seems almost as good as haptic perception of roughness.
|
||||
%
|
||||
%However, there is a greater variability between individuals for visual perception than for haptic perception of roughness \autocite{bergmanntiest2007haptic}.
|
||||
%However, there is a greater variability between individuals for visual perception than for haptic perception of roughness \cite{bergmanntiest2007haptic}.
|
||||
|
||||
Many approaches have been used to generate realistic haptic virtual textures.
|
||||
%
|
||||
Ultrasonic vibrating screens are capable of modulating their friction~\autocite{rekik2017localized,ito2019tactile}, but their use in AR is limited.
|
||||
Ultrasonic vibrating screens are capable of modulating their friction~\cite{rekik2017localized,ito2019tactile}, but their use in AR is limited.
|
||||
%
|
||||
By simulating the roughness of a surface instead, force feedback devices can reproduce perceptions of patterned textures identical to those of real textures~\autocite{unger2011roughness}, but they are expensive and have a limited workspace.
|
||||
By simulating the roughness of a surface instead, force feedback devices can reproduce perceptions of patterned textures identical to those of real textures~\cite{unger2011roughness}, but they are expensive and have a limited workspace.
|
||||
%
|
||||
An alternative is to reproduce the vibrations that occur when a tool or the finger is moved across a surface using a vibrotactile device attached to a hand-held tool~\autocite{culbertson2018haptics}.
|
||||
An alternative is to reproduce the vibrations that occur when a tool or the finger is moved across a surface using a vibrotactile device attached to a hand-held tool~\cite{culbertson2018haptics}.
|
||||
%
|
||||
Several physical models have been proposed to represent such vibrations~\autocite{okamura1998vibration,guruswamy2011iir,chan2021hasti}.
|
||||
Several physical models have been proposed to represent such vibrations~\cite{okamura1998vibration,guruswamy2011iir,chan2021hasti}.
|
||||
%
|
||||
However, as they can be difficult to tune, measurement-based models have been developed to record, model, and render these vibrations~\autocite{culbertson2014modeling,culbertson2017ungrounded}.
|
||||
However, as they can be difficult to tune, measurement-based models have been developed to record, model, and render these vibrations~\cite{culbertson2014modeling,culbertson2017ungrounded}.
|
||||
%
|
||||
In this work, we employed such data-driven haptic models to augment and studied the visuo-haptic texture perception of tangible surfaces in AR.%\CP{Here the original sentence was: ``We use these data-driven haptic models to augment [...].''. It was not clear what ``we use'' meant. Check that the new sentence is correct.}
|
||||
|
||||
@@ -91,9 +91,9 @@ Similarly, \textcite{culbertson2014modeling} compared the similarity of all poss
|
||||
%
|
||||
Virtual data-driven textures were perceived as similar to real textures, except for friction, which was not rendered properly.
|
||||
%
|
||||
For grating textures, an arbitrary roughness rating is used to determine a psycho-physical curve as a function of pattern spacing~\autocite{unger2011roughness,asano2015vibrotactile,degraen2019enhancing}.
|
||||
For grating textures, an arbitrary roughness rating is used to determine a psycho-physical curve as a function of pattern spacing~\cite{unger2011roughness,asano2015vibrotactile,degraen2019enhancing}.
|
||||
%
|
||||
Another common method is to identify a given haptic texture among visual representations of all haptic textures~\autocite{ando2007fingernailmounted,rekik2017localized,degraen2019enhancing,chan2021hasti}.
|
||||
Another common method is to identify a given haptic texture among visual representations of all haptic textures~\cite{ando2007fingernailmounted,rekik2017localized,degraen2019enhancing,chan2021hasti}.
|
||||
%
|
||||
In this user study, participants matched the pairs of visual and haptic textures they find most coherent and ranked the textures according to their perceived roughness.
|
||||
%\CP{Do you refer to the one in our paper? Not super clear.}
|
||||
@@ -102,7 +102,7 @@ A few studies have explored vibrotactile haptic devices worn directly on the fin
|
||||
%
|
||||
\textcite{ando2007fingernailmounted} mounted a vibrotactile actuator on the index nail, which generated impulse vibrations to render virtual edges and gaps on a real surface.
|
||||
%
|
||||
%This rendering method was compared later to providing the vibrations with pressure directly on the fingertip in AR and was found more realistic to render virtual objects and textures~\autocite{teng2021touch}.
|
||||
%This rendering method was compared later to providing the vibrations with pressure directly on the fingertip in AR and was found more realistic to render virtual objects and textures~\cite{teng2021touch}.
|
||||
%
|
||||
%Covering the fingertip is however not suitable for rendering haptic feedback when touching real objects.
|
||||
%
|
||||
@@ -123,13 +123,13 @@ The well-known phycho-physical model of \textcite{ernst2002humans} established t
|
||||
%
|
||||
This effect has been used to alter the texture perception in AR and VR.
|
||||
%
|
||||
For example, superimposed virtual visual opaque textures on real surfaces in AR can be perceived as coherent together even though they have very different roughnesses~\autocite{kitahara2010sensory}.
|
||||
For example, superimposed virtual visual opaque textures on real surfaces in AR can be perceived as coherent together even though they have very different roughnesses~\cite{kitahara2010sensory}.
|
||||
%
|
||||
\textcite{fradin2023humans} explored this effect further, finding that a superimposed AR visual texture slightly different from a colocalized haptic texture affected the ability to recognize the haptic texture.
|
||||
%
|
||||
Similarly, \textcite{punpongsanon2015softar} altered the softness perception of a tangible surface using AR-projected visual textures whereas \textcite{chan2021hasti} evaluated audio-haptic texture perception in VR.
|
||||
%
|
||||
Conversely, colocalized 3D-printed real hair structures were able to correctly render several virtual visual textures seen in VR in terms of haptic hardness and roughness~\autocite{degraen2019enhancing}.
|
||||
Conversely, colocalized 3D-printed real hair structures were able to correctly render several virtual visual textures seen in VR in terms of haptic hardness and roughness~\cite{degraen2019enhancing}.
|
||||
%
|
||||
This study investigated how virtual roughness haptic texture can be used to enhance touched real surfaces augmented with visual AR textures.
|
||||
%Dans cet article, les textures haptiques sont senties co-localisées avec des textures visuelles
|
||||
|
||||
@@ -11,37 +11,37 @@ Yet visual and haptic sensations are often combined in everyday life, and it is
|
||||
\subsection{Augmenting Haptic Texture Roughness}
|
||||
\label{vibrotactile_roughness}
|
||||
|
||||
When running a finger over a surface, the deformations and vibrations of the skin caused by the micro-height differences of the material induce the sensation of roughness~\autocite{klatzky2003feeling}.
|
||||
When running a finger over a surface, the deformations and vibrations of the skin caused by the micro-height differences of the material induce the sensation of roughness~\cite{klatzky2003feeling}.
|
||||
%
|
||||
%Several approaches have been proposed to render virtual haptic texture~\autocite{culbertson2018haptics}.
|
||||
%Several approaches have been proposed to render virtual haptic texture~\cite{culbertson2018haptics}.
|
||||
%
|
||||
%High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\autocite{unger2011roughness}.
|
||||
%High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\cite{unger2011roughness}.
|
||||
%
|
||||
%As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\autocite{culbertson2018haptics}.
|
||||
%As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}.
|
||||
%
|
||||
%In this way, physics-based models~\autocite{chan2021hasti,okamura1998vibration} and data-based models~\autocite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures.
|
||||
%In this way, physics-based models~\cite{chan2021hasti,okamura1998vibration} and data-based models~\cite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures.
|
||||
%
|
||||
%Notably, \textcite{okamura1998vibration} rendered grating textures with exponentially decaying sinudoids that simulated the strokes of the grooves and ridges of the surface, while \textcite{culbertson2014modeling} captured and modelled the roughness of real surfaces to render them using the speed and force of the user.
|
||||
%
|
||||
An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\autocite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force. % to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\autocite{culbertson2015should}.
|
||||
An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force. % to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\cite{culbertson2015should}.
|
||||
%
|
||||
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\autocite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\autocite{asano2015vibrotactile,normand2024augmenting}, creating a haptic texture augmentation.
|
||||
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\cite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\cite{asano2015vibrotactile,normand2024augmenting}, creating a haptic texture augmentation.
|
||||
%
|
||||
%The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\autocite{bhatia2024augmenting,jeon2009haptic}.
|
||||
%The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\cite{bhatia2024augmenting,jeon2009haptic}.
|
||||
%
|
||||
One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand~\autocite{ando2007fingernailmounted,friesen2024perceived,normand2024visuohaptic,teng2021touch}.
|
||||
One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand~\cite{ando2007fingernailmounted,friesen2024perceived,normand2024visuohaptic,teng2021touch}.
|
||||
%
|
||||
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\autocite{manfredi2014natural}.
|
||||
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\cite{manfredi2014natural}.
|
||||
%
|
||||
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\autocite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
|
||||
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\cite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
|
||||
%
|
||||
It remains unclear whether such vibrotactile texture augmentation is perceived the same when integrated into visual AR or VR environments or touched with a virtual hand instead of the real hand.
|
||||
%
|
||||
%We also add a phase adjustment to this sinusoidal signal to allow free exploration movements of the finger with a simple camera-based tracking system.
|
||||
|
||||
%Another approach is to use ultrasonic vibrating screens, which are able to modulate their friction~\autocite{brahimaj2023crossmodal,rekik2017localized}.
|
||||
%Another approach is to use ultrasonic vibrating screens, which are able to modulate their friction~\cite{brahimaj2023crossmodal,rekik2017localized}.
|
||||
%
|
||||
%Combined with vibrotactile rendering of roughness using a voice-coil actuator attached to the screen, they can produce realistic haptic texture sensations~\autocite{ito2019tactile}.
|
||||
%Combined with vibrotactile rendering of roughness using a voice-coil actuator attached to the screen, they can produce realistic haptic texture sensations~\cite{ito2019tactile}.
|
||||
%
|
||||
%However, this method is limited to the screen and does not allow to easily render textures on virtual (visual) objects or to alter the perception of real surfaces.
|
||||
|
||||
@@ -55,30 +55,30 @@ When the same object property is sensed simultaneously by vision and touch, the
|
||||
The phychophysical model of \textcite{ernst2002humans} established that the sense with the least variability dominates perception.
|
||||
%
|
||||
%In particular, this effect has been used to better understand the visuo-haptic perception of texture and to design better feedback for virtual objects.
|
||||
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\autocite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
|
||||
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
|
||||
%
|
||||
Thus, the overall perception can be modified by changing one of the modalities, as shown by \textcite{yanagisawa2015effects}, who altered the perception of roughness, stiffness and friction of some real tactile textures touched by the finger by superimposing different real visual textures using a half-mirror.
|
||||
%
|
||||
%Similarly but in VR, \textcite{degraen2019enhancing} combined visual textures with different passive haptic hair-like structure that were touched with the finger to induce a larger set of visuo-haptic materials perception.
|
||||
%
|
||||
%\textcite{gunther2022smooth} studied in a complementary way how the visual rendering of a virtual object touching the arm with a tangible object influenced the perception of roughness.
|
||||
Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch~\autocite{degraen2019enhancing} and passive touch~\autocite{gunther2022smooth} contexts.
|
||||
Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch~\cite{degraen2019enhancing} and passive touch~\cite{gunther2022smooth} contexts.
|
||||
%
|
||||
\textcite{normand2024augmenting} also investigated the roughness perception of tangible surfaces touched with the finger and augmented with visual textures in AR and with wearable vibrotactile textures.
|
||||
%
|
||||
%A common finding of these studies is that haptic sensations seem to dominate the perception of roughness, suggesting that a smaller set of haptic textures can support a larger set of visual textures.
|
||||
%
|
||||
Conversely, virtual hand rendering is also known to influence how an object is grasped in VR~\autocite{prachyabrued2014visual,blaga2020too} and AR~\autocite{normand2024visuohaptic}, or even how real bumps and holes are perceived in VR~\autocite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated.
|
||||
Conversely, virtual hand rendering is also known to influence how an object is grasped in VR~\cite{prachyabrued2014visual,blaga2020too} and AR~\cite{normand2024visuohaptic}, or even how real bumps and holes are perceived in VR~\cite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated.
|
||||
|
||||
% \autocite{degraen2019enhancing} and \autocite{gunther2022smooth} showed that the visual rendering of a virtual object can influence the perception of its haptic properties.
|
||||
% \autocite{yanagisawa2015effects} with real visual textures superimposed on touched real textures affected the perception of the touched textures.
|
||||
% \cite{degraen2019enhancing} and \cite{gunther2022smooth} showed that the visual rendering of a virtual object can influence the perception of its haptic properties.
|
||||
% \cite{yanagisawa2015effects} with real visual textures superimposed on touched real textures affected the perception of the touched textures.
|
||||
|
||||
A few works have also used pseudo-haptic feedback to change the perception of haptic stimuli to create richer feedback by deforming the visual representation of a user input~\autocite{ujitoko2021survey}.
|
||||
A few works have also used pseudo-haptic feedback to change the perception of haptic stimuli to create richer feedback by deforming the visual representation of a user input~\cite{ujitoko2021survey}.
|
||||
%
|
||||
For example, %different levels of stiffness can be simulated on a grasped virtual object with the same passive haptic device~\autocite{achibet2017flexifingers} or
|
||||
the perceived softness of tangible objects can be altered by superimposing in AR a virtual texture that deforms when pressed by the hand~\autocite{punpongsanon2015softar}, or in combination with vibrotactile rendering in VR~\autocite{choi2021augmenting}.
|
||||
For example, %different levels of stiffness can be simulated on a grasped virtual object with the same passive haptic device~\cite{achibet2017flexifingers} or
|
||||
the perceived softness of tangible objects can be altered by superimposing in AR a virtual texture that deforms when pressed by the hand~\cite{punpongsanon2015softar}, or in combination with vibrotactile rendering in VR~\cite{choi2021augmenting}.
|
||||
%
|
||||
The vibrotactile sinusoidal rendering of virtual texture cited above was also combined with visual oscillations of a cursor on a screen to increase the roughness perception of the texture~\autocite{ujitoko2019modulating}.
|
||||
The vibrotactile sinusoidal rendering of virtual texture cited above was also combined with visual oscillations of a cursor on a screen to increase the roughness perception of the texture~\cite{ujitoko2019modulating}.
|
||||
%
|
||||
%However, the visual representation was a virtual cursor seen on a screen while the haptic feedback was felt with a hand-held device.
|
||||
%
|
||||
@@ -95,8 +95,8 @@ Rendering a virtual piston pressed with one's real hand using a video see-throug
|
||||
%
|
||||
In a similar setup, but with an optical see-through (OST) AR headset, \textcite{gaffary2017ar} found that the virtual piston was perceived as less stiff in AR than in VR, without participants noticing this difference.
|
||||
%
|
||||
Using a VST-AR headset have notable consequences, as the "real" view of the environment and the hand is actually a visual stream from a camera, which has a noticeable delay and lower quality (\eg resolution, frame rate, field of view) compared to the direct view of the real environment with OST-AR~\autocite{macedo2023occlusion}.
|
||||
Using a VST-AR headset have notable consequences, as the "real" view of the environment and the hand is actually a visual stream from a camera, which has a noticeable delay and lower quality (\eg resolution, frame rate, field of view) compared to the direct view of the real environment with OST-AR~\cite{macedo2023occlusion}.
|
||||
%
|
||||
While a large literature has investigated these differences in visual perception, as well as for VR, \eg distances are underestimated~\autocite{adams2022depth,peillard2019studying}, less is known about visuo-haptic perception in AR and VR.
|
||||
While a large literature has investigated these differences in visual perception, as well as for VR, \eg distances are underestimated~\cite{adams2022depth,peillard2019studying}, less is known about visuo-haptic perception in AR and VR.
|
||||
%
|
||||
In this work we studied (1) the perception of a \emph{haptic texture augmentation} of a tangible surface and (2) the possible influence of the visual rendering of the environment (OST-AR or VR) and the hand touching the surface (real or virtual) on this perception.
|
||||
|
||||
@@ -1,31 +1,31 @@
|
||||
\section{Introduction}
|
||||
\label{introduction}
|
||||
|
||||
When we look at the surface of an everyday object, we then touch it to confirm or contrast our initial visual impression and to estimate the properties of the object~\autocite{ernst2002humans}.
|
||||
When we look at the surface of an everyday object, we then touch it to confirm or contrast our initial visual impression and to estimate the properties of the object~\cite{ernst2002humans}.
|
||||
%
|
||||
One of the main characteristics of a textured surface is its roughness, \ie the micro-geometry of the material~\autocite{klatzky2003feeling}, which is perceived equally well and similarly by both sight and touch~\autocite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
|
||||
One of the main characteristics of a textured surface is its roughness, \ie the micro-geometry of the material~\cite{klatzky2003feeling}, which is perceived equally well and similarly by both sight and touch~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
|
||||
%
|
||||
Many haptic devices and rendering methods have been used to generate realistic virtual rough textures~\autocite{culbertson2018haptics}.
|
||||
Many haptic devices and rendering methods have been used to generate realistic virtual rough textures~\cite{culbertson2018haptics}.
|
||||
%
|
||||
One of the most common approaches is to reproduce the vibrations that occur when running across a surface, using a vibrotactile device attached to a hand-held tool~\autocite{culbertson2014modeling,culbertson2015should} or worn on the finger~\autocite{asano2015vibrotactile,friesen2024perceived}.
|
||||
One of the most common approaches is to reproduce the vibrations that occur when running across a surface, using a vibrotactile device attached to a hand-held tool~\cite{culbertson2014modeling,culbertson2015should} or worn on the finger~\cite{asano2015vibrotactile,friesen2024perceived}.
|
||||
%
|
||||
By providing timely vibrations synchronized with the movement of the tool or the finger moving on a real object, the perceived roughness of the surface can be augmented~\autocite{culbertson2015should,asano2015vibrotactile}.
|
||||
By providing timely vibrations synchronized with the movement of the tool or the finger moving on a real object, the perceived roughness of the surface can be augmented~\cite{culbertson2015should,asano2015vibrotactile}.
|
||||
%
|
||||
In that sense, data-driven haptic textures have been developed as captures and models of real surfaces, resulting in the Penn Haptic Texture Toolkit (HaTT) database~\autocite{culbertson2014one}.
|
||||
In that sense, data-driven haptic textures have been developed as captures and models of real surfaces, resulting in the Penn Haptic Texture Toolkit (HaTT) database~\cite{culbertson2014one}.
|
||||
%
|
||||
While these virtual haptic textures are perceived as similar to real textures~\autocite{culbertson2015should}, they have been evaluated using hand-held tools and not yet in a direct finger contact with the surface context, in particular combined with visual textures in an immersive virtual environment.
|
||||
While these virtual haptic textures are perceived as similar to real textures~\cite{culbertson2015should}, they have been evaluated using hand-held tools and not yet in a direct finger contact with the surface context, in particular combined with visual textures in an immersive virtual environment.
|
||||
|
||||
Combined with virtual reality (VR), where the user is immersed in a visual virtual environment, wearable haptic devices have also proven to be effective in modifying the visuo-haptic perception of tangible objects touched with the finger, without needing to modify the object~\autocite{asano2012vibrotactile,asano2015vibrotactile,salazar2020altering}.
|
||||
Combined with virtual reality (VR), where the user is immersed in a visual virtual environment, wearable haptic devices have also proven to be effective in modifying the visuo-haptic perception of tangible objects touched with the finger, without needing to modify the object~\cite{asano2012vibrotactile,asano2015vibrotactile,salazar2020altering}.
|
||||
%
|
||||
Worn on the finger, but not directly on the fingertip to keep it free to interact with tangible objects, they have been used to alter perceived stiffness, softness, friction and local deformations~\autocite{detinguy2018enhancing,salazar2020altering}.
|
||||
Worn on the finger, but not directly on the fingertip to keep it free to interact with tangible objects, they have been used to alter perceived stiffness, softness, friction and local deformations~\cite{detinguy2018enhancing,salazar2020altering}.
|
||||
%
|
||||
However, the use of wearable haptic devices has been little explored in Augmented Reality (AR), where visual virtual content is integrated into the real-world environment, especially for augmenting texture sensations~\autocite{punpongsanon2015softar,maisto2017evaluation,meli2018combining,chan2021hasti,teng2021touch,fradin2023humans,normand2024visuohaptic}.
|
||||
However, the use of wearable haptic devices has been little explored in Augmented Reality (AR), where visual virtual content is integrated into the real-world environment, especially for augmenting texture sensations~\cite{punpongsanon2015softar,maisto2017evaluation,meli2018combining,chan2021hasti,teng2021touch,fradin2023humans,normand2024visuohaptic}.
|
||||
%
|
||||
A key difference in AR compared to VR is that the user can still see the real-world surroundings, including their hands, the augmented tangible objects and the worn haptic devices.
|
||||
%
|
||||
One additional issue of current AR systems is their visual display limitations, or virtual content that may not be seen as consistent with the real world~\autocite{kim2018revisiting,macedo2023occlusion}.
|
||||
One additional issue of current AR systems is their visual display limitations, or virtual content that may not be seen as consistent with the real world~\cite{kim2018revisiting,macedo2023occlusion}.
|
||||
%
|
||||
These two factors have been shown to influence the perception of haptic stiffness rendering~\autocite{knorlein2009influence,gaffary2017ar}.
|
||||
These two factors have been shown to influence the perception of haptic stiffness rendering~\cite{knorlein2009influence,gaffary2017ar}.
|
||||
%
|
||||
It remains to be investigated whether simultaneous and co-localized visual and haptic texture augmentation of tangible surfaces in AR can be perceived in a coherent and realistic manner, and to what extent each sensory modality would contribute to the overall perception of the augmented texture.
|
||||
%
|
||||
@@ -33,4 +33,4 @@ Being able to coherently substitute the visuo-haptic texture of an everyday surf
|
||||
|
||||
In this paper, we investigate how users perceive a tangible surface touched with the index finger when it is augmented with a visuo-haptic roughness texture using immersive optical see-through AR (OST-AR) and wearable vibrotactile stimuli provided on the index.
|
||||
%
|
||||
In a user study, twenty participants freely explored and evaluated the coherence, realism and roughness of the combination of nine representative pairs of visuo-haptic texture augmentations (see \figref{setup}, left) from the HaTT database~\autocite{culbertson2014one}.
|
||||
In a user study, twenty participants freely explored and evaluated the coherence, realism and roughness of the combination of nine representative pairs of visuo-haptic texture augmentations (see \figref{setup}, left) from the HaTT database~\cite{culbertson2014one}.
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
\begin{subfigs}{setup}{%
|
||||
User Study.
|
||||
}[%
|
||||
\item The nine visuo-haptic textures used in the user study, selected from the HaTT database~\autocite{culbertson2014one}. %
|
||||
\item The nine visuo-haptic textures used in the user study, selected from the HaTT database~\cite{culbertson2014one}. %
|
||||
The texture names were never shown, so as to prevent the use of the user's visual or haptic memory of the textures.
|
||||
\item Experimental setup. %
|
||||
Participant sat in front of the tangible surfaces, which were augmented with visual textures displayed by the HoloLens~2 AR headset and haptic roughness textures rendered by the vibrotactile haptic device placed on the middle index phalanx. %
|
||||
@@ -20,7 +20,7 @@
|
||||
|
||||
The user study aimed at analyzing the user perception of tangible surfaces when augmented through a visuo-haptic texture using AR and vibrotactile haptic feedback provided on the finger touching the surfaces.
|
||||
%
|
||||
Nine representative visuo-haptic texture pairs from the HaTT database~\autocite{culbertson2014one} were investigated in two tasks:
|
||||
Nine representative visuo-haptic texture pairs from the HaTT database~\cite{culbertson2014one} were investigated in two tasks:
|
||||
%
|
||||
(1) a matching task, where participants had to find the haptic texture that best matched a given visual texture; and (2) a ranking task, where participants had to rank only the haptic textures, only the visual textures, and the visuo-haptic texture pairs according to their perceived roughness.
|
||||
%
|
||||
@@ -30,7 +30,7 @@ Our objective is to assess which haptic textures were associated with which visu
|
||||
\subsection{The textures}
|
||||
\label{textures}
|
||||
|
||||
The 100 visuo-haptic texture pairs of the HaTT database~\autocite{culbertson2014one} were preliminary tested and compared using AR and vibrotactile haptic feedback on the finger on a tangible surface.
|
||||
The 100 visuo-haptic texture pairs of the HaTT database~\cite{culbertson2014one} were preliminary tested and compared using AR and vibrotactile haptic feedback on the finger on a tangible surface.
|
||||
%
|
||||
These texture models were chosen as they are visuo-haptic representations of a wide range of real textures that are publicly available online.
|
||||
%
|
||||
@@ -54,17 +54,17 @@ Positioned \qty{20}{\cm} above the surfaces, a webcam (StreamCam, Logitech) film
|
||||
%
|
||||
The visual textures were displayed on the tangible surfaces using the HoloLens~2 OST-AR headset (see \figref{setup}, middle and right) within a \qtyproduct{43 x 29}{\degree} field of view at \qty{60}{\Hz}; a set of empirical tests enabled us to choose the best rendering characteristics in terms of transparency and brightness for the visual textures, that were used throughout the user study.
|
||||
%
|
||||
When a haptic texture was touched, a \qty{48}{kHz} audio signal was generated using the corresponding HaTT haptic texture model and the measured tangential speed of the finger, using the rendering procedure described in Culbertson \etal~\autocite{culbertson2014modeling}.
|
||||
When a haptic texture was touched, a \qty{48}{kHz} audio signal was generated using the corresponding HaTT haptic texture model and the measured tangential speed of the finger, using the rendering procedure described in Culbertson \etal~\cite{culbertson2014modeling}.
|
||||
%
|
||||
The normal force on the texture was assumed to be constant at \qty{1.2}{\N} to generate the audio signal from the model, as Culbertson \etal~\autocite{culbertson2015should}, who found that the HaTT textures can be rendered using only the speed as input without decreasing their perceived realism.
|
||||
The normal force on the texture was assumed to be constant at \qty{1.2}{\N} to generate the audio signal from the model, as Culbertson \etal~\cite{culbertson2015should}, who found that the HaTT textures can be rendered using only the speed as input without decreasing their perceived realism.
|
||||
%
|
||||
An amplifier (XY-502, not branded) converted this audio signal to a current transmitted to the vibrotactile voice-coil actuator (HapCoil-One, Actronika), that was encased in a 3D-printed plastic shell firmly attached to the middle index phalanx of the participant's dominant hand, similarly to previous studies~\autocite{asano2015vibrotactile,friesen2024perceived}.
|
||||
An amplifier (XY-502, not branded) converted this audio signal to a current transmitted to the vibrotactile voice-coil actuator (HapCoil-One, Actronika), that was encased in a 3D-printed plastic shell firmly attached to the middle index phalanx of the participant's dominant hand, similarly to previous studies~\cite{asano2015vibrotactile,friesen2024perceived}.
|
||||
%
|
||||
This voice-coil actuator was chosen for its wide frequency range (\qtyrange{10}{1000}{\Hz}) and its relatively low acceleration distortion, as specified by the manufacturer\footnoteurl{https://www.actronika.com/haptic-solutions}.
|
||||
%
|
||||
Overall latency was measured to \qty{46 \pm 6}{\ms}, as a result of latency in image acquisition \qty{16 \pm 1}{\ms}, fiducial marker detection \qty{8 \pm 3}{\ms}, network synchronization \qty{4 \pm 1}{\ms}, audio sampling \qty{3 \pm 1}{\ms}, and the vibrotactile actuator latency (\qty{15}{\ms}, as specified by the manufacturer\footnotemark[5]).
|
||||
%
|
||||
This latency was below the \qty{60}{\ms} threshold for vibrotactile feedback \autocite{okamoto2009detectability} and was not noticed by the participants.
|
||||
This latency was below the \qty{60}{\ms} threshold for vibrotactile feedback \cite{okamoto2009detectability} and was not noticed by the participants.
|
||||
%
|
||||
The user study was held in a quiet room with no windows, with one light source of \qty{800}{\lumen} placed \qty{70}{\cm} above the table.
|
||||
|
||||
@@ -92,7 +92,7 @@ The placement of the haptic textures was randomized before each trial.
|
||||
%
|
||||
Participants were instructed to look closely at the details of the visual textures and explore the haptic textures with a constant pressure and various speeds to find the haptic texture that best matched the visual texture, \ie choose the surface with the most coherent visual-haptic texture pair.
|
||||
%
|
||||
The texture names were never given or shown to prevent the use of visual or haptic memory of the textures, nor a definition of what roughness is was given, so as to let participants complete the task as naturally as possible, similarly to Bergmann Tiest \etal~\autocite{bergmanntiest2007haptic}.
|
||||
The texture names were never given or shown to prevent the use of visual or haptic memory of the textures, nor a definition of what roughness is was given, so as to let participants complete the task as naturally as possible, similarly to Bergmann Tiest \etal~\cite{bergmanntiest2007haptic}.
|
||||
|
||||
Then, participants performed the \emph{ranking task}, employing the same setup as the matching task and the same 9 textures.
|
||||
%
|
||||
|
||||
@@ -117,7 +117,7 @@ The first dimension was similar to the rankings (see \figref{results_matching_ra
|
||||
%
|
||||
It seems that the second dimension opposed textures that were perceived as hard with those perceived as softer, as also reported by participants.
|
||||
%
|
||||
Stiffness is indeed an important perceptual dimension of a material~\autocite{okamoto2013psychophysical,culbertson2014modeling}.
|
||||
Stiffness is indeed an important perceptual dimension of a material~\cite{okamoto2013psychophysical,culbertson2014modeling}.
|
||||
|
||||
\figref{results_similarity} (right) shows the dendrograms of the two hierarchical clusterings of the haptic and visual textures, constructed using the Euclidean distance and the Ward's method on squared distance.
|
||||
%
|
||||
|
||||
@@ -3,13 +3,13 @@
|
||||
|
||||
In this study, we investigated the perception of visuo-haptic texture augmentation of tangible surfaces touched directly with the index fingertip, using visual texture overlays in AR and haptic roughness textures generated by a vibrotactile device worn on the middle index phalanx.
|
||||
%
|
||||
The nine evaluated pairs of visuo-haptic textures, taken from the HaTT database~\autocite{culbertson2014one}, are models of real texture captures.
|
||||
The nine evaluated pairs of visuo-haptic textures, taken from the HaTT database~\cite{culbertson2014one}, are models of real texture captures.
|
||||
%
|
||||
Their perception was evaluated in a two-task user study in which participants chose the most coherent combinations of visual and haptic textures (matching task), and ranked all textures according to their perceived roughness (ranking task).
|
||||
%
|
||||
The visual textures were displayed statically on the tangible surface, while the haptic textures adapted in real time to the speed of the finger on the surface, giving the impression that the visuo-haptic textures were integrated into the tangible surface.
|
||||
%
|
||||
In addition, the interaction with the textures was designed to be as natural as possible, without imposing a specific speed of finger movement, as in similar studies~\autocite{asano2015vibrotactile,friesen2024perceived}.
|
||||
In addition, the interaction with the textures was designed to be as natural as possible, without imposing a specific speed of finger movement, as in similar studies~\cite{asano2015vibrotactile,friesen2024perceived}.
|
||||
|
||||
In the matching task, participants were not able to effectively match the original visual and haptic texture pairs (see \figref{results_matching_ranking}, left), except for the Coffee Filter texture, which was the smoothest both visually and haptically.
|
||||
%
|
||||
@@ -25,9 +25,9 @@ Several strategies were used, as some participants reported using vibration freq
|
||||
%
|
||||
It should be noted that the task was rather difficult (see \figref{results_questions}), as participants had no prior knowledge of the textures, there were no additional visual cues such as the shape of an object, and the term \enquote{roughness} had not been used by the experimenter prior to the ranking task.
|
||||
|
||||
The correspondence analysis (see \figref{results_similarity}, left) highlighted that participants did indeed match visual and haptic textures primarily on the basis of their perceived roughness (60\% of variance), which is in line with previous perception studies on real~\autocite{baumgartner2013visual} and virtual~\autocite{culbertson2014modeling} textures.
|
||||
The correspondence analysis (see \figref{results_similarity}, left) highlighted that participants did indeed match visual and haptic textures primarily on the basis of their perceived roughness (60\% of variance), which is in line with previous perception studies on real~\cite{baumgartner2013visual} and virtual~\cite{culbertson2014modeling} textures.
|
||||
%
|
||||
The rankings (see \figref{results_matching_ranking}, right) confirmed that the participants all perceived the roughness of haptic textures very similarly, but that there was less consensus for visual textures, which is also in line with roughness rankings for real haptic and visual textures~\autocite{bergmanntiest2007haptic}.
|
||||
The rankings (see \figref{results_matching_ranking}, right) confirmed that the participants all perceived the roughness of haptic textures very similarly, but that there was less consensus for visual textures, which is also in line with roughness rankings for real haptic and visual textures~\cite{bergmanntiest2007haptic}.
|
||||
%
|
||||
These results made it possible to identify and name groups of textures in the form of clusters, and to construct confusion matrices between these clusters and between visual texture ranks with haptic clusters, showing that participants consistently identified and matched haptic and visual textures (see \figref{results_clusters}).
|
||||
%
|
||||
@@ -35,17 +35,17 @@ Interestingly, 30\% of the matching variance was captured with a second dimensio
|
||||
%
|
||||
One hypothesis is that this dimension could be the perceived stiffness of the textures, with Metal Mesh and smooth textures appearing stiffer than the other textures, whose granularity could have been perceived as bumps on the surface that could deform under finger pressure.
|
||||
%
|
||||
Stiffness is, with roughness, one of the main characteristics perceived by the vision and touch of real materials~\autocite{baumgartner2013visual,vardar2019fingertip}, but also on virtual haptic textures~\autocite{culbertson2014modeling,degraen2019enhancing}.
|
||||
Stiffness is, with roughness, one of the main characteristics perceived by the vision and touch of real materials~\cite{baumgartner2013visual,vardar2019fingertip}, but also on virtual haptic textures~\cite{culbertson2014modeling,degraen2019enhancing}.
|
||||
%
|
||||
The last visuo-haptic roughness ranking (see \figref{results_matching_ranking}, right) showed that both haptic and visual sensory information were well integrated as the resulting roughness ranking was being in between the two individual haptic and visual rankings.
|
||||
%
|
||||
Several strategies were reported: some participants first classified visually and then corrected with haptics, others classified haptically and then integrated visuals.
|
||||
%
|
||||
While visual sensation did influence perception, as observed in previous haptic AR studies~\autocite{punpongsanon2015softar,gaffary2017ar,fradin2023humans}, haptic sensation dominated here.
|
||||
While visual sensation did influence perception, as observed in previous haptic AR studies~\cite{punpongsanon2015softar,gaffary2017ar,fradin2023humans}, haptic sensation dominated here.
|
||||
%
|
||||
This indicates that participants were more confident and relied more on the haptic roughness perception than on the visual roughness perception when integrating both in one coherent perception.
|
||||
%
|
||||
Several participants also described attempting to identify visual and haptic textures using spatial breaks, edges or patterns, that were not observed when these textures were displayed in non-immersive virtual environments with a screen~\autocite{culbertson2014modeling,culbertson2015should}.
|
||||
Several participants also described attempting to identify visual and haptic textures using spatial breaks, edges or patterns, that were not observed when these textures were displayed in non-immersive virtual environments with a screen~\cite{culbertson2014modeling,culbertson2015should}.
|
||||
%
|
||||
A few participants even reported that they clearly sensed patterns on haptic textures.
|
||||
%
|
||||
@@ -55,19 +55,19 @@ Overall, the haptic device was judged to be comfortable, and the visual and hapt
|
||||
|
||||
These results have of course some limitations as they addressed a small set of visuo-haptic textures augmenting the perception of smooth white tangible surfaces.
|
||||
%
|
||||
Indeed, the increase in visuo-haptic texture perception may be limited on surfaces that already have strong visual or haptic patterns~\autocite{asano2012vibrotactile}, or on objects with complex shapes.
|
||||
Indeed, the increase in visuo-haptic texture perception may be limited on surfaces that already have strong visual or haptic patterns~\cite{asano2012vibrotactile}, or on objects with complex shapes.
|
||||
%
|
||||
In addition, the haptic textures used were modelled from the vibrations of a probe sliding over the captured surfaces.
|
||||
%
|
||||
The perception of surface roughness with the finger is actually more complex because it involves both the perception of vibrations and the spatial deformation of the skin~\autocite{klatzky2003feeling}, but also because the sensations generated when exploring a surface depend on factors other than the speed of the finger alone, such as the force of contact, the angle, the posture or the surface of the contact~\autocite{schafer2017transfer}, and the integration of these sensory information into one unified perception is not yet fully understood~\autocite{richardson2022learning}.
|
||||
The perception of surface roughness with the finger is actually more complex because it involves both the perception of vibrations and the spatial deformation of the skin~\cite{klatzky2003feeling}, but also because the sensations generated when exploring a surface depend on factors other than the speed of the finger alone, such as the force of contact, the angle, the posture or the surface of the contact~\cite{schafer2017transfer}, and the integration of these sensory information into one unified perception is not yet fully understood~\cite{richardson2022learning}.
|
||||
%
|
||||
Another limitation that may have affected the perception of haptic textures is the lack of compensation for the frequency response of the actuator and amplifier~\autocite{asano2012vibrotactile,culbertson2014modeling,friesen2024perceived}.
|
||||
Another limitation that may have affected the perception of haptic textures is the lack of compensation for the frequency response of the actuator and amplifier~\cite{asano2012vibrotactile,culbertson2014modeling,friesen2024perceived}.
|
||||
%
|
||||
Finally, the visual textures used were also simple color captures not meant to be used in an immersive virtual environment.
|
||||
%
|
||||
However, our objective was not to accurately reproduce real textures, but to alter the perception of simultaneous visual and haptic roughness augmentation of a real surface directly touched by the finger in AR.
|
||||
%
|
||||
In addition of these limitations, both visual and haptic texture models should be improved by integrating the rendering of spatially localized breaks, edges or patterns, like real textures~\autocite{richardson2022learning}, and by being adaptable to individual sensitivities, as personalized haptics is a promising approach~\autocite{malvezzi2021design,young2020compensating}.
|
||||
In addition of these limitations, both visual and haptic texture models should be improved by integrating the rendering of spatially localized breaks, edges or patterns, like real textures~\cite{richardson2022learning}, and by being adaptable to individual sensitivities, as personalized haptics is a promising approach~\cite{malvezzi2021design,young2020compensating}.
|
||||
%
|
||||
More generally, a wide range of haptic feedbacks should be integrated to form rich and complete haptic augmentations in AR~\autocite{maisto2017evaluation,detinguy2018enhancing,salazar2020altering,normand2024visuohaptic,pacchierotti2024haptics}.
|
||||
More generally, a wide range of haptic feedbacks should be integrated to form rich and complete haptic augmentations in AR~\cite{maisto2017evaluation,detinguy2018enhancing,salazar2020altering,normand2024visuohaptic,pacchierotti2024haptics}.
|
||||
|
||||
|
||||
@@ -19,27 +19,27 @@
|
||||
%
|
||||
%And what if you could also feel its shape or texture?
|
||||
%
|
||||
%Such tactile augmentation is made possible by wearable haptic devices, which are worn directly on the finger or hand and can provide a variety of sensations on the skin, while being small, light and discreet~\autocite{pacchierotti2017wearable}.
|
||||
%Such tactile augmentation is made possible by wearable haptic devices, which are worn directly on the finger or hand and can provide a variety of sensations on the skin, while being small, light and discreet~\cite{pacchierotti2017wearable}.
|
||||
%
|
||||
Wearable haptic devices, worn directly on the finger or hand, have been used to render a variety of tactile sensations to virtual objects seen in VR~\autocite{choi2018claw,detinguy2018enhancing,pezent2019tasbi} or AR~\autocite{maisto2017evaluation,meli2018combining,teng2021touch}.
|
||||
Wearable haptic devices, worn directly on the finger or hand, have been used to render a variety of tactile sensations to virtual objects seen in VR~\cite{choi2018claw,detinguy2018enhancing,pezent2019tasbi} or AR~\cite{maisto2017evaluation,meli2018combining,teng2021touch}.
|
||||
%
|
||||
They have also been used to alter the perception of roughness, stiffness, friction, and local shape perception of real tangible objects~\autocite{asano2015vibrotactile,detinguy2018enhancing,normand2024augmenting,salazar2020altering}.
|
||||
They have also been used to alter the perception of roughness, stiffness, friction, and local shape perception of real tangible objects~\cite{asano2015vibrotactile,detinguy2018enhancing,normand2024augmenting,salazar2020altering}.
|
||||
%
|
||||
Such techniques place the actuator \emph{close} to the point of contact with the real environment, leaving the user free to directly touch the tangible.
|
||||
%
|
||||
This combined use of wearable haptics with tangible objects enables a haptic \emph{augmented} reality (HAR)~\autocite{bhatia2024augmenting} that can provide a rich and varied haptic feedback.
|
||||
This combined use of wearable haptics with tangible objects enables a haptic \emph{augmented} reality (HAR)~\cite{bhatia2024augmenting} that can provide a rich and varied haptic feedback.
|
||||
|
||||
The degree of reality/virtuality in both visual and haptic sensory modalities can be varied independently, but wearable haptic AR has been little explored with VR and (visual) AR~\autocite{choi2021augmenting,normand2024augmenting}.
|
||||
The degree of reality/virtuality in both visual and haptic sensory modalities can be varied independently, but wearable haptic AR has been little explored with VR and (visual) AR~\cite{choi2021augmenting,normand2024augmenting}.
|
||||
%
|
||||
Although AR and VR are closely related, they have significant differences that can affect the user experience~\autocite{genay2021virtual,macedo2023occlusion}.
|
||||
Although AR and VR are closely related, they have significant differences that can affect the user experience~\cite{genay2021virtual,macedo2023occlusion}.
|
||||
%
|
||||
%By integrating visual virtual content into the real environment, AR keeps the hand of the user, the haptic devices worn and the tangibles touched visible, unlike VR where they are hidden by immersing the user into a visual virtual environment.
|
||||
%
|
||||
%Current AR systems also suffer from display and rendering limitations not present in VR, affecting the user experience with virtual content that may be less realistic or inconsistent with the real augmented environment~\autocite{kim2018revisiting,macedo2023occlusion}.
|
||||
%Current AR systems also suffer from display and rendering limitations not present in VR, affecting the user experience with virtual content that may be less realistic or inconsistent with the real augmented environment~\cite{kim2018revisiting,macedo2023occlusion}.
|
||||
%
|
||||
It therefore seems necessary to investigate and understand the potential effect of these differences in visual rendering on the perception of haptically augmented tangible objects.
|
||||
%
|
||||
Previous works have shown, for example, that the stiffness of a virtual piston rendered with a force feedback haptic system seen in AR is perceived as less rigid than in VR~\autocite{gaffary2017ar} or when the visual rendering is ahead of the haptic rendering~\autocite{diluca2011effects,knorlein2009influence}.
|
||||
Previous works have shown, for example, that the stiffness of a virtual piston rendered with a force feedback haptic system seen in AR is perceived as less rigid than in VR~\cite{gaffary2017ar} or when the visual rendering is ahead of the haptic rendering~\cite{diluca2011effects,knorlein2009influence}.
|
||||
%
|
||||
%Taking our example from the beginning of this introduction, you now want to learn more about the context of the discovery of the ancient object or its use at the time of its creation by immersing yourself in a virtual environment in VR.
|
||||
%
|
||||
@@ -47,7 +47,7 @@ Previous works have shown, for example, that the stiffness of a virtual piston r
|
||||
|
||||
The goal of this paper is to study the role of the visual rendering of the hand (real or virtual) and its environment (AR or VR) on the perception of a tangible surface whose texture is augmented with a wearable vibrotactile device worn on the finger.
|
||||
%
|
||||
We focus on the perception of roughness, one of the main tactile sensations of materials~\autocite{baumgartner2013visual,hollins1993perceptual,okamoto2013psychophysical} and one of the most studied haptic augmentations~\autocite{asano2015vibrotactile,culbertson2014modeling,friesen2024perceived,normand2024augmenting,strohmeier2017generating,ujitoko2019modulating}.
|
||||
We focus on the perception of roughness, one of the main tactile sensations of materials~\cite{baumgartner2013visual,hollins1993perceptual,okamoto2013psychophysical} and one of the most studied haptic augmentations~\cite{asano2015vibrotactile,culbertson2014modeling,friesen2024perceived,normand2024augmenting,strohmeier2017generating,ujitoko2019modulating}.
|
||||
%
|
||||
By understanding how these visual factors influence the perception of haptically augmented tangible objects, the many wearable haptic systems that already exist but have not yet been fully explored with AR can be better applied and new visuo-haptic renderings adapted to AR can be designed.
|
||||
|
||||
|
||||
@@ -22,7 +22,7 @@
|
||||
All computation steps except signal sampling are performed at 60~Hz and in separate threads to parallelize them.
|
||||
}
|
||||
|
||||
%With a vibrotactile actuator attached to a hand-held device or directly on the finger, it is possible to simulate virtual haptic sensations as vibrations, such as texture, friction or contact vibrations~\autocite{culbertson2018haptics}.
|
||||
%With a vibrotactile actuator attached to a hand-held device or directly on the finger, it is possible to simulate virtual haptic sensations as vibrations, such as texture, friction or contact vibrations~\cite{culbertson2018haptics}.
|
||||
%
|
||||
In this section, we describe a system for rendering vibrotactile roughness texture in real time, on any tangible surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose.
|
||||
%
|
||||
@@ -60,11 +60,11 @@ A fiducial marker (AprilTag) is glued to the top of the actuator (see \figref{me
|
||||
%
|
||||
Other markers are placed on the tangible surfaces to augment to estimate the relative position of the finger with respect to the surfaces (see \figref{setup}).
|
||||
%
|
||||
Contrary to similar work which either constrained hand to a constant speed to keep the signal frequency constant~\autocite{asano2015vibrotactile,friesen2024perceived}, or used mechanical sensors attached to the hand~\autocite{friesen2024perceived,strohmeier2017generating}, using vision-based tracking allows both to free the hand movements and to augment any tangible surface.
|
||||
Contrary to similar work which either constrained hand to a constant speed to keep the signal frequency constant~\cite{asano2015vibrotactile,friesen2024perceived}, or used mechanical sensors attached to the hand~\cite{friesen2024perceived,strohmeier2017generating}, using vision-based tracking allows both to free the hand movements and to augment any tangible surface.
|
||||
%
|
||||
A camera external to the AR/VR headset with a marker-based technique is employed to provide accurate and robust tracking with a constant view of the markers~\autocite{marchand2016pose}.
|
||||
A camera external to the AR/VR headset with a marker-based technique is employed to provide accurate and robust tracking with a constant view of the markers~\cite{marchand2016pose}.
|
||||
%
|
||||
To reduce the noise the pose estimation while maintaining a good responsiveness, the 1€ filter~\autocite{casiez2012filter} is applied.
|
||||
To reduce the noise the pose estimation while maintaining a good responsiveness, the 1€ filter~\cite{casiez2012filter} is applied.
|
||||
%
|
||||
It is a low-pass filter with an adaptive cutoff frequency, specifically designed for tracking human motion.
|
||||
%
|
||||
@@ -87,7 +87,7 @@ In our implementation, the virtual hand and environment are designed with Unity
|
||||
%
|
||||
The visual rendering is achieved using the Microsoft HoloLens~2, an OST-AR headset with a \qtyproduct{43 x 29}{\degree} field of view (FoV), a \qty{60}{\Hz} refresh rate, and self-localisation capabilities.
|
||||
%
|
||||
It was chosen over VST-AR because OST-AR only adds virtual content to the real environment, while VST-AR streams a real-time video capture of the real environment~\autocite{macedo2023occlusion}.
|
||||
It was chosen over VST-AR because OST-AR only adds virtual content to the real environment, while VST-AR streams a real-time video capture of the real environment~\cite{macedo2023occlusion}.
|
||||
%
|
||||
Indeed, one of our objectives (see \secref{experiment}) is to directly compare a virtual environment that replicates a real one. %, rather than a video feed that introduces many supplementary visual limitations.
|
||||
%
|
||||
@@ -101,11 +101,11 @@ A voice-coil actuator (HapCoil-One, Actronika) is used to display the vibrotacti
|
||||
%
|
||||
The voice-coil actuator is encased in a 3D printed plastic shell and firmly attached to the middle phalanx of the user's index finger with a Velcro strap, to enable the fingertip to directly touch the environment (see \figref{method/device}).
|
||||
%
|
||||
The actuator is driven by a Class D audio amplifier (XY-502 / TPA3116D2, Texas Instrument). %, which has proven to be an effective type of amplifier for driving moving-coil~\autocite{mcmahan2014dynamic}.
|
||||
The actuator is driven by a Class D audio amplifier (XY-502 / TPA3116D2, Texas Instrument). %, which has proven to be an effective type of amplifier for driving moving-coil~\cite{mcmahan2014dynamic}.
|
||||
%
|
||||
The amplifier is connected to the audio output of a computer that generates the signal using the WASAPI driver in exclusive mode and the NAudio library.
|
||||
|
||||
The represented haptic texture is a series of parallels virtual grooves and ridges, similar to real grating textures manufactured for psychophysical roughness perception studies~\autocite{friesen2024perceived,klatzky2003feeling,unger2011roughness}.
|
||||
The represented haptic texture is a series of parallels virtual grooves and ridges, similar to real grating textures manufactured for psychophysical roughness perception studies~\cite{friesen2024perceived,klatzky2003feeling,unger2011roughness}.
|
||||
%
|
||||
It is generated as a square wave audio signal, sampled at \qty{48}{\kilo\hertz}, with a period $\lambda$ (usually in the millimetre range) and an amplitude $A$.
|
||||
%
|
||||
@@ -119,21 +119,21 @@ A sample $s_k$ of the audio signal at sampling time $t_k$ is given by:
|
||||
\end{align}
|
||||
\end{subequations}
|
||||
%
|
||||
This is a common rendering method for vibrotactile textures, with well-defined parameters, that has been employed to modify perceived haptic roughness of a tangible surface~\autocite{asano2015vibrotactile,konyo2005tactile,ujitoko2019modulating}.
|
||||
This is a common rendering method for vibrotactile textures, with well-defined parameters, that has been employed to modify perceived haptic roughness of a tangible surface~\cite{asano2015vibrotactile,konyo2005tactile,ujitoko2019modulating}.
|
||||
%
|
||||
As the finger position is estimated at a far lower rate (\qty{60}{\hertz}) than the audio signal, the finger position $x_f$ cannot be directly used to render the signal if the finger moves fast or if the texture period is small.
|
||||
%
|
||||
The best strategy instead is to modulate the frequency of the signal $s$ as a ratio of the finger velocity $\dot{x}_f$ and the texture period $\lambda$~\autocite{friesen2024perceived}.
|
||||
The best strategy instead is to modulate the frequency of the signal $s$ as a ratio of the finger velocity $\dot{x}_f$ and the texture period $\lambda$~\cite{friesen2024perceived}.
|
||||
%
|
||||
This is important because it preserves the sensation of a constant spatial frequency of the virtual texture while the finger moves at various speeds, which is crucial for the perception of roughness~\autocite{klatzky2003feeling,unger2011roughness}.
|
||||
This is important because it preserves the sensation of a constant spatial frequency of the virtual texture while the finger moves at various speeds, which is crucial for the perception of roughness~\cite{klatzky2003feeling,unger2011roughness}.
|
||||
%
|
||||
Note that the finger position and velocity are transformed from the camera frame $\mathcal{F}_c$ to the texture frame $\mathcal{F}_t$, with the $x$ axis aligned with the texture direction.
|
||||
%
|
||||
However, when a new finger position is estimated at time $t_j$, the phase $\phi_j$ needs to be adjusted as well with the frequency to ensure a continuity in the signal as described in \eqref{signal}.
|
||||
%
|
||||
This approach avoids sudden changes in the actuator movement thus affecting the texture perception in an uncontrolled way (see \figref{method/phase_adjustment}) and, contrary to previous work~\autocite{asano2015vibrotactile,friesen2024perceived}, it enables no constraints a free exploration of the texture by the user with no constraints on the finger speed.
|
||||
This approach avoids sudden changes in the actuator movement thus affecting the texture perception in an uncontrolled way (see \figref{method/phase_adjustment}) and, contrary to previous work~\cite{asano2015vibrotactile,friesen2024perceived}, it enables no constraints a free exploration of the texture by the user with no constraints on the finger speed.
|
||||
%
|
||||
Finally, as \textcite{ujitoko2019modulating}, a square wave is chosen over a sine wave to get a rendering closer to a real grating texture with the sensation of crossing edges, and because the roughness perception of sine wave textures has been shown not to reproduce the roughness perception of real grating textures~\autocite{unger2011roughness}.
|
||||
Finally, as \textcite{ujitoko2019modulating}, a square wave is chosen over a sine wave to get a rendering closer to a real grating texture with the sensation of crossing edges, and because the roughness perception of sine wave textures has been shown not to reproduce the roughness perception of real grating textures~\cite{unger2011roughness}.
|
||||
%
|
||||
%And secondly, to be able to render low frequencies that occurs when the finger moves slowly or the texture period is large, as the actuator cannot render frequencies below \qty{\approx 20}{\Hz} with enough amplitude to be perceived with a pure sine wave signal.
|
||||
%
|
||||
@@ -169,9 +169,9 @@ Both are the result of latency in image capture \qty{16 +- 1}{\ms}, markers trac
|
||||
%
|
||||
The haptic loop also includes the voice-coil latency \qty{15}{\ms} (as specified by the manufacturer\footnotemark[1]), whereas the visual loop includes the latency in 3D rendering \qty{16 +- 5}{\ms} (60 frames per second) and display \qty{5}{\ms}.
|
||||
%
|
||||
The total haptic latency is below the \qty{60}{\ms} detection threshold in vibrotactile feedback~\autocite{okamoto2009detectability}.
|
||||
The total haptic latency is below the \qty{60}{\ms} detection threshold in vibrotactile feedback~\cite{okamoto2009detectability}.
|
||||
%
|
||||
The total visual latency can be considered slightly high, yet it is typical for an AR rendering involving vision-based tracking~\autocite{knorlein2009influence}.
|
||||
The total visual latency can be considered slightly high, yet it is typical for an AR rendering involving vision-based tracking~\cite{knorlein2009influence}.
|
||||
|
||||
The two filters also introduce a constant lag between the finger movement and the estimated position and velocity, measured at \qty{160 +- 30}{\ms}.
|
||||
%
|
||||
|
||||
@@ -28,7 +28,7 @@ The user study aimed to investigate the effect of visual hand rendering in AR or
|
||||
%
|
||||
In a two-alternative forced choice (2AFC) task, participants compared the roughness of different tactile texture augmentations in three visual rendering conditions: without any visual augmentation (see \figref{renderings}, \level{Real}), in AR with a realistic virtual hand superimposed on the real hand (see \figref{renderings}, \level{Mixed}), and in VR with the same virtual hand as an avatar (see \figref{renderings}, \level{Virtual}).
|
||||
%
|
||||
In order not to influence the perception, as vision is an important source of information and influence for the perception of texture~\autocite{bergmanntiest2007haptic,yanagisawa2015effects,normand2024augmenting,vardar2019fingertip}, the touched surface was visually a uniform white; thus only the visual aspect of the hand and the surrounding environment is changed.
|
||||
In order not to influence the perception, as vision is an important source of information and influence for the perception of texture~\cite{bergmanntiest2007haptic,yanagisawa2015effects,normand2024augmenting,vardar2019fingertip}, the touched surface was visually a uniform white; thus only the visual aspect of the hand and the surrounding environment is changed.
|
||||
|
||||
|
||||
\subsection{Participants}
|
||||
@@ -68,7 +68,7 @@ The virtual hand model was a gender-neutral human right hand with realistic skin
|
||||
%
|
||||
Its size was adjusted to match the real hand of the participants before the experiment.
|
||||
%
|
||||
%An OST-AR headset (Microsoft HoloLens~2) was chosen over a VST-AR headset because the former only adds virtual content to the real environment, while the latter streams a real-time video capture of the real environment, and one of our objectives was to directly compare a virtual environment replicating a real one, not to a video feed that introduces many other visual limitations~\autocite{macedo2023occlusion}.
|
||||
%An OST-AR headset (Microsoft HoloLens~2) was chosen over a VST-AR headset because the former only adds virtual content to the real environment, while the latter streams a real-time video capture of the real environment, and one of our objectives was to directly compare a virtual environment replicating a real one, not to a video feed that introduces many other visual limitations~\cite{macedo2023occlusion}.
|
||||
%
|
||||
The visual rendering of the virtual hand and environment is described in \secref{virtual_real_alignment}.
|
||||
%
|
||||
@@ -90,7 +90,7 @@ In the \level{Mixed} and \level{Real} conditions, the mask had two additional ho
|
||||
%
|
||||
%The position of the finger relative to the sheet was estimated using a webcam placed on top of the box (StreamCam, Logitech) and the OpenCV library by tracking a \qty{2}{\cm} square fiducial marker (AprilTag) glued to top of the vibrotactile actuator.
|
||||
%
|
||||
%The total texture latency was measured to \qty{36 \pm 4}{\ms}, as a result of latency in image acquisition \qty{16 \pm 1}{\ms}, fiducial marker detection \qty{2 \pm 1}{\ms}, audio sampling \qty{3 \pm 1}{\ms}, and the vibrotactile actuator latency (\qty{15}{\ms}, as specified by the manufacturer\footnotemark[1]), and was below the \qty{60}{\ms} threshold for vibrotactile feedback \autocite{okamoto2009detectability}.
|
||||
%The total texture latency was measured to \qty{36 \pm 4}{\ms}, as a result of latency in image acquisition \qty{16 \pm 1}{\ms}, fiducial marker detection \qty{2 \pm 1}{\ms}, audio sampling \qty{3 \pm 1}{\ms}, and the vibrotactile actuator latency (\qty{15}{\ms}, as specified by the manufacturer\footnotemark[1]), and was below the \qty{60}{\ms} threshold for vibrotactile feedback \cite{okamoto2009detectability}.
|
||||
%
|
||||
%The virtual hand followed the position of the fiducial marker with a slightly higher latency due to the network synchronization \qty{4 \pm 1}{\ms} between the computer and the HoloLens~2.
|
||||
|
||||
|
||||
@@ -28,7 +28,7 @@ The \level{Real} rendering had the highest PSE (\percent{7.9} \ci{1.2}{4.1}) and
|
||||
%
|
||||
The JND represents the estimated minimum amplitude difference between the comparison and reference textures that participants could perceive,
|
||||
% \ie the sensitivity to vibrotactile roughness differences,
|
||||
calculated at the 84th percentile of the predictions of the GLMM (\ie one standard deviation of the normal distribution)~\autocite{ernst2002humans}.
|
||||
calculated at the 84th percentile of the predictions of the GLMM (\ie one standard deviation of the normal distribution)~\cite{ernst2002humans}.
|
||||
%
|
||||
The \level{Real} rendering had the lowest JND (\percent{26} \ci{23}{29}), the \level{Mixed} rendering had the highest (\percent{33} \ci{30}{37}), and the \level{Virtual} rendering was in between (\percent{30} \ci{28}{32}).
|
||||
%
|
||||
|
||||
@@ -18,7 +18,7 @@ Surprisingly, the PSE of the \level{Real} rendering was shifted to the right (to
|
||||
%
|
||||
The sensitivity of participants to roughness differences (just-noticeable differences, JND) also varied between all the visual renderings, with the \level{Real} rendering having the best JND (\percent{26}), followed by the \level{Virtual} (\percent{30}) and \level{Virtual} (\percent{33}) renderings (see \figref{results/trial_jnds}).
|
||||
%
|
||||
These JND values are in line with and at the upper end of the range of previous studies~\autocite{choi2013vibrotactile}, which may be due to the location of the actuator on the top of the middle phalanx of the finger, being less sensitive to vibration than the fingertip.
|
||||
These JND values are in line with and at the upper end of the range of previous studies~\cite{choi2013vibrotactile}, which may be due to the location of the actuator on the top of the middle phalanx of the finger, being less sensitive to vibration than the fingertip.
|
||||
%
|
||||
Thus, compared to no visual rendering (\level{Real}), the addition of a visual rendering of the hand or environment reduced the roughness sensitivity (JND) and the average roughness perception (PSE), as if the virtual haptic textures felt \enquote{smoother}.
|
||||
|
||||
@@ -50,15 +50,15 @@ Thereby, we hypothesise that the differences in the perception of vibrotactile r
|
||||
%
|
||||
\textcite{diluca2011effects} demonstrated, in a VST-AR setup, how visual latency relative to proprioception increased the perception of stiffness of a virtual piston, while haptic latency decreased it.
|
||||
%
|
||||
Another complementary explanation could be a pseudo-haptic effect of the displacement of the virtual hand, as already observed with this vibrotactile texture rendering, but seen on a screen in a non-immersive context~\autocite{ujitoko2019modulating}.
|
||||
Another complementary explanation could be a pseudo-haptic effect of the displacement of the virtual hand, as already observed with this vibrotactile texture rendering, but seen on a screen in a non-immersive context~\cite{ujitoko2019modulating}.
|
||||
%
|
||||
Such hypotheses could be tested by manipulating the latency and tracking accuracy of the virtual hand or the vibrotactile feedback. % to observe their effects on the roughness perception of the virtual textures.
|
||||
|
||||
The main limitation of our study is, of course, the absence of a visual representation of the touched virtual texture.
|
||||
%
|
||||
This is indeed a source of information as important as haptic sensations for perception for both real textures~\autocite{baumgartner2013visual,bergmanntiest2007haptic,vardar2019fingertip} and virtual textures~\autocite{degraen2019enhancing,gunther2022smooth,normand2024augmenting}.
|
||||
This is indeed a source of information as important as haptic sensations for perception for both real textures~\cite{baumgartner2013visual,bergmanntiest2007haptic,vardar2019fingertip} and virtual textures~\cite{degraen2019enhancing,gunther2022smooth,normand2024augmenting}.
|
||||
%
|
||||
%Specifically, it remains to be investigated how to visually represent vibrotactile textures in an immersive AR or VR context, as the visuo-haptic coupling of such grating textures is not trivial~\autocite{unger2011roughness} even with real textures~\autocite{klatzky2003feeling}.
|
||||
%Specifically, it remains to be investigated how to visually represent vibrotactile textures in an immersive AR or VR context, as the visuo-haptic coupling of such grating textures is not trivial~\cite{unger2011roughness} even with real textures~\cite{klatzky2003feeling}.
|
||||
%
|
||||
The interactions between the visual and haptic sensory modalities is complex and deserves further investigations, in particular in the context of visuo-haptic AR.
|
||||
%
|
||||
|
||||
@@ -12,4 +12,4 @@ We investigated then with a psychophysical user study the effect of visual rende
|
||||
%Only the amplitude $A$ varied between the reference and comparison textures to create the different levels of roughness.
|
||||
%
|
||||
%Participants were not informed there was a reference and comparison textures, and
|
||||
No texture was represented visually, to avoid any influence on the perception~\autocite{bergmanntiest2007haptic,normand2024augmenting,yanagisawa2015effects}.
|
||||
No texture was represented visually, to avoid any influence on the perception~\cite{bergmanntiest2007haptic,normand2024augmenting,yanagisawa2015effects}.
|
||||
@@ -23,35 +23,35 @@
|
||||
|
||||
Augmented reality (AR) integrates virtual content into our real-world surroundings, giving the illusion of one unique environment and promising natural and seamless interactions with real and virtual objects.
|
||||
%
|
||||
Virtual object manipulation is particularly critical for useful and effective AR usage, such as in medical applications, training, or entertainment~\autocite{laviolajr20173d, kim2018revisiting}.
|
||||
Virtual object manipulation is particularly critical for useful and effective AR usage, such as in medical applications, training, or entertainment~\cite{laviolajr20173d, kim2018revisiting}.
|
||||
%
|
||||
Hand tracking technologies~\autocite{xiao2018mrtouch}, grasping techniques~\autocite{holl2018efficient}, and real-time physics engines permit users to directly manipulate virtual objects with their bare hands as if they were real~\autocite{piumsomboon2014graspshell}, without requiring controllers~\autocite{krichenbauer2018augmented}, gloves~\autocite{prachyabrued2014visual}, or predefined gesture techniques~\autocite{piumsomboon2013userdefined, ha2014wearhand}.
|
||||
Hand tracking technologies~\cite{xiao2018mrtouch}, grasping techniques~\cite{holl2018efficient}, and real-time physics engines permit users to directly manipulate virtual objects with their bare hands as if they were real~\cite{piumsomboon2014graspshell}, without requiring controllers~\cite{krichenbauer2018augmented}, gloves~\cite{prachyabrued2014visual}, or predefined gesture techniques~\cite{piumsomboon2013userdefined, ha2014wearhand}.
|
||||
%
|
||||
Optical see-through AR (OST-AR) head-mounted displays (HMDs), such as the Microsoft HoloLens 2 or the Magic Leap, are particularly suited for this type of direct hand interaction~\autocite{kim2018revisiting}.
|
||||
Optical see-through AR (OST-AR) head-mounted displays (HMDs), such as the Microsoft HoloLens 2 or the Magic Leap, are particularly suited for this type of direct hand interaction~\cite{kim2018revisiting}.
|
||||
|
||||
However, there are still several haptic and visual limitations that affect manipulation in OST-AR, degrading the user experience.
|
||||
%
|
||||
For example, it is difficult to estimate the position of one's hand in relation to a virtual content because mutual occlusion between the hand and the virtual object is often lacking~\autocite{macedo2023occlusion}, the depth of virtual content is underestimated~\autocite{diaz2017designing, peillard2019studying}, and hand tracking still has a noticeable latency~\autocite{xiao2018mrtouch}.
|
||||
For example, it is difficult to estimate the position of one's hand in relation to a virtual content because mutual occlusion between the hand and the virtual object is often lacking~\cite{macedo2023occlusion}, the depth of virtual content is underestimated~\cite{diaz2017designing, peillard2019studying}, and hand tracking still has a noticeable latency~\cite{xiao2018mrtouch}.
|
||||
%
|
||||
Similarly, it is challenging to ensure confident and realistic contact with a virtual object due to the lack of haptic feedback and the intangibility of the virtual environment, which of course cannot apply physical constraints on the hand~\autocite{maisto2017evaluation, meli2018combining, lopes2018adding, teng2021touch}.
|
||||
Similarly, it is challenging to ensure confident and realistic contact with a virtual object due to the lack of haptic feedback and the intangibility of the virtual environment, which of course cannot apply physical constraints on the hand~\cite{maisto2017evaluation, meli2018combining, lopes2018adding, teng2021touch}.
|
||||
%
|
||||
These limitations also make it difficult to confidently move a grasped object towards a target~\autocite{maisto2017evaluation, meli2018combining}.
|
||||
These limitations also make it difficult to confidently move a grasped object towards a target~\cite{maisto2017evaluation, meli2018combining}.
|
||||
|
||||
To address these haptic and visual limitations, we investigate two types of sensory feedback that are known to improve virtual interactions with hands, but have not been studied together in an AR context: visual hand rendering and delocalized haptic rendering.
|
||||
%
|
||||
A few works explored the effect of a visual hand rendering on interactions in AR by simulating mutual occlusion between the real hand and virtual objects~\autocite{ha2014wearhand, piumsomboon2014graspshell, al-kalbani2016analysis}, or displaying a 3D virtual hand model, semi-transparent~\autocite{ha2014wearhand, piumsomboon2014graspshell} or opaque~\autocite{blaga2017usability, yoon2020evaluating, saito2021contact}.
|
||||
A few works explored the effect of a visual hand rendering on interactions in AR by simulating mutual occlusion between the real hand and virtual objects~\cite{ha2014wearhand, piumsomboon2014graspshell, al-kalbani2016analysis}, or displaying a 3D virtual hand model, semi-transparent~\cite{ha2014wearhand, piumsomboon2014graspshell} or opaque~\cite{blaga2017usability, yoon2020evaluating, saito2021contact}.
|
||||
%
|
||||
Indeed, some visual hand renderings are known to improve interactions or user experience in virtual reality (VR), where the real hand is not visible~\autocite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch, vanveldhuizen2021effect}.
|
||||
Indeed, some visual hand renderings are known to improve interactions or user experience in virtual reality (VR), where the real hand is not visible~\cite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch, vanveldhuizen2021effect}.
|
||||
%
|
||||
However, the role of a visual hand rendering superimposed and seen above the real tracked hand has not yet been investigated in AR.
|
||||
%
|
||||
Conjointly, several studies have demonstrated that wearable haptics can significantly improve interactions performance and user experience in AR~\autocite{maisto2017evaluation, meli2018combining, sarac2022perceived}.
|
||||
Conjointly, several studies have demonstrated that wearable haptics can significantly improve interactions performance and user experience in AR~\cite{maisto2017evaluation, meli2018combining, sarac2022perceived}.
|
||||
%
|
||||
But haptic rendering for AR remains a challenge as it is difficult to provide rich and realistic haptic sensations while limiting their negative impact on hand tracking~\autocite{pacchierotti2016hring} and keeping the fingertips and palm free to interact with the real environment~\autocite{lopes2018adding, teng2021touch, sarac2022perceived, palmer2022haptic}.
|
||||
But haptic rendering for AR remains a challenge as it is difficult to provide rich and realistic haptic sensations while limiting their negative impact on hand tracking~\cite{pacchierotti2016hring} and keeping the fingertips and palm free to interact with the real environment~\cite{lopes2018adding, teng2021touch, sarac2022perceived, palmer2022haptic}.
|
||||
%
|
||||
Therefore, the haptic feedback of the fingertip contact with the virtual environment needs to be rendered elsewhere on the hand, it is unclear which positioning should be preferred or which type of haptic feedback is best suited for manipulating virtual objects in AR.
|
||||
%
|
||||
A final question is whether one or the other of these (haptic or visual) hand renderings should be preferred~\autocite{maisto2017evaluation, meli2018combining}, or whether a combined visuo-haptic rendering is beneficial for users.
|
||||
A final question is whether one or the other of these (haptic or visual) hand renderings should be preferred~\cite{maisto2017evaluation, meli2018combining}, or whether a combined visuo-haptic rendering is beneficial for users.
|
||||
%
|
||||
In fact, both hand renderings can provide sufficient sensory cues for efficient manipulation of virtual objects in AR, or conversely, they can be shown to be complementary.
|
||||
|
||||
|
||||
@@ -11,7 +11,7 @@ We compared a set of the most popular visual hand renderings.%, as also presente
|
||||
%
|
||||
Since we address hand-centered manipulation tasks, we only considered renderings including the fingertips.
|
||||
%
|
||||
Moreover, as to keep the focus on the hand rendering itself, we used neutral semi-transparent grey meshes, consistent with the choices made in~\autocite{yoon2020evaluating, vanveldhuizen2021effect}.
|
||||
Moreover, as to keep the focus on the hand rendering itself, we used neutral semi-transparent grey meshes, consistent with the choices made in~\cite{yoon2020evaluating, vanveldhuizen2021effect}.
|
||||
%
|
||||
All considered hand renderings are drawn following the tracked pose of the user's real hand.
|
||||
%
|
||||
@@ -21,7 +21,7 @@ However, while the real hand can of course penetrate virtual objects, the visual
|
||||
\subsubsection{None~(\figref{method/hands-none})}
|
||||
\label{hands_none}
|
||||
|
||||
As a reference, we considered no visual hand rendering, as is common in AR~\autocite{hettiarachchi2016annexing, blaga2017usability, xiao2018mrtouch, teng2021touch}.
|
||||
As a reference, we considered no visual hand rendering, as is common in AR~\cite{hettiarachchi2016annexing, blaga2017usability, xiao2018mrtouch, teng2021touch}.
|
||||
%
|
||||
Users have no information about hand tracking and no feedback about contact with the virtual objects, other than their movement when touched.
|
||||
%
|
||||
@@ -31,9 +31,9 @@ As virtual content is rendered on top of the real environment, the hand of the u
|
||||
\subsubsection{Occlusion (Occl,~\figref{method/hands-occlusion})}
|
||||
\label{hands_occlusion}
|
||||
|
||||
To avoid the abovementioned undesired occlusions due to the virtual content being rendered on top of the real environment, we can carefully crop the former whenever it hides real content that should be visible~\autocite{macedo2023occlusion}, \eg the thumb of the user in \figref{method/hands-occlusion}.
|
||||
To avoid the abovementioned undesired occlusions due to the virtual content being rendered on top of the real environment, we can carefully crop the former whenever it hides real content that should be visible~\cite{macedo2023occlusion}, \eg the thumb of the user in \figref{method/hands-occlusion}.
|
||||
%
|
||||
This approach is frequent in works using VST-AR headsets~\autocite{knorlein2009influence, ha2014wearhand, piumsomboon2014graspshell, suzuki2014grasping, al-kalbani2016analysis} .
|
||||
This approach is frequent in works using VST-AR headsets~\cite{knorlein2009influence, ha2014wearhand, piumsomboon2014graspshell, suzuki2014grasping, al-kalbani2016analysis} .
|
||||
|
||||
|
||||
\subsubsection{Tips (\figref{method/hands-tips})}
|
||||
@@ -41,7 +41,7 @@ This approach is frequent in works using VST-AR headsets~\autocite{knorlein2009i
|
||||
|
||||
This rendering shows small visual rings around the fingertips of the user, highlighting the most important parts of the hand and contact with virtual objects during fine manipulation.
|
||||
%
|
||||
Unlike work using small spheres~\autocite{maisto2017evaluation, meli2014wearable, grubert2018effects, normand2018enlarging, schwind2018touch}, this ring rendering also provides information about the orientation of the fingertips.
|
||||
Unlike work using small spheres~\cite{maisto2017evaluation, meli2014wearable, grubert2018effects, normand2018enlarging, schwind2018touch}, this ring rendering also provides information about the orientation of the fingertips.
|
||||
|
||||
|
||||
\subsubsection{Contour (Cont,~\figref{method/hands-contour})}
|
||||
@@ -51,7 +51,7 @@ This rendering is a {1-mm-thick} outline contouring the user's hands, providing
|
||||
%
|
||||
Unlike the other renderings, it is not occluded by the virtual objects, as shown in \figref{method/hands-contour}.
|
||||
%
|
||||
This rendering is not as usual as the previous others in the literature~\autocite{kang2020comparative}.
|
||||
This rendering is not as usual as the previous others in the literature~\cite{kang2020comparative}.
|
||||
|
||||
|
||||
\subsubsection{Skeleton (Skel,~\figref{method/hands-skeleton})}
|
||||
@@ -61,13 +61,13 @@ This rendering schematically renders the joints and phalanges of the fingers wit
|
||||
%
|
||||
It can be seen as an extension of the Tips rendering to include the complete fingers articulations.
|
||||
%
|
||||
It is widely used in VR~\autocite{argelaguet2016role, schwind2018touch, chessa2019grasping} and AR~\autocite{blaga2017usability, yoon2020evaluating}, as it is considered simple yet rich and comprehensive.
|
||||
It is widely used in VR~\cite{argelaguet2016role, schwind2018touch, chessa2019grasping} and AR~\cite{blaga2017usability, yoon2020evaluating}, as it is considered simple yet rich and comprehensive.
|
||||
|
||||
|
||||
\subsubsection{Mesh (\figref{method/hands-mesh})}
|
||||
\label{hands_mesh}
|
||||
|
||||
This rendering is a 3D semi-transparent ($a=0.2$) hand model, which is common in VR~\autocite{prachyabrued2014visual, argelaguet2016role, schwind2018touch, chessa2019grasping, yoon2020evaluating, vanveldhuizen2021effect}.
|
||||
This rendering is a 3D semi-transparent ($a=0.2$) hand model, which is common in VR~\cite{prachyabrued2014visual, argelaguet2016role, schwind2018touch, chessa2019grasping, yoon2020evaluating, vanveldhuizen2021effect}.
|
||||
%
|
||||
It can be seen as a filled version of the Contour hand rendering, thus partially covering the view of the real hand.
|
||||
|
||||
@@ -88,7 +88,7 @@ It can be seen as a filled version of the Contour hand rendering, thus partially
|
||||
\subfig[0.23]{method/task-grasp}
|
||||
\end{subfigs}
|
||||
|
||||
Following the guidelines of \textcite{bergstrom2021how} for designing object manipulation tasks, we considered two variations of a 3D pick-and-place task, commonly found in interaction and manipulation studies~\autocite{prachyabrued2014visual, maisto2017evaluation, meli2018combining, blaga2017usability, vanveldhuizen2021effect}.
|
||||
Following the guidelines of \textcite{bergstrom2021how} for designing object manipulation tasks, we considered two variations of a 3D pick-and-place task, commonly found in interaction and manipulation studies~\cite{prachyabrued2014visual, maisto2017evaluation, meli2018combining, blaga2017usability, vanveldhuizen2021effect}.
|
||||
|
||||
|
||||
\subsubsection{Push Task}
|
||||
@@ -184,7 +184,7 @@ During this training, we did not use any of the six hand renderings we want to t
|
||||
|
||||
Participants were asked to carry out the two tasks as naturally and as fast as possible.
|
||||
%
|
||||
Similarly to~\autocite{prachyabrued2014visual, maisto2017evaluation, blaga2017usability, vanveldhuizen2021effect}, we only allowed the use of the dominant hand.
|
||||
Similarly to~\cite{prachyabrued2014visual, maisto2017evaluation, blaga2017usability, vanveldhuizen2021effect}, we only allowed the use of the dominant hand.
|
||||
%
|
||||
The experiment took around 1 hour and 20 minutes to complete.
|
||||
|
||||
@@ -218,7 +218,7 @@ Finally, (iii) the mean \emph{Time per Contact}, defined as the total time any p
|
||||
%
|
||||
Solely for the grasp-and-place task, we also measured the (iv) \emph{Grip Aperture}, defined as the average distance between the thumb's fingertip and the other fingertips during the grasping of the cube;
|
||||
%
|
||||
lower values indicate a greater finger interpenetration with the cube, resulting in a greater discrepancy between the real hand and the visual hand rendering constrained to the cube surfaces and showing how confident users are in their grasp~\autocite{prachyabrued2014visual, al-kalbani2016analysis, blaga2017usability, chessa2019grasping}.
|
||||
lower values indicate a greater finger interpenetration with the cube, resulting in a greater discrepancy between the real hand and the visual hand rendering constrained to the cube surfaces and showing how confident users are in their grasp~\cite{prachyabrued2014visual, al-kalbani2016analysis, blaga2017usability, chessa2019grasping}.
|
||||
%
|
||||
Taken together, these measures provide an overview of the performance and usability of each of the visual hand renderings tested, as we hypothesized that they should influence the behavior and effectiveness of the participants.
|
||||
|
||||
|
||||
@@ -37,7 +37,7 @@ This result are consistent with \textcite{saito2021contact}, who found that disp
|
||||
|
||||
To summarize, when employing a visual hand rendering overlaying the real hand, participants were more performant and confident in manipulating virtual objects with bare hands in AR.
|
||||
%
|
||||
These results contrast with similar manipulation studies, but in non-immersive, on-screen AR, where the presence of a visual hand rendering was found by participants to improve the usability of the interaction, but not their performance~\autocite{blaga2017usability,maisto2017evaluation,meli2018combining}.
|
||||
These results contrast with similar manipulation studies, but in non-immersive, on-screen AR, where the presence of a visual hand rendering was found by participants to improve the usability of the interaction, but not their performance~\cite{blaga2017usability,maisto2017evaluation,meli2018combining}.
|
||||
%
|
||||
Our results show the most effective visual hand rendering to be the Skeleton one{. Participants appreciated that} it provided a detailed and precise view of the tracking of the real hand{, without} hiding or masking it.
|
||||
%
|
||||
@@ -45,7 +45,7 @@ Although the Contour and Mesh hand renderings were also highly rated, some parti
|
||||
%
|
||||
This result is in line with the results of virtual object manipulation in VR of \textcite{prachyabrued2014visual}, who found that the most effective visual hand rendering was a double representation of both the real tracked hand and a visual hand physically constrained by the virtual environment.
|
||||
%
|
||||
This type of Skeleton rendering was also the one that provided the best sense of agency (control) in VR~\autocite{argelaguet2016role, schwind2018touch}.
|
||||
This type of Skeleton rendering was also the one that provided the best sense of agency (control) in VR~\cite{argelaguet2016role, schwind2018touch}.
|
||||
|
||||
These results have of course some limitations as they only address limited types of manipulation tasks and visual hand characteristics, evaluated in a specific OST-AR setup.
|
||||
%
|
||||
|
||||
Reference in New Issue
Block a user