Replace \autocite => \cite

This commit is contained in:
2024-09-08 10:52:06 +02:00
parent 0c11bb2668
commit e96888afab
19 changed files with 197 additions and 197 deletions

View File

@@ -171,29 +171,29 @@ En particulier, nous nous intéressons aux actuateurs portables stimulant les m
\subsubsection{Texture}
\label{texture_rendering}
Several approaches have been proposed to render virtual haptic texture~\autocite{culbertson2018haptics}.
Several approaches have been proposed to render virtual haptic texture~\cite{culbertson2018haptics}.
%
High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\autocite{unger2011roughness}.
High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\cite{unger2011roughness}.
%
As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\autocite{culbertson2018haptics}.
As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}.
%
In this way, physics-based models~\autocite{chan2021hasti,okamura1998vibration,guruswamy2011iir} and data-based models~\autocite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures.
In this way, physics-based models~\cite{chan2021hasti,okamura1998vibration,guruswamy2011iir} and data-based models~\cite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures.
%
Notably, \textcite{okamura1998vibration} rendered grating textures with exponentially decaying sinudoids that simulated the strokes of the grooves and ridges of the surface, while \textcite{culbertson2014modeling} captured and modelled the roughness of real surfaces to render them using the speed and force of the user.
%
An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\autocite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\autocite{culbertson2015should}.
An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\cite{culbertson2015should}.
% For example, when comparing the same virtual texture pairwise, but with different parameters, \textcite{culbertson2015should} showed that the roughness vibrations generated should vary with user speed, but not necessarily with user force.
% Virtual data-driven textures were perceived as similar to real textures, except for friction, which was not rendered properly.
%
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\autocite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\autocite{asano2015vibrotactile}, creating a haptic texture augmentation.
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\cite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\cite{asano2015vibrotactile}, creating a haptic texture augmentation.
%
The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\autocite{bhatia2024augmenting,jeon2009haptic}.
The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\cite{bhatia2024augmenting,jeon2009haptic}.
One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand~\autocite{ando2007fingernailmounted,friesen2024perceived,normand2024visuohaptic,teng2021touch}.
One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand~\cite{ando2007fingernailmounted,friesen2024perceived,normand2024visuohaptic,teng2021touch}.
%
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\autocite{manfredi2014natural}.
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\cite{manfredi2014natural}.
%
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\autocite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\cite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
%
It remains unclear whether such vibrotactile texture augmentation is perceived the same when integrated into visual AR or VR environments or touched with a virtual hand instead of the real hand.
%

View File

@@ -21,7 +21,7 @@ The phychophysical model of \textcite{ernst2002humans} established that the sens
\subsubsection{Texture Augmentations}
\label{vhar_texture}
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\autocite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
%
Thus, the overall perception can be modified by changing one of the modalities, as shown by \textcite{yanagisawa2015effects}, who altered the perception of roughness, stiffness and friction of some real tactile textures touched by the finger by superimposing different real visual textures using a half-mirror.
@@ -30,14 +30,14 @@ They extended this prototype to in [@Bau2012REVEL] to alter the texture of touch
Similarly but in VR, \textcite{degraen2019enhancing} combined visual textures with different passive haptic hair-like structure that were touched with the finger to induce a larger set of visuo-haptic materials perception.
\textcite{gunther2022smooth} studied in a complementary way how the visual rendering of a virtual object touching the arm with a tangible object influenced the perception of roughness.
Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch~\autocite{degraen2019enhancing} and passive touch~\autocite{gunther2022smooth} contexts.
Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch~\cite{degraen2019enhancing} and passive touch~\cite{gunther2022smooth} contexts.
A common finding of these studies is that haptic sensations seem to dominate the perception of roughness, suggesting that a smaller set of haptic textures can support a larger set of visual textures.
\subsection{Improving the Interactions}
\label{vhar_interaction}
Conversely, virtual hand rendering is also known to influence how an object is grasped in VR~\autocite{prachyabrued2014visual,blaga2020too} and AR, or even how real bumps and holes are perceived in VR~\autocite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated.
Conversely, virtual hand rendering is also known to influence how an object is grasped in VR~\cite{prachyabrued2014visual,blaga2020too} and AR, or even how real bumps and holes are perceived in VR~\cite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated.
\subsubsection{Virtual Hands in Augmented Reality}
\label{vhar_hands}

View File

@@ -6,31 +6,31 @@ This Section summarizes the state of the art in visual hand rendering and (weara
\subsection{Visual Hand Rendering in AR}
\label{2_hands}
Mutual visual occlusion between a virtual object and the real hand, \ie hiding the virtual object when the real hand is in front of it and hiding the real hand when it is behind the virtual object, is often presented as natural and realistic, enhancing the blending of real and virtual environments~\autocite{piumsomboon2014graspshell, al-kalbani2016analysis}.
Mutual visual occlusion between a virtual object and the real hand, \ie hiding the virtual object when the real hand is in front of it and hiding the real hand when it is behind the virtual object, is often presented as natural and realistic, enhancing the blending of real and virtual environments~\cite{piumsomboon2014graspshell, al-kalbani2016analysis}.
%
In video see-through AR (VST-AR), this could be solved as a masking problem by combining the image of the real world captured by a camera and the generated virtual image~\autocite{macedo2023occlusion}.
In video see-through AR (VST-AR), this could be solved as a masking problem by combining the image of the real world captured by a camera and the generated virtual image~\cite{macedo2023occlusion}.
%
In OST-AR, this is more difficult because the virtual environment is displayed as a transparent 2D image on top of the 3D real world, which cannot be easily masked~\autocite{macedo2023occlusion}.
In OST-AR, this is more difficult because the virtual environment is displayed as a transparent 2D image on top of the 3D real world, which cannot be easily masked~\cite{macedo2023occlusion}.
%
Moreover, in VST-AR, the grip aperture and depth positioning of virtual objects often seem to be wrongly estimated~\autocite{al-kalbani2016analysis, maisto2017evaluation}.
Moreover, in VST-AR, the grip aperture and depth positioning of virtual objects often seem to be wrongly estimated~\cite{al-kalbani2016analysis, maisto2017evaluation}.
%
However, this effect has yet to be verified in an OST-AR setup.
An alternative is to render the virtual objects and the hand semi-transparents, so that they are partially visible even when one is occluding the other, \eg in \figref{hands-none} the real hand is behind the virtual cube but still visible.
%
Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in VST-AR~\autocite{buchmann2005interaction, ha2014wearhand, piumsomboon2014graspshell} and VR~\autocite{vanveldhuizen2021effect}, but has not yet been evaluated in OST-AR.
Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in VST-AR~\cite{buchmann2005interaction, ha2014wearhand, piumsomboon2014graspshell} and VR~\cite{vanveldhuizen2021effect}, but has not yet been evaluated in OST-AR.
%
However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a virtual object, \eg in \figref{hands-none} the thumb is in front of the virtual cube, but it appears to be behind it.
In VR, as the user is fully immersed in the virtual environment and cannot see their real hands, it is necessary to represent them virtually.
%
It is known that the virtual hand representation has an impact on perception, interaction performance, and preference of users~\autocite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch}.
It is known that the virtual hand representation has an impact on perception, interaction performance, and preference of users~\cite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch}.
%
In a pick-and-place task in VR, \textcite{prachyabrued2014visual} found that the virtual hand representation whose motion was constrained to the surface of the virtual objects performed the worst, while the virtual hand representation following the tracked human hand (thus penetrating the virtual objects), performed the best, even though it was rather disliked.
%
The authors also observed that the best compromise was a double rendering, showing both the tracked hand and a hand rendering constrained by the virtual environment.
%
It has also been shown that over a realistic avatar, a skeleton rendering (similar to \figref{hands-skeleton}) can provide a stronger sense of being in control~\autocite{argelaguet2016role} and that minimalistic fingertip rendering (similar to \figref{hands-tips}) can be more effective in a typing task~\autocite{grubert2018effects}.
It has also been shown that over a realistic avatar, a skeleton rendering (similar to \figref{hands-skeleton}) can provide a stronger sense of being in control~\cite{argelaguet2016role} and that minimalistic fingertip rendering (similar to \figref{hands-tips}) can be more effective in a typing task~\cite{grubert2018effects}.
In AR, as the real hand of a user is visible but not physically constrained by the virtual environment, adding a visual hand rendering that can physically interact with virtual objects would achieve a similar result to the promising double-hand rendering of \textcite{prachyabrued2014visual}.
%
@@ -38,7 +38,7 @@ Additionally, \textcite{kahl2021investigation} showed that a virtual object over
%
This suggests that a visual hand rendering superimposed on the real hand could be helpful, but should not impair users.
Few works have explored the effect of visual hand rendering in AR~\autocite{blaga2017usability, maisto2017evaluation, krichenbauer2018augmented, yoon2020evaluating, saito2021contact}.
Few works have explored the effect of visual hand rendering in AR~\cite{blaga2017usability, maisto2017evaluation, krichenbauer2018augmented, yoon2020evaluating, saito2021contact}.
%
For example, \textcite{blaga2017usability} evaluated a skeleton rendering in several virtual object manipulations against no visual hand overlay.
%
@@ -55,12 +55,12 @@ To the best of our knowledge, evaluating the role of a visual rendering of the h
\label{2_haptics}
Different haptic feedback systems have been explored to improve interactions in AR, including %
grounded force feedback devices~\autocite{bianchi2006high, jeon2009haptic, knorlein2009influence}, %
exoskeletons~\autocite{lee2021wearable}, %
tangible objects~\autocite{hettiarachchi2016annexing, detinguy2018enhancing, salazar2020altering, normand2018enlarging, xiao2018mrtouch}, and %
wearable haptic devices~\autocite{pacchierotti2016hring, lopes2018adding, pezent2019tasbi, teng2021touch}.
grounded force feedback devices~\cite{bianchi2006high, jeon2009haptic, knorlein2009influence}, %
exoskeletons~\cite{lee2021wearable}, %
tangible objects~\cite{hettiarachchi2016annexing, detinguy2018enhancing, salazar2020altering, normand2018enlarging, xiao2018mrtouch}, and %
wearable haptic devices~\cite{pacchierotti2016hring, lopes2018adding, pezent2019tasbi, teng2021touch}.
Wearable haptics seems particularly suited for this context, as it takes into account many of the AR constraints, \eg limited impact on hand tracking performance and reduced impairment of the senses and ability of the users to interact with real content~\autocite{pacchierotti2016hring, maisto2017evaluation, lopes2018adding, meli2018combining, pezent2019tasbi, teng2021touch, kourtesis2022electrotactile, marchal2022virtual}.
Wearable haptics seems particularly suited for this context, as it takes into account many of the AR constraints, \eg limited impact on hand tracking performance and reduced impairment of the senses and ability of the users to interact with real content~\cite{pacchierotti2016hring, maisto2017evaluation, lopes2018adding, meli2018combining, pezent2019tasbi, teng2021touch, kourtesis2022electrotactile, marchal2022virtual}.
%
For example, \textcite{pacchierotti2016hring} designed a haptic ring providing pressure and skin stretch sensations to be worn at the proximal finger phalanx, so as to improve the hand tracking during a pick-and-place task.
%
@@ -68,7 +68,7 @@ For example, \textcite{pacchierotti2016hring} designed a haptic ring providing p
%
\textcite{teng2021touch} presented Touch\&Fold, a haptic device attached to the nail that provides pressure and texture sensations when interacting with virtual content, but also folds away when the user interacts with real objects, leaving the fingertip free.
%
This approach was also perceived as more realistic than providing sensations directly on the nail, as in~\autocite{ando2007fingernailmounted}.
This approach was also perceived as more realistic than providing sensations directly on the nail, as in~\cite{ando2007fingernailmounted}.
%
Each of these haptic devices provided haptic feedback about fingertip interactions with the virtual content on other parts of the hand.
%
@@ -82,7 +82,7 @@ Results proved that moving the haptic feedback away from the point(s) of contact
%
In pick-and-place tasks in AR involving both virtual and real objects, \textcite{maisto2017evaluation} and \textcite{meli2018combining} showed that having a haptic {rendering of the} fingertip interactions with the virtual objects led to better performance and perceived effectiveness than having only a visual rendering of the hand, similar to \figref{hands-tips}.
%
Moreover, employing the haptic ring of~\autocite{pacchierotti2016hring} on the proximal finger phalanx led to an improved performance with respect to more standard fingertip haptic devices~\autocite{chinello2020modular}.
Moreover, employing the haptic ring of~\cite{pacchierotti2016hring} on the proximal finger phalanx led to an improved performance with respect to more standard fingertip haptic devices~\cite{chinello2020modular}.
%
However, the measured difference in performance could be attributed to either the device or the device position (proximal vs fingertip), or both.
%

View File

@@ -4,31 +4,31 @@
\subsection{Haptics in AR}
As in VR, the addition of haptic feedback in AR has been explored through numerous approaches, including %
grounded force feedback devices~\autocite{jeon2009haptic,knorlein2009influence,hachisu2012augmentation,gaffary2017ar}, %
exoskeletons~\autocite{lee2021wearable}, %
wearable haptic devices~\autocite{maisto2017evaluation,detinguy2018enhancing,lopes2018adding,meli2018combining,pezent2019tasbi,teng2021touch},
tangible objects~\autocite{punpongsanon2015softar,hettiarachchi2016annexing,kahl2021investigation}, and %
mid-air haptics~\autocite{ochiai2016crossfield}. %
grounded force feedback devices~\cite{jeon2009haptic,knorlein2009influence,hachisu2012augmentation,gaffary2017ar}, %
exoskeletons~\cite{lee2021wearable}, %
wearable haptic devices~\cite{maisto2017evaluation,detinguy2018enhancing,lopes2018adding,meli2018combining,pezent2019tasbi,teng2021touch},
tangible objects~\cite{punpongsanon2015softar,hettiarachchi2016annexing,kahl2021investigation}, and %
mid-air haptics~\cite{ochiai2016crossfield}. %
%
Most have been used to provide haptic feedback to virtual objects.
%
While this may seem similar to haptic feedback in VR, there are significant differences in terms of perception, as in AR the real world and the hand of the user remain visible, but also because the virtual content may be less realistic or inconsistent with the real world~\autocite{kim2018revisiting,macedo2023occlusion}.
While this may seem similar to haptic feedback in VR, there are significant differences in terms of perception, as in AR the real world and the hand of the user remain visible, but also because the virtual content may be less realistic or inconsistent with the real world~\cite{kim2018revisiting,macedo2023occlusion}.
%
Indeed, the same haptic stimuli can be perceived differently in AR and VR, \eg the perceived stiffness of a piston seemed higher in AR than in VR~\autocite{gaffary2017ar} or was altered in the presence of a delay between the haptic and visual feedback~\autocite{knorlein2009influence}.
Indeed, the same haptic stimuli can be perceived differently in AR and VR, \eg the perceived stiffness of a piston seemed higher in AR than in VR~\cite{gaffary2017ar} or was altered in the presence of a delay between the haptic and visual feedback~\cite{knorlein2009influence}.
%
It might be therefore interesting to study how haptic and visual augmentations of textures of tangible surfaces are perceived in AR.
An additional challenge in AR is to let the hand of the user free to touch, feel, and interact with the real objects~\autocite{maisto2017evaluation,detinguy2018enhancing,teng2021touch}.
An additional challenge in AR is to let the hand of the user free to touch, feel, and interact with the real objects~\cite{maisto2017evaluation,detinguy2018enhancing,teng2021touch}.
%
For example, mounted on the nail, the haptic device of \textcite{teng2021touch} can be quickly unfolded on demand to the fingertip to render haptic feedback of virtual objects.
%
It is however not suitable for rendering haptic feedback when touching real objects.
%
In this respect, some wearable haptic devices were specifically designed to provide haptic feedback about fingertip interactions with the virtual content, but delocalized elsewhere on the body: on the proximal finger phalanx with the hRing haptic ring device~\autocite{pacchierotti2016hring,ferro2023deconstructing}, on the wrist with the Tasbi bracelet~\autocite{pezent2019tasbi}, or on the arm~\autocite{lopes2018adding}.
In this respect, some wearable haptic devices were specifically designed to provide haptic feedback about fingertip interactions with the virtual content, but delocalized elsewhere on the body: on the proximal finger phalanx with the hRing haptic ring device~\cite{pacchierotti2016hring,ferro2023deconstructing}, on the wrist with the Tasbi bracelet~\cite{pezent2019tasbi}, or on the arm~\cite{lopes2018adding}.
%
Compared to a fingertip worn device, the hRing was even preferred by participants and perceived as more effective in virtual object manipulation task in AR~\autocite{maisto2017evaluation,meli2018combining}.
Compared to a fingertip worn device, the hRing was even preferred by participants and perceived as more effective in virtual object manipulation task in AR~\cite{maisto2017evaluation,meli2018combining}.
%
This device has been then taken further to alter cutaneous perception of touched tangible objects in VR and AR~\autocite{detinguy2018enhancing,salazar2020altering}: by providing normal and shear forces to the proximal phalanx skin in a timely manner, the perceived stiffness, softness, slipperiness, and local deformations (bumps and holes) of the touched tangible object were augmented.
This device has been then taken further to alter cutaneous perception of touched tangible objects in VR and AR~\cite{detinguy2018enhancing,salazar2020altering}: by providing normal and shear forces to the proximal phalanx skin in a timely manner, the perceived stiffness, softness, slipperiness, and local deformations (bumps and holes) of the touched tangible object were augmented.
%
However, wearable haptic devices have not yet been used in AR to modify the texture perception of a tangible surface.
@@ -59,27 +59,27 @@ However, wearable haptic devices have not yet been used in AR to modify the text
\subsection{Virtual Texture Perception}
% Explain how different from \autocite{konyo2005tactile, asano2012vibrotactile, asano2015vibrotactile}, \autocite{ando2007fingernailmounted}, \autocite{bau2012revel}, \autocite{chan2021hasti}, and culbertson
% Explain how different from \cite{konyo2005tactile, asano2012vibrotactile, asano2015vibrotactile}, \cite{ando2007fingernailmounted}, \cite{bau2012revel}, \cite{chan2021hasti}, and culbertson
%
%Tactile perception of a real texture involves multiple sensations, among them roughness being one of the most important.
%
%When running a finger over a surface, the perception of its roughness is due to the deformation of the skin caused by the micro height differences of the material \autocite{klatzky1999tactile,klatzky2003feeling}.
%When running a finger over a surface, the perception of its roughness is due to the deformation of the skin caused by the micro height differences of the material \cite{klatzky1999tactile,klatzky2003feeling}.
%
%Interestingly, visual perception of material roughness seems almost as good as haptic perception of roughness.
%
%However, there is a greater variability between individuals for visual perception than for haptic perception of roughness \autocite{bergmanntiest2007haptic}.
%However, there is a greater variability between individuals for visual perception than for haptic perception of roughness \cite{bergmanntiest2007haptic}.
Many approaches have been used to generate realistic haptic virtual textures.
%
Ultrasonic vibrating screens are capable of modulating their friction~\autocite{rekik2017localized,ito2019tactile}, but their use in AR is limited.
Ultrasonic vibrating screens are capable of modulating their friction~\cite{rekik2017localized,ito2019tactile}, but their use in AR is limited.
%
By simulating the roughness of a surface instead, force feedback devices can reproduce perceptions of patterned textures identical to those of real textures~\autocite{unger2011roughness}, but they are expensive and have a limited workspace.
By simulating the roughness of a surface instead, force feedback devices can reproduce perceptions of patterned textures identical to those of real textures~\cite{unger2011roughness}, but they are expensive and have a limited workspace.
%
An alternative is to reproduce the vibrations that occur when a tool or the finger is moved across a surface using a vibrotactile device attached to a hand-held tool~\autocite{culbertson2018haptics}.
An alternative is to reproduce the vibrations that occur when a tool or the finger is moved across a surface using a vibrotactile device attached to a hand-held tool~\cite{culbertson2018haptics}.
%
Several physical models have been proposed to represent such vibrations~\autocite{okamura1998vibration,guruswamy2011iir,chan2021hasti}.
Several physical models have been proposed to represent such vibrations~\cite{okamura1998vibration,guruswamy2011iir,chan2021hasti}.
%
However, as they can be difficult to tune, measurement-based models have been developed to record, model, and render these vibrations~\autocite{culbertson2014modeling,culbertson2017ungrounded}.
However, as they can be difficult to tune, measurement-based models have been developed to record, model, and render these vibrations~\cite{culbertson2014modeling,culbertson2017ungrounded}.
%
In this work, we employed such data-driven haptic models to augment and studied the visuo-haptic texture perception of tangible surfaces in AR.%\CP{Here the original sentence was: ``We use these data-driven haptic models to augment [...].''. It was not clear what ``we use'' meant. Check that the new sentence is correct.}
@@ -91,9 +91,9 @@ Similarly, \textcite{culbertson2014modeling} compared the similarity of all poss
%
Virtual data-driven textures were perceived as similar to real textures, except for friction, which was not rendered properly.
%
For grating textures, an arbitrary roughness rating is used to determine a psycho-physical curve as a function of pattern spacing~\autocite{unger2011roughness,asano2015vibrotactile,degraen2019enhancing}.
For grating textures, an arbitrary roughness rating is used to determine a psycho-physical curve as a function of pattern spacing~\cite{unger2011roughness,asano2015vibrotactile,degraen2019enhancing}.
%
Another common method is to identify a given haptic texture among visual representations of all haptic textures~\autocite{ando2007fingernailmounted,rekik2017localized,degraen2019enhancing,chan2021hasti}.
Another common method is to identify a given haptic texture among visual representations of all haptic textures~\cite{ando2007fingernailmounted,rekik2017localized,degraen2019enhancing,chan2021hasti}.
%
In this user study, participants matched the pairs of visual and haptic textures they find most coherent and ranked the textures according to their perceived roughness.
%\CP{Do you refer to the one in our paper? Not super clear.}
@@ -102,7 +102,7 @@ A few studies have explored vibrotactile haptic devices worn directly on the fin
%
\textcite{ando2007fingernailmounted} mounted a vibrotactile actuator on the index nail, which generated impulse vibrations to render virtual edges and gaps on a real surface.
%
%This rendering method was compared later to providing the vibrations with pressure directly on the fingertip in AR and was found more realistic to render virtual objects and textures~\autocite{teng2021touch}.
%This rendering method was compared later to providing the vibrations with pressure directly on the fingertip in AR and was found more realistic to render virtual objects and textures~\cite{teng2021touch}.
%
%Covering the fingertip is however not suitable for rendering haptic feedback when touching real objects.
%
@@ -123,13 +123,13 @@ The well-known phycho-physical model of \textcite{ernst2002humans} established t
%
This effect has been used to alter the texture perception in AR and VR.
%
For example, superimposed virtual visual opaque textures on real surfaces in AR can be perceived as coherent together even though they have very different roughnesses~\autocite{kitahara2010sensory}.
For example, superimposed virtual visual opaque textures on real surfaces in AR can be perceived as coherent together even though they have very different roughnesses~\cite{kitahara2010sensory}.
%
\textcite{fradin2023humans} explored this effect further, finding that a superimposed AR visual texture slightly different from a colocalized haptic texture affected the ability to recognize the haptic texture.
%
Similarly, \textcite{punpongsanon2015softar} altered the softness perception of a tangible surface using AR-projected visual textures whereas \textcite{chan2021hasti} evaluated audio-haptic texture perception in VR.
%
Conversely, colocalized 3D-printed real hair structures were able to correctly render several virtual visual textures seen in VR in terms of haptic hardness and roughness~\autocite{degraen2019enhancing}.
Conversely, colocalized 3D-printed real hair structures were able to correctly render several virtual visual textures seen in VR in terms of haptic hardness and roughness~\cite{degraen2019enhancing}.
%
This study investigated how virtual roughness haptic texture can be used to enhance touched real surfaces augmented with visual AR textures.
%Dans cet article, les textures haptiques sont senties co-localisées avec des textures visuelles

View File

@@ -11,37 +11,37 @@ Yet visual and haptic sensations are often combined in everyday life, and it is
\subsection{Augmenting Haptic Texture Roughness}
\label{vibrotactile_roughness}
When running a finger over a surface, the deformations and vibrations of the skin caused by the micro-height differences of the material induce the sensation of roughness~\autocite{klatzky2003feeling}.
When running a finger over a surface, the deformations and vibrations of the skin caused by the micro-height differences of the material induce the sensation of roughness~\cite{klatzky2003feeling}.
%
%Several approaches have been proposed to render virtual haptic texture~\autocite{culbertson2018haptics}.
%Several approaches have been proposed to render virtual haptic texture~\cite{culbertson2018haptics}.
%
%High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\autocite{unger2011roughness}.
%High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\cite{unger2011roughness}.
%
%As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\autocite{culbertson2018haptics}.
%As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}.
%
%In this way, physics-based models~\autocite{chan2021hasti,okamura1998vibration} and data-based models~\autocite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures.
%In this way, physics-based models~\cite{chan2021hasti,okamura1998vibration} and data-based models~\cite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures.
%
%Notably, \textcite{okamura1998vibration} rendered grating textures with exponentially decaying sinudoids that simulated the strokes of the grooves and ridges of the surface, while \textcite{culbertson2014modeling} captured and modelled the roughness of real surfaces to render them using the speed and force of the user.
%
An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\autocite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force. % to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\autocite{culbertson2015should}.
An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force. % to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\cite{culbertson2015should}.
%
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\autocite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\autocite{asano2015vibrotactile,normand2024augmenting}, creating a haptic texture augmentation.
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\cite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\cite{asano2015vibrotactile,normand2024augmenting}, creating a haptic texture augmentation.
%
%The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\autocite{bhatia2024augmenting,jeon2009haptic}.
%The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\cite{bhatia2024augmenting,jeon2009haptic}.
%
One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand~\autocite{ando2007fingernailmounted,friesen2024perceived,normand2024visuohaptic,teng2021touch}.
One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand~\cite{ando2007fingernailmounted,friesen2024perceived,normand2024visuohaptic,teng2021touch}.
%
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\autocite{manfredi2014natural}.
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\cite{manfredi2014natural}.
%
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\autocite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\cite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
%
It remains unclear whether such vibrotactile texture augmentation is perceived the same when integrated into visual AR or VR environments or touched with a virtual hand instead of the real hand.
%
%We also add a phase adjustment to this sinusoidal signal to allow free exploration movements of the finger with a simple camera-based tracking system.
%Another approach is to use ultrasonic vibrating screens, which are able to modulate their friction~\autocite{brahimaj2023crossmodal,rekik2017localized}.
%Another approach is to use ultrasonic vibrating screens, which are able to modulate their friction~\cite{brahimaj2023crossmodal,rekik2017localized}.
%
%Combined with vibrotactile rendering of roughness using a voice-coil actuator attached to the screen, they can produce realistic haptic texture sensations~\autocite{ito2019tactile}.
%Combined with vibrotactile rendering of roughness using a voice-coil actuator attached to the screen, they can produce realistic haptic texture sensations~\cite{ito2019tactile}.
%
%However, this method is limited to the screen and does not allow to easily render textures on virtual (visual) objects or to alter the perception of real surfaces.
@@ -55,30 +55,30 @@ When the same object property is sensed simultaneously by vision and touch, the
The phychophysical model of \textcite{ernst2002humans} established that the sense with the least variability dominates perception.
%
%In particular, this effect has been used to better understand the visuo-haptic perception of texture and to design better feedback for virtual objects.
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\autocite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
%
Thus, the overall perception can be modified by changing one of the modalities, as shown by \textcite{yanagisawa2015effects}, who altered the perception of roughness, stiffness and friction of some real tactile textures touched by the finger by superimposing different real visual textures using a half-mirror.
%
%Similarly but in VR, \textcite{degraen2019enhancing} combined visual textures with different passive haptic hair-like structure that were touched with the finger to induce a larger set of visuo-haptic materials perception.
%
%\textcite{gunther2022smooth} studied in a complementary way how the visual rendering of a virtual object touching the arm with a tangible object influenced the perception of roughness.
Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch~\autocite{degraen2019enhancing} and passive touch~\autocite{gunther2022smooth} contexts.
Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch~\cite{degraen2019enhancing} and passive touch~\cite{gunther2022smooth} contexts.
%
\textcite{normand2024augmenting} also investigated the roughness perception of tangible surfaces touched with the finger and augmented with visual textures in AR and with wearable vibrotactile textures.
%
%A common finding of these studies is that haptic sensations seem to dominate the perception of roughness, suggesting that a smaller set of haptic textures can support a larger set of visual textures.
%
Conversely, virtual hand rendering is also known to influence how an object is grasped in VR~\autocite{prachyabrued2014visual,blaga2020too} and AR~\autocite{normand2024visuohaptic}, or even how real bumps and holes are perceived in VR~\autocite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated.
Conversely, virtual hand rendering is also known to influence how an object is grasped in VR~\cite{prachyabrued2014visual,blaga2020too} and AR~\cite{normand2024visuohaptic}, or even how real bumps and holes are perceived in VR~\cite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated.
% \autocite{degraen2019enhancing} and \autocite{gunther2022smooth} showed that the visual rendering of a virtual object can influence the perception of its haptic properties.
% \autocite{yanagisawa2015effects} with real visual textures superimposed on touched real textures affected the perception of the touched textures.
% \cite{degraen2019enhancing} and \cite{gunther2022smooth} showed that the visual rendering of a virtual object can influence the perception of its haptic properties.
% \cite{yanagisawa2015effects} with real visual textures superimposed on touched real textures affected the perception of the touched textures.
A few works have also used pseudo-haptic feedback to change the perception of haptic stimuli to create richer feedback by deforming the visual representation of a user input~\autocite{ujitoko2021survey}.
A few works have also used pseudo-haptic feedback to change the perception of haptic stimuli to create richer feedback by deforming the visual representation of a user input~\cite{ujitoko2021survey}.
%
For example, %different levels of stiffness can be simulated on a grasped virtual object with the same passive haptic device~\autocite{achibet2017flexifingers} or
the perceived softness of tangible objects can be altered by superimposing in AR a virtual texture that deforms when pressed by the hand~\autocite{punpongsanon2015softar}, or in combination with vibrotactile rendering in VR~\autocite{choi2021augmenting}.
For example, %different levels of stiffness can be simulated on a grasped virtual object with the same passive haptic device~\cite{achibet2017flexifingers} or
the perceived softness of tangible objects can be altered by superimposing in AR a virtual texture that deforms when pressed by the hand~\cite{punpongsanon2015softar}, or in combination with vibrotactile rendering in VR~\cite{choi2021augmenting}.
%
The vibrotactile sinusoidal rendering of virtual texture cited above was also combined with visual oscillations of a cursor on a screen to increase the roughness perception of the texture~\autocite{ujitoko2019modulating}.
The vibrotactile sinusoidal rendering of virtual texture cited above was also combined with visual oscillations of a cursor on a screen to increase the roughness perception of the texture~\cite{ujitoko2019modulating}.
%
%However, the visual representation was a virtual cursor seen on a screen while the haptic feedback was felt with a hand-held device.
%
@@ -95,8 +95,8 @@ Rendering a virtual piston pressed with one's real hand using a video see-throug
%
In a similar setup, but with an optical see-through (OST) AR headset, \textcite{gaffary2017ar} found that the virtual piston was perceived as less stiff in AR than in VR, without participants noticing this difference.
%
Using a VST-AR headset have notable consequences, as the "real" view of the environment and the hand is actually a visual stream from a camera, which has a noticeable delay and lower quality (\eg resolution, frame rate, field of view) compared to the direct view of the real environment with OST-AR~\autocite{macedo2023occlusion}.
Using a VST-AR headset have notable consequences, as the "real" view of the environment and the hand is actually a visual stream from a camera, which has a noticeable delay and lower quality (\eg resolution, frame rate, field of view) compared to the direct view of the real environment with OST-AR~\cite{macedo2023occlusion}.
%
While a large literature has investigated these differences in visual perception, as well as for VR, \eg distances are underestimated~\autocite{adams2022depth,peillard2019studying}, less is known about visuo-haptic perception in AR and VR.
While a large literature has investigated these differences in visual perception, as well as for VR, \eg distances are underestimated~\cite{adams2022depth,peillard2019studying}, less is known about visuo-haptic perception in AR and VR.
%
In this work we studied (1) the perception of a \emph{haptic texture augmentation} of a tangible surface and (2) the possible influence of the visual rendering of the environment (OST-AR or VR) and the hand touching the surface (real or virtual) on this perception.