Replace \citeauthorcite => \textcite

This commit is contained in:
2024-06-27 17:52:03 +02:00
parent eb80ce3e38
commit 65de33bb60
8 changed files with 44 additions and 44 deletions

View File

@@ -26,27 +26,27 @@ In VR, as the user is fully immersed in the virtual environment and cannot see t
%
It is known that the virtual hand representation has an impact on perception, interaction performance, and preference of users~\cite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch}.
%
In a pick-and-place task in VR, \citeauthorcite{prachyabrued2014visual} found that the virtual hand representation whose motion was constrained to the surface of the virtual objects performed the worst, while the virtual hand representation following the tracked human hand (thus penetrating the virtual objects), performed the best, even though it was rather disliked.
In a pick-and-place task in VR, \textcite{prachyabrued2014visual} found that the virtual hand representation whose motion was constrained to the surface of the virtual objects performed the worst, while the virtual hand representation following the tracked human hand (thus penetrating the virtual objects), performed the best, even though it was rather disliked.
%
The authors also observed that the best compromise was a double rendering, showing both the tracked hand and a hand rendering constrained by the virtual environment.
%
It has also been shown that over a realistic avatar, a skeleton rendering (similar to \figref{hands-skeleton}) can provide a stronger sense of being in control~\cite{argelaguet2016role} and that minimalistic fingertip rendering (similar to \figref{hands-tips}) can be more effective in a typing task~\cite{grubert2018effects}.
In AR, as the real hand of a user is visible but not physically constrained by the virtual environment, adding a visual hand rendering that can physically interact with virtual objects would achieve a similar result to the promising double-hand rendering of \citeauthorcite{prachyabrued2014visual}.
In AR, as the real hand of a user is visible but not physically constrained by the virtual environment, adding a visual hand rendering that can physically interact with virtual objects would achieve a similar result to the promising double-hand rendering of \textcite{prachyabrued2014visual}.
%
Additionally, \citeauthorcite{kahl2021investigation} showed that a virtual object overlaying a tangible object in OST-AR can vary in size without worsening the users' experience nor the performance.
Additionally, \textcite{kahl2021investigation} showed that a virtual object overlaying a tangible object in OST-AR can vary in size without worsening the users' experience nor the performance.
%
This suggests that a visual hand rendering superimposed on the real hand could be helpful, but should not impair users.
Few works have explored the effect of visual hand rendering in AR~\cite{blaga2017usability, maisto2017evaluation, krichenbauer2018augmented, yoon2020evaluating, saito2021contact}.
%
For example, \citeauthorcite{blaga2017usability} evaluated a skeleton rendering in several virtual object manipulations against no visual hand overlay.
For example, \textcite{blaga2017usability} evaluated a skeleton rendering in several virtual object manipulations against no visual hand overlay.
%
Performance did not improve, but participants felt more confident with the virtual hand.
%
However, the experiment was carried out on a screen, in a non-immersive AR scenario.
%
\citeauthorcite{saito2021contact} found that masking the real hand with a textured 3D opaque virtual hand did not improve performance in a reach-to-grasp task but displaying the points of contact on the virtual object did.
\textcite{saito2021contact} found that masking the real hand with a textured 3D opaque virtual hand did not improve performance in a reach-to-grasp task but displaying the points of contact on the virtual object did.
%
To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of virtual object manipulation.
@@ -61,11 +61,11 @@ wearable haptic devices~\cite{pacchierotti2016hring, lopes2018adding, pezent2019
Wearable haptics seems particularly suited for this context, as it takes into account many of the AR constraints, \eg limited impact on hand tracking performance and reduced impairment of the senses and ability of the users to interact with real content~\cite{pacchierotti2016hring, maisto2017evaluation, lopes2018adding, meli2018combining, pezent2019tasbi, teng2021touch, kourtesis2022electrotactile, marchal2022virtual}.
%
For example, \citeauthorcite{pacchierotti2016hring} designed a haptic ring providing pressure and skin stretch sensations to be worn at the proximal finger phalanx, so as to improve the hand tracking during a pick-and-place task.
For example, \textcite{pacchierotti2016hring} designed a haptic ring providing pressure and skin stretch sensations to be worn at the proximal finger phalanx, so as to improve the hand tracking during a pick-and-place task.
%
\citeauthorcite{pezent2019tasbi} proposed Tasbi: a wristband haptic device capable of rendering vibrations and pressures.
\textcite{pezent2019tasbi} proposed Tasbi: a wristband haptic device capable of rendering vibrations and pressures.
%
\citeauthorcite{teng2021touch} presented Touch\&Fold, a haptic device attached to the nail that provides pressure and texture sensations when interacting with virtual content, but also folds away when the user interacts with real objects, leaving the fingertip free.
\textcite{teng2021touch} presented Touch\&Fold, a haptic device attached to the nail that provides pressure and texture sensations when interacting with virtual content, but also folds away when the user interacts with real objects, leaving the fingertip free.
%
This approach was also perceived as more realistic than providing sensations directly on the nail, as in~\cite{ando2007fingernailmounted}.
%
@@ -75,11 +75,11 @@ If it is indeed necessary to delocalize the haptic feedback, each of these posit
Conjointly, a few studies have explored and compared the effects of visual and haptic feedback in tasks involving the manipulation of virtual objects with the hand.
%
\citeauthorcite{sarac2022perceived} and \citeauthorcite{palmer2022haptic} studied the effects of providing haptic feedback about contacts at the fingertips using haptic devices worn at the wrist, testing different mappings.
\textcite{sarac2022perceived} and \textcite{palmer2022haptic} studied the effects of providing haptic feedback about contacts at the fingertips using haptic devices worn at the wrist, testing different mappings.
%
Results proved that moving the haptic feedback away from the point(s) of contact is possible and effective, and that its impact is more significant when the visual feedback is limited.
%
In pick-and-place tasks in AR involving both virtual and real objects, \citeauthorcite{maisto2017evaluation} and \citeauthorcite{meli2018combining} showed that having a haptic {rendering of the} fingertip interactions with the virtual objects led to better performance and perceived effectiveness than having only a visual rendering of the hand, similar to \figref{hands-tips}.
In pick-and-place tasks in AR involving both virtual and real objects, \textcite{maisto2017evaluation} and \textcite{meli2018combining} showed that having a haptic {rendering of the} fingertip interactions with the virtual objects led to better performance and perceived effectiveness than having only a visual rendering of the hand, similar to \figref{hands-tips}.
%
Moreover, employing the haptic ring of~\cite{pacchierotti2016hring} on the proximal finger phalanx led to an improved performance with respect to more standard fingertip haptic devices~\cite{chinello2020modular}.
%

View File

@@ -20,7 +20,7 @@ It might be therefore interesting to study how haptic and visual augmentations o
An additional challenge in AR is to let the hand of the user free to touch, feel, and interact with the real objects~\autocite{maisto2017evaluation,detinguy2018enhancing,teng2021touch}.
%
For example, mounted on the nail, the haptic device of \citeauthorcite{teng2021touch} can be quickly unfolded on demand to the fingertip to render haptic feedback of virtual objects.
For example, mounted on the nail, the haptic device of \textcite{teng2021touch} can be quickly unfolded on demand to the fingertip to render haptic feedback of virtual objects.
%
It is however not suitable for rendering haptic feedback when touching real objects.
%
@@ -83,11 +83,11 @@ However, as they can be difficult to tune, measurement-based models have been de
%
In this work, we employed such data-driven haptic models to augment and studied the visuo-haptic texture perception of tangible surfaces in AR.%\CP{Here the original sentence was: ``We use these data-driven haptic models to augment [...].''. It was not clear what ``we use'' meant. Check that the new sentence is correct.}
To evaluate the perception of virtual haptic textures, the same psycho-physical methods as for real materials are often used, as described by \citeauthorcite{okamoto2013psychophysical}.
To evaluate the perception of virtual haptic textures, the same psycho-physical methods as for real materials are often used, as described by \textcite{okamoto2013psychophysical}.
%
For example, when comparing the same virtual texture pairwise, but with different parameters, \citeauthorcite{culbertson2015should} showed that the roughness vibrations generated should vary with user speed, but not necessarily with user force.
For example, when comparing the same virtual texture pairwise, but with different parameters, \textcite{culbertson2015should} showed that the roughness vibrations generated should vary with user speed, but not necessarily with user force.
%
Similarly, \citeauthorcite{culbertson2014modeling} compared the similarity of all possible pairs between five real textures and their data-driven virtual equivalents, and rated their perceived properties in terms of hardness, roughness, friction, and smoothness.
Similarly, \textcite{culbertson2014modeling} compared the similarity of all possible pairs between five real textures and their data-driven virtual equivalents, and rated their perceived properties in terms of hardness, roughness, friction, and smoothness.
%
Virtual data-driven textures were perceived as similar to real textures, except for friction, which was not rendered properly.
%
@@ -100,34 +100,34 @@ In this user study, participants matched the pairs of visual and haptic textures
A few studies have explored vibrotactile haptic devices worn directly on the finger to render virtual textures on real surfaces.
%
\citeauthorcite{ando2007fingernailmounted} mounted a vibrotactile actuator on the index nail, which generated impulse vibrations to render virtual edges and gaps on a real surface.
\textcite{ando2007fingernailmounted} mounted a vibrotactile actuator on the index nail, which generated impulse vibrations to render virtual edges and gaps on a real surface.
%
%This rendering method was compared later to providing the vibrations with pressure directly on the fingertip in AR and was found more realistic to render virtual objects and textures~\autocite{teng2021touch}.
%
%Covering the fingertip is however not suitable for rendering haptic feedback when touching real objects.
%
Using a voice-coil actuator worn on the middle index phalanx, \citeauthorcite{asano2015vibrotactile} altered the roughness perception of a grating surface with a \qty{250}{\Hz} vibrotactile stimulus.
Using a voice-coil actuator worn on the middle index phalanx, \textcite{asano2015vibrotactile} altered the roughness perception of a grating surface with a \qty{250}{\Hz} vibrotactile stimulus.
%
Small amplitudes as a function of finger speed increased perceived roughness, whereas large constant amplitudes decreased it.
%
We used a similar approach, but to augment in AR the visuo-haptic texture perception of \emph{real} surfaces.
%As alternative, \citeauthorcite{teng2021touch} have designed a wearable haptic device specifically for AR scenarios mounted on the nail that can unfold on demand on the finger pad.%
%As alternative, \textcite{teng2021touch} have designed a wearable haptic device specifically for AR scenarios mounted on the nail that can unfold on demand on the finger pad.%
%While it as been perceived more realistic in rendering virtual textures, covering the finger pad is only suitable for rendering mid-air virtual objects.
%[[chan2021hasti]] tried to combine homogenous textures with patterned textures with vibrotactile in VR.
When the same object property is sensed simultaneously by vision and touch, the two modalities are integrated into one single perception.
%
The well-known phycho-physical model of \citeauthorcite{ernst2002humans} established that the sense with the least variability dominates perception.
The well-known phycho-physical model of \textcite{ernst2002humans} established that the sense with the least variability dominates perception.
%
This effect has been used to alter the texture perception in AR and VR.
%
For example, superimposed virtual visual opaque textures on real surfaces in AR can be perceived as coherent together even though they have very different roughnesses~\autocite{kitahara2010sensory}.
%
\citeauthorcite{fradin2023humans} explored this effect further, finding that a superimposed AR visual texture slightly different from a colocalized haptic texture affected the ability to recognize the haptic texture.
\textcite{fradin2023humans} explored this effect further, finding that a superimposed AR visual texture slightly different from a colocalized haptic texture affected the ability to recognize the haptic texture.
%
Similarly, \citeauthorcite{punpongsanon2015softar} altered the softness perception of a tangible surface using AR-projected visual textures whereas \citeauthorcite{chan2021hasti} evaluated audio-haptic texture perception in VR.
Similarly, \textcite{punpongsanon2015softar} altered the softness perception of a tangible surface using AR-projected visual textures whereas \textcite{chan2021hasti} evaluated audio-haptic texture perception in VR.
%
Conversely, colocalized 3D-printed real hair structures were able to correctly render several virtual visual textures seen in VR in terms of haptic hardness and roughness~\autocite{degraen2019enhancing}.
%