Use \autocite

This commit is contained in:
2024-06-26 19:02:04 +02:00
parent 74f2f271bc
commit e34039577b
12 changed files with 125 additions and 125 deletions

View File

@@ -19,27 +19,27 @@
%
%And what if you could also feel its shape or texture?
%
%Such tactile augmentation is made possible by wearable haptic devices, which are worn directly on the finger or hand and can provide a variety of sensations on the skin, while being small, light and discreet~\cite{pacchierotti2017wearable}.
%Such tactile augmentation is made possible by wearable haptic devices, which are worn directly on the finger or hand and can provide a variety of sensations on the skin, while being small, light and discreet~\autocite{pacchierotti2017wearable}.
%
Wearable haptic devices, worn directly on the finger or hand, have been used to render a variety of tactile sensations to virtual objects seen in VR~\cite{choi2018claw,detinguy2018enhancing,pezent2019tasbi} or AR~\cite{maisto2017evaluation,meli2018combining,teng2021touch}.
Wearable haptic devices, worn directly on the finger or hand, have been used to render a variety of tactile sensations to virtual objects seen in VR~\autocite{choi2018claw,detinguy2018enhancing,pezent2019tasbi} or AR~\autocite{maisto2017evaluation,meli2018combining,teng2021touch}.
%
They have also been used to alter the perception of roughness, stiffness, friction, and local shape perception of real tangible objects~\cite{asano2015vibrotactile,detinguy2018enhancing,normand2024augmenting,salazar2020altering}.
They have also been used to alter the perception of roughness, stiffness, friction, and local shape perception of real tangible objects~\autocite{asano2015vibrotactile,detinguy2018enhancing,normand2024augmenting,salazar2020altering}.
%
Such techniques place the actuator \emph{close} to the point of contact with the real environment, leaving the user free to directly touch the tangible.
%
This combined use of wearable haptics with tangible objects enables a haptic \emph{augmented} reality (HAR)~\cite{bhatia2024augmenting} that can provide a rich and varied haptic feedback.
This combined use of wearable haptics with tangible objects enables a haptic \emph{augmented} reality (HAR)~\autocite{bhatia2024augmenting} that can provide a rich and varied haptic feedback.
The degree of reality/virtuality in both visual and haptic sensory modalities can be varied independently, but wearable haptic AR has been little explored with VR and (visual) AR~\cite{choi2021augmenting,normand2024augmenting}.
The degree of reality/virtuality in both visual and haptic sensory modalities can be varied independently, but wearable haptic AR has been little explored with VR and (visual) AR~\autocite{choi2021augmenting,normand2024augmenting}.
%
Although AR and VR are closely related, they have significant differences that can affect the user experience~\cite{genay2021virtual,macedo2023occlusion}.
Although AR and VR are closely related, they have significant differences that can affect the user experience~\autocite{genay2021virtual,macedo2023occlusion}.
%
%By integrating visual virtual content into the real environment, AR keeps the hand of the user, the haptic devices worn and the tangibles touched visible, unlike VR where they are hidden by immersing the user into a visual virtual environment.
%
%Current AR systems also suffer from display and rendering limitations not present in VR, affecting the user experience with virtual content that may be less realistic or inconsistent with the real augmented environment~\cite{kim2018revisiting,macedo2023occlusion}.
%Current AR systems also suffer from display and rendering limitations not present in VR, affecting the user experience with virtual content that may be less realistic or inconsistent with the real augmented environment~\autocite{kim2018revisiting,macedo2023occlusion}.
%
It therefore seems necessary to investigate and understand the potential effect of these differences in visual rendering on the perception of haptically augmented tangible objects.
%
Previous works have shown, for example, that the stiffness of a virtual piston rendered with a force feedback haptic system seen in AR is perceived as less rigid than in VR~\cite{gaffary2017ar} or when the visual rendering is ahead of the haptic rendering~\cite{diluca2011effects,knorlein2009influence}.
Previous works have shown, for example, that the stiffness of a virtual piston rendered with a force feedback haptic system seen in AR is perceived as less rigid than in VR~\autocite{gaffary2017ar} or when the visual rendering is ahead of the haptic rendering~\autocite{diluca2011effects,knorlein2009influence}.
%
%Taking our example from the beginning of this introduction, you now want to learn more about the context of the discovery of the ancient object or its use at the time of its creation by immersing yourself in a virtual environment in VR.
%
@@ -47,7 +47,7 @@ Previous works have shown, for example, that the stiffness of a virtual piston r
The goal of this paper is to study the role of the visual rendering of the hand (real or virtual) and its environment (AR or VR) on the perception of a tangible surface whose texture is augmented with a wearable vibrotactile device worn on the finger.
%
We focus on the perception of roughness, one of the main tactile sensations of materials~\cite{baumgartner2013visual,hollins1993perceptual,okamoto2013psychophysical} and one of the most studied haptic augmentations~\cite{asano2015vibrotactile,culbertson2014modeling,friesen2024perceived,normand2024augmenting,strohmeier2017generating,ujitoko2019modulating}.
We focus on the perception of roughness, one of the main tactile sensations of materials~\autocite{baumgartner2013visual,hollins1993perceptual,okamoto2013psychophysical} and one of the most studied haptic augmentations~\autocite{asano2015vibrotactile,culbertson2014modeling,friesen2024perceived,normand2024augmenting,strohmeier2017generating,ujitoko2019modulating}.
%
By understanding how these visual factors influence the perception of haptically augmented tangible objects, the many wearable haptic systems that already exist but have not yet been fully explored with AR can be better applied and new visuo-haptic renderings adapted to AR can be designed.

View File

@@ -11,37 +11,37 @@ Yet visual and haptic sensations are often combined in everyday life, and it is
\subsection{Augmenting Haptic Texture Roughness}
\sublabel{vibrotactile_roughness}
When running a finger over a surface, the deformations and vibrations of the skin caused by the micro-height differences of the material induce the sensation of roughness~\cite{klatzky2003feeling}.
When running a finger over a surface, the deformations and vibrations of the skin caused by the micro-height differences of the material induce the sensation of roughness~\autocite{klatzky2003feeling}.
%
%Several approaches have been proposed to render virtual haptic texture~\cite{culbertson2018haptics}.
%Several approaches have been proposed to render virtual haptic texture~\autocite{culbertson2018haptics}.
%
%High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\cite{unger2011roughness}.
%High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\autocite{unger2011roughness}.
%
%As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}.
%As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\autocite{culbertson2018haptics}.
%
%In this way, physics-based models~\cite{chan2021hasti,okamura1998vibration} and data-based models~\cite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures.
%In this way, physics-based models~\autocite{chan2021hasti,okamura1998vibration} and data-based models~\autocite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures.
%
%Notably, \citeauthorcite{okamura1998vibration} rendered grating textures with exponentially decaying sinudoids that simulated the strokes of the grooves and ridges of the surface, while \citeauthorcite{culbertson2014modeling} captured and modelled the roughness of real surfaces to render them using the speed and force of the user.
%
An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force. % to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\cite{culbertson2015should}.
An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\autocite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force. % to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\autocite{culbertson2015should}.
%
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\cite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\cite{asano2015vibrotactile,normand2024augmenting}, creating a haptic texture augmentation.
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\autocite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\autocite{asano2015vibrotactile,normand2024augmenting}, creating a haptic texture augmentation.
%
%The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\cite{bhatia2024augmenting,jeon2009haptic}.
%The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\autocite{bhatia2024augmenting,jeon2009haptic}.
%
One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand~\cite{ando2007fingernailmounted,friesen2024perceived,normand2024visuohaptic,teng2021touch}.
One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand~\autocite{ando2007fingernailmounted,friesen2024perceived,normand2024visuohaptic,teng2021touch}.
%
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\cite{manfredi2014natural}.
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\autocite{manfredi2014natural}.
%
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\cite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\autocite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
%
It remains unclear whether such vibrotactile texture augmentation is perceived the same when integrated into visual AR or VR environments or touched with a virtual hand instead of the real hand.
%
%We also add a phase adjustment to this sinusoidal signal to allow free exploration movements of the finger with a simple camera-based tracking system.
%Another approach is to use ultrasonic vibrating screens, which are able to modulate their friction~\cite{brahimaj2023crossmodal,rekik2017localized}.
%Another approach is to use ultrasonic vibrating screens, which are able to modulate their friction~\autocite{brahimaj2023crossmodal,rekik2017localized}.
%
%Combined with vibrotactile rendering of roughness using a voice-coil actuator attached to the screen, they can produce realistic haptic texture sensations~\cite{ito2019tactile}.
%Combined with vibrotactile rendering of roughness using a voice-coil actuator attached to the screen, they can produce realistic haptic texture sensations~\autocite{ito2019tactile}.
%
%However, this method is limited to the screen and does not allow to easily render textures on virtual (visual) objects or to alter the perception of real surfaces.
@@ -55,30 +55,30 @@ When the same object property is sensed simultaneously by vision and touch, the
The phychophysical model of \citeauthorcite{ernst2002humans} established that the sense with the least variability dominates perception.
%
%In particular, this effect has been used to better understand the visuo-haptic perception of texture and to design better feedback for virtual objects.
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
Particularly for real textures, it is known that both touch and sight individually perceive textures equally well and similarly~\autocite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
%
Thus, the overall perception can be modified by changing one of the modalities, as shown by \citeauthorcite{yanagisawa2015effects}, who altered the perception of roughness, stiffness and friction of some real tactile textures touched by the finger by superimposing different real visual textures using a half-mirror.
%
%Similarly but in VR, \citeauthorcite{degraen2019enhancing} combined visual textures with different passive haptic hair-like structure that were touched with the finger to induce a larger set of visuo-haptic materials perception.
%
%\citeauthorcite{gunther2022smooth} studied in a complementary way how the visual rendering of a virtual object touching the arm with a tangible object influenced the perception of roughness.
Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch~\cite{degraen2019enhancing} and passive touch~\cite{gunther2022smooth} contexts.
Likewise, visual textures were combined in VR with various tangible objects to induce a larger set of visuo-haptic material perceptions, in both active touch~\autocite{degraen2019enhancing} and passive touch~\autocite{gunther2022smooth} contexts.
%
\citeauthorcite{normand2024augmenting} also investigated the roughness perception of tangible surfaces touched with the finger and augmented with visual textures in AR and with wearable vibrotactile textures.
%
%A common finding of these studies is that haptic sensations seem to dominate the perception of roughness, suggesting that a smaller set of haptic textures can support a larger set of visual textures.
%
Conversely, virtual hand rendering is also known to influence how an object is grasped in VR~\cite{prachyabrued2014visual,blaga2020too} and AR~\cite{normand2024visuohaptic}, or even how real bumps and holes are perceived in VR~\cite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated.
Conversely, virtual hand rendering is also known to influence how an object is grasped in VR~\autocite{prachyabrued2014visual,blaga2020too} and AR~\autocite{normand2024visuohaptic}, or even how real bumps and holes are perceived in VR~\autocite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated.
% \cite{degraen2019enhancing} and \cite{gunther2022smooth} showed that the visual rendering of a virtual object can influence the perception of its haptic properties.
% \cite{yanagisawa2015effects} with real visual textures superimposed on touched real textures affected the perception of the touched textures.
% \autocite{degraen2019enhancing} and \autocite{gunther2022smooth} showed that the visual rendering of a virtual object can influence the perception of its haptic properties.
% \autocite{yanagisawa2015effects} with real visual textures superimposed on touched real textures affected the perception of the touched textures.
A few works have also used pseudo-haptic feedback to change the perception of haptic stimuli to create richer feedback by deforming the visual representation of a user input~\cite{ujitoko2021survey}.
A few works have also used pseudo-haptic feedback to change the perception of haptic stimuli to create richer feedback by deforming the visual representation of a user input~\autocite{ujitoko2021survey}.
%
For example, %different levels of stiffness can be simulated on a grasped virtual object with the same passive haptic device~\cite{achibet2017flexifingers} or
the perceived softness of tangible objects can be altered by superimposing in AR a virtual texture that deforms when pressed by the hand~\cite{punpongsanon2015softar}, or in combination with vibrotactile rendering in VR~\cite{choi2021augmenting}.
For example, %different levels of stiffness can be simulated on a grasped virtual object with the same passive haptic device~\autocite{achibet2017flexifingers} or
the perceived softness of tangible objects can be altered by superimposing in AR a virtual texture that deforms when pressed by the hand~\autocite{punpongsanon2015softar}, or in combination with vibrotactile rendering in VR~\autocite{choi2021augmenting}.
%
The vibrotactile sinusoidal rendering of virtual texture cited above was also combined with visual oscillations of a cursor on a screen to increase the roughness perception of the texture~\cite{ujitoko2019modulating}.
The vibrotactile sinusoidal rendering of virtual texture cited above was also combined with visual oscillations of a cursor on a screen to increase the roughness perception of the texture~\autocite{ujitoko2019modulating}.
%
%However, the visual representation was a virtual cursor seen on a screen while the haptic feedback was felt with a hand-held device.
%
@@ -95,8 +95,8 @@ Rendering a virtual piston pressed with one's real hand using a video see-throug
%
In a similar setup, but with an optical see-through (OST) AR headset, \citeauthorcite{gaffary2017ar} found that the virtual piston was perceived as less stiff in AR than in VR, without participants noticing this difference.
%
Using a VST-AR headset have notable consequences, as the "real" view of the environment and the hand is actually a visual stream from a camera, which has a noticeable delay and lower quality (\eg resolution, frame rate, field of view) compared to the direct view of the real environment with OST-AR~\cite{macedo2023occlusion}.
Using a VST-AR headset have notable consequences, as the "real" view of the environment and the hand is actually a visual stream from a camera, which has a noticeable delay and lower quality (\eg resolution, frame rate, field of view) compared to the direct view of the real environment with OST-AR~\autocite{macedo2023occlusion}.
%
While a large literature has investigated these differences in visual perception, as well as for VR, \eg distances are underestimated~\cite{adams2022depth,peillard2019studying}, less is known about visuo-haptic perception in AR and VR.
While a large literature has investigated these differences in visual perception, as well as for VR, \eg distances are underestimated~\autocite{adams2022depth,peillard2019studying}, less is known about visuo-haptic perception in AR and VR.
%
In this work we studied (1) the perception of a \emph{haptic texture augmentation} of a tangible surface and (2) the possible influence of the visual rendering of the environment (OST-AR or VR) and the hand touching the surface (real or virtual) on this perception.

View File

@@ -22,7 +22,7 @@
All computation steps except signal sampling are performed at 60~Hz and in separate threads to parallelize them.
}
%With a vibrotactile actuator attached to a hand-held device or directly on the finger, it is possible to simulate virtual haptic sensations as vibrations, such as texture, friction or contact vibrations~\cite{culbertson2018haptics}.
%With a vibrotactile actuator attached to a hand-held device or directly on the finger, it is possible to simulate virtual haptic sensations as vibrations, such as texture, friction or contact vibrations~\autocite{culbertson2018haptics}.
%
In this section, we describe a system for rendering vibrotactile roughness texture in real time, on any tangible surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose.
%
@@ -64,11 +64,11 @@ A fiducial marker (AprilTag) is glued to the top of the actuator (see \figref{me
%
Other markers are placed on the tangible surfaces to augment to estimate the relative position of the finger with respect to the surfaces (see \figref{setup}).
%
Contrary to similar work which either constrained hand to a constant speed to keep the signal frequency constant~\cite{asano2015vibrotactile,friesen2024perceived}, or used mechanical sensors attached to the hand~\cite{friesen2024perceived,strohmeier2017generating}, using vision-based tracking allows both to free the hand movements and to augment any tangible surface.
Contrary to similar work which either constrained hand to a constant speed to keep the signal frequency constant~\autocite{asano2015vibrotactile,friesen2024perceived}, or used mechanical sensors attached to the hand~\autocite{friesen2024perceived,strohmeier2017generating}, using vision-based tracking allows both to free the hand movements and to augment any tangible surface.
%
A camera external to the AR/VR headset with a marker-based technique is employed to provide accurate and robust tracking with a constant view of the markers~\cite{marchand2016pose}.
A camera external to the AR/VR headset with a marker-based technique is employed to provide accurate and robust tracking with a constant view of the markers~\autocite{marchand2016pose}.
%
To reduce the noise the pose estimation while maintaining a good responsiveness, the 1€ filter~\cite{casiez2012filter} is applied.
To reduce the noise the pose estimation while maintaining a good responsiveness, the 1€ filter~\autocite{casiez2012filter} is applied.
%
It is a low-pass filter with an adaptive cutoff frequency, specifically designed for tracking human motion.
%
@@ -91,7 +91,7 @@ In our implementation, the virtual hand and environment are designed with Unity
%
The visual rendering is achieved using the Microsoft HoloLens~2, an OST-AR headset with a \qtyproduct{43 x 29}{\degree} field of view (FoV), a \qty{60}{\Hz} refresh rate, and self-localisation capabilities.
%
It was chosen over VST-AR because OST-AR only adds virtual content to the real environment, while VST-AR streams a real-time video capture of the real environment~\cite{macedo2023occlusion}.
It was chosen over VST-AR because OST-AR only adds virtual content to the real environment, while VST-AR streams a real-time video capture of the real environment~\autocite{macedo2023occlusion}.
%
Indeed, one of our objectives (see \secref{xr_perception:experiment}) is to directly compare a virtual environment that replicates a real one. %, rather than a video feed that introduces many supplementary visual limitations.
%
@@ -105,11 +105,11 @@ A voice-coil actuator (HapCoil-One, Actronika) is used to display the vibrotacti
%
The voice-coil actuator is encased in a 3D printed plastic shell and firmly attached to the middle phalanx of the user's index finger with a Velcro strap, to enable the fingertip to directly touch the environment (see \figref{method/device}).
%
The actuator is driven by a Class D audio amplifier (XY-502 / TPA3116D2, Texas Instrument). %, which has proven to be an effective type of amplifier for driving moving-coil~\cite{mcmahan2014dynamic}.
The actuator is driven by a Class D audio amplifier (XY-502 / TPA3116D2, Texas Instrument). %, which has proven to be an effective type of amplifier for driving moving-coil~\autocite{mcmahan2014dynamic}.
%
The amplifier is connected to the audio output of a computer that generates the signal using the WASAPI driver in exclusive mode and the NAudio library.
The represented haptic texture is a series of parallels virtual grooves and ridges, similar to real grating textures manufactured for psychophysical roughness perception studies~\cite{friesen2024perceived,klatzky2003feeling,unger2011roughness}.
The represented haptic texture is a series of parallels virtual grooves and ridges, similar to real grating textures manufactured for psychophysical roughness perception studies~\autocite{friesen2024perceived,klatzky2003feeling,unger2011roughness}.
%
It is generated as a square wave audio signal, sampled at \qty{48}{\kilo\hertz}, with a period $\lambda$ (usually in the millimetre range) and an amplitude $A$.
%
@@ -123,21 +123,21 @@ A sample $s_k$ of the audio signal at sampling time $t_k$ is given by:
\end{align}
\end{subequations}
%
This is a common rendering method for vibrotactile textures, with well-defined parameters, that has been employed to modify perceived haptic roughness of a tangible surface~\cite{asano2015vibrotactile,konyo2005tactile,ujitoko2019modulating}.
This is a common rendering method for vibrotactile textures, with well-defined parameters, that has been employed to modify perceived haptic roughness of a tangible surface~\autocite{asano2015vibrotactile,konyo2005tactile,ujitoko2019modulating}.
%
As the finger position is estimated at a far lower rate (\qty{60}{\hertz}) than the audio signal, the finger position $x_f$ cannot be directly used to render the signal if the finger moves fast or if the texture period is small.
%
The best strategy instead is to modulate the frequency of the signal $s$ as a ratio of the finger velocity $\dot{x}_f$ and the texture period $\lambda$~\cite{friesen2024perceived}.
The best strategy instead is to modulate the frequency of the signal $s$ as a ratio of the finger velocity $\dot{x}_f$ and the texture period $\lambda$~\autocite{friesen2024perceived}.
%
This is important because it preserves the sensation of a constant spatial frequency of the virtual texture while the finger moves at various speeds, which is crucial for the perception of roughness~\cite{klatzky2003feeling,unger2011roughness}.
This is important because it preserves the sensation of a constant spatial frequency of the virtual texture while the finger moves at various speeds, which is crucial for the perception of roughness~\autocite{klatzky2003feeling,unger2011roughness}.
%
Note that the finger position and velocity are transformed from the camera frame $\mathcal{F}_c$ to the texture frame $\mathcal{F}_t$, with the $x$ axis aligned with the texture direction.
%
However, when a new finger position is estimated at time $t_j$, the phase $\phi_j$ needs to be adjusted as well with the frequency to ensure a continuity in the signal as described in \eqref{xr_perception:signal_phase}.
%
This approach avoids sudden changes in the actuator movement thus affecting the texture perception in an uncontrolled way (see \figref{method/phase_adjustment}) and, contrary to previous work~\cite{asano2015vibrotactile,friesen2024perceived}, it enables no constraints a free exploration of the texture by the user with no constraints on the finger speed.
This approach avoids sudden changes in the actuator movement thus affecting the texture perception in an uncontrolled way (see \figref{method/phase_adjustment}) and, contrary to previous work~\autocite{asano2015vibrotactile,friesen2024perceived}, it enables no constraints a free exploration of the texture by the user with no constraints on the finger speed.
%
Finally, as \citeauthorcite{ujitoko2019modulating}, a square wave is chosen over a sine wave to get a rendering closer to a real grating texture with the sensation of crossing edges, and because the roughness perception of sine wave textures has been shown not to reproduce the roughness perception of real grating textures~\cite{unger2011roughness}.
Finally, as \citeauthorcite{ujitoko2019modulating}, a square wave is chosen over a sine wave to get a rendering closer to a real grating texture with the sensation of crossing edges, and because the roughness perception of sine wave textures has been shown not to reproduce the roughness perception of real grating textures~\autocite{unger2011roughness}.
%
%And secondly, to be able to render low frequencies that occurs when the finger moves slowly or the texture period is large, as the actuator cannot render frequencies below \qty{\approx 20}{\Hz} with enough amplitude to be perceived with a pure sine wave signal.
%
@@ -173,9 +173,9 @@ Both are the result of latency in image capture \qty{16 +- 1}{\ms}, markers trac
%
The haptic loop also includes the voice-coil latency \qty{15}{\ms} (as specified by the manufacturer\footnotemark[1]), whereas the visual loop includes the latency in 3D rendering \qty{16 +- 5}{\ms} (60 frames per second) and display \qty{5}{\ms}.
%
The total haptic latency is below the \qty{60}{\ms} detection threshold in vibrotactile feedback~\cite{okamoto2009detectability}.
The total haptic latency is below the \qty{60}{\ms} detection threshold in vibrotactile feedback~\autocite{okamoto2009detectability}.
%
The total visual latency can be considered slightly high, yet it is typical for an AR rendering involving vision-based tracking~\cite{knorlein2009influence}.
The total visual latency can be considered slightly high, yet it is typical for an AR rendering involving vision-based tracking~\autocite{knorlein2009influence}.
The two filters also introduce a constant lag between the finger movement and the estimated position and velocity, measured at \qty{160 +- 30}{\ms}.
%

View File

@@ -28,7 +28,7 @@ The user study aimed to investigate the effect of visual hand rendering in AR or
%
In a two-alternative forced choice (2AFC) task, participants compared the roughness of different tactile texture augmentations in three visual rendering conditions: without any visual augmentation (see \figref{renderings}, \level{Real}), in AR with a realistic virtual hand superimposed on the real hand (see \figref{renderings}, \level{Mixed}), and in VR with the same virtual hand as an avatar (see \figref{renderings}, \level{Virtual}).
%
In order not to influence the perception, as vision is an important source of information and influence for the perception of texture~\cite{bergmanntiest2007haptic,yanagisawa2015effects,normand2024augmenting,vardar2019fingertip}, the touched surface was visually a uniform white; thus only the visual aspect of the hand and the surrounding environment is changed.
In order not to influence the perception, as vision is an important source of information and influence for the perception of texture~\autocite{bergmanntiest2007haptic,yanagisawa2015effects,normand2024augmenting,vardar2019fingertip}, the touched surface was visually a uniform white; thus only the visual aspect of the hand and the surrounding environment is changed.
\subsection{Participants}
@@ -68,7 +68,7 @@ The virtual hand model was a gender-neutral human right hand with realistic skin
%
Its size was adjusted to match the real hand of the participants before the experiment.
%
%An OST-AR headset (Microsoft HoloLens~2) was chosen over a VST-AR headset because the former only adds virtual content to the real environment, while the latter streams a real-time video capture of the real environment, and one of our objectives was to directly compare a virtual environment replicating a real one, not to a video feed that introduces many other visual limitations~\cite{macedo2023occlusion}.
%An OST-AR headset (Microsoft HoloLens~2) was chosen over a VST-AR headset because the former only adds virtual content to the real environment, while the latter streams a real-time video capture of the real environment, and one of our objectives was to directly compare a virtual environment replicating a real one, not to a video feed that introduces many other visual limitations~\autocite{macedo2023occlusion}.
%
The visual rendering of the virtual hand and environment is described in \secref{xr_perception:virtual_real_alignment}.
%
@@ -90,7 +90,7 @@ In the \level{Mixed} and \level{Real} conditions, the mask had two additional ho
%
%The position of the finger relative to the sheet was estimated using a webcam placed on top of the box (StreamCam, Logitech) and the OpenCV library by tracking a \qty{2}{\cm} square fiducial marker (AprilTag) glued to top of the vibrotactile actuator.
%
%The total texture latency was measured to \qty{36 \pm 4}{\ms}, as a result of latency in image acquisition \qty{16 \pm 1}{\ms}, fiducial marker detection \qty{2 \pm 1}{\ms}, audio sampling \qty{3 \pm 1}{\ms}, and the vibrotactile actuator latency (\qty{15}{\ms}, as specified by the manufacturer\footnotemark[1]), and was below the \qty{60}{\ms} threshold for vibrotactile feedback \cite{okamoto2009detectability}.
%The total texture latency was measured to \qty{36 \pm 4}{\ms}, as a result of latency in image acquisition \qty{16 \pm 1}{\ms}, fiducial marker detection \qty{2 \pm 1}{\ms}, audio sampling \qty{3 \pm 1}{\ms}, and the vibrotactile actuator latency (\qty{15}{\ms}, as specified by the manufacturer\footnotemark[1]), and was below the \qty{60}{\ms} threshold for vibrotactile feedback \autocite{okamoto2009detectability}.
%
%The virtual hand followed the position of the fiducial marker with a slightly higher latency due to the network synchronization \qty{4 \pm 1}{\ms} between the computer and the HoloLens~2.

View File

@@ -28,7 +28,7 @@ The \level{Real} rendering had the highest PSE (\percent{7.9} \ci{1.2}{4.1}) and
%
The JND represents the estimated minimum amplitude difference between the comparison and reference textures that participants could perceive,
% \ie the sensitivity to vibrotactile roughness differences,
calculated at the 84th percentile of the predictions of the GLMM (\ie one standard deviation of the normal distribution)~\cite{ernst2002humans}.
calculated at the 84th percentile of the predictions of the GLMM (\ie one standard deviation of the normal distribution)~\autocite{ernst2002humans}.
%
The \level{Real} rendering had the lowest JND (\percent{26} \ci{23}{29}), the \level{Mixed} rendering had the highest (\percent{33} \ci{30}{37}), and the \level{Virtual} rendering was in between (\percent{30} \ci{28}{32}).
%

View File

@@ -18,7 +18,7 @@ Surprisingly, the PSE of the \level{Real} rendering was shifted to the right (to
%
The sensitivity of participants to roughness differences (just-noticeable differences, JND) also varied between all the visual renderings, with the \level{Real} rendering having the best JND (\percent{26}), followed by the \level{Virtual} (\percent{30}) and \level{Virtual} (\percent{33}) renderings (see \figref{results/trial_jnds}).
%
These JND values are in line with and at the upper end of the range of previous studies~\cite{choi2013vibrotactile}, which may be due to the location of the actuator on the top of the middle phalanx of the finger, being less sensitive to vibration than the fingertip.
These JND values are in line with and at the upper end of the range of previous studies~\autocite{choi2013vibrotactile}, which may be due to the location of the actuator on the top of the middle phalanx of the finger, being less sensitive to vibration than the fingertip.
%
Thus, compared to no visual rendering (\level{Real}), the addition of a visual rendering of the hand or environment reduced the roughness sensitivity (JND) and the average roughness perception (PSE), as if the virtual haptic textures felt \enquote{smoother}.
@@ -50,15 +50,15 @@ Thereby, we hypothesise that the differences in the perception of vibrotactile r
%
\citeauthorcite{diluca2011effects} demonstrated, in a VST-AR setup, how visual latency relative to proprioception increased the perception of stiffness of a virtual piston, while haptic latency decreased it.
%
Another complementary explanation could be a pseudo-haptic effect of the displacement of the virtual hand, as already observed with this vibrotactile texture rendering, but seen on a screen in a non-immersive context~\cite{ujitoko2019modulating}.
Another complementary explanation could be a pseudo-haptic effect of the displacement of the virtual hand, as already observed with this vibrotactile texture rendering, but seen on a screen in a non-immersive context~\autocite{ujitoko2019modulating}.
%
Such hypotheses could be tested by manipulating the latency and tracking accuracy of the virtual hand or the vibrotactile feedback. % to observe their effects on the roughness perception of the virtual textures.
The main limitation of our study is, of course, the absence of a visual representation of the touched virtual texture.
%
This is indeed a source of information as important as haptic sensations for perception for both real textures~\cite{baumgartner2013visual,bergmanntiest2007haptic,vardar2019fingertip} and virtual textures~\cite{degraen2019enhancing,gunther2022smooth,normand2024augmenting}.
This is indeed a source of information as important as haptic sensations for perception for both real textures~\autocite{baumgartner2013visual,bergmanntiest2007haptic,vardar2019fingertip} and virtual textures~\autocite{degraen2019enhancing,gunther2022smooth,normand2024augmenting}.
%
%Specifically, it remains to be investigated how to visually represent vibrotactile textures in an immersive AR or VR context, as the visuo-haptic coupling of such grating textures is not trivial~\cite{unger2011roughness} even with real textures~\cite{klatzky2003feeling}.
%Specifically, it remains to be investigated how to visually represent vibrotactile textures in an immersive AR or VR context, as the visuo-haptic coupling of such grating textures is not trivial~\autocite{unger2011roughness} even with real textures~\autocite{klatzky2003feeling}.
%
The interactions between the visual and haptic sensory modalities is complex and deserves further investigations, in particular in the context of visuo-haptic AR.
%

View File

@@ -12,4 +12,4 @@ We investigated then with a psychophysical user study the effect of visual rende
%Only the amplitude $A$ varied between the reference and comparison textures to create the different levels of roughness.
%
%Participants were not informed there was a reference and comparison textures, and
No texture was represented visually, to avoid any influence on the perception~\cite{bergmanntiest2007haptic,normand2024augmenting,yanagisawa2015effects}.
No texture was represented visually, to avoid any influence on the perception~\autocite{bergmanntiest2007haptic,normand2024augmenting,yanagisawa2015effects}.