WIP
This commit is contained in:
@@ -22,7 +22,7 @@
|
||||
All computation steps except signal sampling are performed at 60~Hz and in separate threads to parallelize them.
|
||||
}
|
||||
|
||||
%With a vibrotactile actuator attached to a hand-held device or directly on the finger, it is possible to simulate virtual haptic sensations as vibrations, such as texture, friction or contact vibrations~\cite{culbertson2018haptics}.
|
||||
%With a vibrotactile actuator attached to a hand-held device or directly on the finger, it is possible to simulate virtual haptic sensations as vibrations, such as texture, friction or contact vibrations \cite{culbertson2018haptics}.
|
||||
%
|
||||
In this section, we describe a system for rendering vibrotactile roughness texture in real time, on any tangible surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose.
|
||||
%
|
||||
@@ -60,11 +60,11 @@ A fiducial marker (AprilTag) is glued to the top of the actuator (\figref{method
|
||||
%
|
||||
Other markers are placed on the tangible surfaces to augment to estimate the relative position of the finger with respect to the surfaces (\figref{setup}).
|
||||
%
|
||||
Contrary to similar work which either constrained hand to a constant speed to keep the signal frequency constant~\cite{asano2015vibrotactile,friesen2024perceived}, or used mechanical sensors attached to the hand~\cite{friesen2024perceived,strohmeier2017generating}, using vision-based tracking allows both to free the hand movements and to augment any tangible surface.
|
||||
Contrary to similar work which either constrained hand to a constant speed to keep the signal frequency constant \cite{asano2015vibrotactile,friesen2024perceived}, or used mechanical sensors attached to the hand \cite{friesen2024perceived,strohmeier2017generating}, using vision-based tracking allows both to free the hand movements and to augment any tangible surface.
|
||||
%
|
||||
A camera external to the AR/VR headset with a marker-based technique is employed to provide accurate and robust tracking with a constant view of the markers~\cite{marchand2016pose}.
|
||||
A camera external to the AR/VR headset with a marker-based technique is employed to provide accurate and robust tracking with a constant view of the markers \cite{marchand2016pose}.
|
||||
%
|
||||
To reduce the noise the pose estimation while maintaining a good responsiveness, the 1€ filter~\cite{casiez2012filter} is applied.
|
||||
To reduce the noise the pose estimation while maintaining a good responsiveness, the 1€ filter \cite{casiez2012filter} is applied.
|
||||
%
|
||||
It is a low-pass filter with an adaptive cutoff frequency, specifically designed for tracking human motion.
|
||||
%
|
||||
@@ -87,7 +87,7 @@ In our implementation, the virtual hand and environment are designed with Unity
|
||||
%
|
||||
The visual rendering is achieved using the Microsoft HoloLens~2, an OST-AR headset with a \qtyproduct{43 x 29}{\degree} field of view (FoV), a \qty{60}{\Hz} refresh rate, and self-localisation capabilities.
|
||||
%
|
||||
It was chosen over VST-AR because OST-AR only adds virtual content to the real environment, while VST-AR streams a real-time video capture of the real environment~\cite{macedo2023occlusion}.
|
||||
It was chosen over VST-AR because OST-AR only adds virtual content to the real environment, while VST-AR streams a real-time video capture of the real environment \cite{macedo2023occlusion}.
|
||||
%
|
||||
Indeed, one of our objectives (\secref{experiment}) is to directly compare a virtual environment that replicates a real one. %, rather than a video feed that introduces many supplementary visual limitations.
|
||||
%
|
||||
@@ -101,11 +101,11 @@ A voice-coil actuator (HapCoil-One, Actronika) is used to display the vibrotacti
|
||||
%
|
||||
The voice-coil actuator is encased in a 3D printed plastic shell and firmly attached to the middle phalanx of the user's index finger with a Velcro strap, to enable the fingertip to directly touch the environment (\figref{method/device}).
|
||||
%
|
||||
The actuator is driven by a Class D audio amplifier (XY-502 / TPA3116D2, Texas Instrument). %, which has proven to be an effective type of amplifier for driving moving-coil~\cite{mcmahan2014dynamic}.
|
||||
The actuator is driven by a Class D audio amplifier (XY-502 / TPA3116D2, Texas Instrument). %, which has proven to be an effective type of amplifier for driving moving-coil \cite{mcmahan2014dynamic}.
|
||||
%
|
||||
The amplifier is connected to the audio output of a computer that generates the signal using the WASAPI driver in exclusive mode and the NAudio library.
|
||||
|
||||
The represented haptic texture is a series of parallels virtual grooves and ridges, similar to real grating textures manufactured for psychophysical roughness perception studies~\cite{friesen2024perceived,klatzky2003feeling,unger2011roughness}.
|
||||
The represented haptic texture is a series of parallels virtual grooves and ridges, similar to real grating textures manufactured for psychophysical roughness perception studies \cite{friesen2024perceived,klatzky2003feeling,unger2011roughness}.
|
||||
%
|
||||
It is generated as a square wave audio signal, sampled at \qty{48}{\kilo\hertz}, with a period $\lambda$ (usually in the millimetre range) and an amplitude $A$.
|
||||
%
|
||||
@@ -119,21 +119,21 @@ A sample $s_k$ of the audio signal at sampling time $t_k$ is given by:
|
||||
\end{align}
|
||||
\end{subequations}
|
||||
%
|
||||
This is a common rendering method for vibrotactile textures, with well-defined parameters, that has been employed to modify perceived haptic roughness of a tangible surface~\cite{asano2015vibrotactile,konyo2005tactile,ujitoko2019modulating}.
|
||||
This is a common rendering method for vibrotactile textures, with well-defined parameters, that has been employed to modify perceived haptic roughness of a tangible surface \cite{asano2015vibrotactile,konyo2005tactile,ujitoko2019modulating}.
|
||||
%
|
||||
As the finger position is estimated at a far lower rate (\qty{60}{\hertz}) than the audio signal, the finger position $x_f$ cannot be directly used to render the signal if the finger moves fast or if the texture period is small.
|
||||
%
|
||||
The best strategy instead is to modulate the frequency of the signal $s$ as a ratio of the finger velocity $\dot{x}_f$ and the texture period $\lambda$~\cite{friesen2024perceived}.
|
||||
The best strategy instead is to modulate the frequency of the signal $s$ as a ratio of the finger velocity $\dot{x}_f$ and the texture period $\lambda$ \cite{friesen2024perceived}.
|
||||
%
|
||||
This is important because it preserves the sensation of a constant spatial frequency of the virtual texture while the finger moves at various speeds, which is crucial for the perception of roughness~\cite{klatzky2003feeling,unger2011roughness}.
|
||||
This is important because it preserves the sensation of a constant spatial frequency of the virtual texture while the finger moves at various speeds, which is crucial for the perception of roughness \cite{klatzky2003feeling,unger2011roughness}.
|
||||
%
|
||||
Note that the finger position and velocity are transformed from the camera frame $\mathcal{F}_c$ to the texture frame $\mathcal{F}_t$, with the $x$ axis aligned with the texture direction.
|
||||
%
|
||||
However, when a new finger position is estimated at time $t_j$, the phase $\phi_j$ needs to be adjusted as well with the frequency to ensure a continuity in the signal as described in \eqref{signal}.
|
||||
%
|
||||
This approach avoids sudden changes in the actuator movement thus affecting the texture perception in an uncontrolled way (\figref{method/phase_adjustment}) and, contrary to previous work~\cite{asano2015vibrotactile,friesen2024perceived}, it enables no constraints a free exploration of the texture by the user with no constraints on the finger speed.
|
||||
This approach avoids sudden changes in the actuator movement thus affecting the texture perception in an uncontrolled way (\figref{method/phase_adjustment}) and, contrary to previous work \cite{asano2015vibrotactile,friesen2024perceived}, it enables no constraints a free exploration of the texture by the user with no constraints on the finger speed.
|
||||
%
|
||||
Finally, as \textcite{ujitoko2019modulating}, a square wave is chosen over a sine wave to get a rendering closer to a real grating texture with the sensation of crossing edges, and because the roughness perception of sine wave textures has been shown not to reproduce the roughness perception of real grating textures~\cite{unger2011roughness}.
|
||||
Finally, as \textcite{ujitoko2019modulating}, a square wave is chosen over a sine wave to get a rendering closer to a real grating texture with the sensation of crossing edges, and because the roughness perception of sine wave textures has been shown not to reproduce the roughness perception of real grating textures \cite{unger2011roughness}.
|
||||
%
|
||||
%And secondly, to be able to render low frequencies that occurs when the finger moves slowly or the texture period is large, as the actuator cannot render frequencies below \qty{\approx 20}{\Hz} with enough amplitude to be perceived with a pure sine wave signal.
|
||||
%
|
||||
@@ -169,9 +169,9 @@ Both are the result of latency in image capture \qty{16 +- 1}{\ms}, markers trac
|
||||
%
|
||||
The haptic loop also includes the voice-coil latency \qty{15}{\ms} (as specified by the manufacturer\footnotemark[1]), whereas the visual loop includes the latency in 3D rendering \qty{16 +- 5}{\ms} (60 frames per second) and display \qty{5}{\ms}.
|
||||
%
|
||||
The total haptic latency is below the \qty{60}{\ms} detection threshold in vibrotactile feedback~\cite{okamoto2009detectability}.
|
||||
The total haptic latency is below the \qty{60}{\ms} detection threshold in vibrotactile feedback \cite{okamoto2009detectability}.
|
||||
%
|
||||
The total visual latency can be considered slightly high, yet it is typical for an AR rendering involving vision-based tracking~\cite{knorlein2009influence}.
|
||||
The total visual latency can be considered slightly high, yet it is typical for an AR rendering involving vision-based tracking \cite{knorlein2009influence}.
|
||||
|
||||
The two filters also introduce a constant lag between the finger movement and the estimated position and velocity, measured at \qty{160 +- 30}{\ms}.
|
||||
%
|
||||
|
||||
Reference in New Issue
Block a user