Remove comments
This commit is contained in:
@@ -1,9 +1,3 @@
|
||||
%With a vibrotactile actuator attached to a hand-held device or directly on the finger, it is possible to simulate virtual haptic sensations as vibrations, such as texture, friction or contact vibrations \cite{culbertson2018haptics}.
|
||||
%
|
||||
%We describe a system for rendering vibrotactile roughness textures in real time, on any real surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose.
|
||||
%
|
||||
%We also describe how to pair this tactile rendering with an immersive \AR or \VR headset visual display to provide a coherent visuo-haptic augmentation of the \RE.
|
||||
|
||||
\section{Concept}
|
||||
\label{principle}
|
||||
|
||||
@@ -53,11 +47,10 @@ Finally, this filtered finger velocity is transformed into the augmented surface
|
||||
\subsection{Virtual Environment Alignment}
|
||||
\label{virtual_real_alignment}
|
||||
|
||||
%To be able to compare virtual and augmented realities, we then create a \VE that closely replicate the real one.
|
||||
Before a user interacts with the system, it is necessary to design a \VE that will be registered with the \RE during the experiment.
|
||||
Each real element tracked by a marker is modelled virtually, \eg the hand and the augmented surface (\figref{device}).
|
||||
In addition, the pose and size of the virtual textures were defined on the virtual replicas.
|
||||
During the experiment, the system uses marker pose estimates to align the virtual models with their real world counterparts. %, according to the condition being tested.
|
||||
During the experiment, the system uses marker pose estimates to align the virtual models with their real world counterparts.
|
||||
This allows to detect if a finger touches a virtual texture using a collision detection algorithm (Nvidia PhysX), and to show the virtual elements and textures in real-time, aligned with the \RE, using the considered \AR or \VR headset.
|
||||
|
||||
In our implementation, the \VE is designed with Unity (v2021.1) and the Mixed Reality Toolkit (v2.7)\footnoteurl{https://learn.microsoft.com/windows/mixed-reality/mrtk-unity}.
|
||||
@@ -69,19 +62,15 @@ A \VST-\AR or a \VR headset could have been used as well.
|
||||
|
||||
A voice-coil actuator (HapCoil-One, Actronika) is used to display the vibrotactile signal, as it allows the frequency and amplitude of the signal to be controlled independently over time, covers a wide frequency range (\qtyrange{10}{1000}{\Hz}), and outputs the signal accurately with relatively low acceleration distortion\footnote{HapCoil-One specific characteristics are described in its data sheet: \url{https://tactilelabs.com/wp-content/uploads/2023/11/HapCoil_One_datasheet.pdf}}.
|
||||
The voice-coil actuator is encased in a \ThreeD printed plastic shell and firmly attached to the middle phalanx of the user's index finger with a Velcro strap, to enable the fingertip to directly touch the environment (\figref{device}).
|
||||
The actuator is driven by a class D audio amplifier (XY-502 / TPA3116D2, Texas Instrument). %, which has proven to be an effective type of amplifier for driving moving-coil \cite{mcmahan2014dynamic}.
|
||||
The actuator is driven by a class D audio amplifier (XY-502 / TPA3116D2, Texas Instrument).
|
||||
The amplifier is connected to the audio output of a computer that generates the signal using the WASAPI driver in exclusive mode and the NAudio library\footnoteurl{https://github.com/naudio/NAudio}.
|
||||
|
||||
The represented haptic texture is a 1D series of parallels virtual grooves and ridges, similar to the real linear grating textures manufactured for psychophysical roughness perception studies \secref[related_work]{roughness}. %\cite{friesen2024perceived,klatzky2003feeling,unger2011roughness}.
|
||||
The represented haptic texture is a 1D series of parallels virtual grooves and ridges, similar to the real linear grating textures manufactured for psychophysical roughness perception studies \secref[related_work]{roughness}.
|
||||
It is generated as a square wave audio signal $r$, sampled at \qty{48}{\kilo\hertz}, with a texture period $\lambda$ and an amplitude $A$, similar to \eqref[related_work]{grating_rendering}.
|
||||
Its frequency is a ratio of the absolute finger filtered (scalar) velocity $\dot{x} = \poseX{s}{|\hat{\dot{X}}|}{f}$, and the texture period $\lambda$ \cite{friesen2024perceived}.
|
||||
As the finger is moving horizontally on the texture, only the $X$ component of the velocity is used.
|
||||
This velocity modulation strategy is necessary as the finger position is estimated at a far lower rate (\qty{60}{\hertz}) than the audio signal (unlike high-fidelity force-feedback devices \cite{unger2011roughness}).
|
||||
|
||||
%As the finger position is estimated at a far lower rate (\qty{60}{\hertz}), the filtered finger (scalar) position ${}^t\hat{X}_f$ in the texture frame $\poseFrame{t}$ cannot be directly used. % to render the signal if the finger moves fast or if the texture period is small.
|
||||
%
|
||||
%The best strategy instead is to modulate the frequency of the signal as a ratio of the filtered finger velocity ${}^t\hat{\dot{\mathbf{X}}}_f$ and the texture period $\lambda$ \cite{friesen2024perceived}.
|
||||
%
|
||||
When a new finger velocity $\dot{x}\,(t_j)$ is estimated at time $t_j$, the phase $\phi\,(t_j)$ of the signal $r$ needs also to be adjusted to ensure a continuity in the signal.
|
||||
In other words, the sampling of the audio signal runs at \qty{48}{\kilo\hertz}, and its frequency and phase is updated at a far lower rate of \qty{60}{\hertz} when a new finger velocity is estimated.
|
||||
A sample $r(t_j, t_k)$ of the audio signal at sampling time $t_k$, with $t_k >= t_j$, is thus given by:
|
||||
@@ -94,11 +83,6 @@ A sample $r(t_j, t_k)$ of the audio signal at sampling time $t_k$, with $t_k >=
|
||||
\end{subequations}
|
||||
|
||||
This rendering preserves the sensation of a constant spatial frequency of the virtual texture while the finger moves at various speeds, which is crucial for the perception of roughness \cite{klatzky2003feeling,unger2011roughness}.
|
||||
%
|
||||
%Note that the finger position and velocity are transformed from the camera frame $\poseFrame{c}$ to the texture frame $\poseFrame{t}$, with the $x$ axis aligned with the texture direction.
|
||||
%
|
||||
%However, when a new finger position is estimated at time $t_j$, the phase $\phi(t_j)$ needs to be adjusted as well with the frequency to ensure a continuity in the signal as described in \eqref{signal_phase}.
|
||||
%
|
||||
The phase matching avoids sudden changes in the actuator movement thus affecting the texture perception in an uncontrolled way (\figref{phase_adjustment}) and, contrary to previous work \cite{asano2015vibrotactile,ujitoko2019modulating}, it enables a free exploration of the texture by the user with no constraints on the finger speed.
|
||||
A square wave is chosen to get a rendering closer to a real grating texture with the sensation of crossing edges \cite{ujitoko2019modulating}, and because the roughness perception of sine wave textures has been shown not to reproduce the roughness perception of real grating textures \cite{unger2011roughness}.
|
||||
A square wave also makes it possible to render low signal frequencies that occur when the finger moves slowly or the texture period is large, as the actuator cannot render a pure sine wave signal below \qty{\approx 20}{\Hz} with sufficient amplitude to be perceived.
|
||||
@@ -131,4 +115,3 @@ The total visual latency can be considered slightly high, yet it is typical for
|
||||
The two filters also introduce a constant lag between the finger movement and the estimated position and velocity, measured at \qty{160 \pm 30}{\ms}.
|
||||
With respect to the real hand position, it causes a distance error in the displayed virtual hand position, and thus a delay in the triggering of the vibrotactile signal.
|
||||
This is proportional to the speed of the finger, \eg distance error is \qty{12 \pm 2.3}{\mm} when the finger moves at \qty{75}{\mm\per\second}.
|
||||
%and of the vibrotactile signal frequency with respect to the finger speed.%, that is proportional to the speed of the finger.
|
||||
|
||||
Reference in New Issue
Block a user