Fix xr-perception equation labels
This commit is contained in:
@@ -36,7 +36,7 @@ The visuo-haptic texture rendering system is based on
|
|||||||
\item and a modulation of the signal frequency by the estimated finger speed with a phase matching.
|
\item and a modulation of the signal frequency by the estimated finger speed with a phase matching.
|
||||||
\end{enumerate*}
|
\end{enumerate*}
|
||||||
%
|
%
|
||||||
\figref{method/diagram} shows the diagram of the interaction loop and \eqref{signal} the definition of the vibrotactile signal.
|
\figref{method/diagram} shows the diagram of the interaction loop and \eqref{xr_perception:signal} the definition of the vibrotactile signal.
|
||||||
%
|
%
|
||||||
The system is composed of three main components: the pose estimation of the tracked real elements, the visual rendering of the virtual environment, and the vibrotactile signal generation and rendering.
|
The system is composed of three main components: the pose estimation of the tracked real elements, the visual rendering of the virtual environment, and the vibrotactile signal generation and rendering.
|
||||||
|
|
||||||
@@ -116,10 +116,10 @@ It is generated as a square wave audio signal, sampled at \qty{48}{\kilo\hertz},
|
|||||||
A sample $s_k$ of the audio signal at sampling time $t_k$ is given by:
|
A sample $s_k$ of the audio signal at sampling time $t_k$ is given by:
|
||||||
%
|
%
|
||||||
\begin{subequations}
|
\begin{subequations}
|
||||||
\label{eq:signal}
|
\label{eq:\labelprefix:signal}
|
||||||
\begin{align}
|
\begin{align}
|
||||||
s(x_{f,j}, t_k) & = A \text{\,sgn} ( \sin (2 \pi \frac{\dot{x}_{f,j}}{\lambda} t_k + \phi_j) ) & \label{eq:signal_speed} \\
|
s(x_{f,j}, t_k) & = A \text{\,sgn} ( \sin (2 \pi \frac{\dot{x}_{f,j}}{\lambda} t_k + \phi_j) ) & \label{eq:\labelprefix:signal_speed} \\
|
||||||
\phi_j & = \phi_{j-1} + 2 \pi \frac{x_{f,j} - x_{f,{j-1}}}{\lambda} t_k & \label{eq:signal_phase}
|
\phi_j & = \phi_{j-1} + 2 \pi \frac{x_{f,j} - x_{f,{j-1}}}{\lambda} t_k & \label{eq:\labelprefix:signal_phase}
|
||||||
\end{align}
|
\end{align}
|
||||||
\end{subequations}
|
\end{subequations}
|
||||||
%
|
%
|
||||||
@@ -133,7 +133,7 @@ This is important because it preserves the sensation of a constant spatial frequ
|
|||||||
%
|
%
|
||||||
Note that the finger position and velocity are transformed from the camera frame $\mathcal{F}_c$ to the texture frame $\mathcal{F}_t$, with the $x$ axis aligned with the texture direction.
|
Note that the finger position and velocity are transformed from the camera frame $\mathcal{F}_c$ to the texture frame $\mathcal{F}_t$, with the $x$ axis aligned with the texture direction.
|
||||||
%
|
%
|
||||||
However, when a new finger position is estimated at time $t_j$, the phase $\phi_j$ needs to be adjusted as well with the frequency to ensure a continuity in the signal as described in \eqref{signal_phase}.
|
However, when a new finger position is estimated at time $t_j$, the phase $\phi_j$ needs to be adjusted as well with the frequency to ensure a continuity in the signal as described in \eqref{xr_perception:signal_phase}.
|
||||||
%
|
%
|
||||||
This approach avoids sudden changes in the actuator movement thus affecting the texture perception in an uncontrolled way (see \figref{method/phase_adjustment}) and, contrary to previous work~\cite{asano2015vibrotactile,friesen2024perceived}, it enables no constraints a free exploration of the texture by the user with no constraints on the finger speed.
|
This approach avoids sudden changes in the actuator movement thus affecting the texture perception in an uncontrolled way (see \figref{method/phase_adjustment}) and, contrary to previous work~\cite{asano2015vibrotactile,friesen2024perceived}, it enables no constraints a free exploration of the texture by the user with no constraints on the finger speed.
|
||||||
%
|
%
|
||||||
|
|||||||
Reference in New Issue
Block a user