Complete vhar_system conclusion

This commit is contained in:
2024-10-01 11:18:32 +02:00
parent 53a45d62a7
commit afb96c7ff1
10 changed files with 55 additions and 51 deletions

View File

@@ -9,7 +9,7 @@ In this chapter, we propose a \textbf{system for rendering visual and haptic vir
It is implemented with an immersive \OST-\AR headset Microsoft HoloLens~2 and a wearable vibrotactile (voice-coil) device worn on the outside of finger (not covering the fingertip, \secref[related_work]{vhar_haptics}).
The visuo-haptic augmentations can be \textbf{viewed from any angle} and \textbf{explored freely with the bare finger}, as if they were real textures.
To ensure both real-time and reliable renderings, the hand and the tangibles are tracked using a webcam and marker-based tracking.
The haptic textures are rendered as a real-time vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented tangible surface.
The haptic textures are rendered as a vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented tangible surface.
\noindentskip The contributions of this chapter are:
\begin{itemize}
@@ -19,5 +19,13 @@ The haptic textures are rendered as a real-time vibrotactile signal representing
\noindentskip In the remainder of this chapter, we describe the principles of the system, how the real and virtual environments are registered, the generation of the vibrotactile textures, and measures of visual and haptic rendering latencies.
% We describe a system for rendering vibrotactile roughness textures in real time, on any tangible surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose.
% We also describe how to pair this tactile rendering with an immersive \AR or \VR headset visual display to provide a coherent, multimodal visuo-haptic augmentation of the \RE.
\bigskip
\begin{subfigs}{setup}{Visuo-haptic texture rendering system setup. }[][
\item HapCoil-One voice-coil actuator with a fiducial marker on top attached to the middle-phalanx of the user's index finger.
\item Our implementation of the system using a Microsoft HoloLens~2, a webcam for tracking the hand and the tangible surfaces, and an external computer for processing the tracking data and rendering the haptic textures.
]
\subfigsheight{60mm}
\subfig{device}
\subfig{apparatus}
\end{subfigs}

View File

@@ -35,15 +35,6 @@ The system consists of three main components: the pose estimation of the tracked
\subsection{Pose Estimation}
\label{pose_estimation}
\begin{subfigs}{setup}{Visuo-haptic texture rendering system setup. }[][
\item HapCoil-One voice-coil actuator with a fiducial marker on top attached to the middle-phalanx of the user's index finger.
]
\subfig[0.325]{device}
%\subfig[0.65]{headset}
%\par\vspace{2.5pt}
%\subfig[0.992]{apparatus}
\end{subfigs}
A \qty{2}{\cm} AprilTag fiducial marker \cite{wang2016apriltag} is glued to the top of the actuator (\figref{device}) to track the finger pose with a camera (StreamCam, Logitech) which is placed above the experimental setup and capturing \qtyproduct{1280 x 720}{px} images at \qty{60}{\hertz} (\figref{apparatus}).
Other markers are placed on the tangible surfaces to augment (\figref{setup}) to estimate the relative position of the finger with respect to the surfaces.
Contrary to similar work using vision-based tracking allows both to free the hand movements and to augment any tangible surface.
@@ -72,7 +63,7 @@ A \VST-\AR or a \VR headset could have been used as well.
\subsection{Vibrotactile Signal Generation and Rendering}
\label{texture_generation}
A voice-coil actuator (HapCoil-One, Actronika) is used to display the vibrotactile signal, as it allows the frequency and amplitude of the signal to be controlled independently over time, covers a wide frequency range (\qtyrange{10}{1000}{\Hz}), and outputs the signal accurately with relatively low acceleration distortion\footnote{HapCoil-One specific characteristics are described in its data sheet: \url{https://web.archive.org/web/20240228161416/https://tactilelabs.com/wp-content/uploads/2023/11/HapCoil_One_datasheet.pdf}}.
A voice-coil actuator (HapCoil-One, Actronika) is used to display the vibrotactile signal, as it allows the frequency and amplitude of the signal to be controlled independently over time, covers a wide frequency range (\qtyrange{10}{1000}{\Hz}), and outputs the signal accurately with relatively low acceleration distortion\footnote{HapCoil-One specific characteristics are described in its data sheet: \url{https://tactilelabs.com/wp-content/uploads/2023/11/HapCoil_One_datasheet.pdf}}.
The voice-coil actuator is encased in a \ThreeD printed plastic shell and firmly attached to the middle phalanx of the user's index finger with a Velcro strap, to enable the fingertip to directly touch the environment (\figref{device}).
The actuator is driven by a class D audio amplifier (XY-502 / TPA3116D2, Texas Instrument). %, which has proven to be an effective type of amplifier for driving moving-coil \cite{mcmahan2014dynamic}.
The amplifier is connected to the audio output of a computer that generates the signal using the WASAPI driver in exclusive mode and the NAudio library\footnoteurl{https://github.com/naudio/NAudio}.

View File

@@ -3,6 +3,11 @@
%Summary of the research problem, method, main findings, and implications.
In this chapter, we designed and implemented a system for rendering virtual haptic grating textures on a real tangible surface touched directly with the fingertip, using a wearable vibrotactile voice-coil device mounted on the middle phalanx of the finger, and allowing free explorative movements of the hand on the surface.
This tactile feedback was integrated with an immersive visual virtual environment, using an OST-AR headset, to provide users with a coherent multimodal visuo-haptic augmentation of the \RE, that can be switched between an \AR and a \VR view.
In this chapter, we designed and implemented a system for rendering virtual visuo-haptic textures that augment a real tangible surface.
Directly touched with the fingertip, the perceived roughness of the surface can be increased using a wearable vibrotactile voice-coil device mounted on the middle phalanx of the finger.
We adapted the 1D sinusoidal grating rendering method, common in the literature but not yet integrated in a direct touch context, for use with vision-based tracking of the finger and paired it with an immersive \AR headset.
Our wearable visuo-haptic augmentation system enable any real surface to be augmented with a minimal setup.
It also allows a free exploration of the textures, as if they were real (\secref[related_work]{ar_presence}), by letting the user view them from different poses and touch them with the bare finger without constraints on hand movements.
The visual latency we measured is typical of \AR systems, and the haptic latency is below the perceptual detection threshold for vibrotactile rendering.
This system forms the basis of the apparatus for the user studies presented in the next two chapters, which evaluate the user perception of these visuo-haptic texture augmentations.

Binary file not shown.