Complete vhar_system conclusion

This commit is contained in:
2024-10-01 11:18:32 +02:00
parent 53a45d62a7
commit afb96c7ff1
10 changed files with 55 additions and 51 deletions

View File

@@ -9,7 +9,7 @@ In this chapter, we propose a \textbf{system for rendering visual and haptic vir
It is implemented with an immersive \OST-\AR headset Microsoft HoloLens~2 and a wearable vibrotactile (voice-coil) device worn on the outside of finger (not covering the fingertip, \secref[related_work]{vhar_haptics}).
The visuo-haptic augmentations can be \textbf{viewed from any angle} and \textbf{explored freely with the bare finger}, as if they were real textures.
To ensure both real-time and reliable renderings, the hand and the tangibles are tracked using a webcam and marker-based tracking.
The haptic textures are rendered as a real-time vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented tangible surface.
The haptic textures are rendered as a vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented tangible surface.
\noindentskip The contributions of this chapter are:
\begin{itemize}
@@ -19,5 +19,13 @@ The haptic textures are rendered as a real-time vibrotactile signal representing
\noindentskip In the remainder of this chapter, we describe the principles of the system, how the real and virtual environments are registered, the generation of the vibrotactile textures, and measures of visual and haptic rendering latencies.
% We describe a system for rendering vibrotactile roughness textures in real time, on any tangible surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose.
% We also describe how to pair this tactile rendering with an immersive \AR or \VR headset visual display to provide a coherent, multimodal visuo-haptic augmentation of the \RE.
\bigskip
\begin{subfigs}{setup}{Visuo-haptic texture rendering system setup. }[][
\item HapCoil-One voice-coil actuator with a fiducial marker on top attached to the middle-phalanx of the user's index finger.
\item Our implementation of the system using a Microsoft HoloLens~2, a webcam for tracking the hand and the tangible surfaces, and an external computer for processing the tracking data and rendering the haptic textures.
]
\subfigsheight{60mm}
\subfig{device}
\subfig{apparatus}
\end{subfigs}

View File

@@ -35,15 +35,6 @@ The system consists of three main components: the pose estimation of the tracked
\subsection{Pose Estimation}
\label{pose_estimation}
\begin{subfigs}{setup}{Visuo-haptic texture rendering system setup. }[][
\item HapCoil-One voice-coil actuator with a fiducial marker on top attached to the middle-phalanx of the user's index finger.
]
\subfig[0.325]{device}
%\subfig[0.65]{headset}
%\par\vspace{2.5pt}
%\subfig[0.992]{apparatus}
\end{subfigs}
A \qty{2}{\cm} AprilTag fiducial marker \cite{wang2016apriltag} is glued to the top of the actuator (\figref{device}) to track the finger pose with a camera (StreamCam, Logitech) which is placed above the experimental setup and capturing \qtyproduct{1280 x 720}{px} images at \qty{60}{\hertz} (\figref{apparatus}).
Other markers are placed on the tangible surfaces to augment (\figref{setup}) to estimate the relative position of the finger with respect to the surfaces.
Contrary to similar work using vision-based tracking allows both to free the hand movements and to augment any tangible surface.
@@ -72,7 +63,7 @@ A \VST-\AR or a \VR headset could have been used as well.
\subsection{Vibrotactile Signal Generation and Rendering}
\label{texture_generation}
A voice-coil actuator (HapCoil-One, Actronika) is used to display the vibrotactile signal, as it allows the frequency and amplitude of the signal to be controlled independently over time, covers a wide frequency range (\qtyrange{10}{1000}{\Hz}), and outputs the signal accurately with relatively low acceleration distortion\footnote{HapCoil-One specific characteristics are described in its data sheet: \url{https://web.archive.org/web/20240228161416/https://tactilelabs.com/wp-content/uploads/2023/11/HapCoil_One_datasheet.pdf}}.
A voice-coil actuator (HapCoil-One, Actronika) is used to display the vibrotactile signal, as it allows the frequency and amplitude of the signal to be controlled independently over time, covers a wide frequency range (\qtyrange{10}{1000}{\Hz}), and outputs the signal accurately with relatively low acceleration distortion\footnote{HapCoil-One specific characteristics are described in its data sheet: \url{https://tactilelabs.com/wp-content/uploads/2023/11/HapCoil_One_datasheet.pdf}}.
The voice-coil actuator is encased in a \ThreeD printed plastic shell and firmly attached to the middle phalanx of the user's index finger with a Velcro strap, to enable the fingertip to directly touch the environment (\figref{device}).
The actuator is driven by a class D audio amplifier (XY-502 / TPA3116D2, Texas Instrument). %, which has proven to be an effective type of amplifier for driving moving-coil \cite{mcmahan2014dynamic}.
The amplifier is connected to the audio output of a computer that generates the signal using the WASAPI driver in exclusive mode and the NAudio library\footnoteurl{https://github.com/naudio/NAudio}.

View File

@@ -3,6 +3,11 @@
%Summary of the research problem, method, main findings, and implications.
In this chapter, we designed and implemented a system for rendering virtual haptic grating textures on a real tangible surface touched directly with the fingertip, using a wearable vibrotactile voice-coil device mounted on the middle phalanx of the finger, and allowing free explorative movements of the hand on the surface.
This tactile feedback was integrated with an immersive visual virtual environment, using an OST-AR headset, to provide users with a coherent multimodal visuo-haptic augmentation of the \RE, that can be switched between an \AR and a \VR view.
In this chapter, we designed and implemented a system for rendering virtual visuo-haptic textures that augment a real tangible surface.
Directly touched with the fingertip, the perceived roughness of the surface can be increased using a wearable vibrotactile voice-coil device mounted on the middle phalanx of the finger.
We adapted the 1D sinusoidal grating rendering method, common in the literature but not yet integrated in a direct touch context, for use with vision-based tracking of the finger and paired it with an immersive \AR headset.
Our wearable visuo-haptic augmentation system enable any real surface to be augmented with a minimal setup.
It also allows a free exploration of the textures, as if they were real (\secref[related_work]{ar_presence}), by letting the user view them from different poses and touch them with the bare finger without constraints on hand movements.
The visual latency we measured is typical of \AR systems, and the haptic latency is below the perceptual detection threshold for vibrotactile rendering.
This system forms the basis of the apparatus for the user studies presented in the next two chapters, which evaluate the user perception of these visuo-haptic texture augmentations.

Binary file not shown.

View File

@@ -3,8 +3,8 @@ Among the various haptic texture augmentations, data-driven methods allow to cap
Databases of visuo-haptic textures have been developed in this way \cite{culbertson2014one,balasubramanian2024sens3}, but they have not yet been explored in an immersive and direct touch context with \AR and wearable haptics.
In this chapter, we investigate whether simultaneous and \textbf{co-localized visual and wearable haptic texture augmentation of tangible surfaces} in \AR can be perceived in a coherent and realistic manner, and to what extent each sensory modality would contribute to the overall perception of the augmented texture.
We used nine pairs of \textbf{data-driven visuo-haptic textures} from the \HaTT database \cite{culbertson2014one}, which we rendered using the visuo-haptic system presented in \chapref{vhar_system}, an \OST-\AR headset, and a wearable voice-coil device worn on the finger.
In a \textbf{user study}, 20 participants freely explored the combination of the visuo-haptic texture pairs to rate their coherence, realism and perceived roughness.
We used nine pairs of \textbf{data-driven visuo-haptic textures} from the \HaTT database \cite{culbertson2014one}, which we rendered using the wearable visuo-haptic augmentatio nsystem presented in \chapref{vhar_system}.%, an \OST-\AR headset, and a wearable voice-coil device worn on the finger.
In a \textbf{user study}, 20 participants freely explored in direct touch the combination of the visuo-haptic texture pairs to rate their coherence, realism and perceived roughness.
We aimed to assess \textbf{which haptic textures were matched with which visual textures}, how the roughness of the visual and haptic textures was perceived, and whether \textbf{the perceived roughness} could explain the matches made between them.
\noindentskip The contributions of this chapter are:

View File

@@ -2,7 +2,7 @@
\label{conclusion}
In this chapter, we investigated how users perceived simultaneous and co-localized visuo-haptic texture augmentations of tangible surfaces seen in immersive \OST-\AR and touched directly with the index finger.
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, the haptic roughness texture was rendered to a voice-coil worn on the middle-phalanx index, based on the \HaTT data-driven models and finger speed.
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, the haptic roughness texture was rendered with on the \HaTT data-driven models and finger speed.
In a user study, 20 participants rated the coherence, realism and perceived roughness of the combination of nine representative visuo-haptic texture pairs.
The results showed that participants consistently identified and matched clusters of visual and haptic textures with similar perceived roughness.

View File

@@ -1,10 +1,8 @@
% Delivers the motivation for your paper. It explains why you did the work you did.
\noindent Most of the haptic augmentations of tangible surfaces using with wearable haptic devices, including roughness of textures (\secref[related_work]{texture_rendering}), have been studied without a visual feedback, and none have considered the influence of the visual rendering on their perception or integrated them in \AR and \VR (\secref[related_work]{texture_rendering}).
Still, it is known that the visual rendering of a tangible can influence the perception of its haptic properties (\secref[related_work]{visual_haptic_influence}), and that the perception of same haptic force-feedback or vibrotactile rendering can differ between \AR and \VR, probably due to difference in perceived simultaneity between visual and haptic stimuli (\secref[related_work]{ar_vr_haptic}).
Indeed, in \AR, the user can see their own hand touching, the haptic device worn and the \RE, while in \VR they are hidden by the \VE.
In this chapter, we investigate the \textbf{role of the visual virtuality} of the hand (real or virtual) and its environment (\AR or \VR) on the perception of a \textbf{tangible surface whose haptic roughness is augmented} with a wearable voice-coil device worn on the finger.
In this chapter, we investigate the \textbf{role of the visual virtuality} of the hand (real or virtual) and its environment (\AR or \VR) on the perception of a \textbf{tangible surface whose haptic roughness is augmented} with a wearable haptics.%voice-coil device worn on the finger.
To do so, we used the visuo-haptic system presented in \chapref{vhar_system} to render virtual vibrotactile patterned textures (\secref[related_work]{texture_rendering}) to augment the tangible surface being touched.% touched by the finger.% that can be directly touched with the bare finger.
We evaluated, in \textbf{user study with psychophysical methods and extensive questionnaire}, the perceived roughness augmentation in three visual rendering conditions: \textbf{(1) without visual augmentation}, in \textbf{(2) \OST-\AR with a realistic virtual hand} rendering, and in \textbf{(3) \VR with the same virtual hand}.
To control for the influence of the visual rendering, the tangible surface was not visually augmented and stayed the same in all conditions.

View File

@@ -2,7 +2,8 @@
\label{conclusion}
In this chapter, we studied how the perception of wearable haptic augmented textures is affected by the visual virtuality of the hand and the environment, being either real, augmented or virtual.
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, we rendered virtual vibrotactile patterned textures on the voice-coil worn on the middle-phalanx of the finger to augment the roughness perception of the tangible surface being touched.
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, we augmented the perceived roughness of tangible surfaces with virtual vibrotactile textures rendered on the finger.
%we rendered virtual vibrotactile patterned textures on the voice-coil worn on the middle-phalanx of the finger to augment the roughness perception of the tangible surface being touched.
With an immersive \AR headset, that could be switched to a \VR only view, we considered three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR.
We then evaluated the perceived roughness augmentation in these three visual conditions with a psychophysical user study involving 20 participants and extensive questionnaires.