37 lines
3.4 KiB
TeX
37 lines
3.4 KiB
TeX
\section{Introduction}
|
|
\label{intro}
|
|
|
|
One approach to render virtual haptic textures consists in simulating the roughness of a periodic grating surface as a vibrotactile sinusoidal (\secref[related_work]{texture_rendering}).
|
|
The vibrations are rendered to a voice-coil actuator embedded in a hand-held tool or worn on the finger (\secref[related_work]{vhar_haptics}).
|
|
To create the illusion of touching a pattern with a fixed spatial period, the frequency of signal must be modulated according to the finger movement.
|
|
Previous work either used mechanical system to track the movement at high frequency \cite{strohmeier2017generating,friesen2024perceived}, or required the user to move at a constant speed to keep the signal frequency constant \cite{asano2015vibrotactile,ujitoko2019modulating}.
|
|
However, this method has not yet been integrated in an \AR headset context, where the user should be able to freely touch and explore the visuo-haptic texture augmentations.
|
|
|
|
%which either constrained hand to a constant speed to keep the signal frequency constant \cite{asano2015vibrotactile,friesen2024perceived}, or used mechanical sensors attached to the hand \cite{friesen2024perceived,strohmeier2017generating}
|
|
|
|
In this chapter, we propose a \textbf{system for rendering visual and haptic virtual textures that augment real surfaces}.
|
|
It is implemented with the \OST-\AR headset Microsoft HoloLens~2 and a wearable vibrotactile (voice-coil) device worn on the outside of finger (not covering the fingertip).
|
|
The visuo-haptic augmentations rendered with this design allow a user to \textbf{see the textures from any angle} and \textbf{explore them freely with the bare finger}, as if they were real textures.
|
|
To ensure both real-time and reliable renderings, the hand and the real surfaces are tracked using a webcam and marker-based pose estimation.
|
|
The haptic textures are rendered as a vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented surface.
|
|
The goal of this design is to enable new \AR applications capable of augmenting real objects with virtual visuo-haptic textures in a portable, on-demand manner, and without impairing with the interaction of the user with the \RE.
|
|
|
|
\noindentskip The contributions of this chapter are:
|
|
\begin{itemize}
|
|
\item The rendering of virtual vibrotactile roughness textures representing a patterned grating texture in real time from free finger movements and using vision-based finger pose estimation.
|
|
\item A system to provide a coherent visuo-haptic texture augmentations of the \RE in a direct touch context using an \OST-\AR headset and wearable haptics.
|
|
\end{itemize}
|
|
|
|
\noindentskip In the remainder of this chapter, we describe the principles of the system, how the \RE and \VE are registered, the generation of the vibrotactile textures, and measures of visual and haptic rendering latencies.
|
|
|
|
\bigskip
|
|
|
|
\begin{subfigs}{setup}{Visuo-haptic texture rendering system setup. }[][
|
|
\item HapCoil-One voice-coil actuator with a fiducial marker on top attached to the middle-phalanx of the user's index finger.
|
|
\item Our implementation of the system using a Microsoft HoloLens~2, a webcam for estimating the poses the hand and the real surfaces, and an external computer for processing the tracking data and rendering the haptic textures.
|
|
]
|
|
\subfigsheight{60mm}
|
|
\subfig{device}
|
|
\subfig{apparatus}
|
|
\end{subfigs}
|