Replace "immersive AR" with "AR headset"
This commit is contained in:
@@ -4,12 +4,12 @@
|
||||
One approach to render virtual haptic textures consists in simulating the roughness of a periodic grating surface as a vibrotactile sinusoidal (\secref[related_work]{texture_rendering}).
|
||||
The vibrations are rendered to a voice-coil actuator embedded in a hand-held tool or worn on the finger, but to create the illusion of touching a pattern with a fixed spatial period, the frequency of signal must be modulated according to the finger movement.
|
||||
Previous work either used mechanical system to track the movement at high frequency \cite{strohmeier2017generating,friesen2024perceived}, or required the user to move at a constant speed to keep the signal frequency constant \cite{asano2015vibrotactile,ujitoko2019modulating}.
|
||||
However, this method has not yet been integrated in an \AR context, where the user should be able to freely touch and explore the visuo-haptic texture augmentations.
|
||||
However, this method has not yet been integrated in an \AR headset context, where the user should be able to freely touch and explore the visuo-haptic texture augmentations.
|
||||
|
||||
%which either constrained hand to a constant speed to keep the signal frequency constant \cite{asano2015vibrotactile,friesen2024perceived}, or used mechanical sensors attached to the hand \cite{friesen2024perceived,strohmeier2017generating}
|
||||
|
||||
In this chapter, we propose a \textbf{system for rendering visual and haptic virtual textures that augment real surfaces}.
|
||||
It is implemented with an immersive \OST-\AR headset Microsoft HoloLens~2 and a wearable vibrotactile (voice-coil) device worn on the outside of finger (not covering the fingertip, \secref[related_work]{vhar_haptics}).
|
||||
It is implemented with the \OST-\AR headset Microsoft HoloLens~2 and a wearable vibrotactile (voice-coil) device worn on the outside of finger (not covering the fingertip, \secref[related_work]{vhar_haptics}).
|
||||
The visuo-haptic augmentations can be \textbf{viewed from any angle} and \textbf{explored freely with the bare finger}, as if they were real textures.
|
||||
To ensure both real-time and reliable renderings, the hand and the real surfaces are tracked using a webcam and marker-based pose estimation.
|
||||
The haptic textures are rendered as a vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented surface.
|
||||
@@ -17,7 +17,7 @@ The haptic textures are rendered as a vibrotactile signal representing a pattern
|
||||
\noindentskip The contributions of this chapter are:
|
||||
\begin{itemize}
|
||||
\item The rendering of virtual vibrotactile roughness textures representing a patterned grating texture in real time from free finger movements and using vision-based finger pose estimation.
|
||||
\item A system to provide a coherent visuo-haptic texture augmentations of the \RE in a direct touch context using an immersive \AR headset and wearable haptics.
|
||||
\item A system to provide a coherent visuo-haptic texture augmentations of the \RE in a direct touch context using an \OST-\AR headset and wearable haptics.
|
||||
\end{itemize}
|
||||
|
||||
\noindentskip In the remainder of this chapter, we describe the principles of the system, how the real and virtual environments are registered, the generation of the vibrotactile textures, and measures of visual and haptic rendering latencies.
|
||||
|
||||
Reference in New Issue
Block a user