tangible -> real
This commit is contained in:
@@ -8,11 +8,11 @@ However, this method has not yet been integrated in an \AR context, where the us
|
||||
|
||||
%which either constrained hand to a constant speed to keep the signal frequency constant \cite{asano2015vibrotactile,friesen2024perceived}, or used mechanical sensors attached to the hand \cite{friesen2024perceived,strohmeier2017generating}
|
||||
|
||||
In this chapter, we propose a \textbf{system for rendering visual and haptic virtual textures that augment tangible surfaces}.
|
||||
In this chapter, we propose a \textbf{system for rendering visual and haptic virtual textures that augment real surfaces}.
|
||||
It is implemented with an immersive \OST-\AR headset Microsoft HoloLens~2 and a wearable vibrotactile (voice-coil) device worn on the outside of finger (not covering the fingertip, \secref[related_work]{vhar_haptics}).
|
||||
The visuo-haptic augmentations can be \textbf{viewed from any angle} and \textbf{explored freely with the bare finger}, as if they were real textures.
|
||||
To ensure both real-time and reliable renderings, the hand and the tangibles are tracked using a webcam and marker-based tracking.
|
||||
The haptic textures are rendered as a vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented tangible surface.
|
||||
To ensure both real-time and reliable renderings, the hand and the real surfaces are tracked using a webcam and marker-based tracking.
|
||||
The haptic textures are rendered as a vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented surface.
|
||||
|
||||
\noindentskip The contributions of this chapter are:
|
||||
\begin{itemize}
|
||||
@@ -26,7 +26,7 @@ The haptic textures are rendered as a vibrotactile signal representing a pattern
|
||||
|
||||
\begin{subfigs}{setup}{Visuo-haptic texture rendering system setup. }[][
|
||||
\item HapCoil-One voice-coil actuator with a fiducial marker on top attached to the middle-phalanx of the user's index finger.
|
||||
\item Our implementation of the system using a Microsoft HoloLens~2, a webcam for tracking the hand and the tangible surfaces, and an external computer for processing the tracking data and rendering the haptic textures.
|
||||
\item Our implementation of the system using a Microsoft HoloLens~2, a webcam for tracking the hand and the real surfaces, and an external computer for processing the tracking data and rendering the haptic textures.
|
||||
]
|
||||
\subfigsheight{60mm}
|
||||
\subfig{device}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
%With a vibrotactile actuator attached to a hand-held device or directly on the finger, it is possible to simulate virtual haptic sensations as vibrations, such as texture, friction or contact vibrations \cite{culbertson2018haptics}.
|
||||
%
|
||||
%We describe a system for rendering vibrotactile roughness textures in real time, on any tangible surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose.
|
||||
%We describe a system for rendering vibrotactile roughness textures in real time, on any real surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose.
|
||||
%
|
||||
%We also describe how to pair this tactile rendering with an immersive \AR or \VR headset visual display to provide a coherent, multimodal visuo-haptic augmentation of the \RE.
|
||||
|
||||
@@ -18,7 +18,7 @@ The visuo-haptic texture rendering system is based on:
|
||||
The system consists of three main components: the pose estimation of the tracked real elements, the visual rendering of the \VE, and the vibrotactile signal generation and rendering.
|
||||
|
||||
\figwide[1]{diagram}{Diagram of the visuo-haptic texture rendering system. }[
|
||||
Fiducial markers attached to the voice-coil actuator and to tangible surfaces to track are captured by a camera.
|
||||
Fiducial markers attached to the voice-coil actuator and to augmented surfaces to track are captured by a camera.
|
||||
The positions and rotations (the poses) ${}^c\mathbf{T}_i$, $i=1..n$ of the $n$ defined markers in the camera frame $\mathcal{F}_c$ are estimated, then filtered with an adaptive low-pass filter.
|
||||
%These poses are transformed to the \AR/\VR headset frame $\mathcal{F}_h$ and applied to the virtual model replicas to display them superimposed and aligned with the \RE.
|
||||
These poses are used to move and display the virtual model replicas aligned with the \RE.
|
||||
@@ -36,8 +36,8 @@ The system consists of three main components: the pose estimation of the tracked
|
||||
\label{pose_estimation}
|
||||
|
||||
A \qty{2}{\cm} AprilTag fiducial marker \cite{wang2016apriltag} is glued to the top of the actuator (\figref{device}) to track the finger pose with a camera (StreamCam, Logitech) which is placed above the experimental setup and capturing \qtyproduct{1280 x 720}{px} images at \qty{60}{\hertz} (\figref{apparatus}).
|
||||
Other markers are placed on the tangible surfaces to augment (\figref{setup}) to estimate the relative position of the finger with respect to the surfaces.
|
||||
Contrary to similar work using vision-based tracking allows both to free the hand movements and to augment any tangible surface.
|
||||
Other markers are placed on the real surfaces to augment (\figref{setup}) to estimate the relative position of the finger with respect to the surfaces.
|
||||
Contrary to similar work using vision-based tracking allows both to free the hand movements and to augment any real surface.
|
||||
A camera external to the \AR headset with a marker-based technique is employed to provide accurate and robust tracking with a constant view of the markers \cite{marchand2016pose}.
|
||||
We denote ${}^c\mathbf{T}_i$, $i=1..n$ the homogenous transformation matrix that defines the position and rotation of the $i$-th marker out of the $n$ defined markers in the camera frame $\mathcal{F}_c$, \eg the finger pose ${}^c\mathbf{T}_f$ and the texture pose ${}^c\mathbf{T}_t$.
|
||||
|
||||
@@ -51,7 +51,7 @@ The velocity (without angular velocity) of the marker, denoted as ${}^c\dot{\mat
|
||||
|
||||
%To be able to compare virtual and augmented realities, we then create a \VE that closely replicate the real one.
|
||||
Before a user interacts with the system, it is necessary to design a \VE that will be registered with the \RE during the experiment.
|
||||
Each real element tracked by a marker is modelled virtually, \eg the hand and the augmented tangible surface (\figref{device}).
|
||||
Each real element tracked by a marker is modelled virtually, \eg the hand and the augmented surface (\figref{device}).
|
||||
In addition, the pose and size of the virtual textures were defined on the virtual replicas.
|
||||
During the experiment, the system uses marker pose estimates to align the virtual models with their real-world counterparts. %, according to the condition being tested.
|
||||
This allows to detect if a finger touches a virtual texture using a collision detection algorithm (Nvidia PhysX), and to show the virtual elements and textures in real-time, aligned with the \RE, using the considered \AR or \VR headset.
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
|
||||
%Summary of the research problem, method, main findings, and implications.
|
||||
|
||||
In this chapter, we designed and implemented a system for rendering virtual visuo-haptic textures that augment a real tangible surface.
|
||||
In this chapter, we designed and implemented a system for rendering virtual visuo-haptic textures that augment a real surface.
|
||||
Directly touched with the fingertip, the perceived roughness of the surface can be increased using a wearable vibrotactile voice-coil device mounted on the middle phalanx of the finger.
|
||||
We adapted the 1D sinusoidal grating rendering method, common in the literature but not yet integrated in a direct touch context, for use with vision-based tracking of the finger and paired it with an immersive \AR headset.
|
||||
|
||||
|
||||
Reference in New Issue
Block a user