tangible -> real

This commit is contained in:
2024-10-12 15:24:56 +02:00
parent 000a0a0fc5
commit f624ed5d44
16 changed files with 91 additions and 84 deletions

View File

@@ -5,14 +5,14 @@ When we look at the surface of an everyday object, we then touch it to confirm o
Among the various haptic texture augmentations, data-driven methods allow to capture, model and reproduce the roughness perception of real surfaces when touched touched by a hand-held stylus \secref[related_work]{texture_rendering}.
Databases of visuo-haptic textures have been developed in this way \cite{culbertson2014one,balasubramanian2024sens3}, but they have not yet been explored in an immersive and direct touch context with \AR and wearable haptics.
In this chapter, we investigate whether simultaneous and \textbf{co-localized visual and wearable haptic texture augmentation of tangible surfaces} in \AR can be perceived in a coherent and realistic manner, and to what extent each sensory modality would contribute to the overall perception of the augmented texture.
In this chapter, we investigate whether simultaneous and \textbf{co-localized visual and wearable haptic texture augmentation of real surfaces} in \AR can be perceived in a coherent and realistic manner, and to what extent each sensory modality would contribute to the overall perception of the augmented texture.
We used nine pairs of \textbf{data-driven visuo-haptic textures} from the \HaTT database \cite{culbertson2014one}, which we rendered using the wearable visuo-haptic augmentatio nsystem presented in \chapref{vhar_system}. %, an \OST-\AR headset, and a wearable voice-coil device worn on the finger.
In a \textbf{user study}, 20 participants freely explored in direct touch the combination of the visuo-haptic texture pairs to rate their coherence, realism and perceived roughness.
We aimed to assess \textbf{which haptic textures were matched with which visual textures}, how the roughness of the visual and haptic textures was perceived, and whether \textbf{the perceived roughness} could explain the matches made between them.
\noindentskip The contributions of this chapter are:
\begin{itemize}
\item Transposition of data-driven visuo-haptic textures to augment tangible objects in a direct touch context in immersive \AR.
\item Transposition of data-driven visuo-haptic textures to augment real objects in a direct touch context in immersive \AR.
\item A user study evaluating with 20 participants the coherence, realism and perceived roughness of nine pairs of these visuo-haptic texture augmentations.
\end{itemize}

View File

@@ -1,7 +1,7 @@
\section{User Study}
\label{experiment}
%The user study aimed at analyzing the user perception of tangible surfaces when augmented through a visuo-haptic texture using \AR and vibrotactile haptic feedback provided on the finger touching the surfaces.
%The user study aimed at analyzing the user perception of real surfaces when augmented through a visuo-haptic texture using \AR and vibrotactile haptic feedback provided on the finger touching the surfaces.
%Nine representative visuo-haptic texture pairs from the \HaTT database \cite{culbertson2014one} were investigated in two tasks:
%\begin{enumerate}
% \item \level{Matching} task: participants had to find the haptic texture that best matched a given visual texture; and
@@ -13,7 +13,7 @@
\label{textures}
The 100 visuo-haptic texture pairs of the \HaTT database \cite{culbertson2014one} were preliminary tested and compared using the apparatus described in \secref{apparatus} to select the most representative textures for the user study.
% visuo-haptic system presented in \chapref{vhar_system}, and with the vibrotactile haptic feedback provided on the middle-phalanx of the finger touching a tangible surface. on the finger on a tangible surface
% visuo-haptic system presented in \chapref{vhar_system}, and with the vibrotactile haptic feedback provided on the middle-phalanx of the finger touching a real surface. on the finger on a real surface
These texture models were chosen as they are visuo-haptic representations of a wide range of real textures that are publicly available online.
Nine texture pairs were selected (\figref{experiment/textures}) to cover various perceived roughness, from rough to smooth, as named on the database: \level{Metal Mesh}, \level{Sandpaper~100}, \level{Brick~2}, \level{Cork}, \level{Sandpaper~320}, \level{Velcro Hooks}, \level{Plastic Mesh~1}, \level{Terra Cotta}, \level{Coffee Filter}.
All these visual and haptic textures are isotropic: their rendering (appearance or roughness) is the same whatever the direction of the movement on the surface, \ie there are no local deformations (holes, bumps, or breaks).
@@ -24,11 +24,11 @@ All these visual and haptic textures are isotropic: their rendering (appearance
\figref{experiment/setup} shows the experimental setup, and \figref{experiment/view} the first person view of participants during the user study.
The user study was held in a quiet room with no windows, with one light source of \qty{800}{\lumen} placed \qty{70}{\cm} above the table.
Nine \qty{5}{\cm} square cardboards with smooth, white melamine surface, arranged in a \numproduct{3 x 3} grid, were used as real tangible surfaces to augment.
Nine \qty{5}{\cm} square cardboards with smooth, white melamine surface, arranged in a \numproduct{3 x 3} grid, were used as real surfaces to augment.
Their poses were estimated with three \qty{2}{\cm} AprilTag fiducial markers glued on the surfaces grid.
Similarly, a \qty{2}{\cm} fiducial marker was glued on top of the vibrotactile actuator to detect the finger pose.
Positioned \qty{20}{\cm} above the surfaces, a webcam (StreamCam, Logitech) filmed the markers to track finger movements relative to the surfaces, as described in \secref[vhar_system]{virtual_real_alignment}.
The visual textures were displayed on the tangible surfaces using the \OST-\AR headset Microsoft HoloLens~2 running a custom application at \qty{60}{FPS} made with Unity 2021.1 and Mixed Reality Toolkit (MRTK) 2.7.2.
The visual textures were displayed on the real surfaces using the \OST-\AR headset Microsoft HoloLens~2 running a custom application at \qty{60}{FPS} made with Unity 2021.1 and Mixed Reality Toolkit (MRTK) 2.7.2.
A set of empirical tests enabled us to choose the best rendering characteristics in terms of transparency and brightness for the visual textures, that were used throughout the user study.
When a virtual haptic texture was touched, a \qty{48}{kHz} audio signal was generated using the rendering procedure described in \cite{culbertson2014modeling} from the corresponding \HaTT haptic texture model and the measured tangential speed of the finger (\secref[vhar_system]{texture_generation}).
@@ -44,7 +44,7 @@ The vibrotactile voice-coil actuator (HapCoil-One, Actronika) was firmly attache
\item The nine visuo-haptic textures used in the user study, selected from the \HaTT database \cite{culbertson2014one}.
The texture names were never shown, to prevent the use of the user's visual or haptic memory of the textures.
\item Experimental setup.
Participant sat in front of the tangible surfaces, which were augmented with visual textures displayed by the Microsoft HoloLens~2 \AR headset and haptic roughness textures rendered by the vibrotactile haptic device placed on the middle index phalanx.
Participant sat in front of the real surfaces, which were augmented with visual textures displayed by the Microsoft HoloLens~2 \AR headset and haptic roughness textures rendered by the vibrotactile haptic device placed on the middle index phalanx.
A webcam above the surfaces tracked the finger movements.
]
\subfig[0.49]{experiment/textures}
@@ -58,12 +58,12 @@ Participants were first given written instructions about the experimental setup,
Then, after having signed an informed consent form, they were asked to seat in front of the table with the experimental setup and to wear the \AR headset.
%The experimenter firmly attached the plastic shell encasing the vibrotactile actuator to the middle index phalanx of their dominant hand.
As the haptic textures generated no audible noise, participants did not wear any noise reduction headphones.
A calibration of both the HoloLens~2 and the hand tracking was performed to ensure the correct alignment of the visual and haptic textures on the tangible surfaces.
A calibration of both the HoloLens~2 and the hand tracking was performed to ensure the correct alignment of the visual and haptic textures on the real surfaces.
Finally, participants familiarized with the augmented surface in a \qty{2}{min} training session with textures different from the ones used in the user study.
Participants started with the \level{Matching} task.
They were informed that the user study involved nine pairs of corresponding visual and haptic textures that were separated and shuffled.
On each trial, the same visual texture was displayed on the nine tangible surfaces, while the nine haptic textures were rendered on only one of the surfaces at a time, \ie all surfaces were augmented by the same visual texture, but each surface was augmented by a different haptic texture.
On each trial, the same visual texture was displayed on the nine real surfaces, while the nine haptic textures were rendered on only one of the surfaces at a time, \ie all surfaces were augmented by the same visual texture, but each surface was augmented by a different haptic texture.
The placement of the haptic textures was randomized before each trial.
Participants were instructed to look closely at the details of the visual textures and explore the haptic textures with a constant pressure and various speeds to find the haptic texture that best matched the visual texture, \ie choose the surface with the most coherent visual-haptic texture pair.
The texture names were never given or shown to prevent the use of visual or haptic memory of the textures, nor a definition of what roughness is was given, to let participants complete the task as naturally as possible, similarly to \textcite{bergmanntiest2007haptic}.

View File

@@ -1,11 +1,11 @@
\section{Discussion}
\label{discussion}
In this study, we investigated the perception of visuo-haptic texture augmentation of tangible surfaces touched directly with the index fingertip, using visual texture overlays in \AR and haptic roughness textures generated by a vibrotactile device worn on the middle index phalanx.
In this study, we investigated the perception of visuo-haptic texture augmentation of real surfaces touched directly with the index fingertip, using visual texture overlays in \AR and haptic roughness textures generated by a vibrotactile device worn on the middle index phalanx.
The nine evaluated pairs of visuo-haptic textures, taken from the \HaTT database \cite{culbertson2014one}, are models of real texture captures (\secref[related_work]{texture_rendering}).
Their perception was evaluated in a two-task user study in which participants chose the most coherent combinations of visual and haptic textures (\level{Matching} task), and ranked all textures according to their perceived roughness (\level{Ranking} task).
The visual textures were displayed statically on the tangible surface, while the haptic textures adapted in real time to the speed of the finger on the surface, giving the impression that the visuo-haptic textures were integrated into the tangible surface.
The visual textures were displayed statically on the real surface, while the haptic textures adapted in real time to the speed of the finger on the surface, giving the impression that the visuo-haptic textures were integrated into the surface.
In addition, the interaction with the textures was designed to be as natural as possible, without imposing a specific speed of finger movement, as in similar studies \cite{asano2015vibrotactile,friesen2024perceived}.
In the \level{Matching} task, participants were not able to effectively match the original visual and haptic texture pairs (\figref{results/matching_confusion_matrix}), except for the \level{Coffee Filter} texture, which was the smoothest both visually and haptically.

View File

@@ -1,13 +1,13 @@
\section{Conclusion}
\label{conclusion}
In this chapter, we investigated how users perceived simultaneous and co-localized visuo-haptic texture augmentations of tangible surfaces seen in immersive \OST-\AR and touched directly with the index finger.
In this chapter, we investigated how users perceived simultaneous and co-localized visuo-haptic texture augmentations of real surfaces seen in immersive \OST-\AR and touched directly with the index finger.
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, the haptic roughness texture was rendered with on the \HaTT data-driven models and finger speed.
In a user study, 20 participants rated the coherence, realism and perceived roughness of the combination of nine representative visuo-haptic texture pairs.
The results showed that participants consistently identified and matched clusters of visual and haptic textures with similar perceived roughness.
The texture rankings did indeed show that participants perceived the roughness of haptic textures to be very similar, but less so for visual textures, and the haptic roughness perception predominated the final roughness perception ranking of the original visuo-haptic pairs.
This suggests that \AR visual textures that augments tangible surfaces can be enhanced with a set of data-driven vibrotactile haptic textures in a coherent and realistic manner.
This suggests that \AR visual textures that augments real surfaces can be enhanced with a set of data-driven vibrotactile haptic textures in a coherent and realistic manner.
This paves the way for new \AR applications capable of augmenting a \RE with virtual visuo-haptic textures, such as visuo-haptic painting in artistic or object design context, or viewing and touching virtual objects in a museum or a showroom.
The latter is illustrated in \figref{experiment/use_case}, where a user applies different visuo-haptic textures to a wall, in an interior design scenario, to compare them visually and by touch.