Replace "immersive AR" with "AR headset"

This commit is contained in:
2025-04-11 22:51:10 +02:00
parent f1cf425e7c
commit f8ec931cd6
22 changed files with 94 additions and 101 deletions

View File

@@ -4,12 +4,12 @@
One approach to render virtual haptic textures consists in simulating the roughness of a periodic grating surface as a vibrotactile sinusoidal (\secref[related_work]{texture_rendering}).
The vibrations are rendered to a voice-coil actuator embedded in a hand-held tool or worn on the finger, but to create the illusion of touching a pattern with a fixed spatial period, the frequency of signal must be modulated according to the finger movement.
Previous work either used mechanical system to track the movement at high frequency \cite{strohmeier2017generating,friesen2024perceived}, or required the user to move at a constant speed to keep the signal frequency constant \cite{asano2015vibrotactile,ujitoko2019modulating}.
However, this method has not yet been integrated in an \AR context, where the user should be able to freely touch and explore the visuo-haptic texture augmentations.
However, this method has not yet been integrated in an \AR headset context, where the user should be able to freely touch and explore the visuo-haptic texture augmentations.
%which either constrained hand to a constant speed to keep the signal frequency constant \cite{asano2015vibrotactile,friesen2024perceived}, or used mechanical sensors attached to the hand \cite{friesen2024perceived,strohmeier2017generating}
In this chapter, we propose a \textbf{system for rendering visual and haptic virtual textures that augment real surfaces}.
It is implemented with an immersive \OST-\AR headset Microsoft HoloLens~2 and a wearable vibrotactile (voice-coil) device worn on the outside of finger (not covering the fingertip, \secref[related_work]{vhar_haptics}).
It is implemented with the \OST-\AR headset Microsoft HoloLens~2 and a wearable vibrotactile (voice-coil) device worn on the outside of finger (not covering the fingertip, \secref[related_work]{vhar_haptics}).
The visuo-haptic augmentations can be \textbf{viewed from any angle} and \textbf{explored freely with the bare finger}, as if they were real textures.
To ensure both real-time and reliable renderings, the hand and the real surfaces are tracked using a webcam and marker-based pose estimation.
The haptic textures are rendered as a vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented surface.
@@ -17,7 +17,7 @@ The haptic textures are rendered as a vibrotactile signal representing a pattern
\noindentskip The contributions of this chapter are:
\begin{itemize}
\item The rendering of virtual vibrotactile roughness textures representing a patterned grating texture in real time from free finger movements and using vision-based finger pose estimation.
\item A system to provide a coherent visuo-haptic texture augmentations of the \RE in a direct touch context using an immersive \AR headset and wearable haptics.
\item A system to provide a coherent visuo-haptic texture augmentations of the \RE in a direct touch context using an \OST-\AR headset and wearable haptics.
\end{itemize}
\noindentskip In the remainder of this chapter, we describe the principles of the system, how the real and virtual environments are registered, the generation of the vibrotactile textures, and measures of visual and haptic rendering latencies.

View File

@@ -5,7 +5,7 @@
In this chapter, we designed and implemented a system for rendering virtual visuo-haptic textures that augment a real surface.
Directly touched with the fingertip, the perceived roughness of the surface can be increased using a wearable vibrotactile voice-coil device mounted on the middle phalanx of the finger.
We adapted the 1D sinusoidal grating rendering method, common in the literature but not yet integrated in a direct touch context, for use with vision-based pose estimation of the finger and paired it with an immersive \AR headset.
We adapted the 1D sinusoidal grating rendering method, common in the literature but not yet integrated in a direct touch context, for use with vision-based pose estimation of the finger and paired it with an \OST-\AR headset.
Our wearable visuo-haptic augmentation system enable any real surface to be augmented with a minimal setup.
It also allows a free exploration of the textures, as if they were real (\secref[related_work]{ar_presence}), by letting the user view them from different poses and touch them with the bare finger without constraints on hand movements.
@@ -18,4 +18,3 @@ This system forms the basis of the apparatus for the user studies presented in t
%Erwan Normand, Claudio Pacchierotti, Eric Marchand, and Maud Marchal.
%\enquote{How Different Is the Perception of Vibrotactile Texture Roughness in Augmented versus Virtual Reality?}.
%In: \textit{ACM Symposium on Virtual Reality Software and Technology}. Trier, Germany, October 2024. pp. 287--296.

View File

@@ -2,29 +2,29 @@
\label{intro}
In the previous chapter, we investigated the role of the visual feedback of the virtual hand and the environment (\AR \vs \VR) on the perception of wearable haptic texture augmentation.
In this chapter, we explore the perception of wearable visuo-haptic texture augmentation of real surfaces touched directly with the finger in an immersive \AR context and without a virtual hand overlay.
In this chapter, we explore the perception of wearable visuo-haptic texture augmentation of real surfaces touched directly with the finger.
When we look at the surface of an everyday object, we then touch it to confirm or contrast our initial visual impression and to estimate the properties of the object, particularly its texture (\secref[related_work]{visual_haptic_influence}).
Among the various haptic texture augmentations, data-driven methods allow to capture, model and reproduce the roughness perception of real surfaces when touched by a hand-held stylus (\secref[related_work]{texture_rendering}).
Databases of visuo-haptic textures have been developed in this way \cite{culbertson2014one,balasubramanian2024sens3}, but they have not yet been explored in an immersive and direct touch context with \AR and wearable haptics.
Databases of visuo-haptic textures have been developed in this way \cite{culbertson2014one,balasubramanian2024sens3}, but they have not yet been explored in a direct touch context with \AR and wearable haptics.
In this chapter, we investigate whether simultaneous and \textbf{co-localized visual and wearable haptic texture augmentation of real surfaces} in \AR can be perceived in a coherent and realistic manner, and to what extent each sensory modality would contribute to the overall perception of the augmented texture.
In this chapter, we consider simultaneous and \textbf{co-localized visual and wearable haptic texture augmentation of real surfaces} with an \OST-\AR headset and wearable vibrotactile feedback.
We investigate how these textures can be perceived in a coherent and realistic manner, and to what extent each sensory modality would contribute to the overall perception of the augmented texture.
We used nine pairs of \textbf{data-driven visuo-haptic textures} from the \HaTT database \cite{culbertson2014one}, which we rendered using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}. %, an \OST-\AR headset, and a wearable voice-coil device worn on the finger.
In a \textbf{user study}, 20 participants freely explored in direct touch the combination of the visuo-haptic texture pairs to rate their coherence, realism and perceived roughness.
We aimed to assess \textbf{which haptic textures were matched with which visual textures}, how the roughness of the visual and haptic textures was perceived, and whether \textbf{the perceived roughness} could explain the matches made between them.
\noindentskip The contributions of this chapter are:
\begin{itemize}
\item Transposition of data-driven visuo-haptic textures to augment real objects in a direct touch context in immersive \AR.
\item Transposition of data-driven visuo-haptic textures to augment real objects in a direct touch context in \AR.
\item A user study evaluating with 20 participants the coherence, realism and perceived roughness of nine pairs of these visuo-haptic texture augmentations.
\end{itemize}
\smallskip
\fig[0.55]{experiment/view}{First person view of the user study.}[
%As seen through the immersive \AR headset.
The visual texture overlays were statically displayed on the surfaces, allowing the user to move around to view them from different angles.
The haptic texture augmentations were generated based on \HaTT data-driven texture models and finger speed, and were rendered on the middle index phalanx.% as it slides on the considered surface.
The haptic texture augmentations were generated based on \HaTT data-driven texture models and finger speed, and were rendered on the middle index phalanx.
]
\noindentskip In the next sections, we first describe the apparatus of the user study experimental design, including the two tasks performed. We then present the results obtained and discuss them before concluding.

View File

@@ -29,7 +29,7 @@ Several strategies were reported: some participants first classified visually an
While visual sensation did influence perception, as observed in previous haptic \AR studies \cite{punpongsanon2015softar,gaffary2017ar,fradin2023humans}, haptic sensation dominated here.
This indicates that participants were more confident and relied more on the haptic roughness perception than on the visual roughness perception when integrating both in one coherent perception.
Several participants also described attempting to identify visual and haptic textures using spatial breaks, edges or patterns, that were not reported when these textures were displayed in non-immersive \VEs with a screen \cite{culbertson2014modeling,culbertson2015should}.
Several participants also described attempting to identify visual and haptic textures using spatial breaks, edges or patterns, that were not reported when these textures were displayed in \VEs using a screen \cite{culbertson2014modeling,culbertson2015should}.
A few participants even reported that they clearly sensed patterns on haptic textures.
However, the visual and haptic textures used were isotropic and homogeneous models of real texture captures, \ie their rendered roughness was constant and did not depend on the direction of movement but only on the speed of the finger (\secref[related_work]{texture_rendering}).
Overall, the haptic device was judged to be comfortable, and the visual and haptic textures were judged to be fairly realistic and to work well together (\figref{results_questions}).

View File

@@ -1,7 +1,7 @@
\section{Conclusion}
\label{conclusion}
In this chapter, we investigated how users perceived simultaneous and co-localized visuo-haptic texture augmentations of real surfaces seen in immersive \OST-\AR and touched directly with the index finger.
In this chapter, we investigated how users perceived simultaneous and co-localized visuo-haptic texture augmentations of real surfaces seen with an \OST-\AR headset and touched directly with the index finger.
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, the haptic roughness texture was rendered with on the \HaTT data-driven models and finger speed.
In a user study, 20 participants rated the coherence, realism and perceived roughness of the combination of nine representative visuo-haptic texture pairs.
@@ -15,7 +15,7 @@ This paves the way for new \AR applications capable of augmenting a real environ
The latter is illustrated in \figref{experiment/use_case}, where a user applies different visuo-haptic textures to a wall, in an interior design scenario, to compare them visually and by touch.
We instinctively perceive the properties of everyday objects by touching and exploring them, but we essentially interact with them by grasping in order to manipulate them.
In this first part, we focused on the perception of wearable and immersive virtual textures that augment real surfaces when touched with the fingertip.
In this first part, we focused on the perception of virtual visuo-haptic textures that augment real surfaces when touched with the fingertip.
In the next part, we will propose to improve the direct manipulation with the hand of virtual object with wearable visuo-haptic interaction feedback.
\noindentskip The work described in \chapref{vhar_textures} was presented at the EuroHaptics 2024 conference:

View File

@@ -1,7 +1,7 @@
\section{Introduction}
\label{intro}
In the previous chapter, we presented a system for augmenting the visuo-haptic texture perception of real surfaces directly touched with the finger, using wearable vibrotactile haptics and an immersive \AR headset.
In the previous chapter, we presented a system for augmenting the visuo-haptic texture perception of real surfaces directly touched with the finger, using wearable vibrotactile haptics and an \OST-\AR headset.
In this chapter and the next one, we evaluate the user's perception of such wearable haptic texture augmentation under different visual rendering conditions.
Most of the haptic augmentations of real surfaces using with wearable haptic devices, including roughness of textures (\secref[related_work]{texture_rendering}), have been studied without a visual feedback, and none have considered the influence of the visual rendering on their perception or integrated them in \AR and \VR (\secref[related_work]{texture_rendering}).

View File

@@ -29,5 +29,5 @@ Thereby, we hypothesize that the differences in the perception of vibrotactile r
The perceived delay was the most important in \AR, where the virtual hand visually lags significantly behind the real one, but less so in \VR, where only the proprioceptive sense can help detect the lag.
This delay was not perceived when touching the virtual haptic textures without visual augmentation, because only the finger velocity was used to render them, and, despite the varied finger movements and velocities while exploring the textures, the participants did not perceive any latency in the vibrotactile rendering (\secref{results_questions}).
\textcite{diluca2011effects} demonstrated similarly, in a \VST-\AR setup, how visual latency relative to proprioception increased the perception of stiffness of a virtual piston, while haptic latency decreased it (\secref[related_work]{ar_vr_haptic}).
Another complementary explanation could be a pseudo-haptic effect (\secref[related_work]{visual_haptic_influence}) of the displacement of the virtual hand, as already observed with this vibrotactile texture rendering, but seen on a screen in a non-immersive context \cite{ujitoko2019modulating}.
Another complementary explanation could be a pseudo-haptic effect (\secref[related_work]{visual_haptic_influence}) of the displacement of the virtual hand, as already observed with this vibrotactile texture rendering, but seen on a screen \cite{ujitoko2019modulating}.
Such hypotheses could be tested by manipulating the latency and pose estimation accuracy of the virtual hand or the vibrotactile feedback. % to observe their effects on the roughness perception of the virtual textures.

View File

@@ -4,7 +4,7 @@
In this chapter, we studied how the perception of wearable haptic augmented textures is affected by the visual feedback of the virtual hand and the environment, being either real, augmented or virtual.
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, we augmented the perceived roughness of real surfaces with virtual vibrotactile textures rendered on the finger.
%we rendered virtual vibrotactile patterned textures on the voice-coil worn on the middle-phalanx of the finger to augment the roughness perception of the real surface being touched.
With an immersive \AR headset, that could be switched to a \VR only view, we considered three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR.
With an \OST-\AR headset, that could be switched to a \VR only view, we considered three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR.
We then evaluated the perceived roughness augmentation in these three visual conditions with a psychophysical user study involving 20 participants and extensive questionnaires.
Our results showed that the visual virtuality of the hand (real or virtual) and the environment (\AR or \VR) had a significant effect on the perception of haptic textures and the exploration behaviour of the participants.