tangible -> real

This commit is contained in:
2024-10-12 15:24:56 +02:00
parent 000a0a0fc5
commit f624ed5d44
16 changed files with 91 additions and 84 deletions

View File

@@ -1,2 +1,2 @@
\part{Augmenting the Visuo-haptic Texture Perception of Tangible Surfaces}
\part{Augmenting the Visuo-haptic Texture Perception of Real Surfaces}
\mainlabel{perception}

View File

@@ -8,11 +8,11 @@ However, this method has not yet been integrated in an \AR context, where the us
%which either constrained hand to a constant speed to keep the signal frequency constant \cite{asano2015vibrotactile,friesen2024perceived}, or used mechanical sensors attached to the hand \cite{friesen2024perceived,strohmeier2017generating}
In this chapter, we propose a \textbf{system for rendering visual and haptic virtual textures that augment tangible surfaces}.
In this chapter, we propose a \textbf{system for rendering visual and haptic virtual textures that augment real surfaces}.
It is implemented with an immersive \OST-\AR headset Microsoft HoloLens~2 and a wearable vibrotactile (voice-coil) device worn on the outside of finger (not covering the fingertip, \secref[related_work]{vhar_haptics}).
The visuo-haptic augmentations can be \textbf{viewed from any angle} and \textbf{explored freely with the bare finger}, as if they were real textures.
To ensure both real-time and reliable renderings, the hand and the tangibles are tracked using a webcam and marker-based tracking.
The haptic textures are rendered as a vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented tangible surface.
To ensure both real-time and reliable renderings, the hand and the real surfaces are tracked using a webcam and marker-based tracking.
The haptic textures are rendered as a vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented surface.
\noindentskip The contributions of this chapter are:
\begin{itemize}
@@ -26,7 +26,7 @@ The haptic textures are rendered as a vibrotactile signal representing a pattern
\begin{subfigs}{setup}{Visuo-haptic texture rendering system setup. }[][
\item HapCoil-One voice-coil actuator with a fiducial marker on top attached to the middle-phalanx of the user's index finger.
\item Our implementation of the system using a Microsoft HoloLens~2, a webcam for tracking the hand and the tangible surfaces, and an external computer for processing the tracking data and rendering the haptic textures.
\item Our implementation of the system using a Microsoft HoloLens~2, a webcam for tracking the hand and the real surfaces, and an external computer for processing the tracking data and rendering the haptic textures.
]
\subfigsheight{60mm}
\subfig{device}

View File

@@ -1,6 +1,6 @@
%With a vibrotactile actuator attached to a hand-held device or directly on the finger, it is possible to simulate virtual haptic sensations as vibrations, such as texture, friction or contact vibrations \cite{culbertson2018haptics}.
%
%We describe a system for rendering vibrotactile roughness textures in real time, on any tangible surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose.
%We describe a system for rendering vibrotactile roughness textures in real time, on any real surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose.
%
%We also describe how to pair this tactile rendering with an immersive \AR or \VR headset visual display to provide a coherent, multimodal visuo-haptic augmentation of the \RE.
@@ -18,7 +18,7 @@ The visuo-haptic texture rendering system is based on:
The system consists of three main components: the pose estimation of the tracked real elements, the visual rendering of the \VE, and the vibrotactile signal generation and rendering.
\figwide[1]{diagram}{Diagram of the visuo-haptic texture rendering system. }[
Fiducial markers attached to the voice-coil actuator and to tangible surfaces to track are captured by a camera.
Fiducial markers attached to the voice-coil actuator and to augmented surfaces to track are captured by a camera.
The positions and rotations (the poses) ${}^c\mathbf{T}_i$, $i=1..n$ of the $n$ defined markers in the camera frame $\mathcal{F}_c$ are estimated, then filtered with an adaptive low-pass filter.
%These poses are transformed to the \AR/\VR headset frame $\mathcal{F}_h$ and applied to the virtual model replicas to display them superimposed and aligned with the \RE.
These poses are used to move and display the virtual model replicas aligned with the \RE.
@@ -36,8 +36,8 @@ The system consists of three main components: the pose estimation of the tracked
\label{pose_estimation}
A \qty{2}{\cm} AprilTag fiducial marker \cite{wang2016apriltag} is glued to the top of the actuator (\figref{device}) to track the finger pose with a camera (StreamCam, Logitech) which is placed above the experimental setup and capturing \qtyproduct{1280 x 720}{px} images at \qty{60}{\hertz} (\figref{apparatus}).
Other markers are placed on the tangible surfaces to augment (\figref{setup}) to estimate the relative position of the finger with respect to the surfaces.
Contrary to similar work using vision-based tracking allows both to free the hand movements and to augment any tangible surface.
Other markers are placed on the real surfaces to augment (\figref{setup}) to estimate the relative position of the finger with respect to the surfaces.
Contrary to similar work using vision-based tracking allows both to free the hand movements and to augment any real surface.
A camera external to the \AR headset with a marker-based technique is employed to provide accurate and robust tracking with a constant view of the markers \cite{marchand2016pose}.
We denote ${}^c\mathbf{T}_i$, $i=1..n$ the homogenous transformation matrix that defines the position and rotation of the $i$-th marker out of the $n$ defined markers in the camera frame $\mathcal{F}_c$, \eg the finger pose ${}^c\mathbf{T}_f$ and the texture pose ${}^c\mathbf{T}_t$.
@@ -51,7 +51,7 @@ The velocity (without angular velocity) of the marker, denoted as ${}^c\dot{\mat
%To be able to compare virtual and augmented realities, we then create a \VE that closely replicate the real one.
Before a user interacts with the system, it is necessary to design a \VE that will be registered with the \RE during the experiment.
Each real element tracked by a marker is modelled virtually, \eg the hand and the augmented tangible surface (\figref{device}).
Each real element tracked by a marker is modelled virtually, \eg the hand and the augmented surface (\figref{device}).
In addition, the pose and size of the virtual textures were defined on the virtual replicas.
During the experiment, the system uses marker pose estimates to align the virtual models with their real-world counterparts. %, according to the condition being tested.
This allows to detect if a finger touches a virtual texture using a collision detection algorithm (Nvidia PhysX), and to show the virtual elements and textures in real-time, aligned with the \RE, using the considered \AR or \VR headset.

View File

@@ -3,7 +3,7 @@
%Summary of the research problem, method, main findings, and implications.
In this chapter, we designed and implemented a system for rendering virtual visuo-haptic textures that augment a real tangible surface.
In this chapter, we designed and implemented a system for rendering virtual visuo-haptic textures that augment a real surface.
Directly touched with the fingertip, the perceived roughness of the surface can be increased using a wearable vibrotactile voice-coil device mounted on the middle phalanx of the finger.
We adapted the 1D sinusoidal grating rendering method, common in the literature but not yet integrated in a direct touch context, for use with vision-based tracking of the finger and paired it with an immersive \AR headset.

View File

@@ -5,14 +5,14 @@ When we look at the surface of an everyday object, we then touch it to confirm o
Among the various haptic texture augmentations, data-driven methods allow to capture, model and reproduce the roughness perception of real surfaces when touched touched by a hand-held stylus \secref[related_work]{texture_rendering}.
Databases of visuo-haptic textures have been developed in this way \cite{culbertson2014one,balasubramanian2024sens3}, but they have not yet been explored in an immersive and direct touch context with \AR and wearable haptics.
In this chapter, we investigate whether simultaneous and \textbf{co-localized visual and wearable haptic texture augmentation of tangible surfaces} in \AR can be perceived in a coherent and realistic manner, and to what extent each sensory modality would contribute to the overall perception of the augmented texture.
In this chapter, we investigate whether simultaneous and \textbf{co-localized visual and wearable haptic texture augmentation of real surfaces} in \AR can be perceived in a coherent and realistic manner, and to what extent each sensory modality would contribute to the overall perception of the augmented texture.
We used nine pairs of \textbf{data-driven visuo-haptic textures} from the \HaTT database \cite{culbertson2014one}, which we rendered using the wearable visuo-haptic augmentatio nsystem presented in \chapref{vhar_system}. %, an \OST-\AR headset, and a wearable voice-coil device worn on the finger.
In a \textbf{user study}, 20 participants freely explored in direct touch the combination of the visuo-haptic texture pairs to rate their coherence, realism and perceived roughness.
We aimed to assess \textbf{which haptic textures were matched with which visual textures}, how the roughness of the visual and haptic textures was perceived, and whether \textbf{the perceived roughness} could explain the matches made between them.
\noindentskip The contributions of this chapter are:
\begin{itemize}
\item Transposition of data-driven visuo-haptic textures to augment tangible objects in a direct touch context in immersive \AR.
\item Transposition of data-driven visuo-haptic textures to augment real objects in a direct touch context in immersive \AR.
\item A user study evaluating with 20 participants the coherence, realism and perceived roughness of nine pairs of these visuo-haptic texture augmentations.
\end{itemize}

View File

@@ -1,7 +1,7 @@
\section{User Study}
\label{experiment}
%The user study aimed at analyzing the user perception of tangible surfaces when augmented through a visuo-haptic texture using \AR and vibrotactile haptic feedback provided on the finger touching the surfaces.
%The user study aimed at analyzing the user perception of real surfaces when augmented through a visuo-haptic texture using \AR and vibrotactile haptic feedback provided on the finger touching the surfaces.
%Nine representative visuo-haptic texture pairs from the \HaTT database \cite{culbertson2014one} were investigated in two tasks:
%\begin{enumerate}
% \item \level{Matching} task: participants had to find the haptic texture that best matched a given visual texture; and
@@ -13,7 +13,7 @@
\label{textures}
The 100 visuo-haptic texture pairs of the \HaTT database \cite{culbertson2014one} were preliminary tested and compared using the apparatus described in \secref{apparatus} to select the most representative textures for the user study.
% visuo-haptic system presented in \chapref{vhar_system}, and with the vibrotactile haptic feedback provided on the middle-phalanx of the finger touching a tangible surface. on the finger on a tangible surface
% visuo-haptic system presented in \chapref{vhar_system}, and with the vibrotactile haptic feedback provided on the middle-phalanx of the finger touching a real surface. on the finger on a real surface
These texture models were chosen as they are visuo-haptic representations of a wide range of real textures that are publicly available online.
Nine texture pairs were selected (\figref{experiment/textures}) to cover various perceived roughness, from rough to smooth, as named on the database: \level{Metal Mesh}, \level{Sandpaper~100}, \level{Brick~2}, \level{Cork}, \level{Sandpaper~320}, \level{Velcro Hooks}, \level{Plastic Mesh~1}, \level{Terra Cotta}, \level{Coffee Filter}.
All these visual and haptic textures are isotropic: their rendering (appearance or roughness) is the same whatever the direction of the movement on the surface, \ie there are no local deformations (holes, bumps, or breaks).
@@ -24,11 +24,11 @@ All these visual and haptic textures are isotropic: their rendering (appearance
\figref{experiment/setup} shows the experimental setup, and \figref{experiment/view} the first person view of participants during the user study.
The user study was held in a quiet room with no windows, with one light source of \qty{800}{\lumen} placed \qty{70}{\cm} above the table.
Nine \qty{5}{\cm} square cardboards with smooth, white melamine surface, arranged in a \numproduct{3 x 3} grid, were used as real tangible surfaces to augment.
Nine \qty{5}{\cm} square cardboards with smooth, white melamine surface, arranged in a \numproduct{3 x 3} grid, were used as real surfaces to augment.
Their poses were estimated with three \qty{2}{\cm} AprilTag fiducial markers glued on the surfaces grid.
Similarly, a \qty{2}{\cm} fiducial marker was glued on top of the vibrotactile actuator to detect the finger pose.
Positioned \qty{20}{\cm} above the surfaces, a webcam (StreamCam, Logitech) filmed the markers to track finger movements relative to the surfaces, as described in \secref[vhar_system]{virtual_real_alignment}.
The visual textures were displayed on the tangible surfaces using the \OST-\AR headset Microsoft HoloLens~2 running a custom application at \qty{60}{FPS} made with Unity 2021.1 and Mixed Reality Toolkit (MRTK) 2.7.2.
The visual textures were displayed on the real surfaces using the \OST-\AR headset Microsoft HoloLens~2 running a custom application at \qty{60}{FPS} made with Unity 2021.1 and Mixed Reality Toolkit (MRTK) 2.7.2.
A set of empirical tests enabled us to choose the best rendering characteristics in terms of transparency and brightness for the visual textures, that were used throughout the user study.
When a virtual haptic texture was touched, a \qty{48}{kHz} audio signal was generated using the rendering procedure described in \cite{culbertson2014modeling} from the corresponding \HaTT haptic texture model and the measured tangential speed of the finger (\secref[vhar_system]{texture_generation}).
@@ -44,7 +44,7 @@ The vibrotactile voice-coil actuator (HapCoil-One, Actronika) was firmly attache
\item The nine visuo-haptic textures used in the user study, selected from the \HaTT database \cite{culbertson2014one}.
The texture names were never shown, to prevent the use of the user's visual or haptic memory of the textures.
\item Experimental setup.
Participant sat in front of the tangible surfaces, which were augmented with visual textures displayed by the Microsoft HoloLens~2 \AR headset and haptic roughness textures rendered by the vibrotactile haptic device placed on the middle index phalanx.
Participant sat in front of the real surfaces, which were augmented with visual textures displayed by the Microsoft HoloLens~2 \AR headset and haptic roughness textures rendered by the vibrotactile haptic device placed on the middle index phalanx.
A webcam above the surfaces tracked the finger movements.
]
\subfig[0.49]{experiment/textures}
@@ -58,12 +58,12 @@ Participants were first given written instructions about the experimental setup,
Then, after having signed an informed consent form, they were asked to seat in front of the table with the experimental setup and to wear the \AR headset.
%The experimenter firmly attached the plastic shell encasing the vibrotactile actuator to the middle index phalanx of their dominant hand.
As the haptic textures generated no audible noise, participants did not wear any noise reduction headphones.
A calibration of both the HoloLens~2 and the hand tracking was performed to ensure the correct alignment of the visual and haptic textures on the tangible surfaces.
A calibration of both the HoloLens~2 and the hand tracking was performed to ensure the correct alignment of the visual and haptic textures on the real surfaces.
Finally, participants familiarized with the augmented surface in a \qty{2}{min} training session with textures different from the ones used in the user study.
Participants started with the \level{Matching} task.
They were informed that the user study involved nine pairs of corresponding visual and haptic textures that were separated and shuffled.
On each trial, the same visual texture was displayed on the nine tangible surfaces, while the nine haptic textures were rendered on only one of the surfaces at a time, \ie all surfaces were augmented by the same visual texture, but each surface was augmented by a different haptic texture.
On each trial, the same visual texture was displayed on the nine real surfaces, while the nine haptic textures were rendered on only one of the surfaces at a time, \ie all surfaces were augmented by the same visual texture, but each surface was augmented by a different haptic texture.
The placement of the haptic textures was randomized before each trial.
Participants were instructed to look closely at the details of the visual textures and explore the haptic textures with a constant pressure and various speeds to find the haptic texture that best matched the visual texture, \ie choose the surface with the most coherent visual-haptic texture pair.
The texture names were never given or shown to prevent the use of visual or haptic memory of the textures, nor a definition of what roughness is was given, to let participants complete the task as naturally as possible, similarly to \textcite{bergmanntiest2007haptic}.

View File

@@ -1,11 +1,11 @@
\section{Discussion}
\label{discussion}
In this study, we investigated the perception of visuo-haptic texture augmentation of tangible surfaces touched directly with the index fingertip, using visual texture overlays in \AR and haptic roughness textures generated by a vibrotactile device worn on the middle index phalanx.
In this study, we investigated the perception of visuo-haptic texture augmentation of real surfaces touched directly with the index fingertip, using visual texture overlays in \AR and haptic roughness textures generated by a vibrotactile device worn on the middle index phalanx.
The nine evaluated pairs of visuo-haptic textures, taken from the \HaTT database \cite{culbertson2014one}, are models of real texture captures (\secref[related_work]{texture_rendering}).
Their perception was evaluated in a two-task user study in which participants chose the most coherent combinations of visual and haptic textures (\level{Matching} task), and ranked all textures according to their perceived roughness (\level{Ranking} task).
The visual textures were displayed statically on the tangible surface, while the haptic textures adapted in real time to the speed of the finger on the surface, giving the impression that the visuo-haptic textures were integrated into the tangible surface.
The visual textures were displayed statically on the real surface, while the haptic textures adapted in real time to the speed of the finger on the surface, giving the impression that the visuo-haptic textures were integrated into the surface.
In addition, the interaction with the textures was designed to be as natural as possible, without imposing a specific speed of finger movement, as in similar studies \cite{asano2015vibrotactile,friesen2024perceived}.
In the \level{Matching} task, participants were not able to effectively match the original visual and haptic texture pairs (\figref{results/matching_confusion_matrix}), except for the \level{Coffee Filter} texture, which was the smoothest both visually and haptically.

View File

@@ -1,13 +1,13 @@
\section{Conclusion}
\label{conclusion}
In this chapter, we investigated how users perceived simultaneous and co-localized visuo-haptic texture augmentations of tangible surfaces seen in immersive \OST-\AR and touched directly with the index finger.
In this chapter, we investigated how users perceived simultaneous and co-localized visuo-haptic texture augmentations of real surfaces seen in immersive \OST-\AR and touched directly with the index finger.
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, the haptic roughness texture was rendered with on the \HaTT data-driven models and finger speed.
In a user study, 20 participants rated the coherence, realism and perceived roughness of the combination of nine representative visuo-haptic texture pairs.
The results showed that participants consistently identified and matched clusters of visual and haptic textures with similar perceived roughness.
The texture rankings did indeed show that participants perceived the roughness of haptic textures to be very similar, but less so for visual textures, and the haptic roughness perception predominated the final roughness perception ranking of the original visuo-haptic pairs.
This suggests that \AR visual textures that augments tangible surfaces can be enhanced with a set of data-driven vibrotactile haptic textures in a coherent and realistic manner.
This suggests that \AR visual textures that augments real surfaces can be enhanced with a set of data-driven vibrotactile haptic textures in a coherent and realistic manner.
This paves the way for new \AR applications capable of augmenting a \RE with virtual visuo-haptic textures, such as visuo-haptic painting in artistic or object design context, or viewing and touching virtual objects in a museum or a showroom.
The latter is illustrated in \figref{experiment/use_case}, where a user applies different visuo-haptic textures to a wall, in an interior design scenario, to compare them visually and by touch.

View File

@@ -1,14 +1,14 @@
\section{Introduction}
\label{intro}
Most of the haptic augmentations of tangible surfaces using with wearable haptic devices, including roughness of textures (\secref[related_work]{texture_rendering}), have been studied without a visual feedback, and none have considered the influence of the visual rendering on their perception or integrated them in \AR and \VR (\secref[related_work]{texture_rendering}).
Still, it is known that the visual rendering of a tangible can influence the perception of its haptic properties (\secref[related_work]{visual_haptic_influence}), and that the perception of same haptic force-feedback or vibrotactile rendering can differ between \AR and \VR, probably due to difference in perceived simultaneity between visual and haptic stimuli (\secref[related_work]{ar_vr_haptic}).
Most of the haptic augmentations of real surfaces using with wearable haptic devices, including roughness of textures (\secref[related_work]{texture_rendering}), have been studied without a visual feedback, and none have considered the influence of the visual rendering on their perception or integrated them in \AR and \VR (\secref[related_work]{texture_rendering}).
Still, it is known that the visual rendering of an object can influence the perception of its haptic properties (\secref[related_work]{visual_haptic_influence}), and that the perception of same haptic force-feedback or vibrotactile rendering can differ between \AR and \VR, probably due to difference in perceived simultaneity between visual and haptic stimuli (\secref[related_work]{ar_vr_haptic}).
Indeed, in \AR, the user can see their own hand touching, the haptic device worn and the \RE, while in \VR they are hidden by the \VE.
In this chapter, we investigate the \textbf{role of the visual virtuality} of the hand (real or virtual) and its environment (\AR or \VR) on the perception of a \textbf{tangible surface whose haptic roughness is augmented} with a wearable haptics. %voice-coil device worn on the finger.
To do so, we used the visuo-haptic system presented in \chapref{vhar_system} to render virtual vibrotactile patterned textures (\secref[related_work]{texture_rendering}) to augment the tangible surface being touched. % touched by the finger.% that can be directly touched with the bare finger.
In this chapter, we investigate the \textbf{role of the visual virtuality} of the hand (real or virtual) and its environment (\AR or \VR) on the perception of a \textbf{real surface whose haptic roughness is augmented} with a wearable haptics. %voice-coil device worn on the finger.
To do so, we used the visuo-haptic system presented in \chapref{vhar_system} to render virtual vibrotactile patterned textures (\secref[related_work]{texture_rendering}) to augment the real surface being touched. % touched by the finger.% that can be directly touched with the bare finger.
We evaluated, in \textbf{user study with psychophysical methods and extensive questionnaire}, the perceived roughness augmentation in three visual rendering conditions: \textbf{(1) without visual augmentation}, in \textbf{(2) \OST-\AR with a realistic virtual hand} rendering, and in \textbf{(3) \VR with the same virtual hand}.
To control for the influence of the visual rendering, the tangible surface was not visually augmented and stayed the same in all conditions.
To control for the influence of the visual rendering, the real surface was not visually augmented and stayed the same in all conditions.
\noindentskip The contributions of this chapter are:
\begin{itemize}
@@ -21,7 +21,7 @@ We then present the results obtained, discuss them, and outline recommendations
%First, we present a system for rendering virtual vibrotactile textures in real time without constraints on hand movements and integrated with an immersive visual \AR/\VR headset to provide a coherent multimodal visuo-haptic augmentation of the \RE.
%An experimental setup is then presented to compare haptic roughness augmentation with an optical \AR headset (Microsoft HoloLens~2) that can be transformed into a \VR headset using a cardboard mask.
%We then conduct a psychophysical study with 20 participants, where various virtual haptic textures on a tangible surface directly touched with the finger are compared in a two-alternative forced choice (2AFC) task in three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR.
%We then conduct a psychophysical study with 20 participants, where various virtual haptic textures on a real surface directly touched with the finger are compared in a two-alternative forced choice (2AFC) task in three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR.
\bigskip

View File

@@ -1,9 +1,9 @@
\section{User Study}
\label{experiment}
%The visuo-haptic rendering system, described in \secref[vhar_system]{method}, allows free exploration of virtual vibrotactile textures on tangible surfaces directly touched with the bare finger to simulate roughness augmentation, while the visual rendering of the hand and environment can be controlled to be in \AR or \VR.
%The visuo-haptic rendering system, described in \secref[vhar_system]{method}, allows free exploration of virtual vibrotactile textures on real surfaces directly touched with the bare finger to simulate roughness augmentation, while the visual rendering of the hand and environment can be controlled to be in \AR or \VR.
%
%The user study aimed to investigate the effect of visual hand rendering in \AR or \VR on the perception of roughness texture augmentation of a touched tangible surface.
%The user study aimed to investigate the effect of visual hand rendering in \AR or \VR on the perception of roughness texture augmentation of a touched real surface.
In a \TIFC task (\secref[related_work]{sensations_perception}), participants compared the roughness of different tactile texture augmentations in three visual rendering conditions: without any visual augmentation (\level{Real}, \figref{experiment/real}), in \AR with a realistic virtual hand superimposed on the real hand (\level{Mixed}, \figref{experiment/mixed}), and in \VR with the same virtual hand as an avatar (\level{Virtual}, \figref{experiment/virtual}).
In order not to influence the perception, as vision is an important source of information and influence for the perception of texture \cite{bergmanntiest2007haptic,yanagisawa2015effects,vardar2019fingertip}, the touched surface was visually a uniform white; thus only the visual aspect of the hand and the surrounding environment is changed.
@@ -52,7 +52,7 @@ The user study was held in a quiet room with no windows.
\begin{subfigs}{setup}{Visuo-haptic textures rendering setup. }[][
\item HoloLens~2 \OST-\AR headset, the two cardboard masks to switch the real or virtual environments with the same field of view, and the \ThreeD-printed piece for attaching the masks to the headset.
\item User exploring a virtual vibrotactile texture on a tangible sheet of paper.
\item User exploring a virtual vibrotactile texture on a real sheet of paper.
]
\subfigsheight{48.5mm}
\subfig{experiment/headset}

View File

@@ -15,7 +15,7 @@ Thus, compared to no visual rendering (\level{Real}), the addition of a visual r
Differences in user behaviour were also observed between the visual renderings (but not between the haptic textures).
On average, participants responded faster (\percent{-16}), explored textures at a greater distance (\percent{+21}) and at a higher speed (\percent{+16}) without visual augmentation (\level{Real} rendering) than in \VR (\level{Virtual} rendering) (\figref{results_finger}).
The \level{Mixed} rendering was always in between, with no significant difference from the other two.
This suggests that touching a virtual vibrotactile texture on a tangible surface with a virtual hand in \VR is different from touching it with one's own hand: users were more cautious or less confident in their exploration in \VR.
This suggests that touching a virtual vibrotactile texture on a real surface with a virtual hand in \VR is different from touching it with one's own hand: users were more cautious or less confident in their exploration in \VR.
This does not seem to be due to the realism of the virtual hand or the environment, nor to the control of the virtual hand, all of which were rated high to very high by the participants (\secref{results_questions}) in both the \level{Mixed} and \level{Virtual} renderings.
The evaluation of the vibrotactile device and the textures was also the same between the visual rendering, with a high sense of control, a good realism and a low perceived latency of the textures (\secref{results_questions}).
Conversely, the perceived latency of the virtual hand (\response{Hand Latency} question) seemed to be related to the perceived roughness of the textures (with the \PSEs).

View File

@@ -2,8 +2,8 @@
\label{conclusion}
In this chapter, we studied how the perception of wearable haptic augmented textures is affected by the visual virtuality of the hand and the environment, being either real, augmented or virtual.
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, we augmented the perceived roughness of tangible surfaces with virtual vibrotactile textures rendered on the finger.
%we rendered virtual vibrotactile patterned textures on the voice-coil worn on the middle-phalanx of the finger to augment the roughness perception of the tangible surface being touched.
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, we augmented the perceived roughness of real surfaces with virtual vibrotactile textures rendered on the finger.
%we rendered virtual vibrotactile patterned textures on the voice-coil worn on the middle-phalanx of the finger to augment the roughness perception of the real surface being touched.
With an immersive \AR headset, that could be switched to a \VR only view, we considered three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR.
We then evaluated the perceived roughness augmentation in these three visual conditions with a psychophysical user study involving 20 participants and extensive questionnaires.
@@ -17,6 +17,6 @@ This study suggests that attention should be paid to the respective latencies of
Latencies should be measured \cite{friston2014measuring}, minimized to an acceptable level for users and kept synchronised with each other \cite{diluca2019perceptual}.
It seems also that the visual aspect of the hand or the environment on itself has little effect on the perception of haptic feedback, but the degree of visual reality-virtuality can affect the asynchrony sensation of the latencies, even though they remain identical.
When designing for wearable haptics or integrating it into \AR/\VR, it seems important to test its perception in real, augmented and virtual environments.
%With a better understanding of how visual factors influence the perception of haptically augmented tangible objects, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic renderings adapted to \AR can be designed.
%With a better understanding of how visual factors influence the perception of haptically augmented real objects, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic renderings adapted to \AR can be designed.
%Finally, a visual hand representation in OST-\AR together with wearable haptics should be avoided until acceptable tracking latencies \are achieved, as was also observed for \VO interaction with the bare hand \cite{normand2024visuohaptic}.