Complete vhar_system conclusion
This commit is contained in:
@@ -248,28 +248,29 @@ Finally, we describe how multimodal visual and haptic feedback have been combine
|
|||||||
We then address each of our two research axes in a dedicated part.
|
We then address each of our two research axes in a dedicated part.
|
||||||
|
|
||||||
\noindentskip
|
\noindentskip
|
||||||
In \textbf{\partref{perception}}, we describe our contributions to the first axis of research: modifying the visuo-haptic texture perception of tangible surfaces.
|
In \textbf{\partref{perception}} we present our contributions to the first axis of research: modifying the visuo-haptic texture perception of tangible surfaces.
|
||||||
We evaluate how the visual rendering of the hand (real or virtual), the environment (\AR or \VR) and the textures (coherent, different or not shown) affect the perception of virtual vibrotactile textures rendered on real surfaces and touched directly with the index finger.
|
We evaluate how the visual rendering of the hand (real or virtual), the environment (\AR or \VR) and the textures (coherent, different or not shown) affect the perception of virtual vibrotactile textures rendered on real surfaces and touched directly with the index finger.
|
||||||
|
|
||||||
In \textbf{\chapref{vhar_system}}, we detail a system for rendering visuo-haptic virtual textures that augment tangible surfaces using an immersive \AR/\VR headset and a wearable vibrotactile device.
|
In \textbf{\chapref{vhar_system}} we design and implement a system for rendering visuo-haptic virtual textures that augment tangible surfaces.%, using an immersive \OST-\AR headset and a wearable vibrotactile device.
|
||||||
The haptic textures are rendered as a real-time vibrotactile signal representing a grating texture, and is provided to the middle phalanx of the index finger touching the texture using a voice-coil actuator.
|
The haptic textures represent a periodical patterned texture rendered by a wearable vibrotactile actuator worn on the middle phalanx of the finger touching the surface.
|
||||||
The tracking of the real hand and environment is done using marker-based technique, and the visual rendering of their virtual counterparts is done using the immersive \OST-\AR headset Microsoft HoloLens~2.
|
The tracking of the real hand and the environment is done using a marker-based technique, and the visual rendering is done using the immersive \OST-\AR headset Microsoft HoloLens~2.
|
||||||
|
The system allows free visual and haptic exploration of the textures, as if they were real, and forms the basis of the next two chapters.
|
||||||
|
|
||||||
In \textbf{\chapref{xr_perception}}, we investigate, in a user study, how different the perception of haptic texture augmentations is in \AR \vs \VR and when touched by a virtual hand \vs one's own hand.
|
In \textbf{\chapref{xr_perception}} we investigate in a user study how different the perception of haptic texture augmentations is in \AR \vs \VR and when touched by a virtual hand \vs one's own hand.
|
||||||
We use psychophysical methods to measure the user perception, and extensive questionnaires to understand how this perception is affected by the visual virtuality of the hand and the environment.
|
We use psychophysical methods to measure the user perception, and extensive questionnaires to understand how this perception is affected by the visual virtuality of the hand and the environment.
|
||||||
|
|
||||||
In \textbf{\chapref{vhar_textures}}, we evaluate the perception of visuo-haptic texture augmentations, touched directly with one's own hand in \AR.
|
In \textbf{\chapref{vhar_textures}} we evaluate in a user study the perception of visuo-haptic texture augmentations, touched directly with one's own hand in \AR.
|
||||||
The virtual textures are paired visual and tactile models of real surfaces \cite{culbertson2014one} that we render as visual and haptic overlays on the touched augmented surfaces.
|
The virtual textures are paired visual and tactile models of real surfaces \cite{culbertson2014one} that we render as visual and haptic overlays on the touched augmented surfaces.
|
||||||
Our objective is to assess the perceived realism, coherence and roughness of the combination of nine representative visuo-haptic texture pairs.
|
Our objective is to assess the perceived realism, coherence and roughness of the combination of nine representative visuo-haptic texture pairs.
|
||||||
|
|
||||||
\noindentskip
|
\noindentskip
|
||||||
In \textbf{\partref{manipulation}}, we describe our contributions to the second axis of research: improving the free and direct hand manipulation of \VOs using visuo-haptic augmentations of the hand as interaction feedback with \VOs in immersive \OST-\AR.
|
In \textbf{\partref{manipulation}} we describe our contributions to the second axis of research: improving the free and direct hand manipulation of \VOs using visuo-haptic augmentations of the hand as interaction feedback with \VOs in immersive \OST-\AR.
|
||||||
|
|
||||||
In \textbf{\chapref{visual_hand}}, we investigate in a user study of six visual renderings as hand augmentations, as a set of the most popular hand renderings in the \AR literature.
|
In \textbf{\chapref{visual_hand}} we investigate in a user study of six visual renderings as hand augmentations, as a set of the most popular hand renderings in the \AR literature.
|
||||||
Using the \OST-\AR headset Microsoft HoloLens~2, we evaluate their effect on the user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a \VO directly with the hand.
|
Using the \OST-\AR headset Microsoft HoloLens~2, we evaluate their effect on the user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a \VO directly with the hand.
|
||||||
|
|
||||||
In \textbf{\chapref{visuo_haptic_hand}}, we evaluate in a user study the visuo-haptic rendering of manual object manipulation with two vibrotactile contact techniques, provided at four different positionings on the user's hand, as haptic rendering of the hand manipulation with \VOs.
|
In \textbf{\chapref{visuo_haptic_hand}} we evaluate in a user study the visuo-haptic rendering of manual object manipulation with two vibrotactile contact techniques, provided at four different positionings on the user's hand, as haptic rendering of the hand manipulation with \VOs.
|
||||||
They are compared to the two most representative visual hand renderings from the previous chapter, resulting in sixteen visuo-haptic hand renderings that are evaluated within the same experimental setup and design.
|
They are compared to the two most representative visual hand renderings from the previous chapter, resulting in sixteen visuo-haptic hand renderings that are evaluated within the same experimental setup and design.
|
||||||
|
|
||||||
\noindentskip
|
\noindentskip
|
||||||
In \textbf{\chapref{conclusion}}, we conclude this thesis and discuss short-term future work and long-term perspectives for each of our contributions and research axes.
|
In \textbf{\chapref{conclusion}} we conclude this thesis and discuss short-term future work and long-term perspectives for each of our contributions and research axes.
|
||||||
|
|||||||
@@ -9,7 +9,7 @@ In this chapter, we propose a \textbf{system for rendering visual and haptic vir
|
|||||||
It is implemented with an immersive \OST-\AR headset Microsoft HoloLens~2 and a wearable vibrotactile (voice-coil) device worn on the outside of finger (not covering the fingertip, \secref[related_work]{vhar_haptics}).
|
It is implemented with an immersive \OST-\AR headset Microsoft HoloLens~2 and a wearable vibrotactile (voice-coil) device worn on the outside of finger (not covering the fingertip, \secref[related_work]{vhar_haptics}).
|
||||||
The visuo-haptic augmentations can be \textbf{viewed from any angle} and \textbf{explored freely with the bare finger}, as if they were real textures.
|
The visuo-haptic augmentations can be \textbf{viewed from any angle} and \textbf{explored freely with the bare finger}, as if they were real textures.
|
||||||
To ensure both real-time and reliable renderings, the hand and the tangibles are tracked using a webcam and marker-based tracking.
|
To ensure both real-time and reliable renderings, the hand and the tangibles are tracked using a webcam and marker-based tracking.
|
||||||
The haptic textures are rendered as a real-time vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented tangible surface.
|
The haptic textures are rendered as a vibrotactile signal representing a patterned grating texture that is synchronized with the finger movement on the augmented tangible surface.
|
||||||
|
|
||||||
\noindentskip The contributions of this chapter are:
|
\noindentskip The contributions of this chapter are:
|
||||||
\begin{itemize}
|
\begin{itemize}
|
||||||
@@ -19,5 +19,13 @@ The haptic textures are rendered as a real-time vibrotactile signal representing
|
|||||||
|
|
||||||
\noindentskip In the remainder of this chapter, we describe the principles of the system, how the real and virtual environments are registered, the generation of the vibrotactile textures, and measures of visual and haptic rendering latencies.
|
\noindentskip In the remainder of this chapter, we describe the principles of the system, how the real and virtual environments are registered, the generation of the vibrotactile textures, and measures of visual and haptic rendering latencies.
|
||||||
|
|
||||||
% We describe a system for rendering vibrotactile roughness textures in real time, on any tangible surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose.
|
\bigskip
|
||||||
% We also describe how to pair this tactile rendering with an immersive \AR or \VR headset visual display to provide a coherent, multimodal visuo-haptic augmentation of the \RE.
|
|
||||||
|
\begin{subfigs}{setup}{Visuo-haptic texture rendering system setup. }[][
|
||||||
|
\item HapCoil-One voice-coil actuator with a fiducial marker on top attached to the middle-phalanx of the user's index finger.
|
||||||
|
\item Our implementation of the system using a Microsoft HoloLens~2, a webcam for tracking the hand and the tangible surfaces, and an external computer for processing the tracking data and rendering the haptic textures.
|
||||||
|
]
|
||||||
|
\subfigsheight{60mm}
|
||||||
|
\subfig{device}
|
||||||
|
\subfig{apparatus}
|
||||||
|
\end{subfigs}
|
||||||
|
|||||||
@@ -35,15 +35,6 @@ The system consists of three main components: the pose estimation of the tracked
|
|||||||
\subsection{Pose Estimation}
|
\subsection{Pose Estimation}
|
||||||
\label{pose_estimation}
|
\label{pose_estimation}
|
||||||
|
|
||||||
\begin{subfigs}{setup}{Visuo-haptic texture rendering system setup. }[][
|
|
||||||
\item HapCoil-One voice-coil actuator with a fiducial marker on top attached to the middle-phalanx of the user's index finger.
|
|
||||||
]
|
|
||||||
\subfig[0.325]{device}
|
|
||||||
%\subfig[0.65]{headset}
|
|
||||||
%\par\vspace{2.5pt}
|
|
||||||
%\subfig[0.992]{apparatus}
|
|
||||||
\end{subfigs}
|
|
||||||
|
|
||||||
A \qty{2}{\cm} AprilTag fiducial marker \cite{wang2016apriltag} is glued to the top of the actuator (\figref{device}) to track the finger pose with a camera (StreamCam, Logitech) which is placed above the experimental setup and capturing \qtyproduct{1280 x 720}{px} images at \qty{60}{\hertz} (\figref{apparatus}).
|
A \qty{2}{\cm} AprilTag fiducial marker \cite{wang2016apriltag} is glued to the top of the actuator (\figref{device}) to track the finger pose with a camera (StreamCam, Logitech) which is placed above the experimental setup and capturing \qtyproduct{1280 x 720}{px} images at \qty{60}{\hertz} (\figref{apparatus}).
|
||||||
Other markers are placed on the tangible surfaces to augment (\figref{setup}) to estimate the relative position of the finger with respect to the surfaces.
|
Other markers are placed on the tangible surfaces to augment (\figref{setup}) to estimate the relative position of the finger with respect to the surfaces.
|
||||||
Contrary to similar work using vision-based tracking allows both to free the hand movements and to augment any tangible surface.
|
Contrary to similar work using vision-based tracking allows both to free the hand movements and to augment any tangible surface.
|
||||||
@@ -72,7 +63,7 @@ A \VST-\AR or a \VR headset could have been used as well.
|
|||||||
\subsection{Vibrotactile Signal Generation and Rendering}
|
\subsection{Vibrotactile Signal Generation and Rendering}
|
||||||
\label{texture_generation}
|
\label{texture_generation}
|
||||||
|
|
||||||
A voice-coil actuator (HapCoil-One, Actronika) is used to display the vibrotactile signal, as it allows the frequency and amplitude of the signal to be controlled independently over time, covers a wide frequency range (\qtyrange{10}{1000}{\Hz}), and outputs the signal accurately with relatively low acceleration distortion\footnote{HapCoil-One specific characteristics are described in its data sheet: \url{https://web.archive.org/web/20240228161416/https://tactilelabs.com/wp-content/uploads/2023/11/HapCoil_One_datasheet.pdf}}.
|
A voice-coil actuator (HapCoil-One, Actronika) is used to display the vibrotactile signal, as it allows the frequency and amplitude of the signal to be controlled independently over time, covers a wide frequency range (\qtyrange{10}{1000}{\Hz}), and outputs the signal accurately with relatively low acceleration distortion\footnote{HapCoil-One specific characteristics are described in its data sheet: \url{https://tactilelabs.com/wp-content/uploads/2023/11/HapCoil_One_datasheet.pdf}}.
|
||||||
The voice-coil actuator is encased in a \ThreeD printed plastic shell and firmly attached to the middle phalanx of the user's index finger with a Velcro strap, to enable the fingertip to directly touch the environment (\figref{device}).
|
The voice-coil actuator is encased in a \ThreeD printed plastic shell and firmly attached to the middle phalanx of the user's index finger with a Velcro strap, to enable the fingertip to directly touch the environment (\figref{device}).
|
||||||
The actuator is driven by a class D audio amplifier (XY-502 / TPA3116D2, Texas Instrument). %, which has proven to be an effective type of amplifier for driving moving-coil \cite{mcmahan2014dynamic}.
|
The actuator is driven by a class D audio amplifier (XY-502 / TPA3116D2, Texas Instrument). %, which has proven to be an effective type of amplifier for driving moving-coil \cite{mcmahan2014dynamic}.
|
||||||
The amplifier is connected to the audio output of a computer that generates the signal using the WASAPI driver in exclusive mode and the NAudio library\footnoteurl{https://github.com/naudio/NAudio}.
|
The amplifier is connected to the audio output of a computer that generates the signal using the WASAPI driver in exclusive mode and the NAudio library\footnoteurl{https://github.com/naudio/NAudio}.
|
||||||
|
|||||||
@@ -3,6 +3,11 @@
|
|||||||
|
|
||||||
%Summary of the research problem, method, main findings, and implications.
|
%Summary of the research problem, method, main findings, and implications.
|
||||||
|
|
||||||
In this chapter, we designed and implemented a system for rendering virtual haptic grating textures on a real tangible surface touched directly with the fingertip, using a wearable vibrotactile voice-coil device mounted on the middle phalanx of the finger, and allowing free explorative movements of the hand on the surface.
|
In this chapter, we designed and implemented a system for rendering virtual visuo-haptic textures that augment a real tangible surface.
|
||||||
This tactile feedback was integrated with an immersive visual virtual environment, using an OST-AR headset, to provide users with a coherent multimodal visuo-haptic augmentation of the \RE, that can be switched between an \AR and a \VR view.
|
Directly touched with the fingertip, the perceived roughness of the surface can be increased using a wearable vibrotactile voice-coil device mounted on the middle phalanx of the finger.
|
||||||
|
We adapted the 1D sinusoidal grating rendering method, common in the literature but not yet integrated in a direct touch context, for use with vision-based tracking of the finger and paired it with an immersive \AR headset.
|
||||||
|
|
||||||
|
Our wearable visuo-haptic augmentation system enable any real surface to be augmented with a minimal setup.
|
||||||
|
It also allows a free exploration of the textures, as if they were real (\secref[related_work]{ar_presence}), by letting the user view them from different poses and touch them with the bare finger without constraints on hand movements.
|
||||||
|
The visual latency we measured is typical of \AR systems, and the haptic latency is below the perceptual detection threshold for vibrotactile rendering.
|
||||||
|
This system forms the basis of the apparatus for the user studies presented in the next two chapters, which evaluate the user perception of these visuo-haptic texture augmentations.
|
||||||
|
|||||||
BIN
2-perception/vhar-system/figures/apparatus.pdf
Normal file
BIN
2-perception/vhar-system/figures/apparatus.pdf
Normal file
Binary file not shown.
@@ -3,8 +3,8 @@ Among the various haptic texture augmentations, data-driven methods allow to cap
|
|||||||
Databases of visuo-haptic textures have been developed in this way \cite{culbertson2014one,balasubramanian2024sens3}, but they have not yet been explored in an immersive and direct touch context with \AR and wearable haptics.
|
Databases of visuo-haptic textures have been developed in this way \cite{culbertson2014one,balasubramanian2024sens3}, but they have not yet been explored in an immersive and direct touch context with \AR and wearable haptics.
|
||||||
|
|
||||||
In this chapter, we investigate whether simultaneous and \textbf{co-localized visual and wearable haptic texture augmentation of tangible surfaces} in \AR can be perceived in a coherent and realistic manner, and to what extent each sensory modality would contribute to the overall perception of the augmented texture.
|
In this chapter, we investigate whether simultaneous and \textbf{co-localized visual and wearable haptic texture augmentation of tangible surfaces} in \AR can be perceived in a coherent and realistic manner, and to what extent each sensory modality would contribute to the overall perception of the augmented texture.
|
||||||
We used nine pairs of \textbf{data-driven visuo-haptic textures} from the \HaTT database \cite{culbertson2014one}, which we rendered using the visuo-haptic system presented in \chapref{vhar_system}, an \OST-\AR headset, and a wearable voice-coil device worn on the finger.
|
We used nine pairs of \textbf{data-driven visuo-haptic textures} from the \HaTT database \cite{culbertson2014one}, which we rendered using the wearable visuo-haptic augmentatio nsystem presented in \chapref{vhar_system}.%, an \OST-\AR headset, and a wearable voice-coil device worn on the finger.
|
||||||
In a \textbf{user study}, 20 participants freely explored the combination of the visuo-haptic texture pairs to rate their coherence, realism and perceived roughness.
|
In a \textbf{user study}, 20 participants freely explored in direct touch the combination of the visuo-haptic texture pairs to rate their coherence, realism and perceived roughness.
|
||||||
We aimed to assess \textbf{which haptic textures were matched with which visual textures}, how the roughness of the visual and haptic textures was perceived, and whether \textbf{the perceived roughness} could explain the matches made between them.
|
We aimed to assess \textbf{which haptic textures were matched with which visual textures}, how the roughness of the visual and haptic textures was perceived, and whether \textbf{the perceived roughness} could explain the matches made between them.
|
||||||
|
|
||||||
\noindentskip The contributions of this chapter are:
|
\noindentskip The contributions of this chapter are:
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
\label{conclusion}
|
\label{conclusion}
|
||||||
|
|
||||||
In this chapter, we investigated how users perceived simultaneous and co-localized visuo-haptic texture augmentations of tangible surfaces seen in immersive \OST-\AR and touched directly with the index finger.
|
In this chapter, we investigated how users perceived simultaneous and co-localized visuo-haptic texture augmentations of tangible surfaces seen in immersive \OST-\AR and touched directly with the index finger.
|
||||||
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, the haptic roughness texture was rendered to a voice-coil worn on the middle-phalanx index, based on the \HaTT data-driven models and finger speed.
|
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, the haptic roughness texture was rendered with on the \HaTT data-driven models and finger speed.
|
||||||
In a user study, 20 participants rated the coherence, realism and perceived roughness of the combination of nine representative visuo-haptic texture pairs.
|
In a user study, 20 participants rated the coherence, realism and perceived roughness of the combination of nine representative visuo-haptic texture pairs.
|
||||||
|
|
||||||
The results showed that participants consistently identified and matched clusters of visual and haptic textures with similar perceived roughness.
|
The results showed that participants consistently identified and matched clusters of visual and haptic textures with similar perceived roughness.
|
||||||
|
|||||||
@@ -1,10 +1,8 @@
|
|||||||
% Delivers the motivation for your paper. It explains why you did the work you did.
|
|
||||||
|
|
||||||
\noindent Most of the haptic augmentations of tangible surfaces using with wearable haptic devices, including roughness of textures (\secref[related_work]{texture_rendering}), have been studied without a visual feedback, and none have considered the influence of the visual rendering on their perception or integrated them in \AR and \VR (\secref[related_work]{texture_rendering}).
|
\noindent Most of the haptic augmentations of tangible surfaces using with wearable haptic devices, including roughness of textures (\secref[related_work]{texture_rendering}), have been studied without a visual feedback, and none have considered the influence of the visual rendering on their perception or integrated them in \AR and \VR (\secref[related_work]{texture_rendering}).
|
||||||
Still, it is known that the visual rendering of a tangible can influence the perception of its haptic properties (\secref[related_work]{visual_haptic_influence}), and that the perception of same haptic force-feedback or vibrotactile rendering can differ between \AR and \VR, probably due to difference in perceived simultaneity between visual and haptic stimuli (\secref[related_work]{ar_vr_haptic}).
|
Still, it is known that the visual rendering of a tangible can influence the perception of its haptic properties (\secref[related_work]{visual_haptic_influence}), and that the perception of same haptic force-feedback or vibrotactile rendering can differ between \AR and \VR, probably due to difference in perceived simultaneity between visual and haptic stimuli (\secref[related_work]{ar_vr_haptic}).
|
||||||
Indeed, in \AR, the user can see their own hand touching, the haptic device worn and the \RE, while in \VR they are hidden by the \VE.
|
Indeed, in \AR, the user can see their own hand touching, the haptic device worn and the \RE, while in \VR they are hidden by the \VE.
|
||||||
|
|
||||||
In this chapter, we investigate the \textbf{role of the visual virtuality} of the hand (real or virtual) and its environment (\AR or \VR) on the perception of a \textbf{tangible surface whose haptic roughness is augmented} with a wearable voice-coil device worn on the finger.
|
In this chapter, we investigate the \textbf{role of the visual virtuality} of the hand (real or virtual) and its environment (\AR or \VR) on the perception of a \textbf{tangible surface whose haptic roughness is augmented} with a wearable haptics.%voice-coil device worn on the finger.
|
||||||
To do so, we used the visuo-haptic system presented in \chapref{vhar_system} to render virtual vibrotactile patterned textures (\secref[related_work]{texture_rendering}) to augment the tangible surface being touched.% touched by the finger.% that can be directly touched with the bare finger.
|
To do so, we used the visuo-haptic system presented in \chapref{vhar_system} to render virtual vibrotactile patterned textures (\secref[related_work]{texture_rendering}) to augment the tangible surface being touched.% touched by the finger.% that can be directly touched with the bare finger.
|
||||||
We evaluated, in \textbf{user study with psychophysical methods and extensive questionnaire}, the perceived roughness augmentation in three visual rendering conditions: \textbf{(1) without visual augmentation}, in \textbf{(2) \OST-\AR with a realistic virtual hand} rendering, and in \textbf{(3) \VR with the same virtual hand}.
|
We evaluated, in \textbf{user study with psychophysical methods and extensive questionnaire}, the perceived roughness augmentation in three visual rendering conditions: \textbf{(1) without visual augmentation}, in \textbf{(2) \OST-\AR with a realistic virtual hand} rendering, and in \textbf{(3) \VR with the same virtual hand}.
|
||||||
To control for the influence of the visual rendering, the tangible surface was not visually augmented and stayed the same in all conditions.
|
To control for the influence of the visual rendering, the tangible surface was not visually augmented and stayed the same in all conditions.
|
||||||
|
|||||||
@@ -2,7 +2,8 @@
|
|||||||
\label{conclusion}
|
\label{conclusion}
|
||||||
|
|
||||||
In this chapter, we studied how the perception of wearable haptic augmented textures is affected by the visual virtuality of the hand and the environment, being either real, augmented or virtual.
|
In this chapter, we studied how the perception of wearable haptic augmented textures is affected by the visual virtuality of the hand and the environment, being either real, augmented or virtual.
|
||||||
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, we rendered virtual vibrotactile patterned textures on the voice-coil worn on the middle-phalanx of the finger to augment the roughness perception of the tangible surface being touched.
|
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, we augmented the perceived roughness of tangible surfaces with virtual vibrotactile textures rendered on the finger.
|
||||||
|
%we rendered virtual vibrotactile patterned textures on the voice-coil worn on the middle-phalanx of the finger to augment the roughness perception of the tangible surface being touched.
|
||||||
With an immersive \AR headset, that could be switched to a \VR only view, we considered three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR.
|
With an immersive \AR headset, that could be switched to a \VR only view, we considered three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR.
|
||||||
We then evaluated the perceived roughness augmentation in these three visual conditions with a psychophysical user study involving 20 participants and extensive questionnaires.
|
We then evaluated the perceived roughness augmentation in these three visual conditions with a psychophysical user study involving 20 participants and extensive questionnaires.
|
||||||
|
|
||||||
|
|||||||
@@ -5,19 +5,19 @@
|
|||||||
|
|
||||||
\section{Summary}
|
\section{Summary}
|
||||||
|
|
||||||
In this thesis, entitled \enquote{\textbf{\ThesisTitle}}, we showed how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR by augmenting the perception and manipulation of the virtual.
|
In this thesis, entitled \enquote{\textbf{\ThesisTitle}}, we have shown how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR by augmenting the perception and manipulation of the virtual.
|
||||||
Wearable haptics can provide a rich tactile feedback on \VOs and augment the perception of real objects, both directly touched with the hand, while preserving the freedom of movement and interaction with the \RE.
|
Wearable haptics can provide a rich tactile feedback on \VOs and augment the perception of real objects, both directly touched by the hand, while preserving the freedom of movement and interaction with the \RE.
|
||||||
However, their integration with \AR is still in its infancy, and presents many design, technical and human challenges.
|
However, their integration with \AR is still in its infancy, and presents many design, technical and human challenges.
|
||||||
We structured our research on two axes: \textbf{(I) modifying the texture perception of tangible surfaces}, and \textbf{(II) improving the manipulation of \VOs}.
|
We have structured our research around two axes: \textbf{(I) modifying the texture perception of tangible surfaces}, and \textbf{(II) improving the manipulation of \VOs}.
|
||||||
|
|
||||||
\noindentskip In \partref{perception} we focused on modifying the perception of wearable virtual visuo-haptic textures that augments tangible surfaces.
|
\noindentskip In \partref{perception} we focused on modifying the perception of wearable and immersive virtual visuo-haptic textures that augment tangible surfaces.
|
||||||
Texture is a fundamental property of an object, perceived equally by sight and touch.
|
Texture is a fundamental property of an object, perceived equally by sight and touch.
|
||||||
It is also one of the most known haptic augmentation, but it had not yet been integrated with \AR or \VR.
|
It is also one of the most studied haptic augmentations, but it had not yet been integrated into \AR or \VR.
|
||||||
%However, haptic texture augmentation had not yet been integrated with \AR or \VR.
|
We \textbf{(1)} proposed a \textbf{wearable visuo-haptic texture augmentation system}, \textbf{(2)} evaluated how the perception of haptic texture augmentations is \textbf{affected by the visual virtuality of the hand} and the environment (real, augmented, or virtual), and \textbf{(3)} investigated the \textbf{perception of co-localized visuo-haptic texture augmentations}.
|
||||||
%We designed wearable visuo-haptic texture augmentations and evaluated how the degree of virtuality and the rendering of the visuals influenced the perception of the haptic textures.
|
|
||||||
We \textbf{(1)} proposed a \textbf{wearable visuo-haptic texture augmentation system}, \textbf{(2)} evaluated how the perception of haptic textures is \textbf{affected by the visual virtuality of the hand} and the environment (real, augmented, or virtual), and \textbf{(3)} investigated the \textbf{perception of co-localized visuo-haptic texture augmentations}.
|
|
||||||
|
|
||||||
In \chapref{vhar_system},
|
In \chapref{vhar_system}, we presented a system for \textbf{augmenting any tangible surface} with virtual \textbf{visuo-haptic roughness textures} using an immersive \AR headset and a wearable vibrotactile device worn on the middle phalanx of the finger.
|
||||||
|
It allows a \textbf{free visual and touch exploration} of the textures, as if they were real, allowing the user to view them from different angles and touch them with the bare finger without constraints on hand movements.
|
||||||
|
The user studies in the next two chapters are based on this system.
|
||||||
|
|
||||||
In \chapref{xr_perception} we explored how the perception of wearable haptic augmented textures is affected by the visual virtuality of the hand and the environment, whether it is real, augmented or virtual.
|
In \chapref{xr_perception} we explored how the perception of wearable haptic augmented textures is affected by the visual virtuality of the hand and the environment, whether it is real, augmented or virtual.
|
||||||
We augmented the perceived roughness of the tangible surface with virtual vibrotactile patterned textures, and rendered the visual conditions by switching the \OST-\AR headset to a \VR-only view.
|
We augmented the perceived roughness of the tangible surface with virtual vibrotactile patterned textures, and rendered the visual conditions by switching the \OST-\AR headset to a \VR-only view.
|
||||||
@@ -71,31 +71,31 @@ This would allow a complete portable and wearable visuo-haptic system to be used
|
|||||||
|
|
||||||
The main limitation of this user study was the absence of a visual representation of the virtual patterned texture.
|
The main limitation of this user study was the absence of a visual representation of the virtual patterned texture.
|
||||||
This is indeed a source of information as important as haptic sensations for the perception of both real textures \cite{baumgartner2013visual,bergmanntiest2007haptic,vardar2019fingertip} and virtual textures \cite{degraen2019enhancing,gunther2022smooth}, and their interaction in the overall perception is complex.
|
This is indeed a source of information as important as haptic sensations for the perception of both real textures \cite{baumgartner2013visual,bergmanntiest2007haptic,vardar2019fingertip} and virtual textures \cite{degraen2019enhancing,gunther2022smooth}, and their interaction in the overall perception is complex.
|
||||||
Specifically, it remains to be investigated how to visually represent the vibrotactile patterned textures in a way that is compelling, as the visuo-haptic coupling of such patterned textures is not trivial \cite{unger2011roughness}.% even with real textures \cite{klatzky2003feeling}.
|
In particular, it remains to be investigated how the vibrotactile patterned textures can be represented visually in a convincing way, as the visuo-haptic coupling of such patterned textures is not trivial \cite{unger2011roughness}.% even with real textures \cite{klatzky2003feeling}.
|
||||||
|
|
||||||
\paragraph{Broader Visuo-Haptic Conditions}
|
\paragraph{Broader Visuo-Haptic Conditions}
|
||||||
|
|
||||||
Our study was conducted with an \OST-\AR headset, but the results may be different with a \VST-\AR headset, where the \RE is seen through cameras and screens, and the perceived simultaneity between visual and haptic stimuli, real or virtual, is different.
|
Our study was conducted with an \OST-\AR headset, but the results may be different with a \VST-\AR headset, where the \RE is seen through cameras and screens, and the perceived simultaneity between visual and haptic stimuli, real or virtual, is different.
|
||||||
We also focused on the perception of roughness augmentation using wearable vibrotactile haptics and a square wave signal to simulate a patterned texture: Our objective was not to accurately reproduce real textures, but to induce various roughness perception on the same tangible surface with a well controlled parameters.
|
We also focused on the perception of roughness augmentation using wearable vibrotactile haptics and a square wave signal to simulate a patterned texture: Our objective was not to accurately reproduce real textures, but to induce different percevied roughness on the same tangible surface with a well controlled parameters.
|
||||||
Yet, more accurate models to simulate interaction with virtual textures should be transposed to wearable haptic augmentations, such as in \textcite{unger2011roughness}.
|
However, more accurate models for simulating interaction with virtual textures should be applied to wearable haptic augmentations, such as in \textcite{unger2011roughness}.
|
||||||
Another limitation that may have affected the perception of the haptic texture augmentations is the lack of compensation for the frequency response of the actuator and amplifier \cite{asano2012vibrotactile,culbertson2014modeling,friesen2024perceived}.
|
Another limitation that may have affected the perception of the haptic texture augmentations is the lack of compensation for the frequency response of the actuator and amplifier \cite{asano2012vibrotactile,culbertson2014modeling,friesen2024perceived}.
|
||||||
The dynamic response of the finger should also be considered, and could vary among individuals.
|
The dynamic response of the finger should also be considered, and may vary between individuals.
|
||||||
|
|
||||||
\subsection*{Perception of Visual and Haptic Texture Augmentations in Augmented Reality}
|
\subsection*{Perception of Visual and Haptic Texture Augmentations in Augmented Reality}
|
||||||
|
|
||||||
\paragraph{Assess the Applicability of the Method}
|
\paragraph{Assess the Applicability of the Method}
|
||||||
|
|
||||||
As with the previous chapter, our objective was not to accurately reproduce real textures, but to alter the perception of a tangible surface being touched with simultaneous visual and haptic texture augmentations.
|
As in the previous chapter, our aim was not to accurately reproduce real textures, but to alter the perception of a tangible surface being touched with simultaneous visual and haptic texture augmentations.
|
||||||
Yet, the results have also some limitations as they addressed a small set of visuo-haptic textures augmenting the perception of smooth and white tangible surfaces.
|
However, the results also have some limitations, as they addressed a small set of visuo-haptic textures that augmented the perception of smooth and white tangible surfaces.
|
||||||
The visuo-haptic texture augmentation may be difficult on surfaces that already have strong visual or haptic patterns \cite{asano2012vibrotactile}, or on objects with complex shapes.
|
Visuo-haptic texture augmentations may be difficult on surfaces that already have strong visual or haptic patterns \cite{asano2012vibrotactile}, or on objects with complex shapes.
|
||||||
The role of visuo-haptic texture augmentations should also be evaluated in more complex tasks, such as object recognition and assembly, or in more concrete use cases, such as displaying and touching a museum object or a 3D print before manufacturing it.
|
The role of visuo-haptic texture augmentation should also be evaluated in more complex tasks, such as object recognition and assembly, or in more concrete use cases, such as displaying and touching a museum object or a 3D print before it is manufactured.
|
||||||
|
|
||||||
%Finally, the visual textures used were also simple color captures not meant to be used in an immersive \VE.
|
%Finally, the visual textures used were also simple color captures not meant to be used in an immersive \VE.
|
||||||
|
|
||||||
\paragraph{Adapt to the Specificities of Direct Touch}
|
\paragraph{Adapt to the Specificities of Direct Touch}
|
||||||
|
|
||||||
The haptic textures used models from the vibrations of a hand-held probe sliding over real surfaces captured.
|
The haptic textures used were captures and models of the vibrations of a hand-held probe sliding over real surfaces.
|
||||||
We generated the vibrotactile textures only from the finger speed \cite{culbertson2015should}, but the perceived roughness of real textures also depends on other factors, such as the force of contact, the angle, the posture or the surface of the contact \cite{schafer2017transfer}, but their respective importance in the perception is not yet fully understood \cite{richardson2022learning}.
|
We generated the vibrotactile textures only from the finger speed \cite{culbertson2015should}, but the perceived roughness of real textures also depends on other factors, such as the contact force, the angle, the posture or the surface of the contact \cite{schafer2017transfer}, but their respective importance in the perception is not yet fully understood \cite{richardson2022learning}.
|
||||||
It would be interesting to determine the importance of these factors on the perceived realism of virtual vibrotactile textures.
|
It would be interesting to determine the importance of these factors on the perceived realism of virtual vibrotactile textures.
|
||||||
The virtual texture models should also be adaptable to individual sensitivities \cite{malvezzi2021design,young2020compensating}.
|
The virtual texture models should also be adaptable to individual sensitivities \cite{malvezzi2021design,young2020compensating}.
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user