Complete vhar_system conclusion
This commit is contained in:
@@ -248,28 +248,29 @@ Finally, we describe how multimodal visual and haptic feedback have been combine
|
||||
We then address each of our two research axes in a dedicated part.
|
||||
|
||||
\noindentskip
|
||||
In \textbf{\partref{perception}}, we describe our contributions to the first axis of research: modifying the visuo-haptic texture perception of tangible surfaces.
|
||||
In \textbf{\partref{perception}} we present our contributions to the first axis of research: modifying the visuo-haptic texture perception of tangible surfaces.
|
||||
We evaluate how the visual rendering of the hand (real or virtual), the environment (\AR or \VR) and the textures (coherent, different or not shown) affect the perception of virtual vibrotactile textures rendered on real surfaces and touched directly with the index finger.
|
||||
|
||||
In \textbf{\chapref{vhar_system}}, we detail a system for rendering visuo-haptic virtual textures that augment tangible surfaces using an immersive \AR/\VR headset and a wearable vibrotactile device.
|
||||
The haptic textures are rendered as a real-time vibrotactile signal representing a grating texture, and is provided to the middle phalanx of the index finger touching the texture using a voice-coil actuator.
|
||||
The tracking of the real hand and environment is done using marker-based technique, and the visual rendering of their virtual counterparts is done using the immersive \OST-\AR headset Microsoft HoloLens~2.
|
||||
In \textbf{\chapref{vhar_system}} we design and implement a system for rendering visuo-haptic virtual textures that augment tangible surfaces.%, using an immersive \OST-\AR headset and a wearable vibrotactile device.
|
||||
The haptic textures represent a periodical patterned texture rendered by a wearable vibrotactile actuator worn on the middle phalanx of the finger touching the surface.
|
||||
The tracking of the real hand and the environment is done using a marker-based technique, and the visual rendering is done using the immersive \OST-\AR headset Microsoft HoloLens~2.
|
||||
The system allows free visual and haptic exploration of the textures, as if they were real, and forms the basis of the next two chapters.
|
||||
|
||||
In \textbf{\chapref{xr_perception}}, we investigate, in a user study, how different the perception of haptic texture augmentations is in \AR \vs \VR and when touched by a virtual hand \vs one's own hand.
|
||||
In \textbf{\chapref{xr_perception}} we investigate in a user study how different the perception of haptic texture augmentations is in \AR \vs \VR and when touched by a virtual hand \vs one's own hand.
|
||||
We use psychophysical methods to measure the user perception, and extensive questionnaires to understand how this perception is affected by the visual virtuality of the hand and the environment.
|
||||
|
||||
In \textbf{\chapref{vhar_textures}}, we evaluate the perception of visuo-haptic texture augmentations, touched directly with one's own hand in \AR.
|
||||
In \textbf{\chapref{vhar_textures}} we evaluate in a user study the perception of visuo-haptic texture augmentations, touched directly with one's own hand in \AR.
|
||||
The virtual textures are paired visual and tactile models of real surfaces \cite{culbertson2014one} that we render as visual and haptic overlays on the touched augmented surfaces.
|
||||
Our objective is to assess the perceived realism, coherence and roughness of the combination of nine representative visuo-haptic texture pairs.
|
||||
|
||||
\noindentskip
|
||||
In \textbf{\partref{manipulation}}, we describe our contributions to the second axis of research: improving the free and direct hand manipulation of \VOs using visuo-haptic augmentations of the hand as interaction feedback with \VOs in immersive \OST-\AR.
|
||||
In \textbf{\partref{manipulation}} we describe our contributions to the second axis of research: improving the free and direct hand manipulation of \VOs using visuo-haptic augmentations of the hand as interaction feedback with \VOs in immersive \OST-\AR.
|
||||
|
||||
In \textbf{\chapref{visual_hand}}, we investigate in a user study of six visual renderings as hand augmentations, as a set of the most popular hand renderings in the \AR literature.
|
||||
In \textbf{\chapref{visual_hand}} we investigate in a user study of six visual renderings as hand augmentations, as a set of the most popular hand renderings in the \AR literature.
|
||||
Using the \OST-\AR headset Microsoft HoloLens~2, we evaluate their effect on the user performance and experience in two representative manipulation tasks: push-and-slide and grasp-and-place a \VO directly with the hand.
|
||||
|
||||
In \textbf{\chapref{visuo_haptic_hand}}, we evaluate in a user study the visuo-haptic rendering of manual object manipulation with two vibrotactile contact techniques, provided at four different positionings on the user's hand, as haptic rendering of the hand manipulation with \VOs.
|
||||
In \textbf{\chapref{visuo_haptic_hand}} we evaluate in a user study the visuo-haptic rendering of manual object manipulation with two vibrotactile contact techniques, provided at four different positionings on the user's hand, as haptic rendering of the hand manipulation with \VOs.
|
||||
They are compared to the two most representative visual hand renderings from the previous chapter, resulting in sixteen visuo-haptic hand renderings that are evaluated within the same experimental setup and design.
|
||||
|
||||
\noindentskip
|
||||
In \textbf{\chapref{conclusion}}, we conclude this thesis and discuss short-term future work and long-term perspectives for each of our contributions and research axes.
|
||||
In \textbf{\chapref{conclusion}} we conclude this thesis and discuss short-term future work and long-term perspectives for each of our contributions and research axes.
|
||||
|
||||
Reference in New Issue
Block a user