Update xr-perception chapter
This commit is contained in:
@@ -3,13 +3,21 @@
|
||||
|
||||
%Summary of the research problem, method, main findings, and implications.
|
||||
|
||||
We designed and implemented a system for rendering virtual haptic grating textures on a real tangible surface touched directly with the fingertip, using a wearable vibrotactile voice-coil device mounted on the middle phalanx of the finger. %, and allowing free explorative movements of the hand on the surface.
|
||||
We investigated virtual textures that modify the roughness perception of real, tangible surfaces, using a wearable vibrotactile device worn on the finger.
|
||||
%
|
||||
This tactile feedback was integrated with an immersive visual virtual environment, using an OST-AR headset, to provide users with a coherent multimodal visuo-haptic augmentation of the real environment, that can be switched between an \AR and a \VR view.
|
||||
%We studied how different such wearable haptic augmented textures are perceived when touched with a virtual hand instead of one's own hand, and when the hand and its environment are visually rendered in AR or VR.
|
||||
%
|
||||
We investigated then with a psychophysical user study the effect of visual rendering of the hand and its environment on the roughness perception of the designed tactile texture augmentations: without visual augmentation (\level{Real} rendering), in \AR with a realistic virtual hand superimposed on the real hand (\level{Mixed} rendering), and in \VR with the same virtual hand as an avatar (\level{Virtual} rendering).
|
||||
To this end, we first designed and implemented a visuo-haptic texture rendering system that allows free exploration of the augmented surface using a visual AR/VR headset.
|
||||
%to render virtual vibrotactile textures on any tangible surface, allowing free exploration of the surface, and integrated them with an immersive visual OST-AR headset, that could be switched to a VR view.
|
||||
%
|
||||
%Only the amplitude $A$ varied between the reference and comparison textures to create the different levels of roughness.
|
||||
%This provided a coherent and synchronised multimodal visuo-haptic augmentation of the real environment, which could also be switched between an AR and a VR view.
|
||||
%
|
||||
%Participants were not informed there was a reference and comparison textures, and
|
||||
No texture was represented visually, to avoid any influence on the perception \cite{bergmanntiest2007haptic,yanagisawa2015effects}.
|
||||
We then conducted a psychophysical user study with 20 participants to assess the roughness perception of these virtual texture augmentations directly touched with the finger (1) without visual augmentation, (2) with a realistic virtual hand rendering in AR, and (3) with the same virtual hand in VR.
|
||||
%
|
||||
%The results showed that the visual rendering of the hand and environment had a significant effect on the perception of haptic textures and the exploration behaviour of the participants.
|
||||
%
|
||||
The textures were on average perceived as \enquote{rougher} and with a higher sensitivity when touched with the real hand alone than with a virtual hand either in AR or VR.
|
||||
%
|
||||
We hypothesised that this difference in perception was due to the \emph{perceived latency} between the finger movements and the different visual, haptic and proprioceptive feedbacks, which were the same in all visual renderings, but were more noticeable in AR and VR. % than without visual augmentation.
|
||||
%
|
||||
With a better understanding of how visual factors influence the perception of haptically augmented tangible objects, the many wearable haptic systems that already exist but have not yet been fully explored with AR can be better applied and new visuo-haptic renderings adapted to AR can be designed.
|
||||
|
||||
Reference in New Issue
Block a user