Fix acronyms
This commit is contained in:
@@ -1,4 +1,4 @@
|
||||
% Even before manipulating a visual representation to induce a haptic sensation, shifts and latencies between user input and co-localised visuo-haptic feedback can be experienced differently in AR and VR, which we aim to investigate in this work.
|
||||
% Even before manipulating a visual representation to induce a haptic sensation, shifts and latencies between user input and co-localised visuo-haptic feedback can be experienced differently in \AR and \VR, which we aim to investigate in this work.
|
||||
|
||||
%Imagine you're an archaeologist or in a museum, and you want to examine an ancient object.
|
||||
%
|
||||
@@ -10,7 +10,7 @@
|
||||
%
|
||||
%Such tactile augmentation is made possible by wearable haptic devices, which are worn directly on the finger or hand and can provide a variety of sensations on the skin, while being small, light and discreet \cite{pacchierotti2017wearable}.
|
||||
%
|
||||
Wearable haptic devices, worn directly on the finger or hand, have been used to render a variety of tactile sensations to virtual objects seen in VR \cite{choi2018claw,detinguy2018enhancing,pezent2019tasbi} or AR \cite{maisto2017evaluation,meli2018combining,teng2021touch}.
|
||||
Wearable haptic devices, worn directly on the finger or hand, have been used to render a variety of tactile sensations to virtual objects seen in \VR \cite{choi2018claw,detinguy2018enhancing,pezent2019tasbi} or \AR \cite{maisto2017evaluation,meli2018combining,teng2021touch}.
|
||||
%
|
||||
They have also been used to alter the perception of roughness, stiffness, friction, and local shape perception of real tangible objects \cite{asano2015vibrotactile,detinguy2018enhancing,salazar2020altering}.
|
||||
%
|
||||
@@ -18,42 +18,42 @@ Such techniques place the actuator \emph{close} to the point of contact with the
|
||||
%
|
||||
This combined use of wearable haptics with tangible objects enables a haptic \emph{augmented} reality (HAR) \cite{bhatia2024augmenting} that can provide a rich and varied haptic feedback.
|
||||
|
||||
The degree of reality/virtuality in both visual and haptic sensory modalities can be varied independently, but wearable haptic AR has been little explored with VR and (visual) AR \cite{choi2021augmenting}.
|
||||
The degree of reality/virtuality in both visual and haptic sensory modalities can be varied independently, but wearable haptic \AR has been little explored with \VR and (visual) \AR \cite{choi2021augmenting}.
|
||||
%
|
||||
Although AR and VR are closely related, they have significant differences that can affect the user experience \cite{genay2021virtual,macedo2023occlusion}.
|
||||
Although \AR and \VR are closely related, they have significant differences that can affect the user experience \cite{genay2021virtual,macedo2023occlusion}.
|
||||
%
|
||||
%By integrating visual virtual content into the real environment, AR keeps the hand of the user, the haptic devices worn and the tangibles touched visible, unlike VR where they are hidden by immersing the user into a visual virtual environment.
|
||||
%By integrating visual virtual content into the real environment, \AR keeps the hand of the user, the haptic devices worn and the tangibles touched visible, unlike \VR where they are hidden by immersing the user into a visual virtual environment.
|
||||
%
|
||||
%Current AR systems also suffer from display and rendering limitations not present in VR, affecting the user experience with virtual content that may be less realistic or inconsistent with the real augmented environment \cite{kim2018revisiting,macedo2023occlusion}.
|
||||
%Current \AR systems also suffer from display and rendering limitations not present in \VR, affecting the user experience with virtual content that may be less realistic or inconsistent with the real augmented environment \cite{kim2018revisiting,macedo2023occlusion}.
|
||||
%
|
||||
It therefore seems necessary to investigate and understand the potential effect of these differences in visual rendering on the perception of haptically augmented tangible objects.
|
||||
%
|
||||
Previous works have shown, for example, that the stiffness of a virtual piston rendered with a force feedback haptic system seen in AR is perceived as less rigid than in VR \cite{gaffary2017ar} or when the visual rendering is ahead of the haptic rendering \cite{diluca2011effects,knorlein2009influence}.
|
||||
Previous works have shown, for example, that the stiffness of a virtual piston rendered with a force feedback haptic system seen in \AR is perceived as less rigid than in \VR \cite{gaffary2017ar} or when the visual rendering is ahead of the haptic rendering \cite{diluca2011effects,knorlein2009influence}.
|
||||
%
|
||||
%Taking our example from the beginning of this introduction, you now want to learn more about the context of the discovery of the ancient object or its use at the time of its creation by immersing yourself in a virtual environment in VR.
|
||||
%Taking our example from the beginning of this introduction, you now want to learn more about the context of the discovery of the ancient object or its use at the time of its creation by immersing yourself in a virtual environment in \VR.
|
||||
%
|
||||
%But how different is the perception of the haptic augmentation in AR compared to VR, with a virtual hand instead of the real hand?
|
||||
%But how different is the perception of the haptic augmentation in \AR compared to \VR, with a virtual hand instead of the real hand?
|
||||
|
||||
The goal of this paper is to study the role of the visual rendering of the hand (real or virtual) and its environment (AR or VR) on the perception of a tangible surface whose texture is augmented with a wearable vibrotactile device worn on the finger.
|
||||
The goal of this paper is to study the role of the visual rendering of the hand (real or virtual) and its environment (AR or \VR) on the perception of a tangible surface whose texture is augmented with a wearable vibrotactile device worn on the finger.
|
||||
%
|
||||
We focus on the perception of roughness, one of the main tactile sensations of materials \cite{baumgartner2013visual,hollins1993perceptual,okamoto2013psychophysical} and one of the most studied haptic augmentations \cite{asano2015vibrotactile,culbertson2014modeling,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
|
||||
%
|
||||
By understanding how these visual factors influence the perception of haptically augmented tangible objects, the many wearable haptic systems that already exist but have not yet been fully explored with AR can be better applied and new visuo-haptic renderings adapted to AR can be designed.
|
||||
By understanding how these visual factors influence the perception of haptically augmented tangible objects, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic renderings adapted to \AR can be designed.
|
||||
|
||||
Our contributions are:
|
||||
%
|
||||
\begin{itemize}
|
||||
\item A system for rendering virtual vibrotactile roughness textures in real time on a tangible surface touched directly with the finger, integrated with an immersive visual AR/VR headset to provide a coherent multimodal visuo-haptic augmentation of the real environment.
|
||||
\item A psychophysical study with 20 participants to evaluate the perception of these virtual roughness textures in three visual rendering conditions: without visual augmentation, with a realistic virtual hand rendering in AR, and with the same virtual hand in VR.
|
||||
\item A psychophysical study with 20 participants to evaluate the perception of these virtual roughness textures in three visual rendering conditions: without visual augmentation, with a realistic virtual hand rendering in \AR, and with the same virtual hand in \VR.
|
||||
\end{itemize}
|
||||
%First, we present a system for rendering virtual vibrotactile textures in real time without constraints on hand movements and integrated with an immersive visual AR/VR headset to provide a coherent multimodal visuo-haptic augmentation of the real environment.
|
||||
%
|
||||
%An experimental setup is then presented to compare haptic roughness augmentation with an optical AR headset (Microsoft HoloLens~2) that can be transformed into a VR headset using a cardboard mask.
|
||||
%An experimental setup is then presented to compare haptic roughness augmentation with an optical \AR headset (Microsoft HoloLens~2) that can be transformed into a \VR headset using a cardboard mask.
|
||||
%
|
||||
%We then conduct a psychophysical study with 20 participants, where various virtual haptic textures on a tangible surface directly touched with the finger are compared in a two-alternative forced choice (2AFC) task in three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in AR, and (3) with the same virtual hand in VR.
|
||||
%We then conduct a psychophysical study with 20 participants, where various virtual haptic textures on a tangible surface directly touched with the finger are compared in a two-alternative forced choice (2AFC) task in three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR.
|
||||
|
||||
%\fig[1]{teaser/teaser2}{%
|
||||
% Vibrotactile textures were rendered in real time on a real surface using a wearable vibrotactile device worn on the finger.
|
||||
% %
|
||||
% Participants explored this haptic roughness augmentation with (Real) their real hand alone, (Mixed) a realistic virtual hand overlay in AR, and (Virtual) the same virtual hand in VR.
|
||||
% Participants explored this haptic roughness augmentation with (Real) their real hand alone, (Mixed) a realistic virtual hand overlay in \AR, and (Virtual) the same virtual hand in \VR.
|
||||
%}
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
%
|
||||
In this section, we describe a system for rendering vibrotactile roughness texture in real time, on any tangible surface, touched directly with the index fingertip, with no constraints on hand movement and using a simple camera to track the finger pose.
|
||||
%
|
||||
We also describe how to pair this tactile rendering with an immersive AR or VR headset visual display to provide a coherent, multimodal visuo-haptic augmentation of the real environment.
|
||||
We also describe how to pair this tactile rendering with an immersive \AR or \VR headset visual display to provide a coherent, multimodal visuo-haptic augmentation of the real environment.
|
||||
|
||||
\section{Principle}
|
||||
\label{principle}
|
||||
@@ -36,7 +36,7 @@ The system is composed of three main components: the pose estimation of the trac
|
||||
|
||||
\begin{subfigs}{setup}{Visuo-haptic texture rendering system setup. }[][
|
||||
\item HapCoil-One voice-coil actuator with a fiducial marker on top attached to a participant's right index finger.
|
||||
\item HoloLens~2 AR headset, the two cardboard masks to switch the real or virtual environments with the same field of view, and the 3D-printed piece for attaching the masks to the headset.
|
||||
\item HoloLens~2 \AR headset, the two cardboard masks to switch the real or virtual environments with the same field of view, and the 3D-printed piece for attaching the masks to the headset.
|
||||
\item User exploring a virtual vibrotactile texture on a tangible sheet of paper.
|
||||
]
|
||||
\subfig[0.325]{device}
|
||||
@@ -70,7 +70,7 @@ In addition, the pose and size of the virtual textures are defined on the virtua
|
||||
%
|
||||
During the experiment, the system uses marker pose estimates to align the virtual models with their real-world counterparts. %, according to the condition being tested.
|
||||
%
|
||||
This allows to detect if a finger touches a virtual texture using a collision detection algorithm (Nvidia PhysX), and to show the virtual elements and textures in real-time, aligned with the real environment (\figref{renderings}), using the considered AR or VR headset.
|
||||
This allows to detect if a finger touches a virtual texture using a collision detection algorithm (Nvidia PhysX), and to show the virtual elements and textures in real-time, aligned with the real environment (\figref{renderings}), using the considered \AR or \VR headset.
|
||||
|
||||
In our implementation, the virtual hand and environment are designed with Unity and the Mixed Reality Toolkit (MRTK).
|
||||
%
|
||||
@@ -80,7 +80,7 @@ It was chosen over VST-AR because OST-AR only adds virtual content to the real e
|
||||
%
|
||||
Indeed, one of our objectives (\secref{experiment}) is to directly compare a virtual environment that replicates a real one. %, rather than a video feed that introduces many supplementary visual limitations.
|
||||
%
|
||||
To simulate a VR headset, a cardboard mask (with holes for sensors) is attached to the headset to block the view of the real environment (\figref{headset}).
|
||||
To simulate a \VR headset, a cardboard mask (with holes for sensors) is attached to the headset to block the view of the real environment (\figref{headset}).
|
||||
|
||||
\section{Vibrotactile Signal Generation and Rendering}
|
||||
\label{texture_generation}
|
||||
@@ -139,7 +139,7 @@ The tactile texture is described and rendered in this work as a one dimensional
|
||||
|
||||
%As shown in \figref{diagram} and described above, the system includes various haptic and visual sensors and rendering devices linked by software processes for image processing, 3D rendering and audio generation.
|
||||
%
|
||||
Because the chosen AR headset is a standalone device (like most current AR/VR headsets) and cannot directly control the sound card and haptic actuator, the image capture, pose estimation and audio signal generation steps are performed on an external computer.
|
||||
Because the chosen \AR headset is a standalone device (like most current AR/VR headsets) and cannot directly control the sound card and haptic actuator, the image capture, pose estimation and audio signal generation steps are performed on an external computer.
|
||||
%
|
||||
All computation steps run in a separate thread to parallelize them and reduce latency, and are synchronised with the headset via a local network and the ZeroMQ library.
|
||||
%
|
||||
@@ -157,7 +157,7 @@ The haptic loop also includes the voice-coil latency \qty{15}{\ms} (as specified
|
||||
%
|
||||
The total haptic latency is below the \qty{60}{\ms} detection threshold in vibrotactile feedback \cite{okamoto2009detectability}.
|
||||
%
|
||||
The total visual latency can be considered slightly high, yet it is typical for an AR rendering involving vision-based tracking \cite{knorlein2009influence}.
|
||||
The total visual latency can be considered slightly high, yet it is typical for an \AR rendering involving vision-based tracking \cite{knorlein2009influence}.
|
||||
|
||||
The two filters also introduce a constant lag between the finger movement and the estimated position and velocity, measured at \qty{160 +- 30}{\ms}.
|
||||
%
|
||||
|
||||
@@ -4,4 +4,4 @@
|
||||
%Summary of the research problem, method, main findings, and implications.
|
||||
|
||||
We designed and implemented a system for rendering virtual haptic grating textures on a real tangible surface touched directly with the fingertip, using a wearable vibrotactile voice-coil device mounted on the middle phalanx of the finger. %, and allowing free explorative movements of the hand on the surface.
|
||||
This tactile feedback was integrated with an immersive visual virtual environment, using an OST-AR headset, to provide users with a coherent multimodal visuo-haptic augmentation of the real environment, that can be switched between an AR and a VR view.
|
||||
This tactile feedback was integrated with an immersive visual virtual environment, using an OST-AR headset, to provide users with a coherent multimodal visuo-haptic augmentation of the real environment, that can be switched between an \AR and a \VR view.
|
||||
|
||||
Reference in New Issue
Block a user