Remove comments
This commit is contained in:
@@ -7,8 +7,6 @@ To create the illusion of touching a pattern with a fixed spatial period, the fr
|
||||
Previous work either used mechanical system to track the movement at high frequency \cite{strohmeier2017generating,friesen2024perceived}, or required the user to move at a constant speed to keep the signal frequency constant \cite{asano2015vibrotactile,ujitoko2019modulating}.
|
||||
However, this method has not yet been integrated in an \AR headset context, where the user should be able to freely touch and explore the visuo-haptic texture augmentations.
|
||||
|
||||
%which either constrained hand to a constant speed to keep the signal frequency constant \cite{asano2015vibrotactile,friesen2024perceived}, or used mechanical sensors attached to the hand \cite{friesen2024perceived,strohmeier2017generating}
|
||||
|
||||
In this chapter, we propose a \textbf{system for rendering visual and haptic virtual textures that augment real surfaces}.
|
||||
It is implemented with the \OST-\AR headset Microsoft HoloLens~2 and a wearable vibrotactile (voice-coil) device worn on the outside of finger (not covering the fingertip).
|
||||
The visuo-haptic augmentations rendered with this design allow a user to \textbf{see the textures from any angle} and \textbf{explore them freely with the bare finger}, as if they were real textures.
|
||||
|
||||
@@ -1,8 +1,6 @@
|
||||
\section{Conclusion}
|
||||
\label{conclusion}
|
||||
|
||||
%Summary of the research problem, method, main findings, and implications.
|
||||
|
||||
In this chapter, we designed and implemented a system for rendering virtual visuo-haptic textures that augment a real surface.
|
||||
Directly touched with the fingertip, the perceived roughness of the surface can be increased using a wearable vibrotactile voice-coil device mounted on the middle phalanx of the finger.
|
||||
We adapted the 1D sinusoidal grating rendering method, common in the literature but not yet integrated in a direct touch context, for use with vision-based pose estimation of the finger and paired it with an \OST-\AR headset.
|
||||
@@ -12,9 +10,3 @@ It also allows a free exploration of the textures, as if they were real (\secref
|
||||
The visual latency we measured is typical of \AR systems, and the haptic latency is below the perceptual detection threshold for vibrotactile rendering.
|
||||
|
||||
This system forms the basis of the apparatus for the user studies presented in the next two chapters, which evaluate the user perception of these visuo-haptic texture augmentations.
|
||||
|
||||
%\noindentskip This work was presented and published at the VRST 2024 conference:
|
||||
%
|
||||
%Erwan Normand, Claudio Pacchierotti, Eric Marchand, and Maud Marchal.
|
||||
%\enquote{How Different Is the Perception of Vibrotactile Texture Roughness in Augmented versus Virtual Reality?}.
|
||||
%In: \textit{ACM Symposium on Virtual Reality Software and Technology}. Trier, Germany, October 2024. pp. 287--296.
|
||||
|
||||
@@ -10,7 +10,7 @@ Databases of visuo-haptic textures have been developed in this way \cite{culbert
|
||||
|
||||
In this chapter, we consider simultaneous and \textbf{co-localized visual and wearable haptic texture augmentation of real surfaces} with an \OST-\AR headset and wearable vibrotactile feedback.
|
||||
We investigate how these textures can be perceived in a coherent and realistic manner, and to what extent each sensory modality would contribute to the overall perception of the augmented texture.
|
||||
We used nine pairs of \textbf{data-driven visuo-haptic textures} from the \HaTT database \cite{culbertson2014one}, which we rendered using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}. %, an \OST-\AR headset, and a wearable voice-coil device worn on the finger.
|
||||
We used nine pairs of \textbf{data-driven visuo-haptic textures} from the \HaTT database \cite{culbertson2014one}, which we rendered using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}.
|
||||
In a \textbf{user study}, 20 participants freely explored in direct touch the combination of the visuo-haptic texture pairs to rate their coherence, realism and perceived roughness.
|
||||
We aimed to assess \textbf{which haptic textures were matched with which visual textures}, how the roughness of the visual and haptic textures was perceived, and whether \textbf{the perceived roughness} could explain the matches made between them.
|
||||
|
||||
|
||||
@@ -1,19 +1,10 @@
|
||||
\section{User Study}
|
||||
\label{experiment}
|
||||
|
||||
%The user study aimed at analyzing the user perception of real surfaces when augmented through a visuo-haptic texture using \AR and vibrotactile haptic feedback provided on the finger touching the surfaces.
|
||||
%Nine representative visuo-haptic texture pairs from the \HaTT database \cite{culbertson2014one} were investigated in two tasks:
|
||||
%\begin{enumerate}
|
||||
% \item \level{Matching} task: participants had to find the haptic texture that best matched a given visual texture; and
|
||||
% \item \level{Ranking} task: participants had to rank the haptic textures, the visual textures, and the visuo-haptic texture pairs according to their perceived roughness.
|
||||
%\end{enumerate}
|
||||
%Our objective is to assess which haptic textures were associated with which visual textures, how the roughness of the visual and haptic textures are perceived, and whether the perceived roughness can explain the matches made between them.
|
||||
|
||||
\subsection{The textures}
|
||||
\label{textures}
|
||||
|
||||
The 100 visuo-haptic texture pairs of the \HaTT database \cite{culbertson2014one} were preliminary tested and compared using the apparatus described in \secref{apparatus} to select the most representative textures for the user study.
|
||||
% visuo-haptic system presented in \chapref{vhar_system}, and with the vibrotactile haptic feedback provided on the middle-phalanx of the finger touching a real surface. on the finger on a real surface
|
||||
These texture models were chosen as they are visuo-haptic representations of a wide range of real textures that are publicly available online.
|
||||
Nine texture pairs were selected (\figref{experiment/textures}) to cover various perceived roughness, from rough to smooth, as named on the database: \level{Metal Mesh}, \level{Sandpaper~100}, \level{Brick~2}, \level{Cork}, \level{Sandpaper~320}, \level{Velcro Hooks}, \level{Plastic Mesh~1}, \level{Terra Cotta}, \level{Coffee Filter}.
|
||||
All these visual and haptic textures are isotropic: their rendering (appearance or roughness) is the same whatever the direction of the movement on the surface, \ie there are no local deformations (holes, bumps, or breaks).
|
||||
@@ -35,10 +26,6 @@ When a virtual haptic texture was touched, a \qty{48}{kHz} audio signal was gene
|
||||
The normal force on the texture was assumed to be constant at \qty{1.2}{\N} to generate the audio signal from the model, as \textcite{culbertson2015should}, who found that the \HaTT textures can be rendered using only the speed as input without decreasing their perceived realism.
|
||||
The rendering of the virtual texture is described in \secref[vhar_system]{texture_generation}.
|
||||
The vibrotactile voice-coil actuator (HapCoil-One, Actronika) was firmly attached to the middle index phalanx of the participant's dominant hand using a Velcro strap, similarly to previous studies \cite{asano2015vibrotactile,friesen2024perceived}.
|
||||
%An amplifier (XY-502, not branded) converted this audio signal to a current transmitted to the vibrotactile voice-coil actuator (HapCoil-One, Actronika), that was encased in a \ThreeD-printed plastic shell firmly attached to the middle index phalanx of the participant's dominant hand, similarly to previous studies \cite{asano2015vibrotactile,friesen2024perceived}.
|
||||
%This voice-coil actuator was chosen for its wide frequency range (\qtyrange{10}{1000}{\Hz}) and its relatively low acceleration distortion, as specified by the manufacturer\footnoteurl{https://www.actronika.com/haptic-solutions}.
|
||||
%Overall latency was measured to \qty{46 \pm 6}{\ms}, as a result of latency in image acquisition \qty{16 \pm 1}{\ms}, fiducial marker detection \qty{8 \pm 3}{\ms}, network synchronization \qty{4 \pm 1}{\ms}, audio sampling \qty{3 \pm 1}{\ms}, and the vibrotactile actuator latency (\qty{15}{\ms}, as specified by the manufacturer\footnotemark[5]).
|
||||
%This latency was below the \qty{60}{\ms} threshold for vibrotactile feedback \cite{okamoto2009detectability} and was not noticed by the participants.
|
||||
|
||||
\begin{subfigs}{setup}{Textures used and experimental setup of the user study. }[][
|
||||
\item The nine visuo-haptic textures used in the user study, selected from the \HaTT database \cite{culbertson2014one}.
|
||||
@@ -56,7 +43,6 @@ The vibrotactile voice-coil actuator (HapCoil-One, Actronika) was firmly attache
|
||||
|
||||
Participants were first given written instructions about the experimental setup, the tasks, and the procedure of the user study.
|
||||
Then, after having signed an informed consent form, they were asked to seat in front of the table with the experimental setup and to wear the \AR headset.
|
||||
%The experimenter firmly attached the plastic shell encasing the vibrotactile actuator to the middle index phalanx of their dominant hand.
|
||||
As the haptic textures generated no audible noise, participants did not wear any noise reduction headphones.
|
||||
A calibration of both the HoloLens~2 and the finger pose estimation was performed to ensure the correct registration of the visuo-haptic textures and the real finger with the real surfaces, as described in \secref[vhar_system]{virtual_real_registration}.
|
||||
Finally, participants familiarized with the augmented surface in a \qty{2}{min} training session with textures different from the ones used in the user study.
|
||||
@@ -89,8 +75,6 @@ A total of 9 textures \x 3 repetitions = 18 matching trials were performed per p
|
||||
In the \level{Ranking} task, participants had to rank the haptic textures, the visual textures, and the visuo-haptic texture pairs according to their perceived roughness.
|
||||
It had one within-subjects factor, \factor{Modality} with the following levels: \level{Visual}, \level{Haptic}, \level{Visuo-Haptic}.
|
||||
Each modality level was ranked once per participant following the fixed order listed above (\secref{procedure}).
|
||||
%The order of \level{Modality} was fixed as listed above, and.
|
||||
%A total of 3 modalities = 60 ranking trials were collected.
|
||||
|
||||
\subsection{Participants}
|
||||
\label{participants}
|
||||
|
||||
@@ -8,8 +8,8 @@ Most of the haptic augmentations of real surfaces using with wearable haptic dev
|
||||
Still, it is known that the visual rendering of an object can influence the perception of its haptic properties (\secref[related_work]{visual_haptic_influence}), and that the perception of same haptic force-feedback or vibrotactile rendering can differ between \AR and \VR, probably due to difference in perceived simultaneity between visual and haptic stimuli (\secref[related_work]{ar_vr_haptic}).
|
||||
In \AR, the user can see their own hand touching, the haptic device worn, and the \RE, while in \VR they are hidden by the \VE.
|
||||
|
||||
In this chapter, we investigate the \textbf{role of the visual feedback of the virtual hand and of the environment (real or virtual) on the perception of a real surface whose haptic roughness is augmented} with wearable vibrotactile haptics. %voice-coil device worn on the finger.
|
||||
To do so, we used the visuo-haptic system presented in \chapref{vhar_system} to render virtual vibrotactile patterned textures (\secref[related_work]{texture_rendering}) to augment the real surface being touched. % touched by the finger.% that can be directly touched with the bare finger.
|
||||
In this chapter, we investigate the \textbf{role of the visual feedback of the virtual hand and of the environment (real or virtual) on the perception of a real surface whose haptic roughness is augmented} with wearable vibrotactile haptics.
|
||||
To do so, we used the visuo-haptic system presented in \chapref{vhar_system} to render virtual vibrotactile patterned textures (\secref[related_work]{texture_rendering}) to augment the real surface being touched.
|
||||
We evaluated, in \textbf{user study with psychophysical methods and extensive questionnaire}, the perceived roughness augmentation in three visual rendering conditions: \textbf{(1) without visual augmentation}, in \textbf{(2) \OST-\AR with a realistic virtual hand} rendering, and in \textbf{(3) \VR with the same virtual hand}.
|
||||
To control for the influence of the visual rendering, the real surface was not visually augmented and stayed the same in all conditions.
|
||||
|
||||
|
||||
@@ -1,9 +1,6 @@
|
||||
\section{User Study}
|
||||
\label{experiment}
|
||||
|
||||
%The visuo-haptic rendering system, described in \secref[vhar_system]{method}, allows free exploration of virtual vibrotactile textures on real surfaces directly touched with the bare finger to simulate roughness augmentation, while the visual rendering of the hand and environment can be controlled to be in \AR or \VR.
|
||||
%
|
||||
%The user study aimed to investigate the effect of visual hand rendering in \AR or \VR on the perception of roughness texture augmentation of a touched real surface.
|
||||
In a \TIFC task (\secref[related_work]{sensations_perception}), participants compared the roughness of different tactile texture augmentations in three visual rendering conditions: without any visual augmentation (\level{Real}, \figref{experiment/real}), in \AR with a realistic virtual hand superimposed on the real hand (\level{Mixed}, \figref{experiment/mixed}), and in \VR with the same virtual hand as an avatar (\level{Virtual}, \figref{experiment/virtual}).
|
||||
In order not to influence the perception, as vision is an important source of information and influence for the perception of texture \cite{bergmanntiest2007haptic,yanagisawa2015effects,vardar2019fingertip}, the touched surface was visually a uniform white; thus only the visual aspect of the hand and the surrounding environment is changed.
|
||||
|
||||
|
||||
@@ -27,8 +27,8 @@ All pairwise differences were statistically significant.
|
||||
]
|
||||
|
||||
\begin{subfigs}{discrimination_accuracy}{Results of the vibrotactile texture roughness discrimination task. }[][
|
||||
\item Estimated \PSE of each visual rendering, defined as the amplitude difference at which both reference and comparison textures are perceived to be equivalent. %, \ie the accuracy in discriminating vibrotactile roughness.
|
||||
\item Estimated \JND of each visual rendering. %, defined as the minimum perceptual amplitude difference, \ie the sensitivity to vibrotactile roughness differences.
|
||||
\item Estimated \PSE of each visual rendering, defined as the amplitude difference at which both reference and comparison textures are perceived to be equivalent.
|
||||
\item Estimated \JND of each visual rendering.
|
||||
]
|
||||
\subfig[0.35]{results/trial_pses}
|
||||
\subfig[0.35]{results/trial_jnds}
|
||||
@@ -107,7 +107,6 @@ Participants were mixed between feeling the vibrations on the surface or on the
|
||||
{NASA-TLX questions asked to participants after each \factor{Visual Rendering} block of trials.}
|
||||
[
|
||||
Questions were bipolar 100-points scales (0~=~Very Low and 100~=~Very High, except for Performance where 0~=~Perfect and 100~=~Failure), with increments of 5.
|
||||
%Participants were shown only the labels for all questions.
|
||||
]
|
||||
\begin{tabularx}{\linewidth}{l X}
|
||||
\toprule
|
||||
|
||||
@@ -4,9 +4,6 @@
|
||||
The results showed a difference in vibrotactile roughness perception between the three visual rendering conditions.
|
||||
Given the estimated \PSEs, the textures were on average perceived as \enquote{rougher} in the \level{Real} rendering than in the \level{Virtual} (\percent{-2.8}) and \level{Mixed} (\percent{-6.0}) renderings (\figref{results/trial_pses}).
|
||||
A \PSE difference in the same range was found for perceived stiffness, with the \VR perceived as \enquote{stiffer} and the \AR as \enquote{softer} \cite{gaffary2017ar}.
|
||||
%
|
||||
%However, the difference between the \level{Virtual} and \level{Mixed} conditions was not significant.
|
||||
%
|
||||
Surprisingly, the \PSE of the \level{Real} rendering was shifted to the right (to be "rougher", \percent{7.9}) compared to the reference texture, whereas the \PSEs of the \level{Virtual} (\percent{5.1}) and \level{Mixed} (\percent{1.9}) renderings were perceived as \enquote{smoother} and closer to the reference texture (\figref{results/trial_predictions}).
|
||||
The sensitivity of participants to roughness differences also varied, with the \level{Real} rendering having the best \JND (\percent{26}), followed by the \level{Virtual} (\percent{30}) and \level{Mixed} (\percent{33}) renderings (\figref{results/trial_jnds}).
|
||||
These \JND values are in line with and at the upper end of the range of previous studies \cite{choi2013vibrotactile}, which may be due to the location of the actuator on the top of the finger middle phalanx, being less sensitive to vibration than the fingertip.
|
||||
@@ -23,11 +20,9 @@ The \level{Mixed} rendering had the lowest \PSE and highest perceived latency, t
|
||||
|
||||
Our wearable visuo-haptic texture augmentation system, described in \chapref{vhar_system}, aimed to provide a coherent visuo-haptic renderings registered with the \RE.
|
||||
Yet, it involves different sensory interaction loops between the user's movements and the visuo-haptic feedback (\figref[vhar_system]{diagram} and \figref[introduction]{interaction-loop}), which may not feel to be in synchronized with each other or with proprioception.
|
||||
%When a user runs their finger over a vibrotactile virtual texture, the haptic sensations and eventual display of the virtual hand lag behind the visual displacement and proprioceptive sensations of the real hand.
|
||||
%
|
||||
Thereby, we hypothesize that the differences in the perception of vibrotactile roughness are less due to the visual rendering of the hand or the environment and their associated differences in exploration behaviour, but rather to the difference in the \emph{perceived} latency between one's own hand (visual and proprioception) and the virtual hand (visual and haptic).
|
||||
The perceived delay was the most important in \AR, where the virtual hand visually lags significantly behind the real one, but less so in \VR, where only the proprioceptive sense can help detect the lag.
|
||||
This delay was not perceived when touching the virtual haptic textures without visual augmentation, because only the finger velocity was used to render them, and, despite the varied finger movements and velocities while exploring the textures, the participants did not perceive any latency in the vibrotactile rendering (\secref{results_questions}).
|
||||
\textcite{diluca2011effects} demonstrated similarly, in a \VST-\AR setup, how visual latency relative to proprioception increased the perception of stiffness of a virtual piston, while haptic latency decreased it (\secref[related_work]{ar_vr_haptic}).
|
||||
Another complementary explanation could be a pseudo-haptic effect (\secref[related_work]{visual_haptic_influence}) of the displacement of the virtual hand, as already observed with this vibrotactile texture rendering, but seen on a screen \cite{ujitoko2019modulating}.
|
||||
Such hypotheses could be tested by manipulating the latency and pose estimation accuracy of the virtual hand or the vibrotactile feedback. % to observe their effects on the roughness perception of the virtual textures.
|
||||
Such hypotheses could be tested by manipulating the latency and pose estimation accuracy of the virtual hand or the vibrotactile feedback.
|
||||
|
||||
@@ -3,7 +3,6 @@
|
||||
|
||||
In this chapter, we studied how the perception of wearable haptic augmented textures is affected by the visual feedback of the virtual hand and the environment, being either real, augmented or virtual.
|
||||
Using the wearable visuo-haptic augmentation system presented in \chapref{vhar_system}, we augmented the perceived roughness of real surfaces with virtual vibrotactile textures rendered on the finger.
|
||||
%we rendered virtual vibrotactile patterned textures on the voice-coil worn on the middle-phalanx of the finger to augment the roughness perception of the real surface being touched.
|
||||
With an \OST-\AR headset, that could be switched to a \VR only view, we considered three visual rendering conditions: (1) without visual augmentation, (2) with a realistic virtual hand rendering in \AR, and (3) with the same virtual hand in \VR.
|
||||
We then evaluated the perceived roughness augmentation in these three visual conditions with a psychophysical user study involving 20 participants and extensive questionnaires.
|
||||
|
||||
@@ -13,13 +12,10 @@ Similarly, the sensitivity to differences in roughness was better with the real
|
||||
Exploration behaviour was also slower in \VR than with real hand alone, although subjective evaluation of the texture was not affected.
|
||||
We hypothesized that this difference in perception was due to the \emph{perceived latency} between the finger movements and the different visual, haptic and proprioceptive feedbacks, which were the same in all visual renderings, but were more noticeable in \AR and \VR than without visual augmentation.
|
||||
|
||||
%We can outline recommendations for future \AR/\VR studies or applications using wearable haptics.
|
||||
This study suggests that attention should be paid to the respective latencies of the visual and haptic sensory feedbacks inherent in such systems and, more importantly, to \emph{the perception of their possible asynchrony}.
|
||||
Latencies should be measured \cite{friston2014measuring}, minimized to an acceptable level for users and kept synchronized with each other \cite{diluca2019perceptual}.
|
||||
It seems also that the visual aspect of the hand or the environment on itself has little effect on the perception of haptic feedback, but the degree of visual virtuality can affect the asynchrony perception of the latencies, even though the latencies remain identical.
|
||||
When designing for wearable haptics or integrating it into \AR/\VR, it seems important to test its perception in real (\RE), augmented (\AE) and virtual (\VE) environments.
|
||||
%With a better understanding of how visual factors influence the perception of haptically augmented real objects, the many wearable haptic systems that already exist but have not yet been fully explored with \AR can be better applied and new visuo-haptic renderings adapted to \AR can be designed.
|
||||
%Finally, a visual hand representation in OST-\AR together with wearable haptics should be avoided until acceptable tracking latencies \are achieved, as was also observed for virtual object interaction with the bare hand \cite{normand2024visuohaptic}.
|
||||
|
||||
In the next chapter we present a second user study where we investigate the perception of simultaneous and co-localised visual and haptic texture augmentation.
|
||||
We will use the same system presented in \chapref{vhar_system} and a visual rendering condition similar to the \level{Real} condition of this study, in \AR without the virtual hand overlay.
|
||||
|
||||
Reference in New Issue
Block a user