Auto add chapter as prefix to labels

This commit is contained in:
2024-06-26 23:58:14 +02:00
parent 3cf72ba41b
commit f795b6e9e5
21 changed files with 166 additions and 62 deletions

View File

@@ -0,0 +1,94 @@
\section{Related Work}
\label{2_literature}
This Section summarizes the state of the art in visual hand rendering and (wearable) haptic rendering in AR, focusing on virtual object manipulation.
\subsection{Visual Hand Rendering in AR}
\label{2_hands}
Mutual visual occlusion between a virtual object and the real hand, \ie hiding the virtual object when the real hand is in front of it and hiding the real hand when it is behind the virtual object, is often presented as natural and realistic, enhancing the blending of real and virtual environments~\cite{piumsomboon2014graspshell, al-kalbani2016analysis}.
%
In video see-through AR (VST-AR), this could be solved as a masking problem by combining the image of the real world captured by a camera and the generated virtual image~\cite{macedo2023occlusion}.
%
In OST-AR, this is more difficult because the virtual environment is displayed as a transparent 2D image on top of the 3D real world, which cannot be easily masked~\autocite{macedo2023occlusion}.
%
Moreover, in VST-AR, the grip aperture and depth positioning of virtual objects often seem to be wrongly estimated~\cite{al-kalbani2016analysis, maisto2017evaluation}.
%
However, this effect has yet to be verified in an OST-AR setup.
An alternative is to render the virtual objects and the hand semi-transparents, so that they are partially visible even when one is occluding the other, \eg in \figref{hands-none} the real hand is behind the virtual cube but still visible.
%
Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in VST-AR~\cite{buchmann2005interaction, ha2014wearhand, piumsomboon2014graspshell} and VR~\cite{vanveldhuizen2021effect}, but has not yet been evaluated in OST-AR.
%
However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a virtual object, \eg in \figref{hands-none} the thumb is in front of the virtual cube, but it appears to be behind it.
In VR, as the user is fully immersed in the virtual environment and cannot see their real hands, it is necessary to represent them virtually.
%
It is known that the virtual hand representation has an impact on perception, interaction performance, and preference of users~\cite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch}.
%
In a pick-and-place task in VR, \citeauthorcite{prachyabrued2014visual} found that the virtual hand representation whose motion was constrained to the surface of the virtual objects performed the worst, while the virtual hand representation following the tracked human hand (thus penetrating the virtual objects), performed the best, even though it was rather disliked.
%
The authors also observed that the best compromise was a double rendering, showing both the tracked hand and a hand rendering constrained by the virtual environment.
%
It has also been shown that over a realistic avatar, a skeleton rendering (similar to \figref{hands-skeleton}) can provide a stronger sense of being in control~\cite{argelaguet2016role} and that minimalistic fingertip rendering (similar to \figref{hands-tips}) can be more effective in a typing task~\cite{grubert2018effects}.
In AR, as the real hand of a user is visible but not physically constrained by the virtual environment, adding a visual hand rendering that can physically interact with virtual objects would achieve a similar result to the promising double-hand rendering of \citeauthorcite{prachyabrued2014visual}.
%
Additionally, \citeauthorcite{kahl2021investigation} showed that a virtual object overlaying a tangible object in OST-AR can vary in size without worsening the users' experience nor the performance.
%
This suggests that a visual hand rendering superimposed on the real hand could be helpful, but should not impair users.
Few works have explored the effect of visual hand rendering in AR~\cite{blaga2017usability, maisto2017evaluation, krichenbauer2018augmented, yoon2020evaluating, saito2021contact}.
%
For example, \citeauthorcite{blaga2017usability} evaluated a skeleton rendering in several virtual object manipulations against no visual hand overlay.
%
Performance did not improve, but participants felt more confident with the virtual hand.
%
However, the experiment was carried out on a screen, in a non-immersive AR scenario.
%
\citeauthorcite{saito2021contact} found that masking the real hand with a textured 3D opaque virtual hand did not improve performance in a reach-to-grasp task but displaying the points of contact on the virtual object did.
%
To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of virtual object manipulation.
\subsection{Wearable Haptic Feedback in AR}
\label{2_haptics}
Different haptic feedback systems have been explored to improve interactions in AR, including %
grounded force feedback devices~\cite{bianchi2006high, jeon2009haptic, knorlein2009influence}, %
exoskeletons~\cite{lee2021wearable}, %
tangible objects~\cite{hettiarachchi2016annexing, detinguy2018enhancing, salazar2020altering, normand2018enlarging, xiao2018mrtouch}, and %
wearable haptic devices~\cite{pacchierotti2016hring, lopes2018adding, pezent2019tasbi, teng2021touch}.
Wearable haptics seems particularly suited for this context, as it takes into account many of the AR constraints, \eg limited impact on hand tracking performance and reduced impairment of the senses and ability of the users to interact with real content~\cite{pacchierotti2016hring, maisto2017evaluation, lopes2018adding, meli2018combining, pezent2019tasbi, teng2021touch, kourtesis2022electrotactile, marchal2022virtual}.
%
For example, \citeauthorcite{pacchierotti2016hring} designed a haptic ring providing pressure and skin stretch sensations to be worn at the proximal finger phalanx, so as to improve the hand tracking during a pick-and-place task.
%
\citeauthorcite{pezent2019tasbi} proposed Tasbi: a wristband haptic device capable of rendering vibrations and pressures.
%
\citeauthorcite{teng2021touch} presented Touch\&Fold, a haptic device attached to the nail that provides pressure and texture sensations when interacting with virtual content, but also folds away when the user interacts with real objects, leaving the fingertip free.
%
This approach was also perceived as more realistic than providing sensations directly on the nail, as in~\cite{ando2007fingernailmounted}.
%
Each of these haptic devices provided haptic feedback about fingertip interactions with the virtual content on other parts of the hand.
%
If it is indeed necessary to delocalize the haptic feedback, each of these positions is promising, and they have not yet been compared with each other.
Conjointly, a few studies have explored and compared the effects of visual and haptic feedback in tasks involving the manipulation of virtual objects with the hand.
%
\citeauthorcite{sarac2022perceived} and \citeauthorcite{palmer2022haptic} studied the effects of providing haptic feedback about contacts at the fingertips using haptic devices worn at the wrist, testing different mappings.
%
Results proved that moving the haptic feedback away from the point(s) of contact is possible and effective, and that its impact is more significant when the visual feedback is limited.
%
In pick-and-place tasks in AR involving both virtual and real objects, \citeauthorcite{maisto2017evaluation} and \citeauthorcite{meli2018combining} showed that having a haptic {rendering of the} fingertip interactions with the virtual objects led to better performance and perceived effectiveness than having only a visual rendering of the hand, similar to \figref{hands-tips}.
%
Moreover, employing the haptic ring of~\cite{pacchierotti2016hring} on the proximal finger phalanx led to an improved performance with respect to more standard fingertip haptic devices~\cite{chinello2020modular}.
%
However, the measured difference in performance could be attributed to either the device or the device position (proximal vs fingertip), or both.
%
Furthermore, all of these studies were conducted in non-immersive setups, where users looked at a screen displaying the visual interactions, and only compared haptic and visual feedback, but did not examine them together.
%
The improved performance and perceived effectiveness of a delocalized haptic feedback over a visual feedback alone, or their multimodal combination, remains to be verified in an immersive OST-AR setup.
\bigskip
These limitations have motivated the research presented in this paper, where we aim at filling the abovementioned gaps related to the understanding of the role of the visuo-haptic rendering of the hand during the 3D manipulation of virtual objects in AR .

View File

@@ -1,5 +1,5 @@
\section{Introduction}
\sublabel{introduction}
\label{sec:introduction}
When we look at the surface of an everyday object, we then touch it to confirm or contrast our initial visual impression and to estimate the properties of the object~\autocite{ernst2002humans}.
%

View File

@@ -1,5 +1,5 @@
\section{User Study}
\sublabel{experiment}
\label{sec:experiment}
\begin{subfigs}{setup}{%
(Left) The nine visuo-haptic textures used in the user study, selected from the HaTT database~\autocite{culbertson2014one}. %
@@ -26,7 +26,7 @@ Our objective is to assess which haptic textures were associated with which visu
\subsection{The textures}
\sublabel{textures}
\label{sec::textures}
The 100 visuo-haptic texture pairs of the HaTT database~\autocite{culbertson2014one} were preliminary tested and compared using AR and vibrotactile haptic feedback on the finger on a tangible surface.
%
@@ -38,7 +38,7 @@ All these visual and haptic textures are isotropic: their rendering (appearance
\subsection{Apparatus}
\sublabel{apparatus}
\label{sec::apparatus}
\figref{setup} shows the experimental setup (middle) and the first person view (right) of the user study.
%
@@ -68,7 +68,7 @@ The user study was held in a quiet room with no windows, with one light source o
\subsection{Procedure and Collected Data}
\sublabel{procedure}
\label{sec::procedure}
Participants were first given written instructions about the experimental setup, the tasks, and the procedure of the user study.
%
@@ -115,7 +115,7 @@ The user study took on average 1 hour to complete.
\subsection{Participants}
\sublabel{participants}
\label{sec::participants}
Twenty participants took part to the user study (12 males, 7 females, 1 preferred not to say), aged between 20 and 60 years old (M=29.1, SD=9.4).
%
@@ -133,7 +133,7 @@ They all signed an informed consent form before the user study.
\subsection{Design}
\sublabel{design}
\label{sec::design}
The matching task was a single-factor within-subjects design, \textit{Visual Texture}, with the following levels:
%

View File

@@ -1,11 +1,11 @@
\section{Results}
\sublabel{results}
\label{sec:results}
\subsection{Textures Matching}
\sublabel{results_matching}
\label{sec:results_matching}
\subsubsection{Confusion Matrix}
\sublabel{results_matching_confusion_matrix}
\label{sec:results_matching_confusion_matrix}
\begin{subfigs}{results_matching_ranking}{%
(Left) Confusion matrix of the matching task, with the presented visual textures as columns and the selected haptic texture in proportion as rows. %
@@ -45,7 +45,7 @@ Another explanation could be that the participants had difficulties to estimate
Indeed, many participants explained that they tried to identify or imagine the roughness of a given visual texture then to select the most plausible haptic texture, in terms of frequency and/or amplitude of vibrations.
\subsubsection{Completion Time}
\sublabel{results_matching_time}
\label{sec:results_matching_time}
To verify that the difficulty with all the visual textures was the same on the matching task, the \textit{Completion Time} of a trial, \ie the time between the visual texture display and the haptic texture selection, was analyzed.
%
@@ -59,7 +59,7 @@ No statistical significant effect of \textit{Visual Texture} was found (\anova{8
\subsection{Textures Ranking}
\sublabel{results_ranking}
\label{sec:results_ranking}
\figref{results_matching_ranking} (right) presents the results of the three rankings of the haptic textures alone, the visual textures alone, and the visuo-haptic texture pairs.
%
@@ -83,7 +83,7 @@ These results indicate, with \figref{results_matching_ranking} (right), that the
\subsection{Perceived Similarity of Visual and Haptic Textures}
\sublabel{results_similarity}
\label{sec:results_similarity}
\begin{subfigs}{results_similarity}{%
(Left) Correspondence analysis of the matching task confusion matrix (see \figref{results_matching_ranking}, left).
@@ -155,7 +155,7 @@ This shows that the participants consistently identified the roughness of each v
\subsection{Questionnaire}
\sublabel{results_questions}
\label{sec:results_questions}
\begin{subfigs}{results_questions}{%
Boxplots of the 7-item Likert scale question results (1=Not at all, 7=Extremely) %

View File

@@ -1,5 +1,5 @@
\section{Discussion}
\sublabel{discussion}
\label{sec:discussion}
In this study, we investigated the perception of visuo-haptic texture augmentation of tangible surfaces touched directly with the index fingertip, using visual texture overlays in AR and haptic roughness textures generated by a vibrotactile device worn on the middle index phalanx.
%

View File

@@ -1,5 +1,5 @@
\section{Conclusion}
\sublabel{conclusion}
\label{sec:conclusion}
\fig[0.6]{experiment/use_case}{%
Illustration of the texture augmentation in AR through an interior design scenario. %

View File

@@ -1,7 +1,5 @@
\mainchapter{Augmenting the Texture Perception of Tangible Surfaces in Augmented Reality using Vibrotactile Haptics}
\renewcommand{\labelprefix}{ar_textures}
\label{ch:\labelprefix}
\mainlabel{ar_textures}
\input{1-introduction}
\input{2-experiment}

View File

@@ -1,2 +1,2 @@
\part{Augmenting the Visuo-haptic Texture Perception of Tangible Surfaces}
\label{part:texture}
\mainlabel{perception}

View File

@@ -1,5 +1,5 @@
\section{Introduction}
\sublabel{introduction}
\label{sec:introduction}
% Delivers the motivation for your paper. It explains why you did the work you did.

View File

@@ -1,5 +1,5 @@
\section{Related Work}
\sublabel{related_work}
\label{sec:related_work}
% Answer the following four questions: “Who else has done work with relevance to this work of yours? What did they do? What did they find? And how is your work here different?”
@@ -9,7 +9,7 @@ Yet visual and haptic sensations are often combined in everyday life, and it is
\subsection{Augmenting Haptic Texture Roughness}
\sublabel{vibrotactile_roughness}
\label{sec:vibrotactile_roughness}
When running a finger over a surface, the deformations and vibrations of the skin caused by the micro-height differences of the material induce the sensation of roughness~\autocite{klatzky2003feeling}.
%
@@ -48,7 +48,7 @@ It remains unclear whether such vibrotactile texture augmentation is perceived t
%In our study, we attached a voice-coil actuator to the middle phalanx of the finger and used a squared sinusoidal signal to render grating textures sensations, but we corrected its phase to allow a simple camera-based tracking and free exploration movements of the finger.
\subsection{Influence of Visual Rendering on Haptic Perception}
\sublabel{influence_visual_haptic}
\label{sec:influence_visual_haptic}
When the same object property is sensed simultaneously by vision and touch, the two modalities are integrated into a single perception.
%

View File

@@ -1,5 +1,5 @@
\section{Visuo-Haptic Texture Rendering in Mixed Reality}
\sublabel{method}
\label{sec:method}
\figwide[1]{method/diagram}{%
Diagram of the visuo-haptic texture rendering system.
@@ -36,13 +36,13 @@ The visuo-haptic texture rendering system is based on
\item and a modulation of the signal frequency by the estimated finger speed with a phase matching.
\end{enumerate*}
%
\figref{method/diagram} shows the diagram of the interaction loop and \eqref{xr_perception:signal} the definition of the vibrotactile signal.
\figref{method/diagram} shows the diagram of the interaction loop and \eqref{signal} the definition of the vibrotactile signal.
%
The system is composed of three main components: the pose estimation of the tracked real elements, the visual rendering of the virtual environment, and the vibrotactile signal generation and rendering.
\subsection{Pose Estimation and Virtual Environment Alignment}
\sublabel{virtual_real_alignment}
\label{sec:virtual_real_alignment}
\begin{subfigs}{setup}{%
Visuo-haptic texture rendering system setup.
@@ -93,13 +93,13 @@ The visual rendering is achieved using the Microsoft HoloLens~2, an OST-AR heads
%
It was chosen over VST-AR because OST-AR only adds virtual content to the real environment, while VST-AR streams a real-time video capture of the real environment~\autocite{macedo2023occlusion}.
%
Indeed, one of our objectives (see \secref{xr_perception:experiment}) is to directly compare a virtual environment that replicates a real one. %, rather than a video feed that introduces many supplementary visual limitations.
Indeed, one of our objectives (see \secref{experiment}) is to directly compare a virtual environment that replicates a real one. %, rather than a video feed that introduces many supplementary visual limitations.
%
To simulate a VR headset, a cardboard mask (with holes for sensors) is attached to the headset to block the view of the real environment (see \figref{method/headset}).
\subsection{Vibrotactile Signal Generation and Rendering}
\sublabel{texture_generation}
\label{sec:texture_generation}
A voice-coil actuator (HapCoil-One, Actronika) is used to display the vibrotactile signal, as it allows the frequency and amplitude of the signal to be controlled independently over time, covers a wide frequency range (\qtyrange{10}{1000}{\Hz}), and outputs the signal accurately with relatively low acceleration distortion\footnote{HapCoil-One specific characteristics are described in its data sheet: \url{https://web.archive.org/web/20240228161416/https://tactilelabs.com/wp-content/uploads/2023/11/HapCoil_One_datasheet.pdf}}.
%
@@ -116,10 +116,10 @@ It is generated as a square wave audio signal, sampled at \qty{48}{\kilo\hertz},
A sample $s_k$ of the audio signal at sampling time $t_k$ is given by:
%
\begin{subequations}
\label{eq:\labelprefix:signal}
\label{eq:signal}
\begin{align}
s(x_{f,j}, t_k) & = A \text{\,sgn} ( \sin (2 \pi \frac{\dot{x}_{f,j}}{\lambda} t_k + \phi_j) ) & \label{eq:\labelprefix:signal_speed} \\
\phi_j & = \phi_{j-1} + 2 \pi \frac{x_{f,j} - x_{f,{j-1}}}{\lambda} t_k & \label{eq:\labelprefix:signal_phase}
s(x_{f,j}, t_k) & = A \text{\,sgn} ( \sin (2 \pi \frac{\dot{x}_{f,j}}{\lambda} t_k + \phi_j) ) & \label{eq:signal_speed} \\
\phi_j & = \phi_{j-1} + 2 \pi \frac{x_{f,j} - x_{f,{j-1}}}{\lambda} t_k & \label{eq:signal_phase}
\end{align}
\end{subequations}
%
@@ -133,7 +133,7 @@ This is important because it preserves the sensation of a constant spatial frequ
%
Note that the finger position and velocity are transformed from the camera frame $\mathcal{F}_c$ to the texture frame $\mathcal{F}_t$, with the $x$ axis aligned with the texture direction.
%
However, when a new finger position is estimated at time $t_j$, the phase $\phi_j$ needs to be adjusted as well with the frequency to ensure a continuity in the signal as described in \eqref{xr_perception:signal_phase}.
However, when a new finger position is estimated at time $t_j$, the phase $\phi_j$ needs to be adjusted as well with the frequency to ensure a continuity in the signal as described in \eqref{signal}.
%
This approach avoids sudden changes in the actuator movement thus affecting the texture perception in an uncontrolled way (see \figref{method/phase_adjustment}) and, contrary to previous work~\autocite{asano2015vibrotactile,friesen2024perceived}, it enables no constraints a free exploration of the texture by the user with no constraints on the finger speed.
%
@@ -153,7 +153,7 @@ The tactile texture is described and rendered in this work as a one dimensional
\subsection{System Latency}
\sublabel{latency}
\label{sec:latency}
%As shown in \figref{method/diagram} and described above, the system includes various haptic and visual sensors and rendering devices linked by software processes for image processing, 3D rendering and audio generation.
%

View File

@@ -1,5 +1,5 @@
\section{User Study}
\sublabel{experiment}
\label{sec:experiment}
\begin{subfigswide}{renderings}{%
The three visual rendering conditions and the experimental procedure of the two-alternative forced choice (2AFC) psychophysical study.
@@ -22,7 +22,7 @@
\subfig[0.32][]{experiment/virtual}
\end{subfigswide}
Our visuo-haptic rendering system, described in \secref{xr_perception:method}, allows free exploration of virtual vibrotactile textures on tangible surfaces directly touched with the bare finger to simulate roughness augmentation, while the visual rendering of the hand and environment can be controlled to be in AR or VR.
Our visuo-haptic rendering system, described in \secref{method}, allows free exploration of virtual vibrotactile textures on tangible surfaces directly touched with the bare finger to simulate roughness augmentation, while the visual rendering of the hand and environment can be controlled to be in AR or VR.
%
The user study aimed to investigate the effect of visual hand rendering in AR or VR on the perception of roughness texture augmentation. % of a touched tangible surface.
%
@@ -32,7 +32,7 @@ In order not to influence the perception, as vision is an important source of in
\subsection{Participants}
\sublabel{participants}
\label{sec:participants}
Twenty participants were recruited for the study (16 males, 3 females, 1 prefer not to say), aged between 18 and 61 years old (\median{26}{}, \iqr{6.8}{}).
%
@@ -50,7 +50,7 @@ They all signed an informed consent form before the user study and were unaware
\subsection{Apparatus}
\sublabel{apparatus}
\label{sec:apparatus}
An experimental environment similar as \citeauthorcite{gaffary2017ar} was created to ensure a similar visual rendering in AR and VR (see \figref{renderings}).
%
@@ -70,7 +70,7 @@ Its size was adjusted to match the real hand of the participants before the expe
%
%An OST-AR headset (Microsoft HoloLens~2) was chosen over a VST-AR headset because the former only adds virtual content to the real environment, while the latter streams a real-time video capture of the real environment, and one of our objectives was to directly compare a virtual environment replicating a real one, not to a video feed that introduces many other visual limitations~\autocite{macedo2023occlusion}.
%
The visual rendering of the virtual hand and environment is described in \secref{xr_perception:virtual_real_alignment}.
The visual rendering of the virtual hand and environment is described in \secref{virtual_real_alignment}.
%
%In the \level{Virtual} rendering, a cardboard mask (with holes for sensors) was attached to the headset to block the view of the real environment and simulate a VR headset (see \figref{method/headset}).
%
@@ -98,7 +98,7 @@ Participants sat comfortably in front of the box at a distance of \qty{30}{\cm},
%
%A vibrotactile voice-coil actuator (HapCoil-One, Actronika) was encased in a 3D printed plastic shell with a \qty{2}{\cm} AprilTag glued to top, and firmly attached to the middle phalanx of the right index finger of the participants using a Velcro strap.
%
The generation of the virtual texture and the control of the virtual hand is described in \secref{xr_perception:method}.
The generation of the virtual texture and the control of the virtual hand is described in \secref{method}.
%
They also wore headphones with a pink noise masking the sound of the voice-coil.
%
@@ -106,7 +106,7 @@ The user study was held in a quiet room with no windows.
\subsection{Procedure}
\sublabel{procedure}
\label{sec:procedure}
Participants were first given written instructions about the experimental setup and procedure, the informed consent form to sign, and a demographic questionnaire.
%
@@ -136,13 +136,13 @@ Participants were not told that there was a reference and a comparison texture.
%
The order of presentation was randomised and not revealed to the participants.
%
All textures were rendered as described in \secref{xr_perception:texture_generation} with period $\lambda$ of \qty{2}{\mm}, but with different amplitudes $A$ to create different levels of roughness.
All textures were rendered as described in \secref{texture_generation} with period $\lambda$ of \qty{2}{\mm}, but with different amplitudes $A$ to create different levels of roughness.
%
Preliminary studies allowed us to determine a range of amplitudes that could be felt by the participants and were not too uncomfortable, and the reference texture was chosen to be the one with the middle amplitude.
\subsection{Experimental Design}
\sublabel{experimental_design}
\label{sec:experimental_design}
The user study was a within-subjects design with two factors:
%
@@ -161,7 +161,7 @@ A total of 3 visual renderings \x 6 amplitude differences \x 2 texture presentat
\subsection{Collected Data}
\sublabel{collected_data}
\label{sec:collected_data}
For each trial, the \textit{Texture Choice} by the participant as the roughest of the pair was recorded.
%

View File

@@ -1,8 +1,8 @@
\section{Results}
\sublabel{results}
\label{sec:results}
\subsection{Trial Measures}
\sublabel{results_trials}
\label{sec:results_trials}
All measures from trials were analysed using linear mixed models (LMM) or generalised linear mixed models (GLMM) with \factor{Visual Rendering}, \factor{Amplitude Difference} and their interaction as within-participant factors, and by-participant random intercepts.
%
@@ -16,7 +16,7 @@ Each estimate is reported with its 95\% confidence interval (CI) as follows: \ci
\subsubsection{Discrimination Accuracy}
\sublabel{discrimination_accuracy}
\label{sec:discrimination_accuracy}
A GLMM was adjusted to the \response{Texture Choice} in the 2AFC vibrotactile texture roughness discrimination task, with by-participant random intercepts but no random slopes, and a probit link function (see \figref{results/trial_predictions}).
%
@@ -54,7 +54,7 @@ All pairwise differences were statistically significant.
\subsubsection{Response Time}
\sublabel{response_time}
\label{sec:response_time}
A LMM analysis of variance (AOV) with by-participant random slopes for \factor{Visual Rendering}, and a log transformation (as \response{Response Time} measures were gamma distributed) indicated a statistically significant effects on \response{Response Time} of \factor{Visual Rendering} (\anova{2}{18}{6.2}, \p{0.009}, see \figref{results/trial_response_times}).
%
@@ -64,7 +64,7 @@ The \level{Mixed} rendering was in between (\geomean{1.56}{s} \ci{1.49}{1.63}).
\subsubsection{Finger Position and Speed}
\sublabel{finger_position_speed}
\label{sec:finger_position_speed}
The frames analysed were those in which the participants actively touched the comparison textures with a finger speed greater than \SI{1}{\mm\per\second}.
%
@@ -96,7 +96,7 @@ All pairwise differences were statistically significant: \level{Real} \vs \level
\subsection{Questionnaires}
\sublabel{questions}
\label{sec:questions}
%\figref{results/question_heatmaps} shows the median and interquartile range (IQR) ratings to the questions in \tabref{questions} and to the NASA-TLX questionnaire.
%

View File

@@ -1,5 +1,5 @@
\section{Discussion}
\sublabel{discussion}
\label{sec:discussion}
%Interpret the findings in results, answer to the problem asked in the introduction, contrast with previous articles, draw possible implications. Give limitations of the study.
@@ -30,9 +30,9 @@ The \level{Mixed} rendering, displaying both the real and virtual hands, was alw
%
This suggests that touching a virtual vibrotactile texture on a tangible surface with a virtual hand in VR is different from touching it with one's own hand: users were more cautious or less confident in their exploration in VR.
%
This seems not due to the realism of the virtual hand or environment, nor the control of the virtual hand, that were all rated high to very high by the participants (see \secref{xr_perception:questions}) in both the \level{Mixed} and \level{Virtual} renderings.
This seems not due to the realism of the virtual hand or environment, nor the control of the virtual hand, that were all rated high to very high by the participants (see \secref{questions}) in both the \level{Mixed} and \level{Virtual} renderings.
%
Very interestingly, the evaluation of the vibrotactile device and textures was also the same between the visual rendering, with a very high sensation of control, a good realism and a very low perceived latency of the textures (see \secref{xr_perception:questions}).
Very interestingly, the evaluation of the vibrotactile device and textures was also the same between the visual rendering, with a very high sensation of control, a good realism and a very low perceived latency of the textures (see \secref{questions}).
%
However, the perceived latency of the virtual hand (\response{Hand Latency} question) seems to be related to the perceived roughness of the textures (with the PSEs).
%

View File

@@ -1,5 +1,5 @@
\section{Conclusion}
\sublabel{conclusion}
\label{sec:conclusion}
%Summary of the research problem, method, main findings, and implications.

View File

@@ -1,7 +1,5 @@
\mainchapter{Perception of Visual-Haptic Texture Augmentation in Augmented and Virtual Reality}
\renewcommand{\labelprefix}{xr_perception}
\label{ch:\labelprefix}
\mainlabel{xr_perception}
\input{1-introduction}
\input{2-related-work}

View File

@@ -1,2 +1,2 @@
\part{Improving Virtual Object Manipulation with Visuo-Haptic Augmentations of the Hand}
\label{part:manipulation}
\mainlabel{manipulation}

View File

@@ -1,2 +1,2 @@
\mainchapter{Visual Rendering of the Hand in Augmented Reality}
\label{ch:visual-hand}
\mainlabel{visual-hand}

View File

@@ -1,2 +1,2 @@
\mainchapter{Visuo-Haptic Rendering of the Hand in Augmented Reality}
\label{ch:visuo-haptic-hand}
\mainlabel{visuo-haptic-hand}

View File

@@ -1,2 +1,2 @@
\part{Conclusion}
\label{part:conclusion}
\mainlabel{conclusion}

View File

@@ -91,8 +91,22 @@
\bookmarksetup{startatroot}%
}
% Add chapter label as prefix to all other sub-labels
\NewCommandCopy{\oldlabel}{\label}
\newcommand{\labelprefix}{main}
\newcommand{\sublabel}[1]{\label{sec:\labelprefix:#1}}
\newcommand{\mainlabel}[1]{%
\renewcommand{\labelprefix}{#1}%
\oldlabel{\labelprefix}%
}
\renewcommand{\label}[1]{\oldlabel{\labelprefix:#1}}
% References
\newcommand{\chapref}[1]{Chapter~\ref{#1}}
\renewcommand{\eqref}[1]{Equation~\ref{\labelprefix:eq:#1}}
\renewcommand{\figref}[1]{Figure~\ref{\labelprefix:fig:#1}}
\newcommand{\partref}[1]{Part~\ref{#1}}
\renewcommand{\secref}[1]{Section~\ref{\labelprefix:sec:#1}}
\renewcommand{\tabref}[1]{Table~\ref{\labelprefix:tab:#1}}
%% Document