Fix in references

This commit is contained in:
2024-09-23 14:35:59 +02:00
parent 1766b83e59
commit 560a085e6e
10 changed files with 265 additions and 177 deletions

View File

@@ -234,7 +234,7 @@ It is therefore unclear to what extent the real and virtual visuo-haptic sensati
\subsectionstarbookmark{Enable Effective Manipulation of the Augmented Environment}
Touching, grasping and manipulating \VOs are fundamental interactions for \AR \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite{laviola20173d}.
Touching, grasping and manipulating \VOs are fundamental interactions for \AR \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite{laviolajr20173d}.
%
As the hand is not occupied or covered with a haptic device to not impair interaction with the \RE, as described in the previous section, one can expect a seamless and direct manipulation of the hand with the virtual content as if it were real.
%

View File

@@ -150,7 +150,7 @@ In all examples of \AR applications shown in \secref{ar_applications}, the user
\label{interaction_techniques}
For a user to interact with a computer system (desktop, mobile, \AR, etc.), they first perceive the state of the system and then acts upon it through an input \UI.
Inputs \UI can be either an \emph{active sensing}, a held or worn device, such as a mouse, a touch screen, or a hand-held controller, or a \emph{passive sensing}, that does not require a contact, such as eye trackers, voice recognition, or hand tracking \cite{laviola20173d}.
Inputs \UI can be either an \emph{active sensing}, a held or worn device, such as a mouse, a touch screen, or a hand-held controller, or a \emph{passive sensing}, that does not require a contact, such as eye trackers, voice recognition, or hand tracking \cite{laviolajr20173d}.
The information gathered from the sensors by the \UI is then translated into actions within the computer system by an \emph{interaction technique} (\figref{interaction-technique}).
For example, a cursor on a screen can be moved using either with a mouse or with the arrow keys on a keyboard, or a two-finger swipe on a touchscreen can be used to scroll or zoom an image.
Choosing useful and efficient \UIs and interaction techniques is crucial for the user experience and the tasks that can be performed within the system.
@@ -161,7 +161,7 @@ Choosing useful and efficient \UIs and interaction techniques is crucial for the
\subsubsection{Tasks with Virtual Environments}
\label{ve_tasks}
\textcite{laviola20173d} classify interaction techniques into three categories based on the tasks they enable users to perform: manipulation, navigation, and system control.
\textcite{laviolajr20173d} classify interaction techniques into three categories based on the tasks they enable users to perform: manipulation, navigation, and system control.
\textcite{hertel2021taxonomy} proposed a taxonomy of interaction techniques specifically for immersive \AR.
The \emph{manipulation tasks} are the most fundamental tasks in \AR and \VR systems, and the building blocks for more complex interactions.
@@ -242,15 +242,15 @@ Similarly, in \secref{tactile_rendering} we described how a material property (\
\label{ar_virtual_hands}
Natural \UIs allow the user to use their body movements directly as inputs to the \VE \cite{billinghurst2015survey}.
Our hands allow us to manipulate real everyday objects with both strength and precision (\secref{grasp_types}), so virtual hand interaction techniques seem to be the most natural way to manipulate virtual objects \cite{laviola20173d}.
Our hands allow us to manipulate real everyday objects with both strength and precision (\secref{grasp_types}), so virtual hand interaction techniques seem to be the most natural way to manipulate virtual objects \cite{laviolajr20173d}.
Initially tracked by active sensing devices such as gloves or controllers, it is now possible to track hands in real time using cameras and computer vision algorithms natively integrated into \AR/\VR headsets \cite{tong2023survey}.
The user's hand is therefore tracked and reconstructed as a \emph{virtual hand} model in the \VE \cite{billinghurst2015survey,laviola20173d}.
The user's hand is therefore tracked and reconstructed as a \emph{virtual hand} model in the \VE \cite{billinghurst2015survey,laviolajr20173d}.
The simplest models represent the hand as a rigid 3D object that follows the movements of the real hand with \qty{6}{\DoF} (position and orientation in space) \cite{talvas2012novel}.
An alternative is to model only the fingertips (\figref{lee2007handy}) or the whole hand (\figref{hilliges2012holodesk_1}) as points.
The most common technique is to reconstruct all the phalanges of the hand in an articulated kinematic model (\secref{hand_anatomy}) \cite{borst2006spring}.
The contacts between the virtual hand model and the \VOs are then simulated using heuristic or physics-based techniques \cite{laviola20173d}.
The contacts between the virtual hand model and the \VOs are then simulated using heuristic or physics-based techniques \cite{laviolajr20173d}.
Heuristic techniques use rules to determine the selection, manipulation and release of a \VO (\figref{piumsomboon2013userdefined_1}).
However, they produce unrealistic behaviour and are limited to the cases predicted by the rules.
Physics-based techniques simulate forces at the points of contact between the virtual hand and the \VO.

View File

@@ -19,7 +19,7 @@ Combined with virtual reality (VR), where the user is immersed in a visual virtu
%
Worn on the finger, but not directly on the fingertip to keep it free to interact with tangible objects, they have been used to alter perceived stiffness, softness, friction and local deformations \cite{detinguy2018enhancing,salazar2020altering}.
%
However, the use of wearable haptic devices has been little explored in Augmented Reality (AR), where visual virtual content is integrated into the real-world environment, especially for augmenting texture sensations \cite{punpongsanon2015softar,maisto2017evaluation,meli2018combining,chan2021hasti,teng2021touch,fradin2023humans,normand2024visuohaptic}.
However, the use of wearable haptic devices has been little explored in Augmented Reality (AR), where visual virtual content is integrated into the real-world environment, especially for augmenting texture sensations \cite{punpongsanon2015softar,maisto2017evaluation,meli2018combining,chan2021hasti,teng2021touch,fradin2023humans}.
%
A key difference in AR compared to VR is that the user can still see the real-world surroundings, including their hands, the augmented tangible objects and the worn haptic devices.
%

View File

@@ -69,5 +69,5 @@ However, our objective was not to accurately reproduce real textures, but to alt
%
In addition of these limitations, both visual and haptic texture models should be improved by integrating the rendering of spatially localized breaks, edges or patterns, like real textures \cite{richardson2022learning}, and by being adaptable to individual sensitivities, as personalized haptics is a promising approach \cite{malvezzi2021design,young2020compensating}.
%
More generally, a wide range of haptic feedbacks should be integrated to form rich and complete haptic augmentations in AR \cite{maisto2017evaluation,detinguy2018enhancing,salazar2020altering,normand2024visuohaptic,pacchierotti2024haptics}.
More generally, a wide range of haptic feedbacks should be integrated to form rich and complete haptic augmentations in AR \cite{maisto2017evaluation,detinguy2018enhancing,salazar2020altering}.

View File

@@ -25,13 +25,13 @@
%
Wearable haptic devices, worn directly on the finger or hand, have been used to render a variety of tactile sensations to virtual objects seen in VR \cite{choi2018claw,detinguy2018enhancing,pezent2019tasbi} or AR \cite{maisto2017evaluation,meli2018combining,teng2021touch}.
%
They have also been used to alter the perception of roughness, stiffness, friction, and local shape perception of real tangible objects \cite{asano2015vibrotactile,detinguy2018enhancing,normand2024augmenting,salazar2020altering}.
They have also been used to alter the perception of roughness, stiffness, friction, and local shape perception of real tangible objects \cite{asano2015vibrotactile,detinguy2018enhancing,salazar2020altering}.
%
Such techniques place the actuator \emph{close} to the point of contact with the real environment, leaving the user free to directly touch the tangible.
%
This combined use of wearable haptics with tangible objects enables a haptic \emph{augmented} reality (HAR) \cite{bhatia2024augmenting} that can provide a rich and varied haptic feedback.
The degree of reality/virtuality in both visual and haptic sensory modalities can be varied independently, but wearable haptic AR has been little explored with VR and (visual) AR \cite{choi2021augmenting,normand2024augmenting}.
The degree of reality/virtuality in both visual and haptic sensory modalities can be varied independently, but wearable haptic AR has been little explored with VR and (visual) AR \cite{choi2021augmenting}.
%
Although AR and VR are closely related, they have significant differences that can affect the user experience \cite{genay2021virtual,macedo2023occlusion}.
%
@@ -49,7 +49,7 @@ Previous works have shown, for example, that the stiffness of a virtual piston r
The goal of this paper is to study the role of the visual rendering of the hand (real or virtual) and its environment (AR or VR) on the perception of a tangible surface whose texture is augmented with a wearable vibrotactile device worn on the finger.
%
We focus on the perception of roughness, one of the main tactile sensations of materials \cite{baumgartner2013visual,hollins1993perceptual,okamoto2013psychophysical} and one of the most studied haptic augmentations \cite{asano2015vibrotactile,culbertson2014modeling,friesen2024perceived,normand2024augmenting,strohmeier2017generating,ujitoko2019modulating}.
We focus on the perception of roughness, one of the main tactile sensations of materials \cite{baumgartner2013visual,hollins1993perceptual,okamoto2013psychophysical} and one of the most studied haptic augmentations \cite{asano2015vibrotactile,culbertson2014modeling,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
%
By understanding how these visual factors influence the perception of haptically augmented tangible objects, the many wearable haptic systems that already exist but have not yet been fully explored with AR can be better applied and new visuo-haptic renderings adapted to AR can be designed.

View File

@@ -28,7 +28,7 @@ The user study aimed to investigate the effect of visual hand rendering in AR or
%
In a two-alternative forced choice (2AFC) task, participants compared the roughness of different tactile texture augmentations in three visual rendering conditions: without any visual augmentation (\figref{renderings}, \level{Real}), in AR with a realistic virtual hand superimposed on the real hand (\figref{renderings}, \level{Mixed}), and in VR with the same virtual hand as an avatar (\figref{renderings}, \level{Virtual}).
%
In order not to influence the perception, as vision is an important source of information and influence for the perception of texture \cite{bergmanntiest2007haptic,yanagisawa2015effects,normand2024augmenting,vardar2019fingertip}, the touched surface was visually a uniform white; thus only the visual aspect of the hand and the surrounding environment is changed.
In order not to influence the perception, as vision is an important source of information and influence for the perception of texture \cite{bergmanntiest2007haptic,yanagisawa2015effects,vardar2019fingertip}, the touched surface was visually a uniform white; thus only the visual aspect of the hand and the surrounding environment is changed.
\subsection{Participants}

View File

@@ -56,7 +56,7 @@ Such hypotheses could be tested by manipulating the latency and tracking accurac
The main limitation of our study is, of course, the absence of a visual representation of the touched virtual texture.
%
This is indeed a source of information as important as haptic sensations for perception for both real textures \cite{baumgartner2013visual,bergmanntiest2007haptic,vardar2019fingertip} and virtual textures \cite{degraen2019enhancing,gunther2022smooth,normand2024augmenting}.
This is indeed a source of information as important as haptic sensations for perception for both real textures \cite{baumgartner2013visual,bergmanntiest2007haptic,vardar2019fingertip} and virtual textures \cite{degraen2019enhancing,gunther2022smooth}.
%
%Specifically, it remains to be investigated how to visually represent vibrotactile textures in an immersive AR or VR context, as the visuo-haptic coupling of such grating textures is not trivial \cite{unger2011roughness} even with real textures \cite{klatzky2003feeling}.
%

View File

@@ -12,4 +12,4 @@ We investigated then with a psychophysical user study the effect of visual rende
%Only the amplitude $A$ varied between the reference and comparison textures to create the different levels of roughness.
%
%Participants were not informed there was a reference and comparison textures, and
No texture was represented visually, to avoid any influence on the perception \cite{bergmanntiest2007haptic,normand2024augmenting,yanagisawa2015effects}.
No texture was represented visually, to avoid any influence on the perception \cite{bergmanntiest2007haptic,yanagisawa2015effects}.

View File

@@ -46,8 +46,11 @@
\clearfield{isbn}%
}
\clearfield{day}%
\clearlist{editor}%
\clearfield{extra}%
\clearfield{location}%
\clearfield{month}%
\clearfield{series}%
\clearlist{publisher}%
\clearfield{url}%
}

File diff suppressed because it is too large Load Diff