Corrections

This commit is contained in:
2024-10-14 16:07:13 +02:00
parent 73af5fefb1
commit ffac70d093
4 changed files with 7 additions and 7 deletions

View File

@@ -15,7 +15,7 @@ Our goal is to achieve a more plausible and coherent perception, as well as a mo
\subsectionstarbookmark{Hand Interaction with Everyday Objects}
In daily life, \textbf{we simultaneously look at, touch and manipulate the everyday objects} around us without even thinking about it.
Many of these object properties can be perceived in a complementary way through all our sensory modalities, such as their shape, size or texture \cite{baumgartner2013visual}.
Many of these object properties can be perceived in a complementary way through all our sensory modalities, such as their shape, size or material \cite{baumgartner2013visual}.
Vision often precedes touch, enabling us to anticipate the tactile sensations we will feel when touching the object \cite{yanagisawa2015effects}, \eg hardness or texture, and even to predict properties that we cannot see, \eg weight or temperature.
Information from different sensory sources can be complementary, redundant or contradictory \cite{ernst2004merging}.
This is why we sometimes want to touch an object to check one of its properties that we have seen and to compare or confront our visual and tactile sensations.
@@ -217,7 +217,7 @@ Wearable haptic devices have proven to be effective in altering the perception o
%It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronised with the finger movement.
%It enables rich haptic feedback as the combination of kinesthetic sensation from the real and cutaneous sensation from the actuator.
However, wearable haptic augmentations with \AR have been little explored, as well as the visuo-haptic augmentation of texture.
Texture is indeed one of the most fundamental perceived property of a surface's material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by both sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) augmentation \cite{unger2011roughness,culbertson2014modeling,strohmeier2017generating}.
Texture is indeed one of the most fundamental perceived property of a surface's material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by both sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) augmentation \cite{unger2011roughness,culbertson2014modeling,asano2015vibrotactile,strohmeier2017generating,friesen2024perceived}.
Being able to coherently substitute the visuo-haptic texture of a surface directly touched by a finger is an important step towards \AR capable of visually and haptically augmenting the \RE of a user in a plausible way.
For this first axis of research, we propose to design and evaluate the \textbf{perception of wearable virtual visuo-haptic textures augmenting real surfaces}. %, using an immersive \AR headset and a wearable vibrotactile device.
To this end, we (1) design a system for rendering wearable visuo-haptic texture augmentations, to (2) evaluate how the perception of haptic texture augmentations is affected by the visual feedback of the virtual hand and the environment, and (3) investigate the perception of co-localized visuo-haptic texture augmentations.

View File

@@ -333,5 +333,5 @@ Taken together, these results suggest that a visual rendering of the hand in \AR
They enable highly immersive \AEs that users can explore with a strong sense of the presence of the virtual content.
However, without a direct and seamless interaction with the \VOs using the hands, the coherence of the \AE experience is compromised.
In particular, there is a lack of mutual occlusion and interaction cues between the hands and virtual content when manipulating \VOs in \OST-\AR that could be mitigated by a visual rendering of the hand.
A common alternative approach is to use real objects as tangible proxies for interaction with \VOs, but this raises concerns about their consistency with the visual rendering.
A common alternative approach is to use real objects as a proxy for interaction with \VOs, but this raises concerns about their consistency with the visual rendering.
In this context, the use of wearable haptic systems worn on the hand seems to be a promising solution both for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of touched real objects.

View File

@@ -12,8 +12,8 @@ Wearable haptic augmentation is mostly achieved with vibrotactile feedback.
\AR headsets integrate virtual content immersively into the user's perception as if it were part of the \RE, with real-time tracking of the head and hands.
However, direct hand interaction and manipulation of \VOs is difficult due to the lack of haptic feedback and of mutual occlusion rendering between the hand and the \VO, which could be improved by a visual rendering of the hand.
Tangibles are also used as proxies for manipulating \VOs, but can be inconsistent with the visual rendering, being haptically passives.
Wearable haptics on the hand is a promising solution for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of tangibles.
Real objects are also used as proxies for manipulating \VOs, but can be inconsistent with the visual rendering, being haptically passives.
Wearable haptics on the hand is a promising solution for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of real objects.
Providing coherent visuo-haptic feedback to enhance direct hand perception and manipulation with \VOs in immersive \AR is challenging.
While many wearable haptic devices have been developed and are capable of providing varied tactile feedback, few can be integrated or experimentally evaluated for direct hand interaction in \AR.
@@ -22,4 +22,4 @@ Different relocation strategies have been proposed for different parts of the ha
In all cases, the real and virtual visual sensations are considered co-localized, but the virtual haptic feedback is not.
Such a discrepancy may affect the user's perception and experience and should be further investigated.
When integrating different sensory feedback, haptic and visual, real and virtual, into a single object property, perception is robust to variations in reliability and to spatial and temporal differences.
However, the same haptic rendering or augmentation can be influenced by the user's visual expectation or the visual rendering of the \VO.
Conversely, the same haptic rendering or augmentation can be influenced by the user's visual expectation or the visual rendering of the \VO.