Files
phd-thesis/1-background/related-work/5-conclusion.tex
2024-10-14 16:07:13 +02:00

26 lines
2.9 KiB
TeX

\section{Conclusion}
\label{conclusion}
Haptic perception and manipulation of objects with the hand involves exploratory movements or grasp types, respectively, with simultaneous sensory feedback from multiple cutaneous and kinaesthetic receptors embedded beneath the skin.
These receptors provide sensory cues about the physical properties of objects, such as roughness and hardness, which are then integrated to form a perception of the property being explored.
Perceptual constancy is possible in the absence of one cue by compensating with others.
Haptic systems aim to provide virtual interactions and sensations similar to those with real objects.
Only a few can be considered wearable due to their compactness and portability, but they are limited to cutaneous feedback.
If their haptic rendering is timely associated with the user's touch actions on a real object, the perceived haptic properties of the object, such as its roughness and hardness, can be modified.
Wearable haptic augmentation is mostly achieved with vibrotactile feedback.
\AR headsets integrate virtual content immersively into the user's perception as if it were part of the \RE, with real-time tracking of the head and hands.
However, direct hand interaction and manipulation of \VOs is difficult due to the lack of haptic feedback and of mutual occlusion rendering between the hand and the \VO, which could be improved by a visual rendering of the hand.
Real objects are also used as proxies for manipulating \VOs, but can be inconsistent with the visual rendering, being haptically passives.
Wearable haptics on the hand is a promising solution for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of real objects.
Providing coherent visuo-haptic feedback to enhance direct hand perception and manipulation with \VOs in immersive \AR is challenging.
While many wearable haptic devices have been developed and are capable of providing varied tactile feedback, few can be integrated or experimentally evaluated for direct hand interaction in \AR.
Their haptic end-effector must be moved away from the inside of the hand so as not to interfere with the user interaction with the \RE.
Different relocation strategies have been proposed for different parts of the hand, such as the nail, the index phalanges, or the wrist, but it remains unclear whether any of them are best suited for direct hand interaction in \AR.
In all cases, the real and virtual visual sensations are considered co-localized, but the virtual haptic feedback is not.
Such a discrepancy may affect the user's perception and experience and should be further investigated.
When integrating different sensory feedback, haptic and visual, real and virtual, into a single object property, perception is robust to variations in reliability and to spatial and temporal differences.
Conversely, the same haptic rendering or augmentation can be influenced by the user's visual expectation or the visual rendering of the \VO.