WIP xr-perception

This commit is contained in:
2024-09-27 22:11:28 +02:00
parent a9319210df
commit 344496cbef
19 changed files with 270 additions and 366 deletions

View File

@@ -24,11 +24,45 @@ The results showed that providing vibrotactile feedback \textbf{improved the per
The wearable visuo-haptic augmentations of perception and manipulation we presented and the user studies we conducted in this thesis have of course some limitations.
In this section, we present some future work for each chapter that could address these.
\subsection*{Visual Rendering of the Hand for Manipulating Virtual Objects in Augmented Reality}
\subsection*{Augmenting the Visuo-haptic Texture Perception of Tangible Surfaces}
\paragraph{Other Augmented Object Properties}
We focused on the visuo-haptic augmentation of roughness using vibrotactile feedback, because it is one of the most salient properties of surfaces (\secref[related_work]{object_properties}), one of the most studied in haptic perception (\secref[related_work]{texture_rendering}), and equally perceived by sight and touch (\secref[related_work]{visual_haptic_influence}).
However, many other wearable augmentation of object properties could be considered, such as hardness, friction, temperature, or local deformations.
Such integration of haptic augmentation of a tangible surface has almost been achieved with the hand-held devices of \citeauthor{culbertson2017ungrounded} \cite{culbertson2017importance,culbertson2017ungrounded}, but it remains to be explored with wearable haptic devices.
In addition, combination with pseudo-haptic rendering techniques \cite{ujitoko2021survey} should be systematically investigated to expand the range of possible wearable haptic augmentations.
\paragraph{Fully Integrated Tracking}
In our system, we registered the real and virtual environments (\secref[related_work]{ar_definition}) using fiducial markers and a webcam external to the \AR headset.
This only allowed us to track the index finger and the surface to be augmented with the haptic texture, but the tracking was reliable and accurate enough for our needs.
In fact, preliminary tests we conducted showed that the built-in tracking capabilities of the Microsoft HoloLens~2 were not able to track the hands wearing a voice-coil.
A more robust hand tracking system would support wearing haptic devices on the hand, as well as holding real objects.
A complementary solution would be to embed tracking sensors in the wearable haptic devices, such as an inertial measurement unit (IMU) or cameras, such as \textcite{preechayasomboon2021haplets}.
This would allow a complete portable and wearable visuo-haptic system to be used in more ecological applications.
\subsection*{Perception of Haptic Texture Augmentation in Augmented and Virtual Reality}
\paragraph{Visual Representation of the Virtual Texture}
The main limitation of our study is the absence of a visual representation of the virtual texture.
This is indeed a source of information as important as haptic sensations for the perception of both real textures \cite{baumgartner2013visual,bergmanntiest2007haptic,vardar2019fingertip} and virtual textures \cite{degraen2019enhancing,gunther2022smooth}, and their interaction in the overall perception is complex.
Specifically, it remains to be investigated how to visually represent vibrotactile textures in an immersive \AR or \VR context, as the visuo-haptic coupling of such grating textures is not trivial \cite{unger2011roughness} even with real textures \cite{klatzky2003feeling}.
\paragraph{Broader Visuo-Haptic Conditions}
Also, our study was conducted with an \OST-\AR headset, but the results may be different with a \VST-\AR headset.
Finally, we focused on the perception of roughness sensations using wearable haptics in \AR \vs \VR using a square wave vibrotactile signal, but different haptic texture rendering methods should be considered.
More generally, many other haptic feedbacks could be investigated in \AR \vs \VR using the same system and methodology, such as stiffness, friction, local deformations, or temperature.
\subsection*{Perception of Visual and Haptic Texture Augmentations in Augmented Reality}
\subsection*{Visual Rendering of the Hand for Manipulating Virtual Objects in AR}
\paragraph{Other AR Displays}
The visual hand renderings we evaluated were displayed on the Microsoft HoloLens~2, which is a common \OST-\AR headset.
The visual hand renderings we evaluated were displayed on the Microsoft HoloLens~2, which is a common \OST-\AR headset \cite{hertel2021taxonomy}.
We purposely chose this type of display as it is with \OST-\AR that the lack of mutual occlusion between the hand and the \VO is the most challenging to solve \cite{macedo2023occlusion}.
We thus hypothesized that a visual hand rendering would be more beneficial to users with this type of display.
However, the user's visual perception and experience is very different with other types of displays, such as \VST-\AR, where the \RE view is seen through a screen (\secref[related_work]{ar_displays}).
@@ -41,7 +75,7 @@ While these tasks are fundamental building blocks for more complex manipulation
Similarly, a broader experimental study might shed light on the role of gender and age, as our subject pool was not sufficiently diverse in this regard.
Finally, all visual hand renderings received low and high rank rates from different participants, suggesting that users should be able to choose and personalize some aspects of the visual hand rendering according to their preferences or needs, and this should be also be evaluated.
\subsection*{Visuo-Haptic Rendering of Hand Manipulation With Virtual Objects in Augmented Reality}
\subsection*{Visuo-Haptic Rendering of Hand Manipulation With Virtual Objects in AR}
\paragraph{Richer Haptic Feedback}