Files
phd-thesis/4-conclusion/conclusion.tex
2024-09-28 09:05:22 +02:00

117 lines
11 KiB
TeX

\chapter{Conclusion}
\mainlabel{conclusion}
\section{Summary}
In this thesis, entitled \enquote{\textbf{\ThesisTitle}}, we presented our research on direct hand interaction with real and virtual everyday objects, visually and haptically augmented using immersive \AR and wearable haptic devices.
The first axis of our research was the \textbf{}.
The second axis was the \textbf{}.
\noindentskip In \partref{perception},
\noindentskip In \chapref{vhar_system},
\noindentskip In \chapref{xr_perception},
\noindentskip In \chapref{vhar_textures},
\noindentskip In \partref{manipulation}, we addressed the challenge of improving the manipulation of \VOs directly with the hand in immersive \OST-\AR.
Our approach was to design, based on the literature, and evaluate in user studies the effect of visual rendering of the hand and the delocalized haptic rendering
We first focused on (1) \textbf{the visual rendering as hand augmentation} and then on the (2) \textbf{combination of different visuo-haptic rendering of the hand manipulation with \VOs}.
\noindentskip In \chapref{visual_hand}, we investigated the visual rendering as hand augmentation.
Seen as an \textbf{overlay on the user's hand}, such visual hand rendering provide feedback on the hand tracking and the interaction with \VOs.
We compared the six commonly used renderings in the \AR litterature in a user study with 24 participants, where we evaluated their effect on the user performance and experience in two representative manipulation tasks.
The results showed that a visual hand rendering improved the user performance, perceived effectiveness and confidence, with a \textbf{skeleton-like rendering being the most performant and effective}.
This rendering provided a detailed view of the tracked phalanges while being thin enough not to hide the real hand.
\noindentskip In \chapref{visuo_haptic_hand}, we then investigated the visuo-haptic rendering as feedback of the direct hand manipulation with \VOs using wearable vibrotactile haptics.
In a user study with a similar design and 20 participants, we compared two vibrotactile contact techniques, provided at \textbf{four different delocalized positions on the user's hand}, and combined with the two most representative visual hand renderings from the previous chapter.
The results showed that providing vibrotactile feedback \textbf{improved the perceived effectiveness, realism, and usefulness when it is provided close to the fingertips}, and that the visual hand rendering complemented the haptic hand rendering well in giving a continuous feedback on the hand tracking.
\section{Future Work}
The wearable visuo-haptic augmentations of perception and manipulation we presented and the user studies we conducted in this thesis have of course some limitations.
In this section, we present some future work for each chapter that could address these.
\subsection*{Augmenting the Visuo-haptic Texture Perception of Tangible Surfaces}
\paragraph{Other Augmented Object Properties}
We focused on the visuo-haptic augmentation of roughness using vibrotactile feedback, because it is one of the most salient properties of surfaces (\secref[related_work]{object_properties}), one of the most studied in haptic perception (\secref[related_work]{texture_rendering}), and equally perceived by sight and touch (\secref[related_work]{visual_haptic_influence}).
However, many other wearable augmentation of object properties could be considered, such as hardness, friction, temperature, or local deformations.
Such integration of haptic augmentation of a tangible surface has almost been achieved with the hand-held devices of \citeauthor{culbertson2017ungrounded} \cite{culbertson2017importance,culbertson2017ungrounded}, but it remains to be explored with wearable haptic devices.
In addition, combination with pseudo-haptic rendering techniques \cite{ujitoko2021survey} should be systematically investigated to expand the range of possible wearable haptic augmentations.
\paragraph{Fully Integrated Tracking}
In our system, we registered the real and virtual environments (\secref[related_work]{ar_definition}) using fiducial markers and a webcam external to the \AR headset.
This only allowed us to track the index finger and the surface to be augmented with the haptic texture, but the tracking was reliable and accurate enough for our needs.
In fact, preliminary tests we conducted showed that the built-in tracking capabilities of the Microsoft HoloLens~2 were not able to track the hands wearing a voice-coil.
A more robust hand tracking system would support wearing haptic devices on the hand, as well as holding real objects.
A complementary solution would be to embed tracking sensors in the wearable haptic devices, such as an inertial measurement unit (IMU) or cameras, such as \textcite{preechayasomboon2021haplets}.
This would allow a complete portable and wearable visuo-haptic system to be used in more ecological applications.
\subsection*{Perception of Haptic Texture Augmentation in Augmented and Virtual Reality}
\paragraph{Visual Representation of the Virtual Texture}
The main limitation of our study is the absence of a visual representation of the virtual texture.
This is indeed a source of information as important as haptic sensations for the perception of both real textures \cite{baumgartner2013visual,bergmanntiest2007haptic,vardar2019fingertip} and virtual textures \cite{degraen2019enhancing,gunther2022smooth}, and their interaction in the overall perception is complex.
Specifically, it remains to be investigated how to visually represent vibrotactile textures in an immersive \AR or \VR context, as the visuo-haptic coupling of such grating textures is not trivial \cite{unger2011roughness} even with real textures \cite{klatzky2003feeling}.
\paragraph{Broader Visuo-Haptic Conditions}
Also, our study was conducted with an \OST-\AR headset, but the results may be different with a \VST-\AR headset.
Finally, we focused on the perception of roughness sensations using wearable haptics in \AR \vs \VR using a square wave vibrotactile signal, but different haptic texture rendering methods should be considered.
More generally, many other haptic feedbacks could be investigated in \AR \vs \VR using the same system and methodology, such as stiffness, friction, local deformations, or temperature.
\subsection*{Perception of Visual and Haptic Texture Augmentations in Augmented Reality}
\subsection*{Visual Rendering of the Hand for Manipulating Virtual Objects in AR}
\paragraph{Other AR Displays}
The visual hand renderings we evaluated were displayed on the Microsoft HoloLens~2, which is a common \OST-\AR headset \cite{hertel2021taxonomy}.
We purposely chose this type of display as it is with \OST-\AR that the lack of mutual occlusion between the hand and the \VO is the most challenging to solve \cite{macedo2023occlusion}.
We thus hypothesized that a visual hand rendering would be more beneficial to users with this type of display.
However, the user's visual perception and experience is very different with other types of displays, such as \VST-\AR, where the \RE view is seen through a screen (\secref[related_work]{ar_displays}).
While the mutual occlusion problem and the hand tracking latency can be overcome with \VST-\AR, the visual hand rendering could still be beneficial to users as it provides depth cues and feedback on the hand tracking, and should be evaluated as such.
\paragraph{More Ecological Conditions}
We conducted the user study with two manipulation tasks that involved placing a virtual cube in a target volume, either by pushing it on a table or by grasping and lifting it.
While these tasks are fundamental building blocks for more complex manipulation tasks \cite{laviolajr20173d}, such as stacking or assembly, more ecological uses should be considered.
Similarly, a broader experimental study might shed light on the role of gender and age, as our subject pool was not sufficiently diverse in this regard.
Finally, all visual hand renderings received low and high rank rates from different participants, suggesting that users should be able to choose and personalize some aspects of the visual hand rendering according to their preferences or needs, and this should be also be evaluated.
\subsection*{Visuo-Haptic Rendering of Hand Manipulation With Virtual Objects in AR}
\paragraph{Richer Haptic Feedback}
The haptic rendering we considered was limited to vibrotactile feedback using \ERM motors.
While the simpler contact vibration technique (Impact technique) was sufficient to confirm contacts with the cube, richer vibrotactile renderings may be required for more complex interactions, such as rendering hardness (\secref[related_work]{hardness_rendering}), textures (\secref[related_work]{texture_rendering}), friction \cite{konyo2008alternative,jeon2011extensions,salazar2020altering}, or edges and shape of \VOs.
This will require to consider a broader ranger of haptic actuators and sensations (\secref[related_work]{wearable_haptic_devices}), such as pressure or stretching of the skin.
More importantly, the best compromise between well-round haptic feedback and wearability of the system with respect to \AR constraints should be analyzed (\secref[related_work]{vhar_haptics}).
\paragraph{Personalized Haptics}
Some users found the vibration rendering to be too strong, suggesting that adapting and personalizing the haptic feedback to one's preference is a promising approach.
In addition, although it was perceived as more effective and realistic when provided close to the point of contact, other positionings, such as the wrist, may be preferred and still be sufficient for a given task.
The interactions in our user study were also restricted to the thumb and index fingertips, with the haptic feedback provided only for these contact points, as these are the most commonly used parts of the hand for manipulation tasks.
It remains to be explored how to support rendering for different and larger areas of the hand, and how to position a delocalized rendering for points other than the fingertips could be challenging.
Indeed, personalized haptics is gaining interest in the community \cite{malvezzi2021design, umair2021exploring}.
\section{Perspectives}
\subsection*{Towards Universal Wearable Haptic Augmentation}
% systematic exploration of the parameter space of the haptic rendering to determine the most important parameters their influence on the perception
% measure the difference in sensitivity to the haptic feedback and how much it affects the perception of the object properties
\subsection*{Responsive Visuo-Haptic Augmented Reality}
%Given these three points, and the diversity of haptic actuators and renderings, one might be able to interact with the \VOs with any haptic device, worn anywhere on the body and providing personalized feedback on any other part of the hand, and the visuo-haptic system should be able to support such a adapted usage.s
% design, implement and validate procedures to automatically calibrate the haptic feedback to the user's perception in accordance to what it has been designed to represent
% + let user free to easily adjust (eg can't let adjust whole spectrum of vibrotactile, reduce to two or three dimensions with sliders using MDS)