Files
phd-thesis/4-conclusion/conclusion.tex

147 lines
16 KiB
TeX

\chapter{Conclusion}
\mainlabel{conclusion}
\chaptertoc
\section{Summary}
In this thesis, entitled \enquote{\textbf{\ThesisTitle}}, we have shown how wearable haptics, worn on the outside of the hand, can improve direct hand interaction in immersive \AR by augmenting the perception and manipulation of the virtual.
Wearable haptics can provide a rich tactile feedback on \VOs and augment the perception of real objects, both directly touched by the hand, while preserving the freedom of movement and interaction with the \RE.
However, their integration with \AR is still in its infancy, and presents many design, technical and human challenges.
We have structured our research around two axes: \textbf{(I) modifying the texture perception of tangible surfaces}, and \textbf{(II) improving the manipulation of \VOs}.
\noindentskip In \partref{perception} we focused on modifying the perception of wearable and immersive virtual visuo-haptic textures that augment tangible surfaces.
Texture is a fundamental property of an object, perceived equally by sight and touch.
It is also one of the most studied haptic augmentations, but it had not yet been integrated into \AR or \VR.
We \textbf{(1)} proposed a \textbf{wearable visuo-haptic texture augmentation system}, \textbf{(2)} evaluated how the perception of haptic texture augmentations is \textbf{affected by the visual virtuality of the hand} and the environment (real, augmented, or virtual), and \textbf{(3)} investigated the \textbf{perception of co-localized visuo-haptic texture augmentations}.
In \chapref{vhar_system}, we presented a system for \textbf{augmenting any tangible surface} with virtual \textbf{visuo-haptic roughness textures} using an immersive \AR headset and a wearable vibrotactile device worn on the middle phalanx of the finger.
It allows a \textbf{free visual and touch exploration} of the textures, as if they were real, allowing the user to view them from different angles and touch them with the bare finger without constraints on hand movements.
The user studies in the next two chapters are based on this system.
In \chapref{xr_perception} we explored how the perception of wearable haptic augmented textures is affected by the visual virtuality of the hand and the environment, whether it is real, augmented or virtual.
We augmented the perceived roughness of the tangible surface with virtual vibrotactile patterned textures, and rendered the visual conditions by switching the \OST-\AR headset to a \VR-only view.
We then conducted a psychophysical user study with 20 participants and extensive questionnaires to evaluate the perceived roughness augmentation in these three visual conditions.
The textures were perceived as \textbf{rougher when touched with the real hand alone compared to a virtual hand} in either \AR or \VR, possibly due to the \textbf{perceived latency} between finger movements and different visual, haptic, and proprioceptive feedbacks.
In \chapref{vhar_textures}, we investigated the perception of co-localized visual and wearable haptic texture augmentations on tangible surfaces.
We transposed the \textbf{data-driven visuo-haptic textures} from the \HaTT database to the system presented in \chapref{vhar_system} and conducted a user study with 20 participants to rate the coherence, realism, and perceived roughness of the combination of nine visuo-haptic texture pairs.
Participants integrated roughness sensations from both visual and haptic modalities well, with \textbf{haptics predominating the perception}, and consistently identified and matched \textbf{clusters of visual and haptic textures with similar perceived roughness}.
\noindentskip In \partref{manipulation} we focused on improving the manipulation of \VOs directly with the hand in immersive \OST-\AR.
Our approach was to design visual renderings of the hand and delocalized haptic rendering, based on the literature, and to evaluate them in user studies.
We first considered \textbf{(1) the visual rendering as hand augmentation} and then the \textbf{(2)} combination of different visuo-haptic \textbf{rendering of the hand manipulation with \VOs}.
In \chapref{visual_hand}, we investigated the visual rendering as hand augmentation.
Seen as an \textbf{overlay on the user's hand}, such visual hand rendering provide feedback on the hand tracking and the interaction with \VOs.
We compared the six commonly used renderings in the \AR litterature in a user study with 24 participants, where we evaluated their effect on the user performance and experience in two representative manipulation tasks.
The results showed that a visual hand rendering improved the user performance, perceived effectiveness and confidence, with a \textbf{skeleton-like rendering being the most performant and effective}.
This rendering provided a detailed view of the tracked phalanges while being thin enough not to hide the real hand.
In \chapref{visuo_haptic_hand}, we then investigated the visuo-haptic rendering as feedback of the direct hand manipulation with \VOs using wearable vibrotactile haptics.
In a user study with a similar design and 20 participants, we compared two vibrotactile contact techniques, provided at \textbf{four different delocalized positions on the user's hand}, and combined with the two most representative visual hand renderings from the previous chapter.
The results showed that providing vibrotactile feedback \textbf{improved the perceived effectiveness, realism, and usefulness when it was provided close to the fingertips}, and that the visual hand rendering complemented the haptic hand rendering well in giving a continuous feedback on the hand tracking.
\section{Future Work}
The wearable visuo-haptic augmentations of perception and manipulation we presented, and the user studies we conducted for this thesis have of course some limitations.
In this section, we present some future work for each chapter that could address these issues.
\subsection*{Augmenting the Visuo-haptic Texture Perception of Tangible Surfaces}
\paragraph{Other Augmented Object Properties}
We focused on the visuo-haptic augmentation of roughness using vibrotactile feedback, because it is one of the most salient properties of surfaces (\secref[related_work]{object_properties}), one of the most studied in haptic perception (\secref[related_work]{texture_rendering}), and equally perceived by sight and touch (\secref[related_work]{visual_haptic_influence}).
However, many other wearable augmentation of object properties should be considered, such as hardness, friction, temperature, or local deformations.
Such integration of haptic augmentation of a tangible surface has almost been achieved with the hand-held devices of \citeauthor{culbertson2017ungrounded} \cite{culbertson2017importance,culbertson2017ungrounded}, but will be more challenging with wearable haptic devices.
In addition, combination with pseudo-haptic rendering techniques \cite{ujitoko2021survey} should be systematically investigated to expand the range of possible wearable haptic augmentations.
\paragraph{Fully Integrated Tracking}
In our system, we registered the real and virtual environments (\secref[related_work]{ar_definition}) using fiducial markers and a webcam external to the \AR headset.
This only allowed us to track the index finger and the surface to be augmented with the haptic texture, but the tracking was reliable and accurate enough for our needs.
In fact, preliminary tests we conducted showed that the built-in tracking capabilities of the Microsoft HoloLens~2 were not able to track the hands wearing a voice-coil.
A more robust hand tracking system would support wearing haptic devices on the hand, as well as holding real objects.
A complementary solution would be to embed tracking sensors in the wearable haptic devices, such as an inertial measurement unit (IMU) or cameras, such as \textcite{preechayasomboon2021haplets}.
This would allow a complete portable and wearable visuo-haptic system to be used in more ecological applications.
\subsection*{Perception of Haptic Texture Augmentation in Augmented and Virtual Reality}
\paragraph{Visual Representation of the Virtual Texture}
The main limitation of this user study was the absence of a visual representation of the virtual patterned texture.
This is indeed a source of information as important as haptic sensations for the perception of both real textures \cite{baumgartner2013visual,bergmanntiest2007haptic,vardar2019fingertip} and virtual textures \cite{degraen2019enhancing,gunther2022smooth}, and their interaction in the overall perception is complex.
In particular, it remains to be investigated how the vibrotactile patterned textures can be represented visually in a convincing way, as the visuo-haptic coupling of such patterned textures is not trivial \cite{unger2011roughness}.% even with real textures \cite{klatzky2003feeling}.
\paragraph{Broader Visuo-Haptic Conditions}
Our study was conducted with an \OST-\AR headset, but the results may be different with a \VST-\AR headset, where the \RE is seen through cameras and screens, and the perceived simultaneity between visual and haptic stimuli, real or virtual, is different.
We also focused on the perception of roughness augmentation using wearable vibrotactile haptics and a square wave signal to simulate a patterned texture: Our objective was not to accurately reproduce real textures, but to induce different percevied roughness on the same tangible surface with a well controlled parameters.
However, more accurate models for simulating interaction with virtual textures should be applied to wearable haptic augmentations, such as in \textcite{unger2011roughness}.
Another limitation that may have affected the perception of the haptic texture augmentations is the lack of compensation for the frequency response of the actuator and amplifier \cite{asano2012vibrotactile,culbertson2014modeling,friesen2024perceived}.
The dynamic response of the finger should also be considered, and may vary between individuals.
\subsection*{Perception of Visual and Haptic Texture Augmentations in Augmented Reality}
\paragraph{Assess the Applicability of the Method}
As in the previous chapter, our aim was not to accurately reproduce real textures, but to alter the perception of a tangible surface being touched with simultaneous visual and haptic texture augmentations.
However, the results also have some limitations, as they addressed a small set of visuo-haptic textures that augmented the perception of smooth and white tangible surfaces.
Visuo-haptic texture augmentations may be difficult on surfaces that already have strong visual or haptic patterns \cite{asano2012vibrotactile}, or on objects with complex shapes.
The role of visuo-haptic texture augmentation should also be evaluated in more complex tasks, such as object recognition and assembly, or in more concrete use cases, such as displaying and touching a museum object or a 3D print before it is manufactured.
%Finally, the visual textures used were also simple color captures not meant to be used in an immersive \VE.
\paragraph{Adapt to the Specificities of Direct Touch}
The haptic textures used were captures and models of the vibrations of a hand-held probe sliding over real surfaces.
We generated the vibrotactile textures only from the finger speed \cite{culbertson2015should}, but the perceived roughness of real textures also depends on other factors, such as the contact force, the angle, the posture or the surface of the contact \cite{schafer2017transfer}, but their respective importance in the perception is not yet fully understood \cite{richardson2022learning}.
It would be interesting to determine the importance of these factors on the perceived realism of virtual vibrotactile textures.
The virtual texture models should also be adaptable to individual sensitivities \cite{malvezzi2021design,young2020compensating}.
\subsection*{Visual Rendering of the Hand for Manipulating \VOs in AR}
\paragraph{Other AR Displays}
The visual hand renderings we evaluated were displayed on the Microsoft HoloLens~2, which is a common \OST-\AR headset \cite{hertel2021taxonomy}.
We purposely chose this type of display as it is with \OST-\AR that the lack of mutual occlusion between the hand and the \VO is the most challenging to solve \cite{macedo2023occlusion}.
We thus hypothesized that a visual hand rendering would be more beneficial to users with this type of display.
However, the user's visual perception and experience is different with other types of displays, such as \VST-\AR, where the \RE view is seen through cameras and screens (\secref[related_work]{ar_displays}).
While the mutual occlusion problem and the hand tracking latency can be overcome with \VST-\AR, the visual hand rendering could still be beneficial to users as it provides depth cues and feedback on the hand tracking, and should be evaluated as such.
\paragraph{More Ecological Conditions}
We conducted the user study with two manipulation tasks that involved placing a virtual cube in a target volume, either by pushing it on a table or by grasping and lifting it.
While these tasks are fundamental building blocks for more complex manipulation tasks \cite{laviolajr20173d}, such as stacking or assembly, more ecological uses should be considered.
Similarly, a broader experimental study might shed light on the role of gender and age, as our subject pool was not sufficiently diverse in this regard.
Finally, all visual hand renderings received low and high rank rates from different participants, suggesting that users should be able to choose and personalize some aspects of the visual hand rendering according to their preferences or needs, and this should be also be evaluated.
\subsection*{Visuo-Haptic Rendering of Hand Manipulation With \VOs in AR}
\paragraph{Richer Haptic Feedback}
The haptic rendering we considered was limited to vibrotactile feedback using \ERM motors.
While the simpler contact vibration technique (Impact technique) was sufficient to confirm contacts with the cube, richer vibrotactile renderings may be required for more complex interactions, such as rendering hardness (\secref[related_work]{hardness_rendering}), textures (\secref[related_work]{texture_rendering}), friction \cite{konyo2008alternative,jeon2011extensions,salazar2020altering}, or edges and shape of \VOs.
This will require to consider a broader ranger of haptic actuators and sensations (\secref[related_work]{wearable_haptic_devices}), such as pressure or stretching of the skin.
More importantly, the best compromise between well-round haptic feedback and wearability of the system with respect to \AR constraints should be analyzed (\secref[related_work]{vhar_haptics}).
\paragraph{Personalized Haptics}
Some users found the vibration rendering to be too strong, suggesting that adapting and personalizing the haptic feedback to one's preference is to investigate \cite{malvezzi2021design, umair2021exploring}.
In addition, although it was perceived as more effective and realistic when provided close to the point of contact, other positionings, such as the wrist, may be preferred and still be sufficient for a given task.
The interactions in our user study were also restricted to the thumb and index fingertips, with the haptic feedback provided only for these contact points, as these are the most commonly used parts of the hand for manipulation tasks.
It remains to be explored how to support rendering for different and larger areas of the hand, and how to position a delocalized rendering for points other than the fingertips could be challenging.
\section{Perspectives}
\subsection*{Towards Universal Wearable Haptic Augmentation}
% systematic exploration of the parameter space of the haptic rendering to determine the most important parameters their influence on the perception
% measure the difference in sensitivity to the haptic feedback and how much it affects the perception of the object properties
\subsection*{Responsive Visuo-Haptic Augmented Reality}
%Given these three points, and the diversity of haptic actuators and renderings, one might be able to interact with the \VOs with any haptic device, worn anywhere on the body and providing personalized feedback on any other part of the hand, and the visuo-haptic system should be able to support such a adapted usage.s
% design, implement and validate procedures to automatically calibrate the haptic feedback to the user's perception in accordance to what it has been designed to represent
% + let user free to easily adjust (eg can't let adjust whole spectrum of vibrotactile, reduce to two or three dimensions with sliders using MDS)