Replace "immersive AR" with "AR headset"
This commit is contained in:
@@ -97,7 +97,7 @@ As illustrated in \figref{sensorimotor_continuum}, \textcite{jones2006human} del
|
||||
]
|
||||
|
||||
This classification has been further refined by \textcite{bullock2013handcentric} into 15 categories of possible hand interactions with an object.
|
||||
In this thesis, we are interested in exploring visuo-haptic texture augmentations (\partref{perception}) and grasping of virtual objects (\partref{manipulation}) using immersive \AR and wearable haptics.
|
||||
In this thesis, we are interested in exploring visuo-haptic texture augmentations (\partref{perception}) and grasping of virtual objects (\partref{manipulation}) using an \AR headset and wearable haptics.
|
||||
|
||||
\subsubsection{Hand Anatomy and Motion}
|
||||
\label{hand_anatomy}
|
||||
|
||||
@@ -2,7 +2,8 @@
|
||||
\label{augmented_reality}
|
||||
|
||||
\AR devices generate and integrate virtual content into the user's perception of their real environment (\RE), creating the illusion of the \emph{presence} of the virtual \cite{azuma1997survey,skarbez2021revisiting}.
|
||||
Immersive systems such as headsets leave the hands free to interact with virtual objects (virtual objects), promising natural and intuitive interactions similar to those with everyday real objects \cite{billinghurst2021grand,hertel2021taxonomy}.
|
||||
Among the different types of devices, \AR headsets leave the hands free to interact with virtual objects.
|
||||
This promises natural and intuitive interactions similar to those with everyday real objects \cite{billinghurst2021grand,hertel2021taxonomy}.
|
||||
|
||||
\subsection{What is Augmented Reality?}
|
||||
\label{what_is_ar}
|
||||
@@ -72,7 +73,7 @@ It doesn't require the user to wear the display, but requires a real surface to
|
||||
Regardless the \AR display, it can be placed at different locations \cite{bimber2005spatial}, as shown in \figref{roo2017one_1}.
|
||||
\emph{Spatial \AR} is usually projection-based displays placed at fixed location (\figref{roo2017inner}), but it can also be \OST or \VST \emph{fixed windows} (\figref{lee2013spacetop}).
|
||||
Alternatively, \AR displays can be \emph{hand-held}, like a \VST smartphone (\figref{hartl2013mobile}), or body-attached, like a micro-projector used as a flashlight \cite[p.141]{billinghurst2015survey}.
|
||||
Finally, \AR displays can be head-worn like \VR \emph{headsets} or glasses, providing a highly immersive and portable experience.
|
||||
Finally, \AR displays can be head-worn like \VR \emph{headsets} or glasses, providing a portable experience.
|
||||
|
||||
\fig[0.75]{roo2017one_1}{Locations of \AR displays from eye-worn to spatially projected. Adapted by \textcite{roo2017one} from \textcite{bimber2005spatial}.}
|
||||
|
||||
@@ -141,7 +142,7 @@ Choosing useful and efficient \UIs and interaction techniques is crucial for the
|
||||
\label{ve_tasks}
|
||||
|
||||
\textcite{laviolajr20173d} (p.385) classify interaction techniques into three categories based on the tasks they enable users to perform: manipulation, navigation, and system control.
|
||||
\textcite{hertel2021taxonomy} proposed a similar taxonomy of interaction techniques specifically for immersive \AR.
|
||||
\textcite{hertel2021taxonomy} proposed a similar taxonomy of interaction techniques specifically for \AR headsets.
|
||||
|
||||
The \emph{manipulation tasks} are the most fundamental tasks in \AR and \VR systems, and the building blocks for more complex interactions.
|
||||
\emph{Selection} is the identification or acquisition of a specific virtual object, \eg pointing at a target as in \figref{grubert2015multifi}, touching a button with a finger, or grasping an object with a hand.
|
||||
@@ -175,12 +176,12 @@ In this thesis we focus on manipulation tasks of virtual content directly with t
|
||||
\label{real_virtual_gap}
|
||||
|
||||
In \AR and \VR, the state of the system is displayed to the user as a \ThreeD spatial \VE.
|
||||
In an immersive and portable \AR system, this \VE is experienced at a 1:1 scale and as an integral part of the \RE.
|
||||
With an \AR headset, the \VE can be experienced at a 1:1 scale and as an integral part of the \RE.
|
||||
The rendering gap between the real and virtual elements, as described on our interaction loop in \figref[introduction]{interaction-loop}, is thus experienced as narrow or even not consciously perceived by the user.
|
||||
This manifests as a sense of presence of the virtual, as described in \secref{ar_presence}.
|
||||
|
||||
As the gap between real and virtual rendering is reduced, one could expect a similar and seamless interaction with the \VE as with a \RE, which \textcite{jacob2008realitybased} called \emph{reality based interactions}.
|
||||
As of today, an immersive \AR system tracks itself with the user in \ThreeD, using tracking sensors and pose estimation algorithms \cite{marchand2016pose}.
|
||||
As of today, an \AR system tracks itself with the user in \ThreeD, using tracking sensors and pose estimation algorithms \cite{marchand2016pose}.
|
||||
It enables the \VE to be registered with the \RE and the user simply moves to navigate within the virtual content.
|
||||
However, direct hand manipulation of virtual content is a challenge that requires specific interaction techniques \cite{billinghurst2021grand}.
|
||||
It is often achieved using two interaction techniques: \emph{tangible objects} and \emph{virtual hands} \cite[p.165]{billinghurst2015survey}.
|
||||
@@ -276,8 +277,8 @@ This suggests that a visual hand feedback superimposed on the real hand as a par
|
||||
|
||||
Few works have compared different visual feedback of the virtual hand in \AR or with wearable haptic feedback.
|
||||
Rendering the real hand as a semi-transparent hand in \VST-\AR is perceived as less natural but seems to be preferred to a mutual visual occlusion for interaction with real and virtual objects \cite{buchmann2005interaction,piumsomboon2014graspshell}.
|
||||
Similarly, \textcite{blaga2017usability} evaluated direct hand manipulation in non-immersive \VST-\AR with a skeleton-like rendering \vs no visual hand feedback: while user performance did not improve, participants felt more confident with the virtual hand (\figref{blaga2017usability}).
|
||||
In a collaborative task in immersive \OST-\AR \vs \VR, \textcite{yoon2020evaluating} showed that a realistic human hand rendering was the most preferred over a low-polygon hand and a skeleton-like hand for the remote partner.
|
||||
Similarly, \textcite{blaga2017usability} evaluated direct hand manipulation in \VST-\AR with a skeleton-like rendering \vs no visual hand feedback: while user performance did not improve, participants felt more confident with the virtual hand (\figref{blaga2017usability}).
|
||||
In a collaborative task in \OST-\AR \vs \VR headsets, \textcite{yoon2020evaluating} showed that a realistic human hand rendering was the most preferred over a low-polygon hand and a skeleton-like hand for the remote partner.
|
||||
\textcite{genay2021virtual} found that the sense of embodiment with robotic hands overlay in \OST-\AR was stronger when the environment contained both real and virtual objects (\figref{genay2021virtual}).
|
||||
Finally, \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the visual and haptic feedback of the hand in \VST-\AR, as detailed in the next section (\secref{vhar_rings}).
|
||||
Taken together, these results suggest that a visual augmentation of the hand in \AR could improve usability and performance in direct hand manipulation tasks, but the best rendering has yet to be determined.
|
||||
@@ -302,7 +303,7 @@ Taken together, these results suggest that a visual augmentation of the hand in
|
||||
|
||||
\AR systems integrate virtual content into the user's perception as if it were part of the \RE.
|
||||
\AR headsets now enable real-time pose estimation of the head and hands, and high-quality display of virtual content, while being portable and mobile.
|
||||
They enable highly immersive augmented environments that users can explore with a strong sense of the presence of the virtual content.
|
||||
They create augmented environments that users can explore with a strong sense of the presence of the virtual content.
|
||||
However, without direct and seamless interaction with the virtual objects using the hands, the coherence of the augmented environment experience is compromised.
|
||||
In particular, when manipulating virtual objects in \OST-\AR, there is a lack of mutual occlusion and interaction cues between the hands and the virtual content, which could be mitigated by a visual augmentation of the hand.
|
||||
A common alternative approach is to use real objects as proxies for interaction with virtual objects, but this raises concerns about their coherence with visual augmentations.
|
||||
|
||||
@@ -5,7 +5,7 @@ Perception and manipulation of objects with the hand typically involves both the
|
||||
Each sense has unique capabilities for perceiving certain object properties, such as color for vision or temperature for touch, but they are equally capable for many properties, such as roughness, hardness, or geometry \cite{baumgartner2013visual}.
|
||||
|
||||
Both \AR and wearable haptic systems integrate virtual content into the user's perception as sensory illusions.
|
||||
It is essential to understand how a visuo-haptic rendering of a virtual object is perceived as a coherent object property, and how wearable haptics have been integrated with immersive \AR.
|
||||
It is essential to understand how a visuo-haptic rendering of a virtual object is perceived as a coherent object property, and how wearable haptics have been integrated with \AR headsets.
|
||||
|
||||
\subsection{Visuo-Haptic Perception of Virtual and Augmented Objects}
|
||||
\label{vh_perception}
|
||||
@@ -60,7 +60,7 @@ More precisely, when surfaces are evaluated by vision or touch alone, both sense
|
||||
|
||||
The overall perception can then be modified by changing one of the sensory modalities.
|
||||
\textcite{yanagisawa2015effects} altered the perceived roughness, stiffness, and friction of real tactile materials touched by the finger by superimposing different real visual textures using a half-mirror.
|
||||
In a similar setup, but in immersive \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched through a glove: many visual textures were found to match the real haptic textures.
|
||||
In a similar setup, but in \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched through a glove: many visual textures were found to match the real haptic textures.
|
||||
\textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined multiple virtual objects in \VR with \ThreeD-printed hair structures or with everyday real surfaces, respectively.
|
||||
They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few real objects seemed to be sufficient to match all the visual virtual objects (\figref{gunther2022smooth}).
|
||||
|
||||
@@ -85,7 +85,7 @@ For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually defo
|
||||
|
||||
Some studies have investigated the visuo-haptic perception of virtual objects rendered with force-feedback and vibrotactile feedback in \AR and \VR.
|
||||
|
||||
In an immersive \VST-\AR setup, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}).
|
||||
In \VST-\AR, \textcite{knorlein2009influence} rendered a virtual piston using force-feedback haptics that participants pressed directly with their hand (\figref{visuo-haptic-stiffness}).
|
||||
In a \TIFC task (\secref{sensations_perception}), participants pressed two pistons and indicated which was stiffer.
|
||||
One had a reference stiffness but an additional visual or haptic delay, while the other varied with a comparison stiffness but had no delay. \footnote{Participants were not told about the delays and stiffness tested, nor which piston was the reference or comparison. The order of the pistons (which one was pressed first) was also randomized.}
|
||||
Adding a visual delay increased the perceived stiffness of the reference piston, while adding a haptic delay decreased it, and adding both delays cancelled each other out (\figref{knorlein2009influence_2}).
|
||||
@@ -114,7 +114,7 @@ The visuo-haptic simultaneity was varied by adding a visual delay or by triggeri
|
||||
No participant (out of 19) was able to detect a \qty{50}{\ms} visual lag and a \qty{15}{\ms} haptic lead, and only half of them detected a \qty{100}{\ms} visual lag and a \qty{70}{\ms} haptic lead.
|
||||
|
||||
These studies have shown how the latency of the visual rendering of a virtual object or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device.
|
||||
We describe in the next section how wearable haptics have been integrated with immersive \AR.
|
||||
We describe in the next section how wearable haptics have been integrated with \AR.
|
||||
|
||||
\begin{subfigs}{gaffary2017ar}{Perception of haptic stiffness in \OST-\AR \vs \VR \cite{gaffary2017ar}. }[][
|
||||
\item Experimental setup: a virtual piston was pressed with a force-feedback placed to the side of the participant.
|
||||
@@ -129,7 +129,7 @@ We describe in the next section how wearable haptics have been integrated with i
|
||||
\subsection{Wearable Haptics for Direct Hand Interaction in AR}
|
||||
\label{vhar_haptics}
|
||||
|
||||
A few wearable haptic devices have been specifically designed or experimentally tested for direct hand interaction in immersive \AR.
|
||||
A few wearable haptic devices have been specifically designed or experimentally tested for direct hand interaction in \AR.
|
||||
Since virtual or augmented objects are naturally touched, grasped, and manipulated directly with the fingertips (\secref{exploratory_procedures} and \secref{grasp_types}), the main challenge of wearable haptics for \AR is to provide haptic sensations of these interactions while keeping the fingertips free to interact with the \RE.
|
||||
Several approaches have been proposed to move the haptic actuator to a different location, on the outside of the finger or the hand, \eg the nail, the top of a phalanx, or the wrist.
|
||||
Yet, they differ greatly in the actuators used (\secref{wearable_haptic_devices}), thus the haptic feedback (\secref{tactile_rendering}), and the placement of the haptic rendering.
|
||||
@@ -178,12 +178,12 @@ In a \VST-\AR setup, \textcite{scheggi2010shape} explored the effect of renderin
|
||||
The middle phalanx of each of these fingers was equipped with a haptic ring of \textcite{minamizawa2007gravity}.
|
||||
\textcite{scheggi2010shape} reported that 12 out of 15 participants found the weight haptic feedback essential to feeling the presence of the virtual cube.
|
||||
|
||||
In a pick-and-place task in non-immersive \VST-\AR involving direct hand manipulation of both virtual and real objects (\figref{maisto2017evaluation}), \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the effects of providing haptic or visual feedback about fingertip-object contacts.
|
||||
In a pick-and-place task in \VST-\AR involving direct hand manipulation of both virtual and real objects (\figref{maisto2017evaluation}), \textcite{maisto2017evaluation} and \textcite{meli2018combining} compared the effects of providing haptic or visual feedback about fingertip-object contacts.
|
||||
They compared the haptic ring of \textcite{pacchierotti2016hring} on the proximal phalanx, the moving platform of \textcite{chinello2020modular} on the fingertip, and a visual feedback of the tracked fingertips as virtual points.
|
||||
They showed that the haptic feedback improved the completion time, reduced the force exerted on the cubes compared to the visual feedback (\figref{visual-hands}).
|
||||
The haptic ring was also perceived as more effective than the moving platform.
|
||||
However, the measured difference in performance could be due to either the device or the device position (proximal vs fingertip), or both.
|
||||
These two studies were also conducted in non-immersive setups, where users viewed a screen displaying the visual interactions, and only compared the haptic and visual feedback of the hand-object contacts, but did not examine them together.
|
||||
These two studies were also conducted in static setups, where users viewed a screen displaying the visual interactions, and only compared the haptic and visual feedback of the hand-object contacts, but did not examine them together.
|
||||
|
||||
\begin{subfigs}{ar_rings}{Wearable haptic ring devices for \AR. }[][
|
||||
\item Rendering weight of a virtual cube placed on a real surface \cite{scheggi2010shape}.
|
||||
@@ -200,7 +200,7 @@ A user study was conducted in \VR to compare the perception of visuo-haptic stif
|
||||
\subsection{Conclusion}
|
||||
\label{visuo_haptic_conclusion}
|
||||
|
||||
Providing coherent visuo-haptic feedback to enhance direct hand perception and manipulation with virtual objects in immersive \AR is challenging.
|
||||
Providing coherent visuo-haptic feedback to enhance direct hand perception and manipulation with virtual objects in \AR is challenging.
|
||||
While many wearable haptic devices have been developed and are capable of providing varied tactile feedback, few have be integrated or experimentally evaluated for direct hand interaction in \AR.
|
||||
Their haptic end-effector must be moved away from the inside of the hand so as not to interfere with the user interaction with the \RE.
|
||||
Different relocation strategies have been proposed for different parts of the hand, such as the nail, the index phalanges, or the wrist, but it remains unclear whether any of them are best suited for direct hand interaction in \AR.
|
||||
|
||||
@@ -10,13 +10,13 @@ Only a few haptic systems can be considered wearable due to their compactness an
|
||||
If their haptic rendering is timely associated with the user's touch actions on a real object, the perceived haptic properties of the object, such as its roughness and hardness, can be modified.
|
||||
Wearable haptic augmentation of roughness and hardness is mostly achieved with vibrotactile feedback (\secref{tactile_rendering}).
|
||||
|
||||
\AR headsets integrate virtual content into the user's perception in an immersive way, as if it were part of the \RE, with real-time pose estimation of the head and hands (\secref{what_is_ar}).
|
||||
\AR headsets integrate virtual content into the user's perception as if it were part of the \RE, with real-time pose estimation of the head and hands (\secref{what_is_ar}).
|
||||
Direct interaction with the hand of virtual content is often implemented using virtual hand interaction technique, which reconstructs the user's hand in the \VE and simulates its interactions with the virtual.
|
||||
However, the perception and manipulation of the virtual is difficult due to the lack of haptic feedback and the mutual occlusion of the hand with the virtual content (\secref{ar_interaction}). %, which could be addressed by a visual augmentation of the hand (\secref{ar_visual_hands}).
|
||||
Real surrounding objects can also be used as proxies to interact with the virtual, but they may be incoherent with their visual augmentation because they are haptically passive (\secref{ar_interaction}).
|
||||
Wearable haptics on the hand seems to be a promising solution to enable coherent and effective visuo-haptic augmentation of both virtual and real objects.
|
||||
|
||||
\noindentskip In this thesis, we will use wearable haptic feedback in immersive \AR to create visuo-haptic texture augmentation when touching real objects (\partref{perception}) and to improve manipulation of virtual objects (\partref{manipulation}), both directly with the bare hand.
|
||||
\noindentskip In this thesis, we will use wearable haptic feedback with an \AR headset to create visuo-haptic texture augmentation when touching real objects (\partref{perception}) and to improve manipulation of virtual objects (\partref{manipulation}), both directly with the bare hand.
|
||||
|
||||
First, it is challenging to provide coherent visuo-haptic feedback when augmenting real objects.
|
||||
By integrating different sensory feedback, haptic and visual, real and virtual, into a single object property, perception is somewhat robust to variations in reliability and to spatial and temporal differences.
|
||||
|
||||
Reference in New Issue
Block a user