tangible -> real
This commit is contained in:
@@ -138,7 +138,7 @@ In \AR, it could take the form of body accessorization, \eg wearing virtual clot
|
||||
\subsection{Direct Hand Manipulation in AR}
|
||||
\label{ar_interaction}
|
||||
|
||||
A user in \AR must be able to interact with the virtual content to fulfil the second point of \textcite{azuma1997survey}'s definition (\secref{ar_definition}) and complete the interaction loop (\figref[introduction]{interaction-loop}). %, \eg through a hand-held controller, a tangible object, or even directly with the hands.
|
||||
A user in \AR must be able to interact with the virtual content to fulfil the second point of \textcite{azuma1997survey}'s definition (\secref{ar_definition}) and complete the interaction loop (\figref[introduction]{interaction-loop}). %, \eg through a hand-held controller, a real object, or even directly with the hands.
|
||||
In all examples of \AR applications shown in \secref{ar_applications}, the user interacts with the \VE using their hands, either directly or through a physical interface.
|
||||
|
||||
\subsubsection{User Interfaces and Interaction Techniques}
|
||||
@@ -173,7 +173,7 @@ The \emph{system control tasks} are changes to the system state through commands
|
||||
\begin{subfigs}{interaction-techniques}{Interaction techniques in \AR. }[][
|
||||
\item Spatial selection of virtual item of an extended display using a hand-held smartphone \cite{grubert2015multifi}.
|
||||
\item Displaying as an overlay registered on the \RE the route to follow \cite{grubert2017pervasive}.
|
||||
\item Virtual drawing on a tangible object with a hand-held pen \cite{roo2017onea}.
|
||||
\item Virtual drawing on a real object with a hand-held pen \cite{roo2017onea}.
|
||||
\item Simultaneous Localization and Mapping (SLAM) algorithms such as KinectFusion \cite{newcombe2011kinectfusion} reconstruct the \RE in real time and enables to register the \VE in it.
|
||||
]
|
||||
\subfigsheight{36mm}
|
||||
@@ -201,26 +201,28 @@ It is often achieved using two interaction techniques: \emph{tangible objects} a
|
||||
\subsubsection{Manipulating with Tangibles}
|
||||
\label{ar_tangibles}
|
||||
|
||||
As \AR integrates visual virtual content into \RE perception, it can involve real surrounding objects as \UI: to visually augment them, \eg by superimposing visual textures \cite{roo2017inner} (\figref{roo2017inner}), and to use them as physical proxies to support interaction with \VOs \cite{ishii1997tangible}.
|
||||
According to \textcite{billinghurst2005designing}, each \VO is coupled to a tangible object, and the \VO is physically manipulated through the tangible object, providing a direct, efficient and seamless interaction with both the real and virtual content.
|
||||
This technique is similar to mapping the movements of a mouse to a virtual cursor on a screen.
|
||||
As \AR integrates visual virtual content into \RE perception, it can involve real surrounding objects as \UI: to either visually augment them (\figref{roo2017inner}), or to use them as physical proxies to support interaction with \VOs \cite{ishii1997tangible}.
|
||||
%According to \textcite{billinghurst2005designing}
|
||||
Each \VO is coupled to a real object and physically manipulated through it, providing a direct, efficient and seamless interaction with both the real and virtual content \cite{billinghurst2005designing}.
|
||||
The real objects are called \emph{tangible} in this usage context.
|
||||
%This technique is similar to mapping the movements of a mouse to a virtual cursor on a screen.
|
||||
|
||||
Methods have been developed to automatically pair and adapt the \VOs for rendering with available tangibles of similar shape and size \cite{hettiarachchi2016annexing,jain2023ubitouch} (\figref{jain2023ubitouch}).
|
||||
The issue with these \emph{space-multiplexed} interfaces is the large number and variety of tangibles required.
|
||||
An alternative is to use a single \emph{universal} tangible object like a hand-held controller, such as a cube \cite{issartel2016tangible} or a sphere \cite{englmeier2020tangible}.
|
||||
These \emph{time-multiplexed} interfaces require interaction techniques that allow the user to pair the tangible with any \VO, \eg by placing the tangible into the \VO and pressing the fingers \cite{issartel2016tangible} (\figref{issartel2016tangible}), similar to a real grasp (\secref{grasp_types}).
|
||||
|
||||
Still, the virtual visual rendering and the tangible haptic sensations can be inconsistent.
|
||||
Especially in \OST-\AR, since the \VOs are inherently slightly transparent allowing the paired tangibles to be seen through them.
|
||||
In a pick-and-place task with tangibles of different shapes, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using_1}) of the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users.
|
||||
This suggests the feasibility of using simplified tangibles in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs.
|
||||
Similarly, in \secref{tactile_rendering} we described how a material property (\secref{object_properties}) of a touched tangible can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR.
|
||||
Still, the virtual visual rendering and the real haptic sensations can be inconsistent.
|
||||
Especially in \OST-\AR, since the \VOs are inherently slightly transparent allowing the paired real objects to be seen through them.
|
||||
In a pick-and-place task with real objects, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using_1}) of the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users.
|
||||
This suggests the feasibility of using simplified real obejcts in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs.
|
||||
Similarly, in \secref{tactile_rendering} we described how a material property (\secref{object_properties}) of a touched real object can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR.
|
||||
|
||||
\begin{subfigs}{ar_tangibles}{Manipulating \VOs with tangibles. }[][
|
||||
\item Ubi-Touch paired the movements and screw interaction of a virtual drill with a real vaporizer held by the user \cite{jain2023ubitouch}.
|
||||
\item A tangible cube that can be moved into the \VE and used to grasp and manipulate \VOs \cite{issartel2016tangible}.
|
||||
\item A real cube that can be moved into the \VE and used to grasp and manipulate \VOs \cite{issartel2016tangible}.
|
||||
\item Size and
|
||||
\item shape difference between a tangible and a \VO is acceptable for manipulation in \AR \cite{kahl2021investigation,kahl2023using}.
|
||||
\item shape difference between a real object and a virtual one is acceptable for manipulation in \AR \cite{kahl2021investigation,kahl2023using}.
|
||||
]
|
||||
\subfigsheight{37.5mm}
|
||||
\subfig{jain2023ubitouch}
|
||||
@@ -291,7 +293,7 @@ While in \VST-\AR, this could be solved as a masking problem by combining the re
|
||||
%However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a \VO, \eg the thumb is in front of the virtual cube, but could be perceived to be behind it.
|
||||
|
||||
Since the \VE is intangible, adding a visual rendering of the virtual hand in \AR that is physically constrained to the \VOs would achieve a similar result to the promising double-hand rendering of \textcite{prachyabrued2014visual}.
|
||||
A \VO overlaying a tangible object in \OST-\AR can vary in size and shape without degrading user experience or manipulation performance \cite{kahl2021investigation,kahl2023using}.
|
||||
A \VO overlaying a real object object in \OST-\AR can vary in size and shape without degrading user experience or manipulation performance \cite{kahl2021investigation,kahl2023using}.
|
||||
This suggests that a visual hand rendering superimposed on the real hand as a partial avatarization (\secref{ar_embodiment}) might be helpful without impairing the user.
|
||||
|
||||
Few works have compared different visual hand rendering in \AR or with wearable haptic feedback.
|
||||
@@ -331,5 +333,5 @@ Taken together, these results suggest that a visual rendering of the hand in \AR
|
||||
They enable highly immersive \AEs that users can explore with a strong sense of the presence of the virtual content.
|
||||
However, without a direct and seamless interaction with the \VOs using the hands, the coherence of the \AE experience is compromised.
|
||||
In particular, there is a lack of mutual occlusion and interaction cues between the hands and virtual content when manipulating \VOs in \OST-\AR that could be mitigated by a visual rendering of the hand.
|
||||
A common alternative approach is to use tangible objects as proxies for interaction with \VOs, but this raises concerns about their consistency with the visual rendering.
|
||||
In this context, the use of wearable haptic systems worn on the hand seems to be a promising solution both for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of touched tangible objects.
|
||||
A common alternative approach is to use real objects as tangible proxies for interaction with \VOs, but this raises concerns about their consistency with the visual rendering.
|
||||
In this context, the use of wearable haptic systems worn on the hand seems to be a promising solution both for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of touched real objects.
|
||||
|
||||
@@ -17,7 +17,7 @@ It is essential to understand how a multimodal visuo-haptic rendering of a \VO i
|
||||
\label{sensations_perception}
|
||||
|
||||
A \emph{perception} is the merging of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property \cite{ernst2004merging}.
|
||||
For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a tangible with a co-localized \VO (\secref{ar_tangibles}).
|
||||
For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a real object with a co-localized \VO (\secref{ar_tangibles}).
|
||||
|
||||
If the sensations are redundant, \ie if only one sensation could suffice to estimate the property, they are integrated to form a single perception \cite{ernst2004merging}.
|
||||
No sensory information is completely reliable and may give different answers to the same property when measured multiple times, \eg the weight of an object.
|
||||
@@ -63,7 +63,7 @@ The \MLE model implies that when seeing and touching a \VO in \AR, the combinati
|
||||
%As long as the user is able to associate the sensations as the same object property, and even if there are discrepancies between the sensations, the overall perception can be influenced by changing one of the stimuli, as discussed in the next sections.
|
||||
%for example by including tangible objects, wearable haptic feedback, or even by altering the visual rendering of the \VO, as discussed in the next sections.
|
||||
|
||||
\subsubsection{Influence of Visual Rendering on Tangible Perception}
|
||||
\subsubsection{Influence of Visual Rendering on Haptic Perception}
|
||||
\label{visual_haptic_influence}
|
||||
|
||||
Thus, a visuo-haptic perception of an object's property is robust to some difference between the two sensory modalities, as long as one can match their respective sensations to the same property.
|
||||
@@ -74,19 +74,19 @@ The overall perception can then be modified by changing one of the sensory modal
|
||||
\textcite{yanagisawa2015effects} altered the perceived roughness, stiffness, and friction of real tactile materials touched by the finger by superimposing different real visual textures using a half-mirror.
|
||||
In a similar setup, but in immersive \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched through a glove: many visual textures were found to match the real haptic textures.
|
||||
\textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined multiple \VOs in \VR with \ThreeD-printed hair structures or with everyday real surfaces, respectively.
|
||||
They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few tangibles seemed to be sufficient to match all the visual \VOs (\figref{gunther2022smooth}).
|
||||
They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few real, tangible objects seemed to be sufficient to match all the visual \VOs (\figref{gunther2022smooth}).
|
||||
%Taken together, these studies suggest that a set of haptic textures, real or virtual, can be perceived as coherent with a larger set of visual virtual textures.
|
||||
|
||||
\fig{gunther2022smooth}{In a passive touch context in \VR, only a smooth and a rough real surfaces were found to match all the visual \VOs \cite{gunther2022smooth}.}
|
||||
|
||||
Visual feedback can even be intentionally designed to influence haptic perception, usually by deforming the visual representation of a user input, creating a \enquote{pseudo-haptic feedback} \cite{ujitoko2021survey}.
|
||||
For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually deforming the geometry of a tangible object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape, or curvature can be altered \cite{ban2013modifying,ban2014displaying}.
|
||||
\textcite{punpongsanon2015softar} used this technique in spatial \AR (\secref{ar_displays}) to induce a softness illusion of a hard tangible object by superimposing a virtual texture that deforms when pressed by the hand (\figref{punpongsanon2015softar}).
|
||||
For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually deforming the geometry of a real object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape, or curvature can be altered \cite{ban2013modifying,ban2014displaying}.
|
||||
\textcite{punpongsanon2015softar} used this technique in spatial \AR (\secref{ar_displays}) to induce a softness illusion of a hard real surface by superimposing a virtual texture that deforms when pressed by the hand (\figref{punpongsanon2015softar}).
|
||||
\textcite{ujitoko2019modulating} increased the perceived roughness of a virtual patterned texture rendered as vibrations through a hand-held stylus (\secref{texture_rendering}) by adding small oscillations to the visual feedback of the stylus on a screen.
|
||||
|
||||
\begin{subfigs}{pseudo_haptic}{Pseudo-haptic feedback in \AR. }[][
|
||||
\item A virtual soft texture projected on a table and that deforms when pressed by the hand \cite{punpongsanon2015softar}.
|
||||
\item Modifying visually a tangible object and the hand touching it in \VST-\AR to modify its perceived shape \cite{ban2014displaying}.
|
||||
\item Modifying visually a real object and the hand touching it in \VST-\AR to modify its perceived shape \cite{ban2014displaying}.
|
||||
]
|
||||
\subfigsheight{42mm}
|
||||
\subfig{punpongsanon2015softar}
|
||||
@@ -167,7 +167,7 @@ This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a
|
||||
This moving platform also contains a \LRA (\secref{moving_platforms}) and provides contact pressure and texture sensations.
|
||||
%The whole system is very compact (\qtyproduct{24 x 24 x 41}{\mm}), lightweight (\qty{9.5}{\g}), and fully portable by including a battery and Bluetooth wireless communication. \qty{20}{\ms} for the Bluetooth
|
||||
When touching \VOs in \OST-\AR with the index finger, this device was found to be more realistic overall (5/7) than vibrations with a \LRA at \qty{170}{\Hz} on the nail (3/7).
|
||||
Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and this design is not suitable for augmenting real tangible objects.
|
||||
Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and this design is not suitable for augmenting real objects.
|
||||
|
||||
% teng2021touch: (5.27+3.03+5.23+5.5+5.47)/5 = 4.9
|
||||
% ando2007fingernailmounted: (2.4+2.63+3.63+2.57+3.2)/5 = 2.9
|
||||
|
||||
Reference in New Issue
Block a user