tangible -> real
This commit is contained in:
@@ -138,7 +138,7 @@ In \AR, it could take the form of body accessorization, \eg wearing virtual clot
|
||||
\subsection{Direct Hand Manipulation in AR}
|
||||
\label{ar_interaction}
|
||||
|
||||
A user in \AR must be able to interact with the virtual content to fulfil the second point of \textcite{azuma1997survey}'s definition (\secref{ar_definition}) and complete the interaction loop (\figref[introduction]{interaction-loop}). %, \eg through a hand-held controller, a tangible object, or even directly with the hands.
|
||||
A user in \AR must be able to interact with the virtual content to fulfil the second point of \textcite{azuma1997survey}'s definition (\secref{ar_definition}) and complete the interaction loop (\figref[introduction]{interaction-loop}). %, \eg through a hand-held controller, a real object, or even directly with the hands.
|
||||
In all examples of \AR applications shown in \secref{ar_applications}, the user interacts with the \VE using their hands, either directly or through a physical interface.
|
||||
|
||||
\subsubsection{User Interfaces and Interaction Techniques}
|
||||
@@ -173,7 +173,7 @@ The \emph{system control tasks} are changes to the system state through commands
|
||||
\begin{subfigs}{interaction-techniques}{Interaction techniques in \AR. }[][
|
||||
\item Spatial selection of virtual item of an extended display using a hand-held smartphone \cite{grubert2015multifi}.
|
||||
\item Displaying as an overlay registered on the \RE the route to follow \cite{grubert2017pervasive}.
|
||||
\item Virtual drawing on a tangible object with a hand-held pen \cite{roo2017onea}.
|
||||
\item Virtual drawing on a real object with a hand-held pen \cite{roo2017onea}.
|
||||
\item Simultaneous Localization and Mapping (SLAM) algorithms such as KinectFusion \cite{newcombe2011kinectfusion} reconstruct the \RE in real time and enables to register the \VE in it.
|
||||
]
|
||||
\subfigsheight{36mm}
|
||||
@@ -201,26 +201,28 @@ It is often achieved using two interaction techniques: \emph{tangible objects} a
|
||||
\subsubsection{Manipulating with Tangibles}
|
||||
\label{ar_tangibles}
|
||||
|
||||
As \AR integrates visual virtual content into \RE perception, it can involve real surrounding objects as \UI: to visually augment them, \eg by superimposing visual textures \cite{roo2017inner} (\figref{roo2017inner}), and to use them as physical proxies to support interaction with \VOs \cite{ishii1997tangible}.
|
||||
According to \textcite{billinghurst2005designing}, each \VO is coupled to a tangible object, and the \VO is physically manipulated through the tangible object, providing a direct, efficient and seamless interaction with both the real and virtual content.
|
||||
This technique is similar to mapping the movements of a mouse to a virtual cursor on a screen.
|
||||
As \AR integrates visual virtual content into \RE perception, it can involve real surrounding objects as \UI: to either visually augment them (\figref{roo2017inner}), or to use them as physical proxies to support interaction with \VOs \cite{ishii1997tangible}.
|
||||
%According to \textcite{billinghurst2005designing}
|
||||
Each \VO is coupled to a real object and physically manipulated through it, providing a direct, efficient and seamless interaction with both the real and virtual content \cite{billinghurst2005designing}.
|
||||
The real objects are called \emph{tangible} in this usage context.
|
||||
%This technique is similar to mapping the movements of a mouse to a virtual cursor on a screen.
|
||||
|
||||
Methods have been developed to automatically pair and adapt the \VOs for rendering with available tangibles of similar shape and size \cite{hettiarachchi2016annexing,jain2023ubitouch} (\figref{jain2023ubitouch}).
|
||||
The issue with these \emph{space-multiplexed} interfaces is the large number and variety of tangibles required.
|
||||
An alternative is to use a single \emph{universal} tangible object like a hand-held controller, such as a cube \cite{issartel2016tangible} or a sphere \cite{englmeier2020tangible}.
|
||||
These \emph{time-multiplexed} interfaces require interaction techniques that allow the user to pair the tangible with any \VO, \eg by placing the tangible into the \VO and pressing the fingers \cite{issartel2016tangible} (\figref{issartel2016tangible}), similar to a real grasp (\secref{grasp_types}).
|
||||
|
||||
Still, the virtual visual rendering and the tangible haptic sensations can be inconsistent.
|
||||
Especially in \OST-\AR, since the \VOs are inherently slightly transparent allowing the paired tangibles to be seen through them.
|
||||
In a pick-and-place task with tangibles of different shapes, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using_1}) of the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users.
|
||||
This suggests the feasibility of using simplified tangibles in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs.
|
||||
Similarly, in \secref{tactile_rendering} we described how a material property (\secref{object_properties}) of a touched tangible can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR.
|
||||
Still, the virtual visual rendering and the real haptic sensations can be inconsistent.
|
||||
Especially in \OST-\AR, since the \VOs are inherently slightly transparent allowing the paired real objects to be seen through them.
|
||||
In a pick-and-place task with real objects, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using_1}) of the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users.
|
||||
This suggests the feasibility of using simplified real obejcts in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs.
|
||||
Similarly, in \secref{tactile_rendering} we described how a material property (\secref{object_properties}) of a touched real object can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR.
|
||||
|
||||
\begin{subfigs}{ar_tangibles}{Manipulating \VOs with tangibles. }[][
|
||||
\item Ubi-Touch paired the movements and screw interaction of a virtual drill with a real vaporizer held by the user \cite{jain2023ubitouch}.
|
||||
\item A tangible cube that can be moved into the \VE and used to grasp and manipulate \VOs \cite{issartel2016tangible}.
|
||||
\item A real cube that can be moved into the \VE and used to grasp and manipulate \VOs \cite{issartel2016tangible}.
|
||||
\item Size and
|
||||
\item shape difference between a tangible and a \VO is acceptable for manipulation in \AR \cite{kahl2021investigation,kahl2023using}.
|
||||
\item shape difference between a real object and a virtual one is acceptable for manipulation in \AR \cite{kahl2021investigation,kahl2023using}.
|
||||
]
|
||||
\subfigsheight{37.5mm}
|
||||
\subfig{jain2023ubitouch}
|
||||
@@ -291,7 +293,7 @@ While in \VST-\AR, this could be solved as a masking problem by combining the re
|
||||
%However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a \VO, \eg the thumb is in front of the virtual cube, but could be perceived to be behind it.
|
||||
|
||||
Since the \VE is intangible, adding a visual rendering of the virtual hand in \AR that is physically constrained to the \VOs would achieve a similar result to the promising double-hand rendering of \textcite{prachyabrued2014visual}.
|
||||
A \VO overlaying a tangible object in \OST-\AR can vary in size and shape without degrading user experience or manipulation performance \cite{kahl2021investigation,kahl2023using}.
|
||||
A \VO overlaying a real object object in \OST-\AR can vary in size and shape without degrading user experience or manipulation performance \cite{kahl2021investigation,kahl2023using}.
|
||||
This suggests that a visual hand rendering superimposed on the real hand as a partial avatarization (\secref{ar_embodiment}) might be helpful without impairing the user.
|
||||
|
||||
Few works have compared different visual hand rendering in \AR or with wearable haptic feedback.
|
||||
@@ -331,5 +333,5 @@ Taken together, these results suggest that a visual rendering of the hand in \AR
|
||||
They enable highly immersive \AEs that users can explore with a strong sense of the presence of the virtual content.
|
||||
However, without a direct and seamless interaction with the \VOs using the hands, the coherence of the \AE experience is compromised.
|
||||
In particular, there is a lack of mutual occlusion and interaction cues between the hands and virtual content when manipulating \VOs in \OST-\AR that could be mitigated by a visual rendering of the hand.
|
||||
A common alternative approach is to use tangible objects as proxies for interaction with \VOs, but this raises concerns about their consistency with the visual rendering.
|
||||
In this context, the use of wearable haptic systems worn on the hand seems to be a promising solution both for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of touched tangible objects.
|
||||
A common alternative approach is to use real objects as tangible proxies for interaction with \VOs, but this raises concerns about their consistency with the visual rendering.
|
||||
In this context, the use of wearable haptic systems worn on the hand seems to be a promising solution both for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of touched real objects.
|
||||
|
||||
Reference in New Issue
Block a user