tangible -> real
This commit is contained in:
@@ -153,7 +153,7 @@ With a better understanding of how visual factors can influence the perception o
|
||||
|
||||
Touching, grasping and manipulating \VOs are fundamental interactions for \AR \cite{kim2018revisiting}, \VR \cite{bergstrom2021how} and \VEs in general \cite{laviolajr20173d}.
|
||||
As the hand is not occupied or covered with a haptic device to not impair interaction with the \RE, as described in the previous section, one can expect a seamless and direct manipulation of the hand with the virtual content as if it were real.
|
||||
Thus, augmenting a tangible object has the advantage of physically constraining the hand, allowing for easy and natural interaction, but manipulating a purely \VO with the bare hand can be challenging without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
|
||||
Thus, augmenting a real object has the advantage of physically constraining the hand, allowing for easy and natural interaction, but manipulating a purely \VO with the bare hand can be challenging without good haptic feedback \cite{maisto2017evaluation,meli2018combining}. %, and one will rely on visual and haptic feedback to guide the interaction.
|
||||
|
||||
In addition, current \AR systems have visual rendering limitations that also affect interaction with \VOs. %, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency.
|
||||
\AR is the display of superimposed images of the virtual world, synchronized with the user's current view of the real world.
|
||||
@@ -178,31 +178,31 @@ Our approach is to
|
||||
|
||||
We consider two main axes of research, each addressing one of the research challenges identified above:
|
||||
\begin{enumerate*}[label=(\Roman*)]
|
||||
\item \textbf{modifying the texture perception of tangible surfaces}, and % with visuo-haptic texture augmentations, and
|
||||
\item \textbf{modifying the texture perception of real surfaces}, and % with visuo-haptic texture augmentations, and
|
||||
\item \textbf{improving the manipulation of virtual objects}.% with visuo-haptic augmentations of the hand-object interaction.
|
||||
\end{enumerate*}
|
||||
Our contributions in these two axes are summarized in \figref{contributions}.
|
||||
|
||||
\fig[0.95]{contributions}{Summary of our contributions through the simplified interaction loop.}[
|
||||
The contributions are represented in dark gray boxes, and the research axes in light green circles.
|
||||
The first axis is \textbf{(I)} the design and evaluation of the perception of visuo-haptic texture augmentations of tangible surfaces, directly touched by the hand.
|
||||
The first axis is \textbf{(I)} the design and evaluation of the perception of visuo-haptic texture augmentations of real surfaces, directly touched by the hand.
|
||||
The second axis focuses on \textbf{(II)} improving the manipulation of \VOs with the bare hand using visuo-haptic augmentations of the hand as interaction feedback.
|
||||
]
|
||||
|
||||
\subsectionstarbookmark{Axis I: Modifying the Texture Perception of Tangible Surfaces}
|
||||
\subsectionstarbookmark{Axis I: Modifying the Texture Perception of Real Surfaces}
|
||||
|
||||
Wearable haptic devices have proven to be effective in modifying the perception of a touched tangible surface, without modifying the tangible, nor covering the fingertip, forming a haptic \AE \cite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
|
||||
Wearable haptic devices have proven to be effective in altering the perception of a touched real surface, without modifying the object nor covering the fingertip, forming a haptic \AE \cite{bau2012revel,detinguy2018enhancing,salazar2020altering}.
|
||||
%It is achieved by placing the haptic actuator close to the fingertip, to let it free to touch the surface, and rendering tactile stimuli timely synchronised with the finger movement.
|
||||
%It enables rich haptic feedback as the combination of kinesthetic sensation from the tangible and cutaneous sensation from the actuator.
|
||||
%It enables rich haptic feedback as the combination of kinesthetic sensation from the real and cutaneous sensation from the actuator.
|
||||
However, wearable haptic augmentations have been little explored with \AR, as well as the visuo-haptic augmentation of textures.
|
||||
Texture is indeed one of the fundamental perceived property of a surface's material \cite{hollins1993perceptual,okamoto2013psychophysical}, perceived equally well by both sight and touch \cite{bergmanntiest2007haptic,baumgartner2013visual}, and one of the most studied haptic (only, without visual) augmentation \cite{unger2011roughness,culbertson2014modeling,strohmeier2017generating}.
|
||||
Being able to coherently substitute the visuo-haptic texture of an everyday surface directly touched by a finger is an important step towards \AR capable of visually and haptically augmenting the \RE of a user in a plausible way.
|
||||
For this first axis of research, we propose to design and evaluate the \textbf{perception of wearable virtual visuo-haptic textures augmenting tangible surfaces}. %, using an immersive \AR headset and a wearable vibrotactile device.
|
||||
For this first axis of research, we propose to design and evaluate the \textbf{perception of wearable virtual visuo-haptic textures augmenting real surfaces}. %, using an immersive \AR headset and a wearable vibrotactile device.
|
||||
To this end, we (1) design a system for rendering wearable visuo-haptic texture augmentations, to (2) evaluate how the perception of haptic texture augmentations is affected by the visual virtuality of the hand and the environment (real, augmented, or virtual), and (3) investigate the perception of co-localized visuo-haptic texture augmentations.
|
||||
|
||||
First, an effective approach for rendering haptic textures is to generate a vibrotactile signal that represents the finger-texture interaction \cite{culbertson2014modeling,asano2015vibrotactile}.
|
||||
Yet, to achieve the natural interaction with the hand and a coherent visuo-haptic feedback, it requires a real time rendering of the textures, no constraints on the hand movements, and a good synchronization between the visual and haptic feedback.
|
||||
Thus, our first objective is to \textbf{design an immersive, real time system} that allows free exploration with the bare hand of \textbf{wearable visuo-haptic texture augmentations} on tangible surfaces.
|
||||
Thus, our first objective is to \textbf{design an immersive, real time system} that allows free exploration with the bare hand of \textbf{wearable visuo-haptic texture augmentations} on real surfaces.
|
||||
It will form the basis of the next two chapters in this section.
|
||||
|
||||
Second, many works have investigated the haptic augmentation of textures, but none have integrated them with \AR and \VR, or have considered the influence of the degree of visual virtuality on their perception.
|
||||
@@ -211,7 +211,7 @@ Hence, our second objective is to \textbf{evaluate how the perception of wearabl
|
||||
|
||||
Finally, some visuo-haptic texture databases have been modeled from real texture captures \cite{culbertson2014penn,balasubramanian2024sens3}, to be rendered as virtual textures with hand-held haptic devices that are perceived as similar to real textures \cite{culbertson2015should,friesen2024perceived}.
|
||||
However, the rendering of these textures in an immersive and natural visuo-haptic \AR using wearable haptics remains to be investigated.
|
||||
Our third objective is to \textbf{evaluate the perception of simultaneous and co-localized wearable visuo-haptic texture augmentations} of tangible surfaces in \AR, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture.
|
||||
Our third objective is to \textbf{evaluate the perception of simultaneous and co-localized wearable visuo-haptic texture augmentations} of real surfaces in \AR, and to understand to what extent each sensory modality contributes to the overall perception of the augmented texture.
|
||||
|
||||
\subsectionstarbookmark{Axis II: Improving the Manipulation of Virtual Objects}
|
||||
|
||||
@@ -248,10 +248,10 @@ Finally, we describe how multimodal visual and haptic feedback have been combine
|
||||
We then address each of our two research axes in a dedicated part.
|
||||
|
||||
\noindentskip
|
||||
In \textbf{\partref{perception}} we present our contributions to the first axis of research: modifying the visuo-haptic texture perception of tangible surfaces.
|
||||
In \textbf{\partref{perception}} we present our contributions to the first axis of research: modifying the visuo-haptic texture perception of real surfaces.
|
||||
We evaluate how the visual rendering of the hand (real or virtual), the environment (\AR or \VR) and the textures (coherent, different or not shown) affect the perception of virtual vibrotactile textures rendered on real surfaces and touched directly with the index finger.
|
||||
|
||||
In \textbf{\chapref{vhar_system}} we design and implement a system for rendering visuo-haptic virtual textures that augment tangible surfaces.%, using an immersive \OST-\AR headset and a wearable vibrotactile device.
|
||||
In \textbf{\chapref{vhar_system}} we design and implement a system for rendering visuo-haptic virtual textures that augment real surfaces. %, using an immersive \OST-\AR headset and a wearable vibrotactile device.
|
||||
The haptic textures represent a periodical patterned texture rendered by a wearable vibrotactile actuator worn on the middle phalanx of the finger touching the surface.
|
||||
The tracking of the real hand and the environment is done using a marker-based technique, and the visual rendering is done using the immersive \OST-\AR headset Microsoft HoloLens~2.
|
||||
The system allows free visual and haptic exploration of the textures, as if they were real, and forms the basis of the next two chapters.
|
||||
|
||||
@@ -138,7 +138,7 @@ In \AR, it could take the form of body accessorization, \eg wearing virtual clot
|
||||
\subsection{Direct Hand Manipulation in AR}
|
||||
\label{ar_interaction}
|
||||
|
||||
A user in \AR must be able to interact with the virtual content to fulfil the second point of \textcite{azuma1997survey}'s definition (\secref{ar_definition}) and complete the interaction loop (\figref[introduction]{interaction-loop}). %, \eg through a hand-held controller, a tangible object, or even directly with the hands.
|
||||
A user in \AR must be able to interact with the virtual content to fulfil the second point of \textcite{azuma1997survey}'s definition (\secref{ar_definition}) and complete the interaction loop (\figref[introduction]{interaction-loop}). %, \eg through a hand-held controller, a real object, or even directly with the hands.
|
||||
In all examples of \AR applications shown in \secref{ar_applications}, the user interacts with the \VE using their hands, either directly or through a physical interface.
|
||||
|
||||
\subsubsection{User Interfaces and Interaction Techniques}
|
||||
@@ -173,7 +173,7 @@ The \emph{system control tasks} are changes to the system state through commands
|
||||
\begin{subfigs}{interaction-techniques}{Interaction techniques in \AR. }[][
|
||||
\item Spatial selection of virtual item of an extended display using a hand-held smartphone \cite{grubert2015multifi}.
|
||||
\item Displaying as an overlay registered on the \RE the route to follow \cite{grubert2017pervasive}.
|
||||
\item Virtual drawing on a tangible object with a hand-held pen \cite{roo2017onea}.
|
||||
\item Virtual drawing on a real object with a hand-held pen \cite{roo2017onea}.
|
||||
\item Simultaneous Localization and Mapping (SLAM) algorithms such as KinectFusion \cite{newcombe2011kinectfusion} reconstruct the \RE in real time and enables to register the \VE in it.
|
||||
]
|
||||
\subfigsheight{36mm}
|
||||
@@ -201,26 +201,28 @@ It is often achieved using two interaction techniques: \emph{tangible objects} a
|
||||
\subsubsection{Manipulating with Tangibles}
|
||||
\label{ar_tangibles}
|
||||
|
||||
As \AR integrates visual virtual content into \RE perception, it can involve real surrounding objects as \UI: to visually augment them, \eg by superimposing visual textures \cite{roo2017inner} (\figref{roo2017inner}), and to use them as physical proxies to support interaction with \VOs \cite{ishii1997tangible}.
|
||||
According to \textcite{billinghurst2005designing}, each \VO is coupled to a tangible object, and the \VO is physically manipulated through the tangible object, providing a direct, efficient and seamless interaction with both the real and virtual content.
|
||||
This technique is similar to mapping the movements of a mouse to a virtual cursor on a screen.
|
||||
As \AR integrates visual virtual content into \RE perception, it can involve real surrounding objects as \UI: to either visually augment them (\figref{roo2017inner}), or to use them as physical proxies to support interaction with \VOs \cite{ishii1997tangible}.
|
||||
%According to \textcite{billinghurst2005designing}
|
||||
Each \VO is coupled to a real object and physically manipulated through it, providing a direct, efficient and seamless interaction with both the real and virtual content \cite{billinghurst2005designing}.
|
||||
The real objects are called \emph{tangible} in this usage context.
|
||||
%This technique is similar to mapping the movements of a mouse to a virtual cursor on a screen.
|
||||
|
||||
Methods have been developed to automatically pair and adapt the \VOs for rendering with available tangibles of similar shape and size \cite{hettiarachchi2016annexing,jain2023ubitouch} (\figref{jain2023ubitouch}).
|
||||
The issue with these \emph{space-multiplexed} interfaces is the large number and variety of tangibles required.
|
||||
An alternative is to use a single \emph{universal} tangible object like a hand-held controller, such as a cube \cite{issartel2016tangible} or a sphere \cite{englmeier2020tangible}.
|
||||
These \emph{time-multiplexed} interfaces require interaction techniques that allow the user to pair the tangible with any \VO, \eg by placing the tangible into the \VO and pressing the fingers \cite{issartel2016tangible} (\figref{issartel2016tangible}), similar to a real grasp (\secref{grasp_types}).
|
||||
|
||||
Still, the virtual visual rendering and the tangible haptic sensations can be inconsistent.
|
||||
Especially in \OST-\AR, since the \VOs are inherently slightly transparent allowing the paired tangibles to be seen through them.
|
||||
In a pick-and-place task with tangibles of different shapes, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using_1}) of the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users.
|
||||
This suggests the feasibility of using simplified tangibles in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs.
|
||||
Similarly, in \secref{tactile_rendering} we described how a material property (\secref{object_properties}) of a touched tangible can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR.
|
||||
Still, the virtual visual rendering and the real haptic sensations can be inconsistent.
|
||||
Especially in \OST-\AR, since the \VOs are inherently slightly transparent allowing the paired real objects to be seen through them.
|
||||
In a pick-and-place task with real objects, a difference in size \cite{kahl2021investigation} (\figref{kahl2021investigation}) and shape \cite{kahl2023using} (\figref{kahl2023using_1}) of the \VOs does not affect user performance or presence, and that small variations (\percent{\sim 10} for size) were not even noticed by the users.
|
||||
This suggests the feasibility of using simplified real obejcts in \AR whose spatial properties (\secref{object_properties}) abstract those of the \VOs.
|
||||
Similarly, in \secref{tactile_rendering} we described how a material property (\secref{object_properties}) of a touched real object can be modified using wearable haptic devices \cite{detinguy2018enhancing,salazar2020altering}: It could be used to render coherent visuo-haptic material perceptions directly touched with the hand in \AR.
|
||||
|
||||
\begin{subfigs}{ar_tangibles}{Manipulating \VOs with tangibles. }[][
|
||||
\item Ubi-Touch paired the movements and screw interaction of a virtual drill with a real vaporizer held by the user \cite{jain2023ubitouch}.
|
||||
\item A tangible cube that can be moved into the \VE and used to grasp and manipulate \VOs \cite{issartel2016tangible}.
|
||||
\item A real cube that can be moved into the \VE and used to grasp and manipulate \VOs \cite{issartel2016tangible}.
|
||||
\item Size and
|
||||
\item shape difference between a tangible and a \VO is acceptable for manipulation in \AR \cite{kahl2021investigation,kahl2023using}.
|
||||
\item shape difference between a real object and a virtual one is acceptable for manipulation in \AR \cite{kahl2021investigation,kahl2023using}.
|
||||
]
|
||||
\subfigsheight{37.5mm}
|
||||
\subfig{jain2023ubitouch}
|
||||
@@ -291,7 +293,7 @@ While in \VST-\AR, this could be solved as a masking problem by combining the re
|
||||
%However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a \VO, \eg the thumb is in front of the virtual cube, but could be perceived to be behind it.
|
||||
|
||||
Since the \VE is intangible, adding a visual rendering of the virtual hand in \AR that is physically constrained to the \VOs would achieve a similar result to the promising double-hand rendering of \textcite{prachyabrued2014visual}.
|
||||
A \VO overlaying a tangible object in \OST-\AR can vary in size and shape without degrading user experience or manipulation performance \cite{kahl2021investigation,kahl2023using}.
|
||||
A \VO overlaying a real object object in \OST-\AR can vary in size and shape without degrading user experience or manipulation performance \cite{kahl2021investigation,kahl2023using}.
|
||||
This suggests that a visual hand rendering superimposed on the real hand as a partial avatarization (\secref{ar_embodiment}) might be helpful without impairing the user.
|
||||
|
||||
Few works have compared different visual hand rendering in \AR or with wearable haptic feedback.
|
||||
@@ -331,5 +333,5 @@ Taken together, these results suggest that a visual rendering of the hand in \AR
|
||||
They enable highly immersive \AEs that users can explore with a strong sense of the presence of the virtual content.
|
||||
However, without a direct and seamless interaction with the \VOs using the hands, the coherence of the \AE experience is compromised.
|
||||
In particular, there is a lack of mutual occlusion and interaction cues between the hands and virtual content when manipulating \VOs in \OST-\AR that could be mitigated by a visual rendering of the hand.
|
||||
A common alternative approach is to use tangible objects as proxies for interaction with \VOs, but this raises concerns about their consistency with the visual rendering.
|
||||
In this context, the use of wearable haptic systems worn on the hand seems to be a promising solution both for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of touched tangible objects.
|
||||
A common alternative approach is to use real objects as tangible proxies for interaction with \VOs, but this raises concerns about their consistency with the visual rendering.
|
||||
In this context, the use of wearable haptic systems worn on the hand seems to be a promising solution both for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of touched real objects.
|
||||
|
||||
@@ -17,7 +17,7 @@ It is essential to understand how a multimodal visuo-haptic rendering of a \VO i
|
||||
\label{sensations_perception}
|
||||
|
||||
A \emph{perception} is the merging of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property \cite{ernst2004merging}.
|
||||
For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a tangible with a co-localized \VO (\secref{ar_tangibles}).
|
||||
For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a real object with a co-localized \VO (\secref{ar_tangibles}).
|
||||
|
||||
If the sensations are redundant, \ie if only one sensation could suffice to estimate the property, they are integrated to form a single perception \cite{ernst2004merging}.
|
||||
No sensory information is completely reliable and may give different answers to the same property when measured multiple times, \eg the weight of an object.
|
||||
@@ -63,7 +63,7 @@ The \MLE model implies that when seeing and touching a \VO in \AR, the combinati
|
||||
%As long as the user is able to associate the sensations as the same object property, and even if there are discrepancies between the sensations, the overall perception can be influenced by changing one of the stimuli, as discussed in the next sections.
|
||||
%for example by including tangible objects, wearable haptic feedback, or even by altering the visual rendering of the \VO, as discussed in the next sections.
|
||||
|
||||
\subsubsection{Influence of Visual Rendering on Tangible Perception}
|
||||
\subsubsection{Influence of Visual Rendering on Haptic Perception}
|
||||
\label{visual_haptic_influence}
|
||||
|
||||
Thus, a visuo-haptic perception of an object's property is robust to some difference between the two sensory modalities, as long as one can match their respective sensations to the same property.
|
||||
@@ -74,19 +74,19 @@ The overall perception can then be modified by changing one of the sensory modal
|
||||
\textcite{yanagisawa2015effects} altered the perceived roughness, stiffness, and friction of real tactile materials touched by the finger by superimposing different real visual textures using a half-mirror.
|
||||
In a similar setup, but in immersive \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched through a glove: many visual textures were found to match the real haptic textures.
|
||||
\textcite{degraen2019enhancing} and \textcite{gunther2022smooth} also combined multiple \VOs in \VR with \ThreeD-printed hair structures or with everyday real surfaces, respectively.
|
||||
They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few tangibles seemed to be sufficient to match all the visual \VOs (\figref{gunther2022smooth}).
|
||||
They found that the visual perception of roughness and hardness influenced the haptic perception, and that only a few real, tangible objects seemed to be sufficient to match all the visual \VOs (\figref{gunther2022smooth}).
|
||||
%Taken together, these studies suggest that a set of haptic textures, real or virtual, can be perceived as coherent with a larger set of visual virtual textures.
|
||||
|
||||
\fig{gunther2022smooth}{In a passive touch context in \VR, only a smooth and a rough real surfaces were found to match all the visual \VOs \cite{gunther2022smooth}.}
|
||||
|
||||
Visual feedback can even be intentionally designed to influence haptic perception, usually by deforming the visual representation of a user input, creating a \enquote{pseudo-haptic feedback} \cite{ujitoko2021survey}.
|
||||
For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually deforming the geometry of a tangible object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape, or curvature can be altered \cite{ban2013modifying,ban2014displaying}.
|
||||
\textcite{punpongsanon2015softar} used this technique in spatial \AR (\secref{ar_displays}) to induce a softness illusion of a hard tangible object by superimposing a virtual texture that deforms when pressed by the hand (\figref{punpongsanon2015softar}).
|
||||
For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually deforming the geometry of a real object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape, or curvature can be altered \cite{ban2013modifying,ban2014displaying}.
|
||||
\textcite{punpongsanon2015softar} used this technique in spatial \AR (\secref{ar_displays}) to induce a softness illusion of a hard real surface by superimposing a virtual texture that deforms when pressed by the hand (\figref{punpongsanon2015softar}).
|
||||
\textcite{ujitoko2019modulating} increased the perceived roughness of a virtual patterned texture rendered as vibrations through a hand-held stylus (\secref{texture_rendering}) by adding small oscillations to the visual feedback of the stylus on a screen.
|
||||
|
||||
\begin{subfigs}{pseudo_haptic}{Pseudo-haptic feedback in \AR. }[][
|
||||
\item A virtual soft texture projected on a table and that deforms when pressed by the hand \cite{punpongsanon2015softar}.
|
||||
\item Modifying visually a tangible object and the hand touching it in \VST-\AR to modify its perceived shape \cite{ban2014displaying}.
|
||||
\item Modifying visually a real object and the hand touching it in \VST-\AR to modify its perceived shape \cite{ban2014displaying}.
|
||||
]
|
||||
\subfigsheight{42mm}
|
||||
\subfig{punpongsanon2015softar}
|
||||
@@ -167,7 +167,7 @@ This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a
|
||||
This moving platform also contains a \LRA (\secref{moving_platforms}) and provides contact pressure and texture sensations.
|
||||
%The whole system is very compact (\qtyproduct{24 x 24 x 41}{\mm}), lightweight (\qty{9.5}{\g}), and fully portable by including a battery and Bluetooth wireless communication. \qty{20}{\ms} for the Bluetooth
|
||||
When touching \VOs in \OST-\AR with the index finger, this device was found to be more realistic overall (5/7) than vibrations with a \LRA at \qty{170}{\Hz} on the nail (3/7).
|
||||
Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and this design is not suitable for augmenting real tangible objects.
|
||||
Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and this design is not suitable for augmenting real objects.
|
||||
|
||||
% teng2021touch: (5.27+3.03+5.23+5.5+5.47)/5 = 4.9
|
||||
% ando2007fingernailmounted: (2.4+2.63+3.63+2.57+3.2)/5 = 2.9
|
||||
|
||||
Reference in New Issue
Block a user