Clean related work
This commit is contained in:
@@ -3,22 +3,13 @@
|
||||
|
||||
Everyday perception and manipulation of objects with the hand typically involves both the visual and haptic senses.
|
||||
Each sense has unique capabilities for perceiving certain object properties, such as color for vision or temperature for touch, but they are equally capable for many properties, such as roughness, hardness, or geometry \cite{baumgartner2013visual}.
|
||||
Both \AR and wearable haptic systems integrate virtual content into the user's perception as sensory illusions.
|
||||
It is essential to understand how a multimodal visuo-haptic rendering of a \VO is perceived.%, especially in immersive \AR where the haptic actuator is moved away so as not to cover the inside of the hand.
|
||||
|
||||
% Answer the following four questions: “Who else has done work with relevance to this work of yours? What did they do? What did they find? And how is your work here different?”
|
||||
Both \AR and wearable haptic systems integrate virtual content into the user's perception as sensory illusions.
|
||||
It is essential to understand how a multimodal visuo-haptic rendering of a \VO is perceived as a coherent object property, and how wearable haptics have been integrated with immersive \AR.%, especially in immersive \AR where the haptic actuator is moved away so as not to cover the inside of the hand.
|
||||
|
||||
% spatial and temporal integration of visuo-haptic feedback as perceptual cues vs proprioception and real touch sensations
|
||||
% delocalized : not at the point of contact = difficult to integrate with other perceptual cues ?
|
||||
|
||||
%Go back to the main objective "to understand how immersive visual and wearable haptic feedback compare and complement each other in the context of direct hand perception and manipulation with augmented objects" and the two research challenges: "providing plausible and coherent visuo-haptic augmentations, and enabling effective manipulation of the augmented environment."
|
||||
%Also go back to the \figref[introduction]{visuo-haptic-rv-continuum3} : we present previous work that either did haptic AR (the middle row), or haptic VR with visual AR, or visuo-haptic AR.
|
||||
|
||||
% One of the roles of haptic systems is to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects, particularly in visual \VE \cite{maclean2008it,culbertson2018haptics}. Moreover, a haptic \AR system should \enquote{modulating the feel of a real object by virtual [haptic] feedback} \cite{jeon2009haptic}, \ie a touch interaction with a real object whose perception is modified by the addition of virtual haptic feedback.
|
||||
|
||||
% Finally, we present how multimodal visual and haptic feedback have been combined in \AR to modify the user perception of tangible objects, and to improve the user interaction with \VOs.
|
||||
|
||||
|
||||
\subsection{Visuo-Haptic Perception of Virtual and Augmented Objects}
|
||||
\label{sensations_perception}
|
||||
|
||||
@@ -33,15 +24,15 @@ No sensory information is completely reliable and may give different answers to
|
||||
Therefore, each sensation $i$ is said to be an estimate $\tilde{s}_i$ with variance $\sigma_i^2$ of the property $s$.
|
||||
The \MLE model then predicts that the integrated estimated property $\tilde{s}$ is the weighted sum of the individual sensory estimates:
|
||||
\begin{equation}{MLE}
|
||||
\tilde{s} = \sum_i w_i \tilde{s}_i \quad \text{with} \quad \sum_i w_i = 1
|
||||
\tilde{s} = \sum_i w_i \tilde{s}_i \quad \text{with} \quad \sum_i w_i = 1
|
||||
\end{equation}
|
||||
Where the individual weights $w_i$ are proportional to their inverse variances:
|
||||
\begin{equation}{MLE_weights}
|
||||
w_i = \frac{1/\sigma_i^2}{\sigma^2}
|
||||
w_i = \frac{1/\sigma_i^2}{\sigma^2}
|
||||
\end{equation}
|
||||
And the integrated variance $\sigma^2$ is the inverse of the sum of the individual variances:
|
||||
\begin{equation}{MLE_variance}
|
||||
\sigma^2 = \left( \sum_i \frac{1}{\sigma_i^2} \right)^{-1}
|
||||
\sigma^2 = \left( \sum_i \frac{1}{\sigma_i^2} \right)^{-1}
|
||||
\end{equation}
|
||||
|
||||
This was demonstrated by \textcite{ernst2002humans} in a user study where participants estimated the height of a virtual bar using a fixed-window \OST-\AR display (\secref{ar_displays}) and force-feedback devices worn on the thumb and index finger (\secref{wearability_level}), as shown in \figref{ernst2002humans_setup}.
|
||||
@@ -53,16 +44,16 @@ The objective was to determine a \PSE between the comparison and reference bars,
|
||||
\figref{ernst2004merging_results} shows that when the visual noise was low, the visual feedback had more weight, but as visual noise increased, haptic feedback gained more weight, as predicted by the \MLE model.
|
||||
|
||||
\begin{subfigs}{ernst2002humans}{Visuo-haptic perception of height of a virtual bar \cite{ernst2002humans}. }[
|
||||
\item Experimental setup.%: Participants estimated height visually with an \OST-\AR display and haptically with force-feedback devices worn on the thumb and index fingers.
|
||||
%\item with only haptic feedback (red) or only visual feedback (blue, with different added noise),
|
||||
%\item combined visuo-haptic feedback (purple, with different visual noises).
|
||||
\item Proportion of trials (vertical axis) where the comparison bar (horizontal axis) was perceived taller than the reference bar as function of increase variance (inverse of reliability) of the visual feedback (colors).
|
||||
The reference had different conflicting visual $s_v$ and haptic $s_h$ heights.
|
||||
]
|
||||
\subfig[.34]{ernst2002humans_setup}
|
||||
\subfig[.64]{ernst2004merging_results}
|
||||
%\subfig{ernst2002humans_within}
|
||||
%\subfig{ernst2002humans_visuo-haptic}
|
||||
\item Experimental setup.%: Participants estimated height visually with an \OST-\AR display and haptically with force-feedback devices worn on the thumb and index fingers.
|
||||
%\item with only haptic feedback (red) or only visual feedback (blue, with different added noise),
|
||||
%\item combined visuo-haptic feedback (purple, with different visual noises).
|
||||
\item Proportion of trials (vertical axis) where the comparison bar (horizontal axis) was perceived taller than the reference bar as function of increase variance (inverse of reliability) of the visual feedback (colors).
|
||||
The reference had different conflicting visual $s_v$ and haptic $s_h$ heights.
|
||||
]
|
||||
\subfig[.34]{ernst2002humans_setup}
|
||||
\subfig[.64]{ernst2004merging_results}
|
||||
%\subfig{ernst2002humans_within}
|
||||
%\subfig{ernst2002humans_visuo-haptic}
|
||||
\end{subfigs}
|
||||
|
||||
%Hence, the \MLE model explains how a (visual) \VO in \AR can be perceived as coherent when combined with real haptic sensations of a tangible or a wearable haptic feedback.
|
||||
@@ -70,7 +61,6 @@ The \MLE model implies that when seeing and touching a \VO in \AR, the combinati
|
||||
%As long as the user is able to associate the sensations as the same object property, and even if there are discrepancies between the sensations, the overall perception can be influenced by changing one of the stimuli, as discussed in the next sections.
|
||||
%for example by including tangible objects, wearable haptic feedback, or even by altering the visual rendering of the \VO, as discussed in the next sections.
|
||||
|
||||
|
||||
\subsubsection{Influence of Visual Rendering on Tangible Perception}
|
||||
\label{visual_haptic_influence}
|
||||
|
||||
@@ -93,12 +83,12 @@ For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by visually defo
|
||||
\textcite{ujitoko2019modulating} increased the perceived roughness of a virtual patterned texture rendered as vibrations through a hand-held stylus (\secref{texture_rendering}) by adding small oscillations to the visual feedback of the stylus on a screen.
|
||||
|
||||
\begin{subfigs}{pseudo_haptic}{Pseudo-haptic feedback in \AR. }[
|
||||
\item A virtual soft texture projected on a table and that deforms when pressed by the hand \cite{punpongsanon2015softar}.
|
||||
\item Modifying visually a tangible object and the hand touching it in \VST-\AR to modify its perceived shape \cite{ban2014displaying}.
|
||||
]
|
||||
\subfigsheight{42mm}
|
||||
\subfig{punpongsanon2015softar}
|
||||
\subfig{ban2014displaying}
|
||||
\item A virtual soft texture projected on a table and that deforms when pressed by the hand \cite{punpongsanon2015softar}.
|
||||
\item Modifying visually a tangible object and the hand touching it in \VST-\AR to modify its perceived shape \cite{ban2014displaying}.
|
||||
]
|
||||
\subfigsheight{42mm}
|
||||
\subfig{punpongsanon2015softar}
|
||||
\subfig{ban2014displaying}
|
||||
\end{subfigs}
|
||||
|
||||
%In all of these studies, the visual expectations of participants influenced their haptic perception.
|
||||
@@ -115,17 +105,17 @@ One had a reference stiffness but an additional visual or haptic delay, while th
|
||||
Adding a visual delay increased the perceived stiffness of the reference piston, while adding a haptic delay decreased it, and adding both delays cancelled each other out (\figref{knorlein2009influence_2}).
|
||||
|
||||
\begin{subfigs}{visuo-haptic-stiffness}{Perception of haptic stiffness in \VST-\AR \cite{knorlein2009influence}. }[
|
||||
\item Participant pressing a virtual piston rendered by a force-feedback device with their hand.
|
||||
\item Proportion of comparison piston perceived as stiffer than reference piston (vertical axis) as a function of the comparison stiffness (horizontal axis) and visual and haptic delays of the reference (colors).
|
||||
]
|
||||
\subfig[.44]{knorlein2009influence_1}
|
||||
\subfig[.55]{knorlein2009influence_2}
|
||||
\item Participant pressing a virtual piston rendered by a force-feedback device with their hand.
|
||||
\item Proportion of comparison piston perceived as stiffer than reference piston (vertical axis) as a function of the comparison stiffness (horizontal axis) and visual and haptic delays of the reference (colors).
|
||||
]
|
||||
\subfig[.44]{knorlein2009influence_1}
|
||||
\subfig[.55]{knorlein2009influence_2}
|
||||
\end{subfigs}
|
||||
|
||||
%explained how these delays affected the integration of the visual and haptic perceptual cues of stiffness.
|
||||
The stiffness $\tilde{k}(t)$ of the piston is indeed estimated at time $t$ by both sight and proprioception as the ratio of the exerted force $F(t)$ and the displacement $D(t)$ of the piston, following \eqref{stiffness}, but with potential visual $\Delta t_v$ or haptic $\Delta t_h$ delays:
|
||||
\begin{equation}{stiffness_delay}
|
||||
\tilde{k}(t) = \frac{F(t + \Delta t_h)}{D(t + \Delta t_v)}
|
||||
\tilde{k}(t) = \frac{F(t + \Delta t_h)}{D(t + \Delta t_v)}
|
||||
\end{equation}
|
||||
Therefore, the perceived stiffness $\tilde{k}(t)$ increases with a haptic delay in force and decreases with a visual delay in displacement \cite{diluca2011effects}.
|
||||
|
||||
@@ -136,13 +126,13 @@ This suggests that the haptic stiffness of \VOs feels \enquote{softer} in an \AE
|
||||
%Two differences that could be worth investigating with the two previous studies are the type of \AR (visuo or optical) and to see the hand touching the \VO.
|
||||
|
||||
\begin{subfigs}{gaffary2017ar}{Perception of haptic stiffness in \OST-\AR \vs \VR \cite{gaffary2017ar}. }[
|
||||
\item Experimental setup: a virtual piston was pressed with a force-feedback placed to the side of the participant.
|
||||
\item View of the virtual piston seen in front of the participant in \OST-\AR and
|
||||
\item in \VR.
|
||||
]
|
||||
\subfig[0.35]{gaffary2017ar_1}
|
||||
\subfig[0.3]{gaffary2017ar_3}
|
||||
\subfig[0.3]{gaffary2017ar_4}
|
||||
\item Experimental setup: a virtual piston was pressed with a force-feedback placed to the side of the participant.
|
||||
\item View of the virtual piston seen in front of the participant in \OST-\AR and
|
||||
\item in \VR.
|
||||
]
|
||||
\subfig[0.35]{gaffary2017ar_1}
|
||||
\subfig[0.3]{gaffary2017ar_3}
|
||||
\subfig[0.3]{gaffary2017ar_4}
|
||||
\end{subfigs}
|
||||
|
||||
Finally, \textcite{diluca2019perceptual} investigated the perceived simultaneity of visuo-haptic contact with a \VO in \VR.
|
||||
@@ -153,7 +143,6 @@ No participant (out of 19) was able to detect a \qty{50}{\ms} visual lag and a \
|
||||
These studies have shown how the latency of the visual rendering of a \VO or the type of environment (\VE or \RE) can affect the perceived haptic stiffness of the object, rendered with a grounded force-feedback device.
|
||||
We describe in the next section how wearable haptics have been integrated with immersive \AR.
|
||||
|
||||
|
||||
\subsection{Wearable Haptics for Direct Hand Interaction in AR}
|
||||
\label{vhar_haptics}
|
||||
|
||||
@@ -166,7 +155,6 @@ Other wearable haptic actuators have been proposed for \AR, but are not discusse
|
||||
A first reason is that they permanently cover the fingertip and affect the interaction with the \RE, such as thin-skin tactile interfaces \cite{withana2018tacttoo,teng2024haptic} or fluid-based interfaces \cite{han2018hydroring}.
|
||||
Another category of actuators relies on systems that cannot be considered as portable, such as REVEL \cite{bau2012revel}, which provide friction sensations with reverse electrovibration that must modify the real objects to augment, or Electrical Muscle Stimulation (EMS) devices \cite{lopes2018adding}, which provide kinesthetic feedback by contracting the muscles.
|
||||
|
||||
|
||||
\subsubsection{Nail-Mounted Devices}
|
||||
\label{vhar_nails}
|
||||
|
||||
@@ -191,19 +179,18 @@ Finally, \textcite{preechayasomboon2021haplets} (\figref{preechayasomboon2021hap
|
||||
However, no proper user study has been conducted to evaluate these devices in \AR.
|
||||
|
||||
\begin{subfigs}{ar_wearable}{Nail-mounted wearable haptic devices designed for \AR. }[
|
||||
%\item A voice-coil rendering a virtual haptic texture on a real sheet of paper \cite{ando2007fingernailmounted}.
|
||||
\item Touch\&Fold provide contact pressure and vibrations on demand to the fingertip \cite{teng2021touch}.
|
||||
\item Fingeret is a finger-side wearable haptic device that pulls and pushs the fingertip skin \cite{maeda2022fingeret}.
|
||||
\item Haplets is a very compact nail device with integrated sensing and vibrotactile feedback \cite{preechayasomboon2021haplets}.
|
||||
]
|
||||
\subfigsheight{33mm}
|
||||
%\subfig{ando2007fingernailmounted}
|
||||
\subfig{teng2021touch}
|
||||
\subfig{maeda2022fingeret}
|
||||
\subfig{preechayasomboon2021haplets}
|
||||
%\item A voice-coil rendering a virtual haptic texture on a real sheet of paper \cite{ando2007fingernailmounted}.
|
||||
\item Touch\&Fold provide contact pressure and vibrations on demand to the fingertip \cite{teng2021touch}.
|
||||
\item Fingeret is a finger-side wearable haptic device that pulls and pushs the fingertip skin \cite{maeda2022fingeret}.
|
||||
\item Haplets is a very compact nail device with integrated sensing and vibrotactile feedback \cite{preechayasomboon2021haplets}.
|
||||
]
|
||||
\subfigsheight{33mm}
|
||||
%\subfig{ando2007fingernailmounted}
|
||||
\subfig{teng2021touch}
|
||||
\subfig{maeda2022fingeret}
|
||||
\subfig{preechayasomboon2021haplets}
|
||||
\end{subfigs}
|
||||
|
||||
|
||||
\subsubsection{Belt Devices}
|
||||
\label{vhar_rings}
|
||||
|
||||
@@ -225,12 +212,12 @@ However, the measured difference in performance could be due to either the devic
|
||||
These two studies were also conducted in non-immersive setups, where users viewed a screen displaying the visual interactions, and only compared the haptic and visual rendering of the hand-object contacts, but did not examine them together.
|
||||
|
||||
\begin{subfigs}{ar_rings}{Wearable haptic ring devices for \AR. }[
|
||||
\item Rendering weight of a virtual cube placed on a real surface \cite{scheggi2010shape}.
|
||||
\item Rendering the contact force exerted by the fingers on a virtual cube \cite{maisto2017evaluation,meli2018combining}.
|
||||
]
|
||||
\subfigsheight{57mm}
|
||||
\subfig{scheggi2010shape}
|
||||
\subfig{maisto2017evaluation}
|
||||
\item Rendering weight of a virtual cube placed on a real surface \cite{scheggi2010shape}.
|
||||
\item Rendering the contact force exerted by the fingers on a virtual cube \cite{maisto2017evaluation,meli2018combining}.
|
||||
]
|
||||
\subfigsheight{57mm}
|
||||
\subfig{scheggi2010shape}
|
||||
\subfig{maisto2017evaluation}
|
||||
\end{subfigs}
|
||||
|
||||
%\subsubsection{Wrist Bracelet Devices}
|
||||
|
||||
Reference in New Issue
Block a user