WIP
This commit is contained in:
@@ -9,7 +9,7 @@
|
||||
%Go back to the main objective "to understand how immersive visual and \WH feedback compare and complement each other in the context of direct hand perception and manipulation with augmented objects" and the two research challenges: "providing plausible and coherent visuo-haptic augmentations, and enabling effective manipulation of the augmented environment."
|
||||
%Also go back to the \figref[introduction]{visuo-haptic-rv-continuum3} : we present previous work that either did haptic AR (the middle row), or haptic VR with visual AR, or visuo-haptic AR.
|
||||
|
||||
% One of the roles of haptic systems is to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects, particularly in \v-\VE~\cite{maclean2008it,culbertson2018haptics}. Moreover, a haptic \AR system should \enquote{modulating the feel of a real object by virtual [haptic] feedback}~\cite{jeon2009haptic}, \ie a touch interaction with a real object whose perception is modified by the addition of virtual haptic feedback.
|
||||
% One of the roles of haptic systems is to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects, particularly in \v-\VE \cite{maclean2008it,culbertson2018haptics}. Moreover, a haptic \AR system should \enquote{modulating the feel of a real object by virtual [haptic] feedback} \cite{jeon2009haptic}, \ie a touch interaction with a real object whose perception is modified by the addition of virtual haptic feedback.
|
||||
|
||||
% Finally, we present how multimodal visual and haptic feedback have been combined in \AR to modify the user perception of tangible objects, and to improve the user interaction with \VOs.
|
||||
|
||||
@@ -20,10 +20,10 @@
|
||||
\subsubsection{Merging the Sensations into a Perception}
|
||||
\label{sensations_perception}
|
||||
|
||||
A \emph{perception} is the merge of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property~\cite{ernst2004merging}.
|
||||
A \emph{perception} is the merge of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property \cite{ernst2004merging}.
|
||||
For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a tangible with a co-localized \VO (\secref{ar_tangibles}).
|
||||
|
||||
When the sensations can be redundant, \ie when only one sensation could be enough to estimate the property, they are integrated to form a single coherent perception~\cite{ernst2004merging}.
|
||||
When the sensations can be redundant, \ie when only one sensation could be enough to estimate the property, they are integrated to form a single coherent perception \cite{ernst2004merging}.
|
||||
No sensory information is completely reliable, and can provide different answers to the same property when measured multiple times, \eg the weight of an object.
|
||||
Therefore, each sensation $i$ is said to be an estimate $\tilde{s}_i$ with variance $\sigma_i^2$ of the property $s$.
|
||||
The \MLE model predicts then that the integrated estimated property $\tilde{s}$ is the weighted sum of the individual sensory estimates:
|
||||
@@ -70,8 +70,8 @@ As long as the user is able to match the sensations as the same object property,
|
||||
\label{visual_haptic_influence}
|
||||
|
||||
A visuo-haptic perception of an object's property is thus robust to a certain difference between the two sensory modalities, as long as one can match their respective sensations to the same property.
|
||||
In particular, the texture perception of everyday objects is known to be constructed from both vision and touch~\cite{klatzky2010multisensory}.
|
||||
More precisely, when evaluating surfaces with vision or touch only, both senses mainly discriminate their materials by the same properties of roughness, hardness and friction, and with similar performance~\cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
|
||||
In particular, the texture perception of everyday objects is known to be constructed from both vision and touch \cite{klatzky2010multisensory}.
|
||||
More precisely, when evaluating surfaces with vision or touch only, both senses mainly discriminate their materials by the same properties of roughness, hardness and friction, and with similar performance \cite{bergmanntiest2007haptic,baumgartner2013visual,vardar2019fingertip}.
|
||||
|
||||
Overall perception can be then modified by changing one of the sensory modality, as shown by \textcite{yanagisawa2015effects}, who altered perceived roughness, stiffness and friction of real tactile materials touched by the finger by superimposing different real visual textures using a half-mirror.
|
||||
With a similar setup but in immersive \VST-\AR, \textcite{kitahara2010sensory} overlaid visual textures on real textured surfaces touched but through a glove: Participants matched visual textures to real textures when their respective hardness felt similar.
|
||||
@@ -81,12 +81,12 @@ They found that the visual perception of roughness and hardness influenced the h
|
||||
|
||||
\fig{gunther2022smooth}{In a passive touch context in \VR, only a smooth and a rough real surfaces were found to match all the visual \VOs \cite{gunther2022smooth}.}
|
||||
|
||||
The visual feedback can even be designed on purpose to influence the haptic perception, usually by deforming the visual representation of a user input, creating a \enquote{pseudo-haptic feedback}~\cite{ujitoko2021survey}.
|
||||
For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by deforming visually the geometry of a tangible object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape or curvature can be altered~\cite{ban2013modifying,ban2014displaying}.
|
||||
The visual feedback can even be designed on purpose to influence the haptic perception, usually by deforming the visual representation of a user input, creating a \enquote{pseudo-haptic feedback} \cite{ujitoko2021survey}.
|
||||
For example, in a fixed \VST-\AR screen (\secref{ar_displays}), by deforming visually the geometry of a tangible object touched by the hand, as well as the touching hand, the visuo-haptic perception of the size, shape or curvature can be altered \cite{ban2013modifying,ban2014displaying}.
|
||||
\textcite{punpongsanon2015softar} used this technique in spatial \AR (\secref{ar_displays}) to induce a softness illusion of a hard tangible object by superimposing a virtual texture that deforms when pressed by the hand (\figref{punpongsanon2015softar}).
|
||||
|
||||
\begin{subfigs}{pseudo_haptic}{Pseudo-haptic feedback in \AR. }[
|
||||
\item A virtual soft texture projected on a table and that deforms when pressed by the hand~\cite{punpongsanon2015softar}.
|
||||
\item A virtual soft texture projected on a table and that deforms when pressed by the hand \cite{punpongsanon2015softar}.
|
||||
\item Modifying visually a tangible object and the hand touching it in \VST-\AR to modify its perceived shape \cite{ban2014displaying}.
|
||||
]
|
||||
\subfigsheight{42mm}
|
||||
@@ -120,15 +120,15 @@ The stiffness $\tilde{k}(t)$ of the piston is indeed estimated at time $t$ by bo
|
||||
\begin{equation}{stiffness_delay}
|
||||
\tilde{k}(t) = \frac{F(t + \Delta t_h)}{D(t + \Delta t_v)}
|
||||
\end{equation}
|
||||
Therefore, the perceived stiffness $\tilde{k}(t)$ increases with a haptic delay in force and decreases with a visual delay in displacement~\cite{diluca2011effects}.
|
||||
Therefore, the perceived stiffness $\tilde{k}(t)$ increases with a haptic delay in force and decreases with a visual delay in displacement \cite{diluca2011effects}.
|
||||
|
||||
In a similar \TIFC user study, participants compared perceived stiffness of virtual pistons in \OST-\AR and \VR~\cite{gaffary2017ar}.
|
||||
In a similar \TIFC user study, participants compared perceived stiffness of virtual pistons in \OST-\AR and \VR \cite{gaffary2017ar}.
|
||||
However, the force-feedback device and the participant's hand were not visible (\figref{gaffary2017ar}).
|
||||
The reference piston was judged to be stiffer when seen in \VR than in \AR, without participants noticing this difference, and more force was exerted on the piston overall in \VR.
|
||||
This suggests that the haptic stiffness of \VOs feels \enquote{softer} in an \AE than in a full \VE.
|
||||
%Two differences that could be worth investigating with the two previous studies are the type of \AR (visuo or optical) and to see the hand touching the \VO.
|
||||
|
||||
\begin{subfigs}{gaffary2017ar}{Perception of haptic stiffness in \OST-\AR \vs \VR~\cite{gaffary2017ar}. }[
|
||||
\begin{subfigs}{gaffary2017ar}{Perception of haptic stiffness in \OST-\AR \vs \VR \cite{gaffary2017ar}. }[
|
||||
\item Experimental setup: a virtual piston was pressed with a force-feedback placed to the side of the participant.
|
||||
\item View of the virtual piston seen in front of the participant in \OST-\AR and
|
||||
\item in \VR.
|
||||
@@ -156,15 +156,16 @@ Several approaches have been proposed to move the actuator away to another locat
|
||||
Yet, they differ greatly in the actuators used (\secref{wearable_haptic_devices}) thus the haptic feedback (\secref{tactile_rendering}), and the placement of the haptic rendering.
|
||||
|
||||
Other wearable haptic actuators have been proposed for \AR but are not detailed here.
|
||||
A first reason is that they permanently cover the fingertip and affect the interaction with the \RE, such as thin-skin tactile interfaces~\cite{withana2018tacttoo,teng2024haptic} or fluid-based interfaces~\cite{han2018hydroring}.
|
||||
Another category of actuators relies on systems that cannot be considered as portable, such as REVEL~\cite{bau2012revel} that provide friction sensations with reverse electrovibration that need to modify the real objects to augment, or Electrical Muscle Stimulation (EMS) devices~\cite{lopes2018adding} that provide kinesthetic feedback by contracting the muscles.
|
||||
A first reason is that they permanently cover the fingertip and affect the interaction with the \RE, such as thin-skin tactile interfaces \cite{withana2018tacttoo,teng2024haptic} or fluid-based interfaces \cite{han2018hydroring}.
|
||||
Another category of actuators relies on systems that cannot be considered as portable, such as REVEL \cite{bau2012revel} that provide friction sensations with reverse electrovibration that need to modify the real objects to augment, or Electrical Muscle Stimulation (EMS) devices \cite{lopes2018adding} that provide kinesthetic feedback by contracting the muscles.
|
||||
|
||||
|
||||
\subsubsection{Nail-Mounted Devices}
|
||||
\label{vhar_nails}
|
||||
|
||||
\textcite{ando2007fingernailmounted} were the first to propose to move away the actuator from the fingertip, as described in \secref{texture_rendering}.
|
||||
This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a haptic device mounted on the nail but able to unfold its end-effector on demand to make contact with the fingertip when touching \VOs (\figref{teng2021touch}).
|
||||
This moving platform also contains a \LRA (\secref{moving_platforms}) and provides contact pressure (\qty{0.34}{\N} force) and texture (\qtyrange{150}{190}{\Hz} bandwidth) sensations.
|
||||
\textcite{ando2007fingernailmounted} were the first to propose to move away the actuator from the fingertip to the nail, as described in \secref{texture_rendering}.
|
||||
This approach was later extended by \textcite{teng2021touch} with Touch\&Fold, a haptic device able to unfold its end-effector on demand to make contact with the fingertip when touching \VOs (\figref{teng2021touch}).
|
||||
This moving platform also contains a \LRA (\secref{moving_platforms}) and provides contact pressure and texture sensations.
|
||||
%The whole system is very compact (\qtyproduct{24 x 24 x 41}{\mm}), lightweight (\qty{9.5}{\g}), and fully portable by including a battery and Bluetooth wireless communication. \qty{20}{\ms} for the Bluetooth
|
||||
When touching \VOs in \OST-\AR with the index finger, this device was found to be more realistic overall (5/7) than vibrations with a \LRA at \qty{170}{\Hz} on the nail (3/7).
|
||||
Still, there is a high (\qty{92}{\ms}) latency for the folding mechanism and this design is not suitable for augmenting real tangible objects.
|
||||
@@ -177,16 +178,16 @@ Two rollers, one per side, can deform the skin: When rotating inwards, they pull
|
||||
By doing quick in and out rotations, they can also simulate a texture sensation.
|
||||
%The device is also very compact (\qty{60 x 25 x 36}{\mm}), lightweight (\qty{18}{\g}), and portable with a battery and Bluetooth wireless communication with \qty{83}{\ms} latency.
|
||||
In a user study not in \AR, but directly touching images on a tablet, Fingeret was found to be more realistic (4/7) than a \LRA at \qty{100}{\Hz} on the nail (3/7) for rendering buttons and a patterned texture (\secref{texture_rendering}), but not different from vibrations for rendering high-frequency textures (3.5/7 for both).
|
||||
However, as for \textcite{teng2021touch}, finger speed was not taken into account for rendering vibrations, which may have been detrimental to texture perception (\secref{texture_rendering}).
|
||||
However, as for \textcite{teng2021touch}, finger speed was not taken into account for rendering vibrations, which may have been detrimental to texture perception, as described in \secref{texture_rendering}.
|
||||
|
||||
Finally, \textcite{preechayasomboon2021haplets} (\figref{preechayasomboon2021haplets}) and \textcite{sabnis2023haptic} designed Haplets and Haptic Servo, respectively: They are very compact and lightweight vibrotactile \LRA devices designed to feature both integrated sensing of the finger movements and very latency haptic feedback (\qty{<5}{ms}).
|
||||
But no proper user study were conducted to evaluate these devices in \AR.
|
||||
|
||||
\begin{subfigs}{ar_wearable}{Nail-mounted wearable haptic devices designed for \AR. }[
|
||||
%\item A voice-coil rendering a virtual haptic texture on a real sheet of paper~\cite{ando2007fingernailmounted}.
|
||||
\item Touch\&Fold provide contact pressure and vibrations on demand to the fingertip~\cite{teng2021touch}.
|
||||
\item Fingeret is a finger-side wearable haptic device that pulls and pushs the fingertip skin~\cite{maeda2022fingeret}.
|
||||
\item Haplets is a very compact nail device with integrated sensing and vibrotactile feedback~\cite{preechayasomboon2021haplets}.
|
||||
%\item A voice-coil rendering a virtual haptic texture on a real sheet of paper \cite{ando2007fingernailmounted}.
|
||||
\item Touch\&Fold provide contact pressure and vibrations on demand to the fingertip \cite{teng2021touch}.
|
||||
\item Fingeret is a finger-side wearable haptic device that pulls and pushs the fingertip skin \cite{maeda2022fingeret}.
|
||||
\item Haplets is a very compact nail device with integrated sensing and vibrotactile feedback \cite{preechayasomboon2021haplets}.
|
||||
]
|
||||
\subfigsheight{33mm}
|
||||
%\subfig{ando2007fingernailmounted}
|
||||
@@ -196,7 +197,7 @@ But no proper user study were conducted to evaluate these devices in \AR.
|
||||
\end{subfigs}
|
||||
|
||||
|
||||
\subsubsection{Ring Belt Devices}
|
||||
\subsubsection{Belt Devices}
|
||||
\label{vhar_rings}
|
||||
|
||||
The haptic ring belt devices of \textcite{minamizawa2007gravity} and \textcite{pacchierotti2016hring}, presented in \secref{belt_actuators}, have been employed to improve the manipulation of \VOs in \AR (\secref{ar_interaction}).
|
||||
@@ -217,38 +218,37 @@ However, the measured difference in performance could be attributed to either th
|
||||
These two studies were also conducted in non-immersive setups, where users looked at a screen displaying the visual interactions, and only compared haptic and visual rendering of the hand-object contacts, but did not examine them together.
|
||||
|
||||
\begin{subfigs}{ar_rings}{Wearable haptic ring devices for \AR. }[
|
||||
\item Rendering weight of a virtual cube placed on a real surface~\cite{scheggi2010shape}.
|
||||
\item Rendering the contact force exerted by the fingers on a virtual cube~\cite{maisto2017evaluation,meli2018combining}.
|
||||
\item Rendering weight of a virtual cube placed on a real surface \cite{scheggi2010shape}.
|
||||
\item Rendering the contact force exerted by the fingers on a virtual cube \cite{maisto2017evaluation,meli2018combining}.
|
||||
]
|
||||
\subfigsheight{57mm}
|
||||
\subfig{scheggi2010shape}
|
||||
\subfig{maisto2017evaluation}
|
||||
\end{subfigs}
|
||||
|
||||
|
||||
\subsubsection{Wrist Bracelet Devices}
|
||||
\label{vhar_bracelets}
|
||||
%\subsubsection{Wrist Bracelet Devices}
|
||||
%\label{vhar_bracelets}
|
||||
|
||||
With their \enquote{Tactile And Squeeze Bracelet Interface} (Tasbi), already mentioned in \secref{belt_actuators}, \textcite{pezent2019tasbi} and \textcite{pezent2022design} explored the use of a wrist-worn bracelet actuator.
|
||||
It is capable of providing a uniform pressure sensation (up to \qty{15}{\N} and \qty{10}{\Hz}) and vibration with six \LRAs (\qtyrange{150}{200}{\Hz} bandwidth).
|
||||
A user study was conducted in \VR to compare the perception of visuo-haptic stiffness rendering~\cite{pezent2019tasbi}.
|
||||
In a \TIFC task (\secref{sensations_perception}), participants pressed a virtual button with different levels of stiffness via a virtual hand constrained by the \VE (\figref{pezent2019tasbi_2}).
|
||||
A higher visual stiffness required a larger physical displacement to press the button (C/D ratio, see \secref{pseudo_haptic}), while the haptic stiffness control the rate of the pressure feedback when pressing.
|
||||
When the visual and haptic stiffness were coherent or when only the haptic stiffness changed, participants easily discriminated two buttons with different stiffness levels (\figref{pezent2019tasbi_3}).
|
||||
However, if only the visual stiffness changed, participants were not able to discriminate the different stiffness levels (\figref{pezent2019tasbi_4}).
|
||||
This suggests that in \VR, the haptic pressure is more important perceptual cue than the visual displacement to render stiffness.
|
||||
A short vibration (\qty{25}{\ms} \qty{175}{\Hz} square-wave) was also rendered when contacting the button, but kept constant across all conditions: It may have affected the overall perception when only the visual stiffness changed.
|
||||
A user study was conducted in \VR to compare the perception of visuo-haptic stiffness rendering \cite{pezent2019tasbi}, and showed that the haptic pressure feedback was more important than the visual displacement.
|
||||
%In a \TIFC task (\secref{sensations_perception}), participants pressed a virtual button with different levels of stiffness via a virtual hand constrained by the \VE (\figref{pezent2019tasbi_2}).
|
||||
%A higher visual stiffness required a larger physical displacement to press the button (C/D ratio, see \secref{pseudo_haptic}), while the haptic stiffness control the rate of the pressure feedback when pressing.
|
||||
%When the visual and haptic stiffness were coherent or when only the haptic stiffness changed, participants easily discriminated two buttons with different stiffness levels (\figref{pezent2019tasbi_3}).
|
||||
%However, if only the visual stiffness changed, participants were not able to discriminate the different stiffness levels (\figref{pezent2019tasbi_4}).
|
||||
%This suggests that in \VR, the haptic pressure is more important perceptual cue than the visual displacement to render stiffness.
|
||||
%A short vibration (\qty{25}{\ms} \qty{175}{\Hz} square-wave) was also rendered when contacting the button, but kept constant across all conditions: It may have affected the overall perception when only the visual stiffness changed.
|
||||
|
||||
\begin{subfigs}{pezent2019tasbi}{Visuo-haptic stiffness rendering of a virtual button in \VR with the Tasbi bracelet. }[
|
||||
\item The \VE seen by the user: the virtual hand (in beige) is constrained by the virtual button. The displacement is proportional to the visual stiffness. The real hand (in green) is hidden by the \VE.
|
||||
\item When the rendered visuo-haptic stiffness are coherents (in purple) or only the haptic stiffness change (in blue), participants easily discrimated the different levels.
|
||||
\item When varying only the visual stiffness (in red) but keeping the haptic stiffness constant, participants were not able to discriminate the different stiffness levels.
|
||||
]
|
||||
\subfigsheight{45mm}
|
||||
\subfig{pezent2019tasbi_2}
|
||||
\subfig{pezent2019tasbi_3}
|
||||
\subfig{pezent2019tasbi_4}
|
||||
\end{subfigs}
|
||||
%\begin{subfigs}{pezent2019tasbi}{Visuo-haptic stiffness rendering of a virtual button in \VR with the Tasbi bracelet. }[
|
||||
% \item The \VE seen by the user: the virtual hand (in beige) is constrained by the virtual button. The displacement is proportional to the visual stiffness. The real hand (in green) is hidden by the \VE.
|
||||
% \item When the rendered visuo-haptic stiffness are coherents (in purple) or only the haptic stiffness change (in blue), participants easily discrimated the different levels.
|
||||
% \item When varying only the visual stiffness (in red) but keeping the haptic stiffness constant, participants were not able to discriminate the different stiffness levels.
|
||||
% ]
|
||||
% \subfigsheight{45mm}
|
||||
% \subfig{pezent2019tasbi_2}
|
||||
% \subfig{pezent2019tasbi_3}
|
||||
% \subfig{pezent2019tasbi_4}
|
||||
%\end{subfigs}
|
||||
|
||||
% \cite{sarac2022perceived,palmer2022haptic} not in AR but studies on relocating to the wrist the haptic feedback of the fingertip-object contacts.
|
||||
|
||||
|
||||
Reference in New Issue
Block a user