WIP related work

This commit is contained in:
2024-09-20 16:50:31 +02:00
parent 5eba5f75d9
commit 6daf361654
7 changed files with 83 additions and 34 deletions

View File

@@ -194,9 +194,9 @@ This allows us to read Braille~\cite{lederman2009haptic}.
However, the speed of exploration affects the perceived intensity of micro-roughness~\cite{bensmaia2003vibrations}. However, the speed of exploration affects the perceived intensity of micro-roughness~\cite{bensmaia2003vibrations}.
To establish the relationship between spacing and intensity for macro-roughness, patterned textured surfaces were manufactured: as a linear grating (on one axis) composed of ridges and grooves, \eg in \figref{lawrence2007haptic_1}~\cite{lederman1972fingertip,lawrence2007haptic}, or as a surface composed of micro conical elements on two axes, \eg in \figref{klatzky2003feeling_1}~\cite{klatzky2003feeling}. To establish the relationship between spacing and intensity for macro-roughness, patterned textured surfaces were manufactured: as a linear grating (on one axis) composed of ridges and grooves, \eg in \figref{lawrence2007haptic_1}~\cite{lederman1972fingertip,lawrence2007haptic}, or as a surface composed of micro conical elements on two axes, \eg in \figref{klatzky2003feeling_1}~\cite{klatzky2003feeling}.
As shown in \figref{lawrence2007haptic_2}, there is a quadratic relationship between the logarithm of the perceived roughness intensity $R$ and the logarithm of the space between the elements $s$ ($a$, $b$ and $c$ are empirical parameters to be estimated)~\cite{klatzky2003feeling}: As shown in \figref{lawrence2007haptic_2}, there is a quadratic relationship between the logarithm of the perceived roughness intensity $r$ and the logarithm of the space between the elements $s$ ($a$, $b$ and $c$ are empirical parameters to be estimated)~\cite{klatzky2003feeling}:
\begin{equation}{roughness_intensity} \begin{equation}{roughness_intensity}
log(R) \sim a \, log(s)^2 + b \, s + c log(r) \sim a \, log(s)^2 + b \, s + c
\end{equation} \end{equation}
A larger spacing between elements increases the perceived roughness, but reaches a plateau from \qty{\sim 5}{\mm} for the linear grating~\cite{lawrence2007haptic}, while the roughness decreases from \qty{\sim 2.5}{\mm}~\cite{klatzky2003feeling} for the conical elements. A larger spacing between elements increases the perceived roughness, but reaches a plateau from \qty{\sim 5}{\mm} for the linear grating~\cite{lawrence2007haptic}, while the roughness decreases from \qty{\sim 2.5}{\mm}~\cite{klatzky2003feeling} for the conical elements.
@@ -230,7 +230,7 @@ For grid textures, as illustrated in \figref{delhaye2012textureinduced}, the rat
\lambda \sim \frac{v}{f_p} \lambda \sim \frac{v}{f_p}
\end{equation} \end{equation}
The vibrations generated by exploring natural textures are also very specific to each texture and similar between individuals, making them identifiable by vibration alone~\cite{greenspon2020effect}. The vibrations generated by exploring natural textures are also very specific to each texture and similar between individuals, making them identifiable by vibration alone~\cite{manfredi2014natural,greenspon2020effect}.
This shows the importance of vibration cues even for macro textures and the possibility of generating virtual texture sensations with vibrotactile rendering. This shows the importance of vibration cues even for macro textures and the possibility of generating virtual texture sensations with vibrotactile rendering.
\fig[0.55]{delhaye2012textureinduced}{Speed of finger exploration (horizontal axis) on grating textures with different periods $\lambda$ of spacing (in color) and frequency of the vibration intensity peak $f_p$ propagated in the wrist (vertical axis)~\cite{delhaye2012textureinduced}.} \fig[0.55]{delhaye2012textureinduced}{Speed of finger exploration (horizontal axis) on grating textures with different periods $\lambda$ of spacing (in color) and frequency of the vibration intensity peak $f_p$ propagated in the wrist (vertical axis)~\cite{delhaye2012textureinduced}.}
@@ -249,7 +249,7 @@ By tapping on a surface, metal will be perceived as harder than wood.
If the surface returns to its original shape after being deformed, the object is elastic (like a spring), otherwise it is plastic (like clay). If the surface returns to its original shape after being deformed, the object is elastic (like a spring), otherwise it is plastic (like clay).
When the finger presses on an object (\figref{exploratory_procedures}), its surface will move and deform with some resistance, and the contact area of the skin will also expand, changing the pressure distribution. When the finger presses on an object (\figref{exploratory_procedures}), its surface will move and deform with some resistance, and the contact area of the skin will also expand, changing the pressure distribution.
When the surface is touched or tapped, vibrations are also transmitted to the skin. When the surface is touched or tapped, vibrations are also transmitted to the skin~\cite{higashi2019hardness}.
Passive touch (without voluntary hand movements) and tapping allow a perception of hardness as good as active touch~\cite{friedman2008magnitude}. Passive touch (without voluntary hand movements) and tapping allow a perception of hardness as good as active touch~\cite{friedman2008magnitude}.
Two physical properties determine the haptic perception of hardness: its stiffness and elasticity, as shown in \figref{hardness}~\cite{bergmanntiest2010tactual}. Two physical properties determine the haptic perception of hardness: its stiffness and elasticity, as shown in \figref{hardness}~\cite{bergmanntiest2010tactual}.
@@ -276,6 +276,10 @@ With finger pressure, a relative difference (the \emph{Weber fraction}) of \perc
However, in the absence of pressure sensations (by placing a thin disc between the finger and the object), the necessary relative difference becomes much larger (Weber fraction of \percent{\sim 50}). However, in the absence of pressure sensations (by placing a thin disc between the finger and the object), the necessary relative difference becomes much larger (Weber fraction of \percent{\sim 50}).
Thus, the perception of hardness relies on \percent{90} on surface deformation cues and \percent{10} on displacement cues. Thus, the perception of hardness relies on \percent{90} on surface deformation cues and \percent{10} on displacement cues.
In addition, an object with low stiffness but high Young's modulus can be perceived as hard, and vice versa, as shown in \figref{bergmanntiest2009cues}. In addition, an object with low stiffness but high Young's modulus can be perceived as hard, and vice versa, as shown in \figref{bergmanntiest2009cues}.
Finally, when pressing with the finger, the perceived hardness intensity $h$ follows a power law with the stiffness $k$~\cite{harper1964subjective}:
\begin{equation}{hardness_intensity}
h = k^{0.8}
\end{equation}
%En pressant du doigt, l'intensité perçue (subjective) de dureté suit avec la raideur une relation selon une loi de puissance avec un exposant de \num{0.8}~\cite{harper1964subjective}, \ie quand la raideur double, la dureté perçue augmente de \num{1.7}. %En pressant du doigt, l'intensité perçue (subjective) de dureté suit avec la raideur une relation selon une loi de puissance avec un exposant de \num{0.8}~\cite{harper1964subjective}, \ie quand la raideur double, la dureté perçue augmente de \num{1.7}.
%\textcite{bergmanntiest2009cues} ont ainsi observé une relation quadratique d'égale intensité perçue de dureté, comme illustré sur la \figref{bergmanntiest2009cues}. %\textcite{bergmanntiest2009cues} ont ainsi observé une relation quadratique d'égale intensité perçue de dureté, comme illustré sur la \figref{bergmanntiest2009cues}.

View File

@@ -2,7 +2,7 @@
\label{wearable_haptics} \label{wearable_haptics}
One of the roles of haptic systems is to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects, particularly in \v-\VE~\cite{maclean2008it,culbertson2018haptics}. One of the roles of haptic systems is to render virtual interactions and sensations that are \emph{similar and comparable} to those experienced by the haptic sense with real objects, particularly in \v-\VE~\cite{maclean2008it,culbertson2018haptics}.
Moreover, a haptic \AR system should \enquote{modulating the feel of a real object by virtual [haptic] feedback}~\cite{jeon2009haptic}, \ie a touch interaction with a real object whose perception is modified by the addition of virtual haptic feedback. Moreover, a haptic augmentation system should \enquote{modulating the feel of a real object by virtual [haptic] feedback}~\cite{jeon2009haptic}, \ie a touch interaction with a real object whose perception is modified by the addition of virtual haptic feedback.
The haptic system should be hand-held or worn, \eg on the hand, and \enquote{not permanently attached to or integrated in the object}~\cite{bhatia2024augmenting}. The haptic system should be hand-held or worn, \eg on the hand, and \enquote{not permanently attached to or integrated in the object}~\cite{bhatia2024augmenting}.
@@ -164,13 +164,16 @@ Several types of vibrotactile actuators are used in haptics, with different trad
\label{tactile_rendering} \label{tactile_rendering}
Tactile rendering of haptic properties consists in modelling and reproducing virtual tactile sensations comparable to those perceived when interacting with real objects~\cite{klatzky2013haptic}. Tactile rendering of haptic properties consists in modelling and reproducing virtual tactile sensations comparable to those perceived when interacting with real objects~\cite{klatzky2013haptic}.
By adding such tactile rendering as feedback to the touch actions of the hand on a real object~\cite{bhatia2024augmenting}, both the real and virtual haptic sensations are integrated into a single property perception, as presented in \secref{sensations_perception}. By adding such tactile rendering as feedback to the touch actions of the hand on a real object~\cite{bhatia2024augmenting}, the perception of the object's haptic property is modified.
Therefore, the visual rendering of a touched object can also greatly influence the perception of its haptic properties, \eg by modifying its visual texture in \AR or \VR, as discussed in the \secref{visuo_haptic}. The integration of the real and virtual haptic sensations into a single property perception is discussed in more details in \secref{sensations_perception}.
%, both the real and virtual haptic sensations are integrated into a single property perception, as presented in \secref{sensations_perception}, \ie the perceived haptic property is modulated by the added virtual feedback.
In particular, the visual rendering of a touched object can also greatly influence the perception of its haptic properties, \eg by modifying its visual texture in \AR or \VR, as discussed in the \secref{visuo_haptic}.
\textcite{bhatia2024augmenting} categorize the tactile augmentations of real objects into three types: direct touch, touch-through, and tool mediated. \textcite{bhatia2024augmenting} categorize the haptic augmentations into three types: direct touch, touch-through, and tool mediated.
Also called direct feel-through~\cite{jeon2015haptic}, in \emph{direct touch}, the haptic device does not cover the interior of the hand to not impair the user to interact with the \RE. Also called direct feel-through~\cite{jeon2015haptic}, in \emph{direct touch}, the haptic device does not cover the interior of the hand to not impair the user to interact with the \RE.
In touch-through and tool-mediated, or \emph{indirect feel-through}, the haptic device is interposed between the hand and the \RE or worn on the hand, respectively. In touch-through and tool-mediated, or \emph{indirect feel-through}, the haptic device is interposed between the hand and the \RE or worn on the hand, respectively.
We are interested in direct touch augmentations with wearable haptic devices (\secref{wearable_haptic_devices}), as their integration with \AR is particularly promising for direct hand interaction with visuo-haptic augmentations. %We are interested in direct touch augmentations with wearable haptics (\secref{wearable_haptic_devices}), as their integration with \AR is particularly promising for free hand interaction with visuo-haptic augmentations.
Many haptic augmentations were first developed with grounded haptic devices and later transposed to wearable haptic devices.
%We also focus on tactile augmentations stimulating the mechanoreceptors of the skin (\secref{haptic_sense}), thus excluding temperature perception, as they are the most common existing haptic interfaces. %We also focus on tactile augmentations stimulating the mechanoreceptors of the skin (\secref{haptic_sense}), thus excluding temperature perception, as they are the most common existing haptic interfaces.
% \cite{klatzky2003feeling} : rendering roughness, friction, deformation, temperatures % \cite{klatzky2003feeling} : rendering roughness, friction, deformation, temperatures
@@ -180,43 +183,75 @@ We are interested in direct touch augmentations with wearable haptic devices (\s
\label{texture_rendering} \label{texture_rendering}
Several approaches have been proposed to render virtual haptic texture~\cite{culbertson2018haptics}. Several approaches have been proposed to render virtual haptic texture~\cite{culbertson2018haptics}.
High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\cite{unger2011roughness}. As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\cite{campion2005fundamental,culbertson2018haptics}.
As more traditional force feedback systems are unable to accurately render such micro-details on a simulated surface, vibrotactile devices attached to the end effector instead generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}.
In this way, physics-based models~\cite{chan2021hasti,okamura1998vibration,guruswamy2011iir} and data-based models~\cite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures. In this way, physics-based models~\cite{chan2021hasti,okamura1998vibration,guruswamy2011iir} and data-based models~\cite{culbertson2015should,romano2010automatic} have been developed and evaluated, the former being simpler but more approximate to real textures, and the latter being more realistic but limited to the captured textures.
\paragraph{Physics-based Models}
High-fidelity force feedback devices can reproduce patterned textures with great precision and provide similar perceptions to real textures, but they are expensive, have a limited workspace, and impose to hold a probe to explore the texture~\cite{unger2011roughness}.
Notably, \textcite{okamura1998vibration} rendered grating textures with exponentially decaying sinudoids that simulated the strokes of the grooves and ridges of the surface, while \textcite{culbertson2014modeling} captured and modelled the roughness of real surfaces to render them using the speed and force of the user. Notably, \textcite{okamura1998vibration} rendered grating textures with exponentially decaying sinudoids that simulated the strokes of the grooves and ridges of the surface, while \textcite{culbertson2014modeling} captured and modelled the roughness of real surfaces to render them using the speed and force of the user.
\paragraph{Data-driven Models}
An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\cite{culbertson2015should}. An effective approach to rendering virtual roughness is to generate vibrations to simulate interaction with the virtual texture~\cite{culbertson2018haptics}, relying on the user's real-time measurements of position, velocity and force to modulate the frequencies and amplitudes of the vibrations, with position and velocity being the most important parameters~\cite{culbertson2015should}.
For example, when comparing the same virtual texture pairwise, but with different parameters, \textcite{culbertson2015should} showed that the roughness vibrations generated should vary with user speed, but not necessarily with user force. For example, when comparing the same virtual texture pairwise, but with different parameters, \textcite{culbertson2015should} showed that the roughness vibrations generated should vary with user speed, but not necessarily with user force.
Virtual data-driven textures were perceived as similar to real textures, except for friction, which was not rendered properly. Virtual data-driven textures were perceived as similar to real textures, except for friction, which was not rendered properly.
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\cite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\cite{asano2015vibrotactile}, creating a haptic texture augmentation.
The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\cite{bhatia2024augmenting,jeon2009haptic}.
The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\cite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\cite{asano2015vibrotactile}, creating a haptic texture augmentation. The perceived roughness of real surfaces can be then modified when touched by a tool with a vibrotactile actuator attached~\cite{culbertson2014modeling,ujitoko2019modulating} or directly with the finger wearing the vibrotactile actuator~\cite{asano2015vibrotactile}, creating a haptic texture augmentation.
The objective is not just to render a virtual texture, but to alter the perception of a real, tangible surface, usually with wearable haptic devices, in what is known as haptic augmented reality (HAR)~\cite{bhatia2024augmenting,jeon2009haptic}. Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\cite{delhaye2012textureinduced,manfredi2014natural}.
One additional challenge of augmenting the finger touch is to keep the fingertip free to touch the real environment, thus delocalizing the actuator elsewhere on the hand~\cite{ando2007fingernailmounted,friesen2024perceived,teng2021touch}.
Of course, the fingertip skin is not deformed by the virtual texture and only vibrations are felt, but it has been shown that the vibrations produced on the fingertip skin running over a real surface are texture specific and similar between individuals~\cite{manfredi2014natural}.
A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\cite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}. A common method vibrotactile rendering of texture is to use a sinusoidal signal whose frequency is modulated by the finger position or velocity~\cite{asano2015vibrotactile,friesen2024perceived,strohmeier2017generating,ujitoko2019modulating}.
\subsubsection{Hardness} \subsubsection{Hardness}
\label{hardness_rendering} \label{hardness_rendering}
Modulating the perceived stiffness $k$ of a real surface with a force-feedback device The two main approaches to modulate the perceived hardness of a real surface with wearable haptics are to render forces or vibrations.
\cite{jeon2008modulating,jeon2010stiffness,jeon2012extending}
\paragraph{Modulating Forces}
When tapping or pressing a real object with a tool, the perceived stiffness $\tilde{k}$ (\secref{hardness}) of its surface can be modulated with force feedback~\cite{jeon2015haptic}.
This was first proposed by \textcite{jeon2008modulating} who augmented a real surface tapped in 1 \DoF with a grounded force-feedback device held in hand (\figref{jeon2009haptic_1}).
When the haptic end-effector contacts the object at time $t$, the object's surface deforms by displacement $x_r(t)$ and opposes a real reaction force $f_r(t)$.
The virtual force of the device $\tilde{f_r}(t)$ is then controlled to:
\begin{equation}{stiffness_augmentation}
\tilde{f_r}(t) = f_r(t) - \tilde{k} x_r(t)
\end{equation}
A force sensor embedded in the device measures the reaction force $f_r(t)$.
The displacement $x_r(t)$ is estimated with the reaction force and tapping velocity using a pre-defined model of various materials, as described by \textcite{jeon2011extensions}.
As shown in \figref{jeon2009haptic_2}, the force $\tilde{f_r}(t)$ perceived by the user being modulated, but not the displacement $x_r(t)$, the perceived stiffness is $\tilde{k}(t)$.
This stiffness augmentation technique was then extended to enable tapping and pressing with 3 \DoFs~\cite{jeon2010stiffness}, to render friction and weight augmentations~\cite{jeon2011extensions}, and to grasping and squeezing the real object with two contact points~\cite{jeon2012extending}.
\begin{subfigs}{stiffness_rendering}{Augmenting perceived stiffness of a real surface. }[%
\item Diagram of a user tapping a real surface with a hand-held force-feedback device~\cite{jeon2009haptic}.
\item Displacement-force curves of a real rubber ball (dashed line) and when its perceived stiffness $\tilde{k}$ is modulated~\cite{jeon2009haptic}.
]
\subfigsheight{35mm}
\subfig[0.2]{jeon2009haptic_1}
\subfig[0.4]{jeon2009haptic_2}
\end{subfigs}
\cite{detinguy2018enhancing}
\cite{salazar2020altering}
\cite{kildal20103dpress}
\cite{tao2021altering} % wearable softness
\paragraph{Vibrations Augmentations}
The second main approach is to modulate the vibrations felt when tapping a real surface with a tool~\cite{okamura1998vibration}.
Tapping with a tool on a real surface augmented with a vibrotactile actuator generating exponential decaying sinusoids Tapping with a tool on a real surface augmented with a vibrotactile actuator generating exponential decaying sinusoids
\begin{equation}{contact_transient}
\tilde{f}_c(t) = a \, |v_{in}| \, e^{- \tau t} sin(2 \pi f t)
\end{equation}
\cite{kuchenbecker2006improving} \cite{kuchenbecker2006improving}
\cite{hachisu2012augmentation} \cite{hachisu2012augmentation}
\cite{park2019realistic} \cite{park2019realistic}
\cite{park2023perceptual} \cite{park2023perceptual}
Comparing the two previous methods %\textcite{choi2021perceived} combined and compared these two rendering approaches (spring-damper and exponential decaying sinusoids) but to render purely virtual surfaces.
\cite{choi2021perceived} %They found that the perceived intensity of the virtual hardness $\tilde{h}$ followed a power law, similarly to \eqref{hardness_intensity}, with the amplitude $a$, the %frequency $f$ and the damping $b$ of the vibration, but not the decay time $\tau$.
With wearable haptics
\cite{kildal20103dpress}
\cite{detinguy2018enhancing}
\cite{salazar2020altering}
\cite{park2017compensation}
\cite{tao2021altering} % wearable softness
%\subsubsection{Friction} %\subsubsection{Friction}
%\label{friction_rendering} %\label{friction_rendering}

View File

@@ -21,14 +21,14 @@
\label{sensations_perception} \label{sensations_perception}
A \emph{perception} is the merge of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property~\cite{ernst2004merging}. A \emph{perception} is the merge of multiple sensations from different sensory modalities (visual, cutaneous, proprioceptive, etc.) about the same event or object property~\cite{ernst2004merging}.
For example, it is the haptic hardness perceived through skin pressure and force sensations~\secref{hardness}, the hand movement from proprioception and a visual hand avatar~\secref{ar_displays}, or the perceived size of a tangible with a co-localized \VO~\secref{ar_tangibles}. For example, it is the haptic hardness perceived through skin pressure and force sensations (\secref{hardness}), the hand movement from proprioception and a visual hand avatar (\secref{ar_displays}), or the perceived size of a tangible with a co-localized \VO (\secref{ar_tangibles}).
When the sensations can be redundant, \ie when only one sensation could be enough to estimate the property, they are integrated to form a single coherent perception~\cite{ernst2004merging}. When the sensations can be redundant, \ie when only one sensation could be enough to estimate the property, they are integrated to form a single coherent perception~\cite{ernst2004merging}.
No sensory information is completely reliable, and can provide different answers to the same property when measured multiple times, \eg the weight of an object. No sensory information is completely reliable, and can provide different answers to the same property when measured multiple times, \eg the weight of an object.
Therefore, each sensation $i$ is said to be an estimate $\hat{s}_i$ with variance $\sigma_i^2$ of the property $s$. Therefore, each sensation $i$ is said to be an estimate $\tilde{s}_i$ with variance $\sigma_i^2$ of the property $s$.
The \MLE model predicts then that the integrated estimated property $\hat{s}$ is the weighted sum of the individual sensory estimates: The \MLE model predicts then that the integrated estimated property $\tilde{s}$ is the weighted sum of the individual sensory estimates:
\begin{equation}{MLE} \begin{equation}{MLE}
\hat{s} = \sum_i w_i \hat{s}_i \quad \text{with} \quad \sum_i w_i = 1 \tilde{s} = \sum_i w_i \tilde{s}_i \quad \text{with} \quad \sum_i w_i = 1
\end{equation} \end{equation}
Where the individual weights $w_i$ are proportional to their inverse variances: Where the individual weights $w_i$ are proportional to their inverse variances:
\begin{equation}{MLE_weights} \begin{equation}{MLE_weights}
@@ -137,11 +137,11 @@ Adding a visual delay increased the perceived stiffness of the reference piston,
\end{subfigs} \end{subfigs}
%explained how these delays affected the integration of the visual and haptic perceptual cues of stiffness. %explained how these delays affected the integration of the visual and haptic perceptual cues of stiffness.
The stiffness $k$ of the piston is indeed estimated by both sight and proprioception as the ratio of the exerted force $F$ and the displacement $D$ of the piston, following \eqref{stiffness}, but with a delay $\Delta t$: The stiffness $\tilde{k}(t)$ of the piston is indeed estimated at time $t$ by both sight and proprioception as the ratio of the exerted force $F(t)$ and the displacement $D(t)$ of the piston, following \eqref{stiffness}, but with potential visual $\Delta t_v$ or haptic $\Delta t_h$ delays:
\begin{equation}{stiffness_delay} \begin{equation}{stiffness_delay}
k = \frac{F(t_H)}{D(t_V)} \quad \text{with} \quad t_H = t_V + \Delta t \tilde{k}(t) = \frac{F(t + \Delta t_h)}{D(t + \Delta t_v)}
\end{equation} \end{equation}
Therefore, the perceived stiffness $k$ increases with a haptic delay in force (positive $\Delta t$) and decreases with a visual delay in displacement (negative $\Delta t$)~\cite{diluca2011effects}. Therefore, the perceived stiffness $\tilde{k}(t)$ increases with a haptic delay in force and decreases with a visual delay in displacement~\cite{diluca2011effects}.
In a similar \TIFC user study, participants compared perceived stiffness of virtual pistons in \OST-\AR and \VR~\cite{gaffary2017ar}. In a similar \TIFC user study, participants compared perceived stiffness of virtual pistons in \OST-\AR and \VR~\cite{gaffary2017ar}.
However, the force-feedback device and the participant's hand were not visible (\figref{gaffary2017ar}). However, the force-feedback device and the participant's hand were not visible (\figref{gaffary2017ar}).

Binary file not shown.

After

Width:  |  Height:  |  Size: 103 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 89 KiB

View File

@@ -1450,6 +1450,16 @@
doi = {10/gm5m8d} doi = {10/gm5m8d}
} }
@article{higashi2019hardness,
title = {Hardness {{Perception Based}} on {{Dynamic Stiffness}} in {{Tapping}}},
author = {Higashi, Kosuke and Okamoto, Shogo and Yamada, Yoji and Nagano, Hikaru and Konyo, Masashi},
date = {2019},
journaltitle = {Front. Psychol.},
volume = {9},
pages = {2654},
doi = {10/gs4tmg}
}
@inproceedings{hilliges2012holodesk, @inproceedings{hilliges2012holodesk,
title = {{{HoloDesk}}: Direct 3d Interactions with a Situated See-through Display}, title = {{{HoloDesk}}: Direct 3d Interactions with a Situated See-through Display},
shorttitle = {{{HoloDesk}}}, shorttitle = {{{HoloDesk}}},