Improve data textures in related work

This commit is contained in:
2024-12-26 19:40:58 +01:00
parent 7b19effcce
commit 897d86c141
2 changed files with 45 additions and 9 deletions

View File

@@ -170,7 +170,9 @@ However, they require high voltages to operate, limiting their use in wearable d
\label{tactile_rendering} \label{tactile_rendering}
Rendering a haptic property consists in modeling and reproducing virtual sensations comparable to those perceived when interacting with real objects \cite{klatzky2013haptic}. Rendering a haptic property consists in modeling and reproducing virtual sensations comparable to those perceived when interacting with real objects \cite{klatzky2013haptic}.
As we have just seen, the haptic sense being rich and complex (\secref{haptic_hand}), a wide variety of wearable haptic actuators have been developed (\secref{wearable_haptic_devices}) that each provide a subset of the haptic sensations felt by the hand. As we have just seen, the haptic sense is rich and complex (\secref{haptic_hand}).
Thus, a wide variety of wearable haptic actuators have been developed (\secref{wearable_haptic_devices}).
However, each actuator is only able to provide a subset of the haptic sensations felt by the hand.
We review in this section the rendering methods with wearable haptics to modify perceived roughness and hardness of real objects. We review in this section the rendering methods with wearable haptics to modify perceived roughness and hardness of real objects.
\subsubsection{Haptic Augmentations} \subsubsection{Haptic Augmentations}
@@ -228,19 +230,23 @@ More complex models have also been developed to be physically accurate and repro
Because simulations of realistic virtual textures can be complex to design and to render in real-time, direct capture and models of real textures have been developed \cite{culbertson2018haptics}. Because simulations of realistic virtual textures can be complex to design and to render in real-time, direct capture and models of real textures have been developed \cite{culbertson2018haptics}.
\textcite{okamura1998vibration} were the first to measure the vibrations produced by the interaction of a stylus dragged over sandpaper and patterned surfaces. \textcite{okamura1998vibration} were the first to measure the vibrations produced by the interaction of a stylus dragged over sandpaper and patterned surfaces.
They found that the contact vibrations with patterns can be modeled as exponentially decaying sinusoids (\eqref{contact_transient}) that depend on the normal force and the scanning velocity of the stylus on the surface. They found that the contact with patterns when sliding on the texture generates vibrations that can be modelled as exponentially decaying sinusoids (\eqref{contact_transient}) that depend on the normal force and the scanning velocity of the stylus on the surface.
This technique was used by \textcite{ando2007fingernailmounted} to augment a smooth sheet of paper with a virtual patterned texture: With a \LRA mounted on the nail, they rendered the virtual finger contacts with \qty{20}{\ms} vibration impulses at \qty{130}{\Hz} (\figref{ando2007fingernailmounted}). This technique was used by \textcite{ando2007fingernailmounted} to augment a smooth sheet of paper with a virtual patterned texture: With a \LRA mounted on the nail, they rendered the virtual finger contacts with \qty{20}{\ms} vibration impulses at \qty{130}{\Hz} (\figref{ando2007fingernailmounted}).
Participants matched the virtual textures to the real ones, with \qty{0.25}{\mm} height and \qtyrange{1}{10}{\mm} width, but systematically overestimated the virtual width to be \qty{4}{\mm} longer. Participants were able to match the virtual textures to the real ones (\qty{0.25}{\mm} height and \qtyrange{1}{10}{\mm} widths) but systematically overestimated the virtual width to be \qty{4}{\mm} longer.
This model was refined to capture everyday, unpatterned textures as well \cite{guruswamy2011iir}
Other models have been then developed to capture everyday textures (such as sandpaper) \cite{guruswamy2011iir} with many force and velocity measures \cite{romano2012creating,culbertson2014modeling}. More complex models were then created to more systematically capture everyday textures from many stylus scan measures \cite{romano2012creating,culbertson2014modeling}.
Such data-based models are capable of interpolating from the user's measures of velocity and force as inputs to generate a virtual texture in real time (\secref{vibrotactile_actuators}).
This led to the release of the \HaTT database, a public set of stylus recordings and models of 100 haptic textures \cite{culbertson2014one}. This led to the release of the \HaTT database, a public set of stylus recordings and models of 100 haptic textures \cite{culbertson2014one}.
A similar database, but captured from a direct touch context with the fingertip, has recently been released \cite{balasubramanian2024sens3}. A similar database, but captured from a direct touch context with the fingertip, has also recently been released \cite{balasubramanian2024sens3}.
A limitation of these data-driven models is that they can only render \emph{isotropic} textures: their record does not depend on the position of the measure, and the rendering is the same regardless of the direction of the movement. A common limitation of these data-driven models is that they can only render \emph{isotropic} textures: their record does not depend on the position of the measure, and the rendering is the same regardless of the direction of the movement.
Alternative models have been proposed to both render both isotropic and patterned textures \cite{chan2021hasti}. This was eventually addressed to include the user's velocity direction into the capture, modelling and rendering of the textures \cite{abdulali2016datadriven,abdulali2018datadriven}.
%A third approach is to model
%Alternative models have been proposed to both render both isotropic and patterned textures \cite{chan2021hasti}., or to simulate the vibrations from the (visual) texture maps used to visually render a \ThreeD object \cite{chan2021hasti}.
Using the user's velocity magnitude and normal force as input, these data-driven models are able to interpolate from the scan measures to generate a virtual texture in real time as vibrations with a high realism.
When comparing real textures felt through a stylus with their virtual models rendered with a voice-coil actuator attached to the stylus (\figref{culbertson2012refined}), the virtual textures were found to accurately reproduce the perception of roughness, but hardness and friction were not rendered properly \cite{culbertson2014modeling}. When comparing real textures felt through a stylus with their virtual models rendered with a voice-coil actuator attached to the stylus (\figref{culbertson2012refined}), the virtual textures were found to accurately reproduce the perception of roughness, but hardness and friction were not rendered properly \cite{culbertson2014modeling}.
\textcite{culbertson2015should} further showed that the perceived realism of the virtual textures, and similarity to the real textures, depended mostly on the user's speed but not on the user's force as inputs to the model, \ie responding to speed is sufficient to render isotropic virtual textures. \textcite{culbertson2015should} further showed that the perceived realism of the virtual textures, and similarity to the real textures, depended mostly on the user's velocity magnitude but not on the user's force as inputs to the model, \ie responding to velocity magnitude is sufficient to render isotropic virtual textures.
\begin{subfigs}{textures_rendering_data}{Augmentating haptic texture perception with voice-coil actuators. }[][ \begin{subfigs}{textures_rendering_data}{Augmentating haptic texture perception with voice-coil actuators. }[][
\item Increasing and decreasing the perceived roughness of a real patterned texture in direct touch \cite{asano2015vibrotactile}. \item Increasing and decreasing the perceived roughness of a real patterned texture in direct touch \cite{asano2015vibrotactile}.

View File

@@ -1,3 +1,33 @@
@incollection{abdulali2016datadriven,
title = {Data-{{Driven Modeling}} of {{Anisotropic Haptic Textures}}: {{Data Segmentation}} and {{Interpolation}}},
booktitle = {Haptics: {{Perception}}, {{Devices}}, {{Control}}, and {{Applications}}},
author = {Abdulali, Arsen and Jeon, Seokhee},
date = {2016},
volume = {9775},
pages = {228--239},
isbn = {978-3-319-42323-4 978-3-319-42324-1}
}
@inproceedings{abdulali2017sample,
title = {Sample Selection of Multi-Trial Data for Data-Driven Haptic Texture Modeling},
booktitle = {{{IEEE World Haptics Conf}}.},
author = {Abdulali, Arsen and Hassan, Waseem and Jeon, Seokhee},
date = {2017},
pages = {66--71},
doi = {10/g8t7zg},
isbn = {978-1-5090-1425-5}
}
@incollection{abdulali2018datadriven,
title = {Data-{{Driven Rendering}} of {{Anisotropic Haptic Textures}}},
booktitle = {Haptic {{Interaction}}},
author = {Abdulali, Arsen and Jeon, Seokhee},
date = {2018},
volume = {432},
pages = {401--407},
isbn = {978-981-10-4156-3 978-981-10-4157-0}
}
@inproceedings{achibet2017flexifingers, @inproceedings{achibet2017flexifingers,
title = {{{FlexiFingers}}: {{Multi-finger}} Interaction in {{VR}} Combining Passive Haptics and Pseudo-Haptics}, title = {{{FlexiFingers}}: {{Multi-finger}} Interaction in {{VR}} Combining Passive Haptics and Pseudo-Haptics},
shorttitle = {{{FlexiFingers}}}, shorttitle = {{{FlexiFingers}}},