Improve data textures in related work
This commit is contained in:
@@ -170,7 +170,9 @@ However, they require high voltages to operate, limiting their use in wearable d
|
||||
\label{tactile_rendering}
|
||||
|
||||
Rendering a haptic property consists in modeling and reproducing virtual sensations comparable to those perceived when interacting with real objects \cite{klatzky2013haptic}.
|
||||
As we have just seen, the haptic sense being rich and complex (\secref{haptic_hand}), a wide variety of wearable haptic actuators have been developed (\secref{wearable_haptic_devices}) that each provide a subset of the haptic sensations felt by the hand.
|
||||
As we have just seen, the haptic sense is rich and complex (\secref{haptic_hand}).
|
||||
Thus, a wide variety of wearable haptic actuators have been developed (\secref{wearable_haptic_devices}).
|
||||
However, each actuator is only able to provide a subset of the haptic sensations felt by the hand.
|
||||
We review in this section the rendering methods with wearable haptics to modify perceived roughness and hardness of real objects.
|
||||
|
||||
\subsubsection{Haptic Augmentations}
|
||||
@@ -228,19 +230,23 @@ More complex models have also been developed to be physically accurate and repro
|
||||
Because simulations of realistic virtual textures can be complex to design and to render in real-time, direct capture and models of real textures have been developed \cite{culbertson2018haptics}.
|
||||
|
||||
\textcite{okamura1998vibration} were the first to measure the vibrations produced by the interaction of a stylus dragged over sandpaper and patterned surfaces.
|
||||
They found that the contact vibrations with patterns can be modeled as exponentially decaying sinusoids (\eqref{contact_transient}) that depend on the normal force and the scanning velocity of the stylus on the surface.
|
||||
They found that the contact with patterns when sliding on the texture generates vibrations that can be modelled as exponentially decaying sinusoids (\eqref{contact_transient}) that depend on the normal force and the scanning velocity of the stylus on the surface.
|
||||
This technique was used by \textcite{ando2007fingernailmounted} to augment a smooth sheet of paper with a virtual patterned texture: With a \LRA mounted on the nail, they rendered the virtual finger contacts with \qty{20}{\ms} vibration impulses at \qty{130}{\Hz} (\figref{ando2007fingernailmounted}).
|
||||
Participants matched the virtual textures to the real ones, with \qty{0.25}{\mm} height and \qtyrange{1}{10}{\mm} width, but systematically overestimated the virtual width to be \qty{4}{\mm} longer.
|
||||
Participants were able to match the virtual textures to the real ones (\qty{0.25}{\mm} height and \qtyrange{1}{10}{\mm} widths) but systematically overestimated the virtual width to be \qty{4}{\mm} longer.
|
||||
This model was refined to capture everyday, unpatterned textures as well \cite{guruswamy2011iir}
|
||||
|
||||
Other models have been then developed to capture everyday textures (such as sandpaper) \cite{guruswamy2011iir} with many force and velocity measures \cite{romano2012creating,culbertson2014modeling}.
|
||||
Such data-based models are capable of interpolating from the user's measures of velocity and force as inputs to generate a virtual texture in real time (\secref{vibrotactile_actuators}).
|
||||
More complex models were then created to more systematically capture everyday textures from many stylus scan measures \cite{romano2012creating,culbertson2014modeling}.
|
||||
This led to the release of the \HaTT database, a public set of stylus recordings and models of 100 haptic textures \cite{culbertson2014one}.
|
||||
A similar database, but captured from a direct touch context with the fingertip, has recently been released \cite{balasubramanian2024sens3}.
|
||||
A limitation of these data-driven models is that they can only render \emph{isotropic} textures: their record does not depend on the position of the measure, and the rendering is the same regardless of the direction of the movement.
|
||||
Alternative models have been proposed to both render both isotropic and patterned textures \cite{chan2021hasti}.
|
||||
A similar database, but captured from a direct touch context with the fingertip, has also recently been released \cite{balasubramanian2024sens3}.
|
||||
A common limitation of these data-driven models is that they can only render \emph{isotropic} textures: their record does not depend on the position of the measure, and the rendering is the same regardless of the direction of the movement.
|
||||
This was eventually addressed to include the user's velocity direction into the capture, modelling and rendering of the textures \cite{abdulali2016datadriven,abdulali2018datadriven}.
|
||||
|
||||
%A third approach is to model
|
||||
%Alternative models have been proposed to both render both isotropic and patterned textures \cite{chan2021hasti}., or to simulate the vibrations from the (visual) texture maps used to visually render a \ThreeD object \cite{chan2021hasti}.
|
||||
|
||||
Using the user's velocity magnitude and normal force as input, these data-driven models are able to interpolate from the scan measures to generate a virtual texture in real time as vibrations with a high realism.
|
||||
When comparing real textures felt through a stylus with their virtual models rendered with a voice-coil actuator attached to the stylus (\figref{culbertson2012refined}), the virtual textures were found to accurately reproduce the perception of roughness, but hardness and friction were not rendered properly \cite{culbertson2014modeling}.
|
||||
\textcite{culbertson2015should} further showed that the perceived realism of the virtual textures, and similarity to the real textures, depended mostly on the user's speed but not on the user's force as inputs to the model, \ie responding to speed is sufficient to render isotropic virtual textures.
|
||||
\textcite{culbertson2015should} further showed that the perceived realism of the virtual textures, and similarity to the real textures, depended mostly on the user's velocity magnitude but not on the user's force as inputs to the model, \ie responding to velocity magnitude is sufficient to render isotropic virtual textures.
|
||||
|
||||
\begin{subfigs}{textures_rendering_data}{Augmentating haptic texture perception with voice-coil actuators. }[][
|
||||
\item Increasing and decreasing the perceived roughness of a real patterned texture in direct touch \cite{asano2015vibrotactile}.
|
||||
|
||||
Reference in New Issue
Block a user