WIP vhar_textures
This commit is contained in:
@@ -228,7 +228,7 @@ Participants matched the virtual textures to the real ones, with \qty{0.25}{\mm}
|
||||
|
||||
Other models have been then developed to capture everyday textures (such as sandpaper) \cite{guruswamy2011iir} with many force and velocity measures \cite{romano2012creating,culbertson2014modeling}.
|
||||
Such data-based models are capable of interpolating from the user's measures of velocity and force as inputs to generate a virtual texture in real time (\secref{vibrotactile_actuators}).
|
||||
This led to the release of the Penn Haptic Texture Toolkit (HaTT) database, a public set of stylus recordings and models of 100 haptic textures \cite{culbertson2014one}.
|
||||
This led to the release of the \HaTT database, a public set of stylus recordings and models of 100 haptic textures \cite{culbertson2014one}.
|
||||
A similar database, but captured from a direct touch context with the fingertip, has recently been released \cite{balasubramanian2024sens3}.
|
||||
A limitation of these data-driven models is that they can only render \emph{isotropic} textures: their record does not depend on the position of the measure, and the rendering is the same regardless of the direction of the movement.
|
||||
Alternative models have been proposed to both render both isotropic and patterned textures \cite{chan2021hasti}.
|
||||
|
||||
Reference in New Issue
Block a user