Complete related work

This commit is contained in:
2024-09-23 11:56:19 +02:00
parent 0495afd60c
commit d832de9f0c
6 changed files with 34 additions and 36 deletions

View File

@@ -204,7 +204,7 @@ The simplest texture simulation model is a 1D sinusoidal grating $v(t)$ with spa
\begin{equation}{grating_rendering}
v(t) = A \sin(\frac{2 \pi \dot{x}(t)}{\lambda})
\end{equation}
That is, this model generates a periodic signal whose frequency is proportional to the user's velocity, implementing the speed-frequency ratio observed with real patterned textures (\eqref{grating_vibrations}).
That is, this model generates a periodic signal whose frequency is modulated and proportional to the user's velocity, implementing the speed-frequency ratio observed with real patterned textures (\eqref{grating_vibrations}).
It gives the user the illusion of a texture with a \emph{fixed spatial period} that approximate the real manufactured grating textures (\secref{roughness}).
The user's position could have been used instead of the velocity, but it requires measuring the position and generating the signal at frequencies too high (\qty{10}{\kHz}) for most sensors and haptic actuators \cite{campion2005fundamental}.
@@ -212,6 +212,7 @@ With a voice-coil actuator attached to the middle phalanx of the finger, \textci
Participants moved their finger over real grating textures (\qtyrange{0.15}{.29}{\mm} groove and ridge width) with a virtual sine grating (\qty{1}{\mm} spatial period) superimposed, rendered after \eqref{grating_rendering}.
The perceived roughness increased proportionally to the virtual texture amplitude, but a high amplitude decreased it instead.
\textcite{ujitoko2019modulating} instead used a square wave signal and a hand-held stylus with an embedded voice-coil.
\textcite{friesen2024perceived} compared the frequency modulation of \eqref{grating_rendering} with amplitude modulation (\figref{friesen2024perceived}), and found that the frequency modulation was perceived as more similar to real sinusoidal gratings for lower spatial periods (\qty{0.5}{\mm}) but both modulations were effective for higher spatial periods (\qty{1.5}{\mm}).
%\textcite{friesen2024perceived} proposed
@@ -239,13 +240,13 @@ When comparing real textures felt through a stylus with their virtual models ren
\begin{subfigs}{textures_rendering_data}{Augmentating haptic texture perception with voice-coil actuators. }[
\item Increasing and decreasing the perceived roughness of a real patterned texture in direct touch \cite{asano2015vibrotactile}.
%\item Comparing real patterned texture with virtual texture augmentation in direct touch \cite{friesen2024perceived}.
\item Comparing real patterned texture with virtual texture augmentation in direct touch \cite{friesen2024perceived}.
\item Rendering virtual contacts in direct touch with the virtual texture \cite{ando2007fingernailmounted}.
\item Rendering an isotropic virtual texture over a real surface while sliding a hand-held stylus across it \cite{culbertson2012refined}.
]
\subfigsheight{38mm}
\subfigsheight{36mm}
\subfig{asano2015vibrotactile_2}
%\subfig{friesen2024perceived}
\subfig{friesen2024perceived}
\subfig{ando2007fingernailmounted}
\subfig{culbertson2012refined}
\end{subfigs}
@@ -358,7 +359,8 @@ Haptic systems aim to provide virtual interactions and sensations similar to tho
The complexity of the haptic sense has led to the design of numerous haptic devices and renderings.
While many haptic devices can be worn on the hand, only a few can be considered wearable as they are compact and portable, but they are limited to cutaneous feedback.
If the haptic rendering is timely associated with the user's touch actions on a real object, the perceived haptic properties of the object can be modified.
Several rendering methods have been developed to modify the perceived roughness and hardness, but not all of them have been already transposed to wearable haptics.
Several rendering methods have been developed to modify the perceived roughness and hardness, mostly using vibrotactile feedback and, to a lesser extent, pressure feedback.
However, not all of these haptic augmentations have been already transposed to wearable haptics.
%, unlike most previous actuators that are designed specifically for fingertips and would require mechanical adaptation to be placed on other parts of the hand.
% thanks to the vibration propagation and the sensory capabilities distributed throughout the skin, they can be placed without adaption and on any part of the hand

View File

@@ -23,21 +23,14 @@ Fixed to the ceiling, the headset displayed a stereoscopic (one image per eye) p
\subsubsection{A Definition of AR}
\label{ar_definition}
The system of \cite{sutherland1968headmounted} already fulfilled the first formal definition of \AR, proposed by \textcite{azuma1997survey} in the first survey of the domain:
\begin{enumerate}[label=(\arabic*)]
\item combine real and virtual,
\item be interactive in real time, and
\item register real and virtual\footnotemark.
\end{enumerate}
%\footnotetext{There quite confusion in the literature and in (because of) the industry about the terms \AR and \MR. The term \MR is very often used as a synonym of \AR, or a version of \AR that enables an interaction with the virtual content. The title of this section refers to the title of the highly cited paper by \textcite{speicher2019what} that examines this debate.}
\footnotetext{This third characteristic has been slightly adapted to use the version of \textcite{marchand2016pose}, the original definition was: \enquote{registered in \ThreeD}.}
The first formal definition of \AR was proposed by \textcite{azuma1997survey}: (1) combine real and virtual, (2) be interactive in real time, and (3) register real and virtual\footnotemark.
Each of these characteristics is essential: the real-virtual combination distinguishes \AR from \VR, a movie with integrated digital content is not interactive and a \TwoD overlay like an image filter is not registered.
There are also two key aspects to this definition: it does not focus on technology or method, but on the user's perspective of the system experience, and it does not specify a particular human sense, \ie it can be auditory \cite{yang2022audio}, haptic \cite{bhatia2024augmenting}, or even olfactory \cite{brooks2021stereosmell} or gustatory \cite{brooks2023taste}.
Yet, most of the research have focused on visual augmentations, and the term \AR (without a prefix) is almost always understood as \v-\AR.
\footnotetext{This third characteristic has been slightly adapted to use the version of \textcite{marchand2016pose}, the original definition was: \enquote{registered in \ThreeD}.}
%For example, \textcite{milgram1994taxonomy} proposed a taxonomy of \MR experiences based on the degree of mixing real and virtual environments, and \textcite{skarbez2021revisiting} revisited this taxonomy to include the user's perception of the experience.
@@ -70,16 +63,16 @@ Yet, the user experience in \AR is still highly dependent on the display used.
\label{ar_displays}
To experience a virtual content combined and registered with the \RE, an output \UI that display the \VE to the user is necessary.
There is a large variety of \AR displays with different methods of combining the real and virtual content (\VST, \OST, or projected), and different locations on the \RE or the user \cite{billinghurst2015survey}.
There is a large variety of \AR displays with different methods of combining the real and virtual content, and different locations on the \RE or the user \cite{billinghurst2015survey}.
In \VST-\AR, the virtual images are superimposed to images of the \RE captured by a camera \cite{marchand2016pose}, and the combined real-virtual image is displayed on a screen to the user, as illustrated in \figref{itoh2022indistinguishable_vst}, \eg \figref{hartl2013mobile}.
In \emph{\VST-\AR}, the virtual images are superimposed to images of the \RE captured by a camera \cite{marchand2016pose}, and the combined real-virtual image is displayed on a screen to the user, as illustrated in \figref{itoh2022indistinguishable_vst}, \eg \figref{hartl2013mobile}.
This augmented view through the camera has the advantage of a complete control on the real-virtual combination such as mutual occlusion between real and virtual objects \cite{macedo2023occlusion}, coherent lighting and no delay between the real and virtual images \cite{kruijff2010perceptual}.
But, due to the camera and the screen, the user's view is degraded with a lower resolution, frame rate, field of view, and an overall visual latency compared to proprioception \cite{kruijff2010perceptual}.
An \OST-\AR directly combines the virtual images with the real world view using a transparent optical system \cite{itoh2022indistinguishable} to like augmented glasses, as illustrated in \figref{itoh2022indistinguishable_ost}, \eg \figref{lee2013spacetop}.
An \emph{\OST-\AR} directly combines the virtual images with the real world view using a transparent optical system \cite{itoh2022indistinguishable} to like augmented glasses, as illustrated in \figref{itoh2022indistinguishable_ost}, \eg \figref{lee2013spacetop}.
These displays feature a direct, preserved view of the \RE at the cost of more difficult registration (spatial misalignment or temporal latency between the real and virtual content) \cite{grubert2018survey} and mutual real-virtual occlusion \cite{macedo2023occlusion}.
Finally, projection-based \AR overlay the virtual images on the real world using a projector, as illustrated in \figref{roo2017one_2}, \eg \figref{roo2017inner}.
Finally, \emph{projection-based \AR} overlays the virtual images on the real world using a projector, as illustrated in \figref{roo2017one_2}, \eg \figref{roo2017inner}.
It doesn't require the user to wear the display, but requires a real surface to project the virtual on, and is vulnerable to shadows created by the user or the real objects \cite{billinghurst2015survey}.
\begin{subfigs}{ar_displays}{Simplified operating diagram of \AR display methods. }[
@@ -94,9 +87,9 @@ It doesn't require the user to wear the display, but requires a real surface to
\end{subfigs}
Regardless the \AR display, it can be placed at different locations \cite{bimber2005spatial}, as shown in \figref{roo2017one_1}.
Spatial \AR is usually projection-based displays placed at fixed location (\figref{roo2017inner}), but it can also be \OST or \VST fixed windows (\figref{lee2013spacetop}).
Alternatively, \AR displays can be hand-held, like a \VST smartphone (\figref{hartl2013mobile}), or body-attached, like a micro-projector used as a flashlight \cite{billinghurst2015survey}.
Finally, \AR displays can be head-worn like \VR headsets or glasses, providing a highly immersive and portable experience.
\emph{Spatial \AR} is usually projection-based displays placed at fixed location (\figref{roo2017inner}), but it can also be \OST or \VST \emph{fixed windows} (\figref{lee2013spacetop}).
Alternatively, \AR displays can be \emph{hand-held}, like a \VST smartphone (\figref{hartl2013mobile}), or body-attached, like a micro-projector used as a flashlight \cite{billinghurst2015survey}.
Finally, \AR displays can be head-worn like \VR \emph{headsets} or glasses, providing a highly immersive and portable experience.
%Smartphones, shipped with sensors, computing ressources and algorithms, are the most common \AR today's displays, but research and development promise more immersive and interactive \AR with headset displays \cite{billinghurst2021grand}.
\fig[0.75]{roo2017one_1}{Locations of \AR displays from eye-worn to spatially projected. Adapted by \textcite{roo2017one} from \textcite{bimber2005spatial}.}

View File

@@ -2,8 +2,9 @@
\label{visuo_haptic}
Everyday perception and manipulation of objects with the hand typically involves both the visual and haptic senses.
Each sense has unique capabilities for perceiving certain object properties, such as color for vision or temperature for touch, but they are equally capable for many material properties, such as roughness, hardness, or friction \cite{baumgartner2013visual}.
Rendering a \VO with both visual and haptic feedback that feels coherent is a challenge, especially in immersive \AR where the haptic actuator is moved away so as not to cover the inside of the hand.
Each sense has unique capabilities for perceiving certain object properties, such as color for vision or temperature for touch, but they are equally capable for many properties, such as roughness, hardness, or geometry \cite{baumgartner2013visual}.
Both \AR and wearable haptic systems integrate virtual content into the user's perception as sensory illusions.
It is essential to understand how a multimodal visuo-haptic rendering of a \VO is perceived.%, especially in immersive \AR where the haptic actuator is moved away so as not to cover the inside of the hand.
% Answer the following four questions: “Who else has done work with relevance to this work of yours? What did they do? What did they find? And how is your work here different?”
@@ -21,10 +22,6 @@ Rendering a \VO with both visual and haptic feedback that feels coherent is a ch
\subsection{Visuo-Haptic Perception of Virtual and Augmented Objects}
\label{sensations_perception}
Both \AR and wearable haptic systems integrate virtual content into the user's perception as sensory illusions.
It is essential to understand how a multimodal visuo-haptic rendering of a \VO is perceived.
\subsubsection{Merging the Sensations into a Perception}
\label{sensations_perception}
@@ -162,7 +159,7 @@ We describe in the next section how wearable haptics have been integrated with i
A few wearable haptic devices have been specifically designed or experimentally tested for direct hand interaction in immersive \AR.
Since virtual or augmented objects are naturally touched, grasped, and manipulated directly with the fingertips (\secref{exploratory_procedures} and \secref{grasp_types}), the main challenge of wearable haptics for \AR is to provide haptic sensations of these interactions while keeping the fingertips free to interact with the \RE.
Several approaches have been proposed to move the actuator to a different location on the hand.
Several approaches have been proposed to move the haptic actuator to a different location on the hand.
Yet, they differ greatly in the actuators used (\secref{wearable_haptic_devices}), thus the haptic feedback (\secref{tactile_rendering}), and the placement of the haptic rendering.
Other wearable haptic actuators have been proposed for \AR, but are not discussed here.

View File

@@ -8,12 +8,18 @@ Perceptual constancy is possible in the absence of one cue by compensating with
Haptic systems aim to provide virtual interactions and sensations similar to those with real objects.
Only a few can be considered wearable due to their compactness and portability, but they are limited to cutaneous feedback.
If their haptic rendering is timely associated with the user's touch actions on a real object, the perceived haptic properties of the object, such as its roughness and hardness, can be modified.
Wearable haptic augmentation is mostly achieved with vibrotactile feedback.
\AR headsets integrate virtual content into the user's perception as if it were part of the \RE, with real-time tracking of the head and hands.
However, they lack direct hand interaction and manipulation of \VOs, which could be improved by visual rendering of the hand.
\AR headsets integrate virtual content immersively into the user's perception as if it were part of the \RE, with real-time tracking of the head and hands.
However, direct hand interaction and manipulation of \VOs is difficult due to the lack of haptic feedback and of mutual occlusion rendering between the hand and the \VO, which could be improved by a visual rendering of the hand.
Tangibles are also used as proxies for manipulating \VOs, but can be inconsistent with the visual rendering, being haptically passives.
Wearable haptics on the hand is a promising solution for improving direct hand manipulation of \VOs and for coherent visuo-haptic augmentation of tangibles.
% the type of rendered object (real or virtual), the rendered haptic property (contact, hardness, texture, see \secref{tactile_rendering}), and .
%In this context of integrating \WHs with \AR to create a \vh-\AE (\chapref{introduction}), the definition of \textcite{pacchierotti2017wearable} can be extended to an additional criterion: The wearable haptic interface should not impair the interaction with the \RE, \ie the user should be able to touch and manipulate objects in the real world while wearing the haptic device.
% The haptic feedback is thus rendered de-localized from the point of contact of the finger on the rendered object.
Providing coherent multimodal visuo-haptic feedback to enhance direct hand perception and manipulation with \VOs in immersive \AR is challenging.
While many wearable haptic devices have been developed and are capable of providing varied tactile feedback, few can be integrated or experimentally evaluated for direct hand interaction in \AR.
Their haptic end-effector must be moved away from the inside of the hand so as not to interfere with the user interaction with the \RE.
Different relocation strategies have been proposed for different parts of the hand, such as the nail, the index phalanges, or the wrist, but it remains unclear whether any of them are best suited for direct hand interaction in \AR.
In all cases, the real and virtual visual sensations are considered co-localized, but the virtual haptic feedback is not.
Such a discrepancy may affect the user's perception and experience and should be further investigated.
When integrating different sensory feedback, haptic and visual, real and virtual, into a single object property, perception is robust to variations in reliability and to spatial and temporal differences.
However, the same haptic rendering or augmentation can be influenced by the user's visual expectation or the visual rendering of the \VO.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 36 KiB

After

Width:  |  Height:  |  Size: 22 KiB