WIP related work

This commit is contained in:
2024-08-13 09:58:23 +02:00
parent 6ae15087f1
commit d28555ffcb
8 changed files with 93 additions and 35 deletions

View File

@@ -0,0 +1,24 @@
\section{Wearable Haptics}
\label{wearable_haptics}
\subsection{The Sense of Touch}
\label{touch_sense}
\subsection{Wearable Haptic Devices}
\label{wearable_haptics_devices}
\subsection{Renderings with Wearable Haptics}
\label{wearable_haptics_rendering}
\subsubsection{Contact Renderings}
\label{haptic_contacts}
\subsubsection{Texture Renderings}
\label{haptic_textures}
\subsection{Conclusion}
\label{wearable_haptics_conclusion}

View File

@@ -0,0 +1,44 @@
\section{Augmented Reality}
\label{augmented_reality}
% Intro with Sutherland
% A few examples of usages now
\subsection{Principles, Capabilities and Limitations}
\label{ar_intro}
Based on the interaction loop presented in \figref[introduction]{interaction-loop}, we briefly detail the fundamental main components that compose any AR application: tracking, rendering and display.
\subsection{What is Augmented Reality?}
% Definition of \cite{azuma1997survey}
% Recall Milgram and differences with VR
\subsection{How does AR work?}
% How it works briefly
\subsubsection{Calibrating \& Tracking}
% \cite{marchand2016pose}
\subsubsection{Modeling \& Rendering}
\subsubsection{Display}
% Bimber and types of AR
% State of current HMD
\subsection{How Virtual is Perceived in AR}
\label{ar_perception}
\subsection{Interacting with Virtual and Augmented Content}
\label{ar_interaction}
\subsubsection{Virtual Hand Rendering in AR}
\label{ar_interaction_hands}
\subsection{Conclusion}
\label{ar_conclusion}

View File

@@ -0,0 +1,10 @@
\section{Visuo-Haptic Augmented Reality}
\label{vhar}
\subsection{Altering the Perceptions}
\label{vhar_perception}
\subsection{Improving the Interactions}
\label{vhar_interaction}

View File

@@ -0,0 +1,2 @@
\section{Conclusion}
\label{conclusion}

View File

@@ -50,6 +50,7 @@ However, the experiment was carried out on a screen, in a non-immersive AR scena
% %
To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of virtual object manipulation. To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of virtual object manipulation.
\subsection{Wearable Haptic Feedback in AR} \subsection{Wearable Haptic Feedback in AR}
\label{2_haptics} \label{2_haptics}

View File

@@ -1,5 +1,5 @@
\section{Related Work} \section{Related Work}
\label{sec:related_work} \label{related_work}
% Answer the following four questions: “Who else has done work with relevance to this work of yours? What did they do? What did they find? And how is your work here different?” % Answer the following four questions: “Who else has done work with relevance to this work of yours? What did they do? What did they find? And how is your work here different?”
@@ -9,7 +9,7 @@ Yet visual and haptic sensations are often combined in everyday life, and it is
\subsection{Augmenting Haptic Texture Roughness} \subsection{Augmenting Haptic Texture Roughness}
\label{sec:vibrotactile_roughness} \label{vibrotactile_roughness}
When running a finger over a surface, the deformations and vibrations of the skin caused by the micro-height differences of the material induce the sensation of roughness~\autocite{klatzky2003feeling}. When running a finger over a surface, the deformations and vibrations of the skin caused by the micro-height differences of the material induce the sensation of roughness~\autocite{klatzky2003feeling}.
% %
@@ -48,7 +48,7 @@ It remains unclear whether such vibrotactile texture augmentation is perceived t
%In our study, we attached a voice-coil actuator to the middle phalanx of the finger and used a squared sinusoidal signal to render grating textures sensations, but we corrected its phase to allow a simple camera-based tracking and free exploration movements of the finger. %In our study, we attached a voice-coil actuator to the middle phalanx of the finger and used a squared sinusoidal signal to render grating textures sensations, but we corrected its phase to allow a simple camera-based tracking and free exploration movements of the finger.
\subsection{Influence of Visual Rendering on Haptic Perception} \subsection{Influence of Visual Rendering on Haptic Perception}
\label{sec:influence_visual_haptic} \label{influence_visual_haptic}
When the same object property is sensed simultaneously by vision and touch, the two modalities are integrated into a single perception. When the same object property is sensed simultaneously by vision and touch, the two modalities are integrated into a single perception.
% %

View File

@@ -1,40 +1,17 @@
\chapter{Related Work} \chapter{Related Work}
\mainlabel{related_work} \mainlabel{related_work}
This chapter presents an overview of previous work on perception and interaction in the context of wearable haptics and augmented reality. \chaptertoc
This chapter presents an overview of previous work on perception and interaction in the context of wearable haptics and \AR.
% %
It first gives an overview of the sense of touch, and how wearable haptic devices and renderings have been used to provide haptic feedback to enhance the touch perception and interaction with virtual and augmented objects, with a focus on vibrotactile feedback and haptic textures. It first gives an overview of the sense of touch, and how wearable haptic devices and renderings have been used to provide haptic feedback to enhance the touch perception and interaction with virtual and augmented objects, with a focus on vibrotactile feedback and haptic textures.
% %
Secondly, it introduces the principles and user perception of augmented reality, and describes the 3D interaction techniques used in \AR and \VR environments to interact with virtual and augmented objects, in particular using the visual rendering of the user's hand. Secondly, it introduces the principles and user perception of \AR, and describes the 3D interaction techniques used in \AR and \VR environments to interact with virtual and augmented objects, in particular using the visual rendering of the user's hand.
% %
Finally, it presents how multimodal visual and haptic feedback have been combined to modify Finally, it presents how multimodal visual and haptic feedback have been combined in \AR to modify the user perception, in particular when touching a tangible, and to improve the user interaction with the augmented environment, in particular when manipulating \VOs.
\chaptertoc \input{1-wearable-haptics}
\input{2-augmented-reality}
\section{Wearable Haptics} \input{3-visuo-haptic}
\input{4-conclusion}
\subsection{The Sense of Touch}
\subsection{Wearable Haptics}
\subsection{Vibrotactile Renderings}
\subsubsection{Contact Renderings}
\subsubsection{Texture Renderings}
\section{Augmented Reality}
\subsection{Principles, Capabilities and Limitations}
\subsection{Interacting with Virtual and Augmented Content}
\subsubsection{Virtual Hand Rendering in AR}
\section{Visuo-Haptic Augmented Reality}
\subsection{Altering the Perceptions}
\subsection{Improving the Interactions}
\section{Conclusion}