WIP related work
@@ -118,7 +118,7 @@ The most mature devices are \HMDs, which are portable headsets worn directly on
|
||||
|
||||
\AR/\VR can also be extended to render for sensory modalities other than vision.
|
||||
%
|
||||
In particular, \textcite{jeon2009haptic} proposed extending the \RV continuum to include haptic feedback by decoupling into two orthogonal haptic and visual axes (see \figref{visuo-haptic-rv-continuum3}).
|
||||
\textcite{jeon2009haptic} proposed extending the \RV continuum to include haptic feedback by decoupling into two orthogonal haptic and visual axes (see \figref{visuo-haptic-rv-continuum3}).
|
||||
%
|
||||
The combination of the two axes defines 9 types of \vh environments, with 3 possible levels of \RV for each \v or \h axis: real, augmented and virtual.
|
||||
%
|
||||
|
||||
@@ -1,9 +1,19 @@
|
||||
\section{Wearable Haptics}
|
||||
\label{wearable_haptics}
|
||||
|
||||
To understand how wearable haptics have been used to provide a user hand with tactile feedback, we first need to briefly describe the hand senses and acts on its environment.
|
||||
|
||||
\subsection{The Sense of Touch}
|
||||
\label{touch_sense}
|
||||
\subsection{The Haptic Sense}
|
||||
\label{haptic_sense}
|
||||
|
||||
\subsubsection{Cutaneous Sensibility}
|
||||
\label{cutaneous_sensibility}
|
||||
|
||||
\subsubsection{Kinesthetic Sensibility}
|
||||
\label{kinesthetic_sensibility}
|
||||
|
||||
\subsubsection{Hand-Object Interactions}
|
||||
\label{hand_object_interactions}
|
||||
|
||||
|
||||
\subsection{Wearable Haptic Devices}
|
||||
|
||||
@@ -1,30 +1,100 @@
|
||||
\section{Augmented Reality}
|
||||
\label{augmented_reality}
|
||||
|
||||
% Intro with Sutherland
|
||||
% A few examples of usages now
|
||||
%Based on the interaction loop presented in \figref[introduction]{interaction-loop}, we briefly detail the fundamental main components that compose any AR application: tracking, rendering and display.
|
||||
|
||||
\subsection{Principles, Capabilities and Limitations}
|
||||
\subsection{Principles and Capabilities of AR}
|
||||
\label{ar_intro}
|
||||
|
||||
Based on the interaction loop presented in \figref[introduction]{interaction-loop}, we briefly detail the fundamental main components that compose any AR application: tracking, rendering and display.
|
||||
\subsubsection{What is Augmented Reality?}
|
||||
|
||||
\subsection{What is Augmented Reality?}
|
||||
The first \AR \HMD was invented by \textcite{sutherland1968headmounted}: With the technology available at the time, it was already capable of displaying virtual objects at a fixed point in space in real time, giving the user the illusion that the content was present in the room (see \figref{sutherland1970computer3}).
|
||||
%
|
||||
Fixed to the ceiling, the headset displayed a stereoscopic (one image per eye) perspective projection of the virtual content on a transparent screen, taking into account the user's position, and thus already following the interaction loop presented in \figref[introduction]{interaction-loop}.
|
||||
|
||||
% Definition of \cite{azuma1997survey}
|
||||
% Recall Milgram and differences with VR
|
||||
\paragraph{A Definition of AR}
|
||||
%
|
||||
This system also already fulfilled the first formal definition of \AR, proposed by \textcite{azuma1997survey} in the first survey of the domain: (1) \emph{combine real and virtual}, (2) \emph{be interactive in real time} and (3) \emph{register real and virtual}\footnote{This third characteristic has been slightly adapted to use the version of \textcite{marchand2016pose}, the original definition was: \enquote{registered in \ThreeD}.}.
|
||||
%
|
||||
Each of these characteristics is essential: the real-virtual combination distinguishes \AR from \VR, a movie with integrated digital content is not interactive and a \TwoD overlay like an image filter is not registered.
|
||||
%
|
||||
There are also two key aspects to this definition: it does not focus on technology or method, but on the user's perspective of the system experience, and it does not specify a particular human sense, \ie it can be auditory~\cite{yang2022audio}, haptic~\cite{bhatia2024augmenting}, or even olfactory~\cite{brooks2021stereosmell} or gustatory~\cite{brooks2023taste}.
|
||||
%
|
||||
Yet, most of the research have focused on visual augmentations, and the term \AR (without a prefix) is almost always understood as visual \AR (\v-\AR).
|
||||
|
||||
\subsection{How does AR work?}
|
||||
\paragraph{On Presence}
|
||||
%
|
||||
Despite this clear and acknowledged definition and the viewpoint of this thesis that \AR and \VR are two type of \MR experience with different levels of mixing real and virtual environments, as presented in \secref[introduction]{visuo_haptic_augmentations}, there is still a debate on defining \AR and \MR as well as how to characterize and categorized such experiences~\cite{speicher2019what,skarbez2021revisiting}.
|
||||
|
||||
\emph{Presence} is one of the key concept to characterize a \VR experience.
|
||||
%
|
||||
\AR and \VR are both essentially illusions as the virtual content does not physically exist but is just digitally simulated and rendered to the user's perception through a user interface and the user's senses.
|
||||
%
|
||||
Such experience of disbelief suspension in \VR is what is called presence, and it can be decomposed into two dimensions: \PI and \PSI~\cite{slater2009place}.
|
||||
%
|
||||
\PI is the sense of the user of \enquote{being there} in the \VE, and it emerges from the real time rendering of the \VE from the user's perspective, the displayed content conforming and being consistent with the proprioception and actions of the user.
|
||||
%
|
||||
\PSI is the illusion that the virtual events are really happening, even if the user knows that they are not real.
|
||||
%
|
||||
It doesn't mean that the virtual events are realistic, but that they are plausible and coherent with the user's expectations.
|
||||
%
|
||||
A third strong illusion in \VR is the \SoE, which is the illusion that the virtual body is one's own~\cite{slater2022separate,guy2023sense}.
|
||||
|
||||
The \AR presence is far less defined and studied than for \VR~\cite{tran2024survey}, but it will be useful to design, evaluate and discuss our contributions in the next chapters.
|
||||
%
|
||||
Thereby, \textcite{slater2022separate} proposed to invert \PI as bring the virtual into the physical world, \ie \enquote{place it here}.
|
||||
%
|
||||
As with VR, \VOs must be able to be seen from different angles by moving the head but also, this is more difficult, be consistent with the \RE, \eg occlude or be occluded by real objects~\cite{macedo2023occlusion}, cast shadows or reflect lights.
|
||||
%
|
||||
The \PSI can be applied to \AR as is, but the \VOs must additionally have knowledge of the \RE and react accordingly to it.
|
||||
%
|
||||
\textcite{skarbez2021revisiting} also named \PI for \AR as \enquote{immersion} and \PSI as \enquote{coherence}, and these terms will be used in the remainder of this thesis.
|
||||
%
|
||||
Finally, as presence, \SoE in \AR is a recent topic and little is known about its perception on the user experience~\cite{genay2021virtual}.
|
||||
|
||||
%For example, \textcite{milgram1994taxonomy} proposed a taxonomy of \MR experiences based on the degree of mixing real and virtual environments, and \textcite{skarbez2021revisiting} revisited this taxonomy to include the user's perception of the experience.
|
||||
|
||||
% debate on the definition of AR
|
||||
% big brother VR and issue with presence/plausibility in AR (Slater)
|
||||
% taxonomy of Milgram/Skarbez
|
||||
|
||||
\paragraph{Applications}
|
||||
%
|
||||
Advances in technology, research and development have enabled many usages of \AR, including medicine, education, industrial, navigation, collaboration and entertainment applications~\cite{dey2018systematic}.
|
||||
%
|
||||
For example, \AR can help surgeons to visualize \ThreeD images of the brain overlaid on the patient's head prior or during surgery (see \figref{watanabe2016transvisible}) or improve the learning of students with complex concepts and phenomena such as optics or chemistry (see \figref{bousquet2024reconfigurable}).
|
||||
%
|
||||
It can also guide workers in complex tasks, such as assembly, maintenance or verification (see \figref{hartl2013mobile}), or can create complete new forms of gaming or tourism experiences (see \figref{roo2017inner}).
|
||||
%
|
||||
Most of \AR/\VR experience can now be implemented with commercially available hardware and software solutions, in particular for tracking, rendering and display.
|
||||
|
||||
\begin{subfigs}{augmented-reality}{Examples of \AR applications. }[
|
||||
\item The first \AR \HMD displaying wireframe \ThreeD virtual objects registered in the real environment~\cite{sutherland1968headmounted}.
|
||||
\item Neurosurgery visualization of the brain on a patient's head~\cite{watanabe2016transvisible}.
|
||||
\item HOBIT is a spatial, tangible \AR table simulating an optical bench for educational experimentations~\cite{bousquet2024reconfigurable}.
|
||||
\item \AR can interactively guide in document verification tasks by recognizing and comparing with virtual references
|
||||
~\cite{hartl2013mobile}.
|
||||
\item Inner Garden is a visually augmented zen garden for relaxation and meditation~\cite{roo2017inner}.
|
||||
]
|
||||
\subfigsheight{36mm}
|
||||
\subfig{sutherland1970computer3}
|
||||
\subfig{watanabe2016transvisible}
|
||||
\subfig{bousquet2024reconfigurable}
|
||||
\subfig{hartl2013mobile}
|
||||
\subfig{roo2017inner}
|
||||
\end{subfigs}
|
||||
|
||||
\subsubsection{How does AR work?}
|
||||
|
||||
% How it works briefly
|
||||
|
||||
\subsubsection{Calibrating \& Tracking}
|
||||
\paragraph{Calibrating \& Tracking}
|
||||
|
||||
% \cite{marchand2016pose}
|
||||
|
||||
\subsubsection{Modeling \& Rendering}
|
||||
\paragraph{Modeling \& Rendering}
|
||||
|
||||
\subsubsection{Display}
|
||||
\paragraph{Displays}
|
||||
|
||||
% Bimber and types of AR
|
||||
% State of current HMD
|
||||
@@ -36,7 +106,7 @@ Based on the interaction loop presented in \figref[introduction]{interaction-loo
|
||||
\subsection{Interacting with Virtual and Augmented Content}
|
||||
\label{ar_interaction}
|
||||
|
||||
\subsubsection{Virtual Hand Rendering in AR}
|
||||
\subsubsection{Virtual Hands in AR}
|
||||
\label{ar_interaction_hands}
|
||||
|
||||
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
\section{Visuo-Haptic Augmented Reality}
|
||||
\label{vhar}
|
||||
|
||||
Combiner haptique et RA
|
||||
|
||||
\subsection{Altering the Perceptions}
|
||||
\label{vhar_perception}
|
||||
@@ -8,3 +9,7 @@
|
||||
|
||||
\subsection{Improving the Interactions}
|
||||
\label{vhar_interaction}
|
||||
|
||||
|
||||
\subsection{Conclusion}
|
||||
\label{vhar_conclusion}
|
||||
|
||||
@@ -1,2 +1,19 @@
|
||||
\section{Conclusion}
|
||||
\label{conclusion}
|
||||
|
||||
De façon intéressante, les deux sections précédentes, présentant l'haptique portable pour la \secref{wearable_haptics} et la RA pour la \secref{augmented_reality}, suivent un cheminement assez opposé : la première commence avec le sens haptique et la main pour décrire les dispositifs haptiques portables et les interactions qu'ils permettent, tandis que la seconde débute sur une description technologique de RA pour ensuite détailler sa perception et son usage.
|
||||
%
|
||||
C'est de cette manière que chacun des deux domaines est souvent introduit dans la littérature, par exemple avec les travaux de \textcite{choi2013vibrotactile,culbertson2018haptics} pour l'haptique et de \textcite{bimber2005spatial,kim2018revisiting} pour la RA.
|
||||
|
||||
Mais il est également intéressant de noter que ces deux domaines sont à des stades de maturité différents.
|
||||
%
|
||||
En effet, pouvoir contribuer pour ces deux domaines soulève, entre autres, des défis techniques importants, comme détaillé dans la \secref[introduction]{research_challenges}.
|
||||
%
|
||||
Et il y a un besoin de standardisation en haptique portable~\cite{culbertson2018haptics}, notamment en terme de dispositifs et de rendus, alors que l'industrie est plutôt bien établie en RA, par exemple avec les casques HoloLens~2 de Microsoft~\footnoteurl{https://www.microsoft.com/hololens} et Vision~Pro d'Apple~\footnoteurl{https://www.apple.com/apple-vision-pro/} ou bien les frameworks ARCore de Google~\footnoteurl{https://developers.google.com/ar} et ARKit d'Apple~\footnoteurl{https://developer.apple.com/augmented-reality/}.
|
||||
%
|
||||
Cela peut en partie d'une part s'expliquer par la maturité de l'industrie de la RV, qui entraîne celle de la RA, et avec une tendance annoncée à la convergence de ces deux technologies~\cite{speicher2019what}, mais aussi d'autre part par la plus grande complexité et les particularité du sens haptique~\cite{culbertson2018haptics}.
|
||||
%
|
||||
À l'inverse, définir et caractériser la RA/RM, dans une bien moindre mesure la RV, reste étonnamment un sujet ouvert~\cite{speicher2019what}.
|
||||
|
||||
Il faut donc également, dans cette thèse, tenir compte des deux éléphants dans la pièce: l'haptique grounded et la RV.
|
||||
%
|
||||
|
||||
|
After Width: | Height: | Size: 143 KiB |
BIN
1-introduction/related-work/figures/hartl2013mobile.jpg
Normal file
|
After Width: | Height: | Size: 50 KiB |
BIN
1-introduction/related-work/figures/roo2017inner.jpg
Normal file
|
After Width: | Height: | Size: 94 KiB |
BIN
1-introduction/related-work/figures/sutherland1970computer1.jpg
Normal file
|
After Width: | Height: | Size: 33 KiB |
BIN
1-introduction/related-work/figures/sutherland1970computer2.jpg
Normal file
|
After Width: | Height: | Size: 32 KiB |
BIN
1-introduction/related-work/figures/sutherland1970computer3.jpg
Normal file
|
After Width: | Height: | Size: 21 KiB |
BIN
1-introduction/related-work/figures/watanabe2016transvisible.jpg
Normal file
|
After Width: | Height: | Size: 44 KiB |
@@ -11,7 +11,7 @@ To achieve this, this chapter first gives an overview of the haptic sense, and h
|
||||
%
|
||||
Secondly, it introduces the principles and user perception of \AR, and describes the interaction techniques used in \AR and \VR environments to interact with virtual and augmented objects, in particular using the visual rendering of the user's hand.
|
||||
%
|
||||
Finally, it presents how multimodal visual and haptic feedback have been combined in \AR to modify the user perception, in particular when touching a tangible, and to improve the user interaction with the augmented environment, in particular when manipulating \VOs.
|
||||
Finally, it presents how multimodal visual and haptic feedback have been combined in \AR to modify the user perception, notably when touching a tangible, and to improve the user interaction with the augmented environment, especially when manipulating \VOs.
|
||||
|
||||
\input{1-wearable-haptics}
|
||||
\input{2-augmented-reality}
|
||||
|
||||
@@ -44,8 +44,11 @@
|
||||
\acronym{HMD}{head-mounted display}
|
||||
\acronym{MR}{mixed reality}
|
||||
\acronym{OST}{optical see-through}
|
||||
\acronym{PI}{place illusion}
|
||||
\acronym[PSI]{Psi}{plausibility}
|
||||
\acronym{RE}{real environment}
|
||||
\acronym{RV}{reality-virtuality}
|
||||
\acronym{SoE}{sense of embodiment}
|
||||
\acronym{v}{visual}
|
||||
\acronym{VCA}{voice-coil actuator}
|
||||
\acronym{VE}{virtual environment}
|
||||
|
||||
188
references.bib
@@ -113,6 +113,17 @@
|
||||
doi = {10/gfz8mv}
|
||||
}
|
||||
|
||||
@article{azuma1997survey,
|
||||
title = {A Survey of Augmented Reality},
|
||||
author = {Azuma, Ronald T.},
|
||||
date = {1997},
|
||||
journaltitle = {Presence Teleoperators Virtual Environ.},
|
||||
volume = {6},
|
||||
number = {4},
|
||||
pages = {355--385},
|
||||
doi = {10/gd7gb4}
|
||||
}
|
||||
|
||||
@inproceedings{balasubramanian2024sens3,
|
||||
title = {{{SENS3}}: {{Multisensory Database}} of {{Finger-Surface Interactions}} and {{Corresponding Sensations}}},
|
||||
shorttitle = {{{SENS3}}},
|
||||
@@ -283,6 +294,23 @@
|
||||
pages = {169--178}
|
||||
}
|
||||
|
||||
@article{billinghurst2021grand,
|
||||
title = {Grand {{Challenges}} for {{Augmented Reality}}},
|
||||
author = {Billinghurst, Mark},
|
||||
date = {2021},
|
||||
journaltitle = {Front. Virtual Real.},
|
||||
volume = {2},
|
||||
doi = {10/gjrwsw}
|
||||
}
|
||||
|
||||
@book{bimber2005spatial,
|
||||
title = {Spatial Augmented Reality: Merging Real and Virtual Worlds},
|
||||
shorttitle = {Spatial Augmented Reality},
|
||||
author = {Bimber, Oliver and Raskar, Ramesh},
|
||||
date = {2005},
|
||||
pagetotal = {369}
|
||||
}
|
||||
|
||||
@inproceedings{blaga2017usability,
|
||||
title = {Usability {{Analysis}} of an {{Off-the-Shelf Hand Posture Estimation Sensor}} for {{Freehand Physical Interaction}} in {{Egocentric Mixed Reality}}},
|
||||
booktitle = {{{IEEE Int}}. {{Symp}}. {{Mix}}. {{Augment}}. {{Real}}.},
|
||||
@@ -313,6 +341,17 @@
|
||||
doi = {10/bwdcc7}
|
||||
}
|
||||
|
||||
@article{bousquet2024reconfigurable,
|
||||
title = {Reconfigurable and Versatile Augmented Reality Optical Setup for Tangible Experimentations},
|
||||
author = {Bousquet, Bruno and Hachet, Martin and Casamayou, Vincent and Normand, Erwan and Guillet, Jean-Paul and Canioni, Lionel},
|
||||
date = {2024},
|
||||
journaltitle = {Discov. Educ.},
|
||||
volume = {3},
|
||||
number = {1},
|
||||
pages = {113},
|
||||
doi = {10/gt63g4}
|
||||
}
|
||||
|
||||
@inproceedings{brahimaj2023crossmodal,
|
||||
title = {Cross-Modal Interaction of Stereoscopy, Surface Deformation and Tactile Feedback on the Perception of Texture Roughness in an Active Touch Condition},
|
||||
booktitle = {Int. {{Francoph}}. {{Conf}}. {{Hum}}.-{{Comput}}. {{Interact}}.},
|
||||
@@ -321,6 +360,22 @@
|
||||
doi = {10/gs6rbg}
|
||||
}
|
||||
|
||||
@inproceedings{brooks2021stereosmell,
|
||||
title = {Stereo-{{Smell}} via {{Electrical Trigeminal Stimulation}}},
|
||||
booktitle = {{{CHI Conf}}. {{Hum}}. {{Factors Comput}}. {{Syst}}.},
|
||||
author = {Brooks, Jas and Teng, Shan-Yuan and Wen, Jingxuan and Nith, Romain and Nishida, Jun and Lopes, Pedro},
|
||||
date = {2021},
|
||||
doi = {10/gksk2c}
|
||||
}
|
||||
|
||||
@inproceedings{brooks2023taste,
|
||||
title = {Taste {{Retargeting}} via {{Chemical Taste Modulators}}},
|
||||
booktitle = {{{ACM Symp}}. {{User Interface Softw}}. {{Technol}}.},
|
||||
author = {Brooks, Jas and Amin, Noor and Lopes, Pedro},
|
||||
date = {2023},
|
||||
doi = {10/gt625w}
|
||||
}
|
||||
|
||||
@inproceedings{buchmann2004fingartips,
|
||||
title = {{{FingARtips}}: Gesture Based Direct Manipulation in {{Augmented Reality}}},
|
||||
booktitle = {Int. {{Conf}}. {{Comput}}. {{Graph}}. {{Interact}}. {{Tech}}. {{Australas}}. {{South East Asia}}},
|
||||
@@ -554,6 +609,16 @@
|
||||
doi = {10/gnnfv9}
|
||||
}
|
||||
|
||||
@article{dey2018systematic,
|
||||
title = {A {{Systematic Review}} of 10 {{Years}} of {{Augmented Reality Usability Studies}}: 2005 to 2014},
|
||||
shorttitle = {A {{Systematic Review}} of 10 {{Years}} of {{Augmented Reality Usability Studies}}},
|
||||
author = {Dey, Arindam and Billinghurst, Mark and Lindeman, Robert W. and Swan, J. Edward},
|
||||
date = {2018},
|
||||
journaltitle = {Front. Robot. AI},
|
||||
volume = {5},
|
||||
doi = {10/gfz8jg}
|
||||
}
|
||||
|
||||
@inproceedings{diaz2017designing,
|
||||
title = {Designing for {{Depth Perceptions}} in {{Augmented Reality}}},
|
||||
booktitle = {{{IEEE Int}}. {{Symp}}. {{Mix}}. {{Augment}}. {{Real}}.},
|
||||
@@ -685,6 +750,16 @@
|
||||
doi = {10/ghms7m}
|
||||
}
|
||||
|
||||
@inproceedings{furio2017hobit,
|
||||
title = {Hobit: {{Hybrid}} Optical Bench for Innovative Teaching},
|
||||
shorttitle = {Hobit},
|
||||
booktitle = {Proc. 2017 {{Chi Conf}}. {{Hum}}. {{Factors Comput}}. {{Syst}}.},
|
||||
author = {Furió, David and Fleck, Stéphanie and Bousquet, Bruno and Guillet, Jean-Paul and Canioni, Lionel and Hachet, Martin},
|
||||
date = {2017},
|
||||
pages = {949--959},
|
||||
doi = {10/ghq79w}
|
||||
}
|
||||
|
||||
@article{gaffary2017ar,
|
||||
title = {{{AR Feels}} “{{Softer}}” than {{VR}}: {{Haptic Perception}} of {{Stiffness}} in {{Augmented}} versus {{Virtual Reality}}},
|
||||
shorttitle = {{{AR Feels}} “{{Softer}}” than {{VR}}},
|
||||
@@ -770,6 +845,15 @@
|
||||
doi = {10/b7rbf7}
|
||||
}
|
||||
|
||||
@article{guy2023sense,
|
||||
title = {The Sense of Embodiment in {{Virtual Reality}} and Its Assessment Methods},
|
||||
author = {Guy, Martin and Normand, Jean-Marie and Jeunet-Kelway, Camille and Moreau, Guillaume},
|
||||
date = {2023},
|
||||
journaltitle = {Front. Virtual Real.},
|
||||
volume = {4},
|
||||
doi = {10/gt63zb}
|
||||
}
|
||||
|
||||
@inproceedings{ha2014wearhand,
|
||||
title = {{{WeARHand}}: {{Head-worn}}, {{RGB-D}} Camera-Based, Bare-Hand User Interface with Visually Enhanced Depth Perception},
|
||||
shorttitle = {{{WeARHand}}},
|
||||
@@ -811,6 +895,15 @@
|
||||
doi = {10/b3mj9n}
|
||||
}
|
||||
|
||||
@inproceedings{hartl2013mobile,
|
||||
title = {Mobile Interactive Hologram Verification},
|
||||
booktitle = {2013 {{IEEE Int}}. {{Symp}}. {{Mix}}. {{Augment}}. {{Real}}.},
|
||||
author = {Hartl, Andreas and Grubert, Jens and Schmalstieg, Dieter and Reitmayr, Gerhard},
|
||||
date = {2013},
|
||||
pages = {75--82},
|
||||
doi = {10/gt63j8}
|
||||
}
|
||||
|
||||
@inproceedings{heo2019pseudobend,
|
||||
title = {{{PseudoBend}}: {{Producing Haptic Illusions}} of {{Stretching}}, {{Bending}}, and {{Twisting Using Grain Vibrations}}},
|
||||
shorttitle = {{{PseudoBend}}},
|
||||
@@ -1690,6 +1783,16 @@
|
||||
doi = {10/d38b53}
|
||||
}
|
||||
|
||||
@inproceedings{roo2017inner,
|
||||
title = {Inner {{Garden}}: {{Connecting Inner States}} to a {{Mixed Reality Sandbox}} for {{Mindfulness}}},
|
||||
shorttitle = {Inner {{Garden}}},
|
||||
booktitle = {{{CHI Conf}}. {{Hum}}. {{Factors Comput}}. {{Syst}}.},
|
||||
author = {Roo, Joan Sol and Gervais, Renaud and Frey, Jeremy and Hachet, Martin},
|
||||
date = {2017},
|
||||
pages = {1459--1470},
|
||||
doi = {10/ggrd6q}
|
||||
}
|
||||
|
||||
@inproceedings{sabnis2023haptic,
|
||||
title = {Haptic {{Servos}}: {{Self-Contained Vibrotactile Rendering System}} for {{Creating}} or {{Augmenting Material Experiences}}},
|
||||
shorttitle = {Haptic {{Servos}}},
|
||||
@@ -1786,6 +1889,36 @@
|
||||
doi = {10/gjrd9w}
|
||||
}
|
||||
|
||||
@article{slater2009place,
|
||||
title = {Place Illusion and Plausibility Can Lead to Realistic Behaviour in Immersive Virtual Environments},
|
||||
author = {Slater, Mel},
|
||||
date = {2009},
|
||||
journaltitle = {Philos. Trans. R. Soc. B Biol. Sci.},
|
||||
volume = {364},
|
||||
number = {1535},
|
||||
pages = {3549--3557},
|
||||
doi = {10/df44xc}
|
||||
}
|
||||
|
||||
@article{slater2022separate,
|
||||
title = {A {{Separate Reality}}: {{An Update}} on {{Place Illusion}} and {{Plausibility}} in {{Virtual Reality}}},
|
||||
shorttitle = {A {{Separate Reality}}},
|
||||
author = {Slater, Mel and Banakou, Domna and Beacco, Alejandro and Gallego, Jaime and Macia-Varela, Francisco and Oliva, Ramon},
|
||||
date = {2022},
|
||||
journaltitle = {Front. Virtual Real.},
|
||||
volume = {3},
|
||||
pages = {914392},
|
||||
doi = {10/gthn7h}
|
||||
}
|
||||
|
||||
@inproceedings{speicher2019what,
|
||||
title = {What Is {{Mixed Reality}}?},
|
||||
booktitle = {{{CHI Conf}}. {{Hum}}. {{Factors Comput}}. {{Syst}}.},
|
||||
author = {Speicher, Maximilian and Hall, Brian D. and Nebeling, Michael},
|
||||
date = {2019},
|
||||
doi = {10/ggp859}
|
||||
}
|
||||
|
||||
@inproceedings{strohmeier2017generating,
|
||||
title = {Generating {{Haptic Textures}} with a {{Vibrotactile Actuator}}},
|
||||
booktitle = {{{CHI Conf}}. {{Hum}}. {{Factors Comput}}. {{Syst}}.},
|
||||
@@ -1803,6 +1936,30 @@
|
||||
doi = {10/gjbvmp}
|
||||
}
|
||||
|
||||
@inproceedings{sutherland1965ultimate,
|
||||
title = {The {{Ultimate Display}}},
|
||||
booktitle = {{{IFIP Congr}}.},
|
||||
author = {Sutherland, Ivan E.},
|
||||
date = {1965},
|
||||
pages = {506--508}
|
||||
}
|
||||
|
||||
@inproceedings{sutherland1968headmounted,
|
||||
title = {A Head-Mounted Three Dimensional Display},
|
||||
booktitle = {Fall {{Jt}}. {{Comput}}. {{Conf}}.},
|
||||
author = {Sutherland, Ivan E.},
|
||||
date = {1968},
|
||||
pages = {757--764}
|
||||
}
|
||||
|
||||
@article{sutherland1970computer,
|
||||
title = {Computer {{Displays}}},
|
||||
author = {Sutherland, Ivan E.},
|
||||
date = {1970},
|
||||
journaltitle = {Sci. Am.},
|
||||
doi = {10/bnk34d}
|
||||
}
|
||||
|
||||
@inproceedings{suzuki2014grasping,
|
||||
title = {Grasping a Virtual Object with a Bare Hand},
|
||||
booktitle = {{{ACM SIGGRAPH Posters}}},
|
||||
@@ -1841,6 +1998,14 @@
|
||||
doi = {10/gm5m8f}
|
||||
}
|
||||
|
||||
@inproceedings{tran2024survey,
|
||||
title = {A {{Survey On Measuring Presence}} in {{Mixed Reality}}},
|
||||
booktitle = {{{CHI Conf}}. {{Hum}}. {{Factors Comput}}. {{Syst}}.},
|
||||
author = {Tran, Tanh Quang and Langlotz, Tobias and Regenbrecht, Holger},
|
||||
date = {2024},
|
||||
doi = {10/gt56cs}
|
||||
}
|
||||
|
||||
@article{turk2014multimodal,
|
||||
title = {Multimodal Interaction: {{A}} Review},
|
||||
shorttitle = {Multimodal Interaction},
|
||||
@@ -1935,6 +2100,17 @@
|
||||
doi = {10/ghbm3h}
|
||||
}
|
||||
|
||||
@article{watanabe2016transvisible,
|
||||
title = {The {{Trans-Visible Navigator}}: {{A See-Through Neuronavigation System Using Augmented Reality}}},
|
||||
shorttitle = {The {{Trans-Visible Navigator}}},
|
||||
author = {Watanabe, Eiju and Satoh, Makoto and Konno, Takehiko and Hirai, Masahiro and Yamaguchi, Takashi},
|
||||
date = {2016},
|
||||
journaltitle = {World Neurosurg.},
|
||||
volume = {87},
|
||||
pages = {399--405},
|
||||
doi = {10/f8pdbs}
|
||||
}
|
||||
|
||||
@article{wichmann2001psychometrica,
|
||||
title = {The Psychometric Function: {{I}}. {{Fitting}}, Sampling, and Goodness of Fit},
|
||||
shorttitle = {The Psychometric Function},
|
||||
@@ -1989,6 +2165,18 @@
|
||||
number = {1}
|
||||
}
|
||||
|
||||
@article{yang2022audio,
|
||||
title = {Audio {{Augmented Reality}}: {{A Systematic Review}} of {{Technologies}}, {{Applications}}, and {{Future Research Directions}}},
|
||||
shorttitle = {Audio {{Augmented Reality}}},
|
||||
author = {Yang, Jing and Barde, Amit and Billinghurst, Mark},
|
||||
date = {2022},
|
||||
journaltitle = {J. Audio Eng. Soc.},
|
||||
volume = {70},
|
||||
number = {10},
|
||||
pages = {788--809},
|
||||
doi = {10/gt623q}
|
||||
}
|
||||
|
||||
@inproceedings{yoon2020evaluating,
|
||||
title = {Evaluating {{Remote Virtual Hands Models}} on {{Social Presence}} in {{Hand-based 3D Remote Collaboration}}},
|
||||
booktitle = {{{IEEE Int}}. {{Symp}}. {{Mix}}. {{Augment}}. {{Real}}.},
|
||||
|
||||