Typo
This commit is contained in:
@@ -88,7 +88,8 @@ In particular, we will investigate the effect of the visual feedback of the virt
|
||||
\paragraph{Presence}
|
||||
\label{ar_presence}
|
||||
|
||||
\AR and \VR are both essentially illusions as the virtual content does not physically exist but is just digitally simulated and rendered to the user's senses through display devices.
|
||||
\AR and \VR are both essentially illusions.
|
||||
The virtual content does not actually exist physically, but is only digitally simulated and rendered to the user's senses through display devices.
|
||||
Such experience of disbelief suspension in \VR is what is called \emph{presence}, and it can be decomposed into two dimensions: place illusion and plausibility \cite{slater2009place,slater2022separate}.
|
||||
\emph{Place illusion} is the sense of the user of \enquote{being there} in the \VE (\figref{presence-vr}).
|
||||
It emerges from the real time rendering of the \VE from the user's perspective: to be able to move around inside the \VE and look from different point of views.
|
||||
@@ -131,9 +132,9 @@ In all examples of \AR applications shown in \secref{ar_applications}, the user
|
||||
|
||||
For a user to interact with a computer system (desktop, mobile, \AR, etc.), they first perceive the state of the system and then acts upon it through an input device \cite[p.145]{laviolajr20173d}.
|
||||
Such input devices form an input \emph{\UI} that captures and translates user's actions to the computer.
|
||||
Similarly, an output \UI render and display the state of the system to the user (such as a \AR/\VR display, \secref{ar_displays}, or an haptic actuator, \secref{wearable_haptic_devices}).
|
||||
Similarly, an output \UI render and display the state of the system to the user (such as an \AR/\VR display, \secref{ar_displays}, or an haptic actuator, \secref{wearable_haptic_devices}).
|
||||
|
||||
Inputs \UI can be either an \emph{active sensing}, a held or worn device, such as a mouse, a touch screen, or a hand-held controller, or a \emph{passive sensing}, that does not require a contact, such as eye trackers, voice recognition, or hand tracking \cite[p.294]{laviolajr20173d}.
|
||||
Inputs \UI can be either an \emph{active sensing}, \ie a held or worn device, such as a mouse, a touch screen, or a hand-held controller, or a \emph{passive sensing}, that does not require a contact, such as eye trackers, voice recognition, or hand tracking \cite[p.294]{laviolajr20173d}.
|
||||
The captured information from the sensors is then translated into actions within the computer system by an \emph{interaction technique}.
|
||||
For example, a cursor on a screen can be moved using either with a mouse or with the arrow keys on a keyboard, or a two-finger swipe on a touchscreen can be used to scroll or zoom an image.
|
||||
Choosing useful and efficient \UIs and interaction techniques is crucial for the user experience and the tasks that can be performed within the system.
|
||||
|
||||
Reference in New Issue
Block a user