WIP intro
This commit is contained in:
@@ -1,15 +1,16 @@
|
||||
\chapter{Introduction}
|
||||
\mainlabel{introduction}
|
||||
|
||||
This thesis presents research works on the perception and interaction with direct hands of everyday objects, visually and tactilely augmented with immersive and wearable augmented reality and haptic devices.
|
||||
This thesis presents research works on the perception and interaction directly with the hand with everyday objects that are visually and tactilely augmented with immersive augmented reality and wearable haptic devices.
|
||||
|
||||
|
||||
\sectionstartoc{Visual and Tactile Object Augmentations}
|
||||
|
||||
\subsectionstartoc{Everyday Interaction with Everyday Objects}
|
||||
|
||||
On a daily basis, we look and touch simultaneously the everyday objects that surround us, without even thinking about it.
|
||||
In daily life, we look and touch simultaneously the everyday objects that surround us, without even thinking about it.
|
||||
%
|
||||
Many these object properties can be perceived complementary by both vision and touch, such as their shapes, sizes or textures.
|
||||
Many these object properties can be perceived complementary by both vision and touch, such as their shapes, sizes or textures~\autocite{baumgartner2013visual}.
|
||||
%
|
||||
But vision often precedes touch, enabling us to expect the tactile sensations we will feel when touching the object, and even to predict properties that we cannot see, such as weight or temperature.
|
||||
%
|
||||
@@ -17,13 +18,13 @@ In this way, visual and tactile sensations are often linked and complementary, o
|
||||
%
|
||||
This is why we sometimes want to touch an object to check one of its properties that we have seen, such as its texture, and compare and confront our visual and tactile sensations.
|
||||
%
|
||||
We then instinctively construct a unified perception of the object we are exploring and manipulating from these two visual and tactile sensory modalities, as well as with the movement of our hand and fingers on the object.
|
||||
%
|
||||
Thus, one important aspect of touch is that is allows not only to perceive the environment, but also to interact with it.
|
||||
We then instinctively construct a unified perception of the object we are exploring and manipulating from these visual and tactile sensory modalities, as well as with the movement of our hand and fingers on the object~\autocite{ernst2002humans}.
|
||||
|
||||
Another important aspect of touch is that is allows not only to perceive the environment, but also to interact with it.
|
||||
%
|
||||
Also called the haptic sense, it comprises two sub-modalities: kinesthetic (or proprioception), which are the forces felt by muscles and tendons, and cutaneous (or tactile sensations), which are the pressures, stretches, vibrations and temperatures felt by the skin.
|
||||
%
|
||||
This rich and complex diversity of haptic actions and sensations makes it particularly difficult to recreate artificially, for example in virtual or remote operation environments.
|
||||
This rich and complex diversity of haptic actions and sensations makes it particularly difficult to recreate artificially, for example in virtual or remote operation environments~\cite{culbertson2018haptics}.
|
||||
|
||||
%Nous regardons et touchons simultanément les objets de la vie quotidienne qui nous entourent, sans même y penser.
|
||||
%Beaucoup de propriétés de ces objets peuvent être perçues de façon complémentaire par la vision comme par le toucher, comme la forme, la taille ou la texture.
|
||||
@@ -38,19 +39,97 @@ This rich and complex diversity of haptic actions and sensations makes it partic
|
||||
|
||||
\subsectionstartoc{Wearable Haptics and the Augmentation of Touch}
|
||||
|
||||
Present what is wearable haptics, how they can be used to augment the sense of touch~\autocite{pacchierotti2017wearable}. Detail then how they have been used with virtual reality, but how little they have been used with augmented reality.
|
||||
\textit{Present what is wearable haptics, how they can be used to augment the sense of touch~\autocite{pacchierotti2017wearable}. Detail then how they have been used with virtual reality, but how little they have been used with augmented reality.}
|
||||
|
||||
|
||||
\subsectionstartoc{Augmented Reality is Mostly Visual}
|
||||
|
||||
Explain what is augmented reality, and why it is important to use wearable haptics with it.
|
||||
|
||||
|
||||
\sectionstartoc{Research Challenges}
|
||||
|
||||
Integrating wearable haptics and immersive augmented reality together to create a plausible and coherent visuo-haptic augmented environment that can be touched and manipulated directly with bare hands raises many perceptive and interaction challenges.
|
||||
A wide range of portable or wearable haptic devices have been designed, rendering rich and realistic virtual haptic sensations to the user, as illustrated in \figref{wearable-haptics}.
|
||||
%
|
||||
Each of these challenges also pose numerous design and technical issues specific to each of the two type of feedback, wearable haptics and augmented reality, as well as multimodal rendering and user experience issues in integrating these two sensorimotor feedbacks into the user's perception.
|
||||
But their sue has been little explored in combination with augmented reality (AR) so far.
|
||||
|
||||
\begin{subfigs}{wearable-haptics}{Wearable haptic devices are able to render sensations to the skin as feedback to touched virtual objects.}[%
|
||||
%\item CLAW, a handheld haptic device providing force and vibrotactile sensations to the fingertips to render contact and textures with virtual objects~\autocite{choi2018claw}.
|
||||
\item Wolverine, a wearable exoskeleton that simulate contact and grasping of virtual objects with force feedback on the fingers~\autocite{choi2016wolverine}.
|
||||
\item Touch\&Fold, a wearable haptic device mounted on the nail that fold on demand to render contact, normal force and vibrations to the fingertip~\autocite{teng2021touch}.
|
||||
\item The hRing, a wearable haptic ring mounted on the proximal phalanx able to render normal and shear forces to the fingertip~\autocite{pacchierotti2016hring}.
|
||||
\item Tasbi, a haptic bracelet capable of rendering squeeze and vibrotactile feedback to the wrist~\autocite{pezent2019tasbi}.
|
||||
%\item A vibrotactile voice-coil actuator mounted on the middle phalanx to modulate the perceived texture roughness of a tangible surface touched by the fingertip~\autocite{asano2015vibrotactile}.
|
||||
]
|
||||
\subfigsheight{28mm}
|
||||
\subfig{choi2016wolverine}
|
||||
\subfig{teng2021touch}
|
||||
\subfig{pacchierotti2016hring}
|
||||
\subfig{pezent2019tasbi}
|
||||
\end{subfigs}
|
||||
|
||||
|
||||
\subsectionstartoc{Augmented Reality is Not Only Visual}
|
||||
|
||||
AR integrates virtual content into the real world perception, creating the illusion of one unique environment and promising natural and seamless with the physical and digital objects (and their combination) directly with our hands.
|
||||
%
|
||||
It is technically and conceptually closely related to VR, which completely replace the real environment with an immersive virtual environment (VE).
|
||||
%
|
||||
AR and VR can be placed on a reality-virtuality (RV) continuum, as proposed by \textcite{milgram1994taxonomy} and shown in \figref{rv-continuum}.
|
||||
%
|
||||
It describes different levels of combination of real and virtual environments along one axis, with one end being the real, physical environment, and the other end being a purely virtual environment, \ie undistinguishable from real world (as \emph{the Matrix} movies or the \emph{Holodeck} in \emph{Star Trek} series).
|
||||
%
|
||||
In between, there is mixed reality (MR) which comprises AR and VR~\autocite{skarbez2021revisiting}.
|
||||
|
||||
\begin{subfigs}{rv-continuums}{Reality-virtuality (RV) continuums.}[%
|
||||
\item Original RV continuum for the visual sense initially proposed by and readapted from \textcite{milgram1994taxonomy}.
|
||||
\item Extension of the RV continuum to include the haptic sense on a second, orthogonal axis, proposed by and readapted from \textcite{jeon2009haptic}.
|
||||
]
|
||||
\subfig[0.44]{rv-continuum}
|
||||
\subfig[0.54]{visuo-haptic-rv-continuum3}
|
||||
\end{subfigs}
|
||||
|
||||
Even though AR and VR are often considered as addressing only the visual sense, they can also be extended to render for other sensory modalities.
|
||||
%
|
||||
In particular, \textcite{jeon2009haptic} proposed to extend the RV continuum to include haptic feedback by decoupling into two orthogonal haptic and visual axes (see \figref{visuo-haptic-rv-continuum3}).
|
||||
%
|
||||
The combination of the two axes defines 9 types of visuo-haptic environments, with 3 possible levels of RV for each visual or haptic axis (real, augmented, virtual).
|
||||
%
|
||||
For instance, a visual AR environment using a tangible object as a proxy to manipulate virtual content is considered a real haptic environment (see \figref{kahl2023using}; bottom middle cell in \figref{visuo-haptic-rv-continuum3}), whereas a device providing synthetic haptic feedback when touching a virtual object is considered a virtual haptic environment (see \figref{meli2018combining}; top middle cell in \figref{visuo-haptic-rv-continuum3}).
|
||||
%
|
||||
Haptic augmented reality (HAR) is then the combination of real and virtual haptic stimuli~\autocite{bhatia2024augmenting} (middle row in \figref{visuo-haptic-rv-continuum3}).
|
||||
%
|
||||
In particular, it has been implemented by enhancing the haptic perception of tangible objects by providing timely tactile stimuli using wearable haptics.
|
||||
%
|
||||
\figref{salazar2020altering} shows an example of modifying the perceived stiffness a tangible seen in VR using simultaneous pressure feedback on the finger (left middle cell in \figref{visuo-haptic-rv-continuum3}).
|
||||
%
|
||||
\figref{bau2012revel} shows another example of visuo-haptic AR rendering of texture when running the finger on a tangible surface (middle cell in the two axes in \figref{visuo-haptic-rv-continuum3}).
|
||||
|
||||
Yet current (visual) AR systems often lack haptic feedback, creating deceptive and incomplete user experience when reaching the hand to touch the virtual environment.
|
||||
%
|
||||
All visual virtual objects are indeed, by nature, intangible and cannot physically constrain a user's hand, making it difficult to congruently perceive their properties and interact with them with confidence and efficiency.
|
||||
%
|
||||
Thus, it is necessary to provide haptic feedback that is consistent with the augmented visual environment and that ensure the best possible user experience.
|
||||
%
|
||||
Integrating wearable haptics with AR seems to be one of the most promising solutions, but it remains a challenge due to their many respective characteristics and the additional constraints of combining them.
|
||||
|
||||
\begin{subfigs}{visuo-haptic-environments}{%
|
||||
Visuo-haptic environments with different degrees of reality-virtuality.
|
||||
}[%
|
||||
\item Visual AR environment with a real, tangible haptic object used as a proxy to manipulate a virtual object~\autocite{kahl2023using}.
|
||||
\item Visual AR environment with a wearable haptic device that provide virtual, synthetic feedback of contact with a virtual object~\autocite{meli2018combining}.
|
||||
\item A tangible object seen in a visual VR environment, and whose haptic perception of stiffness is augmented with the hRing haptic device~\autocite{salazar2020altering}.
|
||||
\item Visuo-haptic rendering of texture on a touched tangible object with a visual AR display and haptic electrovibration feedback~\autocite{bau2012revel}.
|
||||
]
|
||||
\subfigsheight{31mm}
|
||||
\subfig{kahl2023using}
|
||||
\subfig{meli2018combining}
|
||||
\subfig{salazar2020altering}
|
||||
\subfig{bau2012revel}
|
||||
\end{subfigs}
|
||||
|
||||
|
||||
\sectionstartoc{Research Challenges of Wearable Visuo-Haptic Augmented Reality}
|
||||
|
||||
Integrating wearable haptics with augmented reality to create a visuo-haptic augmented environment raises many perceptive and interaction challenges.
|
||||
%
|
||||
We are in particular interested in enabling a direct contact and manipulation with bare hands of virtual and augmented objects with the objective of providing a congruent, intuitive and efficient perception and interaction with the visuo-haptic augmented environment.
|
||||
%
|
||||
%This to ensure the best possible user experience, taking into account the current capabilities and limitations of wearable haptics and augmented reality technologies.
|
||||
%
|
||||
Each of these challenges also pose numerous design and technical issues specific to each of the two type of feedback, wearable haptics and augmented reality, as well as multimodal rendering and user experience issues in integrating these two sensorimotor feedbacks.
|
||||
%
|
||||
We identify two main research challenges which we address in this thesis:
|
||||
%
|
||||
@@ -59,47 +138,113 @@ We identify two main research challenges which we address in this thesis:
|
||||
\item enabling effective interaction with the augmented environment.
|
||||
\end{enumerate*}
|
||||
%
|
||||
These challenges are illustrated the visuo-haptic interaction loops in \figref{xxx}.
|
||||
%These challenges are illustrated in the visuo-haptic interaction loop in \figref{xxx}.
|
||||
|
||||
|
||||
\subsectionstartoc{Provide Plausible and Coherent Visuo-Haptic Augmentations}
|
||||
|
||||
Numerous haptic devices have been specially designed and evaluated to use in VR, providing realistic and diverse kinesthetic and tactile feedback to virtual objects.
|
||||
%
|
||||
Although closely related, (visual) AR and VR have key differences in their rendering that can affect the user perception.
|
||||
|
||||
First, the user's hand and the real environment are visible in AR, contrary to VR, where there is a total control over the visual rendering of the hand and the environment.
|
||||
% (unless specifically overlaid with virtual visual content)
|
||||
%
|
||||
As such, in VR, visual sensations are particularly dominant in the perception, and conflicts are specially created with haptic sensations to influence the user's perception, for example to create pseudo-haptic~\autocite{ujitoko2021survey} or haptic retargeting~\autocite{azmandian2016haptic} effects.
|
||||
%enabling techniques such as pseudo-haptic feedback that induce haptic feedback with visual stimuli~\autocite{ujitoko2021survey} or haptic retargeting that associate a single tangible object with multiple virtual objects without the user noticing~\autocite{azmandian2016haptic}.
|
||||
%
|
||||
Moreover, many wearable haptic devices take the form of controllers, gloves, exoskeletons, and as they cover the fingertips, they are therefore not suitable for AR.
|
||||
%
|
||||
The user's hand must be indeed not impaired to be free to touch and interact with the real environment.
|
||||
%
|
||||
It is possible instead to place the haptic actuator close to the point of contact with the real environment, as described above to implement HAR, \eg providing haptic feedback on another phalanx~\autocite{asano2015vibrotactile,salazar2020altering} or the wrist~\autocite{sarac2022perceived} for rendering fingertip contacts with virtual content.
|
||||
%
|
||||
Therefore, when touching a virtual or augmented object, the real and virtual visual sensations are seen colocalised, but not the virtual haptic feedback.
|
||||
%
|
||||
It remains to be investigated how such potential discrepancies affect the overall perception to design visuo-haptic renderings adapted to AR.
|
||||
|
||||
Le sens visuel nous permet de
|
||||
Then, it is in AR, as of today, only possible to add visual and haptic sensation to the overall user perception of the environment.
|
||||
%
|
||||
These virtual visual and haptic sensations can therefore be perceived as being out of sync or even inconsistent with the sensations of the real environment, for example with a lower rendering quality, a temporal latency, a spatial shift, or a combination of these.
|
||||
%
|
||||
Hence, it is unclear to what extent the real and virtual visuo-haptic sensations will be perceived as realistic or plausible, and how much they will conflict or complement each other in the perception of the augmented environment.
|
||||
%
|
||||
%Therefore, it remains to be investigated how these three characteristics of using wearable haptics with AR affect the perception, especially with visually and haptically augmented objects.
|
||||
|
||||
% on voit sa propre main toucher, contrairement à la VR, où la vision est particulièrement dominante (eg retargeting), difficile à dire si le cas en RA, surtout que si touche objets augmentés, difficile de modifier visuellement et haptiquement on peut ajouter des sensations pas vraiment en enlever. Lactuateur n'est pas là où on touche, à quel point les sensations seront réalistes ? En cohérence avec les sensations visuelles ? À quel point la perception différente de la VR, en terme de rendu main env, et de latence ? Important car permettra d'utiliser efficacement, avwc correction si besoin par rapport à la VR. Lq boucle d'interaction a forcément de la latence par rapport aux mouvements, à la proprioception, et pas les mêmes entre visuel et haptique, quel effet ?
|
||||
|
||||
\subsectionstartoc{Enable Effective Interaction with the Augmented Environment}
|
||||
|
||||
Touching, grasping and manipulating virtual objects are fundamental interactions for AR~\autocite{kim2018revisiting} and VR~\autocite{bergstrom2021how}.
|
||||
%
|
||||
As the hand is not occupied or covered with a haptic device to not impair interaction with the real environment, as described above, one can expect to interact seamlessly and directly with the hand with the virtual content as if it were real.
|
||||
%
|
||||
Thus, augmenting a tangible object present the advantage of physically constraint the hand, enabling an easy and natural interaction, but manipulating a purely virtual object with bare hands can be challenging with no good haptic feedback~\autocite{maisto2017evaluation,meli2018combining}.%, and one will rely on visual and haptic feedback to guide the interaction.
|
||||
|
||||
Moreover, current AR systems present visual rendering limitations that also affect virtual object interaction.%, due to depth underestimation, a lack of mutual occlusions, and hand tracking latency.
|
||||
%
|
||||
AR is the display of superimposed images of the virtual world, synchronised with the current user view of the real world.
|
||||
%
|
||||
But the depth perception of the virtual objects is often underestimated~\autocite{peillard2019studying,adams2022depth}, and it often lacks mutual occlusion between the hand and a virtual object, \ie that the hand can hide the object or be hidden by the object~\autocite{macedo2023occlusion}.
|
||||
%
|
||||
Finally, interaction with the object is an illusion, because in fact the real hand is recorded by a tracking system and controls in real time a virtual hand, like an avatar, whose contacts with virtual objects is then simulated in the virtual environment.
|
||||
%
|
||||
There is therefore inevitably a latency delay between the real hand movements and the virtual object return movements, and a spatial shift can also occur between the real hand and the virtual hand, which movements are constrained to the touched virtual object~\autocite{prachyabrued2014visual}.
|
||||
%
|
||||
This make it difficult to perceive the position of the fingers relative to the object before touching or grasping it, and also to estimate the force to apply to grasp it and move it to a desired location.
|
||||
|
||||
Hence, it is necessary to provide visual and haptic feedback that enable the user to efficiently contact, grasp and manipulate a virtual object with the hand.
|
||||
%
|
||||
Yet, it is unclear which type of visual and haptic feedback is the best suited to guide the virtual object manipulation, and whether one or other of the two types of feedback, or their combination, is the most beneficial for users.
|
||||
|
||||
% comme on laisse la main libre, et quelle visible, il n'y a pas de contrôleur tracké comme en VR, donc on s'attend à des interactions naturelles directement avec la main, de manière seamless entre physique et virtuel. On tracking additionel des actuateurs qui couvre la main. Question de où placer lactuateur, encore une fois le feedback n'est pas colocalisé avec l'action, contrairement au visuel (+- lag). Il y a également des décalages dans le temps et l'espace entre la main physique et son replicat virtuel, or c'est ce dernier qui agit avec les objets virtuel, est-ce qu'il faut le montrer ? Ou des rendus haptiques suffisent, complémentaire, en contradiction ?
|
||||
|
||||
|
||||
\sectionstartoc{Approach and Contributions}
|
||||
|
||||
The objective of this thesis is to understand how immersive visual and wearable haptic feedbacks compare and complement each other in the context of direct hand interaction with augmented and virtual objects.
|
||||
The objective of this thesis is to understand how immersive visual and wearable haptic feedbacks compare and complement each other in the context of direct hand perception and interaction with augmented and virtual objects.
|
||||
%
|
||||
As described in the research challenges section above, providing a user a convincing, consistent and effective visuo-haptic augmented environment is complex and raises many issues.
|
||||
%
|
||||
Our approach is to
|
||||
%
|
||||
\begin{enumerate*}[label=(\arabic*)]
|
||||
\item design immersive and wearable multimodal visuo-haptic renderings that augment both the objects being interacted with and the hand interacting with them, and
|
||||
\item evaluate experimentally in user studies how these renderings affect the perception and manipulation of these objects using psychophysical, performance, and user experience methods.
|
||||
\item evaluate experimentally in user studies how these renderings affect the perception of and interaction with these objects using psychophysical, performance, and user experience methods.
|
||||
\end{enumerate*}
|
||||
%
|
||||
We consider two main axes of research, each addressing one of the two research challenges identified above:
|
||||
We consider two main axes of research, each addressing one of the research challenges identified above:
|
||||
%
|
||||
\begin{enumerate*}[label=(\Roman*)]
|
||||
\item augmenting the visuo-haptic texture perception of tangible surfaces, and
|
||||
\item improving virtual object manipulation with visuo-haptic augmentations of the hand.
|
||||
\item altering the perception of tangible surfaces with virtual visuo-haptic textures, and
|
||||
\item improving virtual object interaction with visuo-haptic augmentations of the hand.
|
||||
\end{enumerate*}
|
||||
%
|
||||
Our contributions in these two axes are summarized in \figref{xxx}.
|
||||
|
||||
% TODO: Add figure with the two axes of research and the contributions
|
||||
|
||||
\subsectionstartoc{Augmenting the Visuo-Haptic Texture Perception of Tangible Surfaces}
|
||||
|
||||
\subsectionstartoc{Improving Virtual Object Manipulation with Visuo-Haptic Augmentations of the Hand}
|
||||
\subsectionstartoc{Altering the Perception of Tangible Surfaces with Virtual Visuo-Haptic Textures}
|
||||
|
||||
(\textit{to complete})
|
||||
|
||||
% Very short shared motivation of the shared objective of the two contributions
|
||||
% Objective + We propose / we consider : (1) ... and (2) ...
|
||||
|
||||
% Very short abstract of contrib 1
|
||||
|
||||
% Very short abstract of contrib 2
|
||||
|
||||
\subsectionstartoc{Improving Virtual Object Interaction with Visuo-Haptic Augmentations of the Hand}
|
||||
|
||||
(\textit{to complete})
|
||||
|
||||
% Touch allows to perceive the environment and interact with it, thus it motivates these two axes of research.
|
||||
|
||||
% it's a mix of augmented reality, virtual reality, vibrotactile feedback for visuo-tactile augmentation of the real world. Such multimodal rendering raise many questions on how to design, how renderings interact and complete each other, to give one perception
|
||||
|
||||
|
||||
\sectionstartoc{Thesis Overview}
|
||||
|
||||
%Present the contributions and structure of the thesis.
|
||||
@@ -114,7 +259,7 @@ First, it gives an overview of existing wearable haptic devices and renderings t
|
||||
%
|
||||
Second, it presents the principles, current capabilities and limitations of augmented reality, and describes the 3D interaction techniques used in AR and VR environments to interact with virtual and augmented objects with, in particular, the visual rendering of the user's hand.
|
||||
%
|
||||
Finally, it shows how multimodal visuo-haptic feedback have been used in AR and VR to alter the perception of tangible objects and to improve the manipulation of virtual objects.
|
||||
Finally, it shows how multimodal visuo-haptic feedback have been used in AR and VR to alter the perception of tangible objects and to improve the interaction and manipulation with virtual objects.
|
||||
%
|
||||
Then, we address each of our two research axes in a dedicated part.
|
||||
|
||||
@@ -124,7 +269,7 @@ Then, we address each of our two research axes in a dedicated part.
|
||||
%
|
||||
We evaluate how the visual rendering of the hand (real or virtual), the environment (AR or VR) and the textures (displayed or hidden) affect the roughness perception of virtual vibrotactile textures rendered on real surfaces and touched directly with the index finger.
|
||||
|
||||
\chapref{xr_perception} first details a system for rendering visuo-haptic virtual textures augmenting tangible surfaces using an immersive AR/VR headset and a wearable vibrotactile device.
|
||||
\chapref{xr_perception} details a system for rendering visuo-haptic virtual textures augmenting tangible surfaces using an immersive AR/VR headset and a wearable vibrotactile device.
|
||||
%
|
||||
The haptic textures are rendered as a real-time vibrotactile signal representing a grating texture, and is provided to the middle phalanx of the index finger touching the texture using a voice-coil actuator.
|
||||
%
|
||||
@@ -140,17 +285,17 @@ The textures are paired visual and tactile models of real surfaces~\autocite{cul
|
||||
%
|
||||
We investigate the perception and user appreciation of the combination of nine representative visuo-haptic pairs of texture.
|
||||
%
|
||||
Our objective is to assess the perceived roughness of the visual and haptic textures and the coherence of their association.
|
||||
Our objective is to assess the perceived realism, plausibility and roughness of the visual and haptic textures, and the coherence of their association.
|
||||
|
||||
\bigskip
|
||||
|
||||
\partref{manipulation} describes our contributions to the second axis of research, improving virtual object manipulation with visuo-haptic augmentations of the hand.
|
||||
\partref{manipulation} describes our contributions to the second axis of research, improving virtual object interaction with visuo-haptic augmentations of the hand.
|
||||
%
|
||||
We evaluate how the visual and haptic rendering of the hand improve the direct manipulation of virtual objects with bare hands.
|
||||
We evaluate how the visual and haptic rendering of the hand improve the interaction with virtual objects directly with the hand.
|
||||
|
||||
\chapref{visual_hand} explores how rendering a virtual visual hand showing how the
|
||||
\chapref{visual_hand} explores the effect of six visual renderings of the hand that provide contact feedback with the virtual object. (\textit{to complete})
|
||||
|
||||
\chapref{visuo_haptic_hand} describes
|
||||
\chapref{visuo_haptic_hand} evaluates two vibrotactile contact techniques, provided at four different locations on the real hand, and compared to the two most representative visual hand renderings from the previous contribution. (\textit{to complete})
|
||||
|
||||
\bigskip
|
||||
|
||||
|
||||
Reference in New Issue
Block a user