WIP related work
This commit is contained in:
@@ -290,17 +290,17 @@ Sandpaper is typically perceived as sticky because it has a strong resistance to
|
||||
This perceptual property is closely related to the perception of roughness~\cite{hollins1993perceptual,baumgartner2013visual}.
|
||||
|
||||
When running the finger on a surface with a lateral movement (see \secref{exploratory_procedures}), the skin-surface contacts generate frictional forces in the opposite direction to the finger movement, giving kinesthetic cues, and also stretch the skin, giving cutaneous cues.
|
||||
As illustrated in \figref{smith1996subjective_1}, a stick-slip phenomenon can also occur, where the finger is intermittently slowed down by the friction before continuing its movement, both on rough and smooth surfaces~\cite{derler2013stick}.
|
||||
The amplitude of the frictional force $F_s$ is proportional to the normal force of the finger $F_n$, \ie the force perpendicular to the surface, according to a friction coefficient $\mu$:
|
||||
As illustrated in \figref{smith1996subjective_1}, a stick-slip phenomenon can also occur, where the finger is intermittently slowed by friction before continuing to move, on both rough and smooth surfaces~\cite{derler2013stick}.
|
||||
The amplitude of the frictional force $F_s$ is proportional to the normal force of the finger $F_n$, \ie the force perpendicular to the surface, according to a coefficient of friction $\mu$:
|
||||
\begin{equation}
|
||||
\label{eq:friction}
|
||||
F_s = \mu \, F_n
|
||||
\end{equation}
|
||||
The perceived intensity of friction is thus that of the friction coefficient $\mu$~\cite{smith1996subjective}.
|
||||
|
||||
\begin{subfigs}{smith1996subjective}{Perception des frottements haptique de plusieurs matériaux par exploration active avec le doigt~\cite{smith1996subjective}. }[
|
||||
\item Mesures des forces normales $F_n$ et tangentielles $F_t$ lors de l'exploration de deux surfaces : une lisse (le verre) et une rugueuse (le nyloprint). Les fluctuations de la force tangentielle sont dues au phénomène de stick-slip. Le coefficient de frottement $\mu$ peut être estimé par la pente de la relation entre les forces normales et tangentielle.
|
||||
\item Intensité perçue des frottements (axe vertical) en fonction du coefficient de frottement $\mu$ estimé de l'exploration (axe horizontal) pour quatre matériaux (formes et couleurs).
|
||||
\begin{subfigs}{smith1996subjective}{Perceived intensity of friction of different materials by active exploration with the finger~\cite{smith1996subjective}. }[
|
||||
\item Measurements of normal $F_n$ and tangential $F_t$ forces when exploring two surfaces: one smooth (glass) and one rough (nyloprint). The fluctuations in the tangential force are due to the stick-slip phenomenon. The coefficient of friction $\mu$ can be estimated as the slope of the relationship between the normal and tangential forces.
|
||||
\item Perceived friction intensity (vertical axis) as a function of the estimated friction coefficient $\mu$ of the exploration (horizontal axis) for four materials (shapes and colors).
|
||||
]
|
||||
\subfigsheight{55mm}
|
||||
\subfig{smith1996subjective_1}
|
||||
@@ -318,11 +318,11 @@ Si le doigt est anesthésié, l'absence de sensations cutanées empêche d'ajust
|
||||
\subsubsection{Temperature}
|
||||
\label{temperature}
|
||||
|
||||
La température (ou coldness/warmness) est la perception du \emph{transfert de chaleur} entre la surface touchée et la peau~\cite{bergmanntiest2010tactual}:
|
||||
Si de la chaleur est extraite de (apportée à) la peau, la surface est perçue comme froide (chaude).
|
||||
Le métal sera perçu comme plus froid que du bois avec la même température de la pièce ; c'est une propriété importante pour discriminer les matériaux~\cite{ho2006contribution}.
|
||||
Cette perception est donc distincte de la température physique du matériau, et dépend de la conductance thermique et de la capacité thermique du matériau, du volume de l'objet, de la différence de température initiale entre la surface et la peau, et de l'aire du contact~\cite{kappers2013haptic}.
|
||||
Par exemple, un objet plus volumineux ou une surface plus lisse, augmentant l'aire de contact, augmente la circulation thermique et rend une sensation de température plus intense~\cite{bergmanntiest2008thermosensory}.
|
||||
Temperature (or coldness/warmness) is the perception of the \emph{transfer of heat} between the touched surface and the skin~\cite{bergmanntiest2010tactual}:
|
||||
When heat is removed from (added to) the skin, the surface is perceived as cold (hot).
|
||||
Metal will be perceived as colder than wood at the same room temperature: This perception is different from the physical temperature of the material and is therefore an important property for distinguishing between materials~\cite{ho2006contribution}.
|
||||
This perception depends on the thermal conductivity and heat capacity of the material, the volume of the object, the initial temperature difference and the area of contact between the surface and the skin~\cite{kappers2013haptic}.
|
||||
For example, a larger object or a smoother surface, which increases the contact area, causes more heat circulation and a more intense temperature sensation (hot or cold)~\cite{bergmanntiest2008thermosensory}.
|
||||
|
||||
%Parce qu'elle est basée sur la circulation de la chaleur, la perception de la température est plus lente que les autres propriétés matérielles et demande un toucher statique (voir \figref{exploratory_procedures}) de plusieurs secondes pour que la température de la peau s'équilibre avec celle de l'objet.
|
||||
%La température $T(t)$ du doigt à l'instant $t$ et au contact avec une surface suit une loi décroissante exponentielle, où $T_s$ est la température initiale de la peau, $T_e$ est la température de la surface, $t$ est le temps et $\tau$ est la constante de temps:
|
||||
@@ -337,29 +337,30 @@ Par exemple, un objet plus volumineux ou une surface plus lisse, augmentant l'ai
|
||||
\subsubsection{Spatial Properties}
|
||||
\label{spatial_properties}
|
||||
|
||||
Le poids, la taille et la forme d'un objet sont des propriétés haptiques dites spatiales qui sont indépendantes des propriétés matérielles décrites précédemment.
|
||||
Weight, size and shape are haptic spatial properties that are independent of the material properties described above.
|
||||
|
||||
Le poids (ou heaviness/lightness) est la perception de la \emph{masse} de l'objet~\cite{bergmanntiest2010haptic}.
|
||||
Elle est typiquement estimé en tenant l'objet dans la paume de la main de façon statique, an \enquote{unsupported holding} (voir \figref{exploratory_procedures}).
|
||||
Une différence relative de poids de \percent{8} est alors nécessaire être perceptible~\cite{brodie1985jiggling}.
|
||||
En soulevant l'objet, il est possible de sentir en plus la force d'inertie de l'objet, \ie la résistance à la vitesse. Cela donne un indice perceptuel supplémentaire de sa masse et permet d'améliorer légèrement la discrimination de poids~\cite{brodie1985jiggling}.
|
||||
Dans les deux cas, les indices kinesthésiques de force sont beaucoup plus importants que les indices cutanés de pression~\cite{bergmanntiest2012investigating}.
|
||||
Weight (or heaviness/lightness) is the perceived \emph{mass} of the object~\cite{bergmanntiest2010haptic}.
|
||||
It is typically estimated by holding the object statically in the palm of the hand to feel the gravitational force (see \secref{exploratory_procedures}).
|
||||
A relative weight difference of \percent{8} is then required to be perceptible~\cite{brodie1985jiggling}.
|
||||
By lifting the object, it is also possible to feel the object's force of inertia, \ie its resistance to velocity.
|
||||
This provides an additional perceptual cue to its mass and slightly improves weight discrimination.
|
||||
For both gravity and inertia, kinesthetic cues to force are much more important than cutaneous cues to pressure~\cite{bergmanntiest2012investigating}.
|
||||
%Le lien entre le poids physique et l'intensité perçue est variable selon les individus~\cite{kappers2013haptic}.
|
||||
|
||||
La taille peut être la perception de la \emph{longueur} de l'objet (sur une dimension) ou de son \emph{volume} (en trois dimensions)~\cite{kappers2013haptic}.
|
||||
Dans les deux cas et si l'objet est assez petit, un grip de précision (voir \figref{gonzalez2014analysis}) entre le pouce et l'index permet de discriminer des tailles avec une précision de \qty{1}{\mm} mais une sur-estimation de la longueur (loi de puissance avec un exposant \qty{1.3}).
|
||||
Sinon, il est nécessaire de suivre les contours de l'objet avec les doigts pour estimer sa longueur (voir \figref{exploratory_procedures}), mais avec une précision dix fois moins bonne et une sous-estimation de la longueur (loi de puissance avec un exposant \qty{0.9})~\cite{bergmanntiest2011cutaneous}.
|
||||
La perception du volume d'un objet qui n'est pas petit se fait typiquement par une enclosure de la main, mais l'estimation est fortement influencée par la taille, la forme et la masse de l'objet, pour un volume identique~\cite{kahrimanovic2010haptic}.
|
||||
Size can be perceived as the object's \emph{length} (in one dimension) or its \emph{volume} (in three dimensions)~\cite{kappers2013haptic}.
|
||||
In both cases, and if the object is small enough, a precision grip (see \figref{gonzalez2014analysis}) between the thumb and index finger can discriminate between sizes with an accuracy of \qty{1}{\mm}, but with an overestimation of length (power law with exponent \qty{1.3}).
|
||||
Alternatively, it is necessary to follow the contours of the object with the fingers to estimate its length (see \secref{exploratory_procedures}), but with ten times less accuracy and an underestimation of length (power law with an exponent of \qty{0.9})~\cite{bergmanntiest2011cutaneous}.
|
||||
The perception of the volume of an object that is not small is typically done by hand enclosure, but the estimate is strongly influenced by the size, shape and mass of the object, for an identical volume~\cite{kahrimanovic2010haptic}.
|
||||
|
||||
La forme de l'objet peut-être définie comme la perception de sa \emph{géométrie globale}, \ie de sa forme et de ses contours.
|
||||
C'est par exemple quand on cherche une clé dans une poche.
|
||||
L'exploration de contours et l'enclosure sont alors employés, comme pour l'estimation des longueur et du volume.
|
||||
Si l'objet est non connu a priori, son identification objet est alors plutôt lente prends plusieurs secondes~\cite{norman2004visual}.
|
||||
C'est pourquoi, l'exploration d'autres propriétés est alors privilégié pour reconnaitre l'objet plus rapidement, en particulier des bords marqués~\cite{klatzky1987there}, \eg une vis parmi des clous (voir \figref{plaisier2009salient_2}), ou des propriétés matérielles particulières~\cite{lakatos1999haptic,plaisier2009salient}, \eg un objet en métal parmi des objets en plastique.
|
||||
The shape of an object can be defined as the perception of its \emph{global geometry}, \ie its shape and contours.
|
||||
This is the case, for example, when looking for a key in a pocket.
|
||||
The exploration of contours and enclosure are then employed, as for the estimation of length and volume.
|
||||
If the object is not known in advance, object identification is rather slow, taking several seconds~\cite{norman2004visual}.
|
||||
Therefore, the exploration of other properties is favoured to recognize the object more quickly, in particular marked edges~\cite{klatzky1987there}, \eg a screw among nails (see \figref{plaisier2009salient_2}), or certain material properties~\cite{lakatos1999haptic,plaisier2009salient}, \eg a metal object among plastic objects.
|
||||
|
||||
\begin{subfigs}{plaisier2009salient}{Identification d'une sphère parmi des cubes~\cite{plaisier2009salient}. }[
|
||||
\item La forme a un effet important sur la perception du volume d'un objet, \eg une sphère est perçue plus petite qu'un cube de même volume.
|
||||
\item L'absence de bord marqué sur la sphère la rend facile à identifier parmi des cubes.
|
||||
\begin{subfigs}{plaisier2009salient}{Identifcation of a sphere among cubes~\cite{plaisier2009salient}. }[
|
||||
\item The shape has a significant effect on the perception of the volume of an object, \eg a sphere is perceived smaller than a cube of the same volume.
|
||||
\item The absence of a marked edge on the sphere makes it easy to identify among cubes.
|
||||
]
|
||||
\subfigsheight{40mm}
|
||||
\subfig{plaisier2009salient_1}
|
||||
@@ -370,8 +371,8 @@ C'est pourquoi, l'exploration d'autres propriétés est alors privilégié pour
|
||||
\subsection{Conclusion}
|
||||
\label{haptic_sense_conclusion}
|
||||
|
||||
La perception haptique et la manipulation des objets avec la main implique donc plusieurs mécanismes simultanés aux interactions complexes.
|
||||
Ainsi, des mouvements exploratoires de la main sont effectués au contact de l'objet afin obtenir de multiples informations sensorielles venant de plusieurs capteurs cutanés et kinesthésiques.
|
||||
Ces sensations traduisent des paramètres physiques en indices perceptuels qui sont alors intégrés pour former une perception de la propriété explorée.
|
||||
Il est fréquent qu'un indice perceptuel soit particulièrement important dans la perception d'une propriété, mais une certaine constance perceptuelle est possible en compensant son absence par d'autres.
|
||||
Ces perceptions permettent en retour de guider la préhension et la manipulation de l'objet en adoptant avec la main des préhensions et forces adaptées à la forme de l'objet et à la tâche à réaliser.
|
||||
Haptic perception and manipulation of objects with the hand involves several simultaneous mechanisms with complex interactions.
|
||||
Exploratory movements of the hand are performed on contact with the object to obtain multiple sensory information from several cutaneous and kinaesthetic receptors.
|
||||
These sensations express physical parameters in the form of perceptual cues, which are then integrated to form a perception of the property being explored.
|
||||
It is often the case that one perceptual cue is particularly important in the perception of a property, but perceptual constancy is possible by compensating for its absence with others.
|
||||
In turn, these perceptions help to guide the grasping and manipulation of the object by adapting the grasp type and the forces applied to the shape of the object and the task to be performed.
|
||||
@@ -209,9 +209,6 @@ A common method vibrotactile rendering of texture is to use a sinusoidal signal
|
||||
|
||||
\cite{kildal20103dpress}
|
||||
|
||||
pacchierotti2014improving
|
||||
pacchierotti2015enhancing
|
||||
park 2017 Compensation of perceived hardness of a virtual object with cutaneous feedback
|
||||
\cite{park2019realistic}
|
||||
\cite{choi2021perceived}
|
||||
\cite{park2023perceptual}
|
||||
@@ -219,6 +216,8 @@ park 2017 Compensation of perceived hardness of a virtual object with cutaneous
|
||||
\cite{detinguy2018enhancing}
|
||||
\cite{salazar2020altering}
|
||||
\cite{yim2021multicontact}
|
||||
|
||||
\cite{park2017compensation}
|
||||
\cite{tao2021altering}
|
||||
|
||||
\subsubsection{Friction}
|
||||
|
||||
@@ -39,21 +39,21 @@ Yet, most of the research have focused on visual augmentations, and the term \AR
|
||||
\paragraph{Applications}
|
||||
|
||||
Advances in technology, research and development have enabled many usages of \AR, including medicine, education, industrial, navigation, collaboration and entertainment applications~\cite{dey2018systematic}.
|
||||
For example, \AR can help surgeons to visualize \ThreeD images of the brain overlaid on the patient's head prior or during surgery (see \figref{watanabe2016transvisible}) or improve the learning of students with complex concepts and phenomena such as optics or chemistry (see \figref{bousquet2024reconfigurable}).
|
||||
It can also guide workers in complex tasks, such as assembly, maintenance or verification (see \figref{hartl2013mobile}), or can create complete new forms of gaming or tourism experiences (see \figref{roo2017inner}).
|
||||
For example, \AR can help surgeons to visualize \ThreeD images of the brain overlaid on the patient's head prior or during surgery, \eg in \figref{watanabe2016transvisible}~\cite{watanabe2016transvisible}, or improve the learning of students with complex concepts and phenomena such as optics or chemistry~\cite{bousquet2024reconfigurable}.
|
||||
It can also guide workers in complex tasks, such as assembly, maintenance or verification, \eg in \figref{hartl2013mobile}~\cite{hartl2013mobile}, reinvent the way we interact with desktop computers, \eg in \figref{lee2013spacetop}~\cite{lee2013spacetop}, or can create complete new forms of gaming or tourism experiences, \eg in \figref{roo2017inner}~\cite{roo2017inner}.
|
||||
Most of (visual) \AR/\VR experience can now be implemented with commercially available hardware and software solutions, in particular for tracking, rendering and display.
|
||||
Yet, the user experience in \AR is still highly dependent on the display used.
|
||||
|
||||
\begin{subfigs}{ar_applications}{Examples of \AR applications. }[
|
||||
\item Neurosurgery visualization of the brain on a patient's head~\cite{watanabe2016transvisible}.
|
||||
\item HOBIT is a spatial, tangible \AR table simulating an optical bench for educational experimentations~\cite{bousquet2024reconfigurable}.
|
||||
\item Neurosurgery \AR visualization of the brain on a patient's head~\cite{watanabe2016transvisible}.
|
||||
\item SpaceTop is transparent \AR desktop computer featuring direct hand interaction with content~\cite{lee2013spacetop}.
|
||||
\item \AR can interactively guide in document verification tasks by recognizing and comparing with virtual references
|
||||
~\cite{hartl2013mobile}.
|
||||
\item Inner Garden is a visually augmented zen garden for relaxation and meditation~\cite{roo2017inner}.
|
||||
\item Inner Garden is a tangible, spatial \AR zen garden for relaxation and meditation~\cite{roo2017inner}.
|
||||
]
|
||||
\subfigsheight{45mm}
|
||||
\subfig{watanabe2016transvisible}
|
||||
\subfig{bousquet2024reconfigurable}
|
||||
\subfig{lee2013spacetop}
|
||||
\subfig{hartl2013mobile}
|
||||
\subfig{roo2017inner}
|
||||
\end{subfigs}
|
||||
@@ -123,6 +123,8 @@ Mais il faut pouvoir permettre à l'utilisateur d'interagir avec l'environment e
|
||||
|
||||
\subsubsection{Interaction Techniques}
|
||||
|
||||
\paragraph{Reducing the Physical-Virtual Gap}
|
||||
|
||||
Pour cela il faut des techniques d'interaction, \cite{billinghurst2005designing} : Physical Elements as Input -- Interaction Technique --> Virtual Elements as Output.
|
||||
Les techniques d'interactions sont cruciales pour l'expérience utilisateur, car elles dictent en grande partie la cohérence du système (voir \secref{ar_presence}) par la qualité des actions possible avec l'environment virtuel.
|
||||
"il s’agit de lier des entrées utilisateurs issues de capteurs physiques (souris, écran tactile, images d’une caméra) à des actions sur l’ordinateur représentées par un résultat en sortie (affichage, son, commande) via une technique d’interaction"
|
||||
@@ -130,22 +132,33 @@ ex : "La technique d’interaction est donc une méthode qui permet de traduire
|
||||
Principe IHM : [Van Dam, 1997] : réduire l'écart entre les éléments physiques et virtuels, \ie en un sens rendre l'interaction la plus "naturelle" possible, la moins "visible" possible.
|
||||
En RA, surtout immersive et portable, cet écart peut être tellement réduit qu'il n'est presque plus perceptible par l'utilisateur et l'interaction peut être pratiquement la même qu'avec le \RE, \ie essentiellement toucher, saisir et manipuler les objets virtuels directement avec les mains.
|
||||
|
||||
\cite{laviola20173d} a classé les techniques d'interactions avec les \VE en trois catégories : \enquote{navigation}, \enquote{selection} et \enquote{manipulation}.
|
||||
\paragraph{Tasks}
|
||||
|
||||
\cite{laviola20173d} a classé les techniques d'interactions avec les \VE en trois catégories selon la tâche qu'elles aident à accomplir : \enquote{navigation}, \enquote{selection} et \enquote{manipulation}.
|
||||
La navigation est le déplacement de l'utilisateur dans le \VE, mais dans le cas d'un casque de RA, le \VE est aligné avec le \RE et sont perceptuellement un seul et même environment augmenté (immersion) : la navigation est donc essentiellement le déplacement de l'utilisateur dans le \RE. Pour cela le casque se repère dans le \RE avec des capteurs et algorithmes de tracking et l'affichage du \VE est déplacé et orienté similairement au déplacement réel afin de l'afficher dans la bonne perspective de l'utilisateur. Voir aussi \cite{marchand2016pose} pour une revue des techniques de tracking pour la RA.
|
||||
La sélection est le choix d'un objet virtuel dans le \VE, et la manipulation est l'interaction avec cet objet, \ie le déplacer, le tourner, le redimensionner, etc.
|
||||
|
||||
\subsubsection{Virtual Hands in AR}
|
||||
\cite{hertel2021taxonomy} a proposé une taxonomie des techniques d'interactions spécifiquement en RA, en se basant sur les tâches et les modalités d'entrée utilisées.
|
||||
Les tâches proposées sont la création, la sélection, la manipulation (géométrique), la manipulation abstraite, et l'entrée de texte.
|
||||
Chacune de ces tâche peut-être potentiellement réalisée par
|
||||
|
||||
\paragraph{Natural Interaction in AR}
|
||||
|
||||
Dans le cas de la RA immersive avec une interaction "naturelles" (cf \cite{billinghurst2005designing}), la sélection consiste à toucher l'objet virtuel avec les mains, et la manipulation à le saisir et le déplacer avec les mains.
|
||||
C'est ce qu'on appelle les "virtual hands" : les mains virtuelles de l'utilisateur dans le \VE.
|
||||
Le dispositif d'entrée n'est pas une manette comme c'est souvent le cas en VR, mais directement les mains.
|
||||
Les mains sont donc détectées et reproduites dans le \VE,
|
||||
Les mains sont donc détectées et reproduites dans le \VE.
|
||||
|
||||
Maglré tout, le principal problème de l'interaction naturelle avec les mains dans un \VE, outre la détection des mains, est le manque de contrainte physique sur le mouvement de la main et des doigts, ce qui rend les actions fatiguantes, imprécises (on ne sait pas si on touche l'objet virtuel sans retour haptique) et difficile (idem, sans retour haptique on ne sent pas l'objet glisser, et on a pas de confirmation qu'il est bien en main). Des techniques d'interactions d'une part sont toujours nécessaire,et un retour haptique adapté aux contraintes d'interactions de la RA est indispensable pour une bonne expérience utilisateur.
|
||||
Maglré tout, le principal problème de l'interaction naturelle avec les mains dans un \VE, outre la détection des mains, est le manque de contrainte physique sur le mouvement de la main et des doigts, ce qui rend les actions fatiguantes (\cite{hincapie-ramos2014consumed}), imprécises (on ne sait pas si on touche l'objet virtuel sans retour haptique) et difficile (idem, sans retour haptique on ne sent pas l'objet glisser, et on a pas de confirmation qu'il est bien en main). Des techniques d'interactions d'une part sont toujours nécessaire,et un retour haptique adapté aux contraintes d'interactions de la RA est indispensable pour une bonne expérience utilisateur.
|
||||
|
||||
Cela peut être aussi difficile à comprendre : "\cite{chan2010touching} proposent la combinaison de retours continus, pour que l’utilisateur situe le suivi de son corps, et de retours discrets pour confirmer ses actions." Un rendu et affichage visuel des mains est un retour continu, un bref changement de couleur ou un retour haptique est un retour discret. Mais cette combinaison n'a pas été évaluée.
|
||||
|
||||
Prototypes : HandyAR and HoloDesk
|
||||
Deux techniques d'interactions naturelles avec les mains en RA immersive sont les plus courantes : les mains virtuelles et les tangibles.
|
||||
Quand l'objet est lointain, la sélection peut se faire avec des techniques de pointing ou des gestes (voir \cite{hertel2021taxonomy}).
|
||||
|
||||
\subsubsection{Manipulating with Virtual Hands}
|
||||
|
||||
\cite{hilliges2012holodesk}
|
||||
\cite{piumsomboon2013userdefined} : user-defined gestures for manipulation of virtual objects in AR.
|
||||
\cite{piumsomboon2014graspshell} : direct hand manipulation of virtual objects in immersive AR vs vocal commands.
|
||||
\cite{chan2010touching} : cues for touching (selection) virtual objects.
|
||||
@@ -161,16 +174,16 @@ In OST-AR, this is more difficult because the virtual environment is displayed a
|
||||
Moreover, in VST-AR, the grip aperture and depth positioning of virtual objects often seem to be wrongly estimated~\cite{al-kalbani2016analysis, maisto2017evaluation}.
|
||||
However, this effect has yet to be verified in an OST-AR setup.
|
||||
|
||||
An alternative is to render the virtual objects and the hand semi-transparents, so that they are partially visible even when one is occluding the other, \eg in \figref{hands-none} the real hand is behind the virtual cube but still visible.
|
||||
An alternative is to render the virtual objects and the hand semi-transparents, so that they are partially visible even when one is occluding the other, \eg the real hand is behind the virtual cube but still visible.
|
||||
Although perceived as less natural, this seems to be preferred to a mutual visual occlusion in VST-AR~\cite{buchmann2005interaction, ha2014wearhand, piumsomboon2014graspshell} and VR~\cite{vanveldhuizen2021effect}, but has not yet been evaluated in OST-AR.
|
||||
However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a virtual object, \eg in \figref{hands-none} the thumb is in front of the virtual cube, but it appears to be behind it.
|
||||
However, this effect still causes depth conflicts that make it difficult to determine if one's hand is behind or in front of a virtual object, \eg the thumb is in front of the virtual cube, but it appears to be behind it.
|
||||
|
||||
In VR, as the user is fully immersed in the virtual environment and cannot see their real hands, it is necessary to represent them virtually.
|
||||
Virtual hand rendering is also known to influence how an object is grasped in VR~\cite{prachyabrued2014visual,blaga2020too} and AR, or even how real bumps and holes are perceived in VR~\cite{schwind2018touch}, but its effect on the perception of a haptic texture augmentation has not yet been investigated.
|
||||
It is known that the virtual hand representation has an impact on perception, interaction performance, and preference of users~\cite{prachyabrued2014visual, argelaguet2016role, grubert2018effects, schwind2018touch}.
|
||||
In a pick-and-place task in VR, \textcite{prachyabrued2014visual} found that the virtual hand representation whose motion was constrained to the surface of the virtual objects performed the worst, while the virtual hand representation following the tracked human hand (thus penetrating the virtual objects), performed the best, even though it was rather disliked.
|
||||
The authors also observed that the best compromise was a double rendering, showing both the tracked hand and a hand rendering constrained by the virtual environment.
|
||||
It has also been shown that over a realistic avatar, a skeleton rendering (similar to \figref{hands-skeleton}) can provide a stronger sense of being in control~\cite{argelaguet2016role} and that minimalistic fingertip rendering (similar to \figref{hands-tips}) can be more effective in a typing task~\cite{grubert2018effects}.
|
||||
It has also been shown that over a realistic avatar, a skeleton rendering can provide a stronger sense of being in control~\cite{argelaguet2016role} and that minimalistic fingertip rendering can be more effective in a typing task~\cite{grubert2018effects}.
|
||||
|
||||
In AR, as the real hand of a user is visible but not physically constrained by the virtual environment, adding a visual hand rendering that can physically interact with virtual objects would achieve a similar result to the promising double-hand rendering of \textcite{prachyabrued2014visual}.
|
||||
Additionally, \textcite{kahl2021investigation} showed that a virtual object overlaying a tangible object in OST-AR can vary in size without worsening the users' experience nor the performance.
|
||||
@@ -183,6 +196,20 @@ However, the experiment was carried out on a screen, in a non-immersive AR scena
|
||||
\textcite{saito2021contact} found that masking the real hand with a textured 3D opaque virtual hand did not improve performance in a reach-to-grasp task but displaying the points of contact on the virtual object did.
|
||||
To the best of our knowledge, evaluating the role of a visual rendering of the hand displayed \enquote{and seen} directly above real tracked hands in immersive OST-AR has not been explored, particularly in the context of virtual object manipulation.
|
||||
|
||||
Mais se pose la question de la représentation, qui a montré des effets sur la performance et expérience utilisateur en RV mais reste peu étudiée en RA.
|
||||
|
||||
\subsubsection{Manipulating with Tangibles}
|
||||
|
||||
\cite{issartel2016tangible}
|
||||
\cite{englmeier2020tangible}
|
||||
en OST-AR \cite{kahl2021investigation,kahl2022influence,kahl2023using}
|
||||
|
||||
Triple problème :
|
||||
il faut un tangible par objet, problème de l'association qui ne fonctionne pas toujours (\cite{hettiarachchi2016annexing}) et du nombre de tangibles à avoir
|
||||
et l'objet visuellement peut ne pas correspondre aux sensations haptiques du tangible manipulé (\cite{tinguy2019how}).
|
||||
C'est pourquoi utiliser du wearable pour modifier les sensations cutanées du tangible est une solution qui fonctionne en VR (\cite{detinguy2018enhancing,salazar2020altering}) et pourrait être adaptée à la RA.
|
||||
Mais, spécifique à la RA vs RV, le tangible et la main sont visibles, du moins partiellement, même si caché par un objet virtuel : comment va fonctionner l'augmentation haptique en RA vs RV ? Biais perceptuels ? Le fait de voir toucher avec sa propre main le tangible vs en RV où il est caché, donc illusion potentiellement plus forte en RV ?
|
||||
|
||||
|
||||
\subsection{Conclusion}
|
||||
\label{ar_conclusion}
|
||||
|
||||
@@ -33,9 +33,17 @@ A few works have also used pseudo-haptic feedback to change the perception of ha
|
||||
For example, different levels of stiffness can be simulated on a grasped virtual object with the same passive haptic device~\cite{achibet2017flexifingers} or
|
||||
the perceived softness of tangible objects can be altered by superimposing in AR a virtual texture that deforms when pressed by the hand~\cite{punpongsanon2015softar}, or in combination with vibrotactile rendering in VR~\cite{choi2021augmenting}.
|
||||
|
||||
% Ban
|
||||
\cite{ban2012modifying}
|
||||
\cite{ban2014displaying}
|
||||
\cite{taima2014controlling}
|
||||
\cite{ujitoko2019presenting}
|
||||
|
||||
% I. Jang and D. Lee. 2014. On utilizing pseudo-haptics for cutaneous fingertip.
|
||||
\cite{costes2019touchy}
|
||||
\cite{kalus2024simulating}
|
||||
\cite{detinguy2019how}
|
||||
\cite{samad2019pseudohaptic}
|
||||
\cite{issartel2015perceiving}
|
||||
\cite{ogawa2021effect}
|
||||
|
||||
The vibrotactile sinusoidal rendering of virtual texture cited above was also combined with visual oscillations of a cursor on a screen to increase the roughness perception of the texture~\cite{ujitoko2019modulating}.
|
||||
However, the visual representation was a virtual cursor seen on a screen while the haptic feedback was felt with a hand-held device.
|
||||
@@ -78,11 +86,13 @@ If it is indeed necessary to delocalize the haptic feedback, each of these posit
|
||||
\subsection{Improving the Interactions with Virtual Objects}
|
||||
\label{vhar_interaction}
|
||||
|
||||
\cite{pacchierotti2015cutaneous}
|
||||
|
||||
Conjointly, a few studies have explored and compared the effects of visual and haptic feedback in tasks involving the manipulation of virtual objects with the hand.
|
||||
\textcite{sarac2022perceived} and \textcite{palmer2022haptic} studied the effects of providing haptic feedback about contacts at the fingertips using haptic devices worn at the wrist, testing different mappings.
|
||||
Results proved that moving the haptic feedback away from the point(s) of contact is possible and effective, and that its impact is more significant when the visual feedback is limited.
|
||||
|
||||
In pick-and-place tasks in AR involving both virtual and real objects, \textcite{maisto2017evaluation} and \textcite{meli2018combining} showed that having a haptic {rendering of the} fingertip interactions with the virtual objects led to better performance and perceived effectiveness than having only a visual rendering of the hand, similar to \figref{hands-tips}.
|
||||
In pick-and-place tasks in AR involving both virtual and real objects, \textcite{maisto2017evaluation} and \textcite{meli2018combining} showed that having a haptic {rendering of the} fingertip interactions with the virtual objects led to better performance and perceived effectiveness than having only a visual rendering of the hand.
|
||||
Moreover, employing the haptic ring of~\cite{pacchierotti2016hring} on the proximal finger phalanx led to an improved performance with respect to more standard fingertip haptic devices~\cite{chinello2020modular}.
|
||||
However, the measured difference in performance could be attributed to either the device or the device position (proximal vs fingertip), or both.
|
||||
|
||||
|
||||
Binary file not shown.
|
Before Width: | Height: | Size: 27 KiB After Width: | Height: | Size: 172 KiB |
@@ -3,12 +3,11 @@
|
||||
|
||||
\chaptertoc
|
||||
|
||||
This chapter reviews previous work on the perception and manipulation directly with the hand of \AEs using \WHs, \AR and their combination.
|
||||
Experiencing a visual, haptic, or visuo-haptic \AE relies on one to many interaction loops between a user and the environment, as shown in \figref[introduction]{interaction-loop}, and each main step must be addressed and understood: the tracking and modelling of the \RE into a \VE, the interaction techniques to act on the \VE, the rendering of the \VE to the user through visual and haptic user interfaces, and, finally, the user's perception and actions on the overall \AE.
|
||||
|
||||
This chapter reviews previous work on the perception and manipulation of \AEs directly with the hand using \WHs, \AR and their combination.
|
||||
%Experiencing a visual, haptic, or visuo-haptic \AE relies on one to many interaction loops between a user and the environment, as shown in \figref[introduction]{interaction-loop}, and each main step must be addressed and understood: the tracking and modelling of the \RE into a \VE, the interaction techniques to act on the \VE, the rendering of the \VE to the user through visual and haptic user interfaces, and, finally, the user's perception and actions on the overall \AE.
|
||||
To achieve this, we first describe how the hand senses and acts on its environment to perceive the haptic properties of real everyday objects.
|
||||
Secondly, we present how \WH devices and renderings have been used to provide tactile feedback and enhance the touch interaction with virtual and augmented objects, with a focus on vibrotactile feedback and haptic textures.
|
||||
Thirdly, we introduce the principles, implementations and user perception of \AR, and overview the main interaction techniques used to manipulate virtual and augmented objects.
|
||||
Secondly, we present how \WH devices and renderings have been used to augment the haptic perception with augmented objects, with a focus on vibrotactile feedback and haptic textures.
|
||||
Thirdly, we introduce the principles and user experience of \AR, and overview the main interaction techniques used to manipulate virtual and augmented objects.
|
||||
Finally, we present how multimodal visual and haptic feedback have been combined in \AR to modify the user perception of tangible objects, and to improve the user interaction with \VOs.
|
||||
|
||||
\input{1-haptic-hand}
|
||||
|
||||
336
references.bib
336
references.bib
@@ -162,6 +162,35 @@
|
||||
date = {2024}
|
||||
}
|
||||
|
||||
@inproceedings{ban2012modifying,
|
||||
title = {Modifying an Identified Curved Surface Shape Using Pseudo-Haptic Effect},
|
||||
booktitle = {{{IEEE Haptics Symp}}.},
|
||||
author = {Ban, Yuki and Kajinami, Takashi and Narumi, Takuji and Tanikawa, Tomohiro and Hirose, Michitaka},
|
||||
date = {2012},
|
||||
pages = {211--216},
|
||||
doi = {10/gnx4k5}
|
||||
}
|
||||
|
||||
@inproceedings{ban2014displaying,
|
||||
title = {Displaying Shapes with Various Types of Surfaces Using Visuo-Haptic Interaction},
|
||||
booktitle = {{{ACM Symp}}. {{Virtual Real}}. {{Softw}}. {{Technol}}.},
|
||||
author = {Ban, Yuki and Narumi, Takuji and Tanikawa, Tomohiro and Hirose, Michitaka},
|
||||
date = {2014},
|
||||
pages = {191--196},
|
||||
doi = {10/gvr3fg}
|
||||
}
|
||||
|
||||
@online{banerjee2024introducing,
|
||||
title = {Introducing {{HOT3D}}: {{An Egocentric Dataset}} for {{3D Hand}} and {{Object Tracking}}},
|
||||
shorttitle = {Introducing {{HOT3D}}},
|
||||
author = {Banerjee, Prithviraj and Shkodrani, Sindi and Moulon, Pierre and Hampali, Shreyas and Zhang, Fan and Fountain, Jade and Miller, Edward and Basol, Selen and Newcombe, Richard and Wang, Robert and Engel, Jakob Julian and Hodan, Tomas},
|
||||
date = {2024-06-13},
|
||||
eprint = {2406.09598},
|
||||
eprinttype = {arXiv},
|
||||
eprintclass = {cs},
|
||||
pubstate = {prepublished}
|
||||
}
|
||||
|
||||
@article{bau2012revel,
|
||||
title = {{{REVEL}}: Tactile Feedback Technology for Augmented Reality},
|
||||
shorttitle = {{{REVEL}}},
|
||||
@@ -252,7 +281,7 @@
|
||||
|
||||
@article{bergmanntiest2009cues,
|
||||
title = {Cues for {{Haptic Perception}} of {{Compliance}}},
|
||||
author = {Bergmann Tiest, W.M. and Kappers, A.},
|
||||
author = {Bergmann Tiest, Wouter M. and Kappers, Astrid M. L.},
|
||||
date = {2009},
|
||||
journaltitle = {IEEE Trans. Haptics},
|
||||
volume = {2},
|
||||
@@ -620,6 +649,16 @@
|
||||
doi = {10/gnc6m5}
|
||||
}
|
||||
|
||||
@inproceedings{choi2017grabity,
|
||||
title = {Grabity: {{A Wearable Haptic Interface}} for {{Simulating Weight}} and {{Grasping}} in {{Virtual Reality}}},
|
||||
shorttitle = {Grabity},
|
||||
booktitle = {{{ACM Symp}}. {{User Interface Softw}}. {{Technol}}.},
|
||||
author = {Choi, Inrak and Culbertson, Heather and Miller, Mark R. and Olwal, Alex and Follmer, Sean},
|
||||
date = {2017-10-20},
|
||||
pages = {119--130},
|
||||
doi = {10/gfz8mg}
|
||||
}
|
||||
|
||||
@inproceedings{choi2018claw,
|
||||
title = {{{CLAW}}: {{A Multifunctional Handheld Haptic Controller}} for {{Grasping}}, {{Touching}}, and {{Triggering}} in {{Virtual Reality}}},
|
||||
shorttitle = {{{CLAW}}},
|
||||
@@ -640,6 +679,27 @@
|
||||
doi = {10/grx8df}
|
||||
}
|
||||
|
||||
@inproceedings{choi2021perceived,
|
||||
title = {Perceived {{Hardness}} of {{Virtual Surface}}: {{A Function}} of {{Stiffness}}, {{Damping}}, and {{Contact Transient}}},
|
||||
shorttitle = {Perceived {{Hardness}} of {{Virtual Surface}}},
|
||||
booktitle = {{{IEEE World Haptics Conf}}.},
|
||||
author = {Choi, Hyejin and Bhardwaj, Amit and Yoon, Gyeore and Choi, Seungmoon},
|
||||
date = {2021},
|
||||
pages = {613--618},
|
||||
doi = {10/gsvh3z}
|
||||
}
|
||||
|
||||
@article{costes2019touchy,
|
||||
title = {Touchy : {{A Visual Approach}} for {{Simulating Haptic Effects}} on {{Touchscreens}}},
|
||||
shorttitle = {Touchy},
|
||||
author = {Costes, Antoine and Argelaguet, Ferran and Danieau, Fabien and Guillotel, Philippe and Lécuyer, Anatole},
|
||||
date = {2019},
|
||||
journaltitle = {Front. ICT},
|
||||
volume = {6},
|
||||
pages = {1},
|
||||
doi = {10/gtv39b}
|
||||
}
|
||||
|
||||
@inproceedings{culbertson2012refined,
|
||||
title = {Refined Methods for Creating Realistic Haptic Virtual Textures from Tool-Mediated Contact Acceleration Data},
|
||||
booktitle = {{{IEEE Haptics Symp}}.},
|
||||
@@ -715,6 +775,16 @@
|
||||
doi = {10/gbth8c}
|
||||
}
|
||||
|
||||
@inproceedings{culbertson2017waves,
|
||||
title = {{{WAVES}}: {{A Wearable Asymmetric Vibration Excitation System}} for {{Presenting Three-Dimensional Translation}} and {{Rotation Cues}}},
|
||||
shorttitle = {{{WAVES}}},
|
||||
booktitle = {{{CHI Conf}}. {{Hum}}. {{Factors Comput}}. {{Syst}}.},
|
||||
author = {Culbertson, Heather and Walker, Julie M. and Raitor, Michael and Okamura, Allison M.},
|
||||
date = {2017},
|
||||
pages = {4972--4982},
|
||||
doi = {10/gfz8qh}
|
||||
}
|
||||
|
||||
@article{culbertson2018haptics,
|
||||
title = {Haptics: {{The Present}} and {{Future}} of {{Artificial Touch Sensation}}},
|
||||
shorttitle = {Haptics},
|
||||
@@ -793,9 +863,18 @@
|
||||
doi = {10/gf39qk}
|
||||
}
|
||||
|
||||
@inproceedings{detinguy2019how,
|
||||
title = {How {{Different Tangible}} and {{Virtual Objects Can Be While Still Feeling}} the {{Same}}?},
|
||||
booktitle = {{{IEEE World Haptics Conf}}.},
|
||||
author = {family=Tinguy, given=Xavier, prefix=de, useprefix=true and Pacchierotti, Claudio and Emily, Mathieu and Chevalier, Mathilde and Guignardat, Aurelie and Guillaudeux, Morgan and Six, Chloe and Lecuyer, Anatole and Marchal, Maud},
|
||||
date = {2019},
|
||||
pages = {580--585},
|
||||
doi = {10/gm5m8f}
|
||||
}
|
||||
|
||||
@article{detinguy2021capacitive,
|
||||
title = {Capacitive {{Sensing}} for {{Improving Contact Rendering With Tangible Objects}} in {{VR}}},
|
||||
author = {De Tinguy, Xavier and Pacchierotti, Claudio and Lecuyer, Anatole and Marchal, Maud},
|
||||
author = {family=Tinguy, given=Xavier, prefix=de, useprefix=true and Pacchierotti, Claudio and Lecuyer, Anatole and Marchal, Maud},
|
||||
date = {2021},
|
||||
journaltitle = {IEEE Trans. Vis. Comput. Graph.},
|
||||
volume = {27},
|
||||
@@ -836,7 +915,7 @@
|
||||
|
||||
@article{diluca2011effects,
|
||||
title = {Effects of Visual–Haptic Asynchronies and Loading–Unloading Movements on Compliance Perception},
|
||||
author = {Di Luca, M. and Knörlein, B. and Ernst, M.O. and Harders, M.},
|
||||
author = {Di Luca, Massimiliano and Knörlein, Benjamin and Ernst, Marc O. and Harders, Matthias},
|
||||
date = {2011},
|
||||
journaltitle = {Brain Res. Bull.},
|
||||
volume = {85},
|
||||
@@ -866,6 +945,15 @@
|
||||
doi = {10/gtrgcm}
|
||||
}
|
||||
|
||||
@inproceedings{englmeier2020tangible,
|
||||
title = {A {{Tangible Spherical Proxy}} for {{Object Manipulation}} in {{Augmented Reality}}},
|
||||
booktitle = {{{IEEE Conf}}. {{Virtual Real}}. {{3D User Interfaces}}},
|
||||
author = {Englmeier, David and Dörner, Julia and Butz, Andreas and Höllerer, Tobias},
|
||||
date = {2020},
|
||||
pages = {221--229},
|
||||
doi = {10/gktm62}
|
||||
}
|
||||
|
||||
@article{ernst2002humans,
|
||||
title = {Humans Integrate Visual and Haptic Information in a Statistically Optimal Fashion},
|
||||
author = {Ernst, Marc O. and Banks, Martin S.},
|
||||
@@ -1046,6 +1134,15 @@
|
||||
doi = {10/gkghs9}
|
||||
}
|
||||
|
||||
@article{girard2016haptip,
|
||||
title = {{{HapTip}}: {{Displaying Haptic Shear Forces}} at the {{Fingertips}} for {{Multi-Finger Interaction}} in {{Virtual Environments}}},
|
||||
author = {Girard, Adrien and Marchal, Maud and Gosselin, Florian and Chabrier, Anthony and Louveau, François and Lécuyer, Anatole},
|
||||
date = {2016},
|
||||
journaltitle = {Front. ICT},
|
||||
volume = {3},
|
||||
doi = {10/gnc6nc}
|
||||
}
|
||||
|
||||
@article{gonzalez2014analysis,
|
||||
title = {Analysis of {{Hand Contact Areas}} and {{Interaction Capabilities During Manipulation}} and {{Exploration}}},
|
||||
author = {Gonzalez, Franck and Gosselin, Florian and Bachta, Wael},
|
||||
@@ -1222,6 +1319,15 @@
|
||||
doi = {10/grx8dh}
|
||||
}
|
||||
|
||||
@inproceedings{hertel2021taxonomy,
|
||||
title = {A {{Taxonomy}} of {{Interaction Techniques}} for {{Immersive Augmented Reality}} Based on an {{Iterative Literature Review}}},
|
||||
booktitle = {{{IEEE Int}}. {{Symp}}. {{Mix}}. {{Augment}}. {{Real}}.},
|
||||
author = {Hertel, Julia and Karaosmanoglu, Sukran and Schmidt, Susanne and Bräker, Julia and Semmann, Martin and Steinicke, Frank},
|
||||
date = {2021},
|
||||
pages = {431--440},
|
||||
doi = {10/gnhc4h}
|
||||
}
|
||||
|
||||
@inproceedings{hettiarachchi2016annexing,
|
||||
title = {Annexing {{Reality}}: {{Enabling Opportunistic Use}} of {{Everyday Objects}} as {{Tangible Proxies}} in {{Augmented Reality}}},
|
||||
shorttitle = {Annexing {{Reality}}},
|
||||
@@ -1242,6 +1348,16 @@
|
||||
doi = {10/ggfv5r}
|
||||
}
|
||||
|
||||
@inproceedings{hincapie-ramos2014consumed,
|
||||
title = {Consumed Endurance: A Metric to Quantify Arm Fatigue of Mid-Air Interactions},
|
||||
shorttitle = {Consumed Endurance},
|
||||
booktitle = {Proc. {{SIGCHI Conf}}. {{Hum}}. {{Factors Comput}}. {{Syst}}.},
|
||||
author = {Hincapié-Ramos, Juan David and Guo, Xiang and Moghadasian, Paymahn and Irani, Pourang},
|
||||
date = {2014-04-26},
|
||||
pages = {1063--1072},
|
||||
doi = {10/gfz8j5}
|
||||
}
|
||||
|
||||
@article{ho2006contribution,
|
||||
title = {Contribution of Thermal Cues to Material Discrimination and Localization},
|
||||
author = {Ho, Hsin-Ni and Jones, Lynette A.},
|
||||
@@ -1308,6 +1424,24 @@
|
||||
doi = {10/d648z2}
|
||||
}
|
||||
|
||||
@inproceedings{issartel2015perceiving,
|
||||
title = {Perceiving Mass in Mixed Reality through Pseudo-Haptic Rendering of {{Newton}}'s Third Law},
|
||||
booktitle = {{{IEEE Virtual Real}}.},
|
||||
author = {Issartel, Paul and Guéniat, Florimond and Coquillart, Sabine and Ammi, Mehdi},
|
||||
date = {2015},
|
||||
pages = {41--46},
|
||||
doi = {10/gnhzx3}
|
||||
}
|
||||
|
||||
@inproceedings{issartel2016tangible,
|
||||
title = {A {{Tangible Volume}} for {{Portable 3D Interaction}}},
|
||||
booktitle = {{{IEEE Int}}. {{Symp}}. {{Mix}}. {{Augment}}. {{Real}}.},
|
||||
author = {Issartel, Paul and Besançon, Lonni and Isenberg, Tobias and Ammi, Mehdi},
|
||||
date = {2016},
|
||||
pages = {215--220},
|
||||
doi = {10/gshn2s}
|
||||
}
|
||||
|
||||
@article{ito2019tactile,
|
||||
title = {Tactile {{Texture Display}} with {{Vibrotactile}} and {{Electrostatic Friction Stimuli Mixed}} at {{Appropriate Ratio Presents Better Roughness Textures}}},
|
||||
author = {Ito, Ken and Okamoto, Shogo and Yamada, Yoji and Kajimoto, Hiroyuki},
|
||||
@@ -1330,6 +1464,26 @@
|
||||
doi = {10/dzwg7q}
|
||||
}
|
||||
|
||||
@inproceedings{jeon2011extensions,
|
||||
title = {Extensions to Haptic Augmented Reality: {{Modulating}} Friction and Weight},
|
||||
shorttitle = {Extensions to Haptic Augmented Reality},
|
||||
booktitle = {{{IEEE World Haptics Conf}}.},
|
||||
author = {Jeon, Seokhee and Metzger, Jean-Claude and {Seungmoon Choi} and Harders, M.},
|
||||
date = {2011},
|
||||
pages = {227--232},
|
||||
doi = {10/fpjmcw}
|
||||
}
|
||||
|
||||
@inproceedings{jeon2012extending,
|
||||
title = {Extending Haptic Augmented Reality: {{Modulating}} Stiffness during Two-Point Squeezing},
|
||||
shorttitle = {Extending Haptic Augmented Reality},
|
||||
booktitle = {{{IEEE Haptics Symp}}.},
|
||||
author = {Jeon, Seokhee and Harders, Matthias},
|
||||
date = {2012},
|
||||
pages = {141--146},
|
||||
doi = {10/gm97jp}
|
||||
}
|
||||
|
||||
@article{johansson1984roles,
|
||||
title = {Roles of Glabrous Skin Receptors and Sensorimotor Memory in Automatic Control of Precision Grip When Lifting Rougher or More Slippery Objects},
|
||||
author = {Johansson, R.S. and Westling, G.},
|
||||
@@ -1388,6 +1542,15 @@
|
||||
doi = {10/gnc6vc}
|
||||
}
|
||||
|
||||
@inproceedings{kahl2022influence,
|
||||
title = {The {{Influence}} of {{Environmental Lighting}} on {{Size Variations}} in {{Optical See-through Tangible Augmented Reality}}},
|
||||
booktitle = {2022 {{IEEE Conf}}. {{Virtual Real}}. {{3D User Interfaces VR}}},
|
||||
author = {Kahl, Denise and Ruble, Marc and Kruger, Antonio},
|
||||
date = {2022-03},
|
||||
pages = {121--129},
|
||||
doi = {10/gt7pqr}
|
||||
}
|
||||
|
||||
@online{kahl2023using,
|
||||
title = {Using {{Abstract Tangible Proxy Objects}} for {{Interaction}} in {{Optical See-through Augmented Reality}}},
|
||||
author = {Kahl, Denise and Krüger, Antonio},
|
||||
@@ -1420,6 +1583,12 @@
|
||||
doi = {10/crh7rv}
|
||||
}
|
||||
|
||||
@article{kalus2024simulating,
|
||||
title = {Simulating {{Object Weight}} in {{Virtual Reality}}: {{The Role}} of {{Absolute Mass}} and {{Weight Distributions}}},
|
||||
author = {Kalus, Alexander and Klein, Johannes and Ho, Tien-Julian and Henze, Niels},
|
||||
date = {2024}
|
||||
}
|
||||
|
||||
@inproceedings{kang2020comparative,
|
||||
title = {A {{Comparative Analysis}} of {{3D User Interaction}}: {{How}} to {{Move Virtual Objects}} in {{Mixed Reality}}},
|
||||
shorttitle = {A {{Comparative Analysis}} of {{3D User Interaction}}},
|
||||
@@ -1444,7 +1613,7 @@
|
||||
@article{kawazoe2021tactile,
|
||||
title = {Tactile {{Echoes}}: {{Multisensory Augmented Reality}} for the {{Hand}}},
|
||||
shorttitle = {Tactile {{Echoes}}},
|
||||
author = {Kawazoe, Anzu and Reardon, Gregory and Woo, Erin and Luca, Massimiliano Di and Visell, Yon},
|
||||
author = {Kawazoe, Anzu and Reardon, Gregory and Woo, Erin and Di Luca, Massimiliano and Visell, Yon},
|
||||
date = {2021},
|
||||
journaltitle = {IEEE Trans. Haptics},
|
||||
volume = {14},
|
||||
@@ -1453,6 +1622,16 @@
|
||||
doi = {10/gs4tnn}
|
||||
}
|
||||
|
||||
@inproceedings{kildal20103dpress,
|
||||
title = {{{3D-press}}: Haptic Illusion of Compliance When Pressing on a Rigid Surface},
|
||||
shorttitle = {{{3D-press}}},
|
||||
booktitle = {Int. {{Conf}}. {{Multimodal Interfaces Workshop Mach}}. {{Learn}}. {{Multimodal Interact}}.},
|
||||
author = {Kildal, Johan},
|
||||
date = {2010},
|
||||
pages = {1--8},
|
||||
doi = {10/bzjh8b}
|
||||
}
|
||||
|
||||
@article{kim2018revisiting,
|
||||
title = {Revisiting {{Trends}} in {{Augmented Reality Research}}: {{A Review}} of the 2nd {{Decade}} of {{ISMAR}} (2008-2017)},
|
||||
shorttitle = {Revisiting {{Trends}} in {{Augmented Reality Research}}},
|
||||
@@ -1543,7 +1722,7 @@
|
||||
@inproceedings{knorlein2009influence,
|
||||
title = {Influence of Visual and Haptic Delays on Stiffness Perception in Augmented Reality},
|
||||
booktitle = {{{IEEE Int}}. {{Symp}}. {{Mix}}. {{Augment}}. {{Real}}.},
|
||||
author = {Knorlein, B. and Di Luca, M. and Harders, M.},
|
||||
author = {Knörlein, Benjamin and Di Luca, Massimiliano and Harders, Matthias},
|
||||
date = {2009},
|
||||
pages = {49--52},
|
||||
doi = {10/bbpd33}
|
||||
@@ -1558,6 +1737,19 @@
|
||||
doi = {10/dbpvrh}
|
||||
}
|
||||
|
||||
@incollection{konyo2008alternative,
|
||||
title = {Alternative {{Display}} of {{Friction Represented}} by {{Tactile Stimulation}} without {{Tangential Force}}},
|
||||
booktitle = {Haptics: {{Perception}}, {{Devices}} and {{Scenarios}}},
|
||||
author = {Konyo, Masashi and Yamada, Hiroshi and Okamoto, Shogo and Tadokoro, Satoshi},
|
||||
editor = {Ferre, Manuel},
|
||||
editora = {Hutchison, David and Kanade, Takeo and Kittler, Josef and Kleinberg, Jon M. and Mattern, Friedemann and Mitchell, John C. and Naor, Moni and Nierstrasz, Oscar and Pandu Rangan, C. and Steffen, Bernhard and Sudan, Madhu and Terzopoulos, Demetri and Tygar, Doug and Vardi, Moshe Y. and Weikum, Gerhard},
|
||||
editoratype = {redactor},
|
||||
date = {2008},
|
||||
volume = {5024},
|
||||
pages = {619--629},
|
||||
doi = {10.1007/978-3-540-69057-3_79}
|
||||
}
|
||||
|
||||
@article{kourtesis2022electrotactile,
|
||||
title = {Electrotactile Feedback Applications for Hand and Arm Interactions},
|
||||
author = {Kourtesis, Panagiotis and Argelaguet, Ferran and Vizcay, Sebastian and Marchal, Maud and Pacchierotti, Claudio},
|
||||
@@ -1801,6 +1993,16 @@
|
||||
doi = {10/dz653x}
|
||||
}
|
||||
|
||||
@inproceedings{maeda2022fingeret,
|
||||
title = {Fingeret: {{A Wearable Fingerpad-Free Haptic Device}} for {{Mixed Reality}}},
|
||||
shorttitle = {Fingeret},
|
||||
booktitle = {{{ACM Symp}}. {{Spat}}. {{User Interact}}.},
|
||||
author = {Maeda, Tomosuke and Yoshida, Shigeo and Murakami, Takaki and Matsuda, Kenroh and Tanikawa, Tomohiro and Sakai, Hiroyuki},
|
||||
date = {2022},
|
||||
pages = {1--10},
|
||||
doi = {10/gtk3c3}
|
||||
}
|
||||
|
||||
@article{maisto2017evaluation,
|
||||
title = {Evaluation of {{Wearable Haptic Systems}} for the {{Fingers}} in {{Augmented Reality Applications}}},
|
||||
author = {Maisto, Maurizio and Pacchierotti, Claudio and Chinello, Francesco and Salvietti, Gionata and De Luca, Alessandro and Prattichizzo, Domenico},
|
||||
@@ -1834,6 +2036,12 @@
|
||||
doi = {10/f5zzhq}
|
||||
}
|
||||
|
||||
@article{mangalam2024enhancing,
|
||||
title = {Enhancing {{Hand-Object Interactions}} in {{Virtual Reality}} for {{Precision Manual Tasks}}},
|
||||
author = {Mangalam, Madhur and Oruganti, Sanjay and Buckingham, Gavin and Borst, Christoph W},
|
||||
date = {2024}
|
||||
}
|
||||
|
||||
@incollection{marchal2022virtual,
|
||||
title = {Virtual {{Reality}} and {{Haptics}}},
|
||||
booktitle = {Robotics {{Goes MOOC}}},
|
||||
@@ -1942,6 +2150,15 @@
|
||||
doi = {10/grmtv8}
|
||||
}
|
||||
|
||||
@inproceedings{minamizawa2008interactive,
|
||||
title = {Interactive Representation of Virtual Object in Hand-Held Box by Finger-Worn Haptic Display},
|
||||
booktitle = {Symp. {{Haptic Interfaces Virtual Environ}}. {{Teleoperator Syst}}.},
|
||||
author = {Minamizawa, Kouta and Fukamachi, Souichiro and Kawakami, Naoki and Tachi, Susumu},
|
||||
date = {2008},
|
||||
pages = {367--368},
|
||||
doi = {10/bdjm7x}
|
||||
}
|
||||
|
||||
@inproceedings{mullenbach2014exploring,
|
||||
title = {Exploring Affective Communication through Variable-Friction Surface Haptics},
|
||||
booktitle = {{{CHI Conf}}. {{Hum}}. {{Factors Comput}}. {{Syst}}.},
|
||||
@@ -2125,7 +2342,7 @@
|
||||
|
||||
@article{pacchierotti2024haptics,
|
||||
title = {Haptics in the {{Metaverse}}: {{Haptic}} Feedback for {{Virtual}}, {{Augmented}}, {{Mixed}}, and {{eXtended Realities}}},
|
||||
author = {Pacchierotti, Claudio and Chinello, Francesco and Koumaditis, Konstantinos and Luca, Massimiliano Di and Ofek, Eyal and Georgiou, Orestis},
|
||||
author = {Pacchierotti, Claudio and Chinello, Francesco and Koumaditis, Konstantinos and Di Luca, Massimiliano and Ofek, Eyal and Georgiou, Orestis},
|
||||
date = {2024},
|
||||
journaltitle = {IEEE Trans. Haptics}
|
||||
}
|
||||
@@ -2148,6 +2365,33 @@
|
||||
doi = {10/fshqps}
|
||||
}
|
||||
|
||||
@inproceedings{park2017compensation,
|
||||
title = {Compensation of Perceived Hardness of a Virtual Object with Cutaneous Feedback},
|
||||
booktitle = {{{IEEE World Haptics Conf}}.},
|
||||
author = {Park, Jaeyoung and Kim, Jaeha and Oh, Yonghwan and Tan, Hong Z.},
|
||||
date = {2017},
|
||||
pages = {101--106},
|
||||
doi = {10/grx8c7}
|
||||
}
|
||||
|
||||
@inproceedings{park2019realistic,
|
||||
title = {Realistic {{Haptic Rendering}} of {{Collision Effects Using Multimodal Vibrotactile}} and {{Impact Feedback}}},
|
||||
booktitle = {{{IEEE World Haptics Conf}}.},
|
||||
author = {Park, Chaeyong and Park, Jaeyoung and Oh, Seungjae and Choi, Seungmoon},
|
||||
date = {2019},
|
||||
pages = {449--454},
|
||||
doi = {10/gr3xc8}
|
||||
}
|
||||
|
||||
@inproceedings{park2023perceptual,
|
||||
title = {Perceptual {{Simultaneity Between Vibrotactile}} and {{Impact Stimuli}}},
|
||||
booktitle = {{{IEEE World Haptics Conf}}.},
|
||||
author = {Park, Chaeyong and Choi, Seungmoon},
|
||||
date = {2023},
|
||||
pages = {148--155},
|
||||
doi = {10/gsvpqg}
|
||||
}
|
||||
|
||||
@inproceedings{peillard2019studying,
|
||||
title = {Studying {{Exocentric Distance Perception}} in {{Optical See-Through Augmented Reality}}},
|
||||
booktitle = {{{IEEE Int}}. {{Symp}}. {{Mix}}. {{Augment}}. {{Real}}.},
|
||||
@@ -2284,6 +2528,17 @@
|
||||
doi = {10/b33wgg}
|
||||
}
|
||||
|
||||
@article{provancher2009fingerpad,
|
||||
title = {Fingerpad {{Skin Stretch Increases}} the {{Perception}} of {{Virtual Friction}}},
|
||||
author = {Provancher, W.R. and Sylvester, N.D.},
|
||||
date = {2009},
|
||||
journaltitle = {IEEE Trans. Haptics},
|
||||
volume = {2},
|
||||
number = {4},
|
||||
pages = {212--223},
|
||||
doi = {10/b55xd3}
|
||||
}
|
||||
|
||||
@article{punpongsanon2015softar,
|
||||
title = {{{SoftAR}}: {{Visually Manipulating Haptic Softness Perception}} in {{Spatial Augmented Reality}}},
|
||||
shorttitle = {{{SoftAR}}},
|
||||
@@ -2387,9 +2642,19 @@
|
||||
doi = {10/gnc6mf}
|
||||
}
|
||||
|
||||
@inproceedings{samad2019pseudohaptic,
|
||||
title = {Pseudo-Haptic Weight: {{Changing}} the Perceived Weight of Virtual Objects by Manipulating Control-Display Ratio},
|
||||
shorttitle = {Pseudo-Haptic Weight},
|
||||
booktitle = {Proc. 2019 {{CHI Conf}}. {{Hum}}. {{Factors Comput}}. {{Syst}}.},
|
||||
author = {Samad, Majed and Gatti, Elia and Hermes, Anne and Benko, Hrvoje and Parise, Cesare},
|
||||
date = {2019},
|
||||
pages = {1--13},
|
||||
doi = {10/gf2hjv}
|
||||
}
|
||||
|
||||
@article{sarac2022perceived,
|
||||
title = {Perceived {{Intensities}} of {{Normal}} and {{Shear Skin Stimuli Using}} a {{Wearable Haptic Bracelet}}},
|
||||
author = {Sarac, Mine and Huh, Tae Myung and Choi, Hojung and Cutkosky, Mark R. and Luca, Massimiliano Di and Okamura, Allison M.},
|
||||
author = {Sarac, Mine and Huh, Tae Myung and Choi, Hojung and Cutkosky, Mark R. and Di Luca, Massimiliano and Okamura, Allison M.},
|
||||
date = {2022},
|
||||
journaltitle = {IEEE Robot. Autom. Lett.},
|
||||
volume = {7},
|
||||
@@ -2507,6 +2772,18 @@
|
||||
doi = {10/gt9fwt}
|
||||
}
|
||||
|
||||
@article{smith2010roughness,
|
||||
title = {Roughness of Simulated Surfaces Examined with a Haptic Tool: Effects of Spatial Period, Friction, and Resistance Amplitude},
|
||||
shorttitle = {Roughness of Simulated Surfaces Examined with a Haptic Tool},
|
||||
author = {Smith, Allan M. and Basile, Georges and Theriault-Groom, Jonathan and Fortier-Poisson, Pascal and Campion, Gianni and Hayward, Vincent},
|
||||
date = {2010},
|
||||
journaltitle = {Exp. Brain Res.},
|
||||
volume = {202},
|
||||
number = {1},
|
||||
pages = {33--43},
|
||||
doi = {10/fj8xvn}
|
||||
}
|
||||
|
||||
@inproceedings{speicher2019what,
|
||||
title = {What Is {{Mixed Reality}}?},
|
||||
booktitle = {{{CHI Conf}}. {{Hum}}. {{Factors Comput}}. {{Syst}}.},
|
||||
@@ -2576,6 +2853,15 @@
|
||||
doi = {10/gqd263}
|
||||
}
|
||||
|
||||
@inproceedings{taima2014controlling,
|
||||
title = {Controlling Fatigue While Lifting Objects Using {{Pseudo-haptics}} in a Mixed Reality Space},
|
||||
booktitle = {{{IEEE Haptics Symp}}.},
|
||||
author = {Taima, Yuki and Ban, Yuki and Narumi, Takuji and Tanikawa, Tomohiro and Hirose, Michitaka},
|
||||
date = {2014},
|
||||
pages = {175--180},
|
||||
doi = {10/gvr27n}
|
||||
}
|
||||
|
||||
@article{tanaka2014contact,
|
||||
title = {Contact {{Force}} and {{Scanning Velocity}} during {{Active Roughness Perception}}},
|
||||
author = {Tanaka, Yoshihiro and Bergmann Tiest, Wouter M. and Kappers, Astrid M. L. and Sano, Akihito},
|
||||
@@ -2588,6 +2874,15 @@
|
||||
doi = {10/f53jcg}
|
||||
}
|
||||
|
||||
@inproceedings{tao2021altering,
|
||||
title = {Altering {{Perceived Softness}} of {{Real Rigid Objects}} by {{Restricting Fingerpad Deformation}}},
|
||||
booktitle = {{{ACM Symp}}. {{User Interface Softw}}. {{Technol}}.},
|
||||
author = {Tao, Yujie and Teng, Shan-Yuan and Lopes, Pedro},
|
||||
date = {2021},
|
||||
pages = {985--996},
|
||||
doi = {10/gm7wzq}
|
||||
}
|
||||
|
||||
@inproceedings{teng2021touch,
|
||||
title = {Touch\&{{Fold}}: {{A Foldable Haptic Actuator}} for {{Rendering Touch}} in {{Mixed Reality}}},
|
||||
shorttitle = {Touch\&{{Fold}}},
|
||||
@@ -2597,15 +2892,6 @@
|
||||
doi = {10/gkskqb}
|
||||
}
|
||||
|
||||
@inproceedings{tinguy2019how,
|
||||
title = {How {{Different Tangible}} and {{Virtual Objects Can Be While Still Feeling}} the {{Same}}?},
|
||||
booktitle = {{{IEEE World Haptics Conf}}.},
|
||||
author = {Tinguy, Xavier De and Pacchierotti, Claudio and Emily, Mathieu and Chevalier, Mathilde and Guignardat, Aurelie and Guillaudeux, Morgan and Six, Chloe and Lecuyer, Anatole and Marchal, Maud},
|
||||
date = {2019},
|
||||
pages = {580--585},
|
||||
doi = {10/gm5m8f}
|
||||
}
|
||||
|
||||
@inproceedings{tran2024survey,
|
||||
title = {A {{Survey On Measuring Presence}} in {{Mixed Reality}}},
|
||||
booktitle = {{{CHI Conf}}. {{Hum}}. {{Factors Comput}}. {{Syst}}.},
|
||||
@@ -2636,6 +2922,15 @@
|
||||
doi = {10/grx8cz}
|
||||
}
|
||||
|
||||
@inproceedings{ujitoko2019presenting,
|
||||
title = {Presenting {{Static Friction Sensation}} at {{Stick-slip Transition}} Using {{Pseudo-haptic Effect}}},
|
||||
booktitle = {{{IEEE World Haptics Conf}}.},
|
||||
author = {Ujitoko, Yusuke and Ban, Yuki and Hirota, Koichi},
|
||||
date = {2019},
|
||||
pages = {181--186},
|
||||
doi = {10/gvr27t}
|
||||
}
|
||||
|
||||
@article{ujitoko2020development,
|
||||
title = {Development of {{Finger-Mounted High-Density Pin-Array Haptic Display}}},
|
||||
author = {Ujitoko, Yusuke and Taniguchi, Takaaki and Sakurai, Sho and Hirota, Koichi},
|
||||
@@ -2806,6 +3101,15 @@
|
||||
doi = {10/gt623q}
|
||||
}
|
||||
|
||||
@inproceedings{yim2021multicontact,
|
||||
title = {Multi-{{Contact Stiffness}} and {{Friction Augmentation Using Contact Centroid-Based Normal-Tangential Force Decomposing}}},
|
||||
booktitle = {{{IEEE World Haptics Conf}}.},
|
||||
author = {Yim, Sunghoon and Choi, Seungmoon and Jeon, Seokhee},
|
||||
date = {2021},
|
||||
pages = {385--390},
|
||||
doi = {10/gvm7sf}
|
||||
}
|
||||
|
||||
@inproceedings{yoon2020evaluating,
|
||||
title = {Evaluating {{Remote Virtual Hands Models}} on {{Social Presence}} in {{Hand-based 3D Remote Collaboration}}},
|
||||
booktitle = {{{IEEE Int}}. {{Symp}}. {{Mix}}. {{Augment}}. {{Real}}.},
|
||||
|
||||
BIN
structure.odg
Normal file
BIN
structure.odg
Normal file
Binary file not shown.
Reference in New Issue
Block a user