Computer Haptics – Virtual Touch

Haptic rendering is the computational technology that allows us to interact with virtual worlds through the sense of touch. It relies on an algorithm that simulates a virtual world in a physically based manner and computes interaction forces, and a robotic device that transmits those interaction forces to the user. Haptics science is a multidisciplinary field that brings together psycophysics research for the understanding of tactile cues and human perception, mechanical engineering for the design of robotic devices, control theory for the analysis of the coupling between the real and virtual worlds, and computer science, in particular computer graphics, for the simulation of the virtual world and the design of the haptic rendering algorithm.

Our group carries out research in multiple areas of haptic rendering, from the algorithmic foundations for efficient computation of interaction forces with virtual environments, to the application of haptics in training, navigation, or visualization settings.



Project WEARHAP, aims at laying the scientific and technological foundations for wearable haptics, a novel concept for the systematic exploration of haptics in advanced cognitive systems and robotics that will redefine the way humans will cooperate with robots. This paradigm shift will enable novel forms of human intention recognition through haptic signals and novel forms of communication and cooperation between humans and robots. Research challenges are ambitious and cross traditional boundaries between robotics, cognitive science and neuroscience. Research findings derived from distributed robotics, biomechanical modeling, multisensory tracking, underaction in control and cognitive systems will be integrated to address the scientific and technological challenges imposed in creating effective wearable haptic interaction.


Haptics for the hand

The hand concentrates a large number of the mechanoreceptors in the human body, and serves as the major means of bidirectional interaction with the world. Meanwhile, operations such as object manipulation and palpation rely on the fine perception of contact forces, both in time and space. Haptic simulation of grasping, with the rendering of contact forces resulting from the manipulation of virtual objects, requires realistic yet interactive models of hand mechanics. We have designed algorithms that allow interactive simulation of the skeletal and elastic properties of a human hand, allowing haptic grasping of virtual objects with soft finger contact. Our algorithms contain novel aspects such as a simple technique to couple skeletal and elastic elements, an efficient dynamics solver in the presence of joints and contact constraints, and an algorithm that connects the simulation to a haptic device (Garre et al. 2011). This work relies as well on an efficient solution to articulated dynamics with stiff joints (Hernandez et al. 2011), and earlier work on haptic rendering of soft objects (Garre et al. 2010).


Multirate haptic rendering

Six-degree-of-freedom (6-DoF) haptic rendering is the problem of computing the haptic interaction (force and torque) between a rigid tool manipulated by the user and other rigid objects in the virtual environment. In order to display stiff contact in a stable manner, feedback forces have to be updated at very high rates (e.g., 1 kHz).  The overall quality of a haptic rendering algorithm can be measured in terms of its stability and transparency, which can be enhanced using multirate algorithms that compute full contact handling at a moderately slow rate and evaluate feedback forces using a simplified model at a fast rate (Otaduy and Lin 2006).

The addition of deformable objects to the virtual world increases the complexity of haptic rendering. Multirate methods can be extended to the case where a rigid tool manipulated by the user is in contact with a deformable environment (Otaduy and Gross 2007), and also to the case where the tool itself is deformable. The solution employs robust models for contact handling on a slow rate, and the computation of a linearized contact model that is evaluated on a fast rate (Garre and Otaduy 2009a) (Garre and Otaduy 2009b) (Garre and Otaduy 2010).


Visuo-haptic augmented reality

Visuo-haptic mixed reality consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects, and haptic devices for adding haptic interaction with the virtual objects. Unfortunately, the use of commodity haptic devices poses obstruction and misalignment issues that complicate the correct integration of a virtual tool and the user’s real hand in the mixed reality scene. We have developed a novel mixed reality paradigm where it is possible to touch and see virtual objects in combination with a real scene, using commodity haptic devices, and with a visually consistent integration of the user’s hand and the virtual tool (Cosco et al. 2009). Our method alleviate the visual obstruction and misalignment issues introduced by commodity haptic devices relying on four simple technical steps: color-based segmentation of the hand, tracking-based segmentation of the haptic device, background repainting using image-based models, and misalignment-free compositing of the user’s hand. We have developed a successful proof-of-concept implementation, where a user can touch virtual objects and interact with them in the context of a real scene, and we have evaluated the impact on user performance of obstruction and misalignment correction (Cosco et al. 2013).


Comments are closed.