Haptic Technologies for Direct Touch in Virtual Reality

Download full course notes

 

Introduction


Virtual reality (VR) is experiencing a renaissance thanks to technological progress in computer graphics and the commercial breakthroughs in head-mounted display and tracking technologies. Fully immersive VR requires virtual touch of comparably high quality, to allow bimanual interaction with the environment. However, current VR systems lack the ability to convey realistic haptic (kinesthetic and cutaneous) sensations, because traditional haptic technologies have focused on grounded, kinesthetic haptic interfaces that render virtual environments by outputting a force through a robotic end effector. They provide compelling simulations of tool-based interaction, but do not allow users to touch virtual content directly with their hands.

In line with the renaissance of VR, there is an explosion of novel haptic technologies too. In recent years, we have witnessed the advent of haptic technologies, both hardware and software, that enable compelling virtual touch directly with our hands. The novel available technologies employ diverse actuation principles, such as wearable robotic end effectors, active surfaces, or ultrasound haptic interfaces for mid-air feedback, as shown in Figure1.

 

haptics_figure

Novel haptic technologies open the door to many possibilities for computer graphics researchers and developers. While simulation and interaction methods for traditional kinesthetic haptic interfaces are well established, novel haptic interfaces exhibit many more degrees of freedom and multiple challenges for model and algorithm design. Similarly, they enable the design and development of so far unseen immersive VR applications.

 

Course Objectives


This course intends to disseminate the recent advances in haptic technologies among the computer graphics community. It will provide initial training on these technologies, so that computer graphics researchers and developers are ready to embark in the development of novel computational methods as well as immersive VR and AR applications with direct touch.

The course will cover a broad range of topics relevant for research and development of direct touch solutions:

  • Fundamentals on tactile perception and control theory relevant for application development, as well as tactile design considerations.
  • Actuation technologies employed in direct-touch haptic interfaces, to understand the dimensionality, range, bandwidth, and resolution of sensations that can be produced. The different variants and their control mechanisms will also be discussed.
  • Software methods for the connection of haptic actuators to VR simulations.

The course will also cover three alternative haptic technologies, discussing particular aspects of each of them:

  • Wearable cutaneous devices provide tactile feedback by stimulating skin directly with miniature electromechanical actuators, and eliminate typical workspace restrictions of haptic feedback. Several successful devices operate on the finger pad by translating and orienting a small mobile platform [Minamizawa et al. 2007;Prattichizzo et al. 2013], while others stretch skin tangentially to simulate frictional forces [Nishimura et al. 2014]. Tactile rendering methods control cutaneous devices to match contact configurations simulated in VR scenarios [Perez et al. 2015].
  • Active surfaces enable direct exploration and palpation of dynamically varying shapes. Two successful approaches operate by controlling local shape through particle jamming with pneumatic actuators [Stanley and Okamura 2015], or modulating height fields using mechanically actuated pin arrays [Leithinger et al. 2015].
  • Ultrasound haptic interfaces enable both direct-touch and mid-air interaction, without the need to hold or wear any device.
    Different devices stimulate the skin using either air jets [Sodhi et al. 2013], vibrotactile feedback through localized ultrasound modulation combined with hand tracking [Long et al. 2014], or full spatial modulation of the ultrasound field [Inoue et al. 2015].

 

Intended Audience


The course is intended at general audience in computer graphics with an interest in research and development of VR applications. There is no prerequisite for course attendees. The course will start with fundamentals and will then evolve to technical content in connection with each actuation technology, but it will pay special attention to the big picture.

 

Speakers


The course gathers three lecturers who are currently at the forefront of research on haptic technologies for direct touch.

In addition, their backgrounds span different areas, computer graphics, robotics, and HCI, contributing to a comprehensive coverage of the course.

  • Miguel A. Otaduy is Professor of Computer Science at Universidad Rey Juan Carlos, Madrid. He received his MS and PhD from UNC – Chapel Hill, and was a senior research associate at ETH Zurich. His research covers from physics-based simulation to computational haptics, and is currently involved in two major European projects: Wearhap and the ERC Starting Grant Animetrics. He is/was program chair or editor-in-chief for the IEEE World Haptics Conference (2017 and 2019), the Symposium on Interactive 3D Graphics and Games (2014), and the Symposium on Computer Animation (2010). He is also co-chair of the Technical Committee on Haptics.
  • Allison Okamura  is a Professor of Mechanical Engineering and (by courtesy) Computer Science at Stanford University. She received her BS from UC Berkeley and MS/PhD from Stanford University. She has been editor-in-chief of the IEEE International Conference on Robotics and Automation, associate editor of the IEEE Transactions on Haptics, and co-chair of the IEEE Haptics Symposium. Her awards include the IEEE Technical Committee on Haptics Early Career Award, the IEEE Robotics and Automation Society Early Career Award, and the NSF CAREER Award. She is an IEEE Fellow. Her research interests include haptics, teleoperation, virtual reality, medical robotics, neuromechanics, and education.
  • Sriram Subramanian  is Professor on Engineering and Informatics at the University of Sussex. Before joining Sussex, he was a Professor at the University of Bristol and a senior scientist at Philips Research Netherlands. He is specifically interested in rich and expressive input combining multi-touch, haptics and touchless gestures. He holds an ERC Starting Grant and has received funding from the EU FET-open call. In 2014 he was one of 30 young scientists invited by the WEF to attend their Summer Davos. He co-founded Ultrahaptics a spin-out company that aims to commercialise mid-air haptic using phased array of ultrasound transducers.

 

Agenda, Schedule and Location


Sunday, 24 July, 10:45 am12:15 pm, Anaheim Convention Center, Ballroom C

  • 30 min. Wearable cutaneous devices (Otaduy).
  • 30 min. Active surfaces (Okamura).
  • 30 min. Ultrasound haptic interfaces (Subramanian).

References


Comments are closed.