Path Routing Optimization for STM Ultrasound Rendering

Héctor Barreiro, Stephen Sinclair and Miguel A. Otaduy
IEEE Transactions on Haptics, 2020



Abstract

Ultrasound transducer arrays are capable of producing tactile sensations on the hand, promising hands-free haptic interaction for virtual environments. However, controlling such an array with respect to reproducing a desired perceived interaction remains a challenging problem. In this work we approach this problem as a dynamic mapping of virtual interactions to existing control metaphors of ultrasound devices, namely, the modulation of focal point positions and intensities over time, a method known as Spatiotemporal Modulation (STM). In particular, we propose an optimization approach that takes into account known perceptual parameters and limitations of the STM method. This results in a set of focal point paths optimized to best reconstruct an arbitrary target pressure field.


Citation

@Article\{BSO20,
  author       = "Barreiro, Héctor and Sinclair, Stephen and Otaduy, Miguel A.",
  title        = "Path Routing Optimization for STM Ultrasound Rendering",
  journal      = "IEEE Transactions on Haptics",
  year         = "2020",
  url          = "http://mslab.es/projects/PROSTM/"
}

Description

We study the problem of rendering interactions with virtual environments using ultrasound haptics. In particular, we approach this problem as a dynamic mapping of virtual interactions to the control metaphors of ultrasound devices. Most previous works have simplified this problem by displaying contact locations at maximum intensity, either through AM or STM. In contrast, we believe that richer display can be obtained by considering the force distributions in virtual interactions, and not just contact locations.

In this work, we introduce PRO-STM, the first method that commands ultrasound STM to render arbitrary force distributions resulting from a dynamic virtual interaction. Given a target pressure field, we pose the rendering as the computation of a set of focal point loops that produce the best-matching quasi-static pressure field.

Our method has been carefully designed to best balance the capabilities of the ultrasound device with the quality and richness of tactile stimuli. We pay special attention to the speed and frequency at which focal points traverse the skin to maximize the perceived intensity and ensure a continuous percept. This treatment allows us to assume that the rendered radiation pressure produces a persistent tactile perception. This is key to our method, as it allows us to consider the stimulus to be spatially varying but temporally invariant (i.e. a quasi-static pressure field) over a time window, enabling a greatly simplified algorithm design.

Our algorithm works at two scales. First, on a coarse scale, it initializes paths over the target domain to optimize coverage weighted by pressure intensity. Then, on a finer scale, it refines the paths to maximize the similarity to the target pressure. Both steps search for paths that maximize coverage and integrated pressure intensity subject to the aforementioned considerations. The resulting paths produce the best matching quasi-static pressure field.


Results

We have applied PRO-STM to the interaction with gaseous fluid media. In such interactions, haptic perception is dictated by a spatially and temporally varying pressure field on skin, which is used as target for our algorithm. We have compared the reconstruction quality of PRO-STM vs. our previous AM rendering method, observing that PRO-STM succeeds to provide larger and smoother coverage than the AM-based method.

In particular, we found that AM produces ambiguous results when rendering interaction with one wide plume or with multiple thin plumes, while STM does not suffer such ambiguity.

We have conducted a user study experiment that confirms this observation, finding a significant difference between AM and STM, with higher proportion of correct responses using STM.


Contact

Héctor Barreiro – hecbarcab@gmail.com
Stephen Sinclair – radarsat1@gmail.com
Miguel A. Otaduy – miguel.otaduy@urjc.es