Hybrid Team Inria Company Logo Irisa Rennes France Edited

Master Internship – Spatial Perception of Mid-air Ultrasound Haptic Rendering

  • M.Sc.
  • Rennes, France
  • Applications have closed

 

Spatial perception of mid-air ultrasound haptic rendering

 

Focused ultrasound arrays [1][2] have recently emerged as a technology for providing tactile stimuli at a distance in human-computer-interaction (HCI) applications. They function by focusing the acoustic beams from several ultrasound speakers in order to create localized regions of oscillating high air pressure which elicit vibrotactile sensations when encountering the skin (see Figure 1). A video presenting the underlying concept of this approach can be seen at here.

These novel tactile displays are finding applications in 3D gesture interfaces e.g. for automotive applications [4], tactile displays in arts and entertainments [5][6], and interactions in virtual reality [7][8]. An example of interaction in VR can be seen at here. While the technological aspects are continuously being refined, and these devices shows promise in the previously mentioned use-cases and others, a fundamental understanding of the relationship between tactile rendering parameters and perception of the displayed patterns is missing.

The aim of this internship is to investigate the relationship between different rendering parameters for an ultrasound phased array and the resulting perceptual properties of the vibrotactile stimuli.

In particular, we wish to investigate:

  • The relationship between generated pressure distributions and perceived shape and size of the vibrotactile stimulations.
  • The relationship between spatiotemporal modulation rendering parameters (spatial and temporal steps, dwell time at each step) and perceived size, continuity and geometric features of the stimuli.

Furthermore, we would like to investigate potential variations to the perceived patterns that may be induced by providing simultaneous vibrotactile stimuli using conventional vibration motors, in a mixed contact and mid-air haptic rendering scenario. This internship will be conducted within the scope of the European project H-Reality (here and here). This research topic will be pursued in Rennes at IRISA, the largest research center in France in the fields of computer science and Information Technology, with more than 900 researchers and 41 research teams spread between Rennes, Lannion, and Vannes. It was founded in 1975 and it is a joint effort of CNRS, University of Rennes 1, ENS Rennes, INSA Rennes and Inria.

 

The candidate must be passionate about human perception and applied cognitive sciences, and be highly motivated to design and conduct user studies. Prior experience with haptics, as well as technical skills (e.g., basic programming in Matlab/Python/C++) would be a plus but are not mandatory. The candidate will be expected to show initiative, critical thinking and good communication skills. The internship is intended for a duration of 6 months. Depending on results and student wishes the option for continuing on to a Ph.D. on a related topic may arise. Applications (CV + cover letter) should be sent by e-mail to Claudio Pacchierotti (claudio.pacchierotti@irisa.fr) and Maud Marchal (maud.marchal@irisa.fr).

 

[1] Carter, Tom, et al. “UltraHaptics: multi-point mid-air haptic feedback for touch surfaces.” Proceedings of the 26th annual ACM symposium on User interface software and technology. ACM, 2013.

[2] Iwamoto, Takayuki, et al. “Airborne ultrasound tactile display.” in ACM SIGGRAPH 2008 New Tech Demos. 2008. Los. 2008.

[3] Long, B., Seah, S. A., Carter, T., & Subramanian, S. (2014). “Rendering volumetric haptic shapes in mid-air using ultrasound”. ACM Transactions on Graphics (TOG), 33(6), 181.

[4] Harrington, Kyle, et al. “Exploring the use of mid-air ultrasonic feedback to enhance automotive user interfaces.” Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. ACM, 2018.

[5] Ablart, Damien, Carlos Velasco, and Marianna Obrist. “Integrating mid-air haptics into movie experiences.” Proceedings of the 2017 ACM International Conference on Interactive Experiences for TV and Online Video. ACM, 2017.

[6] Vi, Chi Thanh, et al. “Not just seeing, but also feeling art: Mid-air haptic experiences integrated in a multisensory art exhibition.” International Journal of Human-Computer Studies 108 (2017): 1-14.

[7] Georgiou, Orestis, et al. “Touchless haptic feedback for VR rhythm games.” 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 2018.

[8] Martinez, Jonatan, et al. “Touchless haptic feedback for supernatural VR experiences.” 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 2018.

 

French version here

 

Please click here to learn more.

 

 

No Comments

Sorry, the comment form is closed at this time.