Home About the Journal Latest Work Current Issue Archive Special Issues Editorial Board
<< Previous Next >>

2019, 1(3): 265-275

Published Date:2019-6-20 DOI: 10.3724/SP.J.2096-5796.2019.0009

Tactile sensitivity in ultrasonic haptics: Do different parts of hand and different rendering methods have an impact on perceptual threshold?


Ultrasonic tactile representation utilizes focused ultrasound to create tactile sensations on the bare skin of a user’s hand that is not in contact with a device. This study is a preliminary investigation on whether different ultrasonic haptic rendering methods have an impact on the perceptual threshold.
This study conducted experiments with the adaptive step method to obtain participants’ perceptual thresholds. We examine (1) whether different parts on the palm of the hand have different perceptual thresholds; (2) whether the perceptual threshold is different when the ultrasonic focus point is stationary and when it moves in different trajectories; (3) whether different moving speeds of the ultrasonic focus point have an influence on the perceptual threshold; and (4) whether the addition of a DC offset to the modulating wave has an impact on the perceptual threshold.
The results show that the center of the palm is more sensitive to ultrasonic haptics than the fingertip; compared with a fast-moving focus point, the palm is more sensitive to a stationary and slow-moving focus point. When the modulating wave has a DC offset, the palm is sensitive to a much smaller modulation amplitude.
For the future ultrasonic tactile representation systems, dynamic adjustment of intensity is required to compensate the difference in perceptual thresholds under different rendering methods to achieve more realistic ultrasonic haptics.


Ultrasonic tactile ; Rendering methods ; Amplitude modulation ; Perceptual threshold ; Human-computer interaction

Cite this article

Chongyang SUN, Weizhi NAI, Xiaoying SUN. Tactile sensitivity in ultrasonic haptics: Do different parts of hand and different rendering methods have an impact on perceptual threshold?. Virtual Reality & Intelligent Hardware, 2019, 1(3): 265-275 DOI:10.3724/SP.J.2096-5796.2019.0009


1. Hayward V, Cruz-Hernandez M. Tactile display device using distributed lateral skin stretch. In: Proceedings of The Haptic Interfaces for Virtual Environment and Teleoperator Systems Symposium. New York, USA, ASME, 2000, 69(2): 1309–1314

2. Wiesendanger M. Squeeze film air bearings using piezoelectric bending elements. Verlag nicht Ermittelbar, 2001

3. Mallinckrodt E, Hughes A L, Sleator W. Perception by the skin of electrically induced vibrations. Science, 1953, 118(3062): 277–278 DOI:10.1126/science.118.3062.277

4. Obrist M, Seah S A, Subramanian S. Talking about tactile experiences. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Paris, France, ACM, 2013, 1659–1668 DOI:10.1145/2470654.2466220

5. Frier W, Ablart D, Chilles J, Long B, Giordano M, Obrist M, Subramanian S. Using spatiotemporal modulation to draw tactile patterns in mid-air//Haptics: Science, Technology, and Applications. Cham: Springer International Publishing, 2018, 270–281 DOI:10.1007/978-3-319-93445-7_24

6. Iwamoto T, Tatezono M, Shinoda H. Non-contact method for producing tactile sensation using airborne ultrasound// Haptics: Perception, Devices and Scenarios. Berlin, Heidelberg: Springer Berlin Heidelberg, 504–513 DOI:10.1007/978-3-540-69057-3_64

7. Hoshi T, Takahashi M, Iwamoto T, Shinoda H. Noncontact tactile display based on radiation pressure of airborne ultrasound. IEEE Transactions on Haptics 2010, 3(3): 155–165 DOI:10.1109/toh.2010.4

8. Inoue S, Makino Y, Shinoda H. Producing airborne ultrasonic 3D tactile image by time reversal field rendering. In: SICE Annual Conference (SICE). Sapporo, Japan, IEEE, 2014, 1360–1365 DOI:10.1109/sice.2014.6935269

9. Spelmezan D, Gonzalez R M, Subramanian S. SkinHaptics: Ultrasound focused in the hand creates tactile sensations. In: Haptics Symposium (HAPTICS).Philadelphia, PA, USA, IEEE, 2016, 98–105 DOI:10.1109/haptics.2016.7463162

10. Jones L A, Lederman S J. Applied aspects of hand function//Human Hand Function. Oxford University Press, 2006, 179–203 DOI:10.1093/acprof:oso/9780195173154.003.0010

11. Hasegawa K, Shinoda H. Aerial vibrotactile display based on multiunit ultrasound phased array. IEEE Transactions on Haptics 2018, 11(3): 367–377 DOI:10.1109/toh.2018.2799220

12. Kervegant C, Raymond F, Graeff D, Castet J. Touch hologram in mid-air. In: ACM SIGGRAPH 2017 Emerging Technologies. Los Angeles, California, USA, ACM, 2017, 23 DOI:10.1145/3084822.3084824

13. Long B, Seah S A, Carter T, Subramanian S. Rendering volumetric haptic shapes in mid-air using ultrasound. ACM Transactions on Graphics 2014, 33(6): 1–10 DOI:10.1145/2661229.2661257

14. Linschoten M R, Harvey L O, Eller P M, Jafek B W. Fast and accurate measurement of taste and smell thresholds using a maximum-likelihood adaptive staircase procedure. Perception & Psychophysics 2001, 63(8): 1330–1347 DOI:10.3758/bf03194546


1. Egemen ERTUGRUL, Ping LI, Bin SHENG, On attaining user-friendly hand gesture interfaces to control existing GUIs Virtual Reality & Intelligent Hardware 2020, 2(2): 153-161

2. Shiwei CHENG, Qiufeng PING, Jialing WANG, Yijian CHEN, EyeGaze: Hybrid eye tracking approach for handheld mobile devices Virtual Reality & Intelligent Hardware 2022, 4(2): 173-188

3. Mohammad Mahmudul ALAM, S. M. Mahbubur RAHMAN, Affine transformation of virtual 3D object using 2D localization of fingertips Virtual Reality & Intelligent Hardware 2020, 2(6): 534-555

4. Yuanyuan SHI, Yunan LI, Xiaolong FU, Kaibin MIAO, Qiguang MIAO, Review of dynamic gesture recognition Virtual Reality & Intelligent Hardware 2021, 3(3): 183-206

5. Wanlu ZHENG, Wenming ZHENG, Yuan ZONG, Multi-scale discrepancy adversarial network for cross-corpus speech emotion recognition Virtual Reality & Intelligent Hardware 2021, 3(1): 65-75

6. Xuezhi YAN, Qiushuang WU, Xiaoying SUN, Electrostatic tactile representation in multimedia mobile terminal Virtual Reality & Intelligent Hardware 2019, 1(2): 201-218

7. Xiaoxiong FAN, Yun CAI, Yufei YANG, Tianxing XU, Yike Li, Songhai ZHANG, Fanglue ZHANG, Detection of scene-irrelevant head movements via eye-head coordination information Virtual Reality & Intelligent Hardware 2021, 3(6): 501-514