Home About the Journal Latest Work Current Issue Archive Special Issues Editorial Board
<< Previous Next >>

2022, 4(2): 173-188

Published Date:2022-4-20 DOI: 10.1016/j.vrih.2021.10.003

EyeGaze: Hybrid eye tracking approach for handheld mobile devices

Abstract

Background
Eye-tracking technology for mobile devices has made significant progress. However, owing to limited computing capacity and the complexity of context, the conventional image feature-based technology cannot extract features accurately, thus affecting the performance.
Methods
This study proposes a novel approach by combining appearance- and feature-based eye-tracking methods. Face and eye region detections were conducted to obtain features that were used as inputs to the appearance model to detect the feature points. The feature points were used to generate feature vectors, such as corner center-pupil center, by which the gaze fixation coordinates were calculated.
Results
To obtain feature vectors with the best performance, we compared different vectors under different image resolution and illumination conditions, and the results indicated that the average gaze fixation accuracy was achieved at a visual angle of 1.93° when the image resolution was 96 × 48 pixels, with light sources illuminating from the front of the eye.
Conclusions
Compared with the current methods, our method improved the accuracy of gaze fixation and it was more usable.

Keyword

Eye movement ; Gaze estimation ; Fixation ; Human-computer interaction ; Eye tracking

Cite this article

Shiwei CHENG, Qiufeng PING, Jialing WANG, Yijian CHEN. EyeGaze: Hybrid eye tracking approach for handheld mobile devices. Virtual Reality & Intelligent Hardware, 2022, 4(2): 173-188 DOI:10.1016/j.vrih.2021.10.003

References

1. Khamis M, Alt F, Bulling A. The past, present, and future of gaze-enabled handheld mobile devices: survey and lessons learned. In: Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services. Barcelona Spain, New York, NY, USA, ACM, 2018, 1–17 DOI:10.1145/3229434.3229452

2. Boring S, Ledo D, Chen X', Marquardt N, Tang A, Greenberg S. The fat thumb: using the thumb's contact size for single-handed mobile interaction. In: Proceedings of the 14th International Conference on Human-computer Interaction with Mobile Devices and Services. San Francisco, California, USA, New York, ACM Press, 2012, 39–48 DOI:10.1145/2371574.2371582

3. Liu D C, Dong B, Gao X, Wang H N. Exploiting eye tracking for smartphone authentication. In: Applied Cryptography and Network Security. Cham: Springer International Publishing, 2015, 457–477 DOI:10.1007/978-3-319-28166-7_22

4. Zhang X Y, Kulkarni H, Morris M R. Smartphone-based gaze gesture communication for people with motor disabilities. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. Denver Colorado USA, New York, NY, USA, ACM, 2017, 2878–2889 DOI:10.1145/3025453.3025790

5. Krafka K, Khosla A, Kellnhofer P, Kannan H, Bhandarkar S, Matusik W, Torralba A. Eye tracking for everyone. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Las Vegas, NV, USA, IEEE, 2016, 2176–2184 DOI:10.1109/cvpr.2016.239

6. Hansen D W, Ji Q. In the eye of the beholder: a survey of models for eyes and gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2010, 32(3): 478–500 DOI:10.1109/tpami.2009.30

7. Zhang X C, Sugano Y, Bulling A. Evaluation of appearance-based methods and implications for gaze-based applications. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. Glasgow Scotland Uk, New York, NY, USA, ACM, 2019, 478–500 DOI:10.1145/3290605.3300646

8. Kar A, Corcoran P. A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. IEEE Access, 2017, 5, 16495–16519 DOI:10.1109/access.2017.2735633

9. Yue X, Liu Y. Novel eyelid location algorithm in iris image preprocessing. Opto-Electronic Engineering, 2018, 35(8): 66–70

10. Wang J G, Sung E. Study on eye gaze estimation. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 2002, 32(3): 332–350 DOI:10.1109/tsmcb.2002.999809

11. Kai K Z, Ishimaru S, Utsumi Y, Kise K. My reading life: towards utilizing eyetracking on unmodified tablets and phones. In: Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication. Zurich Switzerland, New York, NY, USA, ACM, 2013: 283–286 DOI:10.1145/2494091.2494179

12. Hennessey C, Noureddin B, Lawrence P. A single camera eye-gaze tracking system with free head motion. In: Proceedings of the 2006 symposium on Eye tracking research & applications. San Diego, California, New York, ACM Press, 2006, 87–94 DOI:10.1145/1117309.1117349

13. Zhu Z W, Ji Q. Eye gaze tracking under natural head movements. In: 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. San Diego, CA, USA, IEEE, 2005, 918–923 DOI:10.1109/cvpr.2005.148

14. Li D, Hao Q, Huang H. A novel gaze tracking approach based on Purkinje image. Optical Technique, 2007; 33(4): 498–500 DOI:10.13741/j.cnki.11-1879/o4.2007.04.005

15. Papoutsaki A, Daskalova N, Sangkloy P, Huang J, Laskey J, Hays J. WebGazer: Scalable webcam eye tracking using user interactions. In: Proceedings of the 25th International Joint Conference on Artificial Intelligence. AAAI, 2016, 3839–3845

16. Holland C, Komogortsev O. Eye tracking on unmodified common tablets: challenges and solutions. In: Proceedings of the Symposium on Eye Tracking Research and Applications. Santa Barbara, California, New York, ACM Press, 2012, 277-280 DOI:10.1145/2168556.2168615

17. Garcia D, Sintos I. EyeDROID: Gaze tracking component for Android SPCL-Autumn. http://www.itu.dk/~tped/teaching/pervasive/SPCL-E2014/draft01handins/02_EyeDROID%20Gaze%20tracking%20component%20for%20Android.pdf

18. Tan K H, Kriegman D J, Ahuja N. Appearance-based eye gaze estimation. In: Sixth IEEE Workshop on Applications of Computer Vision. Proceedings. Orlando, FL, USA, IEEE, 2002, 191–195 DOI:10.1109/acv.2002.1182180

19. Zhang X C, Sugano Y, Fritz M, Bulling A. MPIIGaze: real-world dataset and deep appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2019, 41(1): 162–175 DOI:10.1109/tpami.2017.2778103

20. Zhang X C, Sugano Y, Fritz M, Bulling A. Appearance-based gaze estimation in the wild. In: 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Boston, MA, USA, IEEE, 2015, 4511–4520 DOI:10.1109/cvpr.2015.7299081

21. Lu F, Okabe T, Sugano Y, Sato Y. Learning gaze biases with head motion for head pose-free gaze estimation. Image and Vision Computing, 2014, 32(3): 169–179 DOI:10.1016/j.imavis.2014.01.005

22. Lu F, Yusuke S, Takahiro O, Yoichi S. Head pose-free appearance-based gaze sensing via eye image synthesis. In: Proceedings of the 21st International Conference on Pattern Recognition (ICPR). IEEE, Tsukuba, Japan, 2012, 1008–1011

23. Choi J, Ahn B, Parl J, Kweon I S. Appearance-based gaze estimation using kinect. In: 2013 10th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI). Jeju, Korea (South), IEEE, 2013, 260–261 DOI:10.1109/urai.2013.6677362

24. Sewell W, Komogortsev O. Real-time eye gaze tracking with an unmodified commodity webcam employing a neural network. In: Extended Abstracts on Human Factors in Computing Systems. Atlanta Georgia USA, New York, NY, USA, ACM, 2010, 3739–3744 DOI:10.1145/1753846.1754048

25. Lu F, Sugano Y, Okabe T, Sato Y. Adaptive linear regression for appearance-based gaze estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, 36(10): 2033–2046 DOI:10.1109/tpami.2014.2313123

26. Liang K, Chahir Y, Molina M, Tijus C, Jouen F. Appearance-based gaze tracking with spectral clustering and semi-supervised Gaussian process regression. In: Proceedings of the 2013 Conference on Eye Tracking South Africa. Cape Town, South Africa, New York, ACM Press, 2013, 17–23 DOI:10.1145/2509315.2509318

27. Paletta L, Neuschmied H, Schwarz M, Lodron G, Pszeida M, Ladstätter S, Luley P. Smartphone eye tracking toolbox: accurate gaze recovery on mobile displays. In: Proceedings of the Symposium on Eye Tracking Research and Applications. Safety Harbor Florida, New York, NY, USA, ACM, 2014, 367–368 DOI:10.1145/2578153.2628813

28. Nagamatsu T, Yamamoto M, Sato H. MobiGaze: development of a gaze interface for handheld mobile devices. In: Extended Abstracts on Human Factors in Computing Systems. Atlanta Georgia USA, New York, NY, USA, ACM, 2010, 3349–3354 DOI:10.1145/1753846.1753983

29. Drewes H, de Luca A, Schmidt A. Eye-gaze interaction for mobile phones. In: Proceedings of the 4th International Conference on Mobile Technology, Applications, and Systems and the 1st International Symposium on Computer Human Interaction in Mobile Technology. Singapore, New York, ACM Press, 2007, 364–371 DOI:10.1145/1378063.1378122

30. Miluzzo E, Wang T, Campbell A T. EyePhone: activating mobile phones with your eyes. In: Proceedings of the second ACM SIGCOMM Workshop on Networking, Systems, and Applications on Mobile Handhelds. New Delhi, India, New York, ACM Press, 2010, 15–20 DOI:10.1145/1851322.1851328

31. Huang M X, Li J J, Ngai G, Leong H V. ScreenGlint: practical, in situ gaze estimation on smartphones. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. Denver Colorado USA, New York, NY, USA, ACM, 2017, 2546–2557 DOI:10.1145/3025453.3025794

32. Li D H, Winfield D, Parkhurst D J. Starburst: a hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In: 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. San Diego, CA, USA, IEEE, 2005, 79 DOI:10.1109/cvpr.2005.531

33. Wood E, Bulling A. EyeTab: model-based gaze estimation on unmodified tablet computers. In: Proceedings of the Symposium on Eye Tracking Research and Applications. Safety Harbor Florida, New York, NY, USA, ACM, 2014, 207–210 DOI:10.1145/2578153.2578185

34. Wang Y N, Wan W G, Wang R, Zhou X L. An improved interpolation algorithm using nearest neighbor from VTK. In: 2010 International Conference on Audio, Language and Image Processing. Shanghai, China, IEEE, 2010, 1062–1065 DOI:10.1109/icalip.2010.5685116

35. Zhang C, Chi J N, Zhang Z H, Wang Z L. A novel eye gaze tracking technique based on pupil center cornea reflection technique. Chinese Journal of Computers, 2010, 33(7): 1272–1285 DOI:10.3724/sp.j.1016.2010.01272

36. Li Y D, Hao Z B, Lei H. Survey of convolutional neural network. Journal of Computer Applications, 2016, 36(9): 2508–2515 DOI:10.11772/j.issn.1001-9081.2016.09.2508

37. Bengio Y, Louradour J, Collobert R, Weston J. Curriculum learning. In: Proceedings of the 26th Annual International Conference on Machine Learning. Montreal, Quebec, Canada, New York, ACM Press, 2009, 41–48 DOI:10.1145/1553374.1553380

38. Park S, Zhang X C, Bulling A, Hilliges O. Learning to find eye region landmarks for remote gaze estimation in unconstrained settings. In: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. Warsaw Poland, New York, NY, USA, ACM, 2018, 1–10 DOI:10.1145/3204493.3204545

39. Meng D, Miao L J, Shao H J, Shen J. A Parameter Adaptive Gaussian Mixture CQKF Algorithm Under Non-Gaussian Noise. Transaction of Beijing Institute of Technology, 2018, 38(10): 1079–1084 DOI:10.15918/j.tbit1001-0645.2018.10.015.

40. Zhang N, Lei D, Zhao J F. An improved adagrad gradient descent optimization algorithm. 2018 Chinese Automation Congress (CAC), 2018, 2359–2362 DOI:10.1109/cac.2018.8623271

41. Zhang C, Chi J N, Zhang Z H, Gao X L, Hu T, Wang Z L. Gaze estimation in a gaze tracking system. Science China Information Sciences, 2011, 54(11): 2295–2306 DOI:10.1007/s11432-011-4243-6

42. Wang J P, Li C, Chen W H. Blind image quality Assessment based on DCT features in the non-flat region and EGRNN. Chinese Journal of Computers, 2017, 40(11): 2492–2505 DOI:10.11897/SP.J.1016.2017.02492

43. Qian N. On the momentum term in gradient descent learning algorithms. Neural Networks, 1999, 12(1): 145–151 DOI:10.1016/s0893-6080(98)00116-6

44. Kao C W, Yang C W, Chen Y W, Fan K C, Hwang B J, Huang C P. Eye gaze tracking based on pattern voting scheme for mobile device. In: 2011 First International Conference on Instrumentation, Measurement, Computer, Communication and Control. Beijing, China, IEEE, 2011, 337–340 DOI:10.1109/imccc.2011.91

45. Mariakakis A, Goel M, Aumi M T I, Patel S N, Wobbrock J O. SwitchBack: using focus and saccade tracking to guide users' attention for mobile task resumption. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. Seoul Republic of Korea, New York, NY, USA, ACM, 2015, 2953–2962 DOI:10.1145/2702123.2702539

46. Huang Q, Veeraraghavan A, Sabharwal A. TabletGaze: dataset and analysis for unconstrained appearance-based gaze estimation in mobile tablets. Machine Vision and Applications, 2017, 28(5/6): 445–461 DOI:10.1007/s00138-017-0852-4

47. Lu Y. Eye tracking system for human-computer interaction on mobile device. HCI, ZJUT, Zhejiang, China, 2016

48. Zhang X C, Sugano Y, Fritz M, Bulling A. It's written all over your face: full-face appearance-based gaze estimation. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). Honolulu, HI, USA, IEEE, 2017, 2299–2308 DOI:10.1109/cvprw.2017.284

49. Wu X M, Li J, Wu Q, Sun J D. Appearance-based gaze block estimation via CNN classification. In: 2017 IEEE 19th International Workshop on Multimedia Signal Processing (MMSP). Luton, UK, IEEE, 2017, 1–5 DOI:10.1109/mmsp.2017.8122270

50. Li B, Fu H, Wen D S, Lo W. Etracker: a mobile gaze-tracking system with near-eye display based on a combined gaze-tracking algorithm. Sensors, 2018, 18(5): 1626 DOI:10.3390/s18051626

51. Bulling A, Gellersen H. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing, 2010, 9(4): 8–12 DOI:10.1109/mprv.2010.86

Related

1. Chongyang SUN, Weizhi NAI, Xiaoying SUN, Tactile sensitivity in ultrasonic haptics: Do different parts of hand and different rendering methods have an impact on perceptual threshold? Virtual Reality & Intelligent Hardware 2019, 1(3): 265-275

2. Egemen ERTUGRUL, Ping LI, Bin SHENG, On attaining user-friendly hand gesture interfaces to control existing GUIs Virtual Reality & Intelligent Hardware 2020, 2(2): 153-161

3. Mohammad Mahmudul ALAM, S. M. Mahbubur RAHMAN, Affine transformation of virtual 3D object using 2D localization of fingertips Virtual Reality & Intelligent Hardware 2020, 2(6): 534-555

4. Yuanyuan SHI, Yunan LI, Xiaolong FU, Kaibin MIAO, Qiguang MIAO, Review of dynamic gesture recognition Virtual Reality & Intelligent Hardware 2021, 3(3): 183-206

5. Wanlu ZHENG, Wenming ZHENG, Yuan ZONG, Multi-scale discrepancy adversarial network for cross-corpus speech emotion recognition Virtual Reality & Intelligent Hardware 2021, 3(1): 65-75

6. Xuezhi YAN, Qiushuang WU, Xiaoying SUN, Electrostatic tactile representation in multimedia mobile terminal Virtual Reality & Intelligent Hardware 2019, 1(2): 201-218

7. Xiaoxiong FAN, Yun CAI, Yufei YANG, Tianxing XU, Yike Li, Songhai ZHANG, Fanglue ZHANG, Detection of scene-irrelevant head movements via eye-head coordination information Virtual Reality & Intelligent Hardware 2021, 3(6): 501-514