Home About the Journal Latest Work Current Issue Archive Special Issues Editorial Board
<< Previous Next >>

2020, 2(4): 291-304

Published Date:2020-8-20 DOI: 10.1016/j.vrih.2020.07.005

Multimodal interaction design and application in augmented reality for chemical experiment

Abstract

Background
Augmented reality classrooms have become an interesting research topic in the field of education, but there are some limitations. Firstly, most researchers use cards to operate experiments, and a large number of cards cause difficulty and inconvenience for users. Secondly, most users conduct experiments only in the visual modal, and such single-modal interaction greatly reduces the users’ real sense of interaction. In order to solve these problems, we propose the Multimodal Interaction Algorithm based on Augmented Reality (ARGEV), which is based on visual and tactile feedback in Augmented Reality. In addition, we design a Virtual and Real Fusion Interactive Tool Suite (VRFITS) with gesture recognition and intelligent equipment.
Methods
The ARGVE method fuses gesture, intelligent equipment, and virtual models. We use a gesture recognition model trained by a convolutional neural network to recognize the gestures in AR, and to trigger a vibration feedback after a recognizing a five-finger grasp gesture. We establish a coordinate mapping relationship between real hands and the virtual model to achieve the fusion of gestures and the virtual model.
Results
The average accuracy rate of gesture recognition was 99.04%. We verify and apply VRFITS in the Augmented Reality Chemistry Lab (ARCL), and the overall operation load of ARCL is thus reduced by 29.42%, in comparison to traditional simulation virtual experiments.
Conclusions
We achieve real-time fusion of the gesture, virtual model, and intelligent equipment in ARCL. Compared with the NOBOOK virtual simulation experiment, ARCL improves the users’ real sense of operation and interaction efficiency.

Keyword

Augmented reality ; Gesture recognition ; Intelligent equipment ; Multimodal Interaction ; Augmented Reality Chemistry Lab

Cite this article

Mengting XIAO, Zhiquan FENG, Xiaohui YANG, Tao XU, Qingbei GUO. Multimodal interaction design and application in augmented reality for chemical experiment. Virtual Reality & Intelligent Hardware, 2020, 2(4): 291-304 DOI:10.1016/j.vrih.2020.07.005

References

1. Collazos C A, Merchan L. Human-computer interaction in Colombia: bridging the gap between education and industry. IT Professional, 2015, 17(1): 5–9 DOI:10.1109/mitp.2015.8

2. Desai K, Belmonte U H H, Jin R, Prabhakaran B, Diehl P, Ramirez V A, Johnson V, Gans M. Experiences with multi-modal collaborative virtual laboratory (MMCVL). In: 2017 IEEE Third International Conference on Multimedia Big Data (BigMM). Laguna Hills, CA, USA, IEEE, 2017, 376–383 DOI:10.1109/bigmm.2017.62

3. Chen L, Tang W, John N W, Wan T R, Zhang J J. SLAM-based dense surface reconstruction in monocular Minimally Invasive Surgery and its application to Augmented Reality. Computer Methods and Programs in Biomedicine, 2018, 158: 135–146 DOI:10.1016/j.cmpb.2018.02.006

4. Huynh B, Orlosky J, Höllerer T. In-situ labeling for augmented reality language learning. In: 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). Osaka, Japan, IEEE, 2019, 1606–1611 DOI:10.1109/vr.2019.8798358

5. Karambakhsh A, Kamel A, Sheng B, Li P, Yang P, Feng D D. Deep gesture interaction for augmented anatomy learning. International Journal of Information Management, 2019, 45: 328–336 DOI:10.1016/j.ijinfomgt.2018.03.004

6. Sun C X. The design and implementation of children's education system based on augmented reality. The Shandong University, 2017

7. Fidan M, Tuncel M. Integrating augmented reality into problem based learning: the effects on learning achievement and attitude in physics education. Computers & Education, 2019, 142: 103635 DOI:10.1016/j.compedu.2019.103635

8. Dave I R, Chaudhary V, Upla K P. Simulation of analytical chemistry experiments on augmented reality platform. In: Advances in Intelligent Systems and Computing. Singapore: Springer Singapore, 2018, 393–403 DOI:10.1007/978-981-13-0224-4_35

9. İbili E, Çat M, Resnyansky D, Şahin S, Billinghurst M. An assessment of geometry teaching supported with augmented reality teaching materials to enhance students' 3D geometry thinking skills. International Journal of Mathematical Education in Science and Technology, 2020, 51(2): 224–246 DOI:10.1080/0020739x.2019.1583382

10. Rani S S, Dhrisya K J, Ahalyadas M. Hand gesture control of virtual object in augmented reality. In: 2017 International Conference on Advances in Computing, Communications and Informatics (ICACCI). Udupi, India, IEEE, 2017, 1500–1505 DOI:10.1109/icacci.2017.8126053

11. Skaria S, Al-Hourani A, Lech M, Evans R J. Hand-gesture recognition using two-antenna Doppler radar with deep convolutional neural networks. IEEE Sensors Journal, 2019, 19(8): 3041–3048 DOI:10.1109/jsen.2019.2892073

12. Côté-Allard U, Fall C L, Drouin A, Campeau-Lecours A, Gosselin C, Glette K, Laviolette F, Gosselin B. Deep learning for electromyographic hand gesture signal classification using transfer learning. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2019, 27(4): 760–771 DOI:10.1109/tnsre.2019.2896269

13. Sinha K, Kumari R, Priya A, Paul P. A computer vision-based gesture recognition using hidden Markov model. In: Innovations in Soft Computing and Information Technology. Singapore: Springer Singapore, 2019, 55–67 DOI:10.1007/978-981-13-3185-5_6

14. Zhang L Z, Zhang Y R, Niu L D, Zhao Z J, Han X W. HMM static hand gesture recognition based on combination of shape features and wavelet texture features. Wireless and Satellite Systems, 2019, 187–197 DOI:10.1007/978-3-030-19156-6_18

15. Ahmad S U D, Akhter S. Real time rotation invariant static hand gesture recognition using an orientation based hash code. 2013 International Conference on Informatics, Electronics and Vision (ICIEV), 2013, 1–6 DOI:10.1109/iciev.2013.6572620

16. Saba T, Rehman A, Harouni M. Cursive multilingual characters recognition based on hard geometric features. International Journal of Computational Vision and Robotics, 2020, 10(3): 213 DOI:10.1504/ijcvr.2020.10029034

17. Wu D, Pigou L, Kindermans P J, Le N D H, Shao L, Dambre J, Odobez J M. Deep dynamic neural networks for multimodal gesture segmentation and recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2016, 38(8): 1583–1597 DOI:10.1109/tpami.2016.2537340

18. Elmezain M, Al-Hamadi A, Michaelis B. Hand trajectory-based gesture spotting and recognition using HMM. In: 2009 16th IEEE International Conference on Image Processing (ICIP). Cairo, Egypt, IEEE, 2009, 3577–3580 DOI:10.1109/icip.2009.5414322

19. Padam Priyal S, Bora P K. A robust static hand gesture recognition system using geometry based normalizations and Krawtchouk moments. Pattern Recognition, 2013, 46(8): 2202–2219 DOI:10.1016/j.patcog.2013.01.033

20. Liang H, Yuan J S, Thalmann D, Thalmann N M. AR in hand: egocentric palm pose tracking and gesture recognition for augmented reality applications. In: Proceedings of the 23rd ACM International Conference on Multimedia-MM'15. Brisbane, Australia, York New, Press ACM, 2015, 743–744 DOI:10.1145/2733373.2807972

21. Wang J. Research on the application of virtual simulation experiment in physics experiment teaching of senior high school. 2018

22. Law K E, Lowndes B R, Kelley S R, Blocker R C, Larson D W, Hallbeck M S, Nelson H. NASA-task load index differentiates surgical approach: opportunities for improvement in colon and rectal surgery. Annals of Surgery, 2020, 271(5): 906–912 DOI:10.1097/sla.0000000000003173

Related

1. Yang LI, Jin HUANG, Feng TIAN, Hong-An WANG, Guo-Zhong DAI, Gesture interaction in virtual reality Virtual Reality & Intelligent Hardware 2019, 1(1): 84-112

2. Meng SONG, Shiyi LIU, Ge YU, Lili GUO, Dangxiao WANG, An immersive space liquid bridge experiment system with gesture interaction and vibrotactile feedback Virtual Reality & Intelligent Hardware 2019, 1(2): 219-232

3. Egemen ERTUGRUL, Ping LI, Bin SHENG, On attaining user-friendly hand gesture interfaces to control existing GUIs Virtual Reality & Intelligent Hardware 2020, 2(2): 153-161

4. Jiaxin LIU, Hongxin ZHANG, Chuankang LI, COMTIS: Customizable touchless interaction system for large screen visualization Virtual Reality & Intelligent Hardware 2020, 2(2): 162-174

5. Benjia ZHOU, Jun WAN, Yanyan LIANG, Guodong GUO, Adaptive cross-fusion learning for multi-modal gesture recognition Virtual Reality & Intelligent Hardware 0, -(-): 1-13

6. Benjia ZHOU, Jun WAN, Yanyan LIANG, Guodong GUO, Adaptive cross-fusion learning for multi-modal gesture recognition Virtual Reality & Intelligent Hardware 2021, 3(3): 235-247

7. Yuanyuan SHI, Yunan LI, Xiaolong FU, Kaibin MIAO, Qiguang MIAO, Review of dynamic gesture recognition Virtual Reality & Intelligent Hardware 2021, 3(3): 183-206

8. Mingxuan CHEN, Ping ZHANG, Zebo WU, Xiaodan CHEN, A multichannel human-swarm robot interaction system in augmented reality Virtual Reality & Intelligent Hardware 2020, 2(6): 518-533

9. Yonghang TAI, Junsheng SHI, Junjun PAN, Aimin HAO, Victor CHANG, Augmented reality-based visual-haptic modeling for thoracoscopic surgery training systems Virtual Reality & Intelligent Hardware 2021, 3(4): 274-286

10. Yukang YAN, Xin YI, Chun YU, Yuanchun SHI, Gesture-based target acquisition in virtual and augmented reality Virtual Reality & Intelligent Hardware 2019, 1(3): 276-289

11. Yuan GAO, Le XIE, A review on the application of augmented reality in craniomaxillofacial surgery Virtual Reality & Intelligent Hardware 2019, 1(1): 113-120

12. Jinyu LI, Bangbang YANG, Danpeng CHEN, Nan WANG, Guofeng ZHANG, Hujun BAO, Survey and evaluation of monocular visual-inertial SLAM algorithms for augmented reality Virtual Reality & Intelligent Hardware 2019, 1(4): 386-410

13. Xiaomei ZHAO, Fulin TANG, Yihong WU, Real-time human segmentation by BowtieNet and a SLAM-based human AR system Virtual Reality & Intelligent Hardware 2019, 1(5): 511-524

14. Chan QIU, Shien ZHOU, Zhenyu LIU, Qi GAO, Jianrong TAN, Digital assembly technology based on augmented reality and digital twins: a review Virtual Reality & Intelligent Hardware 2019, 1(6): 597-610

15. Wang LI, Junfeng WANG, Sichen JIAO, Meng WANG, Shiqi LI, Research on the visual elements of augmented reality assembly processes Virtual Reality & Intelligent Hardware 2019, 1(6): 622-634

16. Pengfei HAN, Gang ZHAO, A review of edge-based 3D tracking of rigid objects Virtual Reality & Intelligent Hardware 2019, 1(6): 580-596

17. Shenze WANG, Kaikai DU, Ningfang SONG, Dongfeng ZHAO, Di FENG, Zhengqian TU, Study on the adaptability of augmented reality smartglasses for astigmatism based on holographic waveguide grating Virtual Reality & Intelligent Hardware 2020, 2(1): 79-85

18. Lingfei ZHU, Qi CAO, Yiyu CAI, Development of augmented reality serious games with a vibrotactile feedback jacket Virtual Reality & Intelligent Hardware 2020, 2(5): 454-470

19. Xiang ZHOU, Liyu TANG, Ding LIN, Wei HAN, Virtual & augmented reality for biological microscope in experiment education Virtual Reality & Intelligent Hardware 2020, 2(4): 316-329

20. Yun-Han LEE, Tao ZHAN, Shin-Tson WU, Prospects and challenges in augmented reality displays Virtual Reality & Intelligent Hardware 2019, 1(1): 10-20

21. Zike YAN, Hongbin ZHA, Flow-based SLAM: From geometry computation to learning Virtual Reality & Intelligent Hardware 2019, 1(5): 435-460