Home About the Journal Latest Work Current Issue Archive Special Issues Editorial Board
<< Previous Next >>

2020, 2(1): 56-69

Published Date:2020-2-20 DOI: 10.1016/j.vrih.2019.12.004

Survey on path and view planning for UAVs


Background In recent decades, unmanned aerial vehicles (UAVs) have developed rapidly and been widely applied in many domains, including photography, reconstruction, monitoring, and search and rescue. In such applications, one key issue is path and view planning, which tells UAVs exactly where to fly and how to search.MethodsWith specific consideration for three popular UAV applications (scene reconstruction, environment exploration, and aerial cinematography), we present a survey that should assist researchers in positioning and evaluating their works in the context of existing solutions. Results/Conclusions It should also help newcomers and practitioners in related fields quickly gain an overview of the vast literature. In addition to the current research status, we analyze and elaborate on advantages, disadvantages, and potential explorative trends for each application domain.


Unmanned aerial vehicle ; Path planning ; View panning ; Multi-view reconstruction ; Autonomous exploration ; Scene navigation ; Obstacle avoidance ; Drone cinematography ; Camera control

Cite this article

Xiaohui ZHOU, Zimu YI, Yilin LIU, Kai HUANG, Hui HUANG. Survey on path and view planning for UAVs. Virtual Reality & Intelligent Hardware, 2020, 2(1): 56-69 DOI:10.1016/j.vrih.2019.12.004


1. Musialski P, Wonka P, Aliaga D G, Wimmer M, van Gool L, Purgathofer W. A survey of urban reconstruction. Computer Graphics Forum, 2013, 32(6): 146–177 DOI:10.1111/cgf.12077

2. Souissi O, Benatitallah R, Duvivier D, Artiba A, Belanger N, Feyzeau P. Path planning: A 2013 survey. In: Proceedings of 2013 International Conference on Industrial Engineering and Systems Management (IESM). IEEE, 2013, 1–8

3. Biljecki F, Stoter J, Ledoux H, Zlatanova S, Çöltekin A. Applications of 3D city models: state of the art review. ISPRS International Journal of Geo-Information, 2015, 4(4): 2842–2889 DOI:10.3390/ijgi4042842

4. Cheng L, Gong J Y, Li M C, Liu Y X. 3D building model reconstruction from multi-view aerial imagery and lidar data. Photogrammetric Engineering & Remote Sensing, 2011, 77(2): 125–139 DOI:10.14358/pers.77.2.125

5. Schwarz B. Mapping the world in 3D. Nature Photonics, 2010, 4(7): 429–430 DOI:10.1038/nphoton.2010.148

6. Qin R. Rpc stereo processor (rsp)–a software package for digital surface model and orthophoto generation from satellite stereo imagery. ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences, 2016, III-1: 77–82 DOI:10.5194/isprs-annals-iii-1-77-2016

7. Qin R. Analysis of critical parameters of satellite stereo image for 3D reconstruction and mapping. arXiv: Computer Vision and Pattern Recognition, 2019

8. Peng C, Isler V. In: 2019 International Conference on Robotics and Automation (ICRA). 2019, IEEE, 2981–2987

9. Pix4DmapperPro. 2015

10. Altizure[EB/OL]. 2019-11-19. https://www.altizure.cn/

11. DJI-TERRA[EB/OL]. 2019-11-19. https://www.dji.com/cn/dji-terra

12. Roberts M, Shah S, Dey D, Truong A, Sinha S, Kapoor A, Hanrahan P, Joshi N. Submodular trajectory optimization for aerial 3D scanning. In: 2017 IEEE International Conference on Computer Vision (ICCV). Venice, IEEE, 2017 DOI:10.1109/iccv.2017.569

13. Fuhrmann S, Langguth F, Moehrle N, Waechter M, Goesele M. MVE: An image-based reconstruction environment. Computers & Graphics, 2015, 53: 44–53 DOI:10.1016/j.cag.2015.09.003

14. Schonberger J L, Frahm J M. Structure-from-motion revisited. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Las Vegas, NV, USA, IEEE, 2016 DOI:10.1109/cvpr.2016.445

15. RealityCapturing[EB/OL]. 2019-11-19. https://www.capturingreality.com/

16. Hepp B, Nießner M, Hilliges O. Plan3D. ACM Transactions on Graphics, 2019, 38(1): 1–17 DOI:10.1145/3233794

17. Krause A, Golovin D. Submodular function maximization. 2014

18. Smith N, Moehrle N, Goesele M, Heidrich W. Aerial path planning for urban scene reconstruction. ACM Transactions on Graphics, 2019, 37(6): 1–15 DOI:10.1145/3272127.3275010

19. Almadhoun R, Abduldayem A, Taha T, Seneviratne L, Zweiri Y. Guided next best view for 3D reconstruction of large complex structures. Remote Sensing, 2019, 11(20): 2440 DOI:10.3390/rs1120244

20. Mendoza M, Vasquez-Gomez J I, Taud H, Sucar L E, Reta C J. Supervised Learning of the Next-Best-View for 3D Object Reconstruction. 2019: 1–15

21. Huang R, Zou D, Vaughan R, Tan P. Active Image-based Modeling with a Toy Drone. In: arXiv e-prints. 2017

22. Xie K, Yang H, Huang S Q, Lischinski D, Christie M, Xu K, Gong M L, Cohen-Or D, Huang H. Creating and chaining camera moves for qadrotor videography. ACM Transactions on Graphics, 2018, 37(4): 1–13 DOI:10.1145/3197517.3201284

23. He X, Bourne J R, Steiner J A, Mortensen C, Hoffman K C, Dudley C J, Rogers B, Cropek D M, Leang K K. Autonomous chemical-sensing aerial robot for urban/suburban environmental monitoring. IEEE Systems Journal, 2019, 13(3): 3524–3535 DOI:10.1109/jsyst.2019.2905807

24. Hornung A, Wurm K M, Bennewitz M, Stachniss C, Burgard W. OctoMap: an efficient probabilistic 3D mapping framework based on octrees. Autonomous Robots, 2013, 34(3): 189–206 DOI:10.1007/s10514-012-9321-0

25. Mur-Artal R, Tardos J D. ORB-SLAM2: an open-source SLAM system for monocular, stereo, and RGB-D cameras. IEEE Transactions on Robotics, 2017, 33(5): 1255–1262 DOI:10.1109/tro.2017.2705103

26. Borenstein J, Koren Y. Real-time obstacle avoidance for fast mobile robots. IEEE Transactions on Systems, Man, and Cybernetics, 1989, 19(5): 1179–1187 DOI:10.1109/21.44033

27. Kim D, Nevatia R. Symbolic Navigation with a Generic Map. Autonomous Robots1999, 6(1):69–88 DOI:10.1023/A:1008824626321

28. Mur-Artal R, Montiel J M M, Tardos J D. ORB-SLAM: A versatile and accurate monocular SLAM system. IEEE Transactions on Robotics, 2015, 31(5): 1147–1163 DOI:10.1109/tro.2015.2463671

29. Qin T, Li P L, Shen S J. VINS-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Transactions on Robotics, 2018, 34(4): 1004–1020 DOI:10.1109/tro.2018.2853729

30. Engel J, Koltun V, Cremers D. Direct sparse odometry. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018, 40(3): 611–625 DOI:10.1109/tpami.2017.2658577

31. Zhu Y K, Mottaghi R, Kolve E, Lim J J, Gupta A, Li F F, Farhadi A. Target-driven visual navigation in indoor scenes using deep reinforcement learning. In: 2017 IEEE International Conference on Robotics and Automation (ICRA). Singapore, IEEE, 2017 DOI:10.1109/icra.2017.7989381

32. Gupta S, Davidson J, Levine S, Sukthankar R, Malik J. Cognitive mapping and planning for visual navigation. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Honolulu, HI, IEEE, 2017 DOI:10.1109/cvpr.2017.769

33. Chen K, de Vicente J P, Sepulveda G, Xia F, Soto A, Vázquez M, Savarese S. A behavioral approach to visual navigation with graph localization networks. 2019

34. Mishkin D, Dosovitskiy A, Koltun V. Benchmarking Classic and Learned Navigation in Complex 3D Environments. 2019

35. Johnson A E, Klumpp A R, Collier J B, Wolf A A. Lidar-based hazard avoidance for safe landing on Mars. Journal of Guidance, Control, and Dynamics, 2002, 25(6): 1091–1099 DOI:10.2514/2.4988

36. Bosch S, Lacroix S, Caballero F. Autonomous detection of safe landing areas for an UAV from monocular images. In: 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems. Beijing, China, IEEE, 2006 DOI:10.1109/iros.2006.282188

37. Shen Y F, Rahman Z, Krusienski D, Li J. A vision-based automatic safe landing-site detection system. IEEE Transactions on Aerospace and Electronic Systems, 2013, 49(1): 294–311 DOI:10.1109/taes.2013.6404104

38. Liu X, Li S, Jiang X Q, Huang X Y. Planetary landing site detection and selection using multilevel optimization strategy. Acta Astronautica, 2019, 163: 272–286 DOI:10.1016/j.actaastro.2019.01.004

39. Maturana D, Scherer S. 3D Convolutional Neural Networks for landing zone detection from LiDAR. In: 2015 IEEE International Conference on Robotics and Automation (ICRA). Seattle, WA, USA, IEEE, 2015 DOI:10.1109/icra.2015.7139679

40. Hinzmann T, Stastny T, Cadena C, Siegwart R, Gilitschenski I. Free LSD: prior-free visual landing site detection for autonomous planes. IEEE Robotics and Automation Letters, 2018, 3(3): 2545–2552 DOI:10.1109/lra.2018.2809962

41. Marcu A, Costea D, Licăreţ V, Pîrvu M, Sluşanschi E, Leordeanu M. SafeUAV: learning to estimate depth and safe landing areas for UAVs from synthetic data//Lecture Notes in Computer Science. Cham: Springer International Publishing, 2019: 43–58 DOI:10.1007/978-3-030-11012-3_4

42. Desaraju V, Michael N, Humenberger M, Brockers R, Weiss S, Matthies L. Vision-based landing site evaluation and trajectory generation toward rooftop landing. InRobotics: Science and Systems X, Robotics: Science and Systems Foundation, 2014 DOI:10.15607/rss.2014.x.044

43. Forster C, Faessler M, Fontana F, Werlberger M, Scaramuzza D. Continuous on-board monocular-vision-based elevation mapping applied to autonomous landing of micro aerial vehicles. In: 2015 IEEE International Conference on Robotics and Automation (ICRA). Seattle, WA, USA, IEEE, 2015 DOI:10.1109/icra.2015.7138988

44. Pizzoli M, Forster C, Scaramuzza D. REMODE: Probabilistic, monocular dense reconstruction in real time. In: 2014 IEEE International Conference on Robotics and Automation (ICRA). Hong Kong, China, IEEE, 2014 DOI:10.1109/icra.2014.6907233

45. Ravankar A, Ravankar A, Kobayashi Y, Emaru T. Autonomous mapping and exploration with unmanned aerial vehicles using low cost sensors. Proceedings, 2018, 4(1): 44 DOI:10.3390/ecsa-5-05753

46. Cesare K, Skeele R, Yoo S H, Zhang Y, Hollinger G. Multi-UAV exploration with limited communication and battery. In: 2015 IEEE International Conference on Robotics and Automation. Seattle, WA, USA, IEEE, 2015 DOI:10.1109/icra.2015.7139494

47. Cieslewski T, Kaufmann E, Scaramuzza D. Rapid exploration with multi-rotors: A frontier selection method for high speed flight. In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Vancouver, BC, IEEE, 2017 DOI:10.1109/iros.2017.8206030

48. Schmid L M, Pantic M, Khanna R, Ott L, Siegwart R, Nieto J. An Efficient Sampling-based Method for Online Informative Path Planning in Unknown Environments. arXiv: Robotics, 2019

49. Dang T, Papachristos C, Alexis K. Autonomous exploration and simultaneous object search using aerial robots. In: 2018 IEEE Aerospace Conference. Big Sky, MT, IEEE, 2018 DOI:10.1109/aero.2018.8396632

50. Papachristos C, Khattak S, Alexis K. Uncertainty-aware receding horizon exploration and mapping using aerial robots. In: 2017 IEEE international conference on robotics and automation (ICRA). 2017, IEEE, 4568–4575

51. Selin M, Tiger M, Duberg D, Heintz F, Jensfelt P. Efficient autonomous exploration planning of large-scale 3-D environments. IEEE Robotics and Automation Letters, 2019, 4(2): 1699–1706 DOI:10.1109/lra.2019.2897343

52. Bircher A, Kamel M, Alexis K, Oleynikova H, Siegwart R. Receding horizon "Next-best-view" planner for 3D exploration. In: 2016 IEEE International Conference on Robotics and Automation (ICRA). Stockholm, Sweden, IEEE, 2016 DOI:10.1109/icra.2016.7487281

53. Bircher A, Kamel M, Alexis K, Oleynikova H, Siegwart R. Receding horizon path planning for 3D exploration and surface inspection. Autonomous Robots, 2018, 42(2): 291–306 DOI:10.1007/s10514-016-9610-0

54. Yamauchi B. A frontier-based exploration for autonomous exploration. IEEE international symposium on computational intelligence in robotics and automation, Monterey, CA, 1997: 146–151

55. LaValle S M. Rapidly-exploring random trees: A new tool for path planning. 1998

56. Oleynikova H, Taylor Z, Siegwart R, Nieto J. Safe local exploration for replanning in cluttered unknown environments for microaerial vehicles. IEEE Robotics and Automation Letters, 2018, 3(3): 1474–1481 DOI:10.1109/lra.2018.2800109

57. Capitán J, Torres-González A, Ollero A. Autonomous cinematography with teams of drones. Workshop on Aerial Swarms. IEEE International Conference on Intelligent Robots and Systems (IROS), 2019(i):1–3

58. Mademlis I, Mygdalis V, Nikolaidis N, Pitas I. Challenges in autonomous UAV cinematography: an overview. In: 2018 IEEE International Conference on Multimedia and Expo. San Diego, CA, USA, IEEE, 2018 DOI:10.1109/icme.2018.8486586

59. Joubert N, Goldman D B, Berthouzoz F, Roberts M, Landay J A, Hanrahan P. Towards a drone cinematographer: guiding quadrotor cameras using visual composition principles. 2016

60. Nägeli T, Meier L, Domahidi A, Alonso-Mora J, Hilliges O. Real-time planning for automated multi-view drone cinematography. ACM Transactions on Graphics, 2017, 36(4): 1–10 DOI:10.1145/3072959.3073712

61. Galvane Q, Lino C, Christie M, Fleureau J, Servant F, Tariolle F L, Guillotel P. Directing cinematographic drones. ACM Transactions on Graphics, 2018, 37(3): 1–18 DOI:10.1145/3181975

62. Galvane Q, Fleureau J, Tariolle F L, Guillotel P. Automated cinematography with unmanned aerial vehicles. 2017

63. Bonatti R, Ho C, Wang W, Choudhury S, Scherer S. Towards a Robust Aerial Cinematography Platform: Localizing and Tracking Moving Targets in Unstructured Environments. 2019

64. Mellinger D, Kumar V. Minimum snap trajectory generation and control for quadrotors. In: 2011 IEEE International Conference on Robotics and Automation. Shanghai, China, IEEE, 2011 DOI:10.1109/icra.2011.5980409

65. Ronfard R, Gandhi V, Boiron L. The Prose Storyboard Language: A Tool for Annotating and Directing Movies. 2013

66. Richter C, Bry A, Roy N. Polynomial trajectory planning for aggressive quadrotor flight in dense indoor environments//Springer Tracts in Advanced Robotics. Cham: Springer International Publishing, 2016: 649–666 DOI:10.1007/978-3-319-28872-7_37

67. Achtelik M, Achtelik M, Brunet Y, Chli M, Chatzichristofis S, Decotignie J-D, Doth K-M, Fraundorfer F, Kneip L, Gurdan D. Sfly: Swarm of micro flying robots. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems. 2012, IEEE, 2649–2650

68. Gebhardt C, Stevšić S, Hilliges O. Optimizing for aesthetically pleasing qadrotor camera motion. ACM Transactions on Graphics, 2018, 37(4): 1–11 DOI:10.1145/3197517.3201390

69. Joubert N, Roberts M, Truong A, Berthouzoz F, Hanrahan P. An interactive tool for designing quadrotor camera shots. ACM Transactions on Graphics, 2015, 34(6): 1–11 DOI:10.1145/2816795.2818106

70. Roberts M, Hanrahan P. Generating dynamically feasible trajectories for quadrotor cameras. ACM Transactions on Graphics, 2016, 35(4): 1–11 DOI:10.1145/2897824.2925980

71. Gebhardt C, Hilliges O. WYFIWYG: Investigating Effective User Support in Aerial Videography. 2018

72. Gebhardt C, Hepp B, Nägeli T, Stevšić S, Hilliges O. Airways: optimization-based planning of quadrotor trajectories according to high-level user goals. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. San Jose, California, USA, ACM, 2016, 2508–2519 DOI:10.1145/2858036.2858353

73. Faulwasser T, Kern B, Findeisen R. Model predictive path-following for constrained nonlinear systems. In: Proceedings of the 48h IEEE Conference on Decision and Control (CDC) held jointly with 2009 28th Chinese Control Conference. Shanghai, China, IEEE, 2009 DOI:10.1109/cdc.2009.5399744

74. Huang H, Lischinski D, Hao Z, Gong M, Christie M, Cohen-Or D. Trip synopsis: 60km in 60sec. Computer Graphics Forum, 2016, 35(7): 107–116 DOI:10.1111/cgf.13008

75. Yang H, Xie K, Huang S Q, Huang H. Uncut aerial video via a single sketch. Computer Graphics Forum, 2018, 37(7): 191–199 DOI:10.1111/cgf.13559

76. Coombes M, Chen W H, Liu C J. Boustrophedon coverage path planning for UAV aerial surveys in wind. In: 2017 International Conference on Unmanned Aircraft Systems (ICUAS). Miami, FL, USA, IEEE, 2017 DOI:10.1109/icuas.2017.7991469

77. Coombes M, Fletcher T, Chen W H, Liu C J. Optimal polygon decomposition for UAV survey coverage path planning in wind. Sensors, 2018, 18(7): 2132 DOI:10.3390/s18072132

78. Achermann F, Lawrance N R J, Ranftl R, Dosovitskiy A, Chung J J, Siegwart R. Learning to predict the wind for safe aerial vehicle planning. In: 2019 International Conference on Robotics and Automation (ICRA). Montreal, QC, Canada, IEEE, 2019 DOI:10.1109/icra.2019.8793547


1. Zhaoqi SU, Tiansong ZHOU, Kun LI, David BRADY, Yebin LIU, View Synthesis from multi-view RGB data using multi-layered representation and volumetric estimation Virtual Reality & Intelligent Hardware 2020, 2(1): 43-55

2. Lihui HUANG, Souravik DUTTA, Yiyu CAI, Intelligent virtualization of crane lifting using laser scanning technology Virtual Reality & Intelligent Hardware 2020, 2(2): 87-103