Skip to main navigation menu Skip to main content Skip to site footer

The Simultaneous Localization and Mapping (SLAM)-An Overview

Abstract

Positioning is needed for many applications related to mapping and navigation, either in civilian or military domains. The significant developments in satellite-based techniques, sensors, telecommunications, computer hardware and software, image processing, etc. positively influenced solving the positioning problem efficiently and instantaneously. Accordingly, the mentioned development empowered the applications and advancement of autonomous navigation. One of the most interestingly developed positioning techniques is what is called in robotics Simultaneous Localization and Mapping (SLAM). The SLAM problem solution has witnessed a quick improvement in the last decades, either using active sensors like the RAdio Detection and Ranging (Radar) and Light Detection and Ranging (LiDAR) or passive sensors like cameras. Definitely, positioning and mapping is one of the main tasks for geomatics engineers, and therefore it's of high importance for them to understand the SLAM topic, which is not easy because of the huge documentation and algorithms available and the various SLAM solutions in terms of the mathematical models, complexity, the sensors used, and the type of applications. In this paper, a clear and simplified explanation of SLAM from a geometrical viewpoint is introduced, avoiding going into the complicated algorithmic details behind the presented techniques. In this way, a general overview of SLAM is presented, showing the relationship between its different components and stages, like the core part of the front-end and back-end, and their relation to the SLAM paradigm. Furthermore, we explain the major mathematical techniques of filtering and pose graph optimization, either using visual or LiDAR SLAM, and introduce a summary of the efficient contribution of deep learning to the SLAM problem. Finally, we address examples of some existing practical applications of SLAM in our reality.

Keywords

SLAM, Visual Odometry, Extended Kalman Filter, Deep Learning, Graph Pose Optimization

PDF

References

  1. T. Bailey and H. F. Durrant-Whyte, Simultaneous localization and mapping (SLAM): part II," IEEE Robotics & Automation Magazine, vol. 13, no. 3, pp. 108-117, 2006, doi: 10.1109/MRA.2006.1678144.
  2. C. Cadena et al., "Simultaneous Localization And Mapping: Present, Future, and the Robust-Perception Age," IEEE Transactions on Robotics, vol. 32, 06/18 2016, doi: 10.1109/TRO.2016 .2624754.
  3. H. Durrant-Whyte, D. Rye, and E. Nebot, "Localization of autonomous guided vehicles," in Robotics Research, London, G. Giralt and G. Hirzinger, Eds., 1996// 1996: Springer London, pp. 613-625.
  4. A. Yao. "Teaching robots presence: what you need to know about SLAM." Comet Labs Research Team.https://blog.cometlabs.io/teaching-robotspresence- what-you-need-to-know-about-slam-9bf0ca037553 (accessed 30 August 2020).
  5. S. Prabhu. "Introduction to SLAM (Simultaneous Localisation and Mapping)." ARreverie Techno-logy. http://www.arreverie.com/blogs/introduction-simultaneous-localisation-and-mapping/ (accessed 30 August 2020).
  6. C. Pao. "How are Visual SLAM and LiDAR used in robotic navigation?" CEVA. https://www.ceva-dsp.com/ourblog/how-are-visual-slam-and-lidar-used-in-robotic-navigation/ (accessed 30 August 2020).
  7. Y. Nava, "Visual-LiDAR SLAM with loop closure," PhD thesis,Master’s thesis, KTH Royal Institute of Technology, 2018.
  8. Y. Baudoin and M. K. Habib, Using Robots in Hazardous Environments: Landmine Detection, De-Mining and Other Applications. Elsevier, 2010.
  9. A. Angrisano, M. Petovello, and G. Pugliano, "Benefits of combined GPS/GLONASS with low-cost MEMS IMUs for vehicular urban navigation," (in eng), Sensors (Basel), vol. 12, no. 4, pp. 5134-5158, 2012, doi: 10.3390/s120405134.
  10. R. Mur-Artal, J. Montiel, and J. Tardos, "ORB-SLAM: a versatile and accurate monocular SLAM system," IEEE Transactions on Robotics, vol. 31, pp. 1147-1163, 10/01 2015, doi: 10.1109/TRO.2015.2463671.
  11. M. Inc. "Monocular visual simultaneous localization and mapping." https://nl.mathworks.com/help/ vision/examples/monocular-visual-simultaneous-localization-and-mapping.html (accessed 30 August 2020).
  12. R. Szeliski, "Structure from motion," in Computer Vision: Algorithms and Applications, R. Szeliski Ed. London: Springer London, 2011, pp. 303-334.
  13. J. O. Esparza-Jiménez, M. Devy, and J. L. Gordillo, "Visual EKF-SLAM from heterogeneous landmarks," (in eng), Sensors (Basel), vol. 16, no. 4, p. 489, 2016, doi: 10.3390/s16040489.
  14. S. Agarwal, K. S. Parunandi, and S. Chakravorty, "Robust pose-graph slam using absolute orientation sensing," IEEE Robotics and Automation Letters, vol. 4, no. 2, pp. 981-988, 2019.
  15. J. Procházková and D. Martišek, Notes on Iterative Closest Point Algorithm. Proc. in 17th Conference on Applied Mathematics, 876–884, 2018.
  16. A. Filgueira, P. Arias, and M. Bueno, "Novel inspection system, backpack-based, for 3D modelling of indoor scenes," International Conference on Indoor Positioning and Indoor Navigation (IPIN), 4-7 October 2016, Alcalá de Henares, Spain, (October), 4–7..
  17. Y.-H. Choi, T.-K. Lee, and S.-Y. Oh, "A line feature based SLAM with low grade range sensors using geometric constraints and active exploration for mobile robot," Autonomous Robots, vol. 24, no. 1, pp. 13-27, 2008, doi: 10.1007/s10514-007-9050-y.
  18. J. Behley and C. Stachniss, Efficient Surfel-Based SLAM using 3D Laser Range Data in Urban Environments. Proc. of Robotics: Science and Systems; Pittsburgh, PA; 2018.
  19. H. W. Sorenson, "Kalman filtering techniques," in Advances in Control Systems, vol. 3, C. T. Leondes Ed.: Elsevier, 1966, pp. 219-292.
  20. M. Xanthidis, C. Stachniss, P Allen, P. Furgale, M. Chli, M. Huttter, M. Rufli, D. Scaramuzza and R. Siegwart."SLAM Tutorial " ETH Zurich. http://www.cs.columbia.edu/~allen/F17/NOTES/slam_pka.pdf (accessed 30 August 2020).
  21. S. A. S. Mohamed, M. Haghbayan, T. Westerlund, J. Heikkonen, H. Tenhunen, and J. Plosila, "A Survey on Odometry for Autonomous Navigation Systems," IEEE Access, vol. 7, pp. 97466-97486, 2019.
  22. S. Karam, V. Lehtola, and G. Vosselman, "Integrating a low-cost mems imu into a laser-based slam for indoor mobile mapping," Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., vol. XLII-2/W17, pp. 149-156, 2019, doi: 10.5194/isprs-archives-XLII-2-W17-149-2019.
  23. E. M. Mikhail and F. E. Ackermann. Observations and least squares. Washington, D.C: University Press of America, 1982. http://catalog.hathitrust.org/api/volumes/oclc/8551922.html.
  24. S. S. Rao, Engineering Optimization - Theory and Practice, 4th ed. New Jersey, USA: JOHN WILEY & SONS, INC., 2009.
  25. J. Folkesson and H. I. Christensen, "Closing the loop with graphical SLAM," IEEE Transactions on Robotics, vol. 23, no. 4, pp. 731-741, 2007.
  26. MathWorks. "SLAM (Simultaneous Localization and Mapping)." https://nl.mathworks.com/discov-ery/slam.html (accessed 30 August 2020).
  27. B. Alsadik, Adjustment models in 3D geomatics and computational geophysics: with MATLAB examples. Elsevier Science, 2019.
  28. I. Ullah, X. Su, X. Zhang, and D. Choi, "Simultaneous localization and mapping based on kalman filter and extended kalman filter," Wireless Communications and Mobile Computing, vol. 2020, p. 2138643, 2020/06/08 2020, doi: 10.1155/2020/2138643.
  29. G. Grisetti, R. Kümmerle, C. Stachniss, and W. Burgard, "A tutorial on graph-based SLAM," IEEE Intelligent Transportation Systems Magazine, vol. 2, no. 4, pp. 31-43, 2010.
  30. R. Kuemmerle, G. Grisetti, H. Strasdat, K. Konolige, and W. Burgard. "g2o: a general framework for graph optimization." https://openslam-org.github.io/g2o.html (accessed 30 August 2020).
  31. "Ceres Solver." Google Inc. http://ceres-solver.org/ (accessed 30 August 2020).
  32. "Georgia Tech Smoothing and Mapping Library GTSAM." https://github.com/borglab/gtsam acce-ssed 30 August 2020).
  33. D. Lu, "Vision-enhanced lidar odometry and mapping," MSc. Thesis, Carnegie Mellon University, Pittsburgh, PA, 2016.
  34. R. Kümmerle, B. Steder, C. Dornhege, M. Ruhnke, G. Grisetti, C. Stachniss and A. Kleiner. "On measuring the accuracy of SLAM algorithms," Autonomous Robots, vol. 27, no. 4, p. 387, 2009, doi: 10.1007/s10514-009-9155-6.
  35. J. M. Santos, D. Portugal, and R. P. Rocha, "An evaluation of 2D SLAM techniques available in Robot Operating System," IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), 21-26, 2013, pp. 1-6, doi: 10.1109/SSRR.2013.6719348.
  36. M. Quigley, B. Gerkey, K. Conley, J. Faust, T. Foote, J. Leibs, E Berger, R. Wheeler and A. Ng. "ROS: an open-source Robot Operating System". ICRA workshop on open source software. Vol. 3. No. 3.2. 2009..
  37. W. Hess, D. Kohler, H. Rapp, and D. Andor, "Real-time loop closure in 2D LIDAR SLAM," IEEE International Conference on Robotics and Automation (ICRA), 2016, pp. 1271-1278, doi: 10.1109/ICRA.2016.7487258.
  38. C. Thomson, G. Apostolopoulos, D. Backes, and J. Boehm, "Mobile laser scanning for indoor modelling," ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. II-5/W2, 10/16 2013, doi: 10.5194/isprsannals-II-5-W2-289-2013.
  39. N. Li, L. Guan, Y. Gao, S. Du, M. Wu, X. Guang and X. Cong, "Indoor and outdoor low-cost seamless integrated navigation system based on the integration of INS/GNSS/LIDAR system," Remote Sensing, vol. 12, no. 19, p. 3271, 2020. [Online]. Available:https://www.mdpi.com/20724292/12/19/3271.
  40. X. Zhang, J. Lai, D. Xu, H. Li, and M. Fu, "2D LiDAR-based SLAM and path planning for indoor rescue using mobile robots," Journal of Advanced Transportation, vol. 2020, p. 8867937, 2020/11/17 2020, doi: 10.1155/2020/8867937.
  41. A. Davison, "Real-time simultaneous localisation and mapping with a single camera," in Proceedings Ninth IEEE International Conference on Computer Vision, 13-16 Oct. 2003 2003, pp. 1403-1410 vol.2, doi: 10.1109/ICCV.2003.1238654.
  42. I. Mahon and S. Williams, "Three-Dimensional Robotic Mapping," PRoc. Australasian Conference on Robotics and Automation. 2003..
  43. J. Weingarten and R. Siegwart, "EKF-based 3D SLAM for structured environment reconstruction," in 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2005, pp. 3834-3839, doi: 10.1109/IROS.2005.1545285.
  44. M. Baglietto, A. Sgorbissa, D. Verda, and R. Zaccaria, "Human navigation and mapping with a 6DOF IMU and a laser scanner," Robotics and Autonomous Systems, vol. 59, no. 12, pp. 1060-1069, 2011, doi: https://doi.org/10.1016/j.robot .2011.08.005.
  45. S. Zhang and S. Qin, "An Approach to 3D SLAM for a mobile robot in unknown indoor environment towards service operation,"Chinese Automation Congress (CAC), 2018 2018, pp. 2101-2105, doi: 10.1109/CAC.2018.8623105.
  46. D. Lu. "LiDAR mapping with ouster 3D sensors." Ouster. https://ouster.com/blog/lidar-mapping-with-ouster-3d-sensors/ (accessed 30 August 2020).
  47. S. Y. Chen and Y. F. Li, "Vision sensor planning for 3–D model acquisition," presented at the IEEE transactions on systems, man, and cybernetics. 2005.
  48. G. Vosselman, "Design of an indoor mapping system using three 2D laser scanners and 6 DOF SLAM," ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. II-3, pp. 173-179, 08/07 2014, doi: 10.5194/isprsannals-II-3-173-2014.
  49. S. Karam, G. Vosselman, M. Peter, S. Hosseinyalamdary, and V. Lehtola, "Design, calibration, and evaluation of a backpack indoor mobile mapping system," Remote Sensing, vol. 11, p. 905, 2019, doi: 10.3390/rs11080905.
  50. FLIR."Ladybug5." https://www.flir.com/products/ladybug5plus/ (accessed 30 August 2020).
  51. REALSENSE. "Depth Camera D415." https://www.-intelrealsense.com/depth-camera-d415/ (accessed 30 August 2020).
  52. B. Alsadik, "A modified method for image triangulation using inclined angles," Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., vol. XLI-B3, pp. 453-458, 2016, doi: 10.5194/isprs-archives-XLI-B3-453-2016.
  53. L. Kneip, D. Scaramuzza, and R. Siegwart, "A novel parametrization of the perspective-three-point problem for a direct computation of absolute camera position and orientation," in CVPR 2011, 2011, pp. 2969-2976, doi: 10.1109/CVPR.2011.5995464.
  54. E. W. Grafarend and J. Shan, "Closed-form solution of P4P or the three-dimensional resection problem in terms of Möbius barycentric coordinates," Journal of Geodesy, vol. 71, no. 4, pp. 217-231, 1997, doi: 10.1007/s001900050089.
  55. Hokuyo Automatic Co., Ltd. https://www.hokuyo-aut.jp/ (accessed 19 February, 2021).
  56. G. Chen, J. Kua, S. Shum, N. Naikal, M. Carlberg, and A. Zakhor, "Indoor localization algorithms for a human-operated backpack system," 2010.
  57. T. Liu, M. Carlberg, G. Chen, J. Chen, J. Kua, and A. Zakhor, "Indoor localization and visualization using a human-operated backpack system," in 2010 International Conference on Indoor Positioning and Indoor Navigation, Zurich, Switzerland, 15-17,2010, pp. 1-10, doi: 10.1109/IPIN.2010 .5646820.
  58. Viametris. "vMS3D." https://www.viametris. com/vms3d (accessed July 21, 2020).
  59. Velodyne. https://velodynelidar.com/ (accessed 1 October, 2018).
  60. NavVis. "Versatile Reality Capture with NavVis VLX _ NavVis." https://www.navvis.com/ (accessed 19 February, 2021).
  61. B. Huang, J. Zhao, and J. Liu, "A Survey of Simultaneous Localization and Mapping with an Envision in 6G Wireless Networks". 2019.
  62. L. Xu, C. Feng, V. R. Kamat, and C. C. Menassa, "An occupancy grid mapping enhanced visual SLAM for real-time locating applications in indoor GPS-denied environments," Automation in Construction, vol. 104, pp. 230-245, 2019, doi: https://doi.org/10.1016/j.autcon.2019.04.011.
  63. P. J. Besl and N. D. McKay, "A method for registration of 3-D shapes," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, no. 2, pp. 239-256, 1992, doi: 10.1109/34.121791.
  64. S. Karam, V. Lehtola, and G. Vosselman, "Strategies to integrate IMU and LiDAR SLAM for indoor mapping," ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., vol. V-1-2020, pp. 223-230, 2020, doi: 10.5194/isprs-annals-V-1-2020-223-2020.
  65. A. Elfes, "Using occupancy grids for mobile robot perception and navigation," Computer, vol. 22, no. 6, pp. 46-57, 1989.
  66. A. A. S. Souza, R. Maia, and L. M. G. Gonçalves, "3D probabilistic occupancy grid to robotic mapping with stereo vision," in Current Advancements in Stereo Vision: IntechOpen, 2012.
  67. L. v. Stumberg, V. C. Usenko, J. Engel, J. Stückler, and D. Cremers, "From monocular SLAM to autonomous drone exploration," European Conference on Mobile Robots (ECMR), pp. 1-8, 2017.
  68. C. Zhao, K. Sun, P. Purkait, T. Duckett, and R. Stolkin, "Learning Monocular Visual Odometry with Dense 3D Mapping from Dense 3D Flow," IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 2018, pp. 6864-6871, doi: 10.1109/IROS.2018.8594151.
  69. C. Godard, O. M. Aodha, M. Firman, and G. Brostow, "Digging into self-supervised monocular depth estimation," IEEE/CVF International Conference on Computer Vision (ICCV), 2019, pp. 3827-3837, doi: 10.1109/ICCV.2019.00393.
  70. P.-E. Sarlin, D. DeTone, T. Malisiewicz, and A. Rabinovich, "SuperGlue: Learning Feature Matching With Graph Neural Networks." Proc. of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2020, pp. 4938-4947
  71. D. DeTone, T. Malisiewicz, and A. Rabinovich, "SuperPoint: self-supervised interest point detection and description," Proc. of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2018, pp. 224-236
  72. R. KNIGHT. "UGV’s Gaining Ground." inside unmanned systems. https://insideunmannedsystems .com/ugvs-gaining-ground/ (accessed 30 August 2020).
  73. Stanley Robotics. https://stanley-robotics.com/ (acces-sed 30 August 2020).
  74. Emesent. "Hovermap AL2." https://www.em-esent.io/ (accessed 30 August 2020.
  75. "Husky performs slam on stereonets to help predict rock falls and rock bursts." Queen’s University. https://clearpathrobotics.com/husky-queens-mining-slam-stereonets/ (accessed 30 August 2020).
  76. Skydio. https://www.skydio.com/pages/skydioautonomy (accessed 30 August 2020).
  77. B. Dynamics. "Spot Arm." https://www.bostondy-namics.com/spot-arm (accessed 5 January 2021).
  78. E. Ackerman. "Boston Dynamics' Spot Robot Dog." IEEE Spectrum. https://spectrum.ieee.org/autom-aton/robotics/industrial-robots/boston-dynamics-spot-robot-dog-now-available (accessed 5 January, 2021).
  79. N. Palomeras, M. Carreras, and J. Andrade-Cetto, "Active SLAM for autonomous underwater exploration," Remote Sensing, vol. 11, p. 2827, 11/28 2019, doi: 10.3390/rs11232827.
  80. Azur Drones. "Skeyetech, fully autonomous drone for safety and security." https://www.azurdrones. com/product/skeyetech/ (accessed 30 August 2020).
  81. FARO. "FARO Trek 3D Laser Scanning Integration." https://www.faro.com/en/Products/Hardware/Trek-3D-Laser-Scanning-Integration (accessed 21 March, 2021).

Metrics

Metrics Loading ...