A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation
Volume 12 Issue 6
Jun.  2025

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 15.3, Top 1 (SCI Q1)
    CiteScore: 23.5, Top 2% (Q1)
    Google Scholar h5-index: 77, TOP 5
Turn off MathJax
Article Contents
G. Liu, K. Huang, X. Lv, Y. Sun, H. Li, X. Lei, Q. Yuan, and L. Shu, “Innovations and refinements in LiDAR odometry and mapping: A comprehensive review,” IEEE/CAA J. Autom. Sinica, vol. 12, no. 6, pp. 1072–1094, Jun. 2025. doi: 10.1109/JAS.2025.125198
Citation: G. Liu, K. Huang, X. Lv, Y. Sun, H. Li, X. Lei, Q. Yuan, and L. Shu, “Innovations and refinements in LiDAR odometry and mapping: A comprehensive review,” IEEE/CAA J. Autom. Sinica, vol. 12, no. 6, pp. 1072–1094, Jun. 2025. doi: 10.1109/JAS.2025.125198

Innovations and Refinements in LiDAR Odometry and Mapping: A Comprehensive Review

doi: 10.1109/JAS.2025.125198
Funds:  This work was supported by Jiangsu Agriculture Science and Technology Innovation Fund (CX(23)2003), China Agriculture Research System of MOF and MARA (CARS-28-21), the National Natural Science Foundation of China (32201680), the National Science and Technology Development Program of China (NK2022160104), and the National Key Research and Development Program of China (2022YFD2001400)
More Information
  • Since its introduction in 2014, the LiDAR odometry and mapping (LOAM) algorithm has become a cornerstone in the fields of autonomous driving and intelligent robotics. LOAM provides robust support for autonomous navigation in complex dynamic environments through precise localization and environmental mapping. This paper offers a comprehensive review of the innovations and optimizations made to the LOAM algorithm, covering advancements in multi-sensor fusion technology, frontend processing optimization, backend optimization, and loop closure detection. These improvements have significantly enhanced LOAM’s performance in various scenarios, including urban, agricultural, and underground environments. However, challenges remain in areas such as data synchronization, real-time processing, computational complexity, and environmental adaptability. Looking ahead, future developments are expected to focus on creating more efficient multi-sensor fusion algorithms, expanding application domains, and building more robust systems, thereby driving continued progress in autonomous driving, intelligent robotics, and autonomous unmanned systems.

     

  • loading
  • [1]
    G. Meyer and S. Beiker, Road Vehicle Automation 3. Cham, Germany: Springer, 2016.
    [2]
    M. R. Pedersen, L. Nalpantidis, R. S. Andersen, C. Schou, S. Bøgh, V. Krüger, and O. Madsen, “Robot skills for manufacturing: From concept to industrial deployment,” Robot. Comput.-Integr. Manuf., vol. 37, pp. 282–291, Feb. 2016. doi: 10.1016/j.rcim.2015.04.002
    [3]
    L. C. Santos, F. N. Santos, E. J. Solteiro Pires, A. Valente, P. Costa, and S. Magalhães, “Path Planning for ground robots in agriculture: A short review,” in Proc. IEEE Int. Conf. Autonomous Robot Systems and Competitions, Ponta Delgada, Portugal, 2020, pp. 61−66.
    [4]
    M. Kyrarini, F. Lygerakis, A. Rajavenkatanarayanan, C. Sevastopoulos, H. R. Nambiappan, K. K. Chaitanya, A. R. Babu, J. Mathew, and F. Makedon, “A survey of robots in healthcare,” Technologies, vol. 9, no. 1, p. 8, Jan. 2021. doi: 10.3390/technologies9010008
    [5]
    J. Fuentes-Pacheco, J. Ruiz-Ascencio, and J. M. Rendón-Mancha, “Visual simultaneous localization and mapping: A survey,” Artif. Intell. Rev., vol. 43, no. 1, pp. 55–81, Jan. 2015. doi: 10.1007/s10462-012-9365-8
    [6]
    G. Yang, Y. Wang, J. Zhi, W. Liu, Y. Shao, and P. Peng, “A review of visual odometry in SLAM techniques,” in Proc. Int. Conf. Artificial Intelligence and Electromechanical Autom., Tianjin, China, 2020, pp. 332−336.
    [7]
    L. Huang, “Review on LiDAR-based SLAM techniques,” in Proc. Int. Conf. Signal Processing and Machine Learning, Stanford, USA, 2021, pp. 163−168.
    [8]
    J. Zhang and S. Singh, “LOAM: Lidar odometry and mapping in real-time,” in Proc. Robotics: Science and Systems X, Berkeley, USA, 2014.
    [9]
    C. Cadena, L. Carlone, H. Carrillo, Y. Latif, D. Scaramuzza, J. Neira, I. Reid, and J. J. Leonard, “Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age,” IEEE Trans. Robot., vol. 32, no. 6, pp. 1309–1332, Dec. 2016. doi: 10.1109/TRO.2016.2624754
    [10]
    S. D. Varanasi, M. Tammana, and R. K. Megalingam, “Robotic navigation unveiled: A comprehensive study of GMapping, hector slam, and cartographer,” in Proc. 3rd Int. Conf. Innovation in Technology, Bangalore, India, 2024, pp. 1−6.
    [11]
    S. Nagla, “2D hector SLAM of indoor mobile robot using 2D lidar,” in Proc. Int. Conf. Power, Energy, Control and Transmission Systems, Chennai, India, 2020, pp. 1−4.
    [12]
    S. F. A. E. Wijaya, D. S. Purnomo, E. B. Utomo, and M. A. Anandito, “Research study of occupancy grid map mapping method on hector SLAM technique,” in Proc. Int. Electronics Symp., Surabaya, Indonesia, 2019, pp. 238−241.
    [13]
    G. Grisetti, C. Stachniss, and W. Burgard, “Improved techniques for grid mapping with rao-blackwellized particle filters,” IEEE Trans. Robot., vol. 23, no. 1, pp. 34–46, Feb. 2007. doi: 10.1109/TRO.2006.889486
    [14]
    W. Hess, D. Kohler, H. Rapp, and D. Andor, “Real-time loop closure in 2D LIDAR SLAM,” in Proc. IEEE Int. Conf. Robotics and Autom., Stockholm, Sweden, 2016, pp. 1271−1278.
    [15]
    A. Nüchter, M. Bleier, J. Schauer, and P. Janotta, “Improving Google’s cartographer 3D mapping by continuous-time SLAM,” in Proc. Int. Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Nafplio, Greece, 2017, pp. 543−549.
    [16]
    B. Akpınar, “Performance of different SLAM algorithms for indoor and outdoor mapping applications,” Appl. Syst. Innov., vol. 4, no. 4, p. 101, Dec. 2021. doi: 10.3390/asi4040101
    [17]
    K. Koide, J. Miura, and E. Menegatti, “A portable three-dimensional LIDAR-based system for long-term and wide-area people behavior measurement,” Int. J. Adv. Robot. Syst., vol. 16, no. 2, p. 172988141984153, May 2019.
    [18]
    J. Zhang and S. Singh, “Low-drift and real-time lidar odometry and mapping,” Auton. Robot., vol. 41, no. 2, pp. 401–416, Feb. 2017. doi: 10.1007/s10514-016-9548-2
    [19]
    A. Geiger, P. Lenz, and R. Urtasun, “Are we ready for autonomous driving? The KITTI vision benchmark suite,” in Proc. IEEE Conf. Computer Vision and Pattern Recognition, Providence, USA, 2012, pp. 3354−3361.
    [20]
    W. Maddern, G. Pascoe, C. Linegar, and P. Newman, “1 year, 1000km: The oxford RobotCar dataset,” Int. J. Robot. Res., vol. 36, no. 1, pp. 3–15, Jan. 2017. doi: 10.1177/0278364916679498
    [21]
    G. Kim, Y. S. Park, Y. Cho, J. Jeong, and A. Kim, “MulRan: Multimodal range dataset for urban place recognition,” in Proc. IEEE Int. Conf. Robotics and Autom., Paris, France, 2020, pp. 6246−6253.
    [22]
    X. Huang, P. Wang, X. Cheng, D. Zhou, Q. Geng, and R. Yang, “The ApolloScape open dataset for autonomous driving and its application,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 42, no. 10, pp. 2702–2719, Oct. 2020. doi: 10.1109/TPAMI.2019.2926463
    [23]
    J. Zhu, J. Gehrung, R. Huang, B. Borgmann, Z. Sun, L. Hoegner, M. Hebel, Y. Xu, and U. Stilla, “TUM-MLS-2016: An annotated mobile LiDAR dataset of the TUM city campus for semantic point cloud interpretation in urban areas,” Remote Sens., vol. 12, no. 11, p. 1875, Jun. 2020. doi: 10.3390/rs12111875
    [24]
    X. Xu, L. Zhang, J. Yang, C. Cao, W. Wang, Y. Ran, Z. Tan, and M. Luo, “A review of multi-sensor fusion SLAM systems based on 3D LIDAR,” Remote Sens., vol. 14, no. 12, p. 2835, Jun. 2022. doi: 10.3390/rs14122835
    [25]
    C. Xiang, C. Feng, X. Xie, B. Shi, H. Lu, Y. Lv, M. Yang, and Z. Niu, “Multi-sensor fusion and cooperative perception for autonomous driving: A review,” IEEE Intell. Trans. Syst. Mag., vol. 15, no. 5, pp. 36–58, Sept.–Oct. 2023. doi: 10.1109/MITS.2023.3283864
    [26]
    K. Huang, B. Shi, X. Li, X. Li, S. Huang, and Y. Li, “Multi-modal sensor fusion for auto driving perception: A survey,” arXiv preprint arXiv: 2202.02703, 2022.
    [27]
    S. Zhao, Z. Fang, H. Li, and S. Scherer, “A robust laser-inertial odometry and mapping method for large-scale highway environments,” in Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, Macau, China, 2019, pp. 1285−1292.
    [28]
    T.-M. Nguyen, S. Yuan, M. Cao, Y. Lyu, T. H. Nguyen, and L. Xie, “MILIOM: Tightly coupled multi-input lidar-inertia odometry and mapping,” IEEE Robot. Autom. Lett., vol. 6, no. 3, pp. 5573–5580, Jul. 2021. doi: 10.1109/LRA.2021.3080633
    [29]
    W. Xu and F. Zhang, “FAST-LIO: A fast, robust LiDAR-inertial odometry package by tightly-coupled iterated Kalman filter,” IEEE Robot. Autom. Lett., vol. 6, no. 2, pp. 3317–3324, Apr. 2021. doi: 10.1109/LRA.2021.3064227
    [30]
    H. Zhang, Z.-Q. Liu, and Y.-L. Wang, “U-LOAM: A real-time 3D lidar SLAM system for water-surface scene applications,” in Proc. IEEE Int. Conf. Unmanned Systems, Guangzhou, China, 2022, pp. 653−657.
    [31]
    Y. Su, T. Wang, S. Shao, C. Yao, and Z. Wang, “GR-LOAM: LiDAR-based sensor fusion SLAM for ground robots on complex terrain,” Robot. Auton. Syst., vol. 140, p. 103759, Jun. 2021. doi: 10.1016/j.robot.2021.103759
    [32]
    J. Zhang and S. Singh, “Visual-lidar odometry and mapping: Low-drift, robust, and fast,” in Proc. IEEE Int. Conf. Robotics and Autom., Seattle, USA, 2015, pp. 2174−2181.
    [33]
    M. Yan, J. Wang, J. Li, and C. Zhang, “Loose coupling visual-lidar odometry by combining VISO2 and LOAM,” in Proc. 36th Chinese Control Conf., Dalian, China, 2017, pp. 6841−6846.
    [34]
    Z. Yuan, Q. Wang, K. Cheng, T. Hao, and X. Yang, “SDV-LOAM: Semi-direct visual–LiDAR odometry and mapping,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 45, no. 9, pp. 11203–11220, Sept. 2023. doi: 10.1109/TPAMI.2023.3262817
    [35]
    Y. Zhu, B. Xue, L. Zheng, H. Huang, M. Liu, and R. Fan, “Real-time, environmentally-robust 3D LiDAR localization,” in Proc. IEEE Int. Conf. Imaging Systems and Techniques, Abu Dhabi, United Arab Emirates, 2019, pp. 1−6.
    [36]
    S. Du, B. Yu, L. Huang, Y. Li, and S. Li, “GNSS-assisted LiDAR odometry and mapping for urban environment,” IEEE Sensors J., vol. 23, no. 18, pp. 21787–21802, Sept. 2023. doi: 10.1109/JSEN.2023.3303427
    [37]
    J. Zhang and S. Singh, “Laser-visual-inertial odometry and mapping with high robustness and low drift,” J. Field Robot., vol. 35, no. 8, pp. 1242–1264, Dec. 2018. doi: 10.1002/rob.21809
    [38]
    Y. Zhang, H. Sun, F. Zhang, B. Zhang, S. Tao, H. Li, K. Qi, S. Zhang, S. Ninomiya, and Y. Mu, “Real-time localization and colorful three-dimensional mapping of orchards based on multi-sensor fusion using extended Kalman filter,” Agronomy, vol. 13, no. 8, p. 2158, Aug. 2023. doi: 10.3390/agronomy13082158
    [39]
    T. Li, L. Pei, Y. Xiang, Q. Wu, S. Xia, L. Tao, X. Guan, and W. Yu, “P.3-LOAM: PPP/LiDAR loosely coupled SLAM with accurate covariance estimation and robust RAIM in urban canyon environment,” IEEE Sensors J., vol. 21, no. 5, pp. 6660–6671, Mar. 2021. doi: 10.1109/JSEN.2020.3042968
    [40]
    T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and D. Rus, “LIO-SAM: Tightly-coupled lidar inertial odometry via smoothing and mapping,” in Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, Las Vegas, USA, 2020, pp. 5135−5142.
    [41]
    T. Shan and B. Englot, “LeGO-LOAM: Lightweight and ground-optimized lidar odometry and mapping on variable terrain,” in Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, Madrid, Spain, 2018, pp. 4758−4765.
    [42]
    X. Ji, L. Zuo, C. Zhang, and Y. Liu, “LLOAM: LiDAR odometry and mapping with loop-closure detection based correction,” in Proc. IEEE Int. Conf. Mechatronics and Autom., Tianjin, China, 2019, pp. 2475−2480.
    [43]
    S. Yi, Y. Lyu, L. Hua, Q. Pan, and C. Zhao, “Light-LOAM: A lightweight LiDAR odometry and mapping based on graph-matching,” IEEE Robot. Autom. Lett., vol. 9, no. 4, pp. 3219–3226, Apr. 2024. doi: 10.1109/LRA.2024.3367268
    [44]
    D.-U. Seo, H. Lim, S. Lee, and H. Myung, “PaGO-LOAM: Robust ground-optimized LiDAR odometry,” in Proc. 19th Int. Conf. Ubiquitous Robots, Jeju, Korea, Republic of, 2022, pp. 1−7.
    [45]
    N. Rufus, U. K. R. Nair, A. V. S. Sai Bhargav Kumar, V. Madiraju, and K. M. Krishna, “SROM: Simple real-time odometry and mapping using LiDAR data for autonomous vehicles,” in Proc. IEEE Intelligent Vehicles Symp., Las Vegas, USA, 2020, pp. 1867−1872.
    [46]
    S. Du, Y. Li, X. Li, and M. Wu, “LiDAR odometry and mapping based on semantic information for outdoor environment,” Remote Sens., vol. 13, no. 15, p. 2864, Jul. 2021. doi: 10.3390/rs13152864
    [47]
    P. Zhou, X. Guo, X. Pei, and C. Chen, “T-LOAM: Truncated least squares LiDAR-only odometry and mapping in real time,” IEEE Trans. Geosci. Remote Sens., vol. 60, p. 5701013, 2022.
    [48]
    X. Wang, Y. Wang, J. Liu, Z. Tian, and L. Li, “DRI-LOAM: A LOAM-based dynamic objects removal strategy using range images,” in Proc. 42nd Chinese Control Conf., Tianjin, China, 2023, pp. 3571−3576.
    [49]
    J. Li, X. Zhang, Y. Zhang, Y. Chang, and K. Zhao, “RF-LOAM: Robust and fast LiDAR odometry and mapping in urban dynamic environment,” IEEE Sensors J., vol. 23, no. 23, pp. 29186–29199, Dec. 2023. doi: 10.1109/JSEN.2023.3324429
    [50]
    C. Gonzalez and M. Adams, “An improved feature extractor for the LiDAR odometry and mapping (LOAM) algorithm,” in Proc. Int. Conf. Control, Autom. and Information Sciences, Chengdu, China, 2019, pp. 1−7.
    [51]
    C. Gonzalez and M. Adams, “Curvature scale space LiDAR odometry and mapping (LOAM),” J. Intell. Robot. Syst., vol. 110, no. 2, p. 67, Apr. 2024. doi: 10.1007/s10846-024-02096-1
    [52]
    Z. Wang, L. Yang, F. Gao, and L. Wang, “FEVO-LOAM: Feature extraction and vertical optimized lidar odometry and mapping,” IEEE Robot. Autom. Lett., vol. 7, no. 4, pp. 12086–12093, Oct. 2022. doi: 10.1109/LRA.2022.3201689
    [53]
    H. Pan, S. Li, Q. Liu, Y. Xu, P. Ji, and X. Kang, “Trajectory tracking method of crawler robot based on improved LOAM,” in Proc. 12th Int. Conf. Intelligent Human-Machine Systems and Cybernetics, Hangzhou, China, 2020, pp. 154−157.
    [54]
    Z. Zhang, Z. Yao, and M. Lu, “OW-LOAM: Observation-weighted LiDAR odometry and mapping,” in Proc. IEEE 12th Int. Conf. Indoor Positioning and Indoor Navigation, Beijing, China, 2022, pp. 1−8.
    [55]
    Y. S. Park, H. Jang, and A. Kim, “I-LOAM: Intensity enhanced LiDAR odometry and mapping,” in Proc. 17th Int. Conf. Ubiquitous Robots, Kyoto, Japan, 2020, pp. 455−458.
    [56]
    H. Guo, J. Zhu, and Y. Chen, “E-LOAM: LiDAR odometry and mapping with expanded local structural information,” IEEE Trans. Intell. Veh., vol. 8, no. 2, pp. 1911–1921, Feb. 2023. doi: 10.1109/TIV.2022.3151665
    [57]
    H. Wang, C. Wang, C.-L. Chen, and L. Xie, “F-LOAM: Fast LiDAR odometry and mapping,” in Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, Prague, Czech Republic, 2021, pp. 4390−4396.
    [58]
    J. Lin and F. Zhang, “Loam livox: A fast, robust, high-precision LiDAR odometry and mapping package for LiDARs of small FoV,” in Proc. IEEE Int. Conf. Robotics and Autom., Paris, France, 2020, pp. 3126−3131.
    [59]
    S. W. Chen, G. V. Nardari, E. S. Lee, C. Qu, X. Liu, R. A. F. Romero, and V. Kumar, “SLOAM: Semantic lidar odometry and mapping for forest inventory,” IEEE Robot. Autom. Lett., vol. 5, no. 2, pp. 612–619, Apr. 2020. doi: 10.1109/LRA.2019.2963823
    [60]
    L. Li, X. Kong, X. Zhao, W. Li, F. Wen, H. Zhang, and Y. Liu, “SA-LOAM: Semantic-aided LiDAR SLAM with loop closure,” in Proc. IEEE Int. Conf. Robotics and Autom., Xi’an, China, 2021, pp. 7627−7634.
    [61]
    N. Dong, R. Chi, and W. Zhang, “LiDAR odometry and mapping based on semantic information for maize field,” Agronomy, vol. 12, no. 12, p. 3107, Dec. 2022. doi: 10.3390/agronomy12123107
    [62]
    M. Zhu, Y. Yang, W. Song, M. Wang, and M. Fu, “AGCV-LOAM: Air-ground cross-view based LiDAR odometry and mapping,” in Proc. Chinese Control and Decision Conf., Hefei, China, 2020, pp. 5261−5266.
    [63]
    J. Deng, Q. Wu, X. Chen, S. Xia, Z. Sun, G. Liu, W. Yu, and L. Pei, “NeRF-LOAM: Neural implicit representation for large-scale incremental LiDAR odometry and mapping,” in Proc. 2023 IEEE/CVF Int. Conf. Computer Vision, Paris, France, 2023, pp. 8184−8193.
    [64]
    L. Liao, C. Fu, B. Feng, and T. Su, “Optimized SC-F-LOAM: Optimized fast LiDAR odometry and mapping using scan context,” in Proc. 6th CAA Int. Conf. Vehicular Control and Intelligence, Nanjing, China, 2022, pp. 1−6.
    [65]
    G. Xue, J. Wei, R. Li, and J. Cheng, “LeGO-LOAM-SC: An improved simultaneous localization and mapping method fusing LeGO-LOAM and scan context for underground coalmine,” Sensors, vol. 22, no. 2, p. 520, Jan. 2022. doi: 10.3390/s22020520
    [66]
    H. Qin, Y. Zou, G. Yu, H. Liu, and Y. Tan, “SOHD-LOAM: Suppress odometry height drift LiDAR odometry and mapping on undulating road,” J. Intell. Fuzzy Syst., vol. 48, no. 3, pp. 1–14, Apr. 2024.
    [67]
    R. Dubé, D. Dugas, E. Stumm, J. Nieto, R. Siegwart, and C. Cadena, “SegMatch: Segment based place recognition in 3D point clouds,” in Proc. IEEE Int. Conf. Robotics and Autom., Singapore, Singapore, 2017, pp. 5266−5272.
    [68]
    R. Dubé, M. G. Gollub, H. Sommer, I. Gilitschenski, R. Siegwart, C. Cadena, and J. Nieto, “Incremental-segment-based localization in 3-D point clouds,” IEEE Robot. Autom. Lett., vol. 3, no. 3, pp. 1832–1839, Jul. 2018. doi: 10.1109/LRA.2018.2803213
    [69]
    R. Dubé, A. Cramariuc, D. Dugas, J. Nieto, R. Siegwart, and C. Cadena, “SegMap: 3D segment mapping using data-driven descriptors,” in Proc. Robotics: Science and Systems XIV, Pittsburgh, Pennsylvania, 2018, pp. 1−8.
    [70]
    R. Dubé, A. Cramariuc, D. Dugas, H. Sommer, M. Dymczyk, J. Nieto, R. Siegwart, and C. Cadena, “SegMap: Segment-based mapping and localization using data-driven descriptors,” Int. J. Robot. Res., vol. 39, no. 2−3, pp. 339–355, Mar. 2020. doi: 10.1177/0278364919863090
    [71]
    X. Liu, L. Zhang, S. Qin, D. Tian, S. Ouyang, and C. Chen, “Optimized LOAM using ground plane constraints and SegMatch-based loop detection,” Sensors, vol. 19, no. 24, p. 5419, Dec. 2019. doi: 10.3390/s19245419
    [72]
    G. Xue, R. Li, S. Liu, and J. Wei, “Research on underground coal mine map construction method based on LeGO-LOAM improved algorithm,” Energies, vol. 15, no. 17, p. 6256, Aug. 2022. doi: 10.3390/en15176256
    [73]
    W. Schwarting, J. Alonso-Mora, and D. Rus, “Planning and decision-making for autonomous vehicles,” Annu. Rev. Control Robot. Auton. Syst., vol. 1, pp. 187–210, Jan. 2018. doi: 10.1146/annurev-control-060117-105157
    [74]
    T. Qin and S. Cao, “A-LOAM: Advanced implementation of LOAM,” 2024, [Online]. Available: https://github.com/HKUST-Aerial-Robotics/A-LOAM
    [75]
    B. Basso and J. Antle, “Digital agriculture to design sustainable agricultural systems,” Nat. Sustain., vol. 3, no. 4, pp. 254–256, Apr. 2020. doi: 10.1038/s41893-020-0510-0
    [76]
    P. Rattan, G. Sharma, and P. P. Singh, “Application of artificial intelligence (AI) in the agriculture sector,” in Agriculture and Aquaculture Applications of Biosensors and Bioelectronics, A. Khang, Ed. IGI Global, 2024, pp. 45−68.
    [77]
    F. Ou, Z. Miao, N. Li, C. He, and Y. Li, “Loop closure detection in complex orchards based on density binary pattern,” Trans. Chin. Soc. Agric. Mach., vol. 54, no. 5, pp. 29–35, May 2023.
    [78]
    R. J. H. Kemper, C. Gonzalez, and S. R. P. Gardini, “Autonomous navigation of a four-wheeled robot in a simulated blueberry farm environment,” in Proc. IEEE ANDESCON, Barranquilla, Colombia, 2022, pp. 1−6.
    [79]
    X. Yao, Y. Bai, B. Zhang, D. Xu, G. Cao, and Y. Bian, “Autonomous navigation and adaptive path planning in dynamic greenhouse environments utilizing improved LeGO-LOAM and OpenPlanner algorithms,” J. Field Robot., vol. 41, no. 7, pp. 2427–2440, Oct. 2024. doi: 10.1002/rob.22315
    [80]
    J. Chen, H. Wang, and S. Yang, “Tightly coupled LiDAR-inertial odometry and mapping for underground environments,” Sensors, vol. 23, no. 15, p. 6834, Jul. 2023. doi: 10.3390/s23156834
    [81]
    G. Xue, R. Li, Z. Zhang, S. Liu, and J. Wei, “Lidar based map construction fusion method for underground coal mine shaft bottom,” Coal Sci. Technol., vol. 51, no. 8, pp. 219–227, Aug. 2023.
    [82]
    Q. Zhang, L. Cui, X. Du, and B. Ma, “Mapping and positioning of 3D LiDAR SLAM algorithm in mine environment,” Bull. Surv. Mapp., vol. 5, pp. 72–77, May 2023.
    [83]
    F. Shuang, H. Ma, J. Yang, and S. Li, “The precise positioning strategy of cable trench inspection robot based on improved EKF_LOAM,” J. Chin. Inertial Technol., vol. 32, no. 4, pp. 326–335, Apr. 2024.
    [84]
    Z. Yuan, J. Deng, R. Ming, F. Lang, and X. Yang, “SR-LIVO: LiDAR-inertial-visual odometry and mapping with sweep reconstruction,” IEEE Robot. Autom. Lett., vol. 9, no. 6, pp. 5110–5117, Jun. 2024. doi: 10.1109/LRA.2024.3389415
    [85]
    H. Zhang, L. Du, S. Bao, J. Yuan, and S. Ma, “LVIO-fusion: Tightly-coupled LiDAR-visual-inertial odometry and mapping in degenerate environments,” IEEE Robot. Autom. Lett., vol. 9, no. 4, pp. 3783–3790, Apr. 2024. doi: 10.1109/LRA.2024.3371383
    [86]
    T. Song, Y. Zheng, S. He, Z. Xiong, P. Wang, and X. Wu, “MLIOM-AB: Multi-LiDAR-inertial-odometry and mapping for autonomous buses,” IEEE Sensors J, vol. 24, no. 17, pp. 28036–28048, Sept. 2024. doi: 10.1109/JSEN.2024.3431678
    [87]
    Y. Fang, K. Qian, Y. Zhang, T. Shi, and H. Yu, “Segmented curved-voxel occupancy descriptor for dynamic-aware LiDAR odometry and mapping,” IEEE Trans. Geosci. Remote Sens., vol. 62, p. 5701916, Feb. 2024.
    [88]
    C. Zhang, M. Chen, G. Wang, Y. Lin, K. Li, M. Wu, Z. Li, and Q. Wang, “LIWOM-GD: Enhanced LiDAR-inertial-wheel odometry and mapping by fusion with ground constraint and dynamic points elimination,” IEEE Sensors J., vol. 24, no. 19, pp. 30287–30303, Oct. 2024. doi: 10.1109/JSEN.2024.3431102
    [89]
    X. Xu, J. Hu, L. Zhang, C. Cao, J. Yang, Y. Ran, Z. Tan, L. Xu, and M. Luo, “Detection-first tightly-coupled LiDAR-Visual-Inertial SLAM in dynamic environments,” Measurement, vol. 239, p. 115506, Jan. 2025. doi: 10.1016/j.measurement.2024.115506
    [90]
    Q. Wang and M. Wang, “A novel 3D LiDAR deep learning approach for uncrewed vehicle odometry,” PeerJ Comput. Sci., vol. 10, p. e2189, Jul. 2024. doi: 10.7717/peerj-cs.2189
    [91]
    R. Huang, M. Zhao, J. Chen, and L. Li, “KDD-LOAM: Jointly learned keypoint detector and descriptors assisted LiDAR odometry and mapping,” in Proc. IEEE Int. Conf. Robotics and Autom., Yokohama, Japan, 2024, pp. 8559−8565.
    [92]
    D. Chung and J. Kim, “NV-LIOM: LiDAR-inertial odometry and mapping using normal vectors towards robust SLAM in multifloor environments,” IEEE Robot. Autom. Lett., vol. 9, no. 11, pp. 9375–9382, Nov. 2024. doi: 10.1109/LRA.2024.3457373
    [93]
    F. Yang, W. Li, and L. Zhao, “NA-LOAM: Normal-based adaptive LiDAR odometry and mapping,” IEEE Sensors J., vol. 24, no. 19, pp. 30715–30725, Oct. 2024. doi: 10.1109/JSEN.2024.3446998
    [94]
    J. Rakun, F. Duchoň, and P. Lepej, “Spatial LiDAR odometry and mapping for complex agricultural environments—Spatial FieldLOAM,” Biosyst. Eng., vol. 248, pp. 58–72, Dec. 2024. doi: 10.1016/j.biosystemseng.2024.09.020
    [95]
    Z. Qin, H. Wang, and P. Lv, “Research on simultaneous localization and mapping method for orchards based on scan context and NDT-ICP fusion scheme,” INMATEH Agric. Eng., vol. 73, no. 2, pp. 636–646, Aug. 2024.
    [96]
    H. Liu, H. Gao, J. Shi, C. Xu, D. Qu, and W. Hua, “APMC-LOM: Accurate 3D LiDAR odometry and mapping based on pyramid warm-up registration and multi-constraint optimization,” IEEE Trans. Veh. Technol., vol. 73, no. 12, pp. 18266–18282, Dec. 2024. doi: 10.1109/TVT.2024.3441058
    [97]
    T. G. Dao, H. D. Tran, N. P. Dao, D. T. Tran, and A. Q. Nguyen, “Accuracy and speed improvement of lidar inertial odometry and mapping by using gravity estimation and incremental voxels,” in Proc. 10th Int. Conf. Communications and Electronics, Danang, Vietnam, 2024, pp. 550−555.
    [98]
    Q. Gu, C. Bai, L. Chen, M. Li, M. Fu, and W. Wang, “Localization and mapping method for unmanned mining trucks in underground slope roads based on multi-line lidar,” J. China Coal Soc., vol. 49, no. 3, pp. 1680–1688, Mar. 2024.
    [99]
    K. Li, L. Li, Z. He, H. Xu, and Y. Dai, “Laser SLAM method for nearshore unmanned boat based on embankment feature extraction,” Laser Optoelectron. Prog., vol. 61, no. 14, p. 1428003, 2024. doi: 10.3788/LOP231845
    [100]
    Z. Zhou, C. Zhang, C. Li, Y. Zhang, Y. Shi, and W. Zhang, “A tightly-coupled LIDAR-IMU SLAM method for quadruped robots,” Meas. Control, vol. 57, no. 7, pp. 1004–1013, Jul. 2024. doi: 10.1177/00202940231224593
    [101]
    X. Cao, C. Wei, J. Hu, M. Ding, M. Zhang, and Z. Kang, “RDP-LOAM: Remove-dynamic-points LiDAR odometry and mapping,” in Proc. IEEE Int. Conf. Unmanned Systems, Hefei, China, 2023, pp. 211−216.
    [102]
    H. Wang, H. Liang, Z. Li, X. Zheng, H. Xu, P. Zhou, and B. Kong, “InLIOM: Tightly-coupled intensity LiDAR inertial odometry and mapping,” IEEE Trans. Intell. Transport. Syst., vol. 25, no. 9, pp. 11821–11832, Sept. 2024. doi: 10.1109/TITS.2024.3370235
    [103]
    H. Fan, Y. Li, R. Guo, and X. Chen, “Research on SLAM system of unmanned ammunition supply vehicle based on the fusion of lidar and IMU,” J. Ordnance Equip. Eng., vol. 45, no. 5, pp. 196–201, May 2024.
    [104]
    Q. Xu, “Synchronous modeling and task planning of platform trimming robot,” M.S. thesis, Shenyang University of Technology, Shenyang, China, 2023.
    [105]
    Y. Wang, Z. Hua, J. Chen, J. Zhu, H. Wu, J. Fan, and R. Lin, “Design of positioning and navigation system for UAV in boiler enclosed space,” J. Chin. Soc. Power Eng., vol. 41, no. 11, pp. 966–971, Nov. 2021.
    [106]
    C. Campolo, G. Genovese, A. Iera, and A. Molinaro, “Virtualizing AI at the distributed edge towards intelligent IoT applications,” J. Sens. Actuator Netw., vol. 10, no. 1, p. 13, 2021. doi: 10.3390/jsan10010013
    [107]
    O. Friha, M. A. Ferrag, L. Shu, L. Maglaras, and X. Wang, “Internet of things for the future of smart agriculture: A comprehensive survey of emerging technologies,” IEEE/CAA J. Autom. Sinica, vol. 8, no. 4, pp. 718–752, Apr. 2021. doi: 10.1109/JAS.2021.1003925
    [108]
    X. Yang, L. Shu, K. Li, Z. Huo, S. Shu, and E. Nurellari, “SILOS: An intelligent fault detection scheme for solar insecticidal lamp IoT with improved energy efficiency,” IEEE Internet Things J., vol. 10, no. 1, pp. 920–939, Jan. 2023. doi: 10.1109/JIOT.2022.3209162

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(7)  / Tables(7)

    Article Metrics

    Article views (354) PDF downloads(70) Cited by()

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return