Volume 50 Issue 10
Oct.  2024
Turn off MathJax
Article Contents
CUI Shaoyun, BAO Jiusheng, HU Deping, et al. Research status and development trends of SLAM technology in autonomous mining field[J]. Journal of Mine Automation,2024,50(10):38-52.  doi: 10.13272/j.issn.1671-251x.2024070010
Citation: CUI Shaoyun, BAO Jiusheng, HU Deping, et al. Research status and development trends of SLAM technology in autonomous mining field[J]. Journal of Mine Automation,2024,50(10):38-52.  doi: 10.13272/j.issn.1671-251x.2024070010

Research status and development trends of SLAM technology in autonomous mining field

doi: 10.13272/j.issn.1671-251x.2024070010
  • Received Date: 2024-07-03
  • Rev Recd Date: 2024-10-28
  • Available Online: 2024-09-29
  • Autonomous driving is identified as one of the key technologies for mining intelligence, with simultaneous localization and mapping (SLAM) technology serving as a key link to realize autonomous driving. To advance the development of SLAM technology in autonomous mining, this paper discusses the principles of SLAM technology, mature ground SLAM solutions, the current research status of mining SLAM, and future development trends. Based on the sensors employed in SLAM technology, the study analyzes the technical principles and corresponding frameworks from three aspects: vision, laser, and multi-sensor fusion. It is noted that visual and laser SLAM technologies, which utilize single cameras or LiDAR, are susceptible to environmental interference and cannot adapt to complex environments. Multi-sensor fusion SLAM emerges as the most effective solution. The research examines the status of mining SLAM technology, analyzing the applicability and research value of visual, laser, and multi-sensor fusion SLAM technologies in underground coal mines and open-pit mines. It concludes that multi-sensor fusion SLAM represents the optimal research approach for underground coal mines, while the research value of SLAM technology in open-pit mines is limited. Based on the challenges identified in underground SLAM technology, such as accumulated errors over time and activity range, adverse effects from various scenes, and the inadequacy of various sensors to meet the hardware requirements for high-precision SLAM algorithms, it is proposed that future developments in SLAM technology for autonomous mining should focus on multi-sensor fusion, solid-state solutions, and intelligent development.

     

  • loading
  • [1]
    鲍久圣,刘琴,葛世荣,等. 矿山运输装备智能化技术研究现状及发展趋势[J]. 智能矿山,2020,1(1):78-88.

    BAO Jiusheng,LIU Qin,GE Shirong,et al. Research status and development trend of intelligent technologies for mine transportation equipment[J]. Journal of Intelligent Mine,2020,1(1):78-88.
    [2]
    鲍久圣,张牧野,葛世荣,等. 基于改进A*和人工势场算法的无轨胶轮车井下无人驾驶路径规划[J]. 煤炭学报,2022,47(3):1347-1360.

    BAO Jiusheng,ZHANG Muye,GE Shirong,et al. Underground driverless path planning of trackless rubber tyred vehicle based on improved A* and artificial potential field algorithm[J]. Journal of China Coal Society,2022,47(3):1347-1360.
    [3]
    SMITH R C,CHEESEMAN P. On the representation and estimation of spatial uncertainty[J]. The International Journal of Robotics Research,1986,5(4):56-68. doi: 10.1177/027836498600500404
    [4]
    刘铭哲,徐光辉,唐堂,等. 激光雷达SLAM算法综述[J]. 计算 机工程与应用,2024,60(1):1-14. doi: 10.54254/2755-2721/60/20240821

    LIU Mingzhe,XU Guanghui,TANG Tang,et al. Review of SLAM based on lidar[J]. Computer Engineering and Applications,2024,60(1):1-14. doi: 10.54254/2755-2721/60/20240821
    [5]
    HUANG Leyao. Review on LiDAR-based SLAM techniques[C]. International Conference on Signal Processing and Machine Learning,Stanford,2021:163-168.
    [6]
    李云天,穆荣军,单永志. 无人系统视觉SLAM技术发展现状简析[J]. 控制与决策,2021,36(3):513-522.

    LI Yuntian,MU Rongjun,SHAN Yongzhi. A survey of visual SLAM in unmanned systems[J]. Control and Decision,2021,36(3):513-522.
    [7]
    DAVISON A J,REID I D,MOLTON N D,et al. MonoSLAM:real-time single camera SLAM[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2007,29(6):1052-1067. doi: 10.1109/TPAMI.2007.1049
    [8]
    KLEIN G,MURRAY D. Parallel tracking and mapping for small AR workspaces[C]. 6th IEEE and ACM International Symposium on Mixed and Augmented Reality,Nara,2007:225-234.
    [9]
    MUR-ARTAL R,MONTIEL J M M,TARDOS J D. ORB-SLAM:a versatile and accurate monocular SLAM system[J]. IEEE Transactions on Robotics,2015,31(5):1147-1163. doi: 10.1109/TRO.2015.2463671
    [10]
    MUR-ARTAL R,TARDOS J D. ORB-SLAM2:an open-source SLAM system for monocular,stereo,and RGB-D cameras[J]. IEEE Transactions on Robotics,2017,33(5):1255-1262. doi: 10.1109/TRO.2017.2705103
    [11]
    NEWCOMBE R A,LOVEGROVE S J,DAVISON A J. DTAM:dense tracking and mapping in real-time[C]. International Conference on Computer Vision,Barcelona,2011:2320-2327.
    [12]
    张继贤,刘飞. 视觉SLAM环境感知技术现状与智能化测绘应用展望[J]. 测绘学报,2023,52(10):1617-1630.

    ZHANG Jixian,LIU Fei. Review of visual SLAM environment perception technology and intelligent surveying and mapping application[J]. Acta Geodaetica et Cartographica Sinica,2023,52(10):1617-1630.
    [13]
    ENGEL J,STUCKLER J,CREMERS D. Large-scale direct SLAM with stereo cameras[C]. IEEE/RSJ International Conference on Intelligent Robots and Systems,Hamburg,2015:1935-1942.
    [14]
    TATENO K,TOMBARI F,LAINA I,et al. CNN-SLAM:real-time dense monocular SLAM with learned depth prediction[C]. IEEE Conference on Computer Vision and Pattern Recognition,Honolulu,2017:6243-6252.
    [15]
    尹鋆泰. 动态场景下基于深度学习的视觉SLAM技术研究[D]. 北京:北京邮电大学,2023.

    YIN Juntai. Research on visual SLAM technology based on deep learning in dynamic scene[D]. Beijing:Beijing University of Posts and Telecommunications,2023.
    [16]
    MONTEMERLO M,THRUN S,KOLLER D,et al. FastSLAM:a factored solution to the simultaneous localization and mapping problem[J]. American Association for Artificial Intelligence,2002. DOI: 10.1007/s00244-005-7058-x.
    [17]
    HESS W,KOHLER D,RAPP H,et al. Real-time loop closure in 2D LIDAR SLAM[C]. IEEE International Conference on Robotics and Automation,Stockholm,2016:1271-1278.
    [18]
    ZHANG Ji,SINGH S. LOAM:lidar odometry and mapping in real-time[J]. Robotics:Science and Systems,2014. DOI: 10.15607/RSS.2014.X.007.
    [19]
    SHAN Tixiao,ENGLOT B. LeGO-LOAM:lightweight and ground-optimized lidar odometry and mapping on variable terrain[C]. IEEE/RSJ International Conference on Intelligent Robots and Systems,Madrid,2018:4758-4765.
    [20]
    LIN Jiarong,ZHANG Fu. Loam livox:a fast,robust,high-precision LiDAR odometry and mapping package for LiDARs of small FoV[C]. IEEE International Conference on Robotics and Automation,Paris,2020:3126-3131.
    [21]
    LI Lin,KONG Xin,ZHAO Xiangrui,et al. SA-LOAM:semantic-aided LiDAR SLAM with loop closure[C]. IEEE International Conference on Robotics and Automation,Xi'an,2021:7627-7634.
    [22]
    CHEN X,MILIOTO A,PALAZZOLO E,et al. SuMa++:efficient LiDAR-based semantic SLAM[C]. IEEE/RSJ International Conference on Intelligent Robots and Systems,Macau,2019:4530-4537.
    [23]
    WANG Guangming,WU Xinrui,JIANG Shuyang,et al. Efficient 3D deep LiDAR odometry[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2023,45(5):5749-5765.
    [24]
    QIN Tong,LI Peiliang,SHEN Shaojie. VINS-mono:a robust and versatile monocular visual-inertial state estimator[J]. IEEE Transactions on Robotics,2018,34(4):1004-1020. doi: 10.1109/TRO.2018.2853729
    [25]
    LI Peiliang,QIN Tong,HU Botao,et al. Monocular visual-inertial state estimation for mobile augmented reality[C]. International Symposium on Mixed and Augmented Reality,Nantes,2017:11-21.
    [26]
    QIN Tong,PAN Jie,GAO Shaozu,et al. A general optimization-based framework for local odometry estimation with multiple sensors[EB/OL]. (2019-01-11)[2024-06-22]. https://arxiv.org/abs/1901.03638.
    [27]
    SHAN Tixiao,ENGLOT B,MEYERS D,et al. LIO-SAM:tightly-coupled lidar inertial odometry via smoothing and mapping[C]. IEEE/RSJ International Conference on Intelligent Robots and Systems,Las Vegas,2020:5135-5142.
    [28]
    SHAN Tixiao,ENGLOT B,RATTI C,et al. LVI-SAM:tightly-coupled lidar-visual-inertial odometry via smoothing and mapping[C]. IEEE International Conference on Robotics and Automation,Xi'an,2021:5692-5698.
    [29]
    祝晓轩. 基于单目相机与IMU融合的SLAM系统研究[D]. 青岛:青岛大学,2023.

    ZHU Xiaoxuan. Research on SLAM system based on monocular camera and IMU fusion[D]. Qingdao:Qingdao University,2023.
    [30]
    秦晓辉,周洪,廖毅霏,等. 动态环境下基于时序滑动窗口的鲁棒激光SLAM系统[J]. 湖南大学学报(自然科学版),2023,50(12):49-58.

    QIN Xiaohui,ZHOU Hong,LIAO Yifei,et al. Robust laser SLAM system based on temporal sliding window in dynamic scenes[J]. Journal of Hunan University(Natural Sciences),2023,50(12):49-58.
    [31]
    YE Haoyang,CHEN Yuying,LIU Ming. Tightly coupled 3D lidar inertial odometry and mapping[C]. International Conference on Robotics and Automation,Montreal,2019:3144-3150.
    [32]
    QIN Chao,YE Haoyang,PRANATA C E,et al. LINS:a lidar-inertial state estimator for robust and efficient navigation[C]. IEEE International Conference on Robotics and Automation,Paris,2020:8899-8906.
    [33]
    XU Wei,ZHANG Fu. FAST-LIO:a fast,robust LiDAR-inertial odometry package by tightly-coupled iterated Kalman filter[J]. IEEE Robotics and Automation Letters,2021,6(2):3317-3324. doi: 10.1109/LRA.2021.3064227
    [34]
    XU Wei,CAI Yixi,HE Dongjiao,et al. FAST-LIO2:fast direct LiDAR-inertial odometry[J]. IEEE Transactions on Robotics,2022,38(4):2053-2073. doi: 10.1109/TRO.2022.3141876
    [35]
    BAI Chunge,XIAO Tao,CHEN Yajie,et al. Faster-LIO:lightweight tightly coupled lidar-inertial odometry using parallel sparse incremental voxels[J]. IEEE Robotics and Automation Letters,2022,7(2):4861-4868. doi: 10.1109/LRA.2022.3152830
    [36]
    LI Kailai,LI Meng,HANEBECK U D. Towards high-performance solid-state-LiDAR-inertial odometry and mapping[J]. IEEE Robotics and Automation Letters,2021,6(3):5167-5174. doi: 10.1109/LRA.2021.3070251
    [37]
    ZHAO Shibo,FANG Zheng,LI Haolai,et al. A robust laser-inertial odometry and mapping method for large-scale highway environments[C]. IEEE/RSJ International Conference on Intelligent Robots and Systems,Macau,2019:1285-1292.
    [38]
    LEUTENEGGER S,LYNEN S,BOSSE M,et al. Keyframe-based visual–inertial odometry using nonlinear optimization[J]. The International Journal of Robotics Research,2015,34(3):314-334. doi: 10.1177/0278364914554813
    [39]
    MUR-ARTAL R,TARDOS J D. Visual-inertial monocular SLAM with map reuse[J]. IEEE Robotics and Automation Letters,2017,2(2):796-803. doi: 10.1109/LRA.2017.2653359
    [40]
    CAMPOS C,ELVIRA R,RODRIGUEZ J J G,et al. ORB-SLAM3:an accurate open-source library for visual,visual-inertial,and multimap SLAM[J]. IEEE Transactions on Robotics,2021,37(6):1874-1890. doi: 10.1109/TRO.2021.3075644
    [41]
    ZHANG Ji,SINGH S. Visual-lidar odometry and mapping:low-drift,robust,and fast[C]. IEEE International Conference on Robotics and Automation,Seattle,2015:2174-2181.
    [42]
    GRAETER J,WILCZYNSKI A,LAUER M. LIMO:lidar-monocular visual odometry[C]. IEEE/RSJ International Conference on Intelligent Robots and Systems,Madrid,2018:7872-7879.
    [43]
    LIN Jiarong,ZHENG Chunran,XU Wei,et al. R(2)LIVE:a robust,real-time,LiDAR-inertial-visual tightly-coupled state estimator and mapping[J]. IEEE Robotics and Automation Letters,2021,6(4):7469-7476. doi: 10.1109/LRA.2021.3095515
    [44]
    LIN Jiarong,ZHANG Fu. R3LIVE++:a robust,real-time,RGB-colored,LiDAR-rnertial-visual tightly-coupled state estimation and mapping package[C]. International Conference on Robotics and Automation,Philadelphia,2022:10672-10678.
    [45]
    LIN Jiarong,ZHANG Fu. R3LIVE++:a robust,real-time,radiance reconstruction package with a tightly-coupled LiDAR-inertial-visual state estimator[J]. IEEE transactions on pattern analysis and machine intelligence,2024. DOI: 10.1109/TPAMI.2024.3456473.
    [46]
    HENG Chunran,ZHU Qingyan,XU Wei,et al. FAST-LIVO:fast and tightly-coupled sparse-direct LiDAR-inertial-visual odometry[C]. IEEE/RSJ International Conference on Intelligent Robots and Systems,Kyoto,2022:4003-4009.
    [47]
    高毅楠,姚顽强,蔺小虎,等. 煤矿井下多重约束的视觉SLAM关键帧选取方法[J]. 煤炭学报,2024,49(增刊1):472-482.

    GAO Yinan,YAO Wanqiang,LIN Xiaohu,et al. Visual SLAM keyframe selection method with multiple constraints in underground coal mines[J]. Journal of China Coal Society,2024,49(S1):472-482.
    [48]
    冯玮,姚顽强,蔺小虎,等. 顾及图像增强的煤矿井下视觉同时定位与建图算法[J]. 工矿自动化,2023,49(5):74-81.

    FENG Wei,YAO Wanqiang,LIN Xiaohu,et al. Visual simultaneous localization and mapping algorithm of coal mine underground considering image enhancement[J]. Journal of Mine Automation,2023,49(5):74-81.
    [49]
    马宏伟,王岩,杨林. 煤矿井下移动机器人深度视觉自主导航研究[J]. 煤炭学报,2020,45(6):2193-2206.

    MA Hongwei,WANG Yan,YANG Lin. Research on depth vision based mobile robot autonomous navigation in underground coal mine[J]. Journal of China Coal Society,2020,45(6):2193-2206.
    [50]
    HUBER D F,VANDAPEL N. Automatic three-dimensional underground mine mapping[J]. The International Journal of Robotics Research,2006,25(1):7-17. doi: 10.1177/0278364906061157
    [51]
    安震. 自主导航搜救机器人关键技术研究[D]. 沈阳:东北大学,2015.

    AN Zhen. Research on key technologies of autonomous navigation search and rescue robot[D]. Shenyang:Northeastern University,2015.
    [52]
    LI Menggang,ZHU Hua,YOU Shaoze,et al. Efficient laser-based 3D SLAM for coal mine rescue robots[J]. IEEE Access,2019,7:14124-14138. doi: 10.1109/ACCESS.2018.2889304
    [53]
    REN Zhuli,WANG Liguan,BI Lin. Robust GICP-based 3D LiDAR SLAM for underground mining environment[J]. Sensors,2019,19(13). DOI: 10.3390/s19132915.
    [54]
    邹筱瑜,黄鑫淼,王忠宾,等. 基于集成式因子图优化的煤矿巷道移动机器人三维地图构建[J]. 工矿自动化,2022,48(12):57-67,92.

    ZOU Xiaoyu,HUANG Xinmiao,WANG Zhongbin,et al. 3D map construction of coal mine roadway mobile robot based on integrated factor graph optimization[J]. Journal of Mine Automation,2022,48(12):57-67,92.
    [55]
    许鹏程. 基于粒子群优化的煤矿井下机器人FASTSLAM算法研究[D]. 北京:煤炭科学研究总院,2017.

    XU Pengcheng. Research on FASTSLAM algorithm of coal mine underground robot based on particle swarm optimization[D]. Beijing:China Coal Research Institute,2017.
    [56]
    杨林,马宏伟,王岩,等. 煤矿巡检机器人同步定位与地图构建方法研究[J]. 工矿自动化,2019,45(9):18-24.

    YANG Lin,MA Hongwei,WANG Yan,et al. Research on method of simultaneous localization and mapping of coal mine inspection robot[J]. Industry and Mine Automation,2019,45(9):18-24.
    [57]
    代嘉惠. 大功率本安驱动煤矿救援机器人定位与建图算法研究[D]. 重庆:重庆大学,2019.

    DAI Jiahui. Study on localization and mapping algorithm of high-power intrinsically safe coal mine rescue robot[D]. Chongqing:Chongqing University,2019.
    [58]
    李仲强. 煤矿救援机器人自主建图和导航技术研究[D]. 淮南:安徽理工大学,2019.

    LI Zhongqiang. Research on self-construction and navigation technology of coal mine rescue robot[D]. Huainan:Anhui University of Science and Technology,2019.
    [59]
    李芳威,鲍久圣,王陈,等. 基于LD改进Cartographer建图算法的无人驾驶无轨胶轮车井下SLAM自主导航方法及试验[J/OL]. 煤炭学报:1-12[2024-06-22]. https://doi.org/10.13225/j.cnki.jccs.2023.0731.

    LI Fangwei,BAO Jiusheng,WANG Chen,et al. Unmanned trackless rubber wheeler based on LD improved Cartographer mapping algorithm underground SLAM autonomous navigation method and test[J/OL]. Journal of China Coal Society:1-12[2024-06-22]. https://doi.org/10.13225/j.cnki.jccs.2023.0731.2023.0731.
    [60]
    顾清华,白昌鑫,陈露,等. 基于多线激光雷达的井下斜坡道无人矿卡定位与建图方法[J]. 煤炭学报,2024,49(3):1680-1688.

    GU Qinghua,BAI Changxin,CHEN Lu,et al. Localization and mapping method for unmanned mining trucks in underground slope roads based on multi-line lidar[J]. Journal of China Coal Society,2024,49(3):1680-1688.
    [61]
    薛光辉,李瑞雪,张钲昊,等. 基于激光雷达的煤矿井底车场地图融合构建方法研究[J]. 煤炭科学技术,2023,51(8):219-227.

    XUE Guanghui,LI Ruixue,ZHANG Zhenghao,et al. Lidar based map construction fusion method for underground coal mine shaft bottom[J]. Coal Science and Technology,2023,51(8):219-227.
    [62]
    ZHU Daixian,JI Kangkang,WU Dong,et al. A coupled visual and inertial measurement units method for locating and mapping in coal mine tunnel[J]. Sensors,2022,22(19):7437. doi: 10.3390/s22197437
    [63]
    汪雷. 煤矿探测机器人图像处理及动态物体去除算法研究[D]. 徐州:中国矿业大学,2020.

    WANG Lei. Research on image processing and dynamic object removal algorithm of coal mine detection robot[D]. Xuzhou:China University of Mining and Technology,2020.
    [64]
    YANG Xin,LIN Xiaohu,YAO Wanqiang,et al. A robust LiDAR SLAM method for underground coal mine robot with degenerated scene compensation[J]. Remote Sensing,2022,15(1). DOI: 10.3390/RS15010186.
    [65]
    YANG Lin,MA Hongwei,NIE Zhen,et al. 3D LiDAR point cloud registration based on IMU preintegration in coal mine roadways[J]. Sensors,2023,23(7). DOI: 10.3390/S23073473.
    [66]
    司垒,王忠宾,魏东,等. 基于IMU−LiDAR紧耦合的煤矿防冲钻孔机器人定位导航方法[J]. 煤炭学报,2024,49(4):2179-2194.

    SI Lei,WANG Zhongbin,WEI Dong,et al. Positioning and navigation method of underground drilling robot for rock-burst prevention based on IMU-LiDAR tight coupling[J]. Journal of China Coal Society,2024,49(4):2179-2194.
    [67]
    李猛钢,胡而已,朱华. 煤矿移动机器人LiDAR/IMU紧耦合SLAM方法[J]. 工矿自动化,2022,48(12):68-78.

    LI Menggang,HU Eryi,ZHU Hua. LiDAR/IMU tightly-coupled SLAM method for coal mine mobile robot[J]. Journal of Mine Automation,2022,48(12):68-78.
    [68]
    董志华,姚顽强,蔺小虎,等. 煤矿井下顾及特征点动态提取的激光SLAM算法研究[J]. 煤矿安全,2023,54(8):241-246.

    DONG Zhihua,YAO Wanqiang,LIN Xiaohu,et al. LiDAR SLAM algorithm considering dynamic extraction of feature points in underground coal mine[J]. Safety in Coal Mines,2023,54(8):241-246.
    [69]
    薛光辉,张钲昊,张桂艺,等. 煤矿井下点云特征提取和配准算法改进与激光SLAM研究[J/OL]. 煤炭科学技术:1-12[2024-06-22]. http://kns.cnki.net/kcms/detail/11.2402.TD.20240722.1557.003.html.

    XUE Guanghui,ZHANG Zhenghao,ZHANG Guiyi,et al. Improvement of point cloud feature extraction and alignment algorithms and LiDAR SLAM in coal mine underground[J/OL]. Coal Science and Technology:1-12[2024-06-22]. http://kns.cnki.net/kcms/detail/11.2402.TD.20240722.1557.003.html.
    [70]
    李栋. 基于多源信息融合的巷道语义地图构建与复用方法研究[D]. 苏州:苏州大学,2022.

    LI Dong. A Method of construction and reuse of roadway semantic map based on multi-source information fusion[D]. Suzhou:Soochow University,2022.
    [71]
    陈步平. 矿用搜救机器人多源信息融合SLAM方法研究[D]. 徐州:中国矿业大学,2023.

    CHEN Buping. Research on SLAM method of multi-source information fusion for mine search and rescue robot[D]. Xuzhou:China University of Mining and Technology,2023.
    [72]
    马艾强,姚顽强. 煤矿井下移动机器人多传感器自适应融合SLAM方法[J]. 工矿自动化,2024,50(5):107-117.

    MA Aiqiang,YAO Wanqiang. Multi sensor adaptive fusion SLAM method for underground mobile robots in coal mines[J]. Journal of Mine Automation,2024,50(5):107-117.
    [73]
    滕睿. 露天矿运输车辆无人驾驶关键技术研究[D]. 阜新:辽宁工程技术大学,2023.

    TENG Rui. Research on key technologies of unmanned driving of transport vehicles in open-pit mine[D]. Fuxin:Liaoning Technical University,2023.
    [74]
    张清宇,崔丽珍,李敏超,等. 倾斜地面3D点云快速分割算法[J]. 无线电工程,2024,54(2):447-456.

    ZHANG Qingyu,CUI Lizhen,LI Minchao,et al. A fast segmentation algorithm for 3D point cloud on inclined ground[J]. Radio Engineering,2024,54(2):447-456.
    [75]
    张清宇. 煤矿环境下LiDAR/IMU融合定位算法研究与实现[D]. 包头:内蒙古科技大学,2023.

    ZHANG Qingyu. Research and implementation of LiDAR/IMU fusion positioning algorithm in coal mine environment[D]. Baotou:Inner Mongolia University of Science & Technology,2023.
    [76]
    马宝良,崔丽珍,李敏超,等. 露天煤矿环境下基于LiDAR/IMU的紧耦合SLAM算法研究[J]. 煤炭科学技术,2024,52(3):236-244.

    MA Baoliang,CUI Lizhen,LI Minchao,et al. Study on tightly coupled LiDAR-Inertial SLAM for open pit coal mine environment[J]. Coal Science and Technology,2024,52(3):236-244.
    [77]
    李慧,李敏超,崔丽珍,等. 露天煤矿三维激光雷达运动畸变算法研究[J/OL]. 煤炭科学技术:1-12[2024-06-22]. http://kns.cnki.net/kcms/detail/11.2402.td.20240325.1558.006.html.

    LI Hui,LI Minchao,CUI Lizhen,et al. Research on 3D LiDAR motion distortion algorithm for open-pit coal mine[J/OL]. Coal Science and Technology:1-12[2024-06-22]. http://kns.cnki.net/kcms/detail/11.2402.td.20240325.1558.006.html.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(19)  / Tables(1)

    Article Metrics

    Article views (168) PDF downloads(15) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return