留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

基于生成对抗网络的井下人员步长估计方法

王泰基

王泰基. 基于生成对抗网络的井下人员步长估计方法[J]. 工矿自动化,2024,50(6):103-111.  doi: 10.13272/j.issn.1671-251x.2024020039
引用本文: 王泰基. 基于生成对抗网络的井下人员步长估计方法[J]. 工矿自动化,2024,50(6):103-111.  doi: 10.13272/j.issn.1671-251x.2024020039
WANG Taiji. A method for estimating the step size of underground personnel based on generative adversarial networks[J]. Journal of Mine Automation,2024,50(6):103-111.  doi: 10.13272/j.issn.1671-251x.2024020039
Citation: WANG Taiji. A method for estimating the step size of underground personnel based on generative adversarial networks[J]. Journal of Mine Automation,2024,50(6):103-111.  doi: 10.13272/j.issn.1671-251x.2024020039

基于生成对抗网络的井下人员步长估计方法

doi: 10.13272/j.issn.1671-251x.2024020039
基金项目: 江苏省成果转化项目(BA2022040)。
详细信息
    作者简介:

    王泰基(1975—),男,甘肃武威人,高级工程师,研究方向为煤矿机电、信息化、智能化,E-mail:740897262@qq.com

  • 中图分类号: TD655.3

A method for estimating the step size of underground personnel based on generative adversarial networks

  • 摘要: 针对基于行人航位推算(PDR)的煤矿井下人员定位系统中步长估计存在累计误差及传统深度学习方法所需数据集样本过大的问题,提出了一种基于生成对抗网络(GAN)的井下人员步长估计方法。GAN模型主要包括生成模型和判别模型2个部分,均采用深度神经网络(DNNs)实现。生成模型根据输入数据生成连续的结果分布(即标签),其输出层使用线性激活函数,以保留网络的线性特性,允许模型预测任何人员在行走过程中的步长;判别模型根据输入数据与标签判别是真实标签还是由生成器生成的标签,其输出层使用Sigmoid激活函数,以实现结果的二分类。确定生成模型与判别模型后,GAN模型联合2个模型进行训练,通过构建并优化生成器和判别器之间的动态竞争,使得生成器能够在不断迭代中学会生成更加逼真、难以区分的数据样本。实验结果表明,使用同样训练集及测试集的情况下,GAN模型的平均误差为0.14 m,标准差和均方根误差均小于DNNs模型,最小值均为0.74 m。户外测试结果表明,基于GAN的井下人员步长估计方法在上下坡场景的误差最小值为3.21%,最大值为4.79%;相比于上下坡场景,操场场景的误差更小,最大误差为1.91%。

     

  • 图  1  基于IMU的PDR算法

    Figure  1.  Pedestrian dead reckoning algorithm based on inertial measurement unit

    图  2  井下人员运动模型

    Figure  2.  Underground personnel movement model

    图  3  井下人员正常行走时三轴加速度计数据

    Figure  3.  Three-axis accelerometer data during normal underground personnel walking

    图  4  步伐检测结果

    Figure  4.  Step detection results

    图  5  三类坐标系统

    Figure  5.  Three types of coordinate systems

    图  6  GAN模型架构

    Figure  6.  Architecture of generative adversarial network model

    图  7  生成模型架构与参数

    Figure  7.  Architecture and parameters of the generative model

    图  8  判别模型架构与参数

    Figure  8.  Architecture and parameters of the discriminant model

    图  9  实验场景

    Figure  9.  Experimental site

    图  10  不同学习率/衰变率下算法的RMSE

    Figure  10.  Root mean square error of the algorithm under different learning rate/decay rate

    图  11  不同训练集比例下算法的RMSE

    Figure  11.  Root mean square error of the algorithm under different ratio of training data

    图  12  一次实验中的行走步伐与步长估计

    Figure  12.  Walking pace and step size estimation in an experiment

    图  13  测试操场

    Figure  13.  Test playground

    图  14  上下坡测试路线

    Figure  14.  Uphill and downhill test routes

    图  15  操场和上下坡测试步伐检测结果

    Figure  15.  Step detection results at playground and uphill and downhill

    表  1  实验数据集

    Table  1.   Experimental dataset

    数据集 行走速度 行走距离/m 采集次数 实际行走步数
    训练集慢速20501 639
    中速20501 428
    快速20501 104
    测试集混合50201 413
    下载: 导出CSV

    表  2  生成模型与判别模型超参数数值选项

    Table  2.   Hyperparameter numerical options for generative model and discriminant model

    超参数 数值选项
    优化器 SGD, RMSprop, Adam
    学习率$ \alpha $ $ {10}^{-4} $, $ {2\times 10}^{-4} $, $ {10}^{-3},{2\times 10}^{-3} $
    动量$ {\beta }_{1} $ 0.5, 0.9
    衰变率$ \lambda $ $ {10}^{-4},{10}^{-3},{10}^{-2},0 $
    样本数量占比 $ \dfrac{1}{10},\dfrac{1}{5},\dfrac{1}{4},\dfrac{1}{3},\dfrac{1}{2},1 $
    下载: 导出CSV

    表  3  生成模型与判别模型超参数默认值

    Table  3.   Default hyperparameters for generative model and discriminant model

    超参数 生成模型 判别模型
    优化器 Adam Adam
    学习率$ \alpha $ $ {10}^{-4} $ $ {10}^{-4} $
    动量$ {\beta }_{1} $ 0.5 0.5
    衰变率$ \lambda $ $ {10}^{-4} $ $ {10}^{-4} $
    样本数量占比 $ \dfrac{1}{2} $ $ \dfrac{1}{2} $
    下载: 导出CSV

    表  4  生成模型与判别模型超参数最优值

    Table  4.   Optimal hyperparameters for generative model and discriminant model

    超参数 生成模型 判别模型
    优化器 Adam Adam
    学习率$ \alpha $ $ {10}^{-4} $ $ {10}^{-4} $
    动量$ {\beta }_{1} $ 0.5 0.5
    衰变率$ \lambda $ $ {10}^{-4} $ $ 0 $
    样本数量占比 $ \dfrac{1}{2} $ $ \dfrac{1}{2} $
    下载: 导出CSV

    表  5  GAN模型与DNNs模型性能对比

    Table  5.   Performance comparison of generative adversarial network model and deep neural networks model

    训练集数据量模型均值±STD/mRMSE/m
    15GAN49.86±1.121.13
    DNNs50.06±1.491.49
    30GAN49.83±0.991
    DNNs50.05±1.351.35
    60GAN49.90±0.840.84
    DNNs50.12±1.281.29
    120GAN49.89±0.760.77
    DNNs50.16±1.251.26
    150GAN49.86±0.740.74
    DNNs50.15±1.231.24
    下载: 导出CSV

    表  6  户外最终测试结果

    Table  6.   Final outdoor test results

    场景实验次数GANDNNs
    人员1距离/m人员1误差/%人员2距离/m人员2误差/%人员1距离/m人员1误差/%人员2距离/m人员2误差/%
    操场1362.641.13363.341.33366.362.17367.302.43
    2364.761.72364.511.65369.953.17367.322.44
    3362.561.11364.851.75371.643.64369.593.07
    4362.591.12364.261.58370.893.43368.962.90
    5364.461.64364.821.74365.291.87370.343.28
    上下坡1440.563.74440.743.78443.304.38447.325.33
    2439.193.41442.134.10449.995.95442.664.23
    3438.183.17439.393.46442.764.25445.995.01
    4444.004.54442.894.28444.814.74442.804.26
    5439.193.41441.563.97444.214.59442.884.28
    下载: 导出CSV
  • [1] 包建军,霍振龙,徐炜,等. 一种高精度井下人员无线定位方法[J]. 工矿自动化,2009,35(10):18-21.

    BAO Jianjun,HUO Zhenlong,XU Wei,et al. A wireless location method with high precision for underground personnel tracking[J]. Industry and Mine Automation,2009,35(10):18-21.
    [2] LEVI R W,JUDD T. Dead reckoning navigational system using accelerometer to measure foot impacts:US5583776[P]. 1996-12-10.
    [3] GUO Shuli,ZHANG Yitong,GUI Xinzhe,et al. An improved PDR/UWB integrated system for indoor navigation applications[J]. IEEE Sensors Journal,2020,20(14):8046-8061. doi: 10.1109/JSEN.2020.2981635
    [4] WANG Hucheng,ZHANG Lei,WANG Zhi,et al. Pals:high-accuracy pedestrian localization with fusion of smartphone acoustics and PDR[EB/OL]. [2024-01-12]. https://ceur-ws.org/Vol-2498/short38.pdf.
    [5] DÍEZ L E,BAHILLO A,OTEGUI J,et al. Step length estimation methods based on inertial sensors:a review[J]. IEEE Sensors Journal,2018,18(17):6908-6926. doi: 10.1109/JSEN.2018.2857502
    [6] VEZOČNIK M,JURIC M B. Average step length estimation models' evaluation using inertial sensors:a review[J]. IEEE Sensors Journal,2019,19(2):396-403. doi: 10.1109/JSEN.2018.2878646
    [7] KONE Y,ZHU Ni,RENAUDIN V. Zero velocity detection without motion pre-classification:uniform AI model for all pedestrian motions (UMAM)[J]. IEEE Sensors Journal,2022,22(6):5113-5121. doi: 10.1109/JSEN.2021.3099860
    [8] MIYAZAKI S. Long-term unrestrained measurement of stride length and walking velocity utilizing a piezoelectric gyroscope[J]. IEEE Transactions on Bio-Medical Engineering,1997,44(8):753-759. doi: 10.1109/10.605434
    [9] TJHAI C,O'KEEFE K. Step-size estimation using fusion of multiple wearable inertial sensors[C]. International Conference on Indoor Positioning and Indoor Navigation,Sapporo,2017:1-8.
    [10] XIA Hao,ZUO Jinbo,LIU Shuo,et al. Indoor localization on smartphones using built-in sensors and map constraints[J]. IEEE Transactions on Instrumentation and Measurement,2019,68(4):1189-1198. doi: 10.1109/TIM.2018.2863478
    [11] WANG Hucheng,XUE Can,WANG Zhi,et al. Smartphone-based pedestrian NLOS positioning based on acoustics and IMU parameter estimation[J]. IEEE Sensors Journal,2022,22(23):23095-23108. doi: 10.1109/JSEN.2022.3185248
    [12] HANNINK J,KAUTZ T,PASLUOSTA C F,et al. Mobile stride length estimation with deep convolutional neural networks[J]. IEEE Journal of Biomedical and Health Informatics,2018,22(2):354-362. doi: 10.1109/JBHI.2017.2679486
    [13] SUI J D,CHANG T S. IMU based deep stride length estimation with self-supervised learning[J]. IEEE Sensors Journal,2021,21(6):7380-7387. doi: 10.1109/JSEN.2021.3049523
    [14] JIN H,KANG I,CHOI G,et al. Wearable sensor-based step length estimation during overground locomotion using a deep convolutional neural network[C]. Annual International Conference of the IEEE Engineering in Medicine and Biology Society,Mexico,2021:4897-4900.
    [15] DÍAZ S,DISDIER S,LABRADOR M A. Step length and step width estimation using wearable sensors[C]. The 9th IEEE Annual Ubiquitous Computing,Electronics & Mobile Communication Conference,New York,2018:997-1001.
    [16] HAN K,YU S M,KO S W,et al. Waveform-guide transformation of IMU measurements for smartphone-based localization[J]. IEEE Sensors Journal,2023,23(17):20379-20389. doi: 10.1109/JSEN.2023.3298713
    [17] 孙延鑫,毛善君,苏颖,等. 改进的井下人员定位PDR算法研究[J]. 工矿自动化,2021,47(1):43-48.

    SUN Yanxin,MAO Shanjun,SU Ying,et al. Research on improved PDR algorithm for underground personnel positioning[J]. Industry and Mine Automation,2021,47(1):43-48.
    [18] WANG Qu,LUO Haiyong,YE Langlang,et al. Personalized stride-length estimation based on active online learning[J]. IEEE Internet of Things Journal,2020,7(6):4885-4897. doi: 10.1109/JIOT.2020.2971318
    [19] 郭倩倩,崔丽珍,杨勇,等. 基于LSTM个性化步长估计的井下人员精准定位PDR算法[J]. 工矿自动化,2022,48(1):33-39.

    GUO Qianqian,CUI Lizhen,YANG Yong,et al. PDR algorithm for precise positioning of underground personnel based on LSTM personalized step size estimation[J]. Industry and Mine Automation,2022,48(1):33-39.
    [20] VOIGHT J. Quaternion algebras[M]. Berlin:Springer,2021.
    [21] DIEBEL J. Representing attitude:euler angles,unit quaternions,and rotation vectors[J]. Matrix,2006,58(15):1-35.
  • 加载中
图(15) / 表(6)
计量
  • 文章访问数:  105
  • HTML全文浏览量:  26
  • PDF下载量:  10
  • 被引次数: 0
出版历程
  • 收稿日期:  2024-02-26
  • 修回日期:  2024-06-25
  • 网络出版日期:  2024-07-10

目录

    /

    返回文章
    返回