Behavior recognition method for underground personnel based on fusion network
-
摘要: 井下人员行为识别是保障煤矿安全生产的重要措施。针对现有井下人员行为识别研究缺少对感知机理的研究与分析且特征提取手段单一的问题,提出一种基于融合网络的井下人员行为识别方法。该方法主要包括数据预处理、特征构建和判识网络构造3个部分。数据预处理:通过信道状态信息(CSI)商模型、子载波去直流和离散小波去噪对采集的CSI数据进行处理,以降低环境噪声、设备噪声等的影响。特征构建:将处理后的数据利用格拉姆和/差角场 (GASF/GADF)转换成图像,从而保留数据的空间和时间特性。判识网络构造:根据人员动作的特点,提出一种由基于门控循环单元(GRU)的编解码网络和多尺度卷积神经网络(CNN)组成的融合网络,利用GRU保留前后数据之间的关联性,同时利用注意力机制的权重分配策略有效提取关键特征,以提高行为识别的准确率。实验结果表明:该方法对行走、摘帽子、扔东西、坐、抽烟、挥手、跑动、睡觉8种动作的平均识别准确率为97.37%,对睡觉和坐的识别准确率最高,最容易发生误判的动作是行走和跑动;使用准确率、精确率、召回率和F1分数作为评价指标,得出融合网络的性能优于CNN和GRU,人员行为识别准确率高于HAR系统、WiWave系统和Wi−Sense系统;正常速度下行走和摘帽子2种动作的平均识别精度为95.6%,高于快速动作情况下的93.6%和慢速动作情况下的92.7%;收发设备之间的距离为2 m和2.5 m时,识别准确率较高。Abstract: Underground personnel behavior recognition is an important measure to ensure safe production in coal mines. The existing research on behavior recognition of underground personnel lacks research and analysis on the perception mechanism, and the feature extraction method is simple. In order to solve the above problems, a behavior recognition method for underground personnel based on fusion networks is proposed. The method mainly includes three parts: data preprocessing, feature construction, and recognition network construction. Data preprocessing: the collected channel status information (CSI) data is processed through CSI quotient models, subcarrier denoising, and discrete wavelet denoising to reduce the impact of environmental noise and equipment noise. Feature construction: the processed data is transformed into images using the Gramian angular summation/difference fields (GASF/GADF) to preserve the spatial and temporal features of the data. Recognition network construction: according to the features of personnel actions, a fusion network composed of a gate recurrent unit (GRU) based encoding and decoding network and a multiscale convolutional neural network (CNN) is proposed. GRU is used to preserve the correlation between pre and post data. The weight allocation strategy of the attention mechanism is used to effectively extract key features to improve the accuracy of behavior recognition. The experimental results show that the average recognition accuracy of this method for eight movements, namely walking, taking off a hat, throwing things, sitting, smoking, waving, running, and sleeping, is 97.37%. The recognition accuracy for sleeping and sitting is the highest, and the most prone to misjudgment are walking and running. Using accuracy, precision, recall, and F1 score as evaluation indicators, it is concluded that the performance of the fusion network is superior to CNN and GRU. The accuracy of personnel behavior recognition is higher than the HAR system, WiWave system and Wi-Sense system. The average recognition accuracy of walking and taking off a hat at normal speed is 95.6%, which is higher than 93.6% for fast motion and 92.7% for slow motion. When the distance between transceiver devices is 2 m and 2.5 m, the recognition accuracy is higher.
-
表 1 基于GRU的编解码网络参数
Table 1. Parameters of encoding and decoding network based on GRU
序号 网络层 输出维度 1 GRU 256×512 2 GRU 256×256 3 GRU 128×256 4 GRU 128×128 5 Transposed Convolution 128×128 6 Self−Attention 128×128 7 Transposed Convolution 128×256 8 Self−Attention 128×256 9 Transposed Convolution 128×512 10 Self−Attention 128×512 11 1D−Convolution 64×256 12 Flattern 2048×1 表 2 多尺度CNN参数
Table 2. Parameters of multi-scale CNN
序号 网络层 核大小 核数目 输出维度 1 ECA 512×1600 1−1 Con1−1 5 256 256×800 1−2 Con1−2 7 256 256×800 1−3 Con1−3 1 256 256×800 2−1 Con2−1 7 512 512×400 2−2 Con2−2 3 512 512×400 2−3 Con2−3 5 512 512×400 3−1 Con3−1 3 256 256×400 3−2 Con3−2 5 256 256×400 3−3 Con3−3 7 256 256×400 4−1 Pooling4−1 128×512 4−2 Pooling4−2 128×512 4−3 Pooling4−3 128×512 5−1 ECA 128×512 5−2 ECA 128×512 5−3 ECA 128×512 6 Flatten 2048×1 表 3 实验动作
Table 3. Experimental actions
动作 潜在危险行为 行走 进入危险区域 摘帽子 摘安全帽 扔东西 乱扔工具 坐 在危险区域休息 抽烟 违规抽烟 挥手 斗殴 跑动 违规下车 睡觉 在危险区域睡觉 表 4 不同优化器和学习率下的识别准确率
Table 4. Recognition accuracy under different optimizers and learning rates
学习率 准确率/% Ada Delta SGD RMS Prop Adam 0.0001 95.54 94.75 96.33 97.01 0.001 93.43 94.37 94.64 97.37 0.01 96.45 94.88 93.41 92.15 0.05 92.97 92.64 90.72 92.89 0.1 88.25 89.76 91.45 91.99 -
[1] 陶志勇,郭京,刘影. 基于多天线判决的CSI高效人体行为识别方法[J]. 计算机科学与探索,2021,15(6):1122-1132. doi: 10.3778/j.issn.1673-9418.2005021TAO Zhiyong,GUO Jing,LIU Ying. Efficient human behavior recognition method of CSI based on multi-antenna judgment[J]. Journal of Frontiers of Computer Science and Technology,2021,15(6):1122-1132. doi: 10.3778/j.issn.1673-9418.2005021 [2] GU Yu,WANG Yantong,WANG Meng,et al. Secure user authentication leveraging keystroke dynamics via Wi-Fi sensing[J]. IEEE Transactions on Industrial Informatics,2022,18(4):2784-2795. doi: 10.1109/TII.2021.3108850 [3] GORRINI A,MESSA F,CECCARELLI G,et al. Covid-19 pandemic and activity patterns in Milan. Wi-Fi sensors and location-based data[J]. TeMA-Journal of Land Use,Mobility and Environment,2021,14(2):211-226. [4] CHEN Liangqin,TIAN Liping,XU Zhimeng,et al. A survey of WiFi sensing techniques with channel state information[J]. ZTE Communications,2020,18(3):57-63. [5] MA Yongsen,ZHOU Gang,WANG Shuangquan. WiFi sensing with channel state information:a survey[J]. ACM Computing Surveys,2019,52(3):1-36. [6] FANG Yuanrun,XIAO Fu,SHENG Biyun,et al. Cross-scene passive human activity recognition using commodity WiFi[J]. Frontiers of Computer Science,2022,16:1-11. [7] ZHANG Lei,ZHANG Yue,BAO Rong,et al. A novel WiFi-based personnel behavior sensing with a deep learning method[J]. IEEE Access,2022,10:120136-120145. doi: 10.1109/ACCESS.2022.3222381 [8] 魏忠诚,张新秋,连彬,等. 基于Wi-Fi信号的身份识别技术研究[J]. 物联网学报,2021,5(4):107-119. doi: 10.11959/j.issn.2096-3750.2021.00213WEI Zhongcheng,ZHANG Xinqiu,LIAN Bin,et al. A survey on Wi-Fi signal based identification technology[J]. Chinese Journal on Internet of Things,2021,5(4):107-119. doi: 10.11959/j.issn.2096-3750.2021.00213 [9] WANG Yan, LIU Jian, CHEN Yingying, et al. E-eyes: device-free location-oriented activity identification using fine-grained WiFi signatures[C]. Proceedings of the 20th Annual International Conference on Mobile Computing and Networking, 2014: 617-628. [10] YAN Huan,ZHANG Yong,WANG Yujie. WiAct:a passive WiFi-based human activity recognition system[J]. IEEE Sensors Journal,2019,20(1):296-305. [11] 熊小樵,冯秀芳,丁一. 基于CSI的手势识别方法研究[J]. 计算机应用与软件,2022,39(1):181-187. doi: 10.3969/j.issn.1000-386x.2022.01.027XIONG Xiaoqiao,FENG Xiufang,DING Yi. Research on hand gesture recognition method based on CSI[J]. Computer Applications and Software,2022,39(1):181-187. doi: 10.3969/j.issn.1000-386x.2022.01.027 [12] ATITALLAH B B, ABBASI M B, BARIOUL R, et al. Simultaneous pressure sensors monitoring system for hand gestures recognition[C]. 2020 IEEE Sensors, Rotterdam, 2020: 1-4. [13] CHU Xianzhi, LIU Jiang, SHIMAMOTO S. A sensor-based hand gesture recognition system for Japanese sign language[C]. 2021 IEEE 3rd Global Conference on Life Sciences and Technologies(LifeTech), Nara, 2021: 311-312. [14] YIN Kang, TANG Chengpei, ZHANG Xie, et al. Robust human activity recognition system with Wi-Fi using handcraft feature[C]. 2021 IEEE Symposium on Computers and Communications, Athens, 2021: 1-8. [15] YU Bohan,WANG Yuxiang,NIU Kai,et al. WiFi-sleep:sleep stage monitoring using commodity Wi-Fi devices[J]. IEEE Internet of Things Journal,2021,8(18):13900-13913. doi: 10.1109/JIOT.2021.3068798 [16] SOLIKHIN M,PRATAMA Y,PASARIBU P,et al. Analisis watermarking menggunakan metode discrete cosine transform (DCT) dan discrete fourier transform (DFT)[J]. Jurnal Sistem Cerdas,2022,5(3):155-170. [17] RAJASHEKHAR U,NEELAPPA D,RAJESH L. Electroencephalogram (EEG) signal classification for brain-computer interface using discrete wavelet transform (DWT)[J]. International Journal of Intelligent Unmanned Systems,2022,10(1):86-97. doi: 10.1108/IJIUS-09-2020-0057 [18] CAN C, KAYA Y, KILIÇ F. A deep convolutional neural network model for hand gesture recognition in 2D near-infrared images[J]. Biomedical Physics & Engineering Express, 2021, 7(5). DOI: 10.1088/2057-1976/ac0d91. [19] YU L, LI J, WANG T, et al. T2I-Net: time series classification via deep sequence-to-image transformation networks[C]. 2022 IEEE International Conference on Networking, Sensing and Control, Shanghai, 2022: 1-5. [20] MOGHADDAM M G, SHIREHJINI A A N, SHIRMOHAMMADI S. A WiFi-based system for recognizing fine-grained multiple-subject human activities[C]. 2022 IEEE International Instrumentation and Measurement Technology Conference, Ottawa, 2022: 1-6. [21] MEI Y, JIANG T, DING X, et al. WiWave: WiFi-based human activity recognition using the wavelet integrated CNN[C]. 2021 IEEE/CIC International Conference on Communications in China, Xiamen, 2021: 100-105. [22] MUAAZ M,CHELLI A,GERDES M W,et al. Wi-Sense:a passive human activity recognition system using Wi-Fi and convolutional neural network and its integration in health information systems[J]. Annals of Telecommunications,2022,77(3):163-175.