Citation: | CHEN Zhiwen, CHEN Ailiangfei, TANG Xiaodan, et al. YOLOv5s pruning method for edge computing of coal mine safety monitoring[J]. Journal of Mine Automation,2024,50(7):89-97. doi: 10.13272/j.issn.1671-251x.2024010095 |
[1] |
LUAN Hengxuan,XU Hao,TANG Wei,et al. Coal and gangue classification in actual environment of mines based on deep learning[J]. Measurement,2023,211:. DOI: 10.1016/j.measurement.2023.112651.
|
[2] |
王宇,于春华,陈晓青,等. 基于多模态特征融合的井下人员不安全行为识别[J]. 工矿自动化,2023,49(11):138-144.
WANG Yu,YU Chunhua,CHEN Xiaoqing,et al. Recognition of unsafe behaviors of underground personnel based on multi modal feature fusion[J]. Industry and Mine Automation,2023,49(11):138-144.
|
[3] |
董昕宇,师杰,张国英. 基于参数轻量化的井下人体实时检测算法[J]. 工矿自动化,2021,47(6):71-78.
DONG Xinyu,SHI Jie,ZHANG Guoying. Real-time detection algorithm of underground human body based on lightweight parameters[J]. Industry and Mine Automation,2021,47(6):71-78.
|
[4] |
许志,李敬兆,张传江,等. 轻量化CNN及其在煤矿智能视频监控中的应用[J]. 工矿自动化,2020,46(12):13-19.
XU Zhi,LI Jingzhao,ZHANG Chuanjiang,et al. Lightweight CNN and its application in coal mine intelligent video surveillance[J]. Industry and Mine Automation,2020,46(12):13-19.
|
[5] |
SHAO Linsong,ZUO Haorui,ZHANG Jianlin,et al. Filter pruning via measuring feature map information[J]. Sensors,2021,21(9). DOI:10.3390/s21196601.
|
[6] |
LUO Jianhao,WU Jianxin. An entropy-based pruning method for CNN compression[EB/OL]. [2023-12-12]. https://arxiv.org/abs/1706.05791v1.
|
[7] |
HE Yang,DING Yuhang,LIU Ping,et al. Learning filter pruning criteria for deep convolutional neural networks acceleration[C]. IEEE/CVF Conference on Computer Vision and Pattern Recognition,Seattle,2020:2006-2015.
|
[8] |
LI Hao,KADAV A,DURDANOVIC I,et al. Pruning filters for efficient ConvNets[EB/OL]. [2023-12-12]. https://arxiv.org/abs/1608.08710v3.
|
[9] |
HE Yang,KANG Guoliang,DONG Xuanyi,et al. Soft filter pruning for accelerating deep convolutional neural networks[EB/OL]. [2023-12-12]. https://arxiv.org/abs/1808.06866v1.
|
[10] |
SARVANI C H,RAM D S,MRINMOY G. UFKT:unimportant filters knowledge transfer for CNN pruning[J]. Neurocomputing,2022,514:101-112. doi: 10.1016/j.neucom.2022.09.150
|
[11] |
CHIN T W,DING Ruizhou,ZHANG Cha,et al. Towards efficient model compression via learned global ranking[C]. IEEE/CVF Conference on Computer Vision and Pattern Recognition,Seattle,2020:1515-1525.
|
[12] |
ZHANG Wei,WANG Zhiming. FPFS:filter-level pruning via distance weight measuring filter similarity[J]. Neurocomputing,2022,512:40-51. doi: 10.1016/j.neucom.2022.09.049
|
[13] |
HE Yang,LIU Ping,WANG Ziwei,et al. Filter pruning via geometric Median for deep convolutional neural networks acceleration[C]. IEEE/CVF Conference on Computer Vision and Pattern Recognition ,Long Beach,2019:4335-4344.
|
[14] |
FATEMEH B,MOHAMMAD A M. Evolutionary convolutional neural network for efficient brain tumor segmentation and overall survival prediction[J]. Expert Systems with Applications,2023,213. DOI: 10.1016/j.eswa.2022.118996.
|
[15] |
ALESSIA A,GIANLUCA B,FRANCESCO C,et al. Representation and compression of Residual Neural Networks through a multilayer network based approach[J]. Expert Systems with Applications,2023,215. DOI:10.1016/j.eswa.2022.119391.
|
[16] |
ZHOU Hao,ALVAREZ J M,PORIKLI F. Less is more:towards compact CNNs[M]. Cham:Springer,2016.
|
[17] |
ÁLVAREZ J M,SALZMANN M. Learning the number of neurons in deep networks[J]. Neural Information Processing Systems,2016. DOI: 10.48550/arXiv.1611.06321.
|
[18] |
WEN Wei,WU Chunpeng,WANG Yandan,et al. Learning structured sparsity in deep neural networks[EB/OL]. [2023-12-12]. https://arxiv.org/abs/1608.03665v4.
|
[19] |
LIU Zhuang,LI Jianguo,SHEN Zhiqiang,et al. Learning efficient convolutional networks through network slimming[C]. IEEE International Conference on Computer Vision ,Venice,2017:2755-2763.
|
[20] |
HE Yihui,ZHANG Xiangyu,SUN Jian. Channel pruning for accelerating very deep neural networks[C]. IEEE International Conference on Computer Vision,Venice,2017:1398-1406.
|
[21] |
YOU Zhonghui,YAN Kun,YE Jinmian,et al. Gate decorator:global filter pruning method for accelerating deep convolutional neural networks[EB/OL]. [2023-12-12]. https://arxiv.org/abs/1909.08174v1.
|
[22] |
MILTON M,BISHSHOY D,DUTTA R S,et al. Adaptive CNN filter pruning using global importance metrics[J]. Computer Vision and Image Understanding,2022,222:. DOI: 10.1016/j.cviu.2022.103511.
|
[23] |
LIN Mingbao,JI Rongrong,WANG Yan,et al. HRank:filter pruning using high-rank feature map[C]. IEEE/CVF Conference on Computer Vision and Pattern Recognition,Seattle,2020:1526-1535.
|
[24] |
LUO Jianhao,WU Jianxin,LIN Weiyao. ThiNet:a filter level pruning method for deep neural network compression[C]. IEEE International Conference on Computer Vision ,Venice,2017:5068-5076.
|
[25] |
CHANG Jingfei,LU Yang,XUE Ping,et al. Automatic channel pruning via clustering and swarm intelligence optimization for CNN[J]. Applied Intelligence,2022,52(15):17751-17771. doi: 10.1007/s10489-022-03508-1
|
[26] |
YU Ruichi,LI Ang,CHEN Chunfu,et al. NISP:pruning networks using neuron importance score propagation[C]. IEEE/CVF Conference on Computer Vision and Pattern Recognition,Salt Lake City,2018:9194-9203.
|
[27] |
ZHU M,GUPTA S. To prune,or not to prune:exploring the efficacy of pruning for model compression[EB/OL]. [2023-12-12]. https://arxiv.org/abs/1710.01878v2.
|
[28] |
HAN Song,POOL J,TRAN J,et al. Learning both weights and connections for efficient neural network[J]. Neural Information Processing Systems,2015. DOI: 10.48550/arXiv.1506.02626.
|
[29] |
FRANKLE J,CARBIN M. The lottery ticket hypothesis:finding sparse,trainable neural networks[EB/OL]. [2023-12-12]. https://arxiv.org/abs/1803.03635v5.
|
[30] |
GALE T,ELSEN E,HOOKER S. The state of sparsity in deep neural networks[EB/OL]. [2023-12-12]. https://arxiv.org/abs/1902. 09574v1.
|
[31] |
MOSTAFA H,WANG Xin. Parameter efficient training of deep convolutional neural networks by dynamic sparse reparameterization[C]. 36th International Conference on Machine Learning,Long Beach,2019:4646-4655.
|
[32] |
EVERINGHAM M,GOOL L,WILLIAMS C K I,et al. The pascal visual object classes (VOC) challenge[J]. International Journal of Computer Vision,2010,88(2):303-338. doi: 10.1007/s11263-009-0275-4
|
[33] |
LI Bailin,WU Bowen,SU Jiang,et al. Eagleeye:fast sub-net evaluation for efficient neural network pruning[C]. 16th European Conference on Computer Vision,Glasgow,2020,639-654.
|