基于集成学习强化BPNN的掘进工作面温度预测模型

A tunneling face temperature prediction model based on ensemble learning enhanced BPNN

  • 摘要: 针对现有掘进工作面温度预测方法存在预测模型泛化性不强、鲁棒性较差,且对非线性多维数据的预测能力有限的问题,提出了一种基于集成学习强化反向传播神经网络(BPNN)的掘进工作面温度预测模型,即t−SNE−BPNN−AdaBoost。首先采用t−分布随机邻域嵌入 (t−SNE)非线性降维技术,将通风机前风量、温度、相对湿度等7项高维特征降至3维,保留数据局部结构并去除噪声。然后将降维数据输入BPNN作为基分类器,经迭代训练得到初步模型。最后通过自适应推进算法(AdaBoost)集成学习,迭代训练多个BPNN弱分类器并加权组合为强分类器,增强模型泛化能力。将60组掘进工作面实测数据按8∶2划分为训练集与测试集,经5折交叉验证确定AdaBoost最优弱学习器数量为30。实验结果表明:① t−SNE−BPNN−AdaBoost预测曲线和真实值贴合度最优,整体误差小,在温度突变区段适应力强,稳定性远超SVM,BPNN和t−SNE−BPNN。② t−SNE−BPNN−AdaBoost的预测相对误差最小,几乎在5%以内,表现出最优的预测精度。③ 在测试集上,t−SNE−BPNN−AdaBoost的决定系数为0.978 4,较SVM,BPNN,t−SNE−BPNN分别提高了60.3%,17.2%,8.1%;平均绝对误差为0.167 6,均方误差为0.056 7,平均绝对百分比误差为0.964 0,指标均显著优于SVM,BPNN和t−SNE−BPNN,在温度突变区段适应性更强。

     

    Abstract: In view of the problems of existing tunneling face temperature prediction methods, such as weak generalization ability, poor robustness, and limited predictive capacity for nonlinear multidimensional data, a tunneling face temperature prediction model based on ensemble learning-enhanced Back Propagation Neural Network (BPNN), namely t-SNE-BPNN-AdaBoost, was proposed. First, the t-Distributed Stochastic Neighbor Embedding (t-SNE) nonlinear dimensionality reduction technique was adopted to reduce seven high-dimensional features, including air volume, temperature, and relative humidity in front of the ventilator, to three dimensions, retaining the local structure of data and removing noise. Then, the reduced-dimensional data were input into BPNN as the base classifier, and the preliminary model was obtained through iterative training. Finally, ensemble learning was carried out by Adaptive Boosting (AdaBoost), in which multiple weak BPNN classifiers were iteratively trained and combined into a strong classifier by weighted integration, thereby enhancing the generalization ability of the model. Sixty sets of measured tunneling face data were divided into training and testing sets at a ratio of 8:2, and 5-fold cross-validation was conducted to determine that the optimal number of AdaBoost weak learners was 30. The experimental results showed that: ① the prediction curve of t-SNE-BPNN-AdaBoost fit the true values best, with the smallest overall error, strong adaptability in sudden temperature change intervals, and stability far superior to Support Vector Machine (SVM), BPNN, and t-SNE-BPNN. ② The relative prediction error of t-SNE-BPNN-AdaBoost was the smallest, almost all within 5%, demonstrating the best prediction accuracy. ③ On the test set, the coefficient of determination of t-SNE-BPNN-AdaBoost was 0.9784, which was improved by 60.3%, 17.2%, and 8.1% compared with SVM, BPNN, and t-SNE-BPNN, respectively. The Mean Absolute Error (MAE) was 0.1676, the Mean Squared Error (MSE) was 0.0567, and the Mean Absolute Percentage Error (MAPE) was 0.9640. All metrics were significantly better than those of SVM, BPNN, and t-SNE-BPNN, and the adaptability in sudden temperature change intervals was stronger.

     

/

返回文章
返回