AUTHOR=Yang Mingxia , Xu Xiaojie , Cheng Huayan , Zhan Zhidan , Xu Zhongshen , Tong Lianghuai , Fang Kai , Ahmed Ahmedin M. TITLE=Industrial steam consumption analysis and prediction based on multi-source sensing data for sustainable energy development JOURNAL=Frontiers in Environmental Science VOLUME=11 YEAR=2023 URL=https://www.frontiersin.org/journals/environmental-science/articles/10.3389/fenvs.2023.1187201 DOI=10.3389/fenvs.2023.1187201 ISSN=2296-665X ABSTRACT=
Centralized heating is an energy-saving and environmentally friendly way that is strongly promoted by the state. It can improve energy utilization and reduce carbon emissions. However, Centralized heating depends on accurate heat demand forecasting. On the one hand, it is impossible to save energy if over producing, while on the other hand, it is impossible to meet the heat demand of enterprises if there is not enough capacity. Therefore, it is necessary to forecast the future trend of heat consumption, so as to provide a reliable basis for enterprises to reasonably deploy fuel stocks and boiler power. At the same time, it is also necessary to analyze and monitor the steam consumption of enterprises for abnormalities in order to monitor pipeline leakage and enterprise gas theft. Due to the nonlinear characteristics of heat load, it is difficult for traditional forecasting methods to capture data trend. Therefore, it is necessary to study the characteristics of heat loads and explore suitable heat load prediction models. In this paper, industrial steam consumption of a paper manufacturer is used as an example, and steam consumption data are periodically analyzed to study its time series characteristics; then steam consumption prediction models are established based on ARIMA model and LSTM neural network, respectively. The prediction work was carried out in minutes and hours, respectively. The experimental results show that the LSTM neural network has greater advantages in this steam consumption load prediction and can meet the needs of heat load prediction.