Abstract: This article (Historical Inertia: An Ignored but Powerful Baseline for Long Sequence Time-series Forecasting) is a short article published on CIKM'21 by the Huawei Cloud Database Innovation Lab and the Data and Intelligence Laboratory of University of Electronic Science and Technology of China. This article A baseline for long time series is proposed.
This article is shared from the HUAWEI cloud community " CIKM'21 Historical Inertia Paper Interpretation ", author: Cloud Database Innovation Lab.
Guide
This article (Historical Inertia: An Ignored but Powerful Baseline for Long Sequence Time-series Forecasting) is a short article published at CIKM'21 by Huawei Cloud Database Innovation Lab and the Data and Intelligence Laboratory of University of Electronic Science and Technology of China. This article proposes a Baseline for long time series. CIKM is one of the top academic conferences in the field of information retrieval and data mining. This conference received a total of 626 short essay submissions, of which 177 were accepted, and the acceptance rate was about 28%. This paper is one of the key technical achievements of cloud database innovation LAB at the time sequence analysis level.
1 Summary
Long Sequence Time-series Forecasting (LSTF) has become more and more popular due to its wide range of applications. Although people have proposed a large number of complex models to improve the effectiveness and efficiency of forecasting, they have neglected or underestimated one of the most natural and basic characteristics of time series: historical inertia. In this article, we propose a new LSTF baseline, historical inertia (HI). In this baseline model, we directly use the historical data point closest to the prediction target in the input time series as the predicted value. We evaluated the effect of HI on 4 public LSTF data sets and 2 LSTF tasks. The results show that compared with SOTA work, HI can achieve a relative improvement of up to 82%. At the same time, we also discussed the possibility of combining HI with existing methods.
2 HI
HI directly uses the historical data point closest to the predicted target in the input time series as the predicted value.
3 experiment
3.1 Univariate long-term series prediction results
For univariate long-term sequence prediction tasks, HI is significantly better than the SOTA model on the ETTh1 and ETTm1 data sets. Informer and its variants dominate the optimal results of the ETTh2 data set. For the Electricity data set, HI, Informer and DeepAR all have good performance. Overall, HI has achieved a relative increase of up to 80% and 58% in MSE and MAE respectively.
3.2 Multivariate long-term series prediction results
For multivariate long-term sequence prediction tasks, HI is significantly better than the SOTA model on most prediction tasks of the four data sets, bringing a relative improvement of up to 82%.
4 Discussion
4.1 Why HI has such a good effect
We considered the reasons why HI can achieve good results from two perspectives:
- Numerical value: HI can ensure that the predicted sequence and the real sequence have similar numerical values.
- Periodicity: For data sets with periodicity and shorter periodicity, HI can achieve the phase similarity between the predicted sequence and the real sequence.
4.2 How to use HI
We have proposed two possible directions for using HI
- Fusion model (Hybrid model): consider fusing HI with other models, for example, simply as a trick to weight the output results.
- Automatic machine learning (AutoML): In some cases, complex models may not achieve good results, so you can consider adapting the model structure according to the data, and appropriately reducing/increasing the complexity of the model.
For the direction of the fusion model, we designed a simple experiment to verify: averaging the output results of the HI and 2-layer MLP models to get the final prediction result. Experimental results show that the MLP model fused with HI can achieve more accurate predictions, and this advantage is more significant in univariate long-term sequence prediction tasks.
Huawei Cloud Database Innovation official website: 1613b027d59816 https://www.huaweicloud.com/lab/clouddb/home.html
Click to follow and learn about Huawei Cloud's fresh technology for the first time~
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。