Volume 2,Issue 3
基于深度学习的时间序列预测方法综述
时间序列预测作为解析历史数据动态演化模式并推断未来状态的核心技术,在经济、交通、能源及气象等领域具有重要的决策支持价值。随着大数据技术的发展与计算能力的指数级提升,深度学习凭借其卓越的非线性拟合与自动特征提取能力,已克服传统统计模型在处理高维、非平稳及复杂时空相关性数据时的局限,成为时间序列预测的主流范式。本文旨在系统综述基于深度学习的时间序列预测方法的研究进展。首先,文章构建了时间序列预测的理论框架,明确了任务定义与方法分类体系。其次,深入剖析了四类主流深度学习架构的技术演进路径与核心机制:基于循环神经网络(RNN)的方法经历了从经典LSTM/GRU 到现代化SegRNN、xLSTM 的演变,证明了递归归纳偏置在序列依赖建模中的持续活力;基于卷积神经网络(CNN)的方法利用扩张卷积与多尺度架构(如TimesNet),在局部特征提取与并行计算效率方面展现出显著优势;基于Transformer 的方法凭借自注意力机制主导了长程依赖建模,并通对改造Transformer 架构(如iTransformer 等)突破计算瓶颈;而基于多层感知机(MLP)的方法则以极简线性架构(如DLinear)实现了精度与效率的平衡,挑战了复杂模型的必要性。此外,本文系统梳理了该领域面临的关键挑战,包括不等长序列的鲁棒处理、黑箱模型的可解释性缺失、大规模数据的计算成本以及跨域泛化能力不足等问题。最后,文章展望了未来的研究趋势,指出时间序列基础模型的构建、多模态预训练范式及边缘设备上的轻量化部署将是下一阶段的创新重点。本文期望通过对现有方法的全面梳理与前瞻性分析,为时间序列预测领域的理论研究与工程应用提供参考。
[1]Ahmar A S, Boj Del Val E. Novel time series methods in economic forecasting: SutteARIMA evidence from Indonesian Consumer Price Index and currency exchange rates[J]. Cogent Economics & Finance, 2025, 13(1): 2566223.
[2]Kahra D, Kahra V. VSF-Net: A variational encoder and Sharpness-Aware attention framework for robust forecasting of non-stationary financial time series[J]. International Journal of Information Technology, 2025.
[3]曾俊铖, 杨建森, 孙璇. RD-RNN:用于交通流量预测的残差分解循环神经网络[J]. 控制工程, 2025, 32(12): 2207-2213.
[4]Bielecki P, Hachaj T, Wąs J. Sample and Aggregate Voronoi Neighborhood Weighted Graph Neural Network (SAGE-Voronoi) and Its Capability for City-Sized Vehicle Traffic Time Series Prediction[J]. Applied Sciences, 2025, 15(24): 12899.
[5]Liu R, Shin S Y. A Review of Traffic Flow Prediction Methods in Intelligent Transportation System Construction[J]. Applied Sciences, 2025, 15(7): 3866.
[6]Park Y J, Germain F, Liu J, et al. Probabilistic forecasting for building energy systems using time-series foundation models[J]. Energy and Buildings, 2025: 116446.
[7]Zhang X, Song M, Xiao X, et al. EXP-Transformer time series prediction model for accident scenarios in high-reliability energy systems: Nuclear power plants case[J].
Energy, 2025, 322: 135481.
[8]Zhang F, Leng Z, Chen L, et al. Joint Probabilistic Forecasting of Wind and Solar Power Exploiting Spatiotemporal Complementarity[J]. Sustainability, 2025, 17(8): 3584.
[9]Sithole P P, Obagbuwa I C. Rainfall Prediction in South Africa Using Time ‐Series Analysis[J]. Applied Computational Intelligence and Soft Computing, 2025, 2025(1):
5599720.
[10] 薛院红, 宋文广, 江琼琴, 等. VMD-LSTM-FEDformer 降水预测融合模型研究[J]. 节水灌溉, 2025.
[11]Waqas M, Humphries U W, Chueasa B, et al. Artificial intelligence and numerical weather prediction models: A technical survey[J]. Natural Hazards Research, 2025,
5(2): 306-320.
[12] 田剑, 刘全智, 张瑜, 等. 通用时间序列预测大模型在医学领域应用研究[J]. 现代医院, 2025, 25(12): 1923-1928.
[13]Ahmed M, Hasan M, Kumar S, et al. Trends and Forecast of Heart Failure-Related Deaths in US Chronic Obstructive Pulmonary Disease Patients (1999–2030): Insights
From Advanced Time-Series Modeling[J]. Cardiology in Review, 2025.
[14]Box G E, Jenkins G M, Reinsel G C, et al. Time series analysis: forecasting and control[M]. John Wiley & Sons, 2015.
[15]Zhou Z H. Machine learning[M]. Springer nature, 2021.
[16]Torres J F, Hadjout D, Sebaa A, et al. Deep Learning for Time Series Forecasting: A Survey[J]. Big Data, 2021, 9(1): 3-21.
[17]Wang Y, Wu H, Dong J, et al. Deep time series models: A comprehensive survey and benchmark[J]. arXiv preprint arXiv:2407.13278, 2024.
[18]Cleveland R B, Cleveland W S, McRae J E, et al. STL: A seasonal-trend decomposition[J]. J. off. Stat, 1990, 6(1): 3-73.
[19]Zhou H, Zhang S, Peng J, et al. Informer: Beyond efficient transformer for long sequence time-series forecasting[C]//Proceedings of the AAAI conference on artificial
intelligence: 35. 2021: 11106-11115.
[20]Trindade A. ElectricityLoadDiagrams20112014[DS/OL]. UCI Machine Learning Repository, 2015. https://archive.ics.uci.edu/dataset/321.
[21]Lai G, Chang W C, Yang Y, et al. Modeling Long- and Short-Term Temporal Patterns with Deep Neural Networks[C]//The 41st International ACM SIGIR Conference on
Research & Development in Information Retrieval. Ann Arbor MI USA: ACM, 2018: 95-104.
[22]Li Y, Lu X, Wang Y, et al. Generative time series forecasting with diffusion, denoise, and disentanglement[J]. Advances in Neural Information Processing Systems, 2022,
35: 23009-23022.
[23]Song C, Lin Y, Guo S, et al. Spatial-temporal synchronous graph convolutional networks: A new framework for spatial-temporal network data forecasting[C]//
Proceedings of the AAAI conference on artificial intelligence: 34. 2020: 914-921.
[24]Wu H, Xu J, Wang J, et al. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting[J]. Advances in neural information processing
systems, 2021, 34: 22419-22430.
[25]Zhang S, Guo B, Dong A, et al. Cautionary tales on air-quality improvement in Beijing[J]. Proceedings of the Royal Society A: Mathematical, Physical and Engineering
Sciences, 2017, 473(2205): 20170457.
[26]Chen S. Beijing Multi-Site Air Quality[DS/OL]. UCI Machine Learning Repository, 2017. https://archive.ics.uci.edu/dataset/501.
[27]Feng F, He X, Wang X, et al. Temporal Relational Ranking for Stock Prediction[J]. ACM Transactions on Information Systems, 2019, 37(2): 1-30.
[28]Panagopoulos G, Nikolentzos G, Vazirgiannis M. Transfer graph neural networks for pandemic forecasting[C]//Proceedings of the AAAI Conference on Artificial
Intelligence: 35. 2021: 4838-4845.
[29]Cheng M, Liu Z, Tao X, et al. A comprehensive survey of time series forecasting: Concepts, challenges, and future directions[J]. Authorea Preprints, 2025.
[30]LeCun Y, Bengio Y, Hinton G. Deep learning[J]. nature, 2015, 521(7553): 436-444.
[31]Kim K G. Book Review: Deep Learning[J]. Healthcare Informatics Research, 2016, 22(4): 351.
[32]Hochreiter S, Schmidhuber J. Long short-term memory[J]. Neural computation, 1997, 9(8): 1735-1780.
[33]Cho K, Merrienboer B van, Gulcehre C, et al. Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation[A]. arXiv, 2014.
[34]Salinas D, Flunkert V, Gasthaus J, et al. DeepAR: Probabilistic forecasting with autoregressive recurrent networks[J]. International journal of forecasting, 2020, 36(3):
1181-1191.
[35]en R, Torkkola K, Narayanaswamy B, et al. A Multi-Horizon Quantile Recurrent Forecaster[A]. arXiv, 2018.
[36]Lin S, Lin W, Wu W, et al. SegRNN: Segment Recurrent Neural Network for Long-Term Time Series Forecasting[A]. arXiv, 2023.
[37]Alharthi M, Mahmood A. xlstmtime: Long-term time series forecasting with xlstm[J]. AI, 2024, 5(3): 1482-1495.
[38]Bai S. An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling[J]. arXiv preprint arXiv:1803.01271, 2018.
[39]Sen R, Yu H F, Dhillon I S. Think globally, act locally: A deep neural network approach to high-dimensional time series forecasting[J]. Advances in neural information
processing systems, 2019, 32.
[40]Liu M, Zeng A, Chen M, et al. Scinet: Time series modeling and forecasting with sample convolution and interaction[J]. Advances in Neural Information Processing
Systems, 2022, 35: 5816-5828.
[41]Wu H, Hu T, Liu Y, et al. TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis[A]. arXiv, 2023.
[42]Luo D, Wang X. Moderntcn: A modern pure convolution structure for general time series analysis[C]//The twelfth international conference on learning representations.
2024: 1-43.
[43]Stitsyuk A, Choi J. xPatch: Dual-Stream Time Series Forecasting with Exponential Seasonal-Trend Decomposition[C]//Proceedings of the AAAI Conference on Artificial
Intelligence: 39. 2025: 20601-20609.
[44]Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[J]. Advances in neural information processing systems, 2017, 30.
[45]Li S, Jin X, Xuan Y, et al. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting[J]. Advances in neural information
processing systems, 2019, 32.
[46]Lim B, Arık S Ö, Loeff N, et al. Temporal fusion transformers for interpretable multi-horizon time series forecasting[J]. International journal of forecasting, 2021, 37(4):
1748-1764.
[47]Zhou T, Ma Z, Wen Q, et al. Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting[C]//International conference on machine learning. PMLR, 2022: 27268-27286.
[48]Liu S, Yu H, Liao C, et al. Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting[C]//# PLACEHOLDER_PARENT_METADATA_VALUE#. 2022.
[49]Nie Y, Nguyen N H, Sinthong P, et al. A Time Series is Worth 64 Words: Long-term Forecasting with Transformers[A]. arXiv, 2023.
[50]Zhang Y, Yan J. Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting[C]//The eleventh international conference on learning representations. 2023.
[51]Liu Y, Hu T, Zhang H, et al. iTransformer: Inverted Transformers Are Effective for Time Series Forecasting[A]. arXiv, 2024.
[52]Wang Y, Wu H, Dong J, et al. Timexer: Empowering transformers for time series forecasting with exogenous variables[J]. Advances in Neural Information Processing Systems, 2024, 37: 469-498.
[53]Oreshkin B N, Carpov D, Chapados N, et al. N-BEATS: Neural basis expansion analysis for interpretable time series forecasting[A]. arXiv, 2020.
[54]Challu C, Olivares K G, Oreshkin B N, et al. Nhits: Neural hierarchical interpolation for time series forecasting[C]//Proceedings of the AAAI conference on artificial intelligence: 37. 2023: 6989-6997.
[55]Zeng A, Chen M, Zhang L, et al. Are transformers effective for time series forecasting?[C]//Proceedings of the AAAI conference on artificial intelligence: 37. 2023: 11121-11128.
[56]Das A, Kong W, Leach A, et al. Long-term Forecasting with TiDE: Time-series Dense Encoder[A]. arXiv, 2024.
[57]Liu Y, Li C, Wang J, et al. Koopa: Learning non-stationary time series dynamics with koopman predictors[J]. Advances in neural information processing systems, 2023, 36: 12271-12290.
[58]Wang S, Wu H, Shi X, et al. TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting[A]. arXiv, 2024.
[59]Qiu X, Wu X, Lin Y, et al. DUET: Dual Clustering Enhanced Multivariate Time Series Forecasting[C]//Proceedings of the 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining V.1. Toronto ON Canada: ACM, 2025: 1185-1196.
[60]Minaee S, Mikolov T, Nikzad N, et al. Large Language Models: A Survey[A]. arXiv, 2025.