Artificial neural networks in time series forecasting: a comparative analysis

Héctor Allende; Claudio Moraga; Rodrigo Salas

Kybernetika (2002)

  • Volume: 38, Issue: 6, page [685]-707
  • ISSN: 0023-5954

Abstract

top
Artificial neural networks (ANN) have received a great deal of attention in many fields of engineering and science. Inspired by the study of brain architecture, ANN represent a class of non-linear models capable of learning from data. ANN have been applied in many areas where statistical methods are traditionally employed. They have been used in pattern recognition, classification, prediction and process control. The purpose of this paper is to discuss ANN and compare them to non-linear time series models. We begin exploring recent developments in time series forecasting with particular emphasis on the use of non-linear models. Thereafter we include a review of recent results on the topic of ANN. The relevance of ANN models for the statistical methods is considered using time series prediction problems. Finally we construct asymptotic prediction intervals for ANN and show how to use prediction intervals to choose the number of nodes in the ANN.

How to cite

top

Allende, Héctor, Moraga, Claudio, and Salas, Rodrigo. "Artificial neural networks in time series forecasting: a comparative analysis." Kybernetika 38.6 (2002): [685]-707. <http://eudml.org/doc/33612>.

@article{Allende2002,
abstract = {Artificial neural networks (ANN) have received a great deal of attention in many fields of engineering and science. Inspired by the study of brain architecture, ANN represent a class of non-linear models capable of learning from data. ANN have been applied in many areas where statistical methods are traditionally employed. They have been used in pattern recognition, classification, prediction and process control. The purpose of this paper is to discuss ANN and compare them to non-linear time series models. We begin exploring recent developments in time series forecasting with particular emphasis on the use of non-linear models. Thereafter we include a review of recent results on the topic of ANN. The relevance of ANN models for the statistical methods is considered using time series prediction problems. Finally we construct asymptotic prediction intervals for ANN and show how to use prediction intervals to choose the number of nodes in the ANN.},
author = {Allende, Héctor, Moraga, Claudio, Salas, Rodrigo},
journal = {Kybernetika},
keywords = {artificial neural network; non-linear time series model; prediction; artificial neural network; non-linear time series model; prediction},
language = {eng},
number = {6},
pages = {[685]-707},
publisher = {Institute of Information Theory and Automation AS CR},
title = {Artificial neural networks in time series forecasting: a comparative analysis},
url = {http://eudml.org/doc/33612},
volume = {38},
year = {2002},
}

TY - JOUR
AU - Allende, Héctor
AU - Moraga, Claudio
AU - Salas, Rodrigo
TI - Artificial neural networks in time series forecasting: a comparative analysis
JO - Kybernetika
PY - 2002
PB - Institute of Information Theory and Automation AS CR
VL - 38
IS - 6
SP - [685]
EP - 707
AB - Artificial neural networks (ANN) have received a great deal of attention in many fields of engineering and science. Inspired by the study of brain architecture, ANN represent a class of non-linear models capable of learning from data. ANN have been applied in many areas where statistical methods are traditionally employed. They have been used in pattern recognition, classification, prediction and process control. The purpose of this paper is to discuss ANN and compare them to non-linear time series models. We begin exploring recent developments in time series forecasting with particular emphasis on the use of non-linear models. Thereafter we include a review of recent results on the topic of ANN. The relevance of ANN models for the statistical methods is considered using time series prediction problems. Finally we construct asymptotic prediction intervals for ANN and show how to use prediction intervals to choose the number of nodes in the ANN.
LA - eng
KW - artificial neural network; non-linear time series model; prediction; artificial neural network; non-linear time series model; prediction
UR - http://eudml.org/doc/33612
ER -

References

top
  1. Allende H., Galbiati J., Robust test in time series model, J. Interamerican Statist. Inst. 1 (1996), 48. 35–79 (1996) MR1648377
  2. Allende H., Heiler S., 10.1111/j.1467-9892.1992.tb00091.x, J. Time Ser. Anal. 13 (1992), 1–18 (1992) Zbl0850.62666MR1149267DOI10.1111/j.1467-9892.1992.tb00091.x
  3. Anderson B., Moore J., Optimal Filtering, Prentice Hall, Englewood Cliffs, N.J. 1979 Zbl1191.93133
  4. Baxt W. G., 10.1162/neco.1990.2.4.480, Neural Computational 2 (1990), 480–489 (1990) DOI10.1162/neco.1990.2.4.480
  5. Benitez J. M., Castro J. L., Requena J., 10.1109/72.623216, Neural Networks 8 (1997), 1156–1163 (1997) DOI10.1109/72.623216
  6. Beran J., Statistics for Long-memory Processes, Chapman and Hall, London 1994 Zbl0869.60045MR1304490
  7. Bowerman B. L., O’Connell R. T., Forecasting and time series: an applied approach, Third edition. Duxbury Press, 1993 Zbl0779.62087MR0635926
  8. Box G. E. P., Jenkins G. M., Reinsel G. C., Time Series Analysis, Forecasting and Control, Third edition. Prentice Hall, Englewood Cliffs, N.J. 1994 Zbl1154.62062MR1312604
  9. Breiman L., Friedman J., Olshen, R., Stone C. J., Classification and Regression Trees, Belmont, C. A. Wadsworth, 1984 Zbl0541.62042MR0726392
  10. Brockwell P. J., Davis R. A., Time Series Theory and Methods, Springer Verlag, New York 1991 Zbl1169.62074MR1093459
  11. Brown R. G., Smoothing, Forecasting and Prediction of Discrete Time Series, Prentice Hall, Englewood Cliffs, N.J. 1962 Zbl0192.25606
  12. Chatfield C., Forecasting in the 1990s, Statistician 4 (1997), 46, 461–473 (1997) 
  13. Cheng B., Titterington D. M., 10.3902/jnns.1.e2, Statist. Sci. 1 (1994), 2–54 (1994) MR1278678DOI10.3902/jnns.1.e2
  14. Connor J. T., Martin R. D., 10.1109/72.279188, IEEE Trans. Neural Networks 2 (1994), 5, 240–253 (1994) DOI10.1109/72.279188
  15. Crato N., Ray B. K., 10.1002/(SICI)1099-131X(199603)15:2<107::AID-FOR612>3.0.CO;2-D, Internat. J. Forecasting 15 (1996), 107–125 (1996) DOI10.1002/(SICI)1099-131X(199603)15:2<107::AID-FOR612>3.0.CO;2-D
  16. Fine T. L., Feedforward Neural Network Methodology, Springer, New York 1999 Zbl0963.68163MR1691898
  17. Flury B., Riedwyl H., Multivariate Statistics: A Practical Approach, Chapman Hall, London 1990 
  18. Friedman J. H., 10.1214/aos/1176347963, Ann. Statist. 19 (1991), 1–141 (1991) MR1091842DOI10.1214/aos/1176347963
  19. Funahashi K. I., 10.1016/0893-6080(89)90003-8, Neural Networks 2 (1989), 183–192 (1989) DOI10.1016/0893-6080(89)90003-8
  20. Han J., Moraga, C., Sinne S., 10.1016/0952-1976(95)00001-1, Engrg. Appl. Artificial Intelligence 2 (1996), 9, 109–119 (1996) DOI10.1016/0952-1976(95)00001-1
  21. Hornik K., Stinchcombe, M., White H., 10.1016/0893-6080(89)90020-8, Neural Networks 2 (1989), 359–366 (1989) DOI10.1016/0893-6080(89)90020-8
  22. Hristev R. M., Artificial Neural Networks, Preprint of a book obtained via Internet from the author, 1998 
  23. Hutchinson J. M., A Radial Basis Function Approach to Financial Time Series Analysis, Ph.D. Thesis. Massachusetts Institute of Technology, 1994 MR2716481
  24. Hwang J. T. G., Ding A. A., Prediction for artificial neural networks, J. Amer. Statist. Assoc. 92 (1997), 438, 748–757 (1997) MR1467864
  25. Lin J. L., Granger C. W., 10.1002/for.3980130102, Internat. J. Forecasting 13 (1994), 1–9 (1994) DOI10.1002/for.3980130102
  26. Lippmann R. P., An introduction to computing with neural nets, IEEE ASSP Magazine (1997), 4–22 (1997) 
  27. Ljung G. M., Bax G. E. P., 10.1093/biomet/65.2.297, Biometrika 65 (1978), 297–303 (1978) DOI10.1093/biomet/65.2.297
  28. McCullagh P., Nelder J. A., Generalized Linear Models, Chapman Hall, London 1989 Zbl0744.62098MR0727836
  29. Meditch J. S., Stochastic Optimal linear Estimation and Control, MacGraw–Hill, New York 1969 Zbl0269.93061
  30. Moody J. E., Utans J., Architecture selection strategies for neural networks, In: Refenes A. P. N. Neural Networks in the Capital Markets, Wiley, New York 1995 
  31. Moraga C., Properties of parametric feedforward neural networks, In: XXIII Conferencia Latinoamericana de Informática, Valparaíso 1997, Vol. 2, pp. 861–870 (1997) 
  32. Pineda F. J., Generalization of Backpropagation to recurrent and higher order networks, In: Proc. IEEE Conf. Neural Inform. Proc. Syst., 1987 
  33. Poli R., Cagnoni S., Coppini, G., Walli G., A neural network expert system for diagnosing and treating hypertension, Computer (1991), 64–71 (1991) 
  34. Referes A. P. N., Zapranis A. D., 10.1002/(SICI)1099-131X(199909)18:5<299::AID-FOR725>3.0.CO;2-T, J. Forecasting 18 (1999), 299–322 (1999) DOI10.1002/(SICI)1099-131X(199909)18:5<299::AID-FOR725>3.0.CO;2-T
  35. Reinsel G. C., Elements of Multivariate Time Series Analysis, Springer Verlag, New York 1993 Zbl1047.62078MR1238940
  36. Ripley B. D., Statistical aspects of neural networks, In: Networks and Chaos-Statistical and Probabilistic Aspect (O. E. Barndorf–Nielsen, J. L. Jensen, and W. S. Kendall, eds.), Chapman and Hall, London 1993 Zbl0825.68531MR1314652
  37. Sarle W. S., Neural networks and statistical methods, In: Proc. of the 19th Anual SAS Users Group International Conference, 1994 (19th) 
  38. Smith J., Yadav S., 10.1016/0169-2070(94)90019-1, Internat. J. Forecasting 10 (1994), 507–514 (1994) DOI10.1016/0169-2070(94)90019-1
  39. Stern H. S., 10.1080/00401706.1996.10484497, Technometrics 38 (1996), 3, 205–214 (1996) Zbl0896.62098MR1411878DOI10.1080/00401706.1996.10484497
  40. Rao T. Subba, On the theory of bilinear models, J. Roy. Statist. Soc. Ser. B 43 (1981), 244–255 (1981) MR0626772
  41. Sussmann H. J., 10.1016/S0893-6080(05)80037-1, Neural Networks 5 (1992), 589–593 (1992) DOI10.1016/S0893-6080(05)80037-1
  42. Temme K. H., Heider, R., Moraga C., Generalized neural networks for fuzzy modeling, In: Proc. Internat. Conference of European Society of Fuzzy Logic and Technology, EUSFLAT’99 Palma de Mallorca 1999 
  43. Tong H., Non-linear Time Series, Oxford University Press, Oxford 1990 Zbl0835.62076
  44. Vapnik V., The Nature of Statistical Learning Theory, Springer Verlag, Berlin 1995 Zbl0934.62009MR1367965
  45. Vapnik V., Chervoneski A., The necessary and sufficient conditions for consistency of the method of empirical risk minimization, Pattern Recognition Image Anal. 1 (1991), 284–305 (1991) 
  46. Waibel A., Hanazawa T., Hinton G., Shikano, K., Lang K. J., 10.1109/29.21701, IEEE Trans. Acoust. Speech Signal Process. 37 (1989), 324–329 (1989) DOI10.1109/29.21701
  47. Wu F. Y., Yen K. K., Application of neural network in regression analysis, In: Proc. 14th Annual Conference on Computers and Industrial Engineering, 1992 

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.