Bayesian estimation of mixtures with dynamic transitions and known component parameters

Ivan Nagy; Evgenia Suzdaleva; Miroslav Kárný

Kybernetika (2011)

  • Volume: 47, Issue: 4, page 572-594
  • ISSN: 0023-5954

Abstract

top
Probabilistic mixtures provide flexible “universal” approximation of probability density functions. Their wide use is enabled by the availability of a range of efficient estimation algorithms. Among them, quasi-Bayesian estimation plays a prominent role as it runs “naturally” in one-pass mode. This is important in on-line applications and/or extensive databases. It even copes with dynamic nature of components forming the mixture. However, the quasi-Bayesian estimation relies on mixing via constant component weights. Thus, mixtures with dynamic components and dynamic transitions between them are not supported. The present paper fills this gap. For the sake of simplicity and to give a better insight into the task, the paper considers mixtures with known components. A general case with unknown components will be presented soon.

How to cite

top

Nagy, Ivan, Suzdaleva, Evgenia, and Kárný, Miroslav. "Bayesian estimation of mixtures with dynamic transitions and known component parameters." Kybernetika 47.4 (2011): 572-594. <http://eudml.org/doc/196626>.

@article{Nagy2011,
abstract = {Probabilistic mixtures provide flexible “universal” approximation of probability density functions. Their wide use is enabled by the availability of a range of efficient estimation algorithms. Among them, quasi-Bayesian estimation plays a prominent role as it runs “naturally” in one-pass mode. This is important in on-line applications and/or extensive databases. It even copes with dynamic nature of components forming the mixture. However, the quasi-Bayesian estimation relies on mixing via constant component weights. Thus, mixtures with dynamic components and dynamic transitions between them are not supported. The present paper fills this gap. For the sake of simplicity and to give a better insight into the task, the paper considers mixtures with known components. A general case with unknown components will be presented soon.},
author = {Nagy, Ivan, Suzdaleva, Evgenia, Kárný, Miroslav},
journal = {Kybernetika},
keywords = {mixture model; Bayesian estimation; approximation; clustering; classification; approximation; mixture model; Bayesian estimation; clustering; classification},
language = {eng},
number = {4},
pages = {572-594},
publisher = {Institute of Information Theory and Automation AS CR},
title = {Bayesian estimation of mixtures with dynamic transitions and known component parameters},
url = {http://eudml.org/doc/196626},
volume = {47},
year = {2011},
}

TY - JOUR
AU - Nagy, Ivan
AU - Suzdaleva, Evgenia
AU - Kárný, Miroslav
TI - Bayesian estimation of mixtures with dynamic transitions and known component parameters
JO - Kybernetika
PY - 2011
PB - Institute of Information Theory and Automation AS CR
VL - 47
IS - 4
SP - 572
EP - 594
AB - Probabilistic mixtures provide flexible “universal” approximation of probability density functions. Their wide use is enabled by the availability of a range of efficient estimation algorithms. Among them, quasi-Bayesian estimation plays a prominent role as it runs “naturally” in one-pass mode. This is important in on-line applications and/or extensive databases. It even copes with dynamic nature of components forming the mixture. However, the quasi-Bayesian estimation relies on mixing via constant component weights. Thus, mixtures with dynamic components and dynamic transitions between them are not supported. The present paper fills this gap. For the sake of simplicity and to give a better insight into the task, the paper considers mixtures with known components. A general case with unknown components will be presented soon.
LA - eng
KW - mixture model; Bayesian estimation; approximation; clustering; classification; approximation; mixture model; Bayesian estimation; clustering; classification
UR - http://eudml.org/doc/196626
ER -

References

top
  1. Bernardo, J. M., 10.1214/aos/1176344689, Ann. Statist. 7 (1979), 3, 686–690. (1979) MR0527503DOI10.1214/aos/1176344689
  2. Böhm, J., Kárný, M., Transformation of user’s knowledge into initial values for identification, In: Preprints DYCOMANS Workshop Industrial Control and Management Methods: Theory and Practice (M. Součková and J. Böhm, eds.), ÚTIA AV ČR, Prague 1995, pp. 17–24. (1995) 
  3. Chen, W., Jovanis, P., Method for identifying factors contributing to driver-injury severity in traffic crashes, Highway And Traffic Safety: Crash Data, Analysis Tools, And Statistical Methods 1717 (2000), 1–9. (2000) 
  4. Chinnaswamy, G., Chirwa, E., Nammi, S., Nowpada, S., Chen, T., Mao, M., 10.1080/13588260701441365, Internat. J. Crashworthiness 12 (2007), 279–291. (2007) DOI10.1080/13588260701441365
  5. Dempster, A. P., Laird, N., Rubin, D., Maximum likelihood from incomplete data via the EM algorithm, J. Roy. Statist. Soc. Ser. B (Methodological) 39 (1977), 1, 1–38. (1977) Zbl0364.62022MR0501537
  6. Hamilton, J. D., Susmel, R., 10.1016/0304-4076(94)90067-1, J. Econometrics 64 (1994), 307–333. (1994) Zbl0825.62950DOI10.1016/0304-4076(94)90067-1
  7. Haykin, S., Neural Networks: A Comprehensive Foundation, MacMillan, New York 1994. (1994) Zbl0828.68103
  8. Wang, Jianyong, Zhang, Yuzhou, Zhou, Lizhu, Karypis, G. , Aggarwal, Charu C., Contour: An efficient algorithm for discovering discriminating subsequences, In: Data Mining and Knowledge Discovery, Springer 18 (2009), 1, pp. 1–29. (2009) MR2469590
  9. Kárný, M., Tools for computer-aided design of adaptive controllers, IEE Control Theory Appl. 150 (2003), 6, 643. (2003) 
  10. Kárný, M., Böhm, J., Guy, T. V., Jirsa, L., Nagy, I., Nedoma, P., Tesař, L., Optimized Bayesian Dynamic Advising: Theory and Algorithms, Springer, London 2005. (2005) 
  11. Kárný, M., Kadlec, J., Sutanto, E. L., Quasi-Bayes estimation applied to normal mixture, in In: Preprints 3rd European IEEE Workshop on Computer-Intensive Methods in Control and Data Processing (J. Rojíček, M. Valečková, M. Kárný, and K. Warwick, eds.), ÚTIA AV ČR, Prague 1998, pp. 77–82. (1998) 
  12. Kárný, M., Nagy, I., Novovičová, J., 10.1002/acs.672, Internat. J. Adaptive Control Signal Process. 16 (2002), 1, 61–83. (2002) Zbl0998.93016DOI10.1002/acs.672
  13. Kerridge, D. F., Inaccuracy and Inference, J. Royal Statist. Soc. Ser. B (Methodological) 23 (1961), 1, 184–194. (1961) Zbl0112.10302MR0123375
  14. Kulhavý, R., 10.1002/acs.4480040404, Internat. J. Adaptive Control Signal Process. 4 (1990), 271–285. (1990) DOI10.1002/acs.4480040404
  15. Ljung, L., System Identification: Theory for the User, Prentice-Hall, London 1987. (1987) Zbl0615.93004
  16. Murray-Smith, R., Johansen, T., Multiple Model Approaches to Modelling and Control, Taylor &Francis, London 1997. (1997) MR1694153
  17. Oppenheim, A., Wilsky, A., Signals and Systems, Englewood Clifts, Jersey 1983. (1983) 
  18. Opper, M., Saad, D., Advanced Mean Field Methods: Theory and Practice, The MIT Press, Cambridge 2001. (2001) Zbl0994.68172MR1863214
  19. Qu, H. B., Hu, B. G., 10.1016/j.ecoinf.2009.06.004, Ecological Informatics 4 (2009), 3, 163–176. (2009) DOI10.1016/j.ecoinf.2009.06.004
  20. Roberts, W. A., Convex Functions, Academic Press, New York 1973. (1973) Zbl0271.26009MR0442824
  21. Sander, J., Ester, M., Kriegel, H.-P., Xu, X., Density-based clustering in spatial databases: The algorithm gdbscan and its applications, In: Data Mining and Knowledge Discovery, Springer, 2 (1998), 2, pp. 169–194. (1998) 
  22. Titterington, D., Smith, A., Makov, U., Statistical Analysis of Finite Mixtures, John Wiley, New York 1985. (1985) MR0838090
  23. Xu, Xiaowei, Jäger, J., Kriegel, H.-P., A fast parallel clustering algorithm for large spatial databases, In: Data Mining and Knowledge Discovery, Springer, 3 (1999), 3, pp. 263–290. (1999) 
  24. Zhang, T., Ramakrishnan, R., Livny, M., Birch: A new data clustering algorithm and its applications, In: Data Mining and Knowledge Discovery, Springer, 1 (1997), 2, pp. 141–182. (1997) 

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.