A novel robust principal component analysis method for image and video processing

Guoqiang Huan; Ying Li; Zhanjie Song

Applications of Mathematics (2016)

  • Volume: 61, Issue: 2, page 197-214
  • ISSN: 0862-7940

Abstract

top
The research on the robust principal component analysis has been attracting much attention recently. Generally, the model assumes sparse noise and characterizes the error term by the 1 -norm. However, the sparse noise has clustering effect in practice so using a certain p -norm simply is not appropriate for modeling. In this paper, we propose a novel method based on sparse Bayesian learning principles and Markov random fields. The method is proved to be very effective for low-rank matrix recovery and contiguous outliers detection, by enforcing the low-rank constraint in a matrix factorization formulation and incorporating the contiguity prior as a sparsity constraint. The experiments on both synthetic data and some practical computer vision applications show that the novel method proposed in this paper is competitive when compared with other state-of-the-art methods.

How to cite

top

Huan, Guoqiang, Li, Ying, and Song, Zhanjie. "A novel robust principal component analysis method for image and video processing." Applications of Mathematics 61.2 (2016): 197-214. <http://eudml.org/doc/276756>.

@article{Huan2016,
abstract = {The research on the robust principal component analysis has been attracting much attention recently. Generally, the model assumes sparse noise and characterizes the error term by the $\ell _1$-norm. However, the sparse noise has clustering effect in practice so using a certain $\ell _p$-norm simply is not appropriate for modeling. In this paper, we propose a novel method based on sparse Bayesian learning principles and Markov random fields. The method is proved to be very effective for low-rank matrix recovery and contiguous outliers detection, by enforcing the low-rank constraint in a matrix factorization formulation and incorporating the contiguity prior as a sparsity constraint. The experiments on both synthetic data and some practical computer vision applications show that the novel method proposed in this paper is competitive when compared with other state-of-the-art methods.},
author = {Huan, Guoqiang, Li, Ying, Song, Zhanjie},
journal = {Applications of Mathematics},
keywords = {robust principal component analysis; sparse Bayesian learning; Markov random fields; matrix factorization; contiguity prior; robust principal component analysis; sparse Bayesian learning; Markov random fields; matrix factorization; contiguity prior},
language = {eng},
number = {2},
pages = {197-214},
publisher = {Institute of Mathematics, Academy of Sciences of the Czech Republic},
title = {A novel robust principal component analysis method for image and video processing},
url = {http://eudml.org/doc/276756},
volume = {61},
year = {2016},
}

TY - JOUR
AU - Huan, Guoqiang
AU - Li, Ying
AU - Song, Zhanjie
TI - A novel robust principal component analysis method for image and video processing
JO - Applications of Mathematics
PY - 2016
PB - Institute of Mathematics, Academy of Sciences of the Czech Republic
VL - 61
IS - 2
SP - 197
EP - 214
AB - The research on the robust principal component analysis has been attracting much attention recently. Generally, the model assumes sparse noise and characterizes the error term by the $\ell _1$-norm. However, the sparse noise has clustering effect in practice so using a certain $\ell _p$-norm simply is not appropriate for modeling. In this paper, we propose a novel method based on sparse Bayesian learning principles and Markov random fields. The method is proved to be very effective for low-rank matrix recovery and contiguous outliers detection, by enforcing the low-rank constraint in a matrix factorization formulation and incorporating the contiguity prior as a sparsity constraint. The experiments on both synthetic data and some practical computer vision applications show that the novel method proposed in this paper is competitive when compared with other state-of-the-art methods.
LA - eng
KW - robust principal component analysis; sparse Bayesian learning; Markov random fields; matrix factorization; contiguity prior; robust principal component analysis; sparse Bayesian learning; Markov random fields; matrix factorization; contiguity prior
UR - http://eudml.org/doc/276756
ER -

References

top
  1. Babacan, S. D., Luessi, M., Molina, R., Katsaggelos, A. K., 10.1109/TSP.2012.2197748, IEEE Trans. Signal Process. 60 (2012), 3964-3977. (2012) MR2960472DOI10.1109/TSP.2012.2197748
  2. Bishop, C. M., Pattern Recognition and Machine Learning, Information Science and Statistics Springer, New York (2006). (2006) Zbl1107.68072MR2247587
  3. Boykov, Y., Veksler, O., Zabih, R., 10.1109/34.969114, IEEE Trans. Pattern Anal. Mach. Intell. 23 (2010), 1222-1239. (2010) DOI10.1109/34.969114
  4. Candès, E. J., Li, X., Ma, Y., Wright, J., 10.1145/1970392.1970395, ACM 58 (2011), Article No. 11, 37 pages. (2011) MR2811000DOI10.1145/1970392.1970395
  5. Candès, E. J., Recht, B., 10.1007/s10208-009-9045-5, Found. Comput. Math. 9 (2009), 717-772. (2009) Zbl1219.90124MR2565240DOI10.1007/s10208-009-9045-5
  6. Candès, E. J., Tao, T., 10.1109/TIT.2010.2044061, IEEE Trans. Inf. Theory 56 (2010), 2053-2080. (2010) Zbl1366.15021MR2723472DOI10.1109/TIT.2010.2044061
  7. Torre, F. De la, Black, M. J., 10.1023/A:1023709501986, Int. J. Comput. Vis. 54 (2003), 117-142. (2003) Zbl1076.68058DOI10.1023/A:1023709501986
  8. Ding, X., He, L., Carin, L., 10.1109/TIP.2011.2156801, IEEE Trans. Image Process. 20 (2011), 3419-3430. (2011) MR2867882DOI10.1109/TIP.2011.2156801
  9. Ding, C., Zhou, D., He, X., Zha, H., R 1 -PCA: rotational invariant L 1 -norm principal component analysis for robust subspace factorization, Proc. 23rd Int. Conf. on Machine Learning. Pittsburgh ACM, New York (2006), 281-288. (2006) 
  10. Hansen, P. C., Rank-Deficient and Discrete Ill-Posed Problems. Numerical Aspects of Linear Inversion, SIAM Monographs on Mathematical Modeling and Computation 4 Society for Industrial and Applied Mathematics, Philadelphia (1998). (1998) MR1486577
  11. Jolliffe, I. T., Principal Component Analysis, Springer Series in Statistics Springer, New York (2002). (2002) Zbl1011.62064MR2036084
  12. Kolmogorov, V., Zabin, R., 10.1109/TPAMI.2004.1262177, Pattern Anal. Mach. Intell. 26 (2004), 147-159. (2004) DOI10.1109/TPAMI.2004.1262177
  13. Kwak, N., 10.1109/TPAMI.2008.114, IEEE Trans. Pattern Anal. Mach. Intell. 30 (2008), 1672-1680. (2008) DOI10.1109/TPAMI.2008.114
  14. Li, S. Z., Markov Random Field Modeling in Image Analysis, Advances in Pattern Recognition Springer, London (2009). (2009) Zbl1183.68691MR2493908
  15. Lin, Z., Chen, M., Ma, Y., The augmented Lagrange multiplier method for exact recovery of corrupted low-rank matrices, ArXiv preprint arXiv:1009.5055, 2010. 
  16. Lin, Z., Ganesh, A., Wright, J., Wu, L., Chen, M., Ma, Y., Fast convex optimization algorithms for exact recovery of a corrupted low-rank matrix, Computational Advances in Multi-Sensor Adaptive Processing Aruba, Dutch Antilles, Proceedings. IEEE (2009). (2009) 
  17. Liu, G., Lin, Z., Yu, Y., Robust subspace segmentation by low-rank representation, Proc. 27th Int. Conf. on Machine Learning, Haifa Proceedings Omni Press. (2010), 663-670. (2010) 
  18. Mazumder, R., Hastie, T., Tibshirani, R., Spectral regularization algorithms for learning large incomplete matrices, J. Mach. Learn. Res. 11 (2010), 2287-2322. (2010) Zbl1242.68237MR2719857
  19. Peng, Y., Ganesh, A., Wright, J., Xu, W., Ma, Y., 10.1109/TPAMI.2011.282, IEEE Trans. Pattern Anal. Mach. Intell. 34 (2012), 2233-2246. (2012) DOI10.1109/TPAMI.2011.282
  20. Recht, B., Fazel, M., Parrilo, P. A., 10.1137/070697835, SIAM Rev. 52 (2010), 471-501. (2010) Zbl1198.90321MR2680543DOI10.1137/070697835
  21. Wang, N., Yeung, D. Y., Bayesian robust matrix factorization for image and video processing, Computer Vision, 2013 IEEE International Conference, Sydney Proceedings. IEEE (2013), 1785-1792. (2013) 
  22. Wright, J., Ganesh, A., Rao, S., Peng, Y., Ma, Y., Robust principal component analysis: Exact recovery of corrupted low-rank matrices via convex optimization, Advances in Neural Information Processing Systems 22, Vancouver Proceedings. Curran Associates (2009), 2080-2088. (2009) 
  23. Xu, H., Caramanis, C., Sanghavi, S., 10.1109/TIT.2011.2173156, IEEE Trans. Inform. Theory 58 (2012), 3047-3064. (2012) Zbl1365.62228MR2952532DOI10.1109/TIT.2011.2173156
  24. Zhao, Q., Meng, D., Xu, Z., Zuo, W., Zhang, L., Robust principal component analysis with complex noise, Proc. 31st Int. Conf. on Machine Learning, Beijing Proceedings. J. Mach. Learn Res. (2014), 55-63. (2014) 
  25. Zhou, Z., Li, X., Wright, J., Candès, E. J., Ma, Y., Stable principal component pursuit, 2010 IEEE International Symposium on Information Theory Proceedings, Austin Proceedings. IEEE (2010), 1518-1522. (2010) 
  26. Zhou, X., Yang, C., Yu, W., 10.1109/TPAMI.2012.132, IEEE Trans. Pattern Anal. Mach. Intell. 35 (2013), 597-610. (2013) DOI10.1109/TPAMI.2012.132

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.