Replicant compression coding in Besov spaces

Gérard Kerkyacharian; Dominique Picard

ESAIM: Probability and Statistics (2003)

  • Volume: 7, page 239-250
  • ISSN: 1292-8100

Abstract

top
We present here a new proof of the theorem of Birman and Solomyak on the metric entropy of the unit ball of a Besov space B π , q s on a regular domain of d . The result is: if s - d ( 1 / π - 1 / p ) + > 0 , then the Kolmogorov metric entropy satisfies H ( ϵ ) ϵ - d / s . This proof takes advantage of the representation of such spaces on wavelet type bases and extends the result to more general spaces. The lower bound is a consequence of very simple probabilistic exponential inequalities. To prove the upper bound, we provide a new universal coding based on a thresholding-quantizing procedure using replication.

How to cite

top

Kerkyacharian, Gérard, and Picard, Dominique. "Replicant compression coding in Besov spaces." ESAIM: Probability and Statistics 7 (2003): 239-250. <http://eudml.org/doc/244843>.

@article{Kerkyacharian2003,
abstract = {We present here a new proof of the theorem of Birman and Solomyak on the metric entropy of the unit ball of a Besov space $B^s_\{\pi ,q\}$ on a regular domain of $\{\mathbb \{R\}\}^d.$ The result is: if $s-d(1/\pi -1/p)_+$$&gt; 0,$ then the Kolmogorov metric entropy satisfies $H(\epsilon ) \sim \epsilon ^\{-d/s\}$. This proof takes advantage of the representation of such spaces on wavelet type bases and extends the result to more general spaces. The lower bound is a consequence of very simple probabilistic exponential inequalities. To prove the upper bound, we provide a new universal coding based on a thresholding-quantizing procedure using replication.},
author = {Kerkyacharian, Gérard, Picard, Dominique},
journal = {ESAIM: Probability and Statistics},
keywords = {entropy; coding; Besov spaces; wavelet bases; replication},
language = {eng},
pages = {239-250},
publisher = {EDP-Sciences},
title = {Replicant compression coding in Besov spaces},
url = {http://eudml.org/doc/244843},
volume = {7},
year = {2003},
}

TY - JOUR
AU - Kerkyacharian, Gérard
AU - Picard, Dominique
TI - Replicant compression coding in Besov spaces
JO - ESAIM: Probability and Statistics
PY - 2003
PB - EDP-Sciences
VL - 7
SP - 239
EP - 250
AB - We present here a new proof of the theorem of Birman and Solomyak on the metric entropy of the unit ball of a Besov space $B^s_{\pi ,q}$ on a regular domain of ${\mathbb {R}}^d.$ The result is: if $s-d(1/\pi -1/p)_+$$&gt; 0,$ then the Kolmogorov metric entropy satisfies $H(\epsilon ) \sim \epsilon ^{-d/s}$. This proof takes advantage of the representation of such spaces on wavelet type bases and extends the result to more general spaces. The lower bound is a consequence of very simple probabilistic exponential inequalities. To prove the upper bound, we provide a new universal coding based on a thresholding-quantizing procedure using replication.
LA - eng
KW - entropy; coding; Besov spaces; wavelet bases; replication
UR - http://eudml.org/doc/244843
ER -

References

top
  1. [1] P. Assouad, Deux remarques sur l’estimation. C. R. Acad. Sci. Paris Sér. I Math. 296 (1983) 1021-1024. Zbl0568.62003
  2. [2] L. Birgé, Sur un théorème de minimax et son application aux tests. Probab. Math. Statist. 3 (1984) 259-282. Zbl0571.62036MR764150
  3. [3] L. Birgé and P. Massart, An adaptative compression algorithm in Besov spaces. Constr. Approx. 16 (2000) 1-36. Zbl1004.41006MR1848840
  4. [4] M.S. Birman and M.Z. Solomiak, Piecewise-polynomial approximation of functions of the classes W p . Mat. Sbornik 73 (1967) 295-317. Zbl0173.16001MR217487
  5. [5] A. Cohen, R. DeVore and W. Dahmen, Multiscale methods on bounded domains. Trans. AMS 352 (2000) 3651-3685. Zbl0945.42018MR1458320
  6. [6] A. Cohen, W. Dahmen, I. Daubechies and R. DeVore, Tree approximation and optimal encoding. Appl. Comput. Harmon. Anal. 11 (2001) 192-226. Zbl0992.65151MR1848303
  7. [7] T.A. Cover and J.A. Thomas, Element of Information Theory. Wiley Interscience (1991). Zbl0762.94001MR1122806
  8. [8] B. Delyon and A. Juditski, On minimax wavelet estimators. Appl. Comput. Harmon. Anal. 3 (1996) 215-228. Zbl0865.62023MR1400080
  9. [9] R. DeVore, R. Kyriazis and P. Wang, Multiscale characterization of Besov spaces on bounded domains. J. Approx. Theory 93 (1998) 273-292. Zbl0922.46029MR1616793
  10. [10] R. DeVore, Nonlinear approximation. Cambridge University Press, Acta Numer. 7 (1998) 51-150. Zbl0931.65007MR1689432
  11. [11] R. DeVore and G. Lorentz, Constructive Approximation. Springer-Verlag, New York (1993). Zbl0797.41016MR1261635
  12. [12] D.L. Donoho, Unconditional bases and bit-level compression. Appl. Comput. Harmon. Anal. 3 (1996) 388-392. Zbl0936.62004MR1420507
  13. [13] W. Hoeffding, Probability inequalities for sums of bounded random variables. J. Amer. Statist. Assoc. 58 (1963) 13-30. Zbl0127.10602MR144363
  14. [14] W. Härdle, G. Kerkyacharian, D. Picard and A. Tsybakov, Wavelet, Approximation and Statistical Applications. Springer Verlag, New York, Lecture Notes in Statist. 129 (1998). Zbl0899.62002MR1618204
  15. [15] G. Kerkyacharian and D. Picard, Thresholding algorithms, maxisets and well-concentrated bases, with discussion. Test 9 (2000) 283-345. Zbl1107.62323MR1821645
  16. [16] G. Kerkyacharian and D. Picard, Minimax or maxisets? Bernoulli 8 (2002) 219-253. Zbl1006.62005MR1895892
  17. [17] G. Kerkyacharian and D. Picard, Entropy, Universal coding, Approximation and bases properties. Technical Report (2001). Zbl1055.41015
  18. [18] G. Kerkyacharian and D. Picard, Density Estimation by Kernel and Wavelets methods – Optimality of Besov spaces. Statist. Probab. Lett. 18 (1993) 327-336. Zbl0793.62019
  19. [19] A.N. Kolmogorov and V.M. Tikhomirov, ϵ -entropy and ϵ -capacity. Uspekhi Mat. Nauk 14 (1959) 3-86. (Engl. Translation: Amer. Math. Soc. Transl. Ser. 2 17, 277-364.) Zbl0133.06703MR112032
  20. [20] L. Le Cam, Convergence of estimator under dimensionality restrictions. Ann. Statist. 1 (1973) 38-53. Zbl0255.62006MR334381
  21. [21] L. Le Cam, Metric dimension and statistical estimation, in Advances in mathematical sciences: CRM’s 25 years. Montreal, PQ (1994) 303-311. Zbl0942.62035
  22. [22] G.G. Lorentz, Metric entropy and approximation. Bull. Amer. Math. Soc. 72 (1966) 903-937. Zbl0158.13603MR203320
  23. [23] S.M. Nikolskii, Approximation of functions of several variables and imbedding theorems (Russian), Second Ed. Moskva, Nauka (1977). English translation of the first Ed., Berlin (1975). Zbl0496.46020MR506247
  24. [24] V.V. Petrov, Limit Theorems of Probability Theory: Sequences of independent Random Variables. Oxford University Press (1995). Zbl0826.60001MR1353441
  25. [25] S.A. van de Geer, Empirical processes in M-estimation. Cambridge University Press (2000). Zbl1179.62073

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.