-penalization in functional linear regression with subgaussian design
Vladimir Koltchinskii[1]; Stanislav Minsker[2]
- [1] School of Mathematics, Georgia Institute of Technology 686 Cherry Street, Atlanta, GA 30332-0160 USA
- [2] Department of Mathematics, Duke University Box 90320, Durham NC 27708-0320,
Journal de l’École polytechnique — Mathématiques (2014)
- Volume: 1, page 269-330
- ISSN: 2270-518X
Access Full Article
topAbstract
topHow to cite
topKoltchinskii, Vladimir, and Minsker, Stanislav. "$L_1$-penalization in functional linear regression with subgaussian design." Journal de l’École polytechnique — Mathématiques 1 (2014): 269-330. <http://eudml.org/doc/275615>.
@article{Koltchinskii2014,
abstract = {We study functional regression with random subgaussian design and real-valued response. The focus is on the problems in which the regression function can be well approximated by a functional linear model with the slope function being “sparse” in the sense that it can be represented as a sum of a small number of well separated “spikes”. This can be viewed as an extension of now classical sparse estimation problems to the case of infinite dictionaries. We study an estimator of the regression function based on penalized empirical risk minimization with quadratic loss and the complexity penalty defined in terms of $L_1$-norm (a continuous version of LASSO). The main goal is to introduce several important parameters characterizing sparsity in this class of problems and to prove sharp oracle inequalities showing how the $L_2$-error of the continuous LASSO estimator depends on the underlying sparsity of the problem.},
affiliation = {School of Mathematics, Georgia Institute of Technology 686 Cherry Street, Atlanta, GA 30332-0160 USA; Department of Mathematics, Duke University Box 90320, Durham NC 27708-0320,},
author = {Koltchinskii, Vladimir, Minsker, Stanislav},
journal = {Journal de l’École polytechnique — Mathématiques},
keywords = {Functional regression; sparse recovery; LASSO; oracle inequality; infinite dictionaries; functional regression},
language = {eng},
pages = {269-330},
publisher = {École polytechnique},
title = {$L_1$-penalization in functional linear regression with subgaussian design},
url = {http://eudml.org/doc/275615},
volume = {1},
year = {2014},
}
TY - JOUR
AU - Koltchinskii, Vladimir
AU - Minsker, Stanislav
TI - $L_1$-penalization in functional linear regression with subgaussian design
JO - Journal de l’École polytechnique — Mathématiques
PY - 2014
PB - École polytechnique
VL - 1
SP - 269
EP - 330
AB - We study functional regression with random subgaussian design and real-valued response. The focus is on the problems in which the regression function can be well approximated by a functional linear model with the slope function being “sparse” in the sense that it can be represented as a sum of a small number of well separated “spikes”. This can be viewed as an extension of now classical sparse estimation problems to the case of infinite dictionaries. We study an estimator of the regression function based on penalized empirical risk minimization with quadratic loss and the complexity penalty defined in terms of $L_1$-norm (a continuous version of LASSO). The main goal is to introduce several important parameters characterizing sparsity in this class of problems and to prove sharp oracle inequalities showing how the $L_2$-error of the continuous LASSO estimator depends on the underlying sparsity of the problem.
LA - eng
KW - Functional regression; sparse recovery; LASSO; oracle inequality; infinite dictionaries; functional regression
UR - http://eudml.org/doc/275615
ER -
References
top- R. Adamczak, A tail inequality for suprema of unbounded empirical processes with applications to Markov chains, Electron. J. Probab. 13 (2008), 1000-1034 Zbl1190.60010MR2424985
- R. Adams, Sobolev spaces, (1975), Academic Press, New York Zbl1098.46001MR450957
- G. Bal, Numerical methods for PDEs, (2009)
- P. L. Bartlett, S. Mendelson, J. Neeman, -regularized linear regression: persistence and oracle inequalities, Probab. Theory Relat. Fields 154 (2012), 193-224 Zbl06125014MR2981422
- W. Bednorz, Concentration via chaining method and its applications, (2014)
- P. J. Bickel, Y. Ritov, A. B. Tsybakov, Simultaneous analysis of Lasso and Dantzig selector, Ann. Statist. 37 (2009), 1705-1732 Zbl1173.62022MR2533469
- V. I. Bogachev, Measure theory. Vol. I, II, (2007), Springer-Verlag, Berlin Zbl1120.28001MR2267655
- P. Bühlmann, S. A. van de Geer, Statistics for high-dimensional data, (2011), Springer- Verlag, Berlin-Heidelberg Zbl1273.62015MR2807761
- F. Bunea, A. B. Tsybakov, M. Wegkamp, Sparsity oracle inequalities for the Lasso, Electron. J. Statist. 1 (2007), 169-194 Zbl1146.62028MR2312149
- T. T. Cai, P. Hall, Prediction in functional linear regression, Ann. Statist. 34 (2006), 2159-2179 Zbl1106.62036MR2291496
- E. Candès, The restricted isometry property and its implications for compressed sensing, Comptes Rendus Mathématique 346 (2008), 589-592 Zbl1153.94002MR2412803
- E. Candès, C. Fernandez-Granda, Towards a Mathematical Theory of Super-resolution, Comm. Pure Appl. Math. 67 (2014), 906-956 Zbl06298872MR3193963
- E. J. Candès, J. K. Romberg, T. Tao, Stable signal recovery from incomplete and inaccurate measurements, Comm. Pure Appl. Math. 59 (2006), 1207-1223 Zbl1098.94009MR2230846
- C. Crambes, A. Kneip, P. Sarda, Smoothing splines estimators for functional linear regression, Ann. Statist. 37 (2009), 35-72 Zbl1169.62027MR2488344
- S. Dirksen, Tail bounds via generic chaining, (2013) Zbl1327.60048
- S. A. van de Geer, High-dimensional generalized linear models and the Lasso, Ann. Statist. 36 (2008), 614-645 Zbl1138.62323MR2396809
- S. A. van de Geer, J. Lederer, The Lasso, correlated design, and improved oracle inequalities, A Festschrift in Honor of Jon Wellner (2012), 3468-3497, Institute of Mathematical Statistics MR3202642
- E. D. Gluskin, Norms of random matrices and widths of finite-dimensional sets, Mat. Sb. 120(162) (1983), 180-189 Zbl0528.46015MR687610
- M. Hebiri, J. Lederer, How Correlations Influence Lasso Prediction, IEEE Trans. Information Theory 59 (2013), 1846-1854 MR3030757
- A. D. Ioffe, V. M. Tikhomirov, Theory of Extremal Problems, (1974), Nauka, Moscow MR410502
- G. James, Sparseness and functional data analysis, The Oxford handbook of functional data analysis (2011), 298-323, Oxford University Press, New York MR2908027
- G. M. James, J. Wang, J. Zhu, Functional linear regression that’s interpretable, Ann. Statist. 37 (2009), 2083-2108 Zbl1171.62041MR2543686
- V. Koltchinskii, The Dantzig selector and sparsity oracle inequalities, Bernoulli 15 (2009), 799-828 Zbl05815956MR2555200
- V. Koltchinskii, Sparse recovery in Convex Hulls via Entropy penalization, Ann. Statist. 37 (2009), 1332-1359 Zbl1269.62039MR2509076
- V. Koltchinskii, Sparsity in Penalized Empirical Risk Minimization, Ann. Inst. H. Poincaré Probab. Statist. 45 (2009), 7-57 Zbl1168.62044MR2500227
- V. Koltchinskii, Oracle inequalities in empirical risk minimization and sparse recovery problems, 38th Probability Summer School (Saint-Flour, 2008) (2011), Springer Zbl1223.91002MR2829871
- V. Koltchinskii, K. Lounici, A. B. Tsybakov, Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion, Ann. Statist. 39 (2011), 2302-2329 Zbl1231.62097MR2906869
- V. Koltchinskii, S. Minsker, Sparse Recovery in Convex Hulls of Infinite Dictionaries, COLT 2010, 23rd Conference on Learning Theory (2010), 420-432
- S. Lang, Real and functional analysis, 142 (1993), Springer Zbl0831.46001MR1216137
- M. A. Lifshits, Gaussian random functions, 322 (1995), Kluwer Academic Publishers, Dordrecht Zbl0832.60002MR1472736
- P. Massart, C. Meynet, The Lasso as an -ball model selection procedure, Electron. J. Statist. 5 (2011), 669-687 Zbl1274.62468MR2820635
- S. Mendelson, Oracle inequalities and the isomorphic method
- S. Mendelson, Empirical processes with a bounded diameter, Geom. Funct. Anal. 20 (2010), 988-1027 Zbl1204.60042MR2729283
- H. G. Müller, U. Stadtmüller, Generalized functional linear models, Ann. Statist. 33 (2005), 774-805 Zbl1068.62048
- J. O. Ramsay, Functional data analysis, (2006), Wiley Online Library
- J. O. Ramsay, B. W. Silverman, Applied functional data analysis: methods and case studies, 77 (2002), Springer, New York Zbl0882.62002MR1910407
- K. Ritter, G. W. Wasilkowski, H. Woźniakowski, Multivariate integration and approximation for random fields satisfying Sacks-Ylvisaker conditions, Ann. Appl. Probab. (1995), 518-540 Zbl0872.62063MR1336881
- J. Sacks, D. Ylvisaker, Designs for regression problems with correlated errors, Ann. Statist. 37 (1966), 66-89 Zbl0152.17503MR192601
- M. Talagrand, The generic chaining, (2005), Springer-Verlag, Berlin Zbl1075.60001MR2133757
- R. Tibshirani, Regression shrinkage and selection via the Lasso, J. R. Stat. Soc. Ser. B Stat. Methodol. (1996), 267-288 Zbl0850.62538MR1379242
- A. W. van der Vaart, J. A. Wellner, Weak convergence and empirical processes, (1996), Springer-Verlag, New York Zbl0862.60002MR1385671
- M. Yuan, T. T. Cai, A reproducing kernel Hilbert space approach to functional linear regression, Ann. Statist. 38 (2010), 3412-3444 Zbl1204.62074MR2766857
NotesEmbed ?
topTo embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.