Additive Covariance kernels for high-dimensional Gaussian Process modeling

Nicolas Durrande[1]; David Ginsbourger[2]; Olivier Roustant[3]

  • [1] School of mathematics and statistics, University of Sheffield, Sheffield S3 7RH, UK, Ecole Nationale Supérieure des Mines, FAYOL-EMSE, LSTI, F-42023 Saint-Etienne, France
  • [2] Institute of Mathematical Statistics and Actuarial Science, University of Berne, Alpeneggstrasse 22, 3012 Bern, Switzerland
  • [3] Ecole Nationale Supérieure des Mines, FAYOL-EMSE, LSTI, F-42023 Saint-Etienne, France

Annales de la faculté des sciences de Toulouse Mathématiques (2012)

  • Volume: 21, Issue: 3, page 481-499
  • ISSN: 0240-2963

Abstract

top
Gaussian Process models are often used for predicting and approximating expensive experiments. However, the number of observations required for building such models may become unrealistic when the input dimension increases. In oder to avoid the curse of dimensionality, a popular approach in multivariate smoothing is to make simplifying assumptions like additivity. The ambition of the present work is to give an insight into a family of covariance kernels that allows combining the features of Gaussian Process modeling with the advantages of generalized additive models, and to describe some properties of the resulting models.

How to cite

top

Durrande, Nicolas, Ginsbourger, David, and Roustant, Olivier. "Additive Covariance kernels for high-dimensional Gaussian Process modeling." Annales de la faculté des sciences de Toulouse Mathématiques 21.3 (2012): 481-499. <http://eudml.org/doc/251000>.

@article{Durrande2012,
abstract = {Gaussian Process models are often used for predicting and approximating expensive experiments. However, the number of observations required for building such models may become unrealistic when the input dimension increases. In oder to avoid the curse of dimensionality, a popular approach in multivariate smoothing is to make simplifying assumptions like additivity. The ambition of the present work is to give an insight into a family of covariance kernels that allows combining the features of Gaussian Process modeling with the advantages of generalized additive models, and to describe some properties of the resulting models.},
affiliation = {School of mathematics and statistics, University of Sheffield, Sheffield S3 7RH, UK, Ecole Nationale Supérieure des Mines, FAYOL-EMSE, LSTI, F-42023 Saint-Etienne, France; Institute of Mathematical Statistics and Actuarial Science, University of Berne, Alpeneggstrasse 22, 3012 Bern, Switzerland; Ecole Nationale Supérieure des Mines, FAYOL-EMSE, LSTI, F-42023 Saint-Etienne, France},
author = {Durrande, Nicolas, Ginsbourger, David, Roustant, Olivier},
journal = {Annales de la faculté des sciences de Toulouse Mathématiques},
keywords = {additive covariance kernels; Gaussian process modeling; additive Kriging},
language = {eng},
month = {4},
number = {3},
pages = {481-499},
publisher = {Université Paul Sabatier, Toulouse},
title = {Additive Covariance kernels for high-dimensional Gaussian Process modeling},
url = {http://eudml.org/doc/251000},
volume = {21},
year = {2012},
}

TY - JOUR
AU - Durrande, Nicolas
AU - Ginsbourger, David
AU - Roustant, Olivier
TI - Additive Covariance kernels for high-dimensional Gaussian Process modeling
JO - Annales de la faculté des sciences de Toulouse Mathématiques
DA - 2012/4//
PB - Université Paul Sabatier, Toulouse
VL - 21
IS - 3
SP - 481
EP - 499
AB - Gaussian Process models are often used for predicting and approximating expensive experiments. However, the number of observations required for building such models may become unrealistic when the input dimension increases. In oder to avoid the curse of dimensionality, a popular approach in multivariate smoothing is to make simplifying assumptions like additivity. The ambition of the present work is to give an insight into a family of covariance kernels that allows combining the features of Gaussian Process modeling with the advantages of generalized additive models, and to describe some properties of the resulting models.
LA - eng
KW - additive covariance kernels; Gaussian process modeling; additive Kriging
UR - http://eudml.org/doc/251000
ER -

References

top
  1. Azaïs (J.M.) and Wschebor (M.).— Level sets and extrema of random processes and fields, Wiley Online Library (2009). Zbl1168.60002MR2478201
  2. Bach (F.).— Exploring large feature spaces with hierarchical multiple kernel learning, Arxiv preprint arXiv:0809.1493 (2008). 
  3. Buja (A.), Hastie (T.) and Tibshirani (R.).— Linear smoothers and additive models, The Annals of Statistics, p. 453-510 (1989). Zbl0689.62029MR994249
  4. Chilès (J.P.) and Delfiner (P.).— Geostatistics: modeling spatial uncertainty, volume 344, Wiley-Interscience (1999). Zbl1256.86007MR1679557
  5. Cressie (N.).— Statistics for spatial data, Terra Nova, 4(5), p. 613-617 (1992). Zbl0799.62002MR1239641
  6. Fang (K.).— Design and modeling for computer experiments, volume 6. CRC Press (2006). Zbl1093.62117MR2223960
  7. Fortet (R.M.).— Les operateurs integraux dont le noyau est une covariance, Trabajos de estadística y de investigación operativa, 36(3), p. 133-144 (1985). Zbl0733.47030
  8. Gaetan (C.) and Guyon (X.).— Spatial statistics and modeling, Springer Verlag (2009). Zbl1271.62214MR2569034
  9. Ginsbourger (D.), Dupuy (D.), Badea (A.), Carraro (L.) and Roustant (O.).— A note on the choice and the estimation of kriging models for the analysis of deterministic computer experiments, Applied Stochastic Models in Business and Industry, 25(2), p. 115-131 (2009). Zbl1224.62149MR2510851
  10. Gunn (S.R.) and Brown (M.).— Supanova: A sparse, transparent modelling approach, In Neural Networks for Signal Processing IX, 1999, Proceedings of the 1999 IEEE Signal Processing Society Workshop, p. 21-30. IEEE (1999). 
  11. Hastie (T.).— gam: Generalized Additive Models, 2011, R package version 1.04.1. 
  12. Hastie (T.J.) and Tibshirani (R.J.).— Generalized additive models, Chapman & Hall/CRC (1990). Zbl0747.62061MR1082147
  13. Loeppky (J.L.), Sacks (J.) and Welch (W.J.).— Choosing the sample size of a computer experiment: A practical guide, Technometrics, 51(4), p. 366-376 (2009). MR2756473
  14. Muehlenstaedt (T.), Roustant (O.), Carraro (L.) and Kuhnt (S.).— Data-driven Kriging models based on FANOVA-decomposition, to appear in Statistics and Computing. 
  15. Newey (W.K.).— Kernel estimation of partial means and a general variance estimator, Econometric Theory, 10(02), p. 1-21 (1994). MR1293201
  16. Rasmussen (C.E.) and Williams (C.K.I.).— Gaussian processes for machine learning (2005). MR2514435
  17. Roustant (O.), Ginsbourger (D.) and Deville (Y.).— DiceKriging: Kriging methods for computer experiments, 2011, R package version 1.3. 
  18. Saltelli (A.), Chan (K.), Scott (E.M.)et al.— Sensitivity analysis, volume 134, Wiley New York (2000). Zbl1152.62071MR1886391
  19. Santner (T.J.), Williams (B.J.) and Notz (W.).— The design and analysis of computer experiments, Springer Verlag (2003). Zbl1041.62068MR2160708
  20. Sobol (I.M.).— Global sensitivity indices for nonlinear mathematical models and their monte carlo estimates, Mathematics and Computers in Simulation, 55(1-3), p. 271-280, (2001). Zbl1005.65004MR1823119
  21. Stone (C.J.).— Additive regression and other nonparametric models, The annals of Statistics, p. 689-705 (1985). Zbl0605.62065MR790566
  22. R Team.— R: A language and environment for statistical computing, R Foundation for Statistical Computing Vienna Austria ISBN, 3(10) (2008). 

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.