Model Selection in Gaussian Regression for High-Dimensional Data

Felix Abramovich, Vadim Grinshtein

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

We consider model selection in Gaussian regression, where the number of predictors might be even larger than the number of observations. The proposed procedure is based on penalized least square criteria with a complexity penalty on a model size.We discuss asymptotic properties of the resulting estimators corresponding to linear and so-called 2k ln(p/k)-type nonlinear penalties for nearly-orthogonal and multicollinear designs. We show that any linear penalty cannot be simultaneously adapted to both sparse and dense setups, while 2k ln(p/k)-type penalties achieve the wide adaptivity range.We also present Bayesian perspective on the procedure that provides an additional insight and can be used as a tool for obtaining a wide class of penalized estimators associated with various complexity penalties.
Original languageUndefined/Unknown
Title of host publicationInverse Problems and High-Dimensional Estimation: Stats in the Château Summer School, August 31 - September 4, 2009
EditorsPierre Alquier, Eric Gautier, Gilles Stoltz
Place of PublicationBerlin, Heidelberg
Pages159-170
Number of pages12
DOIs
StatePublished - 2011

Cite this