Bayesian Compressed Regression
Feb 3 2013
As an alternative to variable selection or shrinkage priors in high dimensional regression problems, we propose to randomly compress the predictors to dramatically reduce storage and computational bottlenecks. As opposed to existing Bayesian dimensionality reduction approaches, the exact posterior distribution conditional on the compressed data is available analytically, speeding up computation by many orders of magnitude while also bypassing robustness issues due to convergence and mixing problems with MCMC. Model averaging is used to reduce sensitivity to the random projection matrix, while accommodating uncertainty in the subspace dimension. Strong theoretical support is provided for the approach by showing near parametric convergence rates for the predictive density in the large p small n asymptotic paradigm. Practical performance relative to competitors is illustrated in simulations and real data applications.
Keywords:Compressed sensing; Data compression; Data squashing; Dimensionality reduction; Large p, small n; Random projection; Sparsity