Senior AnalystGoogle Inc
Bayesian Additive Regression Kernels
We propose a general Bayesian "sum of kernels" model, named Bayesian Additive Regression Kernels (BARK), for both regression and classification problems. The unknown mean function is represented as a weighted sum of kernel functions, which is constructed by a prior using symmetric α-stable (SαS) Lévy random fields. Both truncation and continuous approximations for the SαS Lévy random fields are investigated, which lead to specifications of joint prior distributions for the number of kernel functions, the regression coefficients and kernel parameters. Dimension reduction and variable selection techniques are commonly used for gaining insight and making more accurate predictions in the regression and classification problems. We detail a fully Bayesian approach for dimension reduction through low rank Gaussian kernel functions in the non-parametric kernel logistic regression model. We also demonstrate a direct feature selection procedure which facilitates a hierarchical mixture prior distribution of point mass at zero and a gamma distribution on the kernel scale parameters. Finally, we present some preliminary investigation adding dependence structure into this feature selection procedure through a Markov model on the kernel scale parameters. Reversible jump algorithms are implemented to facilitate the posterior inference, and the methods are illustrated with several simulated data sets and real data sets.