Recursive partitioning and Bayesian inference on conditional distributions

Authors: 
Li Ma
Duke University

Feb 28 2012

In this work we introduce a Bayesian framework for nonparametric inference on conditional distributions in the form of a prior called the conditional optional P´olya tree. The prior is constructed based on a two-stage nested procedure, which in the first stage recursively partitions the predictor space, and thenin the second generates the conditional distribution on those predictor blocks using afurther recursive partitioning procedure on the response space. This design allows adaptive smoothing on both the predictor space and the response space. We show that this prior obtains desirable prop- erties such as large support, posterior conjugacy and weak consistency. Moreover, the prior can be marginalized analytically producing a closed-form marginal likelihood, and based on the marginal likelihood, the corresponding posterior can also be com- puted analytically, allowing direct sampling without Markov Chain Monte Carlo. In addition, we show that this prior can be considered a nonparametric extension of the Bayesian classification and regression trees (CART), and therefore many of its theo- retical results extend to the Bayesian CART as well. Our prior serves as a general tool for nonparametric inference on conditional distributions. We illustrate its work in density estimation, model selection, hypothesis testing,and regression through several numerical examples. 

Manuscript: 

PDF icon 2012-03.pdf