Local Predictive Influence in Bayesian Regression

Authors: 
Michael Lavine
Duke University

Nov 30 1989

In 1986 Cook presented the idea of local influence to study the sensitivity of inferences to model assumptions: introduce a vector ω of perturbations to the model; choose a discrepancy function δ to measure differences between th original inference and the inference under the perturbed model; study the behavior of δ near ω=0, the original model, usually by taking derivatives. Cook gives an example where ω is a vector of case weight perturbations in a linear regression. Johnson and Geisser (1983) measure influence in Bayesian linear regression by the Kullback-Leibler divergence between predictive ditributions. The current work is a synthesis of Cook and Johnson and Geisser, using Kullback-Leibler divergence between predictive distributions as the discrepancy function in a local influence analysis of a Bayesian linear regression.

Manuscript: 

PDF icon 1989-04.pdf