Stochastic nets and Bayesian regularization
Joshua Bon, Queensland University of Technology & ARC Centre of Excellence for Mathematical and Statistical Frontiers (ACEMS)
Monday, January 6, 2020 - 3:30pm
This work considers a relatively simple, but undocumented, representation of shrinkage priors in Bayesian inference. The new representation is based on a probabilistic decomposition, analogous to the duality between constrained and penalized optimization. Stochastic nets, as we call them, introduce the notion of probabilistic regularization as an operator that acts on a distribution. The class includes the Horseshoe, regularized-Horseshoe, Dirichlet-Laplace, and R2-D2 priors for example. I will introduce this new prior construct, outline some of its properties, and describe its strengths and weaknesses from both theoretical and computational perspectives
Seminars generally take place in 116 Old Chemistry Building on Fridays from 3:30 - 4:30 pm. For additional information contact: firstname.lastname@example.org or phone 919-684-8029. Sorry, but we do not have reprints available. Please feel free to contact the authors by email for follow-up information, articles, etc. Reception following seminar in 203B Old Chemistry.